Loving what you do

March 29, 2015

The years 2010 and 2011 were a major turning point in my life. I went from a career that I had grown to hate (Mechanical Engineering) to one that inspires me (Software Engineering). I’ve been very fortunate to be able to make this switch, and I still find myself inspired with the day job as an embedded Linux coder.

Though, in 2010-2011 my open-source presence seemed to take a nose-dive. At the start of 2010 I had grand plans for Composite and was making good progress on it. But for most of 2010 and 2011 I had taken on a 2nd job and couldn’t work on Composite at all. And in the summer of 2011 I got a day job with TI. When I wanted to to some extra hacking, it was more apropos to do work-related hacking than my own. I was just as interested with the problems at work as I was with hobby software.

And this is still true today.

However, it makes me sad that I more or less vanished from the open source community — because that was a place I really enjoyed being in. But with way device manufacturers approach IP these days, it more or less discourages community involvement because it’s an IP liability. So, I played it safe for a few years.

Will this change for me in 2015? I hope so. Today I moved Composite from gabe.is-a-geek.org to riggable.com (something I planned to do 4 years ago). I also want to make some progress on Composite — though I may need to re-think the strategy and motivations. The original motivation was mainly obvious, but there was also a subversive motivation of having a publicly available work that demonstrates my software skills (remember, I was trying to change careers at the time).

Also, Composite didn’t go as planned because it didn’t attract developers… neither front-end UI developers nor back-end audio developers. And I absolutely totally hate UI development. If you look at the commit log, things got really slow and unclear whenever I turned to the UI work. Looking to the future, I’ll need to scale back the “innovative UI” ideas in order for it to be something that _I_ can produce. (I’m all in to the “innovative back-end” stuff. :-))

Meanwhile, even Garage Band offers a lot of the features that Composite set out to do. So, there’s not much innovative in the idea any more. It’s now just a “me, too” project. An OSS alternative. So, it’s an open question as to whether Composite should even continue.

Anyway, I want to hammer some of this out in the coming year.

Usually after I hit “Publish” on a post like this, things get really heated up at work. So, I can’t really say what tomorrow will look like. Will Composite continue? Can’t say. Will I blog more? Dunno. Will I actually read my “Linux Audio Developers” mailing list mail? Who knows. But I’m taking a moment to reflect and also think about the future.


Last time I talked about getting the X server to send XInput 2 multitouch events in your Qt program. In this post we decode them and send them to the widget of interest.


For TouchBegin, TouchUpdate, and TouchEnd the “cookie” in our XGenericEventCookie struct is a struct XIDeviceEvent. It looks like this:[1]

typedef struct {
    int           type;         /* GenericEvent */
    unsigned long serial;       /* # of last request processed by server */
    Bool          send_event;   /* true if this came from a SendEvent request */
    Display       *display;     /* Display the event was read from */
    int           extension;    /* XI extension offset */
    int           evtype;       /* TouchBegin, TouchEnd, ... */
    Time          time;         /* Timestamp of event */
    int           deviceid;     /* Device generating event */
    int           sourceid;     /* *actual* device that gen'd event */
    int           detail;       /* Touch ID / Finger ID */
    Window        root;         /* Window ID of root window (usu. 0) */
    Window        event;        /* Window ID of Window that should receive event */
    Window        child;        /* Window ID of Window where the event occured */
    double        root_x;       /* (x,y) coords in root window */
    double        root_y;
    double        event_x;      /* (x,y) coords in event window */
    double        event_y;
    int           flags;        /* misc modifiers, see spec */
    XIButtonState       buttons;
    XIValuatorState     valuators;
    XIModifierState     mods;
    XIGroupState        group;
} XIDeviceEvent;

Usually we’re only interested in event,[2] event_x, event_y, and detail. The detail allows you to track individual fingers. The XInput extension promises that you’ll always get this sequence for each finger:

TouchBegin -> [TouchUpdate -> ] TouchEnd

…that is to say, you’ll always get a TouchEnd event, even if some other window grabs the finger from you. Furthermore, the events you get are individual. The events for each finger come in independently… not all grouped together in one event.

Sending Events to a QWidget

If we decide to handle an event inside TApplication::x11EventFilter() then we do the handling immediately and return true. Handling it means sending it to the window. Fortunately, since the touch events give us a window ID, it’s pretty easy:

    XIDeviceEvent *de = (XIDeviceEvent*)xev->data;
    QEvent *qev = translate_to_some_kind_of_qevent(xideviceevent);
    QWidget *target = QWidget::find(de->event);
    return notify(target, qev);

The only trick is… what kind of event should we send it?

Translating Touch Events

Ideally, we would find a way to send QTouchEvent‘s to our QWidget. That way the widget logic uses only one touch API. And QTouchEvent is a fairly nice API… since the event maintains state of all fingers that are currently in play.

We can do this, but it would raise the ire of the Qt gods. QTouchEvent has a sub-class called QTouchEvent::TouchPoint that is intended to be used as a read-only object (the “write” methods are all marked “internal”).[3] So we do it at our own risk… which isn’t a great idea for your production/stable code.

As an alternative, we can create our own custom touch events that can be used by the QWidget. If it were me, I would make something that looks very similar to QTouchEvent.

Sending a custom QEvent

Both options are a bit of work to implement. Since this is a bit of a rabbit trail, I’m not going to create (or translate) the touch events. Instead, I’ll just send the widget a custom QEvent and call it a day. The event is pretty simple:

class TouchEvent : public QEvent
    enum {
        TouchEventId = QEvent::User + 1

    TouchEvent(int id);
    virtual ~TouchEvent();

    int id() { return m_id; }

    int m_id;

With one critical piece:

TouchEvent::TouchEvent(int id) :

If we don’t initialize QEvent with the event id… then it won’t return the right number for QEvent::type().

We’re careful to make sure that our event ID number is within the range that Qt has allocated: QEvent::User + 1. Note that all of the documentation says that our event number must be greater than QEvent::User. I suspect this is a mis-print, though.

Now, when we send the event it goes to virtual QWidget::event() we check the type and then cast it back to our original type:

bool ScribbleArea::event(QEvent *event)
    switch (event->type()) {
    case TouchEvent::TouchEventId:
        TouchEvent *tev = static_cast(event);
        qDebug() << "Received TouchEvent #" <id();
    }   break;
    /* ... */


So, we didn’t meet our goal of getting fingerpaint refactored… but the end is in sight (and I’m stopping so that I can move on to larger goals). For Composite I will probably stick with Ubuntu 12.04’s patch for as long as I can get away with it… instead of rolling my own XInput 2 code. If I really need to add the support, I will probably go ahead and cheat… using the private API of QTouchEvent::TouchPoint.


I dropped the code <a href=”http://gabe.is-a-geek.org/blog_content/2012/07/xinput2-part2/. See the previous post for links to docs.

[1] – source: /usr/include/X11/extensions/XInput2.h, with extra annotation from the protocol spec added by me.

[2]root is the root window of the entire X Display… and the value is usually 0. child is usually None, but sometimes if the event actually happened in a window that is a child of the event window, but they have a relationship that redirects the event.

[3] – See /usr/include/qt4/QtGui/qevent.h

Fennec (Firefox Mobile)Fennec is the code name for the latest Firefox Mobile browser.  I was playing with the version on the MeeGo Handset UX (on an Atom-based Ideapad)… and pulled up some Youtube videos.  At first the video seemed a little jumpy… no doubt because of the compositing window manager I was using (mcompositor).

Then I clicked fullscreen.

The quality of the fullscreen video was amazing.  It was like watching a DVD or television.  I didn’t detect any jitter or pixelation… just clean video.  I was impressed!

Other video players in MeeGo are no slouch.  For example, watching the short film “Big Buck Bunny” in the Netbook UX with Banshee (a media player) is also impressively snappy.  And flash video inside the Chromium browser is no slouch.  But in Chromium, if you hit the button for full-screen flash, you’ll just get full-screen white. 😦

This is just one of the things that is really well done with Firefox Mobile.  If you get a chance to play with the latest version — take it.