The iTunes mini-player (to give just one example) supports click-through where the application isn't brought to the front when the play/pause and volume controls are used.
How is this done?
I've been looking through Apple's documentation and have a little to go on, in Cocoa Event-Handling Guide, Event Dispatch it states:
Some events, many of which are defined by the Application Kit (type NSAppKitDefined), have to do with actions controlled by a window or the application object itself. Examples of these events are those related to activating, deactivating, hiding, and showing the application. NSApp filters out these events early in its dispatch routine and handles them itself.
So, from my limited understanding (How an Event Enters a Cocoa, Application) subclassing NSApplication and overriding
- (void)sendEvent:(NSEvent *)theEvent
should trap every mouse and keyboard event, but still, the window is raised on click. So either the window is raised before the event is seen by NSApplication or I'm missing something else.
I've looked at Matt Gallagher's Demystifying NSApplication by recreating it, unfortunately Matt didn't cover the event queue, so other than that, I'm stumped.
Any help would be appreciated, thanks.
Edited to add: Found a post at Lloyd's Lounge in which he talks about the same problem and links to a post at CocoaBuilder, capture first right mouse down. I'm currently trying out the code supplied there, after some fiddling around and reactivating the NSLog for [theEvent type], the left mouse button activity is being caught.
Now, left clicking on the window to bring it forward produces a sequence of event types, 13, 1, 13
, these are NSAppKitDefined, NSLeftMouseDown and NSAppKitDefined again. Can I filter these out or find where they are going?