I never worked with touch screens, but I am inclined to think they work somehow like a mouse. So if this is true, I think you may be able to use Windows hooks to capture events. If the event is from the keyboard (which I assume is single), direct it to your first window. If the event is from the mouse, check its coordinates (or maybe device ID if there is one, I don't know). If it is within your first window, direct input to it. If it is in the second screen, direct input to your second window.
If the screen is not like a mouse, I guess you'll find another way to put a hook for it.
I don't know whether this would work. It's just a conceptual idea.
Edit: Of course you need to identify which application should receive the messages on the first screen. I guess Z-order can help, unless the user opens some on-top application. It may be better to track OnFocus messages (or something similar) to keep track of which application is getting focus (excluding your other app).
(Comments explaining any deficencies with this method are very welcome!)