views:

41

answers:

2

I am developing a WPF kiosk like client that will be deployed on an HP TouchSmart Windows 7 machine with multi-touch enabled. My problem is that with Windows 7 multi-touch, the application does not recognize a finger 'tap' as a button press event, and therefore the button press Trigger to change color is never fired.

The Windows 7 animation for the touch usually displays, and the button click event fires fine. It is only the XAML define style for the 'IsPressed' event that does not function as intended on a finger tap. It does eventually work if enough pressure is applied with a finger and/or you roll or press like you would do for a fingerprint. Is there a workaround for to make a 'tap' fire a press/click event?

<Trigger Property="AreAnyTouchesOver" Value="true">
    <Trigger.ExitActions>
        <BeginStoryboard Storyboard="{StaticResource PressedOff}" />
    </Trigger.ExitActions>
    <Trigger.EnterActions>
        <BeginStoryboard Storyboard="{StaticResource PressedOn}" />
    </Trigger.EnterActions>
</Trigger>

<Trigger Property="AreAnyTouchesDirectlyOver" Value="true">
    <Trigger.ExitActions>
        <BeginStoryboard Storyboard="{StaticResource PressedOff}" />
    </Trigger.ExitActions>
    <Trigger.EnterActions>
        <BeginStoryboard Storyboard="{StaticResource PressedOn}" />
    </Trigger.EnterActions>
</Trigger>
A: 

I am not clear on what you want t do.

Some explanation on your issue:

In touch IsPressed is only triggered when your contact is held down over the element and the movement threshold is passed to create a "press" (That is why you see it if you press the finger down and slide it slightly)

As soon as the contact comes up IsPressed = false.

When you "tap" a button you are in essence doing a mouse click. This is why events fires as expected.

So what exactly are you trying to do? You want to fire an storyboard animation when a button is "held down" ("IsPressed") or when a button is clicked (tapped).

These are two different situations, also note that often times desktop metaphors do not translate well to the touch environment. You may want to rethink how you are giving visual feedback to the user.

I think the easiest answer for what you want to is Trigger on "TouchDown" if I understand your questions correctly.

UPDATE:

This should work for a TouchDown trigger

<Button Content="Touch Down Example" Height="20" Width="20">
        <Button.Triggers>
            <EventTrigger RoutedEvent="Button.TouchDown">
                <BeginStoryboard>
                    ....
                </BeginStoryboard>
            </EventTrigger>
        </Button.Triggers>
    </Button>

NOTE: I don't have a touch screen in front of me so there could be errors but it looks correct.

Foovanadil
Sorry for being unclear - I'm having difficulty communicating what I want. You are correct, I want a Trigger on 'TouchDown', how do I do this? Some background: the kiosk client is being developed on a standard Gvision Touch screen (no multi-touch) - therefore touch/press/click works the same. The app is being deployed to multi-touch enabled PC, hence my communication issues. Hope that helps.
aurealus
Updated my answer to include example
Foovanadil
Tried your solution, I'm still not seeing the result I expect. I'm wondering if Windows 7 is interpreting the quick tap as an accidental touch. Also tried suggestion from http://connect.microsoft.com/VisualStudio/feedback/details/527886/touchdown-does-not-trigger-until-the-touch-moves
aurealus
With no success
aurealus
Does the device you are testing support the Window 7 Touch API correctly? If it does then it should be triggering the TouchDown event on the button correctly. It is highly likely that your hardware is not Win7 compliant (I have seem many vendors claim they are but don't actually support the API correctly).
Foovanadil
That is a possibility.
aurealus
+2  A: 

For touch development, I would recommend using the Microsoft® Surface® Toolkit for Windows Touch Beta in your project, and using the surface controls included, since they handle touch events properly, and a multitude of use cases, such as a user pressing a button with 2 fingers, and lifting them one at a time. Only the last "lift" or TouchUp event should trigger a Click event.

All of the surface controls are subclasses of the Windows controls, so SurfaceListBox subclasses ListBox, SurfaceButton subclasses Button, etc. This way, all your styles and templates will still work with these controls.

In your case, you will want to use the SurfaceButton control. Simply add a reference to the toolkit assemblies, and in your outer element (, , etc), add a reference to the surface namespace, such as

xmlns:s="http://schemas.microsoft.com/surface/2008"

and then add the surface controls from that namespace, such as

<s:SurfaceButton x:Name="button1" Click="OnButtonClick" />
Hugo
Agree completely. However, if the problem the author is seeing is due to the touch screen vendor not properly supporting Windows 7 Touch API (as it seems might be the case) then the Surface Toolkit won't help much there. Regardless, if beta software is an option for this project then definitely use the Surface Toolkit
Foovanadil
@Foovanadil - missed this comment when you first posted it, sorry. The reason I mentioned it explicitly is that I am presently working with a device in the HP Touchsmart product line, and this was the cause of my problem as well. Unhandled touch events *should* get promoted to mouse events, but in reality, a number of things could be swallowing those events without further notice.
Hugo
Thanks Hugo, I will try out the Surface API when I have a chance. For now we went back to a non multi-touch enabled touchscreen.
aurealus