I have a naive question about input events-- to what extent is the Polymer
input events library
<http://www.polymer-project.org/docs/polymer/touch.html> meant to obviate
gesture disambiguation in script?

Specifically, if I have an element that I'm interested in taps and swipes
on, I can listen to both the tap and the trackstart/track/trackend event
with Polymer's gesture event library. But the tap gesture fires after the
touch up whether the user tracked in the middle or not.

It isn't too painful to stop listening for other events when a track
begins, but in an ideal world we'd lean on the browser to do disambiguation
of all gestures that the gesture library exposes -- it needs to do them
anyway for browser UI (like long press and scrolling), and this provides
much neater encapsulation of behavior for components (the various gesture
handlers don't need to know about one another).

Do we have a path to making that happen?

Follow Polymer on Google+: plus.google.com/107187849809354688692
--- 
You received this message because you are subscribed to the Google Groups 
"Polymer" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/polymer-dev/CAK-G-KWXBKor5hZ6%3DV0_kEvOep6s57miTYjqsPLN%3Dw0swJLZBg%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to