The ascii sketch looked fine on my screen before I sent the mail :( I
hope the idea is clear from the text
(now in the reply dialog its also look good)
Assaf
On 11/11/2013 12:51 PM, Assaf Yavnai wrote:
Hi Guys,
I hope that I'm right about this, but it seems that touch events in
glass are translated (and reported) as a single point events (x & y)
without an area, like pointer events.
AFAIK, the controls response for touch events same as mouse events
(using the same pickers) and as a result a button press, for example,
will only triggered if the x & y of the touch event is within the
control area.
This means that small controls, or even quite large controls (like
buttons with text) will often get missed because the 'strict' node
picking, although from a UX point of view it is strange as the user
clearly pressed on a node (the finger was clearly above it) but
nothing happens...
With current implementation its hard to use small features in
controls, like scrollbars in lists, and it almost impossible to
implement something like 'screen navigator' (the series of small dots
in the bottom of a smart phones screen which allow you to jump
directly to a 'far away' screen)
To illustrate it consider the bellow low resolution sketch, where the
"+" is the actual x,y reported, the ellipse is the finger touch area
and the rectangle is the node.
With current implementation this type of tap will not trigger the node
handlers
__
/ \
/ \
___/ __+_ \___ in this scenario the 'button' will not get
pressed
| \ / |
|___\ ___ / __ |
\___/
If your smart phone support it, turn on the touch debugging options in
settings and see that each point translate to a quite large circle and
what ever fall in it, or reasonably close to it, get picked.
I want to start a discussion to understand if my perspective is
accurate and to understand what can be done, if any, for the coming
release or the next one.
We might use recently opened RT-34136
<https://javafx-jira.kenai.com/browse/RT-34136> for logging this, or
open a new JIRA for it
Thanks,
Assaf