Hi,

I believe my conceptual question on touch/mouse events has been missed because 
of the other questions
in the "JAVAFX on ANDROID" thread. That's why I would like to start a new 
discussion about touch events.


1. The main question is how are touch and internal mouse events handled? Javafx 
controls seem to rely on mouse events.
That's why I assume there must be some kind of an emulation layer. Are these 
emulated in Prism, Glass (Java-Glasses)
or even lower? Where is it suppose to emulate the mouse events? 

What I've seen right now is that iOS-native glass does the mouse
emulation by itself in GlassViewDelegate.m. Touch events and Mouse events are 
sent from the lowest layer.
In Android there are only touch events passed to the lens implementation. On 
udev which I assume is the implementation
that's used for Dukepad it does only pass touch events. Udev and Android are 
lens implementations so, they are using
the same Java classes which do kind of mouse emulation for toch events. But 
it's not exactly the same as the iOS
codes does.

iOS:
sends Touch, Mouse-Enter and Mouse-Down

Lens (Android/Dukepad):
sends Mouse-Enter and Touch


The major differences in calling order and the missung mouse down leeds me to 
the assumption that the events are actually
missing.



2. Is that mouse emulation supposed to be eliminated due to the latest 
lensWindow changes? 
  I believe that must be handled in higher layers not in the input layer itself.


3. What is the input layer for the Dukepad? I think it's the udev 
implementation and this does pretty much the same as the current 
android implementation. I just want to have a "stable" reference to look at ;)


4. Has anyone with a Dukepad the opportunity to test the ListView-Example? For 
me on Android, it doesn't scroll at all with any touches.
With the automatic scrolling (from Richard sources) I get around 30fps on the 
Samsung Galaxy Tab 10.1.



regards
Matthias

Reply via email to