On Tue, 22 Jul 2008 16:08:55 +0200 Kalle Happonen <[EMAIL PROTECTED]> babbled:
> > what gesture, where? how? how ill this be able to not conflict with > > operation of other apps? i am not so hot on gestures - especially ones that > > use up the "whole screen" or parts o the screen where apps run - as now > > gestures fight for usability with apps themselves. there is no > > coordination. example: > > > > if the gesture was "slide up the screen from bottom to top" - how is this > > gesture different from me dragging my finger to scroll a list in the > > application on my screen? how do i make sure only ONE of these happens (the > > keyboard pops up OR the scroll happens) and not both? > > > I'm not sure, but I think he meant gesture as in accelerometer. Double > tap the phone for instance, or tap it on the bottom and it slides up, > and tap it on the top and it slides down... or... hmm accelerometer - not going there right now. i have never played with them, but - i do see there being a good use of them for something like this. twitch phone up for displaying keyboard, twitch down for hiding. but until i have got accelerometers firmly in hand - i'm sticking to the screen and buttons as input. not saying "no" - just saying "not there yet". need to consider if i should be listening to them in E as another kind of input device and generate internal events, or if there should be a daemon messaging commands or emitting keystrokes based on gestures. lots of things to solve there before using accelerometers for this -- Carsten Haitzler (The Rasterman) <[EMAIL PROTECTED]> _______________________________________________ Openmoko community mailing list [email protected] http://lists.openmoko.org/mailman/listinfo/community

