Hi Ariya, That is the problem: how can I "separate" the touch events implemented by web sites (e.g. pinching on the map on google maps or swiping on a results page of a google images search to switch page) from "gestures for user interaction" (e.g. change the zoom factor, scroll the page)?
It seems that I should first give the "page" a chance to deal with the gestures and only if they are not "needed" have them interact with the whole view. Every "sample" I have seen will start by intercepting any user mouse/touch/gesture events and just forward to "webkit" the mouse clicks. p.s. I really enjoy reading your blog and many times came across your work. Thanks! Felipe On Sun, Oct 16, 2011 at 11:15 PM, Ariya Hidayat <[email protected]>wrote: > On Sun, Oct 16, 2011 at 6:40 PM, Felipe Crochik <[email protected]> > wrote: > > I can't seem to find a definitive answer where Qt Webkit "can" or can't > > support gestures (pinch, swipe, ...) > > Do you mean the gestures for user interactions? For example, pinch can > be used to zoom in and out, swipe to flick the view, etc. In that > case, those gestures are application-specific gestures and should be > handled at the application level, e.g. the browser which uses > QtWebKit. > > If what you mean is touch events (http://www.w3.org/TR/touch-events/), > then see http://trac.webkit.org/wiki/QtWebKitFeatures21. > > > > > -- > Ariya Hidayat, http://ariya.ofilabs.com >
_______________________________________________ webkit-qt mailing list [email protected] http://lists.webkit.org/mailman/listinfo.cgi/webkit-qt
