On Thu, Oct 3, 2013 at 7:58 AM, Carsten Haitzler <ras...@rasterman.com> wrote:
> On Wed, 2 Oct 2013 22:28:54 -0700 Jason Gerecke <killert...@gmail.com> said:
>
>> On Wed, Oct 2, 2013 at 8:21 PM, Carsten Haitzler <ras...@rasterman.com> 
>> wrote:
>> > On Wed, 2 Oct 2013 13:59:47 -0700 Jason Gerecke <killert...@gmail.com> 
>> > said:
>> >
>> > 3 things here.
>> >
>> > 1. for general device queries (get name, description, device classes etc.)
>> > there is already an evas_device api. right now though nothng populates the
>> > evas device information from lower levels (xi/xi2, etc. etc.), so it's
>> > unused. but it's there.
>> Would clients be expected to call this just once prior to receiving
>> input, or is there some way to notify them that a change has occured?
>
> the query of the devices and building of their nodes and tree should be in
> ecore-evas. EVAS_CALLBACK_DEVICE_CHANGED is already there for this - u can set
> a callback on the evas itself to listen for this. the idea is that if you get
> this callback - re-query the device tree and see. :)
>
>> I ask because the type information (e.g. pen tip vs. eraser or
>> airbrush vs. inking pen) is likely to change fairly regularly. We
>> would want to be sure that clients can get their hands on that,
>> whether it means notifying that they need to refresh their
>> understanding or sending the type information in each event.
>
> well the device add/del is really more about plug and unplug. here i'd expect
> there to be a single device representing the pen and pen vs eraser vs brush 
> i'd
> expect as a mode field on the pen (like i mention below about eraser)
>
>> > 2. i've talked with some people about this and the general take was that we
>> > need to add new pen events (ala multi) because they need to handle more 
>> > than
>> > multi: e.g.:
>> >
>> >   * button number on pen pressed/released SEPARATELY from pen touching.
>> >   * pen touch vs eraser touch (ie indicate which "end" of N ends a pen
>> > presses down).
>> >   * some pens support a hover ability - so that means motion events without
>> > down/up begin/end points like multi, BUT we would ned/want to report
>> > distance as a value during this hover
>> >   * possibly other custom inputs on the pens themselves that are not
>> > accounted for.
>> >
>> > we COULD extend multi events and add fields, but i think you are mistaking
>> > habitual over-engineering in efl for intent for these to be pen events. a
>> > lot of stuff gets extended beyond its initial scope "in case". eg in case
>> > out touch surface can report size, pressure and angle of your finger... :)
>> > also it'd cause issues with existing multi event usage.
>> >
>> As I said, the multi event is interesting for its extra fields, but
>> might not be a good semantic match. A pen isn't anything more than
>> fancy mouse: it has motion without a touch event and a set of buttons
>> to worry about. The only thing that makes it special are all those
>> extra axes, and most of those could be imagined on a
>> sufficiently-advanced mouse. Wacom actually makes such a "mouse" for
>> their Intuos tablets which reports things like height above the pad,
>> rotation about the z-axis, and absolute position of a spring-loaded
>> fingerwheel.
>>
>> The way most APIs handle pen input is to just pass the data alongside
>> the X/Y position you'd expect for a mouse. Throw in an enum or
>> function to let clients distinguish pen tip from eraser and you're
>> set.
>
> i'm thinking similarly - have down/up and move events, with a bunch of extra
> fields. only q is.. what set of fileds pretty much covers every aspect of a
> possible pen.... :)
>
>> > 3. if you want to tal about the extra buttons and what not that you find on
>> > pen tablets (i have a bamboo sitting on my desk at home - i know cheapo
>> > little pen tablet, but its indicative at a small scale of a lot of them), i
>> > believe these should just be "keys" like any keyboard. i don't think these
>> > belong in any specialized event system. as with #1 we CAN attach a special
>> > device handle to them though so you can differentiate where the key comes
>> > from... :)
>> >
>> Most implementations send these as "buttons" rather than "keys". In
>> the case of evdev, the Intuos and Cintiq tablets will send the
>> "meaningless" BTN_{0,1,2,3,...} buttons. For a Bamboo though, you'll
>> find the mouse buttons BTN_{LEFT,RIGHT,FORWARD,BACK} instead. In the X
>> driver these are all mapped to mouse buttons that clients can easily
>> understand. The first few mouse buttons usually have attached
>> semantics (e.g. button 1 => left click => primary action) which makes
>> them non-ideal, but "keys" don't fare much better since you'd need
>> clients to properly understand a non-standard keyboard layout. These
>> buttons are always a tricky issue.
>
> sure, but i see keys as better as they have keysyms at least that have
> enumerations... if the intent is to emulate a mouse - then buttons sounds
> right... anyway... if buttons  then its kind of weirs to expose a button pres
> from the tablet base as this press event carries x/y info but actually has
> nothing to do with location... :)
>
>> > adding  new event type isn't too hard in evas - you add the structures,
>> > enums (at the end of the current list of callbacks), add the feed api's to
>> > feed the event in, add the appropriate routing (the same we do for multi
>> > and mouse), and then that's done.
>> >
>> > then there is edje - likely you want to expose the pen events as signals
>> > like mouse. maybe not? optional really.
>> >
>> > and the important bit then ecore-evas + ecore-input + ecore-input-evas
>> > needs to gather the events from the next layer down and call the feed
>> > calls. the layer below could be evdev/console input devices (fb) x11
>> > (xi/xi2 etc.), or wl.. or win32.. or anything... and then for these
>> > ecore-x, ecore-fb, ecore-wl etc. needs to gather the appropriate events
>> > from the display system/event input system below that. one thing missign in
>> > ecore-evas is what i describe in #1 - querying all available devices,
>> > monitoring for new devices being plugged/unplugged and appropriately
>> > populating/managing the evas_device tree per canvas. :) it's something we
>> > need to get around to ding some time... :)
>> >
>> Sounds like there'll be a bit of code spelunking in my future. I'll be
>> happy once I wrap my head around how each of these components (edje,
>> ecore, evas, etc.) fit together.
>
> indeed. i hope this helped a bit. keep this dialog going and ask questions,
> swap initial struct definitions or ideas etc. :)

Put a backtrace on _mouse_down_cb, on that evas_multi_touch example,
and you will find that it is delivered by the
ecore_event_evas_mouse_button_down function, which is a handler for an
Ecore event called ECORE_EVENT_MOUSE_BUTTON_DOWN. That one is
delivered by the underlying input system for that engine. Find out
which part of the code delivers that event and then you can trace
where it started by simple gdb backtrace.

I suggest using eo_backtrace for debugging, to keep you sane from the
lots of eo_internal stuff that goes in the middle of the backtraces.

I hope that helps,
Rafael

------------------------------------------------------------------------------
October Webinars: Code for Performance
Free Intel webinars can help you accelerate application performance.
Explore tips for MPI, OpenMP, advanced profiling, and more. Get the most from 
the latest Intel processors and coprocessors. See abstracts and register >
http://pubads.g.doubleclick.net/gampad/clk?id=60134791&iu=/4140/ostg.clktrk
_______________________________________________
enlightenment-devel mailing list
enlightenment-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/enlightenment-devel

Reply via email to