2009/3/21 John M McIntosh <[email protected]>:
>
> On 21-Mar-09, at 3:09 AM, Igor Stasenko wrote:
>
>> Right, but here we're talking about doing such conversion much more
>> earlier (at event source object),
>> so then event sensor already deals with first class event objects.
>> I want to know, if such scheme (which i described in first post) is
>> plausible.
>
> For the iPhone VM I return a complex event type, which then points to
> Smalltalk objects which are the
> representation of the touch events. For location and acceleration data I
> return the actual objective-C objects.
> This data is then processed by EventSensor.
>
> If you choose to push the responsibility to the VM for creating event
> objects then you need to be cognizant
> of the fact that whatever is proposed has to change very little over time,
> otherwise you end up with the issue
> of image versus VM compatibility and the fact that VM version changes
> proceed at a slow rate.
>
Nope. I don't want VM to deal with real event objects.
VM will still use the old event buffers to deliver events to image.
But once image receiving it, it should convert it to an instance of
event as soon as source.
This is the role of EventSource class - represent VM as event source,
which producing an instances of KernelXXXEvent classes, and hiding the
details of converting raw event buffers from the eyes of higher
layers, which then going to handle the event (EventSensor/Morphic etc)


>
> --
> ===========================================================================
> John M. McIntosh <[email protected]>
> Corporate Smalltalk Consulting Ltd.  http://www.smalltalkconsulting.com
> ===========================================================================
>
>
>
>
>



-- 
Best regards,
Igor Stasenko AKA sig.

_______________________________________________
Pharo-project mailing list
[email protected]
http://lists.gforge.inria.fr/cgi-bin/mailman/listinfo/pharo-project

Reply via email to