Re: A small present. (PenTablet-support installation script for 656/703)
On Mon, May 5, 2008 at 4:41 PM, Sameer Verma [EMAIL PROTECTED] wrote: Sameer Verma wrote: Michael Stone wrote: Friends, Over a month ago, Blake and I pieced together some software to turn on PenTablet support. At Kim's and SJ's urging, I have prepared http://teach.laptop.org/~mstone/fix-tablet.sh which makes it a bit easier to install the relevant pieces. I believe it will enable the PT on both 656 and 703 and I'm not _aware_ of any side effects, but it's definitely experimental software. Enjoy, unless you prefer to help out further by testing Andres' 2.6.25 kernels. Michael ___ Devel mailing list Devel@lists.laptop.org http://lists.laptop.org/listinfo/devel Hi Michael, I ran your script without a hitch, but I am unable to see it work in Paint. Which activity will support its use? Sameer Following up on my post...I upgraded my G1G1 from 656/Q2D07 to 703/Q2D14 and updated the activities using Bert's script. I re-ran Michael's pen-tablet script. I still didn't see the pen-tablet feature working in Paint, until...I applied a lot more pressure than I had anticipated. The cursor moved! So, I guess the pen-tablet script makes it work, but the amount of pressure needed is a lot more than what I would apply to my Nokia 770 or 810. Is this expected? Yep. For whatever reason, the OLPC tablet requires quite a bit of pressure to work -- much more than conventional graphics tablets. Pat -- Patrick Dubroy http://dubroy.com/blog - on programming, usability, and hci ___ Devel mailing list Devel@lists.laptop.org http://lists.laptop.org/listinfo/devel
Re: A small present. (PenTablet-support installation script for 656/703)
Hi Michael, Thanks for this. I'd like to give it a try soon, but I was just wondering...do you know if there's a way to use this without the tablet controlling the core pointer in X? From a usability point of view, I think it should work as an independent device. Thanks, Pat -- Patrick Dubroy http://dubroy.com/blog - on programming, usability, and hci On Sat, May 3, 2008 at 12:32 AM, Michael Stone [EMAIL PROTECTED] wrote: Friends, Over a month ago, Blake and I pieced together some software to turn on PenTablet support. At Kim's and SJ's urging, I have prepared http://teach.laptop.org/~mstone/fix-tablet.sh which makes it a bit easier to install the relevant pieces. I believe it will enable the PT on both 656 and 703 and I'm not _aware_ of any side effects, but it's definitely experimental software. Enjoy, unless you prefer to help out further by testing Andres' 2.6.25 kernels. Michael ___ Devel mailing list Devel@lists.laptop.org http://lists.laptop.org/listinfo/devel ___ Devel mailing list Devel@lists.laptop.org http://lists.laptop.org/listinfo/devel
Re: Usability testing
On Sun, Apr 13, 2008 at 3:08 PM, Greg Smith (gregmsmi) [EMAIL PROTECTED] wrote: Hi Tomeuz et al, I have done a few usability tests and they are a lot of work and not easy to turn in to code later. Many people seem to believe this myth that usability tests are too much work. But as Jakob Nielsen says: Some people think that usability is very costly and complex and that user tests should be reserved for the rare web design project with a huge budget and a lavish time schedule. Not true. Elaborate usability tests are a waste of resources. The best results come from testing no more than 5 users and running as many small tests as you can afford. (The whole article is worth a read. Why You Only Need to Test With 5 Users: http://www.useit.com/alertbox/2319.html) [snip] The report by Carol's daughter: (see http://wiki.laptop.org/go/User_talk:Gregorio#User_experience.2C_input.2C _ideas_and_blogs South Bronx Teacher Feedback link). One key idea there is that kids wont wait for an activity to load. The activity icon blinks but the kids didn't get that. Maybe an animated GIF or a mini-animation would help. Or maybe paint the activity window right away, then fill it in slowly. Downside of that is you are tied to activity even if it never loads. Two ideas but we need more user feedback that its important issue before I would suggest it's a development priority. And that is a perfect example of how informal usability testing can be done without too much work. I'd love to see more reports from the field like Robin's. Pat -- Patrick Dubroy http://dubroy.com/blog - on programming, usability, and hci ___ Devel mailing list Devel@lists.laptop.org http://lists.laptop.org/listinfo/devel
Re: Usability testing
(CC'ed to the sugar ml. For context, see http://lists.laptop.org/pipermail/devel/2008-April/012674.html) Like Carol, when I first got involved with OLPC, I was surprised to find that so little user testing had been done. I was told by someone that to their knowledge, no formal user testing had been done *at all*. But, as the same person explained, the entire system has been developed by a small number of people under immense time pressure, which naturally makes it difficult to do iterative design. I think everyone is in agreement that it would be great if more user testing could be done on the Sugar interface. But, that doesn't mean everyone agrees on the methods. On Fri, Apr 11, 2008 at 10:40 AM, Benjamin M. Schwartz [EMAIL PROTECTED] wrote: Carol Lerche wrote: | A good example is the rococco color picking widget. According to my | observation this is very difficult for small children to use, and to learn. Perhaps you would care to look at http://wiki.laptop.org/go/Designs/Toolbars#11 A glance through the Designs section of the wiki will answer your question about why there isn't a bigger push for usability testing: because Sugar isn't anywhere close to implementing its design. Personally, I expect that usability testing will serve almost exclusively to tell us what we already know is wrong with the current implementation. Once the new designs are implemented, then it will be a good time to test usability to search for further improvements. Ben, you're right that it will be easier to do testing once the designs have been implemented. But we don't need to fully implement the interface before it has been implemented -- user testing could be done on the existing designs using paper prototyping (http://en.wikipedia.org/wiki/Paper_prototyping) or other lightweight usability methods. This makes it easier to iterate on the designs without wasting a lot of development time. In the end, nothing beats putting a working interface in front of the actual users, but given the constraints of this project, that kind of testing is difficult. For one thing, it's tough for most of the developers to get access to children in developing countries. However, doing some kind of testing is better than nothing. I'd encourage people to do any kind of user testing they can do -- paper prototypes, informal think-alouds with friends and family, etc. Simply putting your design in front of someone who's not familiar with the project will teach you lots. In fact, I just completed a study like this on the Pen Tablet user interface: http://wiki.laptop.org/go/Pen_Tablet_UI/User_Study If there's one conclusion we can make here, it's that we could do a better job in coordinating our usability efforts. In the next few days, I'll try to set up a central place on the wiki that can use to do this. Anyone else who is interested in this can feel free to do so, of simply get in touch with me to let me know you're interested. Pat -- Patrick Dubroy http://dubroy.com/blog - on programming, usability, and hci http://wiki.laptop.org/go/User:Pdubroy ___ Devel mailing list Devel@lists.laptop.org http://lists.laptop.org/listinfo/devel
Pen Tablet support and a usability study
For the last couple months, I've been working on the higher-level support for the XO Pen Tablet (for more information, see here: http://wiki.laptop.org/go/Pen_Tablet_Support). If anyone's interested, I've got a couple of activities available which will let you play with the tablet on an XO: - TabletAreaTest (http://wiki.laptop.org/go/Pen_Tablet_Support/GTK_Widget#Sample_Activity) demonstrates a GTK widget that is mapped to tablet input - tabletui (http://wiki.laptop.org/go/Pen_Tablet_UI#Prototype) explores some user interface prototypes for the unconstrained drawing case (http://wiki.laptop.org/go/Pen_Tablet_UI#Unconstrained_drawing) These activities have been developed for build 656, meaning they don't require any special driver support (other than /dev/input/eventx). I'd love to see people give these a go and let me know what they think. I also just completed a user study on some of my prototype user interfaces. I've summarized the results here: http://wiki.laptop.org/go/Pen_Tablet_UI/User_Study Feedback is welcome -- either directly, or on the appropriate Talk page. Pat -- Patrick Dubroy http://dubroy.com/blog - on programming, usability, and hci http://wiki.laptop.org/go/User:Pdubroy ___ Devel mailing list Devel@lists.laptop.org http://lists.laptop.org/listinfo/devel
Project application: TabletUI
1. Project name: TabletUI 2. Existing website, if any: 3. One-line description: A prototype activity for exploring user interfaces for the XO Pen Tablet. 4. Longer description: See http://wiki.laptop.org/go/PenTablet_UI 5. URLs of similar projects : 6. Committer list Please list the maintainer (lead developer) as the first entry. Only list developers who need to be given accounts so that they can commit to your project's code repository, or push their own. There is no need to list non-committer developers. Username Full name SSH2 key URLE-mail - -- #1 pdubroyPatrick Dubroy http://dubroy.com/patrick/id_rsa.pub[EMAIL PROTECTED] If any developers don't have their SSH2 keys on the web, please attach them to the application e-mail. 7. Preferred development model [X] Central tree. Every developer can push his changes directly to the project's git tree. This is the standard model that will be familiar to CVS and Subversion users, and that tends to work well for most projects. [ ] Maintainer-owned tree. Every developer creates his own git tree, or multiple git trees. He periodically asks the maintainer to look at one or more of these trees, and merge changes into the maintainer-owned, main tree. This is the model used by the Linux kernel, and is well-suited to projects wishing to maintain a tighter control on code entering the main tree. If you choose the maintainer-owned tree model, but wish to set up some shared trees where all of your project's committers can commit directly, as might be the case with a discussion tree, or a tree for an individual feature, you may send us such a request by e-mail, and we will set up the tree for you. 8. Set up a project mailing list: [ ] Yes, named after our project name [ ] Yes, named __ [X] No When your project is just getting off the ground, we suggest you eschew a separate mailing list and instead keep discussion about your project on the main OLPC development list. This will give you more input and potentially attract more developers to your project; when the volume of messages related to your project reaches some critical mass, we can trivially create a separate mailing list for you. If you need multiple lists, let us know. We discourage having many mailing lists for smaller projects, as this tends to stunt the growth of your project community. You can always add more lists later. 9. Commit notifications [ ] Notification of commits to the main tree should be e-mailed to the list we chose to create above [ ] A separate mailing list, projectname-git, should be created for commit notifications [X] No commit notifications, please 10. Shell accounts As a general rule, we don't provide shell accounts to developers unless there's a demonstrated need. If you have one, please explain here, and list the usernames of the committers above needing shell access. It's been suggested (http://lists.laptop.org/pipermail/sugar/2008-April/004921.html) that I should get a shell account in order to create personal git trees for sugar-toolkit and Paint, as a place to being integrating Pen Tablet support into Sugar. 11. Translation [X] Set up the laptop.org Pootle server to allow translation commits to be made [ ] Translation arrangements have already been made at ___ 12. Notes/comments: ___ Devel mailing list Devel@lists.laptop.org http://lists.laptop.org/listinfo/devel
Re: Fixing the Pen Tablet
I've heard that the tablet is enabled is in recent joyride builds. Is there a build that has it that would be particularly good to try out? And how does the tablet mapping work? Does it control the core pointer, or is it accessed as an XInput device? I'm really interested in trying this out, as an alternative to reading directly from /dev/input/event5, as I'm doing now. Thanks, Pat -- Patrick Dubroy http://dubroy.com/blog - on programming, usability, and design 2008/3/23 Michael Stone [EMAIL PROTECTED]: Folks, Below are patches to the kernel source code and the xorg-x11-server and xorg-x11-drv-evdev packages which restore function to the ALPS Pen Tablet (dlo#5268), which fix the twin clicks bug that plagued previous approaches (dlo#6079), and which cause X to configure the Pen Tablet in absolute mode (mapped to the entire screen, dlo#1002) while leaving Glide Sensor in relative mode (discussion at dlo#4260). Blake Michael ___ Devel mailing list Devel@lists.laptop.org http://lists.laptop.org/listinfo/devel ___ Devel mailing list Devel@lists.laptop.org http://lists.laptop.org/listinfo/devel
Summer of Code mentorship
I've indicated my interest in mentoring a Summer of Code project on the wiki, and sent an email to SJ. However, according to Google, I still haven't been accepted as a mentor. I've got several undergraduate students here at the University of Toronto who are applying under the assumption that I can mentor them, and I've been told that their applications will be stronger if I've been approved by the time they submit (deadline is Monday I think). Does anyone know if there's something I can do to speed this process up? Thanks, Pat -- Patrick Dubroy http://dubroy.com/blog - on programming, usability, and hci ___ Devel mailing list Devel@lists.laptop.org http://lists.laptop.org/listinfo/devel
PenTablet user interface
Hi, I'm working on a project to improve the PenTablet support. I have two main goals: 1. Build a GTK widget that an application developer could use to get tablet support for free. The widget would provide a 1-to-1 mapping between the physical tablet and the on-screen drawing area. For example, a penmanship application might have an area on-screen where a child could practice their writing. 2. For the more complicated case of freehand drawing in e.g. the Paint activity, my goal is to define the interface through which the user will be able to draw on an arbitrary area of the canvas. Of course, this is all pending proper driver support for the PenTablet. For now, I am prototyping these applications by reading directly from /dev/input/event5. I know that this has been discussed previously on the mailing list, but to my knowledge there's been no agreement on exactly how the UI will work for the PenTablet. I've created a page in the wiki (http://wiki.laptop.org/go/PenTablet_UI) that summarizes the previous suggestions that I am aware of. If you have any opinions on this, please take a look at let me know what you think. My personal feeling is that the best option is this: - the tablet is always mapped to a rectangle in the center of the screen. Using the grab button and the stylus, the canvas can be moved around underneath the rectangle (Option 1 for Adjusting the Mapping) - to allow for precise drawing, the user can engage a hover mode by holding down the Alt key while dragging the stylus (Option B for Precise Drawing) I have an application which demonstrates some of these techniques, which I could make available to anyone who is interested. I am also planning on doing a small, informal user study to test some of the techniques. Pat -- Patrick Dubroy http://dubroy.com/blog - on programming, usability, and hci ___ Devel mailing list Devel@lists.laptop.org http://lists.laptop.org/listinfo/devel
Re: Accessing PenTablet coordinates from PyGTK
On Fri, Feb 29, 2008 at 12:41 PM, Andres Salomon [EMAIL PROTECTED] wrote: [...] I've tried setting SendCoreEvents to false, and then creating a GTK widget and registering for extension events. This allows to receive motion events from the PenTablet, but the x and y values in the event object are inf and nan. Anybody know why this would be? Any suggestions? Have you tried using a kernel from the master branch? In that, the PT coordinates are ABS by default. I just got around to trying this now. I grabbed a kernel from the master branch built on 20080123, and the PenTablet works, but it still controls the core pointer in relative mode. If the coordinates are ABS, what kind of behaviour should I be seeing? And any idea where I could poke around to see what the problem is? ___ Devel mailing list Devel@lists.laptop.org http://lists.laptop.org/listinfo/devel
Re: Accessing PenTablet coordinates from PyGTK
On Fri, Feb 29, 2008 at 1:38 PM, Tomeu Vizoso [EMAIL PROTECTED] wrote: On Fri, Feb 29, 2008 at 6:08 PM, Patrick Dubroy [EMAIL PROTECTED] wrote: Apparently there used to be a method gdk.Window.input_get_pointer that would allow you to query the location of the pointer for XInput devices. However, this method appears to be gone in the current version of GTK. Has it moved somewhere? I see it here in line 608: http://svn.gnome.org/viewvc/pygtk/trunk/gtk/gtk-types.c You're right, I see it there. But: python -c 'import gtk; print hasattr(gtk.gdk.Window, input_get_pointer)' returns False. And strings /usr/lib/python2.5/site-packages/gtk-2.0/gtk/_gtk.so | grep input_get_pointer doesn't find it either. Pat ___ Devel mailing list Devel@lists.laptop.org http://lists.laptop.org/listinfo/devel
Accessing PenTablet coordinates from PyGTK
I'm trying to get access to the PenTablet from within a PyGTK application. Since the PenTablet isn't working in the current system, I had to modify xorg.conf to use the evdev driver (as in this example: http://dev.laptop.org/attachment/ticket/2198/new-xorg.conf). Doing this, I can get the PenTablet to work in relative mode to control the core pointer. But what I really want to do is to access the absolute position data from the PenTablet, to be able to draw in a GTK widget independent of the core pointer. I've tried setting SendCoreEvents to false, and then creating a GTK widget and registering for extension events. This allows to receive motion events from the PenTablet, but the x and y values in the event object are inf and nan. Anybody know why this would be? Any suggestions? Apparently there used to be a method gdk.Window.input_get_pointer that would allow you to query the location of the pointer for XInput devices. However, this method appears to be gone in the current version of GTK. Has it moved somewhere? Pat ___ Devel mailing list Devel@lists.laptop.org http://lists.laptop.org/listinfo/devel
Re: Controlling the Glide/Pressure sensor
I've got my hands on a B2 machine, and finally got around to giving this a go. When I boot into the firmware tests and get to the tablet test, it doesn't seem to detect anything at all except in the center third (the capactive area). Does this mean that the firmware in this machine doesn't support the pressure tablet at all? On 1/14/08, Wade Brainerd [EMAIL PROTECTED] wrote: You can watch the output of the PT by downloading and compiling evtest: wget http://david.woodhou.se/evtest.c gcc -o evtest evtest.c ./evtest /dev/input/event5 0 It's event5 on my XO, you might have to use a different number. Anyway, then drag something around on the PT and watch the output. This is the output from the olpc.c driver in the kernel. So the driver appears to be working, but there is something preventing X (possibly the HAL?) from recognizing the events as mouse movement. I'm not sure whether the driver correctly reports the GS's (quite limited) pressure sensitivity though. If you want to see the raw data, just boot the XO with the left gamepad key held down and run through the firmware tests to the one which tests the tablet, it lets you see both GS and PT input as well as pressure. Best, Wade On Jan 14, 2008 11:29 AM, Patrick Dubroy [EMAIL PROTECTED] wrote: I'm the student Mike mentioned who will be working on this project. Does anyone have any more details on how much low level work needs to be done? I know there will need to be work done to map the input from the tablet to X events. Is the device driver fully functional? I'd appreciate any more details that people could give me. Thanks, Pat -- Patrick Dubroy http://dubroy.com/blog - on programming, usability, and hci On 1/10/08, Mike C. Fletcher [EMAIL PROTECTED] wrote: I have a student who's interested in doing a term project on the UI for the track sensor. I've put together this quick summary. Deadline looms for starting the project, so if people have don't do that or we've already done that feedback, please speak up ASAP. Background: * XO has two different devices, resistive glide-sensor and pressure-sensitive tablet o Both of these are currently showing up as core pointer events in X AFAIK o Changes between pressure and glide-sensor activity have the potential to cause jumps of the cursor (absolute versus relative mode) * There is currently no UI to map the pressure-sensitive tablet's area into a particular area on the screen (nor, AFAIK an API to accomplish the mapping) o Use case: use the entire drawing area to draw into a small area of a drawing in Paint * Activities currently do not have control over the mapping of the area o Use case: in a penmanship course, collect samples of the child's letters in special widget areas within a test, focusing a new area should remap the pen to that area Trackpad UI Design Requirements: * API for configuring the resistive/pressure sensor allowing control of all the major features o Note that there will likely be some X11 hacking here, to get the pointer mapping working * Standard UI controls for redirecting input areas o Standard GTK UI for positioning, and scaling o Standard GTK widget for e.g. handwritten text entry, provide access as a bitmap (or a series of strokes optional) + Allow for capturing all data (full resolution) or just scaled data as configuration option o Intuitive (HIG-compliant) standard mechanisms for controlling the various configuration parameters o A 6 year old should be able to figure out how to direct their drawings, written text and the like into the desired areas o Standard feedback on where the tablet area is bounded on screen when drawing with the tablet * System level (possibly on Sugar's Frame) trigger to bring up the control mechanisms (optional) o Most pen-aware applications will likely use internal logic and the API to determine position and the like, but a general trigger to the functionality should be useful for non-pen-aware activities * Paint Controls o Work with Paint's authors to provide intuitive controls to make using the pen/tablet intuitive within the context of paint Obviously we would need to find a machine to work on to make the project feasible. I'll see if we can repurpose one that's local to the task. Discussions welcome, Mike -- Mike C. Fletcher Designer, VR Plumber, Coder http
Re: Controlling the Glide/Pressure sensor
I'm the student Mike mentioned who will be working on this project. Does anyone have any more details on how much low level work needs to be done? I know there will need to be work done to map the input from the tablet to X events. Is the device driver fully functional? I'd appreciate any more details that people could give me. Thanks, Pat -- Patrick Dubroy http://dubroy.com/blog - on programming, usability, and hci On 1/10/08, Mike C. Fletcher [EMAIL PROTECTED] wrote: I have a student who's interested in doing a term project on the UI for the track sensor. I've put together this quick summary. Deadline looms for starting the project, so if people have don't do that or we've already done that feedback, please speak up ASAP. Background: * XO has two different devices, resistive glide-sensor and pressure-sensitive tablet o Both of these are currently showing up as core pointer events in X AFAIK o Changes between pressure and glide-sensor activity have the potential to cause jumps of the cursor (absolute versus relative mode) * There is currently no UI to map the pressure-sensitive tablet's area into a particular area on the screen (nor, AFAIK an API to accomplish the mapping) o Use case: use the entire drawing area to draw into a small area of a drawing in Paint * Activities currently do not have control over the mapping of the area o Use case: in a penmanship course, collect samples of the child's letters in special widget areas within a test, focusing a new area should remap the pen to that area Trackpad UI Design Requirements: * API for configuring the resistive/pressure sensor allowing control of all the major features o Note that there will likely be some X11 hacking here, to get the pointer mapping working * Standard UI controls for redirecting input areas o Standard GTK UI for positioning, and scaling o Standard GTK widget for e.g. handwritten text entry, provide access as a bitmap (or a series of strokes optional) + Allow for capturing all data (full resolution) or just scaled data as configuration option o Intuitive (HIG-compliant) standard mechanisms for controlling the various configuration parameters o A 6 year old should be able to figure out how to direct their drawings, written text and the like into the desired areas o Standard feedback on where the tablet area is bounded on screen when drawing with the tablet * System level (possibly on Sugar's Frame) trigger to bring up the control mechanisms (optional) o Most pen-aware applications will likely use internal logic and the API to determine position and the like, but a general trigger to the functionality should be useful for non-pen-aware activities * Paint Controls o Work with Paint's authors to provide intuitive controls to make using the pen/tablet intuitive within the context of paint Obviously we would need to find a machine to work on to make the project feasible. I'll see if we can repurpose one that's local to the task. Discussions welcome, Mike -- Mike C. Fletcher Designer, VR Plumber, Coder http://www.vrplumber.com http://blog.vrplumber.com ___ Devel mailing list Devel@lists.laptop.org http://lists.laptop.org/listinfo/devel