Eric,

Although I would most likely enjoy the former option I feel the latter would be 
most appropriate for contacting you. Thanks for getting back to me and 
explaining some of this. I will contact you off list for sure so as not to fill 
up the lists mailboxes with this topic. 

I will say that Jacob is absolutely correct in the fact that those of us who 
rely on screen readers to interact with our computers have a few things we do, 
and tend to not do.
• Do type very efficiently
• Do listen to our speech output very fast
• Do interact with the text line, word, and character at a time.
• Do not, use a mouse
• Do not, enjoy reading through several lines of code at a time to locate a 
specific issue, method, or block of code.
• Do not, have access to debugging tools.

This is just a few things we do and do not do typically, but the point is that 
even though some things are similar there are some things that are quite 
different. Here is an example. I would not use a voice driven editor because I 
tend to build out a layout of code before I begin filling in methods, 
functions, and constructors. For me to do this with voice would require the use 
of a mouse to place the cursor where I want each line of code to be placed, and 
since I do not have access to the mouse in that way this would not work well. 
On the other hand if the program could interpret what you are speaking to build 
out method signatures and individual lines of code accurately, then there is a 
good chance that same system could provide a great interface for a screen 
reader. Thank you again for the response and be expecting an email from me 
shortly sir.  
> On Feb 20, 2015, at 11:39 AM, Jacob Kruger <ja...@blindza.co.za> wrote:
> 
> Eric, issue is that with screenreaders, we're generally way more into
> navigating code and interface character by character/by keyboard, so , yes,
> keeping interface relatively simple is a good thing, but, we also would
> prefer to primarily keep all interface elements to make use of standard UI
> controls, and make sure tab index/order is suitable/relevant at times, etc.
> etc.
> 
> As in, I think we'd primarily want to avoid having to use a mouse at all if
> possible, but anyway.
> 
> Stay well
> 
> Jacob Kruger
> Blind Biker
> Skype: BlindZA
> "Roger Wilco wants to welcome you...to the space janitor's closet..."
> 
> ----- Original Message ----- From: "Eric S. Johansson" <e...@harvee.org>
> To: <python-list@python.org>
> Sent: Friday, February 20, 2015 7:22 PM
> Subject: Re: Accessible tools
> 
> 
>> 
>> On 2/19/2015 10:33 AM, Bryan Duarte wrote:
>>> Thank you jwi, and Jacob,
>>> 
>>> I took a look at that posting and it seems pretty unique. I am not much 
>>> interested in the speech driven development, but I am very interested in 
>>> developing an accessible IDE.
>> 
>> Well you should be because it looks like an aural interface (uses speech 
>> instead of keyboards) uses the same kinds of data to present to either a 
>> text to speech or speech recognition driven environment.
>>> A professor and I have been throwing around the idea of developing a 
>>> completely text based IDE. There are a lot of reasons this could be 
>>> beneficial to a blind developer and maybe even some sighted developers who 
>>> are comfortable in the terminal. The idea would be really just to provide a 
>>> way of easily navigating blocks of code using some kind of tabular 
>>> formatting, and being able to collapse blocks of code and hearing from a 
>>> high level information about the code within. All tools and features would 
>>> obviously be spoken or output in some kind of audio manor.
>> I've been working with another professor working on some of these issues as 
>> well. His focus has been mostly blind young adults in India.  come up with 
>> some pretty cool concepts that looks very usable. The challenge now is to 
>> make them work and, quite frankly monetize the effort to pay for the 
>> development.
>> 
>> Again, this shows the similarities in functionality used by both speech 
>> recognition and text-to-speech. All I care about is text and what I can say. 
>> We're now working with constructs such as with-open, argument by number, 
>> plaintext symbol names (with bidirectional transform to and from code form), 
>> guided construct generation for things like classes, methods, comprehensions 
>> etc.
>> 
>> All of these things would be useful to handed programmers as well as a way 
>> of accelerating co-creation and editing. Unfortunately, like with disabled 
>> people stove piping text-to-speech versus speech recognition, handed 
>> developers stovepipe keyboard interfaces and don't really think about what 
>> they are trying to do, only how they are doing it.
>> 
>> Yes yes, it's a broadbrush that you can probably slap me with. :-)
>>> 
>>> Oh and before I forget does anyone know how to contact Eric who was 
>>> developing that accessible speech driven IDE? Thanks
>> 
>> Well, you could try looking in a mirror and speaking my name three times at 
>> midnight But you would get better results if you used my non-mailing list 
>> email address. e...@eggo.org.
>> 
>> --- eric
>> -- 
>> https://mail.python.org/mailman/listinfo/python-list
>> 
> 
> -- 
> https://mail.python.org/mailman/listinfo/python-list

-- 
https://mail.python.org/mailman/listinfo/python-list

Reply via email to