Wasn't John McCarthy's Elephant programming language based on the metaphor
of conversation?  Perhaps voice based programming interactions are
addressed there?
On Apr 9, 2013 8:46 AM, "David Barbour" <dmbarb...@gmail.com> wrote:

>
> On Tue, Apr 9, 2013 at 1:48 AM, Casey Ransberger <casey.obrie...@gmail.com
> > wrote:
>
>>
>> The computer is going to keep getting smaller. How do you program a
>> phone? It would be nice to be able to just talk to it, but it needs to be
>> able -- in a programming context -- to eliminate ambiguity by asking me
>> questions about what I meant. Or *something.*
>>
>
> Well, once computers get small enough that we can easily integrate them
> with our senses and gestures, it will become easier to program again.
>
> Phones are an especially difficult target (big hands and fingers, small
> screens, poor tactile feedback, noisy environments). But something like
> Project Glass or AR glasses could project information onto different
> surfaces - screens the size of walls, effectively - or perhaps the size of
> our moleskin notebooks [1]. Something like myo [2] would support pointer
> and gesture control without much interfering with our use of hands.
>
> That said, I think supporting ambiguity and resolving it will be one of
> the upcoming major revolutions in both HCI and software design. It has a
> rather deep impact on software design [3].
>
> (Your Siri converstation had me laughing out loud. Appreciated.)
>
> [1]
> http://awelonblue.wordpress.com/2012/10/26/ubiquitous-programming-with-pen-and-paper/
> [2] https://getmyo.com/
> [3]
> http://awelonblue.wordpress.com/2012/05/20/abandoning-commitment-in-hci/
>
>
> _______________________________________________
> fonc mailing list
> fonc@vpri.org
> http://vpri.org/mailman/listinfo/fonc
>
>
_______________________________________________
fonc mailing list
fonc@vpri.org
http://vpri.org/mailman/listinfo/fonc

Reply via email to