Ted Leung wrote:
> The whole command line approach seems wrong to me.  Agenda, and Apple's
> Newton, both of which ran on significantly less capable hardware than we
> have today, were able to do recognition of people, dates, etc without
> the need for a semantic hint such as /event or /task.   It might be that
> the first cut of recognition needs those hints to be implementable, but
> I think that a longer range goal should to reduce the need for these
> kinds of hints.

Yeah, it seemed kinda awkward to me too, but I haven't used Agenda etc.
so I don't have good programs to compare to.

Before this command line discussion I sort of vaguely assumed that one
would hit the "New" button in the toolbar, get a note, and start typing.
As the NLP recognized things it would collect dates etc. from the flow
of text.

I was thinking it could perhaps underline the special things it
understood (might be hard with wx).

I was also thinking it could automatically add in fields to the detail
view as it recognized them, but that would cause the note area to shift
downwards while you were typing, which would be awkward. Same thing if
it automatically stamped the note based on what kind of data you had
written.

Another unfortunate thing with this approach is that currently you get a
new item type based on what is selected on the toolbar (Calendar ->
create event), and if you have anything other than plain note those date
fields etc. get in the way if you would like to rely on NLP.

-- 
  Heikki Toivonen


Attachment: signature.asc
Description: OpenPGP digital signature

_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

Open Source Applications Foundation "chandler-dev" mailing list
http://lists.osafoundation.org/mailman/listinfo/chandler-dev

Reply via email to