Nicolas :

>> However, it is probably less work to embed features into the
>> lock screen program than to add new programs into the Trusted
>> Path, I'd think.
> 
> Has anyone looked at what it would take to do the latter?

I don't think so.  Also this question is a bit fuzzy.  There are
a lot of a11y components.  It's probably reasonable to consider
ATK a part of Trusted Path already since it is a part of GTK+.

However, if we want to launch AT programs from the dialog, then
we need at-spi-registryd and the AT programs themselves.

We probably don't need the full range of AT programs in the
dialog since it might make more sense to do things like allow
a hotkey to increase/decrease font size than to bother supporting
a separate magnifier application, for example.

>>> Text-to-speech cannot be avoided.  Orca will have to be part of the
>>> trusted path.
>> I don't think this is the case if the dialog passes strings to
>> the userland eye-candy program and these are passed along to
>> orca.  Since the lock screen dialog and PAM have control over
>> the strings displayed, we know no sensitive information would
>> ever be passed along.  The only way this would happen is if
>> PAM passed a message back saying your password or something,
>> which should never happen.
> 
> The question becomes: can having access to the user's xauth (or to the
> session's DBus, or whatever) allow a process to modify those strings.

Yes, I'd imagine if someone had access to the user's xauth or session
D-Bus or ORBit2 CORBA connection, that they could manipulate these
strings.  However, they wouldn't have access to any sensitive data
or any information about what the user is typing.  You could perhaps
do something like change the string "Enter your password" to "Enter
your mother's maiden name".  I suppose this could cause enough
confusion to a user that it would be a DoS.

The point of providing this information is mostly just to allow
the user to know what is going on in the dialog, so if they move
focus to the "OK" button, the text-to-speech engine or braille
display engine should report "Focus moved to OK button", and if they
move focus to the entry field it should say whatever the PAM prompt
reads.

I'm unsure if it would be a good idea to report what the user is
typing into the entry field, but it could work either way.  The
user could be expected to just type whatever they need to into
the entry field properly without feedback.

> If the answer is yes, then that needs to be fixed.  I'm guessing
> __wildly__ that it will be easier to make all the a11y infrastructure
> part of the trusted path.

Yes, if it is a requirement to ensure that this sort of information
can not be manipulated, then it would be necessary to call the
text-to-speech and braille display engines directly from the
dialog.

I don't think this means that we need to include "all of the a11y
infrastructure".  It just means that we need to include those
components needed to translate text to speech and braille display
output.

This doesn't necessary mean that we need orca, we could call the
lower level interfaces directly.  This could probably be done
without using at-spi-registryd at all, if we wanted.

Since the login GUI is very simple, there are probably only a
few focus change events and strings that would need to be
reported.

Brian

Reply via email to