At 2005-01-22T23:24:53+1300, Andrew Errington wrote:
> To log in to this particular library you need your library card number
> and a PIN.  It is sent to the server as a POST form using
> x-www-form-urlencoded, which implies it is not encrypted during
> transmission.

Probably not, though it's possible that the details are encoded in some
form.  Is the site using SSL at least?  If not, then other than for the
sake of 'doing it right' there's not much point worrying about the
security of the user's details on your own disk when you're sending it
unencrypted across the Internet every day.

> I am storing the library card number and PIN in a hidden text file in
> the user's home directory, with only their account having read
> privileges.  I need the PIN as plain text to send to the server, but I
> don't know enough to make it safer (assuming it needs to be).

You can't make it a great deal safer.  At some point you need an
unencrypted copy of the user's details to send it to the server.  It's
probably a good idea to programmatically check the file permissions to
ensure they are strict enough, i.e. read access for the user only.  You
could encrypt the details on disk, but at some stage you need to store a
secret somewhere to decrypt them, so you've just shifting the problem.

If you wanted to be fancy, encrypt the user's details on disk and make
the program persistent (i.e. run all the time, don't start it via cron).
When the program is first started, the user would be required to enter a
password to decrypt the details, which would then be stored in memory
only (this is how ssh-agent works).  Of course, if you don't trust the
other users of the machine where the program is running, having the
unecrypted data only in memory is not really any more secure than
storing it on disk unencrypted.

One other thing, make sure you don't pass the username and password
details on the command line to the tool you're using to fetch the page.
If you do this, other users on the machine can find out your username
and password details easily using tools such as ps(1).  Most decent
tools that accept usernames and passwords as command line arguments will
also let you use environment variables or files as the data source.

> PS I'm using curl, grep and sed so far.  Tried to fit awk in, but not
> at the moment, although there is still plenty to do.

It sounds like you've got the basics working already, but it might be
worth considering using a tool that can parse HTML properly.  If the
page is simple or very unlikely to change, you can get away with using
things like grep, sed, and awk, but to do any serious parsing you're
wasting your time with these tools.  The common scripting languages such
as Perl, Python, Ruby, TCL, etc. all have HTML parsing modules
available.  There may be standalone command line tools that munge HTML
into something easier to handle with grep, sed, or awk, but I don't know
of any off the top of my head.

Cheers,
-mjg
-- 
Matthew Gregan                     |/
                                  /|                [EMAIL PROTECTED]

Reply via email to