On Sun, 23 Jan 2005 13:38, you wrote:
> At 2005-01-22T23:24:53+1300, Andrew Errington wrote:
> > To log in to this particular library you need your library card number
> > and a PIN.  It is sent to the server as a POST form using
> > x-www-form-urlencoded, which implies it is not encrypted during
> > transmission.
>
> Probably not, though it's possible that the details are encoded in some
> form.  Is the site using SSL at least?  If not, then other than for the
> sake of 'doing it right' there's not much point worrying about the
> security of the user's details on your own disk when you're sending it
> unencrypted across the Internet every day.

No SSL.

> > I am storing the library card number and PIN in a hidden text file in
> > the user's home directory, with only their account having read
> > privileges.  I need the PIN as plain text to send to the server, but I
> > don't know enough to make it safer (assuming it needs to be).
>
> You can't make it a great deal safer.  At some point you need an
> unencrypted copy of the user's details to send it to the server.  It's
<snip>
> If you wanted to be fancy, encrypt the user's details on disk and make
> the program persistent (i.e. run all the time, don't start it via cron).

No, don't want to be fancy.

<snip>
> One other thing, make sure you don't pass the username and password
> details on the command line to the tool you're using to fetch the page.
> If you do this, other users on the machine can find out your username
> and password details easily using tools such as ps(1).  Most decent
> tools that accept usernames and passwords as command line arguments will
> also let you use environment variables or files as the data source.

Got that.  So probably I'm doing the right thing.  I guess I have the 
following things covered:

1) Physical security - the machine is as safe as anything else in the house
2) Local security - the file containing the username and password is 
protected from other users (except root of course)
3) Network security - the machine is protected by a firewall, and 
non-essential services on the box are turned off.  Plus I apt-get 
update/upgrade regularly.

In fact, the password is probably safer on the machine than it is when it 
is used!

> > PS I'm using curl, grep and sed so far.  Tried to fit awk in, but not
> > at the moment, although there is still plenty to do.
>
> It sounds like you've got the basics working already, but it might be
> worth considering using a tool that can parse HTML properly.  If the
> page is simple or very unlikely to change, you can get away with using
> things like grep, sed, and awk, but to do any serious parsing you're
> wasting your time with these tools.  The common scripting languages such
> as Perl, Python, Ruby, TCL, etc. all have HTML parsing modules
> available.  There may be standalone command line tools that munge HTML
> into something easier to handle with grep, sed, or awk, but I don't know
> of any off the top of my head.

Thanks for the advice.  I could have started with a Perl script, but I 
thought I'd have a go with stringing some Unix tools together.  It's been 
fun so far, but if I find something I can't do then I'll re-think.

Andy

Reply via email to