Hi everyone,

As of yet I have not ventured in the area of Linux on the client's side
here in the office. Admittedly there are only two Linux computers here to
date. One is the server, and the other is my personal laptop. So far the
only hindrance has been authentication.

I obviously do not want to create and keep synchronized the
group/gshadow/passwd/shadow files on each of the clients. I'm sure
everyone knows how crazy this would be. There are a number of ways around
this that I know exist to date. I've heard/read of NIS/YP, LDAP
authentication, and Samba authentication via a PAM module. I do not know
how stable the SQL authentication modules are.

It seems the most reliable of these are NIS/YP and LDAP. I am particularly
inclined to use LDAP because of the promise of scalability (despite the
fact that I honestly do not expect to get 64k users any time in the near
future). Unfortunately my latest attempt at using libnss-ldap had some
problems.

I set up my nsswitch.conf to use LDAP only, and while logins worked, the
user could not be identified. In particular the shell read something like
"I do not know you" instead of the username at the prompt.

I have never tried to use NIS/YP.

Perhaps people who have set up such authentication systems can share with
me their views about both approaches? I do not know how they compare with
each other as far as performance and security are concerned (performance
being if you ls and entire filesystem tree will your system slow down
because of resolving the numerical uids to their usernames).

Documents aside from the HOWTOs in LinuxDoc would also be great.

I am inspired to set up Linux on the client workstations here in the
office because of, among other reasons, the viability of GnuCash over
Quickbooks. I just checked their website out and the latest version of
GnuCash already has multi-user support with a PostgreSQL database backend.
WOWOWEE!!! This honestly beats GnuCash anytime. Quickbooks is great, but
the five-user "value" pack costs US$500. That's a whopping P25k (or more
depending on the exchange rate when you read this message).

In line with this I wonder how I should approach software installation.
One approach will be to install all the client software on all the Linux
workstations. This doesn't sound too attractive, though.

There are three alternatives I have in mind, that again was hoping for
more comments from those with experience:

 o NFS-mounted share with the applications.
 o Remote X connections with all applications server-side.
 o Local X connections with connections to the server via SSH.

The first alternative sounds a little off, but I don't know if someone's
done this and may shed light on it to make it sound less off and more on.
;>

The second alternative I've read about a number of times already (I think
there's an article on the Linux Gazette about this). I do not know how
safe this is, though, considering some factors that I will list later on
together with the server hardware. I also do not know how good this
performs, although the computers are connected in a 10/100Mbps LAN with
most of them connected at 100Mbps (although throughput only averages two
to four MBps).

The third alternative I've been hearing about on the list lately. I do not
know why this is being done instead of the second alternative, so I'd like
to learn more about it. Off the top of my head, it's security that's the
reason.

The server has the following hardware:

 o Intel Pentium III 733MHz
 o 512MB RAM
 o 96.6GB hard drive space via 3ware RAID5
 o 100Mbps NIC to switch, with option to channel bond two NICs if the
   switch can be upgraded

The server currently has the following roles:

 o Gateway/firewall to the Internet via PLDT DSL
 o Serving of all files via Samba stored on XFS (will work on ACLs to
   provide similar functionality as Samba even on local login)
 o E-mail server (Postfix, Courier IMAP, Horde IMP)
 o Database server (PostgreSQL)
 o Web server (small website, Horde IMP)
 o DNS server (BIND v9, might go djbdns if I like it and learn it)

At present we cannot afford to have a second server to handle the
application serving. We may be able to afford a RAM upgrade, though. The
server currently has 2 x Apacer 256MB PC133 SDRAM, with two more slots
free. I do not know if I must install identically-sized SDRAM modules. If
this is the case then at least I can add two more for a total memory of
1GB. If I do not then this can even go more if I can find 512MB SDRAM
modules to upgrade the memory to 1.5GB (although I think that would cost
too much for me, and I don't think I will need it).

I do not know if having remote X and/or the applications being loaded
locally will have severe security impacts. In particular I wonder how
Linux will handle frozen applications like say, Netscape (although I'll
probably install Mozilla since the server hsa enough CPU for this). Unlike
workstations, the server cannot be rebooted whenever an application goes
down. I could teach the users to use kill, but I do not know how one
user's instance of an application affects another's that's loaded at the
same time.

I hope some of you can help me out with my questions questions questions.
As always documentation that can help me out is most welcome.

Thanks in advance everyone!

 --> Jijo

--
Federico Sevilla III  :: [EMAIL PROTECTED]
Network Administrator :: The Leather Collection, Inc.


_
Philippine Linux Users Group. Web site and archives at http://plug.linux.org.ph
To leave: send "unsubscribe" in the body to [EMAIL PROTECTED]

To subscribe to the Linux Newbies' List: send "subscribe" in the body to 
[EMAIL PROTECTED]

Reply via email to