Emir Prcic wrote:

We have planed on using optical backbones (every floor 1 cable) so for a . building with 5 floors I would have 5 cables going verticaly.

1. Do not worry too much with the hardware. 2. Do not make grandiose plans. Start small and simple and then keep adding to it. 3. Choose a ward near your office and start your IT work in it. That way it will be easier to be in contact with people from that ward. You need their trust and collaboration, so do not use the phone, go and meet them face to face. To get their trust you need to be be near them to help with the day-to-day problems that will come up.

separate the cables on each floor and plug them into a switch thereby
getting spanning tree protocol sort of connection.

4. Keep an eye in the cost/efficacy (c/e) ratio and the hospital administration will trust you. 5. Start with the good reliable CAT5 cooper cable and off the shelf (far east made) switches (24 port switches have the best c/e ratio). 6. Place at least one unpretensious server for each floor, or 1 server for each 30 LTSP workstations (http://www.ltsp.org/), if you have more than 30 wrksts per floor.

6.1. As SERVER try:
6.1.1. CPU: AMD64 CPU
6.1.2. RAM: 1GB for the server and 100MB for each workstation. It would
give 1GB + 3 GB (30x 100MB) for a 30 LTSP workstation setting. If you
afford it go for ECC RAM chips (and a supporting motherboard), as they
are worth the extra 15% cost.
6.1.3. HD: SATA, 8 or 16 MB cache, with at least 160 GB each, as they
will now provide a better c/e ratio than SCSI or fiber optic devices.
6.1.4. Use a good and realiable PSU (I tend to prefer units with at
least PFC regulated 400W with 120 mm fans).
6.1.5. ALWAYS BACK THE SERVER WITH A GOOD UPS (500W units from APC with
USB or RS232 feedback have worked for us). Aditionaly the UPS may be
used as good "stonith" devices if you ever think to go into cluster
networking.
6.1.5. ALWAYS LOCK THE SERVER (install iron rings in the box and close
them with a lock), INSIDE A LOCKED ROOM (lock the server room as you
were locking money).

6.2. As WORKSTATIONS:
6.2.1. Keep in mind that the equipment must be small, cheap, VERY quiet,
without mobile parts inside. No fans to make noise nor to suck and blow
airborne poluants/infectants into the ward.
6.2.2. The equipment that interfaces with the humans (keyb and pointing
devices) should be easily cleaneable and sterilizable.
Keep in mind that the average ward computer keyboard is 4 times more
infected than the average toilete seat in that ward.
Chances are that the toilete seat is cleaned with some kind of
antiseptic product at least one time a day and... when was the last time
you saw a ward keyboard or mouse being sterilized?
And, as we all know people just keep coming back to that keyboard after
manipulating all kinds of stuff inside that ward (I have seen people
with rubber gloves that just after manipulating infectious material came
to grab the mouse and punch the keyboard without even taking out the
gloves. Think of the ioncent who needed to use it after that...)
6.2.3. CPU: any CPU in the 1 GHz range will be enough. You should favour
passively cooled (fanless) units.
6.2.3. RAM: 128 MB
6.2.4. Inboard 100 Mb (or better) ethernet card, PXE or Etherboot
compatible in order to make it automatically boot from the remote SERVER.
6.2.5. No Floppy disk, no hard disk, no CD/DVD disk. They have mobile
parts inside that make them unreliable. They are noisy. They are a
security weakness. They are simple not needed once the workstation has
been set up and correctly booting up from the server.
6.2.6. If you have special needs (like special biometric security
devices to identify users) you may deploy the needed algorithm in a USB
Flash drive that you will securely install and solder inside the
workstation box.
6.2.7. If you afford it, go for wall bolted, touch sensitive LCD/TFT
screens.
6.2.8. If you afford it, go for small touchpads instead of mice.
6.2.9. If you afford it, specify sterilizable membranes (plastic or
silicone based) to cover the keyboards.
6.2.7. ALWAYS LOCK THE WORKSTATION (install iron rings in the box and
close them with a lock). Bolt the unit to a wall. It wont be in the way
if the personnel needs to fastly move a stretcher around... and it will
be a lot more harder for that unit to leave the ward, for a holiday
without license :-)

7. The Software
Ah, the software! Do you need realibility or do you need windows?
Please forget that. I never said that!

You will need something like:

7.1. NON PROPRIETARY High Availability Clustering.

Try this first: "Getting Started with Linux-HA (heartbeat)" at
http://linux-ha.org/download/GettingStarted.html

7.1.1. Heartbeat (http://linux-ha.org/heartbeat/), also see
(http://wiki.trick.ca/linux-ha/GettingStartedWithHeartbeat)
7.1.2. DRBD (http://www.drbd.org/), just see it as a VERY RELIABLE
software RAID 1 system with an unbeateable cost/efficacy ratio
7.1.3. DHCP (http://www.isc.org/), see DHCP under the Software folder of
that site. DHCP will do both DHCP and load balancing for you. See also
"DHCP Failover/load balancing" at DHCP Failover/load balancing
7.1.4 MON (http://www.kernel.org/software/mon/), a general-purpose
scheduler and alert management tool used for monitoring service
availability and triggering alerts/actions upon failure detection

That will all be very easy if you grab a copy of, for instance Knoppix
(a Debian based and easily installed Linux) at http://www.knoppix.org/ ,
install it and, after that is a simple matter of "apt-get install" the
needed packages:
care2x
Heartbeat
DRDB
DHCP
MON

- a good database; you choose! (I choose posgres.
3-500.000 patients may lead to 20.000.000 records. Postgres is good for this,

Yes, postgreSQL is good for medical records.

I am a mySQL person... is postgres better than mysql for this amount of
data, what about replication? If I have a Raid10 with 4 x 146 GB SCSI = 296
GB isn't that enough for about 20.0000.000 records??? What about MaxDB???
any expirience with it?

Go for postgreSQL.

Anyhow is a HUGE work even for a team, but you can handle alone in more steps

It certainly is. Are you sure that you will be able to do it alone? A preliminary check, based on the numbers you gave, seems to point for the need of a team of at least 6 very qualified, fully motivated, persons. And they would need to be working at full time for at least 6 to 12 months.

I am planing all the steps neccesary and my biggest problem are the servers.

I am afraid that the servers will be your least problem, so don't worry too much with them... a long walk just starts will a single step... just start working with whichever hardware your institution already has.

I want to build a bullet proof thing so that I can focus more on CARE2x.

How bullet proof? 90% up time? 99%? 99,9%? Each nine that you add will at best linearly (sometimes: exponentially) increase your expenses. Do see "The magic of nines" in the Robertson's article "Highly-Affordable High Availability" at http://www.linux-mag.com/2003-11/availability_01.html


Best regards, J. Antas




-------------------------------------------------------
This Newsletter Sponsored by: Macrovision For reliable Linux application installations, use the industry's leading
setup authoring tool, InstallShield X. Learn more and evaluate today. http://clk.atdmt.com/MSI/go/ins0030000001msi/direct/01/
_______________________________________________
Care2002-developers mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/care2002-developers

Reply via email to