IMHO the size of the DB is less relevant than the query workload. For
example, if you're storying 100GB of data but only doing a single
index scan on it every 10 seconds, any modern machine with enough HD
space should be fine.
I agree that the workload is likely to be the main issue in most
Hi,
I have done some performance tests on 1Gb and 4 Gb Databases on a mono
Pentium 4 , 1 Gb RAM, IDE disk, SCSI disks and RAID0 LUN on DAS 5300 on
Linux RedHat 7.3.
In each cases my tests make select, update and insert.
One of them is pgbench. You can find it in Postgres/contrib/pgbench.
The
LIANHE SHAO [EMAIL PROTECTED] writes:
We will have a very large database to store microarray data (may
exceed 80-100G some day). now we have 1G RAM, 2G Hz Pentium 4, 1
CPU. and enough hard disk.
Could anybody tell me that our hardware is an issue or not?
IMHO the size of the DB is less
-
From: Neil Conway [EMAIL PROTECTED]
Date: Wednesday, November 26, 2003 10:03 pm
Subject: Re: [PERFORM] very large db performance
question
LIANHE SHAO [EMAIL PROTECTED] writes:
We will have a very large database to store
microarray data (may
exceed 80-100G some day). now we have 1G RAM, 2G
Hz