Hello I am working on a project that acquires real-time data from an external device that I need to store and be able to search through and retrieve quickly. My application receives packets of data ranging in size from 300 to 5000 bytes every 50 milliseconds for the minimum duration of 24 hours before the data is purged or archived off disk. There are several fields in the data that I like to be able to search on to retrieve the data at later time. By using a SQL database such as Postgresql or Mysql it seams that it would make this task much easier. My questions are, is a SQL database such as Postgresql able to handle this kind of activity saving a record of 5000 bytes at rate of 20 times a second, also how well will it perform at searching through a database which contains nearly two million records at a size of about 8 - 9 gigabytes of data, assuming that I have adequate computing hardware. I am trying to determine if a SQL database would work well for this or if I need to write my own custom database for this project. If anyone has any experience in doing anything similar with Postgresql I would love to know about your findings.
Thanks Mark ---------------------------(end of broadcast)--------------------------- TIP 3: if posting/reading through Usenet, please send an appropriate subscribe-nomail command to [EMAIL PROTECTED] so that your message can get through to the mailing list cleanly