I'm pleased to announce the release of pg_bulkload, a high speed data loading utility. It was presented by NTT OSS Center at PostgreSQL Anniversary Summit in Toronto last summer. Now it is available for download :
http://pgfoundry.org/projects/pgbulkload/ I would like to take this opportunity to thank all those who gave their time to help with the development, testing, translation and packaging of this release. Introduction (from README.pg_bulkload) --------------------------------------- pg_bulkload provides high-speed data loading capability to PostgreSQL users. When we load huge amount of data to a database, it is common situation that data set to be loaded is valid and consistent. For example, dedicated tools are used to prepare such data, providing data validation in advance. In such cases, we'd like to bypass any overheads within database system to load data as quickly as possible. pg_bulkload is developed to help such situations. Therefore, it is not pg_bulkload's goal to provide detailed data validation. Rather, pg_bulkload assumes that loaded data set is validated by separate means. If you're not in such situation, you should use COPY command in PostgreSQL. -- Toru SHIMOGAKI<[EMAIL PROTECTED]> NTT Open Source Software Center ---------------------------(end of broadcast)--------------------------- -To unsubscribe from this list, send an email to: [EMAIL PROTECTED]