Hi.

I'm trying to run a script that executes multiple \copy commands in order 
to import some 5 GB of data. All the input files are computer generated, 
simple (5 numeric columns and "\N" for NULL in some cases), use default 
delimiters, and _should_ be error free. But I keep getting error messages 
for _some_ of the \copy commands.

eg,
pqReadData() -- backend closed the channel unexpectedly.
         This probably means the backend terminated abnormally
         before or while processing the request.
PQendcopy: resetting connection

Questions:

1. Are there any size restrictions on the input files?

2. How do I tell which file or better yet which line is tripping up the 
system? I could cut the list in half repeatedly until I find the problem, 
but that would be a huge waste of time given how long it takes to import 
any of the data. I could setup more verbose logging on the backend, but 
will that make a mess if my error is 2 GB into the import?

3. I'm running the Windows/cygwin version of the psql client and a Linux 
backend, if that makes a difference.

Any help would be much appreciated.

-Xavier



_________________________________________________________
Do You Yahoo!?
Get your free @yahoo.com address at http://mail.yahoo.com


---------------------------(end of broadcast)---------------------------
TIP 6: Have you searched our list archives?

http://www.postgresql.org/search.mpl

Reply via email to