I've got a directory with ~14000 files in it.  I would like
to enter the name of the file and its contents into a
database table on another host.  So I use a shell command
to iterate through the files.  Most of them work but others fail
as you see below.  Multiple runs in the same directory have some
succeeding which failed before and some failing which succeeded
before.

Anybody got any ideas on this one?  Could this be a network
problem?  Or is it a Postgres issue?  Maybe my postmaster
is getting tied up with the quantity of inserts?

Here is my shell command (I've subbed a dummy hostname):

bash$ for file in `ls` ; do /usr/local/pgsql/bin/psql \
        -h  not.a.real.address.edu \
        -d pipeline \
        -c "INSERT INTO seq(seqname, sequence) values('$file', '`cat $file`')";
done


And here is the output:

INSERT 2246656 1
INSERT 2246688 1
INSERT 2246720 1
INSERT 2246752 1
INSERT 2246784 1
INSERT 2246816 1
INSERT 2246848 1
INSERT 2246880 1
INSERT 2246912 1
INSERT 2246944 1
INSERT 2246976 1
INSERT 2247008 1
INSERT 2247040 1
INSERT 2247072 1

Connection to database 'pipeline' failed.
connectDB() --  unknown hostname:  not.a.real.address.edu

Connection to database 'pipeline' failed.
connectDB() --  unknown hostname: not.a.real.address.edu

-- 
John Ziniti
Mass General Hospital
CVRC 149-4201
149 13th Street
Charlestown, MA 02129
tel (617) 726-4347
fax (617) 726-5806
[EMAIL PROTECTED]

************

Reply via email to