Nageshwar Rao wrote:
Hi,
Can you please let me know what parameters need to be adjusted in order to
complete this operation? It is still running and checked the table, still
there are no records in that. This table does not have any constraints

That is odd then. The only thing that should slow you down would be testing constraints against another table where there is no suitable index. Otherwise 5000 rows should take 100 times as long as 50 rows.


.Checked the same operation with 4 records it is fine.Only problem is when I
do bulk insert of 5000 records.

Something strange is going on here. I can only think of two possibilities:

1. The table is locked, and the copy is waiting for the lock to be released. Even so, I would expect it to timeout and return an error by now.
2. There is something wrong with the file.


Option #1
You should be able to see what processes are running from the command-line with:
ps auxw | grep postgres
One of the processes should be your copy - what does it say?


Also, check the output of "top" and "vmstat 1" - is there any activity?

If you were running a later version, you could check the locks directly, but I don't think that's possible with 7.2. If it is "SELECT * FROM pg_locks" will show you any locks held.

Option #2
If the data is not confidential, feel free to send me a copy of the table definition and import file and I'll take a look at it. Email it to me directly, since the mailing list doesn't like large attachments.


Failing that, try splitting the import file into sections, you could use something like:
split -l 500 input_file output_file
That will split "input_file" into 500-line chunks. You'll need to copy the header/footer of the copy command onto each chunk too.
--
Richard Huxton
Archonet Ltd


---------------------------(end of broadcast)---------------------------
TIP 9: the planner will ignore your desire to choose an index scan if your
     joining column's datatypes do not match

Reply via email to