On Fri, Nov 07, 2008 at 11:15:43AM -0500, Bruno Lavoie wrote:
> Is there a way to easily import a relatively huge text file into a table 
> column?

How big is "relatively huge"?

> I'd like to use psql and I`'ve looked at lo_* commands and I 
> can't figure how to import my text file into my TEXT column.

the "lo_*" commands are for working with large objects; these have
somewhat unusual semantics compared to the normal data in columns in
PG.  If you're routinely expecting files of more than, say, one MB then
they're probably a good way to go, but it's a lot more work getting them
going in the first place.

> My last 
> solution is to write a little script to load my text file in a var and 
> then insert to databse.

If you want to just get the data into a TEXT column as quickly as
possible; I'd probably just write a little bit of code to perform the
escaping that PG requires on the file.  You can then simply do:

  COPY tbl (col) FROM '/path/to/escaped/file';

I'm not sure if this is really what you want though!  "Enormous" TEXT
columns can be a bit fiddly to work.


  Sam

-- 
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general

Reply via email to