Thus far the PostgreSQL Copy command seems to be doing the best job for the
task in terms of speed.  The BatchUpdate settings on the Remote View helped,
but it still takes a lot of time for several million records to push into
the view, then update to the source table.  Now to learn a little Python to
help automate the process.  Task for another day.  I need to get back onto
some regular projects for a few weeks now.  But, despite the large record
import taking some time, understandably, the VFP-to-PostgreSQL union is
doing great.  Once the DoNotCall table is populated with its records
imported from the FTC Source Files, pulling them out into a VFP CURSOR is
fast.  I am using a parameterized view to keep any temp table used for the
CURSOR from getting over 2GB in size.

Gil

> -----Original Message-----
> From: [EMAIL PROTECTED]
> [mailto:[EMAIL PROTECTED] Behalf Of Ed Leafe
> Sent: Saturday, April 07, 2007 6:12 PM
> To: [EMAIL PROTECTED]
> Subject: Re: PostgreSQL text file import performance
>
>
> On Apr 7, 2007, at 5:21 PM, mrgmhale wrote:
>
> > So, how are other folks handling large text/csv file imports into
> > remote
> > updateable views?
>
>       Have you looked into PostgreSQL's COPY command?
>
> http://www.postgresql.org/docs/current/static/sql-copy.html
>
> -- Ed Leafe
> -- http://leafe.com
> -- http://dabodev.com
>
>
>
>
[excessive quoting removed by server]

_______________________________________________
Post Messages to: [email protected]
Subscription Maintenance: http://leafe.com/mailman/listinfo/profox
OT-free version of this list: http://leafe.com/mailman/listinfo/profoxtech
Searchable Archive: http://leafe.com/archives/search/profox
This message: http://leafe.com/archives/byMID/profox/[EMAIL PROTECTED]
** All postings, unless explicitly stated otherwise, are the opinions of the 
author, and do not constitute legal or medical advice. This statement is added 
to the messages for those lawyers who are too stupid to see the obvious.

Reply via email to