Grant,

 

The most efficient way that I can think about of the top of my head, is to
write a .Net / C++ / Java application that will run and buffer read the
file. I came across this once before and the only solution is to write a
parser in a language that is able to file seek through the file.

 

I am not 100% sure but I also believe there are tools out there that could
maybe use the csv as a datasource and migrate that way, but I am not sure if
they will file seek or hold the entire file open.

 

HTH

 

It's been 10 years since I had to write a program to do just that.

 



Andrew Scott
Senior Coldfusion Developer
Aegeon Pty. Ltd.
 <http://www.aegeon.com.au> www.aegeon.com.au
Phone: +613  8676 4223
Mobile: 0404 998 273

 

 

From: cfaussie@googlegroups.com [mailto:[EMAIL PROTECTED] On Behalf
Of grant
Sent: Thursday, 14 June 2007 9:54 AM
To: cfaussie@googlegroups.com
Subject: [cfaussie] Large CSV's and server timeouts

 

Hi y'All

Here's the scenario: we have a csv file that contains 1.5 million rows and
is provided by an external vendor. we need to either (a) parse the csv and
insert into a DB, or (b) do lookups directly on the csv. 

We do not and cannot have SQLloader. The DB is ORA10g. The servers are old
and under fairly high load all the time.

We've tried splitting the csv into smaller chunks and inserting into the DB
and also trying a SQL link straight into the CSV. Anything we try time's out
and basically brings down the server. 

Who can shed light on the most efficient way to solve this problem?
Thanks
Grant



--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"cfaussie" group.
To post to this group, send email to cfaussie@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/cfaussie?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to