Re: Large CSV File

2007-06-12 Thread Phillip M. Vector
Is it all brand new records? or will most of the data after the first be already in there and need to be updated? I learned that the best way to parse it is to set up a upload once (using cffile) and then every week, load the csv file into an array and then compare it with the data already

RE: Large CSV File

2007-06-12 Thread Ben Forta
work (a scheduled stored procedure, a trigger on INSERT on that import table, or something like that). --- Ben -Original Message- From: Phillip M. Vector [mailto:[EMAIL PROTECTED] Sent: Tuesday, June 12, 2007 10:06 AM To: CF-Talk Subject: Re: Large CSV File Is it all brand new records

Re: Large CSV File

2007-06-12 Thread Crow T. Robot
DTS bulk import into SQL Server, then query away as usual. DTS is incredibly fast, and there are tutorials (google pengowrks and dts) on how exactly to set it up using stored procedures and CF. On 6/12/07, Les Mizzell [EMAIL PROTECTED] wrote: I've got a vendor that's going to FTP product data

RE: Large CSV File

2007-06-12 Thread Robert Rawlins - Think Blue
decent :-D I would import into the database and then do the manipulation there with stored procs for sure. Rob -Original Message- From: Phillip M. Vector [mailto:[EMAIL PROTECTED] Sent: 12 June 2007 15:06 To: CF-Talk Subject: Re: Large CSV File Is it all brand new records? or will most

Re: Large CSV File

2007-06-12 Thread Qasim Rasheed
: Phillip M. Vector [mailto:[EMAIL PROTECTED] Sent: 12 June 2007 15:06 To: CF-Talk Subject: Re: Large CSV File Is it all brand new records? or will most of the data after the first be already in there and need to be updated? I learned that the best way to parse it is to set up a upload once

Re: Large CSV File

2007-06-12 Thread Phillip M. Vector
:[EMAIL PROTECTED] Sent: 12 June 2007 15:06 To: CF-Talk Subject: Re: Large CSV File Is it all brand new records? or will most of the data after the first be already in there and need to be updated? I learned that the best way to parse it is to set up a upload once (using cffile

Re: Large CSV File

2007-06-12 Thread Les Mizzell
Les, what database are you using? *Crosses fingers he says something decent :-D MySQL I really see no way to do anything with this without doing a database import first. What's going to make it interesting is that the file is named according to date, so it will have a different name each

Re: Large CSV File

2007-06-12 Thread Tom Chiverton
On Tuesday 12 Jun 2007, Les Mizzell wrote: to date, so it will have a different name each time they ftp it over. I Just grab the newest file from the directory. -- Tom Chiverton Helping to completely unleash fourth-generation data on: http://thefalken.livejournal.com

RE: Large CSV File

2007-06-12 Thread James Smith
What's going to make it interesting is that the file is named according to date, so it will have a different name each time they ftp it over. I Shouldn't be a problem, just have them ftp it to an empty directory then use a cfdirectory to get the name of the only file in the directory, do your

RE: Large CSV File

2007-06-12 Thread Ben Forta
- From: James Wolfe [mailto:[EMAIL PROTECTED] Sent: Tuesday, June 12, 2007 10:55 AM To: CF-Talk Subject: Re: Large CSV File If you have MS SQL at your disposal, load the stuff into a table and query that. You can use DTS (though that's harder to do on the fly) or you can simply use the BULK INSERT

Re: Large CSV File

2007-06-12 Thread Jake Pilgrim
A number of excellent database-based solutions have already been posted (and a database solution is probably preferred), but don't overlook CFHTTP. CFHTTP is actually quite good at parsing a CSV and spooling it into a CFQUERY. Depending on exactly what you want to do with your data, this may be

Re: Large CSV File

2007-06-12 Thread James Wolfe
If you have MS SQL at your disposal, load the stuff into a table and query that. You can use DTS (though that's harder to do on the fly) or you can simply use the BULK INSERT command which is super fast and is very easy. If you dont have a database available to you, you can use ASP (let the

Re: Large CSV File

2007-06-12 Thread Matt Robertson
And you can always drop to java and read the little beast in one line at a time that way. You won't have any memory issues like you would if you try and read in an 80mb file into a single cosmic-scale array. I have a similar monster file situation and solved it this way; although mine is daily

Re: Large CSV File

2007-06-12 Thread Dinner
On 6/12/07, Ben Forta wrote: CF may be able to do the same thing, but the fact that Ben Forta didn't mention it doesn't bode well for that being true. It's doable, via the text ODBC driver, but it's slow, and the SQL is a pain. I still think the best option is to do this in the DBMS.