Depending on how big the files are and how quickly they need to be processed, I would recommend you build a scanner that toggles at least two flags.

Flag 1 is "flgQuoteRunOn" as true or false
Flag 2 is "flgEscapeChar" as true or false

put false into flgQuoteRunOn
put false into flgEscapeChar
put "/" into escChar
put quote into q
put comma into c

Now run a repeat loop to re-cast the strings into logical chunks.
Afterwards, use human inspection to find those spots that cannot be converted with logic loops (see examples below)

Be careful of a few things
 embedded commas, quotes, and escape chars

15446,"gold,silver watch", 599.00
15447,"gold,silver, 18" chain", 199.00
15447,"gold,silver, 18 1/2" chain", 199.00

Just so you know CSV is the second worst format ever invented.
They are still searching for worst one, but have not found it yet.



On Dec 4, 2009, at 12:46 PM, David Coker wrote:

Hello folks,
I'm in the planning stages of a possible new app which will include populating a Rev Database (SQLite) primarily from a standard Excel based CSV file. What I've run into while doing some research is that that format seems leaves a lot to be desired. It seems that the CSV data that I'll be working with has all kinds of spurious line breaks and such embedded, so converting to a tab delimited format doesn't work well.

Does anyone have any suggestions as to how to make something like this reliable as far as maintaining record integrity during import?


Jim Ault
Las Vegas


_______________________________________________
use-revolution mailing list
[email protected]
Please visit this url to subscribe, unsubscribe and manage your subscription 
preferences:
http://lists.runrev.com/mailman/listinfo/use-revolution

Reply via email to