sorry ,maked an mistake.
actually,the data size is 805 vars, 118519 obs.
2006/1/5, ronggui wong <[EMAIL PROTECTED]>:
> Thanks to all give response to help.
> This is my solution using the luanguage I familiar.(http://www.r-project.org).
>
> I use the code to read a 11819x807 csv file and it takes
Thanks to all give response to help.
This is my solution using the luanguage I familiar.(http://www.r-project.org).
I use the code to read a 11819x807 csv file and it takes 10 minus.I think is
not too slow .(My PC:1.7G,512M RAM)
#code begins
rm(list=ls())
f<-file("D:\\wvsevs_sb_v4.csv","r")#134M
Arjen Markus wrote:
Hm, there is a CSV reading module in Tcllib, so one could contemplate
using Tcl instead of Perl for this. That ought to take care of the
quotes
and other nastiness...
Perl's Text::CSV module available from CPAN also handles these issues.
Jay Sprenkle wrote:
On 12/6/05, ronggui wong <[EMAIL PROTECTED]> wrote:
I have a very large CSV file with 1 rows and 100 columns.and the
file looks like the following:
"a","b","c","d",
"1","2","1","3" ,
"3","2","2","1",
..
If I use .import,It seems I have to set the variable names
On 12/6/05, ronggui wong <[EMAIL PROTECTED]> wrote:
> I have a very large CSV file with 1 rows and 100 columns.and the
> file looks like the following:
> "a","b","c","d",
> "1","2","1","3" ,
> "3","2","2","1",
> ..
>
> If I use .import,It seems I have to set the variable names manually .
>
Aaron Peterson wrote:
>
> On 12/7/05, Teg <[EMAIL PROTECTED]> wrote:
> > Hello All,
> >
> > Wouldn't it make sense to write a program that reads it in, one line
> > at a time, splits and inserts the data into the proper tables? Even
> > creating the table on the fly? That's what I'd do, a little
On 12/7/05, Teg <[EMAIL PROTECTED]> wrote:
> Hello All,
>
> Wouldn't it make sense to write a program that reads it in, one line
> at a time, splits and inserts the data into the proper tables? Even
> creating the table on the fly? That's what I'd do, a little command
> line utility.
One could
PBDBMS on www.hellobasic.com
All through ADO..
- Original Message -
From: "Cariotoglou Mike" <[EMAIL PROTECTED]>
To: <sqlite-users@sqlite.org>
Sent: Wednesday, December 07, 2005 9:23 AM
Subject: RE: [sqlite] how can I import CSV file into SQLite quickly
sqlite3Explorer does that
From: John Stanton [mailto:[EMAIL PROTECTED]
Sent: Wed 07-Dec-05 8:00 AM
To: sqlite-users@sqlite.org
Subject: Re: [sqlite] how can I import CSV file into SQLite quickly
Someone somwhere must have a simple Perl script which does what
Someone somwhere must have a simple Perl script which does what you want.
JS
Robert L Cochran wrote:
I create an SQL file that has contents like this:
[EMAIL PROTECTED] elections]$ cat insert_precinct.sql
BEGIN TRANSACTION;
INSERT INTO "precinct" VALUES(1, 'Community Center 15 Crescent Road',
Hello All,
Wouldn't it make sense to write a program that reads it in, one line
at a time, splits and inserts the data into the proper tables? Even
creating the table on the fly? That's what I'd do, a little command
line utility.
It's basically the "convert to SQL" technique you suggest where
I create an SQL file that has contents like this:
[EMAIL PROTECTED] elections]$ cat insert_precinct.sql
BEGIN TRANSACTION;
INSERT INTO "precinct" VALUES(1, 'Community Center 15 Crescent Road', 3,
'Greenbelt', 'Maryland', 0);
INSERT INTO "precinct" VALUES(2, 'Police Station 550 Crescent Road',
I have a very large CSV file with 1 rows and 100 columns.and the
file looks like the following:
"a","b","c","d",
"1","2","1","3" ,
"3","2","2","1",
..
If I use .import,It seems I have to set the variable names manually .
Is there any way to import the whole data file into SQLite quickly?
13 matches
Mail list logo