Re: [sqlite] problem with ".dump" -- not generating insert statements on biggish table (12K by 2.8K)

2009-01-05 Thread Webb Sprague
Hi again, Just hoping that someone might have an answer to my question about ".dump". I just tried it again on a different database with a similar result... -W On Sun, Jan 4, 2009 at 2:39 PM, Webb Sprague wrote: > Hi All, > > When I run ".dump" on a small tabl

Re: [sqlite] "meta command" via string via shell?

2009-01-05 Thread Webb Sprague
> If I understand correctly, all you need to do is write the desired > commands out to a text file, then either direct stdin to the text file, > or use the '.read" command. Yes, I could write the commands out to a file (ick!), but I don't really want to add four lines and a whole lot of I/O. I co

[sqlite] "meta command" via string via shell?

2009-01-05 Thread Webb Sprague
Hi list I would like to set my ".mode tabs" and then run a command like so (yielding output separated by tabs instead of pipes): sqlite3 NLSY.db '.mode tabs; select * from DS0001 limit 1;' but this doesn't work because the .mode tabs isn't separated from the regular SQL. Could someone point out

[sqlite] problem with ".dump" -- not generating insert statements on biggish table (12K by 2.8K)

2009-01-04 Thread Webb Sprague
Hi All, When I run ".dump" on a small table I created at the prompt, it works fine. But when I try to run .dump on a big table it only gives me the table schema, pauses, then stops (see below for example). It happens when I run in batch from the command line, too. I am able to select from the

[sqlite] Transaction within script

2009-01-01 Thread Webb Sprague
Hi all, I have a script containing the following, which works fine except that at the end of the script I get "SQL error near line 5: cannot commit - no transaction is active". The table specified in the file on line 2 gets created just fine, and populated just fine on line 4. I am using sqlite

Re: [sqlite] SQLITE_MAX_VARIABLE_NUMBER and .import for very wide file

2008-12-31 Thread Webb Sprague
See below. On Sun, Dec 28, 2008 at 11:56 PM, Chris Wedgwood wrote: > On Sun, Dec 28, 2008 at 11:49:34PM -0800, Webb Sprague wrote: > >> I am sure there is a better way to deal with 12K rows by 2500 columns, >> but I can't figure it out > > 2500 columns sounds

Re: [sqlite] SQLITE_MAX_VARIABLE_NUMBER and .import for very wide file

2008-12-29 Thread Webb Sprague
>> I am sure there is a better way to deal with 12K rows by 2500 columns, >> but I can't figure it out > > 2500 columns sounds like a nightmare to deal with > > could you perhaps explain that data layout a little? It is a download of a huge longitudinal survey (www.bls.gov/nls/nlsy79.htm) that

[sqlite] SQLITE_MAX_VARIABLE_NUMBER and .import for very wide file

2008-12-28 Thread Webb Sprague
Hi All, What are the ramifications of increasing SQLITE_MAX_VARIABLE_NUMBER, probably to ? I am trying to import a csv file from the National Longitudinal Study of Youth 79, and .import errors out, though creating the table worked ok. I am sure there is a better way to deal with 12K rows by