Hi All,

When I run ".dump" on a small table I created at the prompt, it works
fine.  But when I try to run .dump on a big table it only gives me the
table schema, pauses, then stops (see below for example).  It happens
when I run in batch  from the command line, too.  I am able to select
from the large table just fine ('select count(*) from nlsy79;' gives
25374, group by's work fine, etc.).

I am working with a modified 3.6.7 codebase (edited various max_*
modified to be very large).  Though I can't imagine why that would
matter, unless there is something that won't dump unless under a
certain max.

(Note that I am not looking for solutions that require that I truncate
the table or get rid of columns, because of the nature of the
problem.)

Example 1, first table, works fine:

sqlite> .dump blah
BEGIN TRANSACTION;
CREATE TABLE blah (i int, s text);
INSERT INTO "blah" VALUES(10,'blah blah');
COMMIT;

Example 2, second table, notice lack of insert statements:

sqlite> .dump NLSY79
BEGIN TRANSACTION;
CREATE TABLE NLSY79 ( "Obs" integer
, "R0000100" integer
, "R0000149" integer
, "R0000150" integer
--... (SNIP some 2000+ columns)
 , "W0072300" integer
, "W0072400" integer
);
COMMIT;

Thanks for all the help!
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to