Hi all,

I'm trying to import a large file of about 13GB using SQLite 3.7.6.3
on Ubuntu. I use the precompiled Linux binary.

The commands are:
=========================
.separator ";"
.import largefile.csv mytable
=========================

but then I get: "Error: cannot open "largefile.csv" "

I can view the file with "head" or "less", so there seems to be no
problem with readability or permissions. Moreover, I can (partially)
import the same file on a different Ubuntu system using the exact same
commands (but run out of storage space before the import completes).

So I searched the archives of this list and found two threads on this:

http://www.mail-archive.com/sqlite-users@sqlite.org/msg51574.html
http://www.mail-archive.com/sqlite-users@sqlite.org/msg48649.html

The first thread got no answers, but the second suggest to either
split the file or recompile sqlite3 with the option for large file
support . Now I had understood that since version 3.5.9 large file
support is switched on by default so that should not be the problem
(http://www.sqlite.org/changes.html). Splitting the file, however,
seems to solve the problem. I would prefer not to have to split the
file first.

Any ideas on what causes this problem?
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to