On May 5, 2008, at 11:37 AM, Michael Lackhoff wrote:

> Hello,
>
> thanks to the flexibility of sqlite I can use a sybase database dump  
> as
> a source for import into my database. Only problem: the file to import
> is over 2GB and I get a "file not found" error. The file is there, it
> can be read by split and imported after the split and the resultung
> sqlite database is over 2GB as well, so large file support doesn't  
> seem
> to be a general problem. I wonder if there is a possibility to make
> this work in one go (without the split). It would speed things up and
> would save quite a lot of (temporary) disk space.
>
> I am using version 3.5.8 on Solaris 9
>
> the job file looks like this:
> .separator ^A
> create table mytable (...);
> .import mydata.bcp mytable
>

Looking at the code, I do not see why this fails, assuming you have  
large-file support turned on (which you seem to have, and it is the  
default, after all.)  The ".import" code does fopen() then fgets() to  
read the file.  Does fopen() not work for large files on Solaris 9?

The code to do an import is not part of the core SQLite, btw.  It is  
part of the CLI.  You can find the code by searching for "import" in  
the shell.c source file.

D. Richard Hipp
[EMAIL PROTECTED]



_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to