RE: [sqlite] Is there a method for doing bulk insertion?
Thanks! -Original Message- From: jphillip [mailto:[EMAIL PROTECTED] Sent: Tuesday, December 19, 2006 4:24 PM To: sqlite-users@sqlite.org Subject: RE: [sqlite] Is there a method for doing bulk insertion? issue .help look for .separator example for a csv file with colon(:) separators issue .separator ':' use an editor to change the existing separator character(s) to the character you want to use. On Tue, 19 Dec 2006, Anderson, James H (IT) wrote: > So I can assume that there's no way to use a delimiter other than a > comma to import a CSV file? > > -Original Message- > From: jphillip [mailto:[EMAIL PROTECTED] > Sent: Tuesday, December 19, 2006 3:47 PM > To: sqlite-users@sqlite.org > Subject: RE: [sqlite] Is there a method for doing bulk insertion? > > > > .help pretty well sums it up. > > On Tue, 19 Dec 2006, Anderson, James H (IT) wrote: > > > How do I find doc on .import? > > > > Is there a way to specify the delimiter for the CSV file? > > > > Thanks, > > > > jim > > > > -Original Message- > > From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] > > Sent: Monday, December 18, 2006 9:12 AM > > To: sqlite-users@sqlite.org > > Subject: Re: [sqlite] Is there a method for doing bulk insertion? > > > > "Anderson, James H \(IT\)" <[EMAIL PROTECTED]> wrote: > > > or do I have to creation a gazillion insert statements? > > > > > > > The sqlite3 command-line shell has a ".import" command which > > can be used to read CSV data. But the way this works internally > > is that the command-line shell constructs an INSERT statement, > > parses each line of the CSV file and binds the values to that > > INSERT statement, then runs the INSERT statement for each line. > > So at the end of the day, a bunch of INSERT statements are still > > getting evaluated - you just don't see them. > > > > On my workstation, an INSERT statement can be parsed, compiled, > > and evaluated in 25-40 microseconds. That's about 3 rows > > per second. How much performance do you need? > > > > -- > > D. Richard Hipp <[EMAIL PROTECTED]> > > > > > > > > > - > > To unsubscribe, send email to [EMAIL PROTECTED] > > > > > - > > > > > > NOTICE: If received in error, please destroy and notify sender. Sender > does not intend to waive confidentiality or privilege. Use of this email > is prohibited when received in error. > > > > > > - > > To unsubscribe, send email to [EMAIL PROTECTED] > > > > - > > > > > > You have to be BRAVE to grow OLD. > There are no old CARELESS pilots or electricians. > > > > - > To unsubscribe, send email to [EMAIL PROTECTED] > > - > > > NOTICE: If received in error, please destroy and notify sender. Sender does not intend to waive confidentiality or privilege. Use of this email is prohibited when received in error. > > - > To unsubscribe, send email to [EMAIL PROTECTED] > - > > You have to be BRAVE to grow OLD. There are no old CARELESS pilots or electricians. - To unsubscribe, send email to [EMAIL PROTECTED] - NOTICE: If received in error, please destroy and notify sender. Sender does not intend to waive confidentiality or privilege. Use of this email is prohibited when received in error. - To unsubscribe, send email to [EMAIL PROTECTED] -
RE: [sqlite] Is there a method for doing bulk insertion?
issue .help look for .separator example for a csv file with colon(:) separators issue .separator ':' use an editor to change the existing separator character(s) to the character you want to use. On Tue, 19 Dec 2006, Anderson, James H (IT) wrote: > So I can assume that there's no way to use a delimiter other than a > comma to import a CSV file? > > -Original Message- > From: jphillip [mailto:[EMAIL PROTECTED] > Sent: Tuesday, December 19, 2006 3:47 PM > To: sqlite-users@sqlite.org > Subject: RE: [sqlite] Is there a method for doing bulk insertion? > > > > .help pretty well sums it up. > > On Tue, 19 Dec 2006, Anderson, James H (IT) wrote: > > > How do I find doc on .import? > > > > Is there a way to specify the delimiter for the CSV file? > > > > Thanks, > > > > jim > > > > -Original Message- > > From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] > > Sent: Monday, December 18, 2006 9:12 AM > > To: sqlite-users@sqlite.org > > Subject: Re: [sqlite] Is there a method for doing bulk insertion? > > > > "Anderson, James H \(IT\)" <[EMAIL PROTECTED]> wrote: > > > or do I have to creation a gazillion insert statements? > > > > > > > The sqlite3 command-line shell has a ".import" command which > > can be used to read CSV data. But the way this works internally > > is that the command-line shell constructs an INSERT statement, > > parses each line of the CSV file and binds the values to that > > INSERT statement, then runs the INSERT statement for each line. > > So at the end of the day, a bunch of INSERT statements are still > > getting evaluated - you just don't see them. > > > > On my workstation, an INSERT statement can be parsed, compiled, > > and evaluated in 25-40 microseconds. That's about 3 rows > > per second. How much performance do you need? > > > > -- > > D. Richard Hipp <[EMAIL PROTECTED]> > > > > > > > > > - > > To unsubscribe, send email to [EMAIL PROTECTED] > > > > > - > > > > > > NOTICE: If received in error, please destroy and notify sender. Sender > does not intend to waive confidentiality or privilege. Use of this email > is prohibited when received in error. > > > > > > - > > To unsubscribe, send email to [EMAIL PROTECTED] > > > > - > > > > > > You have to be BRAVE to grow OLD. > There are no old CARELESS pilots or electricians. > > > > - > To unsubscribe, send email to [EMAIL PROTECTED] > > - > > > NOTICE: If received in error, please destroy and notify sender. Sender does > not intend to waive confidentiality or privilege. Use of this email is > prohibited when received in error. > > - > To unsubscribe, send email to [EMAIL PROTECTED] > - > > You have to be BRAVE to grow OLD. There are no old CARELESS pilots or electricians. - To unsubscribe, send email to [EMAIL PROTECTED] -
RE: [sqlite] Is there a method for doing bulk insertion?
Thanks! -Original Message- From: Jeff Godfrey [mailto:[EMAIL PROTECTED] Sent: Tuesday, December 19, 2006 4:01 PM To: sqlite-users@sqlite.org Subject: Re: [sqlite] Is there a method for doing bulk insertion? Take a look at the ".separator" command. It seems to be what you need... Jeff - Original Message - From: "Anderson, James H (IT)" <[EMAIL PROTECTED]> To: <sqlite-users@sqlite.org> Sent: Tuesday, December 19, 2006 2:52 PM Subject: RE: [sqlite] Is there a method for doing bulk insertion? So I can assume that there's no way to use a delimiter other than a comma to import a CSV file? - To unsubscribe, send email to [EMAIL PROTECTED] - NOTICE: If received in error, please destroy and notify sender. Sender does not intend to waive confidentiality or privilege. Use of this email is prohibited when received in error. - To unsubscribe, send email to [EMAIL PROTECTED] -
Re: [sqlite] Is there a method for doing bulk insertion?
Take a look at the ".separator" command. It seems to be what you need... Jeff - Original Message - From: "Anderson, James H (IT)" <[EMAIL PROTECTED]> To: <sqlite-users@sqlite.org> Sent: Tuesday, December 19, 2006 2:52 PM Subject: RE: [sqlite] Is there a method for doing bulk insertion? So I can assume that there's no way to use a delimiter other than a comma to import a CSV file? - To unsubscribe, send email to [EMAIL PROTECTED] -
RE: [sqlite] Is there a method for doing bulk insertion?
So I can assume that there's no way to use a delimiter other than a comma to import a CSV file? -Original Message- From: jphillip [mailto:[EMAIL PROTECTED] Sent: Tuesday, December 19, 2006 3:47 PM To: sqlite-users@sqlite.org Subject: RE: [sqlite] Is there a method for doing bulk insertion? .help pretty well sums it up. On Tue, 19 Dec 2006, Anderson, James H (IT) wrote: > How do I find doc on .import? > > Is there a way to specify the delimiter for the CSV file? > > Thanks, > > jim > > -Original Message- > From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] > Sent: Monday, December 18, 2006 9:12 AM > To: sqlite-users@sqlite.org > Subject: Re: [sqlite] Is there a method for doing bulk insertion? > > "Anderson, James H \(IT\)" <[EMAIL PROTECTED]> wrote: > > or do I have to creation a gazillion insert statements? > > > > The sqlite3 command-line shell has a ".import" command which > can be used to read CSV data. But the way this works internally > is that the command-line shell constructs an INSERT statement, > parses each line of the CSV file and binds the values to that > INSERT statement, then runs the INSERT statement for each line. > So at the end of the day, a bunch of INSERT statements are still > getting evaluated - you just don't see them. > > On my workstation, an INSERT statement can be parsed, compiled, > and evaluated in 25-40 microseconds. That's about 3 rows > per second. How much performance do you need? > > -- > D. Richard Hipp <[EMAIL PROTECTED]> > > > > - > To unsubscribe, send email to [EMAIL PROTECTED] > > - > > > NOTICE: If received in error, please destroy and notify sender. Sender does not intend to waive confidentiality or privilege. Use of this email is prohibited when received in error. > > - > To unsubscribe, send email to [EMAIL PROTECTED] > - > > You have to be BRAVE to grow OLD. There are no old CARELESS pilots or electricians. - To unsubscribe, send email to [EMAIL PROTECTED] - NOTICE: If received in error, please destroy and notify sender. Sender does not intend to waive confidentiality or privilege. Use of this email is prohibited when received in error. - To unsubscribe, send email to [EMAIL PROTECTED] -
RE: [sqlite] Is there a method for doing bulk insertion?
.help pretty well sums it up. On Tue, 19 Dec 2006, Anderson, James H (IT) wrote: > How do I find doc on .import? > > Is there a way to specify the delimiter for the CSV file? > > Thanks, > > jim > > -Original Message- > From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] > Sent: Monday, December 18, 2006 9:12 AM > To: sqlite-users@sqlite.org > Subject: Re: [sqlite] Is there a method for doing bulk insertion? > > "Anderson, James H \(IT\)" <[EMAIL PROTECTED]> wrote: > > or do I have to creation a gazillion insert statements? > > > > The sqlite3 command-line shell has a ".import" command which > can be used to read CSV data. But the way this works internally > is that the command-line shell constructs an INSERT statement, > parses each line of the CSV file and binds the values to that > INSERT statement, then runs the INSERT statement for each line. > So at the end of the day, a bunch of INSERT statements are still > getting evaluated - you just don't see them. > > On my workstation, an INSERT statement can be parsed, compiled, > and evaluated in 25-40 microseconds. That's about 3 rows > per second. How much performance do you need? > > -- > D. Richard Hipp <[EMAIL PROTECTED]> > > > > - > To unsubscribe, send email to [EMAIL PROTECTED] > > - > > > NOTICE: If received in error, please destroy and notify sender. Sender does > not intend to waive confidentiality or privilege. Use of this email is > prohibited when received in error. > > - > To unsubscribe, send email to [EMAIL PROTECTED] > - > > You have to be BRAVE to grow OLD. There are no old CARELESS pilots or electricians. - To unsubscribe, send email to [EMAIL PROTECTED] -
RE: [sqlite] Is there a method for doing bulk insertion?
How do I find doc on .import? Is there a way to specify the delimiter for the CSV file? Thanks, jim -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] Sent: Monday, December 18, 2006 9:12 AM To: sqlite-users@sqlite.org Subject: Re: [sqlite] Is there a method for doing bulk insertion? "Anderson, James H \(IT\)" <[EMAIL PROTECTED]> wrote: > or do I have to creation a gazillion insert statements? > The sqlite3 command-line shell has a ".import" command which can be used to read CSV data. But the way this works internally is that the command-line shell constructs an INSERT statement, parses each line of the CSV file and binds the values to that INSERT statement, then runs the INSERT statement for each line. So at the end of the day, a bunch of INSERT statements are still getting evaluated - you just don't see them. On my workstation, an INSERT statement can be parsed, compiled, and evaluated in 25-40 microseconds. That's about 3 rows per second. How much performance do you need? -- D. Richard Hipp <[EMAIL PROTECTED]> - To unsubscribe, send email to [EMAIL PROTECTED] - NOTICE: If received in error, please destroy and notify sender. Sender does not intend to waive confidentiality or privilege. Use of this email is prohibited when received in error. - To unsubscribe, send email to [EMAIL PROTECTED] -
RE: [sqlite] Is there a method for doing bulk insertion?
Thanks. -Original Message- From: Jay Sprenkle [mailto:[EMAIL PROTECTED] Sent: Monday, December 18, 2006 8:32 PM To: sqlite-users@sqlite.org Subject: Re: [sqlite] Is there a method for doing bulk insertion? On 12/18/06, Anderson, James H (IT) <[EMAIL PROTECTED]> wrote: > I was hoping there was the equivalent of Sybase's BCP program. I was > also hoping something programmatic was available, i.e., not something > from the command shell. Maybe a little background would help. > > I'm planning on using the perl package DBD::SQLite. My department is a > big sybase user but because of the nature of our workload, we experience > a lot of contention in both the transaction log and tempdb (the database > that houses temporary tables). I'm investigating the feasibility of > transferring data into SQLite, doing all the data manipulations there, > and then transferring it back to the appropriate sybase tables. I > suspect this could be a big win for a number of our applications. > > But if it can be avoided, I don't want to do a CSV conversion, nor do I > want to shell out of the code to invoke this. I created a c++ version for my own use. The source code is downloadable if that's of any help to you. See my sig line for the address. -- The PixAddixImage Collector suite: http://groups-beta.google.com/group/pixaddix SqliteImporter and SqliteReplicator: Command line utilities for Sqlite http://www.reddawn.net/~jsprenkl/Sqlite Cthulhu Bucks! http://www.cthulhubucks.com - To unsubscribe, send email to [EMAIL PROTECTED] - NOTICE: If received in error, please destroy and notify sender. Sender does not intend to waive confidentiality or privilege. Use of this email is prohibited when received in error. - To unsubscribe, send email to [EMAIL PROTECTED] -
Re: [sqlite] Is there a method for doing bulk insertion?
On 12/18/06, Anderson, James H (IT) <[EMAIL PROTECTED]> wrote: I was hoping there was the equivalent of Sybase's BCP program. I was also hoping something programmatic was available, i.e., not something from the command shell. Maybe a little background would help. I'm planning on using the perl package DBD::SQLite. My department is a big sybase user but because of the nature of our workload, we experience a lot of contention in both the transaction log and tempdb (the database that houses temporary tables). I'm investigating the feasibility of transferring data into SQLite, doing all the data manipulations there, and then transferring it back to the appropriate sybase tables. I suspect this could be a big win for a number of our applications. But if it can be avoided, I don't want to do a CSV conversion, nor do I want to shell out of the code to invoke this. I created a c++ version for my own use. The source code is downloadable if that's of any help to you. See my sig line for the address. -- The PixAddixImage Collector suite: http://groups-beta.google.com/group/pixaddix SqliteImporter and SqliteReplicator: Command line utilities for Sqlite http://www.reddawn.net/~jsprenkl/Sqlite Cthulhu Bucks! http://www.cthulhubucks.com - To unsubscribe, send email to [EMAIL PROTECTED] -
RE: [sqlite] Is there a method for doing bulk insertion?
I was hoping there was the equivalent of Sybase's BCP program. I was also hoping something programmatic was available, i.e., not something from the command shell. Maybe a little background would help. I'm planning on using the perl package DBD::SQLite. My department is a big sybase user but because of the nature of our workload, we experience a lot of contention in both the transaction log and tempdb (the database that houses temporary tables). I'm investigating the feasibility of transferring data into SQLite, doing all the data manipulations there, and then transferring it back to the appropriate sybase tables. I suspect this could be a big win for a number of our applications. But if it can be avoided, I don't want to do a CSV conversion, nor do I want to shell out of the code to invoke this. jim -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] Sent: Monday, December 18, 2006 9:12 AM To: sqlite-users@sqlite.org Subject: Re: [sqlite] Is there a method for doing bulk insertion? "Anderson, James H \(IT\)" <[EMAIL PROTECTED]> wrote: > or do I have to creation a gazillion insert statements? > The sqlite3 command-line shell has a ".import" command which can be used to read CSV data. But the way this works internally is that the command-line shell constructs an INSERT statement, parses each line of the CSV file and binds the values to that INSERT statement, then runs the INSERT statement for each line. So at the end of the day, a bunch of INSERT statements are still getting evaluated - you just don't see them. On my workstation, an INSERT statement can be parsed, compiled, and evaluated in 25-40 microseconds. That's about 3 rows per second. How much performance do you need? -- D. Richard Hipp <[EMAIL PROTECTED]> - To unsubscribe, send email to [EMAIL PROTECTED] - NOTICE: If received in error, please destroy and notify sender. Sender does not intend to waive confidentiality or privilege. Use of this email is prohibited when received in error. - To unsubscribe, send email to [EMAIL PROTECTED] -
Re: [sqlite] Is there a method for doing bulk insertion?
"Anderson, James H \(IT\)" <[EMAIL PROTECTED]> wrote: > or do I have to creation a gazillion insert statements? > The sqlite3 command-line shell has a ".import" command which can be used to read CSV data. But the way this works internally is that the command-line shell constructs an INSERT statement, parses each line of the CSV file and binds the values to that INSERT statement, then runs the INSERT statement for each line. So at the end of the day, a bunch of INSERT statements are still getting evaluated - you just don't see them. On my workstation, an INSERT statement can be parsed, compiled, and evaluated in 25-40 microseconds. That's about 3 rows per second. How much performance do you need? -- D. Richard Hipp <[EMAIL PROTECTED]> - To unsubscribe, send email to [EMAIL PROTECTED] -