Re: [sqlite] Deleting duplicate records

2009-01-06 Thread Craig Smith
On Jan 6, 2009, at 6:14 PM, sqlite-users-requ...@sqlite.org wrote: > delete from talks where exists > (select 1 from talks t2 > where talks.member_id = t2.member_id and talks.date = t2.date and > talks.rowid > t2.rowid); Igor, this worked fabulously, thank you very much. I also tried your

[sqlite] CREATE INDEX performance

2009-01-06 Thread Christopher Mason
Hi. I have a database with a table having a text column that averages 7.1 characters per row; there are 11 million rows in this table. In one version of the source file that is used to populate this table, the file is laid out such that the column data is populated in sorted order; in

Re: [sqlite] SQLite with NAS storage

2009-01-06 Thread Edward J. Yoon
Thanks, In more detail, SQLite used for user-based applications (20 million is the size of app-users). and MySQL used for user location (file path on NAS) addressing. On Wed, Jan 7, 2009 at 1:31 PM, P Kishor wrote: > On 1/6/09, Edward J. Yoon wrote:

Re: [sqlite] SQLite with NAS storage

2009-01-06 Thread P Kishor
On 1/6/09, Edward J. Yoon wrote: > > Do you have 20 million sqlite databases? > > > Yes. Since all these databases are just files, you should stuff them into a Postgres database, then write an application that extracts the specific row from the pg database with 20 mil rows

Re: [sqlite] SQLite with NAS storage

2009-01-06 Thread Edward J. Yoon
> Do you have 20 million sqlite databases? Yes. On Wed, Jan 7, 2009 at 12:36 PM, Jim Dodgen wrote: > I think the question was about the structure of your data > > a sqlite database is a file and can contain many tables. tables can contain > many rows. > > Do you have 20 million

Re: [sqlite] SQLite with NAS storage

2009-01-06 Thread Jim Dodgen
I think the question was about the structure of your data a sqlite database is a file and can contain many tables. tables can contain many rows. Do you have 20 million sqlite databases? This information can help people formulate an answer. On Tue, Jan 6, 2009 at 6:14 PM, Edward J. Yoon

Re: [sqlite] SQLite with NAS storage

2009-01-06 Thread Edward J. Yoon
Thanks for your reply. > That's a lot of files. Or did you mean rows? > Are you sure? There can be many other reasons. There is a lot of files. So, I don't know exactly why at this time, But thought network latency can´t be denied. /Edward On Wed, Jan 7, 2009 at 4:07 AM, Kees Nuyt

Re: [sqlite] Deleting duplicate records

2009-01-06 Thread Igor Tandetnik
Craig Smith wrote: > By searching the archives of this list, I was able to come up with > this syntax to identify duplicate records and place a single copy of > each duplicate into another table: > > CREATE TABLE dup_killer (member_id INTEGER, date DATE); INSERT INTO >

Re: [sqlite] Deleting duplicate records

2009-01-06 Thread Kees Nuyt
On Tue, 6 Jan 2009 08:29:43 -0800, Craig Smith wrote in General Discussion of SQLite Database : >By searching the archives of this list, I was able to come up with >this syntax to identify duplicate records and place a single copy of >each

Re: [sqlite] Deleting duplicate records

2009-01-06 Thread Alexey Pechnikov
Hello! В сообщении от Tuesday 06 January 2009 19:29:43 Craig Smith написал(а): > By searching the archives of this list, I was able to come up with   > this syntax to identify duplicate records and place a single copy of   > each duplicate into another table: There is simple way: dump your

Re: [sqlite] Deleting duplicate records

2009-01-06 Thread Brad Stiles
> CREATE TABLE dup_killer (member_id INTEGER, date DATE); INSERT INTO > dup_killer (member_id, date) SELECT * FROM talks GROUP BY member_id, > date HAVING count(*)>1; > > But, now that I have the copies in the dup_killer table, I have not > been able to discover an efficient way to go back to the

[sqlite] Deleting duplicate records

2009-01-06 Thread Craig Smith
By searching the archives of this list, I was able to come up with this syntax to identify duplicate records and place a single copy of each duplicate into another table: CREATE TABLE dup_killer (member_id INTEGER, date DATE); INSERT INTO dup_killer (member_id, date) SELECT * FROM talks

Re: [sqlite] Exporting database to CSV file

2009-01-06 Thread Alexey Pechnikov
Hello! В сообщении от Tuesday 06 January 2009 15:33:42 Sylvain Pointeau написал(а): > The import has the big limitation to not be able to import the file when a > field is on multiple lines.I don't know if this is the same for the > export... See virtualtext extension from spatialite project.

[sqlite] nested transactions

2009-01-06 Thread Chris Wedgwood
On Thu, Jan 01, 2009 at 08:19:01PM -0500, D. Richard Hipp wrote: > FWIW, nested transactions (in the form of SAVEPOINTs) will appear in > the next SQLite release, which we hope to get out by mid-January. Is that going to be 4.0.x then? I'm assuming there will need to be incompatible file format

Re: [sqlite] Exporting database to CSV file

2009-01-06 Thread Sylvain Pointeau
The import has the big limitation to not be able to import the file when a field is on multiple lines.I don't know if this is the same for the export... Cheers, Sylvain On Tue, Jan 6, 2009 at 11:08 AM, MikeW wrote: > Jonathon writes: > > > > > Awesome.

Re: [sqlite] Exporting database to CSV file

2009-01-06 Thread MikeW
Jonathon writes: > > Awesome. Thanks for the quick reply Deech :) > > J "Awesome" ? You must be easily impressed/scared ! ;-) MikeW ___ sqlite-users mailing list sqlite-users@sqlite.org