Re: splitting a large dictionary into smaller ones

2009-03-23 Thread Steven D'Aprano
On Sun, 22 Mar 2009 23:10:21 -0400, Terry Reedy wrote: Searching for a key in, say, 10 dicts will be slower than searching for it in just one. The only reason I would do this would be if the dict had to be split, say over several machines. But then, you could query them in parallel. That

Re: splitting a large dictionary into smaller ones

2009-03-23 Thread John Machin
On Mar 23, 1:32 pm, per perfr...@gmail.com wrote: hi all, i have a very large dictionary object that is built from a text file that is about 800 MB -- it contains several million keys.  ideally i would like to pickle this object so that i wouldnt have to parse this large file to compute the

Re: splitting a large dictionary into smaller ones

2009-03-23 Thread Dave Angel
As others have said, a database is probably the right answer. There, the data is kept on disk, and only a few records at a time are read for each access, with modification transactions usually being synchronous. However, there are use cases where your approach makes more sense. And it

Re: splitting a large dictionary into smaller ones

2009-03-23 Thread Dave Angel
As others have said, a database is probably the right answer. There, the data is kept on disk, and only a few records at a time are read for each access, with modification transactions usually being synchronous. However, there are use cases where your approach makes more sense. And it

Re: splitting a large dictionary into smaller ones

2009-03-23 Thread Tim Chase
i have a very large dictionary object that is built from a text file that is about 800 MB -- it contains several million keys. ideally i would like to pickle this object so that i wouldnt have to parse this large file to compute the dictionary every time i run my program. however currently the

Re: splitting a large dictionary into smaller ones

2009-03-23 Thread Steve Holden
per wrote: hi all, i have a very large dictionary object that is built from a text file that is about 800 MB -- it contains several million keys. ideally i would like to pickle this object so that i wouldnt have to parse this large file to compute the dictionary every time i run my

splitting a large dictionary into smaller ones

2009-03-22 Thread per
hi all, i have a very large dictionary object that is built from a text file that is about 800 MB -- it contains several million keys. ideally i would like to pickle this object so that i wouldnt have to parse this large file to compute the dictionary every time i run my program. however

Re: splitting a large dictionary into smaller ones

2009-03-22 Thread Paul Rubin
per perfr...@gmail.com writes: i would like to split the dictionary into smaller ones, containing only hundreds of thousands of keys, and then try to pickle them. That already sounds like the wrong approach. You want a database. -- http://mail.python.org/mailman/listinfo/python-list

Re: splitting a large dictionary into smaller ones

2009-03-22 Thread odeits
On Mar 22, 7:32 pm, per perfr...@gmail.com wrote: hi all, i have a very large dictionary object that is built from a text file that is about 800 MB -- it contains several million keys.  ideally i would like to pickle this object so that i wouldnt have to parse this large file to compute the

Re: splitting a large dictionary into smaller ones

2009-03-22 Thread per
On Mar 22, 10:51 pm, Paul Rubin http://phr...@nospam.invalid wrote: per perfr...@gmail.com writes: i would like to split the dictionary into smaller ones, containing only hundreds of thousands of keys, and then try to pickle them. That already sounds like the wrong approach.  You want a

Re: splitting a large dictionary into smaller ones

2009-03-22 Thread Armin
On Monday 23 March 2009 00:01:40 per wrote: On Mar 22, 10:51 pm, Paul Rubin http://phr...@nospam.invalid wrote: per perfr...@gmail.com writes: i would like to split the dictionary into smaller ones, containing only hundreds of thousands of keys, and then try to pickle them. That

Re: splitting a large dictionary into smaller ones

2009-03-22 Thread Paul Rubin
per perfr...@gmail.com writes: fair enough - what native python database would you recommend? i prefer not to install anything commercial or anything other than python modules I think sqlite is the preferred one these days. -- http://mail.python.org/mailman/listinfo/python-list

Re: splitting a large dictionary into smaller ones

2009-03-22 Thread Terry Reedy
per wrote: hi all, i have a very large dictionary object that is built from a text file that is about 800 MB -- it contains several million keys. ideally i would like to pickle this object so that i wouldnt have to parse this large file to compute the dictionary every time i run my program.