On Sun, 22 Mar 2009 23:10:21 -0400, Terry Reedy wrote:
Searching for a key in, say, 10 dicts will be slower than searching for
it in just one. The only reason I would do this would be if the dict
had to be split, say over several machines. But then, you could query
them in parallel.
That
On Mar 23, 1:32 pm, per perfr...@gmail.com wrote:
hi all,
i have a very large dictionary object that is built from a text file
that is about 800 MB -- it contains several million keys. ideally i
would like to pickle this object so that i wouldnt have to parse this
large file to compute the
As others have said, a database is probably the right answer. There,
the data is kept on disk, and only a few records at a time are read for
each access, with modification transactions usually being synchronous.
However, there are use cases where your approach makes more sense. And
it
As others have said, a database is probably the right answer. There,
the data is kept on disk, and only a few records at a time are read for
each access, with modification transactions usually being synchronous.
However, there are use cases where your approach makes more sense. And
it
i have a very large dictionary object that is built from a text file
that is about 800 MB -- it contains several million keys. ideally i
would like to pickle this object so that i wouldnt have to parse this
large file to compute the dictionary every time i run my program.
however currently the
per wrote:
hi all,
i have a very large dictionary object that is built from a text file
that is about 800 MB -- it contains several million keys. ideally i
would like to pickle this object so that i wouldnt have to parse this
large file to compute the dictionary every time i run my
hi all,
i have a very large dictionary object that is built from a text file
that is about 800 MB -- it contains several million keys. ideally i
would like to pickle this object so that i wouldnt have to parse this
large file to compute the dictionary every time i run my program.
however
per perfr...@gmail.com writes:
i would like to split the dictionary into smaller ones, containing
only hundreds of thousands of keys, and then try to pickle them.
That already sounds like the wrong approach. You want a database.
--
http://mail.python.org/mailman/listinfo/python-list
On Mar 22, 7:32 pm, per perfr...@gmail.com wrote:
hi all,
i have a very large dictionary object that is built from a text file
that is about 800 MB -- it contains several million keys. ideally i
would like to pickle this object so that i wouldnt have to parse this
large file to compute the
On Mar 22, 10:51 pm, Paul Rubin http://phr...@nospam.invalid wrote:
per perfr...@gmail.com writes:
i would like to split the dictionary into smaller ones, containing
only hundreds of thousands of keys, and then try to pickle them.
That already sounds like the wrong approach. You want a
On Monday 23 March 2009 00:01:40 per wrote:
On Mar 22, 10:51 pm, Paul Rubin http://phr...@nospam.invalid wrote:
per perfr...@gmail.com writes:
i would like to split the dictionary into smaller ones, containing
only hundreds of thousands of keys, and then try to pickle them.
That
per perfr...@gmail.com writes:
fair enough - what native python database would you recommend? i
prefer not to install anything commercial or anything other than
python modules
I think sqlite is the preferred one these days.
--
http://mail.python.org/mailman/listinfo/python-list
per wrote:
hi all,
i have a very large dictionary object that is built from a text file
that is about 800 MB -- it contains several million keys. ideally i
would like to pickle this object so that i wouldnt have to parse this
large file to compute the dictionary every time i run my program.
13 matches
Mail list logo