On Aug 4, 2008, at 4:12 AM, Jörgen Grahn wrote:

(You might want to post this to comp.lang.python rather than to me --
I am just another c.l.p reader.  If you already have done to, please
disregard this.)

Yeah, I hit "reply" by mistake and didn't realize it.  My bad.

(I assume here that Berkeley DB supports 7GB data sets.)

If I remember correctly, BerkeleyDB is limited to a single file size
of 2GB.

Sounds likely.  But with some luck maybe they have increased this in
later releases?  There seem to be many competing Berkeley releases.

It's worth investigating, but that leads me to:

I haven't caught the earlier parts of this thread, but do I
understand correctly that someone wants to load a 7GB dataset into the
form of a dictionary?

Yes, he claimed the dictionary was 6.8 GB.  How he measured that, I
don't know.


To the OP: how did you measure this?

--
Avi

--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to