> -----Original Message-----
> From: python-dev-bounces+kristjan=ccpgames....@python.org
> [mailto:python-dev-bounces+kristjan=ccpgames....@python.org] On
> Behalf Of mar...@v.loewis.de
> Sent: 24. apríl 2012 17:44
> To: python-dev@python.org
> Subject: Re: [Python-Dev] cpython: Implement PEP 412: Key-sharing
> dictionaries (closes #13903)
> 
> > Benchmarks should measure memory usage too, of course.  Sadly that is
> > not possible in standard cPython.
> 
> It's actually very easy in standard CPython, using sys.getsizeof.
>
Yes, you can query each python object about how big it thinks it is.
What I'm speaking of is more like:
start_allocs, start_mem = allocator.get_current()
allocator.reset_limits()
run_complicated_tests()

end_allocs, end_mem = allocator.get=current()

Print "delta blocks: %d, delta mem: %d"%(end_allocs-start_allocs, 
end_mem-start_mem)
print "peak blocks: %d, peak mem: %d"%allocator.peak()

 
> > Btw, this is of great interest to me at the moment, our Shanghai
> > engineers are screaming at the memory waste incurred by dictionaries.
> > A 10 item dictionary consumes 1/2k on 32 bits, did you know this?
> 
> I did.
> 
> In Python 3.3, this now goes down to 248 bytes (32 bits).
> 
I'm going to experiment with tunable parameters in 2.7 to trade performance for 
memory.  In some applications, memory trumps performance.

K

_______________________________________________
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com

Reply via email to