A Wednesday 08 December 2010 01:32:12 david.bri...@ubs.com escrigué:
> Hi
> 
> I have loaded around 2200 table total of 1.1GB (I have a lot more to
> load - csv size is 12GB). The load crash with a memory error:
[clip]

>   File "c:\python26qv\lib\site-packages\tables\table.py", line 687,
> in _get_container
>     return numpy.empty(shape=shape, dtype=self._v_dtype)
> MemoryError
> 
> 
> When I open the db I wish to cycle through each table and store the
> attributes in memory, the first row of first column, the last row of
> the last column but no other table data.
> 
> If I open the db with the NODE_CACHE_SLOTS=-10000 python uses about
> 1GB of ram then stops with a memory error, same for
> NODE_CACHE_SLOTS=-2000. If I run it with NODE_CACHE_SLOTS=2000 then
> I still get a memory error followed by a performance warning thus:

May I ask why do you insist in putting so many slots in cache?  Can you 
reproduce the problem when using the default (256)?

Also, you should double check that you are not creating a leak on your 
own application (for example, by not removing data imported from CSV 
files).

In case you are sure that the problem is on the PyTables side, could you 
please provide a self-contained script reproducing the problem?

Finally, when you post about a new subject, please do not reply to an 
existing (and unrelated) post; that is confusing.

Cheers,

-- 
Francesc Alted

------------------------------------------------------------------------------
What happens now with your Lotus Notes apps - do you make another costly 
upgrade, or settle for being marooned without product support? Time to move
off Lotus Notes and onto the cloud with Force.com, apps are easier to build,
use, and manage than apps on traditional platforms. Sign up for the Lotus 
Notes Migration Kit to learn more. http://p.sf.net/sfu/salesforce-d2d
_______________________________________________
Pytables-users mailing list
Pytables-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/pytables-users

Reply via email to