Hello PyTables Users,
My current implementation works pretty well now and has the write speeds
that I am looking for; however, around 20 minutes of execution and of a
file size of around 127MB with level 3 blosc compression I seem to get
memory allocation errors. Here is my trace that I get, if anybody can shed
light on this, that will be excellent. Does my implementation hog all of my
memory? Is there a memory leak?
HDF5-DIAG: Error detected in HDF5 (1.8.8) thread 0:
#000: ..\..\hdf5-1.8.8\src\H5Dio.c line 266 in H5Dwrite(): can't write
data
major: Dataset
minor: Write failed
#001: ..\..\hdf5-1.8.8\src\H5Dio.c line 671 in H5D_write(): can't write
data
major: Dataset
minor: Write failed
#002: ..\..\hdf5-1.8.8\src\H5Dchunk.c line 1861 in H5D_chunk_write():
unable t
o read raw data chunk
major: Low-level I/O
minor: Read failed
#003: ..\..\hdf5-1.8.8\src\H5Dchunk.c line 2776 in H5D_chunk_lock():
memory al
location failed for raw data chunk
major: Resource unavailable
minor: No space available for allocation
Exception in thread bookthread:
Traceback (most recent call last):
File "C:\Python27bit\lib\threading.py", line 551, in __bootstrap_inner
self.run()
File "../PyTablesInterface\Acceptor.py", line 21, in run
BookDataWrapper.acceptDict()
File "../PyTablesInterface\BookDataWrapper.py", line 50, in acceptDict
tableD.append(dataArray)
File "C:\Python27bit\lib\site-packages\tables\table.py", line 2081, in
append
self._saveBufferedRows(wbufRA, lenrows)
File "C:\Python27bit\lib\site-packages\tables\table.py", line 2016, in
_saveBu
fferedRows
self._append_records(lenrows)
File "tableExtension.pyx", line 454, in
tables.tableExtension.Table._append_re
cords (tables\tableExtension.c:4623)
HDF5ExtError: Problems appending the records.
#####################################
######THIS IS A LATER ERROR#########
#####################################
Exception in thread CME_10_B:
Traceback (most recent call last):
File "C:\Python27bit\lib\threading.py", line 551, in __bootstrap_inner
self.run()
File
"C:\Users\jacob.bennett\development\MarketDataReader\IO\__init__.py", lin
e 19, in run
self.socket.rec()
File
"C:\Users\jacob.bennett\development\MarketDataReader\IO\MarketSocket.py",
line 33, in rec
Parser.parse(self.sock.recv(1024*16), self.exchange)
File "../Parser\Parser.py", line 39, in parse
SendInBatch.acceptBookData(instrumentId, timestamp, 0, i, bidPrice,
bidQuant
, bidOrders, exchange, source)
File "../PyTablesInterface\SendInBatch.py", line 28, in acceptBookData
maindict[(instrumentId, yearmonthday)] = [(timestamp1, timestamp2,
side, lev
el, price, quant, orders, source, 1)]
MemoryError
Thanks,
Jacob Bennett
--
Jacob Bennett
Massachusetts Institute of Technology
Department of Electrical Engineering and Computer Science
Class of 2014| benne...@mit.edu
------------------------------------------------------------------------------
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and
threat landscape has changed and how IT managers can respond. Discussions
will include endpoint security, mobile security and the latest in malware
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
_______________________________________________
Pytables-users mailing list
Pytables-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/pytables-users