Hello Edward,
I don't think there is any document such as you proposed about translating
errors from HDF5 to PyTables. This problem does seem to be about the issue
of using Blosc. If you want a quick fix you could try turning compression
off or using a different filter instead. If you want more specific help,
you should post some code which reproduces this error.
Be Well
Anthony
On Mon, Dec 5, 2011 at 4:49 AM, PyTables Org <pytab...@googlemail.com>wrote:
> Forwarding to list. ~Josh.
>
> Begin forwarded message:
>
> *From: *pytables-users-boun...@lists.sourceforge.net
> *Date: *December 3, 2011 12:52:46 AM GMT+01:00
> *To: *pytables-users-ow...@lists.sourceforge.net
> *Subject: **Auto-discard notification*
>
> The attached message has been automatically discarded.
> *From: *"Edward C. Jones" <edcjo...@comcast.net>
> *Date: *December 3, 2011 12:50:13 AM GMT+01:00
> *To: *pytables-users@lists.sourceforge.net
> *Subject: **Giant HDF5/PyTables error message*
>
>
> The following gigantic (but hard to interpret) error message was generated
> by the line of code:
>
> self.h5.root.sortlist.cols.score.createCSIndex()
>
> Is there a document on how to interpret HDF5 error messages in a PyTables
> context? What kind of errors in my code create these error messages?
> Which
> styles of PyTables programming tends to lead to HDF5 errors? If I cannot
> figure out a way to analyze these messages, I will have to remove much of
> the PyTables from my code.
>
> (I wrapped long lines)
> HDF5-DIAG: Error detected in HDF5 (1.8.4-patch1) thread 4148012736:
> #000: ../../../src/H5Dio.c line 174 in H5Dread(): can't read data
> major: Dataset
> minor: Read failed
> #001: ../../../src/H5Dio.c line 404 in H5D_read(): can't read data
> major: Dataset
> minor: Read failed
> #002: ../../../src/H5Dchunk.c line 1733 in H5D_chunk_read():
> unable to read raw data chunk
> major: Low-level I/O
> minor: Read failed
> #003: ../../../src/H5Dchunk.c line 2742 in H5D_chunk_lock():
> data pipeline read failed
> major: Data filters
> minor: Filter operation failed
> #004: ../../../src/H5Z.c line 1017 in H5Z_pipeline():
> filter returned failure during read
> major: Data filters
> minor: Read failed
> #005: blosc/blosc_filter.c line 232 in blosc_filter():
> Blosc decompression error
> major: Data filters
> minor: Callback failed
> Traceback (most recent call last):
> File "./mrq.py", line 133, in <module>
> options.scoretask)
> File "/home/edcjones/imagedb/imgSeek/mrquery/DB.py", line 56, in __init__
> self.New(basename, topdirs, scoretask, excluded_dirs)
> File "/home/edcjones/imagedb/imgSeek/mrquery/DB.py", line 180, in New
> hashbin.HashBin(self)
> File "/home/edcjones/imagedb/imgSeek/mrquery/hashbin.py",
> line 74, in __init__
> self.mrq.reversesort_sortlist(self.quota)
> File "/home/edcjones/imagedb/imgSeek/mrquery/DB.py",
> line 951, in reversesort_sortlist
> self.h5.root.sortlist.cols.score.createCSIndex()
> File "/usr/local/lib/python2.6/dist-packages/tables/table.py",
> line 3437, in createCSIndex
> _blocksizes=_blocksizes, _testmode=_testmode, _verbose=_verbose)
> File "/usr/local/lib/python2.6/dist-packages/tables/table.py",
> line 3412, in createIndex
> tmp_dir, _blocksizes, _verbose)
> File "/usr/local/lib/python2.6/dist-packages/tables/table.py",
> line 360, in _column__createIndex
> self.pathname, 0, table.nrows, lastrow=True, update=False )
> File "/usr/local/lib/python2.6/dist-packages/tables/table.py",
> line 2405, in _addRowsToIndex
> [self._read(startLR, self.nrows, 1, colname)],
> File "/usr/local/lib/python2.6/dist-packages/tables/table.py",
> line 1759, in _read
> self.row._fillCol(result, start, stop, step, field)
> File "tableExtension.pyx", line 1134,
> in tables.tableExtension.Row._fillCol (tables/tableExtension.c:10214)
> File "tableExtension.pyx",
> line 548, in tables.tableExtension.Table._read_records
> (tables/tableExtension.c:5400)
> tables.exceptions.HDF5ExtError: Problems reading records.
> Closing remaining open files: /extra/thumbs/BrianYoung_data/temp.h5... done
> /extra/thumbs/BrianYoung_data/pytables-HsQPfE.tmp... done
> /extra/thumbs/BrianYoung_data/mrq.h5...
> HDF5-DIAG: Error detected in HDF5 (1.8.4-patch1) thread 4148012736:
> #000: ../../../src/H5Dio.c line 174 in H5Dread(): can't read data
> major: Dataset
> minor: Read failed
> #001: ../../../src/H5Dio.c line 404 in H5D_read(): can't read data
> major: Dataset
> minor: Read failed
> #002: ../../../src/H5Dchunk.c line 1733 in H5D_chunk_read():
> unable to read raw data chunk
> major: Low-level I/O
> minor: Read failed
> #003: ../../../src/H5Dchunk.c line 2742 in H5D_chunk_lock():
> data pipeline read failed
> major: Data filters
> minor: Filter operation failed
> #004: ../../../src/H5Z.c line 1017 in H5Z_pipeline():
> filter returned failure during read
> major: Data filters
> minor: Read failed
> #005: blosc/blosc_filter.c line 232 in blosc_filter():
> Blosc decompression error
> major: Data filters
> minor: Callback failed
> Error in atexit._run_exitfuncs:
> Traceback (most recent call last):
> File "/usr/lib/python2.6/atexit.py", line 24, in _run_exitfuncs
> func(*targs, **kargs)
> File "/usr/local/lib/python2.6/dist-packages/tables/file.py",
> line 2337, in close_open_files
> fileh.close()
> File "/usr/local/lib/python2.6/dist-packages/tables/file.py",
> line 2141, in close
> self.root._f_close()
> File "/usr/local/lib/python2.6/dist-packages/tables/group.py",
> line 968, in _f_close
> self._g_closeDescendents()
> File "/usr/local/lib/python2.6/dist-packages/tables/group.py",
> line 918, in _g_closeDescendents
> lambda path: aliveNodes[path])
> File "/usr/local/lib/python2.6/dist-packages/tables/group.py",
> line 899, in closeNodes
> node._f_close()
> File "/usr/local/lib/python2.6/dist-packages/tables/table.py",
> line 2764, in _f_close
> self.flush()
> File "/usr/local/lib/python2.6/dist-packages/tables/table.py",
> line 2711, in flush
> self.reIndexDirty()
> File "/usr/local/lib/python2.6/dist-packages/tables/table.py",
> line 2579, in reIndexDirty
> self._doReIndex(dirty=True)
> File "/usr/local/lib/python2.6/dist-packages/tables/table.py",
> line 2548, in _doReIndex
> indexedrows = indexcol._doReIndex(dirty)
> File "/usr/local/lib/python2.6/dist-packages/tables/table.py",
> line 3459, in _doReIndex
> kind=kind, optlevel=optlevel, filters=filters))
> File "/usr/local/lib/python2.6/dist-packages/tables/table.py",
> line 3412, in createIndex
> tmp_dir, _blocksizes, _verbose)
> File "/usr/local/lib/python2.6/dist-packages/tables/table.py",
> line 360, in _column__createIndex
> self.pathname, 0, table.nrows, lastrow=True, update=False )
> File "/usr/local/lib/python2.6/dist-packages/tables/table.py",
> line 2405, in _addRowsToIndex
> [self._read(startLR, self.nrows, 1, colname)],
> File "/usr/local/lib/python2.6/dist-packages/tables/table.py",
> line 1759, in _read
> self.row._fillCol(result, start, stop, step, field)
> File "tableExtension.pyx",
> line 1134, in tables.tableExtension.Row._fillCol
> (tables/tableExtension.c:10214)
> File "tableExtension.pyx",
> line 548, in tables.tableExtension.Table._read_records
> (tables/tableExtension.c:5400)
> HDF5ExtError: Problems reading records.
> Error in sys.exitfunc:
> Traceback (most recent call last):
> File "/usr/lib/python2.6/atexit.py", line 24, in _run_exitfuncs
> func(*targs, **kargs)
> File "/usr/local/lib/python2.6/dist-packages/tables/file.py",
> line 2337, in close_open_files
> fileh.close()
> File "/usr/local/lib/python2.6/dist-packages/tables/file.py",
> line 2141, in close
> self.root._f_close()
> File "/usr/local/lib/python2.6/dist-packages/tables/group.py",
> line 968, in _f_close
> self._g_closeDescendents()
> File "/usr/local/lib/python2.6/dist-packages/tables/group.py",
> line 918, in _g_closeDescendents
> lambda path: aliveNodes[path])
> File "/usr/local/lib/python2.6/dist-packages/tables/group.py",
> line 899, in closeNodes
> node._f_close()
> File "/usr/local/lib/python2.6/dist-packages/tables/table.py",
> line 2764, in _f_close
> self.flush()
> File "/usr/local/lib/python2.6/dist-packages/tables/table.py",
> line 2711, in flush
> self.reIndexDirty()
> File "/usr/local/lib/python2.6/dist-packages/tables/table.py",
> line 2579, in reIndexDirty
> self._doReIndex(dirty=True)
> File "/usr/local/lib/python2.6/dist-packages/tables/table.py",
> line 2548, in _doReIndex
> indexedrows = indexcol._doReIndex(dirty)
> File "/usr/local/lib/python2.6/dist-packages/tables/table.py",
> line 3459, in _doReIndex
> kind=kind, optlevel=optlevel, filters=filters))
> File "/usr/local/lib/python2.6/dist-packages/tables/table.py",
> line 3412, in createIndex
> tmp_dir, _blocksizes, _verbose)
> File "/usr/local/lib/python2.6/dist-packages/tables/table.py",
> line 360, in _column__createIndex
> self.pathname, 0, table.nrows, lastrow=True, update=False )
> File "/usr/local/lib/python2.6/dist-packages/tables/table.py",
> line 2405, in _addRowsToIndex
> [self._read(startLR, self.nrows, 1, colname)],
> File "/usr/local/lib/python2.6/dist-packages/tables/table.py",
> line 1759, in _read
> self.row._fillCol(result, start, stop, step, field)
> File "tableExtension.pyx",
> line 1134, in tables.tableExtension.Row._fillCol
> (tables/tableExtension.c:10214)
> File "tableExtension.pyx",
> line 548, in tables.tableExtension.Table._read_records
> (tables/tableExtension.c:5400)
> tables.exceptions.HDF5ExtError: Problems reading records.
>
>
>
>
>
>
>
> ------------------------------------------------------------------------------
> All the data continuously generated in your IT infrastructure
> contains a definitive record of customers, application performance,
> security threats, fraudulent activity, and more. Splunk takes this
> data and makes sense of it. IT sense. And common sense.
> http://p.sf.net/sfu/splunk-novd2d
> _______________________________________________
> Pytables-users mailing list
> Pytables-users@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/pytables-users
>
>
------------------------------------------------------------------------------
All the data continuously generated in your IT infrastructure
contains a definitive record of customers, application performance,
security threats, fraudulent activity, and more. Splunk takes this
data and makes sense of it. IT sense. And common sense.
http://p.sf.net/sfu/splunk-novd2d
_______________________________________________
Pytables-users mailing list
Pytables-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/pytables-users