El dl 06 de 11 del 2006 a les 16:04 +0100, en/na Alexandre Fayolle va escriure: > Hi, > > I have developped an application using pytables as backend storage, > which runs fine most of the time, but we have received a number of > reports mentioning error messages such as: > > 10:27:20 ERROR : Error during projection HDF5ExtError: Can't set attribute > 'reductor' in node: > /projections/EroDilSpeedCurve-01-Images-05-01 (Group) ''. > Traceback (most recent call last): > File "D:\castext\castext-0.6.0\surfcar\gui\castext_researchlab.py", line > 541, in doProjectLocally > surfaces): > File "D:\castext\castext-0.6.0\surfcar\surfaceset.py", line 371, in > project_locally > self.add_projection(reductor, projected, naxis, descriptor_name, surfaces) > File "D:\castext\castext-0.6.0\surfcar\surfaceset.py", line 879, in > add_projection > group._v_attrs.reductor = reductor > File "C:\Program Files\Python24\lib\site-packages\tables\AttributeSet.py", > line 363, in __setattr__ > self._g__setattr(name, value) > File "C:\Program Files\Python24\lib\site-packages\tables\AttributeSet.py", > line 293, in _g__setattr > self._g_setAttr(name, value) > File "hdf5Extension.pyx", line 983, in hdf5Extension.AttributeSet._g_setAttr > File "hdf5Extension.pyx", line 727, in > hdf5Extension.AttributeSet._g_setAttrStr > HDF5ExtError: Can't set attribute 'reductor' in node: > /projections/EroDilSpeedCurve-01-Images-05-01 (Group) ''. > > I have not been able to reproduce the behaviour locally yet. > > reductor is a python object. Are there any limitations to the size of an > object stored as an attribute in a tables file ? I would be very > grateful for any additional hint towards a solution.
Well, no completely sure about this. From the HDF5 users' guide (http://hdfgroup.com/HDF5/doc/UG/), in "Attributes" chapter: """ Note: Attributes are small datasets but not separate objects; they are contained within the object header of a primary data object. As such, attributes are opened, read, or written only with H5A functions. """ Below, in the same chapter and under "Special issues" section, one can read: """ How small is small and how large is large are not defined by the library; it is left to the user's interpretation. (In considering attributes and size, the HDF5 development team has considered attributes to be up to 16K, but this has never been set as a design or implementation limit.) """ So, this is not completely clear, but it seems to me that trying to save attributes longer that 16KB can lead to problems. How large is the pickled "reduce" object? you can compute it by issuing a len(pickle.dumps(reduce_object)). In any case, if attributes are very large, it's better to use a separate dataset to keep them. Cheers, -- Francesc Altet | Be careful about using the following code -- Carabos Coop. V. | I've only proven that it works, www.carabos.com | I haven't tested it. -- Donald Knuth ------------------------------------------------------------------------- Using Tomcat but need to do more? Need to support web services, security? Get stuff done quickly with pre-integrated technology to make your job easier Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo http://sel.as-us.falkag.net/sel?cmd=lnk&kid=120709&bid=263057&dat=121642 _______________________________________________ Pytables-users mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/pytables-users
