Hello Shyam,
Can you please post the full traceback? In any event, I am fairly certain
that this error is coming from the np.fromiter step. The problem here is
that you are trying to read yur entire SQL query into a single numpy array
in memory. This is impossible because you don't have enough
Hello Anthony,
Thank you for your suggestions. When I mentioned that I am reading the
data from database, I meant a DB2 database, not a HDF5 database/file.
I followed your suggestions, so the code looks as follows:
def createHDF5File():
h5File= tables.openFile(, mode="a")
On Thu, Apr 11, 2013 at 5:16 PM, Shyam Parimal Katti wrote:
> Hello Anthony,
>
> Thank you for replying back with suggestions.
>
> In response to your suggestions, I am *not reading the data from a file
> in the first step, but instead a database*.
>
Hello Shyam,
To put too fine a point on it,
Hello Anthony,
Thank you for replying back with suggestions.
In response to your suggestions, I am *not reading the data from a file in
the first step, but instead a database*. I did try out your 1st suggestion
of doing a table.append(list of tuples), which took a little more than the
executed ti
Hi Shyam,
The pattern that you are using to write to a table is basically one for
writing Python data to HDF5. However, your data is already in a machine /
HDF5 native format. Thus what you are doing here is an excessive amount of
work: read data from file -> convert to Python data structures -