Thanks. I will try it out and post any findings.

Pushkar

On Thu, Jul 18, 2013 at 12:36 AM, Andreas Hilboll <li...@hilboll.de> wrote:

> >
>
> You could use pandas_ and the read_table function. There, you have nrows
> and skiprows parameters with which you can easily do your own 'streaming'.
>
> .. _pandas: http://pandas.pydata.org/



On Thu, Jul 18, 2013 at 1:00 AM, Antonio Valentino <
antonio.valent...@tiscali.it> wrote:

> Hi Pushkar,
>
> Il 18/07/2013 08:45, Pushkar Raj Pande ha scritto:
> > Both loadtxt and genfromtxt read the entire data into memory which is not
> > desirable. Is there a way to achieve streaming writes?
> >
>
> OK, probably fromfile [1] can help you to cook something that works
> without loading the entire file into memory (and without too much
> iterations over the file).
>
> Anyway I strongly recommend you to not perform read/write cycles on
> single lines, rather define a reasonable data block size (number of
> rows) and process the file in chunks.
>
> If you find a reasonably simple solution it would be nice to include it
> in out documentation as an example or a "recipe" [2]
>
> [1]
>
> http://docs.scipy.org/doc/numpy/reference/generated/numpy.fromfile.html#numpy.fromfile
> [2] http://pytables.github.io/latest/cookbook/index.html
>
> best regards
>
> antonio
>
>
------------------------------------------------------------------------------
See everything from the browser to the database with AppDynamics
Get end-to-end visibility with application monitoring from AppDynamics
Isolate bottlenecks and diagnose root cause in seconds.
Start your free trial of AppDynamics Pro today!
http://pubads.g.doubleclick.net/gampad/clk?id=48808831&iu=/4140/ostg.clktrk
_______________________________________________
Pytables-users mailing list
Pytables-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/pytables-users

Reply via email to