kalakouentin wrote:
I use python in order to analyze my data which are in a text form. The
script is fairly simple. It reads a line form the input file, computes
what it must compute and then write it it to a buffer/list. When the
whole reading file is processed (essential all lines) then the
algorithms goes ahead and writes them one by one on the output file.
It works fine. But because of the continuous I/O it takes a lot of
time to execute.
I think that the output phase is more or less optimized. (A loop that
reads the solutions list sequentially and puts "/n" in the appropriate
intervals). Do you know a way to actually load my data in a more
"batch-like" way so I will avoid the constant line by line reading?
I guess I could read and store the whole text in a list with each cell
being being a line and then process each line one by one again but I
don't really think that would offer me a significant time gain.

Python already does that batch-like reading under the hood. There is probably nothing you could do to improve its speed.

But what makes you say your output is optimized? I thinks it's *far* more likely that the bottleneck is you processing or your output. Try this test (as a prelude to doing some *real* profiling):

Comment out your computation and output, then run it. I except this will be fast.

Then put the computation back in and run. Is it slightly slower or much slower?

Then put the output back in and run.

If the results are not clearcut, then try some real profiling.

Gary Herron


Thanx in advance for the time reading this.
Pantelis



--
http://mail.python.org/mailman/listinfo/python-list

--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to