On 2009-10-15 13:57 PM, BDL wrote:
I have a large amount of binary data that needs to be pushed across
the network.  It appears from profiling that the dominant time is
being taken up by packing the data.  (50%)  Here is a CME that shows
the problem.

from numpy import random
from struct import pack
import time

lenstim = 10084200
sigma = 0.1
stim = random.normal(0., sigma, lenstim) # 10084200 gaussian random
numbers (doubles)

fmt = '!h'+str(stim.size)+'d'  # makes fmt = '!h10084200d'
cmd = 4

startTime = time.time()
packdat = pack( fmt, cmd, *stim )
elapsed = time.time() - startTime
print "Time to pack the command and data %.6f seconds " % elapsed

Is there a faster method to do this?  Is it possible to use array?
Any suggestions would be appreciated.

If you're already using numpy, use its routines to convert to string representations (like stim.tostring(), but there are better alternatives like the NPY file format). Don't use struct.pack() for large homogeneous data.

If you have further numpy questions, you should ask them on the numpy mailing 
list:

  http://www.scipy.org/Mailing_Lists

--
Robert Kern

"I have come to believe that the whole world is an enigma, a harmless enigma
 that is made terrible by our own mad attempt to interpret it as though it had
 an underlying truth."
  -- Umberto Eco

--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to