Re: [Numpy-discussion] numpy/Windows shared arrays between processes?

2007-10-11 Thread David Cournapeau
Matthieu Brucher wrote:

  The 2.6 seems to use VC 2005 Express, I don't know about py3000(?),
  with associated upgrade issues.
 But what if the next MS compiler has again broken libc
 implementation ?
 (Incidently, VS2005 was not used for python2.5 for even more
 broken libc
 than in 2003):
 http://mail.python.org/pipermail/python-dev/2006-April/064126.html


 I don't what he meant by a broken libc, if it is the fact that there 
 is a lot of deprecated standard functions, I don't call it broken 
 (besides, this deprecation follows a technical paper that describe the 
 new safe functions, although it does not deprecate these functions).
If unilaterally deprecating standard functions which exist for years is 
not broken, I really wonder what is :)

cheers,

David
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] numpy/Windows shared arrays between processes?

2007-10-11 Thread Matthieu Brucher

  I don't what he meant by a broken libc, if it is the fact that there
  is a lot of deprecated standard functions, I don't call it broken
  (besides, this deprecation follows a technical paper that describe the
  new safe functions, although it does not deprecate these functions).
 If unilaterally deprecating standard functions which exist for years is
 not broken, I really wonder what is :)


They are deprecated (although a simple flag can get rid of those
deprecation) not removed. Besides, the deprecated functions are in fact
functions that can lead to security issues (for the first time Microsoft did
something not completely stupid about this topic), so telling that the
programmer should not use them but more secure one may be seen as a good
idea (from a certain point of view).

Matthieu
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] numpy/Windows shared arrays between processes?

2007-10-10 Thread Ray S
Thanks all:

At 10:00 AM 10/10/2007, Robert Kern wrote:
 Something like the following should suffice (untested, though 
I've done similar things with ctypes before):

I tested, successfully:
 nFromAddress.py


def fromaddress(address, dtype, shape, strides=None):
  Create a numpy array from an integer address, a dtype
 or dtype string, a shape tuple, and possibly strides.
 
 import numpy
 # Make sure our dtype is a dtype, not just f or whatever.
 dtype = numpy.dtype(dtype)

 class Dummy(object):
 pass
 d = Dummy()
 d.__array_interface__ = dict(
 data = (address, False),
 typestr = dtype.str,
 descr = dtype.descr,
 shape = shape,
 strides = strides,
 version = 3,
 )
 return numpy.asarray(d)

## Numeric example, with address kludge
import Numeric, numpy, ctypes, string
a0 = Numeric.zeros((1), Numeric.Int16)
nAddress = int(string.split(repr(a0.__copy__))[-1][:-1], 16)
tmp=(ctypes.c_long*1)(0)
ctypes.memmove(tmp, nAddress+8, 4)
nAddress = tmp[0]
a1 = fromaddress(nAddress, numpy.int16, (1,)) ## explicit type
a0[0] = 5
print a1[0]

## numpy example
a2 = numpy.zeros(1, numpy.int16)
nAddress = a2.__array_interface__['data'][0]
nType = a2.__array_interface__['typestr']
nShape = a2.__array_interface__['shape']
a3 = fromaddress(nAddress, nType, nShape)
a2[0] = 5
print a3[0]

So, now with little effort the relevant info can be passed over 
pipes, shared memory, etc. and shares/views created in other 
processes, since they are not objects but ints and strings.

 David Cournapeau Wrote:
 Basically, you cannot expect file descriptors (or even file 
handles: the
 standard ones from C library fopen) to cross dll boundaries if 
the dll do not have exactly the same runtime.

It sounds like there is a general dilemma: no one with Python 2.4 or 
2.5 can reliably expect to compile extensions/modules if they did not 
install the 7.1 compiler in time.
The 2.6 seems to use VC 2005 Express, I don't know about py3000(?), 
with associated upgrade issues.
It would be nice if the build bots could also compile suggested 
modules/extentions.

Thanks again,
Ray

___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] numpy/Windows shared arrays between processes?

2007-10-10 Thread David Cournapeau
Ray S wrote:
 Thanks all:

 At 10:00 AM 10/10/2007, Robert Kern wrote:
  Something like the following should suffice (untested, though 
 I've done similar things with ctypes before):

 I tested, successfully:
  nFromAddress.py
 

 def fromaddress(address, dtype, shape, strides=None):
   Create a numpy array from an integer address, a dtype
  or dtype string, a shape tuple, and possibly strides.
  
  import numpy
  # Make sure our dtype is a dtype, not just f or whatever.
  dtype = numpy.dtype(dtype)

  class Dummy(object):
  pass
  d = Dummy()
  d.__array_interface__ = dict(
  data = (address, False),
  typestr = dtype.str,
  descr = dtype.descr,
  shape = shape,
  strides = strides,
  version = 3,
  )
  return numpy.asarray(d)

 ## Numeric example, with address kludge
 import Numeric, numpy, ctypes, string
 a0 = Numeric.zeros((1), Numeric.Int16)
 nAddress = int(string.split(repr(a0.__copy__))[-1][:-1], 16)
 tmp=(ctypes.c_long*1)(0)
 ctypes.memmove(tmp, nAddress+8, 4)
 nAddress = tmp[0]
 a1 = fromaddress(nAddress, numpy.int16, (1,)) ## explicit type
 a0[0] = 5
 print a1[0]

 ## numpy example
 a2 = numpy.zeros(1, numpy.int16)
 nAddress = a2.__array_interface__['data'][0]
 nType = a2.__array_interface__['typestr']
 nShape = a2.__array_interface__['shape']
 a3 = fromaddress(nAddress, nType, nShape)
 a2[0] = 5
 print a3[0]

 So, now with little effort the relevant info can be passed over 
 pipes, shared memory, etc. and shares/views created in other 
 processes, since they are not objects but ints and strings.

  David Cournapeau Wrote:
  Basically, you cannot expect file descriptors (or even file 
 handles: the
  standard ones from C library fopen) to cross dll boundaries if 
 the dll do not have exactly the same runtime.

 It sounds like there is a general dilemma: no one with Python 2.4 or 
 2.5 can reliably expect to compile extensions/modules if they did not 
 install the 7.1 compiler in time.
Well, in theory you could: 'just' recompile python. The problem is not 
the compiler as such, but the C runtime. I don't see any solution to 
this situation, unfortunately; if MS decides to ship a broken libc, it 
is hard to get around that in a portable way.

For files (I don't know any other problems, but this certainly do not 
mean they do not exist), the only way I know is to use the win32 files 
handles. At least, it works in C (I had similar problems when dealing 
with tmp files on win32). To do it directly in python, you may need 
pywin32 specific functions (I cannot remember them on the top of my head).

 The 2.6 seems to use VC 2005 Express, I don't know about py3000(?), 
 with associated upgrade issues.
But what if the next MS compiler has again broken libc implementation ? 
(Incidently, VS2005 was not used for python2.5 for even more broken libc 
than in 2003): 
http://mail.python.org/pipermail/python-dev/2006-April/064126.html

cheers,

David
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] numpy/Windows shared arrays between processes?

2007-10-09 Thread Sebastian Haase
Hi!
I was in fact experimenting with this. The solution seemed to lie in
simple memmap as it is implemented in Windows:

import numpy as N
def arrSharedMemory(shape, dtype, tag=PriithonSharedMemory):

Windows only !
share memory between different processes if same `tag` is used.

itemsize = N.dtype(dtype).itemsize
count = N.product(shape)
size =  count * itemsize

import mmap
sharedmem = mmap.mmap(0, size, tag)
a=N.frombuffer(sharedmem, dtype, count)
a.shape = shape
return a

For explaintion look up the microsoft site for the mmap documentation.
And/or the Python-doc for mmap.
(( I have to mention, that I could crash a process while testing this ... ))

If anyone here would know an equivalent way of doing this on
Linux/OS-X  we were back to a cross-platfrom function.



Hope this helps,
Sebastian Haase

On 10/9/07, David Cournapeau [EMAIL PROTECTED] wrote:
 Ray S wrote:
  Is anyone sharing arrays between processes on Windows?
  I tried compiling the posh sources (once, so far) with the new MS
  toolkit and failed...
  What other solutions are in use?
 
  Have a second process create an array view from an address would
  suffice for this particular purpose. I could pass the address as a
  parameter of the second process's argv.
 
  I've also tried things like
  pb=pythonapi.PyBuffer_FromReadWriteMemory(9508824, 9*sizeof(c_int))
  N.frombuffer(pb, N.int32)
  which fails since pb is and int. What are my options?
 
 (disclaimer: I know nothing about windows idiosyncraties)

 Could not this be because you compiled the posh sources with a
 compiler/runtime which is different than the other extensions and python
 interpreter ? I don't know the details, but since most of the posix
 functions related to files and processes are broken beyond despair in
 windows, and in particular, many posix handles cannot cross dll
 boundaries compiled by different compilers, I would not be surprised if
 this cause some trouble.

 The fact that POSH is said to be posix-only on python.org
 (http://wiki.python.org/moin/ParallelProcessing) would imply that people
 do not care much about windows, too (but again, this is just from
 reading what posh is about; I have never used it personnally).

 cheers,

 David

 ___
 Numpy-discussion mailing list
 Numpy-discussion@scipy.org
 http://projects.scipy.org/mailman/listinfo/numpy-discussion

___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] numpy/Windows shared arrays between processes?

2007-10-09 Thread David Cournapeau
Sebastian Haase wrote:
 Hi!
 I was in fact experimenting with this. The solution seemed to lie in
 simple memmap as it is implemented in Windows:

 import numpy as N
 def arrSharedMemory(shape, dtype, tag=PriithonSharedMemory):
 
 Windows only !
 share memory between different processes if same `tag` is used.
 
 itemsize = N.dtype(dtype).itemsize
 count = N.product(shape)
 size =  count * itemsize

 import mmap
 sharedmem = mmap.mmap(0, size, tag)
 a=N.frombuffer(sharedmem, dtype, count)
 a.shape = shape
 return a

 For explaintion look up the microsoft site for the mmap documentation.
 And/or the Python-doc for mmap.
 (( I have to mention, that I could crash a process while testing this ... ))

 If anyone here would know an equivalent way of doing this on
 Linux/OS-X  we were back to a cross-platfrom function.
   
AFAIK, the tag thing is pretty much windows specific, so why not just 
ignoring it on non windows platforms ? (or interpreting the tag argument 
as the flag argument for mmap, which would be consistent with python 
mmap API ?)

cheers,

David
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] numpy/Windows shared arrays between processes?

2007-10-09 Thread Sebastian Haase
On 10/9/07, David Cournapeau [EMAIL PROTECTED] wrote:
 Sebastian Haase wrote:
  Hi!
  I was in fact experimenting with this. The solution seemed to lie in
  simple memmap as it is implemented in Windows:
 
  import numpy as N
  def arrSharedMemory(shape, dtype, tag=PriithonSharedMemory):
  
  Windows only !
  share memory between different processes if same `tag` is used.
  
  itemsize = N.dtype(dtype).itemsize
  count = N.product(shape)
  size =  count * itemsize
 
  import mmap
  sharedmem = mmap.mmap(0, size, tag)
  a=N.frombuffer(sharedmem, dtype, count)
  a.shape = shape
  return a
 
  For explaintion look up the microsoft site for the mmap documentation.
  And/or the Python-doc for mmap.
  (( I have to mention, that I could crash a process while testing this ... ))
 
  If anyone here would know an equivalent way of doing this on
  Linux/OS-X  we were back to a cross-platfrom function.
 
 AFAIK, the tag thing is pretty much windows specific, so why not just
 ignoring it on non windows platforms ? (or interpreting the tag argument
 as the flag argument for mmap, which would be consistent with python
 mmap API ?)

As I recollect, the tag thing was the key for  turning the mmap into a
not really memmaped file, that is, a memmap without a corresponding
file on the disk.
In other words, isn't a mmap ( without(!) tag ) always bound to a
real file in the file system ?

-Sebastian
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] numpy/Windows shared arrays between processes?

2007-10-09 Thread Ray Schumacher
At 05:22 AM 10/9/2007, David Cournapeau wrote:
Could not this be because you compiled the posh sources with a
compiler/runtime which is different than the other extensions and python
interpreter ?

It definitely was - since my 2.4 wanted the free 7.1 compiler, I (and 
anyone else who didn't download it in time) are now seemingly SOL 
since it is no longer available. I saw much discussion of this as 
well, but even 2.5 is now fixed on 7.1 and reports of compiling 
distutil modules with the new MS SDK and having them work at all with 
2.4 were very mixed. I also tried GCC and had a litany of other 
errors with the posh.

Sebastian Haase added:
I was in fact experimenting with this. The solution seemed to lie in
simple memmap as it is implemented in Windows:

snip
I had just found and started to write some tests with that MS 
function. If I can truly write to the array in one process and 
instantly read it in the other I'll be happy. Did you find that locks 
or semaphores were needed?

(( I have to mention, that I could crash a process while testing this ... ))

That was one of my first results! I also found that using ctypes to 
create arrays from the other process's address and laying a numpy 
array on top was prone to that in experimentation. But I had the same 
issue as Mark Heslep
http://aspn.activestate.com/ASPN/Mail/Message/ctypes-users/3192422
of creating a numpy array from a raw address (not a c_array).

Thanks,
Ray Schumacher


___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] numpy/Windows shared arrays between processes?

2007-10-09 Thread Sebastian Haase
On 10/9/07, Ray Schumacher [EMAIL PROTECTED] wrote:
 At 05:22 AM 10/9/2007, David Cournapeau wrote:
 Could not this be because you compiled the posh sources with a
 compiler/runtime which is different than the other extensions and python
 interpreter ?

 It definitely was - since my 2.4 wanted the free 7.1 compiler, I (and
 anyone else who didn't download it in time) are now seemingly SOL
 since it is no longer available. I saw much discussion of this as
 well, but even 2.5 is now fixed on 7.1 and reports of compiling
 distutil modules with the new MS SDK and having them work at all with
 2.4 were very mixed. I also tried GCC and had a litany of other
 errors with the posh.

 Sebastian Haase added:
 I was in fact experimenting with this. The solution seemed to lie in
 simple memmap as it is implemented in Windows:

 snip
 I had just found and started to write some tests with that MS
 function. If I can truly write to the array in one process and
 instantly read it in the other I'll be happy. Did you find that locks
 or semaphores were needed?

Maybe that's why it crashed ;-) !?  But for simple use it seems fine.


 (( I have to mention, that I could crash a process while testing this ... ))

 That was one of my first results! I also found that using ctypes to
 create arrays from the other process's address and laying a numpy
 array on top was prone to that in experimentation. But I had the same
 issue as Mark Heslep
 http://aspn.activestate.com/ASPN/Mail/Message/ctypes-users/3192422
 of creating a numpy array from a raw address (not a c_array).

I assume this is a different issue, but haven't looked into it yet.

-Sebastian
___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] numpy/Windows shared arrays between processes?

2007-10-09 Thread Ray S
On 10/9/07, Sebastian Haase replied:
   Did you find that locks
   or semaphores were needed?
  Maybe that's why it crashed ;-) !?  But for simple use it seems 
fine.

I just did some code (below) that does read/write to the array AFAP, 
and there is no crash, or any other issue (Win2000, py2.4, numpy 
1.0b1).
Without the print statements, it does max both processors; with 
printing I/O only 58%.
Both processes can modify the array without issue either.
I'll experiment with

I had seen the Win mmap in this thread:
http://objectmix.com/python/127666-shared-memory-pointer.html
and here:
http://www.codeproject.com/cpp/embedpython_2.asp

Note also that the Python mmap docs read In either case you must 
provide a file descriptor for a file opened for update. and no 
mention of the integer zero descriptor option.
Access options behave as presented.

Because *NIX has MAP_SHARED as an option you'd think that there might 
be cross-platform share behavior with some platform checking if 
statements. Without a tag though, how does another process reference 
the same memory on NIX, a filename? (It seems)

   But I had the same
   issue as Mark Heslep
   
http://aspn.activestate.com/ASPN/Mail/Message/ctypes-users/3192422
   of creating a numpy array from a raw address (not a c_array).
 I assume this is a different issue, but haven't looked into it yet.

Yes, a different methodology attempt. It would be interesting to know 
anyway how to create a numpy array from an address; it's probably 
buried in the undocumented C-API that I don't grok, and likely 
frowned upon.

Thanks,
Ray

___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] numpy/Windows shared arrays between processes?

2007-10-08 Thread Ray S
Is anyone sharing arrays between processes on Windows?
I tried compiling the posh sources (once, so far) with the new MS 
toolkit and failed...
What other solutions are in use?

Have a second process create an array view from an address would 
suffice for this particular purpose. I could pass the address as a 
parameter of the second process's argv.

I've also tried things like
pb=pythonapi.PyBuffer_FromReadWriteMemory(9508824, 9*sizeof(c_int))
N.frombuffer(pb, N.int32)
which fails since pb is and int. What are my options?

Ray Schumacher

___
Numpy-discussion mailing list
Numpy-discussion@scipy.org
http://projects.scipy.org/mailman/listinfo/numpy-discussion