Here's my own open source project built with pyopengl and pyqt.
http://code.google.com/p/ice-cache-explorer/
Hope it helps.
-mab
"The world goin' one way, people another!" - Poot
> From: da...@boddie.org.uk
> To: pyqt@riverbankcomputing.com
> Date: Mon, 30 May 2011 22:04:06 +0200
> Subject: R
On Mon, 9 May 2011 12:15:54 -0400, Belzile Marc-André
> wrote:
> > Hi,
> > I want to compile my sip module in debug but the linker fails as I don't
> > have python debug installed locally.
> > link /NOLOGO /DLL /MANIFEST /MANIFESTFILE:sipy.pyd.manifest
>
Hi,
I want to compile my sip module in debug but the linker fails as I don't have
python debug installed locally.
link /NOLOGO /DLL /MANIFEST /MANIFESTFILE:sipy.pyd.manifest
/SUBSYSTEM:WINDOWS "/MANIFESTDEPENDENCY:type='win32'
name='Microsoft.Windows.Common-Controls' version='6.0.0.0'
a 512x512x1500 array of unsigned int
> using hdf5 from one process to another?
>
>
>
> On Mon, May 9, 2011 at 10:16 AM, Belzile Marc-André
> wrote:
> > Indeed, QProcess is meant to run external python scripts, which allows
> > python to run one script per av
ou use it with python. My dataset
> could be very big (or what I think is big), for example, a serie of
> 1500 dicom images so I think that HDF5 could be useful to me.
>
> On Mon, May 9, 2011 at 9:24 AM, Belzile Marc-André wrote:
> > As an alternative to python multiprocessing
As an alternative to python multiprocessing module, I'm using QProcess for
loading and exporting files with my own pool manager. It's pretty darn fast and
works well, my data sets are managed with HDF5 which takes care of handling
huge data sets pretty easily. If you want to pass data back to t