I've been looking at using nim to create extension modules for python. I would 
typically do this using cython and C/C++ with the workflow: Write in python -> 
profile -> re-write slow functions in cython -> compile to shared object (via 
c). Using the nimpy lib, I've been doing essentially the same thing with nim, 
which after initial set-up actually seems easier!

But I'm confused about how to approach distribution of the extensions as python 
libraries. With cython-written extensions, I would:

create a sourse distribution of .py and .c files. The end user builds this so 
it matches their system architecture/OS.

and/or create platform specific wheels for the most obvious platforms from .py 
and .so/.pyd

and/or create general wheel from .py and .pyx files, with a cython dependancy.

With nim, the generated .c is platform specific, does this means 1) is out?

Creating a nim dependancy for 3) doesn't seem feasible to me, even if the end 
user was willing to install nim (cython is a pypi package, so the installer can 
grab it and put it in the correct place automatically).

So that just leaves 2), creating all the combinations (e.g. 22(!) for numpy) of 
binaries for platform-specific wheels.

Are there any better methods to do this? The only nim extension pypi libraries 
I've seen (e.g. faster_than_requests) are not cross-platform enough even for my 
limited uses (Linux and windows 32 and 64 bit)

Reply via email to