On Thu, Jul 11, 2019 at 11:26 AM Antoine Pitrou <anto...@python.org> wrote: > > > Le 11/07/2019 à 17:52, Krisztián Szűcs a écrit : > > Hi All, > > > > I have a couple of questions about the wheel packaging: > > - why do we build an arrow namespaced boost on linux and osx, could we link > > statically like with the windows wheels? > > No idea. Boost shouldn't leak in the public APIs, so theoretically a > static build would be fine...
In principle the privately-namespaced Boost could be statically linked. We are using bcp to change the C++ namespace of the symbols so that our Boost symbols don't conflict with other wheels' Boost symbols (which may have come from a different Boost version). I'll let Uwe comment further on the desire for dynamic linking > > > - do we explicitly say somewhere in the linux wheels to link the 3rdparty > > dependencies statically or just implicitly, by removing (or not building) > > the shared libs for the 3rdparty dependencies? > > It's implicit by removing the shared libs (or not building them). > Some time ago the compression libs were always linked statically by > default, but it was changed to dynamic along the time, probably to > please system packagers. I think only libz shared library is being bundled, for security reasons > > > - couldn't we use the 3rdparty toolchain to build the smaller 3rdparty > > dependencies for the linux wheels instead of building them manually in the > > manylinux docker image - it'd easier to say <dependency>_SOURCE=BUNDLED > > I don't think so. The conda-forge and Anaconda packages use a different > build chain (different compiler, different libstdc++ version) and may > not be usable directly on manylinux-compliant systems. I think you may misunderstand. Krisztian is suggesting building the dependencies through the ExternalProject mechanism during "docker run" on the image rather than caching pre-built versions in the Docker image. For small dependencies, I don't see why we couldn't used the BUNDLED approach. This might spare us having to maintain some of the build scripts. It will strictly increase build times, though -- I think the reason that everything is cached now is to save on build times (which have historically been quite long) > > Regards > > Antoine.