Hello,

I am having some troubles using the Continuum PyArrow conda package 
dependencies in conjunction with internal C++ extension modules.

Apparently, Arrow and Parquet link Boost statically.  We have some internal 
packages containing C++ code that linking Boost libs dynamicaly.  If we import 
Feather as well as our own extension modules into the same Python process, we 
get random segfaults in Boost.  I think what's happening is that our extension 
modules are picking up Boost's symbols from Arrow and Parquet already loaded 
into the process, rather than from our own Boost shared libs.

Could anyone explain the policy for linking Boost in binary distributions, 
particularly conda packages?  What is your expectation for how other C++ 
extension modules should be built?

Thanks in advance,
Alex

Reply via email to