[ 
https://issues.apache.org/jira/browse/ARROW-8135?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17061208#comment-17061208
 ] 

Wes McKinney commented on ARROW-8135:
-------------------------------------

It looks like you're missing transitive dependencies of pyarrow. You also need 
the runtime dependencies of arrow-cpp, the C++ library

https://github.com/conda-forge/arrow-cpp-feedstock/blob/master/recipe/meta.yaml#L50

Recommend closing the issue

> [Python] Problem importing PyArrow on a cluster
> -----------------------------------------------
>
>                 Key: ARROW-8135
>                 URL: https://issues.apache.org/jira/browse/ARROW-8135
>             Project: Apache Arrow
>          Issue Type: Bug
>          Components: C++
>    Affects Versions: 0.16.0
>         Environment: Linux, RedHat CentOS 7
>            Reporter: Matej Murin
>            Priority: Major
>              Labels: newbie
>
> Hi, when I am trying to import pyarrow in python, I get the following error:
> *File "<stdin>", line 1, in <module>*
>  *File 
> "/services/matejm/anaconda3/lib/python3.7/site-packages/pyarrow/__init__.py", 
> line 49, in <module>*
>  *from pyarrow.lib import cpu_count, set_cpu_count*
>  *ImportError: libaws-cpp-sdk-s3.so: cannot open shared object file: No such 
> file or directory*
>  What can this be related to? I have searched wherever i could've and could 
> not find any reason for it, so I figured i might as well try in here.
> Thank you very much
> Note: I have installed py-arrow and its dependencies off-line, since our 
> cluster has a company firewall that does not allow pip nor conda installation



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to