[
https://issues.apache.org/jira/browse/ARROW-8135?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17061108#comment-17061108
]
Matej Murin commented on ARROW-8135:
------------------------------------
I have done:
```
*conda info pyarrow*
```
Locally on my machine.
Found my version, and checked if I have all the dependencies listed there in my
*conda list* command.
The ones i didn't, I found the dependencies on conda-forge website, downloaded
the tar.bz2 files manually and transferred them, then Installed offline.
> [Python] Problem importing PyArrow on a cluster
> -----------------------------------------------
>
> Key: ARROW-8135
> URL: https://issues.apache.org/jira/browse/ARROW-8135
> Project: Apache Arrow
> Issue Type: Bug
> Components: C++
> Affects Versions: 0.16.0
> Environment: Linux, RedHat CentOS 7
> Reporter: Matej Murin
> Priority: Major
> Labels: newbie
>
> Hi, when I am trying to import pyarrow in python, I get the following error:
> *File "<stdin>", line 1, in <module>*
> *File
> "/services/matejm/anaconda3/lib/python3.7/site-packages/pyarrow/__init__.py",
> line 49, in <module>*
> *from pyarrow.lib import cpu_count, set_cpu_count*
> *ImportError: libaws-cpp-sdk-s3.so: cannot open shared object file: No such
> file or directory*
> What can this be related to? I have searched wherever i could've and could
> not find any reason for it, so I figured i might as well try in here.
> Thank you very much
> Note: I have installed py-arrow and its dependencies off-line, since our
> cluster has a company firewall that does not allow pip nor conda installation
--
This message was sent by Atlassian Jira
(v8.3.4#803005)