vmalisz commented on issue #33814:
URL: https://github.com/apache/arrow/issues/33814#issuecomment-1404014045
hi @kou! I'm experiencing a new issue but note sure how related to the
install it is so I dont want to create a new thread unless you tell me so.
I initially needed to install pyarrow because I couldn't install db-dtypes
without it. Now I have both and it's confirmed if I try `pip install db-dtypes`
It looks like the issue is still with pyarrow though:
` import pyarrow.lib as _lib
ImportError: libarrow_dataset.so.1100: cannot open shared object file: No
such file or directory`
the full error (worth mentioning the script works well on my laptop... maybe
I should just consider not using a Pi for that project):
`Traceback (most recent call last):
File
"/home/pi/pyarrow-dev/lib/python3.7/site-packages/google/cloud/bigquery/_pandas_helpers.py",
line 41, in <module>
import db_dtypes # type: ignore
File
"/home/pi/pyarrow-dev/lib/python3.7/site-packages/db_dtypes/__init__.py", line
26, in <module>
import pyarrow
File
"/home/pi/pyarrow-dev/lib/python3.7/site-packages/pyarrow/__init__.py", line
65, in <module>
import pyarrow.lib as _lib
ImportError: libarrow_dataset.so.1100: cannot open shared object file: No
such file or directory
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "run_gscpull.py", line 5, in <module>
gscdatapull(backfill=5)#, end_date='2022-11-10')
File "/home/pi/gsc-bigquery/class_gscdailypull.py", line 22, in __init__
self.do_allthethings(all_users_settings,self.backfill)
File "/home/pi/gsc-bigquery/class_gscdailypull.py", line 58, in
do_allthethings
mydata =
GSC(domain=domain,wp_user_id=user_data['user_id'],backfill=backfill,
refresh=user_data['refresh_token'], end_date=self.end_date)
File "/home/pi/gsc-bigquery/class_gsc.py", line 47, in __init__
self.db_dates_data = bigquery.checkdata(self.start_date,
self.end_date).to_dataframe()
File
"/home/pi/pyarrow-dev/lib/python3.7/site-packages/google/cloud/bigquery/table.py",
line 1993, in to_dataframe
_pandas_helpers.verify_pandas_imports()
File
"/home/pi/pyarrow-dev/lib/python3.7/site-packages/google/cloud/bigquery/_pandas_helpers.py",
line 1006, in verify_pandas_imports
raise ValueError(_NO_DB_TYPES_ERROR) from db_dtypes_import_exception
ValueError: Please install the 'db-dtypes' package to use this function.
`
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]