Mgmaplus opened a new issue #10408:
URL: https://github.com/apache/arrow/issues/10408


   Hi, 
   
   I'm using the bigquery_storage_v1 and I have an error when it calls 
read_schema.
   It seems this function is in pyarrow.lib
   
   I have pyarrow 4.0 installed which makes me think I need to install an 
earlier release.
   
   Any help is appreciated.
   
   <!--StartFragment-->
   [ERROR] AttributeError: module 'pyarrow' has no attribute 'ipc'Traceback 
(most recent call last):&nbsp;&nbsp;File "/var/task/lambda_handler.py", line 
89, in lambda_handler&nbsp;&nbsp;&nbsp;&nbsp;raise err&nbsp;&nbsp;File 
"/var/task/lambda_handler.py", line 80, in 
lambda_handler&nbsp;&nbsp;&nbsp;&nbsp;report_json = 
fetchPlatform(job['platform'], job['credentials'], job['ranges'], date_ranges, 
job['labels_to_add'])&nbsp;&nbsp;File 
"/var/task/generators/gen_reports_rev.py", line 81, in 
fetchPlatform&nbsp;&nbsp;&nbsp;&nbsp;listOfFigs = BQ_report_builder(cred_dict, 
listOfDateRanges[i][0], listOfDateRanges[i][1], labelsToAdd)&nbsp;&nbsp;File 
"/var/task/generators/gen_reports_rev.py", line 583, in 
BQ_report_builder&nbsp;&nbsp;&nbsp;&nbsp;df = BIGQ_downloadLastDay(json_cert, 
projectID, bucketID, params, date = end)&nbsp;&nbsp;File 
"/var/task/generators/data_download.py", line 604, in 
BIGQ_downloadLastDay&nbsp;&nbsp;&nbsp;&nbsp;dataframe = 
(bqclient.query(query_string).result().to_datafram
 e())&nbsp;&nbsp;File 
"/mnt/efs/__python_packages__/google/cloud/bigquery/table.py", line 1852, in 
to_dataframe&nbsp;&nbsp;&nbsp;&nbsp;record_batch = 
self.to_arrow(&nbsp;&nbsp;File 
"/mnt/efs/__python_packages__/google/cloud/bigquery/table.py", line 1667, in 
to_arrow&nbsp;&nbsp;&nbsp;&nbsp;for record_batch in 
self._to_arrow_iterable(&nbsp;&nbsp;File 
"/mnt/efs/__python_packages__/google/cloud/bigquery/table.py", line 1567, in 
_to_page_iterable&nbsp;&nbsp;&nbsp;&nbsp;yield from 
result_pages&nbsp;&nbsp;File 
"/mnt/efs/__python_packages__/google/cloud/bigquery/_pandas_helpers.py", line 
717, in 
_download_table_bqstorage&nbsp;&nbsp;&nbsp;&nbsp;future.result()&nbsp;&nbsp;File
 "/var/lang/lib/python3.8/concurrent/futures/_base.py", line 437, in 
result&nbsp;&nbsp;&nbsp;&nbsp;return self.__get_result()&nbsp;&nbsp;File 
"/var/lang/lib/python3.8/concurrent/futures/_base.py", line 389, in 
__get_result&nbsp;&nbsp;&nbsp;&nbsp;raise self._exception&nbsp;&nbsp;File 
"/var/lang/lib/python3.8/concurrent/fut
 ures/thread.py", line 57, in run&nbsp;&nbsp;&nbsp;&nbsp;result = 
self.fn(*self.args, **self.kwargs)&nbsp;&nbsp;File 
"/mnt/efs/__python_packages__/google/cloud/bigquery/_pandas_helpers.py", line 
598, in _download_table_bqstorage_stream&nbsp;&nbsp;&nbsp;&nbsp;item = 
page_to_item(page)&nbsp;&nbsp;File 
"/mnt/efs/__python_packages__/google/cloud/bigquery/_pandas_helpers.py", line 
581, in _bqstorage_page_to_arrow&nbsp;&nbsp;&nbsp;&nbsp;return 
page.to_arrow()&nbsp;&nbsp;File 
"/mnt/efs/__python_packages__/google/cloud/bigquery_storage_v1/reader.py", line 
453, in to_arrow&nbsp;&nbsp;&nbsp;&nbsp;return 
self._stream_parser.to_arrow(self._message)&nbsp;&nbsp;File 
"/mnt/efs/__python_packages__/google/cloud/bigquery_storage_v1/reader.py", line 
626, in to_arrow&nbsp;&nbsp;&nbsp;&nbsp;return 
self._parse_arrow_message(message)&nbsp;&nbsp;File 
"/mnt/efs/__python_packages__/google/cloud/bigquery_storage_v1/reader.py", line 
650, in _parse_arrow_message&nbsp;&nbsp;&nbsp;&nbsp;self._parse_arrow_schema()&
 nbsp;&nbsp;File 
"/mnt/efs/__python_packages__/google/cloud/bigquery_storage_v1/reader.py", line 
661, in _parse_arrow_schema&nbsp;&nbsp;&nbsp;&nbsp;self._schema = 
pyarrow.ipc.read_schema( | [ERROR] AttributeError: module 'pyarrow' has no 
attribute 'ipc' Traceback (most recent call last): &nbsp;&nbsp;File 
"/var/task/lambda_handler.py", line 89, in lambda_handler 
&nbsp;&nbsp;&nbsp;&nbsp;raise err &nbsp;&nbsp;File 
"/var/task/lambda_handler.py", line 80, in lambda_handler 
&nbsp;&nbsp;&nbsp;&nbsp;report_json = fetchPlatform(job['platform'], 
job['credentials'], job['ranges'], date_ranges, job['labels_to_add']) 
&nbsp;&nbsp;File "/var/task/generators/gen_reports_rev.py", line 81, in 
fetchPlatform &nbsp;&nbsp;&nbsp;&nbsp;listOfFigs = BQ_report_builder(cred_dict, 
listOfDateRanges[i][0], listOfDateRanges[i][1], labelsToAdd) &nbsp;&nbsp;File 
"/var/task/generators/gen_reports_rev.py", line 583, in BQ_report_builder 
&nbsp;&nbsp;&nbsp;&nbsp;df = BIGQ_downloadLastDay(json_cert, projectID, 
bucketID, 
 params, date = end) &nbsp;&nbsp;File "/var/task/generators/data_download.py", 
line 604, in BIGQ_downloadLastDay &nbsp;&nbsp;&nbsp;&nbsp;dataframe = 
(bqclient.query(query_string).result().to_dataframe()) &nbsp;&nbsp;File 
"/mnt/efs/__python_packages__/google/cloud/bigquery/table.py", line 1852, in 
to_dataframe &nbsp;&nbsp;&nbsp;&nbsp;record_batch = self.to_arrow( 
&nbsp;&nbsp;File "/mnt/efs/__python_packages__/google/cloud/bigquery/table.py", 
line 1667, in to_arrow &nbsp;&nbsp;&nbsp;&nbsp;for record_batch in 
self._to_arrow_iterable( &nbsp;&nbsp;File 
"/mnt/efs/__python_packages__/google/cloud/bigquery/table.py", line 1567, in 
_to_page_iterable &nbsp;&nbsp;&nbsp;&nbsp;yield from result_pages 
&nbsp;&nbsp;File 
"/mnt/efs/__python_packages__/google/cloud/bigquery/_pandas_helpers.py", line 
717, in _download_table_bqstorage &nbsp;&nbsp;&nbsp;&nbsp;future.result() 
&nbsp;&nbsp;File "/var/lang/lib/python3.8/concurrent/futures/_base.py", line 
437, in result &nbsp;&nbsp;&nbsp;&nbsp;return self.__ge
 t_result() &nbsp;&nbsp;File 
"/var/lang/lib/python3.8/concurrent/futures/_base.py", line 389, in 
__get_result &nbsp;&nbsp;&nbsp;&nbsp;raise self._exception &nbsp;&nbsp;File 
"/var/lang/lib/python3.8/concurrent/futures/thread.py", line 57, in run 
&nbsp;&nbsp;&nbsp;&nbsp;result = self.fn(*self.args, **self.kwargs) 
&nbsp;&nbsp;File 
"/mnt/efs/__python_packages__/google/cloud/bigquery/_pandas_helpers.py", line 
598, in _download_table_bqstorage_stream &nbsp;&nbsp;&nbsp;&nbsp;item = 
page_to_item(page) &nbsp;&nbsp;File 
"/mnt/efs/__python_packages__/google/cloud/bigquery/_pandas_helpers.py", line 
581, in _bqstorage_page_to_arrow &nbsp;&nbsp;&nbsp;&nbsp;return page.to_arrow() 
&nbsp;&nbsp;File 
"/mnt/efs/__python_packages__/google/cloud/bigquery_storage_v1/reader.py", line 
453, in to_arrow &nbsp;&nbsp;&nbsp;&nbsp;return 
self._stream_parser.to_arrow(self._message) &nbsp;&nbsp;File 
"/mnt/efs/__python_packages__/google/cloud/bigquery_storage_v1/reader.py", line 
626, in to_arrow &nbsp;&nbsp;&nbsp;&nb
 sp;return self._parse_arrow_message(message) &nbsp;&nbsp;File 
"/mnt/efs/__python_packages__/google/cloud/bigquery_storage_v1/reader.py", line 
650, in _parse_arrow_message &nbsp;&nbsp;&nbsp;&nbsp;self._parse_arrow_schema() 
&nbsp;&nbsp;File 
"/mnt/efs/__python_packages__/google/cloud/bigquery_storage_v1/reader.py", line 
661, in _parse_arrow_schema &nbsp;&nbsp;&nbsp;&nbsp;self._schema = 
pyarrow.ipc.read_schema(
   -- | --
   
   
   <!--EndFragment-->
   
   
   
   STOP! Are you reporting a bug, a possible bug, or requesting a
   feature? If so, please report under the ARROW project on the ASF JIRA
   server https://issues.apache.org/jira/browse/ARROW. This JIRA server
   is free to use and open to the public, but you must create an account
   if it is your first time.
   
   See our contribution guidelines for more information:
   http://arrow.apache.org/docs/developers/contributing.html
   
   We have GitHub issues available as a way for new contributors and
   passers-by who are unfamiliar with Apache Software Foundation projects
   to ask questions and interact with the project. Do not be surprised if
   the first response is to open a JIRA issue or to write an e-mail to
   one of the public mailing lists:
   
   * Development discussions: [email protected] (first subscribe by
     sending an e-mail to [email protected]).
   * User discussions: [email protected] (first subscribe by
     sending an e-mail to [email protected]).
   
   Thank you!
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to