[ 
https://issues.apache.org/jira/browse/BEAM-11731?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17276621#comment-17276621
 ] 

Brian Hulette commented on BEAM-11731:
--------------------------------------

Took a quick look since it looks like an issue with pandas/pyarrow/parquet. 
First cron failure was 
https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/3797/ which has no 
beam changes. Doesn't seem to be due to a pyarrow or pandas release either, 
we're still running with pyarrow 2.0.0 and pandas 1.1.5.

I noticed that of all the various configurations we test with, only the py37-* 
ones are failing. py36 and py38 are ok.

> Py precommit failing: test_read_write_10_parquet
> ------------------------------------------------
>
>                 Key: BEAM-11731
>                 URL: https://issues.apache.org/jira/browse/BEAM-11731
>             Project: Beam
>          Issue Type: Bug
>          Components: test-failures
>            Reporter: Kyle Weaver
>            Priority: P1
>
> pyarrow.lib.ArrowTypeError: ("Did not pass numpy.dtype object [while running 
> '_WriteToPandas/WriteToFiles/ParDo(_WriteUnshardedRecordsFn)/ParDo(_WriteUnshardedRecordsFn)']",
>  'Conversion failed for column rank with type int64')
> https://ci-beam.apache.org/job/beam_PreCommit_Python_Commit/17083/testReport/junit/apache_beam.dataframe.io_test/IOTest/test_read_write_10_parquet/



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to