[
https://issues.apache.org/jira/browse/ARROW-10264?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17263488#comment-17263488
]
Antoine Pitrou commented on ARROW-10264:
----------------------------------------
[~jorisvandenbossche] Is this something we can fix easily for 3.0? (either
change the test or fix PyArrow if it's an actual bug)
> [C++][Python] Parquet test failing with HadoopFileSystem URI
> ------------------------------------------------------------
>
> Key: ARROW-10264
> URL: https://issues.apache.org/jira/browse/ARROW-10264
> Project: Apache Arrow
> Issue Type: Bug
> Components: Python
> Reporter: Joris Van den Bossche
> Priority: Major
> Labels: filesystem, hdfs
> Fix For: 3.0.0
>
>
> Follow-up on ARROW-10175. In the HDFS integration tests, there is a test
> using a URI failing if we use the new filesystem / dataset implementation:
> {code}
> FAILED
> opt/conda/envs/arrow/lib/python3.7/site-packages/pyarrow/tests/test_hdfs.py::TestLibHdfs::test_read_multiple_parquet_files_with_uri
> {code}
> fails with
> {code}
> pyarrow.lib.ArrowInvalid: Path
> '/tmp/pyarrow-test-838/multi-parquet-uri-48569714efc74397816722c9c6723191/0.parquet'
> is not relative to '/user/root'
> {code}
> while it is passing a URI (and not a filesystem object) to
> {{parquet.read_table}}, and the new filesystems/dataset implementation should
> be able to handle URIs.
> cc [~apitrou]
--
This message was sent by Atlassian Jira
(v8.3.4#803005)