[ 
https://issues.apache.org/jira/browse/ARROW-6520?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Furkan Tektas updated ARROW-6520:
---------------------------------
    Description: 
I'm not sure if this should be reported to Parquet or here.

When I tried to serialize a pyarrow table with a fixed size binary field (holds 
16 byte UUID4 information) to a parquet file, segmentation fault occurs.

Here is the minimal example to reproduce:

{{import pyarrow as pa}}
{{from pyarrow import parquet as pq}}
{{data = \{"col": pa.array([b"1234" for _ in range(10)])}}}
{{fields = [("col", pa.binary(4))]}}
{{schema = pa.schema(fields)}}
{{table = pa.table(data, schema)}}
{{pq.write_table(table, "test.parquet")}}
{{segmentation fault (core dumped) ipython}}

 

Yet, it works if I don't specify the size of the binary field.

{{import pyarrow as pa}}
{{from pyarrow import parquet as pq}}
{{data = \{"col": pa.array([b"1234" for _ in range(10)])}}}
{{fields = [("col", pa.binary())]}}
{{schema = pa.schema(fields)}}
{{table = pa.table(data, schema)}}
{{pq.write_table(table, "test.parquet")}}

Thanks,

  was:
I'm not sure if this should be reported to Parquet or here.

When I tried to serialize a pyarrow table with a fixed size binary field (holds 
16 byte UUID4 information) to a parquet file, segmentation fault occurs.

Here is the minimal example to reproduce:

{{import pyarrow as pa}}
{{ from pyarrow import parquet as pq}}
{{ data = \{"col": pa.array([b"1234" for _ in range(10)])}}}
{{ fields = [("col", pa.binary(4))]}}
{{ schema = pa.schema(fields)}}
{{ table = pa.table(data, schema)}}
{{ pq.write_table(table, "test.parquet")}}
{{ segmentation fault (core dumped) ipython}}

 

Yet, it works if I don't specify the size of the binary field.

{{import pyarrow as pa}}
{{from pyarrow import parquet as pq}}
{{data = \{"col": pa.array([b"1234" for _ in range(10)])}}}
{{fields = [("col", pa.binary())]}}
{{schema = pa.schema(fields)}}
{{table = pa.table(data, schema)}}
{{pq.write_table(table, "test.parquet")}}

Thanks,


> Segmentation fault on writing tables with fixed size binary fields 
> -------------------------------------------------------------------
>
>                 Key: ARROW-6520
>                 URL: https://issues.apache.org/jira/browse/ARROW-6520
>             Project: Apache Arrow
>          Issue Type: Bug
>          Components: Python
>    Affects Versions: 0.14.1
>         Environment: Arch Linux x86_64
> arrow-cpp                 0.14.1           py37h6b969ab_1    conda-forge
> parquet-cpp               1.5.1                         2    conda-forge
> pyarrow                   0.14.1           py37h8b68381_0    conda-forge
> python                    3.7.3                h33d41f4_1    conda-forge
>            Reporter: Furkan Tektas
>            Priority: Critical
>              Labels: newbie
>
> I'm not sure if this should be reported to Parquet or here.
> When I tried to serialize a pyarrow table with a fixed size binary field 
> (holds 16 byte UUID4 information) to a parquet file, segmentation fault 
> occurs.
> Here is the minimal example to reproduce:
> {{import pyarrow as pa}}
> {{from pyarrow import parquet as pq}}
> {{data = \{"col": pa.array([b"1234" for _ in range(10)])}}}
> {{fields = [("col", pa.binary(4))]}}
> {{schema = pa.schema(fields)}}
> {{table = pa.table(data, schema)}}
> {{pq.write_table(table, "test.parquet")}}
> {{segmentation fault (core dumped) ipython}}
>  
> Yet, it works if I don't specify the size of the binary field.
> {{import pyarrow as pa}}
> {{from pyarrow import parquet as pq}}
> {{data = \{"col": pa.array([b"1234" for _ in range(10)])}}}
> {{fields = [("col", pa.binary())]}}
> {{schema = pa.schema(fields)}}
> {{table = pa.table(data, schema)}}
> {{pq.write_table(table, "test.parquet")}}
> Thanks,



--
This message was sent by Atlassian Jira
(v8.3.2#803003)

Reply via email to