[ 
https://issues.apache.org/jira/browse/ARROW-2555?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16802129#comment-16802129
 ] 

Jakub Okoński commented on ARROW-2555:
--------------------------------------

Does this depend on a specific pandas version? I'm still having problems on 
pyarrow 0.12.0:

 

{{In [35]: pyarrow.__version__}}
{{Out[35]: '0.12.0'}}

{{In [36]: pd.__version__}}
{{Out[36]: '0.20.3'}}

{{In [38]: from datetime import datetime}}
{{    ...: import pandas as pd}}
{{    ...: }}
{{    ...: dt = datetime(day=1, month=1, year=2017, hour=1, minute=1, second=1, 
microsecond=1)}}
{{    ...: values = [(dt,)]}}
{{    ...: df = pd.DataFrame.from_records(values, columns=['ts'])}}
{{    ...: }}

{{In [39]: df}}
{{Out[39]: }}
{{                          ts}}
{{0 2017-01-01 01:01:01.000001}}

{{In [40]: pyarrow.parquet.write_table(table=pyarrow.Table.from_pandas(df, 
schema=pyarrow.schema([pyarrow.field('ts', pyarrow.timestamp('ms'))])), 
coerce_timestamps='ms', where='/array/test.parquet', 
allow_truncated_timestamps=True)}}
{{---------------------------------------------------------------------------}}
{{ArrowInvalid                              Traceback (most recent call last)}}
{{...}}

{{ArrowInvalid: ('Casting from timestamp[ns] to timestamp[ms] would lose data: 
1483232461000001000', 'Conversion failed for column ts with type 
datetime64[ns]')}}

> [Python] Provide an option to convert on coerce_timestamps instead of error
> ---------------------------------------------------------------------------
>
>                 Key: ARROW-2555
>                 URL: https://issues.apache.org/jira/browse/ARROW-2555
>             Project: Apache Arrow
>          Issue Type: Improvement
>          Components: Python
>            Reporter: Uwe L. Korn
>            Assignee: Eric Conlon
>            Priority: Major
>              Labels: pull-request-available
>             Fix For: 0.11.0
>
>          Time Spent: 3h 10m
>  Remaining Estimate: 0h
>
> At the moment, we error out on {{coerce_timestamps='ms'}} on 
> {{pyarrow.parquet.write_table}} if the data contains a timestamp that would 
> loose information when converted to milliseconds. In a lot of cases the user 
> does not care about this granularity and rather wants the comfort 
> functionality that the timestamp are stored regardlessly in Parquet. Thus we 
> should provide an option to ignore the error and do the lossy conversion.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to