honno opened a new issue, #35535:
URL: https://github.com/apache/arrow/issues/35535

   ### Describe the bug, including details regarding any error messages, 
version, and platform.
   
   First of, this might not be a problem—let me know if that's the case! Hard 
for me to wrap my head around the consequences of Arrow's NA model with 
interchanging :sweat_smile:
   
   So, if one were to interchange a `pandas.DataFrame` containing NaNs to a 
`pyarrow.Table`, one gets nulls in place of NaNs.
   
   ```python
   >>> import pandas as pd
   >>> import numpy as np
   >>> df = pd.DataFrame({"foo": pd.Series([float("nan")], dtype=np.float64)})
   >>> from pyarrow.interchange import from_dataframe
   >>> from_dataframe(df)
   pyarrow.Table
   foo: double
   ----
   foo: [[null]    # expect NaN?
   ```
   
   We get the same with `modin`, which also adopts the interchange protocol.
   
   ```python
   >>> import modin
   >>> import ray
   >>> ray.init(local_mode=True)
   >>> from modin.config import Engine
   >>> Engine.put("ray")
   >>> from modin import pandas as mpd
   >>> df = mpd.DataFrame({"foo": mpd.Series([float("nan")], dtype=np.float64)})
   >>> from_dataframe(df)
   pyarrow.Table
   foo: double
   ----
   foo: [[null]]
   ```
   
   I see interchanging another `pa.Table` with NaNs works fine, asummedly 
because `from_dataframe()` short-circuits when it gets a 
`pa.Table`/`pa.RecordBatch`.
   
   ```python
   >>> table = pa.Table.from_pydict({"foo": pa.array([float("nan")], 
type=pa.float64())})
   >>> from_dataframe(table)
   pyarrow.Table
   foo: double
   ----
   foo: [[nan]]
   ```
   
   Using the arrow nightly from https://pypi.fury.io/arrow-nightlies/
   
   cc @AlenkaF
   
   
   
   ### Component(s)
   
   Python


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to