wirable23 commented on issue #35584:
URL: https://github.com/apache/arrow/issues/35584#issuecomment-1553476325

   Thanks @westonpace for the clarification. The API for pa.array is documented 
to be safe by default. In the cast compute API, truncation is considered 
unsafe. So what is the reason that in this API, truncation is safe, but in the 
cast API, truncation is considered unsafe?
   
   >>> a = pa.array([1.2])
   >>> a
   <pyarrow.lib.DoubleArray object at 0x000002BD16F34EE0>
   [
     1.2
   ]
   >>> a.cast(pa.int64())
   Traceback (most recent call last):
     File "<stdin>", line 1, in <module>
     File "pyarrow\array.pxi", line 935, in pyarrow.lib.Array.cast
     File "C:\ca2_ps_311\Lib\site-packages\pyarrow\compute.py", line 400, in 
cast
       return call_function("cast", [arr], options, memory_pool)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
     File "pyarrow\_compute.pyx", line 572, in pyarrow._compute.call_function
     File "pyarrow\_compute.pyx", line 367, in pyarrow._compute.Function.call
     File "pyarrow\error.pxi", line 144, in 
pyarrow.lib.pyarrow_internal_check_status
     File "pyarrow\error.pxi", line 100, in pyarrow.lib.check_status
   pyarrow.lib.ArrowInvalid: Float value 1.2 was truncated converting to int64
   >>>
   
   In a safe cast, the truncation is raised, but in the safe array creation, 
all other conditions are checked except truncation. But if this is by design, 
then this issue can just be closed then?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscr...@arrow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to