raulcd opened a new issue, #49083:
URL: https://github.com/apache/arrow/issues/49083

   ### Describe the bug, including details regarding any error messages, 
version, and platform.
   
   The nightly job for dask 
([test-conda-python-3.11-dask-upstream_devel](https://github.com/ursacomputing/crossbow/actions/runs/21500242032/job/61944769588))
 has been failing during the last ~10 days. This matches the release of pandas 
3.0.0.
   
   There are 10 test failures. From what I can see they aren't all related to 
the same cause, some examples:
   ```
           if check_dtype and str(adt) != str(bdt):
   >           raise AssertionError(f"a and b have different dtypes: (a: {adt}, 
b: {bdt})")
   E           AssertionError: a and b have different dtypes: (a: str, b: 
object)
   ```
   or
   ```
   pyarrow/error.pxi:155: in pyarrow.lib.pyarrow_internal_check_status
       ???
   _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
_ _ 
   
   >   ???
   E   pyarrow.lib.ArrowNotImplementedError: Function 'quantile' has no kernel 
matching input types (large_string)
   ```
   or
   ```
            if any(blk.dtype.kind in "mM" for blk in self._mgr.blocks):
               msg = (
                   "DataFrame contains columns with dtype datetime64 "
                   "or timedelta64, which are not supported for cov."
               )
   >           raise TypeError(msg)
   E           TypeError: DataFrame contains columns with dtype datetime64 or 
timedelta64, which are not supported for cov.
   ```
   
   but all of them seem to be related to differences of conversion. All this 
seems related to the pandas 3 version. I think we've seen some of those issues 
with pandas tests integrations before.
   
   ### Component(s)
   
   Continuous Integration, Python


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to