[GitHub] [arrow] kszucs edited a comment on pull request #9115: ARROW-10457: [CI] Fix Spark branch-3.0 integration tests

2021-01-15 Thread GitBox


kszucs edited a comment on pull request #9115:
URL: https://github.com/apache/arrow/pull/9115#issuecomment-76146


   ```
   Caused by: java.lang.RuntimeException: No DefaultAllocationManager found on 
classpath. Can't allocate Arrow buffers. Please consider adding 
arrow-memory-netty or arrow-memory-unsafe as a dependency.
   ```
   
   This change was introduced by 
https://github.com/apache/arrow/commit/2092e18752a9c0494799493b12eb1830052217a2
   Which is part of the previous release, so I assume a testing issue (or 
something which should be modified on spark's side).



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [arrow] kszucs edited a comment on pull request #9115: ARROW-10457: [CI] Fix Spark branch-3.0 integration tests

2021-01-15 Thread GitBox


kszucs edited a comment on pull request #9115:
URL: https://github.com/apache/arrow/pull/9115#issuecomment-760917973


   We definitely should test against spark releases.
   
   I executed the build against 3.0.1 but it fails with:
   ```
   - max records in batch conf *** FAILED ***
 org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 
in stage 21.0 failed 1 times, most recent failure: Lost task 0.0 in stage 21.0 
(TID 25, 5fc0f8cfe8d2, executor driver): java.lang.NoClassDefFoundError: Could 
not initialize class
org.apache.spark.sql.util.ArrowUtils$
   ```
   
   cc @BryanCutler 



This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org