vrozov commented on PR #52099:
URL: https://github.com/apache/spark/pull/52099#issuecomment-3238073241

   > To @sarutak , I have no opinion for the PR because apparently this PR 
didn't pass the CI yet and what I can say for now is that Apache Spark 
community doesn't allow [this kind of 
regression](https://github.com/apache/spark/pull/52099/files#diff-9c5fb3d1b7e3b0f54bc5c4182965c4fe1f9023d449017cece3005d3f90e8e4d8R210)
 (of this PR). I'm not sure why Hive 4.1 upgrade proposal enforces us to 
downgrade the dependencies. I hope it's a mistake and to make it sure that to 
avoid this kind of hidden stuff.
   > 
   > ```
   > - <antlr4.version>4.13.1</antlr4.version>
   > + <antlr4.version>4.9.3</antlr4.version>
   > ```
   
   @dongjoon-hyun @sarutak Unfortunately it is not a mistake. Hive uses 4.9.3 
that is not compatible with 4.10.x and above: "_Mixing ANTLR 4.9.3 and 4.10 can 
lead to errors that point to a version mismatch. A very common Java error looks 
like this:
   java.io.InvalidClassException: org.antlr.v4.runtime.atn.ATN; Could not 
deserialize ATN with version 4 (expected 3)_" and this is exactly the error I 
got without changing `antlr4.version`.
   
   One possible solution is to shade antlr4 in Spark, so there will be no 
conflict between Spark and Hive version. Please let me know what do you think 
and if that sounds reasonable, I'll open a separate PR for shading.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to