Hi,
https://issues.apache.org/jira/browse/SPARK-19076
Pozdrawiam,
Jacek Laskowski
https://about.me/JacekLaskowski
Spark Structured Streaming https://bit.ly/spark-structured-streaming
Mastering Apache Spark 2 https://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowsk
Hi, all,
I want to include Sentry 2.0.0 in my Spark project. However it bundles
Hive 2.3.2. I find the newest Spark 2.2.1 still bundles old Hive jars,
for example, hive-exec-1.2.1.spark2.jar. Why does it upgrade to the new
Hive? Are they compatible?
Regards,
Qin An.