Hi, https://issues.apache.org/jira/browse/SPARK-19076
Pozdrawiam, Jacek Laskowski ---- https://about.me/JacekLaskowski Spark Structured Streaming https://bit.ly/spark-structured-streaming Mastering Apache Spark 2 https://bit.ly/mastering-apache-spark Follow me at https://twitter.com/jaceklaskowski On Mon, Dec 11, 2017 at 7:43 AM, An Qin <a...@qilinsoft.com> wrote: > Hi, all, > > > > I want to include Sentry 2.0.0 in my Spark project. However it bundles > Hive 2.3.2. I find the newest Spark 2.2.1 still bundles old Hive jars, for > example, hive-exec-1.2.1.spark2.jar. Why does it upgrade to the new Hive? > Are they compatible? > > > > Regards, > > > > > > Qin An. > > > > > > >