Hi, All.

Since Apache Spark 3.0.0, Apache Hive 2.3.7 is the default
Hive execution library. The forked Hive 1.2.1 library is not
recommended because it's not maintained properly.

In Apache Spark 3.1 on December 2020, we are going to
remove it from our official distribution.

    https://github.com/apache/spark/pull/29856
    SPARK-32981 Remove hive-1.2/hadoop-2.7 from Apache Spark 3.1
distribution

Of course, the users still can build it from the source because the profile
`hive-1.2` is still alive.

Please let us know if you are going to build with the forked unofficial
Hive 1.2.1 library still in Apache Spark 3.1. We want to listen to your
pain-points before moving forward in this area. Eventually we will remove
Hive 1.2 as a last piece of migration to Hive 2.3/Hadoop3/Java11+.

Bests,
Dongjoon.

Reply via email to