Hi,
Good day.
My setup:
1. Single node Hadoop 2.7.3 on Ubuntu 16.04.
2. Hive 2.1.1 with metastore in MySQL.
3. Spark 2.1.0 configured using hive-site.xml to use MySQL metastore.
4. The VERSION table contains SCHEMA_VERSION = 2.1.0
Hive CLI works fine.
However, when I start Spark-shell or Spark-sql, SCHEMA_VERSION
is set to 1.2.0 by spark.
Hive CLI then fails to start. After manual update of VERSION
table, it works fine again.
I see in the spark/jars directory that hive related jars are of
version 1.2.1
I tried building spark from source and as spark uses hive 1.2.1
by default, I get the same set of jars.
How can we make Spark 2.1.0 work with Hive 2.1.1?
Thanks in advance!
Best regards / Mit freundlichen Grüßen / Sincères salutations
M. Lohith Samaga
[cid:[email protected]]
Block C, Tech Bay, PL Compound, Jeppu Ferry Road, Morgan's gate, Mangalore 575
001, India
T +91 824 423 1172 Ext. 1172 | CUG Ext. #5651172 |M +91 9880393463 |
[email protected]<mailto:[email protected]>
www.mphasis.com<http://www.mphasis.com>
Information transmitted by this e-mail is proprietary to Mphasis, its
associated companies and/ or its customers and is intended
for use only by the individual or entity to which it is addressed, and may
contain information that is privileged, confidential or
exempt from disclosure under applicable law. If you are not the intended
recipient or it appears that this mail has been forwarded
to you without proper authority, you are notified that any use or dissemination
of this information in any manner is strictly
prohibited. In such cases, please notify us immediately at
[email protected] and delete this mail from your records.