Github user blue666man commented on the issue:
https://github.com/apache/spark/pull/21398
To weigh in a bit, we're one of the larger enterprises that hit this bug,
which we consider a regression going from Spark 1.6.3 (on Mesos, attempting to
perform DDL to a Hive instance on CDH) to >= 2.1.x. I think many more large
enterprises will hit this with Spark on K8S to any Sentry-secured Hive
databases.
IMO, [this
commit](https://github.com/apache/spark/commit/ce13c2672318242748f7520ed4ce6bcfad4fb428),
while well intended in removing some of the hacks for 2.1.0, was a reversal of
Spark's sort-of-implicit-commitment to have Sentry/Hive DDL work. I agree this
should be fixed upstream in Sentry too, but that would be a long wait for us:
creating tables in Mesos Spark 2.x is blocking 100's of devs from upgrading to
Spark 2.
Basically, the hacks / workarounds had existed in the past and were
working. Please add a workaround back until Sentry can fix the location issue
on their side.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]