Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/7929#discussion_r36273851
--- Diff:
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/ClientWrapper.scala ---
@@ -62,6 +64,52 @@ private[hive] class ClientWrapper(
extends ClientInterface
with Logging {
+ overrideHadoopShims()
+
+ // !! HACK ALERT !!
+ //
+ // This method is a surgical fix for Hadoop version 2.0.0-mr1-cdh4.1.1,
which is used by Spark EC2
+ // scripts. We should remove this after upgrading Spark EC2 scripts to
some more recent Hadoop
+ // version in the future.
+ //
+ // Internally, Hive `ShimLoader` tries to load different versions of
Hadoop shims by checking
+ // version information gathered from Hadoop jar files. If the major
version number is 1,
+ // `Hadoop20SShims` will be loaded. Otherwise, if the major version
number is 2, `Hadoop23Shims`
+ // will be chosen.
+ //
+ // However, part of APIs in Hadoop 2.0.x and 2.1.x versions were in flux
due to historical
+ // reasons. So 2.0.0-mr1-cdh4.1.1 is actually more Hadoop-1-like and
should be used together with
--- End diff --
My gut is that there's much more reason to believe other 2.0.x builds work
the same way. The method in question here (as far as I understand) never
appeared in any 2.0.x release. Occam's razor would suggest not special casing
here. I don't know that CDH4 is the only relevant 2.0.x release; certainly
upstream Apache Hadoop made a number of 2.0.x releases that this change would
(again as far as I understand) affect as well and would be left out.
At the least, let's get the comment updated. Also, `mr1` really isn't
relevant. I would not special-case cdh4, since the comments will say it's not
special.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]