Github user vanzin commented on the pull request:
https://github.com/apache/spark/pull/11796#issuecomment-199921064
These tests fail sporadically; I think it's some weird sbt dependency
resolution issue that's causing different hadoop versions to get mixed up. e.g.
```
16/03/21 19:43:11.830 redirect stderr for command ./bin/spark-submit INFO
Utils: Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.hadoop.conf.Configuration.addDeprecations([Lorg/apache/hadoop/conf/Configuration$DeprecationDelta;)V
16/03/21 19:43:11.830 redirect stderr for command ./bin/spark-submit INFO
Utils: at
org.apache.hadoop.mapreduce.util.ConfigUtil.addDeprecatedKeys(ConfigUtil.java:54)
16/03/21 19:43:11.830 redirect stderr for command ./bin/spark-submit INFO
Utils: at
org.apache.hadoop.mapreduce.util.ConfigUtil.loadResources(ConfigUtil.java:42)
16/03/21 19:43:11.830 redirect stderr for command ./bin/spark-submit INFO
Utils: at org.apache.hadoop.mapred.JobConf.<clinit>(JobConf.java:118)
```
That's not the whole stack trace, but there's only hadoop stuff in there,
so it's not Spark calling some method that was removed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]