[
https://issues.apache.org/jira/browse/SPARK-5115?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14267074#comment-14267074
]
Sean Owen commented on SPARK-5115:
----------------------------------
I just deleted my IntelliJ project config for Spark ({{.idea/}}, all {{.iml}})
and reimported the Maven build from master, and chose all defaults. The build
is fine for me* and {{yarn/}} is not even a module, since the {{yarn}} profile
is not on by default, which turns on this module. So I think you have somehow
activated the YARN-related module but it takes another step or two to do that
in the build -- activate profile {{yarn}} and {{hadoop-2.4}} for example is
what I do.
If I turn these profiles, reimport the Maven project, and rebuild in IntelliJ,
{{yarn}} becomes a module and it builds OK for me.
I hope that resolves the compile error you see and gets rid of the red. This is
why I'm saying I don't see that there's a basic developer sanity problem to
fix. The build seems to do what it's supposed to when put into IntelliJ.
To me, separately, the idea of updating the Hadoop default to something more
modern (Hadoop 2.4? YARN-enabled?) sounds fine on its own, not because it
solves a problem but just because it feels like a more sensible default in 2015.
* I find I have to press the 'generate sources' button in IJ before the first
build or else Make won't find the generated sources in the flume-sink module,
but I think that's not related here
** Hm, I see some crazy-looking compiler errors from the Catalyst DSL package
the first time I compile, that then go away, but I also think that's something
unrelated or to do with code generation
> Intellij fails to find hadoop classes in Spark "yarn" modules
> -------------------------------------------------------------
>
> Key: SPARK-5115
> URL: https://issues.apache.org/jira/browse/SPARK-5115
> Project: Spark
> Issue Type: Improvement
> Components: YARN
> Affects Versions: 1.2.0
> Reporter: Ryan Williams
>
> Intellij's parsing of Spark's POMs works like a charm for the most part,
> however it fails to resolve the hadoop and yarn dependencies in the Spark
> {{yarn}} and {{network/yarn}} modules.
> Imports and later references to imported classes show up as errors, e.g.
> !http://f.cl.ly/items/0g3w3s0t45402z30011l/Screen%20Shot%202015-01-06%20at%206.42.52%20PM.png!
> Opening the module settings, we see that IntelliJ is looking for version
> {{1.0.4}} of [each yarn JAR that the Spark YARN module depends
> on|https://github.com/apache/spark/blob/bb38ebb1abd26b57525d7d29703fd449e40cd6de/yarn/pom.xml#L41-L56],
> and failing to find them:
> !http://f.cl.ly/items/2d320l2h2o2N1m0t2X3b/yarn.png!
> This, in turn, is due to the parent POM [defaulting {{hadoop.version}} to
> {{1.0.4}}|https://github.com/apache/spark/blob/bb38ebb1abd26b57525d7d29703fd449e40cd6de/pom.xml#L122].
> AFAIK, having the default-hadoop-version be {{1.0.4}} is not that important
> and may just be an accident of history; people typically select a Maven
> profile when building Spark that matches the version of Hadoop that they
> intend to run with.
> This suggests one possible fix: bump the default Hadoop version to >= 2. I've
> tried this locally and it resolves Intellij's difficulties with the "yarn"
> and "network/yarn" modules; [PR
> #3917|https://github.com/apache/spark/pull/3917] does this.
> Another fix would be to declare a {{hadoop.version}} property in
> {{yarn/pom.xml}} and add that to the YARN dependencies in that file; [PR
> #3918|https://github.com/apache/spark/pull/3918] does that.
> It is more obvious to me in the former case that the existing rules that
> govern what {{hadoop.version}} the YARN dependencies inherit will still apply.
> For the latter, or potentially other ways to configure IntelliJ / Spark's
> POMs to address this issue, someone with more Maven/IntelliJ fu may need to
> chime in.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]