GitHub user srowen opened a pull request:

    https://github.com/apache/spark/pull/3952

    SPARK-5136 [DOCS] Improve documentation around setting up Spark IntelliJ 
project

    This PR simply points to the IntelliJ wiki page instead of also including 
IntelliJ notes in the docs. The intent however is to also update the wiki page 
with updated tips. This is the text I propose for the IntelliJ section on the 
wiki. I realize it omits some of the existing instructions on the wiki, about 
enabling Hive, but I think those are actually optional.
    
    ------
    
    IntelliJ supports both Maven- and SBT-based projects. It is recommended, 
however, to import Spark as a Maven project. Choose "Import Project..." from 
the File menu, and select the `pom.xml` file in the Spark root directory. 
    
    It is fine to leave all settings at their default values in the Maven 
import wizard, with two caveats. First, it is usually useful to enable "Import 
Maven projects automatically", sincchanges to the project structure will 
automatically update the IntelliJ project.
    
    Second, note the step that prompts you to choose active Maven build 
profiles. As documented above, some build configuration require specific 
profiles to be enabled. The same profiles that are enabled with `-P[profile 
name]` above may be enabled on this screen. For example, if developing for 
Hadoop 2.4 with YARN support, enable profiles `yarn` and `hadoop-2.4`.
    
    These selections can be changed later by accessing the "Maven Projects" 
tool window from the View menu, and expanding the Profiles section.
    
    "Rebuild Project" can fail the first time the project is compiled, because 
generate source files are not automatically generated. Try clicking the  
"Generate Sources and Update Folders For All Projects" button in the "Maven 
Projects" tool window to manually generate these sources.
    
    Compilation may fail with an error like "scalac: bad option: 
-P:/home/jakub/.m2/repository/org/scalamacros/paradise_2.10.4/2.0.1/paradise_2.10.4-2.0.1.jar".
 If so, go to Preferences > Build, Execution, Deployment > Scala Compiler and 
clear the "Additional compiler options" field. It will work then although the 
option will come back when the project reimports.


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/srowen/spark SPARK-5136

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/3952.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #3952
    
----
commit 016b7dfbeb3f9272a3a80aa84acda8c2540ac31e
Author: Sean Owen <[email protected]>
Date:   2015-01-08T14:11:04Z

    Point to IntelliJ wiki page instead of also including IntelliJ notes in the 
docs

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to