[ 
https://issues.apache.org/jira/browse/SPARK-14220?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16540478#comment-16540478
 ] 

Erik LaBianca edited comment on SPARK-14220 at 7/11/18 6:07 PM:
----------------------------------------------------------------

Sorry for the newbie question, but I'd like to try a full spark + scala 2.12 
build and none of the obvious paths work:

`build/mvn -DskipTests -P scala-2.12 clean package` blows up with a bunch of 
missing dependencies, and 

`build/sbt ++2.12.6 package` blows up on a missing genjavadoc plugin. I've been 
able to build several of the individual packages this way, however.

I'm happy to update the 'building spark` document with the "official" build on 
2.12 steps if we can work out what they are.

 


was (Author: easel):
Sorry for the newbie question, but I'd like to try a full spark + scala 2.12 
build and none of the obvious paths work:

`build/mvn -DskipTests -P s-cala--2.12 clean package` blows up with a bunch of 
missing dependencies, and 

`build/sbt ++2.12.6 package` blows up on a missing genjavadoc plugin. I've been 
able to build several of the individual packages this way, however.

I'm happy to update the 'building spark` document with the "official" build on 
2.12 steps if we can work out what they are.

 

> Build and test Spark against Scala 2.12
> ---------------------------------------
>
>                 Key: SPARK-14220
>                 URL: https://issues.apache.org/jira/browse/SPARK-14220
>             Project: Spark
>          Issue Type: Umbrella
>          Components: Build, Project Infra
>    Affects Versions: 2.1.0
>            Reporter: Josh Rosen
>            Priority: Blocker
>
> This umbrella JIRA tracks the requirements for building and testing Spark 
> against the current Scala 2.12 milestone.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to