Hi,

Bumping up again! Why does spark modules depend upon scala2.11 versions
inspite of changing pom.xmls using ./dev/change-scala-version.sh 2.10.
Appreciate any quick help!!

Thanks

On Fri, Jun 16, 2017 at 2:59 PM, Kanagha Kumar <kpra...@salesforce.com>
wrote:

> Hey all,
>
>
> I'm trying to use Spark 2.0.2 with scala 2.10 by following this
> https://spark.apache.org/docs/2.0.2/building-spark.
> html#building-for-scala-210
>
> ./dev/change-scala-version.sh 2.10
> ./build/mvn -Pyarn -Phadoop-2.4 -Dscala-2.10 -DskipTests clean package
>
>
> I could build the distribution successfully using
> bash -xv dev/make-distribution.sh --tgz  -Dscala-2.10 -DskipTests
>
> But, when I am trying to maven release, it keeps failing with the error
> using the command:
>
>
> Executing Maven:  -B -f pom.xml  -DscmCommentPrefix=[maven-release-plugin]
> -e  -Dscala-2.10 -Pyarn -Phadoop-2.7 -Phadoop-provided -DskipTests
> -Dresume=false -U -X *release:prepare release:perform*
>
> Failed to execute goal on project spark-sketch_2.10: Could not resolve
> dependencies for project 
> org.apache.spark:spark-sketch_2.10:jar:2.0.2-sfdc-3.0.0:
> *Failure to find org.apache.spark:spark-tags_2.11:jar:2.0.2-sfdc-3.0.0*
> in <a .. nexus repo...> was cached in the local repository, resolution will
> not be reattempted until the update interval of nexus has elapsed or
> updates are forced -&gt; [Help 1]
>
>
> Why does spark-sketch depend upon spark-tags_2.11 when I have already
> compiled against scala 2.10?? Any pointers would be helpful.
> Thanks
> Kanagha
>

Reply via email to