Hi,
well, spark 1.2 was prepared for scala 2.10. If you want stable and fully
functional tool I'd compile it this default compiler.

*I was able to compile Spar 1.2 by Java 7 and scala 2.10 seamlessly.*

I also tried Java8 and scala 2.11 (no -Dscala.usejavacp=true), but I failed
for some other problem:

/mvn -Pyarn -Phadoop-2.5 -Dhadoop.version=2.5.0 -Dscala-2.11 -X -DskipTests
clean package 
[INFO]
------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Spark Project Parent POM ........................... SUCCESS [ 14.453
s]
[INFO] Spark Project Core ................................. SUCCESS [ 47.508
s]
[INFO] Spark Project Bagel ................................ SUCCESS [  3.646
s]
[INFO] Spark Project GraphX ............................... SUCCESS [  5.533
s]
[INFO] Spark Project ML Library ........................... SUCCESS [ 12.715
s]
[INFO] Spark Project Tools ................................ SUCCESS [  1.854
s]
[INFO] Spark Project Networking ........................... SUCCESS [  6.580
s]
[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [  5.290
s]
[INFO] Spark Project Streaming ............................ SUCCESS [ 10.846
s]
[INFO] Spark Project Catalyst ............................. SUCCESS [  8.296
s]
[INFO] Spark Project SQL .................................. SUCCESS [ 12.921
s]
[INFO] Spark Project Hive ................................. SUCCESS [ 28.931
s]
[INFO] Spark Project Assembly ............................. FAILURE [01:09
min]
[INFO] Spark Project External Twitter ..................... SKIPPED
[INFO] Spark Project External Flume ....................... SKIPPED
[INFO] Spark Project External Flume Sink .................. SKIPPED
[INFO] Spark Project External MQTT ........................ SKIPPED
[INFO] Spark Project External ZeroMQ ...................... SKIPPED
[INFO] Spark Project Examples ............................. SKIPPED
[INFO] Spark Project REPL ................................. SKIPPED
[INFO] Spark Project YARN Parent POM ...................... SKIPPED
[INFO] Spark Project YARN Stable API ...................... SKIPPED
[INFO] Spark Project YARN Shuffle Service ................. SKIPPED
[INFO]
------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO]
------------------------------------------------------------------------
[INFO] Total time: 03:49 min
[INFO] Finished at: 2014-12-30T12:41:59+01:00
[INFO] Final Memory: 59M/417M
[INFO]
------------------------------------------------------------------------
[WARNING] The requested profile "hadoop-2.5" could not be activated because
it does not exist.
[ERROR] Failed to execute goal on project spark-assembly_2.10: Could not
resolve dependencies for project
org.apache.spark:spark-assembly_2.10:pom:1.2.0: The following artifacts
could not be resolved: org.apache.spark:spark-repl_2.11:jar:1.2.0,
org.apache.spark:spark-yarn_2.11:jar:1.2.0: Could not find artifact
org.apache.spark:spark-repl_2.11:jar:1.2.0 in central
(https://repo1.maven.org/maven2) -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute
goal on project spark-assembly_2.10: Could not resolve dependencies for
project org.apache.spark:spark-assembly_2.10:pom:1.2.0: The following
artifacts could not be resolved: org.apache.spark:spark-repl_2.11:jar:1.2.0,
org.apache.spark:spark-yarn_2.11:jar:1.2.0: Could not find artifact
org.apache.spark:spark-repl_2.11:jar:1.2.0 in central
(https://repo1.maven.org/maven2)
        at
org.apache.maven.lifecycle.internal.LifecycleDependencyResolver.getDependencies(LifecycleDependencyResolver.java:220)
        at
org.apache.maven.lifecycle.internal.LifecycleDependencyResolver.resolveProjectDependencies(LifecycleDependencyResolver.java:127)
        at
org.apache.maven.lifecycle.internal.MojoExecutor.ensureDependenciesAreResolved(MojoExecutor.java:257)
        at
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:200)
        at
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
        at
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
        at
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
        at
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
        at
org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
        at
org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:120)
        at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:347)
        at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:154)
        at org.apache.maven.cli.MavenCli.execute(MavenCli.java:584)
        at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:213)
        at org.apache.maven.cli.MavenCli.main(MavenCli.java:157)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:483)
        at
org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
        at
org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
        at
org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
        at
org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: org.apache.maven.project.DependencyResolutionException: Could not
resolve dependencies for project
org.apache.spark:spark-assembly_2.10:pom:1.2.0: The following artifacts
could not be resolved: org.apache.spark:spark-repl_2.11:jar:1.2.0,
org.apache.spark:spark-yarn_2.11:jar:1.2.0: Could not find artifact
org.apache.spark:spark-repl_2.11:jar:1.2.0 in central
(https://repo1.maven.org/maven2)
        at
org.apache.maven.project.DefaultProjectDependenciesResolver.resolve(DefaultProjectDependenciesResolver.java:198)
        at
org.apache.maven.lifecycle.internal.LifecycleDependencyResolver.getDependencies(LifecycleDependencyResolver.java:195)
        ... 22 more
Caused by: org.eclipse.aether.resolution.DependencyResolutionException: The
following artifacts could not be resolved:
org.apache.spark:spark-repl_2.11:jar:1.2.0,
org.apache.spark:spark-yarn_2.11:jar:1.2.0: Could not find artifact
org.apache.spark:spark-repl_2.11:jar:1.2.0 in central
(https://repo1.maven.org/maven2)
        at
org.eclipse.aether.internal.impl.DefaultRepositorySystem.resolveDependencies(DefaultRepositorySystem.java:384)
        at
org.apache.maven.project.DefaultProjectDependenciesResolver.resolve(DefaultProjectDependenciesResolver.java:192)
        ... 23 more
Caused by: org.eclipse.aether.resolution.ArtifactResolutionException: The
following artifacts could not be resolved:
org.apache.spark:spark-repl_2.11:jar:1.2.0,
org.apache.spark:spark-yarn_2.11:jar:1.2.0: Could not find artifact
org.apache.spark:spark-repl_2.11:jar:1.2.0 in central
(https://repo1.maven.org/maven2)
        at
org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolve(DefaultArtifactResolver.java:459)
        at
org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolveArtifacts(DefaultArtifactResolver.java:262)
        at
org.eclipse.aether.internal.impl.DefaultRepositorySystem.resolveDependencies(DefaultRepositorySystem.java:367)
        ... 24 more
Caused by: org.eclipse.aether.transfer.ArtifactNotFoundException: Could not
find artifact org.apache.spark:spark-repl_2.11:jar:1.2.0 in central
(https://repo1.maven.org/maven2)
        at
org.eclipse.aether.connector.wagon.WagonRepositoryConnector$6.wrap(WagonRepositoryConnector.java:1012)
        at
org.eclipse.aether.connector.wagon.WagonRepositoryConnector$6.wrap(WagonRepositoryConnector.java:1004)
        at
org.eclipse.aether.connector.wagon.WagonRepositoryConnector$GetTask.run(WagonRepositoryConnector.java:725)
        at
org.eclipse.aether.util.concurrency.RunnableErrorForwarder$1.run(RunnableErrorForwarder.java:67)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:744)
[ERROR] 
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please
read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the
command
[ERROR]   mvn <goals> -rf :spark-assembly_2.10/






--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/building-spark1-2-meet-error-tp20853p20905.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to