If you're asking about a compile error, you should include the command
you used to compile.

I am able to compile branch 1.2 successfully with "mvn -DskipTests
clean package".

This error is actually an error from scalac, not a compile error from
the code. It sort of sounds like it has not been able to download
scala dependencies. Check or maybe recreate your environment.

On Fri, Apr 3, 2015 at 3:19 AM, myelinji <myeli...@aliyun.com> wrote:
> Hi,all:
>    Just now i checked out spark-1.2 on github , wanna to build it use maven,
> how ever I encountered an error during compiling:
>
> [INFO]
> ------------------------------------------------------------------------
> [ERROR] Failed to execute goal
> net.alchim31.maven:scala-maven-plugin:3.2.0:compile (scala-compile-first) on
> project spark-catalyst_2.10: wrap:
> scala.reflect.internal.MissingRequirementError: object scala.runtime in
> compiler mirror not found. -> [Help 1]
> org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute
> goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile
> (scala-compile-first) on project spark-catalyst_2.10: wrap:
> scala.reflect.internal.MissingRequirementError: object scala.runtime in
> compiler mirror not found.
> at
> org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:217)
> at
> org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
> at
> org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
> at
> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84)
> at
> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59)
> at
> org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183)
> at
> org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161)
> at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:320)
> at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156)
> at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537)
> at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196)
> at org.apache.maven.cli.MavenCli.main(MavenCli.java:141)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
> org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:290)
> at
> org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:230)
> at
> org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:409)
> at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352)
> Caused by: org.apache.maven.plugin.MojoExecutionException: wrap:
> scala.reflect.internal.MissingRequirementError: object scala.runtime in
> compiler mirror not found.
> at scala_maven.ScalaMojoSupport.execute(ScalaMojoSupport.java:490)
> at
> org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101)
> at
> org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209)
> ... 19 more
> Caused by: scala.reflect.internal.MissingRequirementError: object
> scala.runtime in compiler mirror not found.
> at
> scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
> at
> scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
> at
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
> at
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
> at
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)
> at scala.reflect.internal.Mirrors$RootsBase.getPackage(Mirrors.scala:172)
> at
> scala.reflect.internal.Mirrors$RootsBase.getRequiredPackage(Mirrors.scala:175)
> at
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage$lzycompute(Definitions.scala:183)
> at
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage(Definitions.scala:183)
> at
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass$lzycompute(Definitions.scala:184)
> at
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass(Definitions.scala:184)
> at
> scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr$lzycompute(Definitions.scala:1024)
> at
> scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr(Definitions.scala:1023)
> at
> scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses$lzycompute(Definitions.scala:1153)
> at
> scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses(Definitions.scala:1152)
> at
> scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode$lzycompute(Definitions.scala:1196)
> at
> scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode(Definitions.scala:1196)
> at
> scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1261)
> at scala.tools.nsc.Global$Run.<init>(Global.scala:1290)
> at xsbt.CachedCompiler0$$anon$2.<init>(CompilerInterface.scala:113)
> at xsbt.CachedCompiler0.run(CompilerInterface.scala:113)
> at xsbt.CachedCompiler0.run(CompilerInterface.scala:99)
> at xsbt.CompilerInterface.run(CompilerInterface.scala:27)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at sbt.compiler.AnalyzingCompiler.call(AnalyzingCompiler.scala:102)
> at sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:48)
> at sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:41)
> at
> sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileScala$1$1.apply$mcV$sp(AggressiveCompile.scala:99)
> at
> sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileScala$1$1.apply(AggressiveCompile.scala:99)
> at
> sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileScala$1$1.apply(AggressiveCompile.scala:99)
> at
> sbt.compiler.AggressiveCompile.sbt$compiler$AggressiveCompile$$timed(AggressiveCompile.scala:166)
> at
> sbt.compiler.AggressiveCompile$$anonfun$3.compileScala$1(AggressiveCompile.scala:98)
> at
> sbt.compiler.AggressiveCompile$$anonfun$3.apply(AggressiveCompile.scala:143)
> at
> sbt.compiler.AggressiveCompile$$anonfun$3.apply(AggressiveCompile.scala:87)
> at sbt.inc.IncrementalCompile$$anonfun$doCompile$1.apply(Compile.scala:39)
> at sbt.inc.IncrementalCompile$$anonfun$doCompile$1.apply(Compile.scala:37)
> at sbt.inc.IncrementalCommon.cycle(Incremental.scala:99)
> at sbt.inc.Incremental$$anonfun$1.apply(Incremental.scala:38)
> at sbt.inc.Incremental$$anonfun$1.apply(Incremental.scala:37)
> at sbt.inc.Incremental$.manageClassfiles(Incremental.scala:65)
> at sbt.inc.Incremental$.compile(Incremental.scala:37)
> at sbt.inc.IncrementalCompile$.apply(Compile.scala:27)
> at sbt.compiler.AggressiveCompile.compile2(AggressiveCompile.scala:157)
> at sbt.compiler.AggressiveCompile.compile1(AggressiveCompile.scala:71)
> at com.typesafe.zinc.Compiler.compile(Compiler.scala:184)
> at com.typesafe.zinc.Compiler.compile(Compiler.scala:164)
> at sbt_inc.SbtIncrementalCompiler.compile(SbtIncrementalCompiler.java:92)
> at
> scala_maven.ScalaCompilerSupport.incrementalCompile(ScalaCompilerSupport.java:303)
> at scala_maven.ScalaCompilerSupport.compile(ScalaCompilerSupport.java:119)
> at scala_maven.ScalaCompilerSupport.doExecute(ScalaCompilerSupport.java:99)
> at scala_maven.ScalaMojoSupport.execute(ScalaMojoSupport.java:482)
> ... 21 more
>
>
> the env is :
> JDK: 1.7
> scala: 2.10.4
> maven: 3.0.5
>
> but i can use sbt to compile successful. is there anyone can help me ?

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to