Re: Error Running SparkPi.scala Example

2016-06-17 Thread Krishna Kalyan
Hi Jacek,

Maven build output
*mvn clean install*

[INFO]

[INFO] BUILD FAILURE
[INFO]

[INFO] Total time: 30:12 min
[INFO] Finished at: 2016-06-17T15:15:46+02:00
[INFO] Final Memory: 82M/1253M
[INFO]

[ERROR] Failed to execute goal
org.scalatest:scalatest-maven-plugin:1.0:test (test) on project
spark-core_2.11: There are test failures -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions,
please read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the
command
[ERROR]   mvn  -rf :spark-core_2.11


and the error
- handles standalone cluster mode *** FAILED ***
  Map("spark.driver.memory" -> "4g", "SPARK_SUBMIT" -> "true",
"spark.driver.cores" -> "5", "spark.ui.enabled" -> "false",
"spark.driver.supervise" -> "true", "spark.app.name" -> "org.SomeClass",
"spark.jars" -> "file:/Users/krishna/Experiment/spark/core/thejar.jar",
"spark.submit.deployMode" -> "cluster", "spark.executor.extraClassPath" ->
"~/mysql-connector-java-5.1.12.jar", "spark.master" -> "spark://h:p",
"spark.driver.extraClassPath" -> "~/mysql-connector-java-5.1.12.jar") had
size 11 instead of expected size 9 (SparkSubmitSuite.scala:294)
- handles legacy standalone cluster mode *** FAILED ***
  Map("spark.driver.memory" -> "4g", "SPARK_SUBMIT" -> "true",
"spark.driver.cores" -> "5", "spark.ui.enabled" -> "false",
"spark.driver.supervise" -> "true", "spark.app.name" -> "org.SomeClass",
"spark.jars" -> "file:/Users/krishna/Experiment/spark/core/thejar.jar",
"spark.submit.deployMode" -> "cluster", "spark.executor.extraClassPath" ->
"~/mysql-connector-java-5.1.12.jar", "spark.master" -> "spark://h:p",
"spark.driver.extraClassPath" -> "~/mysql-connector-java-5.1.12.jar") had
size 11 instead of expected size 9 (SparkSubmitSuite.scala:294)


On Thu, Jun 16, 2016 at 1:57 PM, Jacek Laskowski  wrote:

> Hi,
>
> Before you try to do it inside another environment like an IDE, could
> you build Spark using mvn or sbt and only when successful try to run
> SparkPi using spark-submit run-example. With that, you could try to
> have a complete environment inside your beloved IDE (and I'm very glad
> to hear it's IDEA :))
>
> Pozdrawiam,
> Jacek Laskowski
> 
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Thu, Jun 16, 2016 at 1:37 AM, Krishna Kalyan
>  wrote:
> > Hello,
> > I am faced with problems when I try to run SparkPi.scala.
> > I took the following steps below:
> > a) git pull https://github.com/apache/spark
> > b) Import the project in Intellij as a maven project
> > c) Run 'SparkPi'
> >
> > Error Below:
> > Information:16/06/16 01:34 - Compilation completed with 10 errors and 5
> > warnings in 5s 843ms
> > Warning:scalac: Class org.jboss.netty.channel.ChannelFactory not found -
> > continuing with a stub.
> > Warning:scalac: Class org.jboss.netty.channel.ChannelPipelineFactory not
> > found - continuing with a stub.
> > Warning:scalac: Class org.jboss.netty.handler.execution.ExecutionHandler
> not
> > found - continuing with a stub.
> > Warning:scalac: Class org.jboss.netty.channel.group.ChannelGroup not
> found -
> > continuing with a stub.
> > Warning:scalac: Class com.google.common.collect.ImmutableMap not found -
> > continuing with a stub.
> >
> /Users/krishna/Experiment/spark/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkAvroCallbackHandler.scala
> > Error:(45, 66) not found: type SparkFlumeProtocol
> >   val transactionTimeout: Int, val backOffInterval: Int) extends
> > SparkFlumeProtocol with Logging {
> >  ^
> > Error:(70, 39) not found: type EventBatch
> >   override def getEventBatch(n: Int): EventBatch = {
> >   ^
> > Error:(85, 13) not found: type EventBatch
> > new EventBatch("Spark sink has been stopped!", "",
> > java.util.Collections.emptyList())
> > ^
> >
> /Users/krishna/Experiment/spark/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/TransactionProcessor.scala
> > Error:(80, 22) not found: type EventBatch
> >   def getEventBatch: EventBatch = {
> >  ^
> > Error:(48, 37) not found: type EventBatch
> >   @volatile private var eventBatch: EventBatch = new EventBatch("Unknown
> > Error", "",
> > ^
> > 

Re: Error Running SparkPi.scala Example

2016-06-16 Thread Jacek Laskowski
Hi,

Before you try to do it inside another environment like an IDE, could
you build Spark using mvn or sbt and only when successful try to run
SparkPi using spark-submit run-example. With that, you could try to
have a complete environment inside your beloved IDE (and I'm very glad
to hear it's IDEA :))

Pozdrawiam,
Jacek Laskowski

https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Thu, Jun 16, 2016 at 1:37 AM, Krishna Kalyan
 wrote:
> Hello,
> I am faced with problems when I try to run SparkPi.scala.
> I took the following steps below:
> a) git pull https://github.com/apache/spark
> b) Import the project in Intellij as a maven project
> c) Run 'SparkPi'
>
> Error Below:
> Information:16/06/16 01:34 - Compilation completed with 10 errors and 5
> warnings in 5s 843ms
> Warning:scalac: Class org.jboss.netty.channel.ChannelFactory not found -
> continuing with a stub.
> Warning:scalac: Class org.jboss.netty.channel.ChannelPipelineFactory not
> found - continuing with a stub.
> Warning:scalac: Class org.jboss.netty.handler.execution.ExecutionHandler not
> found - continuing with a stub.
> Warning:scalac: Class org.jboss.netty.channel.group.ChannelGroup not found -
> continuing with a stub.
> Warning:scalac: Class com.google.common.collect.ImmutableMap not found -
> continuing with a stub.
> /Users/krishna/Experiment/spark/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkAvroCallbackHandler.scala
> Error:(45, 66) not found: type SparkFlumeProtocol
>   val transactionTimeout: Int, val backOffInterval: Int) extends
> SparkFlumeProtocol with Logging {
>  ^
> Error:(70, 39) not found: type EventBatch
>   override def getEventBatch(n: Int): EventBatch = {
>   ^
> Error:(85, 13) not found: type EventBatch
> new EventBatch("Spark sink has been stopped!", "",
> java.util.Collections.emptyList())
> ^
> /Users/krishna/Experiment/spark/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/TransactionProcessor.scala
> Error:(80, 22) not found: type EventBatch
>   def getEventBatch: EventBatch = {
>  ^
> Error:(48, 37) not found: type EventBatch
>   @volatile private var eventBatch: EventBatch = new EventBatch("Unknown
> Error", "",
> ^
> Error:(48, 54) not found: type EventBatch
>   @volatile private var eventBatch: EventBatch = new EventBatch("Unknown
> Error", "",
>  ^
> Error:(115, 41) not found: type SparkSinkEvent
> val events = new util.ArrayList[SparkSinkEvent](maxBatchSize)
> ^
> Error:(146, 28) not found: type EventBatch
>   eventBatch = new EventBatch("", seqNum, events)
>^
> /Users/krishna/Experiment/spark/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkSinkUtils.scala
> Error:(25, 27) not found: type EventBatch
>   def isErrorBatch(batch: EventBatch): Boolean = {
>   ^
> /Users/krishna/Experiment/spark/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkSink.scala
> Error:(86, 51) not found: type SparkFlumeProtocol
> val responder = new SpecificResponder(classOf[SparkFlumeProtocol],
> handler.get)
>
> Thanks,
> Krishan

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org