Re: Compilation error

2015-03-12 Thread Sean Owen
A couple points:

You've got mismatched versions here -- 1.2.0 vs 1.2.1. You should fix
that but it's not your problem.

These are also supposed to be 'provided' scope dependencies in Maven.

You should get the Scala deps transitively and can import scala.*
classes. However, it would be a little bit more correct to depend
directly on the scala library classes, but in practice, easiest not to
in simple use cases.

If you're still having trouble look at the output of mvn dependency:tree

On Tue, Mar 10, 2015 at 6:32 PM, Mohit Anchlia mohitanch...@gmail.com wrote:
 I am using maven and my dependency looks like this, but this doesn't seem to
 be working

 dependencies

 dependency

 groupIdorg.apache.spark/groupId

 artifactIdspark-streaming_2.10/artifactId

 version1.2.0/version

 /dependency

 dependency

 groupIdorg.apache.spark/groupId

 artifactIdspark-core_2.10/artifactId

 version1.2.1/version

 /dependency

 /dependencies


 On Tue, Mar 10, 2015 at 11:06 AM, Tathagata Das t...@databricks.com wrote:

 If you are using tools like SBT/Maven/Gradle/etc, they figure out all the
 recursive dependencies and includes them in the class path. I haven't
 touched Eclipse in years so I am not sure off the top of my head what's
 going on instead. Just in case you only downloaded the
 spark-streaming_2.10.jar  then that is indeed insufficient and you have to
 download all the recursive dependencies. May be you should create a Maven
 project inside Eclipse?

 TD

 On Tue, Mar 10, 2015 at 11:00 AM, Mohit Anchlia mohitanch...@gmail.com
 wrote:

 How do I do that? I haven't used Scala before.

 Also, linking page doesn't mention that:


 http://spark.apache.org/docs/1.2.0/streaming-programming-guide.html#linking

 On Tue, Mar 10, 2015 at 10:57 AM, Sean Owen so...@cloudera.com wrote:

 It means you do not have Scala library classes in your project
 classpath.

 On Tue, Mar 10, 2015 at 5:54 PM, Mohit Anchlia mohitanch...@gmail.com
 wrote:
  I am trying out streaming example as documented and I am using spark
  1.2.1
  streaming from maven for Java.
 
  When I add this code I get compilation error on and eclipse is not
  able to
  recognize Tuple2. I also don't see any import scala.Tuple2 class.
 
 
 
  http://spark.apache.org/docs/1.2.0/streaming-programming-guide.html#a-quick-example
 
 
  private void map(JavaReceiverInputDStreamString lines) {
 
  JavaDStreamString words = lines.flatMap(
 
  new FlatMapFunctionString, String() {
 
  @Override public IterableString call(String x) {
 
  return Arrays.asList(x.split( ));
 
  }
 
  });
 
  // Count each word in each batch
 
  JavaPairDStreamString, Integer pairs = words.map(
 
  new PairFunctionString, String, Integer() {
 
  @Override public Tuple2String, Integer call(String s) throws
  Exception {
 
  return new Tuple2String, Integer(s, 1);
 
  }
 
  });
 
  }





-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Compilation error

2015-03-12 Thread Mohit Anchlia
It works after sync, thanks for the pointers

On Tue, Mar 10, 2015 at 1:22 PM, Mohit Anchlia mohitanch...@gmail.com
wrote:

 I navigated to maven dependency and found scala library. I also found
 Tuple2.class and when I click on it in eclipse I get invalid LOC header
 (bad signature)

 java.util.zip.ZipException: invalid LOC header (bad signature)
  at java.util.zip.ZipFile.read(Native Method)

 I am wondering if I should delete that file from local repo and re-sync

 On Tue, Mar 10, 2015 at 1:08 PM, Mohit Anchlia mohitanch...@gmail.com
 wrote:

 I ran the dependency command and see the following dependencies:

 I only see org.scala-lang.

 [INFO] org.spark.test:spak-test:jar:0.0.1-SNAPSHOT

 [INFO] +- org.apache.spark:spark-streaming_2.10:jar:1.2.0:compile

 [INFO] | +- org.eclipse.jetty:jetty-server:jar:8.1.14.v20131031:compile

 [INFO] | | +-
 org.eclipse.jetty.orbit:javax.servlet:jar:3.0.0.v201112011016:co mpile

 [INFO] | | +-
 org.eclipse.jetty:jetty-continuation:jar:8.1.14.v20131031:compil e

 [INFO] | | \- org.eclipse.jetty:jetty-http:jar:8.1.14.v20131031:compile

 [INFO] | | \- org.eclipse.jetty:jetty-io:jar:8.1.14.v20131031:compile

 [INFO] | +- org.scala-lang:scala-library:jar:2.10.4:compile

 [INFO] | \- org.spark-project.spark:unused:jar:1.0.0:compile

 [INFO] \- org.apache.spark:spark-core_2.10:jar:1.2.1:compile

 [INFO] +- com.twitter:chill_2.10:jar:0.5.0:compile

 [INFO] | \- com.esotericsoftware.kryo:kryo:jar:2.21:compile

 [INFO] | +- com.esotericsoftware.reflectasm:reflectasm:jar:shaded:1.07:co
 mpile

 [INFO] | +- com.esotericsoftware.minlog:minlog:jar:1.2:compile

 [INFO] | \- org.objenesis:objenesis:jar:1.2:compile

 [INFO] +- com.twitter:chill-java:jar:0.5.0:compile

 [INFO] +- org.apache.hadoop:hadoop-client:jar:2.2.0:compile

 [INFO] | +- org.apache.hadoop:hadoop-common:jar:2.2.0:compile

 [INFO] | | +- commons-cli:commons-cli:jar:1.2:compile

 [INFO] | | +- org.apache.commons:commons-math:jar:2.1:compile

 [INFO] | | +- xmlenc:xmlenc:jar:0.52:compile

 [INFO] | | +- commons-io:commons-io:jar:2.1:compile

 [INFO] | | +- commons-logging:commons-logging:jar:1.1.1:compile

 [INFO] | | +- commons-lang:commons-lang:jar:2.5:compile

 [INFO] | | +- commons-configuration:commons-configuration:jar:1.6:compile

 [INFO] | | | +- commons-collections:commons-collections:jar:3.2.1:compile

 [INFO] | | | +- commons-digester:commons-digester:jar:1.8:compile

 [INFO] | | | | \- commons-beanutils:commons-beanutils:jar:1.7.0:compile

 [INFO] | | | \- commons-beanutils:commons-beanutils-core:jar:1.8.0:compile

 [INFO] | | +- org.codehaus.jackson:jackson-core-asl:jar:1.8.8:compile

 [INFO] | | +- org.codehaus.jackson:jackson-mapper-asl:jar:1.8.8:compile

 [INFO] | | +- org.apache.avro:avro:jar:1.7.4:compile

 [INFO] | | +- com.google.protobuf:protobuf-java:jar:2.5.0:compile

 [INFO] | | +- org.apache.hadoop:hadoop-auth:jar:2.2.0:compile

 [INFO] | | \- org.apache.commons:commons-compress:jar:1.4.1:compile

 [INFO] | | \- org.tukaani:xz:jar:1.0:compile

 [INFO] | +- org.apache.hadoop:hadoop-hdfs:jar:2.2.0:compile

 [INFO] | | \- org.mortbay.jetty:jetty-util:jar:6.1.26:compile

 [INFO] | +-
 org.apache.hadoop:hadoop-mapreduce-client-app:jar:2.2.0:compile

 [INFO] | | +-
 org.apache.hadoop:hadoop-mapreduce-client-common:jar:2.2.0:co mpile

 [INFO] | | | +- org.apache.hadoop:hadoop-yarn-client:jar:2.2.0:compile

 [INFO] | | | | +- com.google.inject:guice:jar:3.0:compile

 [INFO] | | | | | +- javax.inject:javax.inject:jar:1:compile

 [INFO] | | | | | \- aopalliance:aopalliance:jar:1.0:compile

 [INFO] | | | | +- com.sun.jersey.jersey-test-framework:jersey-test-framew
 ork-grizzly2:jar:1.9:compile

 [INFO] | | | | | +- com.sun.jersey.jersey-test-framework:jersey-test-fra
 mework-core:jar:1.9:compile

 [INFO] | | | | | | +- javax.servlet:javax.servlet-api:jar:3.0.1:compile

 [INFO] | | | | | | \- com.sun.jersey:jersey-client:jar:1.9:compile

 [INFO] | | | | | \- com.sun.jersey:jersey-grizzly2:jar:1.9:compile

 [INFO] | | | | | +- org.glassfish.grizzly:grizzly-http:jar:2.1.2:comp ile

 [INFO] | | | | | | \- org.glassfish.grizzly:grizzly-framework:jar:2.
 1.2:compile

 [INFO] | | | | | | \- org.glassfish.gmbal:gmbal-api-only:jar:3.0.
 0-b023:compile

 [INFO] | | | | | | \- org.glassfish.external:management-api:ja
 r:3.0.0-b012:compile

 [INFO] | | | | | +- org.glassfish.grizzly:grizzly-http-server:jar:2.1
 .2:compile

 [INFO] | | | | | | \- org.glassfish.grizzly:grizzly-rcm:jar:2.1.2:co mpile

 [INFO] | | | | | +- org.glassfish.grizzly:grizzly-http-servlet:jar:2.
 1.2:compile

 [INFO] | | | | | \- org.glassfish:javax.servlet:jar:3.1:compile

 [INFO] | | | | +- com.sun.jersey:jersey-server:jar:1.9:compile

 [INFO] | | | | | +- asm:asm:jar:3.1:compile

 [INFO] | | | | | \- com.sun.jersey:jersey-core:jar:1.9:compile

 [INFO] | | | | +- com.sun.jersey:jersey-json:jar:1.9:compile

 [INFO] | | | | | +- org.codehaus.jettison:jettison:jar:1.1:compile

 [INFO] | | | | | | \- stax:stax-api:jar:1.0.1:compile

 [INFO] | | 

Re: Compilation error

2015-03-10 Thread Tathagata Das
If you are using tools like SBT/Maven/Gradle/etc, they figure out all the
recursive dependencies and includes them in the class path. I haven't
touched Eclipse in years so I am not sure off the top of my head what's
going on instead. Just in case you only downloaded the
spark-streaming_2.10.jar  then that is indeed insufficient and you have to
download all the recursive dependencies. May be you should create a Maven
project inside Eclipse?

TD

On Tue, Mar 10, 2015 at 11:00 AM, Mohit Anchlia mohitanch...@gmail.com
wrote:

 How do I do that? I haven't used Scala before.

 Also, linking page doesn't mention that:

 http://spark.apache.org/docs/1.2.0/streaming-programming-guide.html#linking

 On Tue, Mar 10, 2015 at 10:57 AM, Sean Owen so...@cloudera.com wrote:

 It means you do not have Scala library classes in your project classpath.

 On Tue, Mar 10, 2015 at 5:54 PM, Mohit Anchlia mohitanch...@gmail.com
 wrote:
  I am trying out streaming example as documented and I am using spark
 1.2.1
  streaming from maven for Java.
 
  When I add this code I get compilation error on and eclipse is not able
 to
  recognize Tuple2. I also don't see any import scala.Tuple2 class.
 
 
 
 http://spark.apache.org/docs/1.2.0/streaming-programming-guide.html#a-quick-example
 
 
  private void map(JavaReceiverInputDStreamString lines) {
 
  JavaDStreamString words = lines.flatMap(
 
  new FlatMapFunctionString, String() {
 
  @Override public IterableString call(String x) {
 
  return Arrays.asList(x.split( ));
 
  }
 
  });
 
  // Count each word in each batch
 
  JavaPairDStreamString, Integer pairs = words.map(
 
  new PairFunctionString, String, Integer() {
 
  @Override public Tuple2String, Integer call(String s) throws
 Exception {
 
  return new Tuple2String, Integer(s, 1);
 
  }
 
  });
 
  }





Re: Compilation error

2015-03-10 Thread Tathagata Das
You have to include Scala libraries in the Eclipse dependencies.

TD

On Tue, Mar 10, 2015 at 10:54 AM, Mohit Anchlia mohitanch...@gmail.com
wrote:

 I am trying out streaming example as documented and I am using spark 1.2.1
 streaming from maven for Java.

 When I add this code I get compilation error on and eclipse is not able to
 recognize Tuple2. I also don't see any import scala.Tuple2 class.



 http://spark.apache.org/docs/1.2.0/streaming-programming-guide.html#a-quick-example


 *private* *void* map(JavaReceiverInputDStreamString lines) {

 JavaDStreamString words = lines.flatMap(

 *new* *FlatMapFunctionString, String()* {

 @Override *public* IterableString call(String x) {

 *return* Arrays.*asList*(x.split( ));

 }

 });

  // Count each word in each batch

 JavaPairDStreamString, Integer pairs = words.*map*(

 *new* *PairFunctionString, String, Integer()* {

 @Override *public* *Tuple2*String, Integer call(String s) *throws*
 Exception {

 *return* *new* *Tuple2*String, Integer(s, 1);

 }

 });

  }



Re: Compilation error

2015-03-10 Thread Mohit Anchlia
How do I do that? I haven't used Scala before.

Also, linking page doesn't mention that:

http://spark.apache.org/docs/1.2.0/streaming-programming-guide.html#linking

On Tue, Mar 10, 2015 at 10:57 AM, Sean Owen so...@cloudera.com wrote:

 It means you do not have Scala library classes in your project classpath.

 On Tue, Mar 10, 2015 at 5:54 PM, Mohit Anchlia mohitanch...@gmail.com
 wrote:
  I am trying out streaming example as documented and I am using spark
 1.2.1
  streaming from maven for Java.
 
  When I add this code I get compilation error on and eclipse is not able
 to
  recognize Tuple2. I also don't see any import scala.Tuple2 class.
 
 
 
 http://spark.apache.org/docs/1.2.0/streaming-programming-guide.html#a-quick-example
 
 
  private void map(JavaReceiverInputDStreamString lines) {
 
  JavaDStreamString words = lines.flatMap(
 
  new FlatMapFunctionString, String() {
 
  @Override public IterableString call(String x) {
 
  return Arrays.asList(x.split( ));
 
  }
 
  });
 
  // Count each word in each batch
 
  JavaPairDStreamString, Integer pairs = words.map(
 
  new PairFunctionString, String, Integer() {
 
  @Override public Tuple2String, Integer call(String s) throws Exception
 {
 
  return new Tuple2String, Integer(s, 1);
 
  }
 
  });
 
  }



Re: Compilation error

2015-03-10 Thread Mohit Anchlia
I navigated to maven dependency and found scala library. I also found
Tuple2.class and when I click on it in eclipse I get invalid LOC header
(bad signature)

java.util.zip.ZipException: invalid LOC header (bad signature)
 at java.util.zip.ZipFile.read(Native Method)

I am wondering if I should delete that file from local repo and re-sync

On Tue, Mar 10, 2015 at 1:08 PM, Mohit Anchlia mohitanch...@gmail.com
wrote:

 I ran the dependency command and see the following dependencies:

 I only see org.scala-lang.

 [INFO] org.spark.test:spak-test:jar:0.0.1-SNAPSHOT

 [INFO] +- org.apache.spark:spark-streaming_2.10:jar:1.2.0:compile

 [INFO] | +- org.eclipse.jetty:jetty-server:jar:8.1.14.v20131031:compile

 [INFO] | | +-
 org.eclipse.jetty.orbit:javax.servlet:jar:3.0.0.v201112011016:co mpile

 [INFO] | | +-
 org.eclipse.jetty:jetty-continuation:jar:8.1.14.v20131031:compil e

 [INFO] | | \- org.eclipse.jetty:jetty-http:jar:8.1.14.v20131031:compile

 [INFO] | | \- org.eclipse.jetty:jetty-io:jar:8.1.14.v20131031:compile

 [INFO] | +- org.scala-lang:scala-library:jar:2.10.4:compile

 [INFO] | \- org.spark-project.spark:unused:jar:1.0.0:compile

 [INFO] \- org.apache.spark:spark-core_2.10:jar:1.2.1:compile

 [INFO] +- com.twitter:chill_2.10:jar:0.5.0:compile

 [INFO] | \- com.esotericsoftware.kryo:kryo:jar:2.21:compile

 [INFO] | +- com.esotericsoftware.reflectasm:reflectasm:jar:shaded:1.07:co
 mpile

 [INFO] | +- com.esotericsoftware.minlog:minlog:jar:1.2:compile

 [INFO] | \- org.objenesis:objenesis:jar:1.2:compile

 [INFO] +- com.twitter:chill-java:jar:0.5.0:compile

 [INFO] +- org.apache.hadoop:hadoop-client:jar:2.2.0:compile

 [INFO] | +- org.apache.hadoop:hadoop-common:jar:2.2.0:compile

 [INFO] | | +- commons-cli:commons-cli:jar:1.2:compile

 [INFO] | | +- org.apache.commons:commons-math:jar:2.1:compile

 [INFO] | | +- xmlenc:xmlenc:jar:0.52:compile

 [INFO] | | +- commons-io:commons-io:jar:2.1:compile

 [INFO] | | +- commons-logging:commons-logging:jar:1.1.1:compile

 [INFO] | | +- commons-lang:commons-lang:jar:2.5:compile

 [INFO] | | +- commons-configuration:commons-configuration:jar:1.6:compile

 [INFO] | | | +- commons-collections:commons-collections:jar:3.2.1:compile

 [INFO] | | | +- commons-digester:commons-digester:jar:1.8:compile

 [INFO] | | | | \- commons-beanutils:commons-beanutils:jar:1.7.0:compile

 [INFO] | | | \- commons-beanutils:commons-beanutils-core:jar:1.8.0:compile

 [INFO] | | +- org.codehaus.jackson:jackson-core-asl:jar:1.8.8:compile

 [INFO] | | +- org.codehaus.jackson:jackson-mapper-asl:jar:1.8.8:compile

 [INFO] | | +- org.apache.avro:avro:jar:1.7.4:compile

 [INFO] | | +- com.google.protobuf:protobuf-java:jar:2.5.0:compile

 [INFO] | | +- org.apache.hadoop:hadoop-auth:jar:2.2.0:compile

 [INFO] | | \- org.apache.commons:commons-compress:jar:1.4.1:compile

 [INFO] | | \- org.tukaani:xz:jar:1.0:compile

 [INFO] | +- org.apache.hadoop:hadoop-hdfs:jar:2.2.0:compile

 [INFO] | | \- org.mortbay.jetty:jetty-util:jar:6.1.26:compile

 [INFO] | +- org.apache.hadoop:hadoop-mapreduce-client-app:jar:2.2.0:compile

 [INFO] | | +-
 org.apache.hadoop:hadoop-mapreduce-client-common:jar:2.2.0:co mpile

 [INFO] | | | +- org.apache.hadoop:hadoop-yarn-client:jar:2.2.0:compile

 [INFO] | | | | +- com.google.inject:guice:jar:3.0:compile

 [INFO] | | | | | +- javax.inject:javax.inject:jar:1:compile

 [INFO] | | | | | \- aopalliance:aopalliance:jar:1.0:compile

 [INFO] | | | | +- com.sun.jersey.jersey-test-framework:jersey-test-framew
 ork-grizzly2:jar:1.9:compile

 [INFO] | | | | | +- com.sun.jersey.jersey-test-framework:jersey-test-fra
 mework-core:jar:1.9:compile

 [INFO] | | | | | | +- javax.servlet:javax.servlet-api:jar:3.0.1:compile

 [INFO] | | | | | | \- com.sun.jersey:jersey-client:jar:1.9:compile

 [INFO] | | | | | \- com.sun.jersey:jersey-grizzly2:jar:1.9:compile

 [INFO] | | | | | +- org.glassfish.grizzly:grizzly-http:jar:2.1.2:comp ile

 [INFO] | | | | | | \- org.glassfish.grizzly:grizzly-framework:jar:2.
 1.2:compile

 [INFO] | | | | | | \- org.glassfish.gmbal:gmbal-api-only:jar:3.0.
 0-b023:compile

 [INFO] | | | | | | \- org.glassfish.external:management-api:ja
 r:3.0.0-b012:compile

 [INFO] | | | | | +- org.glassfish.grizzly:grizzly-http-server:jar:2.1
 .2:compile

 [INFO] | | | | | | \- org.glassfish.grizzly:grizzly-rcm:jar:2.1.2:co mpile

 [INFO] | | | | | +- org.glassfish.grizzly:grizzly-http-servlet:jar:2.
 1.2:compile

 [INFO] | | | | | \- org.glassfish:javax.servlet:jar:3.1:compile

 [INFO] | | | | +- com.sun.jersey:jersey-server:jar:1.9:compile

 [INFO] | | | | | +- asm:asm:jar:3.1:compile

 [INFO] | | | | | \- com.sun.jersey:jersey-core:jar:1.9:compile

 [INFO] | | | | +- com.sun.jersey:jersey-json:jar:1.9:compile

 [INFO] | | | | | +- org.codehaus.jettison:jettison:jar:1.1:compile

 [INFO] | | | | | | \- stax:stax-api:jar:1.0.1:compile

 [INFO] | | | | | +- com.sun.xml.bind:jaxb-impl:jar:2.2.3-1:compile

 [INFO] | | | | | | \- javax.xml.bind:jaxb-api:jar:2.2.2:compile

 [INFO] | 

Re: Compilation error on JavaPairDStream

2015-03-10 Thread Sean Owen
Ah, that's a typo in the example: use words.mapToPair
I can make a little PR to fix that.

On Tue, Mar 10, 2015 at 8:32 PM, Mohit Anchlia mohitanch...@gmail.com wrote:
 I am getting following error. When I look at the sources it seems to be a
 scala source, but not sure why it's complaining about it.

 The method map(FunctionString,R) in the type JavaDStreamString is not
 applicable for the arguments (new

 PairFunctionString,String,Integer(){})


 And my code has been taken from the spark examples site:


 JavaPairDStreamString, Integer pairs = words.map(

 new PairFunctionString, String, Integer() {

 @Override public Tuple2String, Integer call(String s) throws Exception {

 return new Tuple2String, Integer(s, 1);


 }

 });



-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



RE: Compilation error

2015-03-10 Thread java8964
Or another option is to use Scala-IDE, which is built on top of Eclipse, 
instead of pure Eclipse, so Scala comes with it.
Yong

 From: so...@cloudera.com
 Date: Tue, 10 Mar 2015 18:40:44 +
 Subject: Re: Compilation error
 To: mohitanch...@gmail.com
 CC: t...@databricks.com; user@spark.apache.org
 
 A couple points:
 
 You've got mismatched versions here -- 1.2.0 vs 1.2.1. You should fix
 that but it's not your problem.
 
 These are also supposed to be 'provided' scope dependencies in Maven.
 
 You should get the Scala deps transitively and can import scala.*
 classes. However, it would be a little bit more correct to depend
 directly on the scala library classes, but in practice, easiest not to
 in simple use cases.
 
 If you're still having trouble look at the output of mvn dependency:tree
 
 On Tue, Mar 10, 2015 at 6:32 PM, Mohit Anchlia mohitanch...@gmail.com wrote:
  I am using maven and my dependency looks like this, but this doesn't seem to
  be working
 
  dependencies
 
  dependency
 
  groupIdorg.apache.spark/groupId
 
  artifactIdspark-streaming_2.10/artifactId
 
  version1.2.0/version
 
  /dependency
 
  dependency
 
  groupIdorg.apache.spark/groupId
 
  artifactIdspark-core_2.10/artifactId
 
  version1.2.1/version
 
  /dependency
 
  /dependencies
 
 
  On Tue, Mar 10, 2015 at 11:06 AM, Tathagata Das t...@databricks.com wrote:
 
  If you are using tools like SBT/Maven/Gradle/etc, they figure out all the
  recursive dependencies and includes them in the class path. I haven't
  touched Eclipse in years so I am not sure off the top of my head what's
  going on instead. Just in case you only downloaded the
  spark-streaming_2.10.jar  then that is indeed insufficient and you have to
  download all the recursive dependencies. May be you should create a Maven
  project inside Eclipse?
 
  TD
 
  On Tue, Mar 10, 2015 at 11:00 AM, Mohit Anchlia mohitanch...@gmail.com
  wrote:
 
  How do I do that? I haven't used Scala before.
 
  Also, linking page doesn't mention that:
 
 
  http://spark.apache.org/docs/1.2.0/streaming-programming-guide.html#linking
 
  On Tue, Mar 10, 2015 at 10:57 AM, Sean Owen so...@cloudera.com wrote:
 
  It means you do not have Scala library classes in your project
  classpath.
 
  On Tue, Mar 10, 2015 at 5:54 PM, Mohit Anchlia mohitanch...@gmail.com
  wrote:
   I am trying out streaming example as documented and I am using spark
   1.2.1
   streaming from maven for Java.
  
   When I add this code I get compilation error on and eclipse is not
   able to
   recognize Tuple2. I also don't see any import scala.Tuple2 class.
  
  
  
   http://spark.apache.org/docs/1.2.0/streaming-programming-guide.html#a-quick-example
  
  
   private void map(JavaReceiverInputDStreamString lines) {
  
   JavaDStreamString words = lines.flatMap(
  
   new FlatMapFunctionString, String() {
  
   @Override public IterableString call(String x) {
  
   return Arrays.asList(x.split( ));
  
   }
  
   });
  
   // Count each word in each batch
  
   JavaPairDStreamString, Integer pairs = words.map(
  
   new PairFunctionString, String, Integer() {
  
   @Override public Tuple2String, Integer call(String s) throws
   Exception {
  
   return new Tuple2String, Integer(s, 1);
  
   }
  
   });
  
   }
 
 
 
 
 
 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org
 
  

Re: Compilation error

2015-03-10 Thread Mohit Anchlia
I am using maven and my dependency looks like this, but this doesn't seem
to be working

 dependencies

dependency

groupIdorg.apache.spark/groupId

artifactIdspark-streaming_2.10/artifactId

version1.2.0/version

/dependency

dependency

groupIdorg.apache.spark/groupId

artifactIdspark-core_2.10/artifactId

version1.2.1/version

/dependency

/dependencies

On Tue, Mar 10, 2015 at 11:06 AM, Tathagata Das t...@databricks.com wrote:

 If you are using tools like SBT/Maven/Gradle/etc, they figure out all the
 recursive dependencies and includes them in the class path. I haven't
 touched Eclipse in years so I am not sure off the top of my head what's
 going on instead. Just in case you only downloaded the
 spark-streaming_2.10.jar  then that is indeed insufficient and you have to
 download all the recursive dependencies. May be you should create a Maven
 project inside Eclipse?

 TD

 On Tue, Mar 10, 2015 at 11:00 AM, Mohit Anchlia mohitanch...@gmail.com
 wrote:

 How do I do that? I haven't used Scala before.

 Also, linking page doesn't mention that:


 http://spark.apache.org/docs/1.2.0/streaming-programming-guide.html#linking

 On Tue, Mar 10, 2015 at 10:57 AM, Sean Owen so...@cloudera.com wrote:

 It means you do not have Scala library classes in your project classpath.

 On Tue, Mar 10, 2015 at 5:54 PM, Mohit Anchlia mohitanch...@gmail.com
 wrote:
  I am trying out streaming example as documented and I am using spark
 1.2.1
  streaming from maven for Java.
 
  When I add this code I get compilation error on and eclipse is not
 able to
  recognize Tuple2. I also don't see any import scala.Tuple2 class.
 
 
 
 http://spark.apache.org/docs/1.2.0/streaming-programming-guide.html#a-quick-example
 
 
  private void map(JavaReceiverInputDStreamString lines) {
 
  JavaDStreamString words = lines.flatMap(
 
  new FlatMapFunctionString, String() {
 
  @Override public IterableString call(String x) {
 
  return Arrays.asList(x.split( ));
 
  }
 
  });
 
  // Count each word in each batch
 
  JavaPairDStreamString, Integer pairs = words.map(
 
  new PairFunctionString, String, Integer() {
 
  @Override public Tuple2String, Integer call(String s) throws
 Exception {
 
  return new Tuple2String, Integer(s, 1);
 
  }
 
  });
 
  }






Re: Compilation error

2015-03-10 Thread Tathagata Das
See if you can import scala libraries in your project.

On Tue, Mar 10, 2015 at 11:32 AM, Mohit Anchlia mohitanch...@gmail.com
wrote:

 I am using maven and my dependency looks like this, but this doesn't seem
 to be working

  dependencies

 dependency

 groupIdorg.apache.spark/groupId

 artifactIdspark-streaming_2.10/artifactId

 version1.2.0/version

 /dependency

 dependency

 groupIdorg.apache.spark/groupId

 artifactIdspark-core_2.10/artifactId

 version1.2.1/version

 /dependency

 /dependencies

 On Tue, Mar 10, 2015 at 11:06 AM, Tathagata Das t...@databricks.com
 wrote:

 If you are using tools like SBT/Maven/Gradle/etc, they figure out all the
 recursive dependencies and includes them in the class path. I haven't
 touched Eclipse in years so I am not sure off the top of my head what's
 going on instead. Just in case you only downloaded the
 spark-streaming_2.10.jar  then that is indeed insufficient and you have to
 download all the recursive dependencies. May be you should create a Maven
 project inside Eclipse?

 TD

 On Tue, Mar 10, 2015 at 11:00 AM, Mohit Anchlia mohitanch...@gmail.com
 wrote:

 How do I do that? I haven't used Scala before.

 Also, linking page doesn't mention that:


 http://spark.apache.org/docs/1.2.0/streaming-programming-guide.html#linking

 On Tue, Mar 10, 2015 at 10:57 AM, Sean Owen so...@cloudera.com wrote:

 It means you do not have Scala library classes in your project
 classpath.

 On Tue, Mar 10, 2015 at 5:54 PM, Mohit Anchlia mohitanch...@gmail.com
 wrote:
  I am trying out streaming example as documented and I am using spark
 1.2.1
  streaming from maven for Java.
 
  When I add this code I get compilation error on and eclipse is not
 able to
  recognize Tuple2. I also don't see any import scala.Tuple2 class.
 
 
 
 http://spark.apache.org/docs/1.2.0/streaming-programming-guide.html#a-quick-example
 
 
  private void map(JavaReceiverInputDStreamString lines) {
 
  JavaDStreamString words = lines.flatMap(
 
  new FlatMapFunctionString, String() {
 
  @Override public IterableString call(String x) {
 
  return Arrays.asList(x.split( ));
 
  }
 
  });
 
  // Count each word in each batch
 
  JavaPairDStreamString, Integer pairs = words.map(
 
  new PairFunctionString, String, Integer() {
 
  @Override public Tuple2String, Integer call(String s) throws
 Exception {
 
  return new Tuple2String, Integer(s, 1);
 
  }
 
  });
 
  }







Re: Compilation error

2015-03-10 Thread Mohit Anchlia
I ran the dependency command and see the following dependencies:

I only see org.scala-lang.

[INFO] org.spark.test:spak-test:jar:0.0.1-SNAPSHOT

[INFO] +- org.apache.spark:spark-streaming_2.10:jar:1.2.0:compile

[INFO] | +- org.eclipse.jetty:jetty-server:jar:8.1.14.v20131031:compile

[INFO] | | +-
org.eclipse.jetty.orbit:javax.servlet:jar:3.0.0.v201112011016:co mpile

[INFO] | | +-
org.eclipse.jetty:jetty-continuation:jar:8.1.14.v20131031:compil e

[INFO] | | \- org.eclipse.jetty:jetty-http:jar:8.1.14.v20131031:compile

[INFO] | | \- org.eclipse.jetty:jetty-io:jar:8.1.14.v20131031:compile

[INFO] | +- org.scala-lang:scala-library:jar:2.10.4:compile

[INFO] | \- org.spark-project.spark:unused:jar:1.0.0:compile

[INFO] \- org.apache.spark:spark-core_2.10:jar:1.2.1:compile

[INFO] +- com.twitter:chill_2.10:jar:0.5.0:compile

[INFO] | \- com.esotericsoftware.kryo:kryo:jar:2.21:compile

[INFO] | +- com.esotericsoftware.reflectasm:reflectasm:jar:shaded:1.07:co
mpile

[INFO] | +- com.esotericsoftware.minlog:minlog:jar:1.2:compile

[INFO] | \- org.objenesis:objenesis:jar:1.2:compile

[INFO] +- com.twitter:chill-java:jar:0.5.0:compile

[INFO] +- org.apache.hadoop:hadoop-client:jar:2.2.0:compile

[INFO] | +- org.apache.hadoop:hadoop-common:jar:2.2.0:compile

[INFO] | | +- commons-cli:commons-cli:jar:1.2:compile

[INFO] | | +- org.apache.commons:commons-math:jar:2.1:compile

[INFO] | | +- xmlenc:xmlenc:jar:0.52:compile

[INFO] | | +- commons-io:commons-io:jar:2.1:compile

[INFO] | | +- commons-logging:commons-logging:jar:1.1.1:compile

[INFO] | | +- commons-lang:commons-lang:jar:2.5:compile

[INFO] | | +- commons-configuration:commons-configuration:jar:1.6:compile

[INFO] | | | +- commons-collections:commons-collections:jar:3.2.1:compile

[INFO] | | | +- commons-digester:commons-digester:jar:1.8:compile

[INFO] | | | | \- commons-beanutils:commons-beanutils:jar:1.7.0:compile

[INFO] | | | \- commons-beanutils:commons-beanutils-core:jar:1.8.0:compile

[INFO] | | +- org.codehaus.jackson:jackson-core-asl:jar:1.8.8:compile

[INFO] | | +- org.codehaus.jackson:jackson-mapper-asl:jar:1.8.8:compile

[INFO] | | +- org.apache.avro:avro:jar:1.7.4:compile

[INFO] | | +- com.google.protobuf:protobuf-java:jar:2.5.0:compile

[INFO] | | +- org.apache.hadoop:hadoop-auth:jar:2.2.0:compile

[INFO] | | \- org.apache.commons:commons-compress:jar:1.4.1:compile

[INFO] | | \- org.tukaani:xz:jar:1.0:compile

[INFO] | +- org.apache.hadoop:hadoop-hdfs:jar:2.2.0:compile

[INFO] | | \- org.mortbay.jetty:jetty-util:jar:6.1.26:compile

[INFO] | +- org.apache.hadoop:hadoop-mapreduce-client-app:jar:2.2.0:compile

[INFO] | | +- org.apache.hadoop:hadoop-mapreduce-client-common:jar:2.2.0:co
mpile

[INFO] | | | +- org.apache.hadoop:hadoop-yarn-client:jar:2.2.0:compile

[INFO] | | | | +- com.google.inject:guice:jar:3.0:compile

[INFO] | | | | | +- javax.inject:javax.inject:jar:1:compile

[INFO] | | | | | \- aopalliance:aopalliance:jar:1.0:compile

[INFO] | | | | +- com.sun.jersey.jersey-test-framework:jersey-test-framew
ork-grizzly2:jar:1.9:compile

[INFO] | | | | | +- com.sun.jersey.jersey-test-framework:jersey-test-fra
mework-core:jar:1.9:compile

[INFO] | | | | | | +- javax.servlet:javax.servlet-api:jar:3.0.1:compile

[INFO] | | | | | | \- com.sun.jersey:jersey-client:jar:1.9:compile

[INFO] | | | | | \- com.sun.jersey:jersey-grizzly2:jar:1.9:compile

[INFO] | | | | | +- org.glassfish.grizzly:grizzly-http:jar:2.1.2:comp ile

[INFO] | | | | | | \- org.glassfish.grizzly:grizzly-framework:jar:2.
1.2:compile

[INFO] | | | | | | \- org.glassfish.gmbal:gmbal-api-only:jar:3.0.
0-b023:compile

[INFO] | | | | | | \- org.glassfish.external:management-api:ja
r:3.0.0-b012:compile

[INFO] | | | | | +- org.glassfish.grizzly:grizzly-http-server:jar:2.1
.2:compile

[INFO] | | | | | | \- org.glassfish.grizzly:grizzly-rcm:jar:2.1.2:co mpile

[INFO] | | | | | +- org.glassfish.grizzly:grizzly-http-servlet:jar:2.
1.2:compile

[INFO] | | | | | \- org.glassfish:javax.servlet:jar:3.1:compile

[INFO] | | | | +- com.sun.jersey:jersey-server:jar:1.9:compile

[INFO] | | | | | +- asm:asm:jar:3.1:compile

[INFO] | | | | | \- com.sun.jersey:jersey-core:jar:1.9:compile

[INFO] | | | | +- com.sun.jersey:jersey-json:jar:1.9:compile

[INFO] | | | | | +- org.codehaus.jettison:jettison:jar:1.1:compile

[INFO] | | | | | | \- stax:stax-api:jar:1.0.1:compile

[INFO] | | | | | +- com.sun.xml.bind:jaxb-impl:jar:2.2.3-1:compile

[INFO] | | | | | | \- javax.xml.bind:jaxb-api:jar:2.2.2:compile

[INFO] | | | | | | \- javax.activation:activation:jar:1.1:compile

[INFO] | | | | | +- org.codehaus.jackson:jackson-jaxrs:jar:1.8.3:compile

[INFO] | | | | | \- org.codehaus.jackson:jackson-xc:jar:1.8.3:compile

[INFO] | | | | \- com.sun.jersey.contribs:jersey-guice:jar:1.9:compile

[INFO] | | | \- org.apache.hadoop:hadoop-yarn-server-common:jar:2.2.0:comp
ile

[INFO] | | \- org.apache.hadoop:hadoop-mapreduce-client-shuffle:jar:2.2.0:c
ompile

[INFO] | +- 

Re: Compilation error on JavaPairDStream

2015-03-10 Thread Mohit Anchlia
works now. I should have checked :)

On Tue, Mar 10, 2015 at 1:44 PM, Sean Owen so...@cloudera.com wrote:

 Ah, that's a typo in the example: use words.mapToPair
 I can make a little PR to fix that.

 On Tue, Mar 10, 2015 at 8:32 PM, Mohit Anchlia mohitanch...@gmail.com
 wrote:
  I am getting following error. When I look at the sources it seems to be a
  scala source, but not sure why it's complaining about it.
 
  The method map(FunctionString,R) in the type JavaDStreamString is not
  applicable for the arguments (new
 
  PairFunctionString,String,Integer(){})
 
 
  And my code has been taken from the spark examples site:
 
 
  JavaPairDStreamString, Integer pairs = words.map(
 
  new PairFunctionString, String, Integer() {
 
  @Override public Tuple2String, Integer call(String s) throws Exception
 {
 
  return new Tuple2String, Integer(s, 1);
 
 
  }
 
  });
 
 



Re: Compilation Error: Spark 1.0.2 with HBase 0.98

2014-08-28 Thread arthur.hk.c...@gmail.com
Hi,triedmvn -Phbase-hadoop2,hadoop-2.4,yarn -Dhadoop.version=2.4.1 -DskipTests dependency:tree  dep.txtAttached the dep. txt for your information.[WARNING] 
[WARNING] Some problems were encountered while building the effective settings
[WARNING] Unrecognised tag: 'mirrors' (position: START_TAG seen .../mirror\n  
   --\n\n\n  mirrors... @161:12)  @ 
/opt/apache-maven-3.1.1/conf/settings.xml, line 161, column 12
[WARNING] 
[INFO] Scanning for projects...
[INFO] 
[INFO] 
[INFO] Building Spark Project Examples 1.0.2
[INFO] 
[INFO] 
[INFO] --- maven-dependency-plugin:2.8:tree (default-cli) @ spark-examples_2.10 
---
[INFO] org.apache.spark:spark-examples_2.10:jar:1.0.2
[INFO] +- org.apache.spark:spark-core_2.10:jar:1.0.2:provided
[INFO] |  +- org.apache.hadoop:hadoop-client:jar:2.4.1:provided
[INFO] |  |  +- org.apache.hadoop:hadoop-common:jar:2.4.1:provided
[INFO] |  |  |  \- org.apache.hadoop:hadoop-auth:jar:2.4.1:provided
[INFO] |  |  +- org.apache.hadoop:hadoop-hdfs:jar:2.4.1:provided
[INFO] |  |  +- org.apache.hadoop:hadoop-mapreduce-client-app:jar:2.4.1:provided
[INFO] |  |  |  +- 
org.apache.hadoop:hadoop-mapreduce-client-common:jar:2.4.1:provided
[INFO] |  |  |  |  +- org.apache.hadoop:hadoop-yarn-client:jar:2.4.1:provided
[INFO] |  |  |  |  |  \- com.sun.jersey:jersey-client:jar:1.9:provided
[INFO] |  |  |  |  \- 
org.apache.hadoop:hadoop-yarn-server-common:jar:2.4.1:provided
[INFO] |  |  |  \- 
org.apache.hadoop:hadoop-mapreduce-client-shuffle:jar:2.4.1:provided
[INFO] |  |  +- org.apache.hadoop:hadoop-yarn-api:jar:2.4.1:provided
[INFO] |  |  +- 
org.apache.hadoop:hadoop-mapreduce-client-core:jar:2.4.1:provided
[INFO] |  |  |  \- org.apache.hadoop:hadoop-yarn-common:jar:2.4.1:provided
[INFO] |  |  +- 
org.apache.hadoop:hadoop-mapreduce-client-jobclient:jar:2.4.1:provided
[INFO] |  |  \- org.apache.hadoop:hadoop-annotations:jar:2.4.1:provided
[INFO] |  +- net.java.dev.jets3t:jets3t:jar:0.9.0:runtime
[INFO] |  |  +- org.apache.httpcomponents:httpclient:jar:4.1.2:compile
[INFO] |  |  +- org.apache.httpcomponents:httpcore:jar:4.1.2:compile
[INFO] |  |  \- com.jamesmurty.utils:java-xmlbuilder:jar:0.4:runtime
[INFO] |  +- org.apache.curator:curator-recipes:jar:2.4.0:provided
[INFO] |  |  \- org.apache.curator:curator-framework:jar:2.4.0:provided
[INFO] |  | \- org.apache.curator:curator-client:jar:2.4.0:provided
[INFO] |  +- org.eclipse.jetty:jetty-plus:jar:8.1.14.v20131031:provided
[INFO] |  |  +- 
org.eclipse.jetty.orbit:javax.transaction:jar:1.1.1.v201105210645:provided
[INFO] |  |  +- org.eclipse.jetty:jetty-webapp:jar:8.1.14.v20131031:provided
[INFO] |  |  |  +- org.eclipse.jetty:jetty-xml:jar:8.1.14.v20131031:provided
[INFO] |  |  |  \- org.eclipse.jetty:jetty-servlet:jar:8.1.14.v20131031:provided
[INFO] |  |  \- org.eclipse.jetty:jetty-jndi:jar:8.1.14.v20131031:provided
[INFO] |  | \- 
org.eclipse.jetty.orbit:javax.mail.glassfish:jar:1.4.1.v201005082020:provided
[INFO] |  |\- 
org.eclipse.jetty.orbit:javax.activation:jar:1.1.0.v201105071233:provided
[INFO] |  +- org.eclipse.jetty:jetty-security:jar:8.1.14.v20131031:provided
[INFO] |  +- org.eclipse.jetty:jetty-util:jar:8.1.14.v20131031:compile
[INFO] |  +- com.google.guava:guava:jar:14.0.1:compile
[INFO] |  +- org.apache.commons:commons-lang3:jar:3.3.2:provided
[INFO] |  +- com.google.code.findbugs:jsr305:jar:1.3.9:provided
[INFO] |  +- org.slf4j:slf4j-api:jar:1.7.5:compile
[INFO] |  +- org.slf4j:jul-to-slf4j:jar:1.7.5:provided
[INFO] |  +- org.slf4j:jcl-over-slf4j:jar:1.7.5:provided
[INFO] |  +- log4j:log4j:jar:1.2.17:compile
[INFO] |  +- org.slf4j:slf4j-log4j12:jar:1.7.5:compile
[INFO] |  +- com.ning:compress-lzf:jar:1.0.0:provided
[INFO] |  +- org.xerial.snappy:snappy-java:jar:1.0.5:compile
[INFO] |  +- com.twitter:chill_2.10:jar:0.3.6:provided
[INFO] |  |  \- com.esotericsoftware.kryo:kryo:jar:2.21:provided
[INFO] |  | +- 
com.esotericsoftware.reflectasm:reflectasm:jar:shaded:1.07:provided
[INFO] |  | +- com.esotericsoftware.minlog:minlog:jar:1.2:provided
[INFO] |  | \- org.objenesis:objenesis:jar:1.2:provided
[INFO] |  +- com.twitter:chill-java:jar:0.3.6:provided
[INFO] |  +- commons-net:commons-net:jar:2.2:compile
[INFO] |  +- 
org.spark-project.akka:akka-remote_2.10:jar:2.2.3-shaded-protobuf:provided
[INFO] |  |  +- 
org.spark-project.akka:akka-actor_2.10:jar:2.2.3-shaded-protobuf:compile
[INFO] |  |  |  \- com.typesafe:config:jar:1.0.2:compile
[INFO] |  |  +- io.netty:netty:jar:3.6.6.Final:provided
[INFO] |  |  +- 
org.spark-project.protobuf:protobuf-java:jar:2.4.1-shaded:compile
[INFO] |  |  \- org.uncommons.maths:uncommons-maths:jar:1.2.2a:provided
[INFO] |  +- 
org.spark-project.akka:akka-slf4j_2.10:jar:2.2.3-shaded-protobuf:provided
[INFO] |  +- org.scala-lang:scala-library:jar:2.10.4:compile

Re: Compilation Error: Spark 1.0.2 with HBase 0.98

2014-08-28 Thread Ted Yu
I see 0.98.5 in dep.txt

You should be good to go.


On Thu, Aug 28, 2014 at 3:16 AM, arthur.hk.c...@gmail.com 
arthur.hk.c...@gmail.com wrote:

 Hi,

 tried
 mvn -Phbase-hadoop2,hadoop-2.4,yarn -Dhadoop.version=2.4.1 -DskipTests
 dependency:tree  dep.txt

 Attached the dep. txt for your information.


 Regards
 Arthur

 On 28 Aug, 2014, at 12:22 pm, Ted Yu yuzhih...@gmail.com wrote:

 I forgot to include '-Dhadoop.version=2.4.1' in the command below.

 The modified command passed.

 You can verify the dependence on hbase 0.98 through this command:

 mvn -Phbase-hadoop2,hadoop-2.4,yarn -Dhadoop.version=2.4.1 -DskipTests
 dependency:tree  dep.txt

 Cheers


 On Wed, Aug 27, 2014 at 8:58 PM, Ted Yu yuzhih...@gmail.com wrote:

 Looks like the patch given by that URL only had the last commit.

 I have attached pom.xml for spark-1.0.2 to SPARK-1297
 You can download it and replace examples/pom.xml with the downloaded pom

 I am running this command locally:

 mvn -Phbase-hadoop2,hadoop-2.4,yarn -DskipTests clean package

 Cheers


 On Wed, Aug 27, 2014 at 7:57 PM, arthur.hk.c...@gmail.com 
 arthur.hk.c...@gmail.com wrote:

 Hi Ted,

 Thanks.

 Tried [patch -p1 -i 1893.patch](Hunk #1 FAILED at 45.)
 Is this normal?

 Regards
 Arthur


 patch -p1 -i 1893.patch
 patching file examples/pom.xml
 Hunk #1 FAILED at 45.
 Hunk #2 succeeded at 94 (offset -16 lines).
 1 out of 2 hunks FAILED -- saving rejects to file examples/pom.xml.rej
 patching file examples/pom.xml
 Hunk #1 FAILED at 54.
 Hunk #2 FAILED at 72.
  Hunk #3 succeeded at 122 (offset -49 lines).
 2 out of 3 hunks FAILED -- saving rejects to file examples/pom.xml.rej
 patching file docs/building-with-maven.md
 patching file examples/pom.xml
 Hunk #1 succeeded at 122 (offset -40 lines).
 Hunk #2 succeeded at 195 (offset -40 lines).


 On 28 Aug, 2014, at 10:53 am, Ted Yu yuzhih...@gmail.com wrote:

 Can you use this command ?

 patch -p1 -i 1893.patch

 Cheers


 On Wed, Aug 27, 2014 at 7:41 PM, arthur.hk.c...@gmail.com 
 arthur.hk.c...@gmail.com wrote:

 Hi Ted,

 I tried the following steps to apply the patch 1893 but got Hunk
 FAILED, can you please advise how to get thru this error? or is my
 spark-1.0.2 source not the correct one?

 Regards
 Arthur

 wget http://d3kbcqa49mib13.cloudfront.net/spark-1.0.2.tgz
 tar -vxf spark-1.0.2.tgz
 cd spark-1.0.2
 wget https://github.com/apache/spark/pull/1893.patch
 patch   1893.patch
 patching file pom.xml
 Hunk #1 FAILED at 45.
 Hunk #2 FAILED at 110.
 2 out of 2 hunks FAILED -- saving rejects to file pom.xml.rej
 patching file pom.xml
 Hunk #1 FAILED at 54.
 Hunk #2 FAILED at 72.
 Hunk #3 FAILED at 171.
 3 out of 3 hunks FAILED -- saving rejects to file pom.xml.rej
 can't find file to patch at input line 267
 Perhaps you should have used the -p or --strip option?
 The text leading up to this was:
 --
 |
 |From cd58437897bf02b644c2171404ccffae5d12a2be Mon Sep 17 00:00:00 2001
 |From: tedyu yuzhih...@gmail.com
 |Date: Mon, 11 Aug 2014 15:57:46 -0700
 |Subject: [PATCH 3/4] SPARK-1297 Upgrade HBase dependency to 0.98 - add
 | description to building-with-maven.md
 |
 |---
 | docs/building-with-maven.md | 3 +++
 | 1 file changed, 3 insertions(+)
 |
 |diff --git a/docs/building-with-maven.md b/docs/building-with-maven.md
 |index 672d0ef..f8bcd2b 100644
 |--- a/docs/building-with-maven.md
 |+++ b/docs/building-with-maven.md
 --
 File to patch:



 On 28 Aug, 2014, at 10:24 am, Ted Yu yuzhih...@gmail.com wrote:

 You can get the patch from this URL:
 https://github.com/apache/spark/pull/1893.patch

 BTW 0.98.5 has been released - you can specify 0.98.5-hadoop2 in the
 pom.xml

 Cheers


 On Wed, Aug 27, 2014 at 7:18 PM, arthur.hk.c...@gmail.com 
 arthur.hk.c...@gmail.com wrote:

 Hi Ted,

 Thank you so much!!

 As I am new to Spark, can you please advise the steps about how to
 apply this patch to my spark-1.0.2 source folder?

 Regards
 Arthur


 On 28 Aug, 2014, at 10:13 am, Ted Yu yuzhih...@gmail.com wrote:

 See SPARK-1297

  The pull request is here:
 https://github.com/apache/spark/pull/1893


 On Wed, Aug 27, 2014 at 6:57 PM, arthur.hk.c...@gmail.com 
 arthur.hk.c...@gmail.com wrote:

 (correction: Compilation Error:  Spark 1.0.2 with HBase 0.98” ,
 please ignore if duplicated)


 Hi,

 I need to use Spark with HBase 0.98 and tried to compile Spark 1.0.2
 with HBase 0.98,

 My steps:
 wget http://d3kbcqa49mib13.cloudfront.net/spark-1.0.2.tgz
 tar -vxf spark-1.0.2.tgz
 cd spark-1.0.2

 edit project/SparkBuild.scala, set HBASE_VERSION
   // HBase version; set as appropriate.
   val HBASE_VERSION = 0.98.2


 edit pom.xml with following values
 hadoop.version2.4.1/hadoop.version
 protobuf.version2.5.0/protobuf.version
 yarn.version${hadoop.version}/yarn.version
 hbase.version0.98.5/hbase.version
 zookeeper.version3.4.6/zookeeper.version
 hive.version0.13.1/hive.version


 SPARK_HADOOP_VERSION=2.4.1 SPARK_YARN=true sbt/sbt clean assembly
 but it 

Re: Compilation Error: Spark 1.0.2 with HBase 0.98

2014-08-28 Thread arthur.hk.c...@gmail.com
Hi,

I tried to start Spark but failed:

$ ./sbin/start-all.sh
starting org.apache.spark.deploy.master.Master, logging to 
/mnt/hadoop/spark-1.0.2/sbin/../logs/spark-edhuser-org.apache.spark.deploy.master.Master-1-m133.out
failed to launch org.apache.spark.deploy.master.Master:
  Failed to find Spark assembly in 
/mnt/hadoop/spark-1.0.2/assembly/target/scala-2.10/

$ ll assembly/
total 20
-rw-rw-r--. 1 hduser hadoop 11795 Jul 26 05:50 pom.xml
-rw-rw-r--. 1 hduser hadoop   507 Jul 26 05:50 README
drwxrwxr-x. 4 hduser hadoop  4096 Jul 26 05:50 src



Regards
Arthur



On 28 Aug, 2014, at 6:19 pm, Ted Yu yuzhih...@gmail.com wrote:

 I see 0.98.5 in dep.txt
 
 You should be good to go.
 
 
 On Thu, Aug 28, 2014 at 3:16 AM, arthur.hk.c...@gmail.com 
 arthur.hk.c...@gmail.com wrote:
 Hi,
 
 tried 
 mvn -Phbase-hadoop2,hadoop-2.4,yarn -Dhadoop.version=2.4.1 -DskipTests 
 dependency:tree  dep.txt
 
 Attached the dep. txt for your information. 
 
 
 Regards
 Arthur
 
 On 28 Aug, 2014, at 12:22 pm, Ted Yu yuzhih...@gmail.com wrote:
 
 I forgot to include '-Dhadoop.version=2.4.1' in the command below.
 
 The modified command passed.
 
 You can verify the dependence on hbase 0.98 through this command:
 
 mvn -Phbase-hadoop2,hadoop-2.4,yarn -Dhadoop.version=2.4.1 -DskipTests 
 dependency:tree  dep.txt
 
 Cheers
 
 
 On Wed, Aug 27, 2014 at 8:58 PM, Ted Yu yuzhih...@gmail.com wrote:
 Looks like the patch given by that URL only had the last commit.
 
 I have attached pom.xml for spark-1.0.2 to SPARK-1297
 You can download it and replace examples/pom.xml with the downloaded pom
 
 I am running this command locally:
 
 mvn -Phbase-hadoop2,hadoop-2.4,yarn -DskipTests clean package
 
 Cheers
 
 
 On Wed, Aug 27, 2014 at 7:57 PM, arthur.hk.c...@gmail.com 
 arthur.hk.c...@gmail.com wrote:
 Hi Ted, 
 
 Thanks. 
 
 Tried [patch -p1 -i 1893.patch](Hunk #1 FAILED at 45.)
 Is this normal?
 
 Regards
 Arthur
 
 
 patch -p1 -i 1893.patch
 patching file examples/pom.xml
 Hunk #1 FAILED at 45.
 Hunk #2 succeeded at 94 (offset -16 lines).
 1 out of 2 hunks FAILED -- saving rejects to file examples/pom.xml.rej
 patching file examples/pom.xml
 Hunk #1 FAILED at 54.
 Hunk #2 FAILED at 72.
 Hunk #3 succeeded at 122 (offset -49 lines).
 2 out of 3 hunks FAILED -- saving rejects to file examples/pom.xml.rej
 patching file docs/building-with-maven.md
 patching file examples/pom.xml
 Hunk #1 succeeded at 122 (offset -40 lines).
 Hunk #2 succeeded at 195 (offset -40 lines).
 
 
 On 28 Aug, 2014, at 10:53 am, Ted Yu yuzhih...@gmail.com wrote:
 
 Can you use this command ?
 
 patch -p1 -i 1893.patch
 
 Cheers
 
 
 On Wed, Aug 27, 2014 at 7:41 PM, arthur.hk.c...@gmail.com 
 arthur.hk.c...@gmail.com wrote:
 Hi Ted,
 
 I tried the following steps to apply the patch 1893 but got Hunk FAILED, 
 can you please advise how to get thru this error? or is my spark-1.0.2 
 source not the correct one?
 
 Regards
 Arthur
  
 wget http://d3kbcqa49mib13.cloudfront.net/spark-1.0.2.tgz
 tar -vxf spark-1.0.2.tgz
 cd spark-1.0.2
 wget https://github.com/apache/spark/pull/1893.patch
 patch   1893.patch
 patching file pom.xml
 Hunk #1 FAILED at 45.
 Hunk #2 FAILED at 110.
 2 out of 2 hunks FAILED -- saving rejects to file pom.xml.rej
 patching file pom.xml
 Hunk #1 FAILED at 54.
 Hunk #2 FAILED at 72.
 Hunk #3 FAILED at 171.
 3 out of 3 hunks FAILED -- saving rejects to file pom.xml.rej
 can't find file to patch at input line 267
 Perhaps you should have used the -p or --strip option?
 The text leading up to this was:
 --
 |
 |From cd58437897bf02b644c2171404ccffae5d12a2be Mon Sep 17 00:00:00 2001
 |From: tedyu yuzhih...@gmail.com
 |Date: Mon, 11 Aug 2014 15:57:46 -0700
 |Subject: [PATCH 3/4] SPARK-1297 Upgrade HBase dependency to 0.98 - add
 | description to building-with-maven.md
 |
 |---
 | docs/building-with-maven.md | 3 +++
 | 1 file changed, 3 insertions(+)
 |
 |diff --git a/docs/building-with-maven.md b/docs/building-with-maven.md
 |index 672d0ef..f8bcd2b 100644
 |--- a/docs/building-with-maven.md
 |+++ b/docs/building-with-maven.md
 --
 File to patch:
 
 
 
 On 28 Aug, 2014, at 10:24 am, Ted Yu yuzhih...@gmail.com wrote:
 
 You can get the patch from this URL:
 https://github.com/apache/spark/pull/1893.patch
 
 BTW 0.98.5 has been released - you can specify 0.98.5-hadoop2 in the 
 pom.xml
 
 Cheers
 
 
 On Wed, Aug 27, 2014 at 7:18 PM, arthur.hk.c...@gmail.com 
 arthur.hk.c...@gmail.com wrote:
 Hi Ted,
 
 Thank you so much!!
 
 As I am new to Spark, can you please advise the steps about how to apply 
 this patch to my spark-1.0.2 source folder?
 
 Regards
 Arthur
 
 
 On 28 Aug, 2014, at 10:13 am, Ted Yu yuzhih...@gmail.com wrote:
 
 See SPARK-1297
 
 The pull request is here:
 https://github.com/apache/spark/pull/1893
 
 
 On Wed, Aug 27, 2014 at 6:57 PM, arthur.hk.c...@gmail.com 
 arthur.hk.c...@gmail.com wrote:
 (correction: Compilation Error:  Spark 1.0.2 with HBase 0.98” , please 
 ignore if duplicated)
 

Re: Compilation Error: Spark 1.0.2 with HBase 0.98

2014-08-28 Thread Ted Yu
I didn't see that problem.
Did you run this command ?

mvn -Phbase-hadoop2,hadoop-2.4,yarn -Dhadoop.version=2.4.1 -DskipTests
clean package

Here is what I got:

TYus-MacBook-Pro:spark-1.0.2 tyu$ sbin/start-all.sh
starting org.apache.spark.deploy.master.Master, logging to
/Users/tyu/spark-1.0.2/sbin/../logs/spark-tyu-org.apache.spark.deploy.master.Master-1-TYus-MacBook-Pro.local.out
localhost: ssh: connect to host localhost port 22: Connection refused

TYus-MacBook-Pro:spark-1.0.2 tyu$ vi
logs/spark-tyu-org.apache.spark.deploy.master.Master-1-TYus-MacBook-Pro.local.out
TYus-MacBook-Pro:spark-1.0.2 tyu$ jps
11563 Master
11635 Jps

TYus-MacBook-Pro:spark-1.0.2 tyu$ ps aux | grep 11563
tyu 11563   0.7  0.8  196 142444 s003  S 6:52AM
0:02.72
/Library/Java/JavaVirtualMachines/jdk1.7.0_60.jdk/Contents/Home/bin/java
-cp
::/Users/tyu/spark-1.0.2/conf:/Users/tyu/spark-1.0.2/assembly/target/scala-2.10/spark-assembly-1.0.2-hadoop2.4.1.jar
-XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m
org.apache.spark.deploy.master.Master --ip TYus-MacBook-Pro.local --port
7077 --webui-port 8080

TYus-MacBook-Pro:spark-1.0.2 tyu$ ls -l
assembly/target/scala-2.10/spark-assembly-1.0.2-hadoop2.4.1.jar
-rw-r--r--  1 tyu  staff  121182305 Aug 27 21:13
assembly/target/scala-2.10/spark-assembly-1.0.2-hadoop2.4.1.jar

Cheers


On Thu, Aug 28, 2014 at 3:42 AM, arthur.hk.c...@gmail.com 
arthur.hk.c...@gmail.com wrote:

 Hi,

 I tried to start Spark but failed:

 $ ./sbin/start-all.sh
 starting org.apache.spark.deploy.master.Master, logging to
 /mnt/hadoop/spark-1.0.2/sbin/../logs/spark-edhuser-org.apache.spark.deploy.master.Master-1-m133.out
 failed to launch org.apache.spark.deploy.master.Master:
   Failed to find Spark assembly in
 /mnt/hadoop/spark-1.0.2/assembly/target/scala-2.10/

 $ ll assembly/
 total 20
 -rw-rw-r--. 1 hduser hadoop 11795 Jul 26 05:50 pom.xml
 -rw-rw-r--. 1 hduser hadoop   507 Jul 26 05:50 README
 drwxrwxr-x. 4 hduser hadoop  4096 Jul 26 05:50 *src*



 Regards
 Arthur



 On 28 Aug, 2014, at 6:19 pm, Ted Yu yuzhih...@gmail.com wrote:

 I see 0.98.5 in dep.txt

 You should be good to go.


 On Thu, Aug 28, 2014 at 3:16 AM, arthur.hk.c...@gmail.com 
 arthur.hk.c...@gmail.com wrote:

 Hi,

 tried
 mvn -Phbase-hadoop2,hadoop-2.4,yarn -Dhadoop.version=2.4.1 -DskipTests
 dependency:tree  dep.txt

 Attached the dep. txt for your information.


 Regards
 Arthur

 On 28 Aug, 2014, at 12:22 pm, Ted Yu yuzhih...@gmail.com wrote:

 I forgot to include '-Dhadoop.version=2.4.1' in the command below.

 The modified command passed.

 You can verify the dependence on hbase 0.98 through this command:

 mvn -Phbase-hadoop2,hadoop-2.4,yarn -Dhadoop.version=2.4.1 -DskipTests
 dependency:tree  dep.txt

 Cheers


 On Wed, Aug 27, 2014 at 8:58 PM, Ted Yu yuzhih...@gmail.com wrote:

 Looks like the patch given by that URL only had the last commit.

 I have attached pom.xml for spark-1.0.2 to SPARK-1297
 You can download it and replace examples/pom.xml with the downloaded pom

 I am running this command locally:

 mvn -Phbase-hadoop2,hadoop-2.4,yarn -DskipTests clean package

 Cheers


 On Wed, Aug 27, 2014 at 7:57 PM, arthur.hk.c...@gmail.com 
 arthur.hk.c...@gmail.com wrote:

 Hi Ted,

 Thanks.

 Tried [patch -p1 -i 1893.patch](Hunk #1 FAILED at 45.)
 Is this normal?

 Regards
 Arthur


 patch -p1 -i 1893.patch
 patching file examples/pom.xml
 Hunk #1 FAILED at 45.
 Hunk #2 succeeded at 94 (offset -16 lines).
 1 out of 2 hunks FAILED -- saving rejects to file examples/pom.xml.rej
 patching file examples/pom.xml
 Hunk #1 FAILED at 54.
 Hunk #2 FAILED at 72.
  Hunk #3 succeeded at 122 (offset -49 lines).
 2 out of 3 hunks FAILED -- saving rejects to file examples/pom.xml.rej
 patching file docs/building-with-maven.md
 patching file examples/pom.xml
 Hunk #1 succeeded at 122 (offset -40 lines).
 Hunk #2 succeeded at 195 (offset -40 lines).


 On 28 Aug, 2014, at 10:53 am, Ted Yu yuzhih...@gmail.com wrote:

 Can you use this command ?

 patch -p1 -i 1893.patch

 Cheers


 On Wed, Aug 27, 2014 at 7:41 PM, arthur.hk.c...@gmail.com 
 arthur.hk.c...@gmail.com wrote:

 Hi Ted,

 I tried the following steps to apply the patch 1893 but got Hunk
 FAILED, can you please advise how to get thru this error? or is my
 spark-1.0.2 source not the correct one?

 Regards
 Arthur

 wget http://d3kbcqa49mib13.cloudfront.net/spark-1.0.2.tgz
 tar -vxf spark-1.0.2.tgz
 cd spark-1.0.2
 wget https://github.com/apache/spark/pull/1893.patch
 patch   1893.patch
 patching file pom.xml
 Hunk #1 FAILED at 45.
 Hunk #2 FAILED at 110.
 2 out of 2 hunks FAILED -- saving rejects to file pom.xml.rej
 patching file pom.xml
 Hunk #1 FAILED at 54.
 Hunk #2 FAILED at 72.
 Hunk #3 FAILED at 171.
 3 out of 3 hunks FAILED -- saving rejects to file pom.xml.rej
 can't find file to patch at input line 267
 Perhaps you should have used the -p or --strip option?
 The text leading up to this was:
 --
 |
 |From 

Re: Compilation Error: Spark 1.0.2 with HBase 0.98

2014-08-27 Thread Ted Yu
See SPARK-1297

The pull request is here:
https://github.com/apache/spark/pull/1893


On Wed, Aug 27, 2014 at 6:57 PM, arthur.hk.c...@gmail.com 
arthur.hk.c...@gmail.com wrote:

 (correction: Compilation Error:  Spark 1.0.2 with HBase 0.98” , please
 ignore if duplicated)


 Hi,

 I need to use Spark with HBase 0.98 and tried to compile Spark 1.0.2 with
 HBase 0.98,

 My steps:
 wget http://d3kbcqa49mib13.cloudfront.net/spark-1.0.2.tgz
 tar -vxf spark-1.0.2.tgz
 cd spark-1.0.2

 edit project/SparkBuild.scala, set HBASE_VERSION
   // HBase version; set as appropriate.
   val HBASE_VERSION = 0.98.2


 edit pom.xml with following values
 hadoop.version2.4.1/hadoop.version
 protobuf.version2.5.0/protobuf.version
 yarn.version${hadoop.version}/yarn.version
 hbase.version0.98.5/hbase.version
 zookeeper.version3.4.6/zookeeper.version
 hive.version0.13.1/hive.version


 SPARK_HADOOP_VERSION=2.4.1 SPARK_YARN=true sbt/sbt clean assembly
 but it fails because of UNRESOLVED DEPENDENCIES hbase;0.98.2

 Can you please advise how to compile Spark 1.0.2 with HBase 0.98? or
 should I set HBASE_VERSION back to “0.94.6?

 Regards
 Arthur




 [warn]  ::
 [warn]  ::  UNRESOLVED DEPENDENCIES ::
 [warn]  ::
 [warn]  :: org.apache.hbase#hbase;0.98.2: not found
 [warn]  ::

 sbt.ResolveException: unresolved dependency:
 org.apache.hbase#hbase;0.98.2: not found
 at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:217)
 at sbt.IvyActions$$anonfun$update$1.apply(IvyActions.scala:126)
 at sbt.IvyActions$$anonfun$update$1.apply(IvyActions.scala:125)
 at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:116)
 at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:116)
 at sbt.IvySbt$$anonfun$withIvy$1.apply(Ivy.scala:104)
 at sbt.IvySbt.sbt$IvySbt$$action$1(Ivy.scala:51)
 at sbt.IvySbt$$anon$3.call(Ivy.scala:60)
 at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:98)
 at
 xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:81)
 at
 xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:102)
 at xsbt.boot.Using$.withResource(Using.scala:11)
 at xsbt.boot.Using$.apply(Using.scala:10)
 at
 xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:62)
 at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:52)
 at xsbt.boot.Locks$.apply0(Locks.scala:31)
 at xsbt.boot.Locks$.apply(Locks.scala:28)
 at sbt.IvySbt.withDefaultLogger(Ivy.scala:60)
 at sbt.IvySbt.withIvy(Ivy.scala:101)
 at sbt.IvySbt.withIvy(Ivy.scala:97)
 at sbt.IvySbt$Module.withModule(Ivy.scala:116)
 at sbt.IvyActions$.update(IvyActions.scala:125)
 at
 sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1170)
 at
 sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1168)
 at
 sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$73.apply(Defaults.scala:1191)
 at
 sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$73.apply(Defaults.scala:1189)
 at sbt.Tracked$$anonfun$lastOutput$1.apply(Tracked.scala:35)
 at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1193)
 at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1188)
 at sbt.Tracked$$anonfun$inputChanged$1.apply(Tracked.scala:45)
 at sbt.Classpaths$.cachedUpdate(Defaults.scala:1196)
 at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1161)
 at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1139)
 at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
 at
 sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:42)
 at sbt.std.Transform$$anon$4.work(System.scala:64)
 at
 sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
 at
 sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
 at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18)
 at sbt.Execute.work(Execute.scala:244)
 at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
 at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
 at
 sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:160)
 at sbt.CompletionService$$anon$2.call(CompletionService.scala:30)
 at
 java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
 at java.util.concurrent.FutureTask.run(FutureTask.java:138)
 at
 java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
 at
 java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
 at java.util.concurrent.FutureTask.run(FutureTask.java:138)
 

Re: Compilation Error: Spark 1.0.2 with HBase 0.98

2014-08-27 Thread arthur.hk.c...@gmail.com
Hi Ted,

Thank you so much!!

As I am new to Spark, can you please advise the steps about how to apply this 
patch to my spark-1.0.2 source folder?

Regards
Arthur


On 28 Aug, 2014, at 10:13 am, Ted Yu yuzhih...@gmail.com wrote:

 See SPARK-1297
 
 The pull request is here:
 https://github.com/apache/spark/pull/1893
 
 
 On Wed, Aug 27, 2014 at 6:57 PM, arthur.hk.c...@gmail.com 
 arthur.hk.c...@gmail.com wrote:
 (correction: Compilation Error:  Spark 1.0.2 with HBase 0.98” , please 
 ignore if duplicated)
 
 
 Hi,
 
 I need to use Spark with HBase 0.98 and tried to compile Spark 1.0.2 with 
 HBase 0.98,
 
 My steps:
 wget http://d3kbcqa49mib13.cloudfront.net/spark-1.0.2.tgz
 tar -vxf spark-1.0.2.tgz
 cd spark-1.0.2
 
 edit project/SparkBuild.scala, set HBASE_VERSION
   // HBase version; set as appropriate.
   val HBASE_VERSION = 0.98.2
 
 
 edit pom.xml with following values
 hadoop.version2.4.1/hadoop.version
 protobuf.version2.5.0/protobuf.version
 yarn.version${hadoop.version}/yarn.version
 hbase.version0.98.5/hbase.version
 zookeeper.version3.4.6/zookeeper.version
 hive.version0.13.1/hive.version
 
 
 SPARK_HADOOP_VERSION=2.4.1 SPARK_YARN=true sbt/sbt clean assembly
 but it fails because of UNRESOLVED DEPENDENCIES hbase;0.98.2
 
 Can you please advise how to compile Spark 1.0.2 with HBase 0.98? or should I 
 set HBASE_VERSION back to “0.94.6?
 
 Regards
 Arthur
 
 
 
 
 [warn]  ::
 [warn]  ::  UNRESOLVED DEPENDENCIES ::
 [warn]  ::
 [warn]  :: org.apache.hbase#hbase;0.98.2: not found
 [warn]  ::
 
 sbt.ResolveException: unresolved dependency: org.apache.hbase#hbase;0.98.2: 
 not found
 at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:217)
 at sbt.IvyActions$$anonfun$update$1.apply(IvyActions.scala:126)
 at sbt.IvyActions$$anonfun$update$1.apply(IvyActions.scala:125)
 at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:116)
 at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:116)
 at sbt.IvySbt$$anonfun$withIvy$1.apply(Ivy.scala:104)
 at sbt.IvySbt.sbt$IvySbt$$action$1(Ivy.scala:51)
 at sbt.IvySbt$$anon$3.call(Ivy.scala:60)
 at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:98)
 at 
 xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:81)
 at 
 xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:102)
 at xsbt.boot.Using$.withResource(Using.scala:11)
 at xsbt.boot.Using$.apply(Using.scala:10)
 at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:62)
 at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:52)
 at xsbt.boot.Locks$.apply0(Locks.scala:31)
 at xsbt.boot.Locks$.apply(Locks.scala:28)
 at sbt.IvySbt.withDefaultLogger(Ivy.scala:60)
 at sbt.IvySbt.withIvy(Ivy.scala:101)
 at sbt.IvySbt.withIvy(Ivy.scala:97)
 at sbt.IvySbt$Module.withModule(Ivy.scala:116)
 at sbt.IvyActions$.update(IvyActions.scala:125)
 at 
 sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1170)
 at 
 sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1168)
 at 
 sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$73.apply(Defaults.scala:1191)
 at 
 sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$73.apply(Defaults.scala:1189)
 at sbt.Tracked$$anonfun$lastOutput$1.apply(Tracked.scala:35)
 at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1193)
 at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1188)
 at sbt.Tracked$$anonfun$inputChanged$1.apply(Tracked.scala:45)
 at sbt.Classpaths$.cachedUpdate(Defaults.scala:1196)
 at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1161)
 at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1139)
 at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
 at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:42)
 at sbt.std.Transform$$anon$4.work(System.scala:64)
 at 
 sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
 at 
 sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
 at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18)
 at sbt.Execute.work(Execute.scala:244)
 at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
 at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
 at 
 sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:160)
 at sbt.CompletionService$$anon$2.call(CompletionService.scala:30)
 at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
 at 

Re: Compilation Error: Spark 1.0.2 with HBase 0.98

2014-08-27 Thread Ted Yu
You can get the patch from this URL:
https://github.com/apache/spark/pull/1893.patch

BTW 0.98.5 has been released - you can specify 0.98.5-hadoop2 in the pom.xml

Cheers


On Wed, Aug 27, 2014 at 7:18 PM, arthur.hk.c...@gmail.com 
arthur.hk.c...@gmail.com wrote:

 Hi Ted,

 Thank you so much!!

 As I am new to Spark, can you please advise the steps about how to apply
 this patch to my spark-1.0.2 source folder?

 Regards
 Arthur


 On 28 Aug, 2014, at 10:13 am, Ted Yu yuzhih...@gmail.com wrote:

 See SPARK-1297

 The pull request is here:
 https://github.com/apache/spark/pull/1893


 On Wed, Aug 27, 2014 at 6:57 PM, arthur.hk.c...@gmail.com 
 arthur.hk.c...@gmail.com wrote:

 (correction: Compilation Error:  Spark 1.0.2 with HBase 0.98” , please
 ignore if duplicated)


 Hi,

 I need to use Spark with HBase 0.98 and tried to compile Spark 1.0.2 with
 HBase 0.98,

 My steps:
 wget http://d3kbcqa49mib13.cloudfront.net/spark-1.0.2.tgz
 tar -vxf spark-1.0.2.tgz
 cd spark-1.0.2

 edit project/SparkBuild.scala, set HBASE_VERSION
   // HBase version; set as appropriate.
   val HBASE_VERSION = 0.98.2


 edit pom.xml with following values
 hadoop.version2.4.1/hadoop.version
 protobuf.version2.5.0/protobuf.version
 yarn.version${hadoop.version}/yarn.version
 hbase.version0.98.5/hbase.version
 zookeeper.version3.4.6/zookeeper.version
 hive.version0.13.1/hive.version


 SPARK_HADOOP_VERSION=2.4.1 SPARK_YARN=true sbt/sbt clean assembly
 but it fails because of UNRESOLVED DEPENDENCIES hbase;0.98.2

 Can you please advise how to compile Spark 1.0.2 with HBase 0.98? or
 should I set HBASE_VERSION back to “0.94.6?

 Regards
 Arthur




 [warn]  ::
 [warn]  ::  UNRESOLVED DEPENDENCIES ::
 [warn]  ::
 [warn]  :: org.apache.hbase#hbase;0.98.2: not found
 [warn]  ::

 sbt.ResolveException: unresolved dependency:
 org.apache.hbase#hbase;0.98.2: not found
 at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:217)
 at sbt.IvyActions$$anonfun$update$1.apply(IvyActions.scala:126)
 at sbt.IvyActions$$anonfun$update$1.apply(IvyActions.scala:125)
 at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:116)
 at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:116)
 at sbt.IvySbt$$anonfun$withIvy$1.apply(Ivy.scala:104)
 at sbt.IvySbt.sbt$IvySbt$$action$1(Ivy.scala:51)
 at sbt.IvySbt$$anon$3.call(Ivy.scala:60)
 at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:98)
 at
 xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:81)
 at
 xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:102)
 at xsbt.boot.Using$.withResource(Using.scala:11)
 at xsbt.boot.Using$.apply(Using.scala:10)
 at
 xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:62)
 at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:52)
 at xsbt.boot.Locks$.apply0(Locks.scala:31)
 at xsbt.boot.Locks$.apply(Locks.scala:28)
 at sbt.IvySbt.withDefaultLogger(Ivy.scala:60)
 at sbt.IvySbt.withIvy(Ivy.scala:101)
 at sbt.IvySbt.withIvy(Ivy.scala:97)
 at sbt.IvySbt$Module.withModule(Ivy.scala:116)
 at sbt.IvyActions$.update(IvyActions.scala:125)
 at
 sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1170)
 at
 sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1168)
 at
 sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$73.apply(Defaults.scala:1191)
 at
 sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$73.apply(Defaults.scala:1189)
 at sbt.Tracked$$anonfun$lastOutput$1.apply(Tracked.scala:35)
 at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1193)
 at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1188)
 at sbt.Tracked$$anonfun$inputChanged$1.apply(Tracked.scala:45)
 at sbt.Classpaths$.cachedUpdate(Defaults.scala:1196)
 at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1161)
 at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1139)
 at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
 at
 sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:42)
 at sbt.std.Transform$$anon$4.work(System.scala:64)
 at
 sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
 at
 sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
 at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18)
 at sbt.Execute.work(Execute.scala:244)
 at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
 at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
 at
 

Re: Compilation Error: Spark 1.0.2 with HBase 0.98

2014-08-27 Thread arthur.hk.c...@gmail.com
Hi Ted,

I tried the following steps to apply the patch 1893 but got Hunk FAILED, can 
you please advise how to get thru this error? or is my spark-1.0.2 source not 
the correct one?

Regards
Arthur
 
wget http://d3kbcqa49mib13.cloudfront.net/spark-1.0.2.tgz
tar -vxf spark-1.0.2.tgz
cd spark-1.0.2
wget https://github.com/apache/spark/pull/1893.patch
patch   1893.patch
patching file pom.xml
Hunk #1 FAILED at 45.
Hunk #2 FAILED at 110.
2 out of 2 hunks FAILED -- saving rejects to file pom.xml.rej
patching file pom.xml
Hunk #1 FAILED at 54.
Hunk #2 FAILED at 72.
Hunk #3 FAILED at 171.
3 out of 3 hunks FAILED -- saving rejects to file pom.xml.rej
can't find file to patch at input line 267
Perhaps you should have used the -p or --strip option?
The text leading up to this was:
--
|
|From cd58437897bf02b644c2171404ccffae5d12a2be Mon Sep 17 00:00:00 2001
|From: tedyu yuzhih...@gmail.com
|Date: Mon, 11 Aug 2014 15:57:46 -0700
|Subject: [PATCH 3/4] SPARK-1297 Upgrade HBase dependency to 0.98 - add
| description to building-with-maven.md
|
|---
| docs/building-with-maven.md | 3 +++
| 1 file changed, 3 insertions(+)
|
|diff --git a/docs/building-with-maven.md b/docs/building-with-maven.md
|index 672d0ef..f8bcd2b 100644
|--- a/docs/building-with-maven.md
|+++ b/docs/building-with-maven.md
--
File to patch:



On 28 Aug, 2014, at 10:24 am, Ted Yu yuzhih...@gmail.com wrote:

 You can get the patch from this URL:
 https://github.com/apache/spark/pull/1893.patch
 
 BTW 0.98.5 has been released - you can specify 0.98.5-hadoop2 in the pom.xml
 
 Cheers
 
 
 On Wed, Aug 27, 2014 at 7:18 PM, arthur.hk.c...@gmail.com 
 arthur.hk.c...@gmail.com wrote:
 Hi Ted,
 
 Thank you so much!!
 
 As I am new to Spark, can you please advise the steps about how to apply this 
 patch to my spark-1.0.2 source folder?
 
 Regards
 Arthur
 
 
 On 28 Aug, 2014, at 10:13 am, Ted Yu yuzhih...@gmail.com wrote:
 
 See SPARK-1297
 
 The pull request is here:
 https://github.com/apache/spark/pull/1893
 
 
 On Wed, Aug 27, 2014 at 6:57 PM, arthur.hk.c...@gmail.com 
 arthur.hk.c...@gmail.com wrote:
 (correction: Compilation Error:  Spark 1.0.2 with HBase 0.98” , please 
 ignore if duplicated)
 
 
 Hi,
 
 I need to use Spark with HBase 0.98 and tried to compile Spark 1.0.2 with 
 HBase 0.98,
 
 My steps:
 wget http://d3kbcqa49mib13.cloudfront.net/spark-1.0.2.tgz
 tar -vxf spark-1.0.2.tgz
 cd spark-1.0.2
 
 edit project/SparkBuild.scala, set HBASE_VERSION
   // HBase version; set as appropriate.
   val HBASE_VERSION = 0.98.2
 
 
 edit pom.xml with following values
 hadoop.version2.4.1/hadoop.version
 protobuf.version2.5.0/protobuf.version
 yarn.version${hadoop.version}/yarn.version
 hbase.version0.98.5/hbase.version
 zookeeper.version3.4.6/zookeeper.version
 hive.version0.13.1/hive.version
 
 
 SPARK_HADOOP_VERSION=2.4.1 SPARK_YARN=true sbt/sbt clean assembly
 but it fails because of UNRESOLVED DEPENDENCIES hbase;0.98.2
 
 Can you please advise how to compile Spark 1.0.2 with HBase 0.98? or should 
 I set HBASE_VERSION back to “0.94.6?
 
 Regards
 Arthur
 
 
 
 
 [warn]  ::
 [warn]  ::  UNRESOLVED DEPENDENCIES ::
 [warn]  ::
 [warn]  :: org.apache.hbase#hbase;0.98.2: not found
 [warn]  ::
 
 sbt.ResolveException: unresolved dependency: org.apache.hbase#hbase;0.98.2: 
 not found
 at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:217)
 at sbt.IvyActions$$anonfun$update$1.apply(IvyActions.scala:126)
 at sbt.IvyActions$$anonfun$update$1.apply(IvyActions.scala:125)
 at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:116)
 at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:116)
 at sbt.IvySbt$$anonfun$withIvy$1.apply(Ivy.scala:104)
 at sbt.IvySbt.sbt$IvySbt$$action$1(Ivy.scala:51)
 at sbt.IvySbt$$anon$3.call(Ivy.scala:60)
 at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:98)
 at 
 xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:81)
 at 
 xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:102)
 at xsbt.boot.Using$.withResource(Using.scala:11)
 at xsbt.boot.Using$.apply(Using.scala:10)
 at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:62)
 at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:52)
 at xsbt.boot.Locks$.apply0(Locks.scala:31)
 at xsbt.boot.Locks$.apply(Locks.scala:28)
 at sbt.IvySbt.withDefaultLogger(Ivy.scala:60)
 at sbt.IvySbt.withIvy(Ivy.scala:101)
 at sbt.IvySbt.withIvy(Ivy.scala:97)
 at sbt.IvySbt$Module.withModule(Ivy.scala:116)
 at sbt.IvyActions$.update(IvyActions.scala:125)
 at 
 

Re: Compilation Error: Spark 1.0.2 with HBase 0.98

2014-08-27 Thread Ted Yu
Can you use this command ?

patch -p1 -i 1893.patch

Cheers


On Wed, Aug 27, 2014 at 7:41 PM, arthur.hk.c...@gmail.com 
arthur.hk.c...@gmail.com wrote:

 Hi Ted,

 I tried the following steps to apply the patch 1893 but got Hunk FAILED,
 can you please advise how to get thru this error? or is my spark-1.0.2
 source not the correct one?

 Regards
 Arthur

 wget http://d3kbcqa49mib13.cloudfront.net/spark-1.0.2.tgz
 tar -vxf spark-1.0.2.tgz
 cd spark-1.0.2
 wget https://github.com/apache/spark/pull/1893.patch
 patch   1893.patch
 patching file pom.xml
 Hunk #1 FAILED at 45.
 Hunk #2 FAILED at 110.
 2 out of 2 hunks FAILED -- saving rejects to file pom.xml.rej
 patching file pom.xml
 Hunk #1 FAILED at 54.
 Hunk #2 FAILED at 72.
 Hunk #3 FAILED at 171.
 3 out of 3 hunks FAILED -- saving rejects to file pom.xml.rej
 can't find file to patch at input line 267
 Perhaps you should have used the -p or --strip option?
 The text leading up to this was:
 --
 |
 |From cd58437897bf02b644c2171404ccffae5d12a2be Mon Sep 17 00:00:00 2001
 |From: tedyu yuzhih...@gmail.com
 |Date: Mon, 11 Aug 2014 15:57:46 -0700
 |Subject: [PATCH 3/4] SPARK-1297 Upgrade HBase dependency to 0.98 - add
 | description to building-with-maven.md
 |
 |---
 | docs/building-with-maven.md | 3 +++
 | 1 file changed, 3 insertions(+)
 |
 |diff --git a/docs/building-with-maven.md b/docs/building-with-maven.md
 |index 672d0ef..f8bcd2b 100644
 |--- a/docs/building-with-maven.md
 |+++ b/docs/building-with-maven.md
 --
 File to patch:



 On 28 Aug, 2014, at 10:24 am, Ted Yu yuzhih...@gmail.com wrote:

 You can get the patch from this URL:
 https://github.com/apache/spark/pull/1893.patch

 BTW 0.98.5 has been released - you can specify 0.98.5-hadoop2 in the
 pom.xml

 Cheers


 On Wed, Aug 27, 2014 at 7:18 PM, arthur.hk.c...@gmail.com 
 arthur.hk.c...@gmail.com wrote:

 Hi Ted,

 Thank you so much!!

 As I am new to Spark, can you please advise the steps about how to apply
 this patch to my spark-1.0.2 source folder?

 Regards
 Arthur


 On 28 Aug, 2014, at 10:13 am, Ted Yu yuzhih...@gmail.com wrote:

 See SPARK-1297

  The pull request is here:
 https://github.com/apache/spark/pull/1893


 On Wed, Aug 27, 2014 at 6:57 PM, arthur.hk.c...@gmail.com 
 arthur.hk.c...@gmail.com wrote:

 (correction: Compilation Error:  Spark 1.0.2 with HBase 0.98” , please
 ignore if duplicated)


 Hi,

 I need to use Spark with HBase 0.98 and tried to compile Spark 1.0.2
 with HBase 0.98,

 My steps:
 wget http://d3kbcqa49mib13.cloudfront.net/spark-1.0.2.tgz
 tar -vxf spark-1.0.2.tgz
 cd spark-1.0.2

 edit project/SparkBuild.scala, set HBASE_VERSION
   // HBase version; set as appropriate.
   val HBASE_VERSION = 0.98.2


 edit pom.xml with following values
 hadoop.version2.4.1/hadoop.version
 protobuf.version2.5.0/protobuf.version
 yarn.version${hadoop.version}/yarn.version
 hbase.version0.98.5/hbase.version
 zookeeper.version3.4.6/zookeeper.version
 hive.version0.13.1/hive.version


 SPARK_HADOOP_VERSION=2.4.1 SPARK_YARN=true sbt/sbt clean assembly
 but it fails because of UNRESOLVED DEPENDENCIES hbase;0.98.2

 Can you please advise how to compile Spark 1.0.2 with HBase 0.98? or
 should I set HBASE_VERSION back to “0.94.6?

 Regards
 Arthur




 [warn]  ::
 [warn]  ::  UNRESOLVED DEPENDENCIES ::
 [warn]  ::
 [warn]  :: org.apache.hbase#hbase;0.98.2: not found
 [warn]  ::

 sbt.ResolveException: unresolved dependency:
 org.apache.hbase#hbase;0.98.2: not found
 at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:217)
 at sbt.IvyActions$$anonfun$update$1.apply(IvyActions.scala:126)
 at sbt.IvyActions$$anonfun$update$1.apply(IvyActions.scala:125)
 at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:116)
 at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:116)
 at sbt.IvySbt$$anonfun$withIvy$1.apply(Ivy.scala:104)
 at sbt.IvySbt.sbt$IvySbt$$action$1(Ivy.scala:51)
 at sbt.IvySbt$$anon$3.call(Ivy.scala:60)
 at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:98)
 at
 xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:81)
 at
 xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:102)
 at xsbt.boot.Using$.withResource(Using.scala:11)
 at xsbt.boot.Using$.apply(Using.scala:10)
 at
 xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:62)
 at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:52)
 at xsbt.boot.Locks$.apply0(Locks.scala:31)
 at xsbt.boot.Locks$.apply(Locks.scala:28)
 at sbt.IvySbt.withDefaultLogger(Ivy.scala:60)
 at sbt.IvySbt.withIvy(Ivy.scala:101)
 at sbt.IvySbt.withIvy(Ivy.scala:97)
 at 

Re: Compilation Error: Spark 1.0.2 with HBase 0.98

2014-08-27 Thread arthur.hk.c...@gmail.com
Hi Ted, 

Thanks. 

Tried [patch -p1 -i 1893.patch](Hunk #1 FAILED at 45.)
Is this normal?

Regards
Arthur


patch -p1 -i 1893.patch
patching file examples/pom.xml
Hunk #1 FAILED at 45.
Hunk #2 succeeded at 94 (offset -16 lines).
1 out of 2 hunks FAILED -- saving rejects to file examples/pom.xml.rej
patching file examples/pom.xml
Hunk #1 FAILED at 54.
Hunk #2 FAILED at 72.
Hunk #3 succeeded at 122 (offset -49 lines).
2 out of 3 hunks FAILED -- saving rejects to file examples/pom.xml.rej
patching file docs/building-with-maven.md
patching file examples/pom.xml
Hunk #1 succeeded at 122 (offset -40 lines).
Hunk #2 succeeded at 195 (offset -40 lines).


On 28 Aug, 2014, at 10:53 am, Ted Yu yuzhih...@gmail.com wrote:

 Can you use this command ?
 
 patch -p1 -i 1893.patch
 
 Cheers
 
 
 On Wed, Aug 27, 2014 at 7:41 PM, arthur.hk.c...@gmail.com 
 arthur.hk.c...@gmail.com wrote:
 Hi Ted,
 
 I tried the following steps to apply the patch 1893 but got Hunk FAILED, can 
 you please advise how to get thru this error? or is my spark-1.0.2 source not 
 the correct one?
 
 Regards
 Arthur
  
 wget http://d3kbcqa49mib13.cloudfront.net/spark-1.0.2.tgz
 tar -vxf spark-1.0.2.tgz
 cd spark-1.0.2
 wget https://github.com/apache/spark/pull/1893.patch
 patch   1893.patch
 patching file pom.xml
 Hunk #1 FAILED at 45.
 Hunk #2 FAILED at 110.
 2 out of 2 hunks FAILED -- saving rejects to file pom.xml.rej
 patching file pom.xml
 Hunk #1 FAILED at 54.
 Hunk #2 FAILED at 72.
 Hunk #3 FAILED at 171.
 3 out of 3 hunks FAILED -- saving rejects to file pom.xml.rej
 can't find file to patch at input line 267
 Perhaps you should have used the -p or --strip option?
 The text leading up to this was:
 --
 |
 |From cd58437897bf02b644c2171404ccffae5d12a2be Mon Sep 17 00:00:00 2001
 |From: tedyu yuzhih...@gmail.com
 |Date: Mon, 11 Aug 2014 15:57:46 -0700
 |Subject: [PATCH 3/4] SPARK-1297 Upgrade HBase dependency to 0.98 - add
 | description to building-with-maven.md
 |
 |---
 | docs/building-with-maven.md | 3 +++
 | 1 file changed, 3 insertions(+)
 |
 |diff --git a/docs/building-with-maven.md b/docs/building-with-maven.md
 |index 672d0ef..f8bcd2b 100644
 |--- a/docs/building-with-maven.md
 |+++ b/docs/building-with-maven.md
 --
 File to patch:
 
 
 
 On 28 Aug, 2014, at 10:24 am, Ted Yu yuzhih...@gmail.com wrote:
 
 You can get the patch from this URL:
 https://github.com/apache/spark/pull/1893.patch
 
 BTW 0.98.5 has been released - you can specify 0.98.5-hadoop2 in the pom.xml
 
 Cheers
 
 
 On Wed, Aug 27, 2014 at 7:18 PM, arthur.hk.c...@gmail.com 
 arthur.hk.c...@gmail.com wrote:
 Hi Ted,
 
 Thank you so much!!
 
 As I am new to Spark, can you please advise the steps about how to apply 
 this patch to my spark-1.0.2 source folder?
 
 Regards
 Arthur
 
 
 On 28 Aug, 2014, at 10:13 am, Ted Yu yuzhih...@gmail.com wrote:
 
 See SPARK-1297
 
 The pull request is here:
 https://github.com/apache/spark/pull/1893
 
 
 On Wed, Aug 27, 2014 at 6:57 PM, arthur.hk.c...@gmail.com 
 arthur.hk.c...@gmail.com wrote:
 (correction: Compilation Error:  Spark 1.0.2 with HBase 0.98” , please 
 ignore if duplicated)
 
 
 Hi,
 
 I need to use Spark with HBase 0.98 and tried to compile Spark 1.0.2 with 
 HBase 0.98,
 
 My steps:
 wget http://d3kbcqa49mib13.cloudfront.net/spark-1.0.2.tgz
 tar -vxf spark-1.0.2.tgz
 cd spark-1.0.2
 
 edit project/SparkBuild.scala, set HBASE_VERSION
   // HBase version; set as appropriate.
   val HBASE_VERSION = 0.98.2
 
 
 edit pom.xml with following values
 hadoop.version2.4.1/hadoop.version
 protobuf.version2.5.0/protobuf.version
 yarn.version${hadoop.version}/yarn.version
 hbase.version0.98.5/hbase.version
 zookeeper.version3.4.6/zookeeper.version
 hive.version0.13.1/hive.version
 
 
 SPARK_HADOOP_VERSION=2.4.1 SPARK_YARN=true sbt/sbt clean assembly
 but it fails because of UNRESOLVED DEPENDENCIES hbase;0.98.2
 
 Can you please advise how to compile Spark 1.0.2 with HBase 0.98? or should 
 I set HBASE_VERSION back to “0.94.6?
 
 Regards
 Arthur
 
 
 
 
 [warn]  ::
 [warn]  ::  UNRESOLVED DEPENDENCIES ::
 [warn]  ::
 [warn]  :: org.apache.hbase#hbase;0.98.2: not found
 [warn]  ::
 
 sbt.ResolveException: unresolved dependency: org.apache.hbase#hbase;0.98.2: 
 not found
 at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:217)
 at sbt.IvyActions$$anonfun$update$1.apply(IvyActions.scala:126)
 at sbt.IvyActions$$anonfun$update$1.apply(IvyActions.scala:125)
 at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:116)
 at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:116)
 at sbt.IvySbt$$anonfun$withIvy$1.apply(Ivy.scala:104)
 at sbt.IvySbt.sbt$IvySbt$$action$1(Ivy.scala:51)
 at sbt.IvySbt$$anon$3.call(Ivy.scala:60)
 at 

Re: Compilation Error: Spark 1.0.2 with HBase 0.98

2014-08-27 Thread Ted Yu
Looks like the patch given by that URL only had the last commit.

I have attached pom.xml for spark-1.0.2 to SPARK-1297
You can download it and replace examples/pom.xml with the downloaded pom

I am running this command locally:

mvn -Phbase-hadoop2,hadoop-2.4,yarn -DskipTests clean package

Cheers


On Wed, Aug 27, 2014 at 7:57 PM, arthur.hk.c...@gmail.com 
arthur.hk.c...@gmail.com wrote:

 Hi Ted,

 Thanks.

 Tried [patch -p1 -i 1893.patch](Hunk #1 FAILED at 45.)
 Is this normal?

 Regards
 Arthur


 patch -p1 -i 1893.patch
 patching file examples/pom.xml
 Hunk #1 FAILED at 45.
 Hunk #2 succeeded at 94 (offset -16 lines).
 1 out of 2 hunks FAILED -- saving rejects to file examples/pom.xml.rej
 patching file examples/pom.xml
 Hunk #1 FAILED at 54.
 Hunk #2 FAILED at 72.
 Hunk #3 succeeded at 122 (offset -49 lines).
 2 out of 3 hunks FAILED -- saving rejects to file examples/pom.xml.rej
 patching file docs/building-with-maven.md
 patching file examples/pom.xml
 Hunk #1 succeeded at 122 (offset -40 lines).
 Hunk #2 succeeded at 195 (offset -40 lines).


 On 28 Aug, 2014, at 10:53 am, Ted Yu yuzhih...@gmail.com wrote:

 Can you use this command ?

 patch -p1 -i 1893.patch

 Cheers


 On Wed, Aug 27, 2014 at 7:41 PM, arthur.hk.c...@gmail.com 
 arthur.hk.c...@gmail.com wrote:

 Hi Ted,

 I tried the following steps to apply the patch 1893 but got Hunk FAILED,
 can you please advise how to get thru this error? or is my spark-1.0.2
 source not the correct one?

 Regards
 Arthur

 wget http://d3kbcqa49mib13.cloudfront.net/spark-1.0.2.tgz
 tar -vxf spark-1.0.2.tgz
 cd spark-1.0.2
 wget https://github.com/apache/spark/pull/1893.patch
 patch   1893.patch
 patching file pom.xml
 Hunk #1 FAILED at 45.
 Hunk #2 FAILED at 110.
 2 out of 2 hunks FAILED -- saving rejects to file pom.xml.rej
 patching file pom.xml
 Hunk #1 FAILED at 54.
 Hunk #2 FAILED at 72.
 Hunk #3 FAILED at 171.
 3 out of 3 hunks FAILED -- saving rejects to file pom.xml.rej
 can't find file to patch at input line 267
 Perhaps you should have used the -p or --strip option?
 The text leading up to this was:
 --
 |
 |From cd58437897bf02b644c2171404ccffae5d12a2be Mon Sep 17 00:00:00 2001
 |From: tedyu yuzhih...@gmail.com
 |Date: Mon, 11 Aug 2014 15:57:46 -0700
 |Subject: [PATCH 3/4] SPARK-1297 Upgrade HBase dependency to 0.98 - add
 | description to building-with-maven.md
 |
 |---
 | docs/building-with-maven.md | 3 +++
 | 1 file changed, 3 insertions(+)
 |
 |diff --git a/docs/building-with-maven.md b/docs/building-with-maven.md
 |index 672d0ef..f8bcd2b 100644
 |--- a/docs/building-with-maven.md
 |+++ b/docs/building-with-maven.md
 --
 File to patch:



 On 28 Aug, 2014, at 10:24 am, Ted Yu yuzhih...@gmail.com wrote:

 You can get the patch from this URL:
 https://github.com/apache/spark/pull/1893.patch

 BTW 0.98.5 has been released - you can specify 0.98.5-hadoop2 in the
 pom.xml

 Cheers


 On Wed, Aug 27, 2014 at 7:18 PM, arthur.hk.c...@gmail.com 
 arthur.hk.c...@gmail.com wrote:

 Hi Ted,

 Thank you so much!!

 As I am new to Spark, can you please advise the steps about how to apply
 this patch to my spark-1.0.2 source folder?

 Regards
 Arthur


 On 28 Aug, 2014, at 10:13 am, Ted Yu yuzhih...@gmail.com wrote:

 See SPARK-1297

  The pull request is here:
 https://github.com/apache/spark/pull/1893


 On Wed, Aug 27, 2014 at 6:57 PM, arthur.hk.c...@gmail.com 
 arthur.hk.c...@gmail.com wrote:

 (correction: Compilation Error:  Spark 1.0.2 with HBase 0.98” , please
 ignore if duplicated)


 Hi,

 I need to use Spark with HBase 0.98 and tried to compile Spark 1.0.2
 with HBase 0.98,

 My steps:
 wget http://d3kbcqa49mib13.cloudfront.net/spark-1.0.2.tgz
 tar -vxf spark-1.0.2.tgz
 cd spark-1.0.2

 edit project/SparkBuild.scala, set HBASE_VERSION
   // HBase version; set as appropriate.
   val HBASE_VERSION = 0.98.2


 edit pom.xml with following values
 hadoop.version2.4.1/hadoop.version
 protobuf.version2.5.0/protobuf.version
 yarn.version${hadoop.version}/yarn.version
 hbase.version0.98.5/hbase.version
 zookeeper.version3.4.6/zookeeper.version
 hive.version0.13.1/hive.version


 SPARK_HADOOP_VERSION=2.4.1 SPARK_YARN=true sbt/sbt clean assembly
 but it fails because of UNRESOLVED DEPENDENCIES hbase;0.98.2

 Can you please advise how to compile Spark 1.0.2 with HBase 0.98? or
 should I set HBASE_VERSION back to “0.94.6?

 Regards
 Arthur




 [warn]  ::
 [warn]  ::  UNRESOLVED DEPENDENCIES ::
 [warn]  ::
 [warn]  :: org.apache.hbase#hbase;0.98.2: not found
 [warn]  ::

 sbt.ResolveException: unresolved dependency:
 org.apache.hbase#hbase;0.98.2: not found
 at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:217)
 at sbt.IvyActions$$anonfun$update$1.apply(IvyActions.scala:126)
 at 

Re: Compilation Error: Spark 1.0.2 with HBase 0.98

2014-08-27 Thread Ted Yu
I forgot to include '-Dhadoop.version=2.4.1' in the command below.

The modified command passed.

You can verify the dependence on hbase 0.98 through this command:

mvn -Phbase-hadoop2,hadoop-2.4,yarn -Dhadoop.version=2.4.1 -DskipTests
dependency:tree  dep.txt

Cheers


On Wed, Aug 27, 2014 at 8:58 PM, Ted Yu yuzhih...@gmail.com wrote:

 Looks like the patch given by that URL only had the last commit.

 I have attached pom.xml for spark-1.0.2 to SPARK-1297
 You can download it and replace examples/pom.xml with the downloaded pom

 I am running this command locally:

 mvn -Phbase-hadoop2,hadoop-2.4,yarn -DskipTests clean package

 Cheers


 On Wed, Aug 27, 2014 at 7:57 PM, arthur.hk.c...@gmail.com 
 arthur.hk.c...@gmail.com wrote:

 Hi Ted,

 Thanks.

 Tried [patch -p1 -i 1893.patch](Hunk #1 FAILED at 45.)
 Is this normal?

 Regards
 Arthur


 patch -p1 -i 1893.patch
 patching file examples/pom.xml
 Hunk #1 FAILED at 45.
 Hunk #2 succeeded at 94 (offset -16 lines).
 1 out of 2 hunks FAILED -- saving rejects to file examples/pom.xml.rej
 patching file examples/pom.xml
 Hunk #1 FAILED at 54.
 Hunk #2 FAILED at 72.
 Hunk #3 succeeded at 122 (offset -49 lines).
 2 out of 3 hunks FAILED -- saving rejects to file examples/pom.xml.rej
 patching file docs/building-with-maven.md
 patching file examples/pom.xml
 Hunk #1 succeeded at 122 (offset -40 lines).
 Hunk #2 succeeded at 195 (offset -40 lines).


 On 28 Aug, 2014, at 10:53 am, Ted Yu yuzhih...@gmail.com wrote:

 Can you use this command ?

 patch -p1 -i 1893.patch

 Cheers


 On Wed, Aug 27, 2014 at 7:41 PM, arthur.hk.c...@gmail.com 
 arthur.hk.c...@gmail.com wrote:

 Hi Ted,

 I tried the following steps to apply the patch 1893 but got Hunk FAILED,
 can you please advise how to get thru this error? or is my spark-1.0.2
 source not the correct one?

 Regards
 Arthur

 wget http://d3kbcqa49mib13.cloudfront.net/spark-1.0.2.tgz
 tar -vxf spark-1.0.2.tgz
 cd spark-1.0.2
 wget https://github.com/apache/spark/pull/1893.patch
 patch   1893.patch
 patching file pom.xml
 Hunk #1 FAILED at 45.
 Hunk #2 FAILED at 110.
 2 out of 2 hunks FAILED -- saving rejects to file pom.xml.rej
 patching file pom.xml
 Hunk #1 FAILED at 54.
 Hunk #2 FAILED at 72.
 Hunk #3 FAILED at 171.
 3 out of 3 hunks FAILED -- saving rejects to file pom.xml.rej
 can't find file to patch at input line 267
 Perhaps you should have used the -p or --strip option?
 The text leading up to this was:
 --
 |
 |From cd58437897bf02b644c2171404ccffae5d12a2be Mon Sep 17 00:00:00 2001
 |From: tedyu yuzhih...@gmail.com
 |Date: Mon, 11 Aug 2014 15:57:46 -0700
 |Subject: [PATCH 3/4] SPARK-1297 Upgrade HBase dependency to 0.98 - add
 | description to building-with-maven.md
 |
 |---
 | docs/building-with-maven.md | 3 +++
 | 1 file changed, 3 insertions(+)
 |
 |diff --git a/docs/building-with-maven.md b/docs/building-with-maven.md
 |index 672d0ef..f8bcd2b 100644
 |--- a/docs/building-with-maven.md
 |+++ b/docs/building-with-maven.md
 --
 File to patch:



 On 28 Aug, 2014, at 10:24 am, Ted Yu yuzhih...@gmail.com wrote:

 You can get the patch from this URL:
 https://github.com/apache/spark/pull/1893.patch

 BTW 0.98.5 has been released - you can specify 0.98.5-hadoop2 in the
 pom.xml

 Cheers


 On Wed, Aug 27, 2014 at 7:18 PM, arthur.hk.c...@gmail.com 
 arthur.hk.c...@gmail.com wrote:

 Hi Ted,

 Thank you so much!!

 As I am new to Spark, can you please advise the steps about how to
 apply this patch to my spark-1.0.2 source folder?

 Regards
 Arthur


 On 28 Aug, 2014, at 10:13 am, Ted Yu yuzhih...@gmail.com wrote:

 See SPARK-1297

  The pull request is here:
 https://github.com/apache/spark/pull/1893


 On Wed, Aug 27, 2014 at 6:57 PM, arthur.hk.c...@gmail.com 
 arthur.hk.c...@gmail.com wrote:

 (correction: Compilation Error:  Spark 1.0.2 with HBase 0.98” ,
 please ignore if duplicated)


 Hi,

 I need to use Spark with HBase 0.98 and tried to compile Spark 1.0.2
 with HBase 0.98,

 My steps:
 wget http://d3kbcqa49mib13.cloudfront.net/spark-1.0.2.tgz
 tar -vxf spark-1.0.2.tgz
 cd spark-1.0.2

 edit project/SparkBuild.scala, set HBASE_VERSION
   // HBase version; set as appropriate.
   val HBASE_VERSION = 0.98.2


 edit pom.xml with following values
 hadoop.version2.4.1/hadoop.version
 protobuf.version2.5.0/protobuf.version
 yarn.version${hadoop.version}/yarn.version
 hbase.version0.98.5/hbase.version
 zookeeper.version3.4.6/zookeeper.version
 hive.version0.13.1/hive.version


 SPARK_HADOOP_VERSION=2.4.1 SPARK_YARN=true sbt/sbt clean assembly
 but it fails because of UNRESOLVED DEPENDENCIES hbase;0.98.2

 Can you please advise how to compile Spark 1.0.2 with HBase 0.98? or
 should I set HBASE_VERSION back to “0.94.6?

 Regards
 Arthur




 [warn]  ::
 [warn]  ::  UNRESOLVED DEPENDENCIES ::
 [warn]  ::
 [warn]  :: 

Re: Compilation error in Spark 1.0.0

2014-07-09 Thread Silvina Caíno Lores
Right, the compile error is a casting issue telling me I cannot assign
a JavaPairRDDPartition,
Body to a JavaPairRDDObject, Object. It happens in the mapToPair()
method.




On 9 July 2014 19:52, Sean Owen so...@cloudera.com wrote:

 You forgot the compile error!


 On Wed, Jul 9, 2014 at 6:14 PM, Silvina Caíno Lores silvi.ca...@gmail.com
  wrote:

 Hi everyone,

  I am new to Spark and I'm having problems to make my code compile. I
 have the feeling I might be misunderstanding the functions so I would be
 very glad to get some insight in what could be wrong.

 The problematic code is the following:

 JavaRDDBody bodies = lines.map(l - {Body b = new Body(); b.parse(l);}
 );

 JavaPairRDDPartition, IterableBody partitions =
 bodies.mapToPair(b -
 b.computePartitions(maxDistance)).groupByKey();

  Partition and Body are defined inside the driver class. Body contains
 the following definition:

 protected IterableTuple2Partition, Body computePartitions (int
 maxDistance)

 The idea is to reproduce the following schema:

 The first map results in: *body1, body2, ... *
 The mapToPair should output several of these:* (partition_i, body1),
 (partition_i, body2)...*
 Which are gathered by key as follows: *(partition_i, (body1,
 body_n), (partition_i', (body2, body_n') ...*

 Thanks in advance.
 Regards,
 Silvina