Re: Newbie question - Help with runtime error on augmentString

2016-03-11 Thread Tristan Nixon
y help appreciated on this.  I am trying to write a Spark program using
>> IntelliJ.  I get a run time error as soon as new SparkConf() is called from
>> main.  Top few lines of the exception are pasted below.
>> 
>> These are the following versions:
>> 
>> Spark jar:  spark-assembly-1.6.0-hadoop2.6.0.jar
>> pom:  spark-core_2.11
>>  1.6.0
>> 
>> I have installed the Scala plugin in IntelliJ and added a dependency.
>> 
>> I have also added a library dependency in the project structure.
>> 
>> Thanks for any help!
>> 
>> Vasu
>> 
>> 
>> Exception in thread "main" java.lang.NoSuchMethodError:
>> scala.Predef$.augmentString(Ljava/lang/String;)Ljava/lang/String;
>> at org.apache.spark.util.Utils$.(Utils.scala:1682)
>> at org.apache.spark.util.Utils$.(Utils.scala)
>> at org.apache.spark.SparkConf.(SparkConf.scala:59)
>> 
>> 
>> 
>> 
>> 
>> 
>> --
>> View this message in context: 
>> http://apache-spark-user-list.1001560.n3.nabble.com/Newbie-question-Help-with-runtime-error-on-augmentString-tp26462.html
>>  
>> <http://apache-spark-user-list.1001560.n3.nabble.com/Newbie-question-Help-with-runtime-error-on-augmentString-tp26462.html>
>> Sent from the Apache Spark User List mailing list archive at Nabble.com 
>> <http://nabble.com/>.
>> 
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org 
>> <mailto:user-unsubscr...@spark.apache.org>
>> For additional commands, e-mail: user-h...@spark.apache.org 
>> <mailto:user-h...@spark.apache.org>
>> 
>> 
> 
> 



Re: Newbie question - Help with runtime error on augmentString

2016-03-11 Thread Vasu Parameswaran
Added these to the pom and still the same error :-(. I will look into sbt
as well.



On Fri, Mar 11, 2016 at 2:31 PM, Tristan Nixon 
wrote:

> You must be relying on IntelliJ to compile your scala, because you haven’t
> set up any scala plugin to compile it from maven.
> You should have something like this in your plugins:
>
> 
>  
>   net.alchim31.maven
>   scala-maven-plugin
>   
>
> scala-compile-first
> process-resources
> 
>  compile
> 
>
>
> scala-test-compile
> process-test-resources
> 
>  testCompile
> 
>
>   
>  
> 
>
>
> PS - I use maven to compile all my scala and haven’t had a problem with
> it. I know that sbt has some wonderful things, but I’m just set in my ways
> ;)
>
> On Mar 11, 2016, at 2:02 PM, Jacek Laskowski  wrote:
>
> Hi,
>
> Doh! My eyes are bleeding to go through XMLs... 😁
>
> Where did you specify Scala version? Dunno how it's in maven.
>
> p.s. I *strongly* recommend sbt.
>
> Jacek
> 11.03.2016 8:04 PM "Vasu Parameswaran"  napisał(a):
>
>> Thanks Jacek.  Pom is below (Currenlty set to 1.6.1 spark but I started
>> out with 1.6.0 with the same problem).
>>
>>
>> 
>> http://maven.apache.org/POM/4.0.0";
>>  xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
>>  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
>> http://maven.apache.org/xsd/maven-4.0.0.xsd";>
>> 
>> spark
>> com.test
>> 1.0-SNAPSHOT
>> 
>> 4.0.0
>>
>> sparktest
>>
>> 
>> UTF-8
>> 
>>
>> 
>> 
>> junit
>> junit
>> 
>>
>> 
>> commons-cli
>> commons-cli
>> 
>> 
>> com.google.code.gson
>> gson
>> 2.3.1
>> compile
>> 
>> 
>> org.apache.spark
>> spark-core_2.11
>> 1.6.1
>> 
>> 
>>
>> 
>> 
>> 
>> org.apache.maven.plugins
>> maven-shade-plugin
>> 2.4.2
>> 
>> 
>> package
>> 
>> shade
>> 
>> 
>> 
>> 
>>
>> ${project.artifactId}-${project.version}-with-dependencies
>> 
>> 
>> 
>> 
>>
>> 
>>
>>
>>
>> On Fri, Mar 11, 2016 at 10:46 AM, Jacek Laskowski 
>> wrote:
>>
>>> Hi,
>>>
>>> Why do you use maven not sbt for Scala?
>>>
>>> Can you show the entire pom.xml and the command to execute the app?
>>>
>>> Jacek
>>> 11.03.2016 7:33 PM "vasu20"  napisał(a):
>>>
>>>> Hi
>>>>
>>>> Any help appreciated on this.  I am trying to write a Spark program
>>>> using
>>>> IntelliJ.  I get a run time error as soon as new SparkConf() is called
>>>> from
>>>> main.  Top few lines of the exception are pasted below.
>>>>
>>>> These are the following versions:
>>>>
>>>> Spark jar:  spark-assembly-1.6.0-hadoop2.6.0.jar
>>>> pom:  spark-core_2.11
>>>>  1.6.0
>>>>
>>>> I have installed the Scala plugin in IntelliJ and added a dependency.
>>>>
>>>> I have also added a library dependency in the project structure.
>>>>
>>>> Thanks for any help!
>>>>
>>>> Vasu
>>>>
>>>>
>>>> Exception in thread "main" java.lang.NoSuchMethodError:
>>>> scala.Predef$.augmentString(Ljava/lang/String;)Ljava/lang/String;
>>>> at org.apache.spark.util.Utils$.(Utils.scala:1682)
>>>> at org.apache.spark.util.Utils$.(Utils.scala)
>>>> at org.apache.spark.SparkConf.(SparkConf.scala:59)
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> View this message in context:
>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Newbie-question-Help-with-runtime-error-on-augmentString-tp26462.html
>>>> Sent from the Apache Spark User List mailing list archive at Nabble.com
>>>> <http://nabble.com>.
>>>>
>>>> -
>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>>
>>>>
>>
>


Re: Newbie question - Help with runtime error on augmentString

2016-03-11 Thread Tristan Nixon
You must be relying on IntelliJ to compile your scala, because you haven’t set 
up any scala plugin to compile it from maven.
You should have something like this in your plugins:


 
  net.alchim31.maven
  scala-maven-plugin
  
   
scala-compile-first
process-resources

 compile

   
   
scala-test-compile
process-test-resources

 testCompile

   
  
 


PS - I use maven to compile all my scala and haven’t had a problem with it. I 
know that sbt has some wonderful things, but I’m just set in my ways ;)

> On Mar 11, 2016, at 2:02 PM, Jacek Laskowski  wrote:
> 
> Hi,
> 
> Doh! My eyes are bleeding to go through XMLs... 😁
> 
> Where did you specify Scala version? Dunno how it's in maven.
> 
> p.s. I *strongly* recommend sbt.
> 
> Jacek
> 
> 11.03.2016 8:04 PM "Vasu Parameswaran"  <mailto:vas...@gmail.com>> napisał(a):
> Thanks Jacek.  Pom is below (Currenlty set to 1.6.1 spark but I started out 
> with 1.6.0 with the same problem).
> 
> 
> 
> http://maven.apache.org/POM/4.0.0 
> <http://maven.apache.org/POM/4.0.0>"
>  xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance 
> <http://www.w3.org/2001/XMLSchema-instance>"
>  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
> <http://maven.apache.org/POM/4.0.0> 
> http://maven.apache.org/xsd/maven-4.0.0.xsd 
> <http://maven.apache.org/xsd/maven-4.0.0.xsd>">
> 
> spark
> com.test
> 1.0-SNAPSHOT
> 
> 4.0.0
> 
> sparktest
> 
> 
> UTF-8
> 
> 
> 
> 
> junit
> junit
> 
> 
> 
> commons-cli
> commons-cli
> 
> 
> com.google.code.gson
> gson
> 2.3.1
> compile
> 
> 
> org.apache.spark
> spark-core_2.11
> 1.6.1
> 
> 
> 
> 
> 
> 
> org.apache.maven.plugins
> maven-shade-plugin
> 2.4.2
> 
> 
> package
> 
> shade
> 
> 
> 
> 
> 
> ${project.artifactId}-${project.version}-with-dependencies
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On Fri, Mar 11, 2016 at 10:46 AM, Jacek Laskowski  <mailto:ja...@japila.pl>> wrote:
> Hi,
> 
> Why do you use maven not sbt for Scala?
> 
> Can you show the entire pom.xml and the command to execute the app?
> 
> Jacek
> 
> 11.03.2016 7:33 PM "vasu20" mailto:vas...@gmail.com>> 
> napisał(a):
> Hi
> 
> Any help appreciated on this.  I am trying to write a Spark program using
> IntelliJ.  I get a run time error as soon as new SparkConf() is called from
> main.  Top few lines of the exception are pasted below.
> 
> These are the following versions:
> 
> Spark jar:  spark-assembly-1.6.0-hadoop2.6.0.jar
> pom:  spark-core_2.11
>  1.6.0
> 
> I have installed the Scala plugin in IntelliJ and added a dependency.
> 
> I have also added a library dependency in the project structure.
> 
> Thanks for any help!
> 
> Vasu
> 
> 
> Exception in thread "main" java.lang.NoSuchMethodError:
> scala.Predef$.augmentString(Ljava/lang/String;)Ljava/lang/String;
> at org.apache.spark.util.Utils$.(Utils.scala:1682)
> at org.apache.spark.util.Utils$.(Utils.scala)
> at org.apache.spark.SparkConf.(SparkConf.scala:59)
> 
> 
> 
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Newbie-question-Help-with-runtime-error-on-augmentString-tp26462.html
>  
> <http://apache-spark-user-list.1001560.n3.nabble.com/Newbie-question-Help-with-runtime-error-on-augmentString-tp26462.html>
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> 
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org 
> <mailto:user-unsubscr...@spark.apache.org>
> For additional commands, e-mail: user-h...@spark.apache.org 
> <mailto:user-h...@spark.apache.org>
> 
> 



Re: Newbie question - Help with runtime error on augmentString

2016-03-11 Thread Jacek Laskowski
Hi,

Doh! My eyes are bleeding to go through XMLs... 😁

Where did you specify Scala version? Dunno how it's in maven.

p.s. I *strongly* recommend sbt.

Jacek
11.03.2016 8:04 PM "Vasu Parameswaran"  napisał(a):

> Thanks Jacek.  Pom is below (Currenlty set to 1.6.1 spark but I started
> out with 1.6.0 with the same problem).
>
>
> 
> http://maven.apache.org/POM/4.0.0";
>  xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
>  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
> http://maven.apache.org/xsd/maven-4.0.0.xsd";>
> 
> spark
> com.test
> 1.0-SNAPSHOT
> 
> 4.0.0
>
> sparktest
>
> 
> UTF-8
> 
>
> 
> 
> junit
> junit
> 
>
> 
> commons-cli
> commons-cli
> 
> 
> com.google.code.gson
> gson
> 2.3.1
> compile
> 
> 
> org.apache.spark
> spark-core_2.11
> 1.6.1
> 
> 
>
> 
> 
> 
> org.apache.maven.plugins
> maven-shade-plugin
> 2.4.2
> 
> 
> package
> 
> shade
> 
> 
> 
> 
>
> ${project.artifactId}-${project.version}-with-dependencies
> 
> 
> 
> 
>
> 
>
>
>
> On Fri, Mar 11, 2016 at 10:46 AM, Jacek Laskowski  wrote:
>
>> Hi,
>>
>> Why do you use maven not sbt for Scala?
>>
>> Can you show the entire pom.xml and the command to execute the app?
>>
>> Jacek
>> 11.03.2016 7:33 PM "vasu20"  napisał(a):
>>
>>> Hi
>>>
>>> Any help appreciated on this.  I am trying to write a Spark program using
>>> IntelliJ.  I get a run time error as soon as new SparkConf() is called
>>> from
>>> main.  Top few lines of the exception are pasted below.
>>>
>>> These are the following versions:
>>>
>>> Spark jar:  spark-assembly-1.6.0-hadoop2.6.0.jar
>>> pom:  spark-core_2.11
>>>  1.6.0
>>>
>>> I have installed the Scala plugin in IntelliJ and added a dependency.
>>>
>>> I have also added a library dependency in the project structure.
>>>
>>> Thanks for any help!
>>>
>>> Vasu
>>>
>>>
>>> Exception in thread "main" java.lang.NoSuchMethodError:
>>> scala.Predef$.augmentString(Ljava/lang/String;)Ljava/lang/String;
>>> at org.apache.spark.util.Utils$.(Utils.scala:1682)
>>> at org.apache.spark.util.Utils$.(Utils.scala)
>>> at org.apache.spark.SparkConf.(SparkConf.scala:59)
>>>
>>>
>>>
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Newbie-question-Help-with-runtime-error-on-augmentString-tp26462.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> -
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>


Re: Newbie question - Help with runtime error on augmentString

2016-03-11 Thread Vasu Parameswaran
Thanks Jacek.  Pom is below (Currenlty set to 1.6.1 spark but I started out
with 1.6.0 with the same problem).



http://maven.apache.org/POM/4.0.0";
 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd";>

spark
com.test
1.0-SNAPSHOT

4.0.0

sparktest


UTF-8




junit
junit



commons-cli
commons-cli


com.google.code.gson
gson
2.3.1
compile


org.apache.spark
spark-core_2.11
1.6.1






org.apache.maven.plugins
maven-shade-plugin
2.4.2


package

shade





${project.artifactId}-${project.version}-with-dependencies









On Fri, Mar 11, 2016 at 10:46 AM, Jacek Laskowski  wrote:

> Hi,
>
> Why do you use maven not sbt for Scala?
>
> Can you show the entire pom.xml and the command to execute the app?
>
> Jacek
> 11.03.2016 7:33 PM "vasu20"  napisał(a):
>
>> Hi
>>
>> Any help appreciated on this.  I am trying to write a Spark program using
>> IntelliJ.  I get a run time error as soon as new SparkConf() is called
>> from
>> main.  Top few lines of the exception are pasted below.
>>
>> These are the following versions:
>>
>> Spark jar:  spark-assembly-1.6.0-hadoop2.6.0.jar
>> pom:  spark-core_2.11
>>  1.6.0
>>
>> I have installed the Scala plugin in IntelliJ and added a dependency.
>>
>> I have also added a library dependency in the project structure.
>>
>> Thanks for any help!
>>
>> Vasu
>>
>>
>> Exception in thread "main" java.lang.NoSuchMethodError:
>> scala.Predef$.augmentString(Ljava/lang/String;)Ljava/lang/String;
>> at org.apache.spark.util.Utils$.(Utils.scala:1682)
>> at org.apache.spark.util.Utils$.(Utils.scala)
>> at org.apache.spark.SparkConf.(SparkConf.scala:59)
>>
>>
>>
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Newbie-question-Help-with-runtime-error-on-augmentString-tp26462.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>


Re: Newbie question - Help with runtime error on augmentString

2016-03-11 Thread Vasu Parameswaran
Thanks Ted.

I haven't explicitly specified Scala (I tried different versions in pom.xml
as well).

For what it is worth, this is what I get when I do a maven dependency
tree.  I wonder if the 2.11.2 coming from scala-reflect matters:


[INFO] |  | \- org.scala-lang:scalap:jar:2.11.0:compile
[INFO] |  |\- org.scala-lang:scala-compiler:jar:2.11.0:compile
[INFO] |  |   +-
org.scala-lang.modules:scala-xml_2.11:jar:1.0.1:compile
[INFO] |  |   \-
org.scala-lang.modules:scala-parser-combinators_2.11:jar:1.0.1:compile
[INFO] |  +-
com.fasterxml.jackson.module:jackson-module-scala_2.11:jar:2.4.4:compile
[INFO] |  |  +- org.scala-lang:scala-reflect:jar:2.11.2:compile
[INFO] \- org.scala-lang:scala-library:jar:2.11.0:compile



On Fri, Mar 11, 2016 at 10:38 AM, Ted Yu  wrote:

> Looks like Scala version mismatch.
>
> Are you using 2.11 everywhere ?
>
> On Fri, Mar 11, 2016 at 10:33 AM, vasu20  wrote:
>
>> Hi
>>
>> Any help appreciated on this.  I am trying to write a Spark program using
>> IntelliJ.  I get a run time error as soon as new SparkConf() is called
>> from
>> main.  Top few lines of the exception are pasted below.
>>
>> These are the following versions:
>>
>> Spark jar:  spark-assembly-1.6.0-hadoop2.6.0.jar
>> pom:  spark-core_2.11
>>  1.6.0
>>
>> I have installed the Scala plugin in IntelliJ and added a dependency.
>>
>> I have also added a library dependency in the project structure.
>>
>> Thanks for any help!
>>
>> Vasu
>>
>>
>> Exception in thread "main" java.lang.NoSuchMethodError:
>> scala.Predef$.augmentString(Ljava/lang/String;)Ljava/lang/String;
>> at org.apache.spark.util.Utils$.(Utils.scala:1682)
>> at org.apache.spark.util.Utils$.(Utils.scala)
>>     at org.apache.spark.SparkConf.(SparkConf.scala:59)
>>
>>
>>
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Newbie-question-Help-with-runtime-error-on-augmentString-tp26462.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>


Re: Newbie question - Help with runtime error on augmentString

2016-03-11 Thread Jacek Laskowski
Hi,

Why do you use maven not sbt for Scala?

Can you show the entire pom.xml and the command to execute the app?

Jacek
11.03.2016 7:33 PM "vasu20"  napisał(a):

> Hi
>
> Any help appreciated on this.  I am trying to write a Spark program using
> IntelliJ.  I get a run time error as soon as new SparkConf() is called from
> main.  Top few lines of the exception are pasted below.
>
> These are the following versions:
>
> Spark jar:  spark-assembly-1.6.0-hadoop2.6.0.jar
> pom:  spark-core_2.11
>  1.6.0
>
> I have installed the Scala plugin in IntelliJ and added a dependency.
>
> I have also added a library dependency in the project structure.
>
> Thanks for any help!
>
> Vasu
>
>
> Exception in thread "main" java.lang.NoSuchMethodError:
> scala.Predef$.augmentString(Ljava/lang/String;)Ljava/lang/String;
> at org.apache.spark.util.Utils$.(Utils.scala:1682)
> at org.apache.spark.util.Utils$.(Utils.scala)
> at org.apache.spark.SparkConf.(SparkConf.scala:59)
>
>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Newbie-question-Help-with-runtime-error-on-augmentString-tp26462.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Re: Newbie question - Help with runtime error on augmentString

2016-03-11 Thread Ted Yu
Looks like Scala version mismatch.

Are you using 2.11 everywhere ?

On Fri, Mar 11, 2016 at 10:33 AM, vasu20  wrote:

> Hi
>
> Any help appreciated on this.  I am trying to write a Spark program using
> IntelliJ.  I get a run time error as soon as new SparkConf() is called from
> main.  Top few lines of the exception are pasted below.
>
> These are the following versions:
>
> Spark jar:  spark-assembly-1.6.0-hadoop2.6.0.jar
> pom:  spark-core_2.11
>  1.6.0
>
> I have installed the Scala plugin in IntelliJ and added a dependency.
>
> I have also added a library dependency in the project structure.
>
> Thanks for any help!
>
> Vasu
>
>
> Exception in thread "main" java.lang.NoSuchMethodError:
> scala.Predef$.augmentString(Ljava/lang/String;)Ljava/lang/String;
> at org.apache.spark.util.Utils$.(Utils.scala:1682)
> at org.apache.spark.util.Utils$.(Utils.scala)
> at org.apache.spark.SparkConf.(SparkConf.scala:59)
>
>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Newbie-question-Help-with-runtime-error-on-augmentString-tp26462.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Newbie question - Help with runtime error on augmentString

2016-03-11 Thread vasu20
Hi

Any help appreciated on this.  I am trying to write a Spark program using
IntelliJ.  I get a run time error as soon as new SparkConf() is called from
main.  Top few lines of the exception are pasted below.

These are the following versions:

Spark jar:  spark-assembly-1.6.0-hadoop2.6.0.jar
pom:  spark-core_2.11
 1.6.0

I have installed the Scala plugin in IntelliJ and added a dependency.

I have also added a library dependency in the project structure.

Thanks for any help!

Vasu


Exception in thread "main" java.lang.NoSuchMethodError:
scala.Predef$.augmentString(Ljava/lang/String;)Ljava/lang/String;
at org.apache.spark.util.Utils$.(Utils.scala:1682)
at org.apache.spark.util.Utils$.(Utils.scala)
at org.apache.spark.SparkConf.(SparkConf.scala:59)






--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Newbie-question-Help-with-runtime-error-on-augmentString-tp26462.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org