hello

2021-01-02 Thread somni fourfiveone
Hi,   Is anybody out there ?     Somni-451   - To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Re: Hello !

2016-04-11 Thread mylisttech
Thank you ! On Apr 12, 2016, at 1:41, Ted Yu wrote: > For SparkR, please refer to https://spark.apache.org/docs/latest/sparkr.html > > bq. on Ubuntu or CentOS > > Both platforms are supported. > > On Mon, Apr 11, 2016 at 1:08 PM, wrote: > Dear Experts , > > I am posting this for your inf

Re: Hello !

2016-04-11 Thread Ted Yu
For SparkR, please refer to https://spark.apache.org/docs/latest/sparkr.html bq. on Ubuntu or CentOS Both platforms are supported. On Mon, Apr 11, 2016 at 1:08 PM, wrote: > Dear Experts , > > I am posting this for your information. I am a newbie to spark. > I am interested in understanding Spa

Hello !

2016-04-11 Thread mylisttech
Dear Experts , I am posting this for your information. I am a newbie to spark. I am interested in understanding Spark at the internal level. I need your opinion, which unix flavor should I install spark on Ubuntu or CentOS. I have had enough trouble with the windows version (1.6.1 with Hadoop 2

Re: Unable to start Pi (hello world) application on Spark 1.4

2015-06-28 Thread ๏̯͡๏
Figured it out. All the jars that are specified with driver-class-path are now exported through SPARK_CLASSPATH and its working now. I thought SPARK_CLASSPATH was dead. Looks like its flipping ON/OFF On Sun, Jun 28, 2015 at 12:55 PM, ÐΞ€ρ@Ҝ (๏̯͡๏) wrote: > Any thoughts on this ? > > On Fri, Ju

Re: Unable to start Pi (hello world) application on Spark 1.4

2015-06-28 Thread ๏̯͡๏
Any thoughts on this ? On Fri, Jun 26, 2015 at 2:27 PM, ÐΞ€ρ@Ҝ (๏̯͡๏) wrote: > It used to work with 1.3.1, however with 1.4.0 i get the following > exception > > > export SPARK_HOME=/home/dvasthimal/spark1.4/spark-1.4.0-bin-hadoop2.4 > export > SPARK_JAR=/home/dvasthimal/spark1.4/spark-1.4.0-bin

Unable to start Pi (hello world) application on Spark 1.4

2015-06-26 Thread ๏̯͡๏
It used to work with 1.3.1, however with 1.4.0 i get the following exception export SPARK_HOME=/home/dvasthimal/spark1.4/spark-1.4.0-bin-hadoop2.4 export SPARK_JAR=/home/dvasthimal/spark1.4/spark-1.4.0-bin-hadoop2.4/lib/spark-assembly-1.4.0-hadoop2.4.0.jar export HADOOP_CONF_DIR=/apache/hadoop/co

Re: Spark 1.2.1: ClassNotFoundException when running hello world example in scala 2.11

2015-02-19 Thread Akhil Das
erflow.com/questions/28612837/spark-classnotfoundexception-when-running-hello-world-example-in-scala-2-11 > > Has anyone experienced anything similar? > > Thank you! >

Spark 1.2.1: ClassNotFoundException when running hello world example in scala 2.11

2015-02-19 Thread Luis Solano
I'm having an issue with spark 1.2.1 and scala 2.11. I detailed the symptoms in this stackoverflow question. http://stackoverflow.com/questions/28612837/spark-classnotfoundexception-when-running-hello-world-example-in-scala-2-11 Has anyone experienced anything similar? Thank you!

Re: hello

2014-12-18 Thread Harihar Nahak
If you reply to this email, your message will be added to the discussion > below: > http://apache-spark-user-list.1001560.n3.nabble.com/hello-tp20759.html > To start a new topic under Apache Spark User List, email > ml-node+s1001560n1...@n3.nabble.com > To unsubscribe from Apache

Re: Fails to run simple Spark (Hello World) scala program

2014-09-23 Thread Moshe Beeri
>>>>> } >>>>> } >>>>> Runs with Scala 2.10.4 >>>>> The problem is this [vogue] exception: >>>>> >>>>> at com.example.scamel.Nizoz.main(Nizoz.scala) >>>>> Caused by: java.lang.RuntimeEx

Re: Fails to run simple Spark (Hello World) scala program

2014-09-20 Thread Moshe Beeri
>>> >>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) >>> at >>> >>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) >>> ... >>

Re: Fails to run simple Spark (Hello World) scala program

2014-09-20 Thread Sean Owen
ructorAccessorImpl.java:62) >> at >> >> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) >> ... >> ... 10 more >> Caused by: java.lang.UnsatisfiedLinkError: >> org.apache.hadoop.security.JniBa

Re: Fails to run simple Spark (Hello World) scala program

2014-09-20 Thread Moshe Beeri
wrote: > Thank Manu, > > I just saw I have included hadoop client 2.x in my pom.xml, removing it > solved the problem. > > Thanks for you help > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Fails-to-run-simple-S

Re: Fails to run simple Spark (Hello World) scala program

2014-09-20 Thread Moshe Beeri
Thank Manu, I just saw I have included hadoop client 2.x in my pom.xml, removing it solved the problem. Thanks for you help -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Fails-to-run-simple-Spark-Hello-World-scala-program-tp14718p14721.html Sent from

Re: Fails to run simple Spark (Hello World) scala program

2014-09-20 Thread Manu Suryavansh
hadoop.security.JniBasedUnixGroupsMapping.anchorNative()V > at > org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative(Native > Method) > at > > org.apache.hadoop.security.JniBasedUnixGroupsMapping.(JniBasedUnixGroupsMapping.java:49) > > I have Hadoop 1.2

Fails to run simple Spark (Hello World) scala program

2014-09-20 Thread Moshe Beeri
psMapping.(JniBasedUnixGroupsMapping.java:49) I have Hadoop 1.2.1 running on Ubuntu 14.04, the Scala console run as expected. What am I doing wrong? Any idea will be welcome -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Fails-to-ru