Hi,
Is anybody out there ?
Somni-451
-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org
Thank you !
On Apr 12, 2016, at 1:41, Ted Yu wrote:
> For SparkR, please refer to https://spark.apache.org/docs/latest/sparkr.html
>
> bq. on Ubuntu or CentOS
>
> Both platforms are supported.
>
> On Mon, Apr 11, 2016 at 1:08 PM, wrote:
> Dear Experts ,
>
> I am posting this for your inf
For SparkR, please refer to https://spark.apache.org/docs/latest/sparkr.html
bq. on Ubuntu or CentOS
Both platforms are supported.
On Mon, Apr 11, 2016 at 1:08 PM, wrote:
> Dear Experts ,
>
> I am posting this for your information. I am a newbie to spark.
> I am interested in understanding Spa
Dear Experts ,
I am posting this for your information. I am a newbie to spark.
I am interested in understanding Spark at the internal level.
I need your opinion, which unix flavor should I install spark on Ubuntu or
CentOS. I have had enough trouble with the windows version (1.6.1 with Hadoop
2
Figured it out.
All the jars that are specified with driver-class-path are now exported
through SPARK_CLASSPATH and its working now.
I thought SPARK_CLASSPATH was dead. Looks like its flipping ON/OFF
On Sun, Jun 28, 2015 at 12:55 PM, ÐΞ€ρ@Ҝ (๏̯͡๏) wrote:
> Any thoughts on this ?
>
> On Fri, Ju
Any thoughts on this ?
On Fri, Jun 26, 2015 at 2:27 PM, ÐΞ€ρ@Ҝ (๏̯͡๏) wrote:
> It used to work with 1.3.1, however with 1.4.0 i get the following
> exception
>
>
> export SPARK_HOME=/home/dvasthimal/spark1.4/spark-1.4.0-bin-hadoop2.4
> export
> SPARK_JAR=/home/dvasthimal/spark1.4/spark-1.4.0-bin
It used to work with 1.3.1, however with 1.4.0 i get the following exception
export SPARK_HOME=/home/dvasthimal/spark1.4/spark-1.4.0-bin-hadoop2.4
export
SPARK_JAR=/home/dvasthimal/spark1.4/spark-1.4.0-bin-hadoop2.4/lib/spark-assembly-1.4.0-hadoop2.4.0.jar
export HADOOP_CONF_DIR=/apache/hadoop/co
erflow.com/questions/28612837/spark-classnotfoundexception-when-running-hello-world-example-in-scala-2-11
>
> Has anyone experienced anything similar?
>
> Thank you!
>
I'm having an issue with spark 1.2.1 and scala 2.11. I detailed the
symptoms in this stackoverflow question.
http://stackoverflow.com/questions/28612837/spark-classnotfoundexception-when-running-hello-world-example-in-scala-2-11
Has anyone experienced anything similar?
Thank you!
If you reply to this email, your message will be added to the discussion
> below:
> http://apache-spark-user-list.1001560.n3.nabble.com/hello-tp20759.html
> To start a new topic under Apache Spark User List, email
> ml-node+s1001560n1...@n3.nabble.com
> To unsubscribe from Apache
>>>>> }
>>>>> }
>>>>> Runs with Scala 2.10.4
>>>>> The problem is this [vogue] exception:
>>>>>
>>>>> at com.example.scamel.Nizoz.main(Nizoz.scala)
>>>>> Caused by: java.lang.RuntimeEx
>>>
>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>> at
>>>
>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>> ...
>>
ructorAccessorImpl.java:62)
>> at
>>
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> ...
>> ... 10 more
>> Caused by: java.lang.UnsatisfiedLinkError:
>> org.apache.hadoop.security.JniBa
wrote:
> Thank Manu,
>
> I just saw I have included hadoop client 2.x in my pom.xml, removing it
> solved the problem.
>
> Thanks for you help
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Fails-to-run-simple-S
Thank Manu,
I just saw I have included hadoop client 2.x in my pom.xml, removing it
solved the problem.
Thanks for you help
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Fails-to-run-simple-Spark-Hello-World-scala-program-tp14718p14721.html
Sent from
hadoop.security.JniBasedUnixGroupsMapping.anchorNative()V
> at
> org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative(Native
> Method)
> at
>
> org.apache.hadoop.security.JniBasedUnixGroupsMapping.(JniBasedUnixGroupsMapping.java:49)
>
> I have Hadoop 1.2
psMapping.(JniBasedUnixGroupsMapping.java:49)
I have Hadoop 1.2.1 running on Ubuntu 14.04, the Scala console run as
expected.
What am I doing wrong?
Any idea will be welcome
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Fails-to-ru
17 matches
Mail list logo