-library is provided, you need to change
it to compile to run SparkPi in Intellij. As I remember, you also need to
change guava and jetty related library to compile too.
On Mon, Aug 17, 2015 at 2:14 AM, xiaohe lan zombiexco...@gmail.com
wrote:
Hi,
I am trying to run SparkPi in Intellij
Hi,
I am trying to run SparkPi in Intellij and getting NoClassDefFoundError.
Anyone else saw this issue before ?
Exception in thread main java.lang.NoClassDefFoundError:
scala/collection/Seq
at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
at
Change jdk from 1.8.0_45 to 1.7.0_79 solve this issue.
I saw https://issues.apache.org/jira/browse/SPARK-6388
But it is not a problem however.
On Thu, Jul 2, 2015 at 1:30 PM, xiaohe lan zombiexco...@gmail.com wrote:
Hi Expert,
Hadoop version: 2.4
Spark version: 1.3.1
I am running
Hi Expert,
Hadoop version: 2.4
Spark version: 1.3.1
I am running the SparkPi example application.
bin/spark-submit --class org.apache.spark.examples.SparkPi --master
yarn-client --executor-memory 2G lib/spark-examples-1.3.1-hadoop2.4.0.jar
2
The same command sometimes gets WARN
...@cloudera.com
wrote:
Awesome!
It's documented here:
https://spark.apache.org/docs/latest/submitting-applications.html
-Sandy
On Mon, May 18, 2015 at 8:03 PM, xiaohe lan zombiexco...@gmail.com
wrote:
Hi Sandy,
Thanks for your information. Yes, spark-submit --master yarn
--num-executors 5
, Sandy Ryza sandy.r...@cloudera.com
wrote:
Hi Xiaohe,
The all Spark options must go before the jar or they won't take effect.
-Sandy
On Sun, May 17, 2015 at 8:59 AM, xiaohe lan zombiexco...@gmail.com
wrote:
Sorry, them both are assigned task actually.
Aggregated Metrics by Executor
MB295.4
MB2host2:620721.7 min505640.0 MB / 12014510386.0 MB / 109269121646.6 MB304.8
MB
On Sun, May 17, 2015 at 11:50 PM, xiaohe lan zombiexco...@gmail.com wrote:
bash-4.1$ ps aux | grep SparkSubmit
xilan 1704 13.2 1.2 5275520 380244 pts/0 Sl+ 08:39 0:13
/scratch/xilan/jdk1.8.0_45/bin
:
Did you try --executor-cores param? While you submit the job, do a ps aux
| grep spark-submit and see the exact command parameters.
Thanks
Best Regards
On Sat, May 16, 2015 at 12:31 PM, xiaohe lan zombiexco...@gmail.com
wrote:
Hi,
I have a 5 nodes yarn cluster, I used spark-submit
link:
http://mbonaci.github.io/mbo-spark/
You dont need to install spark on every node.Just install it on one node
or you can install it on remote system also and made a spark cluster.
Thanks
Madhvi
On Thursday 30 April 2015 09:31 AM, xiaohe lan wrote:
Hi experts,
I see spark on yarn has
Hi experts,
I see spark on yarn has yarn-client and yarn-cluster mode. I also have a 5
nodes hadoop cluster (hadoop 2.4). How to install spark if I want to try
the spark on yarn mode.
Do I need to install spark on the each node of hadoop cluster ?
Thanks,
Xiaohe
10 matches
Mail list logo