Hi, all:
Sometimes task will fail with exception "About the exception "Received
LaunchTask command but executor was null", and I find it is a common problem:
https://issues.apache.org/jira/browse/SPARK-13112
https://issues.apache.org/jira/browse/SPARK-13060
I have a ques
I have no idea... We use scala. You upgrade to 1.4 so quickly..., are you
using spark in production? Spark 1.3 is better than spark1.4.
-- --
??: "??";;
: 2015??8??14??(??) 11:14
??: "Sea"<
Yes, I guess so. I see this bug before.
-- --
??: "??";;
: 2015??8??13??(??) 9:30
??: "Sea"<261810...@qq.com>; "dev@spark.apache.org";
: Re: please help with ClassNotFoundException
Are you using 1.4.0? If yes, use 1.4.1
-- --
??: "??";;
: 2015??8??13??(??) 6:04
??: "dev";
: please help with ClassNotFoundException
Hi,I am using spark 1.4 when an issue occurs to me.
I am trying to use the
This exception is so ugly!!! The screen is full of these information when the
program runs a long time, and they will not fail the job.
I comment it in the source code. I think this information is useless because
the executor is already removed and I don't know what does the executor id mean
Yes , things go well now. It is a problem of SimpleDateFormat. Thank you all.
-- --
??: "Dumas Hwang";;
: 2015??6??27??(??) 8:16
??: "Tathagata Das";
: "Emrehan T??z??n"; "Sea"<2
Yes, I make it.
-- --
??: "Gerard Maas";;
: 2015??6??26??(??) 5:40
??: "Sea"<261810...@qq.com>;
: "user"; "dev";
: Re: Time is ugly in Spark Streaming
Are you shari
Hi, all
I find a problem in spark streaming, when I use the time in function
foreachRDD... I find the time is very interesting.
val messages = KafkaUtils.createDirectStream[String, String, StringDecoder,
StringDecoder](ssc, kafkaParams, topicsSet)
dataStream.map(x => createGroup(x._2,
dimensio
11:19
??: "Sea"<261810...@qq.com>;
: "dev";
: Re: Spark-sql(yarn-client) java.lang.NoClassDefFoundError:
org/apache/spark/deploy/yarn/ExecutorLauncher
Is it the full stack trace?
On Thu, Jun 18, 2015 at 6:39 AM, Sea <261810...@qq.com> wrote:
Hi, all:
I
Hi, all:
I want to run spark sql on yarn(yarn-client), but ... I already set
"spark.yarn.jar" and "spark.jars" in conf/spark-defaults.conf.
./bin/spark-sql -f game.sql --executor-memory 2g --num-executors 100 > game.txt
Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/spar
Hi, all:
I want to run spark sql on yarn(yarn-client), but ... I already set
"spark.yarn.jar" and "spark.jars" in conf/spark-defaults.conf.
./bin/spark-sql -f game.sql --executor-memory 2g --num-executors 100 > game.txt
Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/spark
nice.
-- --
??: "Akhil Das";;
: 2015??6??15??(??) ????5:36
??: "Sea"<261810...@qq.com>;
: "dev";
: Re: About HostName display in SparkUI
In the conf/slaves file, are you having the ip
In spark 1.4.0, I find that the Address is ip (it was hostname in v1.3.0), why?
who did it?
Hi, all:
I use function updateStateByKey in Spark Streaming, I need to store the states
for one minite, I set "spark.cleaner.ttl" to 120, the duration is 2 seconds,
but it throws Exception
Caused by:
org.apache.hadoop.ipc.RemoteException(java.io.FileNotFoundException): File does
not exist
Hi, Vinodkc
Yes, I found another solution, https://github.com/apache/spark/pull/4771/
I will test it later.
-- 原始邮件 --
发件人: "vinodkc";;
发送时间: 2015年3月21日(星期六) 下午4:52
收件人: "dev";
主题: Re: Filesystem closed Exception
Hi Sea,
I've rai
Hi, all:
When I exit the console of spark-sql, the following exception throwed..
My spark version is 1.3.0, hadoop version is 2.2.0
Exception in thread "Thread-3" java.io.IOException: Filesystem closed
at org.apache.hadoop.hdfs.DFSClient.checkOpen(DFSClient.java:629)
at
Hi, all:
When I exit the console of spark-sql, the following exception throwed..
Exception in thread "Thread-3" java.io.IOException: Filesystem closed
at org.apache.hadoop.hdfs.DFSClient.checkOpen(DFSClient.java:629)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClien
Hi, all:
Spark1.3.0 hadoop2.2.0
I put the following params in the spark-defaults.conf
spark.dynamicAllocation.enabled true
spark.dynamicAllocation.minExecutors 20
spark.dynamicAllocation.maxExecutors 300
spark.dynamicAllocation.executorIdleTimeout 300
spark.shuffle.service.enabled true
I
Hi, all:
Spark1.3.0 hadoop2.2.0
I put the following params in the spark-defaults.conf
spark.dynamicAllocation.enabled true
spark.dynamicAllocation.minExecutors 20
spark.dynamicAllocation.maxExecutors 300
spark.dynamicAllocation.executorIdleTimeout 300
spark.shuffle.service.enabled true
I
19 matches
Mail list logo