Hi, all:
Sometimes task will fail with exception "About the exception "Received
LaunchTask command but executor was null", and I find it is a common problem:
https://issues.apache.org/jira/browse/SPARK-13112
https://issues.apache.org/jira/browse/SPARK-13060
I have a
Are you using 1.4.0? If yes, use 1.4.1
-- --
??: ??;qhz...@apache.org;
: 2015??8??13??(??) 6:04
??: devdev@spark.apache.org;
: please help with ClassNotFoundException
Hi,I am using spark 1.4 when an issue occurs
seaIs it the same issue as
https://issues.apache.org/jira/browse/SPARK-8368
Sea 261810...@qq.com??2015??8??13?? 6:52??
Are you using 1.4.0? If yes, use 1.4.1
-- --
??: ??;qhz...@apache.org;
: 2015??8??13??(??) 6
; dev@spark.apache.orgdev@spark.apache.org;
: Re: please help with ClassNotFoundException
Hi Sea I have updated spark to 1.4.1, however the problem still exists, any
idea?
Sea 261810...@qq.com??2015??8??14?? 12:36??
Yes, I guess so. I see this bug before
t...@databricks.com wrote:
Could you print the time on the driver (that is, in foreachRDD but before
RDD.foreachPartition) and see if it is behaving weird?
TD
On Fri, Jun 26, 2015 at 3:57 PM, Emrehan T??z??n emrehan.tu...@gmail.com
wrote:
On Fri, Jun 26, 2015 at 12:30 PM, Sea 261810
Hi, all
I find a problem in spark streaming, when I use the time in function
foreachRDD... I find the time is very interesting.
val messages = KafkaUtils.createDirectStream[String, String, StringDecoder,
StringDecoder](ssc, kafkaParams, topicsSet)
dataStream.map(x = createGroup(x._2,
26, 2015 at 11:06 AM, Sea 261810...@qq.com wrote:
Hi, all
I find a problem in spark streaming, when I use the time in function
foreachRDD... I find the time is very interesting.
val messages = KafkaUtils.createDirectStream[String, String, StringDecoder,
StringDecoder](ssc, kafkaParams, topicsSet
Hi, all:
I want to run spark sql on yarn(yarn-client), but ... I already set
spark.yarn.jar and spark.jars in conf/spark-defaults.conf.
./bin/spark-sql -f game.sql --executor-memory 2g --num-executors 100 game.txt
Exception in thread main java.lang.NoClassDefFoundError:
Hi, all:
I want to run spark sql on yarn(yarn-client), but ... I already set
spark.yarn.jar and spark.jars in conf/spark-defaults.conf.
./bin/spark-sql -f game.sql --executor-memory 2g --num-executors 100 game.txt
Exception in thread main java.lang.NoClassDefFoundError:
??(??) 11:19
??: Sea261810...@qq.com;
: devdev@spark.apache.org;
: Re: Spark-sql(yarn-client) java.lang.NoClassDefFoundError:
org/apache/spark/deploy/yarn/ExecutorLauncher
Is it the full stack trace?
On Thu, Jun 18, 2015 at 6:39 AM, Sea 261810...@qq.com wrote:
Hi, all:
I want to run
? or the hostnames?
ThanksBest Regards
On Sat, Jun 13, 2015 at 9:51 PM, Sea 261810...@qq.com wrote:
In spark 1.4.0, I find that the Address is ip (it was hostname in v1.3.0), why?
who did it?
7FAB9BA9@AFBE9573.34FC7E55
Description: Binary data
In spark 1.4.0, I find that the Address is ip (it was hostname in v1.3.0), why?
who did it?
Hi, all:
I use function updateStateByKey in Spark Streaming, I need to store the states
for one minite, I set spark.cleaner.ttl to 120, the duration is 2 seconds,
but it throws Exception
Caused by:
org.apache.hadoop.ipc.RemoteException(java.io.FileNotFoundException): File does
not exist:
Sea,
I've raised a JIRA Issue on this :
https://issues.apache.org/jira/browse/SPARK-6445 . Making a PR now
On Sat, Mar 21, 2015 at 11:06 AM, Sea [via Apache Spark Developers List]
ml-node+s1001551n11145...@n3.nabble.com wrote:
Hi, all:
When I exit the console of spark-sql, the following
Hi, all:
When I exit the console of spark-sql, the following exception throwed..
Exception in thread Thread-3 java.io.IOException: Filesystem closed
at org.apache.hadoop.hdfs.DFSClient.checkOpen(DFSClient.java:629)
at
Hi, all:
Spark1.3.0 hadoop2.2.0
I put the following params in the spark-defaults.conf
spark.dynamicAllocation.enabled true
spark.dynamicAllocation.minExecutors 20
spark.dynamicAllocation.maxExecutors 300
spark.dynamicAllocation.executorIdleTimeout 300
spark.shuffle.service.enabled true
Hi, all:
Spark1.3.0 hadoop2.2.0
I put the following params in the spark-defaults.conf
spark.dynamicAllocation.enabled true
spark.dynamicAllocation.minExecutors 20
spark.dynamicAllocation.maxExecutors 300
spark.dynamicAllocation.executorIdleTimeout 300
spark.shuffle.service.enabled true
17 matches
Mail list logo