Re: java.io.FileNotFoundException

2016-07-05 Thread Jacek Laskowski
On Tue, Jul 5, 2016 at 2:16 AM, kishore kumar wrote: > 2016-07-04 05:11:53,972 [dispatcher-event-loop-0] ERROR > org.apache.spark.scheduler.LiveListenerBus- Dropping SparkListenerEvent > because no remaining room in event q > ueue. This likely means one of the SparkListeners is too slow and canno

Re: java.io.FileNotFoundException

2016-07-04 Thread kishore kumar
loy mode? Have >>>> you seen any other exceptions before? How long did the application run >>>> before the exception? >>>> >>>> Pozdrawiam, >>>> Jacek Laskowski >>>> >>>> https://medium.com/@jaceklaskowski/ >>&g

Re: java.io.FileNotFoundException

2016-07-04 Thread kishore kumar
t; > We've upgraded spark version from 1.2 to 1.6 still the same problem, >>> > >>> > Exception in thread "main" org.apache.spark.SparkException: Job >>> aborted due >>> > to stage failure: Task 286 in stage >>> > 2397.0 f

Re: java.io.FileNotFoundException

2016-07-04 Thread Jacek Laskowski
twitter.com/jaceklaskowski >> >> >> On Mon, Jul 4, 2016 at 10:57 AM, kishore kumar >> wrote: >> > We've upgraded spark version from 1.2 to 1.6 still the same problem, >> > >> > Exception in thread "main" org.apache.spark.SparkExcep

Re: java.io.FileNotFoundException

2016-07-04 Thread kishore kumar
e failure: Task 286 in stage > > 2397.0 failed 4 times, most recent failure: Lost task 286.3 in stage > 2397.0 > > (TID 314416, salve-06.domain.com): java.io.FileNotFoundException: > > /opt/mapr/tmp/h > > > adoop-tmp/hadoop-mapr/nm-local-dir/usercache/user1/appcache/app

Re: java.io.FileNotFoundException

2016-07-04 Thread Jacek Laskowski
failure: Task 286 in stage > 2397.0 failed 4 times, most recent failure: Lost task 286.3 in stage 2397.0 > (TID 314416, salve-06.domain.com): java.io.FileNotFoundException: > /opt/mapr/tmp/h > adoop-tmp/hadoop-mapr/nm-local-dir/usercache/user1/appcache/application_1467474162580_29353/blockm

Re: java.io.FileNotFoundException

2016-07-04 Thread karthi keyan
che.spark.SparkException: Job aborted > due to stage failure: Task 286 in stage > 2397.0 failed 4 times, most recent failure: Lost task 286.3 in stage > 2397.0 (TID 314416, salve-06.domain.com): java.io.FileNotFoundException: > /opt/mapr/tmp/h > > adoop-tmp/hadoop-

Re: java.io.FileNotFoundException

2016-07-04 Thread kishore kumar
e-06.domain.com): java.io.FileNotFoundException: /opt/mapr/tmp/h adoop-tmp/hadoop-mapr/nm-local-dir/usercache/user1/appcache/application_1467474162580_29353/blockmgr-bd075392-19c2-4cb8-8033-0fe54d683c8f/12/shuffle_530_286_0.inde x.c374502a-4cf2-4052-abcf-42977f1623d0 (No such file or directory) Kindly hel

Re: java.io.FileNotFoundException

2016-06-04 Thread kishore kumar
; Exception in thread "main" org.apache.spark.SparkException: Job aborted >>>> due to stage failure: Task 38 in stage 26800.0 failed 4 times, most recent >>>> failure: Lost task 38.3 in stage 26800.0 (TID 4990082, hdprd-c01-r04-03): >>>> java.io.FileNotF

Re: java.io.FileNotFoundException

2016-06-03 Thread kishore kumar
rch, >>> >>> the error which we are encountering is >>> Exception in thread "main" org.apache.spark.SparkException: Job aborted >>> due to stage failure: Task 38 in stage 26800.0 failed 4 times, most recent >>> failure: Lost task 38.3 in stage 26800.0 (TID 4990082,

Re: java.io.FileNotFoundException

2016-06-03 Thread Jeff Zhang
led 4 times, most recent >> failure: Lost task 38.3 in stage 26800.0 (TID 4990082, hdprd-c01-r04-03): >> java.io.FileNotFoundException: >> /opt/mapr/tmp/hadoop-tmp/hadoop-mapr/nm-local-dir/usercache/sparkuser/appcache/application_1463194314221_211370/spark-3cc37dc7-fa3c-4b98-aa60-0acdf

Re: java.io.FileNotFoundException

2016-06-03 Thread kishore kumar
is > Exception in thread "main" org.apache.spark.SparkException: Job aborted > due to stage failure: Task 38 in stage 26800.0 failed 4 times, most recent > failure: Lost task 38.3 in stage 26800.0 (TID 4990082, hdprd-c01-r04-03): > java.io.FileNotFoundException: > /opt/m

java.io.FileNotFoundException

2016-05-31 Thread kishore kumar
ge 26800.0 failed 4 times, most recent failure: Lost task 38.3 in stage 26800.0 (TID 4990082, hdprd-c01-r04-03): java.io.FileNotFoundException: /opt/mapr/tmp/hadoop-tmp/hadoop-mapr/nm-local-dir/usercache/sparkuser/appcache/application_1463194314221_211370/spark-3cc37dc7-fa3c-4b98-aa60-0acdf

Re: java.io.FileNotFoundException(Too many open files) in Spark streaming

2016-01-06 Thread Priya Ch
The line of code which I highlighted in the screenshot is within the spark source code. Spark implements sort-based shuffle implementation and the spilled files are merged using the merge sort. Here is the link https://issues.apache.org/jira/secure/attachment/12655884/Sort-basedshuffledesign.pdf w

Re: java.io.FileNotFoundException(Too many open files) in Spark streaming

2016-01-06 Thread Priya Ch
Running 'lsof' will let us know the open files but how do we come to know the root cause behind opening too many files. Thanks, Padma CH On Wed, Jan 6, 2016 at 8:39 AM, Hamel Kothari wrote: > The "Too Many Files" part of the exception is just indicative of the fact > that when that call was mad

Re: java.io.FileNotFoundException(Too many open files) in Spark streaming

2016-01-05 Thread Priya Ch
Yes, the fileinputstream is closed. May be i didn't show in the screen shot . As spark implements, sort-based shuffle, there is a parameter called maximum merge factor which decides the number of files that can be merged at once and this avoids too many open files. I am suspecting that it is somet

Re: java.io.FileNotFoundException(Too many open files) in Spark streaming

2016-01-05 Thread Annabel Melongo
Vijay, Are you closing the fileinputstream at the end of each loop ( in.close())? My guess is those streams aren't close and thus the "too many open files" exception. On Tuesday, January 5, 2016 8:03 AM, Priya Ch wrote: Can some one throw light on this ? Regards,Padma Ch On Mon, Dec 2

Re: java.io.FileNotFoundException(Too many open files) in Spark streaming

2016-01-05 Thread Priya Ch
Can some one throw light on this ? Regards, Padma Ch On Mon, Dec 28, 2015 at 3:59 PM, Priya Ch wrote: > Chris, we are using spark 1.3.0 version. we have not set > spark.streaming.concurrentJobs > this parameter. It takes the default value. > > Vijay, > > From the tack trace it is evident th

Re: java.io.FileNotFoundException(Too many open files) in Spark streaming

2015-12-28 Thread Priya Ch
Chris, we are using spark 1.3.0 version. we have not set spark.streaming.concurrentJobs this parameter. It takes the default value. Vijay, From the tack trace it is evident that org.apache.spark.util.collection.ExternalSorter$$anonfun$writePartitionedFile$1.apply$mcVI$sp(ExternalSorter.scala:73

Re: java.io.FileNotFoundException(Too many open files) in Spark streaming

2015-12-25 Thread Chris Fregly
and which version of Spark/Spark Streaming are you using? are you explicitly setting the spark.streaming.concurrentJobs to something larger than the default of 1? if so, please try setting that back to 1 and see if the problem still exists. this is a dangerous parameter to modify from the defaul

java.io.FileNotFoundException(Too many open files) in Spark streaming

2015-12-23 Thread Vijay Gharge
Few indicators - 1) during execution time - check total number of open files using lsof command. Need root permissions. If it is cluster not sure much ! 2) which exact line in the code is triggering this error ? Can you paste that snippet ? On Wednesday 23 December 2015, Priya Ch > wrote: > ulim

Re: java.io.FileNotFoundException(Too many open files) in Spark streaming

2015-12-23 Thread Priya Ch
ulimit -n 65000 fs.file-max = 65000 ( in etc/sysctl.conf file) Thanks, Padma Ch On Tue, Dec 22, 2015 at 6:47 PM, Yash Sharma wrote: > Could you share the ulimit for your setup please ? > > - Thanks, via mobile, excuse brevity. > On Dec 22, 2015 6:39 PM, "Priya Ch" wrote: > >> Jakob, >> >>

Re: java.io.FileNotFoundException(Too many open files) in Spark streaming

2015-12-22 Thread Yash Sharma
Could you share the ulimit for your setup please ? - Thanks, via mobile, excuse brevity. On Dec 22, 2015 6:39 PM, "Priya Ch" wrote: > Jakob, > >Increased the settings like fs.file-max in /etc/sysctl.conf and also > increased user limit in /etc/security/limits.conf. But still see the same >

Re: java.io.FileNotFoundException(Too many open files) in Spark streaming

2015-12-22 Thread Priya Ch
Jakob, Increased the settings like fs.file-max in /etc/sysctl.conf and also increased user limit in /etc/security/limits.conf. But still see the same issue. On Fri, Dec 18, 2015 at 12:54 AM, Jakob Odersky wrote: > It might be a good idea to see how many files are open and try increasing > th

Re: java.io.FileNotFoundException(Too many open files) in Spark streaming

2015-12-17 Thread Jakob Odersky
It might be a good idea to see how many files are open and try increasing the open file limit (this is done on an os level). In some application use-cases it is actually a legitimate need. If that doesn't help, make sure you close any unused files and streams in your code. It will also be easier t

java.io.FileNotFoundException(Too many open files) in Spark streaming

2015-12-17 Thread Priya Ch
Hi All, When running streaming application, I am seeing the below error: java.io.FileNotFoundException: /data1/yarn/nm/usercache/root/appcache/application_1450172646510_0004/blockmgr-a81f42cd-6b52-4704-83f3-2cfc12a11b86/02/temp_shuffle_589ddccf-d436-4d2c-9935-e5f8c137b54b (Too many open

Re: java.io.FileNotFoundException: Job aborted due to stage failure

2015-11-26 Thread Ted Yu
re: > Task 4 in stage 5710.0 failed 4 times, most recent failure: Lost task > 4.3 in stage 5710.0 (TID 341269, > ip-10-0-1-80.us-west-2.compute.internal): > java.io.FileNotFoundException: > /mnt/md0/var/lib/spark/spark-549f7d96-82da-4b8d-b9fe- > 7f6fe8238478/blockmgr-f44be41a-90

java.io.FileNotFoundException: Job aborted due to stage failure

2015-11-26 Thread Sahil Sareen
c on s.property = c.property from X YZ org.apache.spark.SparkException: Job aborted due to stage failure: Task 4 in stage 5710.0 failed 4 times, most recent failure: Lost task 4.3 in stage 5710.0 (TID 341269, ip-10-0-1-80.us-west-2.compute.internal): java.io.FileNotFoundException: /mnt/md0/var/lib/

Spark 1.4.2- java.io.FileNotFoundException: Job aborted due to stage failure

2015-11-24 Thread Sahil Sareen
: Job aborted due to stage failure: Task 4 in stage 5710.0 failed 4 times, most recent failure: Lost task 4.3 in stage 5710.0 (TID 341269, ip-10-0-1-80.us-west-2.compute.internal): java.io.FileNotFoundException: /mnt/md0/var/lib/spark/spark-549f7d96-82da-4b8d-b9fe-7f6fe8238478/blockmgr-f44be41a-9036-4b93

get java.io.FileNotFoundException when use addFile Function

2015-07-15 Thread prateek arora
Hi I am trying to write a simple program using addFile Function but getting error in my worker node that file doest not exist tage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, slave2.novalocal): java.io.FileNotFoundException: File file:/tmp

get java.io.FileNotFoundException when use addFile Function

2015-07-15 Thread prateek arora
I am trying to write a simple program using addFile Function but getting error in my worker node that file doest not exist tage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, slave2.novalocal): java.io.FileNotFoundException: File file:/tmp

Re: spark java.io.FileNotFoundException: /user/spark/applicationHistory/application

2015-05-29 Thread igor.berman
in yarn your executors might run on every node in your cluster, so you need to configure spark history to be on hdfs(so it will be accessible to every executor) probably you've switched from local to yarn mode when submitting -- View this message in context: http://apache-spark-user-list.100156

spark java.io.FileNotFoundException: /user/spark/applicationHistory/application

2015-05-28 Thread roy
hi, Suddenly spark jobs started failing with following error Exception in thread "main" java.io.FileNotFoundException: /user/spark/applicationHistory/application_1432824195832_1275.inprogress (No such file or directory) full trace here [21:50:04 x...@hadoop-client01.dev:~]$ sp

Re: java.io.FileNotFoundException when using HDFS in cluster mode

2015-03-30 Thread nsalian
Try running it like this: sudo -u hdfs spark-submit --class org.apache.spark.examples.SparkPi --deploy-mode cluster --master yarn hdfs:///user/spark/spark-examples-1.2.0-cdh5.3.2-hadoop2.5.0-cdh5.3.2.jar 10 Caveats: 1) Make sure the permissions of /user/nick is 775 or 777. 2) No need for hostnam

RE: java.io.FileNotFoundException when using HDFS in cluster mode

2015-03-30 Thread java8964
pache.org > Subject: java.io.FileNotFoundException when using HDFS in cluster mode > > Hi List, > > I'm following this example here > <https://github.com/databricks/learning-spark/tree/master/mini-complete-example> > > with the following: > > $SPARK

Re: java.io.FileNotFoundException when using HDFS in cluster mode

2015-03-29 Thread Akhil Das
5K Mar 29 22:05 > learning-spark-mini-example_2.10-0.0.1.jar > -rw-r--r-- 1 nickt nickt 9.2K Mar 29 22:05 stderr > -rw-r--r-- 1 nickt nickt0 Mar 29 22:05 stdout > > But it's failing due to a java.io.FileNotFoundException saying my input > file > is missing: > > Cau

java.io.FileNotFoundException when using HDFS in cluster mode

2015-03-29 Thread Nick Travers
rom HDFS ok): -rw-r--r-- 1 nickt nickt 15K Mar 29 22:05 learning-spark-mini-example_2.10-0.0.1.jar -rw-r--r-- 1 nickt nickt 9.2K Mar 29 22:05 stderr -rw-r--r-- 1 nickt nickt0 Mar 29 22:05 stdout But it's failing due to a java.io.FileNotFoundException saying my input file is mis

When uses SparkFiles.get("GeoIP.dat"), got exception in thread "main" java.io.FileNotFoundException

2015-02-07 Thread Gmail
2015-02-08 06:51:17,064 INFO [main] handler.ContextHandler (ContextHandler.java:startContext(737)) - started o.e.j.s.ServletContextHandler{/streaming,null} 2015-02-08 06:51:17,065 INFO [main] handler.ContextHandler (ContextHandler.java:startContext(737)) - started o.e.j.s.ServletContex

Re: SparkContext.wholeTextFiles() java.io.FileNotFoundException: File does not exist:

2014-10-09 Thread Rahul Kumar Singh
n Owen > > Komu: > > Datum: 08.10.2014 18:05 > > Předmět: Re: SparkContext.wholeTextFiles() > java.io.FileNotFoundException: File does not exist: > > > > > CC: "user@spark.apache.org" > > Take this as a bit of a guess, since I don't use S3 much an

Re: SparkContext.wholeTextFiles() java.io.FileNotFoundException: File does not exist:

2014-10-09 Thread jan.zikes
m the standard EC2 installation? __ Od: Sean Owen Komu: Datum: 08.10.2014 18:05 Předmět: Re: SparkContext.wholeTextFiles() java.io.FileNotFoundException: File does not exist: CC: "user@spark.apache.org" Take this as a bit of a

Re: SparkContext.wholeTextFiles() java.io.FileNotFoundException: File does not exist:

2014-10-08 Thread Sean Owen
ions().size() > > File "/root/spark/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", > line 538, in __call__ > > File "/root/spark/python/lib/py4j-0.8.2.1-src.zip/py4j/protocol.py", line > 300, in get_return_value > > py4j.protocol.Py4JJavaEr

Re: SparkContext.wholeTextFiles() java.io.FileNotFoundException: File does not exist:

2014-10-08 Thread jan.zikes
__call__   File "/root/spark/python/lib/py4j-0.8.2.1-src.zip/py4j/protocol.py", line 300, in get_return_value py4j.protocol.Py4JJavaError: An error occurred while calling o30.partitions. : java.io.FileNotFoundException: File does not exist: /wikiinput/wiki.xml.gz at org.apache.hadoop.

Re: SparkContext.wholeTextFiles() java.io.FileNotFoundException: File does not exist:

2014-10-08 Thread jan.zikes
ile "/root/spark/python/lib/py4j-0.8.2.1-src.zip/py4j/protocol.py", line 300, in get_return_value py4j.protocol.Py4JJavaError: An error occurred while calling o30.partitions. : java.io.FileNotFoundException: File does not exist: /wikiinput/wiki.xml.gz at org.apache.hadoop.hdfs.DistributedF

SparkContext.wholeTextFiles() java.io.FileNotFoundException: File does not exist:

2014-10-08 Thread jan.zikes
ark/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", line 538, in __call__   File "/root/spark/python/lib/py4j-0.8.2.1-src.zip/py4j/protocol.py", line 300, in get_return_value py4j.protocol.Py4JJavaError: An error occurred while calling o30.partitions. : java.io.FileNotFoundExce

java.io.FileNotFoundException in usercache

2014-09-25 Thread Egor Pahomov
-20140925151931-a4c3/3a/shuffle_4_30_174 java.io.FileNotFoundException: /local/hd2/yarn/local/usercache/epahomov/appcache/application_1411219858924_15501/spark-local-20140925151931-a4c3/3a/shuffle_4_30_174 (No such file or directory) couple days ago. After this error spark context shuted down. I'm are

Re: org.apache.spark.SparkException: java.io.FileNotFoundException: does not exist)

2014-09-16 Thread Aris
er: Serialized task 12.0:1 as 1733 > bytes in 0 ms > 14/09/16 10:55:21 WARN TaskSetManager: Lost TID 24 (task 12.0:0) > 14/09/16 10:55:21 WARN TaskSetManager: Loss was due to > java.io.FileNotFoundException > java.io.FileNotFoundException: File file:/root/test/sample_svm_data.txt > does not ex

org.apache.spark.SparkException: java.io.FileNotFoundException: does not exist)

2014-09-16 Thread Hui Li
Loss was due to java.io.FileNotFoundException java.io.FileNotFoundException: File file:/root/test/sample_svm_data.txt does not exist at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:511)

java.io.FileNotFoundException: shuffle

2014-07-02 Thread nit
pressure..but I could never figure out the root cause. -- 14/07/02 07:34:45 WARN TaskSetManager: Loss was due to java.io.FileNotFoundException java.io.FileNotFoundException: /var/storage/sda3/nm-local/usercache/nit/appcache/application_1403208801430_0183/spark-local-20140702065054-388d/0e

java.io.FileNotFoundException: http:///broadcast_1

2014-07-01 Thread Honey Joshi
Hi All, We are using shark table to dump the data, we are getting the following error : Exception in thread "main" org.apache.spark.SparkException: Job aborted: Task 1.0:0 failed 1 times (most recent failure: Exception failure: java.io.FileNotFoundException: http:///broadcast_1) We

答复: 答复: java.io.FileNotFoundException: /test/spark-0.9.1/work/app-20140505053550-0000/2/stdout (No such file or directory)

2014-05-11 Thread Francis . Hu
主题: 答复: 答复: java.io.FileNotFoundException: /test/spark-0.9.1/work/app-20140505053550-/2/stdout (No such file or directory) i looked into the log again, all exceptions are about FileNotFoundException . In the Webui, no anymore info I can check except for the basic description of job

Re: 答复: java.io.FileNotFoundException: /test/spark-0.9.1/work/app-20140505053550-0000/2/stdout (No such file or directory)

2014-05-05 Thread Tathagata Das
ata Das [mailto:tathagata.das1...@gmail.com] > *发送时间:* Tuesday, May 06, 2014 3:45 > *收件人:* user@spark.apache.org > *主题:* Re: java.io.FileNotFoundException: > /test/spark-0.9.1/work/app-20140505053550-/2/stdout (No such file or > directory) > > > > Do those file actually

答复: java.io.FileNotFoundException: /test/spark-0.9.1/work/app-20140505053550-0000/2/stdout (No such file or directory)

2014-05-05 Thread Francis . Hu
: java.io.FileNotFoundException: /test/spark-0.9.1/work/app-20140505053550-/2/stdout (No such file or directory) Do those file actually exist? Those stdout/stderr should have the output of the spark's executors running in the workers, and its weird that they dont exist. Could be permi

Re: java.io.FileNotFoundException: /test/spark-0.9.1/work/app-20140505053550-0000/2/stdout (No such file or directory)

2014-05-05 Thread Tathagata Das
in workers' > logs/spark-francis-org.apache.spark.deploy.worker.Worker-1-ubuntu-4.out > > > > 14/05/05 02:39:39 WARN AbstractHttpConnection: > /logPage/?appId=app-20140505053550-&executorId=2&logType=stdout > > java.io.FileNotFoundExceptio

java.io.FileNotFoundException: /test/spark-0.9.1/work/app-20140505053550-0000/2/stdout (No such file or directory)

2014-05-05 Thread Francis . Hu
02:39:39 WARN AbstractHttpConnection: /logPage/?appId=app-20140505053550-&executorId=2&logType=stdout java.io.FileNotFoundException: /test/spark-0.9.1/work/app-20140505053550-/2/stdout (No such file or directory) at java.io.FileInputStream.open(Native Method) at ja