[jira] [Updated] (BEAM-6307) Nexmark Spark failed due to FileNotFoundException

2019-02-14 Thread Ahmet Altay (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-6307?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ahmet Altay updated BEAM-6307:
--
Fix Version/s: (was: 2.11.0)

> Nexmark Spark failed due to FileNotFoundException
> -
>
> Key: BEAM-6307
> URL: https://issues.apache.org/jira/browse/BEAM-6307
> Project: Beam
>  Issue Type: Bug
>  Components: examples-nexmark
>Reporter: Andrew Pilloud
>Priority: Major
>  Labels: flake
>
> Nexmark Spark failed due to FileNotFoundException.
> https://builds.apache.org/job/beam_PostCommit_Java_Nexmark_Spark/1417/consoleFull
> {code:java}
> 09:28:16 18/12/22 17:27:58 ERROR 
> org.apache.spark.storage.DiskBlockObjectWriter: Uncaught exception while 
> reverting partial writes to file 
> /tmp/blockmgr-d4a681cf-2c36-4c8c-a238-cf16e4a89e73/12/temp_shuffle_c59ca414-109d-4c7b-b1d1-3514aa936662
> 09:28:16 java.io.FileNotFoundException: 
> /tmp/blockmgr-d4a681cf-2c36-4c8c-a238-cf16e4a89e73/12/temp_shuffle_c59ca414-109d-4c7b-b1d1-3514aa936662
>  (No such file or directory)
> 09:28:16  at java.io.FileOutputStream.open0(Native Method)
> 09:28:16  at java.io.FileOutputStream.open(FileOutputStream.java:270)
> 09:28:16  at java.io.FileOutputStream.(FileOutputStream.java:213)
> 09:28:16  at 
> org.apache.spark.storage.DiskBlockObjectWriter$$anonfun$revertPartialWritesAndClose$2.apply$mcV$sp(DiskBlockObjectWriter.scala:217)
> 09:28:16  at 
> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1390)
> 09:28:16  at 
> org.apache.spark.storage.DiskBlockObjectWriter.revertPartialWritesAndClose(DiskBlockObjectWriter.scala:214)
> 09:28:16  at 
> org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.stop(BypassMergeSortShuffleWriter.java:237)
> 09:28:16  at 
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:102)
> 09:28:16  at 
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
> 09:28:16  at org.apache.spark.scheduler.Task.run(Task.scala:109)
> 09:28:16  at 
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
> 09:28:16  at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> 09:28:16  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> 09:28:16  at java.lang.Thread.run(Thread.java:748)
> 09:28:16 18/12/22 17:27:58 ERROR 
> org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter: Error while 
> deleting file 
> /tmp/blockmgr-d4a681cf-2c36-4c8c-a238-cf16e4a89e73/12/temp_shuffle_c59ca414-109d-4c7b-b1d1-3514aa936662
> 09:28:16 18/12/22 17:27:58 ERROR 
> org.apache.spark.storage.DiskBlockObjectWriter: Uncaught exception while 
> reverting partial writes to file 
> /tmp/blockmgr-d4a681cf-2c36-4c8c-a238-cf16e4a89e73/12/temp_shuffle_0f0fa603-712e-4e36-8ce8-e9770dbab93e
> 09:28:16 java.io.FileNotFoundException: 
> /tmp/blockmgr-d4a681cf-2c36-4c8c-a238-cf16e4a89e73/12/temp_shuffle_0f0fa603-712e-4e36-8ce8-e9770dbab93e
>  (No such file or directory)
> 09:28:16  at java.io.FileOutputStream.open0(Native Method)
> 09:28:16  at java.io.FileOutputStream.open(FileOutputStream.java:270)
> 09:28:16  at java.io.FileOutputStream.(FileOutputStream.java:213)
> 09:28:16  at 
> org.apache.spark.storage.DiskBlockObjectWriter$$anonfun$revertPartialWritesAndClose$2.apply$mcV$sp(DiskBlockObjectWriter.scala:217)
> 09:28:16  at 
> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1390)
> 09:28:16  at 
> org.apache.spark.storage.DiskBlockObjectWriter.revertPartialWritesAndClose(DiskBlockObjectWriter.scala:214)
> 09:28:16  at 
> org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.stop(BypassMergeSortShuffleWriter.java:237)
> 09:28:16  at 
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:102)
> 09:28:16  at 
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
> 09:28:16  at org.apache.spark.scheduler.Task.run(Task.scala:109)
> 09:28:16  at 
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
> 09:28:16  at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> 09:28:16  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> 09:28:16  at java.lang.Thread.run(Thread.java:748){code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-6307) Nexmark Spark failed due to FileNotFoundException

2019-01-15 Thread Kenneth Knowles (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-6307?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-6307:
--
Fix Version/s: 2.11.0

> Nexmark Spark failed due to FileNotFoundException
> -
>
> Key: BEAM-6307
> URL: https://issues.apache.org/jira/browse/BEAM-6307
> Project: Beam
>  Issue Type: Bug
>  Components: examples-nexmark
>Reporter: Andrew Pilloud
>Priority: Major
>  Labels: flake
> Fix For: 2.11.0
>
>
> Nexmark Spark failed due to FileNotFoundException.
> https://builds.apache.org/job/beam_PostCommit_Java_Nexmark_Spark/1417/consoleFull
> {code:java}
> 09:28:16 18/12/22 17:27:58 ERROR 
> org.apache.spark.storage.DiskBlockObjectWriter: Uncaught exception while 
> reverting partial writes to file 
> /tmp/blockmgr-d4a681cf-2c36-4c8c-a238-cf16e4a89e73/12/temp_shuffle_c59ca414-109d-4c7b-b1d1-3514aa936662
> 09:28:16 java.io.FileNotFoundException: 
> /tmp/blockmgr-d4a681cf-2c36-4c8c-a238-cf16e4a89e73/12/temp_shuffle_c59ca414-109d-4c7b-b1d1-3514aa936662
>  (No such file or directory)
> 09:28:16  at java.io.FileOutputStream.open0(Native Method)
> 09:28:16  at java.io.FileOutputStream.open(FileOutputStream.java:270)
> 09:28:16  at java.io.FileOutputStream.(FileOutputStream.java:213)
> 09:28:16  at 
> org.apache.spark.storage.DiskBlockObjectWriter$$anonfun$revertPartialWritesAndClose$2.apply$mcV$sp(DiskBlockObjectWriter.scala:217)
> 09:28:16  at 
> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1390)
> 09:28:16  at 
> org.apache.spark.storage.DiskBlockObjectWriter.revertPartialWritesAndClose(DiskBlockObjectWriter.scala:214)
> 09:28:16  at 
> org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.stop(BypassMergeSortShuffleWriter.java:237)
> 09:28:16  at 
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:102)
> 09:28:16  at 
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
> 09:28:16  at org.apache.spark.scheduler.Task.run(Task.scala:109)
> 09:28:16  at 
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
> 09:28:16  at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> 09:28:16  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> 09:28:16  at java.lang.Thread.run(Thread.java:748)
> 09:28:16 18/12/22 17:27:58 ERROR 
> org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter: Error while 
> deleting file 
> /tmp/blockmgr-d4a681cf-2c36-4c8c-a238-cf16e4a89e73/12/temp_shuffle_c59ca414-109d-4c7b-b1d1-3514aa936662
> 09:28:16 18/12/22 17:27:58 ERROR 
> org.apache.spark.storage.DiskBlockObjectWriter: Uncaught exception while 
> reverting partial writes to file 
> /tmp/blockmgr-d4a681cf-2c36-4c8c-a238-cf16e4a89e73/12/temp_shuffle_0f0fa603-712e-4e36-8ce8-e9770dbab93e
> 09:28:16 java.io.FileNotFoundException: 
> /tmp/blockmgr-d4a681cf-2c36-4c8c-a238-cf16e4a89e73/12/temp_shuffle_0f0fa603-712e-4e36-8ce8-e9770dbab93e
>  (No such file or directory)
> 09:28:16  at java.io.FileOutputStream.open0(Native Method)
> 09:28:16  at java.io.FileOutputStream.open(FileOutputStream.java:270)
> 09:28:16  at java.io.FileOutputStream.(FileOutputStream.java:213)
> 09:28:16  at 
> org.apache.spark.storage.DiskBlockObjectWriter$$anonfun$revertPartialWritesAndClose$2.apply$mcV$sp(DiskBlockObjectWriter.scala:217)
> 09:28:16  at 
> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1390)
> 09:28:16  at 
> org.apache.spark.storage.DiskBlockObjectWriter.revertPartialWritesAndClose(DiskBlockObjectWriter.scala:214)
> 09:28:16  at 
> org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.stop(BypassMergeSortShuffleWriter.java:237)
> 09:28:16  at 
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:102)
> 09:28:16  at 
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
> 09:28:16  at org.apache.spark.scheduler.Task.run(Task.scala:109)
> 09:28:16  at 
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
> 09:28:16  at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> 09:28:16  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> 09:28:16  at java.lang.Thread.run(Thread.java:748){code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-6307) Nexmark Spark failed due to FileNotFoundException

2019-01-08 Thread Kenneth Knowles (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-6307?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-6307:
--
Component/s: (was: test-failures)

> Nexmark Spark failed due to FileNotFoundException
> -
>
> Key: BEAM-6307
> URL: https://issues.apache.org/jira/browse/BEAM-6307
> Project: Beam
>  Issue Type: Bug
>  Components: examples-nexmark
>Reporter: Andrew Pilloud
>Assignee: Kenneth Knowles
>Priority: Major
>
> Nexmark Spark failed due to FileNotFoundException.
> https://builds.apache.org/job/beam_PostCommit_Java_Nexmark_Spark/1417/consoleFull
> {code:java}
> 09:28:16 18/12/22 17:27:58 ERROR 
> org.apache.spark.storage.DiskBlockObjectWriter: Uncaught exception while 
> reverting partial writes to file 
> /tmp/blockmgr-d4a681cf-2c36-4c8c-a238-cf16e4a89e73/12/temp_shuffle_c59ca414-109d-4c7b-b1d1-3514aa936662
> 09:28:16 java.io.FileNotFoundException: 
> /tmp/blockmgr-d4a681cf-2c36-4c8c-a238-cf16e4a89e73/12/temp_shuffle_c59ca414-109d-4c7b-b1d1-3514aa936662
>  (No such file or directory)
> 09:28:16  at java.io.FileOutputStream.open0(Native Method)
> 09:28:16  at java.io.FileOutputStream.open(FileOutputStream.java:270)
> 09:28:16  at java.io.FileOutputStream.(FileOutputStream.java:213)
> 09:28:16  at 
> org.apache.spark.storage.DiskBlockObjectWriter$$anonfun$revertPartialWritesAndClose$2.apply$mcV$sp(DiskBlockObjectWriter.scala:217)
> 09:28:16  at 
> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1390)
> 09:28:16  at 
> org.apache.spark.storage.DiskBlockObjectWriter.revertPartialWritesAndClose(DiskBlockObjectWriter.scala:214)
> 09:28:16  at 
> org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.stop(BypassMergeSortShuffleWriter.java:237)
> 09:28:16  at 
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:102)
> 09:28:16  at 
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
> 09:28:16  at org.apache.spark.scheduler.Task.run(Task.scala:109)
> 09:28:16  at 
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
> 09:28:16  at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> 09:28:16  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> 09:28:16  at java.lang.Thread.run(Thread.java:748)
> 09:28:16 18/12/22 17:27:58 ERROR 
> org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter: Error while 
> deleting file 
> /tmp/blockmgr-d4a681cf-2c36-4c8c-a238-cf16e4a89e73/12/temp_shuffle_c59ca414-109d-4c7b-b1d1-3514aa936662
> 09:28:16 18/12/22 17:27:58 ERROR 
> org.apache.spark.storage.DiskBlockObjectWriter: Uncaught exception while 
> reverting partial writes to file 
> /tmp/blockmgr-d4a681cf-2c36-4c8c-a238-cf16e4a89e73/12/temp_shuffle_0f0fa603-712e-4e36-8ce8-e9770dbab93e
> 09:28:16 java.io.FileNotFoundException: 
> /tmp/blockmgr-d4a681cf-2c36-4c8c-a238-cf16e4a89e73/12/temp_shuffle_0f0fa603-712e-4e36-8ce8-e9770dbab93e
>  (No such file or directory)
> 09:28:16  at java.io.FileOutputStream.open0(Native Method)
> 09:28:16  at java.io.FileOutputStream.open(FileOutputStream.java:270)
> 09:28:16  at java.io.FileOutputStream.(FileOutputStream.java:213)
> 09:28:16  at 
> org.apache.spark.storage.DiskBlockObjectWriter$$anonfun$revertPartialWritesAndClose$2.apply$mcV$sp(DiskBlockObjectWriter.scala:217)
> 09:28:16  at 
> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1390)
> 09:28:16  at 
> org.apache.spark.storage.DiskBlockObjectWriter.revertPartialWritesAndClose(DiskBlockObjectWriter.scala:214)
> 09:28:16  at 
> org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.stop(BypassMergeSortShuffleWriter.java:237)
> 09:28:16  at 
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:102)
> 09:28:16  at 
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
> 09:28:16  at org.apache.spark.scheduler.Task.run(Task.scala:109)
> 09:28:16  at 
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
> 09:28:16  at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> 09:28:16  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> 09:28:16  at java.lang.Thread.run(Thread.java:748){code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-6307) Nexmark Spark failed due to FileNotFoundException

2019-01-08 Thread Kenneth Knowles (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-6307?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-6307:
--
Labels: flake  (was: )

> Nexmark Spark failed due to FileNotFoundException
> -
>
> Key: BEAM-6307
> URL: https://issues.apache.org/jira/browse/BEAM-6307
> Project: Beam
>  Issue Type: Bug
>  Components: examples-nexmark
>Reporter: Andrew Pilloud
>Assignee: Kenneth Knowles
>Priority: Major
>  Labels: flake
>
> Nexmark Spark failed due to FileNotFoundException.
> https://builds.apache.org/job/beam_PostCommit_Java_Nexmark_Spark/1417/consoleFull
> {code:java}
> 09:28:16 18/12/22 17:27:58 ERROR 
> org.apache.spark.storage.DiskBlockObjectWriter: Uncaught exception while 
> reverting partial writes to file 
> /tmp/blockmgr-d4a681cf-2c36-4c8c-a238-cf16e4a89e73/12/temp_shuffle_c59ca414-109d-4c7b-b1d1-3514aa936662
> 09:28:16 java.io.FileNotFoundException: 
> /tmp/blockmgr-d4a681cf-2c36-4c8c-a238-cf16e4a89e73/12/temp_shuffle_c59ca414-109d-4c7b-b1d1-3514aa936662
>  (No such file or directory)
> 09:28:16  at java.io.FileOutputStream.open0(Native Method)
> 09:28:16  at java.io.FileOutputStream.open(FileOutputStream.java:270)
> 09:28:16  at java.io.FileOutputStream.(FileOutputStream.java:213)
> 09:28:16  at 
> org.apache.spark.storage.DiskBlockObjectWriter$$anonfun$revertPartialWritesAndClose$2.apply$mcV$sp(DiskBlockObjectWriter.scala:217)
> 09:28:16  at 
> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1390)
> 09:28:16  at 
> org.apache.spark.storage.DiskBlockObjectWriter.revertPartialWritesAndClose(DiskBlockObjectWriter.scala:214)
> 09:28:16  at 
> org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.stop(BypassMergeSortShuffleWriter.java:237)
> 09:28:16  at 
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:102)
> 09:28:16  at 
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
> 09:28:16  at org.apache.spark.scheduler.Task.run(Task.scala:109)
> 09:28:16  at 
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
> 09:28:16  at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> 09:28:16  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> 09:28:16  at java.lang.Thread.run(Thread.java:748)
> 09:28:16 18/12/22 17:27:58 ERROR 
> org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter: Error while 
> deleting file 
> /tmp/blockmgr-d4a681cf-2c36-4c8c-a238-cf16e4a89e73/12/temp_shuffle_c59ca414-109d-4c7b-b1d1-3514aa936662
> 09:28:16 18/12/22 17:27:58 ERROR 
> org.apache.spark.storage.DiskBlockObjectWriter: Uncaught exception while 
> reverting partial writes to file 
> /tmp/blockmgr-d4a681cf-2c36-4c8c-a238-cf16e4a89e73/12/temp_shuffle_0f0fa603-712e-4e36-8ce8-e9770dbab93e
> 09:28:16 java.io.FileNotFoundException: 
> /tmp/blockmgr-d4a681cf-2c36-4c8c-a238-cf16e4a89e73/12/temp_shuffle_0f0fa603-712e-4e36-8ce8-e9770dbab93e
>  (No such file or directory)
> 09:28:16  at java.io.FileOutputStream.open0(Native Method)
> 09:28:16  at java.io.FileOutputStream.open(FileOutputStream.java:270)
> 09:28:16  at java.io.FileOutputStream.(FileOutputStream.java:213)
> 09:28:16  at 
> org.apache.spark.storage.DiskBlockObjectWriter$$anonfun$revertPartialWritesAndClose$2.apply$mcV$sp(DiskBlockObjectWriter.scala:217)
> 09:28:16  at 
> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1390)
> 09:28:16  at 
> org.apache.spark.storage.DiskBlockObjectWriter.revertPartialWritesAndClose(DiskBlockObjectWriter.scala:214)
> 09:28:16  at 
> org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.stop(BypassMergeSortShuffleWriter.java:237)
> 09:28:16  at 
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:102)
> 09:28:16  at 
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
> 09:28:16  at org.apache.spark.scheduler.Task.run(Task.scala:109)
> 09:28:16  at 
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
> 09:28:16  at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> 09:28:16  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> 09:28:16  at java.lang.Thread.run(Thread.java:748){code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (BEAM-6307) Nexmark Spark failed due to FileNotFoundException

2019-01-08 Thread Kenneth Knowles (JIRA)


 [ 
https://issues.apache.org/jira/browse/BEAM-6307?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kenneth Knowles updated BEAM-6307:
--
Component/s: test-failures

> Nexmark Spark failed due to FileNotFoundException
> -
>
> Key: BEAM-6307
> URL: https://issues.apache.org/jira/browse/BEAM-6307
> Project: Beam
>  Issue Type: Bug
>  Components: examples-nexmark, test-failures
>Reporter: Andrew Pilloud
>Assignee: Kenneth Knowles
>Priority: Major
>
> Nexmark Spark failed due to FileNotFoundException.
> https://builds.apache.org/job/beam_PostCommit_Java_Nexmark_Spark/1417/consoleFull
> {code:java}
> 09:28:16 18/12/22 17:27:58 ERROR 
> org.apache.spark.storage.DiskBlockObjectWriter: Uncaught exception while 
> reverting partial writes to file 
> /tmp/blockmgr-d4a681cf-2c36-4c8c-a238-cf16e4a89e73/12/temp_shuffle_c59ca414-109d-4c7b-b1d1-3514aa936662
> 09:28:16 java.io.FileNotFoundException: 
> /tmp/blockmgr-d4a681cf-2c36-4c8c-a238-cf16e4a89e73/12/temp_shuffle_c59ca414-109d-4c7b-b1d1-3514aa936662
>  (No such file or directory)
> 09:28:16  at java.io.FileOutputStream.open0(Native Method)
> 09:28:16  at java.io.FileOutputStream.open(FileOutputStream.java:270)
> 09:28:16  at java.io.FileOutputStream.(FileOutputStream.java:213)
> 09:28:16  at 
> org.apache.spark.storage.DiskBlockObjectWriter$$anonfun$revertPartialWritesAndClose$2.apply$mcV$sp(DiskBlockObjectWriter.scala:217)
> 09:28:16  at 
> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1390)
> 09:28:16  at 
> org.apache.spark.storage.DiskBlockObjectWriter.revertPartialWritesAndClose(DiskBlockObjectWriter.scala:214)
> 09:28:16  at 
> org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.stop(BypassMergeSortShuffleWriter.java:237)
> 09:28:16  at 
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:102)
> 09:28:16  at 
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
> 09:28:16  at org.apache.spark.scheduler.Task.run(Task.scala:109)
> 09:28:16  at 
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
> 09:28:16  at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> 09:28:16  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> 09:28:16  at java.lang.Thread.run(Thread.java:748)
> 09:28:16 18/12/22 17:27:58 ERROR 
> org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter: Error while 
> deleting file 
> /tmp/blockmgr-d4a681cf-2c36-4c8c-a238-cf16e4a89e73/12/temp_shuffle_c59ca414-109d-4c7b-b1d1-3514aa936662
> 09:28:16 18/12/22 17:27:58 ERROR 
> org.apache.spark.storage.DiskBlockObjectWriter: Uncaught exception while 
> reverting partial writes to file 
> /tmp/blockmgr-d4a681cf-2c36-4c8c-a238-cf16e4a89e73/12/temp_shuffle_0f0fa603-712e-4e36-8ce8-e9770dbab93e
> 09:28:16 java.io.FileNotFoundException: 
> /tmp/blockmgr-d4a681cf-2c36-4c8c-a238-cf16e4a89e73/12/temp_shuffle_0f0fa603-712e-4e36-8ce8-e9770dbab93e
>  (No such file or directory)
> 09:28:16  at java.io.FileOutputStream.open0(Native Method)
> 09:28:16  at java.io.FileOutputStream.open(FileOutputStream.java:270)
> 09:28:16  at java.io.FileOutputStream.(FileOutputStream.java:213)
> 09:28:16  at 
> org.apache.spark.storage.DiskBlockObjectWriter$$anonfun$revertPartialWritesAndClose$2.apply$mcV$sp(DiskBlockObjectWriter.scala:217)
> 09:28:16  at 
> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1390)
> 09:28:16  at 
> org.apache.spark.storage.DiskBlockObjectWriter.revertPartialWritesAndClose(DiskBlockObjectWriter.scala:214)
> 09:28:16  at 
> org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.stop(BypassMergeSortShuffleWriter.java:237)
> 09:28:16  at 
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:102)
> 09:28:16  at 
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
> 09:28:16  at org.apache.spark.scheduler.Task.run(Task.scala:109)
> 09:28:16  at 
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
> 09:28:16  at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> 09:28:16  at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> 09:28:16  at java.lang.Thread.run(Thread.java:748){code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)