[ 
https://issues.apache.org/jira/browse/SPARK-37141?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-37141.
-----------------------------------
    Fix Version/s: 3.3.0
       Resolution: Fixed

Issue resolved by pull request 34420
[https://github.com/apache/spark/pull/34420]

> WorkerSuite cannot run on Mac OS
> --------------------------------
>
>                 Key: SPARK-37141
>                 URL: https://issues.apache.org/jira/browse/SPARK-37141
>             Project: Spark
>          Issue Type: Bug
>          Components: Tests
>    Affects Versions: 3.3.0
>            Reporter: Yang Jie
>            Assignee: Yazhi Wang
>            Priority: Minor
>             Fix For: 3.3.0
>
>
> After SPARK-35907 run `org.apache.spark.deploy.worker.WorkerSuite` on Mac 
> os(both M1 and Intel) failed
> {code:java}
> mvn clean install -DskipTests -pl core -am
> mvn test -pl core -Dtest=none 
> -DwildcardSuites=org.apache.spark.deploy.worker.WorkerSuite
> {code}
> {code:java}
> WorkerSuite:
> - test isUseLocalNodeSSLConfig
> - test maybeUpdateSSLSettings
> - test clearing of finishedExecutors (small number of executors)
> - test clearing of finishedExecutors (more executors)
> - test clearing of finishedDrivers (small number of drivers)
> - test clearing of finishedDrivers (more drivers)
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] Total time:  47.973 s
> [INFO] Finished at: 2021-10-28T13:46:56+08:00
> [INFO] 
> ------------------------------------------------------------------------
> [ERROR] Failed to execute goal 
> org.scalatest:scalatest-maven-plugin:2.0.2:test (test) on project 
> spark-core_2.12: There are test failures -> [Help 1]
> [ERROR] 
> [ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
> switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR] 
> [ERROR] For more information about the errors and possible solutions, please 
> read the following articles:
> [ERROR] [Help 1] 
> http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
> {code}
> {code:java}
> 21/10/28 13:46:56.133 dispatcher-event-loop-1 ERROR Utils: Failed to create 
> directory /tmp
> java.nio.file.FileAlreadyExistsException: /tmp
>         at 
> sun.nio.fs.UnixException.translateToIOException(UnixException.java:88)
>         at 
> sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
>         at 
> sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
>         at 
> sun.nio.fs.UnixFileSystemProvider.createDirectory(UnixFileSystemProvider.java:384)
>         at java.nio.file.Files.createDirectory(Files.java:674)
>         at java.nio.file.Files.createAndCheckIsDirectory(Files.java:781)
>         at java.nio.file.Files.createDirectories(Files.java:727)
>         at org.apache.spark.util.Utils$.createDirectory(Utils.scala:292)
>         at 
> org.apache.spark.deploy.worker.Worker.createWorkDir(Worker.scala:221)
>         at org.apache.spark.deploy.worker.Worker.onStart(Worker.scala:232)
>         at 
> org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:120)
>         at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213)
>         at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100)
>         at 
> org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75)
>         at 
> org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41)
>         at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>         at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>         at java.lang.Thread.run(Thread.java:748)
> {code}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to