+ dev mailing list If this is supposed to work, is there a regression then?
The spark core code shows the permission for copied file to \work is set to a+x at Line 442 of Utils.scala<https://github.com/apache/spark/blob/b271c265b742fa6947522eda4592e9e6a7fd1f3a/core/src/main/scala/org/apache/spark/util/Utils.scala> . The example jar I used had all permissions including Read & Execute prior spark-submit: [cid:image001.png@01D04BDA.A74C65E0] However after copied to worker node’s \work folder, only limited permission left on the jar with no execution right. [cid:image002.png@01D04BDA.A74C65E0] From: Akhil Das [mailto:ak...@sigmoidanalytics.com] Sent: Wednesday, February 18, 2015 10:40 PM To: Judy Nash Cc: u...@spark.apache.org Subject: Re: spark slave cannot execute without admin permission on windows You need not require admin permission, but just make sure all those jars has execute permission ( read/write access) Thanks Best Regards On Thu, Feb 19, 2015 at 11:30 AM, Judy Nash <judyn...@exchange.microsoft.com<mailto:judyn...@exchange.microsoft.com>> wrote: Hi, Is it possible to configure spark to run without admin permission on windows? My current setup run master & slave successfully with admin permission. However, if I downgrade permission level from admin to user, SparkPi fails with the following exception on the slave node: Exception in thread "main" org.apache.spark.SparkException: Job aborted due to s tage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 9, workernode0.jnashsparkcurr2.d10.internal.cloudapp.net<http://workernode0.jnashsparkcurr2.d10.internal.cloudapp.net>) : java.lang.ClassNotFoundException: org.apache.spark.examples.SparkPi$$anonfun$1 at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:270) Upon investigation, it appears that sparkPi jar under spark_home\worker\appname\*.jar does not have execute permission set, causing spark not able to find class. Advice would be very much appreciated. Thanks, Judy