MonkeyCanCode opened a new pull request, #911:
URL: https://github.com/apache/polaris/pull/911

   This new getting start docker compose is trying to mount local `~/.ivy2` 
into `/home/spark/.ivy2`. Users can get the following error:
   ```
   spark-sql-1          | Exception in thread "main" 
java.io.FileNotFoundException: 
/home/spark/.ivy2/cache/resolved-org.apache.spark-spark-submit-parent-83e429a6-2332-4f48-9b70-e9f6e356aeeb-1.0.xml
 (Permission denied)
   spark-sql-1          |  at java.base/java.io.FileOutputStream.open0(Native 
Method)
   spark-sql-1          |  at 
java.base/java.io.FileOutputStream.open(FileOutputStream.java:293)
   spark-sql-1          |  at 
java.base/java.io.FileOutputStream.<init>(FileOutputStream.java:235)
   spark-sql-1          |  at 
java.base/java.io.FileOutputStream.<init>(FileOutputStream.java:184)
   spark-sql-1          |  at 
org.apache.ivy.plugins.parser.xml.XmlModuleDescriptorWriter.write(XmlModuleDescriptorWriter.java:71)
   spark-sql-1          |  at 
org.apache.ivy.plugins.parser.xml.XmlModuleDescriptorWriter.write(XmlModuleDescriptorWriter.java:63)
   spark-sql-1          |  at 
org.apache.ivy.core.module.descriptor.DefaultModuleDescriptor.toIvyFile(DefaultModuleDescriptor.java:553)
   spark-sql-1          |  at 
org.apache.ivy.core.cache.DefaultResolutionCacheManager.saveResolvedModuleDescriptor(DefaultResolutionCacheManager.java:184)
   spark-sql-1          |  at 
org.apache.ivy.core.resolve.ResolveEngine.resolve(ResolveEngine.java:259)
   spark-sql-1          |  at org.apache.ivy.Ivy.resolve(Ivy.java:522)
   spark-sql-1          |  at 
org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1585)
   spark-sql-1          |  at 
org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185)
   spark-sql-1          |  at 
org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339)
   spark-sql-1          |  at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969)
   spark-sql-1          |  at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199)
   spark-sql-1          |  at 
org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222)
   spark-sql-1          |  at 
org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91)
   spark-sql-1          |  at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125)
   spark-sql-1          |  at 
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134)
   spark-sql-1          |  at 
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   spark-sql-1 exited with code 1
   ```
   
   Instead, it is more common to overwrite this setting to `/tmp` instead 
(spark official is doing the same: 
https://github.com/apache/spark/blob/7243de6fe7162ac491e73d110425b15ef397ec88/docs/running-on-kubernetes.md?plain=1#L245)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to