dongjoon-hyun commented on PR #53351:
URL: https://github.com/apache/spark/pull/53351#issuecomment-3679606370

   No, you don't need to do that for your applications., @rmannibucau .
   > This is exactly the point to have a custom distribution, kind of include 
the application inside it.
   
   Please submit your application according to the Apache Spark community 
guideline.
   - https://spark.apache.org/docs/latest/submitting-applications.html
   
   For the following question, `Spark Shell` is a part of Apache Spark 
Interactive Environment, not an application.
   > Why do you have spark shell in spark distribution? It is an application so 
must not be there from your statement, this is exactly the same.
   
   Let's talk about the applications (including yours). Since it should be 
built and deployed independently, you can see that it's only `1.5MB` in 
`spark-4.1.0-bin-hadoop3/examples/jars` directory instead of 
`spark-4.1.0-bin-hadoop3/jars`, @rmannibucau . You need to build like 
`spark-examples_2.13-4.1.0.jar` independently and provide via 
`submitting-applications.html`.
   ```
   $ ls -alh spark-4.1.0-bin-hadoop3/examples/jars
   total 3296
   drwxr-xr-x@ 4 dongjoon  staff   128B Dec 11 23:32 .
   drwxr-xr-x@ 4 dongjoon  staff   128B Dec 11 23:32 ..
   -rw-r--r--@ 1 dongjoon  staff    79K Dec 11 23:32 scopt_2.13-3.7.1.jar
   -rw-r--r--@ 1 dongjoon  staff   1.5M Dec 11 23:32 
spark-examples_2.13-4.1.0.jar
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to