KKcorps opened a new pull request, #8559:
URL: https://github.com/apache/pinot/pull/8559

   Spark-core classes in Pinot currently creates a lot of conflicts with spark 
runtime classes present in user environment.  These dependencies should not be 
part of the jar in the first place. 
   
   [Spark docs](https://spark.apache.org/docs/latest/cluster-overview.html) 
recommend the same - 
   
   ```
   Application jar | A jar containing the user's Spark application. In some 
cases users will want to create an "uber jar" containing their application 
along with its dependencies. The user's jar should never include Hadoop or 
Spark libraries, however, these will be added at runtime.
   ```
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to