gh-yzou commented on PR #1211:
URL: https://github.com/apache/polaris/pull/1211#issuecomment-2741672784

   By *dependency stuff*, do you mean we need to use the correct 
implementation(org.apache.spark:spark-sql_<scalaVersion>:<sparkVersion>) 
library in our plugin development? If that is what you mean, i think we are 
doing that, we configs the spark scala version in a spark-scala.properties file 
like what nessie does, and adds all targets as a default build. 
   I think my point is that it is only used to config the dependency library 
version, and the plugin do not need Scala build if it is not introducing any 
new Scala class. So the scala build infra is not needed by Spark Plugin, if for 
PR 1208, we still decided to go with Scala, it will be required there, but 
there seems a discussion about where the change should go.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to