pan3793 commented on code in PR #47402:
URL: https://github.com/apache/spark/pull/47402#discussion_r1708690764


##########
assembly/pom.xml:
##########
@@ -159,6 +159,78 @@
             </target>
           </configuration>
       </plugin>
+      <plugin>
+        <!--
+          Here we download ammonite dependency required for Spark Connect REPL 
and copy
+          Spark Connect client to target's jars/connect-repl directory. Both 
jars will
+          only be loaded when we run Spark Connect shell, see also
+          AbstractCommandBuilder.buildClassPath and SPARK-48936.
+        -->
+        <groupId>org.codehaus.mojo</groupId>
+        <artifactId>exec-maven-plugin</artifactId>
+        <executions>
+          <execution>
+            <id>get-ammonite-jar</id>
+            <phase>package</phase>
+            <goals>
+              <goal>exec</goal>
+            </goals>
+            <configuration>
+              <executable>${basedir}/../build/mvn</executable>

Review Comment:
   this is a little bit hacky, if we use `./dev/make-distribution.sh 
<extra-args>` to package make a binary distribution, `extra-args` won't be 
passed into this command, these may cause some issues.
   
   is it possible to collect artifacts in the `spark-connect-client-jvm` module 
and only perform a simple copy here to avoid recursive `build/mvn` calls?
   
   also cc @LuciferYang, do you have any suggestions?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to