This is an automated email from the ASF dual-hosted git repository.

hvanhovell pushed a commit to branch branch-3.4
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.4 by this push:
     new 9023f301c78 [SPARK-42656][CONNECT] Adding SCALA REPL shell script for 
JVM client
9023f301c78 is described below

commit 9023f301c78f0faa68b555389f40a35df9b50437
Author: Zhen Li <zhenli...@users.noreply.github.com>
AuthorDate: Thu Mar 2 22:00:32 2023 -0400

    [SPARK-42656][CONNECT] Adding SCALA REPL shell script for JVM client
    
    ### What changes were proposed in this pull request?
    Adding a simple script to start the Scala client in the Scala REPL. As well 
as a script to start the spark connect server for the client to connect to.
    
    ### Why are the changes needed?
    Make the JVM client more easy to be used.
    
    ### Does this PR introduce _any_ user-facing change?
    No
    
    ### How was this patch tested?
    Manually tested.
    
    Closes #40257 from zhenlineo/jshell.
    
    Authored-by: Zhen Li <zhenli...@users.noreply.github.com>
    Signed-off-by: Herman van Hovell <her...@databricks.com>
    (cherry picked from commit 7505e5cdd12661dd5c96bfd004185bf6fc4eb33b)
    Signed-off-by: Herman van Hovell <her...@databricks.com>
---
 connector/connect/bin/spark-connect                | 32 ++++++++++++++
 connector/connect/bin/spark-connect-scala-client   | 50 ++++++++++++++++++++++
 .../connect/bin/spark-connect-scala-client.sc      | 15 +++++++
 3 files changed, 97 insertions(+)

diff --git a/connector/connect/bin/spark-connect 
b/connector/connect/bin/spark-connect
new file mode 100755
index 00000000000..008209c8440
--- /dev/null
+++ b/connector/connect/bin/spark-connect
@@ -0,0 +1,32 @@
+#!/usr/bin/env bash
+
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+# Go to the Spark project root directory
+FWDIR="$(cd "`dirname "$0"`"/../../..; pwd)"
+cd "$FWDIR"
+export SPARK_HOME=$FWDIR
+
+# Build the jars needed for spark submit and spark connect
+build/sbt package
+
+SCALA_BINARY_VER=`grep "scala.binary.version" "${SPARK_HOME}/pom.xml" | head 
-n1 | awk -F '[<>]' '{print $3}'`
+
+CONNECT_JAR=`ls 
"${SPARK_HOME}"/connector/connect/server/target/scala-"${SCALA_BINARY_VER}"/spark-connect-assembly*.jar
 | paste -sd ',' -`
+
+exec "${SPARK_HOME}"/bin/spark-submit --class 
org.apache.spark.sql.connect.SimpleSparkConnectService "$CONNECT_JAR"
\ No newline at end of file
diff --git a/connector/connect/bin/spark-connect-scala-client 
b/connector/connect/bin/spark-connect-scala-client
new file mode 100755
index 00000000000..902091a74de
--- /dev/null
+++ b/connector/connect/bin/spark-connect-scala-client
@@ -0,0 +1,50 @@
+#!/usr/bin/env bash
+
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+# Use the spark connect JVM client to connect to a spark connect server.
+#
+# Start a local server:
+# A local spark-connect server with default settings can be started using the 
following command:
+#  `connector/connect/bin/spark-connect`
+# The client should be able to connect to this server directly with the 
default client settings.
+#
+# Connect to a remote server:
+# To connect to a remote server, use env var `SPARK_REMOTE` to configure the 
client connection
+# string. e.g.
+#  `export SPARK_REMOTE="sc://<URL>:<port>/;token=<auth 
token>;<param1>=<value1>"`
+
+# Go to the Spark project root directory
+FWDIR="$(cd "`dirname "$0"`"/../../..; pwd)"
+cd "$FWDIR"
+export SPARK_HOME=$FWDIR
+
+# Build the jars needed for spark connect JVM client
+build/sbt "sql/package;connect-client-jvm/assembly"
+
+CONNECT_CLASSPATH="$(build/sbt -DcopyDependencies=false "export 
connect-client-jvm/fullClasspath" | grep jar | tail -n1)"
+SQL_CLASSPATH="$(build/sbt -DcopyDependencies=false "export sql/fullClasspath" 
| grep jar | tail -n1)"
+
+INIT_SCRIPT="${SPARK_HOME}"/connector/connect/bin/spark-connect-scala-client.sc
+
+# Determine the Scala version used in Spark
+SCALA_BINARY_VER=`grep "scala.binary.version" "${SPARK_HOME}/pom.xml" | head 
-n1 | awk -F '[<>]' '{print $3}'`
+SCALA_VER=`grep "scala.version" "${SPARK_HOME}/pom.xml" | grep 
${SCALA_BINARY_VER} | head -n1 | awk -F '[<>]' '{print $3}'`
+SCALA_BIN="${SPARK_HOME}/build/scala-${SCALA_VER}/bin/scala"
+
+exec "${SCALA_BIN}" -cp "$CONNECT_CLASSPATH:$SQL_CLASSPATH" -i $INIT_SCRIPT
\ No newline at end of file
diff --git a/connector/connect/bin/spark-connect-scala-client.sc 
b/connector/connect/bin/spark-connect-scala-client.sc
new file mode 100644
index 00000000000..0aa96545e53
--- /dev/null
+++ b/connector/connect/bin/spark-connect-scala-client.sc
@@ -0,0 +1,15 @@
+import org.apache.spark.sql.functions._
+import org.apache.spark.sql.SparkSession
+
+val conStr = if (sys.env.contains("SPARK_REMOTE")) sys.env("SPARK_REMOTE") 
else ""
+val sessionBuilder = SparkSession.builder()
+val spark = if (conStr.isEmpty) sessionBuilder.build() else 
sessionBuilder.remote(conStr).build()
+println(
+  """
+    |   _____                  __      ______                            __
+    |  / ___/____  ____ ______/ /__   / ____/___  ____  ____  ___  _____/ /_
+    |  \__ \/ __ \/ __ `/ ___/ //_/  / /   / __ \/ __ \/ __ \/ _ \/ ___/ __/
+    | ___/ / /_/ / /_/ / /  / ,<    / /___/ /_/ / / / / / / /  __/ /__/ /_
+    |/____/ .___/\__,_/_/  /_/|_|   \____/\____/_/ /_/_/ /_/\___/\___/\__/
+    |    /_/
+    |""".stripMargin)
\ No newline at end of file


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to