This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new b9c2a4159c94 [SPARK-55586][EXAMPLE] Add `jdbc.py` Example
b9c2a4159c94 is described below
commit b9c2a4159c9439e2a0467eadbef9e4606ce94f41
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Wed Feb 18 05:53:04 2026 -0800
[SPARK-55586][EXAMPLE] Add `jdbc.py` Example
### What changes were proposed in this pull request?
This PR aims to add `jdbc.py` example.
### Why are the changes needed?
To provide a working example which can be used in the downstream as a test
case.
### Does this PR introduce _any_ user-facing change?
No.
### How was this patch tested?
Manual tests.
**1. Run this example with `PostgreSQL`**
```bash
$ docker run -e POSTGRES_PASSWORD=rootpass -p 5432:5432 -d postgres:latest
```
```bash
$ curl -LO
https://repo1.maven.org/maven2/org/postgresql/postgresql/42.7.7/postgresql-42.7.7.jar
$ bin/spark-submit --driver-class-path postgresql-42.7.7.jar --jars
postgresql-42.7.7.jar -c spark.log.level=ERROR
examples/src/main/python/sql/jdbc.py
'jdbc:postgresql://127.0.0.1:5432/?user=postgres&password=rootpass'
...
+---+----+
| id|name|
+---+----+
| 2| bar|
| 1| foo|
+---+----+
```
**2. Run this example with `MySQL`**
```bash
$ docker run --name mysql.server -p 3306:3306 -e TZ=UTC -e
MYSQL_ROOT_PASSWORD=rootpass -d mysql:latest
```
```bash
$ curl -LO
https://repo1.maven.org/maven2/com/mysql/mysql-connector-j/9.6.0/mysql-connector-j-9.6.0.jar
$ bin/spark-submit --driver-class-path mysql-connector-j-9.6.0.jar --jars
mysql-connector-j-9.6.0.jar -c spark.log.level=ERROR
examples/src/main/python/sql/jdbc.py
'jdbc:mysql://127.0.0.1:3306/db?user=root&password=rootpass&createDatabaseIfNotExist=true'
...
+---+----+
| id|name|
+---+----+
| 2| bar|
| 1| foo|
+---+----+
```
### Was this patch authored or co-authored using generative AI tooling?
Generated-by: `Gemini 3 Pro (High)` on `Antigravity`
Closes #54363 from dongjoon-hyun/SPARK-55586.
Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
---
examples/src/main/python/sql/jdbc.py | 48 ++++++++++++++++++++++++++++++++++++
1 file changed, 48 insertions(+)
diff --git a/examples/src/main/python/sql/jdbc.py
b/examples/src/main/python/sql/jdbc.py
new file mode 100644
index 000000000000..7e67ab351f63
--- /dev/null
+++ b/examples/src/main/python/sql/jdbc.py
@@ -0,0 +1,48 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements. See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+"""
+A simple example demonstrating Spark SQL JDBC integration.
+Run with:
+ ./bin/spark-submit examples/src/main/python/sql/jdbc.py [jdbc_url]
+"""
+import sys
+from pyspark.sql import SparkSession
+
+
+if __name__ == "__main__":
+ if len(sys.argv) < 2:
+ print("Usage: jdbc.py <jdbc_url>", file=sys.stderr)
+ sys.exit(-1)
+ url = sys.argv[1]
+
+ spark = SparkSession \
+ .builder \
+ .appName("Python Spark SQL JDBC integration example") \
+ .getOrCreate()
+
+ # 1. Create a DataFrame
+ df = spark.createDataFrame([(1, "foo"), (2, "bar")], ["id", "name"])
+
+ # 2. Write data to a JDBC source
+ df.write.jdbc(url, "test_table", mode="overwrite", properties={})
+
+ # 3. Read data from a JDBC source
+ jdbcDF = spark.read.jdbc(url, "test_table", properties={})
+ jdbcDF.show()
+
+ spark.stop()
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]