This is an automated email from the ASF dual-hosted git repository.

techdocsmith pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/druid.git


The following commit(s) were added to refs/heads/master by this push:
     new cc2e4a80ff doc: add a basic JDBC tutorial (#13343)
cc2e4a80ff is described below

commit cc2e4a80ff485a15a9bba1ea94bcf0915a35b264
Author: 317brian <[email protected]>
AuthorDate: Wed Nov 30 16:25:35 2022 -0800

    doc: add a basic JDBC tutorial (#13343)
    
    * initial commit for jdbc tutorial
    
    (cherry picked from commit 04c4adad71e5436b76c3425fe369df03aaaf0acb)
    
    * add commentary
    
    * address comments from charles
    
    * add query context to example
    
    * fix typo
    
    * add links
    
    * Apply suggestions from code review
    
    Co-authored-by: Frank Chen <[email protected]>
    
    * fix datatype
    
    * address feedback
    
    * add parameterize to spelling file. the past tense version was already 
there
    
    Co-authored-by: Frank Chen <[email protected]>
---
 docs/querying/sql-jdbc.md       | 121 +++++++++++++++++++++++++++++++++++++++-
 docs/tutorials/tutorial-jdbc.md |  31 ++++++++++
 website/.spelling               |   1 +
 website/sidebars.json           |   5 +-
 4 files changed, 154 insertions(+), 4 deletions(-)

diff --git a/docs/querying/sql-jdbc.md b/docs/querying/sql-jdbc.md
index 0f041c6fad..a558637bc1 100644
--- a/docs/querying/sql-jdbc.md
+++ b/docs/querying/sql-jdbc.md
@@ -29,6 +29,12 @@ sidebar_label: "JDBC driver API"
 
 You can make [Druid SQL](./sql.md) queries using the [Avatica JDBC 
driver](https://calcite.apache.org/avatica/downloads/). We recommend using 
Avatica JDBC driver version 1.17.0 or later. Note that as of the time of this 
writing, Avatica 1.17.0, the latest version, does not support passing 
connection string parameters from the URL to Druid, so you must pass them using 
a `Properties` object. Once you've downloaded the Avatica client jar, add it to 
your classpath and use the connect string  [...]
 
+When using the JDBC connector for the [examples](#examples) or in general, 
it's helpful to understand the parts of the connect string stored in the `url` 
variable:
+
+  - `jdbc:avatica:remote:url=` is prepended to the hostname and port.
+  - The hostname and port number for your Druid deployment depends on whether 
you want to connect to the Router or a specific Broker. For more information, 
see [Connection stickiness](#connection-stickiness). In the case of the 
quickstart deployment, the hostname and port are `http://localhost:8888`, which 
connects to the Router running on your local machine.
+  - The SQL endpoint in Druid for the Avatica driver is  
`/druid/v2/sql/avatica/`.
+
 Example code:
 
 ```java
@@ -51,6 +57,8 @@ try (Connection connection = DriverManager.getConnection(url, 
connectionProperti
 }
 ```
 
+For a runnable example that includes a query that you might run, see 
[Examples](#examples).
+
 It is also possible to use a protocol buffers JDBC connection with Druid, this 
offer reduced bloat and potential performance
 improvements for larger result sets. To use it apply the following connection 
url instead, everything else remains the same
 ```
@@ -60,12 +68,12 @@ String url = 
"jdbc:avatica:remote:url=http://localhost:8082/druid/v2/sql/avatica
 > The protobuf endpoint is also known to work with the official [Golang 
 > Avatica driver](https://github.com/apache/calcite-avatica-go)
 
 Table metadata is available over JDBC using `connection.getMetaData()` or by 
querying the
-["INFORMATION_SCHEMA" tables](sql-metadata-tables.md).
+["INFORMATION_SCHEMA" tables](sql-metadata-tables.md). For an example of this, 
see [Get the metadata for a datasource](#get-the-metadata-for-a-datasource).
 
 ## Connection stickiness
 
 Druid's JDBC server does not share connection state between Brokers. This 
means that if you're using JDBC and have
-multiple Druid Brokers, you should either connect to a specific Broker, or use 
a load balancer with sticky sessions
+multiple Druid Brokers, you should either connect to a specific Broker or use 
a load balancer with sticky sessions
 enabled. The Druid Router process provides connection stickiness when 
balancing JDBC requests, and can be used to achieve
 the necessary stickiness even with a normal non-sticky load balancer. Please 
see the
 [Router](../design/router.md) documentation for more details.
@@ -82,3 +90,112 @@ statement.setString(1, "abc");
 statement.setString(2, "def");
 final ResultSet resultSet = statement.executeQuery();
 ```
+
+## Examples
+
+<!-- docs/tutorial-jdbc.md redirects here -->
+
+The following section contains two complete samples that use the JDBC 
connector:
+
+- [Get the metadata for a datasource](#get-the-metadata-for-a-datasource) 
shows you how to query the `INFORMATION_SCHEMA` to get metadata like column 
names. 
+- [Query data](#query-data) runs a select query against the datasource.
+
+You can try out these examples after verifying that you meet the 
[prerequisites](#prerequisites).
+
+For more information about the connection options, see [Client 
Reference](https://calcite.apache.org/avatica/docs/client_reference.html).
+
+### Prerequisites 
+
+Make sure you meet the following requirements before trying these examples:
+
+- A supported Java version, such as Java 8
+
+- [Avatica JDBC driver](https://calcite.apache.org/avatica/downloads/). You 
can add the JAR  to your `CLASSPATH` directly or manage it externally, such as 
through Maven and a `pom.xml` file.
+
+- An available Druid instance. You can use the `micro-quickstart` 
configuration described in [Quickstart (local)](../tutorials/index.md). The 
examples assume that you are using the quickstart, so no authentication or 
authorization is expected unless explicitly mentioned. 
+
+- The example `wikipedia` datasource from the quickstart is loaded on your 
Druid instance. If you have a different datasource loaded, you can still try 
these examples. You'll have to update the table name and column names to match 
your datasource.
+
+### Get the metadata for a datasource
+
+Metadata, such as column names, is available either through the 
[`INFORMATION_SCHEMA`](../querying/sql-metadata-tables.md) table or through 
`connect.getMetaData()`. The following example uses the `INFORMATION_SCHEMA` 
table to retrieve and print the list of column names for the `wikipedia` 
datasource that you loaded during a previous tutorial.
+
+```java
+import java.sql.*;
+import java.util.Properties;
+
+public class JdbcListColumns {
+
+    public static void main(String args[]) throws SQLException
+    {
+        // Connect to /druid/v2/sql/avatica/ on your Router. 
+        // You can connect to a Broker but must configure connection 
stickiness if you do. 
+        String url = 
"jdbc:avatica:remote:url=http://localhost:8888/druid/v2/sql/avatica/";;
+
+        String query = "SELECT COLUMN_NAME,* FROM INFORMATION_SCHEMA.COLUMNS 
WHERE TABLE_NAME = 'wikipedia' and TABLE_SCHEMA='druid'";
+        // Set any connection context parameters you need here
+        // Or leave empty for default behavior.
+        Properties connectionProperties = new Properties();
+
+        try (Connection connection = DriverManager.getConnection(url, 
connectionProperties)) {
+            try (
+                    final Statement statement = connection.createStatement();
+                    final ResultSet rs = statement.executeQuery(query)
+            ) {
+                while (rs.next()) {
+                    String columnName = rs.getString("COLUMN_NAME");
+                    System.out.println(columnName);
+                }
+            }
+        } catch (SQLException e) {
+            throw new RuntimeException(e);
+        }
+
+    }
+}
+```
+
+### Query data
+
+Now that you know what columns are available, you can start querying the data. 
The following example queries the datasource named `wikipedia` for the 
timestamps and comments from Japan. It also sets the [query context 
parameter](../querying/sql-query-context.md) `sqlTimeZone` . Optionally,  you 
can also parameterize queries by using [dynamic 
parameters](#dynamic-parameters).
+
+```java
+import java.sql.*;
+import java.util.Properties;
+
+public class JdbcCountryAndTime {
+
+    public static void main(String args[]) throws SQLException
+    {
+        // Connect to /druid/v2/sql/avatica/ on your Router. 
+        // You can connect to a Broker but must configure connection 
stickiness if you do. 
+        String url = 
"jdbc:avatica:remote:url=http://localhost:8888/druid/v2/sql/avatica/";;
+
+        //The query you want to run.
+        String query = "SELECT __time, isRobot, countryName, comment FROM 
wikipedia WHERE countryName='Japan'";
+        // Set any connection context parameters you need here
+        // Or leave empty for default behavior.
+        Properties connectionProperties = new Properties();
+        connectionProperties.setProperty("sqlTimeZone", "America/Los_Angeles");
+
+        try (Connection connection = DriverManager.getConnection(url, 
connectionProperties)) {
+            try (
+                    final Statement statement = connection.createStatement();
+                    final ResultSet rs = statement.executeQuery(query)
+            ) {
+                while (rs.next()) {
+                    Timestamp timeStamp = rs.getTimestamp("__time");
+                    String comment = rs.getString("comment");
+                    System.out.println(timeStamp);
+                    System.out.println(comment);
+                }
+            }
+        } catch (SQLException e) {
+            throw new RuntimeException(e);
+        }
+
+    }
+}
+`````
+
+
diff --git a/docs/tutorials/tutorial-jdbc.md b/docs/tutorials/tutorial-jdbc.md
new file mode 100644
index 0000000000..28cee144c3
--- /dev/null
+++ b/docs/tutorials/tutorial-jdbc.md
@@ -0,0 +1,31 @@
+---
+id: tutorial-jdbc
+title: "Tutorial: Using the JDBC driver to query Druid"
+sidebar_label: JDBC connector
+---
+
+<!--
+  ~ Licensed to the Apache Software Foundation (ASF) under one
+  ~ or more contributor license agreements.  See the NOTICE file
+  ~ distributed with this work for additional information
+  ~ regarding copyright ownership.  The ASF licenses this file
+  ~ to you under the Apache License, Version 2.0 (the
+  ~ "License"); you may not use this file except in compliance
+  ~ with the License.  You may obtain a copy of the License at
+  ~
+  ~   http://www.apache.org/licenses/LICENSE-2.0
+  ~
+  ~ Unless required by applicable law or agreed to in writing,
+  ~ software distributed under the License is distributed on an
+  ~ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+  ~ KIND, either express or implied.  See the License for the
+  ~ specific language governing permissions and limitations
+  ~ under the License.
+  -->
+
+Redirecting you to the JDBC connector examples...
+
+<script>window.location.replace("https://druid.apache.org/docs/latest/querying/sql-jdbc.html#examples";)</script>
+</head>
+<a 
href="https://druid.apache.org/docs/latest/querying/sql-jdbc.html#examples";>Click
 here if you are not redirected.</a>
+
diff --git a/website/.spelling b/website/.spelling
index d02f4f8467..7c4d471aaf 100644
--- a/website/.spelling
+++ b/website/.spelling
@@ -379,6 +379,7 @@ non-nullable
 noop
 numerics
 numShards
+parameterize
 objectGlob
 parameterized
 parse_json
diff --git a/website/sidebars.json b/website/sidebars.json
index 1ebc214027..e0fcbf0403 100644
--- a/website/sidebars.json
+++ b/website/sidebars.json
@@ -13,7 +13,7 @@
       "tutorials/tutorial-batch-hadoop",
       "tutorials/tutorial-query",
       "tutorials/tutorial-rollup",
-      "tutorials/tutorial-sketches-theta",      
+      "tutorials/tutorial-sketches-theta",
       "tutorials/tutorial-retention",
       "tutorials/tutorial-update-data",
       "tutorials/tutorial-compaction",
@@ -22,7 +22,8 @@
       "tutorials/tutorial-transform-spec",
       "tutorials/docker",
       "tutorials/tutorial-kerberos-hadoop",
-      "tutorials/tutorial-msq-convert-spec"
+      "tutorials/tutorial-msq-convert-spec",
+      "tutorials/tutorial-jdbc"
     ],
     "Design": [
       "design/architecture",


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to