This is an automated email from the ASF dual-hosted git repository.
bowenliang pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/kyuubi.git
The following commit(s) were added to refs/heads/master by this push:
new 4be20e1ca [KYUUBI #5404] JDBC Engine supports StarRocks
4be20e1ca is described below
commit 4be20e1ca7691245addacf0373b9524afb428527
Author: Bowen Liang <[email protected]>
AuthorDate: Thu Dec 21 16:19:36 2023 +0800
[KYUUBI #5404] JDBC Engine supports StarRocks
# :mag: Description
## Issue References ๐
This pull request fixes #5404
## Describe Your Solution ๐ง
- Introduce StarRocks support in JDBC Engine
- Adding dialects and tests for StarRocks
- Tested with StarRocks 3.x official open-sourced version docker image,
with a mini cluster with FE and BE setup.
## Types of changes :bookmark:
- [ ] Bugfix (non-breaking change which fixes an issue)
- [x] New feature (non-breaking change which adds functionality)
- [ ] Breaking change (fix or feature that would cause existing
functionality to change)
## Test Plan ๐งช
#### Behavior Without This Pull Request :coffin:
#### Behavior With This Pull Request :tada:
Supported StarRocks in JDBC engine.
#### Related Unit Tests
Add tests of `StarRocksOperationSuite`,
`StarRocksOperationWithEngineSuite`, `StarRocksSessionSuite` and
`StarRocksStatementSuite`.
---
# Checklists
## ๐ Author Self Checklist
- [x] My code follows the [style
guidelines](https://kyuubi.readthedocs.io/en/master/contributing/code/style.html)
of this project
- [x] I have performed a self-review
- [x] I have commented my code, particularly in hard-to-understand areas
- [x] I have made corresponding changes to the documentation
- [ ] My changes generate no new warnings
- [x] I have added tests that prove my fix is effective or that my feature
works
- [x] New and existing unit tests pass locally with my changes
- [x] This patch was not authored or co-authored using [Generative
Tooling](https://www.apache.org/legal/generative-tooling.html)
## ๐ Committer Pre-Merge Checklist
- [ ] Pull request title is okay.
- [ ] No license issues.
- [ ] Milestone correctly set?
- [ ] Test coverage is ok
- [ ] Assignees are selected.
- [ ] Minimum number of approvals
- [ ] No changes are requested
**Be nice. Be informative.**
Closes #5882 from bowenliang123/jdbc-sr.
Closes #5404
54927d341 [Bowen Liang] update
5b1bbf71f [Bowen Liang] update doc
f01da74ea [Bowen Liang] update StarRocksStatementSuite
1018bc95a [Bowen Liang] MySQL8ConnectionProvider
59cba957f [Bowen Liang] simplify StarRocksDialect
4bf55bba7 [Bowen Liang] simplify StarRocksTRowSetGenerator
c85722481 [Bowen Liang] jdbc starrocks
Authored-by: Bowen Liang <[email protected]>
Signed-off-by: Bowen Liang <[email protected]>
---
docs/configuration/settings.md | 168 ++++++-------
...i.engine.jdbc.connection.JdbcConnectionProvider | 1 +
...g.apache.kyuubi.engine.jdbc.dialect.JdbcDialect | 1 +
.../StarRocksDialect.scala} | 16 +-
.../jdbc/doris/DorisConnectionProvider.scala | 4 +-
...ovider.scala => MySQL8ConnectionProvider.scala} | 10 +-
.../jdbc/mysql/MySQLConnectionProvider.scala | 2 +-
.../StarRocksConnectionProvider.scala} | 8 +-
.../StarRocksSchemaHelper.scala} | 11 +-
.../StarRocksTRowSetGenerator.scala} | 7 +-
.../jdbc/starrocks/StarRocksOperationSuite.scala | 261 +++++++++++++++++++++
.../StarRocksOperationWithEngineSuite.scala | 78 ++++++
.../jdbc/starrocks/StarRocksSessionSuite.scala} | 26 +-
.../jdbc/starrocks/StarRocksStatementSuite.scala | 105 +++++++++
.../jdbc/starrocks/WithStarRocksContainer.scala | 57 +++++
.../jdbc/starrocks/WithStarRocksEngine.scala} | 20 +-
.../org/apache/kyuubi/config/KyuubiConf.scala | 7 +-
17 files changed, 656 insertions(+), 126 deletions(-)
diff --git a/docs/configuration/settings.md b/docs/configuration/settings.md
index 4341d15d0..299cb7052 100644
--- a/docs/configuration/settings.md
+++ b/docs/configuration/settings.md
@@ -120,90 +120,90 @@ You can configure the Kyuubi properties in
`$KYUUBI_HOME/conf/kyuubi-defaults.co
### Engine
-| Key | Default
|
[...]
-|----------------------------------------------------------|---------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
[...]
-| kyuubi.engine.chat.ernie.http.connect.timeout | PT2M
| The timeout[ms] for establishing the connection with the ernie bot
server. A timeout value of zero is interpreted as an infinite timeout.
[...]
-| kyuubi.engine.chat.ernie.http.proxy | <undefined>
| HTTP proxy url for API calling in ernie bot engine. e.g.
http://127.0.0.1:1088
[...]
-| kyuubi.engine.chat.ernie.http.socket.timeout | PT2M
| The timeout[ms] for waiting for data packets after ernie bot server
connection is established. A timeout value of zero is interpreted as an
infinite timeout.
[...]
-| kyuubi.engine.chat.ernie.model | completions
| ID of the model used in ernie bot. Available models are
completions_pro, ernie_bot_8k, completions and eb-instant[Model
overview](https://cloud.baidu.com/doc/WENXINWORKSHOP/s/6lp69is2a).
[...]
-| kyuubi.engine.chat.ernie.token | <undefined>
| The token to access ernie bot open API, which could be got at
https://cloud.baidu.com/doc/WENXINWORKSHOP/s/Ilkkrb0i5
[...]
-| kyuubi.engine.chat.extra.classpath | <undefined>
| The extra classpath for the Chat engine, for configuring the location
of the SDK and etc.
[...]
-| kyuubi.engine.chat.gpt.apiKey | <undefined>
| The key to access OpenAI open API, which could be got at
https://platform.openai.com/account/api-keys
[...]
-| kyuubi.engine.chat.gpt.http.connect.timeout | PT2M
| The timeout[ms] for establishing the connection with the Chat GPT
server. A timeout value of zero is interpreted as an infinite timeout.
[...]
-| kyuubi.engine.chat.gpt.http.proxy | <undefined>
| HTTP proxy url for API calling in Chat GPT engine. e.g.
http://127.0.0.1:1087
[...]
-| kyuubi.engine.chat.gpt.http.socket.timeout | PT2M
| The timeout[ms] for waiting for data packets after Chat GPT server
connection is established. A timeout value of zero is interpreted as an
infinite timeout.
[...]
-| kyuubi.engine.chat.gpt.model | gpt-3.5-turbo
| ID of the model used in ChatGPT. Available models refer to OpenAI's
[Model overview](https://platform.openai.com/docs/models/overview).
[...]
-| kyuubi.engine.chat.java.options | <undefined>
| The extra Java options for the Chat engine
[...]
-| kyuubi.engine.chat.memory | 1g
| The heap memory for the Chat engine
[...]
-| kyuubi.engine.chat.provider | ECHO
| The provider for the Chat engine. Candidates: <ul> <li>ECHO: simply
replies a welcome message.</li> <li>GPT: a.k.a ChatGPT, powered by OpenAI.</li>
<li>ERNIE: ErnieBot, powered by Baidu.</li></ul>
[...]
-| kyuubi.engine.connection.url.use.hostname | true
| (deprecated) When true, the engine registers with hostname to
zookeeper. When Spark runs on K8s with cluster mode, set to false to ensure
that server can connect to engine
[...]
-| kyuubi.engine.deregister.exception.classes
|| A comma-separated list of exception classes. If there is any
exception thrown, whose class matches the specified classes, the engine would
deregister itself.
[...]
-| kyuubi.engine.deregister.exception.messages
|| A comma-separated list of exception messages. If there is any
exception thrown, whose message or stacktrace matches the specified message
list, the engine would deregister itself.
[...]
-| kyuubi.engine.deregister.exception.ttl | PT30M
| Time to live(TTL) for exceptions pattern specified in
kyuubi.engine.deregister.exception.classes and
kyuubi.engine.deregister.exception.messages to deregister engines. Once the
total error count hits the kyuubi.engine.deregister.job.max.failures within the
TTL, an engine will deregister itself and wait for self-terminated. Otherwise,
we suppose that the engine has recovered from temporary failures. [...]
-| kyuubi.engine.deregister.job.max.failures | 4
| Number of failures of job before deregistering the engine.
[...]
-| kyuubi.engine.event.json.log.path |
file:///tmp/kyuubi/events | The location where all the engine events go for the
built-in JSON logger.<ul><li>Local Path: start with 'file://'</li><li>HDFS
Path: start with 'hdfs://'</li></ul>
[...]
-| kyuubi.engine.event.loggers | SPARK
| A comma-separated list of engine history loggers, where
engine/session/operation etc events go.<ul> <li>SPARK: the events will be
written to the Spark listener bus.</li> <li>JSON: the events will be written to
the location of kyuubi.engine.event.json.log.path</li> <li>JDBC: to be
done</li> <li>CUSTOM: User-defined event handlers.</li></ul> Note that: Kyuubi
supports custom event handlers with the Jav [...]
-| kyuubi.engine.flink.application.jars | <undefined>
| A comma-separated list of the local jars to be shipped with the job
to the cluster. For example, SQL UDF jars. Only effective in yarn application
mode.
[...]
-| kyuubi.engine.flink.extra.classpath | <undefined>
| The extra classpath for the Flink SQL engine, for configuring the
location of hadoop client jars, etc. Only effective in yarn session mode.
[...]
-| kyuubi.engine.flink.initialize.sql | SHOW DATABASES
| The initialize sql for Flink engine. It fallback to
`kyuubi.engine.initialize.sql`.
[...]
-| kyuubi.engine.flink.java.options | <undefined>
| The extra Java options for the Flink SQL engine. Only effective in
yarn session mode.
[...]
-| kyuubi.engine.flink.memory | 1g
| The heap memory for the Flink SQL engine. Only effective in yarn
session mode.
[...]
-| kyuubi.engine.hive.event.loggers | JSON
| A comma-separated list of engine history loggers, where
engine/session/operation etc events go.<ul> <li>JSON: the events will be
written to the location of kyuubi.engine.event.json.log.path</li> <li>JDBC: to
be done</li> <li>CUSTOM: to be done.</li></ul>
[...]
-| kyuubi.engine.hive.extra.classpath | <undefined>
| The extra classpath for the Hive query engine, for configuring
location of the hadoop client jars and etc.
[...]
-| kyuubi.engine.hive.java.options | <undefined>
| The extra Java options for the Hive query engine
[...]
-| kyuubi.engine.hive.memory | 1g
| The heap memory for the Hive query engine
[...]
-| kyuubi.engine.initialize.sql | SHOW DATABASES
| SemiColon-separated list of SQL statements to be initialized in the
newly created engine before queries. i.e. use `SHOW DATABASES` to eagerly
active HiveClient. This configuration can not be used in JDBC url due to the
limitation of Beeline/JDBC driver.
[...]
-| kyuubi.engine.jdbc.connection.password | <undefined>
| The password is used for connecting to server
[...]
-| kyuubi.engine.jdbc.connection.propagateCredential | false
| Whether to use the session's user and password to connect to database
[...]
-| kyuubi.engine.jdbc.connection.properties
|| The additional properties are used for connecting to server
[...]
-| kyuubi.engine.jdbc.connection.provider | <undefined>
| A JDBC connection provider plugin for the Kyuubi Server to establish
a connection to the JDBC URL. The configuration value should be a subclass of
`org.apache.kyuubi.engine.jdbc.connection.JdbcConnectionProvider`. Kyuubi
provides the following built-in implementations: <li>doris: For establishing
Doris connections.</li> <li>mysql: For establishing MySQL connections.</li>
<li>phoenix: For establishing [...]
-| kyuubi.engine.jdbc.connection.url | <undefined>
| The server url that engine will connect to
[...]
-| kyuubi.engine.jdbc.connection.user | <undefined>
| The user is used for connecting to server
[...]
-| kyuubi.engine.jdbc.driver.class | <undefined>
| The driver class for JDBC engine connection
[...]
-| kyuubi.engine.jdbc.extra.classpath | <undefined>
| The extra classpath for the JDBC query engine, for configuring the
location of the JDBC driver and etc.
[...]
-| kyuubi.engine.jdbc.fetch.size | 1000
| The fetch size of JDBC engine
[...]
-| kyuubi.engine.jdbc.initialize.sql | SELECT 1
| SemiColon-separated list of SQL statements to be initialized in the
newly created engine before queries. i.e. use `SELECT 1` to eagerly active
JDBCClient.
[...]
-| kyuubi.engine.jdbc.java.options | <undefined>
| The extra Java options for the JDBC query engine
[...]
-| kyuubi.engine.jdbc.memory | 1g
| The heap memory for the JDBC query engine
[...]
-| kyuubi.engine.jdbc.session.initialize.sql
|| SemiColon-separated list of SQL statements to be initialized in the
newly created engine session before queries.
[...]
-| kyuubi.engine.jdbc.type | <undefined>
| The short name of JDBC type
[...]
-| kyuubi.engine.kubernetes.submit.timeout | PT30S
| The engine submit timeout for Kubernetes application.
[...]
-| kyuubi.engine.operation.convert.catalog.database.enabled | true
| When set to true, The engine converts the JDBC methods of set/get
Catalog and set/get Schema to the implementation of different engines
[...]
-| kyuubi.engine.operation.log.dir.root |
engine_operation_logs | Root directory for query operation log at
engine-side.
[...]
-| kyuubi.engine.pool.name | engine-pool
| The name of the engine pool.
[...]
-| kyuubi.engine.pool.selectPolicy | RANDOM
| The select policy of an engine from the corresponding engine pool
engine for a session. <ul><li>RANDOM - Randomly use the engine in the
pool</li><li>POLLING - Polling use the engine in the pool</li></ul>
[...]
-| kyuubi.engine.pool.size | -1
| The size of the engine pool. Note that, if the size is less than 1,
the engine pool will not be enabled; otherwise, the size of the engine pool
will be min(this, kyuubi.engine.pool.size.threshold).
[...]
-| kyuubi.engine.pool.size.threshold | 9
| This parameter is introduced as a server-side parameter controlling
the upper limit of the engine pool.
[...]
-| kyuubi.engine.session.initialize.sql
|| SemiColon-separated list of SQL statements to be initialized in the
newly created engine session before queries. This configuration can not be used
in JDBC url due to the limitation of Beeline/JDBC driver.
[...]
-| kyuubi.engine.share.level | USER
| Engines will be shared in different levels, available configs are:
<ul> <li>CONNECTION: engine will not be shared but only used by the current
client connection</li> <li>USER: engine will be shared by all sessions created
by a unique username, see also kyuubi.engine.share.level.subdomain</li>
<li>GROUP: the engine will be shared by all sessions created by all users
belong to the same primary group na [...]
-| kyuubi.engine.share.level.sub.domain | <undefined>
| (deprecated) - Using kyuubi.engine.share.level.subdomain instead
[...]
-| kyuubi.engine.share.level.subdomain | <undefined>
| Allow end-users to create a subdomain for the share level of an
engine. A subdomain is a case-insensitive string values that must be a valid
zookeeper subpath. For example, for the `USER` share level, an end-user can
share a certain engine within a subdomain, not for all of its clients.
End-users are free to create multiple engines in the `USER` share level. When
disable engine pool, use 'default' if [...]
-| kyuubi.engine.single.spark.session | false
| When set to true, this engine is running in a single session mode.
All the JDBC/ODBC connections share the temporary views, function registries,
SQL configuration and the current database.
[...]
-| kyuubi.engine.spark.event.loggers | SPARK
| A comma-separated list of engine loggers, where
engine/session/operation etc events go.<ul> <li>SPARK: the events will be
written to the Spark listener bus.</li> <li>JSON: the events will be written to
the location of kyuubi.engine.event.json.log.path</li> <li>JDBC: to be
done</li> <li>CUSTOM: to be done.</li></ul>
[...]
-| kyuubi.engine.spark.initialize.sql | SHOW DATABASES
| The initialize sql for Spark engine. It fallback to
`kyuubi.engine.initialize.sql`.
[...]
-| kyuubi.engine.spark.python.env.archive | <undefined>
| Portable Python env archive used for Spark engine Python language
mode.
[...]
-| kyuubi.engine.spark.python.env.archive.exec.path | bin/python
| The Python exec path under the Python env archive.
[...]
-| kyuubi.engine.spark.python.home.archive | <undefined>
| Spark archive containing $SPARK_HOME/python directory, which is used
to init session Python worker for Python language mode.
[...]
-| kyuubi.engine.submit.timeout | PT30S
| Period to tolerant Driver Pod ephemerally invisible after submitting.
In some Resource Managers, e.g. K8s, the Driver Pod is not visible immediately
after `spark-submit` is returned.
[...]
-| kyuubi.engine.trino.connection.keystore.password | <undefined>
| The keystore password used for connecting to trino cluster
[...]
-| kyuubi.engine.trino.connection.keystore.path | <undefined>
| The keystore path used for connecting to trino cluster
[...]
-| kyuubi.engine.trino.connection.keystore.type | <undefined>
| The keystore type used for connecting to trino cluster
[...]
-| kyuubi.engine.trino.connection.password | <undefined>
| The password used for connecting to trino cluster
[...]
-| kyuubi.engine.trino.connection.truststore.password | <undefined>
| The truststore password used for connecting to trino cluster
[...]
-| kyuubi.engine.trino.connection.truststore.path | <undefined>
| The truststore path used for connecting to trino cluster
[...]
-| kyuubi.engine.trino.connection.truststore.type | <undefined>
| The truststore type used for connecting to trino cluster
[...]
-| kyuubi.engine.trino.event.loggers | JSON
| A comma-separated list of engine history loggers, where
engine/session/operation etc events go.<ul> <li>JSON: the events will be
written to the location of kyuubi.engine.event.json.log.path</li> <li>JDBC: to
be done</li> <li>CUSTOM: to be done.</li></ul>
[...]
-| kyuubi.engine.trino.extra.classpath | <undefined>
| The extra classpath for the Trino query engine, for configuring other
libs which may need by the Trino engine
[...]
-| kyuubi.engine.trino.java.options | <undefined>
| The extra Java options for the Trino query engine
[...]
-| kyuubi.engine.trino.memory | 1g
| The heap memory for the Trino query engine
[...]
-| kyuubi.engine.type | SPARK_SQL
| Specify the detailed engine supported by Kyuubi. The engine type
bindings to SESSION scope. This configuration is experimental. Currently,
available configs are: <ul> <li>SPARK_SQL: specify this engine type will launch
a Spark engine which can provide all the capacity of the Apache Spark. Note,
it's a default engine type.</li> <li>FLINK_SQL: specify this engine type will
launch a Flink engine which c [...]
-| kyuubi.engine.ui.retainedSessions | 200
| The number of SQL client sessions kept in the Kyuubi Query Engine web
UI.
[...]
-| kyuubi.engine.ui.retainedStatements | 200
| The number of statements kept in the Kyuubi Query Engine web UI.
[...]
-| kyuubi.engine.ui.stop.enabled | true
| When true, allows Kyuubi engine to be killed from the Spark Web UI.
[...]
-| kyuubi.engine.user.isolated.spark.session | true
| When set to false, if the engine is running in a group or server
share level, all the JDBC/ODBC connections will be isolated against the user.
Including the temporary views, function registries, SQL configuration, and the
current database. Note that, it does not affect if the share level is
connection or user.
[...]
-| kyuubi.engine.user.isolated.spark.session.idle.interval | PT1M
| The interval to check if the user-isolated Spark session is timeout.
[...]
-| kyuubi.engine.user.isolated.spark.session.idle.timeout | PT6H
| If kyuubi.engine.user.isolated.spark.session is false, we will
release the Spark session if its corresponding user is inactive after this
configured timeout.
[...]
-| kyuubi.engine.yarn.submit.timeout | PT30S
| The engine submit timeout for YARN application.
[...]
+| Key | Default
|
[...]
+|----------------------------------------------------------|---------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
[...]
+| kyuubi.engine.chat.ernie.http.connect.timeout | PT2M
| The timeout[ms] for establishing the connection with the ernie bot
server. A timeout value of zero is interpreted as an infinite timeout.
[...]
+| kyuubi.engine.chat.ernie.http.proxy | <undefined>
| HTTP proxy url for API calling in ernie bot engine. e.g.
http://127.0.0.1:1088
[...]
+| kyuubi.engine.chat.ernie.http.socket.timeout | PT2M
| The timeout[ms] for waiting for data packets after ernie bot server
connection is established. A timeout value of zero is interpreted as an
infinite timeout.
[...]
+| kyuubi.engine.chat.ernie.model | completions
| ID of the model used in ernie bot. Available models are
completions_pro, ernie_bot_8k, completions and eb-instant[Model
overview](https://cloud.baidu.com/doc/WENXINWORKSHOP/s/6lp69is2a).
[...]
+| kyuubi.engine.chat.ernie.token | <undefined>
| The token to access ernie bot open API, which could be got at
https://cloud.baidu.com/doc/WENXINWORKSHOP/s/Ilkkrb0i5
[...]
+| kyuubi.engine.chat.extra.classpath | <undefined>
| The extra classpath for the Chat engine, for configuring the location
of the SDK and etc.
[...]
+| kyuubi.engine.chat.gpt.apiKey | <undefined>
| The key to access OpenAI open API, which could be got at
https://platform.openai.com/account/api-keys
[...]
+| kyuubi.engine.chat.gpt.http.connect.timeout | PT2M
| The timeout[ms] for establishing the connection with the Chat GPT
server. A timeout value of zero is interpreted as an infinite timeout.
[...]
+| kyuubi.engine.chat.gpt.http.proxy | <undefined>
| HTTP proxy url for API calling in Chat GPT engine. e.g.
http://127.0.0.1:1087
[...]
+| kyuubi.engine.chat.gpt.http.socket.timeout | PT2M
| The timeout[ms] for waiting for data packets after Chat GPT server
connection is established. A timeout value of zero is interpreted as an
infinite timeout.
[...]
+| kyuubi.engine.chat.gpt.model | gpt-3.5-turbo
| ID of the model used in ChatGPT. Available models refer to OpenAI's
[Model overview](https://platform.openai.com/docs/models/overview).
[...]
+| kyuubi.engine.chat.java.options | <undefined>
| The extra Java options for the Chat engine
[...]
+| kyuubi.engine.chat.memory | 1g
| The heap memory for the Chat engine
[...]
+| kyuubi.engine.chat.provider | ECHO
| The provider for the Chat engine. Candidates: <ul> <li>ECHO: simply
replies a welcome message.</li> <li>GPT: a.k.a ChatGPT, powered by OpenAI.</li>
<li>ERNIE: ErnieBot, powered by Baidu.</li></ul>
[...]
+| kyuubi.engine.connection.url.use.hostname | true
| (deprecated) When true, the engine registers with hostname to
zookeeper. When Spark runs on K8s with cluster mode, set to false to ensure
that server can connect to engine
[...]
+| kyuubi.engine.deregister.exception.classes
|| A comma-separated list of exception classes. If there is any
exception thrown, whose class matches the specified classes, the engine would
deregister itself.
[...]
+| kyuubi.engine.deregister.exception.messages
|| A comma-separated list of exception messages. If there is any
exception thrown, whose message or stacktrace matches the specified message
list, the engine would deregister itself.
[...]
+| kyuubi.engine.deregister.exception.ttl | PT30M
| Time to live(TTL) for exceptions pattern specified in
kyuubi.engine.deregister.exception.classes and
kyuubi.engine.deregister.exception.messages to deregister engines. Once the
total error count hits the kyuubi.engine.deregister.job.max.failures within the
TTL, an engine will deregister itself and wait for self-terminated. Otherwise,
we suppose that the engine has recovered from temporary failures. [...]
+| kyuubi.engine.deregister.job.max.failures | 4
| Number of failures of job before deregistering the engine.
[...]
+| kyuubi.engine.event.json.log.path |
file:///tmp/kyuubi/events | The location where all the engine events go for the
built-in JSON logger.<ul><li>Local Path: start with 'file://'</li><li>HDFS
Path: start with 'hdfs://'</li></ul>
[...]
+| kyuubi.engine.event.loggers | SPARK
| A comma-separated list of engine history loggers, where
engine/session/operation etc events go.<ul> <li>SPARK: the events will be
written to the Spark listener bus.</li> <li>JSON: the events will be written to
the location of kyuubi.engine.event.json.log.path</li> <li>JDBC: to be
done</li> <li>CUSTOM: User-defined event handlers.</li></ul> Note that: Kyuubi
supports custom event handlers with the Jav [...]
+| kyuubi.engine.flink.application.jars | <undefined>
| A comma-separated list of the local jars to be shipped with the job
to the cluster. For example, SQL UDF jars. Only effective in yarn application
mode.
[...]
+| kyuubi.engine.flink.extra.classpath | <undefined>
| The extra classpath for the Flink SQL engine, for configuring the
location of hadoop client jars, etc. Only effective in yarn session mode.
[...]
+| kyuubi.engine.flink.initialize.sql | SHOW DATABASES
| The initialize sql for Flink engine. It fallback to
`kyuubi.engine.initialize.sql`.
[...]
+| kyuubi.engine.flink.java.options | <undefined>
| The extra Java options for the Flink SQL engine. Only effective in
yarn session mode.
[...]
+| kyuubi.engine.flink.memory | 1g
| The heap memory for the Flink SQL engine. Only effective in yarn
session mode.
[...]
+| kyuubi.engine.hive.event.loggers | JSON
| A comma-separated list of engine history loggers, where
engine/session/operation etc events go.<ul> <li>JSON: the events will be
written to the location of kyuubi.engine.event.json.log.path</li> <li>JDBC: to
be done</li> <li>CUSTOM: to be done.</li></ul>
[...]
+| kyuubi.engine.hive.extra.classpath | <undefined>
| The extra classpath for the Hive query engine, for configuring
location of the hadoop client jars and etc.
[...]
+| kyuubi.engine.hive.java.options | <undefined>
| The extra Java options for the Hive query engine
[...]
+| kyuubi.engine.hive.memory | 1g
| The heap memory for the Hive query engine
[...]
+| kyuubi.engine.initialize.sql | SHOW DATABASES
| SemiColon-separated list of SQL statements to be initialized in the
newly created engine before queries. i.e. use `SHOW DATABASES` to eagerly
active HiveClient. This configuration can not be used in JDBC url due to the
limitation of Beeline/JDBC driver.
[...]
+| kyuubi.engine.jdbc.connection.password | <undefined>
| The password is used for connecting to server
[...]
+| kyuubi.engine.jdbc.connection.propagateCredential | false
| Whether to use the session's user and password to connect to database
[...]
+| kyuubi.engine.jdbc.connection.properties
|| The additional properties are used for connecting to server
[...]
+| kyuubi.engine.jdbc.connection.provider | <undefined>
| A JDBC connection provider plugin for the Kyuubi Server to establish
a connection to the JDBC URL. The configuration value should be a subclass of
`org.apache.kyuubi.engine.jdbc.connection.JdbcConnectionProvider`. Kyuubi
provides the following built-in implementations: <li>doris: For establishing
Doris connections.</li> <li>mysql: For establishing MySQL connections.</li>
<li>phoenix: For establishing [...]
+| kyuubi.engine.jdbc.connection.url | <undefined>
| The server url that engine will connect to
[...]
+| kyuubi.engine.jdbc.connection.user | <undefined>
| The user is used for connecting to server
[...]
+| kyuubi.engine.jdbc.driver.class | <undefined>
| The driver class for JDBC engine connection
[...]
+| kyuubi.engine.jdbc.extra.classpath | <undefined>
| The extra classpath for the JDBC query engine, for configuring the
location of the JDBC driver and etc.
[...]
+| kyuubi.engine.jdbc.fetch.size | 1000
| The fetch size of JDBC engine
[...]
+| kyuubi.engine.jdbc.initialize.sql | SELECT 1
| SemiColon-separated list of SQL statements to be initialized in the
newly created engine before queries. i.e. use `SELECT 1` to eagerly active
JDBCClient.
[...]
+| kyuubi.engine.jdbc.java.options | <undefined>
| The extra Java options for the JDBC query engine
[...]
+| kyuubi.engine.jdbc.memory | 1g
| The heap memory for the JDBC query engine
[...]
+| kyuubi.engine.jdbc.session.initialize.sql
|| SemiColon-separated list of SQL statements to be initialized in the
newly created engine session before queries.
[...]
+| kyuubi.engine.jdbc.type | <undefined>
| The short name of JDBC type
[...]
+| kyuubi.engine.kubernetes.submit.timeout | PT30S
| The engine submit timeout for Kubernetes application.
[...]
+| kyuubi.engine.operation.convert.catalog.database.enabled | true
| When set to true, The engine converts the JDBC methods of set/get
Catalog and set/get Schema to the implementation of different engines
[...]
+| kyuubi.engine.operation.log.dir.root |
engine_operation_logs | Root directory for query operation log at
engine-side.
[...]
+| kyuubi.engine.pool.name | engine-pool
| The name of the engine pool.
[...]
+| kyuubi.engine.pool.selectPolicy | RANDOM
| The select policy of an engine from the corresponding engine pool
engine for a session. <ul><li>RANDOM - Randomly use the engine in the
pool</li><li>POLLING - Polling use the engine in the pool</li></ul>
[...]
+| kyuubi.engine.pool.size | -1
| The size of the engine pool. Note that, if the size is less than 1,
the engine pool will not be enabled; otherwise, the size of the engine pool
will be min(this, kyuubi.engine.pool.size.threshold).
[...]
+| kyuubi.engine.pool.size.threshold | 9
| This parameter is introduced as a server-side parameter controlling
the upper limit of the engine pool.
[...]
+| kyuubi.engine.session.initialize.sql
|| SemiColon-separated list of SQL statements to be initialized in the
newly created engine session before queries. This configuration can not be used
in JDBC url due to the limitation of Beeline/JDBC driver.
[...]
+| kyuubi.engine.share.level | USER
| Engines will be shared in different levels, available configs are:
<ul> <li>CONNECTION: engine will not be shared but only used by the current
client connection</li> <li>USER: engine will be shared by all sessions created
by a unique username, see also kyuubi.engine.share.level.subdomain</li>
<li>GROUP: the engine will be shared by all sessions created by all users
belong to the same primary group na [...]
+| kyuubi.engine.share.level.sub.domain | <undefined>
| (deprecated) - Using kyuubi.engine.share.level.subdomain instead
[...]
+| kyuubi.engine.share.level.subdomain | <undefined>
| Allow end-users to create a subdomain for the share level of an
engine. A subdomain is a case-insensitive string values that must be a valid
zookeeper subpath. For example, for the `USER` share level, an end-user can
share a certain engine within a subdomain, not for all of its clients.
End-users are free to create multiple engines in the `USER` share level. When
disable engine pool, use 'default' if [...]
+| kyuubi.engine.single.spark.session | false
| When set to true, this engine is running in a single session mode.
All the JDBC/ODBC connections share the temporary views, function registries,
SQL configuration and the current database.
[...]
+| kyuubi.engine.spark.event.loggers | SPARK
| A comma-separated list of engine loggers, where
engine/session/operation etc events go.<ul> <li>SPARK: the events will be
written to the Spark listener bus.</li> <li>JSON: the events will be written to
the location of kyuubi.engine.event.json.log.path</li> <li>JDBC: to be
done</li> <li>CUSTOM: to be done.</li></ul>
[...]
+| kyuubi.engine.spark.initialize.sql | SHOW DATABASES
| The initialize sql for Spark engine. It fallback to
`kyuubi.engine.initialize.sql`.
[...]
+| kyuubi.engine.spark.python.env.archive | <undefined>
| Portable Python env archive used for Spark engine Python language
mode.
[...]
+| kyuubi.engine.spark.python.env.archive.exec.path | bin/python
| The Python exec path under the Python env archive.
[...]
+| kyuubi.engine.spark.python.home.archive | <undefined>
| Spark archive containing $SPARK_HOME/python directory, which is used
to init session Python worker for Python language mode.
[...]
+| kyuubi.engine.submit.timeout | PT30S
| Period to tolerant Driver Pod ephemerally invisible after submitting.
In some Resource Managers, e.g. K8s, the Driver Pod is not visible immediately
after `spark-submit` is returned.
[...]
+| kyuubi.engine.trino.connection.keystore.password | <undefined>
| The keystore password used for connecting to trino cluster
[...]
+| kyuubi.engine.trino.connection.keystore.path | <undefined>
| The keystore path used for connecting to trino cluster
[...]
+| kyuubi.engine.trino.connection.keystore.type | <undefined>
| The keystore type used for connecting to trino cluster
[...]
+| kyuubi.engine.trino.connection.password | <undefined>
| The password used for connecting to trino cluster
[...]
+| kyuubi.engine.trino.connection.truststore.password | <undefined>
| The truststore password used for connecting to trino cluster
[...]
+| kyuubi.engine.trino.connection.truststore.path | <undefined>
| The truststore path used for connecting to trino cluster
[...]
+| kyuubi.engine.trino.connection.truststore.type | <undefined>
| The truststore type used for connecting to trino cluster
[...]
+| kyuubi.engine.trino.event.loggers | JSON
| A comma-separated list of engine history loggers, where
engine/session/operation etc events go.<ul> <li>JSON: the events will be
written to the location of kyuubi.engine.event.json.log.path</li> <li>JDBC: to
be done</li> <li>CUSTOM: to be done.</li></ul>
[...]
+| kyuubi.engine.trino.extra.classpath | <undefined>
| The extra classpath for the Trino query engine, for configuring other
libs which may need by the Trino engine
[...]
+| kyuubi.engine.trino.java.options | <undefined>
| The extra Java options for the Trino query engine
[...]
+| kyuubi.engine.trino.memory | 1g
| The heap memory for the Trino query engine
[...]
+| kyuubi.engine.type | SPARK_SQL
| Specify the detailed engine supported by Kyuubi. The engine type
bindings to SESSION scope. This configuration is experimental. Currently,
available configs are: <ul> <li>SPARK_SQL: specify this engine type will launch
a Spark engine which can provide all the capacity of the Apache Spark. Note,
it's a default engine type.</li> <li>FLINK_SQL: specify this engine type will
launch a Flink engine which c [...]
+| kyuubi.engine.ui.retainedSessions | 200
| The number of SQL client sessions kept in the Kyuubi Query Engine web
UI.
[...]
+| kyuubi.engine.ui.retainedStatements | 200
| The number of statements kept in the Kyuubi Query Engine web UI.
[...]
+| kyuubi.engine.ui.stop.enabled | true
| When true, allows Kyuubi engine to be killed from the Spark Web UI.
[...]
+| kyuubi.engine.user.isolated.spark.session | true
| When set to false, if the engine is running in a group or server
share level, all the JDBC/ODBC connections will be isolated against the user.
Including the temporary views, function registries, SQL configuration, and the
current database. Note that, it does not affect if the share level is
connection or user.
[...]
+| kyuubi.engine.user.isolated.spark.session.idle.interval | PT1M
| The interval to check if the user-isolated Spark session is timeout.
[...]
+| kyuubi.engine.user.isolated.spark.session.idle.timeout | PT6H
| If kyuubi.engine.user.isolated.spark.session is false, we will
release the Spark session if its corresponding user is inactive after this
configured timeout.
[...]
+| kyuubi.engine.yarn.submit.timeout | PT30S
| The engine submit timeout for YARN application.
[...]
### Event
diff --git
a/externals/kyuubi-jdbc-engine/src/main/resources/META-INF/services/org.apache.kyuubi.engine.jdbc.connection.JdbcConnectionProvider
b/externals/kyuubi-jdbc-engine/src/main/resources/META-INF/services/org.apache.kyuubi.engine.jdbc.connection.JdbcConnectionProvider
index 1a7ac9467..0d8a2c58e 100644
---
a/externals/kyuubi-jdbc-engine/src/main/resources/META-INF/services/org.apache.kyuubi.engine.jdbc.connection.JdbcConnectionProvider
+++
b/externals/kyuubi-jdbc-engine/src/main/resources/META-INF/services/org.apache.kyuubi.engine.jdbc.connection.JdbcConnectionProvider
@@ -19,3 +19,4 @@ org.apache.kyuubi.engine.jdbc.doris.DorisConnectionProvider
org.apache.kyuubi.engine.jdbc.mysql.MySQLConnectionProvider
org.apache.kyuubi.engine.jdbc.phoenix.PhoenixConnectionProvider
org.apache.kyuubi.engine.jdbc.postgresql.PostgreSQLConnectionProvider
+org.apache.kyuubi.engine.jdbc.starrocks.StarRocksConnectionProvider
diff --git
a/externals/kyuubi-jdbc-engine/src/main/resources/META-INF/services/org.apache.kyuubi.engine.jdbc.dialect.JdbcDialect
b/externals/kyuubi-jdbc-engine/src/main/resources/META-INF/services/org.apache.kyuubi.engine.jdbc.dialect.JdbcDialect
index 9f97ab5d7..c5a75ec9c 100644
---
a/externals/kyuubi-jdbc-engine/src/main/resources/META-INF/services/org.apache.kyuubi.engine.jdbc.dialect.JdbcDialect
+++
b/externals/kyuubi-jdbc-engine/src/main/resources/META-INF/services/org.apache.kyuubi.engine.jdbc.dialect.JdbcDialect
@@ -19,3 +19,4 @@ org.apache.kyuubi.engine.jdbc.dialect.DorisDialect
org.apache.kyuubi.engine.jdbc.dialect.MySQLDialect
org.apache.kyuubi.engine.jdbc.dialect.PhoenixDialect
org.apache.kyuubi.engine.jdbc.dialect.PostgreSQLDialect
+org.apache.kyuubi.engine.jdbc.dialect.StarRocksDialect
diff --git
a/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/mysql/Mysql8ConnectionProvider.scala
b/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/dialect/StarRocksDialect.scala
similarity index 62%
copy from
externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/mysql/Mysql8ConnectionProvider.scala
copy to
externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/dialect/StarRocksDialect.scala
index 8dc930e48..aa4054eaa 100644
---
a/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/mysql/Mysql8ConnectionProvider.scala
+++
b/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/dialect/StarRocksDialect.scala
@@ -14,18 +14,16 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
-package org.apache.kyuubi.engine.jdbc.mysql
+package org.apache.kyuubi.engine.jdbc.dialect
-import org.apache.kyuubi.engine.jdbc.connection.JdbcConnectionProvider
+import org.apache.kyuubi.engine.jdbc.schema.{JdbcTRowSetGenerator,
SchemaHelper}
+import org.apache.kyuubi.engine.jdbc.starrocks.{StarRocksSchemaHelper,
StarRocksTRowSetGenerator}
-class Mysql8ConnectionProvider extends JdbcConnectionProvider {
+class StarRocksDialect extends MySQLDialect {
+ override def name(): String = "starrocks"
- override val name: String = classOf[Mysql8ConnectionProvider].getSimpleName
+ override def getTRowSetGenerator(): JdbcTRowSetGenerator = new
StarRocksTRowSetGenerator
- override val driverClass: String = "com.mysql.cj.jdbc.Driver"
-
- override def canHandle(providerClass: String): Boolean = {
- driverClass.equalsIgnoreCase(providerClass)
- }
+ override def getSchemaHelper(): SchemaHelper = new StarRocksSchemaHelper
}
diff --git
a/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/doris/DorisConnectionProvider.scala
b/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/doris/DorisConnectionProvider.scala
index 291e85d2d..c38bf7845 100644
---
a/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/doris/DorisConnectionProvider.scala
+++
b/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/doris/DorisConnectionProvider.scala
@@ -16,9 +16,9 @@
*/
package org.apache.kyuubi.engine.jdbc.doris
-import org.apache.kyuubi.engine.jdbc.mysql.Mysql8ConnectionProvider
+import org.apache.kyuubi.engine.jdbc.mysql.MySQL8ConnectionProvider
-class DorisConnectionProvider extends Mysql8ConnectionProvider {
+class DorisConnectionProvider extends MySQL8ConnectionProvider {
override val name: String = classOf[DorisConnectionProvider].getSimpleName
}
diff --git
a/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/mysql/Mysql8ConnectionProvider.scala
b/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/mysql/MySQL8ConnectionProvider.scala
similarity index 78%
copy from
externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/mysql/Mysql8ConnectionProvider.scala
copy to
externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/mysql/MySQL8ConnectionProvider.scala
index 8dc930e48..563d5758b 100644
---
a/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/mysql/Mysql8ConnectionProvider.scala
+++
b/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/mysql/MySQL8ConnectionProvider.scala
@@ -18,14 +18,18 @@ package org.apache.kyuubi.engine.jdbc.mysql
import org.apache.kyuubi.engine.jdbc.connection.JdbcConnectionProvider
-class Mysql8ConnectionProvider extends JdbcConnectionProvider {
+class MySQL8ConnectionProvider extends JdbcConnectionProvider {
- override val name: String = classOf[Mysql8ConnectionProvider].getSimpleName
+ override val name: String = classOf[MySQL8ConnectionProvider].getSimpleName
- override val driverClass: String = "com.mysql.cj.jdbc.Driver"
+ override val driverClass: String = MySQL8ConnectionProvider.driverClass
override def canHandle(providerClass: String): Boolean = {
driverClass.equalsIgnoreCase(providerClass)
}
}
+
+object MySQL8ConnectionProvider {
+ val driverClass: String = "com.mysql.cj.jdbc.Driver"
+}
diff --git
a/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/mysql/MySQLConnectionProvider.scala
b/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/mysql/MySQLConnectionProvider.scala
index 249ea0c31..bd57d1f53 100644
---
a/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/mysql/MySQLConnectionProvider.scala
+++
b/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/mysql/MySQLConnectionProvider.scala
@@ -16,7 +16,7 @@
*/
package org.apache.kyuubi.engine.jdbc.mysql
-class MySQLConnectionProvider extends Mysql8ConnectionProvider {
+class MySQLConnectionProvider extends MySQL8ConnectionProvider {
override val name: String = classOf[MySQLConnectionProvider].getSimpleName
}
diff --git
a/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/doris/DorisConnectionProvider.scala
b/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/starrocks/StarRocksConnectionProvider.scala
similarity index 75%
copy from
externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/doris/DorisConnectionProvider.scala
copy to
externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/starrocks/StarRocksConnectionProvider.scala
index 291e85d2d..09b7efb3f 100644
---
a/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/doris/DorisConnectionProvider.scala
+++
b/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/starrocks/StarRocksConnectionProvider.scala
@@ -14,11 +14,11 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
-package org.apache.kyuubi.engine.jdbc.doris
+package org.apache.kyuubi.engine.jdbc.starrocks
-import org.apache.kyuubi.engine.jdbc.mysql.Mysql8ConnectionProvider
+import org.apache.kyuubi.engine.jdbc.mysql.MySQL8ConnectionProvider
-class DorisConnectionProvider extends Mysql8ConnectionProvider {
+class StarRocksConnectionProvider extends MySQL8ConnectionProvider {
- override val name: String = classOf[DorisConnectionProvider].getSimpleName
+ override val name: String =
classOf[StarRocksConnectionProvider].getSimpleName
}
diff --git
a/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/doris/DorisConnectionProvider.scala
b/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/starrocks/StarRocksSchemaHelper.scala
similarity index 70%
copy from
externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/doris/DorisConnectionProvider.scala
copy to
externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/starrocks/StarRocksSchemaHelper.scala
index 291e85d2d..18c2f5112 100644
---
a/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/doris/DorisConnectionProvider.scala
+++
b/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/starrocks/StarRocksSchemaHelper.scala
@@ -14,11 +14,14 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
-package org.apache.kyuubi.engine.jdbc.doris
+package org.apache.kyuubi.engine.jdbc.starrocks
-import org.apache.kyuubi.engine.jdbc.mysql.Mysql8ConnectionProvider
+import org.apache.kyuubi.engine.jdbc.schema.SchemaHelper
+import org.apache.kyuubi.shaded.hive.service.rpc.thrift._
-class DorisConnectionProvider extends Mysql8ConnectionProvider {
+class StarRocksSchemaHelper extends SchemaHelper {
- override val name: String = classOf[DorisConnectionProvider].getSimpleName
+ override def tinyIntToTTypeId: TTypeId = TTypeId.INT_TYPE
+
+ override def smallIntToTTypeId: TTypeId = TTypeId.INT_TYPE
}
diff --git
a/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/mysql/MySQLConnectionProvider.scala
b/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/starrocks/StarRocksTRowSetGenerator.scala
similarity index 81%
copy from
externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/mysql/MySQLConnectionProvider.scala
copy to
externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/starrocks/StarRocksTRowSetGenerator.scala
index 249ea0c31..736ce7664 100644
---
a/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/mysql/MySQLConnectionProvider.scala
+++
b/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/starrocks/StarRocksTRowSetGenerator.scala
@@ -14,9 +14,8 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
-package org.apache.kyuubi.engine.jdbc.mysql
+package org.apache.kyuubi.engine.jdbc.starrocks
-class MySQLConnectionProvider extends Mysql8ConnectionProvider {
+import org.apache.kyuubi.engine.jdbc.mysql.MySQLTRowSetGenerator
- override val name: String = classOf[MySQLConnectionProvider].getSimpleName
-}
+class StarRocksTRowSetGenerator extends MySQLTRowSetGenerator {}
diff --git
a/externals/kyuubi-jdbc-engine/src/test/scala/org/apache/kyuubi/engine/jdbc/starrocks/StarRocksOperationSuite.scala
b/externals/kyuubi-jdbc-engine/src/test/scala/org/apache/kyuubi/engine/jdbc/starrocks/StarRocksOperationSuite.scala
new file mode 100644
index 000000000..575467143
--- /dev/null
+++
b/externals/kyuubi-jdbc-engine/src/test/scala/org/apache/kyuubi/engine/jdbc/starrocks/StarRocksOperationSuite.scala
@@ -0,0 +1,261 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.kyuubi.engine.jdbc.starrocks
+
+import java.sql.ResultSet
+
+import scala.collection.mutable.ArrayBuffer
+
+import org.apache.kyuubi.operation.HiveJDBCTestHelper
+import org.apache.kyuubi.operation.meta.ResultSetSchemaConstant._
+
+abstract class StarRocksOperationSuite extends WithStarRocksEngine with
HiveJDBCTestHelper {
+ test("starrocks - get tables") {
+ case class Table(catalog: String, schema: String, tableName: String,
tableType: String)
+
+ withJdbcStatement() { statement =>
+ val meta = statement.getConnection.getMetaData
+ val resultBuffer = ArrayBuffer[Table]()
+
+ var tables = meta.getTables(null, null, null, null)
+ while (tables.next()) {
+ resultBuffer +=
+ Table(
+ tables.getString(TABLE_CATALOG),
+ tables.getString(TABLE_SCHEMA),
+ tables.getString(TABLE_NAME),
+ tables.getString(TABLE_TYPE))
+ }
+ assert(resultBuffer.contains(Table("def", "information_schema",
"tables", "SYSTEM VIEW")))
+ assert(resultBuffer.contains(Table("def", "information_schema", "views",
"SYSTEM VIEW")))
+ resultBuffer.clear()
+
+ statement.execute("create database if not exists db1")
+ statement.execute("create table db1.test1(id bigint)" +
+ "ENGINE=OLAP DISTRIBUTED BY HASH(`id`) BUCKETS 32 " +
+ "PROPERTIES ('replication_num' = '1')")
+ statement.execute("create table db1.test2(id bigint)" +
+ "ENGINE=OLAP DISTRIBUTED BY HASH(`id`) BUCKETS 32 " +
+ "PROPERTIES ('replication_num' = '1')")
+
+ statement.execute("create database if not exists db2")
+ statement.execute("create table db2.test1(id bigint)" +
+ "ENGINE=OLAP DISTRIBUTED BY HASH(`id`) BUCKETS 32 " +
+ "PROPERTIES ('replication_num' = '1')")
+ statement.execute("create table db2.test2(id bigint)" +
+ "ENGINE=OLAP DISTRIBUTED BY HASH(`id`) BUCKETS 32 " +
+ "PROPERTIES ('replication_num' = '1')")
+
+ statement.execute("create view db1.view1 (k1) as select id from
db1.test1")
+
+ tables = meta.getTables(null, "db1", "test1", Array("BASE TABLE"))
+ while (tables.next()) {
+ val table = Table(
+ tables.getString(TABLE_CATALOG),
+ tables.getString(TABLE_SCHEMA),
+ tables.getString(TABLE_NAME),
+ tables.getString(TABLE_TYPE))
+ assert(table == Table("def", "db1", "test1", "BASE TABLE"))
+ }
+
+ tables = meta.getTables(null, "db1", null, null)
+ while (tables.next()) {
+ resultBuffer += Table(
+ tables.getString(TABLE_CATALOG),
+ tables.getString(TABLE_SCHEMA),
+ tables.getString(TABLE_NAME),
+ tables.getString(TABLE_TYPE))
+ }
+ assert(resultBuffer.contains(Table("def", "db1", "test1", "BASE TABLE")))
+ assert(resultBuffer.contains(Table("def", "db1", "test2", "BASE TABLE")))
+ resultBuffer.clear()
+
+ tables = meta.getTables(null, null, "test1", null)
+ while (tables.next()) {
+ resultBuffer += Table(
+ tables.getString(TABLE_CATALOG),
+ tables.getString(TABLE_SCHEMA),
+ tables.getString(TABLE_NAME),
+ tables.getString(TABLE_TYPE))
+ }
+ assert(resultBuffer.contains(Table("def", "db1", "test1", "BASE TABLE")))
+ assert(resultBuffer.contains(Table("def", "db2", "test1", "BASE TABLE")))
+ resultBuffer.clear()
+
+ tables = meta.getTables(null, "db%", "test1", null)
+ while (tables.next()) {
+ resultBuffer += Table(
+ tables.getString(TABLE_CATALOG),
+ tables.getString(TABLE_SCHEMA),
+ tables.getString(TABLE_NAME),
+ tables.getString(TABLE_TYPE))
+ }
+ assert(resultBuffer.contains(Table("def", "db1", "test1", "BASE TABLE")))
+ assert(resultBuffer.contains(Table("def", "db2", "test1", "BASE TABLE")))
+ resultBuffer.clear()
+
+ tables = meta.getTables(null, "db2", "test%", null)
+ while (tables.next()) {
+ resultBuffer += Table(
+ tables.getString(TABLE_CATALOG),
+ tables.getString(TABLE_SCHEMA),
+ tables.getString(TABLE_NAME),
+ tables.getString(TABLE_TYPE))
+ }
+ assert(resultBuffer.contains(Table("def", "db2", "test1", "BASE TABLE")))
+ assert(resultBuffer.contains(Table("def", "db2", "test2", "BASE TABLE")))
+ resultBuffer.clear()
+
+ tables = meta.getTables(null, "fake_db", "test1", null)
+ assert(!tables.next())
+
+ tables = meta.getTables(null, null, null, Array("VIEW"))
+ while (tables.next()) {
+ val table = Table(
+ tables.getString(TABLE_CATALOG),
+ tables.getString(TABLE_SCHEMA),
+ tables.getString(TABLE_NAME),
+ tables.getString(TABLE_TYPE))
+ assert(table == Table("def", "db1", "view1", "VIEW"))
+ }
+
+ tables = meta.getTables(null, null, null, Array("VIEW", "BASE TABLE"))
+ while (tables.next()) {
+ resultBuffer += Table(
+ tables.getString(TABLE_CATALOG),
+ tables.getString(TABLE_SCHEMA),
+ tables.getString(TABLE_NAME),
+ tables.getString(TABLE_TYPE))
+ }
+ assert(resultBuffer.contains(Table("def", "db1", "test1", "BASE TABLE")))
+ assert(resultBuffer.contains(Table("def", "db1", "test2", "BASE TABLE")))
+ assert(resultBuffer.contains(Table("def", "db2", "test1", "BASE TABLE")))
+ assert(resultBuffer.contains(Table("def", "db2", "test2", "BASE TABLE")))
+ assert(resultBuffer.contains(Table("def", "db1", "view1", "VIEW")))
+ resultBuffer.clear()
+
+ statement.execute("drop view db1.view1")
+ statement.execute("drop table db1.test1")
+ statement.execute("drop table db1.test2")
+ statement.execute("drop table db2.test1")
+ statement.execute("drop table db2.test2")
+ statement.execute("drop database db1")
+ statement.execute("drop database db2")
+ }
+ }
+
+ test("starrocks - get columns") {
+ case class Column(tableSchema: String, tableName: String, columnName:
String)
+
+ def buildColumn(resultSet: ResultSet): Column = {
+ val schema = resultSet.getString(TABLE_SCHEMA)
+ val tableName = resultSet.getString(TABLE_NAME)
+ val columnName = resultSet.getString(COLUMN_NAME)
+ val column = Column(schema, tableName, columnName)
+ column
+ }
+
+ withJdbcStatement() { statement =>
+ val metadata = statement.getConnection.getMetaData
+ statement.execute("create database if not exists db1")
+ statement.execute("create table if not exists db1.test1" +
+ "(id bigint, str1 string, str2 string, age int)" +
+ "ENGINE=OLAP DISTRIBUTED BY HASH(`id`) BUCKETS 32 " +
+ "PROPERTIES ('replication_num' = '1')")
+ statement.execute("create table if not exists db1.test2" +
+ "(id bigint, str1 string, str2 string, age int)" +
+ "ENGINE=OLAP DISTRIBUTED BY HASH(`id`) BUCKETS 32 " +
+ "PROPERTIES ('replication_num' = '1')")
+
+ statement.execute("create database if not exists db2")
+
+ statement.execute("create table if not exists db2.test1" +
+ "(id bigint, str1 string, str2 string, age int)" +
+ "ENGINE=OLAP DISTRIBUTED BY HASH(`id`) BUCKETS 32 " +
+ "PROPERTIES ('replication_num' = '1')")
+
+ val resultBuffer = ArrayBuffer[Column]()
+ val resultSet1 = metadata.getColumns(null, "db1", null, null)
+ while (resultSet1.next()) {
+ val column = buildColumn(resultSet1)
+ resultBuffer += column
+ }
+
+ assert(resultBuffer.contains(Column("db1", "test1", "id")))
+ assert(resultBuffer.contains(Column("db1", "test1", "str1")))
+ assert(resultBuffer.contains(Column("db1", "test1", "str2")))
+ assert(resultBuffer.contains(Column("db1", "test1", "age")))
+
+ assert(resultBuffer.contains(Column("db1", "test2", "id")))
+ assert(resultBuffer.contains(Column("db1", "test2", "str1")))
+ assert(resultBuffer.contains(Column("db1", "test2", "str2")))
+ assert(resultBuffer.contains(Column("db1", "test2", "age")))
+
+ resultBuffer.clear()
+
+ val resultSet2 = metadata.getColumns(null, null, "test1", null)
+ while (resultSet2.next()) {
+ val column = buildColumn(resultSet2)
+ resultBuffer += column
+ }
+
+ assert(resultBuffer.contains(Column("db1", "test1", "id")))
+ assert(resultBuffer.contains(Column("db1", "test1", "str1")))
+ assert(resultBuffer.contains(Column("db1", "test1", "str2")))
+ assert(resultBuffer.contains(Column("db1", "test1", "age")))
+
+ assert(resultBuffer.contains(Column("db2", "test1", "id")))
+ assert(resultBuffer.contains(Column("db2", "test1", "str1")))
+ assert(resultBuffer.contains(Column("db2", "test1", "str2")))
+ assert(resultBuffer.contains(Column("db2", "test1", "age")))
+
+ resultBuffer.clear()
+
+ val resultSet3 = metadata.getColumns(null, null, null, "age")
+ while (resultSet3.next()) {
+ val column = buildColumn(resultSet3)
+ resultBuffer += column
+ }
+
+ assert(resultBuffer.contains(Column("db1", "test1", "age")))
+ assert(resultBuffer.contains(Column("db1", "test2", "age")))
+ assert(resultBuffer.contains(Column("db2", "test1", "age")))
+
+ resultBuffer.clear()
+
+ val resultSet4 = metadata.getColumns(null, "d%1", "t%1", "str%")
+ while (resultSet4.next()) {
+ val column = buildColumn(resultSet4)
+ resultBuffer += column
+ }
+
+ assert(resultBuffer.contains(Column("db1", "test1", "str1")))
+ assert(resultBuffer.contains(Column("db1", "test1", "str2")))
+
+ resultBuffer.clear()
+
+ val resultSet5 = metadata.getColumns(null, "d%1", "t%1", "fake")
+ assert(!resultSet5.next())
+
+ statement.execute("drop table db1.test1")
+ statement.execute("drop table db1.test2")
+ statement.execute("drop database db1")
+ statement.execute("drop table db2.test1")
+ statement.execute("drop database db2")
+ }
+ }
+}
diff --git
a/externals/kyuubi-jdbc-engine/src/test/scala/org/apache/kyuubi/engine/jdbc/starrocks/StarRocksOperationWithEngineSuite.scala
b/externals/kyuubi-jdbc-engine/src/test/scala/org/apache/kyuubi/engine/jdbc/starrocks/StarRocksOperationWithEngineSuite.scala
new file mode 100644
index 000000000..acbc028f8
--- /dev/null
+++
b/externals/kyuubi-jdbc-engine/src/test/scala/org/apache/kyuubi/engine/jdbc/starrocks/StarRocksOperationWithEngineSuite.scala
@@ -0,0 +1,78 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.kyuubi.engine.jdbc.starrocks
+
+import org.apache.kyuubi.config.KyuubiConf
+import org.apache.kyuubi.engine.jdbc.connection.ConnectionProvider
+import org.apache.kyuubi.operation.HiveJDBCTestHelper
+import org.apache.kyuubi.shaded.hive.service.rpc.thrift._
+
+class StarRocksOperationWithEngineSuite extends StarRocksOperationSuite with
HiveJDBCTestHelper {
+
+ override protected def jdbcUrl: String = jdbcConnectionUrl
+
+ test("starrocks - test for Jdbc engine getInfo") {
+ val metaData = ConnectionProvider.create(kyuubiConf).getMetaData
+
+ withSessionConf(Map(KyuubiConf.SERVER_INFO_PROVIDER.key -> "ENGINE"))()() {
+ withSessionHandle { (client, handle) =>
+ val req = new TGetInfoReq()
+ req.setSessionHandle(handle)
+ req.setInfoType(TGetInfoType.CLI_DBMS_NAME)
+ assert(client.GetInfo(req).getInfoValue.getStringValue ==
metaData.getDatabaseProductName)
+
+ val req2 = new TGetInfoReq()
+ req2.setSessionHandle(handle)
+ req2.setInfoType(TGetInfoType.CLI_DBMS_VER)
+ assert(
+ client.GetInfo(req2).getInfoValue.getStringValue ==
metaData.getDatabaseProductVersion)
+
+ val req3 = new TGetInfoReq()
+ req3.setSessionHandle(handle)
+ req3.setInfoType(TGetInfoType.CLI_MAX_COLUMN_NAME_LEN)
+ assert(client.GetInfo(req3).getInfoValue.getLenValue ==
metaData.getMaxColumnNameLength)
+
+ val req4 = new TGetInfoReq()
+ req4.setSessionHandle(handle)
+ req4.setInfoType(TGetInfoType.CLI_MAX_SCHEMA_NAME_LEN)
+ assert(client.GetInfo(req4).getInfoValue.getLenValue ==
metaData.getMaxSchemaNameLength)
+
+ val req5 = new TGetInfoReq()
+ req5.setSessionHandle(handle)
+ req5.setInfoType(TGetInfoType.CLI_MAX_TABLE_NAME_LEN)
+ assert(client.GetInfo(req5).getInfoValue.getLenValue ==
metaData.getMaxTableNameLength)
+ }
+ }
+ }
+
+ test("starrocks - JDBC ExecuteStatement operation should contain
operationLog") {
+ withSessionHandle { (client, handle) =>
+ val tExecuteStatementReq = new TExecuteStatementReq()
+ tExecuteStatementReq.setSessionHandle(handle)
+ tExecuteStatementReq.setStatement("SELECT 1")
+ val tExecuteStatementResp = client.ExecuteStatement(tExecuteStatementReq)
+
+ val tFetchResultsReq = new TFetchResultsReq()
+
tFetchResultsReq.setOperationHandle(tExecuteStatementResp.getOperationHandle)
+ tFetchResultsReq.setFetchType(1)
+ tFetchResultsReq.setMaxRows(1)
+
+ val tFetchResultsResp = client.FetchResults(tFetchResultsReq)
+ assert(tFetchResultsResp.getStatus.getStatusCode ===
TStatusCode.SUCCESS_STATUS)
+ }
+ }
+}
diff --git
a/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/mysql/Mysql8ConnectionProvider.scala
b/externals/kyuubi-jdbc-engine/src/test/scala/org/apache/kyuubi/engine/jdbc/starrocks/StarRocksSessionSuite.scala
similarity index 55%
rename from
externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/mysql/Mysql8ConnectionProvider.scala
rename to
externals/kyuubi-jdbc-engine/src/test/scala/org/apache/kyuubi/engine/jdbc/starrocks/StarRocksSessionSuite.scala
index 8dc930e48..f1c11c967 100644
---
a/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/mysql/Mysql8ConnectionProvider.scala
+++
b/externals/kyuubi-jdbc-engine/src/test/scala/org/apache/kyuubi/engine/jdbc/starrocks/StarRocksSessionSuite.scala
@@ -14,18 +14,26 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
-package org.apache.kyuubi.engine.jdbc.mysql
+package org.apache.kyuubi.engine.jdbc.starrocks
-import org.apache.kyuubi.engine.jdbc.connection.JdbcConnectionProvider
+import org.apache.kyuubi.operation.HiveJDBCTestHelper
-class Mysql8ConnectionProvider extends JdbcConnectionProvider {
+class StarRocksSessionSuite extends WithStarRocksEngine with
HiveJDBCTestHelper {
- override val name: String = classOf[Mysql8ConnectionProvider].getSimpleName
-
- override val driverClass: String = "com.mysql.cj.jdbc.Driver"
-
- override def canHandle(providerClass: String): Boolean = {
- driverClass.equalsIgnoreCase(providerClass)
+ test("starrocks - test session") {
+ withJdbcStatement() { statement =>
+ val resultSet = statement.executeQuery(
+ "select '1' as id")
+ val metadata = resultSet.getMetaData
+ for (i <- 1 to metadata.getColumnCount) {
+ assert(metadata.getColumnName(i) == "id")
+ }
+ while (resultSet.next()) {
+ val id = resultSet.getObject(1)
+ assert(id == "1")
+ }
+ }
}
+ override protected def jdbcUrl: String = jdbcConnectionUrl
}
diff --git
a/externals/kyuubi-jdbc-engine/src/test/scala/org/apache/kyuubi/engine/jdbc/starrocks/StarRocksStatementSuite.scala
b/externals/kyuubi-jdbc-engine/src/test/scala/org/apache/kyuubi/engine/jdbc/starrocks/StarRocksStatementSuite.scala
new file mode 100644
index 000000000..596701d7e
--- /dev/null
+++
b/externals/kyuubi-jdbc-engine/src/test/scala/org/apache/kyuubi/engine/jdbc/starrocks/StarRocksStatementSuite.scala
@@ -0,0 +1,105 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.kyuubi.engine.jdbc.starrocks
+
+import java.sql.{Date, Timestamp}
+
+import org.apache.kyuubi.operation.HiveJDBCTestHelper
+
+class StarRocksStatementSuite extends WithStarRocksEngine with
HiveJDBCTestHelper {
+
+ test("starrocks - test select") {
+ withJdbcStatement("test1") { statement =>
+ statement.execute("create database if not exists db1")
+ statement.execute("use db1")
+ statement.execute(
+ """CREATE TABLE db1.test1(id bigint, name varchar(255), age int)
+ | ENGINE=OLAP
+ | DISTRIBUTED BY HASH(`id`)
+ | PROPERTIES ('replication_num' = '1', 'in_memory' = 'true')
+ |""".stripMargin)
+ statement.execute("insert into db1.test1 values(1, 'a', 11)")
+
+ val resultSet1 = statement.executeQuery("select * from db1.test1")
+ while (resultSet1.next()) {
+ val id = resultSet1.getObject(1)
+ assert(id == 1)
+ val name = resultSet1.getObject(2)
+ assert(name == "a")
+ val age = resultSet1.getObject(3)
+ assert(age == 11)
+ }
+ }
+ }
+
+ test("starrocks - test types") {
+ withJdbcStatement("test1") { statement =>
+ statement.execute("create database if not exists db1")
+ statement.execute("use db1")
+ statement.execute(
+ """ CREATE TABLE db1.type_test(
+ | id bigint,
+ | tiny_col tinyint,
+ | smallint_col smallint,
+ | int_col int,
+ | bigint_col bigint,
+ | largeint_col largeint,
+ | decimal_col decimal(27, 9),
+ | date_col date,
+ | datetime_col datetime,
+ | char_col char,
+ | varchar_col varchar(255),
+ | string_col string,
+ | boolean_col boolean,
+ | double_col double,
+ | float_col float)
+ | ENGINE=OLAP
+ | DISTRIBUTED BY HASH(`id`)
+ | PROPERTIES ('replication_num' = '1', 'in_memory' = 'true')
+ |""".stripMargin)
+ statement.execute(
+ """ insert into db1.type_test
+ | (id, tiny_col, smallint_col, int_col, bigint_col, largeint_col,
decimal_col,
+ | date_col, datetime_col, char_col, varchar_col, string_col,
boolean_col,
+ | double_col, float_col)
+ | VALUES (1, 2, 3, 4, 5, 6, 7.7,
+ | '2022-05-08', '2022-05-08 17:47:45', 'a', 'Hello', 'Hello,
Kyuubi', true,
+ | 8.8, 9.9)
+ |""".stripMargin)
+ val resultSet1 = statement.executeQuery("select * from db1.type_test")
+ while (resultSet1.next()) {
+ assert(resultSet1.getObject(1) == 1)
+ assert(resultSet1.getObject(2) == 2)
+ assert(resultSet1.getObject(3) == 3)
+ assert(resultSet1.getObject(4) == 4)
+ assert(resultSet1.getObject(5) == 5)
+ assert(resultSet1.getObject(6) == "6")
+ assert(resultSet1.getObject(7) == new
java.math.BigDecimal("7.700000000"))
+ assert(resultSet1.getObject(8) == Date.valueOf("2022-05-08"))
+ assert(resultSet1.getObject(9) == Timestamp.valueOf("2022-05-08
17:47:45"))
+ assert(resultSet1.getObject(10) == "a")
+ assert(resultSet1.getObject(11) == "Hello")
+ assert(resultSet1.getObject(12) == "Hello, Kyuubi")
+ assert(resultSet1.getObject(13) == true)
+ assert(resultSet1.getObject(14) == 8.8)
+ assert(resultSet1.getObject(15) == 9.9)
+ }
+ }
+ }
+
+ override protected def jdbcUrl: String = jdbcConnectionUrl
+}
diff --git
a/externals/kyuubi-jdbc-engine/src/test/scala/org/apache/kyuubi/engine/jdbc/starrocks/WithStarRocksContainer.scala
b/externals/kyuubi-jdbc-engine/src/test/scala/org/apache/kyuubi/engine/jdbc/starrocks/WithStarRocksContainer.scala
new file mode 100644
index 000000000..9c229a636
--- /dev/null
+++
b/externals/kyuubi-jdbc-engine/src/test/scala/org/apache/kyuubi/engine/jdbc/starrocks/WithStarRocksContainer.scala
@@ -0,0 +1,57 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.kyuubi.engine.jdbc.starrocks
+
+import java.time.Duration
+
+import com.dimafeng.testcontainers.GenericContainer
+import org.testcontainers.containers.wait.strategy.{Wait, WaitAllStrategy}
+import org.testcontainers.containers.wait.strategy.Wait._
+
+import org.apache.kyuubi.engine.jdbc.WithJdbcServerContainer
+
+trait WithStarRocksContainer extends WithJdbcServerContainer {
+
+ private val starrocksDockerImage = "starrocks/allin1-ubuntu:3.1.6"
+
+ private val STARROCKS_FE_MYSQL_PORT = 9030
+ private val STARROCKS_FE_HTTP_PORT = 8030
+ private val STARROCKS_BE_THRIFT_PORT = 9060
+ private val STARROCKS_BE_HTTP_PORT = 8040
+ private val STARROCKS_BE_HEARTBEAT_PORT = 9050
+ private val ports = Seq(
+ STARROCKS_FE_MYSQL_PORT,
+ STARROCKS_FE_HTTP_PORT,
+ STARROCKS_BE_THRIFT_PORT,
+ STARROCKS_BE_HTTP_PORT,
+ STARROCKS_BE_HEARTBEAT_PORT)
+
+ override val containerDef: GenericContainer.Def[GenericContainer] =
GenericContainer.Def(
+ dockerImage = starrocksDockerImage,
+ exposedPorts = ports,
+ waitStrategy = new
WaitAllStrategy().withStartupTimeout(Duration.ofMinutes(10))
+ .withStrategy(Wait.forListeningPorts(ports: _*))
+ .withStrategy(forLogMessage(".*broker service already added into FE
service.*", 1))
+ .withStrategy(
+ forLogMessage(".*Enjoy the journal to StarRocks blazing-fast
lake-house engine.*", 1)))
+
+ protected def feJdbcUrl: String = withContainers { container =>
+ val queryServerHost: String = container.host
+ val queryServerPort: Int = container.mappedPort(STARROCKS_FE_MYSQL_PORT)
+ s"jdbc:mysql://$queryServerHost:$queryServerPort"
+ }
+}
diff --git
a/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/doris/DorisConnectionProvider.scala
b/externals/kyuubi-jdbc-engine/src/test/scala/org/apache/kyuubi/engine/jdbc/starrocks/WithStarRocksEngine.scala
similarity index 52%
copy from
externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/doris/DorisConnectionProvider.scala
copy to
externals/kyuubi-jdbc-engine/src/test/scala/org/apache/kyuubi/engine/jdbc/starrocks/WithStarRocksEngine.scala
index 291e85d2d..6423186c0 100644
---
a/externals/kyuubi-jdbc-engine/src/main/scala/org/apache/kyuubi/engine/jdbc/doris/DorisConnectionProvider.scala
+++
b/externals/kyuubi-jdbc-engine/src/test/scala/org/apache/kyuubi/engine/jdbc/starrocks/WithStarRocksEngine.scala
@@ -14,11 +14,23 @@
* See the License for the specific language governing permissions and
* limitations under the License.
*/
-package org.apache.kyuubi.engine.jdbc.doris
+package org.apache.kyuubi.engine.jdbc.starrocks
-import org.apache.kyuubi.engine.jdbc.mysql.Mysql8ConnectionProvider
+import org.apache.kyuubi.config.KyuubiConf._
+import org.apache.kyuubi.engine.jdbc.WithJdbcEngine
+import org.apache.kyuubi.engine.jdbc.mysql.MySQL8ConnectionProvider
-class DorisConnectionProvider extends Mysql8ConnectionProvider {
+trait WithStarRocksEngine extends WithJdbcEngine with WithStarRocksContainer {
- override val name: String = classOf[DorisConnectionProvider].getSimpleName
+ private val user = "root"
+ private val password = ""
+
+ override def withKyuubiConf: Map[String, String] = Map(
+ ENGINE_SHARE_LEVEL.key -> "SERVER",
+ ENGINE_JDBC_CONNECTION_URL.key -> feJdbcUrl,
+ ENGINE_JDBC_CONNECTION_USER.key -> user,
+ ENGINE_JDBC_CONNECTION_PASSWORD.key -> password,
+ ENGINE_TYPE.key -> "jdbc",
+ ENGINE_JDBC_SHORT_NAME.key -> "starrocks",
+ ENGINE_JDBC_DRIVER_CLASS.key -> MySQL8ConnectionProvider.driverClass)
}
diff --git
a/kyuubi-common/src/main/scala/org/apache/kyuubi/config/KyuubiConf.scala
b/kyuubi-common/src/main/scala/org/apache/kyuubi/config/KyuubiConf.scala
index 97800e157..fd01e718c 100644
--- a/kyuubi-common/src/main/scala/org/apache/kyuubi/config/KyuubiConf.scala
+++ b/kyuubi-common/src/main/scala/org/apache/kyuubi/config/KyuubiConf.scala
@@ -2100,7 +2100,7 @@ object KyuubiConf {
" all the capacity of the Hive Server2.</li>" +
" <li>JDBC: specify this engine type will launch a JDBC engine which can
forward " +
" queries to the database system through the certain JDBC driver, " +
- " for now, it supports Doris and Phoenix.</li>" +
+ " for now, it supports Doris, MySQL, Phoenix, PostgreSQL and
StarRocks.</li>" +
" <li>CHAT: specify this engine type will launch a Chat engine.</li>" +
"</ul>")
.version("1.4.0")
@@ -2893,7 +2893,8 @@ object KyuubiConf {
"<li>doris: For establishing Doris connections.</li> " +
"<li>mysql: For establishing MySQL connections.</li> " +
"<li>phoenix: For establishing Phoenix connections.</li> " +
- "<li>postgresql: For establishing PostgreSQL connections.</li>")
+ "<li>postgresql: For establishing PostgreSQL connections.</li>" +
+ "<li>starrocks: For establishing StarRocks connections.</li>")
.version("1.6.0")
.stringConf
.transform {
@@ -2905,6 +2906,8 @@ object KyuubiConf {
"org.apache.kyuubi.engine.jdbc.phoenix.PhoenixConnectionProvider"
case "PostgreSQL" | "postgresql" | "PostgreSQLConnectionProvider" =>
"org.apache.kyuubi.engine.jdbc.postgresql.PostgreSQLConnectionProvider"
+ case "StarRocks" | "starrocks" | "StarRocksConnectionProvider" =>
+ "org.apache.kyuubi.engine.jdbc.starrocks.StarRocksConnectionProvider"
case other => other
}
.createOptional