lsyldliu commented on code in PR #20630:
URL: https://github.com/apache/flink/pull/20630#discussion_r965450158
##########
docs/content/docs/dev/table/sql/jar.md:
##########
@@ -68,22 +70,24 @@ Flink SQL> REMOVE JAR '/path/hello.jar';
ADD JAR '<path_to_filename>.jar'
```
-Currently it only supports to add the local jar into the session classloader.
+Currently it supports to add the jar locates in a local or remote [file
system]({{< ref "docs/deployment/filesystems/overview" >}}) into the session
classloader.
-## REMOVE JAR
+## SHOW JARS
```sql
-REMOVE JAR '<path_to_filename>.jar'
+SHOW JARS
```
-Currently it only supports to remove the jar that is added by the [`ADD
JAR`](#add-jar) statements.
+Show all added jars in the session classloader which are added by [`ADD
JAR`](#add-jar) and [`USING JAR`]({{< ref "docs/dev/table/sql/create"
>}}#create-function) statements.
Review Comment:
After reading spark [add
jar](https://spark.apache.org/docs/latest/sql-ref-syntax-aux-resource-mgmt-add-jar.html)
and [list
jar](https://spark.apache.org/docs/latest/sql-ref-syntax-aux-resource-mgmt-list-jar.html)
document, I think you are right, the show jars only list the jar which is
added by add jar statement, this seems the semantic is consistent.
However, I've operated the spark sql and hive cli, I found that the list
jars statement also shows the jars which are added `Create function ... using
jar` statement, the document and actual behavior is not consistent. So
regarding the document, I think we can follow the spark.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]