wuchong commented on a change in pull request #12427:
URL: https://github.com/apache/flink/pull/12427#discussion_r436452417
##########
File path:
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/connector/jdbc/internal/executor/JdbcBatchStatementExecutor.java
##########
@@ -34,7 +34,7 @@
/**
* Open the writer by JDBC Connection. It can create Statement from
Connection.
Review comment:
Please update the Javadoc.
##########
File path:
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/connector/jdbc/internal/executor/SimpleBatchStatementExecutor.java
##########
@@ -30,24 +35,27 @@
/**
* A {@link JdbcBatchStatementExecutor} that executes supplied statement for
given the records (without any pre-processing).
*/
-class SimpleBatchStatementExecutor<T, V> implements
JdbcBatchStatementExecutor<T> {
+@Internal
+public class SimpleBatchStatementExecutor<T, V> implements
JdbcBatchStatementExecutor<T> {
Review comment:
Not necessary to declare it public?
##########
File path:
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/connector/jdbc/internal/options/JdbcOptions.java
##########
@@ -34,6 +34,8 @@
private static final long serialVersionUID = 1L;
+ public static final int CONNECTION_CHECK_TIMEOUT = 60;
Review comment:
Better to describe the time unit in the variable name:
`CONNECTION_CHECK_TIMEOUT_SECONDS`
##########
File path:
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/connector/jdbc/internal/executor/JdbcBatchStatementExecutor.java
##########
@@ -46,7 +46,7 @@
/**
* Close JDBC related statements and other classes.
Review comment:
Please update the Javadoc.
##########
File path:
flink-connectors/flink-connector-jdbc/src/test/java/org/apache/flink/connector/jdbc/table/JdbcRowDataLookupFunctionTest.java
##########
@@ -0,0 +1,232 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.connector.jdbc.table;
+
+import org.apache.flink.connector.jdbc.JdbcTestFixture;
+import org.apache.flink.connector.jdbc.internal.options.JdbcLookupOptions;
+import org.apache.flink.connector.jdbc.internal.options.JdbcOptions;
+import org.apache.flink.table.api.DataTypes;
+import org.apache.flink.table.data.RowData;
+import org.apache.flink.table.data.StringData;
+import org.apache.flink.table.types.DataType;
+import org.apache.flink.table.types.logical.LogicalType;
+import org.apache.flink.table.types.logical.RowType;
+import org.apache.flink.test.util.AbstractTestBase;
+import org.apache.flink.util.Collector;
+
+import org.apache.flink.shaded.guava18.com.google.common.collect.Lists;
+
+import org.junit.After;
+import org.junit.Before;
+import org.junit.Test;
+
+import java.sql.Connection;
+import java.sql.DriverManager;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.List;
+import java.util.stream.Collectors;
+
+import static org.junit.Assert.assertEquals;
+
+/**
+ * Test suite for {@link JdbcRowDataLookupFunction}.
+ */
+public class JdbcRowDataLookupFunctionTest extends AbstractTestBase {
+
+ public static final String DB_URL = "jdbc:derby:memory:lookup";
+ public static final String LOOKUP_TABLE = "lookup_table";
+ public static final String DB_DRIVER =
"org.apache.derby.jdbc.EmbeddedDriver";
+
+ private static String[] fieldNames = new String[] {"id1", "id2",
"comment1", "comment2"};
+ private static DataType[] fieldDataTypes = new DataType[] {
+ DataTypes.INT(),
+ DataTypes.STRING(),
+ DataTypes.STRING(),
+ DataTypes.STRING()
+ };
+
+ private static String[] lookupKeys = new String[] {"id1", "id2"};
+
+ @Before
+ public void before() throws ClassNotFoundException, SQLException {
Review comment:
Could you share the setup and shutdown code for
`JdbcRowDataLookupFunctionTest`, `JdbcLookupFunctionTest`,
`JdbcLookupTableITCase` ?
##########
File path:
flink-connectors/flink-connector-jdbc/src/test/java/org/apache/flink/connector/jdbc/table/JdbcRowDataLookupFunctionTest.java
##########
@@ -0,0 +1,232 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.connector.jdbc.table;
+
+import org.apache.flink.connector.jdbc.JdbcTestFixture;
+import org.apache.flink.connector.jdbc.internal.options.JdbcLookupOptions;
+import org.apache.flink.connector.jdbc.internal.options.JdbcOptions;
+import org.apache.flink.table.api.DataTypes;
+import org.apache.flink.table.data.RowData;
+import org.apache.flink.table.data.StringData;
+import org.apache.flink.table.types.DataType;
+import org.apache.flink.table.types.logical.LogicalType;
+import org.apache.flink.table.types.logical.RowType;
+import org.apache.flink.test.util.AbstractTestBase;
+import org.apache.flink.util.Collector;
+
+import org.apache.flink.shaded.guava18.com.google.common.collect.Lists;
+
+import org.junit.After;
+import org.junit.Before;
+import org.junit.Test;
+
+import java.sql.Connection;
+import java.sql.DriverManager;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.List;
+import java.util.stream.Collectors;
+
+import static org.junit.Assert.assertEquals;
+
+/**
+ * Test suite for {@link JdbcRowDataLookupFunction}.
+ */
+public class JdbcRowDataLookupFunctionTest extends AbstractTestBase {
+
+ public static final String DB_URL = "jdbc:derby:memory:lookup";
+ public static final String LOOKUP_TABLE = "lookup_table";
+ public static final String DB_DRIVER =
"org.apache.derby.jdbc.EmbeddedDriver";
+
+ private static String[] fieldNames = new String[] {"id1", "id2",
"comment1", "comment2"};
+ private static DataType[] fieldDataTypes = new DataType[] {
+ DataTypes.INT(),
+ DataTypes.STRING(),
+ DataTypes.STRING(),
+ DataTypes.STRING()
+ };
+
+ private static String[] lookupKeys = new String[] {"id1", "id2"};
+
+ @Before
+ public void before() throws ClassNotFoundException, SQLException {
+ System.setProperty("derby.stream.error.field",
JdbcTestFixture.class.getCanonicalName() + ".DEV_NULL");
+
+ Class.forName(DB_DRIVER);
+ try (
+ Connection conn = DriverManager.getConnection(DB_URL +
";create=true");
+ Statement stat = conn.createStatement()) {
+ stat.executeUpdate("CREATE TABLE " + LOOKUP_TABLE + "
(" +
+ "id1 INT NOT NULL DEFAULT 0," +
+ "id2 VARCHAR(20) NOT NULL," +
+ "comment1 VARCHAR(1000)," +
+ "comment2 VARCHAR(1000))");
+
+ Object[][] data = new Object[][] {
+ new Object[] {1, "1", "11-c1-v1", "11-c2-v1"},
+ new Object[] {1, "1", "11-c1-v2", "11-c2-v2"},
+ new Object[] {2, "3", null, "23-c2"},
+ new Object[] {2, "5", "25-c1", "25-c2"},
+ new Object[] {3, "8", "38-c1", "38-c2"}
+ };
+ boolean[] surroundedByQuotes = new boolean[] {
+ false, true, true, true
+ };
+
+ StringBuilder sqlQueryBuilder = new StringBuilder(
+ "INSERT INTO " + LOOKUP_TABLE + " (id1, id2,
comment1, comment2) VALUES ");
+ for (int i = 0; i < data.length; i++) {
+ sqlQueryBuilder.append("(");
+ for (int j = 0; j < data[i].length; j++) {
+ if (data[i][j] == null) {
+ sqlQueryBuilder.append("null");
+ } else {
+ if (surroundedByQuotes[j]) {
+
sqlQueryBuilder.append("'");
+ }
+
sqlQueryBuilder.append(data[i][j]);
+ if (surroundedByQuotes[j]) {
+
sqlQueryBuilder.append("'");
+ }
+ }
+ if (j < data[i].length - 1) {
+ sqlQueryBuilder.append(", ");
+ }
+ }
+ sqlQueryBuilder.append(")");
+ if (i < data.length - 1) {
+ sqlQueryBuilder.append(", ");
+ }
+ }
+ stat.execute(sqlQueryBuilder.toString());
+ }
+ }
+
+ @After
+ public void clearOutputTable() throws Exception {
+ Class.forName(DB_DRIVER);
+ try (
+ Connection conn = DriverManager.getConnection(DB_URL);
+ Statement stat = conn.createStatement()) {
+ stat.execute("DROP TABLE " + LOOKUP_TABLE);
+ }
+ }
+
+ @Test
+ public void testEval() throws Exception {
+
+ JdbcRowDataLookupFunction lookupFunction =
buildRowDataLookupFunction();
+
+ lookupFunction.open(null);
+
+ ListOutputCollector collector = new ListOutputCollector();
+ lookupFunction.setCollector(collector);
+
+ lookupFunction.eval(1, StringData.fromString("1"));
+
+ List<String> result =
Lists.newArrayList(collector.getOutputs()).stream()
+ .map(row -> {
+ return String.format("%d,%s,%s,%s",
+ row.getInt(0), row.getString(1),
row.getString(2), row.getString(3));
+ })
Review comment:
Use `.map(RowData::toString)`?
##########
File path:
flink-connectors/flink-connector-jdbc/src/test/java/org/apache/flink/connector/jdbc/JdbcLookupFunctionTest.java
##########
@@ -0,0 +1,222 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.connector.jdbc;
+
+import org.apache.flink.api.common.typeinfo.BasicTypeInfo;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.connector.jdbc.internal.options.JdbcLookupOptions;
+import org.apache.flink.connector.jdbc.internal.options.JdbcOptions;
+import org.apache.flink.connector.jdbc.table.JdbcLookupFunction;
+import org.apache.flink.table.functions.TableFunction;
+import org.apache.flink.test.util.AbstractTestBase;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+
+import org.apache.flink.shaded.guava18.com.google.common.collect.Lists;
+
+import org.junit.After;
+import org.junit.Before;
+import org.junit.Test;
+
+import java.sql.Connection;
+import java.sql.DriverManager;
+import java.sql.SQLException;
+import java.sql.Statement;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.List;
+import java.util.stream.Collectors;
+
+import static org.junit.Assert.assertEquals;
+
+/**
+ * Test suite for {@link JdbcLookupFunction}.
+ */
+public class JdbcLookupFunctionTest extends AbstractTestBase {
+
+ public static final String DB_URL = "jdbc:derby:memory:lookup";
+ public static final String LOOKUP_TABLE = "lookup_table";
+ public static final String DB_DRIVER =
"org.apache.derby.jdbc.EmbeddedDriver";
+
+ private static String[] fieldNames = new String[] {"id1", "id2",
"comment1", "comment2"};
+ private static TypeInformation[] fieldTypes = new TypeInformation[] {
+ BasicTypeInfo.INT_TYPE_INFO,
+ BasicTypeInfo.STRING_TYPE_INFO,
+ BasicTypeInfo.STRING_TYPE_INFO,
+ BasicTypeInfo.STRING_TYPE_INFO
+ };
+
+ private static String[] lookupKeys = new String[] {"id1", "id2"};
+
+ private TableFunction lookupFunction;
+
+ @Before
+ public void before() throws ClassNotFoundException, SQLException {
+ System.setProperty("derby.stream.error.field",
JdbcTestFixture.class.getCanonicalName() + ".DEV_NULL");
+
+ Class.forName(DB_DRIVER);
+ try (
+ Connection conn = DriverManager.getConnection(DB_URL +
";create=true");
+ Statement stat = conn.createStatement()) {
+ stat.executeUpdate("CREATE TABLE " + LOOKUP_TABLE + "
(" +
+ "id1 INT NOT NULL DEFAULT 0," +
+ "id2 VARCHAR(20) NOT NULL," +
+ "comment1 VARCHAR(1000)," +
+ "comment2 VARCHAR(1000))");
+
+ Object[][] data = new Object[][] {
+ new Object[] {1, "1", "11-c1-v1", "11-c2-v1"},
+ new Object[] {1, "1", "11-c1-v2", "11-c2-v2"},
+ new Object[] {2, "3", null, "23-c2"},
+ new Object[] {2, "5", "25-c1", "25-c2"},
+ new Object[] {3, "8", "38-c1", "38-c2"}
+ };
+ boolean[] surroundedByQuotes = new boolean[] {
+ false, true, true, true
+ };
+
+ StringBuilder sqlQueryBuilder = new StringBuilder(
+ "INSERT INTO " + LOOKUP_TABLE + " (id1, id2,
comment1, comment2) VALUES ");
+ for (int i = 0; i < data.length; i++) {
+ sqlQueryBuilder.append("(");
+ for (int j = 0; j < data[i].length; j++) {
+ if (data[i][j] == null) {
+ sqlQueryBuilder.append("null");
+ } else {
+ if (surroundedByQuotes[j]) {
+
sqlQueryBuilder.append("'");
+ }
+
sqlQueryBuilder.append(data[i][j]);
+ if (surroundedByQuotes[j]) {
+
sqlQueryBuilder.append("'");
+ }
+ }
+ if (j < data[i].length - 1) {
+ sqlQueryBuilder.append(", ");
+ }
+ }
+ sqlQueryBuilder.append(")");
+ if (i < data.length - 1) {
+ sqlQueryBuilder.append(", ");
+ }
+ }
+ stat.execute(sqlQueryBuilder.toString());
+ }
+ }
+
+ @After
+ public void clearOutputTable() throws Exception {
+ Class.forName(DB_DRIVER);
+ try (
+ Connection conn = DriverManager.getConnection(DB_URL);
+ Statement stat = conn.createStatement()) {
+ stat.execute("DROP TABLE " + LOOKUP_TABLE);
+ }
+ }
+
+ @Test
+ public void testEval() throws Exception {
+
+ JdbcLookupFunction lookupFunction = buildLookupFunction();
+
+ lookupFunction.open(null);
+
+ ListOutputCollector collector = new ListOutputCollector();
+ lookupFunction.setCollector(collector);
+
+ lookupFunction.eval(1, "1");
+
+ List<String> result =
Lists.newArrayList(collector.getOutputs()).stream()
+ .map(Row::toString)
+ .sorted()
+ .collect(Collectors.toList());
+
+ List<String> expected = new ArrayList<>();
+ expected.add("1,1,11-c1-v1,11-c2-v1");
+ expected.add("1,1,11-c1-v2,11-c2-v2");
+ Collections.sort(expected);
+
+ assertEquals(expected, result);
+ }
+
+ @Test
+ public void testInvalidConnectionInEval() throws Exception {
Review comment:
Code of this method and the above one can share most code. I think you
can process the multiple element, and disable connection after the first
elements are processed. So that you can combine these 2 tests.
##########
File path:
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/connector/jdbc/internal/executor/SimpleBatchStatementExecutor.java
##########
@@ -69,13 +77,15 @@ public void executeBatch() throws SQLException {
}
@Override
- public void close() throws SQLException {
+ public void closeStatements() throws SQLException {
if (st != null) {
st.close();
st = null;
}
- if (batch != null) {
- batch.clear();
- }
+ }
+
+ @VisibleForTesting
+ public PreparedStatement getStatement() {
Review comment:
Not used.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]