lirui-apache commented on a change in pull request #15712:
URL: https://github.com/apache/flink/pull/15712#discussion_r630779155
##########
File path:
flink-connectors/flink-connector-hive/src/test/java/org/apache/flink/connectors/hive/TableEnvHiveConnectorITCase.java
##########
@@ -531,6 +531,34 @@ public void testLocationWithComma() throws Exception {
}
}
+ @Test
+ public void testReadHiveDataWithEmptyMapForHiveShim20X() throws Exception {
+ TableEnvironment tableEnv = getTableEnvWithHiveCatalog();
+
+ try {
+ String format = "parquet";
+ String folderURI =
this.getClass().getResource("/parquet").getPath();
+
+ tableEnv.getConfig()
+ .getConfiguration()
+ .set(HiveOptions.TABLE_EXEC_HIVE_FALLBACK_MAPRED_READER,
true);
+ tableEnv.getConfig().setSqlDialect(SqlDialect.HIVE);
Review comment:
No need to set hive dialect. It's already set in
`getTableEnvWithHiveCatalog`
##########
File path:
flink-connectors/flink-connector-hive/src/test/java/org/apache/flink/connectors/hive/TableEnvHiveConnectorITCase.java
##########
@@ -531,6 +531,34 @@ public void testLocationWithComma() throws Exception {
}
}
+ @Test
+ public void testReadHiveDataWithEmptyMapForHiveShim20X() throws Exception {
Review comment:
We can only run this test for hive <=2.0.0. Something like
`Assume.assumeTrue(HiveShimLoader.getHiveVersion().compareTo("2.0.0") <= 0)`
##########
File path:
flink-connectors/flink-connector-hive/src/test/java/org/apache/flink/connectors/hive/TableEnvHiveConnectorITCase.java
##########
@@ -531,6 +531,30 @@ public void testLocationWithComma() throws Exception {
}
}
+ @Test
+ public void testReadHiveDataWithEmptyMapForHiveShim20X() throws Exception {
+ TableEnvironment tableEnv = getTableEnvWithHiveCatalog();
+ tableEnv.getConfig()
+ .getConfiguration()
+ .set(HiveOptions.TABLE_EXEC_HIVE_FALLBACK_MAPRED_READER, true);
+ try {
+ tableEnv.executeSql("create table src(x map<string, string>)
stored as parquet");
+ HiveTestUtils.createTextTableInserter(hiveCatalog, "default",
"src")
+ .addRow(new Object[] {new HashMap<String, String>()})
+ .addRow(new Object[] {new HashMap<String, String>()})
+ .commit();
+
+ List<Row> results =
Review comment:
Maybe we should also test empty arrays?
##########
File path:
flink-connectors/flink-connector-hive/src/test/java/org/apache/flink/connectors/hive/TableEnvHiveConnectorITCase.java
##########
@@ -531,6 +531,34 @@ public void testLocationWithComma() throws Exception {
}
}
+ @Test
+ public void testReadHiveDataWithEmptyMapForHiveShim20X() throws Exception {
+ TableEnvironment tableEnv = getTableEnvWithHiveCatalog();
+
+ try {
+ String format = "parquet";
+ String folderURI =
this.getClass().getResource("/parquet").getPath();
+
+ tableEnv.getConfig()
+ .getConfiguration()
+ .set(HiveOptions.TABLE_EXEC_HIVE_FALLBACK_MAPRED_READER,
true);
+ tableEnv.getConfig().setSqlDialect(SqlDialect.HIVE);
+ tableEnv.executeSql(
+ String.format(
+ "create external table src_t (a string, b
map<string, string>) stored as %s location 'file://%s'",
+ format, folderURI));
+
+ List<Row> results =
+ CollectionUtil.iteratorToList(
+ tableEnv.sqlQuery("select * from
src_t").execute().collect());
+ } finally {
+ tableEnv.getConfig()
Review comment:
No need to reset this config, because `tableEnv` is not reused among the
test cases.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]