[GitHub] flink pull request #3723: [FLINK-6307] [jdbc] Refactor JDBC tests

2017-04-19 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/flink/pull/3723


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request #3723: [FLINK-6307] [jdbc] Refactor JDBC tests

2017-04-18 Thread zentol
Github user zentol commented on a diff in the pull request:

https://github.com/apache/flink/pull/3723#discussion_r111971942
  
--- Diff: 
flink-connectors/flink-jdbc/src/test/java/org/apache/flink/api/java/io/jdbc/JDBCInputFormatTest.java
 ---
@@ -272,63 +209,61 @@ public void 
testJDBCInputFormatWithParallelismAndGenericSplitting() throws IOExc
.setParametersProvider(paramProvider)

.setResultSetType(ResultSet.TYPE_SCROLL_INSENSITIVE)
.finish();
+
jdbcInputFormat.openInputFormat();
InputSplit[] splits = jdbcInputFormat.createInputSplits(1);
//this query exploit parallelism (1 split for every 
queryParameters row)
Assert.assertEquals(queryParameters.length, splits.length);
-   int recordCount = 0;
+
+   verifySplit(splits[0], TEST_DATA[3].id);
+   verifySplit(splits[1], TEST_DATA[0].id + TEST_DATA[1].id);
+
+   jdbcInputFormat.closeInputFormat();
+   }
+
+   private void verifySplit(InputSplit split, int expectedIDSum) throws 
IOException {
+   int sum = 0;
+
Row row =  new Row(5);
-   for (int i = 0; i < splits.length; i++) {
-   jdbcInputFormat.open(splits[i]);
-   while (!jdbcInputFormat.reachedEnd()) {
-   Row next = jdbcInputFormat.nextRecord(row);
-   if (next == null) {
-   break;
-   }
-   if (next.getField(0) != null) {
-   Assert.assertEquals("Field 0 should be 
int", Integer.class, next.getField(0).getClass());
-   }
-   if (next.getField(1) != null) {
-   Assert.assertEquals("Field 1 should be 
String", String.class, next.getField(1).getClass());
-   }
-   if (next.getField(2) != null) {
-   Assert.assertEquals("Field 2 should be 
String", String.class, next.getField(2).getClass());
-   }
-   if (next.getField(3) != null) {
-   Assert.assertEquals("Field 3 should be 
float", Double.class, next.getField(3).getClass());
-   }
-   if (next.getField(4) != null) {
-   Assert.assertEquals("Field 4 should be 
int", Integer.class, next.getField(4).getClass());
-   }
+   jdbcInputFormat.open(split);
+   while (!jdbcInputFormat.reachedEnd()) {
+   row = jdbcInputFormat.nextRecord(row);
 
-   recordCount++;
-   }
-   jdbcInputFormat.close();
+   int id = ((int) row.getField(0));
+   int testDataIndex = id - 1001;
+   
+   compare(TEST_DATA[testDataIndex], row);
+   sum += id;
}
-   Assert.assertEquals(3, recordCount);
-   jdbcInputFormat.closeInputFormat();
+   
+   Assert.assertEquals(expectedIDSum, sum);
}

@Test
-   public void testEmptyResults() throws IOException, 
InstantiationException, IllegalAccessException {
+   public void testEmptyResults() throws IOException {
jdbcInputFormat = JDBCInputFormat.buildJDBCInputFormat()
.setDrivername(DRIVER_CLASS)
.setDBUrl(DB_URL)
.setQuery(SELECT_EMPTY)
.setRowTypeInfo(rowTypeInfo)

.setResultSetType(ResultSet.TYPE_SCROLL_INSENSITIVE)
.finish();
-   jdbcInputFormat.openInputFormat();
-   jdbcInputFormat.open(null);
-   Row row = new Row(5);
-   int recordsCnt = 0;
-   while (!jdbcInputFormat.reachedEnd()) {
-   Assert.assertNull(jdbcInputFormat.nextRecord(row));
-   recordsCnt++;
+   try {
+   jdbcInputFormat.openInputFormat();
+   jdbcInputFormat.open(null);
+   Assert.assertTrue(jdbcInputFormat.reachedEnd());
+   } finally {
+   jdbcInputFormat.close();
+   jdbcInputFormat.closeInputFormat();
}
-   jdbcInputFormat.close();
-   jdbcInputFormat.closeInputFormat();
-   

[GitHub] flink pull request #3723: [FLINK-6307] [jdbc] Refactor JDBC tests

2017-04-18 Thread aljoscha
Github user aljoscha commented on a diff in the pull request:

https://github.com/apache/flink/pull/3723#discussion_r111966195
  
--- Diff: 
flink-connectors/flink-jdbc/src/test/java/org/apache/flink/api/java/io/jdbc/JDBCInputFormatTest.java
 ---
@@ -272,63 +209,61 @@ public void 
testJDBCInputFormatWithParallelismAndGenericSplitting() throws IOExc
.setParametersProvider(paramProvider)

.setResultSetType(ResultSet.TYPE_SCROLL_INSENSITIVE)
.finish();
+
jdbcInputFormat.openInputFormat();
InputSplit[] splits = jdbcInputFormat.createInputSplits(1);
//this query exploit parallelism (1 split for every 
queryParameters row)
Assert.assertEquals(queryParameters.length, splits.length);
-   int recordCount = 0;
+
+   verifySplit(splits[0], TEST_DATA[3].id);
+   verifySplit(splits[1], TEST_DATA[0].id + TEST_DATA[1].id);
+
+   jdbcInputFormat.closeInputFormat();
+   }
+
+   private void verifySplit(InputSplit split, int expectedIDSum) throws 
IOException {
+   int sum = 0;
+
Row row =  new Row(5);
-   for (int i = 0; i < splits.length; i++) {
-   jdbcInputFormat.open(splits[i]);
-   while (!jdbcInputFormat.reachedEnd()) {
-   Row next = jdbcInputFormat.nextRecord(row);
-   if (next == null) {
-   break;
-   }
-   if (next.getField(0) != null) {
-   Assert.assertEquals("Field 0 should be 
int", Integer.class, next.getField(0).getClass());
-   }
-   if (next.getField(1) != null) {
-   Assert.assertEquals("Field 1 should be 
String", String.class, next.getField(1).getClass());
-   }
-   if (next.getField(2) != null) {
-   Assert.assertEquals("Field 2 should be 
String", String.class, next.getField(2).getClass());
-   }
-   if (next.getField(3) != null) {
-   Assert.assertEquals("Field 3 should be 
float", Double.class, next.getField(3).getClass());
-   }
-   if (next.getField(4) != null) {
-   Assert.assertEquals("Field 4 should be 
int", Integer.class, next.getField(4).getClass());
-   }
+   jdbcInputFormat.open(split);
+   while (!jdbcInputFormat.reachedEnd()) {
+   row = jdbcInputFormat.nextRecord(row);
 
-   recordCount++;
-   }
-   jdbcInputFormat.close();
+   int id = ((int) row.getField(0));
+   int testDataIndex = id - 1001;
+   
+   compare(TEST_DATA[testDataIndex], row);
+   sum += id;
}
-   Assert.assertEquals(3, recordCount);
-   jdbcInputFormat.closeInputFormat();
+   
+   Assert.assertEquals(expectedIDSum, sum);
}

@Test
-   public void testEmptyResults() throws IOException, 
InstantiationException, IllegalAccessException {
+   public void testEmptyResults() throws IOException {
jdbcInputFormat = JDBCInputFormat.buildJDBCInputFormat()
.setDrivername(DRIVER_CLASS)
.setDBUrl(DB_URL)
.setQuery(SELECT_EMPTY)
.setRowTypeInfo(rowTypeInfo)

.setResultSetType(ResultSet.TYPE_SCROLL_INSENSITIVE)
.finish();
-   jdbcInputFormat.openInputFormat();
-   jdbcInputFormat.open(null);
-   Row row = new Row(5);
-   int recordsCnt = 0;
-   while (!jdbcInputFormat.reachedEnd()) {
-   Assert.assertNull(jdbcInputFormat.nextRecord(row));
-   recordsCnt++;
+   try {
+   jdbcInputFormat.openInputFormat();
+   jdbcInputFormat.open(null);
+   Assert.assertTrue(jdbcInputFormat.reachedEnd());
+   } finally {
+   jdbcInputFormat.close();
+   jdbcInputFormat.closeInputFormat();
}
-   jdbcInputFormat.close();
-   jdbcInputFormat.closeInputFormat();
-   

[GitHub] flink pull request #3723: [FLINK-6307] [jdbc] Refactor JDBC tests

2017-04-15 Thread zentol
GitHub user zentol opened a pull request:

https://github.com/apache/flink/pull/3723

[FLINK-6307] [jdbc] Refactor JDBC tests

Builds on top of #3686.

List of changes:

JDBCFullTest:
- split testJdbcInOut into 2 methods to avoid manul test-lifecycle calls
JDBCTestBase:
- remove all qualified static accesses
- remove static Connection field
- remove (now) unused prepareTestDB method
- create RowTypeInfo directly instead of first allocating a separate
TypeInfo[]
- rename testData to TEST_DATA in-line with naming conventions
- rework test data to not rely on Object arrays

JDBCInputFormatTest:
- call InputFormat#closeInputFormat() in tearDown()
- simplify method exception declarations
- remove unreachable branch when format returns null (this should fail
the test)
- replace loops over splits with for-each loops
- rework comparisons; no longer ignore nulls, no longer check class,
compare directly against expected value

JDBCOutputFormatTest:
- directly create Row instead of first creating a tuple
- simplify method exception declarations

General:
- do not catch exceptions if the catch block only calls Assert.fail()

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/zentol/flink 6307_jdbc_tests

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/flink/pull/3723.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3723


commit 993a712eaff4b7b29dbcd45897e3afe7323256a7
Author: Flavio Pompermaier 
Date:   2017-04-06T10:01:51Z

[FLINK-6271] [jdbc] Fix NPE when there's a single split

This closes #3686.

commit fb510ce440577d07b7cd7229db1a624a150e66b0
Author: zentol 
Date:   2017-04-15T16:07:15Z

[FLINK-6307] [jdbc] Refactor JDBC tests

JDBCFullTest:
- split testJdbcInOut into 2 methods to avoid manul test-lifecycle calls
JDBCTestBase:
- remove all qualified static accesses
- remove static Connection field
- remove (now) unused prepareTestDB method
- create RowTypeInfo directly instead of first allocating a separate
TypeInfo[]
- rename testData to TEST_DATA in-line with naming conventions
- rework test data to not rely on Object arrays

JDBCInputFormatTest:
- call InputFormat#closeInputFormat() in tearDown()
- simplify method exception declarations
- remove unreachable branch when format returns null (this should fail
the test)
- replace loops over splits with for-each loops
- rework comparisons; no longer ignore nulls, no longer check class,
compare directly against expected value

JDBCOutputFormatTest:
- directly create Row instead of first creating a tuple
- simplify method exception declarations

General:
- do not catch exceptions if the catch block only calls Assert.fail()




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---