[jira] [Commented] (FLINK-6307) Refactor JDBC tests

2017-04-19 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6307?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15974421#comment-15974421
 ] 

ASF GitHub Bot commented on FLINK-6307:
---

Github user asfgit closed the pull request at:

https://github.com/apache/flink/pull/3723


> Refactor JDBC tests
> ---
>
> Key: FLINK-6307
> URL: https://issues.apache.org/jira/browse/FLINK-6307
> Project: Flink
>  Issue Type: Improvement
>  Components: Batch Connectors and Input/Output Formats, Tests
>Affects Versions: 1.3.0
>Reporter: Chesnay Schepler
>Assignee: Chesnay Schepler
>Priority: Minor
> Fix For: 1.3.0
>
>
> While glancing over the JDBC related tests I've found a lot of odds things 
> that accumulated over time.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6307) Refactor JDBC tests

2017-04-18 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6307?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15973047#comment-15973047
 ] 

ASF GitHub Bot commented on FLINK-6307:
---

Github user zentol commented on the issue:

https://github.com/apache/flink/pull/3723
  
Will merge this and address the comment on the way.


> Refactor JDBC tests
> ---
>
> Key: FLINK-6307
> URL: https://issues.apache.org/jira/browse/FLINK-6307
> Project: Flink
>  Issue Type: Improvement
>  Components: Batch Connectors and Input/Output Formats, Tests
>Affects Versions: 1.3.0
>Reporter: Chesnay Schepler
>Assignee: Chesnay Schepler
>Priority: Minor
> Fix For: 1.3.0
>
>
> While glancing over the JDBC related tests I've found a lot of odds things 
> that accumulated over time.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6307) Refactor JDBC tests

2017-04-18 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6307?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15972814#comment-15972814
 ] 

ASF GitHub Bot commented on FLINK-6307:
---

Github user zentol commented on a diff in the pull request:

https://github.com/apache/flink/pull/3723#discussion_r111971942
  
--- Diff: 
flink-connectors/flink-jdbc/src/test/java/org/apache/flink/api/java/io/jdbc/JDBCInputFormatTest.java
 ---
@@ -272,63 +209,61 @@ public void 
testJDBCInputFormatWithParallelismAndGenericSplitting() throws IOExc
.setParametersProvider(paramProvider)

.setResultSetType(ResultSet.TYPE_SCROLL_INSENSITIVE)
.finish();
+
jdbcInputFormat.openInputFormat();
InputSplit[] splits = jdbcInputFormat.createInputSplits(1);
//this query exploit parallelism (1 split for every 
queryParameters row)
Assert.assertEquals(queryParameters.length, splits.length);
-   int recordCount = 0;
+
+   verifySplit(splits[0], TEST_DATA[3].id);
+   verifySplit(splits[1], TEST_DATA[0].id + TEST_DATA[1].id);
+
+   jdbcInputFormat.closeInputFormat();
+   }
+
+   private void verifySplit(InputSplit split, int expectedIDSum) throws 
IOException {
+   int sum = 0;
+
Row row =  new Row(5);
-   for (int i = 0; i < splits.length; i++) {
-   jdbcInputFormat.open(splits[i]);
-   while (!jdbcInputFormat.reachedEnd()) {
-   Row next = jdbcInputFormat.nextRecord(row);
-   if (next == null) {
-   break;
-   }
-   if (next.getField(0) != null) {
-   Assert.assertEquals("Field 0 should be 
int", Integer.class, next.getField(0).getClass());
-   }
-   if (next.getField(1) != null) {
-   Assert.assertEquals("Field 1 should be 
String", String.class, next.getField(1).getClass());
-   }
-   if (next.getField(2) != null) {
-   Assert.assertEquals("Field 2 should be 
String", String.class, next.getField(2).getClass());
-   }
-   if (next.getField(3) != null) {
-   Assert.assertEquals("Field 3 should be 
float", Double.class, next.getField(3).getClass());
-   }
-   if (next.getField(4) != null) {
-   Assert.assertEquals("Field 4 should be 
int", Integer.class, next.getField(4).getClass());
-   }
+   jdbcInputFormat.open(split);
+   while (!jdbcInputFormat.reachedEnd()) {
+   row = jdbcInputFormat.nextRecord(row);
 
-   recordCount++;
-   }
-   jdbcInputFormat.close();
+   int id = ((int) row.getField(0));
+   int testDataIndex = id - 1001;
+   
+   compare(TEST_DATA[testDataIndex], row);
+   sum += id;
}
-   Assert.assertEquals(3, recordCount);
-   jdbcInputFormat.closeInputFormat();
+   
+   Assert.assertEquals(expectedIDSum, sum);
}

@Test
-   public void testEmptyResults() throws IOException, 
InstantiationException, IllegalAccessException {
+   public void testEmptyResults() throws IOException {
jdbcInputFormat = JDBCInputFormat.buildJDBCInputFormat()
.setDrivername(DRIVER_CLASS)
.setDBUrl(DB_URL)
.setQuery(SELECT_EMPTY)
.setRowTypeInfo(rowTypeInfo)

.setResultSetType(ResultSet.TYPE_SCROLL_INSENSITIVE)
.finish();
-   jdbcInputFormat.openInputFormat();
-   jdbcInputFormat.open(null);
-   Row row = new Row(5);
-   int recordsCnt = 0;
-   while (!jdbcInputFormat.reachedEnd()) {
-   Assert.assertNull(jdbcInputFormat.nextRecord(row));
-   recordsCnt++;
+   try {
+   jdbcInputFormat.openInputFormat();
+   jdbcInputFormat.open(null);
+   Assert.assertTrue(jdbcInputFormat.reachedEnd());
+   } finally {
+

[jira] [Commented] (FLINK-6307) Refactor JDBC tests

2017-04-18 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6307?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15972759#comment-15972759
 ] 

ASF GitHub Bot commented on FLINK-6307:
---

Github user aljoscha commented on a diff in the pull request:

https://github.com/apache/flink/pull/3723#discussion_r111966195
  
--- Diff: 
flink-connectors/flink-jdbc/src/test/java/org/apache/flink/api/java/io/jdbc/JDBCInputFormatTest.java
 ---
@@ -272,63 +209,61 @@ public void 
testJDBCInputFormatWithParallelismAndGenericSplitting() throws IOExc
.setParametersProvider(paramProvider)

.setResultSetType(ResultSet.TYPE_SCROLL_INSENSITIVE)
.finish();
+
jdbcInputFormat.openInputFormat();
InputSplit[] splits = jdbcInputFormat.createInputSplits(1);
//this query exploit parallelism (1 split for every 
queryParameters row)
Assert.assertEquals(queryParameters.length, splits.length);
-   int recordCount = 0;
+
+   verifySplit(splits[0], TEST_DATA[3].id);
+   verifySplit(splits[1], TEST_DATA[0].id + TEST_DATA[1].id);
+
+   jdbcInputFormat.closeInputFormat();
+   }
+
+   private void verifySplit(InputSplit split, int expectedIDSum) throws 
IOException {
+   int sum = 0;
+
Row row =  new Row(5);
-   for (int i = 0; i < splits.length; i++) {
-   jdbcInputFormat.open(splits[i]);
-   while (!jdbcInputFormat.reachedEnd()) {
-   Row next = jdbcInputFormat.nextRecord(row);
-   if (next == null) {
-   break;
-   }
-   if (next.getField(0) != null) {
-   Assert.assertEquals("Field 0 should be 
int", Integer.class, next.getField(0).getClass());
-   }
-   if (next.getField(1) != null) {
-   Assert.assertEquals("Field 1 should be 
String", String.class, next.getField(1).getClass());
-   }
-   if (next.getField(2) != null) {
-   Assert.assertEquals("Field 2 should be 
String", String.class, next.getField(2).getClass());
-   }
-   if (next.getField(3) != null) {
-   Assert.assertEquals("Field 3 should be 
float", Double.class, next.getField(3).getClass());
-   }
-   if (next.getField(4) != null) {
-   Assert.assertEquals("Field 4 should be 
int", Integer.class, next.getField(4).getClass());
-   }
+   jdbcInputFormat.open(split);
+   while (!jdbcInputFormat.reachedEnd()) {
+   row = jdbcInputFormat.nextRecord(row);
 
-   recordCount++;
-   }
-   jdbcInputFormat.close();
+   int id = ((int) row.getField(0));
+   int testDataIndex = id - 1001;
+   
+   compare(TEST_DATA[testDataIndex], row);
+   sum += id;
}
-   Assert.assertEquals(3, recordCount);
-   jdbcInputFormat.closeInputFormat();
+   
+   Assert.assertEquals(expectedIDSum, sum);
}

@Test
-   public void testEmptyResults() throws IOException, 
InstantiationException, IllegalAccessException {
+   public void testEmptyResults() throws IOException {
jdbcInputFormat = JDBCInputFormat.buildJDBCInputFormat()
.setDrivername(DRIVER_CLASS)
.setDBUrl(DB_URL)
.setQuery(SELECT_EMPTY)
.setRowTypeInfo(rowTypeInfo)

.setResultSetType(ResultSet.TYPE_SCROLL_INSENSITIVE)
.finish();
-   jdbcInputFormat.openInputFormat();
-   jdbcInputFormat.open(null);
-   Row row = new Row(5);
-   int recordsCnt = 0;
-   while (!jdbcInputFormat.reachedEnd()) {
-   Assert.assertNull(jdbcInputFormat.nextRecord(row));
-   recordsCnt++;
+   try {
+   jdbcInputFormat.openInputFormat();
+   jdbcInputFormat.open(null);
+   Assert.assertTrue(jdbcInputFormat.reachedEnd());
+   } finally {
+  

[jira] [Commented] (FLINK-6307) Refactor JDBC tests

2017-04-15 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6307?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15970011#comment-15970011
 ] 

ASF GitHub Bot commented on FLINK-6307:
---

GitHub user zentol opened a pull request:

https://github.com/apache/flink/pull/3723

[FLINK-6307] [jdbc] Refactor JDBC tests

Builds on top of #3686.

List of changes:

JDBCFullTest:
- split testJdbcInOut into 2 methods to avoid manul test-lifecycle calls
JDBCTestBase:
- remove all qualified static accesses
- remove static Connection field
- remove (now) unused prepareTestDB method
- create RowTypeInfo directly instead of first allocating a separate
TypeInfo[]
- rename testData to TEST_DATA in-line with naming conventions
- rework test data to not rely on Object arrays

JDBCInputFormatTest:
- call InputFormat#closeInputFormat() in tearDown()
- simplify method exception declarations
- remove unreachable branch when format returns null (this should fail
the test)
- replace loops over splits with for-each loops
- rework comparisons; no longer ignore nulls, no longer check class,
compare directly against expected value

JDBCOutputFormatTest:
- directly create Row instead of first creating a tuple
- simplify method exception declarations

General:
- do not catch exceptions if the catch block only calls Assert.fail()

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/zentol/flink 6307_jdbc_tests

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/flink/pull/3723.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3723


commit 993a712eaff4b7b29dbcd45897e3afe7323256a7
Author: Flavio Pompermaier 
Date:   2017-04-06T10:01:51Z

[FLINK-6271] [jdbc] Fix NPE when there's a single split

This closes #3686.

commit fb510ce440577d07b7cd7229db1a624a150e66b0
Author: zentol 
Date:   2017-04-15T16:07:15Z

[FLINK-6307] [jdbc] Refactor JDBC tests

JDBCFullTest:
- split testJdbcInOut into 2 methods to avoid manul test-lifecycle calls
JDBCTestBase:
- remove all qualified static accesses
- remove static Connection field
- remove (now) unused prepareTestDB method
- create RowTypeInfo directly instead of first allocating a separate
TypeInfo[]
- rename testData to TEST_DATA in-line with naming conventions
- rework test data to not rely on Object arrays

JDBCInputFormatTest:
- call InputFormat#closeInputFormat() in tearDown()
- simplify method exception declarations
- remove unreachable branch when format returns null (this should fail
the test)
- replace loops over splits with for-each loops
- rework comparisons; no longer ignore nulls, no longer check class,
compare directly against expected value

JDBCOutputFormatTest:
- directly create Row instead of first creating a tuple
- simplify method exception declarations

General:
- do not catch exceptions if the catch block only calls Assert.fail()




> Refactor JDBC tests
> ---
>
> Key: FLINK-6307
> URL: https://issues.apache.org/jira/browse/FLINK-6307
> Project: Flink
>  Issue Type: Improvement
>  Components: Batch Connectors and Input/Output Formats, Tests
>Affects Versions: 1.3.0
>Reporter: Chesnay Schepler
>Assignee: Chesnay Schepler
>Priority: Minor
> Fix For: 1.3.0
>
>
> While glancing over the JDBC related tests I've found a lot of odds things 
> that accumulated over time.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)