[GitHub] flink issue #1962: [FLINK-3857][Streaming Connectors]Add reconnect attempt t...

2017-02-17 Thread sbcd90
Github user sbcd90 commented on the issue:

https://github.com/apache/flink/pull/1962
  
Hello @tzulitai ,

I have rebased the changes. Can you please review?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink issue #1962: [FLINK-3857][Streaming Connectors]Add reconnect attempt t...

2017-01-25 Thread sbcd90
Github user sbcd90 commented on the issue:

https://github.com/apache/flink/pull/1962
  
Hi @tzulitai , thanks for the updates. I'll refactor the code & will rebase 
the PR.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request #2644: [FLINK-4837] flink-streaming-akka source connector

2016-10-25 Thread sbcd90
Github user sbcd90 closed the pull request at:

https://github.com/apache/flink/pull/2644


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink issue #2644: [FLINK-4837] flink-streaming-akka source connector

2016-10-25 Thread sbcd90
Github user sbcd90 commented on the issue:

https://github.com/apache/flink/pull/2644
  
Hello @rmetzger ,

I'll open a new PR in bahir. Closing this.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request #2644: [FLINK-4837] flink-streaming-akka source connector

2016-10-16 Thread sbcd90
GitHub user sbcd90 opened a pull request:

https://github.com/apache/flink/pull/2644

[FLINK-4837] flink-streaming-akka source connector

This PR is created to propose the idea of having a `flink-streaming-akka 
source connector`. 

The source connector can be used to receive messages from an Akka feeder or 
publisher actor & these messages can then be processed using flink streaming.

The source connector has the following features.

- It can supports several different message formats like iterable data, 
bytes array & data with timestamp.
- It can send back acknowledgements to the feeder actor.



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/sbcd90/flink flink-akka

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/flink/pull/2644.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2644


commit c05274058a2a7f152e668ea464e257ca9dc5aac0
Author: Subhobrata Dey <sbc...@gmail.com>
Date:   2016-10-15T22:09:56Z

[FLINK-4837] flink-streaming-akka source connector




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink issue #1962: [FLINK-3857][Streaming Connectors]Add reconnect attempt t...

2016-06-09 Thread sbcd90
Github user sbcd90 commented on the issue:

https://github.com/apache/flink/pull/1962
  
Hello @tzulitai ,

I think default value for int in Java is 0.
The check if connection is lost or not & then retry for connection is a 
good suggestion. Made the change.
separated the methods for connection creation & connection status check.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request: [FLINK-3857][Streaming Connectors]Add reconnec...

2016-05-06 Thread sbcd90
Github user sbcd90 commented on the pull request:

https://github.com/apache/flink/pull/1962#issuecomment-217540405
  
Hello @rmetzger ,

I added a testcase now to the `ElasticsearchSinkITCase.java` list of tests. 
Can you kindly have a look once?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request: [FLINK-3857][Streaming Connectors]Add reconnec...

2016-05-05 Thread sbcd90
Github user sbcd90 commented on the pull request:

https://github.com/apache/flink/pull/1962#issuecomment-217342680
  
Hello @rmetzger ,

Looking at the test case `ElasticsearchSinkItCase.testTransportClient`, I 
think to test the re-connect scenario the `hasFailure` may need to be made 
`public` so that the test-method can set it.
Can you kindly provide some suggestions?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request: [FLINK-3857][Streaming Connectors]Add reconnec...

2016-05-05 Thread sbcd90
Github user sbcd90 commented on the pull request:

https://github.com/apache/flink/pull/1962#issuecomment-217298024
  
Hello @rmetzger ,

Thanks a lot for reviewing the PR.
I have made all the changes mentioned by you as inline comments as well as 
added some documentation.
Kindly have a look now.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request: [FLINK-3857][Streaming Connectors]Add reconnec...

2016-05-04 Thread sbcd90
Github user sbcd90 commented on the pull request:

https://github.com/apache/flink/pull/1962#issuecomment-216874916
  
Hello @fhueske , The tests do not fail because of the changes made in the 
PR. I tested the Junits for elasticsearch connector & all of them runs fine.
Can you kindly have a look?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request: [FLINK-3857][Streaming Connectors]Add reconnec...

2016-05-03 Thread sbcd90
GitHub user sbcd90 opened a pull request:

https://github.com/apache/flink/pull/1962

[FLINK-3857][Streaming Connectors]Add reconnect attempt to Elasticsearch 
host

- [ ] General
  - The pull request references the related JIRA issue ("[FLINK-3857] Add 
reconnect attempt to Elasticsearch host")

- [ ] Documentation
  - Documentation added based on the changes made.

- [ ] Tests & Build
  - Functionality added by the pull request is covered by tests
  - `mvn clean verify` has been executed successfully locally or a Travis 
build has passed

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/sbcd90/flink elasticSearchRetryIssue

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/flink/pull/1962.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1962


commit 62990c984f0d2eca3ba89ed9c2d22c469f16b136
Author: Subhobrata Dey <sbc...@gmail.com>
Date:   2016-05-04T02:16:35Z

[FLINK-3857][Streaming Connectors]Add reconnect attempt to Elasticsearch 
host




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request: [FLINK-3035] Redis as State Backend

2016-02-22 Thread sbcd90
Github user sbcd90 commented on the pull request:

https://github.com/apache/flink/pull/1617#issuecomment-187515384
  
Hello @mjsax , it is possible to use Travis for testing the redis. Travis 
can start Redis during build. Please check out an example 
[here](https://github.com/sbcd90/redis-travis-test).

Let me know the final decision on the way to proceed. I'll modify my 
implementation accordingly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request: [FLINK-3035] Redis as State Backend

2016-02-10 Thread sbcd90
Github user sbcd90 commented on the pull request:

https://github.com/apache/flink/pull/1617#issuecomment-182688977
  
@mjsax Would you open a new thread in dev-mailing-list for discussion on 
redis-state-backend. I'm really interested in the topic. & would like to work 
on it. I'm following dev mailing list.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request: [FLINK-3035] Redis as State Backend

2016-02-09 Thread sbcd90
GitHub user sbcd90 opened a pull request:

https://github.com/apache/flink/pull/1617

[FLINK-3035] Redis as State Backend

@mjsax please review.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/sbcd90/flink FLINK-3035

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/flink/pull/1617.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1617


commit 5a4b2f09e6990185ca6cdf3d91a4561dcb23098b
Author: Subhobrata Dey <sbc...@gmail.com>
Date:   2016-02-10T01:33:13Z

[FLINK-3035] Redis as State Backend




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request: [FLINK-2678]DataSet API does not support multi...

2016-02-05 Thread sbcd90
Github user sbcd90 closed the pull request at:

https://github.com/apache/flink/pull/1566


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request: [FLINK-2678]DataSet API does not support multi...

2016-02-04 Thread sbcd90
Github user sbcd90 commented on the pull request:

https://github.com/apache/flink/pull/1566#issuecomment-180152337
  
Hello @tillrohrmann please review the commit as the work to be done is 
nearly complete I believe. Please comment.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request: [FLINK-2678]DataSet API does not support multi...

2016-02-03 Thread sbcd90
Github user sbcd90 commented on the pull request:

https://github.com/apache/flink/pull/1566#issuecomment-179599473
  
Hello @tillrohrmann  ..I have made all the changes you have mentioned.

- support for CompositeType.
- added test cases for both primitive type & composite type
- modified the changed test case in DataSinkTest.java
- changed the name of Comparator to ObjectArrayComparator.java
- javadocs are auto-generated by intellij.

Please please please please review my code now. Kindly let me know if there 
are any further changes required. Kindly let me know if this can be merged now 
also.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request: [FLINK-2678]DataSet API does not support multi...

2016-02-01 Thread sbcd90
Github user sbcd90 commented on the pull request:

https://github.com/apache/flink/pull/1566#issuecomment-178174969
  
@tillrohrmann Thanks for the review. I had a few questions based on your 
comments. Kindly help me in getting the questions answered so that I can 
proceed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request: [FLINK-2678]DataSet API does not support multi...

2016-02-01 Thread sbcd90
Github user sbcd90 commented on a diff in the pull request:

https://github.com/apache/flink/pull/1566#discussion_r51476173
  
--- Diff: 
flink-core/src/main/java/org/apache/flink/api/common/typeutils/base/GenericArrayComparator.java
 ---
@@ -0,0 +1,205 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.common.typeutils.base;
+
+import org.apache.flink.api.common.typeutils.TypeComparator;
+import org.apache.flink.api.common.typeutils.TypeSerializer;
+import org.apache.flink.core.memory.DataInputView;
+import org.apache.flink.core.memory.DataOutputView;
+import org.apache.flink.core.memory.MemorySegment;
+
+import java.io.IOException;
+import java.lang.reflect.Array;
+import java.util.Arrays;
+
+public class GenericArrayComparator extends TypeComparator<T[]> 
implements java.io.Serializable {
+
+   private static final long serialVersionUID = 1L;
+
+   private transient T[] reference;
+
+   protected final boolean ascendingComparison;
+
+   private final TypeSerializer<T[]> serializer;
+
+   // For use by getComparators
+   @SuppressWarnings("rawtypes")
+   private final TypeComparator[] comparators = new TypeComparator[] 
{this};
+
+   public GenericArrayComparator(boolean ascending, TypeSerializer<T[]> 
serializer) {
+   this.ascendingComparison = ascending;
+   this.serializer = serializer;
+   }
+
+   @Override
+   public void setReference(T[] reference) {
+   this.reference = reference;
+   }
+
+   @Override
+   public boolean equalToReference(T[] candidate) {
+   return compare(this.reference, candidate) == 0;
+   }
+
+   @Override
+   public int compareToReference(TypeComparator<T[]> referencedComparator) 
{
+   int comp = compare(((GenericArrayComparator) 
referencedComparator).reference, reference);
+   return ascendingComparison ? comp : -comp;
+   }
+
+   @Override
+   public int compareSerialized(DataInputView firstSource, DataInputView 
secondSource) throws IOException {
+   T[] firstArray = serializer.deserialize(firstSource);
+   T[] secondArray = serializer.deserialize(secondSource);
+
+   int comp = compare(firstArray, secondArray);
+   return ascendingComparison ? comp : -comp;
+   }
+
+   @Override
+   public int extractKeys(Object record, Object[] target, int index) {
+   target[index] = record;
+   return 1;
+   }
+
+   @Override
+   public TypeComparator[] getFlatComparators() {
+   return comparators;
+   }
+
+   @Override
+   public boolean supportsNormalizedKey() {
+   return false;
+   }
+
+   @Override
+   public boolean supportsSerializationWithKeyNormalization() {
+   return false;
+   }
+
+   @Override
+   public int getNormalizeKeyLen() {
+   return 0;
+   }
+
+   @Override
+   public boolean isNormalizedKeyPrefixOnly(int keyBytes) {
+   throw new UnsupportedOperationException();
+   }
+
+   @Override
+   public void putNormalizedKey(T[] record, MemorySegment target, int 
offset, int numBytes) {
+   throw new UnsupportedOperationException();
+   }
+
+   @Override
+   public void writeWithKeyNormalization(T[] record, DataOutputView 
target) throws IOException {
+   throw new UnsupportedOperationException();
+   }
+
+   @Override
+   public T[] readWithKeyDenormalization(T[] reuse, DataInputView source) 
throws IOException {
+   throw new UnsupportedOperationException();
+   }
+
+   @Override
+   public boolean invertNormalizedKey() {
+   return !ascendingComparison;
+   }
+
+   @Override
+   public int hash(T[] record) {
+   return Arrays

[GitHub] flink pull request: [FLINK-2678]DataSet API does not support multi...

2016-02-01 Thread sbcd90
Github user sbcd90 commented on a diff in the pull request:

https://github.com/apache/flink/pull/1566#discussion_r51476344
  
--- Diff: 
flink-core/src/test/java/org/apache/flink/api/common/typeutils/base/GenericArrayComparatorTest.java
 ---
@@ -0,0 +1,82 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.api.common.typeutils.base;
+
+import org.apache.flink.api.common.typeinfo.PrimitiveArrayTypeInfo;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeutils.ComparatorTestBase;
+import org.apache.flink.api.common.typeutils.TypeComparator;
+import org.apache.flink.api.common.typeutils.TypeSerializer;
+import org.junit.Assert;
+
+public class GenericArrayComparatorTest extends 
ComparatorTestBase<char[][]> {
--- End diff --

Should another test class be created for testing non-primitive types?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request: [FLINK-2678]DataSet API does not support multi...

2016-02-01 Thread sbcd90
Github user sbcd90 commented on the pull request:

https://github.com/apache/flink/pull/1566#issuecomment-178034715
  
@StephanEwen could you also help reviewing the code?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request: [FLINK-2678]DataSet API does not support multi...

2016-01-31 Thread sbcd90
GitHub user sbcd90 opened a pull request:

https://github.com/apache/flink/pull/1566

[FLINK-2678]DataSet API does not support multi-dimensional arrays as keys

Hello,

@tillrohrmann I have added support for multi-dimensional arrays as keys in 
Dataset api. Please review & merge.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/sbcd90/flink FLINK-2678

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/flink/pull/1566.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1566


commit ec0846c64c1143dbddabc69df984dbbe64110ca6
Author: Subhobrata Dey <sbc...@gmail.com>
Date:   2016-02-01T01:50:43Z

[FLINK-2678]DataSet API does not support multi-dimensional arrays as keys




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request: [FLINK-3292]Fix for Bug in flink-jdbc. Not all...

2016-01-28 Thread sbcd90
Github user sbcd90 commented on the pull request:

https://github.com/apache/flink/pull/1551#issuecomment-176153352
  
Hello zentol,

I dont see any updates from your side. Can you kindly let me know if this 
PR is good to be merged so that I can close the issue?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request: [FLINK-3292]Fix for Bug in flink-jdbc. Not all...

2016-01-27 Thread sbcd90
Github user sbcd90 commented on the pull request:

https://github.com/apache/flink/pull/1551#issuecomment-175810825
  
Hello zentol,

I have squashed all changes. Can you please merge now?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request: [FLINK-3292]Fix for Bug in flink-jdbc. Not all...

2016-01-27 Thread sbcd90
Github user sbcd90 commented on the pull request:

https://github.com/apache/flink/pull/1551#issuecomment-175592466
  
Hello zentol,

To answer your initial question,

```
is there any way to check which createStatement method is supported? what 
happens if you use the wrong statement?
```

I went into the Jdbc driver code & found that both methods are supported, 
however, with the ones defaulted in the code,
ResultSet.TYPE_SCROLL_INSENSITIVE
ResultSet.CONCUR_READ_ONLY

I got this error, 
```
java.sql.SQLException: Result set type is TYPE_FORWARD_ONLY
```

I think both variants should be supported & it is up to the api user to 
specify which variant they should go for. Hence, this PR. Kindly let me know if 
the changes in code are fine now. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request: [FLINK-3292]Fix for Bug in flink-jdbc. Not all...

2016-01-27 Thread sbcd90
Github user sbcd90 commented on the pull request:

https://github.com/apache/flink/pull/1551#issuecomment-175589833
  
Hello zentol,

I made changes according to your post. Kindly review now.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request: [FLINK-3292]Fix for Bug in flink-jdbc. Not all...

2016-01-27 Thread sbcd90
Github user sbcd90 commented on the pull request:

https://github.com/apache/flink/pull/1551#issuecomment-175653356
  
Hello zentol,

Can you check the code changes now? The changes look exactly similar to 
what you suggest. Can you let me know if this can be merged? or do I provide a 
new PR?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] flink pull request: [FLINK-3292]Fix for Bug in flink-jdbc. Not all...

2016-01-26 Thread sbcd90
GitHub user sbcd90 opened a pull request:

https://github.com/apache/flink/pull/1551

[FLINK-3292]Fix for Bug in flink-jdbc. Not all JDBC drivers supported

Hello,

Here is the fix for issue FLINK-3292. Kindly review & merge.

Thanks & regards,
Subhobrata

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/sbcd90/flink master

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/flink/pull/1551.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1551


commit 8f1ab49c823d53d4b46eb57789bdca29533bb37e
Author: Subhobrata Dey <sbc...@gmail.com>
Date:   2016-01-26T20:18:49Z

[FLINK-3292]Fix for Bug in flink-jdbc. Not all JDBC drivers supported




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---