Re: Are flink connectors included in the binary release ?

2018-11-14 Thread Jeff Zhang
Thanks Chesnay, but if user want to use connectors in scala shell, they
have to download it.

On Wed, Nov 14, 2018 at 5:22 PM Chesnay Schepler  wrote:

> Connectors are never contained in binary releases as they are supposed t
> be packaged into the user-jar.
>
> On 14.11.2018 10:12, Jeff Zhang wrote:
>
>
> I don't see the jars of flink connectors in the binary release of flink
> 1.6.1, so just want to confirm whether flink binary release include these
> connectors. Thanks
>
> --
> Best Regards
>
> Jeff Zhang
>
>
>

-- 
Best Regards

Jeff Zhang


Re: Are flink connectors included in the binary release ?

2018-11-14 Thread Chesnay Schepler
Connectors are never contained in binary releases as they are supposed t 
be packaged into the user-jar.


On 14.11.2018 10:12, Jeff Zhang wrote:


I don't see the jars of flink connectors in the binary release of 
flink 1.6.1, so just want to confirm whether flink binary release 
include these connectors. Thanks


--
Best Regards

Jeff Zhang





Are flink connectors included in the binary release ?

2018-11-14 Thread Jeff Zhang
I don't see the jars of flink connectors in the binary release of flink
1.6.1, so just want to confirm whether flink binary release include these
connectors. Thanks

-- 
Best Regards

Jeff Zhang


Re: Possible conflict between in Flink connectors

2017-08-25 Thread Federico D'Ambrosio
Hi,

At the beginning, I was wondering myself that too, and I don't know why
hbase-common wasn''t being downloaded and included, so I added it
explicitly.

I was in the process to write that maybe I've solved this weird issue:
apparently the shading worked and the ClassDefNotFound issue was caused by
the missing hbase-client jar inside the fat jar, even though there should
have been! So, I added again the explicit dependency for it in the
build.sbt and now it's working.


2017-08-25 23:03 GMT+02:00 Robert Metzger :

> Hi,
>
> why do you need to add hbase-common as a separate dependency? Doesn't the
> "flink-hbase" dependency transitively pull in hbase?
>
> On Fri, Aug 25, 2017 at 6:35 PM, Ted Yu  wrote:
>
>> If Guava 18.0 is used to build hbase 1.3, there would be compilation
>> errors such as the following:
>>
>> [ERROR] /mnt/disk2/a/1.3-h/hbase-server/src/main/java/org/apache/had
>> oop/hbase/replication/regionserver/ReplicationSource.java:[271,25]
>> error: cannot find symbol
>> [ERROR]   symbol:   method stopAndWait()
>> [ERROR]   location: variable replicationEndpoint of type
>> ReplicationEndpoint
>> [ERROR] /mnt/disk2/a/1.3-h/hbase-server/src/main/java/org/apache/had
>> oop/hbase/replication/regionserver/ReplicationSource.java:[281,47]
>> error: cannot find symbol
>> [ERROR]   symbol:   method start()
>> [ERROR]   location: variable replicationEndpoint of type
>> ReplicationEndpoint
>>
>> Maybe you can shade the guava dependency in hbase 1.3
>>
>> In the upcoming hbase 2.0 release, third party dependencies such as guava
>> and netty are shaded. Meaning there wouldn't be such conflict with other
>> components you may use.
>>
>> On Fri, Aug 25, 2017 at 8:36 AM, Federico D'Ambrosio <
>> federico.dambro...@smartlab.ws> wrote:
>>
>>> Hello everyone, I'm new to Flink and am encountering a nasty problem
>>> while trying to submit a streaming Flink Job. I'll try to explain it as
>>> thoroughly as possible.
>>>
>>> Premise: I'm using an HDP 2.6 hadoop cluster, with hadoop version
>>> 2.7.3.2.6.1.0-129, Flink compiled from sources accordingly (maven 3.0.5) as
>>> per documentation
>>> 
>>> and I submit the job using yarn-session.sh and then flink run.
>>>
>>> Getting into more details, the Flink Job is a fat jar built with sbt
>>> assembly and the following dependencies:
>>>
>>> libraryDependencies += "org.apache.flink" %% "flink-scala" % flinkVersion % 
>>> "provided"
>>> libraryDependencies += "org.apache.flink" %% "flink-streaming-scala" % 
>>> flinkVersion % "provided"
>>> libraryDependencies += "org.apache.flink" %% "flink-connector-kinesis" % 
>>> flinkVersion % "provided"
>>> libraryDependencies += "org.apache.flink" %% "flink-hbase" % flinkVersion % 
>>> "provided"
>>> libraryDependencies += "org.apache.flink" %% "flink-connector-filesystem" % 
>>> flinkVersion % "provided"
>>> libraryDependencies += "org.apache.hbase" % "hbase-common" % "1.3.0"
>>> libraryDependencies += "org.joda" % "joda-convert" % "1.8.3"
>>> libraryDependencies += "com.typesafe.play" %% "play-json" % "2.6.2"
>>>
>>>
>>> assemblyOption in assembly := (assemblyOption in 
>>> assembly).value.copy(includeScala = false)
>>>
>>>
>>> inside the flink lib folder I have the following Jars:
>>>
>>> flink-connector-filesystem_2.10-1.3.2.jar
>>> flink-python_2.10-1.3.2.jar
>>> flink-connector-kinesis_2.10-1.3.2.jar
>>> flink-shaded-hadoop2-uber-1.3.2.jar
>>> flink-dist_2.10-1.3.2.jar
>>> log4j-1.2.17.jar
>>> flink-hbase_2.10-1.3.2.jar
>>> slf4j-log4j12-1.7.7.jar
>>>
>>> Right now, the issue is that when I flink run the job, I get
>>> NoClassDefFound for org/apache/hadoop/hbase/client/Put class, despite
>>> it being inside the fat jar. So, I tried putting both hbase-common.jar and
>>> hbase-client.jars inside the lib folder, getting as a result another
>>> NoClassDefFound, only, this time, for com/google/common/collect/List
>>> Multimap. Now, I noticed that flink-connector-kinesis and all the hbase
>>> packages (both hbase-common and flink-hbase deps) have a different version
>>> of guava as dependency: the first one needs version 18.0, while the second
>>> one needs 12.0.1 version. I, then, tried to enable the shading of guava
>>> packages with the following:
>>>
>>> assemblyShadeRules in assembly := Seq(
>>>   ShadeRule.rename("com.google.**" -> "shade.com.google.@1").inAll
>>> )
>>>
>>>
>>> Only to no avail, getting the exact same error as before. And I really
>>> don't know how to submit the job without pouring every single dependency
>>> inside the lib folder. I thought it could be a compatibility issue with the
>>> YARN cluster, but I tried launching the job locally only to get the same
>>> error.
>>>
>>> As an additional info, I get the following warnings if I don't specify
>>> flink packages as provided:
>>>
>>> [warn] * org.scalamacros:quasiquotes_2.10:2.1.0 is selected over
>>> 

Re: Possible conflict between in Flink connectors

2017-08-25 Thread Ted Yu
Looking at dependencies for flink-hbase, we have :

[INFO] +- org.apache.hbase:hbase-server:jar:1.3.1:compile
[INFO] |  +- org.apache.hbase:hbase-common:jar:1.3.1:compile
[INFO] |  +- org.apache.hbase:hbase-protocol:jar:1.3.1:compile
[INFO] |  +- org.apache.hbase:hbase-procedure:jar:1.3.1:compile
[INFO] |  |  \- org.apache.hbase:hbase-common:jar:tests:1.3.1:compile
[INFO] |  +- org.apache.hbase:hbase-client:jar:1.3.1:compile

On Fri, Aug 25, 2017 at 2:03 PM, Robert Metzger  wrote:

> Hi,
>
> why do you need to add hbase-common as a separate dependency? Doesn't the
> "flink-hbase" dependency transitively pull in hbase?
>
> On Fri, Aug 25, 2017 at 6:35 PM, Ted Yu  wrote:
>
>> If Guava 18.0 is used to build hbase 1.3, there would be compilation
>> errors such as the following:
>>
>> [ERROR] /mnt/disk2/a/1.3-h/hbase-server/src/main/java/org/apache/had
>> oop/hbase/replication/regionserver/ReplicationSource.java:[271,25]
>> error: cannot find symbol
>> [ERROR]   symbol:   method stopAndWait()
>> [ERROR]   location: variable replicationEndpoint of type
>> ReplicationEndpoint
>> [ERROR] /mnt/disk2/a/1.3-h/hbase-server/src/main/java/org/apache/had
>> oop/hbase/replication/regionserver/ReplicationSource.java:[281,47]
>> error: cannot find symbol
>> [ERROR]   symbol:   method start()
>> [ERROR]   location: variable replicationEndpoint of type
>> ReplicationEndpoint
>>
>> Maybe you can shade the guava dependency in hbase 1.3
>>
>> In the upcoming hbase 2.0 release, third party dependencies such as guava
>> and netty are shaded. Meaning there wouldn't be such conflict with other
>> components you may use.
>>
>> On Fri, Aug 25, 2017 at 8:36 AM, Federico D'Ambrosio <
>> federico.dambro...@smartlab.ws> wrote:
>>
>>> Hello everyone, I'm new to Flink and am encountering a nasty problem
>>> while trying to submit a streaming Flink Job. I'll try to explain it as
>>> thoroughly as possible.
>>>
>>> Premise: I'm using an HDP 2.6 hadoop cluster, with hadoop version
>>> 2.7.3.2.6.1.0-129, Flink compiled from sources accordingly (maven 3.0.5) as
>>> per documentation
>>> 
>>> and I submit the job using yarn-session.sh and then flink run.
>>>
>>> Getting into more details, the Flink Job is a fat jar built with sbt
>>> assembly and the following dependencies:
>>>
>>> libraryDependencies += "org.apache.flink" %% "flink-scala" % flinkVersion % 
>>> "provided"
>>> libraryDependencies += "org.apache.flink" %% "flink-streaming-scala" % 
>>> flinkVersion % "provided"
>>> libraryDependencies += "org.apache.flink" %% "flink-connector-kinesis" % 
>>> flinkVersion % "provided"
>>> libraryDependencies += "org.apache.flink" %% "flink-hbase" % flinkVersion % 
>>> "provided"
>>> libraryDependencies += "org.apache.flink" %% "flink-connector-filesystem" % 
>>> flinkVersion % "provided"
>>> libraryDependencies += "org.apache.hbase" % "hbase-common" % "1.3.0"
>>> libraryDependencies += "org.joda" % "joda-convert" % "1.8.3"
>>> libraryDependencies += "com.typesafe.play" %% "play-json" % "2.6.2"
>>>
>>>
>>> assemblyOption in assembly := (assemblyOption in 
>>> assembly).value.copy(includeScala = false)
>>>
>>>
>>> inside the flink lib folder I have the following Jars:
>>>
>>> flink-connector-filesystem_2.10-1.3.2.jar
>>> flink-python_2.10-1.3.2.jar
>>> flink-connector-kinesis_2.10-1.3.2.jar
>>> flink-shaded-hadoop2-uber-1.3.2.jar
>>> flink-dist_2.10-1.3.2.jar
>>> log4j-1.2.17.jar
>>> flink-hbase_2.10-1.3.2.jar
>>> slf4j-log4j12-1.7.7.jar
>>>
>>> Right now, the issue is that when I flink run the job, I get
>>> NoClassDefFound for org/apache/hadoop/hbase/client/Put class, despite
>>> it being inside the fat jar. So, I tried putting both hbase-common.jar and
>>> hbase-client.jars inside the lib folder, getting as a result another
>>> NoClassDefFound, only, this time, for com/google/common/collect/List
>>> Multimap. Now, I noticed that flink-connector-kinesis and all the hbase
>>> packages (both hbase-common and flink-hbase deps) have a different version
>>> of guava as dependency: the first one needs version 18.0, while the second
>>> one needs 12.0.1 version. I, then, tried to enable the shading of guava
>>> packages with the following:
>>>
>>> assemblyShadeRules in assembly := Seq(
>>>   ShadeRule.rename("com.google.**" -> "shade.com.google.@1").inAll
>>> )
>>>
>>>
>>> Only to no avail, getting the exact same error as before. And I really
>>> don't know how to submit the job without pouring every single dependency
>>> inside the lib folder. I thought it could be a compatibility issue with the
>>> YARN cluster, but I tried launching the job locally only to get the same
>>> error.
>>>
>>> As an additional info, I get the following warnings if I don't specify
>>> flink packages as provided:
>>>
>>> [warn] * org.scalamacros:quasiquotes_2.10:2.1.0 is selected over
>>> 2.0.1
>>> [warn] +- 

Re: Possible conflict between in Flink connectors

2017-08-25 Thread Robert Metzger
Hi,

why do you need to add hbase-common as a separate dependency? Doesn't the
"flink-hbase" dependency transitively pull in hbase?

On Fri, Aug 25, 2017 at 6:35 PM, Ted Yu  wrote:

> If Guava 18.0 is used to build hbase 1.3, there would be compilation
> errors such as the following:
>
> [ERROR] /mnt/disk2/a/1.3-h/hbase-server/src/main/java/org/apache/
> hadoop/hbase/replication/regionserver/ReplicationSource.java:[271,25]
> error: cannot find symbol
> [ERROR]   symbol:   method stopAndWait()
> [ERROR]   location: variable replicationEndpoint of type
> ReplicationEndpoint
> [ERROR] /mnt/disk2/a/1.3-h/hbase-server/src/main/java/org/apache/
> hadoop/hbase/replication/regionserver/ReplicationSource.java:[281,47]
> error: cannot find symbol
> [ERROR]   symbol:   method start()
> [ERROR]   location: variable replicationEndpoint of type
> ReplicationEndpoint
>
> Maybe you can shade the guava dependency in hbase 1.3
>
> In the upcoming hbase 2.0 release, third party dependencies such as guava
> and netty are shaded. Meaning there wouldn't be such conflict with other
> components you may use.
>
> On Fri, Aug 25, 2017 at 8:36 AM, Federico D'Ambrosio <
> federico.dambro...@smartlab.ws> wrote:
>
>> Hello everyone, I'm new to Flink and am encountering a nasty problem
>> while trying to submit a streaming Flink Job. I'll try to explain it as
>> thoroughly as possible.
>>
>> Premise: I'm using an HDP 2.6 hadoop cluster, with hadoop version
>> 2.7.3.2.6.1.0-129, Flink compiled from sources accordingly (maven 3.0.5) as
>> per documentation
>> 
>> and I submit the job using yarn-session.sh and then flink run.
>>
>> Getting into more details, the Flink Job is a fat jar built with sbt
>> assembly and the following dependencies:
>>
>> libraryDependencies += "org.apache.flink" %% "flink-scala" % flinkVersion % 
>> "provided"
>> libraryDependencies += "org.apache.flink" %% "flink-streaming-scala" % 
>> flinkVersion % "provided"
>> libraryDependencies += "org.apache.flink" %% "flink-connector-kinesis" % 
>> flinkVersion % "provided"
>> libraryDependencies += "org.apache.flink" %% "flink-hbase" % flinkVersion % 
>> "provided"
>> libraryDependencies += "org.apache.flink" %% "flink-connector-filesystem" % 
>> flinkVersion % "provided"
>> libraryDependencies += "org.apache.hbase" % "hbase-common" % "1.3.0"
>> libraryDependencies += "org.joda" % "joda-convert" % "1.8.3"
>> libraryDependencies += "com.typesafe.play" %% "play-json" % "2.6.2"
>>
>>
>> assemblyOption in assembly := (assemblyOption in 
>> assembly).value.copy(includeScala = false)
>>
>>
>> inside the flink lib folder I have the following Jars:
>>
>> flink-connector-filesystem_2.10-1.3.2.jar
>> flink-python_2.10-1.3.2.jar
>> flink-connector-kinesis_2.10-1.3.2.jar
>> flink-shaded-hadoop2-uber-1.3.2.jar
>> flink-dist_2.10-1.3.2.jar
>> log4j-1.2.17.jar
>> flink-hbase_2.10-1.3.2.jar
>> slf4j-log4j12-1.7.7.jar
>>
>> Right now, the issue is that when I flink run the job, I get
>> NoClassDefFound for org/apache/hadoop/hbase/client/Put class, despite it
>> being inside the fat jar. So, I tried putting both hbase-common.jar and
>> hbase-client.jars inside the lib folder, getting as a result another
>> NoClassDefFound, only, this time, for com/google/common/collect/List
>> Multimap. Now, I noticed that flink-connector-kinesis and all the hbase
>> packages (both hbase-common and flink-hbase deps) have a different version
>> of guava as dependency: the first one needs version 18.0, while the second
>> one needs 12.0.1 version. I, then, tried to enable the shading of guava
>> packages with the following:
>>
>> assemblyShadeRules in assembly := Seq(
>>   ShadeRule.rename("com.google.**" -> "shade.com.google.@1").inAll
>> )
>>
>>
>> Only to no avail, getting the exact same error as before. And I really
>> don't know how to submit the job without pouring every single dependency
>> inside the lib folder. I thought it could be a compatibility issue with the
>> YARN cluster, but I tried launching the job locally only to get the same
>> error.
>>
>> As an additional info, I get the following warnings if I don't specify
>> flink packages as provided:
>>
>> [warn] * org.scalamacros:quasiquotes_2.10:2.1.0 is selected over
>> 2.0.1
>> [warn] +- org.typelevel:macro-compat_2.10:1.1.1
>> (depends on 2.1.0)
>> [warn] +- org.apache.flink:flink-scala_2.10:1.3.2
>> (depends on 2.0.1)
>> [warn]
>> [warn] * org.mortbay.jetty:jetty-util:6.1.26 is selected over
>> 6.1.26.hwx
>> [warn] +- org.apache.hbase:hbase-common:1.3.0
>> (depends on 6.1.26)
>> [warn] +- org.apache.flink:flink-shaded-hadoop2:1.3.2
>> (depends on 6.1.26.hwx)
>> [warn]
>> [warn] * io.netty:netty:3.8.0.Final is selected over 3.7.0.Final
>> [warn] +- com.data-artisans:flakka-remote_2.10:2.3-custom
>> (depends on 3.8.0.Final)
>> [warn] +- 

Re: Possible conflict between in Flink connectors

2017-08-25 Thread Ted Yu
If Guava 18.0 is used to build hbase 1.3, there would be compilation errors
such as the following:

[ERROR] /mnt/disk2/a/1.3-h/hbase-server/src/main/java/org/
apache/hadoop/hbase/replication/regionserver/ReplicationSource.java:[271,25]
error: cannot find symbol
[ERROR]   symbol:   method stopAndWait()
[ERROR]   location: variable replicationEndpoint of type ReplicationEndpoint
[ERROR] /mnt/disk2/a/1.3-h/hbase-server/src/main/java/org/
apache/hadoop/hbase/replication/regionserver/ReplicationSource.java:[281,47]
error: cannot find symbol
[ERROR]   symbol:   method start()
[ERROR]   location: variable replicationEndpoint of type ReplicationEndpoint

Maybe you can shade the guava dependency in hbase 1.3

In the upcoming hbase 2.0 release, third party dependencies such as guava
and netty are shaded. Meaning there wouldn't be such conflict with other
components you may use.

On Fri, Aug 25, 2017 at 8:36 AM, Federico D'Ambrosio <
federico.dambro...@smartlab.ws> wrote:

> Hello everyone, I'm new to Flink and am encountering a nasty problem while
> trying to submit a streaming Flink Job. I'll try to explain it as
> thoroughly as possible.
>
> Premise: I'm using an HDP 2.6 hadoop cluster, with hadoop version
> 2.7.3.2.6.1.0-129, Flink compiled from sources accordingly (maven 3.0.5) as
> per documentation
> 
> and I submit the job using yarn-session.sh and then flink run.
>
> Getting into more details, the Flink Job is a fat jar built with sbt
> assembly and the following dependencies:
>
> libraryDependencies += "org.apache.flink" %% "flink-scala" % flinkVersion % 
> "provided"
> libraryDependencies += "org.apache.flink" %% "flink-streaming-scala" % 
> flinkVersion % "provided"
> libraryDependencies += "org.apache.flink" %% "flink-connector-kinesis" % 
> flinkVersion % "provided"
> libraryDependencies += "org.apache.flink" %% "flink-hbase" % flinkVersion % 
> "provided"
> libraryDependencies += "org.apache.flink" %% "flink-connector-filesystem" % 
> flinkVersion % "provided"
> libraryDependencies += "org.apache.hbase" % "hbase-common" % "1.3.0"
> libraryDependencies += "org.joda" % "joda-convert" % "1.8.3"
> libraryDependencies += "com.typesafe.play" %% "play-json" % "2.6.2"
>
>
> assemblyOption in assembly := (assemblyOption in 
> assembly).value.copy(includeScala = false)
>
>
> inside the flink lib folder I have the following Jars:
>
> flink-connector-filesystem_2.10-1.3.2.jar
> flink-python_2.10-1.3.2.jar
> flink-connector-kinesis_2.10-1.3.2.jar
> flink-shaded-hadoop2-uber-1.3.2.jar
> flink-dist_2.10-1.3.2.jar
> log4j-1.2.17.jar
> flink-hbase_2.10-1.3.2.jar
> slf4j-log4j12-1.7.7.jar
>
> Right now, the issue is that when I flink run the job, I get
> NoClassDefFound for org/apache/hadoop/hbase/client/Put class, despite it
> being inside the fat jar. So, I tried putting both hbase-common.jar and
> hbase-client.jars inside the lib folder, getting as a result another
> NoClassDefFound, only, this time, for com/google/common/collect/
> ListMultimap. Now, I noticed that flink-connector-kinesis and all the
> hbase packages (both hbase-common and flink-hbase deps) have a different
> version of guava as dependency: the first one needs version 18.0, while the
> second one needs 12.0.1 version. I, then, tried to enable the shading of
> guava packages with the following:
>
> assemblyShadeRules in assembly := Seq(
>   ShadeRule.rename("com.google.**" -> "shade.com.google.@1").inAll
> )
>
>
> Only to no avail, getting the exact same error as before. And I really
> don't know how to submit the job without pouring every single dependency
> inside the lib folder. I thought it could be a compatibility issue with the
> YARN cluster, but I tried launching the job locally only to get the same
> error.
>
> As an additional info, I get the following warnings if I don't specify
> flink packages as provided:
>
> [warn] * org.scalamacros:quasiquotes_2.10:2.1.0 is selected over 2.0.1
> [warn] +- org.typelevel:macro-compat_2.10:1.1.1
> (depends on 2.1.0)
> [warn] +- org.apache.flink:flink-scala_2.10:1.3.2
> (depends on 2.0.1)
> [warn]
> [warn] * org.mortbay.jetty:jetty-util:6.1.26 is selected over
> 6.1.26.hwx
> [warn] +- org.apache.hbase:hbase-common:1.3.0
> (depends on 6.1.26)
> [warn] +- org.apache.flink:flink-shaded-hadoop2:1.3.2
> (depends on 6.1.26.hwx)
> [warn]
> [warn] * io.netty:netty:3.8.0.Final is selected over 3.7.0.Final
> [warn] +- com.data-artisans:flakka-remote_2.10:2.3-custom
> (depends on 3.8.0.Final)
> [warn] +- org.apache.zookeeper:zookeeper:3.4.6
> (depends on 3.7.0.Final)
> [warn]
> [warn] * com.google.guava:guava:18.0 is selected over 12.0.1
> [warn] +- org.apache.flink:flink-connector-kinesis_2.10:1.3.2
> (depends on 18.0)
> [warn] +- org.apache.hbase:hbase-prefix-tree:1.3.0
> (depends on 12.0.1)
> [warn] +- 

Possible conflict between in Flink connectors

2017-08-25 Thread Federico D'Ambrosio
Hello everyone, I'm new to Flink and am encountering a nasty problem while
trying to submit a streaming Flink Job. I'll try to explain it as
thoroughly as possible.

Premise: I'm using an HDP 2.6 hadoop cluster, with hadoop version
2.7.3.2.6.1.0-129, Flink compiled from sources accordingly (maven 3.0.5) as
per documentation

and I submit the job using yarn-session.sh and then flink run.

Getting into more details, the Flink Job is a fat jar built with sbt
assembly and the following dependencies:

libraryDependencies += "org.apache.flink" %% "flink-scala" %
flinkVersion % "provided"
libraryDependencies += "org.apache.flink" %% "flink-streaming-scala" %
flinkVersion % "provided"
libraryDependencies += "org.apache.flink" %% "flink-connector-kinesis"
% flinkVersion % "provided"
libraryDependencies += "org.apache.flink" %% "flink-hbase" %
flinkVersion % "provided"
libraryDependencies += "org.apache.flink" %%
"flink-connector-filesystem" % flinkVersion % "provided"
libraryDependencies += "org.apache.hbase" % "hbase-common" % "1.3.0"
libraryDependencies += "org.joda" % "joda-convert" % "1.8.3"
libraryDependencies += "com.typesafe.play" %% "play-json" % "2.6.2"


assemblyOption in assembly := (assemblyOption in
assembly).value.copy(includeScala = false)


inside the flink lib folder I have the following Jars:

flink-connector-filesystem_2.10-1.3.2.jar
flink-python_2.10-1.3.2.jar
flink-connector-kinesis_2.10-1.3.2.jar
flink-shaded-hadoop2-uber-1.3.2.jar
flink-dist_2.10-1.3.2.jar
log4j-1.2.17.jar
flink-hbase_2.10-1.3.2.jar
slf4j-log4j12-1.7.7.jar

Right now, the issue is that when I flink run the job, I get
NoClassDefFound for org/apache/hadoop/hbase/client/Put class, despite it
being inside the fat jar. So, I tried putting both hbase-common.jar and
hbase-client.jars inside the lib folder, getting as a result another
NoClassDefFound, only, this time, for com/google/common/collect/ListMultimap.
Now, I noticed that flink-connector-kinesis and all the hbase packages
(both hbase-common and flink-hbase deps) have a different version of guava
as dependency: the first one needs version 18.0, while the second one needs
12.0.1 version. I, then, tried to enable the shading of guava packages with
the following:

assemblyShadeRules in assembly := Seq(
  ShadeRule.rename("com.google.**" -> "shade.com.google.@1").inAll
)


Only to no avail, getting the exact same error as before. And I really
don't know how to submit the job without pouring every single dependency
inside the lib folder. I thought it could be a compatibility issue with the
YARN cluster, but I tried launching the job locally only to get the same
error.

As an additional info, I get the following warnings if I don't specify
flink packages as provided:

[warn] * org.scalamacros:quasiquotes_2.10:2.1.0 is selected over 2.0.1
[warn] +- org.typelevel:macro-compat_2.10:1.1.1
(depends on 2.1.0)
[warn] +- org.apache.flink:flink-scala_2.10:1.3.2
(depends on 2.0.1)
[warn]
[warn] * org.mortbay.jetty:jetty-util:6.1.26 is selected over 6.1.26.hwx
[warn] +- org.apache.hbase:hbase-common:1.3.0
(depends on 6.1.26)
[warn] +- org.apache.flink:flink-shaded-hadoop2:1.3.2
(depends on 6.1.26.hwx)
[warn]
[warn] * io.netty:netty:3.8.0.Final is selected over 3.7.0.Final
[warn] +- com.data-artisans:flakka-remote_2.10:2.3-custom
(depends on 3.8.0.Final)
[warn] +- org.apache.zookeeper:zookeeper:3.4.6
(depends on 3.7.0.Final)
[warn]
[warn] * com.google.guava:guava:18.0 is selected over 12.0.1
[warn] +- org.apache.flink:flink-connector-kinesis_2.10:1.3.2
(depends on 18.0)
[warn] +- org.apache.hbase:hbase-prefix-tree:1.3.0
(depends on 12.0.1)
[warn] +- org.apache.hbase:hbase-procedure:1.3.0
(depends on 12.0.1)
[warn] +- org.apache.hbase:hbase-client:1.3.0
(depends on 12.0.1)
[warn] +- org.apache.hbase:hbase-common:1.3.0
(depends on 12.0.1)
[warn] +- org.apache.hbase:hbase-server:1.3.0
(depends on 12.0.1)
[warn]
[warn] * io.netty:netty-all:4.0.27.Final is selected over 4.0.23.Final
[warn] +- org.apache.flink:flink-runtime_2.10:1.3.2
(depends on 4.0.27.Final)
[warn] +- org.apache.flink:flink-shaded-hadoop2:1.3.2
(depends on 4.0.27.Final)
[warn] +- org.apache.hbase:hbase-prefix-tree:1.3.0
(depends on 4.0.23.Final)
[warn] +- org.apache.hbase:hbase-client:1.3.0
(depends on 4.0.23.Final)
[warn] +- org.apache.hbase:hbase-server:1.3.0
(depends on 4.0.23.Final)
[warn]
[warn] * junit:junit:4.12 is selected over 3.8.1
[warn] +- org.apache.hbase:hbase-protocol:1.3.0
(depends on 4.12)
[warn] +- org.apache.hbase:hbase-annotations:1.3.0
(depends on 4.12)
[warn] +- org.apache.hbase:hbase-prefix-tree:1.3.0
(depends on 4.12)
[warn] +- org.apache.hbase:hbase-procedure:1.3.0
(depends on 4.12)
[warn] +- 

Re: Why not add flink-connectors to flink dist?

2017-05-15 Thread Chesnay Schepler
You can either package the connector into the user-jar or place it in 
the /lib directory of the distribution.


On 15.05.2017 11:09, yunfan123 wrote:

So how can I use it?
Every jar file I submitted should contains the specific connector class?
Can I package it to flink-dist ?



--
View this message in context: 
http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Why-not-add-flink-connectors-to-flink-dist-tp13134.html
Sent from the Apache Flink User Mailing List archive. mailing list archive at 
Nabble.com.





Why not add flink-connectors to flink dist?

2017-05-15 Thread yunfan123
So how can I use it? 
Every jar file I submitted should contains the specific connector class?
Can I package it to flink-dist ?



--
View this message in context: 
http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Why-not-add-flink-connectors-to-flink-dist-tp13134.html
Sent from the Apache Flink User Mailing List archive. mailing list archive at 
Nabble.com.


Re: flink connectors

2015-11-27 Thread Stephan Ewen
The reason why the binary distribution does not contain all connectors is
that this would add all libraries used by the connectors into the binary
distribution jar.

These libraries partly conflict with each other, and often conflict with
the libraries used by the user's programs. Not including these libraries in
the binary distribution makes life of users much easier with respect to
dependency-conflicts.

Greetings,
Stephan


On Fri, Nov 27, 2015 at 3:06 PM, Radu Tudoran <radu.tudo...@huawei.com>
wrote:

> Hi,
>
> Thank you for the tips!
>
> For future references in case someone else wants to search for the
> binaries for this, I would like to share the link to the maven repository
>
> http://mvnrepository.com/artifact/org.apache.flink/flink-connector-kafka
>
>
>
> Dr. Radu Tudoran
> Research Engineer
> IT R Division
>
>
> HUAWEI TECHNOLOGIES Duesseldorf GmbH
> European Research Center
> Riesstrasse 25, 80992 München
>
> E-mail: radu.tudo...@huawei.com
> Mobile: +49 15209084330
> Telephone: +49 891588344173
>
> HUAWEI TECHNOLOGIES Duesseldorf GmbH
> Hansaallee 205, 40549 Düsseldorf, Germany, www.huawei.com
> Registered Office: Düsseldorf, Register Court Düsseldorf, HRB 56063,
> Managing Director: Jingwen TAO, Wanzhou MENG, Lifang CHEN
> Sitz der Gesellschaft: Düsseldorf, Amtsgericht Düsseldorf, HRB 56063,
> Geschäftsführer: Jingwen TAO, Wanzhou MENG, Lifang CHEN
> This e-mail and its attachments contain confidential information from
> HUAWEI, which is intended only for the person or entity whose address is
> listed above. Any use of the information contained herein in any way
> (including, but not limited to, total or partial disclosure, reproduction,
> or dissemination) by persons other than the intended recipient(s) is
> prohibited. If you receive this e-mail in error, please notify the sender
> by phone or email immediately and delete it!
>
> -Original Message-
> From: Matthias J. Sax [mailto:mj...@apache.org]
> Sent: Friday, November 27, 2015 2:53 PM
> To: user@flink.apache.org
> Subject: Re: flink connectors
>
> If I understand the question right, you just want to download the jar
> manually?
>
> Just go to the maven repository website and download the jar from there.
>
>
> -Matthias
>
> On 11/27/2015 02:49 PM, Robert Metzger wrote:
> > Maybe there is a maven mirror you can access from your network?
> >
> > This site contains a list of some mirrors
> > http://stackoverflow.com/questions/5233610/what-are-the-official-mirro
> > rs-of-the-maven-central-repository
> > You don't have to use the maven tool, you can also manually browse for
> > the jars and download what you need.
> >
> >
> > On Fri, Nov 27, 2015 at 2:46 PM, Fabian Hueske <fhue...@gmail.com
> > <mailto:fhue...@gmail.com>> wrote:
> >
> > You can always build Flink from source, but apart from that I am not
> > aware of an alternative.
> >
> > 2015-11-27 14:42 GMT+01:00 Radu Tudoran <radu.tudo...@huawei.com
> > <mailto:radu.tudo...@huawei.com>>:
> >
> > Hi,
> >
> > __ __
> >
> > Is there any alternative to avoiding maven?
> >
> > That is why I was curious if there is a binary distribution of
> > this available for download directly
> >
> > __ __
> >
> > Dr. Radu Tudoran
> >
> > Research Engineer
> >
> > IT R Division
> >
> > __ __
> >
> > cid:image007.jpg@01CD52EB.AD060EE0
> >
> > HUAWEI TECHNOLOGIES Duesseldorf GmbH
> >
> > European Research Center
> >
> > Riesstrasse 25, 80992 München
> >
> > __ __
> >
> > E-mail: _radu.tudo...@huawei.com
> > <mailto:radu.tudo...@huawei.com>_
> >
> > Mobile: +49 15209084330 <tel:%2B49%2015209084330>
> >
> > Telephone: +49 891588344173 <tel:%2B49%20891588344173>
> >
> > __ __
> >
> > HUAWEI TECHNOLOGIES Duesseldorf GmbH
> > Hansaallee 205, 40549 Düsseldorf, Germany, www.huawei.com
> > <http://www.huawei.com/>
> > Registered Office: Düsseldorf, Register Court Düsseldorf, HRB
> 56063,
> > Managing Director: Jingwen TAO, Wanzhou MENG, Lifang CHEN
> > Sitz der Gesellschaft: Düsseldorf, Amtsgericht Düsseldorf, HRB
> > 56063,
> > Geschäftsführer: Jingwen TAO, Wanzhou MENG, Lifang CHEN
> >
> > This e-ma

RE: flink connectors

2015-11-27 Thread Radu Tudoran
Hi,

Is there any alternative to avoiding maven?
That is why I was curious if there is a binary distribution of this available 
for download directly

Dr. Radu Tudoran
Research Engineer
IT R Division

[cid:image007.jpg@01CD52EB.AD060EE0]
HUAWEI TECHNOLOGIES Duesseldorf GmbH
European Research Center
Riesstrasse 25, 80992 München

E-mail: radu.tudo...@huawei.com
Mobile: +49 15209084330
Telephone: +49 891588344173

HUAWEI TECHNOLOGIES Duesseldorf GmbH
Hansaallee 205, 40549 Düsseldorf, Germany, 
www.huawei.com<http://www.huawei.com/>
Registered Office: Düsseldorf, Register Court Düsseldorf, HRB 56063,
Managing Director: Jingwen TAO, Wanzhou MENG, Lifang CHEN
Sitz der Gesellschaft: Düsseldorf, Amtsgericht Düsseldorf, HRB 56063,
Geschäftsführer: Jingwen TAO, Wanzhou MENG, Lifang CHEN
This e-mail and its attachments contain confidential information from HUAWEI, 
which is intended only for the person or entity whose address is listed above. 
Any use of the information contained herein in any way (including, but not 
limited to, total or partial disclosure, reproduction, or dissemination) by 
persons other than the intended recipient(s) is prohibited. If you receive this 
e-mail in error, please notify the sender by phone or email immediately and 
delete it!

From: Fabian Hueske [mailto:fhue...@gmail.com]
Sent: Friday, November 27, 2015 2:41 PM
To: user@flink.apache.org
Subject: Re: flink connectors

Hi Radu,
the connectors are available in Maven Central.
Just add them as a dependency in your project and they will be fetched and 
included.
Best, Fabian

2015-11-27 14:38 GMT+01:00 Radu Tudoran 
<radu.tudo...@huawei.com<mailto:radu.tudo...@huawei.com>>:
Hi,

I was trying to use flink connectors. However, when I tried to import this

import org.apache.flink.streaming.connectors.*;

I saw that they are not present in the binary distribution as downloaded from 
website (flink-dist-0.10.0.jar). Is this intentionally? Is there also a binary 
distribution that contains these connectors?

Regards,

Dr. Radu Tudoran
Research Engineer
IT R Division

[cid:image007.jpg@01CD52EB.AD060EE0]
HUAWEI TECHNOLOGIES Duesseldorf GmbH
European Research Center
Riesstrasse 25, 80992 München

E-mail: radu.tudo...@huawei.com<mailto:radu.tudo...@huawei.com>
Mobile: +49 15209084330<tel:%2B49%2015209084330>
Telephone: +49 891588344173<tel:%2B49%20891588344173>

HUAWEI TECHNOLOGIES Duesseldorf GmbH
Hansaallee 205, 40549 Düsseldorf, Germany, 
www.huawei.com<http://www.huawei.com/>
Registered Office: Düsseldorf, Register Court Düsseldorf, HRB 56063,
Managing Director: Jingwen TAO, Wanzhou MENG, Lifang CHEN
Sitz der Gesellschaft: Düsseldorf, Amtsgericht Düsseldorf, HRB 56063,
Geschäftsführer: Jingwen TAO, Wanzhou MENG, Lifang CHEN
This e-mail and its attachments contain confidential information from HUAWEI, 
which is intended only for the person or entity whose address is listed above. 
Any use of the information contained herein in any way (including, but not 
limited to, total or partial disclosure, reproduction, or dissemination) by 
persons other than the intended recipient(s) is prohibited. If you receive this 
e-mail in error, please notify the sender by phone or email immediately and 
delete it!




Re: flink connectors

2015-11-27 Thread Fabian Hueske
You can always build Flink from source, but apart from that I am not aware
of an alternative.

2015-11-27 14:42 GMT+01:00 Radu Tudoran <radu.tudo...@huawei.com>:

> Hi,
>
>
>
> Is there any alternative to avoiding maven?
>
> That is why I was curious if there is a binary distribution of this
> available for download directly
>
>
>
> Dr. Radu Tudoran
>
> Research Engineer
>
> IT R Division
>
>
>
> [image: cid:image007.jpg@01CD52EB.AD060EE0]
>
> HUAWEI TECHNOLOGIES Duesseldorf GmbH
>
> European Research Center
>
> Riesstrasse 25, 80992 München
>
>
>
> E-mail: *radu.tudo...@huawei.com <radu.tudo...@huawei.com>*
>
> Mobile: +49 15209084330
>
> Telephone: +49 891588344173
>
>
>
> HUAWEI TECHNOLOGIES Duesseldorf GmbH
> Hansaallee 205, 40549 Düsseldorf, Germany, www.huawei.com
> Registered Office: Düsseldorf, Register Court Düsseldorf, HRB 56063,
> Managing Director: Jingwen TAO, Wanzhou MENG, Lifang CHEN
> Sitz der Gesellschaft: Düsseldorf, Amtsgericht Düsseldorf, HRB 56063,
> Geschäftsführer: Jingwen TAO, Wanzhou MENG, Lifang CHEN
>
> This e-mail and its attachments contain confidential information from
> HUAWEI, which is intended only for the person or entity whose address is
> listed above. Any use of the information contained herein in any way
> (including, but not limited to, total or partial disclosure, reproduction,
> or dissemination) by persons other than the intended recipient(s) is
> prohibited. If you receive this e-mail in error, please notify the sender
> by phone or email immediately and delete it!
>
>
>
> *From:* Fabian Hueske [mailto:fhue...@gmail.com]
> *Sent:* Friday, November 27, 2015 2:41 PM
> *To:* user@flink.apache.org
> *Subject:* Re: flink connectors
>
>
>
> Hi Radu,
>
> the connectors are available in Maven Central.
>
> Just add them as a dependency in your project and they will be fetched and
> included.
>
> Best, Fabian
>
>
>
> 2015-11-27 14:38 GMT+01:00 Radu Tudoran <radu.tudo...@huawei.com>:
>
> Hi,
>
>
>
> I was trying to use flink connectors. However, when I tried to import this
>
>
>
> import org.apache.flink.streaming.connectors.*;
>
>
>
> I saw that they are not present in the binary distribution as downloaded
> from website (flink-dist-0.10.0.jar). Is this intentionally? Is there also
> a binary distribution that contains these connectors?
>
>
>
> Regards,
>
>
>
> Dr. Radu Tudoran
>
> Research Engineer
>
> IT R Division
>
>
>
> [image: cid:image007.jpg@01CD52EB.AD060EE0]
>
> HUAWEI TECHNOLOGIES Duesseldorf GmbH
>
> European Research Center
>
> Riesstrasse 25, 80992 München
>
>
>
> E-mail: *radu.tudo...@huawei.com <radu.tudo...@huawei.com>*
>
> Mobile: +49 15209084330
>
> Telephone: +49 891588344173
>
>
>
> HUAWEI TECHNOLOGIES Duesseldorf GmbH
> Hansaallee 205, 40549 Düsseldorf, Germany, www.huawei.com
> Registered Office: Düsseldorf, Register Court Düsseldorf, HRB 56063,
> Managing Director: Jingwen TAO, Wanzhou MENG, Lifang CHEN
> Sitz der Gesellschaft: Düsseldorf, Amtsgericht Düsseldorf, HRB 56063,
> Geschäftsführer: Jingwen TAO, Wanzhou MENG, Lifang CHEN
>
> This e-mail and its attachments contain confidential information from
> HUAWEI, which is intended only for the person or entity whose address is
> listed above. Any use of the information contained herein in any way
> (including, but not limited to, total or partial disclosure, reproduction,
> or dissemination) by persons other than the intended recipient(s) is
> prohibited. If you receive this e-mail in error, please notify the sender
> by phone or email immediately and delete it!
>
>
>
>
>


Re: flink connectors

2015-11-27 Thread Matthias J. Sax
If I understand the question right, you just want to download the jar
manually?

Just go to the maven repository website and download the jar from there.


-Matthias

On 11/27/2015 02:49 PM, Robert Metzger wrote:
> Maybe there is a maven mirror you can access from your network?
> 
> This site contains a list of some mirrors
> http://stackoverflow.com/questions/5233610/what-are-the-official-mirrors-of-the-maven-central-repository
> You don't have to use the maven tool, you can also manually browse for
> the jars and download what you need.
> 
> 
> On Fri, Nov 27, 2015 at 2:46 PM, Fabian Hueske <fhue...@gmail.com
> <mailto:fhue...@gmail.com>> wrote:
> 
> You can always build Flink from source, but apart from that I am not
> aware of an alternative.
> 
> 2015-11-27 14:42 GMT+01:00 Radu Tudoran <radu.tudo...@huawei.com
> <mailto:radu.tudo...@huawei.com>>:
> 
> Hi,
> 
> __ __
> 
> Is there any alternative to avoiding maven?
> 
> That is why I was curious if there is a binary distribution of
> this available for download directly
> 
> __ __
> 
> Dr. Radu Tudoran
> 
> Research Engineer
> 
> IT R Division
> 
> __ __
> 
> cid:image007.jpg@01CD52EB.AD060EE0
> 
> HUAWEI TECHNOLOGIES Duesseldorf GmbH
> 
> European Research Center
> 
> Riesstrasse 25, 80992 München
> 
> __ __
> 
> E-mail: _radu.tudo...@huawei.com
> <mailto:radu.tudo...@huawei.com>_
> 
> Mobile: +49 15209084330 <tel:%2B49%2015209084330>
> 
> Telephone: +49 891588344173 <tel:%2B49%20891588344173>
> 
> __ __
> 
> HUAWEI TECHNOLOGIES Duesseldorf GmbH
> Hansaallee 205, 40549 Düsseldorf, Germany, www.huawei.com
> <http://www.huawei.com/>
> Registered Office: Düsseldorf, Register Court Düsseldorf, HRB 56063,
> Managing Director: Jingwen TAO, Wanzhou MENG, Lifang CHEN
> Sitz der Gesellschaft: Düsseldorf, Amtsgericht Düsseldorf, HRB
> 56063,
> Geschäftsführer: Jingwen TAO, Wanzhou MENG, Lifang CHEN
> 
> This e-mail and its attachments contain confidential information
> from HUAWEI, which is intended only for the person or entity
> whose address is listed above. Any use of the information
> contained herein in any way (including, but not limited to,
> total or partial disclosure, reproduction, or dissemination) by
> persons other than the intended recipient(s) is prohibited. If
> you receive this e-mail in error, please notify the sender by
> phone or email immediately and delete it!
> 
> __ __
> 
> *From:*Fabian Hueske [mailto:fhue...@gmail.com
> <mailto:fhue...@gmail.com>]
> *Sent:* Friday, November 27, 2015 2:41 PM
> *To:* user@flink.apache.org <mailto:user@flink.apache.org>
> *Subject:* Re: flink connectors
> 
> __ __
> 
> Hi Radu,
> 
> the connectors are available in Maven Central.
> 
> Just add them as a dependency in your project and they will be
> fetched and included.
> 
> Best, Fabian
> 
> __ __
> 
> 2015-11-27 14:38 GMT+01:00 Radu Tudoran <radu.tudo...@huawei.com
> <mailto:radu.tudo...@huawei.com>>:
> 
> Hi,
> 
>  
> 
> I was trying to use flink connectors. However, when I tried to
> import this
> 
>  
> 
> import org.apache.flink.streaming.connectors.*;
> 
>  
> 
> I saw that they are not present in the binary distribution as
> downloaded from website (flink-dist-0.10.0.jar). Is this
> intentionally? Is there also a binary distribution that contains
> these connectors?
> 
>  
> 
> Regards,
> 
>  
> 
> Dr. Radu Tudoran
> 
> Research Engineer
> 
> IT R Division
> 
>  
> 
> cid:image007.jpg@01CD52EB.AD060EE0
> 
> HUAWEI TECHNOLOGIES Duesseldorf GmbH
> 
> European Research Center
> 
> Riesstrasse 25, 80992 München
> 
>  
> 
> E-mail: _radu.tudo...@huawei.com
> <mailto:radu.tudo...@huawei.com>_
> 
> Mobile: +49 15209084330 <tel:%2B49%2015209084330>
> 
> 

RE: flink connectors

2015-11-27 Thread Radu Tudoran
Hi,

Thank you for the tips!

For future references in case someone else wants to search for the binaries for 
this, I would like to share the link to the maven repository

http://mvnrepository.com/artifact/org.apache.flink/flink-connector-kafka



Dr. Radu Tudoran
Research Engineer
IT R Division


HUAWEI TECHNOLOGIES Duesseldorf GmbH
European Research Center
Riesstrasse 25, 80992 München

E-mail: radu.tudo...@huawei.com
Mobile: +49 15209084330
Telephone: +49 891588344173

HUAWEI TECHNOLOGIES Duesseldorf GmbH
Hansaallee 205, 40549 Düsseldorf, Germany, www.huawei.com
Registered Office: Düsseldorf, Register Court Düsseldorf, HRB 56063,
Managing Director: Jingwen TAO, Wanzhou MENG, Lifang CHEN
Sitz der Gesellschaft: Düsseldorf, Amtsgericht Düsseldorf, HRB 56063,
Geschäftsführer: Jingwen TAO, Wanzhou MENG, Lifang CHEN
This e-mail and its attachments contain confidential information from HUAWEI, 
which is intended only for the person or entity whose address is listed above. 
Any use of the information contained herein in any way (including, but not 
limited to, total or partial disclosure, reproduction, or dissemination) by 
persons other than the intended recipient(s) is prohibited. If you receive this 
e-mail in error, please notify the sender by phone or email immediately and 
delete it!

-Original Message-
From: Matthias J. Sax [mailto:mj...@apache.org] 
Sent: Friday, November 27, 2015 2:53 PM
To: user@flink.apache.org
Subject: Re: flink connectors

If I understand the question right, you just want to download the jar manually?

Just go to the maven repository website and download the jar from there.


-Matthias

On 11/27/2015 02:49 PM, Robert Metzger wrote:
> Maybe there is a maven mirror you can access from your network?
> 
> This site contains a list of some mirrors 
> http://stackoverflow.com/questions/5233610/what-are-the-official-mirro
> rs-of-the-maven-central-repository
> You don't have to use the maven tool, you can also manually browse for 
> the jars and download what you need.
> 
> 
> On Fri, Nov 27, 2015 at 2:46 PM, Fabian Hueske <fhue...@gmail.com 
> <mailto:fhue...@gmail.com>> wrote:
> 
> You can always build Flink from source, but apart from that I am not
> aware of an alternative.
> 
> 2015-11-27 14:42 GMT+01:00 Radu Tudoran <radu.tudo...@huawei.com
> <mailto:radu.tudo...@huawei.com>>:
> 
> Hi,
> 
> __ __
> 
> Is there any alternative to avoiding maven?
> 
> That is why I was curious if there is a binary distribution of
> this available for download directly
> 
> __ __
> 
> Dr. Radu Tudoran
> 
> Research Engineer
> 
> IT R Division
> 
> __ __
> 
> cid:image007.jpg@01CD52EB.AD060EE0
> 
> HUAWEI TECHNOLOGIES Duesseldorf GmbH
> 
> European Research Center
> 
> Riesstrasse 25, 80992 München
> 
> __ __
> 
> E-mail: _radu.tudo...@huawei.com
> <mailto:radu.tudo...@huawei.com>_
> 
> Mobile: +49 15209084330 <tel:%2B49%2015209084330>
> 
> Telephone: +49 891588344173 <tel:%2B49%20891588344173>
> 
> __ __
> 
> HUAWEI TECHNOLOGIES Duesseldorf GmbH
> Hansaallee 205, 40549 Düsseldorf, Germany, www.huawei.com
> <http://www.huawei.com/>
> Registered Office: Düsseldorf, Register Court Düsseldorf, HRB 56063,
> Managing Director: Jingwen TAO, Wanzhou MENG, Lifang CHEN
> Sitz der Gesellschaft: Düsseldorf, Amtsgericht Düsseldorf, HRB
> 56063,
> Geschäftsführer: Jingwen TAO, Wanzhou MENG, Lifang CHEN
> 
> This e-mail and its attachments contain confidential information
> from HUAWEI, which is intended only for the person or entity
> whose address is listed above. Any use of the information
> contained herein in any way (including, but not limited to,
> total or partial disclosure, reproduction, or dissemination) by
> persons other than the intended recipient(s) is prohibited. If
> you receive this e-mail in error, please notify the sender by
> phone or email immediately and delete it!
> 
> __ __
> 
> *From:*Fabian Hueske [mailto:fhue...@gmail.com
> <mailto:fhue...@gmail.com>]
> *Sent:* Friday, November 27, 2015 2:41 PM
> *To:* user@flink.apache.org <mailto:user@flink.apache.org>
> *Subject:* Re: flink connectors
> 
> __ __
> 
> Hi Radu,
> 
> the connectors are available in Maven Central.
> 
> Just add them as a dependency in your

flink connectors

2015-11-27 Thread Radu Tudoran
Hi,

I was trying to use flink connectors. However, when I tried to import this

import org.apache.flink.streaming.connectors.*;

I saw that they are not present in the binary distribution as downloaded from 
website (flink-dist-0.10.0.jar). Is this intentionally? Is there also a binary 
distribution that contains these connectors?

Regards,

Dr. Radu Tudoran
Research Engineer
IT R Division

[cid:image007.jpg@01CD52EB.AD060EE0]
HUAWEI TECHNOLOGIES Duesseldorf GmbH
European Research Center
Riesstrasse 25, 80992 München

E-mail: radu.tudo...@huawei.com
Mobile: +49 15209084330
Telephone: +49 891588344173

HUAWEI TECHNOLOGIES Duesseldorf GmbH
Hansaallee 205, 40549 Düsseldorf, Germany, 
www.huawei.com<http://www.huawei.com/>
Registered Office: Düsseldorf, Register Court Düsseldorf, HRB 56063,
Managing Director: Jingwen TAO, Wanzhou MENG, Lifang CHEN
Sitz der Gesellschaft: Düsseldorf, Amtsgericht Düsseldorf, HRB 56063,
Geschäftsführer: Jingwen TAO, Wanzhou MENG, Lifang CHEN
This e-mail and its attachments contain confidential information from HUAWEI, 
which is intended only for the person or entity whose address is listed above. 
Any use of the information contained herein in any way (including, but not 
limited to, total or partial disclosure, reproduction, or dissemination) by 
persons other than the intended recipient(s) is prohibited. If you receive this 
e-mail in error, please notify the sender by phone or email immediately and 
delete it!



Re: flink connectors

2015-11-27 Thread Robert Metzger
Maybe there is a maven mirror you can access from your network?

This site contains a list of some mirrors
http://stackoverflow.com/questions/5233610/what-are-the-official-mirrors-of-the-maven-central-repository
You don't have to use the maven tool, you can also manually browse for the
jars and download what you need.


On Fri, Nov 27, 2015 at 2:46 PM, Fabian Hueske <fhue...@gmail.com> wrote:

> You can always build Flink from source, but apart from that I am not aware
> of an alternative.
>
> 2015-11-27 14:42 GMT+01:00 Radu Tudoran <radu.tudo...@huawei.com>:
>
>> Hi,
>>
>>
>>
>> Is there any alternative to avoiding maven?
>>
>> That is why I was curious if there is a binary distribution of this
>> available for download directly
>>
>>
>>
>> Dr. Radu Tudoran
>>
>> Research Engineer
>>
>> IT R Division
>>
>>
>>
>> [image: cid:image007.jpg@01CD52EB.AD060EE0]
>>
>> HUAWEI TECHNOLOGIES Duesseldorf GmbH
>>
>> European Research Center
>>
>> Riesstrasse 25, 80992 München
>>
>>
>>
>> E-mail: *radu.tudo...@huawei.com <radu.tudo...@huawei.com>*
>>
>> Mobile: +49 15209084330
>>
>> Telephone: +49 891588344173
>>
>>
>>
>> HUAWEI TECHNOLOGIES Duesseldorf GmbH
>> Hansaallee 205, 40549 Düsseldorf, Germany, www.huawei.com
>> Registered Office: Düsseldorf, Register Court Düsseldorf, HRB 56063,
>> Managing Director: Jingwen TAO, Wanzhou MENG, Lifang CHEN
>> Sitz der Gesellschaft: Düsseldorf, Amtsgericht Düsseldorf, HRB 56063,
>> Geschäftsführer: Jingwen TAO, Wanzhou MENG, Lifang CHEN
>>
>> This e-mail and its attachments contain confidential information from
>> HUAWEI, which is intended only for the person or entity whose address is
>> listed above. Any use of the information contained herein in any way
>> (including, but not limited to, total or partial disclosure, reproduction,
>> or dissemination) by persons other than the intended recipient(s) is
>> prohibited. If you receive this e-mail in error, please notify the sender
>> by phone or email immediately and delete it!
>>
>>
>>
>> *From:* Fabian Hueske [mailto:fhue...@gmail.com]
>> *Sent:* Friday, November 27, 2015 2:41 PM
>> *To:* user@flink.apache.org
>> *Subject:* Re: flink connectors
>>
>>
>>
>> Hi Radu,
>>
>> the connectors are available in Maven Central.
>>
>> Just add them as a dependency in your project and they will be fetched
>> and included.
>>
>> Best, Fabian
>>
>>
>>
>> 2015-11-27 14:38 GMT+01:00 Radu Tudoran <radu.tudo...@huawei.com>:
>>
>> Hi,
>>
>>
>>
>> I was trying to use flink connectors. However, when I tried to import this
>>
>>
>>
>> import org.apache.flink.streaming.connectors.*;
>>
>>
>>
>> I saw that they are not present in the binary distribution as downloaded
>> from website (flink-dist-0.10.0.jar). Is this intentionally? Is there also
>> a binary distribution that contains these connectors?
>>
>>
>>
>> Regards,
>>
>>
>>
>> Dr. Radu Tudoran
>>
>> Research Engineer
>>
>> IT R Division
>>
>>
>>
>> [image: cid:image007.jpg@01CD52EB.AD060EE0]
>>
>> HUAWEI TECHNOLOGIES Duesseldorf GmbH
>>
>> European Research Center
>>
>> Riesstrasse 25, 80992 München
>>
>>
>>
>> E-mail: *radu.tudo...@huawei.com <radu.tudo...@huawei.com>*
>>
>> Mobile: +49 15209084330
>>
>> Telephone: +49 891588344173
>>
>>
>>
>> HUAWEI TECHNOLOGIES Duesseldorf GmbH
>> Hansaallee 205, 40549 Düsseldorf, Germany, www.huawei.com
>> Registered Office: Düsseldorf, Register Court Düsseldorf, HRB 56063,
>> Managing Director: Jingwen TAO, Wanzhou MENG, Lifang CHEN
>> Sitz der Gesellschaft: Düsseldorf, Amtsgericht Düsseldorf, HRB 56063,
>> Geschäftsführer: Jingwen TAO, Wanzhou MENG, Lifang CHEN
>>
>> This e-mail and its attachments contain confidential information from
>> HUAWEI, which is intended only for the person or entity whose address is
>> listed above. Any use of the information contained herein in any way
>> (including, but not limited to, total or partial disclosure, reproduction,
>> or dissemination) by persons other than the intended recipient(s) is
>> prohibited. If you receive this e-mail in error, please notify the sender
>> by phone or email immediately and delete it!
>>
>>
>>
>>
>>
>
>


Re: flink connectors

2015-11-27 Thread Ovidiu-Cristian MARCU
Hi,

The main question here is why the distribution release doesn’t contain the 
connector dependencies.
It is fair to say that it does not have to (which connector to include or all). 
So just like Spark does, Flink offers binary distribution for hadoop only 
without considering other dependencies.

The thing to consider is if it may help, on the flink.apache.org download page, 
to offer a customised page in order to let the user choose also a dependency 
(connector) to be included in the binary to be downloaded.

Best regards,
Ovidiu



> On 27 Nov 2015, at 14:52, Matthias J. Sax <mj...@apache.org> wrote:
> 
> If I understand the question right, you just want to download the jar
> manually?
> 
> Just go to the maven repository website and download the jar from there.
> 
> 
> -Matthias
> 
> On 11/27/2015 02:49 PM, Robert Metzger wrote:
>> Maybe there is a maven mirror you can access from your network?
>> 
>> This site contains a list of some mirrors
>> http://stackoverflow.com/questions/5233610/what-are-the-official-mirrors-of-the-maven-central-repository
>> You don't have to use the maven tool, you can also manually browse for
>> the jars and download what you need.
>> 
>> 
>> On Fri, Nov 27, 2015 at 2:46 PM, Fabian Hueske <fhue...@gmail.com
>> <mailto:fhue...@gmail.com>> wrote:
>> 
>>You can always build Flink from source, but apart from that I am not
>>aware of an alternative.
>> 
>>2015-11-27 14:42 GMT+01:00 Radu Tudoran <radu.tudo...@huawei.com
>><mailto:radu.tudo...@huawei.com>>:
>> 
>>Hi,
>> 
>>__ __
>> 
>>Is there any alternative to avoiding maven?
>> 
>>That is why I was curious if there is a binary distribution of
>>this available for download directly
>> 
>>__ __
>> 
>>Dr. Radu Tudoran
>> 
>>Research Engineer
>> 
>>IT R Division
>> 
>>__ __
>> 
>>cid:image007.jpg@01CD52EB.AD060EE0
>> 
>>HUAWEI TECHNOLOGIES Duesseldorf GmbH
>> 
>>European Research Center
>> 
>>Riesstrasse 25, 80992 München
>> 
>>__ __
>> 
>>E-mail: _radu.tudo...@huawei.com
>><mailto:radu.tudo...@huawei.com>_
>> 
>>Mobile: +49 15209084330 <tel:%2B49%2015209084330>
>> 
>>Telephone: +49 891588344173 <tel:%2B49%20891588344173>
>> 
>>__ __
>> 
>>HUAWEI TECHNOLOGIES Duesseldorf GmbH
>>Hansaallee 205, 40549 Düsseldorf, Germany, www.huawei.com
>><http://www.huawei.com/>
>>Registered Office: Düsseldorf, Register Court Düsseldorf, HRB 56063,
>>Managing Director: Jingwen TAO, Wanzhou MENG, Lifang CHEN
>>Sitz der Gesellschaft: Düsseldorf, Amtsgericht Düsseldorf, HRB
>>56063,
>>Geschäftsführer: Jingwen TAO, Wanzhou MENG, Lifang CHEN
>> 
>>This e-mail and its attachments contain confidential information
>>from HUAWEI, which is intended only for the person or entity
>>whose address is listed above. Any use of the information
>>contained herein in any way (including, but not limited to,
>>total or partial disclosure, reproduction, or dissemination) by
>>persons other than the intended recipient(s) is prohibited. If
>>you receive this e-mail in error, please notify the sender by
>>phone or email immediately and delete it!
>> 
>>__ __
>> 
>>*From:*Fabian Hueske [mailto:fhue...@gmail.com
>><mailto:fhue...@gmail.com>]
>>*Sent:* Friday, November 27, 2015 2:41 PM
>>*To:* user@flink.apache.org <mailto:user@flink.apache.org>
>>*Subject:* Re: flink connectors
>> 
>>__ __
>> 
>>Hi Radu,
>> 
>>the connectors are available in Maven Central.
>> 
>>Just add them as a dependency in your project and they will be
>>fetched and included.
>> 
>>Best, Fabian
>> 
>>__ __
>> 
>>2015-11-27 14:38 GMT+01:00 Radu Tudoran <radu.tudo...@huawei.com
>><mailto:radu.tudo...@huawei.com>>:
>> 
>>Hi,
>> 
>> 
>> 
>>I was trying to use flink connectors. However, when I tried to
>>import this
>> 
>&