Yes, it would probably work. I cloned master repo and compiled with 2.8.0
and it worked. It would be nice to have 2.8 binaries since Hadoop 2.8.1 is
also released

Mustafa Akın
www.mustafaak.in

On Wed, Aug 9, 2017 at 9:00 PM, Eron Wright <eronwri...@gmail.com> wrote:

> For reference: [FLINK-6466] Build Hadoop 2.8.0 convenience binaries
>
> On Wed, Aug 9, 2017 at 6:41 AM, Aljoscha Krettek <aljos...@apache.org>
> wrote:
>
>> So you're saying that this works if you manually compile Flink for Hadoop
>> 2.8.0? If yes, I think the solution is that we have to provide binaries for
>> Hadoop 2.8.0. If we did that with a possible Flink 1.3.3 release and
>> starting from Flink 1.4.0, would this be an option for you?
>>
>> Best,
>> Aljoscha
>>
>> On 11. Jul 2017, at 10:47, Mustafa AKIN <mustaf...@gmail.com> wrote:
>>
>> Hi all,
>>
>> I am trying to use S3 backend with custom endpoint. However, it is not
>> supported in hadoop-aws@2.7.3, I need to use at least 2.8.0 version. The
>> underyling reason is that the requests are being sent as following
>>
>> DEBUG [main] (AmazonHttpClient.java:337) - Sending Request: HEAD
>> http://mustafa.localhost:9000 / Headers:
>>
>> Because "fs.s3a.path.style.access" is not recognized in old version.I
>> want the domain to remain same, the bucket name to be appended in the path (
>> http://localhost:9000/mustafa/.. <http://localhost:9000/>.)
>>
>> I cannot blindly increase aws-java-sdk version to latest, it causes:
>>
>> Caused by: java.lang.NoClassDefFoundError: Could not initialize class
>> com.amazonaws.ClientConfiguration
>> at org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSys
>> tem.java:182)
>>
>> So, If I increase the hadoop-aws to 2.8.0 with latest client, it causes
>> the following error:
>>
>>
>> According to, I need hadoop-aws@2.7.2 and
>> https://ci.apache.org/projects/flink/flink-docs-release-1.3/
>> setup/aws.html#provide-s3-filesystem-dependency
>>
>> Caused by: java.lang.IllegalAccessError: tried to access method
>> org.apache.hadoop.metrics2.lib.MutableCounterLong.<init>(Lor
>> g/apache/hadoop/metrics2/MetricsInfo;J)V from class
>> org.apache.hadoop.fs.s3a.S3AInstrumentation
>> at org.apache.hadoop.fs.s3a.S3AInstrumentation.streamCounter(S3
>> AInstrumentation.java:194)
>>
>>
>> Should I be excluding hadoop-common from Flink somehow? Building flink
>> from source with mvn clean install -DskipTests -Dhadoop.version=2.8.0 works
>> but I want to manage it via maven as much as possible.
>>
>>
>>
>

Reply via email to