The version problems are related to using hadoop-aws-2.7.3 alongside
aws-sdk-1.7.4 in hadoop-2.7.3 where DynamoDb functionality is limited
(it may not even operate with deployed versions of the service). I've
stripped out usage of DynamoDB from the driver program in the meantime
(using it in a
On 27 Oct 2016, at 23:04, adam kramer
> wrote:
Is the version of Spark built for Hadoop 2.7 and later only for 2.x releases?
Is there any reason why Hadoop 3.0 is a non-starter for use with Spark
2.0? The version of aws-sdk in 3.0 actually works for
Worked for me 2 weeks ago with a 3.0.0-alpha2 snapshot. Just changed
hadoop.version while building.
On Fri, Oct 28, 2016, 11:50 Sean Owen wrote:
> I don't think it works, but, there is no Hadoop 3.0 right now either. As
> the version implies, it's going to be somewhat
I don't think it works, but, there is no Hadoop 3.0 right now either. As
the version implies, it's going to be somewhat different API-wise.
On Thu, Oct 27, 2016 at 11:04 PM adam kramer wrote:
> Is the version of Spark built for Hadoop 2.7 and later only for 2.x
> releases?
>
>
Is the version of Spark built for Hadoop 2.7 and later only for 2.x releases?
Is there any reason why Hadoop 3.0 is a non-starter for use with Spark
2.0? The version of aws-sdk in 3.0 actually works for DynamoDB which
would resolve our driver dependency issues.
Thanks,
Adam