many use it. how do you add aws sdk to classpath? check in environment ui what is in cp. you should make sure that in your cp the version is compatible with one that spark compiled with I think 1.7.4 is compatible(at least we use it)
make sure that you don't get other versions from other transitive dependencies(effective pom) On 22 October 2015 at 17:12, Ashish Shrowty <[email protected]> wrote: > I understand that there is some incompatibility with the API between Hadoop > 2.6/2.7 and Amazon AWS SDK where they changed a signature of > > com.amazonaws.services.s3.transfer.TransferManagerConfiguration.setMultipartUploadThreshold. > The JIRA indicates that this would be fixed in Hadoop 2.8. > (https://issues.apache.org/jira/browse/HADOOP-12420) > > My question is - what are people doing today to access S3? I am unable to > find an older JAR of the AWS SDK to test with. > > Thanks, > Ashish > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-5-1-Hadoop2-6-unable-to-write-to-S3-HADOOP-12420-tp25163.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: [email protected] > For additional commands, e-mail: [email protected] > >
