Hi,
   The fix is in below, I am not sure if its a proper fix, if someone
reviews it or require any changes, happy to do a PR.

@Override
public int read(ByteBuffer dst) throws IOException {
  if (closed) {
    throw new IOException("Channel is closed");
  }
  try {
    return inputStream.read(dst);
  } catch (UnsupportedOperationException e) {
    byte[] dstb = new byte[dst.remaining()]; //check with remaining()
    int bytesRead = inputStream.read(dstb);
    dst.put(dstb);
    return bytesRead;
  }
}


On Thu, Jul 6, 2017 at 1:29 PM, Lukasz Cwik <[email protected]>
wrote:

> Jyotirmoy, would you like to open up a PR with your changes that you did to
> get S3 reading working?
>
> On Thu, Jul 6, 2017 at 1:26 PM, Jyotirmoy Sundi <[email protected]>
> wrote:
>
> > Hi Ted,
> >     BEAM-2500 is for reading it seems, I made a couple of changes in
> > beam HadoopFileSystem
> > and able to read s3 data but write i am still facing the issues above.
> any
> > help would highly appreciate.
> >
> > On Wed, Jul 5, 2017 at 8:49 PM, Ted Yu <[email protected]> wrote:
> >
> > > Please take a look at BEAM-2500 (and related JIRAs).
> > >
> > > Cheers
> > >
> > > On Wed, Jul 5, 2017 at 8:00 PM, Jyotirmoy Sundi <[email protected]>
> > > wrote:
> > >
> > > > Hi Folks,
> > > >
> > > >      I am trying to write to s3 from beam.
> > > >
> > > > These are configs I am passing
> > > >
> > > > --hdfsConfiguration='[{"fs.default.name": "s3://xxx-output",
> > > > "fs.s3.awsAccessKeyId" :"xxx", "fs.s3.awsSecretAccessKey":"yyy"}]'
> > > > --input="/home/hadoop/data" --output="s3://xx-output/beam-output/"
> > > >
> > > > *Any idea how can I write to s3, I am using beam release-2.0.0*
> > > >
> > > > *Trace*
> > > >
> > > > 17/07/06 02:55:46 WARN TaskSetManager: Lost task 7.0 in stage 2.0
> (TID
> > > 31,
> > > > ip-10-130-237-28.vpc.internal): org.apache.beam.sdk.util.
> > > > UserCodeException:
> > > > java.lang.NullPointerException
> > > >
> > > > at
> > > > org.apache.beam.sdk.util.UserCodeException.wrap(
> > > UserCodeException.java:36)
> > > >
> > > > at
> > > > org.apache.beam.sdk.io.WriteFiles$WriteShardedBundles$auxiliary$
> > > > TXDiaduA.invokeProcessElement(Unknown
> > > > Source)
> > > >
> > > > at
> > > > org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(
> > > > SimpleDoFnRunner.java:197)
> > > >
> > > > at
> > > > org.apache.beam.runners.core.SimpleDoFnRunner.processElement(
> > > > SimpleDoFnRunner.java:155)
> > > >
> > > > at
> > > > org.apache.beam.runners.spark.translation.DoFnRunnerWithMetrics.
> > > > processElement(DoFnRunnerWithMetrics.java:64)
> > > >
> > > > at
> > > > org.apache.beam.runners.spark.translation.SparkProcessContext$
> > > > ProcCtxtIterator.computeNext(SparkProcessContext.java:165)
> > > >
> > > > at
> > > > org.apache.beam.runners.spark.repackaged.com.google.common.
> > > > collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:145)
> > > >
> > > > at
> > > > org.apache.beam.runners.spark.repackaged.com.google.common.
> > > > collect.AbstractIterator.hasNext(AbstractIterator.java:140)
> > > >
> > > > at
> > > > scala.collection.convert.Wrappers$JIteratorWrapper.
> > > > hasNext(Wrappers.scala:41)
> > > >
> > > > at scala.collection.Iterator$$anon$14.hasNext(Iterator.scala:388)
> > > >
> > > > at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
> > > >
> > > > at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:327)
> > > >
> > > > at scala.collection.Iterator$class.foreach(Iterator.scala:727)
> > > >
> > > > at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
> > > >
> > > > at scala.collection.generic.Growable$class.$plus$plus$eq(
> > > > Growable.scala:48)
> > > >
> > > > at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(
> > > > ArrayBuffer.scala:103)
> > > >
> > > > at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(
> > > > ArrayBuffer.scala:47)
> > > >
> > > > at scala.collection.TraversableOnce$class.to(
> > TraversableOnce.scala:273)
> > > >
> > > > at scala.collection.AbstractIterator.to(Iterator.scala:1157)
> > > >
> > > > at
> > > > scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.
> > > scala:265)
> > > >
> > > > at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)
> > > >
> > > > at scala.collection.TraversableOnce$class.toArray(
> > > > TraversableOnce.scala:252)
> > > >
> > > > at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)
> > > >
> > > > at
> > > > org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.
> > > > apply(RDD.scala:927)
> > > >
> > > > at
> > > > org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.
> > > > apply(RDD.scala:927)
> > > >
> > > > at
> > > > org.apache.spark.SparkContext$$anonfun$runJob$5.apply(
> > > > SparkContext.scala:1858)
> > > >
> > > > at
> > > > org.apache.spark.SparkContext$$anonfun$runJob$5.apply(
> > > > SparkContext.scala:1858)
> > > >
> > > > at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.
> scala:66)
> > > >
> > > > at org.apache.spark.scheduler.Task.run(Task.scala:89)
> > > >
> > > > at org.apache.spark.executor.Executor$TaskRunner.run(
> > Executor.scala:227)
> > > >
> > > > at
> > > > java.util.concurrent.ThreadPoolExecutor.runWorker(
> > > > ThreadPoolExecutor.java:1145)
> > > >
> > > > at
> > > > java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > > > ThreadPoolExecutor.java:615)
> > > >
> > > > at java.lang.Thread.run(Thread.java:745)
> > > >
> > > > Caused by: java.lang.NullPointerException
> > > >
> > > > at java.io.File.<init>(File.java:277)
> > > >
> > > > at
> > > > org.apache.hadoop.fs.s3.S3OutputStream.newBackupFile(
> > > > S3OutputStream.java:92)
> > > >
> > > > at org.apache.hadoop.fs.s3.S3OutputStream.<init>(
> > S3OutputStream.java:84)
> > > >
> > > > at org.apache.hadoop.fs.s3.S3FileSystem.create(
> S3FileSystem.java:252)
> > > >
> > > > at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:915)
> > > >
> > > > at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:896)
> > > >
> > > > at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:793)
> > > >
> > > > at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:782)
> > > >
> > > > at
> > > > org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(
> > > > HadoopFileSystem.java:103)
> > > >
> > > > at
> > > > org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(
> > > > HadoopFileSystem.java:67)
> > > >
> > > > at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:207)
> > > >
> > > > at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:194)
> > > >
> > > > at org.apache.beam.sdk.io.FileBasedSink$Writer.open(
> > > > FileBasedSink.java:876)
> > > >
> > > > at
> > > > org.apache.beam.sdk.io.FileBasedSink$Writer.
> > > openUnwindowed(FileBasedSink.
> > > > java:842)
> > > >
> > > > at
> > > > org.apache.beam.sdk.io.WriteFiles$WriteShardedBundles.
> > > > processElement(WriteFiles.java:362)
> > > >
> > > >
> > > >
> > > > --
> > > > Best Regards,
> > > > Jyotirmoy Sundi
> > > >
> > >
> >
> >
> >
> > --
> > Best Regards,
> > Jyotirmoy Sundi
> >
>



-- 
Best Regards,
Jyotirmoy Sundi

Reply via email to