BTW. the error happens when configure Spark to read input file from Tachyon
like following:

/home/ubuntu/spark-1.6.0/bin/spark-submit  --properties-file
/home/ubuntu/HiBench/report/kmeans/spark/java/conf/sparkbench/spark.conf
--class org.apache.spark.examples.mllib.JavaKMeans --master spark://ip
-10-73-198-35:7077
/home/ubuntu/HiBench/src/sparkbench/target/sparkbench-5.0-SNAPSHOT-MR2-spark1.5-jar-with-dependencies.jar
tachyon://localhost:19998/Kmeans/Input/samples 10 5

On Wed, Jan 27, 2016 at 5:02 AM, Jia Zou <jacqueline...@gmail.com> wrote:

> Dears, I keep getting below exception when using Spark 1.6.0 on top of
> Tachyon 0.8.2. Tachyon is 93% used and configured as CACHE_THROUGH.
>
> Any suggestions will be appreciated, thanks!
>
> =====================================
>
> Exception in thread "main" org.apache.spark.SparkException: Job aborted
> due to stage failure: Task 13 in stage 0.0 failed 4 times, most recent
> failure: Lost task 13.3 in stage 0.0 (TID 33,
> ip-10-73-198-35.ec2.internal): java.io.IOException:
> tachyon.org.apache.thrift.transport.TTransportException
>
> at tachyon.worker.WorkerClient.unlockBlock(WorkerClient.java:416)
>
> at
> tachyon.client.block.LocalBlockInStream.close(LocalBlockInStream.java:87)
>
> at tachyon.client.file.FileInStream.close(FileInStream.java:105)
>
> at tachyon.hadoop.HdfsFileInputStream.read(HdfsFileInputStream.java:171)
>
> at java.io.DataInputStream.readInt(DataInputStream.java:388)
>
> at
> org.apache.hadoop.io.SequenceFile$Reader.readRecordLength(SequenceFile.java:2325)
>
> at org.apache.hadoop.io.SequenceFile$Reader.next(SequenceFile.java:2356)
>
> at org.apache.hadoop.io.SequenceFile$Reader.next(SequenceFile.java:2493)
>
> at
> org.apache.hadoop.mapred.SequenceFileRecordReader.next(SequenceFileRecordReader.java:82)
>
> at org.apache.spark.rdd.HadoopRDD$$anon$1.getNext(HadoopRDD.scala:246)
>
> at org.apache.spark.rdd.HadoopRDD$$anon$1.getNext(HadoopRDD.scala:208)
>
> at org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:73)
>
> at
> org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:39)
>
> at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:369)
>
> at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:193)
>
> at
> org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:39)
>
> at
> org.apache.spark.rdd.RDD$$anonfun$zip$1$$anonfun$apply$31$$anon$1.hasNext(RDD.scala:851)
>
> at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:369)
>
> at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1595)
>
> at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1143)
>
> at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1143)
>
> at
> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
>
> at
> org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
>
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
>
> at org.apache.spark.scheduler.Task.run(Task.scala:89)
>
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
>
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>
> at java.lang.Thread.run(Thread.java:745)
>
> Caused by: tachyon.org.apache.thrift.transport.TTransportException
>
> at
> tachyon.org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
>
> at
> tachyon.org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
>
> at
> tachyon.org.apache.thrift.transport.TFramedTransport.readFrame(TFramedTransport.java:129)
>
> at
> tachyon.org.apache.thrift.transport.TFramedTransport.read(TFramedTransport.java:101)
>
> at
> tachyon.org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
>
> at
> tachyon.org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429)
>
> at
> tachyon.org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318)
>
> at
> tachyon.org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)
>
> at
> tachyon.org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69)
>
> at
> tachyon.thrift.WorkerService$Client.recv_unlockBlock(WorkerService.java:455)
>
> at tachyon.thrift.WorkerService$Client.unlockBlock(WorkerService.java:441)
>
> at tachyon.worker.WorkerClient.unlockBlock(WorkerClient.java:413)
>
> ... 28 more
>
>
>

Reply via email to