Hi, I'm running spark on YARN. My code is very simple. I want to kill one
executor when "data.repartition(10)" is executed. Ho can I do it in easy
way?


val data = sc.sequenceFile[NullWritable, BytesWritable](inputPath)
.map { case (key, value) =>
Data.fromBytes(value)
}

process = data.repartition(10) // kill one executor here
process.map { d =>
val data = d.toByteArray
(new AvroKey(ByteBuffer.wrap(data)), NullWritable.get())
}
.saveAsNewAPIHadoopFile[AvroKeyOutputFormat[ByteBuffer]](outputPath)

Reply via email to