Hi,

Simply said, you submit another Job in the event thread which will be
blocked and can't receive the this job submission event. So your second job
submission is never processed, and the getPreferredLocations method is never
returned.



Fei Hu wrote
> Dear all,
> 
> I tried to customize my own RDD. In the getPreferredLocations() function,
> I
> used the following code to query anonter RDD, which was used as an input
> to
> initialize this customized RDD:
> 
>                    * val results: Array[Array[DataChunkPartition]] =
> context.runJob(partitionsRDD, (context: TaskContext, partIter:
> Iterator[DataChunkPartition]) => partIter.toArray, partitions, allowLocal
> =
> true)*
> 
> The problem is that when executing the above code, the task seemed to be
> suspended. I mean the job just stopped at this code, but no errors and no
> outputs.
> 
> What is the reason for it?
> 
> Thanks,
> Fei





-----
Liang-Chi Hsieh | @viirya 
Spark Technology Center 
http://www.spark.tc/ 
--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/context-runJob-was-suspended-in-getPreferredLocations-function-tp20412p20419.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to