Can you paste the exception stack here?

Thanks
Best Regards

On Mon, Mar 21, 2016 at 1:42 PM, Sarath Chandra <
sarathchandra.jos...@algofusiontech.com> wrote:

> I'm using Hadoop 1.0.4 and Spark 1.2.0.
>
> I'm facing a strange issue. I have a requirement to read a small file from
> HDFS and all it's content has to be read at one shot. So I'm using spark
> context's wholeTextFiles API passing the HDFS URL for the file.
>
> When I try this from a spark shell it's works as mentioned in the
> documentation, but when I try the same through program (by submitting job
> to cluster) I get FileNotFoundException. I have all compatible JARs in
> place.
>
> Please help.
>
>
>

Reply via email to