Do you see similar hangs when accessing files through the `hadoop fs`
command?  Or does this just happen in Spark?

I ran into a similar problem earlier this year that turned out to be due to
a missing Kerberos configuration file (even though I wasn't using secure
Hadoop):
http://stackoverflow.com/questions/9952094/hadoop-filesystem-getfs-pauses-for-about-2-minutes


On Thu, Dec 12, 2013 at 9:10 AM, Milos Nikolic <[email protected]>wrote:

> Hello,
>
> When trying to read from a file, sc.textFile() hangs for exactly one
> minute. From the spark shell,
>
> scala> val v = sc.textFile("README.txt")        // Hangs for one minute
>
> After one minute the command successfully returns the result. Now, v.count
> also blocks for one minute but returns the correct result. Interestingly,
> this happens only once. Any subsequent sc.textFile and count calls on
> different input files work without any problems.
>
> In the example I'm trying to read from a local filesystem. I've tracked
> down the problem to the FileSystem.getLocal(conf) call. Any ideas what
> might be the cause of the one-minute delay?
>
> Environment: Spark 0.8.0, Solaris 10
>
> Thanks,
> Milos

Reply via email to