Hmm. Strange. Even the below hangs.

val r = sc.parallelize(List(1,2,3,4,5,6,7,8))
r.count

I then looked at the web UI at port 8080 and realized that the spark shell is in WAITING status since another job is running on the standalone cluster. This may sound like a very stupid question but my expectation would be that I can submit multiple jobs at the same time and there would be some kind of a fair strategy to run them in turn. What Spark (basics) have a slept through? :)

Thanks!
Ognen

On 3/24/14, 4:00 PM, Ognen Duzlevski wrote:
Is someRDD.saveAsTextFile("hdfs://ip:port/path/final_filename.txt") supposed to work? Meaning, can I save files to the HDFS fs this way?

I tried:

val r = sc.parallelize(List(1,2,3,4,5,6,7,8))
r.saveAsTextFile("hdfs://ip:port/path/file.txt")

and it is just hanging. At the same time on my HDFS it created file.txt but as a directory which has subdirectories (the final one is empty).

Thanks!
Ognen

--
“No matter what they ever do to us, we must always act for the love of our 
people and the earth. We must not react out of hatred against those who have no 
sense.”
-- John Trudell

Reply via email to