Hi,

I'm starting spark-shell like this:

SPARK_MEM=1g SPARK_JAVA_OPTS="-Dspark.cleaner.ttl=3600"
/spark/bin/spark-shell -c 3

but when I try to create a streaming context
val scc = new StreamingContext(sc, Seconds(10))

 I get:

org.apache.spark.SparkException: Spark Streaming cannot be used
without setting spark.cleaner.ttl; set this property before creating a
SparkContext creating a SparkContext (use SPARK_JAVA_OPTS for the
shell)

        at 
org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:121)


I also tried export SPARK_JAVA_OPTS="-Dspark.cleaner.ttl=3600"
before calling spark-shell but with no luck...

What am I doing wrong? This is spark 0.9.1 -- I cannot upgrade

Reply via email to