FYI I wrote a small test to try to reproduce this, and filed
SPARK-6688 to track the fix.
On Tue, Mar 31, 2015 at 1:15 PM, Marcelo Vanzin van...@cloudera.com wrote:
Hmmm... could you try to set the log dir to
file:/home/hduser/spark/spark-events?
I checked the code and it might be the case
Hmmm... could you try to set the log dir to
file:/home/hduser/spark/spark-events?
I checked the code and it might be the case that the behaviour changed
between 1.2 and 1.3...
On Mon, Mar 30, 2015 at 6:44 PM, Tom Hubregtsen thubregt...@gmail.com wrote:
The stack trace for the first scenario and
Updated spark-defaults and spark-env:
Log directory /home/hduser/spark/spark-events does not exist.
(Also, in the default /tmp/spark-events it also did not work)
On 30 March 2015 at 18:03, Marcelo Vanzin van...@cloudera.com wrote:
Are those config values in spark-defaults.conf? I don't think
Are those config values in spark-defaults.conf? I don't think you can
use ~ there - IIRC it does not do any kind of variable expansion.
On Mon, Mar 30, 2015 at 3:50 PM, Tom thubregt...@gmail.com wrote:
I have set
spark.eventLog.enabled true
as I try to preserve log files. When I run, I get
I have set
spark.eventLog.enabled true
as I try to preserve log files. When I run, I get
Log directory /tmp/spark-events does not exist.
I set
spark.local.dir ~/spark
spark.eventLog.dir ~/spark/spark-events
and
SPARK_LOCAL_DIRS=~/spark
Now I get:
Log directory ~/spark/spark-events does not
Are you running Spark in cluster mode by any chance?
(It always helps to show the command line you're actually running, and
if there's an exception, the first few frames of the stack trace.)
On Mon, Mar 30, 2015 at 4:11 PM, Tom Hubregtsen thubregt...@gmail.com wrote:
Updated spark-defaults and
I run Spark in local mode.
Command line (added some debug info):
hduser@hadoop7:~/spark-terasort$ ./bin/run-example SparkPi 10
Jar:
/home/hduser/spark-terasort/examples/target/scala-2.10/spark-examples-1.3.0-SNAPSHOT-hadoop2.4.0.jar
/home/hduser/spark-terasort/bin/spark-submit --master local[*]
The stack trace for the first scenario and your suggested improvement is
similar, with as only difference the first line (Sorry for not including
this):
Log directory /home/hduser/spark/spark-events does not exist.
To verify your premises, I cd'ed into the directory by copy pasting the
path
So, the error below is still showing the invalid configuration.
You mentioned in the other e-mails that you also changed the
configuration, and that the directory really, really exists. Given the
exception below, the only ways you'd get the error with a valid
configuration would be if (i) the