hi Andrew
Thanks for reply. I run SparkPi in cluster with 1 master+2 slaves based on
YARN, I did not specify the client mode so I think it should in client
mode. I checked the console log and did not find EventLoggingListener
keyword.* Seems the spark-default.conf are not passed correctly. *
The odd thing is when I run
root@Master:/usr/local/spark/spark-1.6.0-bin-hadoop2.6/sbin#
./start-history-server.sh, the
spark-root-org.apache.spark.deploy.history.historyserver-1-Master.out will
say:
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at
org.apache.spark.deploy.history.HistoryServer$.main(HistoryServer.scala:235)
at org.apache.spark.deploy.history.HistoryServer.main(HistoryServer.scala)
Caused by: java.lang.IllegalArgumentException: Log directory specified does
not exist: file:/tmp/spark-events. Did you configure the correct one
through spark.history.fs.logDirectory?
at org.apache.spark.deploy.history.FsHistoryProvider.org
$apache$spark$deploy$history$FsHistoryProvider$$startPolling(FsHistoryProvider.scala:168)
at
org.apache.spark.deploy.history.FsHistoryProvider.initialize(FsHistoryProvider.scala:120)
at
org.apache.spark.deploy.history.FsHistoryProvider.(FsHistoryProvider.scala:116)
at
org.apache.spark.deploy.history.FsHistoryProvider.(FsHistoryProvider.scala:49)
... 6 more
Then I specify SPARK_CONF_DIR=/usr/local/spark/spark-1.6.0-bin-hadoop2.6/conf
in Spark-Env.sh, and scp this sh file to two slaves, still have same error
as above in
spark-root-org.apache.spark.deploy.history.historyserver-1-Master.out after
tried to start history server. To check if Spark-Env.sh is also not passed
correctly, I specify SPARK_LOG_DIR in Spark-Env.sh, then run
./start-all.sh, the console log show the log folder is successfully changed
to new log folder. *So I'm sure Spark-Env.sh is correctlly passed .*
So I tried two ways:
1.Manually add parameter as:
root@Master:/usr/local/spark/spark-1.6.0-bin-hadoop2.6/sbin#
./start-history-server.sh hdfs://master:9000/historyserverforspark
2.Manually create fold Spark-events under /tmp
Both approaches can start history server, and browse spark history server
web OK. *But, still have no log files left after application complete
running.*
*The command I used to run SparkPi is: /spark-submit --class
org.apache.spark.examples.SparkPi --master spark://Master:7077
../lib/spark-examples-1.6.0-hadoop2.6.0.jar 1000*
Any suggestion?
2016-03-09 3:46 GMT+08:00 Andrew Or :
> Hi Patrick,
>
> I think he means just write `/tmp/sparkserverlog` instead of
> `file:/tmp/sparkserverlog`. However, I think both should work. What mode
> are you running in, client mode (the default) or cluster mode? If the
> latter your driver will be run on the cluster, and so your event logs won't
> be on the machine you ran spark-submit from. Also, are you running
> standalone, YARN or Mesos?
>
> As Jeff commented above, if event log is in fact enabled you should see
> the log message from EventLoggingListener. If the log message is not
> present in your driver logs, it's likely that the configurations in your
> spark-defaults.conf are not passed correctly.
>
> -Andrew
>
> 2016-03-03 19:57 GMT-08:00 PatrickYu :
>
>> alvarobrandon wrote
>> > Just write /tmp/sparkserverlog without the file part.
>>
>> I don't get your point, what's mean of 'without the file part'
>>
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/No-event-log-in-tmp-spark-events-tp26318p26394.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>
--
Thanks,
Xin