Erik Baumert created SPARK-23339:
------------------------------------
Summary: Spark UI not loading *.js/*.css files, only raw HTML
Key: SPARK-23339
URL: https://issues.apache.org/jira/browse/SPARK-23339
Project: Spark
Issue Type: Bug
Components: Web UI
Affects Versions: 2.2.0
Environment: Spark 2.2.0, YARN, 2 Ubuntu 16.04 nodes, openjdk 1.8.0_151
Reporter: Erik Baumert
I have never reported anything before, and hope this is the right place as I
think I have come across a bug. If I missed the solution, please feel free to
correct me.
I set up Spark 2.2.0 on a 2-node Ubuntu cluster. I use Jupyter notebook to
access the pyspark-shell. However, the UI via [http://IP:4040/|http://ip:4040/]
is broken. Has anyone ever seen something like this?
When I inspect the page in Chrome, it says "Failed to load resource:
net::ERR_EMPTY_RESPONSE" for various .js and .css files.
I did a fresh install and added my configurations until the problem occurred
again. Everything works fine until I edited the spark-defaults.conf to contain
the following line:
{{{{spark.driver.extraClassPath
/usr/local/phoenix/phoenix-4.13.0-HBase-1.3-client.jar }}}}
{{{{spark.executor.extraClassPath
/usr/local/phoenix/phoenix-4.13.0-HBase-1.3-client.jar }}}}
How to add these jar to my class path without breaking the UI? If I just supply
them using the --jars parameter in the Terminal it works fine. But I'd like to
have them configured, as explained in the manual:
[https://phoenix.apache.org/phoenix_spark.html]
I posted the question on Stackoverflow some time ago
[here|https://stackoverflow.com/questions/47291547/spark-ui-fails-to-load-js-displays-bare-html]
and apparently I'm not the only one
([here|https://stackoverflow.com/questions/47875064/spark-ui-appears-with-wrong-format]).
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]