[ 
https://issues.apache.org/jira/browse/SPARK-47232?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sharad Mishra updated SPARK-47232:
----------------------------------
    Description: 
When enabled reverse proxy in master and worker configOptions. We're not able 
to access different tabs available in spark UI e.g.(stages, environment, 
storage etc.)

We're deploying spark through bitnami helm chart : 
https://github.com/bitnami/charts/tree/main/bitnami/spark

Name and Version

bitnami/spark - 6.0.0

What steps will reproduce the bug?

Kubernetes Version: 1.25
Spark: 3.4.2
Helm chart: 6.0.0

 

  was:
### Name and Version

bitnami/spark - 6.0.0

### What architecture are you using?

None

### What steps will reproduce the bug?

Kubernetes Version: 1.25
Spark: 3.4.2
Helm chart: 6.0.0

When enabled reverse proxy in master and worker configOptions. We're not able 
to access different tabs available in spark UI e.g.(stages, environment, 
storage etc.)


> Spark-UI stages and other tabs not accessible in standalone mode when 
> reverse-proxy is enabled
> ----------------------------------------------------------------------------------------------
>
>                 Key: SPARK-47232
>                 URL: https://issues.apache.org/jira/browse/SPARK-47232
>             Project: Spark
>          Issue Type: Bug
>          Components: Kubernetes, Structured Streaming, Web UI
>    Affects Versions: 3.4.2
>         Environment: Steps to reproduce:
> After installing the chart Spark Cluster(Master and worker) UI is available 
> at:
> [https://spark.staging.abc.com/]
> We are able to access running application by click on applicationID under 
> Running Applications link:
> We can access spark UI by clicking Application Detail UI:
> We are taken to jobs tab when we click on Application Detail UI
> URL looks like: 
> [https://spark.staging.abc.com/proxy/app-20240208103209-0030/stages/]
> When we click any of the tab from spark UI e.g. stages or environment etc, it 
> takes us back to spark cluster UI page
> We noticed that endpoint changes to 
> [https://spark.staging.abc.com/stages/] 
> instead of 
> [https://spark.staging.abc.com/proxy/app-20240208103209-0030/stages/]
>  
> Are you using any custom parameters or values?
> Configurations set in values.yaml
> ```
> master:
>   configOptions:
>     -Dspark.ui.reverseProxy=true
>     
> -Dspark.ui.reverseProxyUrl=[https://spark.staging.abc.com|https://spark.staging.abc.com/]
> worker:
>   configOptions:
>     -Dspark.ui.reverseProxy=true
>     
> -Dspark.ui.reverseProxyUrl=[https://spark.staging.abc.com|https://spark.staging.abc.com/]
> service:
>   type: ClusterIP
>   ports:
>     http: 8080
>     https: 443
>     cluster: 7077
> ingress:
>   enabled: true
>   pathType: ImplementationSpecific
>   apiVersion: ""
>   hostname: spark.staging.abc.com
>   ingressClassName: "staging"
>   path: /
> ```
>  
> What is the expected behavior?
> Expected behaviour is that when I click on stages tab, instead of taking me 
> to 
> [https://spark.staging.abc.com/stages/] 
> it should take me to following URL:
> [https://spark.staging.abc.com/proxy/app-20240208103209-0030/stages/]
> What do you see instead?
> current behaviour is it takes me to URL: 
> [https://spark.staging.abc.com/stages/] , which shows spark cluster UI with 
> master and worker details
>            Reporter: Sharad Mishra
>            Priority: Major
>
> When enabled reverse proxy in master and worker configOptions. We're not able 
> to access different tabs available in spark UI e.g.(stages, environment, 
> storage etc.)
> We're deploying spark through bitnami helm chart : 
> https://github.com/bitnami/charts/tree/main/bitnami/spark
> Name and Version
> bitnami/spark - 6.0.0
> What steps will reproduce the bug?
> Kubernetes Version: 1.25
> Spark: 3.4.2
> Helm chart: 6.0.0
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to