Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/19520
Can one of the admins verify this patch?
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/19520
We should close this. I don't see any user benefit in supporting slashes in
app ids.
---
-
To unsubscribe, e-mail:
Github user guoxiaolongzte commented on the issue:
https://github.com/apache/spark/pull/19520
I would like to ask, under what circumstances the application id will
contain a forward slash?
---
-
To unsubscribe,
Github user ajbozarth commented on the issue:
https://github.com/apache/spark/pull/19520
I'm ok either way but if we add this we should probably make sure we cover
this issue anywhere else it may pop up
---
-
To
Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/19520
IMO non-URL friendly IDs (such as those containing slashes) also are not
very user-friendly. So yes, this is a no-op for any case that we care about,
but it's exposing what I consider a usability
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/19520
Should a cluster manager have to generate URL-friendly app IDs? I don't see
a strong reason for that expectation. This doesn't affect any supported Spark
integration, true. But this is basically a
Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/19520
Which is not necessary if the cluster manager is doing the right thing,
which it's not here.
---
-
To unsubscribe, e-mail:
Github user alexnaspo commented on the issue:
https://github.com/apache/spark/pull/19520
The only goal is to to make the executor page resilient to this
---
-
To unsubscribe, e-mail:
Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/19520
The app id is something that the cluster manager should be setting, not
users, and it's pretty easy for cluster manager to generate Spark-friendly IDs.
So not sure what problem this would be
Github user alexnaspo commented on the issue:
https://github.com/apache/spark/pull/19520
I agree that a / in the appId is not necessarily a good practice. However,
would you think that the executor page should be resilient to this regardless
of that fact?
---
Github user ajbozarth commented on the issue:
https://github.com/apache/spark/pull/19520
I don't think this is something to "fix" in Spark, having slashes in the
appId seems like a generally bad idea and this may not be the only place that
it breaks things. Instead it may be better
Github user alexnaspo commented on the issue:
https://github.com/apache/spark/pull/19520
Yes, we can definitely remove it. However, it did take me a bit of time to
determine why the executor page was broken. Was hoping to save this hassle for
someone else in the future. It is unclear
Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/19520
Could you not do that instead? Or not use slashes?
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For
Github user alexnaspo commented on the issue:
https://github.com/apache/spark/pull/19520
We set our own appId when running spark on Nomad.
https://www.nomadproject.io/guides/spark/spark.html
---
-
To unsubscribe,
Github user ajbozarth commented on the issue:
https://github.com/apache/spark/pull/19520
I'm a bit confused by the issue this is addressing, how do you get an appId
with a `/` in it to beh=gin with, last I checked appId formats were hard coded
inside Spark
---
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/19520
Can one of the admins verify this patch?
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
16 matches
Mail list logo