Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/22598#discussion_r237622443
--- Diff:
external/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaSecurityHelperSuite.scala
---
@@ -0,0 +1,100
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/23151#discussion_r237617706
--- Diff:
core/src/test/scala/org/apache/spark/deploy/SparkSubmitSuite.scala ---
@@ -494,13 +494,12 @@ class SparkSubmitSuite
}
test
[
https://issues.apache.org/jira/browse/LIVY-503?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Marcelo Vanzin reassigned LIVY-503:
---
Assignee: Marcelo Vanzin
> Move RPC classes used in thrifserver in a separate mod
[
https://issues.apache.org/jira/browse/LIVY-503?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Marcelo Vanzin resolved LIVY-503.
-
Resolution: Fixed
Fix Version/s: 0.6.0
Issue resolved by pull request 118
[https
RSC pom so that the shaded jar
is generated in the correct target directory and later copied into the
staging directory; this solves an issue with using the RSC in test code
in other modules (such as the one being added).
Author: Marcelo Vanzin
Closes #118 from vanzin/LIVY-503.
Project: http://g
Repository: incubator-livy
Updated Branches:
refs/heads/master ae2228f56 -> 39fa887cf
http://git-wip-us.apache.org/repos/asf/incubator-livy/blob/39fa887c/thriftserver/session/src/main/java/org/apache/livy/thriftserver/session/SqlJob.java
-
Github user vanzin commented on the issue:
https://github.com/apache/incubator-livy/pull/118
Merging to master.
---
Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/23174
(In fact, env variables don't even show up in the UI or event logs, as far
as I can see. Other configs - Spark config, system properties, e.g. - do show
up, and are redacted to mask se
[
https://issues.apache.org/jira/browse/SPARK-26015?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Marcelo Vanzin reassigned SPARK-26015:
--
Assignee: Rob Vesse
> Include a USER directive in project provided Spark Dockerfi
[
https://issues.apache.org/jira/browse/SPARK-26015?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Marcelo Vanzin resolved SPARK-26015.
Resolution: Fixed
Fix Version/s: 3.0.0
Issue resolved by pull request 23017
[https
that UID is as expected.
Tried customising the UID from the default via the new `-u` argument to
`docker-image-tool.sh` and again checked the resulting image for the correct
runtime UID.
cc felixcheung skonto vanzin
Closes #23017 from rvesse/SPARK-26015.
Authored-by: Rob Vesse
Signed-off-by:
Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/23017
Merging to master.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
[
https://issues.apache.org/jira/browse/SPARK-26184?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Marcelo Vanzin reassigned SPARK-26184:
--
Assignee: shahid
> Last updated time is not getting updated in the History Server
[
https://issues.apache.org/jira/browse/SPARK-26184?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Marcelo Vanzin resolved SPARK-26184.
Resolution: Fixed
Fix Version/s: 2.4.1
3.0.0
Issue resolved by
[
https://issues.apache.org/jira/browse/SPARK-26186?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Marcelo Vanzin resolved SPARK-26186.
Resolution: Fixed
Fix Version/s: 2.4.1
3.0.0
Issue resolved by
[
https://issues.apache.org/jira/browse/SPARK-26186?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Marcelo Vanzin reassigned SPARK-26186:
--
Assignee: shahid
> In progress applications with last updated time is lesser t
8-947a-e8a2a09a785a.png)
Closes #23158 from shahidki31/HistoryLastUpdateTime.
Authored-by: Shahid
Signed-off-by: Marcelo Vanzin
(cherry picked from commit 24e78b7f163acf6129d934633ae6d3e6d568656a)
Signed-off-by: Marcelo Vanzin
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/23158
Merging to master / 2.4.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail
a2a09a785a.png)
Closes #23158 from shahidki31/HistoryLastUpdateTime.
Authored-by: Shahid
Signed-off-by: Marcelo Vanzin
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/24e78b7f
Tree: http://git-wip-us.apache.org/repos/asf/s
Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/23174
> if the secret would be listed under the environment variables in the
Spark UI
Secrets are redacted in the UI and event logs. We already use env variables
in other contexts (
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/22959#discussion_r237587099
--- Diff:
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/KubernetesConf.scala
---
@@ -112,125 +72,139 @@ private[spark] case
Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/23174
> via a mounted file
> Also the user should be able to specify their own mounted file
The point is that the user shouldn't need to set this at all. You enable
auth, Spark ta
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/incubator-livy/pull/117#discussion_r237315238
--- Diff:
thriftserver/server/src/main/scala/org/apache/livy/thriftserver/LivyThriftServer.scala
---
@@ -114,24 +98,56 @@ object LivyThriftServer
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/incubator-livy/pull/117#discussion_r237314382
--- Diff: server/src/main/scala/org/apache/livy/LivyConf.scala ---
@@ -98,10 +98,78 @@ object LivyConf {
val LAUNCH_KERBEROS_REFRESH_INTERVAL
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/incubator-livy/pull/117#discussion_r237319002
--- Diff:
thriftserver/server/src/main/scala/org/apache/livy/thriftserver/operation/GetTableTypesOperation.scala
---
@@ -0,0 +1,63
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/incubator-livy/pull/117#discussion_r237318955
--- Diff:
thriftserver/server/src/main/scala/org/apache/livy/thriftserver/operation/GetCatalogsOperation.scala
---
@@ -0,0 +1,62
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/incubator-livy/pull/117#discussion_r237318294
--- Diff:
thriftserver/server/src/main/scala/org/apache/livy/thriftserver/auth/PlainSaslServer.scala
---
@@ -0,0 +1,184 @@
+/*
+ * Licensed
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/incubator-livy/pull/117#discussion_r237319368
--- Diff:
thriftserver/server/src/main/scala/org/apache/livy/thriftserver/operation/GetTypeInfoOperation.scala
---
@@ -0,0 +1,130
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/incubator-livy/pull/117#discussion_r237313919
--- Diff:
thriftserver/server/src/main/scala/org/apache/livy/thriftserver/LivyCLIService.scala
---
@@ -22,60 +22,57 @@ import java.util
import
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/incubator-livy/pull/117#discussion_r237319439
--- Diff:
thriftserver/server/src/main/scala/org/apache/livy/thriftserver/operation/GetTypeInfoOperation.scala
---
@@ -0,0 +1,130
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/incubator-livy/pull/117#discussion_r237317373
--- Diff:
thriftserver/server/src/main/scala/org/apache/livy/thriftserver/auth/AuthFactory.scala
---
@@ -0,0 +1,197 @@
+/*
+ * Licensed to
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/incubator-livy/pull/117#discussion_r237316485
--- Diff:
thriftserver/server/src/main/scala/org/apache/livy/thriftserver/auth/AuthBridgeServer.scala
---
@@ -0,0 +1,301 @@
+/*
+ * Licensed
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/incubator-livy/pull/117#discussion_r237318814
--- Diff:
thriftserver/server/src/main/scala/org/apache/livy/thriftserver/cli/ThriftHttpServlet.scala
---
@@ -0,0 +1,500 @@
+/*
+ * Licensed
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/23158#discussion_r237293492
--- Diff:
core/src/test/scala/org/apache/spark/deploy/history/FsHistoryProviderSuite.scala
---
@@ -334,6 +334,42 @@ class FsHistoryProviderSuite extends
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/23058#discussion_r237284719
--- Diff: core/src/main/scala/org/apache/spark/storage/BlockManager.scala
---
@@ -718,13 +718,9 @@ private[spark] class BlockManager
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/incubator-livy/pull/128#discussion_r237282583
--- Diff: rsc/src/main/java/org/apache/livy/rsc/driver/JobWrapper.java ---
@@ -39,20 +39,27 @@
private final RSCDriver driver
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/23158#discussion_r237280881
--- Diff:
core/src/test/scala/org/apache/spark/deploy/history/FsHistoryProviderSuite.scala
---
@@ -334,6 +334,42 @@ class FsHistoryProviderSuite extends
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/23158#discussion_r237280935
--- Diff:
core/src/test/scala/org/apache/spark/deploy/history/FsHistoryProviderSuite.scala
---
@@ -334,6 +334,42 @@ class FsHistoryProviderSuite extends
[
https://issues.apache.org/jira/browse/LIVY-537?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Marcelo Vanzin resolved LIVY-537.
-
Resolution: Fixed
Fix Version/s: 0.6.0
Issue resolved by pull request 130
[https
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/23172#discussion_r237276972
--- Diff: project/SparkBuild.scala ---
@@ -494,7 +494,12 @@ object KubernetesIntegrationTests {
dockerBuild := {
if (shouldBuildImage
Github user vanzin commented on the issue:
https://github.com/apache/incubator-livy/pull/129
The code shouldn't be spending that much time in the synchronized block. It
does do quite a lot of things, including spawning a child process, but it
doesn't wait for that process
ido
Authored: Wed Nov 28 13:48:58 2018 -0800
Committer: Marcelo Vanzin
Committed: Wed Nov 28 13:48:58 2018 -0800
--
.../org/apache/livy/thriftserver/LivyThriftSessionManager.scala| 2 ++
1 file changed, 2 inserti
Github user vanzin commented on the issue:
https://github.com/apache/incubator-livy/pull/130
Merging to master.
---
GitHub user vanzin opened a pull request:
https://github.com/apache/spark/pull/23174
[SPARK-26194][k8s] Auto generate auth secret for k8s apps.
This change modifies the logic in the SecurityManager to do two
things:
- generate unique app secrets also when k8s is being
GitHub user vanzin opened a pull request:
https://github.com/apache/spark/pull/23172
[SPARK-25957][followup] Build python docker image in sbt build too.
docker-image-tool.sh requires explicit argument to create the python
image now; do that from the sbt integration tests target
Github user vanzin commented on the issue:
https://github.com/apache/incubator-livy/pull/118
Another unrelated failure. The intersection of the last test runs did pass
all the combinations, though.
---
GitHub user vanzin reopened a pull request:
https://github.com/apache/incubator-livy/pull/118
[LIVY-503] Separate thrift server session code in separate module.
This change creates a new module ("livy-thriftserver-session") with
the code related to the Thrift serv
Github user vanzin closed the pull request at:
https://github.com/apache/incubator-livy/pull/118
---
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/23088#discussion_r237197020
--- Diff: core/src/main/scala/org/apache/spark/status/AppStatusStore.scala
---
@@ -222,29 +223,20 @@ private[spark] class AppStatusStore(
val
Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/22887
> So it's reasonble for users to expect that, if they set hadoop config via
the SQL SET command, it should override the one in spark-defaults.conf.
I agree with that. But the
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/23055#discussion_r237188935
--- Diff: python/pyspark/worker.py ---
@@ -22,7 +22,12 @@
import os
import sys
import time
-import resource
+# 'resource'
Github user vanzin commented on the issue:
https://github.com/apache/incubator-livy/pull/118
I was using an old beeline (Hive 1.1 I think). The one from the client
module was having trouble with kerberos, or maybe the connection URL syntax has
changed in 3.0... anyway, didn't g
GitHub user vanzin reopened a pull request:
https://github.com/apache/incubator-livy/pull/118
[LIVY-503] Separate thrift server session code in separate module.
This change creates a new module ("livy-thriftserver-session") with
the code related to the Thrift serv
Github user vanzin closed the pull request at:
https://github.com/apache/incubator-livy/pull/118
---
Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/22887
ok, that makes sense as in I understand what you're saying, but not sure
it's what you actually want?
Why shouldn't "set spark.hadoop.*" override spark-defaults.c
Marcelo Vanzin created SPARK-26194:
--
Summary: Support automatic spark.authenticate secret in Kubernetes
backend
Key: SPARK-26194
URL: https://issues.apache.org/jira/browse/SPARK-26194
Project: Spark
[
https://issues.apache.org/jira/browse/SPARK-24219?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Marcelo Vanzin resolved SPARK-24219.
Resolution: Duplicate
I fixed this as part of SPARK-26025.
> Improve the docker bu
[
https://issues.apache.org/jira/browse/SPARK-24383?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Marcelo Vanzin resolved SPARK-24383.
Resolution: Not A Problem
This has been working reliably for me. If your k8s server is
[
https://issues.apache.org/jira/browse/SPARK-24577?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Marcelo Vanzin resolved SPARK-24577.
Resolution: Duplicate
> Spark submit fails with documentation example spark
[
https://issues.apache.org/jira/browse/SPARK-24600?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Marcelo Vanzin resolved SPARK-24600.
Resolution: Duplicate
> Improve support for building different types of images
[
https://issues.apache.org/jira/browse/SPARK-26096?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Marcelo Vanzin resolved SPARK-26096.
Resolution: Duplicate
> k8s integration tests should run R te
[
https://issues.apache.org/jira/browse/SPARK-26125?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16701200#comment-16701200
]
Marcelo Vanzin commented on SPARK-26125:
Pretty sure this works with my p
[
https://issues.apache.org/jira/browse/SPARK-25744?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Marcelo Vanzin resolved SPARK-25744.
Resolution: Duplicate
> Allow kubernetes integration tests to be run against a r
[
https://issues.apache.org/jira/browse/SPARK-26064?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Marcelo Vanzin resolved SPARK-26064.
Resolution: Invalid
I'm closing this for the time being. If you have a question p
[
https://issues.apache.org/jira/browse/SPARK-26190?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Marcelo Vanzin resolved SPARK-26190.
Resolution: Won't Fix
I'm closing this for now until I see a better use case. S
[
https://issues.apache.org/jira/browse/SPARK-26190?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16701140#comment-16701140
]
Marcelo Vanzin commented on SPARK-26190:
bq. Can you give me one good re
Marcelo Vanzin created LIVY-538:
---
Summary: Add integration tests for the thrift server
Key: LIVY-538
URL: https://issues.apache.org/jira/browse/LIVY-538
Project: Livy
Issue Type: Sub-task
[
https://issues.apache.org/jira/browse/SPARK-26190?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16700937#comment-16700937
]
Marcelo Vanzin commented on SPARK-26190:
If you need to run things before s
Github user vanzin commented on the issue:
https://github.com/apache/incubator-livy/pull/118
Yep, same NPE with master. So not a problem with this patch.
---
[
https://issues.apache.org/jira/browse/SPARK-26190?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16700896#comment-16700896
]
Marcelo Vanzin commented on SPARK-26190:
Just run your script wit
[
https://issues.apache.org/jira/browse/SPARK-26190?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16700890#comment-16700890
]
Marcelo Vanzin commented on SPARK-26190:
Based solely on what you wrote
Github user vanzin commented on the issue:
https://github.com/apache/incubator-livy/pull/118
test failure is unrelated
---
Github user vanzin commented on the issue:
https://github.com/apache/incubator-livy/pull/118
I think the latter might be because my beeline is older than the Hive jars
Livy is using.
---
Github user vanzin commented on the issue:
https://github.com/apache/incubator-livy/pull/118
BTW is it just me or there are actually no integration tests for the thrift
server? There are unit tests that run the thrift server, but that's different.
If that's the case, it
Github user vanzin commented on the issue:
https://github.com/apache/incubator-livy/pull/118
Getting this:
```
Exception in thread "Livy-Thriftserver" java.lang.NoSuchFieldError:
cliService
at
org.apache.livy.thriftserver.LivyThriftS
[
https://issues.apache.org/jira/browse/SPARK-26025?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Marcelo Vanzin reassigned SPARK-26025:
--
Assignee: Marcelo Vanzin
> Docker image build on dev builds is s
0M for the main image, 1M for the
pyspark image and 8M for the R image. That speeds up the image builds
considerably.
I also snuck in a fix to the k8s integration test dependencies in the sbt
build, so that the examples are properly built (without having to do it
manually).
Closes #23019 from vanzin/S
[
https://issues.apache.org/jira/browse/SPARK-26025?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Marcelo Vanzin resolved SPARK-26025.
Resolution: Fixed
Fix Version/s: 3.0.0
Issue resolved by pull request 23019
[https
Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/23019
Merging to master.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
Github user vanzin commented on the issue:
https://github.com/apache/incubator-livy/pull/118
BTW I just found out support for running ITs in a pre-defined cluster was
removed in LIVY-473. That's very disappointing. It allowed to easily test
things on an existing cluster, e.g. to
Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/22887
> I think this is what this PR tries to fix?
To be fair I'm not sure I fully understand the PR description. But I know
that the previous patch (which I commented on) broke the funct
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/incubator-livy/pull/118#discussion_r236741064
--- Diff:
thriftserver/session/src/main/java/org/apache/livy/thriftserver/session/ThriftSessionState.java
---
@@ -0,0 +1,126
Github user vanzin commented on the issue:
https://github.com/apache/incubator-livy/pull/118
Integration tests run in YARN; that is basically the same thing as running
a "real cluster". But, sure, if you insist, I'll run it on a "real cluster".
---
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/22598#discussion_r236738629
--- Diff:
core/src/main/scala/org/apache/spark/deploy/security/KafkaTokenUtil.scala ---
@@ -0,0 +1,168 @@
+/*
+ * Licensed to the Apache Software
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/23055#discussion_r236737983
--- Diff:
core/src/main/scala/org/apache/spark/api/python/PythonRunner.scala ---
@@ -74,8 +74,13 @@ private[spark] abstract class BasePythonRunner[IN, OUT
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/23098#discussion_r236736341
--- Diff: dev/create-release/release-build.sh ---
@@ -110,16 +110,18 @@ fi
# Depending on the version being built, certain extra profiles need to be
Github user vanzin commented on the issue:
https://github.com/apache/incubator-livy/pull/118
No but that's what integration tests do, isn't it?
---
Hi Alex,
There have been other attempts at this (e.g.
https://github.com/apache/commons-crypto/pull/90), and my main concern
with those is that they require different builds for OpenSSL 1.0 and
1.1. It seems your approach falls in that bucket, at least without
additional work.
The problem with th
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/23055#discussion_r236485186
--- Diff:
core/src/main/scala/org/apache/spark/api/python/PythonRunner.scala ---
@@ -74,8 +74,13 @@ private[spark] abstract class BasePythonRunner[IN, OUT
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/23055#discussion_r236483417
--- Diff:
core/src/main/scala/org/apache/spark/api/python/PythonRunner.scala ---
@@ -74,8 +74,13 @@ private[spark] abstract class BasePythonRunner[IN, OUT
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/22997#discussion_r236474610
--- Diff: dev/make-distribution.sh ---
@@ -241,6 +241,10 @@ fi
# Make R package - this is used for both CRAN release and packing R layout
into
Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/22997
> It is not possible to build a distribution that doesn't contain hadoop
dependencies but include SparkR
I wouldn't say that. It seems like it's possible, it just can'
Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/22887
Sorry, this is a breaking change. It changes the behavior from "I can
currently override any Hadoop configs, even final ones, using spark.hadoop.*"
to "I can never do that".
asf/spark/tree/9deaa726
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/9deaa726
Branch: refs/heads/master
Commit: 9deaa726ef1645746892a23d369c3d14677a48ff
Parents: 6f1a1c1
Author: Marcelo Vanzin
Authored: Mon Nov 26 15:33:21 2018 -0800
Committer: Marcelo Vanzin
Committed: Mon Nov 26 15:33
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/23037#discussion_r236468362
--- Diff:
resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/PythonTestsSuite.scala
---
@@ -89,6
Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/23037
Seems like a legitimate failure.
```
name="Run PySpark shell"
classname="org.apache.spark.deploy.k8s.integrationtest.KubernetesSuite&
Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/23037
retest this please
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/23017#discussion_r236456515
--- Diff: docs/running-on-kubernetes.md ---
@@ -19,9 +19,9 @@ Please see [Spark Security](security.html) and the
specific advice below before
Repository: spark
Updated Branches:
refs/heads/master fbf62b710 -> 6f1a1c124
[SPARK-25451][HOTFIX] Call stage.attemptNumber instead of attemptId.
Closes #23149 from vanzin/SPARK-25451.hotfix.
Authored-by: Marcelo Vanzin
Signed-off-by: Marcelo Vanzin
Project: http://git-wip-us.apache.
Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/23149
The PR builder is past the compile failure, so I'm merging this.
---
-
To unsubscribe, e-mail: reviews-uns
1301 - 1400 of 17322 matches
Mail list logo