Github user ambauma closed the pull request at:
https://github.com/apache/spark/pull/19538
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org
Github user ambauma commented on the issue:
https://github.com/apache/spark/pull/19538
No argument.
On Thu, Sep 13, 2018, 12:25 PM Dongjoon Hyun
wrote:
> @ambauma <https://github.com/ambauma> Unfortunately, it seems to be too
> old and the PR
Github user ambauma commented on the issue:
https://github.com/apache/spark/pull/19528
I'm unable to duplicate the PySpark failures locally. I assume I need a
specific version of SciPy to duplicate the error. Is there a way I could get
what versions the build server is running
Github user ambauma commented on a diff in the pull request:
https://github.com/apache/spark/pull/19538#discussion_r146111073
--- Diff: core/src/main/scala/org/apache/spark/ui/UIUtils.scala ---
@@ -506,4 +510,33 @@ private[spark] object UIUtils extends Logging {
def
Github user ambauma commented on a diff in the pull request:
https://github.com/apache/spark/pull/19528#discussion_r146095022
--- Diff:
resource-managers/mesos/src/main/scala/org/apache/spark/deploy/mesos/ui/DriverPage.scala
---
@@ -0,0 +1,180 @@
+/*
--- End diff
Github user ambauma commented on a diff in the pull request:
https://github.com/apache/spark/pull/19528#discussion_r146084730
--- Diff: python/pyspark/mllib/classification.py ---
@@ -173,7 +173,7 @@ def __init__(self, weights, intercept, numFeatures,
numClasses
Github user ambauma commented on a diff in the pull request:
https://github.com/apache/spark/pull/19528#discussion_r146084021
--- Diff: core/src/main/scala/org/apache/spark/ui/jobs/JobsTab.scala ---
@@ -16,9 +16,9 @@
*/
package org.apache.spark.ui.jobs
Github user ambauma commented on a diff in the pull request:
https://github.com/apache/spark/pull/19528#discussion_r146080377
--- Diff:
resource-managers/mesos/src/main/scala/org/apache/spark/deploy/mesos/ui/DriverPage.scala
---
@@ -0,0 +1,180 @@
+/*
--- End diff
Github user ambauma commented on a diff in the pull request:
https://github.com/apache/spark/pull/19528#discussion_r146080311
--- Diff: python/pyspark/mllib/classification.py ---
@@ -173,7 +173,7 @@ def __init__(self, weights, intercept, numFeatures,
numClasses
Github user ambauma commented on a diff in the pull request:
https://github.com/apache/spark/pull/19528#discussion_r146080177
--- Diff: core/src/main/scala/org/apache/spark/ui/jobs/JobsTab.scala ---
@@ -16,9 +16,9 @@
*/
package org.apache.spark.ui.jobs
Github user ambauma commented on a diff in the pull request:
https://github.com/apache/spark/pull/19528#discussion_r146080089
--- Diff: python/pyspark/mllib/classification.py ---
@@ -173,7 +173,7 @@ def __init__(self, weights, intercept, numFeatures,
numClasses
Github user ambauma commented on the issue:
https://github.com/apache/spark/pull/19538
I'm not looking for an official release. My goal is to get the fix into
the official branch 1.6 to reduce the number of forks necessary and so that if
CVE-2018- comes and I've moved on my
Github user ambauma commented on the issue:
https://github.com/apache/spark/pull/19528
Believed fixed. Hard to say for sure without knowing the precise python
and numpy versions the build is using
Github user ambauma commented on the issue:
https://github.com/apache/spark/pull/19528
Able to duplicate. Working theory is that this is related to numpy 1.12.1.
Here is my conda env:
(spark-1.6) andrew@andrew-Inspiron-7559:~/git/spark$ conda list
# packages in environment
Github user ambauma commented on the issue:
https://github.com/apache/spark/pull/19528
Working on duplicating PySpark failures...
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
Github user ambauma commented on the issue:
https://github.com/apache/spark/pull/19528
I just posted the 2.0 pull request.
https://github.com/apache/spark/pull/19538
---
-
To unsubscribe, e-mail: reviews-unsubscr
GitHub user ambauma opened a pull request:
https://github.com/apache/spark/pull/19538
[SPARK-20393][WEBU UI][2.0] Strengthen Spark to prevent XSS vulnerabilities
## What changes were proposed in this pull request?
This is the fix for the master branch applied to the 2.0
Github user ambauma commented on the issue:
https://github.com/apache/spark/pull/19528
I have a release in my fork for my immediate needs. However, Spark 1.6 is
still included in Hortonworks and is default in Cloudera. This patch addresses
CVE-2017-7678. Some companies in strict
Github user ambauma commented on the issue:
https://github.com/apache/spark/pull/19528
Understood. Working on porting to 2.0...
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
GitHub user ambauma opened a pull request:
https://github.com/apache/spark/pull/19528
[SPARK-20393] [Core] Existing patch applied to 1.6 branch.
## What changes were proposed in this pull request?
This is the fix for the master branch applied to the 1.6 branch. My
20 matches
Mail list logo