[GitHub] spark pull request #19538: [SPARK-20393][WEBU UI][BACKPORT-2.0] Strengthen S...

2018-09-15 Thread ambauma
Github user ambauma closed the pull request at:

https://github.com/apache/spark/pull/19538


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #19538: [SPARK-20393][WEBU UI][BACKPORT-2.0] Strengthen Spark to...

2018-09-13 Thread ambauma
Github user ambauma commented on the issue:

https://github.com/apache/spark/pull/19538
  
No argument.

On Thu, Sep 13, 2018, 12:25 PM Dongjoon Hyun 
wrote:

> @ambauma <https://github.com/ambauma> Unfortunately, it seems to be too
> old and the PR on 1.6 also is closed. Can we close this, too?
>
> My goal is to get the fix into the official branch 1.6 to reduce the
> number of forks necessary and so that if CVE-2018- comes and I've 
moved
> on my replacement doesn't have to apply this plus that.
>
> —
> You are receiving this because you were mentioned.
> Reply to this email directly, view it on GitHub
> <https://github.com/apache/spark/pull/19538#issuecomment-421086285>, or 
mute
> the thread
> 
<https://github.com/notifications/unsubscribe-auth/AL2KaybesYvjeXb-sJC-PvdFttBTQ671ks5uapUHgaJpZM4P_n6c>
> .
>



---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #19528: [SPARK-20393][WEBU UI][1.6] Strengthen Spark to prevent ...

2017-10-27 Thread ambauma
Github user ambauma commented on the issue:

https://github.com/apache/spark/pull/19528
  
I'm unable to duplicate the PySpark failures locally.  I assume I need a 
specific version of SciPy to duplicate the error.  Is there a way I could get 
what versions the build server is running?  Something like:
`sorted(["%s==%s" % (i.key, i.version) for i in 
pip.get_installed_distributions()])` for python and python 3.4?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19538: [SPARK-20393][WEBU UI][BACKPORT-2.0] Strengthen S...

2017-10-21 Thread ambauma
Github user ambauma commented on a diff in the pull request:

https://github.com/apache/spark/pull/19538#discussion_r146111073
  
--- Diff: core/src/main/scala/org/apache/spark/ui/UIUtils.scala ---
@@ -506,4 +510,33 @@ private[spark] object UIUtils extends Logging {
 
   def getTimeZoneOffset() : Int =
 TimeZone.getDefault().getOffset(System.currentTimeMillis()) / 1000 / 60
+
+  /**
+  * Return the correct Href after checking if master is running in the
+  * reverse proxy mode or not.
+  */
+  def makeHref(proxy: Boolean, id: String, origHref: String): String = {
--- End diff --

I think this method came with the original patch.  I don't see anything 
calling it.  I will remove it.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19528: [SPARK-20393][WEBU UI][1.6] Strengthen Spark to p...

2017-10-20 Thread ambauma
Github user ambauma commented on a diff in the pull request:

https://github.com/apache/spark/pull/19528#discussion_r146095022
  
--- Diff: 
resource-managers/mesos/src/main/scala/org/apache/spark/deploy/mesos/ui/DriverPage.scala
 ---
@@ -0,0 +1,180 @@
+/*
--- End diff --

I'm not sure what I did to make this whole file look new, but I've copied 
the 1.6 current and reapplied stripXSS locally.  Waiting for my build to pass 
to commit again.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19528: [SPARK-20393][WEBU UI][1.6] Strengthen Spark to p...

2017-10-20 Thread ambauma
Github user ambauma commented on a diff in the pull request:

https://github.com/apache/spark/pull/19528#discussion_r146084730
  
--- Diff: python/pyspark/mllib/classification.py ---
@@ -173,7 +173,7 @@ def __init__(self, weights, intercept, numFeatures, 
numClasses):
 self._dataWithBiasSize = None
 self._weightsMatrix = None
 else:
-self._dataWithBiasSize = self._coeff.size / (self._numClasses 
- 1)
+self._dataWithBiasSize = self._coeff.size // (self._numClasses 
- 1)
--- End diff --

The NewSparkPullRequestBuilder failed on python tests.  I was only able to 
duplicate the failure with Python 3.4 and numpy 1.12.1, which I'm guessing is 
the versions that NewSparkPullRequestBuilder is using.  Older and newer 
versions of numpy build clean either way.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19528: [SPARK-20393][WEBU UI][1.6] Strengthen Spark to p...

2017-10-20 Thread ambauma
Github user ambauma commented on a diff in the pull request:

https://github.com/apache/spark/pull/19528#discussion_r146084021
  
--- Diff: core/src/main/scala/org/apache/spark/ui/jobs/JobsTab.scala ---
@@ -16,9 +16,9 @@
  */
 
 package org.apache.spark.ui.jobs
-
+import javax.servlet.http.HttpServletRequest
--- End diff --

Agreed, will remove.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19528: [SPARK-20393][WEBU UI][1.6] Strengthen Spark to p...

2017-10-20 Thread ambauma
Github user ambauma commented on a diff in the pull request:

https://github.com/apache/spark/pull/19528#discussion_r146080377
  
--- Diff: 
resource-managers/mesos/src/main/scala/org/apache/spark/deploy/mesos/ui/DriverPage.scala
 ---
@@ -0,0 +1,180 @@
+/*
--- End diff --

I'll look into this as well...


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19528: [SPARK-20393][WEBU UI][1.6] Strengthen Spark to p...

2017-10-20 Thread ambauma
Github user ambauma commented on a diff in the pull request:

https://github.com/apache/spark/pull/19528#discussion_r146080311
  
--- Diff: python/pyspark/mllib/classification.py ---
@@ -173,7 +173,7 @@ def __init__(self, weights, intercept, numFeatures, 
numClasses):
 self._dataWithBiasSize = None
 self._weightsMatrix = None
 else:
-self._dataWithBiasSize = self._coeff.size / (self._numClasses 
- 1)
+self._dataWithBiasSize = self._coeff.size // (self._numClasses 
- 1)
--- End diff --

This is already fixed in the 2.0 branch, btw.  Just was never applied to 
1.6.  [SPARK-20862]


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19528: [SPARK-20393][WEBU UI][1.6] Strengthen Spark to p...

2017-10-20 Thread ambauma
Github user ambauma commented on a diff in the pull request:

https://github.com/apache/spark/pull/19528#discussion_r146080177
  
--- Diff: core/src/main/scala/org/apache/spark/ui/jobs/JobsTab.scala ---
@@ -16,9 +16,9 @@
  */
 
 package org.apache.spark.ui.jobs
-
+import javax.servlet.http.HttpServletRequest
--- End diff --

Will look into this...


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19528: [SPARK-20393][WEBU UI][1.6] Strengthen Spark to p...

2017-10-20 Thread ambauma
Github user ambauma commented on a diff in the pull request:

https://github.com/apache/spark/pull/19528#discussion_r146080089
  
--- Diff: python/pyspark/mllib/classification.py ---
@@ -173,7 +173,7 @@ def __init__(self, weights, intercept, numFeatures, 
numClasses):
 self._dataWithBiasSize = None
 self._weightsMatrix = None
 else:
-self._dataWithBiasSize = self._coeff.size / (self._numClasses 
- 1)
+self._dataWithBiasSize = self._coeff.size // (self._numClasses 
- 1)
--- End diff --

I had to apply this to get past a python unit test failure.  My assumption 
is that the NewSparkPullRequestBuilder is on a different version of numpy than 
when the Spark 1.6 branch was last built.  The current python unit test failure 
looks like it has to do with a novel version of SciPy.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #19538: [SPARK-20393][WEBU UI][BACKPORT-2.0] Strengthen Spark to...

2017-10-20 Thread ambauma
Github user ambauma commented on the issue:

https://github.com/apache/spark/pull/19538
  
I'm not looking for an official release.  My goal is to get the fix into 
the official branch 1.6 to reduce the number of forks necessary and so that if 
CVE-2018- comes and I've moved on my replacement doesn't have to apply this 
plus that.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #19528: [SPARK-20393][WEBU UI][1.6] Strengthen Spark to prevent ...

2017-10-19 Thread ambauma
Github user ambauma commented on the issue:

https://github.com/apache/spark/pull/19528
  
Believed fixed.  Hard to say for sure without knowing the precise python 
and numpy versions the build is using.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #19528: [SPARK-20393][WEBU UI][1.6] Strengthen Spark to prevent ...

2017-10-19 Thread ambauma
Github user ambauma commented on the issue:

https://github.com/apache/spark/pull/19528
  
Able to duplicate.  Working theory is that this is related to numpy 1.12.1. 
 Here is my conda env:
(spark-1.6) andrew@andrew-Inspiron-7559:~/git/spark$ conda list
# packages in environment at /home/andrew/.conda/envs/spark-1.6:
#
ca-certificates   2017.08.26   h1d4fec5_0  
certifi   2016.2.28py34_0  
intel-openmp  2018.0.0 h15fc484_7  
libedit   3.1  heed3624_0  
libffi3.2.1h4deb6c0_3  
libgcc-ng 7.2.0h7cc24e2_2  
libgfortran   1.0   0  
libstdcxx-ng  7.2.0h7a57d05_2  
mkl   2017.0.3  0  
ncurses   6.0  h06874d7_1  
numpy 1.12.1   py34_0  
openblas  0.2.190  
openssl   1.0.2l   h077ae2c_5  
pip   9.0.1py34_1  
python3.4.5 0  
readline  6.2   2  
setuptools27.2.0   py34_0  
sqlite3.13.00  
tk8.5.180  
wheel 0.29.0   py34_0  
xz5.2.3h2bcbf08_1  
zlib  1.2.11   hfbfcf68_1 


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #19528: [SPARK-20393][WEBU UI][1.6] Strengthen Spark to prevent ...

2017-10-19 Thread ambauma
Github user ambauma commented on the issue:

https://github.com/apache/spark/pull/19528
  
Working on duplicating PySpark failures...


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #19528: [SPARK-20393][WEBU UI][1.6] Strengthen Spark to prevent ...

2017-10-19 Thread ambauma
Github user ambauma commented on the issue:

https://github.com/apache/spark/pull/19528
  
I just posted the 2.0 pull request.  
https://github.com/apache/spark/pull/19538


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19538: [SPARK-20393][WEBU UI][2.0] Strengthen Spark to p...

2017-10-19 Thread ambauma
GitHub user ambauma opened a pull request:

https://github.com/apache/spark/pull/19538

[SPARK-20393][WEBU UI][2.0] Strengthen Spark to prevent XSS vulnerabilities

## What changes were proposed in this pull request?

This is the fix for the master branch applied to the 2.0 branch. My 
(unnamed) company will be using Spark 1.6 probably for another year. We have 
been blocked from having Spark 1.6 on our workstations until CVE-2017-7678 is 
patched, which SPARK-20393 does. I was told I need to patch branch 2.0 before 
branch 1.6 could be patched.

## How was this patch tested?

The patch came with unit tests. The test build passed. Manual testing on 
one of the effected screens showed the newline character removed. Screen 
display was the same regardless (html ignores newline characters).
![screenshot from 2017-10-19 
12-54-01](https://user-images.githubusercontent.com/12421739/31786133-09ab7ea2-b4cd-11e7-88db-68c09e5b955b.png)



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/ambauma/spark branch-2.0

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/19538.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #19538


commit 94918ea5e46ec1a1e8f12677bce51634efee6e35
Author: NICHOLAS T. MARION <nmar...@us.ibm.com>
Date:   2017-05-10T09:59:57Z

[SPARK-20393][WEBU UI] Strengthen Spark to prevent XSS vulnerabilities

Add stripXSS and stripXSSMap to Spark Core's UIUtils. Calling these 
functions at any point that getParameter is called against a HttpServletRequest.

Unit tests, IBM Security AppScan Standard no longer showing 
vulnerabilities, manual verification of WebUI pages.

Author: NICHOLAS T. MARION <nmar...@us.ibm.com>

Closes #17686 from n-marion/xss-fix.

commit 3e01302e8870c3193232463b03a734a0980be554
Author: ambauma <andrew.m.baum...@gmail.com>
Date:   2017-10-19T00:54:58Z

Changes based on code review.




---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #19528: [SPARK-20393][WEBU UI][1.6] Strengthen Spark to prevent ...

2017-10-19 Thread ambauma
Github user ambauma commented on the issue:

https://github.com/apache/spark/pull/19528
  
I have a release in my fork for my immediate needs.  However, Spark 1.6 is 
still included in Hortonworks and is default in Cloudera.  This patch addresses 
CVE-2017-7678.  Some companies in strict regulatory environments may fail 
audits and be forced to remove Spark 1.6 if it is not patched.  Rather than 
keeping security patches in forks, I think it makes sense to merge them back 
into the mainline for branches that are still in active use.  That way if I get 
hit by a bus and CVE-2018- comes out, CVE-2017-7678 will already be covered 
and the work will not need to be duplicated.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #19528: [SPARK-20393] [Core] Existing patch applied to 1.6 branc...

2017-10-18 Thread ambauma
Github user ambauma commented on the issue:

https://github.com/apache/spark/pull/19528
  
Understood.  Working on porting to 2.0...


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19528: [SPARK-20393] [Core] Existing patch applied to 1....

2017-10-18 Thread ambauma
GitHub user ambauma opened a pull request:

https://github.com/apache/spark/pull/19528

[SPARK-20393] [Core] Existing patch applied to 1.6 branch.

## What changes were proposed in this pull request?

This is the fix for the master branch applied to the 1.6 branch.  My 
(unnamed) company will be using Spark 1.6 probably for another year.   We have 
been blocked from having Spark 1.6 on our workstations until CVE-2017-7678 is 
patched, which SPARK-20393 does.  I realize there will not be an official Spark 
1.6.4 release, but it still seems wise to keep the code there patched for those 
who are stuck on that version.  Otherwise I imagine several forks duplicating 
1.6 compliance and security fixes.

## How was this patch tested?

The patch came with unit tests.  The test build passed.  Manual testing on 
one of the effected screens showed the newline character removed.  Screen 
display was the same regardless (html ignores newline characters).  
![screenshot from 2017-10-18 
14-18-17](https://user-images.githubusercontent.com/12421739/31739388-50db67c0-b413-11e7-8928-c5c874380835.png)

Please review http://spark.apache.org/contributing.html before opening a 
pull request.

The patch itself is from previous pull requests associated to SPARK-20939.  
My original "work" was actions on what to apply to branch 1.6. and I license 
the work to the project under the project’s open source license.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/ambauma/spark branch-1.6

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/19528.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #19528


commit d9a45aaabfe7a264a34a15896aa0352d911daf45
Author: NICHOLAS T. MARION <nmar...@us.ibm.com>
Date:   2017-05-10T09:59:57Z

Initial Merge of SPARK-20393 to 1.6 branch

commit 630854a58d8fbf562e65ea8b02fd6cd32430f957
Author: ambauma <andrew.m.baum...@gmail.com>
Date:   2017-10-10T20:33:21Z

Removing what I believe is extra code never intended for the Spark 1.6 
branch from the merge of SPARK-20393

commit ffe3e9867ef84cfbee72b7ef3d41d902169ec287
Author: ambauma <andrew.m.baum...@gmail.com>
Date:   2017-10-10T21:08:07Z

Adding back in DriverPage.scala




---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org