[GitHub] spark issue #9287: SPARK-11326: Split networking in standalone mode

2016-10-09 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the issue:

https://github.com/apache/spark/pull/9287
  
Closing, although I don't really understand the reason why it wasn't 
accepted. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #9287: SPARK-11326: Split networking in standalone mode

2016-10-09 Thread jacek-lewandowski
Github user jacek-lewandowski closed the pull request at:

https://github.com/apache/spark/pull/9287


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #14200: [SPARK-16528][SQL] Fix NPE problem in HiveClientI...

2016-07-19 Thread jacek-lewandowski
Github user jacek-lewandowski commented on a diff in the pull request:

https://github.com/apache/spark/pull/14200#discussion_r71277387
  
--- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala 
---
@@ -320,7 +320,7 @@ private[hive] class HiveClientImpl(
 name = d.getName,
 description = d.getDescription,
 locationUri = d.getLocationUri,
-properties = d.getParameters.asScala.toMap)
+properties = Option(d.getParameters).map(_.asScala.toMap).orNull)
--- End diff --

Perhaps... however this would change the semantics which was out of the 
scope of this ticket. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #14200: SPARK-16528: Fix NPE problem in HiveClientImpl

2016-07-14 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the issue:

https://github.com/apache/spark/pull/14200
  
test this please


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #14200: SPARK-16528: Fix NPE problem in HiveClientImpl

2016-07-14 Thread jacek-lewandowski
GitHub user jacek-lewandowski opened a pull request:

https://github.com/apache/spark/pull/14200

SPARK-16528: Fix NPE problem in HiveClientImpl

## What changes were proposed in this pull request?

There are some calls to methods or fields (getParameters, properties) which 
are then passed to Java/Scala collection converters. Unfortunately those fields 
can be null in some cases and then the conversions throws NPE. We fix it by 
wrapping calls to those fields and methods with option and then do the 
conversion.

## How was this patch tested?

Manually tested with a custom Hive metastore. 


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jacek-lewandowski/spark SPARK-16528

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/14200.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #14200


commit 667abc0d842266405ba412f468358e76b3d815c8
Author: Jacek Lewandowski <lewandowski.ja...@gmail.com>
Date:   2016-07-14T10:14:36Z

SPARK-16528: Fix NPE problem in HiveClientImpl

There are some calls to methods or fields (getParameters, properties) which 
are then passed to Java/Scala collection converters. Unfortunately those fields 
can be null in some cases and then the conversions throws NPE. We fix it by 
wrapping calls to those fields and methods with option and then do the 
conversion.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12639] [SQL] Mark Filters Fully Handled...

2016-05-10 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/11317#issuecomment-218287519
  
jenkins, test this please


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-12639] [SQL] Mark Filters Fully Handled...

2016-05-10 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/11317#issuecomment-218287442
  
test this please


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-11882: Custom scheduler support

2015-12-21 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/10292#issuecomment-166396719
  
@ScrapCodes - yes, we are building Spark on top of this change and it is 
working correctly. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-11882: Custom scheduler support

2015-12-14 Thread jacek-lewandowski
GitHub user jacek-lewandowski opened a pull request:

https://github.com/apache/spark/pull/10292

SPARK-11882: Custom scheduler support



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jacek-lewandowski/spark SPARK-11882

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/10292.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #10292


commit 1911689463bdd7ac8625d1a9fe708b034f8168fc
Author: Jacek Lewandowski <lewandowski.ja...@gmail.com>
Date:   2015-11-13T22:14:50Z

SPARK-11882: Custom scheduler support




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-11726: Throw exception on timeout when w...

2015-11-13 Thread jacek-lewandowski
GitHub user jacek-lewandowski opened a pull request:

https://github.com/apache/spark/pull/9692

SPARK-11726: Throw exception on timeout when waiting for REST server 
response



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jacek-lewandowski/spark SPARK-11726

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/9692.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #9692


commit 79fdabd5888a01ff7b98ce3a0bf39de531486906
Author: Jacek Lewandowski <lewandowski.ja...@gmail.com>
Date:   2015-11-13T13:55:46Z

SPARK-11726: Throw exception on timeout when waiting for REST server 
response




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-2750][WEB UI]Add Https support for Web ...

2015-11-12 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/5664#issuecomment-156095274
  
I rebased this pr on the current master with a little pain :smile: 
Will create a new PR soon.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-11326: Split networking in standalone mo...

2015-11-11 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/9287#issuecomment-155764855
  
Rebased and more reasonably split into commits.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-11326: Split networking in standalone mo...

2015-11-11 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/9287#issuecomment-155872516
  
jenkins, test this please


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-11326: Split networking in standalone mo...

2015-11-09 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/9287#issuecomment-155208288
  
jenkins, test this please


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-11326: Split networking in standalone mo...

2015-11-06 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/9287#issuecomment-154496831
  
@vanzin I've changed it a little bit. The master doesn't create two 
endpoints now but different users can be used with a single endpoint. So 
basically the user can plug his own password authenticator if he wants to which 
will be used to authenticate Spark users with the master. 

To authenticate workers with the master, and clients when no special 
password authenticator is plugged in, the default SASL user is used (as it was 
used so far). When the user wants to enable more sophisticated authentication, 
he simply needs to provide some authenticator or just set different secrets for 
the default user {{sparkSaslUser}} (which is used by the workers) and other 
users (the default authenticator allows to set multiple secrets like 
{{spark.authenticate.secrets.=password}} while the default password 
is set as usual with {{spark.authenticate.secret}}). In this case, the users 
should just do not need to know the default user password. Obviously, this only 
works for Netty RPC. The whole original behaviour when no-secret or a single 
shared secret is provided remains unchanged.

The last PR is not polished, but it removes most of the changes from the 
master. The remaining thing is authorisation - determining who is communicating 
with the master and authorising the requests (for example, only a default user 
can exchange worker messages, or only the application owner can remove his 
application).



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-11326: Split networking in standalone mo...

2015-11-05 Thread jacek-lewandowski
Github user jacek-lewandowski commented on a diff in the pull request:

https://github.com/apache/spark/pull/9287#discussion_r44104918
  
--- Diff: core/src/main/scala/org/apache/spark/deploy/Client.scala ---
@@ -23,9 +23,10 @@ import scala.reflect.ClassTag
 import scala.util.{Failure, Success}
 
 import org.apache.log4j.{Level, Logger}
+import org.apache.spark.SecurityManager.{SPARK_AUTH_SECRET_CONF, 
ENV_AUTH_SECRET}
--- End diff --

Weird that IDE didn't take care of that :(


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-11344: Made ApplicationDescription and D...

2015-11-02 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/9299#issuecomment-152950651
  
@srowen That is the intent. I think I cannot remove it from 
`ApplicationDescription` because this address is set by the driver. Then, the 
driver put it into the `ApplicationDescription` object and send to Master. If I 
removed it, Master wouldn't learn about the UI address of the driver.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-11344: Made ApplicationDescription and D...

2015-11-02 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/9299#issuecomment-152970309
  
Ok, agree


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-11344: Made ApplicationDescription and D...

2015-11-02 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/9299#issuecomment-152962517
  
@srowen I can rename it if you insist. However, this value is in fact 
immutable. The driver UI address does not get changed. Master replaces URLs 
which points to driver's UI when the driver is dead, but this doesn't mean the 
driver UI address was changed. It just became unavailable. So, imo the current 
names are ok. Though, if you think they are confusing, it would be better to 
change the name in `ApplicationInfo` to something like `appUIOrNotFound` or 
`appUIPlaceholder`. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-2750][WEB UI]Add Https support for Web ...

2015-10-30 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/5664#issuecomment-152500303
  
@WangTaoTheTonic if you don't have time, would you mind if i take your 
commits, rebase, squash them and add few changes?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-11326: Split networking in standalone mo...

2015-10-29 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/9287#issuecomment-152174907
  
jenkins, test this please


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-2750][WEB UI]Add Https support for Web ...

2015-10-29 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/5664#issuecomment-152194366
  
@WangTaoTheTonic can you rebase and squash?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-11402: Use ChildRunnerProvider to create...

2015-10-29 Thread jacek-lewandowski
GitHub user jacek-lewandowski opened a pull request:

https://github.com/apache/spark/pull/9354

SPARK-11402: Use ChildRunnerProvider to create ExecutorRunner and 
DriverRunner

Abstracted ExecutorRunner and DriverRunner. The current implementations 
were renamed to ExecutorRunnerImpl and DriverRunnerImpl respectively.
Added a way to provide a custom implemnetation of the runners by defining 
their factories.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jacek-lewandowski/spark SPARK-11402

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/9354.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #9354


commit a2d2ec8a555d5a2adf20cb2cf29fc76b02e923a6
Author: Jacek Lewandowski <lewandowski.ja...@gmail.com>
Date:   2015-10-15T15:08:21Z

SPARK-11402: Use ChildRunnerProvider to create ExecutorRunner and 
DriverRunner

Abstracted ExecutorRunner and DriverRunner. The current implementations 
were renamed to ExecutorRunnerImpl and DriverRunnerImpl respectively.
Added a way to provide a custom implemnetation of the runners by defining 
their factories.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-11344: Made ApplicationDescription and D...

2015-10-29 Thread jacek-lewandowski
Github user jacek-lewandowski commented on a diff in the pull request:

https://github.com/apache/spark/pull/9299#discussion_r43393292
  
--- Diff: 
core/src/main/scala/org/apache/spark/deploy/master/ApplicationInfo.scala ---
@@ -32,7 +32,8 @@ private[spark] class ApplicationInfo(
 val desc: ApplicationDescription,
 val submitDate: Date,
 val driver: RpcEndpointRef,
-defaultCores: Int)
+defaultCores: Int,
+var appUiUrl: String)
--- End diff --

Yes, I moved it to that class because it represent a mutable state of 
application in the Master - and if the appUiUrl changes, it is due to 
application state change. This class has many more mutable fields so it doesn't 
make it worse. The main purpose of this ticket was to move out mutable fields 
from `ApplicationDescription` which is used to transfer data from one process 
to another and by definition it should be immutable.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-11415: Remove timezone shift of Catalyst...

2015-10-29 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/9369#issuecomment-152417671
  
jenkins, test this please


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-11344: Made ApplicationDescription and D...

2015-10-27 Thread jacek-lewandowski
GitHub user jacek-lewandowski opened a pull request:

https://github.com/apache/spark/pull/9299

SPARK-11344: Made ApplicationDescription and DriverDescription case classes



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jacek-lewandowski/spark SPARK-11344

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/9299.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #9299


commit 1870a5ad0c5d0f337b78e502b181092cfc1b46fc
Author: Jacek Lewandowski <lewandowski.ja...@gmail.com>
Date:   2015-10-27T11:35:28Z

SPARK-11344: Made ApplicationDescription and DriverDescription case classes




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-11344: Made ApplicationDescription and D...

2015-10-27 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/9299#issuecomment-151527896
  
Thanks @srowen 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-11344: Made ApplicationDescription and D...

2015-10-27 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/9299#issuecomment-151611992
  
@JoshRosen could you verify this?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-11326: Split networking in standalone mo...

2015-10-26 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/9287#issuecomment-151310713
  
@vanzin could you take a look?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-11326: Split networking in standalone mo...

2015-10-26 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/9287#issuecomment-151310670
  
jenkins, test this please


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-11326: Split networking in standalone mo...

2015-10-26 Thread jacek-lewandowski
GitHub user jacek-lewandowski opened a pull request:

https://github.com/apache/spark/pull/9287

SPARK-11326: Split networking in standalone mode



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jacek-lewandowski/spark SPARK-11326

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/9287.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #9287


commit 65323ed474b1f6ebcd8dde8aae8da0dcaeb5b3df
Author: Jacek Lewandowski <lewandowski.ja...@gmail.com>
Date:   2015-10-23T10:39:35Z

Add fromNamespace method to SparkConf

This method allows for converting properties at a given namespace into 
properties at base namespace, that is, for example spark.ns1.xxx to spark.xxx. 
This will be very useful if we have the same set of properties for different 
components and we don't want to modify, say SecurityManager.

commit 6b2c23eb6c292aa30988f74f8aedb628e8a9
Author: Jacek Lewandowski <lewandowski.ja...@gmail.com>
Date:   2015-10-26T11:35:41Z

SecurityManager does not mix usages of env variables and SparkConf

Setting secret key from env variable for executors has been moved to 
executor backends because this logic is not a part of SecurityManager. It also 
makes SecurityManager purely configured by SparkConf passed as parameter.

commit 7ef3d6af55cd2ab8ea19cc45b626f445e1696653
Author: Jacek Lewandowski <lewandowski.ja...@gmail.com>
Date:   2015-10-22T03:45:18Z

Added a secondary RPC interface in Master.

Secondary RPC interfaces is intended to handle only client communication. 
It doesn't handle messages normally sent by workers.
The purpose of such demarcation is making it possible for the cluster 
(master and workers) to have a separate security configuration (distinct secure 
token) which is not disclosed to the clients.
This commit doesn't introduce separate security configurations for both 
interfaces.

For simplicity and to retain backward compatibility, the primary RPC 
service remains unchanged and accepts all kinds of messages and the secondary 
service just forwards a subset of messages to the primary service (those which 
are for communication with client/driver). This is fine because even if we have 
secure cluster, only workers will be able to communicate with master (only 
workers will have a proper secure token), and the will not send any client-like 
message. Such approach allows to decrease the number of meaningful changes and 
avoid synchronisation issues between two RPC handlers.

commit 9dde6288c81868ba432c89a6b07fb36c1f3c4a94
Author: Jacek Lewandowski <lewandowski.ja...@gmail.com>
Date:   2015-10-23T07:50:48Z

Separate RPC for AppClient

Application client, which essentially the entity reposnsible for 
communicating with Spark Master in standalone mode was using RPC env inherited 
from SparkEnv. It has been changed so that now it setups its own RPC env with a 
distinct configuration. By default it takes the same host as SparkEnv and a 
random port.

This change will allow to have a separate network configuration of 
communication with the scheduler and for internal application communication 
(driver and executors).

commit f3b2d9f51b1b28da9e74646024d9fb7ec4a6df9d
Author: Jacek Lewandowski <lewandowski.ja...@gmail.com>
Date:   2015-10-23T10:07:05Z

Use ShuffleSecretManager in standalone mode

The purpose of this change is to allow applications using distinct secret 
keys in standalone mode. This commit should not change the default behaviour 
though because it assumes that the token is still retrieved from 
SecurityManager.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-2750][WEB UI]Add Https support for Web ...

2015-09-01 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/5664#issuecomment-136934630
  
@vanzin @andrewor14 I may take a look at this and finish or create a 
separate pr with my approach if you like.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-7171: Added a method to retrieve metrics...

2015-07-20 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/5805#issuecomment-123018885
  
@andrewor14 is it ok now?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-6602][Core]Replace Akka Serialization w...

2015-07-16 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/7159#issuecomment-121929208
  
@JoshRosen this shouldn't be a problem for us. Thanks for pinging me anyway.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-8562: Log the lost of an executor only i...

2015-06-23 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/6952#issuecomment-114537354
  
jenkins, retest this please


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-8562: Log the lost of an executor only i...

2015-06-23 Thread jacek-lewandowski
GitHub user jacek-lewandowski opened a pull request:

https://github.com/apache/spark/pull/6952

SPARK-8562: Log the lost of an executor only if SparkContext is alive



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jacek-lewandowski/spark SPARK-8562-1.3

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/6952.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #6952


commit 1963a1b3520d4414fa09bf7790f6c43b58a67867
Author: Jacek Lewandowski lewandowski.ja...@gmail.com
Date:   2015-06-23T12:28:01Z

SPARK-8562: Log the lost of an executor only if SparkContext is alive




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-7171: Added a method to retrieve metrics...

2015-06-22 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/5805#issuecomment-114020441
  
@andrewor14 rebased


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-7169: Metrics can be additionally config...

2015-06-19 Thread jacek-lewandowski
Github user jacek-lewandowski closed the pull request at:

https://github.com/apache/spark/pull/5788


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-7169: Metrics can be additionally config...

2015-06-01 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/5788#issuecomment-107807461
  
@vanzin i'm really sorry - i must have missed the notification


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-7436: Fixed instantiation of custom reco...

2015-05-08 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/5977#issuecomment-100116778
  
@ScrapCodes do you mean should matchers ?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-7436: Fixed instantiation of custom reco...

2015-05-08 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/5977#issuecomment-100158160
  
I don't agree with this - you already use such notation in a few places 
(for Idea search for `\d+\sseconds` except comments and string literals shows 
16 usages for production code and 71 usages for test code, search for usages of 
dot-less notation of `===` method shows 4k+ usages).

Although I can see the benefits of this style guide item for the production 
code, I cannot understand how it could be forbidden for test code? How would 
you use should matchers without this notation? Scala test frameworks are 
designed to use dot-less notation to improve readability and understanding of 
test cases, as well as to decrease unneeded verbosity. 



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-7436: Fixed instantiation of custom reco...

2015-05-08 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/5977#issuecomment-100185582
  
@ScrapCodes so this one is enough then?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-7436: Fixed instantiation of custom reco...

2015-05-08 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/5977#issuecomment-100182540
  
Thanks @ScrapCodes :)
Other pull requests with the same issue id are against branch-1.3 and 
branch-1.4. Isn't going to be applied to those branches?



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-7436: Fixed instantiation of custom reco...

2015-05-08 Thread jacek-lewandowski
Github user jacek-lewandowski closed the pull request at:

https://github.com/apache/spark/pull/5975


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-7436: Fixed instantiation of custom reco...

2015-05-08 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/5977#issuecomment-100204960
  
Ahh, i do remember now - i needed to change some thing in the test while 
creating branch for some release... so i would keep them.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-7436: Fixed instantiation of custom reco...

2015-05-07 Thread jacek-lewandowski
GitHub user jacek-lewandowski opened a pull request:

https://github.com/apache/spark/pull/5976

SPARK-7436: Fixed instantiation of custom recovery mode factory and added 
tests



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jacek-lewandowski/spark SPARK-7436-1.4

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/5976.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #5976


commit 629831306a63c4eed3659f3c2551d999ca927515
Author: Jacek Lewandowski lewandowski.ja...@gmail.com
Date:   2015-05-07T09:06:28Z

SPARK-7436: Fixed instantiation of custom recovery mode factory and added 
tests




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-7436: Fixed instantiation of custom reco...

2015-05-07 Thread jacek-lewandowski
GitHub user jacek-lewandowski opened a pull request:

https://github.com/apache/spark/pull/5975

SPARK-7436: Fixed instantiation of custom recovery mode factory and added 
tests



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jacek-lewandowski/spark SPARK-7436-1.3

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/5975.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #5975


commit 3988817bbe121d3f431d76df718f6f8992eb74e4
Author: Jacek Lewandowski lewandowski.ja...@gmail.com
Date:   2015-05-07T09:06:28Z

SPARK-7436: Fixed instantiation of custom recovery mode factory and added 
tests




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-7436: Fixed instantiation of custom reco...

2015-05-07 Thread jacek-lewandowski
GitHub user jacek-lewandowski opened a pull request:

https://github.com/apache/spark/pull/5977

SPARK-7436: Fixed instantiation of custom recovery mode factory and added 
tests



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jacek-lewandowski/spark SPARK-7436

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/5977.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #5977


commit ff0a3c26d49073f7f3d98425f4df59c55ca02670
Author: Jacek Lewandowski lewandowski.ja...@gmail.com
Date:   2015-05-07T09:06:28Z

SPARK-7436: Fixed instantiation of custom recovery mode factory and added 
tests




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-7436: Fixed instantiation of custom reco...

2015-05-07 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/5975#issuecomment-99937988
  
retest this please


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-7436: Fixed instantiation of custom reco...

2015-05-07 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/5975#issuecomment-99938103
  
test this please


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-7171: Added a method to retrieve metrics...

2015-05-03 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/5805#issuecomment-98590756
  
@squito so who should I ask for this?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-7169: Metrics can be additionally config...

2015-05-01 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/5788#issuecomment-98233328
  
@jerryshao because this is a very minor fix and I wanted to prepare a PR 
for the lowest version I think it it suitable. After successful review and 
agree to merge it to particular versions i'll create particular PRs.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-7171: Added a method to retrieve metrics...

2015-04-30 Thread jacek-lewandowski
GitHub user jacek-lewandowski opened a pull request:

https://github.com/apache/spark/pull/5805

SPARK-7171: Added a method to retrieve metrics sources in TaskContext



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jacek-lewandowski/spark SPARK-7171

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/5805.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #5805


commit 92aa76fc559499470595fcd772d750b34d128cc6
Author: Jacek Lewandowski lewandowski.ja...@gmail.com
Date:   2015-04-30T08:30:34Z

SPARK-7171: Added a method to retrieve metrics sources in TaskContext




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-7171: Added a method to retrieve metrics...

2015-04-30 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/5805#issuecomment-97714183
  
retest this please


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-7169: Metrics can be additionally config...

2015-04-29 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/5788#issuecomment-97577823
  
Jenkins test this please


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-7169: Metrics can be additionally config...

2015-04-29 Thread jacek-lewandowski
GitHub user jacek-lewandowski opened a pull request:

https://github.com/apache/spark/pull/5788

SPARK-7169: Metrics can be additionally configured from Spark configuration



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jacek-lewandowski/spark SPARK-7169

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/5788.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #5788


commit 750927c78912cbd970584be18ec5b3424e30ffd7
Author: Jacek Lewandowski lewandowski.ja...@gmail.com
Date:   2015-04-29T20:26:16Z

SPARK-7169: Metrics can be additionally configured from Spark configuration




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-7169: Metrics can be additionally config...

2015-04-29 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/5788#issuecomment-97578751
  
retest this please


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5548: Fix for AkkaUtilsSuite failure - a...

2015-02-17 Thread jacek-lewandowski
GitHub user jacek-lewandowski opened a pull request:

https://github.com/apache/spark/pull/4653

SPARK-5548: Fix for AkkaUtilsSuite failure - attempt 2



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jacek-lewandowski/spark SPARK-5548-2-master

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/4653.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #4653


commit 843eafb343bac705e8a4a091501ca2d55df78240
Author: Jacek Lewandowski lewandowski.ja...@gmail.com
Date:   2015-02-17T19:00:38Z

SPARK-5548: Fix for AkkaUtilsSuite failure - attempt 2




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5548: Fixed a race condition in AkkaUtil...

2015-02-17 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/4343#issuecomment-74728480
  
Here is the new PR https://github.com/apache/spark/pull/4653


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5408: Use -XX:MaxPermSize specified by u...

2015-02-08 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/4203#issuecomment-73376469
  
Thanks @srowen 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5548: Fixed a race condition in AkkaUtil...

2015-02-05 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/4343#issuecomment-73118448
  
@pwendell so is this pr enough (according to what @JoshRosen said?) or 
should i create another one for master?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5548: Fixed a race condition in AkkaUtil...

2015-02-04 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/4343#issuecomment-72858032
  
@pwendell I think the problem with catching either exception is that the 
`ActorNotFoundException` is also thrown in other situations - in this 
particular test we want to prove that actor system A cannot connect to the 
other actor system B because A doesn't trust B, and this results in timeout. 
This is even lower communication layer than SASL. 
On the other hand `ActorNotFoundException` is thrown in case of timeout, in 
case there is no actor with that name or in case of authentication problems - 
completely different situations because they occur when we are successfully 
connected to the other actor system and that remote actor system refuses to 
return the reference to the requested actor. 
Therefore, `ActorNotFoundException` when not related to the timeout should 
cause the test to fail because it means that we successfully connected to 
untrusted actor system.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5548: Fixed a race condition in AkkaUtil...

2015-02-03 Thread jacek-lewandowski
GitHub user jacek-lewandowski opened a pull request:

https://github.com/apache/spark/pull/4343

SPARK-5548: Fixed a race condition in AkkaUtilsSuite



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jacek-lewandowski/spark SPARK-5548-1.3

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/4343.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #4343


commit b9ba47e635cb31d3a66c7417cb0048edd667ed42
Author: Jacek Lewandowski lewandowski.ja...@gmail.com
Date:   2015-02-03T21:08:36Z

SPARK-5548: Fixed a race condition in AkkaUtilsSuite




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5548: Fixed a race condition in AkkaUtil...

2015-02-03 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/4343#issuecomment-72735186
  
Once approved, I'll create another PR for master.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: Spark 3883: SSL support for HttpServer and Akk...

2015-02-02 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/3571#issuecomment-72544044
  
@JoshRosen I've just added the documentation. I hope there is enough of it 
:)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5425: Use synchronised methods in system...

2015-02-02 Thread jacek-lewandowski
Github user jacek-lewandowski closed the pull request at:

https://github.com/apache/spark/pull/4220


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5425: Use synchronised methods in system...

2015-02-02 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/4220#issuecomment-72529009
  
@JoshRosen is it ok to go now?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: Spark 3883: SSL support for HttpServer and Akk...

2015-02-02 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/3571#issuecomment-72560136
  
I don't get why the tests failed. I only added documentation!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: Spark 3883: SSL support for HttpServer and Akk...

2015-02-02 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/3571#issuecomment-72440499
  
Ok @JoshRosen I've found a way to do this. 



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5425: Use synchronised methods in system...

2015-01-31 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/4220#issuecomment-72309821
  
Agreed, I'll make the changes and add the clarifying comments.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5425: Use synchronised methods in system...

2015-01-30 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/4220#issuecomment-72274441
  
It serializes the object and then deserializes so I suppose this is a deep 
copy. 

For the `stringPropertyNames` - you can, but this will not be a 1:1 copy:

```scala
val parent = new Properties()
parent.setProperty(test1, A)

val child = new Properties(parent)
child.put(test1, C)
child.put(test2, B)

child.getProperty(test1)
child.remove(test1)
child.getProperty(test1)
```
will give you
```
scala res17: Object = C
scala res18: String = A
```

When you copy in the way you suggested, there will be `null` after removal.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5425: Use synchronised methods in system...

2015-01-30 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/4220#issuecomment-72272628
  
Look at this simple example:
```scala
val parent = new Properties()
parent.setProperty(test1, A)

val child = new Properties(parent)
child.put(test2, B)

val copy = new Properties()
copy.putAll(child)

child.getProperty(test1)
child.getProperty(test2)

copy.getProperty(test1)
copy.getProperty(test2)
```
which will result in:
```
scala res3: String = A
scala res4: String = B
scala res5: String = null
scala res6: String = B
```

In other words: `new Properties(oldProperties)` initialises a new 
properties by setting oldProperties as a parent (defaults). On the other hand 
`new Properties().putAll(oldProperties)` copies only those properties which 
were explicitly set and cuts the whole hierarchy with defaults. Only cloning 
gives you the same object.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5425: Use synchronised methods in system...

2015-01-30 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/4220#issuecomment-72268328
  
@srowen - unfortunately they are something more - they inherit from the 
`HashTable` but they makes a hierarchy by referencing the parent `Properties` 
which are the defaults. As the defaults is also `Properties`, it has its own 
parent and so on.
 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5425: Use synchronised methods in system...

2015-01-30 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/4220#issuecomment-72308249
  
`Utils.getSystemProperties` is fine but not as a replacement of 
`stringProperties`. There were actually two problems - one was related to 
`ConcurrentModificationException` and the other one was related to abandoning 
completely the defaults. Unfortunately `Utils.getSystemProperties` is not 
enough because it doesn't make Scala wrapper over Java properties (which is 
used when you just iterate over them) to consider default values defined for 
the properties (as I said - they make a hierarchy). Since you still need to use 
`stringPropertyNames` which creates a defensive copy anyway, cloning properties 
before you want to use them seems to be redundant. Maybe a more clear solution 
would be to modify `Utils.getSystemProperties` to use `stringPropertyNames` to 
make a copy and return just a simple `Map` instead?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5425: Use synchronised methods in system...

2015-01-30 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/4220#issuecomment-72168541
  
@JoshRosen can you take a look and maybe merge it?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5425: Use synchronised methods in system...

2015-01-29 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/4220#issuecomment-71988851
  
I found that there is a small difference in the outcome of my code in 
comparison to the original code. I don't know how it affect this test suite 
yet, but: the default Java wrapper over the system properties, which was used 
originally uses `entrySet` method to get the properties. It means that when 
there are some defaults defined, there are not used. On the other hand, 
`stringPropertyNames` method returns a set of property keys - both defined 
explicitly and by providing default values. The same for `System.getProperty` 
method. The later behaviour seems more appropriate here.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5425: Use synchronised methods in system...

2015-01-29 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/4220#issuecomment-71992373
  
It looks like there is `ResetSystemProperties` trait which works completely 
wrong. Its aim is to save system properties before the test and the reset them 
after the test. However, the way the copy was created and then restored was 
invalid. 

The copy was created in this way:
```scala
oldProperties = new Properties(System.getProperties)
```

which did not initialize properties as they were in the original system 
properties but rather set them as defaults in new properties object, which in 
turn made `JPropertiesWrapper` not consider them at all (actually 
JPropertiesWrapper would return empty iterator then).

The way I'm gonna fix `ResetSystemProperties` class is to use 
`SerializationUtils.clone` to save the system properties.





---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: Spark 3883: SSL support for HttpServer and Akk...

2015-01-29 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/3571#issuecomment-71993756
  
@JoshRosen I'll add a way to create `SSLOptions` which treats another 
`SSLOptions` as a set of defaults. I don't expect any API changes since the 
only thing you will have to do is to create another `SSLOptions` objects in 
`SecurityManager`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: Spark 3883: SSL support for HttpServer and Akk...

2015-01-29 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/3571#issuecomment-72008176
  
Since you accepted my changes, I squashed them and rebased the PR. The 
second commit is for what I mentioned in the previous comment - the sample 
usage is in the test suite. 



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5425: Use synchronised methods in system...

2015-01-29 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/4222#issuecomment-72008939
  
Explanation of the second commit is in 
https://github.com/apache/spark/pull/4220


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5425: Use synchronised methods in system...

2015-01-29 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/4221#issuecomment-72008957
  
Explanation of the second commit is in 
https://github.com/apache/spark/pull/4220


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: Spark 3883: SSL support for HttpServer and Akk...

2015-01-29 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/3571#issuecomment-7201
  
After thinking more about this, I decided to refactor things a little bit 
to make it more consistent.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: Spark 3883: SSL support for HttpServer and Akk...

2015-01-29 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/3571#issuecomment-72042566
  
@vanzin @JoshRosen wdyt about the recent changes?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5425: Use synchronised methods in system...

2015-01-29 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/4220#issuecomment-72118648
  
So is it going to be merged?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: Spark 3883: SSL support for HttpServer and Akk...

2015-01-29 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/3571#issuecomment-72132146
  
AFAIK TypeSafe configuration notify about missing properties by throwing an 
exception. That's why this `Try`. I just thought that this is cleaner form than 
`try...catch` block.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5425: Use synchronised methods in system...

2015-01-27 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/4220#issuecomment-71651118
  
What is going on with these tests??? I've created three PRs - for 1.1, 1.2 
and 1.3 and all of them failed in a very strange way. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5425: Use synchronised methods in system...

2015-01-27 Thread jacek-lewandowski
GitHub user jacek-lewandowski opened a pull request:

https://github.com/apache/spark/pull/4220

SPARK-5425: Use synchronised methods in system properties to create 
SparkConf



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jacek-lewandowski/spark SPARK-5425-1.1

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/4220.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #4220


commit 685780e4ed8d2bc4a734e002e03482bf52a6c517
Author: Jacek Lewandowski lewandowski.ja...@gmail.com
Date:   2015-01-27T11:36:52Z

SPARK-5425: Use synchronised methods in system properties to create 
SparkConf




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5425: Use synchronised methods in system...

2015-01-27 Thread jacek-lewandowski
GitHub user jacek-lewandowski opened a pull request:

https://github.com/apache/spark/pull/4221

SPARK-5425: Use synchronised methods in system properties to create 
SparkConf



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jacek-lewandowski/spark SPARK-5425-1.2

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/4221.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #4221


commit 94aeacf6fcc7fae6d045d35b9d8f1fe4c2594780
Author: Jacek Lewandowski lewandowski.ja...@gmail.com
Date:   2015-01-27T12:02:29Z

SPARK-5425: Use synchronised methods in system properties to create 
SparkConf




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5425: Use synchronised methods in system...

2015-01-27 Thread jacek-lewandowski
GitHub user jacek-lewandowski opened a pull request:

https://github.com/apache/spark/pull/4222

SPARK-5425: Use synchronised methods in system properties to create 
SparkConf



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jacek-lewandowski/spark SPARK-5425-1.3

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/4222.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #4222


commit 51987d24ea6b29c9607679daa2b482d5855be361
Author: Jacek Lewandowski lewandowski.ja...@gmail.com
Date:   2015-01-27T12:10:51Z

SPARK-5425: Use synchronised methods in system properties to create 
SparkConf




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5425: Use synchronised methods in system...

2015-01-27 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/4220#issuecomment-71640081
  
@srowen exactly - that was the idea :)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5408: Use -XX:MaxPermSize specified by u...

2015-01-26 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/4202#issuecomment-71519044
  
I guess I should be overridden, but I couldn't find any specification which 
says that it actually works in this way.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5408: Use -XX:MaxPermSize specified by u...

2015-01-26 Thread jacek-lewandowski
GitHub user jacek-lewandowski opened a pull request:

https://github.com/apache/spark/pull/4202

SPARK-5408: Use -XX:MaxPermSize specified by used instead of default in ...

...ExecutorRunner and DriverRunner

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jacek-lewandowski/spark SPARK-5408-1.2

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/4202.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #4202


commit 6062321aeee564da6cc57cd3e6d27a5b2b7ce120
Author: Jacek Lewandowski lewandowski.ja...@gmail.com
Date:   2015-01-26T08:44:30Z

SPARK-5408: Use -XX:MaxPermSize specified by used instead of default in 
ExecutorRunner and DriverRunner




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5408: Use -XX:MaxPermSize specified by u...

2015-01-26 Thread jacek-lewandowski
GitHub user jacek-lewandowski opened a pull request:

https://github.com/apache/spark/pull/4203

SPARK-5408: Use -XX:MaxPermSize specified by used instead of default in ...

...ExecutorRunner and DriverRunner

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jacek-lewandowski/spark SPARK-5408-1.3

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/4203.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #4203


commit d91368639820f9f4696f7b70d484b7d42b39eeb4
Author: Jacek Lewandowski lewandowski.ja...@gmail.com
Date:   2015-01-26T08:46:34Z

SPARK-5408: Use -XX:MaxPermSize specified by used instead of default in 
ExecutorRunner and DriverRunner




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5408: Use -XX:MaxPermSize specified by u...

2015-01-26 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/4202#issuecomment-71551333
  
@pwendell for JVM which I use it is overwritten. But this is only one JVM. 
As I said - if such behaviour is in JVM specification - that's good. However if 
it is not, this should be considered as non-deterministic. I couldn't find that 
information.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5382: Use SPARK_CONF_DIR in spark-class ...

2015-01-25 Thread jacek-lewandowski
Github user jacek-lewandowski closed the pull request at:

https://github.com/apache/spark/pull/4177


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5382: Use SPARK_CONF_DIR in spark-class ...

2015-01-23 Thread jacek-lewandowski
GitHub user jacek-lewandowski opened a pull request:

https://github.com/apache/spark/pull/4177

SPARK-5382: Use SPARK_CONF_DIR in spark-class and spark-submit, spark-su...

...bmit2.cmd if it is defined

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jacek-lewandowski/spark SPARK-5382-1.2

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/4177.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #4177


commit 41cef2576e076a6bcf26850b4f63517caefa4d52
Author: Jacek Lewandowski lewandowski.ja...@gmail.com
Date:   2015-01-23T10:24:27Z

SPARK-5382: Use SPARK_CONF_DIR in spark-class and spark-submit, 
spark-submit2.cmd if it is defined




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: SPARK-5382: Use SPARK_CONF_DIR in spark-class ...

2015-01-23 Thread jacek-lewandowski
GitHub user jacek-lewandowski opened a pull request:

https://github.com/apache/spark/pull/4179

SPARK-5382: Use SPARK_CONF_DIR in spark-class if it is defined



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jacek-lewandowski/spark SPARK-5382-1.3

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/4179.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #4179


commit 55d77917ca2b84e801b420b575c6e3b5ac1dbd63
Author: Jacek Lewandowski lewandowski.ja...@gmail.com
Date:   2015-01-23T10:37:04Z

SPARK-5382: Use SPARK_CONF_DIR in spark-class if it is defined




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: Spark 3883: SSL support for HttpServer and Akk...

2015-01-23 Thread jacek-lewandowski
Github user jacek-lewandowski commented on the pull request:

https://github.com/apache/spark/pull/3571#issuecomment-71164426
  
@JoshRosen what is going on with these test? I can see that Jenkins run 
them twice and the second comment says they failed, however it has a lower 
build number than the previously mentioned build which passed the tests. Also 
the failing tests seems to have nothing to do with this PR.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: Spark 3883: SSL support for HttpServer and Akk...

2015-01-22 Thread jacek-lewandowski
Github user jacek-lewandowski commented on a diff in the pull request:

https://github.com/apache/spark/pull/3571#discussion_r23418026
  
--- Diff: 
core/src/test/scala/org/apache/spark/deploy/worker/WorkerTest.scala ---
@@ -0,0 +1,56 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the License); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an AS IS BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.deploy.worker
+
+import org.apache.spark.SparkConf
+import org.apache.spark.deploy.Command
+import org.scalatest.{Matchers, FunSuite}
+
+class WorkerTest extends FunSuite with Matchers {
+
+  def cmd(javaOpts: String*) = Command(, Seq.empty, Map.empty, 
Seq.empty, Seq.empty, Seq(javaOpts:_*))
+  def conf(opts: (String, String)*) = new SparkConf(loadDefaults = 
false).setAll(opts)
+
+  test(test isUseLocalNodeSSLConfig) {
+Worker.isUseLocalNodeSSLConfig(cmd(-Dasdf=dfgh)) shouldBe false
+
Worker.isUseLocalNodeSSLConfig(cmd(-Dspark.ssl.useNodeLocalConf=true)) 
shouldBe true
+
Worker.isUseLocalNodeSSLConfig(cmd(-Dspark.ssl.useNodeLocalConf=false)) 
shouldBe false
+Worker.isUseLocalNodeSSLConfig(cmd(-Dspark.ssl.useNodeLocalConf=)) 
shouldBe false
+  }
+
+  test(test maybeUpdateSSLSettings) {
+Worker.maybeUpdateSSLSettings(
+  cmd(-Dasdf=dfgh, -Dspark.ssl.opt1=x),
+  conf(spark.ssl.opt1 - y, spark.ssl.opt2 - z))
+.javaOpts should contain theSameElementsInOrderAs Seq(
+  -Dasdf=dfgh, -Dspark.ssl.opt1=x)
+
+Worker.maybeUpdateSSLSettings(
+  cmd(-Dspark.ssl.useNodeLocalConf=false, -Dspark.ssl.opt1=x),
+  conf(spark.ssl.opt1 - y, spark.ssl.opt2 - z))
+.javaOpts should contain theSameElementsInOrderAs Seq(
--- End diff --

of course... I can change it if it doesn't conform the requirements of 
Spark test suites


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: Spark 3883: SSL support for HttpServer and Akk...

2015-01-22 Thread jacek-lewandowski
Github user jacek-lewandowski commented on a diff in the pull request:

https://github.com/apache/spark/pull/3571#discussion_r23417992
  
--- Diff: 
core/src/test/scala/org/apache/spark/deploy/worker/WorkerTest.scala ---
@@ -0,0 +1,56 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the License); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an AS IS BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.deploy.worker
+
+import org.apache.spark.SparkConf
+import org.apache.spark.deploy.Command
+import org.scalatest.{Matchers, FunSuite}
+
+class WorkerTest extends FunSuite with Matchers {
+
+  def cmd(javaOpts: String*) = Command(, Seq.empty, Map.empty, 
Seq.empty, Seq.empty, Seq(javaOpts:_*))
+  def conf(opts: (String, String)*) = new SparkConf(loadDefaults = 
false).setAll(opts)
+
+  test(test isUseLocalNodeSSLConfig) {
+Worker.isUseLocalNodeSSLConfig(cmd(-Dasdf=dfgh)) shouldBe false
+
Worker.isUseLocalNodeSSLConfig(cmd(-Dspark.ssl.useNodeLocalConf=true)) 
shouldBe true
+
Worker.isUseLocalNodeSSLConfig(cmd(-Dspark.ssl.useNodeLocalConf=false)) 
shouldBe false
+Worker.isUseLocalNodeSSLConfig(cmd(-Dspark.ssl.useNodeLocalConf=)) 
shouldBe false
+  }
+
+  test(test maybeUpdateSSLSettings) {
+Worker.maybeUpdateSSLSettings(
+  cmd(-Dasdf=dfgh, -Dspark.ssl.opt1=x),
+  conf(spark.ssl.opt1 - y, spark.ssl.opt2 - z))
+.javaOpts should contain theSameElementsInOrderAs Seq(
+  -Dasdf=dfgh, -Dspark.ssl.opt1=x)
+
+Worker.maybeUpdateSSLSettings(
+  cmd(-Dspark.ssl.useNodeLocalConf=false, -Dspark.ssl.opt1=x),
+  conf(spark.ssl.opt1 - y, spark.ssl.opt2 - z))
+.javaOpts should contain theSameElementsInOrderAs Seq(
--- End diff --

It is just more generic - in this test i'm interested only whether the 
elements in the resulting collection are the same and are in the same order - 
not what exact collection implementation is used.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



  1   2   >