GitHub user agsachin opened a pull request:
https://github.com/apache/spark/pull/14601
[SPARK-13979][Spark Core][WIP]Killed executor is re spawned without AWS
keyâ¦
## What changes were proposed in this pull request?
(Please fill in changes proposed in this fix)
*Summery we need to make two changes*
1) in DataSourceStratgy.scala. use
sqlContext.sparkContext.hadoopConfiguration instead of SparkHadoopUtil.get.conf
2) update def newConfiguration(conf: SparkConf): Configuration = {..}
function to copy only s3 and swift related confs
## How was this patch tested?
manually on local machine in local mode using spark shell.
(If this patch involves UI changes, please attach a screenshot; otherwise,
remove this)
no
â¦s in standalone spark cluster
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/agsachin/spark branch-1.6
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/14601.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #14601
----
commit 9efc14086bbfc0014ed6d5cac8603af93b2dd723
Author: sachin aggarwal <[email protected]>
Date: 2016-08-11T09:14:09Z
[SPARK-13979][Spark Core]Killed executor is respawned without AWS keys in
standalone spark cluster
----
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]