Repository: incubator-airflow
Updated Branches:
  refs/heads/master 1df6c2456 -> fe73f2215


[AIRFLOW-1319] Fix misleading SparkSubmitOperator and SparkSubmitHook docstring

- Reword docstring to reduce ambigiuity in the
usage of `--files` option.
- See JIRA issue
https://issues.apache.org/jira/browse/AIRFLOW-1319
for more info.

Closes #2377 from chrissng/airflow-1319


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/fe73f221
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/fe73f221
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/fe73f221

Branch: refs/heads/master
Commit: fe73f22153118601a22300b21c15975632321338
Parents: 1df6c24
Author: chrissng <[email protected]>
Authored: Sat Feb 10 19:44:28 2018 +0100
Committer: Fokko Driesprong <[email protected]>
Committed: Sat Feb 10 19:44:28 2018 +0100

----------------------------------------------------------------------
 airflow/contrib/hooks/spark_submit_hook.py         | 5 +++--
 airflow/contrib/operators/spark_submit_operator.py | 5 +++--
 2 files changed, 6 insertions(+), 4 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/fe73f221/airflow/contrib/hooks/spark_submit_hook.py
----------------------------------------------------------------------
diff --git a/airflow/contrib/hooks/spark_submit_hook.py 
b/airflow/contrib/hooks/spark_submit_hook.py
index 16e14b4..0c89a9c 100644
--- a/airflow/contrib/hooks/spark_submit_hook.py
+++ b/airflow/contrib/hooks/spark_submit_hook.py
@@ -32,8 +32,9 @@ class SparkSubmitHook(BaseHook, LoggingMixin):
     :param conn_id: The connection id as configured in Airflow administration. 
When an
                     invalid connection_id is supplied, it will default to yarn.
     :type conn_id: str
-    :param files: Upload additional files to the container running the job, 
separated by a
-                  comma. For example hive-site.xml.
+    :param files: Upload additional files to the executor running the job, 
separated by a
+                  comma. Files will be placed in the working directory of each 
executor.
+                  For example, serialized objects.
     :type files: str
     :param py_files: Additional python files used by the job, can be .zip, 
.egg or .py.
     :type py_files: str

http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/fe73f221/airflow/contrib/operators/spark_submit_operator.py
----------------------------------------------------------------------
diff --git a/airflow/contrib/operators/spark_submit_operator.py 
b/airflow/contrib/operators/spark_submit_operator.py
index 2eb8919..38a0db0 100644
--- a/airflow/contrib/operators/spark_submit_operator.py
+++ b/airflow/contrib/operators/spark_submit_operator.py
@@ -31,8 +31,9 @@ class SparkSubmitOperator(BaseOperator):
     :param conn_id: The connection id as configured in Airflow administration. 
When an
                     invalid connection_id is supplied, it will default to yarn.
     :type conn_id: str
-    :param files: Upload additional files to the container running the job, 
separated by a
-                  comma. For example hive-site.xml.
+    :param files: Upload additional files to the executor running the job, 
separated by a
+                  comma. Files will be placed in the working directory of each 
executor.
+                  For example, serialized objects.
     :type files: str
     :param py_files: Additional python files used by the job, can be .zip, 
.egg or .py.
     :type py_files: str

Reply via email to