http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/1f06fa0e/code.html
----------------------------------------------------------------------
diff --git a/code.html b/code.html
index 4208c5a..9c74cb2 100644
--- a/code.html
+++ b/code.html
@@ -101,20 +101,20 @@
 <li class="toctree-l3"><a class="reference internal" 
href="#baseoperator">BaseOperator</a></li>
 <li class="toctree-l3"><a class="reference internal" 
href="#basesensoroperator">BaseSensorOperator</a></li>
 <li class="toctree-l3"><a class="reference internal" 
href="#core-operators">Core Operators</a><ul>
-<li class="toctree-l4"><a class="reference internal" 
href="#id3">Operators</a></li>
+<li class="toctree-l4"><a class="reference internal" 
href="#id1">Operators</a></li>
 <li class="toctree-l4"><a class="reference internal" 
href="#sensors">Sensors</a></li>
 </ul>
 </li>
 <li class="toctree-l3"><a class="reference internal" 
href="#community-contributed-operators">Community-contributed Operators</a><ul>
-<li class="toctree-l4"><a class="reference internal" 
href="#id4">Operators</a></li>
-<li class="toctree-l4"><a class="reference internal" 
href="#id11">Sensors</a></li>
+<li class="toctree-l4"><a class="reference internal" 
href="#id2">Operators</a></li>
+<li class="toctree-l4"><a class="reference internal" 
href="#id9">Sensors</a></li>
 </ul>
 </li>
 </ul>
 </li>
 <li class="toctree-l2"><a class="reference internal" 
href="#macros">Macros</a><ul>
 <li class="toctree-l3"><a class="reference internal" 
href="#default-variables">Default Variables</a></li>
-<li class="toctree-l3"><a class="reference internal" 
href="#id13">Macros</a></li>
+<li class="toctree-l3"><a class="reference internal" 
href="#id11">Macros</a></li>
 </ul>
 </li>
 <li class="toctree-l2"><a class="reference internal" 
href="#models">Models</a></li>
@@ -220,7 +220,7 @@ to understand the primitive features that can be leveraged 
in your
 DAGs.</p>
 <dl class="class">
 <dt id="airflow.models.BaseOperator">
-<em class="property">class </em><code 
class="descclassname">airflow.models.</code><code 
class="descname">BaseOperator</code><span 
class="sig-paren">(</span><em>task_id</em>, <em>owner='Airflow'</em>, 
<em>email=None</em>, <em>email_on_retry=True</em>, 
<em>email_on_failure=True</em>, <em>retries=0</em>, 
<em>retry_delay=datetime.timedelta(0</em>, <em>300)</em>, 
<em>retry_exponential_backoff=False</em>, <em>max_retry_delay=None</em>, 
<em>start_date=None</em>, <em>end_date=None</em>, 
<em>schedule_interval=None</em>, <em>depends_on_past=False</em>, 
<em>wait_for_downstream=False</em>, <em>dag=None</em>, <em>params=None</em>, 
<em>default_args=None</em>, <em>adhoc=False</em>, <em>priority_weight=1</em>, 
<em>weight_rule=u'downstream'</em>, <em>queue='default'</em>, 
<em>pool=None</em>, <em>sla=None</em>, <em>execution_timeout=None</em>, 
<em>on_failure_callback=None</em>, <em>on_success_callback=None</em>, 
<em>on_retry_callback=None</em>, <em>trigger_rule=u'all_success'</em>, 
<em>resources=None
 </em>, <em>run_as_user=None</em>, <em>task_concurrency=None</em>, 
<em>executor_config=None</em>, <em>inlets=None</em>, <em>outlets=None</em>, 
<em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/models.html#BaseOperator"><span 
class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.models.BaseOperator" title="Permalink to this 
definition">¶</a></dt>
+<em class="property">class </em><code 
class="descclassname">airflow.models.</code><code 
class="descname">BaseOperator</code><span 
class="sig-paren">(</span><em>task_id</em>, <em>owner='Airflow'</em>, 
<em>email=None</em>, <em>email_on_retry=True</em>, 
<em>email_on_failure=True</em>, <em>retries=0</em>, 
<em>retry_delay=datetime.timedelta(0</em>, <em>300)</em>, 
<em>retry_exponential_backoff=False</em>, <em>max_retry_delay=None</em>, 
<em>start_date=None</em>, <em>end_date=None</em>, 
<em>schedule_interval=None</em>, <em>depends_on_past=False</em>, 
<em>wait_for_downstream=False</em>, <em>dag=None</em>, <em>params=None</em>, 
<em>default_args=None</em>, <em>adhoc=False</em>, <em>priority_weight=1</em>, 
<em>weight_rule='downstream'</em>, <em>queue='default'</em>, 
<em>pool=None</em>, <em>sla=None</em>, <em>execution_timeout=None</em>, 
<em>on_failure_callback=None</em>, <em>on_success_callback=None</em>, 
<em>on_retry_callback=None</em>, <em>trigger_rule='all_success'</em>, 
<em>resources=None</
 em>, <em>run_as_user=None</em>, <em>task_concurrency=None</em>, 
<em>executor_config=None</em>, <em>inlets=None</em>, <em>outlets=None</em>, 
<em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/models.html#BaseOperator"><span 
class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.models.BaseOperator" title="Permalink to this 
definition">¶</a></dt>
 <dd><p>Bases: <code class="xref py py-class docutils literal 
notranslate"><span 
class="pre">airflow.utils.log.logging_mixin.LoggingMixin</span></code></p>
 <p>Abstract base class for all operators. Since operators create objects that
 become nodes in the dag, BaseOperator contains many recursive methods for
@@ -326,7 +326,7 @@ of this task fails. a context dictionary is passed as a 
single
 parameter to this function. Context contains references to related
 objects to the task instance and is documented under the macros
 section of the API.</li>
-<li><strong>on_retry_callback</strong> – much like the <code class="docutils 
literal notranslate"><span class="pre">on_failure_callback</span></code> except
+<li><strong>on_retry_callback</strong> (<em>callable</em>) – much like the 
<code class="docutils literal notranslate"><span 
class="pre">on_failure_callback</span></code> except
 that it is executed when retries occur.</li>
 <li><strong>on_success_callback</strong> (<em>callable</em>) – much like the 
<code class="docutils literal notranslate"><span 
class="pre">on_failure_callback</span></code> except
 that it is executed when the task succeeds.</li>
@@ -344,17 +344,17 @@ Resources constructor) to their values.</li>
 runs across execution_dates</li>
 <li><strong>executor_config</strong> (<em>dict</em>) – <p>Additional 
task-level configuration parameters that are
 interpreted by a specific executor. Parameters are namespaced by the name of
-executor.
-<a href="#id1"><span class="problematic" id="id2">``</span></a>example: to run 
this task in a specific docker container through
-the KubernetesExecutor
-MyOperator(…,</p>
-<blockquote>
-<div>executor_config={
-“KubernetesExecutor”:<blockquote>
-<div>{“image”: “myCustomDockerImage”}
-}</div></blockquote>
-</div></blockquote>
-<p>)``</p>
+executor.</p>
+<p><strong>Example</strong>: to run this task in a specific docker container 
through
+the KubernetesExecutor</p>
+<div class="highlight-default notranslate"><div 
class="highlight"><pre><span></span><span class="n">MyOperator</span><span 
class="p">(</span><span class="o">...</span><span class="p">,</span>
+    <span class="n">executor_config</span><span class="o">=</span><span 
class="p">{</span>
+    <span class="s2">&quot;KubernetesExecutor&quot;</span><span 
class="p">:</span>
+        <span class="p">{</span><span class="s2">&quot;image&quot;</span><span 
class="p">:</span> <span class="s2">&quot;myCustomDockerImage&quot;</span><span 
class="p">}</span>
+        <span class="p">}</span>
+<span class="p">)</span>
+</pre></div>
+</div>
 </li>
 </ul>
 </td>
@@ -363,7 +363,7 @@ MyOperator(…,</p>
 </table>
 <dl class="method">
 <dt id="airflow.models.BaseOperator.clear">
-<code class="descname">clear</code><span 
class="sig-paren">(</span><em>**kwargs</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/models.html#BaseOperator.clear"><span 
class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.models.BaseOperator.clear" title="Permalink to this 
definition">¶</a></dt>
+<code class="descname">clear</code><span 
class="sig-paren">(</span><em>start_date=None</em>, <em>end_date=None</em>, 
<em>upstream=False</em>, <em>downstream=False</em>, <em>session=None</em><span 
class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/models.html#BaseOperator.clear"><span 
class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.models.BaseOperator.clear" title="Permalink to this 
definition">¶</a></dt>
 <dd><p>Clears the state of task instances associated with the task, following
 the parameters specified.</p>
 </dd></dl>
@@ -446,7 +446,7 @@ ghost processes behind.</p>
 
 <dl class="method">
 <dt id="airflow.models.BaseOperator.post_execute">
-<code class="descname">post_execute</code><span 
class="sig-paren">(</span><em>context</em>, <em>*args</em>, 
<em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/models.html#BaseOperator.post_execute"><span 
class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.models.BaseOperator.post_execute" title="Permalink to this 
definition">¶</a></dt>
+<code class="descname">post_execute</code><span 
class="sig-paren">(</span><em>context</em>, <em>result=None</em><span 
class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/models.html#BaseOperator.post_execute"><span 
class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.models.BaseOperator.post_execute" title="Permalink to this 
definition">¶</a></dt>
 <dd><p>This hook is triggered right after self.execute() is called.
 It is passed the execution context and any results returned by the
 operator.</p>
@@ -454,7 +454,7 @@ operator.</p>
 
 <dl class="method">
 <dt id="airflow.models.BaseOperator.pre_execute">
-<code class="descname">pre_execute</code><span 
class="sig-paren">(</span><em>context</em>, <em>*args</em>, 
<em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/models.html#BaseOperator.pre_execute"><span 
class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.models.BaseOperator.pre_execute" title="Permalink to this 
definition">¶</a></dt>
+<code class="descname">pre_execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/models.html#BaseOperator.pre_execute"><span 
class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.models.BaseOperator.pre_execute" title="Permalink to this 
definition">¶</a></dt>
 <dd><p>This hook is triggered right before self.execute() is called.</p>
 </dd></dl>
 
@@ -519,7 +519,7 @@ task.</p>
 
 <dl class="method">
 <dt id="airflow.models.BaseOperator.xcom_pull">
-<code class="descname">xcom_pull</code><span 
class="sig-paren">(</span><em>context</em>, <em>task_ids=None</em>, 
<em>dag_id=None</em>, <em>key=u'return_value'</em>, 
<em>include_prior_dates=None</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/models.html#BaseOperator.xcom_pull"><span 
class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.models.BaseOperator.xcom_pull" title="Permalink to this 
definition">¶</a></dt>
+<code class="descname">xcom_pull</code><span 
class="sig-paren">(</span><em>context</em>, <em>task_ids=None</em>, 
<em>dag_id=None</em>, <em>key='return_value'</em>, 
<em>include_prior_dates=None</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/models.html#BaseOperator.xcom_pull"><span 
class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.models.BaseOperator.xcom_pull" title="Permalink to this 
definition">¶</a></dt>
 <dd><p>See TaskInstance.xcom_pull()</p>
 </dd></dl>
 
@@ -540,7 +540,7 @@ attributes.</p>
 <dl class="class">
 <dt id="airflow.sensors.base_sensor_operator.BaseSensorOperator">
 <em class="property">class </em><code 
class="descclassname">airflow.sensors.base_sensor_operator.</code><code 
class="descname">BaseSensorOperator</code><span 
class="sig-paren">(</span><em>poke_interval=60</em>, <em>timeout=604800</em>, 
<em>soft_fail=False</em>, <em>*args</em>, <em>**kwargs</em><span 
class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/sensors/base_sensor_operator.html#BaseSensorOperator"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.sensors.base_sensor_operator.BaseSensorOperator" 
title="Permalink to this definition">¶</a></dt>
-<dd><p>Bases: <a class="reference internal" 
href="#airflow.models.BaseOperator" title="airflow.models.BaseOperator"><code 
class="xref py py-class docutils literal notranslate"><span 
class="pre">airflow.models.BaseOperator</span></code></a>, <code class="xref py 
py-class docutils literal notranslate"><span 
class="pre">airflow.models.SkipMixin</span></code></p>
+<dd><p>Bases: <a class="reference internal" 
href="#airflow.models.BaseOperator" title="airflow.models.BaseOperator"><code 
class="xref py py-class docutils literal notranslate"><span 
class="pre">airflow.models.BaseOperator</span></code></a>, <a class="reference 
internal" href="#airflow.models.SkipMixin" 
title="airflow.models.SkipMixin"><code class="xref py py-class docutils literal 
notranslate"><span class="pre">airflow.models.SkipMixin</span></code></a></p>
 <p>Sensor operators are derived from this class an inherit these 
attributes.</p>
 <dl class="docutils">
 <dt>Sensor operators keep executing at a time interval and succeed when</dt>
@@ -561,6 +561,14 @@ between each tries</li>
 </tbody>
 </table>
 <dl class="method">
+<dt id="airflow.sensors.base_sensor_operator.BaseSensorOperator.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/sensors/base_sensor_operator.html#BaseSensorOperator.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.sensors.base_sensor_operator.BaseSensorOperator.execute" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
+<dl class="method">
 <dt id="airflow.sensors.base_sensor_operator.BaseSensorOperator.poke">
 <code class="descname">poke</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/sensors/base_sensor_operator.html#BaseSensorOperator.poke"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.sensors.base_sensor_operator.BaseSensorOperator.poke" 
title="Permalink to this definition">¶</a></dt>
 <dd><p>Function that the sensors defined while deriving this class should
@@ -572,8 +580,8 @@ override.</p>
 </div>
 <div class="section" id="core-operators">
 <h3>Core Operators<a class="headerlink" href="#core-operators" 
title="Permalink to this headline">¶</a></h3>
-<div class="section" id="id3">
-<h4>Operators<a class="headerlink" href="#id3" title="Permalink to this 
headline">¶</a></h4>
+<div class="section" id="id1">
+<h4>Operators<a class="headerlink" href="#id1" title="Permalink to this 
headline">¶</a></h4>
 <dl class="class">
 <dt id="airflow.operators.bash_operator.BashOperator">
 <em class="property">class </em><code 
class="descclassname">airflow.operators.bash_operator.</code><code 
class="descname">BashOperator</code><span 
class="sig-paren">(</span><em>bash_command</em>, <em>xcom_push=False</em>, 
<em>env=None</em>, <em>output_encoding='utf-8'</em>, <em>*args</em>, 
<em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/operators/bash_operator.html#BashOperator"><span 
class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.bash_operator.BashOperator" title="Permalink to this 
definition">¶</a></dt>
@@ -604,12 +612,21 @@ behavior. (templated)</li>
 which will be cleaned afterwards</p>
 </dd></dl>
 
+<dl class="method">
+<dt id="airflow.operators.bash_operator.BashOperator.on_kill">
+<code class="descname">on_kill</code><span class="sig-paren">(</span><span 
class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/operators/bash_operator.html#BashOperator.on_kill"><span 
class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.bash_operator.BashOperator.on_kill" title="Permalink 
to this definition">¶</a></dt>
+<dd><p>Override this method to cleanup subprocesses when a task instance
+gets killed. Any use of the threading, subprocess or multiprocessing
+module within an operator needs to be cleaned up or it will leave
+ghost processes behind.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
 <dt id="airflow.operators.python_operator.BranchPythonOperator">
 <em class="property">class </em><code 
class="descclassname">airflow.operators.python_operator.</code><code 
class="descname">BranchPythonOperator</code><span 
class="sig-paren">(</span><em>python_callable</em>, <em>op_args=None</em>, 
<em>op_kwargs=None</em>, <em>provide_context=False</em>, 
<em>templates_dict=None</em>, <em>templates_exts=None</em>, <em>*args</em>, 
<em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/operators/python_operator.html#BranchPythonOperator"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.python_operator.BranchPythonOperator" title="Permalink 
to this definition">¶</a></dt>
-<dd><p>Bases: <a class="reference internal" 
href="#airflow.operators.python_operator.PythonOperator" 
title="airflow.operators.python_operator.PythonOperator"><code class="xref py 
py-class docutils literal notranslate"><span 
class="pre">airflow.operators.python_operator.PythonOperator</span></code></a>, 
<code class="xref py py-class docutils literal notranslate"><span 
class="pre">airflow.models.SkipMixin</span></code></p>
+<dd><p>Bases: <a class="reference internal" 
href="#airflow.operators.python_operator.PythonOperator" 
title="airflow.operators.python_operator.PythonOperator"><code class="xref py 
py-class docutils literal notranslate"><span 
class="pre">airflow.operators.python_operator.PythonOperator</span></code></a>, 
<a class="reference internal" href="#airflow.models.SkipMixin" 
title="airflow.models.SkipMixin"><code class="xref py py-class docutils literal 
notranslate"><span class="pre">airflow.models.SkipMixin</span></code></a></p>
 <p>Allows a workflow to “branch” or follow a single path following the
 execution of this task.</p>
 <p>It derives the PythonOperator and expects a Python function that returns
@@ -624,6 +641,14 @@ to be inferred.</p>
 will invariably lead to block tasks that depend on their past successes.
 <code class="docutils literal notranslate"><span 
class="pre">skipped</span></code> states propagates where all directly upstream 
tasks are
 <code class="docutils literal notranslate"><span 
class="pre">skipped</span></code>.</p>
+<dl class="method">
+<dt id="airflow.operators.python_operator.BranchPythonOperator.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/python_operator.html#BranchPythonOperator.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.python_operator.BranchPythonOperator.execute" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
@@ -664,11 +689,19 @@ single record from an external source.</p>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.operators.check_operator.CheckOperator.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context=None</em><span 
class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/operators/check_operator.html#CheckOperator.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.check_operator.CheckOperator.execute" title="Permalink 
to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
 <dt id="airflow.operators.docker_operator.DockerOperator">
-<em class="property">class </em><code 
class="descclassname">airflow.operators.docker_operator.</code><code 
class="descname">DockerOperator</code><span 
class="sig-paren">(</span><em>image</em>, <em>api_version=None</em>, 
<em>command=None</em>, <em>cpus=1.0</em>, 
<em>docker_url='unix://var/run/docker.sock'</em>, <em>environment=None</em>, 
<em>force_pull=False</em>, <em>mem_limit=None</em>, <em>network_mode=None</em>, 
<em>tls_ca_cert=None</em>, <em>tls_client_cert=None</em>, 
<em>tls_client_key=None</em>, <em>tls_hostname=None</em>, 
<em>tls_ssl_version=None</em>, <em>tmp_dir='/tmp/airflow'</em>, 
<em>user=None</em>, <em>volumes=None</em>, <em>working_dir=None</em>, 
<em>xcom_push=False</em>, <em>xcom_all=False</em>, 
<em>docker_conn_id=None</em>, <em>*args</em>, <em>**kwargs</em><span 
class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/operators/docker_operator.html#DockerOperator"><span 
class="viewcode-link">[source]</span></a><a class="headerlink" href="#airflo
 w.operators.docker_operator.DockerOperator" title="Permalink to this 
definition">¶</a></dt>
+<em class="property">class </em><code 
class="descclassname">airflow.operators.docker_operator.</code><code 
class="descname">DockerOperator</code><span 
class="sig-paren">(</span><em>image</em>, <em>api_version=None</em>, 
<em>command=None</em>, <em>cpus=1.0</em>, 
<em>docker_url='unix://var/run/docker.sock'</em>, <em>environment=None</em>, 
<em>force_pull=False</em>, <em>mem_limit=None</em>, <em>network_mode=None</em>, 
<em>tls_ca_cert=None</em>, <em>tls_client_cert=None</em>, 
<em>tls_client_key=None</em>, <em>tls_hostname=None</em>, 
<em>tls_ssl_version=None</em>, <em>tmp_dir='/tmp/airflow'</em>, 
<em>user=None</em>, <em>volumes=None</em>, <em>working_dir=None</em>, 
<em>xcom_push=False</em>, <em>xcom_all=False</em>, 
<em>docker_conn_id=None</em>, <em>dns=None</em>, <em>dns_search=None</em>, 
<em>shm_size=None</em>, <em>*args</em>, <em>**kwargs</em><span 
class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/operators/docker_operator.html#DockerOperator"><span 
class="
 viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.docker_operator.DockerOperator" title="Permalink to 
this definition">¶</a></dt>
 <dd><p>Bases: <a class="reference internal" 
href="#airflow.models.BaseOperator" title="airflow.models.BaseOperator"><code 
class="xref py py-class docutils literal notranslate"><span 
class="pre">airflow.models.BaseOperator</span></code></a></p>
 <p>Execute a command inside a docker container.</p>
 <p>A temporary directory is created on the host and
@@ -684,17 +717,20 @@ be provided with the parameter <code class="docutils 
literal notranslate"><span
 <col class="field-body" />
 <tbody valign="top">
 <tr class="field-odd field"><th class="field-name">Parameters:</th><td 
class="field-body"><ul class="first last simple">
-<li><strong>image</strong> (<em>str</em>) – Docker image from which to 
create the container.</li>
+<li><strong>image</strong> (<em>str</em>) – Docker image from which to 
create the container.
+If image tag is omitted, “latest” will be used.</li>
 <li><strong>api_version</strong> (<em>str</em>) – Remote API version. Set to 
<code class="docutils literal notranslate"><span class="pre">auto</span></code> 
to automatically
 detect the server’s version.</li>
 <li><strong>command</strong> (<em>str</em><em> or </em><em>list</em>) – 
Command to be run in the container. (templated)</li>
 <li><strong>cpus</strong> (<em>float</em>) – Number of CPUs to assign to the 
container.
 This value gets multiplied with 1024. See
 <a class="reference external" 
href="https://docs.docker.com/engine/reference/run/#cpu-share-constraint";>https://docs.docker.com/engine/reference/run/#cpu-share-constraint</a></li>
+<li><strong>dns</strong> (<em>list of strings</em>) – Docker custom DNS 
servers</li>
+<li><strong>dns_search</strong> (<em>list of strings</em>) – Docker custom 
DNS search domain</li>
 <li><strong>docker_url</strong> (<em>str</em>) – URL of the host running the 
docker daemon.
 Default is unix://var/run/docker.sock</li>
 <li><strong>environment</strong> (<em>dict</em>) – Environment variables to 
set in the container. (templated)</li>
-<li><strong>force_pull</strong> (<em>bool</em>) – Pull the docker image on 
every run. Default is false.</li>
+<li><strong>force_pull</strong> (<em>bool</em>) – Pull the docker image on 
every run. Default is False.</li>
 <li><strong>mem_limit</strong> (<em>float</em><em> or </em><em>str</em>) – 
Maximum amount of memory the container can use.
 Either a float value, which represents the limit in bytes,
 or a string like <code class="docutils literal notranslate"><span 
class="pre">128m</span></code> or <code class="docutils literal 
notranslate"><span class="pre">1g</span></code>.</li>
@@ -721,11 +757,30 @@ The default is False.</li>
 <li><strong>xcom_all</strong> (<em>bool</em>) – Push all the stdout or just 
the last line.
 The default is False (last line).</li>
 <li><strong>docker_conn_id</strong> (<em>str</em>) – ID of the Airflow 
connection to use</li>
+<li><strong>shm_size</strong> (<em>int</em>) – Size of <code class="docutils 
literal notranslate"><span class="pre">/dev/shm</span></code> in bytes. The 
size must be
+greater than 0. If omitted uses system default.</li>
 </ul>
 </td>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.operators.docker_operator.DockerOperator.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/docker_operator.html#DockerOperator.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.docker_operator.DockerOperator.execute" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
+<dl class="method">
+<dt id="airflow.operators.docker_operator.DockerOperator.on_kill">
+<code class="descname">on_kill</code><span class="sig-paren">(</span><span 
class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/operators/docker_operator.html#DockerOperator.on_kill"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.docker_operator.DockerOperator.on_kill" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>Override this method to cleanup subprocesses when a task instance
+gets killed. Any use of the threading, subprocess or multiprocessing
+module within an operator needs to be cleaned up or it will leave
+ghost processes behind.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
@@ -734,6 +789,14 @@ The default is False (last line).</li>
 <dd><p>Bases: <a class="reference internal" 
href="#airflow.models.BaseOperator" title="airflow.models.BaseOperator"><code 
class="xref py py-class docutils literal notranslate"><span 
class="pre">airflow.models.BaseOperator</span></code></a></p>
 <p>Operator that does literally nothing. It can be used to group tasks in a
 DAG.</p>
+<dl class="method">
+<dt id="airflow.operators.dummy_operator.DummyOperator.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/dummy_operator.html#DummyOperator.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.dummy_operator.DummyOperator.execute" title="Permalink 
to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
@@ -776,6 +839,14 @@ without stopping the progress of the DAG.</p>
 </tbody>
 </table>
 <dl class="method">
+<dt id="airflow.operators.druid_check_operator.DruidCheckOperator.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context=None</em><span 
class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/operators/druid_check_operator.html#DruidCheckOperator.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.druid_check_operator.DruidCheckOperator.execute" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
+<dl class="method">
 <dt id="airflow.operators.druid_check_operator.DruidCheckOperator.get_db_hook">
 <code class="descname">get_db_hook</code><span class="sig-paren">(</span><span 
class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/operators/druid_check_operator.html#DruidCheckOperator.get_db_hook"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.druid_check_operator.DruidCheckOperator.get_db_hook" 
title="Permalink to this definition">¶</a></dt>
 <dd><p>Return the druid db api hook.</p>
@@ -822,6 +893,14 @@ header.</li>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.operators.email_operator.EmailOperator.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/email_operator.html#EmailOperator.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.email_operator.EmailOperator.execute" title="Permalink 
to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
@@ -849,6 +928,14 @@ executed prior to loading the data. (templated)</li>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.operators.generic_transfer.GenericTransfer.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/generic_transfer.html#GenericTransfer.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.generic_transfer.GenericTransfer.execute" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
@@ -899,6 +986,14 @@ hive for the staging table</li>
 </table>
 </dd></dl>
 
+<dl class="method">
+<dt id="airflow.operators.hive_to_druid.HiveToDruidTransfer.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/hive_to_druid.html#HiveToDruidTransfer.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.hive_to_druid.HiveToDruidTransfer.execute" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
@@ -934,6 +1029,14 @@ destination MySQL connection: {‘local_infile’: 
true}.</li>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.operators.hive_to_mysql.HiveToMySqlTransfer.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/hive_to_mysql.html#HiveToMySqlTransfer.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.hive_to_mysql.HiveToMySqlTransfer.execute" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
@@ -955,11 +1058,19 @@ results of the query as a csv to a Samba location.</p>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.operators.hive_to_samba_operator.Hive2SambaOperator.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/hive_to_samba_operator.html#Hive2SambaOperator.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.hive_to_samba_operator.Hive2SambaOperator.execute" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
 <dt id="airflow.operators.hive_operator.HiveOperator">
-<em class="property">class </em><code 
class="descclassname">airflow.operators.hive_operator.</code><code 
class="descname">HiveOperator</code><span 
class="sig-paren">(</span><em>hql</em>, 
<em>hive_cli_conn_id=u'hive_cli_default'</em>, <em>schema=u'default'</em>, 
<em>hiveconfs=None</em>, <em>hiveconf_jinja_translate=False</em>, 
<em>script_begin_tag=None</em>, <em>run_as_owner=False</em>, 
<em>mapred_queue=None</em>, <em>mapred_queue_priority=None</em>, 
<em>mapred_job_name=None</em>, <em>*args</em>, <em>**kwargs</em><span 
class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/operators/hive_operator.html#HiveOperator"><span 
class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.hive_operator.HiveOperator" title="Permalink to this 
definition">¶</a></dt>
+<em class="property">class </em><code 
class="descclassname">airflow.operators.hive_operator.</code><code 
class="descname">HiveOperator</code><span 
class="sig-paren">(</span><em>hql</em>, 
<em>hive_cli_conn_id='hive_cli_default'</em>, <em>schema='default'</em>, 
<em>hiveconfs=None</em>, <em>hiveconf_jinja_translate=False</em>, 
<em>script_begin_tag=None</em>, <em>run_as_owner=False</em>, 
<em>mapred_queue=None</em>, <em>mapred_queue_priority=None</em>, 
<em>mapred_job_name=None</em>, <em>*args</em>, <em>**kwargs</em><span 
class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/operators/hive_operator.html#HiveOperator"><span 
class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.hive_operator.HiveOperator" title="Permalink to this 
definition">¶</a></dt>
 <dd><p>Bases: <a class="reference internal" 
href="#airflow.models.BaseOperator" title="airflow.models.BaseOperator"><code 
class="xref py py-class docutils literal notranslate"><span 
class="pre">airflow.models.BaseOperator</span></code></a></p>
 <p>Executes hql code or hive script in a specific Hive database.</p>
 <table class="docutils field-list" frame="void" rules="none">
@@ -991,6 +1102,32 @@ This can make monitoring easier.</li>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.operators.hive_operator.HiveOperator.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/hive_operator.html#HiveOperator.execute"><span 
class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.hive_operator.HiveOperator.execute" title="Permalink 
to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
+<dl class="method">
+<dt id="airflow.operators.hive_operator.HiveOperator.on_kill">
+<code class="descname">on_kill</code><span class="sig-paren">(</span><span 
class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/operators/hive_operator.html#HiveOperator.on_kill"><span 
class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.hive_operator.HiveOperator.on_kill" title="Permalink 
to this definition">¶</a></dt>
+<dd><p>Override this method to cleanup subprocesses when a task instance
+gets killed. Any use of the threading, subprocess or multiprocessing
+module within an operator needs to be cleaned up or it will leave
+ghost processes behind.</p>
+</dd></dl>
+
+<dl class="method">
+<dt id="airflow.operators.hive_operator.HiveOperator.prepare_template">
+<code class="descname">prepare_template</code><span 
class="sig-paren">(</span><span class="sig-paren">)</span><a class="reference 
internal" 
href="_modules/airflow/operators/hive_operator.html#HiveOperator.prepare_template"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.hive_operator.HiveOperator.prepare_template" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>Hook that is triggered after the templated fields get replaced
+by their content. If you need your operator to alter the
+content of the file before the template is rendered,
+it should override this method to do so.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
@@ -1029,6 +1166,14 @@ column.</li>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt 
id="airflow.operators.hive_stats_operator.HiveStatsCollectionOperator.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context=None</em><span 
class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/operators/hive_stats_operator.html#HiveStatsCollectionOperator.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.hive_stats_operator.HiveStatsCollectionOperator.execute"
 title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
@@ -1054,6 +1199,14 @@ against. Defaults to 7 days</li>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.operators.check_operator.IntervalCheckOperator.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context=None</em><span 
class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/operators/check_operator.html#IntervalCheckOperator.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.check_operator.IntervalCheckOperator.execute" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
@@ -1076,16 +1229,32 @@ Template reference are recognized by str ending in 
'.sql'</em>) – the sql code
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.operators.jdbc_operator.JdbcOperator.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/jdbc_operator.html#JdbcOperator.execute"><span 
class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.jdbc_operator.JdbcOperator.execute" title="Permalink 
to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
 <dt id="airflow.operators.latest_only_operator.LatestOnlyOperator">
-<em class="property">class </em><code 
class="descclassname">airflow.operators.latest_only_operator.</code><code 
class="descname">LatestOnlyOperator</code><span 
class="sig-paren">(</span><em>task_id</em>, <em>owner='Airflow'</em>, 
<em>email=None</em>, <em>email_on_retry=True</em>, 
<em>email_on_failure=True</em>, <em>retries=0</em>, 
<em>retry_delay=datetime.timedelta(0</em>, <em>300)</em>, 
<em>retry_exponential_backoff=False</em>, <em>max_retry_delay=None</em>, 
<em>start_date=None</em>, <em>end_date=None</em>, 
<em>schedule_interval=None</em>, <em>depends_on_past=False</em>, 
<em>wait_for_downstream=False</em>, <em>dag=None</em>, <em>params=None</em>, 
<em>default_args=None</em>, <em>adhoc=False</em>, <em>priority_weight=1</em>, 
<em>weight_rule=u'downstream'</em>, <em>queue='default'</em>, 
<em>pool=None</em>, <em>sla=None</em>, <em>execution_timeout=None</em>, 
<em>on_failure_callback=None</em>, <em>on_success_callback=None</em>, 
<em>on_retry_callback=None</em>, <em>trigger_rule=u'all_suc
 cess'</em>, <em>resources=None</em>, <em>run_as_user=None</em>, 
<em>task_concurrency=None</em>, <em>executor_config=None</em>, 
<em>inlets=None</em>, <em>outlets=None</em>, <em>*args</em>, 
<em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/operators/latest_only_operator.html#LatestOnlyOperator"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.latest_only_operator.LatestOnlyOperator" 
title="Permalink to this definition">¶</a></dt>
-<dd><p>Bases: <a class="reference internal" 
href="#airflow.models.BaseOperator" title="airflow.models.BaseOperator"><code 
class="xref py py-class docutils literal notranslate"><span 
class="pre">airflow.models.BaseOperator</span></code></a>, <code class="xref py 
py-class docutils literal notranslate"><span 
class="pre">airflow.models.SkipMixin</span></code></p>
+<em class="property">class </em><code 
class="descclassname">airflow.operators.latest_only_operator.</code><code 
class="descname">LatestOnlyOperator</code><span 
class="sig-paren">(</span><em>task_id</em>, <em>owner='Airflow'</em>, 
<em>email=None</em>, <em>email_on_retry=True</em>, 
<em>email_on_failure=True</em>, <em>retries=0</em>, 
<em>retry_delay=datetime.timedelta(0</em>, <em>300)</em>, 
<em>retry_exponential_backoff=False</em>, <em>max_retry_delay=None</em>, 
<em>start_date=None</em>, <em>end_date=None</em>, 
<em>schedule_interval=None</em>, <em>depends_on_past=False</em>, 
<em>wait_for_downstream=False</em>, <em>dag=None</em>, <em>params=None</em>, 
<em>default_args=None</em>, <em>adhoc=False</em>, <em>priority_weight=1</em>, 
<em>weight_rule='downstream'</em>, <em>queue='default'</em>, 
<em>pool=None</em>, <em>sla=None</em>, <em>execution_timeout=None</em>, 
<em>on_failure_callback=None</em>, <em>on_success_callback=None</em>, 
<em>on_retry_callback=None</em>, <em>trigger_rule='all_succe
 ss'</em>, <em>resources=None</em>, <em>run_as_user=None</em>, 
<em>task_concurrency=None</em>, <em>executor_config=None</em>, 
<em>inlets=None</em>, <em>outlets=None</em>, <em>*args</em>, 
<em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/operators/latest_only_operator.html#LatestOnlyOperator"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.latest_only_operator.LatestOnlyOperator" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>Bases: <a class="reference internal" 
href="#airflow.models.BaseOperator" title="airflow.models.BaseOperator"><code 
class="xref py py-class docutils literal notranslate"><span 
class="pre">airflow.models.BaseOperator</span></code></a>, <a class="reference 
internal" href="#airflow.models.SkipMixin" 
title="airflow.models.SkipMixin"><code class="xref py py-class docutils literal 
notranslate"><span class="pre">airflow.models.SkipMixin</span></code></a></p>
 <p>Allows a workflow to skip tasks that are not running during the most
 recent schedule interval.</p>
 <p>If the task is run outside of the latest schedule interval, all
 directly downstream tasks will be skipped.</p>
+<dl class="method">
+<dt id="airflow.operators.latest_only_operator.LatestOnlyOperator.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/latest_only_operator.html#LatestOnlyOperator.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.latest_only_operator.LatestOnlyOperator.execute" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
@@ -1107,11 +1276,19 @@ extension.</em><em> (</em><em>templated</em><em>)</em>) 
– the sql code to be e
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.operators.mssql_operator.MsSqlOperator.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/mssql_operator.html#MsSqlOperator.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.mssql_operator.MsSqlOperator.execute" title="Permalink 
to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
 <dt id="airflow.operators.mssql_to_hive.MsSqlToHiveTransfer">
-<em class="property">class </em><code 
class="descclassname">airflow.operators.mssql_to_hive.</code><code 
class="descname">MsSqlToHiveTransfer</code><span 
class="sig-paren">(</span><em>sql</em>, <em>hive_table</em>, 
<em>create=True</em>, <em>recreate=False</em>, <em>partition=None</em>, 
<em>delimiter=u'x01'</em>, <em>mssql_conn_id='mssql_default'</em>, 
<em>hive_cli_conn_id='hive_cli_default'</em>, <em>tblproperties=None</em>, 
<em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/mssql_to_hive.html#MsSqlToHiveTransfer"><span 
class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.mssql_to_hive.MsSqlToHiveTransfer" title="Permalink to 
this definition">¶</a></dt>
+<em class="property">class </em><code 
class="descclassname">airflow.operators.mssql_to_hive.</code><code 
class="descname">MsSqlToHiveTransfer</code><span 
class="sig-paren">(</span><em>sql</em>, <em>hive_table</em>, 
<em>create=True</em>, <em>recreate=False</em>, <em>partition=None</em>, 
<em>delimiter='x01'</em>, <em>mssql_conn_id='mssql_default'</em>, 
<em>hive_cli_conn_id='hive_cli_default'</em>, <em>tblproperties=None</em>, 
<em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/mssql_to_hive.html#MsSqlToHiveTransfer"><span 
class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.mssql_to_hive.MsSqlToHiveTransfer" title="Permalink to 
this definition">¶</a></dt>
 <dd><p>Bases: <a class="reference internal" 
href="#airflow.models.BaseOperator" title="airflow.models.BaseOperator"><code 
class="xref py py-class docutils literal notranslate"><span 
class="pre">airflow.models.BaseOperator</span></code></a></p>
 <p>Moves data from Microsoft SQL Server to Hive. The operator runs
 your query against Microsoft SQL Server, stores the file locally
@@ -1147,6 +1324,14 @@ values. (templated)</li>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.operators.mssql_to_hive.MsSqlToHiveTransfer.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/mssql_to_hive.html#MsSqlToHiveTransfer.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.mssql_to_hive.MsSqlToHiveTransfer.execute" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
@@ -1169,11 +1354,19 @@ Template reference are recognized by str ending in 
'.sql'</em>) – the sql code
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.operators.mysql_operator.MySqlOperator.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/mysql_operator.html#MySqlOperator.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.mysql_operator.MySqlOperator.execute" title="Permalink 
to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
 <dt id="airflow.operators.mysql_to_hive.MySqlToHiveTransfer">
-<em class="property">class </em><code 
class="descclassname">airflow.operators.mysql_to_hive.</code><code 
class="descname">MySqlToHiveTransfer</code><span 
class="sig-paren">(</span><em>sql</em>, <em>hive_table</em>, 
<em>create=True</em>, <em>recreate=False</em>, <em>partition=None</em>, 
<em>delimiter=u'x01'</em>, <em>mysql_conn_id='mysql_default'</em>, 
<em>hive_cli_conn_id='hive_cli_default'</em>, <em>tblproperties=None</em>, 
<em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/mysql_to_hive.html#MySqlToHiveTransfer"><span 
class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.mysql_to_hive.MySqlToHiveTransfer" title="Permalink to 
this definition">¶</a></dt>
+<em class="property">class </em><code 
class="descclassname">airflow.operators.mysql_to_hive.</code><code 
class="descname">MySqlToHiveTransfer</code><span 
class="sig-paren">(</span><em>sql</em>, <em>hive_table</em>, 
<em>create=True</em>, <em>recreate=False</em>, <em>partition=None</em>, 
<em>delimiter='x01'</em>, <em>mysql_conn_id='mysql_default'</em>, 
<em>hive_cli_conn_id='hive_cli_default'</em>, <em>tblproperties=None</em>, 
<em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/mysql_to_hive.html#MySqlToHiveTransfer"><span 
class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.mysql_to_hive.MySqlToHiveTransfer" title="Permalink to 
this definition">¶</a></dt>
 <dd><p>Bases: <a class="reference internal" 
href="#airflow.models.BaseOperator" title="airflow.models.BaseOperator"><code 
class="xref py py-class docutils literal notranslate"><span 
class="pre">airflow.models.BaseOperator</span></code></a></p>
 <p>Moves data from MySql to Hive. The operator runs your query against
 MySQL, stores the file locally before loading it into a Hive table.
@@ -1208,6 +1401,14 @@ and values. (templated)</li>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.operators.mysql_to_hive.MySqlToHiveTransfer.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/mysql_to_hive.html#MySqlToHiveTransfer.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.mysql_to_hive.MySqlToHiveTransfer.execute" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
@@ -1222,6 +1423,14 @@ and values. (templated)</li>
 <blockquote>
 <div>a list of str (sql statements), or reference to a template file.
 Template reference are recognized by str ending in 
‘.sql’</div></blockquote>
+<dl class="method">
+<dt id="airflow.operators.oracle_operator.OracleOperator.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/oracle_operator.html#OracleOperator.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.oracle_operator.OracleOperator.execute" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
@@ -1246,6 +1455,32 @@ object documentation for more details.</li>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.operators.pig_operator.PigOperator.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/pig_operator.html#PigOperator.execute"><span 
class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.pig_operator.PigOperator.execute" title="Permalink to 
this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
+<dl class="method">
+<dt id="airflow.operators.pig_operator.PigOperator.on_kill">
+<code class="descname">on_kill</code><span class="sig-paren">(</span><span 
class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/operators/pig_operator.html#PigOperator.on_kill"><span 
class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.pig_operator.PigOperator.on_kill" title="Permalink to 
this definition">¶</a></dt>
+<dd><p>Override this method to cleanup subprocesses when a task instance
+gets killed. Any use of the threading, subprocess or multiprocessing
+module within an operator needs to be cleaned up or it will leave
+ghost processes behind.</p>
+</dd></dl>
+
+<dl class="method">
+<dt id="airflow.operators.pig_operator.PigOperator.prepare_template">
+<code class="descname">prepare_template</code><span 
class="sig-paren">(</span><span class="sig-paren">)</span><a class="reference 
internal" 
href="_modules/airflow/operators/pig_operator.html#PigOperator.prepare_template"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.pig_operator.PigOperator.prepare_template" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>Hook that is triggered after the templated fields get replaced
+by their content. If you need your operator to alter the
+content of the file before the template is rendered,
+it should override this method to do so.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
@@ -1268,6 +1503,14 @@ Template reference are recognized by str ending in 
'.sql'</em>) – the sql code
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.operators.postgres_operator.PostgresOperator.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/postgres_operator.html#PostgresOperator.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.postgres_operator.PostgresOperator.execute" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
@@ -1360,6 +1603,14 @@ the task twice won’t double load data). 
(templated)</li>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.operators.presto_to_mysql.PrestoToMySqlTransfer.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/presto_to_mysql.html#PrestoToMySqlTransfer.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.presto_to_mysql.PrestoToMySqlTransfer.execute" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
@@ -1412,6 +1663,14 @@ processing templated fields, for examples <code 
class="docutils literal notransl
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.operators.python_operator.PythonOperator.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/python_operator.html#PythonOperator.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.python_operator.PythonOperator.execute" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
@@ -1495,6 +1754,14 @@ omit the transformation script if S3 Select expression 
is specified.</p>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt 
id="airflow.operators.s3_file_transform_operator.S3FileTransformOperator.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/s3_file_transform_operator.html#S3FileTransformOperator.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.s3_file_transform_operator.S3FileTransformOperator.execute"
 title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
@@ -1545,6 +1812,14 @@ required to process headers</li>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.operators.s3_to_hive_operator.S3ToHiveTransfer.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/s3_to_hive_operator.html#S3ToHiveTransfer.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.s3_to_hive_operator.S3ToHiveTransfer.execute" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
@@ -1569,12 +1844,20 @@ required to process headers</li>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt 
id="airflow.operators.s3_to_redshift_operator.S3ToRedshiftTransfer.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/s3_to_redshift_operator.html#S3ToRedshiftTransfer.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.s3_to_redshift_operator.S3ToRedshiftTransfer.execute" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
 <dt id="airflow.operators.python_operator.ShortCircuitOperator">
 <em class="property">class </em><code 
class="descclassname">airflow.operators.python_operator.</code><code 
class="descname">ShortCircuitOperator</code><span 
class="sig-paren">(</span><em>python_callable</em>, <em>op_args=None</em>, 
<em>op_kwargs=None</em>, <em>provide_context=False</em>, 
<em>templates_dict=None</em>, <em>templates_exts=None</em>, <em>*args</em>, 
<em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/operators/python_operator.html#ShortCircuitOperator"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.python_operator.ShortCircuitOperator" title="Permalink 
to this definition">¶</a></dt>
-<dd><p>Bases: <a class="reference internal" 
href="#airflow.operators.python_operator.PythonOperator" 
title="airflow.operators.python_operator.PythonOperator"><code class="xref py 
py-class docutils literal notranslate"><span 
class="pre">airflow.operators.python_operator.PythonOperator</span></code></a>, 
<code class="xref py py-class docutils literal notranslate"><span 
class="pre">airflow.models.SkipMixin</span></code></p>
+<dd><p>Bases: <a class="reference internal" 
href="#airflow.operators.python_operator.PythonOperator" 
title="airflow.operators.python_operator.PythonOperator"><code class="xref py 
py-class docutils literal notranslate"><span 
class="pre">airflow.operators.python_operator.PythonOperator</span></code></a>, 
<a class="reference internal" href="#airflow.models.SkipMixin" 
title="airflow.models.SkipMixin"><code class="xref py py-class docutils literal 
notranslate"><span class="pre">airflow.models.SkipMixin</span></code></a></p>
 <p>Allows a workflow to continue only if a condition is met. Otherwise, the
 workflow “short-circuits” and downstream tasks are skipped.</p>
 <p>The ShortCircuitOperator is derived from the PythonOperator. It evaluates a
@@ -1582,11 +1865,19 @@ condition and short-circuits the workflow if the 
condition is False. Any
 downstream tasks are marked with a state of “skipped”. If the condition is
 True, downstream tasks proceed as normal.</p>
 <p>The condition is determined by the result of 
<cite>python_callable</cite>.</p>
+<dl class="method">
+<dt id="airflow.operators.python_operator.ShortCircuitOperator.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/python_operator.html#ShortCircuitOperator.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.python_operator.ShortCircuitOperator.execute" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
 <dt id="airflow.operators.http_operator.SimpleHttpOperator">
-<em class="property">class </em><code 
class="descclassname">airflow.operators.http_operator.</code><code 
class="descname">SimpleHttpOperator</code><span 
class="sig-paren">(</span><em>endpoint</em>, <em>method='POST'</em>, 
<em>data=None</em>, <em>headers=None</em>, <em>response_check=None</em>, 
<em>extra_options=None</em>, <em>xcom_push=False</em>, 
<em>http_conn_id='http_default'</em>, <em>*args</em>, <em>**kwargs</em><span 
class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/operators/http_operator.html#SimpleHttpOperator"><span 
class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.http_operator.SimpleHttpOperator" title="Permalink to 
this definition">¶</a></dt>
+<em class="property">class </em><code 
class="descclassname">airflow.operators.http_operator.</code><code 
class="descname">SimpleHttpOperator</code><span 
class="sig-paren">(</span><em>endpoint</em>, <em>method='POST'</em>, 
<em>data=None</em>, <em>headers=None</em>, <em>response_check=None</em>, 
<em>extra_options=None</em>, <em>xcom_push=False</em>, 
<em>http_conn_id='http_default'</em>, <em>log_response=False</em>, 
<em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/http_operator.html#SimpleHttpOperator"><span 
class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.http_operator.SimpleHttpOperator" title="Permalink to 
this definition">¶</a></dt>
 <dd><p>Bases: <a class="reference internal" 
href="#airflow.models.BaseOperator" title="airflow.models.BaseOperator"><code 
class="xref py py-class docutils literal notranslate"><span 
class="pre">airflow.models.BaseOperator</span></code></a></p>
 <p>Calls an endpoint on an HTTP system to execute an action</p>
 <table class="docutils field-list" frame="void" rules="none">
@@ -1606,11 +1897,21 @@ Returns True for ‘pass’ and False otherwise.</li>
 <li><strong>extra_options</strong> (<em>A dictionary of options</em><em>, 
</em><em>where key is string and value
 depends on the option that's being modified.</em>) – Extra options for the 
‘requests’ library, see the
 ‘requests’ documentation (options to modify timeout, ssl, etc.)</li>
+<li><strong>xcom_push</strong> (<em>bool</em>) – Push the response to Xcom 
(default: False)</li>
+<li><strong>log_response</strong> (<em>bool</em>) – Log the response 
(default: False)</li>
 </ul>
 </td>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.operators.http_operator.SimpleHttpOperator.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/http_operator.html#SimpleHttpOperator.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.http_operator.SimpleHttpOperator.execute" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
@@ -1709,12 +2010,28 @@ a '.sql' extensions.</em>) – the sql code to be 
executed. (templated)</li>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.operators.sqlite_operator.SqliteOperator.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/sqlite_operator.html#SqliteOperator.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.sqlite_operator.SqliteOperator.execute" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
 <dt id="airflow.operators.subdag_operator.SubDagOperator">
-<em class="property">class </em><code 
class="descclassname">airflow.operators.subdag_operator.</code><code 
class="descname">SubDagOperator</code><span 
class="sig-paren">(</span><em>**kwargs</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/subdag_operator.html#SubDagOperator"><span 
class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.subdag_operator.SubDagOperator" title="Permalink to 
this definition">¶</a></dt>
+<em class="property">class </em><code 
class="descclassname">airflow.operators.subdag_operator.</code><code 
class="descname">SubDagOperator</code><span 
class="sig-paren">(</span><em>subdag</em>, 
<em>executor=&lt;airflow.executors.sequential_executor.SequentialExecutor 
object&gt;</em>, <em>*args</em>, <em>**kwargs</em><span 
class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/operators/subdag_operator.html#SubDagOperator"><span 
class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.subdag_operator.SubDagOperator" title="Permalink to 
this definition">¶</a></dt>
 <dd><p>Bases: <a class="reference internal" 
href="#airflow.models.BaseOperator" title="airflow.models.BaseOperator"><code 
class="xref py py-class docutils literal notranslate"><span 
class="pre">airflow.models.BaseOperator</span></code></a></p>
+<dl class="method">
+<dt id="airflow.operators.subdag_operator.SubDagOperator.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/subdag_operator.html#SubDagOperator.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.subdag_operator.SubDagOperator.execute" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
@@ -1743,6 +2060,14 @@ should look like <code class="docutils literal 
notranslate"><span class="pre">de
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.operators.dagrun_operator.TriggerDagRunOperator.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/dagrun_operator.html#TriggerDagRunOperator.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.dagrun_operator.TriggerDagRunOperator.execute" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
@@ -1761,6 +2086,14 @@ single record from an external source.</p>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.operators.check_operator.ValueCheckOperator.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context=None</em><span 
class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/operators/check_operator.html#ValueCheckOperator.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.check_operator.ValueCheckOperator.execute" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
@@ -1785,6 +2118,14 @@ single record from an external source.</p>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt 
id="airflow.operators.redshift_to_s3_operator.RedshiftToS3Transfer.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/operators/redshift_to_s3_operator.html#RedshiftToS3Transfer.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.operators.redshift_to_s3_operator.RedshiftToS3Transfer.execute" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 </div>
@@ -1820,7 +2161,7 @@ or execution_date_fn can be passed to ExternalTaskSensor, 
but not both.</li>
 </table>
 <dl class="method">
 <dt id="airflow.sensors.external_task_sensor.ExternalTaskSensor.poke">
-<code class="descname">poke</code><span 
class="sig-paren">(</span><em>**kwargs</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/sensors/external_task_sensor.html#ExternalTaskSensor.poke"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.sensors.external_task_sensor.ExternalTaskSensor.poke" 
title="Permalink to this definition">¶</a></dt>
+<code class="descname">poke</code><span 
class="sig-paren">(</span><em>context</em>, <em>session=None</em><span 
class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/sensors/external_task_sensor.html#ExternalTaskSensor.poke"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.sensors.external_task_sensor.ExternalTaskSensor.poke" 
title="Permalink to this definition">¶</a></dt>
 <dd><p>Function that the sensors defined while deriving this class should
 override.</p>
 </dd></dl>
@@ -2075,6 +2416,7 @@ are NOT special characters in the Python regex engine.</p>
 <li><strong>prefix</strong> (<em>str</em>) – The prefix being waited on. 
Relative path from bucket root level.</li>
 <li><strong>delimiter</strong> (<em>str</em>) – The delimiter intended to 
show hierarchy.
 Defaults to ‘/’.</li>
+<li><strong>aws_conn_id</strong> (<em>str</em>) – a reference to the s3 
connection</li>
 </ul>
 </td>
 </tr>
@@ -2182,8 +2524,8 @@ override.</p>
 </div>
 <div class="section" id="community-contributed-operators">
 <h3>Community-contributed Operators<a class="headerlink" 
href="#community-contributed-operators" title="Permalink to this 
headline">¶</a></h3>
-<div class="section" id="id4">
-<h4>Operators<a class="headerlink" href="#id4" title="Permalink to this 
headline">¶</a></h4>
+<div class="section" id="id2">
+<h4>Operators<a class="headerlink" href="#id2" title="Permalink to this 
headline">¶</a></h4>
 <dl class="class">
 <dt id="airflow.contrib.operators.awsbatch_operator.AWSBatchOperator">
 <em class="property">class </em><code 
class="descclassname">airflow.contrib.operators.awsbatch_operator.</code><code 
class="descname">AWSBatchOperator</code><span 
class="sig-paren">(</span><em>job_name</em>, <em>job_definition</em>, 
<em>job_queue</em>, <em>overrides</em>, <em>max_retries=4200</em>, 
<em>aws_conn_id=None</em>, <em>region_name=None</em>, <em>**kwargs</em><span 
class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/contrib/operators/awsbatch_operator.html#AWSBatchOperator"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.contrib.operators.awsbatch_operator.AWSBatchOperator" 
title="Permalink to this definition">¶</a></dt>
@@ -2193,34 +2535,47 @@ override.</p>
 <col class="field-name" />
 <col class="field-body" />
 <tbody valign="top">
-<tr class="field-odd field"><th class="field-name">Parameters:</th><td 
class="field-body"><ul class="first simple">
+<tr class="field-odd field"><th class="field-name">Parameters:</th><td 
class="field-body"><ul class="first last simple">
 <li><strong>job_name</strong> (<em>str</em>) – the name for the job that 
will run on AWS Batch</li>
 <li><strong>job_definition</strong> (<em>str</em>) – the job definition name 
on AWS Batch</li>
 <li><strong>job_queue</strong> (<em>str</em>) – the queue name on AWS 
Batch</li>
-<li><strong>max_retries</strong> (<em>int</em>) – exponential backoff 
retries while waiter is not merged, 4200 = 48 hours</li>
+<li><strong>overrides</strong> (<em>dict</em>) – the same parameter that 
boto3 will receive on
+containerOverrides (templated).
+<a class="reference external" 
href="http://boto3.readthedocs.io/en/latest/reference/services/batch.html#submit_job";>http://boto3.readthedocs.io/en/latest/reference/services/batch.html#submit_job</a></li>
+<li><strong>max_retries</strong> (<em>int</em>) – exponential backoff 
retries while waiter is not merged,
+4200 = 48 hours</li>
 <li><strong>aws_conn_id</strong> (<em>str</em>) – connection id of AWS 
credentials / region name. If None,
 credential boto3 strategy will be used
 (<a class="reference external" 
href="http://boto3.readthedocs.io/en/latest/guide/configuration.html";>http://boto3.readthedocs.io/en/latest/guide/configuration.html</a>).</li>
-<li><strong>region_name</strong> – region name to use in AWS Hook.
+<li><strong>region_name</strong> (<em>str</em>) – region name to use in AWS 
Hook.
 Override the region_name in connection (if provided)</li>
 </ul>
 </td>
 </tr>
-<tr class="field-even field"><th class="field-name">Param:</th><td 
class="field-body"><p class="first">overrides: the same parameter that boto3 
will receive on
-containerOverrides (templated):
-<a class="reference external" 
href="http://boto3.readthedocs.io/en/latest/reference/services/batch.html#submit_job";>http://boto3.readthedocs.io/en/latest/reference/services/batch.html#submit_job</a></p>
-</td>
-</tr>
-<tr class="field-odd field"><th class="field-name">Type:</th><td 
class="field-body"><p class="first last">overrides: dict</p>
-</td>
-</tr>
 </tbody>
 </table>
+<dl class="method">
+<dt id="airflow.contrib.operators.awsbatch_operator.AWSBatchOperator.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/contrib/operators/awsbatch_operator.html#AWSBatchOperator.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.contrib.operators.awsbatch_operator.AWSBatchOperator.execute" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
+<dl class="method">
+<dt id="airflow.contrib.operators.awsbatch_operator.AWSBatchOperator.on_kill">
+<code class="descname">on_kill</code><span class="sig-paren">(</span><span 
class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/contrib/operators/awsbatch_operator.html#AWSBatchOperator.on_kill"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.contrib.operators.awsbatch_operator.AWSBatchOperator.on_kill" 
title="Permalink to this definition">¶</a></dt>
+<dd><p>Override this method to cleanup subprocesses when a task instance
+gets killed. Any use of the threading, subprocess or multiprocessing
+module within an operator needs to be cleaned up or it will leave
+ghost processes behind.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
 <dt 
id="airflow.contrib.operators.bigquery_check_operator.BigQueryCheckOperator">
-<em class="property">class </em><code 
class="descclassname">airflow.contrib.operators.bigquery_check_operator.</code><code
 class="descname">BigQueryCheckOperator</code><span 
class="sig-paren">(</span><em>sql</em>, 
<em>bigquery_conn_id='bigquery_default'</em>, <em>*args</em>, 
<em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/contrib/operators/bigquery_check_operator.html#BigQueryCheckOperator"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.contrib.operators.bigquery_check_operator.BigQueryCheckOperator" 
title="Permalink to this definition">¶</a></dt>
+<em class="property">class </em><code 
class="descclassname">airflow.contrib.operators.bigquery_check_operator.</code><code
 class="descname">BigQueryCheckOperator</code><span 
class="sig-paren">(</span><em>sql</em>, 
<em>bigquery_conn_id='bigquery_default'</em>, <em>use_legacy_sql=True</em>, 
<em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/contrib/operators/bigquery_check_operator.html#BigQueryCheckOperator"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.contrib.operators.bigquery_check_operator.BigQueryCheckOperator" 
title="Permalink to this definition">¶</a></dt>
 <dd><p>Bases: <a class="reference internal" 
href="#airflow.operators.check_operator.CheckOperator" 
title="airflow.operators.check_operator.CheckOperator"><code class="xref py 
py-class docutils literal notranslate"><span 
class="pre">airflow.operators.check_operator.CheckOperator</span></code></a></p>
 <p>Performs checks against BigQuery. The <code class="docutils literal 
notranslate"><span class="pre">BigQueryCheckOperator</span></code> expects
 a sql query that will return a single row. Each value on that
@@ -2252,6 +2607,8 @@ without stopping the progress of the DAG.</p>
 <tr class="field-odd field"><th class="field-name">Parameters:</th><td 
class="field-body"><ul class="first last simple">
 <li><strong>sql</strong> (<em>string</em>) – the sql to be executed</li>
 <li><strong>bigquery_conn_id</strong> (<em>string</em>) – reference to the 
BigQuery database</li>
+<li><strong>use_legacy_sql</strong> (<em>boolean</em>) – Whether to use 
legacy SQL (true)
+or standard SQL (false).</li>
 </ul>
 </td>
 </tr>
@@ -2261,14 +2618,19 @@ without stopping the progress of the DAG.</p>
 
 <dl class="class">
 <dt 
id="airflow.contrib.operators.bigquery_check_operator.BigQueryValueCheckOperator">
-<em class="property">class </em><code 
class="descclassname">airflow.contrib.operators.bigquery_check_operator.</code><code
 class="descname">BigQueryValueCheckOperator</code><span 
class="sig-paren">(</span><em>sql</em>, <em>pass_value</em>, 
<em>tolerance=None</em>, <em>bigquery_conn_id='bigquery_default'</em>, 
<em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/contrib/operators/bigquery_check_operator.html#BigQueryValueCheckOperator"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.contrib.operators.bigquery_check_operator.BigQueryValueCheckOperator"
 title="Permalink to this definition">¶</a></dt>
+<em class="property">class </em><code 
class="descclassname">airflow.contrib.operators.bigquery_check_operator.</code><code
 class="descname">BigQueryValueCheckOperator</code><span 
class="sig-paren">(</span><em>sql</em>, <em>pass_value</em>, 
<em>tolerance=None</em>, <em>bigquery_conn_id='bigquery_default'</em>, 
<em>use_legacy_sql=True</em>, <em>*args</em>, <em>**kwargs</em><span 
class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/contrib/operators/bigquery_check_operator.html#BigQueryValueCheckOperator"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.contrib.operators.bigquery_check_operator.BigQueryValueCheckOperator"
 title="Permalink to this definition">¶</a></dt>
 <dd><p>Bases: <a class="reference internal" 
href="#airflow.operators.check_operator.ValueCheckOperator" 
title="airflow.operators.check_operator.ValueCheckOperator"><code class="xref 
py py-class docutils literal notranslate"><span 
class="pre">airflow.operators.check_operator.ValueCheckOperator</span></code></a></p>
 <p>Performs a simple value check using sql code.</p>
 <table class="docutils field-list" frame="void" rules="none">
 <col class="field-name" />
 <col class="field-body" />
 <tbody valign="top">
-<tr class="field-odd field"><th class="field-name">Parameters:</th><td 
class="field-body"><strong>sql</strong> (<em>string</em>) – the sql to be 
executed</td>
+<tr class="field-odd field"><th class="field-name">Parameters:</th><td 
class="field-body"><ul class="first last simple">
+<li><strong>sql</strong> (<em>string</em>) – the sql to be executed</li>
+<li><strong>use_legacy_sql</strong> (<em>boolean</em>) – Whether to use 
legacy SQL (true)
+or standard SQL (false).</li>
+</ul>
+</td>
 </tr>
 </tbody>
 </table>
@@ -2276,7 +2638,7 @@ without stopping the progress of the DAG.</p>
 
 <dl class="class">
 <dt 
id="airflow.contrib.operators.bigquery_check_operator.BigQueryIntervalCheckOperator">
-<em class="property">class </em><code 
class="descclassname">airflow.contrib.operators.bigquery_check_operator.</code><code
 class="descname">BigQueryIntervalCheckOperator</code><span 
class="sig-paren">(</span><em>table</em>, <em>metrics_thresholds</em>, 
<em>date_filter_column='ds'</em>, <em>days_back=-7</em>, 
<em>bigquery_conn_id='bigquery_default'</em>, <em>*args</em>, 
<em>**kwargs</em><span class="sig-paren">)</span><a class="reference internal" 
href="_modules/airflow/contrib/operators/bigquery_check_operator.html#BigQueryIntervalCheckOperator"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.contrib.operators.bigquery_check_operator.BigQueryIntervalCheckOperator"
 title="Permalink to this definition">¶</a></dt>
+<em class="property">class </em><code 
class="descclassname">airflow.contrib.operators.bigquery_check_operator.</code><code
 class="descname">BigQueryIntervalCheckOperator</code><span 
class="sig-paren">(</span><em>table</em>, <em>metrics_thresholds</em>, 
<em>date_filter_column='ds'</em>, <em>days_back=-7</em>, 
<em>bigquery_conn_id='bigquery_default'</em>, <em>use_legacy_sql=True</em>, 
<em>*args</em>, <em>**kwargs</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/contrib/operators/bigquery_check_operator.html#BigQueryIntervalCheckOperator"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.contrib.operators.bigquery_check_operator.BigQueryIntervalCheckOperator"
 title="Permalink to this definition">¶</a></dt>
 <dd><p>Bases: <a class="reference internal" 
href="#airflow.operators.check_operator.IntervalCheckOperator" 
title="airflow.operators.check_operator.IntervalCheckOperator"><code 
class="xref py py-class docutils literal notranslate"><span 
class="pre">airflow.operators.check_operator.IntervalCheckOperator</span></code></a></p>
 <p>Checks that the values of metrics given as SQL expressions are within
 a certain tolerance of the ones from days_back before.</p>
@@ -2296,6 +2658,8 @@ against. Defaults to 7 days</li>
 <li><strong>metrics_threshold</strong> (<em>dict</em>) – a dictionary of 
ratios indexed by metrics, for
 example ‘COUNT(*)’: 1.5 would require a 50 percent or less difference
 between the current day, and the prior days_back.</li>
+<li><strong>use_legacy_sql</strong> (<em>boolean</em>) – Whether to use 
legacy SQL (true)
+or standard SQL (false).</li>
 </ul>
 </td>
 </tr>
@@ -2352,11 +2716,19 @@ delegation enabled.</li>
 </tr>
 </tbody>
 </table>
+<dl class="method">
+<dt 
id="airflow.contrib.operators.bigquery_get_data.BigQueryGetDataOperator.execute">
+<code class="descname">execute</code><span 
class="sig-paren">(</span><em>context</em><span class="sig-paren">)</span><a 
class="reference internal" 
href="_modules/airflow/contrib/operators/bigquery_get_data.html#BigQueryGetDataOperator.execute"><span
 class="viewcode-link">[source]</span></a><a class="headerlink" 
href="#airflow.contrib.operators.bigquery_get_data.BigQueryGetDataOperator.execute"
 title="Permalink to this definition">¶</a></dt>
+<dd><p>This is the main method to derive when creating an operator.
+Context is the same dictionary used as when rendering jinja templates.</p>
+<p>Refer to get_template_context for more context.</p>
+</dd></dl>
+
 </dd></dl>
 
 <dl class="class">
 <dt 
id="airflow.contrib.operators.bigquery_operator.BigQueryCreateEmptyTableOperator">
-<em class="p

<TRUNCATED>

Reply via email to