[jira] [Created] (AIRFLOW-948) cannot run "airflow initdb"

2017-03-06 Thread Tony Kucera (JIRA)
Tony Kucera created AIRFLOW-948:
---

 Summary: cannot run "airflow initdb"
 Key: AIRFLOW-948
 URL: https://issues.apache.org/jira/browse/AIRFLOW-948
 Project: Apache Airflow
  Issue Type: Bug
Affects Versions: Airflow 1.7.1
 Environment: Debian Jessie 8.2, Python 2.7.9, Python 3.4.2
Reporter: Tony Kucera


[2017-03-07 08:45:38,241] {__init__.py:36} INFO - Using executor 
SequentialExecutor
[2017-03-07 08:45:38,538] {driver.py:120} INFO - Generating grammar tables from 
/usr/lib/python3.4/lib2to3/Grammar.txt
[2017-03-07 08:45:38,578] {driver.py:120} INFO - Generating grammar tables from 
/usr/lib/python3.4/lib2to3/PatternGrammar.txt
DB: sqlite:root/airflow/airflow.db
[2017-03-07 08:45:38,898] {db.py:222} INFO - Creating tables
INFO  [alembic.runtime.migration] Context impl SQLiteImpl.
INFO  [alembic.runtime.migration] Will assume non-transactional DDL.
ERROR [airflow.models.DagBag] Failed to import: 
/usr/local/lib/python3.4/dist-packages/airflow/example_dags/example_twitter_dag.py
Traceback (most recent call last):
  File "/usr/local/lib/python3.4/dist-packages/airflow/models.py", line 247, in 
process_file
m = imp.load_source(mod_name, filepath)
  File "/usr/lib/python3.4/imp.py", line 171, in load_source
module = methods.load()
  File "", line 1220, in load
  File "", line 1200, in _load_unlocked
  File "", line 1129, in _exec
  File "", line 1471, in exec_module
  File "", line 321, in _call_with_frames_removed
  File 
"/usr/local/lib/python3.4/dist-packages/airflow/example_dags/example_twitter_dag.py",
 line 26, in 
from airflow.operators import BashOperator, HiveOperator, PythonOperator
ImportError: cannot import name 'HiveOperator'



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Created] (AIRFLOW-947) Make PrestoHook surface better messages when the Presto Cluster is unavailable.

2017-03-06 Thread Arthur Wiedmer (JIRA)
Arthur Wiedmer created AIRFLOW-947:
--

 Summary: Make PrestoHook surface better messages when the Presto 
Cluster is unavailable.
 Key: AIRFLOW-947
 URL: https://issues.apache.org/jira/browse/AIRFLOW-947
 Project: Apache Airflow
  Issue Type: Bug
Reporter: Arthur Wiedmer
Assignee: Arthur Wiedmer
Priority: Minor






--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Created] (AIRFLOW-946) Virtualenv not explicitly used by webserver/worker subprocess

2017-03-06 Thread Daniel Huang (JIRA)
Daniel Huang created AIRFLOW-946:


 Summary: Virtualenv not explicitly used by webserver/worker 
subprocess
 Key: AIRFLOW-946
 URL: https://issues.apache.org/jira/browse/AIRFLOW-946
 Project: Apache Airflow
  Issue Type: Bug
  Components: cli
Reporter: Daniel Huang
Assignee: Daniel Huang
Priority: Minor


I have airflow installed in a virtualenv. I'd expect calling 
{{/path/to/venv/bin/airflow webserver}} or {{/path/to/venv/bin/airflow worker}} 
*without activating my virtualenv* to work. However, they both fail to run 
properly because they spawn a process that is called without specifying the 
virtualenv.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (AIRFLOW-288) Make system timezone configurable

2017-03-06 Thread David Klosowski (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-288?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15898590#comment-15898590
 ] 

David Klosowski commented on AIRFLOW-288:
-

Thoughts on support for this?  It would be nice to see this at the DAG level so 
individual DAGs could be triggered in a timezone aware way by the the 
scheduler.  

> Make system timezone configurable
> -
>
> Key: AIRFLOW-288
> URL: https://issues.apache.org/jira/browse/AIRFLOW-288
> Project: Apache Airflow
>  Issue Type: Improvement
>Reporter: Vineet Goel
>Assignee: Vineet Goel
>




--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


incubator-airflow git commit: [AIRFLOW-941] Use defined parameters for psycopg2

2017-03-06 Thread bolke
Repository: incubator-airflow
Updated Branches:
  refs/heads/master 2cfe28244 -> e79dee871


[AIRFLOW-941] Use defined parameters for psycopg2

This works around
https://github.com/psycopg/psycopg2/issues/517 .

Closes #2126 from bolkedebruin/AIRFLOW-941


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/e79dee87
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/e79dee87
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/e79dee87

Branch: refs/heads/master
Commit: e79dee8718ca7d69a6905509f45fd2c5e9267209
Parents: 2cfe282
Author: Bolke de Bruin 
Authored: Mon Mar 6 21:03:14 2017 +0100
Committer: Bolke de Bruin 
Committed: Mon Mar 6 21:03:14 2017 +0100

--
 airflow/hooks/postgres_hook.py | 15 +++
 1 file changed, 11 insertions(+), 4 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/e79dee87/airflow/hooks/postgres_hook.py
--
diff --git a/airflow/hooks/postgres_hook.py b/airflow/hooks/postgres_hook.py
index 372e4e5..4b460c1 100644
--- a/airflow/hooks/postgres_hook.py
+++ b/airflow/hooks/postgres_hook.py
@@ -36,10 +36,17 @@ class PostgresHook(DbApiHook):
 conn = self.get_connection(self.postgres_conn_id)
 conn_args = dict(
 host=conn.host,
-user=conn.login,
-password=conn.password,
-dbname=self.schema or conn.schema,
-port=conn.port)
+dbname=self.schema or conn.schema)
+# work around for https://github.com/psycopg/psycopg2/issues/517
+# todo: remove when psycopg2 2.7.1 is released
+# https://issues.apache.org/jira/browse/AIRFLOW-945
+if conn.port:
+conn_args['port'] = conn.port
+if conn.login:
+conn_args['user'] = conn.login
+if conn.password:
+conn_args['password'] = conn.password
+
 # check for ssl parameters in conn.extra
 for arg_name, arg_val in conn.extra_dejson.items():
 if arg_name in ['sslmode', 'sslcert', 'sslkey', 'sslrootcert', 
'sslcrl', 'application_name']:



[jira] [Commented] (AIRFLOW-941) Psycopg2 2.7.0 has a regression when port is 'None'

2017-03-06 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-941?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15897952#comment-15897952
 ] 

ASF subversion and git services commented on AIRFLOW-941:
-

Commit e79dee8718ca7d69a6905509f45fd2c5e9267209 in incubator-airflow's branch 
refs/heads/master from [~bolke]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=e79dee8 ]

[AIRFLOW-941] Use defined parameters for psycopg2

This works around
https://github.com/psycopg/psycopg2/issues/517 .

Closes #2126 from bolkedebruin/AIRFLOW-941


> Psycopg2 2.7.0 has a regression when port is 'None'
> ---
>
> Key: AIRFLOW-941
> URL: https://issues.apache.org/jira/browse/AIRFLOW-941
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: operators
>Affects Versions: 1.8.0rc4
>Reporter: Bolke de Bruin
>Assignee: Bolke de Bruin
> Fix For: 1.8.0rc5
>
>
> workaround is to not specify username/password if they are not available. See 
> external issue URL.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (AIRFLOW-941) Psycopg2 2.7.0 has a regression when port is 'None'

2017-03-06 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/AIRFLOW-941?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15897954#comment-15897954
 ] 

ASF subversion and git services commented on AIRFLOW-941:
-

Commit e79dee8718ca7d69a6905509f45fd2c5e9267209 in incubator-airflow's branch 
refs/heads/master from [~bolke]
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=e79dee8 ]

[AIRFLOW-941] Use defined parameters for psycopg2

This works around
https://github.com/psycopg/psycopg2/issues/517 .

Closes #2126 from bolkedebruin/AIRFLOW-941


> Psycopg2 2.7.0 has a regression when port is 'None'
> ---
>
> Key: AIRFLOW-941
> URL: https://issues.apache.org/jira/browse/AIRFLOW-941
> Project: Apache Airflow
>  Issue Type: Bug
>  Components: operators
>Affects Versions: 1.8.0rc4
>Reporter: Bolke de Bruin
>Assignee: Bolke de Bruin
> Fix For: 1.8.0rc5
>
>
> workaround is to not specify username/password if they are not available. See 
> external issue URL.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[10/22] incubator-airflow-site git commit: Latest docs version as of 1.8.x

2017-03-06 Thread maximebeauchemin
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/9c75ee9e/_static/down-pressed.png
--
diff --git a/_static/down-pressed.png b/_static/down-pressed.png
index 7c30d00..5756c8c 100644
Binary files a/_static/down-pressed.png and b/_static/down-pressed.png differ

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/9c75ee9e/_static/down.png
--
diff --git a/_static/down.png b/_static/down.png
index f48098a..1b3bdad 100644
Binary files a/_static/down.png and b/_static/down.png differ

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/9c75ee9e/_static/file.png
--
diff --git a/_static/file.png b/_static/file.png
index 254c60b..a858a41 100644
Binary files a/_static/file.png and b/_static/file.png differ

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/9c75ee9e/_static/fonts/Inconsolata-Bold.ttf
--
diff --git a/_static/fonts/Inconsolata-Bold.ttf 
b/_static/fonts/Inconsolata-Bold.ttf
index 58c9fef..809c1f5 100644
Binary files a/_static/fonts/Inconsolata-Bold.ttf and 
b/_static/fonts/Inconsolata-Bold.ttf differ

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/9c75ee9e/_static/fonts/Inconsolata-Regular.ttf
--
diff --git a/_static/fonts/Inconsolata-Regular.ttf 
b/_static/fonts/Inconsolata-Regular.ttf
index a87ffba..fc981ce 100644
Binary files a/_static/fonts/Inconsolata-Regular.ttf and 
b/_static/fonts/Inconsolata-Regular.ttf differ

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/9c75ee9e/_static/fonts/Lato-Bold.ttf
--
diff --git a/_static/fonts/Lato-Bold.ttf b/_static/fonts/Lato-Bold.ttf
index 7434369..1d23c70 100644
Binary files a/_static/fonts/Lato-Bold.ttf and b/_static/fonts/Lato-Bold.ttf 
differ

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/9c75ee9e/_static/fonts/Lato-Regular.ttf
--
diff --git a/_static/fonts/Lato-Regular.ttf b/_static/fonts/Lato-Regular.ttf
index 04ea8ef..0f3d0f8 100644
Binary files a/_static/fonts/Lato-Regular.ttf and 
b/_static/fonts/Lato-Regular.ttf differ

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/9c75ee9e/_static/fonts/fontawesome-webfont.eot
--
diff --git a/_static/fonts/fontawesome-webfont.eot 
b/_static/fonts/fontawesome-webfont.eot
index 84677bc..c7b00d2 100644
Binary files a/_static/fonts/fontawesome-webfont.eot and 
b/_static/fonts/fontawesome-webfont.eot differ



[19/22] incubator-airflow-site git commit: Latest docs version as of 1.8.x

2017-03-06 Thread maximebeauchemin
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/5e574012/api.html
--
diff --git a/api.html b/api.html
new file mode 100644
index 000..7aca3bb
--- /dev/null
+++ b/api.html
@@ -0,0 +1,279 @@
+
+
+
+
+  
+
+  
+  
+  
+  
+  Experimental Rest API  Airflow Documentation
+  
+
+  
+  
+
+  
+
+  
+  
+
+
+  
+
+  
+  
+
+  
+
+  
+
+  
+
+
+
+
+ 
+
+  
+  
+
+
+
+
+
+   
+  
+
+
+
+  
+
+  
+
+  
+ Airflow
+  
+
+  
+  
+
+  
+
+
+  
+
+  
+
+  
+
+
+
+  
+
+
+  
+
+
+
+  
+
+
+
+Project
+License
+Quick 
Start
+Installation
+Tutorial
+Configuration
+UI / 
Screenshots
+Concepts
+Data Profiling
+Command 
Line Interface
+Scheduling  Triggers
+Plugins
+Security
+Experimental Rest API
+Endpoints
+CLI
+Authentication
+
+
+Integration
+FAQ
+API 
Reference
+
+
+
+  
+
+  
+
+
+
+
+  
+  
+
+  
+  Airflow
+
+  
+
+
+  
+  
+
+  
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+  
+
+  Docs 
+
+  Experimental Rest API
+
+
+  
+
+
+ View page source
+  
+
+  
+
+  
+
+  
+  
+
+  http://schema.org/Article;>
+   
+
+  
+Experimental Rest API¶
+Airflow exposes an experimental Rest API. It is available through the 
webserver. Endpoints are
+available at /api/experimental/. Please note that we expect the endpoint 
definitions to change.
+
+Endpoints¶
+This is a place holder until the swagger definitions are active
+
+/api/experimental/dags/DAG_ID/tasks/TASK_ID returns info 
for a task (GET).
+/api/experimental/dags/DAG_ID/dag_runs creates a dag_run for a 
given dag id (POST).
+
+
+
+CLI¶
+For some functions the cli can use the API. To configure the CLI to use the 
API when available
+configure as follows:
+[cli]
+api_client = 
airflow.api.client.json_client
+endpoint_url = 
http://WEBSERVER:PORT;
+
+
+
+
+Authentication¶
+Only Kerberos authentication is currently supported for the API. To enable 
this set the following
+in the configuration:
+[api]
+auth_backend = 
airflow.api.auth.backend.default
+
+[kerberos]
+keytab = KEYTAB
+
+
+The Kerberos service is configured as 
airflow/fully.qualified.domainnameREALM. Make sure this
+principal exists in the keytab file.
+
+
+
+
+   
+   
+
+   
+  
+  
+  
+
+  
+Next 
+  
+  
+ 
Previous
+  
+
+  
+
+  
+
+  
+
+
+
+  
+  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/snide/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
+
+
+
+
+  
+
+
+
+  
+  
+
+
+  
+
+
+var DOCUMENTATION_OPTIONS = {
+URL_ROOT:'./',
+VERSION:'',
+COLLAPSE_INDEX:false,
+FILE_SUFFIX:'.html',
+HAS_SOURCE:  true,
+SOURCELINK_SUFFIX: '.txt'
+};
+
+  
+  
+  
+
+  
+
+  
+  
+
+  
+
+  
+  
+  
+  jQuery(function () {
+  SphinxRtdTheme.StickyNav.enable();
+  });
+  
+   
+
+
+
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/5e574012/integration.html
--
diff --git a/integration.html b/integration.html
new file mode 100644
index 000..a55aa41
--- /dev/null
+++ b/integration.html
@@ -0,0 +1,424 @@
+
+
+
+
+  
+
+  
+  
+  
+  
+  Integration  Airflow Documentation
+  
+
+  
+  
+
+  
+
+  
+  
+
+
+  
+
+  
+  
+
+  
+
+  
+
+  
+
+
+
+
+ 
+
+  
+  
+
+
+
+
+
+   
+  
+
+
+
+  
+
+  
+
+  
+ Airflow
+  
+
+  
+  
+
+  
+
+
+  
+
+  
+
+  
+
+
+
+  
+
+
+  
+
+
+
+  
+
+
+
+Project
+License
+Quick 
Start
+Installation
+Tutorial
+Configuration
+UI / 
Screenshots
+Concepts
+Data Profiling
+Command 
Line Interface
+Scheduling  Triggers
+Plugins
+Security
+Experimental Rest API
+Integration
+AWS: Amazon Webservices
+GCP: Google Cloud Platform
+BigQuery
+BigQuery Operators
+BigQueryHook
+
+
+Cloud DataFlow
+DataFlow Operators
+DataFlowHook
+
+
+Cloud DataProc
+DataProc Operators
+DataProcPySparkOperator
+
+
+Cloud Datastore
+Datastore Operators
+
+
+Cloud Storage
+Storage Operators
+GoogleCloudStorageHook
+
+
+
+
+
+
+FAQ
+API 
Reference
+
+
+
+  
+
+  
+
+
+
+
+  
+  
+
+  

[18/22] incubator-airflow-site git commit: Latest docs version as of 1.8.x

2017-03-06 Thread maximebeauchemin
Latest docs version as of 1.8.x


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/repo
Commit: 
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/commit/9c75ee9e
Tree: 
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/tree/9c75ee9e
Diff: 
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/diff/9c75ee9e

Branch: refs/heads/asf-site
Commit: 9c75ee9e46cc5ec459979dbfd2c911743c07fd63
Parents: 4af0850
Author: Maxime Beauchemin 
Authored: Mon Mar 6 08:43:12 2017 -0800
Committer: Maxime Beauchemin 
Committed: Mon Mar 6 08:43:12 2017 -0800

--
 .../contrib/executors/mesos_executor.html   |   81 +-
 .../contrib/operators/hipchat_operator.html |   72 +-
 _modules/airflow/executors/local_executor.html  |   73 +-
 .../airflow/executors/sequential_executor.html  |   59 +-
 _modules/airflow/macros.html|   79 +-
 _modules/airflow/macros/hive.html   |   61 +-
 _modules/airflow/models.html| 1993 ++
 _modules/airflow/operators/sensors.html |  242 ++-
 _modules/bash_operator.html |   77 +-
 _modules/dagrun_operator.html   |   67 +-
 _modules/dbapi_hook.html|  130 +-
 _modules/dummy_operator.html|   49 +-
 _modules/email_operator.html|   61 +-
 _modules/ftp_hook.html  |   69 +-
 _modules/generic_transfer.html  |   57 +-
 _modules/http_hook.html |   61 +-
 _modules/http_operator.html |   71 +-
 _modules/index.html |   74 +-
 _modules/mysql_hook.html|   95 +-
 _modules/mysql_operator.html|   61 +-
 _modules/presto_check_operator.html |   60 +-
 _modules/presto_hook.html   |   63 +-
 _modules/python_operator.html   |   69 +-
 _modules/sensors.html   |  242 ++-
 _modules/sqlite_hook.html   |   49 +-
 _modules/ssh_execute_operator.html  |   75 +-
 _modules/ssh_hook.html  |   81 +-
 _static/basic.css   |   68 +-
 _static/comment-bright.png  |  Bin 3500 -> 756 bytes
 _static/comment-close.png   |  Bin 3578 -> 829 bytes
 _static/comment.png |  Bin 3445 -> 641 bytes
 _static/css/badge_only.css  |2 +-
 _static/css/theme.css   |4 +-
 _static/down-pressed.png|  Bin 347 -> 222 bytes
 _static/down.png|  Bin 347 -> 202 bytes
 _static/file.png|  Bin 358 -> 286 bytes
 _static/fonts/Inconsolata-Bold.ttf  |  Bin 66352 -> 109948 bytes
 _static/fonts/Inconsolata-Regular.ttf   |  Bin 84548 -> 96964 bytes
 _static/fonts/Lato-Bold.ttf |  Bin 121788 -> 656544 bytes
 _static/fonts/Lato-Regular.ttf  |  Bin 120196 -> 656568 bytes
 _static/fonts/fontawesome-webfont.eot   |  Bin 56006 -> 76518 bytes
 _static/fonts/fontawesome-webfont.svg   |  207 +-
 _static/fonts/fontawesome-webfont.ttf   |  Bin 112160 -> 152796 bytes
 _static/fonts/fontawesome-webfont.woff  |  Bin 65452 -> 90412 bytes
 _static/jquery.js   |8 +-
 _static/js/theme.js |   58 +-
 _static/minus.png   |  Bin 173 -> 90 bytes
 _static/plus.png|  Bin 173 -> 90 bytes
 _static/searchtools.js  |  115 +-
 _static/up-pressed.png  |  Bin 345 -> 214 bytes
 _static/up.png  |  Bin 345 -> 203 bytes
 cli.html|  482 +++--
 code.html   | 1933 +++--
 concepts.html   |  129 +-
 configuration.html  |   97 +-
 faq.html|   61 +-
 genindex.html   | 1541 --
 index.html  |  115 +-
 installation.html   |   51 +-
 license.html|   51 +-
 objects.inv |  Bin 2326 -> 2147 bytes
 plugins.html|   64 +-
 profiling.html  |   51 +-
 project.html|   76 +-
 py-modindex.html|   63 +-
 scheduler.html  |   79 +-
 search.html   

[02/22] incubator-airflow-site git commit: Latest docs version as of 1.8.x

2017-03-06 Thread maximebeauchemin
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/9c75ee9e/searchindex.js
--
diff --git a/searchindex.js b/searchindex.js
index 3d2f62d..625cb26 100644
--- a/searchindex.js
+++ b/searchindex.js
@@ -1 +1 @@
-Search.setIndex({envversion:47,filenames:["cli","code","concepts","configuration","faq","index","installation","license","plugins","profiling","project","scheduler","security","start","tutorial","ui"],objects:{"airflow.contrib":{hooks:[1,1,0,"-"],operators:[1,1,0,"-"]},"airflow.contrib.executors.mesos_executor":{MesosExecutor:[1,0,1,""]},"airflow.contrib.hooks":{BigQueryHook:[1,0,1,""],CloudantHook:[1,0,1,""],FTPHook:[1,0,1,""],GoogleCloudStorageHook:[1,0,1,""],SSHHook:[1,0,1,""],VerticaHook:[1,0,1,""]},"airflow.contrib.hooks.BigQueryHook":{get_conn:[1,2,1,""],get_pandas_df:[1,2,1,""],get_service:[1,2,1,""],insert_rows:[1,2,1,""]},"airflow.contrib.hooks.CloudantHook":{db:[1,2,1,""]},"airflow.contrib.hooks.FTPHook":{close_conn:[1,2,1,""],create_directory:[1,2,1,""],delete_directory:[1,2,1,""],delete_file:[1,2,1,""],describe_directory:[1,2,1,""],get_conn:[1,2,1,""],list_directory:[1,2,1,""],rename:[1,2,1,""],retrieve_file:[1,2,1,""],store_file:[1,2,1,""]},"airflow.contrib.hooks.Google
 
CloudStorageHook":{download:[1,2,1,""],get_conn:[1,2,1,""],upload:[1,2,1,""]},"airflow.contrib.hooks.SSHHook":{Popen:[1,2,1,""],check_output:[1,2,1,""],tunnel:[1,2,1,""]},"airflow.contrib.hooks.VerticaHook":{get_conn:[1,2,1,""]},"airflow.contrib.hooks.gcs_hook":{GoogleCloudStorageHook:[1,0,1,""]},"airflow.contrib.operators":{QuboleOperator:[1,0,1,""],SSHExecuteOperator:[1,0,1,""],VerticaOperator:[1,0,1,""],VerticaToHiveTransfer:[1,0,1,""]},"airflow.contrib.operators.bigquery_operator":{BigQueryOperator:[1,0,1,""]},"airflow.contrib.operators.bigquery_to_gcs":{BigQueryToCloudStorageOperator:[1,0,1,""]},"airflow.contrib.operators.gcs_download_operator":{GoogleCloudStorageDownloadOperator:[1,0,1,""]},"airflow.contrib.operators.hipchat_operator":{HipChatAPIOperator:[1,0,1,""],HipChatAPISendRoomNotificationOperator:[1,0,1,""]},"airflow.executors":{CeleryExecutor:[1,0,1,""],LocalExecutor:[1,0,1,""],SequentialExecutor:[1,0,1,""]},"airflow.hooks":{DbApiHook:[1,0,1,""],DruidHook:[1,0,1,""],Hi
 
veCliHook:[1,0,1,""],HiveMetastoreHook:[1,0,1,""],HiveServer2Hook:[1,0,1,""],HttpHook:[1,0,1,""],MsSqlHook:[1,0,1,""],MySqlHook:[1,0,1,""],PostgresHook:[1,0,1,""],PrestoHook:[1,0,1,""],S3Hook:[1,0,1,""],SqliteHook:[1,0,1,""],WebHDFSHook:[1,0,1,""]},"airflow.hooks.DbApiHook":{bulk_dump:[1,2,1,""],bulk_load:[1,2,1,""],get_conn:[1,2,1,""],get_cursor:[1,2,1,""],get_first:[1,2,1,""],get_pandas_df:[1,2,1,""],get_records:[1,2,1,""],insert_rows:[1,2,1,""],run:[1,2,1,""]},"airflow.hooks.DruidHook":{construct_ingest_query:[1,2,1,""],get_conn:[1,2,1,""],load_from_hdfs:[1,2,1,""]},"airflow.hooks.HiveCliHook":{load_file:[1,2,1,""],run_cli:[1,2,1,""],test_hql:[1,2,1,""]},"airflow.hooks.HiveMetastoreHook":{check_for_named_partition:[1,2,1,""],check_for_partition:[1,2,1,""],get_databases:[1,2,1,""],get_metastore_client:[1,2,1,""],get_partitions:[1,2,1,""],get_table:[1,2,1,""],get_tables:[1,2,1,""],max_partition:[1,2,1,""],table_exists:[1,2,1,""]},"airflow.hooks.HiveServer2Hook":{get_pandas_df:[1,2,
 
1,""],get_records:[1,2,1,""]},"airflow.hooks.HttpHook":{get_conn:[1,2,1,""],run:[1,2,1,""],run_and_check:[1,2,1,""]},"airflow.hooks.MsSqlHook":{get_conn:[1,2,1,""]},"airflow.hooks.MySqlHook":{bulk_load:[1,2,1,""],get_conn:[1,2,1,""]},"airflow.hooks.PrestoHook":{get_conn:[1,2,1,""],get_first:[1,2,1,""],get_pandas_df:[1,2,1,""],get_records:[1,2,1,""],run:[1,2,1,""]},"airflow.hooks.S3Hook":{check_for_bucket:[1,2,1,""],check_for_key:[1,2,1,""],check_for_prefix:[1,2,1,""],check_for_wildcard_key:[1,2,1,""],get_bucket:[1,2,1,""],get_conn:[1,2,1,""],get_key:[1,2,1,""],get_wildcard_key:[1,2,1,""],list_keys:[1,2,1,""],list_prefixes:[1,2,1,""],load_file:[1,2,1,""],load_string:[1,2,1,""]},"airflow.hooks.SqliteHook":{get_conn:[1,2,1,""]},"airflow.hooks.WebHDFSHook":{check_for_path:[1,2,1,""],get_conn:[1,2,1,""],load_file:[1,2,1,""]},"airflow.macros":{ds_add:[1,3,1,""],ds_format:[1,3,1,""],hive:[1,1,0,"-"],random:[1,3,1,""]},"airflow.macros.hive":{closest_ds_partition:[1,3,1,""],max_partition:[1,
 

[01/22] incubator-airflow-site git commit: Latest docs version as of 1.8.x

2017-03-06 Thread maximebeauchemin
Repository: incubator-airflow-site
Updated Branches:
  refs/heads/asf-site 4af0850c3 -> 5e5740122


http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/9c75ee9e/security.html
--
diff --git a/security.html b/security.html
index 25ada3c..d0156b7 100644
--- a/security.html
+++ b/security.html
@@ -30,8 +30,11 @@
   
 
   
+
+
 
-
+
  
 
   
@@ -41,6 +44,7 @@
 
 
 
+   
   
 
 
@@ -102,12 +106,27 @@
 Limitations
 Enabling kerberos
 Using kerberos authentication
-GitHub Enterprise (GHE) 
Authentication
-Setting up GHE Authentication
 
 
+OAuth Authentication
+GitHub Enterprise (GHE) 
Authentication
+Setting up GHE Authentication
+
+
+Google Authentication
+Setting up Google 
Authentication
+
+
+
+
+SSL
+Impersonation
 
 
+
+
+Experimental Rest API
+Integration
 FAQ
 API 
Reference
 
@@ -122,8 +141,10 @@
 
   
   
-
-Airflow
+
+  
+  Airflow
+
   
 
 
@@ -132,23 +153,40 @@
 
   
 
- 
+
+
+
+
+
+
+
+
+
+
+
 
 
 
 
+
   
-Docs 
-  
-Security
+
+  Docs 
+
+  Security
+
+
   
 
-  
- View page 
source
+
+ View page 
source
   
 
   
+
   
+
+  
   
 
   http://schema.org/Article;>
@@ -156,13 +194,13 @@
 
   
 Security¶
-
-Web Authentication¶
 By default, all gates are opened. An easy way to restrict access
 to the web application is to do it at the network level, or by using
 SSH tunnels.
 It is however possible to switch on authentication by either using one of 
the supplied
 backends or create your own.
+
+Web Authentication¶
 
 Password¶
 One of the simplest mechanisms for authentication is requiring users to 
specify a password before logging in.
@@ -331,6 +369,9 @@ section of the connection. For the login user specify the 
following as extra:
 
 
+
+
+OAuth Authentication¶
 
 GitHub Enterprise (GHE) Authentication¶
 The GitHub Enterprise authentication backend can be used to authenticate 
users
@@ -348,12 +389,11 @@ your GHE installation will be able to login to 
Airflow.
 client_id = 
oauth_key_from_github_enterprise
 client_secret = 
oauth_secret_from_github_enterprise
 oauth_callback_route = 
/example/ghe_oauth/callback
-allowed_teams = example_team_1, 
example_team_2
+allowed_teams = 1, 345, 23
 
 
-
 
-Setting up GHE Authentication¶
+Setting up GHE Authentication¶
 An application must be setup in GHE before you can use the GHE 
authentication
 backend. In order to setup an application:
 
@@ -367,19 +407,87 @@ backend. In order to setup an application:
 
 
 
+
+Google Authentication¶
+The Google authentication backend can be used to authenticate users
+against Google using OAuth2. You must specify a domain to restrict login
+to only members of that domain.
+[webserver]
+authenticate = True
+auth_backend = 
airflow.contrib.auth.backends.google_auth
+
+[google]
+client_id = google_client_id
+client_secret = 
google_client_secret
+oauth_callback_route = 
/oauth2callback
+domain = example.com
+
+
+
+Setting up Google Authentication¶
+An application must be setup in the Google API Console before you can use 
the Google authentication
+backend. In order to setup an application:
+
+Navigate to https://console.developers.google.com/apis/;>https://console.developers.google.com/apis/
+Select Credentials from the left hand nav
+Click Create credentials and choose OAuth client 
ID
+Choose Web application
+Fill in the required information (the Authorized redirect 
URIs must be fully qualifed e.g. http://airflow.example.com/oauth2callback;>http://airflow.example.com/oauth2callback)
+Click Create
+Copy Client ID, Client Secret, and your 
redirect URI to your airflow.cfg according to the above example
+
+
+
+
+
+SSL¶
+SSL can be enabled by providing a certificate and key. Once enabled, be 
sure to use
+https://;>https:// in 
your browser.
+[webserver]
+web_server_ssl_cert = path 
to cert
+web_server_ssl_key = path 
to key
+
+
+Enabling SSL will not automatically change the web server port. If you want 
to use the
+standard port 443, youll need to configure that too. Be aware that 
super user privileges
+(or cap_net_bind_service on Linux) are required to listen on port 443.
+# Optionally, set the server to listen on the standard SSL 
port.
+web_server_port = 443
+base_url = http://hostname 
or IP:443
+
+
+
+Impersonation¶
+Airflow has the ability to impersonate a unix user while running task
+instances based on the tasks run_as_user parameter, which takes a users 
name.
+NOTE For impersonations to work, Airflow must be run with 
sudo as subtasks are run
+with sudo -u and permissions of files are changed. Furthermore, 
the unix user needs to
+exist on the worker. Here is what a simple sudoers file entry could look like 
to achieve
+this, assuming as airflow is running as 

[22/22] incubator-airflow-site git commit: Latest docs version as of 1.8.x

2017-03-06 Thread maximebeauchemin
Latest docs version as of 1.8.x


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/repo
Commit: 
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/commit/5e574012
Tree: 
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/tree/5e574012
Diff: 
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/diff/5e574012

Branch: refs/heads/asf-site
Commit: 5e5740122ed33a22a30047e75e6ca4c7da3961b4
Parents: 9c75ee9
Author: Maxime Beauchemin 
Authored: Mon Mar 6 08:43:25 2017 -0800
Committer: Maxime Beauchemin 
Committed: Mon Mar 6 08:43:25 2017 -0800

--
 _images/latest_only_with_trigger.png |   Bin 0 -> 40034 bytes
 _sources/api.rst.txt |43 +
 _sources/cli.rst.txt |11 +
 _sources/code.rst.txt|   255 +
 _sources/concepts.rst.txt|   833 +++
 _sources/configuration.rst.txt   |   284 +
 _sources/faq.rst.txt |   147 +
 _sources/index.rst.txt   |89 +
 _sources/installation.rst.txt|90 +
 _sources/integration.rst.txt |   246 +
 _sources/license.rst.txt |   211 +
 _sources/plugins.rst.txt |   144 +
 _sources/profiling.rst.txt   |39 +
 _sources/project.rst.txt |49 +
 _sources/scheduler.rst.txt   |   153 +
 _sources/security.rst.txt|   334 +
 _sources/start.rst.txt   |49 +
 _sources/tutorial.rst.txt|   429 ++
 _sources/ui.rst.txt  |   102 +
 _static/fonts/Inconsolata.ttf|   Bin 0 -> 63184 bytes
 _static/jquery-3.1.0.js  | 10074 
 api.html |   279 +
 integration.html |   424 ++
 23 files changed, 14285 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/5e574012/_images/latest_only_with_trigger.png
--
diff --git a/_images/latest_only_with_trigger.png 
b/_images/latest_only_with_trigger.png
new file mode 100644
index 000..629adfa
Binary files /dev/null and b/_images/latest_only_with_trigger.png differ

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/5e574012/_sources/api.rst.txt
--
diff --git a/_sources/api.rst.txt b/_sources/api.rst.txt
new file mode 100644
index 000..eef671c
--- /dev/null
+++ b/_sources/api.rst.txt
@@ -0,0 +1,43 @@
+Experimental Rest API
+=
+
+Airflow exposes an experimental Rest API. It is available through the 
webserver. Endpoints are
+available at /api/experimental/. Please note that we expect the endpoint 
definitions to change.
+
+Endpoints
+-
+
+This is a place holder until the swagger definitions are active
+
+* /api/experimental/dags//tasks/ returns info for a task 
(GET).
+* /api/experimental/dags//dag_runs creates a dag_run for a given dag 
id (POST).
+
+CLI
+-
+
+For some functions the cli can use the API. To configure the CLI to use the 
API when available
+configure as follows:
+
+.. code-block:: bash
+
+[cli]
+api_client = airflow.api.client.json_client
+endpoint_url = http://:
+
+
+Authentication
+--
+
+Only Kerberos authentication is currently supported for the API. To enable 
this set the following
+in the configuration:
+
+.. code-block:: bash
+
+[api]
+auth_backend = airflow.api.auth.backend.default
+
+[kerberos]
+keytab = 
+
+The Kerberos service is configured as 
`airflow/fully.qualified.domainname@REALM`. Make sure this
+principal exists in the keytab file.

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/5e574012/_sources/cli.rst.txt
--
diff --git a/_sources/cli.rst.txt b/_sources/cli.rst.txt
new file mode 100644
index 000..f05cbfb
--- /dev/null
+++ b/_sources/cli.rst.txt
@@ -0,0 +1,11 @@
+Command Line Interface
+==
+
+Airflow has a very rich command line interface that allows for
+many types of operation on a DAG, starting services, and supporting
+development and testing.
+
+.. argparse::
+   :module: airflow.bin.cli
+   :func: get_parser
+   :prog: airflow

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/5e574012/_sources/code.rst.txt
--
diff --git a/_sources/code.rst.txt b/_sources/code.rst.txt
new file mode 100644
index 000..fabe6db
--- /dev/null
+++ b/_sources/code.rst.txt
@@ -0,0 +1,255 @@
+API Reference
+=
+
+Operators
+-
+Operators allow for generation of certain types of tasks that become nodes in
+the DAG when 

[05/22] incubator-airflow-site git commit: Latest docs version as of 1.8.x

2017-03-06 Thread maximebeauchemin
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/9c75ee9e/concepts.html
--
diff --git a/concepts.html b/concepts.html
index 15b2b4d..1ed51a5 100644
--- a/concepts.html
+++ b/concepts.html
@@ -30,6 +30,9 @@
   
 
   
+
+
 
 
  
@@ -41,6 +44,7 @@
 
 
 
+   
   
 
 
@@ -114,6 +118,7 @@
 SubDAGs
 SLAs
 Trigger Rules
+Latest Run Only
 Zombies  Undeads
 Cluster Policy
 Documentation  Notes
@@ -128,6 +133,8 @@
 Scheduling  Triggers
 Plugins
 Security
+Experimental Rest API
+Integration
 FAQ
 API 
Reference
 
@@ -142,8 +149,10 @@
 
   
   
-
-Airflow
+
+  
+  Airflow
+
   
 
 
@@ -152,23 +161,40 @@
 
   
 
- 
+
+
+
+
+
+
+
+
+
+
+
 
 
 
 
+
   
-Docs 
-  
-Concepts
+
+  Docs 
+
+  Concepts
+
+
   
 
-  
- View page 
source
+
+ View page 
source
   
 
   
+
   
+
+  
   
 
   http://schema.org/Article;>
@@ -341,8 +367,7 @@ object is always returned. For example:
 We can put this all together to build a simple pipeline:
 with DAG(my_dag, start_date=datetime(2016, 1, 1)) as dag:
 (
-dag
- DummyOperator(task_id=dummy_1)
+DummyOperator(task_id=dummy_1)
  BashOperator(
 task_id=bash_1,
 bash_command=echo HELLO!)
@@ -452,10 +477,11 @@ variables from the operating system. The environment 
variable needs to be
 prefixed with AIRFLOW_CONN_ to be considered a connection. When
 referencing the connection in the Airflow pipeline, the conn_id should
 be the name of the variable without the prefix. For example, if the conn_id
-is named POSTGRES_MASTER the environment variable should be 
named
-AIRFLOW_CONN_POSTGRES_MASTER. Airflow assumes the 
value returned
-from the environment variable to be in a URI format
-(e.g. postgres://user:passwordlocalhost:5432/master).
+is named postgres_master the environment variable should be 
named
+AIRFLOW_CONN_POSTGRES_MASTER (note that the 
environment variable must be
+all uppercase). Airflow assumes the value returned from the environment
+variable to be in a URI format (e.g.
+postgres://user:passwordlocalhost:5432/master or 
s3://accesskey:secretkeyS3).
 
 
 Queues¶
@@ -674,6 +700,71 @@ while creating tasks:
 that, when set to True, keeps a task from getting triggered if the
 previous schedule for the task hasnt succeeded.
 
+
+Latest Run Only¶
+Standard workflow behavior involves running a series of tasks for a
+particular date/time range. Some workflows, however, perform tasks that
+are independent of run time but need to be run on a schedule, much like a
+standard cron job. In these cases, backfills or running jobs missed during
+a pause just wastes CPU cycles.
+For situations like this, you can use the LatestOnlyOperator to skip
+tasks that are not being run during the most recent scheduled run for a
+DAG. The LatestOnlyOperator skips all immediate downstream 
tasks, and
+itself, if the time right now is not between its execution_time and the
+next scheduled execution_time.
+One must be aware of the interaction between skipped tasks and trigger
+rules. Skipped tasks will cascade through trigger rules all_success
+and all_failed 
but not all_done, one_failed, one_success,
+and dummy. If 
you would like to use the LatestOnlyOperator with
+trigger rules that do not cascade skips, you will need to ensure that the
+LatestOnlyOperator is directly 
upstream of the task you would like
+to skip.
+It is possible, through use of trigger rules to mix tasks that should run
+in the typical date/time dependent mode and those using the
+LatestOnlyOperator.
+For example, consider the following dag:
+#dags/latest_only_with_trigger.py
+import datetime as dt
+
+from airflow.models import DAG
+from airflow.operators.dummy_operator import DummyOperator
+from airflow.operators.latest_only_operator import LatestOnlyOperator
+from airflow.utils.trigger_rule import 
TriggerRule
+
+
+dag = DAG(
+dag_id=latest_only_with_trigger,
+schedule_interval=dt.timedelta(hours=4),
+start_date=dt.datetime(2016, 9, 20),
+)
+
+latest_only = LatestOnlyOperator(task_id=latest_only, dag=dag)
+
+task1 = DummyOperator(task_id=task1, dag=dag)
+task1.set_upstream(latest_only)
+
+task2 = DummyOperator(task_id=task2, dag=dag)
+
+task3 = DummyOperator(task_id=task3, dag=dag)
+task3.set_upstream([task1, task2])
+
+task4 = DummyOperator(task_id=task4, dag=dag,
+  trigger_rule=TriggerRule.ALL_DONE)
+task4.set_upstream([task1, task2])
+
+
+In the case of this dag, the latest_only task will show up as skipped
+for all runs except the latest run. task1 is directly downstream of
+latest_only and 
will also skip for all runs except the latest.
+task2 is 
entirely independent of latest_only and will run in all
+scheduled periods. 

[13/22] incubator-airflow-site git commit: Latest docs version as of 1.8.x

2017-03-06 Thread maximebeauchemin
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/9c75ee9e/_modules/sensors.html
--
diff --git a/_modules/sensors.html b/_modules/sensors.html
index 6567e38..f2b1cfe 100644
--- a/_modules/sensors.html
+++ b/_modules/sensors.html
@@ -30,6 +30,9 @@
   
 
   
+
+
 
  
 
@@ -40,6 +43,7 @@
 
 
 
+   
   
 
 
@@ -90,6 +94,8 @@
 Scheduling  Triggers
 Plugins
 Security
+Experimental Rest API
+Integration
 FAQ
 API 
Reference
 
@@ -104,8 +110,10 @@
 
   
   
-
-Airflow
+
+  
+  Airflow
+
   
 
 
@@ -118,19 +126,36 @@
 
 
 
+
+
+
+
+
+
+
+
+
+
 
+
   
-Docs 
-  
+
+  Docs 
+
   Module code 
-  
-sensors
+
+  sensors
+
+
   
 
-  
+
 
   
+
   
+
+  
   
 
   http://schema.org/Article;>
@@ -151,24 +176,27 @@
 # See the License for the specific language governing 
permissions and
 # limitations under the License.
 
-from __future__ import print_function
-from future import standard_library
+from __future__ import print_function
+from future import standard_library
 standard_library.install_aliases()
-from builtins import str
-from past.builtins import basestring
+from builtins import str
+from past.builtins import basestring
 
-from datetime import datetime
+from datetime import datetime
 import logging
-from urllib.parse import urlparse
-from time import sleep
+from urllib.parse import urlparse
+from time import sleep
+import re
+import sys
 
 import airflow
-from airflow import hooks, 
settings
-from airflow.exceptions import AirflowException, AirflowSensorTimeout, AirflowSkipException
-from airflow.models import BaseOperator, TaskInstance, 
Connection as DB
-from airflow.hooks.base_hook 
import BaseHook
-from airflow.utils.state import State
-from airflow.utils.decorators 
import apply_defaults
+from airflow import hooks, 
settings
+from airflow.exceptions import AirflowException, AirflowSensorTimeout, AirflowSkipException
+from airflow.models import BaseOperator, TaskInstance
+from airflow.hooks.base_hook 
import BaseHook
+from airflow.hooks.hdfs_hook 
import HDFSHook
+from airflow.utils.state import State
+from airflow.utils.decorators 
import apply_defaults
 
 
 class BaseSensorOperator(BaseOperator):
@@ -193,7 +221,7 @@
 self,
 poke_interval=60,
 timeout=60*60*24*7,
-soft_fail=False,
+soft_fail=False,
 *args, **kwargs):
 super(BaseSensorOperator, self).__init__(*args, **kwargs)
 self.poke_interval = poke_interval
@@ -245,13 +273,13 @@
 logging.info(Poking: 
 + self.sql)
 records = hook.get_records(self.sql)
 if not records:
-return False
+return False
 else:
 if str(records[0][0]) in (0, ,):
-return False
+return False
 else:
-return True
-print(records[0][0])
+return True
+print(records[0][0])
 
 
 [docs]class MetastorePartitionSensor(SqlSensor):
@@ -286,13 +314,18 @@
 self.partition_name = partition_name
 self.table = table
 self.schema = schema
-self.first_poke = True
+self.first_poke = True
 self.conn_id = mysql_conn_id
+# TODO(aoen): We shouldnt be using SqlSensor 
here but MetastorePartitionSensor.
+# The problem is the way apply_defaults works 
isnt compatible with inheritance.
+# The inheritance model needs to be reworked in order 
to support overriding args/
+# kwargs with arguments here, then conn_id 
and sql can be passed into the
+# constructor below and apply_defaults will no longer 
throw an exception.
 super(SqlSensor, self).__init__(*args, **kwargs)
 
 def poke(self, context):
 if self.first_poke:
-self.first_poke = False
+self.first_poke = False
 if . 
in self.table:
 self.schema, self.table 
= self.table.split(.)
 self.sql = 
@@ -301,9 +334,9 @@
 LEFT OUTER JOIN TBLS B0 ON A0.TBL_ID = 
B0.TBL_ID
 LEFT OUTER JOIN DBS C0 ON B0.DB_ID = 
C0.DB_ID
 WHERE
-B0.TBL_NAME = {self.table} 
AND
-C0.NAME = {self.schema} AND
-A0.PART_NAME = 
{self.partition_name};
+B0.TBL_NAME = {self.table} AND
+C0.NAME = {self.schema} AND
+A0.PART_NAME = {self.partition_name};
 .format(self=self)
 return super(MetastorePartitionSensor, self).poke(context)
 
@@ -338,13 +371,13 @@
 self,
 external_dag_id,
 external_task_id,
-allowed_states=None,
-execution_delta=None,
-execution_date_fn=None,
+

[15/22] incubator-airflow-site git commit: Latest docs version as of 1.8.x

2017-03-06 Thread maximebeauchemin
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/9c75ee9e/_modules/dbapi_hook.html
--
diff --git a/_modules/dbapi_hook.html b/_modules/dbapi_hook.html
index 604..ae37fd3 100644
--- a/_modules/dbapi_hook.html
+++ b/_modules/dbapi_hook.html
@@ -30,6 +30,9 @@
   
 
   
+
+
 
  
 
@@ -40,6 +43,7 @@
 
 
 
+   
   
 
 
@@ -90,6 +94,8 @@
 Scheduling  Triggers
 Plugins
 Security
+Experimental Rest API
+Integration
 FAQ
 API 
Reference
 
@@ -104,8 +110,10 @@
 
   
   
-
-Airflow
+
+  
+  Airflow
+
   
 
 
@@ -118,19 +126,36 @@
 
 
 
+
+
+
+
+
+
+
+
+
+
 
+
   
-Docs 
-  
+
+  Docs 
+
   Module code 
-  
-dbapi_hook
+
+  dbapi_hook
+
+
   
 
-  
+
 
   
+
   
+
+  
   
 
   http://schema.org/Article;>
@@ -151,15 +176,17 @@
 # See the License for the specific language governing 
permissions and
 # limitations under the License.
 
-from builtins import str
-from past.builtins import basestring
-from datetime import datetime
+from builtins import str
+from past.builtins import basestring
+from datetime import datetime
 import numpy
 import logging
 import sys
 
-from airflow.hooks.base_hook 
import BaseHook
-from airflow.exceptions import AirflowException
+from sqlalchemy import create_engine
+
+from airflow.hooks.base_hook 
import BaseHook
+from airflow.exceptions import AirflowException
 
 
 [docs]class DbApiHook(BaseHook):
@@ -167,13 +194,13 @@
 Abstract base class for sql hooks.
 
 # Override to provide the connection name.
-conn_name_attr = None
+conn_name_attr = None
 # Override to have a default connection id for a 
particular dbHook
 default_conn_name = default_conn_id
 # Override if this db supports autocommit.
-supports_autocommit = False
+supports_autocommit = False
 # Override with the object that exposes the connect 
method
-connector = None
+connector = None
 
 def __init__(self, *args, **kwargs):
 if not self.conn_name_attr:
@@ -195,7 +222,23 @@
 username=db.login,
 schema=db.schema)
 
-[docs]def get_pandas_df(self, sql, parameters=None):
+def get_uri(self):
+conn = self.get_connection(getattr(self, self.conn_name_attr))
+login = 
+if conn.login:
+login = {conn.login}:{conn.password}@.format(conn=conn)
+host = conn.host
+if conn.port is not None:
+host += :{port}.format(port=conn.port)
+return {conn.conn_type}://{login}{host}/{conn.schema}.format(
+conn=conn, login=login, host=host)
+
+def get_sqlalchemy_engine(self, engine_kwargs=None):
+if engine_kwargs is None:
+engine_kwargs = 
{}
+return create_engine(self.get_uri(), **engine_kwargs)
+
+[docs]def get_pandas_df(self, sql, parameters=None):
 
 Executes the sql and returns a pandas dataframe
 
@@ -207,13 +250,13 @@
 
 if sys.version_info[0]  3:
 sql = sql.encode(utf-8)
-import pandas.io.sql 
as psql
+import pandas.io.sql 
as psql
 conn = self.get_conn()
 df = psql.read_sql(sql, con=conn, params=parameters)
 conn.close()
 return df
 
-[docs]def get_records(self, sql, parameters=None):
+[docs]def get_records(self, sql, parameters=None):
 
 Executes the sql and returns a set of records.
 
@@ -227,7 +270,7 @@
 sql = sql.encode(utf-8)
 conn = self.get_conn()
 cur = self.get_cursor()
-if parameters is not None:
+if parameters is not None:
 cur.execute(sql, parameters)
 else:
 cur.execute(sql)
@@ -236,7 +279,7 @@
 conn.close()
 return rows
 
-[docs]def get_first(self, sql, parameters=None):
+[docs]def get_first(self, sql, parameters=None):
 
 Executes the sql and returns the first resulting 
row.
 
@@ -250,7 +293,7 @@
 sql = sql.encode(utf-8)
 conn = self.get_conn()
 cur = conn.cursor()
-if parameters is not None:
+if parameters is not None:
 cur.execute(sql, parameters)
 else:
 cur.execute(sql)
@@ -259,7 +302,7 @@
 conn.close()
 return rows
 
-[docs]def run(self, sql, autocommit=False, parameters=None):
+[docs]def run(self, sql, autocommit=False, parameters=None):
 
 Runs a command or a list of commands. Pass a list of 
sql
 statements to the sql parameter to get them to 
execute
@@ -275,7 +318,7 @@
 :type parameters: mapping or iterable
 
 conn = self.get_conn()
-if isinstance(sql, basestring):
+if isinstance(sql, basestring):
 sql = [sql]
 
 

[08/22] incubator-airflow-site git commit: Latest docs version as of 1.8.x

2017-03-06 Thread maximebeauchemin
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/9c75ee9e/_static/fonts/fontawesome-webfont.woff
--
diff --git a/_static/fonts/fontawesome-webfont.woff 
b/_static/fonts/fontawesome-webfont.woff
index 628b6a5..6e7483c 100644
Binary files a/_static/fonts/fontawesome-webfont.woff and 
b/_static/fonts/fontawesome-webfont.woff differ

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/9c75ee9e/_static/jquery.js
--
diff --git a/_static/jquery.js b/_static/jquery.js
index ab28a24..f6a6a99 100644
--- a/_static/jquery.js
+++ b/_static/jquery.js
@@ -1,4 +1,4 @@
-/*! jQuery v1.11.1 | (c) 2005, 2014 jQuery Foundation, Inc. | 
jquery.org/license */
-!function(a,b){"object"==typeof module&&"object"==typeof 
module.exports?module.exports=a.document?b(a,!0):function(a){if(!a.document)throw
 new Error("jQuery requires a window with a document");return 
b(a)}:b(a)}("undefined"!=typeof window?window:this,function(a,b){var 
c=[],d=c.slice,e=c.concat,f=c.push,g=c.indexOf,h={},i=h.toString,j=h.hasOwnProperty,k={},l="1.11.1",m=function(a,b){return
 new 
m.fn.init(a,b)},n=/^[\s\uFEFF\xA0]+|[\s\uFEFF\xA0]+$/g,o=/^-ms-/,p=/-([\da-z])/gi,q=function(a,b){return
 
b.toUpperCase()};m.fn=m.prototype={jquery:l,constructor:m,selector:"",length:0,toArray:function(){return
 d.call(this)},get:function(a){return 
null!=a?0>a?this[a+this.length]:this[a]:d.call(this)},pushStack:function(a){var 
b=m.merge(this.constructor(),a);return 
b.prevObject=this,b.context=this.context,b},each:function(a,b){return 
m.each(this,a,b)},map:function(a){return 
this.pushStack(m.map(this,function(b,c){return 
a.call(b,c,b)}))},slice:function(){return this.pushStack(d.apply(this,argumen
 ts))},first:function(){return this.eq(0)},last:function(){return 
this.eq(-1)},eq:function(a){var b=this.length,c=+a+(0>a?b:0);return 
this.pushStack(c>=0&>c?[this[c]]:[])},end:function(){return 
this.prevObject||this.constructor(null)},push:f,sort:c.sort,splice:c.splice},m.extend=m.fn.extend=function(){var
 
a,b,c,d,e,f,g=arguments[0]||{},h=1,i=arguments.length,j=!1;for("boolean"==typeof
 g&&(j=g,g=arguments[h]||{},h++),"object"==typeof 
g||m.isFunction(g)||(g={}),h===i&&(g=this,h--);i>h;h++)if(null!=(e=arguments[h]))for(d
 in 
e)a=g[d],c=e[d],g!==c&&(j&&&(m.isPlainObject(c)||(b=m.isArray(c)))?(b?(b=!1,f=a&(a)?a:[]):f=a&(a)?a:{},g[d]=m.extend(j,f,c)):void
 0!==c&&(g[d]=c));return 
g},m.extend({expando:"jQuery"+(l+Math.random()).replace(/\D/g,""),isReady:!0,error:function(a){throw
 new 
Error(a)},noop:function(){},isFunction:function(a){return"function"===m.type(a)},isArray:Array.isArray||function(a){return"array"===m.type(a)},isWindow:function(a){return
 null!=a&=
 
=a.window},isNumeric:function(a){return!m.isArray(a)&(a)>=0},isEmptyObject:function(a){var
 b;for(b in a)return!1;return!0},isPlainObject:function(a){var 
b;if(!a||"object"!==m.type(a)||a.nodeType||m.isWindow(a))return!1;try{if(a.constructor&&!j.call(a,"constructor")&&!j.call(a.constructor.prototype,"isPrototypeOf"))return!1}catch(c){return!1}if(k.ownLast)for(b
 in a)return j.call(a,b);for(b in a);return void 
0===b||j.call(a,b)},type:function(a){return null==a?a+"":"object"==typeof 
a||"function"==typeof a?h[i.call(a)]||"object":typeof 
a},globalEval:function(b){b&(b)&&(a.execScript||function(b){a.eval.call(a,b)})(b)},camelCase:function(a){return
 a.replace(o,"ms-").replace(p,q)},nodeName:function(a,b){return 
a.nodeName&()===b.toLowerCase()},each:function(a,b,c){var
 
d,e=0,f=a.length,g=r(a);if(c){if(g){for(;f>e;e++)if(d=b.apply(a[e],c),d===!1)break}else
 for(e in a)if(d=b.apply(a[e],c),d===!1)break}else 
if(g){for(;f>e;e++)if(d=b.call(a[e],e,a[e]),d
 ===!1)break}else for(e in a)if(d=b.call(a[e],e,a[e]),d===!1)break;return 
a},trim:function(a){return 
null==a?"":(a+"").replace(n,"")},makeArray:function(a,b){var c=b||[];return 
null!=a&&(r(Object(a))?m.merge(c,"string"==typeof 
a?[a]:a):f.call(c,a)),c},inArray:function(a,b,c){var d;if(b){if(g)return 
g.call(b,a,c);for(d=b.length,c=c?0>c?Math.max(0,d+c):c:0;d>c;c++)if(c in 
b&[c]===a)return c}return-1},merge:function(a,b){var 
c=+b.length,d=0,e=a.length;while(c>d)a[e++]=b[d++];if(c!==c)while(void 
0!==b[d])a[e++]=b[d++];return a.length=e,a},grep:function(a,b,c){for(var 
d,e=[],f=0,g=a.length,h=!c;g>f;f++)d=!b(a[f],f),d!==h&(a[f]);return 
e},map:function(a,b,c){var 
d,f=0,g=a.length,h=r(a),i=[];if(h)for(;g>f;f++)d=b(a[f],f,c),null!=d&(d);else
 for(f in a)d=b(a[f],f,c),null!=d&(d);return 
e.apply([],i)},guid:1,proxy:function(a,b){var c,e,f;return"string"==typeof 
b&&(f=a[b],b=a,a=f),m.isFunction(a)?(c=d.call(arguments,2),e=function(){return 
a.apply(b||this,c.concat(d.call(ar
 guments)))},e.guid=a.guid=a.guid||m.guid++,e):void 
0},now:function(){return+new Date},support:k}),m.each("Boolean Number String 
Function Array Date RegExp Object Error".split(" "),function(a,b){h["[object 

[20/22] incubator-airflow-site git commit: Latest docs version as of 1.8.x

2017-03-06 Thread maximebeauchemin
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/5e574012/_static/jquery-3.1.0.js
--
diff --git a/_static/jquery-3.1.0.js b/_static/jquery-3.1.0.js
new file mode 100644
index 000..f2fc274
--- /dev/null
+++ b/_static/jquery-3.1.0.js
@@ -0,0 +1,10074 @@
+/*eslint-disable no-unused-vars*/
+/*!
+ * jQuery JavaScript Library v3.1.0
+ * https://jquery.com/
+ *
+ * Includes Sizzle.js
+ * https://sizzlejs.com/
+ *
+ * Copyright jQuery Foundation and other contributors
+ * Released under the MIT license
+ * https://jquery.org/license
+ *
+ * Date: 2016-07-07T21:44Z
+ */
+( function( global, factory ) {
+
+   "use strict";
+
+   if ( typeof module === "object" && typeof module.exports === "object" ) 
{
+
+   // For CommonJS and CommonJS-like environments where a proper 
`window`
+   // is present, execute the factory and get jQuery.
+   // For environments that do not have a `window` with a 
`document`
+   // (such as Node.js), expose a factory as module.exports.
+   // This accentuates the need for the creation of a real 
`window`.
+   // e.g. var jQuery = require("jquery")(window);
+   // See ticket #14549 for more info.
+   module.exports = global.document ?
+   factory( global, true ) :
+   function( w ) {
+   if ( !w.document ) {
+   throw new Error( "jQuery requires a 
window with a document" );
+   }
+   return factory( w );
+   };
+   } else {
+   factory( global );
+   }
+
+// Pass this if window is not defined yet
+} )( typeof window !== "undefined" ? window : this, function( window, noGlobal 
) {
+
+// Edge <= 12 - 13+, Firefox <=18 - 45+, IE 10 - 11, Safari 5.1 - 9+, iOS 6 - 
9.1
+// throw exceptions when non-strict code (e.g., ASP.NET 4.5) accesses strict 
mode
+// arguments.callee.caller (trac-13335). But as of jQuery 3.0 (2016), strict 
mode should be common
+// enough that all such attempts are guarded in a try block.
+"use strict";
+
+var arr = [];
+
+var document = window.document;
+
+var getProto = Object.getPrototypeOf;
+
+var slice = arr.slice;
+
+var concat = arr.concat;
+
+var push = arr.push;
+
+var indexOf = arr.indexOf;
+
+var class2type = {};
+
+var toString = class2type.toString;
+
+var hasOwn = class2type.hasOwnProperty;
+
+var fnToString = hasOwn.toString;
+
+var ObjectFunctionString = fnToString.call( Object );
+
+var support = {};
+
+
+
+   function DOMEval( code, doc ) {
+   doc = doc || document;
+
+   var script = doc.createElement( "script" );
+
+   script.text = code;
+   doc.head.appendChild( script ).parentNode.removeChild( script );
+   }
+/* global Symbol */
+// Defining this global in .eslintrc would create a danger of using the global
+// unguarded in another place, it seems safer to define global only for this 
module
+
+
+
+var
+   version = "3.1.0",
+
+   // Define a local copy of jQuery
+   jQuery = function( selector, context ) {
+
+   // The jQuery object is actually just the init constructor 
'enhanced'
+   // Need init if jQuery is called (just allow error to be thrown 
if not included)
+   return new jQuery.fn.init( selector, context );
+   },
+
+   // Support: Android <=4.0 only
+   // Make sure we trim BOM and NBSP
+   rtrim = /^[\s\uFEFF\xA0]+|[\s\uFEFF\xA0]+$/g,
+
+   // Matches dashed string for camelizing
+   rmsPrefix = /^-ms-/,
+   rdashAlpha = /-([a-z])/g,
+
+   // Used by jQuery.camelCase as callback to replace()
+   fcamelCase = function( all, letter ) {
+   return letter.toUpperCase();
+   };
+
+jQuery.fn = jQuery.prototype = {
+
+   // The current version of jQuery being used
+   jquery: version,
+
+   constructor: jQuery,
+
+   // The default length of a jQuery object is 0
+   length: 0,
+
+   toArray: function() {
+   return slice.call( this );
+   },
+
+   // Get the Nth element in the matched element set OR
+   // Get the whole matched element set as a clean array
+   get: function( num ) {
+   return num != null ?
+
+   // Return just the one element from the set
+   ( num < 0 ? this[ num + this.length ] : this[ num ] ) :
+
+   // Return all the elements in a clean array
+   slice.call( this );
+   },
+
+   // Take an array of elements and push it onto the stack
+   // (returning the new matched element set)
+   pushStack: function( elems ) {
+
+   // Build a new jQuery matched element set
+   var ret = jQuery.merge( 

[11/22] incubator-airflow-site git commit: Latest docs version as of 1.8.x

2017-03-06 Thread maximebeauchemin
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/9c75ee9e/_static/css/theme.css
--
diff --git a/_static/css/theme.css b/_static/css/theme.css
index 7be9339..c1631d8 100644
--- a/_static/css/theme.css
+++ b/_static/css/theme.css
@@ -1,5 +1,5 @@
 
*{-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}article,aside,details,figcaption,figure,footer,header,hgroup,nav,section{display:block}audio,canvas,video{display:inline-block;*display:inline;*zoom:1}audio:not([controls]){display:none}[hidden]{display:none}*{-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}html{font-size:100%;-webkit-text-size-adjust:100%;-ms-text-size-adjust:100%}body{margin:0}a:hover,a:active{outline:0}abbr[title]{border-bottom:1px
 
dotted}b,strong{font-weight:bold}blockquote{margin:0}dfn{font-style:italic}ins{background:#ff9;color:#000;text-decoration:none}mark{background:#ff0;color:#000;font-style:italic;font-weight:bold}pre,code,.rst-content
 tt,.rst-content 
code,kbd,samp{font-family:monospace,serif;_font-family:"courier 
new",monospace;font-size:1em}pre{white-space:pre}q{quotes:none}q:before,q:after{content:"";content:none}small{font-size:85%}sub,sup{font-size:75%;line-height:0;position:relative;ver
 
tical-align:baseline}sup{top:-0.5em}sub{bottom:-0.25em}ul,ol,dl{margin:0;padding:0;list-style:none;list-style-image:none}li{list-style:none}dd{margin:0}img{border:0;-ms-interpolation-mode:bicubic;vertical-align:middle;max-width:100%}svg:not(:root){overflow:hidden}figure{margin:0}form{margin:0}fieldset{border:0;margin:0;padding:0}label{cursor:pointer}legend{border:0;*margin-left:-7px;padding:0;white-space:normal}button,input,select,textarea{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle}button,input{line-height:normal}button,input[type="button"],input[type="reset"],input[type="submit"]{cursor:pointer;-webkit-appearance:button;*overflow:visible}button[disabled],input[disabled]{cursor:default}input[type="checkbox"],input[type="radio"]{box-sizing:border-box;padding:0;*width:13px;*height:13px}input[type="search"]{-webkit-appearance:textfield;-moz-box-sizing:content-box;-webkit-box-sizing:content-box;box-sizing:content-box}input[type="search"]::-webkit-search-decor
 
ation,input[type="search"]::-webkit-search-cancel-button{-webkit-appearance:none}button::-moz-focus-inner,input::-moz-focus-inner{border:0;padding:0}textarea{overflow:auto;vertical-align:top;resize:vertical}table{border-collapse:collapse;border-spacing:0}td{vertical-align:top}.chromeframe{margin:0.2em
 0;background:#ccc;color:#000;padding:0.2em 
0}.ir{display:block;border:0;text-indent:-999em;overflow:hidden;background-color:transparent;background-repeat:no-repeat;text-align:left;direction:ltr;*line-height:0}.ir
 br{display:none}.hidden{display:none 
!important;visibility:hidden}.visuallyhidden{border:0;clip:rect(0 0 0 
0);height:1px;margin:-1px;overflow:hidden;padding:0;position:absolute;width:1px}.visuallyhidden.focusable:active,.visuallyhidden.focusable:focus{clip:auto;height:auto;margin:0;overflow:visible;position:static;width:auto}.invisible{visibility:hidden}.relative{position:relative}big,small{font-size:100%}@media
 print{html,body,section{background:none !important}*{box-shadow:n
 one !important;text-shadow:none !important;filter:none 
!important;-ms-filter:none !important}a,a:visited{text-decoration:underline}.ir 
a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100%
 !important}@page{margin:0.5cm}p,h2,.rst-content .toctree-wrapper 
p.caption,h3{orphans:3;widows:3}h2,.rst-content .toctree-wrapper 
p.caption,h3{page-break-after:avoid}}.fa:before,.wy-menu-vertical li 
span.toctree-expand:before,.wy-menu-vertical li.on a 
span.toctree-expand:before,.wy-menu-vertical li.current>a 
span.toctree-expand:before,.rst-content .admonition-title:before,.rst-content 
h1 .headerlink:before,.rst-content h2 .headerlink:before,.rst-content h3 
.headerlink:before,.rst-content h4 .headerlink:before,.rst-content h5 
.headerlink:before,.rst-content h6 .headerlink:before,.rst-content dl dt 
.headerlink:before,.rst-content p.caption .headerlink:before,.rst-con
 tent tt.download span:first-child:before,.rst-content code.download 
span:first-child:before,.icon:before,.wy-dropdown 
.caret:before,.wy-inline-validate.wy-inline-validate-success 
.wy-input-context:before,.wy-inline-validate.wy-inline-validate-danger 
.wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning 
.wy-input-context:before,.wy-inline-validate.wy-inline-validate-info 
.wy-input-context:before,.wy-alert,.rst-content .note,.rst-content 
.attention,.rst-content .caution,.rst-content .danger,.rst-content 
.error,.rst-content .hint,.rst-content .important,.rst-content 
.tip,.rst-content .warning,.rst-content 

[12/22] incubator-airflow-site git commit: Latest docs version as of 1.8.x

2017-03-06 Thread maximebeauchemin
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/9c75ee9e/_modules/ssh_hook.html
--
diff --git a/_modules/ssh_hook.html b/_modules/ssh_hook.html
index e094b62..8da4b35 100644
--- a/_modules/ssh_hook.html
+++ b/_modules/ssh_hook.html
@@ -30,6 +30,9 @@
   
 
   
+
+
 
  
 
@@ -40,6 +43,7 @@
 
 
 
+   
   
 
 
@@ -90,6 +94,8 @@
 Scheduling  Triggers
 Plugins
 Security
+Experimental Rest API
+Integration
 FAQ
 API 
Reference
 
@@ -104,8 +110,10 @@
 
   
   
-
-Airflow
+
+  
+  Airflow
+
   
 
 
@@ -118,19 +126,36 @@
 
 
 
+
+
+
+
+
+
+
+
+
+
 
+
   
-Docs 
-  
+
+  Docs 
+
   Module code 
-  
-ssh_hook
+
+  ssh_hook
+
+
   
 
-  
+
 
   
+
   
+
+  
   
 
   http://schema.org/Article;>
@@ -156,10 +181,10 @@
 #
 # This is a port of Luigis ssh implementation. All 
credits go there.
 import subprocess
-from contextlib import contextmanager
+from contextlib import contextmanager
 
-from airflow.hooks.base_hook 
import BaseHook
-from airflow.exceptions import AirflowException
+from airflow.hooks.base_hook 
import BaseHook
+from airflow.exceptions import AirflowException
 
 import logging
 
@@ -191,11 +216,13 @@
 
 def __init__(self, conn_id=ssh_default):
 conn = self.get_connection(conn_id)
-self.key_file = conn.extra_dejson.get(key_file, None)
-self.connect_timeout = conn.extra_dejson.get(connect_timeout, None)
-self.no_host_key_check = conn.extra_dejson.get(no_host_key_check, False)
-self.tty = conn.extra_dejson.get(tty, False)
-self.sshpass = conn.extra_dejson.get(sshpass, False)
+self.key_file = conn.extra_dejson.get(key_file, None)
+self.connect_timeout = conn.extra_dejson.get(connect_timeout, None)
+self.tcp_keepalive = conn.extra_dejson.get(tcp_keepalive, False)
+self.server_alive_interval = conn.extra_dejson.get(server_alive_interval, 60)
+self.no_host_key_check = conn.extra_dejson.get(no_host_key_check, False)
+self.tty = conn.extra_dejson.get(tty, False)
+self.sshpass = conn.extra_dejson.get(sshpass, False)
 self.conn = conn
 
 def get_conn(self):
@@ -203,7 +230,7 @@
 
 def _host_ref(self):
 if self.conn.login:
-return {0}@{1}.format(self.conn.login, self.conn.host)
+return {0}@{1}.format(self.conn.login, self.conn.host)
 else:
 return self.conn.host
 
@@ -218,7 +245,11 @@
 connection_cmd += 
[-p, str(self.conn.port)]
 
 if self.connect_timeout:
-connection_cmd += 
[-o, ConnectionTimeout={}.format(self.connect_timeout)]
+connection_cmd += 
[-o, ConnectionTimeout={}.format(self.connect_timeout)]
+
+if self.tcp_keepalive:
+connection_cmd += 
[-o, TCPKeepAlive=yes]
+connection_cmd += 
[-o, ServerAliveInterval={}.format(self.server_alive_interval)]
 
 if self.no_host_key_check:
 connection_cmd += 
[-o, UserKnownHostsFile=/dev/null,
@@ -231,7 +262,7 @@
 connection_cmd += 
[-t]
 
 connection_cmd += cmd
-logging.debug(SSH cmd: 
{} .format(connection_cmd))
+logging.debug(SSH cmd: 
{} .format(connection_cmd))
 
 return connection_cmd
 
@@ -259,13 +290,13 @@
 
 if p.returncode != 
0:
 # I like this better: 
RemoteCalledProcessError(p.returncode, cmd, self.host, output=output)
-raise AirflowException(Cannot execute {} on {}. Error code is: {}. Output: {}, 
Stderr: {}.format(
+raise AirflowException(Cannot execute {} on {}. Error code 
is: {}. Output: {}, Stderr: {}.format(
cmd, self.conn.host, p.returncode, output, stderr))
 
 return output
 
 @contextmanager
-[docs]def tunnel(self, local_port, remote_port=None, remote_host=localhost):
+[docs]def tunnel(self, local_port, remote_port=None, remote_host=localhost):
 
 Creates a tunnel between two hosts. Like ssh -L 
LOCAL_PORT:host:REMOTE_PORT.
 Remember to close() the returned tunnel 
object in order to clean up
@@ -279,7 +310,7 @@
 :type remote_host: str
 :return:
 
-tunnel_host = {0}:{1}:{2}.format(local_port, remote_host, remote_port)
+tunnel_host = {0}:{1}:{2}.format(local_port, remote_host, remote_port)
 proc = self.Popen([-L, tunnel_host, 
echo -n ready  cat],
   stdin=subprocess.PIPE, stdout=subprocess.PIPE)
 
@@ -287,11 +318,14 @@
 assert ready == bready, Did not get ready from remote
 yield
 proc.communicate()
-assert proc.returncode == 
0, Tunnel 
process did unclean exit (returncode {}.format(proc.returncode)
+assert 

[03/22] incubator-airflow-site git commit: Latest docs version as of 1.8.x

2017-03-06 Thread maximebeauchemin
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/9c75ee9e/plugins.html
--
diff --git a/plugins.html b/plugins.html
index 68c615d..be1cfbb 100644
--- a/plugins.html
+++ b/plugins.html
@@ -30,6 +30,9 @@
   
 
   
+
+
 
 
  
@@ -41,6 +44,7 @@
 
 
 
+   
   
 
 
@@ -97,6 +101,8 @@
 
 
 Security
+Experimental Rest API
+Integration
 FAQ
 API 
Reference
 
@@ -111,8 +117,10 @@
 
   
   
-
-Airflow
+
+  
+  Airflow
+
   
 
 
@@ -121,23 +129,40 @@
 
   
 
- 
+
+
+
+
+
+
+
+
+
+
+
 
 
 
 
+
   
-Docs 
-  
-Plugins
+
+  Docs 
+
+  Plugins
+
+
   
 
-  
- View page source
+
+ View page 
source
   
 
   
+
   
+
+  
   
 
   http://schema.org/Article;>
@@ -226,18 +251,22 @@ definitions in Airflow.
 from airflow.models import  BaseOperator
 from airflow.executors.base_executor import 
BaseExecutor
 
-# Will show up under airflow.hooks.PluginHook
+# Will show up under 
airflow.hooks.test_plugin.PluginHook
 class PluginHook(BaseHook):
 pass
 
-# Will show up under airflow.operators.PluginOperator
+# Will show up under 
airflow.operators.test_plugin.PluginOperator
 class PluginOperator(BaseOperator):
 pass
 
-# Will show up under airflow.executors.PluginExecutor
+# Will show up under 
airflow.executors.test_plugin.PluginExecutor
 class PluginExecutor(BaseExecutor):
 pass
 
+# Will show up under 
airflow.macros.test_plugin.plugin_macro
+def plugin_macro():
+pass
+
 # Creating a flask admin BaseView
 class TestView(BaseView):
 @expose(/)
@@ -262,10 +291,11 @@ definitions in Airflow.
 class AirflowTestPlugin(AirflowPlugin):
 name = test_plugin
 operators = [PluginOperator]
-flask_blueprints = [bp]
 hooks = [PluginHook]
 executors = [PluginExecutor]
+macros = [plugin_macro]
 admin_views = [v]
+flask_blueprints = [bp]
 menu_links = [ml]
 
 
@@ -274,15 +304,18 @@ definitions in Airflow.
 
 

+   
+
+   
   
   
   
 
   
-Next 
+Next 
   
   
- 
Previous
+ Previous
   
 
   
@@ -315,7 +348,8 @@ definitions in Airflow.
 VERSION:'',
 COLLAPSE_INDEX:false,
 FILE_SUFFIX:'.html',
-HAS_SOURCE:  true
+HAS_SOURCE:  true,
+SOURCELINK_SUFFIX: '.txt'
 };
 
   

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/9c75ee9e/profiling.html
--
diff --git a/profiling.html b/profiling.html
index 795ec81..350c015 100644
--- a/profiling.html
+++ b/profiling.html
@@ -30,6 +30,9 @@
   
 
   
+
+
 
 
  
@@ -41,6 +44,7 @@
 
 
 
+   
   
 
 
@@ -99,6 +103,8 @@
 Scheduling  Triggers
 Plugins
 Security
+Experimental Rest API
+Integration
 FAQ
 API 
Reference
 
@@ -113,8 +119,10 @@
 
   
   
-
-Airflow
+
+  
+  Airflow
+
   
 
 
@@ -123,23 +131,40 @@
 
   
 
- 
+
+
+
+
+
+
+
+
+
+
+
 
 
 
 
+
   
-Docs 
-  
-Data Profiling
+
+  Docs 
+
+  Data Profiling
+
+
   
 
-  
- View page 
source
+
+ View page 
source
   
 
   
+
   
+
+  
   
 
   http://schema.org/Article;>
@@ -181,15 +206,18 @@ directly in the URL.
 
 

+   
+
+   
   
   
   
 
   
-Next 
+Next 
   
   
- Previous
+ 
Previous
   
 
   
@@ -222,7 +250,8 @@ directly in the URL.
 VERSION:'',
 COLLAPSE_INDEX:false,
 FILE_SUFFIX:'.html',
-HAS_SOURCE:  true
+HAS_SOURCE:  true,
+SOURCELINK_SUFFIX: '.txt'
 };
 
   

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/9c75ee9e/project.html
--
diff --git a/project.html b/project.html
index d2eb7f2..1bc09d0 100644
--- a/project.html
+++ b/project.html
@@ -30,6 +30,9 @@
   
 
   
+
+
 
 
  
@@ -41,6 +44,7 @@
 
 
 
+   
   
 
 
@@ -97,6 +101,8 @@
 Scheduling  Triggers
 Plugins
 Security
+Experimental Rest API
+Integration
 FAQ
 API 
Reference
 
@@ -111,8 +117,10 @@
 
   
   
-
-Airflow
+
+  
+  Airflow
+
   
 
 
@@ -121,23 +129,40 @@
 
   
 
- 
+
+
+
+
+
+
+
+
+
+
+
 
 
 
 
+
   
-Docs 
-  
-Project
+
+  Docs 
+
+  Project
+
+
  

[04/22] incubator-airflow-site git commit: Latest docs version as of 1.8.x

2017-03-06 Thread maximebeauchemin
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/9c75ee9e/genindex.html
--
diff --git a/genindex.html b/genindex.html
index d4a80fb..76acd11 100644
--- a/genindex.html
+++ b/genindex.html
@@ -31,6 +31,9 @@
   
 
   
+
+
  
 
   
@@ -40,6 +43,7 @@
 
 
 
+   
   
 
 
@@ -90,6 +94,8 @@
 Scheduling  Triggers
 Plugins
 Security
+Experimental Rest API
+Integration
 FAQ
 API 
Reference
 
@@ -104,8 +110,10 @@
 
   
   
-
-Airflow
+
+  
+  Airflow
+
   
 
 
@@ -118,17 +126,34 @@
 
 
 
+
+
+
+
+
+
+
+
+
+
 
+
   
-Docs 
-  
-
+
+  Docs 
+
+  
+
+
   
 
-  
+
 
   
+
   
+
+  
   
 
   http://schema.org/Article;>
@@ -153,1167 +178,536 @@
  | N
  | O
  | P
- | Q
  | R
  | S
  | T
  | U
- | V
  | W
  | X
  
 
 A
 
-  
-  
-  add_task() 
(airflow.models.DAG method)
-  
-
-  
-  add_tasks() 
(airflow.models.DAG method)
-  
-
-  
-  airflow.contrib.hooks 
(module)
-  
-
-  
-  airflow.contrib.operators 
(module)
-  
-
-  
-  airflow.executors 
(module)
-  
-
-  
-  airflow.hooks (module)
-  
-
-  
-  
-  
-  airflow.macros (module)
-  
-
-  
-  airflow.macros.hive 
(module)
-  
-
-  
-  airflow.models (module)
-  
-
-  
-  airflow.operators 
(module)
-  
-
-  
-  are_dependencies_met()
 (airflow.models.TaskInstance method)
-  
-
-  
-  are_dependents_done()
 (airflow.models.TaskInstance method)
-  
-
-  
+  
+  add_task() 
(airflow.models.DAG method)
+
+  add_tasks() 
(airflow.models.DAG method)
+
+  airflow.contrib.hooks (module)
+
+  airflow.contrib.operators 
(module)
+
+  airflow.executors 
(module)
+
+  airflow.hooks (module)
+
+  
+  
+  airflow.macros (module)
+
+  airflow.macros.hive 
(module)
+
+  airflow.models (module)
+
+  airflow.operators 
(module)
+
+  are_dependencies_met()
 (airflow.models.TaskInstance method)
+
+  are_dependents_done()
 (airflow.models.TaskInstance method)
+
+  
 
 
 B
 
-  
-  
-  bag_dag() 
(airflow.models.DagBag method)
-  
-
-  
-  BaseOperator (class in 
airflow.models), [1]
-  
-
-  
-  BaseSensorOperator
 (class in airflow.operators.sensors)
-  
-
-  
-  BashOperator (class 
in airflow.operators)
-  
-
-  
-  BigQueryHook 
(class in airflow.contrib.hooks)
-  
-
-  
-  
-  
-  BigQueryOperator
 (class in airflow.contrib.operators.bigquery_operator)
-  
-
-  
-  BigQueryToCloudStorageOperator
 (class in airflow.contrib.operators.bigquery_to_gcs)
-  
-
-  
-  BranchPythonOperator 
(class in airflow.operators)
-  
-
-  
-  bulk_dump() 
(airflow.hooks.DbApiHook method)
-  
-
-  
-  bulk_load() 
(airflow.hooks.DbApiHook method)
-  
-
-  
-
-  (airflow.hooks.MySqlHook 
method)
-  
-
-  
-  
+  
+  bag_dag() 
(airflow.models.DagBag method)
+
+  BaseOperator (class 
in airflow.models), [1]
+
+  BaseSensorOperator
 (class in airflow.operators.sensors)
+
+  BashOperator 
(class in airflow.operators)
+
+  
+  
+  BranchPythonOperator 
(class in airflow.operators)
+
+  bulk_dump() 
(airflow.hooks.DbApiHook method)
+
+  bulk_load() 
(airflow.hooks.DbApiHook method)
+
+  
+(airflow.hooks.MySqlHook 
method)
+
+  
+  
 
 
 C
 
-  
-  
-  CeleryExecutor 
(class in airflow.executors)
-  
-
-  
-  check_for_bucket() 
(airflow.hooks.S3Hook method)
-  
-
-  
-  check_for_key() 
(airflow.hooks.S3Hook method)
-  
-
-  
-  check_for_named_partition()
 (airflow.hooks.HiveMetastoreHook method)
-  
-
-  
-  check_for_partition()
 (airflow.hooks.HiveMetastoreHook method)
-  
-
-  
-  check_for_path() 
(airflow.hooks.WebHDFSHook method)
-  
-
-  
-  check_for_prefix() 
(airflow.hooks.S3Hook method)
-  
-
-  
-  check_for_wildcard_key()
 (airflow.hooks.S3Hook method)
-  
-
-  
-  check_output() 
(airflow.contrib.hooks.SSHHook method)
-  
-
-  
-  clear() 
(airflow.models.BaseOperator method)
-  
-
-  
-
-  (airflow.models.DAG 
method)
-  
-
-  
-  
-  clear_xcom_data() 
(airflow.models.TaskInstance method)
-  
-
-  
-  cli() (airflow.models.DAG 
method)
-  
-
-  
-  close_conn() 
(airflow.contrib.hooks.FTPHook method)
-  
-
-  
-  
-  
-  closest_ds_partition()
 (in module airflow.macros.hive)
-  
-
-  
-  CloudantHook 
(class in airflow.contrib.hooks)
-  
-
-  
-  collect_dags() 
(airflow.models.DagBag method)
-  
-
-  
-  command() 
(airflow.models.TaskInstance method)
-  
-
-  
-  concurrency_reached 
(airflow.models.DAG attribute)
-  
-
-  
-  Connection (class in 
airflow.models)
-  
-
-  
-  construct_api_call_params()
 (airflow.operators.SlackAPIOperator method)
-  
-
-  
-  construct_ingest_query()
 (airflow.hooks.DruidHook method)
-  
-
-  
-  

[07/22] incubator-airflow-site git commit: Latest docs version as of 1.8.x

2017-03-06 Thread maximebeauchemin
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/9c75ee9e/_static/js/theme.js
--
diff --git a/_static/js/theme.js b/_static/js/theme.js
index 48a9f06..af661a9 100644
--- a/_static/js/theme.js
+++ b/_static/js/theme.js
@@ -13,33 +13,36 @@ function ThemeNav () {
 winPosition: 0,
 winHeight: null,
 docHeight: null,
-isRunning: null
+isRunning: false
 };
 
 nav.enable = function () {
 var self = this;
 
-jQuery(function ($) {
-self.init($);
-
-self.reset();
-self.win.on('hashchange', self.reset);
-
-// Set scroll monitor
-self.win.on('scroll', function () {
-if (!self.linkScroll) {
-self.winScroll = true;
-}
-});
-setInterval(function () { if (self.winScroll) self.onScroll(); }, 
25);
-
-// Set resize monitor
-self.win.on('resize', function () {
-self.winResize = true;
+if (!self.isRunning) {
+self.isRunning = true;
+jQuery(function ($) {
+self.init($);
+
+self.reset();
+self.win.on('hashchange', self.reset);
+
+// Set scroll monitor
+self.win.on('scroll', function () {
+if (!self.linkScroll) {
+self.winScroll = true;
+}
+});
+setInterval(function () { if (self.winScroll) self.onScroll(); 
}, 25);
+
+// Set resize monitor
+self.win.on('resize', function () {
+self.winResize = true;
+});
+setInterval(function () { if (self.winResize) self.onResize(); 
}, 25);
+self.onResize();
 });
-setInterval(function () { if (self.winResize) self.onResize(); }, 
25);
-self.onResize();
-});
+};
 };
 
 nav.init = function ($) {
@@ -95,6 +98,19 @@ function ThemeNav () {
 try {
 var link = $('.wy-menu-vertical')
 .find('[href="' + anchor + '"]');
+// If we didn't find a link, it may be because we clicked on
+// something that is not in the sidebar (eg: when using
+// sphinxcontrib.httpdomain it generates headerlinks but those
+// aren't picked up and placed in the toctree). So let's find
+// the closest header in the document and try with that one.
+if (link.length === 0) {
+  var doc_link = $('.document a[href="' + anchor + '"]');
+  var closest_section = doc_link.closest('div.section');
+  // Try again with the closest section entry.
+  link = $('.wy-menu-vertical')
+.find('[href="#' + closest_section.attr("id") + '"]');
+
+}
 $('.wy-menu-vertical li.toctree-l1 li.current')
 .removeClass('current');
 link.closest('li.toctree-l2').addClass('current');

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/9c75ee9e/_static/minus.png
--
diff --git a/_static/minus.png b/_static/minus.png
index 0f22b16..d96755f 100644
Binary files a/_static/minus.png and b/_static/minus.png differ

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/9c75ee9e/_static/plus.png
--
diff --git a/_static/plus.png b/_static/plus.png
index 0cfe084..7107cec 100644
Binary files a/_static/plus.png and b/_static/plus.png differ

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/9c75ee9e/_static/searchtools.js
--
diff --git a/_static/searchtools.js b/_static/searchtools.js
index 066857c..bbfb3ac 100644
--- a/_static/searchtools.js
+++ b/_static/searchtools.js
@@ -226,6 +226,106 @@ var Scorer = {
 };
 
 
+
+
+
+var splitChars = (function() {
+var result = {};
+var singles = [96, 180, 187, 191, 215, 247, 749, 885, 903, 907, 909, 930, 
1014, 1648,
+ 1748, 1809, 2416, 2473, 2481, 2526, 2601, 2609, 2612, 2615, 2653, 
2702,
+ 2706, 2729, 2737, 2740, 2857, 2865, 2868, 2910, 2928, 2948, 2961, 
2971,
+ 2973, 3085, 3089, 3113, 3124, 3213, 3217, 3241, 3252, 3295, 3341, 
3345,
+ 3369, 3506, 3516, 3633, 3715, 3721, 3736, 3744, 3748, 3750, 3756, 
3761,
+ 3781, 3912, 4239, 4347, 4681, 4695, 4697, 4745, 4785, 4799, 4801, 
4823,
+ 4881, 5760, 5901, 5997, 6313, 7405, 8024, 8026, 8028, 8030, 8117, 
8125,
+ 8133, 8181, 8468, 8485, 8487, 8489, 8494, 8527, 11311, 11359, 11687, 
11695,
+ 11703,