[GitHub] codecov-io edited a comment on issue #3733: [AIRFLOW-491] Add cache parameter in BigQuery query method - with 'api_resource_configs'

2018-08-29 Thread GitBox
codecov-io edited a comment on issue #3733: [AIRFLOW-491] Add cache parameter 
in BigQuery query method - with 'api_resource_configs'
URL: 
https://github.com/apache/incubator-airflow/pull/3733#issuecomment-413105867
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3733?src=pr=h1)
 Report
   > Merging 
[#3733](https://codecov.io/gh/apache/incubator-airflow/pull/3733?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/ac9033db0981ae1f770a8bdb5597055751ab15bd?src=pr=desc)
 will **decrease** coverage by `0.32%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3733/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3733?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#3733  +/-   ##
   ==
   - Coverage   77.41%   77.08%   -0.33% 
   ==
 Files 203  203  
 Lines   1581715817  
   ==
   - Hits1224412192  -52 
   - Misses   3573 3625  +52
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/3733?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/hooks/hdfs\_hook.py](https://codecov.io/gh/apache/incubator-airflow/pull/3733/diff?src=pr=tree#diff-YWlyZmxvdy9ob29rcy9oZGZzX2hvb2sucHk=)
 | `27.5% <0%> (-65%)` | :arrow_down: |
   | 
[airflow/utils/decorators.py](https://codecov.io/gh/apache/incubator-airflow/pull/3733/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9kZWNvcmF0b3JzLnB5)
 | `85.41% <0%> (-6.25%)` | :arrow_down: |
   | 
[airflow/utils/sqlalchemy.py](https://codecov.io/gh/apache/incubator-airflow/pull/3733/diff?src=pr=tree#diff-YWlyZmxvdy91dGlscy9zcWxhbGNoZW15LnB5)
 | `75.71% <0%> (-5.72%)` | :arrow_down: |
   | 
[airflow/hooks/dbapi\_hook.py](https://codecov.io/gh/apache/incubator-airflow/pull/3733/diff?src=pr=tree#diff-YWlyZmxvdy9ob29rcy9kYmFwaV9ob29rLnB5)
 | `79.67% <0%> (-3.26%)` | :arrow_down: |
   | 
[airflow/task/task\_runner/base\_task\_runner.py](https://codecov.io/gh/apache/incubator-airflow/pull/3733/diff?src=pr=tree#diff-YWlyZmxvdy90YXNrL3Rhc2tfcnVubmVyL2Jhc2VfdGFza19ydW5uZXIucHk=)
 | `77.96% <0%> (-1.7%)` | :arrow_down: |
   | 
[airflow/operators/docker\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/3733/diff?src=pr=tree#diff-YWlyZmxvdy9vcGVyYXRvcnMvZG9ja2VyX29wZXJhdG9yLnB5)
 | `96.51% <0%> (-1.17%)` | :arrow_down: |
   | 
[airflow/www\_rbac/app.py](https://codecov.io/gh/apache/incubator-airflow/pull/3733/diff?src=pr=tree#diff-YWlyZmxvdy93d3dfcmJhYy9hcHAucHk=)
 | `96.66% <0%> (-1.12%)` | :arrow_down: |
   | 
[airflow/configuration.py](https://codecov.io/gh/apache/incubator-airflow/pull/3733/diff?src=pr=tree#diff-YWlyZmxvdy9jb25maWd1cmF0aW9uLnB5)
 | `82.96% <0%> (-1.12%)` | :arrow_down: |
   | 
[airflow/www/app.py](https://codecov.io/gh/apache/incubator-airflow/pull/3733/diff?src=pr=tree#diff-YWlyZmxvdy93d3cvYXBwLnB5)
 | `98.97% <0%> (-1.03%)` | :arrow_down: |
   | 
[airflow/bin/cli.py](https://codecov.io/gh/apache/incubator-airflow/pull/3733/diff?src=pr=tree#diff-YWlyZmxvdy9iaW4vY2xpLnB5)
 | `64.53% <0%> (-0.26%)` | :arrow_down: |
   | ... and [3 
more](https://codecov.io/gh/apache/incubator-airflow/pull/3733/diff?src=pr=tree-more)
 | |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3733?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3733?src=pr=footer).
 Last update 
[ac9033d...ca276d7](https://codecov.io/gh/apache/incubator-airflow/pull/3733?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3821: [AIRFLOW-2983] Add prev_ds_nodash and next_ds_nodash macro

2018-08-29 Thread GitBox
codecov-io edited a comment on issue #3821: [AIRFLOW-2983] Add prev_ds_nodash 
and next_ds_nodash macro
URL: 
https://github.com/apache/incubator-airflow/pull/3821#issuecomment-417186769
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3821?src=pr=h1)
 Report
   > Merging 
[#3821](https://codecov.io/gh/apache/incubator-airflow/pull/3821?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/ac9033db0981ae1f770a8bdb5597055751ab15bd?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3821/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3821?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#3821  +/-   ##
   ==
   + Coverage   77.41%   77.41%   +<.01% 
   ==
 Files 203  203  
 Lines   1581715821   +4 
   ==
   + Hits1224412248   +4 
 Misses   3573 3573
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/3821?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/3821/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `88.8% <100%> (+0.01%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3821?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3821?src=pr=footer).
 Last update 
[ac9033d...317d3d2](https://codecov.io/gh/apache/incubator-airflow/pull/3821?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io commented on issue #3821: [AIRFLOW-2983] Add prev_ds_nodash and next_ds_nodash macro

2018-08-29 Thread GitBox
codecov-io commented on issue #3821: [AIRFLOW-2983] Add prev_ds_nodash and 
next_ds_nodash macro
URL: 
https://github.com/apache/incubator-airflow/pull/3821#issuecomment-417186769
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3821?src=pr=h1)
 Report
   > Merging 
[#3821](https://codecov.io/gh/apache/incubator-airflow/pull/3821?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/ac9033db0981ae1f770a8bdb5597055751ab15bd?src=pr=desc)
 will **increase** coverage by `<.01%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3821/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3821?src=pr=tree)
   
   ```diff
   @@Coverage Diff @@
   ##   master#3821  +/-   ##
   ==
   + Coverage   77.41%   77.41%   +<.01% 
   ==
 Files 203  203  
 Lines   1581715821   +4 
   ==
   + Hits1224412248   +4 
 Misses   3573 3573
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/3821?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/3821/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `88.8% <100%> (+0.01%)` | :arrow_up: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3821?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3821?src=pr=footer).
 Last update 
[ac9033d...317d3d2](https://codecov.io/gh/apache/incubator-airflow/pull/3821?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] xnuinside commented on a change in pull request #3733: [AIRFLOW-491] Add cache parameter in BigQuery query method - with 'api_resource_configs'

2018-08-29 Thread GitBox
xnuinside commented on a change in pull request #3733: [AIRFLOW-491] Add cache 
parameter in BigQuery query method - with 'api_resource_configs'
URL: https://github.com/apache/incubator-airflow/pull/3733#discussion_r213896240
 
 

 ##
 File path: airflow/contrib/hooks/bigquery_hook.py
 ##
 @@ -566,95 +612,108 @@ def run_query(self,
   'Airflow.',
   category=DeprecationWarning)
 
-if sql is None:
-raise TypeError('`BigQueryBaseCursor.run_query` missing 1 required 
'
-'positional argument: `sql`')
+if not sql and not configuration['query'].get('query', None):
+raise TypeError('`BigQueryBaseCursor.run_query` '
+'missing 1 required positional argument: `sql`')
+
+# BigQuery also allows you to define how you want a table's schema
+# to change as a side effect of a query job for more details:
+# 
https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs#configuration.query.schemaUpdateOptions
 
-# BigQuery also allows you to define how you want a table's schema to 
change
-# as a side effect of a query job
-# for more details:
-#   
https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs#configuration.query.schemaUpdateOptions
 allowed_schema_update_options = [
 'ALLOW_FIELD_ADDITION', "ALLOW_FIELD_RELAXATION"
 ]
-if not set(allowed_schema_update_options).issuperset(
-set(schema_update_options)):
-raise ValueError(
-"{0} contains invalid schema update options. "
-"Please only use one or more of the following options: {1}"
-.format(schema_update_options, allowed_schema_update_options))
 
-if use_legacy_sql is None:
-use_legacy_sql = self.use_legacy_sql
+if not set(allowed_schema_update_options
+   ).issuperset(set(schema_update_options)):
+raise ValueError("{0} contains invalid schema update options. "
+ "Please only use one or more of the following "
+ "options: {1}"
+ .format(schema_update_options,
+ allowed_schema_update_options))
 
-configuration = {
-'query': {
-'query': sql,
-'useLegacySql': use_legacy_sql,
-'maximumBillingTier': maximum_billing_tier,
-'maximumBytesBilled': maximum_bytes_billed,
-'priority': priority
-}
-}
+if schema_update_options:
+if write_disposition not in ["WRITE_APPEND", "WRITE_TRUNCATE"]:
+raise ValueError("schema_update_options is only "
+ "allowed if write_disposition is "
+ "'WRITE_APPEND' or 'WRITE_TRUNCATE'.")
 
 if destination_dataset_table:
-if '.' not in destination_dataset_table:
-raise ValueError(
-'Expected destination_dataset_table name in the format of '
-'.. Got: {}'.format(
-destination_dataset_table))
 destination_project, destination_dataset, destination_table = \
 _split_tablename(table_input=destination_dataset_table,
  default_project_id=self.project_id)
-configuration['query'].update({
-'allowLargeResults': allow_large_results,
-'flattenResults': flatten_results,
-'writeDisposition': write_disposition,
-'createDisposition': create_disposition,
-'destinationTable': {
-'projectId': destination_project,
-'datasetId': destination_dataset,
-'tableId': destination_table,
-}
-})
-if udf_config:
-if not isinstance(udf_config, list):
-raise TypeError("udf_config argument must have a type 'list'"
-" not {}".format(type(udf_config)))
-configuration['query'].update({
-'userDefinedFunctionResources': udf_config
-})
 
-if query_params:
-if self.use_legacy_sql:
-raise ValueError("Query parameters are not allowed when using "
- "legacy SQL")
-else:
-configuration['query']['queryParameters'] = query_params
+destination_dataset_table = {
+'projectId': destination_project,
+'datasetId': destination_dataset,
+'tableId': destination_table,
+}
 
-if labels:
-configuration['labels'] = labels
+query_param_list = [
+(sql, 

[GitHub] xnuinside commented on a change in pull request #3733: [AIRFLOW-491] Add cache parameter in BigQuery query method - with 'api_resource_configs'

2018-08-29 Thread GitBox
xnuinside commented on a change in pull request #3733: [AIRFLOW-491] Add cache 
parameter in BigQuery query method - with 'api_resource_configs'
URL: https://github.com/apache/incubator-airflow/pull/3733#discussion_r213896015
 
 

 ##
 File path: airflow/contrib/hooks/bigquery_hook.py
 ##
 @@ -1114,18 +1184,18 @@ def get_schema(self, dataset_id, table_id):
 :param table_id: the table ID of the requested table
 :return: a table schema
 """
-tables_resource = self.service.tables() \
-.get(projectId=self.project_id, datasetId=dataset_id, 
tableId=table_id) \
-.execute()
+tables_resource = self.service.tables().get(
+projectId=self.project_id, datasetId=dataset_id,
+tableId=table_id).execute()
 return tables_resource['schema']
 
 def get_tabledata(self, dataset_id, table_id,
   max_results=None, selected_fields=None, page_token=None,
   start_index=None):
 """
-Get the data of a given dataset.table and optionally with selected 
columns.
-see https://cloud.google.com/bigquery/docs/reference/v2/tabledata/list
-
+Get the data of a given dataset.table and optionally
+with selected columns. see:
+https://cloud.google.com/bigquery/docs/reference/v2/tabledata/list
 
 Review comment:
   fixed


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] xnuinside commented on a change in pull request #3733: [AIRFLOW-491] Add cache parameter in BigQuery query method - with 'api_resource_configs'

2018-08-29 Thread GitBox
xnuinside commented on a change in pull request #3733: [AIRFLOW-491] Add cache 
parameter in BigQuery query method - with 'api_resource_configs'
URL: https://github.com/apache/incubator-airflow/pull/3733#discussion_r213894129
 
 

 ##
 File path: airflow/contrib/hooks/bigquery_hook.py
 ##
 @@ -566,95 +612,108 @@ def run_query(self,
   'Airflow.',
   category=DeprecationWarning)
 
-if sql is None:
-raise TypeError('`BigQueryBaseCursor.run_query` missing 1 required 
'
-'positional argument: `sql`')
+if not sql and not configuration['query'].get('query', None):
+raise TypeError('`BigQueryBaseCursor.run_query` '
+'missing 1 required positional argument: `sql`')
+
+# BigQuery also allows you to define how you want a table's schema
+# to change as a side effect of a query job for more details:
+# 
https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs#configuration.query.schemaUpdateOptions
 
-# BigQuery also allows you to define how you want a table's schema to 
change
-# as a side effect of a query job
-# for more details:
-#   
https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs#configuration.query.schemaUpdateOptions
 allowed_schema_update_options = [
 'ALLOW_FIELD_ADDITION', "ALLOW_FIELD_RELAXATION"
 ]
-if not set(allowed_schema_update_options).issuperset(
-set(schema_update_options)):
-raise ValueError(
-"{0} contains invalid schema update options. "
-"Please only use one or more of the following options: {1}"
-.format(schema_update_options, allowed_schema_update_options))
 
-if use_legacy_sql is None:
-use_legacy_sql = self.use_legacy_sql
+if not set(allowed_schema_update_options
+   ).issuperset(set(schema_update_options)):
+raise ValueError("{0} contains invalid schema update options. "
+ "Please only use one or more of the following "
+ "options: {1}"
+ .format(schema_update_options,
+ allowed_schema_update_options))
 
-configuration = {
-'query': {
-'query': sql,
-'useLegacySql': use_legacy_sql,
-'maximumBillingTier': maximum_billing_tier,
-'maximumBytesBilled': maximum_bytes_billed,
-'priority': priority
-}
-}
+if schema_update_options:
+if write_disposition not in ["WRITE_APPEND", "WRITE_TRUNCATE"]:
+raise ValueError("schema_update_options is only "
+ "allowed if write_disposition is "
+ "'WRITE_APPEND' or 'WRITE_TRUNCATE'.")
 
 if destination_dataset_table:
-if '.' not in destination_dataset_table:
-raise ValueError(
-'Expected destination_dataset_table name in the format of '
-'.. Got: {}'.format(
-destination_dataset_table))
 destination_project, destination_dataset, destination_table = \
 _split_tablename(table_input=destination_dataset_table,
  default_project_id=self.project_id)
-configuration['query'].update({
-'allowLargeResults': allow_large_results,
-'flattenResults': flatten_results,
-'writeDisposition': write_disposition,
-'createDisposition': create_disposition,
-'destinationTable': {
-'projectId': destination_project,
-'datasetId': destination_dataset,
-'tableId': destination_table,
-}
-})
-if udf_config:
-if not isinstance(udf_config, list):
-raise TypeError("udf_config argument must have a type 'list'"
-" not {}".format(type(udf_config)))
-configuration['query'].update({
-'userDefinedFunctionResources': udf_config
-})
 
-if query_params:
-if self.use_legacy_sql:
-raise ValueError("Query parameters are not allowed when using "
- "legacy SQL")
-else:
-configuration['query']['queryParameters'] = query_params
+destination_dataset_table = {
+'projectId': destination_project,
+'datasetId': destination_dataset,
+'tableId': destination_table,
+}
 
-if labels:
-configuration['labels'] = labels
+query_param_list = [
+(sql, 

[GitHub] xnuinside commented on a change in pull request #3733: [AIRFLOW-491] Add cache parameter in BigQuery query method - with 'api_resource_configs'

2018-08-29 Thread GitBox
xnuinside commented on a change in pull request #3733: [AIRFLOW-491] Add cache 
parameter in BigQuery query method - with 'api_resource_configs'
URL: https://github.com/apache/incubator-airflow/pull/3733#discussion_r213894206
 
 

 ##
 File path: airflow/contrib/hooks/bigquery_hook.py
 ##
 @@ -1114,18 +1184,18 @@ def get_schema(self, dataset_id, table_id):
 :param table_id: the table ID of the requested table
 :return: a table schema
 """
-tables_resource = self.service.tables() \
-.get(projectId=self.project_id, datasetId=dataset_id, 
tableId=table_id) \
 
 Review comment:
   just saw what, max-line-length = 90 in Airflow.
   removed changes with line 79
   
   I will be more clever next time


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] xnuinside commented on a change in pull request #3733: [AIRFLOW-491] Add cache parameter in BigQuery query method - with 'api_resource_configs'

2018-08-29 Thread GitBox
xnuinside commented on a change in pull request #3733: [AIRFLOW-491] Add cache 
parameter in BigQuery query method - with 'api_resource_configs'
URL: https://github.com/apache/incubator-airflow/pull/3733#discussion_r213895073
 
 

 ##
 File path: airflow/contrib/hooks/bigquery_hook.py
 ##
 @@ -220,26 +229,29 @@ def create_empty_table(self,
 :type table_id: str
 :param schema_fields: If set, the schema field list as defined here:
 
https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs#configuration.load.schema
-:param labels: a dictionary containing labels for the table, passed to 
BigQuery
+:param labels: a dictionary containing labels for the table,
+passed to BigQuery
 :type labels: dict
 
 **Example**: ::
 
-schema_fields=[{"name": "emp_name", "type": "STRING", "mode": 
"REQUIRED"},
-   {"name": "salary", "type": "INTEGER", "mode": 
"NULLABLE"}]
+schema_fields=[
+{"name": "emp_name", "type": "STRING", "mode": "REQUIRED"},
+{"name": "salary", "type": "INTEGER", "mode": "NULLABLE"}
+]
 
 :type schema_fields: list
-:param time_partitioning: configure optional time partitioning fields 
i.e.
-partition by field, type and expiration as per API specifications.
+:param time_partitioning: configure optional time partitioning
+fields i.e. partition by field, type and expiration
+as per API specifications.
 
 .. seealso::
 
https://cloud.google.com/bigquery/docs/reference/rest/v2/tables#timePartitioning
 :type time_partitioning: dict
 
 :return:
 """
-if time_partitioning is None:
 
 Review comment:
   logic in DRY, it was placed several times in code, and only one place where 
was a critical type of local var time_partitioning - in function 
_cleanse_time_partitioning, so I moved to check inside function, only to 
avoiding duplicate this check 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] r39132 commented on issue #3366: [AIRFLOW-2476] allow tabulate up to 0.8.2

2018-08-29 Thread GitBox
r39132 commented on issue #3366: [AIRFLOW-2476] allow tabulate up to 0.8.2
URL: 
https://github.com/apache/incubator-airflow/pull/3366#issuecomment-417164457
 
 
   @Tagar kindly rebase


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] gerardo commented on a change in pull request #3805: [AIRFLOW-2062] Add per-connection KMS encryption.

2018-08-29 Thread GitBox
gerardo commented on a change in pull request #3805: [AIRFLOW-2062] Add 
per-connection KMS encryption.
URL: https://github.com/apache/incubator-airflow/pull/3805#discussion_r213878647
 
 

 ##
 File path: airflow/bin/cli.py
 ##
 @@ -1785,6 +1792,15 @@ class CLIFactory(object):
 ('--conn_extra',),
 help='Connection `Extra` field, optional when adding a connection',
 type=str),
+'kms_conn_id': Arg(
+('--kms_conn_id',),
+help='An existing connection to use when encrpting this connection 
with a '
 
 Review comment:
   There's a typo here


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] gerardo commented on issue #3805: [AIRFLOW-2062] Add per-connection KMS encryption.

2018-08-29 Thread GitBox
gerardo commented on issue #3805: [AIRFLOW-2062] Add per-connection KMS 
encryption.
URL: 
https://github.com/apache/incubator-airflow/pull/3805#issuecomment-417157895
 
 
   @jakahn this PR makes things pretty confusing, as AWS has a product called 
[KMS](https://aws.amazon.com/kms/) as well, which has been around for longer. 
The docs you submitted also assume Google Cloud KMS is the _only_ thing called 
KMS.
   
   I still like the idea, but the implementation is very specific to Google 
Cloud and hard to reuse for other implementations. Is there a way you could 
make it more generic?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] r39132 edited a comment on issue #3656: [AIRFLOW-2803] Fix all ESLint issues

2018-08-29 Thread GitBox
r39132 edited a comment on issue #3656: [AIRFLOW-2803] Fix all ESLint issues
URL: 
https://github.com/apache/incubator-airflow/pull/3656#issuecomment-416415615
 
 
   @tedmiston Please resolve conflicts and squash your commits. @verdan thx for 
your feedback. It would be great if you could do one more pass after the rebase 
& squash! Are the steps you outline above (i.e. `npm run lint`) still valid? 
Thx for your work on this btw!
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] r39132 commented on issue #3656: [AIRFLOW-2803] Fix all ESLint issues

2018-08-29 Thread GitBox
r39132 commented on issue #3656: [AIRFLOW-2803] Fix all ESLint issues
URL: 
https://github.com/apache/incubator-airflow/pull/3656#issuecomment-417149187
 
 
   @tedmiston any updates?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] feng-tao commented on issue #3821: [AIRFLOW-2983] Add prev_ds_nodash and next_ds_nodash macro

2018-08-29 Thread GitBox
feng-tao commented on issue #3821: [AIRFLOW-2983] Add prev_ds_nodash and 
next_ds_nodash macro
URL: 
https://github.com/apache/incubator-airflow/pull/3821#issuecomment-417147572
 
 
   PTAL @Fokko @kaxil 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] feng-tao opened a new pull request #3821: [AIRFLOW-2983] Add prev_ds_nodash and next_ds_nodash macro

2018-08-29 Thread GitBox
feng-tao opened a new pull request #3821: [AIRFLOW-2983] Add prev_ds_nodash and 
next_ds_nodash macro
URL: https://github.com/apache/incubator-airflow/pull/3821
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-2983
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
   
   ### Description
   Similar to https://github.com/apache/incubator-airflow/pull/3418, but 
introduce two nodash macros:
   {{ prev_ds_nodash }}: the previous execution date as {{ MMDD }}
   {{ next_ds_nodash }}: the next execution date as {{ MMDD }}
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [x] Passes `git diff upstream/master -u -- "*.py" | flake8 --diff`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-2983) Add prev_ds_nodash and next_ds_nodash macro

2018-08-29 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2983?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16596948#comment-16596948
 ] 

ASF GitHub Bot commented on AIRFLOW-2983:
-

feng-tao opened a new pull request #3821: [AIRFLOW-2983] Add prev_ds_nodash and 
next_ds_nodash macro
URL: https://github.com/apache/incubator-airflow/pull/3821
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-2983
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
   
   ### Description
   Similar to https://github.com/apache/incubator-airflow/pull/3418, but 
introduce two nodash macros:
   {{ prev_ds_nodash }}: the previous execution date as {{ MMDD }}
   {{ next_ds_nodash }}: the next execution date as {{ MMDD }}
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [x] Passes `git diff upstream/master -u -- "*.py" | flake8 --diff`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add prev_ds_nodash and next_ds_nodash macro 
> 
>
> Key: AIRFLOW-2983
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2983
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Tao Feng
>Assignee: Tao Feng
>Priority: Minor
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Closed] (AIRFLOW-2980) Missing operators in the docs

2018-08-29 Thread Siddharth Anand (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2980?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Siddharth Anand closed AIRFLOW-2980.

Resolution: Fixed

> Missing operators in the docs
> -
>
> Key: AIRFLOW-2980
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2980
> Project: Apache Airflow
>  Issue Type: Task
>  Components: docs, Documentation
>Reporter: Kaxil Naik
>Assignee: Kaxil Naik
>Priority: Major
>
> Some of the operators are missing from the API reference
> part of the docs (HiveOperator for instance). 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] r39132 closed pull request #3818: [AIRFLOW-2980] ReadTheDocs - Fix Missing API Reference

2018-08-29 Thread GitBox
r39132 closed pull request #3818: [AIRFLOW-2980] ReadTheDocs - Fix Missing API 
Reference
URL: https://github.com/apache/incubator-airflow/pull/3818
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/.readthedocs.yml b/.readthedocs.yml
index c6a4da8d69..87fb227d81 100644
--- a/.readthedocs.yml
+++ b/.readthedocs.yml
@@ -19,7 +19,14 @@
 python:
 pip_install: true
 extra_requirements:
+- all_dbs
+- databricks
 - doc
 - docker
-- gcp_api
 - emr
+- gcp_api
+- s3
+- salesforce
+- sendgrid
+- ssh
+- slack


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-2980) Missing operators in the docs

2018-08-29 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2980?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16596947#comment-16596947
 ] 

ASF GitHub Bot commented on AIRFLOW-2980:
-

r39132 closed pull request #3818: [AIRFLOW-2980] ReadTheDocs - Fix Missing API 
Reference
URL: https://github.com/apache/incubator-airflow/pull/3818
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/.readthedocs.yml b/.readthedocs.yml
index c6a4da8d69..87fb227d81 100644
--- a/.readthedocs.yml
+++ b/.readthedocs.yml
@@ -19,7 +19,14 @@
 python:
 pip_install: true
 extra_requirements:
+- all_dbs
+- databricks
 - doc
 - docker
-- gcp_api
 - emr
+- gcp_api
+- s3
+- salesforce
+- sendgrid
+- ssh
+- slack


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Missing operators in the docs
> -
>
> Key: AIRFLOW-2980
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2980
> Project: Apache Airflow
>  Issue Type: Task
>  Components: docs, Documentation
>Reporter: Kaxil Naik
>Assignee: Kaxil Naik
>Priority: Major
>
> Some of the operators are missing from the API reference
> part of the docs (HiveOperator for instance). 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] r39132 closed pull request #3819: [AIRFLOW-XXX] Fix Broken Link in CONTRIBUTING.md

2018-08-29 Thread GitBox
r39132 closed pull request #3819: [AIRFLOW-XXX] Fix Broken Link in 
CONTRIBUTING.md
URL: https://github.com/apache/incubator-airflow/pull/3819
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
index beaf609b5a..152d5d9aab 100644
--- a/CONTRIBUTING.md
+++ b/CONTRIBUTING.md
@@ -88,7 +88,7 @@ There are three ways to setup an Apache Airflow development 
environment.
 1. Using tools and libraries installed directly on your system.
 
   Install Python (2.7.x or 3.4.x), MySQL, and libxml by using system-level 
package
-  managers like yum, apt-get for Linux, or Homebrew for Mac OS at first. Refer 
to the [base CI 
Dockerfile](https://github.com/apache/incubator-airflow-ci/blob/master/Dockerfile.base)
 for
+  managers like yum, apt-get for Linux, or Homebrew for Mac OS at first. Refer 
to the [base CI 
Dockerfile](https://github.com/apache/incubator-airflow-ci/blob/master/Dockerfile)
 for
   a comprehensive list of required packages.
 
   Then install python development requirements. It is usually best to work in 
a virtualenv:


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-2983) Add prev_ds_nodash and next_ds_nodash macro

2018-08-29 Thread Tao Feng (JIRA)
Tao Feng created AIRFLOW-2983:
-

 Summary: Add prev_ds_nodash and next_ds_nodash macro 
 Key: AIRFLOW-2983
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2983
 Project: Apache Airflow
  Issue Type: Bug
Reporter: Tao Feng
Assignee: Tao Feng






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] kaxil commented on a change in pull request #3733: [AIRFLOW-491] Add cache parameter in BigQuery query method - with 'api_resource_configs'

2018-08-29 Thread GitBox
kaxil commented on a change in pull request #3733: [AIRFLOW-491] Add cache 
parameter in BigQuery query method - with 'api_resource_configs'
URL: https://github.com/apache/incubator-airflow/pull/3733#discussion_r213867433
 
 

 ##
 File path: airflow/contrib/hooks/bigquery_hook.py
 ##
 @@ -536,24 +569,37 @@ def run_query(self,
 :param labels a dictionary containing labels for the job/query,
 passed to BigQuery
 :type labels: dict
-:param schema_update_options: Allows the schema of the desitination
+:param schema_update_options: Allows the schema of the destination
 table to be updated as a side effect of the query job.
 :type schema_update_options: tuple
 :param priority: Specifies a priority for the query.
 Possible values include INTERACTIVE and BATCH.
 The default value is INTERACTIVE.
 :type priority: string
-:param time_partitioning: configure optional time partitioning fields 
i.e.
+:param time_partitioning: configure optional time partitioning
+fields i.e.
 partition by field, type and
-expiration as per API specifications. Note that 'field' is not 
available in
-conjunction with dataset.table$partition.
+expiration as per API specifications.
+Note that 'field' is not available in conjunction
 
 Review comment:
   Indentation


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil commented on a change in pull request #3733: [AIRFLOW-491] Add cache parameter in BigQuery query method - with 'api_resource_configs'

2018-08-29 Thread GitBox
kaxil commented on a change in pull request #3733: [AIRFLOW-491] Add cache 
parameter in BigQuery query method - with 'api_resource_configs'
URL: https://github.com/apache/incubator-airflow/pull/3733#discussion_r213868118
 
 

 ##
 File path: airflow/contrib/hooks/bigquery_hook.py
 ##
 @@ -840,43 +902,50 @@ def run_load(self,
 :type source_uris: list
 :param source_format: File format to export.
 :type source_format: string
-:param create_disposition: The create disposition if the table doesn't 
exist.
+:param create_disposition: The create disposition if the table
+doesn't exist.
 :type create_disposition: string
-:param skip_leading_rows: Number of rows to skip when loading from a 
CSV.
+:param skip_leading_rows: Number of rows to skip when loading
+from a CSV.
 :type skip_leading_rows: int
-:param write_disposition: The write disposition if the table already 
exists.
+:param write_disposition: The write disposition if the table
+already exists.
 :type write_disposition: string
-:param field_delimiter: The delimiter to use when loading from a CSV.
+:param field_delimiter: The delimiter to use when loading
+from a CSV.
 
 Review comment:
   indentation


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil commented on a change in pull request #3733: [AIRFLOW-491] Add cache parameter in BigQuery query method - with 'api_resource_configs'

2018-08-29 Thread GitBox
kaxil commented on a change in pull request #3733: [AIRFLOW-491] Add cache 
parameter in BigQuery query method - with 'api_resource_configs'
URL: https://github.com/apache/incubator-airflow/pull/3733#discussion_r213868479
 
 

 ##
 File path: airflow/contrib/hooks/bigquery_hook.py
 ##
 @@ -1114,18 +1184,18 @@ def get_schema(self, dataset_id, table_id):
 :param table_id: the table ID of the requested table
 :return: a table schema
 """
-tables_resource = self.service.tables() \
-.get(projectId=self.project_id, datasetId=dataset_id, 
tableId=table_id) \
-.execute()
+tables_resource = self.service.tables().get(
+projectId=self.project_id, datasetId=dataset_id,
+tableId=table_id).execute()
 return tables_resource['schema']
 
 def get_tabledata(self, dataset_id, table_id,
   max_results=None, selected_fields=None, page_token=None,
   start_index=None):
 """
-Get the data of a given dataset.table and optionally with selected 
columns.
-see https://cloud.google.com/bigquery/docs/reference/v2/tabledata/list
-
+Get the data of a given dataset.table and optionally
+with selected columns. see:
+https://cloud.google.com/bigquery/docs/reference/v2/tabledata/list
 
 Review comment:
   Missing newline between description and params list


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil commented on a change in pull request #3733: [AIRFLOW-491] Add cache parameter in BigQuery query method - with 'api_resource_configs'

2018-08-29 Thread GitBox
kaxil commented on a change in pull request #3733: [AIRFLOW-491] Add cache 
parameter in BigQuery query method - with 'api_resource_configs'
URL: https://github.com/apache/incubator-airflow/pull/3733#discussion_r213867642
 
 

 ##
 File path: airflow/contrib/hooks/bigquery_hook.py
 ##
 @@ -473,20 +497,21 @@ def create_external_table(self,
 def run_query(self,
   bql=None,
   sql=None,
-  destination_dataset_table=False,
+  destination_dataset_table=None,
   write_disposition='WRITE_EMPTY',
   allow_large_results=False,
   flatten_results=False,
-  udf_config=False,
+  udf_config=None,
   use_legacy_sql=None,
   maximum_billing_tier=None,
   maximum_bytes_billed=None,
   create_disposition='CREATE_IF_NEEDED',
   query_params=None,
   labels=None,
   schema_update_options=(),
-  priority='INTERACTIVE',
-  time_partitioning=None):
+  priority=None,
 
 Review comment:
   I would not change the default value here, as it is going to show in 
autocompletion.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil commented on a change in pull request #3733: [AIRFLOW-491] Add cache parameter in BigQuery query method - with 'api_resource_configs'

2018-08-29 Thread GitBox
kaxil commented on a change in pull request #3733: [AIRFLOW-491] Add cache 
parameter in BigQuery query method - with 'api_resource_configs'
URL: https://github.com/apache/incubator-airflow/pull/3733#discussion_r213868377
 
 

 ##
 File path: airflow/contrib/hooks/bigquery_hook.py
 ##
 @@ -1114,18 +1184,18 @@ def get_schema(self, dataset_id, table_id):
 :param table_id: the table ID of the requested table
 :return: a table schema
 """
-tables_resource = self.service.tables() \
-.get(projectId=self.project_id, datasetId=dataset_id, 
tableId=table_id) \
 
 Review comment:
   I prefer this, it looks more cleaner


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil commented on a change in pull request #3733: [AIRFLOW-491] Add cache parameter in BigQuery query method - with 'api_resource_configs'

2018-08-29 Thread GitBox
kaxil commented on a change in pull request #3733: [AIRFLOW-491] Add cache 
parameter in BigQuery query method - with 'api_resource_configs'
URL: https://github.com/apache/incubator-airflow/pull/3733#discussion_r213867932
 
 

 ##
 File path: airflow/contrib/hooks/bigquery_hook.py
 ##
 @@ -566,95 +612,108 @@ def run_query(self,
   'Airflow.',
   category=DeprecationWarning)
 
-if sql is None:
-raise TypeError('`BigQueryBaseCursor.run_query` missing 1 required 
'
-'positional argument: `sql`')
+if not sql and not configuration['query'].get('query', None):
+raise TypeError('`BigQueryBaseCursor.run_query` '
+'missing 1 required positional argument: `sql`')
+
+# BigQuery also allows you to define how you want a table's schema
+# to change as a side effect of a query job for more details:
+# 
https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs#configuration.query.schemaUpdateOptions
 
-# BigQuery also allows you to define how you want a table's schema to 
change
-# as a side effect of a query job
-# for more details:
-#   
https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs#configuration.query.schemaUpdateOptions
 allowed_schema_update_options = [
 'ALLOW_FIELD_ADDITION', "ALLOW_FIELD_RELAXATION"
 ]
-if not set(allowed_schema_update_options).issuperset(
-set(schema_update_options)):
-raise ValueError(
-"{0} contains invalid schema update options. "
-"Please only use one or more of the following options: {1}"
-.format(schema_update_options, allowed_schema_update_options))
 
-if use_legacy_sql is None:
-use_legacy_sql = self.use_legacy_sql
+if not set(allowed_schema_update_options
+   ).issuperset(set(schema_update_options)):
+raise ValueError("{0} contains invalid schema update options. "
+ "Please only use one or more of the following "
+ "options: {1}"
+ .format(schema_update_options,
+ allowed_schema_update_options))
 
-configuration = {
-'query': {
-'query': sql,
-'useLegacySql': use_legacy_sql,
-'maximumBillingTier': maximum_billing_tier,
-'maximumBytesBilled': maximum_bytes_billed,
-'priority': priority
-}
-}
+if schema_update_options:
+if write_disposition not in ["WRITE_APPEND", "WRITE_TRUNCATE"]:
+raise ValueError("schema_update_options is only "
+ "allowed if write_disposition is "
+ "'WRITE_APPEND' or 'WRITE_TRUNCATE'.")
 
 if destination_dataset_table:
-if '.' not in destination_dataset_table:
-raise ValueError(
-'Expected destination_dataset_table name in the format of '
-'.. Got: {}'.format(
-destination_dataset_table))
 destination_project, destination_dataset, destination_table = \
 _split_tablename(table_input=destination_dataset_table,
  default_project_id=self.project_id)
-configuration['query'].update({
-'allowLargeResults': allow_large_results,
-'flattenResults': flatten_results,
-'writeDisposition': write_disposition,
-'createDisposition': create_disposition,
-'destinationTable': {
-'projectId': destination_project,
-'datasetId': destination_dataset,
-'tableId': destination_table,
-}
-})
-if udf_config:
-if not isinstance(udf_config, list):
-raise TypeError("udf_config argument must have a type 'list'"
-" not {}".format(type(udf_config)))
-configuration['query'].update({
-'userDefinedFunctionResources': udf_config
-})
 
-if query_params:
-if self.use_legacy_sql:
-raise ValueError("Query parameters are not allowed when using "
- "legacy SQL")
-else:
-configuration['query']['queryParameters'] = query_params
+destination_dataset_table = {
+'projectId': destination_project,
+'datasetId': destination_dataset,
+'tableId': destination_table,
+}
 
-if labels:
-configuration['labels'] = labels
+query_param_list = [
+(sql, 

[GitHub] kaxil commented on a change in pull request #3733: [AIRFLOW-491] Add cache parameter in BigQuery query method - with 'api_resource_configs'

2018-08-29 Thread GitBox
kaxil commented on a change in pull request #3733: [AIRFLOW-491] Add cache 
parameter in BigQuery query method - with 'api_resource_configs'
URL: https://github.com/apache/incubator-airflow/pull/3733#discussion_r213867370
 
 

 ##
 File path: airflow/contrib/hooks/bigquery_hook.py
 ##
 @@ -536,24 +569,37 @@ def run_query(self,
 :param labels a dictionary containing labels for the job/query,
 passed to BigQuery
 :type labels: dict
-:param schema_update_options: Allows the schema of the desitination
+:param schema_update_options: Allows the schema of the destination
 table to be updated as a side effect of the query job.
 :type schema_update_options: tuple
 :param priority: Specifies a priority for the query.
 Possible values include INTERACTIVE and BATCH.
 The default value is INTERACTIVE.
 :type priority: string
-:param time_partitioning: configure optional time partitioning fields 
i.e.
+:param time_partitioning: configure optional time partitioning
+fields i.e.
 partition by field, type and
 
 Review comment:
   same here, wrap it into 2 lines with proper indentation


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] dimberman commented on a change in pull request #3782: [AIRFLOW-2936] Use official Python images as base image for Docker

2018-08-29 Thread GitBox
dimberman commented on a change in pull request #3782: [AIRFLOW-2936] Use 
official Python images as base image for Docker
URL: https://github.com/apache/incubator-airflow/pull/3782#discussion_r213867121
 
 

 ##
 File path: scripts/ci/kubernetes/docker/airflow-init.sh
 ##
 @@ -17,9 +17,10 @@
 #  specific language governing permissions and limitations  *
 #  under the License.
 
-cd /usr/local/lib/python2.7/dist-packages/airflow && \
-cp -R example_dags/* /root/airflow/dags/ && \
+set -e
+
+cd /usr/local/lib/python3.7/site-packages/airflow/ && \
+cp -R example_dags/* /home/airflow/dags/ && \
 airflow initdb && \
 alembic upgrade heads && \
-(airflow create_user -u airflow -l airflow -f jon -e airf...@apache.org -r 
Admin -p airflow || true) && \
-echo "retrieved from mount" > /root/test_volume/test.txt
 
 Review comment:
   So the reason this line didn't break the CI is because a previous PR 
silently broke all kubernetes tests. I'm actively trying to fix this and will 
report back when I have it working.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil commented on a change in pull request #3733: [AIRFLOW-491] Add cache parameter in BigQuery query method - with 'api_resource_configs'

2018-08-29 Thread GitBox
kaxil commented on a change in pull request #3733: [AIRFLOW-491] Add cache 
parameter in BigQuery query method - with 'api_resource_configs'
URL: https://github.com/apache/incubator-airflow/pull/3733#discussion_r213866794
 
 

 ##
 File path: airflow/contrib/hooks/bigquery_hook.py
 ##
 @@ -314,48 +326,56 @@ def create_external_table(self,
 :type source_uris: list
 :param source_format: File format to export.
 :type source_format: string
-:param autodetect: Try to detect schema and format options 
automatically.
-Any option specified explicitly will be honored.
+:param autodetect: Try to detect schema and format options
+automatically. Any option specified explicitly will be honored.
 :type autodetect: bool
 :param compression: [Optional] The compression type of the data source.
 Possible values include GZIP and NONE.
 The default value is NONE.
 This setting is ignored for Google Cloud Bigtable,
 Google Cloud Datastore backups and Avro formats.
 :type compression: string
-:param ignore_unknown_values: [Optional] Indicates if BigQuery should 
allow
-extra values that are not represented in the table schema.
-If true, the extra values are ignored. If false, records with 
extra columns
-are treated as bad records, and if there are too many bad records, 
an
-invalid error is returned in the job result.
+:param ignore_unknown_values: [Optional] Indicates if
+BigQuery should allow extra values that are
+not represented in the table schema. If true, the extra values
+are ignored. If false, records with extra columns
+are treated as bad records, and if there are too many
+bad records, an invalid error is returned in the job result.
 :type ignore_unknown_values: bool
-:param max_bad_records: The maximum number of bad records that 
BigQuery can
-ignore when running the job.
+:param max_bad_records: The maximum number of bad records that
+BigQuery can ignore when running the job.
 :type max_bad_records: int
-:param skip_leading_rows: Number of rows to skip when loading from a 
CSV.
+:param skip_leading_rows: Number of rows to skip when
+loading from a CSV.
 :type skip_leading_rows: int
 :param field_delimiter: The delimiter to use when loading from a CSV.
 :type field_delimiter: string
-:param quote_character: The value that is used to quote data sections 
in a CSV
+:param quote_character: The value that is used to quote data
+sections in a CSV
 file.
 
 Review comment:
   Just have it in 2 lines instead of 3
   ```
   :param quote_character: The value that is used to quote data
   sections in a CSV file.
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-2955) Kubernetes pod operator: Unable to set requests/limits on task pods

2018-08-29 Thread Daniel Imberman (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2955?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16596934#comment-16596934
 ] 

Daniel Imberman commented on AIRFLOW-2955:
--

[~jpds] So the problem here is that the operator is expecting a "Resources" 
class 
[https://github.com/apache/incubator-airflow/blob/master/airflow/contrib/kubernetes/pod.py#L19.]
 It might actually make more sense to have users just use a dict and then 
generate that class ourselves. Should be a pretty easy fix. Until then try 
creating that class and it should work.

> Kubernetes pod operator: Unable to set requests/limits on task pods
> ---
>
> Key: AIRFLOW-2955
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2955
> Project: Apache Airflow
>  Issue Type: Bug
>Reporter: Jon Davies
>Priority: Major
>
> When I try and set a resource limit/request on a DAG task with the 
> KubernetesPodOperator as follows:
> {code:java}
> resources={"limit_cpu": 1, "request_cpu": 1},
> {code}
> ...I get:
> {code:java}
> [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask 
> task Traceback (most recent call last):
> [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask 
> task   File "/usr/local/bin/airflow", line 32, in 
> [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask 
> task args.func(args)
> [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask 
> task   File "/usr/local/lib/python3.7/site-packages/airflow/utils/cli.py", 
> line 74, in wrapper
> [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask 
> task return f(*args, **kwargs)
> [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask 
> task   File "/usr/local/lib/python3.7/site-packages/airflow/bin/cli.py", line 
> 498, in run
> [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask 
> task _run(args, dag, ti)
> [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask 
> task   File "/usr/local/lib/python3.7/site-packages/airflow/bin/cli.py", line 
> 402, in _run
> [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask 
> task pool=args.pool,
> [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask 
> task   File "/usr/local/lib/python3.7/site-packages/airflow/utils/db.py", 
> line 74, in wrapper
> [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask 
> task return func(*args, **kwargs)
> [2018-08-24 15:51:27,795] {base_task_runner.py:107} INFO - Job 2: Subtask 
> task   File "/usr/local/lib/python3.7/site-packages/airflow/models.py", line 
> 1633, in _run_raw_task
> [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask 
> task result = task_copy.execute(context=context)
> [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask 
> task   File 
> "/usr/local/lib/python3.7/site-packages/airflow/contrib/operators/kubernetes_pod_operator.py",
>  line 115, in execute
> [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask 
> task get_logs=self.get_logs)
> [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask 
> task   File 
> "/usr/local/lib/python3.7/site-packages/airflow/contrib/kubernetes/pod_launcher.py",
>  line 71, in run_pod
> [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask 
> task resp = self.run_pod_async(pod)
> [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask 
> task   File 
> "/usr/local/lib/python3.7/site-packages/airflow/contrib/kubernetes/pod_launcher.py",
>  line 52, in run_pod_async
> [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask 
> task req = self.kube_req_factory.create(pod)
> [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask 
> task   File 
> "/usr/local/lib/python3.7/site-packages/airflow/contrib/kubernetes/kubernetes_request_factory/pod_request_factory.py",
>  line 56, in create
> [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask 
> task self.extract_resources(pod, req)
> [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask 
> task   File 
> "/usr/local/lib/python3.7/site-packages/airflow/contrib/kubernetes/kubernetes_request_factory/kubernetes_request_factory.py",
>  line 160, in extract_resources
> [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask 
> task if not pod.resources or pod.resources.is_empty_resource_request():
> [2018-08-24 15:51:27,796] {base_task_runner.py:107} INFO - Job 2: Subtask 
> task AttributeError: 'dict' object has no attribute 
> 'is_empty_resource_request'
> {code}
> ...setting 
> 

[GitHub] kaxil commented on a change in pull request #3733: [AIRFLOW-491] Add cache parameter in BigQuery query method - with 'api_resource_configs'

2018-08-29 Thread GitBox
kaxil commented on a change in pull request #3733: [AIRFLOW-491] Add cache 
parameter in BigQuery query method - with 'api_resource_configs'
URL: https://github.com/apache/incubator-airflow/pull/3733#discussion_r213866316
 
 

 ##
 File path: airflow/contrib/hooks/bigquery_hook.py
 ##
 @@ -220,26 +229,29 @@ def create_empty_table(self,
 :type table_id: str
 :param schema_fields: If set, the schema field list as defined here:
 
https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs#configuration.load.schema
-:param labels: a dictionary containing labels for the table, passed to 
BigQuery
+:param labels: a dictionary containing labels for the table,
+passed to BigQuery
 :type labels: dict
 
 **Example**: ::
 
-schema_fields=[{"name": "emp_name", "type": "STRING", "mode": 
"REQUIRED"},
-   {"name": "salary", "type": "INTEGER", "mode": 
"NULLABLE"}]
+schema_fields=[
+{"name": "emp_name", "type": "STRING", "mode": "REQUIRED"},
+{"name": "salary", "type": "INTEGER", "mode": "NULLABLE"}
+]
 
 :type schema_fields: list
-:param time_partitioning: configure optional time partitioning fields 
i.e.
-partition by field, type and expiration as per API specifications.
+:param time_partitioning: configure optional time partitioning
+fields i.e. partition by field, type and expiration
+as per API specifications.
 
 .. seealso::
 
https://cloud.google.com/bigquery/docs/reference/rest/v2/tables#timePartitioning
 :type time_partitioning: dict
 
 :return:
 """
-if time_partitioning is None:
 
 Review comment:
   Why remove this?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil commented on a change in pull request #3733: [AIRFLOW-491] Add cache parameter in BigQuery query method - with 'api_resource_configs'

2018-08-29 Thread GitBox
kaxil commented on a change in pull request #3733: [AIRFLOW-491] Add cache 
parameter in BigQuery query method - with 'api_resource_configs'
URL: https://github.com/apache/incubator-airflow/pull/3733#discussion_r213865966
 
 

 ##
 File path: airflow/contrib/operators/bigquery_operator.py
 ##
 @@ -64,8 +64,9 @@ class BigQueryOperator(BaseOperator):
 :param udf_config: The User Defined Function configuration for the query.
 See https://cloud.google.com/bigquery/user-defined-functions for 
details.
 :type udf_config: list
-:param use_legacy_sql: Whether to use legacy SQL (true) or standard SQL 
(false).
-:type use_legacy_sql: boolean
+:param use_legacy_sql: Whether to use legacy SQL (true) or
+standard SQL (false).
 
 Review comment:
   Can you give a tab here if you want to break `standard SQL (false).` to a 
new line?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil opened a new pull request #3820: [AIRFLOW-XXX] Fix Docstrings for Hooks/Operators

2018-08-29 Thread GitBox
kaxil opened a new pull request #3820: [AIRFLOW-XXX] Fix Docstrings for 
Hooks/Operators
URL: https://github.com/apache/incubator-airflow/pull/3820
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   (1)
   **Before**:
   
![image](https://user-images.githubusercontent.com/8811558/44820836-a2649d80-abea-11e8-9a5b-3b3511c3ae50.png)
   
   **After**:
   
![image](https://user-images.githubusercontent.com/8811558/44820862-ba3c2180-abea-11e8-9729-608f99753948.png)
   
   (2)
   **Before**:
   
![image](https://user-images.githubusercontent.com/8811558/44820884-db9d0d80-abea-11e8-90c9-f0f285f98e54.png)
   
   **After**:
   
![image](https://user-images.githubusercontent.com/8811558/44820892-e657a280-abea-11e8-9c47-2dbeeaedc022.png)
   
   (3)
   **Before**:
   
![image](https://user-images.githubusercontent.com/8811558/44820930-04250780-abeb-11e8-9b3e-c985030790b6.png)
   
   
   **After**:
   
![image](https://user-images.githubusercontent.com/8811558/44820935-0e470600-abeb-11e8-859c-7c356752b8d2.png)
   
   etc..
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [x] Passes `git diff upstream/master -u -- "*.py" | flake8 --diff`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3806: [AIRFLOW-2956] added kubernetes tolerations to kubernetes pod operator

2018-08-29 Thread GitBox
codecov-io edited a comment on issue #3806: [AIRFLOW-2956] added kubernetes 
tolerations to kubernetes pod operator
URL: 
https://github.com/apache/incubator-airflow/pull/3806#issuecomment-416340975
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=h1)
 Report
   > Merging 
[#3806](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/0b0f4ac3caca72e67273f9e80221677d78ad5c0e?src=pr=desc)
 will **decrease** coverage by `<.01%`.
   > The diff coverage is `100%`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3806/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ##   master   #3806  +/-   ##
   =
   - Coverage   77.41%   77.4%   -0.01% 
   =
 Files 203 203  
 Lines   15817   15818   +1 
   =
 Hits12244   12244  
   - Misses   35733574   +1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[...irflow/example\_dags/example\_kubernetes\_operator.py](https://codecov.io/gh/apache/incubator-airflow/pull/3806/diff?src=pr=tree#diff-YWlyZmxvdy9leGFtcGxlX2RhZ3MvZXhhbXBsZV9rdWJlcm5ldGVzX29wZXJhdG9yLnB5)
 | `76.92% <100%> (+1.92%)` | :arrow_up: |
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/3806/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `88.74% <0%> (-0.05%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=footer).
 Last update 
[0b0f4ac...be965c1](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3806: [AIRFLOW-2956] added kubernetes tolerations to kubernetes pod operator

2018-08-29 Thread GitBox
codecov-io edited a comment on issue #3806: [AIRFLOW-2956] added kubernetes 
tolerations to kubernetes pod operator
URL: 
https://github.com/apache/incubator-airflow/pull/3806#issuecomment-416340975
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=h1)
 Report
   > Merging 
[#3806](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/0b0f4ac3caca72e67273f9e80221677d78ad5c0e?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3806/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3806   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=footer).
 Last update 
[0b0f4ac...be965c1](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] xnuinside commented on a change in pull request #3733: [AIRFLOW-491] Add cache parameter in BigQuery query method - with 'api_resource_configs'

2018-08-29 Thread GitBox
xnuinside commented on a change in pull request #3733: [AIRFLOW-491] Add cache 
parameter in BigQuery query method - with 'api_resource_configs'
URL: https://github.com/apache/incubator-airflow/pull/3733#discussion_r213858150
 
 

 ##
 File path: airflow/contrib/operators/bigquery_operator.py
 ##
 @@ -64,8 +64,9 @@ class BigQueryOperator(BaseOperator):
 :param udf_config: The User Defined Function configuration for the query.
 See https://cloud.google.com/bigquery/user-defined-functions for 
details.
 :type udf_config: list
-:param use_legacy_sql: Whether to use legacy SQL (true) or standard SQL 
(false).
-:type use_legacy_sql: boolean
+:param use_legacy_sql: Whether to use legacy SQL (true) or
+standard SQL (false).
+:type use_legacy_sql: booleanq
 
 Review comment:
   Yeah, I will change it. I broke my mind with error in test with bql 
deprecated warning. It ok in python 3.5, but error with python 2.7.. but it’s 
not errored before, not failed on master branch, but I did not touch warning or 
test... I try to fix it several days, must fun the fact, what in my local env 
this test always failed with python 2.7 (master too)...  panic


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3806: [AIRFLOW-2956] added kubernetes tolerations to kubernetes pod operator

2018-08-29 Thread GitBox
codecov-io edited a comment on issue #3806: [AIRFLOW-2956] added kubernetes 
tolerations to kubernetes pod operator
URL: 
https://github.com/apache/incubator-airflow/pull/3806#issuecomment-416340975
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=h1)
 Report
   > Merging 
[#3806](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/0b0f4ac3caca72e67273f9e80221677d78ad5c0e?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3806/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3806   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=footer).
 Last update 
[0b0f4ac...be965c1](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] xnuinside commented on a change in pull request #3733: [AIRFLOW-491] Add cache parameter in BigQuery query method - with 'api_resource_configs'

2018-08-29 Thread GitBox
xnuinside commented on a change in pull request #3733: [AIRFLOW-491] Add cache 
parameter in BigQuery query method - with 'api_resource_configs'
URL: https://github.com/apache/incubator-airflow/pull/3733#discussion_r213858150
 
 

 ##
 File path: airflow/contrib/operators/bigquery_operator.py
 ##
 @@ -64,8 +64,9 @@ class BigQueryOperator(BaseOperator):
 :param udf_config: The User Defined Function configuration for the query.
 See https://cloud.google.com/bigquery/user-defined-functions for 
details.
 :type udf_config: list
-:param use_legacy_sql: Whether to use legacy SQL (true) or standard SQL 
(false).
-:type use_legacy_sql: boolean
+:param use_legacy_sql: Whether to use legacy SQL (true) or
+standard SQL (false).
+:type use_legacy_sql: booleanq
 
 Review comment:
   Yeah, I will change it. I broke my mind with error in test with bql 
deprecated warning. It ok in python 3.5, but error with python 2.7.. but it’s 
not errored before, not failed on master branch, but I did not touch warning or 
test... I try to fix it several days, must fun the fact, what in my local env 
this test always failed with python 2.7 ...  panic


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] xnuinside commented on a change in pull request #3733: [AIRFLOW-491] Add cache parameter in BigQuery query method - with 'api_resource_configs'

2018-08-29 Thread GitBox
xnuinside commented on a change in pull request #3733: [AIRFLOW-491] Add cache 
parameter in BigQuery query method - with 'api_resource_configs'
URL: https://github.com/apache/incubator-airflow/pull/3733#discussion_r213858150
 
 

 ##
 File path: airflow/contrib/operators/bigquery_operator.py
 ##
 @@ -64,8 +64,9 @@ class BigQueryOperator(BaseOperator):
 :param udf_config: The User Defined Function configuration for the query.
 See https://cloud.google.com/bigquery/user-defined-functions for 
details.
 :type udf_config: list
-:param use_legacy_sql: Whether to use legacy SQL (true) or standard SQL 
(false).
-:type use_legacy_sql: boolean
+:param use_legacy_sql: Whether to use legacy SQL (true) or
+standard SQL (false).
+:type use_legacy_sql: booleanq
 
 Review comment:
   Yeah, I will change it. I broke my mind with error in test with bql 
deprecated warning. It ok in python 3.6, but error with python 2.7.. but it’s 
not errored before, not failed on master branch, but I did not touch warning or 
test... I try to fix it several days, must fun the fact, what in my local env 
this test always failed with python 2.7 ...  panic


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3806: [AIRFLOW-2956] added kubernetes tolerations to kubernetes pod operator

2018-08-29 Thread GitBox
codecov-io edited a comment on issue #3806: [AIRFLOW-2956] added kubernetes 
tolerations to kubernetes pod operator
URL: 
https://github.com/apache/incubator-airflow/pull/3806#issuecomment-416340975
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=h1)
 Report
   > Merging 
[#3806](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/0b0f4ac3caca72e67273f9e80221677d78ad5c0e?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3806/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3806   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=footer).
 Last update 
[0b0f4ac...be965c1](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3806: [AIRFLOW-2956] added kubernetes tolerations to kubernetes pod operator

2018-08-29 Thread GitBox
codecov-io edited a comment on issue #3806: [AIRFLOW-2956] added kubernetes 
tolerations to kubernetes pod operator
URL: 
https://github.com/apache/incubator-airflow/pull/3806#issuecomment-416340975
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=h1)
 Report
   > Merging 
[#3806](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/0b0f4ac3caca72e67273f9e80221677d78ad5c0e?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3806/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3806   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=footer).
 Last update 
[0b0f4ac...be965c1](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3806: [AIRFLOW-2956] added kubernetes tolerations to kubernetes pod operator

2018-08-29 Thread GitBox
codecov-io edited a comment on issue #3806: [AIRFLOW-2956] added kubernetes 
tolerations to kubernetes pod operator
URL: 
https://github.com/apache/incubator-airflow/pull/3806#issuecomment-416340975
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=h1)
 Report
   > Merging 
[#3806](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/0b0f4ac3caca72e67273f9e80221677d78ad5c0e?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3806/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3806   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=footer).
 Last update 
[0b0f4ac...be965c1](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3806: [AIRFLOW-2956] added kubernetes tolerations to kubernetes pod operator

2018-08-29 Thread GitBox
codecov-io edited a comment on issue #3806: [AIRFLOW-2956] added kubernetes 
tolerations to kubernetes pod operator
URL: 
https://github.com/apache/incubator-airflow/pull/3806#issuecomment-416340975
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=h1)
 Report
   > Merging 
[#3806](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/0b0f4ac3caca72e67273f9e80221677d78ad5c0e?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3806/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3806   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=footer).
 Last update 
[0b0f4ac...be965c1](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3806: [AIRFLOW-2956] added kubernetes tolerations to kubernetes pod operator

2018-08-29 Thread GitBox
codecov-io edited a comment on issue #3806: [AIRFLOW-2956] added kubernetes 
tolerations to kubernetes pod operator
URL: 
https://github.com/apache/incubator-airflow/pull/3806#issuecomment-416340975
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=h1)
 Report
   > Merging 
[#3806](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/0b0f4ac3caca72e67273f9e80221677d78ad5c0e?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3806/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3806   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=footer).
 Last update 
[0b0f4ac...be965c1](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3806: [AIRFLOW-2956] added kubernetes tolerations to kubernetes pod operator

2018-08-29 Thread GitBox
codecov-io edited a comment on issue #3806: [AIRFLOW-2956] added kubernetes 
tolerations to kubernetes pod operator
URL: 
https://github.com/apache/incubator-airflow/pull/3806#issuecomment-416340975
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=h1)
 Report
   > Merging 
[#3806](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/0b0f4ac3caca72e67273f9e80221677d78ad5c0e?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3806/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3806   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=footer).
 Last update 
[0b0f4ac...be965c1](https://codecov.io/gh/apache/incubator-airflow/pull/3806?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] tswast commented on a change in pull request #3733: [AIRFLOW-491] Add cache parameter in BigQuery query method - with 'api_resource_configs'

2018-08-29 Thread GitBox
tswast commented on a change in pull request #3733: [AIRFLOW-491] Add cache 
parameter in BigQuery query method - with 'api_resource_configs'
URL: https://github.com/apache/incubator-airflow/pull/3733#discussion_r213854044
 
 

 ##
 File path: airflow/contrib/operators/bigquery_operator.py
 ##
 @@ -64,8 +64,9 @@ class BigQueryOperator(BaseOperator):
 :param udf_config: The User Defined Function configuration for the query.
 See https://cloud.google.com/bigquery/user-defined-functions for 
details.
 :type udf_config: list
-:param use_legacy_sql: Whether to use legacy SQL (true) or standard SQL 
(false).
-:type use_legacy_sql: boolean
+:param use_legacy_sql: Whether to use legacy SQL (true) or
+standard SQL (false).
+:type use_legacy_sql: booleanq
 
 Review comment:
   Typo `booleanq`.
   
   Should this be `bool`?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Updated] (AIRFLOW-2982) DAG Graph View: error: INTERNAL SERVER ERROR

2018-08-29 Thread Ashok Kumar (JIRA)


 [ 
https://issues.apache.org/jira/browse/AIRFLOW-2982?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ashok Kumar updated AIRFLOW-2982:
-
Description: 
I followed the steps mentioned in 
[https://kubernetes.io/blog/2018/06/28/airflow-on-kubernetes-part-1-a-different-kind-of-operator/]
 and the rbac related presented in 
[https://github.com/apache/incubator-airflow/blob/master/CONTRIBUTING.md#setting-up-the-node--npm-javascript-environment-only-for-www_rbac]

The tree view is empty to begin with and when refreshed gives: error: INTERNAL 
SERVER ERROR

Inspect on the browser shows:

{{Failed to load resource: the server responded with a status of 404 (NOT 
FOUND)}}{{dagre-d3.min.js:24}}

{{Uncaught TypeError: svgEdgeLabels.exit is not a function}}
 {{at createEdgeLabels (dagre-d3.min.js:24)}}
 {{at Array.fn (dagre-d3.min.js:90)}}
 {{at Array.Co.call (d3.min.js:3)}}
 {{at graph?dag_id=example_bash_operator:1160}}

{{:8080/object/task_instances?execution_date=+%2B+execution_date+%2B+_id=%2B%0A++dag.dag_id+%2B+:1
 }}

{{Failed to load resource: the server responded with a status of 500 (INTERNAL 
SERVER ERROR)}}

  was:
I followed the steps mentioned in 
[https://kubernetes.io/blog/2018/06/28/airflow-on-kubernetes-part-1-a-different-kind-of-operator/]
 and the rbac related presented in 
[https://github.com/apache/incubator-airflow/blob/master/CONTRIBUTING.md#setting-up-the-node--npm-javascript-environment-only-for-www_rbac]

The tree view is empty to begin with and when refreshed gives: error: INTERNAL 
SERVER ERROR

Inspect on the browser shows:

{{Failed to load resource: the server responded with a status of 404 (NOT 
FOUND)}}
{{dagre-d3.min.js:24 Uncaught TypeError: svgEdgeLabels.exit is not a function}}
{{at createEdgeLabels (dagre-d3.min.js:24)}}
{{at Array.fn (dagre-d3.min.js:90)}}
{{at Array.Co.call (d3.min.js:3)}}
{{at graph?dag_id=example_bash_operator:1160}}
{{:8080/object/task_instances?execution_date=+%2B+execution_date+%2B+dag_id=%2B%0A++dag.dag_id+%2B+:1
 Failed to load resource: the server responded with a status of 500 (INTERNAL 
SERVER ERROR)}}

 

This is the line in 


> DAG Graph View: error: INTERNAL SERVER ERROR
> 
>
> Key: AIRFLOW-2982
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2982
> Project: Apache Airflow
>  Issue Type: Bug
> Environment: Build environment:
> Ubuntu 16.04
> Kubernetes:
> Client Version: version.Info{Major:"1", Minor:"11", GitVersion:"v1.11.2", 
> GitCommit:"bb9ffb1654d4a729bb4cec18ff088eacc153c239", GitTreeState:"clean", 
> BuildDate:"2018-08-07T23:17:28Z", GoVersion:"go1.10.3", Compiler:"gc", 
> Platform:"linux/amd64"}
> Server Version: version.Info{Major:"1", Minor:"9", GitVersion:"v1.9.4", 
> GitCommit:"bee2d1505c4fe820744d26d41ecd3fdd4a3d6546", GitTreeState:"clean", 
> BuildDate:"2018-03-12T16:21:35Z", GoVersion:"go1.9.3", Compiler:"gc", 
> Platform:"linux/amd64"}
>Reporter: Ashok Kumar
>Priority: Minor
>
> I followed the steps mentioned in 
> [https://kubernetes.io/blog/2018/06/28/airflow-on-kubernetes-part-1-a-different-kind-of-operator/]
>  and the rbac related presented in 
> [https://github.com/apache/incubator-airflow/blob/master/CONTRIBUTING.md#setting-up-the-node--npm-javascript-environment-only-for-www_rbac]
> The tree view is empty to begin with and when refreshed gives: error: 
> INTERNAL SERVER ERROR
> Inspect on the browser shows:
> {{Failed to load resource: the server responded with a status of 404 (NOT 
> FOUND)}}{{dagre-d3.min.js:24}}
> {{Uncaught TypeError: svgEdgeLabels.exit is not a function}}
>  {{at createEdgeLabels (dagre-d3.min.js:24)}}
>  {{at Array.fn (dagre-d3.min.js:90)}}
>  {{at Array.Co.call (d3.min.js:3)}}
>  {{at graph?dag_id=example_bash_operator:1160}}
> {{:8080/object/task_instances?execution_date=+%2B+execution_date+%2B+_id=%2B%0A++dag.dag_id+%2B+:1
>  }}
> {{Failed to load resource: the server responded with a status of 500 
> (INTERNAL SERVER ERROR)}}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] codecov-io edited a comment on issue #3818: [AIRFLOW-2980] ReadTheDocs - Fix Missing API Reference

2018-08-29 Thread GitBox
codecov-io edited a comment on issue #3818: [AIRFLOW-2980] ReadTheDocs - Fix 
Missing API Reference
URL: 
https://github.com/apache/incubator-airflow/pull/3818#issuecomment-417109229
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=h1)
 Report
   > Merging 
[#3818](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/0b0f4ac3caca72e67273f9e80221677d78ad5c0e?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3818/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3818   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=footer).
 Last update 
[0b0f4ac...d98ce34](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-2982) DAG Graph View: error: INTERNAL SERVER ERROR

2018-08-29 Thread Ashok Kumar (JIRA)
Ashok Kumar created AIRFLOW-2982:


 Summary: DAG Graph View: error: INTERNAL SERVER ERROR
 Key: AIRFLOW-2982
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2982
 Project: Apache Airflow
  Issue Type: Bug
 Environment: Build environment:
Ubuntu 16.04
Kubernetes:
Client Version: version.Info{Major:"1", Minor:"11", GitVersion:"v1.11.2", 
GitCommit:"bb9ffb1654d4a729bb4cec18ff088eacc153c239", GitTreeState:"clean", 
BuildDate:"2018-08-07T23:17:28Z", GoVersion:"go1.10.3", Compiler:"gc", 
Platform:"linux/amd64"}
Server Version: version.Info{Major:"1", Minor:"9", GitVersion:"v1.9.4", 
GitCommit:"bee2d1505c4fe820744d26d41ecd3fdd4a3d6546", GitTreeState:"clean", 
BuildDate:"2018-03-12T16:21:35Z", GoVersion:"go1.9.3", Compiler:"gc", 
Platform:"linux/amd64"}
Reporter: Ashok Kumar


I followed the steps mentioned in 
[https://kubernetes.io/blog/2018/06/28/airflow-on-kubernetes-part-1-a-different-kind-of-operator/]
 and the rbac related presented in 
[https://github.com/apache/incubator-airflow/blob/master/CONTRIBUTING.md#setting-up-the-node--npm-javascript-environment-only-for-www_rbac]

The tree view is empty to begin with and when refreshed gives: error: INTERNAL 
SERVER ERROR

Inspect on the browser shows:

{{Failed to load resource: the server responded with a status of 404 (NOT 
FOUND)}}
{{dagre-d3.min.js:24 Uncaught TypeError: svgEdgeLabels.exit is not a function}}
{{at createEdgeLabels (dagre-d3.min.js:24)}}
{{at Array.fn (dagre-d3.min.js:90)}}
{{at Array.Co.call (d3.min.js:3)}}
{{at graph?dag_id=example_bash_operator:1160}}
{{:8080/object/task_instances?execution_date=+%2B+execution_date+%2B+dag_id=%2B%0A++dag.dag_id+%2B+:1
 Failed to load resource: the server responded with a status of 500 (INTERNAL 
SERVER ERROR)}}

 

This is the line in 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] codecov-io edited a comment on issue #3818: [AIRFLOW-2980] ReadTheDocs - Fix Missing API Reference

2018-08-29 Thread GitBox
codecov-io edited a comment on issue #3818: [AIRFLOW-2980] ReadTheDocs - Fix 
Missing API Reference
URL: 
https://github.com/apache/incubator-airflow/pull/3818#issuecomment-417109229
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=h1)
 Report
   > Merging 
[#3818](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/0b0f4ac3caca72e67273f9e80221677d78ad5c0e?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3818/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3818   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=footer).
 Last update 
[0b0f4ac...d98ce34](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3818: [AIRFLOW-2980] ReadTheDocs - Fix Missing API Reference

2018-08-29 Thread GitBox
codecov-io edited a comment on issue #3818: [AIRFLOW-2980] ReadTheDocs - Fix 
Missing API Reference
URL: 
https://github.com/apache/incubator-airflow/pull/3818#issuecomment-417109229
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=h1)
 Report
   > Merging 
[#3818](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/0b0f4ac3caca72e67273f9e80221677d78ad5c0e?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3818/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3818   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=footer).
 Last update 
[0b0f4ac...d98ce34](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3818: [AIRFLOW-2980] ReadTheDocs - Fix Missing API Reference

2018-08-29 Thread GitBox
codecov-io edited a comment on issue #3818: [AIRFLOW-2980] ReadTheDocs - Fix 
Missing API Reference
URL: 
https://github.com/apache/incubator-airflow/pull/3818#issuecomment-417109229
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=h1)
 Report
   > Merging 
[#3818](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/0b0f4ac3caca72e67273f9e80221677d78ad5c0e?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3818/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3818   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=footer).
 Last update 
[0b0f4ac...d98ce34](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3818: [AIRFLOW-2980] ReadTheDocs - Fix Missing API Reference

2018-08-29 Thread GitBox
codecov-io edited a comment on issue #3818: [AIRFLOW-2980] ReadTheDocs - Fix 
Missing API Reference
URL: 
https://github.com/apache/incubator-airflow/pull/3818#issuecomment-417109229
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=h1)
 Report
   > Merging 
[#3818](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/0b0f4ac3caca72e67273f9e80221677d78ad5c0e?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3818/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3818   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=footer).
 Last update 
[0b0f4ac...d98ce34](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3818: [AIRFLOW-2980] ReadTheDocs - Fix Missing API Reference

2018-08-29 Thread GitBox
codecov-io edited a comment on issue #3818: [AIRFLOW-2980] ReadTheDocs - Fix 
Missing API Reference
URL: 
https://github.com/apache/incubator-airflow/pull/3818#issuecomment-417109229
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=h1)
 Report
   > Merging 
[#3818](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/0b0f4ac3caca72e67273f9e80221677d78ad5c0e?src=pr=desc)
 will **decrease** coverage by `<.01%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3818/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=tree)
   
   ```diff
   @@Coverage Diff@@
   ##   master   #3818  +/-   ##
   =
   - Coverage   77.41%   77.4%   -0.01% 
   =
 Files 203 203  
 Lines   15817   15817  
   =
   - Hits12244   12243   -1 
   - Misses   35733574   +1
   ```
   
   
   | [Impacted 
Files](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=tree) 
| Coverage Δ | |
   |---|---|---|
   | 
[airflow/models.py](https://codecov.io/gh/apache/incubator-airflow/pull/3818/diff?src=pr=tree#diff-YWlyZmxvdy9tb2RlbHMucHk=)
 | `88.74% <0%> (-0.05%)` | :arrow_down: |
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=footer).
 Last update 
[0b0f4ac...d98ce34](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Created] (AIRFLOW-2981) TypeError in dataflow operators when using GCS jar or py_file

2018-08-29 Thread Jeffrey Payne (JIRA)
Jeffrey Payne created AIRFLOW-2981:
--

 Summary:  TypeError in dataflow operators when using GCS jar or 
py_file
 Key: AIRFLOW-2981
 URL: https://issues.apache.org/jira/browse/AIRFLOW-2981
 Project: Apache Airflow
  Issue Type: Bug
  Components: contrib, Dataflow
Affects Versions: 1.9.0, 1.10
Reporter: Jeffrey Payne
Assignee: Jeffrey Payne


The {{GoogleCloudBucketHelper.google_cloud_to_local}} function attempts to 
compare a list to an int, resulting in the TypeError, with:
{noformat}
...
path_components = file_name[self.GCS_PREFIX_LENGTH:].split('/')
if path_components < 2:
...
{noformat}
This should be {{if len(path_components) < 2:}}.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] codecov-io edited a comment on issue #3819: [AIRFLOW-XXX] Fix Broken Link in CONTRIBUTING.md

2018-08-29 Thread GitBox
codecov-io edited a comment on issue #3819: [AIRFLOW-XXX] Fix Broken Link in 
CONTRIBUTING.md
URL: 
https://github.com/apache/incubator-airflow/pull/3819#issuecomment-417110927
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3819?src=pr=h1)
 Report
   > Merging 
[#3819](https://codecov.io/gh/apache/incubator-airflow/pull/3819?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/0b0f4ac3caca72e67273f9e80221677d78ad5c0e?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3819/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3819?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3819   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3819?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3819?src=pr=footer).
 Last update 
[0b0f4ac...e82cb52](https://codecov.io/gh/apache/incubator-airflow/pull/3819?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io edited a comment on issue #3818: [AIRFLOW-2980] ReadTheDocs - Fix Missing API Reference

2018-08-29 Thread GitBox
codecov-io edited a comment on issue #3818: [AIRFLOW-2980] ReadTheDocs - Fix 
Missing API Reference
URL: 
https://github.com/apache/incubator-airflow/pull/3818#issuecomment-417109229
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=h1)
 Report
   > Merging 
[#3818](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/0b0f4ac3caca72e67273f9e80221677d78ad5c0e?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3818/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3818   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=footer).
 Last update 
[0b0f4ac...d98ce34](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io commented on issue #3819: [AIRFLOW-XXX] Fix Broken Link in CONTRIBUTING.md

2018-08-29 Thread GitBox
codecov-io commented on issue #3819: [AIRFLOW-XXX] Fix Broken Link in 
CONTRIBUTING.md
URL: 
https://github.com/apache/incubator-airflow/pull/3819#issuecomment-417110927
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3819?src=pr=h1)
 Report
   > Merging 
[#3819](https://codecov.io/gh/apache/incubator-airflow/pull/3819?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/0b0f4ac3caca72e67273f9e80221677d78ad5c0e?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3819/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3819?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3819   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3819?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3819?src=pr=footer).
 Last update 
[0b0f4ac...e82cb52](https://codecov.io/gh/apache/incubator-airflow/pull/3819?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil commented on issue #3818: [AIRFLOW-2980] ReadTheDocs - Fix Missing API Reference

2018-08-29 Thread GitBox
kaxil commented on issue #3818: [AIRFLOW-2980] ReadTheDocs - Fix Missing API 
Reference
URL: 
https://github.com/apache/incubator-airflow/pull/3818#issuecomment-417109434
 
 
   Docs for `HiveOperator` can now be seen 
https://airflow-fork-k1.readthedocs.io/en/airflow-2980-rtd-fix-missing-docs/code.html#airflow.operators.hive_operator.HiveOperator
 (RTD on my Fork)


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] codecov-io commented on issue #3818: [AIRFLOW-2980] ReadTheDocs - Fix Missing API Reference

2018-08-29 Thread GitBox
codecov-io commented on issue #3818: [AIRFLOW-2980] ReadTheDocs - Fix Missing 
API Reference
URL: 
https://github.com/apache/incubator-airflow/pull/3818#issuecomment-417109229
 
 
   # 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=h1)
 Report
   > Merging 
[#3818](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=desc)
 into 
[master](https://codecov.io/gh/apache/incubator-airflow/commit/0b0f4ac3caca72e67273f9e80221677d78ad5c0e?src=pr=desc)
 will **not change** coverage.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree 
graph](https://codecov.io/gh/apache/incubator-airflow/pull/3818/graphs/tree.svg?width=650=WdLKlKHOAU=150=pr)](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=tree)
   
   ```diff
   @@   Coverage Diff   @@
   ##   master#3818   +/-   ##
   ===
 Coverage   77.41%   77.41%   
   ===
 Files 203  203   
 Lines   1581715817   
   ===
 Hits1224412244   
 Misses   3573 3573
   ```
   
   
   
   --
   
   [Continue to review full report at 
Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=continue).
   > **Legend** - [Click here to learn 
more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute  (impact)`, `ø = not affected`, `? = missing data`
   > Powered by 
[Codecov](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=footer).
 Last update 
[0b0f4ac...d98ce34](https://codecov.io/gh/apache/incubator-airflow/pull/3818?src=pr=lastupdated).
 Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] mascah edited a comment on issue #3549: [AIRFLOW-1840] Support back-compat on old celery config

2018-08-29 Thread GitBox
mascah edited a comment on issue #3549: [AIRFLOW-1840] Support back-compat on 
old celery config
URL: 
https://github.com/apache/incubator-airflow/pull/3549#issuecomment-417080253
 
 
   This looks like it's missing the option for `celery_result_backend` -> 
`result_backend`


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] kaxil opened a new pull request #3819: [AIRFLOW-XXX] Fix Broken Link in CONTRIBUTING.md

2018-08-29 Thread GitBox
kaxil opened a new pull request #3819: [AIRFLOW-XXX] Fix Broken Link in 
CONTRIBUTING.md
URL: https://github.com/apache/incubator-airflow/pull/3819
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   Fix Broken DockerFile Link in CONTRIBUTING.md
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   
   ### Code Quality
   
   - [x] Passes `git diff upstream/master -u -- "*.py" | flake8 --diff`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[jira] [Commented] (AIRFLOW-2980) Missing operators in the docs

2018-08-29 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/AIRFLOW-2980?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16596820#comment-16596820
 ] 

ASF GitHub Bot commented on AIRFLOW-2980:
-

kaxil opened a new pull request #3818: [AIRFLOW-2980] ReadTheDocs - Fix Missing 
API Reference
URL: https://github.com/apache/incubator-airflow/pull/3818
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   Some of the operators are missing from the API reference part of the docs 
(HiveOperator for instance). This PR with force RTD to install all the Airflow 
dependencies which will then be used by it to generate API Refernce. 
   
   ### Code Quality
   
   - [x] Passes `git diff upstream/master -u -- "*.py" | flake8 --diff`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Missing operators in the docs
> -
>
> Key: AIRFLOW-2980
> URL: https://issues.apache.org/jira/browse/AIRFLOW-2980
> Project: Apache Airflow
>  Issue Type: Task
>  Components: docs, Documentation
>Reporter: Kaxil Naik
>Assignee: Kaxil Naik
>Priority: Major
>
> Some of the operators are missing from the API reference
> part of the docs (HiveOperator for instance). 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] kaxil opened a new pull request #3818: [AIRFLOW-2980] ReadTheDocs - Fix Missing API Reference

2018-08-29 Thread GitBox
kaxil opened a new pull request #3818: [AIRFLOW-2980] ReadTheDocs - Fix Missing 
API Reference
URL: https://github.com/apache/incubator-airflow/pull/3818
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow 
Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references 
them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
 - https://issues.apache.org/jira/browse/AIRFLOW-XXX
 - In case you are fixing a typo in the documentation you can prepend your 
commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI 
changes:
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for 
this extremely good reason:
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I 
have squashed multiple commits if they address the same issue. In addition, my 
commits follow the guidelines from "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)":
 1. Subject is separated from body by a blank line
 1. Subject is limited to 50 characters (not including Jira issue reference)
 1. Subject does not end with a period
 1. Subject uses the imperative mood ("add", not "adding")
 1. Body wraps at 72 characters
 1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes 
how to use it.
 - When adding new operators/hooks/sensors, the autoclass documentation 
generation needs to be added.
   Some of the operators are missing from the API reference part of the docs 
(HiveOperator for instance). This PR with force RTD to install all the Airflow 
dependencies which will then be used by it to generate API Refernce. 
   
   ### Code Quality
   
   - [x] Passes `git diff upstream/master -u -- "*.py" | flake8 --diff`
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[24/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/operators/hive_to_druid.html
--
diff --git a/_modules/airflow/operators/hive_to_druid.html 
b/_modules/airflow/operators/hive_to_druid.html
new file mode 100644
index 000..e232264
--- /dev/null
+++ b/_modules/airflow/operators/hive_to_druid.html
@@ -0,0 +1,456 @@
+
+
+
+
+  
+
+  
+  
+  
+  
+  airflow.operators.hive_to_druid  Airflow Documentation
+  
+
+  
+  
+  
+  
+
+  
+
+  
+  
+
+
+  
+
+  
+  
+
+ 
+
+  
+  
+
+
+
+
+
+   
+  
+
+
+
+  
+
+  
+
+  
+ Airflow
+  
+
+  
+  
+
+  
+
+
+  
+
+  
+
+  
+
+
+
+  
+
+
+  
+
+
+
+  
+
+
+  
+
+
+  
+Project
+License
+Quick Start
+Installation
+Tutorial
+How-to Guides
+UI / Screenshots
+Concepts
+Data Profiling
+Command Line Interface
+Scheduling  Triggers
+Plugins
+Security
+Time zones
+Experimental Rest API
+Integration
+Lineage
+FAQ
+API Reference
+
+
+
+  
+
+  
+
+
+
+
+  
+  
+
+  
+  Airflow
+
+  
+
+
+  
+
+
+
+  
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+  
+
+  Docs 
+
+  Module code 
+
+  airflow.operators.hive_to_druid
+
+
+  
+
+  
+
+  
+
+  
+  
+
+  http://schema.org/Article;>
+   
+
+  Source code for airflow.operators.hive_to_druid
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under 
one
+# or more contributor license agreements.  See the NOTICE 
file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this 
file
+# to you under the Apache License, Version 2.0 (the
+# License); you may not use this file except in 
compliance
+# with the License.  You may obtain a copy of the License 
at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in 
writing,
+# software distributed under the License is distributed on 
an
+# AS IS BASIS, WITHOUT WARRANTIES OR CONDITIONS 
OF ANY
+# KIND, either express or implied.  See the License for 
the
+# specific language governing permissions and 
limitations
+# under the License.
+
+from airflow.hooks.hive_hooks 
import HiveCliHook, HiveMetastoreHook
+from airflow.hooks.druid_hook 
import DruidHook
+from airflow.models import BaseOperator
+from airflow.utils.decorators 
import apply_defaults
+
+LOAD_CHECK_INTERVAL = 5
+DEFAULT_TARGET_PARTITION_SIZE = 
500
+
+
+[docs]class HiveToDruidTransfer(BaseOperator):
+
+Moves data from Hive to Druid, [del]note that for now the 
data is loaded
+into memory before being pushed to Druid, so this 
operator should
+be used for smallish amount of data.[/del]
+
+:param sql: SQL query to execute against the Druid 
database. (templated)
+:type sql: str
+:param druid_datasource: the datasource you want to 
ingest into in druid
+:type druid_datasource: str
+:param ts_dim: the timestamp dimension
+:type ts_dim: str
+:param metric_spec: the metrics you want to define for 
your data
+:type metric_spec: list
+:param hive_cli_conn_id: the hive connection id
+:type hive_cli_conn_id: str
+:param druid_ingest_conn_id: the druid ingest connection 
id
+:type druid_ingest_conn_id: str
+:param metastore_conn_id: the metastore connection 
id
+:type metastore_conn_id: str
+:param hadoop_dependency_coordinates: list of coordinates 
to squeeze
+int the ingest json
+:type hadoop_dependency_coordinates: list of str
+:param intervals: list of time intervals that defines 
segments,
+this is passed as is to the json object. 
(templated)
+:type intervals: list
+:param hive_tblproperties: additional properties for 
tblproperties in
+hive for the staging table
+:type hive_tblproperties: dict
+
+
+template_fields = (sql, 
intervals)
+template_ext = (.sql,)
+
+@apply_defaults
+def __init__(
+self,
+sql,
+druid_datasource,
+ts_dim,
+metric_spec=None,
+hive_cli_conn_id=hive_cli_default,
+druid_ingest_conn_id=druid_ingest_default,
+metastore_conn_id=metastore_default,
+hadoop_dependency_coordinates=None,
+intervals=None,
+num_shards=-1,
+target_partition_size=-1,
+query_granularity=NONE,
+segment_granularity=DAY,
+hive_tblproperties=None,
+*args, **kwargs):
+super(HiveToDruidTransfer, self).__init__(*args, **kwargs)
+ 

[46/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/contrib/hooks/cassandra_hook.html
--
diff --git a/_modules/airflow/contrib/hooks/cassandra_hook.html 
b/_modules/airflow/contrib/hooks/cassandra_hook.html
new file mode 100644
index 000..01142f1
--- /dev/null
+++ b/_modules/airflow/contrib/hooks/cassandra_hook.html
@@ -0,0 +1,400 @@
+
+
+
+
+  
+
+  
+  
+  
+  
+  airflow.contrib.hooks.cassandra_hook  Airflow 
Documentation
+  
+
+  
+  
+  
+  
+
+  
+
+  
+  
+
+
+  
+
+  
+  
+
+ 
+
+  
+  
+
+
+
+
+
+   
+  
+
+
+
+  
+
+  
+
+  
+ Airflow
+  
+
+  
+  
+
+  
+
+
+  
+
+  
+
+  
+
+
+
+  
+
+
+  
+
+
+
+  
+
+
+  
+
+
+  
+Project
+License
+Quick Start
+Installation
+Tutorial
+How-to Guides
+UI / Screenshots
+Concepts
+Data Profiling
+Command Line Interface
+Scheduling  Triggers
+Plugins
+Security
+Time zones
+Experimental Rest API
+Integration
+Lineage
+FAQ
+API Reference
+
+
+
+  
+
+  
+
+
+
+
+  
+  
+
+  
+  Airflow
+
+  
+
+
+  
+
+
+
+  
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+  
+
+  Docs 
+
+  Module code 
+
+  airflow.contrib.hooks.cassandra_hook
+
+
+  
+
+  
+
+  
+
+  
+  
+
+  http://schema.org/Article;>
+   
+
+  Source code for airflow.contrib.hooks.cassandra_hook
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under 
one
+# or more contributor license agreements.  See the NOTICE 
file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this 
file
+# to you under the Apache License, Version 2.0 (the
+# License); you may not use this file except in 
compliance
+# with the License.  You may obtain a copy of the License 
at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in 
writing,
+# software distributed under the License is distributed on 
an
+# AS IS BASIS, WITHOUT WARRANTIES OR CONDITIONS 
OF ANY
+# KIND, either express or implied.  See the License for 
the
+# specific language governing permissions and 
limitations
+# under the License.
+
+from cassandra.cluster import Cluster
+from cassandra.policies import (RoundRobinPolicy, DCAwareRoundRobinPolicy,
+TokenAwarePolicy, WhiteListRoundRobinPolicy)
+from cassandra.auth import PlainTextAuthProvider
+
+from airflow.hooks.base_hook 
import BaseHook
+from airflow.utils.log.logging_mixin import 
LoggingMixin
+
+
+[docs]class CassandraHook(BaseHook, LoggingMixin):
+
+Hook used to interact with Cassandra
+
+Contact points can be specified as a comma-separated 
string in the hosts
+field of the connection.
+
+Port can be specified in the port field of the 
connection.
+
+If SSL is enabled in Cassandra, pass in a dict in the 
extra field as kwargs for
+``ssl.wrap_socket()``. For example:
+{
+ssl_options : {
+ca_certs : 
PATH_TO_CA_CERTS
+}
+}
+
+Default load balancing policy is RoundRobinPolicy. To 
specify a different LB policy:
+- DCAwareRoundRobinPolicy
+{
+load_balancing_policy: 
DCAwareRoundRobinPolicy,
+load_balancing_policy_args: {
+local_dc: LOCAL_DC_NAME,
  // optional
+used_hosts_per_remote_dc: 
SOME_INT_VALUE, // optional
+}
+ }
+- WhiteListRoundRobinPolicy
+{
+load_balancing_policy: 
WhiteListRoundRobinPolicy,
+load_balancing_policy_args: {
+hosts: [HOST1, 
HOST2, HOST3]
+}
+}
+- TokenAwarePolicy
+{
+load_balancing_policy: 
TokenAwarePolicy,
+load_balancing_policy_args: {
+child_load_balancing_policy: 
CHILD_POLICY_NAME, // optional
+
child_load_balancing_policy_args: { ... }   // optional
+}
+}
+
+For details of the Cluster config, see 
cassandra.cluster.
+
+def __init__(self, cassandra_conn_id=cassandra_default):
+conn = self.get_connection(cassandra_conn_id)
+
+conn_config = {}
+if conn.host:
+conn_config[contact_points] = conn.host.split(,)
+
+if conn.port:
+conn_config[port] = int(conn.port)
+
+if conn.login:
+conn_config[auth_provider] = 

[44/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/contrib/hooks/gcp_dataproc_hook.html
--
diff --git a/_modules/airflow/contrib/hooks/gcp_dataproc_hook.html 
b/_modules/airflow/contrib/hooks/gcp_dataproc_hook.html
index 4ca7edd..d22dfa5 100644
--- a/_modules/airflow/contrib/hooks/gcp_dataproc_hook.html
+++ b/_modules/airflow/contrib/hooks/gcp_dataproc_hook.html
@@ -24,26 +24,17 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
- 
+  
+  
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -116,7 +107,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -124,9 +115,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -156,8 +148,6 @@
 
   
 
-
-
   
 
   
@@ -178,9 +168,9 @@
 # to you under the Apache License, Version 2.0 (the
 # License); you may not use this file except in 
compliance
 # with the License.  You may obtain a copy of the License 
at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in 
writing,
 # software distributed under the License is distributed on 
an
 # AS IS BASIS, WITHOUT WARRANTIES OR CONDITIONS 
OF ANY
@@ -192,6 +182,7 @@
 import uuid
 
 from apiclient.discovery import build
+from zope.deprecation import deprecation
 
 from airflow.contrib.hooks.gcp_api_base_hook import GoogleCloudBaseHook
 from airflow.utils.log.logging_mixin import 
LoggingMixin
@@ -375,7 +366,9 @@
 [docs]
def get_conn(self):
 Returns a Google Cloud Dataproc 
service object.
 http_authorized = self._authorize()
-return build(dataproc, self.api_version, http=http_authorized)
+return build(
+dataproc, 
self.api_version, http=http_authorized,
+cache_discovery=False)
 
 def get_cluster(self, project_id, region, cluster_name):
 return self.get_conn().projects().regions().clusters().get(
@@ -393,16 +386,23 @@
 return _DataProcJobBuilder(self.project_id, task_id, cluster_name,
job_type, properties)
 
-[docs]
def await(self, operation):
+[docs]
def wait(self, operation):
 Awaits for Google Cloud Dataproc 
Operation to complete.
 submitted = _DataProcOperation(self.get_conn(), operation)
 submitted.wait_for_done()
+
+
+setattr(
+DataProcHook,
+await,
+deprecation.deprecated(
+DataProcHook.wait, renamed to 
wait for Python3.7 compatability
+),
+)
 
 

-   
-
-   
+   
   
   
   
@@ -414,7 +414,7 @@
 
 
   
-  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/snide/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
+  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/rtfd/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
 
 
 
@@ -433,6 +433,7 @@
 var DOCUMENTATION_OPTIONS = {
 URL_ROOT:'../../../../',
 VERSION:'',
+LANGUAGE:'None',
 COLLAPSE_INDEX:false,
 FILE_SUFFIX:'.html',
 HAS_SOURCE:  true,
@@ -445,19 +446,13 @@
 
   
 
-  
-  
-
-  
+  
 
-  
-  
   
   jQuery(function () {
-  SphinxRtdTheme.StickyNav.enable();
+  SphinxRtdTheme.Navigation.enable(true);
   });
-  
-   
+   
 
 
 
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/contrib/hooks/gcp_mlengine_hook.html
--
diff --git a/_modules/airflow/contrib/hooks/gcp_mlengine_hook.html 
b/_modules/airflow/contrib/hooks/gcp_mlengine_hook.html
index df9ea25..b961738 100644
--- a/_modules/airflow/contrib/hooks/gcp_mlengine_hook.html
+++ b/_modules/airflow/contrib/hooks/gcp_mlengine_hook.html
@@ -24,26 +24,17 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
- 
+  
+  
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -116,7 +107,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -124,9 +115,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -156,8 +148,6 @@
 
   
 
-
-
   
 
   
@@ -188,7 +178,6 @@
 import time
 from apiclient import errors
 from apiclient.discovery import build
-from oauth2client.client import GoogleCredentials
 
 from airflow.contrib.hooks.gcp_api_base_hook import GoogleCloudBaseHook
 from airflow.utils.log.logging_mixin import 
LoggingMixin
@@ -226,8 +215,8 @@
 
 Returns a Google MLEngine service object.
 
-credentials = GoogleCredentials.get_application_default()
-return build(ml, 
v1, credentials=credentials)
+

[42/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/contrib/hooks/vertica_hook.html
--
diff --git a/_modules/airflow/contrib/hooks/vertica_hook.html 
b/_modules/airflow/contrib/hooks/vertica_hook.html
index 355e266..f60fa92 100644
--- a/_modules/airflow/contrib/hooks/vertica_hook.html
+++ b/_modules/airflow/contrib/hooks/vertica_hook.html
@@ -24,26 +24,17 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
- 
+  
+  
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -116,7 +107,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -124,9 +115,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -156,8 +148,6 @@
 
   
 
-
-
   
 
   
@@ -178,9 +168,9 @@
 # to you under the Apache License, Version 2.0 (the
 # License); you may not use this file except in 
compliance
 # with the License.  You may obtain a copy of the License 
at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in 
writing,
 # software distributed under the License is distributed on 
an
 # AS IS BASIS, WITHOUT WARRANTIES OR CONDITIONS 
OF ANY
@@ -195,9 +185,9 @@
 
 
 [docs]class VerticaHook(DbApiHook):
-
+
 Interact with Vertica.
-
+
 
 conn_name_attr = vertica_conn_id
 default_conn_name = vertica_default
@@ -225,9 +215,7 @@
 
 

-   
-
-   
+   
   
   
   
@@ -239,7 +227,7 @@
 
 
   
-  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/snide/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
+  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/rtfd/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
 
 
 
@@ -258,6 +246,7 @@
 var DOCUMENTATION_OPTIONS = {
 URL_ROOT:'../../../../',
 VERSION:'',
+LANGUAGE:'None',
 COLLAPSE_INDEX:false,
 FILE_SUFFIX:'.html',
 HAS_SOURCE:  true,
@@ -270,19 +259,13 @@
 
   
 
-  
-  
-
-  
+  
 
-  
-  
   
   jQuery(function () {
-  SphinxRtdTheme.StickyNav.enable();
+  SphinxRtdTheme.Navigation.enable(true);
   });
-  
-   
+   
 
 
 
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/contrib/hooks/wasb_hook.html
--
diff --git a/_modules/airflow/contrib/hooks/wasb_hook.html 
b/_modules/airflow/contrib/hooks/wasb_hook.html
index 605fbf3..e98b5cf 100644
--- a/_modules/airflow/contrib/hooks/wasb_hook.html
+++ b/_modules/airflow/contrib/hooks/wasb_hook.html
@@ -24,26 +24,17 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
- 
+  
+  
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -116,7 +107,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -124,9 +115,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -156,8 +148,6 @@
 
   
 
-
-
   
 
   
@@ -178,9 +168,9 @@
 # to you under the Apache License, Version 2.0 (the
 # License); you may not use this file except in 
compliance
 # with the License.  You may obtain a copy of the License 
at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in 
writing,
 # software distributed under the License is distributed on 
an
 # AS IS BASIS, WITHOUT WARRANTIES OR CONDITIONS 
OF ANY
@@ -322,9 +312,7 @@
 
 

-   
-
-   
+   
   
   
   
@@ -336,7 +324,7 @@
 
 
   
-  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/snide/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
+  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/rtfd/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
 
 
 
@@ -355,6 +343,7 @@
 var DOCUMENTATION_OPTIONS = {
 URL_ROOT:'../../../../',
 VERSION:'',
+LANGUAGE:'None',
 COLLAPSE_INDEX:false,
 FILE_SUFFIX:'.html',
 HAS_SOURCE:  true,
@@ -367,19 +356,13 @@
 
   
 
-  
-  
-
-  
+  
 
-  
-  
   
   jQuery(function () {
-  SphinxRtdTheme.StickyNav.enable();
+  SphinxRtdTheme.Navigation.enable(true);
   });
-  
-   
+   
 
 
 
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/contrib/hooks/winrm_hook.html
--
diff --git a/_modules/airflow/contrib/hooks/winrm_hook.html 

[32/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/executors/celery_executor.html
--
diff --git a/_modules/airflow/executors/celery_executor.html 
b/_modules/airflow/executors/celery_executor.html
index b787b3e..8b79d18 100644
--- a/_modules/airflow/executors/celery_executor.html
+++ b/_modules/airflow/executors/celery_executor.html
@@ -24,26 +24,17 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
- 
+  
+  
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -116,7 +107,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -124,9 +115,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -156,8 +148,6 @@
 
   
 
-
-
   
 
   
@@ -178,9 +168,9 @@
 # to you under the Apache License, Version 2.0 (the
 # License); you may not use this file except in 
compliance
 # with the License.  You may obtain a copy of the License 
at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in 
writing,
 # software distributed under the License is distributed on 
an
 # AS IS BASIS, WITHOUT WARRANTIES OR CONDITIONS 
OF ANY
@@ -245,24 +235,24 @@
 vast amounts of messages, while providing operations with 
the tools
 required to maintain such a system.
 
-def start(self):
+[docs]
def start(self):
 self.tasks = {}
-self.last_state = {}
+self.last_state = {}
 
-def execute_async(self, key, command,
+[docs]
def execute_async(self, key, command,
   queue=DEFAULT_CELERY_CONFIG[task_default_queue],
   executor_config=None):
-self.log.info( [celery] queuing {key} through celery, 
-   queue={queue}.format(**locals()))
+self.log.info([celery] queuing {key} through celery, 
+  queue={queue}.format(**locals()))
 self.tasks[key] = execute_command.apply_async(
 args=[command], queue=queue)
-self.last_state[key] = celery_states.PENDING
+self.last_state[key] = celery_states.PENDING
 
-def sync(self):
+[docs]
def sync(self):
 self.log.debug(Inquiring about %s celery task(s), len(self.tasks))
-for key, async in 
list(self.tasks.items()):
+for key, task in list(self.tasks.items()):
 try:
-state = async.state
+state = task.state
 if self.last_state[key] != state:
 if state 
== celery_states.SUCCESS:
 self.success(key)
@@ -277,25 +267,23 @@
 del self.tasks[key]
 del self.last_state[key]
 else:
-self.log.info(Unexpected 
state: %s, async.state)
-self.last_state[key] = async.state
+self.log.info(Unexpected 
state: %s, task.state)
+self.last_state[key] = task.state
 except Exception 
as e:
 self.log.error(Error syncing the celery executor, 
ignoring it:)
-self.log.exception(e)
+self.log.exception(e)
 
-def end(self, synchronous=False):
+[docs]
def end(self, synchronous=False):
 if synchronous:
 while any([
-async.state not in 
celery_states.READY_STATES
-for async 
in self.tasks.values()]):
+task.state not in 
celery_states.READY_STATES
+for task 
in self.tasks.values()]):
 time.sleep(5)
-self.sync()
+self.sync()
 
 

-   
-
-   
+   
   
   
   
@@ -307,7 +295,7 @@
 
 
   
-  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/snide/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
+  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/rtfd/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
 
 
 
@@ -326,6 +314,7 @@
 var DOCUMENTATION_OPTIONS = {
 URL_ROOT:'../../../',
 VERSION:'',
+LANGUAGE:'None',
 COLLAPSE_INDEX:false,
 FILE_SUFFIX:'.html',
 HAS_SOURCE:  true,
@@ -338,19 +327,13 @@
 
   
 
-  
-  
-
-  
+  
 
-  
-  
   
   jQuery(function () {
-  SphinxRtdTheme.StickyNav.enable();
+  SphinxRtdTheme.Navigation.enable(true);
   });
-  
-   
+   
 
 
 
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/executors/local_executor.html
--
diff --git a/_modules/airflow/executors/local_executor.html 

[33/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/contrib/sensors/gcs_sensor.html
--
diff --git a/_modules/airflow/contrib/sensors/gcs_sensor.html 
b/_modules/airflow/contrib/sensors/gcs_sensor.html
index aee0eda..12530ff 100644
--- a/_modules/airflow/contrib/sensors/gcs_sensor.html
+++ b/_modules/airflow/contrib/sensors/gcs_sensor.html
@@ -24,26 +24,17 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
- 
+  
+  
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -116,7 +107,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -124,9 +115,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -156,8 +148,6 @@
 
   
 
-
-
   
 
   
@@ -178,9 +168,9 @@
 # to you under the Apache License, Version 2.0 (the
 # License); you may not use this file except in 
compliance
 # with the License.  You may obtain a copy of the License 
at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in 
writing,
 # software distributed under the License is distributed on 
an
 # AS IS BASIS, WITHOUT WARRANTIES OR CONDITIONS 
OF ANY
@@ -195,21 +185,7 @@
 [docs]class GoogleCloudStorageObjectSensor(BaseSensorOperator):
 
 Checks for the existence of a file in Google Cloud 
Storage.
-
-template_fields = (bucket, object)
-ui_color = #f0eee4
-
-@apply_defaults
-def __init__(
-self,
-bucket,
-object,  # pylint:disable=redefined-builtin
-google_cloud_conn_id=google_cloud_default,
-delegate_to=None,
-*args,
-**kwargs):
-
-Create a new GoogleCloudStorageObjectSensor.
+Create a new GoogleCloudStorageObjectSensor.
 
 :param bucket: The Google cloud storage bucket where 
the object is.
 :type bucket: string
@@ -220,21 +196,33 @@
 connecting to Google cloud storage.
 :type google_cloud_storage_conn_id: string
 :param delegate_to: The account to impersonate, if 
any.
-For this to work, the service account making the 
request must have domain-wide delegation enabled.
+For this to work, the service account making the 
request must have
+domain-wide delegation enabled.
 :type delegate_to: string
-
+
+template_fields = (bucket, object)
+ui_color = #f0eee4
+
+@apply_defaults
+def __init__(self,
+ bucket,
+ object,  # pylint:disable=redefined-builtin
+ google_cloud_conn_id=google_cloud_default,
+ delegate_to=None,
+ *args, **kwargs):
+
 super(GoogleCloudStorageObjectSensor, self).__init__(*args, **kwargs)
 self.bucket = bucket
 self.object = object
 self.google_cloud_conn_id = google_cloud_conn_id
 self.delegate_to = delegate_to
 
-def poke(self, context):
+[docs]
def poke(self, context):
 self.log.info(Sensor checks existence of : 
%s, %s, self.bucket, self.object)
 hook = GoogleCloudStorageHook(
 google_cloud_storage_conn_id=self.google_cloud_conn_id,
 delegate_to=self.delegate_to)
-return hook.exists(self.bucket, self.object)
+return hook.exists(self.bucket, self.object)
 
 
 def ts_function(context):
@@ -249,23 +237,7 @@
 [docs]class GoogleCloudStorageObjectUpdatedSensor(BaseSensorOperator):
 
 Checks if an object is updated in Google Cloud 
Storage.
-
-template_fields = (bucket, object)
-template_ext = (.sql,)
-ui_color = #f0eee4
-
-@apply_defaults
-def __init__(
-self,
-bucket,
-object,  # pylint:disable=redefined-builtin
-ts_func=ts_function,
-google_cloud_conn_id=google_cloud_default,
-delegate_to=None,
-*args,
-**kwargs):
-
-Create a new 
GoogleCloudStorageObjectUpdatedSensor.
+Create a new GoogleCloudStorageObjectUpdatedSensor.
 
 :param bucket: The Google cloud storage bucket where 
the object is.
 :type bucket: string
@@ -283,7 +255,20 @@
 For this to work, the service account making the 
request must have domain-wide
 delegation enabled.
 :type delegate_to: string
-
+
+template_fields = (bucket, object)
+template_ext = (.sql,)
+ui_color = #f0eee4
+
+@apply_defaults
+def __init__(self,
+ bucket,
+ object,  # pylint:disable=redefined-builtin
+ ts_func=ts_function,
+ google_cloud_conn_id=google_cloud_default,
+ delegate_to=None,
+ *args, **kwargs):
+
 super(GoogleCloudStorageObjectUpdatedSensor, 

[43/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/contrib/hooks/redis_hook.html
--
diff --git a/_modules/airflow/contrib/hooks/redis_hook.html 
b/_modules/airflow/contrib/hooks/redis_hook.html
index 1d0c202..e559ff0 100644
--- a/_modules/airflow/contrib/hooks/redis_hook.html
+++ b/_modules/airflow/contrib/hooks/redis_hook.html
@@ -24,26 +24,17 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
- 
+  
+  
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -116,7 +107,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -124,9 +115,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -156,8 +148,6 @@
 
   
 
-
-
   
 
   
@@ -178,9 +168,9 @@
 # to you under the Apache License, Version 2.0 (the
 # License); you may not use this file except in 
compliance
 # with the License.  You may obtain a copy of the License 
at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in 
writing,
 # software distributed under the License is distributed on 
an
 # AS IS BASIS, WITHOUT WARRANTIES OR CONDITIONS 
OF ANY
@@ -265,9 +255,7 @@
 
 

-   
-
-   
+   
   
   
   
@@ -279,7 +267,7 @@
 
 
   
-  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/snide/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
+  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/rtfd/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
 
 
 
@@ -298,6 +286,7 @@
 var DOCUMENTATION_OPTIONS = {
 URL_ROOT:'../../../../',
 VERSION:'',
+LANGUAGE:'None',
 COLLAPSE_INDEX:false,
 FILE_SUFFIX:'.html',
 HAS_SOURCE:  true,
@@ -310,19 +299,13 @@
 
   
 
-  
-  
-
-  
+  
 
-  
-  
   
   jQuery(function () {
-  SphinxRtdTheme.StickyNav.enable();
+  SphinxRtdTheme.Navigation.enable(true);
   });
-  
-   
+   
 
 
 
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/contrib/hooks/redshift_hook.html
--
diff --git a/_modules/airflow/contrib/hooks/redshift_hook.html 
b/_modules/airflow/contrib/hooks/redshift_hook.html
index 314fd9e..e587dc2 100644
--- a/_modules/airflow/contrib/hooks/redshift_hook.html
+++ b/_modules/airflow/contrib/hooks/redshift_hook.html
@@ -24,26 +24,17 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
- 
+  
+  
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -116,7 +107,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -124,9 +115,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -156,8 +148,6 @@
 
   
 
-
-
   
 
   
@@ -178,9 +168,9 @@
 # to you under the Apache License, Version 2.0 (the
 # License); you may not use this file except in 
compliance
 # with the License.  You may obtain a copy of the License 
at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in 
writing,
 # software distributed under the License is distributed on 
an
 # AS IS BASIS, WITHOUT WARRANTIES OR CONDITIONS 
OF ANY
@@ -285,9 +275,7 @@
 
 

-   
-
-   
+   
   
   
   
@@ -299,7 +287,7 @@
 
 
   
-  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/snide/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
+  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/rtfd/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
 
 
 
@@ -318,6 +306,7 @@
 var DOCUMENTATION_OPTIONS = {
 URL_ROOT:'../../../../',
 VERSION:'',
+LANGUAGE:'None',
 COLLAPSE_INDEX:false,
 FILE_SUFFIX:'.html',
 HAS_SOURCE:  true,
@@ -330,19 +319,13 @@
 
   
 
-  
-  
-
-  
+  
 
-  
-  
   
   jQuery(function () {
-  SphinxRtdTheme.StickyNav.enable();
+  SphinxRtdTheme.Navigation.enable(true);
   });
-  
-   
+   
 
 
 
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/contrib/hooks/segment_hook.html
--
diff --git a/_modules/airflow/contrib/hooks/segment_hook.html 
b/_modules/airflow/contrib/hooks/segment_hook.html
new file mode 100644
index 000..ccb8fd0
--- /dev/null
+++ b/_modules/airflow/contrib/hooks/segment_hook.html
@@ -0,0 +1,310 

[12/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_static/jquery.js
--
diff --git a/_static/jquery.js b/_static/jquery.js
index f6a6a99..644d35e 100644
--- a/_static/jquery.js
+++ b/_static/jquery.js
@@ -1,4 +1,4 @@
-/*! jQuery v3.1.0 | (c) jQuery Foundation | jquery.org/license */
-!function(a,b){"use strict";"object"==typeof module&&"object"==typeof 
module.exports?module.exports=a.document?b(a,!0):function(a){if(!a.document)throw
 new Error("jQuery requires a window with a document");return 
b(a)}:b(a)}("undefined"!=typeof window?window:this,function(a,b){"use 
strict";var 
c=[],d=a.document,e=Object.getPrototypeOf,f=c.slice,g=c.concat,h=c.push,i=c.indexOf,j={},k=j.toString,l=j.hasOwnProperty,m=l.toString,n=m.call(Object),o={};function
 p(a,b){b=b||d;var 
c=b.createElement("script");c.text=a,b.head.appendChild(c).parentNode.removeChild(c)}var
 q="3.1.0",r=function(a,b){return new 
r.fn.init(a,b)},s=/^[\s\uFEFF\xA0]+|[\s\uFEFF\xA0]+$/g,t=/^-ms-/,u=/-([a-z])/g,v=function(a,b){return
 
b.toUpperCase()};r.fn=r.prototype={jquery:q,constructor:r,length:0,toArray:function(){return
 f.call(this)},get:function(a){return 
null!=a?a<0?this[a+this.length]:this[a]:f.call(this)},pushStack:function(a){var 
b=r.merge(this.constructor(),a);return 
b.prevObject=this,b},each:function(a){retu
 rn r.each(this,a)},map:function(a){return 
this.pushStack(r.map(this,function(b,c){return 
a.call(b,c,b)}))},slice:function(){return 
this.pushStack(f.apply(this,arguments))},first:function(){return 
this.eq(0)},last:function(){return this.eq(-1)},eq:function(a){var 
b=this.length,c=+a+(a<0?b:0);return 
this.pushStack(c>=0&0& in a)}var x=function(a){var 
b,c,d,e,f,g,h,i,j,k,l,m,n,o,p,q,r,s,t,u="sizzle"+1*new 
Date,v=a.document,w=0,x=0,y=ha(),z=ha(),A=ha(),B=function(a,b){return 
a===b&&(l=!0),0},C={}.hasOwnProperty,D=[],E=D.pop,F=D.push,G=D.push,H=D.slice,I=function(a,b){for(var
 c=0,d=a.length;c
 +~]|"+K+")"+K+"*"),S=new RegExp("="+K+"*([^\\]'\"]*?)"+K+"*\\]","g"),T=new 
RegExp(N),U=new RegExp("^"+L+"$"),V={ID:new RegExp("^#("+L+")"),CLASS:new 
RegExp("^\\.("+L+")"),TAG:new RegExp("^("+L+"|[*])"),ATTR:new 
RegExp("^"+M),PSEUDO:new RegExp("^"+N),CHILD:new 
RegExp("^:(only|first|last|nth|nth-last)-(child|of-type)(?:\\("+K+"*(even|odd|(([+-]|)(\\d*)n|)"+K+"*(?:([+-]|)"+K+"*(\\d+)|))"+K+"*\\)|)","i"),bool:new
 RegExp("^(?:"+J+")$","i"),needsContext:new 
RegExp("^"+K+"*[>+~]|:(even|odd|eq|gt|lt|nth|first|last)(?:\\("+K+"*((?:-\\d)?\\d*)"+K+"*\\)|)(?=[^-]|$)","i")},W=/^(?:input|select|textarea|button)$/i,X=/^h\d$/i,Y=/^[^{]+\{\s*\[native
 \w/,Z=/^(?:#([\w-]+)|(\w+)|\.([\w-]+))$/,$=/[+~]/,_=new 
RegExp("([\\da-f]{1,6}"+K+"?|("+K+")|.)","ig"),aa=function(a,b,c){var 
d="0x"+b-65536;return 
d!==d||c?b:d<0?String.fromCharCode(d+65536):String.fromCharCode(d>>10|55296,1023|56320)},ba=/([\0-\x1f\x7f]|^-?\d)|^-$|[^\x80-\u\w-]/g,ca=function(a,b){return
 b?"\0"===a?"\ufffd":a.slice(0,-1)+"\\"
 +a.charCodeAt(a.length-1).toString(16)+" 
":"\\"+a},da=function(){m()},ea=ta(function(a){return 
a.disabled===!0},{dir:"parentNode",next:"legend"});try{G.apply(D=H.call(v.childNodes),v.childNodes),D[v.childNodes.length].nodeType}catch(fa){G={apply:D.length?function(a,b){F.apply(a,H.call(b))}:function(a,b){var
 c=a.length,d=0;while(a[c++]=b[d++]);a.length=c-1}}}function ga(a,b,d,e){var 
f,h,j,k,l,o,r,s=b&,w=b?b.nodeType:9;if(d=d||[],"string"!=typeof 
a||!a||1!==w&&9!==w&&11!==w)return 
d;if(!e&&((b?b.ownerDocument||b:v)!==n&(b),b=b||n,p)){if(11!==w&&(l=Z.exec(a)))if(f=l[1]){if(9===w){if(!(j=b.getElementById(f)))return
 d;if(j.id===f)return d.push(j),d}else 
if(s&&(j=s.getElementById(f))&(b,j)&===f)return 
d.push(j),d}else{if(l[2])return 
G.apply(d,b.getElementsByTagName(a)),d;if((f=l[3])&&)return
 G.apply(d,b.getElementsByClassName(f)),d}if(c.qsa&&!A[a+" 
"]&&(!q||!q.test(a))){if(1!==w)s=b,r=a;else if("object"!==b.nodeNam
 
e.toLowerCase()){(k=b.getAttribute("id"))?k=k.replace(ba,ca):b.setAttribute("id",k=u),o=g(a),h=o.length;while(h--)o[h]="#"+k+"
 "+sa(o[h]);r=o.join(","),s=$.test(a)&(b.parentNode)||b}if(r)try{return 
G.apply(d,s.querySelectorAll(r)),d}catch(x){}finally{k===u&("id")}}}return
 i(a.replace(P,"$1"),b,d,e)}function ha(){var a=[];function b(c,e){return 
a.push(c+" ")>d.cacheLength& b[a.shift()],b[c+" "]=e}return b}function 
ia(a){return a[u]=!0,a}function ja(a){var 
b=n.createElement("fieldset");try{return!!a(b)}catch(c){return!1}finally{b.parentNode&(b),b=null}}function
 ka(a,b){var c=a.split("|"),e=c.length;while(e--)d.attrHandle[c[e]]=b}function 
la(a,b){var 
c=b&,d=c&&1===a.nodeType&&1===b.nodeType&if(d)return
 d;if(c)while(c=c.nextSibling)if(c===b)return-1;return a?1:-1}function 
ma(a){return function(b){var 
c=b.nodeName.toLowerCase();return"input"===c&===a}}function na(a){return 
function(b){var c=b.nodeN
 ame.toLowerCase();return("input"===c||"button"===c)&===a}}function 
oa(a){return function(b){return"label"in b&===a||"form"in 

[18/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_static/css/theme.css
--
diff --git a/_static/css/theme.css b/_static/css/theme.css
index c1631d8..03a13df 100644
--- a/_static/css/theme.css
+++ b/_static/css/theme.css
@@ -1,5 +1,6 @@
-*{-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}article,aside,details,figcaption,figure,footer,header,hgroup,nav,section{display:block}audio,canvas,video{display:inline-block;*display:inline;*zoom:1}audio:not([controls]){display:none}[hidden]{display:none}*{-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}html{font-size:100%;-webkit-text-size-adjust:100%;-ms-text-size-adjust:100%}body{margin:0}a:hover,a:active{outline:0}abbr[title]{border-bottom:1px
 
dotted}b,strong{font-weight:bold}blockquote{margin:0}dfn{font-style:italic}ins{background:#ff9;color:#000;text-decoration:none}mark{background:#ff0;color:#000;font-style:italic;font-weight:bold}pre,code,.rst-content
 tt,.rst-content 
code,kbd,samp{font-family:monospace,serif;_font-family:"courier 
new",monospace;font-size:1em}pre{white-space:pre}q{quotes:none}q:before,q:after{content:"";content:none}small{font-size:85%}sub,sup{font-size:75%;line-height:0;position:relative;ver
 
tical-align:baseline}sup{top:-0.5em}sub{bottom:-0.25em}ul,ol,dl{margin:0;padding:0;list-style:none;list-style-image:none}li{list-style:none}dd{margin:0}img{border:0;-ms-interpolation-mode:bicubic;vertical-align:middle;max-width:100%}svg:not(:root){overflow:hidden}figure{margin:0}form{margin:0}fieldset{border:0;margin:0;padding:0}label{cursor:pointer}legend{border:0;*margin-left:-7px;padding:0;white-space:normal}button,input,select,textarea{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle}button,input{line-height:normal}button,input[type="button"],input[type="reset"],input[type="submit"]{cursor:pointer;-webkit-appearance:button;*overflow:visible}button[disabled],input[disabled]{cursor:default}input[type="checkbox"],input[type="radio"]{box-sizing:border-box;padding:0;*width:13px;*height:13px}input[type="search"]{-webkit-appearance:textfield;-moz-box-sizing:content-box;-webkit-box-sizing:content-box;box-sizing:content-box}input[type="search"]::-webkit-search-decor
 
ation,input[type="search"]::-webkit-search-cancel-button{-webkit-appearance:none}button::-moz-focus-inner,input::-moz-focus-inner{border:0;padding:0}textarea{overflow:auto;vertical-align:top;resize:vertical}table{border-collapse:collapse;border-spacing:0}td{vertical-align:top}.chromeframe{margin:0.2em
 0;background:#ccc;color:#000;padding:0.2em 
0}.ir{display:block;border:0;text-indent:-999em;overflow:hidden;background-color:transparent;background-repeat:no-repeat;text-align:left;direction:ltr;*line-height:0}.ir
 br{display:none}.hidden{display:none 
!important;visibility:hidden}.visuallyhidden{border:0;clip:rect(0 0 0 
0);height:1px;margin:-1px;overflow:hidden;padding:0;position:absolute;width:1px}.visuallyhidden.focusable:active,.visuallyhidden.focusable:focus{clip:auto;height:auto;margin:0;overflow:visible;position:static;width:auto}.invisible{visibility:hidden}.relative{position:relative}big,small{font-size:100%}@media
 print{html,body,section{background:none !important}*{box-shadow:n
 one !important;text-shadow:none !important;filter:none 
!important;-ms-filter:none !important}a,a:visited{text-decoration:underline}.ir 
a:after,a[href^="javascript:"]:after,a[href^="#"]:after{content:""}pre,blockquote{page-break-inside:avoid}thead{display:table-header-group}tr,img{page-break-inside:avoid}img{max-width:100%
 !important}@page{margin:0.5cm}p,h2,.rst-content .toctree-wrapper 
p.caption,h3{orphans:3;widows:3}h2,.rst-content .toctree-wrapper 
p.caption,h3{page-break-after:avoid}}.fa:before,.wy-menu-vertical li 
span.toctree-expand:before,.wy-menu-vertical li.on a 
span.toctree-expand:before,.wy-menu-vertical li.current>a 
span.toctree-expand:before,.rst-content .admonition-title:before,.rst-content 
h1 .headerlink:before,.rst-content h2 .headerlink:before,.rst-content h3 
.headerlink:before,.rst-content h4 .headerlink:before,.rst-content h5 
.headerlink:before,.rst-content h6 .headerlink:before,.rst-content dl dt 
.headerlink:before,.rst-content p.caption .headerlink:before,.rst-con
 tent tt.download span:first-child:before,.rst-content code.download 
span:first-child:before,.icon:before,.wy-dropdown 
.caret:before,.wy-inline-validate.wy-inline-validate-success 
.wy-input-context:before,.wy-inline-validate.wy-inline-validate-danger 
.wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning 
.wy-input-context:before,.wy-inline-validate.wy-inline-validate-info 
.wy-input-context:before,.wy-alert,.rst-content .note,.rst-content 
.attention,.rst-content .caution,.rst-content .danger,.rst-content 
.error,.rst-content .hint,.rst-content .important,.rst-content 
.tip,.rst-content .warning,.rst-content 

[08/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/faq.html
--
diff --git a/faq.html b/faq.html
index 7ca7156..d2bf58f 100644
--- a/faq.html
+++ b/faq.html
@@ -24,27 +24,19 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
-
- 
+  
+  
+
+
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -108,9 +100,9 @@
 Why isn’t my task getting 
scheduled?
 How do I trigger 
tasks based on another task’s failure?
 Why
 are connection passwords still not encrypted in the metadata db after I 
installed airflow[crypto]?
-What’s the deal with start_date?
+What’s the deal with start_date?
 How can I create DAGs 
dynamically?
-What are all 
the airflow run commands in my process list?
+What are all 
the airflow 
run commands in my process list?
 How can my airflow dag run 
faster?
 How can we reduce the 
airflow UI page load time?
 How
 to fix Exception: Global variable explicit_defaults_for_timestamp needs to be 
on (1)?
@@ -129,7 +121,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -137,9 +129,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -189,9 +182,9 @@
 Here are some of the common causes:
 
 Does your script “compile”, can the Airflow engine parse it and find 
your
-DAG object. To test this, you can run airflow list_dags and
+DAG object. To test this, you can run airflow list_dags and
 confirm that your DAG shows up in the list. You can also run
-airflow list_tasks foo_dag_id --tree and confirm that your task
+airflow 
list_tasks foo_dag_id --tree and confirm that your task
 shows up in the list as expected. If you use the CeleryExecutor, you
 may want to confirm that this works both where the scheduler runs as well
 as where the worker runs.
@@ -199,97 +192,97 @@ as where the worker runs.
 in the contents? When searching the DAG directory, Airflow ignores files not 
containing
 “airflow” and “DAG” in order to prevent the DagBag parsing from 
importing all python
 files collocated with user’s DAGs.
-Is your start_date set properly? The Airflow scheduler 
triggers the
-task soon after the start_date + scheduler_interval is passed.
-Is your schedule_interval set properly? The default schedule_interval
-is one day (datetime.timedelta(1)). You must specify a different 
schedule_interval
-directly to the DAG object you instantiate, not as a default_param, as task instances
-do not override their parent DAG’s schedule_interval.
-Is your start_date beyond where you can see it in the UI? If 
you
-set your start_date to some time say 3 months ago, you won’t 
be able to see
+Is your start_date set properly? The Airflow scheduler 
triggers the
+task soon after the start_date + scheduler_interval is passed.
+Is your schedule_interval set properly? The default schedule_interval
+is one day (datetime.timedelta(1)). You must specify a different 
schedule_interval
+directly to the DAG object you instantiate, not as a default_param, as task 
instances
+do not override their parent DAG’s schedule_interval.
+Is your start_date beyond where you can see it in the UI? If 
you
+set your start_date to some time say 3 months ago, you won’t 
be able to see
 it in the main view in the UI, but you should be able to see it in the
-Menu - Browse -Task Instances.
+Menu - Browse -Task Instances.
 Are the dependencies for the task met. The task instances directly
-upstream from the task need to be in a success state. Also,
-if you have set depends_on_past=True, the previous task instance
+upstream from the task need to be in a success state. Also,
+if you have set depends_on_past=True, the previous task instance
 needs to have succeeded (except if it is the first run for that task).
-Also, if wait_for_downstream=True, make sure you understand
+Also, if wait_for_downstream=True, make sure you understand
 what it means.
-You can view how these properties are set from the Task Instance Details
+You can view how these properties are set from the Task Instance Details
 page for your task.
 Are the DagRuns you need created and active? A DagRun represents a specific
 execution of an entire DAG and has a state (running, success, failed, …).
 The scheduler creates new DagRun as it moves forward, but never goes back
-in time to create new ones. The scheduler only evaluates running DagRuns
+in time to create new ones. The scheduler only evaluates running DagRuns
 to see what task instances it can trigger. Note that clearing tasks
 instances (from the UI or CLI) does set the state of a DagRun back to
 running. You can bulk view the list of DagRuns and alter states by clicking
 on the schedule tag for a DAG.
-Is the concurrency parameter of your DAG reached? concurrency defines
-how many running task instances a DAG is allowed to have, 
beyond which
+Is the concurrency parameter of your DAG reached? concurrency defines
+how many 

[04/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/integration.html
--
diff --git a/integration.html b/integration.html
index 782ae0e..326ae1e 100644
--- a/integration.html
+++ b/integration.html
@@ -24,27 +24,19 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
-
- 
+  
+  
+
+
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -214,7 +206,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -222,9 +214,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -280,21 +273,21 @@
 Airflow can be set up behind a reverse proxy, with the ability to set its 
endpoint with great
 flexibility.
 For example, you can configure your reverse proxy to get:
-https://lab.mycompany.com/myorg/airflow/
+https://lab.mycompany.com/myorg/airflow/
 
 
 To do so, you need to set the following setting in your 
airflow.cfg:
-base_url = http://my_host/myorg/airflow
+base_url = http://my_host/myorg/airflow
 
 
 Additionally if you use Celery Executor, you can get Flower in 
/myorg/flower with:
-flower_url_prefix = /myorg/flower
+flower_url_prefix 
= /myorg/flower
 
 
 Your reverse proxy (ex: nginx) should be configured as follow:
 
 pass the url and http header as it for the Airflow 
webserver, without any rewrite, for example:
-server 
{
+server {
   listen 80;
   server_name lab.mycompany.com;
 
@@ -311,7 +304,7 @@ flexibility.
 
 
 rewrite the url for the flower endpoint:
-server 
{
+server {
 listen 80;
 server_name lab.mycompany.com;
 
@@ -352,7 +345,7 @@ field (see connection wasb_default for an 
example).
 
 
 class airflow.contrib.sensors.wasb_sensor.WasbBlobSensor(container_name, blob_name, 
wasb_conn_id='wasb_default', check_options=None, 
*args, **kwargs)[source]¶
-Bases: airflow.sensors.base_sensor_operator.BaseSensorOperator
+Bases: airflow.sensors.base_sensor_operator.BaseSensorOperator
 Waits for a blob to arrive on Azure Blob Storage.
 
 
@@ -369,6 +362,13 @@ field (see connection wasb_default for an 
example).
 
 
 
+
+
+poke(context)[source]¶
+Function that the sensors defined while deriving this class should
+override.
+
+
 
 
 
@@ -377,7 +377,7 @@ field (see connection wasb_default for an 
example).
 
 
 class airflow.contrib.sensors.wasb_sensor.WasbPrefixSensor(container_name, prefix, 
wasb_conn_id='wasb_default', check_options=None, 
*args, **kwargs)[source]¶
-Bases: airflow.sensors.base_sensor_operator.BaseSensorOperator
+Bases: airflow.sensors.base_sensor_operator.BaseSensorOperator
 Waits for blobs matching a prefix to arrive on Azure Blob Storage.
 
 
@@ -394,6 +394,13 @@ field (see connection wasb_default for an 
example).
 
 
 
+
+
+poke(context)[source]¶
+Function that the sensors defined while deriving this class should
+override.
+
+
 
 
 
@@ -402,16 +409,16 @@ field (see connection wasb_default for an 
example).
 
 
 class airflow.contrib.operators.file_to_wasb.FileToWasbOperator(file_path, container_name, 
blob_name, wasb_conn_id='wasb_default', 
load_options=None, *args, **kwargs)[source]¶
-Bases: airflow.models.BaseOperator
+Bases: airflow.models.BaseOperator
 Uploads a file to Azure Blob Storage.
 
 
 
 
 Parameters:
-file_path (str) – Path to the file to 
load.
-container_name (str) – Name of the 
container.
-blob_name (str) – Name of the blob.
+file_path (str) – Path to the file to load. 
(templated)
+container_name (str) – Name of the container. 
(templated)
+blob_name (str) – Name of the blob. 
(templated)
 wasb_conn_id (str) – Reference to the wasb 
connection.
 load_options (dict) – Optional keyword 
arguments that
 WasbHook.load_file() takes.
@@ -434,7 +441,7 @@ field (see connection wasb_default for an 
example).
 
 
 class airflow.contrib.hooks.wasb_hook.WasbHook(wasb_conn_id='wasb_default')[source]¶
-Bases: airflow.hooks.base_hook.BaseHook
+Bases: airflow.hooks.base_hook.BaseHook
 Interacts with Azure Blob Storage through the wasb:// protocol.
 Additional options passed in the ‘extra’ field of the connection will be
 passed to the BlockBlockService() constructor. For example, 
authenticate
@@ -596,6 +603,237 @@ and password (=Storage account key), or login and SAS 
token in the extra field
 (see connection wasb_default for an example).
 
 AzureFileShareHook¶
+
+
+class airflow.contrib.hooks.azure_fileshare_hook.AzureFileShareHook(wasb_conn_id='wasb_default')[source]¶
+Bases: airflow.hooks.base_hook.BaseHook
+Interacts with Azure FileShare Storage.
+Additional options passed in the ‘extra’ field of the connection will be
+passed to the FileService() constructor.
+
+
+
+
+Parameters:wasb_conn_id (str) – Reference 
to the wasb connection.
+
+
+
+
+
+check_for_directory(share_name, directory_name, 
**kwargs)[source]¶
+Check if a directory exists on Azure File Share.
+
+
+
+
+Parameters:
+share_name (str) – Name of the share.
+directory_name (str) – Name of the 

[37/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/contrib/operators/kubernetes_pod_operator.html
--
diff --git a/_modules/airflow/contrib/operators/kubernetes_pod_operator.html 
b/_modules/airflow/contrib/operators/kubernetes_pod_operator.html
index 07b5378..4fe38df 100644
--- a/_modules/airflow/contrib/operators/kubernetes_pod_operator.html
+++ b/_modules/airflow/contrib/operators/kubernetes_pod_operator.html
@@ -24,26 +24,17 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
- 
+  
+  
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -116,7 +107,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -124,9 +115,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -156,8 +148,6 @@
 
   
 
-
-
   
 
   
@@ -192,11 +182,15 @@
 from airflow.contrib.kubernetes import 
kube_client, pod_generator, pod_launcher
 from airflow.contrib.kubernetes.pod import 
Resources
 from airflow.utils.state import State
+from airflow.contrib.kubernetes.volume_mount import VolumeMount  # 
noqa
+from airflow.contrib.kubernetes.volume import Volume  # 
noqa
+from airflow.contrib.kubernetes.secret import Secret  # 
noqa
 
 template_fields = (templates_dict,)
 template_ext = tuple()
 ui_color = #ffefeb
 
+
 [docs]class KubernetesPodOperator(BaseOperator):
 
 Execute a task in a Kubernetes Pod
@@ -206,12 +200,16 @@
 :type image: str
 :param: namespace: the namespace to run within 
kubernetes
 :type: namespace: str
-:param cmds: entrypoint of the container.
+:param cmds: entrypoint of the container. 
(templated)
 The docker imagess entrypoint is used if this is 
not provide.
-:type cmds: list
-:param arguments: arguments of to the entrypoint.
+:type cmds: list of str
+:param arguments: arguments of to the entrypoint. 
(templated)
 The docker images CMD is used if this is not 
provided.
-:type arguments: list
+:type arguments: list of str
+:param volume_mounts: volumeMounts for launched pod
+:type volume_mounts: list of VolumeMount
+:param volumes: volumes for launched pod. Includes 
ConfigMaps and PersistentVolumes
+:type volumes: list of Volume
 :param labels: labels to apply to the Pod
 :type labels: dict
 :param startup_timeout_seconds: timeout in seconds to 
startup the pod
@@ -219,30 +217,48 @@
 :param name: name of the task you want to run,
 will be used to generate a pod id
 :type name: str
-:param env_vars: Environment variables initialized in the 
container
+:param env_vars: Environment variables initialized in the 
container. (templated)
 :type env_vars: dict
 :param secrets: Kubernetes secrets to inject in the 
container,
 They can be exposed as environment vars or files in a 
volume.
-:type secrets: list
+:type secrets: list of Secret
 :param in_cluster: run kubernetes client with in_cluster 
configuration
 :type in_cluster: bool
+:param cluster_context: context that points to kubernetes 
cluster.
+Ignored when in_cluster is True. If None, 
current-context is used.
+:type cluster_context: string
 :param get_logs: get the stdout of the container as logs 
of the tasks
 :type get_logs: bool
+:param affinity: A dict containing a group of affinity 
scheduling rules
+:type affinity: dict
+:param config_file: The path to the Kubernetes config 
file
+:type config_file: str
+:param xcom_push: If xcom_push is True, the content of 
the file
+/airflow/xcom/return.json in the container will also 
be pushed to an
+XCom when the container completes.
+:type xcom_push: bool
 
-template_fields = (cmds, arguments, env_vars)
+template_fields = (cmds, arguments, env_vars, config_file)
 
 def execute(self, context):
 try:
-client = kube_client.get_kube_client(in_cluster=self.in_cluster)
+client = kube_client.get_kube_client(in_cluster=self.in_cluster,
+ cluster_context=self.cluster_context,
+ config_file=self.config_file)
 gen = pod_generator.PodGenerator()
 
+for mount in self.volume_mounts:
+gen.add_mount(mount)
+for volume in self.volumes:
+gen.add_volume(volume)
+
 pod = gen.make_pod(
 namespace=self.namespace,
 image=self.image,
 pod_id=self.name,
 cmds=self.cmds,
 arguments=self.arguments,
-labels=self.labels
+labels=self.labels,
 )
 
 pod.secrets = self.secrets
@@ -250,9 +266,11 @@
 pod.image_pull_policy = self.image_pull_policy

[03/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/kubernetes.html
--
diff --git a/kubernetes.html b/kubernetes.html
index 62110ac..9d0ca02 100644
--- a/kubernetes.html
+++ b/kubernetes.html
@@ -24,25 +24,17 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
- 
+  
+  
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -115,7 +107,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -123,9 +115,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -176,7 +169,7 @@
 
 
 Kubernetes Operator¶
-from airflow.contrib.operators import KubernetesOperator
+from airflow.contrib.operators import KubernetesOperator
 from airflow.contrib.operators.kubernetes_pod_operator import KubernetesPodOperator
 from airflow.contrib.kubernetes.secret import Secret
 
@@ -260,8 +253,8 @@
 
 
 
-class airflow.contrib.operators.kubernetes_pod_operator.KubernetesPodOperator(namespace, image, name, 
cmds=None, arguments=None, env_vars=None, 
secrets=None, in_cluster=False, labels=None, 
startup_timeout_seconds=120, get_logs=True, 
image_pull_policy='IfNotPresent', annotations=None, 
resources=None, *args, **kwargs)[source]¶
-Bases: airflow.models.BaseOperator
+class airflow.contrib.operators.kubernetes_pod_operator.KubernetesPodOperator(namespace, image, name, 
cmds=None, arguments=None, volume_mounts=None, 
volumes=None, env_vars=None, secrets=None, 
in_cluster=False, cluster_context=None, labels=None, 
startup_timeout_seconds=120, get_logs=True, 
image_pull_policy='IfNotPresent', annotations=None, 
resources=None, affinity=None, config_file=None, 
xcom_push=False, *args, **kwargs)[source]¶
+Bases: airflow.models.BaseOperator
 Execute a task in a Kubernetes Pod
 
 
@@ -270,19 +263,28 @@
 Parameters:
 image (str) – Docker image you wish to launch. 
Defaults to dockerhub.io,
 but fully qualified URLS will point to custom repositories
-cmds (list)
 – entrypoint of the container.
+cmds (list of str) – entrypoint of the 
container. (templated)
 The docker images’s entrypoint is used if this is not provide.
-arguments (list)
 – arguments of to the entrypoint.
+arguments (list of str) – arguments of to the 
entrypoint. (templated)
 The docker image’s CMD is used if this is not provided.
+volume_mounts (list of VolumeMount) – 
volumeMounts for launched pod
+volumes (list of Volume) – volumes for 
launched pod. Includes ConfigMaps and PersistentVolumes
 labels (dict) – labels to apply to the Pod
 startup_timeout_seconds (int) – timeout in 
seconds to startup the pod
 name (str) – name of the task you want to run,
 will be used to generate a pod id
-env_vars (dict) – Environment variables 
initialized in the container
-secrets (list)
 – Kubernetes secrets to inject in the container,
+env_vars (dict) – Environment variables 
initialized in the container. (templated)
+secrets (list of Secret) – Kubernetes secrets 
to inject in the container,
 They can be exposed as environment vars or files in a volume.
 in_cluster (bool) – run kubernetes client with 
in_cluster configuration
+cluster_context (string) – context that points 
to kubernetes cluster.
+Ignored when in_cluster is True. If None, current-context is used.
 get_logs (bool) – get the stdout of the 
container as logs of the tasks
+affinity (dict) – A dict containing a group of 
affinity scheduling rules
+config_file (str) – The path to the Kubernetes 
config file
+xcom_push (bool) – If xcom_push is True, the 
content of the file
+/airflow/xcom/return.json in the container will also be pushed to an
+XCom when the container completes.
 
 
 
@@ -306,9 +308,7 @@ They can be exposed as environment vars or files in a 
volume.
 
 

-   
-
-   
+   
   
   
   
@@ -320,7 +320,7 @@ They can be exposed as environment vars or files in a 
volume.
 
 
   
-  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/snide/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
+  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/rtfd/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
 
 
 
@@ -339,6 +339,7 @@ They can be exposed as environment vars or files in a 
volume.
 var DOCUMENTATION_OPTIONS = {
 URL_ROOT:'./',
 VERSION:'',
+LANGUAGE:'None',
 COLLAPSE_INDEX:false,
 FILE_SUFFIX:'.html',
 HAS_SOURCE:  true,
@@ -351,19 +352,13 @@ They can be exposed as environment vars or files in a 
volume.
 
   
 
-  
-  
-
-  
+  
 
-  
-  
   
   jQuery(function () {
-  SphinxRtdTheme.StickyNav.enable();
+  SphinxRtdTheme.Navigation.enable(true);
   });
-  
-   
+   
 
 
 
\ No newline at end of file


[06/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/howto/executor/use-celery.html
--
diff --git a/howto/executor/use-celery.html b/howto/executor/use-celery.html
index 5305019..d29401f 100644
--- a/howto/executor/use-celery.html
+++ b/howto/executor/use-celery.html
@@ -24,28 +24,19 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
-
-
- 
+  
+  
+
+
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -96,6 +87,7 @@
 How-to Guides
 Setting Configuration Options
 Initializing a Database Backend
+Using Operators
 Managing Connections
 Securing Connections
 Writing Logs
@@ -131,7 +123,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -139,9 +131,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -187,22 +180,22 @@
 
   
 Scaling Out with Celery¶
-CeleryExecutor is one of the ways you can scale out 
the number of workers. For this
+CeleryExecutor is one of the ways you can scale out 
the number of workers. For this
 to work, you need to setup a Celery backend (RabbitMQ, 
Redis, …) and
-change your airflow.cfg to point the executor parameter to
-CeleryExecutor 
and provide the related Celery settings.
+change your airflow.cfg to point the executor parameter to
+CeleryExecutor and provide the related Celery 
settings.
 For more information about setting up a Celery broker, refer to the
 exhaustive http://docs.celeryproject.org/en/latest/getting-started/brokers/index.html;>Celery
 documentation on the topic.
 Here are a few imperative requirements for your workers:
 
-airflow 
needs to be installed, and the CLI needs to be in the path
+airflow needs to be installed, and the CLI needs to 
be in the path
 Airflow configuration settings should be homogeneous across the 
cluster
 Operators that are executed on the worker need to have their dependencies
-met in that context. For example, if you use the HiveOperator,
+met in that context. For example, if you use the HiveOperator,
 the hive CLI needs to be installed on that box, or if you use the
-MySqlOperator, 
the required Python library needs to be available in
-the PYTHONPATH 
somehow
-The worker needs to have access to its DAGS_FOLDER, and you need to
+MySqlOperator, the required Python library needs to 
be available in
+the PYTHONPATH somehow
+The worker needs to have access to its DAGS_FOLDER, and you need to
 synchronize the filesystems by your own means. A common setup would be to
 store your DAGS_FOLDER in a Git repository and sync it across machines using
 Chef, Puppet, Ansible, or whatever you use to configure machines in your
@@ -211,13 +204,13 @@ pipelines files shared there should work as well
 
 To kick off a worker, you need to setup Airflow and kick off the worker
 subcommand
-airflow 
worker
+airflow worker
 
 
 Your worker should start picking up tasks as soon as they get fired in
 its direction.
 Note that you can also run “Celery Flower”, a web UI built on top of 
Celery,
-to monitor your workers. You can use the shortcut command airflow flower
+to monitor your workers. You can use the shortcut command airflow flower
 to start a Flower web server.
 Some caveats:
 
@@ -229,9 +222,7 @@ to start a Flower web server.
 
 

-   
-
-   
+   
   
   
   
@@ -252,7 +243,7 @@ to start a Flower web server.
 
 
   
-  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/snide/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
+  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/rtfd/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
 
 
 
@@ -271,6 +262,7 @@ to start a Flower web server.
 var DOCUMENTATION_OPTIONS = {
 URL_ROOT:'../../',
 VERSION:'',
+LANGUAGE:'None',
 COLLAPSE_INDEX:false,
 FILE_SUFFIX:'.html',
 HAS_SOURCE:  true,
@@ -283,19 +275,13 @@ to start a Flower web server.
 
   
 
-  
-  
-
-  
+  
 
-  
-  
   
   jQuery(function () {
-  SphinxRtdTheme.StickyNav.enable();
+  SphinxRtdTheme.Navigation.enable(true);
   });
-  
-   
+   
 
 
 
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/howto/executor/use-dask.html
--
diff --git a/howto/executor/use-dask.html b/howto/executor/use-dask.html
index d8b2380..bc18ef5 100644
--- a/howto/executor/use-dask.html
+++ b/howto/executor/use-dask.html
@@ -24,28 +24,19 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
-
-
- 
+  
+  
+
+
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -96,6 +87,7 @@
 How-to Guides
 Setting Configuration Options
 Initializing a Database 

[36/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/contrib/operators/qubole_check_operator.html
--
diff --git a/_modules/airflow/contrib/operators/qubole_check_operator.html 
b/_modules/airflow/contrib/operators/qubole_check_operator.html
new file mode 100644
index 000..0d54850
--- /dev/null
+++ b/_modules/airflow/contrib/operators/qubole_check_operator.html
@@ -0,0 +1,443 @@
+
+
+
+
+  
+
+  
+  
+  
+  
+  airflow.contrib.operators.qubole_check_operator  Airflow 
Documentation
+  
+
+  
+  
+  
+  
+
+  
+
+  
+  
+
+
+  
+
+  
+  
+
+ 
+
+  
+  
+
+
+
+
+
+   
+  
+
+
+
+  
+
+  
+
+  
+ Airflow
+  
+
+  
+  
+
+  
+
+
+  
+
+  
+
+  
+
+
+
+  
+
+
+  
+
+
+
+  
+
+
+  
+
+
+  
+Project
+License
+Quick Start
+Installation
+Tutorial
+How-to Guides
+UI / Screenshots
+Concepts
+Data Profiling
+Command Line Interface
+Scheduling  Triggers
+Plugins
+Security
+Time zones
+Experimental Rest API
+Integration
+Lineage
+FAQ
+API Reference
+
+
+
+  
+
+  
+
+
+
+
+  
+  
+
+  
+  Airflow
+
+  
+
+
+  
+
+
+
+  
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+  
+
+  Docs 
+
+  Module code 
+
+  airflow.contrib.operators.qubole_check_operator
+
+
+  
+
+  
+
+  
+
+  
+  
+
+  http://schema.org/Article;>
+   
+
+  Source code for airflow.contrib.operators.qubole_check_operator
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under 
one
+# or more contributor license agreements.  See the NOTICE 
file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this 
file
+# to you under the Apache License, Version 2.0 (the
+# License); you may not use this file except in 
compliance
+# with the License.  You may obtain a copy of the License 
at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in 
writing,
+# software distributed under the License is distributed on 
an
+# AS IS BASIS, WITHOUT WARRANTIES OR CONDITIONS 
OF ANY
+# KIND, either express or implied.  See the License for 
the
+# specific language governing permissions and 
limitations
+# under the License.
+#
+from airflow.contrib.operators.qubole_operator import QuboleOperator
+from airflow.utils.decorators 
import apply_defaults
+from airflow.contrib.hooks.qubole_check_hook import QuboleCheckHook
+from airflow.operators.check_operator import CheckOperator, ValueCheckOperator
+from airflow.exceptions import AirflowException
+
+
+[docs]class QuboleCheckOperator(CheckOperator, 
QuboleOperator):
+
+Performs checks against Qubole Commands. 
``QuboleCheckOperator`` expects
+a command that will be executed on QDS.
+By default, each value on first row of the result of this 
Qubole Commmand
+is evaluated using python ``bool`` casting. If any of 
the
+values return ``False``, the check is failed and errors 
out.
+
+Note that Python bool casting evals the following as 
``False``:
+
+* ``False``
+* ``0``
+* Empty string ()
+* Empty list (``[]``)
+* Empty dictionary or set (``{}``)
+
+Given a query like ``SELECT COUNT(*) FROM foo``, it will 
fail only if
+the count ``== 0``. You can craft much more complex query 
that could,
+for instance, check that the table has the same number of 
rows as
+the source table upstream, or that the count of 
todays partition is
+greater than yesterdays partition, or that a set of 
metrics are less
+than 3 standard deviation for the 7 day average.
+
+This operator can be used as a data quality check in your 
pipeline, and
+depending on where you put it in your DAG, you have the 
choice to
+stop the critical path, preventing from
+publishing dubious data, or on the side and receive email 
alerts
+without stopping the progress of the DAG.
+
+:param qubole_conn_id: Connection id which consists of 
qds auth_token
+:type qubole_conn_id: str
+
+kwargs:
+
+Arguments specific to Qubole command can be referred 
from QuboleOperator docs.
+
+:results_parser_callable: This is an optional 
parameter to
+extend the flexibility of parsing the results of 
Qubole
+command to the users. This is a python callable 
which
+can hold the logic to parse list of rows returned 
by Qubole command.
+By default, only the values on first row are used 
for performing checks.
+This callable should return 

[34/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/contrib/operators/winrm_operator.html
--
diff --git a/_modules/airflow/contrib/operators/winrm_operator.html 
b/_modules/airflow/contrib/operators/winrm_operator.html
new file mode 100644
index 000..fa546c3
--- /dev/null
+++ b/_modules/airflow/contrib/operators/winrm_operator.html
@@ -0,0 +1,328 @@
+
+
+
+
+  
+
+  
+  
+  
+  
+  airflow.contrib.operators.winrm_operator  Airflow 
Documentation
+  
+
+  
+  
+  
+  
+
+  
+
+  
+  
+
+
+  
+
+  
+  
+
+ 
+
+  
+  
+
+
+
+
+
+   
+  
+
+
+
+  
+
+  
+
+  
+ Airflow
+  
+
+  
+  
+
+  
+
+
+  
+
+  
+
+  
+
+
+
+  
+
+
+  
+
+
+
+  
+
+
+  
+
+
+  
+Project
+License
+Quick Start
+Installation
+Tutorial
+How-to Guides
+UI / Screenshots
+Concepts
+Data Profiling
+Command Line Interface
+Scheduling  Triggers
+Plugins
+Security
+Time zones
+Experimental Rest API
+Integration
+Lineage
+FAQ
+API Reference
+
+
+
+  
+
+  
+
+
+
+
+  
+  
+
+  
+  Airflow
+
+  
+
+
+  
+
+
+
+  
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+  
+
+  Docs 
+
+  Module code 
+
+  airflow.contrib.operators.winrm_operator
+
+
+  
+
+  
+
+  
+
+  
+  
+
+  http://schema.org/Article;>
+   
+
+  Source code for airflow.contrib.operators.winrm_operator
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under 
one
+# or more contributor license agreements.  See the NOTICE 
file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this 
file
+# to you under the Apache License, Version 2.0 (the
+# License); you may not use this file except in 
compliance
+# with the License.  You may obtain a copy of the License 
at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in 
writing,
+# software distributed under the License is distributed on 
an
+# AS IS BASIS, WITHOUT WARRANTIES OR CONDITIONS 
OF ANY
+# KIND, either express or implied.  See the License for 
the
+# specific language governing permissions and 
limitations
+# under the License.
+
+from airflow.contrib.hooks.winrm_hook import WinRMHook
+from airflow.exceptions import AirflowException
+from airflow.models import BaseOperator
+from airflow.utils.decorators 
import apply_defaults
+
+
+[docs]class WinRMOperator(BaseOperator):
+
+
+WinRMOperator to execute commands on given remote host 
using the winrm_hook.
+
+:param winrm_hook: predefined ssh_hook to use for remote 
execution
+:type winrm_hook: :class:`WinRMHook`
+:param ssh_conn_id: connection id from airflow 
Connections
+:type ssh_conn_id: str
+:param remote_host: remote host to connect
+:type remote_host: str
+:param command: command to execute on remote host. 
(templated)
+:type command: str
+:param timeout: timeout for executing the command.
+:type timeout: int
+:param do_xcom_push: return the stdout which also get set 
in xcom by airflow platform
+:type do_xcom_push: bool
+
+
+template_fields = (command,)
+
+@apply_defaults
+def __init__(self,
+ winrm_hook=None,
+ ssh_conn_id=None,
+ remote_host=None,
+ command=None,
+ timeout=10,
+ do_xcom_push=False,
+ *args,
+ **kwargs):
+super(WinRMOperator, self).__init__(*args, **kwargs)
+self.winrm_hook = winrm_hook
+self.ssh_conn_id = ssh_conn_id
+self.remote_host = remote_host
+self.command = command
+self.timeout = timeout
+self.do_xcom_push = do_xcom_push
+
+def execute(self, context):
+try:
+if self.ssh_conn_id and not self.winrm_hook:
+self.log.info(hook not found, 
creating)
+self.winrm_hook = WinRMHook(ssh_conn_id=self.ssh_conn_id)
+
+if not self.winrm_hook:
+raise AirflowException(can not operate without ssh_hook or 
ssh_conn_id)
+
+if self.remote_host is 
not None:
+self.winrm_hook.remote_host = self.remote_host
+
+winrm_client = self.winrm_hook.get_conn()
+self.log.info(Established WinRM 
connection)
+
+if not self.command:
+raise AirflowException(no command specified so nothing to execute 
here.)
+
+self.log.info(
+Starting command: {command} on 

[25/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/operators/dummy_operator.html
--
diff --git a/_modules/airflow/operators/dummy_operator.html 
b/_modules/airflow/operators/dummy_operator.html
index e730cbe..5502644 100644
--- a/_modules/airflow/operators/dummy_operator.html
+++ b/_modules/airflow/operators/dummy_operator.html
@@ -24,26 +24,17 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
- 
+  
+  
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -116,7 +107,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -124,9 +115,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -156,8 +148,6 @@
 
   
 
-
-
   
 
   
@@ -178,9 +168,9 @@
 # to you under the Apache License, Version 2.0 (the
 # License); you may not use this file except in 
compliance
 # with the License.  You may obtain a copy of the License 
at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in 
writing,
 # software distributed under the License is distributed on 
an
 # AS IS BASIS, WITHOUT WARRANTIES OR CONDITIONS 
OF ANY
@@ -210,9 +200,7 @@
 
 

-   
-
-   
+   
   
   
   
@@ -224,7 +212,7 @@
 
 
   
-  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/snide/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
+  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/rtfd/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
 
 
 
@@ -243,6 +231,7 @@
 var DOCUMENTATION_OPTIONS = {
 URL_ROOT:'../../../',
 VERSION:'',
+LANGUAGE:'None',
 COLLAPSE_INDEX:false,
 FILE_SUFFIX:'.html',
 HAS_SOURCE:  true,
@@ -255,19 +244,13 @@
 
   
 
-  
-  
-
-  
+  
 
-  
-  
   
   jQuery(function () {
-  SphinxRtdTheme.StickyNav.enable();
+  SphinxRtdTheme.Navigation.enable(true);
   });
-  
-   
+   
 
 
 
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/operators/email_operator.html
--
diff --git a/_modules/airflow/operators/email_operator.html 
b/_modules/airflow/operators/email_operator.html
index 6a5980a..d365677 100644
--- a/_modules/airflow/operators/email_operator.html
+++ b/_modules/airflow/operators/email_operator.html
@@ -24,26 +24,17 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
- 
+  
+  
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -116,7 +107,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -124,9 +115,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -156,8 +148,6 @@
 
   
 
-
-
   
 
   
@@ -197,12 +187,12 @@
 
 Sends an email.
 
-:param to: list of emails to send the email to
+:param to: list of emails to send the email to. 
(templated)
 :type to: list or string (comma or semicolon 
delimited)
-:param subject: subject line for the email 
(templated)
+:param subject: subject line for the email. 
(templated)
 :type subject: string
-:param html_content: content of the email (templated), 
html markup
-is allowed
+:param html_content: content of the email, html 
markup
+is allowed. (templated)
 :type html_content: string
 :param files: file names to attach in email
 :type files: list
@@ -246,13 +236,11 @@
 def execute(self, context):
 send_email(self.to, self.subject, self.html_content,
files=self.files, cc=self.cc, bcc=self.bcc,
-   mime_subtype=self.mime_subtype, mine_charset=self.mime_charset)
+   mime_subtype=self.mime_subtype, mime_charset=self.mime_charset)
 
 

-   
-
-   
+   
   
   
   
@@ -264,7 +252,7 @@
 
 
   
-  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/snide/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
+  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/rtfd/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
 
 
 
@@ -283,6 +271,7 @@
 var DOCUMENTATION_OPTIONS = {
 URL_ROOT:'../../../',
 VERSION:'',
+LANGUAGE:'None',
 COLLAPSE_INDEX:false,
 FILE_SUFFIX:'.html',
 HAS_SOURCE:  true,
@@ -295,19 +284,13 @@
 
   
 
-  
-  
-
-  
+  
 
-  
-  
   
   jQuery(function () {
-  SphinxRtdTheme.StickyNav.enable();
+  

[01/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
Repository: incubator-airflow-site
Updated Branches:
  refs/heads/asf-site 11437c14a -> 7d4d76286


http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/security.html
--
diff --git a/security.html b/security.html
index 76ef200..6bf00b5 100644
--- a/security.html
+++ b/security.html
@@ -24,27 +24,19 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
-
- 
+  
+  
+
+
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -153,7 +145,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -161,9 +153,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -218,9 +211,9 @@ backends or creating your own.
 
 Password¶
 One of the simplest mechanisms for authentication is requiring users to 
specify a password before logging in.
-Password authentication requires the used of the password subpackage in your 
requirements file. Password hashing
+Password authentication requires the used of the password subpackage in your 
requirements file. Password hashing
 uses bcrypt before storing passwords.
-[webserver]
+[webserver]
 authenticate = True
 auth_backend = 
airflow.contrib.auth.backends.password_auth
 
@@ -228,7 +221,7 @@ uses bcrypt before storing passwords.
 When password auth is enabled, an initial user credential will need to be 
created before anyone can login. An initial
 user was not created in the migrations for this authentication backend to 
prevent default Airflow installations from
 attack. Creating a new user has to be done via a Python REPL on the same 
machine Airflow is installed.
-# navigate to the airflow installation directory
+# navigate to the airflow 
installation directory
 $ cd ~/airflow
 $ python
 Python 2.7.9 (default, Feb 10 2015, 03:28:08)
@@ -250,13 +243,13 @@ Type help, copyright&
 
 
 LDAP¶
-To turn on LDAP authentication configure your airflow.cfg as follows. Please note 
that the example uses
+To turn on LDAP authentication configure your airflow.cfg as follows. Please 
note that the example uses
 an encrypted connection to the ldap server as you probably do not want 
passwords be readable on the network level.
 It is however possible to configure without encryption if you really want 
to.
 Additionally, if you are using Active Directory, and are not explicitly 
specifying an OU that your users are in,
-you will need to change search_scope to “SUBTREE”.
+you will need to change search_scope to “SUBTREE”.
 Valid search_scope options can be found in the http://ldap3.readthedocs.org/searches.html?highlight=search_scope;>ldap3 
Documentation
-[webserver]
+[webserver]
 authenticate = True
 auth_backend = 
airflow.contrib.auth.backends.ldap_auth
 
@@ -286,10 +279,10 @@ you will need to change search_
 
 
 Roll your own¶
-Airflow uses flask_login and
-exposes a set of hooks in the airflow.default_login module. You can
-alter the content and make it part of the PYTHONPATH and configure it as a backend in airflow.cfg.
-[webserver]
+Airflow uses flask_login and
+exposes a set of hooks in the airflow.default_login module. You can
+alter the content and make it part of the PYTHONPATH and configure it as a 
backend in airflow.cfg.
+[webserver]
 authenticate = True
 auth_backend = mypackage.auth
 
@@ -299,9 +292,9 @@ alter the content and make it part of the 
 Multi-tenancy¶
 You can filter the list of dags in webserver by owner name when 
authentication
-is turned on by setting webserver:filter_by_owner in your config. With this, 
a user will see
+is turned on by setting webserver:filter_by_owner in your config. With this, 
a user will see
 only the dags which it is owner of, unless it is a superuser.
-[webserver]
+[webserver]
 filter_by_owner = True
 
 
@@ -324,7 +317,7 @@ host and launch a ticket renewer next to every worker it 
will most likely work.<
 
 Airflow¶
 To enable kerberos you will need to generate a (service) key tab.
-# in the kadmin.local or kadmin shell, create the airflow 
principal
+# in the kadmin.local or 
kadmin shell, create the airflow principal
 kadmin:  addprinc -randkey airflow/fully.qualified.domain.n...@your-realm.com
 
 # Create the airflow keytab file that will contain the 
airflow principal
@@ -332,8 +325,8 @@ kadmin:  xst -norandkey -k airflow.keytab 
airflow/fully.qualified.domain.name
 
 
 Now store this file in a location where the airflow user can read it (chmod 
600). And then add the following to
-your airflow.cfg
-[core]
+your airflow.cfg
+[core]
 security = kerberos
 
 [kerberos]
@@ -343,15 +336,15 @@ your airflow.cfg<
 
 
 Launch the ticket renewer by
-# run ticket renewer
+# run ticket renewer
 airflow kerberos
 
 
 
 
 Hadoop¶
-If want to use impersonation this needs to be enabled in core-site.xml of your 
hadoop config.
-property
+If want to use impersonation this needs to be enabled in core-site.xml of your hadoop config.
+property
   

[35/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/contrib/operators/snowflake_operator.html
--
diff --git a/_modules/airflow/contrib/operators/snowflake_operator.html 
b/_modules/airflow/contrib/operators/snowflake_operator.html
new file mode 100644
index 000..d324f9a
--- /dev/null
+++ b/_modules/airflow/contrib/operators/snowflake_operator.html
@@ -0,0 +1,285 @@
+
+
+
+
+  
+
+  
+  
+  
+  
+  airflow.contrib.operators.snowflake_operator  Airflow 
Documentation
+  
+
+  
+  
+  
+  
+
+  
+
+  
+  
+
+
+  
+
+  
+  
+
+ 
+
+  
+  
+
+
+
+
+
+   
+  
+
+
+
+  
+
+  
+
+  
+ Airflow
+  
+
+  
+  
+
+  
+
+
+  
+
+  
+
+  
+
+
+
+  
+
+
+  
+
+
+
+  
+
+
+  
+
+
+  
+Project
+License
+Quick Start
+Installation
+Tutorial
+How-to Guides
+UI / Screenshots
+Concepts
+Data Profiling
+Command Line Interface
+Scheduling  Triggers
+Plugins
+Security
+Time zones
+Experimental Rest API
+Integration
+Lineage
+FAQ
+API Reference
+
+
+
+  
+
+  
+
+
+
+
+  
+  
+
+  
+  Airflow
+
+  
+
+
+  
+
+
+
+  
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+  
+
+  Docs 
+
+  Module code 
+
+  airflow.contrib.operators.snowflake_operator
+
+
+  
+
+  
+
+  
+
+  
+  
+
+  http://schema.org/Article;>
+   
+
+  Source code for airflow.contrib.operators.snowflake_operator
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under 
one
+# or more contributor license agreements.  See the NOTICE 
file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this 
file
+# to you under the Apache License, Version 2.0 (the
+# License); you may not use this file except in 
compliance
+# with the License.  You may obtain a copy of the License 
at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in 
writing,
+# software distributed under the License is distributed on 
an
+# AS IS BASIS, WITHOUT WARRANTIES OR CONDITIONS 
OF ANY
+# KIND, either express or implied.  See the License for 
the
+# specific language governing permissions and 
limitations
+# under the License.
+from airflow.contrib.hooks.snowflake_hook import SnowflakeHook
+from airflow.models import BaseOperator
+from airflow.utils.decorators 
import apply_defaults
+
+
+[docs]class SnowflakeOperator(BaseOperator):
+
+Executes sql code in a Snowflake database
+
+:param snowflake_conn_id: reference to specific snowflake 
connection id
+:type snowflake_conn_id: string
+:param sql: the sql code to be executed. 
(templated)
+:type sql: Can receive a str representing a sql 
statement,
+a list of str (sql statements), or reference to a 
template file.
+Template reference are recognized by str ending in 
.sql
+:param warehouse: name of warehouse which overwrite 
defined
+one in connection
+:type warehouse: string
+:param database: name of database which overwrite defined 
one in connection
+:type database: string
+
+
+template_fields = (sql,)
+template_ext = (.sql,)
+ui_color = #ededed
+
+@apply_defaults
+def __init__(
+self, sql, snowflake_conn_id=snowflake_default, parameters=None,
+autocommit=True, warehouse=None, database=None, *args, **kwargs):
+super(SnowflakeOperator, self).__init__(*args, **kwargs)
+self.snowflake_conn_id = snowflake_conn_id
+self.sql = sql
+self.autocommit = autocommit
+self.parameters = parameters
+self.warehouse = warehouse
+self.database = database
+
+def get_hook(self):
+return SnowflakeHook(snowflake_conn_id=self.snowflake_conn_id,
+ warehouse=self.warehouse, database=self.database)
+
+def execute(self, context):
+self.log.info(Executing: %s, self.sql)
+hook = self.get_hook()
+hook.run(
+self.sql,
+autocommit=self.autocommit,
+parameters=self.parameters)
+
+
+   
+   
+  
+  
+  
+
+  
+
+  
+
+
+
+  
+  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/rtfd/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
+
+
+
+
+  
+
+
+
+  
+  
+
+
+  
+
+
+var DOCUMENTATION_OPTIONS = {
+URL_ROOT:'../../../../',
+VERSION:'',
+LANGUAGE:'None',
+

[02/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/searchindex.js
--
diff --git a/searchindex.js b/searchindex.js
index dd547ce..1a79f0c 100644
--- a/searchindex.js
+++ b/searchindex.js
@@ -1 +1 @@
-Search.setIndex({docnames:["api","cli","code","concepts","faq","howto/executor/use-celery","howto/executor/use-dask","howto/executor/use-mesos","howto/index","howto/initialize-database","howto/manage-connections","howto/run-with-systemd","howto/run-with-upstart","howto/secure-connections","howto/set-config","howto/use-test-config","howto/write-logs","index","installation","integration","kubernetes","license","lineage","plugins","profiling","project","scheduler","security","start","timezone","tutorial","ui"],envversion:52,filenames:["api.rst","cli.rst","code.rst","concepts.rst","faq.rst","howto/executor/use-celery.rst","howto/executor/use-dask.rst","howto/executor/use-mesos.rst","howto/index.rst","howto/initialize-database.rst","howto/manage-connections.rst","howto/run-with-systemd.rst","howto/run-with-upstart.rst","howto/secure-connections.rst","howto/set-config.rst","howto/use-test-config.rst","howto/write-logs.rst","index.rst","installation.rst","integration.rst","kubernetes.rst",
 
"license.rst","lineage.rst","plugins.rst","profiling.rst","project.rst","scheduler.rst","security.rst","start.rst","timezone.rst","tutorial.rst","ui.rst"],objects:{"airflow.contrib.executors.mesos_executor":{MesosExecutor:[2,0,1,""]},"airflow.contrib.hooks.aws_dynamodb_hook":{AwsDynamoDBHook:[2,0,1,""]},"airflow.contrib.hooks.aws_dynamodb_hook.AwsDynamoDBHook":{write_batch_data:[2,1,1,""]},"airflow.contrib.hooks.aws_hook":{AwsHook:[2,0,1,""]},"airflow.contrib.hooks.aws_hook.AwsHook":{get_credentials:[2,1,1,""],get_session:[2,1,1,""]},"airflow.contrib.hooks.aws_lambda_hook":{AwsLambdaHook:[2,0,1,""]},"airflow.contrib.hooks.aws_lambda_hook.AwsLambdaHook":{invoke_lambda:[2,1,1,""]},"airflow.contrib.hooks.bigquery_hook":{BigQueryHook:[19,0,1,""]},"airflow.contrib.hooks.bigquery_hook.BigQueryHook":{get_conn:[19,1,1,""],get_pandas_df:[19,1,1,""],get_service:[19,1,1,""],insert_rows:[19,1,1,""],table_exists:[19,1,1,""]},"airflow.contrib.hooks.databricks_hook":{DatabricksHook:[2,0,1,""]},"ai
 
rflow.contrib.hooks.databricks_hook.DatabricksHook":{submit_run:[2,1,1,""]},"airflow.contrib.hooks.datadog_hook":{DatadogHook:[2,0,1,""]},"airflow.contrib.hooks.datadog_hook.DatadogHook":{post_event:[2,1,1,""],query_metric:[2,1,1,""],send_metric:[2,1,1,""]},"airflow.contrib.hooks.datastore_hook":{DatastoreHook:[19,0,1,""]},"airflow.contrib.hooks.datastore_hook.DatastoreHook":{allocate_ids:[19,1,1,""],begin_transaction:[19,1,1,""],commit:[19,1,1,""],delete_operation:[19,1,1,""],export_to_storage_bucket:[19,1,1,""],get_conn:[19,1,1,""],get_operation:[19,1,1,""],import_from_storage_bucket:[19,1,1,""],lookup:[19,1,1,""],poll_operation_until_done:[19,1,1,""],rollback:[19,1,1,""],run_query:[19,1,1,""]},"airflow.contrib.hooks.discord_webhook_hook":{DiscordWebhookHook:[2,0,1,""]},"airflow.contrib.hooks.discord_webhook_hook.DiscordWebhookHook":{execute:[2,1,1,""]},"airflow.contrib.hooks.emr_hook":{EmrHook:[19,0,1,""]},"airflow.contrib.hooks.emr_hook.EmrHook":{create_job_flow:[19,1,1,""]},"ai
 
rflow.contrib.hooks.fs_hook":{FSHook:[2,0,1,""]},"airflow.contrib.hooks.ftp_hook":{FTPHook:[2,0,1,""],FTPSHook:[2,0,1,""]},"airflow.contrib.hooks.ftp_hook.FTPHook":{close_conn:[2,1,1,""],create_directory:[2,1,1,""],delete_directory:[2,1,1,""],delete_file:[2,1,1,""],describe_directory:[2,1,1,""],get_conn:[2,1,1,""],list_directory:[2,1,1,""],rename:[2,1,1,""],retrieve_file:[2,1,1,""],store_file:[2,1,1,""]},"airflow.contrib.hooks.ftp_hook.FTPSHook":{get_conn:[2,1,1,""]},"airflow.contrib.hooks.gcp_api_base_hook":{GoogleCloudBaseHook:[2,0,1,""]},"airflow.contrib.hooks.gcp_dataflow_hook":{DataFlowHook:[19,0,1,""]},"airflow.contrib.hooks.gcp_dataflow_hook.DataFlowHook":{get_conn:[19,1,1,""]},"airflow.contrib.hooks.gcp_dataproc_hook":{DataProcHook:[2,0,1,""]},"airflow.contrib.hooks.gcp_dataproc_hook.DataProcHook":{await:[2,1,1,""],get_conn:[2,1,1,""]},"airflow.contrib.hooks.gcp_mlengine_hook":{MLEngineHook:[19,0,1,""]},"airflow.contrib.hooks.gcp_mlengine_hook.MLEngineHook":{create_job:[19,1
 

[38/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/contrib/operators/gcs_operator.html
--
diff --git a/_modules/airflow/contrib/operators/gcs_operator.html 
b/_modules/airflow/contrib/operators/gcs_operator.html
index 1d95eae..dcdee63 100644
--- a/_modules/airflow/contrib/operators/gcs_operator.html
+++ b/_modules/airflow/contrib/operators/gcs_operator.html
@@ -24,26 +24,17 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
- 
+  
+  
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -116,7 +107,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -124,9 +115,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -156,8 +148,6 @@
 
   
 
-
-
   
 
   
@@ -178,9 +168,9 @@
 # to you under the Apache License, Version 2.0 (the
 # License); you may not use this file except in 
compliance
 # with the License.  You may obtain a copy of the License 
at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in 
writing,
 # software distributed under the License is distributed on 
an
 # AS IS BASIS, WITHOUT WARRANTIES OR CONDITIONS 
OF ANY
@@ -203,10 +193,10 @@
 For more information, see Bucket Naming 
Guidelines:
 
https://cloud.google.com/storage/docs/bucketnaming.html#requirements
 
-:param bucket_name: The name of the bucket.
+:param bucket_name: The name of the bucket. 
(templated)
 :type bucket_name: string
 :param storage_class: This defines how objects in the 
bucket are stored
-and determines the SLA and the cost of storage. 
Values include
+and determines the SLA and the cost of storage 
(templated). Values include
 
 - ``MULTI_REGIONAL``
 - ``REGIONAL``
@@ -216,7 +206,7 @@
 If this value is not specified when the bucket 
is
 created, it will default to STANDARD.
 :type storage_class: string
-:param location: The location of the bucket.
+:param location: The location of the bucket. 
(templated)
 Object data for objects in the bucket resides in 
physical storage
 within this region. Defaults to US.
 
@@ -224,7 +214,7 @@
 
https://developers.google.com/storage/docs/bucket-locations
 
 :type location: string
-:param project_id: The ID of the GCP Project.
+:param project_id: The ID of the GCP Project. 
(templated)
 :type project_id: string
 :param labels: User-provided labels, in key/value 
pairs.
 :type labels: dict
@@ -294,9 +284,7 @@
 
 

-   
-
-   
+   
   
   
   
@@ -308,7 +296,7 @@
 
 
   
-  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/snide/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
+  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/rtfd/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
 
 
 
@@ -327,6 +315,7 @@
 var DOCUMENTATION_OPTIONS = {
 URL_ROOT:'../../../../',
 VERSION:'',
+LANGUAGE:'None',
 COLLAPSE_INDEX:false,
 FILE_SUFFIX:'.html',
 HAS_SOURCE:  true,
@@ -339,19 +328,13 @@
 
   
 
-  
-  
-
-  
+  
 
-  
-  
   
   jQuery(function () {
-  SphinxRtdTheme.StickyNav.enable();
+  SphinxRtdTheme.Navigation.enable(true);
   });
-  
-   
+   
 
 
 
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/contrib/operators/gcs_to_bq.html
--
diff --git a/_modules/airflow/contrib/operators/gcs_to_bq.html 
b/_modules/airflow/contrib/operators/gcs_to_bq.html
index db2b756..4dabc6a 100644
--- a/_modules/airflow/contrib/operators/gcs_to_bq.html
+++ b/_modules/airflow/contrib/operators/gcs_to_bq.html
@@ -24,26 +24,17 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
- 
+  
+  
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -116,7 +107,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -124,9 +115,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -156,8 +148,6 @@
 
   
 
-
-
   
 
   
@@ -178,9 +168,9 @@
 # to you under the Apache License, Version 2.0 (the
 # License); you may not use this file except in 
compliance
 # with the License.  You may obtain a copy of the License 
at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in 
writing,
 # software distributed under the License is distributed on 
an
 # AS IS BASIS, WITHOUT WARRANTIES OR CONDITIONS 

[05/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/howto/run-with-upstart.html
--
diff --git a/howto/run-with-upstart.html b/howto/run-with-upstart.html
index e8c7b88..add6b47 100644
--- a/howto/run-with-upstart.html
+++ b/howto/run-with-upstart.html
@@ -24,28 +24,19 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
-
-
- 
+  
+  
+
+
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -96,6 +87,7 @@
 How-to Guides
 Setting Configuration Options
 Initializing a Database Backend
+Using Operators
 Managing Connections
 Securing Connections
 Writing Logs
@@ -131,7 +123,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -139,9 +131,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -188,25 +181,23 @@
   
 Running Airflow with upstart¶
 Airflow can integrate with upstart based systems. Upstart automatically 
starts all airflow services for which you
-have a corresponding *.conf file in /etc/init upon system boot. On failure, upstart 
automatically restarts
-the process (until it reaches re-spawn limit set in a *.conf file).
-You can find sample upstart job files in the scripts/upstart directory. These files 
have been tested on
-Ubuntu 14.04 LTS. You may have to adjust start on and stop on stanzas to make it work on other upstart
-systems. Some of the possible options are listed in scripts/upstart/README.
-Modify *.conf files as needed and copy to /etc/init directory. 
It is assumed that airflow will run
-under airflow:airflow. Change setuid and setgid in *.conf files if you use other 
user/group
-You can use initctl to manually start, stop, view status of the 
airflow process that has been
+have a corresponding *.conf file in /etc/init upon system boot. On 
failure, upstart automatically restarts
+the process (until it reaches re-spawn limit set in a *.conf file).
+You can find sample upstart job files in the scripts/upstart directory. These 
files have been tested on
+Ubuntu 14.04 LTS. You may have to adjust start on 
and stop 
on stanzas to make it work on other upstart
+systems. Some of the possible options are listed in scripts/upstart/README.
+Modify *.conf files as needed and copy to /etc/init 
directory. It is assumed that airflow will run
+under airflow:airflow. Change setuid and setgid in *.conf 
files if you use other user/group
+You can use initctl to manually start, stop, view status of the 
airflow process that has been
 integrated with upstart
-initctl 
airflow-webserver status
+initctl airflow-webserver status
 
 
 
 
 

-   
-
-   
+   
   
   
   
@@ -227,7 +218,7 @@ integrated with upstart
 
 
   
-  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/snide/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
+  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/rtfd/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
 
 
 
@@ -246,6 +237,7 @@ integrated with upstart
 var DOCUMENTATION_OPTIONS = {
 URL_ROOT:'../',
 VERSION:'',
+LANGUAGE:'None',
 COLLAPSE_INDEX:false,
 FILE_SUFFIX:'.html',
 HAS_SOURCE:  true,
@@ -258,19 +250,13 @@ integrated with upstart
 
   
 
-  
-  
-
-  
+  
 
-  
-  
   
   jQuery(function () {
-  SphinxRtdTheme.StickyNav.enable();
+  SphinxRtdTheme.Navigation.enable(true);
   });
-  
-   
+   
 
 
 
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/howto/secure-connections.html
--
diff --git a/howto/secure-connections.html b/howto/secure-connections.html
index 4a6db31..1b801e0 100644
--- a/howto/secure-connections.html
+++ b/howto/secure-connections.html
@@ -24,28 +24,19 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
-
-
- 
+  
+  
+
+
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -96,6 +87,7 @@
 How-to Guides
 Setting Configuration Options
 Initializing a Database Backend
+Using Operators
 Managing Connections
 Securing Connections
 Writing Logs
@@ -131,7 +123,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -139,9 +131,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -188,39 +181,37 @@
   
 Securing Connections¶
 By default, Airflow will save the passwords for the connection in plain text
-within the metadata database. The crypto package is highly recommended
-during installation. The crypto package does require that your operating
+within the metadata database. The crypto package is highly 
recommended
+during installation. The crypto package does require that your operating
 

[29/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/hooks/http_hook.html
--
diff --git a/_modules/airflow/hooks/http_hook.html 
b/_modules/airflow/hooks/http_hook.html
index 0df6ddc..46340b7 100644
--- a/_modules/airflow/hooks/http_hook.html
+++ b/_modules/airflow/hooks/http_hook.html
@@ -24,26 +24,17 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
- 
+  
+  
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -116,7 +107,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -124,9 +115,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -156,8 +148,6 @@
 
   
 
-
-
   
 
   
@@ -178,9 +168,9 @@
 # to you under the Apache License, Version 2.0 (the
 # License); you may not use this file except in 
compliance
 # with the License.  You may obtain a copy of the License 
at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in 
writing,
 # software distributed under the License is distributed on 
an
 # AS IS BASIS, WITHOUT WARRANTIES OR CONDITIONS 
OF ANY
@@ -191,6 +181,7 @@
 from builtins import str
 
 import requests
+import tenacity
 
 from airflow.hooks.base_hook 
import BaseHook
 from airflow.exceptions import AirflowException
@@ -199,16 +190,31 @@
 [docs]class HttpHook(BaseHook):
 
 Interact with HTTP servers.
+:param http_conn_id: connection that has the base API url 
i.e https://www.google.com/
+and optional authentication credentials. Default 
headers can also be specified in
+the Extra field in json format.
+:type http_conn_id: str
+:param method: the API method to be called
+:type method: str
 
 
-def __init__(self, method=POST, http_conn_id=http_default):
+def __init__(
+self,
+method=POST,
+http_conn_id=http_default
+):
 self.http_conn_id = http_conn_id
 self.method = method
+self.base_url = None
+self._retry_obj = None
 
-# headers is required to make it required
-[docs]  
  def get_conn(self, headers):
+# headers may be passed through directly or in the 
extra field in the connection
+# definition
+[docs]  
  def get_conn(self, headers=None):
 
 Returns http session for use with requests
+:param headers: additional headers to be passed 
through as a dictionary
+:type headers: dict
 
 conn = self.get_connection(self.http_conn_id)
 session = requests.Session()
@@ -221,9 +227,11 @@
 self.base_url = schema + :// + conn.host
 
 if conn.port:
-self.base_url = self.base_url 
+ : + str(conn.port) + /
+self.base_url = self.base_url 
+ : + str(conn.port)
 if conn.login:
 session.auth = (conn.login, conn.password)
+if conn.extra:
+session.headers.update(conn.extra_dejson)
 if headers:
 session.headers.update(headers)
 
@@ -232,6 +240,16 @@
 [docs]
def run(self, endpoint, data=None, headers=None, extra_options=None):
 
 Performs the request
+:param endpoint: the endpoint to be called i.e. 
resource/v1/query?
+:type endpoint: str
+:param data: payload to be uploaded or request 
parameters
+:type data: dict
+:param headers: additional headers to be passed 
through as a dictionary
+:type headers: dict
+:param extra_options: additional options to be used 
when executing the request
+i.e. {check_response: False} to avoid 
checking raising exceptions on non
+2XX or 3XX status codes
+:type extra_options: dict
 
 extra_options = extra_options or {}
 
@@ -261,43 +279,85 @@
 self.log.info(Sending %s to url: %s, 
self.method, url)
 return self.run_and_check(session, prepped_request, extra_options)
 
+[docs]
def check_response(self, response):
+
+Checks the status code and raise an AirflowException 
exception on non 2XX or 3XX
+status codes
+:param response: A requests response object
+:type response: requests.response
+
+try:
+response.raise_for_status()
+except requests.exceptions.HTTPError:
+self.log.error(HTTP error: %s, 
response.reason)
+if self.method not 
in [GET, HEAD]:
+self.log.error(response.text)
+raise AirflowException(str(response.status_code) + 
: + response.reason)
+
 [docs]
def run_and_check(self, session, prepped_request, extra_options):
 
 Grabs extra options like timeout and actually runs 
the request,
 checking for the result
+:param session: the session to be used to execute the 
request
+:type session: requests.Session

[16/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_static/fonts/fontawesome-webfont.svg
--
diff --git a/_static/fonts/fontawesome-webfont.svg 
b/_static/fonts/fontawesome-webfont.svg
index 8b66187..855c845 100644
--- a/_static/fonts/fontawesome-webfont.svg
+++ b/_static/fonts/fontawesome-webfont.svg
@@ -1,685 +1,2671 @@
 
 http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd; >
-http://www.w3.org/2000/svg;>
-
+
+
+Created by FontForge 20120731 at Mon Oct 24 17:37:40 2016
+ By ,,,
+Copyright Dave Gandy 2016. All rights reserved.
+
 
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_static/fonts/fontawesome-webfont.ttf
--
diff --git a/_static/fonts/fontawesome-webfont.ttf 
b/_static/fonts/fontawesome-webfont.ttf
index f221e50..35acda2 100644
Binary files a/_static/fonts/fontawesome-webfont.ttf and 
b/_static/fonts/fontawesome-webfont.ttf differ




[45/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/contrib/hooks/discord_webhook_hook.html
--
diff --git a/_modules/airflow/contrib/hooks/discord_webhook_hook.html 
b/_modules/airflow/contrib/hooks/discord_webhook_hook.html
index 1115b8c..b82d631 100644
--- a/_modules/airflow/contrib/hooks/discord_webhook_hook.html
+++ b/_modules/airflow/contrib/hooks/discord_webhook_hook.html
@@ -24,26 +24,17 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
- 
+  
+  
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -116,7 +107,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -124,9 +115,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -156,8 +148,6 @@
 
   
 
-
-
   
 
   
@@ -178,9 +168,9 @@
 # to you under the Apache License, Version 2.0 (the
 # License); you may not use this file except in 
compliance
 # with the License.  You may obtain a copy of the License 
at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in 
writing,
 # software distributed under the License is distributed on 
an
 # AS IS BASIS, WITHOUT WARRANTIES OR CONDITIONS 
OF ANY
@@ -312,9 +302,7 @@
 
 

-   
-
-   
+   
   
   
   
@@ -326,7 +314,7 @@
 
 
   
-  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/snide/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
+  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/rtfd/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
 
 
 
@@ -345,6 +333,7 @@
 var DOCUMENTATION_OPTIONS = {
 URL_ROOT:'../../../../',
 VERSION:'',
+LANGUAGE:'None',
 COLLAPSE_INDEX:false,
 FILE_SUFFIX:'.html',
 HAS_SOURCE:  true,
@@ -357,19 +346,13 @@
 
   
 
-  
-  
-
-  
+  
 
-  
-  
   
   jQuery(function () {
-  SphinxRtdTheme.StickyNav.enable();
+  SphinxRtdTheme.Navigation.enable(true);
   });
-  
-   
+   
 
 
 
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/contrib/hooks/emr_hook.html
--
diff --git a/_modules/airflow/contrib/hooks/emr_hook.html 
b/_modules/airflow/contrib/hooks/emr_hook.html
index 8a5a5c7..7aa2a43 100644
--- a/_modules/airflow/contrib/hooks/emr_hook.html
+++ b/_modules/airflow/contrib/hooks/emr_hook.html
@@ -24,26 +24,17 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
- 
+  
+  
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -116,7 +107,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -124,9 +115,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -156,8 +148,6 @@
 
   
 
-
-
   
 
   
@@ -178,9 +168,9 @@
 # to you under the Apache License, Version 2.0 (the
 # License); you may not use this file except in 
compliance
 # with the License.  You may obtain a copy of the License 
at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in 
writing,
 # software distributed under the License is distributed on 
an
 # AS IS BASIS, WITHOUT WARRANTIES OR CONDITIONS 
OF ANY
@@ -194,7 +184,8 @@
 
 [docs]class EmrHook(AwsHook):
 
-Interact with AWS EMR. emr_conn_id is only neccessary for 
using the create_job_flow method.
+Interact with AWS EMR. emr_conn_id is only neccessary for 
using the
+create_job_flow method.
 
 
 def __init__(self, emr_conn_id=None, *args, **kwargs):
@@ -208,7 +199,8 @@
 [docs]
def create_job_flow(self, job_flow_overrides):
 
 Creates a job flow using the config from the EMR 
connection.
-Keys of the json extra hash may have the arguments of 
the boto3 run_job_flow method.
+Keys of the json extra hash may have the arguments of 
the boto3
+run_job_flow method.
 Overrides for this config may be passed as the 
job_flow_overrides.
 
 
@@ -239,9 +231,7 @@
 
 

-   
-
-   
+   
   
   
   
@@ -253,7 +243,7 @@
 
 
   
-  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/snide/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
+  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/rtfd/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
 
 
 
@@ -272,6 +262,7 @@
 var DOCUMENTATION_OPTIONS = {
 URL_ROOT:'../../../../',
 VERSION:'',
+

[39/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/contrib/operators/ecs_operator.html
--
diff --git a/_modules/airflow/contrib/operators/ecs_operator.html 
b/_modules/airflow/contrib/operators/ecs_operator.html
index 442c5ae..becc9fb 100644
--- a/_modules/airflow/contrib/operators/ecs_operator.html
+++ b/_modules/airflow/contrib/operators/ecs_operator.html
@@ -24,26 +24,17 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
- 
+  
+  
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -116,7 +107,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -124,9 +115,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -156,8 +148,6 @@
 
   
 
-
-
   
 
   
@@ -178,9 +168,9 @@
 # to you under the Apache License, Version 2.0 (the
 # License); you may not use this file except in 
compliance
 # with the License.  You may obtain a copy of the License 
at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in 
writing,
 # software distributed under the License is distributed on 
an
 # AS IS BASIS, WITHOUT WARRANTIES OR CONDITIONS 
OF ANY
@@ -204,13 +194,17 @@
 :type task_definition: str
 :param cluster: the cluster name on EC2 Container 
Service
 :type cluster: str
-:param: overrides: the same parameter that boto3 will 
receive:
+:param: overrides: the same parameter that boto3 will 
receive (templated):
 
http://boto3.readthedocs.org/en/latest/reference/services/ecs.html#ECS.Client.run_task
 :type: overrides: dict
 :param aws_conn_id: connection id of AWS credentials / 
region name. If None,
-credential boto3 strategy will be used 
(http://boto3.readthedocs.io/en/latest/guide/configuration.html).
+credential boto3 strategy will be used
+
(http://boto3.readthedocs.io/en/latest/guide/configuration.html).
 :type aws_conn_id: str
-:param region_name: region name to use in AWS Hook. 
Override the region_name in connection (if provided)
+:param region_name: region name to use in AWS Hook.
+Override the region_name in connection (if 
provided)
+:param launch_type: the launch type on which to run your 
task (EC2 or FARGATE)
+:type: launch_type: str
 
 
 ui_color = #f0ede4
@@ -220,7 +214,7 @@
 
 @apply_defaults
 def __init__(self, task_definition, cluster, overrides,
- aws_conn_id=None, region_name=None, **kwargs):
+ aws_conn_id=None, region_name=None, launch_type=EC2, **kwargs):
 super(ECSOperator, self).__init__(**kwargs)
 
 self.aws_conn_id = aws_conn_id
@@ -228,13 +222,14 @@
 self.task_definition = task_definition
 self.cluster = cluster
 self.overrides = overrides
+self.launch_type = launch_type
 
 self.hook = self.get_hook()
 
 def execute(self, context):
 self.log.info(
 Running ECS Task - Task definition: 
%s - on cluster %s,
-self.task_definition,self.cluster
+self.task_definition, self.cluster
 )
 self.log.info(ECSOperator overrides: %s, self.overrides)
 
@@ -247,7 +242,8 @@
 cluster=self.cluster,
 taskDefinition=self.task_definition,
 overrides=self.overrides,
-startedBy=self.owner
+startedBy=self.owner,
+launchType=self.launch_type
 )
 
 failures = response[failures]
@@ -282,13 +278,16 @@
 for task in response[tasks]:
 containers = task[containers]
 for container in containers:
-if container.get(lastStatus) == STOPPED and container[exitCode] != 0:
-raise AirflowException(This task is not in success state {}.format(task))
+if container.get(lastStatus) == STOPPED and \
+container[exitCode] != 0:
+raise AirflowException(
+This task is not in success 
state {}.format(task))
 elif container.get(lastStatus) == PENDING:
 raise AirflowException(This task is still pending {}.format(task))
 elif error in container.get(reason, ).lower():
-raise AirflowException(This containers encounter an error during launching : 
{}.
-   format(container.get(reason, ).lower()))
+raise AirflowException(
+This containers encounter an 
error during launching : {}.
+format(container.get(reason, ).lower()))
 
 def get_hook(self):
 return AwsHook(
@@ -304,9 +303,7 @@
 
 

-   
-
-   
+   
   
   
   
@@ 

[41/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/contrib/operators/bigquery_table_delete_operator.html
--
diff --git 
a/_modules/airflow/contrib/operators/bigquery_table_delete_operator.html 
b/_modules/airflow/contrib/operators/bigquery_table_delete_operator.html
index 94bc9dd..8b3a424 100644
--- a/_modules/airflow/contrib/operators/bigquery_table_delete_operator.html
+++ b/_modules/airflow/contrib/operators/bigquery_table_delete_operator.html
@@ -24,26 +24,17 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
- 
+  
+  
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -116,7 +107,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -124,9 +115,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -156,8 +148,6 @@
 
   
 
-
-
   
 
   
@@ -178,9 +168,9 @@
 # to you under the Apache License, Version 2.0 (the
 # License); you may not use this file except in 
compliance
 # with the License.  You may obtain a copy of the License 
at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in 
writing,
 # software distributed under the License is distributed on 
an
 # AS IS BASIS, WITHOUT WARRANTIES OR CONDITIONS 
OF ANY
@@ -199,7 +189,7 @@
 
 :param deletion_dataset_table: A dotted
 
(project.|project:)dataset.table that indicates 
which table
-will be deleted.
+will be deleted. (templated)
 :type deletion_dataset_table: string
 :param bigquery_conn_id: reference to a specific BigQuery 
hook.
 :type bigquery_conn_id: string
@@ -238,9 +228,7 @@
 
 

-   
-
-   
+   
   
   
   
@@ -252,7 +240,7 @@
 
 
   
-  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/snide/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
+  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/rtfd/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
 
 
 
@@ -271,6 +259,7 @@
 var DOCUMENTATION_OPTIONS = {
 URL_ROOT:'../../../../',
 VERSION:'',
+LANGUAGE:'None',
 COLLAPSE_INDEX:false,
 FILE_SUFFIX:'.html',
 HAS_SOURCE:  true,
@@ -283,19 +272,13 @@
 
   
 
-  
-  
-
-  
+  
 
-  
-  
   
   jQuery(function () {
-  SphinxRtdTheme.StickyNav.enable();
+  SphinxRtdTheme.Navigation.enable(true);
   });
-  
-   
+   
 
 
 
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/contrib/operators/bigquery_to_bigquery.html
--
diff --git a/_modules/airflow/contrib/operators/bigquery_to_bigquery.html 
b/_modules/airflow/contrib/operators/bigquery_to_bigquery.html
index 0abb905..da4edcd 100644
--- a/_modules/airflow/contrib/operators/bigquery_to_bigquery.html
+++ b/_modules/airflow/contrib/operators/bigquery_to_bigquery.html
@@ -24,26 +24,17 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
- 
+  
+  
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -116,7 +107,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -124,9 +115,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -156,8 +148,6 @@
 
   
 
-
-
   
 
   
@@ -178,9 +168,9 @@
 # to you under the Apache License, Version 2.0 (the
 # License); you may not use this file except in 
compliance
 # with the License.  You may obtain a copy of the License 
at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in 
writing,
 # software distributed under the License is distributed on 
an
 # AS IS BASIS, WITHOUT WARRANTIES OR CONDITIONS 
OF ANY
@@ -203,11 +193,12 @@
 
 :param source_project_dataset_tables: One or more
 dotted 
(project:|project.)dataset.table BigQuery tables to use as 
the
-source data. If project is not included, 
project will be the project defined
-in the connection json. Use a list if there are 
multiple source tables.
+source data. If project is not included, 
project will be the
+project defined in the connection json. Use a list if 
there are multiple
+source tables. (templated)
 :type source_project_dataset_tables: list|string
 :param destination_project_dataset_table: The destination 
BigQuery
-table. Format is: 
(project:|project.)dataset.table
+table. Format is: 
(project:|project.)dataset.table (templated)
 :type destination_project_dataset_table: string
 :param write_disposition: The write 

[11/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_static/js/theme.js
--
diff --git a/_static/js/theme.js b/_static/js/theme.js
index af661a9..62bc0b7 100644
--- a/_static/js/theme.js
+++ b/_static/js/theme.js
@@ -1,169 +1,3 @@
-require=(function e(t,n,r){function s(o,u){if(!n[o]){if(!t[o]){var a=typeof 
require=="function"&if(!u&)return a(o,!0);if(i)return i(o,!0);var 
f=new Error("Cannot find module '"+o+"'");throw f.code="MODULE_NOT_FOUND",f}var 
l=n[o]={exports:{}};t[o][0].call(l.exports,function(e){var n=t[o][1][e];return 
s(n?n:e)},l,l.exports,e,t,n,r)}return n[o].exports}var i=typeof 
require=="function"&for(var o=0;o");
-
-// Add expand links to all parents of nested ul
-$('.wy-menu-vertical ul').not('.simple').siblings('a').each(function 
() {
-var link = $(this);
-expand = $('');
-expand.on('click', function (ev) {
-self.toggleCurrent(link);
-ev.stopPropagation();
-return false;
-});
-link.prepend(expand);
-});
-};
-
-nav.reset = function () {
-// Get anchor from URL and open up nested nav
-var anchor = encodeURI(window.location.hash);
-if (anchor) {
-try {
-var link = $('.wy-menu-vertical')
-.find('[href="' + anchor + '"]');
-// If we didn't find a link, it may be because we clicked on
-// something that is not in the sidebar (eg: when using
-// sphinxcontrib.httpdomain it generates headerlinks but those
-// aren't picked up and placed in the toctree). So let's find
-// the closest header in the document and try with that one.
-if (link.length === 0) {
-  var doc_link = $('.document a[href="' + anchor + '"]');
-  var closest_section = doc_link.closest('div.section');
-  // Try again with the closest section entry.
-  link = $('.wy-menu-vertical')
-.find('[href="#' + closest_section.attr("id") + '"]');
-
-}
-$('.wy-menu-vertical li.toctree-l1 li.current')
-.removeClass('current');
-link.closest('li.toctree-l2').addClass('current');
-link.closest('li.toctree-l3').addClass('current');
-link.closest('li.toctree-l4').addClass('current');
-}
-catch (err) {
-console.log("Error expanding nav for anchor", err);
-}
-}
-};
-
-nav.onScroll = function () {
-this.winScroll = false;
-var newWinPosition = this.win.scrollTop(),
-winBottom = newWinPosition + this.winHeight,
-navPosition = this.navBar.scrollTop(),
-newNavPosition = navPosition + (newWinPosition - this.winPosition);
-if (newWinPosition < 0 || winBottom > this.docHeight) {
-return;
-}
-this.navBar.scrollTop(newNavPosition);
-this.winPosition = newWinPosition;
-};
-
-nav.onResize = function () {
-this.winResize = false;
-this.winHeight = this.win.height();
-this.docHeight = $(document).height();
-};
-
-nav.hashChange = function () {
-this.linkScroll = true;
-this.win.one('hashchange', function () {
-this.linkScroll = false;
-});
-};
-
-nav.toggleCurrent = function (elem) {
-var parent_li = elem.closest('li');
-parent_li.siblings('li.current').removeClass('current');
-parent_li.siblings().find('li.current').removeClass('current');
-parent_li.find('> ul li.current').removeClass('current');
-parent_li.toggleClass('current');
-}
-
-return nav;
-};
-
-module.exports.ThemeNav = ThemeNav();
-
-if (typeof(window) != 'undefined') {
-window.SphinxRtdTheme = { StickyNav: module.exports.ThemeNav };
-}
-
-},{"jquery":"jquery"}]},{},["sphinx-rtd-theme"]);
+/* sphinx_rtd_theme version 0.4.1 | MIT license */
+/* Built 20180727 10:07 */
+require=function n(e,i,t){function o(s,a){if(!i[s]){if(!e[s]){var 
l="function"==typeof require&if(!a&)return l(s,!0);if(r)return 
r(s,!0);var c=new Error("Cannot find module '"+s+"'");throw 
c.code="MODULE_NOT_FOUND",c}var 
u=i[s]={exports:{}};e[s][0].call(u.exports,function(n){var i=e[s][1][n];return 
o(i||n)},u,u.exports,n,e,i,t)}return i[s].exports}for(var r="function"==typeof 
require&,s=0;s"),n("table.docutils.footnote").wrap(""),n("table.docutils.citation").wrap(""),n(".wy-menu-vertical 
ul").not(".simple").siblings("a").each(function(){var i=n(this);expand=n(''),expand.on("click",function(n){return 
e.toggleCurrent(i),n.stopPropagation(),!1}),i.prepend(expand)})},reset:function(){var
 n=encodeURI(window.location.hash)||"#";try{var 

[26/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/operators/bash_operator.html
--
diff --git a/_modules/airflow/operators/bash_operator.html 
b/_modules/airflow/operators/bash_operator.html
index d6d7dc6..da3c883 100644
--- a/_modules/airflow/operators/bash_operator.html
+++ b/_modules/airflow/operators/bash_operator.html
@@ -24,26 +24,17 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
- 
+  
+  
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -116,7 +107,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -124,9 +115,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -156,8 +148,6 @@
 
   
 
-
-
   
 
   
@@ -195,28 +185,18 @@
 from subprocess import Popen, 
STDOUT, PIPE
 from tempfile import gettempdir, NamedTemporaryFile
 
-from airflow import configuration as conf
 from airflow.exceptions import AirflowException
 from airflow.models import BaseOperator
 from airflow.utils.decorators 
import apply_defaults
 from airflow.utils.file import TemporaryDirectory
 
 
-# These variables are required in cases when BashOperator 
tasks use airflow specific code,
-# e.g. they import packages in the airflow context and the 
possibility of impersonation
-# gives not guarantee that these variables are available in 
the impersonated environment.
-# Hence, we need to propagate them in the Bash script used as 
a wrapper of commands in
-# this BashOperator.
-PYTHONPATH_VAR = PYTHONPATH
-AIRFLOW_HOME_VAR = AIRFLOW_HOME
-
-
 [docs]class BashOperator(BaseOperator):
 
 Execute a Bash script, command or set of commands.
 
 :param bash_command: The command, set of commands or 
reference to a
-bash script (must be .sh) to be 
executed.
+bash script (must be .sh) to be executed. 
(templated)
 :type bash_command: string
 :param xcom_push: If xcom_push is True, the last line 
written to stdout
 will also be pushed to an XCom when the bash command 
completes.
@@ -254,18 +234,12 @@
 
 self.log.info(Tmp dir root location: \n %s, gettempdir())
 
-airflow_home_value = 
conf.get(core, AIRFLOW_HOME_VAR)
-pythonpath_value = os.environ.get(PYTHONPATH_VAR, )
-
-bash_command = (export {}={}; .format(AIRFLOW_HOME_VAR, airflow_home_value) +
-export {}={}; .format(PYTHONPATH_VAR, pythonpath_value) +
-self.bash_command)
-self.lineage_data = bash_command
+self.lineage_data = self.bash_command
 
 with TemporaryDirectory(prefix=airflowtmp) as tmp_dir:
 with NamedTemporaryFile(dir=tmp_dir, prefix=self.task_id) as 
f:
 
-f.write(bytes(bash_command, utf_8))
+f.write(bytes(self.bash_command, utf_8))
 f.flush()
 fname = f.name
 script_location = os.path.abspath(fname)
@@ -281,7 +255,7 @@
 signal.signal(getattr(signal, sig), signal.SIG_DFL)
 os.setsid()
 
-self.log.info(Running command: %s, 
bash_command)
+self.log.info(Running command: %s, 
self.bash_command)
 sp = Popen(
 [bash, fname],
 stdout=PIPE, stderr=STDOUT,
@@ -313,9 +287,7 @@
 
 

-   
-
-   
+   
   
   
   
@@ -327,7 +299,7 @@
 
 
   
-  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/snide/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
+  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/rtfd/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
 
 
 
@@ -346,6 +318,7 @@
 var DOCUMENTATION_OPTIONS = {
 URL_ROOT:'../../../',
 VERSION:'',
+LANGUAGE:'None',
 COLLAPSE_INDEX:false,
 FILE_SUFFIX:'.html',
 HAS_SOURCE:  true,
@@ -358,19 +331,13 @@
 
   
 
-  
-  
-
-  
+  
 
-  
-  
   
   jQuery(function () {
-  SphinxRtdTheme.StickyNav.enable();
+  SphinxRtdTheme.Navigation.enable(true);
   });
-  
-   
+   
 
 
 
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/operators/check_operator.html
--
diff --git a/_modules/airflow/operators/check_operator.html 
b/_modules/airflow/operators/check_operator.html
index 7e367ba..4d3fd89 100644
--- a/_modules/airflow/operators/check_operator.html
+++ b/_modules/airflow/operators/check_operator.html
@@ -24,26 +24,17 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
- 
+  
+  
+
+ 
 
   
   
 
 
 
-
+
 

   

[07/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/genindex.html
--
diff --git a/genindex.html b/genindex.html
index d9028f8..71dfe50 100644
--- a/genindex.html
+++ b/genindex.html
@@ -25,25 +25,17 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
- 
+  
+  
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -116,7 +108,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -124,9 +116,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -194,6 +187,7 @@
  | V
  | W
  | X
+ | Z
  
 
 A
@@ -205,6 +199,10 @@
 
   add_tasks() 
(airflow.models.DAG method)
 
+  aggregate()
 (airflow.contrib.hooks.mongo_hook.MongoHook method)
+
+  airflow.hooks.hive_hooks 
(module)
+
   airflow.macros (module)
 
   airflow.models (module)
@@ -229,6 +227,10 @@
 
   AwsRedshiftClusterSensor
 (class in airflow.contrib.sensors.aws_redshift_cluster_sensor), [1]
 
+  AzureDataLakeHook
 (class in airflow.contrib.hooks.azure_data_lake_hook), [1]
+
+  AzureFileShareHook
 (class in airflow.contrib.hooks.azure_fileshare_hook), [1]
+
   
 
 
@@ -259,10 +261,10 @@
 
   BigQueryIntervalCheckOperator
 (class in airflow.contrib.operators.bigquery_check_operator), [1]
 
-  
-  
   BigQueryOperator
 (class in airflow.contrib.operators.bigquery_operator), [1]
 
+  
+  
   BigQueryTableDeleteOperator
 (class in airflow.contrib.operators.bigquery_table_delete_operator), [1]
 
   BigQueryTableSensor
 (class in airflow.contrib.sensors.bigquery_sensor)
@@ -278,7 +280,13 @@
   build_job()
 
(airflow.contrib.operators.jenkins_job_trigger_operator.JenkinsJobTriggerOperator
 method)
 
   bulk_dump() 
(airflow.hooks.dbapi_hook.DbApiHook method)
+
+  
+(airflow.hooks.mysql_hook.MySqlHook
 method)
+
+(airflow.hooks.postgres_hook.PostgresHook
 method)
 
+  
   bulk_insert_rows()
 (airflow.hooks.oracle_hook.OracleHook method)
 
   bulk_load() 
(airflow.hooks.dbapi_hook.DbApiHook method)
@@ -286,6 +294,8 @@
   
 (airflow.hooks.mysql_hook.MySqlHook
 method)
 
+(airflow.hooks.postgres_hook.PostgresHook
 method)
+
   
   
 
@@ -293,14 +303,34 @@
 C
 
   
+  call() 
(airflow.hooks.zendesk_hook.ZendeskHook method)
+
+  CassandraHook
 (class in airflow.contrib.hooks.cassandra_hook)
+
+  CassandraToGoogleCloudStorageOperator
 (class in airflow.contrib.operators.cassandra_to_gcs)
+
   CeleryExecutor
 (class in airflow.executors.celery_executor)
 
+  Chart (class in 
airflow.models)
+
   check_for_blob()
 (airflow.contrib.hooks.wasb_hook.WasbHook method), [1]
 
   check_for_bucket()
 (airflow.hooks.S3_hook.S3Hook method), [1]
 
+  check_for_directory()
 (airflow.contrib.hooks.azure_fileshare_hook.AzureFileShareHook method), [1]
+
+  check_for_file()
 (airflow.contrib.hooks.azure_data_lake_hook.AzureDataLakeHook method), [1]
+
+  
+(airflow.contrib.hooks.azure_fileshare_hook.AzureFileShareHook
 method), [1]
+
+  
   check_for_key() 
(airflow.hooks.S3_hook.S3Hook method), [1]
 
+  check_for_named_partition()
 (airflow.hooks.hive_hooks.HiveMetastoreHook method)
+
+  check_for_partition()
 (airflow.hooks.hive_hooks.HiveMetastoreHook method)
+
   check_for_path()
 (airflow.hooks.webhdfs_hook.WebHDFSHook method)
 
   check_for_prefix()
 (airflow.contrib.hooks.wasb_hook.WasbHook method), [1]
@@ -311,6 +341,8 @@
   
   check_for_wildcard_key()
 (airflow.hooks.S3_hook.S3Hook method), [1]
 
+  check_response()
 (airflow.hooks.http_hook.HttpHook method)
+
   CheckOperator 
(class in airflow.operators.check_operator)
 
   clear() 
(airflow.models.BaseOperator method), [1]
@@ -333,14 +365,16 @@
   
   closest_ds_partition()
 (in module airflow.macros.hive)
 
+  CloudantHook 
(class in airflow.contrib.hooks.cloudant_hook)
+
   cluster_status()
 (airflow.contrib.hooks.redshift_hook.RedshiftHook method), [1]
 
   collect_dags() 
(airflow.models.DagBag method)
 
-  command() 
(airflow.models.TaskInstance method)
-
   
   
+  command() 
(airflow.models.TaskInstance method)
+
   command_as_list() 
(airflow.models.TaskInstance method)
 
   commit()
 (airflow.contrib.hooks.datastore_hook.DatastoreHook method), [1]
@@ -350,22 +384,40 @@
   Connection (class in 
airflow.models)
 
   construct_api_call_params()
 (airflow.operators.slack_operator.SlackAPIOperator method)
+
+  
+(airflow.operators.slack_operator.SlackAPIPostOperator
 method)
+
+  
+  construct_ingest_query()
 (airflow.operators.hive_to_druid.HiveToDruidTransfer method)
+
+  convert_map_type()
 
(airflow.contrib.operators.cassandra_to_gcs.CassandraToGoogleCloudStorageOperator
 class method)
+
+  convert_tuple_type()
 
(airflow.contrib.operators.cassandra_to_gcs.CassandraToGoogleCloudStorageOperator
 

[09/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/concepts.html
--
diff --git a/concepts.html b/concepts.html
index 3b9d4a3..58f47df 100644
--- a/concepts.html
+++ b/concepts.html
@@ -24,27 +24,19 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
-
- 
+  
+  
+
+
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -155,7 +147,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -163,9 +155,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -215,7 +208,7 @@ workflows.
 Core Ideas¶
 
 DAGs¶
-In Airflow, a DAG – or a Directed Acyclic Graph – is a 
collection of all
+In Airflow, a DAG – or a Directed Acyclic Graph – is a 
collection of all
 the tasks you want to run, organized in a way that reflects their relationships
 and dependencies.
 For example, a simple DAG could consist of three tasks: A, B, and C. It 
could
@@ -232,22 +225,22 @@ concerned with what its constituent tasks do; its job is 
to make sure that
 whatever they do happens at the right time, or in the right order, or with the
 right handling of any unexpected issues.
 DAGs are defined in standard Python files that are placed in Airflow’s
-DAG_FOLDER. 
Airflow will execute the code in each file to dynamically build
-the DAG 
objects. You can have as many DAGs as you want, each describing an
+DAG_FOLDER. Airflow will execute the code in each 
file to dynamically build
+the DAG objects. You can have as many DAGs as you want, 
each describing an
 arbitrary number of tasks. In general, each one should correspond to a single
 logical workflow.
 
 Note
 When searching for DAGs, Airflow will only consider files 
where the string
-“airflow” and “DAG” both appear in the contents of the .py file.
+“airflow” and “DAG” both appear in the contents of the .py 
file.
 
 
 Scope¶
-Airflow will load any DAG object it can import from a DAGfile. Critically,
-that means the DAG must appear in globals(). Consider the following two
-DAGs. Only dag_1 will be loaded; the other one only appears in a 
local
+Airflow will load any DAG object it can import from a DAGfile. Critically,
+that means the DAG must appear in globals(). Consider the following 
two
+DAGs. Only dag_1 will be loaded; the other one only appears in a 
local
 scope.
-dag_1 = DAG(this_dag_will_be_discovered)
+dag_1 = DAG(this_dag_will_be_discovered)
 
 def my_function():
 dag_2 = DAG(but_this_dag_will_not)
@@ -256,14 +249,14 @@ scope.
 
 
 Sometimes this can be put to good use. For example, a common pattern with
-SubDagOperator 
is to define the subdag inside a function so that Airflow
+SubDagOperator is to define the subdag inside a 
function so that Airflow
 doesn’t try to load it as a standalone DAG.
 
 
 Default Arguments¶
-If a dictionary of default_args is passed to a DAG, it will apply them to
+If a dictionary of default_args is passed to a DAG, it will apply them to
 any of its operators. This makes it easy to apply a common parameter to many 
operators without having to type it many times.
-default_args = {
+default_args = {
 start_date: datetime(2016, 1, 1),
 owner: Airflow
 }
@@ -278,7 +271,7 @@ any of its operators. This makes it easy to apply a common 
parameter to many ope
 Context Manager¶
 Added in Airflow 1.8
 DAGs can be used as context managers to automatically assign new operators 
to that DAG.
-with DAG(my_dag, start_date=datetime(2016, 1, 1)) as dag:
+with DAG(my_dag, start_date=datetime(2016, 1, 1)) as dag:
 op = DummyOperator(op)
 
 op.dag 
is dag # 
True
@@ -287,8 +280,8 @@ any of its operators. This makes it easy to apply a common 
parameter to many ope
 
 
 
-Operators¶
-While DAGs describe how to run a workflow, Operators determine what
+Operators¶
+While DAGs describe how to run a workflow, Operators determine what
 actually gets done.
 An operator describes a single task in a workflow. Operators are usually 
(but
 not always) atomic, meaning they can stand on their own and don’t need to 
share
@@ -302,30 +295,31 @@ Airflow does have a feature for operator 
cross-communication called XCom that is
 described elsewhere in this document.
 Airflow provides operators for many common tasks, including:
 
-BashOperator - executes a bash command
-PythonOperator - calls an arbitrary Python 
function
-EmailOperator - sends an email
-HTTPOperator - sends an HTTP request
-MySqlOperator, SqliteOperator, PostgresOperator, MsSqlOperator, OracleOperator, JdbcOperator, etc. - executes a SQL 
command
-Sensor - 
waits for a certain time, file, database row, S3 key, etc…
+BashOperator - executes a bash command
+PythonOperator - calls an arbitrary Python 
function
+EmailOperator - sends an email
+SimpleHttpOperator - sends an HTTP request
+MySqlOperator, SqliteOperator, PostgresOperator, MsSqlOperator, OracleOperator, JdbcOperator, etc. - executes a 
SQL 

[30/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/hooks/hive_hooks.html
--
diff --git a/_modules/airflow/hooks/hive_hooks.html 
b/_modules/airflow/hooks/hive_hooks.html
new file mode 100644
index 000..ff22edf
--- /dev/null
+++ b/_modules/airflow/hooks/hive_hooks.html
@@ -0,0 +1,1098 @@
+
+
+
+
+  
+
+  
+  
+  
+  
+  airflow.hooks.hive_hooks  Airflow Documentation
+  
+
+  
+  
+  
+  
+
+  
+
+  
+  
+
+
+  
+
+  
+  
+
+ 
+
+  
+  
+
+
+
+
+
+   
+  
+
+
+
+  
+
+  
+
+  
+ Airflow
+  
+
+  
+  
+
+  
+
+
+  
+
+  
+
+  
+
+
+
+  
+
+
+  
+
+
+
+  
+
+
+  
+
+
+  
+Project
+License
+Quick Start
+Installation
+Tutorial
+How-to Guides
+UI / Screenshots
+Concepts
+Data Profiling
+Command Line Interface
+Scheduling  Triggers
+Plugins
+Security
+Time zones
+Experimental Rest API
+Integration
+Lineage
+FAQ
+API Reference
+
+
+
+  
+
+  
+
+
+
+
+  
+  
+
+  
+  Airflow
+
+  
+
+
+  
+
+
+
+  
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+  
+
+  Docs 
+
+  Module code 
+
+  airflow.hooks.hive_hooks
+
+
+  
+
+  
+
+  
+
+  
+  
+
+  http://schema.org/Article;>
+   
+
+  Source code for airflow.hooks.hive_hooks
+# -*- coding: utf-8 -*-
+#
+# Licensed to the Apache Software Foundation (ASF) under 
one
+# or more contributor license agreements.  See the NOTICE 
file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this 
file
+# to you under the Apache License, Version 2.0 (the
+# License); you may not use this file except in 
compliance
+# with the License.  You may obtain a copy of the License 
at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in 
writing,
+# software distributed under the License is distributed on 
an
+# AS IS BASIS, WITHOUT WARRANTIES OR CONDITIONS 
OF ANY
+# KIND, either express or implied.  See the License for 
the
+# specific language governing permissions and 
limitations
+# under the License.
+
+from __future__ import print_function, unicode_literals
+
+import contextlib
+import os
+
+from six.moves import zip
+from past.builtins import basestring, unicode
+
+import unicodecsv as csv
+import re
+import six
+import subprocess
+import time
+from collections import OrderedDict
+from tempfile import NamedTemporaryFile
+import hmsclient
+
+from airflow import configuration as conf
+from airflow.exceptions import AirflowException
+from airflow.hooks.base_hook 
import BaseHook
+from airflow.utils.helpers 
import as_flattened_list
+from airflow.utils.file import TemporaryDirectory
+from airflow import configuration
+import airflow.security.utils 
as utils
+
+HIVE_QUEUE_PRIORITIES = [VERY_HIGH, HIGH, NORMAL, LOW, VERY_LOW]
+
+
+[docs]class HiveCliHook(BaseHook):
+Simple wrapper around the hive 
CLI.
+
+It also supports the ``beeline``
+a lighter CLI that runs JDBC and is replacing the 
heavier
+traditional CLI. To enable ``beeline``, set the 
use_beeline param in the
+extra field of your connection as in ``{ 
use_beeline: true }``
+
+Note that you can also set default hive CLI parameters 
using the
+``hive_cli_params`` to be used in your connection as 
in
+``{hive_cli_params: -hiveconf 
mapred.job.tracker=some.jobtracker:444}``
+Parameters passed here can be overridden by run_clis 
hive_conf param
+
+The extra connection parameter ``auth`` gets passed as in 
the ``jdbc``
+connection string as is.
+
+:param mapred_queue: queue used by the Hadoop Scheduler 
(Capacity or Fair)
+:type  mapred_queue: string
+:param mapred_queue_priority: priority within the job 
queue.
+Possible settings include: VERY_HIGH, HIGH, NORMAL, 
LOW, VERY_LOW
+:type  mapred_queue_priority: string
+:param mapred_job_name: This name will appear in the 
jobtracker.
+This can make monitoring easier.
+:type  mapred_job_name: string
+
+
+def __init__(
+self,
+hive_cli_conn_id=hive_cli_default,
+run_as=None,
+mapred_queue=None,
+mapred_queue_priority=None,
+mapred_job_name=None):
+conn = self.get_connection(hive_cli_conn_id)
+self.hive_cli_params = conn.extra_dejson.get(hive_cli_params, )
+self.use_beeline = conn.extra_dejson.get(use_beeline, False)
+self.auth = conn.extra_dejson.get(auth, noSasl)
+self.conn = conn
+self.run_as = run_as
+
+if 

[22/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/operators/presto_to_mysql.html
--
diff --git a/_modules/airflow/operators/presto_to_mysql.html 
b/_modules/airflow/operators/presto_to_mysql.html
index 0f27e92..7e86afb 100644
--- a/_modules/airflow/operators/presto_to_mysql.html
+++ b/_modules/airflow/operators/presto_to_mysql.html
@@ -24,26 +24,17 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
- 
+  
+  
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -116,7 +107,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -124,9 +115,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -156,8 +148,6 @@
 
   
 
-
-
   
 
   
@@ -178,15 +168,16 @@
 # to you under the Apache License, Version 2.0 (the
 # License); you may not use this file except in 
compliance
 # with the License.  You may obtain a copy of the License 
at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in 
writing,
 # software distributed under the License is distributed on 
an
 # AS IS BASIS, WITHOUT WARRANTIES OR CONDITIONS 
OF ANY
 # KIND, either express or implied.  See the License for 
the
 # specific language governing permissions and 
limitations
 # under the License.
+
 from airflow.hooks.presto_hook 
import PrestoHook
 from airflow.hooks.mysql_hook 
import MySqlHook
 from airflow.models import BaseOperator
@@ -199,19 +190,19 @@
 into memory before being pushed to MySQL, so this 
operator should
 be used for smallish amount of data.
 
-:param sql: SQL query to execute against Presto
+:param sql: SQL query to execute against Presto. 
(templated)
 :type sql: str
 :param mysql_table: target MySQL table, use dot notation 
to target a
-specific database
+specific database. (templated)
 :type mysql_table: str
 :param mysql_conn_id: source mysql connection
 :type mysql_conn_id: str
 :param presto_conn_id: source presto connection
 :type presto_conn_id: str
 :param mysql_preoperator: sql statement to run against 
mysql prior to
-import, typically use to truncate of delete in place 
of the data
-coming in, allowing the task to be idempotent 
(running the task
-twice wont double load data)
+import, typically use to truncate of delete in 
place
+of the data coming in, allowing the task to be 
idempotent (running
+the task twice wont double load data). 
(templated)
 :type mysql_preoperator: str
 
 
@@ -251,9 +242,7 @@
 
 

-   
-
-   
+   
   
   
   
@@ -265,7 +254,7 @@
 
 
   
-  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/snide/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
+  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/rtfd/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
 
 
 
@@ -284,6 +273,7 @@
 var DOCUMENTATION_OPTIONS = {
 URL_ROOT:'../../../',
 VERSION:'',
+LANGUAGE:'None',
 COLLAPSE_INDEX:false,
 FILE_SUFFIX:'.html',
 HAS_SOURCE:  true,
@@ -296,19 +286,13 @@
 
   
 
-  
-  
-
-  
+  
 
-  
-  
   
   jQuery(function () {
-  SphinxRtdTheme.StickyNav.enable();
+  SphinxRtdTheme.Navigation.enable(true);
   });
-  
-   
+   
 
 
 
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/operators/python_operator.html
--
diff --git a/_modules/airflow/operators/python_operator.html 
b/_modules/airflow/operators/python_operator.html
index fb1e29d..d5ff59b 100644
--- a/_modules/airflow/operators/python_operator.html
+++ b/_modules/airflow/operators/python_operator.html
@@ -24,26 +24,17 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
- 
+  
+  
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -116,7 +107,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -124,9 +115,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -156,8 +148,6 @@
 
   
 
-
-
   
 
   
@@ -178,9 +168,9 @@
 # to you under the Apache License, Version 2.0 (the
 # License); you may not use this file except in 
compliance
 # with the License.  You may obtain a copy of the License 
at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in 
writing,
 # software distributed under the License is distributed on 
an
 # AS IS BASIS, WITHOUT WARRANTIES OR CONDITIONS 
OF ANY
@@ 

[20/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/sensors/hdfs_sensor.html
--
diff --git a/_modules/airflow/sensors/hdfs_sensor.html 
b/_modules/airflow/sensors/hdfs_sensor.html
index 7747a54..fdfec6d 100644
--- a/_modules/airflow/sensors/hdfs_sensor.html
+++ b/_modules/airflow/sensors/hdfs_sensor.html
@@ -24,26 +24,17 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
- 
+  
+  
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -116,7 +107,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -124,9 +115,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -156,8 +148,6 @@
 
   
 
-
-
   
 
   
@@ -178,9 +168,9 @@
 # to you under the Apache License, Version 2.0 (the
 # License); you may not use this file except in 
compliance
 # with the License.  You may obtain a copy of the License 
at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in 
writing,
 # software distributed under the License is distributed on 
an
 # AS IS BASIS, WITHOUT WARRANTIES OR CONDITIONS 
OF ANY
@@ -266,7 +256,7 @@
 log.debug(HdfsSensor.poke: after ext filter result is %s, result)
 return result
 
-def poke(self, context):
+[docs]
def poke(self, context):
 sb = self.hook(self.hdfs_conn_id).get_conn()
 self.log.info(Poking for file {self.filepath}.format(**locals()))
 try:
@@ -285,13 +275,11 @@
 except Exception:
 e = sys.exc_info()
 self.log.debug(Caught an exception !: %s, 
str(e))
-return False
+return False
 
 

-   
-
-   
+   
   
   
   
@@ -303,7 +291,7 @@
 
 
   
-  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/snide/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
+  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/rtfd/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
 
 
 
@@ -322,6 +310,7 @@
 var DOCUMENTATION_OPTIONS = {
 URL_ROOT:'../../../',
 VERSION:'',
+LANGUAGE:'None',
 COLLAPSE_INDEX:false,
 FILE_SUFFIX:'.html',
 HAS_SOURCE:  true,
@@ -334,19 +323,13 @@
 
   
 
-  
-  
-
-  
+  
 
-  
-  
   
   jQuery(function () {
-  SphinxRtdTheme.StickyNav.enable();
+  SphinxRtdTheme.Navigation.enable(true);
   });
-  
-   
+   
 
 
 
\ No newline at end of file

http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_modules/airflow/sensors/hive_partition_sensor.html
--
diff --git a/_modules/airflow/sensors/hive_partition_sensor.html 
b/_modules/airflow/sensors/hive_partition_sensor.html
index 5549f84..d70cbb5 100644
--- a/_modules/airflow/sensors/hive_partition_sensor.html
+++ b/_modules/airflow/sensors/hive_partition_sensor.html
@@ -24,26 +24,17 @@
 
   
 
-  
-  
-
-  
-
-  
-
-  
-
-
-
- 
+  
+  
+
+ 
 
   
   
 
 
 
-
+
 

   
@@ -116,7 +107,7 @@
 
 
   
-  
+  
 
   
   Airflow
@@ -124,9 +115,10 @@
   
 
 
-  
   
+
 
+
   
 
 
@@ -156,8 +148,6 @@
 
   
 
-
-
   
 
   
@@ -178,9 +168,9 @@
 # to you under the Apache License, Version 2.0 (the
 # License); you may not use this file except in 
compliance
 # with the License.  You may obtain a copy of the License 
at
-# 
+#
 #   http://www.apache.org/licenses/LICENSE-2.0
-# 
+#
 # Unless required by applicable law or agreed to in 
writing,
 # software distributed under the License is distributed on 
an
 # AS IS BASIS, WITHOUT WARRANTIES OR CONDITIONS 
OF ANY
@@ -232,7 +222,7 @@
 self.partition = partition
 self.schema = schema
 
-def poke(self, context):
+[docs]
def poke(self, context):
 if . in self.table:
 self.schema, self.table 
= self.table.split(.)
 self.log.info(
@@ -243,13 +233,11 @@
 self.hook = HiveMetastoreHook(
 metastore_conn_id=self.metastore_conn_id)
 return self.hook.check_for_partition(
-self.schema, self.table, self.partition)
+self.schema, self.table, self.partition)
 
 

-   
-
-   
+   
   
   
   
@@ -261,7 +249,7 @@
 
 
   
-  Built with http://sphinx-doc.org/;>Sphinx using a https://github.com/snide/sphinx_rtd_theme;>theme provided by https://readthedocs.org;>Read the Docs. 
+  Built with http://sphinx-doc.org/;>Sphinx using a 

[14/48] incubator-airflow-site git commit: 1.10.0 with Updated Api Reference

2018-08-29 Thread kaxilnaik
http://git-wip-us.apache.org/repos/asf/incubator-airflow-site/blob/7d4d7628/_static/jquery-3.1.0.js
--
diff --git a/_static/jquery-3.1.0.js b/_static/jquery-3.1.0.js
deleted file mode 100644
index f2fc274..000
--- a/_static/jquery-3.1.0.js
+++ /dev/null
@@ -1,10074 +0,0 @@
-/*eslint-disable no-unused-vars*/
-/*!
- * jQuery JavaScript Library v3.1.0
- * https://jquery.com/
- *
- * Includes Sizzle.js
- * https://sizzlejs.com/
- *
- * Copyright jQuery Foundation and other contributors
- * Released under the MIT license
- * https://jquery.org/license
- *
- * Date: 2016-07-07T21:44Z
- */
-( function( global, factory ) {
-
-   "use strict";
-
-   if ( typeof module === "object" && typeof module.exports === "object" ) 
{
-
-   // For CommonJS and CommonJS-like environments where a proper 
`window`
-   // is present, execute the factory and get jQuery.
-   // For environments that do not have a `window` with a 
`document`
-   // (such as Node.js), expose a factory as module.exports.
-   // This accentuates the need for the creation of a real 
`window`.
-   // e.g. var jQuery = require("jquery")(window);
-   // See ticket #14549 for more info.
-   module.exports = global.document ?
-   factory( global, true ) :
-   function( w ) {
-   if ( !w.document ) {
-   throw new Error( "jQuery requires a 
window with a document" );
-   }
-   return factory( w );
-   };
-   } else {
-   factory( global );
-   }
-
-// Pass this if window is not defined yet
-} )( typeof window !== "undefined" ? window : this, function( window, noGlobal 
) {
-
-// Edge <= 12 - 13+, Firefox <=18 - 45+, IE 10 - 11, Safari 5.1 - 9+, iOS 6 - 
9.1
-// throw exceptions when non-strict code (e.g., ASP.NET 4.5) accesses strict 
mode
-// arguments.callee.caller (trac-13335). But as of jQuery 3.0 (2016), strict 
mode should be common
-// enough that all such attempts are guarded in a try block.
-"use strict";
-
-var arr = [];
-
-var document = window.document;
-
-var getProto = Object.getPrototypeOf;
-
-var slice = arr.slice;
-
-var concat = arr.concat;
-
-var push = arr.push;
-
-var indexOf = arr.indexOf;
-
-var class2type = {};
-
-var toString = class2type.toString;
-
-var hasOwn = class2type.hasOwnProperty;
-
-var fnToString = hasOwn.toString;
-
-var ObjectFunctionString = fnToString.call( Object );
-
-var support = {};
-
-
-
-   function DOMEval( code, doc ) {
-   doc = doc || document;
-
-   var script = doc.createElement( "script" );
-
-   script.text = code;
-   doc.head.appendChild( script ).parentNode.removeChild( script );
-   }
-/* global Symbol */
-// Defining this global in .eslintrc would create a danger of using the global
-// unguarded in another place, it seems safer to define global only for this 
module
-
-
-
-var
-   version = "3.1.0",
-
-   // Define a local copy of jQuery
-   jQuery = function( selector, context ) {
-
-   // The jQuery object is actually just the init constructor 
'enhanced'
-   // Need init if jQuery is called (just allow error to be thrown 
if not included)
-   return new jQuery.fn.init( selector, context );
-   },
-
-   // Support: Android <=4.0 only
-   // Make sure we trim BOM and NBSP
-   rtrim = /^[\s\uFEFF\xA0]+|[\s\uFEFF\xA0]+$/g,
-
-   // Matches dashed string for camelizing
-   rmsPrefix = /^-ms-/,
-   rdashAlpha = /-([a-z])/g,
-
-   // Used by jQuery.camelCase as callback to replace()
-   fcamelCase = function( all, letter ) {
-   return letter.toUpperCase();
-   };
-
-jQuery.fn = jQuery.prototype = {
-
-   // The current version of jQuery being used
-   jquery: version,
-
-   constructor: jQuery,
-
-   // The default length of a jQuery object is 0
-   length: 0,
-
-   toArray: function() {
-   return slice.call( this );
-   },
-
-   // Get the Nth element in the matched element set OR
-   // Get the whole matched element set as a clean array
-   get: function( num ) {
-   return num != null ?
-
-   // Return just the one element from the set
-   ( num < 0 ? this[ num + this.length ] : this[ num ] ) :
-
-   // Return all the elements in a clean array
-   slice.call( this );
-   },
-
-   // Take an array of elements and push it onto the stack
-   // (returning the new matched element set)
-   pushStack: function( elems ) {
-
-   // Build a new jQuery matched element set
-   var ret = jQuery.merge( 

  1   2   3   >