[AIRFLOW-1933] Fix some typos

Closes #2474 from Philippus/patch-1


Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo
Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/ad28decc
Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/ad28decc
Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/ad28decc

Branch: refs/heads/v1-10-test
Commit: ad28decc74a87f6f9fbff57f200199860a2bfa81
Parents: 3952e05
Author: Philippus Baalman <[email protected]>
Authored: Sun Apr 29 20:08:48 2018 -0700
Committer: r39132 <[email protected]>
Committed: Sun Apr 29 20:08:48 2018 -0700

----------------------------------------------------------------------
 UPDATING.md | 22 +++++++++++-----------
 1 file changed, 11 insertions(+), 11 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/ad28decc/UPDATING.md
----------------------------------------------------------------------
diff --git a/UPDATING.md b/UPDATING.md
index 609c8db..8006876 100644
--- a/UPDATING.md
+++ b/UPDATING.md
@@ -74,17 +74,17 @@ With Airflow 1.9 or lower, there were two connection 
strings for the Google Clou
 
 ### SSH Hook updates, along with new SSH Operator & SFTP Operator
 
-SSH Hook now uses Paramiko library to create ssh client connection, instead of 
sub-process based ssh command execution previously (<1.9.0), so this is 
backward incompatible.
+SSH Hook now uses the Paramiko library to create an ssh client connection, 
instead of the sub-process based ssh command execution previously (<1.9.0), so 
this is backward incompatible.
   - update SSHHook constructor
-  - use SSHOperator class in place of SSHExecuteOperator which is removed now. 
Refer test_ssh_operator.py for usage info.
-  - SFTPOperator is added to perform secure file transfer from serverA to 
serverB. Refer test_sftp_operator.py.py for usage info.
+  - use SSHOperator class in place of SSHExecuteOperator which is removed now. 
Refer to test_ssh_operator.py for usage info.
+  - SFTPOperator is added to perform secure file transfer from serverA to 
serverB. Refer to test_sftp_operator.py.py for usage info.
   - No updates are required if you are using ftpHook, it will continue to work 
as is.
 
 ### S3Hook switched to use Boto3
 
 The airflow.hooks.S3_hook.S3Hook has been switched to use boto3 instead of the 
older boto (a.k.a. boto2). This results in a few backwards incompatible changes 
to the following classes: S3Hook:
   - the constructors no longer accepts `s3_conn_id`. It is now called 
`aws_conn_id`.
-  - the default conneciton is now "aws_default" instead of "s3_default"
+  - the default connection is now "aws_default" instead of "s3_default"
   - the return type of objects returned by `get_bucket` is now boto3.s3.Bucket
   - the return type of `get_key`, and `get_wildcard_key` is now an 
boto3.S3.Object.
 
@@ -106,7 +106,7 @@ Once a logger has determined that a message needs to be 
processed, it is passed
 
 #### Changes in Airflow Logging
 
-Airflow's logging mechanism has been refactored to uses Python’s builtin 
`logging` module to perform logging of the application. By extending classes 
with the existing `LoggingMixin`, all the logging will go through a central 
logger. Also the `BaseHook` and `BaseOperator` already extends this class, so 
it is easily available to do logging.
+Airflow's logging mechanism has been refactored to use Python’s builtin 
`logging` module to perform logging of the application. By extending classes 
with the existing `LoggingMixin`, all the logging will go through a central 
logger. Also the `BaseHook` and `BaseOperator` already extend this class, so it 
is easily available to do logging.
 
 The main benefit is easier configuration of the logging by setting a single 
centralized python file. Disclaimer; there is still some inline configuration, 
but this will be removed eventually. The new logging class is defined by 
setting the dotted classpath in your `~/airflow/airflow.cfg` file:
 
@@ -153,7 +153,7 @@ The `file_task_handler` logger has been made more flexible. 
The default format c
 If you are logging to Google cloud storage, please see the [Google cloud 
platform 
documentation](https://airflow.incubator.apache.org/integration.html#gcp-google-cloud-platform)
 for logging instructions.
 
 If you are using S3, the instructions should be largely the same as the Google 
cloud platform instructions above. You will need a custom logging config. The 
`REMOTE_BASE_LOG_FOLDER` configuration key in your airflow config has been 
removed, therefore you will need to take the following steps:
- - Copy the logging configuration from 
[`airflow/config_templates/airflow_logging_settings.py`](https://github.com/apache/incubator-airflow/blob/master/airflow/config_templates/airflow_local_settings.py)
 and copy it.
+ - Copy the logging configuration from 
[`airflow/config_templates/airflow_logging_settings.py`](https://github.com/apache/incubator-airflow/blob/master/airflow/config_templates/airflow_local_settings.py).
  - Place it in a directory inside the Python import path `PYTHONPATH`. If you 
are using Python 2.7, ensuring that any `__init__.py` files exist so that it is 
importable.
  - Update the config by setting the path of `REMOTE_BASE_LOG_FOLDER` 
explicitly in the config. The `REMOTE_BASE_LOG_FOLDER` key is not used anymore.
  - Set the `logging_config_class` to the filename and dict. For example, if 
you place `custom_logging_config.py` on the base of your pythonpath, you will 
need to set `logging_config_class = custom_logging_config.LOGGING_CONFIG` in 
your config as Airflow 1.8.
@@ -180,12 +180,12 @@ supported and will be removed entirely in Airflow 2.0
   Note that JSON serialization is stricter than pickling, so if you want to 
e.g. pass
   raw bytes through XCom you must encode them using an encoding like base64.
   By default pickling is still enabled until Airflow 2.0. To disable it
-  Set enable_xcom_pickling = False in your Airflow config.
+  set enable_xcom_pickling = False in your Airflow config.
 
 ## Airflow 1.8.1
 
 The Airflow package name was changed from `airflow` to `apache-airflow` during 
this release. You must uninstall
-previously installed version of Airflow before installing 1.8.1.
+a previously installed version of Airflow before installing 1.8.1.
 
 ## Airflow 1.8
 
@@ -202,12 +202,12 @@ Systemd unit files have been updated. If you use systemd 
please make sure to upd
 Airflow 1.7.1 has issues with being able to over subscribe to a pool, ie. more 
slots could be used than were
 available. This is fixed in Airflow 1.8.0, but due to past issue jobs may fail 
to start although their
 dependencies are met after an upgrade. To workaround either temporarily 
increase the amount of slots above
-the the amount of queued tasks or use a new pool.
+the amount of queued tasks or use a new pool.
 
 ### Less forgiving scheduler on dynamic start_date
 Using a dynamic start_date (e.g. `start_date = datetime.now()`) is not 
considered a best practice. The 1.8.0 scheduler
 is less forgiving in this area. If you encounter DAGs not being scheduled you 
can try using a fixed start_date and
-renaming your dag. The last step is required to make sure you start with a 
clean slate, otherwise the old schedule can
+renaming your DAG. The last step is required to make sure you start with a 
clean slate, otherwise the old schedule can
 interfere.
 
 ### New and updated scheduler options
@@ -243,7 +243,7 @@ By default the scheduler will fill any missing interval DAG 
Runs between the las
 This setting changes that behavior to only execute the latest interval. This 
can also be specified per DAG as
 `catchup = False / True`. Command line backfills will still work.
 
-### Faulty Dags do not show an error in the Web UI
+### Faulty DAGs do not show an error in the Web UI
 
 Due to changes in the way Airflow processes DAGs the Web UI does not show an 
error when processing a faulty DAG. To
 find processing errors go the `child_process_log_directory` which defaults to 
`<AIRFLOW_HOME>/scheduler/latest`.

Reply via email to