feng-tao closed pull request #4242: [AIRFLOW-XXX] Correct typos in UPDATING.md 
`master`
URL: https://github.com/apache/incubator-airflow/pull/4242
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/UPDATING.md b/UPDATING.md
index af448cfff8..88dc78c810 100644
--- a/UPDATING.md
+++ b/UPDATING.md
@@ -80,7 +80,7 @@ airflow users --delete --username jondoe
 
 ### StatsD Metrics
 
-The `scheduler_heartbeat` metric has been changed from a gauge to a counter. 
Each loop of the scheduler will increment the counter by 1. This provides a 
higher degree of visibility and allows for better integration with Prometheus 
using the [StatsD Exporter](https://github.com/prometheus/statsd_exporter). 
Scheduler upness can be determined by graphing and alerting using a rate. If 
the scheduler goes down, the rate will drop to 0.
+The `scheduler_heartbeat` metric has been changed from a gauge to a counter. 
Each loop of the scheduler will increment the counter by 1. This provides a 
higher degree of visibility and allows for better integration with Prometheus 
using the [StatsD Exporter](https://github.com/prometheus/statsd_exporter). The 
scheduler's activity status can be determined by graphing and alerting using a 
rate of change of the counter. If the scheduler goes down, the rate will drop 
to 0.
 
 ### Custom auth backends interface change
 
@@ -110,12 +110,12 @@ should be inside the "Instances" dict)
 
 ### LDAP Auth Backend now requires TLS
 
-Connecting to an LDAP serever over plain text is not supported anymore. The
+Connecting to an LDAP server over plain text is not supported anymore. The
 certificate presented by the LDAP server must be signed by a trusted
-certificiate, or you must provide the `cacert` option under `[ldap]` in the
+certificate, or you must provide the `cacert` option under `[ldap]` in the
 config file.
 
-If you want to use LDAP auth backend without TLS then you will habe to create a
+If you want to use LDAP auth backend without TLS then you will have to create a
 custom-auth backend based on
 
https://github.com/apache/incubator-airflow/blob/1.10.0/airflow/contrib/auth/backends/ldap_auth.py
 
@@ -133,7 +133,7 @@ The method name was changed to be compatible with the 
Python 3.7 async/await key
 
 ### Add a configuration variable(default_dag_run_display_number) to control 
numbers of dag run for display
 
-Add a configuration variable(default_dag_run_display_number) under webserver 
section to control num of dag run to show in UI.
+Add a configuration variable(default_dag_run_display_number) under webserver 
section to control the number of dag runs to show in UI.
 
 ### Default executor for SubDagOperator is changed to SequentialExecutor
 
@@ -166,7 +166,7 @@ There are five roles created for Airflow by default: Admin, 
User, Op, Viewer, an
 - AWS Batch Operator renamed property queue to job_queue to prevent conflict 
with the internal queue from CeleryExecutor - AIRFLOW-2542
 - Users created and stored in the old users table will not be migrated 
automatically. FAB's built-in authentication support must be reconfigured.
 - Airflow dag home page is now `/home` (instead of `/admin`).
-- All ModelViews in Flask-AppBuilder follow a different pattern from 
Flask-Admin. The `/admin` part of the url path will no longer exist. For 
example: `/admin/connection` becomes `/connection/list`, 
`/admin/connection/new` becomes `/connection/add`, `/admin/connection/edit` 
becomes `/connection/edit`, etc.
+- All ModelViews in Flask-AppBuilder follow a different pattern from 
Flask-Admin. The `/admin` part of the URL path will no longer exist. For 
example: `/admin/connection` becomes `/connection/list`, 
`/admin/connection/new` becomes `/connection/add`, `/admin/connection/edit` 
becomes `/connection/edit`, etc.
 - Due to security concerns, the new webserver will no longer support the 
features in the `Data Profiling` menu of old UI, including `Ad Hoc Query`, 
`Charts`, and `Known Events`.
 - HiveServer2Hook.get_results() always returns a list of tuples, even when a 
single column is queried, as per Python API 2.
 
@@ -249,7 +249,7 @@ SSH Hook now uses the Paramiko library to create an ssh 
client connection, inste
 
 - update SSHHook constructor
 - use SSHOperator class in place of SSHExecuteOperator which is removed now. 
Refer to test_ssh_operator.py for usage info.
-- SFTPOperator is added to perform secure file transfer from serverA to 
serverB. Refer to test_sftp_operator.py.py for usage info.
+- SFTPOperator is added to perform secure file transfer from serverA to 
serverB. Refer to test_sftp_operator.py for usage info.
 - No updates are required if you are using ftpHook, it will continue to work 
as is.
 
 ### S3Hook switched to use Boto3
@@ -279,7 +279,7 @@ Once a logger has determined that a message needs to be 
processed, it is passed
 
 #### Changes in Airflow Logging
 
-Airflow's logging mechanism has been refactored to use Python’s builtin 
`logging` module to perform logging of the application. By extending classes 
with the existing `LoggingMixin`, all the logging will go through a central 
logger. Also the `BaseHook` and `BaseOperator` already extend this class, so it 
is easily available to do logging.
+Airflow's logging mechanism has been refactored to use Python’s built-in 
`logging` module to perform logging of the application. By extending classes 
with the existing `LoggingMixin`, all the logging will go through a central 
logger. Also the `BaseHook` and `BaseOperator` already extend this class, so it 
is easily available to do logging.
 
 The main benefit is easier configuration of the logging by setting a single 
centralized python file. Disclaimer; there is still some inline configuration, 
but this will be removed eventually. The new logging class is defined by 
setting the dotted classpath in your `~/airflow/airflow.cfg` file:
 
@@ -439,7 +439,7 @@ If you are using S3, the instructions should be largely the 
same as the Google c
 - Copy the logging configuration from 
[`airflow/config_templates/airflow_logging_settings.py`](https://github.com/apache/incubator-airflow/blob/master/airflow/config_templates/airflow_local_settings.py).
 - Place it in a directory inside the Python import path `PYTHONPATH`. If you 
are using Python 2.7, ensuring that any `__init__.py` files exist so that it is 
importable.
 - Update the config by setting the path of `REMOTE_BASE_LOG_FOLDER` explicitly 
in the config. The `REMOTE_BASE_LOG_FOLDER` key is not used anymore.
-- Set the `logging_config_class` to the filename and dict. For example, if you 
place `custom_logging_config.py` on the base of your pythonpath, you will need 
to set `logging_config_class = custom_logging_config.LOGGING_CONFIG` in your 
config as Airflow 1.8.
+- Set the `logging_config_class` to the filename and dict. For example, if you 
place `custom_logging_config.py` on the base of your `PYTHONPATH`, you will 
need to set `logging_config_class = custom_logging_config.LOGGING_CONFIG` in 
your config as Airflow 1.8.
 
 ### New Features
 


 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to