GitHub user HyukjinKwon opened a pull request:

    https://github.com/apache/spark/pull/17517

    [MINOR][DOCS] Replace non-breaking space to normal spaces that breaks 
rendering markdown

    ## What changes were proposed in this pull request?
    
    It seems there are several non-breaking spaces were inserted into `.md` and 
they look breaking rendering markdown files. as below:
    
    These are different. For example, this can be checked via `python` as below:
    
    ```python
    >>> " "
    '\xc2\xa0'
    >>> " "
    ' '
    ```
    
    **Before**
    
    ![2017-04-03 12 36 
57](https://cloud.githubusercontent.com/assets/6477701/24594654/50a855e6-186a-11e7-94e2-661e56544b0f.png)
    ![2017-04-03 12 37 
17](https://cloud.githubusercontent.com/assets/6477701/24594655/50aaba02-186a-11e7-80bb-d34b17a3398a.png)
    
    **After**
    
    ![2017-04-03 12 36 
46](https://cloud.githubusercontent.com/assets/6477701/24594657/53c2545c-186a-11e7-9a73-00529afbfd75.png)
    ![2017-04-03 12 36 
31](https://cloud.githubusercontent.com/assets/6477701/24594658/53c286c0-186a-11e7-99c9-e66b1f510fe7.png)
    
    ## How was this patch tested?
    
    Manually checking.
    
    
    
    These instances were found via 
    
    ```
    grep --include=*.scala --include=*.python --include=*.java --include=*.r 
--include=*.R --include=*.md --include=*.r -r -I " " .
    ```
    
    in Mac OS.
    
    It seems there are several instances more as below:
    
    ```
    ./docs/sql-programming-guide.md:        │   ├── ...
    ./docs/sql-programming-guide.md:        │   │
    ./docs/sql-programming-guide.md:        │   ├── country=US
    ./docs/sql-programming-guide.md:        │   │   └── 
data.parquet
    ./docs/sql-programming-guide.md:        │   ├── country=CN
    ./docs/sql-programming-guide.md:        │   │   └── 
data.parquet
    ./docs/sql-programming-guide.md:        │   └── ...
    ./docs/sql-programming-guide.md:            ├── ...
    ./docs/sql-programming-guide.md:            │
    ./docs/sql-programming-guide.md:            ├── country=US
    ./docs/sql-programming-guide.md:            │   └── data.parquet
    ./docs/sql-programming-guide.md:            ├── country=CN
    ./docs/sql-programming-guide.md:            │   └── data.parquet
    ./docs/sql-programming-guide.md:            └── ...
    ./sql/core/src/test/README.md:│   ├── *.avdl                  # 
Testing Avro IDL(s)
    ./sql/core/src/test/README.md:│   └── *.avpr                  # 
!! NO TOUCH !! Protocol files generated from Avro IDL(s)
    ./sql/core/src/test/README.md:│   ├── gen-avro.sh             # 
Script used to generate Java code for Avro
    ./sql/core/src/test/README.md:│   └── gen-thrift.sh           # 
Script used to generate Java code for Thrift
    ```
    
    These seems generated via `tree` command which inserts non-breaking spaces. 
They do not look causing any problem for rendering and I did not fix it to 
reduce the overhead to manually replace it when it is overwritten via `tree` 
command in the future.

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/HyukjinKwon/spark non-breaking-space

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/17517.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #17517
    
----
commit 4b0e56a590a9a73e26e6c5c2cff2a20942fdb908
Author: hyukjinkwon <[email protected]>
Date:   2017-04-03T03:35:10Z

    Replace non-breaking space to normal spaces

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to