[ 
https://issues.apache.org/jira/browse/FLINK-8714?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16370075#comment-16370075
 ] 

ASF GitHub Bot commented on FLINK-8714:
---------------------------------------

GitHub user michalklempa opened a pull request:

    https://github.com/apache/flink/pull/5536

    [FLINK-8714][Documentation] Added either charsetName) or "utf-8" value in 
examples of readTextFile

    ## What is the purpose of the change
    When a newcomer (like me), goes through the docs, there are several places 
where examples encourage to read the input data using the env.readTextFile() 
method.
    
    This method variant does not take a second argument - character set (see 
https://ci.apache.org/projects/flink/flink-docs-release-1.4/api/java/org/apache/flink/streaming/api/environment/StreamExecutionEnvironment.html#readTextFile-java.lang.String-).
 This versoin relies (according to Javadoc) on " The file will be read with the 
system's default character set. "
    *(For example: This pull request makes task deployment go through the blob 
server, rather than through RPC. That way we avoid re-transferring them on each 
deployment (during recovery).)*
    
    Fixing this in documentation by providing charsetName in examples where the 
API is described and "utf-8" as second argument in programming examples. This 
should help others not to forget about the need to specify a charset 
programmatically, if they want to avoid non-deterministic behavior depending on 
environment.
    
    ## Brief change log
    
    ## Verifying this change
    
    This change is a trivial rework of documentation without any test coverage.
    
    ## Does this pull request potentially affect one of the following parts:
    
      - Dependencies (does it add or upgrade a dependency): no
      - The public API, i.e., is any changed class annotated with 
`@Public(Evolving)`: no
      - The serializers: no
      - The runtime per-record code paths (performance sensitive): no
      - Anything that affects deployment or recovery: JobManager (and its 
components), Checkpointing, Yarn/Mesos, ZooKeeper: no
      - The S3 file system connector: no
    
    ## Documentation
    
      - Does this pull request introduce a new feature? no
      - If yes, how is the feature documented? not applicable


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/michalklempa/flink 
FLINK-8714_readTextFile_charset_version

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/flink/pull/5536.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #5536
    
----
commit 221684da5b564b21c1e0cc99e823c18939c0ca91
Author: Michal Klempa <michal.klempa@...>
Date:   2018-02-20T13:50:30Z

    FLINK-8714 added either env.readTextFile(pathToFile, charsetName) where the 
API is described or readTextFile(path/to/file, utf-8)  where API is shown as 
example

----


> Suggest new users to use env.readTextFile method with 2 arguments (using the 
> charset), not to rely on system charset (which varies across environments)
> -------------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: FLINK-8714
>                 URL: https://issues.apache.org/jira/browse/FLINK-8714
>             Project: Flink
>          Issue Type: Improvement
>          Components: Documentation
>    Affects Versions: 1.4.0
>            Reporter: Michal Klempa
>            Priority: Trivial
>              Labels: easyfix, newbie
>
> When a newcomer (like me), goes through the docs, there are several places 
> where examples encourage to read the input data using the 
> {{env.readTextFile()}} method.
>  
> This method variant does not take a second argument - character set (see 
> [https://ci.apache.org/projects/flink/flink-docs-release-1.4/api/java/org/apache/flink/streaming/api/environment/StreamExecutionEnvironment.html#readTextFile-java.lang.String-).]
>  This versoin relies (according to Javadoc) on " The file will be read with 
> the system's default character set. "
>  
> This behavior is also default in Java, like in the 
> {{java.util.String.getBytes()}} method, where not supplying the charset mean 
> - use the system locale or the one which JVM was started with (see 
> [https://stackoverflow.com/questions/64038/setting-java-locale-settings).] 
> There are two ways to set locale prior to JVM start (-D arguments or set 
> LC_ALL variable).
>  
> Given this is something a new Flink user may not know about, nor he wants to 
> spend hours trying to find the environment-related bug (it works on 
> localhost, but in production the locale is different), I would kindly suggest 
> a change in documentation: lets migrate examples to use the two-argument 
> version of {{readTextFile(filePath, charsetName)}}.
>  
> I am open to criticism and suggestions. The listing of {{readTextFile}} I was 
> able to grep in docs is:
> {code:java}
> ./dev/datastream_api.md:- `readTextFile(path)` - Reads text files, i.e. files 
> that respect the `TextInputFormat` specification, line-by-line and returns 
> them as Strings.
> ./dev/datastream_api.md:- `readTextFile(path)` - Reads text files, i.e. files 
> that respect the `TextInputFormat` specification, line-by-line and returns 
> them as Strings.
> ./dev/libs/storm_compatibility.md:DataStream<String> text = 
> env.readTextFile(localFilePath);
> ./dev/cluster_execution.md:    DataSet<String> data = 
> env.readTextFile("hdfs://path/to/file");
> ./dev/batch/index.md:- `readTextFile(path)` / `TextInputFormat` - Reads files 
> line wise and returns them as Strings.
> ./dev/batch/index.md:- `readTextFileWithValue(path)` / `TextValueInputFormat` 
> - Reads files line wise and returns them as
> ./dev/batch/index.md:DataSet<String> localLines = 
> env.readTextFile("file:///path/to/my/textfile");
> ./dev/batch/index.md:DataSet<String> hdfsLines = 
> env.readTextFile("hdfs://nnHost:nnPort/path/to/my/textfile");
> ./dev/batch/index.md:DataSet<String> logs = 
> env.readTextFile("file:///path/with.nested/files")
> ./dev/batch/index.md:- `readTextFile(path)` / `TextInputFormat` - Reads files 
> line wise and returns them as Strings.
> ./dev/batch/index.md:- `readTextFileWithValue(path)` / `TextValueInputFormat` 
> - Reads files line wise and returns them as
> ./dev/batch/index.md:val localLines = 
> env.readTextFile("file:///path/to/my/textfile")
> ./dev/batch/index.md:val hdfsLines = 
> env.readTextFile("hdfs://nnHost:nnPort/path/to/my/textfile")
> ./dev/batch/index.md:env.readTextFile("file:///path/with.nested/files").withParameters(parameters)
> ./dev/batch/index.md:DataSet<String> lines = env.readTextFile(pathToTextFile);
> ./dev/batch/index.md:val lines = env.readTextFile(pathToTextFile)
> ./dev/batch/examples.md:DataSet<String> text = 
> env.readTextFile("/path/to/file");
> ./dev/batch/examples.md:val text = env.readTextFile("/path/to/file")
> ./dev/api_concepts.md:DataStream<String> text = 
> env.readTextFile("file:///path/to/file");
> ./dev/api_concepts.md:val text: DataStream[String] = 
> env.readTextFile("file:///path/to/file")
> ./dev/local_execution.md:    DataSet<String> data = 
> env.readTextFile("file:///path/to/file");
> ./ops/deployment/aws.md:env.readTextFile("s3://<bucket>/<endpoint>");{code}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to