[
https://issues.apache.org/jira/browse/FLINK-1525?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14537076#comment-14537076
]
ASF GitHub Bot commented on FLINK-1525:
---------------------------------------
Github user uce commented on the pull request:
https://github.com/apache/flink/pull/664#issuecomment-100600902
**Feedback w/o code review**
I like the idea very much. +1 for the general thing. Exporting it to the
web interface is great. Thanks for this. :-)
The Hadoop compatibility idea is also very nice. Let's make sure to follow
up with it soon (file a JIRA etc.)
Regarding adding this to all tests: I think we should discuss this. I am
undecided. On the one hand, it will break everything that shows how to run the
examples, because you will have to specify the parameter names now. On the
other hand, I like that people will be exposed to the utility. I think I'm
leaning towards option 2 atm.
What's currently missing is documentation.
---
In the word count example: is the only point of setting the user config in
the execution config only to show the values in the web interface?
> Provide utils to pass -D parameters to UDFs
> --------------------------------------------
>
> Key: FLINK-1525
> URL: https://issues.apache.org/jira/browse/FLINK-1525
> Project: Flink
> Issue Type: Improvement
> Components: flink-contrib
> Reporter: Robert Metzger
> Labels: starter
>
> Hadoop users are used to setting job configuration through "-D" on the
> command line.
> Right now, Flink users have to manually parse command line arguments and pass
> them to the methods.
> It would be nice to provide a standard args parser with is taking care of
> such stuff.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)