[ 
https://issues.apache.org/jira/browse/SPARK-1779?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13996567#comment-13996567
 ] 

Erik Erlandson commented on SPARK-1779:
---------------------------------------

I'd like to propose an additional (somewhat orthogonal) solution to PR-714, 
which can be viewed here:

https://github.com/erikerlandson/spark/compare/SPARK-1779-memoryFraction

The basic elements of this proposal are:
*) getInt, getLong, and getDouble support new optional arguments:  minValue and 
maxValue.   If either or both of these are supplied, the corresponding 
boundaries are checked.  A parameter setting that violates either bound results 
in an exception by default.
*)  getInt, getLong, getDouble and getBoolean support an optional argument:  
defaultBadValue.  This argument defaults to false.  If set to true, a bounds 
checking failure (or other value failures) will result in a log warning 
message, and the return of the given defaultValue.
*) new arguments are applied to some invocations of getDouble, e.g. for 
memoryFraction.
*) unit testing for getInt, getLong, getDouble, getBoolean

A couple advantages of this approach are:
*) it provides a standardized interface for applying bound checking on 
SparkConf settings
*) SparkConf owns the checking logic, which helps keep code DRY

Note, there is also a proposal for more generalized checking predicates here:  
SPARK-1781

Another possibly useful feature I've seen on other projects is to have 
validation logic be data driven.  That is, required properties for each 
configuration parameter can reside in a file and be deserialized at startup 
time, or alternatively have them reside in some appropriate standardized data 
structure that lives in the code.


> Warning when spark.storage.memoryFraction is not between 0 and 1
> ----------------------------------------------------------------
>
>                 Key: SPARK-1779
>                 URL: https://issues.apache.org/jira/browse/SPARK-1779
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 0.9.0, 1.0.0
>            Reporter: wangfei
>             Fix For: 1.1.0
>
>
> There should be a warning when memoryFraction is lower than 0 or greater than 
> 1



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to