[ 
https://issues.apache.org/jira/browse/SPARK-7727?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14558697#comment-14558697
 ] 

Cheng Hao commented on SPARK-7727:
----------------------------------

As we probably don't want to change the `SQLContext.optimizer`, which supposed 
to be an `Optimizer`, how about making the `DefaultOptimizer` from `object` to 
`class`? So people could extend it very easily? And as well as the `Analyzer` 
etc.

> Avoid inner classes in RuleExecutor
> -----------------------------------
>
>                 Key: SPARK-7727
>                 URL: https://issues.apache.org/jira/browse/SPARK-7727
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 1.3.1
>            Reporter: Santiago M. Mola
>              Labels: easyfix, starter
>
> In RuleExecutor, the following classes and objects are defined as inner 
> classes or objects: Strategy, Once, FixedPoint, Batch.
> This does not seem to accomplish anything in this case, but makes 
> extensibility harder. For example, if I want to define a new Optimizer that 
> uses all batches from the DefaultOptimizer plus some more, I would do 
> something like:
> {code}
> new Optimizer {
>     override protected val batches: Seq[Batch] =
>       DefaultOptimizer.batches ++ myBatches
>  }
> {code}
> But this will give a typing error because batches in DefaultOptimizer are of 
> type DefaultOptimizer#Batch while myBatches are this#Batch.
> Workarounds include either copying the list of batches from DefaultOptimizer 
> or using a method like this:
> {code}
>     private def transformBatchType(b: DefaultOptimizer.Batch): Batch = {
>       val strategy = b.strategy.maxIterations match {
>         case 1 => Once
>         case n => FixedPoint(n)
>       }
>       Batch(b.name, strategy, b.rules)
>     }
> {code}
> However, making these classes outer would solve the problem.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to