[ 
https://issues.apache.org/jira/browse/MAHOUT-1856?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15830810#comment-15830810
 ] 

ASF GitHub Bot commented on MAHOUT-1856:
----------------------------------------

Github user dlyubimov commented on a diff in the pull request:

    https://github.com/apache/mahout/pull/246#discussion_r96982250
  
    --- Diff: 
math-scala/src/main/scala/org/apache/mahout/math/algorithms/regression/Regressor.scala
 ---
    @@ -0,0 +1,33 @@
    +/**
    +  * Licensed to the Apache Software Foundation (ASF) under one
    +  * or more contributor license agreements. See the NOTICE file
    +  * distributed with this work for additional information
    +  * regarding copyright ownership. The ASF licenses this file
    +  * to you under the Apache License, Version 2.0 (the
    +  * "License"); you may not use this file except in compliance
    +  * with the License. You may obtain a copy of the License at
    +  *
    +  * http://www.apache.org/licenses/LICENSE-2.0
    +  *
    +  * Unless required by applicable law or agreed to in writing,
    +  * software distributed under the License is distributed on an
    +  * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
    +  * KIND, either express or implied. See the License for the
    +  * specific language governing permissions and limitations
    +  * under the License.
    +  */
    +
    +package org.apache.mahout.math.algorithms.regression
    +
    +import org.apache.mahout.math.algorithms.Model
    +import org.apache.mahout.math.drm.DrmLike
    +
    +/**
    +  * Abstract of Regressors
    +  */
    +abstract class Regressor extends Model {
    +
    +  def fit[Int](drmY: DrmLike[Int], drmX: DrmLike[Int]): Unit
    --- End diff --
    
    i guess if this is abstract enough, we also need to be able admit 
hyperparameters which are of course specific for every fitter. in R this is 
trivial (any call can be made a bag of whatever named arguments), but in Scala 
this may need a bit of a thought (if this abstraction needs to be that high). 
otherwise, i guess most scala kits just create a concrete fit signature per 
implementation.
    
    if the Regressor trait is meant to be common to all possible regression 
class algorithms, we either need a way to universally pass in the 
hyperparameters, or just not have fit abstraction in the regressor trait at all 
. (then what i guess :) ) 



> Create a framework for new Mahout Clustering, Classification, and 
> Optimization  Algorithms
> ------------------------------------------------------------------------------------------
>
>                 Key: MAHOUT-1856
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-1856
>             Project: Mahout
>          Issue Type: New Feature
>    Affects Versions: 0.12.1
>            Reporter: Andrew Palumbo
>            Assignee: Trevor Grant
>            Priority: Critical
>             Fix For: 0.13.0
>
>
> To ensure that Mahout does not become "A loose bag of algorithms", Create 
> basic traits with funtions common to each class of algorithm. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to