2012/2/1 Mathieu Blondel :
> On Wed, Feb 1, 2012 at 10:10 PM, David Warde-Farley
> wrote:
>
>> I might suggest mean over training examples but sum over output dimensions,
>> if there is more than one.
>
> Currently, Ridge is the only estimator in scikit-learn supporting
> multivariate regression
Sent from my iPod
On 01.02.2012, at 15:43, Mathieu Blondel wrote:
> On Wed, Feb 1, 2012 at 10:10 PM, David Warde-Farley
> wrote:
>
>> I might suggest mean over training examples but sum over output dimensions,
>> if there is more than one.
>
> Currently, Ridge is the only estimator in scikit-l
>> I might suggest mean over training examples but sum over output dimensions,
>> if there is more than one.
>
> Currently, Ridge is the only estimator in scikit-learn supporting
> multivariate regression (it does so in a way which is more efficient
> than solving `n_responses` problems). It would
On Wed, Feb 1, 2012 at 10:10 PM, David Warde-Farley
wrote:
> I might suggest mean over training examples but sum over output dimensions,
> if there is more than one.
Currently, Ridge is the only estimator in scikit-learn supporting
multivariate regression (it does so in a way which is more effi
On 2012-02-01, at 5:10 AM, Mathieu Blondel wrote:
> Hello,
>
> I just realized that the function "mean_square_error" returns
> np.sum((y_true - y_pred) ** 2) instead of np.mean((y_true - y_pred) **
> 2). Hence it is more a cumulated error than a mean error.
>
> I would like to fix this but this
On Wed, Feb 01, 2012 at 07:22:38PM +0900, Mathieu Blondel wrote:
> I will rename the function from "mean_square_error" to
> "mean_squared_error", as this is how Wikipedia calls it anyway. This
> way, we can keep the old one for two releases.
Sounds good. We can add a depreciation warning.
Thanks,
On Wed, Feb 1, 2012 at 7:14 PM, Gael Varoquaux
wrote:
> But at least with a warning. We can't have such a change silent.
I will rename the function from "mean_square_error" to
"mean_squared_error", as this is how Wikipedia calls it anyway. This
way, we can keep the old one for two releases.
Mat
+1 for fixing the bug eventually with a warning notifying for the
change in behavior
A
On Wed, Feb 1, 2012 at 11:10 AM, Mathieu Blondel wrote:
> Hello,
>
> I just realized that the function "mean_square_error" returns
> np.sum((y_true - y_pred) ** 2) instead of np.mean((y_true - y_pred) **
> 2).
On Wed, Feb 01, 2012 at 11:12:33AM +0100, Olivier Grisel wrote:
> > I would like to fix this but this will change people's results.
> +1 for changing and documenting it in whats_new.rst.
But at least with a warning. We can't have such a change silent.
On the other hand, I agree that the current
2012/2/1 Mathieu Blondel :
> Hello,
>
> I just realized that the function "mean_square_error" returns
> np.sum((y_true - y_pred) ** 2) instead of np.mean((y_true - y_pred) **
> 2). Hence it is more a cumulated error than a mean error.
>
> I would like to fix this but this will change people's resul
Hello,
I just realized that the function "mean_square_error" returns
np.sum((y_true - y_pred) ** 2) instead of np.mean((y_true - y_pred) **
2). Hence it is more a cumulated error than a mean error.
I would like to fix this but this will change people's results.
Mathieu
-
11 matches
Mail list logo