Greetings,
Is there any way to force the MLPRegressor to make predictions in the same
value range as the training data? For example, if the training data range
between -5 and -9, I don't want the predictions to range between -820 and
-800. In fact, some times I get anti-correlated predictions, for
You could normalize the outputs (e.g., via min-max scaling). However, I think
the more intuitive way would be to clip the predictions. E.g., say you are
predicting house prices, it probably makes no sense to have a negative
prediction, so you would clip the output at some value >0$
PS: -820 an
On 10 September 2017 at 22:03, Sebastian Raschka
wrote:
> You could normalize the outputs (e.g., via min-max scaling). However, I
> think the more intuitive way would be to clip the predictions. E.g., say
> you are predicting house prices, it probably makes no sense to have a
> negative predictio
With clipping, I mean thresholding the output, e.g., via sth like
min/max(some_constant, actual_output)
or like in an leaky relu:
min/max(some_constant * 0.001, actual_output)
Alternatively, you could use an sigmoidal function (something like tanh but
with a larger co-domain) as the output unit,
I think you want to call the radius_neighbors method (check here:
http://scikit-learn.org/stable/modules/generated/sklearn.neighbors.NearestNeighbors.html#sklearn.neighbors.NearestNeighbors.radius_neighbors)
(you're using kneighbors, replace with radius_neighbors)
~Shane
On 09/10, Martin Lee
Given your related post on the issue tracker, I think you're trying to
perform clustering. Use DBSCAN, which is a standard approach to clustering
based on neighborhoods within radius.
On 10 September 2017 at 14:44, Martin Lee wrote:
> nbrs = NearestNeighbors(n_neighbors=10,radius=100.0,metric