Hi Sheila,
I think if you use an odd-number of neighbors you can break your ties.
Without a weight function, the probability should be comprised of votes
from the k-nearest neighbors. So, the tie at 0.5 means two neighbors are
class 2 and two are class 3 for the first two samples and a tie would b
Any suggestion about KNeighborsClassifier().predict_proba ?
On 3 September 2014 14:57, Sheila the angel wrote:
> I am using KNeighborsClassifier and trying to obtain probabilistic output.
> But for many of the test sets I am getting equal probability for all class.
>
> >>>X_train, X_test, y_tra
I am using KNeighborsClassifier and trying to obtain probabilistic output.
But for many of the test sets I am getting equal probability for all class.
>>>X_train, X_test, y_train, y_test =
cross_validation.train_test_split(iris.data, iris.target, test_size=0.4,
random_state=0)
>>>clf = KNeighbors