2012/9/6 Jaidev Deshpande <[email protected]>:
> I've been playing around with the Perceptron class in scikit-learn. I
> have a theoretical understanding of the perceptron algorithm. In
> sklearn it has been subclassed from the SGDClassifier class, very
> different from how I would have expected the perceptron to be
> implemented (I'd have thought it was simply something like this -
> http://en.wikipedia.org/wiki/Perceptron#Learning_algorithm_steps).

The algorithm described *is* SGD. Just look at the
sklearn/linear_model/sgd_fast.pyx code, abridged here with the steps
listed in Wikipedia in the comments:

# step 2
for i in xrange(n_samples):
    # step 2a.
    p = w.dot(x_data_ptr, x_ind_ptr, xnnz) + intercept

    # step 2b.
    update = eta * loss.dloss(p, y)
    w.add(x_data_ptr, x_ind_ptr, xnnz, -update)

Our eta corresponds to Wikipedia's alpha and loss.dloss is
instantiated to the perceptron loss, d_j - y_j(t).

The main differences are that our SGD repeats this loop inside an
outer loop n_iter times, and that it offers a bunch of extra bells and
whistles like adaptive learning rate.

-- 
Lars Buitinck
Scientific programmer, ILPS
University of Amsterdam

------------------------------------------------------------------------------
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to