Well, scikit-learn is even better if you know what you're doing ;)

M.

On Tue, Feb 24, 2015 at 6:56 AM, Sebastian Raschka <se.rasc...@gmail.com>
wrote:

> Thanks,
> So, it's basically a "classic" perceptron rule after all.
>
> Sorry for the distraction, when I saw that it is in the
> linear_model.SGDClassifier, I somehow assumed it would be implemented more
> like:
>
>         for i in range(self.iterations):
>             for xi, yi in zip(X, y):
>                 yi_pred = self.b_ + self.w_.dot(xi)
>                 self.w_ +=  (yi - yi_pred) * xi
>                 self.b_ +=  (yi - yi_pred)
>
> In contrast to
>
>         for i in range(self.iterations):
>             for xi, yi in zip(X, y):
>                 yi_pred = np.sign(self.b_ + xi.dot(self.w_))
>                 self.w_ +=  (yi - yi_pred) *  xi
>                 self.b_ +=  (yi - yi_pred)
>
> But it all makes sense now!
>
> PS: Haha, just have to think of a phrase that I saw in the Pylearn2
> documentation that I found a tad offensive but kind of fitting:
>
> "
>
> [...] while scikit-learn aims to work as a “black box” that can produce
> good results even if the user does not understand the implementation
>
> "
>
>
> Thanks,
> Sebastian
>
> On Feb 23, 2015, at 4:16 PM, Daniel Sullivan <dbsulliva...@gmail.com>
> wrote:
>
> Yes, apologies for the confusion I was reading the code wrong. yi_pred is
> not real.
>
> On Mon, Feb 23, 2015 at 6:35 PM, Andy <t3k...@gmail.com> wrote:
>
>>  So indeed in the perceptron update yi_pred is {-1, 1}, not real, in
>> sklearn, right?
>>
>>
>>
>> On 02/23/2015 08:35 AM, Mathieu Blondel wrote:
>>
>>  Rosenblatt's Perceptron is a special case of SGD, see:
>>
>> https://github.com/scikit-learn/scikit-learn/blob/master/sklearn/linear_model/tests/test_perceptron.py
>>
>>  The perceptron loss leads to sparser weight vectors than the hinge loss
>> in the sense that it updates the weight vector less aggressively (only on
>> mistakes while the hinge loss updates the model if the prediction is not
>> "good enough").
>>
>>  Mathieu
>>
>> On Mon, Feb 23, 2015 at 7:14 PM, Sebastian Raschka <se.rasc...@gmail.com>
>> wrote:
>>
>>> Hi all,
>>>
>>> I find the description of the perceptron classifier a little bit
>>> ambiguous and was wondering if it would be worthwhile to clarify it a
>>> little bit. What do you think?
>>>
>>>
>>> While browsing through the documentation at
>>> http://scikit-learn.org/stable/modules/linear_model.html I found the
>>> following paragraph about perceptrons:
>>>
>>> > The Perceptron is another simple algorithm suitable for large scale
>>> learning. By default:
>>> >       • It does not require a learning rate.
>>> >       • It is not regularized (penalized).
>>> >       • It updates its model only on mistakes.
>>> > The last characteristic implies that the Perceptron is slightly faster
>>> to train than SGD with the hinge loss and that the resulting models are
>>> sparser.
>>>
>>>
>>> To me, it sounds like the "classic" Rosenblatt Perceptron update rule
>>>
>>> weights = weights + eta(yi - yi_pred)xi
>>>
>>> where yi_pred = sign(w^T.x)     [yi_pred ele in {-1, 1 }]
>>>
>>>
>>> However, when I read the documentation on
>>> http://scikit-learn.org/stable/modules/generated/sklearn.linear_model.Perceptron.html#sklearn.linear_model.Perceptron
>>>
>>> > Perceptron and SGDClassifier share the same underlying implementation.
>>> In fact, Perceptron() is equivalent to SGDClassifier(loss=”perceptron”,
>>> eta0=1, learning_rate=”constant”, penalty=None).
>>>
>>> it sounds more like the slightly more modern online learning variant of
>>> gradient descent (i.e. stochastic gradient descent):
>>>
>>> weights = weights + eta(yi - yi_pred)xi
>>>
>>> where yi_pred = w^T.x     [yi_pred ele Real]
>>>
>>>
>>>
>>> Best,
>>> Sebastian
>>>
>>> ------------------------------------------------------------------------------
>>> Download BIRT iHub F-Type - The Free Enterprise-Grade BIRT Server
>>> from Actuate! Instantly Supercharge Your Business Reports and Dashboards
>>> with Interactivity, Sharing, Native Excel Exports, App Integration & more
>>> Get technology previously reserved for billion-dollar corporations, FREE
>>>
>>> http://pubads.g.doubleclick.net/gampad/clk?id=190641631&iu=/4140/ostg.clktrk
>>> _______________________________________________
>>> Scikit-learn-general mailing list
>>> Scikit-learn-general@lists.sourceforge.net
>>> https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
>>>
>>
>>
>>
>> ------------------------------------------------------------------------------
>> Download BIRT iHub F-Type - The Free Enterprise-Grade BIRT Server
>> from Actuate! Instantly Supercharge Your Business Reports and Dashboards
>> with Interactivity, Sharing, Native Excel Exports, App Integration & more
>> Get technology previously reserved for billion-dollar corporations, 
>> FREEhttp://pubads.g.doubleclick.net/gampad/clk?id=190641631&iu=/4140/ostg.clktrk
>>
>>
>>
>> _______________________________________________
>> Scikit-learn-general mailing 
>> listScikit-learn-general@lists.sourceforge.nethttps://lists.sourceforge.net/lists/listinfo/scikit-learn-general
>>
>>
>>
>>
>> ------------------------------------------------------------------------------
>> Download BIRT iHub F-Type - The Free Enterprise-Grade BIRT Server
>> from Actuate! Instantly Supercharge Your Business Reports and Dashboards
>> with Interactivity, Sharing, Native Excel Exports, App Integration & more
>> Get technology previously reserved for billion-dollar corporations, FREE
>>
>> http://pubads.g.doubleclick.net/gampad/clk?id=190641631&iu=/4140/ostg.clktrk
>> _______________________________________________
>> Scikit-learn-general mailing list
>> Scikit-learn-general@lists.sourceforge.net
>> https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
>>
>>
>
> ------------------------------------------------------------------------------
> Download BIRT iHub F-Type - The Free Enterprise-Grade BIRT Server
> from Actuate! Instantly Supercharge Your Business Reports and Dashboards
> with Interactivity, Sharing, Native Excel Exports, App Integration & more
> Get technology previously reserved for billion-dollar corporations, FREE
>
> http://pubads.g.doubleclick.net/gampad/clk?id=190641631&iu=/4140/ostg.clktrk_______________________________________________
> Scikit-learn-general mailing list
> Scikit-learn-general@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
>
>
>
>
> ------------------------------------------------------------------------------
> Download BIRT iHub F-Type - The Free Enterprise-Grade BIRT Server
> from Actuate! Instantly Supercharge Your Business Reports and Dashboards
> with Interactivity, Sharing, Native Excel Exports, App Integration & more
> Get technology previously reserved for billion-dollar corporations, FREE
>
> http://pubads.g.doubleclick.net/gampad/clk?id=190641631&iu=/4140/ostg.clktrk
> _______________________________________________
> Scikit-learn-general mailing list
> Scikit-learn-general@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
>
>
------------------------------------------------------------------------------
Dive into the World of Parallel Programming The Go Parallel Website, sponsored
by Intel and developed in partnership with Slashdot Media, is your hub for all
things parallel software development, from weekly thought leadership blogs to
news, videos, case studies, tutorials and more. Take a look and join the 
conversation now. http://goparallel.sourceforge.net/
_______________________________________________
Scikit-learn-general mailing list
Scikit-learn-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to