> On Dec 2, 2014, at 6:34 AM, Andy <t3k...@gmail.com> wrote:
> 
> Hi Ilya.
> 
> Thanks for your interest in contributing.
> I am not expert in affinity propagation, so it would be great if you could 
> give some details of what the advantage of the method is.
> The reference paper seems to be an arxiv preprint with 88 citations, which 
> would probably not qualify for inclusion in scikit-learn,
> see the FAQ 
> http://scikit-learn.org/dev/faq.html#can-i-add-this-new-algorithm-that-i-or-someone-else-just-published
>  
> <http://scikit-learn.org/dev/faq.html#can-i-add-this-new-algorithm-that-i-or-someone-else-just-published>

Wow, I had not seen this FAQ.  "As a rule we only add well-established 
algorithms. A rule of thumb is at least 3 years since publications, 1000+ cites 
and wide use and usefullness.”  I was intending to contribute a rule learning 
system to scikit-learn, and/or descriptive learning methods.   I guess those 
are both right out.  I thought scikit-learn would welcome some variety but 
1000+ cites (sic) and wide use pretty much rules out anything but statistical 
learning.  Among symbolic methods there is only one rather mediocre decision 
tree induction method.

Anyone know of another python framework that’s a little more welcoming?

-Tom





------------------------------------------------------------------------------
Download BIRT iHub F-Type - The Free Enterprise-Grade BIRT Server
from Actuate! Instantly Supercharge Your Business Reports and Dashboards
with Interactivity, Sharing, Native Excel Exports, App Integration & more
Get technology previously reserved for billion-dollar corporations, FREE
http://pubads.g.doubleclick.net/gampad/clk?id=164703151&iu=/4140/ostg.clktrk
_______________________________________________
Scikit-learn-general mailing list
Scikit-learn-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to