Hi Prokopis.
The paper seems a bit old and not that relevant any more (feel free to
prove otherwise).
Looking at the issue tracker and seeing where you can help out is
definitely the most useful for us.
Thanks,
Andy
On 08/01/2015 04:33 PM, Prokopis Gryllos wrote:
Hi Andreas,
Thank you for
Hi Andreas,
Thank you for your reply!
The NFL belongs in the k-NN family of algorithms for classification /
regression.
In general, for a query point, NFL works like k-NN but instead of using the
k feature
points to determine the results, it generalized any two feature points of
the same class
by
I am sorry. But i posted in the issue 3 days ago and I didn't get reply. I
was unable to comprehend whether the issue was resolved or the issue was
left.
On Tue, Jul 28, 2015 at 11:24 PM, Andreas Mueller wrote:
> You posted on the ML yesterday. This is mostly a volunteer-run project.
> I try to
You posted on the ML yesterday. This is mostly a volunteer-run project.
I try to respond to everything within at least a day, but I was
travelling the last weeks.
If you feel a day is not an acceptable response time, I am afraid you
will not be happy with this project.
On 07/28/2015 12:52 P
i,
i am new to scikit-learn. I am intersted to contribute either by
introducing some new algorithms or optimizing existing algorithms. I have
gone through the issue-tracker and found few interesting topics to work on
like NCA anf NMF. I have posted my question as what is the status of the
work a
Hi Gryllos.
Before contributing a new feature (which is usually a major undertaking)
it us usually a good idea to get started working on known issues, have a
look at the issue tracker.
I'm not familiar with the feature line approach. Can you elaborate and
provide a reference?
Please see the
Hi everyone,
I would like to contribute code to the project and I was thinking of
implementing
a nearest feature line approach to the nearest neighbor class.
As it is suggested in the instruction set about contributing I thought it
would be best
to ask you first before I start working on it.
Than