fiers get the same data. We
> need to think about how and if we want to support
> passing different representations to the different classifiers. Or is that
> just ``FeatureUnion``?
>
>
> On 12/15/2015 10:22 PM, Dan Shiebler wrote:
>
> Hello,
>
> I have some code and
Hello,
I have some code and tests written for a StackingClassifier that has an
sklearn-like interface and is compatible with sklearn classifiers. The
classifier contains methods to easily train classifiers on different
transformations of the same data and train a meta-classifier on the
classifier
I don't have any DTW code written, but I could definitely prototype how a
lower bound callable might get incorporated into sklearn
On Mon, Dec 7, 2015 at 5:28 PM, Stéfan van der Walt
wrote:
> On Mon, Dec 7, 2015 at 6:04 AM, Gael Varoquaux
> wrote:
> >> hello pandora's box ;)
> >> I thought we d
What about adding the option for users to pass in a callable "lower bound"
function to a nearest neighbor search? Then users could use things like the LB
Keogh lower bound
On Mon, Dec 7, 2015 at 9:04 AM, Gael Varoquaux <
gael.varoqu...@normalesup.org> wrote:
> > hello pandora's box ;)
> > I thoug
Hello,
I’m not sure if this is the correct place to send this. If it is not, could
you please direct me to the best place? Thank you.
I’d like to add a dynamic time warping metric to
sklearn.neighbors.DistanceMetric.
Dynamic time warping is one of the most used distance metrics for time
series, a