I don't have any DTW code written, but I could definitely prototype how a
lower bound callable might get incorporated into sklearn
On Mon, Dec 7, 2015 at 5:28 PM, Stéfan van der Walt
wrote:
> On Mon, Dec 7, 2015 at 6:04 AM, Gael Varoquaux
> wrote:
> >> hello pandora's box ;)
> >> I thought we d
On Mon, Dec 7, 2015 at 6:04 AM, Gael Varoquaux
wrote:
>> hello pandora's box ;)
>> I thought we don't want to have time-series specific code?
>
> I agree. We should strive to make something like this pluggeable into
> scikit-learn, but not have in inside.
If you are seeking a home for DTW code, w
I would say prototype it and let's see what it implies on the code.
A
--
Go from Idea to Many App Stores Faster with Intel(R) XDK
Give your users amazing mobile app experiences with Intel(R) XDK.
Use one codebase in this
What about adding the option for users to pass in a callable "lower bound"
function to a nearest neighbor search? Then users could use things like the LB
Keogh lower bound
On Mon, Dec 7, 2015 at 9:04 AM, Gael Varoquaux <
gael.varoqu...@normalesup.org> wrote:
> > hello pandora's box ;)
> > I thoug
> hello pandora's box ;)
> I thought we don't want to have time-series specific code?
I agree. We should strive to make something like this pluggeable into
scikit-learn, but not have in inside.
--
Go from Idea to Many App
On 12/07/2015 04:33 AM, Alexandre Gramfort wrote:
>> How do you plan to represent variable-length time series? Lists of 1d numpy
>> arrays work but would be slow I guess. The ideal representation needs to be
>> compatible with grid search and fast.
> good point. I was thinking of forcing all time s
> How do you plan to represent variable-length time series? Lists of 1d numpy
> arrays work but would be slow I guess. The ideal representation needs to be
> compatible with grid search and fast.
good point. I was thinking of forcing all time series to have the same length.
or use dtype = object l
How do you plan to represent variable-length time series? Lists of 1d numpy
arrays work but would be slow I guess. The ideal representation needs to be
compatible with grid search and fast.
Mathieu
On Mon, Dec 7, 2015 at 10:35 AM, Dan Shiebler wrote:
> Hello,
>
> I’m not sure if this is the cor
> In addition, users
> cannot take advantage of the LB Keogh lower bound of dynamic time warping,
> which can dramatically speed up the nearest neighbors search.
can you give more details on how you would use this in our NN code?
if passing a dtw callable is not good enough, that can justify an a
Hello,
I’m not sure if this is the correct place to send this. If it is not, could
you please direct me to the best place? Thank you.
I’d like to add a dynamic time warping metric to
sklearn.neighbors.DistanceMetric.
Dynamic time warping is one of the most used distance metrics for time
series, a
10 matches
Mail list logo