Searching the mailing list would be the best way to find out this information.

It may be in the contrib packages on github – have you checked?


__________________________________________________________________________________________________________________________________________
Dale T. Smith | Macy's Systems and Technology | IFS eCom CSE Data Science
5985 State Bridge Road, Johns Creek, GA 30097 | [email protected]

From: scikit-learn 
[mailto:[email protected]] On Behalf Of 
KevNo
Sent: Friday, November 4, 2016 4:44 PM
To: [email protected]
Subject: [scikit-learn] Recurrent Decision Tree

⚠ EXT MSG:
Just wondering if Recurrent Decision Tree has been investigated
by Scikit previously.

Main interest is in path dependant (time series data) problems,
the recurrence is often necessary to model the path dependent state.
In other words, wrong prediction will affect the subsequent predictions.

Here, a research paper on Recurrent Decision Tree,
from Walt Disney Research (!)

https://goo.gl/APGpvM


Any thought is welcome.
Thanks
Brookm





[email protected]<mailto:[email protected]> wrote:

Send scikit-learn mailing list submissions to

        [email protected]<mailto:[email protected]>



To subscribe or unsubscribe via the World Wide Web, visit

        https://mail.python.org/mailman/listinfo/scikit-learn

or, via email, send a message with subject or body 'help' to

        [email protected]<mailto:[email protected]>



You can reach the person managing the list at

        [email protected]<mailto:[email protected]>



When replying, please edit your Subject line so it is more specific

than "Re: Contents of scikit-learn digest..."





Today's Topics:



   1. Re: hierarchical clustering (Gael Varoquaux)

   2. Naive Bayes - Multinomial Naive Bayes tf-idf (Marcin Miro?czuk)

   3. Re: hierarchical clustering (Jaime Lopez Carvajal)

   4. Re: Naive Bayes - Multinomial Naive Bayes tf-idf (Andy)





----------------------------------------------------------------------



Message: 1

Date: Fri, 4 Nov 2016 10:36:49 +0100

From: Gael Varoquaux 
<[email protected]><mailto:[email protected]>

To: Scikit-learn user and developer mailing list

        <[email protected]><mailto:[email protected]>

Subject: Re: [scikit-learn] hierarchical clustering

Message-ID: 
<[email protected]><mailto:[email protected]>

Content-Type: text/plain; charset=us-ascii



AgglomerativeClustering internally calls scikit learn's version of

cut_tree. I would be curious to know whether this is equivalent to

scipy's fcluster.



It differs in that it enable to add connectivity contraints.





------------------------------



Message: 2

Date: Fri, 4 Nov 2016 11:45:39 +0100

From: Marcin Miro?czuk 
<[email protected]><mailto:[email protected]>

To: [email protected]<mailto:[email protected]>

Subject: [scikit-learn] Naive Bayes - Multinomial Naive Bayes tf-idf

Message-ID:

        
<CAH6=pucebylz32-yqpeutrryqvn7equiymwcy38vi9_9jr+...@mail.gmail.com><mailto:CAH6=pucebylz32-yqpeutrryqvn7equiymwcy38vi9_9jr+...@mail.gmail.com>

Content-Type: text/plain; charset="utf-8"



Hi,

In our experiments, we use a Multinomial Naive Bayes (MNB). The traditional

MNB implies the TF weight of the words. We read in documentation

http://scikit-learn.org/stable/modules/naive_bayes.html which describes

Multinomial Naive Bayes that "... where the data are typically represented

as word vector counts, although tf-idf vectors are also known to work well

in practice". The "word vector counts" is a TF and it is well known. We

have a problem which the "tf-idf vectors". In this case, i.e. tf-idf  it

was implemented the approach of the D. M. Rennie et all Tackling the Poor

Assumptions of Naive Bayes Text Classification? In the documentation, there

are not any citation of this solution.



Best,


* This is an EXTERNAL EMAIL. Stop and think before clicking a link or opening 
attachments.
_______________________________________________
scikit-learn mailing list
[email protected]
https://mail.python.org/mailman/listinfo/scikit-learn

Reply via email to