Hello,
When considering linear function where dependent variable is a function of
lags of dependent variables ( co-variates are kind of auto
regressive).Question is .Can i take entire data set for rpart . Then
split data into training and testing and validate the model for testing
data.In
attached is the data set which generates the rgui.exe cpu loop...
here are commands :
library(rpart)
train-read.csv(traindata.csv,header=T)
y-as.numeric(train[,18])
x-train[,1:3]
fit-rpart(y~.,x)
On Thu, Apr 22, 2010 at 9:25 AM, Terry Therneau thern...@mayo.edu wrote:
--- Begin included
--- Begin included message
I have attempted to email the author of this package without success,
just wondering if anybody else has experienced this.
I am having an using rpart on 4000 rows of data with 13 attributes.
I can run the same test on 300 rows of the same data with no issue.
When I run
Hello,
I have attempted to email the author of this package without success,
just wondering if anybody else has experienced this.
I am having an using rpart on 4000 rows of data with 13 attributes.
I can run the same test on 300 rows of the same data with no issue.
When I run on 4000 rows,
Hello,
Isn't it totally counter-intuitive that if you penalize the error less
the tree finds it?
See:
experience - as.factor(c(rep(good,90), rep(bad,10)))
cancel - as.factor(c(rep(no,85), rep(yes,5),
rep(no,5),rep(yes,5)))
foo - function( i ){
tmp - rpart(cancel ~ experience,
Hello,
If you do
my.tree - rpart(cancel ~ experience)
and then you check
my.tree$frame
you will note that the complexity parameter there is 0.
Check ?rpart.object to get a description of what this output means. But
essentially, you will not be able to break the leaf unless you set a
2009/7/27 Robert Smith robertpsmith2...@gmail.com
Hi,
I am using rpart decision trees to analyze customer churn. I am finding
that
the decision trees created are not effective because they are not able to
recognize factors that influence churn. I have created an example situation
below.
-- begin included message ---
Hi,
I am using rpart decision trees to analyze customer churn. I am finding that
the decision trees created are not effective because they are not able to
recognize factors that influence churn. I have created an example situation
below. What do I need to do to for
Hi,
I am using rpart decision trees to analyze customer churn. I am finding that
the decision trees created are not effective because they are not able to
recognize factors that influence churn. I have created an example situation
below. What do I need to do to for rpart to build a tree with the
I have a standard database - HouseVotes84
For example:
Class V1 V2 V3 V4 V5 V6 V7 V8 V9 V10 V11 V12 V13 V14 V15 V16
1 republicann y ny y y n n n y NA y y y ny
2 republicann y ny y y n n n nn y y y n NA
3 democrat NA y y NA
Grześ wrote:
I have a standard database - HouseVotes84
For example:
Class V1 V2 V3 V4 V5 V6 V7 V8 V9 V10 V11 V12 V13 V14 V15 V16
1 republicann y ny y y n n n y NA y y y ny
2 republicann y ny y y n n n nn y y y n NA
3
Gavin Simpson wrote:
Grześ wrote:
I have a standard database - HouseVotes84
For example:
Class V1 V2 V3 V4 V5 V6 V7 V8 V9 V10 V11 V12 V13 V14 V15 V16
1 republicann y ny y y n n n y NA y y y ny
2 republicann y ny y y n n n nn
12 matches
Mail list logo