Hello,
I would like to plot the results of a LDA analysis plotting the
discriminant scores with the decission boundaries on it with rggobi. I
have GGobi already installed on my computer. I have three classes, so
the plot would be LD1xLD2 plus the decission boundaries. Here there is
the code I use
2007/8/20, Dani Valverde [EMAIL PROTECTED]:
Hello,
I would like to plot the results of a LDA analysis plotting the
discriminant scores with the decission boundaries on it with rggobi. I
have GGobi already installed on my computer. I have three classes, so
the plot would be LD1xLD2 plus the
Hello
I try to fit a LDA and RDA model to the same data, which has two classes.
The problem now is that the training errors of the LDA model and the
training error of the RDA model with alpha=0 are not the same. In my
understanding this should be the case. Am I wrong? Can someone explain what
the
dominic senn wrote:
Hello
I try to fit a LDA and RDA model to the same data, which has two classes.
The problem now is that the training errors of the LDA model and the
training error of the RDA model with alpha=0 are not the same. In my
understanding this should be the case. Am I wrong?
Dear all,
I am trying to compare several methods for classify data into groups.
In that purpose I 'd like to developp model comparison and selection
using AIC.
In the lda function of the MASS library, the maximum likelihood of the
function is not given in the output and the script is not
On Mon, 6 Aug 2007, [EMAIL PROTECTED] wrote:
I am trying to compare several methods for classify data into groups.
In that purpose I 'd like to developp model comparison and selection
using AIC.
In the lda function of the MASS library, the maximum likelihood of the
function is not given in
Hi,
I am using the function lda() from MASS for finding reduced-dimensional
representations of a datset. In reading various texts to compare LDA with
Fisher's LDA approach (including Ripley's Modern Applied Statistics with
S-Plus), it is still unclear to me whether or not they produce the
I am using lda for the first time. I am using version 2.3.1 of R.
When I ran the lda I did not get Proportion of trace in the output.
Is there another way to get this or is there a bug in my version?
Sarah Hodgson
__
R-help@stat.math.ethz.ch
--- Wade Wall [EMAIL PROTECTED] wrote:
Hi all,
I have performed an lda on two groups and have
plotted using
plot(x.lda), with x.lda being my lda results. I
have forgotten how to
change the labels of the of the x-axes (they are
currently listed as
Group1 and Group 13), and to rescale
Sorry I wasn't clearer. I believe that it was a specialized function,
but it may have been plot().
What I am basically trying to do is alter the y-axis to represent
frequency and change the labels on the plotting of the linear
discriminant analysis results. I can't seem to do this with plot(),
Hi all,
I have performed an lda on two groups and have plotted using
plot(x.lda), with x.lda being my lda results. I have forgotten how to
change the labels of the of the x-axes (they are currently listed as
Group1 and Group 13), and to rescale the y-axis to reflect frequency.
If anyone knows
hi,
i am wondering if i could use lda$scaling (i.e. coeff) to evaluate
variables' importance if all the x's are normalized before put into
model?
thanks.
--
Weiwei Shi, Ph.D
Research Scientist
GeneGO, Inc.
Did you always know?
No, I did not. But I believed...
---Matrix III
Pieter Vermeesch wrote:
I'm trying to do a linear discriminant analysis on a dataset of three
classes (Affinities), using the MASS library:
data.frame2 - na.omit(data.frame1)
data.ld = lda(AFFINITY ~ ., data.frame2, prior = c(1,1,1)/3)
Error in var(x - group.means[g, ]) : missing
Pieter == Pieter Vermeesch [EMAIL PROTECTED]
on Mon, 16 Oct 2006 19:15:59 +0200 writes:
Pieter I'm trying to do a linear discriminant analysis on a
Pieter dataset of three classes (Affinities), using the
Pieter MASS library:
^^^
No, no!MASS *package*
Dear Martin and Uwe,
I did indeed have a few -Inf values in my data frame. Few enough that
I didn't notice them when I inspected my data.
Thanks a lot for helping me better understand the MASS *package* :-)
Pieter
On 10/17/06, Martin Maechler [EMAIL PROTECTED] wrote:
Pieter == Pieter
I'm trying to do a linear discriminant analysis on a dataset of three
classes (Affinities), using the MASS library:
data.frame2 - na.omit(data.frame1)
data.ld = lda(AFFINITY ~ ., data.frame2, prior = c(1,1,1)/3)
Error in var(x - group.means[g, ]) : missing observations in cov/cor
What does
Dears,
Using the LDA function of MASS package I can project my data into one
dimension. It returns the coefficients that leads to the dimension of
maximum separation (e.g. model$scaling), so far so good.
Question: how can I get two or four dimensions? That is, instead of
project my data using
Hi list,
I'm looking about lda function.
I'd like to know how calcolate the value of the discriminant functions for the
original datas.
I see that in the result object lda there is $scaling a matrix which
transforms observations to discriminant functions, normalized so that within
groups
--- Leonardo Lami [EMAIL PROTECTED] wrote:
Hi list,
I'm looking about lda function.
I'd like to know how calcolate the value of the discriminant
functions for the
original datas.
I see that in the result object lda there is $scaling a matrix
which
transforms observations to
Hello,
I'm tring to make a linear discriminant analysis eith lda (MASS).
In the value of the resulting object is there information about the value of
the centroids of the discriminated group for the discriminant functions?
Thank you in advance for your help!
Leonardo
--
Leonardo Lami
[EMAIL
Friends
Briefly...
In the documentation for lda in MASS it describes the value 'scaling' as
'a matrix which transforms observations into discriminint functions...'.
How?
Verbosely...
I have a matrix of data. 9 independent variables and describing
3-classes. About 100 observations in total.
Dear R-helpers,
if I am right a discriminant analysis can be done with lda.
My questions are:
1. What method to discriminate the groups is used by lda (Fisher's linar
discriminant function, diagonal linear discriminant analysis, likelihood ratio
discriminant rule, ...)?
2. How can I see, which
On Sat, 14 May 2005, K. Steinmann wrote:
if I am right a discriminant analysis can be done with lda.
My questions are:
1. What method to discriminate the groups is used by lda (Fisher's linar
discriminant function, diagonal linear discriminant analysis, likelihood ratio
discriminant rule, ...)?
Hi!!
I was trying to analyze my data by doing a linear discriminants and I
encountered
problems with collinearity. So I am planning to do a PCA to extract
orthogonal
components and then analyze them using linear discriminants. I don't know
how to relate the orthogonal components back to my
hi!
this is a question about lda (MASS) in R on a particular dataset.
I'm not a specialist about any of this but:
First with the well-known iris dataset, I tried using lda to discriminate
versicolor from the other to classes and I got approx. 70% of accuracy
testing on train set. In iris,
Now, I use my real dataset (900 instances, 21 attributes), which 2 classes
can be serparated with accuracy no more than 80% (10xval) with KNN, SVM, C4.5
and the like.
I thinks these accuracies are based on cross-validation runs. Whereas
the 80% accuracy you report using LDA is not based on
Hi,
when I performe LDA on some of my datasets I get the error message:
Error in lda.default(x, grouping, ...) : variable(s) 3 appear to be constant
within groups
I assume the reason is the small values of my varibale 3 ( e.g. 4.530353e-05).
Has someone a suggestion how to solve this problem?
Hi!
I want to determine the relative contribution (or importance) of the independet
variables in a two-class case from the discriminant coefficients given in
obj.lda$scaling. Are the discriminant coefficinets standardized or do I have to
do that? What about other method like partial F-test or
Dear Cristoph, David, Torsten and Bjørn-Helge,
I think that Bjørn-Helge has made more explicit what I had in mind (which I
think is close also to what David mentioned). As well, at the very least, not
placing the PCA inside the cross-validation will underestimate the variance
in the
Torsten Hothorn writes:
as long as one does not use the information in the response (the class
variable, in this case) I don't think that one ends up with an
optimistically biased estimate of the error
I would be a little careful, though. The left-out sample in the
LDA-cross-validation, will
Dear all, not really a R question but:
If I want to check for the classification accuracy of a LDA with
previous PCA for dimensionality reduction by means of the LOOCV method:
Is it ok to do the PCA on the WHOLE dataset ONCE and then run the LDA
with the CV option set to TRUE (runs LOOCV)
--
Dear Cristoph,
I guess you want to assess the error rate of a LDA that has been fitted to a
set of currently existing training data, and that in the future you will get
some new observation(s) for which you want to make a prediction.
Then, I'd say that you want to use the second approach. You
On Wed, 24 Nov 2004, Ramon Diaz-Uriarte wrote:
Dear Cristoph,
I guess you want to assess the error rate of a LDA that has been fitted to a
set of currently existing training data, and that in the future you will get
some new observation(s) for which you want to make a prediction.
Then, I'd
Thank you, Torsten; that's what I thought, as long as one does not use
the 'class label' as a constraint in the dimension reduction, the
procedure is ok. Of course it is computationally more demanding, since
for each new (unknown in respect of the class label) observation one has
to compute a
Julien Trolet wrote:
Hello,
I used the lda function from the MASS (VR) package and the rda function
from the klaR package.
I wanted to compare the result of this two functions by using the same
training set.
Thus, I used the rda function with lambda=1 an gamma=0, I should emulate
the lda
Hello,
I used the lda function from the MASS (VR) package and the rda function
from the klaR package.
I wanted to compare the result of this two functions by using the same
training set.
Thus, I used the rda function with lambda=1 an gamma=0, I should emulate
the lda function and I should
Hi !!
I am trying to analyze some of my data using linear discriminant analysis.
I worked out the following example code in Venables and Ripley
It does not seem to be happy with it.
library(MASS)
library(stats)
data(iris3)
ir-rbind(iris3[,,1],iris3[,,2],iris3[,,3])
On Tue, 2 Nov 2004, T. Murlidharan Nair wrote:
Hi !!
I am trying to analyze some of my data using linear discriminant analysis.
I worked out the following example code in Venables and Ripley
It does not seem to be happy with it.
What is `it'? If you mean R, which version, and which
T. Murlidharan Nair wrote:
Hi !!
I am trying to analyze some of my data using linear discriminant analysis.
I worked out the following example code in Venables and Ripley
It does not seem to be happy with it.
library(MASS)
library(stats)
data(iris3)
Dear R-helpers,
I have a model created by lda, and I would like to use this
model to make predictions for new or old data. The catch is, I want to
do this without using the predict function, i.e. only using
information directly from the foo.lda object to create my posterior
On Tue, 21 Sep 2004, rob foxall (IFR) wrote:
Dear R-helpers,
I have a model created by lda, and I would like to use this
model to make predictions for new or old data. The catch is, I want to
do this without using the predict function, i.e. only using
information directly from
Hi.
I asked a question about lda() and got some answers. However, one
question remains (which is not independent of the earlier ones):
What output does lda() produce which I can use to compute the
posteriors? I know predict(lda())$posterior will give me precisely the
posteriors, but suppose
I remember doing this some time ago but forgot. Perhaps this might help
you
MASS:::predict.lda
On Tue, 2004-07-13 at 23:56, marzban wrote:
Hi.
I asked a question about lda() and got some answers. However, one
question remains (which is not independent of the earlier ones):
What output
Hello,
For a simple problem with 1 predictor (x) and 2 classes (0 and 1), the
linear discriminant function should be something like
2(mu_0 - mu_1)/var x+x-independent-terms
At 08:45 AM 7/12/2004, marzban wrote:
Hello,
For a simple problem with 1 predictor (x) and 2 classes (0 and 1), the
linear discriminant function should be something like
2(mu_0 - mu_1)/var x+x-independent-terms
where var is the common variance.
Question 1: Why does lda() report only
Perhaps some reading would be helpful. I suggest you look first at the
help file for lda(). Second, I suggest you read Venables and Ripley, MASS,
4th Edition, where lda() is discussed extensively. Third, I suggest you
read Ripley's Pattern Recognition and Neural Networks, where the
I haven't done this in years but I think the `scaling' element in the
list returned by lda is the original data matrix multiplied by the
rotation matrix from the SVD. Taking a look at
getAnywhere(lda.default)
will probably answer your question.
-roger
marzban wrote:
Perhaps some reading would
I am trying to write the following code in R. The code works in S+ and i
am trying to do the program in R.
x=discrim(admit~gpa+gmat,prior=c(uniform),data=data.mm)
i wrote the following in R:
x=lda(admit~gpa+gmat,data=data.mm)
i could not figure out how to write prior=c(uniform) in R. I
i could not figure out how to write prior=c(uniform) in R. I would get
an error every time. I think that it has something to do
with uniform. Do you know what i use instead of uniform for R? I am
trying to do a uniform distribution.
try ?runif (random uniform distribution)
There is a help page for lda: please read it for yourself (as the posting
guide requests you too). lda in R works the same way in R as it works in
S-PLUS: in both it is support software for a book, and the posting guide
also asks you to read that book.
On Sat, 12 Jun 2004, Martin Willett wrote:
Seems rather straightforward to me. The prior=uniform in discrim() says
to use equal prior for each group. You can do the same by explicitly
specifying the priors; e.g.,
x - lda(admit ~ gpa + gmat, data=data.mm,
prior=1/nlevels(data.mm$admit))
HTH,
Andy
From: Martin Willett
I am
Hi
I have a data.frame with a grouping variable having the levels
C,
mild AD,
mod AD,
O and
S
since I want to compute a lda only for the two groups 'C' and 'mod AD' I
call lda with data=subset(mydata.pca,GROUP == 'mod AD' | GROUP == 'C')
my.lda - lda(GROUP ~ Comp.1 + Comp.2 + Comp.3 +
I presume is lda from the uncredited package MASS and you ignored the
advice to ask the maintainer?
The short answer is `don't ignore the warning', and set up a proper data
frame with just the groups you actually want.
As a quick fix, look in lda.default and alter the line that looks like
Frank Gibbons wrote:
Wei Geng,
I asked the same question about six weeks ago, so let me try to answer
it. The source for the entire package 'MASS' is in a single file, I
believe (at least this is true on my Linux setup).
The thread gets boring, but let me correct this belief:
NO! There is
I am new to R. Trying to find out how lda() {in MASS R1.8.0 Windows} was
implemented in R. Does anyone know where to find out lda source code ?
Thanks.
Wei
__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
Wei Geng wrote:
I am new to R. Trying to find out how lda() {in MASS R1.8.0 Windows} was
implemented in R. Does anyone know where to find out lda source code ?
Thanks.
Here:
http://cran.r-project.org
Hint: MASS is a *package*. You want to view its *source*.
Same with most other R packages.
Consider the following:
library(MASS)
lda
function (x, ...)
UseMethod(lda)
environment: namespace:MASS
methods(lda)
[1] lda.data.frame lda.defaultlda.formulalda.matrix
Now type lda.data.frame or lda.default, etc., at a command prompt to
see the corresponding R code.
Is this
Hi Jason, Spencer,
Thanks for the prompt response. The strange thing about MASS is that it's
not in Package Sources as most of other R packages are. It seems to come
with the binary R installation. I checked out the Rxx/library/MASS on my
laptop, there are source code (script) for Venables
Wei Geng,
I asked the same question about six weeks ago, so let me try to answer it.
The source for the entire package 'MASS' is in a single file, I believe (at
least this is true on my Linux setup). The exact location of that file
you'll have to determine by searching the directory/folder
With R 1.7.1 on Windows 2000, I got fine R source code for each of the 4
options. What version of R and what exactly did you do? I can't
reproduce your error.
hope this helps. spencer graves
Wei Geng wrote:
Hi Jason, Spencer,
Thanks for the prompt response. The strange thing about MASS
Wei Geng wrote:
Does anyone know where to find out lda source code ?
Try typing lda.default at the prompt. That should get you started. Also see:
methods(lda)
as lda.default isn't the only bit of code used in lda()
Alternatively, grab the source from CRAN and read it at your leisure.
HTH
Gav
Wei Geng wrote:
Hi Jason, Spencer,
Thanks for the prompt response. The strange thing about MASS is that it's
not in Package Sources as most of other R packages are. It seems to come
with the binary R installation. I checked out the Rxx/library/MASS on my
laptop, there are source code (script)
Wei Geng wrote:
Hi Jason, Spencer,
Thanks for the prompt response. The strange thing about MASS is that it's
not in Package Sources as most of other R packages are. It seems to come
with the binary R installation. I checked out the Rxx/library/MASS on my
laptop, there are source code (script) for
You are using a *beta* version of R 1.8.0 and in that version
lda.default is not visible to the user (hidden in a namespace). You can
access it though by using the ::: (triple colon) operator, as in
library(MASS)
MASS:::lda.default
Actually, the first library() call is not necessary.
-roger
Hi,
Having dipped my toe into R a few times over the last year or two, in the
last few weeks I've been using it more and more; I'm now a thorough
convert. I've just joined the list, because although it's great, I do have
this problem...
I'm using linear discriminant analysis for binary
Hi dear R-users
I try to reproduce the steps included in a LDA. Concerning the eigenvectors there is
a difference to SPSS. In my textbook (Bortz)
it says, that the matrix with the eigenvectors
V
usually are not normalized to the length of 1, but in the way that the
following holds (SPSS does
The following satisfies some of your constraints but I don't know if
it satisfies all of them.
Let V = eigenvectors normalized so t(V) %*% V = I. Also, let D.5 =
some square root matrix, so t(D.5) %*% D.5 = Derror, and Dm.5 =
solve(D.5) = invers of D.5. The Choleski decomposition
Hi, Christoph:
1. I didn't see in your original email that you wanted V to be
orthogonal, only that it's columns have length 1. You have a solution
satisfying the latter constraint, but not the former.
2. I don't have time now to sort out the details, and I don't have
them on the top
Hello
I have been using r to classify fish into groups using the lda formula and
predict(object,)$class. I am having problems finding an output that
simultaneously shows me classification function coefficients for predictors
and groups (such as is given in spss under classify:linear discriminant
Dear R-users
How can I get the eigenvalues out of an lda analysis?
thanks a lot
christoph
--
Christoph Lehmann [EMAIL PROTECTED]
__
[EMAIL PROTECTED] mailing list
https://www.stat.math.ethz.ch/mailman/listinfo/r-help
On Tue, 3 Jun 2003, Christoph Lehmann wrote:
How can I get the eigenvalues out of an lda analysis?
It uses singular values not eigenvalues: see ?lda for a description of the
output, and the print method for one way to use them.
--
Brian D. Ripley, [EMAIL PROTECTED]
Professor
On Tue, 3 Jun 2003, Christoph Lehmann wrote:
How can I get the eigenvalues out of an lda analysis?
It uses singular values not eigenvalues: see ?lda for a description of the
output, and the print method for one way to use them.
the function discrimin ofthe ade4 package performs discriminat
let's compare lda and discrimin (ade4) using the iris data:
with lda I get:
lda1 - lda(iris[,1:4],iris[,5])
lda1$svd
[1] 48.642644 4.579983
with discrimin:
discrimin1 - discrimin(dudi.pca(iris[,1:4],scan=F),iris[,5],scan=F)
discrimin1
eigen values: 0.9699 0.222
so where and how is the
Hi,
it seems that the lda function in MASS library doesn't give out the constant for the
linear discriminant function under the situation that we don't use standardized
variable, anyone knows how to obtain the constant in order to construct the linear
discriminant function?
I understand that
On Tue, 1 Apr 2003, array chip wrote:
I used the lda function in the MASS library of S-Plus (R) to do a
linear discriminant analysis, and got the linear coefficients, say b1
and b2 for the 2 predictors x1 and x2. I have trouble to calculate the
discrimiant scores for each observation, I
Hi,
I recently work about linear dimension reduction for classification.
There is a research report on
ftp://ftp.stat.math.ethz.ch/Research-Reports/108.html
In this report I discuss nine methods for linear dimension reduction, five
of which are new. Four of the methods do not perform internal
I'm working on a rather interesting consulting problem with a client. A
number of physical variables are measured on a number of cricket bowlers
in the performance of a delivery. An example variable might be a
directional component of angular momentum for a particular joint
measured at a large
R per se does not have lda() function. Package MASS does, and MASS (the
book) describes it in detail.
If you use a package supporting a book (there are several) do expect to
read the book for the fine details
On Mon, 10 Feb 2003, Luis Silva wrote:
There are some versions of lda. I would
At 01:10 PM 2/1/2003, Roland Goecke wrote:
Hi,
Is there a simple way to get the discriminant score or do I have to
manually multiply the coefficients with the data?
predict.lda will generate an object with a scores component, among other
things.
Try ?predict.lda
79 matches
Mail list logo