Hi Meir,
It's part of Prof. Ripley's package tree, but is not exported.
library(tree)
ls(asNamespace(tree))
RSiteSearch(tree.matrix)
Regards, Mark.
Meir Preiszler wrote:
Hi,
Does anyone know where such a function can be found?
Thanks
Meir
Hi Jathine,
I hope this can explain the problem a bit more clearly.
Why PCA gives different results on the two different platforms?
What is amazing, Jathine, is how nearly exactly identical the two sets of
results are, not that they begin to differ at the 16th decimal place. To
assuage your
begin to concern you even
more.
Regards, Mark.
Mark Difford wrote:
Hi Jathine,
I hope this can explain the problem a bit more clearly.
Why PCA gives different results on the two different platforms?
What is amazing, Jathine, is how nearly exactly identical the two sets of
results
Hi Bo,
I can't seem to find the right set of commands to enable me to do perform
a
regression with cluster-adjusted standard-errors.
Frank Harrell's Design package has ?bootcov and ?robcov, which will both do
it.
Regards, Mark.
Bo Cowgill wrote:
I can't seem to find the right set of
Hi Gareth,
If I use the full composition (31 elements or variables), I can get
reasonable separation of my 6 sources.
A word of advice: You need to be exceptionally careful when analyzing
compositional data. Taking compositions puts your data values into a
constrained/bounded space (generally
Aitchison geometry, so I am essentially working in Euclidean space.
Has anyone had experience doing stepwise LDA?? I can't for the life of me
find any help online about where to start.
Thanks
Gareth
quote author=Mark Difford
Hi Gareth,
If I use the full composition (31 elements
Hi Tommaso,
I struggle to understand the discrepancy in df between the anova and lme,
and the
fact that the interaction term is not significant in the anova but
significant in lme.
To begin with, why try to compare things that are obviously quite different?
Surely you can see that the
Hi Michael,
It's in the manual:
?plot.summary.rqs
plot(summary(rq(..., tau=c(...)), parm = x1, ...)
Regards, Mark.
Michael Faye wrote:
Dear all.
I have a question on plotting the coefficients from a series of
mutivariate
quantile regressions. The following code plots the
Hi Denis,
h = c(3h30, 6h30, 9h40, 11h25, 14h00,
15h55, 23h)
I could not figure out how to use chron to import this into times, so
I tried to extract the hours and minutes on my own.
Look at ?strptime for this:
##
strptime(6h30, format=%Ih%M)
[1] 2008-06-21
Hi Gundala,
Suppose I have 2 matrices A and B.
And I want to measure how good each of this matrix is.
You really want to be using Robert Escoufier's RV-coefficient (A unifying
tool for linear multivariate statistical methods: The $RV$-coefficient Appl.
Statist., 1976, 25, 257-265).
Several
Hi Paul,
Duncan has shown you how to do it. There is often a simpler route that is
worth knowing about. Whether it works depends on how the function was coded.
In this case it works:
## Example
par(cex.main = 3)
spineplot (table (tbl$DAY, tbl$SEX), main='TIPS')
par(cex.main = 1.2)
spineplot
Hi Daren,
Can R (out)do Emacs? I think you just need to ?Sweave a little.
Mark.
Daren Tan wrote:
I have a folder full of pngs and jpgs, and would like to consolidate them
into a pdf with appropriate title and labels. Can this be done via R ?
Hi Pavel,
First, annonations should have the same cex-size on each axis. That said,
the way that this is implemented is not too cexy (ouch!). You need to plot
without axes, e.g. plot(obj, axes=F), then you add your axes afterwards
using your own specifications.
?axes
Also see ?par (sub ann)
Hi Pavel,
And perhaps read the entry for cex.axis a little more carefully. And bear in
mind that labels, main, and sub are distinct, having their own cex.-
settings.
HTH, Mark.
Mark Difford wrote:
Hi Pavel,
First, annonations should have the same cex-size on each axis. That said
Hi Caroline,
Because POSIXlt is a complicated structure: you are dealing with a list, not
with what you think you are. Maybe this will help you to see more clearly.
strptime(19800604062759, format=%Y%m%d%H%M%S)
[1] 1980-06-04 06:27:59
str(strptime(19800604062759, format=%Y%m%d%H%M%S))
Hi Caroline,
is.na(strptime(19810329012000,format=%Y%m%d%H%M%S))
[1] TRUE
The problem was to do with daylight saving time. I need to specify a
time zone as this time doesn't exist in my operating system's current
time zone. I still think this is odd behaviour though! When you look
at
Hi Dylan,
I am curious about how to interpret the table produced by
anova(ols(...)), from the Design package.
Frank will perhaps come in with more detail, but if he doesn't then you can
get an understanding of what's being tested by doing the following on the
saved object from your OLS call
Hi wf
I just cannot believe that R does not have a good command of this.
Curious. I find R's graphical output matchless. Almost without exception I
use postscript and find the controls available under base graphics (?par) or
lattice adequate (to understate). Very occassionally I fiddle with
Hi willemf,
Glad to hear that it helped. Years ago (late-90s) I Linuxed, but have since
been forced into the Windows environment (where, however, I have the great
pleasure of being able to use MiKTeX and LyX, i.e. TeX/LaTeX). I therefore
can't help you further, except to say that I have never
, it's not possible to help further.
Of course, you could send me the data and a script showing how you want it
plotted, and I would send you a PDF in return, showing you what R can do ;).
HTH, Mark.
Mark Difford wrote:
Hi willemf,
Glad to hear that it helped. Years ago (late-90s) I Linuxed
Hi Daniela,
Spencer (? Graves) is not at home. Seriously, this is a list that many
people read and use. If you wish to elicit a response, then you would be
wise to give a better statement of what your difficulty is.
The function you enquire about is well documented with an example, see
##
Hi Andreas,
It's because you are dealing with binary or floating point calculations, not
just a few apples and oranges, or an abacus (which, by the way, is an
excellent calculating device, and still widely used in some [sophisticated]
parts of the world).
Hi Ptit,
I would like to fit data with the following formula :
y=V*(1+alpha*(x-25))
where y and x are my data, V is a constant and alpha is the slope I'm
looking for.
Priorities first: lm() or ordinary least-square regression is a basically a
method for finding the best-fitting straight
Hi Jaap,
With all those packages loading it could take some time, unless it's a known
problem (?). Why don't you do a vanilla start (add switch --vanilla to
startup) and do some simple core-related stuff. Then add packages
one-by-one...
Or: search through the source code of the packages for
Hi Jaap,
Great stuff! As the old adage went, Go well, go
Bye, Mark.
Van Wyk, Jaap wrote:
Thanks, Mark, for the response.
The problem is vith SciViews. It is not stable under the latest version of
R.
I found a solution by downloading the latest version of Tinn-R, which
communicates
Hi Ben,
Sorry (still a little out-of-tune), perhaps what you really need to know
about is ?[
HTH, Mark.
Mark Difford wrote:
Hi Ben,
If you wouldn't mind, how do I access the individual components inside
coefficients matrix?
What you want to know about is ?attributes
Hi Ben,
If you wouldn't mind, how do I access the individual components inside
coefficients matrix?
What you want to know about is ?attributes
##
attributes(model)
model$coefficients
model$coefficients[1]
model$coefficients[2:4]
model$coefficients[c(1,5)]
HTH, Mark.
ascentnet wrote:
Hi All,
It really comes down to a question of attitude: you either want to learn
something fundamental or core and so bootstrap yourself to a better place
(at least away from where you are), or you don't. As Marc said, Michal seems
to have erected a wall around his thinking.
I don't think it's
Hi Angelo,
Look carefully at package vcd; and at log-linear models (+ glm(...,
family=poisson)). For overdispersion there are more advanced methods.
HTH, Mark.
Angelo Scozzarella wrote:
Hi,
how can I treat data organised in classes and frequencies?
Ex.
class frequency
Hi Robin,
I ... can't get lm to work despite reading the help. I can get it to work
with a single
explanatory variable, EG model - lm(data$admissions~data$maxitemp)
I'll answer just the second of your questions. Advice: don't just read the
help file, look at the examples and run them; look
Hi Edna,
Because I am always subsetting, I keep the following function handy
mydata[] - lapply(mydata, function(x) if(is.factor(x)) x[,drop=T] else x)
This will strip out all factor levels that have been dropped by a previous
subsetting operation. For novice users of R (though I am not
Hi Kevin,
Can anyone give me a short tutorial on the formula syntax? ... I am sorry
but I could not
glean this information from the help page on lm.
You can give yourself a very good tutorial by reading ?formula and Chapter
12 of
Hi Jinsong and Thierry,
(x1 + x2 + x3) ^2 will give you the main effects and the interactions.
Although it wasn't specifically requested it is perhaps important to note
that (...)^2 doesn't expand to give _all_ interaction terms, only
interactions to the second order, so the interaction term
Hi Murali,
I am interested in plotting my regression analysis results(regression
coefficients and
standard errors obtained through OLS and Tobit models) in the form of
graphs.
plot(obj$lm) will give you a set of diagnostic plots. What you seem to be
after is ?termplot. Also look at John
Hi Ileana,
See this thread:
http://www.nabble.com/R-package-install-td18636993.html
HTH, Mark.
Somesan, Ileana wrote:
Hello,
I want to install the package multiv which is not maintained any
more (found in the archive: multiv_1.1-6.tar.gz from 16 July 2003). I
have installed an older
Hi Kevin,
The documentation indicates that the bw is essentially the sd.
d - density(rnorm(1000))
Not so. The documentation states that the following about bw: The kernels
are scaled such that this is the standard deviation of the smoothing
kernel..., which is a very different thing.
The
to the negative one-fifth power (= Silverman's ‘rule of
thumb’
But how does that relate to say a Poisson distribution or a two-parameter
distribution like a normal, beta, or binomial distribution?
Thank you.
Kevin
Mark Difford [EMAIL PROTECTED] wrote:
Hi Kevin
, and
then...
HTH, Mark.
rkevinburton wrote:
Sorry I tried WikiPedia and only found:
Wikipedia does not have an article with this exact name.
I will try to find some other sources of information.
Kevin
Mark Difford [EMAIL PROTECTED] wrote:
Hi Kevin,
I still have my original
Hi Chunhao,
I google the website and I found that there are three ways to perform
repeated measure ANOVA: aov, lme and lmer.
It's also a good idea to search through the archives.
I use the example that is provided in the above link and I try
Hi Chunhao,
If you carefully read the posting that was referred to you will see that
lme() and not lmer() was used as an example (for using with the multcomp
package). lmer() was only mentioned as an aside... lmer() is S4 and doesn't
work with multcomp, which is S3.
Apropos of specifying random
Hi Miki and Chunhao,
Rusers (Anna, and Mark {thank you guys}) provide me a vary valuable
information.
Also see Gavin Simpson's posting earlier today: apparently multcomp does now
work with lmer objects (it's gone through phases of not working, then
working: it's still being developed).
Hi Ronaldo,
... lmer p-values
There are two packages that may help you with this and that might work with
the current implementation of lmer(). They are languageR and RLRsim.
HTH, Mark.
Bugzilla from [EMAIL PROTECTED] wrote:
Hi,
I have a modelo like this:
Yvar - c(0, 0, 0, 0, 1, 0,
Hi Scotty,
Can't give an answer from what you've provided, but one temp. work-around
that might work is to get onto CRAN -- packages and download the packages
you need from your web browser as zip files, then do an Install package(s)
from local zip files... from the Packages menu.
HTH, Mark.
Hi Ullrich,
The model is
RT.aov - aov(RT~Cond + Error(Subj/Cond), WMU3C)
I understand that TukeyHSD only works with an aov object, but that
RT.aov is an aovlist object.
You want to use lme() in package nlme, then glht() in the multcomp package.
This will give you multiplicity adjusted
Hi Ullrich,
# what does '~1 | Subj/Cond' mean?
It is equivalent to your aov() error structure [ ... +Error(Subj/Cond) ].
It gives you a set of random intercepts, one for each level of your nesting
structure.
## To get some idea of what's being done it helps to have a continuous
covariate in
Hi Stefan,
Is it possible to combine both PCAs in order to get only one set of
eigenvectors?
Yes there is: statis() in the ade4 package is probably what you want. In
short, it does a k-table analysis that will give you a common
ordination/position. It also shows how each time-set deviates
Daniel,
Yes I am trying to model such data, and i need R to know that Site is
nested within Habitat.
Do I use some kind of command before running the model (like factor() and
so on) or do i
write it in the model formula. If so, how?
You still are not telling the list enough, since
Hi Mao,
I am confused. And, I want to know how to assign a wanted order to factor
levels, intentionally?
You want ?relevel. Although the documentation leads one to think that it can
only be used to set a reference level, with the other levels being moved
down, presently it can in fact be
):
[1] cluster_1.11.9 gamlss_1.7-0 grid_2.6.1 lattice_0.17-2
latticeExtra_0.3-1
[6] rcompgen_0.1-17tools_2.6.1
Mark Difford wrote:
Dear List-members,
Hopefully someone will help through my confusion:
In order to get the same coefficients as we get from
Dear List-members,
Hopefully someone will help through my confusion:
In order to get the same coefficients as we get from the following
##
require (MASS)
summary ( lm(Gas ~ Insul/Temp - 1, data = whiteside) )
..
we need to do the following (if we use model.matrix to
}
rownames(olscf) - rownames(coef(obj))
Thanks again for your input.
Regards,
Mark.
Berwin A Turlach wrote:
G'day Mark,
On Wed, 12 Dec 2007 02:05:54 -0800 (PST)
Mark Difford [EMAIL PROTECTED] wrote:
In order to get the same coefficients as we get from the following
[...]
we need
to get me through to the next level.
Best Regards,
Mark.
Gavin Simpson wrote:
On Wed, 2007-12-12 at 06:46 -0800, Mark Difford wrote:
Hi Gavin, Berwin,
Thanks for your detailed replies. I'll make a general reply, if you
don't
mind.
To reiterate, my main point is that if model.matrix
Hi Petr,
Mike Prager has implemented a 4D contour plot in R. You might find this
useful. Find an example and the code at:
http://addictedtor.free.fr/graphiques/graphcode.php?graph=90
HTH,
Mark.
Petr Pikal wrote:
Dear all
I want to display 4 dimensional space by some suitable way. I
Hi Silvia,
What I need is exactly what I get using biplot (pca.object) but for other
axes.
You need to look at ?biplot.prcomp (argument: choices=)
## Try
biplot(prcomp(USArrests), choices=c(1,2)) ## plots ax1 and ax2
biplot(prcomp(USArrests), choices=c(1,3)) ## plots ax1 and ax3
Hi All,
Thanjuvar wrote:
model2-lm(lavi~age+sex+age*race+diabetes+hypertension, data=tb1)
David wrote:
in the second equation you are only including the interaction of
age*race,
the main effect of age, but not the main effect of race which is what
came out significant
in your first
Hi Simon,
Carefull, someone is likely to tell you that the bible is Pinheiro, J.C.,
and Bates, D.M. (2000) Mixed-Effects Models in S and S-PLUS, Springer, and
that would be much closer to being correct. Others might mention something
by Searle. Nothing against Crawley, of course. But it usually
Hi Sören,
(1) Is there an easy example, which explains the differences between
pca and pfa? (2) Which R procedure should I use to get what I want?
There are a number of fundamental differences between PCA and FA (Factor
Analysis), which unfortunately are quite widely ignored. FA is
Hi James,
Daniel Chessel's table.value function (also look at table.paint) in ade4
will do the main panel, you will need to build the side panels. It is
brilliantly simple, and becomes especially informative and powerful when
used with a standardized data frame (?scale) and/or the coordinates
Hi Rodrigo,
I would appreciate any suggestion on how can I make a text Ive inserted
in
a plot show some contrast?
There are quite a few approaches, but why not try legend?
Best, Mark.
Rodrigo Aluizio wrote:
Hi List,
I would appreciate any suggestion on how can I make a text Ive
Hi Rodrigo,
I need to rotate on y axis the lines and symbols of constrained and sites
representation.
Easiest is to multiply the axis you want to invert by -1. Something like the
following, where my.cca is the orginal object and yax = obj[, 2] (xax being
obj[, 1]). Obviously, copying isn't
, Mark.
Mark Difford wrote:
Hi Rodrigo,
I need to rotate on y axis the lines and symbols of constrained and
sites representation.
Easiest is to multiply the axis you want to invert by -1. Something like
the following, where my.cca is the orginal object and yax = obj[, 2] (xax
being obj[, 1
.
--
From: Mark Difford [EMAIL PROTECTED]
Sent: Tuesday, October 07, 2008 10:34 AM
To: r-help@r-project.org
Subject: Re: [R] Mirror Image on Biplot Graphic
Hi Rodrigo,
Sorry, that will not return a result (I use several different ordination
packages, in most of which
or
text.cca(scores(CAPpotiFTI$biplot),col=323232,cex=0.6,lwd=2,lty='dotted')
Doesn't return an error, but it's not a mirror image as expected, it's
nonsense.
--
From: Mark Difford [EMAIL PROTECTED]
Sent: Tuesday, October 07, 2008 12:22 PM
To: r
Hi Rodrigo,
Again an error, as that doesn't touch one of the data structures. You need
to extend the range to include #15, as below:
## This does axis 2
mynew.cca - my.cca
for (i in c(2:8,15)) mynew.cca$CCA[[i]][, 2] - mynew,cca$CCA[[i]][, 2] * -1
Cheers, Mark.
Mark Difford wrote:
Hi
:8,15)) CAPpotiFTI$CCA[[i]][,1]-CAPpotiFTI$CCA[[i]][,1]*-1
Erro em CAPpotiFTI$CCA[[i]] : índice fora de limites (index out of limits)
Well, it needs lots of patience...
--
From: Mark Difford [EMAIL PROTECTED]
Sent: Tuesday, October 07, 2008 3:50
)) CAPpotiFTI$CCA[[i]][, 1]-CAPpotiFTI$CCA[[i]][, 1] *
-1
--
From: Mark Difford [EMAIL PROTECTED]
Sent: Tuesday, October 07, 2008 5:17 PM
To: r-help@r-project.org
Subject: Re: [R] Mirror Image on Biplot Graphic
Hi Rodrigo,
Yes it does
Hi Rodrigo,
Here are two options; for each type, the second version gives 2nd order
interactions
## aov
T.aovmod - aov(response ~ Season + Beach + Line + Error(Block/Strata))
T.aovmod - aov(response ~ (Season + Beach + Line)^2 + Error(Block/Strata))
## lme
library(nlme)
T.lmemod - lme(response
Hi losemind,
I understand the resultant lm coefficients for one factors, but when it
comes to the
interaction term, I got confused.
Yes, it is possible to lose your mind on this (so perhaps get a real name).
A good friend here is
?dummy.coef
In your case (i.e. treatment contrasts), your
Hi losemind,
What's wrong?
What's wrong is probably that you never read the help page for dummy.coef
properly. But that is a wild guess, since I have no idea what your yy is.
And one is strongly inclined to say, Why, oh why?
Your first posting on this subject has your linear model fit, which
somewhat *counter* to the purpose of using methods that are NOT
least-squares based
Martin Maechler, ETH Zurich
LP 2008/11/13 Mark Difford [EMAIL PROTECTED]
Hi Laura,
I was searching for a way to compute robust R-square in R in
order
to
get
Hi Ann,
I want to delete one category from the dataset ... I have tried the omit
command but it just returns TRUE and False values.
You are leaving the list to guess at what you have tried, and which
functions you are using.
There are several different ways of omitting or dropping
, 2008 at 3:42 AM, Mark Difford
[EMAIL PROTECTED]wrote:
Hi Ann,
I want to delete one category from the dataset ... I have tried the
omit
command but it just returns TRUE and False values.
You are leaving the list to guess at what you have tried, and which
functions you are using
Hi Christina,
How can this happen? How can the p-values from the Tukey become
significant when the lme-model wasn't?
The link below, with an explanation by Prof. Fox is relevant to your
question:
http://www.nabble.com/Strange-results-with-anova.glm()-td13471998.html#a13475563
Another way
Hi Michael,
coinertia(op.dudi, em3.dudi)
Error in paste(COCA, 1:n.axes, sep = ) : element 2 is empty;
Something makes me think that this is not the full error message, or the
correct error message, or that you have something else in your call to
coinertia?
Why? Because you also list
Hi Jason,
lattice, with the help of the latticeExtra package does excellent
business-like 3D bars. With devices like PDF that handle transparency you
can make the facets transparent.
library(lattice)
library(latticeExtra)
?panel.3dbars
## Example from the help file (modified to show alpha
Hi Roland,
But this is obviously not a dagger and it seems the Adobe Symbol font
does not have a dagger.
True, but ... Yoda was here.
plot(0:1,0:1, type=n)
text(x=0.5, y=0.5, labels=expression(\u2020))
text(x=0.4, y=0.6, labels=expression(\u2021))
?plotmath, sub Other symbols ... Any
() for yourself.
Mark Difford.
Michael Kubovy wrote:
Dear Friends,
require(cluster)
x - rbind(cbind(rnorm(10, 0, 0.5), rnorm(10, 0, 0.5)),
cbind(rnorm(15, 5, 0.5), rnorm(15, 5, 0.5)))
plot(pp - pam(x, 2), which.plots = 1)
How can I extract the coordinates used in the plot
Or maybe by John Chambers as the central person for the development of S
... I'd be more interested in their opinions than another outrage!
Hi Roland,
The Trevor who Eugene Dalt is referring to is Trevor Hastie. Trevor was at
ATT Bell Labs at the time and worked/has worked very closely with
...
And perhaps I might also have added that one of the strong precepts of this
list is that credit be given where credit is due. Without such
acknowledgement, which of course is founded on a strong principle, the
Open-source community is likely eventually to fall that on its face...
Regards,
Hi All,
Before these things be set in stone, it should be noted that it would be a
real mistake to have a miscalculated statistical object on R's Homepage.
Imagine if SAS found out!
Fact is, the manner in which the percentage contribution of each PC to the
overall inertia is calculated in the
Indeed. The postings exuded a tabloid-esque level of slimy nastiness.
Indeed, indeed. But I do not feel that that is necessarily the case. Credit
should be given where credit is due. And that, I believe is the issue that
is getting (some) people hot and bothered. Certainly, Trevor Hastie in
. the
botched article.
I think that what some people are waiting for are factual statements from
the parties concerned. Conjecture is, well, little more than conjecture.
Regards, Mark.
Rolf Turner-3 wrote:
On 4/02/2009, at 8:15 PM, Mark Difford wrote:
Indeed. The postings exuded
hard-nosed individuals on this list can be, or have become.
Regards, Mark.
still to wait.
Duncan Murdoch-2 wrote:
On 2/4/2009 3:53 PM, Mark Difford wrote:
Indeed. The postings exuded a tabloid-esque level of slimy
nastiness.
Hi Rolf,
It is good to have clarification, for you wrote
or
Leibntiz
invented calculas.
On Thu, Feb 5, 2009 at 11:35 AM, Mark Difford
mark_diff...@yahoo.co.ukwrote:
I think that all appeared on January 8 in Vance's blog posting, with a
comment on it by David M Smith on Jan 9. So those people have -27
days
Then there was no need for vituperative
If you have bug reports for a contributed package please take them up with
the maintainer,
not the list.
Of course, Wacek is right. His observations being made with a customary
needle-like precision. It's that old conundrum about how to have your cake
and still eat it.
Regards to all, Mark.
, is the point.
Peter Dalgaard wrote:
Mark Difford wrote:
It would have been very easy for Mr. Vance to have written:
John M. Chambers, a former Bell Labs researcher who is now a consulting
professor of statistics at Stanford University, was an early champion. At
Bell Labs, Mr. Chambers had
correctly, it seems that Mr.
Vance made an even more botched job of parts of his article than one would
have thought possible. The proverbial curate's egg, it seems.
Regards, Mark.
Duncan Murdoch-2 wrote:
On 2/5/2009 1:05 AM, Mark Difford wrote:
I think that all appeared on January 8
Hi Pele,
I have been trying to output all my results (text, plots, etc) into the
same postscript file...
Surely I missed something? Is it not simply that you are turning off the
postscript device in the wrong place? At the moment you only seem to be
using the postscript device in the
Hi David, Pele:
David Winsemius wrote:
I don't see anywhere that you opened a pdf device. When I try :
pdf(test.pdf) and then run your code I get what looks like the
desired output sitting in my working directory:
Pele does open a PDF device (previously it was a postscript device).
Sorry, the message seems to have got botched. Here it is again:
Pele does open a PDF device (previously it was a postscript device). It
looks like what Pele is trying to do is plot the printed results of the
summary of a model + the AIC, together with the acf() and pacf() plots.
As Dieter
Hi Mike,
I'm surprised there were no takers on this query; I thought it would
be an easy answer, particularly where I provided example data set and
code.
The following simplification should help you to answer your own question.
covariate_aov = aov(dv~(covariate+group+iv1+iv2)^2,data=a)
Hi Paul,
Have you ever seen a drawing of the regions of an R plot with the
terminology that is used for parts?
From what I can remember, several documents on CRAN cover this. The one that
springs to mind is Alzola Harrell's An Introduction to S and the Hmisc
and Design Libraries,” which you
Hi Glen, Andrew,
The PCA is just a singular value decomposition on a sample covariance/...
I believe that Bjørn-Helge Mevik's point was that __if you read the
documentation__ you will see the argument covmat to princomp(). This,
really, is much more straightforward and practical than Andrew's
Hi Dylan,
Am I trying to use contrast.Design() for something that it was not
intended for? ...
I think Prof. Harrell's main point had to do with how interactions are
handled. You can also get the kind of contrasts that Patrick was interested
in via multcomp. If we do this using your
Hi Dylan, Chuck,
contrast(l, a=list(f=levels(d$f)[1:3], x=0), b=list(f=levels(d$f)[4],
x=0))
There is a subtlety here that needs to be emphasized. Setting the
interacting variable (x) to zero is reasonable in this case, because the
mean value of rnorm(n) is zero. However, in the real world
Hi Dylan, Chuck,
Mark Difford wrote:
Coming to your question [?] about how to generate the kind of contrasts
that Patrick wanted
using contrast.Design. Well, it is not that straightforward, though I may
have missed
something in the documentation to the function. In the past I have
Duncan Murdoch wrote:
There's no real difficulty there: axis takes an mgp arg as well.
Thanks for that. A good bit of practical advice, which I hadn't yet clicked
on. I won't comment on the thinking thing;)
Regards, Mark.
Duncan Murdoch-2 wrote:
On 18/02/2009 7:50 AM, Mark Difford wrote
Hi Chun,
I did do the research and work on for hours ... I try to creat a new
variable in my dataset.
Yes, looks like you did. Look at ?interaction, which gives you more
flexibility than ?:.
## Example
diet-sort(rep(x=c(C,T),4))
vesl-rep(x=c(A,P),4)
mydata-data.frame(diet,vesl)
mydata$trt
Hi Simon,
I want to know if yrs (a continuous variable) has a significant unique
effect in the model,
so I fit a simplified model with the main effect ommitted...
[A different approach...] This is not really a sensible question until you
have established that there is no significant
. Do it by comparing nested models (basically as you have done), or
use dropterm() or stepAIC() [both are in MASS].
Regards, Mark.
Mark Difford wrote:
Hi Simon,
I want to know if yrs (a continuous variable) has a significant unique
effect in the model,
so I fit a simplified model
101 - 200 of 309 matches
Mail list logo