Hi,

I am trying to do a simple superimposing of two density plots and am seeing strange behavior (I haven't seen this reported elsewhere). This little snipit will make it apparent...

a = rnorm(10)
b = rnorm(100, mean=-2, sd=0.5)
densityplot( ~ a + b)

This is supposed to superimpose the two distributions on the plot, as if they were in different "groups". But the resulting density plot is goofy -- it looks like it is mixing the two distributions instead of plotting each one separately. In fact I can prove that the distributions are being mixed...

groups = latticeParseFormula(~ a+b, parent.frame(), multiple=T)$groups

If I understand things correctly, the groups vector has length = the combined size of the input vectors (e.g. 110 in this case). The values in "groups" are factors indicating which input vector is being pulled from. So there should be 10 a's and 100 b's. But in fact I get...

sum(groups=="a")
[1] 55

sum(groups=="b")
[1] 55

So indeed Lattice is for some reason mixing the vectors and that's why I'm seeing the incorrect plots.

I'm using Lattice 0.10-14 on R 2.0.1 for MacOSX. Details are below...

 platform = powerpc-apple-darwin6.8
 arch = powerpc
 os = darwin6.8
 system = powerpc, darwin6.8
 status =
 major = 2
 minor = 0.1
 year = 2004
 month = 11
 day = 15
 language = R

I'm not going to have time to try R 2.1.0 for a little while. Is this problem fixed in that version? If so, then I'll try it sooner.

Thanks in advance for any assistance!

--- Adam

Adam Lyon
Fermi National Accelerator Laboratory
Computing Division / D0 Experiment

______________________________________________
[email protected] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

Reply via email to