Re: [R] Time-measurement in milliseconds

2005-11-07 Thread Sydler, Dominik
Thanks a lot, that works fine!
Strange that I didn't find this function in the help...

Bye
Dominic 

-Original Message-
From: Prof Brian Ripley [mailto:[EMAIL PROTECTED] 
Sent: Montag, 7. November 2005 18:27
To: Sydler, Dominik
Cc: r-help@stat.math.ethz.ch
Subject: Re: [R] Time-measurement in milliseconds

Be aware that the measurement will itself take more than a millisecond
from an interpreted language like R.

Please see the help page for Sys.time, and its suggestion of proc.time.
for(i in 1:100) print(proc.time()[3]) suggests this takes 3ms on my box.

On Mon, 7 Nov 2005, Sydler, Dominik wrote:

> Hi there
>
> I'm loking for a time-measurement to measure time-differences in 
> milliseconds.
> On my search, I only found the following:
> - package "base": Sys.time()
>-> only second-accuracy
> - package "R.utils": System$currentTimeMillis()
>-> returns integer of milliseconds, but accuracy is only whole 
> seconds too.
> At the moment I run every bit of code to measure 1000-times to be able

> to calculate time in milliseconds... ;-)
>
> Has anyone a method to get milliseconds?
>
> Thanks for any help.
>
> __
> R-help@stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! 
> http://www.R-project.org/posting-guide.html
>

-- 
Brian D. Ripley,  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] Sort a dataframe

2005-11-07 Thread Abd. Rahman Kassim

Dear Andrew,

Thanks for the response. It works.

Abd. Rahman
- Original Message - 
From: "Andrew Robinson" <[EMAIL PROTECTED]>
To: "Abd. Rahman Kassim" <[EMAIL PROTECTED]>
Cc: "R-Help Discussion" 
Sent: Tuesday, November 08, 2005 12:53 PM
Subject: Re: [R] Sort a dataframe


> Try:
>
> dataframeName <- dataframeName[order(dataframeName$ColumnName),]
>
> Cheers
>
> Andrew
>
> On Wed, Nov 09, 2005 at 12:43:13PM +0800, Abd. Rahman Kassim wrote:
>>
>> Dear All,
>>
>> How can I sort a data frame (using one of the column)?
>>
>> Thanks for your support.
>>
>> Regards.
>>
>> Abd. Rahman Kassim (PhD)
>> Head Forest Ecology Branch
>> Forest Management & Ecology Program
>> Forestry and Conservation Division
>> Forest Research Institute Malaysia
>> Kepong 52109
>> Selangor, Malaysia
>>
>> *
>>
>> Checked by TrendMicro Interscan Messaging Security.
>> For any enquiries, please contact FRIM IT Department.
>> *
>> [[alternative HTML version deleted]]
>>
>> __
>> R-help@stat.math.ethz.ch mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide! 
>> http://www.R-project.org/posting-guide.html
>
> -- 
> Andrew Robinson
> Senior Lecturer in Statistics   Tel: +61-3-8344-9763
> Department of Mathematics and StatisticsFax: +61-3-8344-4599
> University of Melbourne, VIC 3010 Australia
> Email: [EMAIL PROTECTED]Website: 
> http://www.ms.unimelb.edu.au 


*

Checked by TrendMicro Interscan Messaging Security. 
For any enquiries, please contact FRIM IT Department.

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] Poisson/negbin followed by jackknife

2005-11-07 Thread Prof Brian Ripley
Do you really mean a jackknife or leave-one-out crossvalidation?  They are 
not the same, but the second is often incorrectly called the first.

In either case, I would point you at the book the 'boot' package supports.
See for example its cv.glm function.

On Mon, 7 Nov 2005, Jeffrey Stratford wrote:

> Thanks for the help with the hier.part analysis.  All the problems
> stemmed from an import problem which was solved with file.chose().
>
> Now that I have the variables that I'd like to use I need to run some
> GLM models.  I think I have that part under control but I'd like to use
> a jackknife approach to model validation (I was using a hold out sample
> but this seems to have fallen out of favor).
>
> I'd appreciate it if someone could just point me in the right direction
> for the jackkife analysis given a particular distribution, coefficients,
> etc.

-- 
Brian D. Ripley,  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] Modifying Internal C Files

2005-11-07 Thread Prof Brian Ripley
In your first post you said you called .C.  Here you say you call 
.Internal.  Users cannot add .Internal functions, nor can they be used in 
packages. In any case, The syntax is not as you show, but

 .Internal(pnbeta(q, shape1, shape2, ncp, lower.tail, log.p))

(And to be definite, that calls the C function do_math4, not pnbeta.)

Final comment:  Duncan mentioned

>> double pnbeta2(double x, double a, double b, double lambda,
>>   int lower_tail, int log_p)

but a .C call has to be to a function with all arguments as pointers, so 
there would need a wrapper function.

The reason you are finding this difficult is that you are not dealing with 
the C function behind .Internal, but one deeper into the system. The 
simplest thing to do would be to compile R from the sources, then make 
modifications and rebuild.  But in this case I believe Thomas Lumley has 
already modified the R-devel sources and Duncan Murdoch has provided 
Windows binaries of R-devel, so perhaps there is a much simpler route?


On Mon, 7 Nov 2005, Ken Kelley wrote:

> Hi Duncan and others.
>
> Thanks for your insight. Actually, I did change the function name in the
> pnbeta2.c code (I should have said so). When I check the package I get
> no errors. I build the package and all seems well. When I install and
> then load the package in R, I get:
>
> > pnbeta2
> function (x, a, b, lambda, low.tail, log.p)
> {
> res <- .Internal("pnbeta2", as.double(x), as.double(a),
> as.double(b), as.double(lambda), as.integer(low.tail), as.integer(log.p))
> return(list(res))
> }
> 
>
> which seems good, but the function does not work:
>
> pnbeta2(.1, 10, 25, 2, TRUE, FALSE)
> Error in pnbeta2(0.1, 10, 25, 2, TRUE, FALSE) :
> 7 arguments passed to '.Internal' which requires 1
>
> When I try to run the .Internal file directly is perhaps where my
> problems begin:
> > .Internal(pnbeta2(.1, 10, 25, 2, TRUE, FALSE))
> Error in .Internal(pnbeta2(0.1, 10, 25, 2, TRUE, FALSE)) :
> no internal function "pnbeta2"
>
> But, when I do the same thing to the pnbeta internal function (which
> pnbeta2 is identical to at this point) I get the result:
> > .Internal(pnbeta(.1, 10, 25, 2, TRUE, FALSE))
> [1] 0.0006837318
>
> So, I'm at a loss for what is going on in this situation. my pnbeta2
> internal function doesn't seem to be there. What I want to do is simple
> (modify an interal C file), but it is proving to be quite difficult for
> me to implement.
>
> Thanks for any thoughts,
> Ken
>
>
> Duncan Murdoch wrote:
>> On 11/7/2005 5:17 PM, Ken Kelley wrote:
>>
>>> Hi All.
>>> I want to tweak a few minor things inside of internal C code. I have
>>> my Win. XP machine set-up to build packages (including C code), but
>>> I'm having problems getting the package to run correctly. In
>>> particular, I want to modify a few things inside of pnbeta.c (namely
>>> errmax and itrmax), which is what the pbeta() function calls upon when
>>> there is a noncentral parameter. I copied the pnbeta.c C code, changed
>>> its name [to pnbeta2.c], included the nmath.h, dpq.h files, lgamma.c,
>>> and pbeta.c in my src folder (since the .h files were called upon and
>>> the .c files were). I then created an R function that I thought would
>>> call upon the new C code:
>>>
>>> pnbeta2 <- function(x, a, b, lambda, low.tail, log.p)
>>> {
>>> res <- .C("pnbeta2", as.double(x), as.double(a), as.double(b),
>>> as.double(lambda), as.integer(low.tail), as.integer(log.p))
>>> return(list(res))
>>> }
>>>
>>> But after I built the package and loaded it it, the function doesn't
>>> work (it isn't even recognized). I have no idea why this is failing.
>>> Any information to help me figure out what I need to do to modify
>>> internal C code generally, or specifically as it applies to this
>>> scenario would be most helpful.
>>
>>
>> You didn't say that you changed the name of the function, only the file
>> that contained it.  If this wasn't an oversight, then you should put
>>
>> double pnbeta2(double x, double a, double b, double lambda,
>>   int lower_tail, int log_p)
>>
>> in the appropriate place in your pnbeta2.c file.
>>
>> The other thing you should do is to add a PACKAGE argument to your .C
>> call, just in case there is already a pnbeta2 function somewhere else
>> (or will be some day).  In fact, if you do this, there should be no need
>> to change the name of the function:  R will look in the package DLL
>> rather than R.dll to find it.  Just make sure that the name in the .C
>> call matches the declared name in the source.
>>
>> Duncan Murdoch
>>
>
> -- 
> Ken Kelley, Ph.D.
> Inquiry Methodology Program
> Indiana University
> 201 North Rose Avenue, Room 4004
> Bloomington, Indiana 47405
> http://www.indiana.edu/~kenkel
>
> __
> R-help@stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
>

-- 
Brian D. Ripley,  

Re: [R] Sort a dataframe

2005-11-07 Thread Andrew Robinson
Try: 

dataframeName <- dataframeName[order(dataframeName$ColumnName),]

Cheers

Andrew

On Wed, Nov 09, 2005 at 12:43:13PM +0800, Abd. Rahman Kassim wrote:
> 
> Dear All,
> 
> How can I sort a data frame (using one of the column)?
> 
> Thanks for your support.
> 
> Regards.
> 
> Abd. Rahman Kassim (PhD)
> Head Forest Ecology Branch
> Forest Management & Ecology Program
> Forestry and Conservation Division
> Forest Research Institute Malaysia
> Kepong 52109
> Selangor, Malaysia
> 
> *
> 
> Checked by TrendMicro Interscan Messaging Security.
> For any enquiries, please contact FRIM IT Department.
> *
>   [[alternative HTML version deleted]]
> 
> __
> R-help@stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

-- 
Andrew Robinson
Senior Lecturer in Statistics   Tel: +61-3-8344-9763
Department of Mathematics and StatisticsFax: +61-3-8344-4599
University of Melbourne, VIC 3010 Australia
Email: [EMAIL PROTECTED]Website: http://www.ms.unimelb.edu.au

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] Sort a dataframe

2005-11-07 Thread Abd. Rahman Kassim

Dear All,

How can I sort a data frame (using one of the column)?

Thanks for your support.

Regards.

Abd. Rahman Kassim (PhD)
Head Forest Ecology Branch
Forest Management & Ecology Program
Forestry and Conservation Division
Forest Research Institute Malaysia
Kepong 52109
Selangor, Malaysia

*

Checked by TrendMicro Interscan Messaging Security.
For any enquiries, please contact FRIM IT Department.
*
[[alternative HTML version deleted]]

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] Simple Nesting question/Odd error message

2005-11-07 Thread Jarrett Byrnes
I'm attempting to analyze some survey data comparing multiple docks.  I 
surveyed all of the slips within each dock, but as slips are nested 
within docks, getting multiple samples per slip, and don't really 
represent any meaningful gradient, slip is a random effect.  There are 
also an unequal number of slips at each dock.

I'm having syntactical issues, however.  When I try

dock.lme<-lme(X.open ~ Dock, random=Slip|~Dock, data=my.data)

I get

Error in inherits(object, "reStruct") : Object "Slip" not found

I'm uncertain as to what this means, as Slip most certainly is there 
(yes, I checked) - any pointers, or pointers about syntax for this as a 
general topic.

And while I'm on it, is there a way to fit models with random effects 
using least squares instead of REML?  Thanks!

-Jarrett

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] multidimensional integration not over a multidimensionalrectangle

2005-11-07 Thread Spencer Graves
  The (non)respoonse that I've seen to your question combined with my 
own search suggests that R does not yet have a multidimensional 
numerical integration function of the generality you desire.

  Would you care to tell us more about the problem you are trying to 
solve?  For example, if it can be parameterized as an integral over R^k 
where the integrand is unimodal, you could approximate the integrand 
with a parabolic then do Monte Carlo from the obvious approximation of 
that integral by a multivariate normal.  For each selected point, you 
compute the ratio of your integrand to the approximating normal density. 
  Then compute mean and standard deviation of these ratios.  The mean is 
the evaluation of the integral, and stdev/N estimates the error.  For 
more information, see any good reference on "importance sampling", e.g.:

  Evans and Swartz (2000) Approximating Integral via Monte Carlo and 
Deterministic Methods (Oxford).

  hope this helps.
  spencer graves

Lynette Sun wrote:

> Hi,
> 
> anyone knows about any functions in R can get multidimensional integration
> not over a multidimensional rectangle (not adapt).
> 
> For example, I tried the following function f(x,n)=x^n/n!
> 
> phi.fun<-function(x,n)
> { if (n==1) {
>   x
>   }else{
>   integrate(phi.fun, lower=0, upper=x, n=n-1)$value
>   }
> }
> 
> I could get f(4,2)=4^2/2!=8, but failed in f(4,3)=4^3/3! Thanks
> 
> Best,
> Lynette
> 
> __
> R-help@stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

-- 
Spencer Graves, PhD
Senior Development Engineer
PDF Solutions, Inc.
333 West San Carlos Street Suite 700
San Jose, CA 95110, USA

[EMAIL PROTECTED]
www.pdf.com 
Tel:  408-938-4420
Fax: 408-280-7915

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] Modifying Internal C Files

2005-11-07 Thread Ken Kelley
Hi Duncan and others.

Thanks for your insight. Actually, I did change the function name in the 
pnbeta2.c code (I should have said so). When I check the package I get 
no errors. I build the package and all seems well. When I install and 
then load the package in R, I get:

 > pnbeta2
function (x, a, b, lambda, low.tail, log.p)
{
 res <- .Internal("pnbeta2", as.double(x), as.double(a), 
as.double(b), as.double(lambda), as.integer(low.tail), as.integer(log.p))
 return(list(res))
}


which seems good, but the function does not work:

pnbeta2(.1, 10, 25, 2, TRUE, FALSE)
Error in pnbeta2(0.1, 10, 25, 2, TRUE, FALSE) :
 7 arguments passed to '.Internal' which requires 1

When I try to run the .Internal file directly is perhaps where my 
problems begin:
 > .Internal(pnbeta2(.1, 10, 25, 2, TRUE, FALSE))
Error in .Internal(pnbeta2(0.1, 10, 25, 2, TRUE, FALSE)) :
 no internal function "pnbeta2"

But, when I do the same thing to the pnbeta internal function (which 
pnbeta2 is identical to at this point) I get the result:
 > .Internal(pnbeta(.1, 10, 25, 2, TRUE, FALSE))
[1] 0.0006837318

So, I'm at a loss for what is going on in this situation. my pnbeta2 
internal function doesn't seem to be there. What I want to do is simple 
(modify an interal C file), but it is proving to be quite difficult for 
me to implement.

Thanks for any thoughts,
Ken


Duncan Murdoch wrote:
> On 11/7/2005 5:17 PM, Ken Kelley wrote:
> 
>> Hi All.
>> I want to tweak a few minor things inside of internal C code. I have 
>> my Win. XP machine set-up to build packages (including C code), but 
>> I'm having problems getting the package to run correctly. In 
>> particular, I want to modify a few things inside of pnbeta.c (namely 
>> errmax and itrmax), which is what the pbeta() function calls upon when 
>> there is a noncentral parameter. I copied the pnbeta.c C code, changed 
>> its name [to pnbeta2.c], included the nmath.h, dpq.h files, lgamma.c, 
>> and pbeta.c in my src folder (since the .h files were called upon and 
>> the .c files were). I then created an R function that I thought would 
>> call upon the new C code:
>>
>> pnbeta2 <- function(x, a, b, lambda, low.tail, log.p)
>> {
>> res <- .C("pnbeta2", as.double(x), as.double(a), as.double(b), 
>> as.double(lambda), as.integer(low.tail), as.integer(log.p))
>> return(list(res))
>> }
>>
>> But after I built the package and loaded it it, the function doesn't 
>> work (it isn't even recognized). I have no idea why this is failing. 
>> Any information to help me figure out what I need to do to modify 
>> internal C code generally, or specifically as it applies to this 
>> scenario would be most helpful.
> 
> 
> You didn't say that you changed the name of the function, only the file 
> that contained it.  If this wasn't an oversight, then you should put
> 
> double pnbeta2(double x, double a, double b, double lambda,
>   int lower_tail, int log_p)
> 
> in the appropriate place in your pnbeta2.c file.
> 
> The other thing you should do is to add a PACKAGE argument to your .C 
> call, just in case there is already a pnbeta2 function somewhere else 
> (or will be some day).  In fact, if you do this, there should be no need 
> to change the name of the function:  R will look in the package DLL 
> rather than R.dll to find it.  Just make sure that the name in the .C 
> call matches the declared name in the source.
> 
> Duncan Murdoch
> 

-- 
Ken Kelley, Ph.D.
Inquiry Methodology Program
Indiana University
201 North Rose Avenue, Room 4004
Bloomington, Indiana 47405
http://www.indiana.edu/~kenkel

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] writing R shell scripts?

2005-11-07 Thread Duncan Temple Lang
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1


This has been done and I said I would integrate it into R.
It is on my list of things to do relatively soon. I have never
been entirely happy with the way arguments are handled and that is what
has put things on hold. And now that the R startup is a lot quicker,
this is more worthwhile.

Mike Miller wrote:
> I'm new to the list.  I've used R and S-PLUS a bit for about 15 years but 
> am now working to make R my main program for all numerical and statistical 
> computing.  I also use Octave for this kind of work and I recommend it (it 
> is also under the GPL).  Here's my question:  In Octave I can write shell 
> scripts in the Linux/UNIX environment that begin with a line like this...
> 
> #!/usr/local/bin/octave -q
> 
> ...and the remaining lines are octave commands.  Is it possible to do this 
> sort of thing in R using something like this?:
> 
> #!/usr/lib/R/bin/R.bin
> 
> Well, that isn't quite it because I tried it and it didn't work!
> 
> Any advice greatly appreciated.  Thanks in advance.
> 
> Mike
> 

- --
Duncan Temple Lang[EMAIL PROTECTED]
Department of Statistics  work:  (530) 752-4782
371 Kerr Hall fax:   (530) 752-7099
One Shields Ave.
University of California at Davis
Davis, CA 95616, USA
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.2 (Darwin)
Comment: Using GnuPG with Thunderbird - http://enigmail.mozdev.org

iD8DBQFDcAo19p/Jzwa2QP4RAjuPAJ48um++JJ9f1g62i6jBm7w8yZY7wwCfXU1J
ceVWi2zr1oEcXV8BQyD2gdM=
=i8JE
-END PGP SIGNATURE-

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] reduce levels

2005-11-07 Thread Bill.Venables
sub$mm <- factor(sub$mm)

is the simplest way to change a single factor in this way.  To do a
whole data frame you just need a loop:

drop_unused_levels <- function(data) 
as.data.frame(lapply(data, function(x) if(is.factor(x))
factor(x) else x ))

Here's your example again, but witn a slightly different idiom...

test <- data.frame(mm = letters[1:3])
sub <- subset(test, mm == "b")

fixedSub <- drop_unused_levels(sub)

Bill Venables.

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Claus Atzenbeck
Sent: Tuesday, 8 November 2005 9:52 AM
To: R-help
Subject: [R] reduce levels


Hi all:

I have an example that shows my problem:

> test <- data.frame(c("a", "b", "c"))
> colnames(test) <- "mm"
> sub <- subset(test, mm=="b")
> sub$mm
[1] b
Levels: a b c
> levels(sub$mm)
[1] "a" "b" "c"

How can I reduce the levels to exclusively those which are part of the
data frame? That is in the above named example exclusively "b".

Reason: I want to iterate exclusively through those levels that exist
within a subset, but leave away all others.

Thanks for any hint.
Claus

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide!
http://www.R-project.org/posting-guide.html

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] Poisson/negbin followed by jackknife

2005-11-07 Thread Jeffrey Stratford
Folks,

Thanks for the help with the hier.part analysis.  All the problems
stemmed from an import problem which was solved with file.chose().

Now that I have the variables that I'd like to use I need to run some
GLM models.  I think I have that part under control but I'd like to use
a jackknife approach to model validation (I was using a hold out sample
but this seems to have fallen out of favor).  

I'd appreciate it if someone could just point me in the right direction
for the jackkife analysis given a particular distribution, coefficients,
etc.

Thanks,

Jeff


Jeffrey A. Stratford, Ph.D.
Postdoctoral Associate
331 Funchess Hall
Department of Biological Sciences
Auburn University
Auburn, AL 36849
334-329-9198
FAX 334-844-9234
http://www.auburn.edu/~stratja

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] writing R shell scripts?

2005-11-07 Thread Mike Miller
I'm new to the list.  I've used R and S-PLUS a bit for about 15 years but 
am now working to make R my main program for all numerical and statistical 
computing.  I also use Octave for this kind of work and I recommend it (it 
is also under the GPL).  Here's my question:  In Octave I can write shell 
scripts in the Linux/UNIX environment that begin with a line like this...

#!/usr/local/bin/octave -q

...and the remaining lines are octave commands.  Is it possible to do this 
sort of thing in R using something like this?:

#!/usr/lib/R/bin/R.bin

Well, that isn't quite it because I tried it and it didn't work!

Any advice greatly appreciated.  Thanks in advance.

Mike

-- 
Michael B. Miller, Ph.D.
Assistant Professor
Division of Epidemiology and Community Health
and Institute of Human Genetics
University of Minnesota
http://taxa.epi.umn.edu/~mbmiller/

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] Help on model selection using AICc

2005-11-07 Thread Spencer Graves
  I don't think AICc is available in a standard script.  A couple of 
years ago, I modified the stepAIC script from the MASS package to use 
AICc.  If and when I get time for this again, I plan to redo it using a 
more complete Bayesian analysis.

  However, if you want to use AICc, you might consider modifying 
stepAIC or downloading my "stepAIC.c" from "www.prodsyse.com".  This 
code was last used under S-Plus 6.2 and would doubtless require some 
effort to make it work under R.

  Before you do either, however, I suggest you review earlier 
discussions on this issue, especially comments by Prof. Ripley, if you 
haven't already.  You can find numerous comments by looking for "AICc", 
"AIC.c", "stepAIC.c", and "Burnham and Anderson" with RSiteSearch.

  Buena Suerte
  Spencer Graves

[EMAIL PROTECTED] wrote:

> Hi,
>I'm fitting poisson regression models to counts of birds in 
> 1x1 km squares using several environmental variables as predictors. 
> I do this in a stepwise way, using the stepAIC function. However the 
> resulting models appear to be overparametrized, since too much 
> variables were included. 
>   I would like to know if there is the possibility of fitting models 
> by steps but using the AICc instead of AIC. Or at least I wonder if it 
> would be possible to save the AIC value and number of parameters of 
> models fitted in each step and to calculate AICc afterward.
>Help on this will be very much appreciated
>German Lopez
> 
> __
> R-help@stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

-- 
Spencer Graves, PhD
Senior Development Engineer
PDF Solutions, Inc.
333 West San Carlos Street Suite 700
San Jose, CA 95110, USA

[EMAIL PROTECTED]
www.pdf.com 
Tel:  408-938-4420
Fax: 408-280-7915

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] frequency() source code

2005-11-07 Thread Francisco J. Zagmutt



>From: Duncan Murdoch <[EMAIL PROTECTED]>
>To: bob mccall <[EMAIL PROTECTED]>
>CC: "r-help@stat.math.ethz.ch" 
>Subject: Re: [R] frequency() source code
>Date: Mon, 07 Nov 2005 18:30:25 -0500
>
>On 11/7/2005 6:11 PM, bob mccall wrote:
> >   Greetings:
> >
> > I am looking for the source code for the frequency function. I
> > grepped  the following dirs  but no luck.
> >
> > R-2.2.0/src/appl/*
> > R-2.2.0/src/main/*
> > R-2.2.0/src/nmath/*
> > R-2.2.0/src/library/stats/*
> >
> > Does anybody know the file name??
>
>Here's how I would look for it:
>
>In R, type frequency, and I get:
>
>  > frequency
>function (x, ...)
>UseMethod("frequency")
>
>
>So frequency is a generic function.  That's all the source there is.
>
>Now I'm probably interested in a particular method.  Let's say the
>default one.  So I try
>
>  > frequency.default
>Error: object "frequency.default" not found
>
>Oops, looks like it's hidden in a namespace.  Try again:
>
>  > getAnywhere("frequency.default")
>A single object matching 'frequency.default' was found
>It was found in the following places
>registered S3 method for frequency from namespace stats
>namespace:stats
>with value
>
>function (x, ...)
>if (!is.null(xtsp <- attr(x, "tsp"))) xtsp[3] else 1
>
>
>Now that's probably enough information, but if I really wanted to see
>the source (as opposed to the above deparsed version, which won't have
>any comments in it), then I'd know to look in src/library/stats/R, since
>it's in namespace:stats, and it's R code.  I'd grep or do another search
>of the files there and find it in src/library/stats/R/ts.R (and it turns
>out there aren't any comments after all, but it might be useful to look
>at the other functions in that file anyway).
>
>Now if someone were running from a binary install in Windows, they
>wouldn't have a full copy of the source directories; they need to
>download that separately from the binary builds.

...or, they can look at the source code directly at 
https://svn.r-project.org/R/trunk/src/library/stats/R/ts.R

Cheers

Francisco

>
>Duncan Murdoch
>
>__
>R-help@stat.math.ethz.ch mailing list
>https://stat.ethz.ch/mailman/listinfo/r-help
>PLEASE do read the posting guide! 
>http://www.R-project.org/posting-guide.html

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] reduce levels

2005-11-07 Thread Claus Atzenbeck
Hi all:

I have an example that shows my problem:

> test <- data.frame(c("a", "b", "c"))
> colnames(test) <- "mm"
> sub <- subset(test, mm=="b")
> sub$mm
[1] b
Levels: a b c
> levels(sub$mm)
[1] "a" "b" "c"

How can I reduce the levels to exclusively those which are part of the
data frame? That is in the above named example exclusively "b".

Reason: I want to iterate exclusively through those levels that exist
within a subset, but leave away all others.

Thanks for any hint.
Claus

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] OLS variables

2005-11-07 Thread John Fox
Dear Brian,

I like the idea of providing support for raw polynomials in poly() and
polym(), if only for pedagogical reasons.

Regards,
 John


John Fox
Department of Sociology
McMaster University
Hamilton, Ontario
Canada L8S 4M4
905-525-9140x23604
http://socserv.mcmaster.ca/jfox 
 

> -Original Message-
> From: Prof Brian Ripley [mailto:[EMAIL PROTECTED] 
> Sent: Monday, November 07, 2005 11:14 AM
> To: John Fox
> Cc: r-help@stat.math.ethz.ch; 'Kjetil Brinchmann halvorsen'
> Subject: RE: [R] OLS variables
> 
> On Mon, 7 Nov 2005, John Fox wrote:
> 
> > Dear Brian,
> >
> > I don't have a strong opinion, but R's interpretation seems more 
> > consistent to me, and as Kjetil points out, one can use polym() to 
> > specify a full-polynomial model. It occurs to me that ^ and 
> ** could 
> > be differentiated in model formulae to provide both.
> 
> However, poly[m] only provide orthogonal polynomials, and I 
> have from time to time considered extending them to provide 
> raw polynomials too.
> Is that a better-supported idea?
> 
> >
> > Regards,
> > John
> >
> > 
> > John Fox
> > Department of Sociology
> > McMaster University
> > Hamilton, Ontario
> > Canada L8S 4M4
> > 905-525-9140x23604
> > http://socserv.mcmaster.ca/jfox
> > 
> >
> >> -Original Message-
> >> From: Prof Brian Ripley [mailto:[EMAIL PROTECTED]
> >> Sent: Monday, November 07, 2005 4:05 AM
> >> To: Kjetil Brinchmann halvorsen
> >> Cc: John Fox; r-help@stat.math.ethz.ch
> >> Subject: Re: [R] OLS variables
> >>
> >> On Sun, 6 Nov 2005, Kjetil Brinchmann halvorsen wrote:
> >>
> >>> John Fox wrote:
> 
>  I assume that you're using lm() to fit the model, and that
> >> you don't
>  really want *all* of the interactions among 20 predictors:
> >> You'd need
>  quite a lot of data to fit a model with 2^20 terms in it,
> >> and might
>  have trouble interpreting the results.
> 
>  If you know which interactions you're looking for, then why not 
>  specify them directly, as in lm(y ~  x1*x2 + x3*x4*x5 +
> >> etc.)? On the
>  other hand, it you want to include all interactions, say, up to 
>  three-way, and you've put the variables in a data frame,
> >> then lm(y ~ .^3, data=DataFrame) will do it.
> >>>
> >>> This is nice with factors, but with continuous variables,
> >> and need of
> >>> a response-surface type, of model, will not do. For 
> instance, with 
> >>> variables x, y, z in data frame dat
> >>>lm( y ~ (x+z)^2, data=dat )
> >>> gives a model mwith the terms x, z and x*z, not the square terms.
> >>> There is a need for a semi-automatic way to get these, for
> >> instance,
> >>> use poly() or polym() as in:
> >>>
> >>> lm(y ~ polym(x,z,degree=2), data=dat)
> >>
> >> This is an R-S difference (FAQ 3.3.2).  R's formula parser always 
> >> takes
> >> x^2 = x whereas the S one does so only for factors.  This 
> makes sense 
> >> it you interpret `interaction' strictly as in John's 
> description - S 
> >> chose to see an interaction of any two continuous variables as 
> >> multiplication (something which puzzled me when I first 
> encountered 
> >> it, as it was not well documented back in 1991).
> >>
> >> I have often wondered if this difference was thought to be an 
> >> improvement, or if it just a different implementation of the 
> >> Rogers-Wilkinson syntax.
> >> Should we consider changing it?
> 
> -- 
> Brian D. Ripley,  [EMAIL PROTECTED]
> Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
> University of Oxford, Tel:  +44 1865 272861 (self)
> 1 South Parks Road, +44 1865 272866 (PA)
> Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] frequency() source code

2005-11-07 Thread Duncan Murdoch
On 11/7/2005 6:11 PM, bob mccall wrote:
>   Greetings:
>  
> I am looking for the source code for the frequency function. I 
> grepped  the following dirs  but no luck. 
> 
> R-2.2.0/src/appl/*
> R-2.2.0/src/main/*
> R-2.2.0/src/nmath/*
> R-2.2.0/src/library/stats/*
> 
> Does anybody know the file name??

Here's how I would look for it:

In R, type frequency, and I get:

 > frequency
function (x, ...)
UseMethod("frequency")


So frequency is a generic function.  That's all the source there is.

Now I'm probably interested in a particular method.  Let's say the 
default one.  So I try

 > frequency.default
Error: object "frequency.default" not found

Oops, looks like it's hidden in a namespace.  Try again:

 > getAnywhere("frequency.default")
A single object matching 'frequency.default' was found
It was found in the following places
   registered S3 method for frequency from namespace stats
   namespace:stats
with value

function (x, ...)
if (!is.null(xtsp <- attr(x, "tsp"))) xtsp[3] else 1


Now that's probably enough information, but if I really wanted to see 
the source (as opposed to the above deparsed version, which won't have 
any comments in it), then I'd know to look in src/library/stats/R, since 
it's in namespace:stats, and it's R code.  I'd grep or do another search 
of the files there and find it in src/library/stats/R/ts.R (and it turns 
out there aren't any comments after all, but it might be useful to look 
at the other functions in that file anyway).

Now if someone were running from a binary install in Windows, they 
wouldn't have a full copy of the source directories; they need to 
download that separately from the binary builds.

Duncan Murdoch

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] frequency() source code

2005-11-07 Thread bob mccall
  Greetings:
 
I am looking for the source code for the frequency function. I 
grepped  the following dirs  but no luck. 

R-2.2.0/src/appl/*
R-2.2.0/src/main/*
R-2.2.0/src/nmath/*
R-2.2.0/src/library/stats/*

Does anybody know the file name??
Thanks,
Bob



-

[[alternative HTML version deleted]]

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] Modifying Internal C Files

2005-11-07 Thread Duncan Murdoch
On 11/7/2005 5:17 PM, Ken Kelley wrote:
> Hi All.
> I want to tweak a few minor things inside of internal C code. I have my 
> Win. XP machine set-up to build packages (including C code), but I'm 
> having problems getting the package to run correctly. In particular, I 
> want to modify a few things inside of pnbeta.c (namely errmax and 
> itrmax), which is what the pbeta() function calls upon when there is a 
> noncentral parameter. I copied the pnbeta.c C code, changed its name [to 
> pnbeta2.c], included the nmath.h, dpq.h files, lgamma.c, and pbeta.c in 
> my src folder (since the .h files were called upon and the .c files 
> were). I then created an R function that I thought would call upon the 
> new C code:
> 
> pnbeta2 <- function(x, a, b, lambda, low.tail, log.p)
> {
> res <- .C("pnbeta2", as.double(x), as.double(a), as.double(b), 
> as.double(lambda), as.integer(low.tail), as.integer(log.p))
> return(list(res))
> }
> 
> But after I built the package and loaded it it, the function doesn't 
> work (it isn't even recognized). I have no idea why this is failing. Any 
> information to help me figure out what I need to do to modify internal C 
> code generally, or specifically as it applies to this scenario would be 
> most helpful.

You didn't say that you changed the name of the function, only the file 
that contained it.  If this wasn't an oversight, then you should put

double pnbeta2(double x, double a, double b, double lambda,
  int lower_tail, int log_p)

in the appropriate place in your pnbeta2.c file.

The other thing you should do is to add a PACKAGE argument to your .C 
call, just in case there is already a pnbeta2 function somewhere else 
(or will be some day).  In fact, if you do this, there should be no need 
to change the name of the function:  R will look in the package DLL 
rather than R.dll to find it.  Just make sure that the name in the .C 
call matches the declared name in the source.

Duncan Murdoch

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] Modifying Internal C Files

2005-11-07 Thread Ken Kelley
Hi All.
I want to tweak a few minor things inside of internal C code. I have my 
Win. XP machine set-up to build packages (including C code), but I'm 
having problems getting the package to run correctly. In particular, I 
want to modify a few things inside of pnbeta.c (namely errmax and 
itrmax), which is what the pbeta() function calls upon when there is a 
noncentral parameter. I copied the pnbeta.c C code, changed its name [to 
pnbeta2.c], included the nmath.h, dpq.h files, lgamma.c, and pbeta.c in 
my src folder (since the .h files were called upon and the .c files 
were). I then created an R function that I thought would call upon the 
new C code:

pnbeta2 <- function(x, a, b, lambda, low.tail, log.p)
{
res <- .C("pnbeta2", as.double(x), as.double(a), as.double(b), 
as.double(lambda), as.integer(low.tail), as.integer(log.p))
return(list(res))
}

But after I built the package and loaded it it, the function doesn't 
work (it isn't even recognized). I have no idea why this is failing. Any 
information to help me figure out what I need to do to modify internal C 
code generally, or specifically as it applies to this scenario would be 
most helpful.

Thanks,
Ken

-- 
Ken Kelley, Ph.D.
Inquiry Methodology Program
Indiana University
201 North Rose Avenue, Room 4004
Bloomington, Indiana 47405
http://www.indiana.edu/~kenkel

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] pdf device and TeXencoding?

2005-11-07 Thread ivo welch

hi:  as usual, I am trying to be brief, and end up being too brief.  let me be 
more specific on [a].

$ cat test.R
luafmfiles <- c("/usr/share/texmf/fonts/afm/yandy/lubright/lbr.afm",
"/usr/share/texmf/fonts/afm/yandy/lubright/lbd.afm",
"/usr/share/texmf/fonts/afm/yandy/lubright/lbi.afm",
"/usr/share/texmf/fonts/afm/yandy/lubright/lbdi.afm",
"/usr/share/texmf/fonts/afm/yandy/lumath/lbms.afm")

grDevices::postscriptFonts(lucida=grDevices::postscriptFont("Lucida", 
metrics=luafmfiles, encoding="TeXtext.enc"));

pdf(file="test.pdf", fonts="lucida", version="1.4");
par(family="lucida");
plot( c(0,1),c(0,1) );
text( 0.5, 0.5, "this is some text" );
dev.off();

$ R CMD BATCH test.R

$ pdffonts test.pdf
name type emb sub uni object ID
  --- --- --- -
Error: Wrong type in font encoding resource differences (array)
Error: Wrong type in font encoding resource differences (array)
Error: Wrong type in font encoding resource differences (array)
Error: Wrong type in font encoding resource differences (array)
ZapfDingbats Type 1   no  no  no   5  0
HelveticaType 1   no  no  no  11  0
Helvetica-Bold   Type 1   no  no  no  12  0
Helvetica-ObliqueType 1   no  no  no  13  0
Helvetica-BoldObliqueType 1   no  no  no  14  0
Symbol   Type 1   no  no  no  15  0
LucidaBright Type 1   no  no  no  16  0
LucidaBright-DemiType 1   no  no  no  17  0
LucidaBright-Italic  Type 1   no  no  no  18  0
LucidaBright-DemiItalic  Type 1   no  no  no  19  0
LucidaNewMath-Symbol Type 1   no  no  no  20  0


$ ps2pdf13 -dPDFSETTINGS=/printer test.pdf a.pdf
Error: /typecheck in --.max--
Operand stack:
   --dict:4/4(L)--   F7   1   --dict:5/5(L)--   --dict:5/5(L)--   
--dict:11/11(ro)(G)--   --nostringval--   --nostringval--   --nostringval--   
256
Execution stack:
   %interp_exit   .runexec2   --nostringval--   --nostringval--   
--nostringval--   2   %stopped_push   --nostringval--   --nostringval--   
--nostringval--   false   1   %stopped_push   1   3   %oparray_pop   1   3   
%oparray_pop   --nostringval--   2   1   1   --nostringval--   
%for_pos_int_continue   --nostringval--   --nostringval--   --nostringval--   
%array_continue   --nostringval--   false   1   %stopped_push   --nostringval-- 
  %loop_continue   --nostringval--   --nostringval--   --nostringval--   
--nostringval--   --nostringval--   --nostringval--
Dictionary stack:
   --dict:1052/1417(ro)(G)--   --dict:0/20(G)--   --dict:73/200(L)--   
--dict:73/200(L)--   --dict:97/127(ro)(G)--   --dict:229/230(ro)(G)--   
--dict:19/24(L)--   --dict:4/6(L)--   --dict:19/20(L)--   --dict:12/13(L)--
Current allocation mode is local
ESP Ghostscript 7.07.1: Unrecoverable error, exit code 1


if I use the postscript device,  this error does not occur.  if I use the 
standard helvetica fonts, this error does not occur.  I believe this error is 
from the textext encoding used in the pdf device.


I am sorry, I know so little about (postscript) fonts.  (I also need to end up 
with all fonts embedded in my figures.)  I was really trying to avoid 
complexity by avoiding "families of fonts".  I really want the most simple 
possible access given that I have exactly one .pfb font---I just want R to 
write a string in this font to a particular location.  [if it could embed the 
font itself, it would be much better, but I believe R cannot embed fonts 
itself; instead it requires an external tool like ghostscript.]  long story, 
short moral:  my desire for simplicity here is not just my stupidity (although 
there is plenty), but other tools (e.g., ghostscript) are sometimes quiet on 
errors or just substitute other fonts without asking ('feature'), and it takes 
a lot of experimenting to get the basics right.  so, the more basic, the better.

regards,

/iaw


Prof Brian Ripley wrote:

>> On Mon, 7 Nov 2005, ivo welch wrote:
>> 
>
 [a] I believe that the pdf device does not yet fully support
 TeXencoding.  (under R-2.2.0, the pdf file created with Textext as
 font encoding still dies when post-processed by ghostscript.)  are
 there any workarounds, or are there utilities that would allow a
 TeXencoded font to be re-encoded/converted into ISOLatin, perhaps,
 which R could then handle beautifully?
>
>> 
>> 
>> 1) What does `dies' mean?
>> 2) See the following R-patched NEWS entry
>> 
>> opdf() was not writing details of the encoding to the file
>> correctly.  (Spotted by Alexey Shipunov in Russian encodings.)
>> 
>> so this may well be solved in current R (R-patched/R-devel).
>> 
>
 [b] is there a way 

Re: [R] pdf device and TeXencoding?

2005-11-07 Thread Prof Brian Ripley
On Mon, 7 Nov 2005, ivo welch wrote:

> [a] I believe that the pdf device does not yet fully support 
> TeXencoding.  (under R-2.2.0, the pdf file created with Textext as font 
> encoding still dies when post-processed by ghostscript.)  are there any 
> workarounds, or are there utilities that would allow a TeXencoded font 
> to be re-encoded/converted into ISOLatin, perhaps, which R could then 
> handle beautifully?

1) What does `dies' mean?
2) See the following R-patched NEWS entry

 o  pdf() was not writing details of the encoding to the file
correctly.  (Spotted by Alexey Shipunov in Russian encodings.)

so this may well be solved in current R (R-patched/R-devel).

> [b] is there a way to use an arbitrary postscript font and position it 
> into a ps or pdf graphic (i.e., without it being in the family {5 fonts} 
> that I am using for the main drawing)?  in a weird latex/R mix, my 
> intent is something like

Yes, see par(family=) and (in 2.2.x)  ?postscriptFont to create a family.

>   pdf(file="test.pdf");
>   plot( c(0,1),c(0,1) );
>   test.font = fontis("postscriptfont.pfb");  # may need a tfm 
> specification, too?
>   text( 0.5, 0.5, fontobject("some text", test.font));
>   dev.off();
>
> help would be highly appreciated, as always.

This area is all under development as we allow for CJK fonts.  In R-devel, 
something like

myfam <- Type1Font("test", rep("postscriptfont.afm"), 4)
pdf(file="test.pdf", fonts=myfam)
plot( c(0,1),c(0,1) )
text( 0.5, 0.5, "some text", family=myfam)
dev.off()

will work (and with amendments if this is a TeX-encoded font).


-- 
Brian D. Ripley,  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] R-2.2.0: aggregate problem

2005-11-07 Thread Prof Brian Ripley
On Mon, 7 Nov 2005 [EMAIL PROTECTED] wrote:

>> aggregate(as.ts(c(1,2,3,4,5,6,7,8,9,10)),1/2,mean)
> Time Series:
> Start = 1
> End = 9
> Frequency = 0.5
> [1] 1.5 3.5 5.5 7.5 9.5
>> aggregate(as.ts(c(1,2,3,4,5,6,7,8,9,10)),1/5,mean)
> Error in sprintf(gettext(fmt, domain = domain), ...) :
>   use format %f, %e or %g for numeric objects
>>
>
> I must be doing something wrong, but what?

It is rounding error: 1 is just under a multiple of 1/5 (in computer 
representation).  Try

aggregate(as.ts(c(1,2,3,4,5,6,7,8,9,10)),1/5.001,mean)

Clearly it needs an internal fix (and for the reporting error), and 
R-patched now has one.

-- 
Brian D. Ripley,  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] Deleting Rows/Columns

2005-11-07 Thread Weiwei Shi
put - before the column id you want to delete.

like this:
> a<-data.frame(c(1,2,3), c(4,5,6), c(7,8,9))
> a
  c.1..2..3. c.4..5..6. c.7..8..9.
1  1  4  7
2  2  5  8
3  3  6  9
> a[,-1]
  c.4..5..6. c.7..8..9.
1  4  7
2  5  8
3  6  9

HTH

On 11/7/05, Xiaofan Li <[EMAIL PROTECTED]> wrote:
> Sorry to bother the group but I am wondering if there are some official ways
> to delete a row/column, i.e., some functions of dataTable manipulation? For
> rows operation I use subset() but what about columns?
>
>
>
> Any advice is welcome and I will be more than grateful if somebody could
> make a summary on this issue.
>
>
>
> Xiaofan
>
>
>
> -
>
> Xiaofan Li
>
> Cambridge Computational Biology Institute
>
> Department of Applied Mathematics and Theoretical Physics
>
> University of Cambridge, CB3 0WA, United Kingdom
>
> Tel +44 7886 614030, Email [EMAIL PROTECTED]
>
>
>
>
> [[alternative HTML version deleted]]
>
> __
> R-help@stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
>


--
Weiwei Shi, Ph.D

"Did you always know?"
"No, I did not. But I believed..."
---Matrix III

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] [OTAnn] Feedback

2005-11-07 Thread shenanigans
I was interested in getting feedback from current mail group users.

We have mirrored your mail list in a new application that provides a more 
aggregated and safe environment which utilizes the power of broadband.

Roomity.com v 1.5 is a web 2.01 community webapp. Our newest version adds 
broadcast video and social networking such as favorite authors and an html 
editor.

It?s free to join and any feedback would be appreciated.

S.



--
Broadband interface (RIA) + mail box saftey = http://R_Project_Help_List.roomity.com";>R_Project_Help_List.roomity.com
*Your* clubs, no sign up to read, ad supported; try broadband internet. 
~~1131397871144~~
--

[[alternative HTML version deleted]]

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] Deleting Rows/Columns

2005-11-07 Thread Xiaofan Li
Sorry to bother the group but I am wondering if there are some official ways
to delete a row/column, i.e., some functions of dataTable manipulation? For
rows operation I use subset() but what about columns?

 

Any advice is welcome and I will be more than grateful if somebody could
make a summary on this issue.

 

Xiaofan

 

-

Xiaofan Li

Cambridge Computational Biology Institute

Department of Applied Mathematics and Theoretical Physics

University of Cambridge, CB3 0WA, United Kingdom

Tel +44 7886 614030, Email [EMAIL PROTECTED]

 


[[alternative HTML version deleted]]

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] slow R start up

2005-11-07 Thread Ted Harding
On 07-Nov-05 Ted Harding wrote:
> 
> On 07-Nov-05 Gabor Grothendieck wrote:
>> Or you could call those other programs from within R.  See
>> ?system
> 
> A possibility which might work, and if so would work as you intend,
> is to use a FIFO ("named pipe") or two to communicate with R.
> 
> In R, see "?connections"; in Linux, see "man mkfifo".
> 
> Then you should be able to start R up so that it reads commands
> from the FIFO, (and optionally writes output to a FIFO which can
> be read by the shell script; though it might be simpler to write
> the output to a regular file).

Here is a bare and thin skeleton, which you may be able to develop.
You'll need two windows for this demo, one for R and one for the
Linux shell, both operating in the same directory.


In the R window, start up R


In the Linux window, execute

mkfifo -m 666 Rin


In R:

Rin<-fifo("Rin",open="T")
online<-TRUE
while(online){source(Rin)}


In Linux:

cat > Rin  << EOF
x<-0.1*(0:10)
plot(x,cos(2*pi*x))
EOF
sleep 10
cat > Rin  << EOF
x<-0.1*(0:20)
plot(x,sin(2*pi*x))
online<-FALSE
EOF

You can then "rm Rin" in Linux.


That is a little script which sends commends to R via the FIFO
to do two (slightly) different things, with a pause between them.

R reads the FIFO so long as "online" is TRUE, and then stops
reading when told that "online" is FALSE. Whenever it has emptied
the FIFO for the moment (as happens at the "sleep 10" in Linux)
it simply waits in the loop for more to come through.

You could add a line in R following the "while" to

close(Rin)

or to quit() so that R closes down at that point.

>From that beginning you should be able to develop scripts which
send arbitrary commands to R from Linux.

PS: This is my first venture into the FIFO world where using R
is concerned, so I may have under-exploited the possibilities
of R in this respect. It is simply based on how I've done it
with other software, and this is a very generic way to do it.

Hoping this helps,
Ted.



E-Mail: (Ted Harding) <[EMAIL PROTECTED]>
Fax-to-email: +44 (0)870 094 0861
Date: 07-Nov-05   Time: 21:03:32
-- XFMail --

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] R seems to "stall" after several hours on a long series of analyses... where to start?

2005-11-07 Thread David L. Van Brunt, Ph.D.
I'll try the memory stats function first, see what I get... I do have "top"
on OS X, so I'll try watching that more closely as well.

Great suggestions here.

On 11/7/05, Paul Gilbert <[EMAIL PROTECTED]> wrote:
>
>
>
> David L. Van Brunt, Ph.D. wrote:
>
> > Great suggestions, all.
> >
> > I do have a timer in there, and it looks like the time to complete a
> loop is
> > not increasing as it goes. From your comments, I take it that suggests
> there
> > is not a memory leak. I could try scripting the loop from the shell,
> rather
> > than R, to see if that works, but will do that as a last resort as it
> will
> > require a good deal of re-writing (the loop follows some "setup" code
> that
> > builds a pretty large data set... the loop then slaps several new
> columns on
> > a copy of that data set, and analyses that...)
>
> You may find it is better to make a new array for these columns. R tends
> to make copies when you do this sort of thing, with the result that you
> have multiple copies of your original dataset. Also, define the array to
> be the final size, with NA values, rather than appending rows or columns.
> >
> > I'll still try the other platform as well, see if the same problem
> occurs
> > there.
> I'm curious to hear what you find. I doubt you willfind a big
> difference, but you will find a big diffence on a machine with more
> physical memory.
>
> Paul
> >
> > On 11/7/05, jim holtman <[EMAIL PROTECTED]> wrote:
> >
> >>Here is some code that I use to track the progress of my scripts. This
> >>will print out the total cpu time and the memory that is being used. You
> >>call it with 'my.stats("message")' to print out "message" on the
> console.
> >> Also, have you profiled your code to see where the time is being spent?
> >>Can you break it up into multiple runs so that you can start with a
> "fresh"
> >>version of memory?
> >> ==script===
> >>"my.stats" <- local({
> >># local variables to hold the last times
> >># first two variables are the elasped and CPU times from the last report
> >>lastTime <- lastCPU <- 0
> >>function(text = "stats", reset=F)
> >>{
> >>procTime <- proc.time()[1:3] # get current metrics
> >>if (reset){ # setup to mark timing from this point
> >>lastTime <<- procTime[3]
> >>lastCPU <<- procTime[1] + procTime[2]
> >>} else {
> >>cat(text, "-",sys.call(sys.parent())[[1]], ": <",
> >>round((procTime[1] + procTime[2]) - lastCPU,1),
> >>round(procTime[3] - lastTime,1), ">", procTime,
> >>" : ", round(memory.size()/2.^20., 1.), "MB\n")
> >>invisible(flush.console()) # force a write to the console
> >>}
> >>}
> >>})
> >> = here is some sample output=
> >>
> >>>my.stats(reset=TRUE) # reset counters
> >>>x <- runif(1e6) # generate 1M random numbers
> >>>my.stats('random')
> >>
> >>random - my.stats : < 0.3 31.8 > 96.17 11.7 230474.9 : 69.5 MB
> >>
> >>>y <- x*x+sqrt(x) # just come calculation
> >>>my.stats('calc')
> >>
> >>calc - my.stats : < 0.7 71.2 > 96.52 11.74 230514.3 : 92.4 MB
> >>
> >> You can see that memory is growing. The first number is the CPU time
> and
> >>the second (in <>) is the elapsed time.
> >> HTH
> >>
> >>
> >> On 11/7/05, David L. Van Brunt, Ph.D. <[EMAIL PROTECTED]> wrote:
> >>
> >>
> >>>Not sure where to even start on this I'm hoping there's some
> >>>debugging I
> >>>can do...
> >>>
> >>>I have a loop that cycles through several different data sets (same
> >>>structure, different info), performing randomForest growth and
> >>>predictions... saving out the predictions for later study...
> >>>
> >>>I get about 5 hours in (9%... of the planned iterations.. yikes!) and R
> >>>just
> >>>freezes.
> >>>
> >>>This happens in interactive and batch mode execution (I can see from
> the
> >>>".Rout" file that it gets about 9% through in Batch mode, and about 6%
> >>>if in
> >>>interactive mode... does that suggest memory problems?)
> >>>
> >>>I'm thinking of re-executing this same code on a different platform to
> >>>see
> >>>if that's the issue (currently using OS X)... any other suggestions on
> >>>where
> >>>to look, or what to try to get more information?
> >>>
> >>>Sorry so vague... it's a LOT of code, runs fine without error for many
> >>>iterations, so I didn't think the problem was syntax...
> >>>
> >>>--
> >>>---
> >>>David L. Van Brunt, Ph.D.
> >>>mailto: [EMAIL PROTECTED]
> >>>
> >>>[[alternative HTML version deleted]]
> >>>
> >>>__
> >>>R-help@stat.math.ethz.ch mailing list
> >>>https://stat.ethz.ch/mailman/listinfo/r-help
> >>>PLEASE do read the posting guide!
> http://www.R-project.org/posting-guide.html
> >>>
> >>>
> >>
> >>
> >>
> >>--
> >>Jim Holtman
> >>Cincinnati, OH
> >>+1 513 247 0281
> >>
> >>What the problem you are trying to solve?
> >
> >
> >
> >
> >
> > --
> > ---
> > David L. Van Brunt, Ph.D.
> > mailto:[EMAIL PROTECTED]
> >
> > [[alternative HTML version deleted]]
> >
> > __

Re: [R] lattice chart: different definitions for series

2005-11-07 Thread Deepayan Sarkar
On 11/7/05, [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote:
>
>
>
>
> Hi enthusiasts,
>
> Trying to create a single chart in  lattice with different plotting
> definitions for the different series (two series should be drawn with lines
> and the other without them)
>
> I am using a dataset, which includes a grouping variable e.g. clinic with
> three levels, the variable "year" and a continous variable: "mct".
>
> In the graph the variable "year" is in the x axis, with "mct" represented
> in the y axis.
>
> The diagram should include two line diagrams(representing two of the
> groups) , with the third group represented only with symbols(no lines).
>
> Until now I was using white lines to eliminate the lines drawn in the third
> group, but this solution is not optimal, as the grids are sometimes not
> visible
>
> sp<-list(superpose.symbol=list(pch=c(1,2,1),col=c("blue","red","green")),
>superpose.line=list(col=c("blue","red","white"),lty=c(1,2,)))
>
> ... and then including
>
> print(xyplot(mct~trend.data$year,groups=clinic,
>   scales=list(x=list(at=c(15:pno),labels=per.labels)),
>   main=main.title,
>   sub=sub.title,
>   xlab=x.label,
>   ylab=y.label,
>   xlim=c(pno-12,pno+1),
>   panel=function(x,y,...){panel.grid(h=-1,v=-1,col="grey",lty=2,cex=0.1);
>   panel.superpose(x,y,type="l",lwd=1.8,...);
>   panel.superpose(x,y,type="p",cex=1.8,...))},
>   key=sk,
>   par.settings=sp));
>
> ... was also experimenting, and searching a lot in the WWW for
>
> panel.superpose.2 and type=c("b","b","p"), but without success.

I don't know what experiments you did, but the following seems to work
fine for me:

library(lattice)

mydf <-
data.frame(year = rep(1991:2000, 3),
   mct =
   rnorm(30,
 mean = rep(1:3, each = 10),
 sd = 0.5),
   clinic = gl(3, 10))

xyplot(mct ~ year, mydf, groups = clinic,
   panel = panel.superpose.2,
   type = c("b", "b", "p"))

-Deepayan

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] percent rank by an index key?

2005-11-07 Thread t c

A couple follow up questions:

 

1. Is there any way to modify this so that non-numeric values are ignored? (As 
it is, length seems to "count" the NA values.)



  

2. In order fro the cbind function  "x <- cbind(x, do.call("rbind", r))"   to 
work as intended, does the data need to be “Ordered” by State and Year? 

 

e.g. "x <- x[order(x$State,x$Year), ]"

 

Here is some sample data, with non-numeric values included:

 

Year,State,Subject,Income

2000,TX,1,30776

2000,AL,1,81240

2000,TX,2,28035

2000,AL,2,35947

2000,TX,3,42010

2000,AL,3,48830

2000,TX,4,18040

2000,AL,4,77758

2000,TX,5,20771

2000,AL,5,59132

2000,TX,6,46370

2000,AL,6,45573

2000,TX,7,57256

2000,AL,7,83402

2000,TX,8,3780

2000,AL,8,90695

2000,TX,9,51745

2000,AL,9,4105

2000,TX,10,1154

2000,AL,10,96598

2001,TX,1,25767

2001,AL,1,37032

2001,TX,2,39848

2001,AL,2,69029

2001,TX,3,17142

2001,AL,3,92850

2001,TX,4,62939

2001,AL,4,82730

2001,TX,5,30708

2001,AL,5,25339

2001,TX,6,64710

2001,AL,6,44541

2001,TX,7,96699

2001,AL,7,9151

2001,TX,8,57793

2001,AL,8,20981

2001,TX,9,12523

2001,AL,9,36139

2001,TX,10,53553

2001,AL,10,3767

2002,TX,1,55232

2002,AL,1,54655

2002,TX,2,76255

2002,AL,2,53581

2002,TX,3,77030

2002,AL,3,34869

2002,TX,4,98956

2002,AL,4,60332

2002,TX,5,33052

2002,AL,5,12348

2002,TX,6,96057

2002,AL,6,24509

2002,TX,7,66177

2002,AL,7,45952

2002,TX,8,73331

2002,AL,8,35813

2002,TX,9,3014

2002,AL,9,57097

2002,TX,10,83657

2002,AL,10,91640

2003,TX,1,5638

2003,AL,1,17026

2003,TX,2,66902

2003,AL,2,71080

2003,TX,3,88195

2003,AL,3,95415

2003,TX,4,13028

2003,AL,4,49123

2003,TX,5,19867

2003,AL,5,22990

2003,TX,6,67639

2003,AL,6,69435

2003,TX,7,62469

2003,AL,7,59939

2003,TX,8,24874

2003,AL,8,44829

2003,TX,9,77180

2003,AL,9,68488

2003,TX,10,80686

2003,AL,10,72622

2004,TX,1,46854

2004,AL,1,62499

2004,TX,2,20461

2004,AL,2,53834

2004,TX,3,54909

2004,AL,3,69527

2004,TX,4,33066

2004,AL,4,78035

2004,TX,5,23569

2004,AL,5,59757

2004,TX,6,44514

2004,AL,6,41223

2004,TX,7,85665

2004,AL,7,91972

2004,TX,8,30073

2004,AL,8,90642

2004,TX,9,32741

2004,AL,9,97111

2004,TX,10,8093

2004,AL,10,20077

2005,TX,1,48377

2005,AL,1,88216

2005,TX,2,35752

2005,AL,2,74897

2005,TX,3,27772

2005,AL,3,88945

2005,TX,4,86512

2005,AL,4,88422

2005,TX,5,27488

2005,AL,5,21140

2005,TX,6,35777

2005,AL,6,32772

2005,TX,7,77477

2005,AL,7,98282

2005,TX,8,73346

2005,AL,8,38943

2005,TX,9,38947

2005,AL,9,70195

2005,TX,10,23890

2005,AL,10,84020

2000,TX,11,na

2005,AL,11,null

 

 

 

 

 


Sundar Dorai-Raj <[EMAIL PROTECTED]> wrote: 

t c wrote:
> What is the easiest way to calculate a percent rank “by” an index key?
> 
> 
> 
> Foe example, I have a dataset with 3 fields:
> 
> 
> 
> Year, State, Income ,
> 
> 
> 
> I wish to calculate the rank, by year, by state.
> 
> I also wish to calculate the “percent rank”, where I define percent rank as 
> rank/n.
> 
> 
> 
> (n is the number of numeric data points within each date-state grouping.)
> 
> 
> 
> 
> 
> This is what I am currently doing:
> 
> 
> 
> 1. I create a “group by” field by using the paste function to combine date 
> and state into a field called date_state. I then use the rank function to 
> calculate the rank by date, by state. 
> 
> 
> 
> 2. I then add a field called “one” that I set to 1 if the value in income is 
> numeric and to 0 if it is not.
> 
> 
> 
> 3. I then take an aggregate sum of “one”. This gives me a count (n) for each 
> date-state grouping.
> 
> 
> 
> 
> 
> 4. I next use merge to add this count to the table.
> 
> 
> 
> 5. Finally, I calculate the percent rank.
> 
> 
> 
> Pr<-rank/n
> 
> 
> 
> The merge takes quite a bit of time to process. 
> 
> 
> 
> Is there an easier/more efficient way to calculate the percent rank?
> 

How about using ?by:

set.seed(100)
# fake data set, replace with your own
# "Subject" is just a dummy to produce replicates
x <- expand.grid(Year = 2000:2005,
State = c("TX", "AL"),
Subject = 1:10)
x$Income <- floor(runif(NROW(x)) * 10)

r <- by(x$Income, x[c("Year", "State")],
function(x) {
r <- rank(x)
n <- length(x)
cbind(Rank = r, PRank = r/n)
})
x <- cbind(x, do.call("rbind", r))

HTH,

--sundar



Sundar Dorai-Raj <[EMAIL PROTECTED]> wrote:

t c wrote:
> What is the easiest way to calculate a percent rank “by” an index key?
> 
> 
> 
> Foe example, I have a dataset with 3 fields:
> 
> 
> 
> Year, State, Income ,
> 
> 
> 
> I wish to calculate the rank, by year, by state.
> 
> I also wish to calculate the “percent rank”, where I define percent rank as 
> rank/n.
> 
> 
> 
> (n is the number of numeric data points within each date-state grouping.)
> 
> 
> 
> 
> 
> This is what I am currently doing:
> 
> 
> 
> 1. I create a “group by” field by using the paste function to combine date 
> and state into a field called date_state. I then use the rank function to 
> calculate the rank by date, by state. 
> 
> 
> 
> 2. I then add a field called “one” that I set to 1 if the value in income is 

Re: [R] Newbie on functions

2005-11-07 Thread Ted Harding
On 07-Nov-05 Angelo Secchi wrote:
> 
> Hi,
> I'm trying to write a simple function like
> 
> case1 <- function (m, cov, Q, R) {
>   theta <- (acos(R/sqrt(Q^3)))
>   beta  <- (-2)*sqrt(Q)*cos(theta/3)+m[1]/3
>   rho1  <- (-2)*sqrt(Q)*cos((theta+2*pi)/3)+m[1]/3
>   rho2 <- (-2)*sqrt(Q)*cos((theta-2*pi)/3)+m[1]/3
>   stderrb   <-  deltamethod( ~(-2)*sqrt(Q)*cos(theta/3)+x1/3,m,cov)
>   stderrr1  <-  deltamethod( ~(-2)*sqrt(Q)*cos((theta+2*pi)/3)+x1/3, m,
> cov) stderrr2  <-  deltamethod( ~(-2)*sqrt(Q)*cos((theta-2*pi)/3)+x1/3,
> m, cov) stderr<- c(stderrb,stderrr1,stderrr2)
>   results   <- c(beta,rho1,rho2,stderr)
>   results2  <- t(results)
>   results2
> }
> 
> When I call the function in an IF statement like
> 
> if (Q^3>R^2) results2 <- case1() else print('ciccio')
> 
> I get 
> 
> Error in eval(expr, envir, enclos) : Object "theta" not found
> 
> I do not understand why, any help?

Because when the function 'case1' is called as "case1()" it
has no information on the values of m, cov, Q and R since
it only looks in its argument list for these values (and
not in the environment from which you called it).

It would be different if, in the function definition, you
had assigned default values for arguments not mentioned in
the function call, since it would then use these values
instead, for example

  case1 <- function (m=1, cov=1, Q=1, R=1) {
  ...
  }

but probably you do not want to do anything like that in this case.

Best wishes,
Ted.



E-Mail: (Ted Harding) <[EMAIL PROTECTED]>
Fax-to-email: +44 (0)870 094 0861
Date: 07-Nov-05   Time: 20:14:07
-- XFMail --

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] solve the quadratic equation ax^2+bx+c=0

2005-11-07 Thread Ted Harding
On 05-Nov-05 Yuying Shi wrote:
> If I have matrics as follows:
>> a <- c(1,1,0,0)
>> b <- c(4,4,0,0)
>> c <- c(3,5,5,6)
> How can I use R code to solve the equation ax^2+bx+c=0.
> thanks!
>  yuying shi

Here is a solution, using the more interesting example in an
ealrier mail of yours:

  a b c
  1 4 3
  1 4 5
  0 2 5
  0 0 6

  qs<-function(a,b,c){
a<-as.complex(a); b<-as.complex(b); c<-as.complex(c)
i2<-(a!=0); i1<-((a==0)&(b!=0));
solns<-as.complex(rep(NA,length(a)))
  solns<-cbind(solns,solns); colnames(solns)<-c("soln 1","soln 2")
a2<-a[i2]; b2<-b[i2]; c2<-c[i2]
  solns[i2,1]<-(-b2 + sqrt(b2^2 - 4*a2*c2))/(2*a2)
  solns[i2,2]<-(-b2 - sqrt(b2^2 - 4*a2*c2))/(2*a2)
b1<-b[i1]; c1<-c[i1]
  solns[i1,1]<-(-c1)/b1
solns
  }

  a<-c(1,1,0,0); b<-c(4,4,2,0); c<-c(3,5,5,6)

  qs(a,b,c)
  soln 1 soln 2
[1,] -1.0+0i -3+0i
[2,] -2.0+1i -2-1i
[3,] -2.5+0i NA
[4,]  NA NA


Check that a*s^2 + b*s + c = 0:

  s1<-solns[,1]; s2<-solns[,2]

  diag(cbind(a,b,c)%*%rbind(s1^2,s1,c(1,1,1,1)))
[1]  0.00e+00-2.449213e-16i -8.881784e-16-1.776357e-15i
[3]  0.00e+00+0.00e+00i  NA

  diag(cbind(a,b,c)%*%rbind(s2^2,s2,c(1,1,1,1)))
[1]  1.776357e-15-2.204291e-15i -8.881784e-16+1.776357e-15i
[3]  NA  NA

which is as close to 0 as you can expect to get.

Hoping this helps,
Ted.



E-Mail: (Ted Harding) <[EMAIL PROTECTED]>
Fax-to-email: +44 (0)870 094 0861
Date: 07-Nov-05   Time: 20:04:00
-- XFMail --

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] repeated values, nlme, correlation structures

2005-11-07 Thread Patrick Giraudoux
Dear listers,

As an exercise, I am trying to fit a logistic model with nlme.  Blue tit 
pulli (youngs) were weighted occasionnally (for field reasons) along 
time in 17 nestboxes. Individuals where not idenfied but their age was 
known. This means that for a given age several measurements were done 
but individuals could not be identified from a time to the other. This 
makes repeated values for a given age group in each nestbox. The aim is 
to get an acceptable growth curve (weight against age).

As far as repeated values cannot be handled with standard coStruct 
classes of nlme, I have done a first fit with nlme using the mean of 
each group. Comparing several models, the best fit is:

modm0c<-nlme(pds~Asym/(1+exp((xmid-age)/scal)),
fixed=list(Asym~1,xmid~1,scal~1),
random=Asym+xmid~1|nichoir,data=croispulm,
start=list(fixed=c(10,5,2.2)),
method="ML",
corr=corCAR1()
)

with pds = weight, age = mean age of each age group, nichoir = nestbox 
(a factor of 17 levels)

Based on the empirical autocorrelation function of the normalised 
residuals drawn from this model one can acceptaly assume that  the 
normalized residuals behave like uncorrelated noise.

Though this could be quite satisfying at first sight,  I am quite 
frustrated with starting from the mean weight of each age group, thus 
not being capable to manage and incorporate the variability around the 
mean weight of each age group in the model. My bible is Pinheiro & Bates 
(2000), but I did not find an example to board this problem.

Is there an affordable way (=not that much complicated for a biologist 
familiar to some general statistics) to handle this in nlme?

Any hint?

Patrick

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] pdf device and TeXencoding?

2005-11-07 Thread ivo welch

Dear R wizards:

[a] I believe that the pdf device does not yet fully support TeXencoding.  
(under R-2.2.0, the pdf file created with Textext as font encoding still dies 
when post-processed by ghostscript.)  are there any workarounds, or are there 
utilities that would allow a TeXencoded font to be re-encoded/converted into 
ISOLatin, perhaps, which R could then handle beautifully?

[b] is there a way to use an arbitrary postscript font and position it into a 
ps or pdf graphic (i.e., without it being in the family {5 fonts} that I am 
using for the main drawing)?  in a weird latex/R mix, my intent is something 
like

pdf(file="test.pdf");
plot( c(0,1),c(0,1) );
test.font = fontis("postscriptfont.pfb");  # may need a tfm 
specification, too?
text( 0.5, 0.5, fontobject("some text", test.font));
dev.off();

help would be highly appreciated, as always.

sincerely,

/ivo welch

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] Is R thread safe?

2005-11-07 Thread Olaf . Schenk
Dear R-help,

I would like to accelerate my R computation by using parallel OpenMP compilers
(e.g from Pathscale) on a 2-processor AMD server and I would like to know
whether R is a tread safe library. The main kernel of the OpenMP
parallelization is a C SEXP function that performs the computational routine in
parallel with:

***
SEXP example(SEXP list, SEXP expr, SEXP rho)
 {
   R_len_t i, n = length(list);
   SEXP ans, alocal;

   omp_lock_t lck;
   PROTECT(ans = allocVector(VECSXP, n));
   ans = allocVector(VECSXP, n);
   omp_init_lock(&lck);
#pragma omp parallel for default(none) private(i, alocal) shared(list,
lck,rho, ans, n, expr)
   for(i = 0; i < n; i++) {

  omp_set_lock(&lck);
 PROTECT(alocal = allocVector(VECSXP, 1));
 alocal = allocVector(VECSXP, 1);
 defineVar(install("x"), VECTOR_ELT(list, i), rho);
  omp_unset_lock(&lck);

  /* do computational kernel in parallel */
  alocal = eval(expr, rho);

  omp_set_lock(&lck);
 SET_VECTOR_ELT(ans, i, alocal);
 UNPROTECT(1);
  omp_unset_lock(&lck);

   }
   setAttrib(ans, R_NamesSymbol, getAttrib(list, R_NamesSymbol));
   UNPROTECT(1);
   return(ans);
}

***

The code works fine using one thread and breaks currently down with 2 threads.
I am using a recent R distribution and  the complete R code is compile with
"-openmp" and the Pathscale compiler suite.

Thanks in advance,
Olaf

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] Is R thread safe?

2005-11-07 Thread Olaf . Schenk
Dear R-help,

I would like to accelerate my R computation by using parallel OpenMP compilers
(e.g from Pathscale) on a 2-processor AMD server and I would like to know
whether R is a tread safe library. The main kernel of the OpenMP
parallelization is a C SEXP function that performs the computational routine in
parallel with:

***
SEXP example(SEXP list, SEXP expr, SEXP rho)
 {
   R_len_t i, n = length(list);
   SEXP ans, alocal;

   omp_lock_t lck;
   PROTECT(ans = allocVector(VECSXP, n));
   ans = allocVector(VECSXP, n);
   omp_init_lock(&lck);
#pragma omp parallel for default(none) private(i, alocal) shared(list,
lck,rho, ans, n, expr)
   for(i = 0; i < n; i++) {

  omp_set_lock(&lck);
 PROTECT(alocal = allocVector(VECSXP, 1));
 alocal = allocVector(VECSXP, 1);
 defineVar(install("x"), VECTOR_ELT(list, i), rho);
  omp_unset_lock(&lck);

  /* do computational kernel in parallel */
  alocal = eval(expr, rho);

  omp_set_lock(&lck);
 SET_VECTOR_ELT(ans, i, alocal);
 UNPROTECT(1);
  omp_unset_lock(&lck);

   }
   setAttrib(ans, R_NamesSymbol, getAttrib(list, R_NamesSymbol));
   UNPROTECT(1);
   return(ans);
}

***

The code works fine using one thread and breaks currently down with 2 threads. 
I am using a recent R distribution and  the complete R code is compile with
"-openmp" and the Pathscale compiler suite.

Thanks in advance,
Olaf

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] levelplot and layout

2005-11-07 Thread Deepayan Sarkar
On 11/7/05, Andreas Lehnert <[EMAIL PROTECTED]> wrote:
>
> Dear R,
>
> I know that I cannot use filled.contour() with layout,
> so I tried levelplot. But it looks like I cannot use levelplot also.

Have you looked at help(print.trellis) ?

> The only way to get more than one graphic in the window is
> by levelplot(panel=). but there is only one colorkey and I need one for
> every
> graphic.
> (I´m also not very satisfied with levelplot, because I would like to have
> a "soft" graphic like filled.contour)

That's not implemented.

Deepayan

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] Newbie on functions

2005-11-07 Thread Angelo Secchi

Hi,
I'm trying to write a simple function like

case1 <- function (m, cov, Q, R) {
  theta <- (acos(R/sqrt(Q^3)))
  beta  <- (-2)*sqrt(Q)*cos(theta/3)+m[1]/3
  rho1  <- (-2)*sqrt(Q)*cos((theta+2*pi)/3)+m[1]/3
  rho2 <- (-2)*sqrt(Q)*cos((theta-2*pi)/3)+m[1]/3
  stderrb   <-  deltamethod( ~(-2)*sqrt(Q)*cos(theta/3)+x1/3,m,cov)
  stderrr1  <-  deltamethod( ~(-2)*sqrt(Q)*cos((theta+2*pi)/3)+x1/3, m,
cov) stderrr2  <-  deltamethod( ~(-2)*sqrt(Q)*cos((theta-2*pi)/3)+x1/3,
m, cov) stderr<- c(stderrb,stderrr1,stderrr2)
  results   <- c(beta,rho1,rho2,stderr)
  results2  <- t(results)
  results2
}

When I call the function in an IF statement like

if (Q^3>R^2) results2 <- case1() else print('ciccio')

I get 

Error in eval(expr, envir, enclos) : Object "theta" not found

I do not understand why, any help?
Thanks


-- 

 Angelo Secchi PGP Key ID:EA280337

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] someone knows how to title a image.plot() in a layout?

2005-11-07 Thread Francisco J. Zagmutt
What error are you getting?  What did you try?  If you want to get 
meaningful help you need to provide examples of what you did and didn't 
work.

Read the bottom of the message that you just sent and you will notice that 
we ask you to read the posting guide

Cheers

Francisco

>From: "Andreas Lehnert" <[EMAIL PROTECTED]>
>To: r-help@stat.math.ethz.ch
>Subject: [R] someone knows how to title a image.plot() in a layout?
>Date: Mon, 7 Nov 2005 16:53:18 +0100 (MET)
>
>
>Hello R,
>
>I tried to get 4 image.plot() in one layout.
>But when I try to label it with "main" or "sub" there is an error.
>What am I doing wrong?
>
>Please help
>
>Thanks, Andy
>
>--
>
>__
>R-help@stat.math.ethz.ch mailing list
>https://stat.ethz.ch/mailman/listinfo/r-help
>PLEASE do read the posting guide! 
>http://www.R-project.org/posting-guide.html

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] lattice chart: different definitions for series

2005-11-07 Thread ManuelPerera-Chang




Hi enthusiasts,

Trying to create a single chart in  lattice with different plotting
definitions for the different series (two series should be drawn with lines
and the other without them)

I am using a dataset, which includes a grouping variable e.g. clinic with
three levels, the variable "year" and a continous variable: "mct".

In the graph the variable "year" is in the x axis, with "mct" represented
in the y axis.

The diagram should include two line diagrams(representing two of the
groups) , with the third group represented only with symbols(no lines).

Until now I was using white lines to eliminate the lines drawn in the third
group, but this solution is not optimal, as the grids are sometimes not
visible

sp<-list(superpose.symbol=list(pch=c(1,2,1),col=c("blue","red","green")),
   superpose.line=list(col=c("blue","red","white"),lty=c(1,2,)))

... and then including

print(xyplot(mct~trend.data$year,groups=clinic,
  scales=list(x=list(at=c(15:pno),labels=per.labels)),
  main=main.title,
  sub=sub.title,
  xlab=x.label,
  ylab=y.label,
  xlim=c(pno-12,pno+1),
  panel=function(x,y,...){panel.grid(h=-1,v=-1,col="grey",lty=2,cex=0.1);
  panel.superpose(x,y,type="l",lwd=1.8,...);
  panel.superpose(x,y,type="p",cex=1.8,...))},
  key=sk,
  par.settings=sp));

... was also experimenting, and searching a lot in the WWW for

panel.superpose.2 and type=c("b","b","p"), but without success.

Many thanks,

Manuel

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] Help with downloading clim.pact

2005-11-07 Thread Prof Brian Ripley
On Mon, 7 Nov 2005, Nelson, Gary (FWE) wrote:

> I am wondering if anyone might know why the package clim.pact won't
> install properly.  I have tried many URL sites and the same thing
> happens. I get the error message below.  I also tried downloading the
> ZIP from the CRAN site and extracting the file myself, but an error
> message (something like, "not an archive file") appears.  I operate
> through Windows.

Perhaps you are hitting a download limit?  This works for me from the 
master site.

-- 
Brian D. Ripley,  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] solve the quadratic equation ax^2+bx+c=0

2005-11-07 Thread Christoph Buser
Hi

Have a look at ?polyroot. This might be helpful.

Regards,

Christoph Buser

--
Christoph Buser <[EMAIL PROTECTED]>
Seminar fuer Statistik, LEO C13
ETH (Federal Inst. Technology)  8092 Zurich  SWITZERLAND
phone: x-41-44-632-4673 fax: 632-1228
http://stat.ethz.ch/~buser/
--


Yuying Shi writes:
 > If I have matrics as follows:
 > > a <- c(1,1,0,0)
 > > b <- c(4,4,0,0)
 > > c <- c(3,5,5,6)
 > How can I use R code to solve the equation ax^2+bx+c=0.
 > thanks!
 >  yuying shi
 > 
 >  [[alternative HTML version deleted]]
 > 
 > __
 > R-help@stat.math.ethz.ch mailing list
 > https://stat.ethz.ch/mailman/listinfo/r-help
 > PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] Time-measurement in milliseconds

2005-11-07 Thread Prof Brian Ripley
Be aware that the measurement will itself take more than a millisecond
from an interpreted language like R.

Please see the help page for Sys.time, and its suggestion of proc.time.
for(i in 1:100) print(proc.time()[3]) suggests this takes 3ms on my box.

On Mon, 7 Nov 2005, Sydler, Dominik wrote:

> Hi there
>
> I'm loking for a time-measurement to measure time-differences in
> milliseconds.
> On my search, I only found the following:
> - package "base": Sys.time()
>-> only second-accuracy
> - package "R.utils": System$currentTimeMillis()
>-> returns integer of milliseconds, but accuracy is only whole
> seconds too.
> At the moment I run every bit of code to measure 1000-times to be able
> to calculate time in milliseconds... ;-)
>
> Has anyone a method to get milliseconds?
>
> Thanks for any help.
>
> __
> R-help@stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
>

-- 
Brian D. Ripley,  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] R (2.2.0), R-DCOM and Delphi

2005-11-07 Thread Earl F. Glynn
"Dieter Menne" <[EMAIL PROTECTED]> wrote in message
news:<[EMAIL PROTECTED]>...
> Jose Claudio Faria  terra.com.br> writes:

> Looks like a missing file in "uses". I compiled it under Delphi 5, and it
looks
> like the variants are no longer included in my "uses".
>
> Try to find out via help file where VarType is declared, add it to "uses".
And
> please tell me about the result when you got it compiled, I will add your
> finding to the archive.

Here's what worked for me:

(1) Install the new (D)COM server (the old one gives the error "Method '~'
of object '~' failed" with R 2.2.0):

 For anybody who wants to try the R(D)COM server alone, I recommend
 http://sunsite.univie.ac.at/rcom/download/RSrv200beta.exe
 It is reasonably stable and will work with R 2.2.0.


(2) In C:\Program Files\R\(D)COM Server\samples\Simple run the simple.exe to
verify the server is working.


(3) Download and unzip:  http://www.menne-biomed.de/download/RDComDelphi.zip

Like Jose Claudio Faria described, a number of warnings and errors will be
seen in Delphi 7 in compiling the unmodified RDCom.dpr project.

To fix the compilation errors, add the "Variants" unit to the uses:

uses
  Windows, Messages, SysUtils,Classes, Dialogs,
STATCONNECTORSRVLib_TLB,Forms, Variants;

I don't remember why Borland made this change.

You can make the Warnings go away by selecting Project | Options | Compiler
Messages and unchecking the boxes for

- Unsafe type
- Unsafe code

These warning were introduced with Borland made changes to support .NET.
The code should be quite fine for Win32.

Note:  The Delphi source code could be modified to use {$IFDEF} directives
to take care of the differences by compiler version.  See the "Compiler
Versions" section of this page:
http://www.efg2.com/Lab/Library/Delphi/Miscellany/index.html.
Unfortunately, Borland has not been very helpful in promoting the use of a
common "Versions.INC" file or any other mechanism to take care of such
compiler version problems in a uniform way.  The file "Versions.INC" must be
updated with every new release by Borland.


(4) Run the RDCom.exe program. Select the "Run" and "Plot" buttons and see a
wonderful demo of Delphi and R working together.  (Now I can get busy and
use this in other Delphi applications with R as a "backend".)


Thanks for the work on R-DCOM and this great example!

efg
Scientific Programmer
Stowers Institute for Medical Research

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] nlme error message

2005-11-07 Thread Spencer Graves
p.s.  You may also find useful the process I followed to diagnose this 
problem.

  1.  I copied your example into R and confirmed that I could replicate 
the error.

  2.  I read the documentation, invoked debug, and tried different 
things to isolate the problem.  For example, I listed the code for 
corCompSymm.  The documentation led me to "Initialize(cs)", which gave 
me the same error message.

  3.  Ultimately, I found in Pinhiero and Bates (2000) Mixed-Effects 
Models for S and S-Plus (Springer) an example that looked exactly like 
yours but did NOT produce the same error message to "Initialize".  By 
comparing the example that worked with the superficially identical case 
that didn't, I found the difference.

  hope this helps.
  spencer graves

###
  You need repeated measures for a random effect to make any sense.  I
modified your example as follows, and the error went away.

> mytable$RIL2 <- rep(1:4, 1:4)
> cs2 <- corCompSymm(value=0.5, form=~1|RIL2)
> model2<-lme(mytrait~myloc, data=mytable, random=~1|RIL2,
+ correlation=cs2)

  (I've made similar mistakes and had great difficulty finding the
problem.)

  spencer graves

J.Fu wrote:
> Dear Friends,
> 
> I am seeking for any help on an error message in lme 
> functions. I use mixed model to analyze a data with 
> compound symmetric correlation structure. But I get an 
> error message: "Error in corMatrix.corCompSymm(object) : 
> NA/NaN/Inf in foreign function call (arg 1)". If I change 
> the correlation structure to corAR1, then no error. I have 
> no clue how to solve this problem. I would highly 
> appreciate any help.
> Thanks in advance and looking forward to any help.
> 
> JY
> 
> 
> I attached my data and codes here:
> 
> # data: mytable
>   mytrait myloc RIL
> A1 0.590950330 0 1
> A2 -0.315469846 -1 2
> A3 -0.265690115 0 3
> A4 0.342885046 0 4
> A5 0.007613402 1 5
> A6 0.285997884 0 6
> A7 0.333841975 0 7
> A8 -0.599817735 -1 8
> A9 0.242621036 0 9
> A10 0.518959588 1 10
> 
> cs<-corCompSymm(0.5, form=~1|RIL, fixed=T)
> model<-lme(mytrait~myloc, data=mytable, random=~1|RIL, 
> na.action=na.omit, correlation=cs)
> 
> Error in corMatrix.corCompSymm(object) : NA/NaN/Inf in 
> foreign function call (arg 1)
> 
> __
> R-help@stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

-- 
Spencer Graves, PhD
Senior Development Engineer
PDF Solutions, Inc.
333 West San Carlos Street Suite 700
San Jose, CA 95110, USA

[EMAIL PROTECTED]
www.pdf.com 
Tel:  408-938-4420
Fax: 408-280-7915

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] Scale of plots

2005-11-07 Thread Berton Gunter
Please read the docs! ?plot  --> ?plot.default --> ylim

-- Bert Gunter
Genentech Non-Clinical Statistics
South San Francisco, CA
 
"The business of the statistician is to catalyze the scientific learning
process."  - George E. P. Box
 
 

> -Original Message-
> From: [EMAIL PROTECTED] 
> [mailto:[EMAIL PROTECTED] On Behalf Of Mark Miller
> Sent: Monday, November 07, 2005 7:12 AM
> To: r-help@stat.math.ethz.ch
> Subject: [R] Scale of plots
> 
> I would like to know how I can change the axis to run from 0 
> to 50 say, 
> instead of the 0 to 200 it gives me.
> 
> Many thanks
> Mark Miller
> 
> __
> R-help@stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! 
> http://www.R-project.org/posting-guide.html
>

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] Scale of plots

2005-11-07 Thread Wiener, Matthew
See "xlim" and "ylim" in the documentation on the plot command.
Hope this helps,

Matt Wiener
-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Mark Miller
Sent: Monday, November 07, 2005 10:12 AM
To: r-help@stat.math.ethz.ch
Subject: [R] Scale of plots


I would like to know how I can change the axis to run from 0 to 50 say, 
instead of the 0 to 200 it gives me.

Many thanks
Mark Miller

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide!
http://www.R-project.org/posting-guide.html

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] OLS variables

2005-11-07 Thread Prof Brian Ripley
On Mon, 7 Nov 2005, John Fox wrote:

> Dear Brian,
>
> I don't have a strong opinion, but R's interpretation seems more consistent
> to me, and as Kjetil points out, one can use polym() to specify a
> full-polynomial model. It occurs to me that ^ and ** could be differentiated
> in model formulae to provide both.

However, poly[m] only provide orthogonal polynomials, and I have from time 
to time considered extending them to provide raw polynomials too.
Is that a better-supported idea?

>
> Regards,
> John
>
> 
> John Fox
> Department of Sociology
> McMaster University
> Hamilton, Ontario
> Canada L8S 4M4
> 905-525-9140x23604
> http://socserv.mcmaster.ca/jfox
> 
>
>> -Original Message-
>> From: Prof Brian Ripley [mailto:[EMAIL PROTECTED]
>> Sent: Monday, November 07, 2005 4:05 AM
>> To: Kjetil Brinchmann halvorsen
>> Cc: John Fox; r-help@stat.math.ethz.ch
>> Subject: Re: [R] OLS variables
>>
>> On Sun, 6 Nov 2005, Kjetil Brinchmann halvorsen wrote:
>>
>>> John Fox wrote:

 I assume that you're using lm() to fit the model, and that
>> you don't
 really want *all* of the interactions among 20 predictors:
>> You'd need
 quite a lot of data to fit a model with 2^20 terms in it,
>> and might
 have trouble interpreting the results.

 If you know which interactions you're looking for, then why not
 specify them directly, as in lm(y ~  x1*x2 + x3*x4*x5 +
>> etc.)? On the
 other hand, it you want to include all interactions, say, up to
 three-way, and you've put the variables in a data frame,
>> then lm(y ~ .^3, data=DataFrame) will do it.
>>>
>>> This is nice with factors, but with continuous variables,
>> and need of
>>> a response-surface type, of model, will not do. For instance, with
>>> variables x, y, z in data frame dat
>>>lm( y ~ (x+z)^2, data=dat )
>>> gives a model mwith the terms x, z and x*z, not the square terms.
>>> There is a need for a semi-automatic way to get these, for
>> instance,
>>> use poly() or polym() as in:
>>>
>>> lm(y ~ polym(x,z,degree=2), data=dat)
>>
>> This is an R-S difference (FAQ 3.3.2).  R's formula parser
>> always takes
>> x^2 = x whereas the S one does so only for factors.  This
>> makes sense it you interpret `interaction' strictly as in
>> John's description - S chose to see an interaction of any two
>> continuous variables as multiplication (something which
>> puzzled me when I first encountered it, as it was not well
>> documented back in 1991).
>>
>> I have often wondered if this difference was thought to be an
>> improvement, or if it just a different implementation of the
>> Rogers-Wilkinson syntax.
>> Should we consider changing it?

-- 
Brian D. Ripley,  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] OLS variables

2005-11-07 Thread John Fox
Dear Brian,

I don't have a strong opinion, but R's interpretation seems more consistent
to me, and as Kjetil points out, one can use polym() to specify a
full-polynomial model. It occurs to me that ^ and ** could be differentiated
in model formulae to provide both.

Regards,
 John


John Fox
Department of Sociology
McMaster University
Hamilton, Ontario
Canada L8S 4M4
905-525-9140x23604
http://socserv.mcmaster.ca/jfox 
 

> -Original Message-
> From: Prof Brian Ripley [mailto:[EMAIL PROTECTED] 
> Sent: Monday, November 07, 2005 4:05 AM
> To: Kjetil Brinchmann halvorsen
> Cc: John Fox; r-help@stat.math.ethz.ch
> Subject: Re: [R] OLS variables
> 
> On Sun, 6 Nov 2005, Kjetil Brinchmann halvorsen wrote:
> 
> > John Fox wrote:
> >>
> >> I assume that you're using lm() to fit the model, and that 
> you don't 
> >> really want *all* of the interactions among 20 predictors: 
> You'd need 
> >> quite a lot of data to fit a model with 2^20 terms in it, 
> and might 
> >> have trouble interpreting the results.
> >>
> >> If you know which interactions you're looking for, then why not 
> >> specify them directly, as in lm(y ~  x1*x2 + x3*x4*x5 + 
> etc.)? On the 
> >> other hand, it you want to include all interactions, say, up to 
> >> three-way, and you've put the variables in a data frame, 
> then lm(y ~ .^3, data=DataFrame) will do it.
> >
> > This is nice with factors, but with continuous variables, 
> and need of 
> > a response-surface type, of model, will not do. For instance, with 
> > variables x, y, z in data frame dat
> >lm( y ~ (x+z)^2, data=dat )
> > gives a model mwith the terms x, z and x*z, not the square terms.
> > There is a need for a semi-automatic way to get these, for 
> instance, 
> > use poly() or polym() as in:
> >
> > lm(y ~ polym(x,z,degree=2), data=dat)
> 
> This is an R-S difference (FAQ 3.3.2).  R's formula parser 
> always takes
> x^2 = x whereas the S one does so only for factors.  This 
> makes sense it you interpret `interaction' strictly as in 
> John's description - S chose to see an interaction of any two 
> continuous variables as multiplication (something which 
> puzzled me when I first encountered it, as it was not well 
> documented back in 1991).
> 
> I have often wondered if this difference was thought to be an 
> improvement, or if it just a different implementation of the 
> Rogers-Wilkinson syntax.
> Should we consider changing it?
> 
> -- 
> Brian D. Ripley,  [EMAIL PROTECTED]
> Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
> University of Oxford, Tel:  +44 1865 272861 (self)
> 1 South Parks Road, +44 1865 272866 (PA)
> Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] someone knows how to title a image.plot() in a layout?

2005-11-07 Thread Andreas Lehnert

Hello R,

I tried to get 4 image.plot() in one layout.
But when I try to label it with "main" or "sub" there is an error.
What am I doing wrong?

Please help

Thanks, Andy

--

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] Help with downloading clim.pact

2005-11-07 Thread Nelson, Gary \(FWE\)

I am wondering if anyone might know why the package clim.pact won't
install properly.  I have tried many URL sites and the same thing
happens. I get the error message below.  I also tried downloading the
ZIP from the CRAN site and extracting the file myself, but an error
message (something like, "not an archive file") appears.  I operate
through Windows.

Thanks.

Gary Nelson.

**
trying URL
'http://cran.cnr.Berkeley.edu/bin/windows/contrib/2.2/clim.pact_2.2-0.zi
p'
Content type 'application/zip' length 5857349 bytes
opened URL
downloaded 2525Kb

Error in gzfile(file, "r") : unable to open connection
In addition: Warning messages:
1: downloaded length 2586064 != reported length 5857349 
2: error 1 in extracting from zip file 
3: cannot open compressed file 'clim.pact/DESCRIPTION' 

> version
 _  
platform i386-pc-mingw32
arch i386   
os   mingw32
system   i386, mingw32  
status  
major2  
minor2.0
year 2005   
month10 
day  06 
svn rev  35749  
language R 



*
Gary A. Nelson, Ph.D
Massachusetts Division of Marine Fisheries
30 Emerson Avenue
Gloucester, MA 01930
Phone: (978) 282-0308 x114
Fax: (617) 727-3337
Email: [EMAIL PROTECTED]

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] Use of paste with apply()

2005-11-07 Thread Thomas Lumley
On Sun, 6 Nov 2005, Kjetil Brinchmann halvorsen wrote:

> Why doesn't paste behave in apply as sum?
>

Because sum maps a vector of inputs to a single output, but paste does 
not, unless you use collapse=

-thomas

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] R seems to "stall" after several hours on a long series o f analyses... where to start?

2005-11-07 Thread Thomas Lumley
On Mon, 7 Nov 2005, Fowler, Mark wrote:

> You can test if the problem is accumulation in memory registers, which is
> certainly what this sounds like. Just do a loop over a reasonably small
> number of iterations and store or print the time between each iteration.

If you are worried about memory accumulation it would make more sense to 
print memory use statistics after each iteration rather than the time (eg 
gc(,reset=TRUE))

-thomas



> If
> memory accumulation it will run optimally for the first few iterations,
> after which the time will increase noticeably (essentially exponentially,
> hence ultimately freezes up). If this is the problem you may need to switch
> to a For loop approach, or effect the loop as a DOS script to do each
> iteration as a batch job (open/close R each iteration, old bytes can't
> clutter the memory).
>
>
>>  Mark Fowler
>   Population Ecology Division
>>  Bedford Inst of Oceanography
>>  Dept Fisheries & Oceans
>>  Dartmouth NS Canada
>   B2Y 4A2
>   Tel. (902) 426-3529
>   Fax (902) 426-9710
>   Email [EMAIL PROTECTED]
>   Home Tel. (902) 461-0708
>   Home Email [EMAIL PROTECTED]
>
>
> -Original Message-
> From: David L. Van Brunt, Ph.D. [mailto:[EMAIL PROTECTED]
> Sent: November 7, 2005 7:16 AM
> To: r-help@stat.math.ethz.ch
> Subject: [R] R seems to "stall" after several hours on a long series of
> analyses... where to start?
>
> Not sure where to even start on this I'm hoping there's some debugging I
> can do...
>
> I have a loop that cycles through several different data sets (same
> structure, different info), performing randomForest growth and
> predictions... saving out the predictions for later study...
>
> I get about 5 hours in (9%... of the planned iterations.. yikes!) and R just
> freezes.
>
> This happens in interactive and batch mode execution (I can see from the
> ".Rout" file that it gets about 9% through in Batch mode, and about 6% if in
> interactive mode... does that suggest memory problems?)
>
> I'm thinking of re-executing this same code on a different platform to see
> if that's the issue (currently using OS X)... any other suggestions on where
> to look, or what to try to get more information?
>
> Sorry so vague... it's a LOT of code, runs fine without error for many
> iterations, so I didn't think the problem was syntax...
>
> --
> ---
> David L. Van Brunt, Ph.D.
> mailto:[EMAIL PROTECTED]
>
>   [[alternative HTML version deleted]]
>
> __
> R-help@stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide!
> http://www.R-project.org/posting-guide.html
>
> __
> R-help@stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
>

Thomas Lumley   Assoc. Professor, Biostatistics
[EMAIL PROTECTED]   University of Washington, Seattle

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] how to export density_function output?

2005-11-07 Thread Dieter Menne
Alessandro Bigi  agrsci.unibo.it> writes:

> 
> Dear all,
> 
> quite a naive question: I have a data frame and I computed "the kernel 
> density estimate" with density on each column.
> Now I'd like to export in a txt file the density function output for each 
> column, but, when if I use write.table, I get a message "Error in 
> as.data.frame.default(x[[i]], optional = TRUE) : can't coerce class 
> "density" into a data.frame"

dens = density(c(-20,rep(0,98),20))
str(dens) # If in doubt, write str()
#List of 7
# $ x: num [1:512] -23.1 -23.0 -22.9 -22.8 -22.7 ...
# $ y: num [1:512] 4.46e-05 5.77e-05 7.40e-05 9.45e-05 1.20e-04 ...
# $ bw   : num 1.02
# $ n: int 100
# $ call : language density.default(x = c(-20, rep(0, 98), 20))
# $ data.name: chr "c(-20, rep(0, 98), 20)"
# $ has.na   : logi FALSE
# - attr(*, "class")= chr "density"
# You probably need x and y
write.table(cbind(dens$x,dens$y),file="dens.txt")

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] Scale of plots

2005-11-07 Thread Mark Miller
I would like to know how I can change the axis to run from 0 to 50 say, 
instead of the 0 to 200 it gives me.

Many thanks
Mark Miller

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] R seems to "stall" after several hours on a long series of analyses... where to start?

2005-11-07 Thread David L. Van Brunt, Ph.D.
Great suggestions, all.

I do have a timer in there, and it looks like the time to complete a loop is
not increasing as it goes. From your comments, I take it that suggests there
is not a memory leak. I could try scripting the loop from the shell, rather
than R, to see if that works, but will do that as a last resort as it will
require a good deal of re-writing (the loop follows some "setup" code that
builds a pretty large data set... the loop then slaps several new columns on
a copy of that data set, and analyses that...)

I'll still try the other platform as well, see if the same problem occurs
there.

On 11/7/05, jim holtman <[EMAIL PROTECTED]> wrote:
>
> Here is some code that I use to track the progress of my scripts. This
> will print out the total cpu time and the memory that is being used. You
> call it with 'my.stats("message")' to print out "message" on the console.
>  Also, have you profiled your code to see where the time is being spent?
> Can you break it up into multiple runs so that you can start with a "fresh"
> version of memory?
>  ==script===
> "my.stats" <- local({
> # local variables to hold the last times
> # first two variables are the elasped and CPU times from the last report
> lastTime <- lastCPU <- 0
> function(text = "stats", reset=F)
> {
> procTime <- proc.time()[1:3] # get current metrics
> if (reset){ # setup to mark timing from this point
> lastTime <<- procTime[3]
> lastCPU <<- procTime[1] + procTime[2]
> } else {
> cat(text, "-",sys.call(sys.parent())[[1]], ": <",
> round((procTime[1] + procTime[2]) - lastCPU,1),
> round(procTime[3] - lastTime,1), ">", procTime,
> " : ", round(memory.size()/2.^20., 1.), "MB\n")
> invisible(flush.console()) # force a write to the console
> }
> }
> })
>  = here is some sample output=
> > my.stats(reset=TRUE) # reset counters
> > x <- runif(1e6) # generate 1M random numbers
> > my.stats('random')
> random - my.stats : < 0.3 31.8 > 96.17 11.7 230474.9 : 69.5 MB
> > y <- x*x+sqrt(x) # just come calculation
> > my.stats('calc')
> calc - my.stats : < 0.7 71.2 > 96.52 11.74 230514.3 : 92.4 MB
> >
>  You can see that memory is growing. The first number is the CPU time and
> the second (in <>) is the elapsed time.
>  HTH
>
>
>  On 11/7/05, David L. Van Brunt, Ph.D. <[EMAIL PROTECTED]> wrote:
>
> > Not sure where to even start on this I'm hoping there's some
> > debugging I
> > can do...
> >
> > I have a loop that cycles through several different data sets (same
> > structure, different info), performing randomForest growth and
> > predictions... saving out the predictions for later study...
> >
> > I get about 5 hours in (9%... of the planned iterations.. yikes!) and R
> > just
> > freezes.
> >
> > This happens in interactive and batch mode execution (I can see from the
> > ".Rout" file that it gets about 9% through in Batch mode, and about 6%
> > if in
> > interactive mode... does that suggest memory problems?)
> >
> > I'm thinking of re-executing this same code on a different platform to
> > see
> > if that's the issue (currently using OS X)... any other suggestions on
> > where
> > to look, or what to try to get more information?
> >
> > Sorry so vague... it's a LOT of code, runs fine without error for many
> > iterations, so I didn't think the problem was syntax...
> >
> > --
> > ---
> > David L. Van Brunt, Ph.D.
> > mailto: [EMAIL PROTECTED]
> >
> > [[alternative HTML version deleted]]
> >
> > __
> > R-help@stat.math.ethz.ch mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-help
> > PLEASE do read the posting guide! 
> > http://www.R-project.org/posting-guide.html
> >
> >
>
>
>
> --
> Jim Holtman
> Cincinnati, OH
> +1 513 247 0281
>
> What the problem you are trying to solve?




--
---
David L. Van Brunt, Ph.D.
mailto:[EMAIL PROTECTED]

[[alternative HTML version deleted]]

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] Time-measurement in milliseconds

2005-11-07 Thread Sydler, Dominik
Hi there

I'm loking for a time-measurement to measure time-differences in
milliseconds.
On my search, I only found the following:
 - package "base": Sys.time()
-> only second-accuracy
 - package "R.utils": System$currentTimeMillis()
-> returns integer of milliseconds, but accuracy is only whole
seconds too.
At the moment I run every bit of code to measure 1000-times to be able
to calculate time in milliseconds... ;-)

Has anyone a method to get milliseconds?

Thanks for any help.

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] solve the quadratic equation ax^2+bx+c=0

2005-11-07 Thread Yuying Shi
If I have matrics as follows:
> a <- c(1,1,0,0)
> b <- c(4,4,0,0)
> c <- c(3,5,5,6)
How can I use R code to solve the equation ax^2+bx+c=0.
thanks!
 yuying shi

[[alternative HTML version deleted]]

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] solving a complicated equation

2005-11-07 Thread Andreas Hary
Try something like this:

> g <- function(x) return(p-a*exp(-x^2/2)+b*pnorm(x,0,1,lower.tail=F))
> p <- -0.5
> a <- 1
> b <- 1

> uniroot(g,c(-100,10))
$root
[1] -1.336607

$f.root
[1] 4.738e-06

$iter
[1] 11

$estim.prec
[1] 6.103516e-05


Regards,

Andreas






- Original Message - 
From: "Cunningham Kerry" <[EMAIL PROTECTED]>
To: 
Sent: Sunday, November 06, 2005 10:47 PM
Subject: [R] solving a complicated equation


>I want to solve the following equation for x
>
> p=a*exp(-x^2/2)+b*P(Z>x)
>
> where p,a,b are known, Z is a standard normal
> variable. Clearly there is no analytic form for
> P(Z>x).
>
> I am wondering if any expert could direct one easy way
> on this. Thank you.
>
> __
> R-help@stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! 
> http://www.R-project.org/posting-guide.html
>

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] how to export density_function output?

2005-11-07 Thread Duncan Murdoch
Alessandro Bigi wrote:
> Dear all,
> 
> quite a naive question: I have a data frame and I computed "the kernel 
> density estimate" with density on each column.
> Now I'd like to export in a txt file the density function output for each 
> column, but, when if I use write.table, I get a message "Error in 
> as.data.frame.default(x[[i]], optional = TRUE) : can't coerce class 
> "density" into a data.frame"
> How should I do?

When you get errors like that, it's a good idea to look inside the 
object to see what's there.  For example,

 > d <- density(precip, bw = bw)
 > str(d)
List of 7
  $ x: num [1:512] -4.82 -4.65 -4.49 -4.32 -4.16 ...
  $ y: num [1:512] 4.79e-05 5.46e-05 6.19e-05 7.00e-05 7.91e-05 ...
  $ bw   : num 3.94
  $ n: int 70
  $ call : language density.default(x = precip, bw = bw)
  $ data.name: chr "precip"
  $ has.na   : logi FALSE
  - attr(*, "class")= chr "density"

So the result is a list with x and y components giving the density, plus 
other components that describe what is there.  You probably only want 
the y components from various fits.  You can extract them as

results <- data.frame(precip = d$y, ...)

(where the ... has you extracting densities from other objects).

Duncan Murdoch

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] Problem defining a system of odes as a C library with lsoda

2005-11-07 Thread Woodrow Setzer
I think the problem is in odesolve (something I thought I'd already fixed).  I 
don't have a 64-bit system to test on, so could you try these changes and let 
me know (offline) if they fix the problem?  There is a type mismatch in 
odesolve; I want to know if that is the cause of your problem.  In any case, 
I'll get an updated version of odesolve on CRAN ASAP.

in mymod.c:
change the line 
mymod(void(* odeparms)(int *, double *))
to
mymod(void(* odeparms)(long int *, double *))

and the line
int N=3;
to 
long int N=3;

Woody

-Original Message-
From: Dylan Childs <[EMAIL PROTECTED]>
Sent: Nov 6, 2005 2:39 PM
To: r-help@stat.math.ethz.ch
Subject: [R] Problem defining a system of odes as a C library with lsoda

I have been trying to make use of the odesolve library on my 
university's Linux grid - currently R version 2.0.1 is installed and 
the system runs 64-bit Scientific Linux based on Redhat.  

... [deleted]

However, the call to 
lsoda fails with the following error:

Error in lsoda(c(1, 0, 0), times, "myderivs", parms, rtol = 1e-04, atol 
= my.atol,  :
Confusion over the length of parms

... [deleted]

Dr. Dylan Z. Childs
Department of Animal and Plant Sciences,
University of Sheffield,
Sheffield,
S10 2TN, UK.

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Woodrow Setzer
National Center for Computational Toxicology
US Environmental Protection Agency
Research Triangle Park, NC 27711

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] how to export density_function output?

2005-11-07 Thread Alessandro Bigi
Dear all,

quite a naive question: I have a data frame and I computed "the kernel 
density estimate" with density on each column.
Now I'd like to export in a txt file the density function output for each 
column, but, when if I use write.table, I get a message "Error in 
as.data.frame.default(x[[i]], optional = TRUE) : can't coerce class 
"density" into a data.frame"
How should I do?
Thanks,

Alessandro


-- 







--

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] Box's M/likelihood ratio test for equal v-c matrices

2005-11-07 Thread Liaw, Andy
Doing RSiteSearch("Box's M test") ought to help:  There's code in 
the 3rd hit.

Andy

> From: Thomas Lotze
> 
> Hopefully a quick question:
> 
> is there a package/routine to perform Box's M (like proc 
> discrim in SAS) 
> or some other test for whether two multivariate samples (presumed 
> multivariate normal) have the same v-c matrix?  I've looked, 
> but can't 
> find it.
> 
> If not, I can certainly code it up myself (and perhaps submit it for 
> inclusion in the mvtnormtest package), but I wanted to check 
> if it exists.
> 
> Many thanks,
> Thomas Lotze
> 
> "Isn't sanity just a one-trick pony anyway?  I mean, all you get
> is that one trick--rational thinking--but when you're good and crazy,
> well, the sky's the limit!"
>   -The Tick
> 
> __
> R-help@stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! 
> http://www.R-project.org/posting-guide.html
> 
>

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] R-2.2.0: aggregate problem

2005-11-07 Thread jfontain
> aggregate(as.ts(c(1,2,3,4,5,6,7,8,9,10)),1/2,mean)
Time Series:
Start = 1
End = 9
Frequency = 0.5
[1] 1.5 3.5 5.5 7.5 9.5
> aggregate(as.ts(c(1,2,3,4,5,6,7,8,9,10)),1/5,mean)
Error in sprintf(gettext(fmt, domain = domain), ...) :
use format %f, %e or %g for numeric objects
>

I must be doing something wrong, but what?

Many thanks for your help,

--
Jean-Luc

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] R (2.2.0), R-DCOM and Delphi

2005-11-07 Thread Dieter Menne
Jose Claudio Faria  terra.com.br> writes:

> 
> Dear Dieter,
> 
> Below the error message I got when trying to compile your RDCom.dpr
> 
...
> [Warning] RCom.pas(93): Unsafe code 'String index to var param'
> [Error] RCom.pas(119): Undeclared identifier: 'VarType'
> [Error] RCom.pas(124): Undeclared identifier: 'VarArrayDimCount'
..
> I'm using Delphi 7 under WinXP pro/SP2.

Looks like a missing file in "uses". I compiled it under Delphi 5, and it looks 
like the variants are no longer included in my "uses".

Try to find out via help file where VarType is declared, add it to "uses". And 
please tell me about the result when you got it compiled, I will add your 
finding to the archive.

Dieter

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] R seems to "stall" after several hours on a long series o f analyses... where to start?

2005-11-07 Thread Fowler, Mark
You can test if the problem is accumulation in memory registers, which is
certainly what this sounds like. Just do a loop over a reasonably small
number of iterations and store or print the time between each iteration. If
memory accumulation it will run optimally for the first few iterations,
after which the time will increase noticeably (essentially exponentially,
hence ultimately freezes up). If this is the problem you may need to switch
to a For loop approach, or effect the loop as a DOS script to do each
iteration as a batch job (open/close R each iteration, old bytes can't
clutter the memory).


>   Mark Fowler
Population Ecology Division
>   Bedford Inst of Oceanography
>   Dept Fisheries & Oceans
>   Dartmouth NS Canada
B2Y 4A2
Tel. (902) 426-3529
Fax (902) 426-9710
Email [EMAIL PROTECTED]
Home Tel. (902) 461-0708
Home Email [EMAIL PROTECTED]


-Original Message-
From: David L. Van Brunt, Ph.D. [mailto:[EMAIL PROTECTED] 
Sent: November 7, 2005 7:16 AM
To: r-help@stat.math.ethz.ch
Subject: [R] R seems to "stall" after several hours on a long series of
analyses... where to start?

Not sure where to even start on this I'm hoping there's some debugging I
can do...

I have a loop that cycles through several different data sets (same
structure, different info), performing randomForest growth and
predictions... saving out the predictions for later study...

I get about 5 hours in (9%... of the planned iterations.. yikes!) and R just
freezes.

This happens in interactive and batch mode execution (I can see from the
".Rout" file that it gets about 9% through in Batch mode, and about 6% if in
interactive mode... does that suggest memory problems?)

I'm thinking of re-executing this same code on a different platform to see
if that's the issue (currently using OS X)... any other suggestions on where
to look, or what to try to get more information?

Sorry so vague... it's a LOT of code, runs fine without error for many
iterations, so I didn't think the problem was syntax...

--
---
David L. Van Brunt, Ph.D.
mailto:[EMAIL PROTECTED]

[[alternative HTML version deleted]]

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide!
http://www.R-project.org/posting-guide.html

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] R (2.2.0), R-DCOM and Delphi

2005-11-07 Thread Jose Claudio Faria
Dear Dieter,

First, thank you for your work!
Below the error message I got when trying to compile your RDCom.dpr

[Warning] STATCONNECTORCLNTLib_TLB.pas(319): Unsafe type 'EventDispIDs: Pointer'
[Warning] STATCONNECTORCLNTLib_TLB.pas(320): Unsafe type 'LicenseKey: Pointer'
[Warning] STATCONNECTORCLNTLib_TLB.pas(324): Unsafe code '@ operator'
[Warning] STATCONNECTORCLNTLib_TLB.pas(367): Unsafe type 'EventDispIDs: Pointer'
[Warning] STATCONNECTORCLNTLib_TLB.pas(368): Unsafe type 'LicenseKey: Pointer'
[Warning] STATCONNECTORCLNTLib_TLB.pas(372): Unsafe code '@ operator'
[Warning] STATCONNECTORCLNTLib_TLB.pas(374): Unsafe code '@ operator'
[Warning] STATCONNECTORSRVLib_TLB.pas(376): Unsafe type 'LicenseKey: Pointer'
[Warning] STATCONNECTORSRVLib_TLB.pas(379): Unsafe code '@ operator'
[Warning] RCom.pas(93): Unsafe code 'String index to var param'
[Error] RCom.pas(119): Undeclared identifier: 'VarType'
[Error] RCom.pas(124): Undeclared identifier: 'VarArrayDimCount'
[Error] RCom.pas(127): Undeclared identifier: 'VarArrayHighBound'
[Error] RCom.pas(140): Undeclared identifier: 'VarType'
[Error] RCom.pas(145): Undeclared identifier: 'VarArrayDimCount'
[Error] RCom.pas(148): Undeclared identifier: 'VarArrayHighBound'
[Error] RCom.pas(161): Undeclared identifier: 'VarType'
[Error] RCom.pas(166): Undeclared identifier: 'VarArrayDimCount'
[Error] RCom.pas(169): Undeclared identifier: 'VarArrayHighBound'
[Error] RCom.pas(181): Undeclared identifier: 'VarArrayCreate'
[Error] RCom.pas(196): Undeclared identifier: 'VarArrayCreate'
[Error] RCom.pas(211): Undeclared identifier: 'VarArrayCreate'
[Error] RCom.pas(226): Undeclared identifier: 'VarArrayCreate'
[Error] RCom.pas(253): Undeclared identifier: 'VarType'
[Error] RCom.pas(258): Undeclared identifier: 'VarArrayDimCount'
[Error] RCom.pas(261): Undeclared identifier: 'VarArrayHighBound'
[Fatal Error] RDComMain.pas(14): Could not compile used unit 'RCom.pas'

I'm using Delphi 7 under WinXP pro/SP2.
Could you give me a tip?

Regards,
-- 
Jose Claudio Faria
Brasil/Bahia/UESC/DCET
Estatistica Experimental/Prof. Adjunto
mails:
  [EMAIL PROTECTED]
  [EMAIL PROTECTED]
  [EMAIL PROTECTED]
tel: 73-3634.2779

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] R seems to "stall" after several hours on a long series of analyses... where to start?

2005-11-07 Thread Duncan Murdoch
David L. Van Brunt, Ph.D. wrote:
> Not sure where to even start on this I'm hoping there's some debugging I
> can do...
> 
> I have a loop that cycles through several different data sets (same
> structure, different info), performing randomForest growth and
> predictions... saving out the predictions for later study...
> 
> I get about 5 hours in (9%... of the planned iterations.. yikes!) and R just
> freezes.
> 
> This happens in interactive and batch mode execution (I can see from the
> ".Rout" file that it gets about 9% through in Batch mode, and about 6% if in
> interactive mode... does that suggest memory problems?)
> 
> I'm thinking of re-executing this same code on a different platform to see
> if that's the issue (currently using OS X)... any other suggestions on where
> to look, or what to try to get more information?
> 
> Sorry so vague... it's a LOT of code, runs fine without error for many
> iterations, so I didn't think the problem was syntax...

You could try running an external debugger to see whether it appears R 
is stuck in a loop.  I don't know what OS X debuggers are like, but on 
Windows, you can see routine names even without debugging information. 
Recompiling R with debugging info will make the results a lot easier to 
interpret.

Duncan Murdoch

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] R seems to "stall" after several hours on a long series of analyses... where to start?

2005-11-07 Thread David L. Van Brunt, Ph.D.
Not sure where to even start on this I'm hoping there's some debugging I
can do...

I have a loop that cycles through several different data sets (same
structure, different info), performing randomForest growth and
predictions... saving out the predictions for later study...

I get about 5 hours in (9%... of the planned iterations.. yikes!) and R just
freezes.

This happens in interactive and batch mode execution (I can see from the
".Rout" file that it gets about 9% through in Batch mode, and about 6% if in
interactive mode... does that suggest memory problems?)

I'm thinking of re-executing this same code on a different platform to see
if that's the issue (currently using OS X)... any other suggestions on where
to look, or what to try to get more information?

Sorry so vague... it's a LOT of code, runs fine without error for many
iterations, so I didn't think the problem was syntax...

--
---
David L. Van Brunt, Ph.D.
mailto:[EMAIL PROTECTED]

[[alternative HTML version deleted]]

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] levelplot and layout

2005-11-07 Thread Andreas Lehnert

Dear R,

I know that I cannot use filled.contour() with layout,
so I tried levelplot. But it looks like I cannot use levelplot also.
The only way to get more than one graphic in the window is 
by levelplot(panel=). but there is only one colorkey and I need one for
every 
graphic.
(I´m also not very satisfied with levelplot, because I would like to have
a "soft" graphic like filled.contour)

Could anyone kindly suggest anything?

Thanks, Andy

-- 
Telefonieren Sie schon oder sparen Sie noch?

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] slow R start up

2005-11-07 Thread Barry Rowlingson
Some helpful person wrote:

> A possibility which might work, and if so would work as you intend,
> is to use a FIFO ("named pipe") or two to communicate with R.

>>>One possible solution may be to send the relevant commands to an
>>>already
>>>running copy of R but I've yet to figure out how to achieve this.

  You could try one of the R server-based solutions:

http://franklin.imgen.bcm.tmc.edu/R.web.servers/

  You could try Rserve, but you'll probably have to write some C++ 
wrapper so you can do a command from the shell, so you can do something 
like:

  echo "x=runif(1000);hist(x)" | RserveShell

from your shell script. Its probably only ten lines of C++.

http://stats.math.uni-augsburg.de/Rserve/doc.shtml

  If you rewrite your script in Python, I may have some Python-Rserve 
bindings working soon!

  Do you need to get any of R's computational results back into your shell?

  Whenever I write shell scripts, once it gets to about 20 lines I think 
to myself "Uh oh, I shoulda written this in perl". And when I've written 
200 lines of perl I find myself thinking "Uh oh, I shoulda written this 
in Python".

Baz

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] OLS variables

2005-11-07 Thread Prof Brian Ripley
On Sun, 6 Nov 2005, Kjetil Brinchmann halvorsen wrote:

> John Fox wrote:
>>
>> I assume that you're using lm() to fit the model, and that you don't really
>> want *all* of the interactions among 20 predictors: You'd need quite a lot
>> of data to fit a model with 2^20 terms in it, and might have trouble
>> interpreting the results.
>>
>> If you know which interactions you're looking for, then why not specify them
>> directly, as in lm(y ~  x1*x2 + x3*x4*x5 + etc.)? On the other hand, it you
>> want to include all interactions, say, up to three-way, and you've put the
>> variables in a data frame, then lm(y ~ .^3, data=DataFrame) will do it.
>
> This is nice with factors, but with continuous variables, and need of a
> response-surface type, of model, will not do. For instance, with
> variables x, y, z in data frame dat
>lm( y ~ (x+z)^2, data=dat )
> gives a model mwith the terms x, z and x*z, not the square terms.
> There is a need for a semi-automatic way to get these, for instance,
> use poly() or polym() as in:
>
> lm(y ~ polym(x,z,degree=2), data=dat)

This is an R-S difference (FAQ 3.3.2).  R's formula parser always takes 
x^2 = x whereas the S one does so only for factors.  This makes sense it 
you interpret `interaction' strictly as in John's description - S chose 
to see an interaction of any two continuous variables as multiplication
(something which puzzled me when I first encountered it, as it was not 
well documented back in 1991).

I have often wondered if this difference was thought to be an improvement, 
or if it just a different implementation of the Rogers-Wilkinson syntax.
Should we consider changing it?

-- 
Brian D. Ripley,  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] slow R start up

2005-11-07 Thread Ted Harding

On 07-Nov-05 Gabor Grothendieck wrote:
> Or you could call those other programs from within R.  See
> ?system

A possibility which might work, and if so would work as you intend,
is to use a FIFO ("named pipe") or two to communicate with R.

In R, see "?connections"; in Linux, see "man mkfifo".

Then you should be able to start R up so that it reads commands
from the FIFO, (and optionally writes output to a FIFO which can
be read by the shell script; though it might be simpler to write
the output to a regular file).

This would be the converse of running systen commands from within
R using 'system()'.

However, there can be issues with this approach since, in my
experience, you have to ensure that the program reading the
FIFO knows when there is more to come on the FIFO, and also
when to stop reading. In principle it's straightforward,
though, and rather like reading the commamnds simply from
a file.

Or you might emulate the whole thing using regular files:
Put R into a loop which reads commands from a file (testing
whether it has changed) which the shell script updates as
needed. But you'd need to be careful that R does not read
a partially updated file.

Best wishes,
Ted.

> On 11/7/05, Stuart Macgregor <[EMAIL PROTECTED]> wrote:
>> Hi,
>>   I want to use R within a unix shell script where I repeatedly open
>> and close R (doing some computation within R whilst it is open). i.e.
>> something like
>>
>> #!/bin/sh
>> R --vanilla << EOF
>> # some R commands
>> EOF
>> # some unix commands
>> R --vanilla << EOF
>> # some R commands
>> EOF
>> # some unix commands
>> ...etc
>>
>> the problem is that since I open and close R many times, the shell
>> script takes a long time to run (R takes ~1 second to start up, even
>> on
>> a fast machine; I've tried various linux installations and versions of
>> R
>> from 1.7 to 2.2). Is there some way of making R start up faster? I've
>> tried altering the memory start options but this makes little
>> difference.
>> Note I have various other programs called from the shell script and
>> don't want to do everything from "within" R.
>> One possible solution may be to send the relevant commands to an
>> already
>> running copy of R but I've yet to figure out how to achieve this.
>>
>> Any suggestions gratefully received.
>>
>> Stuart.
>>
>> __
>> R-help@stat.math.ethz.ch mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide!
>> http://www.R-project.org/posting-guide.html
>>
> 
> __
> R-help@stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide!
> http://www.R-project.org/posting-guide.html


E-Mail: (Ted Harding) <[EMAIL PROTECTED]>
Fax-to-email: +44 (0)870 094 0861
Date: 07-Nov-05   Time: 08:57:16
-- XFMail --

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html