Re: [Rd] package building problem under Windows Vista

2008-04-20 Thread Gabor Grothendieck
There does seem to be some general problem associated with Sweave
and graphics when I try it on my Vista system with
[1] R version 2.7.0 RC (2008-04-17 r45367)

Using tradeCosts-article.Rnw from the tradeCosts package:

setwd(path.to.tradeCosts-article.Rnw)
Sweave(tradeCosts-article.Rnw)

appears to work properly; however, if we try it from the
command line:

R CMD Sweave tradeCosts-article.Rnw

to simulate what happens when trying to build the package it
indicates that in chunk 2 that pdf is masked from grDevices.
Then in chunk 17, when plotting is attempted the first time,
it chokes.

Maybe someone familiar with the relevant internals can explain.

I have shown chunks 1, 2 and 17 as output from Stangle,
the Rnw file up to the end of chunk 2 and the error log
from Sweave in the sections below.


-
###
### chunk number 1:
###
options(digits = 3, width = 60, scipen = 99)
set.seed(1)

cat.df.without.rownames - function (d, file = ){
  stopifnot(is.data.frame(d))
  row.names(d) - 1:nrow(d)
  x - NULL
  conn - textConnection(x, w, local = TRUE)
  capture.output(print(d), file = conn)
  close(conn)
  cat(substring(x, first = max(nchar(row.names(d))) + 2), sep = \n,
  file = file)
}


###
### chunk number 2:
###
library(tradeCosts)
data(trade.mar.2007)
head(trade.mar.2007)


###
### chunk number 17:
###
plot(result.batched, time.series.bps)

-


\documentclass[a4paper]{report}
\usepackage[round]{natbib}

\usepackage{Rnews}
\usepackage{fancyvrb}
\usepackage{Sweave}
\hyphenation{tradeCosts}
\hyphenation{tradeCostsResults}
\hyphenation{decision}

\DefineVerbatimEnvironment{Sinput}{Verbatim}{fontsize=\small,fontshape=sl}
\DefineVerbatimEnvironment{Soutput}{Verbatim}{fontsize=\small}
\DefineVerbatimEnvironment{Scode}{Verbatim}{fontsize=\small,fontshape=sl}

%% \SweaveOpts{prefix.string=graphics/portfolio}

\bibliographystyle{abbrvnat}

\begin{document}
\begin{article}
\title{Trade Costs}
\author{Jeff Enos, David Kane, Arjun Ravi Narayan, Aaron Schwartz,
Daniel Suo and Luyi Zhao}

%%\VignetteIndexEntry{Trade Costs}
%%\VignetteDepends{tradeCosts}

echo = FALSE=
options(digits = 3, width = 60, scipen = 99)
set.seed(1)

cat.df.without.rownames - function (d, file = ){
  stopifnot(is.data.frame(d))
  row.names(d) - 1:nrow(d)
  x - NULL
  conn - textConnection(x, w, local = TRUE)
  capture.output(print(d), file = conn)
  close(conn)
  cat(substring(x, first = max(nchar(row.names(d))) + 2), sep = \n,
  file = file)
}
@

\maketitle

\setkeys{Gin}{width=0.95\textwidth}


\section*{Introduction}

Trade costs are the costs a trader must pay to implement a decision to
buy or sell a security. Consider a single trade of a single equity
security. Suppose on the evening of August 1, a trader decides to
purchase 10,000 shares of IBM at \$10, the \emph{decision price} of
the trade.  The next day, the trader's broker buys 10,000 shares in a
rising market and pays \$11 per share, the trade's \emph{execution price}.

How much did it cost to implement this trade?  In the most basic
ex-post analysis, trade costs are calculated by comparing the
execution price of a trade to a benchmark price.\footnote{For an
  in-depth discussion of both ex-ante modeling and ex-post measurement
  of trade costs, see \citet{kissell:glantz}.}  Suppose we
wished to compare the execution price to the price of the security at
the time of the decision in the above example.  Since the trader's
decision occurred at \$10 and the broker paid \$11, the cost of the
trade relative to the decision price was $\$11 - \$10 = \$1$ per
share, or \$10,000 (9.1\% of the total value of the execution).

Measuring costs relative to a trade's decision price captures costs
associated with the delay in the release of a trade into the market
and movements in price after the decision was made but before the
order is completed.  It does not, however, provide a means to
determine whether the broker's execution reflects a fair price. For
example, the price of \$11 would be a poor price if most transactions
in IBM on August 2 occurred at \$10.50.  For this purpose a better
benchmark would be the day's volume-weighted average price, or VWAP.
If VWAP on August 2 was \$10.50 and the trader used this as her
benchmark, then the trade cost would be \$0.50 per share, or \$500.

The first version of the \pkg{tradeCosts} package provides a simple
framework for calculating the cost of trades relative to a benchmark
price, such as VWAP or decision price, over multiple periods and basic
reporting and plotting facilities to analyse these costs.


Re: [Rd] package building problem under Windows Vista

2008-04-19 Thread Gabor Grothendieck
Note that CRAN is also having a problem with the package
so its not just you:

http://cran.r-project.org/bin/windows/contrib/2.7/check/tm-check.log

I also have a problem with the vignettes (note
there are two) but as with you this works:

Rcmd build --no-vignettes tradeCosts

I don't have any special environment variables set for temporary
directories and I use C:\Program Files\R\...
for R.  I use Rcmd.bat and sweave.bat from batchfiles found at:
http://batchfiles.googlecode.com
which finds R in the registry so no paths need be set.
I am on Vista but you seem to have SP1 which I don't
have yet.

I can build other packages so there
is probably something that needs fixing in their vignettes.

On Sat, Apr 19, 2008 at 7:52 AM, John Fox [EMAIL PROTECTED] wrote:
 Dear list members,

 I've encountered the following problem trying to build a package under
 Windows Vista (SP1). The problem occurs with both R 2.6.2 and R 2.7.0 RC
 (from which this output was produced). The package builds just fine on my XP
 (SP2) machine. Please see some further comments below.

 -- snip -

 Microsoft Windows [Version 6.0.6001]
 Copyright (c) 2006 Microsoft Corporation.  All rights reserved.

 d:\R-packagesR CMD build tradeCosts
 * checking for file 'tradeCosts/DESCRIPTION' ... OK
 * preparing 'tradeCosts':
 * checking DESCRIPTION meta-information ... OK
 * installing the package to re-build vignettes
 installing R.css in C:/Users/JOHNFO~1/AppData/Local/Temp/Rinst602447586


 -- Making package tradeCosts 
  adding build stamp to DESCRIPTION
  installing NAMESPACE file and metadata
 Error in file(file, r) : unable to open connection
 Calls: Anonymous - parseNamespaceFile - parse - file
 In addition: Warning message:
 In file(file, r) :
  cannot open file
 'C:/Users/JOHNFO~1/AppData/Local/Temp/Rinst602447586/tradeCos
 ts/NAMESPACE', reason 'Permission denied'
 Execution halted
 make[2]: *** [nmspace] Error 1
 make[1]: *** [all] Error 2
 make: *** [pkg-tradeCosts] Error 2
 *** Installation of tradeCosts failed ***

 Removing 'C:/Users/JOHNFO~1/AppData/Local/Temp/Rinst602447586/tradeCosts'
 * creating vignettes ... OK
 * removing junk files
 * checking for LF line-endings in source and make files
 * checking for empty or unneeded directories
 * building 'tradeCosts_0.3-0.tar.gz'


 d:\R-packages

 -- snip -

 I believe that the error is related to the vignette in the package, since I
 can build packages without a vignette. Clearly there is a file-permission
 problem but: (1) I'm using an account with administrator privileges; (2) R
 is installed into c:\R (and the problem persists even when I install R into
 d:\R); (3) the problem persists when I run the command window and R itself
 as administrator, and when I turn off account controls; (4) the problems
 persists when I reset the environment variables temp and tmp to d:\temp and
 set the permissions to d:\temp so that all groups and users have full
 control over that directory.

 I'm tempted to dump Vista but I've been trying to persist with it since most
 people (e.g., my students) buying new Windows machines will be getting it.
 Although I've read section 2.24 of the R for Windows FAQ, it's quite
 possible that I've missed something of relevance there.

 Any help would be appreciated.

 Thanks in advance,
  John

 
 John Fox, Professor
 Department of Sociology
 McMaster University
 Hamilton, Ontario, Canada L8S 4M4
 905-525-9140x23604
 http://socserv.mcmaster.ca/jfox

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Progress window on updating

2008-04-19 Thread Gabor Grothendieck
I am updating some packages using the packages menu
and noticed that the progress window says 100% in the
title done all the time.  It works, its just that the progress
percentage is wrong.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Overriding axis formatting with custom Axis method, Axis.numeric etc

2008-04-07 Thread Gabor Grothendieck
Try this:


Axis.AsIs = AxisFUN = function(x=NULL, at=NULL, ..., side,
labels=TRUE) {
   if (is.null(at)) at = pretty(x)
   axis(at=at, ..., side=side, labels=labels, col=red, lwd=5)
}

plot(I(1:10))

On Mon, Apr 7, 2008 at 7:22 AM, Sklyar, Oleg (MI London)
[EMAIL PROTECTED] wrote:
 Dear list:

 I would like to override the default way R formats plot axes with a
 custom method(s). Obviously I would prefer to define it as general as
 possible avoiding writing a custom method for each individual class
 where possible.

 The plot.default method (and I assume other methods as well) calls
 Axis(...) to produce the axis layout depending on data. In this sense it
 seems reasonable to (re)define Axis methods rather than change plot
 methods (which are impossible to control as there can be too many custom
 ones).

 Now my question is how can I redefine Axis that it is automatically
 called by plot.default and other plot methods? Or which Axis-method
 signatures are already defined that I should redefine?

 What I have succeeded so far was defining Axis.numeric and Axis.myClass,
 which are called by default if I supply data of class numeric or
 myClass. For example, a simple code like

 Axis.numeric = AxisFUN = function(x=NULL, at=NULL, ..., side,
 labels=TRUE) {
if (is.null(at)) at = pretty(x)
axis(at=at, ..., side=side, labels=labels, col=red, lwd=5)
 }

 run with plot(1:5,1:5) will format both axes red.

 However, if I execute it with plot(1:5) only x axis plotting is
 intercepted leaving y at default formatting although it is also numeric.
 Why and what should I define to intercept for plotting the y axis or for
 plotting axes in boxplot etc. Simply importing and overriding Axis as
 function does not bring anything, it wouldn't get called.

 Also I was not able to use S4 methods to redefine Axis. Overriding it
 with the code from above using any of the following signatures also
 didn't work for me - they are simply ignored:

 setGeneric(Axis)

 setMethod(Axis, signature(x=missing,at=numeric), AxisFUN)
 setMethod(Axis, signature(x=numeric,at=missing), AxisFUN)

 setMethod(Axis, signature(x=missing,at=ANY), AxisFUN)
 setMethod(Axis, signature(x=ANY,at=missing), AxisFUN)
 setMethod(Axis, signature(x=ANY,at=ANY), AxisFUN)

 Any ideas?

 Thanks,
 Oleg

 Dr Oleg Sklyar
 Technology Group
 Man Investments Ltd
 +44 (0)20 7144 3803
 [EMAIL PROTECTED]


 **
 The contents of this email are for the named addressee(s) only.
 It contains information which may be confidential and privileged.
 If you are not the intended recipient, please notify the sender
 immediately, destroy this email and any attachments and do not
 otherwise disclose or use them. Email transmission is not a
 secure method of communication and Man Investments cannot accept
 responsibility for the completeness or accuracy of this email or
 any attachments. Whilst Man Investments makes every effort to keep
 its network free from viruses, it does not accept responsibility
 for any computer virus which might be transferred by way of this
 email or any attachments. This email does not constitute a request,
 offer, recommendation or solicitation of any kind to buy, subscribe,
 sell or redeem any investment instruments or to perform other such
 transactions of any kind. Man Investments reserves the right to
 monitor, record and retain all electronic communications through
 its network to ensure the integrity of its systems, for record
 keeping and regulatory purposes.

 Visit us at: www.maninvestments.com

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] [R] How to improve the OPTIM results

2008-04-06 Thread Gabor Grothendieck
On Sun, Apr 6, 2008 at 12:59 PM, Kurt Hornik [EMAIL PROTECTED] wrote:
  Dirk Eddelbuettel writes:

  On 6 April 2008 at 08:37, Spencer Graves wrote:
  | moved to R-devel
  [...]
  |   However, the comment from Hans Borchers (below) raises a related
  | question:  What might be the best way to build a collaboration with
  | existing optimization initiatives, possibly making R a platform of
  | choice for making it easy for users to access and compare alternative
  | optimization methods?

  Are you (used in the plural) aware of Kurt's more recent advances into
  OR?  There are new-ish packages RSymphony, Rglpk and other things at
  CRAN.

 Not only by me: there's also Rcplex and TSP and there are at least three
 more COIN-OR package interfaces in the pipeline.


It would be nice if there were a CRAN Task View to pull all this together.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] __FILE__ for R

2008-04-04 Thread Gabor Grothendieck
Although its kludgy this previously (and still works) too:

parent.frame(2)$ofile

if called from top level within a sourced file.

On Fri, Apr 4, 2008 at 11:50 AM, hadley wickham [EMAIL PROTECTED] wrote:
 Sorry, that should be:

 FILE - (function() {
  attr(body(sys.function()), srcfile)
 })()$filename

 Hadley


 On Fri, Apr 4, 2008 at 10:34 AM, hadley wickham [EMAIL PROTECTED] wrote:
  I've often missed the ability to get the directory of the currently
   running script.  It's actually been possible for a while:
 
   FILE - (function() {
attr(body(sys.function()), srcfile)
   })()
 
   thanks to Duncan's recent changes to file parsing.  This is pretty
   useful for sourcing in files relative to the current script, rather
   than the working directory.
 
   Hadley
 
   --
   http://had.co.nz/
 



 --
 http://had.co.nz/

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] R CMD check should check date in description

2008-04-04 Thread Gabor Grothendieck
I think its somewhat inspiring to be able to get R CMD check to the
point that there are no warnings so having a situation where a warning
is ok would interfere with that.

On Fri, Apr 4, 2008 at 3:54 PM, hadley wickham [EMAIL PROTECTED] wrote:
   I recently thought about this.  I see several issues.
 
   * How can we determine if it is old?  Relative to the time when the
package was uploaded to a repository?
 
   * Some developers might actually want a different date for a variety of
reasons ...
 
   * What we currently say in R-exts is
 
   The optional `Date' field gives the release date of the current
   version of the package.  It is strongly recommended to use the
   -mm-dd format conforming to the ISO standard.
 
Many packages do not comply with the latter (but I have some code to
sanitize most of these), and release date may be a moving target.
 
   The best that I could think of is to teach R CMD build to *add* a Date
   field if there was none.

 That sounds like a good solution to me.  Otherwise, maybe just a
 message from R CMD check?  i.e. just like failing the codetools
 checks, it might be perfectly ok, but you should be doing it
 consciously, not by mistake.


 Hadley


 --
 http://had.co.nz/

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] callCC in 2.7.0

2008-04-01 Thread Gabor Grothendieck
Can I suggest some clarification of the help page for callCC
plainly stating that it is intended to exit from a deeply nested
set of calls.

On a casual reading I thought the exact same thing as f.jamitsky.

On Tue, Apr 1, 2008 at 6:32 AM, Luke Tierney [EMAIL PROTECTED] wrote:
 No.  First class continuations of the kind provided in scheme can be
 used as a means to implement generators, but downward-only
 continuations as currently provided in R are not sufficient for that.
 This version is intended only as a non-local exit mechanism.

 Best,

 luke

 On Mon, 31 Mar 2008, f.jamitzky wrote:

 
  callcc is similar to the yield keyword in python and c#
  it lets you define e.g. a generator of lists of numbers.
 
 
 
 
  Gabor Grothendieck wrote:
 
  Would anyone like to explain if callCC in R 2.7.0 gives
  anything that on.exit does not already provide?
 
  It seems that the exit condition once defined cannot
  be added to overridden whereas with on.exit multiple
  on.exit's add additional on.exits rather than being ignored.
 
  Is this important?
 
  __
  R-devel@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-devel
 
 
 
 

 --
 Luke Tierney
 Chair, Statistics and Actuarial Science
 Ralph E. Wareham Professor of Mathematical Sciences
 University of Iowa  Phone: 319-335-3386
 Department of Statistics andFax:   319-335-3017
Actuarial Science
 241 Schaeffer Hall  email:  [EMAIL PROTECTED]
 Iowa City, IA 52242 WWW:  http://www.stat.uiowa.edu


 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] data(lh) time serie parameters

2008-03-30 Thread Gabor Grothendieck
On Sun, Mar 30, 2008 at 5:52 AM, Jean lobry
[EMAIL PROTECTED] wrote:
 Dear all,

 I'm confused by the time serie parameters in data(lh) :

 sueoka:~ lobry$ R --vanilla --quiet
   tsp(lh)
 [1]  1 48  1

 because documentation says:

 QUOTE
 A regular time series giving the luteinizing hormone in blood
 samples at 10 mins intervals from a human female, 48 samples.
 UNQUOTE

 So that I would expect the time serie to end at 480 minutes
 or 8 hours. Shouldn't we have something like:

   tsp(lh) - c(10, 480, 0.1) # in Minutes

 or

   tsp(lh) - c(1/6, 8, 6)# in Hours

It seems they are using 10 minutes as the unit of measurement.
If you wish to change it to hours you might want to use this instead:

   lh.hr - ts(lh, start = 0, frequency = 6)

so that

   cycle(lh.hr)

starts out at 1.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] callCC in 2.7.0

2008-03-30 Thread Gabor Grothendieck
Would anyone like to explain if callCC in R 2.7.0 gives
anything that on.exit does not already provide?

It seems that the exit condition once defined cannot
be added to overridden whereas with on.exit multiple
on.exit's add additional on.exits rather than being ignored.

Is this important?

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] callCC in 2.7.0

2008-03-30 Thread Gabor Grothendieck
I think the only relationship to that is the name since
it does not appear to allow one to leave a function
in the middle of its processing and re-enter it back
at that point -- which is what would be needed.

On Sun, Mar 30, 2008 at 12:04 PM,  [EMAIL PROTECTED] wrote:

  Would anyone like to explain if callCC in R 2.7.0 gives
  anything that on.exit does not already provide?
 
  It seems that the exit condition once defined cannot
  be added to overridden whereas with on.exit multiple
  on.exit's add additional on.exits rather than being ignored.
 
  Is this important?

 It facilitates a completely different style of programming - see
 http://en.wikipedia.org/wiki/Continuation-passing_style

 --
 http://had.co.nz/


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] callCC in 2.7.0

2008-03-30 Thread Gabor Grothendieck
Also in trying it out again it seems that its not like
on.exit but more like return:

F - function(f) { f(10); print(2); f(20); 3}
callCC(F)

acts the same as:

F - function() { return(10); print(2); f(20); 3}
F()

and there is no documented way to restart F
at the point it left off so I assume it can't.

On Sun, Mar 30, 2008 at 12:34 PM, Gabor Grothendieck
[EMAIL PROTECTED] wrote:
 I think the only relationship to that is the name since
 it does not appear to allow one to leave a function
 in the middle of its processing and re-enter it back
 at that point -- which is what would be needed.


 On Sun, Mar 30, 2008 at 12:04 PM,  [EMAIL PROTECTED] wrote:
 
   Would anyone like to explain if callCC in R 2.7.0 gives
   anything that on.exit does not already provide?
  
   It seems that the exit condition once defined cannot
   be added to overridden whereas with on.exit multiple
   on.exit's add additional on.exits rather than being ignored.
  
   Is this important?
 
  It facilitates a completely different style of programming - see
  http://en.wikipedia.org/wiki/Continuation-passing_style
 
  --
  http://had.co.nz/
 


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] callCC in 2.7.0

2008-03-30 Thread Gabor Grothendieck
Sorry it should be as follows:

fib - function(i, a = 0, b = 1) {
  if (i == 0) b else fib(i-1, b, a+b)
}

Now, how do we transform that to use callCC?


On Sun, Mar 30, 2008 at 1:42 PM, Gabor Grothendieck
[EMAIL PROTECTED] wrote:
 OK. Can you show code to implement the tail recursive version of
 fib using callCC in R, say.

 Here it is transformed to tail recursive style:

 fib - function(i, a = 0, b = 1) {
  if (i == 0) a else fib(i-1, b, a+b)

 Now, how do I add callCC to all this so that the fib call
 presumably does not create a new stack instance?


 On Sun, Mar 30, 2008 at 1:31 PM, Luke Tierney [EMAIL PROTECTED] wrote:
  On Sun, 30 Mar 2008, Gabor Grothendieck wrote:
 
   I think the only relationship to that is the name since
   it does not appear to allow one to leave a function
   in the middle of its processing and re-enter it back
   at that point -- which is what would be needed.
 
  The article conflates basic CPS with having first class continuations
  as in Scheme. The discussion about compilers and tail calls only
  requires downward-only continuations of the kind provided by R's
  current callCC.  The user interface and coroutine discussion requires
  continuations that can be run outside of their creating context.  The
  most sophisticated variant, as provided in Scheme, also allows
  continuations to be run more than once.  I don't think any of the
  examples in the Wikipedia article need that, but there is some
  interesting work on using that to model web browsing behavior.
 
  At any rate, there is plenty of precedent for using callCC as the name
  for the construct here even when the continuation is no longer valid
  outside of the creating callCC call. So the relationship is more than
  just the name.
 
  luke
 
  
   On Sun, Mar 30, 2008 at 12:04 PM,  [EMAIL PROTECTED] wrote:
  
   Would anyone like to explain if callCC in R 2.7.0 gives
   anything that on.exit does not already provide?
  
   It seems that the exit condition once defined cannot
   be added to overridden whereas with on.exit multiple
   on.exit's add additional on.exits rather than being ignored.
  
   Is this important?
  
   It facilitates a completely different style of programming - see
   http://en.wikipedia.org/wiki/Continuation-passing_style
  
   --
   http://had.co.nz/
  
  
 
   __
   R-devel@r-project.org mailing list
   https://stat.ethz.ch/mailman/listinfo/r-devel
  
 
  --
  Luke Tierney
  Chair, Statistics and Actuarial Science
  Ralph E. Wareham Professor of Mathematical Sciences
  University of Iowa  Phone: 319-335-3386
  Department of Statistics andFax:   319-335-3017
 Actuarial Science
  241 Schaeffer Hall  email:  [EMAIL PROTECTED]
  Iowa City, IA 52242 WWW:  http://www.stat.uiowa.edu
 


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] callCC in 2.7.0

2008-03-30 Thread Gabor Grothendieck
I came across this in googling for continuations and was surprised when
I found it in R 2.7.0 and since I had not come across it before  I assumed
it was added just now.

Can you explain how its intended to be used with an example
that is more realistic than in the example section.

On Sun, Mar 30, 2008 at 1:14 PM, Luke Tierney [EMAIL PROTECTED] wrote:
 On Sun, 30 Mar 2008, Gabor Grothendieck wrote:

  Also in trying it out again it seems that its not like
  on.exit but more like return:

 Yes -- if you can point out what in the documentation ever gave the
 idea it might be like on.exit then we can fix the documentation.

 
  F - function(f) { f(10); print(2); f(20); 3}
  callCC(F)
 
  acts the same as:
 
  F - function() { return(10); print(2); f(20); 3}
  F()
 
  and there is no documented way to restart F
  at the point it left off so I assume it can't.

 The documentation descriped this as a downward-only version -- that is
 standard terminology for a callCC that produces continuations that are
 no longer valid after the callCC call exits.

 Not sure why your original question was about callCC in 2.7.0 -- I
 believe callCC was added in 2.5.0, the same time as codetools became
 recommended, and hasn't changed since.  It is not a function every R
 user needs to have in their repertoire, but it can be very useful in
 some situations.

 luke

  On Sun, Mar 30, 2008 at 12:34 PM, Gabor Grothendieck
  [EMAIL PROTECTED] wrote:
  I think the only relationship to that is the name since
  it does not appear to allow one to leave a function
  in the middle of its processing and re-enter it back
  at that point -- which is what would be needed.
 
 
  On Sun, Mar 30, 2008 at 12:04 PM,  [EMAIL PROTECTED] wrote:
 
  Would anyone like to explain if callCC in R 2.7.0 gives
  anything that on.exit does not already provide?
 
  It seems that the exit condition once defined cannot
  be added to overridden whereas with on.exit multiple
  on.exit's add additional on.exits rather than being ignored.
 
  Is this important?
 
  It facilitates a completely different style of programming - see
  http://en.wikipedia.org/wiki/Continuation-passing_style
 
  --
  http://had.co.nz/
 
 
 
  __
  R-devel@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-devel
 

 --
 Luke Tierney
 Chair, Statistics and Actuarial Science
 Ralph E. Wareham Professor of Mathematical Sciences
 University of Iowa  Phone: 319-335-3386
 Department of Statistics andFax:   319-335-3017
Actuarial Science
 241 Schaeffer Hall  email:  [EMAIL PROTECTED]
 Iowa City, IA 52242 WWW:  http://www.stat.uiowa.edu


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] callCC in 2.7.0

2008-03-30 Thread Gabor Grothendieck
OK. Can you show code to implement the tail recursive version of
fib using callCC in R, say.

Here it is transformed to tail recursive style:

fib - function(i, a = 0, b = 1) {
  if (i == 0) a else fib(i-1, b, a+b)

Now, how do I add callCC to all this so that the fib call
presumably does not create a new stack instance?

On Sun, Mar 30, 2008 at 1:31 PM, Luke Tierney [EMAIL PROTECTED] wrote:
 On Sun, 30 Mar 2008, Gabor Grothendieck wrote:

  I think the only relationship to that is the name since
  it does not appear to allow one to leave a function
  in the middle of its processing and re-enter it back
  at that point -- which is what would be needed.

 The article conflates basic CPS with having first class continuations
 as in Scheme. The discussion about compilers and tail calls only
 requires downward-only continuations of the kind provided by R's
 current callCC.  The user interface and coroutine discussion requires
 continuations that can be run outside of their creating context.  The
 most sophisticated variant, as provided in Scheme, also allows
 continuations to be run more than once.  I don't think any of the
 examples in the Wikipedia article need that, but there is some
 interesting work on using that to model web browsing behavior.

 At any rate, there is plenty of precedent for using callCC as the name
 for the construct here even when the continuation is no longer valid
 outside of the creating callCC call. So the relationship is more than
 just the name.

 luke

 
  On Sun, Mar 30, 2008 at 12:04 PM,  [EMAIL PROTECTED] wrote:
 
  Would anyone like to explain if callCC in R 2.7.0 gives
  anything that on.exit does not already provide?
 
  It seems that the exit condition once defined cannot
  be added to overridden whereas with on.exit multiple
  on.exit's add additional on.exits rather than being ignored.
 
  Is this important?
 
  It facilitates a completely different style of programming - see
  http://en.wikipedia.org/wiki/Continuation-passing_style
 
  --
  http://had.co.nz/
 
 

  __
  R-devel@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-devel
 

 --
 Luke Tierney
 Chair, Statistics and Actuarial Science
 Ralph E. Wareham Professor of Mathematical Sciences
 University of Iowa  Phone: 319-335-3386
 Department of Statistics andFax:   319-335-3017
Actuarial Science
 241 Schaeffer Hall  email:  [EMAIL PROTECTED]
 Iowa City, IA 52242 WWW:  http://www.stat.uiowa.edu


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] callCC in 2.7.0

2008-03-30 Thread Gabor Grothendieck
Thanks.  So its intended to jump straight out of deeply nested
calls without having to manage the unwinding.

On Sun, Mar 30, 2008 at 4:22 PM, Luke Tierney [EMAIL PROTECTED] wrote:
 On Sun, 30 Mar 2008, Gabor Grothendieck wrote:

  Sorry it should be as follows:
 
  fib - function(i, a = 0, b = 1) {
   if (i == 0) b else fib(i-1, b, a+b)
  }
 
  Now, how do we transform that to use callCC?
 
 
  On Sun, Mar 30, 2008 at 1:42 PM, Gabor Grothendieck
  [EMAIL PROTECTED] wrote:
  OK. Can you show code to implement the tail recursive version of
  fib using callCC in R, say.
 
  Here it is transformed to tail recursive style:
 
  fib - function(i, a = 0, b = 1) {
   if (i == 0) a else fib(i-1, b, a+b)
 
  Now, how do I add callCC to all this so that the fib call
  presumably does not create a new stack instance?

 What makes you think callCC has anything to contribute here?

 The Wikipedia article Hadley cited says that using CPS effectively is
 difficult without tail call optimization, which R does not have and
 can't, at least not without some restrictions, because the stack is
 avialable via sys.xyz functions.  It does _not_ say that having
 explicit continuations implies having tail call optimization.

 If you want to, you can rewrite your fib in CPS, something like

 fibCPS - function(k, i, a = 0, b = 1) {
   if (i == 0) k(b) else fibCPS(k, i-1, b, a+b)
 }

 and you can then use callCC to bridge between the nonCPS and CPS
 world, e.g. with

 fib - function(i) callCC(function(k) fibCPS(k, i, 0, 1))

 This will grow the stack just like any other recursion in R.  A
 (minor) difference compared to your original fib is that the final
 exit happens in a single jump rather than a sequence of returns.

 The point of adding the current downward-only callCC to R is to
 provide a clean, lexically scoped non-local exit mechanism for exiting
 from complex, possibly recursive function calls. Dylan provides this
 as part of its 'block' construct (also as an exit or continuation
 function); Common Lisp has 'block'/'return-from' constructs that use
 lexically scoped block name symbols.

 For a slightly more realistic example than the ones currently in the
 help page: Here is some code that implements a simple binary tree data
 structure, and a function that visits the nodes of the tree and calls
 a specified function on the value in each node:

 mkTree - function(value, left, right)
list(isLeaf = FALSE, value = value, left = left, right = right)
 mkLeaf - function(value) list(isLeaf = TRUE, value = value)

 visit - function(node, fun) {
if (node$isLeaf) fun(node$value)
else {
visit(node$left, fun)
visit(node$right, fun)
fun(node$value)
}
 }

 A simple example tree:

 x - mkTree(1, mkTree(2, mkLeaf(3), mkLeaf(4)),
mkTree(5, mkLeaf(6), mkLeaf(7)))

 You can use visit() to print out the node values with

  visit(x, print)
 [1] 3
 [1] 4
 [1] 2
 [1] 6
 [1] 7
 [1] 5
 [1] 1
 

 If you want to use a function like visit() to traverse the tree but
 want the traversal to stop when some condition is met then you can
 either rewrite visit to allow for this, making it more complicated, or
 you can use a non-local exit.  That is where callCC comes in.  To
 print all values until you find one equal to 7 you can use

  callCC(function(exit) {
 + fun - function(value) {
 + if (value == 7) exit(NULL)
 + else print(value)
 + }
 + visit(x, fun)
 + })
 [1] 3
 [1] 4
 [1] 2
 [1] 6
 NULL
 

 One can also imagine situations where this might be useful in a
 function passed to an optimizer like optim.  Given that our 'break'
 only breaks out of one loop level, callCC can also be useful for
 breaking out of a set of nested loops.

 I first wrote this particular version of callCC when prototyping the
 tryCatch mechanism in pure R code where one needs a means to jump from
 a point where a condition is signaled to the point where the handler
 is established.  The current tryCatch implementation does things
 differently because it needs to integrate with the error handling at
 the C level. Currently I use callCC in the constantFold function in
 codetools (which is why callCC was added when codetools became
 recommended).  This use is similar to the tree example.


 luke

 
 
  On Sun, Mar 30, 2008 at 1:31 PM, Luke Tierney [EMAIL PROTECTED] wrote:
  On Sun, 30 Mar 2008, Gabor Grothendieck wrote:
 
  I think the only relationship to that is the name since
  it does not appear to allow one to leave a function
  in the middle of its processing and re-enter it back
  at that point -- which is what would be needed.
 
  The article conflates basic CPS with having first class continuations
  as in Scheme. The discussion about compilers and tail calls only
  requires downward-only continuations of the kind provided by R's

Re: [Rd] cpu usage high with windows change dir / winDialogString (PR#11045)

2008-03-29 Thread Gabor Grothendieck
On Sat, Mar 29, 2008 at 12:06 PM, Duncan Murdoch [EMAIL PROTECTED] wrote:
 On 28/03/2008 12:05 PM, [EMAIL PROTECTED] wrote:
  Good afternoon,
 
  This is possibly a windows only bug, definitely of comparatively low
  importance - but for the sake of completeness here we go.  I've
  searched http://bugs.R-project.org/ etc., but can find no mention.
 
  For RGui.exe, the CPU usage goes to 100% for certain dialog boxes for
  the duration that the dialog box is visible, e.g.
 
  * check CPU usage is low
  * On the RGui.exe menu chose File / Change dir...
  * the CPU usage goes to 100%
  * hit OK
  * the CPU usage goes back down again

 What is the bug here?  I'd guess it's not R using the cpu, rather some
 other process hooked to the dialog, but even if it really is R, why is
 this a bug?

Maybe R is sitting in a loop consuming all the resources of the system?

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] cpu usage high with windows change dir / winDialogString (PR#11045)

2008-03-29 Thread Gabor Grothendieck
On Sat, Mar 29, 2008 at 12:32 PM, Duncan Murdoch [EMAIL PROTECTED] wrote:

 On 29/03/2008 12:23 PM, Gabor Grothendieck wrote:
  On Sat, Mar 29, 2008 at 12:06 PM, Duncan Murdoch [EMAIL PROTECTED] wrote:
  On 28/03/2008 12:05 PM, [EMAIL PROTECTED] wrote:
  Good afternoon,
 
  This is possibly a windows only bug, definitely of comparatively low
  importance - but for the sake of completeness here we go.  I've
  searched http://bugs.R-project.org/ etc., but can find no mention.
 
  For RGui.exe, the CPU usage goes to 100% for certain dialog boxes for
  the duration that the dialog box is visible, e.g.
 
  * check CPU usage is low
  * On the RGui.exe menu chose File / Change dir...
  * the CPU usage goes to 100%
  * hit OK
  * the CPU usage goes back down again
  What is the bug here?  I'd guess it's not R using the cpu, rather some
  other process hooked to the dialog, but even if it really is R, why is
  this a bug?
 
  Maybe R is sitting in a loop consuming all the resources of the system?

 It's a SHBrowseForFolder call, with a callback.  If there's a bug, it
 looks to me as though it's in Windows.

 Duncan Murdoch


I wonder if any of the numerous settings in the _browseinfo
structure might be impacting this such as looking for networking
folders and if the user's networking setup impacts this since it
could start performing some network functions to find files over
a network.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] cut.Date and cut.POSIXt problem

2008-03-18 Thread Gabor Grothendieck
cut.Date and cut.POSIXt indicate that the breaks argument
can be an integer followed by a space followed by year, etc.
but it seems the integer is ignored.

For example, I assume that breaks = 3 months is supposed
to cut it into quarters but, in fact, it cuts it into months as if
3 had not been there.

 d - seq(Sys.Date(), length = 12, by = month)
 cut(d, 3 months)
 [1] 2008-03-01 2008-04-01 2008-05-01 2008-06-01 2008-07-01 2008-08-01
2008-09-01 2008-10-01 2008-11-01 2008-12-01 2009-01-01 2009-02-01
Levels: 2008-03-01 2008-04-01 2008-05-01 2008-06-01 2008-07-01
2008-08-01 2008-09-01 2008-10-01 2008-11-01 2008-12-01 2009-01-01
2009-02-01
 cut(as.POSIXct(d), 3 months)
 [1] 2008-03-01 2008-04-01 2008-05-01 2008-06-01 2008-07-01 2008-08-01
2008-09-01 2008-10-01 2008-11-01 2008-12-01 2009-01-01 2009-02-01
Levels: 2008-03-01 2008-04-01 2008-05-01 2008-06-01 2008-07-01
2008-08-01 2008-09-01 2008-10-01 2008-11-01 2008-12-01 2009-01-01
2009-02-01
 cut(as.POSIXlt(d), 3 months)
 [1] 2008-03-01 2008-04-01 2008-05-01 2008-06-01 2008-07-01 2008-08-01
2008-09-01 2008-10-01 2008-11-01 2008-12-01 2009-01-01 2009-02-01
Levels: 2008-03-01 2008-04-01 2008-05-01 2008-06-01 2008-07-01
2008-08-01 2008-09-01 2008-10-01 2008-11-01 2008-12-01 2009-01-01
2009-02-01

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] cut.Date and cut.POSIXt problem

2008-03-18 Thread Gabor Grothendieck
I am using

[1] R version 2.6.2 alpha (2008-01-26 r44181)

and also get the same under:

R version 2.7.0 Under development (unstable) (2008-03-13 r44752)

On Tue, Mar 18, 2008 at 2:09 PM, Roger D. Peng [EMAIL PROTECTED] wrote:
 Seems it did work as advertised in R 2.6.0 but is still broken in R-devel.  
 Will
 take a look.

 -roger


 Gabor Grothendieck wrote:
  cut.Date and cut.POSIXt indicate that the breaks argument
  can be an integer followed by a space followed by year, etc.
  but it seems the integer is ignored.
 
  For example, I assume that breaks = 3 months is supposed
  to cut it into quarters but, in fact, it cuts it into months as if
  3 had not been there.
 
  d - seq(Sys.Date(), length = 12, by = month)
  cut(d, 3 months)
   [1] 2008-03-01 2008-04-01 2008-05-01 2008-06-01 2008-07-01 2008-08-01
  2008-09-01 2008-10-01 2008-11-01 2008-12-01 2009-01-01 2009-02-01
  Levels: 2008-03-01 2008-04-01 2008-05-01 2008-06-01 2008-07-01
  2008-08-01 2008-09-01 2008-10-01 2008-11-01 2008-12-01 2009-01-01
  2009-02-01
  cut(as.POSIXct(d), 3 months)
   [1] 2008-03-01 2008-04-01 2008-05-01 2008-06-01 2008-07-01 2008-08-01
  2008-09-01 2008-10-01 2008-11-01 2008-12-01 2009-01-01 2009-02-01
  Levels: 2008-03-01 2008-04-01 2008-05-01 2008-06-01 2008-07-01
  2008-08-01 2008-09-01 2008-10-01 2008-11-01 2008-12-01 2009-01-01
  2009-02-01
  cut(as.POSIXlt(d), 3 months)
   [1] 2008-03-01 2008-04-01 2008-05-01 2008-06-01 2008-07-01 2008-08-01
  2008-09-01 2008-10-01 2008-11-01 2008-12-01 2009-01-01 2009-02-01
  Levels: 2008-03-01 2008-04-01 2008-05-01 2008-06-01 2008-07-01
  2008-08-01 2008-09-01 2008-10-01 2008-11-01 2008-12-01 2009-01-01
  2009-02-01
 
  __
  R-devel@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-devel
 

 --
 Roger D. Peng  |  http://www.biostat.jhsph.edu/~rpeng/


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] cut.Date and cut.POSIXt problem

2008-03-18 Thread Gabor Grothendieck
By the way, it would be nice if the breaks argument
accepted the word quarter directly since its quite a common
break to need when using dates.

On Tue, Mar 18, 2008 at 2:29 PM, Gabor Grothendieck
[EMAIL PROTECTED] wrote:
 I am using

 [1] R version 2.6.2 alpha (2008-01-26 r44181)

 and also get the same under:

 R version 2.7.0 Under development (unstable) (2008-03-13 r44752)


 On Tue, Mar 18, 2008 at 2:09 PM, Roger D. Peng [EMAIL PROTECTED] wrote:
  Seems it did work as advertised in R 2.6.0 but is still broken in R-devel.  
  Will
  take a look.
 
  -roger
 
 
  Gabor Grothendieck wrote:
   cut.Date and cut.POSIXt indicate that the breaks argument
   can be an integer followed by a space followed by year, etc.
   but it seems the integer is ignored.
  
   For example, I assume that breaks = 3 months is supposed
   to cut it into quarters but, in fact, it cuts it into months as if
   3 had not been there.
  
   d - seq(Sys.Date(), length = 12, by = month)
   cut(d, 3 months)
[1] 2008-03-01 2008-04-01 2008-05-01 2008-06-01 2008-07-01 2008-08-01
   2008-09-01 2008-10-01 2008-11-01 2008-12-01 2009-01-01 2009-02-01
   Levels: 2008-03-01 2008-04-01 2008-05-01 2008-06-01 2008-07-01
   2008-08-01 2008-09-01 2008-10-01 2008-11-01 2008-12-01 2009-01-01
   2009-02-01
   cut(as.POSIXct(d), 3 months)
[1] 2008-03-01 2008-04-01 2008-05-01 2008-06-01 2008-07-01 2008-08-01
   2008-09-01 2008-10-01 2008-11-01 2008-12-01 2009-01-01 2009-02-01
   Levels: 2008-03-01 2008-04-01 2008-05-01 2008-06-01 2008-07-01
   2008-08-01 2008-09-01 2008-10-01 2008-11-01 2008-12-01 2009-01-01
   2009-02-01
   cut(as.POSIXlt(d), 3 months)
[1] 2008-03-01 2008-04-01 2008-05-01 2008-06-01 2008-07-01 2008-08-01
   2008-09-01 2008-10-01 2008-11-01 2008-12-01 2009-01-01 2009-02-01
   Levels: 2008-03-01 2008-04-01 2008-05-01 2008-06-01 2008-07-01
   2008-08-01 2008-09-01 2008-10-01 2008-11-01 2008-12-01 2009-01-01
   2009-02-01
  
   __
   R-devel@r-project.org mailing list
   https://stat.ethz.ch/mailman/listinfo/r-devel
  
 
  --
  Roger D. Peng  |  http://www.biostat.jhsph.edu/~rpeng/
 


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] extract function [ and empty index

2008-03-09 Thread Gabor Grothendieck
Use TRUE.

On Sun, Mar 9, 2008 at 5:05 AM, Laurent Gautier [EMAIL PROTECTED] wrote:
 Dear list,

 I am having a question regarding the extract function [.

 The man page says that one usage with k-dimensional arrays is to
 specify k indices to [, with an empty index indicating that all
 entries in that dimension are selected.

 The question is the following: is there an R object qualifying as an
 empty index ? I understand that the lazy evaluation of parameters
 allows one to
 have genuinely missing parameters, but I would like to have an object
 instead. I understand that one can always have an if/else workaround,
 but I thought should ask, just in case.

 I tried with NULL but with little success, as it appears to give the
 same results
 as an empty vector.
  m = matrix(1, 2, 2)
  m[1, NULL]
 numeric(0)
  m[1, integer(0)]
 numeric(0)

 Since I was at it, I noted that the result obtained with numeric(0)
 definitely makes sense but could as well be seen as challenging the
 concept of an empty index presented in the man page (One could somehow
 expect the presence of an object  meaning everything rather than
 missingness meaning it).


 Thanks,


 Laurent

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] [patch] add=TRUE in plot.default()

2008-03-09 Thread Gabor Grothendieck
On Sun, Mar 9, 2008 at 6:27 PM, hadley wickham [EMAIL PROTECTED] wrote:
   Yes.  The ability to plot things on top of each other is important.
   The simplicity created by having a single interface for adding to plots
   outweighs the complexity of yet another parameter.
 
   The add parameter only interacts with other parameters superficially --
   some parameters of plot (like log) are related to the shape of the axes,
   and should be inherited from what is on the plot already.

 But what about when the new data is outside the range of the current
 plot?

plot/lines/points already works that way so this is just an interface issue.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] [patch] add=TRUE in plot.default()

2008-03-09 Thread Gabor Grothendieck
Its not a matter of desirable or not -- its a matter that its a different point.

The par(new= ) to get an old graph is completely confusing
and its annoying that one has to suddenly switch to lines and points
and cannot consistently use plot.

That remains true whether or not there is auto expansion.

On Sun, Mar 9, 2008 at 7:51 PM, hadley wickham [EMAIL PROTECTED] wrote:
But what about when the new data is outside the range of the current
plot?
 
   plot/lines/points already works that way so this is just an interface 
  issue.

 That may be the way it is, but I don't see how you could argue that
 it's desirable behaviour.


 Hadley


 --
 http://had.co.nz/


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] merging environments

2008-03-07 Thread Gabor Grothendieck
On Fri, Mar 7, 2008 at 3:15 PM, hadley wickham [EMAIL PROTECTED] wrote:
 2008/3/7 Ben Bolker [EMAIL PROTECTED]:
 

 Despite the spirited arguments of various R-core folks
   who feel that mle() doesn't need a data argument, and
   that users would be better off learning to deal with function
   closures, I am *still* trying to make such things work
   in a reasonably smooth fashion ...
 
 Is there a standard idiom for merging environments?
   i.e., suppose a function has an environment that I want
   to preserve, but _add_ the contents of a data list --
   would something like this do it? Is there a less ugly
   way?
 
   x - 0
   y - 1
   z - 2
 
   f - function() {
   x+y+z
   }
 
   f2 - function(fun,data) {
   L - ls(pos=environment(fun))
   mapply(assign,names(data),data,
MoreArgs=list(envir=environment(fun)))
   print(ls(pos=environment(fun)))
   }
 
   f2(f,list(a=1))

 I think you're doomed to be ugly if you don't use closures - I think
 any explicit manipulation of environments is worse than the implicit
 manipulation by closures.

 f - function(data) with(data, x + y + z)
 f2 - function(fun, data) function() fun(data)

 f2(f, list(x = 10))()

 Although it would be even nicer if you could do:

 f - function()  x + y + z
 f2 - function(fun, data) function() with(data, fun())


This last one is close to what you can do with proto and is referred
to as the method of proxies here:
http://r-proto.googlecode.com/files/prototype_approaches.pdf

f - function() x + y + z
f2 - function(fun, ...) with(proto(environment(fun), ..., g = fun), g())
f2(f, x = 1, y = 2, z = 3)

The proto call creates an anonymous proto object whose parent is
the parent of fun.  The anonymous proto object contains the ...
arguments to f2 and g.g is just fun with its environment reset to
the anonymous proto object. We then call g.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Warnings generated by log2()/log10() are really large/takes a long time to display

2008-02-27 Thread Gabor Grothendieck
On Wed, Feb 27, 2008 at 5:50 AM, Henrik Bengtsson [EMAIL PROTECTED] wrote:
 On Wed, Feb 27, 2008 at 12:56 AM, Prof Brian Ripley
 [EMAIL PROTECTED] wrote:
  On Wed, 27 Feb 2008, Martin Maechler wrote:
 
Thank you Henrik,
   
HenrikB == Henrik Bengtsson [EMAIL PROTECTED]
on Tue, 26 Feb 2008 22:03:24 -0800 writes:
   
{with many superfluous empty statements ( i.e., trailing ; ):
 
   Indeed!

 I like to add a personal touch to the code I'm writing ;)

 Seriously, I added them here as a bait in order to get a chance to say
 that I finally found a good reason for adding the semicolons.  If you
 cut'n'paste code from certain web pages it may happen that
 newlines/carriage returns are not transferred and all code is pasted
 into the same line at the R prompt.  With semicolons you still get a
 valid syntax.  I cannot remember under what conditions this happened -

I have seen that too and many others have as well since in some forums
(not related to R) its common to indent all source lines by two spaces.  Any
line appearing without indentation must have been wrapped.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Unix-like touch to update modification timestamp of file?

2008-02-27 Thread Gabor Grothendieck
If you only need Windows then this will do it even without RTools:

shell(copy /b /v myfile +,,nul)



On Wed, Feb 27, 2008 at 11:12 AM, Earl F. Glynn
[EMAIL PROTECTED] wrote:
 Henrik Bengtsson [EMAIL PROTECTED] wrote in message
 news:[EMAIL PROTECTED]...


  is it possible to update the modification time stamp of a file using R
  (on file systems supporting it)?

 For a Windows PC, if you have RTools in your path (from
 http://www.murdoch-sutherland.com/Rtools/installer.html), then you should be
 able to use the touch that's in the RTools\bin directory:

 system(touch sample.dat)

 efg

 Earl F. Glynn
 Bioinformatics
 Stowers Institute for Medical Research


 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] how to write dput-able objects

2008-02-25 Thread Gabor Grothendieck
You might want to look at the proto package.  proto objects won't
immediately dput either but it would not be hard to convert them to
restorable character strings because the proto methods normally
have their object as their parent environment so its implicit in the
definition.  First define a proto object p with one variable 'a' and one
method 'f' and a child object, q, whose 'a' component overrides the
'a' in its parent.

 library(proto)
 p - proto(a=1, f = function(.) .$a - .$a + 1)
 q - p$proto(a = 2)
 p$as.list()
$a
[1] 1

$f
function(.) .$a - .$a + 1
environment: 0x01d6e284

 name.proto(p)
[1] p
 name.proto(p$parent.env())
[1] R_GlobalEnv
 q$as.list()
$a
[1] 2

 name.proto(q)
[1] q
 name.proto(q$parent.env())
[1] p

Note that the strings above have everything
you would need to restore them.  We don't
need the environment since we already know that
f's environment must be p as it belongs to p.

On Mon, Feb 25, 2008 at 12:47 PM, Vadim Organovich
[EMAIL PROTECTED] wrote:
 Hi,

 One way of doing object-oriented programming in R is to use function 
 environment to hold object's data, see for example
 @Article{Rnews:Chambers+Lang:2001a,
  author   = {John M. Chambers and Duncan Temple Lang},
  title= {Object-Oriented Programming in {R}},
  journal  = {R News},
  year= 2001,
  volume   = 1,
  number   = 3,
  pages= {17--19},
  month= {September},
  url= http,
  pdf= Rnews2001-3
 }

 One deficiency of this approach is that dput() does not export all data 
 pertained to the object. For example

  objfactory - function(nm) {
 +   list(name = function() nm)
 + }
 
 
  obj - objfactory('foo')
 
  obj$name()
 [1] foo
  dput(obj)
 structure(list(name = function ()
 nm), .Names = name)
 As one can see the data piece of the obj, nm='foo', is not exported. Is there 
 a way to modify the original approach so that dput() will produce a 
 self-sufficient dump of the object?

 Thanks,
 Vadim

[[alternative HTML version deleted]]

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] how to write dput-able objects

2008-02-25 Thread Gabor Grothendieck
Try something like this.  If ..Name exists in the proto object
it uses that as the name; otherwise, it uses the name.proto()
heuristic to find the name.

library(proto)
dput.proto - function(x, ...) {
y - as.list(x, all = TRUE)
if (!exists(..Name, x)) y$..Name - name.proto(x, parent.frame())
y$.parent.Name - name.proto(parent.env(x), parent.frame())
dput(y, ...)
}
p - proto(a = 1, f = function(.) { .$a - .$a + 1})
q - p$proto(a = 2)
dput.proto(p)
dput.proto(q)

Output:

 library(proto)
 p - proto(a = 1, f = function(.) { .$a - .$a + 1})
 q - p$proto(a = 2)
 dput.proto(p)
structure(list(.super = environment, .that = environment,
a = 1, f = function (.)
{
.$a - .$a + 1
}, ..Name = p, .parent.Name = R_GlobalEnv), .Names = c(.super,
.that, a, f, ..Name, .parent.Name))
 dput.proto(q)
structure(list(.super = environment, .that = environment,
a = 2, ..Name = q, .parent.Name = p), .Names = c(.super,
.that, a, ..Name, .parent.Name))



On Mon, Feb 25, 2008 at 2:17 PM, Vadim Organovich
[EMAIL PROTECTED] wrote:
 Thank you Gabor! This is very close indeed to what I need. If dput() were 
 generic one could code dput.proto() and that would be it.

 Anyway, it is so close to what I need that I should be able to hack someting 
 to make it work for my purposes. Thanks again!

 Vadim

 
 From: Gabor Grothendieck [EMAIL PROTECTED]
 Sent: Monday, February 25, 2008 12:16 PM
 To: Vadim Organovich
 Cc: r-devel@r-project.org
 Subject: Re: [Rd] how to write dput-able objects


 You might want to look at the proto package.  proto objects won't
 immediately dput either but it would not be hard to convert them to
 restorable character strings because the proto methods normally
 have their object as their parent environment so its implicit in the
 definition.  First define a proto object p with one variable 'a' and one
 method 'f' and a child object, q, whose 'a' component overrides the
 'a' in its parent.

  library(proto)
  p - proto(a=1, f = function(.) .$a - .$a + 1)
  q - p$proto(a = 2)
  p$as.list()
 $a
 [1] 1

 $f
 function(.) .$a - .$a + 1
 environment: 0x01d6e284

  name.proto(p)
 [1] p
  name.proto(p$parent.env())
 [1] R_GlobalEnv
  q$as.list()
 $a
 [1] 2

  name.proto(q)
 [1] q
  name.proto(q$parent.env())
 [1] p

 Note that the strings above have everything
 you would need to restore them.  We don't
 need the environment since we already know that
 f's environment must be p as it belongs to p.

 On Mon, Feb 25, 2008 at 12:47 PM, Vadim Organovich
 [EMAIL PROTECTED] wrote:
  Hi,
 
  One way of doing object-oriented programming in R is to use function 
  environment to hold object's data, see for example
  @Article{Rnews:Chambers+Lang:2001a,
   author   = {John M. Chambers and Duncan Temple Lang},
   title= {Object-Oriented Programming in {R}},
   journal  = {R News},
   year= 2001,
   volume   = 1,
   number   = 3,
   pages= {17--19},
   month= {September},
   url= http,
   pdf= Rnews2001-3
  }
 
  One deficiency of this approach is that dput() does not export all data 
  pertained to the object. For example
 
   objfactory - function(nm) {
  +   list(name = function() nm)
  + }
  
  
   obj - objfactory('foo')
  
   obj$name()
  [1] foo
   dput(obj)
  structure(list(name = function ()
  nm), .Names = name)
  As one can see the data piece of the obj, nm='foo', is not exported. Is 
  there a way to modify the original approach so that dput() will produce a 
  self-sufficient dump of the object?
 
  Thanks,
  Vadim
 
 [[alternative HTML version deleted]]
 
  __
  R-devel@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-devel
 

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Aliasing a function

2008-02-23 Thread Gabor Grothendieck
formals can be used like this:

f - g
formals(f) - pairlist(a=1,b=2,c=3)


On Sat, Feb 23, 2008 at 5:40 PM, hadley wickham [EMAIL PROTECTED] wrote:
 A simple way to alias a function is to do :

 g - function(a = 1, b = 2, c = 3) a + b * c
 f - function(...) g(...)

 but formals (etc) is no longer very helpful.  Is there an easy way to
 programmatically create:

 f - function(a=1, b=2, c=3) g(a=a, b=b, c=c)

 This comes up in ggplot2 where I alias many functions to hide the fact
 that I'm using proto in the background, and I'd like the aliased
 functions to be a little more user friendly in terms of revealing what
 are valid arguments (and also to match up with the rdoc files which I
 am building).

 Any ideas?

 Thanks,

 Hadley

 --
 http://had.co.nz/

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Aliasing a function

2008-02-23 Thread Gabor Grothendieck
I assume he wants to be able to change the
formals although its confusing since the example
uses the same formals in both cases.

On Sat, Feb 23, 2008 at 6:32 PM, John Fox [EMAIL PROTECTED] wrote:
 Dear Hadley,

 Why not just f - g ?

 Regards,
  John

 
 John Fox, Professor
 Department of Sociology
 McMaster University
 Hamilton, Ontario, Canada L8S 4M4
 905-525-9140x23604
 http://socserv.mcmaster.ca/jfox



  -Original Message-
  From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
  project.org] On Behalf Of hadley wickham
  Sent: February-23-08 5:40 PM
  To: R-devel
  Subject: [Rd] Aliasing a function
 
  A simple way to alias a function is to do :
 
  g - function(a = 1, b = 2, c = 3) a + b * c
  f - function(...) g(...)
 
  but formals (etc) is no longer very helpful.  Is there an easy way to
  programmatically create:
 
  f - function(a=1, b=2, c=3) g(a=a, b=b, c=c)
 
  This comes up in ggplot2 where I alias many functions to hide the fact
  that I'm using proto in the background, and I'd like the aliased
  functions to be a little more user friendly in terms of revealing what
  are valid arguments (and also to match up with the rdoc files which I
  am building).
 
  Any ideas?
 
  Thanks,
 
  Hadley
 
  --
  http://had.co.nz/
 
  __
  R-devel@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-devel

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] NEWS and ChangeLog in available.packages()

2008-02-18 Thread Gabor Grothendieck
With the newly available NEWS and ChangeLog links on the CRAN
pages for packages it would be nice if available.packages()
gave information on those.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] NEWS and ChangeLog in available.packages()

2008-02-18 Thread Gabor Grothendieck
On Feb 18, 2008 11:59 AM, Martin Maechler [EMAIL PROTECTED] wrote:
  GaGr == Gabor Grothendieck [EMAIL PROTECTED]
  on Mon, 18 Feb 2008 10:19:05 -0500 writes:

GaGr With the newly available NEWS and ChangeLog links on
GaGr the CRAN pages for packages it would be nice if
GaGr available.packages() gave information on those.

 how would you propose?

 It returns a character matrix, and that probably  should  not be
 changed.


Currently the Depends (and Imports, Contains and Suggests)
are character strings containing components to form
one large string so maybe there could be a Docs whose
substrings could include NEWS, ChangeLog and the vignette
and demo names or maybe separate ones for NEWS/ChangeLog,
vignettes and demos.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Arithmetic bug? (found when use POSIXct) (PR#10776)

2008-02-17 Thread Gabor Grothendieck
OK.  Good point.

Note that that article was written 4 years ago
when R was at version 1.9 and POSIXt did not fully support
subseconds.  At that time, POSIXct could represent subseconds
internally but it was not used as POSIXlt did not yet support it
and that and the associated digits.sec option did not come until
R version 2.3, released two years after the article was written.

Try setting the digits.secs option before you run the command
like this:

 options(digits.secs = 3)
 dp - Sys.time()
 dp-as.POSIXct(format(dp,tz=GMT))
Time difference of -5 hours

On Feb 17, 2008 3:50 PM,  [EMAIL PROTECTED] wrote:

 Hi Gabo,

 FAQ 7.31 does not apply to this. numeric in R is 64bit which is large eno=
 ugh to handle this.

 I
 figured out the cause is that Sys.time() gives millisecond value while
 b-as.POSIXct(format(a,tz=3DGMT)) cuts off that part.=20

 Then of course a-b is not 5 hours!

 Do it again with options(digits=3D22) and you will see what actually happ=
 ened there.

 The recommended code from R News is not aware the truth that POSIXlt can re=
 present sub-second time.

 Cheers,

 B
 _
 Helping your favorite cause is as easy as instant messaging.=A0You IM, we g=
 ive.

[[alternative HTML version deleted]]

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Arithmetic bug? (found when use POSIXct) (PR#10776)

2008-02-16 Thread Gabor Grothendieck
See FAQ 7.31 or explain what you believe to be the problem.

On Feb 16, 2008 1:05 PM,  [EMAIL PROTECTED] wrote:
 Full_Name: Bo Zhou
 Version: 2.6.1 (2007-11-26)
 OS: Windows XP
 Submission from: (NULL) (207.237.54.242)


 Hi,

 I found an arithmetic problem when I'm doing something with POSIXct
 The code to reproduce it is as follows (This is the recommended way of finding
 out time zone difference on R News 2004-1 Page 32 URL
 http://cran.r-project.org/doc/Rnews/Rnews_2004-1.pdf)

 a=Sys.time()
 b-as.POSIXct(format(a,tz=GMT))
 a-b
 unclass(a)
 unclass(b)
 unclass(a)-unclass(b)
 as.numeric(a)
 as.numeric(b)
 as.numeric(a)-as.numeric(b)

 The result on my machine

  a=Sys.time()
  b-as.POSIXct(format(a,tz=GMT))
  a-b
 Time difference of -4.69 hours
  unclass(a)
 [1] 1203184447
  unclass(b)
 [1] 1203202447
 attr(,tzone)
 [1] 
  unclass(a)-unclass(b)
 [1] -17999.89
 attr(,tzone)
 [1] 
  as.numeric(a)
 [1] 1203184447
  as.numeric(b)
 [1] 1203202447
  as.numeric(a)-as.numeric(b)
 [1] -17999.89

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] assigning NULLs to elements of a list

2008-02-13 Thread Gabor Grothendieck
But what about by name?

a - list(a = 1, b = 2, c = 3)

a$b - NULL


On Feb 13, 2008 9:39 AM, Oleg Sklyar [EMAIL PROTECTED] wrote:
 Hmm, I think the pretty traditional R style does the job...

 a = list(1,2,3)
 a[-2]

 So I really do not see a good reason for doing a[2] = NULL instead of a
 = a[-2]


 Jeffrey J. Hallman wrote:
 From your tone, I gather you don't much like this behavior, and I can see 
 your
  point, as it not very intuitive that setting a list element to NULL deletes
  any existing element at that index.  But is there a better way to delete an
  element from a list?  Maybe there should be.
 
  Jeff
 
  Prof Brian Ripley [EMAIL PROTECTED] writes:
  I have just came across an (unexpected to me) behaviour of lists when
  assigning NULLs to list elements. I understand that a NULL is a valid R
  object, thus assigning a NULL to a list element should yield exactly the
  same result as assigning any other object. So I was surprised when
  assigning a NULL in fact removed the element from the list. Is this an
  intended behaviour? If so, does anybody know where is it documented and
  what is a good way around?
  Yes, it was apparently intended: R has long done this.
 
  x - list(a=c(1L,2L), b=matrix(runif(4),2,2), c=LETTERS[1:3])
  x[2] - list(NULL)
 
  is what I think you are intending.
 
  See e.g. the comment in subassign.c
 
   /* If val is NULL, this is an element deletion */
   /* if there is a match to nlist otherwise x */
   /* is unchanged.  The attributes need adjustment. */
 

 --
 Dr Oleg Sklyar * EBI-EMBL, Cambridge CB10 1SD, UK * +44-1223-494466

 __

 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] lm with factors for which only one level occurs

2008-02-08 Thread Gabor Grothendieck
Consider the following:

 lm(conc ~ Type, CO2, subset = Plant == Quebec)
Error in `contrasts-`(`*tmp*`, value = contr.treatment) :
  contrasts can be applied only to factors with 2 or more levels

Here Type is a factor for which only one level occurs
within the Quebec subset.   This is something that
one would think would be possible to handle in lm.  (Note
that singular.ok = TRUE is the default.)

I came across this when automatically performing lm's
on the levels of a conditioning variable, such as Plant in the above
example, and found I had to unexpectedly special case situations
like the above.

lmList() in lme4 is likely adversely affected by this as well.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Choosing CRAN Mirror in Windows GUI

2008-02-07 Thread Gabor Grothendieck
When you choose the CRAN Mirror in the Windows
GUI Packages menu in R version 2.6.2 alpha (2008-01-26 r44181)
the first mirror is highlighted when you open it even
if the mirror had been set.  If the mirror had previously been
set then that previously set mirror should be highlighted rather
than the first mirror.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] a != a*1 != a+0 != +a

2008-02-04 Thread Gabor Grothendieck
I don't think global options are desirable.

I would make your operators work in a single way and if the user
wants a different way he can call whatever functions you provide
directly instead of using the operators.

If you really need two ways with operators I would define a subclass
with operators working in the alternative way and have as.whatever
methods that converts back and forth.

Another possibility is to have the way it works be a special
attribute of the object itself (other than class).

On Feb 4, 2008 9:25 AM, Robin Hankin [EMAIL PROTECTED] wrote:
 hits=1.0 tests=MANY_EXCLAMATIONS
 X-USF-Spam-Flag: NO

 Hi

 I am writing a package for multivariate polynomials ('multipols')
 using S3 methods.

 The package includes a Ops.multipol()  function for the
 arithmetic methods;  I would like
 to define some sort of user-specified Boolean option which, if
 set,  would force results to be simplified as they are produced.

 Call this option trim.  Trimming a multipol results in
 a smaller array that is more manageable.

 Mostly one wants to trim, sometimes not.


 Would options() be a good way to manage this?

 One issue is the behaviour of unary operators + and -.

 If trim is TRUE, then  a   is one thing,  but +a  returns
 trim(a), which might be different.

 Also 1*a would be different from a and a+0



 Does the List consider this to be Good Practice?

 Has anyone got comments?



 --
 Robin Hankin
 Uncertainty Analyst and Neutral Theorist,
 National Oceanography Centre, Southampton
 European Way, Southampton SO14 3ZH, UK
  tel  023-8059-7743

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Pb with lapply()

2008-01-31 Thread Gabor Grothendieck
The problem of promises not being evaluated in
lists has been discussed before.

I think its fixed in the development version of R. See
r44139 on Jan 24, 2008 in http://developer.r-project.org/R.svnlog.2007

On Jan 31, 2008 1:26 PM,  [EMAIL PROTECTED] wrote:
 Hi,

 If needed, lapply() tries to convert its first argument into a list
 before it starts doing something with it:

   lapply
  function (X, FUN, ...)
  {
FUN - match.fun(FUN)
if (!is.vector(X) || is.object(X))
X - as.list(X)
.Internal(lapply(X, FUN))
  }

 But in practice, things don't always seem to work as suggested by
 this code (at least to the eyes of a naive user).

 I have defined an as.list method for my S4 class A:

   setClass(A, representation(data=list))
  [1] A
   setMethod(as.list, A, function(x, ...) [EMAIL PROTECTED])
  Creating a new generic function for as.list in .GlobalEnv
  [1] as.list

 Testing it:

   a - new(A, data=list(8, 2:0))
   as.list(a)
  [[1]]
  [1] 8

  [[2]]
  [1] 2 1 0

 OK.

 But lapply() doesn't work on 'a':

   lapply(a, typeof)
  Error in as.vector(x, list) : cannot type 'S4' coerce to vector

 I still have to do the 'as.list(a)' part myself for things to work:

   lapply(as.list(a), typeof)
  [[1]]
  [1] double

  [[2]]
  [1] integer

 Seems like using force() inside lapply() would solve the problem:

  lapply2 - function(X, FUN, ...)
  {
FUN - match.fun(FUN)
if (!is.vector(X) || is.object(X))
X - force(as.list(X))
.Internal(lapply(X, FUN))
  }

 It works now:

   lapply2(a, typeof)
  [[1]]
  [1] double

  [[2]]
  [1] integer

 Cheers,
 H.

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Pb with lapply()

2008-01-31 Thread Gabor Grothendieck
I just checked and the
code that previously triggered the promises in lists
bug seems to work now so I guess it is a different
problem.

# code below previously triggered errors but now works
# R 2.6.2 (2008-01-26 r44181) on Vista
f - function(x) environment()
z - as.list(f(7))
dput(z)
structure(list(x = 7), .Names = x)
z[[1]] == 7
force(z[[1]]) == 7


On Jan 31, 2008 2:33 PM,  [EMAIL PROTECTED] wrote:
 Hi Gabor,

 Quoting Gabor Grothendieck [EMAIL PROTECTED]:

  The problem of promises not being evaluated in
  lists has been discussed before.
 
  I think its fixed in the development version of R. See
  r44139 on Jan 24, 2008 in http://developer.r-project.org/R.svnlog.2007
 

 I'm using R-devel r44238 and the problem is still here.

 Cheers,
 H.

  sessionInfo()
 R version 2.7.0 Under development (unstable) (2008-01-29 r44238)
 x86_64-unknown-linux-gnu

 locale:
 LC_CTYPE=en_US.UTF-8;LC_NUMERIC=C;LC_TIME=en_US.UTF-8;LC_COLLATE=en_US.UTF-8;LC_MONETARY=en_US.UTF-8;LC_MESSAGES=en_US.UTF-8;LC_PAPER=en_US.UTF-8;LC_NAME=C;LC_ADDRESS=C;LC_TELEPHONE=C;LC_MEASUREMENT=en_US.UTF-8;LC_IDENTIFICATION=C

 attached base packages:
 [1] stats graphics  grDevices utils datasets  methods   base





__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Feature request: about lag(), which.min() and cat().

2008-01-31 Thread Gabor Grothendieck
On Jan 31, 2008 8:24 AM, Matthieu Stigler [EMAIL PROTECTED] wrote:
 lag()
 If one wants to construct a time series model, it is often useful to
 have the lags matrix. This is available with embed(ts,k) but the time
 series class disappears. So it would be nice that the argument k of
 lag() also admit values of length 1, which would give the same result
 as embed() but keep the class. In his wish list of 1 January 2008 (point
 8), Gabor Grothendieck spoke about a function Lag. Maybe also a function
 lags otherwise if the idea of length(k)1 is not good?

It would be nice.  Note that lag.zoo in the zoo package does supports this
for zoo objects.

 library(zoo)
 z - zoo((11:20)^2)
 lag(z, 1:2)
  lag1 lag2
1  144  169
2  169  196
3  196  225
4  225  256
5  256  289
6  289  324
7  324  361
8  361  400
9  400   NA

 # regress z on lag(z, 1:2) using dyn package
 library(dyn)
 dyn$lm(z ~ lag(z, 1:2))

Call:
lm(formula = dyn(z ~ lag(z, 1:2)))

Coefficients:
 (Intercept)  lag(z, 1:2)1  lag(z, 1:2)2
   2 2-1

# as a workaround one could convert to zoo and back

 x - ts((11:20)^2)
 as.ts(lag(as.zoo(x), 1:2))
Time Series:
Start = 1
End = 9
Frequency = 1
  lag1 lag2
1  144  169
2  169  196
3  196  225
4  225  256
5  256  289
6  289  324
7  324  361
8  361  400
9  400   NA


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] [wishlist, patch] Removing .bzr/ directory when calling R CMD build (PR#10625)

2008-01-23 Thread Gabor Grothendieck
If you place an .Rbuildignore file in the top level of
your package with this line single line as its contents:

[.]bzr$

then the .bzr file will not be included.


On Jan 23, 2008 1:45 PM,  [EMAIL PROTECTED] wrote:
 Full_Name: Ben Goodrich
 Version: 2.6.1
 OS: Debian
 Submission from: (NULL) (128.103.222.166)


 bzr is another version control system and adds a .bzr folder to the top-level
 directory of a package, similar to .svn and .git for subversion and git
 respectively. However, while R CMD build removes directories called .svn, 
 .git,
 and some others, it does not remove .bzr . As a result, the .bzr folder is
 unnecessarily included in the cleanly built package and there are accurate but
 unnecessary warnings that say certain subdirectories of the .bzr directory are
 empty. I believe the following patch, which is lightly tested with R-devel, is
 sufficient to remove the .bzr folder and prevent these warnings. It does not,
 however, remove the .bzrignore file (if it exists) but perhaps it should be
 augmented to do so.

 --- src/scripts/build.in2008-01-23 11:42:47.0 -0500
 +++ build.in2008-01-23 11:45:02.0 -0500
 @@ -215,6 +215,7 @@
print EXCLUDE $File::Find::name\n if(-d $_  /^\.svn$/);
print EXCLUDE $File::Find::name\n if(-d $_  /^\.arch-ids$/);
print EXCLUDE $File::Find::name\n if(-d $_  /^\.git$/);
 +   print EXCLUDE $File::Find::name\n if(-d $_  /^\.bzr$/);
## Windows DLL resource file
push(@exclude_patterns, ^src/ . $pkgname . _res\\.rc);
my $filename = $File::Find::name;

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] [wishlist, patch] Removing .bzr/ directory when calling R CMD build (PR#10625)

2008-01-23 Thread Gabor Grothendieck
On Jan 23, 2008 2:23 PM, Ben Goodrich [EMAIL PROTECTED] wrote:
 Gabor Grothendieck wrote:
  If you place an .Rbuildignore file in the top level of
  your package with this line single line as its contents:
 
  [.]bzr$
 
  then the .bzr file will not be included.

 Thank you for the suggestion. In order to remove both the .bzr/
 directory and the .bzrignore file, it seems to be necessary to specify

 [.]bzr$
 [.]bzrignore

or perhaps

[.]bzr.*

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] seekViewport error

2008-01-23 Thread Gabor Grothendieck
On Jan 23, 2008 9:38 PM, Paul Murrell [EMAIL PROTECTED] wrote:
 Hi


 Gabor Grothendieck wrote:
  Why does the seekViewport at the bottom give an error?


 Because the viewport is popped after GRID.cellGrob.84 is drawn.

 grid.ls() shows the viewport because it recurses down into the legend
 frame grob.  Compare your output with (grid-generated numbering differs)
  ...

   grid.ls(recurs=FALSE, view=TRUE)
 ROOT
   GRID.rect.28
   plot1.toplevel.vp
 plot1.xlab.vp
   plot1.xlab
   1
 plot1.ylab.vp
   plot1.ylab
   1
 plot1.strip.1.1.off.vp
   GRID.segments.29
   1
 plot1.strip.left.1.1.off.vp
   GRID.segments.30
   GRID.text.31
   1
 plot1.panel.1.1.off.vp
   GRID.segments.32
   GRID.text.33
   GRID.segments.34
   1
 plot1.panel.1.1.vp
   GRID.points.35
   GRID.points.36
   GRID.points.37
   1
 plot1.panel.1.1.off.vp
   GRID.rect.38
   1
 plot1.legend.top.vp
   GRID.frame.9
   1
 plot1.
   1
 1

 If you look at what viewports are actually available, via
 current.vpTree(), you'll see that GRID.VP.24 is not there.

 The problem (see also
 https://stat.ethz.ch/pipermail/r-help/2008-January/151655.html) is that
 cellGrobs (children of frame grobs) use their 'vp' component to store
 the viewport that positions them within the parent frame.  This means
 that the viewport is pushed and then popped (as per normal behaviour for
 'vp' components).

 A possible solution that I am currently trialling uses a special
 'cellvp' slot instead so that the cellGrob viewports are pushed and then
 upped.  That way they remain available after the cellGrob has drawn,
 so you can downViewport() to them.

 The disadvantage of this approach is that the viewports no longer appear
 in the grid.ls() listing (because grid.ls() has no way of knowing about
 special components of grobs that contain viewports).  This effect can
 already be seen by the fact that the viewport for the frame grob
 (GRID.frame.70) is not shown in the grid.ls() output.  On the other
 hand, the viewports will be visible via current.vpTree()  ...

Perhaps some convention could be adopted which, if followed, would
let grid.ls know?  If that worked at least for graphics generated from
lattice and ggplot2 that would likely satisfy a significant percentage
of uses.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] experiments with slot functions and possible problems NOTE

2008-01-21 Thread Gabor Grothendieck
If the intention is to place fList's contents in the global environment
then you need to specify that in addtoenv or else it assumes
the parent environment.

 flistA - list(foo = function () 1:10, bar = function() log(foo()))
 makefun - function(fList) addtoenv(fList, .GlobalEnv)
 makefun(flistA)
$foo
function() {
   1:10
  }

$bar
function() {
log(foo())
  }

 foo()
 [1]  1  2  3  4  5  6  7  8  9 10
 bar()
 [1] 0.000 0.6931472 1.0986123 1.3862944 1.6094379 1.7917595 1.9459101
 [8] 2.0794415 2.1972246 2.3025851

Note that this takes advantage of the fact that in your example flistA was
defined in the global environment in the first place.  Had that not been the
case we would have had to reset the environment of bar so that it could
find foo.

By the way.  What about just attach(flistA) ?


On Jan 21, 2008 8:30 AM, Thomas Petzoldt [EMAIL PROTECTED] wrote:
 Hello,

 first of all, thanks to LT for \pkg{codeutils}. I agree that it is
 indeed very useful to identify errors and also to encourage re-thinking
 past solutions. My problem:

 I want to compare different sets of related sub-functions which should
 be used alternatively by the same top-level function. Sets of related
 functions should be bound together (as lists) and the workspace should
 be as clean as possible.

 Finally, these functions are to be called by top-level functions that
 work with such sets.

 What's the best way to do this?

 - clutter the workspace with lots of functions?
 OR:
 - ignore notes about possible problems
 OR:
 - a third way?

 Thanks in advance

 Thomas P.



 An example:

 ##=
 ## 1) One possible set of functions
 flistA - list(
   foo = function() {
1:10
   },
   bar = function() {
 log(foo())
   }
 )

 ## .. we may also have alternative sets,
 ##e.g. flistB, flistC, ... etc

 ## 2) Now we try to construct closures

 ## 2a) non-nested
 makefun1 - function(flist) {
   with(flist,
 function() foo()
   )
 }

 ## 2b) nested call
 makefun2 - function(flist) {
   with(flist,
 function() bar()
   )
 }

 ## 2c) or use an alternative way with a special function
 ##  addtoenv, suggested by Gabor Grothendieck some times ago:
 addtoenv - function(L, p = parent.frame()) {
   for(nm in names(L)) {
 assign(nm, L[[nm]], p)
 environment(p[[nm]]) - p
   }
   L
 }

 makefun3 - function(flist) {
   addtoenv(flist)
   function() bar()
 }

 ## 3) now we create the top-level functions
 ##with one particular set of functions
 m1 - makefun1(flistA)
 m2 - makefun2(flistA)
 m3 - makefun3(flistA)

 m1()
 ## this was no problem, trivial

 m2()
 # Error in bar() : could not find function foo

 m3()
 # works, but even in that case we get problems
 # if we do this in a package:

 # * checking R code for possible problems ... NOTE
 # bar: no visible global function definition for 'foo'

 ## tested with R version 2.6.1 and
 ## R 2.7.0 Under development, svn rev. 44061





 --
 Thomas Petzoldt
 Technische Universitaet Dresden
 Institut fuer Hydrobiologie
 01062 Dresden
 GERMANY

 http://tu-dresden.de/Members/thomas.petzoldt

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Great tool

2008-01-21 Thread Gabor Grothendieck
Tracking down the free variables in a function when
reworking old code.

On Jan 21, 2008 12:41 PM, Charles C. Berry [EMAIL PROTECTED] wrote:
 On Sun, 20 Jan 2008, Gabor Grothendieck wrote:

  I agree.  Its incredibly useful.

 OK gentlemen, you have piqued my curiosity.

 Can you give an example or two of situations you encountered in which a
 codetools function was so helpful?

 Chuck



 
  On Jan 20, 2008 11:02 PM, Henrik Bengtsson [EMAIL PROTECTED] wrote:
  Hi,
 
  I just have drop a note to say that the 'codetools' (and the part of R
  CMD check that use it) is a pleasure to use and saves me from hours of
  troubleshooting.  Each time it finds something I am amazed how
  accurate it is.  Thanks to Luke T. and everyone else involved in
  creating it.
 
  Cheers,
 
  Henrik
 
  __
  R-devel@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-devel
 
 
  __
  R-devel@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-devel
 

 Charles C. Berry(858) 534-2098
 Dept of Family/Preventive Medicine
 E mailto:[EMAIL PROTECTED]   UC San Diego
 http://famprevmed.ucsd.edu/faculty/cberry/  La Jolla, San Diego 92093-0901




__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Updating library

2008-01-19 Thread Gabor Grothendieck
Thanks.  That did it.

On Jan 19, 2008 2:19 AM, Prof Brian Ripley [EMAIL PROTECTED] wrote:
 You need to use 'Run as admininistrator' for just the session updating
 those packages (assuming you installed R with administrator privileges and
 are running it without: you did not say).

 This is covered in rw-FAQ Q2.24.


 On Fri, 18 Jan 2008, Gabor Grothendieck wrote:

  On Vista my R installation is in
 
   C:\Program Files\R\R-2.6.0
 
  and there is a library here:
 
%userprofile%\Documents\R\win-library\2.6
 
  If I use the GUI Packages menu to update my library it
  works ok _except_ for any packages such as lattice and
  the VR bundle that come with R.  For those I get this (this
  is what I got when I tried to update lattice but I get a similar
  message for the VR bundle as well):
 
  update.packages(ask='graphics')
  --- Please select a CRAN mirror for use in this session ---
  Warning in install.packages(update[instlib == l, Package], l,
  contriburl = contriburl,  :
   'lib = C:/PROGRA~1/R/R-26~1.0/library' is not writable
  Error in install.packages(update[instlib == l, Package], l,
  contriburl = contriburl,  :
   unable to install packages
 
  I am using R-2.6.1 (even though the directory says 2.6.0) since I
  tend to write over the prior directory if its just an upgrade involving
  the last digit of the R version.
 
  R.version.string
  [1] R version 2.6.1 Patched (2007-12-06 r43610)
 
  How should I go about updating lattice and the VR bundle?
 
  __
  R-devel@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-devel
 

 --
 Brian D. Ripley,  [EMAIL PROTECTED]
 Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
 University of Oxford, Tel:  +44 1865 272861 (self)
 1 South Parks Road, +44 1865 272866 (PA)
 Oxford OX1 3TG, UKFax:  +44 1865 272595


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Updating library

2008-01-19 Thread Gabor Grothendieck
Ideally one could just choose the
 packages | update
menu and it would just work.  Possibilities include updating packages into
the private library if they otherwise fail to install or providing an
error message
on what to do.

On Jan 19, 2008 9:13 AM, Uwe Ligges [EMAIL PROTECTED] wrote:
 hits=-2.6 tests=BAYES_00
 X-USF-Spam-Flag: NO



 Peter Dalgaard wrote:
  Prof Brian Ripley wrote:
  You need to use 'Run as admininistrator' for just the session updating
  those packages (assuming you installed R with administrator privileges and
  are running it without: you did not say).
 
  This is covered in rw-FAQ Q2.24.
 
 
  I only have WINE to hand here and package installation doesn't quite
  work there, but shouldn't it also work in the  absense of administrative
  privileges to just install lattice and VR into your private win-library
  and update them there?

 Sure, but that works. What Gabor reported is that updating lattice/VR
 and co did not succeed. And it did not, because it was installed in R's
 default library (which can only be updated with administrator privileges
 in his case, we guess).
 Gabor could install lattice/VR into his private library and after that
 updating will succeed.

 Best,
 uwe






 
  -p
 
  On Fri, 18 Jan 2008, Gabor Grothendieck wrote:
 
 
  On Vista my R installation is in
 
   C:\Program Files\R\R-2.6.0
 
  and there is a library here:
 
%userprofile%\Documents\R\win-library\2.6
 
  If I use the GUI Packages menu to update my library it
  works ok _except_ for any packages such as lattice and
  the VR bundle that come with R.  For those I get this (this
  is what I got when I tried to update lattice but I get a similar
  message for the VR bundle as well):
 
 
  update.packages(ask='graphics')
 
  --- Please select a CRAN mirror for use in this session ---
  Warning in install.packages(update[instlib == l, Package], l,
  contriburl = contriburl,  :
   'lib = C:/PROGRA~1/R/R-26~1.0/library' is not writable
  Error in install.packages(update[instlib == l, Package], l,
  contriburl = contriburl,  :
   unable to install packages
 
  I am using R-2.6.1 (even though the directory says 2.6.0) since I
  tend to write over the prior directory if its just an upgrade involving
  the last digit of the R version.
 
 
  R.version.string
 
  [1] R version 2.6.1 Patched (2007-12-06 r43610)
 
  How should I go about updating lattice and the VR bundle?
 
  __
  R-devel@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-devel
 
 
 
 
 

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Updating library

2008-01-18 Thread Gabor Grothendieck
On Vista my R installation is in

  C:\Program Files\R\R-2.6.0

and there is a library here:

   %userprofile%\Documents\R\win-library\2.6

If I use the GUI Packages menu to update my library it
works ok _except_ for any packages such as lattice and
the VR bundle that come with R.  For those I get this (this
is what I got when I tried to update lattice but I get a similar
message for the VR bundle as well):

 update.packages(ask='graphics')
--- Please select a CRAN mirror for use in this session ---
Warning in install.packages(update[instlib == l, Package], l,
contriburl = contriburl,  :
  'lib = C:/PROGRA~1/R/R-26~1.0/library' is not writable
Error in install.packages(update[instlib == l, Package], l,
contriburl = contriburl,  :
  unable to install packages

I am using R-2.6.1 (even though the directory says 2.6.0) since I
tend to write over the prior directory if its just an upgrade involving
the last digit of the R version.

 R.version.string
[1] R version 2.6.1 Patched (2007-12-06 r43610)

How should I go about updating lattice and the VR bundle?

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] as.function()

2008-01-14 Thread Gabor Grothendieck
On Jan 14, 2008 6:50 AM, Duncan Murdoch [EMAIL PROTECTED] wrote:
 Robin Hankin wrote:
  Hi
 
  [this after some considerable thought as to R-help vs R-devel]
 
 
 
  I want to write a (S3) method for as.function();
  toy example follows.
 
  Given a matrix a, I need to evaluate trace(ax) as a function of
  (matrix) x.
 
  Here's a trace function:
 
  tr -  function (a)  {
   i - seq_len(nrow(a))
   return(sum(a[cbind(i, i)]))
  }
 
 
  How do I accomplish the following:
 
 
  a - crossprod(matrix(rnorm(12),ncol=3))
  class(a) - foo
 
  f - as.function(a)   # need help to write as.function.foo()
  x - diag(3)
 
  f(x) #should give tr(ax)
 
  a - 4
  f(x)   # should still give tr(ax) even though a has been
  reassigned.
 
 
 Brian's answer was what you want.  A less general version is this:

   as.function.foo - function(x, ...) {
 +function(b) tr(x %*% b)
 + }


This can also be done using the proto package.  p has two
components b and f.  q inherits f from p but has its own b.

library(proto)
p - proto(b = 1:4, f = function(., x) sum(diag(x %*% .$b)))
q - p$proto(b = 5:8)
p$f(1:4)
q$f(1:4)

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] as.function()

2008-01-14 Thread Gabor Grothendieck
The gsubfn package can do something like that too.  If you
preface a function with fn$ then it will interpret certain formula
arguments as functions.  If all we want is the function itself we
can use force, the identity function, to recover it:

 library(gsubfn)
 fn$force(~ 2*x + 3*y^2)
function (x, y)
2 * x + 3 * y^2

If there are free variables in the formula that you don't want to
include in the argument list the left hand side can be used to
specify the argument list:

 fn$force(x + y ~ 2*x + a*y^2)
function (x, y)
2 * x + a * y^2



On Jan 14, 2008 1:05 PM, Tony Plate [EMAIL PROTECTED] wrote:
 How about this as a version  that automatically constructs the argument
 list (and make into a method for as.function as appropriate)?

 makefun - function(expr)
 {
f - function() {}
body(f) - expr
vars - all.vars(expr)
if (length(vars)) {
args - alist(x=)[rep(1,length(vars))]
names(args) - vars
formals(f) - args
}
environment(f) - globalenv()
return(f)
 }

   makefun(expression(2*x + 3*y^2))
 function (x, y)
 2 * x + 3 * y^2
   makefun(expression(2*x + 3*y^2 - z))
 function (x, y, z)
 2 * x + 3 * y^2 - z
   makefun(expression(p1 + p2))
 function (p1, p2)
 p1 + p2
  

 -- Tony Plate




 Henrique Dallazuanna wrote:
  Try this:
 
  as.function.foo - function(obj, ...)
  {
  newobj - function(x, ...){}
  body(newobj) - obj
  return(newobj)
  }
 
  x - expression(2*x + 3*x^2)
 
  foo - as.function.foo(x)
  foo(2)
 
 
  Hope this help
 
  On 14/01/2008, Robin Hankin [EMAIL PROTECTED] wrote:
 
  Antonio
 
 
  thanks for your help here, but it doesn't answer my question.
 
  Perhaps if I outline my motivation it would help.
 
 
  I want to recreate the ability of
  the polynom package to do the following:
 
 
library(polynom)
p - polynomial(1:4)
p
  1 + 2*x + 3*x^2 + 4*x^3
MySpecialFunction - as.function(p)
MySpecialFunction(1:10)
[1]   10   49  142  313  586  985 1534 2257 3178 4321
p - 4
MySpecialFunction(1:10)
[1]   10   49  142  313  586  985 1534 2257 3178 4321
   
 
 
  See how the user can define object MySpecialFunction,
which outlives short-lived polynomial p.
 
  Unfortunately, I don't see a way to modify as.function.polynomial()
  to do what I want.
 
 
  best wishes
 
 
  rksh
 
 
 
 
 
 
 
 
 
  On 14 Jan 2008, at 08:45, Antonio, Fabio Di Narzo wrote:
 
 
  2008/1/14, Robin Hankin [EMAIL PROTECTED]:
 
  Hi
 
  [this after some considerable thought as to R-help vs R-devel]
 
 
 
  I want to write a (S3) method for as.function();
  toy example follows.
 
  Given a matrix a, I need to evaluate trace(ax) as a function of
  (matrix) x.
 
  Here's a trace function:
 
  tr -  function (a)  {
  i - seq_len(nrow(a))
  return(sum(a[cbind(i, i)]))
  }
 
 
  How do I accomplish the following:
 
 
  a - crossprod(matrix(rnorm(12),ncol=3))
  class(a) - foo
 
  f - as.function(a)   # need help to write as.function.foo()
  x - diag(3)
 
  f(x) #should give tr(ax)
 
  What about the following?
 
  as.function.foo - function(a, ...)
   function(x)
 sum(diag(a*x))
 
  However, I don't see the need for an S3 method. Why don't simply use
  (?):
  mulTraceFun - function(a)
   function(x)
sum(diag(a*x))
 
  So you also have a more meaningful name than an anonymous
  'as.function'.
 
  HTH,
  Antonio.
 
 
  a - 4
  f(x)   # should still give tr(ax) even though a has been
  reassigned.
 
  This would'nt work with my proposal, because of lexical scoping.
 
 
 
 
 
  [my real example is very much more complicated than this but
  I need this toy one too and I can't see how to modify
  as.function.polynomial()
  to do what I want]
 
 
 
 
  --
  Robin Hankin
  Uncertainty Analyst and Neutral Theorist,
  National Oceanography Centre, Southampton
  European Way, Southampton SO14 3ZH, UK
   tel  023-8059-7743
 
  __
  R-devel@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-devel
 
 
  --
  Antonio, Fabio Di Narzo
  Ph.D. student at
  Department of Statistical Sciences
  University of Bologna, Italy
 
  --
  Robin Hankin
  Uncertainty Analyst and Neutral Theorist,
  National Oceanography Centre, Southampton
  European Way, Southampton SO14 3ZH, UK
tel  023-8059-7743
 
  __
  R-devel@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-devel
 
 
 
 
 

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] R Bugs link broken

2008-01-11 Thread Gabor Grothendieck
With the dash it goes to section 9.1 entitled R Bugs but with
the %20 it goes to the top.  It should go to 9.1; however, it
looks likes someone fixed it since I posted.

On Jan 9, 2008 11:54 AM, Gabor Grothendieck [EMAIL PROTECTED] wrote:
 The link to R Bugs in the posting guide

  http://www.r-project.org/posting-guide.html

 is broken. The - that should be in the link:

 http://cran.r-project.org/doc/FAQ/R-FAQ.html#R-Bugs

 is currently %20:

 http://cran.r-project.org/doc/FAQ/R-FAQ.html#R%20Bugs


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] R Bugs link broken

2008-01-09 Thread Gabor Grothendieck
The link to R Bugs in the posting guide

  http://www.r-project.org/posting-guide.html

is broken. The - that should be in the link:

http://cran.r-project.org/doc/FAQ/R-FAQ.html#R-Bugs

is currently %20:

http://cran.r-project.org/doc/FAQ/R-FAQ.html#R%20Bugs

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] read.table: aborting based on a time constraint

2008-01-09 Thread Gabor Grothendieck
That was supposed to be file.info()$size

On Jan 9, 2008 3:27 PM, Gabor Grothendieck [EMAIL PROTECTED] wrote:
 Use file.file()$size to find out how large the file is
 and skip files larger than some cutoff.

 On Jan 9, 2008 2:01 PM, Derek Stephen Elmerick [EMAIL PROTECTED] wrote:

  Hello –
 
  I am trying to write code that will read in multiple datasets;
  however, I would like to skip any dataset where the read-in process
  takes longer than some fixed cutoff. A generic version of the function
  is the following:
 
  for(k in  1:number.of.datasets)
  {
X[k]=read.table(…)
  }
 
  The issue is that I cannot find a way to embed logic that will abort
  the read-in process of a specific dataset without manual intervention.
  I scanned the help manual and other postings, but no luck based on my
  search. Any thoughts?
 
  Thanks,
  Derek
 
  __
  R-devel@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-devel
 


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] nls (with SSlogis model and upper limit) never returns (PR#10544)

2008-01-05 Thread Gabor Grothendieck
On my Windows Vista system it returns an answer:

 values - list(x=10:30, y=c(23.85, 28.805, 28.195, 26.23, 25.005, 20.475,
+ 17.33, 14.97, 11.765, 8.857, 5.3725, 5.16, 4.2105, 2.929, 2.174, 1.25, 1.0255,
+ 0.612, 0.556, 0.4025, 0.173))
 y.max - max(values$y)
 model - nls(y ~ SSlogis(x, asym, xmid, scal), data=values, algorithm=port,
+ start=c(asym=y.max, xmid=15, scal=-0.5), upper=c(y.max, Inf, Inf))
 model
Nonlinear regression model
  model:  y ~ SSlogis(x, asym, xmid, scal)
   data:  values
  asym   xmid   scal
28.805 17.268 -2.223
 residual sum-of-squares: 32.02

Algorithm port, convergence message: relative convergence (4)
 R.version.string # Vista
[1] R version 2.6.1 Patched (2007-12-06 r43610)


On Jan 4, 2008 9:45 AM,  [EMAIL PROTECTED] wrote:
 Full_Name: Hendrik Weisser
 Version: 2.6.1
 OS: Linux
 Submission from: (NULL) (139.19.102.218)


 The following computation never finishes and locks R up:

  values - list(x=10:30, y=c(23.85, 28.805, 28.195, 26.23, 25.005, 20.475,
 17.33, 14.97, 11.765, 8.857, 5.3725, 5.16, 4.2105, 2.929, 2.174, 1.25, 1.0255,
 0.612, 0.556, 0.4025, 0.173))
  y.max - max(values$y)
  model - nls(y ~ SSlogis(x, asym, xmid, scal), data=values, 
  algorithm=port,
 start=c(asym=y.max, xmid=15, scal=-0.5), upper=c(y.max, Inf, Inf))

 This used to work with R version 2.5.1 patched.
 The problem does _not_ occur if the parameter scal=-0.5 in the nls call is
 changed, e. g. to scal=-0.6 or scal=-0.4.
 Simply calling model - nls(y ~ SSlogis(x, asym, xmid, scal), data=values)
 also works, but this does not use the upper bound for the asym parameter, 
 which
 was the point.

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Typo in DateTimeClasses.Rd

2008-01-03 Thread Gabor Grothendieck
In:

   trunk/src/library/base/man/DateTimeClasses.Rd

this:

  \section{Sub-section Accuracy}{

should be

  \section{Sub-second Accuracy}{

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Wish List

2008-01-01 Thread Gabor Grothendieck
Most of the items on this list have been mentioned before but it
may be useful to see them altogether and at any rate every year
I have posted my R wishlist at the beginning of the year.

High priority items pertain to the foundations of R (promises,
environments) since those form the basis of everything
else and the foundation needs to be looked after first.

The medium items are focused on scripting since with a few additional
features R could work more smoothly with other software.

For the Low priority items we listed the rest.  They are not necessarily
low in terms of desirability but I wanted to focus the high and
medium items on foundations and scripting.

There is also a section at the end focusing on addon packages.
These may be strictly speaking part of R but are widely used.

High

1. Some way of inspecting promises.  It is possible to get
the expression associated with a promise using substitute but
not its environment.  Also need a way to copy a promise without
forcing it.  See:
https://stat.ethz.ch/pipermail/r-devel/2007-September/046966.html

2. Fix bug when promises are stored in lists:

f - function(x) environment()
as.list(f(0))$x == 0 # gives error.  Should be TRUE.

3. If a package uses LazyLoad: true then R changes the class of
certain top level objects.  This does not occur if Lazyload: false
is used.  For an example see:
https://stat.ethz.ch/pipermail/r-devel/2007-October/047118.html

4. If two environment variables point to the same environment they
cannot have different attributes.  This effectively thwarts subclassing
of environments (contrary to OO principles).

Medium

5. Sweave. A common scanario is spawning a Sweave job from
another program (such as from a program controlling a
web site).  The caller needs to pass some information to the
Sweave program such as the file name of a report to produce.
Its possible to spawn R and have R spawn sweave but given the
existence of R CMD Sweave it would be nice to be able to just
spawn R CMD Sweave directly.  Features that would help here
would be:

- support --args or some other method of passing arguments
  from R CMD Sweave line to the Sweave script

- have a facility whereby R CMD Sweave can directly generate
  the .pdf file and an argument which allows the caller to
  define the name of the resulting pdf file, e.g. -o.  (With
  automated reports one may need to have many different outputs
  from the same Rnw file so its important to name them differently.)

- an -x argument similar to Perl/Python/Ruby such that if one calls
  R CMD Sweave -x abc myfile.Rnw then all lines up to the first one
  matching the indicated regexp, abc here, are skipped.  This
  facilitates combining the script with a shell or batch file if the
  previous is not enough.

Thus one could spawn this from their program:
R CMD Sweave --pdf myfile.Rnw -o myfile-123.pdf --args 23
and it would generate a pdf file from myfile.Rnw of the
indicated name passing 23 as arg1 to the R code embedded in the
Sweave file.

See:
https://stat.ethz.ch/pipermail/r-devel/2007-October/047195.html
https://stat.ethz.ch/pipermail/r-help/2007-December/148091.html

6. -x flag on Rscript as in perl/python/ruby.  Useful for combining batch
   and R file into a single file on non-UNIX systems.  It would cause all
   lines until a line starting with #!Rscript to be skipped by the R
   processor.  See:
   https://www.stat.math.ethz.ch/pipermail/r-devel/2007-January/044433.html
   Also see
   http://www.datafocus.com/docs/perl/pod/perlwin32.asp#running_perl_scripts
   since the same considerations as for Perl scripts applies.
   There is also some discussion here:
   https://stat.ethz.ch/pipermail/r-help/2007-November/145279.html
   https://stat.ethz.ch/pipermail/r-help/2007-November/145301.html

Low

7. Define Lag - function(x, k = 1, ...) lag(x, -k, ,..)

   so the user has his choice of which orientation he prefers.
   Many packages could make use of it if it were in the core of R including
   zoo, dyn, dynlm, fame and others.  This would also address comments
   such as in ISSUE 4 on this page which is associated with a popular
   book on time series:
   http://www.stat.pitt.edu/stoffer/tsa2/Rissues.htm

8. On Windows, package build tools should check that Cygwin is in
correct position on PATH and issue meaningful error if not.  If
you get this wrong currently its quite hard to diagnose unless
you know about it.

9. Implement the R shell() and shell.exec() commands on
non-Windows systems.

10. print.function should be improved to make it obvious how to find what
the user is undoubtedly looking for in both the S3 and S4 cases.
That would address one of the criticisms here:

http://www.stat.columbia.edu/~cook/movabletype/archives/2007/08/they_started_me.html
(The other criticisms at this link are worth addressing too -- ggplot2
and several existing or upcoming books on grid, lattice and ggplot
R graphics are
presumably addressing  the criticism that creating 

[Rd] Predictable grid names

2007-12-26 Thread Gabor Grothendieck
One thing that makes modifying lattice grid objects difficult is that if
you create the same plot twice you get different grid names for
some grid objects:

1. I wonder if there is some possibility in grid to reset the
names after a grid.newpage(), say, so that one can get predictable
names or some other strategy for getting predictable names.
Sort of like a set.seed() for grid except this has nothing to do with
random numbers.

2. grid.ls() is a great improvement over what previously existed but another
thing that would be nice would be to be able to list out more info with
the grid object so one can more easily identify it.  Sort of like ls -l.
And/or maybe a where clause restricting which objects are listed, e.g. just
list grid objects that are dark green.  I have been using the
following to show the
color values (which displays them if they are in the gp list but not
otherwise) but it would be nice to have something more general and
built in.  Or perhaps something already exists that I am not aware of.

library(lattice)
library(grid)
# show tree with names and col values
recurse - function(x, indent = ) {
  if (!is.null(x$name)) {
cat(indent, x$name)
if (!is.null(x$gp)) cat( col:, x$gp$col)
cat(\n)
  }
  for (ch in x$children) Recall(ch, indent = paste(indent, .))
}
xyplot(Sepal.Length ~ Sepal.Width, iris, group = Species, col = 11:13,
  auto.key = TRUE)
gg - grid.grab()
recurse(gg)

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] seekViewport error

2007-12-26 Thread Gabor Grothendieck
Why does the seekViewport at the bottom give an error?

 xyplot(Sepal.Length ~ Sepal.Width, iris, group = Species, col = 11:13,
+   auto.key = TRUE)
 grid.ls(view = TRUE)
ROOT
  GRID.rect.89
  plot1.toplevel.vp
plot1.xlab.vp
  plot1.xlab
  1
plot1.ylab.vp
  plot1.ylab
  1
plot1.strip.1.1.off.vp
  GRID.segments.90
  1
plot1.strip.left.1.1.off.vp
  GRID.segments.91
  GRID.text.92
  1
plot1.panel.1.1.off.vp
  GRID.segments.93
  GRID.text.94
  GRID.segments.95
  1
plot1.panel.1.1.vp
  GRID.points.96
  GRID.points.97
  GRID.points.98
  1
plot1.panel.1.1.off.vp
  GRID.rect.99
  1
plot1.legend.top.vp
  GRID.frame.70
GRID.VP.18
  GRID.cellGrob.72
GRID.rect.71
  1
GRID.VP.19
  GRID.cellGrob.74
GRID.text.73
  1
GRID.VP.20
  GRID.cellGrob.76
GRID.text.75
  1
GRID.VP.21
  GRID.cellGrob.78
GRID.text.77
  1
GRID.VP.22
  GRID.cellGrob.80
GRID.points.79
  1
GRID.VP.23
  GRID.cellGrob.82
GRID.points.81
  1
GRID.VP.24
  GRID.cellGrob.84
GRID.points.83
  1
  1
plot1.
  1
1
 seekViewport(GRID.VP.24)
Error in downViewport.vpPath(vpPathDirect(name), strict, recording =
recording) :
  Viewport 'GRID.VP.24' was not found

 R.version.string # Vista
[1] R version 2.6.1 Patched (2007-12-06 r43610)
 packageDescription(grid)$Version
[1] 2.6.1
 packageDescription(lattice)$Version
[1] 0.17-2

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Wrong length of POSIXt vectors (PR#10507)

2007-12-15 Thread Gabor Grothendieck
If it were simply deprecated and then changed then
everyone using it would get a warning during the period
of deprecation so it would
not be so bad.  Given that its current behavior is
not very useful I suspect its not widely used anyways.
| haven't followed the whole discussion so sorry if these
points have already been made.

On Dec 15, 2007 5:17 PM, Martin Maechler [EMAIL PROTECTED] wrote:
  TP == Tony Plate [EMAIL PROTECTED]
  on Fri, 14 Dec 2007 13:58:30 -0700 writes:


TP Duncan Murdoch wrote:
 On 12/13/2007 1:59 PM, Tony Plate wrote:
 Duncan Murdoch wrote:
 On 12/11/2007 6:20 AM, [EMAIL PROTECTED] wrote:
 Full_Name: Petr Simecek
 Version: 2.5.1, 2.6.1
 OS: Windows XP
 Submission from: (NULL) (195.113.231.2)


 Several times I have experienced that a length of a POSIXt vector
 has not been
 computed right.

 Example:

 tv-structure(list(sec = c(50, 0, 55, 12, 2, 0, 37, NA, 17, 3, 31
 ), min = c(1L, 10L, 11L, 15L, 16L, 18L, 18L, NA, 20L, 22L, 22L
 ), hour = c(12L, 12L, 12L, 12L, 12L, 12L, 12L, NA, 12L, 12L, 12L),
 mday = c(13L, 13L, 13L, 13L, 13L, 13L, 13L, NA, 13L, 13L, 13L), mon
 = c(5L, 5L, 5L, 5L, 5L, 5L, 5L, NA, 5L, 5L, 5L), year = c(105L,
 105L, 105L, 105L, 105L, 105L, 105L, NA, 105L, 105L, 105L), wday =
 c(1L, 1L, 1L, 1L, 1L, 1L, 1L, NA, 1L, 1L, 1L), yday = c(163L, 163L,
 163L, 163L, 163L, 163L, 163L, NA, 163L, 163L, 163L), isdst = c(1L,
 1L, 1L, 1L, 1L, 1L, 1L, -1L, 1L, 1L, 1L)), .Names = c(sec, min,
 hour, mday, mon, year, wday, yday, isdst
 ), class = c(POSIXt, POSIXlt))

 print(tv)
 # print 11 time points (right)

 length(tv)
 # returns 9 (wrong)

 tv is a list of length 9.  The answer is right, your expectation is
 wrong.
 I have tried that on several computers with/without switching to
 English
 locales, i.e. Sys.setlocale(LC_TIME, en). I have searched a
 help pages but I
 cannot imagine how that could be OK.

 See this in ?POSIXt:

 Class 'POSIXlt' is a named list of vectors...

 You could define your own length measurement as

 length.POSIXlt - function(x) length(x$sec)

 and you'll get the answer you expect, but be aware that length.XXX
 methods are quite rare, and you may surprise some of your users.


 On the other hand, isn't the fact that length() currently always
 returns 9 for POSIXlt objects likely to be a surprise to many users
 of POSIXlt?

 The back of The New S Language says Easy-to-use facilities allow
 you to organize, store and retrieve all sorts of data. ... S
 functions and data organization make applications easy to write.

 Now, POSIXlt has methods for c() and vector subsetting [ (and many
 other vector-manipulation methods - see methods(class=POSIXlt)).
 Hence, from the point of view of intending to supply easy-to-use
 facilities ... [for] all sorts of data, isn't it a little
 incongruous that length() is not also provided -- as 3 functions (any
 others?) comprise a core set of vector-manipulation functions?

 Would it make sense to have an informal prescription (e.g., in
 R-exts) that a class that implements a vector-like object and
 provides at least of one of functions 'c', '[' and 'length' should
 provide all three?  It would also be easy to describe a test-suite
 that should be included in the 'test' directory of a package
 implementing such a class, that had some tests of the basic
 vector-manipulation functionality, such as:

  # at this point, x0, x1, x3,  x10 should exist, as vectors of the
  # class being tested, of length 0, 1, 3, and 10, and they should
  # contain no duplicate elements
  length(x0)
 [1] 1
  length(c(x0, x1))
 [1] 2
  length(c(x1,x10))
 [1] 11
  all(x3 == x3[seq(len=length(x3))])
 [1] TRUE
  all(x3 == c(x3[1], x3[2], x3[3]))
 [1] TRUE
  length(c(x3[2], x10[5:7]))
 [1] 4
 

 It would also be possible to describe a larger set of vector
 manipulation functions that should be implemented together, including
 e.g., 'rep', 'unique', 'duplicated', '==', 'sort', '[-', 'is.na',
 head, tail ... (many of which are provided for POSIXlt).

 Or is there some good reason that length() cannot be provided (while
 'c' and '[' can) for some vector-like classes such as POSIXlt?

 What you say sounds good in general, but the devil is in the details.
 Changing the meaning of length(x) for some objects has fairly
 widespread effects.  Are they all positive?  I don't know.

 Adding a prescription like the one you suggest would be good if it's
 easy to implement, but bad if it's already widely violated.  How many
 base or CRAN or Bioconductor packages violate it currently?   Do the
 ones that provide all 3 methods do so in a 

Re: [Rd] creating lagged variables

2007-12-13 Thread Gabor Grothendieck
The problem is the representation.

If we transform it into a zoo time series, z, with one
series per column and one time point per row then we
can just merge the series with its lag.

 DF - data.frame(id = c(1, 1, 1, 2, 2, 2), time = c(1, 2,
+ 3, 1, 2, 3), value = c(-0.56047565, -0.23017749, 1.55870831,
+ 0.07050839, 0.12928774, 1.71506499))

 library(zoo)
 z - do.call(merge, by(DF, DF$id, function(x) zoo(x$value, x$time)))
 merge(z, lag(z, -1))
 1.z2.z 1.lag(z, -1) 2.lag(z, -1)
1 -0.5604756 0.07050839   NA   NA
2 -0.2301775 0.12928774   -0.5604756   0.07050839
3  1.5587083 1.71506499   -0.2301775   0.12928774


On Dec 13, 2007 1:21 PM, Antonio, Fabio Di Narzo
[EMAIL PROTECTED] wrote:
 Hi all.
 I'm looking for robust ways of building lagged variables in a dataset
 with multiple individuals.

 Consider a dataset with variables like the following:
 ##
 set.seed(123)
 d - data.frame(id = rep(1:2, each=3), time=rep(1:3, 2), value=rnorm(6))
 ##
 d
  id time   value
 1  11 -0.56047565
 2  12 -0.23017749
 3  13  1.55870831
 4  21  0.07050839
 5  22  0.12928774
 6  23  1.71506499

 I want to compute the lagged variable 'value(t-1)', taking subject id
 into account.
 My current effort produced the following:
 ##
 my_lag - function(dt, varname, timevarname='time', lag=1) {
vname - paste(varname, if(lag0) '.' else '', lag, sep='')
timevar - dt[[timevarname]]
dt[[vname]] - dt[[varname]][match(timevar, timevar + lag)]
dt
 }
 lag_by - function(dt, idvarname='id', ...)
  do.call(rbind, by(dt, dt[[idvarname]], my_lag, ...))
 ##
 With the previous data I get:

  lag_by(d, varname='value')
id time   value value.1
 1.1  11 -0.56047565  NA
 1.2  12 -0.23017749 -0.56047565
 1.3  13  1.55870831 -0.23017749
 2.4  21  0.07050839  NA
 2.5  22  0.12928774  0.07050839
 2.6  23  1.71506499  0.12928774

 So that seems working. However, I was thinking if there is a
 smarter/cleaner/more robust way to do the job. For instance, with the
 above function I get dataframe rows re-ordering as a side-effect
 (anyway this is of no concern in my current analysis)...
 Any suggestion?

 All the bests,
 Fabio.
 --
 Antonio, Fabio Di Narzo
 Ph.D. student at
 Department of Statistical Sciences
 University of Bologna, Italy

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] List comprehensions for R

2007-12-09 Thread Gabor Grothendieck
That seems quite nice.

Note that there has been some related code posted.  See:
http://tolstoy.newcastle.edu.au/R/help/03b/6406.html
which discusses some R idioms for list comprehensions.

Also the gsubfn package has some functionality in this direction.  We
preface any function with fn$ to allow functions in its arguments
to be specified as formulas.  Its more R-ish than your code and
applies to more than just list comprehensions while your code is
more faithful to list comprehensions.

 library(gsubfn)
 fn$sapply(0:11/11, ~ sin(x))
 [1] 0. 0.09078392 0.18081808 0.26935891 0.35567516 0.43905397
 [7] 0.51880673 0.59427479 0.66483486 0.72990422 0.78894546 0.84147098
 fn$sapply(0:4, y ~ fn$sapply(0:3, x ~ x*y))
 [,1] [,2] [,3] [,4] [,5]
[1,]00000
[2,]01234
[3,]02468
[4,]0369   12
 fn$sapply(0:4, y ~ fn$sapply(0:y, x ~ x*y))
[[1]]
[1] 0

[[2]]
[1] 0 1

[[3]]
[1] 0 2 4

[[4]]
[1] 0 3 6 9

[[5]]
[1]  0  4  8 12 16

 unlist(fn$sapply(1:4, y ~ fn$sapply(1:y, x ~ x*y)))
 [1]  1  2  4  3  6  9  4  8 12 16


On Dec 9, 2007 4:41 PM, David C. Norris
[EMAIL PROTECTED] wrote:
 Below is code that introduces a list comprehension syntax into R,
 allowing expressions like:

   .[ sin(x) ~ x - (0:11)/11 ]
  [1] 0. 0.09078392 0.18081808 0.26935891 0.35567516 0.43905397
  [7] 0.51880673 0.59427479 0.66483486 0.72990422 0.78894546 0.84147098
   .[ .[x*y ~ x - 0:3] ~ y - 0:4]
 [,1] [,2] [,3] [,4] [,5]
 [1,]00000
 [2,]01234
 [3,]02468
 [4,]0369   12
   .[ .[x+y ~ x - 0:y] ~ y - 0:4]
 [[1]]
 [1] 0

 [[2]]
 [1] 1 2

 [[3]]
 [1] 2 3 4

 [[4]]
 [1] 3 4 5 6

 [[5]]
 [1] 4 5 6 7 8

   .[ x*y ~ {x - 1:4; y-1:x} ]
  [1]  1  2  4  3  6  9  4  8 12 16

 These constructions are supported by the following code.

 Regards,
 David

 ##
 ## Define syntax for list/vector/array comprehensions
 ##

 . - structure(NA, class=comprehension)

 comprehend - function(expr, vars, seqs, comprehension=list()){
  if(length(vars)==0) # base case
comprehension[[length(comprehension)+1]] - eval(expr)
  else
for(elt in eval(seqs[[1]])){
  assign(vars[1], elt, inherits=TRUE)
  comprehension - comprehend(expr, vars[-1], seqs[-1], comprehension)
}
  comprehension
 }

 ## Support general syntax like .[{exprs} ~ {generators}]
 [.comprehension - function(x, f){
  f - substitute(f)
  ## To allow omission of braces around a lone comprehension generator,
  ## as in 'expr ~ var - seq' we make allowances for two shapes of f:
  ##
  ## (1)(`-` (`~` expr
  ##   var)
  ##  seq)
  ## and
  ##
  ## (2)(`~` expr
  ## (`{` (`-` var1 seq1)
  ##  (`-` var2 seq2)
  ##  ...
  ##  (`-` varN - seqN)))
  ##
  ## In the former case, we set gens - list(var - seq), unifying the
  ## treatment of both shapes under the latter, more general one.
  syntax.error - Comprehension expects 'expr ~ {x1 - seq1; ... ; xN
 - seqN}'.
  if(!is.call(f) || (f[[1]]!='-'  f[[1]]!='~'))
stop(syntax.error)
  if(is(f,'-')){ # (1)
lhs - f[[2]]
if(!is.call(lhs) || lhs[[1]] != '~')
  stop(syntax.error)
expr - lhs[[2]]
var - as.character(lhs[[3]])
seq - f[[3]]
gens - list(call('-', var, seq))
  } else { # (2)
expr - f[[2]]
gens - as.list(f[[3]])[-1]
if(any(lapply(gens, class) != '-'))
  stop(syntax.error)
  }
  ## Fill list comprehension .LC
  vars - as.character(lapply(gens, function(g) g[[2]]))
  seqs - lapply(gens, function(g) g[[3]])
  .LC - comprehend(expr, vars, seqs)
  ## Provided the result is rectangular, convert it to a vector or array
  ## TODO: Extend to handle .LC structures more than 2-deep.
  if(!length(.LC))
return(.LC)
  dim1 - dim(.LC[[1]])
  if(is.null(dim1)){
lengths - sapply(.LC, length)
if(all(lengths == lengths[1])){ # rectangular
  .LC - unlist(.LC)
  if(lengths[1]  1) # matrix
dim(.LC) - c(lengths[1], length(lengths))
} else { # ragged
  # leave .LC as a list
}
  } else { # elements of .LC have dimension
dim - c(dim1, length(.LC))
.LC - unlist(.LC)
dim(.LC) - dim
  }
  .LC
 }

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Building packages

2007-12-07 Thread Gabor Grothendieck
An svn checkout directory can contain a mix of files that
are mirrored in the svn and not mirrored.  In particular, if you
add a new file into your checkout directory it will not automatically
go into the repository on your next commit unless you specifically
place that file under svn control so junk files remain local.

You can exclude files from R CMD build using the .Rbuildignore file.
See the Writing Extensions manual.

On Dec 7, 2007 11:07 AM, Barry Rowlingson [EMAIL PROTECTED] wrote:
 I've started a new package and I'm trying to work out the best way to do
 it. I'm managing my package source directory with SVN, but R CMD build
 likes to dump things in the inst/doc directory when making vignette PDF
 files. I don't want to keep these in SVN (they aren't strictly
 'source'), so it set me thinking.

 One of the other projects I work with has an out-of-source build system.
 You make a 'build' directory, run a config system (cmake-based) and then
 'make' does everything in the build directory without touching the
 source tree. Very nice and neat. How much work would it take to have
 something similar for building R packages? At present I've just got some
 svn:ignore settings to stop SVN bothering me.

  I also hit the problem of vignettes needing the package to be
 installed before being able to build them, but not being able to install
 the package because the vignettes wouldn't build without the package
 already being installed. The fix is to build with --no-vignettes, then
 install the package, then build with the vignettes enabled. Seems
 kludgy, plus it means that vignettes are always built with the currently
 installed package and not the currently-being-installed package. So I
 install and do a second pass to get it all right again.

  Or am I doing it wrong?

  Once I get smooth running of R package development and SVN I might
 write it up for R-newsletter - there's a couple of other tricks I've had
 to employ...

 Barry

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] suggested modification to the 'mle' documentation?

2007-12-07 Thread Gabor Grothendieck
On Dec 7, 2007 8:43 AM, Duncan Murdoch [EMAIL PROTECTED] wrote:
 On 12/7/2007 8:10 AM, Peter Dalgaard wrote:
  Ben Bolker wrote:
At this point I'd just like to advertise the bbmle package
  (on CRAN) for those who respectfully disagree, as I do, with Peter over
  this issue.  I have added a data= argument to my version
  of the function that allows other variables to be passed
  to the objective function.  It seems to me that this is perfectly
  in line with the way that other modeling functions in R
  behave.
 
  This is at least cleaner than abusing the fixed argument. As you know,
  I have reservations, one of which is that it is not a given that I want
  it to behave just like other modeling functions, e.g. a likelihood
  function might refer to more than one data set, and/or data that are not
  structured in the traditional data frame format. The design needs more
  thought than just adding arguments.

 We should allow more general things to be passed as data arguments in
 cases where it makes sense.  For example a list with names or an
 environment would be a reasonable way to pass data that doesn't fit into
 a data frame.

  I still prefer a design based a plain likelihood function. Then we can
  discuss how to construct such a function so that  the data are
  incorporated in a flexible way.  There are many ways to do this, I've
  shown one, here's another:
 
  f - function(lambda) -sum(dpois(x, lambda, log=T))
  d - data.frame(x=rpois(1, 12.34))
  environment(f)-evalq(environment(),d)

 We really need to expand as.environment, so that it can convert data
 frames into environments.  You should be able to say:

 environment(f) - as.environment(d)

 and get the same result as

 environment(f)-evalq(environment(),d)

 But I'd prefer to avoid the necessity for users to manipulate the
 environment of a function.  I think the pattern

 model( f, data=d )

 being implemented internally as

 environment(f) - as.environment(d, parent = environment(f))

 is very nice and general.  It makes things like cross-validation,
 bootstrapping, etc. conceptually cleaner:  keep the same
 formula/function f, but manipulate the data and see what happens.
 It does have problems when d is an environment that already has a
 parent, but I think a reasonable meaning in that case would be to copy
 its contents into a new environment with the new parent set.

 Duncan Murdoch

Something close to that is already possible in proto and its cleaner in proto
since the explicit environment manipulation is unnecessary as it occurs
implicitly:

1. In terms of data frame d from Peter Dalgaard's post the code
below is similar to my last post but it replaces the explicit
manipulation of f's environemnt with the creation of proto object
p on line ###.  That line converts d to an anonymous proto object
containing the components of d, in this case just x, and then
creates a child object p which can access x via delegation/inheritance.

library(proto)
set.seed(1)
f - function(lambda) -sum(dpois(x, lambda, log=T))
d - data.frame(x=rpois(100, 12.34))
p - proto(as.proto(as.list(d)), f = f) ###
mle(p[[f]], start=list(lambda=10))

2. Or the ### line could be replaced with the following line
which places f and the components of d, in this case just x,
directly into p:

p - proto(f = f, envir = as.proto(as.list(d)))

again avoiding the explicit reset of environment(f) and the evalq.



  mle(f, start=list(lambda=10))
 
  Call:
  mle(minuslogl = f, start = list(lambda = 10))
 
  Coefficients:
   lambda
  12.3402
 
  It is not at all an unlikely design to have mle() as a generic function
  which works on many kinds of objects, the default method being
  function(object,...) mle(minuslogl(obj)) and minuslogl is an extractor
  function returning (tada!) the negative log likelihood function.
(My version also has a cool formula interface and other
  bells and whistles, and I would love to get feedback from other
  useRs about it.)
 
 cheers
  Ben Bolker
 
 
 
 


 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] suggested modification to the 'mle' documentation?

2007-12-07 Thread Gabor Grothendieck
On Dec 7, 2007 8:10 AM, Peter Dalgaard [EMAIL PROTECTED] wrote:
 Ben Bolker wrote:
At this point I'd just like to advertise the bbmle package
  (on CRAN) for those who respectfully disagree, as I do, with Peter over
  this issue.  I have added a data= argument to my version
  of the function that allows other variables to be passed
  to the objective function.  It seems to me that this is perfectly
  in line with the way that other modeling functions in R
  behave.
 
 This is at least cleaner than abusing the fixed argument. As you know,
 I have reservations, one of which is that it is not a given that I want
 it to behave just like other modeling functions, e.g. a likelihood
 function might refer to more than one data set, and/or data that are not
 structured in the traditional data frame format. The design needs more
 thought than just adding arguments.

 I still prefer a design based a plain likelihood function. Then we can
 discuss how to construct such a function so that  the data are
 incorporated in a flexible way.  There are many ways to do this, I've
 shown one, here's another:

  f - function(lambda) -sum(dpois(x, lambda, log=T))
  d - data.frame(x=rpois(1, 12.34))
  environment(f)-evalq(environment(),d)
  mle(f, start=list(lambda=10))

 Call:
 mle(minuslogl = f, start = list(lambda = 10))

 Coefficients:
  lambda
 12.3402


The explicit environment manipulation is what I was referring to but
we can simplify it using proto.  Create a proto object to hold
f and x then pass the f in the proto object (rather than the
original f) to mle.  That works because proto automatically resets
the environment of f when its added to avoiding the evalq.

 set.seed(1)
 library(proto)
 f - function(lambda) -sum(dpois(x, lambda, log=TRUE))
 p - proto(f = f, x = rpois(100, 12.34))
 mle(p[[f]], start = list(lambda = 10))

Call:
mle(minuslogl = p[[f]], start = list(lambda = 10))

Coefficients:
  lambda
12.46000

 It is not at all an unlikely design to have mle() as a generic function
 which works on many kinds of objects, the default method being
 function(object,...) mle(minuslogl(obj)) and minuslogl is an extractor
 function returning (tada!) the negative log likelihood function.
(My version also has a cool formula interface and other
  bells and whistles, and I would love to get feedback from other
  useRs about it.)
 
 cheers
  Ben Bolker
 
 


 --

   O__   Peter Dalgaard Øster Farimagsgade 5, Entr.B
  c/ /'_ --- Dept. of Biostatistics PO Box 2099, 1014 Cph. K
  (*) \(*) -- University of Copenhagen   Denmark  Ph:  (+45) 35327918
 ~~ - ([EMAIL PROTECTED])  FAX: (+45) 35327907

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] suggested modification to the 'mle' documentation?

2007-12-06 Thread Gabor Grothendieck
The closure only works if you are defining the inner function yourself.
If you are not then its yet more work to redefine the environment of
the inner function or other workaround.

On Dec 6, 2007 6:01 PM, Peter Dalgaard [EMAIL PROTECTED] wrote:
 Spencer Graves wrote:
  Hello:
 
I wish to again express my appreciation to all who have
  contributed to making R what it is today.
 
At this moment, I'm particularly grateful for whoever modified the
  'mle' code so data no longer need be passed via global variables.  I
  remember struggling with this a couple of years ago, and I only today
  discovered that it is no longer the case.
 
I'd like to suggest that the 'mle' help file be modified to
  advertise this fact, e.g., by adding one of the two examples appearing
  below.
 

 In a word: No!!! That is not the design. A likelihood function is a
 function of its parameters, and the fixed argument is for holding some
 parameters fixed (e.g. during profiling).

 To include data, just make a closure, e.g.

 poissonLike - function(x., y.){
function(ymax=15, xhalf=6)
  -sum(stats::dpois(y., lambda=ymax/(1+x./xhalf), log=TRUE))}
 mll -  poissonLike(x, y)
 mle(ll, 


Best Wishes,
Spencer Graves
  
  x - 0:10
  y - c(26, 17, 13, 12, 20, 5, 9, 8, 5, 4, 8)
  #  Pass data via function arguments rather than global variables
  ll.5 - function(ymax=15, xhalf=6, x., y.)
   -sum(stats::dpois(y., lambda=ymax/(1+x./xhalf), log=TRUE))
  (fit.5 - mle(ll.5, start=list(ymax=15, xhalf=6),
fixed=list(x.=x, y.=y)))
 
  ll3 - function(lymax=log(15), lxhalf=log(6), x., y.)
-sum(stats::dpois(y.,
   lambda=exp(lymax)/(1+x./exp(lxhalf)), log=TRUE))
  (fit3 - mle(ll3, start=list(lymax=0, lxhalf=0),
   fixed=list(x.=x, y.=y)))
 
  __
  R-devel@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-devel
 


 --
   O__   Peter Dalgaard Øster Farimagsgade 5, Entr.B
  c/ /'_ --- Dept. of Biostatistics PO Box 2099, 1014 Cph. K
  (*) \(*) -- University of Copenhagen   Denmark  Ph:  (+45) 35327918
 ~~ - ([EMAIL PROTECTED])  FAX: (+45) 35327907


 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] faqs

2007-11-20 Thread Gabor Grothendieck
And one other thing.  In the GNU docs it says that if your NEWS file gets too
big you can create an ONEWS file for the really old stuff.   Is ONEWS supported
too?

On Nov 16, 2007 11:12 PM, Gabor Grothendieck [EMAIL PROTECTED] wrote:

 On Nov 15, 2007 10:51 AM, Gabor Grothendieck [EMAIL PROTECTED] wrote:
  On Nov 15, 2007 9:57 AM, Simon Urbanek [EMAIL PROTECTED] wrote:
  
   On Nov 14, 2007, at 11:55 PM, Gabor Grothendieck wrote:
  
inst/NEWS would have the advantage of consistency with R itself
which also has a NEWS file.
   
  
   I vote for NEWS for reasons above plus because that's what I use in my
   packages already ;).
   IMHO it should be installed automatically if present, i.e. it
   shouldn't be necessary to have it in inst/NEWS, just plain NEWS should
   suffice.
  
 
  I agree that that would be desirable.

 Also it would be nice if the HTML files here:

 http://cran.r-project.org/src/contrib/Descriptions/

 each had a link to the NEWS file so one could get the news on a
 package prior to
 installing it.


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] faqs

2007-11-16 Thread Gabor Grothendieck
On Nov 15, 2007 10:51 AM, Gabor Grothendieck [EMAIL PROTECTED] wrote:
 On Nov 15, 2007 9:57 AM, Simon Urbanek [EMAIL PROTECTED] wrote:
 
  On Nov 14, 2007, at 11:55 PM, Gabor Grothendieck wrote:
 
   inst/NEWS would have the advantage of consistency with R itself
   which also has a NEWS file.
  
 
  I vote for NEWS for reasons above plus because that's what I use in my
  packages already ;).
  IMHO it should be installed automatically if present, i.e. it
  shouldn't be necessary to have it in inst/NEWS, just plain NEWS should
  suffice.
 

 I agree that that would be desirable.

Also it would be nice if the HTML files here:

http://cran.r-project.org/src/contrib/Descriptions/

each had a link to the NEWS file so one could get the news on a
package prior to
installing it.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] library path in Rd link

2007-11-15 Thread Gabor Grothendieck
I think that's right -- it only works on NTFS systems.  This page
refers to it as an NTFS symbolic link:
http://en.wikipedia.org/wiki/NTFS_symbolic_link

On Nov 14, 2007 10:00 PM, Duncan Murdoch [EMAIL PROTECTED] wrote:
 On 14/11/2007 7:44 PM, Gabor Grothendieck wrote:
  On Nov 14, 2007 4:36 PM, Duncan Murdoch [EMAIL PROTECTED] wrote:
  On Unix-alikes, the workaround is to build soft links to all the
  packages in a standard location; but soft links don't work on Windows
  (and we don't want to get into the almost-undocumented hard links that
  exist on some Windows file systems).
 
  Symbolic links are available on Windows Vista:

 Does this work on FAT file systems, e.g. on a USB drive?  It used to be
 that they only worked on NTFS.

 Duncan Murdoch


 
  C:\ mklink /?
  Creates a symbolic link.
 
  MKLINK [[/D] | [/H] | [/J]] Link Target
 
  /D  Creates a directory symbolic link.  Default is a file
  symbolic link.
  /H  Creates a hard link instead of a symbolic link.
  /J  Creates a Directory Junction.
  Linkspecifies the new symbolic link name.
  Target  specifies the path (relative or absolute) that the new link
  refers to.



__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] faqs

2007-11-15 Thread Gabor Grothendieck
On Nov 15, 2007 9:57 AM, Simon Urbanek [EMAIL PROTECTED] wrote:

 On Nov 14, 2007, at 11:55 PM, Gabor Grothendieck wrote:

  inst/NEWS would have the advantage of consistency with R itself
  which also has a NEWS file.
 

 I vote for NEWS for reasons above plus because that's what I use in my
 packages already ;).
 IMHO it should be installed automatically if present, i.e. it
 shouldn't be necessary to have it in inst/NEWS, just plain NEWS should
 suffice.


I agree that that would be desirable.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] When to use LazyLoad, LazyData and ZipData?

2007-11-14 Thread Gabor Grothendieck
On Nov 14, 2007 7:01 AM, Bjørn-Helge Mevik [EMAIL PROTECTED] wrote:
 Dear developeRs,

 I've searched the documentation, FAQ, and mailing lists, but haven't
 found the answer(*) to the following:

 When should one specify LazyLoad, LazyData, and ZipData?

 And what is the default if they are left unspecified?


 (*)Except that
 1) If the package you are writing uses the methods package, specify
 LazyLoad: yes, and
 2) The optional ZipData field controls whether the automatic Windows
 build will zip up the data directory or no: set this to no if your
 package will not work with a zipped data directory.

There is some information and links regarding LazyLoad on the proto
package home page,
http://r-proto.googlecode.com, in the last section entitled Avoiding R
Bugs in point #1.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] library path in Rd link

2007-11-14 Thread Gabor Grothendieck
On Nov 14, 2007 4:36 PM, Duncan Murdoch [EMAIL PROTECTED] wrote:

 On Unix-alikes, the workaround is to build soft links to all the
 packages in a standard location; but soft links don't work on Windows
 (and we don't want to get into the almost-undocumented hard links that
 exist on some Windows file systems).

Symbolic links are available on Windows Vista:

C:\ mklink /?
Creates a symbolic link.

MKLINK [[/D] | [/H] | [/J]] Link Target

/D  Creates a directory symbolic link.  Default is a file
symbolic link.
/H  Creates a hard link instead of a symbolic link.
/J  Creates a Directory Junction.
Linkspecifies the new symbolic link name.
Target  specifies the path (relative or absolute) that the new link
refers to.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] faqs

2007-11-14 Thread Gabor Grothendieck
Another possibility is to just place it at the end of the vignette.
That is where it is in the proto package:

http://cran.r-project.org/doc/vignettes/proto/proto.pdf

Package documentation is already quite scattered and adding
a faq() command would add just one more thing one has
to do.  Here are some places one might look already for
documentation:

package?mypackage
vignette(...)
library(help = mypackage)
?mycommand
home page
WISHLIST file
NEWS file
THANKS file
svn log

I personally would give a much higher priority to a standardized
place to find change information.  Some packages provide it.
Some don't which is very frustrating since you see a new version
and have no idea what has changed.  If they do document it you
often have to download the source just to get at the change information.



On Nov 14, 2007 8:22 PM, roger koenker [EMAIL PROTECTED] wrote:
 An extremely modest proposal:

 It would be nice if packages could have a FAQ and if

faq(package.name)

 would produce this faq.  And if, by default

faq()
FAQ()

 would produce the admirable R faq...  Apologies in advance
 if there is already a mechanism like this, but help.search()
 didn't reveal anything.

 url:www.econ.uiuc.edu/~rogerRoger Koenker
 email   [EMAIL PROTECTED]   Department of Economics
 vox:217-333-4558University of Illinois
 fax:217-244-6678Champaign, IL 61820

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] How to overload the assignment operator?

2007-11-13 Thread Gabor Grothendieck
Check out the g.data package in case that's what you are looking for.  It
uses promises until the data is actually used.

On Nov 13, 2007 9:19 AM, Jens Oehlschlägel [EMAIL PROTECTED] wrote:
 Thanks Matthias,

  are you looking for setReplaceMethod?

 So far the package I m writing has avoided using anything from S4 and the 
 implications of touching S4 are not clear to me. The package aims on 
 providing an alternative to 'atomic' data stored in ram, i.e. large atomic 
 data stored on disk. I need some advice how to do this maximally performant, 
 which probably means pure S3!?

 Best regards


 Jens Oehlschlägel

 --

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] years() in chron package not working in R 2.6.0 (PR#10415)

2007-11-10 Thread Gabor Grothendieck
In chron, ?year says x must be a dates object and its not in the code below.

On Nov 9, 2007 3:55 PM,  [EMAIL PROTECTED] wrote:
 I loaded package chron in a newly installed version of R 2.6.0 and the
 years() function would not work.  (This worked in 2.5).

 =20

  x-as.Date(2006-01-01)

  years(x)

 NULL

 =20

 sessionInfo()

 R version 2.6.0 (2007-10-03)=20

 i386-pc-mingw32=20

 =20

 locale:

 LC_COLLATE=3DEnglish_United States.1252;LC_CTYPE=3DEnglish_United
 States.1252;LC_MONETARY=3DEnglish_United
 States.1252;LC_NUMERIC=3DC;LC_TIME=3DEnglish_United States.1252

 =20

 attached base packages:

 [1] tcltk stats graphics  grDevices utils datasets  methods
 base=20=20=20=20=20

 =20

 other attached packages:

 [1] fCalendar_260.72  fEcofin_260.72fUtilities_260.72 zoo_1.4-0
 spatial_7.2-36=20=20=20

 [6] RUnit_0.4.17  MASS_7.2-36   chron_2.3-15=20=20=20=20=20

 =20

 loaded via a namespace (and not attached):

 [1] grid_2.6.0 lattice_0.16-5 tools_2.6.0=20=20=20

 =20

 Thanks.

 =20

 John Putz

 Sr. Energy Analyst

 The Energy Authority

 425-460-1137 (o)

 206-910-5229 (c)

 =20


[[alternative HTML version deleted]]

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] pairs, par

2007-10-29 Thread Gabor Grothendieck
This hack will disable the on.exit temporarily:

pairs.data.frame - function(x, ...) {
on.exit - function(...) {}
environment(pairs.default) - environment()
pairs.default(x, ...)
}
pairs(iris)
par(usr)
# add points to lower right square
points(1:10/10, 1:10/10, col = red)


On 10/29/07, Oliver Soong [EMAIL PROTECTED] wrote:
 I dug around in pairs, and I think it has something to do with the
 on.exit(par(opar)) bit:

 f - function() {
opar - par(mfrow = c(2, 2), mar = rep(0.5, 4), oma = rep(4, 4))
on.exit(par(opar))
for(i in 1:4) plot(0:1, 0:1)
par(c(mfg, omd, fig, plt, usr))
print(opar)
 }
 f()
 par(xpd = NA)
 par(c(omd, fig, plt, usr))
 points(0 - 0.01 * 1:100, 0 - 0.01 * 1:100)
 points(0 - 0.01 * 1:100, 1 + 0.01 * 1:100)
 points(1 + 0.01 * 1:100, 0 - 0.01 * 1:100)
 points(1 + 0.01 * 1:100, 1 + 0.01 * 1:100)

 My guess is that there are 2 sets of graphical parameters, the ones
 stored in par and the ones used by the plotting functions.  Before
 par(opar) gets called, the two are synchronized.  When par(opar) gets
 called, we somehow set new values for par without changing the ones
 used by the plotting functions, and the data used by points becomes
 out of sync with the par information.

 This is reflected in this much simpler example:

 x11()
 par(c(omd, fig, plt, usr))
 points(0, 0)

 Again, par is defined, but this time the data used by the plotting
 functions has not been set, and an error occurs.

 Thanks for the workaround suggestion.  I guess I can always define a
 new plotting region to force par and the plotting data to
 re-synchronize.  It might be nice if those two didn't go out of sync,
 as I had assumed par would always be reliable.

 Oliver


 On 10/29/07, Tony Plate [EMAIL PROTECTED] wrote:
  I would look into the code for pairs().  Among other things, it sets and
  restores par(mfrow=...).  I suspect this is the relevant issue, not the
  use of pairs().  I would try to figure out what state a graphics device
  is in after resetting par(mfrow).  When I try the following (R 2.6.0
  patched, under Windows), I see a line on the plot, but not in a place
  that corresponds to the axis that were drawn by the 'plot()' command:
 
par(mfrow=c(2,2))
plot(1:2)
par(mfrow=c(1,1))
lines(1:2,1:2)
   
 
  (and if you want to be able to set up a new coordinate system on the
  plotting device to draw on top of the plot left by pairs(), look at
  par(new)  something like plot(0:1, type='n', axes=F, xlab=))
 
  hope this helps,
 
  Tony Plate
 
  Oliver Soong wrote:
   Hi,
  
   I posted over at R-help, and didn't get a response, but perhaps that
   was the wrong forum for this question.  I'm having some confusion over
   the coordinate system after using pairs.  I'm not interested in the
   content of the actual pairs plot, although the number of pairs seems
   to matter a bit.  I'm purely interested in knowing where subsequent
   points will be plotted on the device.  However, after using pairs, the
   par information (omd, fig, plt, and usr) don't reflect what points
   does.  For example:
  
   pairs(iris[1:5])
   par(xpd = NA)
   points(0 - 0.01 * 1:100, 0 - 0.01 * 1:100)
   points(0 - 0.01 * 1:100, 1 + 0.01 * 1:100)
   points(1 + 0.01 * 1:100, 0 - 0.01 * 1:100)
   points(1 + 0.01 * 1:100, 1 + 0.01 * 1:100)
   par(c(omd, fig, plt, usr))
  
   The resulting plot shows that the corners of the are approximately
   0.05 user coordinate units from the boundaries of the plot region.
   According to par, though, there is a margin around the plotting region
   that is clearly not symmetric and does not correspond to around 0.05
   units.
  
   If we use pairs(iris[1:2]) and repeat the rest, the corners are now
   0.02 user coordinate units.  par provides the same information as
   before.
  
   So:
   1. How do I figure out where coordinates I give to points will display
   on the figure?
   2. More generally (for my own understanding), why does the par
   information not do what I expect?  Do I have some fundamental
   misunderstanding of the arrangement of plotting, figure, display, and
   margin regions within the device?  Is there a bug in pairs and/or par?
  
   I'm using R 2.5.1, and this behavior occurs on a fresh R console.
  
   Thanks!
  
   Oliver
  
  
  
 
 


 --
 Oliver Soong
 Donald Bren School of Environmental Science  Management
 University of California, Santa Barbara
 Santa Barbara, CA 93106-5131
 805-893-7044 (office)
 610-291-9706 (cell)

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] LazyLoad changes the class of objects

2007-10-17 Thread Gabor Grothendieck
On 10/17/07, Henrik Bengtsson [EMAIL PROTECTED] wrote:
 Yes (on the yes), to second Luke.  Here is John Chambers' comment when
 I was bitten by the same bug a while ago:

  http://tolstoy.newcastle.edu.au/R/devel/02b/0524.html

 See also Peter Dalgaard's follow up suggesting to wrap up the
 environment in a list, which will typically be enough.  I've been
 using this trick successfully in the Object class (R.oo package) for
 several years, where I'm putting the environment in the attributes
 list of an object, i.e.  obj - NA; attr(NA, ..env) - new.env();
 It turned out at the time that this was slightly faster to access than
 using a list element.

This has all been discussed before but this trick is not sufficient
for defining an environment subclass because it does not respect
inheritance.  The subclass writer must replicate all methods that
act on environments in a subclass for an environment subclass to
have them.  Inheritance is completely broken.

If there were N environment methods the writer of an environment subclass
would have to write N methods to support them all.  On the other hand,
if it worked in a true OO way the subclass writer would not have to write
anything.

Also suppose a new environment method comes along.  In true OO
the subclass automatically inherits it but with the trick the subclass
writer needs to write a new method always mimicing the parent.

This is not how OO is supposed to work.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Withdrawing SaveImage

2007-10-17 Thread Gabor Grothendieck
I noticed this under R 2.7.0 NEWS:

o   In package installation, SaveImage: yes is defunct and
lazyloading is attempted instead.

I think its premature to make SaveImage defunct especially when:

1. there is a bug that would be triggered in some packages
by automatically using LazyLoad instead of SaveImage:

https://stat.ethz.ch/pipermail/r-devel/2007-October/047118.html

2. only last year it was stated on R-devel that there was no intention to
withdraw SaveImage: yes

http://tolstoy.newcastle.edu.au/R/devel/06/02/4025.html

Having SaveImage: yes automatically invoke LazyLoad is really
tantamount to withdrawing it.

At the very least making SaveImage defunct should be postponed until the
bug in #1 is fixed and a period of time has elapsed during which both
SaveImage and LazyLoad are available without that bug so that packages
affected can gradually move over and have the ability to move back to
SaveImage if the move uncovers more R bugs related to this.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Extending deriv3()

2007-10-15 Thread Gabor Grothendieck
If you are modifying it it would also be nice to add
{ to the derivative table so one can write this:

f - function(x) x*x
deriv(body(f), x, func = TRUE)

Currently, one must do:

deriv(body(f)[[2]], x, func = TRUE)


On 10/15/07, Prof Brian Ripley [EMAIL PROTECTED] wrote:
 On Mon, 15 Oct 2007, Thomas Yee wrote:

  Hello,
 
  I was wondering if the functions deriv3(), deriv() etc. could be extended
  to handle psigamma() and its special cases (digamma(), trigamma()
  etc.). From the error message it seems that 'psigamma' needs to be
  added to the derivatives table.
  This might be easy since psigamma() has a deriv argument.

 If you look at ?deriv you will see that it only knows about functions *of
 one argument* and operators.  So it would be easy to add digamma(x) and
 psigamma(x) (and I will do so shortly), it would not be so easy to add
 psigamma(x, deriv).

  Additionally, this error message is also obtained when requesting for
  the Hessian of the gamma and lgamma functions:
 
  d3 = deriv(~  gamma(y), namev=y, hessian= TRUE)
  d3 = deriv(~ lgamma(y), namev=y, hessian= TRUE)
 
  Another class of special functions worth adding are the Bessel functions.

 Well, you can always submit a patch 

 Note that deriv() in R differs from that in S in being done in C and hence
 not being user-extensible.  A long time ago that had an advantage: S's
 deriv could be very slow and take a lot of memory by the standards of the
 early 1990's.  Rather than work on adding yet more special cases it would
 seem better to work on making it user-extensible.

 --
 Brian D. Ripley,  [EMAIL PROTECTED]
 Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
 University of Oxford, Tel:  +44 1865 272861 (self)
 1 South Parks Road, +44 1865 272866 (PA)
 Oxford OX1 3TG, UKFax:  +44 1865 272595

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] LazyLoad changes the class of objects

2007-10-12 Thread Gabor Grothendieck
Consider a package that this DESCRIPTION file:

---
Package: tester
Version: 0.1-0
Date: 2007-10-12
Title: Prototype object-based programming
Author: Gabor Grothendieck
Maintainer: Gabor Grothendieck [EMAIL PROTECTED]
Description: test
LazyLoad: true
Depends: R (= 2.6.0)
License: GPL2
---

and a single subdirectory R containing tester.R which contains two lines:

---
e - new.env()
class(e) - c(x, environment)
---

Now issue these commands:

 library(tester)
 class(tester::e)
[1] environment

 R.version.string # Windows Vista
[1] R version 2.6.0 Patched (2007-10-08 r43124)


Note that the class of e was changed from what we set it to !!!

On the other handn, if we omit LazyLoad: true from the DESCRIPTION file
then it retains its original class.

 # removed LazyLoad: true line from DESCRIPTION and reinstall pkg
 # now its ok
 library(tester)
 class(tester::e)
[1] x   environment

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Ryacas check

2007-09-30 Thread Gabor Grothendieck
When I do

   Rcmd check Ryacas

on my Windows Vista system under

   R version 2.6.0 beta (2007-09-23 r42958)

it checks out fine but here:

  http://cran.r-project.org/bin/windows/contrib/checkSummaryWin.html
  http://cran.r-project.org/bin/windows/contrib/2.5/check/Ryacas-check.log

it complains about the XML package even though the XML package
*is* declared in the DESCRIPTION file.

What is wrong?

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Ryacas check

2007-09-30 Thread Gabor Grothendieck
I don't understand what you are suggesting.  I already
have Depends: XML in my description file and as indicated
in my original post don't get any errors or warnings when I run
CHECK on my own system.  Its only CRAN that has them.

On 9/30/07, Duncan Temple Lang [EMAIL PROTECTED] wrote:
 -BEGIN PGP SIGNED MESSAGE-
 Hash: SHA1



 Gabor Grothendieck wrote:
  When I do
 
 Rcmd check Ryacas
 
  on my Windows Vista system under
 
 R version 2.6.0 beta (2007-09-23 r42958)
 
  it checks out fine but here:
 
http://cran.r-project.org/bin/windows/contrib/checkSummaryWin.html
http://cran.r-project.org/bin/windows/contrib/2.5/check/Ryacas-check.log
 
  it complains about the XML package even though the XML package
  *is* declared in the DESCRIPTION file.

 The log says that the XML package refers to a function that cannot be
 found and so the XML package wasn't loaded.


 Grab the latest version of the XML package from
 www.omegahat.org/RSXML/XML_0.93-1.tar.gz.
 It is usually automatically fetched via CRAN, but
 we have to figure out why that is not happening
 currently.


 
  What is wrong?
 
  __
  R-devel@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-devel
 -BEGIN PGP SIGNATURE-
 Version: GnuPG v1.4.7 (Darwin)
 Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

 iD8DBQFG//Cw9p/Jzwa2QP4RAvjdAJ0aAnMNyjGxO5WdONA7GUUTcIIXBQCfc5eR
 gWNaJh5gMu5ZmLLDa1Pb1vc=
 =ZU1N
 -END PGP SIGNATURE-


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] as.Date.numeric

2007-09-29 Thread Gabor Grothendieck
I noticed that R 2.7.0 will have as.Date.numeric with a second
non-optional origin argument.  Frankly I would prefer that it default
to the Epoch since its a nuisance to specify but at the very least
I think that .Epoch should be provided as a builtin variable.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] delayedAssign

2007-09-27 Thread Gabor Grothendieck
Thanks for the explanation.

For lists either: (a) promises should be evaluated as they
enter the list or (b) promises evaluated as they exit the
list (i.e. as they are compared, inspected, etc.).  I gather
the intent was (a) but it does not happen that way due to
a bug in R.   Originally I thought (b) would then occur but
my surprise was that it does not occur either which is why
I feel its more serious than I had originally thought.

I think its ok if promises only exist in environments and not
lists.  Items that would be on my wishlist would be to be able
to do at R level the two mentioned previously

https://stat.ethz.ch/pipermail/r-devel/2007-September/046943.html

and thirdly an ability to get the evaluation environment, not just the
expression,
associated with a promise -- substitute only gets the expression.
Originally I thought I would need some or all of these wish items
and then thought not but am back to the original situation again as I use
them more and realize that they are at least important
for debugging (its very difficult to debug situations involving promises as
there is no way to inspect the evaluation environment so you are never sure
which environment a given promise is evaluating in) and possibly
for writing programs as well.

On 9/27/07, Luke Tierney [EMAIL PROTECTED] wrote:
 On Wed, 26 Sep 2007, Gabor Grothendieck wrote:

  I thought that perhaps the behavior in the previous post,
  while inconsistent with the documentation, was not all that
  harmful but I think its related to the following which is a potentially
  serious bug.

 The previous discussion already established that as.list of an
 environment should not return a list with promises in as promises
 should not be visible at the R level.  (Another loophole that needs
 closing is $ for environments). So behavior of results that should not
 exist is undefined and I cannot see how any such behavior is a further
 bug, serious or otherwise.

  z is a list with a single numeric component,
  as the dput output verifies,

 Except it isn't, as print or str verify, which might be a problem if z
 was an input these functions should expect, but it isn't.

  yet we cannot compare its first element
  to 7 without getting an error message.
 
  Later on we see that its because it thinks that z[[1]] is of type promise

 As z[[1]] is in fact of type promise that would seem a fairly
 reasonable thing to think at this point ...

  and even force(z[[1]]) is of type promise.

 which is consistent with what force is documented to do. The
 documentation is quite explicit that force does not do what you seem
 to be expecting.  That documentation is from a time when delay()
 existed to produce promises at the R level, which was a nightmare
 because of all the peculiarities it introduced, which is why it was
 removed.

 force is intended for one thing only -- replacing code like this:

   # I know the following line look really stupid and you will be
   # tempted to remove it for efficiency but DO NOT: it is needed
   # to make sure that the formal argument y is evaluated at this
   # point.
   y - y

 with this:

  force(y)

 which seems much clearer -- at least it suggest you look at the help
 page for force to see what it does.

 At this point promises should only ever exist in bindings in
 environments. If we wanted lazy evaluation constructs more widely
 there are really only two sensible options:

 The Scheme option where a special function delay creates a deferred
 evaluation and another, called force in Scheme, forces the evaluation
 but there is no implicit forcing

 or

 The Haskell option where data structurs are created lazily so

 z - list(f(x))

 would create a list with a deferred evaluation, but any attempt to
 access the value of z would force the evaluation. So printing z,
 for example, would force the evaluation but

y - z[[1]]

 would not.

 It is easy enough to create a Delay/Force pair that behaves like
 Scheme's with the tools available in R if that is what you want.
 Haskell, and other fully lazy functional languages, are very
 interesting but very different animals from R. For some reason you
 seem to be expecting some combination of Scheme and Haskell behavior.

 Best,

 luke

 
  f - function(x) environment()
  z - as.list(f(7))
  dput(z)
  structure(list(x = 7), .Names = x)
  z[[1]] == 7
  Error in z[[1]] == 7 :
   comparison (1) is possible only for atomic and list types
  force(z[[1]]) == 7
  Error in force(z[[1]]) == 7 :
   comparison (1) is possible only for atomic and list types
 
  typeof(z)
  [1] list
  typeof(z[[1]])
  [1] promise
  typeof(force(z[[1]]))
  [1] promise
  R.version.string # Vista
  [1] R version 2.6.0 beta (2007-09-23 r42958)
 
 
  On 9/19/07, Gabor Grothendieck [EMAIL PROTECTED] wrote:
  The last two lines of example(delayedAssign) give this:
 
  e - (function(x, y = 1, z) environment())(1+2, y, {cat( HO! ); pi+2})
  (le

Re: [Rd] Aggregate factor names

2007-09-27 Thread Gabor Grothendieck
You can do this:

aggregate(iris[-5], iris[5], mean)


On 9/27/07, Mike Lawrence [EMAIL PROTECTED] wrote:
 Hi all,

 A suggestion derived from discussions amongst a number of R users in
 my research group: set the default column names produced by aggregate
 () equal to the names of the objects in the list passed to the 'by'
 object.

 ex. it is annoying to type

 with(
my.data
,aggregate(
my.dv
,list(
one.iv = one.iv
,another.iv = another.iv
,yet.another.iv = yet.another.iv
)
,some.function
)
 )

 to yield a data frame with names = c
 ('one.iv','another.iv','yet.another.iv','x') when this seems more
 economical:

 with(
my.data
,aggregate(
my.dv
,list(
one.iv
,another.iv
,yet.another.iv
)
,some.function
)
 )

 --
 Mike Lawrence
 Graduate Student, Department of Psychology, Dalhousie University

 Website: http://memetic.ca

 Public calendar: http://icalx.com/public/informavore/Public

 The road to wisdom? Well, it's plain and simple to express:
 Err and err and err again, but less and less and less.
- Piet Hein

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Aggregate factor names

2007-09-27 Thread Gabor Grothendieck
You can do this too:

aggregate(iris[-5], iris[Species], mean)

or this:

with(iris, aggregate(iris[-5], data.frame(Species), mean))

or this:

attach(iris)
aggregate(iris[-5], data.frame(Species), mean)

The point is that you already don't have to write x = x.  The only
reason you are writing it that way is that you are using list instead
of data.frame.  Just use data.frame or appropriate indexing as shown.

On 9/27/07, Mike Lawrence [EMAIL PROTECTED] wrote:
 Understood, but my point is that the naming I suggest should be the
 default. One should not be 'punished' for being explicit in calling
 aggregate.


 On 27-Sep-07, at 1:06 PM, Gabor Grothendieck wrote:

  You can do this:
 
  aggregate(iris[-5], iris[5], mean)
 
 
  On 9/27/07, Mike Lawrence [EMAIL PROTECTED] wrote:
  Hi all,
 
  A suggestion derived from discussions amongst a number of R users in
  my research group: set the default column names produced by aggregate
  () equal to the names of the objects in the list passed to the 'by'
  object.
 
  ex. it is annoying to type
 
  with(
 my.data
 ,aggregate(
 my.dv
 ,list(
 one.iv = one.iv
 ,another.iv = another.iv
 ,yet.another.iv = yet.another.iv
 )
 ,some.function
 )
  )
 
  to yield a data frame with names = c
  ('one.iv','another.iv','yet.another.iv','x') when this seems more
  economical:
 
  with(
 my.data
 ,aggregate(
 my.dv
 ,list(
 one.iv
 ,another.iv
 ,yet.another.iv
 )
 ,some.function
 )
  )
 
  --
  Mike Lawrence
  Graduate Student, Department of Psychology, Dalhousie University
 
  Website: http://memetic.ca
 
  Public calendar: http://icalx.com/public/informavore/Public
 
  The road to wisdom? Well, it's plain and simple to express:
  Err and err and err again, but less and less and less.
 - Piet Hein
 
  __
  R-devel@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-devel
 

 --
 Mike Lawrence
 Graduate Student, Department of Psychology, Dalhousie University

 Website: http://memetic.ca

 Public calendar: http://icalx.com/public/informavore/Public

 The road to wisdom? Well, it's plain and simple to express:
 Err and err and err again, but less and less and less.
- Piet Hein




__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] delayedAssign

2007-09-26 Thread Gabor Grothendieck
I thought that perhaps the behavior in the previous post,
while inconsistent with the documentation, was not all that
harmful but I think its related to the following which is a potentially
serious bug.  z is a list with a single numeric component,
as the dput output verifies, yet we cannot compare its first element
to 7 without getting an error message.

Later on we see that its because it thinks that z[[1]] is of type promise
and even force(z[[1]]) is of type promise.

 f - function(x) environment()
 z - as.list(f(7))
 dput(z)
structure(list(x = 7), .Names = x)
 z[[1]] == 7
Error in z[[1]] == 7 :
  comparison (1) is possible only for atomic and list types
 force(z[[1]]) == 7
Error in force(z[[1]]) == 7 :
  comparison (1) is possible only for atomic and list types

 typeof(z)
[1] list
 typeof(z[[1]])
[1] promise
 typeof(force(z[[1]]))
[1] promise
 R.version.string # Vista
[1] R version 2.6.0 beta (2007-09-23 r42958)


On 9/19/07, Gabor Grothendieck [EMAIL PROTECTED] wrote:
 The last two lines of example(delayedAssign) give this:

  e - (function(x, y = 1, z) environment())(1+2, y, {cat( HO! ); pi+2})
  (le - as.list(e)) # evaluates the promises
 $x
 promise: 0x032b31f8
 $y
 promise: 0x032b3230
 $z
 promise: 0x032b3268

 which contrary to the comment appears unevaluated.  Is the comment
 wrong or is it supposed to return an evaluated result but doesn't?

  R.version.string # Vista
 [1] R version 2.6.0 alpha (2007-09-06 r42791)


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Inspecting promises

2007-09-23 Thread Gabor Grothendieck
Is there some way of displaying the expression and evaluation environment
associated with a promise?   I have found the following:

 # first run these two commands to set up example
 e - new.env()
 delayedAssign(y, x*x, assign.env = e)

 # method 1.  shows expression but not evaluation environment
 str(as.list(e))
List of 1
 $ y: promise to  language x * x

 # method 2. shows expression but not evaluation environment
 substitute(y, e)
x * x

which shows two different ways of displaying the expression
associated with a promise but neither shows the evaluation
environment.  The first technique may actually be a bug in
R based on previous discussion on r-devel.

Is there a way to display both the expression and the evaluation
environment associated with a promise.  Its a bit difficult to debug
code involving promises if you can't inspect the objects you are
working with.

 R.version.string # Vista
[1] R version 2.6.0 beta (2007-09-19 r42914)

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] warning upon automatic close of connection

2007-09-21 Thread Gabor Grothendieck
I would like to follow up on the annoying warnings which are generated
when connections are automatically closed.  This is affecting several of
my packages and is quite a nuisance.

R does not give you a message every time it garbage collects, at least
by default.  Perhaps there could be a higher level of warnings that issue
information on garbage collection, closed connections, etc. or perhaps the
user could have control over it but having it as the default is really
a nuisance
and I hope this warning can be removed.

On 9/12/07, Seth Falcon [EMAIL PROTECTED] wrote:
 Gabor Grothendieck [EMAIL PROTECTED] writes:
  I noticed that under R 2.6.0 there is a warning about closing the connection
  in the code from this post:
  https://stat.ethz.ch/pipermail/r-help/2007-September/140601.html
 
  which is evidently related to the following from the NEWS file:
 
  o Connections will be closed if there is no R object referring to
them.  A warning is issued if this is done, either at garbage
collection or if all the connection slots are in use.
 
  If we use read.table directly it still happens:
 
  # use Lines and Lines2 from cited post
  library(zoo)
  DF1 - read.table(textConnection(Lines), header = TRUE)
  DF2 - read.table(textConnection(Lines2), header = TRUE)
  z1 - zoo(as.matrix(DF1[-1]), as.Date(DF1[,1], %d/%m/%Y))
  z2 - zoo(as.matrix(DF2[-1]), as.Date(DF2[,1], %d/%m/%Y))
  both - merge(z1, z2)
  plot(na.approx(both))
 
  R.version.string # Vista
  [1] R version 2.6.0 alpha (2007-09-06 r42791)
 
  Is this annoying warning really necessary?  I assume we can get rid of
  it by explicitly naming and closing the connections but surely there should
  be a way to avoid the warning without going to those lengths.

 Up until the change you mention above it really was necessary to name
 and close all connections.  Short scripts run in fresh R sessions may
 not have had problems with code like you have written above, but
 longer programs or shorter ones run in a long running R session would
 run out of connections.

 Now that connections have weak reference semantics, one can ask
 whether this behavior should be standard and no warning issued.

  I would have thought that read.table opens the connection then it would
  close it itself so no warning would need to be generated.

 In your example, read.table is _not_ opening the connection.  You are
 passing an open connection which has no symbol bound to it:

   foo = 
   c = textConnection(foo)
   c
 descriptionclass mode text
   foo textConnection  r   text
  opened can readcan write
openedyes no

 But I think passing a closed connection would cause the same sort of
 issue.  It seems that there are two notions of closing a connection:
 (i) close as the opposite of open, and (ii) clean up the entire
 connection object.  I haven't looked closely at the code here, so I
 could be wrong, but I'm basing this guess on the following:

  file(foo)
 description   classmodetext  openedcan read
  foo  file r  textclosed   yes
  can write
  yes
 ## start new R session
 for (i in 1:75) file(foo)
 gc()
 warnings()[1:3]
  gc()
 used (Mb) gc trigger (Mb) max used (Mb)
 Ncells 149603  4.0 35  9.4   35  9.4
 Vcells 101924  0.8 786432  6.0   486908  3.8
 There were 50 or more warnings (use warnings() to see the first 50)
  warnings()[1:3]
 $`closing unused connection 76 (foo)`
 NULL

 $`closing unused connection 75 (foo)`
 NULL

 $`closing unused connection 74 (foo)`
 NULL


 --
 Seth Falcon | Computational Biology | Fred Hutchinson Cancer Research Center
 BioC: http://bioconductor.org/
 Blog: http://userprimary.net/user/


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] delayedAssign

2007-09-19 Thread Gabor Grothendieck
The last two lines of example(delayedAssign) give this:

 e - (function(x, y = 1, z) environment())(1+2, y, {cat( HO! ); pi+2})
 (le - as.list(e)) # evaluates the promises
$x
promise: 0x032b31f8
$y
promise: 0x032b3230
$z
promise: 0x032b3268

which contrary to the comment appears unevaluated.  Is the comment
wrong or is it supposed to return an evaluated result but doesn't?

 R.version.string # Vista
[1] R version 2.6.0 alpha (2007-09-06 r42791)

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] delayedAssign

2007-09-19 Thread Gabor Grothendieck
Also note that earlier in the same example we have:

 msg - old
 delayedAssign(x, msg)
 msg - new!
 x #- new!
[1] new!
 substitute(x) #- msg
x
 R.version.string # Vista
[1] R version 2.6.0 alpha (2007-09-06 r42791)

That is substitute(x) gives x, not msg.

On 9/19/07, Gabor Grothendieck [EMAIL PROTECTED] wrote:
 The last two lines of example(delayedAssign) give this:

  e - (function(x, y = 1, z) environment())(1+2, y, {cat( HO! ); pi+2})
  (le - as.list(e)) # evaluates the promises
 $x
 promise: 0x032b31f8
 $y
 promise: 0x032b3230
 $z
 promise: 0x032b3268

 which contrary to the comment appears unevaluated.  Is the comment
 wrong or is it supposed to return an evaluated result but doesn't?

  R.version.string # Vista
 [1] R version 2.6.0 alpha (2007-09-06 r42791)


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] copying promise

2007-09-19 Thread Gabor Grothendieck
1. Is there some way to copy a promise so that the copy has the same
expression in its promise as the original.  In the following we
y is a promise that we want to copy to z.  We
want z to be a promise based on the expression x since y is a
promise based on the expression x.  Thus the answer to the code
below is desired to be z=2 but its 1, 1 and y in the next three
examples so they are not the answer. See examples at end.

2. Is there some way to determine if a variable holds a promise
without evaluating it?

This code relates to question 1.

# example 1
x - 0
delayedAssign(y, x)
x - 1
# this forces y which is not what we want
z - y
x - 2
z # 1

# example 2
# this connects to z to x via y which is not what we want
#  since if y is forced then z takes its value from that, not x
x - 0
delayedAssign(y, x)
delayedAssign(z, y)
x - 1
y
x - 2
z # 1

# example 3
# this attempts to assign the expression underlying promise y to z
# which seems closest in spirit to what we want
# but it does not work as intended
x - 0
delayedAssign(y, x)
delayedAssign(z, substitute(y))
x - 1
y
x - 2
z # y

 R.version.string # Vista
[1] R version 2.6.0 alpha (2007-09-06 r42791)

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Date vs date (long)

2007-09-17 Thread Gabor Grothendieck
On 9/17/07, Terry Therneau [EMAIL PROTECTED] wrote:
 Gabor Grothendieck

 as.Date(10)
 You can define as.Date.numeric in your package and then it will work.  zoo
 has done that.

 library(zoo)
 as.Date(10)

  This is also a nice idea.  Although adding to a package is possible, it is
 now very hard to take away, given namespaces.  That is, I can't define my
 own Math.Date to do away with the creation of timespan objects.  Am I
 correct?  Is it also true that adding methods is hard if one uses version 4
 classes?

  The rest of Gabor's comments are workarounds for the problem I raised.
 But I don't want to have to wrap as.numeric around all of my date
 calculations.

You can define as.Date.numeric and Ops.Date, say, using S3 and these
will be added to the whatever is there but won't override the existing
+.Date and -.Date nor would you want them to or else the behavior would
be different depending on whether your package was there or not.  Also
namespaces should not be a problem since zoo uses namespaces and
it defined its own as.Date.numeric.

Try this:

Ops.Date - function (e1, e2) {
e - if (missing(e2)) {
NextMethod(.Generic)
}
else if (any(nchar(.Method) == 0)) {
NextMethod(.Generic)
}
else {
e1 - as.numeric(e1)
e2 - as.numeric(e2)
NextMethod(.Generic)
}
e
}

Sys.Date() / Sys.Date()
Sys.Date() + as.numeric(Sys.Date())
as.numeric(Sys.Date()) + as.numeric(Sys.Date())

Sys.Date() + Sys.Date() # error since its intercepted by +.Date

Thus you will have to issue some as.numeric calls but perhaps not too
many.

However, I think its better not to implement Ops.Date as above but
just leave the Date operations the way they are, extend it with
as.Date.numeric like zoo has done and force the user to use as.numeric
in other cases to make it clear from the code that  there is conversion
going on.  I have done a fair amount of Date manipulation and have not
found the as.numeric to be onerous.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] r cmd build

2007-09-16 Thread Gabor Grothendieck
The Writing Extensions manual says to do an R CMD build for releases
to CRAN.  That's what I have been doing and that does produce a .tar.gz
file even though I get a message about hhc.exe.  Is that what I should continue
to do and ignore the message or should I be using one of the alternatives
you mention to create a .tar.gz release file on Vista?

On 9/16/07, Duncan Murdoch [EMAIL PROTECTED] wrote:
 On 15/09/2007 10:27 PM, Gabor Grothendieck wrote:
  On Windows Vista hhc.exe is not available.  One can do this on an
  install:
 
  rcmd install --docs=normal myPackage
 
  to avoid the message about hhc.exe; however,
  rcmd build does not appear to support --docs=normal so one cannot
  do a build without getting a message about hhc.exe (although the build
  still proceeds).

 Are you talking about build --binary?  I recommend using install
 --build instead.  AFAIK a non-binary build doesn't make any use of hhc.

 Another way to do an install or build without getting that message is to
 indicate in src/gnuwin32/MkRules that you don't want to build CHM help.

 Duncan Murdoch


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] r cmd build

2007-09-16 Thread Gabor Grothendieck
Something should be added to the Writing Extensions manual
since one gets this message during the R CMD build on Vista:

hhc: not found
CHM compile failed: HTML Help Workshop not installed?

and one is not really sure if the result is ok or not -- it does say it FAILED.
Even better would be to get rid of the message or improve the message.
Again, this all refers to doing a build on Vista.

On 9/16/07, Uwe Ligges [EMAIL PROTECTED] wrote:


 Gabor Grothendieck wrote:
  The Writing Extensions manual says to do an R CMD build for releases
  to CRAN.  That's what I have been doing and that does produce a .tar.gz
  file even though I get a message about hhc.exe.  Is that what I should 
  continue
  to do and ignore the message or should I be using one of the alternatives
  you mention to create a .tar.gz release file on Vista?

 Well, to produce a tar.gz, it is not required to have hhc.exe. It might
 happen that vignettes are created and therefore the package is installed
 for this purpose (and hhc is used to produce some help pages -
 temporarily). When hhc is not found at that place, just ignore it, the
 resulting .tar.gz should really be fine.

 Uwe Ligges

  On 9/16/07, Duncan Murdoch [EMAIL PROTECTED] wrote:
  On 15/09/2007 10:27 PM, Gabor Grothendieck wrote:
  On Windows Vista hhc.exe is not available.  One can do this on an
  install:
 
  rcmd install --docs=normal myPackage
 
  to avoid the message about hhc.exe; however,
  rcmd build does not appear to support --docs=normal so one cannot
  do a build without getting a message about hhc.exe (although the build
  still proceeds).
  Are you talking about build --binary?  I recommend using install
  --build instead.  AFAIK a non-binary build doesn't make any use of hhc.
 
  Another way to do an install or build without getting that message is to
  indicate in src/gnuwin32/MkRules that you don't want to build CHM help.
 
  Duncan Murdoch
 
 
  __
  R-devel@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] r cmd build

2007-09-15 Thread Gabor Grothendieck
On Windows Vista hhc.exe is not available.  One can do this on an
install:

rcmd install --docs=normal myPackage

to avoid the message about hhc.exe; however,
rcmd build does not appear to support --docs=normal so one cannot
do a build without getting a message about hhc.exe (although the build
still proceeds).

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Building an R GUI using gWidgets and RGtk2

2007-09-14 Thread Gabor Grothendieck
A fourth approach would be the proto package.  It provides a thin
layer over environments making use of the prototype (aka object-based)
style of programming which is fundamental different relative to
class-based programming (although it is powerful enough to encompass
class based programming).  The gsubfn package uses proto objects
as generalizations of replacement strings that hold state from one replacement
to the next.  An application that may be closer to yours that uses proto
is ggplot2 which is a recent grid-based plotting package.  The home page is at
   http://r-proto.googlecode.com .
See the paper on prototype programming linked on the home page as well
as the package vignette.

On 9/14/07, Gerlanc, Daniel [EMAIL PROTECTED] wrote:
 Hello,

 I'm developing a GUI in R that will be used to monitor financial
 portfolio performance.  The GUI will be distributed as an R package.  So
 far, I've decided to use the cairoDevice, RGtk2, gWidgets, and
 gWidgetsRGtk2 packages to develop the GUI.  I am trying to decide what
 would be the best way to structure the GUI would be.

 I've considered 3 approaches to building the GUI.  The first would be to
 use S4 classes.  I would create parent gui object that would store
 widgets or containers in slots.  Other more specialized guis for
 different purposes would extend this parent gui object.  The
 difficulty in this approach is R's use of pass-by-value.  Once the gui
 object has been created, changing any of the slots of the gui requires
 returning a new GUI object or saving one off in a higher level
 environment and editing the slots directly.  Editing the slots directly
 would completely bypass the S4 method dispatch.

 Another approach would be more functional.  I would create variables
 that are global within the package or in their own environment and
 define the package function closures within this environment.  This
 could work, but the code could get noisy when calls to have be made to
 distinguish between local variable assignment within the environment of
 the functions and assignment within the namespace of the package.

 The third approach I've been considering is using the R.oo package.  I
 have never used this package before but it appears to provide similar OO
 features to Java.  Because it allows references, it would seem to
 provide the features I'm looking for from both the S4 and functional
 approaches.

 Any comments or suggestions on these different approaches would be
 appreciated.

 Thank you.

 Sincerely,

 Daniel Gerlanc

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Date vs date

2007-09-14 Thread Gabor Grothendieck
On 9/14/07, Terry Therneau [EMAIL PROTECTED] wrote:
  I wrote the date package long ago, and it has been useful.  In my current 
 task
 of reunifying the R (Tom Lumley) and Splus (me) code trees for survival, I'm
 removing the explicit dependence on 'date' objects from the expected survival
 routines so that they better integrate.   Comparison of 'date' to 'Date' has
 raised a couple of questions.

  Clearly Date is more mature -- more options for conversion, better plotting,
 etc (a long list of etc).  I see three things where date is better.  Only the
 last of these really matters, and is the point on which I would like comment.
 (Well, actually I'd like to talk you all into a change, of course).

  1. Since date uses 1/1/1960 as the base, and so does SAS, those of us who
 contantly pass files back and forth between those two packages have a slightly
 easier time.

There are some other programs that use 1/1/70.  See the R Help Desk article
in R News 4/1 that discusses a few origins.


  2. as.date(10) works, as.Date(10) does not.  Sometimes I have done a
 manipluation that the date package does not understand, and I know that the
 result is still of the right type, but the package does not.  However, this is
 fairly rare and I can work around it. (It mostly occurs in processing the rate
 tables for expected survival).

You can define as.Date.numeric in your package and then it will work.  zoo
has done that.

library(zoo)
as.Date(10)

Some other things you can do:

today - Sys.Date()
Epoch - today - as.numeric(today)

Epoch + 10  # similar to as.Date(10)


  3. temp - as.Date('1990/1/1') - as.date('1953/2/5')
 sqrt(temp)
 Error in Math.difftime(temp3) : sqrtnot defined for difftime objects

  Minor bug: no space before the word 'not'
  Major: this shouldn't fail.

 People will do things with time intervals that you have not thought of.  
 Fitting
 a growth curve that uses a square root, for instance.  I firmly believe that 
 the
 superior behavior in the face of something unexpected is to assume that the 
 user
 knows what they are doing, and return a numeric.
   I recognize that assume the user knows what they are doing is an anathema
 to the more zealous OO types, but in designing a class I have found that they
 often know more than me!

   4. Variation on #3 above

  (as.Date('2007-9-14') - as.Date('1953-3-10')) / 365.25
  Time difference of 54.51335 days

No, I am not 54.5 days old.  Both hair color and knee creaking most
 definitely proclaim otherwise, I am sorry to say. Time difference / number
 should be a number.

Note that you can write:

x - Sys.Date()
y - x + 1
as.numeric(x-y)
as.numeric(x) - as.numeric(y)


   5. This is only amusing.  Im not saying that as.Date should necessarily 
 work,
 but the format is certainly not ambiguous.  (Not standard, but not ambiguous).
 Not important to fix, not something that date does any better.

  as.Date('09Sep2007')
 Error in fromchar(x) : character string is not in a standard unambiguous 
 format

as.Date(09Sep2007, %d%b%Y)




Terry Therneau

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Building an R GUI using gWidgets and RGtk2

2007-09-14 Thread Gabor Grothendieck
On 9/14/07, Gabor Grothendieck [EMAIL PROTECTED] wrote:
 On 9/14/07, Gerlanc, Daniel [EMAIL PROTECTED] wrote:
  Hello,
 
  I'm developing a GUI in R that will be used to monitor financial
  portfolio performance.  The GUI will be distributed as an R package.  So
  far, I've decided to use the cairoDevice, RGtk2, gWidgets, and
  gWidgetsRGtk2 packages to develop the GUI.  I am trying to decide what
  would be the best way to structure the GUI would be.
 
  I've considered 3 approaches to building the GUI.  The first would be to
  use S4 classes.  I would create parent gui object that would store
  widgets or containers in slots.  Other more specialized guis for
  different purposes would extend this parent gui object.  The
  difficulty in this approach is R's use of pass-by-value.  Once the gui
  object has been created, changing any of the slots of the gui requires
  returning a new GUI object or saving one off in a higher level
  environment and editing the slots directly.  Editing the slots directly
  would completely bypass the S4 method dispatch.
 
  Another approach would be more functional.  I would create variables
  that are global within the package or in their own environment and
  define the package function closures within this environment.  This
  could work, but the code could get noisy when calls to have be made to
  distinguish between local variable assignment within the environment of
  the functions and assignment within the namespace of the package.
 
  The third approach I've been considering is using the R.oo package.  I
  have never used this package before but it appears to provide similar OO
  features to Java.  Because it allows references, it would seem to
  provide the features I'm looking for from both the S4 and functional
  approaches.
 
  Any comments or suggestions on these different approaches would be
  appreciated.
 
  Thank you.
 
  Sincerely,
 
  Daniel Gerlanc
 A fourth approach would be the proto package.  It provides a thin
 layer over environments making use of the prototype (aka object-based)
 style of programming which is fundamental different relative to
 class-based programming (although it is powerful enough to encompass
 class based programming).  The gsubfn package uses proto objects
 as generalizations of replacement strings that hold state from one replacement
 to the next.  An application that may be closer to yours that uses proto
 is ggplot2 which is a recent grid-based plotting package.  The home page is at
   http://r-proto.googlecode.com .
 See the paper on prototype programming linked on the home page as well
 as the package vignette.

Just to illustrate this further here is a simple example of gWidgets and
proto.  In this example we create a proto object, p, that corresponds to
an ok/cancel dialogue labelled Hello and that prints Hello when OK is
pressed.  The components of p are go, msg and handler.  msg is
a character string, go is a method and handler is a function (to be
a method it would have to pass the proto object as its first arg).

q is created as a child of p so q gets components via
delegation from p.  q overrides msg which was Hello in p but is Bye
in q.  q acts the same as p except the label is Bye and q prints Bye when
OK is pressed.  Note that we pass the proto object to the handler via
the action= argument.  Here we used a dot (.) to denote the current
object but you could use this or self or any variable name you prefer.

library(proto)
library(gWidgets)
p - proto(go = function(.) {
w = gwindow()
g = ggroup(container = w)
g.i = ggroup(horizontal=FALSE, container = g)
glabel(.$msg, container = g.i, expand = TRUE)
g.i.b = ggroup(container = g.i)
addSpring(g.i.b)
gbutton(ok, handler = with(., handler), action = ., container = g.i.b)
gbutton(cancel, handler = function(h, ...) dispose(w),
container = g.i.b)
},
msg = Hello,
handler = function(h, ...) {
cat(\n, h$action$msg, \n)
dispose(h$obj)
}
)
p$go()  # press ok and Hello is printed

q - p$proto(msg = Bye) # q is child of p overriding msg
q$go()  # press ok and Bye is printed

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


<    1   2   3   4   5   6   7   8   9   10   >