Re: [R] Excel User interface for R

2013-03-05 Thread Barry Rowlingson
On Mon, Mar 4, 2013 at 1:39 PM, Tammy Ma metal_lical...@live.com wrote:

 HI,

 Assume I have the data frame generated from R as the following:

 Product Price market_share
   A 10010%
   B 1109%
  C  12020%
  D  90  61%

 What I want to do is to have this kind of excel user interface by changing 
 price of certain products, to get the impact of market share generated from R:
 if I change A price from 100 to 90, and change B price from 110 to 100, based 
 on the calculation in R, I want to get the result from excel, for example,


 Product Price market_share
   A 9020%
   B 100   12%
  C  120   10%
  D  90 58%


 I want to have the excel interface which make me be able to change the value 
 in excel cell, then based on the calculation in R, then exported market share 
 impact in excel.

 DO you have recommendation for such user interface or sth which make me 
 achieve this task?

 This is just not how you *think* in R. Formulas in Excel are all
hidden, and stuff happens automatically, and if columns are all
dependent on one another then you are redundantly storing information.

 If I was doing a table like this I would only store the original
information, and then write some functions to produce the spreadsheet
with all the derived columns. I would only change the original
information, and then re-run the function that generates the report.

 Now, I really don't understand in your example how changing the price
of A from 100 to 90 results in a change of market_share from 10% to
20%, so I guess there must be some complex formula going on. I'll
simplify

 Suppose your data is Product, Price, and Unit Sales:

  d = 
  data.frame(Product=c(A,B,C,D),Price=c(100,110,120,90),Sales=c(3,5,3,7))
  d
  Product Price Sales
1   A   100 3
2   B   110 5
3   C   120 3
4   D90 7


 And in Excel you would normally compute market share by value as
100*Price * Sales / SUM(Price*Sales). Then in R:

  d$ShareByValue = 100 * d$Price * d$Sales / sum(d$Price*d$Sales)
  d
  Product Price Sales ShareByValue
1   A   100 3 16.30435
2   B   110 5 29.89130
3   C   120 3 19.56522
4   D90 7 34.23913

But if the price of Product A was wrong, and you have to change it to
90, then yes, you have to re-run the function:
  d$Price=90
  d$ShareByValue = 100 * d$Price * d$Sales / sum(d$Price*d$Sales)
  d
  Product Price Sales ShareByValue
1   A90 3 16.7
2   B90 5 27.8
3   C90 3 16.7
4   D90 7 38.9

But now we can get a bit smarter. Let's write a function that computes
the share by value:

 computeShare = function(d){100*d$Price*d$Sales/sum(d$Price*d$Sales)}
  computeShare(d)
[1] 16.7 27.8 16.7 38.9


 So we just have to do:

  d$Share = computeShare(d)

and we can re-use that function on other data (with the same named
columns). If there are any bugs in that function they only need fixing
in one place. Yes, if your data changes then all your columns may need
adjusting, but you just write a script or a function that does it. I
think something like this:

 updateSales = function(d){
  d$Share = computeShare(d)
  d$TotalItemCost = totalItemCost(d)
  d$Tax = taxTate*d$TotalItemCost
  totalTax = sum(d$Tax)
 list(data=d,totalTax=totalTax)
}

is a lot easier to understand than hunting round spreadsheet cells to
find that one formula you typed in wrong...

Barry

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Mysterious issues with reading text files from R in ArcGIS and Excel

2013-03-05 Thread Kerry
I appreciate the help and suggestions.  I was afraid that this question would 
be considered off topic, but thought I would give it a try to see if anyone 
else gets these results from R output files.  As I do not know what a hex 
editor or hexbin are I guess I will not be able to continue this discussion 
here.  I will take Duncan's suggestion and try R-Geo?
 
~K




 From: Jeff Newmiller jdnew...@dcn.davis.ca.us

Murdoch murdoch.dun...@gmail.com 
Cc: r-help@r-project.org r-help@r-project.org 
Sent: Monday, March 4, 2013 6:29 PM
Subject: Re: [R] Mysterious issues with reading text files from R in ArcGIS and 
Excel

Your description of diagnosis uses non-R software (off topic here). Please 
either describe the difference in the files (you may need a hex editor or the 
hexbin package to detect the differences) or supply the files that behave 
differently (this may require some alternate route than this mailing list if 
there are odd characters at fault).

For what it is worth, TXT is not a clearly-defined format, so this could be 
more effectively addressed by using a more specific format for data exchange.
---
Jeff Newmiller                        The     .       .  Go Live...
DCN:jdnew...@dcn.davis.ca.us        Basics: ##.#.       ##.#.  Live Go...
                                      Live:   OO#.. Dead: OO#..  Playing
Research Engineer (Solar/Batteries            O.O#.       #.O#.  with
/Software/Embedded Controllers)               .OO#.       .OO#.  rocks...1k
---
Sent from my phone. Please excuse my brevity.
[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Mysterious issues with reading text files from R in ArcGIS and Excel

2013-03-05 Thread Michael Sumner
I would try write.csv or write.table with row.names = FALSE - just a guess,
and do follow up on R-Sig-Geo if need be.

Cheers, Mike

On Tuesday, March 5, 2013, Kerry wrote:

 I appreciate the help and suggestions.  I was afraid that this question
 would be considered off topic, but thought I would give it a try to see
 if anyone else gets these results from R output files.  As I do not know
 what a hex editor or hexbin are I guess I will not be able to continue this
 discussion here.  I will take Duncan's suggestion and try R-Geo?

 ~K



 
  From: Jeff Newmiller jdnew...@dcn.davis.ca.us javascript:;

 Murdoch murdoch.dun...@gmail.com javascript:;
 Cc: r-help@r-project.org javascript:; r-help@r-project.orgjavascript:;
 
 Sent: Monday, March 4, 2013 6:29 PM
 Subject: Re: [R] Mysterious issues with reading text files from R in
 ArcGIS and Excel

 Your description of diagnosis uses non-R software (off topic here). Please
 either describe the difference in the files (you may need a hex editor or
 the hexbin package to detect the differences) or supply the files that
 behave differently (this may require some alternate route than this mailing
 list if there are odd characters at fault).

 For what it is worth, TXT is not a clearly-defined format, so this could
 be more effectively addressed by using a more specific format for data
 exchange.
 ---
 Jeff NewmillerThe .   .  Go Live...
 DCN:jdnew...@dcn.davis.ca.us javascript:;Basics: ##.#.
  ##.#.  Live Go...
   Live:   OO#.. Dead: OO#..  Playing
 Research Engineer (Solar/BatteriesO.O#.   #.O#.  with
 /Software/Embedded Controllers)   .OO#.   .OO#.  rocks...1k
 ---
 Sent from my phone. Please excuse my brevity.
 [[alternative HTML version deleted]]



-- 
Michael Sumner
Hobart, Australia
e-mail: mdsum...@gmail.com

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Agreement and Consistency of 2D data

2013-03-05 Thread Felix Nensa
Hi,

I have two different imaging modalities (for the identification of areas of
infarcted myocardium) that I need to compare regarding agreement and
consistency.
However, I don't think that methods like Cohen's Kappa, PCC, Bland-Altmann
or ICC are sufficient here as there is not just a pairwise but also a
spatial relationship between measured data points. For example if the
results of the two imaging modalities are slightly misaligned their
agreement might be still much better than the mentioned tests might
indicate (at least from a practical point of view). On the other hand the
number of non-infarcted myocardial segments often heavily outweighs the
number of infarcted segments which seems to introduce a bias towards better
agreement in most examinations.

Here is an example data set, where I've calculated the ICC.
What would be the correct way to assess agreement and consistency here?

# modality 1
lgemtx - as.matrix(read.table('
http://cluster010.ovh.net/~myvideoc/R20130305/lge_mtx.csv'))

# modality 2
petmtx - as.matrix(read.table('
http://cluster010.ovh.net/~myvideoc/R20130305/pet_mtx.csv'))

# ICC
# note: modalities are inverse
# thus in modality1 0 denotes a normal segment and 100 a complete
infarction of the segment
# and vice versa in modality2
library(irr)
print(icc(cbind(100 - c(pet_mtx), c(lge_mtx)), model=twoway, type=c))
print(icc(cbind(100 - c(pet_mtx), c(lge_mtx)), model=twoway, type=a))


If the matrices are mapped to polar plots (also called bullseye plots) one
can visually assess their agreement.
Each matrix row defines one circle starting at 0° with columns equally
mapped to segments of 3,6° width.
The first row defines the outmost circle, the last row defines the inner
circle.
Here are the corresponding plots for the matrices given above (red is
infarction).

http://cluster010.ovh.net/~myvideoc/R20130305/LGE.png
http://cluster010.ovh.net/~myvideoc/R20130305/PET.png

I know, this is not a pure R question but more a general statistical one.
Hopefully it is still ok to post it here :-)

Best, Felix

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] R function for estimating historical-VaR

2013-03-05 Thread Patrick Burns

You might have a look at:

http://www.portfolioprobe.com/2012/12/17/a-look-at-historical-value-at-risk/

which points to a function for historical VaR.

As Nello said, we really need to know what it is
that you think doesn't work, before we can help
you with what you have.

It probably doesn't really matter, but doing a
full sort is wasteful compared with what 'quantile'
does.

If you have further questions and they are finance
oriented as opposed to being about R programming, then
you should post to R-sig-finance (you have to subscribe
before posting).

Pat


On 04/03/2013 16:25, Blaser Nello wrote:

Does it just not work or does it not do the right thing? The reason it doesn't 
work is that you are writing 'T = length(returns) x_foc = vector(length=n) N = 
T-(n+1)' on one line instead of using three lines. However, your description of 
what you want to do also doesn't seem to correspond to the function. Please 
clarify what exactly you want the function to do. You could also write the 
current function as follows.

VaR_foc - function(returns, value=1000, p=.01, n=300) {
N - length(returns)-n-1
op - N*p
unlist(lapply(1:n, function(i) {-sort(returns[i:(N+i)])[op]*value}))
}

Nello Blaser


-Original Message-
From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org] On 
Behalf Of ? ???
Sent: Montag, 4. März 2013 14:07
To: R-help@r-project.org
Subject: [R] R function for estimating historical-VaR



Hi everyone!! I am new in R and I want to create a simple R function for 
estimating historical-VaR.  In y_IBM returns, there are 2300 observations. For 
evaluation I take the next 2000 observations, then I abandon the latest 300 
observations. Firstly, I use the window which has the fix length and contains 
the observations from 1 to 2000 to estimate the VaR. At first I  take 2000 obs. 
and reorder these series in ascending order, from smallest return to largest 
return. Each ordered return is assigned an index value (1, 2, ...). At the 99% 
confidence level, the daily VaR under historical simulation method equals the 
return corresponding to the index number calculated as follows:
(1-0.99)*2000 (the number of our window) =20. The return corresponding to index 
20 is the daily historical simulation VaR.
I repeat the first step except the window changes the observations from 2 to 
2001. Such a process provides 300 one-step ahead VaR.
My function is:



VaR_foc - function (returns, value = 1000, p = 0.01, n=251) { T = 
length(returns) x_foc = vector(length=n) N = T-(n+1)
m=sort(returns[1:N])
op = as.integer(N*p) # p % smallest
for (i in 2:n) {
g= returns[i:(N+i)]
ys = sort(g) # sort returns
x_foc[[1]] = -m[op]*value # VaR number
x_foc[i] = -ys[op]*value
}
return(x_foc)
}
VaR_foc (returns=y_IBM)

But the fucntion doesn't work,  can smbd help me wh

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.



--
Patrick Burns
pbu...@pburns.seanet.com
twitter: @burnsstat @portfolioprobe
http://www.portfolioprobe.com/blog
http://www.burns-stat.com
(home of:
 'Impatient R'
 'The R Inferno'
 'Tao Te Programming')

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] What package can I use to help me categorize comment responses?

2013-03-05 Thread Milan Bouchet-Valat
Le lundi 04 mars 2013 à 20:28 +, Lopez, Dan a écrit :
 Hi,
 
 We have comment questions from a survey that we need to categorize.
 What package and functions can I use in R to help do this?
If you are asking for some kind of clustering based on vocabulary, then
have a look at the tm package and the articles that present it. If you
want an integrated solution with a graphical user interface, then my
package RcmdrPlugin.temis performs hierarchical clustering using Ward's
method and a Chi-squared distance.


My two cents

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Color spalettes for black/white printing

2013-03-05 Thread David Studer
Hi everybody!

Does anyone know a good way to color my images so that
when I print them out on a non-color-printer the colors used
can be distinguished well? As I have many categories I would
not want to assign the colors c(black, grey, white) by
hand.

Thank you!

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Color spalettes for black/white printing

2013-03-05 Thread ONKELINX, Thierry
Have a look at bpy.colors()

Best regards,

ir. Thierry Onkelinx
Instituut voor natuur- en bosonderzoek / Research Institute for Nature and 
Forest
team Biometrie  Kwaliteitszorg / team Biometrics  Quality Assurance
Kliniekstraat 25
1070 Anderlecht
Belgium
+ 32 2 525 02 51
+ 32 54 43 61 85
thierry.onkel...@inbo.be
www.inbo.be

To call in the statistician after the experiment is done may be no more than 
asking him to perform a post-mortem examination: he may be able to say what the 
experiment died of.
~ Sir Ronald Aylmer Fisher

The plural of anecdote is not data.
~ Roger Brinner

The combination of some data and an aching desire for an answer does not ensure 
that a reasonable answer can be extracted from a given body of data.
~ John Tukey


-Oorspronkelijk bericht-
Van: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org] Namens 
David Studer
Verzonden: dinsdag 5 maart 2013 11:56
Aan: r-help@r-project.org
Onderwerp: [R] Color spalettes for black/white printing

Hi everybody!

Does anyone know a good way to color my images so that when I print them out on 
a non-color-printer the colors used can be distinguished well? As I have many 
categories I would not want to assign the colors c(black, grey, white) by 
hand.

Thank you!

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
* * * * * * * * * * * * * D I S C L A I M E R * * * * * * * * * * * * *
Dit bericht en eventuele bijlagen geven enkel de visie van de schrijver weer en 
binden het INBO onder geen enkel beding, zolang dit bericht niet bevestigd is 
door een geldig ondertekend document.
The views expressed in this message and any annex are purely those of the 
writer and may not be regarded as stating an official position of INBO, as long 
as the message is not confirmed by a duly signed document.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] how_to_create_a_package?

2013-03-05 Thread Jim Lemon

On 03/05/2013 02:42 PM, Jyoti Sharma wrote:

hello sir

myself Jyoti Sharma,  and i am working as preoject fellow in IGIB Delhi.
I need your help to know how to create a package as well as how to
post that package to the CRAN mirror, for public use. i have used
package.skeleton() command but its not working properly.
So, please suggest me the right way asap.
i'll be very thankful to you.


Hi Jyoti,
Creating packages is not a trivial task. The only ASAP solution that I 
could suggest is to read the first and third sections in Creating 
packages (Package structure and Checking and building packages) in 
the Writing R Extensions document that comes with R. To see this, invoke:


help.start()

in an R session and it should be the second link down on the left of the 
page that appears.


Jim

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Color spalettes for black/white printing

2013-03-05 Thread Michael Sumner
That is bpy.colors() in the contributed sp package on CRAN, not in R itself.

On Tuesday, March 5, 2013, ONKELINX, Thierry wrote:

 Have a look at bpy.colors()

 Best regards,

 ir. Thierry Onkelinx
 Instituut voor natuur- en bosonderzoek / Research Institute for Nature and
 Forest
 team Biometrie  Kwaliteitszorg / team Biometrics  Quality Assurance
 Kliniekstraat 25
 1070 Anderlecht
 Belgium
 + 32 2 525 02 51
 + 32 54 43 61 85
 thierry.onkel...@inbo.be javascript:;
 www.inbo.be

 To call in the statistician after the experiment is done may be no more
 than asking him to perform a post-mortem examination: he may be able to say
 what the experiment died of.
 ~ Sir Ronald Aylmer Fisher

 The plural of anecdote is not data.
 ~ Roger Brinner

 The combination of some data and an aching desire for an answer does not
 ensure that a reasonable answer can be extracted from a given body of data.
 ~ John Tukey


 -Oorspronkelijk bericht-
 Van: r-help-boun...@r-project.org javascript:; [mailto:
 r-help-boun...@r-project.org javascript:;] Namens David Studer
 Verzonden: dinsdag 5 maart 2013 11:56
 Aan: r-help@r-project.org javascript:;
 Onderwerp: [R] Color spalettes for black/white printing

 Hi everybody!

 Does anyone know a good way to color my images so that when I print them
 out on a non-color-printer the colors used can be distinguished well? As I
 have many categories I would not want to assign the colors c(black,
 grey, white) by hand.

 Thank you!

 [[alternative HTML version deleted]]

 __
 R-help@r-project.org javascript:; mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.
 * * * * * * * * * * * * * D I S C L A I M E R * * * * * * * * * * * * *
 Dit bericht en eventuele bijlagen geven enkel de visie van de schrijver
 weer en binden het INBO onder geen enkel beding, zolang dit bericht niet
 bevestigd is door een geldig ondertekend document.
 The views expressed in this message and any annex are purely those of the
 writer and may not be regarded as stating an official position of INBO, as
 long as the message is not confirmed by a duly signed document.

 __
 R-help@r-project.org javascript:; mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.



-- 
Michael Sumner
Hobart, Australia
e-mail: mdsum...@gmail.com

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Color spalettes for black/white printing

2013-03-05 Thread Achim Zeileis

On Tue, 5 Mar 2013, David Studer wrote:


Hi everybody!

Does anyone know a good way to color my images so that
when I print them out on a non-color-printer the colors used
can be distinguished well? As I have many categories I would
not want to assign the colors c(black, grey, white) by
hand.


The colorspace package provides a wide range of colors that - when 
printed on a grayscale printer - still preserve an increasing/decreasing 
gray palette.


To explore these, see choose_palette() in colorspace which opens a GUI, 
let's you play around with the palettes, see them in example displays, and 
also let's you collapse them to gray colorse (option: desaturate).


Furtheremore, you can explore the effects of different types of color 
blindness (provided that the dichromat package is installed).


Best,
Z


Thank you!

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.



__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] how_to_create_a_package?

2013-03-05 Thread Rui Barradas

Hello,

Another document that could help is Friedrich Leisch, Creating R 
Packages: A Tutorial, that can be found at


http://cran.r-project.org/doc/contrib/Leisch-CreatingPackages.pdf


Hope this helps,

Rui Barradas

Em 05-03-2013 11:15, Jim Lemon escreveu:

On 03/05/2013 02:42 PM, Jyoti Sharma wrote:

hello sir

myself Jyoti Sharma,  and i am working as preoject fellow in IGIB Delhi.
I need your help to know how to create a package as well as how to
post that package to the CRAN mirror, for public use. i have used
package.skeleton() command but its not working properly.
So, please suggest me the right way asap.
i'll be very thankful to you.


Hi Jyoti,
Creating packages is not a trivial task. The only ASAP solution that I
could suggest is to read the first and third sections in Creating
packages (Package structure and Checking and building packages) in
the Writing R Extensions document that comes with R. To see this, invoke:

help.start()

in an R session and it should be the second link down on the left of the
page that appears.

Jim

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide
http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] plotmath: angle brackets

2013-03-05 Thread Donatella Quagli
Dear all,

is it possible to print angle brackets (LaTeX notation: \langle, \rangle)? I 
found that lceil and lfloor
are available, see demo(plotmath). But langle and rangle are not.

I tried to print utf8 characters directly as well without success.

I have also read something about a tikzDevice package. But obviously this is 
not available anymore.

Any suggestions?

Greetings
Donatella

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] survfit plot question

2013-03-05 Thread Andrews, Chris
1. A censored observation
2. It does not relate to either
3. See ?print.survfit  .  Recall also that the mean of a positive random 
variable is the integral from 0 to infinity of the survival function.  The 
truncated mean is the integral from 0 to tau (819 in your case) of the survival 
function.

Chris


-Original Message-
From: Steve Einbender [mailto:steve.einben...@gmail.com] 
Sent: Monday, March 04, 2013 7:44 AM
To: r-help
Subject: [R] survfit plot question

Hello,
I create a plot from a coxph object called fit.ads4:
plot(survfit(fit.ads4))

plot is located at:
https://www.dropbox.com/s/9jswrzid7mp1u62/survfit%20plot.png

I also create the following survfit statistics:

 print(survfit(fit.ads4),print.rmean=T)
Call: survfit(formula = fit.ads4)

   records  n.maxn.start events *rmean *se(rmean)
median0.95LCL0.95UCL
 203.0  100.0  100.0  103.0  486.7   24.4
 387.0  340.0  467.0
* restricted mean with upper limit =  819


Questions:
1.  What is the cross mark in the plot ?
2.  How does the cross mark in the plot relate to either the rmean or the 
median from survfit ?
3.  What is the meaning of the restricted mean ?  The upper limit noted in 
the output is the end of the observation period (i.e., it is always the Stop 
value in the Censored observation)

Thanks for taking the time to review

Steve

[[alternative HTML version deleted]]


**
Electronic Mail is not secure, may not be read every day, and should not be 
used for urgent or sensitive issues 

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Generating 1-bit and 8-bit BMP files using R

2013-03-05 Thread Robert Baer

On 3/5/2013 12:51 AM, Ingo Reinhold wrote:

Hi,

I'm trying to use the data which I generate within R to make images in .bmp 
format to be lateron printed by a printer.

My first thought was the RImageJ package, but this seems to be discontinued. What I am 
currently doing is generating a matrix of grey values, which needs to be parsed into the 
right image format. Is anyone aware of a package or rather easy way to 
generate these images using R?
Sorry to hear this about RImageJ.  I did find that old versions are 
still available in the repository archive: 
http://cran.r-project.org/src/contrib/Archive/RImageJ/


Some of the capabilities you might be looking for are available in 
EBImage on Bioconductor, but I don't remember specifically whether the 
.bmp format was supported.


Rob

Many thanks,

Ingo

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.



--

Robert W. Baer, Ph.D.
Professor of Physiology
Kirksille College of Osteopathic Medicine
A. T. Still University of Health Sciences
Kirksville, MO 63501 USA

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] R function for estimating historical-VaR

2013-03-05 Thread Joshua Ulrich
Cross-posted, verbatim, on stackoverflow:
http://stackoverflow.com/q/15203347/271616
--
Joshua Ulrich  |  about.me/joshuaulrich
FOSS Trading  |  www.fosstrading.com

R/Finance 2013: Applied Finance with R  | www.RinFinance.com


On Mon, Mar 4, 2013 at 7:07 AM, Аскар  Нысанов nysanas...@mail.ru wrote:


 Hi everyone!! I am new in R and I want to create a simple R function for 
 estimating historical-VaR.  In y_IBM returns, there are 2300 observations. 
 For evaluation I take the next 2000 observations,
 then I abandon the latest 300 observations. Firstly, I use the window which 
 has the fix
 length and contains the observations from 1 to 2000 to estimate the VaR. At 
 first I  take 2000 obs. and reorder these series in ascending order, from 
 smallest return to largest return. Each ordered return is assigned an index 
 value (1, 2, ...). At the 99% confidence level, the daily VaR under 
 historical simulation method equals the return corresponding to the index 
 number calculated as follows:
 (1-0.99)*2000 (the number of our window) =20. The return corresponding to 
 index 20 is the daily historical simulation VaR.
 I repeat the first step except the window changes the observations from 2 to 
 2001. Such a process provides 300 one-step ahead VaR.
 My function is:



 VaR_foc - function (returns, value = 1000, p = 0.01, n=251) {
 T = length(returns)
 x_foc = vector(length=n)
 N = T-(n+1)
 m=sort(returns[1:N])
 op = as.integer(N*p) # p % smallest
 for (i in 2:n) {
 g= returns[i:(N+i)]
 ys = sort(g) # sort returns
 x_foc[[1]] = -m[op]*value # VaR number
 x_foc[i] = -ys[op]*value
 }
 return(x_foc)
 }
 VaR_foc (returns=y_IBM)

 But the fucntion doesn't work,  can smbd help me wh

 [[alternative HTML version deleted]]


 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.


__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] How to reference to the `stats` package in academical paper

2013-03-05 Thread Julien Mvdb
The question is in the title.
Then, I would like to know how I should refer to the documentation
regarding the use of each functions.

Thanks,

 Julien Mehl Vettori

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Problem with R CMD check and the inconsolata font business

2013-03-05 Thread Matthew Dowle


On 11/3/2011 3:30 PM, Brian Diggs wrote:


Well, I figured it out.  Or at least got it working.  I had to run

initexmf --mkmaps

because apparently there was something wrong with my font mappings.  
I
don't know why; I don't know how.  But it works now.  I think 
installing

the font into the Windows Font directory was not necessary.  I'm
including the solution in case anyone else has this problem.


Many thanks Brian Diggs! I just had the same problem and that fixed it.

Matthew

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] boxplot with frequencies(counts)

2013-03-05 Thread km
Dear All,

I have a table as following
position type count
1   2 100
1   3  51
1   5  64
1   8  81
1   6  32
2   2  41
2   3  85
and so on


Normally if  would have a vector of 2,3,4,5... by position position and
plot them by position.
But now i have counts of these types.
Is there a way to compute boxplot of such kind of data ?

Regards,
KM

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] How to reference to the `stats` package in academical paper

2013-03-05 Thread Jorge I Velez
Dear Julien,

Check

citation('stats')

HTH,
Jorge.-


On Wed, Mar 6, 2013 at 12:05 AM, Julien Mvdb julien.m...@gmail.com wrote:

 The question is in the title.
 Then, I would like to know how I should refer to the documentation
 regarding the use of each functions.

 Thanks,

  Julien Mehl Vettori

 [[alternative HTML version deleted]]

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Fwd: Re: How to reference to the `stats` package in academical paper

2013-03-05 Thread Andrew Koeser

Julien,

I would just try your best given the journal's style guide and wait for 
them to change it. For what it is worth, my last paper was corrected as 
follows.

(In text)...one-way analysis of variance (ANOVA) in R [version 2.14.2 
(R Core Team, 2012)].


(reference section)

R Core Team. 2012. R: A language and environment for statistical 
computing. 11 Oct. 2012. http://www.R-project.org/.


Andrew

On 03/05/2013 07:05 AM, Julien
Mvdb wrote:
 The question is in the title.
 Then, I would like to know how I should refer to the documentation
 regarding the use of each functions.

 Thanks,

   Julien Mehl Vettori

   [[alternative HTML version deleted]]

 __
 R-help@r-project.org  mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guidehttp://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.




[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] caret pls model statistics

2013-03-05 Thread Charles Determan Jr
Does anyone know of any literature on the kappa statistic with plsda?  I
have been trying to find papers that used plsda for classification and have
yet to come across this kappa value.  All the papers I come across
typically have R2 as an indicator of model fit.  I want to make sure I
conduct such analysis appropriately, any guidance is appreciated.

Regards,
Charles

On Sun, Mar 3, 2013 at 4:38 PM, Max Kuhn mxk...@gmail.com wrote:

 That the most common formula, but not the only one. See

   Kvålseth, T. (1985). Cautionary note about $R^2$. *American Statistician
 *, *39*(4), 279–285.

 Traditionally, the symbol 'R' is used for the Pearson correlation
 coefficient and one way to calculate R^2 is... R^2.

 Max


 On Sun, Mar 3, 2013 at 3:16 PM, Charles Determan Jr deter...@umn.eduwrote:

 I was under the impression that in PLS analysis, R2 was calculated by 1-
 (Residual sum of squares) / (Sum of squares).  Is this still what you are
 referring to?  I am aware of the linear R2 which is how well two variables
 are correlated but the prior equation seems different to me.  Could you
 explain if this is the same concept?

 Charles


 On Sun, Mar 3, 2013 at 12:46 PM, Max Kuhn mxk...@gmail.com wrote:

  Is there some literature that you make that statement?

 No, but there isn't literature on changing a lightbulb with a duck
 either.

  Are these papers incorrect in using these statistics?

 Definitely, if they convert 3+ categories to integers (but there are
 specialized R^2 metrics for binary classification models). Otherwise, they
 are just using an ill-suited score.

  How would you explain such an R^2 value to someone? R^2 is
 a function of correlation between the two random variables. For two
 classes, one of them is binary. What does it mean?

 Historically, models rooted in computer science (eg neural networks)
 used RMSE or SSE to fit models with binary outcomes and that *can* work
 work well.

 However, I don't think that communicating R^2 is effective. Other
 metrics (e.g. accuracy, Kappa, area under the ROC curve, etc) are designed
 to measure the ability of a model to classify and work well. With 3+
 categories, I tend to use Kappa.

 Max




 On Sun, Mar 3, 2013 at 10:53 AM, Charles Determan Jr 
 deter...@umn.eduwrote:

 Thank you for your response Max.  Is there some literature that you
 make that statement?  I am confused as I have seen many publications that
 contain R^2 and Q^2 following PLSDA analysis.  The analysis usually is to
 discriminate groups (ie. classification).  Are these papers incorrect in
 using these statistics?

 Regards,
 Charles


 On Sat, Mar 2, 2013 at 10:39 PM, Max Kuhn mxk...@gmail.com wrote:

 Charles,

 You should not be treating the classes as numeric (is virginica
 really three times setosa?). Q^2 and/or R^2 are not appropriate for
 classification.

 Max


 On Sat, Mar 2, 2013 at 5:21 PM, Charles Determan Jr 
 deter...@umn.eduwrote:

 I have discovered on of my errors.  The timematrix was unnecessary
 and an
 unfortunate habit I brought from another package.  The following
 provides
 the same R2 values as it should, however, I still don't know how to
 retrieve Q2 values.  Any insight would again be appreciated:

 library(caret)
 library(pls)

 data(iris)

 #needed to convert to numeric in order to do regression
 #I don't fully understand this but if I left as a factor I would get
 an
 error following the summary function
 iris$Species=as.numeric(iris$Species)
 inTrain1=createDataPartition(y=iris$Species,
 p=.75,
 list=FALSE)

 training1=iris[inTrain1,]
 testing1=iris[-inTrain1,]

 ctrl1=trainControl(method=cv,
 number=10)

 plsFit2=train(Species~.,
 data=training1,
 method=pls,
 trControl=ctrl1,
 metric=Rsquared,
 preProc=c(scale))

 data(iris)
 training1=iris[inTrain1,]
 datvars=training1[,1:4]
 dat.sc=scale(datvars)

 pls.dat=plsr(as.numeric(training1$Species)~dat.sc,
 ncomp=3, method=oscorespls, data=training1)

 x=crossval(pls.dat, segments=10)

 summary(x)
 summary(plsFit2)

 Regards,
 Charles

 On Sat, Mar 2, 2013 at 3:55 PM, Charles Determan Jr deter...@umn.edu
 wrote:

  Greetings,
 
  I have been exploring the use of the caret package to conduct some
 plsda
  modeling.  Previously, I have come across methods that result in a
 R2 and
  Q2 for the model.  Using the 'iris' data set, I wanted to see if I
 could
  accomplish this with the caret package.  I use the following code:
 
  library(caret)
  data(iris)
 
  #needed to convert to numeric in order to do regression
  #I don't fully understand this but if I left as a factor I would
 get an
  error following the summary function
  iris$Species=as.numeric(iris$Species)
  inTrain1=createDataPartition(y=iris$Species,
  p=.75,
  list=FALSE)
 
  training1=iris[inTrain1,]
  testing1=iris[-inTrain1,]
 
  ctrl1=trainControl(method=cv,
  number=10)
 
  plsFit2=train(Species~.,
  data=training1,
  method=pls,
  trControl=ctrl1,
  metric=Rsquared,
  

Re: [R] chisq.test

2013-03-05 Thread arun
If you wanted to do a t.test
res1-do.call(cbind,lapply(seq_len(nrow(m)),function(i) 
do.call(rbind,lapply(split(rbind(m[i,-1],n),1:nrow(rbind(m[i,-1],n))), 
function(x) {x1- rbind(x,m[i,-1]); t.test(x1[1,],x1[2,])$p.value}
 res2-do.call(cbind,lapply(seq_len(ncol(res1)),function(i) 
c(c(tail(res1[seq(1,i,1),i],-1),1),res1[-c(1:i),i])))
 attr(res2,dimnames)-NULL
 res2
#  [,1]  [,2]  [,3]  [,4]
#[1,] 1.000 1.000 1.000 0.6027881
#[2,] 1.000 1.000 1.000 0.5790103
#[3,] 1.000 1.000 1.000 1.000
#[4,] 0.6027881 0.6027881 0.5637881 1.000

#here, the first column is testing a2, against a2, a,c,t, second c2, against t, 
c2, a,c, third c3 against c,t,c3,a, and fourth t2 against a,c,t, and t2.
A.K.







From: Vera Costa veracosta...@gmail.com
To: arun smartpink...@yahoo.com 
Sent: Tuesday, March 5, 2013 9:38 AM
Subject: Re: chisq.test


ok, thank you.

I will test. Thank you very much




From: Vera Costa veracosta...@gmail.com
To: arun smartpink...@yahoo.com
Sent: Tuesday, March 5, 2013 8:23 AM
Subject: Re: chisq.test



Sorry if my explanation isn't good...

I have this tables:

m-structure(list(id = structure(1:4, .Label = c(a2, c2, c3, 
t2), class = factor), `1` = c(0L, 0L, 0L, 1L), `2` = c(8L, 
8L, 6L, 10L), `3` = c(2L, 2L, 4L, 5L)), .Names = c(id, 1, 
2, 3), row.names = c(a2, c2, c3, t2), class = data.frame)


n-structure(c(0, 0, 1, 8, 7, 10, 2, 3, 5), .Dim = c(3L, 3L), .Dimnames = 
list(
    c(a, c, t), c(1, 2, 3)))

and I need to apply a chisq.test between all. I need to compare a2 to a,c an 
t. After compare c2 with a,c,and t.After c3 with a,c,and t 

And the output will be some like this:

             a             b             c
a2      xxx                   xxx
c2      xxx                   xxx
c3       xxx                   xxx
t2       xxx                   xxx

where  is the p-values.


It isn't possible?

Vera




__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] survfit plot question

2013-03-05 Thread Terry Therneau



On 03/05/2013 05:00 AM, r-help-requ...@r-project.org wrote:
Hello, I create a plot from a coxph object called fit.ads4: plot(survfit(fit.ads4)) 
Questions: 1. What is the cross mark in the plot ? 2. How does the cross mark in the 
plot relate to either the rmean or the median from survfit ?

Try help(plot.survfit).  Read the description of mark.time.


3.  What is the meaning of the restricted mean ?  The upper limit noted
in the output is the end of the observation period (i.e., it is always
the Stop value in the Censored observation)

help(print.survfit)

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Excel User interface for R

2013-03-05 Thread Richard M. Heiberger
This request is very completely satisfied by RExcel.  Please look at
rcom.univie.ac.at for full
information including download information.  Follow up should be on the
rcom email list.
You can sign up at the website.

On Mon, Mar 4, 2013 at 8:39 AM, Tammy Ma metal_lical...@live.com wrote:


 HI,

 Assume I have the data frame generated from R as the following:

 Product Price market_share
   A 10010%
   B 1109%
  C  12020%
  D  90  61%

 What I want to do is to have this kind of excel user interface by changing
 price of certain products, to get the impact of market share generated from
 R:
 if I change A price from 100 to 90, and change B price from 110 to 100,
 based on the calculation in R, I want to get the result from excel, for
 example,


 Product Price market_share
   A 9020%
   B 100   12%
  C  120   10%
  D  90 58%


 I want to have the excel interface which make me be able to change the
 value in excel cell, then based on the calculation in R, then exported
 market share impact in excel.

 DO you have recommendation for such user interface or sth which make me
 achieve this task?

 Thanks.

 Kind regards,
 Lingyi




 [[alternative HTML version deleted]]

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Error message from flexsurvreg

2013-03-05 Thread Brett Close
I'm working on a survival analysis project and, based on previous work of
my colleagues on similar projects, am trying to estimate an AFT model with
a generalized gamma distribution.  When I execute the following code:

flexsurvreg(timesurv ~ cycles + rcycle + cyctime, dist=gengamma)

I get the following error message:

Error in optim(optpars, minusloglik.flexsurv, t = Y[, time], dead = Y[,
 :
  non-finite finite-difference value [4]

I get an almost identical error (a 3 in the brackets instead of a 4) if I
use the dist = weibull.  But have no problems when I run survreg with
Weibull distribution on the same data.

I've seen some messages about similar errors for other functions, but don't
think the responses apply (at least not in a what I understand).  I tried
setting initial values based on the survreg output values and that didn't
help.  Also, I checked and none of the variables are particularly
well-correlated.

Any idea how to deal with this?

Thanks,

-- 
Brett Close

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Help

2013-03-05 Thread Jonson Javier
This is my first time to use R. 

For clarification:

I made an oa design for 4 factors with the following levels: one with 4 levels, 
2 with 3 levels, 1  with 6 levels. 
Using DoE package, I have generated 72 runs (setting the columns=min3). 
However, the numbers of generalized words of lengths 3 and 4 is still equal to 
2.00e+00 and 4.559372e-33, respectively. 
I am concerned that these will affect my estimates since my study is only 
interested in obtaining the orthogonal main effects of the factors.

How can I improve my design so that I can be ensured of the main effects?

Thank you very much.

Jonson M. Javier 
[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] different colors for two wireframe spheres

2013-03-05 Thread David Schellenberger Costa
Dear List,

I have the code below adapted from the lattice-package examples to draw two 
spheres. I would now like to give
both different surface colors, e.g. one red and one blue. 

 ## 3-D surface parametrized on a 2-D grid
 
 n - 10

 tx - matrix(seq(-pi, pi, length.out = 2*n), 2*n, n)
 ty - matrix(seq(-pi, pi, length.out = n) / 2, 2*n, n, 
byrow = T)
 xx - cos(tx) * cos(ty)
 yy - sin(tx) * cos(ty)
 zz - sin(ty)
 zzz - zz
 
 bxx - xx+5 
 byy - yy+5
 bzzz - zzz+5
 
 xx=rbind(xx,rep(NA,n),bxx)
 yy=rbind(yy,rep(NA,n),byy)
 zzz=rbind(zzz,rep(NA,n),bzzz)
 gr-rbind(matrix(1,n,2*n),rep(NA,n),matrix(2,n,2*n))

 wireframe(zzz ~ xx + yy, 
groups=gr,col.groups=c(red,blue))

I tried various parameters as col.groups, col.regions but I have the impression 
that the groups argument in the wireframe command can only be used when 
supplying a data argument, which does not allow to plot the spheres as far as I 
tried. Is there a solution for this?

Cheers

David Schellenberger
__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Zelig package: Coxph model problems

2013-03-05 Thread Stephen Knight

Hi,

I'm having problems with the Zelig package - when using 
the below R displays the follwing message (I'm running R 
i386 2.15.3 for Windows and have updated all the Zelig 
packages):


z.out-zelig(Surv(psurv2, pcens2) ~ ren_sup3 + age,
data=data_urgent, model=coxph)

** The model coxph is not available with the currently 
loaded packages,

** and is not an official Zelig package.
** The model's name may be a typo.

Any suggestions?

Regards,
Steve Knight

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] generalized gamma distribution

2013-03-05 Thread خاکیفیروز مرضیه
To who(m) it may concern:

I'm so beginner with R , I have a data set and want to compare the Generalized 
gamma distribution with different 
K (gamma shape parameter) for my data, in other hand wanna to fit the gengamma 
distribution when K has different value for example k=1, 2, 5, 10 ,…
would you please to help me how could I do it?

Best 
Marzieh
__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] positioning of R windows

2013-03-05 Thread Glenn Stauffer
I've updated to the latest version of R, but still the problem persists.
Another thing I noticed (but failed to mention before) is that when I
initially open the change working directory dialog box, the little line
labeled Folder: (under the window showing the folder tree) does show the
current folder for a split second before reverting back to the folder
Computer. But the tree always just selects Computer and below that lists
the C drive and the D drive.

Anyway, I am convinced now that it is not an R issue, but rather something
about my computer, specifically. 

Glenn Stauffer 


-Original Message-
From: Duncan Murdoch [mailto:murdoch.dun...@gmail.com] 
Sent: Friday, March 01, 2013 8:25 AM
To: Glenn Stauffer
Cc: 'Prof Brian Ripley'; r-help@r-project.org
Subject: Re: [R] positioning of R windows

On 13-02-28 11:10 AM, Duncan Murdoch wrote:
 On 28/02/2013 11:00 AM, Glenn Stauffer wrote:
 Ahh, I should have known about the MDI and SDI options - choosing SDI 
 lets me do what I want. Thanks.
 On #2, I realized that when the change directory dialog window pops 
 up, if I resize it, R remembers the resizing so that now the entire 
 window is visible. I should have tried that before I posted. No luck 
 though on getting the change directory dialog box to begin at the 
 current working directory the same manner as 'save' or 'open'
 dialog boxes.
 I'll update to the new R version

 In the 2.15.3 release candidate, I see this:

 The first time I open that control, it starts in the current working 
 directory.

 After that, it starts in the last directory chosen in that control. If 
 I've used setwd() in the console to change, it doesn't see the change.

 This might be fixable, but not for tomorrow.

It is now changed so that it starts in the current working directory,
whether that was chosen by dialog or setwd().  This will appear in 3.0.0.

Duncan Murdoch


 Duncan Murdoch


 Thanks,
 Glenn Stauffer

 -Original Message-
 From: r-help-boun...@r-project.org 
 [mailto:r-help-boun...@r-project.org] On Behalf Of Prof Brian Ripley
 Sent: Thursday, February 28, 2013 1:52 AM
 To: r-help@r-project.org
 Subject: Re: [R] positioning of R windows

 On 27/02/2013 22:33, Glenn Stauffer wrote:
 I have 2 (related, I think) questions about positioning of windows 
 within
 R.

 1.   I often work with a second monitor and sometimes like to
arrange
 1
 or more plot windows on the second monitor, while keeping the 
 console on the primary monitor (so I can see things better). I used 
 to be able to do this (when using Windows XP), but it seems that now 
 (using Windows 7) I can't even move the plot window outside of the 
 parent R window. Is this a Windows
 7 issue, or something I can fix with R preferences?

 Run RGui with --sdi  I don't believe it was ever intentionally 
 possible to move MDI windows outside the frame.

 2.   When I use the file menu to change directories I noticed 2
 differences from Win XP to Win 7. In Win 7, 1) the bottom of the 
 pop-up window is off the bottom of my computer, and 2) the directory 
 tree defaults to something close to the root, regardless of the 
 current
 working directory.
 In Win XP, the directory tree defaulted to the current working 
 directory, which made it easy to jump up one folder, etc. Is there 
 any way to make this the default behavior?

 Ask Microsoft not to change the behaviour of their common controls API.

 I am using R 2.15.1

 Which is not current: R 2.15.3 will be released tomorrow.  And you 
 are comparing an old OS (Win7) with a very old one (XP): R for 
 Windows was adapted for Win7 and before that, Vista, several years ago.




 Thanks,

 Glenn Stauffer


 [[alternative HTML version deleted]]

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.





__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] positioning of R windows

2013-03-05 Thread Duncan Murdoch

On 05/03/2013 10:22 AM, Glenn Stauffer wrote:

I've updated to the latest version of R,


That's a little ambiguous.  Do you mean 2.15.3?  I didn't put the new 
change into that, because it was too late.  You can try a build of the 
development version, which will be released as 3.0.0 in April, and you 
should see different behaviour.


Duncan Murdoch



  but still the problem persists.
Another thing I noticed (but failed to mention before) is that when I
initially open the change working directory dialog box, the little line
labeled Folder: (under the window showing the folder tree) does show the
current folder for a split second before reverting back to the folder
Computer. But the tree always just selects Computer and below that lists
the C drive and the D drive.

Anyway, I am convinced now that it is not an R issue, but rather something
about my computer, specifically.

Glenn Stauffer


-Original Message-
From: Duncan Murdoch [mailto:murdoch.dun...@gmail.com]
Sent: Friday, March 01, 2013 8:25 AM
To: Glenn Stauffer
Cc: 'Prof Brian Ripley'; r-help@r-project.org
Subject: Re: [R] positioning of R windows

On 13-02-28 11:10 AM, Duncan Murdoch wrote:
 On 28/02/2013 11:00 AM, Glenn Stauffer wrote:
 Ahh, I should have known about the MDI and SDI options - choosing SDI
 lets me do what I want. Thanks.
 On #2, I realized that when the change directory dialog window pops
 up, if I resize it, R remembers the resizing so that now the entire
 window is visible. I should have tried that before I posted. No luck
 though on getting the change directory dialog box to begin at the
 current working directory the same manner as 'save' or 'open'
 dialog boxes.
 I'll update to the new R version

 In the 2.15.3 release candidate, I see this:

 The first time I open that control, it starts in the current working
 directory.

 After that, it starts in the last directory chosen in that control. If
 I've used setwd() in the console to change, it doesn't see the change.

 This might be fixable, but not for tomorrow.

It is now changed so that it starts in the current working directory,
whether that was chosen by dialog or setwd().  This will appear in 3.0.0.

Duncan Murdoch


 Duncan Murdoch


 Thanks,
 Glenn Stauffer

 -Original Message-
 From: r-help-boun...@r-project.org
 [mailto:r-help-boun...@r-project.org] On Behalf Of Prof Brian Ripley
 Sent: Thursday, February 28, 2013 1:52 AM
 To: r-help@r-project.org
 Subject: Re: [R] positioning of R windows

 On 27/02/2013 22:33, Glenn Stauffer wrote:
 I have 2 (related, I think) questions about positioning of windows
 within
 R.

 1.   I often work with a second monitor and sometimes like to
arrange
 1
 or more plot windows on the second monitor, while keeping the
 console on the primary monitor (so I can see things better). I used
 to be able to do this (when using Windows XP), but it seems that now
 (using Windows 7) I can't even move the plot window outside of the
 parent R window. Is this a Windows
 7 issue, or something I can fix with R preferences?

 Run RGui with --sdi  I don't believe it was ever intentionally
 possible to move MDI windows outside the frame.

 2.   When I use the file menu to change directories I noticed 2
 differences from Win XP to Win 7. In Win 7, 1) the bottom of the
 pop-up window is off the bottom of my computer, and 2) the directory
 tree defaults to something close to the root, regardless of the
 current
 working directory.
 In Win XP, the directory tree defaulted to the current working
 directory, which made it easy to jump up one folder, etc. Is there
 any way to make this the default behavior?

 Ask Microsoft not to change the behaviour of their common controls API.

 I am using R 2.15.1

 Which is not current: R 2.15.3 will be released tomorrow.  And you
 are comparing an old OS (Win7) with a very old one (XP): R for
 Windows was adapted for Win7 and before that, Vista, several years ago.




 Thanks,

 Glenn Stauffer


[[alternative HTML version deleted]]

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.








__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] positioning of R windows

2013-03-05 Thread Glenn Stauffer
I'm sorry - yes I meant 2.15.3. But I think it probably does not matter.
Others are not seeing the behavior that I am (with any version), so I don't
think R is the problem. But I will post if I ever get it figured out.

Thanks,
Glenn Stauffer

-Original Message-
From: Duncan Murdoch [mailto:murdoch.dun...@gmail.com] 
Sent: Tuesday, March 05, 2013 10:58 AM
To: Glenn Stauffer
Cc: 'Prof Brian Ripley'; r-help@r-project.org
Subject: Re: [R] positioning of R windows

On 05/03/2013 10:22 AM, Glenn Stauffer wrote:
 I've updated to the latest version of R,

That's a little ambiguous.  Do you mean 2.15.3?  I didn't put the new change
into that, because it was too late.  You can try a build of the development
version, which will be released as 3.0.0 in April, and you should see
different behaviour.

Duncan Murdoch


   but still the problem persists.
 Another thing I noticed (but failed to mention before) is that when I 
 initially open the change working directory dialog box, the little 
 line labeled Folder: (under the window showing the folder tree) does 
 show the current folder for a split second before reverting back to 
 the folder Computer. But the tree always just selects Computer and 
 below that lists the C drive and the D drive.

 Anyway, I am convinced now that it is not an R issue, but rather 
 something about my computer, specifically.

 Glenn Stauffer


 -Original Message-
 From: Duncan Murdoch [mailto:murdoch.dun...@gmail.com]
 Sent: Friday, March 01, 2013 8:25 AM
 To: Glenn Stauffer
 Cc: 'Prof Brian Ripley'; r-help@r-project.org
 Subject: Re: [R] positioning of R windows

 On 13-02-28 11:10 AM, Duncan Murdoch wrote:
  On 28/02/2013 11:00 AM, Glenn Stauffer wrote:
  Ahh, I should have known about the MDI and SDI options - choosing 
  SDI lets me do what I want. Thanks.
  On #2, I realized that when the change directory dialog window pops 
  up, if I resize it, R remembers the resizing so that now the entire 
  window is visible. I should have tried that before I posted. No 
  luck though on getting the change directory dialog box to begin 
  at the current working directory the same manner as 'save' or 'open'
  dialog boxes.
  I'll update to the new R version
 
  In the 2.15.3 release candidate, I see this:
 
  The first time I open that control, it starts in the current working 
  directory.
 
  After that, it starts in the last directory chosen in that control. 
  If I've used setwd() in the console to change, it doesn't see the
change.
 
  This might be fixable, but not for tomorrow.

 It is now changed so that it starts in the current working directory, 
 whether that was chosen by dialog or setwd().  This will appear in 3.0.0.

 Duncan Murdoch

 
  Duncan Murdoch
 
 
  Thanks,
  Glenn Stauffer
 
  -Original Message-
  From: r-help-boun...@r-project.org
  [mailto:r-help-boun...@r-project.org] On Behalf Of Prof Brian 
  Ripley
  Sent: Thursday, February 28, 2013 1:52 AM
  To: r-help@r-project.org
  Subject: Re: [R] positioning of R windows
 
  On 27/02/2013 22:33, Glenn Stauffer wrote:
  I have 2 (related, I think) questions about positioning of windows 
  within
  R.
 
  1.   I often work with a second monitor and sometimes like to
 arrange
  1
  or more plot windows on the second monitor, while keeping the 
  console on the primary monitor (so I can see things better). I 
  used to be able to do this (when using Windows XP), but it seems 
  that now (using Windows 7) I can't even move the plot window 
  outside of the parent R window. Is this a Windows
  7 issue, or something I can fix with R preferences?
 
  Run RGui with --sdi  I don't believe it was ever intentionally 
  possible to move MDI windows outside the frame.
 
  2.   When I use the file menu to change directories I noticed 2
  differences from Win XP to Win 7. In Win 7, 1) the bottom of the 
  pop-up window is off the bottom of my computer, and 2) the 
  directory tree defaults to something close to the root, regardless 
  of the current
  working directory.
  In Win XP, the directory tree defaulted to the current working 
  directory, which made it easy to jump up one folder, etc. Is there 
  any way to make this the default behavior?
 
  Ask Microsoft not to change the behaviour of their common controls API.
 
  I am using R 2.15.1
 
  Which is not current: R 2.15.3 will be released tomorrow.  And you 
  are comparing an old OS (Win7) with a very old one (XP): R for 
  Windows was adapted for Win7 and before that, Vista, several years ago.
 
 
 
 
  Thanks,
 
  Glenn Stauffer
 
 
[[alternative HTML version deleted]]
 
  __
  R-help@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-help
  PLEASE do read the posting guide
  http://www.R-project.org/posting-guide.html
  and provide commented, minimal, self-contained, reproducible code.
 
 
 
 



__
R-help@r-project.org mailing list

Re: [R] plotmath: angle brackets

2013-03-05 Thread Prof Brian Ripley

On 05/03/2013 12:05, Donatella Quagli wrote:

Dear all,

is it possible to print angle brackets (LaTeX notation: \langle, \rangle)? I 
found that lceil and lfloor
are available, see demo(plotmath). But langle and rangle are not.

I tried to print utf8 characters directly as well without success.

I have also read something about a tikzDevice package. But obviously this is 
not available anymore.

Any suggestions?


You can use anything from the Adobe Symbol font, see ?plotmath. 
symbol(0xe1) and symbol(0xf1) look the nearest match.




Greetings
Donatella




--
Brian D. Ripley,  rip...@stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Reading outdated .Rprofile file

2013-03-05 Thread Simon Kiss
Hi there:
I'm having a weird problem with my startup procedure. R.app is reading an 
unknown .Rprofile file.

First, I'm on a Mac Os 10.6.8 running R.app 2.15.0

On startup
  getwd()
[1] /Users/simon

But: the contents of my .Rprofile file in my home directory when viewed with a 
text editor are:

.First-function() {
 source(/Users/simon/Documents/R/functions/trim.leading.R)
source(/Users/simon/Documents/R/functions/trim.trailing.R)
source(/Users/simon/Documents/R/functions/trim.R)
  source(/Users/simon/Documents/R/functions/pseudor2.R)
source(/Users/simon/Documents/R/functions/dates.R)
source(/Users/simon/Documents/R/functions/andersen.R)
source(/Users/simon/Documents/R/functions/tabfun.R)
source(/Users/simon/Documents/R/functions/cox_snell.R)
source(/Users/simon/Documents/R/functions/cor.prob.R)
source(/Users/simon/Documents/R/functions/kmo.R)
source(/Users/simon/Documents/R/functions/residual.stats.R)

source(/Users/simon/Documents/R/functions/missings.plot.R)
}


but then, when I type .First from the command line I get
function () 
{
source(/Users/simon/Documents/R/functions/sample_size.R)
source(/Users/simon/Documents/R/functions/pseudor2.R)
source(/Users/simon/Documents/R/functions/dates.R)
source(/Users/simon/Documents/R/functions/andersen.R)
source(/Users/simon/Documents/R/functions/tabfun.R)
source(/Users/simon/Documents/R/functions/cox_snell.R)
source(/Users/simon/Documents/R/functions/cor.prob.R)
source(/Users/simon/Documents/R/functions/kmo.R)
}

Needless to say, I get an error because the file sample.size.R was deleted a 
long time ago.

So how do I get R.app to read the updated .Rprofile file?

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Reading outdated .Rprofile file

2013-03-05 Thread David Winsemius
Preamble: There is a dedicated R-Mac-SIG list where this would have been more 
appropriately directed. and followups should be directed there (and the R-help 
address removed so we do not have two simulatnaous threads.)

On Mar 5, 2013, at 8:38 AM, Simon Kiss wrote:

 Hi there:
 I'm having a weird problem with my startup procedure. R.app is reading an 
 unknown .Rprofile file.
 
 First, I'm on a Mac Os 10.6.8 running R.app 2.15.0

There have been several updates to the R.app that went with R 2.15.0 that 
should not be at all disruptive to acquire. R.app version numbers are not in 
sync with R versions so that doesn't tell us what R.app you have. This is the 
current R.app (based on what is at the r.research.att.com website) : Mac OS X 
GUI rev. 6451

 
 On startup
 getwd()
 [1] /Users/simon
 
 But: the contents of my .Rprofile file in my home directory when viewed with 
 a text editor are:

What is your home directory and does your understanding agree with what you 
see when you use the Preferences panel from within a running version of R.app? 

inline: Snapz Pro XScreenSnapz028.png


I'm also running OSX 10.6.8, but R version 2.15.2 (2012-10-26) Platform: 
x86_64-apple-darwin9.8.0/x86_64 (64-bit). Looking at the Get Info panel for the 
GUI I see I am also somewhat out-of-date: R.app GUI 1.53 (6335 Leopard build 
64-bit).

I noticed that R 2.15.3 is now available. (I had been checking over the last 
several days and this is the first time I noticed that it had appeared.)



 .First-function() {
source(/Users/simon/Documents/R/functions/trim.leading.R)
   source(/Users/simon/Documents/R/functions/trim.trailing.R)
source(/Users/simon/Documents/R/functions/trim.R)
  source(/Users/simon/Documents/R/functions/pseudor2.R)
source(/Users/simon/Documents/R/functions/dates.R)
source(/Users/simon/Documents/R/functions/andersen.R)
source(/Users/simon/Documents/R/functions/tabfun.R)
   source(/Users/simon/Documents/R/functions/cox_snell.R)
   source(/Users/simon/Documents/R/functions/cor.prob.R)
   source(/Users/simon/Documents/R/functions/kmo.R)
   source(/Users/simon/Documents/R/functions/residual.stats.R)
   
 source(/Users/simon/Documents/R/functions/missings.plot.R)
   }
 
 
 but then, when I type .First from the command line I get
 function () 
 {
source(/Users/simon/Documents/R/functions/sample_size.R)
source(/Users/simon/Documents/R/functions/pseudor2.R)
source(/Users/simon/Documents/R/functions/dates.R)
source(/Users/simon/Documents/R/functions/andersen.R)
source(/Users/simon/Documents/R/functions/tabfun.R)
source(/Users/simon/Documents/R/functions/cox_snell.R)
source(/Users/simon/Documents/R/functions/cor.prob.R)
source(/Users/simon/Documents/R/functions/kmo.R)
 }
 
 Needless to say, I get an error because the file sample.size.R was deleted a 
 long time ago.
 
 So how do I get R.app to read the updated .Rprofile file?

Could there be another .Profile file? (They are invisible by default in the 
MacOS using Finder.app.)

-- 

David Winsemius
Alameda, CA, USA

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] permutest

2013-03-05 Thread Sarah Hicks
I'm working with capscale and permutest for the first time, and having
trouble getting statistical analyses for more than one constraining
variable. I've read the documentation, but setting first=FALSE or using
by=axis doesn't seem to be helping. capscale seems to be fine, I receive
output for more than one constrained axis. What am I doing wrong?

capscale.Nrem.results-capscale(as.dist(qiime.data$distmat)~
N+rem+N*rem+Condition(dateFac), factor.frame)
capscale.Nrem.results


Inertia Proportion Rank
Total  1.454538
Real Total 1.459802   1.00
Conditional0.117117   0.0802281
Constrained0.386228   0.2645763
Unconstrained  0.956457   0.655197   22
Imaginary -0.005264   2
Inertia is squared Unknown distance

Eigenvalues for constrained axes:
   CAP1CAP2CAP3
0.29869 0.05395 0.03359

Eigenvalues for unconstrained axes:
   MDS1MDS2MDS3MDS4MDS5MDS6MDS7MDS8
0.27719 0.13725 0.11048 0.06691 0.05551 0.04940 0.03892 0.03468
(Showed only 8 of all 22 unconstrained eigenvalues)


sig.Nrem - permutest(capscale.Nrem.results,permutations=999, by=margin,
model=direct,first=FALSE)
sig.Nrem

Permutation test for capscale

Call: capscale(formula = as.dist(qiime.data$distmat) ~ N + rem +
Condition(dateFac) + N:rem, data =
factor.frame)
Permutation test for all constrained eigenvalues
Pseudo-F: 2.961281 (with 3, 22 Degrees of Freedom)
Significance: 0.001
Based on 999 permutations under direct model.

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] R-help Digest, Vol 121, Issue 5

2013-03-05 Thread Law, Jason
On R 2.15.2 and ArcGIS 9.3.1, it works for me in ArcCatalog but you have to 
follow the particulars here:

http://webhelp.esri.com/arcgisdesktop/9.3/index.cfm?TopicName=Accessing_delimited_text_file_data

For example:

write.table(test, '***.tab', sep = '\t', row.names = F)

The extension .tab and sep = '\t' are required for text files.  Didn't test 
row.names=T but I wouldn't count on that working either.

Jason Law
Statistician
City of Portland, Bureau of Environmental Services
Water Pollution Control Laboratory
6543 N Burlington Avenue
Portland, OR 97203-5452
503-823-1038
jason@portlandoregon.gov

-Original Message-
Date: Mon, 04 Mar 2013 10:48:39 -0500
From: Duncan Murdoch murdoch.dun...@gmail.com
To: Kerry kernichol...@yahoo.com
Cc: r-help@r-project.org r-help@r-project.org
Subject: Re: [R] Mysterious issues with reading text files from R in
ArcGIS and Excel
Message-ID: 5134c257.6020...@gmail.com
Content-Type: text/plain; charset=ISO-8859-1; format=flowed

On 04/03/2013 10:09 AM, Kerry wrote:
 It seems within the last ~3 months Ive been having issues with writing text 
 or csv files from a R data frame.  The problem is multifold and it is hard to 
 filter  out what is going on and where the problem is.  So, Im hoping someone 
 else has come across this and may provide insight.

I think you need to provide a simple example for us to try, either by 
putting a small example of one of your files online for us to download, 
or (better) by giving us self-contained code to duplicate the problem.

You might also get better help (especially about ArcGIS) on the 
R-sig-Geo mailing list: https://stat.ethz.ch/mailman/listinfo/r-sig-geo.

Duncan Murdoch




 My current settings for R:
 R version 2.15.2 (2012-10-26)
 Platform: x86_64-w64-mingw32/x64 (64-bit)
 locale:

 [1] LC_COLLATE=Swedish_Sweden.1252  LC_CTYPE=Swedish_Sweden.1252
 LC_MONETARY=Swedish_Sweden.1252 LC_NUMERIC=C
 [5] LC_TIME=Swedish_Sweden.1252

 attached base packages:
 [1] tcltk stats graphics  grDevices utils datasets  methods   base

 other attached packages:
 [1] adehabitat_1.8.11 shapefiles_0.6foreign_0.8-51tkrplot_0.0-23
 ade4_1.5-1

 loaded via a namespace (and not attached):
 [1] tools_2.15.2

 I am using Microsoft Excel 2010 and ArcGIS 10.1sp1 for Desktop

 Basically, no matter what data frame I am working on, when I export it to a 
 text file to be use in Excel or ArcGIS problems arise.  Im not sure if it is 
 R or these other programs, maybe forums for ArcGIS might be more appropriate, 
 but this problem only occurs when I use tables that have been produced from 
 an R session.

 When I try to open a text file in Excel, either I get an error message stating
 The file you are trying to open is in a different format than specified by 
 the file extension.  Verify that the file is not corrupted and is from a 
 trusted source.
 Followed by
 Excel has detected that 'file.txt' is a SYLK file, but cannot load it.  
 Either the file has errors or is not a SYLK file format.  Click OK to open 
 the file in a different format
 Then the file opens


 Otherwise, the file opens fine the first time through - and looks ok. I 
 can't figure out what Im doing different between the two commands of 
 write.table as they are always written the same:
 write.csv(file, file = D:/mylocations/fileofinterest.csv) or 
 write.table(file, file = D:/mylocations/fileofinterest.txt)
 Sometimes I will try to add sep = , or sep = ; but these don't make a 
 difference (which I didn't figure they would).

 The other program I use is ArcGIS and bringing in a txt file from R is really 
 messing things up as 2 new columns of information are typically added and 
 date/time data is usually lost with txt files, but not with csv files.

 For instance - a text file that looks like this in Excel:
  id   x   ydateR1dmedR1dmean R1error 
 R2error
 1 F07001 1482445 6621768 2007-03-05 10:00:53 2498.2973 2498.2973   FALSE   
 FALSE
 2 F07001 1481274 6619628 2007-03-05 12:00:41  657.1029  657.1029FALSE   
 FALSE
 3 F07001 1481279 6619630 2007-03-05 14:01:12  660.3569  660.3569FALSE   
 FALSE
 4 F07001 1481271 6619700 2007-03-05 16:00:39  620.1397  620.1397FALSE   
 FALSE

   in ArcGIS now looks like this:

 Field1idid_Xid_YxydateR1dmedR1dmean R1errorR2errorOBJECTID *
 1F07001118.0818119.485541e+01514824456621768NA2498.297272498.29727FALSEFALSE1
 2F07001118.0818119.485541e+01514812746619628NA657.102922657.102922FALSEFALSE2
 3F07001118.0818119.485541e+01514812796619630NA660.356911660.356911FALSEFALSE3
 4F07001118.0818119.485541e+01514812716619700NA620.139702620.139702FALSEFALSE4
 5F07001118.0818119.485541e+01514808496620321NA378.186792378.186792FALSEFALSE5

 Where did id_X and id_Y come from?? What are they??
 What happened to the Date column???  Why does the date column show up when I 
 use write.csv but not write.table?

 Thank you for your help.

 ~K
   [[alternative HTML version deleted]]

[R] Simulate binary correlated data

2013-03-05 Thread Marbles
Dear R experts,

I am trying to simulate correlated binary data and have stumbled upon the
following problem:

With the help of binarySimCLF or mvpBinaryEp I have been able to
simulate correlating binary vectors given certain mean values and a desired
correlation. My problem is that these procedures do not allow you to specify
the exact vector for which you want to generate a correlated vector. Is
there anyway to do this?

Maybe I can clarify my question by explaining what my goal is:
I want to generate one Binary Vector (A), generate a correlated binary
Vector (B), then generate a third binary Vector (C) that is correlated to B
so that I can then see the occuring correlations between A and C.

Thank you in advance,
Marbles



--
View this message in context: 
http://r.789695.n4.nabble.com/Simulate-binary-correlated-data-tp4660366.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Is there a serial autocorrelation test for FGLS-FE fitted with PGGLS?

2013-03-05 Thread Tomas Note
Dears, a simple question here: is there any AR1 and AR2 test for FGLS-FE
fitted with the pggls function of plm package?

(one example would be the Baltagi-Wu LBI test)

Thanks for your attention!

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] useR! 2013 registration open

2013-03-05 Thread Virgilio Gómez-Rubio
We are happy to inform you that registration for useR! 2013 is now open,
see

http://www.R-project.org/useR-2013

This meeting of the R user community will take place at the University
of Castilla-La Mancha, Albacete, Spain, July 10-12, 2013. Pre-conference
tutorials will be offered on July 9. A list of tutorials is available at

http://www.r-project.org/useR-2013/Tutorials/


Submission of contributed talks and posters is still open. More
information can be found here:

http://www.r-project.org/useR-2013#Call

In addition to regular talks and posters, all participants are invited
to present a Lightning Talk, for which no abstract is required. The
format for Lightning Talks is a 15-slide version of Pecha Kucha.
Registered participants will be contacted to provide an informative
title in due course.

Thanks to Revolution Analytics, Google, R-Studio and Oracle, we have a
number of bursaries available for Ph.D.  students. These will cover
registration fees and, if possible, travel expenses. If you would like
to apply you need to:

- Register for the conference and include Apply for Bursary in the
Comments section of the Registration form
- Submit an abstract using the standard procedure (remember that
deadline is 31 March 2013). Only those applicants who submit an abstract
will be considered.
- Send a summary of your CV, motivation letter (including title of your
abstract) and support letter from your advisor to
user-2...@r-project.org . In the subject of the e-mail state clearly
that it is a bursary application.

Deadline for bursary applications is 15th April 2013.

We look forward to seeing you at useR! 2013 in Albacete!

-- 
Virgilio Gómez Rubio
Departamento de Matemáticas
Escuela de Ingenieros Industriales - Albacete
Avda. España s/n - 02071 Albacete - SPAIN
Tlf: (+34) 967 59 92 00 ext. 8250/8242

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Zelig package: Coxph model problems

2013-03-05 Thread Thomas Lumley
I think Zelig uses model=cox.ph (
https://github.com/zeligdev/ZeligMisc/blob/master/tests/coxph.R)

   -thomas

On Wed, Mar 6, 2013 at 2:54 AM, Stephen Knight stephenkni...@doctors.org.uk
 wrote:

 Hi,

 I'm having problems with the Zelig package - when using the below R
 displays the follwing message (I'm running R i386 2.15.3 for Windows and
 have updated all the Zelig packages):

 z.out-zelig(Surv(psurv2, pcens2) ~ ren_sup3 + age,
 data=data_urgent, model=coxph)

 ** The model coxph is not available with the currently loaded packages,
 ** and is not an official Zelig package.
 ** The model's name may be a typo.

 Any suggestions?

 Regards,
 Steve Knight

 __**
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/**listinfo/r-helphttps://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/**
 posting-guide.html http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.




-- 
Thomas Lumley
Professor of Biostatistics
University of Auckland

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] R short course, BYU Summer Institute of Applied Statistics

2013-03-05 Thread Greg Snow
For anyone looking for an intermediate to advanced level short course on R
this summer, I will be presenting at the BYU Summer Institute of Applied
Statistics June 19-21, 2013.

The official page is here:
http://statistics.byu.edu/r-beyond-basics-38th-annual-summer-institute-applied-statistics

This course is intended for those who have worked through and have a fairly
good understanding of An Introduction to R and will go into the more
advanced uses and programming of R beyond the introduction.

This is put on by the Statistics department at Brigham Young University in
Provo, Utah (if you come, plan an extra day or so to enjoy the mountains).

-- 
Gregory (Greg) L. Snow Ph.D.
538...@gmail.com

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] ggplot2: two time series with different dates in a single plot

2013-03-05 Thread Matthijs Daelman
Hi

Using the ggplot2 package, I would like to obtain a plot that contains two
time series that have data points on different dates.

For instance, one data frame looks like:

date1, value1
2010-01-05, 2921.74
2010-01-08, 2703.89
2010-01-14, 3594.21
2010-01-20, 3659.22

The other data frame looks like

date2, value2
2010-01-01, 285.85
2010-01-02, 229.20
2010-01-05, 333.91
2010-01-06, 338.27
2010-01-07, 272.85
2010-01-08, 249.04
2010-01-09, 240.07
2010-01-10, 255.06
2010-01-11, 275.42
2010-01-12, 252.39

I would like to plot these two time series in one and the same plot, with
date on the X axis and value on the Y axis.

And while you're at it: how would you proceed to get a secondary Y axis for
the second dataframe?

Thanks a lot!

Matthijs Daelman

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Error message

2013-03-05 Thread li li
Dear all,
 I got an error message when running the following code.
Can anyone give any suggestions on fixing  this type of error?
 Thank you very much in advance.
Hanna



 integrand - function(x, rho, a, b, z){
+  x1 - x[1]
+  x2 - x[2]
+  Sigma - matrix(c(1, rho, rho, 1), 2,2)
+  mu - rep(0,2)
+  f - pmnorm(c((z-a*x1)/b, (z-a*x2)/b), mu,
Sigma)*dmnorm(c(0,0), mu, diag(2))
+  f
+}

 adaptIntegrate(integrand, lower=rep(-Inf, 2), upper=c(2,2),
+ rho=0.1, a=0.6, b=0.3, z=3,  maxEval=1)
Error in if (any(lower  upper)) stop(lowerupper integration limits) :
  missing value where TRUE/FALSE needed

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] ggplot2: two time series with different dates in a single plot

2013-03-05 Thread Gabor Grothendieck
On Tue, Mar 5, 2013 at 3:30 PM, Matthijs Daelman
matthijs.dael...@gmail.com wrote:
 Hi

 Using the ggplot2 package, I would like to obtain a plot that contains two
 time series that have data points on different dates.

 For instance, one data frame looks like:

 date1, value1
 2010-01-05, 2921.74
 2010-01-08, 2703.89
 2010-01-14, 3594.21
 2010-01-20, 3659.22

 The other data frame looks like

 date2, value2
 2010-01-01, 285.85
 2010-01-02, 229.20
 2010-01-05, 333.91
 2010-01-06, 338.27
 2010-01-07, 272.85
 2010-01-08, 249.04
 2010-01-09, 240.07
 2010-01-10, 255.06
 2010-01-11, 275.42
 2010-01-12, 252.39

 I would like to plot these two time series in one and the same plot, with
 date on the X axis and value on the Y axis.

 And while you're at it: how would you proceed to get a secondary Y axis for
 the second dataframe?


Lines1 - date1, value1
2010-01-05, 2921.74
2010-01-08, 2703.89
2010-01-14, 3594.21
2010-01-20, 3659.22

Lines2 - date2, value2
2010-01-01, 285.85
2010-01-02, 229.20
2010-01-05, 333.91
2010-01-06, 338.27
2010-01-07, 272.85
2010-01-08, 249.04
2010-01-09, 240.07
2010-01-10, 255.06
2010-01-11, 275.42
2010-01-12, 252.39

library(zoo)
library(ggplot2)

# create two zoo time series objects
z1 - read.zoo(text = Lines1, header = TRUE, sep = ,)
z2 - read.zoo(text = Lines2, header = TRUE, sep = ,)

# combine them into a single multivariate time series
z - na.approx(merge(z1, z2))

# single panel
autoplot(z, facet = NULL)

# or, multiple panels with different Y axes
autoplot(z) + facet_free()

Different left and right are generally frowned upon but if you want
that anyways look at the examples in ?plot.zoo

--
Statistics  Software Consulting
GKX Group, GKX Associates Inc.
tel: 1-877-GKX-GROUP
email: ggrothendieck at gmail.com

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] different colors for two wireframe spheres

2013-03-05 Thread David Winsemius

On Mar 5, 2013, at 3:28 AM, David Schellenberger Costa wrote:

 Dear List,
 
 I have the code below adapted from the lattice-package examples to draw two 
 spheres. I would now like to give
 both different surface colors, e.g. one red and one blue. 
 
 ## 3-D surface parametrized on a 2-D grid

n - 10
   
tx - matrix(seq(-pi, pi, length.out = 2*n), 2*n, n)
ty - matrix(seq(-pi, pi, length.out = n) / 2, 2*n, n, 
 byrow = T)
xx - cos(tx) * cos(ty)
yy - sin(tx) * cos(ty)
zz - sin(ty)
zzz - zz

bxx - xx+5 
byy - yy+5
bzzz - zzz+5

xx=rbind(xx,rep(NA,n),bxx)
yy=rbind(yy,rep(NA,n),byy)
zzz=rbind(zzz,rep(NA,n),bzzz)
gr-rbind(matrix(1,n,2*n),rep(NA,n),matrix(2,n,2*n))
   
wireframe(zzz ~ xx + yy, 
 groups=gr,col.groups=c(red,blue))

You might try:

wireframe(zzz ~ xx + yy, groups=gr, drape=TRUE )

Notice that in help(wireframe): Details you find a sentence:

Note that this feature does not work with groups, subscripts, subset, etc. 
Conditioning variables are also not supported in this case.

 This sentence refers back to the paragraph beginning For single panel plots, 
wireframe can also plot parametrized 3-D surfaces...  that describes the 
method you are using to draw those spheres.

 
 I tried various parameters as col.groups, col.regions but I have the 
 impression that the groups argument in the wireframe command can only be 
 used when supplying a data argument, which does not allow to plot the spheres 
 as far as I tried. Is there a solution for this?
 
 Cheers
 
 David Schellenberger
 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.

David Winsemius
Alameda, CA, USA

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Simulate binary correlated data

2013-03-05 Thread R. Michael Weylandt
On Tue, Mar 5, 2013 at 7:07 PM, Marbles max-ihm...@gmx.de wrote:
 Dear R experts,

 I am trying to simulate correlated binary data and have stumbled upon the
 following problem:

 With the help of binarySimCLF or mvpBinaryEp I have been able to
 simulate correlating binary vectors given certain mean values and a desired
 correlation. My problem is that these procedures do not allow you to specify
 the exact vector for which you want to generate a correlated vector. Is
 there anyway to do this?

 Maybe I can clarify my question by explaining what my goal is:
 I want to generate one Binary Vector (A), generate a correlated binary
 Vector (B), then generate a third binary Vector (C) that is correlated to B
 so that I can then see the occuring correlations between A and C.

IIRC, knowing the correlation between A  B and the correlation
between B  C is not enough to uniquely specify the correlation
between A  C (or perhaps even to bound it). Therefore i think your
question is ill-defined. Though I might be wrong in the specific case
of binary variates...

Cheers,
MW


 Thank you in advance,
 Marbles



 --
 View this message in context: 
 http://r.789695.n4.nabble.com/Simulate-binary-correlated-data-tp4660366.html
 Sent from the R help mailing list archive at Nabble.com.

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Function completely locks up my computer if the input is too big

2013-03-05 Thread Benjamin Caldwell
Dear r-help,


Somewhere in my innocuous function to rotate an object in Cartesian space
I've created a monster that completely locks up my computer (requires a
hard reset every time). I don't know if this is useful description to
anyone - the mouse still responds, but not the keyboard and not windows
explorer.

The script only does this when the input matrix is large, and so my initial
runs didn't catch it as I used a smaller matrix to speed up the test runs.
When I tried an input matrix with a number of dimensions in the same order
of magnitude as the data I want to read in, R and my computer choked. This
was a surprise for me, as I've always been able to break execution in the
past or do other tasks. So i tried it again, and still no dice.

Now I need the function to work as subsequent functions/functionality are
dependent, and I can't see anything on the face of it that would likely
cause the issue.

Any insight on why this happens in general or specifically in my case are
appreciated. Running R 15.2, Platform: x86_64-w64-mingw32/x64 (64-bit) on a
windows 7 machine with 4 mb RAM. In the meantime I suppose I'll write a
loop to do this function piece-wise for larger data and see if that helps.

Script is attached and appended below.

Thanks

Ben Caldwell



#compass to polar coordinates

compass2polar - function(x) {-x+90}



#degrees (polar) to radians

Deg2Rad - function(x) {(x*pi)/180}



# radians to degrees

Rad2Deg - function (rad) (rad/pi)*180



# polar to cartesian coordinates - assumes degrees those from a compass.
output is a list, x  y of equal length

Pol2Car - function(distance,deg) {


rad - Deg2Rad(compass2polar(deg))

rad - rep(rad, length(distance))


x - ifelse(is.na(distance), NA, distance * cos(rad))

y - ifelse(is.na(distance), NA, distance * sin(rad))


x-round(x,2)

y-round(y,2)


cartes- list(x,y)

name-c('x','y')

names(cartes)-name

cartes

}





#rotate an object, with assumed origin at 0,0, in any number of degrees

rotate - function(x,y,tilt){ 8


d2 - x^2+y^2

rotate.dis-sqrt(d2)

or.rad - atan(x/y)

or.deg - Rad2Deg(or.rad)


n - length(or.deg)

for(i in 1:n){

if(is.na(or.deg[i])==TRUE) {or.deg[i] - 0}

}

# browser()

tilt.in - tilt + or.deg


xy-Pol2Car (distance=rotate.dis, deg=tilt.in)

 # if(abs(tilt) = 0) {

 # shift.frame - cbind(xy$x, xy$y)

# shift.frame.val - shift.frame[shift.frame[,2]==min(shift.frame[,2]),]

# shift.x- shift.frame.val[1] * -1

# shift.y- shift.frame.val[2] * -1

# x-xy$x + shift.x

# y-xy$y + shift.y

# }

# name - c('x', 'y')

# xy-list(x,y)

# names(xy)-name

 xy

}


x - seq(0,5, .5)

y - seq(0,5, .5)

z - seq(0,5, .5)

dquad-expand.grid(x,y,z)

name-c(y,x,z)

names(dquad)-name


plot(dquad$x, dquad$y, xlim=c(-25,25), ylim=c(-25,25))


#this works fine

rotated-rotate(dquad$x, dquad$y, 45)



points(rotated$x, rotated$y, col='green')


# profiling of both time and memory

Rprof(“myFunction.out”, memory.profiling=T)

y - myFunction(x)

Rprof(NULL)

summaryRprof(“myFunction.out”, memory=”both”)



#

x - seq(0,5, .1)

y - seq(0,5, .1)

z - seq(0,5, .1)

dquad-expand.grid(x,y,z)

name-c(y,x,z)

names(dquad)-name

# running the below locks up my machine (commented out to avoid accidental
run)

# rotated-rotate(dquad$x, dquad$y, 45)
__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Reading Wyoming radiosonde data files with RadioSonde package

2013-03-05 Thread Ilan Levy
Hi,
I need to do some analysis on historic daily radiosonde data I download
from the Wyoming Univ. web page (
http://weather.uwyo.edu/upperair/sounding.html).
I am trying to use the RadioSonde package (V 1.3), but the format of the
files from Wyoming don't match what RadioSonde is expecting.

Has anyone used the Radiosonde package on the Wyoming data?
Here is a sample of the Wyoming file format:


40179 Bet Dagan Observations at 00Z 22 Feb 2013

-
   PRES   HGHT   TEMP   DWPT   RELH   MIXR   DRCT   SKNT   THTA   THTE
THTV
hPa m  C  C  %g/kgdeg   knot K  K
 K
-
   70.0  18430  -64.5 265 91  446.1
446.1
   59.6  19417  -60.9 265 83  475.1
475.1
   50.0  20500  -62.5 265 75  495.8
495.8
   46.7  20920  -64.3 501.2
501.2
   38.3  22137  -63.5 532.5
532.5
   33.3  23012  -56.3 573.2
573.2
   30.0  23670  -59.9 580.8
580.8
   28.1  24078  -60.5 590.0
590.0
   20.5  26056  -57.1 656.0
656.0


  Station information and sounding indices

 Station number: 40179
   Observation time: 130222/
   Station latitude: 32.00
  Station longitude: 34.81
  Station elevation: 35.0
 Mean mixed layer potential temperature: 0.00
  Mean mixed layer mixing ratio: 0.00

Here is the code I tried:

filename - 'D:\\Data\\sounding_test3.txt'
datakey  - --
varkey   -PRES
unitkey  - hPa
sample.sonde - getsonde(filename, datakey, varkey, unitkey)

Error in getsonde(filename, datakey, varkey, unitkey) :
  (getsonde): could not find a unique match for the data string

Thanks,
Ilik

Win7 OS 64-bit, R version 2.13.0.

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Meaning of error message when exporting to MS Excel

2013-03-05 Thread dthomas
Hi,

I’m attempting to export data (split into multiple files from one large
dataset) from R to excel using the excel.link package. The code for export
is as follows:

for(i in practicesNN){
  #Create relevant data for input
  #Separate out all parts of data – PracticeName is removed from example
data for privacy reasons
 detailedH-dataExport2[dataExport1$PracticeName == i 
dataExport1$RISK_LEVEL == 'High',]
  detailedM-dataExport2[dataExport1$PracticeName == i 
dataExport1$RISK_LEVEL == 'Medium',]
  detailedL-dataExport2[dataExport1$PracticeName == i 
dataExport1$RISK_LEVEL == 'Low',]
  print(paste(i,2))
   x-paste(i,.xls,sep=)
   #Open excel template
  xl.workbook.open(Template.xls)
   #Create practice specific file
  xl.workbook.save(x)
 #Activate detailed High risk sheet
  xl.sheet.activate(High Risk detailed)
  #Update detailed High risk spreadsheet
  xlrc[a1]-detailedH
  #Activate detailed Medium risk sheet
  xl.sheet.activate(Medium Risk detailed)
  #Update detailed Medium risk spreadsheet
  xlrc[a1]-detailedM
  #Activate detailed Low risk sheet
  xl.sheet.activate(Low Risk detailed)
  #Update detailed Low risk spreadsheet
  xlrc[a1]-detailedL
  #Save spreadsheet
  xl.workbook.save(x)
  #Close spreadsheet
  xl.workbook.close(x)   
} 

I attached an example file of data of the first 8 rows of the first xls. The
columns of the spreadsheet are filled until it gets to column ‘HbA1c_mmol’
which produces the following error:

Error in apply(r.obj[, iter], 1, paste, collapse = \t) : 
  dim(X) must have a positive length

I removed the offending column and the same occurs when column ‘BMI’ is
encountered. Having searched for similar error message I have been unable to
deduce the meaning of the error, particularly the ‘apply(r.obj[, iter], 1,
paste, collapse = \t)’ part. Can anyone explain what the error message
means and how to resolve it?

Many thanks,
Dove ExampleData.csv
http://r.789695.n4.nabble.com/file/n4660378/ExampleData.csv  




--
View this message in context: 
http://r.789695.n4.nabble.com/Meaning-of-error-message-when-exporting-to-MS-Excel-tp4660378.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Simulate binary correlated data

2013-03-05 Thread Marbles
Thank you for your response!

I know that it is not enough to uniquely specify the correlation. That is
why i would like to simulate it so that i can see how the resulting
correlations between A and C are distributed in dependence of the
correlations between AB and BC.



--
View this message in context: 
http://r.789695.n4.nabble.com/Simulate-binary-correlated-data-tp4660366p4660380.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Simulate binary correlated data

2013-03-05 Thread Marbles
Btw: I have done this for non-binary variables already. If anyone is
interested, it looks like this.
I am pretty new to R, so excuse the potentially unelegant code.

# Implement library 
library(ecodist)

# Prepare ACOutcome vector
ACOutcomes = c()

# Set desired variablelength  Number of simulations
VarLength = 1000
NSim = 1

# Set desired range for correlations
rangeAB = c(0.0, 1.0)
rangeBC = c(0.0, 1.0)

# Start n simulation runs
n = 0
while(n  NSim) {n=n+1;

# Set desired correlations between Variables
DesCorAB = runif(1, rangeAB[1], rangeAB[2])
DesCorBC = runif(1, rangeBC[1], rangeBC[2]) 

# Simulate A and B with desired correlation between them
DatasetAB = corgen(len=VarLength, r=DesCorAB, epsilon=0.00)
A = DatasetAB$x
B = DatasetAB$y

# Option of saving correlation between A  B
cor(A, B, method = pearson)

# Simulate C with desired correlation to B
C = corgen(x=B, r=DesCorBC, epsilon = 0.00)$y

# Calculate correlation between A  C
corAC = cor(A, C, method = pearson)

# Save correlation AC into vector
ACOutcomes = append(ACOutcomes, corAC)

;}

# Show results: Mean, Minimum, Maximum
hist(ACOutcomes, breaks = 100)
mean(ACOutcomes)
min(ACOutcomes)
max(ACOutcomes)



--
View this message in context: 
http://r.789695.n4.nabble.com/Simulate-binary-correlated-data-tp4660366p4660382.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Questions on implementing logistic regression

2013-03-05 Thread Ivan Li
Hi there,

I am trying to write a tool which involves implementing logistic
regression. With the batch gradient descent method, the convergence is
guaranteed as it is a convex problem. However, I find that with the
stochastic gradient decent method, it typically converges to some random
points (i.e., not very close to the minimum point resulted from the batch
method). I have tried different ways of decreasing the learning rate, and
different starting points of weights. However, the performance (e.g.,
accuracy, precision/recall, ...) are comparable (to the batch method).

I understand that this is possible, since SGD(stochastic gradient descent)
uses an approximation to the real cost each step. Does it matter? I guess
it does since otherwise the interpretation of the weights would not make
much sense even the accuracy is comparable. If it matters, I wonder if you
have some suggestions on how to make it converge or getting close to the
global optimal point.



Thanks!

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Function completely locks up my computer if the input is too big

2013-03-05 Thread Peter Alspach
Tena koe Benjamin

I haven't looked at you code in detail, but in general ifelse is slow and can 
generally be avoided.  For example,

ben - 1:10^7
system.time(BEN - ifelse(ben10, NA, -ben))
   user  system elapsed 
   1.310.241.56 
system.time({BEN1 - -ben; BEN1[BEN1 -10] - NA})
   user  system elapsed 
   0.170.030.20 
all.equal(BEN, BEN1)
[1] TRUE

HTH ...

Peter Alspach

-Original Message-
From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org] On 
Behalf Of Benjamin Caldwell
Sent: Wednesday, 6 March 2013 10:18 a.m.
To: r-help
Subject: [R] Function completely locks up my computer if the input is too big

Dear r-help,


Somewhere in my innocuous function to rotate an object in Cartesian space I've 
created a monster that completely locks up my computer (requires a hard reset 
every time). I don't know if this is useful description to anyone - the mouse 
still responds, but not the keyboard and not windows explorer.

The script only does this when the input matrix is large, and so my initial 
runs didn't catch it as I used a smaller matrix to speed up the test runs.
When I tried an input matrix with a number of dimensions in the same order of 
magnitude as the data I want to read in, R and my computer choked. This was a 
surprise for me, as I've always been able to break execution in the past or do 
other tasks. So i tried it again, and still no dice.

Now I need the function to work as subsequent functions/functionality are 
dependent, and I can't see anything on the face of it that would likely cause 
the issue.

Any insight on why this happens in general or specifically in my case are 
appreciated. Running R 15.2, Platform: x86_64-w64-mingw32/x64 (64-bit) on a 
windows 7 machine with 4 mb RAM. In the meantime I suppose I'll write a loop to 
do this function piece-wise for larger data and see if that helps.

Script is attached and appended below.

Thanks

Ben Caldwell



#compass to polar coordinates

compass2polar - function(x) {-x+90}



#degrees (polar) to radians

Deg2Rad - function(x) {(x*pi)/180}



# radians to degrees

Rad2Deg - function (rad) (rad/pi)*180



# polar to cartesian coordinates - assumes degrees those from a compass.
output is a list, x  y of equal length

Pol2Car - function(distance,deg) {


rad - Deg2Rad(compass2polar(deg))

rad - rep(rad, length(distance))


x - ifelse(is.na(distance), NA, distance * cos(rad))

y - ifelse(is.na(distance), NA, distance * sin(rad))


x-round(x,2)

y-round(y,2)


cartes- list(x,y)

name-c('x','y')

names(cartes)-name

cartes

}





#rotate an object, with assumed origin at 0,0, in any number of degrees

rotate - function(x,y,tilt){ 8


d2 - x^2+y^2

rotate.dis-sqrt(d2)

or.rad - atan(x/y)

or.deg - Rad2Deg(or.rad)


n - length(or.deg)

for(i in 1:n){

if(is.na(or.deg[i])==TRUE) {or.deg[i] - 0}

}

# browser()

tilt.in - tilt + or.deg


xy-Pol2Car (distance=rotate.dis, deg=tilt.in)

 # if(abs(tilt) = 0) {

 # shift.frame - cbind(xy$x, xy$y)

# shift.frame.val - shift.frame[shift.frame[,2]==min(shift.frame[,2]),]

# shift.x- shift.frame.val[1] * -1

# shift.y- shift.frame.val[2] * -1

# x-xy$x + shift.x

# y-xy$y + shift.y

# }

# name - c('x', 'y')

# xy-list(x,y)

# names(xy)-name

 xy

}


x - seq(0,5, .5)

y - seq(0,5, .5)

z - seq(0,5, .5)

dquad-expand.grid(x,y,z)

name-c(y,x,z)

names(dquad)-name


plot(dquad$x, dquad$y, xlim=c(-25,25), ylim=c(-25,25))


#this works fine

rotated-rotate(dquad$x, dquad$y, 45)



points(rotated$x, rotated$y, col='green')


# profiling of both time and memory

Rprof(myFunction.out, memory.profiling=T)

y - myFunction(x)

Rprof(NULL)

summaryRprof(myFunction.out, memory=both)



#

x - seq(0,5, .1)

y - seq(0,5, .1)

z - seq(0,5, .1)

dquad-expand.grid(x,y,z)

name-c(y,x,z)

names(dquad)-name

# running the below locks up my machine (commented out to avoid accidental
run)

# rotated-rotate(dquad$x, dquad$y, 45)

The contents of this e-mail are confidential and may be ...{{dropped:14}}

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Reading Wyoming radiosonde data files with RadioSonde package

2013-03-05 Thread Clint Bowman
Sure looks as if that second --... line is causing 
another attempt at parsing for varkey and unitkey which for 70.0 and 59.6 
just won't make sense to getsonde.


It's a pain but I'd experiment by removing that second ... line.

Clint BowmanINTERNET:   cl...@ecy.wa.gov
Air Quality Modeler INTERNET:   cl...@math.utah.edu
Department of Ecology   VOICE:  (360) 407-6815
PO Box 47600FAX:(360) 407-7534
Olympia, WA 98504-7600

USPS:   PO Box 47600, Olympia, WA 98504-7600
Parcels:300 Desmond Drive, Lacey, WA 98503-1274

On Tue, 5 Mar 2013, Ilan Levy wrote:


Hi,
I need to do some analysis on historic daily radiosonde data I download
from the Wyoming Univ. web page (
http://weather.uwyo.edu/upperair/sounding.html).
I am trying to use the RadioSonde package (V 1.3), but the format of the
files from Wyoming don't match what RadioSonde is expecting.

Has anyone used the Radiosonde package on the Wyoming data?
Here is a sample of the Wyoming file format:


   40179 Bet Dagan Observations at 00Z 22 Feb 2013

-
  PRES   HGHT   TEMP   DWPT   RELH   MIXR   DRCT   SKNT   THTA   THTE
THTV
   hPa m  C  C  %g/kgdeg   knot K  K
K
-
  70.0  18430  -64.5 265 91  446.1
446.1
  59.6  19417  -60.9 265 83  475.1
475.1
  50.0  20500  -62.5 265 75  495.8
495.8
  46.7  20920  -64.3 501.2
501.2
  38.3  22137  -63.5 532.5
532.5
  33.3  23012  -56.3 573.2
573.2
  30.0  23670  -59.9 580.8
580.8
  28.1  24078  -60.5 590.0
590.0
  20.5  26056  -57.1 656.0
656.0


 Station information and sounding indices

Station number: 40179
  Observation time: 130222/
  Station latitude: 32.00
 Station longitude: 34.81
 Station elevation: 35.0
Mean mixed layer potential temperature: 0.00
 Mean mixed layer mixing ratio: 0.00

Here is the code I tried:

filename - 'D:\\Data\\sounding_test3.txt'
datakey  - --
varkey   -PRES
unitkey  - hPa
sample.sonde - getsonde(filename, datakey, varkey, unitkey)

Error in getsonde(filename, datakey, varkey, unitkey) :
 (getsonde): could not find a unique match for the data string

Thanks,
Ilik

Win7 OS 64-bit, R version 2.13.0.

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.



__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Function completely locks up my computer if the input is too big

2013-03-05 Thread William Dunlap
I see you have profiling calls in there.  Have you used them?

It is often fruitful to see how the time for a function grows as size of the 
input or output grows.  Have you tried that? 

A concrete suggestion is to change
   for(i in 1:n){
  if(is.na(or.deg[i])==TRUE) {or.deg[i] - 0}
   }
to
 or.deg[is.na(or.deg)] - 0

Bill Dunlap
Spotfire, TIBCO Software
wdunlap tibco.com


 -Original Message-
 From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org] On 
 Behalf
 Of Benjamin Caldwell
 Sent: Tuesday, March 05, 2013 1:18 PM
 To: r-help
 Subject: [R] Function completely locks up my computer if the input is too big
 
 Dear r-help,
 
 
 Somewhere in my innocuous function to rotate an object in Cartesian space
 I've created a monster that completely locks up my computer (requires a
 hard reset every time). I don't know if this is useful description to
 anyone - the mouse still responds, but not the keyboard and not windows
 explorer.
 
 The script only does this when the input matrix is large, and so my initial
 runs didn't catch it as I used a smaller matrix to speed up the test runs.
 When I tried an input matrix with a number of dimensions in the same order
 of magnitude as the data I want to read in, R and my computer choked. This
 was a surprise for me, as I've always been able to break execution in the
 past or do other tasks. So i tried it again, and still no dice.
 
 Now I need the function to work as subsequent functions/functionality are
 dependent, and I can't see anything on the face of it that would likely
 cause the issue.
 
 Any insight on why this happens in general or specifically in my case are
 appreciated. Running R 15.2, Platform: x86_64-w64-mingw32/x64 (64-bit) on a
 windows 7 machine with 4 mb RAM. In the meantime I suppose I'll write a
 loop to do this function piece-wise for larger data and see if that helps.
 
 Script is attached and appended below.
 
 Thanks
 
 Ben Caldwell
 
 
 
 #compass to polar coordinates
 
 compass2polar - function(x) {-x+90}
 
 
 
 #degrees (polar) to radians
 
 Deg2Rad - function(x) {(x*pi)/180}
 
 
 
 # radians to degrees
 
 Rad2Deg - function (rad) (rad/pi)*180
 
 
 
 # polar to cartesian coordinates - assumes degrees those from a compass.
 output is a list, x  y of equal length
 
 Pol2Car - function(distance,deg) {
 
 
 rad - Deg2Rad(compass2polar(deg))
 
 rad - rep(rad, length(distance))
 
 
 x - ifelse(is.na(distance), NA, distance * cos(rad))
 
 y - ifelse(is.na(distance), NA, distance * sin(rad))
 
 
 x-round(x,2)
 
 y-round(y,2)
 
 
 cartes- list(x,y)
 
 name-c('x','y')
 
 names(cartes)-name
 
 cartes
 
 }
 
 
 
 
 
 #rotate an object, with assumed origin at 0,0, in any number of degrees
 
 rotate - function(x,y,tilt){ 8
 
 
 d2 - x^2+y^2
 
 rotate.dis-sqrt(d2)
 
 or.rad - atan(x/y)
 
 or.deg - Rad2Deg(or.rad)
 
 
 n - length(or.deg)
 
 for(i in 1:n){
 
 if(is.na(or.deg[i])==TRUE) {or.deg[i] - 0}
 
 }
 
 # browser()
 
 tilt.in - tilt + or.deg
 
 
 xy-Pol2Car (distance=rotate.dis, deg=tilt.in)
 
  # if(abs(tilt) = 0) {
 
  # shift.frame - cbind(xy$x, xy$y)
 
 # shift.frame.val - shift.frame[shift.frame[,2]==min(shift.frame[,2]),]
 
 # shift.x- shift.frame.val[1] * -1
 
 # shift.y- shift.frame.val[2] * -1
 
 # x-xy$x + shift.x
 
 # y-xy$y + shift.y
 
 # }
 
 # name - c('x', 'y')
 
 # xy-list(x,y)
 
 # names(xy)-name
 
  xy
 
 }
 
 
 x - seq(0,5, .5)
 
 y - seq(0,5, .5)
 
 z - seq(0,5, .5)
 
 dquad-expand.grid(x,y,z)
 
 name-c(y,x,z)
 
 names(dquad)-name
 
 
 plot(dquad$x, dquad$y, xlim=c(-25,25), ylim=c(-25,25))
 
 
 #this works fine
 
 rotated-rotate(dquad$x, dquad$y, 45)
 
 
 
 points(rotated$x, rotated$y, col='green')
 
 
 # profiling of both time and memory
 
 Rprof(myFunction.out, memory.profiling=T)
 
 y - myFunction(x)
 
 Rprof(NULL)
 
 summaryRprof(myFunction.out, memory=both)
 
 
 
 #
 
 x - seq(0,5, .1)
 
 y - seq(0,5, .1)
 
 z - seq(0,5, .1)
 
 dquad-expand.grid(x,y,z)
 
 name-c(y,x,z)
 
 names(dquad)-name
 
 # running the below locks up my machine (commented out to avoid accidental
 run)
 
 # rotated-rotate(dquad$x, dquad$y, 45)

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Issues when using interaction term with a lagged variable

2013-03-05 Thread Richard Asturia
Hi there!

Today I tried to estimate models using both plm and pgmm functions, with an
interaction between X1 and lag(X2, 1). And I notice two issues.

Let Y=b_1 * X_1 + b_2 * X_2 + b_3 * X_1 * x_2 + e be our model.

1) When using plm, I got different results when I coded the interaction
term with I(X1 * lag(X2, 1)) and when I just saved this multiplication X1 *
lag(X2, 1) in a different variable of the dataset and then used it. in the
regression.

2) With pgmm it is not even possible to run a formula which contains I(X1 *
lag(X2, 1)). How can I pass such interaction?

Thanks in advance for your time!

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] histogram

2013-03-05 Thread arun
HI Elisa,
Just noticed the order of elements in vec1:
You have to replace `vec1`
dat2- as.dist(dat1[,-1],upper=F,diag=F) 
vec1- as.vector(dat2)
 head(vec1)
#[1]  5.796656 43.523023 38.193750 44.730182  6.511703  2.904954 #the order is 
based on columns

#with
dat1- read.csv(rightest.csv,sep=,,header=TRUE,check.names=FALSE) 
label1=c(0-25,25-50,50-75)
dat1New- dat1[,-1]
vec1- unlist(lapply(seq_len(nrow(dat1New)),function(i) 
dat1New[i,][1:which(dat1New[i,]==0)-1]))
 head(vec1)
#    1 1 2 1 2 3 
# 5.796656 43.523023 36.305633 38.193750 31.623020  5.391179  #correct order
 dat1[1:4,1:4]
#  St. 1 2 3
#1   1  0.00  5.796656 43.523023
#2   2  5.796656  0.00 36.305633
#3   3 43.523023 36.305633  0.00
#4   4 38.193750 31.623020  5.391179

Name2-unlist(lapply(0:123,function(i) if(length(rep(i+1,i)=1)) 
paste((,paste(rep(i+1,i)[1],seq_along(rep(i+1,i)),sep=,),),sep=) else 
NULL)) 
dat3New- data.frame(Name2,vec1) 
resNew-t(aggregate(.~Name2,data=dat3New,function(x) 
table(cut(x,breaks=seq(0,75,25),labels=label1 
colnames(resNew)- resNew[1,] 
resNew1- resNew[-1,] 
row.names(resNew1)-gsub(vec1.,,row.names(resNew1)) 
Names3-apply(resNew1,1,function(x) paste(names(which(x!=0)),collapse=,)) 
res2- data.frame(Frequency=apply(resNew1,1,function(x) sum(1*(x!=0))), 
stations=Names3,stringsAsFactors=FALSE)

A.K.






- Original Message -
From: arun smartpink...@yahoo.com
To: eliza botto eliza_bo...@hotmail.com
Cc: 
Sent: Tuesday, March 5, 2013 8:12 AM
Subject: Re: histogram

Dear Elisa,
I already sent you the solution.


 Name2-unlist(lapply(0:123,function(i) 
if(length(rep(i+1,i)=1)) 
paste((,paste(rep(i+1,i)[1],seq_along(rep(i+1,i)),sep=,),),sep=)
else NULL))
dat3New- data.frame(Name2,vec1)
resNew-t(aggregate(.~Name2,data=dat3New,function(x) 
table(cut(x,breaks=seq(0,75,25),labels=label1
colnames(resNew)- resNew[1,]
resNew1- resNew[-1,]
row.names(resNew1)-gsub(vec1.,,row.names(resNew1))
Names3-apply(resNew1,1,function(x) paste(names(which(x!=0)),collapse=,)) 
res2- data.frame(Frequency=apply(resNew1,1,function(x) sum(1*(x!=0))), 
stations=Names3,stringsAsFactors=FALSE)
A.K.


From: eliza botto eliza_bo...@hotmail.com
To: smartpink...@yahoo.com smartpink...@yahoo.com 
Sent: Tuesday, March 5, 2013 7:04 AM
Subject: RE: histogram



Dear Arun,
Extremely sorry for replying you late. i really wanted to calculate the index 
of dat3. It alright for me, even if the size of output is really large.
thanks in advance

Elisa

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] histogram

2013-03-05 Thread arun


Hi,

I guess this is what you wanted.
Attaching a plot from a subset (lstSub)



dat1- read.csv(rightest.csv,sep=,,header=TRUE,check.names=FALSE) 
label1=c(0-25,25-50,50-75) 
Name1-unlist(lapply(0:123,function(i) rep(i+1,i))) 
dat1New- dat1[,-1]
vec1- unlist(lapply(seq_len(nrow(dat1New)),function(i) 
dat1New[i,][1:which(dat1New[i,]==0)-1]))
dat3-data.frame(Name1,vec1)
dat3$Categ-as.character(cut(dat3$vec1,breaks=seq(0,75,25),labels=label1))
source(k.txt)
lst1- split(dat3,dat3$Name1)
#lstSub-split(dat3,dat3$Name1)[1:4]

pdf(ElisaNew0_25.pdf)
 lapply(lst1,function(x) {indx025-which(x$Categ==0-25); 
indx2550-which(x$Categ==25-50);indx5075-which(x$Categ==50-75); 
if(length(indx025)=1) 
{matplot(k[,indx025],ylim=c(0,5),type=l,col=grey,main=paste(range 
0-25,line=,unique(x$Name1),sep= ),xlab=T,ylab=Q); 
lines(k[,unique(x$Name1)],type=l,col=black)} else NULL})
dev.off()

pdf(ElisaNew25_50New.pdf)
 lapply(lst1,function(x) {indx025-which(x$Categ==0-25); 
indx2550-which(x$Categ==25-50);indx5075-which(x$Categ==50-75); 
if(length(indx2550)=1) 
{matplot(k[,indx2550],ylim=c(0,5),type=l,col=grey,main=paste(range 
25-50,line=,unique(x$Name1),sep= ),xlab=T,ylab=Q); 
lines(k[,unique(x$Name1)],type=l,col=black)} else NULL})
dev.off()

pdf(ElisaNew50_75.pdf)
 lapply(lst1,function(x) {indx025-which(x$Categ==0-25); 
indx2550-which(x$Categ==25-50);indx5075-which(x$Categ==50-75); 
if(length(indx5075)=1) 
{matplot(k[,indx5075],ylim=c(0,5),type=l,col=grey,main=paste(range 
50-75,line=,unique(x$Name1),sep= ),xlab=T,ylab=Q); 
lines(k[,unique(x$Name1)],type=l,col=black)} else NULL})
dev.off()


A.K.




From: eliza botto eliza_bo...@hotmail.com
To: smartpink...@yahoo.com smartpink...@yahoo.com 
Sent: Tuesday, March 5, 2013 5:04 PM
Subject: RE: histogram



Dear Arun,
Thanks for the update.
Any success with the recently asked question?
thanks


elisa


 Date: Tue, 5 Mar 2013 14:00:29 -0800
 From: smartpink...@yahoo.com
 Subject: Re: histogram
 To: eliza_bo...@hotmail.com
 CC: r-help@r-project.org
 
 HI Elisa,
 Just noticed the order of elements in vec1:
 You have to replace `vec1`
 dat2- as.dist(dat1[,-1],upper=F,diag=F) 
 vec1- as.vector(dat2)
  head(vec1)
 #[1]  5.796656 43.523023 38.193750 44.730182  6.511703  2.904954 #the order 
 is based on columns
 
 #with
 dat1- read.csv(rightest.csv,sep=,,header=TRUE,check.names=FALSE) 
 label1=c(0-25,25-50,50-75)
 dat1New- dat1[,-1]
 vec1- unlist(lapply(seq_len(nrow(dat1New)),function(i) 
 dat1New[i,][1:which(dat1New[i,]==0)-1]))
  head(vec1)
 #    1 1 2 1 2 3 
 # 5.796656 43.523023 36.305633 38.193750 31.623020  5.391179  #correct order
  dat1[1:4,1:4]
 #  St. 1 2 3
 #1   1  0.00  5.796656 43.523023
 #2   2  5.796656  0.00 36.305633
 #3   3 43.523023 36.305633  0.00
 #4   4 38.193750 31.623020  5.391179
 
 Name2-unlist(lapply(0:123,function(i) if(length(rep(i+1,i)=1)) 
 paste((,paste(rep(i+1,i)[1],seq_along(rep(i+1,i)),sep=,),),sep=) else 
 NULL)) 
 dat3New- data.frame(Name2,vec1) 
 resNew-t(aggregate(.~Name2,data=dat3New,function(x) 
 table(cut(x,breaks=seq(0,75,25),labels=label1 
 colnames(resNew)- resNew[1,] 
 resNew1- resNew[-1,] 
 row.names(resNew1)-gsub(vec1.,,row.names(resNew1)) 
 Names3-apply(resNew1,1,function(x) paste(names(which(x!=0)),collapse=,)) 
 res2- data.frame(Frequency=apply(resNew1,1,function(x) sum(1*(x!=0))), 
 stations=Names3,stringsAsFactors=FALSE)
 
 A.K.
 
 
 
 
 
 
 - Original Message -
 From: arun smartpink...@yahoo.com
 To: eliza botto eliza_bo...@hotmail.com
 Cc: 
 Sent: Tuesday, March 5, 2013 8:12 AM
 Subject: Re: histogram
 
 Dear Elisa,
 I already sent you the solution.
 
 
  Name2-unlist(lapply(0:123,function(i) 
 if(length(rep(i+1,i)=1)) 
 paste((,paste(rep(i+1,i)[1],seq_along(rep(i+1,i)),sep=,),),sep=)
 else NULL))
 dat3New- data.frame(Name2,vec1)
 resNew-t(aggregate(.~Name2,data=dat3New,function(x) 
 table(cut(x,breaks=seq(0,75,25),labels=label1
 colnames(resNew)- resNew[1,]
 resNew1- resNew[-1,]
 row.names(resNew1)-gsub(vec1.,,row.names(resNew1))
 Names3-apply(resNew1,1,function(x) paste(names(which(x!=0)),collapse=,)) 
 res2- data.frame(Frequency=apply(resNew1,1,function(x) sum(1*(x!=0))), 
 stations=Names3,stringsAsFactors=FALSE)
 A.K.
 
 
 From: eliza botto eliza_bo...@hotmail.com
 To: smartpink...@yahoo.com smartpink...@yahoo.com 
 Sent: Tuesday, March 5, 2013 7:04 AM
 Subject: RE: histogram
 
 
 
 Dear Arun,
 Extremely sorry for replying you late. i really wanted to calculate the index 
 of dat3. It alright for me, even if the size of output is really large.
 thanks in advance
 
 Elisa

ElisaNew25_50.pdf
Description: Adobe PDF document
__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, 

[R] Wilcox-Off?

2013-03-05 Thread R. Michael Weylandt
A potentially ridiculous question, but why does R use wilcox (e.g.,
pwilcox or wilcox.test) instead of the full name Wilcoxon? I've
browsed (but not scoured) the help files and Peter Dalgaard's book,
but I'm coming up empty.

Purely for brevity or have I missed something massive?

## Reproducible example ;-)

? wilcox.test

## End Reproducible Example

Cheers,
Michael

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Questions on implementing logistic regression

2013-03-05 Thread Bert Gunter
I may be missing something, but what does this have to do specifically
with R? I believe this is OT here and you need to post elsewhere, e.g.
perhaps on stats.stackexchange.com.

-- Bert

On Tue, Mar 5, 2013 at 1:36 PM, Ivan Li machinelearning2...@gmail.com wrote:
 Hi there,

 I am trying to write a tool which involves implementing logistic
 regression. With the batch gradient descent method, the convergence is
 guaranteed as it is a convex problem. However, I find that with the
 stochastic gradient decent method, it typically converges to some random
 points (i.e., not very close to the minimum point resulted from the batch
 method). I have tried different ways of decreasing the learning rate, and
 different starting points of weights. However, the performance (e.g.,
 accuracy, precision/recall, ...) are comparable (to the batch method).

 I understand that this is possible, since SGD(stochastic gradient descent)
 uses an approximation to the real cost each step. Does it matter? I guess
 it does since otherwise the interpretation of the weights would not make
 much sense even the accuracy is comparable. If it matters, I wonder if you
 have some suggestions on how to make it converge or getting close to the
 global optimal point.



 Thanks!

 [[alternative HTML version deleted]]

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.



-- 

Bert Gunter
Genentech Nonclinical Biostatistics

Internal Contact Info:
Phone: 467-7374
Website:
http://pharmadevelopment.roche.com/index/pdb/pdb-functional-groups/pdb-biostatistics/pdb-ncb-home.htm

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Questions on implementing logistic regression

2013-03-05 Thread Bert Gunter
Perhaps I should have added and FWIW, R, like essentially all
statistical software, has logistic regression already built in, if I
understand what you mean by the term (which I may not), via glm's.

-- Bert

On Tue, Mar 5, 2013 at 1:36 PM, Ivan Li machinelearning2...@gmail.com wrote:
 Hi there,

 I am trying to write a tool which involves implementing logistic
 regression. With the batch gradient descent method, the convergence is
 guaranteed as it is a convex problem. However, I find that with the
 stochastic gradient decent method, it typically converges to some random
 points (i.e., not very close to the minimum point resulted from the batch
 method). I have tried different ways of decreasing the learning rate, and
 different starting points of weights. However, the performance (e.g.,
 accuracy, precision/recall, ...) are comparable (to the batch method).

 I understand that this is possible, since SGD(stochastic gradient descent)
 uses an approximation to the real cost each step. Does it matter? I guess
 it does since otherwise the interpretation of the weights would not make
 much sense even the accuracy is comparable. If it matters, I wonder if you
 have some suggestions on how to make it converge or getting close to the
 global optimal point.



 Thanks!

 [[alternative HTML version deleted]]

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.



-- 

Bert Gunter
Genentech Nonclinical Biostatistics

Internal Contact Info:
Phone: 467-7374
Website:
http://pharmadevelopment.roche.com/index/pdb/pdb-functional-groups/pdb-biostatistics/pdb-ncb-home.htm

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Function completely locks up my computer if the input is too big

2013-03-05 Thread Benjamin Caldwell
Hi all,

Thanks for the suggestions. Updating the function as below to break the
problem into chunks seemed to do the trick - perhaps there is a relatively
small limit to the size of a vector that R can work with?

Best

rotate - function(x,y,tilt,threshold){

df.main-data.frame(x,y)

 if(length(x) threshold){
l - round(length(x)/ threshold, 0)
dfchunk - split(df.main, factor(sort(rank(row.names(df.main))%%l)))
 n-length(summary(dfchunk)[,1])
xy-vector(list, n)
for (i in 1:n){
wk.df - dfchunk[[i]]
x - wk.df$x
y - wk.df$y
d2 - x^2+y^2
rotate.dis-sqrt(d2)
or.rad - atan(x/y)
or.deg - Rad2Deg(or.rad)

 or.deg[is.na(or.deg)] - 0
 tilt.in - tilt + or.deg
xy[[i]]-data.frame(Pol2Car(distance=rotate.dis, deg=tilt.in))
}
 xy-do.call(rbind, xy[1:n])
  } else {
 d2 - x^2+y^2
rotate.dis-sqrt(d2)
or.rad - atan(x/y)
or.deg - Rad2Deg(or.rad)

n - length(or.deg)
for(i in 1:n){
if(is.na(or.deg[i])==TRUE) {or.deg[i] - 0}
}
 tilt.in - tilt + or.deg

xy-data.frame(Pol2Car (distance=rotate.dis, deg=tilt.in))
}

xy
}

*Ben Caldwell*

Graduate Fellow
University of California, Berkeley
130 Mulford Hall #3114
Berkeley, CA 94720
Office 223 Mulford Hall
(510)859-3358


On Tue, Mar 5, 2013 at 1:44 PM, Peter Alspach 
peter.alsp...@plantandfood.co.nz wrote:

 Tena koe Benjamin

 I haven't looked at you code in detail, but in general ifelse is slow and
 can generally be avoided.  For example,

 ben - 1:10^7
 system.time(BEN - ifelse(ben10, NA, -ben))
user  system elapsed
1.310.241.56
 system.time({BEN1 - -ben; BEN1[BEN1 -10] - NA})
user  system elapsed
0.170.030.20
 all.equal(BEN, BEN1)
 [1] TRUE

 HTH ...

 Peter Alspach

 -Original Message-
 From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org]
 On Behalf Of Benjamin Caldwell
 Sent: Wednesday, 6 March 2013 10:18 a.m.
 To: r-help
 Subject: [R] Function completely locks up my computer if the input is too
 big

 Dear r-help,


 Somewhere in my innocuous function to rotate an object in Cartesian space
 I've created a monster that completely locks up my computer (requires a
 hard reset every time). I don't know if this is useful description to
 anyone - the mouse still responds, but not the keyboard and not windows
 explorer.

 The script only does this when the input matrix is large, and so my
 initial runs didn't catch it as I used a smaller matrix to speed up the
 test runs.
 When I tried an input matrix with a number of dimensions in the same order
 of magnitude as the data I want to read in, R and my computer choked. This
 was a surprise for me, as I've always been able to break execution in the
 past or do other tasks. So i tried it again, and still no dice.

 Now I need the function to work as subsequent functions/functionality are
 dependent, and I can't see anything on the face of it that would likely
 cause the issue.

 Any insight on why this happens in general or specifically in my case are
 appreciated. Running R 15.2, Platform: x86_64-w64-mingw32/x64 (64-bit) on a
 windows 7 machine with 4 mb RAM. In the meantime I suppose I'll write a
 loop to do this function piece-wise for larger data and see if that helps.

 Script is attached and appended below.

 Thanks

 Ben Caldwell



 #compass to polar coordinates

 compass2polar - function(x) {-x+90}



 #degrees (polar) to radians

 Deg2Rad - function(x) {(x*pi)/180}



 # radians to degrees

 Rad2Deg - function (rad) (rad/pi)*180



 # polar to cartesian coordinates - assumes degrees those from a compass.
 output is a list, x  y of equal length

 Pol2Car - function(distance,deg) {


 rad - Deg2Rad(compass2polar(deg))

 rad - rep(rad, length(distance))


 x - ifelse(is.na(distance), NA, distance * cos(rad))

 y - ifelse(is.na(distance), NA, distance * sin(rad))


 x-round(x,2)

 y-round(y,2)


 cartes- list(x,y)

 name-c('x','y')

 names(cartes)-name

 cartes

 }





 #rotate an object, with assumed origin at 0,0, in any number of degrees

 rotate - function(x,y,tilt){ 8


 d2 - x^2+y^2

 rotate.dis-sqrt(d2)

 or.rad - atan(x/y)

 or.deg - Rad2Deg(or.rad)


 n - length(or.deg)

 for(i in 1:n){

 if(is.na(or.deg[i])==TRUE) {or.deg[i] - 0}

 }

 # browser()

 tilt.in - tilt + or.deg


 xy-Pol2Car (distance=rotate.dis, deg=tilt.in)

  # if(abs(tilt) = 0) {

  # shift.frame - cbind(xy$x, xy$y)

 # shift.frame.val - shift.frame[shift.frame[,2]==min(shift.frame[,2]),]

 # shift.x- shift.frame.val[1] * -1

 # shift.y- shift.frame.val[2] * -1

 # x-xy$x + shift.x

 # y-xy$y + shift.y

 # }

 # name - c('x', 'y')

 # xy-list(x,y)

 # names(xy)-name

  xy

 }


 x - seq(0,5, .5)

 y - seq(0,5, .5)

 z - seq(0,5, .5)

 dquad-expand.grid(x,y,z)

 name-c(y,x,z)

 names(dquad)-name


 plot(dquad$x, dquad$y, xlim=c(-25,25), ylim=c(-25,25))


 #this works fine

 rotated-rotate(dquad$x, dquad$y, 45)



 points(rotated$x, rotated$y, col='green')


 # profiling of both time and memory

 Rprof(myFunction.out, memory.profiling=T)

 y - myFunction(x)

 Rprof(NULL)

 

Re: [R] Issues when using interaction term with a lagged variable

2013-03-05 Thread Richard Asturia
Actually, the problem number 2 is easy to solve: instead of using I(X1 *
lag(X2,1)), one should use X1:lag(X2,1). It works.

The issue number 1 remains, though. And also affects this solution for
number 2. It means: results are different with one uses X1:lag(X2,1)
whithin the formula or uses a new variable previouysly created with
X1*lag(X2,1).

Any insights here would be very hepful.


2013/3/5 Richard Asturia richard.astu...@gmail.com

 Hi there!

 Today I tried to estimate models using both plm and pgmm functions, with
 an interaction between X1 and lag(X2, 1). And I notice two issues.

 Let Y=b_1 * X_1 + b_2 * X_2 + b_3 * X_1 * x_2 + e be our model.

 1) When using plm, I got different results when I coded the interaction
 term with I(X1 * lag(X2, 1)) and when I just saved this multiplication X1 *
 lag(X2, 1) in a different variable of the dataset and then used it. in the
 regression.

 2) With pgmm it is not even possible to run a formula which contains I(X1
 * lag(X2, 1)). How can I pass such interaction?

 Thanks in advance for your time!


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] plotmath: angle brackets

2013-03-05 Thread Donatella Quagli
Indeed! Thank you!


__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Wilcox-Off?

2013-03-05 Thread Peter Ehlers

On 2013-03-05 14:51, R. Michael Weylandt wrote:

A potentially ridiculous question, but why does R use wilcox (e.g.,
pwilcox or wilcox.test) instead of the full name Wilcoxon? I've
browsed (but not scoured) the help files and Peter Dalgaard's book,
but I'm coming up empty.

Purely for brevity or have I missed something massive?

## Reproducible example ;-)

? wilcox.test

## End Reproducible Example

Cheers,
Michael


I would guess that it's the brevity argument;
witness pnorm, pchisq, punif, ppois, pbinom, phyper, ...

But maybe Kurt knows better since he's down as the author
of pwilcox.

Peter Ehlers

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Zelig package: Coxph model problems

2013-03-05 Thread David Winsemius

On Mar 5, 2013, at 5:54 AM, Stephen Knight wrote:

 Hi,
 
 I'm having problems with the Zelig package - when using the below R displays 
 the follwing message (I'm running R i386 2.15.3 for Windows and have updated 
 all the Zelig packages):
 
 z.out-zelig(Surv(psurv2, pcens2) ~ ren_sup3 + age,
   data=data_urgent, model=coxph)
 
 ** The model coxph is not available with the currently loaded packages,
 ** and is not an official Zelig package.
 ** The model's name may be a typo.

When this question (or at least something like it ) was asked last November on 
the Zelig mailing lis, rolling back to earlier versions was the suggestion:

http://comments.gmane.org/gmane.comp.lang.r.zelig/848

I do not see any mention of coxph or cox.ph as a model specification in the 
current docs for that package.

-- 

David Winsemius
Alameda, CA, USA

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Constrained cubic smoothing spline

2013-03-05 Thread Victor hyk
Hello everone,
   Anyone who knows how to force a cubic smoothing spline to pass 
through a particular point?
   I found on website  someone said that we can use cobs package to 
force the spline pass through certain points or impose shape   
constraints (increasing, decreasing). However,  this package is using  B-spline 
and can only do linear and quadratic B-spline.
   In my research, I need to force a cubic smoothing spline to pass a 
point. 
   Thanks!

  Victor

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] About basic logical operators

2013-03-05 Thread Victor hyk
Hello everyone,
  I have a basic question regarding logical operators.
 x-seq(-1,1,by=0.02)
 x
  [1] -1.00 -0.98 -0.96 -0.94 -0.92 -0.90 -0.88 -0.86 -0.84 -0.82 -0.80 -0.78
 [13] -0.76 -0.74 -0.72 -0.70 -0.68 -0.66 -0.64 -0.62 -0.60 -0.58 -0.56 -0.54
 [25] -0.52 -0.50 -0.48 -0.46 -0.44 -0.42 -0.40 -0.38 -0.36 -0.34 -0.32 -0.30
 [37] -0.28 -0.26 -0.24 -0.22 -0.20 -0.18 -0.16 -0.14 -0.12 -0.10 -0.08 -0.06
 [49] -0.04 -0.02  0.00  0.02  0.04  0.06  0.08  0.10  0.12  0.14  0.16  0.18
 [61]  0.20  0.22  0.24  0.26  0.28  0.30  0.32  0.34  0.36  0.38  0.40  0.42
 [73]  0.44  0.46  0.48  0.50  0.52  0.54  0.56  0.58  0.60  0.62  0.64  0.66
 [85]  0.68  0.70  0.72  0.74  0.76  0.78  0.80  0.82  0.84  0.86  0.88  0.90
 [97]  0.92  0.94  0.96  0.98  1.00
 x[x=0.02]
 [1] -1.00 -0.98 -0.96 -0.94 -0.92 -0.90 -0.88 -0.86 -0.84 -0.82 -0.80 -0.78
[13] -0.76 -0.74 -0.72 -0.70 -0.68 -0.66 -0.64 -0.62 -0.60 -0.58 -0.56 -0.54
[25] -0.52 -0.50 -0.48 -0.46 -0.44 -0.42 -0.40 -0.38 -0.36 -0.34 -0.32 -0.30
[37] -0.28 -0.26 -0.24 -0.22 -0.20 -0.18 -0.16 -0.14 -0.12 -0.10 -0.08 -0.06
[49] -0.04 -0.02  0.00
 x[x0.2]
 [1] -1.00 -0.98 -0.96 -0.94 -0.92 -0.90 -0.88 -0.86 -0.84 -0.82 -0.80 -0.78
[13] -0.76 -0.74 -0.72 -0.70 -0.68 -0.66 -0.64 -0.62 -0.60 -0.58 -0.56 -0.54
[25] -0.52 -0.50 -0.48 -0.46 -0.44 -0.42 -0.40 -0.38 -0.36 -0.34 -0.32 -0.30
[37] -0.28 -0.26 -0.24 -0.22 -0.20 -0.18 -0.16 -0.14 -0.12 -0.10 -0.08 -0.06
[49] -0.04 -0.02  0.00  0.02  0.04  0.06  0.08  0.10  0.12  0.14  0.16  0.18
[61]  0.20
 
 Why does x[x=0.02] return  no 0.02 but x[x0.2] return a subsample 
with 0.02?
 Anyone who can tell me why?
 Thanks!

 Victor

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] About basic logical operators

2013-03-05 Thread Anthony Damico
not sure if you meant to use both 0.2 and 0.02, but i believe your
unexpected results are a floating point issue..

start here
http://r.789695.n4.nabble.com/That-dreaded-floating-point-trap-td3418142.html
and here
http://cran.r-project.org/doc/FAQ/R-FAQ.html#Why-doesn_0027t-R-think-these-numbers-are-equal_003f

and google floating point r for more detail  :)



On Tue, Mar 5, 2013 at 7:53 PM, Victor hyk victor_...@yahoo.ca wrote:

 Hello everyone,
   I have a basic question regarding logical operators.
  x-seq(-1,1,by=0.02)
  x
   [1] -1.00 -0.98 -0.96 -0.94 -0.92 -0.90 -0.88 -0.86 -0.84 -0.82 -0.80
 -0.78
  [13] -0.76 -0.74 -0.72 -0.70 -0.68 -0.66 -0.64 -0.62 -0.60 -0.58 -0.56
 -0.54
  [25] -0.52 -0.50 -0.48 -0.46 -0.44 -0.42 -0.40 -0.38 -0.36 -0.34 -0.32
 -0.30
  [37] -0.28 -0.26 -0.24 -0.22 -0.20 -0.18 -0.16 -0.14 -0.12 -0.10 -0.08
 -0.06
  [49] -0.04 -0.02  0.00  0.02  0.04  0.06  0.08  0.10  0.12  0.14  0.16
 0.18
  [61]  0.20  0.22  0.24  0.26  0.28  0.30  0.32  0.34  0.36  0.38  0.40
 0.42
  [73]  0.44  0.46  0.48  0.50  0.52  0.54  0.56  0.58  0.60  0.62  0.64
 0.66
  [85]  0.68  0.70  0.72  0.74  0.76  0.78  0.80  0.82  0.84  0.86  0.88
 0.90
  [97]  0.92  0.94  0.96  0.98  1.00
  x[x=0.02]
  [1] -1.00 -0.98 -0.96 -0.94 -0.92 -0.90 -0.88 -0.86 -0.84 -0.82 -0.80
 -0.78
 [13] -0.76 -0.74 -0.72 -0.70 -0.68 -0.66 -0.64 -0.62 -0.60 -0.58 -0.56
 -0.54
 [25] -0.52 -0.50 -0.48 -0.46 -0.44 -0.42 -0.40 -0.38 -0.36 -0.34 -0.32
 -0.30
 [37] -0.28 -0.26 -0.24 -0.22 -0.20 -0.18 -0.16 -0.14 -0.12 -0.10 -0.08
 -0.06
 [49] -0.04 -0.02  0.00
  x[x0.2]
  [1] -1.00 -0.98 -0.96 -0.94 -0.92 -0.90 -0.88 -0.86 -0.84 -0.82 -0.80
 -0.78
 [13] -0.76 -0.74 -0.72 -0.70 -0.68 -0.66 -0.64 -0.62 -0.60 -0.58 -0.56
 -0.54
 [25] -0.52 -0.50 -0.48 -0.46 -0.44 -0.42 -0.40 -0.38 -0.36 -0.34 -0.32
 -0.30
 [37] -0.28 -0.26 -0.24 -0.22 -0.20 -0.18 -0.16 -0.14 -0.12 -0.10 -0.08
 -0.06
 [49] -0.04 -0.02  0.00  0.02  0.04  0.06  0.08  0.10  0.12  0.14  0.16
 0.18
 [61]  0.20
 
  Why does x[x=0.02] return  no 0.02 but x[x0.2] return a
 subsample with 0.02?
  Anyone who can tell me why?
  Thanks!

  Victor

 [[alternative HTML version deleted]]


 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.



[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] About basic logical operators

2013-03-05 Thread Jeff Newmiller
faq 7.31
---
Jeff NewmillerThe .   .  Go Live...
DCN:jdnew...@dcn.davis.ca.usBasics: ##.#.   ##.#.  Live Go...
  Live:   OO#.. Dead: OO#..  Playing
Research Engineer (Solar/BatteriesO.O#.   #.O#.  with
/Software/Embedded Controllers)   .OO#.   .OO#.  rocks...1k
--- 
Sent from my phone. Please excuse my brevity.

Victor hyk victor_...@yahoo.ca wrote:

Hello everyone,
� I have a basic question regarding logical operators.
 x-seq(-1,1,by=0.02)
 x
� [1] -1.00 -0.98 -0.96 -0.94 -0.92 -0.90 -0.88 -0.86 -0.84 -0.82 -0.80
-0.78
�[13] -0.76 -0.74 -0.72 -0.70 -0.68 -0.66 -0.64 -0.62 -0.60 -0.58 -0.56
-0.54
�[25] -0.52 -0.50 -0.48 -0.46 -0.44 -0.42 -0.40 -0.38 -0.36 -0.34 -0.32
-0.30
�[37] -0.28 -0.26 -0.24 -0.22 -0.20 -0.18 -0.16 -0.14 -0.12 -0.10 -0.08
-0.06
�[49] -0.04 -0.02� 0.00� 0.02� 0.04� 0.06� 0.08� 0.10� 0.12� 0.14�
0.16� 0.18
�[61]� 0.20� 0.22� 0.24� 0.26� 0.28� 0.30� 0.32� 0.34� 0.36� 0.38�
0.40� 0.42
�[73]� 0.44� 0.46� 0.48� 0.50� 0.52� 0.54� 0.56� 0.58� 0.60� 0.62�
0.64� 0.66
�[85]� 0.68� 0.70� 0.72� 0.74� 0.76� 0.78� 0.80� 0.82� 0.84� 0.86�
0.88� 0.90
�[97]� 0.92� 0.94� 0.96� 0.98� 1.00
 x[x=0.02]
�[1] -1.00 -0.98 -0.96 -0.94 -0.92 -0.90 -0.88 -0.86 -0.84 -0.82 -0.80
-0.78
[13] -0.76 -0.74 -0.72 -0.70 -0.68 -0.66 -0.64 -0.62 -0.60 -0.58 -0.56
-0.54
[25] -0.52 -0.50 -0.48 -0.46 -0.44 -0.42 -0.40 -0.38 -0.36 -0.34 -0.32
-0.30
[37] -0.28 -0.26 -0.24 -0.22 -0.20 -0.18 -0.16 -0.14 -0.12 -0.10 -0.08
-0.06
[49] -0.04 -0.02� 0.00
 x[x0.2]
�[1] -1.00 -0.98 -0.96 -0.94 -0.92 -0.90 -0.88 -0.86 -0.84 -0.82 -0.80
-0.78
[13] -0.76 -0.74 -0.72 -0.70 -0.68 -0.66 -0.64 -0.62 -0.60 -0.58 -0.56
-0.54
[25] -0.52 -0.50 -0.48 -0.46 -0.44 -0.42 -0.40 -0.38 -0.36 -0.34 -0.32
-0.30
[37] -0.28 -0.26 -0.24 -0.22 -0.20 -0.18 -0.16 -0.14 -0.12 -0.10 -0.08
-0.06
[49] -0.04 -0.02� 0.00� 0.02� 0.04� 0.06� 0.08� 0.10� 0.12� 0.14� 0.16�
0.18
[61]� 0.20
 
 Why does x[x=0.02] return� no 0.02 but x[x0.2] return a
subsample with 0.02?
 Anyone who can tell me why?
 Thanks!

 Victor

   [[alternative HTML version deleted]]





__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide
http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] aov() and anova() making faulty F-tests

2013-03-05 Thread PatGauthier
Dear useRs, 

I've just encountered a serious problem involving the F-test being carried
out in aov() and anova(). In the provided example, aov() is not making the
correct F-test for an hypothesis involving the expected mean square (EMS) of
a factor divided by the EMS of another factor (i.e., instead of the error
EMS). 

Here is the example:


  Expected Mean Squaredf
Mi σ2+18σ2M  1
Ij  σ2+6σ2MI+12Ф(I)  2
MIij   σ2+6σ2MI  2
ε(ijk)lσ2   30

The clear test for Ij is EMS(I) / EMS(MI) -  F(2,2)

However, observe the following example carried out in R, 

M - rep(c(M1, M2), each = 18)
I - as.ordered(rep(rep(c(5,10,15), each = 6), 2))
y -
c(44,39,48,40,43,41,27,20,25,21,28,22,35,30,29,34,31,38,12,7,6,11,7,12,15,10,12,17,11,13,22,15,27,22,21,19)
dat - data.frame(M, I, y)
summary(aov(y~M*I, data = dat))
   DfSum Sq   Mean Sq F value   
Pr(F)
m 1 3136.0   3136.0295.85  
2e-16 ***
i2  513.7  256.9  24.23 
5.45e-07 ***
m:i   2  969.5  484.7  45.73 
7.77e-10 ***
Residuals   30   318.010.6 
---

In this example aov has taken the F-ratio of MS(I) / MS(ε) -  F(2,30) =
24.23 with F-crit = qf(0.95,2,3) = 9.55 -- significant

However, as stated above,  the correct F-ratio is MS(I) / MS(MI) -  F(2,2) =
0.53 with F-crit = qf(0.95,2,2) = 19 -- non-significant

Why is aov() miscalculating the F-ratio, and is there a way to fix this
without prior knowledge of the appropriate test (e.g., EMS(I)/EMS(MI)?

Thanks for your help, 

Patrick





--
View this message in context: 
http://r.789695.n4.nabble.com/aov-and-anova-making-faulty-F-tests-tp4660407.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] CARET and NNET fail to train a model when the input is high dimensional

2013-03-05 Thread James Jong
The following code fails to train a nnet model in a random dataset using
caret:

nR - 700
nCol - 2000
  myCtrl - trainControl(method=cv, number=3, preProcOptions=NULL,
classProbs = TRUE, summaryFunction = twoClassSummary)
  trX - data.frame(replicate(nR, rnorm(nCol)))
  trY - runif(1)*trX[,1]*trX[,2]^2+runif(1)*trX[,3]/trX[,4]
  trY - as.factor(ifelse(sign(trY)0,'X1','X0'))
  my.grid - createGrid(method.name, grid.len, data=trX)
  my.model - 
train(trX,trY,method=method.name,trace=FALSE,trControl=myCtrl,tuneGrid=my.grid,
metric=ROC)
  print(Done)

The error I get is:
task 2 failed - arguments imply differing number of rows: 1334, 666

However, everything works if I reduce nR to, say 20.

Any thoughts on what may be causing this? Is there a place where I could
report this bug other than this mailing list?

Here is my session info:
 sessionInfo()
R version 2.15.2 (2012-10-26)
Platform: x86_64-unknown-linux-gnu (64-bit)

locale:
[1] C

attached base packages:
[1] stats graphics  grDevices utils datasets  methods   base

other attached packages:
[1] nnet_7.3-5  pROC_1.5.4  caret_5.15-052  foreach_1.4.0
[5] cluster_1.14.3  plyr_1.8reshape2_1.2.2  lattice_0.20-13

loaded via a namespace (and not attached):
[1] codetools_0.2-8 compiler_2.15.2 grid_2.15.2 iterators_1.0.6
[5] stringr_0.6.2   tools_2.15.2

Thanks,

James

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] aov() and anova() making faulty F-tests

2013-03-05 Thread Rolf Turner



Your subject line is patent nonsense.  The aov() and anova() functions
have been around for decades.  If they were doing something wrong
it would have been noticed long since.

You should realize that the fault is in your understanding, not in these
functions.

I cannot really follow your convoluted and messy code, but it would
appear that you want to consider M and I to be random effects.

Where have you informed aov() as to the presence of these
random effects?

cheers,

Rolf Turner

On 03/06/2013 03:36 PM, PatGauthier wrote:

Dear useRs,

I've just encountered a serious problem involving the F-test being carried
out in aov() and anova(). In the provided example, aov() is not making the
correct F-test for an hypothesis involving the expected mean square (EMS) of
a factor divided by the EMS of another factor (i.e., instead of the error
EMS).

Here is the example:


   Expected Mean Squaredf
Mi σ2+18σ2M  1
Ij  σ2+6σ2MI+12Ф(I)  2
MIij   σ2+6σ2MI  2
ε(ijk)lσ2   30

The clear test for Ij is EMS(I) / EMS(MI) -  F(2,2)

However, observe the following example carried out in R,

M - rep(c(M1, M2), each = 18)
I - as.ordered(rep(rep(c(5,10,15), each = 6), 2))
y -
c(44,39,48,40,43,41,27,20,25,21,28,22,35,30,29,34,31,38,12,7,6,11,7,12,15,10,12,17,11,13,22,15,27,22,21,19)
dat - data.frame(M, I, y)
summary(aov(y~M*I, data = dat))
DfSum Sq   Mean Sq F value
Pr(F)
m 1 3136.0   3136.0295.85  
2e-16 ***
i2  513.7  256.9  24.23
5.45e-07 ***
m:i   2  969.5  484.7  45.73
7.77e-10 ***
Residuals   30   318.010.6
---

In this example aov has taken the F-ratio of MS(I) / MS(ε) -  F(2,30) =
24.23 with F-crit = qf(0.95,2,3) = 9.55 -- significant

However, as stated above,  the correct F-ratio is MS(I) / MS(MI) -  F(2,2) =
0.53 with F-crit = qf(0.95,2,2) = 19 -- non-significant

Why is aov() miscalculating the F-ratio, and is there a way to fix this
without prior knowledge of the appropriate test (e.g., EMS(I)/EMS(MI)?

Thanks for your help,


__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Wilcox-Off?

2013-03-05 Thread William Dunlap
 I would guess that it's the brevity argument;
 witness pnorm, pchisq, punif, ppois, pbinom, phyper, ...
 
 But maybe Kurt knows better since he's down as the author
 of pwilcox.

I can only guess it was the brevity argument also.  The [...]wilcox functions
were added to S+ in June 1990, with no comment on the name choice in
the checkin log, and the R authors probably copied the name from S+.

Bill Dunlap
Spotfire, TIBCO Software
wdunlap tibco.com


 -Original Message-
 From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org] On 
 Behalf
 Of Peter Ehlers
 Sent: Tuesday, March 05, 2013 4:59 PM
 To: R. Michael Weylandt
 Cc: r-help
 Subject: Re: [R] Wilcox-Off?
 
 On 2013-03-05 14:51, R. Michael Weylandt wrote:
  A potentially ridiculous question, but why does R use wilcox (e.g.,
  pwilcox or wilcox.test) instead of the full name Wilcoxon? I've
  browsed (but not scoured) the help files and Peter Dalgaard's book,
  but I'm coming up empty.
 
  Purely for brevity or have I missed something massive?
 
  ## Reproducible example ;-)
 
  ? wilcox.test
 
  ## End Reproducible Example
 
  Cheers,
  Michael
 
 I would guess that it's the brevity argument;
 witness pnorm, pchisq, punif, ppois, pbinom, phyper, ...
 
 But maybe Kurt knows better since he's down as the author
 of pwilcox.
 
 Peter Ehlers
 
 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] How to combine conditional argument and logical argument in R to create subset of data...

2013-03-05 Thread HJ YAN
Dear R user

I have data created using code below

b-matrix(2:21,nrow=4)
b[,1:3]=NA
b[4,2]=5
b[3,1]=6

Now the data is

 b
 [,1]  [,2]   [,3]  [,4]  [,5]
[1,]   NA   NA   NA   14   18
[2,]   NA   NA   NA   15   19
[3,]  6   NA   NA   16   20
[4,]   NA5 NA17   21


I want to keep data in column 4 greater than 15 and the value in column 1 
2 either greater than 4 or is 'NA'. So I would like to have
my outcome as below...

[3,]   6   NA NA 16 20
[4,] NA 5 NA 17 21

I thought something like the code below gonna to work but it only returns
the last row,e.g NA 5 NA 17 21. ...

bb-b[which( (b[,2]4 | b[,2]==NA)  (b[,1]4 | b[,1]==NA)  b[,4]15) ,])


Please could anyone help?

Many thanks in advance

HJ

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] How to combine conditional argument and logical argument in R to create subset of data...

2013-03-05 Thread arun
Hi,

 b[b[,4]15  (b[,1]4|is.na(b[,1]))  (b[,2]4|is.na(b[,2])),]
 #    [,1] [,2] [,3] [,4] [,5]
#[1,]    6   NA   NA   16   20
#[2,]   NA    5   NA   17   21
A.K.


- Original Message -
From: HJ YAN yhj...@googlemail.com
To: r-help@r-project.org
Cc: 
Sent: Tuesday, March 5, 2013 9:33 PM
Subject: [R] How to combine conditional argument and logical argument in R to 
create subset of data...

Dear R user

I have data created using code below

b-matrix(2:21,nrow=4)
b[,1:3]=NA
b[4,2]=5
b[3,1]=6

Now the data is

 b
         [,1]  [,2]   [,3]  [,4]  [,5]
[1,]   NA   NA   NA   14   18
[2,]   NA   NA   NA   15   19
[3,]      6   NA   NA   16   20
[4,]   NA    5     NA    17   21


I want to keep data in column 4 greater than 15 and the value in column 1 
2 either greater than 4 or is 'NA'. So I would like to have
my outcome as below...

[3,]   6   NA NA 16 20
[4,] NA 5 NA 17 21

I thought something like the code below gonna to work but it only returns
the last row,e.g NA 5 NA 17 21. ...

bb-b[which( (b[,2]4 | b[,2]==NA)  (b[,1]4 | b[,1]==NA)  b[,4]15) ,])


Please could anyone help?

Many thanks in advance

HJ

    [[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] installing biOps on MacOSX fails

2013-03-05 Thread ishi soichi
thanks for your answer.
The problem in my case was that Mac was loading R (32-bit) version rather
than R64 (64-bit).
Since many libraries in this Mac are compiled for 64 architecture. So
naturally the installation did not work at all.

Please make use of my case for future reference.

soichi


2013/3/4 Prof Brian Ripley rip...@stats.ox.ac.uk

 See the recent discussion on R-sig-mac (the place to ask questions about
 OS X).


 On 04/03/2013 11:00, ishi soichi wrote:

 version.string R version 2.15.2 (2012-10-26)

 I am trying to install biOps on MacOS X 10.8.2

 First, I have tiff, fftw-3, jpeg

 and set paths like

 cd /usr/include
 sudo ln -s /usr/local/include/fftw3.h
 for x in /usr/local/include/j*.h; do sudo ln -s $x; done
 for x in /usr/local/include/tiff*.h; do sudo ln -s $x; done
 cd /usr/lib
 for x in /usr/local/lib/libfftw3.*; do sudo ln -s $x; done
 for x in /usr/local/lib/libjpeg.*; do sudo ln -s $x; done
 for x in /usr/local/lib/libtiff.*; do sudo ln -s $x; done

 then run,

 install.packages(biOps, 
 repos=http://cran.md.tsukuba.**ac.jp/http://cran.md.tsukuba.ac.jp/,
 type=source)


 but it gives errors like the following.
 Can you tell me why ?  it looks like R cannot find the libraries..


 R does nothing: it is the linker which cannot find the libraries. See the
 discussion on R-sig-mac 


 ** R
 ** data
 ** inst
 ** preparing package for lazy loading
 ** help
 *** installing help indices
 ** building package indices
 ** testing if installed package can be loaded
 Error in dyn.load(file, DLLpath = DLLpath, ...) :
unable to load shared object
 '/Library/Frameworks/R.**framework/Versions/2.15/**
 Resources/library/biOps/libs/**i386/biOps.so':

 dlopen(/Library/Frameworks/R.**framework/Versions/2.15/**
 Resources/library/biOps/libs/**i386/biOps.so,
 6): Symbol not found: _TIFFClose
Referenced from:
 /Library/Frameworks/R.**framework/Versions/2.15/**
 Resources/library/biOps/libs/**i386/biOps.so
Expected in: flat namespace
   in
 /Library/Frameworks/R.**framework/Versions/2.15/**
 Resources/library/biOps/libs/**i386/biOps.so
 Error: loading failed
 Execution halted
 ERROR: loading failed
 * removing
 '/Library/Frameworks/R.**framework/Versions/2.15/**
 Resources/library/biOps'

 The downloaded source packages are in
 '/private/var/folders/hk/**1clspzcd49d173p3pvpk1f3wgn**
 /T/RtmpFpgBgP/downloaded_**packages'
 Warning message:
 In install.packages(biOps, repos = http://cran.md.tsukuba.ac.jp/**;,
  :
installation of package 'biOps' had non-zero exit status

 library(biOps)

 Error in library(biOps) : there is no package called 'biOps'



 [[alternative HTML version deleted]]

 __**
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/**listinfo/r-helphttps://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/**
 posting-guide.html http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.



 --
 Brian D. Ripley,  rip...@stats.ox.ac.uk
 Professor of Applied Statistics,  
 http://www.stats.ox.ac.uk/~**ripley/http://www.stats.ox.ac.uk/~ripley/
 University of Oxford, Tel:  +44 1865 272861 (self)
 1 South Parks Road, +44 1865 272866 (PA)
 Oxford OX1 3TG, UKFax:  +44 1865 272595

 __**
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/**listinfo/r-helphttps://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/**
 posting-guide.html http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] corAR(1) with GLS: does it fit a random or a pooling model?

2013-03-05 Thread Tomas Note
Dears, I am specifying a panel model with GLS function and corCAR1 and cor
AR1 parameters to correct for serial autocorrelation. But I have one
seemingly trivial question: those models fitted that way are random effects
models or pooling models?


Thanks again,

Tomas Notes

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Learning the R way – A Wish

2013-03-05 Thread Andrew Hoerner
Dear Mark--
I've just spent an hour and a half reading chapters from Hadley's book. It
is phenomenal. Thanks for pointing it out to me
   --andrewH


On Mon, Mar 4, 2013 at 9:04 PM, Mark Leeds marklee...@gmail.com wrote:

 Hi Andrew: Not that I've gone through it all yet but the draft of hadley's
 book  at https://github.com/hadley/devtools/wiki/Introduction has a lot
 if not all of the commands you refer to and all of their gory details along
 with many examples. No matter what you're budget, given that the book will
 be finished in dec, 2013, I would print out the current draft ( it changes
 frequently so your draft will become not current pretty quickly ) and make
 a binding ( actually I had to make two bindings out of it ) and go through
 it slowly. I was doing that for a while and it was quite enlightening until
 I got sidetracked with other things.


 On Mon, Mar 4, 2013 at 6:42 PM, andrewH ahoer...@rprogress.org wrote:

 There is something that I wish I had that I think would help me a lot to
 be a
 better R programmer, that I think would probably help many others as well.
 I put the wish out there in the hopes that someone might think it was
 worth
 doing at some point.

 I wish I had the code of some substantial, widely used package – lm, say –
 heavily annotated and explained at roughly the level of R knowledge of
 someone who has completed an intro statistics course using R and picked up
 some R along the way.  The idea is that you would say what the various
 blocks of code are doing, why the authors chose to do it this way rather
 than some other way, point out coding techniques that save time or memory
 or
 prevent errors relative to alternatives, and generally, to explain what it
 does and point out and explain as many of the smarter features as
 possible.
 Ideally, this would include a description at least at the conceptual level
 if not at the code level of the major C functions that the package calls,
 so
 that you understand at least what is happening at that level, if not the
 nitty-gritty details of coding.

 I imagine this as a piece of annotated code, but maybe it could be a video
 of someone, or some couple of people, scrolling through the code and
 talking
 about it. Or maybe something more like a wiki page, with various people
 contributing explanations for different lines, sections, and practices.

 I am learning R on my own from books and the internet, and I think I would
 learn a lot from a chatty line-by-line description of some substantial
 block
 of code by someone who really knows what he or she is doing – perhaps
 with a
 little feedback from some people who are new about where they get lost in
 the description.

 There are a couple of particular things that I personally would hope to
 get
 out of this.  First, there are lots of instances of good coding practice
 that I think most people pick up from other programmers or by having
 individual bits of code explained to them that are pretty hard to get from
 books and help files.  I think this might be a good way to get at them.

 Second, there are a whole bunch of functions in R that I call
 meta-programming functions – don’t know if they have a more proper name.
 These are things that are intended primarily to act on R language objects
 or
 to control how R objects are evaluated. They include functions like call,
 match.call, parse and deparse, deparen, get, envir, substitute, eval, etc.
 Although I have read the individual documentation for many of these
 command,
 and even used most of them, I don’t think I have any fluency with them, or
 understand well how and when to code with them.  I think reading a
 good-sized hunk of code that uses these functions to do a lot of things
 that
 packages often need to do in the best-practice or standard R way, together
 with comments that describe and explain them would help a lot with that.
 (There is a good smaller-scale example of this in Friedrich Leisch’s
 tutorial on creating R packages).

 These are things I think I probably share with many others. I actually
 have
 an ulterior motive for suggesting lm in particular that is more peculiar
 to
 me, though not unique I am sure. I would like to understand how formulas
 work well enough to use them in my own functions. I do not think there is
 any way to get that from the help documentation. I have been working on a
 piece of code that I suspect is reinventing, but in an awkward and kludgey
 way, a piece of the functionality of formulas. So far as I have been able
 to
 gather, the only place they are really explained in detail is in chapters
 2
  3 of the White Book, “Statistical Models in S”. Unfortunately, I do not
 have ready access to a major research library and I have way, way outspent
 my book budget. Someday I’ll probably buy a copy, but for the time being,
 I
 am stuck without it. So it would be great to have a piece of code that
 uses
 them explained in detail.

 Warmest regards to all,  andrewH




 --
 View this message 

Re: [R] Learning the R way – A Wish

2013-03-05 Thread Andrew Hoerner
Thanks, David!  That bookfinder.com search is awesome! I checked four sites
and the best price i found for the white Bokk used was $99 + $4 shipping.
This was a quarter of that. So i just bought it.

The Venables and Ripley book was actually part of my previous
budget-busting splurge. I agree with everything good you have to say. their
writing is elegant, concise, and surprisingly complete on many topics.
Their discussion of the dot dot dot fnction alone was worth the price of
the book.

But it still didn't help me much with formulas. The real meat of the
formula function is buried in a C function called by lm and the other
 packages that use it. It is pretty hard to get at how it really works.
Especially since I do not know any C.

Appreciatively, andrewH


On Mon, Mar 4, 2013 at 4:58 PM, David Winsemius dwinsem...@comcast.netwrote:


 On Mar 4, 2013, at 3:42 PM, andrewH wrote:

  There is something that I wish I had that I think would help me a lot to
 be a
  better R programmer, that I think would probably help many others as
 well.
  I put the wish out there in the hopes that someone might think it was
 worth
  doing at some point.
 
  I wish I had the code of some substantial, widely used package – lm, say
 –
  heavily annotated and explained at roughly the level of R knowledge of
  someone who has completed an intro statistics course using R and picked
 up
  some R along the way.  The idea is that you would say what the various
  blocks of code are doing, why the authors chose to do it this way rather
  than some other way, point out coding techniques that save time or
 memory or
  prevent errors relative to alternatives, and generally, to explain what
 it
  does and point out and explain as many of the smarter features as
 possible.
  Ideally, this would include a description at least at the conceptual
 level
  if not at the code level of the major C functions that the package
 calls, so
  that you understand at least what is happening at that level, if not the
  nitty-gritty details of coding.
 
  I imagine this as a piece of annotated code, but maybe it could be a
 video
  of someone, or some couple of people, scrolling through the code and
 talking
  about it. Or maybe something more like a wiki page, with various people
  contributing explanations for different lines, sections, and practices.
 
  I am learning R on my own from books and the internet, and I think I
 would
  learn a lot from a chatty line-by-line description of some substantial
 block
  of code by someone who really knows what he or she is doing – perhaps
 with a
  little feedback from some people who are new about where they get lost in
  the description.
 
  There are a couple of particular things that I personally would hope to
 get
  out of this.  First, there are lots of instances of good coding practice
  that I think most people pick up from other programmers or by having
  individual bits of code explained to them that are pretty hard to get
 from
  books and help files.  I think this might be a good way to get at them.
 
  Second, there are a whole bunch of functions in R that I call
  meta-programming functions – don’t know if they have a more proper name.
  These are things that are intended primarily to act on R language
 objects or
  to control how R objects are evaluated. They include functions like call,
  match.call, parse and deparse, deparen, get, envir, substitute, eval,
 etc.
  Although I have read the individual documentation for many of these
 command,
  and even used most of them, I don’t think I have any fluency with them,
 or
  understand well how and when to code with them.  I think reading a
  good-sized hunk of code that uses these functions to do a lot of things
 that
  packages often need to do in the best-practice or standard R way,
 together
  with comments that describe and explain them would help a lot with that.
  (There is a good smaller-scale example of this in Friedrich Leisch’s
  tutorial on creating R packages).
 
  These are things I think I probably share with many others. I actually
 have
  an ulterior motive for suggesting lm in particular that is more peculiar
 to
  me, though not unique I am sure. I would like to understand how formulas
  work well enough to use them in my own functions. I do not think there is
  any way to get that from the help documentation. I have been working on a
  piece of code that I suspect is reinventing, but in an awkward and
 kludgey
  way, a piece of the functionality of formulas. So far as I have been
 able to
  gather, the only place they are really explained in detail is in
 chapters 2
   3 of the White Book, “Statistical Models in S”. Unfortunately, I do not
  have ready access to a major research library and I have way, way
 outspent
  my book budget. Someday I’ll probably buy a copy, but for the time
 being, I
  am stuck without it. So it would be great to have a piece of code that
 uses
  them explained in detail.

 Not sure that you have a valid 

[R] lm and Formula tutorial

2013-03-05 Thread Alaios
Dear all,
I was reading last night the lm and the Formula manual page, and 'I have to 
admit that I had tough time to understand their syntax. Is there a simpler 
guide for the dummies like me to start with?

I would like to thank you in advance for your help

Regards
Alex
[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Learning the R way – A Wish

2013-03-05 Thread Andrew Hoerner
Thanks, Andrew! I'll put it on my list.
I have not been through much of it yet, but the exercises on count data are
excelent and at least one of them is immediately helpful to a current
project.
With appreciation, andrewH


On Mon, Mar 4, 2013 at 7:28 PM, Andrew Koeser arborkoe...@yahoo.com wrote:

 The book that helped me break into R and more advanced texts was Crawley's
 Statistics: An Introduction with R.  Very light read that assumes no
 prior knowledge with stats or R. I am using it to teach my fellow grad
 students R and all agree it was worth scrimping pennies to get. He also has
 a series of exercises (for free) that may be close to what you need.

 http://www3.imperial.ac.uk/**naturalsciences/research/**statisticsusingrhttp://www3.imperial.ac.uk/naturalsciences/research/statisticsusingr

 Andrew


 On 03/04/2013 05:42 PM, andrewH wrote:

 There is something that I wish I had that I think would help me a lot to
 be a
 better R programmer, that I think would probably help many others as well.
 I put the wish out there in the hopes that someone might think it was
 worth
 doing at some point.

 I wish I had the code of some substantial, widely used package – lm, say –
 heavily annotated and explained at roughly the level of R knowledge of
 someone who has completed an intro statistics course using R and picked up
 some R along the way.  The idea is that you would say what the various
 blocks of code are doing, why the authors chose to do it this way rather
 than some other way, point out coding techniques that save time or memory
 or
 prevent errors relative to alternatives, and generally, to explain what it
 does and point out and explain as many of the smarter features as
 possible.
 Ideally, this would include a description at least at the conceptual level
 if not at the code level of the major C functions that the package calls,
 so
 that you understand at least what is happening at that level, if not the
 nitty-gritty details of coding.

 I imagine this as a piece of annotated code, but maybe it could be a video
 of someone, or some couple of people, scrolling through the code and
 talking
 about it. Or maybe something more like a wiki page, with various people
 contributing explanations for different lines, sections, and practices.

 I am learning R on my own from books and the internet, and I think I would
 learn a lot from a chatty line-by-line description of some substantial
 block
 of code by someone who really knows what he or she is doing – perhaps
 with a
 little feedback from some people who are new about where they get lost in
 the description.

 There are a couple of particular things that I personally would hope to
 get
 out of this.  First, there are lots of instances of good coding practice
 that I think most people pick up from other programmers or by having
 individual bits of code explained to them that are pretty hard to get from
 books and help files.  I think this might be a good way to get at them.

 Second, there are a whole bunch of functions in R that I call
 meta-programming functions – don’t know if they have a more proper name.
 These are things that are intended primarily to act on R language objects
 or
 to control how R objects are evaluated. They include functions like call,
 match.call, parse and deparse, deparen, get, envir, substitute, eval, etc.
 Although I have read the individual documentation for many of these
 command,
 and even used most of them, I don’t think I have any fluency with them, or
 understand well how and when to code with them.  I think reading a
 good-sized hunk of code that uses these functions to do a lot of things
 that
 packages often need to do in the best-practice or standard R way, together
 with comments that describe and explain them would help a lot with that.
 (There is a good smaller-scale example of this in Friedrich Leisch’s
 tutorial on creating R packages).

 These are things I think I probably share with many others. I actually
 have
 an ulterior motive for suggesting lm in particular that is more peculiar
 to
 me, though not unique I am sure. I would like to understand how formulas
 work well enough to use them in my own functions. I do not think there is
 any way to get that from the help documentation. I have been working on a
 piece of code that I suspect is reinventing, but in an awkward and kludgey
 way, a piece of the functionality of formulas. So far as I have been able
 to
 gather, the only place they are really explained in detail is in chapters
 2
  3 of the White Book, “Statistical Models in S”. Unfortunately, I do not
 have ready access to a major research library and I have way, way outspent
 my book budget. Someday I’ll probably buy a copy, but for the time being,
 I
 am stuck without it. So it would be great to have a piece of code that
 uses
 them explained in detail.

 Warmest regards to all,  andrewH




 --
 View this message in context: http://r.789695.n4.nabble.com/**
 

Re: [R] Learning the R way – A Wish

2013-03-05 Thread Andrew Hoerner
Dear Patrick--
After the official Core Team's R manuals and the individual function help
pages, I have found The R Inferno to be the single most useful piece of
documentation when I have gotten stuck with a R problems. It is the only
introduction that seems to be aware of the ambiguities present in the
official documentation and of some of the ways one can get stuck in traps
of misunderstanding. Plus, it is enjoyably witty.

When I first started using it, I found it ranged from very useful to pretty
frustrating. I did not always understand what the examples you presented
were trying to say. It is still true that I occasionally wish for a little
more discursive explanatory style, but as time goes by I find that I am
increasingly likely to get the point just from the example.

Many thanks, Andrew


On Tue, Mar 5, 2013 at 1:46 AM, Patrick Burns pbu...@pburns.seanet.comwrote:

 Andrew,

 That sounds like a sensible document you propose.
 Perhaps I'll do a few blog posts along that vein -- thanks.

 I presume you know of 'The R Inferno', which does
 a little of what you want.

 Pat



 On 04/03/2013 23:42, andrewH wrote:

 There is something that I wish I had that I think would help me a lot to
 be a
 better R programmer, that I think would probably help many others as well.
 I put the wish out there in the hopes that someone might think it was
 worth
 doing at some point.

 I wish I had the code of some substantial, widely used package – lm, say –
 heavily annotated and explained at roughly the level of R knowledge of
 someone who has completed an intro statistics course using R and picked up
 some R along the way.  The idea is that you would say what the various
 blocks of code are doing, why the authors chose to do it this way rather
 than some other way, point out coding techniques that save time or memory
 or
 prevent errors relative to alternatives, and generally, to explain what it
 does and point out and explain as many of the smarter features as
 possible.
 Ideally, this would include a description at least at the conceptual level
 if not at the code level of the major C functions that the package calls,
 so
 that you understand at least what is happening at that level, if not the
 nitty-gritty details of coding.

 I imagine this as a piece of annotated code, but maybe it could be a video
 of someone, or some couple of people, scrolling through the code and
 talking
 about it. Or maybe something more like a wiki page, with various people
 contributing explanations for different lines, sections, and practices.

 I am learning R on my own from books and the internet, and I think I would
 learn a lot from a chatty line-by-line description of some substantial
 block
 of code by someone who really knows what he or she is doing – perhaps
 with a
 little feedback from some people who are new about where they get lost in
 the description.

 There are a couple of particular things that I personally would hope to
 get
 out of this.  First, there are lots of instances of good coding practice
 that I think most people pick up from other programmers or by having
 individual bits of code explained to them that are pretty hard to get from
 books and help files.  I think this might be a good way to get at them.

 Second, there are a whole bunch of functions in R that I call
 meta-programming functions – don’t know if they have a more proper name.
 These are things that are intended primarily to act on R language objects
 or
 to control how R objects are evaluated. They include functions like call,
 match.call, parse and deparse, deparen, get, envir, substitute, eval, etc.
 Although I have read the individual documentation for many of these
 command,
 and even used most of them, I don’t think I have any fluency with them, or
 understand well how and when to code with them.  I think reading a
 good-sized hunk of code that uses these functions to do a lot of things
 that
 packages often need to do in the best-practice or standard R way, together
 with comments that describe and explain them would help a lot with that.
 (There is a good smaller-scale example of this in Friedrich Leisch’s
 tutorial on creating R packages).

 These are things I think I probably share with many others. I actually
 have
 an ulterior motive for suggesting lm in particular that is more peculiar
 to
 me, though not unique I am sure. I would like to understand how formulas
 work well enough to use them in my own functions. I do not think there is
 any way to get that from the help documentation. I have been working on a
 piece of code that I suspect is reinventing, but in an awkward and kludgey
 way, a piece of the functionality of formulas. So far as I have been able
 to
 gather, the only place they are really explained in detail is in chapters
 2
  3 of the White Book, “Statistical Models in S”. Unfortunately, I do not
 have ready access to a major research library and I have way, way outspent
 my book budget. Someday I’ll probably buy a copy, 

Re: [R] Constrained cubic smoothing spline

2013-03-05 Thread Simon Wood

Victor,

It's a bit clunky, but you can use the 'pcls' function in package 'mgcv' 
for this, by adapting the examples in the help file. The examples 
themselves deal with inequality constraints imposing monotonicity, but 
'pcls' also allows you to impose equality constraints. The examples are 
based on penalized cubic regression splines (i.e. cubic splines with 
fewer knots than you have data). You could use essentially the same code 
for full smoothing splines, but it is somewhat inefficient for that 
purpose, and will be prohibitively expensive for large datasets: 
operations count is O(np^2) where p is number of spline coefficients, so 
setting p=n can get costly.


Simon




On 06/03/13 01:07, Victor hyk wrote:

Hello everone,
Anyone who knows how to force a cubic smoothing spline to pass 
through a particular point?
I found on website  someone said that we can use cobs package to 
force the spline pass through certain points or impose shape   constraints 
(increasing, decreasing). However,  this package is using  B-spline and can only do 
linear and quadratic B-spline.
In my research, I need to force a cubic smoothing spline to pass a 
point.
Thanks!

   Victor

[[alternative HTML version deleted]]



__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.




--
Simon Wood, Mathematical Science, University of Bath BA2 7AY UK
+44 (0)1225 386603   http://people.bath.ac.uk/sw283

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.