Re: [R] statment can tacke value in row1 and rows

2020-09-02 Thread Jim Lemon
Hi Hesham, It think you are looking for something like this: truth<-data.frame(G1=sample(LETTERS[1:4],20,TRUE), G2=sample(LETTERS[1:4],20,TRUE)) truth truth$G3<-as.numeric(truth$G1 == truth$G2) truth Note that like quite a few emails produced with Javascript formatting, there are embedded

Re: [R] Why does a 2 GB RData file exceed my 16GB memory limit when reading it in?

2020-09-02 Thread John via R-help
On Wed, 2 Sep 2020 16:31:53 -0500 David Jones wrote: > Thank you Uwe, John, and Bert - this is very helpful context. > > If it helps inform the discussion, to address John and Bert's > questions - I actually had less memory free when I originally ran the > analyses and saved the workspace, than

Re: [R] statment can tacke value in row1 and rows

2020-09-02 Thread Bert Gunter
Please re-post in plain text. This is a plain text list and html can get messed up, as here. Bert Gunter "The trouble with having an open mind is that people keep coming along and sticking things into it." -- Opus (aka Berkeley Breathed in his "Bloom County" comic strip ) On Wed, Sep 2, 2020

[R] statment can tacke value in row1 and rows

2020-09-02 Thread Hesham A. AL-bukhaiti via R-help
hello.I have this code :#3#read data just thee columns. first and second columns are catogary , third column  is number.  out<-read.csv("outbr.csv") truth<-out[,seq(1,2)] #truth about 2000 rows, some values in row1 can  show in rows2,and the some values in row2

Re: [R] .grb2 Files

2020-09-02 Thread David Winsemius
A very simple search (= "CRAN NOAA .grb2") and small bit of reading help files suggests that you might want wgrib2 and rNOMADS https://rdrr.io/cran/rNOMADS/man/GribInfo.html https://www.cpc.ncep.noaa.gov/products/wesley/wgrib2/ -- David On 9/2/20 5:57 PM, Sarah Goslee wrote: GDAL supports

Re: [R] .grb2 Files

2020-09-02 Thread Sarah Goslee
GDAL supports GRIB2 so it should be easy using rgdal and raster packages. Sarah On Wed, Sep 2, 2020 at 8:32 PM Philip wrote: > > Any advise about how to get NOAA .grb2 files into R? > > Thanks. -- Sarah Goslee (she/her) http://www.numberwright.com

[R] .grb2 Files

2020-09-02 Thread Philip
Any advise about how to get NOAA .grb2 files into R? Thanks. [[alternative HTML version deleted]] __ R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide

Re: [R] Why does a 2 GB RData file exceed my 16GB memory limit when reading it in?

2020-09-02 Thread Jeff Newmiller
You need more RAM to load this file. As the memory was being used in your original file, certain objects (such as numeric columns) were being shared among different higher-level objects (such as data frames). When serialized into the file those optimizations were lost, and now those columns are

Re: [R] Why does a 2 GB RData file exceed my 16GB memory limit when reading it in?

2020-09-02 Thread Leandro Marino
David, If the ".Rdata" contains more than one object you could (and maybe should use) the SOAR package (from Venables). This package helps you to split the objects over multiple RData files. It's useful when you have numerous medium-large objects in the workspace but doesn't use then at the same

Re: [R] Why does a 2 GB RData file exceed my 16GB memory limit when reading it in?

2020-09-02 Thread David Jones
Thank you Uwe, John, and Bert - this is very helpful context. If it helps inform the discussion, to address John and Bert's questions - I actually had less memory free when I originally ran the analyses and saved the workspace, than when I read in the data back in later on (I rebooted in an

Re: [R] Why does a 2 GB RData file exceed my 16GB memory limit when reading it in?

2020-09-02 Thread Bert Gunter
R experts may give you a detailed explanation, but it is certainly possible that the memory available to R when it wrote the file was different than when it tried to read it, is it not? Bert Gunter "The trouble with having an open mind is that people keep coming along and sticking things into

Re: [R] Why does a 2 GB RData file exceed my 16GB memory limit when reading it in?

2020-09-02 Thread John via R-help
On Wed, 2 Sep 2020 13:36:43 +0200 Uwe Ligges wrote: > On 02.09.2020 04:44, David Jones wrote: > > I ran a number of analyses in R and saved the workspace, which > > resulted in a 2GB .RData file. When I try to read the file back > > into R > > Compressed in RData but uncompressed in main

Re: [R] Odd Results when generating predictions with nnet function

2020-09-02 Thread Paul Bernal
You are right Jeff, that was a mistake, I was focusing on the square root and made the mistake of talking about taking the square root instead of raising to the 2nd power. This is the example I was following ( https://www.youtube.com/watch?v=SaQgA6V8UA4). Of course, I tried fitting the nnet model

Re: [R] Odd Results when generating predictions with nnet function

2020-09-02 Thread peter dalgaard
The problem seems to be the fit rather than the predictions. Looks like nnet is happier with data between 0 and 1, witness Fit <- nnet(y/max(y) ~ x, a, size = 5, maxit = 1000, lineout = T, decay = 0.001) plot(y/max(y)~x,a) lines(fitted(Fit)~x,a) > On 2 Sep 2020, at 16:21 , Paul Bernal wrote:

Re: [R] Odd Results when generating predictions with nnet function

2020-09-02 Thread Jeff Newmiller
Why would you expect raising y_pred to the 0.5 to "backtransform" a model sqrt(y)~x? Wouldn't you raise to the 2? Why would you "backtransform" x in such a model if it were never transformed in the first place? Dr Maechler did not suggest that. And why are you mentioning some random

Re: [R] Odd Results when generating predictions with nnet function

2020-09-02 Thread Paul Bernal
Dear Dr. Martin and Dr. Peter, Hope you are doing well. Thank you for your kind feedback. I also tried fitting the nnet using y ~ x, but the model kept on generating odd predictions. If I understand correctly, from what Dr. Martin said, it would be a good idea to try modeling sqrt(y) ~ x and then

Re: [R] Why does a 2 GB RData file exceed my 16GB memory limit when reading it in?

2020-09-02 Thread Uwe Ligges
On 02.09.2020 04:44, David Jones wrote: I ran a number of analyses in R and saved the workspace, which resulted in a 2GB .RData file. When I try to read the file back into R Compressed in RData but uncompressed in main memory later, it won't read into R and provides the error:

[R] augPred and missing data error

2020-09-02 Thread PIKAL Petr
Dear all I would like to ask if augPred is able to handle missing values. Here is example with below data "test". I read augPred documentation and nothing is mentioned that fitted object from data with missing values cannot be used in augPred. Maybe it would be worth to add something. Or I just

Re: [R] Odd Results when generating predictions with nnet function

2020-09-02 Thread Martin Maechler
> peter dalgaard > on Wed, 2 Sep 2020 08:41:09 +0200 writes: > Generically, nnet(a$y ~ a$x, a ...) should be nnet(y ~ x, > data=a, ...) otherwise predict will go looking for a$x, no > matter what is in xnew. > But more importantly, nnet() is a _classifier_, >

Re: [R] Odd Results when generating predictions with nnet function

2020-09-02 Thread peter dalgaard
Generically, nnet(a$y ~ a$x, a ...) should be nnet(y ~ x, data=a, ...) otherwise predict will go looking for a$x, no matter what is in xnew. But more importantly, nnet() is a _classifier_, so the LHS should be a class, not a numeric variable. -pd > On 1 Sep 2020, at 22:19 , Paul Bernal

[R] Why does a 2 GB RData file exceed my 16GB memory limit when reading it in?

2020-09-02 Thread David Jones
I ran a number of analyses in R and saved the workspace, which resulted in a 2GB .RData file. When I try to read the file back into R later, it won't read into R and provides the error: "Error: cannot allocate vector of size 37 Kb" This error comes after 1 minute of trying to read things in - I