[R] Memory problem

2007-08-09 Thread Gang Chen
I got a long list of error message repeating with the following 3 lines when running the loop at the end of this mail: R(580,0xa000ed88) malloc: *** vm_allocate(size=327680) failed (error code=3) R(580,0xa000ed88) malloc: *** error: can't allocate region R(580,0xa000ed88) malloc: *** set a

Re: [R] Memory problem

2007-08-09 Thread Gang Chen
It seems the problem lies in this line: try(fit.lme - lme(Beta ~ group*session*difficulty+FTND, random = ~1|Subj, Model), tag - 1); As lme fails for most iterations in the loop, the 'try' function catches one error message for each failed iteration. But the puzzling part is, why does the

[R] memory problem

2007-07-14 Thread Li, Xue
Hi, My computer has 2GB of ram and I also request 2GB of virtual ram from c drive, therefore totally I have 4GB of ram. Before I open R workshop, I also add C:\Program Files\R\R-2.5.0\bin\Rgui.exe --max-mem-size=3000Mb--max-vsize=3000Mb into the target of R by right clicking the R

Re: [R] memory problem --- use sparse matrices

2007-01-09 Thread Zoltan Kmetty
Unfortunatelly, i have to fill all the cells, with numbers..., so I need a better machine, or i have to split the data for smaller parts, but that way is much slower, but i see i dont have other alternative way. But thanx for your help, because i work with big networks too (1 vertex), and

Re: [R] memory problem

2007-01-08 Thread Thomas Lumley
On Sat, 6 Jan 2007, Zoltan Kmetty wrote: Hi! I had some memory problem with R - hope somebody could tell me a solution. I work with very large datasets, but R cannot allocate enough memoty to handle these datasets. You haven't said what you want to do with these datasets. -thomas

Re: [R] memory problem --- use sparse matrices

2007-01-08 Thread Martin Maechler
UweL == Uwe Ligges [EMAIL PROTECTED] on Sun, 07 Jan 2007 09:42:08 +0100 writes: UweL Zoltan Kmetty wrote: Hi! I had some memory problem with R - hope somebody could tell me a solution. I work with very large datasets, but R cannot allocate enough

Re: [R] memory problem

2007-01-07 Thread Uwe Ligges
Zoltan Kmetty wrote: Hi! I had some memory problem with R - hope somebody could tell me a solution. I work with very large datasets, but R cannot allocate enough memoty to handle these datasets. I want work a matrix with row= 100 000 000 and column=10 A know this is 1 milliard

Re: [R] memory problem

2007-01-07 Thread Bos, Roger
: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Uwe Ligges Sent: Sunday, January 07, 2007 3:42 AM To: Zoltan Kmetty Cc: r-help@stat.math.ethz.ch Subject: Re: [R] memory problem Zoltan Kmetty wrote: Hi! I had some memory problem with R - hope somebody could tell me a solution. I work

[R] memory problem

2007-01-06 Thread Zoltan Kmetty
Hi! I had some memory problem with R - hope somebody could tell me a solution. I work with very large datasets, but R cannot allocate enough memoty to handle these datasets. I want work a matrix with row= 100 000 000 and column=10 A know this is 1 milliard cases, but i thought R could handle

Re: [R] Memory problem on a linux cluster using a large data set [Broadcast]

2006-12-21 Thread Iris Kolder
Kolder [EMAIL PROTECTED] Cc: r-help@stat.math.ethz.ch; N.C. Onland-moret [EMAIL PROTECTED] Sent: Monday, December 18, 2006 7:48:23 PM Subject: RE: [R] Memory problem on a linux cluster using a large data set [Broadcast] In addition to my off-list reply to Iris (pointing her to an old post

Re: [R] Memory problem on a linux cluster using a large data set [Broadcast]

2006-12-21 Thread Thomas Lumley
On Thu, 21 Dec 2006, Iris Kolder wrote: Thank you all for your help! So with all your suggestions we will try to run it on a computer with a 64 bits proccesor. But i've been told that the new R versions all work on a 32bits processor. I read in other posts that only the old R versions

Re: [R] Memory problem on a linux cluster using a large data set [Broadcast]

2006-12-21 Thread Martin Morgan
Section 8 of the Installation and Administration guide says that on 64-bit architectures the 'size of a block of memory allocated is limited to 2^32-1 (8 GB) bytes'. The wording 'a block of memory' here is important, because this sets a limit on a single allocation rather than the memory consumed

[R] Memory problem on a linux cluster using a large data set

2006-12-18 Thread Iris Kolder
Hello, I have a large data set 320.000 rows and 1000 columns. All the data has the values 0,1,2. I wrote a script to remove all the rows with more than 46 missing values. This works perfect on a smaller dataset. But the problem arises when I try to run it on the larger data set I get an error

Re: [R] Memory problem on a linux cluster using a large data set

2006-12-18 Thread Martin Morgan
Iris -- I hope the following helps; I think you have too much data for a 32-bit machine. Martin Iris Kolder [EMAIL PROTECTED] writes: Hello, I have a large data set 320.000 rows and 1000 columns. All the data has the values 0,1,2. It seems like a single copy of this data set will be at

Re: [R] Memory problem on a linux cluster using a large data set [Broadcast]

2006-12-18 Thread Liaw, Andy
In addition to my off-list reply to Iris (pointing her to an old post of mine that detailed the memory requirement of RF in R), she might consider the following: - Use larger nodesize - Use sampsize to control the size of bootstrap samples Both of these have the effect of reducing sizes of trees

Re: [R] memory problem [cluster]

2006-12-05 Thread Martin Maechler
Roger == Roger Bivand [EMAIL PROTECTED] on Sat, 2 Dec 2006 22:11:12 +0100 (CET) writes: Roger On Sat, 2 Dec 2006, Dylan Beaudette wrote: Hi Stephano, Roger Looks like you used my example verbatim Roger (http://casoilresource.lawr.ucdavis.edu/drupal/node/221) Roger :)

Re: [R] memory problem [cluster]

2006-12-02 Thread Dylan Beaudette
Hi Stephano, Looks like you used my example verbatim (http://casoilresource.lawr.ucdavis.edu/drupal/node/221) :) While my approach has not *yet* been published, the original source [4] by Roger Bivand certainly has. Just a reminder. That said, I would highly recommend reading up on the

Re: [R] memory problem [cluster]

2006-12-02 Thread Roger Bivand
On Sat, 2 Dec 2006, Dylan Beaudette wrote: Hi Stephano, Looks like you used my example verbatim (http://casoilresource.lawr.ucdavis.edu/drupal/node/221) :) From exchanges on R-sig-geo, I believe the original questioner is feeding NAs to clara, and the error message in clara() is overrunning

[R] memory problem

2006-12-01 Thread Massimo Di Stefano
hi to all, frustated for this error, to day i buy a 1 GB memory slot for my laptop now it have 1,28GB instead the old 512, but i've the same error :-( damn!damn!how can i do? repeat for a little area (about 20X20 km and res=20m) it work fine! have you any suggestion? is ther a method for look

[R] Memory problem

2006-03-02 Thread Mahdi Osman
Hi list, I am analysing a large dataset using random coefficient (using nlme) and fixed effects (using lm function) models. I have problem with my R version 2. 2. 1 due to memory allocation difficulties. When I try to expand the memory I get the following error message. R

[R] memory problem in handling large dataset

2005-10-27 Thread Weiwei Shi
Dear Listers: I have a question on handling large dataset. I searched R-Search and I hope I can get more information as to my specific case. First, my dataset has 1.7 billion observations and 350 variables, among which, 300 are float and 50 are integers. My system has 8 G memory, 64bit CPU, linux

Re: [R] memory problem in handling large dataset

2005-10-27 Thread Berton Gunter
is to catalyze the scientific learning process. - George E. P. Box -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Weiwei Shi Sent: Thursday, October 27, 2005 9:28 AM To: r-help Subject: [R] memory problem in handling large dataset Dear

Re: [R] memory problem in handling large dataset

2005-10-27 Thread Liaw, Andy
the scientific learning process. - George E. P. Box -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Weiwei Shi Sent: Thursday, October 27, 2005 9:28 AM To: r-help Subject: [R] memory problem in handling large dataset Dear Listers

Re: [R] memory problem in handling large dataset

2005-10-27 Thread Weiwei Shi
Hi, Jim: Thanks for the calculation. I think you won't mind if I cc the reply to r-help too so that I can get more info. I assume you use 4 bytes for integer and 8 bytes for float, so 300x8+50x4=2600 bytes for each observation, right? I wish I could have 500x8 G memory :) just kidding..

Re: [R] memory problem in handling large dataset

2005-10-27 Thread Weiwei Shi
Of Weiwei Shi Sent: Thursday, October 27, 2005 9:28 AM To: r-help Subject: [R] memory problem in handling large dataset Dear Listers: I have a question on handling large dataset. I searched R-Search and I hope I can get more information as to my specific case. First, my

Re: [R] memory problem in handling large dataset

2005-10-27 Thread Søren Højsgaard
of the statistician is to catalyze the scientific learning process. - George E. P. Box -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Weiwei Shi Sent: Thursday, October 27, 2005 9:28 AM To: r-help Subject: [R] memory problem in handling large

[R] memory problem

2005-07-14 Thread Ginters
I'm a beginner in R and, therefore, I don't know how serious my trouble is. After running a script: ** *t**-c(14598417794,649693)* *data**=data.frame(read.spss(C:\\Ginters\\Kalibracija\\cal_data.sav))* *Xs=**as.matrix(data[,1:2]) *

Re: [R] memory problem

2005-07-14 Thread Duncan Murdoch
On 7/14/2005 7:19 AM, Ginters wrote: I'm a beginner in R and, therefore, I don't know how serious my trouble is. After running a script: ** *t**-c(14598417794,649693)* *data**=data.frame(read.spss(C:\\Ginters\\Kalibracija\\cal_data.sav))* *Xs=**as.matrix(data[,1:2]) *

Re: [R] memory problem

2005-07-14 Thread Thomas Lumley
On Thu, 14 Jul 2005, Duncan Murdoch wrote: On 7/14/2005 7:19 AM, Ginters wrote: Why does memory need so much (1.6 GB) space? How can I enlarge it? Is it possible to allocate a part of memory used to the hard drive? Or, is the trouble only with my script? This sounds like a problem with the

RE: [R] memory problem with mac os X

2005-03-01 Thread Huntsinger, Reid
@stat.math.ethz.ch Subject: [R] memory problem with mac os X Dear list, I am using R.2.0.1 on a G5 biprocessor 2.5GHz with 2Go RAM (Mac OS X 10.3.8). I'm trying to calculate an object of type dist. I am getting the following memory error : *** malloc: vm_allocate(size=1295929344) failed (error code=3

[R] memory problem with mac os X

2005-02-28 Thread Edouard Henrion
Dear list, I am using R.2.0.1 on a G5 biprocessor 2.5GHz with 2Go RAM (Mac OS X 10.3.8). I'm trying to calculate an object of type dist. I am getting the following memory error : *** malloc: vm_allocate(size=1295929344) failed (error code=3) *** malloc[25960]: error: Can't allocate region

[R] memory problem with package mix

2005-02-15 Thread Delphine . Gille
Hello, I think we have a memory problem with em.mix. We have done: library(mix) Manq - read.table(C:/.../file.txt) attach(Manq) Manq V1 V2 V3 V4 .V27 1 1 1 1 1... 2 1 NA 3 6 3 1 2 6 2 ... ... 300 2 NA 6 2... Essaimanq

RE: [R] memory problem with package mix

2005-02-15 Thread Ted Harding
On 15-Feb-05 [EMAIL PROTECTED] wrote: Hello, I think we have a memory problem with em.mix. We have done: library(mix) Manq - read.table(C:/.../file.txt) attach(Manq) Manq V1 V2 V3 V4 .V27 1 1 1 1 1... 2 1 NA 3 6 3 1 2 6 2 ... ... 300 2 NA 6

[R] Memory problem ... Again

2005-01-03 Thread Tae-Hoon Chung
Happy new year to all; A few days ago, I posted similar problem. At that time, I found out that our R program had been 32-bit compiled, not 64-bit compiled. So the R program has been re-installed in 64-bit and run the same job, reading in 150 Affymetrix U133A v2 CEL files and perform dChip

RE: [R] Memory problem ... Again

2005-01-03 Thread Liaw, Andy
Have you checked whether there are limits set? What does `ulimit -a' say? Do you know how much memory the R process is using when the error occurred? We've had R jobs using upwards of 13GB on a box with 16GB of RAM (SLES8 on dual Opterons) and never had problems. Andy From: Tae-Hoon Chung

Re: [R] Memory problem ... Again = False Alarm !!!

2005-01-03 Thread Tae-Hoon Chung
Thanks Peter and Andy; I just found it was not due to memory problem. It was false alarm ... 64-bit compiled program works fine! On 1/3/05 3:39 PM, Peter Dalgaard [EMAIL PROTECTED] wrote: Tae-Hoon Chung [EMAIL PROTECTED] writes: Happy new year to all; A few days ago, I posted similar

[R] Memory problem with jpeg() and wide jpegs

2004-12-20 Thread michael watson \(IAH-C\)
Hi I have been creating very, very long jpeg images for the last two weeks using jpeg(). All of a sudden, and I mean that, it's stopped working - I've not changed a thing! The error message I get is: jpeg(out.jpg,width=5,height=480, quality=100) Error in devga(paste(jpeg:, quality, :,

Re: [R] Memory problem with jpeg() and wide jpegs

2004-12-20 Thread Uwe Ligges
michael watson (IAH-C) wrote: Hi I have been creating very, very long jpeg images for the last two weeks using jpeg(). All of a sudden, and I mean that, it's stopped working - I've not changed a thing! The error message I get is: jpeg(out.jpg,width=5,height=480, quality=100) Error in

Re: [R] Memory problem with jpeg() and wide jpegs

2004-12-20 Thread Prof Brian Ripley
It is also several times greater than the limit of human perception, being several feet long at printing resolutions that need a magnifying glass to see. This is Windows and the limit is in the graphics card: mine is able to do this but I suspect you need a 128Mb card (that jpeg is of itself

RE: [R] Memory problem with jpeg() and wide jpegs

2004-12-20 Thread michael watson \(IAH-C\)
: Mon 12/20/2004 5:37 PM To: Uwe Ligges Cc: michael watson (IAH-C); [EMAIL PROTECTED] Subject:Re: [R] Memory problem with jpeg() and wide jpegs It is also several times greater than the limit of human perception, being several feet long at printing resolutions that need a magnifying

[R] memory problem

2004-12-08 Thread Samuel Kemp
Hi, I am trying to run a very computationally expensive procedure in R-2.0.0. and the process always gets killed after approx 8 minutes. This procedure calls some of my own C++ code - in case it was this code causing a memory leak I unload and then reload the .so file every time, however I

RE: [R] memory problem

2004-12-08 Thread Huntsinger, Reid
PROTECTED] Subject: [R] memory problem Hi, I am trying to run a very computationally expensive procedure in R-2.0.0. and the process always gets killed after approx 8 minutes. This procedure calls some of my own C++ code - in case it was this code causing a memory leak I unload and then reload the .so

Re: [R] memory problem

2004-12-08 Thread Samuel Kemp
that there are no limits on memory allocation set by the administrator. Reid Huntsinger -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Samuel Kemp Sent: Wednesday, December 08, 2004 6:29 AM To: [EMAIL PROTECTED] Subject: [R] memory problem Hi, I am trying to run a very

RE: [R] memory problem

2004-12-08 Thread Huntsinger, Reid
-Original Message- From: Samuel Kemp [mailto:[EMAIL PROTECTED] Sent: Wednesday, December 08, 2004 11:46 AM To: Huntsinger, Reid Cc: [EMAIL PROTECTED] Subject: Re: [R] memory problem Thanks. Here is some more information. My platform is a Linux desktop. The C++ code implements a Gamma

[R] Memory Problem???

2004-09-20 Thread Perez Martin, Agustin
DeaR useRs: I am working with 1 files at the same time. These files are in some lists. When I begin to operate the memory size grows, but never exceeds the computer’s RAM. And suddenly R reports: Error: cannot allocate vector of size 23 Kb Somebody know what can I do?

Re: [R] Memory Problem???

2004-09-20 Thread Uwe Ligges
Perez Martin, Agustin wrote: DeaR useRs: I am working with 1 files at the same time. These files are in some lists. When I begin to operate the memory size grows, but never exceeds the computers RAM. And suddenly R reports: Error: cannot allocate vector of size 23 Kb Read the

[R] memory problem under windows

2004-09-14 Thread Christoph Lehmann
I have (still) some memory problems, when trying to allocate a huge array: WinXP pro, with 2G RAM I start R by calling: Rgui.exe --max-mem-size=2Gb (as pointed out in R for windows FAQ) R.Version(): i386-pc-mingw32, 9.1, 21.6.2004 ## and here the problem x.dim - 46 y.dim - 58 slices - 40

Re: [R] memory problem under windows

2004-09-14 Thread James W. MacDonald
Christoph Lehmann wrote: I have (still) some memory problems, when trying to allocate a huge array: WinXP pro, with 2G RAM I start R by calling: Rgui.exe --max-mem-size=2Gb (as pointed out in R for windows FAQ) Not sure that it actually says to use 2Gb there. You might try

Re: [R] memory problem under windows

2004-09-14 Thread Prof Brian Ripley
Did you read the *rest* of what the rw-FAQ says? Be aware though that Windows has (in most versions) a maximum amount of user virtual memory of 2Gb, and parts of this can be reserved by processes but not used. The version of the memory manager used from R 1.9.0 allocates large objects in

[R] memory problem?

2004-07-12 Thread Tianyu Tom Wang
Hi everyone, I'm running R1.9.1 on RedHat Linux. I'm trying to read in a matrix file with 13956 by 858 dimensions. I realize this is pretty huge, though I think the amount of memory I have should be able to handle it. R reads the entire file and tells me Read in 11974247 values. This is

Re: [R] memory problem?

2004-07-12 Thread Duncan Murdoch
On Mon, 12 Jul 2004 21:53:34 -0400 (EDT), Tianyu Tom Wang [EMAIL PROTECTED] wrote: Hi everyone, I'm running R1.9.1 on RedHat Linux. I'm trying to read in a matrix file with 13956 by 858 dimensions. I realize this is pretty huge, though I think the amount of memory I have should be able to

Re: [R] memory problem?

2004-07-12 Thread Roger D. Peng
I doubt this is a memory problem, considering that R reported that it read in the data! What exactly were the commands that you used to read in the data? -roger Tianyu Tom Wang wrote: Hi everyone, I'm running R1.9.1 on RedHat Linux. I'm trying to read in a matrix file with 13956 by 858

Re: [R] memory problem

2004-03-09 Thread James MacDonald
How many chips you can read is a function of how much RAM you have and what chip it is. On a unix/linux box you will be able to read in and process 143 of the HG-u95aV2 chips if you have about 2 Gb RAM. For the larger U133A chips (RAE/MOE are about the same size), you will probably need almost

Re: [R] memory problem

2004-03-08 Thread Peter Dalgaard
Joshi, Nina (NIH/NCI) [EMAIL PROTECTED] writes: I am trying to upload into R 143 Affymetrix chips onto using R on the NIH Nimbus server. I can load 10 chips without a problem, however, when I try to load 143 I receive a error message: cannot create a vector of 523263 KB. I have expanded the

Re: [R] memory problem for R --Summary

2004-02-02 Thread Yun-Fang Juan
[EMAIL PROTECTED] Cc: [EMAIL PROTECTED] Sent: Friday, January 30, 2004 11:44 AM Subject: RE: [R] memory problem for R You still have not read the posting guide, have you? See more below. From: Yun-Fang Juan [...] I tried 10% sample and it turned out the matrix became singular after I

Re: [R] memory problem for R

2004-01-30 Thread Prof Brian Ripley
- Original Message - From: Yun-Fang Juan [EMAIL PROTECTED] To: [EMAIL PROTECTED] Sent: Thursday, January 29, 2004 7:03 PM Subject: [R] memory problem for R Hi, I try to use lm to fit a linear model with 600k rows and 70 attributes. But I can't even load the data into the R

Re: [R] memory problem for R

2004-01-30 Thread Spencer Graves
: Thursday, January 29, 2004 7:03 PM Subject: [R] memory problem for R Hi, I try to use lm to fit a linear model with 600k rows and 70 attributes. But I can't even load the data into the R environment. The error message says the vector memory is used up. Is there anyone having experience with large

Re: [R] memory problem for R

2004-01-30 Thread Yun-Fang Juan
: [R] memory problem for R Hi, I try to use lm to fit a linear model with 600k rows and 70 attributes. But I can't even load the data into the R environment. The error message says the vector memory is used up. Is there anyone having experience with large datasets in R? (I bet

[R] memory problem for R

2004-01-29 Thread Yun-Fang Juan
Hi, I try to use lm to fit a linear model with 600k rows and 70 attributes. But I can't even load the data into the R environment. The error message says the vector memory is used up. Is there anyone having experience with large datasets in R? (I bet) Please advise. thanks, Yun-Fang

RE: [R] memory problem for R

2004-01-29 Thread Liaw, Andy
Have you read the posting guide for R-help? You need to tell us more: What hardware/OS/version of R are you using? A rough calculation on storage needed: 6e5 * 70 * 8 / 1024^2 [1] 320.4346 So you need 320+ MB of RAM just to store the data as a matrix of doubles in R. You need enough RAM to

RE: [R] memory problem in exporting data frame

2003-09-09 Thread Henrik Bengtsson
Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Thomas W Blackwell Sent: den 9 september 2003 01:28 To: array chip Cc: [EMAIL PROTECTED] Subject: Re: [R] memory problem in exporting data frame Simplest is to save your workspace using save.image(), then delete

RE: [R] memory problem in exporting data frame

2003-09-09 Thread array chip
Cc: [EMAIL PROTECTED] Subject: Re: [R] memory problem in exporting data frame Simplest is to save your workspace using save.image(), then delete a bunch of large objects other than the data frame that you want to export, and run write.table() again, now that you've made

Re: [R] memory problem in exporting data frame

2003-09-09 Thread Peter Dalgaard BSA
array chip [EMAIL PROTECTED] writes: Hi all, Thanks for all the suggestions. I was able to get the data frame out by first deleting some other large objects in the directory, and then changing the data frame into matrix by as.matrix(), splitting the matrix into 4 blocks and finally using

RE: [R] memory problem in exporting data frame

2003-09-09 Thread Henrik Bengtsson
-Original Message- From: array chip [mailto:[EMAIL PROTECTED] Sent: den 9 september 2003 19:04 To: Henrik Bengtsson; 'Thomas W Blackwell'; Patrick Burns Cc: [EMAIL PROTECTED] Subject: RE: [R] memory problem in exporting data frame Hi all, Thanks for all the suggestions. I

Re: [R] memory problem in exporting data frame

2003-09-08 Thread Patrick Burns
I had a similar problem not long ago. My solution was to look at the definition of write.table and essentially do it by hand. The key steps are to create a matrix of characters that includes the dimnames (if desired), and then use writeLines to put that into a file. My machine has 1G as well

Re: [R] memory problem in exporting data frame

2003-09-08 Thread array chip
Patrick, Thanks for the suggestion. do you mean you need to change each row of the data frame into a text string using something like paste(data[1,],collapse='\t') and then output the resulting character vector into a file using writeLines? It seems not working with my data mainly because my

Re: [R] memory problem in exporting data frame

2003-09-08 Thread Patrick Burns
Yes, you have the operation precisely right. What happens if you coerce your data frame to a matrix: data.mat - as.matrix(data) and then do the paste and writeLines? Pat array chip wrote: Patrick, Thanks for the suggestion. do you mean you need to change each row of the data frame into a

[R] Memory-problem?

2003-08-11 Thread Unternährer Thomas, uth
Hi, I have a big problem with my R-script. It seems to be a memory problem, but I'm not sure. My script: test.window - function(stat, some arguments){ several ifs and ifs in ifs (if(){...if(){...}}) } ... for (ii in 1 : length(data)){ ## data is a vector of length 2500 stat -

Re: [R] Memory-problem?

2003-08-11 Thread Ko-Kang Kevin Wang
On Mon, 11 Aug 2003, Unternährer Thomas, uth wrote: test.window - function(stat, some arguments){ several ifs and ifs in ifs (if(){...if(){...}}) } ... for (ii in 1 : length(data)){ ## data is a vector of length 2500 stat - test.window( some arguments ) ## there are 15