Re: [R] ?summaryRprof running at 100% cpu for one hour ...

2010-11-22 Thread Kjetil Halvorsen
see below.

2010/11/21 Uwe Ligges lig...@statistik.tu-dortmund.de:


 On 21.11.2010 18:13, Kjetil Halvorsen wrote:

 ?save.image

 And at this point it has been running with one cpu at 100% for over an
 hour!


 It's OK to take an hour (due to memory - disc IO) if it uses swap space
 heavily. Factor of 60 is not much given memory is faster than harddiscs by
 orders of magnitude.

 Uwe

It takes much more than an hour! I started anew a process with the
problem yesterday aroun 18.00, had to kill it this morning around
09.00. That's more than  1|5 hours.

Kjetil


__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] ?summaryRprof running at 100% cpu for one hour ...

2010-11-22 Thread Mike Marchywka






 Date: Mon, 22 Nov 2010 12:03:54 -0300
 From: kjetilbrinchmannhalvor...@gmail.com
 To: lig...@statistik.tu-dortmund.de
 CC: r-help@r-project.org
 Subject: Re: [R] ?summaryRprof running at 100% cpu for one hour ...

 see below.

 2010/11/21 Uwe Ligges :
 
 
  On 21.11.2010 18:13, Kjetil Halvorsen wrote:

  ?save.image
 
  And at this point it has been running with one cpu at 100% for over an
  hour!
 
 
  It's OK to take an hour (due to memory - disc IO) if it uses swap space
  heavily. Factor of 60 is not much given memory is faster than harddiscs by
  orders of magnitude.
 
  Uwe

 It takes much more than an hour! I started anew a process with the
 problem yesterday aroun 18.00, had to kill it this morning around
 09.00. That's more than 1|5 hours.


Again, see if you can run it under gdb or at least look at
tools you have to determine page faults. My brain has been corrupted
with 'dohs but in task manager CPU usage drops when page faults start
or lock startvation etc. A blocking thread should yield IIRC. Waiting
for it to die a natural death may not be practical. 

I just posted something on this after following another's suggestion but
it should be easy for you to get developer tools, execute gdb,
point it at R and then break a few times. Debuggers don't speed anything
up but presumably it gets into its limit cycle ( infinite futile loop )
within a short time. Also sometimes you get these loops due to memory corruption
with native code etc etc so confusing results may take a few different 
approaches
to figure out. 

Turning on profiling will at best destry any memory coherence and worse
ad to VM thrashing. At least try to determine if you are faulting all over.



 Kjetil
 

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.
   
__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] ?summaryRprof running at 100% cpu for one hour ...

2010-11-22 Thread Kjetil Halvorsen
see below.

On Mon, Nov 22, 2010 at 12:13 PM, Mike Marchywka marchy...@hotmail.com wrote:





 
 Date: Mon, 22 Nov 2010 12:03:54 -0300
 From: kjetilbrinchmannhalvor...@gmail.com
 To: lig...@statistik.tu-dortmund.de
 CC: r-help@r-project.org
 Subject: Re: [R] ?summaryRprof running at 100% cpu for one hour ...

 see below.

 2010/11/21 Uwe Ligges :
 
 
  On 21.11.2010 18:13, Kjetil Halvorsen wrote:

  ?save.image
 
  And at this point it has been running with one cpu at 100% for over an
  hour!
 
 
  It's OK to take an hour (due to memory - disc IO) if it uses swap space
  heavily. Factor of 60 is not much given memory is faster than harddiscs by
  orders of magnitude.
 
  Uwe

 It takes much more than an hour! I started anew a process with the
 problem yesterday aroun 18.00, had to kill it this morning around
 09.00. That's more than 1|5 hours.


 Again, see if you can run it under gdb or at least look at
 tools you have to determine page faults. My brain has been corrupted
 with 'dohs but in task manager CPU usage drops when page faults start
 or lock startvation etc. A blocking thread should yield IIRC. Waiting
 for it to die a natural death may not be practical.

Thanks. Will try. Really, I tried yesterday, to run R under gdb within
emacs, but it did'nt work out. What I did (in emacs 23) was, typing
Ctrl-u M-x R
and then enter the option
--debugger=gdb

It starts, but Ctrl-C signal do not have any effect!

Kjetil


 I just posted something on this after following another's suggestion but
 it should be easy for you to get developer tools, execute gdb,
 point it at R and then break a few times. Debuggers don't speed anything
 up but presumably it gets into its limit cycle ( infinite futile loop )
 within a short time. Also sometimes you get these loops due to memory 
 corruption
 with native code etc etc so confusing results may take a few different 
 approaches
 to figure out.

 Turning on profiling will at best destry any memory coherence and worse
 ad to VM thrashing. At least try to determine if you are faulting all over.



 Kjetil
 

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] ?summaryRprof running at 100% cpu for one hour ...

2010-11-22 Thread Mike Marchywka









 Date: Mon, 22 Nov 2010 12:41:06 -0300
 Subject: Re: [R] ?summaryRprof running at 100% cpu for one hour ...
 From: kjetilbrinchmannhalvor...@gmail.com
 To: marchy...@hotmail.com
 CC: lig...@statistik.tu-dortmund.de; r-help@r-project.org

 see below.

 On Mon, Nov 22, 2010 at 12:13 PM, Mike Marchywka wrote:
 
 
 Thanks. Will try. Really, I tried yesterday, to run R under gdb within
 emacs, but it did'nt work out. What I did (in emacs 23) was, typing
 Ctrl-u M-x R
 and then enter the option
 --debugger=gdb

[[elided Hotmail spam]]

 Kjetil

I rarely use gdb but it did seem to work with R but I executed gdb from
cygwin windoh and IIRC ctrl-C worked fine as it broke into debugger.
I guess you could try that- start gdb and attach or invoke R from gdb.


  
__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] ?summaryRprof running at 100% cpu for one hour ...

2010-11-22 Thread Kjetil Halvorsen
see  below.

On Mon, Nov 22, 2010 at 12:57 PM, Mike Marchywka marchy...@hotmail.com wrote:








 
 Date: Mon, 22 Nov 2010 12:41:06 -0300
 Subject: Re: [R] ?summaryRprof running at 100% cpu for one hour ...
 From: kjetilbrinchmannhalvor...@gmail.com
 To: marchy...@hotmail.com
 CC: lig...@statistik.tu-dortmund.de; r-help@r-project.org

 see below.

 On Mon, Nov 22, 2010 at 12:13 PM, Mike Marchywka wrote:
 
 
 Thanks. Will try. Really, I tried yesterday, to run R under gdb within
 emacs, but it did'nt work out. What I did (in emacs 23) was, typing
 Ctrl-u M-x R
 and then enter the option
 --debugger=gdb

 It starts, but Ctrl-C signal do not have any effect!

 Kjetil

 I rarely use gdb but it did seem to work with R but I executed gdb from
 cygwin windoh and IIRC ctrl-C worked fine as it broke into debugger.
 I guess you could try that- start gdb and attach or invoke R from gdb.



OK, thanks. I started R with
R --debugger=gdb
in a shell, outside emacs. then it works.

I did some unsystematic sampling with Ctrl-C. Most of the time it was stuck
in memory.c, apparently doing garbage collection.
Other files which occured was unique.c, duplicate.c

kjetil




__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] ?summaryRprof running at 100% cpu for one hour ...

2010-11-22 Thread Mike Marchywka









 Date: Mon, 22 Nov 2010 19:59:04 -0300
 Subject: Re: [R] ?summaryRprof running at 100% cpu for one hour ...
 From: kjetilbrinchmannhalvor...@gmail.com
 To: marchy...@hotmail.com
 CC: lig...@statistik.tu-dortmund.de; r-help@r-project.org

 see below.

 On Mon, Nov 22, 2010 at 12:57 PM, Mike Marchywka wrote:
 
 
 
 
 
 
 
 
  
  Date: Mon, 22 Nov 2010 12:41:06 -0300
  Subject: Re: [R] ?summaryRprof running at 100% cpu for one hour ...
  From: kjetilbrinchmannhalvor...@gmail.com
  To: marchy...@hotmail.com
  CC: lig...@statistik.tu-dortmund.de; r-help@r-project.org
 
  see below.
 
  On Mon, Nov 22, 2010 at 12:13 PM, Mike Marchywka wrote:
  
  
  Thanks. Will try. Really, I tried yesterday, to run R under gdb within
  emacs, but it did'nt work out. What I did (in emacs 23) was, typing
  Ctrl-u M-x R
  and then enter the option
  --debugger=gdb
 
[[elided Hotmail spam]]
 
  Kjetil
 
  I rarely use gdb but it did seem to work with R but I executed gdb from
  cygwin windoh and IIRC ctrl-C worked fine as it broke into debugger.
  I guess you could try that- start gdb and attach or invoke R from gdb.
 
 

 OK, thanks. I started R with
 R --debugger=gdb
 in a shell, outside emacs. then it works.

 I did some unsystematic sampling with Ctrl-C. Most of the time it was stuck
 in memory.c, apparently doing garbage collection.
 Other files which occured was unique.c, duplicate.c


you may want to try the R-develop list for better help now but
presumably you can get symobls somewhere and a readable
stack trace. I guess floundering with memory management
would be consistent with high CPU usage since as far as the OS
is concerned the process is runnable. In java you see stuff like
this with lots of temp objects being created. I guess if it
is gc and you make lots of garbage and then need a big contiguous
area could slow things down a lot.
Once you are pretty sure you stopped it in a hotspot, you can
try stepping in and out of things and see if anything looks odd.

I guess one other exploratory thing to try, this may or may not
work in R with your problem, is get a snapshot of the memory and then use a 
utility
like strings to see if there is any indication of what is going on.
If objects are annotated at all something may jump out but hard to know.


 kjetil



__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] ?summaryRprof running at 100% cpu for one hour ...

2010-11-21 Thread Uwe Ligges



On 21.11.2010 01:30, Kjetil Halvorsen wrote:

see below.

2010/11/20 Uwe Liggeslig...@statistik.tu-dortmund.de:



On 19.11.2010 21:43, Kjetil Halvorsen wrote:


This is very strange. (Debian squeeze, R 2.12.0 compiled from source)

I did some moderately large computation (including svd of a 560x50
matrix),
running a few minutes, and R memory increasing to about 900MB on this
2 GB ram laptop. I had done Rprof(memory.profiling=TRUE) first.
Then doing summaryRprof().
Then doing
?summaryRprof
and then the computer running with one of two cores at 100% for more
than an hour!

Whats happening?


We do not know. What about sending a reproducible example?


I will try. But how do I send this info


Well, just send what you typed to get into that state ...

Uwe


when I have to kill
the R-process from outside?

kjetil



Best,
Uwe



(running R from within emacs-ess)
Kjetil

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide
http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.




__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] ?summaryRprof running at 100% cpu for one hour ...

2010-11-21 Thread Uwe Ligges
OK, trying it on a 8Gb Windows machine with R-2.12.0 64-bit it runs 
within less than 2 minutes in 5Gb of RAM.


That means your machine is probably swapping heavily and is therefore 
extremely slow.


Nevertheless, this seems to be unrelated with summaryRprof(). The 
anacor() call is roughly equally fast with or without Rprof() calls 
around it.


Best wishes,
Uwe




On 21.11.2010 02:38, Kjetil Halvorsen wrote:

OK . I will try to give an reproducible example.
the code I give refer to a 72x72 matrix Wna, which is given at the
end of this message.
This matrix contains NA's on the diagonal.I try an correspondence
analysis on this matrix,
with package anacor, which supports correspondence analysis of
matrices with NA's.


library(anacor)

Loading required package: scatterplot3d
Loading required package: fda
Loading required package: splines
Loading required package: zoo
Loading required package: colorspace
Loading required package: car
Loading required package: MASS
Loading required package: nnet
Loading required package: survival


Rprof(file=Rprof.out, append=FALSE, memory.profiling=TRUE)
sessionInfo()

R version 2.12.0 (2010-10-15)
Platform: x86_64-unknown-linux-gnu (64-bit)

locale:
  [1] LC_CTYPE=en_US.utf8   LC_NUMERIC=C
  [3] LC_TIME=en_US.utf8LC_COLLATE=en_US.utf8
  [5] LC_MONETARY=C LC_MESSAGES=en_US.utf8
  [7] LC_PAPER=en_US.utf8   LC_NAME=C
  [9] LC_ADDRESS=C  LC_TELEPHONE=C
[11] LC_MEASUREMENT=en_US.utf8 LC_IDENTIFICATION=C

attached base packages:
[1] splines   stats graphics  grDevices utils datasets  methods
[8] base

other attached packages:
[1] anacor_1.0-1 car_2.0-6survival_2.35-8
[4] nnet_7.3-1   MASS_7.3-8   colorspace_1.0-1
[7] fda_2.2.5zoo_1.6-4scatterplot3d_0.3-31

loaded via a namespace (and not attached):
[1] fortunes_1.4-0  grid_2.12.0 lattice_0.19-13 tools_2.12.0





green.ana-anacor(Wna, ndim=3)


I will start this command in a moment, it runs for over an hour, and
memory grows to
multiple GB (If there are to many other programs running, the process
gets killed!.
This laptop has amd64, debian squeeze, 2Gb ram, 4Gb swap. I leave it
running tonight.

After finnishing this, giving some simple commands , like ls() or
?Rprof, leads to the problem described originally. Will post more info
tomorrow.

Kjetil



On Sat, Nov 20, 2010 at 9:30 PM, Kjetil Halvorsen
kjetilbrinchmannhalvor...@gmail.com  wrote:

see below.

2010/11/20 Uwe Liggeslig...@statistik.tu-dortmund.de:



On 19.11.2010 21:43, Kjetil Halvorsen wrote:


This is very strange. (Debian squeeze, R 2.12.0 compiled from source)

I did some moderately large computation (including svd of a 560x50
matrix),
running a few minutes, and R memory increasing to about 900MB on this
2 GB ram laptop. I had done Rprof(memory.profiling=TRUE) first.
Then doing summaryRprof().
Then doing
?summaryRprof
and then the computer running with one of two cores at 100% for more
than an hour!

Whats happening?


We do not know. What about sending a reproducible example?


I will try. But how do I send this info when I have to kill
the R-process from outside?




dput(Wna)

structure(c(NA, 0, 0, 0, 0.34, 0.114285714, 0.125, 0, 0.138461538,
0, 0, 0, 0.085714286, 0.26667, 0, 0.35, 0, 0, 0, 0.1, 0,
0, 0, 0, 0, 0, 0, 0, 0.12, 0, 0, 0.090909091, 0, 0, 0.44, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.4, 0.142857143, 0.525,
0.2, 0, 0, 0, 0, 0, 0, 0, 0, 0.225, 0.65, 0, 0, 0, 0, 0, 0, 0,
0, 0, NA, 0, 0, 0, 0.085714286, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0.454545455, 0.08, 0, 0, 0, 0, 0, 0.25, 0,
0, 0, 0.714285714, 0, 0.1, 0, 0, 0, 0, 0, 0, 0, 0, 0.18,
0.054545455, 0, 0, 0, 0.1, 0, 0.542857143, 0, 0.114285714, 0,
0, 0.25, 0.16, 0, 0, 0, 0, 0, 0, 0, 0.046153846, 0.2, 0, 0, 0,
0, 0, 0, 0, NA, 0, 0.1, 0, 0.125, 0, 0, 0, 0.085714286, 0, 0,
0.16667, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.08, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0.257142857, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0.1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.114285714,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0.1, NA, 0, 0.114285714, 0, 0, 0,
0, 0, 0, 0, 0, 0.1, 0, 0, 0, 0, 0.3, 0, 0.285714286,
0, 0, 0.145454545, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0.12, 0.125, 0, 0, 0.1, 0, 0.145454545, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0.075, 0, 0.1, 0.2, 0.7, 0, 0, 0, 0, 0, 0,
0, 0, 0.79333, 0, 0.325, 0, NA, 0, 0, 0, 0, 0, 0, 0, 0.114285714,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.090909091,
0, 0, 0.14, 0, 0, 0, 0.185714286, 0, 0, 0, 0, 0, 0, 0, 0.054545455,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.625, 0, 0, 0,
0, 0, 0, 0, 0, 0.40667, 0, 0, 0, 0.5, 0.08, NA, 0,
0, 0.123076923, 0.9, 0.085714286, 0, 0.571428571, 0, 0.7,
0, 0.5, 0.25, 0, 0.1, 0, 0.571428571, 0, 0, 0.072727273,
0.5, 0.1, 0.24, 0, 0, 0, 0, 0.125, 0, 0.32, 0.3,
0, 0, 0.057142857, 0.25, 0, 0, 0.685714286, 0.26667, 0, 0.272727273,
0, 0, 

Re: [R] ?summaryRprof running at 100% cpu for one hour ...

2010-11-20 Thread Uwe Ligges



On 19.11.2010 21:43, Kjetil Halvorsen wrote:

This is very strange. (Debian squeeze, R 2.12.0 compiled from source)

I did some moderately large computation (including svd of a 560x50 matrix),
running a few minutes, and R memory increasing to about 900MB on this
2 GB ram laptop. I had done Rprof(memory.profiling=TRUE) first.
Then doing summaryRprof().
Then doing
?summaryRprof
and then the computer running with one of two cores at 100% for more
than an hour!

Whats happening?


We do not know. What about sending a reproducible example?

Best,
Uwe



(running R from within emacs-ess)
Kjetil

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] ?summaryRprof running at 100% cpu for one hour ...

2010-11-20 Thread Mike Marchywka





 Date: Sat, 20 Nov 2010 18:45:42 +0100
 From: lig...@statistik.tu-dortmund.de
 To: kjetilbrinchmannhalvor...@gmail.com
 CC: r-help@r-project.org
 Subject: Re: [R] ?summaryRprof running at 100% cpu for one hour ...



 On 19.11.2010 21:43, Kjetil Halvorsen wrote:
  This is very strange. (Debian squeeze, R 2.12.0 compiled from source)
 
  I did some moderately large computation (including svd of a 560x50 matrix),
  running a few minutes, and R memory increasing to about 900MB on this
  2 GB ram laptop. I had done Rprof(memory.profiling=TRUE) first.
  Then doing summaryRprof().
  Then doing
  ?summaryRprof
  and then the computer running with one of two cores at 100% for more
  than an hour!
 
  Whats happening?

 We do not know. What about sending a reproducible example?

 
Use the tools designed for this: on 'dohs it would be task manager,
I think there is top or something on linux but I think apropos
can give you some candidates. Usually when things go from reasonble
to infinite time due to small changes in size, it has nothing
to do with algorighm order but rather VM.
 
 

 Best,
 Uwe


  (running R from within emacs-ess)
  Kjetil
 
  __
  R-help@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-help
  PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
  and provide commented, minimal, self-contained, reproducible code.

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.
   
__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] ?summaryRprof running at 100% cpu for one hour ...

2010-11-20 Thread Kjetil Halvorsen
see below.

2010/11/20 Uwe Ligges lig...@statistik.tu-dortmund.de:


 On 19.11.2010 21:43, Kjetil Halvorsen wrote:

 This is very strange. (Debian squeeze, R 2.12.0 compiled from source)

 I did some moderately large computation (including svd of a 560x50
 matrix),
 running a few minutes, and R memory increasing to about 900MB on this
 2 GB ram laptop. I had done Rprof(memory.profiling=TRUE) first.
 Then doing summaryRprof().
 Then doing
 ?summaryRprof
 and then the computer running with one of two cores at 100% for more
 than an hour!

 Whats happening?

 We do not know. What about sending a reproducible example?

I will try. But how do I send this info when I have to kill
the R-process from outside?

kjetil


 Best,
 Uwe


 (running R from within emacs-ess)
 Kjetil

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.


__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] ?summaryRprof running at 100% cpu for one hour ...

2010-11-20 Thread Kjetil Halvorsen
OK . I will try to give an reproducible example.
the code I give refer to a 72x72 matrix Wna, which is given at the
end of this message.
This matrix contains NA's on the diagonal.I try an correspondence
analysis on this matrix,
with package anacor, which supports correspondence analysis of
matrices with NA's.

 library(anacor)
Loading required package: scatterplot3d
Loading required package: fda
Loading required package: splines
Loading required package: zoo
Loading required package: colorspace
Loading required package: car
Loading required package: MASS
Loading required package: nnet
Loading required package: survival

 Rprof(file=Rprof.out, append=FALSE, memory.profiling=TRUE)
 sessionInfo()
R version 2.12.0 (2010-10-15)
Platform: x86_64-unknown-linux-gnu (64-bit)

locale:
 [1] LC_CTYPE=en_US.utf8   LC_NUMERIC=C
 [3] LC_TIME=en_US.utf8LC_COLLATE=en_US.utf8
 [5] LC_MONETARY=C LC_MESSAGES=en_US.utf8
 [7] LC_PAPER=en_US.utf8   LC_NAME=C
 [9] LC_ADDRESS=C  LC_TELEPHONE=C
[11] LC_MEASUREMENT=en_US.utf8 LC_IDENTIFICATION=C

attached base packages:
[1] splines   stats graphics  grDevices utils datasets  methods
[8] base

other attached packages:
[1] anacor_1.0-1 car_2.0-6survival_2.35-8
[4] nnet_7.3-1   MASS_7.3-8   colorspace_1.0-1
[7] fda_2.2.5zoo_1.6-4scatterplot3d_0.3-31

loaded via a namespace (and not attached):
[1] fortunes_1.4-0  grid_2.12.0 lattice_0.19-13 tools_2.12.0


 green.ana -anacor(Wna, ndim=3)

I will start this command in a moment, it runs for over an hour, and
memory grows to
multiple GB (If there are to many other programs running, the process
gets killed!.
This laptop has amd64, debian squeeze, 2Gb ram, 4Gb swap. I leave it
running tonight.

After finnishing this, giving some simple commands , like ls() or
?Rprof, leads to the problem described originally. Will post more info
tomorrow.

Kjetil



On Sat, Nov 20, 2010 at 9:30 PM, Kjetil Halvorsen
kjetilbrinchmannhalvor...@gmail.com wrote:
 see below.

 2010/11/20 Uwe Ligges lig...@statistik.tu-dortmund.de:


 On 19.11.2010 21:43, Kjetil Halvorsen wrote:

 This is very strange. (Debian squeeze, R 2.12.0 compiled from source)

 I did some moderately large computation (including svd of a 560x50
 matrix),
 running a few minutes, and R memory increasing to about 900MB on this
 2 GB ram laptop. I had done Rprof(memory.profiling=TRUE) first.
 Then doing summaryRprof().
 Then doing
 ?summaryRprof
 and then the computer running with one of two cores at 100% for more
 than an hour!

 Whats happening?

 We do not know. What about sending a reproducible example?

 I will try. But how do I send this info when I have to kill
 the R-process from outside?


 dput(Wna)
structure(c(NA, 0, 0, 0, 0.34, 0.114285714, 0.125, 0, 0.138461538,
0, 0, 0, 0.085714286, 0.26667, 0, 0.35, 0, 0, 0, 0.1, 0,
0, 0, 0, 0, 0, 0, 0, 0.12, 0, 0, 0.090909091, 0, 0, 0.44, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.4, 0.142857143, 0.525,
0.2, 0, 0, 0, 0, 0, 0, 0, 0, 0.225, 0.65, 0, 0, 0, 0, 0, 0, 0,
0, 0, NA, 0, 0, 0, 0.085714286, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0.454545455, 0.08, 0, 0, 0, 0, 0, 0.25, 0,
0, 0, 0.714285714, 0, 0.1, 0, 0, 0, 0, 0, 0, 0, 0, 0.18,
0.054545455, 0, 0, 0, 0.1, 0, 0.542857143, 0, 0.114285714, 0,
0, 0.25, 0.16, 0, 0, 0, 0, 0, 0, 0, 0.046153846, 0.2, 0, 0, 0,
0, 0, 0, 0, NA, 0, 0.1, 0, 0.125, 0, 0, 0, 0.085714286, 0, 0,
0.16667, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.08, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0.257142857, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0.1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.114285714,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0.1, NA, 0, 0.114285714, 0, 0, 0,
0, 0, 0, 0, 0, 0.1, 0, 0, 0, 0, 0.3, 0, 0.285714286,
0, 0, 0.145454545, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0.12, 0.125, 0, 0, 0.1, 0, 0.145454545, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0.075, 0, 0.1, 0.2, 0.7, 0, 0, 0, 0, 0, 0,
0, 0, 0.79333, 0, 0.325, 0, NA, 0, 0, 0, 0, 0, 0, 0, 0.114285714,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.090909091,
0, 0, 0.14, 0, 0, 0, 0.185714286, 0, 0, 0, 0, 0, 0, 0, 0.054545455,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.625, 0, 0, 0,
0, 0, 0, 0, 0, 0.40667, 0, 0, 0, 0.5, 0.08, NA, 0,
0, 0.123076923, 0.9, 0.085714286, 0, 0.571428571, 0, 0.7,
0, 0.5, 0.25, 0, 0.1, 0, 0.571428571, 0, 0, 0.072727273,
0.5, 0.1, 0.24, 0, 0, 0, 0, 0.125, 0, 0.32, 0.3,
0, 0, 0.057142857, 0.25, 0, 0, 0.685714286, 0.26667, 0, 0.272727273,
0, 0, 0.16, 0.2, 0.6, 0, 0.1, 0, 0, 0, 0, 0.4, 0.76667,
0.8, 0, 0.296428571, 0, 0.15, 0.371428571, 0.061538462, 0.2,
0, 0.12, 0.28333, 1, 0, 0, 0, 0.25, 0, 0.16, 0, NA, 0, 0.153846154,
0, 0.342857143, 0.6, 0, 0.1, 0, 0.15, 0, 0, 0.35, 0,
0, 0, 0, 0.06, 0.290909091, 0, 0.2, 0, 0, 0.2, 0.45,
0, 0, 0, 0, 0.06667, 0, 0.56, 0.171428571, 0.075, 0, 0, 0.371428571,
0.1, 0.08, 0, 0, 

Re: [R] ?summaryRprof running at 100% cpu for one hour ...

2010-11-20 Thread Mike Marchywka








 Date: Sat, 20 Nov 2010 21:30:38 -0300
 From: kjetilbrinchmannhalvor...@gmail.com
 To: lig...@statistik.tu-dortmund.de
 CC: r-help@r-project.org
 Subject: Re: [R] ?summaryRprof running at 100% cpu for one hour ...

 see below.

 2010/11/20 Uwe Ligges :
 
 
  On 19.11.2010 21:43, Kjetil Halvorsen wrote:
 
  This is very strange. (Debian squeeze, R 2.12.0 compiled from source)
 
  I did some moderately large computation (including svd of a 560x50
  matrix),
  running a few minutes, and R memory increasing to about 900MB on this
  2 GB ram laptop. I had done Rprof(memory.profiling=TRUE) first.
  Then doing summaryRprof().
  Then doing
  ?summaryRprof
  and then the computer running with one of two cores at 100% for more
  than an hour!
 
  Whats happening?
 
  We do not know. What about sending a reproducible example?

 I will try. But how do I send this info when I have to kill
 the R-process from outside?


Can you run it in gdb? Just break a few times and see if stack
trace is informative. Usually in a tight loop you only need
sample a few times to find the offender. 
 
The question still remains if you are using the other tools to 
isolate some issues. Usually once memory is getting tight, you
end up doing VM. I'm not entirely sure what you are doing here-
you recognize that memory is limiting and want to see what objects
are using it which is obviously a good approach. However, once you 
instrument things it tends to disrort and slow things down.
This is especially true of less gross memory profiling such
as issues with low level cache hits ( probably not relevant here
but something to know about). 
Something like gdb, or sampling an unmodified program, may be
more informative depending on exactly how the R memory profiling
is implemented
 
I haven't proffed in linux lately or much at all but in 'dohs the
task manager CPU usage drops when you start page faulting. It is
possible much of speed issue is due to that and anything that
adds to memory usage could really slow things down. 
 
 
 
 kjetil

 
  Best,
  Uwe
 
 
  (running R from within emacs-ess)
  Kjetil
 
  __
  R-help@r-project.org mailing list
  https://stat.ethz.ch/mailman/listinfo/r-help
  PLEASE do read the posting guide
  http://www.R-project.org/posting-guide.html
  and provide commented, minimal, self-contained, reproducible code.
 

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.
   
__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] ?summaryRprof running at 100% cpu for one hour ...

2010-11-19 Thread Kjetil Halvorsen
This is very strange. (Debian squeeze, R 2.12.0 compiled from source)

I did some moderately large computation (including svd of a 560x50 matrix),
running a few minutes, and R memory increasing to about 900MB on this
2 GB ram laptop. I had done Rprof(memory.profiling=TRUE) first.
Then doing summaryRprof().
Then doing
?summaryRprof
and then the computer running with one of two cores at 100% for more
than an hour!

Whats happening?
(running R from within emacs-ess)
Kjetil

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.