Comments below:
> On 13 Dec 2018, at 14:37, Martin Maechler <maech...@stat.math.ethz.ch> wrote:
> 
>>>>>> Peter Dalgaard 
>>>>>>    on Wed, 12 Dec 2018 22:12:34 +0100 writes:
> 
>> I don't think there has been anything mentioned about slowdowns of that 
>> magnitude, but it's been 3.5 years since 3.1.3.
>> Would it be possible to narrow down what kind of code has become slow? 
> 

I explored a bit using Rprof() but made no progress. I can include the Rprof 
output if that might help. 
I will explore a bit more to see if I can narrow it down. But so far everything 
I have tried (cutting out various chunks of the simulation) has left the speed 
discrepancy intact. Which makes me think it is a very general thing, and quite 
possibly at a pretty low level (but that is guessing).

>> Since the OS version is different, I assume the first timing is historical 
>> and not easily redone, but if it is now using like 70 times as long as 
>> before, chances are that it is spending 69/70 of the time in the same few 
>> places.

The timing was done the same day. I have partitioned my hard drive and have two 
systems available, with different versions of R installed in each partition 
(part of the reason for this baroque structure is to be able to use the older 
faster R). But before going down the “it’s the two systems” road... I noticed 
this issue when a student ran the code on a brand new Macbook with R 3.5.1. I 
couldn’t believe how long it took him so tried it on my (old-ish) machine with 
the R 3.1.3, before I did the partitioning. And mine was 70 times faster.


> 
>> One generic frequent cause of grief with simulations is to keep onto the 
>> fitted models in entirety, including model frames etc., causing massive 
>> memory build-up.


The code is pretty simple so I doubt it is that, unless there was some major 
change in memory management between 3.1 and 3.5.

> 
>> -pd
> 
> If the  simulationR-R.R  script is basically reproducible
> (i.e. does not use data or other resources that only exist on
> your computer), it would  probably be useful if you "donated"
> it to the R project by making it publicly available.  Some of
> us do have many old R versions still running, and could quickly
> try and see…
> 

I would be very happy to do that. What is the best way to do so? (It only uses 
the igraph package.)

Thank you both for your help.

> Martin Maechler (not a Mac user though)
> 
>>> On 12 Dec 2018, at 17:39 , Cowan, R (MERIT) 
>>> <r.co...@maastrichtuniversity.nl> wrote:
>>> 
>>> I am running a small simulation, and getting very different run times when 
>>> I use different versions of R. 
>>> Two set-ups using the same machine (MacBook Pro 2013 vintage)
>>> 
>>> 1. R version 3.1.3  running on system OS X 10.9.5
>>> 
>>>> system.time(source("simulationR-R.R"))
>>> 
>>> user  system elapsed 
>>> 3.890   0.061   3.965 
>>> 
>>> Compared to
>>> 
>>> 2. R version 3.5.1  running on system OS X 10.12.6
>>> 
>>>> system.time(source("simulationR-R.R"))
>>> 
>>> user  system elapsed 
>>> 277.924   2.087 280.841 
>>> 
>>> The source code is identical. This is a pretty big difference running the 
>>> same code on the same hardware.
>>> Before submitting the code, is this a known issue?
>>> 
>>> 
>>> Thanks,
>>> Robin Cowan
>>> _______________________________________________
>>> R-SIG-Mac mailing list
>>> R-SIG-Mac@r-project.org
>>> https://stat.ethz.ch/mailman/listinfo/r-sig-mac
> 
>> -- 
>> Peter Dalgaard, Professor,
>> Center for Statistics, Copenhagen Business School
>> Solbjerg Plads 3, 2000 Frederiksberg, Denmark
>> Phone: (+45)38153501
>> Office: A 4.23
>> Email: pd....@cbs.dk  Priv: pda...@gmail.com
> 
> 
> 

_______________________________________________
R-SIG-Mac mailing list
R-SIG-Mac@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-mac

Reply via email to