I have posted this question on R-help where it was suggested to me that I might get a better response on R-devel. So far I have gotten no response. The post I am talking about is here: https://stat.ethz.ch/pipermail/r-help/2020-February/465700.html
My apologies for cross-posting, which I am aware is impolite and I should have posted on R-devel in the first place - but I wasn't sure. Here is my question again: I am currently working through Advanced R by H. Wickham and came across the `lobstr::obj_size` function which appears to calculate the size of an object by taking into account whether the same object has been referenced multiple times, e.g. x <- runif(1e6) y <- list(x, x, x) lobstr::obj_size(y) # 8,000,128 B # versus: object.size(y) # 24000224 bytes Reading through `?object.size` in the "Details" it reads: [...] but does not detect if elements of a list are shared [...]. My questions are: (1) is the result of `obj_size()` the "correct" one when it comes to actual size used in memory? (2) And if yes, why wouldn't `object.size()` be updated to reflect the more precise calculation of an object in question similar to `obj_size()`? There are probably valid reasons for this and any insight would be greatly appreciated. ______________________________________________ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel