Hi,
After I create a long list e.g. with
x <- vector(mode="list", length=3e9)
many bad things start to happen e.g. some things stop working with a
spurious error message:
gc()
# Error in gc() :
# long vectors not supported yet:
/home/hpages/src/R-3.3.2/src/main/memory.c:1137
Hi James,
On 11/14/2016 06:38 PM, James Collins wrote:
I have attempted to remove several .tar.gz files from the LOBSTAHS package
directory using a series of cherry-picked commits. (I had successfully
removed these .tar files a few months ago, but somehow they just reappeared
in the directory
I have attempted to remove several .tar.gz files from the LOBSTAHS package
directory using a series of cherry-picked commits. (I had successfully
removed these .tar files a few months ago, but somehow they just reappeared
in the directory after I invoked git svn rebase during a recent attempt to
Martin, thanks for the good news and sorry for wasting your (and others
time) by not doing my homework and query bugzilla first (lesson learned!
).
I have tested the new implementation from R-devel and observe a semantic
difference when playing with the parameters:
# Test script 1
g <-
Hi Ramon, Diego,
Thanks for bringing this to our attention. This week we'll update R
to 3.3.2 on the machines building release and that will trigger
re-installation of all the packages on these machines. For the devel
builds, the issue will also go away when we update R, which we'll do
in 4 or 5
Thanks Frederick.
Mark, if you have any examples to share, they would also be gratefully
received.
Paul
On 14/11/16 14:53, frede...@ofb.net wrote:
Hi Paul,
OK I tried not to make the examples too fancy.
Please let me know what you think. They should probably be amended to
support the
Hi Diego,
On Mon, 14-11-2016, at 12:25, Diego Diez wrote:
> Hi Ramon,
>
> My experience with this issue is that it requires reinstalling the
> depending packages from source (but not ggplot2 itself).
Makes sense (in my machines, I just reinstalled both, for the sake of
Function 'do_mapply' in mapply.c has the following fragment.
for (int i = 0; i < longest; i++) {
Variable 'longest' is declared as R_xlen_t. Its value can be larger than the
maximum int.
In the fragment, when 'longest' is larger than the maximum int, when 'i'
reaches the maximum int, i++
I don't know if this is a bug per se, but an undesired behavior in
read.dcf. read.dcf takes a file argument and passes it to gzfile if
it's a character:
if (is.character(file)) {
file <- gzfile(file)
on.exit(close(file))
}
This gzfile connection is passed to readLines
Fixed in ensembldb
thanks Martin!
> On 14 Nov 2016, at 11:37, Martin Morgan wrote:
>
> An interested R-help thread (starting at
> https://stat.ethz.ch/pipermail/r-help/2016-November/443123.html) points out
> that return() is treated as a function by the R
Hi Ramon,
My experience with this issue is that it requires reinstalling the
depending packages from source (but not ggplot2 itself).
Alternatively, this should be fixed (I think) once all dependent
packages are rebuilt with the newest ggplot2 installed. See this
related post in the ggtree
Dear All,
At least some packages (partial list in links below) are failing checks in
both release and devel; the error logs for Linux, Windows, and Mac all show
"Error: processing vignette 'OncoSimulR.Rmd' failed with diagnostics:
[GeomTextRepel/PositionQuasirandom/etc] was built with an
Hi BiocDevel:
I got an error when tried to install several packages from Bioconductor
(rtracklayer, SummarizedExperiments), so I removed BiocInstaller and
reinstall again now error is gone. But, I got warning which previously
happened when I failed to install rtracklayer packages. Because this
An interested R-help thread (starting at
https://stat.ethz.ch/pipermail/r-help/2016-November/443123.html) points
out that return() is treated as a function by the R parser. This has
some surprising consequences when it is used without parentheses, for
instance
f0 = function(i)
return (i
> nospam@altfeld-im de
> on Sun, 13 Nov 2016 13:11:38 +0100 writes:
> Dear R friends, to allow post-mortem debugging In my
> Rscript based batch jobs I use
>tryCatch( , error = function(e) {
> dump.frames(to.file = TRUE) })
> to write
> Gábor Csárdi
> on Sun, 13 Nov 2016 20:49:57 + writes:
> Using dup() before fdopen() (and calling fclose() on the connection
> when it is closed) indeed fixes the memory leak.
>
Thank you, Gábor!
Yes I can confirm that this fixes the memory leak.
I'm
16 matches
Mail list logo