For many years, when my R process starts up I've been automatically setting my preferred default plot colors, basically like so:
my.colors <- c("black" ,"red" ,"gold" ,"sky blue" ,"green" ,"blue" ,"orange" ,"grey" ,"hot pink" ,"brown" ,"sea green" ,"cyan" ,"purple" ,"tomato1") require("grDevices") palette(my.colors) That seemed to work reliably in all 2.x versions of R, regardless of whether my R was interactive or not, or if my Linux, ssh, and screen environment had X-Windows properly set up or not. It Just Worked. However, now in R 3.1.0 Patched (2014-04-15 r65398, on Linux), depending on whether I have a good X-Windows connection or not it can fail like so: Error in .External2(C_X11, d$display, d$width, d$height, d$pointsize, : unable to start device X11 Simply wrapping the palette() call in try() of course helps keep that error from breaking the rest of my R start up. However, occasionally the call to palette() will hang for perhaps a minute, unexpectedly locking up my R process until it finishes whatever it was doing. But, all I want to do here is set my default colors to the length 14 vector above, which seems like something that SHOULD be simple... Is there some way for me to reliably do that WITHOUT invoking all this extra X11 device machinery? The relevant C code appears to be "palette" in "src/library/grDevices/src/colors.c" and "do_dotcallgr" for .Call.graphics in "src/main/dotcode.c", but I don't understand what part is triggering the additional complex behavior, nor how I should avoid it. Any advice on how I should handle this robustly? (Thanks!) -- Andrew Piskorski <a...@piskorski.com> ______________________________________________ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel