On Sat, Oct 27, 2012 at 02:49:16PM +0200, Marco wrote: > 2012-10-26 Andre Klärner: > > > (pure IMHO) Well, if I remember right VidMode is as ancient as X11 > > itself. As RANDR was patched in later, and allowed to mix physical > > outputs to one "protocol screen" there still needed to be a way to > > calibrate each output for itself. So I guess one has to either > > create a ICC profile that does the inversion for each monitor and > > use some of the screen calibration utilities to load it (probaply > > not xcalib ;) or port xcalib's invert function to a single tool > > that runs again RANDR (maybe redshift might be a good example). > > I think this is overkill, since it's not essential functionality, it > just makes reading easier. I probably wouldn't bother writing a tool > for that.
well, just as I had some time spare I read xcalib and redshift
comparatively and found this in the redshift code:
for (int i = 0; i < state->crtc_count; i++) {
r = randr_set_temperature_for_crtc(state, i, temp, brightness, gamma);
if (r < 0) return -1;
}
So redshift iterates via randr over all CRTCs that are composing this
X11-Display and sets the temperature on it.
xcalib instead does the following:
if (!XF86VidModeSetGammaRamp (dpy, screen, ramp_size, r_ramp, g_ramp, b_ramp))
warning ("Unable to calibrate display");
So xcalib sets it's modified version of the gamma ramp to the whole
display, and the X11 server decides which CRTC it will hit.
I also saw that there are many other conditionals for FGLRX in the xcalib
source. I guess that someone using xcalib with an AMD card might experience
other behaviour. If someone here is running a amd-dualhead setup I'd like
to hear if it works on both screens.
Maybe one should file a bug agains xcalib that it doesn't work on RANDR
setups.
Regards, Andre
smime.p7s
Description: S/MIME cryptographic signature
