Re: [Intel-gfx] PROBLEM: Native backlight regressed from logarithmic to linear scale

2014-08-08 Thread Jani Nikula
On Tue, 29 Jul 2014, Daniel Vetter dan...@ffwll.ch wrote: On Tue, Jul 29, 2014 at 06:14:16AM -0400, Anders Kaseorg wrote: On Tue, 29 Jul 2014, Hans de Goede wrote: I've been thinking a bit about this, and I believe that the right answer here is to do the linear to logarithmic mapping in

Re: [Intel-gfx] PROBLEM: Native backlight regressed from logarithmic to linear scale

2014-07-29 Thread Hans de Goede
Hi, On 07/22/2014 06:32 AM, Anders Kaseorg wrote: [1.] One line summary of the problem: Native backlight regressed from logarithmic to linear scale [2.] Full description of the problem/report: With the new default of video.use_native_backlight=0 (commit v3.16-rc1~30^2~2^3), my

Re: [Intel-gfx] PROBLEM: Native backlight regressed from logarithmic to linear scale

2014-07-29 Thread Anders Kaseorg
On Tue, 29 Jul 2014, Hans de Goede wrote: I've been thinking a bit about this, and I believe that the right answer here is to do the linear to logarithmic mapping in user-space. The intel backlight interface has a type of raw, clearly signalling to userspace that it is a raw untranslated

Re: [Intel-gfx] PROBLEM: Native backlight regressed from logarithmic to linear scale

2014-07-29 Thread Daniel Vetter
On Tue, Jul 29, 2014 at 06:14:16AM -0400, Anders Kaseorg wrote: On Tue, 29 Jul 2014, Hans de Goede wrote: I've been thinking a bit about this, and I believe that the right answer here is to do the linear to logarithmic mapping in user-space. The intel backlight interface has a type of