On Sat, Feb 15, 2014 at 10:07:22PM +0200, Siarhei Siamashka wrote:
> 
> This seems to be a common sense. The native screen resolution
> is normally expected to be the highest resolution supported by
> a monitor. So the default EDID mode is likely to have the pixel
> clock configured exactly on the highest border of the allowed
> pixel clock frequencies range. Hence rounding the pixel clock
> up may easily drive it outside of the supported range.

(completely taking this thing out of context, as i haven't fully read up 
on this thread, yet. EDID just caught my eye.)

Sorry, but you are wrong here.

CRTs have a preferred resolution which is a function of the highest 
vertical refresh rate, and the highest resolution. Namely: that mode 
that was the kindest on the eyes, and that showed off the CRT as best 
as possible. Preferred really is preferred, it mostly turned into 
meaning native when panels became common.

Yes, CRTs are not often used anymore, but i always keep some around, as 
they are the only devices where you have some chance of seeing what 
really is going on with the display engine. And no, there is one CRT 
which does DVI-D, the iiyama ha901d, and i have owned it since 2001, 
before i was even into graphics drivers. I also believe there might have 
been an HDMI equipped CRT based HDTV set produced at one time.

It can also be imagined that at reduced vertical refresh, a panel, well, 
its controller, could do scaling for you. I think this might end up 
happening in the 4k world now, panel controllers might end up 
downsampling for you in cheap devices.

So please, do not assume the preferred resolution is the maximum 
resolution. You have to search the modes lists for that.

As for pixel clocks: EDID blocks provide pixel clock ranges. EDID fails 
to specify which technology is behind the monitor, and whether reduced 
blanking is possible or not, but it does specify all the necessary 
information for driving a CRT within all of its limits, which is only 
natural if you look at when this standard was created.

As for the dotclocks themselves. In my KMS driver, where i use the 
available video pll for dotclocks, i get pretty decent approximation. 
Even with the main plls frequency being in 3MHz steps, the divider makes 
up for (most of) the rest in an almost satisfactory fashion. It will be 
a bit finnicky to get the heuristics for dual display, and thus dual 
dotclocks right. But that is not an unsolvable problem.

Now that i am out of my post-FOSDEM dip, i hope to start coding big-time 
again in the next few days, and get you guys a workable first pass at a 
KMS driver.

-- The guy who added full EDID handling to X back in H1 2006, and who 
had to fight to get EDID used at all, with a quirks list, as the usual 
suspects swore with a monitor database.

-- The guy who actually digs out PLLs instead of approaching stability 
walls from another angle, like ATI suggested, and like some ATI and 
redhat employees slavishly implemented, back in the RadeonHD days.

-- 
You received this message because you are subscribed to the Google Groups 
"linux-sunxi" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to