On 3/22/06, Hans Kristian Rosbach <[EMAIL PROTECTED]> wrote:

> If I had only thought a little more I would have remembered that line
> length is crucial, thanks for reminding me.
>
> Regarding the last bit being correct or not in the digital domain, that
> is actually what I meant. Do we really care if it gets set to the wrong
> value if it's for example less than 10% of the time? Not having it at
> all would possibly be wrong 50% of the time. Would it not be better?

We killed the temporal dithering discussion on the basis that people
will notice the flickering.  In one sense, this would be worse, since
it would be random flickering, although the degree of flicker would be
1/2 or 1/4 as much.

I just don't think it would be an improvement to have an unstable
image over simply not having the precision in the first place.

Also, for it to be right 90% of the time, it would have to be on the
trailing edge of a transition.  If the video controller and DAC are
running off the same clock, then jitter will be the same for both,
meaning we'd always be on that edge in the same place.  Someone who
understands this better should address it, but I have a feeling that
it's a bad idea.  And I'm not even sure we could do it, even if we
wanted to.
_______________________________________________
Open-graphics mailing list
[email protected]
http://lists.duskglow.com/mailman/listinfo/open-graphics
List service provided by Duskglow Consulting, LLC (www.duskglow.com)

Reply via email to