On Tue, 10 May 2011 00:25:08 -0500
DRC <dcomman...@users.sourceforge.net> wrote:

> I have to admit that I really don't see a clean solution to the dynamic
> quality adjustment problem at the moment.  That doesn't mean that we
> can't implement a crude mechanism based on setting a pre-defined JPEG
> quality corresponding to a particular range of updates/second, but such
> a mechanism wouldn't really correspond to application frames/second in
> most cases.

After more thinking I understand why there is no easy ideal solution.
I have written some code based on the UPS (updates/second), with an objective 
of 20UPS,
that dynamically adapts the quality. It works pretty well... when FPS = UPS. 
It gives awful results on e.g. OpenOffice or Gedit (they are, by the way, 
relevant because
the automatic quality adjustement algorithm must be able to detect that the 
link is idle
when running those apps in order to avoid setting the quality to 10 + 
grayscale, like my
prototype currently does).

I feel that updates per second might not after all work, because it makes too 
many
assumptions about the behavior of the application and I did not need much 
testing to notice
that those assumptions don't hold in the wild.

TigerVNC computes a "max bandwidth" for the link. Perhaps we can try the 
following
approach :

(with a 2 second sampling interval or so)
- dynamically BWmax = evaluate maximal bandwidth over one minute or something 
(size/wait
  time when we know we expect a rectangle)
- evaluate BW = average bandwidth (size / total interval time)

- if BW <= 10% BWmax
        then the link is idle
                (dramatically) increase quality (as a special case, this is 
your ALR)
- if BW >= 95% BWmax
        then the link is highly loaded
                decrease quality (until 90% is reached)
- else 
        try to increase quality slowly until we are in the 90%-95% band

This makes one huge assumption: that the bottleneck is the network. In the case 
I am
interested in, it most definitely is.

Advantages :

- idle applications are properly detected, to the point of possibly doing ALR
- overloaded link is efficiently detected and quality is decreased

Disadvantages :

- relies on a BWmax which isn't always well estimated (but in some cases as a 
workaround
  could be input by the user)
- does not consume 100% of the link's BW

Is there anything obviously wrong with this "bandwidth-based" approach? I think 
UPS had too
many pitfalls, but assuming BWmax can be computed correctly I feel this could 
work.
Obviously the actual thresholds have to be tweaked.
(TigerVNC does something much more simple and that doesn't really work because 
it's not
truly a *dynamic* quality algorithm.)

-- 
Greetings, 
A. Huillet


------------------------------------------------------------------------------
Achieve unprecedented app performance and reliability
What every C/C++ and Fortran developer should know.
Learn how Intel has extended the reach of its next-generation tools
to help boost performance applications - inlcuding clusters.
http://p.sf.net/sfu/intel-dev2devmay
_______________________________________________
VirtualGL-Devel mailing list
VirtualGL-Devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/virtualgl-devel

Reply via email to