On Wed, Apr 05, 2000 at 04:40:37PM +1000, Ian Boreham wrote:
> From my understanding of JPEG (which is not expert), I would have thought
> that although there might be a small loss of quality on subsequent cycles,
> due to rounding-type errors, there would not be anywhere near the same as
> the initial loss due to discarding high-order, low-coefficient data.
> Discarding this data is effectively setting the coefficients to zero, so
> you would be roughly discarding zero on subsequent cycles.

You are correct, but only when the same implementation is being used.
It is quite possible that rounding errors will be introduced by varying
the algorithm, so although this will often work IN PRACTICE, it is not
something to be relied upon.

> Has anyone tried cycling and differencing to see the effect?

Yes. Marc is right only in principle, not in practice.

In tests on real photo data (which is the only thing you should be
shoving through a JPEG codec anyway) I found that ~1% of tiles were
damaged during the second cycle, and much fewer if any were changed
by subsequent cycles. This is testing with RH 6.2's libjpeg package,
Gimp defaults, 75% quality.

I suspect that choosing a non-integer implementation (which might be
faster on modern Intel hardware) would increase the damage from
subsequent cycles, but I've never tested that. For my purposes it is
enough that guessing the Q factor is a win, though not one we can
easily automate.

I have said all this before, is there a problem with the list? Or
is Marc ignoring everything I say?


Reply via email to