Re: [FFmpeg-devel] Rework color quantization in palette{gen,use}

2023-01-03 Thread Michael Niedermayer
On Tue, Jan 03, 2023 at 12:05:22AM +0100, Clément Bœsch wrote:
> On Mon, Jan 02, 2023 at 10:57:33PM +0100, Michael Niedermayer wrote:
> [...]
> > > So I did a lot of experiments, and the explanation for the desaturated
> > > output at low number of colors can be found at the end of this article:
> > > http://blog.pkh.me/p/39-improving-color-quantization-heuristics.html
> > 
> > interresting and its impressive how much reseacrh you did here
> > i hope this will get applied
> 
> Thanks. I was actually planning to push in the next 12 hours or so, unless
> there is an objection.
> 
> > also i hape a bit that it will get
> > extended to include clustering as in ELBG cuz it seems a bit odd
> > to have this sort of alternative filters neither does all 
> 
> Yeah at some point we probably want to group the clustering and vector
> quantization logics in a common module. But there are lot of questions API
> wise wrt its relationship with perceptual and other color systems.
> 
> > > I still think it's acceptable to lean toward desaturated colors when
> > > reducing the number of colors, but you may disagree.
> > 
> > I think a key aspect of desaturation has not been mentioned.
> > That is mixing, i mean dithering is some sort of mixing, in the sense of
> > an artist mixing several pigment/dyes/colors.
> > If you have black and white a 50% mixture gives 50% gray. other ratios
> > would give us all values between white and black though with dithering
> > some ratios work better like 50% looks good while ratios very close to
> > 0 and 100% but not exacty 0 and 100 look bad with few highly vissible
> > black or white pixels in a see of the opposing color.
> > 
> > Now this results in 2 things at least.
> > 1. We should be able to improve color quantization by this.
> >  If we have colors A and B the (A+B)/2 point is basically free, its dither
> >  pattern looks good on any high resolution display and if we consider such
> >  points there are more of course like maybe (A+B+C+D)/4 we can cover more
> >  output colors with a smaller palette.
> 
> That's interesting. Basically you'd free certain slots of the palette if
> you detect that this particular color is at the mid point of two others in
> the palette? And so you could use that slot for another tint…
> 
> Yeah I don't know what to do with this information, it looks not trivial
> to implement.

If we simplify the problem a bit and instead of considering 3d we just look at 
1d
then for example to represent all colors between 0 and 128 either as a
solid color or by 2 colors 50:50% then we need only 19 colors of these values
{0,2,6,8,10,16,28,40,52,64,76,88,100,112,118,120,122,126,128}

if we have fewer colors in 1d we can cover these

Full n= 4 cover=9 avg=2.67 {0,2,6,8}
Full n= 5 cover=13 avg=3.769231 {0,2,6,10,12}
Full n= 6 cover=17 avg=4.823529 {0,2,6,10,14,16}
Full n= 7 cover=21 avg=5.857143 {0,2,4,10,16,18,20}
Full n= 8 cover=27 avg=6.89 {0,2,6,8,18,20,24,26}
Full n= 9 cover=33 avg= 9.969697 {0,2,4,10,16,22,28,30,32}
Full n=10 cover=41 avg=10.878049 {0,2,6,8,18,22,32,34,38,40}
Full n=11 cover=45 avg=10.64 {0,2,6,8,18,22,26,36,38,42,44}
Full n=12 cover=55 avg=14.727273 {0,2,6,8,18,22,32,36,46,48,52,54}
Full n=13 cover=65 avg=18.415385 {0,2,6,8,18,22,32,42,46,56,58,62,64}
Full n=14 cover=73 avg=18.931507 {0,2,6,8,18,22,32,40,50,54,64,66,70,72}
Full n=15 cover=81 avg=25.172840 {0,2,6,8,10,16,28,40,52,64,70,72,74,78,80}
Full n=16 cover=93 avg=30.193548 {0,2,6,8,10,16,28,40,52,64,76,82,84,86,90,92}
Full n=17 cover=105 avg=35.209524 
{0,2,6,8,10,16,28,40,52,64,76,88,94,96,98,102,104}
Full n=18 cover=111 avg=33.261261 
{0,2,6,10,14,18,20,42,44,66,68,90,92,96,100,104,108,110}

so what about 3d ?
if we have a 3d cube and we create a palette with the 1d list from above for
6 colors we end up with a 6x6x6 palette of 216 colors where with the 50:50 mixes
we cover the full 17x17x17=4913 cube, all points of it that is. same for the 
6x6x7
 (17x17x21=6069) cases. this might be an interresting alternative to the 332 
palette
or maybe it could be used as start point for some optimized palette, remocing 
unused
colors and adding new by some clustering

I didnt try any of this on an actual image. Its not as simple as using that 
palette
as the dither algorithm also needs to be redesigned to actually use the right
color pairs. Dither would generally use the closest color and thats not true 
here
for many pairs

also gamma needs to be handled correctly for all this because mixing white and
black pixels will only look like 50% gray when gamma is handled correctly
As i didnt try it, i have 0 idea how bad it would look. I was primarly 
interrested
in the nummeric/math aspect behind this which is why i played a bit with these
numbers

thx

[...]
-- 
Michael GnuPG fingerprint: 9FF2128B147EF6730BADF133611EC787040B0FAB

The day soldiers stop bringing you their problems is the day you have stopped 
leading them. They have either lost confidence that you can help or 

Re: [FFmpeg-devel] Rework color quantization in palette{gen,use}

2023-01-02 Thread Clément Bœsch
On Mon, Jan 02, 2023 at 10:57:33PM +0100, Michael Niedermayer wrote:
[...]
> > So I did a lot of experiments, and the explanation for the desaturated
> > output at low number of colors can be found at the end of this article:
> > http://blog.pkh.me/p/39-improving-color-quantization-heuristics.html
> 
> interresting and its impressive how much reseacrh you did here
> i hope this will get applied

Thanks. I was actually planning to push in the next 12 hours or so, unless
there is an objection.

> also i hape a bit that it will get
> extended to include clustering as in ELBG cuz it seems a bit odd
> to have this sort of alternative filters neither does all 

Yeah at some point we probably want to group the clustering and vector
quantization logics in a common module. But there are lot of questions API
wise wrt its relationship with perceptual and other color systems.

> > I still think it's acceptable to lean toward desaturated colors when
> > reducing the number of colors, but you may disagree.
> 
> I think a key aspect of desaturation has not been mentioned.
> That is mixing, i mean dithering is some sort of mixing, in the sense of
> an artist mixing several pigment/dyes/colors.
> If you have black and white a 50% mixture gives 50% gray. other ratios
> would give us all values between white and black though with dithering
> some ratios work better like 50% looks good while ratios very close to
> 0 and 100% but not exacty 0 and 100 look bad with few highly vissible
> black or white pixels in a see of the opposing color.
> 
> Now this results in 2 things at least.
> 1. We should be able to improve color quantization by this.
>  If we have colors A and B the (A+B)/2 point is basically free, its dither
>  pattern looks good on any high resolution display and if we consider such
>  points there are more of course like maybe (A+B+C+D)/4 we can cover more
>  output colors with a smaller palette.

That's interesting. Basically you'd free certain slots of the palette if
you detect that this particular color is at the mid point of two others in
the palette? And so you could use that slot for another tint…

Yeah I don't know what to do with this information, it looks not trivial
to implement.

> 2. desaturation happens in dithered images because colors are simply not
>  representable, the same way a artist cant paint 100% white if the brightest
>  color she has is 80% white. She cant mix that with anything to make it
>  brighter. An algorithm which would ensure that the colors from the palette
>  form a convex hull around all the colors of the input would ensure all
>  colors are representable and no desaturation should happen. it of course
>  may look bad, i dont know, A convex hull likely is not the global optimum
>  from a perceptual POV. But one only needs 8 colors to gurantee all colors
>  are representable with dithering

I feel like a cheap hack would be to create a filter such as
"palettesource" which generates a palette using OkLCh (same as OkLab but
circular space, the hue is an angle) to design such palette. That's easy
to do and you could immediately test it by feeding it to paletteuse.

>  Another way to maybe see this is that if you have 1 color the best place
>  is teh one where it minimizes the distance to all. But as more points are
>  added average points between them become usable in a dithered image so
>  the thing starts filling up while the perimeter and outside is harder
>  to represent
>  One could also say that with 2 colors all points on the line joining
>  them can be represented and so distance to that line could be minimized
>  but as not really all points on that line form pleasing dither patterns
>  iam hesitant about this representation but it can be extended to a triangle
>  and so forth with more points
>  
> Now i hope i have not given any ideas that make you spend more months on
> this if you dont enjoy it :) But i find the whole myself a bit interresting

Heh, yeah I'm already onto another crazy topic currently so you'll have to
do it on your own :)

BTW it was rightfully pointed out to me that in addition to the box and
the axis selections, there is a 3rd aspect to study: the median cut
itself.

There is likely something better to do here that would use the values
themselves instead of just a cut at the median of the set, specifically if
there are large gaps in the values. For example [1,2,3,6,7,8,231,255]
(assuming weights of 1) would be cut [1,2,3,6] [7,8,231,255] when
[1,2,3,6,7,8] [231,255] would probably be much more appropriate.

It might help addressing the bias toward L* for low number of colors
where these irregularities are particularly common (and tend to smooth out
over cuts because typically [7,8,231,255] is likely to be cut again soon
due to its variance).

I feel like it might not be that hard to actually improve the low color
count by trying out some alternatives. But there are many cut solutions
approaches which need to be measured.

I'm happy to provide 

Re: [FFmpeg-devel] Rework color quantization in palette{gen,use}

2023-01-02 Thread Michael Niedermayer
Hi

On Sat, Dec 31, 2022 at 01:11:54PM +0100, Clément Bœsch wrote:
> On Sun, Nov 06, 2022 at 06:30:22PM +0100, Michael Niedermayer wrote:
> > On Sun, Nov 06, 2022 at 06:09:41PM +0100, Michael Niedermayer wrote:
> > > On Sat, Nov 05, 2022 at 04:26:02PM +0100, Clément Bœsch wrote:
> > > > Hi,
> > > > 
> > > > This patchset essentially fixes a few core problems in these filters and
> > > > switches to a perceptual model.
> > > > 
> > > > I've generated a report for each key commit on this (temporary) page:
> > > > http://big.pkh.me/pal/ (warning: heavy page, ~500M; I did try to add 
> > > > some lazy
> > > > loading of the images but I'm not sure it's actually working as 
> > > > expected).
> > > 
> > > i just looked at file00 and 16 and 64 colors with dither for it and they 
> > > look
> > > different, some areas look better before and some better afterwards
> > 
> > looked at more of the 16 color cases with dither 
> > (16 colors as i asumed fewer would magnify any issues )
> > file 01, IMHO current looks better than last (variance per axis)
> > file 02, IMHO current looks better than last (variance per axis)
> > file 03, IMHO VPA looks better but both really are quite off in terms of 
> > color,
> >  thats not the color of the original image.
> > file 04, VPA is not good thats not the correct color
> > 
> > It seems th last (variance per axis) is more pale and looses color
> 
> So I did a lot of experiments, and the explanation for the desaturated
> output at low number of colors can be found at the end of this article:
> http://blog.pkh.me/p/39-improving-color-quantization-heuristics.html

interresting and its impressive how much reseacrh you did here
i hope this will get applied also i hape a bit that it will get
extended to include clustering as in ELBG cuz it seems a bit odd
to have this sort of alternative filters neither does all 


> 
> I still think it's acceptable to lean toward desaturated colors when
> reducing the number of colors, but you may disagree.

I think a key aspect of desaturation has not been mentioned.
That is mixing, i mean dithering is some sort of mixing, in the sense of
an artist mixing several pigment/dyes/colors.
If you have black and white a 50% mixture gives 50% gray. other ratios
would give us all values between white and black though with dithering
some ratios work better like 50% looks good while ratios very close to
0 and 100% but not exacty 0 and 100 look bad with few highly vissible
black or white pixels in a see of the opposing color.

Now this results in 2 things at least.
1. We should be able to improve color quantization by this.
 If we have colors A and B the (A+B)/2 point is basically free, its dither
 pattern looks good on any high resolution display and if we consider such
 points there are more of course like maybe (A+B+C+D)/4 we can cover more
 output colors with a smaller palette.
 
2. desaturation happens in dithered images because colors are simply not
 representable, the same way a artist cant paint 100% white if the brightest
 color she has is 80% white. She cant mix that with anything to make it
 brighter. An algorithm which would ensure that the colors from the palette
 form a convex hull around all the colors of the input would ensure all
 colors are representable and no desaturation should happen. it of course
 may look bad, i dont know, A convex hull likely is not the global optimum
 from a perceptual POV. But one only needs 8 colors to gurantee all colors
 are representable with dithering
 Another way to maybe see this is that if you have 1 color the best place
 is teh one where it minimizes the distance to all. But as more points are
 added average points between them become usable in a dithered image so
 the thing starts filling up while the perimeter and outside is harder
 to represent
 One could also say that with 2 colors all points on the line joining
 them can be represented and so distance to that line could be minimized
 but as not really all points on that line form pleasing dither patterns
 iam hesitant about this representation but it can be extended to a triangle
 and so forth with more points
 
Now i hope i have not given any ideas that make you spend more months on
this if you dont enjoy it :) But i find the whole myself a bit interresting
 
[...]

thx
-- 
Michael GnuPG fingerprint: 9FF2128B147EF6730BADF133611EC787040B0FAB

Any man who breaks a law that conscience tells him is unjust and willingly 
accepts the penalty by staying in jail in order to arouse the conscience of 
the community on the injustice of the law is at that moment expressing the 
very highest respect for law. - Martin Luther King Jr


signature.asc
Description: PGP signature
___
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-devel] Rework color quantization in palette{gen,use}

2022-12-31 Thread Clément Bœsch
On Sun, Nov 06, 2022 at 06:30:22PM +0100, Michael Niedermayer wrote:
> On Sun, Nov 06, 2022 at 06:09:41PM +0100, Michael Niedermayer wrote:
> > On Sat, Nov 05, 2022 at 04:26:02PM +0100, Clément Bœsch wrote:
> > > Hi,
> > > 
> > > This patchset essentially fixes a few core problems in these filters and
> > > switches to a perceptual model.
> > > 
> > > I've generated a report for each key commit on this (temporary) page:
> > > http://big.pkh.me/pal/ (warning: heavy page, ~500M; I did try to add some 
> > > lazy
> > > loading of the images but I'm not sure it's actually working as expected).
> > 
> > i just looked at file00 and 16 and 64 colors with dither for it and they 
> > look
> > different, some areas look better before and some better afterwards
> 
> looked at more of the 16 color cases with dither 
> (16 colors as i asumed fewer would magnify any issues )
> file 01, IMHO current looks better than last (variance per axis)
> file 02, IMHO current looks better than last (variance per axis)
> file 03, IMHO VPA looks better but both really are quite off in terms of 
> color,
>  thats not the color of the original image.
> file 04, VPA is not good thats not the correct color
> 
> It seems th last (variance per axis) is more pale and looses color

So I did a lot of experiments, and the explanation for the desaturated
output at low number of colors can be found at the end of this article:
http://blog.pkh.me/p/39-improving-color-quantization-heuristics.html

I still think it's acceptable to lean toward desaturated colors when
reducing the number of colors, but you may disagree.

Regards,

-- 
Clément B.
___
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-devel] Rework color quantization in palette{gen,use}

2022-12-27 Thread Clément Bœsch
On Tue, Nov 08, 2022 at 10:37:59PM +, Soft Works wrote:
[...]
> For completeness, I'm also including the recent comparison, but it 
> seems you're already on track in this regard.

If you remove the alpha from the input image you'll see that it performs
pretty much as good. You can check with the last results.

Please note though that it's expected to have elbg or even pngquant to get
better results, because they're relying on clustering algorithms afaik,
which are slower (but more efficient). It's a trade-off.

[...]
> Then I'd have a question about your file07 example. Is this the 
> original file or did I mix something up?
> 
> http://big.pkh.me/pal/output/0-current/file07/cfg00/0-ref.png
> 
> I'm wondering because the image is full or weird artifacts at the 
> edges of the green (and other) leafes.
> 

I'm assuming this was the image with 1M+ colors; sorry I removed the file,
but yeah given the URL it was the reference file with weird artifacts
(it's a synthetic sample after all).

[...]
> PS: I'd be curious what you think about the elbg image...

If you want to look at elbg vs palette filters, make sure you disable
dithering (elbg doesn't have any), and make sure to use the sample without
transparency. You'll see that they perform mostly the same.

-- 
Clément B.
___
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-devel] Rework color quantization in palette{gen,use}

2022-11-08 Thread Soft Works


> -Original Message-
> From: ffmpeg-devel  On Behalf Of
> Clément Bœsch
> Sent: Tuesday, November 8, 2022 10:08 PM
> To: FFmpeg development discussions and patches  de...@ffmpeg.org>
> Subject: Re: [FFmpeg-devel] Rework color quantization in
> palette{gen,use}
> 
> On Sun, Nov 06, 2022 at 07:46:38PM +, Soft Works wrote:
> >
> >
> > > -Original Message-
> > > From: ffmpeg-devel  On Behalf Of
> > > Clément Bœsch
> > > Sent: Saturday, November 5, 2022 4:26 PM
> > > To: ffmpeg-devel@ffmpeg.org
> > > Subject: [FFmpeg-devel] Rework color quantization in
> palette{gen,use}
> > >
> > > Hi,
> > >
> > > This patchset essentially fixes a few core problems in these
> filters
> > > and
> > > switches to a perceptual model.
> > >
> > > I've generated a report for each key commit on this (temporary)
> page:
> > > http://big.pkh.me/pal/ (warning: heavy page, ~500M; I did try to
> add
> > > some lazy
> > > loading of the images but I'm not sure it's actually working as
> > > expected).
> >
> > Comparing the results for the known and simple "rainbow O" example
> reveals
> > that the proposed implementation seems to be even inferior to the
> current
> > code and even farther away from what is possible to achieve:
> >
> > https://gist.github.com/softworkz/e310e3c84a338f98977d70b09e3e3f4f
> 
> The pngquant file on this page has 373 unique colors, and the
> transparency
> is fake (the checkerboard is opaque white & grey). I think there is a
> mistake here.

Hi Clement,

I'm sorry about the confusion. The files in both Gists were created
in the same way: Opened the result image in PhotoShop, set the view
size to 400% and then created a screenshot and pasted into the Gist.
The reason I did it that way was that GitHub seemed to do its own
image "optimization" and I wanted to rule out any such effects and
just let others see what I see.

I couldn't find the original result from pngquant, but I have attached
the result from the elbg filter which is almost of the same quality.

For completeness, I'm also including the recent comparison, but it 
seems you're already on track in this regard.


> WRT the regression after the patch, I confirm that there is a problem
> related to the dithering. If you try with dither=none or even
> dither=bayer, you'll observe that the colors are much better. I will
> update the results page at some point to include that file.

That would be great. Maybe you could also find another "simple" example 
like with large-scale gradients rather than being so strongly colored
like the others?


Then I'd have a question about your file07 example. Is this the 
original file or did I mix something up?

http://big.pkh.me/pal/output/0-current/file07/cfg00/0-ref.png

I'm wondering because the image is full or weird artifacts at the 
edges of the green (and other) leafes.


> Now indeed the sierra dithering (and probably the other of the same
> type)
> are somehow spreading way too strongly, it's unclear to me yet but
> that
> might be a bug I introduced. I'll investigate, thanks.

Yup, okay, thanks.

PS: I'd be curious what you think about the elbg image...

Thanks,
softworkz




<>
<>
___
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-devel] Rework color quantization in palette{gen,use}

2022-11-08 Thread Clément Bœsch
On Sun, Nov 06, 2022 at 08:19:24AM -0500, Ronald S. Bultje wrote:
> Hi,
> 
> On Sat, Nov 5, 2022 at 2:54 PM Clément Bœsch  wrote:
> 
> > On Sat, Nov 05, 2022 at 04:44:39PM +0100, Paul B Mahol wrote:
> > [...]
> > > > Finally, I do believe a lot of other color filters could at least
> > benefit
> > > > from
> > > > fixing their gamma handling (I know I'm guilty of this in various other
> > > > filters).
> > >
> > > gamma handling depends not on pixel format but on metadata present in
> > frame.
> >
> > Right, as suggested by Ronald on IRC, maybe it would have been appropriate
> > to use the vf colorspace code to honor the transfer functions.
> >
> > That being said, this involves quite a substantial refactoring. Is this
> > considered blocking?
> >
> 
> Not for me. I'd like a big fat FIXME and I (or you, or anyone) can look
> into this at some point in the future.
> 

I will likely add a warning, or even error out if the input is not sRGB to
limit the damage.

-- 
Clément B.
___
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-devel] Rework color quantization in palette{gen,use}

2022-11-08 Thread Clément Bœsch
On Sun, Nov 06, 2022 at 06:30:22PM +0100, Michael Niedermayer wrote:
> On Sun, Nov 06, 2022 at 06:09:41PM +0100, Michael Niedermayer wrote:
> > On Sat, Nov 05, 2022 at 04:26:02PM +0100, Clément Bœsch wrote:
> > > Hi,
> > > 
> > > This patchset essentially fixes a few core problems in these filters and
> > > switches to a perceptual model.
> > > 
> > > I've generated a report for each key commit on this (temporary) page:
> > > http://big.pkh.me/pal/ (warning: heavy page, ~500M; I did try to add some 
> > > lazy
> > > loading of the images but I'm not sure it's actually working as expected).
> > 
> > i just looked at file00 and 16 and 64 colors with dither for it and they 
> > look
> > different, some areas look better before and some better afterwards
> 
> looked at more of the 16 color cases with dither 
> (16 colors as i asumed fewer would magnify any issues )
> file 01, IMHO current looks better than last (variance per axis)
> file 02, IMHO current looks better than last (variance per axis)
> file 03, IMHO VPA looks better but both really are quite off in terms of 
> color,
>  thats not the color of the original image.
> file 04, VPA is not good thats not the correct color
> 
> It seems th last (variance per axis) is more pale and looses color
> 

You're right, the variance per axis change is not always very good, I
might dismissed it entirely.

It also makes me question the use of the variance entirely when splitting
the boxes. I need to investigate if choosing the box with a simpler
heuristic (something naive like picking the box with the highest volume)
is not actually improving things.

I'll investigate and share the results.

Thanks for looking deeply into that!

> > Have you done any double blind comparission ?

Nope, I probably should, but I'm not sure I have the energy to setup such
a thing yet.

Regards,

-- 
Clément B.
___
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-devel] Rework color quantization in palette{gen,use}

2022-11-08 Thread Clément Bœsch
On Sun, Nov 06, 2022 at 07:46:38PM +, Soft Works wrote:
> 
> 
> > -Original Message-
> > From: ffmpeg-devel  On Behalf Of
> > Clément Bœsch
> > Sent: Saturday, November 5, 2022 4:26 PM
> > To: ffmpeg-devel@ffmpeg.org
> > Subject: [FFmpeg-devel] Rework color quantization in palette{gen,use}
> > 
> > Hi,
> > 
> > This patchset essentially fixes a few core problems in these filters
> > and
> > switches to a perceptual model.
> > 
> > I've generated a report for each key commit on this (temporary) page:
> > http://big.pkh.me/pal/ (warning: heavy page, ~500M; I did try to add
> > some lazy
> > loading of the images but I'm not sure it's actually working as
> > expected).
> 
> Comparing the results for the known and simple "rainbow O" example reveals
> that the proposed implementation seems to be even inferior to the current 
> code and even farther away from what is possible to achieve:
> 
> https://gist.github.com/softworkz/e310e3c84a338f98977d70b09e3e3f4f

The pngquant file on this page has 373 unique colors, and the transparency
is fake (the checkerboard is opaque white & grey). I think there is a
mistake here.

WRT the regression after the patch, I confirm that there is a problem
related to the dithering. If you try with dither=none or even
dither=bayer, you'll observe that the colors are much better. I will
update the results page at some point to include that file.

Now indeed the sierra dithering (and probably the other of the same type)
are somehow spreading way too strongly, it's unclear to me yet but that
might be a bug I introduced. I'll investigate, thanks.

Regards,

-- 
Clément B.
___
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-devel] Rework color quantization in palette{gen,use}

2022-11-06 Thread Soft Works


> -Original Message-
> From: ffmpeg-devel  On Behalf Of
> Clément Bœsch
> Sent: Saturday, November 5, 2022 4:26 PM
> To: ffmpeg-devel@ffmpeg.org
> Subject: [FFmpeg-devel] Rework color quantization in palette{gen,use}
> 
> Hi,
> 
> This patchset essentially fixes a few core problems in these filters
> and
> switches to a perceptual model.
> 
> I've generated a report for each key commit on this (temporary) page:
> http://big.pkh.me/pal/ (warning: heavy page, ~500M; I did try to add
> some lazy
> loading of the images but I'm not sure it's actually working as
> expected).

Comparing the results for the known and simple "rainbow O" example reveals
that the proposed implementation seems to be even inferior to the current 
code and even farther away from what is possible to achieve:

https://gist.github.com/softworkz/e310e3c84a338f98977d70b09e3e3f4f

Regards,
softworkz
___
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-devel] Rework color quantization in palette{gen,use}

2022-11-06 Thread Ronald S. Bultje
Hi,

On Sat, Nov 5, 2022 at 2:54 PM Clément Bœsch  wrote:

> On Sat, Nov 05, 2022 at 04:44:39PM +0100, Paul B Mahol wrote:
> [...]
> > > Finally, I do believe a lot of other color filters could at least
> benefit
> > > from
> > > fixing their gamma handling (I know I'm guilty of this in various other
> > > filters).
> >
> > gamma handling depends not on pixel format but on metadata present in
> frame.
>
> Right, as suggested by Ronald on IRC, maybe it would have been appropriate
> to use the vf colorspace code to honor the transfer functions.
>
> That being said, this involves quite a substantial refactoring. Is this
> considered blocking?
>

Not for me. I'd like a big fat FIXME and I (or you, or anyone) can look
into this at some point in the future.

Ronald
___
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-devel] Rework color quantization in palette{gen,use}

2022-11-05 Thread Soft Works


> -Original Message-
> From: ffmpeg-devel  On Behalf Of
> Clément Bœsch
> Sent: Saturday, November 5, 2022 4:26 PM
> To: ffmpeg-devel@ffmpeg.org
> Subject: [FFmpeg-devel] Rework color quantization in palette{gen,use}
> 
> Hi,
> 
> This patchset essentially fixes a few core problems in these filters
> and
> switches to a perceptual model.
> 
> I've generated a report for each key commit on this (temporary) page:
> http://big.pkh.me/pal/ (warning: heavy page, ~500M; I did try to add
> some lazy
> loading of the images but I'm not sure it's actually working as
> expected).
> 
> It is easy for me to add specific samples and re-run the whole thing,
> so feel
> free to suggest one.

The "rainbow Q" image would be nice to see in comparison.

Thanks,
softworkz
___
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-devel] Rework color quantization in palette{gen,use}

2022-11-05 Thread Clément Bœsch
On Sat, Nov 05, 2022 at 04:44:39PM +0100, Paul B Mahol wrote:
[...]
> > Finally, I do believe a lot of other color filters could at least benefit
> > from
> > fixing their gamma handling (I know I'm guilty of this in various other
> > filters).
> 
> gamma handling depends not on pixel format but on metadata present in frame.

Right, as suggested by Ronald on IRC, maybe it would have been appropriate
to use the vf colorspace code to honor the transfer functions.

That being said, this involves quite a substantial refactoring. Is this
considered blocking?

More specifically: do we have a lot of situation in which the RGB pixel
format is not sRGB?

-- 
Clément B.
___
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-requ...@ffmpeg.org with subject "unsubscribe".


Re: [FFmpeg-devel] Rework color quantization in palette{gen,use}

2022-11-05 Thread Paul B Mahol
On 11/5/22, Clément Bœsch  wrote:
> Hi,
>
> This patchset essentially fixes a few core problems in these filters and
> switches to a perceptual model.
>
> I've generated a report for each key commit on this (temporary) page:
> http://big.pkh.me/pal/ (warning: heavy page, ~500M; I did try to add some
> lazy
> loading of the images but I'm not sure it's actually working as expected).
>
> It is easy for me to add specific samples and re-run the whole thing, so
> feel
> free to suggest one.
>
> A summary on a few important observed differences can be found on the page,
> but
> I'm duplicating it here for the record:
>
>   - Current: current state on master
>   - Paletteuse Perceptual
>   + same palette but better selection: instead of rgb triplet distance,
> it uses a
> colorspace designed for uniform perceptual color differences (OkLab)
>   + overall impact not that visible immediately, but it will make sure
> the
> palette is used the best way possible, meaning improvements to
> palettegen
> will be honored
>   + observations (with no dither):
>   * file02 (rooftops) in max_colors=8 or 16: sky pattern is more
> stable
>   * file06 (parrot) in max_colors=8: better color for the parrot
> beak
>   * overall seems to favor slightly brighter colors in the currently
> offered palette
>   - Palettegen Linear Average
>   + sRGB colors are gamma encoded, averaging them naively is incorrect,
> we
> need to do that in linear space
>   + observations (with no dither):
>   * file00 (colorful drawing) in max_colors=8: contrast and color
> skin
> look better
>   * file07 (abstract flower) in max_color=128 or 256: this picture
> composed of 1M different colors in the source is now more
> balanced
> (better spreading of the colors)
>   - Palettegen Perceptual
>   + similar to the paletteuse perceptual, we use OkLab for the color
> distance
>   + observations (with no dither):
>   * file07 (abstract flower): in max_colors=128 or 256 we can see
> the
> picture offering a much finer grain.
>   - Palettegen Variance per axis
>   + When deciding on spliting a box along an axis, instead of picking
> the
> longest one, we pick the one with the most color variance
>   + Not that much impact
>
>   Overall, the most brutal change is probably in file07 between current and
> last,
>   256 colors no dither in particular.
>
> Finally, I do believe a lot of other color filters could at least benefit
> from
> fixing their gamma handling (I know I'm guilty of this in various other
> filters).

gamma handling depends not on pixel format but on metadata present in frame.

>
> Regards,
>
> --
> Clément B.
>
> ___
> ffmpeg-devel mailing list
> ffmpeg-devel@ffmpeg.org
> https://ffmpeg.org/mailman/listinfo/ffmpeg-devel
>
> To unsubscribe, visit link above, or email
> ffmpeg-devel-requ...@ffmpeg.org with subject "unsubscribe".
>
___
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-requ...@ffmpeg.org with subject "unsubscribe".


[FFmpeg-devel] Rework color quantization in palette{gen,use}

2022-11-05 Thread Clément Bœsch
Hi,

This patchset essentially fixes a few core problems in these filters and
switches to a perceptual model.

I've generated a report for each key commit on this (temporary) page:
http://big.pkh.me/pal/ (warning: heavy page, ~500M; I did try to add some lazy
loading of the images but I'm not sure it's actually working as expected).

It is easy for me to add specific samples and re-run the whole thing, so feel
free to suggest one.

A summary on a few important observed differences can be found on the page, but
I'm duplicating it here for the record:

  - Current: current state on master
  - Paletteuse Perceptual
  + same palette but better selection: instead of rgb triplet distance, it 
uses a
colorspace designed for uniform perceptual color differences (OkLab)
  + overall impact not that visible immediately, but it will make sure the
palette is used the best way possible, meaning improvements to 
palettegen
will be honored
  + observations (with no dither):
  * file02 (rooftops) in max_colors=8 or 16: sky pattern is more stable
  * file06 (parrot) in max_colors=8: better color for the parrot beak
  * overall seems to favor slightly brighter colors in the currently
offered palette
  - Palettegen Linear Average
  + sRGB colors are gamma encoded, averaging them naively is incorrect, we
need to do that in linear space
  + observations (with no dither):
  * file00 (colorful drawing) in max_colors=8: contrast and color skin
look better
  * file07 (abstract flower) in max_color=128 or 256: this picture
composed of 1M different colors in the source is now more balanced
(better spreading of the colors)
  - Palettegen Perceptual
  + similar to the paletteuse perceptual, we use OkLab for the color 
distance
  + observations (with no dither):
  * file07 (abstract flower): in max_colors=128 or 256 we can see the
picture offering a much finer grain.
  - Palettegen Variance per axis
  + When deciding on spliting a box along an axis, instead of picking the
longest one, we pick the one with the most color variance
  + Not that much impact
  
  Overall, the most brutal change is probably in file07 between current and 
last,
  256 colors no dither in particular.

Finally, I do believe a lot of other color filters could at least benefit from
fixing their gamma handling (I know I'm guilty of this in various other
filters).

Regards,

-- 
Clément B.

___
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-requ...@ffmpeg.org with subject "unsubscribe".