Re: Studying Lossy Image Compression Efficiency

2018-02-26 Thread Daniel Holbert
Ah, I missed that Mike had replied -- it sounds like archive.org's
Wayback Machine is the easier way to get at the study, as compared to
bothering Josh. :)

On 2/26/18 9:26 AM, Daniel Holbert wrote:
> The people.mozilla.org site was a general-purpose webserver for Mozilla
> folks, and it was decommissioned entirely over the past few years.  So,
> that's why the study link there is broken.
> 
> You'd have to ask Josh (CC'd) if he has reposted (or could repost) the
> study docs somewhere else.
> 
> ~Daniel
> 
> On 2/24/18 9:51 AM, audioscaven...@gmail.com wrote:
>> On Thursday, October 17, 2013 at 10:50:49 AM UTC-4, Josh Aas wrote:
>>> Blog post is here:
>>>
>>> https://blog.mozilla.org/research/2013/10/17/studying-lossy-image-compression-efficiency/
>>>
>>> Study is here:
>>>
>>> http://people.mozilla.org/~josh/lossy_compressed_image_study_october_2013/
>>
>> Hi,
>> The link to the study is broken.
>> Is HEVC/BPG another abandoned project?
>> ___
>> dev-platform mailing list
>> dev-platform@lists.mozilla.org
>> https://lists.mozilla.org/listinfo/dev-platform
>>
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2018-02-26 Thread Daniel Holbert
The people.mozilla.org site was a general-purpose webserver for Mozilla
folks, and it was decommissioned entirely over the past few years.  So,
that's why the study link there is broken.

You'd have to ask Josh (CC'd) if he has reposted (or could repost) the
study docs somewhere else.

~Daniel

On 2/24/18 9:51 AM, audioscaven...@gmail.com wrote:
> On Thursday, October 17, 2013 at 10:50:49 AM UTC-4, Josh Aas wrote:
>> Blog post is here:
>>
>> https://blog.mozilla.org/research/2013/10/17/studying-lossy-image-compression-efficiency/
>>
>> Study is here:
>>
>> http://people.mozilla.org/~josh/lossy_compressed_image_study_october_2013/
> 
> Hi,
> The link to the study is broken.
> Is HEVC/BPG another abandoned project?
> ___
> dev-platform mailing list
> dev-platform@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-platform
> 
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2018-02-24 Thread mhoye



On 2018-02-24 12:51 PM, audioscaven...@gmail.com wrote:

On Thursday, October 17, 2013 at 10:50:49 AM UTC-4, Josh Aas wrote:

Blog post is here:

https://blog.mozilla.org/research/2013/10/17/studying-lossy-image-compression-efficiency/

Study is here:

http://people.mozilla.org/~josh/lossy_compressed_image_study_october_2013/

Hi,
The link to the study is broken.


Josh works over at Let's Encrypt now, and people.mozilla.org got 
decommed some time ago. You can still see that page over at the wayback 
machine.


It looks like the repo it links to - 
https://github.com/bdaehlie/lossy-compression-test - hasn't seen any 
motion in some time.


- mhoye


___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2018-02-24 Thread audioscavenger
On Thursday, October 17, 2013 at 10:50:49 AM UTC-4, Josh Aas wrote:
> Blog post is here:
> 
> https://blog.mozilla.org/research/2013/10/17/studying-lossy-image-compression-efficiency/
> 
> Study is here:
> 
> http://people.mozilla.org/~josh/lossy_compressed_image_study_october_2013/

Hi,
The link to the study is broken.
Is HEVC/BPG another abandoned project?
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-11-28 Thread songofapollo
On Tuesday, July 15, 2014 7:34:35 AM UTC-7, Josh Aas wrote:
 This is the discussion thread for Mozilla's July 2014 Lossy Compressed Image 
 Formats Study and the Mozilla Research blog post entitled Mozilla Advances 
 JPEG Encoding with mozjpeg 2.0.

It would help if you would use much more distinct colors in your graphs of the 
results. It can be very hard to keep track of which is which. You used two 
shades of red/purple, and three shades of blue/green/teal. That's a bizarre 
decision for graphs meant to be easily understood.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-09-16 Thread jnoring
On Tuesday, July 15, 2014 8:34:35 AM UTC-6, Josh Aas wrote:
 This is the discussion thread for Mozilla's July 2014 Lossy Compressed Image 
 Formats Study and the Mozilla Research blog post entitled Mozilla Advances 
 JPEG Encoding with mozjpeg 2.0.

Could you post the command lines used for the various encoders?  Also, for 
mozjpeg, if you use arithmetic encoding instead of huffman encoding, what is 
the effect?

I know arithmetic encoding isn't supported by a lot of browsersbut neither 
are most of the formats being tested in the study.  So it seems appropriate to 
consider.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-09-16 Thread jnoring
On Tuesday, July 15, 2014 1:38:00 PM UTC-6, stone...@gmail.com wrote:
 Would be nice if you guys just implemented JPEG2000.  It's 2014.

Based on what data?  

 Not only would you get a lot more than a 5% encoding boost, but you'd get 
 much higher quality images to boot.

Based on what data?

 If you had implemented it in 2014, everyone would support it today.  If you 
 don't implement it today, we'll wait another 15 years tuning a 25 year old 
 image algorithm while better things are available.

Just because something is new doesn't automatically imply it's better.  I've 
seen conflicting data on whether or not JPEG2000 outperforms JPEG.  And on some 
basic level, that last statement is also pretty fickle since encoder maturity 
is a huge factor in quality.

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-31 Thread janus
Den torsdag den 24. juli 2014 23.59.58 UTC+2 skrev Josh Aas:
 
  I selected 10,000 random JPEGs that we were caching for customers and ran 
  them through mozjpeg 2.0 via jpegtran. Some interesting facts:
 
 
 With mozjpeg you probably want to re-encode with cjpeg rather than jpegtran. 
 We added support for JPEG input to cjpeg in mozjpeg to make this possible. 
 I'm not sure, but I don't think jpegtran takes advantage of much of the work 
 we've done to improve compression.
 
 

Hi Josh
You write that we should re-encode with cjpeg rather than just optimize with 
jpegtran, but what settings would you use for this, if the purpose is just to 
optimize, and not actually change the format, quality and so on, in any way?

I tried with cjpeg -quality 100 -optimize -progressive but this seems to give 
me much bigger files.

I am hoping to optimize images uploaded for websites, which has allready had 
the quality setting changed to fit their purpose, so I am just interested in 
optimizing the images lossless, which seems like a similar case to John's.

And one other thing:
I been testing an early version of jpegtran from MozJpeg, but after upgrading 
to 2.0 my testfiles seems to grow by a few KB, after being optimized.
Was there an error in the older versions that deleted a bit too much data or 
did the algorithm change to the worse in 2.0?
I am using jpegtran -optimize -progressive -copy none with both versions.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-24 Thread Josh Aas
On Tuesday, July 15, 2014 3:15:13 PM UTC-5, perez@gmail.com wrote:

 #1 Would it be possible to have the same algorithm that is applied to webP to 
 be applied to JPEG?

I'm not sure. WebP was created much later than JPEGs, so I'd think/hope they're 
already using some equivalent to trellis quantization.

 #2 There are some JPEG services that perceptually change the image, without 
 any noticeable artifacts. Have you tried something like that?

I'm not really sure what this means, but you can experiment with re-encoding 
with mozjpeg and find a level that saves on file size, but at which you can't 
tell the difference between the source and the re-encoded image.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-24 Thread Josh Aas
 Are there any plans to integrate into other tools, specifically imagemagick?
 
 Or would you leave that up to others?

For now we're going to stay focused on improving compression in mozjpeg's 
library. I think a larger improved toolchain for optimizing JPEGs would be 
great, but it's probably outside the scope of the mozjpeg project.

 While you state that you now accept also jpeg for re-compression, this 
 usually involves loss of quality in the process.

Options for improving re-compression are very limited if you're not willing to 
accept any quality loss. That said, our 'jpgcrush' feature does reduce size 
significantly for progressive JPEGs without harming quality.

 Does mozjpeg have a preferred input format (for best quality/performance)?

Not really. It's probably best to input JPEG if your source image is JPEG, 
otherwise I'd probably recommend converting to BMP for use with cjpeg.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-24 Thread Josh Aas
On Friday, July 18, 2014 10:05:19 AM UTC-5, j...@cloudflare.com wrote:

 I selected 10,000 random JPEGs that we were caching for customers and ran 
 them through mozjpeg 2.0 via jpegtran. Some interesting facts:

With mozjpeg you probably want to re-encode with cjpeg rather than jpegtran. We 
added support for JPEG input to cjpeg in mozjpeg to make this possible. I'm not 
sure, but I don't think jpegtran takes advantage of much of the work we've done 
to improve compression.

 We will continue to work with mozjpeg 2.0 experimentally with the hope that 
 run time can be brought closer to what we had before as the compression looks 
 good.

We haven't spent as much time as we'd like to on run-time optimization, we've 
really been focused on compression wins. We hope to spend more time on run-time 
performance in the future.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-21 Thread Gabriele Svelto
On 19/07/2014 22:40, Ralph Giles wrote:
 Probably not for Firefox OS, if you mean mozjpeg. Not necessarily
 because it uses hardware, but because mozjpeg is about spending more cpu
 power to compress images. It's more something you'd use server-side or
 in creating apps. The phone uses libjpeg-turbo for image decoding, which
 is fast, just not as good an compression.

It might be useful in Firefox OS development: we routinely re-compress
PNG assets in FxOS but we never tried re-compressing our JPEG assets
(which are mostly wallpapers IIRC).

 Gabriele



signature.asc
Description: OpenPGP digital signature
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-21 Thread Bryan Stillwell
One option that I haven't seen compared is the combination of JPEG w/ packJPG 
(http://packjpg.encode.ru/?page_id=17).  packJPG can further compress JPEG 
images another 20%+ and still reproduce the original bit-for-bit.

More details on how this is done can be found here:

http://mattmahoney.net/dc/dce.html#Section_616

To me it seems that JPEG+packJPG could be competitive or exceed HEVC-MSP on 
bits/pixel.

Bryan
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-19 Thread Caspy7
Would this code be a candidate for use in Firefox OS or does most of that 
happen in the hardware?
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-19 Thread Ralph Giles
On 2014-07-19 1:14 PM, Caspy7 wrote:

 Would this code be a candidate for use in Firefox OS or does most of that 
 happen in the hardware?

Probably not for Firefox OS, if you mean mozjpeg. Not necessarily
because it uses hardware, but because mozjpeg is about spending more cpu
power to compress images. It's more something you'd use server-side or
in creating apps. The phone uses libjpeg-turbo for image decoding, which
is fast, just not as good an compression.

 -r
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-18 Thread jgc
On Tuesday, July 15, 2014 3:34:35 PM UTC+1, Josh Aas wrote:
 This is the discussion thread for Mozilla's July 2014 Lossy Compressed Image 
 Formats Study and the Mozilla Research blog post entitled Mozilla Advances 
 JPEG Encoding with mozjpeg 2.0.

Josh,

I work for CloudFlare on many things but recently on image compression. We have 
a product called Polish that recompresses images for our customers 
automatically. As we are in the process of rolling out a new version I looked 
at mozjpeg 2.0.

I selected 10,000 random JPEGs that we were caching for customers and ran them 
through mozjpeg 2.0 via jpegtran. Some interesting facts:

1. 691 files were not compressed further. This compares with 3,471 that 
libjpeg-turbo did not compress further.

2. Of the compression files the average compression was about 3%.

3. Run time was about 1.7x the libjpeg-turbo time.

4. I've put together a small chart showing the distribution of compression that 
we saw. It's here: 
https://twitter.com/jgrahamc/status/490114514667327488/photo/1

We will continue to work with mozjpeg 2.0 experimentally with the hope that run 
time can be brought closer to what we had before as the compression looks good.

John.

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-16 Thread renesd
Cool

=Re decoding.

I'm replying to this note: 

1. We're fans of libjpeg-turbo - it powers JPEG decoding in Firefox because 
its focus is on being fast, and that isn't going to change any time soon. The 
mozjpeg project focuses solely on encoding, and we trade some CPU cycles for 
smaller file sizes. We recommend using libjpeg-turbo for a standard JPEG 
library and any decoding tasks. Use mozjpeg when creating JPEGs for the Web.


Why not use hardware for JPEG? It uses less memory, and battery as well as 
being quicker. It's available on many devices these days too. Why use the CPU 
to first convert a small amount of data into a big amount of data when it's not 
needed by most hardware? Not only that, but you probably store the original 
JPEG data in cache as well! The fastest decoder is the one that does nothing. 
Just let the dedicated JPEG decoding hardware, or the GPU do it.

All talk of considering decoding performance is kind of silly considering the 
JPEG performance could be improved massively.


best,
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-15 Thread Josh Aas
Study is here:

http://people.mozilla.org/~josh/lossy_compressed_image_study_july_2014/

Blog post is here:

https://blog.mozilla.org/research/2014/07/15/mozilla-advances-jpeg-encoding-with-mozjpeg-2-0/
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-15 Thread lange . fabian
Hello Josh,

thank you and all involved for your efforts to make the web faster.
Are there any plans to integrate into other tools, specifically imagemagick?
Or would you leave that up to others?

With all the options available for image processing one can end up with 
building quite a complex chain of tools and commands to produce the best output.
While you state that you now accept also jpeg for re-compression, this usually 
involves loss of quality in the process.
Does mozjpeg have a preferred input format (for best quality/performance)?

Best regards
Fabian
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-15 Thread stonecypher
On Tuesday, July 15, 2014 7:34:35 AM UTC-7, Josh Aas wrote:
 This is the discussion thread for Mozilla's July 2014 Lossy Compressed Image 
 Formats Study and the Mozilla Research blog post entitled Mozilla Advances 
 JPEG Encoding with mozjpeg 2.0.

Would be nice if you guys just implemented JPEG2000.  It's 2014.

Not only would you get a lot more than a 5% encoding boost, but you'd get much 
higher quality images to boot.

But nobody supports JPEG2000 and we want to target something everyone can see!

If you had implemented it in 2014, everyone would support it today.  If you 
don't implement it today, we'll wait another 15 years tuning a 25 year old 
image algorithm while better things are available.

Similarly there's a reason that people are still hacking video into JPEGs and 
using animated GIFs.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-15 Thread john
On Tuesday, July 15, 2014 7:34:35 AM UTC-7, Josh Aas wrote:
 This is the discussion thread for Mozilla's July 2014 Lossy Compressed Image 
 Formats Study and the Mozilla Research blog post entitled Mozilla Advances 
 JPEG Encoding with mozjpeg 2.0.

Would be nice if you guys just implemented JPEG2000.  It's 2014.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-15 Thread perez . m . marc
On Tuesday, July 15, 2014 10:34:35 AM UTC-4, Josh Aas wrote:
 This is the discussion thread for Mozilla's July 2014 Lossy Compressed Image 
 Formats Study and the Mozilla Research blog post entitled Mozilla Advances 
 JPEG Encoding with mozjpeg 2.0.

#1 Would it be possible to have the same algorithm that is applied to webP to 
be applied to JPEG?

#2 There are some JPEG services that perceptually change the image, without any 
noticeable artifacts. Have you tried something like that?
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-15 Thread Chris Peterson

On 7/15/14 12:38 PM, stonecyp...@gmail.com wrote:

On Tuesday, July 15, 2014 7:34:35 AM UTC-7, Josh Aas wrote:

This is the discussion thread for Mozilla's July 2014 Lossy Compressed Image Formats 
Study and the Mozilla Research blog post entitled Mozilla Advances JPEG Encoding 
with mozjpeg 2.0.


Would be nice if you guys just implemented JPEG2000.  It's 2014.

Not only would you get a lot more than a 5% encoding boost, but you'd get much 
higher quality images to boot.

But nobody supports JPEG2000 and we want to target something everyone can see!

If you had implemented it in 2014, everyone would support it today.  If you 
don't implement it today, we'll wait another 15 years tuning a 25 year old 
image algorithm while better things are available.

Similarly there's a reason that people are still hacking video into JPEGs and 
using animated GIFs.


Do Chrome and IE support JPEG2000? I can't find a clear answer online. 
The WONTFIX'd Firefox bug [1] says IE and WebKit/Blink browsers support 
JPEG2000 (but WebKit's support is only on OS X).



chris

[1] https://bugzilla.mozilla.org/show_bug.cgi?id=36351

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-15 Thread Masatoshi Kimura
On 7/15/14 12:38 PM, stonecyp...@gmail.com wrote:
 Similarly there's a reason that people are still hacking video into
 JPEGs and using animated GIFs.

People are using animated GIFs, but animated GIFs people are using may
not be animated GIFs [1].

(2014/07/16 5:43), Chris Peterson wrote:
 Do Chrome and IE support JPEG2000? I can't find a clear answer online.
 The WONTFIX'd Firefox bug [1] says IE and WebKit/Blink browsers support
 JPEG2000 (but WebKit's support is only on OS X).

No, IE does not support JPEG2000. But IE9+ supports JPEG XR. Chrome does
not support both, but it supports WebP [2].

[1] http://techcrunch.com/2014/06/19/gasp-twitter-gifs-arent-actually-gifs/
[2] http://xkcd.com/927/

-- 
vyv03...@nifty.ne.jp
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2014-05-09 Thread e . blackbrook
On Saturday, October 19, 2013 12:14:40 PM UTC-4, stephan...@gmail.com wrote:

 Of course, you can throw a bunch of images to some naive observers with a 
 nice web interface, but what about their screens differences? what about 
 their light conditions differences? how do you validate people for the test 
 (vision acuity, color blindness)? 

Is the goal to find the best results for the actual audience and conditions of 
usage of the web, with its naive observers of varying visual acuity, varying 
light conditions and equipment, or to find the best results for some rarified 
laboratory setup? If a format were to prove superior in the lab test but prove 
not significantly different in the messy real world test, that I think we could 
conclude it wasn't a worth implementing. And vice versa.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2014-03-07 Thread Jeff Muizelaar

On Feb 23, 2014, at 5:17 PM, evacc...@gmail.com wrote:

 On Monday, October 21, 2013 8:54:24 AM UTC-6, tric...@accusoft.com wrote:
 - I suppose that the final lossless step used for JPEGs was the usual 
 Huffman encoding and not arithmetic coding, have you considered testing the 
 later one independently?
 
 
 
 Uninteresting since nobody uses it - except a couple of compression gurus, 
 the AC coding option is pretty much unused in the field.
 
 Nobody uses it because there's no browser support, but that doesn't change 
 the fact that it's overwhelmingly better.  And if you're going to compare 
 JPEG to a bunch of codecs with horrible support in the real world, it seems 
 pretty unfair to hold JPEG only to features that are broadly supported.  
 Also, last I looked, the FF team refused to add support for JPEGs with 
 arithmetic encoding, even though the patent issues have long since expired 
 and it's already supported by libjpeg.
 
 IMO, it's silly not to let JPEG use optimal settings for a test like this, 
 because promulgating an entirely new standard (as opposed to improving an 
 existing one) is much more difficult.

Perhaps it’s easier. However, the point though was to see if new image formats 
were sufficiently better than what we have now to be worth adding support for, 
not to compare image formats to see which one is best. 

 I would also like to see the raw libjpeg settings used; were you using float? 
  Were the files optimized?

These are easy questions for you to answer by reading the source yourself.

-Jeff
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2014-02-23 Thread evaccaro
On Monday, October 21, 2013 8:54:24 AM UTC-6, tric...@accusoft.com wrote:
  - I suppose that the final lossless step used for JPEGs was the usual 
  Huffman encoding and not arithmetic coding, have you considered testing the 
  later one independently?
 
 
 
 Uninteresting since nobody uses it - except a couple of compression gurus, 
 the AC coding option is pretty much unused in the field.

Nobody uses it because there's no browser support, but that doesn't change the 
fact that it's overwhelmingly better.  And if you're going to compare JPEG to a 
bunch of codecs with horrible support in the real world, it seems pretty unfair 
to hold JPEG only to features that are broadly supported.  Also, last I looked, 
the FF team refused to add support for JPEGs with arithmetic encoding, even 
though the patent issues have long since expired and it's already supported by 
libjpeg.

IMO, it's silly not to let JPEG use optimal settings for a test like this, 
because promulgating an entirely new standard (as opposed to improving an 
existing one) is much more difficult.

I would also like to see the raw libjpeg settings used; were you using float?  
Were the files optimized?
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2013-10-27 Thread palsto1aa
About the methodology of using identical colorspace conversion for all formats, 
the study asserts
 and manual visual spot checking did not suggest the conversion
 had a large effect on perceptual quality

I think this claim should be examined more carefully.

Take this image, for example: https://i.imgur.com/3pgvjFl.png
WebP quality 100, decoded to PNG: https://i.imgur.com/O6KKOZy.png
JPEG q99, 4:2:0 subsampling: https://i.imgur.com/jqdMv0d.jpg

WebP (in lossy mode) can't code it without visible banding. Even if you set the 
quality to 100, the 256 - 220 range crush destroys enough information that 
even if nothing was lost in the later stages it'd still show significant 
problems.

How visible this is depends on the particular screen or viewing environment. I 
notice it immediately on the three devices I have at hand, but I guess on some 
screens it might not be so obvious. Here are the same 3 files with their 
contrast enhanced for easy differentiation:

Original: https://i.imgur.com/zXQ4Z5D.png
WebP: https://i.imgur.com/NBm9abp.png
JPEG: https://i.imgur.com/ASU94A7.png

Then there's the issue of only supporting 4:2:0, which is terrible for 
synthetic images (screen captures, renders, diagrams) and shows in natural 
images too. 4:4:4 subsampling is used a lot in practice, for example Photoshop 
uses it automatically for the upper half of its quality scale when saving 
JPEGs. At high qualities it's often better to turn chroma subsampling off even 
at the expense of slightly higher quantization.

I suspect these two issues are the reason WebP hits a quality wall in some 
JPEG/JXR/J2K comparisons done in RGB space.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2013-10-26 Thread geeta . b . arora
On Thursday, October 17, 2013 7:48:16 AM UTC-7, Josh Aas wrote:
 This is the discussion thread for the Mozilla Research blog post entitled 
 Studying Lossy Image Compression Efficiency, and the related study.

Few queries regarding the study's methodology:

1.) The compression_test.py code converts the input PNG image to YUV data via 
following command (for Lenna image for instance):
convert png:Lenna.png -sampling-factor 4:2:0 -depth 8 Lenna.png.yuv

Not sure, what's the default colorspace used for imagemagick's convert command. 
Seems, as per imagemagick's documentation, this will produce YUV data (Luma 
range [0..255] and not digtal YCbCr (Luma range [16..235]), unless '-colorspace 
Rec601Luma' is specified). If the YUV intermediate data produced above is not 
YCbCr, not sure if that YUV data is a valid input for WebP like encoder. Can we 
verify that the correct YCbCr data is generated from the convert command 
(above).
 
2.) How the quality scores (Y-SSIM etc) will look like, if we skip generating 
this intermediate (YUV) data and encode (at some lossy quality) the source (RGB 
colorspace) PNG image directly to Jpeg  other codecs (WebP etc) via 
imagemagick's convert command and convert these images (Jpeg, WebP etc) back to 
PNG format (without intermediate YUV step) and evaluate the quality scores 
(Y-SSIM, RGB-SSIM etc) on the source and re-converted PNG files.

3.) JPEG compression being equal or better (on Y-SSIM quality score) than HEVC 
at higher qualities for Technique image set looks little suspicious to me.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2013-10-22 Thread Yoav Weiss
I have a couple of points which IMO are missing from the discussion.

# JPEG's missing features  alpha channel capabilities in particular

Arguably, one of the biggest gains from WebP/JPEG-XR support is the ability to 
send real life photos with an alpha channel.

Last time I checked, about 60% of all PNG image traffic (so about ~9% of all 
Web traffic, according to HTTPArchive.org) is PNGs of color type 6, so 24 bit 
lossless images with an alpha channel. A large part of these PNGs are real-life 
images that would've been better represented with a lossy format, but since 
they require an alpha channel, authors have no choice but to use lossless PNG. 
(I have no data on *how many* of these PNGs are real life photos, but I hope 
I'll have some soon).

This is a part of Web traffic that would make enormous gains from an 
alpha-channel capable format, such as WebP or JPEG-XR (Don't know if HEVC-MSP 
has an alpha channel ATM), yet this is completely left out of the research. I 
think this point should be addressed.

# Implementability in open-source

HEVC seemed to be the winner of the study, with best scores across most 
measurements. Yet, I'm not sure an HEVC based format is something Mozilla can 
implement, since it's most likely to be patent encumbered. 

If this is not the case, it should be stated loud and clear, and if it is, HEVC 
should probably be in the research as a point of reference (which other formats 
should aspire to beat), rather than as a contender.


___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2013-10-22 Thread pornelski
On Tuesday, 22 October 2013 08:12:08 UTC+1, Yoav Weiss  wrote:
 
 This is a part of Web traffic that would make enormous gains from an 
 alpha-channel capable format, such as WebP or JPEG-XR (Don't know if HEVC-MSP 
 has an alpha channel ATM), yet this is completely left out of the research. I 
 think this point should be addressed.

If this is researched I'd love to see how it compares to lossy PNG from 
pngquant2 http://pngquant.org and blurizer tool 
https://github.com/pornel/mediancut-posterizer/tree/blurizer

IMHO these tools already take PNG from too big to use to good enough level.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2013-10-22 Thread Marcos Caceres



On Tuesday, October 22, 2013 at 10:15 AM, pornel...@gmail.com wrote:

 On Tuesday, 22 October 2013 08:12:08 UTC+1, Yoav Weiss wrote:
 
  This is a part of Web traffic that would make enormous gains from an 
  alpha-channel capable format, such as WebP or JPEG-XR (Don't know if 
  HEVC-MSP has an alpha channel ATM), yet this is completely left out of the 
  research. I think this point should be addressed.

I strongly agree with this. This is the killer feature why people want these 
new formats (apart from the byte savings) and is kinda weird that it was not 
part of the study. 

 If this is researched I'd love to see how it compares to lossy PNG from 
 pngquant2 http://pngquant.org and blurizer tool 
 https://github.com/pornel/mediancut-posterizer/tree/blurizer
That would be great.  
-- 
Marcos Caceres



___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2013-10-21 Thread stephanepechard
 I have a couple of fundamental issues with how you're calculating 3 of the 4 
 metrics (all but RGB-SSIM, which I didn't think too much about)

You are right about it, methodology is not clear on this point.

 First, am I correct in my reading of your methodology that for all metrics, 
 you encode a color image (4:2:0) and use that encoded filesize?

All metrics compute data on *decoded* data, being RGB or YUV. Maybe authors 
could publish a flow of their methodology, that would ease the discussion.

 Fortunately, the solution is easy - for greyscale metrics, simply convert to 
 greyscale before encoding, not after. Or, if that's what you're already 
 doing, make it clear.

As I already said, it's not really a good idea to use luma-only metrics to 
assess colored image, as coders tend to blend colors as the quality index 
decreases.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2013-10-21 Thread Henri Sivonen
On Fri, Oct 18, 2013 at 1:08 AM,  cryo...@free.fr wrote:
 Which leads to think that doing some blinded experiment (real people 
 evaluating the images) to compare compressed images has still some value.

I think it would be worthwhile to do two experiments with real people
evaluating the images:
 1) For a given file size with artifacts visible, which format
produces the least terrible artifacts?
 2) Which format gives the smallest file size with a level of
artifacts that is so mild that people don't notice the artifacts?

My limited experience suggests that the ranking of the formats could
be different for those two different questions. Also, my understanding
is that the quality metric algorithms are foremost about answering
question #1 while question #2 is often more important for Web
designers.

-- 
Henri Sivonen
hsivo...@hsivonen.fi
http://hsivonen.fi/
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2013-10-21 Thread Henri Sivonen
On Fri, Oct 18, 2013 at 5:16 PM,  ch...@improbable.org wrote:
 I think JP2 support could potentially be very interesting because it would 
 make responsive images almost trivial without requiring separate files (i.e. 
 srcset could simply specify a byte-range for each size image) but the 
 toolchain support needs some serious attention.

Are there now JPEG 2000 encoders that make images such that if you
want to decode an image in quarter of the full-size in terms of number
of pixels (both dimensions halved), it is sufficient to use the first
quarter of the file length?

Last I tried, which was years ago, in order to decode a quarter-sized
image in terms of number of pixels with quality comparable to the
full-size image in terms of visible artifacts, it was necessary to
consume half of the file length. That is, in order to use the image
with both dimensions halved, it was necessary to load twice as many
bytes as would have been necessary if there was a separate pre-scaled
file available. Having to transfer twice as much data does not seem
like a good trade-off in order to avoid creating separate files for
responsive images.

-- 
Henri Sivonen
hsivo...@hsivonen.fi
http://hsivonen.fi/
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2013-10-21 Thread Chris Adams
It's not as simple as reading n% of the bit-stream – the image needs
to be encoded using tiles so a tile-aware decoder can simply read only
the necessary levels. This is very popular in the library community
because it allows a site like e.g. http://chroniclingamerica.loc.gov/
to serve tiles for a deep-zoom viewer without having to decode a full
600 DPI scan. This is in common usage but not using open-source
software because the venerable libjasper doesn't support it (and is
excruciatingly slow) but the newer OpenJPEG added support for it so
it's now possible without relying on a licensed codec.

As far as transfer efficiency goes, it's slightly more overhead with
the tile wrappers but not enough to come anywhere close to cancelling
out compression win from using JP2 instead of JPEG. For those of us
running servers, it's also frequently a win for cache efficiency
versus separate images – particularly if a CDN miss means you have to
go back to the origin and your stack allows streaming the cached
initial portion of the image while doing byte-range requests for the
other half.

Chris
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2013-10-21 Thread trichter
There are probably a couple of issues here:

 - Why didn't you include JPEG 2000?

This is the first one. However, I would also include various settings of the 
codecs involved. There is quite a bit one can do. For example, the overlap 
settings for XR or visual weighting for JPEG 2000, or subsampling for JPEG.

 - Correct me if I'm wrong but JPEG-XR native color space is not Y'CbCr this 
 means that this format had to perform an extra (possibly lossy) color space 
 conversion.

The question is whether PSNR was measured in YCbCr space or RGB space. The JPEG 
measures in RGB, the MPEG in YUV.

 - I suppose that the final lossless step used for JPEGs was the usual Huffman 
 encoding and not arithmetic coding, have you considered testing the later one 
 independently?

Uninteresting since nobody uses it - except a couple of compression gurus, the 
AC coding option is pretty much unused in the field.

 - The image set is some what biased toward outdoor photographic images and 
 highly contrasted artificial black and white ones, what about fractal 
 renderings, operating systems and 2D/3D games screen-shots, blurry, out of 
 frame or night shots?

That depends very much on the use case you have. For artificial images, I would 
suggest not to use JPEG  friends in first place since they depend on natural 
scene statistics.

Anyhow: Here is the JPEG online test which lets you select (many) parameters 
and measure (many) curves, as much as you want:


http://jpegonline.rus.uni-stuttgart.de/index.py

This is a cut-down version of the JPEG-internal tests, though using essentially 
the same tools.

Greetings,

Thomas
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2013-10-21 Thread trichter

 Are there now JPEG 2000 encoders that make images such that if you
 
 want to decode an image in quarter of the full-size in terms of number
 
 of pixels (both dimensions halved), it is sufficient to use the first
 
 quarter of the file length?

Yes, certainly. Just a matter of the progression mode. Set resolution to the 
slowest progression variable, and off you go.


___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2013-10-21 Thread trichter

 I think it would be worthwhile to do two experiments with real people
 
 evaluating the images:
 
  1) For a given file size with artifacts visible, which format
 
 produces the least terrible artifacts?
 
  2) Which format gives the smallest file size with a level of
 
 artifacts that is so mild that people don't notice the artifacts?

Such studies are called subjective tests, and they have been performed by 
many people (not by me, though, since I don't have a vision lab, i.e. a 
well-calibrated environment). Yes, the outcome of such tests is of course 
task-dependent, and dependent on the method you choose for the test.

There is probably a good study by the EPFL from, IIRC, 2011, published at the 
SPIE, Applications of Digital Image Processing, and many many others.

Outcome is more or less that JPEG 2000 and JPEG XR are on par for a given set 
of options (which I don't remember off my head) when evaluating quality by 
MOS-scores.

This specific test did not attempt to measure the detectibility of defects 
(which I would call a near-threshold test), but rather a scoring or 
badness of defects (thus, above threshold).

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2013-10-21 Thread battlebottle8
On Monday, October 21, 2013 4:05:36 PM UTC+1, tric...@accusoft.com wrote:
 There is probably a good study by the EPFL from, IIRC, 2011, published at the 
 SPIE, Applications of Digital Image Processing, and many many others.
 
 Outcome is more or less that JPEG 2000 and JPEG XR are on par for a given set 
 of options (which I don't remember off my head) when evaluating quality by 
 MOS-scores.

Any idea where we might be able to find the published results of these tests? I 
for one would be very interested in seeing them.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2013-10-20 Thread Yoav Weiss
On Saturday, October 19, 2013 1:12:14 AM UTC+2, Ralph Giles wrote:
 On 2013-10-18 1:57 AM, Yoav Weiss wrote:
 
 
 
  Would you consider a large sample of lossless Web images (real-life images 
  served as PNG24, even though it'd be wiser to serve them as JPEGs) to be 
  unbiased enough to run this research against? I believe such a sample would 
  better represent Web images.
 
 
 
 Do you have such a sample?
 

Assuming Mozilla would consider such a sample valid, I can get such a sample 
using data from httparchive.org.

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2013-10-20 Thread danbrigade
I have a couple of fundamental issues with how you're calculating 3 of the 4 
metrics (all but RGB-SSIM, which I didn't think too much about)

First, am I correct in my reading of your methodology that for all metrics, you 
encode a color image (4:2:0) and use that encoded filesize? If so, then all the 
results from greyscale metrics are invalid, as the filesize would include 
chroma, but the metric only measures luma. An encoder could spend 0 bits on 
chroma and get a better score than an encoder that spent more bits on chroma 
than luma.

Second, for Y-SSIM and IW-SSIM only, it appears you encode as color, then 
afterwards convert both the original image and encoded image to greyscale and 
calculate SSIM between those two images. This is fundamentally wrong - the 
original converted to greyscale was not the image the codec encoded, so you're 
not measuring the distortion of the codec. It looks like PSNR-HVS-M is 
calculated from the YUV fed into the encoder, which is how Y-SSIM and IW-SSIM 
should be calculated as well.

Fortunately, the solution is easy - for greyscale metrics, simply convert to 
greyscale before encoding, not after. Or, if that's what you're already doing, 
make it clear.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2013-10-19 Thread battlebottle8
On Saturday, October 19, 2013 12:12:14 AM UTC+1, Ralph Giles wrote:
 On 2013-10-18 1:57 AM, Yoav Weiss wrote:
 Do you have such a sample?

For what it's worth here's an image I made quite awhile ago showing the results 
of my own blind subjective comparison between codecs: 
http://www.filedropper.com/lossy

The image shows the original lossless image alongside a JPEG, JPEG-XR, JPEG2000 
and Web-P version of the image all of which have been compressed to 7.5kb. I 
used the leadtools compression suite for all images except the web-p one, where 
i used Google's libwebp. I'll be  *very* clear here that I don't consider this 
image very good proof of how good each codec is, clearly the JPEG compressed 
image could be optimized more. The lossy compressed images are ordered as JPEG, 
JPEG-XR, JPEG2000, Web-P with respect to the results I personally came to about 
their performance, web-p being the best and jpeg being the worst. I did this 
comparison at every quality level and using many different image sources and 
found the subjective results were the same. The difference between Web-P, 
JPEG200 and JPEG-XR can at times be hard to call as it felt like i was deciding 
which compression artifacts bothered my most personally rather than which image 
felt closest to the original. What was consistent
  however was that all the modern codecs seems clearly superior to JPEG, or at 
best appeared the same as JPEG at higher compression qualities but certainly 
never worse. What I'm saying is based off my own experiences I'd be shocked if 
anyone could go through a subjective blind test like this and feel that JPEG 
was performing better at any quality level or with any images.

I'd also agree the points brought up by lept...@gmail.com. I think the actual 
features supported by the current range of web image formats is quite lacking. 
It's common on the web for web and game developers to compress photographic 
images as PNG's because they need transparency. Animated Gif's are also popular 
for compressing short live action video clips, something the format is terribly 
inadequate at. Both JPEG-XR and Web-P include transparency + alpha support. 
Only Web-P supports animation, though I believe animation could be added to 
JPEG-XR easily http://ajxr.codeplex.com/. The extra color formats supported in 
JPEG-XR could one day be useful on the web too.

Although the benefit of better compression performance in web image formats 
would have obvious speed benefits, I think the consequences of having such a 
limited feature set in the current range of supported image formats on the web 
is holding web developers back far more than file size issues.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2013-10-19 Thread Jeff Muizelaar


- Original Message -
 On Saturday, October 19, 2013 12:12:14 AM UTC+1, Ralph Giles wrote:
  On 2013-10-18 1:57 AM, Yoav Weiss wrote:
  Do you have such a sample?
 
 For what it's worth here's an image I made quite awhile ago showing the
 results of my own blind subjective comparison between codecs:
 http://www.filedropper.com/lossy

I agree that in this comparison JPEG is clearly the worst. However, the bitrate 
that you are using here is well below the target for which JPEG is designed to 
be used and the quality of all of the image formats is lower than would be 
acceptable for nearly all purposes. This makes these results much less 
interesting than at quality levels typically used on the web.

-Jeff

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2013-10-19 Thread battlebottle8
On Saturday, October 19, 2013 12:30:15 PM UTC+1, Jeff Muizelaar wrote:
 - Original Message -
 
  On Saturday, October 19, 2013 12:12:14 AM UTC+1, Ralph Giles wrote:
 
   On 2013-10-18 1:57 AM, Yoav Weiss wrote:
 
   Do you have such a sample?
 
  
 
  For what it's worth here's an image I made quite awhile ago showing the
 
  results of my own blind subjective comparison between codecs:
 
  http://www.filedropper.com/lossy
 
 
 
 I agree that in this comparison JPEG is clearly the worst. However, the 
 bitrate that you are using here is well below the target for which JPEG is 
 designed to be used and the quality of all of the image formats is lower than 
 would be acceptable for nearly all purposes. This makes these results much 
 less interesting than at quality levels typically used on the web.
 
 
 
 -Jeff

I completely agree. This is why I'm don't want this image to be considered as 
good proof of which codec is superior. I had another image exactly like this 
where I had comparisons around 35kb which was a much more realistic quality 
level for all the codecs but the differences in visual quality loss was still 
noticeable, but I seem to have lost it. My own subjective findings were that 
quality of each codec was roughly the same order, with JPEG always being 
identifiable the worst until file size raised to the point were it was 
impossible to easiy tell the difference between any of the lossy compressed 
images. I think if this test image were compressed on a typical web page it 
would typically be a 80kb JPEG or so, and at this level of quality it's 
difficult for a human to differentiate between a webp or jpeg-xr image of the 
same size or even the original lossless image for that matter in some cases. In 
this respect subjective blind testing by humans fails, but my point is I never
  observed anything like JPEG performing worse than JPEG-XR or Web-P and I'm 
very surprised to see that some algorithms are reporting that JPEG out performs 
WebP and Jpeg-XR by a wide margin at some quality settings, in stark contrast 
to what other algorithms report (RGB-SSIM) and what my own and what I believe 
other subjective blind tests would report.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2013-10-19 Thread stephanepechard
I'll just talk about the quality evaluation aspects of this study, as it is a 
field I know quite well (PhD on the topic, even if in video specifically). 

 I think the most important kind of comparison to do is a subjective blind 
 test with real people. This is of course produces less accurate results, but 
 more meaningful ones.

I don't get how more meaningful results may be less accurate... Running 
subjective quality tests is not as trivial as it sounds, at least to get 
meaningful results, as you say. Of course, you can throw a bunch of images to 
some naive observers with a nice web interface, but what about their screens 
differences? what about their light conditions differences? how do you validate 
people for the test (vision acuity, color blindness)? I've ran more than 600 
test sessions with around 200 different observers. Each one of them was tested 
before the session, and a normalized (ITU-R BT.500) room was dedicated to the 
process. I don't want to brag, I just mean it's a complicated matter, and not 
as sexy as it sounds :-)

In this study, you used several objective quality criteria (Y-SSIM, RGB-SSIM, 
IW-SSIM, PSNR-HVS-M). You say yourself: It's unclear which algorithm is best 
in terms of human visual perception, so we tested with four of the most 
respected algorithms. Still, the ultimate goal of your test is to compare 
different degrading systems (photography coders here) at equivalent *perceived* 
quality. As your graphs show, they don't produce very consistent results 
(especially RGB-SSIM). SSIM-based metrics are structural, which means they 
evaluate how the structure of the image differ from one version to the other.  
Then they are very dependent of the content of the picture. Y-SSIM and IW-SSIM 
are only applied to luma channel, which is not optimal in your case, as image 
coders tend to blend colors. Still, IW-SSIM is the best performer in [1] (but 
it was the subject of the study), so why not. Your results with RGB-SSIM are 
very different than the others, disqualifying it for me. Plus, averaging SSIM 
over R, G and B channels has no sense for the human visual system. PSNR-HVS-M 
has the advantage to take into account a CSF to ponderate their PSNR, but it 
was designed over artificial artefacts, then you don't know how it performs 
over compression artefacts. None of these metrics use the human visual system 
at their heart. At best, they apply some HVS filter to PSNR or SSIM. For a more 
HVS-related metric, which tend to perform well (over 0.92 in correlation), look 
at [2] (from the lab I worked in). The code is a bit old now though, but an R 
package seems to be available.

You cite [1], in which they compare 5 algorithms (PSNR, IW-PSNR, SSIM, MS-SSIM, 
and IW-SSIM) over 6 subject-rated independent image databases (LIVE database, 
Cornell A57 database, IVC database, Toyama database, TID2008 database, and CSIQ 
database). These databases contain images and subjective quality evaluations 
obtained in normalized (i.e. repeatable) conditions. Most of them use JPEG and 
JPEG2000 compression, but not the others you want to test. The LIVE database is 
known not to be spread enough, resulting in high correlation in most studies 
(yet the reason why other databases emerged). If you want to perform your study 
further, consider using some of these data to start with.

Finally, be careful when you compute average of values, did you check their 
distribution first?

Stéphane Péchard

[1] https://ece.uwaterloo.ca/~z70wang/research/iwssim/
[2] http://www.irccyn.ec-nantes.fr/~autrusse/Komparator/index.html
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2013-10-18 Thread Yoav Weiss
On Thursday, October 17, 2013 4:48:16 PM UTC+2, Josh Aas wrote:
 This is the discussion thread for the Mozilla Research blog post entitled 
 Studying Lossy Image Compression Efficiency, and the related study.

Thank you for publishing this research!

While I like the methodology used a lot, I find the image sample used extremely 
small to accurately represent images on today's Web (or tomorrow's Web for that 
matter).

I understand that one of the reasons you used artificial benchmarks instead of 
real-life Web images is to avoid the bias of images that already went through 
JPEG compression.

Would you consider a large sample of lossless Web images (real-life images 
served as PNG24, even though it'd be wiser to serve them as JPEGs) to be 
unbiased enough to run this research against? I believe such a sample would 
better represent Web images.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2013-10-18 Thread battlebottle8
Very interesting study. I’m shocked to see WebP and JPEG-XR perform so poorly 
on so many of the tests. Do they really perform *that* much *worse* than JPEG? 
It seems hard to imagine. I've done my own tests on jpeg, web-p and jpeg-xr by 
blindly comparing files of the same size and deciding subjectively which one I 
thought looked closest to the uncompressed version. The conclusions I came to 
were very close I thought to the RGB-SSIM tests which showed web-p best, with 
JPEG-XR much better than JPEG but significantly behind Web-P and JPEG much 
worse than all. This seemed consistent to me at all encoding qualities with 
many kind of images just as the RGB-SSIM tests show. It seems very curious that 
Y-SSIM, IW-SSIM and PSNR-HVS-M all show JPEG-XR and Web-P both dipping below 
JPEG quality at the same file sizes. I’d be very interested in seeing the 
images that those comparisons are determining JPEG-XR and Web-P are doing a 
worse job than JPEG.

I think the most important kind of comparison to do is a subjective blind test 
with real people. This is of course produces less accurate results, but more 
meaningful ones. It doesn't really matter if a certain algorithm determines a 
certain codec produces less lossy images than another codec if actual humans 
looking at the compressed images don’t tend to feel the same way. All that 
matters in the end is if a codec does a good job of keeping the details that 
the human compressing and viewing the image thinks are important, not what 
various algorithms testing image quality think are important.

Although it’s outside the scope of this study, I wonder what interest Mozilla 
is taking on image formats with more features being supported on the Web? Lossy 
+ transparency seems like a particularly desirable one for games and certainly 
for web developers in general. RGB565 colour format support sounds like it 
could be useful for optimized WebGL applications.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2013-10-17 Thread Josh Aas
Blog post is here:

https://blog.mozilla.org/research/2013/10/17/studying-lossy-image-compression-efficiency/

Study is here:

http://people.mozilla.org/~josh/lossy_compressed_image_study_october_2013/
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2013-10-17 Thread cryopng
Thank you for publishing this study, here are my first questions:
- Why didn't you include JPEG 2000?

- Correct me if I'm wrong but JPEG-XR native color space is not Y'CbCr this 
means that this format had to perform an extra (possibly lossy) color space 
conversion.

- I suppose that the final lossless step used for JPEGs was the usual Huffman 
encoding and not arithmetic coding, have you considered testing the later one 
independently?

- The image set is some what biased toward outdoor photographic images and 
highly contrasted artificial black and white ones, what about fractal 
renderings, operating systems and 2D/3D games screen-shots, blurry, out of 
frame or night shots?

- I've found only two cats and not a single human face in the Tecnick image 
set, no fancy à la Instagram filters, this can't be seriously representative of 
web images, a larger image corpus would be welcome.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2013-10-17 Thread Leman Bennett (Omega X)

On 10/17/2013 9:48 AM, Josh Aas wrote:

This is the discussion thread for the Mozilla Research blog post entitled Studying 
Lossy Image Compression Efficiency, and the related study.




HEVC-MSP did really well. Its unfortunate that Mozilla could not use it 
in any capacity since its tied to the encumbered MPEG HEVC standard.


Also, I didn't know that someone was working on a JPEG-XR FOSS encoder. 
I wonder how it compares to the Microsoft reference encoder.

--
==
~Omega X
MozillaZine Nightly Tester
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2013-10-17 Thread Josh Aas
On Thursday, October 17, 2013 12:50:12 PM UTC-5, cry...@free.fr wrote:
 Thank you for publishing this study, here are my first questions:
 
 - Why didn't you include JPEG 2000?

We couldn't test everything, we picked a small set of the formats that we hear 
the most about and that seem interesting. We're not opposed to including JPEG 
2000 in future testing, particularly if we see more evidence that it's 
competitive.

 - The image set is some what biased toward outdoor photographic images and 
 highly contrasted artificial black and white ones, what about fractal 
 renderings, operating systems and 2D/3D games screen-shots, blurry, out of 
 frame or night shots?
 
 - I've found only two cats and not a single human face in the Tecnick image 
 set, no fancy à la Instagram filters, this can't be seriously representative 
 of web images, a larger image corpus would be welcome.

We considered improving the image sets in some of the ways you suggest, we just 
didn't get to it this time. Trying to be thorough and accurate with these kinds 
of studies is more work that it seems like it'll be, we couldn't do everything. 
We'll try to do better with image sets in future work. I still think this set 
produces meaningful results.

Thanks for the feedback. Maybe Tim, Gregory, or Jeff can respond to some of 
your other questions.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency

2013-10-17 Thread cryopng
HDR-VDP-2 is relatively recent metric that produces predictions for difference 
visibility and quality degradation.
http://sourceforge.net/apps/mediawiki/hdrvdp/index.php?title=Main_Page
It could been interesting to add this metric in future studies.

Rafał Mantiuk (the guy behind HDR-VDP-2) also worked on this paper: New 
Measurements Reveal Weaknesses of Image Quality Metrics in Evaluating Graphics 
Artifacts http://www.mpi-inf.mpg.de/resources/hdr/iqm-evaluation/

Which leads to think that doing some blinded experiment (real people evaluating 
the images) to compare compressed images has still some value. It could be fun 
to conduct such an experience, presenting 2 or 3 versions of the same image 
compressed with different methods and asking a wide panel (could be open to 
anyone on the web) to pick their favorite one.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform