Re: Studying Lossy Image Compression Efficiency, July 2014

2014-11-28 Thread songofapollo
On Tuesday, July 15, 2014 7:34:35 AM UTC-7, Josh Aas wrote:
 This is the discussion thread for Mozilla's July 2014 Lossy Compressed Image 
 Formats Study and the Mozilla Research blog post entitled Mozilla Advances 
 JPEG Encoding with mozjpeg 2.0.

It would help if you would use much more distinct colors in your graphs of the 
results. It can be very hard to keep track of which is which. You used two 
shades of red/purple, and three shades of blue/green/teal. That's a bizarre 
decision for graphs meant to be easily understood.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-09-16 Thread jnoring
On Tuesday, July 15, 2014 8:34:35 AM UTC-6, Josh Aas wrote:
 This is the discussion thread for Mozilla's July 2014 Lossy Compressed Image 
 Formats Study and the Mozilla Research blog post entitled Mozilla Advances 
 JPEG Encoding with mozjpeg 2.0.

Could you post the command lines used for the various encoders?  Also, for 
mozjpeg, if you use arithmetic encoding instead of huffman encoding, what is 
the effect?

I know arithmetic encoding isn't supported by a lot of browsersbut neither 
are most of the formats being tested in the study.  So it seems appropriate to 
consider.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-09-16 Thread jnoring
On Tuesday, July 15, 2014 1:38:00 PM UTC-6, stone...@gmail.com wrote:
 Would be nice if you guys just implemented JPEG2000.  It's 2014.

Based on what data?  

 Not only would you get a lot more than a 5% encoding boost, but you'd get 
 much higher quality images to boot.

Based on what data?

 If you had implemented it in 2014, everyone would support it today.  If you 
 don't implement it today, we'll wait another 15 years tuning a 25 year old 
 image algorithm while better things are available.

Just because something is new doesn't automatically imply it's better.  I've 
seen conflicting data on whether or not JPEG2000 outperforms JPEG.  And on some 
basic level, that last statement is also pretty fickle since encoder maturity 
is a huge factor in quality.

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-31 Thread janus
Den torsdag den 24. juli 2014 23.59.58 UTC+2 skrev Josh Aas:
 
  I selected 10,000 random JPEGs that we were caching for customers and ran 
  them through mozjpeg 2.0 via jpegtran. Some interesting facts:
 
 
 With mozjpeg you probably want to re-encode with cjpeg rather than jpegtran. 
 We added support for JPEG input to cjpeg in mozjpeg to make this possible. 
 I'm not sure, but I don't think jpegtran takes advantage of much of the work 
 we've done to improve compression.
 
 

Hi Josh
You write that we should re-encode with cjpeg rather than just optimize with 
jpegtran, but what settings would you use for this, if the purpose is just to 
optimize, and not actually change the format, quality and so on, in any way?

I tried with cjpeg -quality 100 -optimize -progressive but this seems to give 
me much bigger files.

I am hoping to optimize images uploaded for websites, which has allready had 
the quality setting changed to fit their purpose, so I am just interested in 
optimizing the images lossless, which seems like a similar case to John's.

And one other thing:
I been testing an early version of jpegtran from MozJpeg, but after upgrading 
to 2.0 my testfiles seems to grow by a few KB, after being optimized.
Was there an error in the older versions that deleted a bit too much data or 
did the algorithm change to the worse in 2.0?
I am using jpegtran -optimize -progressive -copy none with both versions.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-24 Thread Josh Aas
On Tuesday, July 15, 2014 3:15:13 PM UTC-5, perez@gmail.com wrote:

 #1 Would it be possible to have the same algorithm that is applied to webP to 
 be applied to JPEG?

I'm not sure. WebP was created much later than JPEGs, so I'd think/hope they're 
already using some equivalent to trellis quantization.

 #2 There are some JPEG services that perceptually change the image, without 
 any noticeable artifacts. Have you tried something like that?

I'm not really sure what this means, but you can experiment with re-encoding 
with mozjpeg and find a level that saves on file size, but at which you can't 
tell the difference between the source and the re-encoded image.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-24 Thread Josh Aas
 Are there any plans to integrate into other tools, specifically imagemagick?
 
 Or would you leave that up to others?

For now we're going to stay focused on improving compression in mozjpeg's 
library. I think a larger improved toolchain for optimizing JPEGs would be 
great, but it's probably outside the scope of the mozjpeg project.

 While you state that you now accept also jpeg for re-compression, this 
 usually involves loss of quality in the process.

Options for improving re-compression are very limited if you're not willing to 
accept any quality loss. That said, our 'jpgcrush' feature does reduce size 
significantly for progressive JPEGs without harming quality.

 Does mozjpeg have a preferred input format (for best quality/performance)?

Not really. It's probably best to input JPEG if your source image is JPEG, 
otherwise I'd probably recommend converting to BMP for use with cjpeg.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-24 Thread Josh Aas
On Friday, July 18, 2014 10:05:19 AM UTC-5, j...@cloudflare.com wrote:

 I selected 10,000 random JPEGs that we were caching for customers and ran 
 them through mozjpeg 2.0 via jpegtran. Some interesting facts:

With mozjpeg you probably want to re-encode with cjpeg rather than jpegtran. We 
added support for JPEG input to cjpeg in mozjpeg to make this possible. I'm not 
sure, but I don't think jpegtran takes advantage of much of the work we've done 
to improve compression.

 We will continue to work with mozjpeg 2.0 experimentally with the hope that 
 run time can be brought closer to what we had before as the compression looks 
 good.

We haven't spent as much time as we'd like to on run-time optimization, we've 
really been focused on compression wins. We hope to spend more time on run-time 
performance in the future.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-21 Thread Gabriele Svelto
On 19/07/2014 22:40, Ralph Giles wrote:
 Probably not for Firefox OS, if you mean mozjpeg. Not necessarily
 because it uses hardware, but because mozjpeg is about spending more cpu
 power to compress images. It's more something you'd use server-side or
 in creating apps. The phone uses libjpeg-turbo for image decoding, which
 is fast, just not as good an compression.

It might be useful in Firefox OS development: we routinely re-compress
PNG assets in FxOS but we never tried re-compressing our JPEG assets
(which are mostly wallpapers IIRC).

 Gabriele



signature.asc
Description: OpenPGP digital signature
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-21 Thread Bryan Stillwell
One option that I haven't seen compared is the combination of JPEG w/ packJPG 
(http://packjpg.encode.ru/?page_id=17).  packJPG can further compress JPEG 
images another 20%+ and still reproduce the original bit-for-bit.

More details on how this is done can be found here:

http://mattmahoney.net/dc/dce.html#Section_616

To me it seems that JPEG+packJPG could be competitive or exceed HEVC-MSP on 
bits/pixel.

Bryan
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-19 Thread Caspy7
Would this code be a candidate for use in Firefox OS or does most of that 
happen in the hardware?
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-19 Thread Ralph Giles
On 2014-07-19 1:14 PM, Caspy7 wrote:

 Would this code be a candidate for use in Firefox OS or does most of that 
 happen in the hardware?

Probably not for Firefox OS, if you mean mozjpeg. Not necessarily
because it uses hardware, but because mozjpeg is about spending more cpu
power to compress images. It's more something you'd use server-side or
in creating apps. The phone uses libjpeg-turbo for image decoding, which
is fast, just not as good an compression.

 -r
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-18 Thread jgc
On Tuesday, July 15, 2014 3:34:35 PM UTC+1, Josh Aas wrote:
 This is the discussion thread for Mozilla's July 2014 Lossy Compressed Image 
 Formats Study and the Mozilla Research blog post entitled Mozilla Advances 
 JPEG Encoding with mozjpeg 2.0.

Josh,

I work for CloudFlare on many things but recently on image compression. We have 
a product called Polish that recompresses images for our customers 
automatically. As we are in the process of rolling out a new version I looked 
at mozjpeg 2.0.

I selected 10,000 random JPEGs that we were caching for customers and ran them 
through mozjpeg 2.0 via jpegtran. Some interesting facts:

1. 691 files were not compressed further. This compares with 3,471 that 
libjpeg-turbo did not compress further.

2. Of the compression files the average compression was about 3%.

3. Run time was about 1.7x the libjpeg-turbo time.

4. I've put together a small chart showing the distribution of compression that 
we saw. It's here: 
https://twitter.com/jgrahamc/status/490114514667327488/photo/1

We will continue to work with mozjpeg 2.0 experimentally with the hope that run 
time can be brought closer to what we had before as the compression looks good.

John.

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-16 Thread renesd
Cool

=Re decoding.

I'm replying to this note: 

1. We're fans of libjpeg-turbo - it powers JPEG decoding in Firefox because 
its focus is on being fast, and that isn't going to change any time soon. The 
mozjpeg project focuses solely on encoding, and we trade some CPU cycles for 
smaller file sizes. We recommend using libjpeg-turbo for a standard JPEG 
library and any decoding tasks. Use mozjpeg when creating JPEGs for the Web.


Why not use hardware for JPEG? It uses less memory, and battery as well as 
being quicker. It's available on many devices these days too. Why use the CPU 
to first convert a small amount of data into a big amount of data when it's not 
needed by most hardware? Not only that, but you probably store the original 
JPEG data in cache as well! The fastest decoder is the one that does nothing. 
Just let the dedicated JPEG decoding hardware, or the GPU do it.

All talk of considering decoding performance is kind of silly considering the 
JPEG performance could be improved massively.


best,
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-15 Thread Josh Aas
Study is here:

http://people.mozilla.org/~josh/lossy_compressed_image_study_july_2014/

Blog post is here:

https://blog.mozilla.org/research/2014/07/15/mozilla-advances-jpeg-encoding-with-mozjpeg-2-0/
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-15 Thread lange . fabian
Hello Josh,

thank you and all involved for your efforts to make the web faster.
Are there any plans to integrate into other tools, specifically imagemagick?
Or would you leave that up to others?

With all the options available for image processing one can end up with 
building quite a complex chain of tools and commands to produce the best output.
While you state that you now accept also jpeg for re-compression, this usually 
involves loss of quality in the process.
Does mozjpeg have a preferred input format (for best quality/performance)?

Best regards
Fabian
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-15 Thread stonecypher
On Tuesday, July 15, 2014 7:34:35 AM UTC-7, Josh Aas wrote:
 This is the discussion thread for Mozilla's July 2014 Lossy Compressed Image 
 Formats Study and the Mozilla Research blog post entitled Mozilla Advances 
 JPEG Encoding with mozjpeg 2.0.

Would be nice if you guys just implemented JPEG2000.  It's 2014.

Not only would you get a lot more than a 5% encoding boost, but you'd get much 
higher quality images to boot.

But nobody supports JPEG2000 and we want to target something everyone can see!

If you had implemented it in 2014, everyone would support it today.  If you 
don't implement it today, we'll wait another 15 years tuning a 25 year old 
image algorithm while better things are available.

Similarly there's a reason that people are still hacking video into JPEGs and 
using animated GIFs.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-15 Thread john
On Tuesday, July 15, 2014 7:34:35 AM UTC-7, Josh Aas wrote:
 This is the discussion thread for Mozilla's July 2014 Lossy Compressed Image 
 Formats Study and the Mozilla Research blog post entitled Mozilla Advances 
 JPEG Encoding with mozjpeg 2.0.

Would be nice if you guys just implemented JPEG2000.  It's 2014.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-15 Thread perez . m . marc
On Tuesday, July 15, 2014 10:34:35 AM UTC-4, Josh Aas wrote:
 This is the discussion thread for Mozilla's July 2014 Lossy Compressed Image 
 Formats Study and the Mozilla Research blog post entitled Mozilla Advances 
 JPEG Encoding with mozjpeg 2.0.

#1 Would it be possible to have the same algorithm that is applied to webP to 
be applied to JPEG?

#2 There are some JPEG services that perceptually change the image, without any 
noticeable artifacts. Have you tried something like that?
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-15 Thread Chris Peterson

On 7/15/14 12:38 PM, stonecyp...@gmail.com wrote:

On Tuesday, July 15, 2014 7:34:35 AM UTC-7, Josh Aas wrote:

This is the discussion thread for Mozilla's July 2014 Lossy Compressed Image Formats 
Study and the Mozilla Research blog post entitled Mozilla Advances JPEG Encoding 
with mozjpeg 2.0.


Would be nice if you guys just implemented JPEG2000.  It's 2014.

Not only would you get a lot more than a 5% encoding boost, but you'd get much 
higher quality images to boot.

But nobody supports JPEG2000 and we want to target something everyone can see!

If you had implemented it in 2014, everyone would support it today.  If you 
don't implement it today, we'll wait another 15 years tuning a 25 year old 
image algorithm while better things are available.

Similarly there's a reason that people are still hacking video into JPEGs and 
using animated GIFs.


Do Chrome and IE support JPEG2000? I can't find a clear answer online. 
The WONTFIX'd Firefox bug [1] says IE and WebKit/Blink browsers support 
JPEG2000 (but WebKit's support is only on OS X).



chris

[1] https://bugzilla.mozilla.org/show_bug.cgi?id=36351

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Studying Lossy Image Compression Efficiency, July 2014

2014-07-15 Thread Masatoshi Kimura
On 7/15/14 12:38 PM, stonecyp...@gmail.com wrote:
 Similarly there's a reason that people are still hacking video into
 JPEGs and using animated GIFs.

People are using animated GIFs, but animated GIFs people are using may
not be animated GIFs [1].

(2014/07/16 5:43), Chris Peterson wrote:
 Do Chrome and IE support JPEG2000? I can't find a clear answer online.
 The WONTFIX'd Firefox bug [1] says IE and WebKit/Blink browsers support
 JPEG2000 (but WebKit's support is only on OS X).

No, IE does not support JPEG2000. But IE9+ supports JPEG XR. Chrome does
not support both, but it supports WebP [2].

[1] http://techcrunch.com/2014/06/19/gasp-twitter-gifs-arent-actually-gifs/
[2] http://xkcd.com/927/

-- 
vyv03...@nifty.ne.jp
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform