Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-06-02 Thread Noel Gordon
On Sun, Jun 1, 2014 at 8:58 AM, Glenn Maynard gl...@zewt.org wrote:


 But again, image decoding *can't* be done efficiently in script:
 platform-independent code with performance competitive with native SIMD
 assembly is a thing of myth.  (People have been trying unsuccessfully to do
 that since day one of MMX, so it's irrelevant until the day it actually
 happens.) Anyhow, I think I'll stop helping to derail this thread and
 return to the subject.


I believe that a spec-conforming canvas implementation must support PNG,
so a PNG encoder/decoder is required. If others want to replace their
native libs (libpng, libjpeg_turbo, and so on) with JS implementations of
same, well that's up to them. Won't be happening in Chrome anytime soon due
to dependent factors: speed, memory use, and security, come to mind. But
agree, let's return to the subject :)

Noel, if you're still around, I'd suggest fleshing out your suggestion by
 providing some real-world benchmarks that compare the PNG compression rates
 against the relative time it takes to compress.  If spending 10x the
 compression time gains you a 50% improvement in compression, that's a lot
 more compelling than if it only gains you 10%.  I don't know what the
 numbers are myself.


For the test case attached, and https://codereview.chromium.org/290893002

compression 0.0, time 0.230500 ms, toDataURL length 2122
compression 0.1, time 0.209900 ms, toDataURL length 1854
compression 0.2, time 0.215200 ms, toDataURL length 1850
compression 0.3, time 0.231100 ms, toDataURL length 1774
compression 0.4, time 0.518100 ms, toDataURL length 1498
compression 0.5, time 0.532000 ms, toDataURL length 1494
compression 0.6, time 0.612600 ms, toDataURL length 1474
compression 0.7, time 0.727750 ms, toDataURL length 1470
compression 0.8, time 1.511150 ms, toDataURL length 1334
compression 0.9, time 3.138100 ms, toDataURL length 1298
compression 1.0, time 3.182050 ms, toDataURL length 1298

I'd be careful using compression rates / encoding times as figures of merit
though -- those depend on the source material (the input to the PNG
encoder). Given incompressible source material, PNG encoding cannot gain
compression at all.

The question (for me) is whether developers should be allowed to control
the compression using a pre-existing API. The browser has a default
compression value, it's a compromise that ... surprise, surprise ...
doesn't always meet developer expectations [1].

~noel

[1] https://bugs.webkit.org/show_bug.cgi?id=54256

-- 
 Glenn Maynard



Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-06-02 Thread Justin Novosad
On Sat, May 31, 2014 at 8:44 AM, Robert O'Callahan rob...@ocallahan.org
wrote:

 On Sat, May 31, 2014 at 3:44 AM, Justin Novosad ju...@google.com wrote:

 My point is, we need a proper litmus test for the just do it in script
 argument because, let's be honnest, a lot of new features being added to
 the Web platform could be scripted efficiently, and that does not
 necessarily make them bad features.


 Which ones?


The examples I had in mind when I wrote that were Path2D and HitRegions.


 Rob
 --
 Jtehsauts  tshaei dS,o n Wohfy  Mdaon  yhoaus  eanuttehrotraiitny  eovni
 le atrhtohu gthot sf oirng iyvoeu rs ihnesa.rt sS?o  Whhei csha iids  teoa
 stiheer :p atroa lsyazye,d  'mYaonu,r  sGients  uapr,e  tfaokreg iyvoeunr,
 'm aotr  atnod  sgaoy ,h o'mGee.t  uTph eann dt hwea lmka'n?  gBoutt  uIp
 waanndt  wyeonut  thoo mken.o w



Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-06-02 Thread Justin Novosad
On Sat, May 31, 2014 at 1:46 PM, Glenn Maynard gl...@zewt.org wrote:

 On Fri, May 30, 2014 at 1:25 PM, Justin Novosad ju...@google.com wrote:

 I think this proposal falls short of enshrining.  The cost of adding this
 feature is minuscule.


 I don't think the cost is ever really miniscule.


https://codereview.chromium.org/290893002





 True, you'd never want to use toDataURL with a compression operation
 that takes many seconds ((or even tenths of a second) to complete, and data
 URLs don't make sense for large images in the first place.  It makes sense
 for toBlob(), though, and having the arguments to toBlob and toDataURL be
 different seems like gratuitous inconsistency.


 Yes, toBlob is async, but it can still be polyfilled.


 (I'm not sure how this replies to what I said--this feature makes more
 sense for toBlob than toDataURL, but I wouldn't add it to toBlob and not
 toDataURL.)


What I meant is that I agree that adding the compression argument to toBlob
answers the need for an async API (being synchronous was one of the
criticisms of the original proposal, which only mentioned toDataURL).
 However, this does not address the other criticism that we should not add
features to toDataURL (and by extension to toBlob) because the new
functionality could implemented more or less efficiently in JS.


 --
 Glenn Maynard




Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-06-02 Thread Rik Cabanier
On Mon, Jun 2, 2014 at 10:05 AM, Justin Novosad ju...@google.com wrote:

 On Sat, May 31, 2014 at 8:44 AM, Robert O'Callahan rob...@ocallahan.org
 wrote:

  On Sat, May 31, 2014 at 3:44 AM, Justin Novosad ju...@google.com
 wrote:
 
  My point is, we need a proper litmus test for the just do it in script
  argument because, let's be honnest, a lot of new features being added to
  the Web platform could be scripted efficiently, and that does not
  necessarily make them bad features.
 
 
  Which ones?
 

 The examples I had in mind when I wrote that were Path2D


Crossing the JS boundary is still an issue so implementing this in pure JS
would be too slow.
Path2D is only there to minimize DOM calls.


 and HitRegions.


I agree that most of hit regions can be implemented using JS.
The reason for hit regions is a11y and people felt a feature that just does
accessibility, will end up unused or unimplemented.


Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-06-02 Thread Rik Cabanier
On Mon, Jun 2, 2014 at 10:16 AM, Justin Novosad ju...@google.com wrote:

 On Sat, May 31, 2014 at 1:46 PM, Glenn Maynard gl...@zewt.org wrote:

  On Fri, May 30, 2014 at 1:25 PM, Justin Novosad ju...@google.com
 wrote:
 
  I think this proposal falls short of enshrining.  The cost of adding this
  feature is minuscule.
 
 
  I don't think the cost is ever really miniscule.
 

 https://codereview.chromium.org/290893002


That's implementation cost to you :-)
Now we need to convince the other vendors. Do they want it, want more, want
it in a different way?
Then it needs to be documented. How can authors discover that this is
supported? How can it be poly-filled?


  True, you'd never want to use toDataURL with a compression operation
  that takes many seconds ((or even tenths of a second) to complete, and
 data
  URLs don't make sense for large images in the first place.  It makes
 sense
  for toBlob(), though, and having the arguments to toBlob and toDataURL
 be
  different seems like gratuitous inconsistency.
 
 
  Yes, toBlob is async, but it can still be polyfilled.
 
 
  (I'm not sure how this replies to what I said--this feature makes more
  sense for toBlob than toDataURL, but I wouldn't add it to toBlob and not
  toDataURL.)
 

 What I meant is that I agree that adding the compression argument to toBlob
 answers the need for an async API (being synchronous was one of the
 criticisms of the original proposal, which only mentioned toDataURL).
  However, this does not address the other criticism that we should not add
 features to toDataURL (and by extension to toBlob) because the new
 functionality could implemented more or less efficiently in JS.


  --
  Glenn Maynard
 
 



Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-06-02 Thread Glenn Maynard
On Mon, Jun 2, 2014 at 12:49 PM, Rik Cabanier caban...@gmail.com wrote:

 That's implementation cost to you :-)

Now we need to convince the other vendors. Do they want it, want more, want
 it in a different way?
 Then it needs to be documented. How can authors discover that this is
 supported? How can it be poly-filled?


Polyfill isn't really an issue, since this is just a browser hint.  We
definitely need a way to feature test option arguments, but we should start
another thread for that.

This needs a bit more guidance in the spec as far as what different numbers
mean.  A quality number of 0-1 with JPEG is fairly well-understood--you
won't always get the same result, but nobody interprets 1 as spend 90
seconds trying as hard as you possibly can to make the image smaller.
 There's no common understanding for PNG compression levels, and there's a
wide variety of ways you can try harder to compress a PNG, with wildly
different space/time tradeoffs.  By order of cost:

- Does 0 mean output a PNG as quickly as possible, even if it results in
zero compression?
- What number means be quick, but don't turn off compression entirely?
- What number means use a reasonable tradeoff, eg. the default today?
- What number means prefer smaller file sizes, but I'm expecting an order
of 25% extra time cost, not a 1500%?
- Does 1 mean spend two minutes if you want, make the image as small as
you can?  (pngcrush does this, and Photoshop in some versions does
this--which is incredibly annoying, by the way.)

If there's no guidance given at all, 0 might mean either of the first
two, and 1 might mean either of the last two.

My suggestion is an enum, with three values: fast, normal, small,
which non-normative spec guidance suggesting that fast means make the
compression faster if possible at the cost of file size, but don't go
overboard and turn compression off entirely, and small means spend a
bit more time if it helps create a smaller file, but don't go overboard and
spend 15x as long.  If we want to support the other two, they can be added
later (eg. uncompressed and crush).  Since this is only a hint,
implementations can choose which ones to implement; if the choice isn't
known, fall back on default.

A normative requirement for all PNG compression is that it should always
round-trip the RGBA value for each pixel.  That means that--regardless of
this option--a UA can use paletted output only if the image color fits in a
palette, and it prohibits things like clamping pixels with a zero alpha to
#00, which is probably one strategy for improving compression (but if
you're compressing non-image data, like helper textures for WebGL, you
don't want that).

On Mon, Jun 2, 2014 at 1:23 PM, Nils Dagsson Moskopp 
n...@dieweltistgarnichtso.net wrote:

 As an author, I do not see why I should ever want to tell a browser

losslessly encoding an image any other quality argument different from
 „maximum speed“ or „minimum size“ – on a cursory look, anything else
 would probably not be interoperable. Also, is 0.5 the default value?


Image compression is uninteroperable from the start, in the sense that the
each UA can always come up with different output files.  This feature (and
the JPEG quality level feature) doesn't make it worse.

-- 
Glenn Maynard


Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-05-31 Thread Robert O'Callahan
On Sat, May 31, 2014 at 3:44 AM, Justin Novosad ju...@google.com wrote:

 My point is, we need a proper litmus test for the just do it in script
 argument because, let's be honnest, a lot of new features being added to
 the Web platform could be scripted efficiently, and that does not
 necessarily make them bad features.


Which ones?

Rob
-- 
Jtehsauts  tshaei dS,o n Wohfy  Mdaon  yhoaus  eanuttehrotraiitny  eovni
le atrhtohu gthot sf oirng iyvoeu rs ihnesa.rt sS?o  Whhei csha iids  teoa
stiheer :p atroa lsyazye,d  'mYaonu,r  sGients  uapr,e  tfaokreg iyvoeunr,
'm aotr  atnod  sgaoy ,h o'mGee.t  uTph eann dt hwea lmka'n?  gBoutt  uIp
waanndt  wyeonut  thoo mken.o w


Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-05-31 Thread Glenn Maynard
On Fri, May 30, 2014 at 1:25 PM, Justin Novosad ju...@google.com wrote:

 I think this proposal falls short of enshrining.  The cost of adding this
 feature is minuscule.


I don't think the cost is ever really miniscule.




 True, you'd never want to use toDataURL with a compression operation that
 takes many seconds ((or even tenths of a second) to complete, and data URLs
 don't make sense for large images in the first place.  It makes sense for
 toBlob(), though, and having the arguments to toBlob and toDataURL be
 different seems like gratuitous inconsistency.


 Yes, toBlob is async, but it can still be polyfilled.


(I'm not sure how this replies to what I said--this feature makes more
sense for toBlob than toDataURL, but I wouldn't add it to toBlob and not
toDataURL.)


On Sat, May 31, 2014 at 7:44 AM, Robert O'Callahan rob...@ocallahan.org
wrote:

 On Sat, May 31, 2014 at 3:44 AM, Justin Novosad ju...@google.com wrote:

 My point is, we need a proper litmus test for the just do it in script
 argument because, let's be honnest, a lot of new features being added to
 the Web platform could be scripted efficiently, and that does not
 necessarily make them bad features.


 Which ones?


The ones that are used so frequently that providing a standard API for them
benefits everyone, by avoiding the fragmentation of everyone rolling their
own.  For example, URL parsing and manipulation, and lots of DOM interfaces
like element.closest(), element.hidden and element.classList.  (Cookies are
another one that should be in this category; document.cookie isn't a sane
API without a wrapper.)

This isn't one of those, though.

-- 
Glenn Maynard


Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-05-31 Thread Rik Cabanier
On Sat, May 31, 2014 at 10:46 AM, Glenn Maynard gl...@zewt.org wrote:

 On Fri, May 30, 2014 at 1:25 PM, Justin Novosad ju...@google.com wrote:

  I think this proposal falls short of enshrining.  The cost of adding this
  feature is minuscule.
 

 I don't think the cost is ever really miniscule.


 
 
  True, you'd never want to use toDataURL with a compression operation
 that
  takes many seconds ((or even tenths of a second) to complete, and data
 URLs
  don't make sense for large images in the first place.  It makes sense
 for
  toBlob(), though, and having the arguments to toBlob and toDataURL be
  different seems like gratuitous inconsistency.
 
 
  Yes, toBlob is async, but it can still be polyfilled.
 

 (I'm not sure how this replies to what I said--this feature makes more
 sense for toBlob than toDataURL, but I wouldn't add it to toBlob and not
 toDataURL.)


 On Sat, May 31, 2014 at 7:44 AM, Robert O'Callahan rob...@ocallahan.org
 wrote:

  On Sat, May 31, 2014 at 3:44 AM, Justin Novosad ju...@google.com
 wrote:
 
  My point is, we need a proper litmus test for the just do it in script
  argument because, let's be honnest, a lot of new features being added to
  the Web platform could be scripted efficiently, and that does not
  necessarily make them bad features.
 
 
  Which ones?
 

 The ones that are used so frequently that providing a standard API for them
 benefits everyone, by avoiding the fragmentation of everyone rolling their
 own.  For example, URL parsing and manipulation, and lots of DOM interfaces
 like element.closest(), element.hidden and element.classList.  (Cookies are
 another one that should be in this category; document.cookie isn't a sane
 API without a wrapper.)

 This isn't one of those, though.


roc was asking which NEW feature is being added that can be done in script.


Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-05-31 Thread Glenn Maynard
On Sat, May 31, 2014 at 4:00 PM, Rik Cabanier caban...@gmail.com wrote:

 roc was asking which NEW feature is being added that can be done in
 script.


He asked which new features have already been added that can be done
efficiently in script.  Element.closest() was added less than a week ago.

But again, image decoding *can't* be done efficiently in script:
platform-independent code with performance competitive with native SIMD
assembly is a thing of myth.  (People have been trying unsuccessfully to do
that since day one of MMX, so it's irrelevant until the day it actually
happens.)  Anyhow, I think I'll stop helping to derail this thread and
return to the subject.

Noel, if you're still around, I'd suggest fleshing out your suggestion by
providing some real-world benchmarks that compare the PNG compression rates
against the relative time it takes to compress.  If spending 10x the
compression time gains you a 50% improvement in compression, that's a lot
more compelling than if it only gains you 10%.  I don't know what the
numbers are myself.

-- 
Glenn Maynard


Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-05-31 Thread Rik Cabanier
On Sat, May 31, 2014 at 4:06 PM, andreas@gmail.com wrote:

 Does SIMD support in JS change this equation?


Glenn is asking how much more compression there is to gain from this extra
parameter and how much extra processing it requires.

He's not asking how long it would take to do it in JavaScript. I would be
interested though :-)
png likely won't gain as much from simd compared to jpeg.


  On May 31, 2014, at 18:58, Glenn Maynard gl...@zewt.org wrote:
 
  On Sat, May 31, 2014 at 4:00 PM, Rik Cabanier caban...@gmail.com
 wrote:
 
  roc was asking which NEW feature is being added that can be done in
  script.
 
  He asked which new features have already been added that can be done
  efficiently in script.  Element.closest() was added less than a week ago.
 
  But again, image decoding *can't* be done efficiently in script:
  platform-independent code with performance competitive with native SIMD
  assembly is a thing of myth.  (People have been trying unsuccessfully to
 do
  that since day one of MMX, so it's irrelevant until the day it actually
  happens.)  Anyhow, I think I'll stop helping to derail this thread and
  return to the subject.
 
  Noel, if you're still around, I'd suggest fleshing out your suggestion by
  providing some real-world benchmarks that compare the PNG compression
 rates
  against the relative time it takes to compress.  If spending 10x the
  compression time gains you a 50% improvement in compression, that's a lot
  more compelling than if it only gains you 10%.  I don't know what the
  numbers are myself.
 
  --
  Glenn Maynard



Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-05-30 Thread Justin Novosad
Backtracking here.

The just do it in script argument saddens me quite a bit. :-(

I don't agree that it is okay to be in a state where web apps have to
depend on script libraries that duplicate the functionality of existing Web
APIs. I mean, we put a lot of effort into avoiding introducing
non-orthogonal APIs in order to keep the platform lean. In that sense it is
hypocritical to keep web APIs in a state that forces web developers to use
scripts that are non-orthogonal to web APIs.  The browser has a png
encoder, and it is exposed in the API.  So why should web developers be
forced provide their own scripted codec implementation?!

I understand that we should not add features to the Web platform that can
be implemented efficiently in client-side code using existing APIs. But
where do we draw the line? An extreme interpretation of that argument would
be to stop adding any new features in CanvasRenderingContext2D because
almost anything can be polyfilled on top of putImageData/getImageData with
an efficient asm.js (or something else) implementation.  In fact, why do we
continue to implement any rendering features? Let's stop adding features to
DOM and CSS, because we could just have JS libraries that dump pixels into
canvases! Pwshh (mind blown)

My point is, we need a proper litmus test for the just do it in script
argument because, let's be honnest, a lot of new features being added to
the Web platform could be scripted efficiently, and that does not
necessarily make them bad features.

Also, there are plenty of browser/OS/HW combinations for which it is
unreasonable to expect a scripted implementation of a codec to rival the
performance of a native implementation.  For example, browsers are not
required to support asm.js (which is kind of the point of it). More
generally speaking, asm.js or any other script performance boosting
technology, may not support the latest processing technology hotness that
may be used in browser implementations (SIMD instructions that aren't
mapped by the script compiler, CUDA, ASICs, PPUs, who knows...)

   -Justin



On Thu, May 29, 2014 at 8:54 PM, Glenn Maynard gl...@zewt.org wrote:

 On Thu, May 29, 2014 at 5:34 PM, Nils Dagsson Moskopp 
 n...@dieweltistgarnichtso.net wrote:

   and time it takes to compress.
 
  What benefit does it give then if the result is the same perceptually?
 

 Time it takes to compress.  There's a big difference between waiting one
 second for a quick save and 60 seconds for a high-compression final export.


 On Thu, May 29, 2014 at 7:31 PM, Kornel Lesiński kor...@geekhood.net
 wrote:

  I don't think it's a no-brainer. There are several ways it could be
  interpreted:
 

 The API is a no-brainer.  That doesn't mean it should be done carelessly.
  That said, how it's implemented is an implementation detail, just like the
 JPEG quality parameter, though it should probably be required to never use
 lossy compression (strictly speaking this may not actually be required
 today...).

 FYI, I don't plan to spend much time arguing for this feature.  My main
 issue is with the just do it in script argument.  It would probably help
 for people more strongly interested in this to show a comparison of
 resulting file sizes and the relative amount of time it takes to compress
 them.

 --
 Glenn Maynard



Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-05-30 Thread Rik Cabanier
On Fri, May 30, 2014 at 8:44 AM, Justin Novosad ju...@google.com wrote:

 Backtracking here.

 The just do it in script argument saddens me quite a bit. :-(

 I don't agree that it is okay to be in a state where web apps have to
 depend on script libraries that duplicate the functionality of existing Web
 APIs. I mean, we put a lot of effort into avoiding introducing
 non-orthogonal APIs in order to keep the platform lean. In that sense it is
 hypocritical to keep web APIs in a state that forces web developers to use
 scripts that are non-orthogonal to web APIs.  The browser has a png
 encoder, and it is exposed in the API.  So why should web developers be
 forced provide their own scripted codec implementation?!

 I understand that we should not add features to the Web platform that can
 be implemented efficiently in client-side code using existing APIs. But
 where do we draw the line? An extreme interpretation of that argument would
 be to stop adding any new features in CanvasRenderingContext2D because
 almost anything can be polyfilled on top of putImageData/getImageData with
 an efficient asm.js (or something else) implementation.  In fact, why do we
 continue to implement any rendering features? Let's stop adding features to
 DOM and CSS, because we could just have JS libraries that dump pixels into
 canvases! Pwshh (mind blown)

 My point is, we need a proper litmus test for the just do it in script
 argument because, let's be honnest, a lot of new features being added to
 the Web platform could be scripted efficiently, and that does not
 necessarily make them bad features.


Yes, we need to weigh the cost of implementing new features natively
against the cost of doing them in script. If it's a feature that is not
often requested and it can be done almost as efficiently in script (and
asynchronous!), I believe it should not be added to the platform.

When canvas was created, JS interpreters were slow so the decision to do it
natively was clear; that decision still makes sense today.
However, if in the future someone writes a complete canvas implementation
on top of WebGL and it is just as fast, memory efficient and reliable, we
should just freeze the current spec and tell people to use that library.


 Also, there are plenty of browser/OS/HW combinations for which it is
 unreasonable to expect a scripted implementation of a codec to rival the
 performance of a native implementation.  For example, browsers are not
 required to support asm.js (which is kind of the point of it). More
 generally speaking, asm.js or any other script performance boosting
 technology, may not support the latest processing technology hotness that
 may be used in browser implementations (SIMD instructions that aren't
 mapped by the script compiler, CUDA, ASICs, PPUs, who knows...)


Do you know of any browser that is not interested in making its JavaScript
interpreter faster and compatible with asm,js?
Note that we're talking about a new feature here so the argument that
asm.js is too slow in old browsers doesn't count :-)


  On Thu, May 29, 2014 at 8:54 PM, Glenn Maynard gl...@zewt.org wrote:

  On Thu, May 29, 2014 at 5:34 PM, Nils Dagsson Moskopp 
  n...@dieweltistgarnichtso.net wrote:
 
and time it takes to compress.
  
   What benefit does it give then if the result is the same perceptually?
  
 
  Time it takes to compress.  There's a big difference between waiting one
  second for a quick save and 60 seconds for a high-compression final
 export.
 
 
  On Thu, May 29, 2014 at 7:31 PM, Kornel Lesiński kor...@geekhood.net
  wrote:
 
   I don't think it's a no-brainer. There are several ways it could be
   interpreted:
  
 
  The API is a no-brainer.  That doesn't mean it should be done carelessly.
   That said, how it's implemented is an implementation detail, just like
 the
  JPEG quality parameter, though it should probably be required to never
 use
  lossy compression (strictly speaking this may not actually be required
  today...).
 
  FYI, I don't plan to spend much time arguing for this feature.  My main
  issue is with the just do it in script argument.  It would probably
 help
  for people more strongly interested in this to show a comparison of
  resulting file sizes and the relative amount of time it takes to compress
  them.
 
  --
  Glenn Maynard
 



Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-05-30 Thread Anne van Kesteren
On Fri, May 30, 2014 at 5:44 PM, Justin Novosad ju...@google.com wrote:
 The just do it in script argument saddens me quite a bit. :-(

Agreed, however for this particular case, I'm not sure it makes much
sense to further enshrine a synchronous API for serializing an image.


-- 
http://annevankesteren.nl/


Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-05-30 Thread Glenn Maynard
On Fri, May 30, 2014 at 12:46 PM, Anne van Kesteren ann...@annevk.nl
wrote:

 On Fri, May 30, 2014 at 5:44 PM, Justin Novosad ju...@google.com wrote:
  The just do it in script argument saddens me quite a bit. :-(

 Agreed, however for this particular case, I'm not sure it makes much
 sense to further enshrine a synchronous API for serializing an image.


True, you'd never want to use toDataURL with a compression operation that
takes many seconds ((or even tenths of a second) to complete, and data URLs
don't make sense for large images in the first place.  It makes sense for
toBlob(), though, and having the arguments to toBlob and toDataURL be
different seems like gratuitous inconsistency.

-- 
Glenn Maynard


Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-05-30 Thread Justin Novosad
On Fri, May 30, 2014 at 1:59 PM, Glenn Maynard gl...@zewt.org wrote:

 On Fri, May 30, 2014 at 12:46 PM, Anne van Kesteren ann...@annevk.nl
 wrote:

 On Fri, May 30, 2014 at 5:44 PM, Justin Novosad ju...@google.com wrote:
  The just do it in script argument saddens me quite a bit. :-(

 Agreed, however for this particular case, I'm not sure it makes much
 sense to further enshrine a synchronous API for serializing an image.


 I think this proposal falls short of enshrining.  The cost of adding this
feature is minuscule.


 True, you'd never want to use toDataURL with a compression operation that
 takes many seconds ((or even tenths of a second) to complete, and data URLs
 don't make sense for large images in the first place.  It makes sense for
 toBlob(), though, and having the arguments to toBlob and toDataURL be
 different seems like gratuitous inconsistency.


Yes, toBlob is async, but it can still be polyfilled.


 --
 Glenn Maynard




Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-05-29 Thread Rik Cabanier
On Wed, May 28, 2014 at 10:36 PM, Noel Gordon noel.gor...@gmail.com wrote:

 canvas.toDataURL supports an optional quality argument for the
 “image/jpeg” mime type to control image compression. Developers have no
 control over “image/png” compression.

 “image/png” is a lossless image compression format and the proposal is to
 allow developers some control over the compression process. For example, a
 developer might request maximum compression once their art work is complete
 to minimize the encoded image size for transmission or storage. Encoding
 speed might be more important while creating the work, and less compression
 (faster encoding) could be requested in that case.

 An optional toDataURL parameter on [0.0 ... 1.0], similar to the optional
 quality argument used for image/jpeg, could be defined for “image/png” to
 control compression:

canvas.toDataURL(“image/png”, [compression-control-value]);

 The default value, and how the browser controls the image encoder to gain
 more compression with increasing values, would be internal implementation
 details of the browser.


This has been requested before. ie
http://lists.whatwg.org/pipermail/help-whatwg.org/2013-May/001209.html
The conclusion was that this can be accomplished using JavaScript. There
are JS libraries that can compress images and performance is very good
these days.

If you're worried about blocking the main thread, you can use workers to do
offline processing.


Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-05-29 Thread Glenn Maynard
On Thu, May 29, 2014 at 1:32 AM, Rik Cabanier caban...@gmail.com wrote:

 This has been requested before. ie

http://lists.whatwg.org/pipermail/help-whatwg.org/2013-May/001209.html
 The conclusion was that this can be accomplished using JavaScript. There
 are JS libraries that can compress images and performance is very good
 these days.


This is a nonsensical conclusion.  People shouldn't have to pull in a PNG
compressor and deflate code when a PNG compression API already exists on
the platform.  This is an argument against adding toDataURL at all, which
is a decision that's already been made.

-- 
Glenn Maynard


Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-05-29 Thread Justin Novosad
On Thu, May 29, 2014 at 9:59 AM, Glenn Maynard gl...@zewt.org wrote:

 On Thu, May 29, 2014 at 1:32 AM, Rik Cabanier caban...@gmail.com wrote:

  This has been requested before. ie
 
 http://lists.whatwg.org/pipermail/help-whatwg.org/2013-May/001209.html
  The conclusion was that this can be accomplished using JavaScript. There
  are JS libraries that can compress images and performance is very good
  these days.
 

 This is a nonsensical conclusion.  People shouldn't have to pull in a PNG
 compressor and deflate code when a PNG compression API already exists on
 the platform.  This is an argument against adding toDataURL at all, which
 is a decision that's already been made.

 +1
I would add that the fact that such libraries even exist despite the fact
that the platform provides a competing API proves that the API is not what
it should be.

Also, an encoder written in JavaScript cannot produce color-managed results
because we do not have any APIs that expose color profiles. I am guessing
that png encoders written in JS probably assume that data returned by
getImageData is in sRGB, which is often not the case.  toDataURL, on the
other hand, has the possibility of encoding into the png, a color profile
that expresses the canvas backing store's color space. I know current
implementations of toDataURL don't do that, but we could and should.

   -Justin


 --
 Glenn Maynard



Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-05-29 Thread Rik Cabanier
On Thu, May 29, 2014 at 7:45 AM, Justin Novosad ju...@google.com wrote:

 On Thu, May 29, 2014 at 9:59 AM, Glenn Maynard gl...@zewt.org wrote:

 On Thu, May 29, 2014 at 1:32 AM, Rik Cabanier caban...@gmail.com wrote:

  This has been requested before. ie
 
 http://lists.whatwg.org/pipermail/help-whatwg.org/2013-May/001209.html
  The conclusion was that this can be accomplished using JavaScript. There
  are JS libraries that can compress images and performance is very good
  these days.
 

 This is a nonsensical conclusion.  People shouldn't have to pull in a PNG
 compressor and deflate code when a PNG compression API already exists on
 the platform.  This is an argument against adding toDataURL at all, which
 is a decision that's already been made.

 +1
 I would add that the fact that such libraries even exist despite the fact
 that the platform provides a competing API proves that the API is not what
 it should be.

 Also, an encoder written in JavaScript cannot produce color-managed
 results because we do not have any APIs that expose color profiles. I am
 guessing that png encoders written in JS probably assume that data returned
 by getImageData is in sRGB, which is often not the case.  toDataURL, on the
 other hand, has the possibility of encoding into the png, a color profile
 that expresses the canvas backing store's color space. I know current
 implementations of toDataURL don't do that, but we could and should.


I'm not sure if we want to bake in the device's color profile into the
output bitmap by default because on re-import it will then go through color
management and its pixels will look different from the unmanaged canvas
ones.


Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-05-29 Thread Rik Cabanier
On Thu, May 29, 2014 at 6:59 AM, Glenn Maynard gl...@zewt.org wrote:

 On Thu, May 29, 2014 at 1:32 AM, Rik Cabanier caban...@gmail.com wrote:

 This has been requested before. ie

 http://lists.whatwg.org/pipermail/help-whatwg.org/2013-May/001209.html
 The conclusion was that this can be accomplished using JavaScript. There
 are JS libraries that can compress images and performance is very good
 these days.


 This is a nonsensical conclusion.  People shouldn't have to pull in a PNG
 compressor and deflate code when a PNG compression API already exists on
 the platform.  This is an argument against adding toDataURL at all, which
 is a decision that's already been made.


If performance is good, why would this not be acceptable?
It seems that this would be a fragmented solution as file formats and
features would be added at different stages to browser engines. Would there
be a way to feature test that the optional arguments are supported?


Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-05-29 Thread Justin Novosad
On Thu, May 29, 2014 at 11:21 AM, Rik Cabanier caban...@gmail.com wrote:




 On Thu, May 29, 2014 at 7:45 AM, Justin Novosad ju...@google.com wrote:

 On Thu, May 29, 2014 at 9:59 AM, Glenn Maynard gl...@zewt.org wrote:

 On Thu, May 29, 2014 at 1:32 AM, Rik Cabanier caban...@gmail.com
 wrote:

  This has been requested before. ie
 
 http://lists.whatwg.org/pipermail/help-whatwg.org/2013-May/001209.html
  The conclusion was that this can be accomplished using JavaScript.
 There
  are JS libraries that can compress images and performance is very good
  these days.
 

 This is a nonsensical conclusion.  People shouldn't have to pull in a PNG
 compressor and deflate code when a PNG compression API already exists on
 the platform.  This is an argument against adding toDataURL at all, which
 is a decision that's already been made.

 +1
 I would add that the fact that such libraries even exist despite the fact
 that the platform provides a competing API proves that the API is not what
 it should be.

 Also, an encoder written in JavaScript cannot produce color-managed
 results because we do not have any APIs that expose color profiles. I am
 guessing that png encoders written in JS probably assume that data returned
 by getImageData is in sRGB, which is often not the case.  toDataURL, on the
 other hand, has the possibility of encoding into the png, a color profile
 that expresses the canvas backing store's color space. I know current
 implementations of toDataURL don't do that, but we could and should.


 I'm not sure if we want to bake in the device's color profile into the
 output bitmap by default because on re-import it will then go through color
 management and its pixels will look different from the unmanaged canvas
 ones.


I think you meant encode rather than bake in that above sentence.
Correct?  Currently, the non-color managed output of toDataURL has the
display profile baked in.

Take the following code:

var image = newImage()
image.src = canvas.toDataURL('image/png');
image.onload = function() { canvas.drawImage(image, 0, 0); }

Under a non color managed implementation, the above code will not modify
the content of the canvas in any way because there are no color space
conversions since the png is not color managed... All is good.  If
toDataURL encoded a color profile, the behavior would remain unchanged
because the color correction applied during the image decode would do
nothing (converting to and from the same color space). Again, all is good.

However, if the data URL was to be sent over the network to be decoded on a
different machine, then you are screwed with a non-color managed png,
because the sender's display's color profile is baked-in to the image but
there is no color profile meta data to allow the receiver to bring the
image into a known color space.

   -Justin


Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-05-29 Thread Rik Cabanier
On Thu, May 29, 2014 at 8:50 AM, Justin Novosad ju...@google.com wrote:




 On Thu, May 29, 2014 at 11:21 AM, Rik Cabanier caban...@gmail.com wrote:




 On Thu, May 29, 2014 at 7:45 AM, Justin Novosad ju...@google.com wrote:

 On Thu, May 29, 2014 at 9:59 AM, Glenn Maynard gl...@zewt.org wrote:

 On Thu, May 29, 2014 at 1:32 AM, Rik Cabanier caban...@gmail.com
 wrote:

  This has been requested before. ie
 
 http://lists.whatwg.org/pipermail/help-whatwg.org/2013-May/001209.html
  The conclusion was that this can be accomplished using JavaScript.
 There
  are JS libraries that can compress images and performance is very good
  these days.
 

 This is a nonsensical conclusion.  People shouldn't have to pull in a
 PNG
 compressor and deflate code when a PNG compression API already exists on
 the platform.  This is an argument against adding toDataURL at all,
 which
 is a decision that's already been made.

 +1
 I would add that the fact that such libraries even exist despite the
 fact that the platform provides a competing API proves that the API is not
 what it should be.

 Also, an encoder written in JavaScript cannot produce color-managed
 results because we do not have any APIs that expose color profiles. I am
 guessing that png encoders written in JS probably assume that data returned
 by getImageData is in sRGB, which is often not the case.  toDataURL, on the
 other hand, has the possibility of encoding into the png, a color profile
 that expresses the canvas backing store's color space. I know current
 implementations of toDataURL don't do that, but we could and should.


 I'm not sure if we want to bake in the device's color profile into the
 output bitmap by default because on re-import it will then go through color
 management and its pixels will look different from the unmanaged canvas
 ones.


 I think you meant encode rather than bake in that above sentence.
 Correct?  Currently, the non-color managed output of toDataURL has the
 display profile baked in.

 Take the following code:

 var image = newImage()
 image.src = canvas.toDataURL('image/png');
 image.onload = function() { canvas.drawImage(image, 0, 0); }

 Under a non color managed implementation, the above code will not modify
 the content of the canvas in any way because there are no color space
 conversions since the png is not color managed... All is good.  If
 toDataURL encoded a color profile, the behavior would remain unchanged
 because the color correction applied during the image decode would do
 nothing (converting to and from the same color space). Again, all is good.


That's right.
The values of pixels on the canvas are the same on every machine, but we
have many different types of monitors. The png's that are generated should
all be identical pixel-wise but their attached profiles might be different.


 However, if the data URL was to be sent over the network to be decoded on
 a different machine, then you are screwed with a non-color managed png,
 because the sender's display's color profile is baked-in to the image but
 there is no color profile meta data to allow the receiver to bring the
 image into a known color space.


You are screwed either way :-)
I think authors are going to be surprised that pixels will end up
different. (Imagine taking screenshots of achievements in a game that are
put in an online gallery)
If you put the profile in and it is different from sRGB, the png will look
different from the original canvas because you now will go through an
intermediate sRGB space which will warp the color range.

As an example, I did a toDataURL of this codepen example:
http://codepen.io/Boshnik/pen/vFbgw
I then opened it up in Photoshop, attached my monitor profile and wrote a
small script that does a difference on them:
http://cabanier.github.io/BlendExamples/images.htm

If you run windows, you will see that there's content in the canvas output.
For some reason, Mac doesn't do any color conversion on any browser.
Even on my own system, there's a difference because of the sRGB conversion:
http://cabanier.github.io/BlendExamples/screenshot.png


Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-05-29 Thread Rik Cabanier
On Thu, May 29, 2014 at 12:17 PM, Glenn Maynard gl...@zewt.org wrote:

 On Thu, May 29, 2014 at 10:29 AM, Rik Cabanier caban...@gmail.com wrote:

 If performance is good, why would this not be acceptable?


  I don't know why we'd provide an API to compress PNGs, then tell people
 to use a script reimplementation if they want to set a common option.

 As far as performance, I'm not sure about PNG, but there's no way that a
 JS compressor would compete with native for JPEG.  Assembly (MMX, SSE)
 optimization gives a significant performance improvement over C, so I doubt
 JS will ever be in the running.  (
 http://www.libjpeg-turbo.org/About/Performance)


MMX, SSE is being addressed using asm.js.
We're also just dealing with screenshots here. I doubt people are going to
do toDataURL at 60fps.




 It seems that this would be a fragmented solution as file formats and
 features would be added at different stages to browser engines. Would there
 be a way to feature test that the optional arguments are supported?


 No more than any other new feature.  I don't know if feature testing for
 dictionary arguments has been solved yet (it's come up before), but if not
 that's something that needs to be figured out in general.



Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-05-29 Thread Rik Cabanier
On Thu, May 29, 2014 at 1:33 PM, Rik Cabanier caban...@gmail.com wrote:




 On Thu, May 29, 2014 at 12:17 PM, Glenn Maynard gl...@zewt.org wrote:

 On Thu, May 29, 2014 at 10:29 AM, Rik Cabanier caban...@gmail.com
 wrote:

 If performance is good, why would this not be acceptable?


  I don't know why we'd provide an API to compress PNGs, then tell people
 to use a script reimplementation if they want to set a common option.

 As far as performance, I'm not sure about PNG, but there's no way that a
 JS compressor would compete with native for JPEG.  Assembly (MMX, SSE)
 optimization gives a significant performance improvement over C, so I doubt
 JS will ever be in the running.  (
 http://www.libjpeg-turbo.org/About/Performance)


 MMX, SSE is being addressed using asm.js.
 We're also just dealing with screenshots here. I doubt people are going to
 do toDataURL at 60fps.


Here's a link to an experiment:
http://multimedia.cx/eggs/playing-with-emscripten-and-asm-js/



  It seems that this would be a fragmented solution as file formats and
 features would be added at different stages to browser engines. Would there
 be a way to feature test that the optional arguments are supported?


 No more than any other new feature.  I don't know if feature testing for
 dictionary arguments has been solved yet (it's come up before), but if not
 that's something that needs to be figured out in general.






Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-05-29 Thread Glenn Maynard
On Thu, May 29, 2014 at 3:33 PM, Rik Cabanier caban...@gmail.com wrote:

 MMX, SSE is being addressed using asm.js.


Assembly language is inherently incompatible with the Web.

We already have an API for compressing images, and compression level is
an ordinary input to image compressors, yet you're arguing that rather than
add the option to the API we have, we should require people to bundle their
own image compressors and write MMX assembly on the Web to make it fast.
 Sorry if I think that's a bizarre argument...

We're also just dealing with screenshots here. I doubt people are going to
 do toDataURL at 60fps.


(I hope we can all see more use cases than just screenshots.)

-- 
Glenn Maynard


Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-05-29 Thread Boris Zbarsky

On 5/29/14, 5:13 PM, Glenn Maynard wrote:

Assembly language is inherently incompatible with the Web.


A SIMD API, however is not.  Under the hood, it can be implemented in 
terms of MMX, SSE, NEON, or just by forgetting about the SIMD bit and 
pretending like you have separate operations.  In particular, you could 
have a SIMD API that desugars to plain JS as the default implementation 
in browsers but that JITs can recognize and vectorize as they desire. 
This sort of API will happen, for sure.


No opinion on the PNG encoder thing.

-Boris


Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-05-29 Thread Glenn Maynard
On Thu, May 29, 2014 at 4:21 PM, Boris Zbarsky bzbar...@mit.edu wrote:

 On 5/29/14, 5:13 PM, Glenn Maynard wrote:

 Assembly language is inherently incompatible with the Web.


 A SIMD API, however is not.  Under the hood, it can be implemented in
 terms of MMX, SSE, NEON, or just by forgetting about the SIMD bit and
 pretending like you have separate operations.  In particular, you could
 have a SIMD API that desugars to plain JS as the default implementation in
 browsers but that JITs can recognize and vectorize as they desire. This
 sort of API will happen, for sure.


I doubt it, at least with performance competitive with native assembly.  We
certainly shouldn't delay features while we hope for it.

-- 
Glenn Maynard


Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-05-29 Thread Rik Cabanier
On Thu, May 29, 2014 at 2:28 PM, Glenn Maynard gl...@zewt.org wrote:

 On Thu, May 29, 2014 at 4:21 PM, Boris Zbarsky bzbar...@mit.edu wrote:

  On 5/29/14, 5:13 PM, Glenn Maynard wrote:
 
  Assembly language is inherently incompatible with the Web.
 
 
  A SIMD API, however is not.  Under the hood, it can be implemented in
  terms of MMX, SSE, NEON, or just by forgetting about the SIMD bit and
  pretending like you have separate operations.  In particular, you could
  have a SIMD API that desugars to plain JS as the default implementation
 in
  browsers but that JITs can recognize and vectorize as they desire. This
  sort of API will happen, for sure.
 

 I doubt it, at least with performance competitive with native assembly.  We
 certainly shouldn't delay features while we hope for it.


You don't need to hope for it. The future is already here:
http://www.j15r.com/blog/2014/05/23/Box2d_2014_Update
asm.js will be fast on all modern browsers before this feature would ship.
As an author, I'd certainly prefer the most flexible solution that works
everywhere.


Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-05-29 Thread Glenn Maynard
On Thu, May 29, 2014 at 4:50 PM, Rik Cabanier caban...@gmail.com wrote:

 You don't need to hope for it. The future is already here:
 http://www.j15r.com/blog/2014/05/23/Box2d_2014_Update
 asm.js will be fast on all modern browsers before this feature would ship.
 As an author, I'd certainly prefer the most flexible solution that works
 everywhere.


I don't have the time to read all of this, but it doesn't seem to have
anything to do with SIMD instruction sets (which are notoriously difficult
to generalize).

Anyway, this has derailed the thread.  We have an API for compression
already.  It already supports a compression level argument for JPEG.
 Having an equivalent argument for PNG is a no-brainer.  The only
difference to JPEG is that it should be described as the compression
level rather than quality level, since with PNG it has no effect on
quality, only the file size and time it takes to compress.

-- 
Glenn Maynard


Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-05-29 Thread Nils Dagsson Moskopp
Glenn Maynard gl...@zewt.org writes:

 We have an API for compression already.  It already supports a
 compression level argument for JPEG.  Having an equivalent argument
 for PNG is a no-brainer.  The only difference to JPEG is that it
 should be described as the compression level rather than quality
 level, since with PNG it has no effect on quality, only the file size
 and time it takes to compress.

What benefit does it give then if the result is the same perceptually?

-- 
Nils Dagsson Moskopp // erlehmann
http://dieweltistgarnichtso.net


Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-05-29 Thread Kornel Lesiński
On 29.05.2014, at 23:19, Glenn Maynard gl...@zewt.org wrote:
 
 Anyway, this has derailed the thread.  We have an API for compression
 already.  It already supports a compression level argument for JPEG.
 Having an equivalent argument for PNG is a no-brainer.  The only
 difference to JPEG is that it should be described as the compression
 level rather than quality level, since with PNG it has no effect on
 quality, only the file size and time it takes to compress.

I don't think it's a no-brainer. There are several ways it could be interpreted:


1. As zlib's compression level

However, this has marginal utility, because these days even the maximum level, 
even on mobile devices, is reasonably fast. Lower level would be useful only 
for very large images on very slow devices, but UAs can have a good heuristic 
for ensuring reasonable compression time without any input from the page's 
author.

I expect exponential increase in computing power make this setting completely 
irrelevant by the time it's implemented in most browsers.


2. Enable brute-force search for best combinations of zlib's compression level, 
memory level and window size 

OptiPNG and pngcrush show that maximum settings in zlib don't always give 
smallest file and best compression is obtained by trying hundreds of 
combinations of zlib parameters.

If browsers choose this approach for a high compression level that will be a 
couple of *orders of magnitude* slower than the first option. If different 
vendors don't agree on orders of magnitude of time it takes to compress an 
image, such parameter could be unusable.


3. Compression parameters in other gzip implementations

For example Zopfli compressor produces files smaller than zlib, but is much 
much slower. Instead of 1-9 scale it takes number of iterations as the 
compression level.

And it can even use a totally different approach to compression level: I've 
modified Zopfli[1] to make it aim for constant processing time on any machine. 
Faster machines will just produce smaller files. Browsers could use this 
approach to ensure every PNG is compressed in  0.5s or so, or the compression 
level parameter could be a number of seconds to spend on the compression.


And that's just for lossless PNG. It's possible to encode standard PNG in a 
*lossy* fashion (http://pngmini.com/lossypng.html), and there are few ways to 
do it:

Images can be converted to PNG-8 (vector quantization is a form of lossy 
compression) and then compression level could be interpreted as number of 
unique colors or mean square error of the quantized image (the latter option is 
used by http://pngquant.org). This generally makes files 3-4 times smaller, but 
has a limit on maximum quality that can be achieved.

For higher quality it's possible to make truecolor PNG lossy by taking 
advantage of the fact that PNG filters are predictors. Instead of writing all 
pixels as they are in the input image the encoder can replace some pixels with 
values matching filters' prediction. This simplifies the data and generally 
halves the file size (and costs almost no extra CPU time). The threshold used 
to choose between source and predicted values for pixels acts similarly to 
JPEG's quality level.

So there are multiple ways such parameter can be interpreted, and it can result 
in wildly different visual quality, file size and time taken to compress the 
image.

-- 
regards, Kornel

[1] https://github.com/pornel/zopfli



Re: [whatwg] Proposal: toDataURL “image/png” compression control

2014-05-29 Thread Glenn Maynard
On Thu, May 29, 2014 at 5:34 PM, Nils Dagsson Moskopp 
n...@dieweltistgarnichtso.net wrote:

  and time it takes to compress.

 What benefit does it give then if the result is the same perceptually?


Time it takes to compress.  There's a big difference between waiting one
second for a quick save and 60 seconds for a high-compression final export.


On Thu, May 29, 2014 at 7:31 PM, Kornel Lesiński kor...@geekhood.net
wrote:

 I don't think it's a no-brainer. There are several ways it could be
 interpreted:


The API is a no-brainer.  That doesn't mean it should be done carelessly.
 That said, how it's implemented is an implementation detail, just like the
JPEG quality parameter, though it should probably be required to never use
lossy compression (strictly speaking this may not actually be required
today...).

FYI, I don't plan to spend much time arguing for this feature.  My main
issue is with the just do it in script argument.  It would probably help
for people more strongly interested in this to show a comparison of
resulting file sizes and the relative amount of time it takes to compress
them.

-- 
Glenn Maynard