On vendredi 17 mars 2017 16:46:23 CET Jean-Luc Lacroix wrote:
> "Google has developed and open-sourced a new JPEG algorithm that reduces
> file size by about 35 percent"
> 
> https://arstechnica.com/information-technology/2017/03/google-jpeg-guetzli-e
> ncoder-file-size/
> 
> Would DT benefit from it?
> 
You did pick up the bit about it needing 300MB per Mpix in the original?
That would mean for me at least 7 GB/image only for the compressions...
(that 300MB/MPix is mentioned on the Github repository here:
https://github.com/google/guetzli/ , linked to from the article you cited)

In addition, it seems to be slower than current jpeg encoders, and I thought 
the colour quality in 
the "eye" exemple less than in the traditional jpeg.

I'd suggest waiting a little bit longer.

Remco

____________________________________________________________________________
darktable user mailing list
to unsubscribe send a mail to [email protected]

Reply via email to