Changing a crucial element of the fifth most popular website on the
Internet may be a good example for an article about [[PITA]] :D

While I'm open to new formats/technologies I think a fundamental
prerequisite is a widespread support among browsers/clients. JS-only
support will severely weaken user experience for high-latency users,
destroying it for non-JS users.

Also, we currently have not enough resources to pioneer and, furthermore,
we are bound -per mission- to serve our content in the most accessible way.

Vito

2017-12-04 23:09 GMT+01:00 Brian Wolff <[email protected]>:

> An encode latency of 7 seconds and decode latency of 1 second arent what I
> would call "very fast" (the decode latency measurement probably isnt
> realistic as decode and encode to png is different from just display in
> browser. Then again all these measurements are going to vary by filesize,
> not to mention low power devices like phones)
>
> If it indeed takes 1 second to decode, thats probably more time than the
> space savings would save time wise on a moderate speed internet connection.
>
> And time is only one metric. Often image encoding is limitted by memory.
>
>
> Anyways i'd be opposed to this unless the bandwidth saving was extreme.
> Wikipedia is not the place to be a testing ground for experimental image
> formats that browsers arent even supporting.
>
>
> --
> bawolff
>
> p.s.lossless rotatio /cropping of jpegs is actually very common due to
> rotatebot
>
> On Monday, December 4, 2017, Ruben Kelevra <[email protected]> wrote:
> > Hey Thiemo,
> >
> > On 04.12.2017 10:43, Thiemo Kreuz wrote:> I consider myself an image
> > file format nerd, so thanks a lot for
> >> sharing this! FLIF was new to me.Don't mind it! :)
> >
> >> I would like to share two important notes:
> >>
> >> 1. Unfortunately the flif.info website does not say a word about the
> >> CPU resources their current implementation burns when converting a,
> >> let's say, PNG to FLIF.Well, currently it's single core and not
> > optimized at all. You should know CABAC encoding from x264/x265 so it's
> > slow, but not dead slow.
> >
> > The website doesn't mention it because it highly depends on the subject
> > as well as the setting on encoding named effort.
> >
> > Currently, effort is default 60, I tried 100 a lot, but there's nearly
> > nothing to gain. So I assume we always want to use the good default. :)
> >
> > PNG Encoding of the current featured picture of today[1] at a medium
> > image size for the web, 1.280×868 Pixel, opened in Gimp, converted to
> > sRGB and exported as maximum compression without any checkbox done in
> > Gimp ... takes Gimp 3 seconds to write it to the Harddisk.
> >
> > Transcoding this PNG to FLIF takes on my machine ([email protected]; 12
> > GB RAM):
> > real 0m7,405s
> > user 0m7,320s
> > sys 0m0,053s
> >
> > decoding the file again to PNG on the same machine (with FLIF)
> > real 0m1,077s
> > user 0m1,067s
> > sys 0m0,007s
> >
> > For comparison, we save 708 KByte in comparison to PNG in this case, and
> > the PNG exported by FLIF is just 14 KByte bigger than the one created by
> > Gimp.
> >
> >> It's pretty important to realize that CPU
> >> resources are even more valuable than storage space and network
> >> bandwidth. Sure, It's not really possible to come up with an exact
> >> threshold. But if, let's say, converting a PNG to FLIF saves 100 KiB,
> >> but takes a minute, this minute will never pay off.
> > So my Point was more: Get rid of this bloody Download a JPG, do some
> > Stuff & Upload a 3 Times locally saved JPG again, calling it an
> improvement.
> >
> >> If you follow the discussions on Wikimedia Commons, you will find this
> >> argument quite often: Downloading PNGs, optimizing them, and uploading
> >> them again is practically never worth it. Think of all the resources
> >> that are burned to do this: CPU time, download and upload, database
> >> storage and time, disk storage for the new revision, and not to forget
> >> the user doing all this.
> > Yep, but enabling FLIF for new files would do nothing to the old Data at
> > all, I don't want to start a discussion about converting petabytes of
> > Data, but all new revisions should be done in FLIF, if you ask me.
> >> Sure, your suggestion avoids a lot of this. But the CPUs involved will
> >> experience heavy load, both on the server as well as clients that need
> >> to recode FLIF files via a JavaScript library first.
> > FLIF is very fast to decode via JavaScript, and in general, as the
> > example shown above, it takes just 1 second to decode and encode a
> > medium size image as PNG with just one thread on a pretty outdated
> > notebook with an unoptimized decoder and encoder. :)
> >
> > Try adding a FLIF to a website and test out if the website load anywhat
> > slower with the FLIF ... at the small image sizes you get on articles,
> > the performance impact is neglectable and comparable to loading a font
> > file to the browser.
> >> 2. Lossy file formats like JPEG should never be converted to lossless
> >> formats. This will actually decrease quality (over time). The problem
> >> is that the image data will still contain the exact same JPEG
> >> artifacts, but the fact that it was a JPEG (and how it was encoded) is
> >> lost.
> > Yes, but those images should never be saved as JPG in the first place.
> > Even under Android RAW-Photography is going to be a thing. FLIF just
> > started to get rudimentary RAW-capabilities, which means you can just
> > convert the special RAW-File to a FLIF and upload it with any loss in
> > quality.
> >
> >> No tool specialized in squeezing the maximum quality out of
> >> lossy JPEGs can work anymore. And there are a lot of super-awesome
> >> tools like this. Not only tools like JPEGCROP and such that can cut
> >> and even combine JPEGs without actually recoding them.
> > Well, if you really want to start a discussion about a lossless rotation
> > of JPG files, let me guess how many uploads are rotated losslessly...
> > 0.0002%?
> >
> >> There are also "JPEG repair" filters that know how to reverse the JPEG
> > encoding
> >> algorithm and can squeeze out a tiny little bit of extra information
> >> regular JPEG decoders can't see.
> > Great! Just recommend this for new uploads to be done if the source
> > material is JPG, let the ppl convert it with this to PNG and then to
> > FLIF or ask the FLIF maintainers if they want to add this as an
> > import-filter, for FLIF itself! :)
> >
> > But the your argument was: "Think of all the resources that are burned
> > to do this: CPU time, download and upload, database storage and time,
> > disk storage for the new revision, and not to forget the user doing all
> > this."
> >
> > Which perfectly fit's here too. On a 20 MPixel picture small JPG
> > Artefacts are no issue at all, but users which Download the JPG and
> > upload it again, after doing some needed work to it, like cropping or
> > color enhancement, this is a problem. Each version get more artifacts
> > and we constantly get a loss of quality which each revision...
> >
> > I don't think we should have accepted JPGs in the first place. :)
> >
> >> This said, I'm all for adding FLIF to the list of allowed file formats!
> > Wonderful, hope I get some feedback from you on the things I pointed out!
> :)
> > If you want to let a note on the GIMP/eog enhancement request, feel free
> > to do so:
> > https://bugzilla.gnome.org/buglist.cgi?quicksearch=flif
> >
> > [1] File:Umschreibung by Olafur Eliasson, Munich, December 2016 -04.jpg
> >
> > Best regards
> >
> >
> > Ruben
> >
> > _______________________________________________
> > Wikitech-l mailing list
> > [email protected]
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> _______________________________________________
> Wikitech-l mailing list
> [email protected]
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
_______________________________________________
Wikitech-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to