So I committed ~basic~ derivate code support for oggHandler in r54550 
(more solid support on the way)

Based input from the w...@home thread;  here are updated target 
qualities expressed via the firefogg api to ffmpeg2thoera

Also j^ was kind enough to run these settings on some sample input files:
http://firefogg.org/j/encoding_samples/ so you can check them out there.

We want to target 400 wide for the "web stream" to be consistent with 
archive.orgs which encodes mostly to 400x300 (although their 16:9 stuff 
can be up to 530 wide) ...

Updated mediaWiki firefogg integration and the stand alone encoder app 
these default transcode settings. in r54552 & r54554 (should be pushed 
out to http://firefogg.org/make shortly ... or can be run @home with a 
trunk check out at:
/js2/mwEmbed/example_usage/Firefogg_Make_Advanced.html

anyway on to the settings:

$wgDerivativeSettings[ WikiAtHome::ENC_SAVE_BANDWITH ] =
        array(
            'maxSize'        => '200',
            'videoBitrate'    => '164',
            'audioBitrate'    => '32',
            'samplerate'    => '22050',
            'framerate'        => '15',
            'channels'        => '1',         
            'noUpscaling'    => 'true'
        );
$wgDerivativeSettings[ WikiAtHome::ENC_WEB_STREAM ] =
        array(
            'maxSize'        => '400',
            'videoBitrate'    => '544',
            'audioBitrate'    => '96',
            'noUpscaling'    => 'true'
        );
$wgDerivativeSettings[ WikiAtHome::ENC_HQ_STREAM ] =
        array(
            'maxSize'         => '1080',
            'videoQuality'    => 6,
            'audioQuality'    => 3,
            'noUpscaling'    => 'true'
        );

--michael


Brion Vibber wrote:
> On 8/3/09 9:56 PM, Gregory Maxwell wrote:
> [snip]
>   
>> Based on 'what other people do' I'd say the low should be in the
>> 200kbit-300kbit/sec range.  Perhaps taking the high up to a megabit?
>>
>> There are also a lot of very short videos on Wikipedia where the whole
>> thing could reasonably be buffered prior to playback.
>>
>>
>> Something I don't have an answer for is what resolutions to use. The
>> low should fit on mobile device screens.
>>     
>
> At the moment the defaults we're using for Firefogg uploads are 400px 
> width (eg, 400x300 or 400x225 for the most common aspect rations) 
> targeting a 400kbps bitrate. IMO at 400kbps at this size things don't 
> look particularly good; I'd prefer a smaller size/bitrate for 'low' and 
> higher size/bitrate for "medium" qual.
>
>
>  From sources I'm googling up, looks like YouTube is using 320x240 for 
> low-res, 480x360 h.264 @ 512kbps+128kbps audio for higher-qual, with 
> 720p h.264 @ 1024Kbps+232kbps audio available for some HD videos.
>
> http://www.squidoo.com/youtubehd
>
> These seem like pretty reasonable numbers to target; offhand I'm not 
> sure the bitrates used for the low-res version but I think that's with 
> older Flash codecs anyway so not as directly comparable.
>
> Also, might we want different standard sizes for 4:3 vs 16:9 material?
>
> Perhaps we should wrangle up some source material and run some test 
> compressions to get a better idea what this'll look like in practice...
>
>   
>> Normally I'd suggest setting
>> the size based on the content: Low motion detail oriented video should
>> get higher resolutions than high motion scenes without important
>> details. Doubling the number of derivatives in order to have a large
>> and small setting on a per article basis is probably not acceptable.
>> :(
>>     
>
> Yeah, that's way tougher to deal with... Potentially we could allow some 
> per-file tweaks of bitrates or something, but that might be a world of 
> pain. :)
>
>   
>> As an aside— downsampled video needs some makeup sharpening like
>> downsampled stills will. I'll work on getting something in
>> ffmpeg2theora to do this.
>>     
>
> Woohoo!
>
>   
>> There is also the option of decimating the frame-rate. Going from
>> 30fps to 15fps can make a decent improvement for bitrate vs visual
>> quality but it can make some kinds of video look jerky. (Dropping the
>> frame rate would also be helpful for any CPU starved devices)
>>     
>
> 15fps looks like crap IMO, but yeah for low-bitrate it can help a lot. 
> We may wish to consider that source material may have varying frame 
> rates, most likely to be:
>
> 15fps - crappy low-res stuff found on internet :)
> 24fps / 23.98 fps - film-sourced
> 25fps - PAL non-interlaced
> 30fps / 29.97 fps - NTSC non-interlaced or many computer-generated vids
> 50fps - PAL interlaced or PAL-compat HD native
> 60fps / 59.93fps - NTSC interlaced or HD native
>
> And of course those 50 and 60fps items might be encoded with or without 
> interlacing. :)
>
> Do we want to normalize everything to a standard rate, or maybe just cut 
> 50/60 to 25/30?
>
> (This also loses motion data, but not as badly as decimation to 15fps!)
>
>   
>> This brings me to an interesting point about instant gratification:
>> Ogg was intended from day one to be a streaming format. This has
>> pluses and minuses, but one thing we should take advantage of is that
>> it's completely valid and well supported by most software to start
>> playing a file *as soon* as the encoder has started writing it. (If
>> software can't handle this it also can't handle icecast streams).
>> This means that so long as the transcode process is at least realtime
>> the transcodes could be immediately available.   This would, however,
>> require that the derivative(s) be written to an accessible location.
>> (and you will likely have to arrange so that a content-length: is not
>> sent for the incomplete file).
>>     
>
> Ooooh, good points all. :D Tricky but not impossible to implement.
>
> -- brion
>
> _______________________________________________
> Wikitech-l mailing list
> [email protected]
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


_______________________________________________
Wikitech-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to