Very impressive, amazing progress for a part time project !
Thats interesting that iOS supports M-JPEG, had not heard that before.
Per M-JPGEG in wikipedia app ... They have WKWebView webview in iOS8 and above
no? so in theory could run the JS engine against the same subset of iOS devices
at
This is a fair assessment of the challenges / divergent code bases.
In terms of a path forward, I think it’s worth highlighting how the Kaltura
player normally integrates with other stand alone entity providers now days. We
normally integrate via a media proxy library that basically normalizes
Amazing work. Added bug to integrate into TMH player.
https://bugzilla.wikimedia.org/show_bug.cgi?id=61823
I can’t imagine anyone being against flash to deliver free formats!
—michael
On Feb 23, 2014, at 5:45 PM, Brion Vibber bvib...@wikimedia.org wrote:
In case anybody's interested but
On 05/30/2013 06:28 PM, Ryan Kaldari wrote:
OK, I decided to be slightly bold. I changed the modal video threshold
on en.wiki from 200px to 800px. This means all video thumbnails that
are 800px or smaller will open a modal player when you click on the
thumbnail. If there are no complaints from
On 04/11/2013 10:48 AM, Quim Gil wrote:
I'm just trying to be consistent: a GSOC project can't force the
agenda of a Wikimedia project.
Also conservative when it comes to manage GSOC students expectations.
These bug reports have been open for years, and I don't want to
guarantee to a GSOC
Yes' all that changed is we added support for audio derivatives. We have
not enabled mp3 or AAC. The same code can be used for flac - ogg or
whatever we configure.
On Feb 3, 2013 2:33 AM, Yuvi Panda yuvipa...@gmail.com wrote:
Just to be sure that I'm reading this right - nothing actually changed
+correct content-type this time ;) Note this has already been merged,
but still worth mention for visibility.
On 2/1/13 12:10 PM, Michael Dale wrote:
We are about to merge in support for audio derivatives to Timed Media
Handler (TMH). The big value here, I think is encoding to AAC or MP3
On 12/13/2012 12:38 PM, Brion Vibber wrote:
It's much, MUCH easier for us to flip the H.264 switch... there are
ideological reasons we might not want to, but we're going to have to put
the effort into making those player apps if we want all our data accessible
to everyone.
+1 its non trivial
On 12/13/2012 04:56 PM, Brion Vibber wrote:
On Thu, Dec 13, 2012 at 10:38 AM, Brion Vibber bvib...@wikimedia.orgwrote:
On Wed, Dec 12, 2012 at 2:50 PM, Rob Lanphier ro...@wikimedia.org wrote:
I was able to play the WebM file of the locomotive on the front page
of
As Brion points out, we get much better coverage. I enabled h.264
locally and ran though a set of Android , iOS and desktop browsers I had
available at the time:
http://www.mediawiki.org/wiki/Extension:TimedMediaHandler/Platform_testing
Pro h.264:
* No one is proposing turning off webm, an
On 10/28/12 11:41 PM, Rob Lanphier wrote:
Hi everyone,
Assuming we get the last blocking bugs fixed tomorrow, then we should
be able to go onto Commons on Wednesday, so that's our current plan.
Let us know if there are issues with this.
Thanks!
Rob
Thanks for the update Rob.
I did not see
On 06/18/2012 04:52 PM, Brion Vibber wrote:
On Mon, Jun 18, 2012 at 4:44 PM, David Gerard dger...@gmail.com wrote:
On 19 June 2012 00:30, Brion Vibber br...@pobox.com wrote:
warning: patent politics question may lead to offtopic bikeshedding
Additionally there's the question of adding H.264
You will want to put into a jobQueue you can take a look at the Timed
Media Handler extension for how post upload processor intensive
transformations can be handled.
--michael
On 05/04/2012 04:58 AM, emw wrote:
Hi all,
For a MediaWiki extension I'm working on (see
Thanks Brion ( and Erik ), for brining chunk uploading closer to
fruition. +Jan, Can you help out with documenting the api? I will take a
pass at it as well when I get a chance ;)
--michael
On 04/17/2012 03:37 PM, Brion Vibber wrote:
I've started adding some documentation on chunked
On 03/20/2012 03:15 AM, David Gerard wrote:
We should definitely be able to ingest H.264. (This has been on the
wishlist forever and is a much harder problem than it sounds.)
Once TMH is deployed, practically speaking .. upload to youtube -
import to commons .. will probably be the easiest
On 03/19/2012 06:24 PM, Brion Vibber wrote:
In theory we can produce a configuration with TimedMediaHandler to produce
both H.264 and Theora/WebM transcodes, bringing Commons media to life for
mobile users and Apple and Microsoft browser users.
What do we think about this? What are the pros and
There was the Add Media Wizard project from a while back ( that sounds
similar to what you describe )
http://www.mediawiki.org/wiki/Extension:Add_Media_Wizard
I wanted to take a look at integrating upload wizard into it post TMH
deployment, and or something new could be built as a gadget as
In terms of a db schema friendly to chunks, we would probably want
another table and associated it with a stashed file.
Russ was discussing adding support for appending files within the swft
object store application logic, so we may not have to be concerned with
storing chunk references in the
Thanks for this thread, Please do commit fixes for 1.17, If not obvious
already I have really only been targeting trunk. While the extension has
been around since before 1.16, it would be very complicated to restore
the custom resource loader it was using before there was a resource
loader in
I recommend using the static binaries hosted on firefogg or if you want
to compile it your self using the build tools provided there:
http://firefogg.org/nightly/
Also I would suggest you take a look at TimedMediahandler as an
alternative to oggHandler it has a lot more features such as WebM,
On 07/06/2011 03:04 PM, Brion Vibber wrote:
Some of you may have found that ResourceLoader's bundled minified
JavaScript loads can be a bit frustrating when syntax errors creep into your
JavaScript code -- not only are the line numbers reported in your browser of
limited help, but a broken
For the TimedMediaHandler I was adding more fine grain control over
background processes [1] and ran into a unix issue around getting both a
pid and exit status for a given background shell command.
Essentially with a background task I can get the pid or the exit status
but can't seem to get
On 06/14/2011 05:41 PM, Platonides wrote:
Do you want the command to be run asynchronously or not?
If you expect the status code to be returned by wfShellExec(), then the
process will obviously have finished and there's no need for the PID.
OTOH if you launch it as a background task, you will
On 06/04/2011 06:43 PM, David Gerard wrote:
A question that wasn't clear from reading the bug: why is reading a
file format (WebM) blocked on the entire Timed Media Handler?
It would be complicated to support WebM without an improved player and
transcoding support. All the IE users for example
On 06/01/2011 08:28 AM, Chad wrote:
I don't think revert in 72 hours if its unreviewed is a good idea. It
just discourages people from contributing to areas in which we only
have one reviewer looking at code.
I *do* think we should enforce a 48hr revert if broken rule. If you
can't be
sorry for the re-post ( having trouble with the wikitech-l list post
email migration :(
I would also be interested in discussing this in Berlin or otherwise ;)
I can offer some notes about video embedding inline:
On 04/29/2011 03:30 PM, Brion Vibber wrote:
Enhanced media player goodies like
On 04/15/2011 12:07 PM, Brion Vibber wrote:
Unexercised code is dangerous code that will break when you least expect it;
we need to get code into use fast, where it won't sit idle until we push it
live with a thousand other things we've forgotten about.
Translate wiki deserves major props for
Very cool. Especially given the development trajectory of Ace to become
the eclipse of web IDEs there will be a lot of interesting
possibilities as we could develop our own mediaWiki centric plugins for
the platform.
I can't help but think about where this is ideally headed ;)
A gitorius type
I had a bit of a documentation challenge approaching the problem of
writing phpunit test for extensions, mostly because many of the
extensions do this very differently and the manual did not have any
recommendations.
It appears many extension have custom bootstraping code ( somewhat hacky
path
On 04/04/2011 02:20 PM, Platonides wrote:
Michael Dale wrote:
Eventually it would be ideal to be able to 'just test your extension'
from the core bootstraper (ie dynamically generate our suite.xml and
namespace the registration of extension tests) ... but for now at least
not having to wait
On 04/02/2011 04:08 PM, Ryan Kaldari wrote:
2. Creating The Complete Idiot's Guide to Writing MediaWiki Extensions
and The Complete Idiot's Guide to Writing MediaWiki Gadgets (in jQuery)
+1 ... Beyond the guide we could win a lot by centralising some of the
scripts and libraries on
On 04/03/2011 10:56 AM, Brion Vibber wrote:
Harder, but very interesting in the medium to long-term:
We would do good to survey and analyse other gadget, widget,
add-on systems and communities that exist in web platforms. Not to
say that wikipedias needs are the same, just that there are
On 03/24/2011 04:45 AM, Joseph Roberts wrote:
Actually, looking through OggHandler, I do think that developing a
seperate entity may work well.
I'm not quite sure what is wanted by the general public and would like
to do what is wanted by the majority, not just wat would be easiest or
even
http://blogs.msdn.com/b/ie/archive/2011/03/16/html5-video-update-webm-for-ie9.aspx
It appears to be a little rough around the edges, but should bode well
for Wikimedia video support as IE 9 starts to be pushed out to windows
machines and ideally we won't have ie7 and 8 for as long as we have had
On 02/18/2011 01:01 PM, Roan Kattouw wrote:
2011/2/18 Philip Tzou philip@gmail.com:
jQuery's ajax method provides a better way to load a javascript, and it can
detect when the script would be loaded and excute the callback function. I
think we can implement it to our mw.loader.load.
On 01/22/2011 01:15 PM, Bryan Tong Minh wrote:
Handling metadata separately from wikitext provides two main
advantages: it is much more user friendly, and it allows us to
properly validate and parse data.
This assumes wikitext is simply a formatting language, really its a data
storage,
On 01/21/2011 08:21 AM, Chad wrote:
While I happen to think the licensing issue is rather bogus and
doesn't really affect us, I'm glad to see it resolved. It outperforms
our current solution and keeps the same behavior. Plus as a bonus,
the vertical line smushing is configurable so if we want
On 01/21/2011 02:45 AM, Alex Brollo wrote:
The interest of wikisource project for a formal and standardyzed set of book
metadata (I presume from Dublin Core) into a database table is obviuos.
Some preliminary tests into it.source suggest that templates and Labeled
Section Transclusion
On 01/20/2011 05:00 PM, Platonides wrote:
I would have probably gone by the page_props route, passing the metadata
from the wikitext to the tables via a parser function.
I would also say its probably best to pass metadata from the wikitext to
the tables via a parser function. Similar to
As mentioned in the bug, it would be nice to have configurable support
for the closure-compiler as well ;) ( I assume Apache licence is
compatible? )
Has anyone done any tests to see if there are any compatibility issues
with SIMPLE_OPTIMIZATIONS with a google closure minification hook?
On 01/03/2011 02:22 PM, Brion Vibber wrote:
Since ApiSVGProxy serves SVG files directly out on the local domain as their
regular content type, it potentially has some of the same safety concerns as
img_auth.php and local hosting of upload files. If that's a concern
preventing rollout, would
On 01/04/2011 09:57 AM, Roan Kattouw wrote:
The separate img_auth.php entry point is needed on wikis where reading
is restricted (private wiis), and img_auth.php will check for read
permissions before it outputs the file. The difference between the
proxy I wrote and img_auth.php is that
On 01/04/2011 01:12 PM, Neil Kandalgaonkar wrote:
We've narrowed it down to two systems that are being tested right now,
MogileFS and OpenStack. OpenStack has more built-in stuff to support
authentication. MogileFS is used in many systems that have an
authentication layer, but it seems you
Looking over the thread, there are lots of good ideas. Its really
important to have some plan towards cleaning up abstractions between
structured data, procedures in representation, visual
representation and tools for participation.
But, I think its correct to identify the social aspects of the
On 11/29/2010 07:56 AM, Roan Kattouw wrote:
2010/11/29 Jan Paul Posma jp.po...@gmail.com:
Full interview videos will be available on Wikimedia Commons somewhere next
month. They are in Dutch, though.
Michael, can we subtitle those with mwEmbed magic?
Roan Kattouw (Catrope)
We can. We can
The code is not spread across many files.. its a single mw.Parser.js
file. Its being used in my gadget and Neil's upload wizard. I agree
the parser is not the ideal parser, its not feature complete, is not
very optimised, and it was hacked together quickly. But it passes all
the tests and
On 10/25/2010 12:02 PM, Erik Moeller wrote:
Hello all,
for some types of resources, it's desirable to upload source files
(whether it's Blender, COLLADA, Scribus, EDL, or some other format),
so that others can more easily remix and process them. Currently, as
far as I know, there's no way to
This is getting a little out of hand? People are going to spend more
time talking about the potential for minification errors or sub
optimisation cost criteria then we will ever actually be running into
real minification errors or any real readability issue. Reasonable
efforts are being made to
On 09/09/2010 10:56 AM, Trevor Parscal wrote:
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
We would need to vary on that cookie, but yes, this seems like a cool idea.
- Trevor
Previously when we had this conversation, I liked the idea of setting a
user preference.
On 09/07/2010 01:39 AM, Tim Starling wrote:
I think it's ironic that this style arises in JavaScript, given that
it's a high-level language and relatively easy to understand, and that
you could make a technical case in favour of terseness. C has an
equally effective minification technique
On 09/08/2010 06:28 AM, Roan Kattouw wrote:
I don't believe we should necessarily support retrieval of arbitrary
wiki pages this way, but that's not needed for Gadgets: there's a
gadgets definition list listing all available gadgets, so Gadgets can
simply register each gadget as a module
On 09/08/2010 11:25 AM, Roan Kattouw wrote:
It's defined on a MediaWiki: page, which is accessed by the server to
generate the Gadgets tab in Special:Preferences. There is sufficient
server-side knowledge about gadgets to implement them as modules,
although I guess we might as well save
On 07/29/2010 10:15 AM, Bryan Tong Minh wrote:
Hi,
I have been working on getting asynchronous upload from url to work
properly[1]. A problem that I encountered was that I need to store
data across requests. Normally I would use $_SESSION, but this data
should also be available to job
On 07/20/2010 10:24 PM, Tim Starling wrote:
The problem is just that increasing the limits in our main Squid and
Apache pool would create DoS vulnerabilities, including the prospect
of accidental DoS. We could offer this service via another domain
name, with a specially-configured webserver,
More important than file_metadata and page asset metadata working with
the same db table backed, its important that you can query export all
the properties in the same way.
Within SMW you already have some special properties like pagelinks,
langlinks, category properties etc, that are not
Robb Shecter wrote:
Consider this true
scenario: I want to write a MediaWiki API client for editors;
something like the Wordpress Dashboard. Really give editors a modern
web experience. I'd want to do this as a Rails app: I could build it
quickly and find lots of collaborators via
Aryeh Gregor wrote:
On Thu, May 20, 2010 at 3:20 PM, Michael Dale md...@wikimedia.org wrote:
I like the idea of a user preference. This way you don't constantly have
to add debug to the url, its easy to tests across pages and is more
difficult for many people to accidentally invoke
Helder Geovane wrote:
I would support a url flag to avoid minification and or avoid
script-grouping,
as suggested by Michael Dale, or even to have a user preference for
enable/disable minification in a more permanent way (so we don't need to
change the url on each test: we just disable
The script-loader has a few modes of operation.
You can run it in raw file mode ( ie $wgEnableScriptLoader = false ).
This will load all your javascript files directly only doing php
requests to get the messages. In this mode php does not touch any js or
css file your developing.
Once ready
If you have been following the svn commits you may have noticed a bit of
activity on the js2 front.
I wanted to send a quick heads up that describes what is going on and
invite people to try things out, and give feedback.
== Demos ==
The js2 extension and associated extension are ruining on
withJS is a special parameter in MediaWiki:Common.js that lets people
preview mediaWiki namespace user scripts. Its been on commons for ages
and en.wikipedia for a few weeks.
peace,
--michael
Chad wrote:
On Fri, Mar 12, 2010 at 1:12 PM, Michael Dale md...@wikimedia.org wrote:
Guillaume
Should we define some sort of convention for extensions to develop
selenium tests? i.e perhaps some of the tests should live in a testing
folder of each extension or in /trunk/testing/extensionName ? So that
its easier to modularize the testing suite?
--michael
Roan Kattouw wrote:
bhagya -
Also just added Michael Shynar ( shmichael ) from Kaltura who is doing
some add-media-wizard work.
--michael
Tim Starling wrote:
Bawolff: various Wikinews-related extensions
Jonathan Williford: extensions developed for http://neurov.is/on
Ning Hu: Semantic NotifyMe
Rob Lanphier and Conrad
Wanted to do a quick mention of the updated mwEmbed gadget and subtitles
support on the email list(s) in case people have missed its mention on
the village pump [1] or other venues.
* On commons the Commons:Timed_Text and Template:Closed_cap have
been started.
* oggHandler has been patched to
that are are probably
have proxies setup).. So the gziping php proxy of js requests is worth
while.
--michael
Aryeh Gregor wrote:
On Wed, Sep 30, 2009 at 3:32 PM, Michael Dale md...@wikimedia.org wrote:
Has anyone done any scalability studies into minimal php @readfile
script vs
Aryeh Gregor wrote:
Also remember the possibility that sysops will want to include these
scripts (conditionally or unconditionally) from MediaWiki:Common.js or
such. Look at the top of
http://en.wikipedia.org/wiki/MediaWiki:Common.js, which imports
specific scripts only on
My attachment did not make it into the JS2 design thread... and that
thread is in summary mode so here is a new post around the html output
question. Which of the following constructions are easier to read and
understand. Is there some tab delimitation format we should use to make
the jquery
[snip]
what I think we have here, is that $('#cat') is expensive, and run
inside a loop in dojBuild
you can build and append in the jquery version and it only shaves 10ms.
ie the following still incurs the jquery html building function call costs:
function dojBuild(){
var o ='';
Tim Starling wrote:
Michael Dale wrote:
That is part of the idea of centrally hosting reusable client-side
components so we control the jquery version and plugin set. So a
new version won't come along until its been tested and
integrated.
You can't host every client-side
~ dough ~ Disregard previous, bad key stroke sent rather than save to draft.
Tim Starling wrote:
Michael Dale wrote:
That is part of the idea of centrally hosting reusable client-side
components so we control the jquery version and plugin set. So a
new version won't come along until its
we have js2AddOnloadHook that gives you jquery in no conflict as $j
variable the idea behind using a different name is to separate jquery
based code from the older non-jquery based code... but if taking a more
iterative approach we could replace the addOnloadHook function.
--michael
Daniel
thanks for the constructive response :) ... comments inline
Tim Starling wrote:
I agree we should move things into a global object ie: $j and all our
components / features should extend that object. (like jquery plugins).
That is the direction we are already going.
I think it would be
~some comments inline~
Tim Starling wrote:
[snip]
I started off working on fixing the coding style and the most glaring
errors from the JS2 branch, but I soon decided that I shouldn't be
putting so much effort into that when a lot of the code would have to
be deleted or rewritten from
I would add that I am of course open to reorganization and would happily
discuss why any given decision was made ... be it trade offs with other
ways of doing things or lack of time to do it differently / better.
I also add that not all the legacy support and metavid based code has
been
yea was using the wrong version of ffmpeg2theora locally ;)... Thanks
for the reminder, updated our ffmpeg2theora encode command in r55042 ...
an update to firefogg should support the --buf-delay argument shortly as
well.
--michael
Gregory Maxwell wrote:
On Fri, Aug 7, 2009 at 5:29 PM,
So I committed ~basic~ derivate code support for oggHandler in r54550
(more solid support on the way)
Based input from the w...@home thread; here are updated target
qualities expressed via the firefogg api to ffmpeg2thoera
Also j^ was kind enough to run these settings on some sample input
.
peace,
--michael
Tisza Gergő wrote:
Michael Dale mdale at wikimedia.org writes:
* We are not Google. Google lost what like ~470 million~ last year on
youtube ...(and that's with $240 million in advertising) so total cost
of $711 million [1]
How much of that is related
yea would have to be opt in. Would have to have controls over how-much
bandwidth sent out... We could encourage people to enable it by sending
out a the higher bit-rate / quality version ~by default~ for those that
opt-in.
--michael
Ryan Lane wrote:
On Mon, Aug 3, 2009 at 1:57 PM, Michael
Wikimedia on slow / developing country connections)
peace,
michael
Brion Vibber wrote:
On 7/31/09 6:51 PM, Michael Dale wrote:
Want to point out the working prototype of the w...@home extension.
Presently it focuses on a system for transcoding uploaded media to free
formats, but will also
two quick points.
1) you don't have to re-upload the whole video just the sha1 or some
sort of hash of the assigned chunk.
2) should be relatively strait froward to catch abuse via assigned user
id's to each chunk uploaded. But checking the sha1 a few times from
other random clients that are
Lets see...
* all these tools will be needed for flattening sequences anyway. In
that case CPU costs are really really high like 1/5 or lower real-time
and the number of computation needed explodes much faster as every
stable edit necessitates a new flattening of some portion of the
sequence.
Some notes:
* ~its mostly an api~. We can run it internally if that is more cost
efficient. ( will do on a command line client shortly ) ... (as
mentioned earlier the present code was hacked together quickly its just
a prototype. I will generalize things to work better as internal jobs.
and I
place in the browser
since there is no other easy way to guarantee wysiwyg flat
representation of browser edited sequences )
peace,
--michael
Mike.lifeguard wrote:
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
BTW, Who's idea was this extension? I know Michael Dale is writing
Want to point out the working prototype of the w...@home extension.
Presently it focuses on a system for transcoding uploaded media to free
formats, but will also be used for flattening sequences and maybe
other things in the future ;)
Its still rough around the edges ... it presently
Gregory Maxwell wrote:
On Fri, Jul 31, 2009 at 9:51 PM, Michael Dalemd...@wikimedia.org wrote:
the transcode job into $wgChunkDuration length encoding jobs. ( each
pieces is uploaded then reassembled on the server. that way big
transcoding jobs can be distributed to as many clients that
This is really a foundation / wikimedia community question. ... I will
do a short email to foundation-l summarizing the technical discussion.
Not that foundation-l has historically been the best way to build
consensus but maybe someone else can summarize that discussion and give
us a ball-park
Tell the users to complain to Apple? .. Bring up anti-competitive
lawsuits against apple? Buy a Mobil device that is less locked down?
There is no easy solution when the platform is a walled garden. There
are two paths towards supporting html5 video in mobile platforms.
1) getting things
We need to inform people that the quality of experience can be
substantially improved if they use a browser that supports free formats.
Wikimedia only distributes content in free formats because if you have
to pay for a licensee to view, edit or publish ~free content~ then the
content is not
I think if the playback system is java in ~any browser~ we should
~softly~ inform people to get a browser with native support if they
want a high quality video playback experience.
The cortado applet is awesome ... but startup time of the java vm is
painful compared to other user experiences
Also should be noted a simple patch for oggHandler to output video and
use the mv_embed library is in the works see:
https://bugzilla.wikimedia.org/show_bug.cgi?id=18869
you can see it in action a few places like
http://metavid.org/wiki/File:FolgersCoffe_512kb.1496.ogv
Also note my ~soft~ push
I would quickly add that the script-loader / new-upload branch also
supports minify along with associating unique id's grouping gziping.
So all your mediaWiki page includes are tied to their version numbers
and can be cached forever without 304 requests by the client or _shift_
reload to get
correct me if I am wrong but thats how we presently update js and css..
we have $wgStyleVersion and when that gets updated we send out fresh
pages with html pointing to js with $wgStyleVersion append.
The difference in the context of the script-loader is we would read the
version from the
Aryeh Gregor wrote:
Any given image is not included on every single page on the wiki.
Purging a few thousand pages from Squid on an image reupload (should
be rare for such a heavily-used image) is okay. Purging every single
page on the wiki is not.
yea .. we are just talking about adding
I am definitely not opposed to adding in that functionality as I have
mentioned in the past:
see thread:
http://www.mail-archive.com/wikitech-l@lists.wikimedia.org/msg00888.html
You should take a look at the work Mike Baynton did back in summer of
code 07.
The issue that we have is both the
As you may know I have been working on firefogg integration with
mediaWiki. As you may also know the mwEmbed library is being designed to
support embedding of these interfaces in arbitrary external contexts. I
wanted to quickly highlight a useful stand alone usage example of the
library:
I would like to request categorization for the media projects to the bug
tracker. To get a brief idea of the components getting packaged into the
new-upload branch check out:
http://www.mediawiki.org/wiki/Media_Projects_Overview
I think the large scope of code and the fact that MwEmbed can be
The new-upload branch includes a good set of new features is available
here:
http://svn.wikimedia.org/svnroot/mediawiki/branches/new-upload/phase3/
Major Additions:
* action=upload added to the api
* Supports New upload Interfaces (dependent on mv_embed / jQuery libs )
** supports upload over
Roan Kattouw wrote:
The problem here seems to be that thumbnail generation times vary a
lot, based on format and size of the original image. It could be 10 ms
for one image and 10 s for another, who knows.
yea again if we only issue the big resize operation on initial upload
with a memory
Aryeh Gregor wrote:
I'm not clear on why we don't just make the daemon synchronously
return a result the way ImageMagick effectively does. Given the level
of reuse of thumbnails, it seems unlikely that the latency is a
significant concern -- virtually no requests will ever actually wait
on
script onload).
Number 2 might be usable as well.
In any case changing all MW and Extensions code to work for #2 or #3 might
be a hard thing.
Thank you,
Sergey
--
Sergey Chernyshev
http://www.sergeychernyshev.com/
On Wed, Apr 22, 2009 at 1:21 PM, Michael Dale md...@wikimedia.org
1 - 100 of 124 matches
Mail list logo