Amazing work. Added bug to integrate into TMH player.
https://bugzilla.wikimedia.org/show_bug.cgi?id=61823
I can’t imagine anyone being against flash to deliver free formats!
—michael
On Feb 23, 2014, at 5:45 PM, Brion Vibber bvib...@wikimedia.org wrote:
In case anybody's interested but
As Brion points out, we get much better coverage. I enabled h.264
locally and ran though a set of Android , iOS and desktop browsers I had
available at the time:
http://www.mediawiki.org/wiki/Extension:TimedMediaHandler/Platform_testing
Pro h.264:
* No one is proposing turning off webm, an
On 12/13/2012 12:38 PM, Brion Vibber wrote:
It's much, MUCH easier for us to flip the H.264 switch... there are
ideological reasons we might not want to, but we're going to have to put
the effort into making those player apps if we want all our data accessible
to everyone.
+1 its non trivial
On 12/13/2012 04:56 PM, Brion Vibber wrote:
On Thu, Dec 13, 2012 at 10:38 AM, Brion Vibber bvib...@wikimedia.orgwrote:
On Wed, Dec 12, 2012 at 2:50 PM, Rob Lanphier ro...@wikimedia.org wrote:
I was able to play the WebM file of the locomotive on the front page
of
+correct content-type this time ;) Note this has already been merged,
but still worth mention for visibility.
On 2/1/13 12:10 PM, Michael Dale wrote:
We are about to merge in support for audio derivatives to Timed Media
Handler (TMH). The big value here, I think is encoding to AAC or MP3
Yes' all that changed is we added support for audio derivatives. We have
not enabled mp3 or AAC. The same code can be used for flac - ogg or
whatever we configure.
On Feb 3, 2013 2:33 AM, Yuvi Panda yuvipa...@gmail.com wrote:
Just to be sure that I'm reading this right - nothing actually changed
On 04/11/2013 10:48 AM, Quim Gil wrote:
I'm just trying to be consistent: a GSOC project can't force the
agenda of a Wikimedia project.
Also conservative when it comes to manage GSOC students expectations.
These bug reports have been open for years, and I don't want to
guarantee to a GSOC
On 05/30/2013 06:28 PM, Ryan Kaldari wrote:
OK, I decided to be slightly bold. I changed the modal video threshold
on en.wiki from 200px to 800px. This means all video thumbnails that
are 800px or smaller will open a modal player when you click on the
thumbnail. If there are no complaints from
, 2008 at 6:37 PM, Michael Dale md...@wikimedia.org wrote:
The debug switch would modify the HTML output to point at the individual
files with a GET seed ie myscript.js??php= date()? or something of
that nature bypassing the script loader altogether. The bulk of extra
content is comments, code
great :) this will greatly benefit both the add_media_wizard and the
in-browser theora video transcoder / uploader :)
--michael
Bryan Tong Minh wrote:
On Thu, Jan 8, 2009 at 3:19 PM, Roan Kattouw roan.katt...@home.nl wrote:
Chad schreef:
I think the thing in the way of this is a
Once we ship the firefogg extension support for the uploading videos;
commons should request that users select the highest quality source
video footage available ie the HD video their camera captured or DV
original edited footage from their local computer and then commons will
supply the
Can we look into enabling $wgAllowCopyUploads on Wikimedia projects?
This will let Wikimedia work a lot better with external archives
especial around large video files that are cumbersome for users to
download and then upload over our POST upload interface on home Internet
connections. In
While the upload API is under development / stabilization ... I hacked
in basic firefogg upload support to the add_media_wizard ... Since
firefogg works over post you can use it with the existing upload
interface (without good error handling) If you first download the
browser extension
Mike Baynton asked about some server side transcoding code he has worked
on this seems appropriate for wikitech-l so I have cc'ed it here.
The current direction is to encourage in-browser client side
transcoding. This offloads the costs of server side transcoding and
maximizes quality letting
Gregory Maxwell wrote:
This does
client side transcoding, but as far as the user can tell it's all done
by the server except no long transmission time for his 14gbyte DV
movie. (although, perhaps a long transcoding time. :) )
At some talks here a FOMS (foundations of open source media)
opps bad url for add_media_wizard try:
importScriptURI('http://mvbox2.cse.ucsc.edu/w/extensions/MetavidWiki/skins/add_media_wizard.js');
--michael
Michael Dale wrote:
While the upload API is under development / stabilization ... I hacked
in basic firefogg upload support
good points
I don't think it would be a _bad_ idea to support server side
transcoding it ofcourse gives more flexibility to have the original file
and then let us target different output formats in the future. Would let
us support camera video uploads etc.
But there are logistical issues.
Revising the $wgAllowCopyUploads request ... The thread ended here:
http://lists.wikimedia.org/pipermail/wikitech-l/2009-January/040942.html
Any updates on this; or ideas on how we could support client initiated
importing of media assets over http?
--michael
with MediaWiki ...
So the question is, why re-invent the wheel ?
Thanks,
GerardM
2009/2/3 Michael Dale md...@wikimedia.org
We really need a wikidata type site. We ran into similar issues with
structured data between government data wikis. Yaron hacked up a
(relatively simple
So I was running into the problem of localizing the messages for the
add_media_wizard mv_embed associated libraries. So I have taken a
first pass at witting the script server (that I had previously
described) http://tinyurl.com/ae44vd Below is a description of how it
works. the code is in
... I was looking at commons upload form JavaScript load profile is like
a long waterfall or rather a steep river :(
http://metavid.org/promo/round_trips.png
Would be much nicer do 1 or 2 request instead of 27 ... true... a lot
are gadgets and what not.. but even on a not logged in Main page
nightlies its 5 scripts at a time. So you still end up doing
a few round trips when you have a high script count.
--michael
Aryeh Gregor wrote:
On Mon, Feb 23, 2009 at 2:35 PM, Michael Dale md...@wikimedia.org wrote:
... I was looking at commons upload form JavaScript load profile is like
Sergey Chernyshev wrote:
Yes, of course - I checked it out and that's why I quoted it in my original
email.
My brief overview made me feel that it wasn't enough.
I just didn't want this to be only in context of localization as performance
is more related to overall user experience then to
.
In addition to being able to handle large files without an ugly manual
download+reupload, the upload-by-URL functionality is also needed for
future-facing work Michael Dale is working on to allow an on-wiki media
picker to fetch freely-licensed files from Flickr, Archive.org, and
other places.
We
The add media Wizard is in testing see blog post:
http://metavid.org/blog/2009/03/27/add-media-wizard-and-firefogg-on-test-wikimediaorg/
If no one objects (or has any blocker bugs that I have missed) I will
add the gadget option for firefogg / add media wizard to commons
shortly. (for wider
as some have pointed out its test.wikipedia.org ( not test.wikimedia.org )
Michael Dale wrote:
The add media Wizard is in testing see blog post:
http://metavid.org/blog/2009/03/27/add-media-wizard-and-firefogg-on-test-wikimediaorg/
If no one objects (or has any blocker bugs that I have
I am wondering if anyone has some contributer browser-client
information handy or can point me to a some dataset that I could query
to get the following information:
1) What is wikipedia client browser usage percentage distribution ( I
recall that being published recently but I have misplaced
/blog/2009/03/27/add-media-wizard-and-firefogg-on-test-wikimediaorg/
peace,
michael
Brion Vibber wrote:
Just a heads-up --
Michael Dale is working on some cleanup of how the various JavaScript
bits are loaded by the skins to centralize some of the currently
horridly spread-out code
--
Sergey Chernyshev
http://www.sergeychernyshev.com/
On Wed, Apr 15, 2009 at 5:29 PM, Michael Dale md...@wikimedia.org
wrote:
These changes will probably result in some minor adjustments to existing
skins. (I will try not to completely break compatibility cuz I know
there are many
,
Sergey
--
Sergey Chernyshev
http://www.sergeychernyshev.com/
On Wed, Apr 15, 2009 at 5:29 PM, Michael Dale md...@wikimedia.org wrote:
These changes will probably result in some minor adjustments to existing
skins. (I will try not to completely break compatibility cuz I know
individually requested. If we really want all the style sheets grouped I
can bump that on the priority list to right after the upload api stuff
that I have to finish up ;)
--michael
Brion Vibber wrote:
Just a heads-up --
Michael Dale is working on some cleanup of how the various JavaScript
So I was thinking instead of just packaging in jQuery I should package
in / move the entire mv_embed folder into the skins directory .. it does
not make sense to maintain two branches reinventing what mv_embed.js
provides in the process of refactoring other core mediaWiki javascript.
We will
Aryeh Gregor wrote:
I'm not clear on why we don't just make the daemon synchronously
return a result the way ImageMagick effectively does. Given the level
of reuse of thumbnails, it seems unlikely that the latency is a
significant concern -- virtually no requests will ever actually wait
on
script onload).
Number 2 might be usable as well.
In any case changing all MW and Extensions code to work for #2 or #3 might
be a hard thing.
Thank you,
Sergey
--
Sergey Chernyshev
http://www.sergeychernyshev.com/
On Wed, Apr 22, 2009 at 1:21 PM, Michael Dale md...@wikimedia.org
Roan Kattouw wrote:
The problem here seems to be that thumbnail generation times vary a
lot, based on format and size of the original image. It could be 10 ms
for one image and 10 s for another, who knows.
yea again if we only issue the big resize operation on initial upload
with a memory
The new-upload branch includes a good set of new features is available
here:
http://svn.wikimedia.org/svnroot/mediawiki/branches/new-upload/phase3/
Major Additions:
* action=upload added to the api
* Supports New upload Interfaces (dependent on mv_embed / jQuery libs )
** supports upload over
I would like to request categorization for the media projects to the bug
tracker. To get a brief idea of the components getting packaged into the
new-upload branch check out:
http://www.mediawiki.org/wiki/Media_Projects_Overview
I think the large scope of code and the fact that MwEmbed can be
As you may know I have been working on firefogg integration with
mediaWiki. As you may also know the mwEmbed library is being designed to
support embedding of these interfaces in arbitrary external contexts. I
wanted to quickly highlight a useful stand alone usage example of the
library:
I am definitely not opposed to adding in that functionality as I have
mentioned in the past:
see thread:
http://www.mail-archive.com/wikitech-l@lists.wikimedia.org/msg00888.html
You should take a look at the work Mike Baynton did back in summer of
code 07.
The issue that we have is both the
I would quickly add that the script-loader / new-upload branch also
supports minify along with associating unique id's grouping gziping.
So all your mediaWiki page includes are tied to their version numbers
and can be cached forever without 304 requests by the client or _shift_
reload to get
correct me if I am wrong but thats how we presently update js and css..
we have $wgStyleVersion and when that gets updated we send out fresh
pages with html pointing to js with $wgStyleVersion append.
The difference in the context of the script-loader is we would read the
version from the
Aryeh Gregor wrote:
Any given image is not included on every single page on the wiki.
Purging a few thousand pages from Squid on an image reupload (should
be rare for such a heavily-used image) is okay. Purging every single
page on the wiki is not.
yea .. we are just talking about adding
I think if the playback system is java in ~any browser~ we should
~softly~ inform people to get a browser with native support if they
want a high quality video playback experience.
The cortado applet is awesome ... but startup time of the java vm is
painful compared to other user experiences
Also should be noted a simple patch for oggHandler to output video and
use the mv_embed library is in the works see:
https://bugzilla.wikimedia.org/show_bug.cgi?id=18869
you can see it in action a few places like
http://metavid.org/wiki/File:FolgersCoffe_512kb.1496.ogv
Also note my ~soft~ push
We need to inform people that the quality of experience can be
substantially improved if they use a browser that supports free formats.
Wikimedia only distributes content in free formats because if you have
to pay for a licensee to view, edit or publish ~free content~ then the
content is not
This is really a foundation / wikimedia community question. ... I will
do a short email to foundation-l summarizing the technical discussion.
Not that foundation-l has historically been the best way to build
consensus but maybe someone else can summarize that discussion and give
us a ball-park
Tell the users to complain to Apple? .. Bring up anti-competitive
lawsuits against apple? Buy a Mobil device that is less locked down?
There is no easy solution when the platform is a walled garden. There
are two paths towards supporting html5 video in mobile platforms.
1) getting things
Want to point out the working prototype of the w...@home extension.
Presently it focuses on a system for transcoding uploaded media to free
formats, but will also be used for flattening sequences and maybe
other things in the future ;)
Its still rough around the edges ... it presently
Gregory Maxwell wrote:
On Fri, Jul 31, 2009 at 9:51 PM, Michael Dalemd...@wikimedia.org wrote:
the transcode job into $wgChunkDuration length encoding jobs. ( each
pieces is uploaded then reassembled on the server. that way big
transcoding jobs can be distributed to as many clients that
Some notes:
* ~its mostly an api~. We can run it internally if that is more cost
efficient. ( will do on a command line client shortly ) ... (as
mentioned earlier the present code was hacked together quickly its just
a prototype. I will generalize things to work better as internal jobs.
and I
place in the browser
since there is no other easy way to guarantee wysiwyg flat
representation of browser edited sequences )
peace,
--michael
Mike.lifeguard wrote:
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
BTW, Who's idea was this extension? I know Michael Dale is writing
two quick points.
1) you don't have to re-upload the whole video just the sha1 or some
sort of hash of the assigned chunk.
2) should be relatively strait froward to catch abuse via assigned user
id's to each chunk uploaded. But checking the sha1 a few times from
other random clients that are
Lets see...
* all these tools will be needed for flattening sequences anyway. In
that case CPU costs are really really high like 1/5 or lower real-time
and the number of computation needed explodes much faster as every
stable edit necessitates a new flattening of some portion of the
sequence.
.
peace,
--michael
Tisza Gergő wrote:
Michael Dale mdale at wikimedia.org writes:
* We are not Google. Google lost what like ~470 million~ last year on
youtube ...(and that's with $240 million in advertising) so total cost
of $711 million [1]
How much of that is related
yea would have to be opt in. Would have to have controls over how-much
bandwidth sent out... We could encourage people to enable it by sending
out a the higher bit-rate / quality version ~by default~ for those that
opt-in.
--michael
Ryan Lane wrote:
On Mon, Aug 3, 2009 at 1:57 PM, Michael
Wikimedia on slow / developing country connections)
peace,
michael
Brion Vibber wrote:
On 7/31/09 6:51 PM, Michael Dale wrote:
Want to point out the working prototype of the w...@home extension.
Presently it focuses on a system for transcoding uploaded media to free
formats, but will also
So I committed ~basic~ derivate code support for oggHandler in r54550
(more solid support on the way)
Based input from the w...@home thread; here are updated target
qualities expressed via the firefogg api to ffmpeg2thoera
Also j^ was kind enough to run these settings on some sample input
yea was using the wrong version of ffmpeg2theora locally ;)... Thanks
for the reminder, updated our ffmpeg2theora encode command in r55042 ...
an update to firefogg should support the --buf-delay argument shortly as
well.
--michael
Gregory Maxwell wrote:
On Fri, Aug 7, 2009 at 5:29 PM,
I would add that I am of course open to reorganization and would happily
discuss why any given decision was made ... be it trade offs with other
ways of doing things or lack of time to do it differently / better.
I also add that not all the legacy support and metavid based code has
been
~some comments inline~
Tim Starling wrote:
[snip]
I started off working on fixing the coding style and the most glaring
errors from the JS2 branch, but I soon decided that I shouldn't be
putting so much effort into that when a lot of the code would have to
be deleted or rewritten from
thanks for the constructive response :) ... comments inline
Tim Starling wrote:
I agree we should move things into a global object ie: $j and all our
components / features should extend that object. (like jquery plugins).
That is the direction we are already going.
I think it would be
My attachment did not make it into the JS2 design thread... and that
thread is in summary mode so here is a new post around the html output
question. Which of the following constructions are easier to read and
understand. Is there some tab delimitation format we should use to make
the jquery
[snip]
what I think we have here, is that $('#cat') is expensive, and run
inside a loop in dojBuild
you can build and append in the jquery version and it only shaves 10ms.
ie the following still incurs the jquery html building function call costs:
function dojBuild(){
var o ='';
Tim Starling wrote:
Michael Dale wrote:
That is part of the idea of centrally hosting reusable client-side
components so we control the jquery version and plugin set. So a
new version won't come along until its been tested and
integrated.
You can't host every client-side
~ dough ~ Disregard previous, bad key stroke sent rather than save to draft.
Tim Starling wrote:
Michael Dale wrote:
That is part of the idea of centrally hosting reusable client-side
components so we control the jquery version and plugin set. So a
new version won't come along until its
we have js2AddOnloadHook that gives you jquery in no conflict as $j
variable the idea behind using a different name is to separate jquery
based code from the older non-jquery based code... but if taking a more
iterative approach we could replace the addOnloadHook function.
--michael
Daniel
Aryeh Gregor wrote:
Also remember the possibility that sysops will want to include these
scripts (conditionally or unconditionally) from MediaWiki:Common.js or
such. Look at the top of
http://en.wikipedia.org/wiki/MediaWiki:Common.js, which imports
specific scripts only on
that are are probably
have proxies setup).. So the gziping php proxy of js requests is worth
while.
--michael
Aryeh Gregor wrote:
On Wed, Sep 30, 2009 at 3:32 PM, Michael Dale md...@wikimedia.org wrote:
Has anyone done any scalability studies into minimal php @readfile
script vs
Wanted to do a quick mention of the updated mwEmbed gadget and subtitles
support on the email list(s) in case people have missed its mention on
the village pump [1] or other venues.
* On commons the Commons:Timed_Text and Template:Closed_cap have
been started.
* oggHandler has been patched to
Also just added Michael Shynar ( shmichael ) from Kaltura who is doing
some add-media-wizard work.
--michael
Tim Starling wrote:
Bawolff: various Wikinews-related extensions
Jonathan Williford: extensions developed for http://neurov.is/on
Ning Hu: Semantic NotifyMe
Rob Lanphier and Conrad
Should we define some sort of convention for extensions to develop
selenium tests? i.e perhaps some of the tests should live in a testing
folder of each extension or in /trunk/testing/extensionName ? So that
its easier to modularize the testing suite?
--michael
Roan Kattouw wrote:
bhagya -
withJS is a special parameter in MediaWiki:Common.js that lets people
preview mediaWiki namespace user scripts. Its been on commons for ages
and en.wikipedia for a few weeks.
peace,
--michael
Chad wrote:
On Fri, Mar 12, 2010 at 1:12 PM, Michael Dale md...@wikimedia.org wrote:
Guillaume
If you have been following the svn commits you may have noticed a bit of
activity on the js2 front.
I wanted to send a quick heads up that describes what is going on and
invite people to try things out, and give feedback.
== Demos ==
The js2 extension and associated extension are ruining on
The script-loader has a few modes of operation.
You can run it in raw file mode ( ie $wgEnableScriptLoader = false ).
This will load all your javascript files directly only doing php
requests to get the messages. In this mode php does not touch any js or
css file your developing.
Once ready
Helder Geovane wrote:
I would support a url flag to avoid minification and or avoid
script-grouping,
as suggested by Michael Dale, or even to have a user preference for
enable/disable minification in a more permanent way (so we don't need to
change the url on each test: we just disable
Aryeh Gregor wrote:
On Thu, May 20, 2010 at 3:20 PM, Michael Dale md...@wikimedia.org wrote:
I like the idea of a user preference. This way you don't constantly have
to add debug to the url, its easy to tests across pages and is more
difficult for many people to accidentally invoke
Robb Shecter wrote:
Consider this true
scenario: I want to write a MediaWiki API client for editors;
something like the Wordpress Dashboard. Really give editors a modern
web experience. I'd want to do this as a Rails app: I could build it
quickly and find lots of collaborators via
More important than file_metadata and page asset metadata working with
the same db table backed, its important that you can query export all
the properties in the same way.
Within SMW you already have some special properties like pagelinks,
langlinks, category properties etc, that are not
On 03/19/2012 06:24 PM, Brion Vibber wrote:
In theory we can produce a configuration with TimedMediaHandler to produce
both H.264 and Theora/WebM transcodes, bringing Commons media to life for
mobile users and Apple and Microsoft browser users.
What do we think about this? What are the pros and
On 03/20/2012 03:15 AM, David Gerard wrote:
We should definitely be able to ingest H.264. (This has been on the
wishlist forever and is a much harder problem than it sounds.)
Once TMH is deployed, practically speaking .. upload to youtube -
import to commons .. will probably be the easiest
Thanks Brion ( and Erik ), for brining chunk uploading closer to
fruition. +Jan, Can you help out with documenting the api? I will take a
pass at it as well when I get a chance ;)
--michael
On 04/17/2012 03:37 PM, Brion Vibber wrote:
I've started adding some documentation on chunked
You will want to put into a jobQueue you can take a look at the Timed
Media Handler extension for how post upload processor intensive
transformations can be handled.
--michael
On 05/04/2012 04:58 AM, emw wrote:
Hi all,
For a MediaWiki extension I'm working on (see
On 06/18/2012 04:52 PM, Brion Vibber wrote:
On Mon, Jun 18, 2012 at 4:44 PM, David Gerard dger...@gmail.com wrote:
On 19 June 2012 00:30, Brion Vibber br...@pobox.com wrote:
warning: patent politics question may lead to offtopic bikeshedding
Additionally there's the question of adding H.264
On 10/28/12 11:41 PM, Rob Lanphier wrote:
Hi everyone,
Assuming we get the last blocking bugs fixed tomorrow, then we should
be able to go onto Commons on Wednesday, so that's our current plan.
Let us know if there are issues with this.
Thanks!
Rob
Thanks for the update Rob.
I did not see
On 07/20/2010 10:24 PM, Tim Starling wrote:
The problem is just that increasing the limits in our main Squid and
Apache pool would create DoS vulnerabilities, including the prospect
of accidental DoS. We could offer this service via another domain
name, with a specially-configured webserver,
On 07/29/2010 10:15 AM, Bryan Tong Minh wrote:
Hi,
I have been working on getting asynchronous upload from url to work
properly[1]. A problem that I encountered was that I need to store
data across requests. Normally I would use $_SESSION, but this data
should also be available to job
On 09/07/2010 01:39 AM, Tim Starling wrote:
I think it's ironic that this style arises in JavaScript, given that
it's a high-level language and relatively easy to understand, and that
you could make a technical case in favour of terseness. C has an
equally effective minification technique
On 09/08/2010 06:28 AM, Roan Kattouw wrote:
I don't believe we should necessarily support retrieval of arbitrary
wiki pages this way, but that's not needed for Gadgets: there's a
gadgets definition list listing all available gadgets, so Gadgets can
simply register each gadget as a module
On 09/08/2010 11:25 AM, Roan Kattouw wrote:
It's defined on a MediaWiki: page, which is accessed by the server to
generate the Gadgets tab in Special:Preferences. There is sufficient
server-side knowledge about gadgets to implement them as modules,
although I guess we might as well save
On 09/09/2010 10:56 AM, Trevor Parscal wrote:
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
We would need to vary on that cookie, but yes, this seems like a cool idea.
- Trevor
Previously when we had this conversation, I liked the idea of setting a
user preference.
This is getting a little out of hand? People are going to spend more
time talking about the potential for minification errors or sub
optimisation cost criteria then we will ever actually be running into
real minification errors or any real readability issue. Reasonable
efforts are being made to
On 10/25/2010 12:02 PM, Erik Moeller wrote:
Hello all,
for some types of resources, it's desirable to upload source files
(whether it's Blender, COLLADA, Scribus, EDL, or some other format),
so that others can more easily remix and process them. Currently, as
far as I know, there's no way to
The code is not spread across many files.. its a single mw.Parser.js
file. Its being used in my gadget and Neil's upload wizard. I agree
the parser is not the ideal parser, its not feature complete, is not
very optimised, and it was hacked together quickly. But it passes all
the tests and
On 11/29/2010 07:56 AM, Roan Kattouw wrote:
2010/11/29 Jan Paul Posma jp.po...@gmail.com:
Full interview videos will be available on Wikimedia Commons somewhere next
month. They are in Dutch, though.
Michael, can we subtitle those with mwEmbed magic?
Roan Kattouw (Catrope)
We can. We can
Looking over the thread, there are lots of good ideas. Its really
important to have some plan towards cleaning up abstractions between
structured data, procedures in representation, visual
representation and tools for participation.
But, I think its correct to identify the social aspects of the
On 01/03/2011 02:22 PM, Brion Vibber wrote:
Since ApiSVGProxy serves SVG files directly out on the local domain as their
regular content type, it potentially has some of the same safety concerns as
img_auth.php and local hosting of upload files. If that's a concern
preventing rollout, would
On 01/04/2011 09:57 AM, Roan Kattouw wrote:
The separate img_auth.php entry point is needed on wikis where reading
is restricted (private wiis), and img_auth.php will check for read
permissions before it outputs the file. The difference between the
proxy I wrote and img_auth.php is that
On 01/04/2011 01:12 PM, Neil Kandalgaonkar wrote:
We've narrowed it down to two systems that are being tested right now,
MogileFS and OpenStack. OpenStack has more built-in stuff to support
authentication. MogileFS is used in many systems that have an
authentication layer, but it seems you
On 01/20/2011 05:00 PM, Platonides wrote:
I would have probably gone by the page_props route, passing the metadata
from the wikitext to the tables via a parser function.
I would also say its probably best to pass metadata from the wikitext to
the tables via a parser function. Similar to
As mentioned in the bug, it would be nice to have configurable support
for the closure-compiler as well ;) ( I assume Apache licence is
compatible? )
Has anyone done any tests to see if there are any compatibility issues
with SIMPLE_OPTIMIZATIONS with a google closure minification hook?
1 - 100 of 124 matches
Mail list logo