Brion Vibber wrote:
On Mon, Oct 31, 2011 at 4:56 PM, Sam Reed re...@wikimedia.org wrote:
http://commons.wikimedia.org/wiki/File:Localisation-team-showcase-20111031.ogv
** **
Needs a license, description etc ;)
Awesome, thanks! I've redirected the original file link to the
On 30/10/11 16:47, Happy Melon wrote:
I think that would make more sense as a tag extension (parse doesn't
look like a good name, what about wikidemo?).
@Happy Melon: I think he wants a funtion which shows both parsed
wikitext and the original source.
He intends to *build* such a
selects you may enjoy
http://toolserver.org/~platonides/mwtools/ExpandSelect.php
/spam
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Platonides wrote:
Jeremy Baron wrote:
Seems pretty clearly broken; we should fix the way we quote. (and
maybe not break anyone because anyone using the syntax that triggers
this would already be broken because of this) Maybe we need backticks
instead of quotes? I can test when I'm less sleepy
Simone Locci wrote:
thank's for help, but here
https://commons.wikimedia.org/wiki/Commons:First_steps/Upload_form I read
that to upload contents the users must be logged, so how can I manage the
upload without a login?
You can't. You need to create an account in Wikimedia Commons [1] in
order
Roan Kattouw wrote:
I wasn't around back then (I've been around for quite a while, but not
for quite that long)
I wasn't either, but his name rang a bellin my head He was an old old
contributor. Is this the same one? Reading Alolita mail cleared it :)
Welcome to the new, enterprisey Wikimedia,
Erik Moeller wrote:
Quick update -- we now have more than 2,000 sign-ups. I figure that's
a good base to work with (or if it isn't, there are bigger problems),
so I've turned down the banner to about 20% and will disable it soon.
Expect the frantic sign-up rate on MW.org to drop.
You may
Ryan Lane wrote:
https://labsconsole.wikimedia.org/wiki/Main_Page
Just a suggestion ;)
- Ryan
1) This is the first time it is mentioned in this mailing list.
1b) Not even mentioned in the Server Admin Log.
2) It has a funny concept of you have an account
3) Public IPs are private
4)
The trick was using svn switch:
svn checkout http://svn.wikimedia.org/svnroot/mediawiki/trunk/phase3
cd phase3
svn switch http://svn.wikimedia.org/svnroot/mediawiki/trunk/extensions
extensions/ --depth immediates
cd extensions/
svn checkout
Simone wrote:
i am lost in sites, when i upload a picture where it must go?
in *.wikipedia.org or in commons.wikimedia.org?
Images should go to Wikimedia Commons ( http://commons.wikimedia.org )
Remember to specify the license.
i didn't understand...
another thing is that to take the token
On 23/10/11 12:45, Daniel Friesen wrote:
The general standard is GitWeb, it's even bundled with Gerrit so it will
be the default available when git is setup:
https://gerrit.wikimedia.org/r/gitweb?p=operations%2Fpuppet.git
A problem with that kind of interface (not restricted to git) is how
Russell Nelson wrote:
Is there a way to tell Sqlite to be pickier about what it will accept? If it
bombs out on anything that falls outside the union of (all supported
database SQL), then just passing our tests should be sufficient, right?
No. And even if it didn't had its own laxitude (eg.
Using Virtual Machines is a too big overhead compared to just coding it
right, and still it would not protect against eg. javascript injection.
Looking into LilyPond exception, I don't see any big problem:
- It relies in Math variables for storing the files in the same folder
(it was made
Neil Harris wrote:
Linux provides the setrlimit() system call for this purpose -- you could
either call it as a wrapper around lilypond, or hack it into a de-fanged
version of Lilypond.
If you're going to be running an auxiliary rendering process or
special-use server anyway, a few moments
On 22/10/11 01:24, Tomasz Finc wrote:
here is a snippet from an email that Erik send three days ago
explaining Gregs role.
subject: Heads up: Online coding challenge
The overall project is being coordinated by Greg DeKoenigsberg,
formerly Senior Community Architect at Red Hat. If you'd like
Rob Lanphier wrote:
He has been working in our community
for quite some time, going by the name Ashar Voultoiz on mailing
lists, and by hashar on IRC.Now that he's working as a
contractor for WMF, he's decided to let everyone know the Superman
behind the Clark Kent alter ego :)
That was
Greg DeKoenigsberg wrote:
No. You'd sound like an ideal tester hitting a use case we didn't even
consider. Will look into it.
--g
I don't want to sound harsh, Greg but... who are you? It seems it's the
first time you post here, yet your email is written as if you were in
charge of this.
Daniel Werner wrote:
3. its difficult with units. Layers can have different units like
degree, km, miles... I didn't look into how it works if markers with
units are defined in display_marker if there are layers with different
units. Perhaps we should allow coordinates with units like
Russell Nelson wrote:
This is needed because some code asks for a path to the file without
having a File. For example SpecialUndelete::showFile(). Basically
encapsulation punch-through. This is a clean-up. If you disagree, of
course feel free to propose something else.
They should be changed
Dmitriy Sintsov wrote:
For Firefox there is WebDeveloper extension
https://addons.mozilla.org/ru/firefox/addon/web-developer/
which has menu to disable browser caching. In IE8 / IE9 there is menu to
disable caching when you run debugger via F12.
I don't seem to have that option? In which menu
How does it work? What is it good for?
Can we eg. add a task to jenkins which runs php -l on all files?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Applies to:
Windows Internet Explorer 9
Microsoft Internet Explorer 4.01 Service Pack 1
Microsoft Internet Explorer 6.0
Microsoft Internet Explorer 6.0 Service Pack 1
Windows Internet Explorer 7
Windows Internet Explorer 8
How's that it isn't fixed in newer versions?
Welcome Leslie!
Hope you get comfortable here.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Ashar Voultoiz wrote:
That might be me, or at least I have the same concern about logs
specially when bisecting changes.
Much like we moved extension out of core, we might want to have
translations moved out too. We do not really need 0day translations for
day to day MW hacking.
What about
Brion Vibber wrote:
You can check out extensions as separate repositories directly into
subfolders within core's 'extensions' dir for a ready-to-run system. But,
you *do* need to do either manually or scripted iteration over them to pull
updates or commit across repos. Git's submodules might
Brion Vibber wrote:
On Wed, Oct 5, 2011 at 7:18 AM, Platonidesplatoni...@gmail.com wrote:
There are 615 extensions in trunk/extensions
With the new setup, would someone which has a checkout of everything
need to open 616 connections (network latency, ssh authentication, etc.)
whenever he
Daniel Friesen wrote:
;) And when all else fails, rsync a bunch of --bare repos and do your
pull on the local hd.
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]
Sorry? Note that only a small subset of those with commit access do have
full ssh to run commands. And a
Arthur Richards wrote:
2) Another user was attempting to move a page on Meta when she saw this:
PHP fatal error in
/usr/local/apache/common-local/php-1.18/includes/GlobalFunctions.php line
1197:
Object of class AFPData could not be converted to string
AFPData is a class used by the
Yaron Koren wrote:
(...)
Neither of those cases apply here - the Ad Manager code was
well-written, and it works. If you're curious, you can see for
yourself the kinds of fixes and changes that were made to the code
after it was checked in - all minor stuff, the only major thing being
that the
jida...@jidanni.org wrote:
Fellows,
This is Google's cache of http://en.wikipedia.org/wiki/Devo. It is a
snapshot of the page as it appeared on 28 Sep 2011 09:22:50 GMT. The
current page could have changed in the meantime. Learn more ...
Like why is it so much faster than the real
jida...@jidanni.org wrote:
On some of the Wikipedia sites, there are some messages near the top of
each page. These messages change every 10 or so seconds.
The problem is the number of lines in each of the changing messages is
not the same.
This causes the entire page to jerk up and down the
Merlijn van Deen wrote:
Hello to both the wikitech and pywikipedia lists -- please keep both
informed when replying. Thanks.
A few days ago, we - the pywikipedia developers - received alarming
reports of interwiki bots removing content from pages. This does not
seem to happen often, and we
Roan Kattouw wrote:
On Mon, Sep 26, 2011 at 4:59 PM, Ariel T. Glennar...@wikimedia.org wrote:
Στις 26-09-2011, ημέρα Δευ, και ώρα 08:47 +0200, ο/η melvin_mm έγραψε:
mysqlselect max(rev_id) from revision;
+-+
| max(rev_id) |
+-+
| 452476647 |
I thought InnoDB had a count, too. Don't worry, I'm happy with Wikipedia
servers not crashing. :)
Do doing things the slow way:
mysql SELECT COUNT(*) FROM revision;
+---+
| COUNT(*) |
+---+
| 416988781 |
+---+
1 row in set (1 hour 36 min 52.58 sec)
Result from about
Niklas Laxström wrote:
On 24 September 2011 00:49, Platonides wrote:
Daniel Friesen wrote:
Does svn actually update disconnected svn directories that just happen
to be in a subdirectory? If it does do that then I do admit we might
want to provide a handy script to batch upgrade... well
K. Peachey wrote:
I'm no SVN user so i'm emailing instead... As the 1.18 users may have
noticed (eg: For example a common place would be CodeReview) the new
designs for pre.
In r87173[1] the layout was changed and based on consensus in review
that was reverted and then 1.18 was rebranched at
Has the svn:externals problem been solved?
What flow would need to follow people currently using sparse checkouts.
There have been patches in git ml for sparse clones, but they hadn't
been applied, last time I checked.
___
Wikitech-l mailing list
Daniel Friesen wrote:
Does svn actually update disconnected svn directories that just happen
to be in a subdirectory? If it does do that then I do admit we might
want to provide a handy script to batch upgrade... well, actually
extension wise I've been thinking of that for extensions for
Daniel Friesen wrote:
And I'd like to point out that every one of the extensions currently
listed inside that area provides config to use different namespace
numbers. So you should have no conflict worries.
Maybe we should provide a way to generate namespaces for extensions.
The simplest way,
Brion Vibber wrote:
2) Checksums would be of fairly obvious benefit to verifying text storage
integrity within MediaWiki's own databases (though perhaps best sitting on
or keyed to the text table...?) Default installs tend to use simple
plain-text or gzipped storage, but big installs like
Domas Mituzas wrote:
* When reverting, do a select count(*) where md5=? and then do something
more advanced when more than one match is found
finally we don't need an index on it becomes we need an index on it, and
storage efficiency becomes much more interesting (binary packing yay ;-)
Just try to not produce conflicts for people when updating. And keep
subversion history, obviously.
I'm not sure if moving in two steps would be good (svn mv, but first
commit the additions, then the deletes), as it would allow for an
intermediate step, where you could move things in that
Chad wrote:
For those of us who do not know...what the heck is a Grawp attack?
Does it involve generating hash collisions?
-Chad
It's the name of a wikipedia vandal.
http://en.wikipedia.org/wiki/User:Grawp
___
Wikitech-l mailing list
Roan Kattouw wrote:
On Fri, Sep 16, 2011 at 6:48 PM, Thomas Griesm...@tgries.de wrote:
Was there a certain reason to chose base 36 ?
Why not recoding to base 62 and saving 3 bytes per checksum ?
I don't know, this was way, way before my time. But then, why use base
62 if you can use base
Siebrand Mazeland wrote:
https://bugzilla.wikimedia.org/show_bug.cgi?id=17865 -- Mismatched input
syntax for Cite error messages
* Some discussion. Cite is seen as scary! No one to take this yet.
Please give this some TLC.
Cite is not /that/ scary. What's TLC?
I don't see the problem with
bawolff wrote:
btw, could refreshImageMetadata.php be run on wmf wikis some time
after the deployment? 1.18 contains lots of new code related to Exif
support (and related image metadata stuff). The maintenance script
just regenerates img_metadata based on the source image.
-bawolff
Do they
bawolff wrote:
That's concerning. They work for me perfectly fine and seem to work on
cruise control. What's the error they're failing with?
Thanks,
Bawolff
It may just be that I don't have something properly configured locally.
This is a run of make databaseless on trunk.
There were 5
Thomas Morton wrote:
Regression error? This was raised a little while ago and supposed to have
been fixed :)
https://bugzilla.wikimedia.org/show_bug.cgi?id=30261
http://lists.wikimedia.org/pipermail/wikitech-l/2011-August/054538.html
Tom
Looks fixed to me (redirects to archive.org)
Harry Burt wrote:
Hey wikitech-l,
I finally got around to reading the August engineering report, which, as
ever, is a very useful read. However, one item did stick out to me:
[Visual editor] Ian Baker
http://www.mediawiki.org/wiki/User:Raindrift investigated
and started to work on a chat
Those are good news (tm) both for Aaron, WMF and MediaWiki.
Congratulations, Aaron
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Bjoern Hoehrmann wrote:
I think OpenID is a great way to consolidate identities across
multiple wikis, and to identify contributors semi-formally, however,
if OpenID providers are easy for spammers to infiltrate, I'll have to
rethink the whole approach.
(Spammers can easily be their own
Great gem, Hashar!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On 28/08/11 17:49, Ashar Voultoiz wrote:
On 28/08/11 03:20, Brion Vibber wrote:
What that really needs is test cases!
Not sure how easy to wrangle though...
OCaml has an implementation for unit testing. I have spend a bit of time
on it and eventually gave up since:
- I lack Ocaml skill.
Amir E. Aharoni wrote:
I suppose that most readers read the current versions of pages and not
the previous revisions.
Are there precise statistics about it? What percentage of readers
bother to look at the histories and the previous revisions?
There was a graphic about parser cache hits on
Welcome Jeremy!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Ashar Voultoiz wrote:
Hello,
During the Berlin hack-a-ton (which was an awesome event), I have added
a quick hack to MediaWiki which is correctly marked as fixme. The
revision I am requesting comments for is r87992:
http://www.mediawiki.org/wiki/Special:Code/MediaWiki/87992
Interesting. I
Do you already know Special:Allmessages?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Jay Ashworth wrote:
Do you already know Special:Allmessages?
Does Special:Allmessages provide the search capability about which I asked,
and I just don't realize it? Cause I didn't see any way to do it from there.
Cheers,
-- jr 'IE: yes :-)' a
People usually set it to 5000 and use the
Finn Årup Nielsen wrote:
On Wed, 2011-08-10 at 17:39 +0200, Daniel Friesen wrote:
MediaWiki does not permit this because allowing random people to create
pages and have them returned to a user with a text/html or other
mimetype creates XSS vectors and ways of distributing malware.
Yes, I
John Elliot wrote:
That was a HTML4/XHTML1 rule that's been removed. An emptyul/ul is
valid HTML5.
I wasn't sure if the emptyul was valid HTML5, or if the validator
wasn't strict enough about it yet. In any event, I'm happy to take your
word for it. If XHTML support is being deprecated, I
Maarten Dammers wrote:
Hi Priyank,
Op 5-8-2011 21:37, Platonides schreef:
priyank bagrecha wrote:
Is it possible to get a list of licenses for all the images on
wikipedia programmatically? Just the licenses, and a count of how many
images have which particular license.
Well, more or less
John Elliot wrote:
Hi there.
I've made some modifications to MediaWiki 1.17.0 that others might be
interested in. I'd be flattered if some or all of them made it into the
official MediaWiki release.
Firstly, I've added links to the W3C HTML validation service hosted by
MIT. These show up
John Elliot wrote:
On 10/08/2011 6:53 AM, Daniel Friesen wrote:
Please don't edit DefaultSettings.php; Your $wgFooterIcons change could
have been done in LocalSettings.php without causing trouble for yourself
when you upgrade.
Ah, didn't realise LocalSettings.php was the right place to do my
Ashar Voultoiz wrote:
Are there any instructions somewhere to disable them easily? Just in
case of trouble.
The svn book explains it. I think it was just
chmod -x hook_file
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
Brion Vibber wrote:
On Sat, Aug 6, 2011 at 12:42 PM, K. Peacheyp858sn...@gmail.com wrote:
Filed in bugzilla: https://bugzilla.wikimedia.org/show_bug.cgi?id=30261
This is an unfortunate consequence of pushing this site out of Wikimedia's
hosting in 2006; the offsite mirror apparently didn't
priyank bagrecha wrote:
Is it possible to get a list of licenses for all the images on
wikipedia programmatically? Just the licenses, and a count of how many
images have which particular license.
Well, more or less. The licenses are noted with templates, and the
templates also add categories,
As I was the primary objector in that revision, I think the idea of doing
the same within an extension is suitable as an immediate solution. Should
be very trivial to implement/review/deploy and would keep core clean.
-Chad
Doing it in an extension seems appropiate. I would still create a
K. Peachey wrote:
1. Write up a proposal somewhere (Eg: in your userspace on wiki then
share the link around or post it to the mailing list)
2. Apply for commit access (link to the proposal/patches you have done
etc etc, we do have the ability to give access to certain sections of
the SVN
Ashar Voultoiz wrote:
On 29/07/11 02:05, MZMcBride wrote:
snip
Sometimes the only way people can get their code reviewed is to commit it.
In a BRANCH! :-)
Although, you will have to find out someone to review your changes, but
at least it saves you the hassle of being reverted on sight.
PS: sorry for TLDR-ness... :-(
It was worth reading!
You make very good points.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
John Du Hart wrote:
Earlier this month, the PHP developer team moved to start soft deprecating
the mysql.so extension via documentation, with a intent to fully
E_DEPRECATED in a later release. [1] Therefore, I thought it would be
appropriate to start a small discussion as to how this should be
Magnus Manske wrote:
Just saw this:
http://shader.kaist.edu/sslshader/
As we move to offer real https access, maybe this could help keeping
the CPU cost down?
Our servers don't have a GPU, so that would need a hardware upgrade.
___
Wikitech-l
A funky merge like this (or a general delete/undelete) can create a new page
id; essentially the revisions from the old page ended up getting migrated
from one page to another, and that other page now has the title of the old
page.
-- brion
Another case is when the old page gets deleted and
Dmitriy Sintsov wrote:
Currently I use mw.loader.load(), which can be replaced with
importScriptURI(). However I don't want to go step back then move again
to 1.17/1.18. I've managed to postpone visual editor thing, now I'll
try to adapt few another extensions. I wanted RTE because it handles
Chad wrote:
I disagree here. Silence should not be taken for assent. Perhaps we
should formalize the approval process for RfCs. I agree with the
rest of your message in that we absolutely need to get more eyes on
RfCs...and I'm open to suggestions on how to do so.
-Chad
They are core
Jelle Zijlstra wrote:
2011/7/26 Platonides
The requester clarified the request today:
So for example if the user removes the email address, or replaces it
with another but doesn't verify it, the user rights associated with
their user group are automatically suspended, until
The requester clarified the request today:
So for example if the user removes the email address, or replaces it
with another but doesn't verify it, the user rights associated with
their user group are automatically suspended, until a verified email
address is provided,
+1 I completely agree.
Dantman is right in that such has to be accompanied by promptly reviews.
First problem is, people has to read the RFC. Searching the archive for
'RFC' I don't find it announced to the mailing list.
A RFC could be just text, have some sample code, or a full
implementation
Aaron Swartz is a wikipedian, too [1].
1- http://en.wikipedia.org/wiki/User:AaronSw
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Welcome both of you.
Daniel, I think you are going to be assigned bug 14890 :)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Congratulations!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Brion Vibber wrote:
On Sat, Jul 9, 2011 at 2:17 PM, Mark A. Hershberger wrote:
* How can they write parser tests and unit tests to try out their
code?
'Look at what other extensions do' is the best answer until somebody writes
up some details, which would be super nice. :D
'Cite' is
Ashar Voultoiz wrote:
On 15/07/11 21:06, Chad wrote:
snip
Got a bunch of new ones created this week:
* Matthew April - Matthew.JA - UserMergeAndDelete extension
* Jeremy - jlemley - New Favorites extension
* Justin Ryan - justizin - Wikia developer
* Owen Davis - owen - Wikia developer
*
Roan Kattouw wrote:
On Wikimedia wikis, the rate limit for editing is 8 edits per minute
for anonymous (logged-out) and new users. So waiting 3 seconds between
edits will result in almost 20 edits per minute, that's way too much.
If you're operating a bot, you should get an account for it and
Emmanuel Engelhart wrote:
Hi
Titles should be stored in the table page with a first letter uppercased.
http://en.wikipedia.org/wiki/Wikipedia:Naming_conventions_%28technical_restrictions%29#Lower_case_first_letter
Unfortunately, it seems that we have XML dumps (and consequently
mwdumper
Ryan Kaldari wrote:
I'm not sure who would be in charge of this, but I think it would be
useful if the WMF was a liaison member of the Unicode Constortium:
http://unicode.org/consortium/memblogo.html
This body makes all sorts of important decisions about the Unicode
standard—decisions that
Alex Brollo wrote:
I found a number of troubles when trying to replicate localurl: parser
function by python urllib.quote(), with a number of annoying bugs.
Is this correct-complete?
/wiki/+urllib.quote(name-of-page.replace( ,_),/!,;:.-_)
where name-of-page is utf-8 encoded.
Thanks! I
Ashar Voultoiz wrote:
On 24/06/11 08:33, Thomas Gries wrote:
When I think that a version committed by myself r90650 (marked as new)
is fully obsolete and
already replaced in my other commit r90684 (marked as fixed)
I would say, if it is reverted, mark it reverted :)
Or maybe resolved, if
Congratulations, to all of you!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
that pool.
And then adding the constraint of a second project with a second
language will narrow the pool even more.
We're looking for the orphan community who have a lot of editors but
little connection to English and Meta.
I have added a small script at
http://www.toolserver.org/~platonides
I have added a small script at
http://www.toolserver.org/~platonides/activeusers/activeusers.php to
show active users per project and language.
Requisites for appearing there are more than 500 edits (total) and at
least one action (usually an edit) in the last month (since May 16, data
Jelle Zijlstra wrote:
You might also get better results when you don't limit yourself to recent
contributions. For example, I contributed heavily to the Dutch Wikipedia a
few years ago, and now contribute heavily to the English. I don't appear in
Platonides's list, because I hardly edit nl: at
Thomas Morton wrote:
Or look for actives on one wiki.. and then cross check those names with all
the other wikis for the same names with over, say, 300 edits (at any time).
Tom
The edit count is are already looking at the full count, in the last
month only one is needed.
Alec Conroy wrote:
Is there an easy way to run this:
For each of the 86,000 'active users':
Store a list for their edit counts on each project they've edited
That's actually a fairly small dataset, and it would get us all the
data we want. I've been a developer before, but never
Tomasz Finc wrote:
I'm really eager to fix the latter. We currently have good
documentation from Roan and others at
http://wikitech.wikimedia.org/view/How_to_deploy_code and we'll have a
much better eco system when het deploy is out. For those of you that
actively deploy code .. do we have
http://wikitech.wikimedia.org/view/How_to_deploy_code mentions a couple
of pitfalls. Given that it is not in svn, I'm providing here a couple of
simpñle snippets to fix them. Can someone apply?
PITFALL #1: The path argument has to be relative to the php-1.17
directory, not to the current
Michael Dale wrote:
For the TimedMediaHandler I was adding more fine grain control over
background processes [1] and ran into a unix issue around getting both a
pid and exit status for a given background shell command.
Essentially with a background task I can get the pid or the exit status
Ahmed Kamal wrote:
Hi everyone,
Thanks for contacting with us, Ahmed.
This is Ahmed, I work for Ubuntu helping the cloud community. Ubuntu has
been working on some hot technology, that aims to be apt-get for the
cloud! Basically Install and manage large scale cloud deployments of
web
We could provide a minified mediawiki version.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Brandon Harris wrote:
You will obey:
http://commons.wikimedia.org/wiki/File:Domas-the-Giant-has-a-Posse.png
OMG that's cool.
But that Wikipedia book should have been about MySQL :)
___
Wikitech-l mailing list
501 - 600 of 1204 matches
Mail list logo