Le 2013-08-21 22:54, Brad Jorsch (Anomie) a écrit :
On Wed, Aug 21, 2013 at 4:13 PM, Mathieu Stumpf
psychosl...@culture-libre.org wrote:
But no, it doesn't, I still generate a randomly ordered wiki table.
So,
what did I missed?
Two things:
1. table.sort() only sorts the sequence part of
On 21 August 2013 13:56, Rob Lanphier ro...@wikimedia.org wrote:
Hi everyone,
After assessing the current readiness (or lack thereof) of our HTTPS
code, we've decided to postpone the deployment for a week. We have a
number of things that we'd like to get cleaner resolution on:
* Use of
On Sat, Aug 17, 2013 at 05:55:36PM -0400, Sumana Harihareswara wrote:
I suggest that we also update either
https://meta.wikimedia.org/wiki/HTTPS or a hub page on
http://wikitech.wikimedia.org/ or
https://www.mediawiki.org/wiki/Security_auditing_and_response with
up-to-date plans, to make it
Dear semantic wiki users and developers,
We are very happy to announce that early bird registration to the 8th
Semantic MediaWiki Conference is now open [2]!
Important facts reminder:
--
* Dates: October 28th to October 30th 2013 (Monday to Wednesday)
* Location: AO
On Aug 20, 2013, at 2:31 AM, Tyler Romeo tylerro...@gmail.com wrote:
As long as the change does not inhibit extensions from hooking in and using
other CSS pre-processors, I don't see any issue with using LESS in core.
However if and when we adopt LESS support in core, which only happens if
I sent this to the Editor Engagement list but maybe here there is more
people interested.
On 08/20/2013 06:39 AM, Quim Gil wrote:
Upon account creation or in your user profile: Send me important email
updates
You have seen this feature in many collaborative sites, but not in
Wikimedia sites.
On 08/01/2013 03:08 AM, Jiang BIAN bianji...@google.com wrote:
Hi,
I noticed some pages we crawled containing error message like this;
div id=mw-content-text lang=zh-CN dir=ltr class=mw-content-ltrp
class=errorFailed to render property P373:
Wikibase\LanguageWithConversion::factory: given
On Fri, Aug 23, 2013 at 7:06 AM, Sumana Harihareswara suma...@wikimedia.org
wrote:
On 08/01/2013 03:08 AM, Jiang BIAN bianji...@google.com wrote:
Hi,
I noticed some pages we crawled containing error message like this;
div id=mw-content-text lang=zh-CN dir=ltr
class=mw-content-ltrp
Forwarding to the Wikidata tech list in case this makes a future
Wiktionary collaboration easier.
Original Message
Subject: [Wiki-research-l] Java-based Wiktionary Library (JWKTL) 1.0.0
released as open source software (Wiki-research-l Digest, Vol 96, Issue 22)
Date: Tue, 20
We are actually crawling the HTML via bot, so the bug is not actually fixed
for non-login user, right?
Could you share the bug's link?
Thanks
On Thu, Aug 22, 2013 at 4:38 PM, Liangent liang...@gmail.com wrote:
On Fri, Aug 23, 2013 at 7:06 AM, Sumana Harihareswara
suma...@wikimedia.org
On Fri, Aug 23, 2013 at 8:13 AM, Jiang BIAN bianji...@google.com wrote:
We are actually crawling the HTML via bot, so the bug is not actually fixed
for non-login user, right?
I can't think of a good way to fix the problem from this aspect besides
waiting for old cached page to expire, unless
Thanks for the link. But I think this is targeting the language variant
related fix.
We actually observed stale cache in a wider range, see the bug entry:
https://bugzilla.wikimedia.org/show_bug.cgi?id=46014
On Thu, Aug 22, 2013 at 5:26 PM, Liangent liang...@gmail.com wrote:
On Fri, Aug 23,
On Fri, Aug 23, 2013 at 8:33 AM, Jiang BIAN bianji...@google.com wrote:
Thanks for the link. But I think this is targeting the language variant
related fix.
This is the root cause of that behavior you mentioned. (It only happens /
happened on zhwiki and maybe as well as some wikis with
The Swedish Wikipedia now has more than 1.5 million
articles, compared to 600,000 in January 2013 and
500,000 in September 2012. This is due to the creation
by a bot of many articles on animal and plant species.
The Swedish Wikipedia community has discussed the
matter thoroughly, and there is
On 23/08/13 10:48, Lars Aronsson wrote:
But it is not obvious how a bug report or feature
request should be written. A naive approach would be
to ask for a random article that wasn't created by a
bot, but this is not to the point.
That was my solution when this issue came up on the English
On 08/23/2013 03:57 AM, Tim Starling wrote:
An approximation would be to select, say, 100 articles from the
database using page_random, then calculate a weight for each of those
100 articles using complex criteria, then do a weighted random
selection from those 100 articles.
Interesting. An
Just add all the non-bot articles to a category and use
Special:RandomInCategory. ;-)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
17 matches
Mail list logo