User Raymond changed the status of MediaWiki.r104323.
Old Status: new
New Status: ok
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/104323
Commit summary:
about time I added myself :)
___
MediaWiki-CodeReview mailing list
User Raymond changed the status of MediaWiki.r104325.
Old Status: new
New Status: ok
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/104325
Commit summary:
added action message
___
MediaWiki-CodeReview mailing list
User Raymond changed the status of MediaWiki.r104328.
Old Status: new
New Status: deferred
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/104328
Commit summary:
added missing action message
___
MediaWiki-CodeReview mailing list
The Article class was a problem we've been trying to fix.
Article basically was 3 things rolled all into one:
1) A page generation class that when run would fill up OutputPage
2) A context that references a Title, of course using the global $wg
context for the rest at the time
3) A glorified
User Dantman posted a comment on MediaWiki.r104334.
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/104334#c26583
Commit summary:
* Use WikiPage instead of Article in Skin and SkinTemplate
* Added $context parameter to Action::factory() to allow callers passing a
WikiPage object
User Reedy changed the status of MediaWiki.r104313.
Old Status: new
New Status: ok
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/104313
Commit summary:
Follow-up r104312, tab to space
___
MediaWiki-CodeReview mailing list
User Reedy changed the status of MediaWiki.r104332.
Old Status: new
New Status: ok
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/104332
Commit summary:
Fix a syntax error caused by using the same variable name for two different
things.
User Reedy posted a comment on MediaWiki.r104274.
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/104274#c26584
Commit summary:
Implement path routing code.
- Makes extending paths with extensions simpler.
- Should fix bug 32621 by parsing paths based on pattern weight rather than
User Jack Phoenix changed the status of MediaWiki.r104335.
Old Status: new
New Status: ok
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/104335
Commit summary:
Update my userinfo file
___
MediaWiki-CodeReview mailing list
User Duplicatebug posted a comment on MediaWiki.r104343.
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/104343#c26585
Commit summary:
Bug 32673: Keep the username in the input field if not existing
We could also add on LogEventsList.php line 870
Hello!
I don't know if the subject of this question belongs to the scope of this
group. Anyway, I will be pleased if I find an aswer to my question.
I'm writing some Java code in order to realize NLP tasks upon texts using
Wikipedia. What can I do in order to extract the first paragraph of a
User Duplicatebug posted a comment on MediaWiki.r104318.
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/104318#c26586
Commit summary:
Bug 32603 - limit option is missing on Special:BlockList
Okay so essentially this is a workaround for a 1.18 release blocker, and is not
a real
you want from all articles or just one?
see here
http://stackoverflow.com/questions/627594/is-there-a-wikipedia-api
mike
On Sun, Nov 27, 2011 at 6:02 PM, Khalida BEN SIDI AHMED
send.to.khal...@gmail.com wrote:
Hello!
I don't know if the subject of this question belongs to the scope of this
User Platonides changed the status of MediaWiki.r104322.
Old Status: new
New Status: ok
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/104322
Commit summary:
Followup r104318, add the new class to the Autoloader
___
I have already read the responses given in this post.
I want to the extract the first paragraph (or the first sentence) for a
list of 100 articles.
I could not use JWPL beacause I don't have a big hard disk space to create
the DB. I try to use JSoup but I need examples.
User Platonides changed the status of MediaWiki.r104337.
Old Status: new
New Status: ok
User Platonides also posted a comment on MediaWiki.r104337.
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/104337#c26587
Commit summary:
Change getUrl to getFullUrl (due to r98745), so it
look,
for 100 articles, just create a list of them, and export them as xml.
or use the book creator.
http://en.wikipedia.org/wiki/Help:Books
also there is a json api to pull single articles.
http://www.barattalo.it/2010/08/29/php-bot-to-get-wikipedia-definitions/
mike
On Sun, Nov 27, 2011 at
User Johnduhart posted a comment on MediaWiki.r104318.
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/104318#c26588
Commit summary:
Bug 32603 - limit option is missing on Special:BlockList
Okay so essentially this is a workaround for a 1.18 release blocker, and is not
a real
http://code.google.com/p/jwpl/ this looks also good
JWPL (Java Wikipedia Library) is a free, Java-based application
programming interface that allows to access all information contained
in Wikipedia.
have not tried that , but you said you wanted to do it in java.
mike
On Sun, Nov 27, 2011 at
I'm developping my project in Java. I'm not a good php developper.
JWPL needs fist to create a database whose size =158 GB. For the RAM, at
least 2 GB are necessary. I don't have neither a big hard disk neither a
big space ram. In addition, creating such big database to just extract the
first
User Johnduhart changed the status of MediaWiki.r104349.
Old Status: new
New Status: ok
User Johnduhart also posted a comment on MediaWiki.r104349.
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/104349#c26589
Commit summary:
follow-up r104347: doesn't seem to change anything,
The list of the articles I will need is not known from the beggining.
Through my project, I will find a list of words (50). I try to find for
them definitions in Wikipedia. After that I will extract the hyperonym of
each word. I will have a new list for which I then retrieve the respective
have you seen wordnet,http://en.wikipedia.org/wiki/WordNet
wikitionary?
in any case you can call the export routine as you need it, even for
single articles. I think it would not cause too much load on the
server.
anyway, good luck
mike
On Sun, Nov 27, 2011 at 6:36 PM, Khalida BEN SIDI AHMED
The words I use are belonging to a special domain (oil and gas industry).
So Wordnet and even Wictionnary are not useful enough (they are generalized
corpora).
Thank you very much indeed.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
User MaxSem changed the status of MediaWiki.r94762.
Old Status: fixme
New Status: new
User MaxSem also posted a comment on MediaWiki.r94762.
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/94762#c26590
Commit summary:
Started rewriting WikiHiero. So far, only parser tag works
* Khalida BEN SIDI AHMED wrote:
JWPL needs fist to create a database whose size =158 GB. For the RAM, at
least 2 GB are necessary. I don't have neither a big hard disk neither a
big space ram. In addition, creating such big database to just extract the
first sentence of each article seems for me
User Happy-melon changed the status of MediaWiki.r89020.
Old Status: resolved
New Status: fixme
User Happy-melon also posted a comment on MediaWiki.r89020.
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/89020#c26591
Commit summary:
Add support for detecting number of wikis a
Thank you Hoehrmann. I will try to apply the options you've mentionned.
However, if someone can help me in using JSoup, his ideas are welcomed.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
If you are focused on jsoup, then best to ask there on the mailing
list http://jsoup.org/discussion
On Sun, Nov 27, 2011 at 7:51 PM, Khalida BEN SIDI AHMED
send.to.khal...@gmail.com wrote:
Thank you Hoehrmann. I will try to apply the options you've mentionned.
However, if someone can help me in
Hello!
In the html code of a Wikipedia article how to recognise the
*first*sentence of this article?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
* Khalida BEN SIDI AHMED wrote:
In the html code of a Wikipedia article how to recognise the
*first*sentence of this article?
It's not marked up and probably differs among language versions. On the
english version the first `p` child of a `mw-content-ltr` element is a
good bet, as I pointed out
User SPQRobin changed the status of MediaWiki.r104333.
Old Status: new
New Status: ok
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/104333
Commit summary:
Consistency tweak for r104325
___
MediaWiki-CodeReview mailing list
User SPQRobin changed the status of MediaWiki.r104361.
Old Status: new
New Status: ok
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/104361
Commit summary:
Register new message key from r104325
___
MediaWiki-CodeReview mailing list
Thank you very much. That exactly what I wanted to know.
2011/11/27 Bjoern Hoehrmann derhoe...@gmx.net
* Khalida BEN SIDI AHMED wrote:
In the html code of a Wikipedia article how to recognise the
*first*sentence of this article?
It's not marked up and probably differs among language
User Brion VIBBER changed the status of MediaWiki.r104312.
Old Status: new
New Status: resolved
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/104312
Commit summary:
(bug 32630) Add image/vnd.adobe.photoshop as alias for image/x-photoshop
User Reedy changed the status of MediaWiki.r89020.
Old Status: fixme
New Status: resolved
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/89020
Commit summary:
Add support for detecting number of wikis a user is blocked on, and preventing
users from voting if they are blocked on
Bjoern Hoehrmann wrote:
* Khalida BEN SIDI AHMED wrote:
In the html code of a Wikipedia article how to recognise the
*first*sentence of this article?
It's not marked up and probably differs among language versions. On the
english version the first `p` child of a `mw-content-ltr` element is
So... this seems to have snuck back in a month ago:
https://www.mediawiki.org/wiki/Special:Code/MediaWiki/101021
https://bugzilla.wikimedia.org/show_bug.cgi?id=21860
Have we resolved the deployment questions on how to actually do the change?
Just want to make sure ops has plenty of warning
User G.Hagedorn posted a comment on MediaWiki.r86105.
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/86105#c26592
Commit summary:
Useful extension by Milan allowing files to be attached to pages
Comment:
Please start documenting this extension. The documentation (
https://en.wikipedia.org/w/index.php?title=Sharon_Aguilaraction=history
quest: GET
http://en.wikipedia.org/w/index.php?title=Sharon_Aguilaraction=history, from
208.80.152.70 via sq66.wikimedia.org (squid/2.7.STABLE9) to ()
Error: ERR_CANNOT_FORWARD, errno (11) Resource temporarily unavailable
I am also getting this from en.m.wikipedia.org:
Error 503 Service Unavailable
Service Unavailable
Guru Meditation:
XID: 1592365530
Varnish cache server
On android, verizon 4g, default browser, 10+ minutes
On Nov 27, 2011 5:18 PM, William Allen Simpson
william.allen.simp...@gmail.com wrote:
We had a site outage of about 30 mins, caused by a major issue,
potentially hardware-related, with a database server, which blocked
all MediaWiki application servers (and thereby rendered most of our
sites unusable). Should be fixed now; we'll prepare a more
comprehensive incident analysis soon.
User Krinkle changed the status of MediaWiki.r102753.
Old Status: new
New Status: ok
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/102753
Commit summary:
Escape the messages.
As per
https://www.mediawiki.org/wiki/User:Krinkle/Extension_review/WebFonts#HTML_escaping
Hello,
This is the answer that was given for my question:
http://stackoverflow.com/questions/8286786/wikipedia-first-paragraph
It works perfectly, the code may be useful for you.
Truly yours
Khalida Ben Sidi Ahmed
___
Wikitech-l mailing list
It appears that we were actually taken down by the reddit community, after
a link to the fundraising stats page was posted under Brandon's IAMA there.
sq71.wikimedia.org 943326197 2011-11-27T22:51:09.075 62032 109.125.42.71
TCP_MISS/200 1035 GET
User Krinkle changed the status of MediaWiki.r102771.
Old Status: new
New Status: ok
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/102771
Commit summary:
While resetting font-family , remove font-family attribute instead of setting
to None.
As reported at
User Krinkle changed the status of MediaWiki.r102770.
Old Status: new
New Status: fixme
User Krinkle also posted a comment on MediaWiki.r102770.
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/102770#c26593
Commit summary:
Create ext.webfonts.init and ext.webfonts.core modules,
User Johnduhart changed the status of MediaWiki.r104379.
Old Status: new
New Status: ok
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/104379
Commit summary:
added varnent
___
MediaWiki-CodeReview mailing list
User Johnduhart posted a comment on MediaWiki.r104381.
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/104381#c26594
Commit summary:
initial extension commit
Comment:
pre+# Sidebar settings
+$wgAddThisSBServ1 = 'compact';
+$wgAddThisSBServ1set = '';
+$wgAddThisSBServ2
User Platonides posted a comment on MediaWiki.r104343.
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/104343#c26595
Commit summary:
Bug 32673: Keep the username in the input field if not existing
We could also add on LogEventsList.php line 870
User Varnent posted a comment on MediaWiki.r104381.
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/104381#c26596
Commit summary:
initial extension commit
Comment:
yes! thank you for the suggestion - over the evolution of the code that
obvious step skipped passed me :) will
User Krinkle changed the status of MediaWiki.r102456.
Old Status: new
New Status: ok
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/102456
Commit summary:
[RL2] Followup r102455: forgot a global
___
MediaWiki-CodeReview mailing list
User Krinkle changed the status of MediaWiki.r102782.
Old Status: new
New Status: ok
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/102782
Commit summary:
[RL2] Remove obsolete TODO comment: all the data we need is in
$wgResourceLoaderSources which is already exposed, so we
User Krinkle changed the status of MediaWiki.r102783.
Old Status: new
New Status: ok
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/102783
Commit summary:
[RL2] Properly fix the live hack from r99889
___
MediaWiki-CodeReview mailing
User Krinkle changed the status of MediaWiki.r102412.
Old Status: new
New Status: ok
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/102412
Commit summary:
[RL2] Set expiry times for cached gadget data. Expiry times are determined by
the repo and set to either zero or
User Krinkle changed the status of MediaWiki.r102797.
Old Status: new
New Status: ok
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/102797
Commit summary:
[RL2] Remove long-resolved FIXME comment
___
MediaWiki-CodeReview mailing list
User Johnduhart posted a comment on MediaWiki.r104384.
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/104384#c26597
Commit summary:
switched to arrays in response to r104381#c26594 - ty Johnduhart
Comment:
Much better, however still room for improvement. How about instead of
User Krinkle changed the status of MediaWiki.r102859.
Old Status: new
New Status: ok
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/102859
Commit summary:
[RL2] Followup r102857: wpgadget has been changed to gadgetpref as well
___
User Reedy changed the status of MediaWiki.r104380.
Old Status: new
New Status: ok
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/104380
Commit summary:
initial extension commit
___
MediaWiki-CodeReview mailing list
On 28/11/11 08:29, Brion Vibber wrote:
So... this seems to have snuck back in a month ago:
https://www.mediawiki.org/wiki/Special:Code/MediaWiki/101021
https://bugzilla.wikimedia.org/show_bug.cgi?id=21860
I don't think it really snuck, Rob has been talking about it for a
while, see e.g.
User Tim Starling changed the status of MediaWiki.r94762.
Old Status: new
New Status: resolved
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/94762
Commit summary:
Started rewriting WikiHiero. So far, only parser tag works while the maint
functions are broken (however, they're
MediaWiki Bugzilla Report for November 21, 2011 - November 28, 2011
Status changes this week
Bugs NEW : 411
Bugs ASSIGNED : 9
Bugs REOPENED : 61
Bugs RESOLVED
Nice one, vandal topped the list of bug resolving...
Regards,
Hydriz
Date: Mon, 28 Nov 2011 03:00:01 +
To: wikitech-l@lists.wikimedia.org
From: repor...@kaulen.wikimedia.org
Subject: [Wikitech-l] Bugzilla Weekly Report
MediaWiki Bugzilla Report for November 21, 2011 - November 28,
I have no idea about the schema changes, but to choose a digest for
detection of identity reverts is pretty simple. The really difficult
part is to choose a locally sensitive hash or fingerprint that works
for very similar revisions with a lot of content.
I would propose that the digest is stored
User Tim Starling changed the status of MediaWiki.r104316.
Old Status: new
New Status: ok
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/104316
Commit summary:
Revert r104309, does not work.
___
MediaWiki-CodeReview mailing list
User Tim Starling changed the status of MediaWiki.r104307.
Old Status: new
New Status: ok
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/104307
Commit summary:
Rename the cryptic variable names in jquery.ucompare
Make the CSS file look less like it was typed on a keyboard
User Tim Starling changed the status of MediaWiki.r103658.
Old Status: new
New Status: fixme
User Tim Starling also posted a comment on MediaWiki.r103658.
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/103658#c26599
Commit summary:
add click handler to switch instantly between
User Tim Starling posted a comment on MediaWiki.r99911.
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/99911#c26600
Commit summary:
Per bug 28135, disable $wgMaxImageArea check when transforming using a hook,
and enable the check for non IM scalers.
Comment:
You know, it would
User Johnduhart posted a comment on MediaWiki.r104388.
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/104388#c26601
Commit summary:
improvement in use of arrays per feedback to r104384 - ty Johnduhart
Comment:
pre+ foreach ( $wgAddThisHServ as $n = $a ) {
+
User MaxSem posted a comment on MediaWiki.r100182.
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/100182#c26602
Commit summary:
Follow up to r82136;
Comment:
Bug 32652.
___
MediaWiki-CodeReview mailing list
User Santhosh.thottingal changed the status of MediaWiki.r102770.
Old Status: fixme
New Status: new
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/102770
Commit summary:
Create ext.webfonts.init and ext.webfonts.core modules, set 'position' = 'top'
for both of these modules so
On Sun, Nov 27, 2011 at 7:06 PM, Hydriz Wikipedia ad...@wikisorg.tk wrote:
Nice one, vandal topped the list of bug resolving...
Remember, kids: cheaters never win!
-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
Hi everyone,
Here's an attempt at putting together a more sophisticated (though not
necessarily correct) goal for 1.19:
https://docs.google.com/a/wikimedia.org/spreadsheet/ccc?key=0Agte_lJNpi-OdDJvZVBmYUtZbm1zYVRCYTFRZkxuVFEhl=en_US#gid=5
The goal was not to make something 100% accurate based on
User Hashar posted a comment on MediaWiki.r100588.
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/100588#c26603
Commit summary:
using real identity for hashar
Converting my pseudonym to use my real identity instead:
Ashar Voultoiz - Antoine Musso
Comment:
Thanks for spotting
User Raymond changed the status of MediaWiki.r104404.
Old Status: new
New Status: ok
Full URL: http://www.mediawiki.org/wiki/Special:Code/MediaWiki/104404
Commit summary:
using real identity for hashar
Most other occurrences where changed by r100588 already.
Thanks to Reach Out to the Truth
Dear Ben Sidi Ahmed,
DBpedia might have all the data you need, already extracted (also in
over 80 languages):
http://wiki.dbpedia.org/Downloads37
Here are all first 2 sentences of each article in a structured format:
http://downloads.dbpedia.org/3.7/en/short_abstracts_en.nt.bz2
Here is the first
User Valhallasw changed the status of pywikipedia.r9775.
Old Status: new
New Status: fixme
User Valhallasw also posted a comment on pywikipedia.r9775.
Full URL: http://www.mediawiki.org/wiki/Special:Code/pywikipedia/9775#c26604
Commit summary:
Remove DEPRECATED spam.
Comment:
Seriously? If a
77 matches
Mail list logo