Re: [Wikitech-l] Dynamic search results as category pages

2014-01-31 Thread Gerard Meijssen
Hoi,
In Reasonator we now support calendars [1] It does show you things like
people who were born / died on that date, events under way on that date ..

Play with and notice that longer periods are also supported. Some longer
periods may take some time but we have a waiting bar for you..It does
support any of the languages that we support. It uses Wikidata information
so if your favourite person or event is not mentioned. Just add the details
in Wikidata.
Thanks,
   GerardM

PS three cheers for Magnus


[1]


On 30 January 2014 19:50, Chad innocentkil...@gmail.com wrote:

 On Sun, Jan 26, 2014 at 1:10 AM, Lydia Pintscher 
 lydia.pintsc...@wikimedia.de wrote:

  On Fri, Jan 24, 2014 at 10:53 AM, Yuri Astrakhan
  yastrak...@wikimedia.org wrote:
   Hi, I am thinking of implementing a
  
   #CATQUERY query
  
   magic keyword for the category pages.
  
   When this keyword is present, the category page would execute a query
   against the search backend instead of normal category behavior and show
   result as if those pages were actually marked with this category.
  
   For example, this would allow Greek Philosophers category page to be
   quickly redefined as
   a cross-section of greeks  philosophers categories:
  
   #CATQUERY incategory:Greek incategory:Philosopher
  
   Obviously the community will be able to define much more elaborate
  queries,
   including the ordering (will be supported by the new search backend)
 
  In the future this will be possible using Wikidata and queries. My
  dream is that categories will to a very large extend go away then and
  be replaced with better Wikidata-based tools. I'd _really_ like us not
  to introduce more incentives to use categories like this one. Also
  it'd be quite a source for confusion to have both.
 
 
 Indeed. I think it would be very shortsighted to introduce any sort of new
 syntax here for this right now as proposed.

 There's lots of exciting query opportunities coming with Wikidata as Lydia
 says (which will very likely end up using Elasticsearch in some manner).

 On Sun, Jan 26, 2014 at 10:21 AM, Brian Wolff bawo...@gmail.com wrote:
 
  Category intersection proposals seem to appear on this list about once
  every 2 years as far as i remember. Usually they dont succede because
  nobody ends up writing any efficient code to implement it.
 

 Yep, we've been here before. Because category intersection isn't easy with
 the schema we've got. However, it's incredibly trivial to do in
 Elasticsearch.
 So I imagine if we invest more effort in helping Wikidata expand its syntax
 and uses we'll likely end up with what Yuri's wanting--if maybe in a
 different
 form.

 I'll also note we've already solved category intersection in Cirrus ;-)


 https://en.wikipedia.org/w/index.php?search=incategory%3A%221941+births%22+incategory%3A%221995+deaths%22title=Special%3ASearch


 -Chad
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] MediaWiki Language Extension Bundle 2014.01 release

2014-01-31 Thread Kartik Mistry
Hello,

I would like to announce the release of MediaWiki Language Extension
Bundle 2014.01. This bundle is compatible with MediaWiki 1.22.2 and
MediaWiki 1.21.5 releases.

* Download: 
https://translatewiki.net/mleb/MediaWikiLanguageExtensionBundle-2014.01.tar.bz2
* sha256sum: 2dc673ba0bbc43a3d69237c15600171e85d3c56d9ff520e5bdb3eda0f2fdc74a

Quick links:
* Installation instructions are at: https://www.mediawiki.org/wiki/MLEB
* Announcements of new releases will be posted to a mailing list:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-i18n
* Report bugs to: https://bugzilla.wikimedia.org
* Talk with us at: #mediawiki-i18n @ Freenode

Release notes for each extension are below.

-- Kartik Mistry

== Babel, CLDR, CleanChanges and LocalisationUpdate ==
* Only localisation updates.

== Translate ==
=== Noteworthy changes ===
* Bug 45695: Added plural support for Android file format.
* Bug 46831: Fixed HTML validation error which caused broken layout in
especially in the Monobook skin when language is blacklisted.
* Bug 59199: Regression in the previous MLEB release: Message groups
without changes were listed on Special:ManageMessageGroups.
* Bug 59241: MediaWiki plural syntax checker no longer warns about
false positives.
* Bug 60128: Fixed statsbar for subgroups in the message group
selector for correct statistics.
* Bug 60198: Preserve whitespace in review mode. In review mode,
whitespace was not preserved and new lines were being displayed as
single spaces, which was messing up the messages.
* Fixed a regression caused by Alt shortcuts introduced in the
previous MLEB release: Can again access special characters with AltGr.
* Translation editor now reacts better to page scrolling and resizing.
* Set directory and language of insertables to the source messages' values.
* Special:SupportedLanguages is now faster.
* Translate can be installed via Composer now.

== UniversalLanguageSelector ==
=== Noteworthy changes ===
* Bug 59175: When clicking a regions in the map in ULS, scroll only
the list of languages and not the entire page.
* Bug 59239: Fixed an alignment issue of ULS personal toolbar trigger.
* Bug 59958: Fixed an issue where web fonts were sometimes loaded for
elements that use the Autonym font.

=== Fonts ===
* Added new font RailwaysSans.
* Updated AbyssinicaSIL font to latest upstream (1.500) version.
* Fix name of Scheherazade to properly target locally installed font on system.

-- 
Kartik Mistry/કાર્તિક મિસ્ત્રી | IRC: kart_
{kartikm, 0x1f1f}.wordpress.com

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Job queue and Redis?

2014-01-31 Thread Jamie Thingelstad
I’m pretty sure that WMF is using Redis for the MediaWiki job queue. I see some 
sparse documentation on setting it up. I already have redis on my machines and 
would like to move to this to make life easier on my database server, but I’m 
wondering if this is fully baked and ready for use. Are there folks outside of 
WMF using Redis for their job queue?

--
Jamie Thingelstad
ja...@thingelstad.com
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Job queue and Redis?

2014-01-31 Thread Dan Andreescu

 I'm pretty sure that WMF is using Redis for the MediaWiki job queue. I see
 some sparse documentation on setting it up. I already have redis on my
 machines and would like to move to this to make life easier on my database
 server, but I'm wondering if this is fully baked and ready for use. Are
 there folks outside of WMF using Redis for their job queue?


Well, I'm not outside WMF, but with Wikimetrics [1], we use Redis as a
Result Backend for our Celery queue.  It works great and besides me getting
the configuration wrong once, everything is very solid.  I'd be happy to
answer any specific questions if you have them.


[1] https://git.wikimedia.org/summary/analytics%2Fwikimetrics.git
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Job queue and Redis?

2014-01-31 Thread Brian Wolff
On Jan 31, 2014 4:52 PM, Dan Andreescu dandree...@wikimedia.org wrote:

 
  I'm pretty sure that WMF is using Redis for the MediaWiki job queue. I
see
  some sparse documentation on setting it up. I already have redis on my
  machines and would like to move to this to make life easier on my
database
  server, but I'm wondering if this is fully baked and ready for use. Are
  there folks outside of WMF using Redis for their job queue?
 

 Well, I'm not outside WMF, but with Wikimetrics [1], we use Redis as a
 Result Backend for our Celery queue.  It works great and besides me
getting
 the configuration wrong once, everything is very solid.  I'd be happy to
 answer any specific questions if you have them.


 [1] https://git.wikimedia.org/summary/analytics%2Fwikimetrics.git
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

I think he meant if the php code mediawiki uses to interact with redis for
job queue is stable, not redis itself.

--bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Job queue and Redis?

2014-01-31 Thread Chad
On Fri, Jan 31, 2014 at 5:21 AM, Jamie Thingelstad ja...@thingelstad.comwrote:

 I’m pretty sure that WMF is using Redis for the MediaWiki job queue. I see
 some sparse documentation on setting it up. I already have redis on my
 machines and would like to move to this to make life easier on my database
 server, but I’m wondering if this is fully baked and ready for use.


Yes your assumptions are correct. I'd say it's fully production ready
now--we use it to run all the job queues at WMF and we're doing
thousands of jobs inserted  processed an hour. If documentation is
a bit lacking, maybe Aaron can be prodded into expanding it :p


 Are there folks outside of WMF using Redis for their job queue?


Dunno. Maybe Translatewiki? Wikia?

-Chad
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Dynamic search results as category pages

2014-01-31 Thread Dan Garry
Hey Yuri,

This is a really cool idea. It's almost like set theory for categories! You
can define certain base categories (based upon manually entering them into
the category using existing category syntax), then define other categories
based on set theoretic operations on categories. Certainly, if we could go
back in time and build categories from the ground up again, I think this
approach would be far superior.

This feature is sufficiently complex from a product standpoint that I think
the best thing to do here is to wait to see what happens with
CirrusSearch's integration with Wikidata. Many cool features like this may
end up being obsoleted by any effort for Wikidata integration. If not, we
can revisit this then.

Thanks,
Dan






On 24 January 2014 01:53, Yuri Astrakhan yastrak...@wikimedia.org wrote:

 Hi, I am thinking of implementing a

 #CATQUERY query

 magic keyword for the category pages.

 When this keyword is present, the category page would execute a query
 against the search backend instead of normal category behavior and show
 result as if those pages were actually marked with this category.

 For example, this would allow Greek Philosophers category page to be
 quickly redefined as
 a cross-section of greeks  philosophers categories:

 #CATQUERY incategory:Greek incategory:Philosopher

 Obviously the community will be able to define much more elaborate queries,
 including the ordering (will be supported by the new search backend)
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l




-- 
Dan Garry
Associate Product Manager for Platform
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Dynamic search results as category pages

2014-01-31 Thread Chad
On Thu, Jan 30, 2014 at 10:50 AM, Chad innocentkil...@gmail.com wrote:

 I'll also note we've already solved category intersection in Cirrus ;-)


 https://en.wikipedia.org/w/index.php?search=incategory%3A%221941+births%22+incategory%3A%221995+deaths%22title=Special%3ASearch



That should read in both the old and new search actually.

It just goes to show you how easy this is in lucene world.

-Chad
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How to retrieve the page, execute some time expensive operation and edit the page ONLY if it wasn't changed meanwhile

2014-01-31 Thread Mark Holmquist
On Wed, Jan 22, 2014 at 12:41:30PM +0100, Petr Bena wrote:
 1) Retrieve the page - page must not be changed starts NOW
 2) Do something what requires user input, possibly may last few minutes
 3) Save the page ONLY if it wasn't changed, if it was, go back to step 1

The way we should *cough* do this is If-Unmodified-Since.

http://restpatterns.org/HTTP_Headers/If-Unmodified-Since

It should be passed a value that matches a Last-Modified value that we
get from the previous fetch page contents API call.

But, I'm not convinced we could do this with our current API setup...
I think there are too many overlaps between things. Last-Modified may
apply, in this case, to the text of the page you're fetching, or, since
you can batch requests, it may apply to the image information you're
fetching, or the post count of the Flow board you're fetching, which
may cock up your estimate. And how the API is supposed to know what part
of the API call is constrained by the If-Unmodified header is beyond
me. Then again, you could apply it to all of them, and just pass the
date of your last request as the value.

I'd be interested to see this happen, but I suspect it would take a lot
of digging. :)

-- 
Mark Holmquist
Software Engineer, Multimedia
Wikimedia Foundation
mtrac...@member.fsf.org
https://wikimediafoundation.org/wiki/User:MHolmquist

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How to retrieve the page, execute some time expensive operation and edit the page ONLY if it wasn't changed meanwhile

2014-01-31 Thread Tyler Romeo
On Fri, Jan 31, 2014 at 8:38 PM, Mark Holmquist mtrac...@member.fsf.orgwrote:

 http://restpatterns.org/HTTP_Headers/If-Unmodified-Since

 It should be passed a value that matches a Last-Modified value that we
 get from the previous fetch page contents API call.


Unfortunately, MediaWiki refuses to obey the HTTP spec. Hence the reason
why we don't have E-Tag support either.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l