Re: [Wikitech-l] RfC update: LESS stylesheet support in core

2013-09-19 Thread Dan Andreescu
 - Has http://learnboost.github.io/stylus/ been considered? I've heard that
 it's a good compromise between sass and less (but I haven't played with it
 myself to see if it really lets you do more compass-like things).


I was just writing a message about Stylus [0] so I'm glad you brought it
up.  Limn [1] uses Stylus and we've been pretty happy with it.  I read the
RFC carefully and it seems the two big reasons to pick LESS over
Stylus/SASS are popularity and support in PHP.  The reason to pick
Stylus/SASS over LESS is a more elegant syntax and a slight edge in
features.

*PHP support* - Stylus does have PHP support [2] but it's not even close to
as mature as the LESS support.

*Popularity* - does matter; one of the long comment threads on the RFC is
from a potential contributor who is concerned that LESS makes it harder to
contribute.  I mostly agree with Jon's and Steven's arguments that LESS is
pretty easy to learn.  However, I have also heard about a year's worth of
complaints about Limn being written in Coco instead of pure Javascript.  I
personally think CSS - LESS is just as mentally taxing as Javascript -
Coco, but I'm objectively in the minority based on the feedback I've
received.  I'd be cautious here.  You can upcompile CSS into LESS, sure,
but if a contributor has to understand a complex LESS codebase full of
mixins and abstractions while debugging the generated CSS in the browser,
they're right to point out that this requires effort.  And this is effort
is only increased for more elegant languages like Stylus.

*Syntax* - Stylus and SASS definitely have cleaner, simpler syntax.  Stylus
aims to be the cleanest of the three but it definitely smells like that SNL
skit about the number of razor blades.  They have 4 blades?!  Fine, we'll
make one with *5* BLADES!!!  What I'm referring to here is that Stylus has
optional colons and tries to be as much like python as possible.

*Features* - The interesting thing about the features comparisons out there
is that all of them seem to be outdated.  For example this write-up [3]
highlights that @media queries can be nested in SASS (same is true for
Stylus).  But the LESS people implemented that as well (Feb 2013).  This
said, it does seem that Stylus and SASS are leading the pack in terms of
new features.  Introspection [4] is a very cool one in Stylus that I'm not
sure you can do in LESS.


I think the decision's pretty much been made to go with LESS, and I agree
with it.  I think it strikes the better balance between making it easy for
people to contribute and DRY-ing up our codebase.  But in the future, if we
loved the migration to LESS and we just wish it had more features and more
DRY-ness, we should revisit Stylus.


[0] - http://learnboost.github.io/stylus/
[1] - https://github.com/wikimedia/limn/tree/develop/css
[2] - https://github.com/AustP/Stylus.php
[3] - http://css-tricks.com/sass-vs-less/
[4] - http://learnboost.github.io/stylus/docs/introspection.html
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RfC update: LESS stylesheet support in core

2013-09-20 Thread Dan Andreescu
On Fri, Sep 20, 2013 at 12:53 PM, C. Scott Ananian
canan...@wikimedia.orgwrote:

 On Fri, Sep 20, 2013 at 2:35 PM, Ori Livneh o...@wikimedia.org wrote:

  I personally think it'd be unfortunate for this current effort to
 collapse
  over such considerations, but I'm obviously biased.
 

 Oh, I certainly agree.  For my part, I'm satisfied that the
 LESS/Sass/stylus issues have been adequately thought through (maybe some of
 this can make it back into the RfC).  The
 http://leafo.net/lessphp/docs/#custom_functions stuff looks very
 promising,
 it probably should be explicitly mentioned in any LESS for MW docs we
 write.  I look forward to seeing the @import guidelines as well.
  --scott


Heartily agree as well.  I alluded to this in my longer answer.  Basically
Stylus/SASS do seem to be slightly ahead of LESS but it's a vanishing
difference and meaningLESS over the long term.

 The biggest gains to be had from using a CSS
 preprocessor tend to come from the most
 basic features

This I think is a most astute point from Ori.  It's why I made the analogy
to Coco.  I don't and never will use any of the complicated crazy Coco
constructs.  But writing class LineNode extends TimeseriesNode instead of
all the JS boilerplate for classes and inheritance is good.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] GSoC 2013 summary: Incremental dumps

2013-10-09 Thread Dan Andreescu
This seems like a great project Petr, congratulations and I look forward to
seeing it deployed.


On Mon, Oct 7, 2013 at 1:38 PM, Petr Onderka gsv...@gmail.com wrote:

 Hi,

 during the summer I've worked on making dumps of page and revision
 information for Wikimedia wikis incremental [1].
 This includes both server (faster updating of dumps) and client
 (download only changes since last dump) sides.

 The project was successful, though there remain some issues that have
 to be fixed before this goes into production [2].

 I've had fun working on this, and I plan to continue with that, as time
 permits.
 I would like to thank to my mentors, to Tyler Romeo, and especially to
 Ariel T. Glenn for being there for me.

 Petr Onderka
 [[User:Svick]]

 [1]: https://www.mediawiki.org/wiki/User:Svick/Incremental_dumps
 [2]: https://bugzilla.wikimedia.org/show_bug.cgi?id=54633

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Inspecting page performance with mw.loader.inspect()

2013-10-11 Thread Dan Andreescu
 
  Sorry to pick on this example in particular but I was surprised to see
   so much code for the Universal Language selector (ULS) - especially as
   a single language speaker I don't ever use any of them - and I am thus
   being penalised.
 
 
  אנחנו במיעוט הקטן בקרב האנושות של דוברי שפות אחרות מצטערים על אי
  הנוחות ומבטיחים ללמוד אנגלית בהקדם
 
 Hah. But I hope comedy aside my point holds. We should be all getting into
 the habit of loading things as and when needed rather than all at the
 beginning.


I know close to nothing about this, but I'm kind of interested in finding
out.  Would it be possible to $('little language toothed wheel
thing').on('click', load something like jquery.uls.data)?  That would
already be 37.13KB.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] mw.inspect: new CSS report

2013-10-24 Thread Dan Andreescu
That's really cool Ori.  Have people looked into why jQuery UI gets loaded
by default?  It seems to not be used in any obvious way.


On Tue, Oct 22, 2013 at 7:07 PM, Ori Livneh o...@wikimedia.org wrote:

 Running mw.loader.inspect('css') in a JavaScript console will now
 report CSS stats for each active ResourceLoader style module,
 including the total count of selectors and the percentage of those
 that match some node in the current DOM.

 ---
 Ori Livneh
 o...@wikimedia.org

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Code Climate for Ruby and JavaScript

2013-10-28 Thread Dan Andreescu
Thanks Zeljko, I added our wikimetrics project and I like how simple it is
to work with.


On Fri, Oct 25, 2013 at 11:50 AM, James Forrester
jforres...@wikimedia.orgwrote:

 On 25 October 2013 04:38, Željko Filipin zfili...@wikimedia.org wrote:

  I have set up Code Climate[1] for a few WMF repositories that have Ruby
  code[2] but looks like it created JavaScript reports[3] (instead of Ruby)
  for a few repositories[4][5].
 
  There are a few interesting Ruby reports[6][7][8].
 
  Željko
  --
  1: https://codeclimate.com/
 

 ​[Snip]​


  5:
 
 https://codeclimate.com/github/wikimedia/mediawiki-extensions-VisualEditor
 

 ​This is interesting - thanks! A nice hit-list to consider for VE technical
 debt (even if a bunch of it is third-party libraries that can't be excluded
 yet).

 J.
 --
 James D. Forrester
 Product Manager, VisualEditor
 Wikimedia Foundation, Inc.

 jforres...@wikimedia.org | @jdforrester
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] $wgRedactedFunctionArguments

2013-10-29 Thread Dan Andreescu
 I don't think the idea here was to ever make the stack traces *safe*,
 just to redact the most obvious things to reduce the risk if someone
 carelessly posts a stack trace publicly.

 Personally, I think the Java model as exemplified in
 https://gerrit.wikimedia.org/r/#/c/92334/ PS3 goes too far in the
 other direction. In this case, an option to log unredacted traces that
 I could enable on my local test wiki would be useful.


I think Ori's original point stands though.  Configuration could be used to
redact fully / not redact at all for local debugging purposes.  But a black
list for what to redact is bad for all the reasons black lists are bad
security in general.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] OAuth Devlopment Training

2013-12-20 Thread Dan Andreescu
And if anyone's curious, the session helped me get identify implemented in
Wikimetrics: https://gerrit.wikimedia.org/r/#/c/102618/.  I had to hack the
unmaintained Flask-Oauth module quite a bit, so eventually I might move to
rauth.  But it seems to work and makes me feel fuzzier about using OAuth as
pseudo authentication.


On Fri, Dec 20, 2013 at 3:30 PM, Chris Steipp cste...@wikimedia.org wrote:

 Unfortunately I wasn't able to get the recording to work, but everything we
 discussed is here:

 https://www.mediawiki.org/wiki/OAuth/For_Developers

 The notes and examples should be able to get most people started
 integrating their applications. But if you have trouble, ping me via email
 or irc, and I can help you get started.


 On Tue, Dec 17, 2013 at 5:44 PM, Chris Steipp cste...@wikimedia.org
 wrote:

  Just a reminder that I'll be running a training tomorrow for any
  developers interested in OAuth at 11am PST / 19:00 UTC. If you're still
  interested, let me know and I'll add you to the hangout invite list. For
  everyone who already responded, I'll send you the link in a bit.
 
  Several people asked if we would record it, so I'm planning to do that.
  I'll send out the link afterward.
 
 
  On Tue, Dec 10, 2013 at 4:47 PM, Chris Steipp cste...@wikimedia.org
 wrote:
 
  Hi all,
 
  For any developers who have been thinking about connecting their
  application to MediaWiki, but haven't gotten around to diving in, I'm
 going
  to have a short training/workshop session next week. I'll give a brief
  intro to using the version of OAuth that we're running, and walk through
  some quick demos in php and go. After that, I'm happy to walk any
 developer
  through getting their app connected, if anyone is struggling with a
  particular issue.
 
  It will be Wed, Dec 18th at 11am PST (1900 UTC). Please let me know if
  you're interested. We'll probably use a hangout for the session, but if
  that's not an option for anyone we can use a voice call and etherpad.
  Either way I'll probably send out invites individually.
 
 
 
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] npm publish

2013-12-30 Thread Dan Andreescu
We needed to write an npm module for mediawiki oauth, and we're about to
publish it to the npm registry.  I just wanted to check whether we had a
wikimedia account on https://npmjs.org/.  If not, do we want one?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] npm publish

2014-01-10 Thread Dan Andreescu
I forgot to say thank you!  Thanks :)

It seems that the npmjs community leans towards individuals registering
accounts.  So I'll publish under my own username for now.  Mostly because
I've already registered and I wouldn't have a valid generic wikimedia email
address to use besides my own anyway.


On Tue, Dec 31, 2013 at 10:52 AM, Antoine Musso hashar+...@free.fr wrote:

 Le 31/12/13 16:26, Max Semenik a écrit :
  On 31.12.2013, 15:54 Antoine wrote:
 
  I have
  put the credentials on fenari in /home/wikipedia/doc.  You might to do
  something alike.
 
  Move to some host that will not die soon?

 Hopefully they will be migrated out of fenari just like we had them
 moved from good old zwinger :]


 --
 Antoine hashar Musso


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] npm publish

2014-01-10 Thread Dan Andreescu

 (parsoid should also be an npm module at some point.  maybe VE should
 be as well, although 'bower' or another client-side packaging solution
 might me more appropriate.)


I agree.  Btw, the passport-mediawiki-oauth module is published and limn is
using it.  Feel free to report bugs to me but everything seems peachy as of
now.

https://npmjs.org/package/passport-mediawiki-oauth

https://github.com/wikimedia/limn/pull/89
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How to collaborate when writing OAuth applications?

2014-01-20 Thread Dan Andreescu
Hi Cristian,

We did basically the same thing with Wikimetrics.  For development, we just
registered a consumer that redirects to localhost:5000 and committed the
consumer key and secret.  Here's the relevant config file:

https://git.wikimedia.org/blob/analytics%2Fwikimetrics.git/f0aa046c401f0726ac93dc6a5424cd4a33ef86f1/wikimetrics%2Fconfig%2Fweb_config.yaml

Let me know if you have any questions.  The production configuration is
just hidden and kept secret on the wikimetrics server.


On Mon, Jan 20, 2014 at 6:52 PM, Cristian Consonni
kikkocrist...@gmail.comwrote:

 (sorry, I forgot the references)

 2014/1/21 Cristian Consonni kikkocrist...@gmail.com:
  I am writing a little application to make edits on (Italian) Wikipedia
  using OAuth. I use in particular flask_mwoauth[1].
 [...]
  As you can imagine from its name this tool was born with the idea of
  helping OSM user to tag objects in OSM with the corresponding
  wikipedia=* tag (more info about the wikipedia tag in OSM here[1]).
  The source code (GPL) is here:
  https://github.com/simone-f/wikipedia-tags-in-osm
 
  The idea is based on a tool by User:Kolossos from de.wiki[2] which
  does something similar
 [...]

 C

 [1] https://github.com/valhallasw/flask-mwoauth
 [1bis] http://wiki.openstreetmap.org/wiki/Wikipedia
 [2]
 http://wiki.openstreetmap.org/wiki/JOSM/Plugins/RemoteControl/Add-tags#Index.php

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How to collaborate when writing OAuth applications?

2014-01-21 Thread Dan Andreescu

 Another question is: i would like to drop my first test-app
 consumer. How can I do it?


I'm not sure, but I would like to drop a consumer as well.  Last time I
asked it was not yet possible.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC cluster summary: HTML templating

2014-01-22 Thread Dan Andreescu

 When writing very complex multi-function Special Pages (almost all of our 
 internal tools are built as special pages) it gets kind of unwieldy with the 
 special page class that just has a single execute() function and the 
 redundant boilerplate to define ajax functions etc.  Since most of our front 
 end is javascript now and we sometimes want templates/html or json data from 
 the same controllers, we have a 1:1 mapping between methods and templates and 
 every controller method is automatically an ajax function that can return 
 json or html.  The front end can call ANY back end method and ask for json 
 data or the html with the data applied to it.  When the Controller gets “too 
 unwieldy” (the threshold for this depends on the developer) we generally 
 refactor them into a single Controller and a set of Helper classes so that we 
 retain the same external end points, just moving the code around.  
 
 Here’s an example of that:
 
 On every content page, there’s a right rail module that shows the latest 
 photos uploaded to the wiki:
 
 http://fallout.wikia.com/wiki/Nikola_Tesla_and_You_(Fallout_3)
 
 on the back end, that’s entirely self contained in this 
 LatestPhotosController which is dropped into the skin with a render() 
 function.  However, the data that it generates can be used in other places:
 
 http://fallout.wikia.com/wikia.php?controller=LatestPhotos   (call 
 controller, combine data with html and return it)
 http://fallout.wikia.com/wikia.php?controller=LatestPhotosformat=json (call 
 controller, just return the data that would have been in the template)
 
 The default method is executeIndex() and the default template is 
 controller_Index.  Here’s the controller code:
 
 https://github.com/Wikia/app/blob/dev/skins/oasis/modules/LatestPhotosController.class.php#L21
 
 And the template:
 
 https://github.com/Wikia/app/blob/dev/skins/oasis/modules/templates/LatestPhotos_Index.php
 
 Hope that helps provide a bit more context for how this is actually used in 
 the application.  
 

This is very cool, Owen.  Once we have a templating engine picked out, 
conventions like this make life easier.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Job queue and Redis?

2014-01-31 Thread Dan Andreescu

 I'm pretty sure that WMF is using Redis for the MediaWiki job queue. I see
 some sparse documentation on setting it up. I already have redis on my
 machines and would like to move to this to make life easier on my database
 server, but I'm wondering if this is fully baked and ready for use. Are
 there folks outside of WMF using Redis for their job queue?


Well, I'm not outside WMF, but with Wikimetrics [1], we use Redis as a
Result Backend for our Celery queue.  It works great and besides me getting
the configuration wrong once, everything is very solid.  I'd be happy to
answer any specific questions if you have them.


[1] https://git.wikimedia.org/summary/analytics%2Fwikimetrics.git
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] InstantClick

2014-02-10 Thread Dan Andreescu

 Hi,

 Today, I heard about a JavaScript library called InstantClick
 (http://instantclick.io/). Basically, it's based on the principle that
 latency is responsible for a lot of the Web's slowness. It also
 considers that there are about 250ms between hovering over and
 clicking on a link. Therefore, it starts pre-loading the page on
 hover, and then switches to it via AJAX when the user clicks the link.
 It can also do this on mousedown only, which causes no additional
 server load and still provides a performance boost, according to its
 website, similarly to Rails' turbolinks functionality.

 Is there any chance this could work on MediaWiki?

 Regards,
 -Kudu.


This is pretty neat.  Some a/b tests on mobile would help us understand
just how much extra data this would cause people to consume vs. how much
time it saves them.

On desktop, I bet a combination of this with something like cursor
proximity based events [1] would be interesting to look into.


[1] https://github.com/padolsey/jquery.fn/tree/master/proximity-event
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Proposal for Zürich Hackathon - getting close to a production-like Vagrant instance

2014-02-11 Thread Dan Andreescu

   A few of us have been discussing how awesome it would be to use
  MediaWiki-Vagrant[1] to create a portable production-like environment.
  This
  would give Mediawiki engineers a common ground from which to develop
 core
  code and/or extensions, and by coming close to mimicking Wikimedia's
  production environment, would hopefully reduce the amount of friction
  around getting features out to production. Also, this is something that
 we
  would be able to pre-package this for new engineers - imagine handing a
  new
  MediaWiki engineer a USB stick with an up-to-date MediaWiki-Vagrant
  instance that closely mimics production at say, a future hackathon.
 
  We started chatting about what it would take to get us there, and
  identified some initial steps that we'd like to tackle at the Zürich
  Hackathon - namely, turning a few puppetized production services into
  roles
  that we could use in MediaWiki-Vagrant.
 
  We've created a corresponding 'topic' on the Hackathon's topic page[2]
 to
  describe what we'd like to achieve at the Hackathon. Please review,
  comment, and certainly add yourself as an 'interested person' if this
  catches your fancy and you plan to attend the hackathon.


I am keeping my options open for the hackathon but wanted to support this
effort enthusiastically.  And to mention that the analytics team loves
mediawiki-vagrant and tries to get most of its projects integrated with it
as roles.

One thing I'd be interested in is to hear how Linux folks are installing
vagrant / virtualbox.  I've not been able to vagrant up my
mediawiki-vagrant since I upgraded to Ubuntu 13.10 and I've heard of at
least two other people with the same experience.   Maybe 14.04 will just
solve all that, but it's worth looking into so we can post any gotchas in
the install instructions.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Proposal for Zürich Hackathon - getting close to a production-like Vagrant instance

2014-02-11 Thread Dan Andreescu
On Tue, Feb 11, 2014 at 2:51 PM, Brad Jorsch (Anomie) bjor...@wikimedia.org
 wrote:

 On Tue, Feb 11, 2014 at 2:33 PM, Dan Andreescu dandree...@wikimedia.org
 wrote:

  One thing I'd be interested in is to hear how Linux folks are installing
  vagrant / virtualbox.  I've not been able to vagrant up my
  mediawiki-vagrant since I upgraded to Ubuntu 13.10 and I've heard of at
  least two other people with the same experience.   Maybe 14.04 will just
  solve all that, but it's worth looking into so we can post any gotchas in
  the install instructions.
 

 I can't comment about Ubuntu, but using Debian Sid I was finally able to
 get it to work this morning.

 All I had to do was apt-get install virtualbox-dkms and vagrant. Versions
 here are 4.3.2-dfsg-1 for virtualbox, 1.4.3-1 for vagrant, and 3.12.9-1 for
 the kernel.


Thanks Brad.  Those versions didn't work for me so I finally stopped being
lazy and looked into it.  It was user error (well, sort of).  VT-x was
disabled in my BIOS.  Added a note in the README:
https://gerrit.wikimedia.org/r/#/c/112800/
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Action Script API client library?

2014-02-11 Thread Dan Andreescu
On Tue, Feb 11, 2014 at 4:18 PM, Brandon Harris bhar...@wikimedia.orgwrote:


 Both ActionScript and JavaScript are ECMAScript languages and are
 thus pretty similar.  I last did AS coding about 4 years ago but I don't
 think the language has changed significantly since then.

 I'd be surprised if there was an ActionScript library for
 MediaWiki, so your best bet is JavaScript.


When I used ActionScript it was fairly easy to communicate back and forth
with JavaScript.  I'd use the JavaScript API and talk to it from
ActionScript.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bingle/Bugello broken post-Bugzilla upgrade

2014-02-13 Thread Dan Andreescu
Bingle is actually a python tool: https://github.com/awjrichards/bingle

Arthur, sorry I spend a couple minutes brainstorming and came up empty.
 Keep us updated and I'll take a more serious look if the problem persists.


On Thu, Feb 13, 2014 at 3:58 PM, Daniel Zahn dz...@wikimedia.org wrote:

 Arthur,

 i think i know the issue here. zirconium is indeed a shared host, so it
 runs several misc. web services
 using https on a single IP, so we rely on clients speaking SNI to get the
 correct virtual host.

 java 6 and IE on XP are among the few clients who don't.

 I think your applications are java and don't speak SNI, so they are getting
 the first virtual host, which is planet.

 this can be fixed by either:

 use Java 7  which should support SNI .. see f.e.

 http://stackoverflow.com/questions/12361090/server-name-indication-sni-on-java

 quote  on Java 7 use

 new URL(https://cmbntr.sni.velox.ch/;).openStream()

 until HTTPCLIENT-1119 is fixed


 or i can cheat by changing the order Apache loads the site configs, f.e. i
 could make it


 sites-enabled/001-Bugzilla , ./002-Planet etc. Then those clients who don't
 speak SNI get Bugzilla (but the Planet users don't get their planet, but
 Bugzilla seems more important.


 or we would have to get an extra IP address just for Bugzilla
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bingle/Bugello broken post-Bugzilla upgrade

2014-02-14 Thread Dan Andreescu

 python-requests bundles it's own cacert list (although the ubuntu .deb
 version might use the central certificate store - not sure about that),
 which might be outdated. Some older cacert lists have issues with RapidSSL
 certificates (this is an issue with the list bundled with httplib2, for
 example).


We use pywikibot's httplib2 which solves the issue Merlijn mentions here.
 For requests though, the last commit message on their cacert.pem file
makes me like your original thought that requests should be updated:

https://github.com/kennethreitz/requests/blob/master/requests/cacert.pem
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Non-Violent Communication

2014-02-18 Thread Dan Andreescu

 I assume your good faith, and I foresee its consequences. You couldn't
 employ your NVC skills because you were, quote, in a hurry, end quote. That
 means, NVC just doesn't work when it's needed. I don't think everyone here
 has a lot of spare time to mix original thoughts with a dump of meaningless
 requests and pardons. You want to share how you feel? I don't think it's
 the right place to do this. Don't ask to ask, just ask, and so on.


I think this and other responses to non-violent communication make a lot of
sense.  They're in line with the old quote First they ignore you, then
they laugh at you, then they fight you, then you win.  But this process
takes years and we seem to be at the laugh and fight stage.

I think violence is a particularly efficient way of getting what you want.
 Assume good faith is just a way to apologize in advance for employing
violence.  And honestly, I come from a culture where violence is a totally
acceptable form of communication, and I'm a violent communicator.  I creep
myself out when I try to not be violent, but I recognize that much harmony
would result from adopting the principles of NVC.  Anyway I don't have any
opinion on either side of this discussion, just wanted to point out that
the responses are to be expected.  And to say to Derric thank you, your
post was not in vain and it did not turn me off to the subject.  On the
contrary, it made me admire that more people are willing to try it.




 On Tue, Feb 18, 2014 at 5:38 PM, Derric Atzrott 
 datzr...@alizeepathology.com wrote:

  Question for Derric: why didn't you formulate your suggestion using
  NVC?
 
  I was excited and in a hurry.  In retrospect I really think that I should
  have.
 
  After reading some of the replies I felt rather disappointed and
  frustrated, and even a little sad as I didn't feel my need for
  understanding was met.
 
  In the future I will try to take a little more time writing emails to the
  list.  I'm sorry to anyone who felt offended by it or felt that my email
  was, well, violent.  That was not my intention at all.  I just began
 myself
  looking into and trying to practice NVC in the past six months or so,
 and I
  am, as of now, still not terribly great at it.
 
  Again, I want to express my apologies, and I really hope that I didn't
  turn anyone off to the subject.  I guess all I was really trying to say
 in
  that email is that when conversation on this list gets heated, I feel
  frustrated because my needs for calm and community are not met.  I end up
  not wanting to participate because I don’t think that I will be heard or
  understood.  I would like to request that people onlist look into
  strategies to help everyone get along, whether that is AGF, or NVC, or
  something else, does not matter as much to me.  I suggested NVC because
 it
  has been a very useful tool for me in the past.
 
  Thank you,
  Derric Atzrott
 
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 



 --
 З павагай,
 Павел Селіцкас/Pavel Selitskas
 Wizardist @ Wikimedia projects
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Minor Bingle improvements - users please update your code

2014-02-21 Thread Dan Andreescu
thanks Arthur!


On Thu, Feb 20, 2014 at 8:19 PM, Arthur Richards aricha...@wikimedia.orgwrote:

 I had some free time today and made some minor improvements to Bingle to
 take care of two longstanding, really annoying issues:
 https://github.com/awjrichards/bingle/issues/10
 https://github.com/awjrichards/bingle/issues/11
 (Same issues also reported via BZ:
 https://bugzilla.wikimedia.org/show_bug.cgi?id=57830)

 The changes should improve the formatting of bug descriptions/comments in
 Mingle. Bingle users, please consider git pull'ing for the latest :)

 --
 Arthur Richards
 Software Engineer, Mobile
 [[User:Awjrichards]]
 IRC: awjr
 +1-415-839-6885 x6687
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Zurich Hackathon: Creating a map namespace

2014-03-05 Thread Dan Andreescu
This is awesome.  I have a decent amount of experience with different
technologies that we could use to build maps and present them both
statically through our cache layers and dynamically for editors.  Let's
kick this project's butt in Zurich.


On Wed, Mar 5, 2014 at 3:54 PM, Jon Robson jdlrob...@gmail.com wrote:

 This may be extremely ambitious, but I'm keen to kick off development
 around the creation of a map namespace during the Zurich hackathon.

 The goal would be to setup an editable map namespace that could be
 used for a variety of things, one of which would be adding a map view
 to the Special:Nearby page provided via the mobile site. The goal is a
 proof of concept not necessarily anything production ready (but that
 would be great if we could get to that point!)

 Please let me know if you would also be interested on hacking such a
 thing -
 https://www.mediawiki.org/wiki/Z%C3%BCrich_Hackathon_2014/Geo_Namespace
 - or if doing so would be a terrible idea (but if you have to go down
 that route please provide constructive reasoning on what would be a
 less terrible idea)

 Excited to hack on cool things in Zurich!

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] About the bug 46453 - Uzbek: Change date and decimal separators

2014-03-13 Thread Dan Andreescu

 Hi,
 I'm having some trouble when trying to upload the fix to gerrit. I followed
 the instructions in [1]. But I get the bellow error when trying to run
 git review -s

 Error.

 Problems encountered installing commit-msg hook
 The following command failed with exit code 1
 scp  gerrit.wikimedia.org:hooks/commit-msg .git/hooks/commit-msg
 ---
 .git/hooks/commit-msg: No such file or directory
 ---


 [1]
 http://www.mediawiki.org/wiki/Gerrit/Tutorial#Make_and_commit_your_change


I'm certainly no fan of gerrit, and I hope gerrit-patch-uploader that
Sumana mentioned helps you.  But if not, the error you're having above
sounds like you are doing git review -s outside your repository's
directory.  So you might try:

git clone repository-address folder-to-clone-into
cd folder-to-clone-into
git review -s
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] IRC Office Hour on Project Management Tools Review: Friday 28th, 17:00UTC

2014-03-25 Thread Dan Andreescu
looking forward to attending this, thanks for organizing


 If you'd like to learn more about Phabricator's project management and
 roadmap functionality, I invite you to test it live on the test
 instance at http://fab.wmflabs.org/ . Alternatively, you can also
 peruse http://phabricator.org/ and http://phabricator.org/tour/ . I
 strongly encourage everyone to do so before the IRC discussion, so we
 can all have a more informed chat about Phabricator and the other
 options currently under consideration.


I've taken a look but I think it would be awesome if we could set up
repositories and do a mock code review.  I'd love to see how that
integrates with tasks (so we can compare with gerrit - bugzilla for
example).
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] IRC Office Hour on Project Management Tools Review: Friday 28th, 17:00UTC

2014-03-26 Thread Dan Andreescu
Ha!  I had added wikimetrics but it wasn't working at first.  It's working
now so that's great, I'll take a deeper look.


On Tue, Mar 25, 2014 at 12:35 PM, Gilles Dubuc gil...@wikimedia.org wrote:

 You can add external repos, people have already added a bunch:
 http://fab.wmflabs.org/diffusion/ This allows you to comment/raise issue
 on
 commits that have already been pushed. Example:
 http://fab.wmflabs.org/rMMV49bc5edd9384ecc22a05a22a88bc70cd2439c5b3

 And I'm pretty sure the phabricator command line tool (arcanist) should
 just work to upload diffs for review:
 https://secure.phabricator.com/book/phabricator/article/arcanist_diff/


 On Tue, Mar 25, 2014 at 5:39 PM, Dan Andreescu dandree...@wikimedia.org
 wrote:

  looking forward to attending this, thanks for organizing
 
 
   If you'd like to learn more about Phabricator's project management and
   roadmap functionality, I invite you to test it live on the test
   instance at http://fab.wmflabs.org/ . Alternatively, you can also
   peruse http://phabricator.org/ and http://phabricator.org/tour/ . I
   strongly encourage everyone to do so before the IRC discussion, so we
   can all have a more informed chat about Phabricator and the other
   options currently under consideration.
  
 
  I've taken a look but I think it would be awesome if we could set up
  repositories and do a mock code review.  I'd love to see how that
  integrates with tasks (so we can compare with gerrit - bugzilla for
  example).
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] CentralAuth questions

2014-03-27 Thread Dan Andreescu
Any links to documentation on consuming data from the CentralAuth databases
is welcome.  We searched a bit and found mostly installation instructions.


On Thu, Mar 27, 2014 at 4:20 PM, Teresa Cho tcho...@gmail.com wrote:

 Hi all,

 I'm trying to add a feature to Wikimetrics that will allow users to create
 a cohort with a username and find all accounts across wikis. I want to use
 the CentralAuth database, because as far as I can tell, it stores the
 global username and all the local usernames. However, I don't see where it
 connects the globalusers to the localusers. Is it just the username?

 Does the username have to be the same across local wikis and you query the
 localuser table with what you think is the global username? If that's the
 case, I suppose I don't need to look at the global table.

 Thanks,
 Teresa
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] CentralAuth questions

2014-03-27 Thread Dan Andreescu
Thank you very much for the reply Max, +1 beer for next time we meet.


On Thu, Mar 27, 2014 at 5:10 PM, MZMcBride z...@mzmcbride.com wrote:

 Teresa Cho wrote:
 I'm trying to add a feature to Wikimetrics that will allow users to
 create a cohort with a username and find all accounts across wikis. I
 want to use the CentralAuth database, because as far as I can tell, it
 stores the global username and all the local usernames. However, I don't
 see where it connects the globalusers to the localusers. Is it just the
 username?
 
 Does the username have to be the same across local wikis and you query
 the localuser table with what you think is the global username? If that's
 the case, I suppose I don't need to look at the global table.

 Hi.

 Broadly, I think the answer for you're looking for is no: CentralAuth
 accounts (global user accounts that match to local user accounts) are not
 fully unified on Wikimedia wikis. It's a long-term goal, but it's a
 disruptive change to make, so it's taken a while. :-)

 It sounds like you want programmatically retrieve the info from:
 https://www.mediawiki.org/wiki/Special:CentralAuth/Jimbo_Wales.

 If so, I'd recommend the MediaWiki Web API
 (https://www.mediawiki.org/w/api.php) for this. Perhaps the
 globaluserinfo API module?
 https://www.mediawiki.org/w/api.php?action=querymeta=globaluserinfoguiuse
 r=Jimbo+Walesguiprop=groups|merged|unattached

 If you must directly query the MediaWiki database using SQL, you'll likely
 need to read through the source code of the CentralAuth MediaWiki
 extension to figure out exactly what the PHP and SQL is doing with the
 underlying data. The source code of the CentralAuth MediaWiki extension
 can be found here:
 https://git.wikimedia.org/tree/mediawiki%2Fextensions%2FCentralAuth.git.
 You'll likely want to read through central-auth.sql in particular.

 Dan Andreescu wrote:
 Any links to documentation on consuming data from the CentralAuth
 databases is welcome.  We searched a bit and found mostly installation
 instructions.

 Well, very generally you (or your program) probably shouldn't be querying
 the databases directly, but if you can provide more specific information
 about where you looked, we can probably add some redirects for future
 w[ao]nderers.

 For general clarity, while I used www.mediawiki.org in the examples in
 this e-mail, because the Web API is retrieving global (wiki farm-wide)
 data, the equivalent URL paths should work on other Wikimedia wikis such
 as en.wikipedia.org or meta.wikimedia.org.

 MZMcBride



 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki on Google App Engine

2014-04-07 Thread Dan Andreescu
I've done things like port mako templates to Google App Engine.  As
mentioned by Jeremy Baron, the common problem to porting anything of
sufficient complexity is that you're not allowed to write files to the
disk.  To get mako to work, since it caches compiled templates to disk, I
patched it to write the templates to memcache instead.  This worked well
enough but I didn't need persistent files.  So you'd probably have to
redirect mediawiki to write files somewhere else more permanent.  Google
Drive seems like a decent place but last time I tried, integrating Drive
with App Engine was silly hard.  This might have changed since I haven't
tried for a few years.


On Sun, Apr 6, 2014 at 1:44 AM, Daniel Friesen
dan...@nadir-seen-fire.comwrote:

 Never tried, but the topic did interest me enough awhile back to write a
 detailed tracking bug for it:
 https://bugzilla.wikimedia.org/show_bug.cgi?id=55475

 ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]

 On 2014-04-05, 1:25 PM, Denny Vrandečić wrote:
  Did anyone manage to get MediaWiki running on Google App Engine? I am a
 bit
  dense, it seems, and would appreciate a few pointers.
 
  Cheers,
  Denny
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RfC on Product Management Tools and Development Toolchain

2014-04-15 Thread Dan Andreescu
Andre,

It's my understanding that the current Phabricator instance is temporary.
 Indeed, it includes mostly jokes and throw-away testing tasks, comments,
boards, etc.  Could we stand up an instance to which we would potentially
migrate to?  We would have to reserve task ids 1-10 to allow us to port
from Bugzilla, but do we have any other blockers?

I had volunteered my team to try it out for one of our projects, but I've
been hesitating until we have a blessed version.

Dan


On Mon, Apr 14, 2014 at 8:02 PM, Andre Klapper aklap...@wikimedia.orgwrote:

 Hi,

 as previously announced [1], we've been facilitating a collective review
 of Wikimedia's current product management tools and development
 toolchain.

 The most popular idea at the moment is to consolidate Wikimedia's
 product management and infrastructure tools (such as Bugzilla, Gerrit,
 RT, Mingle, Trello) into all-in-one Phabricator. We have therefore put
 together a Request for comment to bring this up for wider discussion.

 This discussion affects anyone who deals with bug reports, feature
 requests and code changes in Wikimedia, so it's critical that you test
 Phabricator for your own use and make your voice heard in the RFC:

   https://www.mediawiki.org/wiki/Requests_for_comment/Phabricator

 We're compiling a list of Frequently asked questions at
 https://www.mediawiki.org/wiki/Requests_for_comment/Phabricator/FAQ ;
 You're welcome to add more and help answer them :)

 We'll host a few IRC discussions while the RFC is running to help answer
 questions, etc. Our tentative times and dates are at

 https://www.mediawiki.org/wiki/Talk:Requests_for_comment/Phabricator#IRC_discussions

 Thank you for your input!

 Guillaume and Andre

 [1] http://lists.wikimedia.org/pipermail/wikitech-l/2014-March/074896.html

 --
 Andre Klapper | Wikimedia Bugwrangler
 http://blogs.gnome.org/aklapper/


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] Help! Phabricator and our code review process

2014-05-06 Thread Dan Andreescu
On Mon, May 5, 2014 at 10:24 PM, Matthew Flaschen
mflasc...@wikimedia.orgwrote:

 On 05/02/2014 03:56 PM, C. Scott Ananian wrote:

 [greg-g] cscott: James_F crazy idea here: can some teams use it for
 real (I think growth is, kinda?) and export/import to a future real
 instance?
 frontend...


 No, we're not using it for real currently.  We (Growth) have talked about
 potentially being an early adopter, but have not committed to this yet.

 Matt Flaschen


Likewise for Analytics
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Transcluding non-text content as HTML on wikitext pages

2014-05-14 Thread Dan Andreescu

  Can you outline how RL modules would be handled in the transclusion
  scenario?

 The current patch does not really address that problem, I'm afraid. I can
 think
 of two solutions:

 * Create an SyntheticHtmlContent class that would hold meta info about
 modules
 etc, just like ParserOutput - perhaps it would just contain a ParserOutput
 object.  And an equvalent SyntheticWikitextContent class, perhaps. That
 would
 allow us to pass such meta-info around as needed.

 * Move the entire logic for HTML based transclusion into the wikitext
 parser,
 where it can just call getParserOutput() on the respective Content object.
 We
 would then no longer need the generic infrastructure for HTML transclusion.
 Maybe that would be a better solution in the end.

 Hm... yes, I should make an alternative patch using that approach, so we
 can
 compare.


Thanks a lot Daniel, I'm happy to help test / try out any solutions you
want to experiment with.  I've moved my work to gerrit:
https://gerrit.wikimedia.org/r/#/admin/projects/mediawiki/extensions/Limnand
the last commit (with a lot of help from Matt F.) may be ready for you
to use as a use case.  Let me know if it'd be helpful to install this
somewhere in labs.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Maps-l] Output from Zurich Hackathon - yet another maps extension!

2014-05-14 Thread Dan Andreescu
Thanks for starting this Jon, the end result is going to be awesome.  So
here's how I see things, it's roughly along the lines of what you've been
saying:

Server-side rendering and scaling is important.  This is one of the main
reasons I picked Vega [2] for my hack.  The same visualization grammar can
be used to generate png or svg [2].  I see the approach to visualization as
being similar to Parsoid:

* user creates a visualization (with a visual editor) and saves
* Vega parses that server side and generates an image, then refreshes the
caches accordingly
* a transcluded visualization renders the image from cache as a link to the
interactive version
* when the link is clicked, something like MediaViewer shows the
interactive visualization
* alternatively, we can allow editors to show the interactive version in
the transclusion itself, but that has performance implications until
browser caches are filled.

Now a little bit about where I see the Map namespace.  A visualization in
the world of Vega has three parts:

* Data (one or more sets of data, can be geojson, topojson, tsv, csv,
layers from OSM [3], OHM [4], etc.)
* Transformations on that data (scales, normalization, etc.)
* Marks (visual representations)

Transformations and Marks can be written by hand or by a visual editor that
introspects the specification to show what's possible.  Data to me is the
tricky part.  We may need to restrict Vega to only consume open data that's
hosted and curated by us, and that could be done in a few different ways:

* Namespaces like the Maps namespace that enables awesome collaborative
editing of geojson
* Datasets in WikiData using an alternative data model
* File namespace serving raw data from Commons (where people are familiar
with take down notices and have the infrastructure to deal with that)

But yes, I do see the Maps namespace as one of the sources of data that we
could visualize with Vega.  And recent developments in Vega make me feel
that it's really a solid choice for generic visualization.  We have
interactivity, headless mode, a seemingly clear path to a visual editor via
introspection of the grammar specification, and pretty much everything I
can think of needing from such a tool.

For the short term, I think further exploration of the Map namespace is
great, but I think generic visualization work could go into the
Visualization namespace.  My suggestion for a name for this namespace may
seem a bit obscure.  It's a word that means to illuminate: Limn [5].
 There's an old project by that name of which I'm not very fond (despite
writing some of it myself), but I've always thought the word was beautiful
and fit.  To what Antoine was saying earlier, we should illuminate the
world's knowledge with beautiful visualizations.


[1] https://github.com/trifacta/vega
[2] https://github.com/trifacta/vega/wiki/Headless-Mode
[3] OSM - Open Street Maps http://wiki.openstreetmap.org/wiki/Main_Page
[4] OHM - Open Historical Maps
http://wiki.openstreetmap.org/wiki/Open_Historical_Map
[5] Limn - depict or describe in painting or words:
https://github.com/wikimedia/mediawiki-extensions-Limn


On Wed, May 14, 2014 at 9:43 AM, Jon Robson jrob...@wikimedia.org wrote:

 Tim I completely agree. This is something we need to setup.
 Patches very much welcomed! :-)



 On Wed, May 14, 2014 at 7:51 AM, Tim Alder t...@alder-digital.de wrote:
  I think the most important feature is to create on serverside a
  thumbnail for each map by using something like http://phantomjs.org/
  This thumbnails should than be in the big WMF caches. The map would
  become interactively only in the case a user click on it.
  This would reduce the numbers of request for loading a page and JS
  overhead and it would increase the stability of the system.
  Without this feature I afraid to never see the extension live in
 Wikipedia.
 
  Other nice features you can see at umap.openstreetmap.fr:
  *Choosing different backgrounds
  *POIs with interactive descriptions
  *Geometry import from OSM (WIWOSM)
  *different layers
  *...
 
  Greeting Tim alias Kolossos
 
 
  Am 14.05.2014 00:34, schrieb Jon Robson:
  During the Zurich hackathon, DJ Hartman, Aude and I knocked up a
  generic maps prototype extension [1]. We have noticed that many maps
  like extensions keep popping up and believed it was time we
  standardised on one that all these extensions could use so we share
  data better.
 
  We took a look at all the existing use cases and tried to imagine what
  such an extension would look like that wouldn't be too tied into a
  specific use case.
 
  The extension we came up with was a map extension that introduces a
  Map namespace where data for the map is stored in raw GeoJSON and can
  be edited via a JavaScript map editor interface. It also allows the
  inclusion of maps in wiki articles via a map template.
 
  Dan Andreescu also created a similar visualisation namespace which may
  want to be folded into this as a map could be seen

Re: [Wikitech-l] Output from Zurich Hackathon - yet another maps extension!

2014-05-15 Thread Dan Andreescu

 2014-05-14 19:43 GMT+03:00 Dan Andreescu dandree...@wikimedia.org:
  For the short term, I think further exploration of the Map namespace is
  great, but I think generic visualization work could go into the
  Visualization namespace.  My suggestion for a name for this namespace may
  seem a bit obscure.  It's a word that means to illuminate: Limn [5].

 3) It's a word that is difficult to translate.


I'm open to alternatives, but my original choice (WikiViz) was taken
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Output from Zurich Hackathon - yet another maps extension!

2014-05-15 Thread Dan Andreescu
I like Visual:, any +1s? +2s?

The only downside might be a slight conflict with VisualEditor


On Thu, May 15, 2014 at 11:20 AM, dan-nl dan.entous.wikime...@gmail.comwrote:

 PictureIt:
 Envision:
 Imagine:

 On May 15, 2014, at 17:06 , Jon Robson jdlrob...@gmail.com wrote:

  Visual: ?
  On 14 May 2014 10:44, Derk-Jan Hartman d.j.hartman+wmf...@gmail.com
  wrote:
 
  PS, i'm building an instance that is running this extension.
 
  On Wed, May 14, 2014 at 12:34 AM, Jon Robson jrob...@wikimedia.org
  wrote:
  During the Zurich hackathon, DJ Hartman, Aude and I knocked up a
  generic maps prototype extension [1]. We have noticed that many maps
  like extensions keep popping up and believed it was time we
  standardised on one that all these extensions could use so we share
  data better.
 
  We took a look at all the existing use cases and tried to imagine what
  such an extension would look like that wouldn't be too tied into a
  specific use case.
 
  The extension we came up with was a map extension that introduces a
  Map namespace where data for the map is stored in raw GeoJSON and can
  be edited via a JavaScript map editor interface. It also allows the
  inclusion of maps in wiki articles via a map template.
 
  Dan Andreescu also created a similar visualisation namespace which may
  want to be folded into this as a map could be seen as a visualisation.
  I invite Dan to comment on this with further details :-)!
 
  I'd be interested in people's thoughts around this extension. In
  particular I'd be interested in the answer to the question For my
  usecase A what would the WikiMaps extension have to support for me to
  use it.
 
  Thanks for your involvement in this discussion. Let's finally get a
  maps extension up on a wikimedia box!
  Jon
 
  [1] https://github.com/jdlrobson/WikiMaps
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Output from Zurich Hackathon - yet another maps extension!

2014-05-15 Thread Dan Andreescu
By the way, the Vega work is deployed to this wiki:

http://analytics-wiki.wmflabs.org/mediawiki/Main_Page#Visualization

Jon/DJ/Aude, if you want I can add a more generic proxy name for that wiki
and put your stuff on there too?  It's in the analytics project in labs but
I'm happy to give anyone rights to it.


On Thu, May 15, 2014 at 1:35 PM, Dan Andreescu dandree...@wikimedia.orgwrote:

 I like Visual:, any +1s? +2s?

 The only downside might be a slight conflict with VisualEditor


 On Thu, May 15, 2014 at 11:20 AM, dan-nl 
 dan.entous.wikime...@gmail.comwrote:

 PictureIt:
 Envision:
 Imagine:

 On May 15, 2014, at 17:06 , Jon Robson jdlrob...@gmail.com wrote:

  Visual: ?
  On 14 May 2014 10:44, Derk-Jan Hartman d.j.hartman+wmf...@gmail.com
  wrote:
 
  PS, i'm building an instance that is running this extension.
 
  On Wed, May 14, 2014 at 12:34 AM, Jon Robson jrob...@wikimedia.org
  wrote:
  During the Zurich hackathon, DJ Hartman, Aude and I knocked up a
  generic maps prototype extension [1]. We have noticed that many maps
  like extensions keep popping up and believed it was time we
  standardised on one that all these extensions could use so we share
  data better.
 
  We took a look at all the existing use cases and tried to imagine what
  such an extension would look like that wouldn't be too tied into a
  specific use case.
 
  The extension we came up with was a map extension that introduces a
  Map namespace where data for the map is stored in raw GeoJSON and can
  be edited via a JavaScript map editor interface. It also allows the
  inclusion of maps in wiki articles via a map template.
 
  Dan Andreescu also created a similar visualisation namespace which may
  want to be folded into this as a map could be seen as a visualisation.
  I invite Dan to comment on this with further details :-)!
 
  I'd be interested in people's thoughts around this extension. In
  particular I'd be interested in the answer to the question For my
  usecase A what would the WikiMaps extension have to support for me to
  use it.
 
  Thanks for your involvement in this discussion. Let's finally get a
  maps extension up on a wikimedia box!
  Jon
 
  [1] https://github.com/jdlrobson/WikiMaps
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Maps-l] Output from Zurich Hackathon - yet another maps extension!

2014-05-20 Thread Dan Andreescu

 FYI, Limn is also sort of taken in the MediaWiki/WMF namespace:
 https://www.mediawiki.org/wiki/Analytics/Limn


Yep, but that Limn is dying slowly.  If we do this new extension properly,
it will take its place.  As Erik said, this is not staffed for success
right now but it would address many shared needs.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Output from Zurich Hackathon - yet another maps extension!

2014-05-20 Thread Dan Andreescu

 Overall this is very exciting work with lots of potential future
  applications. I don't think it's resourced for success yet, but let's
  figure out where it should sit in our roadmap since it would address
  many shared needs if done right.
 

 True! It would be nice to talk about it during Wikimania, from my side I
 started a community consultation here:

 https://meta.wikimedia.org/wiki/Requests_for_comment/How_to_deal_with_open_datasets


Thanks for the interest Micru, and especially that RFC.  I am not scheduled
to attend Wikimania but I could fly myself there if you think this topic
will receive attention.  (Also, I don't know how Wikimania works :))
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Output from Zurich Hackathon - yet another maps extension!

2014-05-20 Thread Dan Andreescu

 Sorry, but Limn sounds pretty pretentious, and it is hard to pronounce.


no problem at all, you're not hurting my feelings :)  Sounds like people
don't like the name, so let's drop it.  The top contender right now is
Visual: and I'll ping the Visual Editor folks right now to see if they mind.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] new terms of use/paid contributions - do they apply to mediawiki

2014-06-16 Thread Dan Andreescu
cc-ing Luis as I think this consequence, whether intended or not, would
interest him


On Mon, Jun 16, 2014 at 4:25 PM, Brian Wolff bawo...@gmail.com wrote:

 Hi,

 So from what I understand, there's now been an amendment to WMF's
 terms of use to require disclosure of paid contributions [1]. Its a
 little unclear how this applies to MediaWiki as a project, but a
 literal reading of the policy makes it seem like MediaWiki is
 included.

 * MediaWiki is arguably a project of the Wikimedia foundation. The
 foundation's website says as much [2]
 *A commit/patchset certainly seems like a contribution.

 Thus the new policy would require anyone submitting code to use to
 declare who they work for. Personally this seems both unnecessary to
 me, as well as unlikely to be followed. For example, I see no reason
 why some person who uses our software should have to declare who they
 work for when they upstream a bug fix, etc.

 I would suggest we follow commons' lead [3], and declare that we do
 not have disclosure requirements for people giving us code.

 --bawolff

 [1]
 https://wikimediafoundation.org/w/index.php?title=Terms_of_Usediff=0oldid=90463
 [2] https://wikimediafoundation.org/wiki/Our_projects
 [3]
 https://commons.wikimedia.org/wiki/Commons:Requests_for_comment/Alternative_paid_contribution_disclosure_policy

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] new terms of use/paid contributions - do they apply to mediawiki

2014-06-16 Thread Dan Andreescu

 The terms of use is basically a clickwrap
 https://en.wikipedia.org/wiki/Clickwrap agreement, right? When you make
 a
 contribution, you see by clicking you accept blah blah blah somewhere,
 once you click knowing that, you made a sort of contract from a legal point
 of view. So the terms of use only applies to you if you do actually see
 that piece of text, which does not seem to be present on mediawiki.org,
 nor
 on gerrit.wikimedia.org, and obviously not in the git interface when you
 are pushing a patch.


I agree with this reading, but I also think people like Bartosz will be
legitimately confused.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HTML templating progress; Knockout Components curly brace syntax

2014-07-08 Thread Dan Andreescu
On Tue, Jul 8, 2014 at 10:23 AM, Chad innocentkil...@gmail.com wrote:

 On Tue, Jul 8, 2014 at 7:16 AM, Tyler Romeo tylerro...@gmail.com wrote:

  I just want to be clear that any sort of template syntax that resembles
 or
  can be confused with wikitext is not something that we can or should
 allow
  in core. If MobileFrontend and whatnot want to use this syntax, so be it.
 

 We can absolutely allow it, so that's not strictly true. Whether we should
 is another matter. I just want to be clear that this is your opinion.


I'm not sure I follow, it seems to me knockout syntax and wikitext are
different, and have different purposes.  Can you elaborate a bit more,
Tyler?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Announcement: Matthew Flaschen joins Wikimedia as Features Engineer

2012-12-07 Thread Dan Andreescu
Welcome Matt!  Philadelphia  \o/


On Fri, Dec 7, 2012 at 5:02 PM, James Forrester jforres...@wikimedia.orgwrote:

 On 7 December 2012 13:06, Terry Chay tc...@wikimedia.org wrote:

  Please join me in a belated welcome of Matthew Flaschen to the Wikimedia
  Foundation. :-)
 

 Welcome aboard, Matt! Excited to be working alongside you.

 J.
 --
 James D. Forrester
 Product Manager, VisualEditor
 Wikimedia Foundation, Inc.

 jforres...@wikimedia.org | @jdforrester
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] DevOps/Continuous Deployment discussion?

2012-12-26 Thread Dan Andreescu
Yes, I am seeking a Continuous Deployment solution for Limn.  I would be
glad to participate in a centralization effort.


On Wed, Dec 26, 2012 at 12:38 PM, David Gerard dger...@gmail.com wrote:

 On 26 December 2012 17:31, Chris McMahon cmcma...@wikimedia.org wrote:

  Is it time to start such a discussion?  Or is this premature?


 Everyone has to first read and understand (possibly with shuddering)
 https://twitter.com/DEVOPS_BORAT


 - d.

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] The ultimate bikeshed: typos in commit messages

2013-01-19 Thread Dan Andreescu
 Hey,

 Thanks to all for voicing your opinions.

 My observations:

 * The community is split on this topic
 * Both camps have people that strongly defend their respective views
 * The above two points are unlikely to change
 * People are getting upset due to -1 for things that are by some
considered
 nitpicking
 * People are getting upset because others are not willing to fix their -1
 remarks since they consider them to be nitpicks
 * Commits that are otherwise fine are getting stuck because of this

 Loose from how you choose to interpret and deal with those, let's try to
be
 tolerant of each other, keep and eye on the bigger picture, and work
 towards what is really important as a community.

 Cheers

+2
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Integrating MediaWiki into MS SBS

2013-02-01 Thread Dan Andreescu
Hi Bináris,

The following manual seems to be the most actively maintained guide for
getting MediaWiki installed on Windows:

http://www.mediawiki.org/wiki/Manual:Running_MediaWiki_on_Windows

If you run into any problems, I'd suggest adding them to the manual along
with any resolutions you or others come up with.  Good luck!

Dan


On Fri, Feb 1, 2013 at 8:13 PM, Bináris wikipo...@gmail.com wrote:

 Hi folks,

 does anyone have experiences with integrating a MediaWiki installation into
 Small Business Server 2008 environment? The task is to install a wiki and
 authenticate logged-in users automatically. Preferably with real name
 instead of username.
 Is there a manual for this somewhere?
 I would like a company install for handbooks and howtos, but I have to
 persuade our sysop.

 --
 Bináris
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Integrating MediaWiki into MS SBS

2013-02-05 Thread Dan Andreescu
Devils and Sausages aside, I think sysadmins have a hard job that probably
shouldn't be made harder if possible:

http://xkcd.com/705/

Binaris, if you have access to a machine, you can set up a Bitnami
MediaWiki installation.  It's as simple as downloading the executable and
running it.  Bitnami is an open source project that bundles popular open
source software and takes the headaches out of configuring, etc.  Here's
the MediaWiki install:

http://bitnami.org/stack/mediawiki

Good luck.


On Tue, Feb 5, 2013 at 7:45 AM, Ariel T. Glenn ar...@wikimedia.org wrote:

 Στις 05-02-2013, ημέρα Τρι, και ώρα 07:21 -0500, ο/η Chad έγραψε:
  On Tue, Feb 5, 2013 at 7:06 AM, Marco Fleckinger
  marco.fleckin...@wikipedia.at wrote:
   The farmer doesn't want to eat anything he doesn't know. I don't know
 this
   sentence's popularity in Hungary (AFAIK?), but in German it's quite
 famous.
  
 
  I think the rough equivalent in English is nobody wants to see how a
  sausage is made. :)
 
 That's exactly the opposite meaning... I think you want better the
 devil we know than the devil we don't.

 Ariel


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How can we help Corporations use MW? (Was Re: Comparisons to Confluence)

2013-02-07 Thread Dan Andreescu
On 02/06/2013 10:00 PM, Tim Starling wrote:

  For corporate adoption, the main thing MediaWiki needs is not some
  particular feature. It needs to be supported. It needs an organisation
  with people who will care if corporate users are screwed over by a
  change. It needs community management, so that the features needed by
  corporate users will be discoverable and well-maintained, rather than
  developed privately, over and over. And it needs the smallest nudge of
  promotion, on top of what Wikipedia fans are doing for it. Say, a
  nice-looking website aimed at this user base.


I was on the other side of this, albeit a while back.  We had to decide
between MediaWiki and Confluence to power Disney's ParentPedia:
The main reason we chose Confluence was lack of


 Totally agreed.

 I (along with a few other hardy volunteers) have been helping MW users
 at [[mw:Project:Support desk]] and it seems clear that the focus most
 developers have on the WMF use case has really made MW less usable for
 other people.

 One of my clients had an older (1.11) MediaWiki installation that they
 are using to share information with their distributors world-wide.
 Their first attempt to get the system to do what they wanted was a flop
 since the Java developer they had working on the system really didn't
 know that much about MW.  I was able to get the system upgraded to 1.19
 and adapt MW to their infrastructure using hooks, ResourceLoader, and
 pages they could update in the MediaWiki namespace.

 So, yes, I think MediaWiki has a lot to offer corporate users, but we
 haven't really made that clear or shown them how to do a lot of things
 they want to do.

 Tim has it right when he says MediaWiki needs community management, so
 that the features needed by corporate users will be discoverable and
 well-maintained, rather than developed privately, over and over.

 We've discussed this sort of thing over and over, but I think we're
 actually beginning to make some headway now thanks especially to work by
 Mariya Miteva.


 --
 http://hexmode.com/

 There is no path to peace. Peace is the path.
-- Mahatma Gandhi, Non-Violence in Peace and War

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How can we help Corporations use MW? (Was Re: Comparisons to Confluence)

2013-02-07 Thread Dan Andreescu
Sorry about that - Gmail reload glitch

On 02/06/2013 10:00 PM, Tim Starling wrote:

  For corporate adoption, the main thing MediaWiki needs is not some
  particular feature. It needs to be supported. It needs an organisation
  with people who will care if corporate users are screwed over by a
  change. It needs community management, so that the features needed by
  corporate users will be discoverable and well-maintained, rather than
  developed privately, over and over. And it needs the smallest nudge of
  promotion, on top of what Wikipedia fans are doing for it. Say, a
  nice-looking website aimed at this user base.


I was on the other side of this, albeit a while back.  We had to decide
between MediaWiki and Confluence to power Disney's ParentPedia (which has
since been abandoned):

http://www.theregister.co.uk/2007/03/13/disney_eisner/
http://family.go.com/parenting/

The main reasons we chose Confluence:

* An easier to understand API.  This seems to not be a problem any more:
http://www.mediawiki.org/wiki/API:Client_code
* Easier setup on Windows:
http://www.mediawiki.org/wiki/Manual:Running_MediaWiki_on_Windows, possibly
made easier now by Bitnami: http://bitnami.org/stack/mediawiki

Do we have a place where people can talk through problems with corporate
adoption?  I suspect Tim's correct about the general case but that focusing
on specifics would drive adoption.

Dan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] QUnit testing in Jenkins

2013-03-05 Thread Dan Andreescu
 From console[1]:

 02:50:21 Testing

http://localhost:9412/mediawiki-core-28a705a9f648da310ff2a4fca9d013bf147f3d1f/index.php?title=Special:JavaScriptTest/qunitExceptionthrown
 by test.module1: expected
 02:50:21 Error: expected
 02:50:24 OK
 02:50:24  832 assertions passed (5350ms)
 02:50:24
 02:50:24 Done, without errors.

 This is strange. The console says there was a problem: Exception thrown
by
 test.module1: expected and Error: expected
 But then it says everything is fine: Done, without errors.

 Željko
 --
 [1]
https://integration.mediawiki.org/ci/job/mediawiki-core-qunit/3/console

I'm not sure how to navigate to the source that defines that test but I
suspect it's just using an expected exception test:
http://docs.jquery.com/QUnit/raises
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github/Gerrit mirroring

2013-03-08 Thread Dan Andreescu
 ... Then, if a developer is not willing to learn
 Gerrit, its code is probably not worth the effort of us integrating
 github/gerrit.  That will just add some more poor quality code to your
 review queues.


That seems like a pretty big assumption, and likely to be wrong.  The
simpler the code review process, the happier people will be to submit
patches.  Quality seems independent from that, and more likely linked to
the ease of validating patches (linting, unit test requirements, good style
guides, etc).  But that's just a guess.  If deemed interesting, I would be
glad to help quantify patch quality and analyze what helps to improve it.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github/Gerrit mirroring

2013-03-09 Thread Dan Andreescu
  [...]

  The point I'm trying to make is that the problems with Gerrit are not
  problems with Gerrit, but actually problems with Git itself. If you can't
  handle the basics of Gerrit, it's because you don't know how to use Git.
  And at that point I don't see how GitHub is going to make things that
 much
  easier anyway.

 ACK.  Also I think, if you have mastered the MediaWiki code-
 base to a degree where you can submit a patch, its actual
 submission will be the least problem you've encountered :-).


I disagree and I have a very simple counter-point.  Gerrit makes it
basically impossible to work in a git-flow style (
http://nvie.com/posts/a-successful-git-branching-model/).  From what I
understand, rebasing and good branching strategies like git-flow simply
don't get along.

I agree with Steven and Ryan about leaving the migrate to GitHub
discussion closed.  Gerrit is open source and it meets all of our needs.
 But it does create barriers of entry as shown above, in Jon's point, and
in Matthew Bowker's experience (which I'm sure is not atypical).  So I
think we should be open to valid criticism and find out how we can lower
the bar of entry.  As for the argument that this will lower code quality, I
think the burden of proof is on those making that assumption.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Try out MediaWiki Vagrant

2013-03-28 Thread Dan Andreescu
This is awesome, I've used it as a total MediaWiki noob to poke around at
the internals.  One suggestion I'd have is to include a script that
populates some sample data in the db (pages, users, edits, etc.).  Does
anyone have such a thing, should we dump a particularly active test setup
from somewhere, or would we need to make something new?


On Thu, Mar 28, 2013 at 2:18 AM, Ori Livneh o...@wikimedia.org wrote:

 Vagrant is a command-line tool for automatically provisioning virtual
 machines according to scripted specifications. The mediawiki-vagrant
 project bundles together specifications for quickly and easily provisioning
 a virtual machine running MediaWiki, suitable for development work.

 I announced it a few months ago, when I had something nominally useful to
 share. Some people found it useful, but it was something I had cobbled
 together in a hacky way as I was learning Vagrant, and I wasn't very happy
 with the end result. In the intervening months, I got a lot of useful
 suggestions, and became more proficient in writing Vagrant and Puppet
 configurations, so I decided to do things over.

 There's a lot more work to do (better profiling tools!), but I think it
 works quite well now, so I would appreciate some testing and feedback from
 others.

 Installation instructions are available here:
 http://www.mediawiki.org/wiki/Vagrant

 I hope you check it out, and that you find it useful. Feedback would be
 much appreciated.

 I'd also like to publicly thank Yurik for testing this extensively on
 Windows and providing detailed logs when things broke. Thanks, Yurik!

 --
 Ori Livneh



 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Support for 3D content

2013-04-19 Thread Dan Andreescu
Sounds like NicoV started work again to try to address those issues.  We
should take a look at the Amsterdam Hackathon or Wikimania.


On Fri, Apr 19, 2013 at 10:54 AM, Eugene Zelenko
eugene.zele...@gmail.comwrote:

 Hi!

 Extension and viewer for Chemical Markup Language were created long
 time ago. However it's still not reviewed for security issues to be
 included on WMF projects. See
 https://bugzilla.wikimedia.org/show_bug.cgi?id=16491.

 On Fri, Apr 19, 2013 at 6:03 AM, Mathieu Stumpf
 psychosl...@culture-libre.org wrote:
  Hi,
 
  Reading the 2012-13 Plan, I see that multimedia is one the key
 activities
  for Mediawiki. So I was wondering if there was already any plan to
 integrate
  3D model viewers, which would be for example very interesting for anatomy
  articles, or simply 3D maths objects.

 Eugene.

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Is there something like a gerrit/git download statistics overview page?

2013-05-07 Thread Dan Andreescu
I can't speak for gerrit but it seems to me github tried, failed, and
abandoned capturing these statistics:
http://stackoverflow.com/questions/6198194/how-to-see-count-of-project-downloads-on-github


On Tue, May 7, 2013 at 3:29 AM, Thomas Gries m...@tgries.de wrote:

 Hi,

 perhaps I am too naive that it could be implemented with a git-based repo,
 but is there a kind of download, clone, or access statistics available
 for core and extensions ?

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Welcome, Nik Everett

2013-05-31 Thread Dan Andreescu
Awesome, welcome Nik!  And see you June 17th, a few more of us remoteys
will be in the office :)

Dan


On Fri, May 31, 2013 at 6:02 PM, Matthew Flaschen
mflasc...@wikimedia.orgwrote:

 On 05/31/2013 05:48 PM, Rob Lanphier wrote:
  Hi everyone
 
  It is my pleasure to introduce Nik Everett as a new Senior Software
  Engineer specializing in Search, working remotely from his home in
  Raleigh, North Carolina.  Nik joins us from Lulu, a company founded by
  Bob Young[1] to enable anyone to publish a book (dealing with print,
  distribution, and order fulfillment issues).  Nik was responsible for
  a number of DevOps-related tasks at Lulu, mainly centered around test
  automation, build infrastructure, and database performance, but also
  including maintenance and support of their Solr infrastructure.  He
  also founded and chaired Lulu’s architecture review board, which is
  similar to a group we’re just now starting up here[2]

 Welcome!

 Matt Flaschen

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wikimedia-l] Updates on VE data analysis

2013-07-26 Thread Dan Andreescu
 Note that if you are interested in VE edits per user type instead,
 http://ee-dashboard.wmflabs.org/graphs/enwiki_ve_hourly_perc_by_user_type
 already offers a smoothing option (click the spanner icon on the top
 left), which can be set to 24 hours or more.

The 24 hour smoothing can also be set as the default in a new graph if
that's useful to see.  That way the viewer wouldn't have to toggle it every
time.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] OAuth

2013-08-20 Thread Dan Andreescu
This is highly anticipated on my part and awesome.  I will integrate it
into wikimetrics asap.

Dan


On Tue, Aug 20, 2013 at 9:15 PM, Chris Steipp cste...@wikimedia.org wrote:

 As mentioned earlier this week, we deployed an initial version of the OAuth
 extension to the test wikis yesterday. I wanted to follow up with a few
 more details about the extension that we deployed (although if you're just
 curious about OAuth in general, I recommend starting at oauth.net, or
 https://www.mediawiki.org/wiki/Auth_systems/OAuth):

 * Use it: https://www.mediawiki.org/wiki/Extension:OAuth#Using_OAuthshould
 get you started towards using OAuth in your application.

 * Demo: Anomie setup a excellent initial app (I think counts as our first
 official, approved consumer) here
 https://tools.wmflabs.org/oauth-hello-world/. Feel free to try it out, so
 you can get a feel for the user experience as a user!

 * Timeline: We're hoping to get some use this week, and deploy to the rest
 of the WMF wikis next week if we don't encounter any issues.

 * Bugs: Please open bugzilla tickets for any issues you find, or
 enhancement requests--

 https://bugzilla.wikimedia.org/enter_bug.cgi?product=MediaWiki%20extensionscomponent=OAuth


 And some other details for the curious:

 * Yes, you can use this on your own wiki right now! It's meant to be used
 in a single or shared environment, so the defaults will work on a
 standalone wiki. Input and patches are welcome, if you have any issues
 setting this up on your own wiki.

 * TLS: Since a few of you seem to care about https... The extension
 currently implements OAuth 1.0a, which is designed to be used without https
 (except to deliver the shared secret to the app owner, when the app is
 registered). So calls to the API don't need to use https.

 * Logging: All edits are tagged with the consumer's id (CID), so you can
 see when OAuth was used to contribute an edit.

 Enjoy!
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Announcing initial version of gerrit-stats

2012-09-15 Thread Dan Andreescu

 do anything and it cannot be undone. After deleting and adding one
 metric I had two metrics with same color.


I tried to do this and wasn't successful, could you please provide the
steps that you took when you found the issue?  Thank you.

Dan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Initial stab at responsive images for high-density displays

2012-09-18 Thread Dan Andreescu
DPI started to go a little crazy when mobile devices were introduced.  This
is a good history:

http://www.html5rocks.com/en/mobile/high-dpi/

And you can see that iDevices have screens with varied ppi:

http://stackoverflow.com/questions/1365112/what-dpi-resolution-is-used-for-an-iphone-app

As for Daniel's desire for bliss, I think the html5rocks article does a
good job of showing how to use srcset and build safe fallbacks for it.  I
feel like that's the right approach because as browsers mature we can just
turn off the fallback.

On Tue, Sep 18, 2012 at 9:23 AM, Antoine Musso hashar+...@free.fr wrote:

 Le 18/09/12 09:31, Brion Vibber a écrit :
  More recently, tablets and a few laptops are bringing 1.5x and 2.0x
 density
  displays too, such as the new Retina iPad and MacBook Pro.

 Please excuse my noobiness, but what 1.5x / 2.0x densities are referring
 to? IIRC most computers used 72dpi and Microsoft as used 96dpi.


 --
 Antoine hashar Musso


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] First open tech meeting with video broadcast - 10/18

2012-10-08 Thread Dan Andreescu
I'm using Google Hangouts on Ubuntu and I can help anyone that's having a
problem.  It works out of the box on Windows and OS X as far as I know.  On
Ubuntu, I found it easier to set up with Chrome than with Chromium but it's
possible with both and other browsers.

As far as the profile requirement, it does require you to upgrade your
google account to a google plus account.  Most of us have google accounts
through our wikimedia.org google mail and the required pieces of
information for a google plus upgrade are gender and birthday, and those
aren't shared.  You can see mine is pretty blank:

https://plus.google.com/u/1/108031902869881812253/posts/p/pub?hl=en

If anyone has enough trouble that they think setting up a wiki for this
would be useful, I'd be happy to assist.

Dan

On Sun, Oct 7, 2012 at 4:38 PM, Platonides platoni...@gmail.com wrote:

 On 06/10/12 03:08, MZMcBride wrote:
  Erik Moeller wrote:
  Anyone will be able to join the meeting via IRC to participate and
  watch the YouTube stream. Sign up if you want, or just join the
  meeting when it happens.
 
  Hi.
 
  Thanks for working on this. It sounds neat. :-)
 
  I think the current meetings page is desperately missing instructions for
  joining the meeting. For example, are participants going to need a Gmail
  account prior to the meeting so that they can use Google Hangout? Does
 using
  Google Hangout require any advance preparation such as installing a
 plugin?
  Which IRC channel should meeting participants join? (And of course
 depending
  on who you're targeting, there are a whole host of other questions like
  what is IRC? and how can I easily join an IRC discussion [without an
 IRC
  client]?, though I'm assuming that since this is a tech meeting, maybe
 this
  isn't as necessary.)
 
  I'd offer to help with the documentation/instructions, but unfortunately
 I
  don't know most of the answers. Any help you could give would be great.
 
  MZMcBride

 Last time I tried, Google Hangout required not only a Google account but
 also Google+. You then needed to install a propietary plugin that
 refused to work / be installed. That's my experience with Google Hangout.

 Connecting instead to a YouTube stream (RTSP I guess?) while reading on
 irc looks promising.



 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Reminder: Tech chat tomorrow

2012-10-29 Thread Dan Andreescu
+1 JS libraries beyond jQuery

On Mon, Oct 29, 2012 at 4:19 PM, Erik Moeller e...@wikimedia.org wrote:

 Reminder that we'll have one of our weekly tech chats tomorrow, with
 video from the WMF office and other Hangout participants broadcast via
 YouTube:

 https://www.mediawiki.org/wiki/Meetings/2012-10-30

 As a special topic, I've asked David Schoonover to moderate a
 discussion about JS libraries beyond jQuery if there's sufficient
 interest.

 Cheers,
 Erik
 --
 Erik Möller
 VP of Engineering and Product Development, Wikimedia Foundation

 Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Collaborative code-editing tools (long-distance pair programming)

2012-11-08 Thread Dan Andreescu
+1 on getting collide up and running.  It's open source and looks like it
already does project management and syntax highlighting


On Thu, Nov 8, 2012 at 10:45 AM, Mark Holmquist mtrac...@member.fsf.orgwrote:

 FWIW there is a way to add in syntax highlighting, and I could probably
 create a new instance for that. There was also chatter on the Etherpad
 channel yesterday about writing plugins for compiling and running
 programs on the backend of the server.


 Additionally, I suppose, we could write a plugin for enabling a grouping
 of pads into projects, which would make it easier to have multiple files
 open at once.

 I think the major problem is that any files on the etherpad server will
 need to be downloaded or copy/pasted before you can actually run them,
 which may or may not be ideal. But again, there may be a solution in the
 plugin API.

 (backend of the server - sorry, I just woke up)


 --
 Mark Holmquist
 Software Engineer, Wikimedia Foundation
 mtrac...@member.fsf.org
 http://marktraceur.info

 __**_
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/**mailman/listinfo/wikitech-lhttps://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Labs-l] Announcement: Yuvi Panda joins Ops

2014-11-07 Thread Dan Andreescu
 At the same time I find it sad that he moves to the USA because he was
 invaluable because he was available when the other Labs people were not
 around. As such he provided a much needed service exactly because he lives
 in India.


Oh that's a common misunderstanding which arises from the assumptions that
Yuvi sleeps and Yuvi is human.  In reality, Yuvi is always up on all time
zones so everyone thinks he helps their time zone.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Scrum of Scrums Notes for 2014-12-17

2014-12-17 Thread Dan Andreescu
https://www.mediawiki.org/wiki/Scrum_of_scrums/2014-12-17
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Scrum of Scrums notes for 2015-02-04

2015-02-04 Thread Dan Andreescu
https://www.mediawiki.org/wiki/Scrum_of_scrums/2015-02-04
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Scrum of Scrums notes

2015-01-14 Thread Dan Andreescu
https://www.mediawiki.org/wiki/Scrum_of_scrums/2015-01-14

note: no meeting next week as All Staff is happening.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Scrum of Scrums notes 2015-02-11

2015-02-11 Thread Dan Andreescu
https://www.mediawiki.org/wiki/Scrum_of_scrums/2015-02-11

If anyone with better wiki kung fu than me could take a look and figure out
why the table of contents is not rendering on that page like it does on all
the other [1] ones, I would appreciate it.

[1] https://www.mediawiki.org/wiki/Scrum_of_scrums/2015-02-04
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Scrum of Scrums notes, 2015-02-18

2015-02-18 Thread Dan Andreescu
https://www.mediawiki.org/wiki/Scrum_of_scrums/2015-02-18
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Scrum of Scrums notes since December 24th (2014-12-24, 2014-12-31, 2015-01-07)

2015-01-07 Thread Dan Andreescu
Here are notes from the last three scrum of scrums meetings:

https://www.mediawiki.org/wiki/Scrum_of_scrums/2014-12-24

https://www.mediawiki.org/wiki/Scrum_of_scrums/2014-12-31

https://www.mediawiki.org/wiki/Scrum_of_scrums/2015-01-07


Special thanks! to the Wikidata team for putting up their update for
today's meeting.

Happy New Year everyone :)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Scrum of Scrums notes, 2015-04-15

2015-04-15 Thread Dan Andreescu
https://www.mediawiki.org/wiki/Scrum_of_scrums/2015-04-15
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wmfall] Transitioning wikistats pageview reports to use new pageview definition

2015-11-11 Thread Dan Andreescu
The old definition was never put down clearly in words, but it can be
understood easiest I think from this SQL query that generates
pagecounts-all-sites (which was what wikistats used until this switch):
https://github.com/wikimedia/analytics-refinery/blob/master/oozie/pagecounts-all-sites/load/insert_hourly_pagecounts.hql#L46

Strainu I know you can read that, but if others are curious I'll try to put
it in words.

On Wednesday, November 11, 2015, Strainu  wrote:

> Hi Nuria,
>
> What is the "old" definition? For some wikis the difference is from
> simple to double.
>
> Thanks,
>   Strainu
>
> 2015-11-11 2:00 GMT+02:00 Nuria Ruiz >:
> > Hello!
> >
> > The analytics team wishes to announce that we have finally transitioned
> > several of the pageview reports in stats.wikimedia.org  to the new
> pageview
> > definition [1]. This means that we should no longer have two conflicting
> > sources of pageview numbers.
> >
> > While we are not not fully done transitioning pageview reports we feel
> this
> > is an important enough milestone that warrants some communication. BIG
> > Thanks to Erik Z. for his work on this project.
> >
> > Please take a look at a report using the new definition (a banner is
> > present when report has been updated)
> > http://stats.wikimedia.org/EN/TablesPageViewsMonthlyCombined.htm
> >
> > Thanks,
> >
> > Nuria
> >
> >
> > [1] https://meta.wikimedia.org/wiki/Research:Page_view
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org 
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org 
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Pageview API

2015-11-16 Thread Dan Andreescu
Dear Data Enthusiasts,


In collaboration with the Services team, the analytics team wishes to
announce a public Pageview API
.
For an example of what kind of UIs someone could build with it, check out
this excellent demo  (code)

.


The API can tell you how many times a wiki article or project is viewed
over a certain period.  You can break that down by views from web crawlers
or humans, and by desktop, mobile site, or mobile app.  And you can find
the 1000 most viewed articles

on any project, on any given day or month that we have data for.  We
currently have data back through October and we will be able to go back to
May 2015 when the loading jobs are all done.  For more information, take a
look at the user docs
.


After many requests from the community, we were really happy to finally
make this our top priority and get it done.  Huge thanks to Gabriel, Marko,
Petr, and Eric from Services, Alexandros and all of Ops really, Henrik for
maintaining stats.grok, and, of course, the many community members who have
been so patient with us all this time.


The Research team’s Article Recommender tool 
already uses the API to rank pages and determine relative importance.  Wiki
Education Foundation’s dashboard  is going
to be using it to count how many times an article has been viewed since a
student edited it.  And there are other grand plans for this data like
“article finder”, which will find low-rated articles with a lot of
pageviews; this can be used by editors looking for high-impact work.  Join
the fun, we’re happy to help get you started and listen to your ideas.
Also, if you find bugs or want to suggest improvements, please create a
task in Phabricator and tag it with #Analytics-Backlog
.


So what’s next?  We can think of too many directions to go into, for
pageview data and Wikimedia project data, in general.  We need to work with
you to make a great plan for the next few quarters.  Please chime in here
 with your needs.


Team Analytics
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] [Analytics] Pageview API

2015-11-18 Thread Dan Andreescu
>
> It does! The API framework as a whole is at
> https://github.com/wikimedia/restbase and you want
>
> https://github.com/wikimedia/restbase/blob/master/specs/analytics/v1/pageviews.yaml
> and https://github.com/wikimedia/restbase/blob/master/mods/pageviews.js
> for the pageviews-specific modules.


And Chris, you're welcome to #wikimedia-analytics to ask any questions you
might have.  I'm known as milimetric around those parts :)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Pageview API

2015-11-18 Thread Dan Andreescu
Quick general reminder.  Please tag tasks with "Analytics-Backlog" instead
of "Analytics" for now.  We need to clean that up, but we just haven't
gotten around to it.

On Wed, Nov 18, 2015 at 9:05 AM, Dan Andreescu <dandree...@wikimedia.org>
wrote:

> Nice work on the API!
>>
>> I wrote a basic consumer of this API at
>> http://codepen.io/Krinkle/full/wKOMMN#wikimdia-pageviews
>
>
> Cool!  Check out dv.wikipedia.org though, some of the RTL is messing with
> your (N views) parens.
>
> The only hurdle I found is that the 'articles' property is itself
>> nested/double encoded JSON, instead of a plain object. This was somewhat
>> unexpected and makes the API harder to use.
>>
>
> Right, for sure.  The data had to be stuffed that way to save space in
> Cassandra.  So we could parse it and reshape the response in RESTBase, and
> that seems like a good idea and probably wouldn't hurt performance too
> much.  Do you think it's worth the breaking change to the format?  I'll
> post on the bug that MZ filed.
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Pageview API

2015-11-18 Thread Dan Andreescu
>
> Nice work on the API!
>
> I wrote a basic consumer of this API at
> http://codepen.io/Krinkle/full/wKOMMN#wikimdia-pageviews


Cool!  Check out dv.wikipedia.org though, some of the RTL is messing with
your (N views) parens.

The only hurdle I found is that the 'articles' property is itself
> nested/double encoded JSON, instead of a plain object. This was somewhat
> unexpected and makes the API harder to use.
>

Right, for sure.  The data had to be stuffed that way to save space in
Cassandra.  So we could parse it and reshape the response in RESTBase, and
that seems like a good idea and probably wouldn't hurt performance too
much.  Do you think it's worth the breaking change to the format?  I'll
post on the bug that MZ filed.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Analytics] Pageview API

2015-11-18 Thread Dan Andreescu
>
> Finally! I waited so many years for a formal tool like that. Thank you!
>

Itzik, I remember your requests for this type of data from a long long time
ago, when I was just starting at the foundation!!  You and the many others
with similar requests are the people we were thanking in the announcement :)

And demo on wmflabs is great, but it will be great to add option to export
> the data to CSV file. Also, the data are only from the begging of October,
> any chance we can load past data as well?
>

So, I agree with what Madhu said that you could file a Phabricator ticket
for this.  But for now, we're not looking to build a UI for it that is
production ready.  The demo was meant just to show that it's very simple to
do so.  It took one of our engineers only a few days and under 300 lines of
code to get this done.  We'd like to be patient and see if the community at
large has an interest in running something like stats.grok.se now that
heavy lifting of data is no longer required, and performance is guaranteed
by us within reasonable expectations.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Analytics] [Data Release] [Data Deprecation] [Analytics Dumps]

2016-03-23 Thread Dan Andreescu
On Wed, Mar 23, 2016 at 1:06 PM, Federico Leva (Nemo) <nemow...@gmail.com>
wrote:

> Dan Andreescu, 23/03/2016 15:58:
>
>>
>> *Clean-up:* Analytics data on dumps was crammed into /other with
>> unrelated datasets.  We made a new page to receive current and future
>> datasets [3] and linked to it from /other and /.  Please let us know if
>> anything there looks confusing or opaque and I'll be happy to clarify.
>>
>
> I assume the old URLs will redirect to the new ones, right?
>

Good question, we didn't change any old URLs actually, so if you're trying
to get to other/pagecounts-ez, other/pagecounts-raw and all that, they're
all still there, just linked-to from /analytics.  We did it this way
because we figured people had scripts that depended on those URLs.  We
thought about moving and symlinking but it's probably unlikely that we'll
ever be able to delete the other/** location.

So mainly we just have a new page where we can do a better job of focusing
on the analytics datasets.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Data Release] [Data Deprecation] [Analytics Dumps]

2016-03-23 Thread Dan Andreescu
cc-ing our friends in research and wikitech (sorry I forgot initially)

We're happy to announce a few improvements to Analytics data releases on
> dumps.wikimedia.org:
>
> * We are releasing a new dataset, an estimate of Unique Devices accessing
> our projects [1]
> * We are officially making available a better Pageviews dataset [2]
> * We are deprecating two older pageview statistics datasets
> * We moved Analytics data from /other to /analytics [3]
>
> Details follow:
>
>
> *Unique Devices:* Since 2009, the Wikimedia Foundation used comScore to
> report data about unique web visitors.  In January 2016, however, we
> decided to stop reporting comScore numbers [4] because of certain
> limitations in the methodology, these limitations translated into
> misreported mobile usage. We are now ready to replace comscore numbers with
> the Unique Devices Dataset [5][1]. While unique devices does not equal
> unique visitors, it is a good proxy for that metric, meaning that a major
> increase in the number of unique devices is likely to come from an increase
> in distinct users. We understand that counting uniques raises fairly big
> privacy concerns and we use a very private conscious way to count unique
> devices, it does not include any cookie by which your browser history can
> be tracked [6].
>
> We invite you to explore this new dataset and hope it’s helpful for the
> Wikimedia community in better understanding our projects. This data can
> help measurethe reach of wikimedia projects on the web.
>
> *Pageviews:* This [2] is the best quality data available for counting the
> number of pageviews our projects receive at the article and project level.
> We've upgraded from pagecounts-raw to pagecounts-all-sites, and now to
> pageviews, in order to filter out more spider traffic and measure something
> closer to what we think is a real user viewing content.  A short history
> might be useful:
>
> * pagecounts-raw: was maintained by Domas Mituzas originally and taken
> over by the analytics team.  It was and still is the most used dataset,
> though it has some majore problems.  It does not count access to the mobile
> site, it does not filter out spider or bot traffic, and it suffers from
> unknown loss due to logging infrastructure limitations.
> * pagecounts-all-sites: uses the same pageview definition as
> pagecounts-raw, and so also does not filter out spider or bot traffic.  But
> it does include access to mobile and zero sites, and is built on a more
> reliable logging infrastructure.
> * pagecounts-ez: is derived from the best data available at the time.
> So until December 2015, it was based on pagecounts-raw and
> pagecounts-all-sites, and now it's based on pageviews.  This dataset is
> great because it compresses very large files without losing any
> information, still providing hourly page and project level statistics.
>
> So the new dataset, pageviews, is what's behind our pageview API and is
> now available in static files for bulk download back to May 2015.  But the
> multiple ways to download pageview data is confusing for consumers, so
> we're keeping only pageviews and pagecounts-ez and deprecating the other
> two.  If you'd like to read more about the current pageview definition,
> details are on the research page [7].
>
> *Deprecating:* We are deprecating the pagecounts-raw and
> pagecounts-all-sites datasets in May 2016 (discussion here:
> https://phabricator.wikimedia.org/T130656 ).  This data suffers from many
> artifacts, lack of mobile data, and/or infrastructure problems, and so is
> not comparable to the new way we track pageviews.  It will remain here
> because we have historical data that may be useful, but it will not be
> maintained or updated beyond May 2016.
>
> *Clean-up:* Analytics data on dumps was crammed into /other with
> unrelated datasets.  We made a new page to receive current and future
> datasets [3] and linked to it from /other and /.  Please let us know if
> anything there looks confusing or opaque and I'll be happy to clarify.
>
>
> [1] http://dumps.wikimedia.org/other/unique_devices
> [2] http://dumps.wikimedia.org/other/pageviews
> [3] http://dumps.wikimedia.org/analytics/
> [4] https://meta.wikimedia.org/wiki/ComScore/Announcement
> [5] https://meta.wikimedia.org/wiki/Research:Unique_Devices
> [6]
> https://meta.wikimedia.org/wiki/Research:Unique_Devices#How_do_we_count_unique_devices.3F
> [7] https://meta.wikimedia.org/wiki/Research:Page_view
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Dumps.wm.o access will be https only

2016-04-01 Thread Dan Andreescu
Cross-posting this announcement since people on Analytics-l tend to use
dumps a lot

basically http access is being redirected to https starting April 4th

On Fri, Apr 1, 2016 at 7:03 AM, Ariel Glenn WMF  wrote:

> We plan to make this change on April 4 (this coming Monday), redirecting
> plain http access to https.
>
> A reminder that our dumps can also be found on our mirror sites, for those
> who may have restricted https access.
>
> Ariel Glenn
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Dumps.wm.o access will be https only

2016-04-01 Thread Dan Andreescu
:) I would like to think that April Fools would be a little more extreme
than this

Petr, the data people are downloading from dumps is not sensitive itself,
but a user's private information (such as User Agent, IP, etc. is
vulnerable over plain http).  So the move to https protects that from
snoopers like the NSA.  I don't have much to do with this change, and I
consider it beneficial, but does it cause any problems?

On Fri, Apr 1, 2016 at 10:24 AM, Huib Laurens  wrote:

> Aprils fool?
>
> On Friday, 1 April 2016, Petr Bena  wrote:
>
> > Can you give us some justification for this change? It's not like when
> > downloading dumps you would actually leak some sensitive data...
> >
> > On Fri, Apr 1, 2016 at 1:03 PM, Ariel Glenn WMF  > > wrote:
> > > We plan to make this change on April 4 (this coming Monday),
> redirecting
> > > plain http access to https.
> > >
> > > A reminder that our dumps can also be found on our mirror sites, for
> > those
> > > who may have restricted https access.
> > >
> > > Ariel Glenn
> > > ___
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org 
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org 
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
>
> --
> Met vriendelijke groet,
>
> Huib Laurens
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Scrum of Scrums notes, 2016-04-13

2016-04-13 Thread Dan Andreescu
https://www.mediawiki.org/wiki/Scrum_of_scrums/2016-04-13

note: it looks like some of the wiki markup on etherpad wasn't proper, so
weird things happened like Collaboration is nested under Discovery.  It's
not a sign of a re-org, just that people should go clean that up.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Wikistats 2.0 Design Preview

2017-02-07 Thread Dan Andreescu
Hello, and apologies for cross-posting. We have designed a set of
wireframes that reflects the Wikistats community's priorities [1].  We’re
now looking for feedback on the design, and we’d love your input.  We have
key questions that touch on different sections, and links for feedback at
mediawiki.org [2] (if you prefer email, just reply to this message).
Please comment by Monday, February 13th so we can include your thoughts and
iterate. Thank you very much!

Your Wikistats 2.0 Team

[1]
https://www.mediawiki.org/wiki/Analytics/Wikistats/DumpReports/Future_per_report
[2]
https://www.mediawiki.org/wiki/Wikistats_2.0_Design_Project/RequestforFeedback/Round1
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Wikistats 2.0 Prototype

2017-04-21 Thread Dan Andreescu
Hello!  We've built an interactive prototype of the next version of
Wikistats  [1] based on community
priorities and feedback.  We'd love your input on the visual design and
look & feel.  We have some follow-up questions but mostly an open
discussion here

[2].  Please comment by *Monday, May 1st* so we can include your feedback
into the first release of Wikistats 2.0.  Thank you very much!


Your Wikistats 2.0 Team


[1] https://analytics-prototype.wmflabs.org/

[2]
https://www.mediawiki.org/wiki/Wikistats_2.0_Design_Project/RequestforFeedback/Round2


p.s. apologies for cross-posting
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Can we drop revision hashes (rev_sha1)?

2017-09-18 Thread Dan Andreescu
So, as things stand, rev_sha1 in the database is used for:

1. the XML dumps process and all the researchers depending on the XML dumps
(probably just for revert detection)
2. revert detection for libraries like python-mwreverts [1]
3. revert detection in mediawiki history reconstruction processes in Hadoop
(Wikistats 2.0)
4. revert detection in Wikistats 1.0
5. revert detection for tools that run on labs, like Wikimetrics
?. I think Aaron also uses rev_sha1 in ORES, but I can't seem to find the
latest code for that service

If you think about this list above as a flow of data, you'll see that
rev_sha1 is replicated to xml, labs databases, hadoop, ML models, etc.  So
removing it and adding it back downstream from the main mediawiki database
somewhere, like in XML, cuts off the other places that need it.  That means
it must be available either in the mediawiki database or in some other
central database which all those other consumers can pull from.

I defer to your expertise when you say it's expensive to keep in the db,
and I can see how that would get much worse with MCR.  I'm sure we can
figure something out, though.  Right now it seems like our options are, as
others have pointed out:

* compute async and store in DB or somewhere else that's central and easy
to access from all the branches I mentioned
* update how we detect reverts and keep a revert database with good
references to wiki_db, rev_id so it can be brought back in context.

Personally, I would love to get better revert detection, using sha1 exact
matches doesn't really get to the heart of the issue.  Important phenomena
like revert wars, bullying, and stalking are hiding behind bad revert
detection.  I'm happy to brainstorm ways we can use Analytics
infrastructure to do this.  We definitely have the tools necessary, but not
so much the man-power.  That said, please don't strip out rev_sha1 until
we've accounted for all its "data customers".

So, put another way, I think it's totally fine if we say ok everyone, from
date XYZ, you will no longer have rev_sha1 in the database, but if you want
to know whether an edit reverts a previous edit or a series of edits, go
*HERE*.  That's fine.  And just for context, here's how we do our revert
detection in Hadoop (it's pretty fancy) [2].


[1] https://github.com/mediawiki-utilities/python-mwreverts
[2]
https://github.com/wikimedia/analytics-refinery-source/blob/1d38b8e4acfd10dc811279826ffdff236e8b0f2d/refinery-job/src/main/scala/org/wikimedia/analytics/refinery/job/mediawikihistory/denormalized/DenormalizedRevisionsBuilder.scala#L174-L317

On Mon, Sep 18, 2017 at 9:19 AM, Daniel Kinzler  wrote:

> Am 16.09.2017 um 01:22 schrieb Matthew Flaschen:
> > On 09/15/2017 06:51 AM, Daniel Kinzler wrote:
> >> Also, I believe Roan is currently looking for a better mechanism for
> tracking
> >> all kinds of reverts directly.
> >
> > Let's see if we want to use rev_sha1 for that better solution (a way to
> track
> > reverts within MW itself) before we drop it.
>
>
> The problem is that if we don't drop is, we have to *introduce* it for the
> new
> content table for MCR. I'd like to avoid that.
>
> I guess we can define the field and just null it, but... well. I'd like to
> avoid
> that.
>
>
> --
> Daniel Kinzler
> Principal Platform Engineer
>
> Wikimedia Deutschland
> Gesellschaft zur Förderung Freien Wissens e.V.
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit login oddities

2018-02-13 Thread Dan Andreescu
If you're like me, you go to clear your cookies and you have two
called GerritAccount,
that's probably what's causing the problem.  I looked at the date, and
deleted the earlier-dated one.  That seems to work and show you as
logged-in without having to log in again.

On Tue, Feb 13, 2018 at 1:27 PM, Chad  wrote:

> On Tue, Feb 13, 2018 at 7:39 AM Derk-Jan Hartman <
> d.j.hartman+wmf...@gmail.com> wrote:
>
> > Ah, this explains why i wasn't able to login... Really confusing..
> > Can't we set a varnish/apache/nginx/whatever rule or something to
> > overwrite older now broken cookie settings ?
> >
> > DJ
>
>
> >
>
> Got a fix in the works for this. It's affecting more people than I
> initially thought!
>
> -Chad
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikistats 2.0 - Now with Maps!

2018-02-15 Thread Dan Andreescu
יגאל חיטרון‎, the -- means "Unknown", perhaps we should name it that.

That's a good idea, Yuri, we'll do that.

On Wed, Feb 14, 2018 at 9:24 PM, Yuri Astrakhan 
wrote:

> Nuria, well done, looks awesome!  I think you want to set zoom limits -
> right now you can zoom in and zoom out almost infinitely. Congrats!
>
> On Wed, Feb 14, 2018 at 8:47 PM, Michael Schönitzer <
> michael.schoenit...@wikimedia.de> wrote:
>
> > here a screenshot:
> >
> > There's a second anomaly: "SX" is also listed…
> >
> > 2018-02-15 1:35 GMT+01:00 יגאל חיטרון :
> >
> > > Sorry, I can't. I opened the link you gave on hewiki. Changed from map
> > view
> > > to table view. I get a list of countries - US, France, Spain, --,
> Japan,
> > > and so on. It's a link, and clicking it opens unexisting wiki article
> > with
> > > the same name.
> > > Igal
> > >
> > >
> > > On Feb 15, 2018 02:23, "Nuria Ruiz"  wrote:
> > >
> > > > Hello,
> > > >
> > > > >Hello and thank you.
> > > > >What do you mean in country named "--"?
> > > >
> > > > Not sure what you are asking, maybe a screenshot would help?
> > > >
> > > > On Wed, Feb 14, 2018 at 3:04 PM, יגאל חיטרון 
> > wrote:
> > > >
> > > > > Hello and thank you.
> > > > > What do you mean in country named "--"?
> > > > > Igal (User:IKhitron)
> > > > >
> > > > >
> > > > > On Feb 15, 2018 00:15, "Nuria Ruiz"  wrote:
> > > > >
> > > > > Hello from Analytics team:
> > > > >
> > > > > Just a brief note to announce that Wikistats 2.0 includes data
> about
> > > > > pageviews per project per country for the current month.
> > > > >
> > > > > Take a look, pageviews for Spanish Wikipedia this current month:
> > > > > https://stats.wikimedia.org/v2/#/es.wikipedia.org/reading/
> > > > > pageviews-by-country
> > > > >
> > > > > Data is also available programatically vi APIs:
> > > > >
> > > > > https://wikitech.wikimedia.org/wiki/Analytics/AQS/
> > > > > Pageviews#Pageviews_split_by_country
> > > > >
> > > > > We will be deploying small UI tweaks during this week but please
> > > explore
> > > > > and let us know what you think.
> > > > >
> > > > > Thanks,
> > > > >
> > > > > Nuria
> > > > > ___
> > > > > Wikitech-l mailing list
> > > > > Wikitech-l@lists.wikimedia.org
> > > > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > > > > ___
> > > > > Wikitech-l mailing list
> > > > > Wikitech-l@lists.wikimedia.org
> > > > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > > > ___
> > > > Wikitech-l mailing list
> > > > Wikitech-l@lists.wikimedia.org
> > > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > > ___
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > >
> >
> >
> >
> > --
> > Michael F. Schönitzer
> >
> >
> >
> > Wikimedia Deutschland e.V. | Tempelhofer Ufer 23-24 | 10963 Berlin
> > Tel. (030) 219 158 26-0
> > http://wikimedia.de
> >
> > Stellen Sie sich eine Welt vor, in der jeder Mensch an der Menge allen
> > Wissens frei teilhaben kann. Helfen Sie uns dabei!
> > http://spenden.wikimedia.de/
> >
> > Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
> > Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
> unter
> > der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
> > Körperschaften I Berlin, Steuernummer 27/681/51985.
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wikimetrics] Sunsetting Wikimetrics

2019-02-25 Thread Dan Andreescu
Federico, we're talking about *Wikimetrics*, which is a very different
project intended to run stats on groups of users (cohorts) for a specific
set of standard metrics.  This tool was mostly developed for and used by
grantmaking and program teams at WMF and affiliates around the world, but
its use is too low to justify the effort it would take to maintain it.
Better tools are taking its place, and we're working on some of the
infrastructure will support those better tools.

I think you may have been talking about *Wikistats*, which I agree would've
been very nice to keep maintaining.  I think ultimately we don't have too
many of the skills that made Erik successful at what he did.  Perl was just
one of them, and not a major part in my opinion.
If I'm wrong and you're indeed talking about Wikimetrics, do please
elaborate as we're just starting to make this decision.

On Mon, Feb 25, 2019 at 5:21 PM Federico Leva (Nemo) 
wrote:

> Marcel Ruiz Forns, 22/02/19 21:01:
> > Wikimetrics  development has been frozen
> > since 2017, and we Analytics consider it's time to sunset this tool.
>
> This is sad, for sure. As someone who managed to patch and run the
> original WikiStats scripts despite a limited knowledge of Perl, I think
> it would have been more reasonable than often assumed to continue their
> maintenance. However it's clear that we're well past the time when such
> a decision might have been possible, and it's good to see a growing
> interest in feature parity for the new WikiStats.
>
> I think the purposes of this list can be served by the analytics list
> without too much sweat.
>
> Federico
>
> ___
> Wikimetrics mailing list
> wikimetr...@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikimetrics
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] [Data Release] Active Editors by country

2019-11-07 Thread Dan Andreescu
Today we are releasing a new dataset meant to help us understand the impact
of grants and programs on editing.  This data was requested several years
ago, and we took a long time to bring in the privacy and security experts
whose help we needed to release it.  With that work done, you can download
the data here: https://dumps.wikimedia.org/other/geoeditors/ and read about
it here:
https://wikitech.wikimedia.org/wiki/Analytics/Data_Lake/Edits/Geoeditors/Public

You can send questions or comments on this thread or on the discussion page.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki API pageview issue

2020-03-02 Thread Dan Andreescu
There are two hard problems here.  One is historical page titles.  You can
get those from our new dataset (docs here:
https://dumps.wikimedia.org/other/mediawiki_history/readme.html) by
downloading the months you're interested in from
https://dumps.wikimedia.org/other/mediawiki_history/2020-01/enwiki/, and
looking at the history of the pages you're interested in [1].  As others
have mentioned, page histories can sometimes be very complicated, do let us
know if we didn't get it right for the pages you're interested in, we
worked really hard at vetting the data but there may be exceptions left
unaccounted for.

The second problem is historical redirects.  Sadly, there is no historical
information about redirect status in the databases, only whether or not the
page is a redirect right now.  To find historical information, we have to
parse the wikitext itself, that's why the answers above are complicated.
We are starting to do this but don't yet have the compute power.

To clarify something from above, the flow of data is like this:

0. Historical aggregate data from 2007-2015, kept for reference but uses a
slightly different counter so not directly comparable
1. Webrequest log flowing in through Kafka
--> pageviews found in the log
--> aggregate data simplified and pushed to the dumps referenced by
bawolff
--> aggregate data loaded into the Pageview API (a part of AQS
referenced by Gergo)
--> mediawiki API queries this to respond to action API queries
about pageviews
--> wmflabs pageviews tool does some crazy sophisticated stuff
on top of the API
2. Wikitext dumps
--> processed and loaded into Hadoop
--> [FUTURE] parsed for content like historical redirects and
published as an API or set of dumps files

[1] As a quick intro, each line is an "event" in this wiki, that is
performed on a particular "entity" in {page, user, revision}.  The first
three fields are wiki, entity, and event type, so in your case you'd be
interested in looking for lines starting with enwiki--->page--->move ...
.  Each line has the page id, title of the page as
of today, and title of the page as of the timestamp on that line.  So this
way you can collect all titles for a particular page id or page title.

(if this is useful maybe I should put it on the Phab task about historical
redirects)

On Mon, Feb 24, 2020 at 9:50 PM bawolff  wrote:

> On Tue, Feb 25, 2020 at 1:27 AM MusikAnimal  wrote:
>
> > Unfortunately there's no proper log of redirect changes (I recently
> filed <
> > https://phabricator.wikimedia.org/T240065> for this). There are change
> > tags
> >  that identify redirect
> changes
> > -- "mw-new-redirect" and "mw-changed-redirect-target", specifically --
> but
> > I am not sure if this is easily searchable via the action API. Someone on
> > this list might know.
> >
>
> You can do
>
> https://en.wikipedia.org/w/api.php?action=query=2019%E2%80%9320%20Wuhan%20coronavirus%20outbreak=revisions=timestamp|tags|ids|content=max=mw-new-redirect=2=main
> 
> or
>
> https://en.wikipedia.org/w/api.php?action=query=2019%E2%80%9320%20Wuhan%20coronavirus%20outbreak=revisions=timestamp|tags|ids|content=max=mw-changed-redirect-target=2=main
> 
> (You cannot do both in one query, you can only specify one tag at a time).
> Furthermore, it looks like given a revision id, you would have to determine
> where it redirects yourself, which is unfortunate. I suppose you could look
> at
>
> https://en.wikipedia.org/w/api.php?action=parse=941491141=2=text|links
> (taking the oldid as the revid from the other query) and either try and
> parse the html, or just assume if there is only one main namespace link,
> that that is the right one.
>
> Also keep in mind, really old revisions won't have those tags.
>
> --
> Brian
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] TechCom meeting 2020-10-14

2020-10-14 Thread Dan Andreescu
This is the weekly TechCom board review, usually in preparation for our
meeting, a few days late this week because I forgot.  Apologies.

Activity since Monday 2020-10-06 on the following boards:

https://phabricator.wikimedia.org/tag/techcom/
https://phabricator.wikimedia.org/tag/techcom-rfc/

Committee inbox:

   - T263904 : Are traits part
   of the stable interface?
   - T239742 : "Should npm
   packages maintained by Wikimedia be scoped or unscoped?"
  - Both have been sitting in the inbox for a month or so, Tim first
  pointed this out a few weeks ago but we haven't triaged them yet.

Committee board activity (none)

New RFCs (none)

Phase progression (none)

IRC meeting request (none)

Other RFC activity:


   - T262946:  Bump Firefox
   version in basic support to 3.6 or newer
   - Some agreement between Volker and Timo to keep the scope at just the
  Firefox version bump and move this RFC forward.  I think Volker
is welcome
  to move this back to P4 unless I missed some other ongoing discussion?
   - T259771:  RFC: Drop support
   for database upgrade older than two LTS releases
   - Discussion about upgrading from 1.16 (this humble reader says
  upgrading from a version that's 10 years old should not be a priority)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] TechCom meeting 2020-10-14

2020-10-14 Thread Dan Andreescu
As for the minutes of our meeting today, Grant joined and we discussed
the Proposal
for a Technical Decision Making Process
<https://docs.google.com/document/d/1LOAi3RjVeWY-QRdOjnfHCchnM8bSh_D3ug9GhnQ6Nf0/edit#>.
Specifically, we went over Scope, Transition, Resourcing, and a bit about
the Forum and Timeboxing.  The next steps are to establish a more concrete
transition plan, from the current Tech Com process to the new process.  And
Kate will be reaching out to team managers to begin establishing the Forum.

On Wed, Oct 14, 2020 at 9:58 PM Dan Andreescu 
wrote:

> This is the weekly TechCom board review, usually in preparation for our
> meeting, a few days late this week because I forgot.  Apologies.
>
> Activity since Monday 2020-10-06 on the following boards:
>
> https://phabricator.wikimedia.org/tag/techcom/
> https://phabricator.wikimedia.org/tag/techcom-rfc/
>
> Committee inbox:
>
>- T263904 <https://phabricator.wikimedia.org/T263904>: Are traits part
>of the stable interface?
>- T239742 <https://phabricator.wikimedia.org/T239742>: "Should npm
>packages maintained by Wikimedia be scoped or unscoped?"
>   - Both have been sitting in the inbox for a month or so, Tim first
>   pointed this out a few weeks ago but we haven't triaged them yet.
>
> Committee board activity (none)
>
> New RFCs (none)
>
> Phase progression (none)
>
> IRC meeting request (none)
>
> Other RFC activity:
>
>
>- T262946: <https://phabricator.wikimedia.org/T262946> Bump Firefox
>version in basic support to 3.6 or newer
>- Some agreement between Volker and Timo to keep the scope at just the
>   Firefox version bump and move this RFC forward.  I think Volker is 
> welcome
>   to move this back to P4 unless I missed some other ongoing discussion?
>- T259771: <https://phabricator.wikimedia.org/T259771> RFC: Drop
>support for database upgrade older than two LTS releases
>- Discussion about upgrading from 1.16 (this humble reader says
>   upgrading from a version that's 10 years old should not be a priority)
>
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Update on abstract schema and schema changes

2020-10-11 Thread Dan Andreescu
I hadn’t followed along with this work and this summary is amazing, thank
you so much!

On Sat, Oct 10, 2020 at 21:14 Amir Sarabadani  wrote:

> Hello,
> It has been a while since I gave an update on the state of abstracting
> schema and schema changes in mediawiki
> . So here's a really long one.
>
> So far around half of the mediawiki core tables have been migrated to
> abstract schema (plus lots of extensions lika Wikibase, Babel, Linter,
> BetaFeatures, etc.). Special thanks to Tgr for reviewing most of the
> patches and Sam Reed and James Forrester for doing the extensions.
>
> With the growing number of schemas being abstracted, this is going to
> affect your development if you work on schema and schema changes in core or
> any of the extensions. So If you do, please read Manual:Schema changes
>  in mediawiki.org
>
> You might think that abstraction is just migrating SQL to JSON but it's
> much more, we are making the database schema of mediawiki much more
> consistent, We are basically addressing several long standing issues like
> T164898  and T42626
>  as well.
>
> *Improvement aspects*
>
> First aspect is drifts between different DBMSes. Sqlite schema is being
> produced by regex replacement (this code
> )
> which is less than great but at least it comes from one place. For
> Postgres, its schema and MySQL/Sqlite has drifted so drastically, that
> fixing it so far required 76 schema changes fixing issues ranging from
> missing indexes to missing PKs, extra AUTO_INCREMENT where it shouldn't be,
> missing DEFAULT values, drifting data types and much more.  You can follow
> the fixes of Postgres in here .
>
> The second aspect is the inconsistency in the schema itself. How do we
> model strings? VARCHAR? VARBINARY()? VARCHAR() BINARY? (all three are
> different things). You'd be surprised how inconsistent our MySQL is. So
> far, we are migrating all VARCHAR() BINARY fields to VARBINARY() (so far
> ten schema changes).
>
> Another inconsistency is timestamps. In MySQL, around half of them are
> BINARY(14) and the other half VARBINARY(14) (but in Postgres all are
> TIMESTAMPTZ), there is even a ticket
>  about it. It makes sense to
> migrate all of them to BINARY(14) but not all timestamps are 14 characters,
> e.g. expiry fields accept "infinity" as value and it's a valid timestamp in
> Postgres ¯\_(ツ)_/¯ When you turn an expiry field to BINARY(14), "infinity"
> becomes "  infinity" and as the result mediawiki doesn't recognize it
> as infinity ("infinity" != "  infinity"). There are several ways to
> move forward handling expiry fields, you can follow the discussion in this
> gerrit patch .
>
> Another fun aspect: Booleans. MySQL doesn't have boolean, it translates
> them to TINYINT(1) but other DBMSes don't have TINYINT, they have SMALLINT
> and BOOL though (and we mostly use SMALLINT for them), we decided to go
> with SMALLINT for these cases (which is different than what Doctrine DBAL
> does, it uses BOOL, so we introduced our own custom type for booleans).
>
> Last but not least: ENUMs. MySQL and Postgres support that but Sqlite
> doesn't. Doctrine DBAL doesn't support ENUM at all (as it's an
> anti-pattern) while core has eight fields that are ENUM. There's an RFC to
> discourage using it in general. Feel free to comment on it.
> 
>
> A miscellaneous note: The directories that hold the archive of sql patches
> of schema change are exploding (some of the sql patches are even orphan but
> we can't find them because there are so many of them). So I started a RFC
> to clean that mess up: Drop support for database upgrade older than two
> LTS releases 
>
> *What's next?*
>
>-  We continue to migrate more tables, hopefully we will get two third
>of them by the end of the year (fingers crossed). You can follow the
>progress in its ticket .
>-  We will support abstract schema changes, really soon, like in a
>couple of weeks. Basically you start a json file containing snapshots of
>before and after of a table and then a maintenance script will produce the
>needed sql patches for you for different schemas. This will increase the
>developer productivity drastically, since 1- Schema change sql files become
>more reliable and consistent and less prone to errors like adding the
>column to the wrong table in some DBMSes
>
> 

Re: [Wikitech-l] pageview dumps for 2020-09?

2020-10-06 Thread Dan Andreescu
Hi Gabriel,

My team maintains this dataset.  We have been working on a new version
which will deprecate all other pagecount and pageview datasets.  Release of
this new dataset is imminent.  When we saw the old job failed, we couldn't
fix the problem quickly and have been thinking maybe it isn't worth the
effort since the new data is coming out soon.

However, if you have an immediate need, do let us know.  So, in other
words, if you can wait a week or two and tweak your scripts to work with
the new data, that would probably be best.  Otherwise, let us know and we
can look at the failure more closely.

On Tue, Oct 6, 2020 at 3:37 PM Gabriel Altay 
wrote:

> Apologies if this is the wrong list, but it appears that the page views
> dumps for september were stalled on the 24th
>
> https://dumps.wikimedia.org/other/pagecounts-ez/merged/2020/2020-09/
> 
>
> and the full month dump (pagecounts-2020-09-views-ge-5-totals.bz2) was not
> generated,
> https://dumps.wikimedia.org/other/pagecounts-ez/merged/
>
> does anyone know what the situation is?
>
> best,
> -Gabriel
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Tech Com Board Grooming 2020-08-26

2020-08-26 Thread Dan Andreescu
Activity since Monday 2020-08-26.

https://phabricator.wikimedia.org/tag/techcom/
https://phabricator.wikimedia.org/tag/techcom-rfc/

Committee inbox: none

Committee board activity:

   - Timo updated T253461 Liberate the @ for AtEase
   

New RFCs:

   - Created / closed / opened / closed all since last week: T261133 Ban IP
   edits on pt.wiki .  Basically
   this was a policy discussion, not a technical RFC

Phase progression: none

IRC meeting request: none

Other RFC activity:

   - Discussion spawned from T240884 (evaluate user-provided regex)
    to T260330 (the PHP service
   Tim's building)  around
   encoding parameters, security, and more
   - Product Data Engineering team shows interest in resourcing RFC: Better
   interface for generating metrics in MediaWiki
    (that would be an
   interesting centralization of all things "metrics", whether system level or
   high level)
   - Development starts on RFC: Parsoid Extension API
   



NOTE: The technical committee is hoping for wider and less formal
participation, starting with this message.  Feel free to chime in, we'll
start a meta-thread about this soon.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l]  Wikimedia production errors help

2020-09-16 Thread Dan Andreescu
>
> For example, of the 30 odd backend errors reported in June, 14 were still
> open a month later in July [1], and 12 were still open – three months later
> – in September. The majority of these haven't even yet been triaged,
> assigned assigned or otherwise acknowledged. And meanwhile we've got more
> (non-JavaScript) stuff from July, August and September adding pressure. We
> have to do better.
>
> -- Timo
>

This feels like it needs some higher level coordination.  Like perhaps
managers getting together and deciding production issues are a priority and
diverting resources dynamically to address them.  Building an awesome new
feature will have a lot less impact if the users are hurting from growing
disrepair.  It seems to me like if individual contributors and maintainers
could have solved this problem, they would have by now.  I'm a little
worried that the only viable solution right now seems like heroes stepping
up to fix these bugs.

Concretely, I think expanding something like the Core Platform Team's
clinic duty might work.  Does anyone have a very rough idea of the time it
would take to tackle 293 (wow we went up by a dozen since this thread
started) tasks?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] TechCom meeting 2020-11-25

2020-11-25 Thread Dan Andreescu
Quick reminder that the IRC meeting is starting in 10 minutes.  We will
collaborate on:

RFC: Provide mechanism for configuration sets for development and tests
https://phabricator.wikimedia.org/T267928

On Wed, Nov 25, 2020 at 4:00 PM Krinkle  wrote:

> This is the weekly TechCom board review in preparation of our meeting on
> Wednesday. If there are additional topics for TechCom to review, please let
> us know by replying to this email. However, please keep discussion about
> individual RFCs to the Phabricator tickets.
>
> Activity since Monday 2020-11-02 on the following boards:
>
> https://phabricator.wikimedia.org/tag/techcom/
> https://phabricator.wikimedia.org/tag/techcom-rfc/
>
> Committee inbox:
>
>- T268328: Automatically index extensions in Codesearch
>
>   - Daniel is raising that people effectively use Codesearch to guide
>   deprecation efforts under theStable Interface policy. As such, we should
>   define what inclusion criteria it has (or should have), and simply or
>   document how to implement that in practice through adding and removing
>   repositories from its index (esp those not hosted by Wikimedia).
>- T267085 : Clarify
>deprecation of method overrides
>   - A question about the stable interface policy.
>
> Committee board activity:
>
>- T175745 : Do not
>overwrite edits when conflicting with self
>   - Some renewed interest on this question about how MW should handle
>   when e.g. someone starts editing the same page from multiple tabs and 
> then
>   submits those edits.
>- T227776 : General
>ParserCache service class
>   - Addshore asking for an update.
>
> New RFCs:
>
>- T268326: RFC: Amendment to the Stable interface policy (Nov 2020)
>
>   - Proposal by Daniel, to:
>  - … fill some gaps (e.g. traits, and member fields).
>  - … allow for removal without (released) deprecation if it is
>  unused in code we know about and is considered "maintained". Input 
> welcome.
>
> Phase progression:
>
>- T266866 RFC : Bump Basic
>browser support to require TLS 1.2 for MediaWiki core
>   - Ed lists which Web APIs and other browser capabilities would
>   become safe to use in the base layer (HTML/CSS), as well as some JS
>   features that will automatically become available to Grade A.
>   - Ed confirmed TLS 1.2 mapping to browser names/versions.
>   - Moved to Phase 3: Explore.
>- T260330 RFC: PHP microservice for containerized shell
>
>   - Moved to Last Call last week, until 2 December (next week).
>   - Tim answered and added a section to clarify the backwards
>   compatible nature of the PHP interface in core, for third-parties that
>   would not or have not installed Shellbox.
>- T259771: RFC: Drop support for database upgrade older than two LTS
>
>   - Last week's concerns about detection and failure prevention have
>   been answered by Amir.
>   - The Platform Engineering Team has filled the ownership gap for
>   this policy.
>   - Moved to Phase 4: Tune.
>
> IRC meeting request:
>
>
>- Later today (Wed 25 Nov), this RFC will be discussed in
>#wikimedia-office on Freenode IRC:
>RFC: Provide mechanism for configuration sets for development and tests
>https://phabricator.wikimedia.org/T267928
>
>
> Other RFC activity:
>
>- T263841 RFC : Expand API
>title generator to support other generated data
>   - Rescoped from potential software change to policy update.
>   - Awaiting resourcing from core API steward to confirm support,
>   risk, compatibility as proposed.
>- T250406 RFC: Hybrid extension management
>
>   - Conversation about what we would need to commit to for WMF
>   software, and seeking placing and approval of said resourcing.
>- T119173: RFC: Discourage use of MySQL ENUM type
>
>   - Next step is for the consensus to be turned into concrete wording
>   for the policy.
>- T40010: RFC: Re-evaluate librsvg as SVG renderer for WMF wikis
>
>   - Some general clarifications, and statistics from production.
>
>
> -- Timo
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org

Re: [Wikitech-l] TechCom meeting 2020-12-02

2020-12-03 Thread Dan Andreescu
>
>
>- Create WikiTeq group on Gerrit
> is now more clear (see note
>from Daniel  to
>discuss, which we should do here)
>
> Committee board activity:
>
> The process for adding a "trusted organization" isn't clear, but since the
> group is not maintaining any extensions deployed by wmf, we can probably
> just go ahead and add them.
>
+1

>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] TechCom meeting 2020-12-02

2020-11-30 Thread Dan Andreescu
This is the weekly TechCom board review. If there are additional topics for
TechCom to review, please let us know by replying to this email. However,
please keep discussion about individual RFCs to the Phabricator tickets.
This week, the meeting is async, so we'll be replying to the email as
well, *feel
free to join in*!

Activity since Wednesday 2020-11-25 on the following boards:

https://phabricator.wikimedia.org/tag/techcom/
https://phabricator.wikimedia.org/tag/techcom-rfc/

Committee inbox:

   - Automatically index extensions in Codesearch
    is still in the inbox, no
   activity since last week's discussion
   - Create WikiTeq group on Gerrit
    is now more clear (see note
   from Daniel  to
   discuss, which we should do here)

Committee board activity:

   - General ParserCache service class for large "current" page-derived data
    was declined by Daniel, see
   his reasoning there

New RFCs: (none)

Phase progression: (none)

IRC meeting request: (none)

Other RFC activity:

   - RFC: Re-evaluate librsvg as SVG renderer for WMF wikis
    saw more conversation
   - RFC: Discourage use of MySQL's ENUM type
    Amir points out the need to
   overhaul all db documentation (I agree, could we have a doc sprint about
   this?) including policies
   - RFC: Amendment to the Stable interface policy, November 2020
    has a suggestion/question
   about clarifying default status and deprecation procedure
   - RFC: Store WikibaseQualityConstraint check data in persistent storage
    WMDE calls for help from
   TechCom and the Platform team to collaborate here.  See Adam Shorland's
   comment about putting it through the new Technical Decision Making Process
   - RFC: Expand API title generator to support other generated data
    Carly Bogen is asking for
   clarity on the need for a steward.  Timo's comment
    on the task indeed
   seem to contradict his note from last week's grooming email.  Timo, would
   you please clarify.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


  1   2   >