Re: [Wikitech-l] [Wmfall] VisualEditor on Wikipedia now faster with RESTBase
Fantastic work! :) VisualEditor is becoming really zippy -- which had been one of the top concerns in user feedback in the past. Congratulations to everyone involved. Erik -- Erik Möller VP of Product Strategy, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Starting conversion of LiquidThreads to Flow at mediawiki.org
On Mon, Mar 16, 2015 at 11:21 PM, Kevin Wayne Williams kwwilli...@kwwilliams.com wrote: There doesn't seem to be any particular user demand to adopt Flow, so there's no reason to believe it will gain any more traction than LQT ever did. There was significant community interest and momentum behind LQT including various votes to enable it [1], and there is significant interest in Flow now [2]. The main thing that prevented LQT from wider adoption was not lack of community interest, it was our decision to put the project on hold due to both major architectural concerns and resource constraints at the time. We've committed to providing an upgrade path, and this is our follow-through to that commitment. Our main objective in Flow development is to solve for progressively more challenging collaboration/conversation use cases well, and to demonstrate positive impact at increasing scale, with the goal of providing a better experience for new and experienced editors alike. We recognize that we still have a long way to go, but we can already demonstrate that the system does one thing well, which is to make the process of using talk pages much more understandable for new users: https://www.mediawiki.org/wiki/Flow/Moderated_Testing,_November,_2014:_talk_pages_and_Flow We're also seeing, as Nick pointed out, that users in a mentorship use case are more likely to follow-up with their mentors. This is a pretty big deal -- quantitative research shows that this type of mentorship improves engagement and retention of new users. [3] So mentorship is an obvious early stage use case, even if the rest of a community functions through traditional talk pages. Village pump type pages that are fairly distinct from article talk pages are another obvious use case where a more forum-like system can relatively quickly outperform the talk page based approach that is rife with edit conflicts and other annoyances. We are trialing the first such use case in Catalan with lots of community participation. As for inconsistency and fragmentation of mediawiki.org, if anything, the conversion of LQT pages on mediawiki.org will create greater consistency as we're already using Flow on Beta Features talk pages ( https://www.mediawiki.org/wiki/Talk:Content_translation is a nice example of a feedback page with lots of continuous and substantive comments from experienced users). Flow may not serve major use cases on English Wikipedia today, or tomorrow; that's okay. Smaller projects are often happy to adopt technologies that may not meet the expectations of a large, mature community like en.wp yet, because they may be more concerned with the experience of new users than with the risks or inconveniences associated with features in earlier stages of development. (I am not dismissing either risks or inconveniences in saying so, as the requirements do of course differ at different scale.) We, in turn, remain committed to building tools that serve users well, continuously improving, and continuously demonstrating value through data and qualitative validation. [4] This is but a small step, but it's an important one. Erik [1] https://phabricator.wikimedia.org/search/query/radjv9rJZNLU/#R [2] https://www.mediawiki.org/wiki/Flow/Rollout [3] http://arxiv.org/pdf/1409.1496v1.pdf [4] What we learn is summarized in our quarterly reviews, most recently: https://upload.wikimedia.org/wikipedia/commons/5/5f/Collaboration_Q3_2014-15_WMF_Quarterly_Review.pdf -- Erik Möller VP of Product Strategy, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Engineering] Wikimedia REST content API is now available in beta
On Tue, Mar 10, 2015 at 3:29 PM, James Forrester jforres...@wikimedia.org wrote: Congratulations, Services team, and all those who've helped you get to this point. This is a huge milestone and I'm so happy we've reached it. It'll be hugely valuable for Mobile Web, Mobile Apps, VisualEditor and dozens of other projects. Thank you! Well said. Very excited about the possibilities -- and great to see Swagger in action, as well. :) -- Erik Möller VP of Product Strategy, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Wikitech-ambassadors] Global user pages deployed to all wikis
Thanks for all _your_ work seeing this through the finish line as well, Kunal. This is a great first step towards better user profile support, and brings all Wikimedia wikis closer together. -- Erik Möller VP of Product Strategy, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Use of hreflang in the head
Hi folks, MediaWiki emits the hreflang attribute on language links, but only as part of the links in the body, and not in the head as recommended by Google [1]. The result of this is that Google (and possibly other search engines) doesn't interpret the hreflang attribute for purposes of prioritizing search results in the user's own language. From a contact at Google we asked about this: we currently don't use those annotations on the links, we need to see the hreflang link-elements in the head in order to understand that connection. The important parts there are the we need to have them in the head, we need to have them confirmed from the other versions (so DE needs to refer to EN, and EN to DE -- it can't be one-sided), and it needs to be between the canonical URLs. (...) I imagine if you just added the cross-links as you have them in the sidebar as link-elements to the head then you'd be covered. This of course would add some additional payload to pages with lots of language links, but could help avoid results like [2] where the English language version of an article is #1 and the Indonesian one makes no appearance at all. Results vary greatly and it's hard to say how big a problem this is, but even if boosts discoverability of content in the user's language by only 10% or so, that would still be a pretty big win for local content. I'm curious if folks see any downside, other than the additional page payload, in adding this information to the page header. Given the time it takes for the index to be updated, we should be careful about any potential negative consequences. Thanks, Erik [1] https://support.google.com/webmasters/answer/189077?hl=en [2] https://www.google.co.id/?gws_rd=ssl#q=edison -- Erik Möller VP of Product Strategy, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Phabricator search
On Thu, Feb 5, 2015 at 7:50 PM, MZMcBride z...@mzmcbride.com wrote: Perhaps if Titan/Wikidata Query Service development is on hold, Nik could investigate this? Not really on hold - just looking at alternatives to Titan. But this is a pretty critical issue for all our dev workflows, so really would appreciate help from our search gurus in resolving it. Erik -- Erik Möller VP of Product Strategy, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Investigating building an apps content service using RESTBase and Node.js
On Wed, Feb 4, 2015 at 8:41 AM, Gabriel Wicke gwi...@wikimedia.org wrote: Regarding general-purpose APIs vs. mobile: I think mobile is in some ways a special case as their content transformation needs are closely coupled with the way the apps are presenting the content. Additionally, at least until SPDY is deployed there is a strong performance incentive to bundle information in a single response tailored to the app's needs. A notion of schemas that declare a specific set of transformations to be applied/not applied might help avoid overcomplicating things early on while addressing different transformation needs even within the growing number of mobile use cases (Android app alpha/beta/stable, iOS app alpha/beta/stable, mobile web alpha/beta/stable, third party apps), and potentially making code re-usable for desktop needs down the road. Since the number of schemas would be limited, and specifying the correct schema would result in a single response, performance could be optimized for each use case. Erik -- Erik Möller VP of Product Strategy, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Investigating building an apps content service using RESTBase and Node.js
On Tue, Feb 3, 2015 at 5:46 PM, Dan Garry dga...@wikimedia.org wrote: To address these challenges, we are considering performing some or all of these tasks in a service developed by the Mobile Apps Team with help from Services. This service will hit the APIs we currently hit on the client, aggregate the content we need on the server side, perform transforms we're currently doing on the client on the server instead, and serve the full response to the user via RESTBase. In addition to providing a public API end point, RESTBase would help with common tasks like monitoring, caching and authorisation. Using https://phabricator.wikimedia.org/T87824 as a reference point for what you're talking about - I think you will generally find agreement that moving client-side transformations that only live in the app to server-side code that enables access by multiple consumers and caching is a good idea. If there are reasons not do to this, now'd be a good time to speak up. If not, then I think one thing to keep in mind is how to organize the transformation code in a manner that it doesn't just become a server-side hodgepodge still only useful to one consumer, to avoid some of the pitfalls Brian mentions. Say you want to reformat infoboxes on the mobile web, but not do all the other stuff the mobile app does. Can you just get that specific transformation? Are some transformations dependent on others? Or say we want to make a change only for the output that gets fed into the PDF generator, but not for any other outputs. Can we do that? Or a more pressing concern for the app team itself, what about alpha, beta, stable version of the apps -- how would those get more/less experimental versions of the output? Or languages -- are there cases where we apply a transformation only in one language, but not another? Do we need a way to register schemas so we can easily get a certain set of inter-dependent transformations, like mobile app stable, desktop web, etc.? Or are these all just API/service parameters? Just some early questions as we're thinking this through. Erik -- Erik Möller VP of Product Strategy, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] DevSummit appreciation
Just a quick note that I really appreciated everyone's help making the summit come together. As always, we'll be doing lots of second-guessing of everything we did and didn't do, and how we want to use future time together. Before we go into that, I'd like to thank the event team and _everyone_ who worked to and beyond the point of exhaustion to organize the event, support attendees, plan sessions, facilitate conversations, negotiate sometimes difficult terrain. Thank you. :) Erik -- Erik Möller VP of Product Strategy, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Brion's role change within WMF
Hooray! :-) You'll do great things, as always. Look forward to your focused leadership in this area. Erik -- Erik Möller VP of Product Strategy, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] The future of shared hosting
On Fri, Jan 16, 2015 at 1:14 PM, Ryan Lane rlan...@gmail.com wrote: On Fri, Jan 16, 2015 at 12:27 PM, Ryan Lane rlan...@gmail.com wrote: What you're forgetting is that WMF abandoned MediaWiki as an Open Source project quite a while ago (at least 2 years ago). {{citation needed}} There was a WMF engineering meeting where it was announced internally. I was the only one that spoke against it. I can't give a citation to it because it was never announced outside of WMF, but soon after that third party support was moved to a grant funded org, which is the current status quo. I think the confusion between third party support and an open source project is unhelpful. We're obviously an open source project with lots of contributors who aren't paid (and many of them are motivated by Wikimedia's mission), it's just that the project puts primary emphasis on the Wikimedia mission, and only secondary emphasis on third party needs. I don't think it's a dirty secret that we moved to a model of contracting out support to third parties -- there was even an RFP for it. ;-) Whether this is the best model, or whether it's time to think about alternatives, is always up for debate. We just have a legitimate tension between needing to focus as an org on what donors are supporting us for, vs. potentially very tangentially related needs (PostgreSQL support for some enterprise wiki installation). Erik -- Erik Möller VP of Product Strategy, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Fwd: No more Architecture Committee?
On Thu, Jan 15, 2015 at 10:57 PM, Tim Starling tstarl...@wikimedia.org wrote: Sorry to labour the point, but the way to go about this at present is pretty straightforward, and it doesn't involve the architecture committee. You just convince the management (Damon, Erik, etc.) that it is a good thing to do, get yourself appointed head of the MediaWiki 2.0 team, hire a bunch of people who agree with your outlook, get existing engineers transferred to your team. Yeah, there's a lot of truth to that (though also a lot of opportunities to circumvent organizational structure). WMF is a hierarchical org that operates internally through pretty conventional decision-making structures. On the flip side, the org has increasingly pushed to give individuals a very high degree of latitude of pushing and promoting projects, and leading them to conclusion (hence the project leads on the top priorities and such). Within the organizational pattern we use, the way to complement Damon's and my roles is usually with a CTO who has deep technical experience commitment and ongoing day-to-day involvement writing code, leading projects, and driving architectural change. That person may have some of the most senior engineers in the org reporting to them, and has a serious seat at the table in driving projects that satisfy highly technical concerns. I'm supportive of such a role (if properly defined and socialized), because in the model we're operating in, it seems one of the best ways to complement the team structure. I'm also supportive of experimenting with less conventional models, like creating more official representation for an Architecture Committee in management decisions. Gravity is going to pull us more towards the conventional ways of doing things (it always does), so if you want to promote a different idea we need to start articulating and refining it. Erik -- Erik Möller VP of Product Strategy, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Fwd: No more Architecture Committee?
On Thu, Jan 15, 2015 at 9:31 PM, Ori Livneh o...@wikimedia.org wrote: On the leadership front, let me throw out a hypothetical: should we have MediaWiki 2.0, where we start with an empty repository and build up? If so, who makes that decision? If not, what is our alternative vision? Who is going to define it? Is what we have good enough? Let's throw out that hypothetical, because it's too grotesque even as a conversation starter. And for the record, I agree with this. Full rewrites suck. The model I do think we should consider is Python 3. Python 3 did not jettison the Python 2 codebase. The intent behind the major version change was to open up a parallel development track in which it was permissible to break backward-compatibility in the name of making a substantial contribution to the coherence, elegance and utility of the language. This is a more interesting model - especially if done in parallel with radical experiments creating new workspaces for content. Imagine 1) a version of MediaWiki where a lot of stuff is ripped out ruthlessly, and new features are added more quickly, 2) which powers a site which exists for creating, say, article drafts that can be imported into Wikipedia. That's the kind of thing I get very excited about. Erik -- Erik Möller VP of Product Strategy, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Ops] All non-api traffic is now served by HHVM
This is fantastic -- kudos to everyone for pushing to get this through the finish line. Making editing faster (and improving general site responsiveness) is one of the most obvious things we can do that serves every single contributor to our projects. We've still got lots more that we can do in that area, but HHVM is a huge, huge win. Erik -- Erik Möller VP of Product Strategy, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Phabricator outage due to network issues, 11/29
As noted in the server admin log [1], Phabricator is currently down due to a network outage impacting one of our racks in the Ashburn data-center. We're investigating and will aim to restore service ASAP. Erik [1] https://wikitech.wikimedia.org/wiki/Server_Admin_Log -- Erik Möller VP of Product Strategy, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Revision metadata as a service?
On Wed, Nov 5, 2014 at 11:21 AM, Gabriel Wicke gwi...@wikimedia.org wrote: What are the indexing requirements for this metadata? If fast access by specific properties is needed Most typically, I'm guessing you'd do stuff on a per-revision basis to show quality indicators and such on page histories or article pages via opt-in gadgets. Querying the entire corpus for articles with certain characteristics would be valuable though, especially for applications like offline exports. I just saw https://meta.wikimedia.org/wiki/Grants:IEG/Revision_scoring_as_a_service and wasn't even aware of that when I wrote the email -- there's definitely a lot of interest in a generic solution for this problem. Erik -- Erik Möller VP of Product Strategy, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Revision metadata as a service?
Hi folks, there are many projects which have an interest in generating and querying metadata for specific revisions: - community efforts to annotate quality of specific articles - researchers analyzing revision contents (e.g. to derive quality heuristics, perform citation analysis, etc.) - application developers wanting to display/use such metadata We've gotten inquiries e.g. from WikiProjects and researchers over the years. I'm wondering if a lightweight service that satisfies the following requirements might be a good idea: - community-created schemas (similar to the EventLogging schemas on meta) - basic per-user authentication/authorization - basic namespacing (e.g. WikiProject Medicine:Quality refers to a specific schema + specific permissions) If such a service existed, community members, researchers and occasionally WMF itself could create their own tools/gadgets that use this service, perhaps with a lightweight global approval process. If this seems like a good idea, I'd be curious about implementation strategies -- are we blocked on something like SOA Auth [1] to implement this as a standalone service? My sense is that you'd want to pull this out of MediaWiki for maximum flexibility and simplicity. Thanks, Erik [1] https://www.mediawiki.org/wiki/Requests_for_comment/SOA_Authentication -- Erik Möller VP of Product Strategy, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Looking for project status updates
On Wed, Oct 22, 2014 at 11:56 PM, Pine W wiki.p...@gmail.com wrote: It's been awhile since I've noticed a status update from the Growth that team through the EE list or this list. The Meta page implies that the Growth team disbanded on October 3. Is that true, and if so, does WMF still have a single person leading tech-based growth initiatives? Yes, the Growth team was disbanded, and the engineers working on this team are now supporting Mobile (Rob Moen, Sam Smith) and Flow (Matt Flaschen). We'll be updating wiki pages as we go, but help is welcome. Terry wrote a piratey blog post about his departure here: http://terrychay.com/article/fair-winds-and-godspeed-me-hearties.shtml Also, who is PMming Winter? Winter is a way for the UX team to prototype ideas, not a product. It's a useful way for us to experiment with ideas like the right rail concept for moving some structured info out of the main content area, and a fixed header. We're discussing how we want to explore some of these ideas further. My bias right now is to look at this through a mobile first lens as much as possible, i.e. for completely new UI ideas to be validated on mobile before moving them all the way to desktop scale. This helps ensure that we're getting UI/UX patterns right for multiple device categories and capabilities from the start rather than handling mobile as an afterthought. However, the current focus of the mobile web team is to increase contributions. To keep a high velocity of experiments, we're not currently doing anything major re: the above. It would be helpful to have a unified high-level overview of the status, relationships, plans, strategic goals, and contacts for projects like: Our focus has been to shift to and fully adopt the new quarterly prioritization that Lila has pushed for, and getting that piece right before everything else. This has included project leads for each priority, and more systematic resourcing/trade-off conversations to ensure that every project lead has the support they need. This page will continue to get further attention to flesh out metrics deliverables: https://www.mediawiki.org/wiki/Wikimedia_Engineering/2014-15_Goals/Q2 Teams that are not mapped out here are being asked to revise their (remaining) Q2 roadmap on the overall goals page, here: https://www.mediawiki.org/wiki/Wikimedia_Engineering/2014-15_Goals The overall process is still running late and there's lots of context to be fleshed out -- since we're changing the cadence and nature of planning in pretty fundamental ways, it's not running like clockwork just yet. In addition, I am working on aligning the goalsetting, prioritization and review/reporting processes. This will likely mean a shift away from monthly status updates to quarterly (while over time, I'd like to have more user-centric updates like the VisualEditor newsletter and more consistent updates to Tech News). Erik -- Erik Möller VP of Product Strategy, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Tor and Anonymous Users (I know, we've had this discussion a million times)
On Tue, Sep 30, 2014 at 2:33 PM, Federico Leva (Nemo) nemow...@gmail.com wrote: There must be a way that we can allow users to work from Tor. RESOLVED FIXED http://meta.wikimedia.org/wiki/NOP Not quite; if your _only_ means of access is Tor and you have no prior editing history to point to (which may be a situation if you're in a country where Internet access is heavily censored/monitored), this process is currently quite restrictive in terms of actually granting global exemptions as previously demonstrated. [1] We've had this conversation a few times and I'd love to see creative approaches to a trial/pilot with data driving future decisions. But given that the global exemption process is entirely a community (steward) process, it's not clear to me that WMF can/should do very much here directly. I also don't think it's really a technical problem first and foremost. It clearly is the kind of problem where people do like to _look_ for clever technical fixes, which is why it's a recurring topic on this list. As a social problem, I stick with my original suggestion [2] to relax the global exemption rules a bit, monitor globally exempt accounts for abuse and constructive activity, and try to determine whether the cost/benefit ratio of relaxed rules is worth it. This could be done as a time-limited trial (say 30 days), and requires no new technology. If the cost/benefit ratio actually is worse, there are many non-technical ways to raise the barrier while still having a clearer path to success for sufficiently motivated people than today (say, the well-worn tool all bureaucracies use to manage intake, fill out this form). As Derric pointed out, as a policy issue it's a bit OT here, though it requires people who understand the full technical complexity to make a cogent case for a pilot on Meta and elsewhere. IOW -- I think many people who've been talking on this list about this issue share the right end goal, but it's the wrong target audience. Erik [1] https://lists.wikimedia.org/pipermail/wikitech-l/2014-January/074049.html [2] https://lists.wikimedia.org/pipermail/wikitech-l/2014-January/074070.html -- Erik Möller VP of Product Strategy, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Wikitech-ambassadors] Invitation to beta-test HHVM
Ori and ... Aaron Schulz, Alexandros Kosiaris, Brad Jorsch, Brandon Black, Brett Simmers, Bryan Davis, Chad Horohoe, Chris Steipp, Erik Bernhardson, Faidon Liambotis, Filippo Giunchedi, Giuseppe Lavagetto, Greg Grossmeier, Jack McBarn, Katie Filbert, Kunal Mehta, Mark Bergsma, Max Semenik, Niklas Laxström, Rob Lanphier, and Tim Starling. .. amazing work, everyone. This is a huge milestone and it's wonderful to see the finish line come closer. Making and keeping our sites fast is hugely important, and all evidence so far suggests that HHVM will be one of the biggest gains ever. :) Please, do help spread the word and give it a spin, as we iron out remaining issues. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Disabling JS support in additional browsers
On Sat, Aug 23, 2014 at 11:22 AM, Erik Moeller e...@wikimedia.org wrote: The IE6 disable patch is in prod now. I've tested on a few wikis and have not noticed any issues - if anything, IE6 actually feels usable now when before it kept throwing errors or was just slowing to a crawl. If there are no objections, I'll do the same with IE7 after wmf19 lands on mw.org (to give us a little bit more time for any issues to be reported). This is ready to be merged unless folks have final concerns: https://gerrit.wikimedia.org/r/#/c/157774/ IE7 is pretty badly broken in prod (and has been for a long time), so AFAICT this will be a clear net improvement. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Disabling JS support in additional browsers
On Wed, Aug 6, 2014 at 11:52 AM, Erik Moeller e...@wikimedia.org wrote: == Microsoft Internet Explorer 7.x == Last release in series: April 2009 - Browsing: Most pages work fine (some styling issues), but pages with audio files cause JavaScript errors (problem in TMH). - Editing: Throws JS error immediately (problem in RefToolbar) Both of these errors don't occur in IE8. Security vulnerabilities: Secunia reports 15 out of 87 vulnerabilities as unpatched, with the most serious one being rated as moderately critical (which is the same as IE6, while the most serious IE8 vulnerability is rated less critical). Usage: 1% Recommendation: Add to blacklist The IE6 disable patch is in prod now. I've tested on a few wikis and have not noticed any issues - if anything, IE6 actually feels usable now when before it kept throwing errors or was just slowing to a crawl. If there are no objections, I'll do the same with IE7 after wmf19 lands on mw.org (to give us a little bit more time for any issues to be reported). Erik ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Superprotect user right, Comming to a wiki near you
Hi folks, Admins are currently given broad leeway to customize the user experience for all users, including addition of site-wide JS, CSS, etc. These are important capabilities of the wiki that have been used for many clearly beneficial purposes. In the long run, we will want to apply a code review process to these changes as with any other deployed code, but for now the system works as it is and we have no intent to remove this capability. However, we've clarified in a number of venues that use of the MediaWiki: namespace to disable site features is unacceptable. If such a conflict arises, we're prepared to revoke permissions if required. This protection level provides an additional path to manage these situations by preventing edits to the relevant pages (we're happy to help apply any urgent edits) until a particular situation has calmed down. Thanks, Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Engineering] Migrating test.wikipedia.org to HHVM
This is fantastic progress, and really promising data. Huge kudos, guys :) Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Bikeshedding a good name for the api.php API
On Wed, Aug 6, 2014 at 10:15 PM, James Forrester jdforres...@gmail.com wrote: Yes, this is sensible. Let's certainly not call it the MediaWiki API given how many are planned. Core seems a reasonable qualifier, though, no? Seems like the content API and a lot of other proposed interfaces are by definition outside the core. So why not MW core API or just core API for short? -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Policy on browser support
On Mon, Aug 4, 2014 at 5:36 PM, Derric Atzrott datzr...@alizeepathology.com wrote: I would like to make a case for moving more browsers into the grade C category. Yes please. As a project that must live the test of time I think we should be focusing our energy on building for future browsers. Our main goal is to provide people knowledge which can be done without JavaScript. Older browsers hold us back in my opinion. I would also support this, for similar, but slightly different reasons. I agree that we need to make sure that the project stands the test of time, and for that reason I think we need to make Grade C a first class citizen. OK. I've submitted a change to do this for MSIE6 for now. [1] It got merged quickly, but please update if I missed anything (I'll add release notes now). Provided there are no concerns with this, I'll send a note to wikitech-ambassadors@ soon, drafted here: https://www.mediawiki.org/wiki/User:Eloquence/MSIE6 This case seems very obvious given the unpatched vulnerabilities and lack of official support; let's discuss additional browsers/categories of browsers on a case by case basis. Timo, I also noticed that the definition of grade C in startup.js was inconsistent with what's on the mediawiki.org page so I updated it accordingly. Thanks, Erik [1] https://gerrit.wikimedia.org/r/#/c/152072/ -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Disabling JS support in additional browsers
Following up on disabling JavaScript support for IE6 [1], here is some additional research on other browsers. I'd appreciate if people with experience testing/developing for/with these browsers would jump in with additional observations. I think we should wait with adding other browsers to the blacklist until the IE6 change has been rolled out, which may expose unanticipated consequences (it already exposed that Common.js causes errors in blacklisted browsers, which should be fixed once [2] is reviewed and merged). As a reminder, the current blacklist is in resources/src/startup.js. As a quick test, I tested basic browsing/editing operation on English Wikipedia with various browsers. Negative results don't necessarily indicate that we should disable JS support for these browsers, but they do indicate the quality of testing that currently occurs for those browsers. Based on a combination of test results, unpatched vulnerabilities and usage share, an initial recommendation for each browser follows. Note that due to the heavy customization through gadgets/site scripts, there are often site-specific issues which may not be uncovered through naive testing. == Microsoft Internet Explorer 7.x == Last release in series: April 2009 - Browsing: Most pages work fine (some styling issues), but pages with audio files cause JavaScript errors (problem in TMH). - Editing: Throws JS error immediately (problem in RefToolbar) Both of these errors don't occur in IE8. Security vulnerabilities: Secunia reports 15 out of 87 vulnerabilities as unpatched, with the most serious one being rated as moderately critical (which is the same as IE6, while the most serious IE8 vulnerability is rated less critical). Usage: 1% Recommendation: Add to blacklist == Opera 8.x == Last release in series: September 2005 Browsing/editing: Works fine, but all JS fails due to a script execution error (which at least doesn't cause a pop-up). Security: Secunia reports 0 unpatched vulnerabilities (out of 26). Usage: 0.25% Recommendation: Add to blacklist == Opera 10.x-12.x == Last release in series: April 2014 Browsing/editing: Works fine, including advanced features like MediaViewer (except for 10.x) Security: No unpatched vulnerabilities in 12.x series according to Secunia, 2 unpatched vulnerabilities in 11.x (less critical) and 1 unpatched vulnerability in 10.x (moderately critical) Usage: 1% Recommendation: Maintain basic JS support, but monitor situation re: 10.x and add that series to blacklist if maintenance cost too high == Firefox 3.6.* == Last release in series: March 2012 Browsing/editing: Works fine (MediaViewer disables itself) Security: 0 unpatched vulnerabilities according to Secunia Recommendation: Maintain basic JS support == Firefox 3.5.* == Last release in series: April 2011 Browsing/editing: Works fine (MediaViewer disables itself) Security: 0 unpatched vulnerabilities according to Secunia Recommendation: Maintain basic JS support == Safari 4.x == Last release in series: November 2010 Browsing/editing: Works fine Security: 1 unpatched highly critical vulnerability according to Secunia (exposure of sensitive information) Recommendation: Maintain basic JS support, but monitor == Safari 3.x == Last release in series: May 2009 Browsing/editing: Completely messed up, looks like CSS doesn't get loaded at all Security: 2 unpatched vulnerabilities, highly critical Usage share: Usage reports for Safari in [3] are broken, all Safari versions are reported as 0.0. However, [4] suggests that Safari 3 usage is negligible/non-existent. Recommendation: Styling issue may be worth investigating in case it affects other browsers and/or is JS-caused. Otherwise probably can be safely ignored. [1] http://lists.wikimedia.org/pipermail/wikitech-l/2014-August/077952.html [2] https://gerrit.wikimedia.org/r/#/c/152122/ [3] http://stats.wikimedia.org/wikimedia/squids/SquidReportClients.htm [4] http://stackoverflow.com/questions/12655363/what-is-the-most-old-safari-version-which-is-used-so-far-by-users -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Policy on browser support
On Sun, Jul 27, 2014 at 2:25 AM, Krinkle krinklem...@gmail.com wrote: Since Grade B never ended up being recognised in any way by the software, I've kept that out. And the previously undocumented Grade C represents browsers we are interested in supporting due to their traffic but only via the non-javascript mode. Thanks, Timo - brilliant work as always. I would like to make a case for moving more browsers into the grade C category. JavaScript's client-side execution means that being able to provide predictable user experiences is inherently dependent on client updates _with reasonable maintenance burden_. By shifting more browsers into JavaScript-less mode, we accomplish the following: - Improve performance for low-end users since these JS engines are often slow. - Significantly reduce the maintenance burden for modern user experience code. - Reduce risks of security exploits targeting users running an outdated environment. We could then also begin to treat JavaScript-less experience as a first class citizen in support and testing, which helps us serve low-end users better. If we take the grade C category seriously, we need to think more about testing these workflows and ensuring we give the best possible experience to those users. IMO this should be based on both usage share and maintenance burden based on developer judgment. IE6-7 would be obvious candidates -- single platform browser, high maintenance burden to test for, etc. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Wikitech-ambassadors] Deprecating print-on-demand functionality
On Fri, Jul 11, 2014 at 8:45 AM, Luca Martinelli martinellil...@gmail.com wrote: so the Book Creator will still be active, maybe under another name, maybe with another engine, but still active? Same name and functionality, just the Order a printed book feature will disappear. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Deprecating print-on-demand functionality
Since 2008, we've offered a small feature to download printed books from Wikipedia article. This is done in partnership with a company called PediaPress. They've sold about 15K books over that time period, not enough to break even, and the support/maintenance burden for the service is no longer worth it for them. We'll disable this feature in coming weeks. We'd only continue to offer it if there's 1) strong community interest in maintaining it, and 2) a partner who steps up to provide the service. We'll continue to provide PDF downloads (soon with a new rendering engine). Thanks, Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Mantle - coding sharing between Flow and MobileFrontend
On Thu, Jul 3, 2014 at 9:09 AM, Jon Robson jrob...@wikimedia.org wrote: == The future == Mantle is only a short term measure. The hope is that all the code that goes here will eventually go into core. We hold the code here to exactly the same high standards that we hold core to, we are just able to more freely experiment and iterate. I hope Mantle doesn't exist in a year and instead we have a healthy frontend architecture that Mantle has helped grow. Like Kaldari, I think it's perfectly fine to have this experimental approach served by a separate extension until an implementation strategy for core has been agreed upon. However, I would recommend pursuing a shorter term resolution so we actually ensure that we work together on a single codebase, rather than diverging further. Can we shoot, _aspirationally_, for settling on a standard template/widget approach in core within the next quarter, and definitely within the next two quarters? I.e. dis-Mantle sooner rather than later? Thanks, Jon, for posting the details of this approach publicly. I've asked Tomasz to facilitate ongoing conversation about this internally, as well. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] First _draft_ goals for WMF engineering/product
As an update on the goals process for WMF engineering, we've begun fleshing out out the top priorities for the first quarter. Going forward, we'll aim to call out the top priorities for each quarter as we approach it, to create more shared visibility into the most urgent and high-impact projects we're working on. I've decided for now to use a division between User-Impacting Changes and Cross-Functional Platform and Process Improvements. The intent of calling out both areas is to ensure that important organizational priorities don't fall off our collective radar. At the management level, the intent is for us to pay special attention to the priorities called out in this manner, and this may also impact our willingness to request help from across the organization if necessary to support these priorities, at least in Q1. I've merged the current draft into the goals document, here: https://www.mediawiki.org/wiki/Wikimedia_Engineering/2014-15_Goals#Top_departmental_priorities_for_Q1_.28July_-_September_2014.29 Once again, this is draft and marked as such. The Impact column will include links to relevant metrics once those are a bit more solid; if you look further down in the document you'll see that these are being refined and tweaked in multiple areas right now. A little bit of rationale for some items that may surprise you: - I've decided to list HHVM as the top priority in both categories. This is because a) it's a very complex undertaking from an engineering perspective and requires significant coordination across development operations, b) it's probably the biggest change regarding how code gets executed in production since we adopted PHP in the first place, c) the expected performance benefits for many uncached logged-in user operations are very significant (I defer to the team to quantify before throwing out estimates). This is also indicative of the importance we're attaching to site performance. There's no question that performance is directly correlated with user engagement, and it's appropriate that we spend significant effort in this area. - We're elevating SUL finalisation ( https://www.mediawiki.org/wiki/SUL_finalisation ) to a top priority, and I've classified it as user-impacting. This is because it's on the critical path for making it easier to develop cross-site functionality (as long as we have to deal with the edge case of detached accounts, certain features that work across wikis are just trickier to implement), and one of those long term issues of technical debt we've been kicking down the road for years. It's also a pretty complex project -- if it goes wrong and we mess up our account database, we're in big trouble. So we want to make sure we have lots of eyeballs on this from a technical and community management perspective. We may not completely wrap up in Q1 since we need to give users whose accounts are affected significant warning time, which is just elapsed time we can't shorten. - Front-end code standardization is called out as a top priority. We really need to dig ourselves out of the mess of having disjointed templating systems, widget libraries, and JS frameworks across our codebase if we want to increase development velocity and UX consistency. I'm prepared to sacrifice short term development velocity on other projects in order to make this happen. - The content API that Gabriel is working on ( https://www.mediawiki.org/wiki/Requests_for_comment/Content_API ) is called out as a top priority. This is because the Parsoid output (for which the content API will be a high performance front-end) is now getting to the point where it's starting to become plausible to increasingly use it not just for VisualEditor, but also for views as well. The potential here are performance benefits across the board: for logged-in users in general by consistently relying on fast, cached output; for users loading VisualEditor by giving them most of the payload required to edit in read mode; for users saving through VisualEditor by potentially turning the wikitext transformation into a post-save asynchronous process and thereby making saves near instantaneous. Moreover, it will put us on the long term path towards possibly using HTML5 as MediaWiki's native format, supporting HTML5-only wikis, and more. And it will be valuable for third party re-use and re-processing of Wikimedia content for a multitude of use cases. Last but not least, it's also a great use case for a service-oriented architecture including REST APIs good/clean API documentation. In short, this is a big deal, and it has lots and lots of architectural implications -- so raising the visibility on this is intended to get more people to actually think through what all of this means for the future of MediaWiki. Other elements of the prioritization shouldn't be surprising: Phabricator is a big deal, and it's coming; Mobile (including new contributory features) and VE (including a really awesome new citations experience we're
Re: [Wikitech-l] MediaWiki Front-End Standardization Chat
As a reminder, this is in 65 minutes. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] MediaWiki Front-End Standardization Chat
Raw logs here: https://tools.wmflabs.org/meetbot/wikimedia-office/2014/wikimedia-office.2014-06-25-17.30.log.html Next steps: 1) Trevor, Roan, Timo, Kaldari and others will refine the proposal at https://www.mediawiki.org/wiki/Requests_for_comment/Redo_skin_framework as a concrete step to develop a standard UI framework for MediaWiki Core. 2) The proposal on the table is to implement this new skin framework, port existing skins in MW core, and port it to mobile as a skin to ensure that we're developing a multi-device, responsive design framework. 3) We'll reconvene with a dedicated IRC session about the refined RFC, and then seek to create technical alignment and determine the exact allocation of development effort beyond the team above. This will be in about two weeks. - - - - - The aforementioned work will be coordinated on-wiki and via Bugzilla. This proposal will only address parts of UX standardization. It does not currently focus on look and feel itself, though it would lay the groundwork for more CSS standardization as well. Other aspects - such as more consistent browser support expectations - need to be resolved independently. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] MediaWiki Front-End Standardization Chat
Hi folks, We're hosting a conversation about standardization and continued development of front-end libraries in MW core on 6/25, 5:30 PM UTC, #wikimedia-office. This is driven by a recognized need for teams at WMF to work more effectively on user-facing features and reduce duplication of efforts and inconsistencies across extensions. We're willing to take a bit of a hit on the short term velocity of feature development to build a more robust, consistent and developer-friendly platform. Timo Tijhof, Trevor Parscal and Roan Kattouw have proposed a systematic effort to improve MW core's front-end libraries, building on existing efforts (see https://www.mediawiki.org/wiki/UX_standardization for a messy but reasonably comprehensive overview of some of the inconsistencies and wheel duplication we need to solve). This will be done in partnership with other interested front-end engineers across the org and the community. We'll try to come up with a clear scope of work, such as: - having Mobile and VisualEditor depend on the same front-end libraries in MW core and use them effectively - eliminating dependencies on jQuery UI from all WMF-deployed code, to be replaced with a MediaWiki-native look and feel - creating a proper living style guide and UX standardization pipeline in partnership with the WMF UX team. This conversation is just a first step to ensure this effort has visibility from the start, and major architectural changes will go through the usual public conversations. Thanks, Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Announcement: Elliot Eggleston joins Wikimedia as Features Engineer
On Fri, Jun 13, 2014 at 6:06 PM, Pine W wiki.p...@gmail.com wrote: Voodoo? Secret account in the Caymans set up by Fundraising Tech? Wikishares? If you want in on the Wikishares, it would be a nice test case for Matt Walker's new PDF generator. :) -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] LiquidThreads - how do we kill it?
On Mon, Jun 9, 2014 at 11:14 PM, Pine W wiki.p...@gmail.com wrote: FWIW, for me as a power user who watches many discussions simultaneously on multiple wikis, a unified watchlist and more refined tools for watchlist management are among the features at the top of my development wish list. *nod* The watchlist is an awesome tool, and there's so much more we could do with it. :) I like the theme Danny's proposed for the first quarter work on Flow: Never miss a message. When Flow begins to deliver on that promise, I think power users will start seeing some of the advantages such a system affords. Please don't hesitate to note requests/suggestions on the Talk:Flow page here: https://www.mediawiki.org/wiki/Talk:Flow And please comment on the overall roadmap here: https://www.mediawiki.org/wiki/Talk:Wikimedia_Engineering/2014-15_Goals Thanks :) Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] LiquidThreads - how do we kill it?
On Tue, Jun 10, 2014 at 1:09 AM, Thomas Gries m...@tgries.de wrote: Watchlist and (fine-granular definable) E-Mail-Notifications are very important - for my daily work. LiquidThreads and Echo (if you opt-in to mail) offer that (using the MediaWiki UserMailer functions). Does Flow also offer E-Mail-Notifications? It already has basic notifications, yes. Enable the overly literally labeled Flow email notification type your user preferences, leave a comment in the sandbox, get the sock puppets out and watch the fireworks: https://www.mediawiki.org/wiki/Special:Preferences#mw-prefsection-echo https://www.mediawiki.org/wiki/Talk:Sandbox -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] First _draft_ goals for WMF engineering/product
Hi all, We've got the first DRAFT (sorry for shouting, but can't hurt to emphasize :)) of the annual goals for the engineering/product department up on mediawiki.org. We're now mid-point in the process, and will finalize through June. https://www.mediawiki.org/wiki/Wikimedia_Engineering/2014-15_Goals Note that at this point in the process, teams have flagged inter-dependencies, but they've not necessarily been taken into account across the board, i.e. team A may say We depend on X from team B and team B may not have sufficiently accounted for X in its goals. :P Identifying common themes, shared dependencies, and counteracting silo tendencies is the main focus of the coming weeks. We may also add whole new sections for cross-functional efforts not currently reflected (e.g. UX standardization). Site performance will likely get its own section as well. My own focus will be on fleshing out the overall narrative, aligning around organization-wide objectives, and helping to manage scope. As far as quantitative targets are concerned, we will aim to set them where we have solid baselines and some prior experience to work with (a good example is Wikipedia Zero, where we now have lots of data to build targets from). Otherwise, though, our goal should be to _obtain_ metrics that we want to track and build targets from. This, in itself, is a goal that needs to be reflected, including expectations e.g. from Analytics. Like last year, these goals won't be set in stone. At least on a quarterly basis, we'll update them to reflect what we're learning. Some areas (e.g. scary new features like Flow) are more likely to be significantly revised than others. With this in mind: Please leave any comments/questions on the talk page (not here). Collectively we're smarter than on our own, so we do appreciate honest feedback: - What are our blind spots? Obvious, really high priority things we're not paying sufficient attention to? - Where are we taking on too much? Which projects/goals make no sense to you and require a stronger rationale, if they're to be undertaken at all? - Which projects are a Big Deal from a community perspective, or from an architecture perspective, and need to be carefully coordinated? These are all conversations we'll have in coming weeks, but public feedback is very helpful and may trigger conversations that otherwise wouldn't happen. Please also help to carry this conversation into the wikis in coming weeks. Again, this won't be the only opportunity to influence, and I'll be thinking more about how the quarterly review process can also account for community feedback. Warmly, Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] LiquidThreads - how do we kill it?
On Mon, Jun 9, 2014 at 12:12 PM, Risker risker...@gmail.com wrote: This. Nobody, but nobody, asked the WMF to create this sort of system, and it is a rather quixotic goal given that each project has its own set of workflows. Hey Anne, We're of course pretty familiar with many of the highly specialized workflows that exist across wikis, and have had lots of conversations about how/when we could improve such workflows. Brandon originally titled the system Flow because of precisely that reason - the idea that Flow would provide building blocks through which workflows can be created, much the same way that an ordinary wiki page provides a very flexible mechanism by which users can create their own workflows. To keep the project manageable, however, I recommend a more staged approach: Solving for discrete use cases that can reasonably be solved with a new user experience, testing/validating whether the new user experience is in fact superior to the old one, and iterating from there. In this process we need to be wary of UX fragmentation -- but I think this is reasonably manageable as long as we're careful how we're staging use cases (e.g. I don't think it's unreasonable for a page like the Teahouse to have a different UX than an ordinary article talk page). Danny knows that I'm worried about the user talk page use case (called out in the goals) in that respect, because it represents a possible major fragmentation (old user talk vs. new user talk). My recommendation so far has been to target use cases where there exists local consensus to support them _and_ fragmentation of the user experience can be avoided. Beyond just managing scope, I think it's important to recognize that wiki-based workflows like AfD and RFCs are built around what a wiki allows you to do. If it's easy to tag comments/threads in such a way that they show up on relevant noticeboards, this may enable completely new workflows that are significantly simpler. So I think we need to be careful when modeling a new user experience to not just try to copy the old one, but better. We may find that users actually like some of the capabilities a new tool creates, just like Echo's completely new, never-asked-for mention notifications became very popular, very quickly. :) It's absolutely true that Flow is a risky project, but it's not true that it's designed to solve problems nobody's asked to be solved. You yourself quoted some of the issues with talk pages. And did you attend Jimmy's talk at Wikimania (I think 2012) about how difficult it is to perform simple tasks in the wiki precisely because every workflow is wiki-based? I absolutely want to make sure that WMF solves real problems and not just imagined ones -- but you'll need to allow for people in WMF to have reasonable debates (with each other, and with the community) about what the solutions could look like, and to try different approaches. Meanwhile, core tasks like SUL finalisation are languishing on the back burner, to the detriment of several other projects and products. Yes, I know it's on the roadmap for Mediawiki Core in Q1 starting in July - but it will not be the priority product for any of the three people tasked to it. It's indeed a complex migration that needs to be done carefully, as any issues would be difficult to resolve/debug and could cause massive angst. And it needs to involve people with deep Wikimedia subject matter expertise, and folks from the MediaWiki core team. Engineers aren't fungible -- we have to schedule a project like that with the folks who're best suited and most interested in working on it, and that's what we're doing. But we've scheduled it, and we'll take that commitment seriously, so please do hold our feet to the fire if we try to kick the can down the road. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] LiquidThreads - how do we kill it?
Dear Anne, Thank you for the thoughtful critique. There were four problems with talk/discussion pages that users across multiple communities over multiple years have identified: - Automatic signatures for posts/edits - More efficient method for indenting that is not dependent on arcane wikicode knowledge - More graceful handling of edit conflicts - Ability to watchlist an individual thread or section of a page Indeed, Flow is designed to address these issues, as well as others: - Overall reduced user experience latency (time spent loading pages, positioning the cursor, typing characters, etc.). - Support for cross-wiki discussions, to reduce fragmentation of conversations. - Better tools for organizing (e.g. tagging) and searching conversations. - Making it easy to see what's new/changed, irrespective of watchlisting. - Simple ways to participate in relevant conversations on mobile devices. The architecture reflects these combined needs. While it is possible to incrementally hack away at a single wikitext representation of an entire discussion, logically, without clear boundaries for each comment, any kind of functionality that operates at the comment-level is difficult or impossible to build. With that said, we will likely experiment with improvements to the existing talk page model as well, just to see how far we can push it. The mobile apps team is interested in implementing talk page support in the apps, and since Flow is still some way out, that may be a good case study to see what we can do on the basis of the existing wikitext conversational model. Flow is intended to be a system that is effective both for new and experienced users. Everyone working on the system understands that it cannot succeed if it doesn't serve the needs of experienced users. This is why we're intentionally designing it in the context of a gradually increasing number of real-world use cases. As an example, the Teahouse would be an interesting real-world use case where both new and experienced users interact. Longer term, high volume pages like Village Pumps may make for good use cases. We can measure and compare the characteristics of conversations in such venues over time. At the most basic level, how many new and experienced users participate? At the more granular level, how long does it take users to perform tasks? Only if Flow compares well to other methods of organizing talk pages, should it be used. But I've increasingly been getting the impression that there aren't a lot of WMF staff who actually like wikis, let alone developing Mediawiki core. The WMF technical staff is a pretty healthy mix of people with prior Wikimedia editing experience, people who became Wikimedians on the job, people who contributed to MediaWiki before, people who've not contributed to MediaWiki but who've been part of other open source efforts, and people who're completely new to both Wikimedia and open source. I'll let them speak for whether they like wikis or MediaWiki core, but I think a love/hate relationship with both is not uncommon nor unwarranted. ;-) Meanwhile, Flow does automatic signatures, but its current design actually makes indenting much more problematic from a reader perspective than the current indenting structure. Flow stores the comments as a structured tree, so it's possible to build different interfaces for it. I personally don't have a strong opinion in the threaded vs. flat vs. hybrid debate -- I think we should pick the system that proves effective and that users will adopt and embrace. When I originally proposed the (now doomed) LQT effort nearly 10 years ago ( https://meta.wikimedia.org/w/index.php?title=LiquidThreadsoldid=100760 ), I defaulted to a thread view, because that's what I was familiar with from the forums that I used. Many very successful forums use either approach. A strong argument in favor of a flat, chronological view is that it is just that -- it lets you easily see the most recent comments. Structure is then supplemented by quoting comments, as I'm doing in this email, which my mail client renders as part of a flat, chronologically ordered conversation. In a long thread that spans multiple screens, quickly noticing the comments that have been added after a certain date is otherwise difficult. LQT's implementation actually tries to solve for this -- LQT does deep threading, and you can track the history of an entire thread, with highlighting of when comments have been added: https://www.mediawiki.org/w/index.php?title=Thread:Project:Support_desk/help_to_restore_an_abandoned_wikilqt_method=thread_history Flow, which displays threads with a limited depth, currently shows the history as a meta-structure, which then gives you access to the individual comments. https://www.mediawiki.org/w/index.php?title=Talk:Flowworkflow=rvzy5wgxmdmgbfqaaction=history Neither user experience feels efficient (nor does mentally parsing a wikitext diff). It may be possible to
Re: [Wikitech-l] 404 errors
It's being investigated, see #wikimedia-operations on irc.freenode.net. Erik On Thu, May 29, 2014 at 1:34 PM, ENWP Pine deyntest...@hotmail.com wrote: Hi, I'm getting some 404 errors consistently when trying to load some English Wikipedia articles. Other pages load ok. Did something break? Pine ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Output from Zurich Hackathon - yet another maps extension!
On Thu, May 15, 2014 at 8:06 AM, Jon Robson jdlrob...@gmail.com wrote: Visual: ? The mild conflict VisualEditor: conflict aside, that seems reasonable. View: might also work if it's not used for something else. Either way, it's going to be a bit tricky to translate. It might be good to design it with Commons in mind from the start, so that a Visual:/View: can be loaded from a foreign repository as well as the local one. A qLabel style translation approach (using Wikidata) might be helpful to consider as well [1]. Overall this is very exciting work with lots of potential future applications. I don't think it's resourced for success yet, but let's figure out where it should sit in our roadmap since it would address many shared needs if done right. Erik [1] http://google-opensource.blogspot.com/2014/04/qlabel-multilingual-content-without.html -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Tech Talk: Unit testing for MediaWiki projects
As a reminder, this is happening tomorrow at 12 PM PDT / 19:00 UTC tomorrow (Tuesday): https://plus.google.com/events/cae6ng1m9o4mhdbpo10u5v05bvg We're going to talk about various strategies for automated testing and improvements to our continuous integration infrastructure. Antoine 'hashar' Musso has offered to give an overview, roughly along these lines: - quick overview of the infrastructure (Zuul/Jenkins, the slaves, the myriad of jobs and how they are maintained). - MediaWiki testing frameworks and tools (phpunit, qunit, browser tests, beta cluster) - current concerns in what we test, which should provide enough materials for the open discussion part: - lack of cross repositories tests and how to handle dependencies - repositories that are barely tested yet critical - mw/core tests mixing unit and integration tests - lack of mocking - very thin code coverage This will be followed by an open conversation about improvement strategies. The session is scheduled to take about an hour total. Hope to see you there :) Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Another Wikipedia design concept
On Tue, Apr 15, 2014 at 8:47 AM, Nathan nawr...@gmail.com wrote: In the comment thread at the bottom someone gave him a heads up about the fonts controversy, hopefully he doesn't get totally discouraged from MediaWiki design studies after reading it ;) I actually think it's interesting that he independently picked a mixed serif/sans-serif approach (going a step further than we did and turning all headlines serif). This is also true for other redesigns such as the recent more readable Wikipedia design: http://weare1910.com/sites/default/files/project/wikipedia_new_desktop_full.jpg Of course there are lots of issues with it - it's always a lot easier when you don't have to design with reality as a constraint :) Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Proposed body font stack for Latin
3 Thank you Erwin for always moving things forward. Much appreciated. :) Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Free fonts and Windows users
On Thu, Apr 10, 2014 at 7:39 AM, Derk-Jan Hartman d.j.hartman+wmf...@gmail.com wrote: So for me, the question is not how can we apply pretty serif fonts to headers, the question is what can we do short term and long term to make that happen. It would be good if we could focus the conversation as much on concrete bugs and issues as possible. My understanding is that there are three separate major issues: * serif may not be a good choice for certain languages, no matter what font stack you use, because serif connotes different things in different scripts, and the meaning that's attached to it in Latin script may not necessarily translate well to other languages. I don't think that's an argument against serifs, and this doesn't negate the reasoning explained in [1] when it comes to Latin script wikis. Rather, it argues for per-language improvements. Disabling serif for certain languages is currently handled by local overrides which is not ideal for obvious reasons. The only way I can see to properly resolve that is to explicitly vary the font stack based on the content language. Does that make sense? If so, what's the best way to accomplish it? * As a serif font, Georgia uses old-style numerals (whether users get Georgia depends on their locally installed stack). Some readers aren't used to old-style numerals, but they are really designed to flow with Latin script, so they look especially odd with other scripts. Since, as established in the first point, the serif specification per se may not make sense for certain languages, this issue could be resolved in one fell swoop by specifying sans-serif for certain languages/scripts. * The explicitly specified sans-serif stack needs to be further optimized, and ideally should prioritize free/libre fonts first (as the serif stack does). Until then, there's disagreement about whether the current sans-serif stack represents an appropriate place from which to improve (the UX team is arguing that it does because the increased specificity reduces the risk of bad defaults, others argue that it doesn't because it violates software freedom principles). * Because of the above issues and possibly others, some people feel that either reverting to the previous state, or generally leaving fonts to the browser/OS while specifying broad family choices (serif / sans-serif), would be preferable. The UX team disagrees with that, as they feel that we can achieve a good result for the reader with higher predictability by making specific font recommendations. Are those the main issues? Am I misrepresenting anything or forgetting something / additional major issues? Thanks, Erik [1] https://www.mediawiki.org/wiki/Typography_refresh#Why_are_we_using_serif_fonts_for_the_headings.3F -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Free fonts and Windows users
On Tue, Apr 8, 2014 at 10:59 AM, Martijn Hoekstra martijnhoeks...@gmail.com wrote: So, the font stack changes with regards to the status quo now change nothing for Windows users, changes Helvetica - Helvetica neue for Mac users and changes Arial, DejaVu Sans or Arimo for possibly something else, amongst which Nimbus Sans L, maybe, maybe not. Actually, it's a bit more complicated. All users get serif fonts for headings, which they didn't before and which is probably the biggest visual before/after difference. The serif fonts still prioritize free/libre fonts over non-free ones. The body fonts prioritized free/libre fonts on deployments, but for Windows users without ClearType/anti-aliasing, those render very poorly, so they were disabled shortly after deployment. This is now causing people to be upset because the initial agreement to never prioritize non-free fonts is no longer maintained for the body. Odder's patch would revert to sans-serif as a generic classification for the body, but doesn't touch the font specification for the headers (yet). The commit summary is a bit misleading in that regard. There's some additional discussion about Georgia as a font choice due to its use of text figures (AKA old-style numerals), which some people find look odd in headings with numbers, especially in non-Latin scripts where old-style numerals may not be commonly encountered. Due to this, some are arguing for also changing the style for headings to serif (_not_ sans-serif) as a generic classification, or removing Georgia from the stack. That particular issue hasn't been discussed in detail yet, as far as I can see. I think the differences of opinion here are not worth a holy war. Prioritizing a non-free font before free ones for the _body_ with a clear FIXME indicating that this is not a desirable state is IMO only marginally different from reverting to sans-serif until we have a free/libre font that _can_ be prioritized for the body. So I think either outcome should be OK for the short term, and we should focus on the longer term question of a good font stack for the body that prioritizes free/libre fonts. Let's not polarize each other too much. All the arguments I've heard have been fundamentally reasonable and rational, not just Change is evil. Some people hate the serifs per se, but that's a smaller discussion compared to these conversations, which are about substantial things that can be reasoned about. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Free fonts and Windows users
Just a note that Brandon just commented on the patchset: We discussed this patch today during our weekly design team meeting and how to move forward. At this point in time we are leaning towards +2'ing this but we want to have a bit of discussion internally before doing so. We'll have something at the end of the day, so we're asking that no one merge it until then. I think that's a completely reasonable request. Thanks to everyone who's been speaking up in support of positive changes to our typography while also being constructive and critical as appropriate. I think one of the cool things about Wikimedia is that we're willing to try and experiment even at the risk of causing breakage and disruption, but the flip side of that is listening to each other and optimizing things together towards the right outcome. We've got tons of smart people around the world helping with this, which is awesome. As always, I have confidence in our collective ability to figure things out :) Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Implementation JSON based localisation format for MediaWiki nearly completed
On Tue, Apr 1, 2014 at 9:22 PM, Jon Robson jdlrob...@gmail.com wrote: This is epic. Thanks a bunch Siebrand! Agreed - really exciting to see this come to fruition! :) Kudos to Siebrand everyone involved. I'm sure there will be bumps along the road but it's clearly a bit architectural step forward. It's also nice to see how the RFC process was used for this. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Fwd: Language Engineering team changes
FYI. -- Forwarded message -- From: Erik Moeller e...@wikimedia.org Date: Fri, Feb 21, 2014 at 2:35 PM Subject: Language Engineering team changes To: All Wikimedia Foundation Staff Hi folks. After some internal conversations, we've implemented the following changes on the Language Engineering team: - Amir Aharoni is currently the Acting Product Manager; - Runa Bhattacharjee is the ScrumMaster. Siebrand Mazeland is no longer part of the Wikimedia Language Engineering team. We're maintaining a part-time contract relationship with Siebrand to support MediaWiki developers with i18n/l10n-related questions, to provide continuous review of MediaWiki changes from an i18n/l10n perspective, and to support translatewiki.net requirements which intersect with Wikimedia Foundation priorities and ongoing localization updates. I want to thank Siebrand for serving as the team's Product Manager since July 2011, and for continuing to partner with us on i18n/l10n issues and on translatewiki.net, which is an essential part of the Wikimedia localization process. Thanks, Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Fwd: Update on WMF Director of TechOps
FYI. -- Forwarded message -- From: Erik Moeller e...@wikimedia.org Date: Fri, Feb 21, 2014 at 2:40 PM Subject: Update on WMF Director of TechOps To: All Wikimedia Foundation Staff Hi folks, in consultation with Faidon and Mark, we've decided not to immediately post the Director of TechOps job. Instead, Mark Bergsma will continue to fill the role indefinitely. This is an exploration - he wants to see whether he can balance compelling technical work with the managerial side of the role in a manner that makes him happy and serves the organization's needs. It makes a lot of sense to me. Hiring leadership roles is always high-risk in our complex environment, and with the VPE/ED hires around the corner, there's already a lot of risk to deal with. So I'm grateful Mark's willing to continue to step in, and perhaps he will actually enjoy the hybrid tech/management role in the long run. :) Given that the team is heavily distributed, it's less of an issue that he is not based in SF. To ensure that the responsibilities are manageable and to support the team going forward, we're preparing a Technical Project Manager role that will also include some procurement/vendor negotiations responsibilities. The person will need to have ops/infrastructure experience to be able to hit the ground running. Internal applicants will be welcome, of course. On a side note, Engineering/Product is also hiring an additional admin staff member to support the department's growing administrative support needs, which should also help with the overall workload. Thanks again to Mark and the whole team for making this work. :) Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] .wiki gTLD
We've been in discussions with Top Level Design, both to look into potentially appropriate uses (e.g. URL shorteners) and to prevent squatting of WMF trademarks. James points out that now there's .foundation there's some additional potential for mischief :P. Damn TLDs sprouting like mushrooms .. -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Design] Should MediaWiki CSS prefer non-free fonts?
On Mon, Feb 17, 2014 at 4:42 PM, Rob Lanphier ro...@wikimedia.org wrote: tl;dr: His stack still lists HelveticaNeue as the first font, but proposes Arimo as a web font which may well look better on MS Windows. Arimo ships with ChromeOS. So, what would be the downside of listing a font like Arimo for sans-serif and Libertine for serif first in the stack? While not affecting the reader experience for a significant number of users, it would still be a symbolic expression of a preference for freely licensed fonts, and a conscious choice of a beautiful font for readers that have installed it. There may be practical and aesthetic arguments against these or other specific free fonts; if so, it would be good to hear those arguments spelled out. I do agree that if Helvetica Neue is only installed on Macs and costs $30 for everyone, it's a pretty idiosyncratic choice as a primary font to specify. :-) Surely if the font requires downloading on the majority of platforms anyway, we may as well specify a free one before the non-free one. As for webfonts, given the ULS experience I'd be very leery of the performance impact, both in terms of delivering the font and any unintended re-rendering flashes. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Design] Should MediaWiki CSS prefer non-free fonts?
On Mon, Feb 17, 2014 at 7:19 PM, Steven Walling swall...@wikimedia.org wrote: We basically tried the equivalent of this (placing relatively free fonts unknown on most platforms first) which Kaldari talked about previously. Ultimately that kind of declaration is useless for the vast majority of users and we got very specific negative feedback about it on the Talk page. (..) These fonts are ignored by most systems when placed first or when placed later in the stack. Systems match the first font they recognize, so using something they don't recognize or putting it later is a largely just feel-good measure. Thanks Steven et al. It's clear from https://gerrit.wikimedia.org/r/#/c/108155/ that everyone involved is trying to do the right thing. :) I agree with Rob's follow-up question here -- https://www.mediawiki.org/w/index.php?title=Talk:Typography_refreshdiff=908614oldid=908502 i.e. we should document our assessment of freely licensed fonts and any associated design or rendering issues. Even if specifying alternative fonts in the stack _is_ largely symbolic, to the extent that we can express our values through our choices here without negative side effects, we should. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] MeetBot now at your disposal in #wikimedia-office
Tim, this is great - thanks so much for getting it spun up, should be very helpful for office hours and such. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Welcome to Ken Snider, Wikimedia Operations
Hi all, I'm sorry to update you that Ken will be leaving WMF. He's agreed to provide transitional support through February and March, and Mark Bergsma will be Acting Director of TechOps starting today, sharing some of the work with Faidon. Thanks to both of them for stepping up. We'll be re-posting the role shortly. Thanks to Ken for his service to the team and the organization. If you have any questions, feel free to ping me, Mark or Faidon on IRC. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Tor exemption process (was: Re: Jake requests enabling access and edit access to Wikipedia via TOR)
On Mon, Jan 13, 2014 at 9:10 AM, Chris Steipp cste...@wikimedia.org wrote: To satisfy Applebaum's request, there needs to be a mechanism whereby someone can edit even if *all of their communications with Wikipedia, including the initial contact* are coming over Tor or equivalent. Blinded, costly-to-create handles (minted by Wikipedia itself) are one possible way to achieve that; if there are concrete reasons why that will not work for Wikipedia, the people designing these schemes would like to know about them. This should be possible, according to https://meta.wikimedia.org/wiki/NOP, which Nemo also posted. The user sends an email to the stewards (using tor to access email service of their choice). Account is created, and user can edit Wikimedia wikis. Or is there still a step that is missing? I tested the existing process by creating a new riseup.net email account via Tor, then requesting account creation and a global exemption via stewa...@wikimedia.org. My account creation request was granted, but for exemption purposes, I was requested to go through the process for any specific wiki I want to edit. In fact, the account was created on Meta, but not exempted there. The reason I gave is as follows: My reason for editing through Tor is that I would like to write about sensitive issues (e.g. government surveillance practices) and prefer not to be identified when doing so. I have some prior editing experience, but would rather not disclose further information about it to avoid any correlation of identities. This seems like a valid reason for a global exemption to me, so I'm not sure the current global policy is sufficient. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Tampa datacenter issues
Hi all, We had a fibre cut of our connection to our Tampa DC this morning. ETA of a fix is still pending, but the cuts have been located and crews are being dispatched. Meanwhile public traffic is being rerouted via the public Internet, so most services should be reachable. Tampa is our secondary DC which we're decommissioning, so there was no impact on our main sites. Impacted temporarily were: inbound Wikimedia.org email, Bugzilla, Labs. Fundraising is still impacted, but shouldn't be (we're debugging). Thanks to the ops team for their quick response. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Tor exemption process (was: Re: Jake requests enabling access and edit access to Wikipedia via TOR)
On Fri, Jan 17, 2014 at 11:08 AM, Marc A. Pelletier m...@uberbox.org wrote: The problem isn't straight up vandalism (IPBE is no help there -- the account'd get swiftly blocked) but socking. POV warriors know how to misuse proxies and anonymity to multiply their consensus, and having IPBE and editing through any sort of anonimizing proxy (including TOR) defeats what little means checkuser have to curb socking. I understand. Wikimedia's current abuse prevention strategies rely on limits to user privacy being maintained, and any technical solution that attempts to broaden access for Tor users is unlikely to be successful at any significant scale unless this changes, no matter how clever a solution it is. The Board or global community could decide that protecting users' right to anonymity is more important than having abuse prevention tools relying on IP disclosure, but in the absence of such a Board-level decision or community-wide vote, I don't think the situation relative to Tor users will change. My personal view is that we should transition away from tools relying on IP disclosure, given the global state of Internet surveillance and censorship which makes tools like Tor necessary. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Tor exemption process (was: Re: Jake requests enabling access and edit access to Wikipedia via TOR)
On Fri, Jan 17, 2014 at 1:38 PM, Risker risker...@gmail.com wrote: End of the day, though, absent blocking problematic IP addresses and ranges (which really can't be done unless the person blocking actually knows the IP address or range), the socks and spammers just keep coming. This problem isn't unique to WMF projects, and I don't believe anyone has come up with a solution that allows open/unregistered editing without also using IP information for blocking or limiting access. I'm not arguing for open editing from Tor. I do think it would be nice if global exemptions could in fact be obtained reasonably easily be emailing stewa...@wikimedia.org. While it's true that such requests could be misused, the following are also true: - We regulate the influx of requests and the exemptions we grant. This means that we can use wait periods, interview questions, and other mechanisms to avoid it turning into a free-for-all. This is effectively the same mechanism riseup.net uses to grant anonymous email addresses. - We know all the accounts that we have granted global exemptions to and therefore can investigate behavior _across the group_ of Tor users fairly easily, or even subsets of that group such as exemptions granted in a certain time window, by a certain user, etc. It would allow a motivated person to reset their identity and go undetected provided they avoid the kind of articles and behaviors they got in trouble over in the first place. It's not clear to me that the consequences would be particularly severe or unmanageable beyond that. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Ops] Status update on new Collections PDF Renderer
Thanks, Matt, for the detailed update, as well as for your leadership throughout the project, and thanks to everyone who's helped with the effort so far. :-) As Matt outlined, we're going to keep moving on critical path issues til January and will do a second sprint then to get things ready for production. Currently we're targeting January 6-January 17 for the second sprint. Will keep you posted. All best, Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Wikipedia Issue
We were dealing with cascading site issues due to excessive database queries, and are still investigating the root cause, but site should be recovered by now. -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Re-implementing PDF support
Hi folks, for a long time we've relied on the mwlib libraries by PediaPress to generate PDFs on Wikimedia sites. These have served us well (we generate 200K PDFs/day), but they architecturally pre-date a lot of important developments in MediaWiki, and actually re-implement the MediaWiki parser (!) in Python. The occasion of moving the entire PDF service to a new data-center has given us reason to re-think the architecture and come up with a minimally viable alternative that we can support long term. Most likely, we'll end up using Parsoid's HTML5 output, transform it to add required bits like licensing info and prettify it, and then render it to PDF via phantomjs, but we're still looking at various rendering options. Thanks to Matt Walker, C. Scott Ananian, Max Semenik, Brad Jorsch and Jeff Green for joining the effort, and thanks to the PediaPress folks for giving background as needed. Ideally we'd like to continue to support printed book generation via PediaPress' web service, while completely replacing the rendering tech stack on the WMF side of things (still using the Collection extension to manage books). We may need to deprecate some output formats - more on that as we go. We've got the collection-alt-renderer project set up on Labs (thanks Andrew) and can hopefully get a plan to our ops team soon as to how the new setup could work. If you want to peek - work channel is #mediawiki-pdfhack on FreeNode. Live notes here: http://etherpad.wikimedia.org/p/pdfhack Stuff will be consolidated here: https://www.mediawiki.org/wiki/PDF_rendering Some early experiments with different rendering strategies here: https://github.com/cscott/pdf-research Some improvements to Collection extension underway: https://gerrit.wikimedia.org/r/#/q/status:open+project:mediawiki/extensions/Collection,n,z More soon, Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Architectural leadership in Wikimedia's technical community
On Fri, Nov 8, 2013 at 9:00 AM, Bryan Davis bd...@wikimedia.org wrote: I think that picking the title Senior Software Engineer II may be underselling the value of this highest tier to the outside world. In my recent job search I saw a bit of the tech ladder side of the org chart for several companies. Most of the ladders I saw had a title of Principal Engineer for the top level of non-management engineers. That's totally fair, and I like Principal as an alternative. It has strong leadership implications, but leadership can take many forms, and the criteria for a promotion from Senior to Principal could make that clear. -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Architectural leadership in Wikimedia's technical community
On Thu, Nov 7, 2013 at 1:15 AM, Faidon Liambotis fai...@wikimedia.org wrote: Faidon, great questions. The architect title, besides the job description that you described, is also a seniority level within the WMF's engineering department. Other organizations do e.g. sr./staff/sr. staff and/or numeric levels, we do associate / (blank) / sr. / (lead) architect. At least that's my understanding of what was presented during the last all-staff and documented on officewiki. On mediawiki.org: https://www.mediawiki.org/wiki/Wikimedia_Engineering/Careers What would happen to this seniority level, if any of the options you presented were to be adopted? You seem to hint that there would be a mapping with option D (salary increases) but it's not clear to me how direct of a mapping or a prerequisite would that be. Let me try to respond to this and your other comments in one go. Folks who don't care about WMF internals should stop reading at this point. This stuff doesn't matter to everyone, if it doesn't matter to you, that's OK. :) There are four main salary band levels we work with in engineering: entry-level, mid-level, senior level, and director/architect level. Each of these bands is pretty wide, i.e. tens of thousands of dollars for an SF-based hire between the lowest and the highest point. There's a lot of room for progression within a given band, and it's OK for folks to live outside a given band, which tends to make this somewhat less urgent in practical terms. That said, one of the fundamental principles I believe in is that it should be possible to progress to the highest salary band on either the development or the management side. It seems that based on the discussion, nobody's particularly in favor of a broad community process regarding architecture roles, so some of the intricacies of progression tied in any way to such a process may be moot. What might have some degree of traction, based on the discussion, is to have some blessed delegation coming from the original triumvirate of architects. In practice, I could see this tie into the career progression at WMF in two main ways: 1) We continue to (rarely but sometimes) use the Architect title as the highest salary band in engineering, and promote people into it based on a track record of continued architectural leadership, as proven in a do-o-cracy framework like the one suggested by Brion. 2) We don't award Architect as a job title beyond the original triumvirate, but we _do_ introduce a Senior Software Engineer II (same band as the Architect band), and would define some criteria for that, among which proven architectural leadership could be one. We can choose to still recognize any continued membership in something like a core maintainers groups etc. in a person's role, but that's decoupled from the salary band and can change, consistent with the idea that it should be OK for an architect to spend time doing other fun important things, rather than being locked into one set of responsibilities forever. I think the second is more consistent with the tenor of the discussion here so far, because in the first case, the coupling between job titles and responsibilities in our community might be too tight to maintain flexibility and openness. It would also recognize that technical leadership doesn't _just_ mean taking on broad architectural responsibilities. So for example development of unique and mission-critical domain expertise might be another way to progress into Sr. II. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Architectural leadership in Wikimedia's technical community
On Wed, Nov 6, 2013 at 7:13 AM, Brion Vibber bvib...@wikimedia.org wrote: * It makes sense to have a handful of folks as a core review planning group. * However, I would consider avoiding using the term Architect for its members as it's easily conflated with existing WMF job titles. I think job titles are pretty unreliable indicators at the best of times, and of course can be wildly inconsistent across companies. Yeah, that makes sense to me. How do you propose that core review planning group be comprised? You say a handful of folks, do you mean that literally, or are you talking about a comprehensive maintainers list like the one at https://www.mediawiki.org/wiki/Developers/Maintainers ? If it's a significantly smaller subset, perhaps the current architects should appoint some folks as lieutenants, either Linux-style or on an as-needed basis? As such, I'd recommend a slightly more formal role for additional lead reviewers or module owners in the code review RFC processes Would that be the same as the core review planning group you refer to above? -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Architectural leadership in Wikimedia's technical community
tl;dr: I’d appreciate thoughts from the Wikimedia technical community at large whether the designation of individual technical contributors as architects should be meaningful, and if so, how to expand it beyond the original triumvirate (Brion, Tim Mark), e.g. by transitioning to a community-driven process for recognizing architects. Hi all, in March 2011 and June 2011, Brion Vibber, Mark Bergsma and Tim Starling were announced as Lead Software Architect, Lead Operations Architect and Lead Platform Architect of the Wikimedia Foundation, respectively. Together, these three individuals laid much of the foundation of Wikimedia’s technical infrastructure, from MediaWiki itself to our caching and load balancing setup. So it was a logical step to recognize their immense contributions, and to entrust them with continued high-level stewardship in Wikimedia’s technical ecosystem. Since then, WMF's engineering organization has grown pretty dramatically. We've also seen increased engagement in the Wikimedia technical community from other organizations. Wikimedia Germany is probably most notable among them with the Wikidata project, and Wikia has partnered directly on VisualEditor development and is generally striving to increase visibility of its open source modifications to MediaWiki. At WMF, this has increasingly raised the question how the architecture of Wikimedia’s technical infrastructure can be evolved at this new, larger scale, and how we can bring more voices into that conversation. I've shared this note with the architects ahead of time and taken some initial feedback into account. Rob Lanphier has taken a lead role in giving the RFC process a kick in the pants as a solid, asynchronous, transparent process for organizing and resolving technical proposals. Brion, Tim and Mark are explicitly listed as the three individuals who can close an RFC (interpreting or helping reach consensus, or making an informed decision where there’s a lack of community participation), and have helped clear the RFC backlog and evolve the architecture guidelines. In addition, Rob is organizing the MediaWiki architecture summit in January, where we can talk about some of the most pressing or contentious architectural questions in person. However, Brion, Tim and Mark are not infinitely scalable, nor are they immortal (except in our hearts). They can’t be in every conversation, know every part of Wikimedia’s technical ecosystem, review every RFC, etc. We also have many other deeply talented technical contributors, including some who have many years of experience in our technical context specifically -- not just at WMF. Beyond just making technical decisions, architectural leadership creates opportunities for mentorship, modeling and nurturing the kind of behavior we want to foster in our technical community. So how should this role evolve going forward? Some possible paths (you know I like to present options ;-): Option A: We change nothing and don't promote any new people into architect roles for a while. I truly don’t think this is an option for much longer -- we need to find better ways to encourage some of our other capable technical contributors to feel ownership over Wikimedia’s technical direction, and fill gaps in architectural leadership today. That said, it would be possible to make the RFC process more egalitarian and to reduce the emphasis on formalized technical leadership. Option B: WMF handles it as it sees fit. This basically means WMF gets to decide who to designate as Architects and at what point, which would mostly leave this decision in the hands of managers. This is a very WMF-centric view of the world, but it’s of course the way most organizations operate. Option C: We get rid of the special role of architects. I personally don’t favor this option either, because I think recognizing the most sane and experienced voices in our technical community and according them some real leadership influence over Wikimedia’s technical direction is important (and a useful counterbalance to pointy-haired folks like yours truly ;-). Option D: We come up with some kind of open process for designating/confirming folks as architects, according to some well-defined criteria (including minimum participation in the RFC process, well-defined domain expertise in certain areas, a track record of constructive engagement, etc.). Organizations like WMF can choose to recognize this role as they see fit (likely according salary increases to individuals who demonstrate successful architectural leadership), but it’s a technical leadership role that’s awarded by Wikimedia’s larger technical community, similar to +2 status. Each of these would need to be unpacked and further developed. For option D in particular, I think it’s important to recognize that the level of impact beyond WMF for technical decisions varies significantly. While parts of WMF’s overall operating infrastructure are of interest to third parties, decisions that
Re: [Wikitech-l] Architectural leadership in Wikimedia's technical community
On Tue, Nov 5, 2013 at 6:21 PM, Chad innocentkil...@gmail.com wrote: I think I can respond to pretty much the whole idea here. I think titles are pretty much a WMF-thing and shouldn't have any bearing on MediaWiki :\ Just to be clear on how they currently do, in the relatively recently drafted (and still draft status) architecture guidelines: An RFC is a request for review of a proposal or idea. RFCs are reviewed by the community of MediaWiki developers. Final decisions on RFC status are made by the WMF architects (Mark Bergsma, Tim Starling, Brion Vibber). https://www.mediawiki.org/wiki/Architecture_guidelines This is of course a process we could change, and rely more on informal recognition of technical leadership than formalized roles or titles. -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Visual Editor is trashing every article on French Wiki
This was due to a broken deployment of Parsoid, the new MediaWiki parser used by VisualEditor. A new library dependency defaulted to iso8859-1 instead of utf-8, which caused character munging to occur. Gabriel is working on a postmortem and we'll share this shortly with recommendations on how to avoid such an incident in future. We're very sorry for the breakage. Affected edits likely have already been reverted - but if there's anything in addition we can do, let us know. The team is still actively investigating the issue on #mediawiki-visualeditor on irc.freenode.net, which is the best place to reach them. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Should MediaWiki CSS prefer non-free fonts?
On Mon, Oct 28, 2013 at 7:12 AM, Brad Jorsch (Anomie) bjor...@wikimedia.org wrote: Where I come from, beta does mean this is the direction we're intending to go in, subject to testing and feedback before it's made an official release. That's right. There are two questions here: - Do these style/typography changes represent an improvement for most users, without significantly disadvantaging others? - If so, how should they be implemented? Making the change available as a beta feature helps us get input on both questions. This thread is an example of exactly the kind of feedback that we'd rather get during the beta stage than when something's been made default. Prioritizing freely licensed fonts while also explicitly naming the preferred non-free fonts seems like an easy fix. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Optimizing the deployment train schedule
Hi folks, after speaking to a few folks, I'd like to check in on the WMF deployment train schedule overall, and see if there are ways to optimize it. (Note: In the below I refer to test wikis vs. production wikis, generously including mediawiki.org as a test wiki. I realize that our test wikis, with the exception of Labs wikis, run on the production cluster.) == Current practice == * On Thursdays we increase the release counter, and deploy the latest release to test wikis and the previous one to Wikipedias. * On Mondays we deploy the latest release to non-Wikipedias. == Problems with this approach == * We only have bits of Thursday and all of Friday to resolve issues that are surfaced in the test wikis prior to the Monday rollout to the first production wikis. * Having two stages of release also increases the cognitive load on developers in understanding when their code hits production wikis, which arguably increases the risk of negative impact of a deploy going unnoticed. == Advantages of this approach == * Commons serves just about enough traffic to sometimes act as a useful canary for performance/scaling issues that will later appear in production. * Developers have some post-deployment time to fix issues highly specific to the non-Wikipedia wikis (e.g. extensions gadgets only deployed there) rather than being distracted by firefighting on Wikipedia == Some options == Option A: Change nothing. I've not heard from enough folks to see if the problems above are widely perceived to _be_ problems. If the consensus is that current practice, for now, is the best possible approach, obviously we should stick with it. Option B: No Monday deploy. This would mean we'd have to improve our testing process to catch issues affecting the non-Wikipedia wikis before they hit production. I personally think getting rid of the Monday deploy could create some _desirable_ pain that would act as a forcing function to improve pre-release test practices, rather than using production wikis to test. At the same time, we'd have a full week to work out the kinks we find in testing before they hit any production wiki, and could have a more systematic process of backing out changes if needed prior to deployment. Option C: Shift Monday deploys to Tuesday. This would at least give us an additional work day to fix issues that have occurred in testing before they hit prod. I personally don't think this goes far enough, but might be a useful tweak to make if option B seems too problematic. Are there other ways to optimize / issues I'm missing or misrepresenting above? Thanks, Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Multimedia] Welcome Gergő Tisza!
Welcome on board, Gergő -- really looking forward to making images, video and other media in our projects .. sparkle. ;-) -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [EE] Bug 35306: Global (to a wiki farm or family) message delivery (thoughts)
On Fri, Oct 4, 2013 at 9:34 AM, Brion Vibber bvib...@wikimedia.org wrote: Can someone summarize this thread? As far as I can tell someone has invented a requirement that all features be blessed by the WMF Features team, and I'm pretty sure that can't be right. Can it? Of course not. I think Terry's mostly concerned that there's clear ownership and maintainership for a new extension going forward, and that it's properly reviewed before it goes out the door. He's overstating, but he's coming from a reasonable place of caution. I like the checklist process in https://www.mediawiki.org/wiki/Review_queue (irrespective of the exact steps) because it is agnostic as to who does the work required to get something out the door. That said, it's a given that WMF does get the blame when things go wrong, especially on a large scale, and as operator of the sites we do have a role in making sure we're not causing harm, incurring unreasonable technical debt, or going against WMF's goals. As for MassMessage, I looked at and played with it and there were definitely issues with just pushing it out the door. As originally planned, it would enable any admin anywhere to post bulk messages to any wiki from any other wiki using a bot account created by the extension. This raises policy and auditing questions beyond what EdwardsBot is doing. There's consensus for a simpler deployment to start with, with Meta acting as a place for coordinating cross-wiki messages. That seems reasonable to me, and I definitely look forward to seeing how well this works in practice. -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Bug 35306: Global (to a wiki farm or family) message delivery (thoughts)
On Thu, Oct 3, 2013 at 6:50 PM, Terry Chay tc...@wikimedia.org wrote: We still eventually want to reach the point where the criteria is not the amalgam of rules above but a simpler one based on intent, expertise-sharing and consensus-building: If any engineering department or community developer in collaboration with the other core competencies (engineering, product, design, and community) are willing to commit to ongoing maintenance of a feature, then no one group should block it. I'd actually say that this paragraph more accurately describes both historical and current practice than your preceding enumeration, except that certain competencies (product/design) have historically been under-represented in the dev process and should be more consistently looped in now that we have more folks who wear those hats (both within WMF and beyond). Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Engineering] Roadmap updates - Sept 6th, 2013
On Sat, Sep 7, 2013 at 2:01 PM, Steven Walling swall...@wikimedia.org wrote: Weekly deployment plans/notes This monthly roadmap spreadsheet/wiki page Quarterly plans, as represented in https://www.mediawiki.org/wiki/Wikimedia_Engineering/2013-14_Goals and other places Yearly/annual plans This is too much. Agreed, let's rationalize this a bit. Part of the point of the Roadmap with monthly breakdowns was to create an overview where people can more easily anticipate e.g. architectural conversations and inter-team sync-ups that need to happen. It's not working for that, because it's not part of anyone's workflow. Like you, I'd argue in favor of scrapping the roadmap as it exists today, but I think we should do a better job making the Deployments page more informative. For example, now that we're working towards a proper beta mode on mobile _and_ desktop, it would IMO be useful to summarize which features are currently in beta and planned to enter production soon. We still have too many situations where people are caught by surprise by a deployment, both internally and externally. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] HTTPS enabled for all logged-in users
On Wed, Aug 28, 2013 at 3:19 PM, Tyler Romeo tylerro...@gmail.com wrote: After many months of struggle, WMF takes one big step towards a more secure Wikipedia. Good job everybody! Agreed - fantastic to see this out the door :-). Thanks to everyone who made it happen. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] HTTPS for logged in users on Wednesday August 21st
On Tue, Aug 20, 2013 at 12:46 PM, George William Herbert george.herb...@gmail.com wrote: The change must be delayed until people geographically / nationally denied HTTPS can log in again. Tim's working on a patch that should make this possible: https://gerrit.wikimedia.org/r/#/c/80166/ The plan of record right now is to not make the switch til we have that merged tested. We may still be able to make the launch window tomorrow - RobLa will make the final call on that. Ideally I'd like to see the language-based blacklisting removed if the GeoIP-based solution works. In general, though, I'd prefer for WMF to move away from what could be characterized as appeasement and towards actively resisting censorship and monitoring. So I'd argue in favor of a deadline for this approach, and alignment of resources and alliances to take active measures against censorship and monitoring. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] HTTPS for logged in users on Wednesday August 21st
On Tue, Aug 20, 2013 at 9:20 PM, Risker risker...@gmail.com wrote: The mandatory use of HTTPS outside of a limited number of countries where we know the editors will be blocked is not what I am talking about. No, but the point is that there's no apolitical choice here. Actively suppressing a standard, long overdue security measure in order to ensure that a country's censorship practices do not interfere with editing and access to Wikipedia is a political choice. Not doing so in full awareness of the consequences is a political choice. We cannot claim ignorance or neutrality either way. The real question is what political choices serve Wikimedia's mission best in the short and long term, and I agree that that's a discussion that extends beyond the technical dimension, so is somewhat OT here. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Remove 'visualeditor-enable' from $wgHiddenPrefs
On Mon, Jul 22, 2013 at 8:44 PM, Tim Starling tstarl...@wikimedia.org wrote: Newcomers with the VisualEditor were ~43% less likely to save a single edit than editors with the wikitext editor (x^2=279.4, p0.001), meaning that Visual Editor presented nearly a 2:1 increase in editing difficulty. For the record, this datapoint included in the draft (!) analysis was due to faulty instrumentation. The correct numbers show only a marginally significant difference between VisualEditor and wikitext for an edit within 72 hours [1], with the caveats already given in my earlier response. Erik [1] https://meta.wikimedia.org/wiki/Research:VisualEditor%27s_effect_on_newly_registered_editors/Results#Editing_ease -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Remove 'visualeditor-enable' from $wgHiddenPrefs
On Mon, Jul 22, 2013 at 6:35 PM, James Forrester jforres...@wikimedia.org wrote: It would imply that Wikimedia thinks preference bloat is an appropriate way forward for expenditure of donor funds. This would be a lie. Each added preference adds to the complexity of our software - so increasing the cost and slowness of development and testing, and the difficulty of user support. I want to elaborate on this point a bit, because some of the complexity cost may have gotten lost in the discussion so far. It is true that providing a mechanism to hide all evidence of VisualEditor, as it currently exists, from the user interface entirely is utterly trivial, from a technical standpoint. However, it is important to note that VisualEditor is not purely a means of editing pages, but will also provide, in future, - a mechanism for quickly performing simple metadata manipulation (e.g. categories); - a subset of rich-text editing functionality for edit summaries, log entries, etc.; - a default interface for posting or replying to comments (in Flow); - etc. On the first point, right now, we're approaching categories and similar page metadata from the point of view of the editing surface as an entrypoint. This makes sense if you simply try to map all aspects of markup (which is inherently positional, even where it carries no positional value like categories) into an editing interface. VE is at least providing a Page Settings dialog that gets rid of the positional context for categories, etc. However, from a user's standpoint, it still doesn't make a ton of sense to do it that way. If I just want to add a category, I shouldn't have to invoke an editing surface at all. Similarly, if I want to turn a page into a redirect, I shouldn't have to edit the page at all. As most of you know, some gadgets like HotCat already operate on a similar principle. The VisualEditor team is going to revisit some of these types of edit operations from the standpoint of what's the fastest and most intuitive way to perform this operation rather than how do we integrate this with the editing interface. So, when a user has disabled VisualEditor, should those affordances then also disappear, if they happen to be provided through the VisualEditor MediaWiki extension? Should VE be hidden from view in contexts where it could be safely and speedily initialized, on new content without the complexity of existing pages? As VisualEditor becomes more pervasive in the user experience, the complexity of maintaining a preference in a consistent and non-confusing manner will go up, and the cost of having users who could otherwise successfully use VE not see it will increase as well. Users who hate VE for editing articles with templates might not hate it for writing comments, but if they have that preference set, they might never see it for the latter use case. This is one other reason why we think it's preferable to focus on ensuring that the user experience _with VE present_ is minimally disruptive, rather than creating a preference that completely hides VE from view, and could in future be potentially misleading and/or harmful to the user experience. In other words, as we add VE in other contexts, we'll also want to make sure that source mode is easily accessible in all those contexts, and that there is always a default fallback on browsers where VE can't be used. I'm not saying that we can't find a compromise here - just that there's more long term complexity than one might see immediately. One compromise I could imagine is to offer a preference for the preferred _default_ editor, and honor it consistently (in the labeling of the tabs, in whatever mode gets primary presence in contexts where we can't offer a choice, etc.). Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Remove 'visualeditor-enable' from $wgHiddenPrefs
On Tue, Jul 23, 2013 at 12:05 AM, Tim Starling tstarl...@wikimedia.org wrote: I tried editing [[Argentina]] on my laptop just now, it took 45 seconds of CPU time and 51 seconds of wall clock time before the percentage CPU usage began to drop. It's pretty slow. Yes, that's why I said performance on long pages can absolutely be prohibitively poor, and I would qualify this 150K document as such. :P About 30 seconds in Chrome on this system until I can start making formatting changes, BTW. For comparison, I'd also suggest copying the document into another rich-text editing environment and observing performance characteristics. Google Docs, which is generally regarded as state-of-the-art in this regard, took about 40 seconds (with a tab crashed warning) when attempting to paste this entire article in before becoming responsive (it is significantly more responsive than VE, although still sluggish, once the document is active). It also throws a warning about too many images. Point being, it's a legitimately hard problem. And, to be fair, the equivalent of performing document-level operations within wikitext (loading the whole page and previewing your changes before saving) isn't exactly lightning-fast. An AJAX live-preview on that page takes about 12 seconds to generate. Is there any estimate as to how much development time it might take to improve performance by an order of magnitude or so, as seems to be required? I'm not sure that goal is fully attainable, but I'd suggest folks from the VE team weigh in with some of their thoughts on performance strategies. As I understand it, one of the near term improvements is to target selective activation of the editing surface (in a manner that's transparent to the user) which could reduce CPU and memory footprint quite significantly for operations that don't span the entire document. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Remove 'visualeditor-enable' from $wgHiddenPrefs
On Mon, Jul 22, 2013 at 9:51 AM, Tyler Romeo tylerro...@gmail.com wrote: Seriously, though, I understand why the VE team might want to force everybody to use VE That's a misrepresentation of the facts. We're not talking about forcing people to use VE. We're talking about whether there should be a preference to hide all aspects of VE from the user interface. The default behavior is that both modes coexist. Nobody is forced to use VE, and it adds minimal JavaScript footprint if it is not used. The default experience clearly still has room for improvement for users who prefer wikitext. In my view, in order of priority: 1) Section editing behavior - the hybrid link display is mildly annoying, and doesn't work for touch interfaces; 2) Inconsistent labels across namespaces - Edit source label should probably be consistently used; 3) Further tweaks to tab loading to minimize any delay in rendering. If these issues are addressed now, the only one that remains for users preferring wikitext is getting used to the presence of a new tab in the interface, which I do think is a reasonable change to not offer a specific preference for. (I'm not talking about VE edit bugs/problematic edits in this context as that's a separate issue.) Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Remove 'visualeditor-enable' from $wgHiddenPrefs
On Mon, Jul 22, 2013 at 8:44 PM, Tim Starling tstarl...@wikimedia.org wrote: and the results from Aaron Halfaker's study [2] As noted at the top of the page, the analysis is still in progress. Importantly, there were many confounding variables in the test, some of which are already documented. This includes users being assigned to the test group that received VisualEditor whose browser did not properly support it (it would have literally just not worked if the user attempted to edit); these issues were fixed later. See https://meta.wikimedia.org/wiki/Research:VisualEditor%27s_effect_on_newly_registered_editors/Results#Limitations for some of these issues, but like I said, analysis is still in progress and we'll need to see what conclusions can actually be drawn from the data. A proponent of source editing would claim that the steep learning curve is justified by the end results. A visual editor is easier for new users, but perhaps less convenient for power users. So Aaron Halfaker's study took its measurements at the point in the learning curve where you would expect the benefit of VE to be most clear: the first edit. Actually, as noted in the draft, because the test group was assigned at the point of account creation, we're not taking into account any prior experience using wikitext as an IP editor. 59% of respondents in the 2011 editor survey stated that they had edited as IPs prior to making an account, so we should assume that this is not an insignificant proportion: https://meta.wikimedia.org/wiki/Research:Wikipedia_Editors_Survey_November_2011 Round-trip bugs If you have, like I have, spent hours looking at VisualEditor diffs, you'll know that these are relatively rare at this point. The bug category of round-trip bugs is sometimes used for issues that aren't accurately be described this way, e.g. users typing wikitext into VisualEditor, having difficulty updating a template parameter, or accidentally deleting content (sometimes due to bugs in VE). Perhaps the main problem is performance. Perhaps new users are especially likely to quit on the first edit because they don't want to wait 25-30 seconds for the interface to load (the time reported in [3]). Performance is a very common complaint for established users also. You're quoting a user test from June 10 which was performed on the following page, which I've temporarily undeleted: https://www.mediawiki.org/wiki/Golden-Crowned_Sparrow Editing this page in Firefox on a 6-year-old system only slightly faster than the tester's specs today takes about 5 seconds to initialize. In Chrome it takes about 3 seconds, in the ballpark of reloading the page into the source editor. Note that Gabriel put major caching improvements into production around June 7, which may not have been in effect for this user / this page yet. Still, I think that the hypothesis that any actual negative impact of VE on new users is due to performance issues is very supportable. Performance is the single biggest issue overall for VE right now, and performance on long pages can absolutely be prohibitively poor. Improving it is the highest priority for the team. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Wmfall] Announcement: C. Scott Ananian joins Wikimedia as Senior Features Engineer
Scott - I'm really glad that you've joined WMF engineering! The work on Parsoid is groundbreaking. It will open the door to collaboration at a scale not seen before. And it will require contributors of your level of experience to pull it off. Thanks for coming on board -- I look forward to working with you. And thanks to Terry for investing in growing this mission-critical team. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Welcome to Ken Snider, Wikimedia Operations
Hello all, I’m delighted to announce that Ken Snider is joining the Wikimedia operations team. He will start as an international contractor working remotely from Toronto, Canada on June 10, and will be visiting SF in the week of June 17. We’re currently in the process of seeking work authorization in the United States in the Director of TechOps position. CT has graciously agreed to support the ops leadership transition full-time through June, and part-time through July. We’ll be starting the handover while Ken is working remotely. A bit more about Ken: Ken was apparently genetically predisposed to become a sysadmin since he joined one of Canada’s first large ISPs, Primus, straight out of school in 1997 and helped build their infrastructure til 2001. He then joined a startup called OpenCOLA in 2001 which was co-founded by Cory Doctorow and developed early P2P precursors to tools like BitTorrent and Steam. It’s best known today for the development of an open source (GPL’d) cola recipe which is still in use (more than 150,000 cans sold if Wikipedia is to be believed). Ken got involved in one of Cory’s pet projects, BoingBoing.net which some of you may have heard of ;-), and has been their sysadmin since 2003. After a stint from 2001-2005 at DataWire, Ken became Director of Tech Ops at Federated Media, a role he held from 2005-2012. Federated Media is an ad network that was founded to support high traffic blogs and sites that want to stay independent of large publishers, with a network that supports more than 1B requests/day. One of the unusual challenges at FM was that the company grew through acquisitions of various blogging and publishing networks. This led to the challenge of integrating very heterogeneous operations and engineering infrastructure, including multiple geographically distributed ops teams and data-center locations. As DTO, Ken led these efforts, such as OS standardization, development of a unified deployment infrastructure, etc. Ken also ensured that the operations group partnered effectively with the various engineering teams developing site features and enhancements. I want to again take this opportunity to thank CT Woo for his tireless operations leadership since December 2010. I’d also like to thank everyone who’s participated in the Director of TechOps search process. Please join me in welcoming Ken to the Wikimedia Foundation and the community. :-) All best, Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Fwd: Engineering/Product Goals for 2013-14
FYI, for the sake of transparency; WMF engineering is kicking off its department-level goalsetting process and now is a good time to follow along. Some commitments have already been made for the purpose of the Annual Plan 2013-14 through team-level deliberation and are reflected as such, and some commitments are pre-existing (e.g. we'll need to continue to support Wikidata development on the WMF side; we're planning to ramp down the Tampa data-center), but a lot of details are open to being negotiated now, so feel free to add your thoughts on the talk page as well. And in general, as noted below, we're aiming for a more flexible planning process that reflects a general desire to be able to continuously adapt our objectives to changing circumstances and opportunities. So, very little is set in stone and not open to being revisited through the course of the year. Cheers, Erik -- Forwarded message -- From: Erik Moeller e...@wikimedia.org Date: Mon, Jun 3, 2013 at 3:56 PM Subject: Engineering/Product Goals for 2013-14 To: WMF Engineering/Product Dear all, as those of you who’ve worked on individual goals have seen, we’re only looking for focus areas and individual professional development goals through that part of the process. We’re aiming to separately develop a single goals document for all of engineering/product which breaks down planned team activity. It will live here: https://www.mediawiki.org/wiki/Wikimedia_Engineering/2013-14_Goals Important difference from last year: The template is much simpler. The idea is to provide a quarterly breakdown of planned activities plus a summary of the team and any dependencies on the rest of the organization. The reason for the simplicity is that we want to more explicitly iterate on this document through the year, so keeping it lightweight keeps the cost of change low. Your team can do this through quarterly review/planning cycles if you’re already following that model. The only area where iteration is harder are the specific commitments that we put in the Annual Plan. These apply to Mobile, Editor Engagement (E2 E3) and Visual Editor, and the respective teams are already aware of the Annual Plan commitments, which are reasonably high level. If we do end up needing to change any of them, we’ll need to notify the Board of such changes. This also means that at this point you’re only making your best guess as to what you’re going to work on through the year, and you shouldn’t panic about the level of precision. Obviously some facts are known well ahead of time (e.g. ramping down the Tampa DC) while others are much harder to pin down (e.g. what exciting mobile feature will we be working on in April 2014). It’s the responsibility of product managers (where applicable), technical leads and engineering managers to organize the development of this document through June/July. I’d like to have a complete first version no later than end of July. This should give us plenty of time. I hope the lightweight approach and the built-in assumption of continuous iteration will make this feel minimally burdensome and more like part of your normal day-to-day work. Let me know if you have any questions :) Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] showing videos and images in modal viewers within articles
Yes, better support for display of images through a modal viewer would be great. I'm not sure a modal parameter that has to be explicitly set for files is the best approach - I would recommend optimizing the default experience when a user clicks an image or video. It's not clear that the current behavior based on a pixel threshold is actually desirable as the default behavior. (On a side note, the TMH behavior should be improved to actually play the video immediately, not require a second click to play in modal view.) Magnus Manske explored an alternative approach pretty extensively in response to the October 2011 Coding Challenge, which is worth taking a look at: https://www.mediawiki.org/wiki/User:Magnus_Manske/wikipic Cheers, Erik ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Fwd: Jared Zimmerman joins Wikimedia Foundation as Director of UX
FYI -- Forwarded message -- From: Erik Moeller e...@wikimedia.org Date: Mon, May 6, 2013 at 9:44 AM Subject: Jared Zimmerman joins Wikimedia Foundation as Director of UX To: wikimediaannounc...@lists.wikimedia.org Hi folks, It’s my great pleasure to announce that today, Jared Zimmerman will start as Wikimedia Foundation’s Director of User Experience. As UX Director, Jared will lead the design team and have a hands-on role on the team, contributing his own design work. It’s still a small team (Brandon, Vibha, May, and Pau), but we expect to hire an additional 3-4 designers in the coming 12-18 months. Prior to Wikimedia, Jared was Principal Interaction Designer at Autodesk, where he worked with engineers, visual artists, and user experience researchers to create new software solutions for architecture and design professionals with an emphasis on AutoCAD for Mac and soon to be released online design collaboration tools. Jared has led cross-disciplinary design teams in his roles at Autodesk, Ammunition Group, and iconmobile, including creative direction. At Autodesk, he was part of the transition to agile development and helped his design teams apply those principles to their work. During his time there he worked with design management to establish designers as product owners in the scrum process, to further integrate them into the development process from start to finish, as well as teaching his team best practices for use of agile design tools. Jared has degrees in Graphic Design (BGD) and Fine Art Photography (BFA) from the Rhode Island School of Design. His photography has been in featured in publications such as Travel + Leisure Magazine, ZonaRetiro, and Huffington Post. In addition to starting in his new job, Jared is also planning his wedding in July to his fiancée Shannon. [1] In his spare time Jared is wrapping up a remodel to their home, working on his first iPhone app, experimental cooking, photographing the bay area abroad [2], and answering questions on Quora. [3] I look forward to Jared’s leadership in helping elevate a delightful, consistent and efficient User Experience to becoming a key measure of success for our work. Please join me in welcoming Jared! :-) Erik [1] http://shannonbadiee.com/ [2] http://www.flickr.com/photos/spoinknet/ [3] https://www.quora.com/Jared-Zimmerman/answers -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Proposal: Let's move to a one-week deploy cycle
On Mon, May 6, 2013 at 7:20 PM, MZMcBride z...@mzmcbride.com wrote: The reason I ask about a distinction is that there have been a lot of changes to Wikimedia wikis lately and likely more to come, as the Wikimedia Foundation has gotten larger and has more dedicated tech resources. Overall, this is great. But big new features come with big changes, and these changes sometimes need a bit of breathing room. I've read a lot of pushback lately against rapid changes (usurping usernames, getting rid of the new message indicator, etc.). A lot of this seems mostly outside the scope how often to deploy (and I don't want to shift the focus of this thread), but it gets confusing (to me, at least) to make a distinction between new code/features on Wikimedia wikis and how often to branch MediaWiki core/extensions. A lot of this could potentially be addressed in a consistent manner across wikis if we applied the alpha-beta-prod (or just beta-prod for starters) channel model that's used on the Wikimedia mobile sites. Then features (whether in core or extensions) could be flagged for alpha or beta readiness, and users would only get them if they've decided to opt into either of those channels. We could still flip the switch from beta-prod, but that decision could be decoupled from the weekly deployment cycle. This would likely be done for features changes which have significant user-facing impact, and where segregation into on and off modes is possible (not always the case). We may want to consider at least putting some such scaffolding for beta-prod desktop modes into place before shifting to weekly deployments, although if that holds up this change significantly, I'd be in favor of making the shift first and then iterating. Right now we have lots of individual experimental prefs, some dark launch URL parameters (useNew=1 for the account creation forms etc.), some changes that are announced widely but then rolled out immediately (section edit link change), etc. What would be the disadvantage of having a single I'd like the latest and greatest changes once they come in preference for our users? The main disadvantage I see is that we'd need to temporarily retain two codepaths for significant user-facing changes, potentially increasing code complexity a fair bit, but perhaps reducing post-launch cost in return. And we'd need to consider more carefully if/when to make the beta/prod switch -- not necessarily a bad thing. ;-) Have there been any negative experiences with this model on the mobile sites? Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Support for multiple content languages in MW core
On Tue, Apr 23, 2013 at 9:08 PM, Brian Wolff bawo...@gmail.com wrote: Hi Brian, We already have the page lang support. What do you mean by that? AFAICT there's no existing designated place in the schema for associating a content language with a specific page. Thanks, Erik ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Support for multiple content languages in MW core
On Tue, Apr 23, 2013 at 10:00 PM, MZMcBride z...@mzmcbride.com wrote: I'm not sure I'd call what you're proposing a major architectural undertaking, though perhaps I'm defining a much narrower problem scope. Yeah. A lot depends on whether or not we want language to be a first class citizen at the same level as a namespace throughout MediaWiki, for an installation that contains multiple languages. So for example, should various special pages that currently offer namespace filters also offer language filters? Should page uniqueness be constrained by title, namespace and language, as opposed to title and namespace as it is today? One could make the case that not offering a lot of filtering by language is OK for multilingual wikis, since one of the conscious choices when setting up a wiki that way is that languages are precisely not going to be segregated, and the boundaries between language content are going to be fairly fluid compared with, say, the setup used for Wikipedia. I do think it's worth talking about the user experience benefits of either approach, but clearly a fair bit could be achieved by just improving the user experience around the most basic interactions in navigation and page creation. Still, at a most basic level, it'd be nice to have at least a standard approach for title disambiguation, so folks don't have to manually figure out how to distinguish the Spanish Portada from the Catalan Portada every time that type of issue arises. The common approach to just pick English word/language suffix has its own issues, so perhaps the software could intelligently follow a standard disambiguation convention, e.g. adding a suffix but only if required. Erik ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Support for multiple content languages in MW core
Hi folks, I'd like to start a broader conversation about language support in MW core, and the potential need to re-think some pretty fundamental design decisions in MediaWiki if we want to move past the point of diminishing returns in some language-related improvements. In a nutshell, is it time to make MW aware of multiple content languages in a single wiki? If so, how would we go about it? Hypothesis: Because support for multiple languages existing in a single wiki is mostly handled through JS hacks, templates, and manual markup added to the content (such as divs indicating language direction), we are providing an opaque, confusing and often inconsistent user experience in our multilingual wikis, which is a major impediment for growth of non-English content in those wikis, and participation by contributors who are not English speakers. Categories have long been called out as one of the biggest factors, and they certainly are (since Commons categories are largely in English, they are by definition excluding folks who don't speak the language), but I'd like to focus on the non-category parts of the problem for the purposes of this conversation. Support for the hypothesis (please correct misconceptions or errors): 1) There's no consistent method by which multiple language editions of the same page are surfaced for selection by the use. Different wikis use different templates (often multiple variants and layouts in a single wiki), different positioning, different rules, etc., leading to inconsistent user experience. Consistency is offered by language headers generated by the Translate extension, but these are used for managing translations, while multilingual content existing in the same wiki may often not take the form of 1:1 translations. Moreover, language headers have to be manually updated/maintained, consider the user-friendliness of something like the +/- link in the language header on a page like https://commons.wikimedia.org/wiki/Commons:Kooperationen which leads to: https://commons.wikimedia.org/w/index.php?title=Template:Lang-Partnershipsaction=edit Chances are that a lot of people who'd have the ability to provide a version (not necessarily a translation) of the page in a given language will give up even on the process of doing so correctly. 2) There's no consistent method by which page name conflicts (which may often occur in similar languages) are resolved, and users have to manually disambiguate. 3) There are basic UX issues in the language selection tools offered today. For example, after changing the language on Commons to German, I will see the page I'm on (say English) with a German user interface, even if there's an actual German content version of the page available. This is because these language selection tools have no awareness of the existence of content in relevant languages. 4) In order to ensure that content is rendered correctly irrespective of the UI language set, we require content authors to manually add divs around RTL content, even if that's all the page contains. 5) It's impossible to restrict searches to a specific language. It's impossible to restrict recent changes and similar tools to a specific language. I'll stop there - I'm sure you can think of other issues with the current approach. For third party users, the effort of replicating something like the semi-acceptable Commons or Meta user experience is pretty significant, as well, due to the large number of templates and local hacks employed. This is a very tricky set of architectural issues to solve well, and it would be easy to make the user experience worse by solving it poorly. Still, as we grow our bench strength to take on hard problems, I want to raise the temperature of this problem a bit again, especially from the standpoint of future platform engineering improvements. Would it make sense to add a language property to pages, so it can be used to solve a lot of the above issues, and provide appropriate and consistent user experience built on them? (Keeping in mind that some pages would be multilingual and would need to be identified as such.) If so, this seems like a major architectural undertaking that should only be taken on as a partnership between domain experts (site and platform architecture, language engineering, Visual Editor/Parsoid, etc.). I'm not suggesting this should be done in the very near term, but I'd like to at least start talking about it, hear if I'm completely off base (and if there are simpler ways to improve on current state), and explore where it could fit in our longer term agenda. Relevant existing code: * https://www.mediawiki.org/wiki/Extension:Translate - awesome for page and message translation, but I'm not clear that it can help for the other multilingual content scenarios and problems * Others: https://www.mediawiki.org/wiki/Category:Internationalization_extensions Thanks, Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation
Re: [Wikitech-l] Proposal: Wikitech contributors
On Wed, Apr 3, 2013 at 3:06 PM, Steven Walling steven.wall...@gmail.com wrote: The best way to approach a project like this is not to propose an up-front migration of an entire wiki to a new piece of software, just to prototype a few new features. I think the potential migration of content to wikitech and the potential use of certain MW extensions to improve the user experience are legitimately separate issues. If the migration is merited, it is likely merited irrespective of whether we use SocialProfile, LQT, SMW, SMF, etc. We could perform a large migration of content without using any of them, or we could experiment with these extensions without/before migrating any content. I'm ambivalent about the migration of content. I'm not very fond of the current division between dev-ops contributors (wikitech) and everything else (mediawiki.org), which reinforces barriers between the two worlds. Those are the barriers that Labs was designed to tear down, empowering technical contributors to prototype their changes easily, and to get them ready for large-scale usage on Wikimedia or other production sites. Having all technical contributors directed to wikitech.wikimedia.org would address that - it would introduce them to a magical world of dev-ops unicorns and PHP rainbows at the same time. And having mediawiki.org more clearly dedicated to the product would allow it to shine more brightly in its own sunflower-y colors. At the same time, the amount of wiki-ping-pong we're playing with technically interested users could very well increase significantly as a result. Right now, wikitech.wikimedia.org is relatively quiet, with changes typically either being made by the Wikimedia ops team or by Labs users. It simply stands to reason that if we distribute a lot of content from a large wiki to a much smaller one, the number of times that you'll have to go back and forth between the two to find what you're looking for will increase. API docs? Over here. Status update? Over there. Extension installation docs? Over here. Specs related to the same extension? Over there. Ping, pong. Ping, pong. The divisions may seem logical to us, but for the confused technical contributor, things could easily get a lot worse. If feasible, I would at the end of the day still argue in favor of a single consolidated technical wiki. I realize calling that wiki mediawiki.org is not ideal, but beyond the domain name, a lot can be done to provide reasonable divisions (namespaces, navigation, etc.), so that MediaWiki the product and other Wikimedia technical projects and processes are clearly distinct. Let's also not forget that we'll have future potential for new MediaWiki-related technical contribution that actually would fit very nicely under the mediawiki.org umbrella (e.g. Lua script repository, gadget repository). If we do go forward with the migration to wikitech.wikimedia.org, I would argue in favor of largely depleting mediawiki.org of content except for clearly necessary end user documentation and support pages, to minimize ping pong effects. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] [Wmfall] Yuri Astrakhan Adam Baso join Mobile department partner team
On Mon, Mar 18, 2013 at 10:29 AM, Tomasz Finc tf...@wikimedia.org wrote: I'm pleased to announce that the mobile department has two new staff members. Yuri Astrakhan Adam Baso join as sr. software developers on the mobile partner team. Welcome on board, guys. Really looking forward to the next steps with WP Zero. :-) Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Github/Gerrit mirroring
On Thu, Mar 14, 2013 at 7:46 PM, Juliusz Gonera jgon...@wikimedia.org wrote: I wouldn't be that optimistic, maybe it would slightly increase. Having an account is one of the factors but I wouldn't underestimate user friendliness. The first time I tried to find the URL to clone a repo in gerrit it took me probably around a minute. On GitHub it probably took me 5 seconds. And I wouldn't be too quick to celebrate the increased vendor lock-in of a large percentage of the open source community into an ecosystem of partially proprietary tools and services (the GitHub engine itself, the official GitHub applications, etc.). Gerrit and other open source git repo management and code review tools are one of the best hopes for the development of a viable alternative. Unlike GitHub, Gerrit can be improved by its users over time, and the issues that frustrate and annoy us about it _can_ be fixed (and indeed, many have been). Yes to better pull request management from GitHub. But let's stop complaining about Gerrit, and instead get both functionality and UX issues into their bug tracker, and help get them fixed. Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] QUnit testing in Jenkins
On Mon, Mar 4, 2013 at 8:07 PM, Ori Livneh o...@wikimedia.org wrote: Today I sprinted to pick up QUnit testing in Jenkins and get it stabilised and deployed. This is fantastic. Thanks, Timo. Indeed - this is a great milestone. Thanks for all your work getting this out the door, Timo! :-) Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Reminder about the best way to link to bugs in commits
On Sun, Mar 3, 2013 at 6:06 PM, Brian Wolff bawo...@gmail.com wrote: Personally I prefer it in the first line. Second to a good one line summary of what was done, the bug number is the next most important thing. It allows one to see the context the commit was made in. Having it in the first line allows one to find it easily and have it displayed in various one line log formats ( including in gerrit when you get a list of commits) Yeah, that's my perspective as a user of this info as well. Having the bug numbers visible in Gerrit's list views is pretty handy for me (while I doubt I'd personally use the Bug:# search much, which is not to say it's not useful). That said, most of the time, the bug's also in the topic, so it's not a huge deal, and I promise this will be my last response in this thread. :P Although .. perhaps in some magical future the bug # could be displayed and clickable in a separate list view column if the Bug: field is set? ;-) Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Reminder about the best way to link to bugs in commits
On Fri, Mar 1, 2013 at 2:20 PM, Jon Robson jdlrob...@gmail.com wrote: I actually prefer bug numbers in the header. +1, also useful for release notes. Could the footer line be auto-generated for indexing purposes? Yay for bikeshed topics ;-) -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Welcome, Munagala Ramanath (Ram)
On Tue, Jan 15, 2013 at 12:08 PM, Rob Lanphier ro...@wikimedia.org wrote: started yesterday as a Senior Software Engineer in our Platform Engineering group (MediaWiki Core, specifically). Welcome on-board, Ram :-). Look forward to your efforts on search, which is in desperate need of love and attention. ;) Erik -- Erik Möller VP of Engineering and Product Development, Wikimedia Foundation Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l