Re: [Wikitech-l] MediaWiki 2.0 (was Fwd: No more Architecture Committee?)

2015-01-16 Thread Ori Livneh
On Thu, Jan 15, 2015 at 11:51 PM, Daniel Friesen dan...@nadir-seen-fire.com
 wrote:

 Any such discussion should probably start with this:
 https://www.mediawiki.org/wiki/MediaWiki_2.0


The page is an outline of a plan. It sounds pretty good to me. Whatever
happened to that initiative? Did it just fizzle out, or were there
substantive disagreements to work out?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: No more Architecture Committee?

2015-01-16 Thread Antoine Musso
Le 16/01/2015 05:26, Chad a écrit :
 Agreed. Although it means we'll never be able to use 2.0 because
 2.0 from 1.x has total rewrite implications even if it's not true.
 
 I've been saying for over a year now we should just drop the 1. from
 the 1.x.y release versions. So the next release would be 25.0, 26.0,
 etc etc.

That would be on par with the semver. But since we are more or less
continuously releasing, I would even go to use year/month as a version.
Ie:  2015.01.

We are out of topic though.

-- 
Antoine hashar Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fun with code coverage

2015-01-16 Thread Antoine Musso
Le 15/01/2015 18:31, Legoktm a écrit :
 On 01/14/2015 04:57 PM, James Douglas wrote:
 I'd love to hear your thoughts and learn about your related experiences.
 What are your favorite code coverage tools and services?
 
 PHPUnit has a useful code coverage tool and there are reports running
 for core[1] and some extensions[2]. In my experience it's extremely CPU
 intensive and slow, so it's not something that is convenient to run
 before every commit.
 
 [1] https://integration.wikimedia.org/cover/mediawiki-core/master/php/
 [2] https://tools.wmflabs.org/coverage/

Hello,

The core coverage job is run using Zend and xdebug.  Would probably
faster by using HHVM and XHProf.  PHPUnit has a plugin for XHProf at
https://github.com/sebastianbergmann/phpunit-testlistener-xhprof

If one can craft an entry point in mediawiki/core composer.json and add
the relevant dependencies, that would make it easy for me to migrate the
Jenkins job to it.

cheers,

-- 
Antoine hashar Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread S Page

 One of the bigger questions I have about the potential shift to requiring
 services is the fate of shared hosting deployments of MediaWiki.


Seems like an opportunity. Deploy your MediaWiki-Vagrant instance to our
cloud infrastructure cheap.  It's not $2/month, but Digital Ocean can host
a  1GB VM for $10/month. Maybe https://atlas.hashicorp.com/ will host
vagrant boxes cheaper.

Docker allows many Linux containers on the same physical system which
should lead to lower pricing. Some MediaWiki dockerfiles install Parsoid
and nodejs as well as MediaWiki and PHP, e.g. CannyComputing
https://github.com/CannyComputing/*frontend-mediawiki
https://github.com/CannyComputing/frontend-mediawiki* , or you'd have
separate container for each service. I can't figure out the pricing.

--
=S Page  WMF Tech writer
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread David Gerard
On 16 January 2015 at 07:38, Chad innocentkil...@gmail.com wrote:

 These days I'm not convinced it's our job to support every possible
 scale of wiki install. There's several simpler and smaller wiki solutions
 for people who can't do more than FTP a few files to a folder.


In this case the problem is leaving users and their wiki content
unsupported. Because they won't move while it works, even as it
becomes a Swiss cheese of security holes. Because their content is
important to them.

This is the part of the mission that involves everyone else producing
the sum of human knowledge. They used our software, if we're
abandoning them then don't pretend there's a viable alternative for
them. You know there isn't.


- d.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: No more Architecture Committee?

2015-01-16 Thread Mark A. Hershberger
Ori Livneh o...@wikimedia.org writes:

 The model I do think we should consider is Python 3. Python 3 did not
 jettison the Python 2 codebase. The intent behind the major version change
 was to open up a parallel development track in which it was permissible to
 break backward-compatibility in the name of making a substantial
 contribution to the coherence, elegance and utility of the language.

I like the idea, but this makes it sound like we have some commitment
in the current co-debase to backwards compatibility.

Currently, though, just as Robla points out that there is no clear
vision for the future, there is no clear mandate to support interfaces,
or what we usually call backwards compatibility.

So, yes, let's have a parallel MW 2.0 development track that will allow
developers to try out new things.  But let that be accompanied with a MW
1.0 track so that makes stability a priority.

Now, the question is: what will Wikipedia run: MW 2.0 or MW 1.0?  And,
if they focus on MW 2.0 (My sense is that is where the WMF devs will
want to be), then how do those of us with more conservative clients keep
MW 1.0 viable?

Mark.

-- 
Mark A. Hershberger
NicheWork LLC
717-271-1084

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread Marc A. Pelletier

On 15-01-16 02:48 AM, Max Semenik wrote:

Can't agree on this, as the number covers [...]


An extra datapoint here:  I think I can reasonably consider myself an 
atypical user at the tail end of the sysadmin curve, and yet the 
principal reason why I had delayed installing VisualEditor on a 
mediawiki I administer is Parsoid.


Not because it's unusually complicated to deploy and install - it's 
about as trivial as possible under Ubuntu at least given that there is a 
package - but because it changes a number of fundamental assumptions 
about what a mediawiki install /is/ and what maintaining it involves.


Service-based has implications on reliability and security and - no 
matter how well we manage those implications - they add moving parts and 
complexities.  They increase the monitoring burden, and increase the 
specialization and knowledge necessary to understand and debug issues.


I'm not saying any of those things are /bad/ or insurmountable, but that 
they modify the landscape in a pretty drastic ways and that this /will/ 
impact the vast majority of users.


And I can honestly say that if the intent is to maintain a parallel 
working implementation in pure php then - with my reuser hat - I don't 
believe a word of it.  :-)  I happen to have been a long-time postgres 
user, and yet when 1.23 rolled around and I wanted to start playing 
around with VE I found that my only realistic alternative was to migrate 
to mysql: postgres support had always been sporadic and not-quite-there 
already, and the increased number of issues with the increased 
complexity meant that it moved squarely in not-maintained-enough territory.


So yeah; there *is* a very significant impact.

-- Marc


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] No more Architecture Committee?

2015-01-16 Thread Federico Leva (Nemo)

 [...] Perhaps people
 who get most of their news from this mailing list are [...]

Why does this matter? What other sources are there?

It seems you're saying there is a divide between the main development 
community (on this list) and some other unspecified group which 
communicates by some other unspecified means. If so, what you need is 
not a dictator or an architect or a process, it's a reunification.


Whoever has knowledge of such a divide can help with simple steps, in 
particular by publishing a map outlining this divide on a 
mediawiki.org page.


 [...] the vast majority
 of the WMF staff developers, even if they were unpopular here on this
 list?  Given our staff/not ratio [...]

See above.

 [...] It could be that the general pessimism about the direction
 of MediaWiki (or lack thereof) is not shared out here. [...]

See above.

 have their own ideas about
 what things should be priorities, but have no expectation that those
 ideas will be considered for resourcing.

The solution here seems rather straightforward, if not simple: ensure 
the architecture committee actually feels empowered. The WMF could 
promise to not invest resources on something that the committee 
officially declared a bad idea, for instance.


Nemo

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread Mark A. Hershberger
Trevor Parscal tpars...@wikimedia.org writes:

 95% is pretty extreme.

Not according to WikiApiary.  The largest host for wikis, even before
Wikimedia, is DreamHost followed closely by GoDaddy and other shared
hosters such as 11.[1]

Yes, Amazon and Linode are also near the top of the list, but from
experience, I can tell you that most of those aren't using anything but
PHP, and, if you're lucky, the Math and Scribunto extensions.

Even the most asked-for service of the SOA -- Parsoid for Visual Editor
-- is almost (but not quite) exclusively on WMF wikis.[2]

Developers within an organisation like the WMF don't have to worry about
what it takes to stand up and maintain a wiki -- they have Operations
for that.

Most people running wikis based on MediaWiki don't have a budget or a
staff exclusively for their wiki.  They don't have the resources or
knowledge to run a dedicated server.

There are things that can be done to address this -- make it easier to
set up a SOA-using wiki on your chosen virtual machine provider -- but
they need dedicated resources.

This probably means a really dedicated (group of) volunteer(s) or a new
focus on non-WMF use of MediaWiki by the WMF.  This seems to be the
direct result of the the decisions made by developers who've implemented
some very compelling features without (as Robla seemed to say recently)
any overall architectural direction.

Mark.



Footnotes: 
[1]  https://wikiapiary.com/wiki/Host:Hosts

[2]  https://wikiapiary.com/wiki/Extension:VisualEditor

-- 
Mark A. Hershberger
NicheWork LLC
717-271-1084

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: No more Architecture Committee?

2015-01-16 Thread Brian Wolff
On Jan 16, 2015 9:21 AM, Mark A. Hershberger m...@nichework.com wrote:

 Ori Livneh o...@wikimedia.org writes:

  The model I do think we should consider is Python 3. Python 3 did not
  jettison the Python 2 codebase. The intent behind the major version
change
  was to open up a parallel development track in which it was permissible
to
  break backward-compatibility in the name of making a substantial
  contribution to the coherence, elegance and utility of the language.

 I like the idea, but this makes it sound like we have some commitment
 in the current co-debase to backwards compatibility.

 Currently, though, just as Robla points out that there is no clear
 vision for the future, there is no clear mandate to support interfaces,
 or what we usually call backwards compatibility.

 So, yes, let's have a parallel MW 2.0 development track that will allow
 developers to try out new things.  But let that be accompanied with a MW
 1.0 track so that makes stability a priority.

 Now, the question is: what will Wikipedia run: MW 2.0 or MW 1.0?  And,
 if they focus on MW 2.0 (My sense is that is where the WMF devs will
 want to be), then how do those of us with more conservative clients keep
 MW 1.0 viable?

 Mark.

 --
 Mark A. Hershberger
 NicheWork LLC
 717-271-1084



This seems a solution in  search of a problem. Does anyone actually have
anything they want that is difficult to do currently and requires a mass
compat break? Proposing to rewrite mediawiki because we can without even a
notion of what we would want to do differently seems silly.

--bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread Brad Jorsch (Anomie)
On Fri, Jan 16, 2015 at 2:49 AM, Stas Malyshev smalys...@wikimedia.org
wrote:

 However, with clear API architecture we could maybe have
 alternatives - i.e. be able to have the same service performed by a
 superpowered cluster or by a PHP implementation on the URL on the same
 host.


The problem there is that the PHP implementation is likely to code-rot,
unless we create really good unit tests that actually run for it.

It would be less bad if the focus were on librarization and daemonization
(and improvement) of the existing PHP code so both methods could share a
majority of the code base, rather than rewriting things from scratch in an
entirely different language.

and maybe reduced feature set.


That becomes a hazard when other stuff starts to depend on the non-reduced
feature set.


-- 
Brad Jorsch (Anomie)
Software Engineer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread Brian Wolff
On Jan 16, 2015 11:07 AM, Brad Jorsch (Anomie) bjor...@wikimedia.org
wrote:

 On Fri, Jan 16, 2015 at 2:49 AM, Stas Malyshev smalys...@wikimedia.org
 wrote:

  However, with clear API architecture we could maybe have
  alternatives - i.e. be able to have the same service performed by a
  superpowered cluster or by a PHP implementation on the URL on the same
  host.


 The problem there is that the PHP implementation is likely to code-rot,
 unless we create really good unit tests that actually run for it.

 It would be less bad if the focus were on librarization and daemonization
 (and improvement) of the existing PHP code so both methods could share a
 majority of the code base, rather than rewriting things from scratch in an
 entirely different language.

 and maybe reduced feature set.


 That becomes a hazard when other stuff starts to depend on the non-reduced
 feature set.


 --
 Brad Jorsch (Anomie)
 Software Engineer
 Wikimedia Foundation


I like this approach. After all it worked well enough for image
thumbnailing and job queue.

Writing things in different languages just reminds me of what a mess math
support used to be.

--bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: No more Architecture Committee?

2015-01-16 Thread Mark A. Hershberger
Brian Wolff bawo...@gmail.com writes:

 Does anyone actually have anything they want that is difficult to do
 currently and requires a mass compat break?

I haven't heard of any mass compat break items.  But I have seen
developers who aren't worried about compatibility or continuity of
features since they only have to worry about the WMF cluster and issues
of compatibility are handled there, or the features aren't used by the
WMF.


-- 
Mark A. Hershberger
NicheWork LLC
717-271-1084

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] [CfP] Linked Data Quality #LDQ2015 Call for Papers

2015-01-16 Thread Dimitris Kontokostas

LDQ 2015 CALL FOR PAPERS

2nd Workshop on Linked Data Quality
co-located with ESWC 2015, Portorož, Slovenia

May 31 or June 1, 2015
http://ldq.semanticmultimedia.org/

Important Dates
 * Submission of research papers: March 6, 2015
 * Notification of paper acceptance: April 3, 2015
 * Submission of camera-ready papers: April 17, 2015

Since the start of the Linked Open Data (LOD) Cloud, we have seen an 
unprecedented volume of structured data published on the web, in most 
cases as RDF and Linked (Open) Data. The integration across this LOD 
Cloud, however, is hampered by the ‘publish first, refine later’ 
philosophy. This is due to various quality problems existing in the 
published data such as incompleteness, inconsistency, 
incomprehensibility, etc. These problems affect every application 
domain, be it scientific (e.g., life science, environment), 
governmental, or industrial applications.


We see linked datasets originating from crowdsourced content like 
Wikipedia and OpenStreetMap such as DBpedia and LinkedGeoData and also 
from highly curated sources e.g. from the library domain. Quality is 
defined as “fitness for use”, thus DBpedia currently can be appropriate 
for a simple end-user application but could never be used in the medical 
domain for treatment decisions. However, quality is a key to the success 
of the data web and a major barrier for further industry adoption.


Despite the quality in Linked Data being an essential concept, few 
efforts are currently available to standardize how data quality tracking 
and assurance should be implemented. Particularly in Linked Data, 
ensuring data quality is a challenge as it involves a set of 
autonomously evolving data sources. Additionally, detecting the quality 
of datasets available and making the information explicit is yet another 
challenge. This includes the (semi-)automatic identification of 
problems. Moreover, none of the current approaches uses the assessment 
to ultimately improve the quality of the underlying dataset.


The goal of the Workshop on Linked Data Quality is to raise the 
awareness of quality issues in Linked Data and to promote approaches to 
assess, monitor, maintain and improve Linked Data quality.


The workshop topics include, but are not limited to:
 * Concepts
 *  - Quality modeling vocabularies
 * Quality assessment
 *  - Methodologies
 *  - Frameworks for quality testing and evaluation
 *  - Inconsistency detection
 *  - Tools/Data validators
 * Quality improvement
 *  - Refinement techniques for Linked Datasets
 *  - Linked Data cleansing
 *  - Error correction
 *  - Tools
 * Quality of ontologies
 * Reputation and trustworthiness of web resources
 * Best practices for Linked Data management
 * User experience, empirical studies

Submission guidelines
We seek novel technical research papers in the context of Linked Data 
Quality with a length of up to 8 pages (long) and 4 pages (short) 
papers. Papers should be submitted in PDF format. Other supplementary 
formats (e.g. html) are also accepted but a pdf version is required. 
Paper submissions must be formatted in the style of the Springer 
Publications format for Lecture Notes in Computer Science (LNCS). Please 
submit your paper via EasyChair at 
https://easychair.org/conferences/?conf=ldq2015. Submissions that do not 
comply with the formatting of LNCS or that exceed the page limit will be 
rejected without review. We note that the author list does not need to 
be anonymized, as we do not have a double-blind review process in place. 
Submissions will be peer reviewed by three independent reviewers. 
Accepted papers have to be presented at the workshop.


Important Dates
All deadlines are, unless otherwise stated, at 23:59 Hawaii time.
 * Submission of research papers: March 6, 2015
 * Notification of paper acceptance: April 3, 2015
 * Submission of camera-ready papers: April 17, 2015
 * Workshop date: May 31 or June 1, 2015 (half-day)

Organizing Committee
 * Anisa Rula – University of Milano-Bicocca, IT
 * Amrapali Zaveri – AKSW, University of Leipzig, DE
 * Magnus Knuth – Hasso Plattner Institute, University of Potsdam, DE
 * Dimitris Kontokostas – AKSW, University of Leipzig, DE

Program Committee
 * Maribel Acosta – Karlsruhe Institute of Technology, AIFB, DE
 * Mathieu d’Aquin – Knowledge Media Institute, The Open University, UK
 * Volha Bryl – University of Mannheim, DE
 * Ioannis Chrysakis – ICS FORTH, GR
 * Jeremy Debattista – University of Bonn, Fraunhofer IAIS, DE
 * Stefan Dietze – L3S, DE
 * Suzanne Embury – University of Manchester, UK
 * Christian Fürber – Information Quality Institute GmbH, DE
 * Jose Emilio Labra Gayo – University of Oviedo, ES
 * Markus Graube – Technische Universität Dresden, DE
 * Maristella Matera – Politecnico di Milano, IT
 * John McCrae – CITEC, University of Bielefeld, DE
 * Felix Naumann – Hasso Plattner Institute, DE
 * Matteo Palmonari – University of Milan-Bicocca, IT
 * Heiko Paulheim – University of Mannheim, DE
 

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread Antoine Musso
Le 16/01/2015 07:38, Bryan Davis a écrit :
 There has been a lot of talk over the last year or so of how and when
 to move MediaWiki to a service oriented architecture [0]. So much so
 that it is actually one of the a marquee topics at the upcoming
 Developer Summit.
snip

Hello,

Moving to a service oriented architecture will certainly ease Wikimedia
maintenance, or at least delegate maintenance authority to independent
sub teams which would be rather nice.

The monolithic approach of MediaWiki with all services / classes being
tightly coupled makes the platform rather hard to build upon.  It
certainly lacks clear contracts between components.


Can the SOA model be supported on shared hosts or by third parties?
Surely. But one has to handle the integration and release work to make
it easy to install all the components  and configure them.  Is Wikimedia
going to handle it? I doubt it.


The underlying choice is:

a) should Wikimedia keep being slowed down claiming MediaWiki is
suitable for others

b) Reclaims MediaWiki has a Wiki made to create and distribute free
knowledge geared toward supporting Wikipedia and others WMF projects.


I think WMF should reclaim it, take decisions and stop pretending we
support third parties installation.  That would mean going as far as
dropping postgreSQL support since we have no use for it.


So what we might end up with:

- Wikimedia using the SOA MediaWiki with split components maintained by
staff and the Wikimedia volunteers devs.  Code which is of no use for
the cluster is dropped which would surely ease maintainability.  We can
then reduce MediaWiki to a very thin middleware and eventually rewrite
whatever few code is left.


- The old MediaWiki PHP based is forked and given to the community to
maintain.  WMF is no more involved in it and the PHP-only project live
it's on life.  That project could be made to remove a lot of the rather
complicated code that suits mostly Wikimedia, making MediaWiki simpler
and easier to adjust for small installations.


So, is it time to fork both intent?  I think so.


-- 
Antoine hashar Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: No more Architecture Committee?

2015-01-16 Thread Tyler Romeo
Before I talk about the architecture committee, let me just say that I think
the idea of MediaWiki 2.0 is off-topic and should be moved to a different
thread.

/ontopic

At this point I’d like to contribute a little bit of my experience from working
on the Architecture Committee at Chase Auto Finance. At JPMC our committee
was perhaps the most orthodox definition of an architecture committee: our
authority was enforced. Yes, we were still responsible to the company and the
business, and we could not simply veto new features, but nonetheless it was
considered an absolute requirement that before any dev team even began
_thinking_ about how to implement their new project, they submit a proposal
to the architecture committee detailing how the new feature / project would
affect other projects, etc.

And I think this is a pretty important concept. As Tim has mentioned, right now
we don’t have an architecture committee. What we have is a group of five people
that sometimes have IRC meetings and discuss stuff. Sometimes a #agreed is
slapped on top, for w/e that is worth. But going through the committee is not a
requirement. If Erik or whoever your’e working for gives a green light, you can 
go
right past it. Hell, even if I myself were to do some work on my own and submit
a patch directly to Gerrit, it could go through without ever seeing the 
committee.

If we are really worried about the future of MediaWiki and its technical debt 
(which,
in my opinion, we should be, considering it is growing exponentially by the 
day),
then a formal organization needs to be established. This committee should report
to the WMF as usual, but it should also have an authority of its own. New 
features
cannot be implemented without the approval of this committee. Period. +2ing a
new feature that was not approved could result in you losing your +2, etc.

However, maybe this isn’t the best thing to do? As other people mentioned,
consensus is a really cool thing, and sometimes it’s just better for the 
community
to decide as whole. But I can tell you right now it’s not going to work. And the
reason is this: sometimes the community becomes divided. Sometimes we can’t
come to a consensus, and other times we do but the consensus is in opposition
with what the WMF wants. In all of these scenarios where we cannot make up
our minds, what is going to happen (or rather, what has already been happening),
is that somebody in the WMF, or maybe just somebody in general, is going to
be WP:BOLD and make the decision themselves. Then somebody who maybe
agrees with them is going to +2 it, and the rest of the community will silently
shrug and move on.

There is a place for consensus and discussion, but other times you really just
need a group of people whose job (or, if you’re not hired by the WMF, 
responsibility)
it is to make software design decisions. And if the community as a whole ever
disagrees with the committee’s decision, maybe we have a process for overriding
the committee (or a process for impeaching the committee members). It’s similar
to the English Wikipedia: why do they have an arbitration committee, whose
members are elected, in a community that doesn’t like voting and prefers 
consensus?

Of course, there is some discussion to be had on the exact definition and
process.

* The Big Question(tm): Do we want the committee in the first place?
* Who will be on this committee?
* What processes are there for people to be added or removed?
* What is the process for submitting ideas to the committee?
* Is the committee split into sub-committees, where each sub-committee is in 
charge
  of a different topic or area of expertise?
* Are sub-committees broken up based on generic concepts, e.g., databases, UX,
  or are they broken up based on areas of MediaWiki, e.g., Parser, FileBackend?
* Is the main committee required to approve everything, or can sub-committees
  approve on their own unless they feel the need to defer to the main committee?
* Can people who are not on the committee itself be in sub-committees?
* If so how are the memberships of subcommittees selected?
* What is the committee’s authority and what are its responsibilities to the 
WMF?
* How does the committee interact with other organizations, like the MediaWiki
  User Group, the on-site Wikipedia community, etc.?
* When do aforementioned other organizations get a say in the process?

As mentioned, MediaWiki is not an open source community at the moment. It is
an engineering organization that just happens to have open source contributors.
And it is because of this that our technical debt has been increasing. (Like the
age-old joke that we should rewrite MediaWiki in node.js, but look, for some
reason we are actually using it in parts of our new products!)

Anyway, that’s really the end of my rant. In the end, I do think a committee is
necessary for the future of MediaWiki as an open source software project, but it
is something that should be created with a lot of thought and 

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread David Gerard
On 16 January 2015 at 16:09, Antoine Musso hashar+...@free.fr wrote:

 So what we might end up with:
 - Wikimedia using the SOA MediaWiki with split components maintained by
 staff and the Wikimedia volunteers devs.  Code which is of no use for
 the cluster is dropped which would surely ease maintainability.  We can
 then reduce MediaWiki to a very thin middleware and eventually rewrite
 whatever few code is left.
 - The old MediaWiki PHP based is forked and given to the community to
 maintain.  WMF is no more involved in it and the PHP-only project live
 it's on life.  That project could be made to remove a lot of the rather
 complicated code that suits mostly Wikimedia, making MediaWiki simpler
 and easier to adjust for small installations.
 So, is it time to fork both intent?  I think so.



This is not a great idea because it makes WMF wikis unforkable in
practical terms. The data is worthless without being able to run an
instance of the software. This will functionally proprietise all WMF
wikis, whatever the licence statement.


- d.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread Subramanya Sastry

On 01/16/2015 01:49 AM, Stas Malyshev wrote:
I think we're trying to fulfill a bit of a contradictory requirement 
here - running on the same software both the site of the size of 
*.wikipedia.org and a 1-visit-a-week-maybe $2/month shared hosting 
install. I think it would be increasingly hard to be committed to both 
equally. 


This has always been my question independent of the specific choices 
made or being made to address this. So, some answers could be:


* Yes, it is possible and for that we need to continue with a monolithic 
install, and if VisualEditor/Flow, etc. is needed, then things like 
Parsoid would have to become part of core so continue to support WMF's 
product offerings.


* It is difficult to support installs equally well at the extremities of 
scaling / performance requirements, and for WMF's scale and features 
(new ones being developed and conceived on an ongoing basis), we need a 
different architecture where there are independent services that can be 
developed and maintained independently, but perhaps with packaging 
solutions, a large subset of existing wiki installs can be supported and 
developed


* It is not really possible to do both and there should be a fork / 
split of the mediawiki codebase, and each should go their own way.


Maybe I am setting up strawmen to articulate my point of view, which is 
the middle solution, simply because I don't believe that with continuing 
demand / development of new features, you can make the same codebase 
work for WMF's expanding feature set and for a lot of small wikis who 
are content with simple wikitext-based wikis without any of the 
additional complexity.


Anyway, independent of the specific outlines above, I think the question 
that should inform a discussion would be: is it really possible for the 
same codebase to support both WMF's requirements (I say WMF because it 
is leading the development of new feaures / products) with new and 
evolving products / features as well as the countless small wikis out 
there who have exactly one ore two requirements of the wiki?


I am happy to be disabused of the importance of this question.

Subbu.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: No more Architecture Committee?

2015-01-16 Thread Chad
On Fri Jan 16 2015 at 8:15:45 AM Tyler Romeo tylerro...@gmail.com wrote:

 As mentioned, MediaWiki is not an open source community at the moment. It
 is
 an engineering organization that just happens to have open source
 contributors.


It's hard to have a community when we keep hiring everyone out of it.

-Chad
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] MediaWiki 2.0 (was: No more Architecture Committee?)

2015-01-16 Thread James Forrester
[Moving threads for on-topic-ness.]

On 16 January 2015 at 07:01, Brian Wolff bawo...@gmail.com wrote:

 Does anyone actually have
 anything they want that is difficult to do currently and requires a mass
 compat break?


​Sure.

​Three quick examples of things on the horizon (I'm not particularly saying
we'd actually do these for Wikimedia's use, but if you're going to ask for
straw man arguments… :-)):

   - ​Get rid of wikitext on the server-side.
  - HTML storage only. Remove MWParser from the codebase. All
  extensions that hook into wikitext (so, almost all of them?) will need to
  be re-written.
   - Real-time collaborative editing.
  - Huge semantic change to the concept of a 'revision'; we'd probably
  need to re-structure the table from scratch. Breaking change for
many tools
  in core and elsewhere.
   - Replace local PHP hooks with proper services interfaces instead.​
   - Loads of opportunities for improvements here (anti-spam tools 'as a
  service', Wordpress style; pre-flighting saves; ), but again, pretty much
  everything will need re-writing; this would likely be progressive,
  happening one at a time to areas where it's
useful/wanted/needed, but it's
  still a huge breaking change for many extensions.



 Proposing to rewrite mediawiki because we can without even a
 notion of what we would want to do differently seems silly.


​Oh, absolutely. I think RobLa's point was that it's unclear who feels
empowered to make that decision (rather than the pitch). I don't. I don't
think RobLa does. Clearly the Architecture Committee don't.​

​J.
-- 
James D. Forrester
Product Manager, Editing
Wikimedia Foundation, Inc.

jforres...@wikimedia.org | @jdforrester
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread Greg Grossmeier
quote name=David Gerard date=2015-01-16 time=16:27:22 +
 This is not a great idea because it makes WMF wikis unforkable in
 practical terms. The data is worthless without being able to run an
 instance of the software. This will functionally proprietise all WMF
 wikis, whatever the licence statement.

Unforkable as a binary is not true. Not easy to fork without
implementing all of the pieces carefully is more accurate. But it's
still possible, especially since all of the config/puppet code that is
used to do it is open and Freely licensed.

It ain't easy hosting a site this large with this amount of traffic
(thanks Ops!) so it's not surprising that forking it wouldn't be easy
either.

-- 
| Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @gregA18D 1138 8E47 FAC8 1C7D |

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread David Gerard
On 16 January 2015 at 17:10, Greg Grossmeier g...@wikimedia.org wrote:
 quote name=David Gerard date=2015-01-16 time=16:27:22 +

 This is not a great idea because it makes WMF wikis unforkable in
 practical terms. The data is worthless without being able to run an
 instance of the software. This will functionally proprietise all WMF
 wikis, whatever the licence statement.

 Unforkable as a binary is not true.


That's why I said functionally.


 Not easy to fork without
 implementing all of the pieces carefully is more accurate. But it's
 still possible, especially since all of the config/puppet code that is
 used to do it is open and Freely licensed.


A licence that was the GPL with the additional requirement of
attaching 1kg of gold to each copy would be technically free as well,
but present difficulties in practice.

What I'm saying is that technically is insufficient.


 It ain't easy hosting a site this large with this amount of traffic
 (thanks Ops!) so it's not surprising that forking it wouldn't be easy
 either.


However, at present low-traffic copies are pretty easy.


- d.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread Ryan Lane
On Fri, Jan 16, 2015 at 4:29 AM, David Gerard dger...@gmail.com wrote:

 On 16 January 2015 at 07:38, Chad innocentkil...@gmail.com wrote:

  These days I'm not convinced it's our job to support every possible
  scale of wiki install. There's several simpler and smaller wiki solutions
  for people who can't do more than FTP a few files to a folder.


 In this case the problem is leaving users and their wiki content
 unsupported. Because they won't move while it works, even as it
 becomes a Swiss cheese of security holes. Because their content is
 important to them.

 This is the part of the mission that involves everyone else producing
 the sum of human knowledge. They used our software, if we're
 abandoning them then don't pretend there's a viable alternative for
 them. You know there isn't.


What you're forgetting is that WMF abandoned MediaWiki as an Open Source
project quite a while ago (at least 2 years ago). There's a separate org
that gets a grant from WMF to handle third party use, and it's funded just
well enough to keep the lights on.

Take a look at the current state of MediaWiki on the internet. I'd be
surprised if less than 99% of the MediaWiki wikis in existence are out of
date. Most are probably running a version from years ago. The level of
effort required to upgrade MediaWiki and its extensions that don't list
compatibility with core versions is past the skill level of most people
that use the software. Even places with a dedicated ops team find MediaWiki
difficult to keep up to date. Hell, I find it difficult and I worked for
WMF on the ops team and have been a MediaWiki dev since 1.3.

I don't think adding a couple more services is going to drastically alter
the current situation.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread Ryan Lane
On Fri, Jan 16, 2015 at 8:27 AM, David Gerard dger...@gmail.com wrote:

 On 16 January 2015 at 16:09, Antoine Musso hashar+...@free.fr wrote:

  So what we might end up with:
  - Wikimedia using the SOA MediaWiki with split components maintained by
  staff and the Wikimedia volunteers devs.  Code which is of no use for
  the cluster is dropped which would surely ease maintainability.  We can
  then reduce MediaWiki to a very thin middleware and eventually rewrite
  whatever few code is left.
  - The old MediaWiki PHP based is forked and given to the community to
  maintain.  WMF is no more involved in it and the PHP-only project live
  it's on life.  That project could be made to remove a lot of the rather
  complicated code that suits mostly Wikimedia, making MediaWiki simpler
  and easier to adjust for small installations.
  So, is it time to fork both intent?  I think so.



 This is not a great idea because it makes WMF wikis unforkable in
 practical terms. The data is worthless without being able to run an
 instance of the software. This will functionally proprietise all WMF
 wikis, whatever the licence statement.


So far all of the new services are open source. If you want to work on
forking Wikimedia projects you can even start a Labs project to work on it.
I don't think this is a concern here.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: No more Architecture Committee?

2015-01-16 Thread Tyler Romeo
On January 16, 2015 at 11:59:14, Chad (innocentkil...@gmail.com) wrote:
It's hard to have a community when we keep hiring everyone out of it.
We should use this as an advertising point. “Come contribute to MediaWiki, 
there’s a pretty high percentage we’ll hire you.” :P

-- 
Tyler Romeo
0x405D34A7C86B42DF


signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: No more Architecture Committee?

2015-01-16 Thread Brion Vibber
Rob, thanks for starting this discussion!

We definitely should be thinking about what kind of structures make sense;
the arch committee was a good spike solution which I think we've
determined doesn't quite work as it is, but the core idea of getting some
active people to talk to each other regularly and go over stuff is a good
impulse we should not discard!

I'm currently in the process of disentangling myself from other projects
(we've got some awesome stuff coming up in the mobile apps!) to concentrate
more on MediaWiki core, so ideas about modifying or adding to that
structure are very welcome now.


In general, I feel a sense of urgency that seems lacking in the status
 quo.  We've made progress over the past couple of years, but it
 doesn't feel like our progress is entirely up to the task.  We have a
 collection of *many* instances of individual or small team excellence
 that are sadly the mere sum of their parts.  My intuition is that we
 lose out on multiplicative effects as we fail to engage the wider
 group in our activities, and as we lack engineering-level
 orchestration.  Team-level pride in fantastic work drifts into
 project-level despair, as many excellent engineers fail to grasp how
 to make big changes outside of their limited domains.


^ This.

I think it's getting time to plan more aggressively for the future and work
towards our goals with a shared vision. That's a tall order, of course, but
I think we can make some ideas go around...

-- brion


On Thu, Jan 15, 2015 at 8:04 PM, Rob Lanphier ro...@wikimedia.org wrote:

 (Alright...let's try this again!)

 Hi everyone,

 The current MediaWiki Architecture Committee[1] has its roots in a
 2013 Amsterdam Hackathon session[2], where we had a pair of sessions
 to try to establish our architectural guidelines[3].  It was there
 that we agreed that it would be good to revive our then moribund
 process for reviewing RFCs[4].  Since no one there really knew whose
 job it was to review these things, I believe I said how about we
 start with everyone with 'architect' in their title at WMF?, which
 was met with uncomfortable shrugging that I interpreted as
 consensus!, and no one corrected me.  Thus Brion Vibber, Mark
 Bergsma, and Tim Starling became the founding members of the Arch
 Committee.

 Subsequent to that meeting, I pretended to proceed as though a
 decision was made.  However, over the past year and half since then,
 there's been much more uncomfortable shrugging.  Even Brion, Mark, and
 Tim have not seemed entirely comfortable with the idea.  It was widely
 acknowledged that the group was heavily biased toward the lower parts
 of our server software stack.  The committee agreed to add Roan
 Kattouw and Daniel Kinzler to the group as a means of providing a
 wider perspective, with the added bonus of adding at least one person
 who isn't a WMF employee.

 So, here we are today.  I believe no one would dispute the credentials
 of every member of the group.  Brion, Tim, and Mark have an extremely
 long history with the project, being employees #1, #2, and #3 of the
 WMF respectively, and all having contributed massively to the success
 of Wikipedia and to MediaWiki as general purpose wiki software.  In
 most open source projects, one of them would probably be BFDL[5].
 Roan and Daniel are more recent, but only in relative terms, and
 also have very significant contributions to their name.  All have the
 widespread respect of pretty much everyone in the MediaWiki developer
 community.

 Additionally, I hear quite a bit of relief that the previously
 moribund RFC process is doing much better now.  Things are moving, and
 if you know how to work the process and aren't proposing anything too
 wild, you can get an RFC approved pretty quickly.  The committee has
 made a lap through the entire backlog of RFCs.

 Still, the uncomfortable shrugging continues.  The group is broader,
 but still lacks the breadth, particularly in front end and in the
 development of newer services such as Parsoid and RESTBase.  This
 aspect is pretty obviously something that can be fixed.  Another
 problem is that the scope of the group isn't clear to everyone.  Is
 this group responsible for leading, or merely responsible for
 reviewing big ideas from others to ensure continuity and sanity?  How
 big does an idea need to be before an RFC needs to be written (as
 opposed to just dropping a patch in Gerrit)?  Defining the scope of
 the group is also a fixable problem.

 However, I don't sense much of a desire to fix things.  The dominant
 meme that I hear is that we should go back to the day before
 uncomfortable shrugging led to a committee becoming BFDL.  What I
 fear, though, is that we will develop a system lacking in conceptual
 integrity[6], as individual warring fiefdoms emerge.  It's quite
 simple to argue this is already happening.

 So, where does that leave us?  Do we need a BFDL?  If so, who should
 pick?  Should it be someone in the project?  Should 

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread Brion Vibber
IMHO, the cases traditionally handled by tiny one-off wiki on shared
hosting would be better served by migrating to a dedicated wiki hosting
service which will actually update the software for security issues and new
features.

I believe WMF should concentrate on the large-scale hosting case, with
Vagrant and similar VM solutions for easy dev  one-off setups where people
are willing to maintain them themselves.

This should include better support for creating and maintaining large farms
of small independent wikis.


We should also either spin off an affordable for-pay no-ads wiki hosting
company, or let people create one (or multiple) and direct people to them.

-- brion

On Thu, Jan 15, 2015 at 10:38 PM, Bryan Davis bd...@wikimedia.org wrote:

 There has been a lot of talk over the last year or so of how and when
 to move MediaWiki to a service oriented architecture [0]. So much so
 that it is actually one of the a marquee topics at the upcoming
 Developer Summit.

 I think the problem statement for the services RFC is dead on in
 describing issues that MediaWiki and the Wikimedia Foundation face
 today with the current monolithic MediaWiki implementation. We have
 organic entanglements between subsystems that make reasoning about the
 code base difficult. We have a growing need to API access to data and
 computations that have historically been only presented via generated
 HTML. We have cross-team and cross-project communication issues that
 lead to siloed implementations and duplication of effort.

 The solution to these issues proposed in the RFC is to create
 independent services (eg Parsoid, RESTBase) to implement features that
 were previously handled by the core MediaWiki application. Thus far
 Parsoid is only required if a wiki wants to use VisualEditor. There
 has been discussion however of it being required in some future
 version of MediaWiki where HTML is the canonical representation of
 articles {{citation needed}}. This particular future may or may not be
 far off on the calendar, but there are other services that have been
 proposed (storage service, REST content API) that are likely to appear
 in production use at least for the Foundation projects within the next
 year.

 One of the bigger questions I have about the potential shift to
 requiring services is the fate of shared hosting deployments of
 MediaWiki. What will happen to the numerous MediaWiki installs on
 shared hosting providers like 1and1, Dreamhost or GoDaddy when running
 MediaWiki requires multiple node.js/java/hack/python stand alone
 processes to function? Is the MediaWiki community making a conscious
 decision to abandon these customers? If so should we start looking for
 a suitable replacement that can be recommended and possibly develop
 tools to easy the migration away from MediaWiki to another monolithic
 wiki application? If not, how are we going to ensure that pure PHP
 alternate implementations get equal testing and feature development if
 they are not actively used on the Foundation's project wikis?


 [0]:
 https://www.mediawiki.org/wiki/Requests_for_comment/Services_and_narrow_interfaces

 Bryan
 --
 Bryan Davis  Wikimedia Foundationbd...@wikimedia.org
 [[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
 irc: bd808v:415.839.6885 x6855

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread Ryan Schmidt


 On Jan 16, 2015, at 11:27 AM, Ryan Lane rlan...@gmail.com wrote:
 
 On Fri, Jan 16, 2015 at 4:29 AM, David Gerard dger...@gmail.com wrote:
 
 On 16 January 2015 at 07:38, Chad innocentkil...@gmail.com wrote:
 
 These days I'm not convinced it's our job to support every possible
 scale of wiki install. There's several simpler and smaller wiki solutions
 for people who can't do more than FTP a few files to a folder.
 
 
 In this case the problem is leaving users and their wiki content
 unsupported. Because they won't move while it works, even as it
 becomes a Swiss cheese of security holes. Because their content is
 important to them.
 
 This is the part of the mission that involves everyone else producing
 the sum of human knowledge. They used our software, if we're
 abandoning them then don't pretend there's a viable alternative for
 them. You know there isn't.
 What you're forgetting is that WMF abandoned MediaWiki as an Open Source
 project quite a while ago (at least 2 years ago). There's a separate org
 that gets a grant from WMF to handle third party use, and it's funded just
 well enough to keep the lights on.
 
 Take a look at the current state of MediaWiki on the internet. I'd be
 surprised if less than 99% of the MediaWiki wikis in existence are out of
 date. Most are probably running a version from years ago. The level of
 effort required to upgrade MediaWiki and its extensions that don't list
 compatibility with core versions is past the skill level of most people
 that use the software. Even places with a dedicated ops team find MediaWiki
 difficult to keep up to date. Hell, I find it difficult and I worked for
 WMF on the ops team and have been a MediaWiki dev since 1.3.
 
 I don't think adding a couple more services is going to drastically alter
 the current situation.
 
 - Ryan
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

This sounds like a problem we need to fix, rather than making it worse. I'd 
most wikis are not up to date then we should work on making it easier to keep 
up to date, not making it harder. Any SOA approach is sadly DOA for any shared 
host; even if the user has shell access or other means of launching processes 
(so excluding almost every free host in existence here), there is no guarantee 
that a particular environment such as node.js or Python is installed or at the 
correct version (and similarly no guarantee for a host to install them for 
you). Such a move, while possibly ideal for the WMF, would indeed make running 
on shared hosting nearly if not entirely impossible. The net effect is that 
people will keep their existing MW installs at their current version or even 
install old versions of the software so they can look like Wikipedia or 
whatnot, rife with unpatched security vulnerabilities. I cannot fathom that the 
WMF would be ok with leaving third-party users in such a situation.

Moving advanced things into services may make sense, but it should not come at 
the expense of making the core worthless for third parties; sensible refactors 
could still be done on core while leaving it as pure PHP. Services, if 
implemented, should be entirely optional (as an example, Parsoid is not 
required to have a base working wiki. I would like to stick to that model for 
any future service as well).

Regards,
Ryan Schmidt



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: No more Architecture Committee?

2015-01-16 Thread Trevor Parscal
It's really exciting to see people let go of unnecessary authority,
dismantle bureaucracy and resist building empires. I applaud the restraint
the committee is using here. I think it speaks volumes about who they are
as leaders.

As Brion suggests, it's important to retain the idea of getting some active
people to talk to each other regularly and go over stuff. I look forward
to supporting grass-roots efforts to achieve just that.

- Trevor

On Fri, Jan 16, 2015 at 10:01 AM, Brion Vibber bvib...@wikimedia.org
wrote:

 Rob, thanks for starting this discussion!

 We definitely should be thinking about what kind of structures make sense;
 the arch committee was a good spike solution which I think we've
 determined doesn't quite work as it is, but the core idea of getting some
 active people to talk to each other regularly and go over stuff is a good
 impulse we should not discard!

 I'm currently in the process of disentangling myself from other projects
 (we've got some awesome stuff coming up in the mobile apps!) to concentrate
 more on MediaWiki core, so ideas about modifying or adding to that
 structure are very welcome now.


 In general, I feel a sense of urgency that seems lacking in the status
  quo.  We've made progress over the past couple of years, but it
  doesn't feel like our progress is entirely up to the task.  We have a
  collection of *many* instances of individual or small team excellence
  that are sadly the mere sum of their parts.  My intuition is that we
  lose out on multiplicative effects as we fail to engage the wider
  group in our activities, and as we lack engineering-level
  orchestration.  Team-level pride in fantastic work drifts into
  project-level despair, as many excellent engineers fail to grasp how
  to make big changes outside of their limited domains.
 

 ^ This.

 I think it's getting time to plan more aggressively for the future and work
 towards our goals with a shared vision. That's a tall order, of course, but
 I think we can make some ideas go around...

 -- brion


 On Thu, Jan 15, 2015 at 8:04 PM, Rob Lanphier ro...@wikimedia.org wrote:

  (Alright...let's try this again!)
 
  Hi everyone,
 
  The current MediaWiki Architecture Committee[1] has its roots in a
  2013 Amsterdam Hackathon session[2], where we had a pair of sessions
  to try to establish our architectural guidelines[3].  It was there
  that we agreed that it would be good to revive our then moribund
  process for reviewing RFCs[4].  Since no one there really knew whose
  job it was to review these things, I believe I said how about we
  start with everyone with 'architect' in their title at WMF?, which
  was met with uncomfortable shrugging that I interpreted as
  consensus!, and no one corrected me.  Thus Brion Vibber, Mark
  Bergsma, and Tim Starling became the founding members of the Arch
  Committee.
 
  Subsequent to that meeting, I pretended to proceed as though a
  decision was made.  However, over the past year and half since then,
  there's been much more uncomfortable shrugging.  Even Brion, Mark, and
  Tim have not seemed entirely comfortable with the idea.  It was widely
  acknowledged that the group was heavily biased toward the lower parts
  of our server software stack.  The committee agreed to add Roan
  Kattouw and Daniel Kinzler to the group as a means of providing a
  wider perspective, with the added bonus of adding at least one person
  who isn't a WMF employee.
 
  So, here we are today.  I believe no one would dispute the credentials
  of every member of the group.  Brion, Tim, and Mark have an extremely
  long history with the project, being employees #1, #2, and #3 of the
  WMF respectively, and all having contributed massively to the success
  of Wikipedia and to MediaWiki as general purpose wiki software.  In
  most open source projects, one of them would probably be BFDL[5].
  Roan and Daniel are more recent, but only in relative terms, and
  also have very significant contributions to their name.  All have the
  widespread respect of pretty much everyone in the MediaWiki developer
  community.
 
  Additionally, I hear quite a bit of relief that the previously
  moribund RFC process is doing much better now.  Things are moving, and
  if you know how to work the process and aren't proposing anything too
  wild, you can get an RFC approved pretty quickly.  The committee has
  made a lap through the entire backlog of RFCs.
 
  Still, the uncomfortable shrugging continues.  The group is broader,
  but still lacks the breadth, particularly in front end and in the
  development of newer services such as Parsoid and RESTBase.  This
  aspect is pretty obviously something that can be fixed.  Another
  problem is that the scope of the group isn't clear to everyone.  Is
  this group responsible for leading, or merely responsible for
  reviewing big ideas from others to ensure continuity and sanity?  How
  big does an idea need to be before an RFC needs to be written (as
  opposed 

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread Ryan Lane
On Fri, Jan 16, 2015 at 10:21 AM, Ryan Schmidt skizz...@gmail.com wrote:


 This sounds like a problem we need to fix, rather than making it worse.
 I'd most wikis are not up to date then we should work on making it easier
 to keep up to date, not making it harder. Any SOA approach is sadly DOA for
 any shared host; even if the user has shell access or other means of
 launching processes (so excluding almost every free host in existence
 here), there is no guarantee that a particular environment such as node.js
 or Python is installed or at the correct version (and similarly no
 guarantee for a host to install them for you). Such a move, while possibly
 ideal for the WMF, would indeed make running on shared hosting nearly if
 not entirely impossible. The net effect is that people will keep their
 existing MW installs at their current version or even install old versions
 of the software so they can look like Wikipedia or whatnot, rife with
 unpatched security vulnerabilities. I cannot fathom that the WMF would be
 ok with leaving third-party users in such a situation.


You'd be wrong. WMF has been ok with it since they dropped efforts for
supporting third parties (and knew that most MW installs were insecure well
before that). It would be great if the problem was solved, but people have
been asking for it for over 10 years now and it hasn't happened. I don't
expect it to happen any time soon. There's only a couple substantial MW
users and neither of them care about third party usage.


 Moving advanced things into services may make sense, but it should not
 come at the expense of making the core worthless for third parties;
 sensible refactors could still be done on core while leaving it as pure
 PHP. Services, if implemented, should be entirely optional (as an example,
 Parsoid is not required to have a base working wiki. I would like to stick
 to that model for any future service as well).


I agree with other people that MW as-is should be forked. Until this
happens it'll never get proper third party support.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread Trevor Parscal
I challenge the very foundation of arguments based on this 95% statistic.

The 95% statistic is bogus, not because it's inaccurate, but because it's
misleading.

The number of MediaWiki instances out there is meaningless. The value we
get from 3rd party use correlates much more strongly with the number of
users, not the number of instances.

Consider the experience of using a MediaWiki instance running on a shared
host. MediaWiki doesn't actually perform well enough in this environment to
support a significant amount of usage. It's simply not possible that shared
host installs can achieve enough usage to be relevant.

- Trevor

On Fri, Jan 16, 2015 at 10:21 AM, Ryan Schmidt skizz...@gmail.com wrote:



  On Jan 16, 2015, at 11:27 AM, Ryan Lane rlan...@gmail.com wrote:
 
  On Fri, Jan 16, 2015 at 4:29 AM, David Gerard dger...@gmail.com
 wrote:
 
  On 16 January 2015 at 07:38, Chad innocentkil...@gmail.com wrote:
 
  These days I'm not convinced it's our job to support every possible
  scale of wiki install. There's several simpler and smaller wiki
 solutions
  for people who can't do more than FTP a few files to a folder.
 
 
  In this case the problem is leaving users and their wiki content
  unsupported. Because they won't move while it works, even as it
  becomes a Swiss cheese of security holes. Because their content is
  important to them.
 
  This is the part of the mission that involves everyone else producing
  the sum of human knowledge. They used our software, if we're
  abandoning them then don't pretend there's a viable alternative for
  them. You know there isn't.
  What you're forgetting is that WMF abandoned MediaWiki as an Open Source
  project quite a while ago (at least 2 years ago). There's a separate org
  that gets a grant from WMF to handle third party use, and it's funded
 just
  well enough to keep the lights on.
 
  Take a look at the current state of MediaWiki on the internet. I'd be
  surprised if less than 99% of the MediaWiki wikis in existence are out of
  date. Most are probably running a version from years ago. The level of
  effort required to upgrade MediaWiki and its extensions that don't list
  compatibility with core versions is past the skill level of most people
  that use the software. Even places with a dedicated ops team find
 MediaWiki
  difficult to keep up to date. Hell, I find it difficult and I worked for
  WMF on the ops team and have been a MediaWiki dev since 1.3.
 
  I don't think adding a couple more services is going to drastically alter
  the current situation.
 
  - Ryan
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 This sounds like a problem we need to fix, rather than making it worse.
 I'd most wikis are not up to date then we should work on making it easier
 to keep up to date, not making it harder. Any SOA approach is sadly DOA for
 any shared host; even if the user has shell access or other means of
 launching processes (so excluding almost every free host in existence
 here), there is no guarantee that a particular environment such as node.js
 or Python is installed or at the correct version (and similarly no
 guarantee for a host to install them for you). Such a move, while possibly
 ideal for the WMF, would indeed make running on shared hosting nearly if
 not entirely impossible. The net effect is that people will keep their
 existing MW installs at their current version or even install old versions
 of the software so they can look like Wikipedia or whatnot, rife with
 unpatched security vulnerabilities. I cannot fathom that the WMF would be
 ok with leaving third-party users in such a situation.

 Moving advanced things into services may make sense, but it should not
 come at the expense of making the core worthless for third parties;
 sensible refactors could still be done on core while leaving it as pure
 PHP. Services, if implemented, should be entirely optional (as an example,
 Parsoid is not required to have a base working wiki. I would like to stick
 to that model for any future service as well).

 Regards,
 Ryan Schmidt



 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki 2.0 (was: No more Architecture Committee?)

2015-01-16 Thread Dmitriy Sintsov
Why wikitext is so much disliked? It's more compact to type than HTML. It's
a templating language. HTML is not. Then something like Handlebars (which
is weaker than wikitext) should be used. Or, something like web components
and custom tags. But why removing nice thing (wikitext) which saves a lot
of keystrokes to break the whole codebase? Strange. Especially because
there's more progressive Parsoid. It looks like whole MW can be rewritten
in Node.js, not Python.

On Fri, Jan 16, 2015 at 8:04 PM, James Forrester jforres...@wikimedia.org
wrote:

 [Moving threads for on-topic-ness.]

 On 16 January 2015 at 07:01, Brian Wolff bawo...@gmail.com wrote:

  Does anyone actually have
  anything they want that is difficult to do currently and requires a mass
  compat break?


 ​Sure.

 ​Three quick examples of things on the horizon (I'm not particularly saying
 we'd actually do these for Wikimedia's use, but if you're going to ask for
 straw man arguments… :-)):

- ​Get rid of wikitext on the server-side.
   - HTML storage only. Remove MWParser from the codebase. All
   extensions that hook into wikitext (so, almost all of them?) will
 need to
   be re-written.
- Real-time collaborative editing.
   - Huge semantic change to the concept of a 'revision'; we'd probably
   need to re-structure the table from scratch. Breaking change for
 many tools
   in core and elsewhere.
- Replace local PHP hooks with proper services interfaces instead.​
- Loads of opportunities for improvements here (anti-spam tools 'as a
   service', Wordpress style; pre-flighting saves; ), but again, pretty
 much
   everything will need re-writing; this would likely be progressive,
   happening one at a time to areas where it's
 useful/wanted/needed, but it's
   still a huge breaking change for many extensions.



  Proposing to rewrite mediawiki because we can without even a
  notion of what we would want to do differently seems silly.
 

 ​Oh, absolutely. I think RobLa's point was that it's unclear who feels
 empowered to make that decision (rather than the pitch). I don't. I don't
 think RobLa does. Clearly the Architecture Committee don't.​

 ​J.
 --
 James D. Forrester
 Product Manager, Editing
 Wikimedia Foundation, Inc.

 jforres...@wikimedia.org | @jdforrester
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] New feature in development: collections

2015-01-16 Thread Jon Katz
Apologies for cross-listing this--I initially (and erroneously) only sent
it internally

--
Hi,
For those of you I haven't yet met, I am a new product manager in SF,
working with the mobile web team (I look forward to meeting you)!

I just wanted to let you all know about a new project we are starting for
Wikipedia's mobile website. The project has been dubbed Collections, and
our pilot will let users create and share collections of articles.  Here
are some ways that this project is exploring new ways to move our mission
forward:

   - *New ways to contribute: * through curation, wikipedia readers who are
   not interested in traditional editing can have meaningful, creative
   interactions with our content
   - *Personal:* gives users a way to make content more relevant for
them  Peter's
   list of most important Philosophers is not subject to consensus or editing
   by others.  For the time being, a list will only be accessible via shared
   url.
   - *Shareable:  *this project will experiment with the ability to use
   Wikipedia to share ones' perspective with others and, in doing so,
   encourage new users to engage
   - *[Future] Browseable:* because each list is not exhaustive, there is a
   possibility of using popular lists to promote meaningful content.  It's
   nice to know the full list of statisticians
   https://en.wikipedia.org/wiki/List_of_statisticians is there, but I
   might want to find a subset that have been picked out by a human for one
   reason or another.


Though the pilot is targeting and supporting readers, we think there are
also potential use cases for editors that we could explore in the future.
One can easily imagine lists for editors to track or share their
contributions, such as: Articles that I want to write/expand during 2015
or Articles I created in 2013.

A fair amount of thought has gone into why we are launching this particular
project and how we might approach it, but we have just started exploratory
development work last week.  Here is the team:

Jon Robson
Rob Moen
Joaquin Hernandez
Moiz Syed
[me]

*Ask:*
[I know that this project overlaps with several existing features in terms
of raw functionality, so if you have any relevant experience with either
the editing or technical support of lists, collections extension (books) or
watchlist and are interested in sharing your experience, please feel reach
out to me directly.  We very much want to learn from previous efforts]

Also, please reach out to me or anyone else on the team if you have any
questions or concerns about the feature, team, etc.

Best,

Jon
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread Isarra Yos

On 16/01/15 19:05, Trevor Parscal wrote:

I challenge the very foundation of arguments based on this 95% statistic.

The 95% statistic is bogus, not because it's inaccurate, but because it's
misleading.

The number of MediaWiki instances out there is meaningless. The value we
get from 3rd party use correlates much more strongly with the number of
users, not the number of instances.

Consider the experience of using a MediaWiki instance running on a shared
host. MediaWiki doesn't actually perform well enough in this environment to
support a significant amount of usage. It's simply not possible that shared
host installs can achieve enough usage to be relevant.


The 95% statistic may not be meaningful, but neither is this. The number 
of users involved does not reflect the importance of the information 
presented any more than that a project exists at all.


The mission of the Wikimedia Foundation is to empower and engage people 
around the world to collect and develop educational content under a free 
license or in the public domain, and to disseminate it effectively and 
globally.


This applies whether a project has several thousand active users 
building a general encyclopedia of common topics, or a team of ten 
working on a collection of bird names, or even one single individual 
putting together a chronicle of their family history. MediaWiki is a 
platform that empowers all of these, and to turn your back on them 
directly contradicts Wikimedia's overall mission.


-I

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New feature in development: collections

2015-01-16 Thread Federico Leva (Nemo)
Hi and welcome. Thanks for sharing. Do you already have a mediawiki.org 
page for this idea?


Please choose another name. Collection is taken. 
https://www.mediawiki.org/wiki/Extension:Collection


Nemo

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New feature in development: collections

2015-01-16 Thread Adam Wight
Great to hear you're taking on this work!

From my wishlist, I'd like to see support for cross-wiki collections (T19168
https://phabricator.wikimedia.org/T19168), this could be used to make
really cool multimodal or bilingual materials...

Thanks,
Adam

On Fri, Jan 16, 2015 at 12:16 PM, Federico Leva (Nemo) nemow...@gmail.com
wrote:

 Hi and welcome. Thanks for sharing. Do you already have a mediawiki.org
 page for this idea?

 Please choose another name. Collection is taken.
 https://www.mediawiki.org/wiki/Extension:Collection

 Nemo


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread Brion Vibber
On Fri, Jan 16, 2015 at 12:09 PM, Isarra Yos zhoris...@gmail.com wrote:

The 95% statistic may not be meaningful, but neither is this. The number of
 users involved does not reflect the importance of the information presented
 any more than that a project exists at all.

 The mission of the Wikimedia Foundation is to empower and engage people
 around the world to collect and develop educational content under a free
 license or in the public domain, and to disseminate it effectively and
 globally.

 This applies whether a project has several thousand active users building
 a general encyclopedia of common topics, or a team of ten working on a
 collection of bird names, or even one single individual putting together a
 chronicle of their family history. MediaWiki is a platform that empowers
 all of these, and to turn your back on them directly contradicts
 Wikimedia's overall mission.


I believe the claim is being made that many of those people with tiny
one-off self-hosted projects are *not* being well-served right now by a
MediaWiki instance that's out of date with security holes and no new
features.

Here's where I tend to make the comparison to WordPress:


WordPress is open source and available for self-hosting (including on
shared hosts with PHP and MySQL), but a lot of people find it's a pain to
maintain -- even with the HUGE usability and upgrade improvements that
WordPress folks have made in the last few years.

I personally maintain my own WordPress blog, and pretty much just
occasionally make a security update and otherwise hardly touch it. Plenty
of other people do this too, with varying degrees of success staying up to
date and fixing occasional operational problems.

But lots of people also use WordPress's hosted service as well. No need to
manually install security upgrades, maintain your own vhost, etc.

In our comparison, we have MediaWiki for self-hosters, and we have hosted
projects for Wikimedia, but if you want a hosted wiki that's not covered in
ads your choices are vague and poorly documented.


I would argue that many small projects would be better served by good wiki
farm hosting than trying to maintain their own sites.

Sure, that's not for everyone -- but I think the people who are best served
by running their own servers have a lot of overlap with the people who
would be capable of maintaining multiple services on a virtual machine.

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread Isarra Yos

On 16/01/15 20:28, Brion Vibber wrote:

On Fri, Jan 16, 2015 at 12:09 PM, Isarra Yos zhoris...@gmail.com wrote:

The 95% statistic may not be meaningful, but neither is this. The number of

users involved does not reflect the importance of the information presented
any more than that a project exists at all.

The mission of the Wikimedia Foundation is to empower and engage people
around the world to collect and develop educational content under a free
license or in the public domain, and to disseminate it effectively and
globally.

This applies whether a project has several thousand active users building
a general encyclopedia of common topics, or a team of ten working on a
collection of bird names, or even one single individual putting together a
chronicle of their family history. MediaWiki is a platform that empowers
all of these, and to turn your back on them directly contradicts
Wikimedia's overall mission.


I believe the claim is being made that many of those people with tiny
one-off self-hosted projects are *not* being well-served right now by a
MediaWiki instance that's out of date with security holes and no new
features.

Here's where I tend to make the comparison to WordPress:


WordPress is open source and available for self-hosting (including on
shared hosts with PHP and MySQL), but a lot of people find it's a pain to
maintain -- even with the HUGE usability and upgrade improvements that
WordPress folks have made in the last few years.

I personally maintain my own WordPress blog, and pretty much just
occasionally make a security update and otherwise hardly touch it. Plenty
of other people do this too, with varying degrees of success staying up to
date and fixing occasional operational problems.

But lots of people also use WordPress's hosted service as well. No need to
manually install security upgrades, maintain your own vhost, etc.

In our comparison, we have MediaWiki for self-hosters, and we have hosted
projects for Wikimedia, but if you want a hosted wiki that's not covered in
ads your choices are vague and poorly documented.


I would argue that many small projects would be better served by good wiki
farm hosting than trying to maintain their own sites.

Sure, that's not for everyone -- but I think the people who are best served
by running their own servers have a lot of overlap with the people who
would be capable of maintaining multiple services on a virtual machine.

-- brion


Comparing it to wordpress is a false equivalence, however. There, 
Wordpress is both generic host and developer/maintainer, but here 
Wikimedia, the developer/maintainer, only hosts its own projects. Not 
others. No other host will be able to achieve the same scope and 
resources as Wikimedia, and even if what a host has is /good/, they 
still will not have anywhere near the same upstream clout.


And this is exactly the problem that I think many of us are objecting 
to. As third-party users, even when we are hosters ourselves, we have 
little clout, few resources, and yet somehow we are expected to stand on 
our own? How?


One of MediaWiki's strongest features is its flexibility, too, and that 
is something that wiki hosting services, even relatively good ones, 
often cannot effectively support due to limited scope. Even wanting 
fairly common things like SMW narrows options down considerably, and if 
you're building your own extensions in order to deal with a particular 
kind of data specific to your own research? Then what?


In many cases when folks run their own wikis, this is why - they either 
need something specific, or want to maintain some form of control over 
their own content/communities that isn't necessarily afforded otherwise. 
Otherwise, they wouldn't be bothering at all, because even as is it is 
considerable extra work to do this.


-I

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread Brad Jorsch (Anomie)
On Fri, Jan 16, 2015 at 12:27 PM, Ryan Lane rlan...@gmail.com wrote:

 What you're forgetting is that WMF abandoned MediaWiki as an Open Source
 project quite a while ago (at least 2 years ago).


{{citation needed}}
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread Ryan Lane
On Fri, Jan 16, 2015 at 1:05 PM, Brad Jorsch (Anomie) bjor...@wikimedia.org
 wrote:

 On Fri, Jan 16, 2015 at 12:27 PM, Ryan Lane rlan...@gmail.com wrote:

  What you're forgetting is that WMF abandoned MediaWiki as an Open Source
  project quite a while ago (at least 2 years ago).


 {{citation needed}}


There was a WMF engineering meeting where it was announced internally. I
was the only one that spoke against it. I can't give a citation to it
because it was never announced outside of WMF, but soon after that third
party support was moved to a grant funded org, which is the current status
quo.

Don't want to air dirty laundry, but you asked ;).

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] (Reduced) Deployment schedule for the next two weeks (weeks of Jan 19th and 26th)

2015-01-16 Thread Greg Grossmeier
The next two weeks will hopefully be quiet, deployment-wise. This is due
to two weeks of in-person events that will take the majority of the
attention of WMF and community engineers.


== Week of January 19th ==
Summary:
* No MediaWiki train deploy due to WMF All Hands
** WMF All Hands is all day Wed and Thurs
* No deploys Mon/Wed/Thurs/Fri
* SWAT/etc on Tues OK


== Week of January 26th ==
Summary:
* No MediaWiki train deploy due to MediaWiki Dev Summit
** MediaWiki Dev Summit is Mon and Tues all day
* SWAT all week as normal
* No other deploys Mon/Tuesday


Thanks and as always, questions and comments welcome,

Greg

-- 
| Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @gregA18D 1138 8E47 FAC8 1C7D |


signature.asc
Description: Digital signature
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread Erik Moeller
On Fri, Jan 16, 2015 at 1:14 PM, Ryan Lane rlan...@gmail.com wrote:

  On Fri, Jan 16, 2015 at 12:27 PM, Ryan Lane rlan...@gmail.com wrote:
 
   What you're forgetting is that WMF abandoned MediaWiki as an Open Source
   project quite a while ago (at least 2 years ago).
 
 
  {{citation needed}}
 

 There was a WMF engineering meeting where it was announced internally. I
 was the only one that spoke against it. I can't give a citation to it
 because it was never announced outside of WMF, but soon after that third
 party support was moved to a grant funded org, which is the current status
 quo.

I think the confusion between third party support and an open
source project is unhelpful. We're obviously an open source project
with lots of contributors who aren't paid (and many of them are
motivated by Wikimedia's mission), it's just that the project puts
primary emphasis on the Wikimedia mission, and only secondary emphasis
on third party needs. I don't think it's a dirty secret that we moved
to a model of contracting out support to third parties -- there was
even an RFP for it. ;-)

Whether this is the best model, or whether it's time to think about
alternatives, is always up for debate. We just have a legitimate
tension between needing to focus as an org on what donors are
supporting us for, vs. potentially very tangentially related needs
(PostgreSQL support for some enterprise wiki installation).

Erik
-- 
Erik Möller
VP of Product  Strategy, Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread Brad Jorsch (Anomie)
On Fri, Jan 16, 2015 at 4:14 PM, Ryan Lane rlan...@gmail.com wrote:

 On Fri, Jan 16, 2015 at 1:05 PM, Brad Jorsch (Anomie) 
 bjor...@wikimedia.org
  wrote:

  On Fri, Jan 16, 2015 at 12:27 PM, Ryan Lane rlan...@gmail.com wrote:
 
   What you're forgetting is that WMF abandoned MediaWiki as an Open
 Source
   project quite a while ago (at least 2 years ago).
 
 
  {{citation needed}}
 

 There was a WMF engineering meeting where it was announced internally. I
 was the only one that spoke against it. I can't give a citation to it
 because it was never announced outside of WMF, but soon after that third
 party support was moved to a grant funded org, which is the current status
 quo.


So this was never publicly announced and has had no visible effect, to the
point that the latest version of all the code is still publicly available
under a free license and volunteers are still encouraged to contribute?

Or is this conflating having someone else release tarballs with
abandoning MediaWiki as an open source project? That's an extremely big
stretch.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread Ryan Lane
On Fri, Jan 16, 2015 at 1:20 PM, Erik Moeller e...@wikimedia.org wrote:


 I think the confusion between third party support and an open
 source project is unhelpful. We're obviously an open source project
 with lots of contributors who aren't paid (and many of them are
 motivated by Wikimedia's mission), it's just that the project puts
 primary emphasis on the Wikimedia mission, and only secondary emphasis
 on third party needs. I don't think it's a dirty secret that we moved
 to a model of contracting out support to third parties -- there was
 even an RFP for it. ;-)


There's open source in technicality and open source in spirit. MediaWiki is
obviously the former. It's hard to call MediaWiki open source in spirit
when the code is dumped over the wall by its primary maintainer. Third
parties get security fixes, but little to no emphasis is put into making it
usable for them. Projects that involve making the software usable for third
parties are either rejected or delegated to WMF projects that never get
resourced because they aren't beneficial to WMF (effectively licking all
those cookies). MediaWiki for third parties is effectively abandonware
since no one is maintaining it for that purpose.

All of this is to say: Wikimedia Foundation shouldn't consider third party
usage because it doesn't as-is. I don't understand why there's a
conversation about it. It's putting constraints on its own architecture
model because of a third party community it doesn't support.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread Ryan Lane
On Fri, Jan 16, 2015 at 1:23 PM, Brad Jorsch (Anomie) bjor...@wikimedia.org
 wrote:

 So this was never publicly announced and has had no visible effect, to the
 point that the latest version of all the code is still publicly available
 under a free license and volunteers are still encouraged to contribute?


At the time there were projects about to start for improving the installer,
making extension management better, and a few other third-party things.
They were dropped to narrow scope. In practice third-party support was
the only thing that year dropped from scope. So, maybe not a visible effect
because it maintained the status quo, but there was definitely an effect.

- Ryan
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread Isarra Yos

On 16/01/15 21:20, Erik Moeller wrote:

On Fri, Jan 16, 2015 at 1:14 PM, Ryan Lane rlan...@gmail.com wrote:

On Fri, Jan 16, 2015 at 12:27 PM, Ryan Lane rlan...@gmail.com wrote:


What you're forgetting is that WMF abandoned MediaWiki as an Open Source
project quite a while ago (at least 2 years ago).


{{citation needed}}


There was a WMF engineering meeting where it was announced internally. I
was the only one that spoke against it. I can't give a citation to it
because it was never announced outside of WMF, but soon after that third
party support was moved to a grant funded org, which is the current status
quo.

I think the confusion between third party support and an open
source project is unhelpful. We're obviously an open source project
with lots of contributors who aren't paid (and many of them are
motivated by Wikimedia's mission), it's just that the project puts
primary emphasis on the Wikimedia mission, and only secondary emphasis
on third party needs. I don't think it's a dirty secret that we moved
to a model of contracting out support to third parties -- there was
even an RFP for it. ;-)


According to the text of the RfP, it was requesting proposals for 
managing the releases themselves. Those are nothing more than the very 
end result of the development process, and a part that only faces a 
subset of users. The process and content itself is a much larger problem 
that cannot so easily be shuffled away, which was, I thought, the entire 
reason why the WMF was specifically focussing on that instead. Unlike 
the releases themselves, this part is important to all of us, not just some.



Whether this is the best model, or whether it's time to think about
alternatives, is always up for debate. We just have a legitimate
tension between needing to focus as an org on what donors are
supporting us for, vs. potentially very tangentially related needs
(PostgreSQL support for some enterprise wiki installation).

Erik


This tension applies to all of the non-Wikipedia projects, not just 
MediaWiki. Would you say that Commons gathering up images for use on 
Wikisource is also something that shouldn't be done, as this is likewise 
not a part of the movement that donors tend to know about?


-I

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The future of shared hosting

2015-01-16 Thread Stas Malyshev
Hi!

 The problem there is that the PHP implementation is likely to code-rot,
 unless we create really good unit tests that actually run for it.

With service architecture, there should be a test suite running against
the service and verifying the compliance with the API definition. So if
the PHP implementation passes that, it should be fine.

 It would be less bad if the focus were on librarization and daemonization
 (and improvement) of the existing PHP code so both methods could share a
 majority of the code base, rather than rewriting things from scratch in an
 entirely different language.

I think language is secondary here to other concerns. I.e. if the scale
requirements require implementing something that is best written in
language X, then requiring it to be all in PHP may just not be practical
for the scale.

 That becomes a hazard when other stuff starts to depend on the non-reduced
 feature set.

True, but I think there could still be minimal API and extended API.
E.g. search engine could support just word match search or advanced
expressions, or a database may be transactional with high levels or
reliability or non-transactional storage with minimal guarantees. Of
course, that means some functions would not be available everywhere -
but I don't see a good way to avoid it when the requirements diverge.

-- 
Stas Malyshev
smalys...@wikimedia.org

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Working around composer? (Fatal error: Class 'Cdb\Reader' not found)

2015-01-16 Thread Legoktm
On 01/13/2015 09:08 AM, Bryan Davis wrote:
 On Tue, Jan 13, 2015 at 7:40 AM, Tyler Romeo tylerro...@gmail.com wrote:
 I know we just added some new maintenance scripts for checking things with 
 composer. I’m sure it wouldn’t be that bad having update.php check first and 
 tell the user to run “composer install” before doing update.php.
 
 
 Kunal made the new checkComposerLockUpToDate.php maintenance script
 to validate $IP/vendor against the $IP/composer.json file. An end user
 could either add this to their typical workflow before running
 update.php or we could try to find a reasonable way to integrate the
 check it performs into the update script. Checking for external
 dependencies isn't the same thing at all as updating a database schema
 so I'd lean towards suggesting that the new script be used separately.

I tried getting update.php to run checkComposerLockUpToDate.php but it
was messy and didn't really work well, so I uploaded a patch[1] that
adjusts the intro text to recommend that you run
checkComposerLockUpToDate.php before running update.php.

[1] https://gerrit.wikimedia.org/r/#/c/185592/

-- Legoktm

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] +2 for TTO in mediawiki/

2015-01-16 Thread Legoktm
Hi!

I have nominated TTO for +2 in mediawiki/ repositories:
https://www.mediawiki.org/wiki/Gerrit/Project_ownership#TTO_for_.2B2_in_mediawiki.2F.
Please comment :)

Additionally, there are some other requests on that page that could use
some comments.

Thanks,
-- Legoktm

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l