[Wikitech-l] Re: "Known languages" or similar?

2024-01-08 Thread Daniel Kinzler

Am 08.01.2024 um 07:54 schrieb Strainu:
I'm trying to add a "translate" link to [[:ro:Template:Ill-wd]] (which 
indicates a subject by it's wikidata id) and I need to determine the original 
language. Is there a way to determine if the current user prefers/knows some 
languages except the wiki's own language? I know I can use the interface 
language, but for the vast majority of users that's identical to the content 
language.


MediaWiki core does not support this, but the Babel extension does. It offers an 
API, too: https://www.mediawiki.org/wiki/Extension:Babel


--

Daniel Kinzler
Principal Software Engineer, Platform Engineering
Wikimedia Foundation
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Word embeddings / vector search

2023-05-09 Thread Daniel Kinzler
ces substantial new challenges in keeping the embeddings
up-to-date :)

Hope that helps.

Best,
Isaac

On Tue, May 9, 2023 at 2:10 PM Dan Andreescu  
wrote:

I encourage you to reach out to the search team, they're lovely folks
and even better engineers.

On Tue, May 9, 2023 at 1:53 PM Lars Aronsson  wrote:

On 2023-05-09 09:27, Thiemo Kreuz wrote:
> I'm curious what the actual question is. The basic concepts are
> studied for about 60 years, and are in use for about 20 to 30 
years.

Sorry to hear that you're so negative. It's quite obvious that
this is not
currently used in Wikipedia, but is presented everywhere as a 
novelty
that has not been around for 20 or 30 years.

>

https://www.elastic.co/de/blog/introducing-approximate-nearest-neighbor-search-in-elasticsearch-8-0
> https://en.wikipedia.org/wiki/Special:Version
>

https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Annual_Plan/2023-2024/Draft/Product_%26_Technology#Objectives
>

https://wikitech.wikimedia.org/wiki/Search_Platform/Contact#Office_Hours


Thanks! This answers my question. It's particularly interesting to
read
the talk page to the plan. Part of the problem is that "word
embedding"
and "vector search" are not mentioned there, but a vector search 
could
have found the "ML-enabled natural language search" that is 
mentioned.
If and when this is tried, we will need to evaluate how well it
works for
various languages.


-- 
   Lars Aronsson (l...@aronsson.se, user:LA2)

   Linköping, Sweden

___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org

https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org

https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/



-- 
Isaac Johnson (he/him/his) -- Senior Research Scientist -- Wikimedia

Foundation
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/



--
Amir (he/him)


___
Wikitech-l mailing list --wikitech-l@lists.wikimedia.org
To unsubscribe send an email towikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/


--
Daniel Kinzler
Principal Software Engineer, Platform Engineering
Wikimedia Foundation
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Deprecation: directly invoking maintenance scripts

2023-03-03 Thread Daniel Kinzler
; (which is written in Python):

https://gerrit.wikimedia.org/r/c/integration/quibble/+/875981/6/quibble/mediawiki/maintenance.py


Antoine "hashar" Musso
Wikimedia Release Engineering


___
Wikitech-l mailing list --wikitech-l@lists.wikimedia.org
To unsubscribe send an email towikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/


--
Daniel Kinzler
Principal Software Engineer, Platform Engineering
Wikimedia Foundation
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Deprecation: directly invoking maintenance scripts

2023-01-09 Thread Daniel Kinzler
TLDR: Invoking maintenance scripts directly will be deprecated in MW 1.40, use 
maintenance/run.php instead. This affects anyone managing a MediaWiki 
installation, for development, testing, or production use.


Until now, MediaWiki maintenance scripts have been handled standalonePHP scripts 
- for instance, to run the script that outputs the MediaWiki version, you would use:

php maintenance/version.php


Starting with MediaWiki 1.40, this is deprecated. The preferred way to run 
maintenance scripts is now by name, using the maintenance runner:

php maintenance/run.php version


Similarly, the preferred way to run the updater is now:

php maintenance/run.php update


The script to run cal also be specified using the full path of the script file, 
or the full PHP class name of a subclass of the Maintenance class. For more 
details, run

php maintenance/run.php --help



Rationale and History:

Treating maintenance scripts as standalone PHP scripts requires some boilerplate 
code to be present at the top and at the bottom of every file. This is error 
prone and makes it difficult to update the maintenance framework. But more 
importantly,
for this boilerplate to work, the location of the MediaWiki installation has to 
be known relative to the maintenance script, which is not reliably possible for 
scripts defined in extensions.


A similar problem arises if the maintenance script needs a base class other than 
the default Maintenance class: since the class is loaded before MediaWiki is 
initialized, the autoloader is not yet in place, and the file containing the 
base class needs to be included explicitly.


These and similar issues can be avoided by creating a wrapper script that loads 
and executes the actual maintenance class. This way, the maintenance wrapper can 
initialize MediaWiki before passing control to the script.


I propose creating such a wrapper as an RFC in 2018 (T99268)[^1], which was 
approved in 2019.  However, implementing the proposal proved challenging, and 
soon stalled. I picked it up again as a side project after working on 
overhauling the configuration and bootstrapping code in early 2022: With the 
introduction of SettingsBuilder, it became much simpler to create a 
MaintenanceRunner class, because it was no longer necessary to juggle global 
variables.


Several bits and pieces got reviewed and merged over the course of 2022 (shout 
out to Amir, Tim, Timo, and everyone who contributed). Now the runner is ready, 
and we should stop calling maintenance scripts directly.


For now, existing maintenance scripts will function both ways[^2] : when called 
using the runner, or directly. However, newly created maintenance scripts should 
not be required to be callable as standalone scripts. So it's best to change all 
callers to use the wrapper.


This should now work for nearly all[^2] cases, though there are still a couple 
of rough edges to be smoothed out. If you are running MediaWiki 1.40, please try 
the new mechanism, and report any isses on Phabricator.


Thanks,
Daniel

[^1] https://phabricator.wikimedia.org/T99268
[^2] with the exception over very old-school scripts that do not use the 
Maintenance base class and rely on CommandLineInc.php instead.


--
Daniel Kinzler
Principal Software Engineer, Platform Engineering
Wikimedia Foundation
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Feedback wanted: PHPCS in a static types world

2022-11-15 Thread Daniel Kinzler

Am 10.11.2022 um 03:08 schrieb Tim Starling:
Clutter, because it's redundant to add a return type declaration when the 
return type is already in the doc comment. If we stop requiring doc comments 
as you propose, then fine, add a return type declaration to methods with no 
doc comment. But if there is a doc comment, an additional return type 
declaration just pads out the file for no reason. 


I agree that we shouldn't have redundant doc tags and return type declarations. 
I would suggest that all methods should have a return type declaration, but 
should not have a @return doc tag unless there is additional info to be 
provided. For example, when the declared return type may is array, we may still 
want to document @return string[] or @return array.



The performance impact is measurable for hot functions. In gerrit 820244 
<https://gerrit.wikimedia.org/r/c/mediawiki/core/+/820244> I removed parameter 
type declarations from a private method for a benchmark improvement of 2%.


This raises an interesting issue, one that has bitten me before: How do we know 
that a given method is "hot"? Maybe we should establish a @hot or @performance 
tag to indicate that a given method should be optimized for speed. Is  it 
possible to make phpcs smart enough that it would apply different rules if a 
method is marked as @hot? In that case, perhaps the use of type hints for 
parameters and the return type should be discouraged rather than encouraged.


--

Daniel Kinzler
Principal Software Engineer, Platform Engineering
Wikimedia Foundation
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Feedback wanted: PHPCS in a static types world

2022-10-30 Thread Daniel Kinzler
ssary for actual documentation beyond types, then the type is generally 
still included (and Phan will check that it matches the static type), but when 
no further documentation is needed (see proposition 1 above), then the @var, 
@param, etc. doc comment can be omitted.


Note that depending on the PHP version, not all types can be losslessly 
represented as PHP static types yet (e.g. union types and mixed both need to 
wait for PHP 8.0, null and false for PHP 8.2); in such cases, doc comments can 
remain necessary.


Conclusion: *We should update our PHPCS ruleset to require fewer doc 
comments.* Exact rules are probably to be decided, depending on how much work 
we’re willing to put into the sniff implementations (e.g. is it feasible to 
require /** @param */ doc comments only if a parameter has no static type?), 
but generally, I argue that we want code such as the following to be allowed 
by our standard PHPCS ruleset:


class CreditsAction extends FormlessAction {


private LinkRenderer $linkRenderer;


private UserFactory $userFactory;


/** Convert a Message to a MessageValue */

public function convertMessage( Message $m ): MessageValue {


When doc comments are still necessary or at least beneficial because the type 
alone isn’t enough information, it’s up to humans to decide this while writing 
the code or point it out during code review.


What do people think about this? :)

PS: In PHP 8, we could abbreviate some of this code even more using 
constructor property promotion:


class CreditsAction extends FormlessAction {


public function __construct(

Page $page,

IContextSource $context,

private LinkRenderer $linkRenderer,

private UserFactory $userFactory

) {

parent::__construct( $page, $context );

  }


(Again, I’m not saying that all code should look like this – but I think we 
have plenty of existing code that effectively carries no additional 
information in its documentation, and which could be converted into this form 
without losing anything.)


Cheers,
Lucas

--
Lucas Werkmeister (he/er)
Software Engineer

Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Phone: +49 (0)30-577 11 62-0
https://wikimedia.de

Imagine a world in which every single human being can freely share in the sum 
of all knowledge. Help us to achieve our vision!

https://spenden.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V. 
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter 
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für 
Körperschaften I Berlin, Steuernummer 27/029/42207.


___
Wikitech-l mailing list --wikitech-l@lists.wikimedia.org
To unsubscribe send an email towikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/


--
Daniel Kinzler
Principal Software Engineer, Platform Engineering
Wikimedia Foundation
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: TDF is looking for community representatives

2022-10-06 Thread Daniel Kinzler

Am 06.10.2022 um 13:38 schrieb Kate Chapman:

Hi Daniel,

Thanks for the suggestion on possible additions to the recruitment process.

I'm curious about the perception of TechCom being more community oriented than 
the new process.


Oh I didn't mean to imply that. I was just sharing my experience that asking for 
nominations worked better than asking for volunteers -- for staff members. 
TechCom was in theory open to the community, but that never materialized. In 
that regard, TechCom didn't work any better than TDF.


Regarding the public TechCom meetings. There aren't a bunch of secret 
Technical Decision Forum meetings going on now that people aren't being 
invited to. Is the desire to have an IRC meeting at some point in the decision 
making process to gain input? Or are there other ways people think would be 
better for contributing?


The IRC meetings helped to raise and focus attention, but they were frantic and 
participation was very ad-hoc. I personally wouldn't want them back.


With TDF, feedback  gathering happens mostly on google docs. Perhaps that could 
be done on phabricator instead, or on wiki pages. I think that would be more 
accessible to the general public. Currently, TDF process tracking in on phab, 
discussion is on google docs, and publication is on wiki. It might work better 
to have everything in a singe public place.


--
Daniel Kinzler
Principal Software Engineer, Platform Engineering
Wikimedia Foundation
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: TDF is looking for community representatives

2022-10-06 Thread Daniel Kinzler

Am 06.10.2022 um 08:52 schrieb Linh Nguyen:

Kunal,
I hear you but we only have 3 people who actually put the effort into 
applying for the position.  We are appointing people who are at least trying 
to help.  If you want to help in the process please feel free to put your name 
on the list.


The original mail doesn't really make it clear what impact one might have by 
joining, or what would be expected of a member. Asking people to click a link 
for details loses most of the audience already.


One thing that has worked pretty well in the past when we were looking for 
people to join TechCom was to ask for nominations, rather than volunteers. We'd 
then reach out to the people who were nominated, and asked them if they were 
interested. Self-nominations were of course also fine.


Another thing that might work is to directly approach active volunteer 
contributors to production code. There really aren't so many really active ones. 
Ten, maybe.


--
Daniel Kinzler
Principal Software Engineer, Platform Engineering
Wikimedia Foundation
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Breaking change to the DBPrimaryPos interface

2022-08-31 Thread Daniel Kinzler
This is announcement of a breaking change without deprecation, per the Stable 
Interface Policy <https://www.mediawiki.org/wiki/Stable_interface_policy>.


If you have any objections, concerns or questions, please respond to this email 
as soon as possible.


*What will change?* We are adding two methods to the DBPrimaryPos interface 
<https://gerrit.wikimedia.org/r/c/mediawiki/core/+/828058>. Any classes 
implementing that interface will break if they do not implement these methods. 
Other usage of the class, such as calling methods, is unaffected. The change 
will be part of the 1.39 release.


*Why is the change needed? *We need a plain data representation of the 
DBPrimaryPos objects, so they can be stored in a cache without relying on native 
PHP serialization of objects. Native PHP serialization should be avoided because 
it is brittle and insecure 
<https://www.mediawiki.org/wiki/Development_policy#Implementation%20policies>. 
Issues may arise particularly when the class in question changes or PHP itself 
is updated. The problem that prompted this change is T316601 
<https://phabricator.wikimedia.org/T316601>.


*Why is deprecation not feasible? *There is no backwards compatible way in PHP 
to add methods to an interface, nor is there a way to warn classes that are 
lacking the new methods. For this reason, it is generally recommended to provide 
a base class rather than expecting extensions to implement an interface 
directly. In this case however, there is no base class, and the interface is 
marked as /stable to implement./ But there appear to be no classes in extensions 
that implement the interface in question, so it seems reasonable to just add the 
methods.


--

Daniel Kinzler
Principal Software Engineer, Platform Engineering
Wikimedia Foundation
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: ClassCrawler – extremely fast and structured code search engine

2022-02-11 Thread Daniel Kinzler

Am 11.02.22 um 09:15 schrieb Kunal Mehta:
Seems like what you're asking for is 
<https://docs.gitlab.com/ee/user/project/code_intelligence.html>, right?


AFAICT that functionality is available in self-hosted GitLab, it just requires 
someone writing a LSIF implementation for PHP, as it's not already listed on 
<https://lsif.dev/#implementations-server>.


Oh nice! I did a few minutes of digging on LSIF and sourcegraph, and it does 
sound quite good! Sourcegraph provides a search backend backend, navigation 
frontend, integration API and plugins for GitLab as well as Phabricato and even 
browser extensions. We could integrate it with codesearch as well, via its API. 
And LSIF is an open format for representing the kind if info we need.


Peter, what do you think of targeting LSIF instead of MongoDB?

I mean, just as an experiment. We still need to look closely whether LSIF really 
covers our needs. https://code.visualstudio.com/blogs/2019/02/19/lsif says that: 
/Same as LSP, LSIF doesn't contain any program symbol information nor does the 
LSIF define any symbol semantics (for example, what makes the definition of a 
symbol or whether a method overrides another method). The LSIF therefore doesn't 
define a symbol database, which is consistent with the LSP approach./


That's actually quite a bummer. The most critical kind of search after "where is 
this called" is "what overrides this method"... Do I understand correctly that 
LSIF doesn't doe that?/

/

PS: Sourcegraph's licensing model is a bit confusing though, seems like it's 
Apache for the core, and "open but not free" for some extra bits.


--
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: ClassCrawler – extremely fast and structured code search engine

2022-02-11 Thread Daniel Kinzler

Am 07.02.22 um 08:43 schrieb Giuseppe Lavagetto:
Ok, why do you think symbol search can't be integrated in the current 
codesearch? That's what Amir was proposing. Sadly I don't think much of the 
current code of ClassCrawler can be reused for that goal, and it's a pity.


It could be integrated with the UI, but would require a very different backend.

I very much like Kunal's suggestion about LSIF. I'll reply to his mail.

--

Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: ClassCrawler – extremely fast and structured code search engine

2022-02-05 Thread Daniel Kinzler

Am 05.02.22 um 21:38 schrieb Amir Sarabadani:
Codesearch has been working fine in the past couple of years. There is a new 
frontend being built and I hope we can deploy it soon to provide a better user 
experience and I personally don't see a value in re-implementing codesearch. 
Especially using non-open source software.


While I agree with several points that have been raised, in particular about 
licensing and building on top of existing tools, I'd like to point out that the 
idea is not to re-implement codesearch, but to overcome some of its limitations. 
What we use codesearch for most is finding usages of methods (and sometimes 
classes). This works fine if the method name is fairly unique. But if the method 
name is generic, or you are moving a method from one class to another an you 
want to find callers of the old method, but not the new method, then regular 
experssions just don't cut it.


Basically, I'd want codesearch to allow me to do the kind of "find callers" 
search that IDEs like phpstorm support. Sure, I could do it in the ID, but I 
can't link to that from a ticket, and I'd have to make sure I have exactky the 
right set of extensions installed (and updated).


A tool very much like codesearch, but based not on regular expressions but 
rather on symbols and their relationships, would be very valuable to me. The 
question how exactly it should be build is of course open.


--
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Breaking Change: Extensions Initialization via Composer Autoloading

2021-11-23 Thread Daniel Kinzler

Hi all!

This is a heads up for extensions that expect to be installed via composer:

Starting with the release of 1.38, extensions MUST NOT access any MediaWiki 
functions, variables, or constants in code that is executed by the composer 
autoloader <https://getcomposer.org/doc/04-schema.md#files>. The earliest point 
in time in which extensions MAY access SOME MediaWiki functions, variables, or 
constants is the registration callbacks 
<https://www.mediawiki.org/wiki/Manual:Extension.json/Schema#callback>, though 
restrictions apply 
<https://www.mediawiki.org/wiki/Manual:Extension_registration/Limitations> there 
as well.


Note that managing extension installation via composer will continue to work as 
before for now (though support for this is not well defined, see T250406 
<https://phabricator.wikimedia.org/T250406>). Support is removed only for 
interacting with functions, variables, or constants defined by MediaWiki before 
extension.json is loaded. Doing this has long been known to cause problems, see 
the discussion on T249573 <https://phabricator.wikimedia.org/T249573> and T61872 
<https://phabricator.wikimedia.org/T61872>.


The immediate reason to remove support now is that we are working on allowing 
MediaWiki to load configuration from JSON and YAML files 
<https://phabricator.wikimedia.org/T292402> instead of setting global variables 
in PHP files. Exploration in this area has shown the need to have external 
libraries available early on during the initialization process, so we can use 
them to process settings files. This makes it necessary to change the order of 
initialization so that the auto-loading component of composer is executed before 
we set up global state with default settings etc. Because of this, code that is 
executed by the autoloader cannot have access to things that get initialized 
later on.


This change has already been made to MediaWiki core as deployed on WMF sites. We 
found out the hard way <https://phabricator.wikimedia.org/T295883> that there 
are extensions that presently rely on accessing MediaWiki settings from code 
that runs during autoloader initialization. After some discussion, it was 
decided that it is time to remove support for this practice, since it has been 
frowned upon for a long time, and there seems to be no good way to resolve this 
hen-and-egg problem in the initialization sequence.


Please reply to this email if you have any concerns about this change.

--
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Selenium tests with two instances of Mediawiki

2021-07-02 Thread Daniel Kinzler
Am 02.07.21 um 12:31 schrieb Antoine Musso:
>
> Maybe some of those tests could be written via PHPUnit? I don't know how to
> simulate two MediaWiki instances in there though :-\

I have long wanted that, and I have recently worked on making this possible as a
personal side project. However, the devil is in the details...

Have a look at https://phabricator.wikimedia.org/T261848
<https://phabricator.wikimedia.org/T261848> if you are interested.

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Request for comments: A new way to do permission checks

2021-01-28 Thread Daniel Kinzler
(now sharing beyond the WMF tech department)

Dear colleagues and MediaWiki contributors,

On behalf of the Platform Engineering Team, I am delighted to invite you to the
MediaWiki Authority interface[1][2] evaluation during the Platform Engineering
Office Hours[3] on Feb 04 1700 UTC. Dress Formal. Or not ;)

Before we commit to using Authority for permission checking throughout the
codebase, we want to make sure that we didn't miss anything. So if you are
working on code that needs to check user permissions, please join the PET office
hour and give us your feedback!

During the meeting, we will present the new Authority interface, which defines a
standard for checking user permissions, blocks, throttles, etc, and allows us to
easily make the relevant context for permission checks available where it is
needed.


We are going to explain the design and demonstrate its application in various
areas using exploratorypatches. We would love to hear your opinion on the
approach we’re taking. Ifyou contribute to the MediaWiki or extension codebase,
you will likely have to use the new interface if it’s accepted, and this is your
opportunity to raise concerns and objections before the interface is finalized.


The meeting during Platform Engineering Office hours will loosely correspond to
the second step of the brand new Technical Decision Process[4] for which me made
a"Decision Statement Overview"[5]. After the initial meeting, you will have two
weeks for feedback on the ticket[2]. At the end of the two week period we may
schedule a followup meeting, depending on the feedback we receive.


Cheers. PET.


 1.


https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/core/+/refs/heads/master/includes/Permissions/Authority.php

 2.

https://phabricator.wikimedia.org/T231930

 3.

https://meet.google.com/pjo-xtxv-oea

 4.


https://www.mediawiki.org/wiki/Technical_Decision_Making_Process#2_Technical_Decision_Forum

 5. 
https://docs.google.com/document/d/1RT3mWt57RkGJdeV5kVH_eoVOBu-97w7sYheepKt6DgM/edit


-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Final TechCom Digest, 2020-01-20

2021-01-26 Thread Daniel Kinzler
Hello all!

This is the final TechCom digest. With the new Technical Decision Making Process
<https://www.mediawiki.org/wiki/Technical_Decision_Making_Process> in place, we
are spinning down the RFC process and shutting down the committee. A big Thank
You to all committee members past and present for their time and dedication!

On a closing note, two RFCs have been approved after Last Call, both of which
I'm personally very happy to see:

*Stable interface policy amendment <https://phabricator.wikimedia.org/T268326>*:
The policy was amended to include a definition of the "MediaWiki Ecosystem" of
extensions to be considered when deprecating obsolete code. The deprecation
process was overhauled to allow for a clear timeline from soft deprecation via
hard deprecation to removal.

*Drop support for upgrading from old releases (pre 1.31)
<https://phabricator.wikimedia.org/T259771>*: this frees up to remove about a
thousand or so database patch files only needed for upgrading from very old
systems. Upgrading from old versions of MediaWiki will still be possible, but
have to be performed in multiple steps.

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] TechCom digest 2021-01-06

2021-01-08 Thread Daniel Kinzler
Hi all!

Here's the summary of Wednesday's TechCom meeting. Due to the holidays, there
wasn't much activity, but ther eare a few Last Calls to close, and a couple of
new ones.


Present:  Daniel K, Tim S, Timo T, Kate C.


RFC: Expand API title generator to support other generated data

  *

https://phabricator.wikimedia.org/T263841

  *

T263841 <https://phabricator.wikimedia.org/T263841>RFC: Expand API title
generator to support other generated data

  *

At end of last call, Accepted (pending finalization per mail, due to lack of
quorum in the meeting)


RFC: Introduce PageIdentity to be used instead of WikiPage

  *

T208776 <https://phabricator.wikimedia.org/T208776>RFC: Introduce
PageIdentity to be used instead of WikiPage

  *

Some discussion on names since last meeting

  *

TT has outstanding concerns about migration and getWikiId()

  *

Keep open for another week.


RFC: Drop support for older database upgrades

  *

https://phabricator.wikimedia.org/T259771 

  *

Move to last call


RFC: Amendment to the Stable interface policy (November 2020)

  *

https://phabricator.wikimedia.org/T268326 

  *

Move to last call


Amend RFC process to allow for three last calls

  *

https://www.mediawiki.org/wiki/Requests_for_comment

  *

TechCom is seeking to immediately amend the process such that three RFCs can
be on last call (rather than just two). Today's meeting did not have quorum
attendance, so this motion will be decided on asynchronously, and adopted by
next week if consensus is reached.

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] TechCom meeting 2020-12-16

2020-12-17 Thread Daniel Kinzler
Ariel just pointed out that the original mail had the wrong link for the RFC
ticket. Sorry about that The correct link to the RFC about PageIdentity is:
https://phabricator.wikimedia.org/T208776

-- 

Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] TechCom meeting 2020-12-16

2020-12-17 Thread Daniel Kinzler

Am 17.12.20 um 14:25 schrieb Daniel Kinzler:
> My original proposal was to change the signature of getId() to getId( $wikiId
> = false ), to assert that the PageIdentity actually belongs to the wiki the
> caller expects.


I made a patch exploring that option:
https://gerrit.wikimedia.org/r/c/mediawiki/core/+/650126.
<https://gerrit.wikimedia.org/r/c/mediawiki/core/+/650126>

It's much simpler, but it relies on runtime assertions rather than type hints.

-- 

Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] TechCom meeting 2020-12-16

2020-12-17 Thread Daniel Kinzler
ed for clarifications. Lucas responded with a comment
> and Lydia offered to have it explained in a call.
>   * T208776 <https://phabricator.wikimedia.org/T208776>: RFC: Introduce
> PageIdentity to be used instead of WikiPage.
>   o Krinkle and Daniel discuss implementation details in context of
> what's best for type safety, clarity and migration.
>
>
>   -Niklas
> _______
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org <mailto:Wikitech-l@lists.wikimedia.org>
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] TechCom meeting 2020-12-16

2020-12-16 Thread Daniel Kinzler
>   * T208776 <https://phabricator.wikimedia.org/T208776>: RFC: Introduce
> PageIdentity to be used instead of WikiPage.
>   o Krinkle and Daniel discuss implementation details in context of what's
> best for type safety, clarity and migration.
>
I hope we can move this to Last Call today. It would be useful to have agreement
on this before we start experimenting with introducing Authority.
<https://phabricator.wikimedia.org/T262296>

On an unrelated note, Tgr is requesting clarification about merge rights for
repos that are used for things like tools on labs:
https://www.mediawiki.org/wiki/Topic:Vzovpwgjev621dbq


-- 

Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] TechCom digest 2020-12-08

2020-12-08 Thread Daniel Kinzler
This is the weekly TechCom board review in preparation of our meeting on
Wednesday. If there are additional topics for TechCom to review, please let us
know by replying to this email. However, please keep discussion about individual
RFCs to the Phabricator tickets.

Activity since Monday -MM-DD on the following boards:

https://phabricator.wikimedia.org/tag/techcom/
https://phabricator.wikimedia.org/tag/techcom-rfc/

Committee inbox:

  * Remove legacy ajax interface T42787
<https://phabricator.wikimedia.org/T42787>
  o Very old task. We should just do it.

Committee board activity:

  * Create WikiTeq group on Gerrit T267213
<https://phabricator.wikimedia.org/T267213>
  o We should just add WikiTeq to the policy page

New RFCs: none

Phase progression:

  * PHP microservice for containerized shell execution (aka ShellBox) T260330
<https://phabricator.wikimedia.org/T260330>
  o Last call should have ended last week. Some discussion during last call
period. New comment asking why this should be written in PHP.
  o Code is already merged, though not used yet.
  o Can probably be approved.
  * Amendment to the Stable interface policy T268326
<https://phabricator.wikimedia.org/T268326>
  o Question about new requirement to wait at least thee month between hard
deprecation and removal.
  o Moved to phase 4
  * PageIdentity T208776 <https://phabricator.wikimedia.org/T208776>
  o Thiemo (WMDE) asking about some details
  o Daniel would like this to go on last call.

Other RFC activity:

  * Expand API title generator to support other generated data T263841
<https://phabricator.wikimedia.org/T263841>
  o Tim says this is fine and should go ahead
  o Do we need a last call?
  * Provide mechanism for defining and utilizing configuration sets for local
development and browser / API-testing tests T267928
<https://phabricator.wikimedia.org/T267928>
  o Continued discussion after IRC meeting
  o Adam Wight emphasizes the need to have config controlled per-request,
rather than changing config on disk.
  * Store WikibaseQualityConstraint check data in persistent storage T214362
<https://phabricator.wikimedia.org/T214362>
  o WMDE would like this to move forward.  Resourcing is not really clear.

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] TechCom meeting 2020-12-02

2020-12-03 Thread Daniel Kinzler
Am 30.11.20 um 22:00 schrieb Dan Andreescu:
> Automatically index extensions in Codesearch
> <https://phabricator.wikimedia.org/T268328> is still in the inbox, no activity
> since last week's discussion

I have abandoned this in favor of relying on the list of 3rd party extensions
maintained by the MediaWIki Stakeholder group, see
https://github.com/MWStake/nonwmf-extensions/


>   * Create WikiTeq group on Gerrit <https://phabricator.wikimedia.org/T267213>
> is now more clear (see note from Daniel
> <https://phabricator.wikimedia.org/T267213#6653536> to discuss, which we
> should do here)
>
> Committee board activity:
>
The process for adding a "trusted organization" isn't clear, but since the group
is not maintaining any extensions deployed by wmf, we can probably just go ahead
and add them.


>   * General ParserCache service class for large "current" page-derived data
> <https://phabricator.wikimedia.org/T227776> was declined by Daniel, see
> his reasoning there
>
I'd like to note that "declined" here means that I have withdrawn my own 
proposal.


-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] TechCom meeting 2020-12-02

2020-12-03 Thread Daniel Kinzler
Hi all!

I'd like to mention one additional topic: I have been touching up my proposal
for the introduction of PageIdentity
<https://phabricator.wikimedia.org/T208776>, as a lightweight  alternative to
Title and WikiPage. I would like to propose for it to go on last call. It has
been sitting in the "tune" phase for a while, and it seems pretty mature to my
by now.

You can find the proposed code on gerrit, see
Iaed4871e0d32c67d4fb13e487625527f6a21e9c5
<https://gerrit.wikimedia.org/r/q/Iaed4871e0d32c67d4fb13e487625527f6a21e9c5>.
There is a chain of follow-up patches (some incomplete) that demonstrate how the
new interface is intended to be used, and how it can improve existing code.

Am 30.11.20 um 22:00 schrieb Dan Andreescu:
>
> This is the weekly TechCom board review. If there are additional topics for
> TechCom to review, please let us know by replying to this email. However,
> please keep discussion about individual RFCs to the Phabricator tickets.  This
> week, the meeting is async, so we'll be replying to the email as well, *feel
> free to join in*!
>
> Activity since Wednesday 2020-11-25 on the following boards:
>
> https://phabricator.wikimedia.org/tag/techcom/
> https://phabricator.wikimedia.org/tag/techcom-rfc/
>
> Committee inbox:
>
>   * Automatically index extensions in Codesearch
> <https://phabricator.wikimedia.org/T268328> is still in the inbox, no
> activity since last week's discussion
>   * Create WikiTeq group on Gerrit <https://phabricator.wikimedia.org/T267213>
> is now more clear (see note from Daniel
> <https://phabricator.wikimedia.org/T267213#6653536> to discuss, which we
> should do here)
>
> Committee board activity:
>
>   * General ParserCache service class for large "current" page-derived data
> <https://phabricator.wikimedia.org/T227776> was declined by Daniel, see
> his reasoning there
>
> New RFCs: (none)
>
> Phase progression: (none)
>
> IRC meeting request: (none)
>
> Other RFC activity:
>
>   * RFC: Re-evaluate librsvg as SVG renderer for WMF wikis
> <https://phabricator.wikimedia.org/T40010:> saw more conversation
>   * RFC: Discourage use of MySQL's ENUM type
> <https://phabricator.wikimedia.org/T119173> Amir points out the need to
> overhaul all db documentation (I agree, could we have a doc sprint about
> this?) including policies
>   * RFC: Amendment to the Stable interface policy, November 2020
> <https://phabricator.wikimedia.org/T268326> has a suggestion/question
> about clarifying default status and deprecation procedure
>   * RFC: Store WikibaseQualityConstraint check data in persistent storage
> <https://phabricator.wikimedia.org/T214362> WMDE calls for help from
> TechCom and the Platform team to collaborate here.  See Adam Shorland's
> comment about putting it through the new Technical Decision Making Process
>   * RFC: Expand API title generator to support other generated data
> <https://phabricator.wikimedia.org/T263841> Carly Bogen is asking for
> clarity on the need for a steward.  Timo's comment
> <https://phabricator.wikimedia.org/T263841#6617970> on the task indeed
> seem to contradict his note from last week's grooming email.  Timo, would
> you please clarify.
>
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Proposed updates to the deprecation process

2020-12-02 Thread Daniel Kinzler
Hi all!

In August, I wrote to this list to discuss the when and how breaking changes can
be made without deprecation
<https://lists.wikimedia.org/pipermail/wikitech-l/2020-August/093761.html>. The
proposal I made at the time was admittedly rather radical, but it lead of a good
discussion and flushed out a number of pain points and interesting ideas. Based
on this discussion and other feedback, I have drafted an update to the Stable
Interface Policy. You can find it here:

User:DKinzler_(WMF)/Stable_interface_policy
<https://www.mediawiki.org/wiki/User:DKinzler_(WMF)/Stable_interface_policy>
(diff
<https://www.mediawiki.org/w/index.php?title=User:DKinzler_(WMF)/Stable_interface_policy=revision=4258989=4242722=source>)

The draft has entered the RFC process
<https://phabricator.wikimedia.org/T268326>, and I intended to move it through
swiftly, so it can be adopted soon. If you have any thoughts or feedback, please
reply to this email, or put it on the phab task.

I would like to highlight a few of the changes that I am proposing:

  * Define the idea of a MediaWiki "ecosystem", and state that extensions in
this ecosystem should receive support in dealing with upcoming breaking
changes. The ecosystem includes actively maintained extensions hosted by
Wikimedia or listed by the MediaWiki Stakeholder Group.
  * Clarify that removal without deprecation is only ok for code that has never
been used elsewhere.
  * Allow "hard deprecation by announcement" when it is not possible to emit
deprecation warnings. This was previously stated as an exception from the
deprecation policy, rather than a mode of deprecation.
  * Clarify that individuals or teams who deprecate code commit to removing
usages of that code asap at least in code maintained by Wikimedia, and
should support 3rd parties in this removal from code in the ecosystem.
  * Remove the recommendation that hard-deprecated code should be kept for two
releases. This is replaced by a requirement to keep it for one release, and
three months on master.
  * Clarify that deprecation and removal of deprecated code should not be
backported to release candidates, and should ideally happen right after a
release, rather than shortly before a release.

The intent is to streamline the deprecation process, ensuring that deprecated
code becomes unused quickly, without causing too much of a disturbance.

Besides this, the proposal contains a number of other additions and
clarifications, as described on the phab ticket and visible in the diff.

I'm looking forward to hearing your thoughts and ideas!

-- 

Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] TechCom RFC review meeting: mechanism for overriding configuration for browser tests

2020-11-20 Thread Daniel Kinzler
Kosta recently filed an RFC about a way for browser tests (and Mocha tests) to
trigger different configurations on the backend:
https://phabricator.wikimedia.org/T267928. 
<https://phabricator.wikimedia.org/T267928>

This kind of mechanism would give end-to-end tests a way to easily test
different setups, but also to mock out unnecessary parts of the backend, load
fixtures, or protect against the influence of extensions.

We will be discussing the proposal next Wednesday, November 25, at 22:00 UTC
(2pm EST, 23:00 CET). As always, the meeting will take place in
#wikimedia-office channel on Freenode.

If you are interested, please join, or comment on the ticket.

-- daniel

Am 18.11.20 um 23:46 schrieb Krinkle:
> The minutes from TechCom's triage meeting on 18 November 2020.
>
> Present: Tim S, Daniel K, Timo T.
>
>
> New RFC: Provide mechanism for overriding configuration for browser tests
>
>   * https://phabricator.wikimedia.org/T267928
>   * TT: High time we resource this. Some previous research on this when we
> transitioned browser tests from Ruby to Node/WebdriverIO. At the time, we
> wanted to keep the ability to run the same tests against local+CI+beta,
> which made this rather difficult.
>   * Moved to P2.
>
>
> RFC: Discourage use of MySQL's ENUM type
>
>   * https://phabricator.wikimedia.org/T119173
>   * DK: yes discourage by default
>   * TS: Jaime mentioned that ENUM's sort differently from text, but also said
> we shouldn't ban it outright.
>   * TT: as proposed sounds right. generally there are better solutions, but as
> justified optimization specific high-scale uses could be allowed.
>
>
> RFC: Drop support for database upgrade older than two LTS releases
>
>   * https://phabricator.wikimedia.org/T259771
>   * TT: principally seems fine, not aware of concerns. we'd want to make sure
> we cover the failure scenarios, e.g. not just soft documentation, but
> actually programmatically detected and prevent disaster. I'll comment 
> on-task.
>   * DK: Platform team as stakeholder for ..?
>   * TT: I guess potential veto in terms of what the minimum support should be,
> and if okay with trailing/dropping, then how long it has to be.
>
>
> RFC: Expiring watch list entries
>
>   * https://phabricator.wikimedia.org/T262946
>   * Last Call ended. Approved.
>
>
> RFC: Shellbox microservice for MediaWiki
>
>   * https://phabricator.wikimedia.org/T260330
>   * TT: Worth noting that it is an optional service. The current logic remains
> the same as before and Shell-exec call API also remains compatible. The
> library can effectively now be put into a container and MW configured to
> use that rather than calling directly.
>   * Put on Last Call until 2 December.
>
>
> Next week IRC office hours
>
> No IRC discussion scheduled for next week.
>
>
> You can also find our meeting minutes at
> https://www.mediawiki.org/wiki/Wikimedia_Technical_Committee/Minutes
>
> If you prefer you can subscribe to our newsletter here
> https://www.mediawiki.org/wiki/Newsletter:TechCom_Radar
>
> -- Timo
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] TechCom topics 2020-11-04 (fixed)

2020-11-03 Thread Daniel Kinzler
Am 02.11.20 um 19:24 schrieb Daniel Kinzler:
>
> [Re-posting with fixed links. Thanks for pointing this out Cormac!]
>
> This is the weekly TechCom board review.  Remember that there is no meeting on
> Wednesday, any discussion should happen via email. For individual RFCs, please
> keep discussion to the Phabricator tickets.
>
That's another issue I wanted to raise: Platform Engineeing is working on
switching ParserCache to JSON. For that, we have to make sure extensions only
put JSON-Serializable data into ParserOutput objects, via setProperty() and
setExtensionData(). We are currently trying to figure out how to best do that
for TemplateData.

TemplateData already uses JSON serialization, but then compresses the JSON
output, to make the data fit into the page_props table. This results in binary
data in ParserOutput, which we can't directly put into JSON. There are several
solutions under discussion, e.g.:

* Don't write the data to page_props, treat it as extension data in
ParserOutput. Compression would become unnecessary. However, batch loading of
the data becomes much slower, since each ParserOutput needs to be loaded from
ParserCache. Would it be too slow?

* Apply compression for page_props, but not for the data in ParserOutput. We
would have to introduce some kind of serialization mechanism into PageProps and
LinksUpdate. Do we want to encourage this use of page_props?

* Introduce a dedicated database table for templatedata. Cleaner, but schema
changes and data migration take a long time.

* Put templatedata into the BlobStore, and just the address into page_props.
Makes loading slower, maybe even slower than the solution that relies on
ParserCache.

* Convert TemplateData to MCR. This is the cleanest solution, but would require
us to create an editing interface for templatedata, and migrate out existing
data from wikitext. This is a long term perspective.

To unblock migration of ParserCache to JSON, we need at least a temporary
solution that can be implemented quickly. A somewhat hacky solution I can see 
is:

* detect binary page properties and apply base64 encoding to them when
serializing ParserOutput to JSON. This is possible because page properties can
only be scalar values. So can convert to something like { _encoding_: "base64",
data: "34c892ur3d40" }, and recognize the structure when decoding. This wouldn't
work for data set with setTemplateData, since that could already be an arbitrary
structure.

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] TechCom topics 2020-11-04 (fixed)

2020-11-03 Thread Daniel Kinzler
Am 02.11.20 um 19:24 schrieb Daniel Kinzler:
> T262946 <https://phabricator.wikimedia.org/T262946> *"Bump Firefox version in
> basic support to 3.6 or newer"*: last call ending on Wednesday, November 4.
> Some comments, no objections.
>
Since we are not having a meeting on Wednesday, I guess we should try and get
quorum to approve by mail.

I'm in favor.

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] TechCom topics 2020-11-04 (fixed)

2020-11-02 Thread Daniel Kinzler
[Re-posting with fixed links. Thanks for pointing this out Cormac!]

This is the weekly TechCom board review.  Remember that there is no meeting on
Wednesday, any discussion should happen via email. For individual RFCs, please
keep discussion to the Phabricator tickets.


Activity since Monday 2020-10-26 on the following boards:

https://phabricator.wiki09media.org/tag/techcom/


https://phabricator.wikimedia.org/tag/techcom-rfc/

Committee board activity:

  *

T175745  *"overwrite edits when
conflicting with self"* has once again come up while working on EditPage.
There seems to no longer be any reason for this behavior. I think it does
more harm then good. We should just remove it.

RFCs:

Phase progression:

  * T266866  *"Bump basic supported
browsers (grade C) to require TLS 1.2"*: newly filed, lively discussion.
Phase 1 for now.



  *

T263841 *"Expand API title
generator to support other generated data"*: dropped back to phase 2 because
resourcing is unclear.

  * T262946  *"Bump Firefox version
in basic support to 3.6 or newer"*: last call ending on Wednesday, November
4. Some comments, no objections.


Other RFC activity:

  * T250406  *"Hybrid extension
management"*: Asked for clarification expectations for WMF to publish
extensions to packagist. Resourcing is being discussed in the platform team.

Cheers,
Daniel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] TechCom topics 2020-11-04

2020-11-02 Thread Daniel Kinzler
This is the weekly TechCom board review.  Remember that there is no meeting on
Wednesday, any discussion should happen via email. For individual RFCs, please
keep discussion to the Phabricator tickets.


Activity since Monday 2020-10-26 on the following boards:

https://phabricator.wiki09media.org/tag/techcom/


https://phabricator.wikimedia.org/tag/techcom-rfc/

Committee board activity:

  *

T175745  *"overwrite edits when
conflicting with self"* has once again come up while working on EditPage.
There seems to no longer be any reason for this behavior. I think it does
more harm then good. We should just remove it.

RFCs:

Phase progression:

  * T266866  *"Bump basic supported
browsers (grade C) to require TLS 1.2"*: newly filed, lively discussion.
Phase 1 for now.



  *

T263841 *"Expand API title
generator to support other generated data"*: dropped back to phase 2 because
resourcing is unclear.

  * T262946  *"Bump Firefox version
in basic support to 3.6 or newer"*: last call ending on Wednesday, November
4. Some comments, no objections.


Other RFC activity:

  * T250406  *"Hybrid extension
management"*: Asked for clarification expectations for WMF to publish
extensions to packagist. Resourcing is being discussed in the platform team.

Cheers,
Daniel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] MediaWiki logo

2020-10-27 Thread Daniel Kinzler
Am 27.10.20 um 01:28 schrieb Denny Vrandečić:
> Amir, thank you!

Yes! Thank you! That was quite a ride!


-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] TechCom meeting 2020-10-14

2020-10-15 Thread Daniel Kinzler
Am 15.10.20 um 03:58 schrieb Dan Andreescu:
> Committee inbox:
>
>   * T263904 <https://phabricator.wikimedia.org/T263904>: Are traits part of
> the stable interface?
>
I'm working on an update for the Stable Interface Policy
<https://www.mediawiki.org/wiki/User:DKinzler_(WMF)/Stable_interface_policy>.
Besides some minor tweaks to the wording, the main changes are:

  * added a section on "stable for use" for traits.
  * overhauled the deprecation process, to follow up on the discussion on
breaking changes without deprecation we had here on wikitech-l a couple of
months ago. To give a very brief summary, the idea is that hard deprecation
should happen as soon as possible after soft deprecation, and removal should
happen right after the release that contained the hard deprecation. It also
shifts responsibility for updating calling code to the person or group that
drives the deprecation.

Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] TechCom meeting 2020-10-14

2020-10-15 Thread Daniel Kinzler
Am 15.10.20 um 03:58 schrieb Dan Andreescu:
> Committee inbox:
>
>   * T263904 <https://phabricator.wikimedia.org/T263904>: Are traits part of
> the stable interface?
>
I'm working on an update for the Stable Interface Policy. Besides some minor
tweaks to the wording, the main changes are:

  * added a section on "stable for use" for traits.
  * overhauled the deprecation process, to follow up on the discussion on
breaking changes without deprecation we had here on wikitech-l a couple of
months ago. To give a very brief summary, the idea is that hard deprecation
should happen as soon as possible after soft deprecation, and removal should
happen right after the release that contained the hard deprecation. It also
shifts responsibility for updating calling code to the person or group that
drives the deprecation.

Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] TechCom meeting 2020-10-06

2020-10-07 Thread Daniel Kinzler
Another thing for attention by TechCom:

We need a robust and consistent way to determine whether a response should be
(publicly) cacheable. We Seem to fail in both directions currently, in some edge
cases. On a related note, we need more control over which responses are allowed
to contain which cookies.

See discussion on https://phabricator.wikimedia.org/T264631


Am 07.10.20 um 10:30 schrieb Daniel Kinzler:
>
> There is something I came across that I'd like to briefly discuss briefly
> during the meeting:
>
> It would be nice if the updater could change default settings. This way, we
> could make some behavior the default for new wikis, while keeping old behavior
> for existing wikis. I envision an DefaultOverrides.php file that the installer
> would append to. It would be in .gitignore, but I'm not sure where it should
> be located.
>
> Point in case: Wikis that share the user table should also share the actor
> table. But we haven't been doing that so far, and wikis that now already have
> a shared user table but per-wiki actor tables are rather ticket to migrate. So
> we could add the actor table to wgSharedTables per default, but tell the
> installer to override that for wikis that already have out-of-sync actor
> tables. See T243276#6519078. 
> <https://phabricator.wikimedia.org/T243276#6519078>
>
> Am 06.10.20 um 12:18 schrieb Giuseppe Lavagetto:
>>
>> This is the weekly TechCom board review in preparation of our meeting on
>> Wednesday. If there are additional topics for TechCom to review, please let
>> us know by replying to this email. However, please keep discussion about
>> individual RFCs to the Phabricator tickets.
>>
>> Activity since Monday 2020-09-28 on the following boards:
>>
>> https://phabricator.wikimedia.org/tag/techcom/
>>
>> https://phabricator.wikimedia.org/tag/techcom-rfc/
>>
>> Committee inbox:
>>
>>  *
>>
>> T264334 <https://phabricator.wikimedia.org/T264334>: Could the registered
>> module manifest be removed from the client?
>>
>>  o
>>
>> New task about the possibility of removing the huge module registry
>> from the js sent to the client. The idea is being discussed.
>>
>> Committee board activity: Nothing to report, besides inbox
>>
>> New RFCs: none.
>>
>> Phase progression:
>>
>>  *
>>
>> T262946 <https://phabricator.wikimedia.org/T262946>: Bump Firefox version
>> in basic support to 3.6 or newer
>>
>>  o
>>
>> Moves to P3 (explore)
>>
>>  o
>>
>> It is pointed out that we’ve dropped support in production for TLS
>> 1.0/1.1 in january, so de facto only Firefox 27+ is able to connect
>> to the wikimedia sites
>>
>>  o
>>
>> In light of that, it’s suggested that we might bump the minimum
>> supported versions of browsers further.
>>
>> IRC meeting request: none
>>
>> Other RFC activity:
>>
>>  *
>>
>> T260714 <https://phabricator.wikimedia.org/T260714>: Parsoid Extension 
>> API. 
>>
>>  o
>>
>> Last call to be approved, that will end on October 7 (tomorrow)
>>
>>  *
>>
>> T487 <https://phabricator.wikimedia.org/T487>: RfC: Associated 
>> namespaces.
>>
>>  o
>>
>> On last call to be declined, there is some opposition to the
>> opportunity of marking it as declined on phabricator. Last call
>> should end on October 7 (tomorrow)
>>
>>  *
>>
>> T263841 <https://phabricator.wikimedia.org/T263841>: RFC: Expand API
>> title generator to support other generated data.
>>
>>  o
>>
>> Erik asks if this is going to be generally applied to all generators
>> or not.
>>
>> Cheers,
>> Giuseppe
>> -- 
>> Giuseppe Lavagetto
>> Principal Site Reliability Engineer, Wikimedia Foundation
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> -- 
> Daniel Kinzler
> Principal Software Engineer, Core Platform
> Wikimedia Foundation

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] TechCom meeting 2020-10-06

2020-10-07 Thread Daniel Kinzler
There is something I came across that I'd like to briefly discuss briefly during
the meeting:

It would be nice if the updater could change default settings. This way, we
could make some behavior the default for new wikis, while keeping old behavior
for existing wikis. I envision an DefaultOverrides.php file that the installer
would append to. It would be in .gitignore, but I'm not sure where it should be
located.

Point in case: Wikis that share the user table should also share the actor
table. But we haven't been doing that so far, and wikis that now already have a
shared user table but per-wiki actor tables are rather ticket to migrate. So we
could add the actor table to wgSharedTables per default, but tell the installer
to override that for wikis that already have out-of-sync actor tables. See
T243276#6519078. <https://phabricator.wikimedia.org/T243276#6519078>

Am 06.10.20 um 12:18 schrieb Giuseppe Lavagetto:
>
> This is the weekly TechCom board review in preparation of our meeting on
> Wednesday. If there are additional topics for TechCom to review, please let us
> know by replying to this email. However, please keep discussion about
> individual RFCs to the Phabricator tickets.
>
> Activity since Monday 2020-09-28 on the following boards:
>
> https://phabricator.wikimedia.org/tag/techcom/
>
> https://phabricator.wikimedia.org/tag/techcom-rfc/
>
> Committee inbox:
>
>  *
>
> T264334 <https://phabricator.wikimedia.org/T264334>: Could the registered
> module manifest be removed from the client?
>
>  o
>
> New task about the possibility of removing the huge module registry
> from the js sent to the client. The idea is being discussed.
>
> Committee board activity: Nothing to report, besides inbox
>
> New RFCs: none.
>
> Phase progression:
>
>  *
>
> T262946 <https://phabricator.wikimedia.org/T262946>: Bump Firefox version
> in basic support to 3.6 or newer
>
>  o
>
> Moves to P3 (explore)
>
>  o
>
> It is pointed out that we’ve dropped support in production for TLS
> 1.0/1.1 in january, so de facto only Firefox 27+ is able to connect to
> the wikimedia sites
>
>  o
>
> In light of that, it’s suggested that we might bump the minimum
> supported versions of browsers further.
>
> IRC meeting request: none
>
> Other RFC activity:
>
>  *
>
> T260714 <https://phabricator.wikimedia.org/T260714>: Parsoid Extension 
> API. 
>
>  o
>
> Last call to be approved, that will end on October 7 (tomorrow)
>
>  *
>
> T487 <https://phabricator.wikimedia.org/T487>: RfC: Associated namespaces.
>
>  o
>
> On last call to be declined, there is some opposition to the
> opportunity of marking it as declined on phabricator. Last call should
> end on October 7 (tomorrow)
>
>  *
>
> T263841 <https://phabricator.wikimedia.org/T263841>: RFC: Expand API title
> generator to support other generated data.
>
>  o
>
> Erik asks if this is going to be generally applied to all generators
> or not.
>
> Cheers,
> Giuseppe
> -- 
> Giuseppe Lavagetto
> Principal Site Reliability Engineer, Wikimedia Foundation
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] TechCom meeting 2020-09-23

2020-09-23 Thread Daniel Kinzler
Thanks Tim! And thanks to Timo for going through the old RFCs. We haven't done
that in a while.

Besides the RFCs, I'd like to talk about the GitLab consultation in our meting
tomorrow: https://www.mediawiki.org/wiki/GitLab_consultation

TechCom is listed under "consulted", so I suppose we should form an opinion.

Am 23.09.20 um 06:37 schrieb Tim Starling:
> Two tasks have been sitting in there for multiple weeks

Ah, I guess that'S why they didn't show up under "activity". One is the Parsoid
Extension API RFC mentioned below. The other one is T239742
<https://phabricator.wikimedia.org/T239742>: "Should npm packages maintained by
Wikimedia be scoped or unscoped?"

>   * T157402 <https://phabricator.wikimedia.org/T157402> Provide a reliable way
> to pass information between hook handlers, "hooked" objects
>   o An RFC that was stalled since 2019, closed "declined"
>
Probably obsolete due to the new hook system.
>
>   * T487 <https://phabricator.wikimedia.org/T487> Associated namespaces
> <https://phabricator.wikimedia.org/T487>
>   o Timo asks if it can be merged with something.
>
I think we can close this, the intended use cases are covered by MCR.
>
>   * T114662 <https://phabricator.wikimedia.org/T114662>Per-language URLs for
> multilingual wiki pages
>   o Timo closed due to lack of owner
>
I'd like to  come back to this eventually. But I don't see anyone driving it
right now.
>
>   * T193690 <https://phabricator.wikimedia.org/T193690>How should we fix the
> undeletion system?
>   o Timo moved to P1 and stalled
>
This does need fixing, but is entangled with UX and producty questions. Nothing
is going to happen unless the community pushes on it. Which will probably only
happen once something breaks badly.
>
>   * T196950 <https://phabricator.wikimedia.org/T196950>Pages do not have
> stable identifiers
>   o Timo closed due to lack of owner
>
Page IDs are "mostly" stable these days, right?
>
>   * T260714 <https://phabricator.wikimedia.org/T260714> Parsoid Extension API
>   o Prior to last week's meeting, Subbu described the consultation work
> that has been done.
>   o Suggest last call
>
Yea, let's.

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Allow HTML email

2020-09-23 Thread Daniel Kinzler
Thanks Tim.

This prompted me to finally switch to composing mails in HTML per default, like
it's the 21st century.

Am 23.09.20 um 06:45 schrieb Tim Starling:
> OK done, and it seems to be working.
>
-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] RFC discussion today: PHP microservice for containerized shell execution

2020-09-16 Thread Daniel Kinzler
Hi all!

This is a quick reminder that TechCom is hosting a meeting on IRC about the
following RFC:

"PHP microservice for containerized shell execution"
<https://phabricator.wikimedia.org/T260330>

You can join us at 21:00 UTC (23:00 CEST, 2pm PDT)
in the #wikimedia-office channel on freenode.

Problem
- For security, we need better isolation of external binaries from MediaWiki.
- If we run MediaWiki itself under Kubernetes, the resulting container should be
  as small as possible, so it should ideally exclude unnecessary binaries.
- It's difficult to deploy bleeding-edge versions of external binaries when they
  necessarily share an OS with MediaWiki.

Proposal
- Have a PHP microservice, accessible via HTTP, which takes POSTed inputs,
  writes them to the container's filesystem as temporary files, runs a shell
  command, and responds with gathered output files.

Tim has been working on this for a couple of weeks, and has been updating the
task in a steady monologue. Perhaps in the meeting today, we can get more eyes
on the nitty gritty of the proposal.

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] TechCom meeting 2020-09-16

2020-09-16 Thread Daniel Kinzler
This is the weekly TechCom board review in preparation of our meeting on
Wednesday. If there are additional topics for TechCom to review, please let us
know by replying to this email. However, please keep discussion about individual
RFCs to the Phabricator tickets.

Activity since Monday 2020-09-07 on the following boards:

https://phabricator.wikimedia.org/tag/techcom/
https://phabricator.wikimedia.org/tag/techcom-rfc/

IRC meeting request:
* Public discussion TODAY: "PHP microservice for containerized shell execution"
  Join us at 21:00 UTC (23:00 CEST, 2pm PDT) in the #wikimedia-office channel
  on freenode. <https://phabricator.wikimedia.org/T260330>.


Other RFC activity:
* "Parsoid Extension API": Subbu documented status of outreach with
  various stakeholders. <https://phabricator.wikimedia.org/T260714>

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] TechCom weekly digest 2020-09-02

2020-09-04 Thread Daniel Kinzler
Hi all,

Here are the minutes from this week's TechCom meeting:

=== New RFC: Challenges of developing with npm dependencies ===
* The “frontend build step” RFC is overly broad, and has drifted over time, see
  https://phabricator.wikimedia.org/T199004
* Dan will draft a new RFC focused on some of the issues we encounter in the
  Analytics team when developing Node.js services. Specifically around code
  review, CI/deploy, and dependency auditing.

=== Parsoid extension API ===
* https://phabricator.wikimedia.org/T260714
* TT: this is a major milestone RFC, the new parser will replace the old one.
  Hooks and methods not mentioned here will likely be deprecated/removed. I
  believe that’s well-understood and the
* Parsoid team did great outreach over the past year. This should be detailed
  and summarised on the RFC, or linked to.

=== Session expiry ===
* Pywikibot users have been experiencing CSRF tokens, see
  https://phabricator.wikimedia.org/T261050
* This may be due to the fact that we have become more strict about expiring web
  sessions after a given time since we moved session storage to the Kask
  service.
* It is currently unclear from the documentation for how long a CSRF token is
  valid for. This should be made more clear.

=== Removal of unused code without deprecation ===
* Daniel proposed a change to the stable interface policy on wikitech-l, in a
  thread titled “Making breaking changes without deprecation?”
* Reactions are mixed, more feedback welcome.
* The input from the mailing list will be used to update the current policy


You can also find our meeting minutes at
<https://www.mediawiki.org/wiki/Wikimedia_Technical_Committee/Minutes>

If you prefer you can subscribe to our newsletter here
<https://www.mediawiki.org/wiki/Newsletter:TechCom_Radar>

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Making breaking changes without deprecation?

2020-09-01 Thread Daniel Kinzler
Hi Arthur!

We were indeed thinking of different scenarios. I was thinking of someone who
runs a wiki with a couple of one-off private extensions running, and now wants
to update. They may well test that everything is still working with the new
version of MediaWiki, but I think they would be unlikely to test with
development settings enabled. The upgrade guide doesn't mention this, and even
if it did, I doubt many people would remember to enable it. So they won't notice
deprecations until the code is removed.

I understand your scenario to refer to an extension developer explicitly testing
whether their extension is working with the next release, and try it in their
development environment. They would see the deprecation warnings, and address
them. But in that scenario, would it be so much worse to see fatal errors
instead of deprecation warnings?

This is not meant to be a loaded question. I'm trying to understand what the
practical consequences would be. Fatal errors are of course less nice, but in a
testing environment, not a real problem, right? I suppose deprecation warnings
can provide better information that fatal errors would, but one can also find
this information in the release notes, once it is clear what to look for.

Also note that this would only affect private extensions. Public extensions
would receive support up front, and early removal of the obsolete code would be
blocked until all known extensions are fixed.

Thank you for your thoughts!
-- daniel


Am 31.08.20 um 20:54 schrieb Arthur Smith:
> Hmm, maybe we're talking past one another here? I'm assuming a developer of an
> extension who is interested in testing a new release - if we have a version 
> that
> has things deprecated vs completely removed, that allows a quick check to see 
> if
> the deprecated code affects them without going back into their own code (which
> may have been developed partly by somebody else so  just reading release notes
> wouldn't clue them in that there might be a problem).

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Making breaking changes without deprecation?

2020-08-31 Thread Daniel Kinzler
Am 31.08.20 um 18:52 schrieb Arthur Smith:
> So the alpha
> release would have to be tested in a separate environment, with 
> development
> warnings enabled, and someone actually looking at the log. Typically, 
> people
> only look at logs after things break.
> 
> 
> Is that true? I thought deprecation warnings appeared directly when viewing a
> page that used the deprecated code - that was my recent experience of this 
> with
> the WikiPage/Revision stuff that is deprecated in 1.35 - I was experimenting
> with an extension (in development mode) that hadn't fixed that issue, and the
> warnings appeared right there on every page.

Yes, in development model ($wgDeveloperWarnings = true), deprecation warnings
are visible.

But very commonly, people don't actively work on this "hidden code" any more.
They wrote it once, it's working, and they will not look at it again until it
breaks. I'm not blaming them, that's what I do for "one off" code.

If they are actively developing features, then sure. But then they are likely to
read release notes, or tests against master.

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Grant merge rights on core to Martin Urbanec

2020-08-31 Thread Daniel Kinzler
This is a request for granting merge privileges on the mediawiki group
(MediaWiki core and all extensions) per the gerrit privilege policy.
You can find the relevant ticket at <https://phabricator.wikimedia.org/T261656>.

Martin Urbanec is a long term contributor, both on the wikis and to the code
base. As a Steward, he has the trust of the community. His contributions in code
include improvements to extensions like OATHAuth and CentralAuth. He is also
helping with decoupling classes in core to improve code health, and provides
configuration patches on behalf of the community.

Martin has been particularly helpful in investigating and fixing a recent
security issue (T260485). Working with him on that, I was surprised to find out
that he doesn't have +2 rights. From what I have seen, it seems empowering him
to merge patches is long overdue.

Gerrit: https://gerrit.wikimedia.org/r/q/owner:martin.urbanec%2540wikimedia.cz
Wikipiedia: https://cs.wikipedia.org/wiki/Wikipedista:Martin_Urbanec

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] TechCom Board Review 2020-08-31

2020-08-31 Thread Daniel Kinzler
Hi all!

This is the weekly TechCom board review  in preparation of our meeting on
Wednesday. If there are additional topics for TechCom to review, please let us
know by replying to this email. However, please keep discussion about individual
RFCs to to the phabricator tickets.

Activity since Monday 2020-08-26 on the following boards:
https://phabricator.wikimedia.org/tag/techcom/
https://phabricator.wikimedia.org/tag/techcom-rfc/

Committee inbox: none

Committee board activity:
* Added "Parsoid Extension API" (T260714), see below.

New RFCs: none

Phase progression: none

IRC meeting request: none

Other RFC activity:
* Drop support for database upgrade older than two LTS releases:
  <https://phabricator.wikimedia.org/T259771>
  - Leaderboard suggests to support 3 LTS releases
  - Bawolff says that upgrades from versions as old as 1.16 are common enough,
but that we should drop support for anything older than 1.6.
* Parsoid Extension API:
  <https://phabricator.wikimedia.org/T260714>
  - Subbu wrote in response to las week's digest email that they are looking for
feedback "not from users, but TechCom" and "if TechCom is happy with it, it
can go to Last Call."
  - DISCUSS: should we move this to last call?


-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Making breaking changes without deprecation?

2020-08-31 Thread Daniel Kinzler
Am 28.08.20 um 21:47 schrieb Physikerwelt:
> I appreciate your argument, however, I think the deprecation policy
> will be used in good faith. Fast deprecations are really helpful for
> code that is not been used. If one expects that a feature is used in
> hidden code probably people will not depreciate it too fast,
> especially if there is a lot of visible code to refactor.
Hi Moritz!

I think you are toughing on the core of the issue: We need to figure out mow
much we care about "hidden usages" in code that is not shared back to the 
community.

For a long time, the answer has been "very much", so we worked as if MediaWiki
was a framework developed for other people's use, providing a maximum of
backwards compatibility. However, this comes with a very real cost in terms of
development speed and code complexity.

The other extreme would be saying "not at all". Then we wouldn't need release
notes. Maybe we wouldn't even need releases. That would be a rather harsh.

Perhaps it would be helpful to be more specific about what code we are talking
about. I think that for code we release as standalone libraries, we should
ensure compliance with the principles of semantic versioning, and avoid
inconvenience for 3rd party users.

However, for MediaWiki core, I have come around to thinking that we should not
allow ourselves to be held back too much by the needs of "hidden usages". We
really need to modernize the codebase, and that means breaking changes. Dragging
along a compatibility layer means we cannot benefit from the changes we have
made until we can drop that layer. So I'd rather that be months, not years, as
it has been in the past.

So, for core, I think we should only care about usage in non-public code "a
little". In exchange, we should better support updating 3rd code that is public.


-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Making breaking changes without deprecation?

2020-08-31 Thread Daniel Kinzler
Am 28.08.20 um 17:51 schrieb Arthur Smith:
> Would it be feasible to put the deprecation notices in an early release
> candidate, then encourage third party extension creators to try the release
> candidate with deprecation notices so they'll see where there are problems
> in their code, and what they have to do to be ready for the final release
> where deprecated features are removed?

What you are suggesting sounds like an interesting option to consider - please
let me know if I understand your idea correctly:

When code has become obsolete, and we have removed all known usages, we should
not remove the old code immediately, but we can tag it for removal *before* the
next release (rather than after, per the current policy). The obsolete
functionality would remain intact (but emitting warnings) in some kind of
alpha-release (even before the "release candidates").

Is that what you have in mind?

What I am wondering is - when people try the alpha release, how would they even
notice the deprecation warnings? These warnings are disabled per default,
because they would flood the log files on a production site. So the alpha
release would have to be tested in a separate environment, with development
warnings enabled, and someone actually looking at the log. Typically, people
only look at logs after things break.

But if the pre-release is tested in a development environment, what's the
advantage of a deprecation warning over a hard error? The only difference I see
is the reported log level and type of exception.  I'm not sure that's worth the
effort.

The same question also arises for the existing long deprecation period. My
impression is that the people who should benefit from the long deprecation
either notice right away and quickly fix their code (so they don't need the long
deprecation), or they don't notice until things break (so they don't need the
long deprecation either).

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Making breaking changes without deprecation?

2020-08-28 Thread Daniel Kinzler
Hi Greg, thanks for your reply!

Am 28.08.20 um 18:26 schrieb Greg Rundlett (freephile):
> I like the idea of streamlining deprecation and avoiding the cost of
> maintaining obsolete code. I also **want** to publish my code on Gerrit.

Just a quick clarification: while the current policy only considers code to be
part of the "ecosystem" if it's on gerrit, what I proposed in my mail would mean
that the Extension could be hosted anywhere, as long as it is public, and has a
 page on mediawiki.org

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] TechCom weekly digest 2020-08-26

2020-08-28 Thread Daniel Kinzler
Dear Wikitech-l,

As TechCom (Wikimedia Technical Committee) we triage and review our Phabricator
boards, #techcom and #techcom-rfc, for activity. We also have a weekly meeting
in which the board triage and other topics are discussed. A short summary of
this meeting is published to mediawiki.org and on Wikitech-l, as part TechCom 
Radar.

In an effort to encourage wider and less formal participation through Wikitech-l
(and to make our process more asynchronous) we'll also write to Wikitech-l as
part of the board triage going forward. You may already have seen the first
board grooming mail here a couple of days ago. We are also expanding the TechCom
Radar to (once again) include our meeting minutes. Please feel free to join the
conversation!


Here are the minutes from our meeting on Wednesday:

Present: Daniel K, Tim S, Giuseppe L, Timo T, Dan A, Roan K.

== New link service ==
*
https://www.mediawiki.org/wiki/Growth/Personalized_first_day/Structured_tasks#Prioritizing_%22add_a_link%22
* Guiseppe: Showing suggested links for editors using ML. How do we store ML
  models in production, update them?
* Dan: isn’t this the purview of Chris’s ML infrastructure team?
* Giuseppe: long-term, yes, but not directly their responsibility right now. The
  general architectural issue is: we want a sustainable way to generate ML
  models, and ship them to applications using them. This process should be
  streamlined - the current pattern of slapping the models in git-lfs is not
  sustainable long-term.
* Roan: explains about the project, a collaboration with Research to suggest
  links to add to articles.  Docs/write-ups are emerging (and more focused on
  UX)
* Daniel: Create an RFC?

== Upcoming datacenter switchover ==
* https://wikitech.wikimedia.org/wiki/Switch_Datacenter
* Next week, Tue 1 Sept, 14:00 UTC.

== RFC: New API for Parsoid extensions ==
* https://phabricator.wikimedia.org/T260714
* Daniel: Should it go on last call?
* Tim: no urgency to closing this. The review they’re asking for hasn’t been
  done yet.
* Timo: They’ve done a fair bit of research and outreach offline prior to the
  RFC. They’re looking for explicit approval from would-be users of their new
  API for parser extensions. Once that completes, they should ask for Last Call.

== Shell execute microservice ==
* https://phabricator.wikimedia.org/T260330
* Tim: New initiative being started by Platform Engineering.
  Will turn into an RFC.
* Tim: Affects every shell execution, can be used in small/medium wiki instances
  that aren’t hosted by WMF, seems cross-cutting, affects over 30 extensions.

You can also find our meeting minutes at
<https://www.mediawiki.org/wiki/Wikimedia_Technical_Committee/Minutes>

If you prefer you can subscribe to our newsletter here
<https://www.mediawiki.org/wiki/Newsletter:TechCom_Radar>

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Making breaking changes without deprecation?

2020-08-28 Thread Daniel Kinzler
that they will
get patches and support, but they may see their code broken if they do not
follow up.


Now, my proposal hinges on the idea that we somehow know all relevant code that
needs fixing. How can that work?

When TechCom introduced the idea of the "MediaWiki ecosystem" into the policy,
our reasoning was that we want to support primarily extension developers who
contribute their extensions back to the ecosystem, by making them available to
the public. We found it fair to say that if people develop extensions solely for
their own use, it is up to them to read the release notes. We do not need to go
out of our way to protect them from changes to the code base.

Effectively, with the proposed change to the policy, maintainers of public
extensions will get more support keeping their extensions compatible, while
maintainers of private extensions will receive less consideration.

It seems desirable and fair to me to allow for "fast track" removal of obsolete
code, but only if we create a clear process for making an extensions "official".
How exactly would an extension developer make sure that we know their extension,
and consider it part of the ecosystem? In practice, "known code" is code
accessible via codesearch[5]. But how does one get an extension into the
codesearch index? There is currently no clear process for this.

Ideally, it would be sufficient to:
* create a page on mediawiki.org using the {{Extension}} infobox,
* setting the status to "stable" (and maybe "beta"),
* and linking to a public git repository.

It should be simple enough to create a script that feeds these repos into
codesearch. A quick look at Category:Extensions_by_status category tells me that
there are about a thousand such extensions.


So, my question to you is: do you support the change I am proposing to the
policy? If not, why not? And if you do, why do you think it's helpful?


-- daniel

PS: This proposal has not yet been vetted with TechCom, it's just my personal
take. It will become an RFC if needed. This is intended to start a conversation.


[1] https://www.mediawiki.org/wiki/Stable_interface_policy
[2] https://www.mediawiki.org/wiki/Topic:Vrwr9aloe6y1bi2v
[3] https://phabricator.wikimedia.org/T193613
[4] https://phabricator.wikimedia.org/T255803
[5] https://codesearch.wmcloud.org/search/

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Service wiring conventions in extensions

2020-07-15 Thread Daniel Kinzler
Hi Niklas!

> Do you see any downsides of using code like below instead?
> 
> 'Translate:TranslatablePageParser' => function (): TranslatablePageParser {
> $services = TranslateServices::getInstance();
> return new TranslatablePageParser(
> $services->getParsingPlaceholderFactory() );
> },

The only  downside is that it uses global state. In a future in which we'll some
day have multiple instances of MediaWikiServices, one for the local wiki and one
for each "sibling" wiki we want to access, this wouldn't work.

But then, this future is probably still pretty far off, and this would be easy
to change. On the other hand, a typo in
$services->get( 'Translate:ParsingPlaceholderFactory' ) is spotted quickly, and
the lack of auto completion in the wiring file (and only in the wiring file)
isn't so terrible, is it?

A cleaner solution would be for TranslateServices to not implement
ContainerInterface directly, but extend ServiceContainer. That way, it would
have its own wiring, instead of contribution to the wiring in MediaWikiServices.
And the first parameter passed to instantiator callbacks would be the
TranslateServices, not MediaWikiServices. If you also want MediaWikiServices to
be passed in as well as the second parameter, you can supply it to the
constructor of ServiceContainer via the $extraInstantiationParams parameter.

By the way, you can manage your singleton of TranslateServices as a static
variable, but if you want to prepare for the bright an shiny future, you can
also make TranslateServices a service in MediaWikiServices :)

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Hard deprecation of the Revision class

2020-07-07 Thread Daniel Kinzler
It's great to see this finally happening!

A big THANK YOU to DannyS712, Petr, and everyone involved! The Revision class
had been "soft" deprecated every since we introduced RevisionStore and
RevisionRecord in 1.31. Getting the necessary work scheduled for removing
hundreds of usages proved difficult. Now DannyS712 just dove in and tackled the
beast. Thank you again! It's great to see volunteers take on tasks like this.
Open Source rocks, and our community is awesome!

-- daniel

Am 06.07.20 um 22:18 schrieb Petr Pchelko:
> Hey all,
> 
> TLDR: your extension CI might break due to hard deprecation of Revision
> class.
> 
> Today a Gerrit change[1] has been merged, hard deprecating MediaWiki core
> Revision class.
> Loads of work has been done to prepare for this moment, mostly by a
> volunteer developer DannyS712 with code review
> support from almost every corner of the MediaWiki developer universe. You
> can judge the amount of work by the number
> of subtasks in the tracking ticket[2]
> 
> A lot of effort has been done to update all the WMF deployed extensions and
> some non-deployed ones to the new code,
> In accordance with stable interface policy deprecation section[3]. However
> due to the gravity of a change, we might have
> missed some corner cases. Missed usages will not cause problems in
> production since the only consequence of using hard
> deprecated code is a log message, but some CI tests might start failing.
> If you find your tests failing, please replace all the usages of the
> Revision class according to 1.35 release notes,
> and your tests should start passing again. In case you’re having troubles
> doing it, please reach out to core platform team
> and we will try to help.
> 
> This is a huge milestone in decoupling MediaWiki core! We understand that
> this might cause some inconveniences, apologize
> for them and are hoping we can help with resolving any issues.
> 
> Cheers.
> Petr Pchelko
> 
> 1. https://gerrit.wikimedia.org/r/c/mediawiki/core/+/608845
> 2. https://phabricator.wikimedia.org/T246284
> 3. https://www.mediawiki.org/wiki/Stable_interface_policy#Deprecation
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> 

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] TechCom Radar 2020-06-24

2020-06-26 Thread Daniel Kinzler
Hi all,

Here are the minutes from this week's TechCom meeting:

Last Call: Amendment to the Stable interface policy (T255803)
* Ticket: https://phabricator.wikimedia.org/T255803
* No big change in substance from the policy adopted in March
* Draft policy: 
https://www.mediawiki.org/wiki/User:Krinkle/Stable_interface_policy
* Description of:
https://www.mediawiki.org/wiki/User:Krinkle/Stable_interface_policy/Changes
* The new policy is on Last Call until July 8th. If you have any concerns about
the draft, please comment on the ticket. If no concerns remain unaddressed by
July 8, the draft will be adopted as the new policy.

Approved: Drop support for IE 8 (T248061)
* https://phabricator.wikimedia.org/T248061
* Support for IE 8 will be maintained for the upcoming 1.35 release
* Code needed for IE 8 support can be dropped from master (and Wikimedia sites)
after 1.35 * has been branched in July.

No office hours scheduled for next week.

You can also find our meeting minutes at
<https://www.mediawiki.org/wiki/Wikimedia_Technical_Committee/Minutes>

See also the TechCom RFC board
<https://phabricator.wikimedia.org/tag/mediawiki-rfcs/>.

If you prefer you can subscribe to our newsletter here
<https://www.mediawiki.org/wiki/Newsletter:TechCom_Radar>

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Ops] Backport and Config changes window (name change)

2020-06-17 Thread Daniel Kinzler
"Beam" sounds great!

Alternatively, how about the more mundane "fast track" or "express deploy",
which fits the with image of "train"?

Am 17.06.20 um 16:25 schrieb Aron Manning:
> On Wed, 17 Jun 2020 at 16:14, Jeremy Baron  wrote:
> 
>> or BCC or BACC window.
>>
> 
> BACC sounds good to me: it's easy to pronounce. BCC already has a meaning.
> 
> Other suggestion:
> This is the "fast travel" route in comparison to the regular weekly trains,
> therefore the short name could play with that idea.
> "Beam" would be quite expressive, IMO:
> 
> "Beam up the patches, Scotty!"
> Excuse my nerding... ;-)
> 
> Demian (aka. Aron)
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> 

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Today: TechCom office hour about RFC process

2020-04-08 Thread Daniel Kinzler
Hi all!

TechCom is hosting an office our to talk about the new and improved RFC process
[1] (any anything else you may want to discuss) today.

If you are interested, please join us at 21:00 UTC (2pm PDT, 23:00 CEST) on IRC
in the #wikimedia-office channel on feenode.

[1] https://www.mediawiki.org/wiki/Requests_for_comment/Process

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] What is "revision slot" and why is it necessary to specify it now?

2020-04-05 Thread Daniel Kinzler
Am 05.04.20 um 23:18 schrieb Petr Bena:
> But still, I am curious what is the recommended approach for someone
> who wants to develop their application "properly" in a way that it's
> backward compatible? First query the MediaWiki version via API, and
> based on that decide how to call APIs? I can't think of any other way.

Yes, that's indeed the only way if the client is to be entirely generic. Though
deprecated API behavior tends to stay supported for quite a while,  much longer
than deprecated PHP methods. So checking the MediaWiki version first would only
be needed if you wanted your code to work with a wide range of MediaWiki 
versions.

Also, most clients are not totally agnostic to the wiki they run against. E.g.
pywikibot has per-wiki configuration, which could easily include the wiki
version. So no query would be necessary. Similarly, JS code running on the site
has access to the version from JS variables.

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] What is "revision slot" and why is it necessary to specify it now?

2020-04-05 Thread Daniel Kinzler
Thanks for bringing this up.

I have made some updates to the documentation in mediawiki.org to make clear
that MCR is not work in progress. While support is not complete in all parts of
MediaWiki, the storage layer has been implemented and the database has been
migrated to the new system.

Am 05.04.20 um 22:17 schrieb Petr Bena:
> Hello,
> 
> https://www.mediawiki.org/wiki/API:Revisions mentions "revision slots"
> multiple times, but fails to explain what it is?
> 
> I noticed that some queries that were running fine in the past now
> have warning: "Because "rvslots" was not specified, a legacy format
> has been used for the output. This format is deprecated, and in the
> future the new format will always be used."
> 
> Which is interesting, but not very descriptive. What are revision
> slots? Why are they needed? What is difference between "legacy format"
> and "new format"? If rvslots are so important, why they aren't part of
> some of the examples?
> 
> So far I googled this: https://www.mediawiki.org/wiki/Manual:Slot it
> seems like some new "work-in-progress" feature, that isn't very useful
> yet, as there is only 1 slot in existence.
> 
> Is it necessary to pay attention to this? Can I simply silence this
> warning by providing parameter rvslots=*? Or should I pay it some
> deeper attention?
> 
> Thanks
> 
> _______
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> 

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Unable to run MobileFrontEnd Extension

2020-03-24 Thread Daniel Kinzler
Sounds like you are using an incompatible version of the extension. Make sure
you install a snapshot that matches the version of MediaWiki you run. Pick the
appropriate release on the download page:
<https://www.mediawiki.org/wiki/Special:ExtensionDistributor/MobileFrontend>

Most extensions don't keep backwards compatible with older versions of MediaWiki
(including the latest release). See
<https://www.mediawiki.org/wiki/Compatibility#mediawiki_extensions>

HTH

Am 24.03.20 um 10:05 schrieb Egbe Eugene:
> Hi All,
> After Installing the MobileFrontEnd extension, I am unable to run it due to
> some error.
> 
> *Fatal error: Uncaught Error: Call to undefined method
> MediaWiki\MediaWikiServices::getContentHandlerFactory() in
> /Library/WebServer/Documents/myweb/otherprojects/mw/core/extensions/MobileFrontend/includes/MobileFrontendEditorHooks.php*
> 
> I am not sure what is missing and how can I make this run.
> 
> Thanks
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> 

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Last Call: Updated Stable Interface Policy / Deprecation Policy

2020-03-12 Thread Daniel Kinzler
Hi all!

Per yesterday's TechCom meeting, the last call ended without any concerns being
raised. If you work on extensions or MediaWiki core, please read through the
changes outlined below, and have a look at the policy page at
<https://www.mediawiki.org/wiki/Stable_interface_policy>.

Thanks!

Am 28.02.20 um 11:38 schrieb Daniel Kinzler:
> Hi all!
> 
> TL;DR: A new policy defining stable interfaces for use by extensions, and the
> deprecation process that must be followed when changing such stable 
> interfaces,
> is now in the Last Call period. If no concerns remain unaddressed by March 11,
> the new policy will be adopted as official policy. The policy will apply to
> MediaWiki 1.35 and later.
> 
> The new policy is designed to better protect extensions from breakage when
> things change in core, by being more restrictive about what extensions can
> safely do.
> 
> Draft document: https://www.mediawiki.org/wiki/Stable_interface_policy
> RFC ticket: https://phabricator.wikimedia.org/T193613
> 
> Please comment there. Read below for a summary of changes.
> 
> 
> Long version:
> 
> TechCom has been working on a Stable Interface Policy for MediaWiki's PHP code
> for a while[1]. Previously, the definition of the stable interface that can be
> used by extensions was part of the Deprecation Policy[2]. The new policy[3] is
> much more detailed and explicit about what extensions can expect to remain
> stable and follow the deprecation process, and which things can change without
> notice.
> 
> The introduction of the new policy is driven by problems we found while trying
> to refactor core code with the aim to reduce coupling. For instance, when
> following the Dependency Injection pattern, the constructor of service classes
> is technically public, but is not stable for use outside the module.
> 
> The solution is to be more explicit about different kinds of stability (e.g.
> whether a method is stable for callers, or can also be safely overwritten).
> 
> To allow us more freedom to restructure the source code, the new policy is 
> more
> restrictive with respect to what extensions can expect to be able to do safely
> (e.g. subclass core classes). This is balanced by improved guarantees in
> previously unspecified cases (e.g. stability of protected methods).
> 
> The new policy specifies different kinds of stability, establishes defaults 
> for
> different parts of the code, and defines new annotations to be used in the
> documentation. Once the policy has been adopted, these annotations are soon to
> be added to the code.
> 
> This will hopefully allow us to more quickly improve the structure and quality
> of core code, and reduce the risk of breaking extensions in the future.
> 
> 
> Summary of the new policy:
> 
> For extension authors:
> 
> * It's generally safe to call public methods, and to access public fields in
> classes defined by MediaWiki core, unless these methods are documented to be
> unsafe (e.g. annotated as @deprecated, @unstable, or @internal).
> 
> * It's generally unsafe to extend (subclass) classes or implement interfaces
> defined by MediaWiki core, unless that class or interface was marked as 
> @stable
> for subclassing or @stable for implementation, respectively. In particular, 
> the
> constructor signature may change without notice, and abstract methods may be
> added to interfaces.
> 
> * It's generally unsafe to directly instantiate (using new) classes defined by
> MediaWiki core, unless that class is marked as @newable.
> 
> * It's generally unsafe to rely on global variables from MediaWiki core. Use
> methods such as MediaWikiServices::getInstance() or
> MediaWikiServices::getMainConfig() instead.
> 
> 
> When changing existing code:
> 
> * Keep public methods and hook signatures backwards compatible for callers.
> Follow the deprecation process when removing them.
> 
> * Keep constructor signatures backwards compatible if the constructor was 
> marked
> @stable for calling.
> 
> * Ensure compatibility of method signatures for code that overrides them if 
> they
> are marked @stable for overriding.
> 
> * Do not add abstract methods to classes or interfaces marked as @stable for
> subclassing or @stable for implementation.
> 
> 
> When defining extension points:
> 
> * When defining hooks, keep the signature minimal, and expose narrow 
> interfaces,
> ideally only pure value objects.
> 
> * When defining an interface to be implemented by extensions, provide a base
> class, and mark it as @stable for subclassing.
> 
> * Discourage extensions from directly implementing interfaces by marking them 
> as
> @unstable for implementation. If direct implementatio

[Wikitech-l] Last Call: Updated Stable Interface Policy / Deprecation Policy

2020-02-28 Thread Daniel Kinzler
Hi all!

TL;DR: A new policy defining stable interfaces for use by extensions, and the
deprecation process that must be followed when changing such stable interfaces,
is now in the Last Call period. If no concerns remain unaddressed by March 11,
the new policy will be adopted as official policy. The policy will apply to
MediaWiki 1.35 and later.

The new policy is designed to better protect extensions from breakage when
things change in core, by being more restrictive about what extensions can
safely do.

Draft document: https://www.mediawiki.org/wiki/Stable_interface_policy
RFC ticket: https://phabricator.wikimedia.org/T193613

Please comment there. Read below for a summary of changes.


Long version:

TechCom has been working on a Stable Interface Policy for MediaWiki's PHP code
for a while[1]. Previously, the definition of the stable interface that can be
used by extensions was part of the Deprecation Policy[2]. The new policy[3] is
much more detailed and explicit about what extensions can expect to remain
stable and follow the deprecation process, and which things can change without
notice.

The introduction of the new policy is driven by problems we found while trying
to refactor core code with the aim to reduce coupling. For instance, when
following the Dependency Injection pattern, the constructor of service classes
is technically public, but is not stable for use outside the module.

The solution is to be more explicit about different kinds of stability (e.g.
whether a method is stable for callers, or can also be safely overwritten).

To allow us more freedom to restructure the source code, the new policy is more
restrictive with respect to what extensions can expect to be able to do safely
(e.g. subclass core classes). This is balanced by improved guarantees in
previously unspecified cases (e.g. stability of protected methods).

The new policy specifies different kinds of stability, establishes defaults for
different parts of the code, and defines new annotations to be used in the
documentation. Once the policy has been adopted, these annotations are soon to
be added to the code.

This will hopefully allow us to more quickly improve the structure and quality
of core code, and reduce the risk of breaking extensions in the future.


Summary of the new policy:

For extension authors:

* It's generally safe to call public methods, and to access public fields in
classes defined by MediaWiki core, unless these methods are documented to be
unsafe (e.g. annotated as @deprecated, @unstable, or @internal).

* It's generally unsafe to extend (subclass) classes or implement interfaces
defined by MediaWiki core, unless that class or interface was marked as @stable
for subclassing or @stable for implementation, respectively. In particular, the
constructor signature may change without notice, and abstract methods may be
added to interfaces.

* It's generally unsafe to directly instantiate (using new) classes defined by
MediaWiki core, unless that class is marked as @newable.

* It's generally unsafe to rely on global variables from MediaWiki core. Use
methods such as MediaWikiServices::getInstance() or
MediaWikiServices::getMainConfig() instead.


When changing existing code:

* Keep public methods and hook signatures backwards compatible for callers.
Follow the deprecation process when removing them.

* Keep constructor signatures backwards compatible if the constructor was marked
@stable for calling.

* Ensure compatibility of method signatures for code that overrides them if they
are marked @stable for overriding.

* Do not add abstract methods to classes or interfaces marked as @stable for
subclassing or @stable for implementation.


When defining extension points:

* When defining hooks, keep the signature minimal, and expose narrow interfaces,
ideally only pure value objects.

* When defining an interface to be implemented by extensions, provide a base
class, and mark it as @stable for subclassing.

* Discourage extensions from directly implementing interfaces by marking them as
@unstable for implementation. If direct implementation is to be allowed, mark
the interface @stable for implementation.


Notable changes from the 1.34 policy:

* Public methods are per default considered stable only for calling, not for
overriding.

* Constructors are considered unstable per default.

* Classes and interfaces are considered unstable for subclassing and
implementation, unless documented otherwise.

* Code not used in a public repository that is part of the Wikimedia ecosystem
may be changed or removed without deprecation.

[1] https://phabricator.wikimedia.org/T193613
[2] https://www.mediawiki.org/wiki/Deprecation_policy
[3] https://www.mediawiki.org/wiki/Stable_interface_policy

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo

Re: [Wikitech-l] realtime notifications disabled in Phabricator

2019-11-28 Thread Daniel Kinzler
Am 28.11.19 um 02:50 schrieb Daniel Zahn:
> You voices have been heard. We'll have a meeting next week to talk about
> how to reactivate it.

Thank you!

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] realtime notifications disabled in Phabricator

2019-11-26 Thread Daniel Kinzler
While I don't care much about the "this task was updated" popups, I do miss the
immediate updates of workboards. It enables two use cases previously missing:

* have a board open in a tab to keep track on progress, without the need to
reload every time you look at it

* work on a board with multiple people remotely, with everyone having an up to
date view of the board, avoiding interference and confusion. This was REALLY
helpful.

Can we get this back, please?

Am 26.11.19 um 02:10 schrieb Daniel Zahn:
> Ah. Yes, this sounds like the relatively new feature from:
> 
> https://secure.phabricator.com/T13357
> https://secure.phabricator.com/T4900
> 
> It is affected then.
> 
> On Mon, Nov 25, 2019 at 6:30 PM David Barratt 
> wrote:
> 
>> I think they mean that if you have the board open and someone else moves a
>> task, it will update on your end in realtime.
>>
>> I imagine this *would* be affected by the change.
>>
>> On Mon, Nov 25, 2019 at 6:28 PM Daniel Zahn  wrote:
>>
>>> On Mon, Nov 25, 2019 at 11:21 AM Sebastian Berlin <
>>> sebastian.ber...@wikimedia.se> wrote:
>>>
>>>> Is that also the reason why tasks don't update in realtime on the
>>>> workboards anymore?
>>>>
>>>
>>> If you mean tasks moving from column to column in workboards, no that is
>>> unrelated and done by the
>>> Phabricator maintenance bot by Amir Sarabadani, afaik.
>>>
>>> --
>>> Daniel Zahn 
>>> Operations Engineer
>>> ___
>>> Wikitech-l mailing list
>>> Wikitech-l@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> 
> 
> 

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] [MCR] Heads up: testwiki no longer writing to legacy fields in the revision table (rev_text_id, etc)

2019-10-30 Thread Daniel Kinzler
Hi all!

This is a quick heads up that we just switched testwiki (test.wikipedia.org) to
$wgMultiContentRevisionSchemaMigrationStage = SCHEMA_COMPAT_NEW [1]. We will do
the same for all WMF wikis soon[2], and make this the default for new installes
of MW 1.34[3].

In concrete terms, this means that the following fields will no longer be
populated: rev_text_id, rev_content_model, rev_content_formt, as well as
ar_text_id, ar_content_model, ar_content_formt. All code should now use the
slots table to join against the content table to find the relevant information.
This should already be the case for all code in core as well as all extensions
maintained, deployed or bundled by wmf. The unused fields will be removed
eventually.

If you notice any oddities on testwiki that might be related to this change,
please file a ticket on Phabricator and tag #Core_Platform_Team.

Cheers,
Daniel

[1] https://phabricator.wikimedia.org/T198558
[2] https://phabricator.wikimedia.org/T198312
[3] https://phabricator.wikimedia.org/T231673

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Quibble 0.0.36 released

2019-10-08 Thread Daniel Kinzler
woot!

Or, in other words: supercalifragilisticexpialidocious!

Am 08.10.19 um 23:00 schrieb Antoine Musso:
> Hello,
> 
> I have cut a new version of Quibble — version 0.0.36. This will roll out
> to all CI jobs this week.
> 
> Two new features are now present:
> 
> * Quibble sets up a wiki against which to run tests. It can now override
> configuration set by the MediaWiki installer, which we have done for two
> items:
> ** This wiki now allows file uploading ($wgEnableUploads = true), which
> unblocks people writing integration tests for uploading. – T190829 and
> T199939
> ** This wiki now has $wgSecretKey set (to the value
> 'supercalifragilisticexpialidocious'), which unblocks people writing
> integration tests for the job queue. – T230340
> * After each test stage, Quibble will report how long it took.
> 
> The sole bug fix is that errors were previously ignored when processing
> git clones in parallel. Any errors are now properly raised. T233143
> 
> We also now generate and publish a changelog! This means that you can
> discover more minor new features and miscellaneous changes via:
> 
> 
>   https://doc.wikimedia.org/quibble/changelog.html
> 
> 
> I will progressively upgrade the CI jobs over the next few days. The new
> Docker images are:
> 
> docker-registry.discovery.wmnet/releng/quibble-coverage:0.0.36
> docker-registry.discovery.wmnet/releng/quibble-fresnel:0.0.36
> docker-registry.discovery.wmnet/releng/quibble-stretch:0.0.36
> docker-registry.discovery.wmnet/releng/quibble-stretch-bundle:0.0.36
> docker-registry.discovery.wmnet/releng/quibble-stretch-php70:0.0.36
> docker-registry.discovery.wmnet/releng/quibble-stretch-php71:0.0.36
> docker-registry.discovery.wmnet/releng/quibble-stretch-php72:0.0.36
> docker-registry.discovery.wmnet/releng/quibble-stretch-php73:0.0.36
> 
> Thank you to Adam Wight, Daniel Kinzler, James Forrester, and Kosta Harlan.
> 

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] TechCom Radar 2019-09-18

2019-09-26 Thread Daniel Kinzler
Hi Máté,

Sorry for the late reply!

Am 20.09.19 um 14:13 schrieb Máté Szabó:
> Hi Daniel,
> 
> Thank you for the update. I was wondering, how does the new Front-end 
> Architecture Working Group relate to the existing Front-end Standards 
> Group?[1] Does it replace the previous group or do they operate independently?

The Frontend Architecture Working Group is independent of the Front-end
Standards Group, but has substantial overlap: Roan, Timo, Santhosh, and Moriel
are in both groups.

While the long term goals of both groups overlap, the Front-end Standards Group
has focused on standardizing libraries and best practices for the client side,
across teams. It's a permanent institution. The Front-end Architecture Working
Group is intended to exist for a limited time, and drive specific changes to the
architecture that will empower us to modernize the front-end.

I hope this clarifies the relationship of the two groups.

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Up next: RFC discussion about versioning of the new core REST API.

2019-09-25 Thread Daniel Kinzler
This was already announced in the last TechCom Radar email, but in case you have
missed it:

In an hour, we will be talking about the versioning mechanism for the new REST
API in MediaWiki core: https://phabricator.wikimedia.org/T232485

You can join the discussion on the #wikimedia-office channel.

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] mediawiki.api.edit

2019-09-25 Thread Daniel Kinzler
> I'd assume https://www.mediawiki.org/wiki/Deprecation_policy#Removal
> applies: "Code MUST emit hard deprecation notices for at least one major 
> MediaWiki version before being removed."

Formally, this policy only applies to PHP code. The Scope section says:
"This proposed policy applies to the PHP API of the MediaWiki core
(mediawiki/core.git) codebase. It explicitly does not apply to the api.php API,
client-side JavaScript, HTML output, or database schemas."

I don't know if we even have a policy for the removal of JS modules. It would
certainly be a good idea to have one, so we don't randomly break custom JS 
code...


-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] TechCom Radar 2019-09-18

2019-09-20 Thread Daniel Kinzler
Hi All,

Here are the minutes from this week's TechCom meeting:


* RFC Last Call until 2019-10-02: Drop IE6 and IE7 basic compatibility and
security support . This means we can
remove code that works around issues with IE6 and IE7. If no concerns remain
unaddressed by October 2nd, the RFC will be approved as proposed.

* Discussed: Kickoff of the Front-end Architecture Working Group (FAWG)
. The
purpose of the Frontent Architecture Working Group is to propose an architecture
for a more modern front-end to MediaWiki, enabling a richer user experience.
Efforts to modernize the user interface have often struggled with limitations
imposed by MediaWiki core and the overall system architecture. FAWG is
an effort by the Product and Technology departments to develop a mid-term plan
to overcome these limitations.

* Discussed: Section headings with clickable anchor
. This is about a "link to this
section" feature pegged for deployment on WMF wikis. The RFC is to make sure
that there are no performance issues with this change.

* IRC discussion next week: Core REST API namespace and version
. This is about versioning strategies
for public APIs. As always, the meeting will take place on Freenode in the
#wikimedia-office channel, at 21:00 UTC (2pm PDT, 23:00 CEST).


You can also find our meeting minutes at


See also the TechCom RFC board
.

If you prefer you can subscribe to our newsletter here


Cheers,
Daniel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Startup module size and ResourceLoader registry overhead

2019-09-19 Thread Daniel Kinzler
My friend @wetterfrosch points out on twitter that reducing our daily bandwidth
usage by 4.3TB reduces CO2 emission by 30 tons PER DAY. That's roughly
equivalent to flying 20 people from Berlin to San Francisco and back. Every day.

Thank you all!

Sources:
https://twitter.com/wetterfrosch/status/1174675018795638785
https://alistapart.com/article/sustainable-web-design/
https://en.wikipedia.org/wiki/Environmental_impact_of_transport
https://www.carbonfootprint.com/calculator.aspx

Am 16.09.19 um 23:10 schrieb Amir Sarabadani:
> Hello,
> Startup module, is the Javascript code [1] that is being served at almost
> every request to Wikimedia as part of  (non-blocking though) to load
> other RL modules. It does important things for example it checks if the
> requested modules already exist with the given hash in the local storage
> cache.
> 
> Because of the given reasons, the startup module needs and includes list of
> all registered modules with their version hash in the code. As the result
> if you register a module, even if you don't load it, it adds a rather small
> overhead (around 19 bytes after minification and compression) to every
> request to any wiki the code is enabled. So if you register a new module in
> core or one of the extensions that is deployed in all wikis (CX,
> WikibaseClient, etc.), it's going to be added to every page request,
> meaning 30 GBs more traffic every day to our users. Even if you don't use
> it anywhere.
> 
> If you're adding a feature, 30GB/day is acceptable but sometimes developers
> use Resource loader for dependency management or class loading (yours truly
> used to do that) and introduce 20 modules instead of one and each one
> causing an extra 600 GB/day. The big overhead for our users is bad for
> three reasons: 1- RL has to handle the dependency management, making every
> page view slightly slower and worse UX for our users 2- The extra network
> is bad with places without access to broadband connection, where Wikipedia
> is the most likely place that people learn and grow [2] 3- The scale we are
> talking is so massive (petabytes a year) that It has environmental impact.
> 
> Let me give you an example, a couple of weeks ago we dropped 137 modules
> from WikibaseClient. After deployment, it dropped 4TB/day from our network
> (= 1.5 PB/year) and synthetic data shows, in an average case we drop 40 ms
> from every response time [3].
> 
> We now have a dashboard to track size of size of RL registry [4] plus a
> weekly metrics for changes [5][6]
> 
> If you're looking for ways to help. I wrote a tool [7] to do some graph
> analysis and it gives list of extensions that has modules that can be
> merged. The extensions that according to the analysis (that can have false
> positives) can get better are TimedMediaHandler, PageTriage, Graph,
> RevisionSlider, CodeMirror, Citoid, TemplateData, TwoColConflict,
> Collection, CentralNotice, AdvancedSearch, 3D, MobileFrontend and many more
> including some bits and pieces in core. I put the graphs of modules that
> can be merged at [8] and I invite you to have fun with those modules.
> Modules can be merged using package modules [9]
> 
> Most of the is work done by the performance team [10] and volunteers and
> developers in lots of teams. I joined the party later as volunteer/WMDE
> staff and I'm sharing mostly the results and writing long emails. Big kudos
> to Krinkle, James Forrester, Santhosh Thottingal, Jon Robson, Alaa Sarhaan,
> Alexandros Kosiaris, and so many others that helped and I forgot.
> 
> [1] An example from English Wikipedia:
> https://en.wikipedia.org/w/load.php?lang=en=startup=scripts=1=vector=true
> [2] https://arxiv.org/abs/1812.00474
> [3] https://phabricator.wikimedia.org/T203696#5387672
> [4] https://grafana.wikimedia.org/d/BvWJlaDWk/startup-module-size?orgId=1
> [5] https://gist.github.com/Krinkle/f76229f512fead79fb4868824b5bee07
> [6]
> https://docs.google.com/document/d/1SESOADAH9phJTeLo4lqipAjYUMaLpGsQTAUqdgyZb4U
> [7] https://phabricator.wikimedia.org/T232728
> [8] https://phabricator.wikimedia.org/T233048
> [9] https://www.mediawiki.org/wiki/ResourceLoader/Package_modules
> [10] https://phabricator.wikimedia.org/T202154
> 
> Best
> 

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] content_models table only contains wikitext content model on fresh MW 1.32.3 install

2019-09-19 Thread Daniel Kinzler
Hi Tom!

The snippet looks fine at a glance, though I wonder why you are not just using
maintenance/edit.php.

Am 19.09.19 um 14:17 schrieb Tom Schulze:
> I import pages
> using a custom maintenance script which reads a files' content from the file
> system and saves it to the mediawiki db using:
> 
> $title = Title::newFromText('Widget:MyWidget');
> $wikiPage = new WikiPage( $title );
> $newContent = ContentHandler::makeContent( $contentFromFile, $title );
> $wikiPage->doEditContent( $newContent );
> 
> In the MW Class reference
> <https://doc.wikimedia.org/mediawiki-core/master/php/classContentHandler.html#a2f403e52fb305523b0812f37de41622d>
> it says  "If [the modelId parameter for ContentHandler::makeContent() is] not
> provided, $title->getContentModel() is used." I assume, that it checks the
> namespace among others and uses javascript for Widgets? Because in my case 
> it's
> a widget that causes the error. The extension is installed prior to the
> importation and the namespace 'Widget' exists.

So what should happen is that Title::getContentModel() decides that the default
model for the Widget namespace should be javascript (based on an entry in
$wgNamespaceContentModels made by the extension), and return the string
"javascript".

When recording the model of the content in the content table, that string gets
normalized by creating an entry in the content_models table, if no such entry
exists yet for "javascript", generating a unique integer ID (in your case, this
appears to be 2). This integer gets recorded in content.content_model_id.

When reading the page's content later, the model name associated with 2 is
looked up in the content_models table (actually, in a cached version of that
table), returning "javascript". This however fails in your case.

The question is: since the number 2 was generated by an auto-increment key when
inserting into content_models, why is the row now missing from the table? How
can that be?

> Is there something wrong with the snippet?

Not in an obvious way.

The only explanation I have is that the edit actually fails for some reason, and
the database transaction gets rolled back. This would result in a situation
where the row for "javascript" is not in content_models, but it's still in the
cached version of that table (in APC memory or memcached or whatever you have
your object cache set to).

So perhaps you retry after the initial failure. Since the cached table has an
entry for "javascript", MediaWiki will just use that, and not write to the table
again. Your edit succeeds - but now you have the number 2 in
content.content_model_id, but no row for 2 in the content_models table. You can
still read the page as long as you have the cached version of the content_models
table in memory - but as soon as the cache expires, things blow up.

As I said in my earlier response, I'm working on a patch to avoid this
situation, see <https://gerrit.wikimedia.org/r/c/mediawiki/core/+/514245>.

However, I'm still not 100% sure that what I described above is what actually
happened. Did you have some kind of failure when you first tried to import the
widget (or any javascript, such as MediaWiki:common.js?)

If you didn't, I'm back to having no clue as to what might be causing this
problem. Which of course would not be good at all :)

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] content_models table only contains wikitext content model on fresh MW 1.32.3 install

2019-09-18 Thread Daniel Kinzler
Am 18.09.19 um 16:27 schrieb Tom Schulze:
> Am I missing something that the core content models are not in the
> initial database table 'content_models'? Is this table not filled during
> the installation procedure?

No. The table is filled on demand, when the first revision using that model is
written to the database.

> Is it actually bad advice to insert the rows manually?

Generally, yes. But adding things to that table should not do any harm.
Modifying things would, however.

> I then install extensions, run update.php, and import a couple pages,
> templates, etc. When setting up Cargo's _pageData table using the
> setCargoPageData.php script at a later stage I get the following error
> (see backtrace at the very bottom).
>
> Failed to access name from content_models using id = 2

This implies that *something* got the ID 2 for that content model already from
somewhere. That ID can really only come from an insert to the content_models
table - if it's not in the table, that's an indication that it was somehow lost.

There have been reports of similar problems with the slots table. Please add
your experience to the ticket here:

https://phabricator.wikimedia.org/T224949

There is a patch up that should safeguard against my best guess at the cause of
this. If you can provide additional insights as to exactly how this may happen,
please do!

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Declaring methods final in classes

2019-08-29 Thread Daniel Kinzler
Am 29.08.19 um 15:31 schrieb Adam Wight:
> This is a solution I'd love to see explored.  Subclassing is now considered
> harmful, and it would be possible to refactor existing cases to use
> interfaces, traits, composition, etc.

I wouldn't say subclassing is considered harmful in all cases. In fact, for
extension points, subclassing is preferred over directly implementing
interfaces. But subclassing should be used very carefully. It's the most tight
form of coupling. It should be avoided if an alternative is readily available.

But subclassing across module boundaries should be restricted to classes
explicitly documented to act as extension points. If we could enforce this
automatically, that would be excellent.

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Declaring methods final in classes

2019-08-29 Thread Daniel Kinzler
> As such, to perhaps help with the conversation, I'd like to have a
> practical example we can look at and compare potential solutions. Perhaps
> from WANObjectCache, or perhaps with something else.

Simetrical can speak on the concrete use case, but I'd like to give my thoughts
on mocking WANObjectCache in general: when writing a unit test, the class under
test should ideally be tested in isolation - that is, any of its dependencies
should be mocked. If the dependency is inconsequential or there is a trivial
implementation available, we don't have to be too strict about this. But complex
dependencies should be avoided in unit tests, otherwise it's not a unit test but
an integration test.

WANObjectCache is 2600 lines of complex code. When testing something that has a
WANObjectCache injected, we should inject a fake WANObjectCache, to preserve the
level of isolation that makes the test a unit test. As long as WANObjectCache's
most important methods, like get(), are final, this is impossible.

I agree by the way that overly specific mocks go against the idea of a unit
test. The test should test the contract, not the implementation. The mock should
provide the minimum functionality needed by the code, it shouldn't assert things
like "get() is called only once" - unless that is indeed part of the contract.

Narrow interfaces help with that. If we had for instance a cache interface that
defined just the get() and set() methods, and that's all the code needs, then we
can just provide a mock for that interface, and we wouldn't have to worry about
WANObjectCache or its final methods at all.

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Declaring methods final in classes

2019-08-28 Thread Daniel Kinzler
>> What benefits does it have to bind to a specific implementation that is
> not
>> guaranteed to stay as it is?
> 
> If used properly, the final keyword should be for immutable
> implementations. Otherwise, it could be a questionable use case.

I think all the current uses of "final" in MW core are questionable, then...

>> If somebody is volunteering to do the necessary work in the CI
> infrastructure, fine.
>>
>> To me it just seems like just removing the final modifier is the easier
> and
>> cheaper solution, and doesn't have any big downside.
> 
> It depends on how hard is it to set up the library. Ideally, we only have
> to find a place where to enable it, nothing more. And I also think it won't
> harm existing tests in any way. It doesn't even require a CI change. The
> only possible drawback is performance / opcache integration pointed out
> by @simetrical. Let me try to upload a test change and see how it goes,
> tonight or tomorrow.

Cool, thank you for looking into this!

Sorry if I sounded very negative. I'm trying to be pragmatic and avoid extra
effort where it is not needed. I appreciate your help with evaluating this!

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Declaring methods final in classes

2019-08-28 Thread Daniel Kinzler
Am 28.08.19 um 16:48 schrieb Daimona:
>> Subclassing should be very limited anyway, and even more limited across
> module
>> boundaries
> 
> I agree, but it doesn't offer a strict guarantee.

Do we need a strict guarantee more than we need unit tests?

>> which could even be enforced via static analysis.
> 
> Why not just use final, then?

Because it makes it impossible to write unit tests.

Maybe not impossible with the tool you pointed to. If that thing works, it
becomes: it requires effort to set up the CI infrastructure to allow this to
work, and we don't know who is going to do that, or when.

>> Method contracts should be enforced by compliance tests. When following
> these principles, making
>> methods and classes final has little benefit.
> 
> Ideally, yes. But I don't think our codebase has enough tests for that.

That's what we are trying to fix, and final stuff is making it hard.

>> Preventing mocking is however a pretty massive cost.
> 
> Definitely yes. But making methods mockable while also keeping the
> capability to use finals is even better IMHO.

If that can be made to work, sure. I'm just saying that an inability to mock far
outweights the potential benefits of declaring things as final.

> In theory, for sure. But I believe there are lots of occurrences in our
> code where static methods are not pure functions.

Which indeed is one of the things we are currently trying to fix, because static
code can't be mocked.

>> with these contracts. Final methods make callers rely on a specific
>> implementation, which may still end up changing anyway.
> 
> Two sides of a coin, I think. Each of them has its benefits and its
> drawbacks, I'd say.

What benefits does it have to bind to a specific implementation that is not
guaranteed to stay as it is?

>> If I understand correctly, this would break as soon as the mock object
> hits a
>> type hint of instanceof check. That won't fly.
> 
> No, that's only what happens with mockery. The tool I found just strips
> 'final' keywords from the PHP code - I believe, I still haven't looked at
> the implementation.

If somebody is volunteering to do the necessary work in the CI infrastructure, 
fine.

To me it just seems like just removing the final modifier is the easier and
cheaper solution, and doesn't have any big downside.

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Declaring methods final in classes

2019-08-28 Thread Daniel Kinzler
I see no good use for final methods or classes. Or rather: I see a very limited
benefit and a pretty massive cost.

Subclassing should be very limited anyway, and even more limited across module
boundaries, which could even be enforced via static analysis. Method contracts
should be enforced by compliance tests. When following these principles, making
methods and classes final has little benefit. Preventing mocking is however a
pretty massive cost.

I'd just remove the final markers. But maybe I'm overlooking something?...

Am 28.08.19 um 10:38 schrieb Daimona:
>> I don't like these limitations either, but testing is an integral part
>> of development, and we need to code in a way that facilitates testing.
> 
> This is especially true for e.g. static methods, but here we'd be
> renouncing to a possibly useful feature.

Static code should only be used for pure functions. That's a very limited use 
case.

>> Why do methods ever "have" to be final?
> 
> If you want to make sure that any subclass won't ever change the
> implementation of a method, and thus all callers know what to expect from
> calling a final method.
> I see finals as a sort of safeguard to help write better code, like e.g.
> typehints.

This should eb done by documenting contracts, and having tests ensure compliance
with these contracts. Final methods make callers rely on a specific
implementation, which may still end up changing anyway.

> 
>> That would be a nice solution if it works well. If someone wants to
>> volunteer to try to get it working, then we won't need to have this
>> discussion. But until someone does, the question remains.
> 
> IMHO this would be a perfect compromise. I've filed T231419 for that, and I
> also think that before discussing any further, we should try to see if we
> can install that tool.

If I understand correctly, this would break as soon as the mock object hits a
type hint of instanceof check. That won't fly.


-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] CR+2 on MediaWiki for Aryeh Gregor (aka Simetrical)

2019-08-22 Thread Daniel Kinzler

I propose for Aryeh Gregor to be granted the right to approve changes against
MediaWiki core and extensions, by adding him to the mediawiki LDAP group.

Please voice support or concerns on
<https://phabricator.wikimedia.org/T230979>

Aryeh has been contributing to MediaWiki on and off for a decade or so, both as
a volunteer and a contractor. He joined gerrit only a year ago, but he had
commit access already back in the day when MediaWiki was still managed on SVN.
In the time I have been working with Aryeh, he has shown excellent skill and
good judgement. Granting him merge rights would allow him to contribute not only
by writing code, but also by approving changes made by others.

Aryeh commented on the ticket to note the following:

> My older contributions are available via git log --author simetrical. I
> couldn't find a webpage that would display the results of the search that I
> could link to. Note that that includes things from the SVN era that I only
> committed and didn't author, so it's inflated (probably only slightly). But if
> we leave that aside, according to git shortlog -sne, I have 1052 commits 
> (after
> updating .mailmap to merge the three different e-mail addresses I've used).
> Surprisingly, this still makes me the #19 all-time contributor by number of
> commits. I think I always tended to break up my commits more than a lot of
> people, though.


-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] IRC session on improving TechCom's RFC process tonight

2019-08-21 Thread Daniel Kinzler
Hi all!

Tonight, TechCom will host a discussion about improving the RFC process on
#wikimedia-office, at 21:00 UTC (23:00 CEST, 2pm PDT).

This session is not about approving a specific proposal, but rather an
opportunity to explore options and gather ideas. We have been collecting
thoughts on this for a while, see <https://phabricator.wikimedia.org/T216308>.

If you have suggestions on how to improve the process, please join us!

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] TechCom Radar 2019-07-30

2019-08-13 Thread Daniel Kinzler
Hi All,

Here are the minutes from this week's TechCom meeting (and sorry for the delay):

* RFC approved after last call: Heredoc arguments for templates (aka "hygienic"
or "long" arguments). Implementation will probably have to wait until Parsoid
has been ported to PHP, so we don't have to implement it three times.


* RFC approved after last call: Abstract schemas and schema changes. This allows
database updates to be written in a form agnostic to the underlying RDBMS, using
the DBAL library. We will be dropping support for Oracle and MSSQL for core to
make this feasible. The maintainers of the Oracle and MSSQL backends have
expressed interested in adding support back as extensions.


* RFC approved after last call: Create a proper command-line runner for
MediaWiki maintenance tasks. Implementation is pending priorization by the Core
Platform Team. .

* RFC under discussion: Set explicit PHP support target for MediaWiki.


* The TechCom meeting on August 14 will be a public meeting held as a session at
the Wikimania Hackathon, at 15:00 Sweden time (6am in California).

* There is no RFC discussion scheduled, and no last calls ongoing.

You can also find our meeting minutes at


See also the TechCom RFC board
.

If you prefer you can subscribe to our newsletter here


Cheers,
Daniel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] TechCom Radar 2019-07-24

2019-07-26 Thread Daniel Kinzler
Uhh...

> Thanks,
> Kate

That what happens when you copy emails, I guess :)
That mail was of course by me, not by Kate...


-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] TechCom Radar 2019-07-24

2019-07-26 Thread Daniel Kinzler
Hi All,

Here are the minutes from this week's TechCom meeting:

Two RFCs are going on Last Call until August 27:

* "Heredoc" arguments for templates 
* Abstract schemas and schema changes 


If no objections remain unaddressed by August 7, these RFCs will be approved as
proposed and amended. If you care about this RFCs, please comment on phabricator
-- in support, or raising concerns. The Last Call period is not just for raising
objections, but also for confirming consensus.


We had an IRC meeting about "Merge Extension:Theme into core". There was good
progress made on the mechanics of adding support for theme based LESS variables
to Resource Loader. The discussion about actually merging the extension into
core is ongoing on the ticket. 

Other topics under active discussion:
* strategy for handling PHP interface changes

* future of relational datastore at WMF after the upgrade of MariaDB 10.1

* Store WikibaseQualityConstraint check data in persistent storage


You can also find our meeting minutes at


See also the TechCom RFC board
.

If you prefer you can subscribe to our newsletter here


Thanks,
Kate

--
Kate Chapman
Senior Program Manager, Core Platform
Wikimedia Foundation
kchap...@wikimedia.org

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Reminder: Code Health Office Hour - Cyclic Dependencies

2019-07-08 Thread Daniel Kinzler
Am 08.07.19 um 20:21 schrieb Daniel Kinzler:
> Tomorrow (Tuesday), the code health group is hosting an office hour on Meet at
> 9am PDT (16:00 UTC, 18:00 CEST). You should see an invite in your calendar.

Of course, only WMF staff would see an invite. Sorry about that. Anyone should
be able to join using this link: <http://meet.google.com/civ-ooos-dpf>.

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Reminder: Code Health Office Hour - Cyclic Dependencies

2019-07-08 Thread Daniel Kinzler
Tomorrow (Tuesday), the code health group is hosting an office hour on Meet at
9am PDT (16:00 UTC, 18:00 CEST). You should see an invite in your calendar.

The main topic will by cyclic dependencies - how they creep in, why they are
bad, and how to avoid and fix them. I'll be around to talk about some things the
core platform team and TechCom are looking into to address this problem.

If you are interested in improving the structure of MediaWiki core specifically
and in keeping code maintainable generally, please join!

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] TechCom Radar 2019-05-22

2019-05-24 Thread Daniel Kinzler
Am 24.05.19 um 17:22 schrieb Kate Chapman:
> * IRC Meeting in #wikimedia-office to discuss this IRC on Wednesday
> May 29 at 21:00 UTC/23:00 CEST/14:00 PDT:
> <https://phabricator.wikimedia.org/T221177> Route Handler Interface
> RFC.

Since it may not be obvious from the title: this RFC is about defining the
extension interface for the new REST router component for MediaWiki core. The
routes themselves (the new public API) are not in scope of this RFC, but the
question how extensions would register and handle routes is.

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] TechCom office hour (hackathon edition) today!

2019-05-17 Thread Daniel Kinzler
Hi all!

Today, TechCom will hold its second ask-us-anything office hour - not on IRC,
but at the hackathon in Prague!

If you are not at the hackathon, you can join via Google Meet (no google account
required):
<https://meet.google.com/gqn-wosn-dog>.

The office hour will start at 15:00 UTC (17:00 CEST, 8am PDT).

Ask us what you have always wanted to know, tell us what you have always wanted
to tell us, discuss the topic you have always wanted top discuss.

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Finding and fixing cyclic dependencies

2019-05-13 Thread Daniel Kinzler
Hey Antoine!

Am 13.05.19 um 20:59 schrieb Antoine Musso:
> Hello,
> 
> A few months ago, the code health group enquired about the Php metrics static
> analysis tool [1] and it has been rather straightforward to run it 
> automatically
> and publish its report:
> 
> https://doc.wikimedia.org/mediawiki-core/master/phpmetrics/
> 
> The report has a list of potential coupling and also show relation between
> objects.  That might be a complement.

Yea, that's what I used for my initial analysis for the session on decoupling at
TechConf in Portland :)

phpmetrics and similar tools are useful, but they do not analyze the transitive
dependencies (at least not sufficiently, for my use case). That is, they can't
tell me whether two classes are coupled, but they can't tell me which classes to
decouple to resolve clusters of tightly coupled code. And they can't measure how
"tangled" the codebase is overall, just how "good" or "bad" a given class is (or
all classes are, on average).

> As for your tool, I am pretty sure we can easily run it automatically and
> publish it next to the PHP Metrics report?   The addition to CI has been 
> rather
> straightforward:
> 
> A container:
> https://gerrit.wikimedia.org/r/#/c/integration/config/+/469689/
> 
> The Jenkins job:
> https://gerrit.wikimedia.org/r/#/c/integration/config/+/469690/

Ruprecht comes with a few dependencies that may make containerization less
straight forward. Not terribly hard, but somewhat annoying. E.g. it needs python
1 *and* 3, it needs graph-tool which has to be installed as a debian package
from a non-standard repo, etc.

If you feel like looking into that, I'd of course be happy, of course ;)

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Finding and fixing cyclic dependencies

2019-05-13 Thread Daniel Kinzler
Am 13.05.19 um 12:48 schrieb Amir Sarabadani:
> I wonder how many points you get for nerd snipping a software engineer :)

One, as it's rather easy ;)

> Wonderful work Daniel, kudos!

Thank you for your help!

> In the given list of bad dependencies, I find User class depending on Skin 
> class
> one of the most problematic ones.

It's nasty on the level of the dependency graph, but it's a trivial dependency,
and easily resolved using a trait or utility method:
  Skin::normalizeKey( $wgDefaultSkin );

Want to give it a go?

Thinking this one step further, all logic related to user options should be
factored out of the User class anyway. Filed as
<https://phabricator.wikimedia.org/T223099Y>

> Also, ApiQuery class depending on all of its
> subclasses is one of the biggest issue making all of API query modules act as
> giant big monolith. That can (hopefully) easily addressed and untangle
> significant portion of the 40%, a rather low hanging fruit.

Yea... these dependencies ahttps://github.com/mihaeu/dephpend/issues/47>.

Cheers,
Daniel

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Finding and fixing cyclic dependencies

2019-05-10 Thread Daniel Kinzler
Hi all!

I invite you to try out "Project Ruprecht"[1][2], a tool that measures the
"tangledness" of PHP code, and provides you with a "naughty list" of things to 
fix.

For now, you will have to install this locally. I hope however to soon have this
run automatically against core on a regular basis, perhaps by integrating it
with SonarQube. Maybe some day we can also integrate it with CI, to generate a
warning when a new cyclic dependency is about to be introduced.

So that was the tl;dr. Now for some context, history, and shout-outs. And some
actual real world science, too!

For a while now, I have been talking about the how much of a problem cyclic
dependencies are in MediaWiki core: When two components (classes, namespaces,
libraries, whatever) depend on each other, directly or indirectly, this means
that one cannot be used without the other, nor can it be tested, understood, or
modified without also considering the other. So, in effect, they behave as *one*
component, not two. Applied to MW core, this means that roughly half of our 1600
classes effectively behave like a single giant class. This makes the code rather
hard to deal with.

To fix this, I have been looking for tools that let me identify "tangles" of
classes that depend on each other, and metrics' to measure the progress of
"untangling" the code. However, the classic code quality metrics focus on
"local" properties of the code, so they can't tell us much about the progress of
untangling. And the tools I found that would detect cyclic dependencies in PHP
code would all choke on MediaWiki core: they would try to list all detected
cycles - which, by the super-exponential nature of possible paths through a
graph, would be millions and millions. So, the tools would choke and die. That
approach isn't practical for us.

Two discoveries allowed me to come up with a working solution:
First, I decided to leave the PHP world and turned towards graph analysis tools
built for large data sets. Python's graph-tool did the trick. It's build on top
of boost and numpy, and it's *fast*. It crunched through the 7500 or so class
dependencies in MW core in a split second, and told me that we have 14 "tangles"
(non-trivial strongly connected components), and that 43% of our classes are in
these tangles, with 40% being part of one big tangle that is essentially our
monolith manifest. So now I had a metric to work with: the number of classes in
tangles.

That was great, but still didn't tell me where to start. Graph-tool was still
not fast enough to deal with millions of cycles, and even if it had been, that
data wouldn't be very useful. I needed some smart heuristics. Luckily, I
(totally unintentionally, promise!) nerd sniped[5] Amir Sarabadani one evening
at the WMDE office by telling him about this problem. The next day, he told me
that he had been digging into the problem all night, and he had found a paper
that sounded relevant, and it also came with working code: "Breaking Cycles in
Noisy Hierarchies"[3] by J. Sun, D. Ajwani, P.K. Nicholson, A. Sala, and S.
Parthasarathy. I played with the code a bit, and yes! It spat out a list of 290
or so dependencies[4] that it thought were bad - and I agree for a good number
of them. It's not a clean working list, but it gives a very good idea of where
to start looking.

I find it quite fascinating that this works so well for cleaning up a codebase.
After all, the heuristic wasn't design for this - it was designed for fixing
messy ontologies. Indeed, one of their test data sets was (English language)
Wikipedia's category system! I'd love to see what it does with Wikidata's
subclass hierarchy :)

But I suppose it makes sense - dependencies in software are conceptually a lot
like an ontology, and the same strategies of stratification and abstraction
apply. And the same difficulties, too - it's easy enough to spot a problematic
cycle, but often hard to say where it should be cut. And how to cut it - often,
the solution is not to just remove the dependency, but to introduce a new
abstraction that allows the relationship to exist without a cycle. I'd love to
see the research continue in that direction!

So, a big shout out to the researchers, and to Amir who found the paper!

I hope my ramblings have made you curious to play with Ruprecht, and see what it
has to say about other code bases. There's also another feature to play with
which I haven't discussed here: detection of risky classes using the Page Rank
algorithm. Fun!

Cheers,
Daniel

[1] https://phabricator.wikimedia.org/diffusion/MTDA/repository/master/
[2]
https://gerrit.wikimedia.org/r/admin/projects/mediawiki/tools/dependency-analysis
[3] https://github.com/zhenv5/breaking_cycles_in_noisy_hierarchies
[4] https://phabricator.wikimedia.org/P8513
[5] https://xkcd.com/356/

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation


[Wikitech-l] New policy: Wikimedia Engineering Architecture Principles

2019-05-06 Thread Daniel Kinzler
Hi all!

Today, TechCom[1] has approved a new policy: the Wikimedia Engineering
Architecture Principles [2]. We have been discussing the Architecture Principles
for about a year, by email, on mediawiki.org, at the hackathon, at the Technical
Conference, and on phabricator. I'm very happy that after a final round through
the RFC process[3] and three weeks of Last Call, we could now approve them as
policy.

The architecture principles guide all Wikimedia engineering endeavors. They are
derived from the Wikimedia movement's strategic direction and the Wikimedia
Foundation's product strategy as well as established best practices of the
software industry. They are informed by past experience as well as present needs
and constraints, and are expected to evolve when these needs and constraints
change.

The architecture principles are intended to guide engineering decisions on all
levels, from detailed core review to high level RFCs. People with merge rights
on software in production on WMF servers, as well as people responsible for
technical decision making and planning for such systems, are expected to know
and apply these principles.

If you haven't read them yet, please do so now!

Regards,
Daniel

[1] https://www.mediawiki.org/wiki/Wikimedia_Technical_Committee
[2] https://www.mediawiki.org/wiki/Wikimedia_Engineering_Architecture_Principles
[3] https://phabricator.wikimedia.org/T220657

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] TechCom Radar 2019-04-24

2019-04-26 Thread Daniel Kinzler
Hi All,

Here are the minutes from this week's TechCom meeting:

* The Call extended until May 1st: Establish Architecture Principles as a policy
 and <[[Wikimedia Engineering
Architecture
Principles|https://www.mediawiki.org/wiki/Wikimedia_Engineering_Architecture_Principles]]>

* The first TechCom office hour went pretty well. We had a nice chat, mostly
about TechCom operations and the RFC process. Log:


* The next TechCom office hour is planned to be an in-person session (and video
feed) at the Hackathon in Prague. The exact time is still to be determined.

* No RFC meeting on May 1st (Labour Day)

* Manual:Coding conventions/PHP was updated to reflect T161647 (Deprecate using
php serialization inside MediaWiki) which had been approved two years ago.


You can also find our meeting minutes at


See also the TechCom RFC board
.

If you prefer you can subscribe to our newsletter here


-- daniel

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New: TechCom office hour on April 17.

2019-04-17 Thread Daniel Kinzler
Just a quick reminder that this is in four hoursand a bit.

I'm really curious to see how this goes!

Am 11.04.19 um 22:35 schrieb Daniel Kinzler:
> Hi all!
> 
> On April 17, TechCom will hold an ask-us-anything office hour on IRC. In the
> past, we have held discussions about RFCs in this slot. In the future, we want
> to do focus RFC discussions more on phabricator - we will still have IRC
> meetings about RFCs some times, but we want to to try an more open format with
> the office hours, to give room to ask TechCom questions or bring up recent
> discussions and concerns outside the RFC process.
> 
> The first TechCom office hour will be held on #wikimedia-office next week, 
> April
> 17, 2pm PDT (21:00 UTC, 23:00 CEST). Come and ask us anything, or tell us 
> about
> your ideas and concerns.
> 


-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] TechCom Radar 2019-04-10

2019-04-12 Thread Daniel Kinzler
Am 12.04.19 um 05:18 schrieb David Sharpe:
> I think there was an extra text paste affecting this portion of the document:
> 
> “Motivation: Motivation: These principles aim at the /equity/ goal set by
> the /Movement Strategy Outcomes/ and reflected in the Wikimedia
> Foundation's /Goals and Priorities/: /"Modernize our product experience/",
> particularly by /"integrating content from Commons, Wikidata, Wikisource 
> and
> other projects into Wikipedia”/."

Double motivation :)

fixed, thank you.


-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] TechCom Radar 2019-04-10

2019-04-11 Thread Daniel Kinzler
Hi All,

Here are the minutes from this week's TechCom meeting:

* Older task turned into RFC for discussion: Create temporary accounts for
anonymous editors. 

* Updated: Re-evaluate librsvg as SVG renderer on Wikimedia wikis: resvg is now
available in Debian unstable, perhaps experiments with thumbor could start.


* Last Call until April 24: Establish Architecture Principles as a policy


* New request for guidance: Introduce PageIdentity to be used instead of Title.


* First TechCom office hour on IRC next week, April 17, 2pm PDT (21:00 UTC,
23:00 CEST).  We'll be on #wikimedia-officem, come and ask us anything!

You can also find our meeting minutes at


See also the TechCom RFC board
.

If you prefer you can subscribe to our newsletter here


Thanks,
Kate

--
Kate Chapman
Senior Program Manager, Core Platform
Wikimedia Foundation
kchap...@wikimedia.org

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] New: TechCom office hour on April 17.

2019-04-11 Thread Daniel Kinzler
Hi all!

On April 17, TechCom will hold an ask-us-anything office hour on IRC. In the
past, we have held discussions about RFCs in this slot. In the future, we want
to do focus RFC discussions more on phabricator - we will still have IRC
meetings about RFCs some times, but we want to to try an more open format with
the office hours, to give room to ask TechCom questions or bring up recent
discussions and concerns outside the RFC process.

The first TechCom office hour will be held on #wikimedia-office next week, April
17, 2pm PDT (21:00 UTC, 23:00 CEST). Come and ask us anything, or tell us about
your ideas and concerns.

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Errata: Last Call for comments on the Architecture Principles

2019-04-11 Thread Daniel Kinzler
> You can find the draft at
> <https://www.mediawiki.org/wiki/Wikimedia_Engineering_Architecture_Principles>.
>
> You can find past discussion on the talk page, but please put any new comments
> on the phabricator ticket where we track the last call, as required by the RFC
> process: <https://phabricator.wikimedia.org/T194911>.

Great, I linked to the wrong ticket. Here is the correct one:
<https://phabricator.wikimedia.org/T220657>.

At least I linked the correct draft ;)


-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Last Call for comments on the Architecture Principles

2019-04-11 Thread Daniel Kinzler
After a year of drafting and discussions, TechCom is putting its proposed
Architecture Principles on last call, to be adopted as official policy. This
policy will apply to all engineering endeavors of the Wikimedia foundation, and
is intended to guide all engineering decisions, great and small.

This is a big deal, but it's not intended to be a revolution. The Architecture
Principles mostly codify things that have long been established practice, or
have long been agreed on to be desirable. Not by everybody, but by rough
consensus. We did our best to provide opportunities for input and feedback, and
tried hard to incorporate and consolidate.

The Architecture Principles are a statement of value of the engineers of
Wikimedia, as represented by TechCom. They mean a great deal to us. We hope you
will find them useful when evaluating ideas and plans. They are intended as a
guide that ensures coherence, a tool that helps us build better software 
together.

You can find the draft at
<https://www.mediawiki.org/wiki/Wikimedia_Engineering_Architecture_Principles>.

You can find past discussion on the talk page, but please put any new comments
on the phabricator ticket where we track the last call, as required by the RFC
process: <https://phabricator.wikimedia.org/T194911>.

The last call is open until April 24. If no major issues are raised and remain
unresolved by that time, the draft will be accepted as policy.

But keep in mind that the principles are not intended to be set in stone. They
are derived from the strategic goals and product priorities, from past
experience and from current constraints. When these things change, the
principles should change.

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Uploading new versions of other people's patches to gerrit

2019-04-09 Thread Daniel Kinzler
Am 09.04.19 um 11:21 schrieb Bartosz Dziewoński:
> In the meantime, you can use the traditional Git way of generating patch files
> and sending them by email (or perhaps, in a modern twist, via Phabricator).

Waa! what? really?

Can we get this back at least for people who have +2?


-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Draft: Best practices for software design

2019-03-27 Thread Daniel Kinzler
Hi all!

Please comment on the draft of a best practices guideline for software design:
<https://www.mediawiki.org/wiki/User:DKinzler_(WMF)/Software_Design_Practices>.

I have been thinking of writing down some of my (personal, so far) golden rules
that have emerged in my work on MediaWiki over the years. So here it is. My plan
is to gather some feedback, then turn this into an RFC, and hopefully have some
future version adopted as an official guideline.

Please let me know about things you agree with, or disagree with, and stuff you
find just odd or cryptic.

Thanks!

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New Gerrit privilege policy

2019-03-18 Thread Daniel Kinzler
Am 16.03.19 um 18:40 schrieb Zppix:
> So your basically telling me, I can’t decide who gets the power to +2 on for 
> example a toolforge tool I actively am the primary maintainer of? Instead it 
> has to be requested.

The relevant part of the policy reads:

| If there is a consensus of trusted developers on the Phabricator task, any of
| the Gerrit administrators can resolve the request. The task must remain open
| for at least a week, to allow interested developers to comment. Additional
| time should be allowed if the request is open during travel or holiday
| periods.

So, if you are the one trusted developer on the project, and nobody else cares,
+2 will generally be given after a week.

The rationale for requiring a request instead of allowing the owners of git
repos to gran permissions themselves is two-fold:

1) limiting the impact of compromised volunteer accounts. A compromised account
that can give +2 rights is more problematic than a compromised account that has
+2 rights. It's much easier for WMF to ensure the security of staff accounts.

2) allowing the developer community to raise objections against individuals they
have had bad experiences with in the past. The person asking you for +2 rights
may seem nice enough to you, but giving others an opportunity to chime in seems
prudent.

Both are potential issues for granting maintainer rights on a toolforge project
as well. And perhaps the policy for that should also be revised - perhaps there
should be a distinction between "high impact" and "regular" tools. But that is
beyond the scope of this policy, and this discussion.

> I do not disagree with a lot of the changes to technical policies, but with 
> this change it seems to restrict ability to scale projects. 

That would be the case if the above requests were not handled in a timely
manner. If this should be the case, please complain loudly so we can fix it.

Or do you think one week is an unreasonably long time?

> I also do believe that this change should of be taken under RfC or some sort 
> of consensus-gaining measure. I respect the intentions, but I absolutely 
> think the change needs reverted then voted on by the technical community. 

It did go though the RFC process, including an IRC discussion and a last call
period. Do you have a suggestion for how and where we could have publicized this
more, to gather more feedback?

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] What would you like to see in gerrit?

2019-02-08 Thread Daniel Kinzler
Oh. Oooohhh. Thank you James, you just made my life so much easier!


Am 08.02.19 um 14:42 schrieb James Forrester:
> On Fri, 8 Feb 2019 at 11:35, Daniel Kinzler  wrote:
> 
>> Am 28.01.19 um 19:25 schrieb Paladox via Wikitech-l:
>>> Hi, what would you like to see in gerrit or improved?
>>
>> * I would like to be able to set a cut-off date for my dashboard. E.g.
>> "show
>> only things touched in the last 2 weeks". Ideally, I would be able to make
>> things sticky by starring them, so this would turn into "show only things
>> touched in the last 2 weeks OR starred by me". This would remove a lot of
>> clutter.
>>
> 
> I gave up on gerrit's personal dashboard years ago, and instead have pinned
> browser tabs with specific gerrit searches, which are very powerful.
> 
> https://gerrit.wikimedia.org/r/q/is:open+(reviewer:self+OR+assignee:self)+-is:wip+(-age:2w+OR+is:starred)
> will give you "open non-WIP things for which I'm a reviewer/assignee, and
> which either I've starred or have been touched within the past two weeks",
> which I think is what you're looking for.
> 
> However, yes, it'd be nice to be able to customise the personal dashboard
> queries.
> 
> J.
> 


-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] What would you like to see in gerrit?

2019-02-08 Thread Daniel Kinzler
Am 28.01.19 um 19:25 schrieb Paladox via Wikitech-l:
> Hi, what would you like to see in gerrit or improved? 

* I would like to be able to set a cut-off date for my dashboard. E.g. "show
only things touched in the last 2 weeks". Ideally, I would be able to make
things sticky by starring them, so this would turn into "show only things
touched in the last 2 weeks OR starred by me". This would remove a lot of 
clutter.


* wikibugs should ignore activity on changes that are in WIP state.

* Conflating comment/response with "request review" is confusing. Using "request
review" to get out of WIP mode is impossible to discover. Representing WIP as an
on-or-off state in the UI would make things much simpler.

* a "go to next unresolved comment" navigation button. It's a bit unclear
whether this should include unresolved comments in older change sets. Doing so
without notice would be confusing. Maybe there could be a "go to unresolved
comments in PSn?" prompt when there are no more unresolved comments further
"down" in the current change set would work.

* clicking on the name of a repo in a change should take me to a place where i
can browse that repo. It currently takes me to a list of tasks on that project,
which is quite useless. Same for the target branch.

* git review: submitting a chain of commits with git review should not override
the topic for all changes that get updated. In fact, git review should never
override the topic. It should use the local branch name as a default, but not
force it.

* git review: a nice shortcut for "rebase on change number ". Same as the
rebase button in gerrit, but allowing me to resolve conflicts locally.

-- 
Daniel Kinzler
Principal Software Engineer, Core Platform
Wikimedia Foundation

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

  1   2   3   4   5   6   >