Hi,

On 2020-08-28 02:18, Daniel Kinzler wrote:
> tl;dr: the key question is:
> 
>     Can we shorten or even entirely skip the deprecation process,
>     if we have removed all usages of the obsolete code from public
>     extensions?

I think going down this road would be a mistake, mostly because it's
hides the real problem of providing stable, long-term APIs. Core is the
platform that extensions are built on top of. If core is constantly
changing, it becomes really hard to make stable extensions. I think the
mentality should be "how do we make this more stable for developers
without adding too much burden on core devs?" rather than figuring out
how core devs can break stuff as fast as possible.

If MediaWiki, plus the extensions someone wants to use, isn't stable,
sysadmins are either going to stay on an older version, or stop using
MediaWiki. We don't want either of those.

A few years ago it was fully possible to write a basic parser tag/hook
extension that would've worked fine with no changes from ~1.17
(post-ResourceLoader) to ~1.31 (pre-MediaWikiServices-ification). Is
that still possible? If not, can we make it possible? Most extension
developers aren't full time MediaWiki devs, and we shouldn't make that a
requirement, even implicitly.

When we change an API, why is it impossible to keep the old functions
around for a year? In some complicated cases sure, it's not fully
possible. But in most cases I think it is. We have to write the
back-compat code anyways for deployment purposes, so keeping it around
isn't really extra work.

Broadly I think it's important to remember that every single deprecation
has a cost larger than the actual rote code change: developers have to
update their mental model to use the new code, which is probably the
most expensive cost we have to pay.

I also think before getting set to speed up the deprecation policy, we
should be seeing if the current one works well. The impression I'm
getting is that more and more people are sticking to LTS releases
because extensions, skins, etc. are breaking every upgrade and it's just
less work to fix it every 2-3 years rather than every 6 months.

> So, let's dive in. On the one hand, the new (and old) policy states:
> 
>     Code MUST emit hard deprecation notices for at least one major
>     MediaWiki version before being removed. It is RECOMMENDED to emit
>     hard deprecation notices for at least two major MediaWiki
>     versions. EXCEPTIONS to this are listed in the section "Removal
>     without deprecation" below.
> 
> This means that code that starts to emit a deprecation warning in version N 
> can
> only be removed in version N+1, better even N+2. This effectively recommends
> that obsolete code be kept around for at least half a year, with a preference
> for a full year and more. However, we now have this exception in place:
> 
>     The deprecation process may be bypassed for code that is unused
>     within the MediaWiki ecosystem. The ecosystem is defined to
>     consist of all actively maintained code residing in repositories
>     owned by the Wikimedia foundation, and can be searched using the
>     code search tool.
> 
> When TechCom added this section[3][4], we were thinking of the case where a
> method becomes obsolete, but is unused.

I would note that the original deprecation policy I wrote had this
clause in it: "Extension developers are encouraged to mirror their code
into Wikimedia's Gerrit/Phabricator/Github to make it easier for core
developers to identify usage patterns. Extensions that are open source
will be given more consideration than those that core developers cannot
see." That explicitly said open source extensions are favored, but
didn't leave private ones out entirely.

I wasn't involved in the drafting of this new exception, but it has
several flaws that immediately jump out to me:

1) "actively maintained" is a trap, code can be passively maintained
because it's small enough to be stable. But that doesn't mean it should
be excluded.
2) I would hope there are no repositories that are solely "owned" by the
Wikimedia Foundation. Defining the "MediaWiki ecosystem" as code owned
by the WMF is mind-boggling.

> However, what does this mean for obsolete code that *is* used? Can we just go
> ahead and remove the usages, and then remove the code without deprecation? 
> That
> seems to be the logical consequence.
> 
> The result is a much tighter timeline from soft deprecation to removal, 
> reducing
> the amount of deprecated code we have to drag along and keep functional. This 
> is
> would be helpful particularly when code was refactored to remove undesirable
> dependencies, since the dependency will not actually go away until the
> deprecated code has been removed.
> 
> So, if we put in the work to remove usages, can we skip the deprecation 
> process?
> After all, if the code is truly unused, this would not do any harm, right? And
> being able to make breaking changes without the need to wait a year for them 
> to
> become effective would greatly improve the speed at which we can modernize the
> code base.
> 
> However, even skipping soft deprecation and going directly to hard deprecation
> of the construction of the Revision class raised concerns, see for instance
> <https://www.mail-archive.com/[email protected]/msg92871.html>.

The Revision class is (was) probably the second or third most used class
after Title and User. That seems like a bad target to use as a case
study for speeding up deprecation. Those cases should go slower rather
than faster.

> The key concern is that we can only know about usages in repositories in our
> "ecosystem", a concept introduced into the policy by the section quoted 
> above. I
> will go into the implications of this further below. But first, let me 
> propose a
> change to the policy, to clarify when deprecation is or is not needed.
> 
> I propose that the policy should read:
> 
>     Obsolete code MAY be removed without deprecation if it is unused (or
>     appropriately gated) by any code in the MediaWiki ecosystem. Such
>     removal must be recorded in the release notes as a breaking change
>     without deprecation, and must be announced on the appropriate
>     mailing lists.
> 
>     Obsolete code that is still used within the ecosystem MAY be
>     removed if it has been emitting deprecation warnings in AT LEAST
>     one major version release, and a best effort has been made to
>     remove any remaining usages in the MediaWiki ecosystem. Obsolete
>     code SHOULD be removed when it has been emitting deprecation
>     warnings for two releases, even if it is still used.
> 
> And further:
> 
>     The person, team, or organization that deprecates code SHOULD
>     drive the removal of usages in a timely manner. For code not under
>     the control of this person, team, or organization, appropriate
>     changes SHOULD be proposed to the maintainers, and guidance SHOULD
>     be provided when needed.
> 
> Compared to the old process, this puts more focus on removing usages of 
> obsolete
> code. Previously, we'd often just wait and hope that usages of deprecated
> methods would vanish eventually. Which may take a long time, we still have 
> code
> in MediaWiki that was deprecated in 1.24. Of course, every now and then 
> someone
> fixes a bunch of usages of deprecated code, but this is a sporadic occurrence,
> not designed into the process.

In the past we were able to take advantage of Google Code-In students
that usually made a ton of progress in this area, but that's not really
a possibility anymore :(

> With the change I am proposing, whoever deprecates a function also commits to
> removing usages of it asap. For extension developers, this means that they 
> will
> get patches and support, but they may see their code broken if they do not
> follow up.

This used to be common practice before the 1.29 Deprecation policy came
into effect, but it was removed because people (including myself)
thought it was too burdensome. You want to make a change to core to fix
something, and now you're on the hook for updating 50 extensions too? No
thanks, we'll just hack around it in our extension and move on. I don't
think we should go back to that model.

> Now, my proposal hinges on the idea that we somehow know all relevant code 
> that
> needs fixing. How can that work?
> 
> When TechCom introduced the idea of the "MediaWiki ecosystem" into the policy,
> our reasoning was that we want to support primarily extension developers who
> contribute their extensions back to the ecosystem, by making them available to
> the public. We found it fair to say that if people develop extensions solely 
> for
> their own use, it is up to them to read the release notes. We do not need to 
> go
> out of our way to protect them from changes to the code base.
> 
> Effectively, with the proposed change to the policy, maintainers of public
> extensions will get more support keeping their extensions compatible, while
> maintainers of private extensions will receive less consideration.

As written, maintainers of private extensions get *no* consideration.

And when we say private, it's not always intentionally private. Consider
LocalSettings.php hacks or someone making live server hacks and
forgetting to commit.

> It seems desirable and fair to me to allow for "fast track" removal of 
> obsolete
> code, but only if we create a clear process for making an extensions 
> "official".
> How exactly would an extension developer make sure that we know their 
> extension,
> and consider it part of the ecosystem? In practice, "known code" is code
> accessible via codesearch[5]. 
> But how does one get an extension into the
> codesearch index? There is currently no clear process for this.

On <https://www.mediawiki.org/wiki/Codesearch#Included_repositories>:
"Additional repositories can be added upon request". Please let me know
(or edit the page) if something isn't clear about that.

> Ideally, it would be sufficient to:
> * create a page on mediawiki.org using the {{Extension}} infobox,
> * setting the status to "stable" (and maybe "beta"),
> * and linking to a public git repository.
> 
> It should be simple enough to create a script that feeds these repos into
> codesearch. A quick look at Category:Extensions_by_status category tells me 
> that
> there are about a thousand such extensions.

And how do we determine if the code is still being maintained or used?
That's already starting to become a problem:
<https://phabricator.wikimedia.org/T241320>. I suspect if you wrote said
script, the quality and usability of search results would drop drastically.

This also is a big step towards centralization (you only get support if
your extension is listed on mediawiki.org), which I think most of us are
opposed to philosophically.

-- Legoktm

_______________________________________________
Wikitech-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to