Re: [Wikitech-l] Proposal for removal without deprecation of MagicWord::clearCache

2018-07-31 Thread Bart Humphries
I favor deprecating functions. I'm sure you're familiar with

Have a great day,
Bart Humphries
(909)529-BART(2278) <9095292278>

On Tue, Jul 31, 2018 at 3:16 PM, Alex Winkler 

> So to give my perspective as an "outsider":
> My day to day job is to write MediaWiki extensions for a wiki-farm called
> Liquipedia. We are mostly hopping LTS version, so even deprecation for two
> versions doesn't necessarily mean much to us here. Generally I just install
> a new MediaWiki version on dev and then test all of our extensions thewre
> until they work, without bothering the live version of our wikis.
> The talk here seems to be mostly about people that "blindly" upgrade their
> wiki, and I agree there will always be issues with those, no matter the
> deprecation method. If you don't update to every version, you might never
> see a deprecation notice, as the next version you are upgrading to is not
> necessarily the next version of MediaWiki released, and if you are updating
> to every version you might still not be able to fx your own extensions or
> you might not even check for deprecation notices (mind they only show up
> with the right dev params).
> I totally agree on directly removing functions that have been functionless
> for years like the one we had recently that didn't work since sometime 2009
> or so, no question about that, but I'm not so sure about working functions.
> There is a lot of consideration going into writing MediaWiki extensions,
> and there might be good reasons to not host a MediaWiki extension on
> Wikimedia git repositories. I have seen very specialized MediaWiki
> extensions that would not work without outside code (especially extensions
> allowing login with outside userdatabases and extensions querying certain
> internal APIs come to mind), so I don't think using only Wikimedia hosted
> repositories is a good indication. Generally I think deprecation periods
> that are supposed to actually be helpful should always pass at least one
> LTS release, as those reach a bigger audience.
> Deprecation wise, I personally feel like it only makes a difference if
> deprecations always survive the next LTS release, as most less regular wiki
> owners are likely to jump LTS versions. If there is no waiting for LTS
> releases, then deprecation is somewhat pointless in my point of view, since
> these policies are mostly in place for third party wikis.
> I hope this point of view is helpful to you, if there are any further
> questions regarding our own workflow, feel free to ask and I'll try to
> answer to the best of my ability.
> Alex "FO-nTTaX" Winkler
> Head of Liquipedia Development
> -
> Am 31.07.2018 um 14:53 schrieb Aryeh Gregor:
>> On Tue, Jul 31, 2018 at 3:46 PM, Stephan Gambke 
>> wrote:
>>> I agree that there is a trade-off to be done.
>>> I disagree about expecting code to be put where it is visible to core
>>> developers. I do appreciate that you go and look for where the
>>> functionality that you are working on is used outside core. But
>>> ultimately MW is publishing a public interface and it is unreasonable
>>> to expect all consumers of that interface to "register" and publish
>>> their code as well.
>> We don't, but if they don't publish it we have to guess at what will
>> or won't break it, which means there's a higher chance that we will
>> break it.
>> The difference is that in that case the test would have run through,
>>> throwing just a notice and the call to the deprecated function could
>>> have been removed another day.
>> Hard-deprecation fails the test completely, right?
>> Of course, if they did not want to deal
>>> with breakages, that maintainer should not have pulled MW master in
>>> the first place and should have just worked on whatever MW version
>>> they left off of a few months ago when they last worked on their
>>> extension.
>> Correct, we do not and should not go out of our way to make life easy
>> for people running unstable MediaWiki.  It's not meant for production
>> unless you're following development closely and are prepared to deal
>> with breakage.
>> True. I don't know how other people do it, but I usually only scan the
>>> release notes for big changes and otherwise rely on deprecation notes
>>> to pop up. Doing otherwise would require me to maintain a register of
>>> all MW core functions called from my extensions and cross-checkin

Re: [Wikitech-l] Please comment on the draft consultation for splitting the admin role

2018-06-11 Thread Bart Humphries
" I remember a situation when I posted a fix for a script in the
MediaWiki namespace
as an {{edit request}}, and a well-meaning administrator tried to "improve"
my line of code and forgot a comma, breaking all JavaScript for all
logged-in as well as not logged-in Wikipedia editors and readers for some
painful minutes"

Everyone makes mistakes.  I presume that under this revised proposal, that
administrator would still have had JS edit permission, and might have still
made the mistake.  I mean, they apparently knew JS well enough to have been
able to pass whatever test would have been required to get that permission
added to their account.

Perhaps we need a real test environment of some sort, so that changes like
that must be run on the test server for X [time period] before being pushed
to live?  Changes can't happen on live until there's some sort of consensus
that the test code actually works as run -- any additional changes reset
the test time period counter before it can be pushed to live.

Bart Humphries

On Mon, Jun 11, 2018 at 7:40 AM, Thiemo Kreuz 

> > Is there any historical evidence that sysops being able to edit JS / CSS
> caused some serious issues?
> Oh yes, this happens more often than I feel it needs to. I remember a
> situation when I posted a fix for a script in the MediaWiki:…
> namespace as an {{edit request}}, and a well-meaning administrator
> tried to "improve" my line of code and forgot a comma, breaking all
> JavaScript for all logged-in as well as not logged-in Wikipedia
> editors and readers for some painful minutes.
> I believe such can be avoided with more clear roles that are visible
> for everybody. A separate "tech admin" role would also allow
> volunteers to apply for exactly that, and not be asked why they don't
> do enough "administrator actions" with their privileges.
> Sure, this is anecdotal evidence. Please forgive me, but I currently
> don't have the time to find the pages documenting these situation.
> Best
> Thiemo
> ___
> Wikitech-l mailing list
Wikitech-l mailing list

Re: [Wikitech-l] Download a wiki?

2018-05-18 Thread Bart Humphries
Great, thanks!

I have a convention this weekend, so it'll probably be Monday
evening/Tuesday before I can really do anything else with that dump.

On Fri, May 18, 2018, 3:32 PM Federico Leva (Nemo) 

> You're in luck: just now I was looking for test cases for the new
> version of
> I've made a couple changes and for tomorrow you should see a new XML dump
> at
> (there's also,
> not mine )
> Federico
> ___
> Wikitech-l mailing list
Wikitech-l mailing list

[Wikitech-l] Download a wiki?

2018-05-18 Thread Bart Humphries
We have a very old wiki which has basically never been updated for the past
decade and which was proving stubbornly resistant to updating several years
ago.  And now the owner of the server has drifted away, but we do still
have control over the domain name itself.  The best way that we can think
of to update everything is to scrape all of the pages/file, add them to a
brand new updated wiki on a new server, then point the domain to that new
server.  Yes, user accounts will be broken, but we feel that this is the
most feasible solution unless someone else has another idea.

However, there's a lot of pages on -- which is the wiki I'm
talking about.  Any suggestions for how to automate this scraping process?
I can scrape the HTML off every page, but what I really want is to get the
wikitext off of every page.

Bart Humphries
Wikitech-l mailing list