[Wikitech-l] Re: [Toolforge][GRID SHUTDOWN] Toolforge Grid Engine has been shutdown

2024-03-15 Thread Bryan Davis
On Fri, Mar 15, 2024 at 2:40 AM Magnus Manske via Wikitech-l
 wrote:
>
> As someone who has been migrating a lot of tools, and who has been at times 
> upset^W frustrated at some of the proverbial devils of the details, I want to 
> thank Bryan, and everyone involved, for the sustained effort to keep 
> toolforge going into the future, and congratulations on a job well done.

Thank you very much Magnus for those words of support. As you very
well know yourself, the folks who build things to help others are more
likely to hear about frustrations when their efforts have fallen short
than they are to hear praise when they are succeeding. Hearing from
you and others that we are on the correct path goes a long way towards
keeping up the energy to continue.

In that spirit, I would like to name a few folks specifically who have
chopped more wood and carried more water in the final push to the
deadline than even I had hoped they would. David Caro, Taavi Väänänen,
and Seyram Komla Sapaty all put in great amounts of time and effort
over the last few months to make this project as successful as
possible.

David went above and beyond in to try and ensure that Toolforge had
the technical capabilities needed to allow even less widely used
runtime languages to migrate. Go stalk him a bit on Phabricator to see
the patience and attention he used to help specific tools make it
across, and be sure to check out the many "My first Buildpack"
tutorials that he helped publish.

Taavi carried on a Toolforge admin tradition of finding ways to use
Toolforge itself to provide new features for others by making a
reusable custom image for running Pywikibot scripts using Build
Service. He also spent quite a lot of extra time helping folks 1 on 1
with questions via IRC and Phabricator.

Komla used every means he could think of to try and contact folks
who's tools were in need of attention--Phabricator tickets, Talk page
messages, direct emails, tracking down SUL accounts for Developer
accounts that were not responsive. He also kept the rest of us on the
Toolforge admin team informed of counts of tools remaining and trends
in feedback to consider. He should have had the honor of sending the
final shutdown announcement yesterday, but all 3 of his different ISPs
were disrupted by undersea cable cuts![0]

I would actually like to name one more person, Nicholas Skaggs, as
having been critical to the final steps in converting from Grid Engine
to Kubernetes. Nicholas was the manager of the Wikimedia Cloud
Services team from June 2020 through February 2024. Nicholas had
extreme faith in the abilities of the team and the larger technical
community. He urged us to complete the difficult work needed for the
migration rather than putting it off yet again with "just one more"
upgrade of the Grid exec and control nodes. I and others will miss his
leadership at the Foundation and wish him well in the next phase of
his career.

[0]: 
https://blog.cloudflare.com/undersea-cable-failures-cause-internet-disruptions-across-africa-march-14-2024

Bryan
--
Bryan DavisWikimedia Foundation
Principal Software Engineer   Boise, ID USA
[[m:User:BDavis_(WMF)]]  irc: bd808
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] [Toolforge][GRID SHUTDOWN] Toolforge Grid Engine has been shutdown

2024-03-14 Thread Bryan Davis
As of 2024-03-14T11:02 UTC the Toolforge Grid Engine service has been
shutdown.[0][1]

This shutdown is the culmination of a final migration process from
Grid Engine to Kubernetes that started in in late 2022.[2] Arturo
wrote a blog post in 2022 that gives a detailed explanation of why we
chose to take on the final shutdown project at that time.[3] The roots
of this change go back much further however to at least August of 2015
when Yuvi Panda posted to the labs-l list about looking for more
modern alternatives to the Grid Engine platform.[4]

Some tools have been lost and a few technical volunteers have been
upset as many of us have striven to meet a vision of a more secure,
performant, and maintainable platform for running the many critical
tools hosted by the Toolforge project. I am deeply sorry to each of
you who have been frustrated by this change, but today I stand to
celebrate the collective work and accomplishment of the many humans
who have helped imagine, design, implement, test, document, maintain,
and use the Kubernetes deployment and support systems in Toolforge.

Thank you to the past and present members of the Wikimedia Cloud
Services team. Thank you to the past and present technical volunteers
acting as Toolforge admins. Thank you to the many, many Toolforge tool
maintainers who use the platform, ask for new capabilities, and help
each other make ever better software for the Wikimedia movement. Thank
you to the folks who who will keep moving the Toolforge project and
other technical spaces in the Wikimedia movement forward for many,
many years to come.


[0]: https://sal.toolforge.org/log/DrOgPI4BGiVuUzOd9I1b
[1]: https://wikitech.wikimedia.org/wiki/Obsolete:Toolforge/Grid
[2]: 
https://wikitech.wikimedia.org/wiki/News/Toolforge_Grid_Engine_deprecation#Timeline
[3]: https://techblog.wikimedia.org/2022/03/14/toolforge-and-grid-engine/
[4]: https://lists.wikimedia.org/pipermail/labs-l/2015-August/003955.html

Bryan, on behalf of the Toolforge administrators
-- 
Bryan DavisWikimedia Foundation
Principal Software Engineer   Boise, ID USA
[[m:User:BDavis_(WMF)]]  irc: bd808
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/


[Wikitech-l] Re: remote access for vps

2024-03-04 Thread Bryan Davis
On Mon, Mar 4, 2024 at 8:02 PM Tim Moody  wrote:
>
> re: [3] and [4], in order to support the creation a ZIM of MDWiki, I need a 
> list of all of its redirects, and I find the fastest means of obtaining this 
> is to query the database directly. This is the only info taken directly from 
> the database.

You are probably already aware of this, but there is an Action API
endpoint to enumerate redirects:
* https://www.mediawiki.org/wiki/API:Allredirects
* 
https://mdwiki.org/wiki/Special:ApiSandbox#action=query=json=allredirects

Direct database access is likely faster than enumerating results 500
at a time via the Action API, but it also may reduce the complexity
and fragility of your operational deployment.

Bryan
-- 
Bryan DavisWikimedia Foundation
Principal Software Engineer   Boise, ID USA
[[m:User:BDavis_(WMF)]]  irc: bd808
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: remote access for vps

2024-03-04 Thread Bryan Davis
On Mon, Mar 4, 2024 at 3:53 PM Tim Moody  wrote:
>
> The mysql host ISP does not wish to open a port to an entire gateway, so I 
> have two choices, install tailscale or request a floating IP. Is tailscale 
> permissible and possible on vps?

If there is a client with an OSI approved Open Source license, then
yes it would be permissible. If not, then likely no per the TOU. [0]
It does appear that https://github.com/tailscale/tailscale is licensed
under a 3-clause BSD license, so that is helpful. [1]

It does not at this point sound like your use case would be a
violation of the prohibition on network proxying [2] as the VPN would
be for traffic originating from your Cloud VPS instance and not a
generally open proxy for others or inbound traffic.

I wonder if there is an XY problem [3] here as well. What is the
underlying use case that requires you to connect to an off-premise
database to operate a Wikimedia focused tool in Cloud VPS? Is this
somehow related to your new project to generate ZIM files of MDWiki?
[4]

[0]: 
https://wikitech.wikimedia.org/wiki/Wikitech:Cloud_Services_Terms_of_use#4.3_Open_Source_and_proprietary_software
[1]: https://github.com/tailscale/tailscale/blob/main/LICENSE
[2]: 
https://wikitech.wikimedia.org/wiki/Wikitech:Cloud_Services_Terms_of_use#4.5_Using_WMCS_as_a_network_proxy
[3]: https://en.wikipedia.org/wiki/XY_problem
[4]: https://phabricator.wikimedia.org/T358023

Bryan
-- 
Bryan DavisWikimedia Foundation
Principal Software Engineer   Boise, ID USA
[[m:User:BDavis_(WMF)]]  irc: bd808
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: remote access for vps

2024-03-04 Thread Bryan Davis
On Mon, Mar 4, 2024 at 2:16 PM Bryan Davis  wrote:
>
> On Mon, Mar 4, 2024 at 12:43 PM Tim Moody  wrote:
> >
> > I am trying to reach a remote mysql database from a vps using python. The 
> > destination server has created an opening in the firewall for the request, 
> > but I am still getting network unreachable errors. I gave the ip of the 
> > proxy for the vps, obtained with ping, as the expected ip. I now tried 
> > accessing a web site I control from the same vps also in python using 
> > requests.get, and in the web server log I see the request coming from what 
> > looks to be a gateway (xxx.xxx.xxx.1, rather than the proxy 
> > xxx.xxx.xxx.nnn) Is this to be expected and can I rely on the address for 
> > such requests?
>
> Please, please, please do not expose MySQL/MariaDB to the general
> internet. Instead I would suggest that you use an ssh tunnel to
> connect your workstation with the remote instance. See
> https://wikitech.wikimedia.org/wiki/Help:Toolforge/Database#Connecting_to_the_database_replicas_from_your_own_computer
> for how this can be done in a specific case. For your case really just
> the target host (login.toolforge.org -> your instance) and database
> server should need to change.

I completely misread the direction of our connection. I apologize.

Yes, outbound connections from a Cloud VPS instance to the internet
will show as coming from the shared outbound NAT gateway IP at the
remote. The only change from this is if the Cloud VPS instance has a
"floating IP" that gives it a direct route to the Internet.

Bryan
-- 
Bryan DavisWikimedia Foundation
Principal Software Engineer   Boise, ID USA
[[m:User:BDavis_(WMF)]]  irc: bd808
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: remote access for vps

2024-03-04 Thread Bryan Davis
On Mon, Mar 4, 2024 at 12:43 PM Tim Moody  wrote:
>
> I am trying to reach a remote mysql database from a vps using python. The 
> destination server has created an opening in the firewall for the request, 
> but I am still getting network unreachable errors. I gave the ip of the proxy 
> for the vps, obtained with ping, as the expected ip. I now tried accessing a 
> web site I control from the same vps also in python using requests.get, and 
> in the web server log I see the request coming from what looks to be a 
> gateway (xxx.xxx.xxx.1, rather than the proxy xxx.xxx.xxx.nnn) Is this to be 
> expected and can I rely on the address for such requests?

Please, please, please do not expose MySQL/MariaDB to the general
internet. Instead I would suggest that you use an ssh tunnel to
connect your workstation with the remote instance. See
https://wikitech.wikimedia.org/wiki/Help:Toolforge/Database#Connecting_to_the_database_replicas_from_your_own_computer
for how this can be done in a specific case. For your case really just
the target host (login.toolforge.org -> your instance) and database
server should need to change.

Bryan
-- 
Bryan DavisWikimedia Foundation
Principal Software Engineer   Boise, ID USA
[[m:User:BDavis_(WMF)]]  irc: bd808
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Open question about commit message linter rules and GItLab's merge requests

2023-11-28 Thread Bryan Davis
Recently I completed initial work to enable commit-message-validator
[0] to work with GitLab CI and GitLab merge requests. For those who
are unaware, commit-message-validator is a linter tool designed to
enforce Wikimedia's commit message guidelines. [1]

Soon after the feature was available, James Forrester added it to the
test suite for abstract-wiki/wikifunctions/function-orchestrator [2]
and found the first issue with the integration that I had not
anticipated which he helpfully filed as T351253. [3]

In my estimation, the problem comes down to a question of whether we
should prioritize reading commit message footer information nicely in
GitLab's merge request interface where they are rendered as GitLab
flavored markdown data or not. James' team has developed a convention
of appending a backslash (\) after footer lines so that they render as
individual lines when processed as markdown. This in turn leads to
commit-message-validator rejecting some footers, most obviously "Bug:
T" footers, for having unwanted characters (the trailing " \").

Reasonable people can disagree on the "best" solution here, but I
think it is likely that as a group we can reach consensus on what the
proper behavior of the commit-message-validator tool should be. The
most obvious options are:
* Change nothing in commit-message-validator and suggest folks live
with markdown rendering artifacts in GitLab merge request
descriptions.
* Change commit-message-validator to allow trailing " \" data for
commit message footers in GitLab repos.
* Change commit-message-validator to allow users (typically a CI
process) to configure allow/disallow of trailing " \" data for commit
message footers

Adding support for per-repo rules configuration would be a first for
commit-message-validator. Until now it has provided a single
opinionated ruleset based on interpretation of the commit message
guidelines. [2]

Folks who actually care about this minutia (how git commit message
footers look in and out of GitLab markdown rendering) are encouraged
to think carefully and provide their opinions and supporting data on
T351253. [3]

[0]: https://www.mediawiki.org/wiki/Commit-message-validator
[1]: https://www.mediawiki.org/wiki/Gerrit/Commit_message_guidelines
[2]: 
https://gitlab.wikimedia.org/repos/abstract-wiki/wikifunctions/function-orchestrator/-/commit/dd9b43212fbc884c78e2729c78fac04d6eb6ad87
[3]: https://phabricator.wikimedia.org/T351253

Bryan
-- 
Bryan DavisWikimedia Foundation
Principal Software Engineer   Boise, ID USA
[[m:User:BDavis_(WMF)]]  irc: bd808
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/


[Wikitech-l] Re: send mail in python

2023-05-16 Thread Bryan Davis
On Tue, May 16, 2023 at 7:07 AM Tim Moody  wrote:
>
> Thanks. That got me closer. I was not using ssl. When I took your code and 
> tried to connect to localhost I got
>
> STARTTLS extension not supported by server

*nod* The local relay is bound to localhost, so I guess we never
worried about adding transport security. I would hope that things
would work by just leaving out the starttls() call and related ssl
setup. I have not tested that myself at this point however.

> when I connected to mx-out03.wmcloud.org I got
>
> (220, b'TLS go ahead')
>
> But when I then do
>
>  server.sendmail("nore...@wikimedia.org", "t...@tim.com",
> ... """Subject: smtplib example from Cloud VPS
> ...   ...
> ...   ... Hello world.
> ...   ... ---
> ...   ... Bryan""")
>
> I get SSL: WRONG_VERSION_NUMBER

I saw that in one of my own tests as well. I don't have a good
explanation of what triggered it.

> In the console /usr/sbin/sendmail -v t...@tim.com 
> works just fine, so maybe I should just use subproc.

If shelling out to the sendmail client works for your use case, go for
it! Simpler is nicer in most things. :)

Bryan
-- 
Bryan Davis  Technical Engagement  Wikimedia Foundation
Principal Software Engineer   Boise, ID USA
[[m:User:BDavis_(WMF)]]  irc: bd808
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: send mail in python

2023-05-15 Thread Bryan Davis
On Mon, May 15, 2023 at 2:57 PM Tim Moody  wrote:
>
> I'd like to send an email from a python3 process on a wmcs VPS to report 
> errors.
>
> I looked at https://wikitech.wikimedia.org/wiki/Help:Email_in_Cloud_VPS but 
> could use some help.
>
> sudo echo "Subject: sendmail test2" | /usr/sbin/sendmail -v  works.

The `sudo` here does nothing useful. It is bound to the `echo`
invocation and not the `sendmail` one.

> When I try to send the equivalent from python smtplib I get a 221 error 
> message.

221 is the SMTP status code for closing connection/goodbye. This isn't
specifically an error, instead it means that the SMTP server has
decided to end the session.

* Are there other status codes you see from your attempted python code
prior to the 221?
* What SMTP server are you connecting to?
* Is the python code available somewhere for review?

Here is a quick example of sending email using Python 3.9 and smtplib
from inside Toolforge:

  $ become bd808-test -- webservice python3.9 shell -- python3.9
  >>> import smtplib
  >>> import ssl
  >>> context = ssl.create_default_context()
  >>> server = smtplib.SMTP("mail.tools.wmcloud.org", 587)
  >>> server.starttls(context=context)
  (220, b'TLS go ahead')
  >>> server.sendmail("bd808-test.maintain...@toolforge.org",
"bd...@wikimedia.org", """Subject: smtplib example from Toolforge
  ...
  ... Hello world.
  ... ---
  ... Bryan""")
  {}
  >>> server.quit()
  (221, b'mail.tools.wmcloud.org closing connection')

Things would typically look similar from a Cloud VPS project. The
major change would be to use mx-out03.wmcloud.org or
mx-out04.wmcloud.org as your outbound SMTP service. There is a bit
more complication in using TLS as well due to the x509 certificate
being a bit of a mess (bad subject and expired):

  $ ssh devportal-demo01.devportal.eqiad1.wikimedia.cloud
  $ python3.9
  >>> import smtplib
  >>> import ssl
  >>> context = ssl.create_default_context()
  >>> context.check_hostname = False
  >>> context.verify_mode = ssl.CERT_NONE
  >>> server = smtplib.SMTP("mx-out03.wmcloud.org", 25)
  >>> server.starttls(context=context)
  (220, b'TLS go ahead')
  >>> server.sendmail("bd...@wikimedia.org", "bd...@wikimedia.org",
"""Subject: smtplib example from Cloud VPS
  ...
  ... Hello world.
  ... ---
  ... Bryan""")
  {}
  >>> server.close()

I hope that helps a bit.

Bryan
-- 
Bryan Davis  Technical Engagement  Wikimedia Foundation
Principal Software Engineer   Boise, ID USA
[[m:User:BDavis_(WMF)]]  irc: bd808
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: ClassCrawler – extremely fast and structured code search engine

2022-02-04 Thread Bryan Davis
On Fri, Feb 4, 2022 at 9:40 AM  wrote:
>
> saves to MongoDB

This is problematic from the point of view of shared use within the
Wikimedia movement. MongoDB is a source available product under the
non-free SSPL license [0]. This license was invented by MongoDB and
submitted to the Open Source Initiative (OSI) for OSI approval and
then later withdrawn [1]. The OSI now have a page explaining why this
license is not likely to ever be given OSI approval [2].

This is all esoteric property rights management things to many people,
but the Wikimedia Cloud Services environment (Cloud VPS and Toolforge)
Terms of Use [3] requires that software installed in these
environments is licensed under an OSI approved license. Thus MongoDB,
modern versions of Elasticsearch, and other SSPL licensed software are
not allowed. Even if SSPL was OSI approved it would be problematic in
Cloud VPS & Toolforge as the main point of the license is to restrict
cloud service providers from offering SSPL licensed software as a
service to their clients.

[0]: https://en.wikipedia.org/wiki/Server_Side_Public_License
[1]: 
https://lists.opensource.org/pipermail/license-review_lists.opensource.org/2019-March/003989.html
[2]: https://opensource.org/node/1099
[3]: 
https://wikitech.wikimedia.org/wiki/Wikitech:Cloud_Services_Terms_of_use#What_uses_of_Cloud_Services_do_we_not_like?

Bryan
-- 
Bryan Davis  Technical Engagement  Wikimedia Foundation
Principal Software Engineer   Boise, ID USA
[[m:User:BDavis_(WMF)]]  irc: bd808
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/


[Wikitech-l] Re: Developer Portal status update

2021-12-22 Thread Bryan Davis
On Wed, Dec 22, 2021 at 12:52 PM Amir E. Aharoni
 wrote:
>
> Will the content of this portal be translatable? How will the translation 
> process work?

Yes, the portal is being designed with localization in mind. Our
intent is to register as a translatewiki.net (TWN) project to allow
translations of the English source content. We expect the translation
workflow to look much like the workflow for any software project at
TWN using GNU PO files as the translation dictionary. Our messages
will be using Markdown syntax rather than MediaWiki syntax for things
like links, bold, and italic. This is something that we will need to
document at TWN or elsewhere for translators.

We evaluated an existing plugin for translation of mkdocs projects in
our initial proof of concept work. The evaluation convinced us that
things could be made to work, but that we would need to create our own
plugin with changes intended to make it easier to integrate with TWN.
The current plugin work is tracked in
<https://phabricator.wikimedia.org/T297168> and
<https://gerrit.wikimedia.org/r/c/wikimedia/developer-portal/+/747214/>.

Bryan
-- 
Bryan Davis  Technical Engagement  Wikimedia Foundation
Principal Software Engineer   Boise, ID USA
[[m:User:BDavis_(WMF)]]  irc: bd808
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/


[Wikitech-l] Re: Goto for microoptimisation

2021-07-31 Thread Bryan Davis
 micro-optimisation.
>
> When performance is not a concern, abstractions can be introduced which 
> restructure the code so that it flows in a more conventional way. I 
> understand that you might do a double-take when you see "goto" in a function. 
> Unfamiliarity slows down comprehension. That's why I'm suggesting that it 
> only be used when there is a performance justification.

I am in agreement with Tim. I do not think that any of us should adopt
goto as a commonly used tool. I do however think there are situations
where a goto and comments actually do produce understandable code
which also conforms to a business requirement of keeping wall clock
execution time as small as possible.

Now I guess my question to the group is, how can we describe this
nuance as a replacement for statements in current coding conventions
like "Do not use the goto() syntax introduced in 5.3. PHP may have
introduced the feature, but that does not mean we should use it." [2]?
Could it be as simple as stating the bias more like "The use of `goto`
should be exceedingly rare, always accompanied by comments explaining
why it is used (likely for performance), and the author should be
prepared for others to challenge the usage"?

[0]: https://dl.acm.org/doi/10.1145/362929.362947
[1]: https://www.cs.utexas.edu/users/EWD/transcriptions/EWD02xx/EWD215.html
[2]: https://www.mediawiki.org/wiki/Manual:Coding_conventions/PHP#Other

Bryan
-- 
Bryan Davis  Technical Engagement  Wikimedia Foundation
Principal Software Engineer   Boise, ID USA
[[m:User:BDavis_(WMF)]]  irc: bd808
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/


[Wikitech-l] Re: Proposal: new structure for MediaWiki RELEASE-NOTES

2021-07-20 Thread Bryan Davis
On Tue, Jul 20, 2021 at 5:22 AM Antoine Musso  wrote:
>
> Le 20/07/2021 à 09:48, Riccardo Coccioli a écrit :
>
> On Tue, Jul 20, 2021 at 3:38 AM Petr Pchelko  wrote:
>>
>> Alternative solutions
>>
>> We could write a custom merge driver for RELEASE-NOTES which always puts 
>> ‘ours’ before ’theirs’,
>> but I’m not sure the result will justify the investment.
>
>
> Probably overkill for MediaWiki but I'd like to mention the way that Python 
> developers (CPython) manages their release notes, in case it might be useful.
> The TL;DR is that each change has a unique valid reStructuredText file in a 
> specific directory and then there is a tool to merge all changes when a 
> release is made.
>
> The full process from a contributor point of view is described in [1].
> The tool used to both generate the change files and merge them into a release 
> file is [2].
>
> [1] 
> https://devguide.python.org/committing/#updating-news-and-what-s-new-in-python
> [2] https://pypi.org/project/blurb/
>
> OpenStack has a similar tool: reno. The doc has an overview of the 
> requirements: https://docs.openstack.org/reno/latest/user/design.html and the 
> usage doc for quick glance: 
> https://docs.openstack.org/reno/latest/user/usage.html
>
> There is an a 30 mins presentation https://www.youtube.com/watch?v=tEOGJ_h0Lx0
>
> Short of having to introduce a Python tool to the developers, maybe the PHP 
> ecosystem has a similar tool?  Or we can reach out to other high traffic 
> projects and see how they are managing their changelog and maybe forge a 
> common tool.

https://github.com/Automattic/jetpack-changelogger is a similar tool
to reno that is pure php and composer installable. Some of you may
recognize the primary author too if you check the git history there.

Bryan
-- 
Bryan Davis  Technical Engagement  Wikimedia Foundation
Principal Software Engineer   Boise, ID USA
[[m:User:BDavis_(WMF)]]  irc: bd808
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

Re: [Wikitech-l] Stuck/Missing Grid Job for tools.william-avery-bot

2021-03-26 Thread Bryan Davis
On Fri, Mar 26, 2021 at 3:27 PM William Avery  wrote:
>
> Hi,
>
> I got the email below telling me that my cron job running as 
> william-avery-bot had throw an error, and I noticed that the Grid job that it 
> kicks off hasn't run since.
>
> I tried deleting the job using the instructions at 
> https://wikitech.wikimedia.org/wiki/Help:Toolforge/Grid#Stopping_jobs_with_%E2%80%98qdel%E2%80%99_and_%E2%80%98jstop%E2%80%99
>  but it appeared "stuck".

I have "force deleted" your job using my Toolforge admin rights.

  $ sudo qdel -f 749
  root forced the deletion of job 749

The Toolforge grid engine had numerous problems yesterday which led to
the scheduler losing track of the state of many jobs. Brooke did
several rounds of looking for these and cleaning the queue state, but
obviously yours was not cleaned up in that process. Thank you for your
report, and I hope you can get your tool back into its proper working
state.

Bryan
-- 
Bryan Davis  Technical Engagement  Wikimedia Foundation
Principal Software Engineer   Boise, ID USA
[[m:User:BDavis_(WMF)]]  irc: bd808

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Fatal exception when calling replaceVariables() from parser hook, after upgrade to MW 1.34

2021-01-18 Thread Bryan Davis
On Mon, Jan 18, 2021 at 7:46 AM Mark Clements (HappyDog)
 wrote:
>
> Thanks, Máté,  That's helpful.
>
> What is the most appropriate way to reach-out to the Parsing Team?  I had
> assumed it would be via this list, but no-one else has responded to my
> question, so I guess not.
>
> The method that I am using is based on what I was told to do for the
> previous parser implementation.  It would be useful if there were some
> migration documentation to help extension developers migrate to the new
> implementation.  Is that on the roadmap, somewhere?

See <https://www.mediawiki.org/wiki/Parsoid/Extension_API> and
<https://www.mediawiki.org/wiki/Talk:Parsoid/Extension_API>. There is
also a bit of explanation about the changes at
<<https://lists.wikimedia.org/pipermail/wikitech-l/2020-September/093827.html>
along with a link to a tech talk that Subbu gave on the topic.

Bryan
-- 
Bryan Davis  Technical Engagement  Wikimedia Foundation
Principal Software Engineer   Boise, ID USA
[[m:User:BDavis_(WMF)]]  irc: bd808

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Do you use MediaWiki-Vagrant to develop or test Wikimedia software?

2020-10-09 Thread Bryan Davis
On Fri, Oct 9, 2020 at 11:53 AM Bryan Davis  wrote:
>
> I'm hoping to gather a bit of information on the current use of
> MediaWiki-Vagrant by folks in the Wikimedia movement. I have started a
> slowvote poll in Phabricator at
> <https://phabricator.wikimedia.org/V24> that asks one question: "Do
> you personally use MediaWiki-Vagrant to develop or test software for
> the benefit of the Wikimedia movement?"
>
> I would really appreciate folks taking a couple minutes to click
> through to the poll and selecting one of the possible responses:
>
> * Yes, I use it daily
> * Yes, I use it occasionally
> * No, I used to use it but have switched to another dev environment
> * No, I tried it but it never worked well
> * No, I have never used it
>
> Data collected in the poll will be used to help me and others decide
> if it is worth putting in additional work to revitalize the group
> maintaining and improving MediaWiki-Vagrant.

It has come to my attention that your Phabricator user account must be
a member of the "Trusted-Contributors" group [0] in Phabricator to see
or respond to a slowvote poll. If you get an "Access Denied:
Restricted Application" response from
<https://phabricator.wikimedia.org/V24> this is the cause.

Trusted-Contributors [0] is a self-organizing group, which means that
anyone who is currently in the group can add additional people that
they also trust. If you need an invite to the group, try asking in one
of the Freenode IRC channels for technical contributors like
#wikimedia-tech, #wikimedia-dev, or #mediawiki.

[0]: https://phabricator.wikimedia.org/project/profile/3104/

Bryan
-- 
Bryan Davis  Technical Engagement  Wikimedia Foundation
Principal Software Engineer   Boise, ID USA
[[m:User:BDavis_(WMF)]]  irc: bd808

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Do you use MediaWiki-Vagrant to develop or test Wikimedia software?

2020-10-09 Thread Bryan Davis
I'm hoping to gather a bit of information on the current use of
MediaWiki-Vagrant by folks in the Wikimedia movement. I have started a
slowvote poll in Phabricator at
<https://phabricator.wikimedia.org/V24> that asks one question: "Do
you personally use MediaWiki-Vagrant to develop or test software for
the benefit of the Wikimedia movement?"

I would really appreciate folks taking a couple minutes to click
through to the poll and selecting one of the possible responses:

* Yes, I use it daily
* Yes, I use it occasionally
* No, I used to use it but have switched to another dev environment
* No, I tried it but it never worked well
* No, I have never used it

Data collected in the poll will be used to help me and others decide
if it is worth putting in additional work to revitalize the group
maintaining and improving MediaWiki-Vagrant.

Bryan
-- 
Bryan Davis  Technical Engagement  Wikimedia Foundation
Principal Software Engineer   Boise, ID USA
[[m:User:BDavis_(WMF)]]  irc: bd808

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [MediaWiki-l] Announcing MediaWiki 1.35.0

2020-09-30 Thread Bryan Davis
On Wed, Sep 30, 2020 at 2:51 AM Krabina Bernhard  wrote:
>
> This is great, thank you!
>
> As an LTS user, does anybody know about an overview what has changed since 
> 1.31 LTS?
>
> Would be a great help to have some information about differences/new 
> features/breakting changes between LTS versions and maybe also specific 
> upgrade instructions.

>From https://www.mediawiki.org/wiki/Release_notes you can find these links:

* https://www.mediawiki.org/wiki/Release_notes/1.32
* https://www.mediawiki.org/wiki/Release_notes/1.33
* https://www.mediawiki.org/wiki/Release_notes/1.34
* https://www.mediawiki.org/wiki/Release_notes/1.35

Bryan
-- 
Bryan Davis  Technical Engagement  Wikimedia Foundation
Principal Software Engineer   Boise, ID USA
[[m:User:BDavis_(WMF)]]  irc: bd808

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Help with Mailman spam filter

2020-08-27 Thread Bryan Davis
On Thu, Aug 27, 2020 at 2:45 AM Léa Lacroix  wrote:
>
> Hello all,
>
> I hope that this is the right list to ask about Mailman's admin options, if
> not, let me know where I should ask :)
>
> I'm looking for someone to help me with the config of a mailing-list, to
> reactivate the semi-automated spam filter.
>
> I'm one of the admins on the Wikidata mailing-list. A while ago, to prevent
> my newsletter to be considered as spam because it included too many links,
> I changed something in the config of the mailing-list, I don't remember
> exactly what,  but it was about not filtering spam for the admins anymore,
> and sending each single spam email (or emails from non-registered users) to
> the ML owners. Since then we end up with tons of emails in our mailboxes
> everyday. I tried to reactivate this "pre-filter" feature, but I can't
> remember where it is and how to do it right.
>
> Does anyone have a clue about how to fix this? Thanks in advance :)

https://lists.wikimedia.org/mailman/listinfo/listadmins may be a
better list for questions about configuring a mailman list.

Bryan
-- 
Bryan Davis  Technical Engagement  Wikimedia Foundation
Principal Software Engineer   Boise, ID USA
[[m:User:BDavis_(WMF)]]  irc: bd808

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Gerrit v3.2.2 is live [was: Re: Gerrit upgrade on Saturday, 27th of June]

2020-07-01 Thread Bryan Davis
On Wed, Jul 1, 2020 at 8:28 AM Thiemo Kreuz  wrote:
>
> 2. I would love to apply some minor user CSS to the Gerrit UI, just as
> I did with the old version:
> https://meta.wikimedia.org/wiki/User:Thiemo_Kreuz_(WMDE)/userContent.css.
> Unfortunately the new UI heavily uses "shadow DOM", which means no
> matter what CSS rules I introduce, the Gerrit UI just ignores it.
> There must be a way. Please help.

I wasted my time so you don't have to! I too wanted to apply some
visual changes to the  Gerrit 3.x UI. I ended up with what I know is a
horribly inefficient, but working!,  GreaseMonkey script which scans
the DOM and injects CSS into each shadow DOM object it finds. You can
find my code for this at
<https://github.com/bd808/userscripts/blob/gh-pages/wmfgerrit.user.js>.

Basically the code I put together walks the initial DOM checking every
element to see if it has a shadow DOM component, if it does then the
script will 1) inject my style sheet into that DOM, 2) check every
element in that DOM for more shadow DOM nodes to "fix", and finally 3)
setup a MutationObserver to be alerted for any new DOM insertions and
check them for shadow DOM presence. I'm sure there are more elegant
implementations (my javascript fu is not the best), but this one is
working for now and does not seem to cause any significant load issues
in my browser.

Bryan
-- 
Bryan Davis  Technical Engagement  Wikimedia Foundation
Principal Software Engineer   Boise, ID USA
[[m:User:BDavis_(WMF)]]  irc: bd808

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Ops] Backport and Config changes window (name change)

2020-06-09 Thread Bryan Davis
On Tue, Jun 9, 2020 at 3:33 PM AntiCompositeNumber
 wrote:
>
> We could go the SuSa route and call it the BACON window, for BAckports
> and CONfigs


$ scap say "Scappy approves of BACON as the window name"
 
/\
|  Scappy approves of BACON as the window name   |
\/
 
\
 \
  \
   ___ 
 ⎛   ⎛ ,
  \  //==--'
 _//|,.·//==--'
_OO≣=-  ︶ ᴹw ⎞_§ __  ___\ ___\ ,\__ \/ __ \
   (∞)_, )  ( |  __/__  \/ /__ / /_/ / /_/ /
 ¨--¨|| |- (  / __\/ \___/ \__^_/  .__/
 ««_/  «_/ jgs/bd808    /_/


Bryan
-- 
Bryan Davis  Technical Engagement  Wikimedia Foundation
Principal Software Engineer   Boise, ID USA
[[m:User:BDavis_(WMF)]]  irc: bd808

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: [Wikimedia-l] Etherpad upgrade and a new skin

2020-05-20 Thread Bryan Davis
On Wed, May 20, 2020 at 12:36 AM Pine W  wrote:
>
> Forwarding an announcement. Feedback can go to Alexandros, who may or
> may not be subscribed to a mailing list where you see this email, so
> if you have something to say on list you may also want to include
> Alexandros' email address in the to: field. (By the way, on one
> previous occasion I saw a small number of highly hostile messages
> being directed at WMF staff regarding a UI change that I thought was
> reasonable. If you are a community member and something about this UI
> change upsets you, please vent to me first and then after you have
> finished venting you can send a calmer version of your message to
> Alexandros.)

I appreciate the offer made here by Pine to give you a way to
initially calm down, but if you do feel the need to vent about a UI
change done by an unaffiliated upstream project, please DO NOT vent
even calmly at Alexandros or anyone else on the Wikimedia Foundation's
SRE team or employed by the Wikimedia Foundation generally. Please
open your nearest window and share your complaints with the clouds in
the sky. They will be happy for the attention.


Bryan
-- 
Bryan Davis  Technical Engagement  Wikimedia Foundation
Principal Software Engineer   Boise, ID USA
[[m:User:BDavis_(WMF)]]  irc: bd808

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How to get number of articles or related metrics for indic wiki's ?

2020-05-08 Thread Bryan Davis
On Fri, May 8, 2020 at 2:03 PM Shrinivasan T  wrote:
>
> Hello all,
>
>  I am planning to build a grafana dashboard using Prometheus for the counts
> of all indic wiki articles.

The dashboards you are thinking of may actually already exist.
Discovery of tools is a real problem in our community, and one I hope
to be able to work on more in the coming months.

Please take a look at
<https://stats.wikimedia.org/#/ta.wikipedia.org/content/pages-to-date/normal|line|2-year|~total|monthly>
as an example of the data that the Wikimedia Foundation's Analytics
team publishes to help folks keep track of trends across the Wikimedia
movement's project wikis. More information on the "Wikistats 2"
project can be found at
<https://wikitech.wikimedia.org/wiki/Analytics/Systems/Wikistats_2>
including information on how you can contribute to this project.

> Have to get all the counts and write a custom exporter.
>
>
> Planning for a dashboard showing counts for articles in all indic languages.
>
> Another dashboard to show counts for all wiki projects for selected
> language.
>
> Have few queries.
>
> 1. How to get the number of articles in a wiki for example, tamil wikipedia
> ?  Any api is there to get numbers?

Basic information on article counts can be fetched from each wiki
using the Action API's action=query=siteinfo endpoint. See
<https://www.mediawiki.org/wiki/API:Siteinfo> for more information
about this API.

See 
<https://ta.wikipedia.org/wiki/%E0%AE%9A%E0%AE%BF%E0%AE%B1%E0%AE%AA%E0%AF%8D%E0%AE%AA%E0%AF%81:ApiSandbox#action=query=json=siteinfo=statistics>
for an example usage on tawiki.

The Wikistats 2 project actually pulls its data from a public API as
well! The dashboard I linked above fetches data from
<https://wikimedia.org/api/rest_v1/metrics/edited-pages/aggregate/ta.wikipedia.org/all-editor-types/all-page-types/all-activity-levels/monthly/2018033100/2020050800>.
This is part of what is known as the "Wikimedia REST API". See
<https://wikimedia.org/api/rest_v1/#/Edited%20pages%20data> for more
information on this API collection.


> 2. Can we run a sparql query from our own server?
>
> 3. Once these dashboards are built, can we host custom exporter, Prometheus
> and grafana in tool server or any wiki cloud server? Whom to contact for
> hosting these ?

Toolforge is probably not a great place to host a Prometheus server
simply because the local disk that you would have available to store
the data sets would be hosted on the shared NFS server which provides
$HOME directories for Toolforge maintainers and their tools.

A Cloud VPS project would be capable of hosting the general software
described. See <https://wikitech.wikimedia.org/wiki/Help:Cloud_VPS_project>
for more information about what a Cloud VPS project is and how you
might apply to create one for your project.

Please be aware that a request to create the project described in this
email would likely receive a response encouraging you to collaborate
with the Wikistats 2 project to achieve your goals rather than making
a new project.

> Will do these in remote Hackathon this weekend.

I hope my answers here don't spoil your hackathon! Maybe try playing
around with Wikistats 2 and the APIs it uses and think of ways that
you could either add new features to Wikistats 2 or make a tool that
uses data from the same APIs that would be helpful to the Indic
language community.


Bryan
-- 
Bryan Davis  Technical Engagement  Wikimedia Foundation
Principal Software Engineer   Boise, ID USA
[[m:User:BDavis_(WMF)]]  irc: bd808

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Coolest Tool Award 2019: Call for Nominations

2019-07-16 Thread Bryan Davis
On Tue, Jul 16, 2019 at 4:32 PM Brian Wolff  wrote:
>
> To clarify, does this mean tool in the sense of "tool"server? (Aka coolest
> thing hosted using cloud services) or is it more general including gadgets,
> standalone apps or any other piece of technology in the wikimedia ecosystem
> that's "cool"?

Any Wikimedia related software is eligible. Gadgets, Lua modules, user
scripts, bots no matter where they normally run, web tools no matter
where they are hosted, desktop tools, mobile apps, ...

We have a huge ecosystem of things that people build and use to make
working on the Wikimedia projects easier and the Coolest Tool Academy
wants to hear about all of them. You could honestly even nominate your
favorite GNU Linux distribution if you can explain how it makes life
as a Wikimedian better.

Bryan
-- 
Bryan Davis  Wikimedia Foundation
[[m:User:BDavis_(WMF)]] Manager, Technical EngagementBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Dealing with composer dependencies in early MediaWiki initialization

2019-06-25 Thread Bryan Davis
On Tue, Jun 25, 2019 at 8:21 PM Kunal Mehta  wrote:
>
> I see 3 main ways to move forward:
>
> 1. Move vendor/autoload.php earlier in Setup.php, potentially breaking
> extensions that still rely on composer autoloading for initialization.
> 2. Set $wgServer = false or something in DefaultSettings.php, and then
> fill it in later in Setup.php *after* the composer autoloader has been
> loaded, potentially breaking anyone relying on the value of $wgServer
> in LocalSettings.php.
> 3. (status quo) not librarize code that runs before composer
> autoloader initialization. :(

There may be more entanglements here than I'm seeing, but I think that
there may be an option 4: add code in WebRequest to replace the use of
IP::splitHostAndPort() and IP::combineHostAndPort().

IP::combineHostAndPort() is trivial, and I think that
splitHostAndPort() could be replaced with a semi-clever call to
parse_url() that looked something like:

  $parts = parse_url( 'fake://' . $_SERVER[$varName] );

Bryan
-- 
Bryan Davis  Wikimedia Foundation
[[m:User:BDavis_(WMF)]] Manager, Technical EngagementBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit HTTP Token Auth re-enabled

2019-06-24 Thread Bryan Davis
On Mon, Jun 24, 2019 at 3:53 PM Tyler Cipriani  wrote:
>
> Hi all!
>
> tl;dr: Gerrit HTTP token auth has been re-enabled. To use it you'll need to
> generate a token via your preferences page[0].
>
> Gerrit HTTP token auth was disabled in mid-march due to concerns about its
> implementation[1].
>
> Thanks to the work of Paladox and Gerrit upstream in Gerrit 2.15.14[2] we've
> re-enabled HTTP token authentication.
>
> I previously removed all HTTP auth tokens, so in order to use HTTP token auth
> you'll need to generate a fresh token via your preference page[0]
>
> Your Lowly Gerrit Fiddler,
> -- Tyler
>
> [0]. <https://gerrit.wikimedia.org/r/#/settings/http-password>
> [1]. <https://phabricator.wikimedia.org/T218750>
> [2]. <https://www.gerritcodereview.com/2.15.html#21514>

Thank you for the update Tyler and thank you to everyone who worked to
clear the security concerns with the feature.

I do not use it often, but being able to push patches to Gerrit from
an untrusted location (like a project local Puppet master in a Cloud
VPS project) with this workflow is pretty nice:
* Generate a fresh password at
https://gerrit.wikimedia.org/r/#/settings/http-password
* Git push to gerrit over https with username/password auth
* Regenerate a password at
https://gerrit.wikimedia.org/r/#/settings/http-password to invalidate
the password that was exposed to the untrusted instance/network

Bryan
-- 
Bryan Davis  Wikimedia Foundation
[[m:User:BDavis_(WMF)]] Manager, Technical EngagementBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] 1.34.0-wmf.6 status and cawikinews outage

2019-05-22 Thread Bryan Davis
On Wed, May 22, 2019 at 10:35 AM Andre Klapper  wrote:
>
> On Wed, 2019-05-22 at 17:29 +0100, RhinosF1 Wikipedia wrote:
> > what's on Group 1
>
> See the dropdown on https://tools.wmflabs.org/versions/

The TL;DR answer for what is in each group is basically:
* Group 0 == mediawiki.org, "testing" wikis, and "closed" wikis
* Group 1 == wikis that are not wikipedias or in group 0
* Group 2 == wikipedias

There are some more subtle distinctions, but that's the general setup
of the groups.

Bryan
-- 
Bryan Davis  Wikimedia Foundation
[[m:User:BDavis_(WMF)]] Manager, Technical EngagementBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fetching edit notices via API

2019-04-28 Thread Bryan Davis
On Sun, Apr 28, 2019 at 8:02 PM Bryan Davis  wrote:
>
> On Sun, Apr 28, 2019 at 5:02 PM Huji Lee  wrote:
> >
> > Hi all,
> > Is there an API call that would allow you to retrieve the edit notice for a
> > page?
>
> I did not find a specific Action API that does this, but the edit
> notices themselves are found at predictable URLs per
> <https://www.mediawiki.org/wiki/Manual:Interface/Edit_notice>:
> * MediaWiki:Editnotice- (where N is the namespace id)
> * MediaWiki:Editnotice-- (where any '/' in the title is
> replaced with '-')
>
> You could also find all editnotice messages for a wiki using a prefix
> search generator:

<https://www.mediawiki.org/wiki/Special:ApiSandbox#action=query=json=revisions=prefixsearch=2=content=main=Editnotice-=8=max>

(sorry for the early post on the prior message)

Bryan
-- 
Bryan Davis  Wikimedia Foundation
[[m:User:BDavis_(WMF)]] Manager, Technical EngagementBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fetching edit notices via API

2019-04-28 Thread Bryan Davis
On Sun, Apr 28, 2019 at 5:02 PM Huji Lee  wrote:
>
> Hi all,
> Is there an API call that would allow you to retrieve the edit notice for a
> page?

I did not find a specific Action API that does this, but the edit
notices themselves are found at predictable URLs per
<https://www.mediawiki.org/wiki/Manual:Interface/Edit_notice>:
* MediaWiki:Editnotice- (where N is the namespace id)
* MediaWiki:Editnotice-- (where any '/' in the title is
replaced with '-')

You could also find all editnotice messages for a wiki using a prefix
search generator:

Bryan
-- 
Bryan Davis  Wikimedia Foundation
[[m:User:BDavis_(WMF)]] Manager, Technical EngagementBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Change to Wikitech logins: Username now case-sensitive

2019-04-15 Thread Bryan Davis
A change was deployed to the Wikitech config 2019-04-15T23:16 UTC
which prevents users from logging into the wiki with a username that
differs in case from the 'cn' value for their developer account.

This change is not expected to cause problems for most users, but
there may be some people who have historically entered a username with
mismatched case (for example "bryandavis" instead of "BryanDavis") and
relied on MediaWiki and the LdapAuthentication plugin figuring things
out. This will no longer happen automatically. These users will need
to update their password managers (or brains if they are not using a
password manager) to supply the username with correct casing.

The "wrongpassword" error message on Wikitech has been updated with a
local override to help people discover this problem. See
<https://phabricator.wikimedia.org/T165795> for more details.

Bryan, on behalf of the Cloud Services team
-- 
Bryan Davis  Wikimedia Foundation
[[m:User:BDavis_(WMF)]] Manager, Technical EngagementBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Deployment-prep (aka 'Beta') services will be unreliable this week

2018-11-20 Thread Bryan Davis
On Tue, Nov 20, 2018 at 4:41 PM Mukunda Modell  wrote:
>
> Despite each of our efforts, there is still a configuration error
> preventing MediaWiki from working in beta. I've been stuck for several
> hours now but I'm hopeful that this is the last major issue and I'm sure
> it's simple for someone who understands MediaWiki internals better than I
> do. Unfortunately I'm completely stumped so I could really use some help
> from someone who understands the configuration of MediaWiki session storage
> and the underlying object cache.
>
> The problem is described in https://phabricator.wikimedia.org/T210030  so I
> won't repeat it here. I'll simply appeal for those of you who know
> something about how "BagOStuff" is configured, please take a look at
> T210030 and point me in the right direction.

The beta cluster wikis are working again. It turns out that there was
some confusion when moving/removing servers because of implementation
drift between our production clusters and the beta cluster.

Before the move to the new region we had both "memc*" and "redis*"
servers in the beta cluster project. The "memc*" servers are the
equivalent of our production "mc*" servers. In production the "mc*"
servers run both memcached and redis services. In the beta cluster our
"memc*" servers were only providing memcached and the configuration
relied on the "redis*" servers for session storage. The "redis*"
servers were removed while migrating virtual machines to the eqiad1-r
region under the assumption that they were legacy servers from the
time when we used redis as storage for the job queue. The fix was to
setup the "memc*" servers with both memcached for arbitrary data
caching and redis for session storage. If you are interested in the
gory details see notes left on
https://phabricator.wikimedia.org/T210030.

Bryan
-- 
Bryan Davis  Wikimedia Foundation
[[m:User:BDavis_(WMF)]] Manager, Technical EngagementBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wmfall] Datacenter Switchover recap

2018-09-12 Thread Bryan Davis
On Wed, Sep 12, 2018 at 11:16 AM, Alexandros Kosiaris
 wrote:
> Hello all,
>
> Today we've successfully migrated our wikis (MediaWiki and associated
> services)
> from our primary data center (eqiad) to our secondary (codfw), an exercise
> we've done for the 3rd year in a row. During the most critical part of the
> switch today, the wikis were in read-only mode for a duration of 7 and a
> half minutes - a significant improvement from last year.

Everyone involved worked hard to make this happen, but I'd like to
give a special shout out to Giuseppe Lavagetto for taking the time to
follow up on a VisualEditor problem that affected Wikitech
(<https://phabricator.wikimedia.org/T163438>). We noticed during the
April 2017 switchover that the client side code for VE was failing to
communicate with the backend component while the wikis were being
served from the Dallas datacenter. We guessed that this was a
configuration error of some sort, but did not take the time to debug
in depth. When the issue reoccurred during the current datacenter
switch, Giuseppe took a deep dive into the code and configuration,
identified the configuration difference that triggered the problem,
and made a patch for the Parsoid backend that fixes Wikitech.

Wikitech is a low volume wiki for both edits and reads, and for
various historical and technical reasons is different from all other
wikis that we host. Keeping it available for reading is important to
our technical teams because it hosts many of the troubleshooting
playbooks that we use to diagnose and correct operational problems on
the rest of the wikis. Taking the time to work on an editing bug that
only impacted edits done using VisualEditor is awesome, but not the
sort of thing I would normally expect to be worked on promptly. For
me, Giuseppe's work on this bug is a sign that that he cares about the
small details, and also that the rest of the switchover went well
giving him the time to investigate lower impact edge cases like this.


Bryan
-- 
Bryan Davis  Wikimedia Foundation
[[m:User:BDavis_(WMF)]] Manager, Technical EngagementBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Phabricator monthly statistics - 2018-07

2018-08-01 Thread Bryan Davis
On Wed, Aug 1, 2018 at 2:03 PM, Pine W  wrote:
> Hi Gergo,
>
> I understand that it might take 2 hours to complete a priority X task that
> has been open for 2 years, but depending on the definition of "high"
> priority, it seems to me that the median high priority task should be open
> for fewer than 2 years.
>
> Maybe this is a complex enough topic that it would be better discussed
> during one of the regular technology office hours. Do you have a suggestion
> about which office hour would be most appropriate, if you think that an
> office hour would be a good venue for a discussion?

It is also not completely obvious, but useful to remember, that
phabricator.wikimedia.org is a shared service used by many
organizations, teams, and individuals participating in the Wikimedia
movement's technical spaces. This in turn means that there is no
canonical workflow, no single 'owner' of determining process and
procedure, and no simple way to measure trends.

Any patterns that any of us think we see in global aggregate numbers
such as those in this report should be taken with a whole handful of
salt rather than just a pinch. :) Think of this report the same way
you would think of a report by GitLab, BitBucket, or GitHub about
activity across all of their hosted projects and tracking boards.

Bryan
-- 
Bryan Davis  Wikimedia Foundation
[[m:User:BDavis_(WMF)]] Manager, Technical EngagementBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How to make oauth authentication with wikipedia?

2018-07-29 Thread Bryan Davis
On Sun, Jul 29, 2018 at 12:37 AM rupert THURNER
 wrote:
>
> if one takes an example, lke https://tools.wmflabs.org/video2commons/, is
> this implemented like it should? is there any difference from "any"
> application or applications on the tools server? am looking at the code
> here currently:
> https://github.com/toolforge/video2commons/blob/master/video2commons/frontend/app.py
> the "dologin" method.

Yes, there is a major difference between a web application like the
video2commons tool and a device native application like an Android
app. That difference is that in a web application secret data can be
kept on the web server side that is not visible to the end user. This
allows the OAuth application secret to be used in signing requests to
the Wikimedia servers without exposing that secret to anyone who is
looking at the source code of the web application. This separation is
not possible when the application is running on end-user controlled
devices as a phone or desktop application does.

Bryan
-- 
Bryan Davis  Wikimedia Foundation
[[m:User:BDavis_(WMF)]] Manager, Technical EngagementBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] dumps.wikimedia.org web and rsync services down

2018-06-28 Thread Bryan Davis
On Thu, Jun 28, 2018 at 12:46 PM, Bryan Davis  wrote:
> The https://dumps.wikimedia.org web interface for downloading various
> dump files is currently offline. The rsync service for external
> mirroring is as well. Local network NFS consumers may or may not be
> working depending on which server the consumer is attached to.
>
> This unexpected outage is the result of hardware issues following a
> short planned maintenance. We are currently investigating the root
> cause of the outage and will post additional updates as they become
> available. Thanks for your patience.

All dumps.wikimedia.org services (web download, rsync mirroring, and
NFS mounts) should be back to normal working status. If you are
interested in the details of what went wrong to cause this incident
you can read <https://phabricator.wikimedia.org/T196651> and the
followup tasks that are created for it.

Thank you again for your patience and understanding.


Bryan
-- 
Bryan Davis  Wikimedia Foundation
[[m:User:BDavis_(WMF)]] Manager, Cloud Services  Boise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] dumps.wikimedia.org web and rsync services down

2018-06-28 Thread Bryan Davis
The https://dumps.wikimedia.org web interface for downloading various
dump files is currently offline. The rsync service for external
mirroring is as well. Local network NFS consumers may or may not be
working depending on which server the consumer is attached to.

This unexpected outage is the result of hardware issues following a
short planned maintenance. We are currently investigating the root
cause of the outage and will post additional updates as they become
available. Thanks for your patience.

Bryan
-- 
Bryan Davis  Wikimedia Foundation
[[m:User:BDavis_(WMF)]] Manager, Cloud Services  Boise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Proposal] Change terminology used for wikitech/LDAP/Toolforge/Labs accounts

2018-03-23 Thread Bryan Davis
On Fri, Feb 23, 2018 at 1:57 PM, Bryan Davis <bd...@wikimedia.org> wrote:
> TL;DR see <https://phabricator.wikimedia.org/T179461> for a proposed
> naming change.

Quiddity closed the proposal as accepted today! I have been making
edits on Wikitech to reflect the new "Wikimedia developer account"
term. I also have proposed a page move and content update for
<https://www.mediawiki.org/wiki/Developer_access>. If anyone wants to
help update content on mediawiki.org and other wikis that would be
very much appreciated.

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]] Manager, Cloud Services  Boise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] [Proposal] Change terminology used for wikitech/LDAP/Toolforge/Labs accounts

2018-02-23 Thread Bryan Davis
TL;DR see <https://phabricator.wikimedia.org/T179461> for a proposed
naming change.

I proposed in Phabricator a wile ago [0] that we adopt the term
"Wikimedia developer account" across our wikis and other documentation
for the LDAP-backed user accounts that are created using
wikitech.wikimedia.org.

These same sign-on credentials are used for:
* wikitech.wikimedia.org
* Gerrit
* Phabricator (optionally)
* toolsadmin.wikimedia.org
* horizon.wikimedia.org
* ssh-based "shell access" to Toolforge and Cloud VPS servers
* a variety of web services providing access to operational and
analytics data related to the Wikimedia services

There are plans [1][2] in various stages of progress to change things
about how developer accounts are created and managed. Breaking
assumptions in our documentation about the back-end storage system
(LDAP) and the front-end management interface (wikitech) will help
make these changes less disruptive. It also helps remove another
lingering "labs" reference that the Cloud Services rebranding efforts
have been trying to address.

This change probably has almost no impact on existing technical
community members. It is really just targeted at making things a bit
less confusing for newcomers.

If you have thoughts or concerns about this proposal, please describe
them on the Phabricator task [0]. I'm proposing that on or about March
23rd (4 weeks from the date of this message) the comments on the task
will be evaluated for consensus and an approve/deny decision.


[0]: https://phabricator.wikimedia.org/T179461
[1]: https://phabricator.wikimedia.org/T161859
[2]: https://phabricator.wikimedia.org/T179463

Thanks,
Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]] Manager, Cloud Services  Boise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] MediaWiki-Vagrant's master branch migrated to Debian Stretch

2018-01-15 Thread Bryan Davis
I have just merged the "stretch-migration" feature branch to master
for MediaWiki-Vagrant.

Major changes:
* Debian Stretch (Debian 9) base image
* Default PHP runtime is Zend PHP 7.0 (HHVM available via role)
* Database is MariaDB 10.1
* Puppet 4

Once you update your local MediaWiki-Vagrant clone to 59e3b49c or
later you will need to create a new VM based on the Debian Stretch
base image in order to use `vagrant provision`. Upgrading your local
VM may be as easy as using `vagrant destroy` to delete the current VM
followed by `vagrant up` to make a new one. Note that this will *not*
save the contents of any local wikis in the VM. You will need to
manually backup and restore the databases or export and import pages
you have created.

See <https://phabricator.wikimedia.org/T181353> for more information
and a few known open bugs.


I have also created a jessie-compat branch that can by used by users
who are not ready to destroy their current Jessie based virtual
machines and start over with Stretch. A simple `git checkout -b
jessie-compat` should be all that is needed to switch your local
MediaWiki-Vagrant clone to the new compatibility branch. This branch
will probably receive few updates, so you are encouraged to create new
Stretch based VMs soon.

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]] Manager, Cloud Services  Boise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Help Needed] MediaWiki-Vagrant: Debian Stretch testing and migration

2018-01-04 Thread Bryan Davis
On Mon, Dec 18, 2017 at 10:07 PM, Bryan Davis <bd...@wikimedia.org> wrote:
> Work has begun to upgrade the base distribution for MediaWiki-Vagrant
> to Debian Stretch [0]. The Wikimedia production cluster is preparing
> for a similar upgrade [1] which will in part allow the Wikimedia wikis
> to migrate to PHP 7 [2].
>
> == What's new in the stretch-migration branch? ==
> * Debian Stretch (Debian 9) base image
> * Default PHP runtime is Zend PHP 7.0 (HHVM available via role)
> * Database is MariaDB 10.1
> * Provisioning via Puppet 4
>
> Setting up a basic wiki (no roles enabled) seems to work pretty well.
> Additional roles need testing and may require updated Puppet manifests
> (Puppet syntax updates, erb syntax updates, package name changes,
> additional packages). Help is needed to test roles, file bugs, and
> submit patches. With some help I think we can be ready to switch to
> Stretch as the default base image in early/mid January.
>
> == Testing the Stretch base image and Puppet profiles ==
>
> Its recommended to test with a fresh MediaWiki-Vagrant checkout so if
> things go badly you can easily switch back to your original install
> and keep working.
>
>   $ git clone --recursive
> https://gerrit.wikimedia.org/r/mediawiki/vagrant mwv-stretch
>   $ cd mwv-stretch
>   $ git checkout stretch-migration
>   $ ./setup.sh
>   $ vagrant up
>
> You can run `vagrant roles list -e -1` to get a nice list of the roles
> you have enabled on your normal Trusty VM install to copy over to your
> Stretch testing VM. This one-liner liner might even do it for you:
>
>   $ cd mwv-stretch
>   $ vagrant roles enable $(cd ../vagrant; vagrant roles list -e -1)
>   $ vagrant provision
>
>
> [0]: https://phabricator.wikimedia.org/T181353
> [1]: https://phabricator.wikimedia.org/T174431
> [2]: https://phabricator.wikimedia.org/T176370

Work has gone pretty well on the stretch-migration branch. I'd like to
merge the branch to master and make it the default experience on or
around 2018-01-18. Having a major feature branch like this alive for
more than a month makes integrating the changes more difficult. It
also makes things harder for people who are making improvements on the
master branch.

It would be great to have some more users testing it out to make sure
the roles that they need to work with MediaWiki-Vagrant day to day are
working there. Special thanks to Stas, Gilles, Hashar, and Željko for
the bug reports and patches that they have already supplied.


Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]] Manager, Cloud Services  Boise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] [Help Needed] MediaWiki-Vagrant: Debian Stretch testing and migration

2017-12-18 Thread Bryan Davis
Work has begun to upgrade the base distribution for MediaWiki-Vagrant
to Debian Stretch [0]. The Wikimedia production cluster is preparing
for a similar upgrade [1] which will in part allow the Wikimedia wikis
to migrate to PHP 7 [2].

== What's new in the stretch-migration branch? ==
* Debian Stretch (Debian 9) base image
* Default PHP runtime is Zend PHP 7.0 (HHVM available via role)
* Database is MariaDB 10.1
* Provisioning via Puppet 4

Setting up a basic wiki (no roles enabled) seems to work pretty well.
Additional roles need testing and may require updated Puppet manifests
(Puppet syntax updates, erb syntax updates, package name changes,
additional packages). Help is needed to test roles, file bugs, and
submit patches. With some help I think we can be ready to switch to
Stretch as the default base image in early/mid January.

== Testing the Stretch base image and Puppet profiles ==

Its recommended to test with a fresh MediaWiki-Vagrant checkout so if
things go badly you can easily switch back to your original install
and keep working.

  $ git clone --recursive
https://gerrit.wikimedia.org/r/mediawiki/vagrant mwv-stretch
  $ cd mwv-stretch
  $ git checkout stretch-migration
  $ ./setup.sh
  $ vagrant up

You can run `vagrant roles list -e -1` to get a nice list of the roles
you have enabled on your normal Trusty VM install to copy over to your
Stretch testing VM. This one-liner liner might even do it for you:

  $ cd mwv-stretch
  $ vagrant roles enable $(cd ../vagrant; vagrant roles list -e -1)
  $ vagrant provision


[0]: https://phabricator.wikimedia.org/T181353
[1]: https://phabricator.wikimedia.org/T174431
[2]: https://phabricator.wikimedia.org/T176370

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]] Manager, Cloud Services  Boise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] 2017-12-13 Scrum of Scrums meeting notes

2017-12-14 Thread Bryan Davis
On Thu, Dec 14, 2017 at 4:30 PM, Grace Gellerman
<ggeller...@wikimedia.org> wrote:
> *https://www.mediawiki.org/wiki/Scrum_of_scrums/2017-12-13

I missed getting the Cloud Services update into the etherpad, but it
was pretty boring:

* Team offsite and KubeCon/CloudNativeCon last week in Austin

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]] Manager, Cloud Services  Boise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [AI] New ORES FAQ

2017-11-22 Thread Bryan Davis
(dropped ai@ from the CC)

On Tue, Nov 21, 2017 at 6:49 PM, Gergo Tisza <gti...@wikimedia.org> wrote:
> On Tue, Nov 21, 2017 at 4:20 PM, Aaron Halfaker <ahalfa...@wikimedia.org>
> wrote:
>
>> I thought Wikitech makes sense for a wikimedia-specific initiative.
>>
>
> A documentation SIG is not really Wikimedia-specific though (or did you
> mean "ORES documentation SIG" specifically?).

Sarah is working on technical writing projects related specifically to
ORES and Cloud Services this quarter (and hopefully for many more
quarters to come). This work is part of the "technical community
building" program from the Technology department's annual plan [0].

Sarah can jump in to say more, but my understanding is that the
initial work of the SIGDOCS is intended to be focused pretty narrowly
on ORES, Cloud VPS, and Toolforge. That certainly does not mean that
we do not have other documentation gaps (we do) or that others are
barred from organizing to work on things like MediaWiki API docs (they
are not). It does mean however that we are making choices about focus
with the hope that this will help make more actionable plans about
which docs to work on first and what sort of work to do.

> Also, most Wikimedia-specific technology initiatives are on mw.org (all the
> Audiences projects, for example).

Where the tracking page for SIGDOCS lives is probably the least
important issue related to organizing the group, but I think your
point is well taken.

> Due to being managed differently, wikitech is not a great work environment
> IMO. There is no unified login, no page translation support, no structured
> discussion support, no pageview metrics, the registration process is tied
> to creating LDAP and shell accounts... There are plans to improve this
> eventually [1], but between that and the audience differences, I'd stay
> away for now.

All of this is true, but then also very false when you learn that
improving the Cloud VPS and Toolforge docs is at least 50% of the
project to be undertaken.


I'd like to second Gergo's initial enthusiasm about Sarah stepping up
to help organize work on documentation. She has a great background in
technical writing as individual contributor and experience in teaching
writing related topics that I hope we can all leverage to get better
at the craft of making clear and useful documentation.


[0] 
https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Annual_Plan/2017-2018/Draft/Programs/Technology#Program_4:_Technical_community_building
> [1] https://phabricator.wikimedia.org/T123425

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]] Manager, Cloud Services  Boise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Thoughts for handy features

2017-11-17 Thread Bryan Davis
On Thu, Nov 16, 2017 at 11:15 PM, John Elliot V <j...@jj5.net> wrote:
> 2. For duplicated content it would be handy if you could define a bunch
> of "variables" down the bottom of a page and then reference them from
> elsewhere. I am aware of templates, but those are overkill and difficult
> to maintain per my use case (my use case is documenting the "purpose" of
> a computer, I duplicate this in various places, but don't want to
> maintain templates for that).

This sounds like something that you could do with
<https://www.mediawiki.org/wiki/Extension:Cite>. Citations are really
just footnotes and each can be named when defined and then reused by
reference at other places in the same article. There is a example of
this at 
<https://www.mediawiki.org/wiki/Extension:Cite#Multiple_uses_of_the_same_footnote>.

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]] Manager, Cloud Services  Boise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Try out the commit message validator

2017-11-07 Thread Bryan Davis
On Mon, Nov 6, 2017 at 11:59 PM, Niklas Laxström
<niklas.laxst...@gmail.com> wrote:
> 2017-11-07 7:58 GMT+02:00 Kunal Mehta <lego...@member.fsf.org>:
>>
>> But sometimes people aren't familiar with the guidelines, or more likely
>> we make a typo somewhere. Here's where the commit-message-validator[2]
>> comes in handy!
>>
>
> Does the tool check for typos? Typos in commit messages get merged all the
> time, including my own. I would like to avoid that :)

We could probably add checks for some common ones if someone compiled a list.

Running a full spell check would be difficult because of the number of
false positives there would be based on a "normal" dictionary. Commit
messages often contain technical jargon (maybe something to try and
avoid) and snippets of code  (e.g. class names like
TemplatesOnThisPageFormatter) that would not be in any traditional
dictionary that we could count on being on the local host.

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]] Manager, Cloud Services  Boise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] 2017-09-27 Scrum of Scrums meeting notes

2017-09-28 Thread Bryan Davis
On Thu, Sep 28, 2017 at 10:53 AM, Federico Leva (Nemo)
<nemow...@gmail.com> wrote:
>> Turning off OCG. Investigating using chromium for printing.
>
> The first sentence is unclear. I think it means "We would like to turn off
> OCG" (i.e. "replace OCG"), where the second sentence indicates a step to
> that purpose. Please confirm.

The best description I can find in Phabricator for "Turning off OCG"
is <https://phabricator.wikimedia.org/T150871>. There is also
<https://www.mediawiki.org/wiki/Reading/Web/PDF_Functionality>.

There was also a blurb in Tech News
(<https://meta.wikimedia.org/wiki/Tech/News/2017/37>):

> You can't use OCG to create PDFs after 1 October. This is because
> of technical problems. You can use Electron instead. Most PDFs are
> already created with Electron. Electron will get missing features
> before 1 October. You can create books but they will not have all
> planned features until November or December. You can read more on
> mediawiki.org.

Hopefully someone working on the project can give pointers to the
chromium investigation.

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]] Manager, Cloud Services  Boise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Previous mediawiki version test wiki

2017-09-21 Thread Bryan Davis
On Thu, Sep 21, 2017 at 2:52 PM, יגאל חיטרון <khit...@gmail.com> wrote:
> Very well, thank you. And I thought it will not be a problem at all. But
> I'll try at least.
> -
> Hello, people! I believe you should think about a possibility of creation a
> new test wiki, that will be a screenshot of deployments with one week
> delay. So, every Thursday, at the moment of group 2 deployment of version
> 5 in place of 4, this wiki will get 4 in place of 3. It's
> group 2, because you'll not create three new wikis, of course, so it should
> help to most wikis. Other groups users will wait a day or two.
> Why do I think it is helpful? When filing a phab task with some new bug,
> you always want to know - is it really new, or I just did not pay attention
> to it before? And when I do know it's a new bug, I can open both versions
> in the same time, and compare the behaviour for this bug. And also, compare
> the console results - what exactly changed in html, in css, in js commands
> reactions. IMHO it can be a powerful helping tool for better description of
> new bugs in phabricator. Not for developers, who remember all the changes
> in last deployment, but for users that find the bugs. Even it is heavy for
> maintenance, I think the profit is much better. Could you think about this,
> please?
> Thank you in advance,
> Igal (User:IKhitron)

This use case is exactly the reason that group0 (testwiki, test2wiki,
testwikidatawiki, medaiwikiwiki) and group1 (non-wikipedia wikis) get
code before group2 (all wikipedias). The order of operations is just
reversed from what you are asking for. Group0 gets new release on
Tuesday, then group1 on Wednesday, and finally group2 on Thursday.

Why in this order? We want people to test and find bugs *before* they
hit the larger wikis.

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]] Manager, Cloud Services  Boise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HHVM vs. Zend divergence

2017-09-19 Thread Bryan Davis
On Tue, Sep 19, 2017 at 9:21 AM, C. Scott Ananian
<canan...@wikimedia.org> wrote:
> There are other big users of HHVM -- do we know what other members of the
> larger community are doing?  We've heard that Phabricator intends to follow
> PHP 7.  Etsy also shifted to HHVM, do we know what their plans are?

Etsy 'experimented' with HHVM [0] and then eventually switched to PHP7
as their primary runtime. The blog posts about this are a little
scattered, but Rasmus spoke about it [1] and Etsy started the phan
project [2].

For what it's worth, my opinion is that PHP is an actual FLOSS
software project with years of history and core contributions from
Zend who make their living with PHP. HHVM is a well funded internal
project from Facebook that has experimented with FLOSS but ultimately
is controlled by the internal needs of Facebook. For me the choice
here is obviously to back the community driven FLOSS project and help
them continue to thrive.

[0]: https://codeascraft.com/2015/04/06/experimenting-with-hhvm-at-etsy/
[1]: https://codeascraft.com/speakers/rasmus-lerdorf-deploying-php-7/
[2]: https://github.com/phan/phan

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]] Manager, Cloud Services  Boise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Introducing the Cloud Services Team: What we do, and how we can help you (blog post)

2017-09-11 Thread Bryan Davis
A blog post about the Wikimedia Cloud Services team and the products
they help maintain is live on the Wikimedia blog:
<https://blog.wikimedia.org/2017/09/11/introducing-wikimedia-cloud-services/>

The post talks a bit about why we formed the Wikimedia Cloud Services
team and what the purpose of the product rebranding we have been
working on is. It also gives a shout out to a very small number of the
Toolforge tools and Cloud VPS projects that the Wikimedia technical
community make. I wish I could have named them all, but there are just
too many!

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]] Manager, Cloud Services  Boise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Recommending Firefox to users using legacy browsers?

2017-09-04 Thread Bryan Davis
On Mon, Sep 4, 2017 at 6:08 PM, Risker <risker...@gmail.com> wrote:
> Gonna be honest...after using Firefox almost exclusively for the last 10
> years whenever I had a choice, I'm ready to give up on it. I don't expect
> all the bells and whistles (and privacy compromises) of the big commercial
> browsers, but Firefox has decided to take a path that is actively awful.
> It's not just awful on Wikipedia (where I know logged-in users with lots of
> preferences and scripts are always going to be slow), it is awful on every
> website I go to, and it crashes on a multiple-times-a-day basis.  It does
> this on all three of my computers.  I've been trying to stay loyal and look
> at the bigger "free knowledge" bit...but I have had six crashes today and
> I'm done.  I hear this a lot from people I know outside of Wikimedia, and
> I've been told its unreliability is why several companies have decided
> against adding it (or have removed it) as an acceptable alternate browser.
>
> So no, I do not think it would be a good idea for anyone, let alone the
> Wikimedia Foundation, to advocate on behalf of this software.

As long as we are going on anecdotal evidence, I run Firefox ESR
52.3.0 on an OSX laptop all day every day and can not remember the
last crash I had. I do shutdown the browser every evening which may or
may not avoid serious memory leaks. In my personal past experience,
Firefox crashes were almost always correlated with buggy user
installed, community developed extensions.

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]] Manager, Cloud Services  Boise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Labs and Tool Labs being renamed

2017-07-12 Thread Bryan Davis
TL;DR:
* "Tool Labs" is being renamed to "Toolforge"
* The name for our OpenStack cluster is changing from "Labs" to "Cloud VPS"
* The prefered term for projects such as Toolforge and Beta Cluster
running on Cloud VPS is "VPS projects"
* "Data Services" is a new collective name for the databases, dumps,
and other curated data sets managed by the Cloud Services team
* "Wiki replicas" is the new name for the private-information-redacted
copies of Wikimedia's production wiki databases
* No domain name changes are scheduled at this time, but we control
wikimediacloud.org, wmcloud.org, and toolforge.org
* The Cloud Services logo will still be the unicorn rampant on a green
field surrounded by the red & blue bars of the Wikimedia Community
logo
* Toolforge and Cloud VPS will have distinct images to represent them
on wikitech and in other web contexts

In February when the formation of the Cloud Services team was
announced [0] there was a foreshadowing of more branding changes to
come:

> This new team will soon begin working on rebranding efforts intended
> to reduce confusion about the products they maintain. This refocus
> and re-branding will take time to execute, but the team is looking
> forward to the challenge.

In May we announced a consultation period on a straw dog proposal [1]
for the rebranding efforts [2][3]. Discussion that followed both on
and off wiki were used to refine the initial proposal [4]. During the
hackathon in Vienna the team started to make changes on Wikitech
reflecting both the new naming and the new way that we are trying to
think about the large suite of services that are offered. Starting
this month, the changes that are planned [5] are becoming more visible
in Phabricator and other locations.

It may come as a surprise to many of you on this list, but many
people, even very active movement participants, do not know what Labs
and Tool Labs are and how they work. The fact that the Wikimedia
Foundation and volunteers collaborate to offer a public cloud
computing service that is available for use by anyone who can show a
reasonable benefit to the movement is a surprise to many. When we made
the internal pitch at the Foundation to form the Cloud Services team,
the core of our arguments were the "Labs labs labs" problem [6] and
this larger lack of awareness for our Labs OpenStack cluster and the
Tool Labs shared hosting/platform as a service product.

The use of the term 'labs' in regards to multiple related-but-distinct
products, and the natural tendency to shorten often used names, leads
to ambiguity and confusion. Additionally the term 'labs' itself
commonly refers to 'experimental projects' when applied to software;
the OpenStack cloud and the tools hosting environments maintained by
WMCS have been viable customer facing projects for a long time. Both
environments host projects with varying levels of maturity, but the
collective group of projects should not be considered experimental or
inconsequential.

[0]: https://lists.wikimedia.org/pipermail/labs-l/2017-February/004918.html
[1]: https://en.wikipedia.org/wiki/Straw_man_proposal
[2]: https://lists.wikimedia.org/pipermail/labs-l/2017-May/005002.html
[3]: https://lists.wikimedia.org/pipermail/wikitech-l/2017-May/088184.html
[4]: 
https://wikitech.wikimedia.org/wiki/User:BryanDavis/Rebranding_Cloud_Services_products
[5]: https://phabricator.wikimedia.org/T168480
[6]: https://wikitech.wikimedia.org/wiki/Labs_labs_labs

Bryan (on behalf of the Wikimedia Cloud Services team)
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]] Manager, Cloud Services  Boise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Author of wikipedia images

2017-07-11 Thread Bryan Davis
On Tue, Jul 11, 2017 at 10:40 PM, Trung Dinh <t...@fb.com> wrote:
> Hi everyone,
>
> How can we programmatically (or from the data dump) get the author of the 
> main image shown in a Wikipedia article. For example, the image for Barrack 
> Obama page is 
> https://en.wikipedia.org/wiki/Barack_Obama#/media/File:President_Barack_Obama.jpg
>  and the author of the image is Pete 
> Souza<https://en.wikipedia.org/wiki/Pete_Souza>.

Try action=query=imageinfo:
<https://en.wikipedia.org/wiki/Special:ApiSandbox#action=query=json=imageinfo=File%3APresident_Barack_Obama.jpg=2=user%7Ccanonicaltitle%7Curl%7Ccommonmetadata=1>

This is what MultimediaViewer uses behind the scenes to describe the
image when shown in lightbox mode:
<https://phabricator.wikimedia.org/diffusion/EMMV/browse/master/resources/mmv/provider/mmv.provider.ImageInfo.js>

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]] Manager, Cloud Services  Boise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How does a build process look like for a mediawiki extension repository?

2017-06-15 Thread Bryan Davis
On Thu, Jun 15, 2017 at 10:58 AM, Jon Robson <jdlrob...@gmail.com> wrote:
>
> What's the best wiki page to get an overview of how deployment to the beta
> cluster/production works? I'd like to tinker with these and see if I can
> get one of those steps running npm jobs.

The beta cluster process is to update the repos and then run
https://wikitech.wikimedia.org/wiki/Wikimedia_binaries#scap_sync. The
composer packages are not managed by that process. Instead composer
managed assets used in beta cluster and production are manually
curated in the mediawiki/vendor.git repo via gerrit patches.

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]] Manager, Cloud Services  Boise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How does a build process look like for a mediawiki extension repository?

2017-06-07 Thread Bryan Davis
eployers of MediaWiki and local developers. 3rd
party users are the most important because this is the largest number
of people who will be impacted by tooling changes. In an ideal world
all or most of the changes could be hidden by changes to
ExtensionDistributor or similar tooling that makes it easy to create a
download and run tarball.

One of the awesome features of working on a PHP codebase is the quick
cycle of making a change and seeing it live in your test environment.
Today that is mostly a matter of saving an edit and hitting refresh in
a browser. It would be sad to lose that, so the build system that is
devised should also provide a path that allows a git clone to be a
viable wiki. This runtime doesn't need to be the best that the wiki
could be however. Its usually ok if a local dev environment needs to
do a bit more work than a prod deployment would to gather l10n
resources and do other dynamic processes that would be expected to be
baked into artifacts for a production deployment.

$0.02 USD,
Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]] Manager, Cloud Services  Boise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Labs-l] Fwd: [Affiliates] Recognition of the Wikimedia Tool Developers Group

2017-05-23 Thread Bryan Davis
On Mon, May 22, 2017 at 6:43 AM, Pine W <wiki.p...@gmail.com> wrote:
> -- Forwarded message --
> Date: Sun, May 21, 2017 at 5:41 PM
> Subject: [Affiliates] Recognition of the Wikimedia Tool Developers Group
>
> Hi everyone!
>
> I'm very happy to announce that the Affiliations Committee has recognized
> the Wikimedia Tool Developers Group [1] as a Wikimedia User Group.  The
> group is international cooperative of developers who create freely-licensed
> tools for improving the Wikimedia projects.
>
> Please join me in congratulating the members of this new user group!
> (...snip...)
> [1] https://meta.wikimedia.org/wiki/Wikimedia_Tool_Developers_Group

It's really awesome for me personally to see communities like this
form on their own. I'd like to extend an invitation to the new group
to collaborate with the Cloud Services team [2] and the Tool Labs
standards committee [3] in their program work. Efforts to make it
easier for people to develop, maintain, and publicize technical
contributions to the Wikimedia movement are important work for our
collective future.


[2]: https://www.mediawiki.org/wiki/Wikimedia_Cloud_Services_team
[3]: 
https://wikitech.wikimedia.org/wiki/Help:Tool_Labs/Tool_Labs_standards_committee

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]] Manager, Cloud Services  Boise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Global language preference

2017-05-10 Thread Bryan Davis
On Wed, May 10, 2017 at 10:16 AM, Derk-Jan Hartman
<d.j.hartman+wmf...@gmail.com> wrote:
> We had some discussions about this at the devsummit, since it was on one of
> the wish lists..
> https://phabricator.wikimedia.org/T16950
>
> At that time I suggested it would probably be easier and less risky, to
> start of with a tools project, that allows you to apply a setting to all
> our properties. Saves you a lot of the effort of figuring out how to
> 'position' this in the UI, and what to do when people want exceptions to
> global defaults etc..

Kunal has a rough start on such a tool at
<http://tools.wmflabs.org/globalprefs/>. The source looks to be a
<https://github.com/legoktm/globalprefs>. Maybe some folks would like
to collaborate on expanding that tool to make a nicer working proof of
concept?

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]] Manager, Cloud Services  Boise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Introducing the MediaWiki Platform Team!

2017-04-02 Thread Bryan Davis
On Sun, Apr 2, 2017 at 5:23 PM, Victoria Coleman <vcole...@wikimedia.org> wrote:
> I am pleased to share with you the news about the formation of the latest 
> team in the Technology department, the MediaWiki Platform team![1]
>
> (snip)
>
> I am thrilled that Tim Starling has agreed to lead the team, reporting 
> directly to me. He will be joined by Brion Vibber, Kunal Mehta and Brad 
> Jorsch.  The team officially launches on Monday April 3, and will complete 
> the hiring and onboarding of additional team members in the coming months. 
> Their initial workplan will include core support for multi content revisions 
> for the Structured Data on Commons project and will be discussed in more 
> detail during the upcoming consultation for the Wikimedia Foundation 
> 2017-2018 annual plan.
>
> (snip)
>
> [1] https://www.mediawiki.org/wiki/MediaWiki_Platform_team 
> <https://www.mediawiki.org/wiki/MediaWiki_Platform_team>

I'm really happy to see this happen. I'd like to give my personal
thanks to Victoria for recognizing the need and working to find a way
to form this team. I'd also like to show my appreciation to Tim for
taking on the team lead/manager role in this group.

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] [Breaking Change] MediaWiki-Vagrant now defaults to Debian Jessie

2017-03-13 Thread Bryan Davis
The 'master' branch of MediaWiki-Vagrant will now provision and
maintain Debian Jessie based VMs. The next time you fetch
mediawiki/vagrant.git changes to your laptop or Labs VM and try to run
`vagrant up` or `vagrant provision` it will complain that your Vagrant
managed VM is not running the correct base operating system.

There are two ways to deal with this:

1) Follow the instructions given to delete and recreate your VM. This
is the most awesome long term thing to do, but may be annoying in the
short term. If you have heavily customized the wikis running in your
VM it is up to you to figure out how to backup things before you
destroy your current VM and then restore the changes after you build a
new Jessie-based VM.

2) Switch your git checkout to the 'trusty-compat' branch of
mediawiki/vagrant.git. This trades short term efficiency for long term
pain. The trusty-compat branch is not going away any time soon, but it
will drift out of sync with Puppet changes on the master branch.

See <https://phabricator.wikimedia.org/T136429> for known issues with
the Jessie conversion. The only two I'm aware of at this time are
related to fundraising (T154264) and an NFS permissions mapping
problem when installing ChangeProp on a VM with OSX as the host
operating system and NFS shares enabled for Vagrant (T158617).

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Introduction

2017-03-07 Thread Bryan Davis
On Tue, Mar 7, 2017 at 2:41 PM, Egbe Eugene <agboreug...@gmail.com> wrote:
> Hi all,
>
> I am Egbe Eugene. I come from Cameroon and i am quite new to the movement
> as a developer. My thought towards contributing to the community is driven
> by the impact which the foundation could bring to my country and Africa and
> also what i will do in order that the movement be heard around the ends of
> my country.
>
> I will also wish to enhance my engineering skills while contributing to the
> movement by working on the foundation's projects.
>
> Hope to get more help to get started soonest.

Welcome Egbe! We can always use more help from technical contributors,
and its especially nice to see people coming into the technical side
of the movement to hone skills that they can put to use elsewhere in
their lives as well. The
https://www.mediawiki.org/wiki/How_to_become_a_MediaWiki_hacker wiki
page is a good place to start looking for ways to contribute.

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Results of the Developer Wishlist are in

2017-02-15 Thread Bryan Davis
On Wed, Feb 15, 2017 at 1:06 PM, Gergo Tisza <gti...@wikimedia.org> wrote:
> On Wed, Feb 15, 2017 at 8:31 AM, Jon Robson <jdlrob...@gmail.com> wrote:
>
>> I'd be interested to see how the priorities change. I'm interested
>> specifically in what pain points newcomers to our code might experience.
>
>
> We discussed running the wishlist in a more survey-like way (use Google
> Forms, ask a bunch of demographics/experience questions) but that would
> have been more complex to set up and we did not have the bandwidth.
>
> I have an old vote analysis project [1] that I could dust off but I'm not
> sure how well (lack of) wiki activity correlates with being a newcomer, and
> we don't really have any other indicators. (In theory Phabricator could be
> used to link wiki and gerrit identity, but that data is probably not
> public.)
>
>
> [1] https://github.com/tgr/wikimedia-rfc-stats

There is an API that can be used to lookup a Phabricator account based
on a Wikimedia username
(<https://phabricator.wikimedia.org/conduit/method/user.mediawikiquery/>).
We built this for use by https://toolsadmin.wikimedia.org/ (Striker).

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] MediaWiki-Vagrant needs testers for Debian Jessie changes

2017-02-14 Thread Bryan Davis
TL;DR See <https://phabricator.wikimedia.org/T136429> and try the new
jessie-migration branch.


The Wikimedia Foundation has been working on converting the majority
of its production server farm from Ubuntu to Debian for quite some
time. In August of 2016 that project advanced to the point of actually
converting the 252 servers running MediaWiki [0]. Today that work has
largely been done and the vast majority of MediaWiki is running on
Jessie.

Now it's time (past time really) for MediaWiki-Vagrant to catch up. We
have done this sort of thing once before with the switch from using
Ubuntu Precise to Ubuntu Trusty as our base image [1] in mid-2014. We
had fewer users then and only one supported virtual machine type. The
switch from Trusty to Jessie is also slightly more complicated because
the init system used is changing as well. This means that we have more
Puppet code changes to make than last time and more people who will be
impacted by the change.

For a little over a month several of us have been working on the
jessie-migration branch to get ready for the change over. I think we
are at the point that we could change over, but I want to have a
period of testing by a wider audience to see if we can iron out a few
more common issues before forcing everyone to either upgrade or
explicitly pin their MediaWiki-Vagrant virtual machines to the git tag
for the last Trusty compatible build.


== Testing the Jessie base image and Puppet profiles ==

Its recommended to test with a fresh MediaWiki-Vagrant checkout so if
things go badly you can easily switch back to your original install
and keep working.

$ git clone --recursive
https://gerrit.wikimedia.org/r/mediawiki/vagrant mwv-jessie
$ cd mwv-jessie
$ git checkout jessie-migration
$ ./setup.sh
$ vagrant up

You can run vagrant roles list -e -1 to get a nice list of the roles
you have enabled on your normal Trusty VM install to copy over to your
Jessie testing VM. This one-liner liner might even do it for you:

$ cd mwv-jessie
$ vagrant roles enable $(cd ../vagrant; vagrant roles list -e -1)
$ vagrant provision


Give things a try and report issues as children of the tracking task
for this migration [2]. Barring major issues effecting many people, I
would like to merge the jessie-migration branch with the master branch
the week of March 13th.


[0]: https://phabricator.wikimedia.org/T143536
[1]: 
https://phabricator.wikimedia.org/rMWVAdc73e2bee9dff1e0755d15cfe1376ee2dc6e141d
[2]: https://phabricator.wikimedia.org/T136429

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] private key compromise OCSP declarations per RFC 5280

2017-01-30 Thread Bryan Davis
On Mon, Jan 30, 2017 at 10:06 AM, James Salsman <jsals...@gmail.com> wrote:
> I have been informed off-list that the answer to my question is no,
> and asked to open a phabricator task to allow for fail-over alternate
> certificate utilization in the case of revocations via OCSP or
> revocation list-based revocation.
>
> I am strongly in favor of doing so, but I don't know how to categorize
> such a task or the group to assign it to. Any ideas?

The HTTPS tag (<https://phabricator.wikimedia.org/project/profile/162/>)
and the Traffic component
(<https://phabricator.wikimedia.org/project/profile/1201/>) would both
seem reasonable.

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MVVM/Single-State solution for our UIs?

2017-01-30 Thread Bryan Davis
On Mon, Jan 30, 2017 at 10:17 AM, Isarra Yos <zhoris...@gmail.com> wrote:
> Er, what does this mean? What is MVVM?

A bit of software pattern jargon:
https://en.wikipedia.org/wiki/Model%E2%80%93view%E2%80%93viewmodel

> On 30/01/17 13:57, Jan Dittrich wrote:
>>
>> Hello Wikitext-l,
>>
>> ---
>> TL;DR: The Wikidata team is considering to use a MVVM/Single-State
>> solution
>> for Wikidata’s UI. What are requirements and concerns would be important
>> to
>> consider?
>> ---
>>
>> Wikidata’s current UI is built on jQuery UI. Since jQueryUI shall be faded
>> out, we are looking at possible future frameworks or paradigms to build
>> our
>> UI on. Our needs are:
>>
>> - Having a sustainable foundation
>> - Being able to handle complex state dependencies (simplest are like: "if
>> element x is in edit mode, set element y to saving mode")
>> - A solution that is easy to learn for beginners and easy to read and
>> reason about for our engineers.
>>
>>
>> State management and data/event propagation goes beyond of what OOUI can
>> provide, as far as I (Jan) know. So an obvious candidate was looking into
>> MVVM solutions of which the most well known is the React library.
>>
>> We had a deeper look at Vue.js which is known for having a large
>> community,
>> too, but being easier to understand and not using an additional patent
>> clause in its licensing.
>>
>>
>> We see the following possible advantages:
>>
>> - Better modularization
>> - understandability of our code, in particular reasoning about event- and
>> data-flow
>> - better separation of concerns and testability for:
>> -- HTML templates
>> -- Component interactivity
>> -- Data manipulation
>> -- connection to backend-API
>>
>>
>> - If we use a well documented framework, learning to contribute is much
>> easier compared to software for which there is only auto-generated
>> code-level-docs
>>
>>
>> Here are some answers to obvious questions:
>>
>> 1) Does using a MVVM mean we need to write mixed JS/CSS/HTML in a new
>> syntax? (aka JSX)? -> No, it is possible, but for most frameworks (Vue,
>> too) normal HTML templates are used
>>
>> 2) Does that mean that people coming from Object oriented languages will
>> need to learn a whole new paradigm – reactive, pure-functional
>> programming?
>> -> While there are some elements of functional programming used in
>> react-like-frameworks, I would (subjectively) say that few additional,
>> totally new knowledge is needed and most can be covered by "take
>> parameters, work with them, return values; don't manipulate non-local
>> values"
>>
>> 3) How does DOM access work? Does this mean no jQuery?
>>
>>
>> -> DOM can be still be directly accessed. Libraries like jQuery can still
>> be reused (even if they might not be necessary in many points any more).
>> However, to change data or dom persistently, you need to tell the library
>> (which is not unusual, afaic)
>>
>>
>> There are also some other concerns:
>>
>> - Should we introduce a new dependency like a framework as Vue?
>> - What would be the process of introducing such a dependency (if we agree
>> on one)?
>> - Can we agree on this (or another?) paradigm for managing complex UIs, so
>> that it is not a Wikidata-only solution, but could be used by other
>> Wikimedia projects in the future, too?
>> - How will this work with OOUIjs? OOUI seems to be mainly responsible for
>> creating DOM elements and this actions are usually owned by the MVVM
>> framework. One can use hooks to use libraries like OOUI and such, but it
>> feels like having the same functionality twice. A possible solution would
>> be using OOUI styles and markup but leaving DOM creation to the framework.
>>
>>
>> Do you think using Vue (or a similar framework) is an option for us? What
>> are requirements and concerns which would be important?

I don't know if it would meet all of your requirements, but I would
suggest at least sitting down with some of the folks who develop and
maintain OOjs UI (<https://www.mediawiki.org/wiki/OOjs_UI>) and having
them help you evaluate the pros and cons of using the same view
abstraction layer that is used for Visual Editor and Flow, and which
is already available as a core component in MediaWiki.

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] More useful Puppet docs for MediaWiki-Vagrant

2017-01-10 Thread Bryan Davis
On Tue, Jan 10, 2017 at 3:12 PM, Brion Vibber <bvib...@wikimedia.org> wrote:
> Nice! I like that by expanding the 'source' option I can see dependencies
> as well as what it's actually doing in terms of config etc. The textual
> descriptions of the roles are mostly pretty vague so far; should we expand
> them for instance to describe what will be installed and how to go about
> tweaking if tweaking is needed, or just leave that to the source view?

It would be the guy who wrote Bug:1 to send this. :) Documentation
levels certainly vary. It would be nice to work on making them a bit
better. Patches welcome.


Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] More useful Puppet docs for MediaWiki-Vagrant

2017-01-10 Thread Bryan Davis
Hashar did some magic and replaced our use of `puppet doc` with yard
which is now generating prettier and more useful documentation of the
roles and other Puppet components in the MediaWiki-Vagrant project. Go
check them out if you are interested:
<https://doc.wikimedia.org/mediawiki-vagrant/>. Try searching for
"role::" to get the list of all roles.

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] Scap 3.4.0-1 is live

2016-11-28 Thread Bryan Davis
On Mon, Nov 28, 2016 at 7:25 PM, Tyler Cipriani <tcipri...@wikimedia.org> wrote:
>
> * Scap3 (non-mediawiki) deploys will now announce deploys in IRC -- you
>  can specify a message for IRC via:
>
>scap deploy 'A message for the SAL'
>
> * Scap lockfile errors now show you (a) who has the lockfile and (b) their
>  deploy message. The output you'll see if another person is deploying
>  looks like:
>
>sync-file failed:  Failed to acquire lock 
> "/var/lock/scap"; owner is "thcipriani"; reason is "scap 3.4 sync file"

These two changes in particular are awesome! The lack of automatic
!log messages for trebuchet and scap3 always bugged me. Thanks for all
the work folks.

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tech Talk: Using Kibana4 to read logs at Wikimedia

2016-11-14 Thread Bryan Davis
On Mon, Nov 14, 2016 at 11:51 AM, Rachel Farrand <rfarr...@wikimedia.org> wrote:
> If you missed this talk and would like to view the recording, here is the
> link: *https://www.youtube.com/watch?v=woS587VIfHI
> <https://www.youtube.com/watch?v=woS587VIfHI>*
> It has been released under a creative commons license and will be uploaded
> to commons.
>
> If you have any questions please get in touch with Bryan. Email:
> bd...@wikimedia.org

The slides and video are now on commons as well:
* 
https://commons.wikimedia.org/wiki/File:Using_Kibana4_to_read_logs_at_Wikimedia_Tech_Talk_2016-11-14.pdf
* 
https://commons.wikimedia.org/wiki/File:Tech_Talk-_Using_Kibana4_to_read_logs_at_Wikimedia.webm

Having the slides may be handy when you watch the video during the
section at the end when I talk about the X-Wikimedia-Debug header and
browser extension. I shared the wrong browser window from my desktop
and didn't notice that the slides weren't being displayed on the
video. :/

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Automatic gerrit authentication and retrieval of reviews

2016-10-21 Thread Bryan Davis
On Fri, Oct 21, 2016 at 4:38 PM, Strainu <strain...@gmail.com> wrote:
> 2016-10-22 1:16 GMT+03:00 Gergo Tisza <gti...@wikimedia.org>:
>> Are you worried that the users
>> are going to give positive reviews to themselves to bias the scores?
>
> Authentication is used only to ensure they don't claim somebody else's
> submissions (say, Gerrit Patch Uploader's :) ). Yes, this could
> probably be detected manually, but we're trying to go with an
> automated workflow where manual interventions are at a minimum.
>
>> Can you better explain what you are after?
>
> I'm simply trying to make it easy for the users. In the current
> version of the tool, they login with the github account and the rest
> happens "magically": the tool retrieves their pull requests and scores
> them according to a predefined set of criteria - no need for user
> input of any kind. I just want the same workflow for patches submitted
> to gerrit and I needed a way to authenticate the users and match the
> information I have from the OAuth endpoint with reviews from gerrit.

Today there is no accessible mapping between Wikimedia unified
accounts (the account you use on en.wikipedia.org as an example) and
Wikitech/Gerrit accounts. As Alex pointed out earlier in the thread
there is some work being done to unify these systems, but that
unification is quite far off currently.

There is however a one to one mapping between a Wikitech username and
Gerrit username. My Wikitech username is "BryanDavis" and so is my
Gerrit username
(<https://gerrit.wikimedia.org/r/#/q/owner:BryanDavis>). If the
mapping is not an identity mapping, then it would be still be
contained in the LDAP directory that any Labs project or Tool Labs
tool can query. The "cn" LDAP attribute is a user's Wikitech username,
so you can search for a Wikitech user's LDAP record with something
like `ldapsearch -xLLL cn=BryanDavis` from a command line or a similar
query using an LDAP library. I am unsure if Gerrit uses the "cn" or
"sn" attribute of the same record as the account's login name. For
many records in our LDAP directory it would not matter as the values
are the same, but I know I ran across some records when I was
deploying https://labsadmin.wikimedia.org/ where the two values
differ.

OAuth was recently re-enabled on the Wikitech server, so you would
need to register your OAuth consumer there
(<https://wikitech.wikimedia.org/wiki/Special:OAuthConsumerRegistration>)
and interact with wikitech.wikimedia.org in your client code.

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit screen size

2016-09-25 Thread Bryan Davis
On Sun, Sep 25, 2016 at 6:41 AM, Tim Starling <tstarl...@wikimedia.org> wrote:
> On 25/09/16 21:09, Bináris wrote:
>> Hi,
>>
>> I try to familiarize myself with Gerrit which is not a good example for
>> user-friendly interface.
>> I noticed a letter B in the upper right corner of the screen, and I
>> suspected it could be a portion of my login name. So I looked at it in HTML
>> source, and it was. I pushed my mouse on it and I got another half window
>> as attached.
>>
>> So did somebody perhaps wire the size of a 25" monitor into page rendering?
>> My computer is a Samsung notebook.
>
> In T38471 I complained that the old version was too wide at 1163px
> (for my dashboard on a random day). Now the new version is 1520px. I'm
> not sure if the Gerrit folks are serious or are trolling us. Perhaps
> it is a tactic to encourage UI code contributions?

Sadly I don't think that's the case as the upstream has moved on to
building yet another UI layer to replace the one that we are currently
using [0].

I hacked the heck out of the CSS to make things fit in my ~1024px
preferred browser width [1]. There is a greasemonkey script that will
apply this css [2]. Perhaps some other folks could test this css out
and see if I would be a useful change to add to the stylesheet
overrides already in use by the WMF's deployment.

[0]: https://gerrit.googlesource.com/gerrit/+/master/polygerrit-ui/
[1]: https://github.com/bd808/userscripts/blob/gh-pages/wmfgerrit.user.css
[2]: http://bd808.com/userscripts/wmfgerrit.user.js

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Debian and Ubuntu packages for MediaWiki now available

2016-09-22 Thread Bryan Davis
On Thu, Sep 22, 2016 at 12:44 AM, Legoktm <legoktm.wikipe...@gmail.com> wrote:
> Hi,
>
> I'm very excited to announce that Debian and Ubuntu packages for
> MediaWiki are now available. These packages will follow the MediaWiki
> LTS schedule and currently contain 1.27.1. If you've always wanted to
> "sudo apt install mediawiki", then this is for you :)
>
> For Debian users, you can get the mediawiki package for Jessie from the
> official Debian repositories using jessie-backports, and it will be
> included in the upcoming Stretch release.
>
> Ubuntu users will need to enable a PPA for Xenial or Trusty to set up
> the package.
>
> Instructions, links, and help can all be found at
> <https://www.mediawiki.org/wiki/User:Legoktm/Packages>. Please let me
> know if you have any questions or feedback.
>
> Finally, thanks to Luke Faraone, Faidon Liambotis, Moritz Mühlenhoff,
> Max Semenik, Chad Horohoe, Antoine Musso, and all of the beta testers of
> the package for making this a reality.

Congratulations Legoktm! This is huge!

For those who haven't been following this, Legoktm started working
seriously on this project during the Wikimania 2015 hackathon in
Mexico City. He has kept the process moving over these many months and
worked through lots and lots of blocking issues that would have
stopped most people. For me this is just one more example of why
Legoktm is awesome and deserving of public praise. :)

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] [Discussion needed] Process for taking over abandoned tools

2016-09-21 Thread Bryan Davis
In 2015, a phabricator task [0] and RfC discussion on meta [1] were
started to create a process for determining when a tool has been
abandoned by it's original maintainer(s) and how to hand control of
the tool over to interested volunteers. The process stalled out
without resolution.

Our on-wiki communities are still highly dependent on volunteer
developed tools and vulnerable to disruption when the original
developers move on. I have drafted two straw dog [2] policies that
attempt to define fair and workable solutions to the general problem.
The proposals take two different but compatible approaches to solving
the problem of abandonment. The Tool Labs developer community could
choose to adopt either or both policies as protection for the
communities that they serve.

The first policy describes a *right to fork* for all Tool Labs hosted
software. This policy clarifies the existing Tool Labs Open Source and
Open Data requirements and defines a process for requesting access to
code and data that are not already published publicly.

The second policy is a more aggressive *abandoned tool policy* that
describes a process for adding new maintainers to a tool account
(adoption) with a future possibility of removing the original
maintainers (usurpation). This policy is based primarily on the
discussions that happened on Meta in 2015.

Both policies propose creating a new committee of volunteers to
evaluate requests and perform cleanup of sensitive data in the tool
before providing the source code or direct access to the tool account.
This provision is key actually implementing both proposals. Paid
administration and management does not scale any better than paid
editing. To continue to grow and thrive, the Tool Labs developer
community needs to become more active in enforcing and expanding their
own policies. Membership in the committees would require signing the
Wikimedia Foundation's Volunteer NDA [3] to ensure that sensitive data
is handled appropriately. If both polices are adopted the two
committees should be collapsed into a single group with authority to
handle both types of requests.

The straw dog policies are posted on Wikitech:
* https://wikitech.wikimedia.org/wiki/Help:Tool_Labs/Right_to_fork_policy
* https://wikitech.wikimedia.org/wiki/Help:Tool_Labs/Abandoned_tool_policy

Discussion of the particulars of each proposal should happen on their
associated talk pages. As an example it would be appropriate to debate
whether the 14 day non-functional waiting period is too short or too
long on the Abandoned tool policy talk page. Discussion of the process
in general can happen on Meta [1].

I would like discussion to remain open through *2016-10-12* (3 weeks
from date of posting). Following the discussion period I hope to call
for an approval vote of some sort to make the policies official.
Wikitech and Tool Labs do not currently have well defined policies for
establishing consensus, but I'm sure we can collectively come up with
something reasonable.


[0]: https://phabricator.wikimedia.org/T87730
[1]: https://meta.wikimedia.org/wiki/Requests_for_comment/Abandoned_Labs_tools
[2]: https://en.wikipedia.org/wiki/Straw_man_proposal
[3]: https://wikitech.wikimedia.org/wiki/Volunteer_NDA

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Congratulations to all GSoC students (passed)

2016-09-02 Thread Bryan Davis
On Fri, Sep 2, 2016 at 1:01 PM, Alangi Derick <alangider...@gmail.com> wrote:
> Hi,
>
> I wish to use this opportunity to congratulate all the 6 students
> (including me) that made it to the end of the GSoC program 2016 in the
> Wikimedia foundation. It was not easy but we made it to the end. It was a
> great time working with the Wikimedia Foundation and I had a very great
> experience.

Congratulations to everyone. I hope that all of you find FLOSS
projects to continue to contribute to.

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Ops] deployment-prep using valid certs for HTTPS

2016-08-02 Thread Bryan Davis
On Tue, Aug 2, 2016 at 3:51 AM, Alex Monk <a...@wikimedia.org> wrote:
> Hi all,
>
> With some help from Brandon, I've changed deployment-prep to use Let's
> Encrypt instead of the self-signed cert I added last year (to get HTTPS
> working - albeit improperly-signed - instead of nothing, and nginx/puppet
> working on the Varnish instances again).
> It should now behave much more like production - TLS redirects are enabled
> in Varnish, and you shouldn't have to ignore cert warnings to use it now.
> Details for HTTPS in deployment-prep are spread out over various tickets,
> but the main one now is https://phabricator.wikimedia.org/T50501
> The puppetisation still needs some work, but it's cherry-picked on
> deployment-puppetmaster and seems to be working reliably.
>
> Pages with images may need to be null-edited to make MediaWiki generate
> HTTPS URLs for them so browsers don't block the images.
> Please let me know if you find any beta.wmflabs.org domains that aren't
> covered by the cert or aren't redirecting HTTP to HTTPS in Varnish.

This is really cool and another recent example of Alex grinding out
the steps to close a long standing feature wish for the beta cluster.
Thanks!

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] BetaFeatures grafana dashboard

2016-07-29 Thread Bryan Davis
On Fri, Jul 29, 2016 at 3:54 AM, Addshore <addshorew...@gmail.com> wrote:
>
> Recently WMDE started rolling out the RevisionSlider beta feature to a
> handful of wikis.
>
> We wanted to track how many people were using the feature as well as how
> many people were enabling / disabling it per day.
>
> Getting this data for all beta features is no more difficult than getting
> it for the RevisonSlider only.
> Thus we now have this dashboard, Enjoy!
>
> https://grafana.wikimedia.org/dashboard/db/betafeatures
>
> Any comments / suggestions are very welcome.

Thanks for working on this kind of data collection and visualization Adam.

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Ops] Canary Deploys for MediaWiki

2016-07-25 Thread Bryan Davis
On Mon, Jul 25, 2016 at 3:07 PM, Alex Monk <kren...@gmail.com> wrote:
> On 25 July 2016 at 21:54, Roan Kattouw <roan.katt...@gmail.com> wrote:
>>
>> Note to deployers: when syncing certain config changes (e.g. adding a new
>> variable) that touch both InitialiseSettings and CommonSettings, you will
>> now need to use sync-dir wmf-config, because individual sync-files will
>> likely fail if the intermediate state throws notices/errors.
>>
>> (It was a good idea to do this before, but it'll be more strongly enforced
>> now.)
>
> If the intermediate state throws notices/errors, wouldn't it be a better
> idea to sync-file in the correct order to prevent such notices/errors?

I think Alex is "more right" here. If you are introducing a new $wmgX
var you really should always sync-file the changed InitialiseSettings
file first and then the CommonSettings that uses it. There's no really
good reason to spew a bunch of "undefined X" warnings and there is no
guarantee with sync-dir that the files will be sent in the proper
order.

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Ops] Help evaluate upgrade path for logstash.wikimedia.org

2016-06-24 Thread Bryan Davis
On Thu, Jun 23, 2016 at 7:47 PM, Erik Bernhardson
<ebernhard...@wikimedia.org> wrote:
> That url should have been: https://kibana4.wmflabs.org/
>
> On Thu, Jun 23, 2016 at 6:26 PM, Erik Bernhardson
> <ebernhard...@wikimedia.org> wrote:
>>
>> The elasticsearch servers that run logstash.wikimedia.org will soon be
>> upgraded from 1.7 to 2.3. As part of this upgrade we also need to upgrade
>> kibana (the web interface that runs on logstash.wikimedia.org). The current
>> version uses elasticsearch features that have been long deprecated (since
>> 1.4) and as of 2.0 have been removed. I've setup a test environment in the
>> beta cluster running this new version of elasticsearch along with the new
>> version of kibana.
>>
>> Be warned, kibana 4 is a complete rewrite. It needs to be evaluated for
>> our purposes and we need to make sure it does what we need. I've recreated
>> most of the dashboards that were linked from the homepage of
>> logstash-beta.wmflabs.org to help with the evaluation. The first few took a
>> short while, but due to the way kibana re-uses visualizations it became a
>> good bit faster/easier after that.
>>
>> The main feature i've noticed missing from our old dashboards is the
>> trends panel. This is the part that told you how the volume of messages
>> compared to the previous hour/day/week/etc. There is a long standing github
>> issue about replacing this functionality but it has been stalled out. I
>> don't personally use that part of the visualization much, but perhaps you
>> do?
>>
>> Please try it out, and give some feedback. https://kibana.wmflabs.org/

Thanks for working on this Erik!

Kibana4 is a complete rewrite from Kibana3 (and I think the 5th or 6th
ground up rewrite of Kibana in its history). Elastic.co seems to have
high hopes for this iteration providing a platform for better
visualizations and performance. As a rewrite there are some feature
regressions that haven't been worked out upstream yet. We (well I)
didn't upgrade us to Kibana4 for over a year due to a lack of support
for pinning the dates shown in the UI to a particular timezone [0].
Thankfully that regression has been fixed in the version that Erik is
demoing in the beta cluster. I haven't found any other regressions
that are worthy of blocking yet, and that's good because we really do
have to upgrade in order to have a sane state of services in
production.

There are a few issues that I have noticed that others may find as well:

* No support for "markers" on a time histogram [1]. These were used in
Kibana3 to show when scap changed files on the servers. The markers
weren't highly reliable so its not the end of the world that this is
missing for now.

* Date picker works in local browser TZ instead of UTC [2]. Honestly I
think this bug or a variant of it exists in our Kibana3 install as
well. Working with dates is a pain in just about every language. :/

There are some neat new features too, like permalinks to individual
log events and quite a few new visualization types [3].

[0]: https://github.com/elastic/kibana/issues/1600
[1]: https://github.com/elastic/kibana/issues/2706
[2]: https://github.com/elastic/kibana/issues/5370
[3]: https://www.elastic.co/guide/en/kibana/4.0/whats-new.html

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Labs-l] Fall 2015 Tool Labs user survey data published

2016-05-05 Thread Bryan Davis
On Wed, May 4, 2016 at 8:35 PM, Pine W <wiki.p...@gmail.com> wrote:
>
> Hi Bryan,
>
> Thanks for the report. With this information in hand, what follow up is
> planned?

The primary purpose of the study was to set a baseline of qualitative
("how are we doing") responses. These will be used to gauge
increase/decrease in satisfaction in comparison to future surveys.
Tool Labs staff get regular feedback from people who are either
extremely happy or extremely unhappy with services, but a broad survey
like this is helpful to determine the general satisfaction of those
who are not motivated to provide feedback by a single incident. No
specific timetable has been set yet, but I expect to see another
survey run between October and December of this year.

Data from this survey, especially the free form comments, has already
been shaping the quarterly planning for the Labs techops team and my
more recent Tool Labs support initiative. I would say that a focus on
making Tool Labs easier to use, especially by individuals who are not
experts in the Unix command-line, is a general theme that is being
pursued as we also seek to replace aging services like Sun Grid Engine
and increase overall stability of the platform.

One of the things that stood out the most for me personally in the
feedback was a general need for tutorials and other task focused
technical documentation. This is an area that I would like to
encourage the existing Tool Labs developers to help with. There are
some ideas of things to write in Phabricator [0][1]. Generally we are
lacking on "big picture" documentation. We have a lot of documentation
on solving particular problems but even those documents can be
difficult to find and understand as they are mostly written with an
assumption of familiarity with the services and terminology used in
Tool Labs.

[0]: https://phabricator.wikimedia.org/T101659
[1]: https://phabricator.wikimedia.org/T123425

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fall 2015 Tool Labs user survey data published

2016-05-04 Thread Bryan Davis
On Wed, May 4, 2016 at 7:57 PM, Bryan Davis <bd...@wikimedia.org> wrote:
> [0]: https://meta.wikimedia.org/wiki/Research:Annual_Tool_Labs_Survey

Apologies for that abrupt initial message, that was a great example of
hitting the wrong key in a mail client. :)

Between 2015-09-25 and 2015-10-08, the Wikimedia Foundation ran a
direct response user survey of registered Tool Labs users. 106 users
responded to the survey.

Based on responses to demographic questions, the average[1] respondent:
* Has used Tool Labs for 1-3 years
* Developed & maintains 1-3 tools
* Spends an hour or less a week using Tool Labs
* Programs using PHP and/or Python
* Does the majority of their work locally
* Uses source control
* Was not a developer or maintainer on Toolserver

[1]: "Average" here means a range of responses covering 50% or more of
responses to the question. This summarization is coarse, but useful as
a broad generalization. Detailed demographic response data will be
made available on wiki.

Qualitative questions:
64% agree that services have high reliability (up time).
69% agree that it is easy to write code and have it running on Tool Labs.
67% agree that they feel they are supported by the Tool Labs team when
they contact them via labs-l mailing list, #wikimedia-labs IRC
channel, or phabricator.
53% agree that they receive useful information via labs-announce /
labs-l mailing lists.
52% disagree that documentation is easy-to-find.
71% find the support they receive when using Tool Labs as good or
better than the support they received when using Toolserver.
50% disagree that Tool Labs documentation is comprehensive.
50% agree that Tool Labs documentation is clear.

Service usage:
45% use LabsDB often.
60% use webservices often.
54% use cronjobs often.
75% never use redis.
41% never use continuous jobs .

The survey included several free form response sections. Survey
participants were told that we would only publicly share their
responses or survey results in aggregate or anonymized form. The
freeform responses include comments broadly falling into these
categories:

Documentation (33 comments)
Stability and performance (18 comments)
Version control and Deployment (14 comments)
Logs, Metrics, and Monitoring (12 comments)
Package management (10 comments)
SGE (8 comments)
Database (7 comments)
Account/Tool creation (5 comments)
SSH (5 comments)
Other (11 comments)

Additonal details are available on meta [2].

[2]: https://meta.wikimedia.org/wiki/Research:Annual_Tool_Labs_Survey

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Fall 2015 Tool Labs user survey data published

2016-05-04 Thread Bryan Davis
[0]: https://meta.wikimedia.org/wiki/Research:Annual_Tool_Labs_Survey
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Docs, use of, and admin privileges for wikimedia github project?

2016-04-26 Thread Bryan Davis
On Tue, Apr 26, 2016 at 6:20 PM, Chad <innocentkil...@gmail.com> wrote:
> On Tue, Apr 26, 2016 at 5:20 PM Alex Monk <kren...@gmail.com> wrote:
>
>> On 27 April 2016 at 01:15, Bryan Davis <bd...@wikimedia.org> wrote:
>>
>> > The Wikimedia GitHub project gives me two things in one place that I
>> > don't get elsewhere:
>> > * Find a repo based on some partial name I remember it probably has
>> > using the "Find a repository..." filtering at
>> > https://github.com/wikimedia/
>>
>> https://gerrit.wikimedia.org/r/#/admin/projects/?filter= lets you do this
>>
>>
> As does https://phabricator.wikimedia.org/diffusion/query/advanced/

Neither of which is as usable as the github in my personal opinion.
Gerrit will lead you into a twisty maze of dead ends if you are trying
to get to a repo browser. The diffusion search is several clicks deep
in the UI and not type ahead filtering system. I could learn to live
with the diffusion UX if I didn't have github as a more familiar and
polished interface.

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Docs, use of, and admin privileges for wikimedia github project?

2016-04-26 Thread Bryan Davis
On Tue, Apr 26, 2016 at 5:24 PM, Stas Malyshev <smalys...@wikimedia.org> wrote:
> Hi!
>
>> On 2016-04-25 19:01, Chad wrote:
>>> Honestly, I'm not entirely convinced that "mirror everything" is all that
>>> useful. It mostly results in a ton of unused repos cluttering up lists.
>>
>> I, for one, appreciate it. GitHub's interface is unfortunately a lot
>> more convenient than any of the repository viewers we host ourselves. :(
>
> Same here. Especially if I need to browse code and/or to communicate to
> somebody about it - especially somebody outside MediaWiki community
> (which means probably not trained with tools like gerrit but most
> probably having some familiarity with github because who doesn't have it
> these days?) - I almost always use github. If we have something better,
> I'd use it - but right now neither gerrit, nor gitblt, nor phabricator's
> code browser are superior to what github offers.

The Wikimedia GitHub project gives me two things in one place that I
don't get elsewhere:
* Find a repo based on some partial name I remember it probably has
using the "Find a repository..." filtering at
https://github.com/wikimedia/
* Search all the things at once using https://github.com/search?q=org:wikimedia

> Now, having something like wikimedia.github.io would be an excellent
> idea. If somebody would do the design, loading up repos list and
> displaying them with a nice structure - given that we actually have
> pretty structured names already, so we could start from that - should
> not be super-hard?

github.io pages are static html from the gh-pages branch of some
repository. They can use javascript and connect to the github api
however, so it might be possible to build a useful browser interface
that filtered based on the '-' separated name components. You could at
least setup really broad categories like "Extensions", "Skins",
"Operations", "...". The "other" bucket of that interface would still
be pretty big however since we have been creating a lot of
"un-namespaced" repos in the last year or two. That change started
happening when the migration off of Gerrit was deemed an eventuality
as the organization of repos in Gerrit is not very friendly to other
hosting platforms.

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Docs, use of, and admin privileges for wikimedia github project?

2016-04-25 Thread Bryan Davis
On Mon, Apr 25, 2016 at 9:19 AM, Brion Vibber <bvib...@wikimedia.org> wrote:
> There seems to be extremely little documentation on Wikimedia's GitHub
> project https://github.com/wikimedia ... I can only find
> https://www.mediawiki.org/wiki/Gerrit/GitHub which mostly says we mirror a
> bunch of stuff from gerrit. And I know we have continuous integration of
> some kind set up for some projects, but it doesn't seem to be well
> documented in a place I could find.
>
> There are also some repos that are mirrors of gerrit, and other repos that
> are primary repos, and it's a bit unclear what's what. More importantly,
> when folks have repos that they've been running on GitHub already and want
> to move into the wikimedia project (rather than switch to gerrit), what's
> the procedure? I'm an admin/owner so I can manually import people's repos
> but I'm not sure whether I'm supposed to... :)

The really brief procedure Timo came up with during the Librarization
project is documented on wiki [0] but probably kind of buried in the
details about developing new libraries.

> We also have a lot of admins, which I wonder is necessary:
> https://github.com/orgs/wikimedia/people?utf8=✓=role%3Aowner+
> <https://github.com/orgs/wikimedia/people?utf8=%E2%9C%93=role%3Aowner+>
> Do we do any security review / removal of old accounts, or have a procedure
> for adding new admins?

Not that I am aware of. Rights there tend to work a lot like getting
elevated rights on mediawiki.org: the rights are handed out by
existing admins when somebody asks for something that will be easily
solved by giving them rights. I think there was some amount of cleanup
done a few months ago that got admins to either add 2fa to their
github account or be removed.


[0]: 
https://www.mediawiki.org/wiki/Manual:Developing_libraries#Transferring_an_existing_GitHub_repo_to_Wikimedia

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MOOC Extension

2016-04-20 Thread Bryan Davis
On Wed, Apr 20, 2016 at 8:26 AM, René Pickhardt
<r.pickha...@googlemail.com> wrote:
> Hey Chad,
>
> I understand this but I also understand the documentation (
> https://www.mediawiki.org/wiki/%2B2 ) in a way that it says that Sebastian
> and me should not review our own code. That is where my question was
> aiming. Because obviously even with early code review we do not need every
> time we make a git commit to exchange our work in our team a code review.

Waiting for peer code review on each patch can be very difficult when
working on a small team and doing new development. Self-merges are
generally ok when you are building something new from scratch that
isn't deployed into the WMF production yet. All of your code will
eventually need to get a full security and performance review before
being approved for WMF production use. Find a workflow that works for
you and your collaborators and don't get too hung up on the guidance
that we give for production deployed code yet.

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Tool Labs and my new job at WMF

2016-04-15 Thread Bryan Davis
I recently transferred from the Reading Infrastructure team to the
Community Tech team [0]. The move happened because I want to spend
more of my time working with the developers who build tools and bots
to help the Wikimedia communities. I've been thinking about needs of
the Tool Labs developers for a while, and in November I finally wrote
up a proposal about a job focused on this work [1]. I was ready for a
lengthy discussion with management to defend my ideas about this need,
but to my surprise the feedback I got instead was mostly "it's about
time" and "when can you start?". My draft position proposal is now
"official" and posted on meta [2]. This project will be my major focus
on the Community Tech team, but I will also be helping out with code
review, deployments, and other things that the rest of the team is
working on.

People watching wikitech, labs-l, and Phabricator may have noticed
that I have been poking at various things since January like a
redesign of the wikitech main page [3], a new namespace for tool
documentation [4], and generally being more active in discussing
problems and possible solutions. Now that I am working on these issues
full time I want to start talking about bigger issues. I have drafted
a "vision" document on meta [5] describing some of the larger issues
with Tool Labs (and Labs and wikitech) that are making things harder
than they could be. This vision comes with a straw dog project roadmap
that I think we could work towards. This is not a set in stone
timeline, but rather a very high level description of a series of
projects that I believe would move Tool Labs towards being an easier
environment for collaborative FLOSS projects to thrive in. I will
continue to refine these project ideas and create Phabricator tasks to
track them, but before I dive too deeply into that I would like to
solicit input on both the problems and the general solution roadmap.
The project page is on meta rather than wikitech to make it easier for
existing Wikimedians who aren't wikitech users to participate. The
talk page is open for comments [6] and I look forward to hearing about
problems and solutions that I have not yet imagined. I hope that as
the various sub-projects solidify some of you will join me in getting
the work done.


[0]: 
https://wikimediafoundation.org/w/index.php?title=Template:Staff_and_contractors=105580=105576
[1]: 
https://www.mediawiki.org/wiki/User:BDavis_%28WMF%29/Projects/Tool_Labs_support
[2]: https://meta.wikimedia.org/wiki/Community_Tech/Tool_Labs_support
[3]: https://wikitech.wikimedia.org/wiki/Main_Page
[4]: https://wikitech.wikimedia.org/wiki/Category:Tool_Labs_tools
[5]: 
https://meta.wikimedia.org/wiki/Community_Tech/Tool_Labs_support/Tool_Labs_vision
[6]: 
https://meta.wikimedia.org/wiki/Talk:Community_Tech/Tool_Labs_support/Tool_Labs_vision

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mass migration to new syntax - PRO or CON?

2016-02-12 Thread Bryan Davis
On Fri, Feb 12, 2016 at 12:26 PM, Legoktm <legoktm.wikipe...@gmail.com> wrote:
> I think you're going to end up in rebase hell regardless, so we should
> rip off the bandaid quickly and get it over with, and use the automated
> tools we have to our advantage.
>
> So, if we're voting, I'm PRO.

+2

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Achievements in Wikimedia (6 months / Half year) anniversary

2016-02-06 Thread Bryan Davis
On Sat, Feb 6, 2016 at 5:27 AM, Alangi Derick <alangider...@gmail.com> wrote:
> Hi Everyone(Devs),
>
> Just want to share with you my achievements in WMF for the past 6
> months i have been here. My stay here is so awesome that even myself
> is amazed about it, i have done so far the following things in this
> organisation and i really see it as a motivation to do more..
>
> - Fixed more than(>) 20 bugs for the org in various extensions and in
> the mediawiki 
> core(https://gerrit.wikimedia.org/r/#/q/owner:D3r1ck01+status:merged,n,z).
>
> - Mentored GCI #2016 that just ended some few weeks ago.
>
> - Authored one extension (Mailgun extension) for WM
>
> - Help in one way or the other to make sure that little problem on IRC
> are solved by new comers in the organisation and also doing the same
> thing to other new comers the way other great guys here did to me when
> i just came in.
>
> I feel so proud of myself and i want to do more and more contributions
> to the organisation. Please i will like you all to guide me so that i
> can stay on the right path and make the Foundation(movement) to be a
> better one. Your comments are highly appreciated. Thanks to all
> members of the WMF dev team :)

Looks like you are off to a pretty good start! Thanks for contributing
to the projects. I'm especially glad to hear that you are helping us
scale by answering questions that come up on irc. It's easy to lose
track of how a few quick answers or even just acting as a rubber duck
[0] for someone who is stuck can really make a difference.

[0]: https://en.wikipedia.org/wiki/Rubber_duck_debugging

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Help test SessionManager on WMF beta cluster

2016-02-04 Thread Bryan Davis
One of the conclusions from the recent SessionManager rollout failure [0] was:

"we should have recruited and coordinated testing by developers and
users inside and outside of the WMF while the code was only on the
beta testing cluster"

SessionManager is back on the WMF beta cluster [1] now after being
briefly removed for the 1.27.0-wmf.12 release cycle, so an
announcement seems in order. The beta cluster implements a SUL
authenticated wiki farm that is completely separate from the Wikimedia
production SUL system. Helping test SessionManager there would involve
logging in, logging out, creating new user accounts and generally
wandering around the wikis doing things you would normally do in
production while keeping an eye out for session related issues.

If you spot something (or just think you spotted something) file a
Phabricator task with as many details as you can provide and tag it
with the #reading-infrastructure-team project. For session related
issues getting traces of the headers and cookies used in the requests
that are failing is most helpful. You can also poke around in the
logging interface [2] to try and find associated error messages.

If you find other bugs, report them in Phabricator too. :)

Also please remember NOT TO USE passwords in the beta cluster that
match the passwords you use anywhere else on the planet!


[0]: 
https://wikitech.wikimedia.org/wiki/Incident_documentation/20160123-SessionManagerRolloutFailure
[1]: http://deployment.wikimedia.beta.wmflabs.org/wiki/Main_Page
[2]: https://logstash-beta.wmflabs.org/#/dashboard/elasticsearch/default

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Recent authentication session instability (was Re: Update from the Wikimedia Performance Team)

2016-02-04 Thread Bryan Davis
On Thu, Feb 4, 2016 at 8:20 AM, MZMcBride <z...@mzmcbride.com> wrote:
> Federico Leva (Nemo) wrote:
>>Login pretty much never does what I expect nowadays, but I'm not sure my
>>expectations are correct so I can't identify actual bugs.
>
> There are various open tasks in Phabricator about user sessions currently,
> such as <https://phabricator.wikimedia.org/T124440>. Being unexpectedly
> logged out lately has been a bit annoying, though I don't know if it's
> related to the Performance team or some other team.

The origin of the unexpected logouts falls on the AuthManager project
and specifically the SessionManager component that rolled out in
1.27.0-wmf.11 [0]. We had various issues related to the session
handling changes including a bug that was overloading the storage
capacity of the Redis servers that store session data [1] and two
other issues which required rolling the wikis back to 1.27.0-wmf.10
[2][3].

Both rollbacks were accompanied by a run of the
"resetGlobalUserTokens.php" maintenance script which updates each
user's CentralAuth records in such a way that their authentication
session will be considered invalid the next time it is used on a wiki.
This was done from an abundance of caution point of view concerning
possible issues with sessions that had been issued by the
SessionManager software. The reset script is not fast [4], so session
invalidation has slowly worked its way across the CentralAuth user
table.

Part of the enhancements that are being applied in order to bring
SessionManager back to production with 1.27.0-wmf.13 is a new config
setting that can be used to give us a nearly instant switch to throw
to invalidate all active sessions. This setting is actually included
in 1.27.0-wmf.12, but the configuration on the Wikimedia cluster has
not been changed to make use of it yet. Invalidating all user sessions
is not something we plan to do for fun certainly, but there have been
in the past (and likely will be in the future) software and
configuration issues that necessitate the use of that heavy hammer
approach.


[0]: https://phabricator.wikimedia.org/T123451
[1]: https://phabricator.wikimedia.org/T125267
[2]: 
https://wikitech.wikimedia.org/wiki/Incident_documentation/20160123-SessionManagerRolloutFailure
[3]: https://tools.wmflabs.org/sal/log/AVKZtfQXW8txF7J0uNE2
[4]: https://phabricator.wikimedia.org/T124861

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Labs-l] New [[Main Page]] for Wikitech

2016-01-29 Thread Bryan Davis
On Fri, Jan 29, 2016 at 9:28 AM, Alex Monk <kren...@gmail.com> wrote:
> Nice. What is needed to get other labs projects listed on the front page
> like tools?

An edit? :) I'd love to see portals created for Beta Cluster and
deployers for example. I'm sure there are some other really useful
projects in Labs that could use highlighting as well. Tool Labs ended
up with top billing based on my presumption that it is the most widely
used Labs project and one of the top reasons that a new viewer who
isn't already familiar with finding things on wikitech would end up
there.

I need to kill my table formatting and replace it with something that
is actually mobile friendly too. The page I made looks horrible on my
phone. :(

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] New [[Main Page]] for Wikitech

2016-01-28 Thread Bryan Davis
I've been working on a little redesign project for the Main Page on
wikitech [0] and three key sub pages it points to since 2016-01-01 in
my User space. Tonight I decided that although it is far from perfect
it is better enough. I hope that some of you like it better than the
old page and that none of you hate it with a fiery passion that
compels you to revert it rather than helping me make it better.

[0]: https://wikitech.wikimedia.org/wiki/Main_Page

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Giving some attention to the wikitech.wikimedia.org wiki

2016-01-12 Thread Bryan Davis
(cross-posted to labs-l and wikitech-l)

I've been thinking quite a bit about Tool Labs developers over the
past couple of months. Last week I was in San Francisco for the
Wikimedia Developer Summit and had an opportunity to talk to a number
of people about some of my ideas. One of them that got a pretty good
reception was a few proposed changes to the wikitech.wikimedia.org
wiki.

I've started "Make wikitech more friendly for the multiple audiences
it supports" phab task [0] and started adding blocker tasks for
various ideas that have come up in the discussions. One is to enable a
"Portal" namespace on wikitech [1] to be used to make landing pages
for different audiences. Another is to add a "Tool" namespace [2] that
the various Tool Labs projects could use to document their projects. A
third is to disable searching "Nova resource" pages by default [3].

The wikitech wiki doesn't currently have what anyone would consider an
active on-wiki community to seek consensus from on topics like this so
I am reaching out to what I hope are relevant mailing lists to gather
feedback. Discussion of why any of these ideas is a great or horrible
idea is welcome here on list or possibly even better on the
phabricator tasks themselves. Again since we don't have a strong
social norm for when consensus has been reached for this wiki I'm
going to arbitrarily suggest that discussion remain open until
2016-01-20 after which time I will take further steps to have the new
namespaces enabled (or got back to the drawing board if the concept is
rejected).

[0]: https://phabricator.wikimedia.org/T123425
[1]: https://phabricator.wikimedia.org/T123427
[2]: https://phabricator.wikimedia.org/T123429
[3]: https://phabricator.wikimedia.org/T122993

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Trouble while setting up mediawiki-vagrant

2016-01-07 Thread Bryan Davis
On Thu, Jan 7, 2016 at 1:32 PM, Devang Gaur <devang.o...@gmail.com> wrote:
> Thanks Merlijn! That worked.
>
> Now , I am interested in Wiktionary and Wikidata and want to
> contribute to them . I've setup mediawiki-vagrant successfully
> and running it. Now , how do I setup and run wikitionary
> and wikidata on my local machine . I didn't see any
> clear guidelines for that . I think , one probably needs
> to utilize the memory dumps of the projects .

There is a wikidata role for MediaWiki-Vagrant that you can enable:
`vagrant roles enable wikidata && vagrant provision`. The role only
includes the wikidata software and not any dump data.

> Where can I find these dumps ? I'll
> probably write an article on the contributors section
> too after discovering the path myself .

Dumps are available from https://dumps.wikimedia.org/backup-index.html

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How to get started?

2015-12-12 Thread Bryan Davis
On Sat, Dec 12, 2015 at 12:57 PM, Devang Gaur <devang.o...@gmail.com> wrote:
> Hi there, I have some experience with PHP and Javascript and am also
> learning Node.js . I would like to contribute to the community with code.
> Need suggestions ,instructions or documentation to get started.Anybody?
> Thanks.

Welcome Devang. There are some suggestions for getting started in
contributing to MediaWiki and related software at
<https://www.mediawiki.org/wiki/How_to_become_a_MediaWiki_hacker>.
Come back to this mailing list or the #mediawiki or #mediawiki-dev IRC
channels on Freenode with specific questions when you get stuck.

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] WMF Survey: Please help us understand third-party use of Wikipedia's content

2015-12-03 Thread Bryan Davis
On Mon, Nov 16, 2015 at 1:29 PM, Sylvia Ventura <svent...@wikimedia.org> wrote:
> Dear Wikipedia data/API user,
>
> The WMF’s Engineering, Product and Partnerships teams are conducting a
> short survey to help us understand how organizations are pulling and using
> data from our projects. This information will inform future features and
> improvements to our data tools and APIs.
>
> We would appreciate a few minutes of your time. The link to the Survey
> below will take you to a Google Form - there is no need to sign up to fill
> out the survey. The survey should take no more than 10 minutes.
>
> https://docs.google.com/forms/d/1yUrHzyLABN419RCDbzepjoRWCbaWYV4wbtbKPa95C4o/viewform?usp=send_form
>
> Thank you for your input and feedback!

I heard from Sylvia this week that we got a grand total of ONE
response from the mailing list postings of this survey. If any of you
saw this request the first time around and thought you might have
useful input but just didn't get around to filling out the survey I
would encourage you to take 10 minutes and do so now.

The questions are very high level and open ended but primarily are
seeking to get an overview of how dumps, the action api, irc edit
notifications, rcstream and other tools and services that the
Wikimedia Foundation or others provide to allow off-wiki access to the
Wikimedia community created and curated content are used. One of the
hoped for outcomes of this and related surveys is guidance for the
Wikimedia Foundation on areas that deserve increased focus in the
future.

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Peer-to-peer sharing of the content of Wikipedia through WebRTC

2015-11-30 Thread Bryan Davis
On Mon, Nov 30, 2015 at 4:03 PM, Brian Wolff <bawo...@gmail.com> wrote:
>
> If we wanted to address such a situation, it sounds like it would be
> less complex to just setup a varnish box (With access to the HTCP
> cache clear packets), on that campus.

This is an idea I've casually thought about but never put any real
work towards. It would be pretty neat to have something similar to the
Netflix Open Connect appliance [0] available for Wikimedia projects.

[0]: https://openconnect.netflix.com/

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MW support for Composer equivalent for JavaScript packages

2015-11-06 Thread Bryan Davis
ng you did not depend on. But that could be dealt with by either
>>> adding it as a dep yourself or running dedupe.
>>>
>>> 
>>>
>>> The bad news is that while more and more libraries are moving to npm the
>>> underlying reason for many being on npm is for the many users using
>>> tools like browserify and webpack. So the assumption of many libraries
>>> is that when you use npm for client-side libraries is you are using it
>>> in a node-style/CommonJS way where require is available other npm
>>> dependencies can be used through it and you're not including things in
>>> the traditional way of libraries being run one after another on the page
>>> in the global scope like in ResourceLoader.
>>>
>>> It's probably going to get harder and harder to reconcile deps shared
>>> between extensions and/or use library code itself without having node
>>> installed on servers. For that matter half the advancements in the css
>>> space probably won't be making their way to php either.
>>>
>>> Though I do have some hints to things that might open up some
>>> possibilities.
>>>
>>> Firstly browserify can build standalone versions of libraries. It's
>>> possible you could make tweaks to that pattern and have a development
>>> build step that would convert npm packages into modules built to run in
>>> a new type of ResourceLoader that would probably be a hybrid of the
>>> current RL and npm. However you would need to take care to exclude
>>> things or else you'll end up with a script that will duplicate the
>>> source of a bunch of other modules. You also have to be wary of handling
>>> the cases where npm intentionally duplicates a library.
>>>
>>> This actually may be even more difficult than it sounds due to npm
>>> packages that require modules inside a package rather than just the main
>>> module. You might need to go even deeper into browserify, substituting
>>> one of the last steps so it outputs a different format with objects that
>>> have metadata on the individual modules intended for use in a special
>>> loader instead of as a plain script tag.
>>>
>>> Current RL on its own just won't work for this. Globals don't work well
>>> in the module pattern and so the ability to get a result from require()
>>> and allow npm modules to require() others are necessary. Circular
>>> requires also need consideration.
>>>
>>> Whether you want a RL api that has a bridge to the npm ecosystem or want
>>> to go all-in to the modern js landscape and build a more
>>> CommonJS/node/npm-like RL api that loads things according to MW's use
>>> cases will be the important bike-shedding.
>>>
>>> ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]
>>>
>>> On 2015-11-05 8:32 AM, Trevor Parscal wrote:
>>>> The flat approach to NPM is a game changer for us, and a Bower killer.
>>>>
>>>> Timo? Had a lot of insight at the time, I'd like to see him be invoked in
>>>> this decision. Any thoughts Timo?
>>>>
>>>> - Trevor
>>>>
>>>> On Thursday, November 5, 2015, Jon Robson <jrob...@wikimedia.org> wrote:
>>>>
>>>>> It's been a year now, over 3-6 months. Volker can this be given a
>>>>> priority in the next frontend standards meeting. I think the lack of
>>>>> any standard is extremely damaging to the project.
>>>>>
>>>>> On Wed, Jul 22, 2015 at 4:21 PM, Bryan Davis <bd...@wikimedia.org
>>>>> <javascript:;>> wrote:
>>>>>> On Wed, Jul 22, 2015 at 12:24 PM, Daniel Werner
>>>>>> <daniel.a.r.wer...@gmail.com <javascript:;>> wrote:
>>>>>>> Hey,
>>>>>>>
>>>>>>> I just wanted to check in about the status of enabling JavaScript
>>>>> package
>>>>>>> management usage in MediaWiki. I am basically talking about an
>>>>> equivalent
>>>>>>> for JS to what we have with Composer for PHP.
>>>>>>>
>>>>>>> Real-world example:
>>>>>>>   The "data-values/value-view" package[0] is defining
>>>>>>> "jquery.event.special.eachchange.js":
>>>>>>> ValueView/lib/jquery.event/jquery.event.special.eachchange.js
>>>>>>>
>>>>>>>   Now, recently I needed the same func

Re: [Wikitech-l] Random rant

2015-10-28 Thread Bryan Davis
On Wed, Oct 28, 2015 at 10:10 AM, Aaron Halfaker
<ahalfa...@wikimedia.org> wrote:
>3. Is there a public conversation about this transition that I can
>participate in?

Yes! https://meta.wikimedia.org/wiki/Requests_for_comment/OAuth_handover


Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Would anyone be interested in a tech talk about how to make a mw skin?

2015-09-28 Thread Bryan Davis
On Mon, Sep 28, 2015 at 3:00 PM, Isarra Yos <zhoris...@gmail.com> wrote:
> Jack Phoenix and I were considering doing a tech talk about how to make a
> MediaWiki skin. Would there be any interest in this? What would you folks
> want to see from such a talk?

I'd love to get an overview on creating a new skin. I would be
especially interested in the "tales from the real world" aspects of
working with RL, what is and isn't possible without implementing
hooks, and lessons learned from past skins the two of you have worked
on.

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Interested in working on a WikiWidget for algorithm visualization

2015-09-11 Thread Bryan Davis
On Fri, Sep 11, 2015 at 3:52 PM, Daniel Moisset <dmois...@machinalis.com> wrote:
> Hi,
> I'm one of the developers of thewalnut.io, a platform for authoring and
> sharing algorithm visualizations. While working on a visualization of
> Langton's ant, I ran into the WikiWidget at
> https://es.wikipedia.org/wiki/Hormiga_de_Langton and started looking at how
> you guys are doing these, given that it has some similarities to our work.
>
> My idea is that by creating a wikiwidget that can somehow integrate the
> content built in the walnut, assuming it gets incorporated into Wikipedia,
> it would allow many people to easily create and add interactive
> visualizations for algorithms into wikipedia articles with much less work
> than now where each wikiwidget for each algorithm needs to be created from
> scratch[1]. Instead of just a few wikiwidgets like now it could be
> something commonplace in CS related articles. I also see this as an
> opportunity to get people from visualization communities closer to the
> wikipedia community. And I also see this as an improved experience for
> visitors, given that the walnut gives the flexibility to users of modifying
> and creating alternate versions of algorithms and visualizations.
>
> So, in short, I'm sending this here to see if the community shares this
> interest on moving forward into this direction; specifically if there's
> interest on this (I'm more than happy about doing the development work
> myself). I'm new here so I'm not even sure if this is the right mailing
> list, feel free to redirect me if I'm at the wrong place :)
>
> As a bit of context, I'd like to clarify a few things that I assume might
> be relevant for you... I'm not at all an active member in the
> mediawiki/wikipedia development community, and a very minor wikipedia
> contributor; I'm mostly just a heavy wikipedia user. But I've been an
> active participant in many open source project and free software
> communities for more than 15 years, so I am quite aware that my invitation
> will raise many concerns about the dangers of integrating an external app
> that you don't control. I know because I'd have those concerns if I was in
> your place. With that in the open I'd like to hear about what we can do to
> ease those concerns; if you like the general idea, we're really open to
> explore possibilities like releasing code, data, licensing terms on
> content, or whatever you think is needed to make this possible.
>
> Thanks for your time,
>
>D.
>
> [1] Just so you have a reference, having an interactive  animation about
> the langton's ant example took me a bit about 40 minutes; 50 in total if
> you count the second visualization I made for density of visits when a
> friend asked "are some locations more important than others?"

The high level and non-negotiable needs for integration with Wikipedia
or the sister projects would be:
* Freely licensed client (OSI approved license)
* Offline animated image rendering solution for user-agents that
can't/won't use the client
* Ability to host freely licensed "models" on Commons
* No client interaction with servers not managed by the Wikimedia
Foundation on behalf of the larger Wikimedia movement.

Ideally the authoring tools would also be freely licensed and capable
of being integrated with MediaWiki under the same hosting terms listed
for the client.

The bar is lower for creating an extension that would allow
non-Wikimedia wikis powered by MediaWiki to integrate with your hosted
service. See <https://www.mediawiki.org/wiki/Manual:Developing_extensions>
and <https://www.mediawiki.org/wiki/How_to_become_a_MediaWiki_hacker>
for  how you and your team could get started in that direction.

Bryan
-- 
Bryan Davis  Wikimedia Foundation<bd...@wikimedia.org>
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tech Talk: ELK: Elasticsearch, Logstash and Kibana at Wikimedia

2015-08-20 Thread Bryan Davis
On Mon, Aug 10, 2015 at 1:47 PM, Rachel Farrand rfarr...@wikimedia.org wrote:
 Please join for the following tech talk:

 *Tech Talk**:* ELK: Elasticsearch, Logstash and Kibana at Wikimedia
 *Presenter:* Bryan Davis
 *Date:* August 20, 2015
 *Time: *17:30 UTC

Slides: https://commons.wikimedia.org/wiki/File:ELK_Tech_Talk_2015-08-20.pdf
Video: 
https://commons.wikimedia.org/wiki/File:Elasticsearch,_Logstash_and_Kibana_at_Wikimedia.webm

Thanks to everyone who watched and participated.

Bryan
-- 
Bryan Davis  Wikimedia Foundationbd...@wikimedia.org
[[m:User:BDavis_(WMF)]]  Sr Software EngineerBoise, ID USA
irc: bd808v:415.839.6885 x6855

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

  1   2   3   >