Re: [Wikitech-l] [ Writing a MediaWiki extension for deployment ]

2015-07-06 Thread Tyler Romeo
For the record, Extension:TwoFactorAuthentication is deprecated and in the 
process of being merged into Extension:OATHAuth.

Also, I'd recommend, rather than making a brand new extension, just add Latch 
support into OATHAuth! :) I'm completely open to supporting new 2FA methods, 
and giving wiki administrators options in what they want to use.

-- 
Tyler Romeo
https://parent5446.nyc
0x405D34A7C86B42DF

On July 6, 2015 at 09:29:50, Pine W (wiki.p...@gmail.com) wrote:

I for one would be interested in moving that extension out of beta and
having an option in site configuration to require some or all users to have
two factor authentication enabled.

Pine
On Jul 6, 2015 9:20 AM, "Alex Monk"  wrote:

> We use https://www.mediawiki.org/wiki/Extension:OATHAuth on
> wikitech.wikimedia.org
> https://www.mediawiki.org/wiki/Extension:TwoFactorAuthentication also
> exists
>
> On 6 July 2015 at 17:14, Paula  wrote:
>
> > Hello,
> > I'm writing an extension for MediaWiki so users can use a second factor
> > authentication in their MediaWiki accounts.
> >
> > I would like to know if there is any project on this topic, or if there
> is
> > already someone working on something like this.
> >
> > I'm planning on working in a MediaWiki extension that uses Latch (
> > https://latch.elevenpaths.com/www/service.html ) as a second factor.
> >
> > This is my first time collaborating on a project this size so any advice
> > will be very welcome.
> >
> >
> >
> > ​Thank you for your time.
> >
> > Cheers,
> > Paula.
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Min php version

2015-07-20 Thread Tyler Romeo
Just as a counter-argument (and, to be clear, I do support raising our minimum 
version), just because PHP has EOL'ed a version does not mean that some 
distributions (esp. Debian, Ubuntu) are not providing additional support and 
security updates.

If I remember from the last time we had this discussion, it will still be a 
couple more months before PHP 5.3 is no longer supported by most major distros, 
and it will be a while before 5.4 is no longer supported.

-- 
Tyler Romeo
https://parent5446.nyc
0x405D34A7C86B42DF

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Min php version

2015-07-21 Thread Tyler Romeo
One thing I forgot to mention: while you're considering Debian and Ubuntu 
support, make sure to also take into account MediaWiki support.

Even if we upgrade our minimum PHP version now, older versions of MediaWiki 
with the 5.3 requirement will still be supported and receive security updates. 
So the only difference will be that people running Debian oldstable will be 
locked into our older version and not be able to upgrade to bleeding edge 
MediaWiki, which they probably won't do anyway considering they haven't even 
upgraded their Debian. :P

-- 
Tyler Romeo
https://parent5446.nyc
0x405D34A7C86B42DF

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] Code of conduct

2015-08-08 Thread Tyler Romeo
On Sun, Aug 9, 2015 at 2:43 AM, Moriel Schottlender 
wrote:

> An I missing something?


When an employee of the WMF starts a new topic on the mailing list with the
words "we are" and "binding", it comes off with the wrong connotation,
especially considering the "we" was clarified to mean the "Wikimedia
technical community".

It sounds more like "you've already decided to do this" rather than "we
thought this might be a good idea and want everybody else's input". You
should naturally expect people to speak out against something when you make
it sound like their opinion has already been decided upon.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] Code of conduct

2015-08-22 Thread Tyler Romeo
On Sat, Aug 22, 2015 at 2:40 PM, Brian Wolff  wrote:

> For example, MediaWiki is intentionally GPLv2, not GPLv3.


MediaWiki is not intentionally GPLv2. It merely is v2 now and the community
cannot come to a consensus on whether to change it, thus it remains in its
current stagnant state.

And this is exactly what rupert is talking about: policies and other
important documents and decisions like this very rarely are brought into
the modern age, either because the WMF does not have resources to update
them or because the community cannot come to a decision on what updates to
make or not make.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Engineering] Code of conduct

2015-08-23 Thread Tyler Romeo
On Sun, Aug 23, 2015 at 2:24 PM, Oliver Keyes  wrote:

> "bad things
> are happening, why don't we do X" is to literally google "why isn't
> [the obviously simple thing I thought of] a good idea?", and see what
> smart people have already written
>

Ironically, I tried to be devil's advocate by searching "why isn't a code
of conduct a good idea", and pretty much every result was an article on how
code of conducts *are *a good idea.

That aside, I think most of the people in this thread are already
well-aware that "admins" is a very generic term, and that even so, most
definitions of the word "admins" would probably not be a good enforcing
body for a code of conduct.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The "other" developers at the Wikimedia Developer Summit

2015-09-15 Thread Tyler Romeo
On Tue, Sep 15, 2015 at 12:43 PM, Dan Garry  wrote:

> However, you're right that one would not expect such people to propose
> architecture changes in MediaWiki; that's not really their domain. However,
> to me that does not seem necessary in order to attend the developer summit.
> Their perspectives as people involved in the development of MediaWiki and
> Wikimedia projects would be welcome.
>

I think this is the key point. Remember that this is the "developer summit"
and
not an "architecture summit", and as such anything MediaWiki-related is up
for
grabs. And if the topic of templates comes up, specifically what we need to
fix
in templates or what new features we should focus on next, who better to
have
there for that than the users of templates themselves.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Announcing the launch of Maps

2015-09-17 Thread Tyler Romeo
I feel like there's an argument to be made that the images are a derived
work from the data itself, so it may be best to just stick with OSM's
license rather than run into any possible issues. (Also, of course, I am
biased and prefer copyleft licenses since they support free information.)


*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science

On Thu, Sep 17, 2015 at 3:49 PM, Ryan Kaldari 
wrote:

> And in case that isn't decided, I would suggest CC0 so that we don't have
> to link to a big explanation from each tile rendering :)
>
> On Thu, Sep 17, 2015 at 12:40 PM, Ryan Kaldari 
> wrote:
>
> > What is the license for the map tiles, i.e. the cartography rather than
> > the data? Open Street Maps uses CC-BY-SA, although they don't really make
> > it clear who is supposed to be attributed. It looks like our tiles are
> > different though.
> >
> > On Thu, Sep 17, 2015 at 12:26 PM, Tomasz Finc 
> wrote:
> >
> >> The Discovery Department has launched an experimental tile and static
> maps
> >> service available at https://maps.wikimedia.org.
> >>
> >> Using this service you can browse and embed map tiles into your own
> tools
> >> using OpenStreetMap data. Currently, we handle traffic from *.wmflabs
> .org
> >> and *.wikivoyage .org (referrer header must be either missing or set to
> >> these values) but we would like to open it up to Wikipedia traffic if we
> >> see enough use. Our hope is that this service fits the needs of the
> >> numerous maps developers and tool authors who have asked for a WMF
> hosted
> >> tile service with an initial focus on WikiVoyage.
> >>
> >> We'd love for you to try our new service, experiment writing tools using
> >> our tiles, and giving us feedback <
> >> https://www.mediawiki.org/wiki/Talk:Maps> .
> >> If you've built a tool using OpenStreetMap-based imagery then using our
> >> service is a simple drop-in replacement.
> >>
> >> Getting started is as easy as
> >> https://www.mediawiki.org/wiki/Maps#Getting_Started
> >>
> >> How can you help?
> >>
> >> * Adapt your labs tool to use this service - for example, use Leaflet js
> >> library and point it to https://maps.wikimedia.org
> >> * File bugs in Phabricator
> >> <https://phabricator.wikimedia.org/tag/discovery-maps-sprint/>
> >> * Provide us feedback to help guide future features
> >> <https://www.mediawiki.org/wiki/Talk:Maps>
> >> * Improve our map style <https://github.com/kartotherian/osm-bright.tm2
> >
> >> * Improve our data extraction
> >> <https://github.com/kartotherian/osm-bright.tm2source>
> >>
> >> Based on usage and your feedback, the Discovery team
> >> <https://www.mediawiki.org/wiki/Discovery> will decide how to proceed.
> >>
> >> We could add more data sources (both vector and raster), work on
> >> additional
> >> services such as static maps or geosearch, work on supporting all
> >> languages, switch to client-side WebGL rendering, etc. Please help us
> >> decide what is most important.
> >>
> >> https://www.mediawiki.org/wiki/Maps has more about the project and
> >> related
> >> Maps work.
> >>
> >> == In Depth ==
> >>
> >> Tiles are served from https://maps.wikimedia.org, but can only be
> >> accessed
> >> from any subdomains of *.wmflabs .org and *.wikivoyage.org.
> Kartotherian
> >> can produce tiles as images (png), and as raw vector data (PBF Mapbox
> >> format or json):
> >>
> >> .../{source}/{zoom}/{x}/{y}[@{scale}x].{format}
> >>
> >> Additionally, Kartotherian can produce snapshot (static) images of any
> >> location, scaling, and zoom level with
> >>
> >> .../{source},{zoom},{lat},{lon},{width}x{height}[@{scale}x].{format}.
> >>
> >> For example, to get an image centered at 42,-3.14, at zoom level 4, size
> >> 800x600, use
> >> https://maps.wikimedia.org/img/osm-intl,4,42,-3.14,800x600.png
> >> (copy/paste the link, or else it might not work due to referrer
> >> restriction).
> >>
> >> Do note that the static feature is highly experimental right now.
> >>
> >> We would like to thank WMF Ops (especially Alex Kosiaris, Brandon Black,
> >> and Jaime Crespo), services team, OSM community and engineers, and the
> >> Mapnik and Mapbox teams. The project would not have completed so fast
> >> without you.
> >>
> >> Thank You
> >>
> >> --tomasz
> >> ___
> >> Wikitech-l mailing list
> >> Wikitech-l@lists.wikimedia.org
> >> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> >
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Getting mediawiki core to pass codesniffer standard

2015-09-25 Thread Tyler Romeo
On Fri, Sep 25, 2015 at 2:08 PM, Legoktm 
wrote:

> I've tried out a different approach in
> <https://gerrit.wikimedia.org/r/#/c/241085/>, by disabling all the
> failing rules. This will let us make the rules that do pass voting (e.g.
> no closing ?> tags), and we can selectively enable failing rules instead
> of trying to make giant patches and hope no one introduces regressions
> while it's still non-voting.
>

Very much agree with this post. I have worked on phpcs compliance on other
projects at my company, and we have found it vastly easier to stage-in over
time, that way at the very least additional errors are not accidentally
introduced.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] AWS usage

2015-10-07 Thread Tyler Romeo
On Wed, Oct 7, 2015 at 11:27 AM, Greg Grossmeier  wrote:

> As in, they would not be experiencing any outage because they are not
> using Travis. The very topic of this thread.
>

Nowhere in this thread is the uptime of Travis or AWS mentioned as far as I
see, so it's understandable to be confused as to why we're "lucky" to not
be using Travis.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] GitLab CI (was RFC: Optional Travis Integration for Jenkins)

2015-10-07 Thread Tyler Romeo
I hate to say it, but this is exactly why we should be preferring copyleft
licenses. GitLab began as completely FOSS, but only later on split into the
CE and EE, which they were able to do because they used MIT.


*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science

On Wed, Oct 7, 2015 at 11:26 AM, Greg Grossmeier  wrote:

> Thanks Brian for your thoughts.
>
> Only commenting on one small part:
>
> 
> > *Pros*
> >
> >- FLOSS
>
> ...ish. Not really. "Open Core".
>
> See: https://en.wikipedia.org/wiki/GitLab#History
>
> A view on why Open Core isn't healthy for FLOSS communities:
> http://ebb.org/bkuhn/blog/2009/10/16/open-core-shareware.html
>
>
> Greg
>
> --
> | Greg GrossmeierGPG: B2FA 27B1 F7EB D327 6B8E |
> | identi.ca: @gregA18D 1138 8E47 FAC8 1C7D |
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] GitLab CI (was RFC: Optional Travis Integration for Jenkins)

2015-10-07 Thread Tyler Romeo
On Wed, Oct 7, 2015 at 12:34 PM, Brian Gerstle 
wrote:

> Thanks for the correction & link, Greg, will check that out.  Is this
> different than GitHub's approach?  Is it better or worse?
>

GitHub is entirely proprietary. So technically yes, GitLab's approach is a
little better, since at least some core part of the software remains open.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] GitLab licensing (was Re: GitLab CI)

2015-10-07 Thread Tyler Romeo
That's a really good point, and something I did not think about.

-- 
Tyler Romeo
https://parent5446.nyc
0x405D34A7C86B42DF

From: Greg Grossmeier 
Reply: Wikimedia developers 
Date: October 7, 2015 at 13:19:55
To: wikitech-l@lists.wikimedia.org 
Subject:  [Wikitech-l] GitLab licensing (was Re: GitLab CI)  


> On Wed, Oct 7, 2015 at 12:34 PM, Brian Gerstle 
> wrote:
>  
> > Thanks for the correction & link, Greg, will check that out. Is this
> > different than GitHub's approach? Is it better or worse?
> >
>  
> GitHub is entirely proprietary. So technically yes, GitLab's approach is a
> little better, since at least some core part of the software remains open.

With my non-work hat on[0], it's not really a matter of "better" or
"worse". GitHub doesn't mince words; they are proprietary.

Gitlab, however, has tremendously annoyed the FLOSS community[1] by
doing a "bait and switch" type move. Promoting their openness but
also keeping more than just the "secret sauce" locked up.

See the things which they keep proprietary:
https://about.gitlab.com/features/#compare
Basic things like "git hooks". :/

Not a great example working relationship with FLOSS partners.


Greg

[0] Sending with my personal email address but via my wmf gmail account
because I'm not subscribed to wikitech-l with my personal email.

[1] At least those who I talk with, including those in prominent FLOSS
community positions, who were hoping for gitlab to be great.

--  
| Greg Grossmeier GPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @greg A18D 1138 8E47 FAC8 1C7D |

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] JetBrains IDE licenses for MW developers - php, js/node, python, C/C++, puppet, ruby, ...

2015-10-21 Thread Tyler Romeo
On Thu, Oct 22, 2015 at 12:14 AM, Ricordisamoa  wrote:

> How come the FOSS community hasn't managed to develop a suitable
> alternative yet?!?


If I had the time, I would definitely put together some sort of
.dir-locals.el for MediaWiki, that way we could make better use of Emacs,
since it has a bunch of IDE-like functionality, even if it's not IDEA-level
powerful.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Short license blocks

2015-10-27 Thread Tyler Romeo
IANAL, but the GPL explicitly prescribes adding the header to every source
file to "most effectively state the exclusion of warranty".

If the legal guys can give the OK that we don't necessarily need that, then
we can definitely remove the notices and replace them with something
smaller.


*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science

On Tue, Oct 27, 2015 at 5:44 AM, Gergo Tisza  wrote:

> In a recent blog post ( http://esr.ibiblio.org/?p=6867 ) ESR writes:
>
> High on my list of Things That Annoy Me When I Hack is sourcefiles that
> > contain huge blobs of license text at the top. That is valuable territory
> > which should be occupied by a header comment explaining the code, not a
> > boatload of boilerplate that I’ve seen hundreds of times before.
>
>
> ...and then goes on to explain using SPDX identifiers to refer to licenses,
> which would look something like this:
>
> /* Copyright 2015 by XYZ
>  * SPDX-License-Identifier: GPL-2.0+
>  */
>
> Any objections to making that the new standard / replacing existing blocks
> with this? It would make the PHP files a little more readable.
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Short license blocks

2015-10-27 Thread Tyler Romeo
Are you saying adopting the short license blocks? Or the MIT license?
Because I'm not sure how the licenses of extensions would affect the
license headers in core.
On Oct 27, 2015 12:43, "Ryan Kaldari"  wrote:

> 
> I totally support switching to license identifiers instead of headers,
> provided that we also switch our licensing from GPL to MIT or BSD ;)
> 
>
> On a serious note, we do have a fair number of extensions that are MIT
> Licensed and could go ahead and adopt this (
> https://www.mediawiki.org/wiki/Category:MIT_licensed_extensions).
>
>
> On Tue, Oct 27, 2015 at 3:44 AM, Gergo Tisza  wrote:
>
> > In a recent blog post ( http://esr.ibiblio.org/?p=6867 ) ESR writes:
> >
> > High on my list of Things That Annoy Me When I Hack is sourcefiles that
> > > contain huge blobs of license text at the top. That is valuable
> territory
> > > which should be occupied by a header comment explaining the code, not a
> > > boatload of boilerplate that I’ve seen hundreds of times before.
> >
> >
> > ...and then goes on to explain using SPDX identifiers to refer to
> licenses,
> > which would look something like this:
> >
> > /* Copyright 2015 by XYZ
> >  * SPDX-License-Identifier: GPL-2.0+
> >  */
> >
> > Any objections to making that the new standard / replacing existing
> blocks
> > with this? It would make the PHP files a little more readable.
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Short license blocks

2015-10-27 Thread Tyler Romeo
The Apache license, which is also permissive, has a similar recommended
file header.

I'd say we just standardize on having the warranty disclaimer and license
notice in every file. It's an easy approach to make sure somebody reading
the file can easily tell the license without having to maintain
comprehensive authorship information in every file.
On Oct 27, 2015 14:17, "Ryan Kaldari"  wrote:

> I was saying that we could go ahead and make this the standard for non-GPL
> MediaWiki code (basically, the few MIT licensed extensions). I'm not sure
> if the advantage of doing that would outweigh the disadvantage of having a
> non-standard standard though.
>
> On Tue, Oct 27, 2015 at 11:06 AM, Tyler Romeo 
> wrote:
>
> > Are you saying adopting the short license blocks? Or the MIT license?
> > Because I'm not sure how the licenses of extensions would affect the
> > license headers in core.
> > On Oct 27, 2015 12:43, "Ryan Kaldari"  wrote:
> >
> > > 
> > > I totally support switching to license identifiers instead of headers,
> > > provided that we also switch our licensing from GPL to MIT or BSD ;)
> > > 
> > >
> > > On a serious note, we do have a fair number of extensions that are MIT
> > > Licensed and could go ahead and adopt this (
> > > https://www.mediawiki.org/wiki/Category:MIT_licensed_extensions).
> > >
> > >
> > > On Tue, Oct 27, 2015 at 3:44 AM, Gergo Tisza 
> > wrote:
> > >
> > > > In a recent blog post ( http://esr.ibiblio.org/?p=6867 ) ESR writes:
> > > >
> > > > High on my list of Things That Annoy Me When I Hack is sourcefiles
> that
> > > > > contain huge blobs of license text at the top. That is valuable
> > > territory
> > > > > which should be occupied by a header comment explaining the code,
> > not a
> > > > > boatload of boilerplate that I’ve seen hundreds of times before.
> > > >
> > > >
> > > > ...and then goes on to explain using SPDX identifiers to refer to
> > > licenses,
> > > > which would look something like this:
> > > >
> > > > /* Copyright 2015 by XYZ
> > > >  * SPDX-License-Identifier: GPL-2.0+
> > > >  */
> > > >
> > > > Any objections to making that the new standard / replacing existing
> > > blocks
> > > > with this? It would make the PHP files a little more readable.
> > > > ___
> > > > Wikitech-l mailing list
> > > > Wikitech-l@lists.wikimedia.org
> > > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > > ___
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Short license blocks

2015-10-27 Thread Tyler Romeo
> Who are "the legal guys"? Do they assume liability for fu-
> ture claims with regard to that matter so that authors are
> effectively indemnified? 

That's a very good point. Sometimes I forget that MediaWiki
does not use CLA (which is another discussion).

In that case I go back on what I said, and maintain that I'd
like the license header to be kept.

Regards,
-- 
Tyler Romeo 
https://parent5446.nyc
0x405D34A7C86B42DF


signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Short license blocks

2015-10-27 Thread Tyler Romeo
On Tue, Oct 27, 2015 at 9:19 PM, Platonides  wrote:

> Regarding "keeping the big header is important", I don't think anyone
> barely into CS  on this century can not know what the GPL is (and not
> figure out in 5 minutes).
>
> An excerpt like this would be perfectly fine imho:
> «This MediaWiki file is licensed under the terms of GPL 2 or later, as
> published in http://www.gnu.org/licenses/gpl2 See the COPYING file for
> details.»
>

It has nothing to do with whether or not they know the GPL. As the FLC link
explains, it is about clearly conveying that the code is indeed copyrighted
under a certain license, such that somebody who violates the license cannot
claim innocence or otherwise try and push some of the blame onto us.

That said, I really don't think bikeshedding over reducing a license header
from 4 lines of text to 2 lines really makes a difference. I'd much rather
use the format advised by GNU itself, considering they did write the
license.

(As for the whole warranty disclaimer, which takes up its own 4 lines, I do
not know enough to argue for or against whether it is necessary, since
software liability is a separate legal device from copyright.)

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [RFC/Summit] `npm install mediawiki-express`

2015-11-05 Thread Tyler Romeo
Using VMs does not take any "technical enlightenment". There are 
containerization tools that are relatively simple to use for non-scale 
environments, and learning them would take just as much time as learning how to 
use npm.

(Note, I'm not arguing against this solution, which I think is pretty cool. 
Just wanted to speak out against the concept that npm is somehow far easier to 
use than any other solution.)

-- 
Tyler Romeo
https://parent5446.nyc
0x405D34A7C86B42DF

From: Yongmin Hong 
Reply: Wikimedia developers 
Date: November 5, 2015 at 23:36:07
To: Wikimedia developers 
Subject:  Re: [Wikitech-l] [RFC/Summit] `npm install mediawiki-express`  

2015. 11. 6. 오전 8:26에 "Ryan Lane" 님이 작성:
>
> Is this simply to support hosted providers? npm is one of the worst
package
> managers around. This really seems like a case where thin docker images
and
> docker-compose really shines. It's easy to handle from the packer side,
> it's incredibly simple from the user side, and it doesn't require
> reinventing the world to distribute things.
>
> If this is the kind of stuff we're doing to support hosted providers, it
> seems it's really time to stop supporting hosted providers. It's $5/month
> to have a proper VM on digital ocean. There's even cheaper solutions
> around. Hosted providers at this point aren't cheaper. At best they're
> slightly easier to use, but MediaWiki is seriously handicapping itself to
> support this use-case.
>
>

Please remember, not everyone is technically-enlighted to use VMs.
--
revi
https://revi.me
-- Sent from Android --
2015. 11. 6. 오전 8:26에 "Ryan Lane" 님이 작성:

Is this simply to support hosted providers? npm is one of the worst package
managers around. This really seems like a case where thin docker images and
docker-compose really shines. It's easy to handle from the packer side,
it's incredibly simple from the user side, and it doesn't require
reinventing the world to distribute things.

If this is the kind of stuff we're doing to support hosted providers, it
seems it's really time to stop supporting hosted providers. It's $5/month
to have a proper VM on digital ocean. There's even cheaper solutions
around. Hosted providers at this point aren't cheaper. At best they're
slightly easier to use, but MediaWiki is seriously handicapping itself to
support this use-case.

On Thu, Nov 5, 2015 at 1:47 PM, C. Scott Ananian 
wrote:

> Architecturally it may be desirable to factor our codebase into multiple
> independent services with clear APIs, but small wikis would clearly like a
> "single server" installation with all of the services running under one
> roof, as it were. Some options previously proposed have involved VM
> containers that bundle PHP, Node, MediaWiki and all required services into
> a preconfigured full system image. (T87774
> <https://phabricator.wikimedia.org/T87774>)
>
> This summit topic/RFC proposes an alternative: tightly integrating
PHP/HHVM
> with a persistent server process running under node.js. The central
service
> bundles together multiple independent services, written in either PHP or
> JavaScript, and coordinates their configurations. Running a
> wiki-with-services can be done on a shared node.js host like Heroku.
>
> This is not intended as a production configuration for large wikis -- in
> those cases having separate server farms for PHP, PHP services, and
> JavaScript services is best: that independence is indeed the reason why
> refactoring into services is desirable. But integrating the services into
a
> single process allows for hassle-free configuration and maintenance of
> small wikis.
>
> A proof-of-concept has been built. The node package php-embed
> <https://www.npmjs.com/package/php-embed> embeds PHP 5.6.14 into a node.js
> (>= 2.4.0) process, with bidirectional property and method access between
> PHP and node. The package mediawiki-express
> <https://www.npmjs.com/package/mediawiki-express> uses this to embed
> MediaWiki into an express.js <http://expressjs.com/> HTTP server. (Other
> HTTP server frameworks could equally well be used.) A hook in the `
> LocalSettings.php` allows you to configure the mediawiki instance in
> JavaScript.
>
> A bit of further hacking would allow you to fully configure the MediaWiki
> instance (in either PHP or JavaScript) and to dispatch to Parsoid (running
> in the same process).
>
> *SUMMIT GOALS / FOR DISCUSSION*
>
>
> - Determine whether this technology (or something similar) might be an
> acceptable alternative for small sites which are currently using shared
> hosting. See T113210 <https://phabricator.wikimedia.org/T113210> for
> related discussion.
> - Identify and address tec

Re: [Wikitech-l] [RFC/Summit] `npm install mediawiki-express`

2015-11-06 Thread Tyler Romeo
I would very, *very* much prefer to not have MediaWiki core extensions written 
in JavaScript. Even beyond my criticisms of JavaScript as a language, I feel 
like that just unnecessarily introduces complexity. The purpose of this wrapper 
is to combine separate micro-services that would otherwise be run in separate 
VMs / servers / etc. so that it can easily be run in a hosting setup.

Otherwise, I'm interested in what implications this will have, especially for 
making MediaWiki easier to install and use, which would be awesome.

-- 
Tyler Romeo
https://parent5446.nyc
0x405D34A7C86B42DF

From: C. Scott Ananian 
Reply: Wikimedia developers 
Date: November 6, 2015 at 14:14:13
To: Wikimedia developers 
Subject:  Re: [Wikitech-l] [RFC/Summit] `npm install mediawiki-express`  

Let's not let this discussion sidetrack into "shared hosting vs VMs (vs
docker?)" --- there's another phabricator ticket and summit topic for that (
https://phabricator.wikimedia.org/T87774 and
https://phabricator.wikimedia.org/T113210.

I'd prefer to have discussion in *this* particular task/thread concentrate
on:

* Hey, we can have JavaScript and PHP in the same packaging system. What
cool things might that enable?

* Hey, we can have JavaScript and PHP running together in the same server.
Perhaps some persistence-related issues with PHP can be made easier?

* Hey, we can actually write *extensions for mediawiki-core* in JavaScript
(or CoffeeScript, or...) now. Or run PHP code inside Parsoid. How could
we use that? (Could it grow developer communities?)

* How are parser extensions (like, say, WikiHiero, but there are lots of
them) going to be managed in the long term? There are three separate
codebases to hook right now. An extension like  might eventually
need to hook the image thumbnail service, too. Do we have a plan?

And the pro/anti-npm and pro/anti-docker and pro/anti-VM discussion can go
into one of those other tasks. Thanks.

--scott
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [RFC/Summit] `npm install mediawiki-express`

2015-11-06 Thread Tyler Romeo
That's a pretty good point. Despite my comments, I'll definitely keep an open 
mind, and am interested in what people might propose.

-- 
Tyler Romeo
https://parent5446.nyc
0x405D34A7C86B42DF

From: C. Scott Ananian 
Reply: C. Scott Ananian 
Date: November 6, 2015 at 15:01:59
To: Tyler Romeo 
CC: Wikimedia developers 
Subject:  Re: [Wikitech-l] [RFC/Summit] `npm install mediawiki-express`  

Tyler: I hear you.  I'm not sure it's a good idea, either -- especially not for 
core extensions used in production.

But it does perhaps allow some expansion of our developer community on the 
fringes, and makes writing extensions possible for a larger set of people?  And 
perhaps there are some cool things written in JavaScript which the extended 
community could more easily hook up to MediaWiki using `php-embed`.

I'm not sure that there are.  I'm just opening up the discussion to see if 
anyone pipes up with, "oh, yeah, I've always wanted to do XYZ!".

Greg: I agree re: premature stifling of discussion.  I'm just saying that 
"high-level" conversation is already happening elsewhere, and it's more 
productive there.  I started *this* particular thread trying to elicit 
discussion more narrowly focused on the thing I've just built.
  --scott

On Fri, Nov 6, 2015 at 2:30 PM, Tyler Romeo  wrote:
I would very, *very* much prefer to not have MediaWiki core extensions written 
in JavaScript. Even beyond my criticisms of JavaScript as a language, I feel 
like that just unnecessarily introduces complexity. The purpose of this wrapper 
is to combine separate micro-services that would otherwise be run in separate 
VMs / servers / etc. so that it can easily be run in a hosting setup.

Otherwise, I'm interested in what implications this will have, especially for 
making MediaWiki easier to install and use, which would be awesome.

-- 
Tyler Romeo
https://parent5446.nyc
0x405D34A7C86B42DF

From: C. Scott Ananian 
Reply: Wikimedia developers 
Date: November 6, 2015 at 14:14:13
To: Wikimedia developers 
Subject:  Re: [Wikitech-l] [RFC/Summit] `npm install mediawiki-express`

Let's not let this discussion sidetrack into "shared hosting vs VMs (vs
docker?)" --- there's another phabricator ticket and summit topic for that (
https://phabricator.wikimedia.org/T87774 and
https://phabricator.wikimedia.org/T113210.

I'd prefer to have discussion in *this* particular task/thread concentrate
on:

* Hey, we can have JavaScript and PHP in the same packaging system. What
cool things might that enable?

* Hey, we can have JavaScript and PHP running together in the same server.
Perhaps some persistence-related issues with PHP can be made easier?

* Hey, we can actually write *extensions for mediawiki-core* in JavaScript
(or CoffeeScript, or...) now. Or run PHP code inside Parsoid. How could
we use that? (Could it grow developer communities?)

* How are parser extensions (like, say, WikiHiero, but there are lots of
them) going to be managed in the long term? There are three separate
codebases to hook right now. An extension like  might eventually
need to hook the image thumbnail service, too. Do we have a plan?

And the pro/anti-npm and pro/anti-docker and pro/anti-VM discussion can go
into one of those other tasks. Thanks.

--scott
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



--
(http://cscott.net)

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Fw: Shutting down persona.org in November 2016

2016-01-12 Thread Tyler Romeo
So...

I stopped development on the Persona extension a while ago when Mozilla dropped 
official support for it, but with persona.org shutting down, I am considering 
deprecating the extension entirely, especially considering the work that would 
be required porting it to AuthManager.

But I figured I'd give a sort of last call. Is there anybody here that knows if 
the extension is being used (despite being marked as experimental) and thinks I 
should still maintain some sort of support for it?

-- 
Tyler Romeo
https://parent5446.nyc
0x405D34A7C86B42DF

From: Ryan Kelly 
Reply: dev-ident...@lists.mozilla.org 
Date: January 11, 2016 at 20:09:31
To: dev-ident...@lists.mozilla.org , 
persona-noti...@mozilla.org 
Subject:  Shutting down persona.org in November 2016  


Hi Everyone,


When the Mozilla Identity team transitioned Persona to community
ownership, we committed resources to operational and security support
throughout 2014 [1], and renewed that commitment for 2015 [2]. Due to
low, declining usage, we are reallocating the project’s dedicated,
ongoing resources and will shut down the persona.org services that we run.

Persona.org and related domains will be taken offline on November 30th,
2016.

If you run a website that relies on Persona, you will need to implement
an alternative login solution for your users. We have assembled a wiki
page with additional information and guidelines for migration [3], but
here are the important things you need to know:

Between now and November 30th, 2016, Mozilla will continue to support
the Persona service at a maintenance level:

* Security issues will be resolved in a timely manner and the services
will be kept online, but we do not expect to develop or deploy any
new features.

* Support will continue to be available on the dev-identity mailing
list [4] and in the #services-dev IRC channel [5].

* All websites that rely on persona.org will need to migrate to
another means of authentication during this period.

Beginning on November 30th, 2016, the Persona service hosted by Mozilla
will be decommissioned:

* All services hosted on the persona.org domain will be shut down.

* Mozilla will retain control of the persona.org domain and will
not transfer it to a third party.

* Since the privacy of user data is of utmost importance to Mozilla,
we will destroy all user data stored on the persona.org servers,
and will not transfer it to third parties.

We intentionally designed Persona to expose email addresses rather than
opaque identifiers, which should ease the transition to other systems
that provide verified email addresses. You can find guidelines on
alternative login solutions on the wiki [3] and we will continue to
update them over the coming year. We strongly encourage affected teams
to openly discuss and blog about their migrations on the dev-identity
mailing list so that others can learn from their experience.

Thank you for your support and involvement with Persona. If you have any
questions, please post them to the dev-identity mailing list.


Ryan


[1]
http://identity.mozilla.com/post/78873831485/transitioning-persona-to-community-ownership
[2] https://groups.google.com/forum/#!topic/mozilla.dev.identity/rPIm7GxOeNU
[3]
https://wiki.mozilla.org/Identity/Persona_Shutdown_Guidelines_for_Reliers
[4] https://lists.mozilla.org/listinfo/dev-identity
[5] http://irc.mozilla.org/#services-dev
___
Persona-notices mailing list
persona-noti...@mozilla.org
https://mail.mozilla.org/listinfo/persona-notices


signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mass migration to new syntax - PRO or CON?

2016-02-12 Thread Tyler Romeo
I think it's also pertinent to note that we are finally reaching a state where 
PHP-CS can be voting, which was only achieved after a lot of hard work to make 
our codebase consistent.

If we make these changes gradually, we're basically throwing away all of the 
work that was just done recently.

Regards,
-- 
Tyler Romeo
https://parent5446.nyc
0x405D34A7C86B42DF

From: Joaquin Oltra Hernandez 
Reply: Wikimedia developers 
Date: February 12, 2016 at 14:31:54
To: Wikimedia developers 
Subject:  Re: [Wikitech-l] Mass migration to new syntax - PRO or CON?  

PRO from me too.

Doing it gradually is just going to make the codebase inconsistent, and
tooling can help point patches to the old style to migrate to the new one.

I'd rather do it quickly than have the inconsistency bleed through months
or years.

On Fri, Feb 12, 2016 at 8:28 PM, Alex Monk  wrote:

> PRO from me, for all the reasons mentioned by legoktm
>
> On 12 February 2016 at 19:26, Legoktm  wrote:
>
> > Hi,
> >
> > On 02/12/2016 07:27 AM, Daniel Kinzler wrote:
> > > Now that we target PHP 5.5, some people are itching to make use of some
> > new
> > > language features, like the new array syntax, e.g.
> > > <https://gerrit.wikimedia.org/r/#/c/269745/>.
> > >
> > > Mass changes like this, or similar changes relating to coding style,
> > tend to
> > > lead to controversy. I want to make sure we have a discussion about
> this
> > here,
> > > to avoid having the argument over and over on any such patch.
> > >
> > > Please give a quick PRO or CON response as a basis for discussion.
> > >
> > > In essence, the discussion boils down to two conflicting positions:
> > >
> > > PRO: do mass migration to the new syntax, style, or whatever, as soon
> as
> > > possible. This way, the codebase is in a consistent form, and that form
> > is the
> > > one we agreed is the best for readability. Doing changes like this is
> > > gratifying, because it's low hanging fruit: it's easy to do, and has
> > large
> > > visible impact (well ok, visible in the source).
> >
> > I'll offer an alternative, which is to convert all of them at once using
> > PHPCS and then enforce that all new patches use [] arrays. You then only
> > have one commit which changes everything, not hundreds you have to go
> > through while git blaming or looking in git log.
> >
> > > CON: don't do mass migration to new syntax, only start using new styles
> > and
> > > features when touching the respective bit of code anyway. The argument
> > is here
> > > that touching many lines of code, even if it's just for whitespace
> > changes,
> > > causes merge conflicts when doing backports and when rebasing patches.
> > E.g. if
> > > we touch half the files in the codebase to change to the new array
> > syntax, who
> > > is going to manually rebase the couple of hundred patches we have open?
> >
> > There's no need to do it manually. Just tell people to run the phpcs
> > autofixer before they rebase, and the result should be identical to
> > what's already there. And we can have PHPCS run in the other direction
> > for backports ([] -> array()).
> >
> > But if we don't do that, people are going to start converting things
> > manually whenever they work on the code, and you'll still end up with
> > hundreds of open patches needing rebase, except it can't be done
> > automatically anymore.
> >
> > > My personal vote is CON. No rebase hell please! Changing to the syntax
> > doesn't
> > > buy us anything.
> >
> > Consistency buys us a lot. New developers won't be confused on whether
> > to use [] or array(). It makes entry easier for people coming from other
> > languages where [] is used for lists.
> >
> > I think you're going to end up in rebase hell regardless, so we should
> > rip off the bandaid quickly and get it over with, and use the automated
> > tools we have to our advantage.
> >
> > So, if we're voting, I'm PRO.
> >
> > -- Legoktm
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC: Add Content-Security-Policy header to MediaWiki

2016-05-22 Thread Tyler Romeo
First, as I expected, this proposal is to use CSP with the "unsafe-eval"
option enabled for both style-src and script-src. This means the JavaScript
eval() function can be used freely, and inline CSS via the style attribute
can still be used. Combined with the "default-src *" policy, which I
mention later in this email, this means an attacker can just use XHR to
download an external script and eval() it, thus defeating the entire point
of CSP: to prevent XSS attacks.

I understand the reason behind this is because we store scripts in
localStorage as cache and eval() them upon page load (which is perhaps the
worst abuse of browser technology I have ever seen). Not allowing both
inline scripts and styles and eval()ed code is one half of the two-sided
CSP coin, and eliminating it keeps open a pretty large attack vector. I
don't believe it's appropriate to sacrifice security for a quick kludge
that is used to improve the performance hole opened by the large
fragmentation of our JavaScript codebase.

Second, the proposal is to use the nonce-$RANDOM attribute for inline
scripts, but to cache the nonce for non-logged-in users. Does this mean
that the same nonce will be delivered to multiple pages and/or users?
Because that is a violation of the CSP spec, which says "If a server
delivers a nonce-source expression as part of a police, the server MUST
generate a unique value each time it transmits a policy." [0] Thus we
cannot use nonces in this way. Are these scripts whose contents we know
beforehand? Because then we can just use the hash-source policy.

(Not to mention that nonce-source and hash-source policies are part of CSP
2, which is not supported in the latest IE, Safari, or Opera Mini. What is
the plan to support these browsers? Fall back to unsafe-inline? Possibly
use a solution similar to what Dropbox used for their CSP deployment [1].)

Finally, as stage 1 hints at and as I mentioned above, there are a lot more
than just style-src and script-src, such as font-src, media-src, frame-src,
manifest-src, etc. Why are these not addressed, and instead just left to
the default policy of "default-src *"? Do we allow iframes on Wikipedia
pointing at arbitrary domains? At the very least, a scheme-source policy
can be used to enforce HTTPS.

[0] https://w3c.github.io/webappsec-csp/
[1]
https://blogs.dropbox.com/tech/2015/09/unsafe-inline-and-nonce-deployment/


*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science

On Mon, May 23, 2016 at 1:18 AM, Pine W  wrote:

> With the disclaimer that I'm not a security engineer and that I understand
> only parts of this proposal, in general this strikes me as a good idea. It
> seems to me that trying to develop a comprehensive list of what tools /
> scripts this proposal would likely break, how important those breaks are,
> and who could fix them and when, would help with developing a roadmap
> toward implementing this proposal with appropriate mitigation and
> communication.
>
> It seems to me that this is the kind of project for which product community
> liasons are well suited to help with developing and implementing a rollout
> plan. Is there any chance of getting a CL to help with this project?
>
> Thanks for the initiative,
>
> Pine
>
> Pine
> On May 22, 2016 18:18, "Brian Wolff"  wrote:
>
> > So the RFC process page says I should email wikitech-l to propose an RFC,
> > thus:
> >
> > Content-Security-Policy (CSP) header is a header that disables certain
> > javascript features that are commonly used to exploit XSS attacks, in
> > order to mitigate the risks of XSS. I think we could massively benefit
> > from using this technology - XSS attacks probably being the most
> > common security issue in MediaWiki. The downside is that it would
> > break compatibility with older user scripts.
> >
> > Please see the full text of my proposal at
> >
> https://www.mediawiki.org/wiki/Requests_for_comment/Content-Security-Policy
> >
> > The associated phabricator ticket is:
> > https://phabricator.wikimedia.org/T135963
> >
> > I'd appreciate any comments anyone might have.
> >
> > Thanks,
> > Brian
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Update on WMF account compromises

2016-11-17 Thread Tyler Romeo
On Thu, Nov 17, 2016 at 12:28 PM, Pine W  wrote:

> 1. If you don't trust that strength testing site (which is fine), choose
> another. I did a couple of quick checks on that site; while it's entirely
> possible that I missed something, it appeared to me that the site was not
> sending passwords over the Internet, whether in the clear or encrypted. The
> use of HTTP or HTTPS is irrelevant if the data isn't getting sent out in
> the first place.
>

Or use a password manager that has a local built-in password strength tool,
that way you don't risk being MiTMed by an HTTP site.

In general, as mentioned, you should simply not enter your password on any
website that is not the site the password belongs to. For my full-time job,
employees have a Chrome extension where accidentally type your password on
any website (even if it's not in a text box) you're required to reset it.

*-- *
Regards,

*Tyler Romeo*
0x405d34a7c86b42df
https://parent5446.nyc
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Changes in colors of user interface

2016-12-09 Thread Tyler Romeo
May I just say if the time ever comes where MediaWiki developers have to
submit an RfC, coordinate with local wikis of numerous different languages,
and wait weeks for community feedback and consensus for minor UI color
changes, then all development would have been brought to a halt. It is that
kind of bikeshedding and red-tape that often makes enterprise software
development so cumbersome and loathed.

Honestly, I don't even think this change needs to be announced. If wikis
are overriding their own colors using common.css, then it is their due
diligence to make sure their custom unsupported code keeps up to date with
the base MediaWiki software.

*-- *
Regards,

*Tyler Romeo*
0x405d34a7c86b42df
https://parent5446.nyc

On Fri, Dec 9, 2016 at 9:32 PM, Steven Walling 
wrote:

> Pine, please chill for once.
> On Fri, Dec 9, 2016 at 5:40 PM Legoktm 
> wrote:
>
> > Hi,
> >
> > On 12/09/2016 05:25 PM, Pine W wrote:
> > > While I appreciate the attempt to be funny, I am not amused in this
> case.
> > > Surprise UI changes could, for example, result in thousands of dollars'
> > > worth of instructional videos becoming instantly out of sync with the
> > > real-world user experience. Also, users who may have spent considerable
> > > time perfecting their templates may be surprised to find that they need
> > to
> > > make changes to keep the templates in sync with other changes to the
> UI.
> > > The problem isn't so much that the UI was changed, as that it was
> changed
> > > without notice and consultation. This isn't funny.
> >
> > Well, I was specifically responding to Strainu's comment that "No change
> > is to small to be disliked by one or more people!" which seemed to be in
> > jest too.
> >
> > But I think you're significantly over-exaggerating the costs of this UI
> > change, and changes in general. MediaWiki's UI changes literally every
> > day when localization messages are updated, reworded, or added. Special
> > pages are re-organized to be made more intuitive, toolbars re-arranged,
> > etc. If you're making instructional videos, colors *barely* changing
> > seem like the least of your problems in terms of becoming out of date.
> > The VisualEditor project has a script that automatically generates
> > localized screenshots of the user interface so the user manual stays up
> > to date, I don't know if any solution has been worked out for videos yet.
> >
> > -- Legoktm
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Communication about proposed and planned UI changes

2016-12-15 Thread Tyler Romeo
On Thu, Dec 15, 2016 at 6:51 PM, Pine W  wrote:

> My proposal would be that proposed UI changes which affect large
> proportions of the user base should be announced 3 months in advance.
> This would provide plenty of opportunity for discussion,
> synchronization, and testing of proposed changes.
>

A much finer definition is needed here. "Proposed UI changes which affect
large proportions of the user base", to my mind, includes basically any UI
change that affects any public user page, e.g., public Special Pages,
article pages, etc. It does not include any specification about the
significance of the change. (I understand this is intentional based on your
replies in the previous thread.)

I am still of the opinion that small UI changes, regardless of how many
users will be affected by the change, should not require a three-month
holding pattern while they are discussed. The amount of developer time and
productivity lost far outweighs the other consequences. Something like
minor shade changes to existing colors, slight movements or alignments of
specific containers, minor padding or margin adjustments, etc., are not
significant, even if the change is on an article page, which would affect
every millions of users. Something like a major color shade change, or a
major refactoring of the design of a popular page, is of course another
story.

My motivation for this is simple: I don't think software UI changes should
ever be based on community consensus, nor do I think Wikipedia users (or
rather, the subset of users that take interest in a Tech News announcement
or the like) are a suitable group of individuals for making decisions
regarding software development. Hell, not even the entire software
development team of MediaWiki (volunteer or WMF) should be making such a
decision. There's a reason why we have a UI standards group, composed of
experts who know at least a thing or two about UI design. Does that mean
these changes should occur opaquely without any communication? No, but the
opposite extreme of forcing a three-month comment period is possibly even
worse.

I do understand that there is a significant business requirement when it
comes to announcing significant changes, i.e., those beyond the shred of a
definition of what I consider a minor change. Be it community backlash and
a resulting decline in community engagement by users who consider the
volunteer software development team to be their mortal enemy, or
instructional videos that will become outdated and need funding to recreate
(which I don't view as a blocker for UI changes, but merely a consideration
that should be acknowledged before making cost decisions of UI changes),
there are some valid reasons why community notice should be given for a
change. But it certainly should not be for *every* change. There should be
a case-by-case decision of whether the given change would have a valid
requirement for prior notice.

*-- *
Regards,

*Tyler Romeo*
0x405d34a7c86b42df
https://parent5446.nyc
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] dnschain

2014-04-29 Thread Tyler Romeo
This isn't really relevant to MediaWiki, and the proposal is so ridiculous
I can only assume it is some sort of joke project.

For others seeing this thread, I found all the good quotes for you:

> DNSChain "stops the NSA"

> .dns is a meta-TLD because unlike traditional TLDs, it is not meant to
globally resolve to a specific IP [...] you cannot register a meta-TLD
because you already own them!

I think ICANN might take issue with that. (Also, a good read of RFC 3686 is
necessary here.)

> // hijack and record all HTTPS communications to this site
> function do_TLS_MITM(connection) {
> if (
> // let's not get caught by "pinning", shall we?
> isPinnedSite(connection.website, connection.userAgent)
> // never hijack those EFF nuisances, they're annoying
> || isOnBlacklist(connection.ip)
> // hijack only 5% of connections to avoid detection
> || randomIntBetween(1, 100) > 5
> )
> {
> return false;
> }
> return mitm_and_store_in_database(connection);
> }

I'd *love* to see the implementation of "mitm_and_store_in_database".

Also, fun to note that the entire application is written in CoffeeScript.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science


On Wed, Apr 30, 2014 at 1:41 AM, James Salsman  wrote:

> Would someone please review this DNS proposal for secure HTTPS?
>
> https://github.com/okTurtles/dnschain
> http://okturtles.com/other/dnschain_okturtles_overview.pdf
> http://okturtles.com/
>
> It is new but it appears to be the most correct secure DNS solution for
> HTTPS security at present. Thank you.
>
> Best regards,
> James Salsman
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] recent changes stream

2014-05-04 Thread Tyler Romeo
Just wondering, but has any performance testing been done on different
socket.io implementations? IIRC, Python is pretty good, so I definitely
approve, but I'm wondering if there are other implementations are are more
performant (specifically, servers that have better parallelism and no GIL).

For example, Erlang with Cowboy is supposed to be a good
socket.ioimplementation, and it is truly parallel, but I've never
worked with it so
I cannot say for sure.


*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science


On Sun, May 4, 2014 at 10:23 PM, Ori Livneh  wrote:

> Hi,
>
> Gerrit change Id819246a9 proposes an implementation for a recent changes
> stream broadcast via socket.io, an abstraction layer over WebSockets that
> also provides long polling as a fallback for older browsers. Comment on <
> https://gerrit.wikimedia.org/r/#/c/131040/> or the mailing list.
>
> Thanks,
> Ori
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Help! Phabricator and our code review process

2014-05-05 Thread Tyler Romeo
OK, so I'm sorry if this information is duplicated anywhere, but between
the Project Management Tools review page, the Phabricator RFC, the various
sub-pages of the RFC, and the content on the Phabricator instance itself,
it would take me at least a couple of hours to organize my thoughts. So
I'll just ask directly:

Phabricator still does not work directly with Git, right? Or has that been
implemented since I last checked? If not, what is the planned workaround
for Phabricator? The default workflow is to use arcanist to merge the code
into Git directly. Does that handle merge conflicts? What is the rebase
process?

It's not that I'm opposed to the new system. I'm just confused as to what
the new workflow would actually be.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science


On Mon, May 5, 2014 at 12:02 PM, Quim Gil  wrote:

> Hi, please check this draft plan for the next steps in the Phabricator RfC
> at
>
> https://www.mediawiki.org/wiki/Requests_for_comment/Phabricator/Plan
>
> This aims to be a starting point for the next round of discussion to be
> held online and at the Wikimedia hackathon in Zürich this weekend. Edits,
> questions, and feedback welcome.
>
>
> On Friday, May 2, 2014, C. Scott Ananian  wrote:
>
> >
> > [cscott] James_F: I'm arguing for a middle path. devote *some*
> > resources, implement *some* interoperability, decide at *some later*
> > point when we have a more functional instance.
> >
>
> This is basically the same as "Decide now on a plan identifying the the
> blockers, commit resources to fix them, proceed with the plan unless we get
> stuck with a blocker." We have identified blockers, but we are not seeing
> any that could not be solved with some work (from the very active upstream
> and/or ourselves).
>
> We need a RfC approval to go confidently from http://fab.wmflabs.org to a
> production-like Wikimedia Phabricator. If that happens, the Platform
> Engineering team will commit resources to plan, migrate, and maintain the
> Phabricator instance that will deprecate five tools or more.
>
> The Labs instance has been setup and is being fine-tuned basically on a
> volunteering basis, which tells a lot about Phabricator's simplicity of
> administration and maintenance. As it is now, it is good enough to run
> simple projects with a short term deadline e.g.
>
> Chemical Markup for Wikimedia Commons
> http://fab.wmflabs.org/project/view/26/ (a GSoC project -- hint, hint)
>
> Analytics-EEVS
> http://fab.wmflabs.org/project/board/15/
>
> Please play with it and provide feedback. Other contributors critic with
> Phabricator are doing this, and it is being extremely helpful for
> everybody.
>
>
> --
> Quim Gil
> Engineering Community Manager @ Wikimedia Foundation
> http://www.mediawiki.org/wiki/User:Qgil
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Affiliation in username

2014-05-07 Thread Tyler Romeo
One interesting idea might be what Reddit does:

For a moderator of a subreddit, whenever they make a post it just appears
normally. However, after posting they can choose to "officiate" it. All
that does is highlight their username a different color and indicates they
are acting in their position as moderator rather than a regular user.

This idea could be applied to edits in core, and maybe posts in Flow. WMF
employees in a special user group could make an edit, and then press a
button on the history page to highlight that as an "official" edit.


*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science


On Wed, May 7, 2014 at 4:22 PM, Jared Zimmerman <
jared.zimmer...@wikimedia.org> wrote:

> Affiliations change, and user names are quite difficult to change, this
> sounds like something that would be good for a structured profile, not for
> a user name.
>
>
>
> *Jared Zimmerman * \\  Director of User Experience \\ Wikimedia Foundation
>
> M : +1 415 609 4043 |   :  @JaredZimmerman<
> https://twitter.com/JaredZimmerman>
>
>
>
> On Sat, Apr 19, 2014 at 4:17 PM, Gryllida  wrote:
>
> > On a second thought, do we want to add an optional "affiliation" field to
> > the signup form, so the affiliation goes at the end of username in
> braces?
> >
> > - DGarry (WMF)
> > - Fred (DesignSolutionsInc)
> > - David (MIT)
> > - ...
> >
> > So the signup form would look like this:
> >
> >  -
> > | |
> > | [ Username preview in large green font ]|
> > | |
> > | Username:   |
> > |  ___|
> > | Password:   |
> > |  ___|
> > | Password 2: |
> > |  ___|
> > | Email (optional):   |
> > |  ___|
> > | Affiliation (optional; if your editing is related to work): |
> > |  ___|
> > | |
> >  -
> >
> > I.e.
> >
> >  -
> > | |
> > | [ "Gryllida (FOO)" in large green font ]|
> > | |
> > | Username:   |
> > |  _Gryllida__|
> > | Password:   |
> > |  ___|
> > | Password 2: |
> > |  ___|
> > | Email (optional):   |
> > |  ___|
> > | Affiliation (optional; if your editing is related to work): |
> > |  _FOO___|
> > | |
> >  -
> >
> > Gryllida.
> >
> >
> > On Sun, 23 Feb 2014, at 1:25, rupert THURNER wrote:
> > > hi,
> > >
> > > could wmf please extend the mediawiki software in the following way:
> > > 1. it should knows "groups"
> > > 2. allow users to store an arbitrary number of groups with their
> profile
> > > 3. allow to select one of the "group"s joined to an edit when saving
> > > 4. add a checkbox "COI" to an edit, meaning "potential conflict of
> > interest"
> > > 5. display and filter edits marked with COI in a different color in
> > history
> > > views
> > > 6. display and filter edits done for a group in a different color in
> > > history views
> > > 7. allow members of a group to receive notifications done on the group
> > page,
> > >or when a group is mentioned in an edit/comment/talk pag

Re: [Wikitech-l] Login to Wikimedia Phabricator with a GitHub/Google/etc account?

2014-05-16 Thread Tyler Romeo
I feel like the ideal situation would be to:

1) Only allow Phabricator login with a Wikimedia account; and
2) When logging into Wikimedia, allow login with Google, GitHub, etc.

Unfortunately, fulfilling that situation means deploying the OpenID
extension, which is definitely not ready yet.


*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science


On Fri, May 16, 2014 at 1:47 PM, C. Scott Ananian wrote:

> On Fri, May 16, 2014 at 9:23 AM, Quim Gil  wrote:
> > On Friday, May 16, 2014, Petr Bena  wrote:
> >
> >> Yes. Support as many providers as possible, google at least, I
> >> basically don't even want to use any more web services with own login
> >> unless I have to. single login FTW
> >
> > I wonder why a user without a Wikimedia account or a GitHub account would
> > need to login to Wikimedia Phabricator.
>
> To report or comment on a bug?
>
> To anonymously report an issue?
>  --scott
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Login to Wikimedia Phabricator with a GitHub/Google/etc account?

2014-05-17 Thread Tyler Romeo
On Sat, May 17, 2014 at 2:26 PM, Steven Walling wrote:

> Obviously with Google and Facebook as options we don't
> stand to gain a lot in terms of technical contributions.
>

This isn't necessarily true. I know that I personally would prefer to be
able to log in with my Google account, because it's what I use for
everything.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] What should be the recommended / supported way to do skins? (A proposal.)

2014-05-20 Thread Tyler Romeo
On Tue, May 20, 2014 at 5:25 PM, Bartosz Dziewoński wrote:

> tl;dr Let's start putting all skins files in a single directory, and let's
> use a grown-up structure with one class per file + separate init code for
> them. Okay?


I wouldn't consider this change to be truly revolutionary. Would would
really be a great restructuring of the skinning system is if I could make a
skin by just writing a couple of HTML templates (probably using Gabriel's
DOM-based templating language), throw them in a directory and then tell
MediaWiki where the directory is.

However, staying on the topic that you brought up, I do agree with keeping
skins in a separate directory than extensions. It implies a bit of control
on our part concerning how skins need to be structured, whereas if they
were extensions, we cannot place any requirements on the structure.


*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Square bounding boxes + typesafe enums

2014-05-21 Thread Tyler Romeo
On Wed, May 21, 2014 at 12:48 PM, Brion Vibber wrote:

> From my read, SplEnum is a bit janky -- you still have to defibe integer
> constants and may have to pass them in.
>

Also, SplEnum is not built-in. It's a PECL extension, so we'd have to
require users to install it alongside MediaWiki.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Adding external libraries to core (SwiftMailer)

2014-05-27 Thread Tyler Romeo
This may be true, but like I mentioned during the IRC RFC discussion on 
third-party
components, it’s a case-by-case decision, and I think in this case, SwiftMailer 
is
a fairly reliable library and definitely much better than what we have 
implemented
now.
-- 
Tyler Romeo
0xC86B42DF

From: Tony Thomas 01tonytho...@gmail.com
Reply: Wikimedia developers wikitech-l@lists.wikimedia.org
Date: May 27, 2014 at 12:13:16
To: Wikimedia developers wikitech-l@lists.wikimedia.org
Subject:  Re: [Wikitech-l] Adding external libraries to core (SwiftMailer)  

So, if we can have the swiftmailer repo added, this would be excellent. It
might get worse
otherwise, as after this shift, MW can 'only' send emails through swift. A
hard dependency,
as you quoted earlier.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] deleting config directory confusion

2014-05-31 Thread Tyler Romeo
delete it optionally for purpose X?
You can use the config script to perform database upgrades later on if you 
upgrade MediaWiki versions. Most people just use the command line update.php 
instead, but for some people this is not an option.

-- 
Tyler Romeo
0xC86B42DF
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] new auth/debug/Zero RfCs, + adding more office hours

2014-06-02 Thread Tyler Romeo
* https://www.mediawiki.org/wiki/Requests_for_comment/SOA_Authentication
"With many more entry points and the need for inter-service
authentication, a service-oriented architecture requires a stronger
authentication system."
This is literally the same thing as AuthStack except more generic and without 
any code or plan for implementation.

Also, on the same note, I was told previously that the AuthStack RFC was too 
big and needed to be split up, because I tackled both 
authentication/authorization *and* session management and logout in the same 
proposal. Since this RFC has the same problem, it should be split up 
accordingly.

-- 
Tyler Romeo
0xC86B42DF
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using Composer to manage libraries for mediawiki/core on Jenkins and Foundation cluster

2014-06-02 Thread Tyler Romeo
I would also like to express my disappointment at third party users being
thrown under the bus once again. Several people have been putting a lot of
effort into supporting third party users, so it really saddens me that this
is dismissed as an irrelevance by some so easily.
Third party users were not thrown under the bus. Unfortunately, the solution 
you are looking for in terms of extension installation is not yet available 
with the current tools available. That is just the unfortunate truth. We are 
not going to misuse libraries and hack together MediaWiki just so extension 
installation can be *slightly* easier.

-- 
Tyler Romeo
0xC86B42DF
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using Composer to manage libraries for mediawiki/core on Jenkins and Foundation cluster

2014-06-02 Thread Tyler Romeo
This sort of behaviour towards non-WMF extension developers is
interesting and if your objective is to alienate (as with the attitude
above) volunteer developers then your on the right path.

Some people forget that users not always choose MW because of its
software but because it provides some extensions that people outside
of the WMF cluster find useful.
Considering I *am* a non-WMF extension developer I don’t see how your argument 
is relevant.

And as I literally just said in my previous, the goal was not to disadvantage 
third-party extension developers. Composer is simply not meant to be used in 
the way it was shoe-horned into MediaWiki. I’m not going to re-explain this 
every time because it is in multiple places on Bugzilla and in this mailing 
list.

-- 
Tyler Romeo
0xC86B42DF
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using Composer to manage libraries for mediawiki/core on Jenkins and Foundation cluster

2014-06-04 Thread Tyler Romeo
Starwman. I happen to have discussed the situation and the approach with
the main people behind Composer in person, as well as gone over details
with contributors on IRC. They did not seem to share your opinion.
Since we’re throwing out logical fallacies: argumentum ab auctoritate.

Like I’ve *already explained*…

Here is the problem: we need composer.json and composer.lock to be version 
controlled so that MediaWiki core can manage its own dependencies rather than 
using git submodules or hard-copying source trees into the repository, which 
everybody agrees are not sustainable solutions. However, third parties want to 
be able to install extensions via Composer, which, while not the purpose of 
Composer, is technically feasible. These two ideas conflict.

Here is the solution: rather than using Composer as a package management system 
when, in reality, it is a dependency management system, we use Composer to 
properly maintain core, and then do one of the following: 1) implement our own 
extension installation system from scratch, or 2) change and/or extend Composer 
so that it supports a plugin system. I personally recommend the latter, and 
there are upstream bug reports open concerning it.

-- 
Tyler Romeo
0xC86B42DF

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using Composer to manage libraries for mediawiki/core on Jenkins and Foundation cluster

2014-06-04 Thread Tyler Romeo
As a temporary workaround, maybe we could create
extensions/composer-example.json which could be used for extension
registration by running composer in that directory?
Yes. That would work. All we would need to do is add the following into 
Setup.php:

include “$IP/extensions/vendor/autoload.php”;

(Obviously we’d also have a file_exists check…)

-- 
Tyler Romeo
0xC86B42DF

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Hardening WP/WM against traffic analysis (take two)

2014-06-05 Thread Tyler Romeo
On Thu, Jun 5, 2014 at 4:50 PM, David Gerard  wrote:

> Or, indeed, MediaWiki tarball version itself.


MediaWiki is a web application. As amazing as it would be for Wikipedia to
be secure against traffic analysis, we are not going to introduce
presentation-layer logic into an application-layer product.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] LiquidThreads - how do we kill it?

2014-06-09 Thread Tyler Romeo
On Mon, Jun 9, 2014 at 2:20 PM, James Forrester 
wrote:

> You think people want dual inheritance for comments? Seriously?
> That's… (to be polite) completely insane as a discussion system from a
> user perspective, and perhaps more importantly for your argument,
> completely not something that happens right now in MediaWiki
> conversations.
>

To be fair, this is something that happens on long talk page discussions.
It is not correct to blanket-classify all talk page conversations as
directed trees. When discussions get really long, it just becomes a blob of
text of people talking back and forth with each other. It is massively
disorganized, but it will not work to simply force the discussion into a
tree structure.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Making a plain MW core git clone not be installable

2014-06-11 Thread Tyler Romeo
On Wed, Jun 11, 2014 at 4:28 AM, Antoine Musso  wrote:

> - depends on upstream to incorporate our patches
> - might not be used as-is on Wikimedia cluster
>

I would just like to point out that you can override where Composer finds
packages using the composer.json file. In other words, if we wanted to
force Composer to use WMF's git repositories to check out dependencies,
that is entirely possible. In the end, Composer uses git to check out the
packages anyway.

https://getcomposer.org/doc/04-schema.md#repositories

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Making a plain MW core git clone not be installable

2014-06-11 Thread Tyler Romeo
OK, now that I have some more time, I will expand on what I’ve already said in 
the Gerrit patch.

== Introduction ==

The WMF does not have that many developers. And it’s not that they haven’t 
hired enough or anything like that. It’s just they do not have that many people 
in general. As of last year, they had under 200 employees, total. I work at an 
enterprise software company, and we have 200 developers alone on the app/dev 
team, let alone the additional people (including myself) in the architecture 
group, in the business analyst team, in management, etc. To think that under 
200 people can develop a 400,000+ line codebase is insane.

That’s why the WMF gets help from the open source community. Not only because 
people love to work on open source projects that benefit the world and 
humanity, but because such projects *need* volunteers because they simply do 
not have the billion-dollar operating revenue necessary to develop an 
enterprise product.

With that said, it is absolutely necessary that the WMF get as much help as it 
can. Right now it has a lot of help: numerous volunteer developers working on 
core, working on translatewiki, etc. However, it’s not easy convincing people 
to join an open source software project. I’m sure Quim and Sumana can attest to 
this. What we can do, however, is “export” the work.

== Third-party libraries ==

=== How they can help? ===

Third-party libraries, i.e., other open-source software projects, are helpful 
in that:

1) They have existing communities of interested volunteer (and sometimes even 
paid) developers.
2) They are structured in an abstract manner, meaning they can be used inside 
MediaWiki without having extensive knowledge of the library’s internals.
3) They help to remove possibly hundreds if not thousands of lines of code from 
MediaWiki core that does not need to be there.

The premise is: why reinvent the wheel? Or, to be more specific, why reinvent 
the wheel by carving a square stone rock? There are libraries out there (such 
as SwiftMailer, which I’ll get to) that do certain tasks that MediaWiki needs 
to do. However, they are ten times as comprehensive as MediaWiki’s make-shift 
implementations, and are maintained a lot more.

=== The qualifications ===

Obviously, we cannot just start throwing in third-party libraries willy-nilly. 
There are major implications, specifically:

* Possible security issues introduced by the library
* Having to submit patches upstream when bugs are discovered

These are real issues. However, if the third-party library we are thinking of 
including is: 1) popular, i.e., used by other projects, especially commercial 
and/or commercially-backed projects; 2) uses a sane and reliable code review 
process and release process; and 3) is stable and unlikely to change 
drastically, then the library is reliable, and we do not really have to worry 
about issues.

One argument that people seem to be making is that since the code did not go 
through *our* review process, the code might be vulnerable or insecure. That is 
a fallacy. To say that code going through our own review process is more secure 
than others is ridiculous. Doing a security review of code requires security 
knowledge, and those with security knowledge (i.e., Chris) are not reviewing 
every line of code that goes into core.

Long story short, we do need to evaluate what libraries we put in, but if they 
are reliable it will not be a problem.

== SwiftMailer ==

Now we get to SwiftMailer. Some are calling it a new shiny toy, and like Daniel 
said, this is a bit insulting to anybody who has even looked at the 
UserMailer.php code.

With SwiftMailer, the only mail code that will be in MediaWiki will be what is 
necessary to get the e-mail address of a user. Everything else is handled 
inside the library. We no longer have to maintain mail-sending code at all. 
Putting PEAR aside, considering the major annoyances in using PHP’s mail() 
function, including differences in handling between operating systems, it is a 
big success if we do not have to deal with the mail() function inside MediaWiki.

-- 
Tyler Romeo
0xC86B42DF

From: Tyler Romeo tylerro...@gmail.com
Reply: Tyler Romeo tylerro...@gmail.com
Date: June 11, 2014 at 7:21:46
To: Wikimedia developers wikitech-l@lists.wikimedia.org
Subject:  Re: [Wikitech-l] Making a plain MW core git clone not be installable  


On Wed, Jun 11, 2014 at 4:28 AM, Antoine Musso  wrote:
- depends on upstream to incorporate our patches
- might not be used as-is on Wikimedia cluster

I would just like to point out that you can override where Composer finds 
packages using the composer.json file. In other words, if we wanted to force 
Composer to use WMF's git repositories to check out dependencies, that is 
entirely possible. In the end, Composer uses git to check out the packages 
anyway.

https://getcomposer.org/doc/04-schema.md#repositories

-- 
Tyler Romeo
Stevens Institute of Technology, Class of 2016
Major in Com

Re: [Wikitech-l] Making a plain MW core git clone not be installable

2014-06-11 Thread Tyler Romeo
On Wed, Jun 11, 2014 at 10:42 AM, Alex Brollo  wrote:

> I'm not a developer so it's perfectly normal that I can't understand
> anything about your talk; nevertheless, please, remember of KISS principle
> when building any installing tools for poor, final users. I'm waiting for
> something like "pip install core".
>

Just to clarify on the implications of using Composer, specifically:

1) If you are using MediaWiki through tarball releases (or rather, anything
other than using git), this will *not* affect you. Tarballs will ship with
the dependencies pre-loaded, so nothing changes.

2) If you are using MediaWiki through git, the extra step you will need to
do to install MediaWiki is:

> curl -sS https://getcomposer.org/installer | php
> ./composer.phar install

And hopefully, if the user has enough knowledge to clone a git repository,
they should understand command line and how to run these two basic
commands. (As a side note, you can install composer globally by moving
composer.phar to /usr/bin/local/composer, which makes thing simpler.)

Of course, all of this will be thoroughly documented.

Additionally, it is also possible to just integrate Composer into the
MediaWiki installer, and have the normal install process load the
dependencies on its own. Of course, this assumes the web server has write
permissions to the install directory.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Making a plain MW core git clone not be installable

2014-06-11 Thread Tyler Romeo
On Wed, Jun 11, 2014 at 10:56 AM, Brad Jorsch (Anomie) <
bjor...@wikimedia.org> wrote:

> ... That's just awful.


How so?

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Making a plain MW core git clone not be installable

2014-06-11 Thread Tyler Romeo
On Wed, Jun 11, 2014 at 11:05 AM, Zack Weinberg  wrote:

> Well, it makes *me* wince because you're directing people to pull code
> over the network and feed it straight to the PHP interpreter, probably
> as root, without inspecting it first.  And the site is happy to send
> it to you via plain HTTP, which means a one-character typo gives an
> active attacker a chance to pwn your entire installation.
>

It's over HTTPS. As long as you trust that getcomposer.org is the domain
you are looking for, this is really no different than installing via a
package manager.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Making a plain MW core git clone not be installable

2014-06-11 Thread Tyler Romeo
Thanks Zack for actually explaining the reasoning to me, rather than trying to 
insult my intelligence and then use it as an argument against the proposal.
-- 
Tyler Romeo
0xC86B42DF

From: Zack Weinberg za...@cmu.edu
Reply: Wikimedia developers wikitech-l@lists.wikimedia.org
Date: June 11, 2014 at 11:47:34
To: Wikimedia developers wikitech-l@lists.wikimedia.org
Subject:  Re: [Wikitech-l] Making a plain MW core git clone not be installable  

Nothing stops you from installing it over insecure HTTP. (I filed
https://github.com/composer/composer/issues/3047 for that.)

But this is bad practice even with HTTPS; you're relying on
*transport* integrity/authenticity to secure *document* authenticity.
Yeah, we do that all the time on today's Web, but software
installation is (I don't think this is hyperbole) more
security-critical than anything else and should be held to higher
standards. In this case, there should be an independently verifiable
(i.e. not tied to the TLS PKI) PGP signature on the installer and
people should be instructed to check that before executing it.

Note that Git submodules do this for you automatically, because the
revision hash is unforgeable.

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Making a plain MW core git clone not be installable

2014-06-11 Thread Tyler Romeo
On Wed, Jun 11, 2014 at 7:20 PM, Isarra Yos  wrote:

> Problem with composer is it doesn't actually work for all use cases (farms
> in particular can be problematic, even if the wmf did get it working), so
> depending on it isn't really a good idea unless there is a sane fallback.
> What folks have mentioned as alternatives so far (submodules in core etc)
> do not sound sane.


Which use cases are you referring to exactly? (I'm curious as to how
composer does not work with farms.)

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Which mailer should we use? (was: Making a plain MW core git clone not be installable)

2014-06-12 Thread Tyler Romeo
My point back then was not that it doesn’t support mail(), but that it doesn’t 
support extending the framework with additional, new transport mechanisms.

However, if people really have a concern about SwiftMailer, I’ll take another 
look at PHPMailer and see if I can put together a more comprehensive 
comparison. (I’ll post again here in a day or so.)
-- 
Tyler Romeo
0xC86B42DF

From: Daniel Friesen dan...@nadir-seen-fire.com
Reply: Wikimedia developers wikitech-l@lists.wikimedia.org
Date: June 12, 2014 at 3:10:56
To: Wikimedia developers wikitech-l@lists.wikimedia.org, Tim Starling 
tstarl...@wikimedia.org
Subject:  [Wikitech-l] Which mailer should we use? (was: Making a plain MW core 
git clone not be installable)  

Which doing a bit of digging myself appears to be partially incorrect.
Looking at the code PHPMailer does support mail() and sendmail in
addition to SMTP. So it basically supports the exact same list of email
sending methods as Swift Mailer's transports do.

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tell my favorite conference about your Wikimedia tech

2014-06-13 Thread Tyler Romeo
I’ve always wanted to submit a cool MediaWiki talk to these conferences, but I 
have no idea what I’d talk about (or whether I’m even experienced enough to 
talk about anything at a conference). Are there any guidelines on what would 
make a good talk?
-- 
Tyler Romeo
0xC86B42DF

From: Luis Villa lvi...@wikimedia.org
Reply: Wikimedia developers wikitech-l@lists.wikimedia.org
Date: June 13, 2014 at 10:05:14
To: Wikimedia developers wikitech-l@lists.wikimedia.org
Subject:  Re: [Wikitech-l] Tell my favorite conference about your Wikimedia 
tech  

On Thu, Jun 12, 2014 at 9:56 PM, Roan Kattouw 
wrote:

> On Jun 12, 2014 4:08 PM, "Luis Villa"  wrote:
> > I got a balloon ride the year I spoke. It was trumped the next year by a
> > helicopter ride. Definitely an amazing trip.
> >
> Wow, when was that?! I got no such thing :)
>
> In 2012 they did take all the speakers a show (and dinner) at a staged Gold
> Rush town (which is the *the* tourist attraction in Ballarat), in 2014 it
> was just a dinner at the local marina.
>

'05 for balloons over lovely Canberra, '06 for helicopters over Sydney
harbor. (I only went in '05, heard about '06 later.) Maybe they've toned it
down a little, but still a great conference!

Luis
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Ensure that user is logged in

2014-06-19 Thread Tyler Romeo
On Thu, Jun 19, 2014 at 9:23 PM, Brian Wolff  wrote:

> What about the content-length header? I believe that's included with
> POST requests even when using application/x-www-form-urlencoded form.
>

I suggest people in this thread read section 4.4 of RFC 2616.

There is no situation in which an HTTP request can be "cut off" such that
part of the query goes missing. Assuming you are using a compliant web
server, the Content-Length header is basically required for request bodies
(unless you're doing chunked encoding or something).

Try right now submitting a POST request without a Content-Length header to
Wikipedia. It results in a 400. Additionally, if you send an incorrect
header, it will either wait until timeout if it's too long or give 400 if
it's too short.

Like Daniel said, the only situation that is possible is some sort of
buffer issue, where there is a client-side application logic error that
causes the incorrect request to be sent. But this is not really MediaWiki's
problem, and client's should be using the Content-MD5 header (or whatever
the bastardized MediaWiki version is, since MW doesn't actually support
HTTP) if they really want to ensure data integrity.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] What namespaces should we use when we use them

2014-06-25 Thread Tyler Romeo
I don’t think there’s any need to impose a naming requirement for extension
namespaces. Some vendors may choose to use their own namespace scheme.

In the end, as long as the namespace chosen is PSR-4 compliant, it should not
matter.
-- 
Tyler Romeo
0xC86B42DF

From: Antoine Musso 
Reply: Wikimedia developers >
Date: June 25, 2014 at 4:16:38
To: wikitech-l@lists.wikimedia.org >
Subject:  Re: [Wikitech-l] What namespaces should we use when we use them  

Le 24/06/2014 23:30, Bryan Davis a écrit :
> I suggested MediaWiki\Extensions\OAuth because it seems like the most
> natural naming to me. PSR-4 [2] is very opinionated about namespaces.
> It requires a top-level (or vendor) namespace. I chose "MediaWiki"
> because ... MediaWiki. Beyond that sub-namespaces are optional, but
> the file system path from the base directory to the php file must
> match. Assuming that $IP is the base directory led me to suggest
> MediaWiki\Extensions\OAuth.

Hello,

Looks good to me =)

--  
Antoine "hashar" Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] MediaWiki Bug Bounty Program

2014-06-25 Thread Tyler Romeo
Hey everybody,

So today at the iSEC Partners security open forum I heard a talk from Zane 
Lackey,
the former security lead for Etsy, concerning the effectiveness of bug bounties.

He made two points:

1) Bug bounties are unlikely to cause harm, especially for Wikipedia, which I 
asked
him about, because the mere popularity of our service means we are already being
scanned, pentested, etc. With a bounty program, there will be incentive for 
people to
report those bugs rather than pastebin them.

2) Even without a monetary reward, which I imagine WMF would not be able to 
supply,
crackers are motivated simply by the “hall of fame”, or being able to be 
recognized for
their efforts.

Therefore, I thought it may be beneficial to take that over to Wikipedia and 
start our own
bug bounty program. Most likely, it would be strictly a hall of fame like 
structure where
people would be recognized for submitting bug reports (maybe we could even use 
the
OpenBadges extension *wink* *wink*). It would help by increasing the number of 
bugs
(both security and non-security) that are found and reported to us.

Any thoughts? (Of course, Chris would have to approve of this program before we 
even
consider it.)

-- 
Tyler Romeo
0xC86B42DF


signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki Bug Bounty Program

2014-06-26 Thread Tyler Romeo
OK, so really the process that we need here is:

1) Get more people on the security team via NDA and whatnot (sign me up, by the 
way, obviously)
2) Develop a triage system to quickly investigate and handle invalid and 
duplicate bugs
3) Determine when and how we’re going to do the program
4) Do it.

-- 
Tyler Romeo
0xC86B42DF

From: Brian Wolff 
Reply: Wikimedia developers >
Date: June 26, 2014 at 0:34:54
To: Wikimedia developers >
Subject:  Re: [Wikitech-l] MediaWiki Bug Bounty Program  

On 6/26/14, Chris Steipp  wrote:
> On Wed, Jun 25, 2014 at 5:49 PM, Alex Monk  wrote:
>> Chris, why don't we leave privacy policy compliance to the users posting
>> on
>> the bug? Wikimedia personal user data shouldn't be going to the security
>> product.
>
> There are a few cases where there may be legitimate private data in a
> security bug ("look, sql injection, and here are some rows from the
> user table!", "Hey, this was supposed to be suppressed, and I can see
> it", "This user circumvented the block on this IP"). But there might
> be ways to flag or categorize a report as also including private data?
> Someone with more bugzilla experience would need to comment.
>
>> Why does WMF get the right to control by access to MediaWiki security bugs
>> anyway? Could we not simply host MediaWiki stuff externally? Perhaps on
>> the
>> servers of any other major MediaWiki user.
>
> This certainly could be done. That "other major MediaWiki user" would
> have to be someone everyone trusts, and preferably with a strong track
> record of being able to keep their infrastructure secure. If there's a
> legitimate proposal to try it, let's definitely discuss.
>

Personally I'd prefer that MediaWiki related support software stay
hosted by WMF (at least for the foreseeable future). WMF just seems
like the logical people to host it, and I don't see any harm in
MediaWiki being a "Wikimedia project" in a similar sense as wikipedia
is a Wikimedia project. What I would like to see though is a mediawiki
world where WMF is not special. What I mean by that is that being a
WMF employee/contractor wouldn't get you any special treatment -
trusted people would get special access where needed because they're
trusted and have demonstrated their competence. A WMF staffer would
have to go through the same procedure as anyone else would have to to
get any sort of special access. Much of the people who have special
access would still be WMF employees, since WMF employs most senior
developers, but it wouldn't be "you're a wmf employee = here's access
to everything even if you don't need it", "you're not a WMF employee =
have to jump through a million hoops plus sign something in blood plus
bribe someone to get access to things that would be extremely helpful
to your work".

--bawolff

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki Bug Bounty Program

2014-06-26 Thread Tyler Romeo
I’ll be frank. I care a lot more about the security of MediaWiki as a software 
product,
as well as the security of its customers (both WMF and third-party) than I do 
about
some made-up notion of “open access” to security bugs.

I think it makes complete sense to have people with access to security bugs 
sign an
agreement saying they will not release said bugs to the public until they have 
been
fixed, released, and announced properly.
-- 
Tyler Romeo
0xC86B42DF

From: MZMcBride 
Reply: Wikimedia developers >
Date: June 26, 2014 at 9:44:25
To: Wikimedia developers >
Subject:  Re: [Wikitech-l] MediaWiki Bug Bounty Program  

Any process that involves volunteers signing non-public, indefinite vows
of secrecy and silence are antithetical to Wikimedia's values and mission.
This isn't a cult. Our bedrock principles are open access and transparency.

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki Bug Bounty Program

2014-06-28 Thread Tyler Romeo
On Fri, Jun 27, 2014 at 9:06 AM, Antoine Musso  wrote:

> Doesn't WMF has a plan to provide badges in MediaWiki itself? Kind of
> Wikiloves which let you distribute barn pages on talk pages but a bit
> more robust?
>

Well we made an OpenBadges extension for Facebook OpenAcademy, but it's in
an infant state, and needs more attention (including from myself) if we
actually plan on using it more extensively.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikimedia email list settings

2014-07-06 Thread Tyler Romeo
That's weird, because I use gmail and I always get copies of my own
messages when I send them.

-- 
Tyler Romeo
On Jul 6, 2014 12:38 AM, "Pine W"  wrote:

> Thank you. Groan @ gmail.
>
> Pine
>
>
> On Sat, Jul 5, 2014 at 12:54 PM, Jeremy Baron 
> wrote:
>
> > On Jul 5, 2014 3:27 PM, "Bináris"  wrote:
> > > Fix= don't use gmail. It decides for you. They are like Microsoft and
> > know
> > > what you need better than yourself. :-)
> >
> > More details:
> >
> > https://productforums.google.com/d/topic/gmail/fkbKPajx1Dw/discussion
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HTML templating progress; Knockout Components & curly brace syntax

2014-07-08 Thread Tyler Romeo
I just want to be clear that any sort of template syntax that resembles or can 
be confused with wikitext is not something that we can or should allow in core. 
If MobileFrontend and whatnot want to use this syntax, so be it.

-- 
Tyler Romeo
0x405D34A7C86B42DF

From: Gabriel Wicke 
Reply: Wikimedia developers >
Date: July 7, 2014 at 21:28:33
To: Wikimedia developers >
Subject:  [Wikitech-l] HTML templating progress; Knockout Components & curly 
brace syntax  

== Curly braces ==
The other thing they're adding is text interpolation so you can do
{{name}} and people don't have to be sad about Knockout's HTML
attribute syntax. This can simplify the migration for existing handlebars
compatible templates as used by Mobile & Flow.

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] HTML templating progress; Knockout Components & curly brace syntax

2014-07-08 Thread Tyler Romeo
In general, the syntax is different, which is why Knockout is a great solution. 
But using curly braces for variable insertion is similar to wikisyntax’s 
template transclusion. That alone is not really enough to cause a problem, but 
if we proceed in that general direction it could lead to issues.
-- 
Tyler Romeo
0x405D34A7C86B42DF

From: Dan Andreescu 
Reply: Wikimedia developers >
Date: July 8, 2014 at 10:27:08
To: Wikimedia developers >
Subject:  Re: [Wikitech-l] HTML templating progress; Knockout Components & 
curly brace syntax  

I'm not sure I follow, it seems to me knockout syntax and wikitext are
different, and have different purposes. Can you elaborate a bit more,
Tyler?

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] (no subject)

2014-07-10 Thread Tyler Romeo
On Thu, Jul 10, 2014 at 5:59 AM, Gerard Meijssen 
wrote:

> I use GMAIL and I see it plenty fine.


I also use Gmail, and it says the only reason it wasn't sent to spam was
because I have a filter sending all wikitech emails to me inbox.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Anonymous editors & IP addresses

2014-07-11 Thread Tyler Romeo
I agree that it’s a double standard, but looking at the bright side, it becomes 
a big encouragement to anonymous users to register and log in. The Account 
Creation Experience Team (or whoever the hell is in charge of that) can correct 
me, but I would imagine that we would see a big drop in registered accounts if 
IPs were hashed.

Also, it’d be really annoying to have hashes as usernames, so we’d have to 
think of an alternative scheme that makes things more readable.
-- 
Tyler Romeo
0x405D34A7C86B42DF

From: Gilles Dubuc 
Reply: Wikimedia developers >
Date: July 11, 2014 at 9:34:18
To: Wikimedia developers >
Subject:  [Wikitech-l] Anonymous editors & IP addresses  

This interesting bot showed up on hackernews today:
https://news.ycombinator.com/item?id=8018284

While in this instance the access to anonymous' editors IP addresses is
definitely useful in terms of identifying edits with probable conflict of
interest, it makes me wonder what the history is behind the fact that
anonymous editors are identified by their IP addresses on WMF-hosted wikis.

IP addresses are closely guarded for registered users, why wouldn't
anonymous users be identified by a hash of their IP address in order to
protect their privacy as well? The exact same functionality of being able
to see all edits by a given anonymous IP would still exist, the IP itself
just wouldn't be publicly available, protected with the same access rights
as registered users'.

The "use case" that makes me think of that is someone living in a
totalitarian regime making a sensitive edit and forgetting that they're
logged out. Or just being unaware that being anonymous on the wiki doesn't
mean that their local authorities can figure out who they are based on IP
address and time. Understanding that they're somewhat protected when logged
in and not when logged out requires a certain level of technical
understanding. The easy way out of this argument is to state that these
users should be using Tor or something similar. But I still wonder why we
have this double standard of protecting registered users' privacy in
regards to IP addresses and not applying the same for anonymous users, when
simple hashing would do the job.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Anonymous editors & IP addresses

2014-07-11 Thread Tyler Romeo
As a quick implementation note, we would not be using a hash for the IP address.

Most likely, we would encrypt the IP with AES or something using a 
configuration-based secret key. That way checkusers can still reverse the hash 
back into normal IP addresses without having to store the mapping in the 
database.

-- 
Tyler Romeo
0x405D34A7C86B42DF

From: Gilles Dubuc 
Reply: Wikimedia developers >
Date: July 11, 2014 at 10:59:55
To: Wikimedia developers >
Subject:  Re: [Wikitech-l] Anonymous editors & IP addresses  

With hashing, a given IP would always give the same hash. So this
uniqueness property would remain for people with stable IPs.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] "Not logged in" page

2014-07-15 Thread Tyler Romeo
https://gerrit.wikimedia.org/r/146515
-- 
Tyler Romeo
0x405D34A7C86B42DF

From: Jon Robson 
Reply: Wikimedia developers >
Date: July 15, 2014 at 14:32:19
To: Wikimedia developers >
Subject:  Re: [Wikitech-l] "Not logged in" page  

Yes someone please submit a patch and I will help us get a fix for this merged.

On Tue, Jul 15, 2014 at 6:10 AM, Bartosz Dziewoński  wrote:
> On Tue, 15 Jul 2014 13:00:04 +0200, David Cuenca  wrote:
>
>> Sometimes while not logged in I try to access my Watchlist and then the
>> "Not logged in" page appears. It is quite useless since then you have to
>> click again to go to the "Log in" page. Wouldn't it be better to just
>> redirect to the "log in" page instead of showing that dumb "Not logged in"
>> page?
>>
>> Or perhaps I am missing some hidden use or reasoning behind the existence
>> of that page...?
>
>
> There is some old discussion on
> <https://bugzilla.wikimedia.org/show_bug.cgi?id=15484>, which asks for the
> same thing.
>
>
> --
> Matma Rex
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l



--  
Jon Robson
* http://jonrobson.me.uk
* https://www.facebook.com/jonrobson
* @rakugojon

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] "Not logged in" page

2014-07-15 Thread Tyler Romeo
Yes. Actually, that is existing functionality. Special:Userlogin will redirect 
users
if the returnto (and optionally returntoquery) are specified. (The only 
exception is if
a hook intervenes).
-- 
Tyler Romeo
0x405D34A7C86B42DF

From: Risker 
Reply: Wikimedia developers >
Date: July 15, 2014 at 15:04:18
To: Wikimedia developers >
Subject:  Re: [Wikitech-l] "Not logged in" page  

Would this return the user to the page that was originally requested, and
from which they were diverted?

I ask because a lot of people bookmark their watchlist as the easiest way
to wind up where they want to be. Getting a log-in screen first if they're
not already logged in would work fine, but then they'd want to be sent back
to the watchlist.

Risker/Anne


On 15 July 2014 14:49, Tyler Romeo  wrote:

> https://gerrit.wikimedia.org/r/146515
> --
> Tyler Romeo
> 0x405D34A7C86B42DF
>
> From: Jon Robson 
> Reply: Wikimedia developers >
> Date: July 15, 2014 at 14:32:19
> To: Wikimedia developers >
> Subject: Re: [Wikitech-l] "Not logged in" page
>
> Yes someone please submit a patch and I will help us get a fix for this
> merged.
>
> On Tue, Jul 15, 2014 at 6:10 AM, Bartosz Dziewoński 
> wrote:
> > On Tue, 15 Jul 2014 13:00:04 +0200, David Cuenca 
> wrote:
> >
> >> Sometimes while not logged in I try to access my Watchlist and then the
> >> "Not logged in" page appears. It is quite useless since then you have to
> >> click again to go to the "Log in" page. Wouldn't it be better to just
> >> redirect to the "log in" page instead of showing that dumb "Not logged
> in"
> >> page?
> >>
> >> Or perhaps I am missing some hidden use or reasoning behind the
> existence
> >> of that page...?
> >
> >
> > There is some old discussion on
> > <https://bugzilla.wikimedia.org/show_bug.cgi?id=15484>, which asks for
> the
> > same thing.
> >
> >
> > --
> > Matma Rex
> >
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
>
> --
> Jon Robson
> * http://jonrobson.me.uk
> * https://www.facebook.com/jonrobson
> * @rakugojon
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] logging out on one device logs user out everywhere

2014-07-16 Thread Tyler Romeo
On Wed, Jul 16, 2014 at 12:50 AM, Jasper Deng 
wrote:

> I'd prefer that we did Google's system that, in addition to allowing a
> separate "sign out all other sessions" option, also allows users to monitor
> which IP addresses their account was accessed from (which however would be
> akin to self CheckUser and might run afoul of our privacy policy).
>

https://www.mediawiki.org/wiki/Extension:SecureSessions

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] logging out on one device logs user out everywhere

2014-07-21 Thread Tyler Romeo
Just to be clear, this is a CentralAuth issue. MediaWiki core already has logout
localized by session, and I have an extension SecureSessions that already has
a feature that shows all your logged in sessions and lets them log out.

It is CentralAuth that does global logout.
-- 
Tyler Romeo
0x405D34A7C86B42DF

From: Jon Robson 
Reply: Wikimedia developers >
Date: July 21, 2014 at 14:35:54
To: Wikimedia developers >
Subject:  Re: [Wikitech-l] logging out on one device logs user out everywhere  

It seems like there is agreement on an approach

As I understand it:
* special button that when clicked logs you out everywhere
* default behaviour is just to log you out on current device

Does anyone want to own this and help move it forward? I've got too
many things on my plate right now, but it's been bothering me for many
years.

Although I don't have time/energy to do all of this, I'm happy to help
out grabbing people to code review any patches, unblock any
disagreements.

/me hopes someone puts their hand up


On Wed, Jul 16, 2014 at 4:06 AM, Tyler Romeo  wrote:
> On Wed, Jul 16, 2014 at 12:50 AM, Jasper Deng 
> wrote:
>
>> I'd prefer that we did Google's system that, in addition to allowing a
>> separate "sign out all other sessions" option, also allows users to monitor
>> which IP addresses their account was accessed from (which however would be
>> akin to self CheckUser and might run afoul of our privacy policy).
>>
>
> https://www.mediawiki.org/wiki/Extension:SecureSessions
>
> *-- *
> *Tyler Romeo*
> Stevens Institute of Technology, Class of 2016
> Major in Computer Science
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l



--  
Jon Robson
* http://jonrobson.me.uk
* https://www.facebook.com/jonrobson
* @rakugojon

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Reviewing a couple of TorBlock patches

2014-07-23 Thread Tyler Romeo
FYI, I’d be willing to maintain the extension, but it’d be best it there was 
one other person, since it is WMF-deployed and thus I cannot merge my own 
patches.
-- 
Tyler Romeo
0x405D34A7C86B42DF

From: Quim Gil 
Reply: Wikimedia developers >
Date: July 23, 2014 at 7:57:07
To: Wikimedia developers >
Subject:  [Wikitech-l] Reviewing a couple of TorBlock patches  

According to our algorithm (*), TorBlock currently has the worse track
reviewing code contributions -- even after Tim gave a -1 to one of the
three open patches last week (thanks!). There are two patches from Tyler
that haven't received any feedback at all since August 2013.

https://gerrit.wikimedia.org/r/#/q/status:open+project:mediawiki/extensions/TorBlock,n,z


Your help reviewing these patches is welcome.

It is not surprising that this extension has no maintaner listed at
https://www.mediawiki.org/wiki/Developers/Maintainers (someone suggested
Tim in that table, he disagrees and edited accordingly).

Also, maybe someone is interested in maintaining this extension? Only
eleven patches submitted in the last 15 months.

(*) http://korma.wmflabs.org/browser/gerrit_review_queue.html -- the
algorithm happens to be buggy these days, but apparently equally buggy for
all repos, which still results in some kind of justice.


--  
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] MediaWiki now supports PBKDF2 and Bcrypt

2014-07-28 Thread Tyler Romeo
Hi everybody,

I was on the brink of celebrating the one-year anniversary of a patch I 
submitted being open, but today it was finally merged!

https://gerrit.wikimedia.org/r/77645

The old User::comparePasswords() and User::crypt() functions have been replaced 
with a new password hashing API. This means MediaWiki now natively supports 
Bcrypt and PBKDF2 as replacement password hashing algorithms. Furthermore, the 
system allows seamless transitioning, meaning users’ password hashes will be 
updated automatically the next time they log in.

This means that MD5 is almost out the door, which is a big win (a follow up 
patch, https://gerrit.wikimedia.org/r/149658, changes the default to PBKDF2, 
which would mean any wiki that upgrades to 1.24 would automatically switch away 
from MD5).

I’d like to thank Aaron Schulz, Chris Steipp, Krinkle, and many others who 
helped get this through.

-- 
Tyler Romeo
0x405D34A7C86B42DF
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki now supports PBKDF2 and Bcrypt

2014-07-28 Thread Tyler Romeo
On Mon, Jul 28, 2014 at 5:24 PM, Pine W  wrote:

> Thank you. Out of curiosity, why bcrypt and not scrypt? There is debate in
> the security community about which is better so my comment isn't intended
> as criticism. I'm just interested in the thinking behind this decision.
>

It is a matter of stability in PHP. Bcrypt has built-in support in PHP, as
does PBKDF2, whereas scrypt requires an extension. It should be noted,
however, that the patch that was merged implements an extensible password
API, so it would be trivial to implement scrypt support if we wanted to.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bikeshedding a good name for "the api.php API"

2014-08-06 Thread Tyler Romeo
Definitely agree with this. It’s the only API that is part of core, so 
“MediaWiki API” makes sense.
-- 
Tyler Romeo
0x405D34A7C86B42DF

From: Bartosz Dziewoński 
Reply: Wikimedia developers >
Date: August 6, 2014 at 9:52:34
To: Wikimedia developers >
Subject:  Re: [Wikitech-l] Bikeshedding a good name for "the api.php API"  

How about just "the MediaWiki API"? That's the only proper external API 
core MediaWiki has, as far as I'm aware.

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] News about stolen Internet credentials; reducing Wikimedia reliance on usernames and passwords

2014-08-06 Thread Tyler Romeo
In terms of external authentication, we need Extension:OpenID to catch up to 
the OpenID standard in order to do that.

In terms of two-factor, I have like eight patches for Extension:OATHAuth 
attempting to make it production-worthy.

https://gerrit.wikimedia.org/r/132783
-- 
Tyler Romeo
0x405D34A7C86B42DF

From: svetlana 
Reply: Wikimedia developers >
Date: August 6, 2014 at 7:57:12
To: wikitech-l@lists.wikimedia.org >
Subject:  Re: [Wikitech-l] News about stolen Internet credentials; reducing 
Wikimedia reliance on usernames and passwords  

On Wed, 6 Aug 2014, at 21:49, Andre Klapper wrote:
> On Tue, 2014-08-05 at 22:05 -0700, Pine W wrote:
> > After reading this [1] I am wondering if Wikimedia should start taking
> > steps to reduce reliance on usernames and passwords.
>  
> What "steps" do you refer to, or is this intentionally vague?
> Disallowing usernames and logins?
> Two-step authentication/verification?
> Something else?
>  
> andre

from what i could read and parse:
use less of external things like skype and google accounts
so that there is only 1 username for everything

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bikeshedding a good name for "the api.php API"

2014-08-07 Thread Tyler Romeo
On Thu, Aug 7, 2014 at 7:42 AM, Legoktm  wrote:

> I like "api.php" too, given that we refer to the old one as "query.php".


We should just go back to query.php and get rid of api.php so we can avoid
all this confusion. /s

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] News about stolen Internet credentials; reducing Wikimedia reliance on usernames and passwords

2014-08-07 Thread Tyler Romeo
On Thu, Aug 7, 2014 at 6:01 AM, Brian Wolff  wrote:

> I think TLS has a feature where the client can also provide a
> certificate, in order to use certificates to authenticate users. I've
> never heard of a site actually using it.
>

Indeed.

https://www.mediawiki.org/wiki/Extension:SSLClientAuthentication

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Superprotect user right, Comming to a wiki near you

2014-08-10 Thread Tyler Romeo
Wow this is pretty depressing, although in today's age I cannot say I'm
surprised. Corporations have always been about controlling their consumers,
and it was really only a matter of time before the WMF fell into that as
well. I wonder whether there's any legitimate justification for all of
this, or whether it's just a repeat of the VisualEditor fiasco, aka, "the
WMF knows best" kind of thing.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science


On Sun, Aug 10, 2014 at 3:19 PM, Siebrand Mazeland 
wrote:

>
> > Op 10 aug. 2014 om 20:12 heeft Ricordisamoa <
> ricordisa...@openmailbox.org> het volgende geschreven:
> >
> > I'd really like to hear Jimbo's opinion on the
> matter
>
> You should really watch Jimmy's speech at the Wikimania closing session.
> You might be surprised.
>
> Cheers!
>
> --
> Siebrand
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Superprotect user right, Comming to a wiki near you

2014-08-11 Thread Tyler Romeo
On Mon, Aug 11, 2014 at 12:24 PM, Tim Landscheidt 
wrote:

> {{cn}}


More like the attendees were those who could afford to go and/or were given
scholarship.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Superprotect user right, Comming to a wiki near you

2014-08-11 Thread Tyler Romeo
So I'd just like to make one point, I think this "superprotect" right has
proper technical implications and use cases. In other words, while you guys
may disagree with how it is currently being used (e.g., the MediaViewer
drama and whatnot), I think it is a good idea, and I am actually surprised
such a protection level has not already been enacted.

There are many legitimate cases (e.g., office actions and copyright-related
issues) where I could see the superprotect level coming in handy. There are
some cases where the WMF simply cannot afford (usually b/c of legal
reasons) to trust the community, even if they're 99.9% sure nothing will
happen. Sometimes all it takes is one rogue admin to trigger a lawsuit.

With that said, it's obviously a political matter as to what the proper
uses of this new protection level are, but I do think the existence of the
level itself is appropriate.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science


On Sun, Aug 10, 2014 at 9:19 AM, K. Peachey  wrote:

> Lets all welcome the new overlord Erik.
>
> Add a new protection level called "superprotect"
> Assigned to nobody by default. Requested by Erik Möller for the purposes
> of protecting pages such that sysop permissions are not sufficient to
>
>
> edit them.
> Change-Id: Idfa211257dbacc7623d42393257de1525ff01e9e
> <
> https://gerrit.wikimedia.org/r/#q,Idfa211257dbacc7623d42393257de1525ff01e9e,n,z
> >
>
> https://gerrit.wikimedia.org/r/#/c/153302/
>
>
>
> Someone clearly can't take criticism of their projects well.
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Superprotect user right, Comming to a wiki near you

2014-08-11 Thread Tyler Romeo
No, it is not. Unless we are continuing to discuss the technical implications 
of this change, I suggest this be moved to wikimedia-l.

Also, can we please stop using Wikipedia templates in this thread (e.g., 
{{cc}}). Not everybody here is a Wikipedia editor and knows what they mean. I’d 
prefer not to have to look up templates online just to figure it out.
-- 
Tyler Romeo
0x405D34A7C86B42DF

From: Lewis Cawte 
Reply: Wikimedia developers >
Date: August 11, 2014 at 13:56:11
To: Wikimedia developers >
Subject:  Re: [Wikitech-l] Superprotect user right, Comming to a wiki near you  

Is wikitech-l really the best place for this discussion?

-- Lewis Cawte (Lcawte)

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wikimedia-l] Superprotect user right, Comming to a wiki near you

2014-08-11 Thread Tyler Romeo
On Mon, Aug 11, 2014 at 6:54 PM, John Mark Vandenberg 
wrote:

> Was this functionality was ever supported by MediaWiki core?


Of course this is supported by MediaWiki core, although I cannot attest as
to whether it was previously implemented on WMF wikis.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] PHP 5.3 EOL and MediaWiki 1.24

2014-08-14 Thread Tyler Romeo
Hi everybody,

So we had a discussion about this a while ago, but just recently PHP let
out the final 5.3 release. [0] Back in the previous thread concerning PHP
5.3, there seemed to be general agreement toward upping our PHP minimum
requirements for the 1.24 release of MediaWiki.

Here are the stats:
* Soon Ubuntu (trusty) and Debian (jessie) releases will be running PHP 5.5
and 5.6, respectively.
* MediaWiki 1.23 ends support in May 2017
* PHP 5.4 is estimated to be supported until 2015
* Ubuntu 12.04 LTS (PHP 5.3) ends support on April 26, 2017
* Debian Squeeze (PHP 5.3) ends support in February 2016
* Debian Wheezy (PHP 5.4) *might* be supported until May 2018, depending on
the feedback received from the Squeeze LTS trial

The results of this timeline are that when MediaWiki 1.23 ends support,
most supported distros will be on PHP 5.5 or higher (yes, not 5.4, but 5.5).

With that in mind, I'd like to move to raise our PHP minimum version.
Unless there are people that disagree, it is no longer a question of
whether to raise it or not, but rather what we want to raise it to. I
believe we have a couple of options:

1) Raise to PHP 5.5 for MW 1.24
2) Raise to PHP 5.4 for MW 1.24, and then when a release with support past
2018 is made, go to 5.5.

Option 2 takes advantage of the possible gap in coverage we will face
should Debian Wheezy receive LTS support.

Thoughts?

[0] http://php.net/archive/2014.php?050329#id2014-08-14-1

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHP 5.3 EOL and MediaWiki 1.24

2014-08-14 Thread Tyler Romeo
Ah, I was not aware WMF has still on 5.3. I guess we’ll revisit this in a few 
months or something then.
-- 
Tyler Romeo
0x405D34A7C86B42DF

From: Brad Jorsch (Anomie) 
Reply: Wikimedia developers >
Date: August 14, 2014 at 15:13:03
To: Wikimedia developers >
Subject:  Re: [Wikitech-l] PHP 5.3 EOL and MediaWiki 1.24  

This.

At the least, any change to the supported version of PHP isn't going to
happen until the WMF cluster gets updated, and the decision must be
informed by what version the WMF cluster gets updated to (which may be HHVM
rather than Zend).

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Mediawiki-api] Bikeshedding a good name for "the api.php API"

2014-08-15 Thread Tyler Romeo
Agreed with Aaron. When these proposed additional APIs are actually 
implemented, then we can start arguing about what to call them.

I know that I personally will continue to call the API the “core web API” or 
sometimes just the “web API”, if it is clear based on the context in which I am 
talking.
-- 
Tyler Romeo
0x405D34A7C86B42DF

From: Aaron Halfaker 
Reply: Wikimedia developers >
Date: August 15, 2014 at 11:05:13
To: Wikimedia developers >
Cc: MediaWiki API announcements & discussion 
>
Subject:  Re: [Wikitech-l] [Mediawiki-api] Bikeshedding a good name for "the 
api.php API"  

As a heavy user, I generally just refer to the things api.php does as "the
API". or "MediaWiki's web API" when I'm feeling verbose.

I'd be confused about the "action API" since I generally use it to "read"
which isn't really "action" -- even though it corresponds to "action=query"

As for "the proposed REST API", I don't think that proposed things should
affect the naming scheme of things we already know and love.

Also, I think that all bike sheds should match the color of the house to
(1) denote whose bike shed it is and (2) help tie the yard together like
furniture in a living room.


On Fri, Aug 15, 2014 at 3:50 PM, Sumana Harihareswara  wrote:

> I like "action API".
>
> Sumana Harihareswara
> Senior Technical Writer
> Wikimedia Foundation
>
>
> On Thu, Aug 14, 2014 at 5:06 PM, Brad Jorsch (Anomie) <
> bjor...@wikimedia.org
> > wrote:
>
> > Summing up, it seems like "action API" and "api.php" are the two
> > contenders.
> >
> > "api.php" is least likely to be confused with anything (only its own
> entry
> > point file). But as a name it's somewhat awkward.
> >
> > "action API" might be confused with the Action class and its subclasses,
> > although that doesn't seem like a big deal.
> >
> >
> > As for the rest:
> >
> > Just "API" is already causing confusion. Although it'll certainly
> continue
> > to be used in many contexts.
> >
> > "MediaWiki API", "Web API", and "MediaWiki web API" are liable to be
> > confused with the proposed REST API, which is also supposed to be
> > web-accessible and will theoretically part of MediaWiki (even though I'd
> > guess it's probably going to be implemented as an -oid). "MediaWiki web
> > APIs" may well grow to encompass the api.php action API, the REST API,
> and
> > maybe even stuff like Parsoid.
> >
> > "MediaWiki API" and "Core API" are liable to be confused with the various
> > hooks and PHP classes used by extensions.
> >
> > "JSON API" wouldn't be accurate for well into the future, and would
> likely
> > be confused with other JSON-returning APIs such as Parsoid and maybe
> REST.
> >
> > "Classic API" makes it sound like there's a full replacement.
> >
> > All the code name suggestions would be making things less clear, not
> more.
> > If it had started out with a code name there would be historical inertia,
> > but using a code name now would just be silly.
> >
> >
> > ___
> > Mediawiki-api mailing list
> > mediawiki-...@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/mediawiki-api
> >
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Configuration related things that happened at Wikimania

2014-08-15 Thread Tyler Romeo
It is probably considered a blocker for the GUI config RFC, since the 
Configuration stuff is needed in order to implement non-global configuration 
formats. With this new config system, configuring MediaWiki is likely to become 
a lot easier in the not-so-distant future.
-- 
Tyler Romeo
0x405D34A7C86B42DF

From: Sumana Harihareswara 
Reply: Wikimedia developers >
Date: August 15, 2014 at 11:19:21
To: Wikimedia developers >
Subject:  Re: [Wikitech-l] Configuration related things that happened at 
Wikimania  

Is this work related to
https://www.mediawiki.org/wiki/Requests_for_comment/Extension_registration
,
https://www.mediawiki.org/wiki/Requests_for_comment/Graphical_configuration_interface
, or any other Request for Comment, by the way?

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [QA] Prettier Jenkins results in Gerrit

2014-08-20 Thread Tyler Romeo
Yeah just noticed this as well. An awesome change indeed; much easier to read 
and interpret. Thanks!
-- 
Tyler Romeo
0x405D34A7C86B42DF

From: James Forrester 
Reply: Wikimedia developers >
Date: August 20, 2014 at 17:22:42
To: QA (software quality assurance) for Wikimedia projects. 
>
Cc: Wikimedia developers >
Subject:  Re: [Wikitech-l] [QA] Prettier Jenkins results in Gerrit  

On 20 August 2014 13:02, Antoine Musso  wrote:

> Hello,
>
> The tests results being reported to Gerrit are now much nicer. The
> first ever example is https://gerrit.wikimedia.org/r/#/c/155341/
>
>
> James E. Blair from Openstack found a nice trick to inject HTML in
> Gerrit comment. Christian Aistleitner kindly reviewed and tested the
> regex, and further improved the craziness.
>
> Daniel Zahn deployed the change on spot a few minutes ago and we now
> have slightly nicer and more readable test results being reported.
>
> \O/
>
> Ref:
> https://bugzilla.wikimedia.org/66095


​Lovely! Thanks all.​

J.
--  
James D. Forrester
Product Manager, Editing
Wikimedia Foundation, Inc.

jforres...@wikimedia.org | @jdforrester
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Using magic functions (e.g. __get) to deprecate class members

2014-08-22 Thread Tyler Romeo
My opinion on this matter is that if you were using a variable prefixed with 
“m”, which is clearly one of our conventions for declaring variables private, 
you are asking for trouble. Just recently when the password hashing API patch 
was merged, it caused problems with CentralAuth since it tried accessing 
mPassword directly.

In cases like this, I don’t think there’s any need for a deprecation period, 
because you are playing with fire in the first place.

Obviously, for other variables that don’t start with “m” and are not documented 
as @private or @protected, then we need the grace period.
-- 
Tyler Romeo
0x405D34A7C86B42DF

From: Stephan Gambke 
Reply: Wikimedia developers >
Date: August 22, 2014 at 18:33:24
To: Wikimedia developers >
Subject:  [Wikitech-l] Using magic functions (e.g. __get) to deprecate class 
members  

TL;DR: Please review [1]


I was asked to discuss the topic on the mailing list, so here goes.

Since some time Siebrand is making an effort to improve code quality by
making it phpcs-strict compliant [0]. This involves explicitly declaring
the visibility of class members.

Alas, intended or not, in very many cases he did not only explicitly
declare the class variable's visibility, but he also changed it from the
implicit 'public' to explicit 'protected' or 'private', thus introducing
a major API change without a proper deprecation period. Apparently this
was not noticed (or at least not challenged) by the reviewers. I checked
a few of his commits and they were all merged within hours.

Now, to be clear about that, I appreciate both changes - the phpcs
compliance as well as the more limited accessibility of class variables.
This was long overdue. However, to introduce something like this a
proper deprecation period is in my opinion essential, in particular
considering the extent of the changes. I do believe that Siebrand
checked that the variables he declared protected or private were not
used by extensions. However, he missed one (EditPage::mTokenOk) which
subsequently resulted in a bug in an extension (SemanticForms). Given
the number of the now newly hidden variables I am quite sure, that there
are other cases. If only because many extensions are not hosted on WMF
servers, so they can not be checked.

To keep the changes and still be able to properly deprecate the direct
access to the member variables I submitted a patch [1] that makes use of
PHP magic functions [2]. They provide access to the class members and
issue a deprecation warning. The intention is to keep these functions
for the custom two releases [3], i.e. until 1.26, and then remove them.

This patch was shot down for three reasons:

* it duplicates code
* there is no tests
* our code base barely use the magic methods and I am pretty sure Tim
Starling commented a while back about them being nasty.


When I pointed out, that these functions are not re-entrant and thus
Tims warning [4] did not apply the answer was "I have merely copy pasted
Tim comment without even attempting to understand what it means". I was
subsequently asked to present this to the mailing list which I do with
this mail.

I would appreciate comments on the patch (preferably constructive), that
would allow us to not revert Siebrand's changes and still properly
deprecate formerly public class members.

Thanks.

Stephan



[0]
https://gerrit.wikimedia.org/r/#/q/status:merged+project:mediawiki/core+branch:master+topic:phpcs,n,z
[1] https://gerrit.wikimedia.org/r/#/c/151370/
[2]
http://php.net/manual/en/language.oop5.overloading.php#language.oop5.overloading.members
[3] https://www.mediawiki.org/wiki/Deprecation
[4] https://www.mediawiki.org/wiki/Special:Code/MediaWiki/85288#c17504

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The future of skins

2014-08-26 Thread Tyler Romeo
On Tue, Aug 26, 2014 at 8:21 PM, Trevor Parscal 
wrote:

> Thanks for summarizing the meeting Jon.
>
> So, let's get Twig/Swig into core then, eh? :)
>

Please yes. Twig is really powerful, pretty fast, and is supported by a
major company.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RfC Discussions Today/Next week

2014-09-03 Thread Tyler Romeo
Well unfortunately I will not be able to attend either of these meetings 
(mainly since one of them was today). If somebody could make sure to mention 
third-party libraries on my behalf at the September 10th meeting, I would 
greatly appreciate it. (It would really be a facepalm if MediaWiki implemented 
its own ORM….again.)
-- 
Tyler Romeo
0x405D34A7C86B42DF

From: Rachel Farrand 
Reply: Wikimedia developers >
Date: September 3, 2014 at 14:08:22
To: Wikimedia developers >, Development and 
Operations engineers (WMF only) >
Subject:  [Wikitech-l] RfC Discussions Today/Next week  

Hello,
If you are interested in joining today's RfC discussion,
the Architecture Committee will be discussing the following RfCs:

* SOA Authentication (Chris Steipp)
https://www.mediawiki.org/wiki/Requests_for_comment/SOA_Authentication
* Server-side Javascript error logging (Gergő Tisza)
https://www.mediawiki.org/wiki/Requests_for_comment/Server-side_Javascript_error_logging

Wednesday, 03 September 2014, in #wikimedia-office on Freenode IRC,
21:00-22:00 UTC
*http://www.timeanddate.com/worldclock/fixedtime.html?msg=RFC+Meeting&iso=20140903T14&p1=224&ah=1
<http://www.timeanddate.com/worldclock/fixedtime.html?msg=RFC+Meeting&iso=20140903T14&p1=224&ah=1>*


*Proposed agenda for Sep 10: *
* Data Mapper (Andrew Green)
https://www.mediawiki.org/wiki/Requests_for_comment/Data_mapper
* Dependency injection (Andrew Green)
https://www.mediawiki.org/wiki/Requests_for_comment/Dependency_injection

Wednesday, 10 September 2014, in #wikimedia-office on Freenode IRC,
21:00-22:00 UTC
*http://www.timeanddate.com/worldclock/fixedtime.html?msg=RFC+Meeting+Sep+10&iso=20140910T21&p1=1440&ah=1
<http://www.timeanddate.com/worldclock/fixedtime.html?msg=RFC+Meeting+Sep+10&iso=20140910T21&p1=1440&ah=1>*

Thanks!
Rachel
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] I'm leaving the Wikimedia Foundation

2014-09-12 Thread Tyler Romeo
:'( Going to miss you Sumana! (Hopefully we can meet up in the city one
day.)

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science

On Fri, Sep 12, 2014 at 2:08 PM, Jeroen De Dauw 
wrote:

> Hey Sumana,
>
> Thanks for writing that up and the bits of helpful advice you gave me over
> the years.
>
> And like Micru, I have to agree with this part:
>
> And I'd like to [...] exclude destructive communication from my life (yes,
> > there's some amount of burnout on toxic people and entitlement).
> >
>
> This is such a tragic waste on many levels. It's very hard to make real
> progress towards an organizations goals and have fun in doing so, if the
> environment contains to much of this. Even if every single person involved
> is putting effort towards those goals. You are definitely not the first
> person to leave the WMF (in part) because of this.
>
> Cheers
>
> --
> Jeroen De Dauw - http://www.bn2vs.com
> Software craftsmanship advocate
> Evil software architect at Wikimedia Germany
> ~=[,,_,,]:3
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] the three phases of patch review

2014-09-24 Thread Tyler Romeo
Great article.

As further reading for anybody who is interested, there is a book called “Best 
Kept Secrets of Peer Code Review”. It addresses the various methods of code 
review and how to best tune it for your project. (It’s available online 
somewhere.)

-- 
Tyler Romeo
0x405D34A7C86B42DF

On September 24, 2014 at 18:03:03, Sumana Harihareswara (suma...@wikimedia.org) 
wrote:

I just read
http://sarah.thesharps.us/2014/09/01/the-gentle-art-of-patch-review/ and it
made a lot of sense to me as a way to speed up the first response a new
patch gets.

"Instead of putting off reviewing first-time contributions and thoroughly
reviewing everything in the contribution at once, I propose a three-phase
review process for maintainers:

1. Is the idea behind the contribution sound?
2. Is the contribution architected correctly?
3. Is the contribution polished?"

The post author, a Linux kernel developer, goes into more detail in the
post; it's worth reading even if you decide this approach isn't your style.

(Reminder: https://www.mediawiki.org/wiki/Gerrit/Code_review/Getting_reviews
and https://www.mediawiki.org/wiki/Gerrit/Code_review are worth
re-skimming.)

Sumana Harihareswara
Senior Technical Writer
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor and Anonymous Users (I know, we've had this discussion a million times)

2014-09-30 Thread Tyler Romeo
I still believe that Nymble is the way to go here. It is the only solution
that
successfully allows negotiation of a secure collateral that can still be
blacklisted after abuse has occurred.

Although, as mentioned, it is all about the collateral. Making the user
provide
something that requires work to obtain.


*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science

On Tue, Sep 30, 2014 at 3:40 PM, Risker  wrote:

> Okay, so I have to ask.  What is this obsession with enabling TOR editing?
>
> Stewards are having to routinely disable significant IP ranges because of
> spamming/vandalism/obvious paid editing/etc through anonymizing proxies,
> open proxies, and VPNs - so I'm not really seeing a positive advantage in
> enabling an editing vector that would be as useful to block as the old AOL
> IPs.[1]  If the advocates of enabling TOR were all willing to come play
> whack-a-mole - and keep doing it, day in and day out, for years - there
> might be something to be said for it.  But it would be a terrible waste of
> a lot of talent, and I'm pretty sure none of you are all that interested in
> devoting your volunteer time that way.
>
> We know what the "technical" solution would be here: to turn the
> on/off switch to "on".  Enabling TOR from a technical perspective is
> simple. Don't forget, while you're at it, to address the unregistered
> editing attribution conundrum that has always been the significant
> secondary issue.
>
> I'd encourage all of you to focus on technical ways to prevent
> abusive/inappropriate editing from all types of anonymizing edit platforms,
> including VPNs, sites like Anonymouse, etc.  TOR is but
> one editing vector that is similarly problematic, and it would boggle the
> minds of most users to discover that developers are more interested in
> enabling another of these vectors rather than thinking about how to prevent
> problems from the ones that are currently not systemically shut down.
>
> Risker/Anne
>
>
> [1] Historical note - back in the day, AOL used to reassign IPs with every
> new link accessed through the internet (i.e., new IP every time someone
> went to a new Wikipedia page).  It was impossible to block AOL vandals.
> This resulted in most of the known AOL IP ranges being blocked, since there
> was no other way to address the problem.
>
>
>
> On 30 September 2014 14:52, Brian Wolff  wrote:
>
> > On 9/30/14, Derric Atzrott  wrote:
> > > Alright, this is a long email, and it acts to basically summarise all
> of
> > the
> > > discussions that have already happened on this topic.  I'll be posting
> a
> > > copy
> > > of it to Mediawiki.org as well so that it will be easier to find out
> > about
> > > what has already been proposed in the future.
> > >
> > > There is a policy side to this, Meta has the "No open proxies" policy,
> > which
> > > would need to be changed, but I doubt that such policies will be
> changed
> > > unless those of us on this list can come up with a good way to allow
> Tor
> > > users
> > > to edit.  If we can come up with a way that solves most of the problems
> > the
> > > community has, then I think there is a good chance that this policy can
> > be
> > > changed.
> >
> >
> > I'd like to add an idea I've been thinking about to make TOR more
> > acceptable.
> >
> > A big part of the problem is that there are hundreds (thousands?) of
> > exit nodes, so if someone is being bad, they just have to wait 5
> > minutes to get a new one, making it very hard to block them.
> >
> > So what we could do, is map all tor connections to appear (To MW) as
> > if they are coming from a few private IP addresses. This way its easy
> > to block temporarily (in case of a whole slew of vandalism comes in),
> > the political decision on whether to block or not becomes a local
> > problem (The best kind of solution to a problem is the type that makes
> > it somebody else's problem ;) I would personally hope that admins
> > would only give short term block to such an address during waves of
> > vandalism, but ultimately it would be up to them.
> >
> > To be explicit, the potential idea is as follows:
> >  *User access via tor
> > *MediaWiki sees its a tor request
> > *Try to do limited browser fingerprinting, to perhaps mitigate the
> > affect of an unclued user not using tor browser being bad ruining it
> > for everyone. Say take a hash of the user-agent and various accept
> > headers, and turn it into a number between 1 and 16.
> 

Re: [Wikitech-l] Tor exit node rejects port 443 of WM, but it is disabled for editing

2014-10-02 Thread Tyler Romeo
Well you can also edit from port 80 (unless we deployed site-wide SSL without 
me knowing).

Has a bug been filed for this? This sounds like an Extension:TorBlock issue 
(and is also probably my fault in some manner assuming this is a regression, 
which it may or may not be).

-- 
Tyler Romeo
0x405D34A7C86B42DF

On October 2, 2014 at 7:33:54, Rudolf Dovičín (rudolf.dovi...@gmail.com) wrote:

HTTPS, i.e. port 443, is the ONLY way to edit WikiMedia projects,
so I thing such servers (those reject port 443 of WikiMedia servers)
should by excluded from Public proxy blacklist of WikiMedia.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor exit node rejects port 443 of WM, but it is disabled for editing

2014-10-02 Thread Tyler Romeo
No, they are not usually allowed, but technically there should not be a problem 
with it, since if the exit policy does not allow traffic over port 80 or port 
443, Tor users cannot edit Wikipedia anyway. (The result is that only the 
operator can edit from that IP address.)

Also, the exit policy is verifiable via the node directory.

-- 
Tyler Romeo
0x405D34A7C86B42DF

On October 2, 2014 at 8:09:08, Derric Atzrott (datzr...@alizeepathology.com) 
wrote:

Are Tor exit relays that block access to Wikimedia in their rejection
policies usually allowed to edit Wikimedia projects?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC meeting this week

2014-10-07 Thread Tyler Romeo
If at all possible, if you could discuss Extension registration second, I would 
really appreciate it. I have something until 21:30 UTC so I will only be able 
to be online in the second half of the meeting.

-- 
Tyler Romeo
0x405D34A7C86B42DF

On October 6, 2014 at 22:08:47, Tim Starling (tstarl...@wikimedia.org) wrote:

The next RFC meeting will discuss the following RFCs:

* Extension registration
<https://www.mediawiki.org/wiki/Requests_for_comment/Extension_registration>

* Change LESS compilation library
<https://www.mediawiki.org/wiki/Requests_for_comment/Change_LESS_compilation_library>

The meeting will be on the IRC channel #wikimedia-office on
irc.freenode.org at the following time:

* UTC: Wednesday 21:00
* US PDT: Wednesday 14:00
* Europe CEST: Wednesday 23:00
* Australia AEST: Thursday 07:00

-- Tim Starling


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tor and Anonymous Users (I know, we've had this discussion a million times)

2014-10-12 Thread Tyler Romeo
On Sun, Oct 12, 2014 at 3:24 AM, Arlo Breault 
wrote:

> Proposal:


Unless there is further discussion to be had on a new *technical* solution
to Tor users, this is the wrong mailing list to be making these proposals.
At the very least take it to the main wikimedia list, or on-wiki, where
this is a lot more relevant.

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] MediaWiki Technical Debt

2014-10-23 Thread Tyler Romeo
Found this today: https://twitter.com/symfony_en/status/525222757567827968

It's probably a useless statistic, but I still found it amusing. Good to
know we still have less technical debt than WordPress. ;)

*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki Technical Debt

2014-10-23 Thread Tyler Romeo
Much agreed on this. I tried out SensioLabsInsight once or twice, and overall 
it has never given me anything useful that PhpStorm’s built-in inspections did 
not already catch.

-- 
Tyler Romeo
0x405D34A7C86B42DF

On October 23, 2014 at 11:17:22, Jeroen De Dauw (jeroended...@gmail.com) wrote:

It's based on SensioLabsInsight, which I personally quite dislike.
ScrutinizerCI is hands down better at spotting issues (way to many false
positives on SensioLabsInsight).
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Fwd: Unsolicited digital currency donations

2015-01-09 Thread Tyler Romeo
Link for those interested:
http://prime4commit.com/projects/208

I really dislike these kinds of sites. It’d be one thing if it was just 
donations, but developers can claim their tips and basically get paid for each 
commit, and it rubs me the wrong way gameifying our volunteer community like 
that. Then again, it’s just my opinion.

-- 
Tyler Romeo
0x405D34A7C86B42DF

On January 9, 2015 at 03:10:35, Federico Leva (Nemo) (nemow...@gmail.com) wrote:

Website garnering registrations by "donating" few cents of dollar for  
each merged commit. They started a while ago but looks like it's  
expanding. Just FYI in case you've not received one yet.

Nemo

 Messaggio inoltrato 
Oggetto: You received a tip for your commit
Data: Fri, 09 Jan 2015 07:36:52 +


Hello, Federico Leva!

You were tipped 0.11 XPM for your commit on Project wikimedia/mediawiki.
Please, log in and tell us your primecoin address to get it.

Your current balance is 0.11 XPM. If you don't enter a primecoin address
your tips will be returned to the project in 30 days.

Sign In

Thanks for contributing to Open Source!


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

signature.asc
Description: Message signed with OpenPGP using AMPGpg
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

  1   2   3   4   5   6   7   8   >