Re: [Wikitech-l] .wiki gTLD

2014-02-22 Thread Gryllida
That is a tough one, since most project names /start/ with wiki. pedia.wiki 
just sounds awkward, as do many others.

On Fri, 21 Feb 2014, at 5:56, Derric Atzrott wrote:
 ICANN just delegated the gTLD .WIKI yesterday.  It's being managed by Top 
 Level
 Design, LLC.  I'm not entirely sure what that means for all of us exactly, 
 but I
 suspect that the WMF is going to want to at least register Wikipedia.wiki and
 Wikimedia.wiki once the gTLD is open for registration.
 
 Some of the new gTLDs are already opening up for registration.  .sexy and
 .tattoo will be opening for registration on 25 February.
 
 It looks like if we want to get .wiki domains we will be getting them sometime
 in May or June during the sunrise period.[1]
 
 ICANN also has a full list of new gTLDs that they have approved.[2]
 
 Thank you,
 Derric Atzrott
 Computer Specialist
 Alizee Pathology
 
 [1]: http://www.namejet.com/Pages/newtlds/tld/WIKI
 [2]: http://newgtlds.icann.org/en/program-status/delegated-strings
 
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] .wiki gTLD

2014-02-22 Thread addshorewiki
en.wiki
data.wiki
meta.wiki
media.wiki
en.books.wiki
en.voyage.wiki

Most of them sound rather plausible.

Addshore
 That is a tough one, since most project names /start/ with wiki.
pedia.wiki just sounds awkward, as do many others.

On Fri, 21 Feb 2014, at 5:56, Derric Atzrott wrote:
 ICANN just delegated the gTLD .WIKI yesterday.  It's being managed by Top
Level
 Design, LLC.  I'm not entirely sure what that means for all of us
exactly, but I
 suspect that the WMF is going to want to at least register Wikipedia.wiki
and
 Wikimedia.wiki once the gTLD is open for registration.

 Some of the new gTLDs are already opening up for registration.  .sexy and
 .tattoo will be opening for registration on 25 February.

 It looks like if we want to get .wiki domains we will be getting them
sometime
 in May or June during the sunrise period.[1]

 ICANN also has a full list of new gTLDs that they have approved.[2]

 Thank you,
 Derric Atzrott
 Computer Specialist
 Alizee Pathology

 [1]: http://www.namejet.com/Pages/newtlds/tld/WIKI
 [2]: http://newgtlds.icann.org/en/program-status/delegated-strings


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] .wiki gTLD

2014-02-22 Thread Manuel Schneider
Am 22.02.2014 10:37, schrieb Gryllida:
 That is a tough one, since most project names /start/ with wiki. 
 pedia.wiki just sounds awkward, as do many others.

right, let's get .pedia delegated to WMF. Also a way to make money ;-)

SCNR.

-- 
Manuel Schneider - Chief Information Officer
Wikimedia CH - Verein zur Förderung Freien Wissens
Lausanne, +41 (21) 340 66 22 - www.wikimedia.ch

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] .wiki gTLD

2014-02-22 Thread Anthony
I wouldn't think any of those other than perhaps media.wiki would
implicate a WMF trademark. As far as MediaWiki, WMF does claim a trademark
on that.


On Sat, Feb 22, 2014 at 5:17 AM, addshorewiki addshorew...@gmail.comwrote:

 en.wiki
 data.wiki
 meta.wiki
 media.wiki
 en.books.wiki
 en.voyage.wiki

 Most of them sound rather plausible.

 Addshore
  That is a tough one, since most project names /start/ with wiki.
 pedia.wiki just sounds awkward, as do many others.

 On Fri, 21 Feb 2014, at 5:56, Derric Atzrott wrote:
  ICANN just delegated the gTLD .WIKI yesterday.  It's being managed by Top
 Level
  Design, LLC.  I'm not entirely sure what that means for all of us
 exactly, but I
  suspect that the WMF is going to want to at least register Wikipedia.wiki
 and
  Wikimedia.wiki once the gTLD is open for registration.
 
  Some of the new gTLDs are already opening up for registration.  .sexy and
  .tattoo will be opening for registration on 25 February.
 
  It looks like if we want to get .wiki domains we will be getting them
 sometime
  in May or June during the sunrise period.[1]
 
  ICANN also has a full list of new gTLDs that they have approved.[2]
 
  Thank you,
  Derric Atzrott
  Computer Specialist
  Alizee Pathology
 
  [1]: http://www.namejet.com/Pages/newtlds/tld/WIKI
  [2]: http://newgtlds.icann.org/en/program-status/delegated-strings
 
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Making it possible to specify MediaWiki compatibility in extensions

2014-02-22 Thread Jeroen De Dauw
Hey,

As you are probably aware of, it has been possible for some time now to
install Composer compatible MediaWiki extensions via Composer.

Markus Glaser recently wrote an RFC titled Extension management with
Composer [0]. This RFC mentioned that it is not possible for extensions to
specify which version of MediaWiki they are compatible with. After
discussing the problem with some people from the Composer community, I
created a commit that addresses this pain point [1]. It's been sitting on
gerrit getting stale, so some input there is appreciated.

[0]
https://www.mediawiki.org/wiki/Requests_for_comment/Extension_management_with_Composer
[1] https://gerrit.wikimedia.org/r/#/c/105092/

For your convenience, a copy of the commit message:

~~

Make it possible for extensions to specify which version of MediaWiki

they support via Composer.

This change allows extensions to specify they depend on a specific
version or version range of MediaWiki. This is done by adding the
package mediawiki/mediawiki in their composer.json require section.

As MediaWiki itself is not a Composer package and is quite far away
from becoming one, a workaround was needed, which is provided by
this commit.

It works as follows. When composer install or composer update
is run, a Composer hook is invoked. This hook programmatically
indicates the root package provides MediaWiki, as it indeed does
when extensions are installed into MediaWiki. The package link
of type provides includes the MediaWiki version, which is read
from DefaultSettings.php.

This functionality has been tested and confirmed to work. One needs
a recent Composer version for it to have an effect. The upcoming
Composer alpha8 release will suffice. See
https://github.com/composer/composer/issues/2520

Tests are included. Composer independent tests will run always,
while the Composer specific ones are skipped when Composer is
not installed.

People that already have a composer.json file in their MediaWiki
root directory will need to make the same additions there as this
commit makes to composer-json.example. If this is not done, the
new behaviour will not work for them (though no existing behaviour
will break). The change to the json file has been made in such a
way to minimize the likelihood that any future modifications there
will be needed.

Thanks go to @beausimensen (Sculpin) and @seldaek (Composer) for
their support.

~~

I also wrote up a little blog post on the topic:
http://www.bn2vs.com/blog/2014/02/15/mediawiki-extensions-to-define-their-mediawiki-compatibility/

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Unlicking the cookie: ExtensionStatus extension

2014-02-22 Thread Moriel Schottlender
Hello community,

tl;dr:

I'd love to see if there's anyone in the community that can help me take
charge of ExtensionStatus development, or take it over completely.


Longer version:

About 9 months ago, just before I started Google Summer of Code 2013, I
developed ExtensionStatus, which checks the current status of all
extensions that are installed in the given wiki and compares them with
either their git branch info (if git is available) or scrapes data from the
gerrit website. It gives sysadmins an at-a-glance dashboard so they can see
what needs upgrading.

There was a bit of interest, so I went on and started some testing, which
seemed to go well for a while.

But then I started GSoC and the project I worked on kept me quite busy.
After the summer I went back to full-time grad school and a part-time job,
both of which took me away from working on developing the extension any
further than its extremely experimental state.

In the last couple of days I started seeing a renewed interest in the
extension, and I think it could be worth a second visit. I'd love to
continue working on it, but with grad school and work, my time is really
limited and I don't think I can take it on alone.

Considering all of that, I thought I'd officially unlick the cookie and
see if there is interest from the community to take it on either fully or
partially.


Here's the github repo:
https://github.com/mooeypoo/MediaWiki-ExtensionStatus

And here's the extension page:
https://www.mediawiki.org/wiki/Extension:ExtensionStatus

There's also an empty gerrit repo that is awaiting setup:
https://git.wikimedia.org/summary/mediawiki%2Fextensions%2FExtensionStatus
(I didn't really have much experience with gerrit back then to set it up
properly - and now I am not sure I'd have the time to develop, review, and
approve code on my own.)

So, does anyone want to join in and help revisit the code?

Cheers,

Moriel
(aka mooeypoo)

-- 
No trees were harmed in the creation of this post.
But billions of electrons, photons, and electromagnetic waves were terribly
inconvenienced during its transmission!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Drop support for PHP 5.3

2014-02-22 Thread Markus Glaser
 We should strongly consider ensuring that the latest stable releases of 
 Ubuntu and probably RHEL (or maybe fedora) can run MediaWiki.
 - Ryan
Kind of agree ;)

I'd like to see the next MediaWiki LTS version (1.23) to support PHP 5.3. 
MW1.23LTS has a scheduled release date at end of April (we might add a week or 
two for safety). After that, no problem from my side (release management) with 
dropping PHP5.3 support.

It might have been said before (tl;dr the whole thread) : a vast majority of 
3rd party MW users still relies on PHP5.3. This is not a reason to block 
future improvements, but a fact to consider well. If we can agree on 1.23LTS 
supporting PHP5.3, imho, that'll do the trick.

Best,
Markus (mglaser)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Unlicking the cookie: ExtensionStatus extension

2014-02-22 Thread Jeroen De Dauw
Hey,

Checking such extension status is essentially solved by dependency
management. And dependency management is solved by various tools, the one
being relevant to MediaWiki and other PHP projects being Composer [0]. In
fact this capability already exists for several extensions, though there is
no nice GUI for it yet. An example of such version checking can be seen at
[1] which checks if the versions required for a component are up to date
with the ones it provides. In that case the thing being checked for out of
date dependencies in a component itself, though the exact same code can
work for a wiki. Some relevant pages on the existing support are [2, 3, 4].

Implementing the ideas I outlined in [4] is not terribly difficult and will
result in very powerful capabilities. Or rather, it will make them a lot
more visible and accessible, which will then likely result in wider
adoption of the underlying paradigm (dependency management) amongst
extension authors. I estimate that building a working prototype can take as
little as two days, as indeed most of the groundwork has already been done
[5].

That is in contrast to scraping things of gerrit, or relying on info on
MediaWiki.org being comprehensive and up to date, combined with building an
own solution. I anticipate such a solution to be way less powerful, much
harder to maintain, being rather brittle, not being interoperable at all,
and harder to create in the first place.

[0] http://getcomposer.org/
[1] https://www.versioneye.com/php/mediawiki:semantic-mediawiki
[2] https://www.mediawiki.org/wiki/Composer
[3]
http://www.bn2vs.com/blog/2014/02/15/introduction-to-composer-for-mediawiki-developers/
[4] http://sourceforge.net/mailarchive/message.php?msg_id=31680734
[5] If you think I'm full of shit, throw me 2k USD and I will prove you
wrong

Cheers

--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] How to ask for a python package into Labs

2014-02-22 Thread Alex Brollo
I'd  need internetarchive python package into Labs:
https://pypi.python.org/pypi/internetarchive , a python bot for Internet
Archive. I cant't find how to ask for installing it into Labs. Can you help
me?

It's an interesting package - it can be implemented into a pywikibot and
manage both mediawiki pages and Internet Archive items both reading and
editing metadata and uploading new items/pages. I've been encouraged to go
on by Tpt.

Alex brollo
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How to ask for a python package into Labs

2014-02-22 Thread K. Peachey
bugzilla.


On 23 February 2014 16:51, Alex Brollo alex.bro...@gmail.com wrote:

 I'd  need internetarchive python package into Labs:
 https://pypi.python.org/pypi/internetarchive , a python bot for Internet
 Archive. I cant't find how to ask for installing it into Labs. Can you help
 me?

 It's an interesting package - it can be implemented into a pywikibot and
 manage both mediawiki pages and Internet Archive items both reading and
 editing metadata and uploading new items/pages. I've been encouraged to go
 on by Tpt.

 Alex brollo
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l