[Wikitech-l] Re: Fwd: Re: Code coverage for JavaScript testing?

2021-10-28 Thread planetenxin

Hi folks, many thanks for your hints and helpful links!

Am 28.10.2021 um 02:29 schrieb Jon Robson:

I recommend using Jest <https://mediawiki.org/wiki/Vue.js/Testing> for unit 
testing with code coverage.

In various Vue.js extensions including NearbyPages 
<https://github.com/wikimedia/mediawiki-extensions-NearbyPages> we use Jest 
which is recommended by the Vue.js migration team.

An alternative approach is to use QUnit. In MobileFrontend and Popups we use a 
command line version of QUnit. It’s not tied to Webpack in any way despite the title 
(it would work with packageFiles these days). 
(https://mediawiki.org/wiki/User:Jdlrobson/Developing_with_Webpack_and_ResourceLoader#Writing_unit_tests_and_getting_code_coverage_reports
 
<https://mediawiki.org/wiki/User:Jdlrobson/Developing_with_Webpack_and_ResourceLoader#Writing_unit_tests_and_getting_code_coverage_reports>)

When using command line unit testing you'll need to use @wikimedia/mw-node-qunit 
<https://github.com/wikimedia/mw-node-qunit> for providing a mock MediaWiki 
environment. Despite the name this library can be used with Jest (see NearbyPages for 
an example).

Hope this is helpful!

On Wed, Oct 27, 2021 at 7:51 AM planetenxin mailto:planeten...@web.de>> wrote:


Hi Greg,

actually I'm more interested in creating the coverage reports on a local 
dev box in the context of extension development / local CI (checking the 
coverage of newly created JS tests). I did find info about running JS tests but 
little to nothing about coverage. Maybe I missed something.

/Alexander

Am 27.10.2021 um 01:37 schrieb Greg Grossmeier:
 > On Tue, Oct 26, 2021 at 2:31 AM planetenxin mailto:planeten...@web.de> <mailto:planeten...@web.de <mailto:planeten...@web.de>>> 
wrote:
 >
 >     Is there a generic approach, how to get some coverage reports for 
the JavaScript parts of MW and MW extensions?
 >
 >
 > Is https://doc.wikimedia.org/cover/ <https://doc.wikimedia.org/cover/> 
<https://doc.wikimedia.org/cover/ <https://doc.wikimedia.org/cover/>> helpful in your 
case?
 >
 > --
 > | Greg Grossmeier              GPG: B2FA 27B1 F7EB D327 6B8E |
 > | Dir. Engineering Productivity     A18D 1138 8E47 FAC8 1C7D |
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org 
<mailto:wikitech-l@lists.wikimedia.org>
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org 
<mailto:wikitech-l-le...@lists.wikimedia.org>
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/ 
<https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/>


___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/




--

semantic::core go!
Free Enterprise Class MediaWiki Distribution
https://semantic.wiki/de/core
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Fwd: Re: Code coverage for JavaScript testing?

2021-10-27 Thread planetenxin


Hi Greg,

actually I'm more interested in creating the coverage reports on a local dev 
box in the context of extension development / local CI (checking the coverage 
of newly created JS tests). I did find info about running JS tests but little 
to nothing about coverage. Maybe I missed something.

/Alexander

Am 27.10.2021 um 01:37 schrieb Greg Grossmeier:

On Tue, Oct 26, 2021 at 2:31 AM planetenxin mailto:planeten...@web.de>> wrote:

Is there a generic approach, how to get some coverage reports for the 
JavaScript parts of MW and MW extensions?


Is https://doc.wikimedia.org/cover/ <https://doc.wikimedia.org/cover/> helpful 
in your case?

--
| Greg Grossmeier              GPG: B2FA 27B1 F7EB D327 6B8E |
| Dir. Engineering Productivity     A18D 1138 8E47 FAC8 1C7D |

___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Code coverage for JavaScript testing?

2021-10-26 Thread planetenxin
[1] describes how to do QUnit testing but it does not mention if / how to get 
some coverage reports (e.g. with the help of Istanbul). While searching for an 
answer, I found this [2] post about testing challenges for Extension:Popups and 
solutions implemented for this specific extension.

Is there a generic approach, how to get some coverage reports for the 
JavaScript parts of MW and MW extensions?

/Alexander

[1] https://www.mediawiki.org/wiki/Manual:JavaScript_unit_testing
[2] 
https://phabricator.wikimedia.org/phame/post/view/96/fast_and_isolated_js_unit_tests/
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/


[Wikitech-l] Installing mediawiki/graph-viz fails with Authentication required (gerrit.wikimedia.org)

2021-10-13 Thread planetenxin
An attempt to install mediawiki/graph-viz via composer fails with:

 Installing mediawiki/graph-viz (3.1.0): Authentication required 
(gerrit.wikimedia.org):
  Username:

Anyone else having this issue?

> cat composer.local.json
{
"require": {
"mediawiki/graph-viz": "3.1.0"
}
}

> composer update --no-dev
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/


Re: [Wikitech-l] Gerrit outage

2019-03-19 Thread planetenxin
Am 19.03.2019 um 12:21 schrieb Andre Klapper:
> planetenxin: Sorry for my previous message, was not meant to be rude.

no worries. Hope, that Gerrit is back alive soon. :-)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit outage

2019-03-19 Thread planetenxin
Gerrit seems to be offline again.

> On 16 March 2019, Wikimedia Foundation staff observed suspicious activity
> associated with Gerrit and as a precautionary step has taken Gerrit offline
> pending investigation.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Semediawiki-user] Reminder: SMWCon Fall 2018 - December 12th to 14th, Regensburg, Germany

2018-12-11 Thread planetenxin
Am 11.12.2018 um 01:15 schrieb Evans, Richard K. (GRC-H000):
> Is there any option to attend remotely?
> 
> /Rich


We plan to have a live streaming via YouTube plus recording. Stay tuned and 
watch 

https://www.semantic-mediawiki.org/wiki/SMWCon_Fall_2018

to get notified. Hopefully we get the setup working (Multicamera with 
PictureInPicture + Complex Audio Setup, puuuh). Keep fingers crossed.

/Alexander

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Reminder: SMWCon Fall 2018 - December 12th to 14th, Regensburg, Germany

2018-11-27 Thread planetenxin
Dear users, developers and all people interested in semantic wikis,

this is a reminder to get your tickets for SMWCon 2018 - the 15th Semantic 
MediaWiki Conference:


Dates:   December 12th to December 14th 2018 (Wednesday to Friday).
Location:TechBase Regensburg, Franz-Mayer-Straße 1, 93053 Regensburg, 
Germany 
Conference page: https://www.semantic-mediawiki.org/wiki/SMWCon_Fall_2018
Ticket Shop: https://en.xing-events.com/SMWCon_Fall_2018


Conference Day 1 "Business Solutions"

Keynote: Sustainable technical documentation
Werner Fischer Knowledge Transfer at Thomas-Krenn.AG

Semantic MediaWiki Overview
Ben Fletcher

Data Cockpit - SMW as GDPR compliance tool
Bernhard Krabina Consultant at KDZ - Centre for Public Administration 
Research

GDPR: Simplify managing "Data Catalogue" and "Records of Processing 
Activities"
Alexander Gesinn Managing Director & Partner at gesinn.it GmbH & Co. KG

Running an SMW-based integrated management system in highly regulated 
environments
Franz Borrmann Managing Director at iUS Institut für Umwelttechnologien und 
Strahlenschutz GmbH

Semantic BPMN - Business Process Management the wiki way
Alexander Gesinn Managing Director & Partner at gesinn.it GmbH & Co. KG

From "Word" to Wiki - Quality Management at Handelslehranstalt Hameln
Bernd Strahler Headmaster at Handelslehranstalt Hameln

Housing for people with disabilities
Ad Strack van Schijndel

wikifab - explore, make, share
Clément Flipo Product manager & Co-Founder at Dokit.io


Conference Day 2 "Education, Research, Science / SMW News / Lightning Talks"

How human perception and classification influence semantic knowledge, and 
vice versa
Marc van Hoof Researcher "School for Mental Health and Neuroscience, 
Maastricht University" and "Medical Humanities, Amsterdam UMC"

Use SMW to cohere business knowledge scattered across information sources — 
and integrate on search
Lex Sulzer Knowledge Management Solutions Architect at dataspects GmbH

Chameleon 2 Skin
Stephan Gambke Qualification Monitoring Officer, European Space Agency

WS Form
Viktor Schelling wikibase

SRF Mermaid: Gantt Charts from Semantic Data
Sebastian Schmid Software Engineer at gesinn.it GmbH & Co. KG

SimpleGraph Semantic Media Wiki Module
Wolfgang Fahl Founder and owner of BITPlan, Germany

What's new in Semantic MediaWiki 3.0
Karsten Hoffmeyer Founder and proprietor of WikiHoster.net, Germany

About MediaWiki evolution - information from the recent Wikimedia Technical 
Conference
Karsten Hoffmeyer Founder and proprietor of WikiHoster.net, Germany

How to successfully lose readers
Sabine Melnicki Consultant at WikiAhoi

The Embassy of Good Science
Marc van Hoof Researcher "School for Mental Health and Neuroscience, 
Maastricht University" and "Medical Humanities, Amsterdam UMC"

A Citizen Science Approach for a SMW-based eHumanities Project
Cornelia Veja, Julian Hocker

Implementation and Application of an interface between project planning 
software and SMW based documentation
Maarten Becker iUS Institut für Umwelttechnologien und Strahlenschutz GmbH


Sponsors:
* ArchiXL [0]
* Digitale Gründerinitiative Oberpfalz [1]
* Wikibase Solutions [2]

Organizer:
* gesinn.it GmbH & Co. KG [3].


Looking forward to meet you in Regensburg!

/Alexander

[0] https://www.archixl.nl/
[1] https://www.digitale-oberpfalz.de/
[2] https://www.wikibase.nl/
[3] https://semantic.wiki

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] SMWCon Fall 2018 - December 12th to 14th, Regensburg, Germany

2018-09-28 Thread planetenxin
Dear users, developers and all people interested in semantic wikis,

We are happy to announce SMWCon Fall 2018 - the 15th Semantic MediaWiki 
Conference:

Dates:   December 12th to December 14th 2018 (Wednesday to Friday).
Location:TechBase Regensburg, Franz-Mayer-Straße 1, 93053 
Regensburg, Germany 
Conference page: https://www.semantic-mediawiki.org/wiki/SMWCon_Fall_2018
Ticket Shop: https://en.xing-events.com/SMWCon_Fall_2018

SMWCon Fall 2018 will be supported by gesinn.it GmbH & Co. KG [0].

Due to a challenging conference room situation, SMWCon Fall will take place 
very late this year.
What's good about it: participants will have the opportunity to enjoy winter 
atmosphere (including "Christkindlmarkt") in Regensburg :-)

This time, we start Wednesday with two conference days (including a short 
introduction to SMW) followed by a Tutorial / Hackathon Day on Friday.

TICKET SHOP IS ALREADY OPEN

Contributing to the conference: If you want to present your work in the 
conference, please go to the conference page and add your talk there.
To create an attractive program for the conference, we will later ask you to 
give further information about your proposals.


Looking forward to meet you in Regensburg!

/Alexander Gesinn
on behalf of the gesinn.it organization team


[0] http://gesinn.it, http://semantic.wiki

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] setting $wgDefaultUserOptions['language'] = 'de'; fails

2016-02-22 Thread planetenxin
Hi,

for technical reasons (Semantic), I need to set a wiki's (1.25.x)
$wgLanguageCode to English but want to have the user's default language
set to German.

 $wgLanguageCode = 'en';
 $wgDefaultUserOptions['language'] = 'de';

The $wgDefaultUserOptions['language'] = 'de'; seems to be ignored. New
users still get default 'en'.

Did I miss something?

Many thanks,

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] git.wikimedia.org down?

2015-11-27 Thread planetenxin
... okay, but how to download a specific commit ID as tar.gz from
Phabricator Diffusion like shown in my example?

Am 27.11.2015 um 11:59 schrieb Jaime Crespo:
> Please see my comments on: <
> https://phabricator.wikimedia.org/T119701#1834962>
> 
> And the related ticket: <https://phabricator.wikimedia.org/T83702>
> 
> On Fri, Nov 27, 2015 at 11:06 AM, planetenxin <planeten...@web.de> wrote:
> 
>> Since yesterday we could not reach git.wikimedia.org any more.
>>
>> A call like:
>>
>>
>> https://git.wikimedia.org/zip/?r=mediawiki/extensions/AdminLinks.git=2619ed9beede0017f50ed08b20f6ea3a5200a838=gz
>>
>> fails with a timeout.
>>
>> Ping is working.
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> 
> 
> 
> 


-- 

semantic::apps by gesinn.it
Business Applications with Semantic Mediawiki.
http://semantic-apps.com

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] git.wikimedia.org down?

2015-11-27 Thread planetenxin
Since yesterday we could not reach git.wikimedia.org any more.

A call like:

https://git.wikimedia.org/zip/?r=mediawiki/extensions/AdminLinks.git=2619ed9beede0017f50ed08b20f6ea3a5200a838=gz

fails with a timeout.

Ping is working.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Automatic Transfer SVN Extension to GIT

2015-07-07 Thread planetenxin

Hi Chad,

here?

https://www.mediawiki.org/wiki/Gerrit/New_repositories#Step_4:_Request_space_for_your_extension


Am 07.07.2015 um 18:14 schrieb Chad:

Please file a request on MW.org for a new repo like usual.



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Automatic Transfer SVN Extension to GIT

2015-07-07 Thread planetenxin

Hi list,

is there a procedure/best practice to automatically transfer old SVN 
style extensions to GIT?


Is this something the maintainer of an extension needs to do?

An example would be: https://www.mediawiki.org/wiki/Extension:DateDiff

/planetenxin

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] API mustposttoken / notoken error

2015-06-15 Thread planetenxin


After an upgrade from 1.23.x to 1.25.1 I get an 'mustposttoken' error 
when I try to edit a page via API:


code:mustposttoken,info:The 'token' parameter was found in the 
query string, but must be in the POST body


When I remove the token parameter from the query parameters and add the 
token parameter to the post body, I get


code:notoken,info:The token parameter must be set

Any ideas?

/Planetenxin

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] API mustposttoken / notoken error [SOLVED]

2015-06-15 Thread planetenxin

Am 15.06.2015 um 22:30 schrieb Brad Jorsch (Anomie):

Token=**%2B%5C


Try it with a lowercase t.


Hey Brad,

lowercase t did solve the issue. Thanks a lot!

\o/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] API mustposttoken / notoken error

2015-06-15 Thread planetenxin

That's what I get when sniffing using Fiddler:

POST 
http://localhost:8080/wiki01/api.php?action=edittitle=Organization%3ALotnamkix+-+United+States+-+Eppingtext=%7B%7BOrganization_01_00%0D%0A%7COrganization+Name%3DLotnamkix%0D%0A%7CStreet%3D96+Calef+Highway%0D%0A%7CZIP+Code%3D03042-2224%0D%0A%7CCity%3DEpping%0D%0A%7CCountry%3DUnited+States%0D%0A%7D%7Dsummary=added+by+Kettle+importformat=json 
HTTP/1.1
Cookie: wikidb01_session=**; path=/; 
domain=http://localhost:8080/wiki01/api.php; HttpOnly

User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.0)
Content-Type: application/x-www-form-urlencoded
Host: localhost:8080
Connection: Keep-Alive
Content-Length: 52

Token=**%2B%5C




Some characters in cookies and tokens has been obfuscated

Am 15.06.2015 um 21:24 schrieb Brad Jorsch (Anomie):

Sanity checks: You're actually doing a POST, not a GET with a body? Your
Content-Type and Content-Length headers on the POST are correct?

After checking that, it would be helpful if you could capture the whole
request being POSTed for further review. Obfuscate cookies and tokens, of
course.





--

semantic::apps by gesinn.it
Business Applications with Semantic Mediawiki.
http://semantic-apps.com

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Git Proxy Error

2013-07-21 Thread planetenxin

Hi folks,

I'm constantly getting the following error on https://git.wikimedia.org/:

Proxy Error

The proxy server received an invalid response from an upstream server.
The proxy server could not handle the request GET /.

Reason: Error reading from remote server

/Alexander

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Corporate needs are different (RE: How can we help Corporations use MW?)

2013-02-08 Thread planetenxin

Hi Dan,

thanks a lot for the insights to the vistaprint MediaWiki ecosystem.

Did you give Semantic MediaWiki a try?

/Alexander

Am 07.02.2013 22:31, schrieb Daniel Barrett:

Vistaprint (www.vistaprint.com) has a hugely successful MediaWiki system 
internally. 150,000+ topics, 1000+ active users across several continents, five 
years of history, and a fully supported team of developers to create 
extensions. (We are looking into open-sourcing some of them.)

The main requests from our corporate users are:

0. WYSIWIG editor. No surprise here.

1. A desire for a department to have their own space on the wiki. I'm not talking about access 
control, but (1) customized look  feel, and (2) ability to narrow searches to find articles only within that 
space.  The closest related concept in MediaWiki is the namespace, which can have its own CSS styling, and you 
can search within a namespace using Lucene with the syntax NamespaceName:SearchString.  However, this 
is not a pleasant solution, because it's cumbersome to precede every article title with NamespaceName: 
 when you create, link, and search.

If the *concept* of namespaces could be decoupled from its title syntax, this 
would be a big win for us. So a namespace would be a first-class property of an 
article (like it is in the database), and not a prefix of the article title (at 
the UI level).  I've been thinking about writing an extension that provides 
this kind of UI when creating articles, searching for them, linking, etc.

Some way to search within categories reliably would also be a huge win.  Lucene provides 
incategory: but it misses all articles with transcluded category tags.

2. Hierarchy. Departments want not only their own space, they want subspaces beneath it. For 
example, Human Resources wiki area with sub-areas of Payroll, Benefits, and Recruiting.  I realize 
Confluence supports this... but we decided against Confluence because you have to choose an article's area when you 
create it (at least when we evaluated Confluence years ago). This is a mental barrier to creating an article, if you 
don't know where you want to put it yet.  MediaWiki is so much better in this regard -- if you want an article, just 
make it, and don't worry where it goes since the main namespace is flat.

I've been thinking about writing an extension that superimposes a hierarchy on 
existing namespaces, and what the implications would be for the rest of the 
MediaWiki UI. It's an interesting problem. Anyone tried it?

3. Tools for organizing large groups of articles. Categories and namespaces are great, and the DPL extension helps a lot. But 
when (say) the Legal department creates 700 articles that all begin with the words Legal department (e.g., 
Legal department policies, Legal department meeting 2012-07-01, Legal department lunch, 
etc.), suddenly the AJAX auto-suggest search box becomes a real pain for finding Legal department articles. This is SO COMMON in 
a corporate environment with many departments, as people try to game the search box by titling all their articles with 
Legal department... until suddenly it doesn't scale and they're stuck. I'd like to see tools for easily retitling and 
recategorizing large numbers of articles at once.

4. Integration with popular corporate tools like MS Office, MS Exchange, etc. 
We've spent thousands of hours doing this: for example, an extension that 
embeds an Excel spreadsheet in a wiki page (read-only, using a $10,000 
commercial Excel-to-HTML translator as a back-end), and we're looking at 
embedding Exchange calendars in wiki pages next.

5. Corporate reorganizations and article titles. In any company, the names and relationships of departments change. 
What do you do when 10,000 wiki links refer to the old department name?  Sure, you can move the article 
Finance department to Global Finance department and let redirects handle the rest: now your 
links work. But they still have the old department name, and global search-and-replace is truly scary when wikitext 
might get altered by accident. Also, there's the category called Finance department. You can't rename 
categories easily. I know you can do it with Pywikipedia, but it's slow and risky (e.g., Pywikipedia used to have a 
bug that killednoinclude  tags around categories it changed). Categories should be fully first-class so 
renames are as simple as article title changes.

Hope this was insightful/educational...
DanB
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



--

semantic::apps by gesinn.it
Business Applications with Semantic Mediawiki.
http://semantic-apps.com

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tentative specs for MW releases

2012-10-16 Thread planetenxin
I like the idea of LTS releases which are very useful in enterprise 
environments with focus on stability and maintenance.


We typically build our MediaWiki Enterprise stacks on Ubuntu Server LTS...

/Alexander

Am 15.10.2012 03:26, schrieb Mark A. Hershberger:

I said I would lay out my thoughts regarding MW releases this weekend,
so here goes.

First: I want to provide a regular schedule so users know what to
expect, but something that a volunteer (me, for now) can achieve.

Second: I want to provide something that Linux distributors can
incorporate into their distributions.

To fulfill the first point, I think a release twice a year -- like
Ubuntu releases -- makes a lot of sense.  This schedule also works for
Linux distributors like Ubuntu, Fedora, and OpenSuSE

Since I started out using Debian (which has now adopted a 2 year freeze
cycle), I think it also makes sense to provide LTS support.  Platonides
and I (but mostly Platonides) have been working with the Debian
developers to get 1.19 into Wheezy which was frozen in June.

With that in mind, here is what I propose:

  1.18.0 | Security updates till 1.20
  1.19.x | April 2012 (LTS)
  1.20.0 | October 2012
  1.21.0 | April 2013 (Start in May)
  1.22.0 | October 2013 (Start in September)
  1.23.0 | April 2014 (LTS)
  1.24.0 | October 2014
  1.25.0 | April 2015
  1.26.0 | October 2015
  1.27.0 | April 2016 (LTS)

LTS releases will updates until (at least) the next LTS release.  This
means security updates, but other updates that don't require schema
changes if people are interested in providing them.  Since a couple of
people have put the 1.20.0 milestone on a handful of bugs, I'm assuming
now that they think those are worth merging to the 1.20 series.  I'd
like to get the fixes backported to 1.19 as well,  if  possible.

Well, that's pretty much it what I was thinking.  How does this sound to
you guys?




--

semantic::apps by gesinn.it
Business Applications with Semantic Mediawiki.
http://semantic-apps.com

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l