Re: [Wikitech-l] Map internationalization launched everywhere, AND embedded maps now live on 276 Wikipedias

2018-05-10 Thread mathieu stumpf guntz

Great, thank you! :)


Le 10/05/2018 à 00:32, Joe Matazzoni a écrit :

As of today, interactive (Kartographer) maps no longer display in the language 
of the territory mapped; instead, you’ll read them in the content language of 
the wiki where they appear—or in the language their authors specify (subject to 
availability of multilingual data). In addition, mapframe, the feature that 
automatically embeds dynamic maps right on a wiki page, is now live on most 
Wikipedias that lacked the feature. (Not included in the mapframe launch are 
nine Wikipedias [1] that use the stricter version of Flagged Revisions).

If you you’re new to mapframe, this Kartographer help page [2] shows how to get 
started putting dynamic maps on your pages.  If you’d like to read more about 
map internationalization: this Special Update [3] explains the feature and its 
limiations; this post [4] and this one [5] describe the uses of the new 
parameter, lang=”xx”, which  lets you specify a map’s language. And here are 
some example maps [6] to illustrate the new capabilities.

These features could not have been created without the generous programming 
contributions and advice of our many map-loving volunteers, including Yurik, 
Framawiki, Naveenpf, TheDJ, Milu92, Astirlin, Evad37, Pigsonthewing, Mike Peel, 
Eran Roz,  Gareth and Abbe98. My apologies to anyone I’ve missed.

The Map Improvements 2018 [7] project wraps up at the end of June, so please 
give internationalized maps and mapframe a try soon and give us your feedback 
on the project talk page [8]. We’re listening.

[1] https://phabricator.wikimedia.org/T191583
[2] https://www.mediawiki.org/wiki/Help:Extension:Kartographer
[3] 
https://www.mediawiki.org/wiki/Map_improvements_2018#April_18,_2018,_Special_Update_on_Map_Internationalization
[4] 
https://www.mediawiki.org/wiki/Map_improvements_2018#April_25,_2018,_You_can_now_try_out_internationalization_(on_testwiki)
[5] 
https://www.mediawiki.org/wiki/Map_improvements_2018#April_26,_2018:_OSM_name_data_quirks_and_the_uses_of_lang=%E2%80%9Clocal%E2%80%9D
[6] https://test2.wikipedia.org/wiki/Map_internationalization_examples
[7] https://www.mediawiki.org/wiki/Map_improvements_2018
[8] https://www.mediawiki.org/wiki/Talk:Map_improvements_2018
_

Joe Matazzoni
Product Manager, Collaboration
Wikimedia Foundation, San Francisco

"Imagine a world in which every single human being can freely share in the sum of 
all knowledge."




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Announcing the Core Platform team

2018-04-12 Thread mathieu stumpf guntz

Thank you Victoria for this announcement.

For those who, like me, didn't know about the platform evolution 
cross-departemental program, it might be interesting to have a look at 
the the dedicated wiki[1]. I didn't read it all yet, and there might be 
more relevant resources out there, to which any link would be warmly 
welcome.


Cheers

[1] https://wikifarm.wmflabs.org/platformevolution/index.php/Main_Page


Le 10/04/2018 à 19:57, Victoria Coleman a écrit :

Hi everyone,

We are very pleased to be making three big announcements:

1. The Platform Evolution (PE) cross-departmental program (CDP) is approved in 
the annual plan for next fiscal year.

This PE CDP was developed with input from staff across our organization in the 
Audiences and Technology Working Group. Together, these staff worked to 
identify some of our most pressing issues while opening healthy discussions 
between both departments and WMDE.

Along with the other Technology programs funded in the annual plan, funding of 
this program represents a renewed commitment by the Foundation to the long term 
health of the technology that is key to supporting our mission, staff and 
communities.


2. We are creating the Core Platform team, a new converged platform and 
services team to be the focus of the Platform Evolution CDP.


We are doing this in order to better support the PE CDP and begin the hard work 
of re-architecting our technology stack into a more sustainable and flexible 
platform, in support of the Wikimedia movement strategic direction. The 
MediaWiki Platform team alongside the Services Platform team hold some of the 
most senior technologists of our community. Their skill sets and experience are 
vitally important for the success of the CDP so they will be moving into the 
new Core Platform team. We want to thank Tim Starling and Marko Obrovac for 
their hard work, leadership and dedication which has brought us to this point. 
They and their teams are doing incredibly  important work to sustain our 
software stack day in day out while also looking into the future and guiding 
the reengineering of our platform to support the mission for the years ahead.


3. Corey Floyd will be joining the Technology Department to lead the Core 
Platform team.

In addition to day to day management, Corey will operate in the program 
management capacity for the PE CDP. Corey was instrumental in formulating the 
PE CDP, assembling the program through extensive needs analysis, synthesis, and 
collaboration with Foundation team members and WMDE. He brings a proven track 
record in clarifying stakeholder needs and translating them into amazing 
products.

As many of you know, Corey started at the Foundation as an iOS engineer, was 
promoted to manage the iOS and Android native apps engineers who worked 
tirelessly to evolve the open source apps into award winners beloved by 
millions of our users, and has been operating in an
engineering product owner capacity for the Infrastructure team within 
Audiences-Readers. He is known for his work ethic, thought leadership, real 
world experience, and collegial spirit.

Audiences will be working to backfill the duties of Corey in Apps engineering 
management and Reading Infrastructure product ownership, and work is already 
underway to close these gap in conjunction with Corey’s cutover to Technology.

We’re happy to make the Platform Evolution CDP official. And please join me in 
welcoming Corey to Technology, where he’ll transition on July 1, 2018. We’ll be 
sharing more updates about the team and the PE CDP in the coming weeks.


We are incredibly excited!


Victoria and Toby
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] You can now translate Phabricator to your language

2018-03-02 Thread mathieu stumpf guntz
Youps, sorry, now that I have read the whole email, I realize my 
suggestion was in fact already implemented.


Sorry for the noise.


Le 03/03/2018 à 07:33, mathieu stumpf guntz a écrit :


Hi,

Thank you Niklas for this great news, I'm sure that I wasn't alone to 
find it was a major blocking element for a wider adoption.


Now it's a rather large single piece of translation, around 17,000 
messages. I think it would be wise to cut down this in several virtual 
submodules on Wikitranslate, with chance that average user will face 
the message as criterion. Surely "login" is more important than 
"Aborted due to file upload failure. You can use $1 to skip binary 
uploads." in order of translation order priority. :)


Cheers


Le 02/03/2018 à 16:05, Niklas Laxström a écrit :

It's now possible to translate Phabricator in translatewiki.net thanks
to the Phabricator developers, Mukunda Modell, and many others who
participated inhttps://phabricator.wikimedia.org/T225

We are currently in an experimental phase, where these translations
are only used inhttps://phabricator.wikimedia.org, but the plan is to
propose these translations to Phabricator upstream.

A few languages are already available and the language setting can be
changed athttps://phabricator.wikimedia.org/settings

You can help our multilingual user base by translating Phabricator.
You can find some more info about translating this project at
https://translatewiki.net/wiki/Translating:Phabricator, and you can
start translating directly at
https://translatewiki.net/w/i.php?title=Special:Translate=phabricator

The whole Phabricator project is large, with over 17,000 strings to
translate. Some simple and familiar strings can be found under the
"Maniphest" and "Project" sub-groups. "Maniphest" includes strings for
creating and searching tasks, and "Project" includes strings for
managing project workboards with columns. Here are the direct links to
them
*https://translatewiki.net/w/i.php?title=Special:Translate=phabricator-phabricator-maniphest
*https://translatewiki.net/w/i.php?title=Special:Translate=phabricator-phabricator-project

   -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l




___
Translators-l mailing list
translator...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/translators-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] You can now translate Phabricator to your language

2018-03-02 Thread mathieu stumpf guntz

Hi,

Thank you Niklas for this great news, I'm sure that I wasn't alone to 
find it was a major blocking element for a wider adoption.


Now it's a rather large single piece of translation, around 17,000 
messages. I think it would be wise to cut down this in several virtual 
submodules on Wikitranslate, with chance that average user will face the 
message as criterion. Surely "login" is more important than "Aborted due 
to file upload failure. You can use $1 to skip binary uploads." in order 
of translation order priority. :)


Cheers


Le 02/03/2018 à 16:05, Niklas Laxström a écrit :

It's now possible to translate Phabricator in translatewiki.net thanks
to the Phabricator developers, Mukunda Modell, and many others who
participated in https://phabricator.wikimedia.org/T225

We are currently in an experimental phase, where these translations
are only used in https://phabricator.wikimedia.org, but the plan is to
propose these translations to Phabricator upstream.

A few languages are already available and the language setting can be
changed at https://phabricator.wikimedia.org/settings

You can help our multilingual user base by translating Phabricator.
You can find some more info about translating this project at
https://translatewiki.net/wiki/Translating:Phabricator, and you can
start translating directly at
https://translatewiki.net/w/i.php?title=Special:Translate=phabricator

The whole Phabricator project is large, with over 17,000 strings to
translate. Some simple and familiar strings can be found under the
"Maniphest" and "Project" sub-groups. "Maniphest" includes strings for
creating and searching tasks, and "Project" includes strings for
managing project workboards with columns. Here are the direct links to
them
* 
https://translatewiki.net/w/i.php?title=Special:Translate=phabricator-phabricator-maniphest
* 
https://translatewiki.net/w/i.php?title=Special:Translate=phabricator-phabricator-project

   -Niklas

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wikimedia-l] What's making you happy this week? (Week of 18 February 2018)

2018-02-27 Thread mathieu stumpf guntz
What's making me happy this week is joining the "Telegrafo" discussion 
for ELISo  and I also just 
found Six Degrees of Wikipedia 
.



Le 18/02/2018 à 23:12, Pine W a écrit :

What's making me happy this week is Isarra's persistence in working on the
Timeless skin. Timeless is based on Winter. [0] [1]

For anyone who would like to try Timeless, it's available in Preferences
under Appearance / Skin.

What's making you happy this week?

Pine
( https://meta.wikimedia.org/wiki/User:Pine )

[0] https://www.mediawiki.org/wiki/Skin:Timeless
[1] https://www.mediawiki.org/wiki/Winter
___
Wikimedia-l mailing list, guidelines at: 
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and 
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: wikimedi...@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l, 



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikistats 2.0 - Now with Maps!

2018-02-22 Thread mathieu stumpf guntz

Hi Nuria,

Thank you for the report, and congratulation to all people involved in 
releasing this tool. It would be fine to also have map with editiors and 
active editors.


By the way, is there somewhere where I could find total active editors 
of all wikisources? Surely I could sum that through API (providing that 
it exposes such a data for each language), but it would be fine that 
everybody could have a straight forward access to this kind of cross 
language data. I think there real are usecases for that, actually I'm 
looking for number of active wikisourcerer to evaluate number of 
attendees we might set for the next Wikisource conference.


Cheers.


Le 14/02/2018 à 23:15, Nuria Ruiz a écrit :

Hello from Analytics team:

Just a brief note to announce that Wikistats 2.0 includes data about
pageviews per project per country for the current month.

Take a look, pageviews for Spanish Wikipedia this current month:
https://stats.wikimedia.org/v2/#/es.wikipedia.org/reading/pageviews-by-country

Data is also available programatically vi APIs:

https://wikitech.wikimedia.org/wiki/Analytics/AQS/Pageviews#Pageviews_split_by_country

We will be deploying small UI tweaks during this week but please explore
and let us know what you think.

Thanks,

Nuria
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gathering User Inputs for WikiCV Project

2017-12-18 Thread mathieu stumpf guntz

Hi Megha,

First, sorry I don't really have an idea of what I would like to see 
appear on such a CV generation tool, as I don't personally feel any need 
for such an automate.


But I'm interested to know what are the motivation which led to launch 
this project. Certainly they are people looking for such a tool, so 
simply I would like to consult discussions that led to this project, if 
any is available.


I would also like to know what will be the scope of this project, if any 
as been defined. For example, will it be possible to generate a CV for 
an IP or range of IP? And in the same idea of grouping, will it be 
possible to generate CV from several accounts (for example to include 
bot contributions)? Did the project inception considered possible side 
effect on privacy to make sure it won't conflict with Privacy policy[1]?


My best wishes to you in realizing this project in a fashion relevant to 
your addressed users. :)


[1] https://meta.wikimedia.org/wiki/Privacy_policy


Le 17/12/2017 à 19:50, Megha Sharma a écrit :

Hi all,

I'm an Outreachy  intern and as a part of
the internship, I'm working on a project - WikiCV
.

Through this project we (my mentors - Gergő Tisza and Stephen LaPorte and
I) want to create a contribution summarizing tool which (unlike the
existing ones that focus on statistics and are hard to interpret for
someone not familiar with Wikipedia editing) highlights contributions in an
easy-to-understand manner.

I'm writing this to gather inputs from prospective users of this tool, that
is all the Wikipedia editors!

Basically I want to understand what all things would you like to see in a
tool like this? Or what all would you write in your Wikipedia CV?

You can give in your inputs through mail (I'd be more than happy to get one
:)) or fill out this form . My work
is largely dependent on your inputs, so please pour in your comments/views.
Your help will be quite appreciated!

Eagerly waiting for your inputs :)
Thanks,
Megha Sharma
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikistats gets a facelift - Alpha Launch of Wikistats 2

2017-12-18 Thread mathieu stumpf guntz



Le 15/12/2017 à 23:28, Nuria Ruiz a écrit :

Is showing "all project for a specific language" somewhere in the

remaining roadmap?
Let me make sure I understand: Showing metrics in a language-centric
fashion rather than project-centric fashion?
Well, I mean, when you select any project, you *have* to select a 
language, so the UX is already language-centric, and users will most 
likely be expecting to find this approach also in the "All projects 
Famillies" case.


On the other hand, it would be interesting to indeed also propose to get 
statistic of project regardless of the language versions.


And to make things even more interesting, being able to select several 
projects/versions and have comparison of progressions would be wonderful.


Finaly I didn't found how to filter the Wikisource main domain which do 
still host some content from the user interface, although it does 
provide a result when you directly get 
https://stats.wikimedia.org/v2/#/wikisource.org

No, it is not something we are considering for the near future (not that is
not doable, it is just not a priority). Our future work will be around
these general areas: "pageviews per project per country" (soon), "mobile
layouts" (in some months), "localization" (mid term)  and invisible work in
the backend that help us process this much data in less time (all the time,
basically).
Per country might be interesting, but having more accurate geographical 
information than the largest administrative regions would be even better.


Cheers,
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikistats gets a facelift - Alpha Launch of Wikistats 2

2017-12-15 Thread mathieu stumpf guntz

Yes this is really already in a great shape, congratulations and thank you.

Is showing "all project for a specific language" somewhere in the 
remaining roadmap?



Le 14/12/2017 à 21:41, zppix e a écrit :

Great Work I love the design. Can't wait for finished product!

--
Zppix
Volunteer Wikimedia Developer
Volunteer Wikimedia GCI2017 Mentor
enwp.org/User:Zppix
**Note: I do not work for Wikimedia Foundation, or any of its chapters.**


On Dec 14, 2017, at 1:17 PM, Jonathan Morgan  wrote:

This is fabulous! Thank you, Erik Zachte, Analytics team, and everyone else
involved in this project for giving us the powerful, usable stats dashboard
we deserve :)

- J

On Thu, Dec 14, 2017 at 5:10 AM, Niharika Kohli 
wrote:


This is awesome. Great job A-team!

On Thu, Dec 14, 2017 at 12:12 PM, Victoria Coleman  wrote:

Hello from Analytics Team!

We are happy to announce the Alpha release of Wikistats 2. Wikistats

has

been redesigned for architectural simplicity, faster data processing,

and a

more dynamic and interactive user experience. First goal is to match

the

numbers of the current system, and to provide the most important

reports,

as decided by the Wikistats community (see survey) [1].  Over time, we

will

continue to migrate reports and add new ones that you find useful. We

can

also analyze the data in new and interesting ways, and look forward to
hearing your feedback and suggestions. [2]

You can go directly to Spanish Wikipedia
https://stats.wikimedia.org/v2/#/es.wikipedia.org

or browse all projects
https://stats.wikimedia.org/v2/#/all-projects

The new site comes with a whole new set of APIs, similar to our

existing

Pageview API but with edit data. You can start using them today, they

are

documented here:

https://wikitech.wikimedia.org/wiki/Analytics/AQS/Wikistats


FAQ:

Why is this an alpha?
There are features that we feel a full-fledged product should have that

are

still missing, such as localization. The data-processing pipeline for

the

new Wikistats has been rebuilt from scratch (it uses

distributed-computing

tools such as Hadoop) and we want to see how it is used before calling

it

final. Also while we aim to update data monthly, it will happen a few

days

after the month rolls because of the amount of data to move and

compute.

How about comparing data between two wikis?
You can do it with two tabs but we are aware this UI might not solve

all

use cases for the most advanced Wikistats users. We aim to tackle those

in

the future.

How do I file bugs?
Use the handy link in the footer:
https://phabricator.wikimedia.org/maniphest/task/edit/?

title=Wikistats%20Bug=Analytics-Wikistats,Analytics

How do I comment on design?
The consultation on design already happened but we are still watching

the

talk page:
https://www.mediawiki.org/wiki/Wikistats_2.0_Design_

Project/RequestforFeedback/Round2


[1]
https://www.mediawiki.org/wiki/Analytics/Wikistats/

DumpReports/Future_per_report

[2] https://wikitech.wikimedia.org/wiki/Talk:Analytics/

Systems/Wikistats

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l




--
Niharika
Software Engineer
Community Tech
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l




--
Jonathan T. Morgan
Senior Design Researcher
Wikimedia Foundation
User:Jmorgan (WMF) 
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Fwd: Re: [Wikidata] Imperative programming in Lua, do we really want it?

2017-12-06 Thread mathieu stumpf guntz

Following your message Jeroen, there it also is on Wikitech-l now.

 Message transféré 
Sujet : 	Re: [Wikidata] Imperative programming in Lua, do we really want 
it?

Date :  Wed, 6 Dec 2017 23:53:17 +0100
De :Jeroen De Dauw 
Répondre à : 	Discussion list for the Wikidata project. 

Pour : 	Discussion list for the Wikidata project. 





Hey,

While I am not up to speed with the Lua surrounding Wikidata or 
MediaWiki, I support the call for avoiding overly imperative code where 
possible.


Most Lua code I have seen in the past (which has nothing to do with 
MediaWiki) was very imperative, procedural and statefull. Those are 
things you want to avoid if you want your code to be maintainable, easy 
to understand and testable. Since Lua supports OO and functional styles, 
the language is not an excuse for throwing well establishes software 
development practices out of the window.


If the code is currently procedural, I would recommend establishing that 
new code should not be procedural and have automawted tests unless there 
is very good reason to make an exception. If some of this code is 
written by people not familiar with software development, it is also 
important to create good examples for them and provide guidance so they 
do not unknowingly copy and adopt poor practices/styles.


John, perhaps you can link the code that caused you to start this thread 
so that there is something more concrete to discuss?


(This is just my personal opinion, not some official statement from 
Wikimedia Deutschland)


PS: I just noticed this is the Wikidata mailing list and not the 
Wikidata-tech one :(


Cheers

--
Jeroen De Dauw | https://entropywins.wtf |https://keybase.io/jeroendedauw
Software craftsmanship advocate | Developer at Wikimedia Germany
~=[,,_,,]:3

On 6 December 2017 at 23:31, John Erling Blad > wrote:


   With the current Lua environment we have ended up with an imperative
   programming style in the modules. That invites to statefull objects,
   which does not create easilly testable libraries.

   Do we have some ideas on how to avoid this, or is it simply the way
   things are in Lua? I would really like functional programming with
   chainable calls, but other might want something different?

   John

   ___
   Wikidata mailing list
   wikid...@lists.wikimedia.org 
   https://lists.wikimedia.org/mailman/listinfo/wikidata
   


___
Wikidata mailing list
wikid...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] FLIF for Wikimedia

2017-12-06 Thread mathieu stumpf guntz

Le 06/12/2017 à 14:27, Thiemo Kreuz a écrit :

Sure. Go and encourage people to upload RAW. That's very much welcome.
Is it? I thought that Commons didn't include a RAW file format in its 
list of authorized file type.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Proposal for a developer support channel

2017-11-24 Thread mathieu stumpf guntz



Le 19/11/2017 à 04:33, Brian Wolff a écrit :

Neither project:support_desk nor project:current_issues is really meant for
that purpose - support desk is mainly for user and (external) sysadmin
support. And current_issues is the village pump of mediawiki.org (the
website not the software)

Honestly, I kind of think that lqt was better than flow for support desk.
As much as lqt sucked at least search sort of worked.

Although the bigger problem probably is that project:support desk is
protected so new users arent allowed to ask questions(!)

On the subject of search, i do think that
https://lists.wikimedia.org/robots.txt is rediculous. At least for
technical lists we should let google in, and the privacy concern is silly
as there are mirrors that are indexed.
The foundation is not responsible for, possibly illegal, behaviour of 
external parties. It's not silly to disallow copyrighted content on 
Commons just because you can find them on popular website which publish 
them illegaly.




--
bawolff

On Saturday, November 18, 2017, Sam Wilson  wrote:

Hear hear to being able to properly search past conversations.

I know it's not the fashionably geek thing to say, but I must admit that
I always find mailing lists to be incredibly annoying, compared to
forums. Not only is searching completely separate from reading, even
browsing old topics is another interface again (assuming one hasn't been
subscribed forever and kept every old message). Then, when you do manage
to find an old message, there's no way to reply to it (short of copying
and pasting and losing context).

Maybe https://www.mediawiki.org/wiki/Project:Support_desk (and its
sibling https://www.mediawiki.org/wiki/Project:Current_issues ?) is the
best place to ask questions about the software, its development, and
other things. If so, let's make that fact much more well advertised!
(Although, I think Flow is brilliant, when it's for discussing a wiki
page — because the topic is already set (effectively by the title of the
page its attached to). When it's trying to be a host to multiple
unrelated topics, it becomes pretty annoying to use.)

On Sun, 19 Nov 2017, at 05:57 AM, Niharika Kohli wrote:

I'd like to add that having Discourse will provide the one thing IRC
channels and mailing lists fail to - search capabilities. If you hangout
on
the #mediawiki IRC channel, you have probably noticed that we get a lot
of
repeat questions all the time. This would save everyone time and effort.

Not to mention ease of use. Discourse is way more usable than IRC or
mailing lists. Usability is the main reason there are so many questions
about MediaWiki asked on Stackoverflow instead:
https://stackoverflow.com/unanswered/tagged/mediawiki
,
https://stackoverflow.com/questions/tagged/mediawiki-api?sort=newest
,
https://stackoverflow.com/questions/tagged/mediawiki-extensions...

I'd personally hope we can stop asking developers to go to IRC or mailing
lists eventually and use Discourse/something else as a discussion forum
for
support.

On Sat, Nov 18, 2017 at 1:31 PM, Quim Gil  wrote:


Hi, I have expanded
https://www.mediawiki.org/wiki/Discourse#One_place_to_
seek_developer_support

https://www.mediawiki.org/wiki/Project:Support_desk is the only channel
whose main purpose is to provide support. The volunteers maintaining

are

the ones to decide about its future. There is no rush for any decisions
there. First we need to run a successful pilot.

The rest of channels (like this mailing list) were created for

something

else. If these channels stop receiving questions from new developers,

they

will continue doing whatever they do now.


I'd like to understand how adding a venue will improve matters.

For new developers arriving to our shores, being able to ask a first
question about any topic in one place with a familiar UI is a big
improvement over having to figure out a disseminated landscape of wiki

Talk

pages, mailing lists and IRC channels (especially if they are not used

to

any of these environments). The reason to propose this new space is

them,

not us.

On Sat, Nov 18, 2017 at 5:01 PM, MZMcBride  wrote:


Brian Wolff wrote:

On Friday, November 17, 2017, Quim Gil  wrote:

The Technical Collaboration team proposes the creation of a

developer

support channel focusing on newcomers, as part of our Onboarding

New

Developer program. We are proposing to create a site based on

Discourse

(starting with a pilot in discourse-mediawiki.wmflabs.org) and to

point

the many existing scattered channels there.

What does point existing channels to discouse mean exactly? Are you
planning to shutdown any existing channels? If so, which ones?

Excellent questions. I'd like to know the answers as well.

I raised a similar point at

Re: [Wikitech-l] Proposal for a developer support channel

2017-11-24 Thread mathieu stumpf guntz

Hi again,

Will the published content under a free license? That might seems 
obvious but that is something which isn't granted with IRC or mailling 
lists.


Legislately,
mathieu


Le 19/11/2017 à 01:45, Quim Gil a écrit :

On Sat, Nov 18, 2017 at 11:04 PM, Max Semenik  wrote:


Who's gonna maintain this installation?


The current status is explained at
https://www.mediawiki.org/wiki/Discourse#Maintenance

This is a proposal coming from the Technical Collaboration team and we have
more or less everything we need to run the pilot. The draft plan already
says that in the mid term (and before moving to production) we need to
clarify what is the involvement of the Wikimedia Cloud Services team (who
also organizes developer support activities) and Operations. These
conversations are just starting with the publication of the draft plan.



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Proposal for a developer support channel

2017-11-24 Thread mathieu stumpf guntz

Hi Quim,

Does it have already SUL support?

Quickly,
mathieu


Le 19/11/2017 à 01:45, Quim Gil a écrit :

On Sat, Nov 18, 2017 at 11:04 PM, Max Semenik  wrote:


Who's gonna maintain this installation?


The current status is explained at
https://www.mediawiki.org/wiki/Discourse#Maintenance

This is a proposal coming from the Technical Collaboration team and we have
more or less everything we need to run the pilot. The draft plan already
says that in the mid term (and before moving to production) we need to
clarify what is the involvement of the Wikimedia Cloud Services team (who
also organizes developer support activities) and Operations. These
conversations are just starting with the publication of the draft plan.



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Simple overview image about how MW loads resources in clients

2017-11-08 Thread mathieu stumpf guntz



Le 08/11/2017 à 14:32, MZMcBride a écrit :

Nick Wilson (Quiddity) wrote:
We could switch to Module: or Module:. :-)
lol, Unicode has definitely an insane number of WTF symbols included. 
Maybe I should try to re-implement libcaca in Lua within Module:.

Now, for the case of Module:Bananas, I think that Module:Fanciful would
be fine in some cases. In others cases like introduction manual, things
like "MyFirstModule" or "UsernameFirstModule" might do the trick. It
also add implicit information that camel case is the usual way to write
module names. Moreover, one might even argue that "UsernameFirstModule"
easily be dynamically generated, creating an awesome custom user
experience which foster engagement of developers and finally make the
world a really wonderful place to live in (hmm).

Using CamelCase for module names may be specific to particular wikis. The
English Wikipedia doesn't seem to do this as much:
.
Oh, well, the main point was you might use "Module:|{{#USERNAME:}}| 
first module" in the documentation, or whatever the local convention (if 
any) might be. Although one might argue that from a cache perspective 
that might not looks like a very attractive idea. But, actually, I have 
no idea if there feature which would allow a cache-light substitution 
for just some stuff like user name in the content page.


I guess there are caches for stuff like the username in the user menu 
though. To be honest, the longest conversation I ever had about cache so 
far was probably with a former colleague which was always complaining 
about how much he hates vanish. So more hints/links on cache management 
in the Wikimedia environment would be welcome. I only know there are 
some sever at Amsterdam (if that's still up-to-date information) for 
serving Europe in read-only. I can't remember right now if there is an 
equivalent server in Asia, which would definitely make sense. Also we 
should soon have some backup on the moon, if I well understood : 
https://meta.wikimedia.org/wiki/Wikipedia_to_the_Moon


Sorry for this almost in-topic message. Here are some completely 
irrelevant Unicode symbols to seek your forgiveness: ☫⚛⏿࿋

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Simple overview image about how MW loads resources in clients

2017-11-08 Thread mathieu stumpf guntz

Ok, thank you everybody for this cast of light. :)

Maybe this kind of mnem should be documented in some central place, and 
maybe reference that in 
https://www.mediawiki.org/wiki/How_to_become_a_MediaWiki_hacker


The fact that banana is friendly, yeah sure, I'm not aware of any banana 
who willingly assaulted someone. The fact that it is innocuous, well, 
really, I think that banana is not the less sexually connoted example 
one might found. Well, personally I don't really have a problem with 
that, although one might argue that for gender parity it would be fair 
to also use other examples.


Now, for the case of Module:Bananas, I think that Module:Fanciful would 
be fine in some cases. In others cases like introduction manual, things 
like "MyFirstModule" or "UsernameFirstModule" might do the trick. It 
also add implicit information that camel case is the usual way to write 
module names. Moreover, one might even argue that "UsernameFirstModule" 
easily be dynamically generated, creating an awesome custom user 
experience which foster engagement of developers and finally make the 
world a really wonderful place to live in (hmm).


ĝis baldaŭ

Le 08/11/2017 à 08:07, Nick Wilson (Quiddity) a écrit :

On Tue, Nov 7, 2017 at 10:34 PM, mathieu stumpf guntz
<psychosl...@culture-libre.org> wrote:

Seriously, what is the fantasy with banana? Where does it come from?


If you mean the "?modules=banana|phone" in the image, that is (I
assume) a reference to a long-lasting meme based on kids using
bananas as imaginary phone handsets. See (1969!)
https://www.youtube.com/watch?v=OAX6YPCwJsM and
https://youtu.be/51ZhEjB_KvU , and
http://knowyourmeme.com/memes/bananaphone for more recent iterations.

If you mean https://www.mediawiki.org/wiki/Module:Bananas I would
guess "banana" was chosen as a
https://en.wikipedia.org/wiki/Placeholder_name because it's friendly
and innocuous, and is not ambiguous as "Example" might be.

If you mean https://www.mediawiki.org/wiki/Banana-checker I don't know
the story behind that one!
https://www.mediawiki.org/wiki/Naming_things is hard.



Otherwise, thank you for sharing this information and links.


Agreed!  I've added it to a collection I've been making at
https://wikitech.wikimedia.org/wiki/User:Quiddity/How_does_it_all_work#Images_found_elsewhere
(additions welcome)

quiddity

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Simple overview image about how MW loads resources in clients

2017-11-07 Thread mathieu stumpf guntz

Seriously, what is the fantasy with banana? Where does it come from?

Otherwise, thank you for sharing this information and links.

Ĝis baldaŭ

Le 06/11/2017 à 19:39, Joaquin Oltra Hernandez a écrit :

Hi,

We were having a session where we talked about resource loading, code entry
points for the front-end, and how things work on MediaWiki, and we came up
with a small pic to explain the lifecycle for people newer to MediaWiki.

Maybe it could help some people get a better grasp about where files are
coming from and what why the load.php urls are as they are.

Please, forgive any missing details, and if there is something very wrong
I'd love to correct it, please let me know.

Also to clarify, "Magic" is used as "Dynamic, runtime based, dependent on
the state of your code/client cache/server state & extensions" to shorten
things and in a humorous key.

Links:

- Phab: https://phabricator.wikimedia.org/M232
- Imgur: https://i.imgur.com/DYLqtQf.png
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Print styles reminder

2017-10-30 Thread mathieu stumpf guntz

Hey everyone,

I just checked out, as I did have some pages heavily relying on tables 
which where displaying almost nothing in the "print pdf" version. For 
example Recherche:Lexèmes français relatifs aux structures 
. 
Now at first glance it looks really neat, and in any case its a really 
great improvement.


So congratulation and many thanks to all implicated people in this. :)

Happily,
someone (probably)


Le 05/10/2017 à 21:32, Chris Koerner a écrit :

Hello,
As mentioned a few months ago there has been a significant update to
the design and layout of printed articles.[0] [1] This impacts both
PDF's created from the "Download as PDF" feature and from printing
directly from the web browser.

The feedback and discussion have been incredibly helpful. The team is
now ready to enable these styles as default across all projects. The
deployment is scheduled for Monday, October 9th. If you have used the
feature recently, not much has changed. If you haven't used it
recently we encourage you to check out the new styles. A few features:

* New layout to reduce paper usage
* Clear printing of tables and infoboxes
* Better headings
* Project-specific branding

For more information, please visit the project page on MediaWiki.org.
Comments and feedback can be left on the talk page there.

[0] https://lists.wikimedia.org/pipermail/wikitech-l/2017-August/088565.html
[1] 
https://www.mediawiki.org/wiki/Reading/Web/Projects/Print_Styles#Desktop_Printing

Yours,
Chris Koerner
Community Liaison
Wikimedia Foundation

___
Wikitech-ambassadors mailing list
wikitech-ambassad...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-ambassadors


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Editing module in the wiki editor lead to overloading resouces of processor

2017-10-30 Thread mathieu stumpf guntz

Hi,

As I feel I didn't gathered enough information to create a phabricator 
ticket, I send that here in order to obtain some guidance about what 
would be interesting to provide and how to get it. I can say that most 
of the time, I'm currently under Fedora 26 with Firefox 56, but that's 
probably not much relevant information.


So the problem is: when I edit a module, sometimes, but not 
systemically, the window at some point become slw. Like, there are 
large latency between typing and displaying of the resulting action. And 
indeed, for some reason, one processor core is around 100% due to a 
firefox thread.


As a side note, I would be interested with any advice to ease my module 
edit process, as the direct in-wiki edition and test become quickly 
frustrating. When I edit a basic wiki text, I often use "it's all text" 
which enable to edit in my usual text editor. But even this option won't 
work with the widget of code edition (the same problem occurs with 
"syntax highlight" extension).


Cheers

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] What does "p." stands for in the Scribunto reference manual and many modules

2017-09-30 Thread mathieu stumpf guntz



Le 29/09/2017 à 16:16, Brad Jorsch (Anomie) a écrit :

On Fri, Sep 29, 2017 at 7:05 AM, mathieu stumpf guntz <
psychosl...@culture-libre.org> wrote:


Well, it's nothing functionally important, but I was wondering what the
"p" initial was for, as it's used everywhere as the returned value by
modules.


"package", I believe.

At least it seems relevant, thank you.

  As noted, nothing actually requires that the variable
be named "p", that's just the convention. For that matter, nothing requires
the variable even exist. You could have a module return the method table
directly, like

return {
 hello = function () return "Hello, world!" end
}

and that would be valid. But for a long module that might get hard to read,
and wouldn't allow one method to call another if necessary.
Yes, I was aware of that, but thank you to take time to explain it 
clearly again.

On Fri, Sep 29, 2017 at 7:40 AM, mathieu stumpf guntz <
psychosl...@culture-libre.org> wrote:


Where? In the reference manual, it's `p` which is used. Should it be
updated with `_module`?

But I'm affraid that could lead to confusion for beginners when the
console require to use `p`.
What about alliasing `p` with `_module`, and even `m` in the scribunto
console?


Let's not overcomplicate everything. If someone would rather use "_module"
or "m" in the console, they can always enter "_module = p" or the like
before starting their testing.
That's right and now that I have some word to match with the initialism, 
I'm fine with p.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] What does "p." stands for in the Scribunto reference manual and many modules

2017-09-29 Thread mathieu stumpf guntz



Le 29/09/2017 à 13:25, Danny B. a écrit :

Just a small technical note:

As long as "p" is not forced by software to be used, you are welcome to use
any kind of name for the module within the module code, though the export
for the console will still remain under "p".

We actually use "_module" instead, exactly because of lack of (common)
knowledge of what "p" stands for as well as for its overall zero
descriptiveness.
Where? In the reference manual, it's `p` which is used. Should it be 
updated with `_module`?


But I'm affraid that could lead to confusion for beginners when the 
console require to use `p`.
What about alliasing `p` with `_module`, and even `m` in the scribunto 
console?





Kind regards


Danny B.

------ Původní e-mail --
Od: mathieu stumpf guntz <psychosl...@culture-libre.org>
Komu: Wikimedia developers <wikitech-l@lists.wikimedia.org>
Datum: 29. 9. 2017 13:06:37
Předmět: [Wikitech-l] What does "p." stands for in the Scribunto reference
manual and many modules
"Hello,

Well, it's nothing functionally important, but I was wondering what the
"p" initial was for, as it's used everywhere as the returned value by
modules. I recall I already asked that somewhere, but I can't remember
if I was a replied a significant answer, sorry.

Kind regards

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l;
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] What does "p." stands for in the Scribunto reference manual and many modules

2017-09-29 Thread mathieu stumpf guntz

Hello,

Well, it's nothing functionally important, but I was wondering what the 
"p" initial was for, as it's used everywhere as the returned value by 
modules. I recall I already asked that somewhere, but I can't remember 
if I was a replied a significant answer, sorry.


Kind regards

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Is it possible to change the locale of a scribunto module and have identifiers with locale characters

2017-09-29 Thread mathieu stumpf guntz



Le 28/09/2017 à 16:25, Brad Jorsch (Anomie) a écrit :

On Thu, Sep 28, 2017 at 5:19 AM, mathieu stumpf guntz <
psychosl...@culture-libre.org> wrote:


According to lua wiki <http://lua-users.org/wiki/Lua
Locales%20In%20Lua%205.1>, in Lua 5.1 "identifiers [are] locale
dependent, and from the reference manual which states that "[the
documentation] derived from the Lua 5.1 reference manual <
http://www.lua.org/manual/5.1/index.html>", I guess tha Scribunto is
still derived form Lua 5.1.


That's correct.
Ok, thank you, I think that's a very important point. It appears to me 
that Lua developers have a rather "we don't care about backward 
compatibility" approach, so later version can have significant 
incompatibilities.


By the way is there an official policy or whatever document regarding 
Scribunto evolutions?



So, what I would like is being able to set the locale for a module and use
identifiers with locale characters. But `os.setlocale` isn't accessible in
scribunto modules.


Allowing os.setlocale would very likely cause problems on threaded
webservers where one thread's locale change stomps on another's. It might
even cause trouble for subsequent requests on non-threaded servers if the
locale doesn't get reset, or for other code running during the same request
(e.g. see T107128 <https://phabricator.wikimedia.org/T107128>).
Ok, thank you. I guessed that each Scribunto process was hugely 
sandboxed, especially as everything seems to be done to prevent passing 
information between successive invocations of the same module. I hadn't 
thought of possible side effect on PHP execution as explained in the 
ticket. Do we have some nice (or even ugly) schema of PHP/Scribunto 
execution process so I could have a clearer representation of what's 
happening when I grab a webpage of a mediawiki article with some 
Scribunto invocation?


For sanity's sake, on Wikimedia wikis we use C.UTF-8 as the OS-level
locale. This doesn't affect much since MediaWiki usually uses its own i18n
mechanisms instead of using the locale.


Well, ok. I mean that doesn't seems a problem for string, all the more 
when the mw library provide specific helper around the topic.


But that's not the concern I was writing for. That is, I can't use 
unicode identifiers as in `locale plâtrière = préamorçage()`. When I see 
UTF-8 somewhere, I would expect no problem to use any glyph. So are my 
expectations misguided, or is there something wrong with the way C.UTF-8 
is handled somewhere in the software stack?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Using a transcluded string valued with something like "param1|param2|…" as parameter list for an other template [Was: Re: Passing to {{ping}} a list of user stored in a template on Meta]

2017-09-28 Thread mathieu stumpf guntz



Le 24/09/2017 à 19:31, bawolff a écrit :

Why not just make a template containing {{ping|first
user|second user|third user|...}}

Your issue is almost certainly that the pipes aren't being tokenized
as argument separators when they come from a transcluded template.
(Its the same reason that {{!}} works in tables, except in reverse).

Alternatively, {{ping|{{subst::Wiktionary/Tremendous Wiktionary User
Group/affiliates would probably work.
I encounter a similar problem in an other template I'm trying to write. 
Indeed, `subst` works as expected, but I would like to keep the call as is.
So that when saving, the list won't be expanded, but will stay dynamic. 
So using the same example, something like


{{ping|{{param::Wiktionary/Tremendous Wiktionary User Group/affiliates

Is there already a way to do that.

The other option I see is that the called template or module called 
should directly return the whole call. Actually, with a 
frame:expandTemplate or frame:preprocess it does work for the case I was 
asking this question.




--
Brian

On Sun, Sep 24, 2017 at 3:41 PM, mathieu stumpf guntz
<psychosl...@culture-libre.org> wrote:

I'm trying to solve the problem exposed here:
https://meta.wikimedia.org/wiki/User_talk:Psychoslave#Template:Participants
(in French)

In a nutshell, the goal is to be able to ping a group of user on meta.
so ideally, you just type something like {{ping|my_group}}
But as ping and the underlying "Module:Reply_to" are expecting one user per
argument
my current idea is to make something like {{ping|{{:/some_group/members
so instead of
{{ping|Amqui|Ariel1024|Aryamanarora|Benoît_Prieur|Daniel_Kinzler_(WMDE)|Delarouvraie|Epantaleo|Ernest-Mtl|GastelEtzwane|JackPotte|Jberkel|Jitrixis|Kimdime|LA2|LaMèreVeille|Lydia_Pintscher_(WMDE)|Lyokoï|M0tty|Malaysiaboy|Marcmiquel|Micru|Nattes_à_chat|Nemo_bis|Noé|Otourly|Pamputt|psychoslave|Rich_Farmbrough|Rodelar|Satdeep_Gill|Sebleouf|Shavtay|Stalinjeet|S_The_Singer|TAKASUGI_Shinji|TaronjaSatsuma|Thibaut120094|Thiemo_Mättig_(WMDE)|tpt|Trizek_(WMF)|VIGNERON|Vive_la_Rosière|Xabier_Cañas|Xenophôn}}

I can just write
 {{ping|{{:Wiktionary/Tremendous Wiktionary User Group/affiliates for
example

but even if I put verbatim in a template:
Amqui|Ariel1024|Aryamanarora|Benoît_Prieur|Daniel_Kinzler_(WMDE)|Delarouvraie|Epantaleo|Ernest-Mtl|GastelEtzwane|JackPotte|Jberkel|Jitrixis|Kimdime|LA2|LaMèreVeille|Lydia_Pintscher_(WMDE)|Lyokoï|M0tty|Malaysiaboy|Marcmiquel|Micru|Nattes_à_chat|Nemo_bis|Noé|Otourly|Pamputt|psychoslave|Rich_Farmbrough|Rodelar|Satdeep_Gill|Sebleouf|Shavtay|Stalinjeet|S_The_Singer|TAKASUGI_Shinji|TaronjaSatsuma|Thibaut120094|Thiemo_Mättig_(WMDE)|tpt|Trizek_(WMF)|VIGNERON|Vive_la_Rosière|Xabier_Cañas|Xenophôn
in the page

and call ping with this template as argumement, it will end up with a "Error
in Template:Reply to: Input contains forbidden characters."

This message is generated by the module Reply_to but I can't change it since
it's protected.

So I'm looking for a way to bypass whatever generate this error, or more
broadly any idea to resolve the exposed problem.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Is it possible to change the locale of a scribunto module and have identifiers with locale characters

2017-09-28 Thread mathieu stumpf guntz

Hello everybody,

According to lua wiki 
, in Lua 5.1 
"identifiers [are] locale dependent, and from the reference manual which 
states that "[the documentation] derived from the Lua 5.1 reference 
manual ", I guess tha 
Scribunto is still derived form Lua 5.1.


So, what I would like is being able to set the locale for a module and 
use identifiers with locale characters. But `os.setlocale` isn't 
accessible in scribunto modules.


Might I have some information about reasons to disabling it and feedback 
related to the possibility to enable it?


Ĝis baldaŭ

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Is there a way to gather contextual clues about the specific template call which led to a scribunto module execution?

2017-09-26 Thread mathieu stumpf guntz

Here is what I mean with a simple use case

```

== Section 1 ==

Some text.

Hello, we are in {{section}}, leading paragraph is 
{{paragraph|1|{{section.



== Section 2 ==

Some other text.

Hello, we are in {{section}}, leading paragraph is 
{{paragraph|1|{{section.


```

And each call should generate respectively something like

   Hello, we are in Section 1, leading paragraph is "Some text.".

   Hello, we are in Section 2, leading paragraph is "Some other text.".

So, basically the idea is to let the module infers parameters from the 
calling context, rather than

always make them explicit.

I didn't find in the manual anything that appeared relevant to me for 
such a use case.


I'm also welcoming any suggestion to make things in a totally other way. :)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Looking for advice for structuring data module with redundant entries

2017-09-26 Thread mathieu stumpf guntz

Hi,

So in Lua, as far as I can say, there is no way to express directly 
something like `a = {b=1, c=a.b}`.
Instead one might use functions, and call the table with the instance 
method call operator (colon), like

`a = {b=1, c=function(s) return s.b end}; a.b == a:c() -- true`.

A first minor problem is that it introduces asymmetry in form. A 
solution which would allow to call equally

`a.b` and `a.c` would be more interesting for this use case.

Moreover, `a.c = 2` should lead to a state where `a.b == a.c and a.b == 2`

More context: the goal is to store lexicological informations in data 
modules.
Entries might share one or more descriptive material. The important 
point which lead to this consideration
is that the data module should be modifiable from outside the module, 
without requiring huge integrity

constraint at each description change.

But maybe I'm not taking the simplest approach here, so feel free to 
suggest totally different solution.


Kind regards

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Scribunto, failure in attempt to concatenate an array

2017-09-25 Thread mathieu stumpf guntz
Thank you Anomie, it helped me a lot. I'm sorry I missed the information 
in the manual.



Le 25/09/2017 à 22:21, Brad Jorsch (Anomie) a écrit :

On Mon, Sep 25, 2017 at 4:04 PM, mathieu stumpf guntz <
psychosl...@culture-libre.org> wrote:


Hi, I have some trouble trying to concatenate an array stored in a
Scribunto data module.

 mw.logObject(p.data().voir_aussi)
 table#1 {
   metatable = table#2
   "NU",
   "nú",
   "nụ",
   "nư",
   "nữ",
   "ñu",
   "ňu",
   ".nu",
   "nu!",
 }
 =table.concat(p.data().voir_aussi) == '' -- true



This is mentioned in the reference manual:[1]

- The table actually returned by mw.loadData() has metamethods that
provide read-only access to the table returned by the module. Since it does
not contain the data directly, pairs() and ipairs() will work but other
methods, including #value, next(), and the functions in the Table
library, will not work correctly.

You'll have to copy the data into a real table before concatenating, or
concatenate manually. The former is likely faster if the table will have
many elements in it.

[1]:
https://www.mediawiki.org/wiki/Extension:Scribunto/Lua_reference_manual#mw.loadData




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Scribunto, failure in attempt to concatenate an array

2017-09-25 Thread mathieu stumpf guntz
Hi, I have some trouble trying to concatenate an array stored in a 
Scribunto data module.


    mw.logObject(p.data().voir_aussi)
    table#1 {
  metatable = table#2
  "NU",
  "nú",
  "nụ",
  "nư",
  "nữ",
  "ñu",
  "ňu",
  ".nu",
  "nu!",
    }
    =table.concat(p.data().voir_aussi) == '' -- true


If that can help, here is the related pages:

https://fr.wiktionary.org/wiki/Module:Description
https://fr.wiktionary.org/w/index.php?title=Module:Description/data/nu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Passing to {{ping}} a list of user stored in a template on Meta

2017-09-25 Thread mathieu stumpf guntz



Le 24/09/2017 à 19:31, bawolff a écrit :

Why not just make a template containing {{ping|first
user|second user|third user|...}}

Your issue is almost certainly that the pipes aren't being tokenized
as argument separators when they come from a transcluded template.
(Its the same reason that {{!}} works in tables, except in reverse).

Alternatively, {{ping|{{subst::Wiktionary/Tremendous Wiktionary User
Group/affiliates would probably work.

Great, it looks like it works, thank you.

My idea was that the parameter list should be generated from a bullet 
list written/included in "Wiktionary/Tremendous Wiktionary User
Group". If you know template which already do that well or have any 
suggestion regarding the implementation, please let me know.





--
Brian

On Sun, Sep 24, 2017 at 3:41 PM, mathieu stumpf guntz
<psychosl...@culture-libre.org> wrote:

I'm trying to solve the problem exposed here:
https://meta.wikimedia.org/wiki/User_talk:Psychoslave#Template:Participants
(in French)

In a nutshell, the goal is to be able to ping a group of user on meta.
so ideally, you just type something like {{ping|my_group}}
But as ping and the underlying "Module:Reply_to" are expecting one user per
argument
my current idea is to make something like {{ping|{{:/some_group/members
so instead of
{{ping|Amqui|Ariel1024|Aryamanarora|Benoît_Prieur|Daniel_Kinzler_(WMDE)|Delarouvraie|Epantaleo|Ernest-Mtl|GastelEtzwane|JackPotte|Jberkel|Jitrixis|Kimdime|LA2|LaMèreVeille|Lydia_Pintscher_(WMDE)|Lyokoï|M0tty|Malaysiaboy|Marcmiquel|Micru|Nattes_à_chat|Nemo_bis|Noé|Otourly|Pamputt|psychoslave|Rich_Farmbrough|Rodelar|Satdeep_Gill|Sebleouf|Shavtay|Stalinjeet|S_The_Singer|TAKASUGI_Shinji|TaronjaSatsuma|Thibaut120094|Thiemo_Mättig_(WMDE)|tpt|Trizek_(WMF)|VIGNERON|Vive_la_Rosière|Xabier_Cañas|Xenophôn}}

I can just write
 {{ping|{{:Wiktionary/Tremendous Wiktionary User Group/affiliates for
example

but even if I put verbatim in a template:
Amqui|Ariel1024|Aryamanarora|Benoît_Prieur|Daniel_Kinzler_(WMDE)|Delarouvraie|Epantaleo|Ernest-Mtl|GastelEtzwane|JackPotte|Jberkel|Jitrixis|Kimdime|LA2|LaMèreVeille|Lydia_Pintscher_(WMDE)|Lyokoï|M0tty|Malaysiaboy|Marcmiquel|Micru|Nattes_à_chat|Nemo_bis|Noé|Otourly|Pamputt|psychoslave|Rich_Farmbrough|Rodelar|Satdeep_Gill|Sebleouf|Shavtay|Stalinjeet|S_The_Singer|TAKASUGI_Shinji|TaronjaSatsuma|Thibaut120094|Thiemo_Mättig_(WMDE)|tpt|Trizek_(WMF)|VIGNERON|Vive_la_Rosière|Xabier_Cañas|Xenophôn
in the page

and call ping with this template as argumement, it will end up with a "Error
in Template:Reply to: Input contains forbidden characters."

This message is generated by the module Reply_to but I can't change it since
it's protected.

So I'm looking for a way to bypass whatever generate this error, or more
broadly any idea to resolve the exposed problem.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Passing to {{ping}} a list of user stored in a template on Meta

2017-09-24 Thread mathieu stumpf guntz
I didn't check one by one, but calling directly the ping template with 
all the parameters will work while calling ping with a template which 
transclusion provide the same string, so "Amqui|Ariel1024|…" won't work. 
So, maybe it's the vertical bar "|" which is not interpreted as a 
parameter separator in this case and the whole string is passed as a 
single parameter.



Le 24/09/2017 à 17:59, יגאל חיטרון a écrit :

Hi. Is there any chance for really forbidden characters? For example, did
you try to check the input characters one by one? Or, maybe the last user
name came with end of line, as part of template transclusion, and this will
kill the mw.title.new/1 function.
Igal (User:IKhitron)


2017-09-24 18:41 GMT+03:00 mathieu stumpf guntz <
psychosl...@culture-libre.org>:


I'm trying to solve the problem exposed here:
https://meta.wikimedia.org/wiki/User_talk:Psychoslave#Templa
te:Participants (in French)

In a nutshell, the goal is to be able to ping a group of user on meta.
so ideally, you just type something like {{ping|my_group}}
But as ping and the underlying "Module:Reply_to" are expecting one user
per argument
my current idea is to make something like {{ping|{{:/some_group/members}
}}}
so instead of
{{ping|Amqui|Ariel1024|Aryamanarora|Benoît_Prieur|Daniel_
Kinzler_(WMDE)|Delarouvraie|Epantaleo|Ernest-Mtl|
GastelEtzwane|JackPotte|Jberkel|Jitrixis|Kimdime|LA2|LaMèreV
eille|Lydia_Pintscher_(WMDE)|Lyokoï|M0tty|Malaysiaboy|
Marcmiquel|Micru|Nattes_à_chat|Nemo_bis|Noé|Otourly|
Pamputt|psychoslave|Rich_Farmbrough|Rodelar|Satdeep_
Gill|Sebleouf|Shavtay|Stalinjeet|S_The_Singer|TAKASUGI_
Shinji|TaronjaSatsuma|Thibaut120094|Thiemo_Mättig_(WMDE)|tpt|Trizek_(WMF)|
VIGNERON|Vive_la_Rosière|Xabier_Cañas|Xenophôn}}

I can just write
 {{ping|{{:Wiktionary/Tremendous Wiktionary User Group/affiliates
for example

but even if I put verbatim in a template:
Amqui|Ariel1024|Aryamanarora|Benoît_Prieur|Daniel_Kinzler_(W
MDE)|Delarouvraie|Epantaleo|Ernest-Mtl|GastelEtzwane|JackPot
te|Jberkel|Jitrixis|Kimdime|LA2|LaMèreVeille|Lydia_
Pintscher_(WMDE)|Lyokoï|M0tty|Malaysiaboy|Marcmiquel|Micru|
Nattes_à_chat|Nemo_bis|Noé|Otourly|Pamputt|psychoslave|
Rich_Farmbrough|Rodelar|Satdeep_Gill|Sebleouf|Shavtay|
Stalinjeet|S_The_Singer|TAKASUGI_Shinji|TaronjaSatsuma
|Thibaut120094|Thiemo_Mättig_(WMDE)|tpt|Trizek_(WMF)|
VIGNERON|Vive_la_Rosière|Xabier_Cañas|Xenophôn in the page

and call ping with this template as argumement, it will end up with a
"Error in Template:Reply to: Input contains forbidden characters."

This message is generated by the module Reply_to but I can't change it
since it's protected.

So I'm looking for a way to bypass whatever generate this error, or more
broadly any idea to resolve the exposed problem.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Passing to {{ping}} a list of user stored in a template on Meta

2017-09-24 Thread mathieu stumpf guntz
I'm trying to solve the problem exposed here: 
https://meta.wikimedia.org/wiki/User_talk:Psychoslave#Template:Participants 
(in French)


In a nutshell, the goal is to be able to ping a group of user on meta.
so ideally, you just type something like {{ping|my_group}}
But as ping and the underlying "Module:Reply_to" are expecting one user 
per argument

my current idea is to make something like {{ping|{{:/some_group/members
so instead of
{{ping|Amqui|Ariel1024|Aryamanarora|Benoît_Prieur|Daniel_Kinzler_(WMDE)|Delarouvraie|Epantaleo|Ernest-Mtl|GastelEtzwane|JackPotte|Jberkel|Jitrixis|Kimdime|LA2|LaMèreVeille|Lydia_Pintscher_(WMDE)|Lyokoï|M0tty|Malaysiaboy|Marcmiquel|Micru|Nattes_à_chat|Nemo_bis|Noé|Otourly|Pamputt|psychoslave|Rich_Farmbrough|Rodelar|Satdeep_Gill|Sebleouf|Shavtay|Stalinjeet|S_The_Singer|TAKASUGI_Shinji|TaronjaSatsuma|Thibaut120094|Thiemo_Mättig_(WMDE)|tpt|Trizek_(WMF)|VIGNERON|Vive_la_Rosière|Xabier_Cañas|Xenophôn}}

I can just write
    {{ping|{{:Wiktionary/Tremendous Wiktionary User 
Group/affiliates for example


but even if I put verbatim in a template:
Amqui|Ariel1024|Aryamanarora|Benoît_Prieur|Daniel_Kinzler_(WMDE)|Delarouvraie|Epantaleo|Ernest-Mtl|GastelEtzwane|JackPotte|Jberkel|Jitrixis|Kimdime|LA2|LaMèreVeille|Lydia_Pintscher_(WMDE)|Lyokoï|M0tty|Malaysiaboy|Marcmiquel|Micru|Nattes_à_chat|Nemo_bis|Noé|Otourly|Pamputt|psychoslave|Rich_Farmbrough|Rodelar|Satdeep_Gill|Sebleouf|Shavtay|Stalinjeet|S_The_Singer|TAKASUGI_Shinji|TaronjaSatsuma|Thibaut120094|Thiemo_Mättig_(WMDE)|tpt|Trizek_(WMF)|VIGNERON|Vive_la_Rosière|Xabier_Cañas|Xenophôn 
in the page


and call ping with this template as argumement, it will end up with a 
"Error in Template:Reply to: Input contains forbidden characters."


This message is generated by the module Reply_to but I can't change it 
since it's protected.


So I'm looking for a way to bypass whatever generate this error, or more 
broadly any idea to resolve the exposed problem.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mapping Hiragana and Katakana

2017-09-21 Thread mathieu stumpf guntz



Le 20/09/2017 à 03:40, Trey Jones a écrit :


Anyway, would it be a big deal to show the transliterated results
with less weight in ranking? 



Doing any special weighting would be more difficult, but they would 
already be naturally ranked lower for not being exact matches. (You 
can see this at work if you compare the results for /resume, resumé,/ 
and /résumé/ on English Wikipedia, for example.)

Interesting to know. Thank you.


Actually, add an option button in advanced search in any case, and
just limit discussion about should it be opt-in or opt-out.


There are longer term plans for revamping advanced search 
capabilities, so if we want to go that route, it's doable, but it 
would definitely be on hold for a while. Options that have been 
mentioned include a special case keyword like "kana:オオカミ", or a more 
generic keyword like "phonetic:オオカミ" that was smart enough to know 
what to do with kana, but might do something different with other 
characters... but that's all at the vague ideation stage right now.
Well, I would expect "phonetic:" would bind with something like IPA, but 
the concept of keyword is interesting.


Thanks!


Trey Jones
Sr. Software Engineer, Search Platform
Wikimedia Foundation

On Tue, Sep 19, 2017 at 8:29 PM, mathieu stumpf guntz 
<psychosl...@culture-libre.org <mailto:psychosl...@culture-libre.org>> 
wrote:




Le 19/09/2017 à 23:47, Trey Jones a écrit :

We recently got a suggestion via Phabricator[1] to automatically map
between hiragana and katakana when searching on English Wikipedia and other
wiki projects. As an always-on feature, this isn't difficult to implement,
but major commercial search engines (Google.jp, Bing, Yahoo Japan,
DuckDuckGo, Goo) don't do that. They give different results when searching
for hiragana/katakana forms (for example, オオカミ/おおかみ "wolf"). They also give
different *numbers* of results, seeming to indicate that it's not just
re-ordering the same results (say, so that results in the same script are
ranked higher).[2] I want to know what they know that I don't!

Does anyone have any thoughts on whether this would be useful (seems that
it would) and whether it would cause any problems (it must, or otherwise
all the other search engines would do it, right?).

Well, maybe. Or not. Look how Duckduckgo continue to only give a
"country" option to filter *languages*. Now both might be
complementary,
but personally I'm generally more interested with the later. All
the more when
I'm using a language which have no country using it as official
language. :)

Anyway, would it be a big deal to show the transliterated results
with less
weight in ranking? Actually, add an option button in advanced
search in any
case, and just limit discussion about should it be opt-in or opt-out.


Any idea why it might be different between a Japanese-language wiki and a
non-Japanese-language wiki? We often are more aggressive in matching
between characters that are not native to a given language--for example,
accents on Latin characters are generally ignored on English-language
wikis. So it might make sense to merge hiragana and katakana on
English-language wikis but not Japanese-language wikis.

Thanks very much for any suggestions or information!
—Trey


どういたしました。

[1] https://phabricator.wikimedia.org/T176197
<https://phabricator.wikimedia.org/T176197> [2] Details of my
tests at https://phabricator.wikimedia.org/T173650#3580309
<https://phabricator.wikimedia.org/T173650#3580309>

Trey Jones
Sr. Software Engineer, Search Platform
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
<mailto:Wikitech-l@lists.wikimedia.org>
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
<https://lists.wikimedia.org/mailman/listinfo/wikitech-l>





___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Is it possible to edit scribunto data module content through template edit popup of visual editor?

2017-09-20 Thread mathieu stumpf guntz



Le 20/09/2017 à 14:56, Brad Jorsch (Anomie) a écrit :

On Tue, Sep 19, 2017 at 8:12 PM, mathieu stumpf guntz <
psychosl...@culture-libre.org> wrote:


Well, actually, depending on what you mean with  have page parses make
edits to the wiki, I'm not sure what I'm looking for fall under this
umbrella.

What I would like is a way to do something like

local data = mw.loadData( 'Module:Name/data/entry' )
-- do some stuff with `data`
mw.saveData( 'Module:Name/data/entry', data)

That's it.


And that, like all Scribunto modules, would run during the parse of the
page.
Ok. Probably I miss knowledge about the page generation pipeline, so if 
there are information you think I should be aware of to understand 
problems it might generate, please provide them, links are appropriate 
answer so far as I'm concerned.



If you do think it's an horrible akward awfully disgusting idea , I would
be interested to know more technical details on what problems it might lead
to (not the global result of nightmarish hell on earth that I'm obviously
targeting )


Confusion for users when purging a page results in edits somewhere else.
How would purging a page lead to such a result ? Since, in my proposal, 
templates with an updating parameter would actually never be recorded in 
the wikitext, but removed through a inplace template replacement.

Misattribution of the resulting edits.
That sounds more like a point of specification that an implementation 
must respect than a technical impossibility, or maybe I miss something.

Opportunities for vandals to misuse
it.
Sure but vandals can misuse anything, including bots which on this 
regard offer far more latitude.

  Performance issues.

Well, it's hard to gauge on mere specification, isn't it?

  And T67258.

Shoul be discussed on the ticket itself.




Your initial idea of somehow hooking into the editor (whether that's the
wikitext editor or VE) with JavaScript to allow humans to make edits to the
data module while editing another page was much better.

I didn't even thought about JS actually. For the wikitext removal of
updating parameter, I had in mind some inplace template substitution.
Does javascript allow to change an other page on the wiki, especially a
data module, at some point when the user edit/save an article?


That's what your original message sounded like you were talking about. A
JavaScript gadget of whatever sort could make edits by using the action API.
Ok, then that might be an avenue worth exploring for me. I guess 
https://www.mediawiki.org/wiki/Manual:Interface/JavaScript is a good 
starting point but any documentation link is welcome.








___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mapping Hiragana and Katakana

2017-09-19 Thread mathieu stumpf guntz



Le 19/09/2017 à 23:47, Trey Jones a écrit :

We recently got a suggestion via Phabricator[1] to automatically map
between hiragana and katakana when searching on English Wikipedia and other
wiki projects. As an always-on feature, this isn't difficult to implement,
but major commercial search engines (Google.jp, Bing, Yahoo Japan,
DuckDuckGo, Goo) don't do that. They give different results when searching
for hiragana/katakana forms (for example, オオカミ/おおかみ "wolf"). They also give
different *numbers* of results, seeming to indicate that it's not just
re-ordering the same results (say, so that results in the same script are
ranked higher).[2] I want to know what they know that I don't!

Does anyone have any thoughts on whether this would be useful (seems that
it would) and whether it would cause any problems (it must, or otherwise
all the other search engines would do it, right?).

Well, maybe. Or not. Look how Duckduckgo continue to only give a
"country" option to filter *languages*. Now both might be complementary,
but personally I'm generally more interested with the later. All the 
more when

I'm using a language which have no country using it as official language. :)

Anyway, would it be a big deal to show the transliterated results with less
weight in ranking? Actually, add an option button in advanced search in any
case, and just limit discussion about should it be opt-in or opt-out.



Any idea why it might be different between a Japanese-language wiki and a
non-Japanese-language wiki? We often are more aggressive in matching
between characters that are not native to a given language--for example,
accents on Latin characters are generally ignored on English-language
wikis. So it might make sense to merge hiragana and katakana on
English-language wikis but not Japanese-language wikis.

Thanks very much for any suggestions or information!
—Trey


どういたしました。


[1] https://phabricator.wikimedia.org/T176197
[2] Details of my tests at https://phabricator.wikimedia.org/T173650#3580309

Trey Jones
Sr. Software Engineer, Search Platform
Wikimedia Foundation
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Is it possible to edit scribunto data module content through template edit popup of visual editor?

2017-09-19 Thread mathieu stumpf guntz



Le 19/09/2017 à 15:29, Brad Jorsch (Anomie) a écrit :

On Tue, Sep 19, 2017 at 2:48 AM, mathieu stumpf guntz <
psychosl...@culture-libre.org> wrote:

But having ability to write a limited amount of bytes in a single data

module per script call, and possibly others safeguard limits, wouldn't be
that risky, would it?


It would break T67258 <https://phabricator.wikimedia.org/T67258>. I also
think it's probably a very bad idea to be trying to have page parses make
edits to the wiki.

Well, actually, depending on what you mean with  have page parses make
edits to the wiki, I'm not sure what I'm looking for fall under this 
umbrella.


What I would like is a way to do something like

local data = mw.loadData( 'Module:Name/data/entry' )
-- do some stuff with `data`
mw.saveData( 'Module:Name/data/entry', data)

That's it.






If it's not, please provide me some feed back on the proposal to add such
a function, and if I should document such a proposal elsewhere, please let
me know.


You're free to file a task in Phabricator, but it will be closed as
Declined. There are too many potential issues there for far too little
benefit.
If you do think it's an horrible akward awfully disgusting idea , I 
would be interested to know more technical details on what problems it 
might lead to (not the global result of nightmarish hell on earth that 
I'm obviously targeting )


Your initial idea of somehow hooking into the editor (whether that's the
wikitext editor or VE) with JavaScript to allow humans to make edits to the
data module while editing another page was much better.
I didn't even thought about JS actually. For the wikitext removal of 
updating parameter, I had in mind some inplace template substitution.
Does javascript allow to change an other page on the wiki, especially a 
data module, at some point when the user edit/save an article?







___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Is it possible to edit scribunto data module content through template edit popup of visual editor?

2017-09-19 Thread mathieu stumpf guntz



Le 19/09/2017 à 00:38, mathieu stumpf guntz a écrit :
Well, I have investigated a bit, and so far the only reference where I 
found a description of saving/retrieving data from a Scribunto module 
is SemanticScribunto.
Hum, it was a bit late when I wrote that, so of course *load* data from 
a Scribunto module is no big deal. The real issue which is on my way is 
the ability to *save* data. As far as I know, a module has currently no 
way to save data. No doubt it's a good thing that they can't make 
arbitrary change any page on the wiki. But having ability to write a 
limited amount of bytes in a single data module per script call, and 
possibly others safeguard limits, wouldn't be that risky, would it?


So if this is something already possible, then please let me know.

If it's not, please provide me some feed back on the proposal to add 
such a function, and if I should document such a proposal elsewhere, 
please let me know.


Kind regards,
mathieu




https://github.com/SemanticMediaWiki/SemanticScribunto
https://upload.wikimedia.org/wikipedia/mediawiki/7/75/EMWCon_Spring_2017_-_Introducing_SemanticScribunto_Extension.pdf 



It's build on Semantic Mediawiki, well, I don't know much about that. 
All that seems interesting, although a far more large than what I was 
looking for, at least to begin with. What I would like is a way to 
quickly prototype and drop some trials. This extension seems fine but 
it already requires to install extension, so it would be more 
difficult to put that on an existing wiki so I can have feedback on 
prototypes. Maybe the least resistance path would be to install a 
mediawiki instance on toolforge…



Le 17/09/2017 à 14:05, mathieu stumpf guntz a écrit :

Saluton ĉiuj kundisvolvantoj,

I think the subject summarize it all, so here are more details on 
what I'm trying to do and what I'm looking for.


# Context

You might skip this section if you are not interested in contextual 
verbiage. If you would like to react to anything stated in this 
section, please change the email subject to reflect that.


So I'm currently meditating ways to improve factorization of 
knowledge stored in Wiktionary.


I'm taking a multi-approach experimentation there. On the one hand, I 
just began a Wikiversity project 
<https://fr.wikiversity.org/wiki/Recherche:Recueil_lexicologique_%C3%A0_l%E2%80%99usage_des_Wiktionnaires> 
(in French) to establish a specification of how a DBMS should be 
structured to be useful for Wiktionaries. It mainly emerged from my 
point of view that the current data model 
<https://www.mediawiki.org/wiki/Extension:WikibaseLexeme/Data_Model> 
proposed for the wikidata for wiktionary 
<https://www.wikidata.org/wiki/Wikidata:Wiktionary> does not fit 
needs of Wiktionary contributors. I did made some alternative 
proposals 
<https://www.mediawiki.org/wiki/Extension_talk:WikibaseLexeme/Data_Model>, 
and tried to gather a first feedback from the French wiktionary 
<https://fr.wiktionary.org/wiki/Wiktionnaire:Wikid%C3%A9mie/septembre_2017#Vers_la_conception_d.E2.80.99une_base_de_donn.C3.A9e_relationnelle_con.C3.A7u_pour_servir_de_support_aux_Wiktionnaires> 
on this model too, which led me to the creation Wikiversity research 
project because I was pointed to the lack of "specify extensively the 
needs before you model".


Now, on an other hand, I'm also trying to factorize some data within 
the Wikitionary with current available tools. One driving topic for 
that is fixing gender gap 
<https://fr.wiktionary.org/wiki/Discussion_Projet:Parit%C3%A9_des_genres>, 
and more broadly inflection-form gap. That is a feminine form will 
generally be summarized in a laconic "feminine form of *some-term*", 
rather than being treated as an entry of it's own. That's all the 
more problematic in cases where a word only share a subset of 
relevant definitions depending on which gender(/inflection-form) it 
applies to.


# What I'm trying to do

I am trying to factorize data which pertains to several 
inflection-forms. This way each form can use it to build a 
stand-alone article about a term. The current approach tends to be 
gathering everything under a single lemme, although some statements 
will only pertains to some specific forms.


So far I experimented with transclusion of subpages to share 
definitions, examples and so on between inflection-forms. Well, from 
a consultation point of view it works. But from an editing point of 
view, it's all but fine.


What I would think interesting, is to store this data in a scribunto 
data module (at least for now), and enable user to change them while 
editing an lexical entry article. That might be, when using the 
visual editor, through something like a model popup. Wikitext editors 
will probably be skilled enough to edit the relevant module, but for 
the sake of convenience, it might be interesting to allow to give a 
parameter to the model, which would at publishing ti

Re: [Wikitech-l] Is it possible to edit scribunto data module content through template edit popup of visual editor?

2017-09-18 Thread mathieu stumpf guntz
Well, I have investigated a bit, and so far the only reference where I 
found a description of saving/retrieving data from a Scribunto module is 
SemanticScribunto.


https://github.com/SemanticMediaWiki/SemanticScribunto
https://upload.wikimedia.org/wikipedia/mediawiki/7/75/EMWCon_Spring_2017_-_Introducing_SemanticScribunto_Extension.pdf

It's build on Semantic Mediawiki, well, I don't know much about that. 
All that seems interesting, although a far more large than what I was 
looking for, at least to begin with. What I would like is a way to 
quickly prototype and drop some trials. This extension seems fine but it 
already requires to install extension, so it would be more difficult to 
put that on an existing wiki so I can have feedback on prototypes. Maybe 
the least resistance path would be to install a mediawiki instance on 
toolforge…



Le 17/09/2017 à 14:05, mathieu stumpf guntz a écrit :

Saluton ĉiuj kundisvolvantoj,

I think the subject summarize it all, so here are more details on what 
I'm trying to do and what I'm looking for.


# Context

You might skip this section if you are not interested in contextual 
verbiage. If you would like to react to anything stated in this 
section, please change the email subject to reflect that.


So I'm currently meditating ways to improve factorization of knowledge 
stored in Wiktionary.


I'm taking a multi-approach experimentation there. On the one hand, I 
just began a Wikiversity project 
<https://fr.wikiversity.org/wiki/Recherche:Recueil_lexicologique_%C3%A0_l%E2%80%99usage_des_Wiktionnaires> 
(in French) to establish a specification of how a DBMS should be 
structured to be useful for Wiktionaries. It mainly emerged from my 
point of view that the current data model 
<https://www.mediawiki.org/wiki/Extension:WikibaseLexeme/Data_Model> 
proposed for the wikidata for wiktionary 
<https://www.wikidata.org/wiki/Wikidata:Wiktionary> does not fit needs 
of Wiktionary contributors. I did made some alternative proposals 
<https://www.mediawiki.org/wiki/Extension_talk:WikibaseLexeme/Data_Model>, 
and tried to gather a first feedback from the French wiktionary 
<https://fr.wiktionary.org/wiki/Wiktionnaire:Wikid%C3%A9mie/septembre_2017#Vers_la_conception_d.E2.80.99une_base_de_donn.C3.A9e_relationnelle_con.C3.A7u_pour_servir_de_support_aux_Wiktionnaires> 
on this model too, which led me to the creation Wikiversity research 
project because I was pointed to the lack of "specify extensively the 
needs before you model".


Now, on an other hand, I'm also trying to factorize some data within 
the Wikitionary with current available tools. One driving topic for 
that is fixing gender gap 
<https://fr.wiktionary.org/wiki/Discussion_Projet:Parit%C3%A9_des_genres>, 
and more broadly inflection-form gap. That is a feminine form will 
generally be summarized in a laconic "feminine form of *some-term*", 
rather than being treated as an entry of it's own. That's all the more 
problematic in cases where a word only share a subset of relevant 
definitions depending on which gender(/inflection-form) it applies to.


# What I'm trying to do

I am trying to factorize data which pertains to several 
inflection-forms. This way each form can use it to build a stand-alone 
article about a term. The current approach tends to be gathering 
everything under a single lemme, although some statements will only 
pertains to some specific forms.


So far I experimented with transclusion of subpages to share 
definitions, examples and so on between inflection-forms. Well, from a 
consultation point of view it works. But from an editing point of 
view, it's all but fine.


What I would think interesting, is to store this data in a scribunto 
data module (at least for now), and enable user to change them while 
editing an lexical entry article. That might be, when using the visual 
editor, through something like a model popup. Wikitext editors will 
probably be skilled enough to edit the relevant module, but for the 
sake of convenience, it might be interesting to allow to give a 
parameter to the model, which would at publishing time modify the data 
module and remove the parameter from the wikitext generated.


Let's take an example to make a bit clearer. Let's take the French 
pair "contributeur/contributrice". In both article, I would like that 
the definition could be generated from transclusion with something 
like 
{{definition|vocable=contributrice|lang=French|gloss=contributor}}. 
Note that this template might, by default, take into account the name 
of the calling page, thus avoiding the "vocable" parameter. Also, the 
lang would be required in contributrice, as this is a vocable which 
exist in at least in French and Italian. But it would not be required 
in "contributeur", nor "contributore". Finaly gloss is a string whose 
purpose is to distinguish a given term in case of homonymy. When

Re: [Wikitech-l] Content Translation office hour and online meeting on September 20, 2017 (Wednesday) at 1300 UTC

2017-09-18 Thread mathieu stumpf guntz

Hello Runa,

Thank you for the information.


Le 18/09/2017 à 11:06, Runa Bhattacharjee a écrit :


This session is going to be an online discussion over Google
Hangouts/Youtube with a simultaneous IRC conversation. Due to the
limitation of Google Hangouts, only a limited number of participation slots
are available. Hence, do please let us know in advance if you would like to
join in the Hangout. The IRC channel will be open for interactions during
the session.
Did you consider using an other solution, such as the jisti 
 FLOSS?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Is it possible to edit scribunto data module content through template edit popup of visual editor?

2017-09-17 Thread mathieu stumpf guntz

Saluton ĉiuj kundisvolvantoj,

I think the subject summarize it all, so here are more details on what 
I'm trying to do and what I'm looking for.


# Context

You might skip this section if you are not interested in contextual 
verbiage. If you would like to react to anything stated in this section, 
please change the email subject to reflect that.


So I'm currently meditating ways to improve factorization of knowledge 
stored in Wiktionary.


I'm taking a multi-approach experimentation there. On the one hand, I 
just began a Wikiversity project 
 
(in French) to establish a specification of how a DBMS should be 
structured to be useful for Wiktionaries. It mainly emerged from my 
point of view that the current data model 
 
proposed for the wikidata for wiktionary 
 does not fit needs 
of Wiktionary contributors. I did made some alternative proposals 
, 
and tried to gather a first feedback from the French wiktionary 
 
on this model too, which led me to the creation Wikiversity research 
project because I was pointed to the lack of "specify extensively the 
needs before you model".


Now, on an other hand, I'm also trying to factorize some data within the 
Wikitionary with current available tools. One driving topic for that is 
fixing gender gap 
, 
and more broadly inflection-form gap. That is a feminine form will 
generally be summarized in a laconic "feminine form of *some-term*", 
rather than being treated as an entry of it's own. That's all the more 
problematic in cases where a word only share a subset of relevant 
definitions depending on which gender(/inflection-form) it applies to.


# What I'm trying to do

I am trying to factorize data which pertains to several 
inflection-forms. This way each form can use it to build a stand-alone 
article about a term. The current approach tends to be gathering 
everything under a single lemme, although some statements will only 
pertains to some specific forms.


So far I experimented with transclusion of subpages to share 
definitions, examples and so on between inflection-forms. Well, from a 
consultation point of view it works. But from an editing point of view, 
it's all but fine.


What I would think interesting, is to store this data in a scribunto 
data module (at least for now), and enable user to change them while 
editing an lexical entry article. That might be, when using the visual 
editor, through something like a model popup. Wikitext editors will 
probably be skilled enough to edit the relevant module, but for the sake 
of convenience, it might be interesting to allow to give a parameter to 
the model, which would at publishing time modify the data module and 
remove the parameter from the wikitext generated.


Let's take an example to make a bit clearer. Let's take the French pair 
"contributeur/contributrice". In both article, I would like that the 
definition could be generated from transclusion with something like 
{{definition|vocable=contributrice|lang=French|gloss=contributor}}. Note 
that this template might, by default, take into account the name of the 
calling page, thus avoiding the "vocable" parameter. Also, the lang 
would be required in contributrice, as this is a vocable which exist in 
at least in French and Italian. But it would not be required in 
"contributeur", nor "contributore". Finaly gloss is a string whose 
purpose is to distinguish a given term in case of homonymy. When no 
homonym exist, it might be skiped. So in "contributeur", one might 
simply use {{definition}}, but in  "contributrice", one should at least 
use {{definition|lang=French}}. Now, that's for the purely consultative 
side of the data.


On the backend side, my idea would be to store this data, at least for 
now, in scribunto data module. So for example in 
"Module:Vocable/contributrice", one might store all descriptive data 
about this vocable. I didn't thought yet about the exact structure of 
what would be stored in this kind of module, but the idea is that the 
misc. templates such as *definition*, *example*, and so on would serve 
as interface for this modules, so most contributors would not need to 
care about this structure.


So, precisely, in case of {{definition}}, one should be able to wikitext 
edit the "contributeur" article and to write something like 
{{definition|value=A person who contribute}}. And on publish, it would 
store the given value in appropriate module and change 

[Wikitech-l] Should we plan anything regarding Mozilla’s "Talk" open-source commenting platform

2017-09-13 Thread mathieu stumpf guntz

Hi,

I'm just discovering this blog post Mozilla and the Washington Post Are 
Reinventing Online Comments 
 
and thought it might worth talking about its relevancy for the Wikimedia 
movement. For those who didn't followed the 2017 Movement strategy 
, 
we now also have wikicomment  as a 
solution related to this topic.


Kind regards,
mathieu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Multilingual maps: how to pick each feature language?

2017-02-10 Thread mathieu stumpf guntz



Le 10/02/2017 à 02:31, Yuri Astrakhan a écrit :

TLDR: if browsing a map for French wiki, and a city only has a Russian and
Chinese name, which one should be shown? Should the city name have
different rules from a store or a street name? ...
Well, the third option might be to engage user to contribute with a 
localization. We might for that propose a specific guide, and advises 
from professional(/skilled) topographers would be a big plus here I guess.


Making automated transliteration suggestion from Russian to French is 
practicable, especially if "dummy" Cyrillic to roman script (plus 
possibly some nominative/adjectival suffix) is considered fine. Now if 
you want to go deeper and transpose the meaning of the toponym (which, 
surely most locales wouldn't know ever), it's challenge of an other scale.


Regarding transilteration from Chinese, expect if it comes with some 
romanization like pinyin, it seems to do any automatic suggestion (but 
call for contribution is always possible :).




I have been hacking to add unlimited multilingual support to Wikipedia
maps, and have language fallback question:  given a list of arbitrary
languages for each map feature, what is the best choice for a given
language?

I know Mediawiki has language fallbacks, but they are very simple (e.g. for
"ru", if "ru" is not there, try "en").

Some things to consider:
* Unlike Wikipedia, where readers go for "meaning", in maps we mostly need
"readability".
* Alphabets: Latin alphabet is probably the most universally understood,
followed by...? Per target language?
I have no idea, but if you have sources on this topic, I would be 
interesting in feedback.

* Politics: places like Crimea tend to have both Russian and Ukrainian
names defined, but if drawing map in Ukrainian, and some feature has
Russian and English names, but not Ukrainian, should it be shown with the
Russian or Ukrainian name?
* When viewing a map of China in English, should Chinese (local) name be
shown together with the English name? Should it be shown for all types of
features (city name, street name, name of the church, ...?)
Well, at least as an option, that might be interesting. More generally, 
you might like to have something like "Localized name (names used by 
natives)". The text between parentheses might indeed have several items. 
In the case of China, you probably at least would like to have the 
Ideogram and the pinyin as stated by the official government, plus – as 
far as I know – there are many languages used in China, so you'll 
probably have names which are more along local usage. Other cases which 
come to my mind are Amerindian traditional toponyms, and toponyms in 
France local languages (for exemple, /Strossburi/ and /Strossburch /for 
Strasbourg), as they are even on the municipality entrance.




Thanks!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Monthly page view stats that can now be queried via Pageview API.

2017-01-28 Thread mathieu stumpf guntz

Yes, that was the kind of link I was looking for, thank you. :)


Le 27/01/2017 à 18:46, Marcel Ruiz Forns a écrit :

Hi Mathieu,

As far as I know, there is no direct access to count of pageviews directly

on pages, is there?

Not sure if that is what you are imagining, but there is one link under the
"View History" tab, called "Page view statistics" (Extenal links section).
It brings you to the Pageviews Analysis Tool, already showing stats for
that particular page for the last 20 days.

Cheers!

On Fri, Jan 27, 2017 at 5:26 PM, mathieu stumpf guntz <
psychosl...@culture-libre.org> wrote:


Hi Nuria,

As far as I know, there is no direct access to count of pageviews directly
on pages, is there? What about add one, along the "This page was last
modified on" message at the bottom of the page?

Accountably,
Mathieu

Le 25/01/2017 à 05:18, Nuria Ruiz a écrit :


Hello!

The Analytics team would like to announce that the Pageview API is able to
return monthly pageview stats as of this week.


For example, the request below will get you a monthly count of pageviews
for de.wikipedia's article Barack_Obama for the year 2016 (note 'monthly'
parameter on url)

http://wikimedia.org/api/rest_v1/metrics/pageviews/per-artic
le/de.wikipedia/all-access/all-agents/Barack_Obama/
*monthly*/2016010100/2016123100


More documentation and examples can be found on our quickstart guide:
https://wikitech.wikimedia.org/wiki/Analytics/PageviewAPI#Quick_Start

Thanks,

Nuria
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l






___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Monthly page view stats that can now be queried via Pageview API.

2017-01-27 Thread mathieu stumpf guntz

Hi Nuria,

As far as I know, there is no direct access to count of pageviews 
directly on pages, is there? What about add one, along the "This page 
was last modified on" message at the bottom of the page?


Accountably,
Mathieu

Le 25/01/2017 à 05:18, Nuria Ruiz a écrit :

Hello!

The Analytics team would like to announce that the Pageview API is able to
return monthly pageview stats as of this week.


For example, the request below will get you a monthly count of pageviews
for de.wikipedia's article Barack_Obama for the year 2016 (note 'monthly'
parameter on url)

http://wikimedia.org/api/rest_v1/metrics/pageviews/per-article/de.wikipedia/all-access/all-agents/Barack_Obama/
*monthly*/2016010100/2016123100


More documentation and examples can be found on our quickstart guide:
https://wikitech.wikimedia.org/wiki/Analytics/PageviewAPI#Quick_Start

Thanks,

Nuria
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Now live: Shared structured data

2016-12-30 Thread mathieu stumpf guntz


As to my mind it's a very interesting topic, I searched a bit more.

https://www.w3.org/International/articles/article-text-size.en
which quotes 
http://www-01.ibm.com/software/globalization/guidelines/a3.html


According to which, for strings in English source that are over 70 
characters, you might expect an 130% average expansion. So, with an 
admittedly very loose inference,  the 400 character limit for all is 
equivalent to a 307 character limit for English. Would you say that it 
would seems ok to have a 307 character limit there?



Le 29/12/2016 à 12:11, mathieu stumpf guntz a écrit :



Le 28/12/2016 à 23:08, Yuri Astrakhan a écrit :

The 400 chat limit is to be in sync with Wikidata, which has the same
limitation. The origins of this limit is to encourage storage of 
"values"

rather than full strings (sentences).
Well, that's probably not the best constraints for a glossary then. To 
my mind, 400 char limit regardless of the language is rather 
suprising. Surely you can tell much more with a set of 400 ideograms 
than with, well, whatever the language happen to have the longest 
average sentence length (any idea?). Also, at least for some 
translation pairs, there is a tendancy to have translations longer 
than the original[1].


[1] http://www.sid.ir/en/VEWSSID/J_pdf/53001320130303.pdf

  Also, it discourages storage of wiki
markup.
What about disallowing it explicitly? You might even enforce that with 
a quick parsing that prevent recording, or simply put a reminder when 
detecting such a string to avoid blocking users in legitimate corner 
cases.




On Wed, Dec 28, 2016, 16:45 mathieu stumpf guntz <
psychosl...@culture-libre.org> wrote:

Thank you Yuri. Is there some rational explanation behind this 
limits? I

understand the limit over performance concern, and 2Mb seems already
very large for intented glossaries. But 400 chars might be problematic
for some definition I guess, especially since translations can lead to
varying lenght needs.


Le 25/12/2016 à 17:03, Yuri Astrakhan a écrit :

Hi Mathieu, yes, I think you can totally build up this glossary in a
dataset. Just remember that each string can be no longer then 400 
chars,

and total size under 2mb.

On Sun, Dec 25, 2016, 10:45 mathieu stumpf guntz <
psychosl...@culture-libre.org> wrote:


Hi Yuri,

Seems very interesting. Am I wrong thinking this could helpto create
multi-lingual glossary as drafted in
https://phabricator.wikimedia.org/T150263#2860014 ?


Le 22/12/2016 à 20:30, Yuri Astrakhan a écrit :

Gift season! We have launched structured data on Commons, available

from

all wikis.

TLDR; One data store. Use everywhere. Upload table data to Commons,

with
localization, and use it to create wiki tables, lists, or use 
directly

in
graphs. Works for GeoJSON maps too. Must be licensed as CC0. Try 
this

per-state GDP map demo, and select multiple years. More demos at the

bottom.

US Map state highlight
<https://en.wikipedia.org/wiki/Template:Graph:US_Map_state_highlight> 



Data can now be stored as *.tab and *.map pages in the data 
namespace

on
Commons. That data may contain localization, so a table cell 
could be

in
multiple languages. And that data is accessible from any wikis, 
by Lua

scripts, Graphs, and Maps.

Lua lets you generate wiki tables from the data by filtering,

converting,
mixing, and formatting the raw data. Lua also lets you generate 
lists.

Or

any wiki markup.

Graphs can use both .tab and .map directly to visualize the data and

let
users interact with it. The GDP demo above uses a map from 
Commons, and

colors each segment with the data based on a data table.

Kartographer (/) can use the .map data as an 
extra

layer

on top of the base map. This way we can show endangered species'

habitat.

== Demo ==
* Raw data example
<https://commons.wikimedia.org/wiki/Data:Weather/New_York_City.tab>
* Interactive Weather data
<https://en.wikipedia.org/wiki/Template:Graph:Weather_monthly_history> 


* Same data in Weather template
<https://en.wikipedia.org/wiki/User:Yurik/WeatherDemo>
* Interactive GDP map
<https://en.wikipedia.org/wiki/Template:Graph:US_Map_state_highlight> 


* Endangered Jemez Mountains salamander - habitat
<https://en.wikipedia.org/wiki/Jemez_Mountains_salamander#/maplink/0> 


* Population history
<https://en.wikipedia.org/wiki/Template:Graph:Population_history>
* Line chart <https://en.wikipedia.org/wiki/Template:Graph:Lines>

== Getting started ==
* Try creating a page at data:Sandbox/.tab on Commons. Don't

forget

the .tab extension, or it won't work.
* Try using some data with the Line chart graph template
A thorough guide is needed, help is welcome!

== Documentation links ==
* Tabular help <https://www.mediawiki.org/wiki/Help:Tabular_Data>
* Map help <https://www.mediawiki.org/wiki/Help:Map_Data>
If you find a bug, create Phabricator ticket with #tabular-data 
tag, or

comment on the docu

Re: [Wikitech-l] Now live: Shared structured data

2016-12-29 Thread mathieu stumpf guntz



Le 28/12/2016 à 23:08, Yuri Astrakhan a écrit :

The 400 chat limit is to be in sync with Wikidata, which has the same
limitation. The origins of this limit is to encourage storage of "values"
rather than full strings (sentences).
Well, that's probably not the best constraints for a glossary then. To 
my mind, 400 char limit regardless of the language is rather suprising. 
Surely you can tell much more with a set of 400 ideograms than with, 
well, whatever the language happen to have the longest average sentence 
length (any idea?). Also, at least for some translation pairs, there is 
a tendancy to have translations longer than the original[1].


[1] http://www.sid.ir/en/VEWSSID/J_pdf/53001320130303.pdf

  Also, it discourages storage of wiki
markup.
What about disallowing it explicitly? You might even enforce that with a 
quick parsing that prevent recording, or simply put a reminder when 
detecting such a string to avoid blocking users in legitimate corner cases.




On Wed, Dec 28, 2016, 16:45 mathieu stumpf guntz <
psychosl...@culture-libre.org> wrote:


Thank you Yuri. Is there some rational explanation behind this limits? I
understand the limit over performance concern, and 2Mb seems already
very large for intented glossaries. But 400 chars might be problematic
for some definition I guess, especially since translations can lead to
varying lenght needs.


Le 25/12/2016 à 17:03, Yuri Astrakhan a écrit :

Hi Mathieu, yes, I think you can totally build up this glossary in a
dataset. Just remember that each string can be no longer then 400 chars,
and total size under 2mb.

On Sun, Dec 25, 2016, 10:45 mathieu stumpf guntz <
psychosl...@culture-libre.org> wrote:


Hi Yuri,

Seems very interesting. Am I wrong thinking this could helpto create
multi-lingual glossary as drafted in
https://phabricator.wikimedia.org/T150263#2860014 ?


Le 22/12/2016 à 20:30, Yuri Astrakhan a écrit :

Gift season! We have launched structured data on Commons, available

from

all wikis.

TLDR; One data store. Use everywhere. Upload table data to Commons,

with

localization, and use it to create wiki tables, lists, or use directly

in

graphs. Works for GeoJSON maps too. Must be licensed as CC0. Try this
per-state GDP map demo, and select multiple years. More demos at the

bottom.

US Map state highlight
<https://en.wikipedia.org/wiki/Template:Graph:US_Map_state_highlight>

Data can now be stored as *.tab and *.map pages in the data namespace

on

Commons. That data may contain localization, so a table cell could be

in

multiple languages. And that data is accessible from any wikis, by Lua
scripts, Graphs, and Maps.

Lua lets you generate wiki tables from the data by filtering,

converting,

mixing, and formatting the raw data. Lua also lets you generate lists.

Or

any wiki markup.

Graphs can use both .tab and .map directly to visualize the data and

let

users interact with it. The GDP demo above uses a map from Commons, and
colors each segment with the data based on a data table.

Kartographer (/) can use the .map data as an extra

layer

on top of the base map. This way we can show endangered species'

habitat.

== Demo ==
* Raw data example
<https://commons.wikimedia.org/wiki/Data:Weather/New_York_City.tab>
* Interactive Weather data
<https://en.wikipedia.org/wiki/Template:Graph:Weather_monthly_history>
* Same data in Weather template
<https://en.wikipedia.org/wiki/User:Yurik/WeatherDemo>
* Interactive GDP map
<https://en.wikipedia.org/wiki/Template:Graph:US_Map_state_highlight>
* Endangered Jemez Mountains salamander - habitat
<https://en.wikipedia.org/wiki/Jemez_Mountains_salamander#/maplink/0>
* Population history
<https://en.wikipedia.org/wiki/Template:Graph:Population_history>
* Line chart <https://en.wikipedia.org/wiki/Template:Graph:Lines>

== Getting started ==
* Try creating a page at data:Sandbox/.tab on Commons. Don't

forget

the .tab extension, or it won't work.
* Try using some data with the Line chart graph template
A thorough guide is needed, help is welcome!

== Documentation links ==
* Tabular help <https://www.mediawiki.org/wiki/Help:Tabular_Data>
* Map help <https://www.mediawiki.org/wiki/Help:Map_Data>
If you find a bug, create Phabricator ticket with #tabular-data tag, or
comment on the documentation talk pages.

== FAQ ==
* Relation to Wikidata:  Wikidata is about "facts" (small pieces of
information). Structured data is about "blobs" - large amounts of data

like

the historical weather or the outline of the state of New York.

== TODOs ==
* Add a nice "table editor" - editing JSON by hand is cruel. T134618
* "What links here" should track data usage across wikis. Will allow
quicker auto-refresh of the pages too. T153966
* Support data redirects. T153598
* Mega epic: Support external data feeds.

Re: [Wikitech-l] Now live: Shared structured data

2016-12-28 Thread mathieu stumpf guntz
Thank you Yuri. Is there some rational explanation behind this limits? I 
understand the limit over performance concern, and 2Mb seems already 
very large for intented glossaries. But 400 chars might be problematic 
for some definition I guess, especially since translations can lead to 
varying lenght needs.



Le 25/12/2016 à 17:03, Yuri Astrakhan a écrit :

Hi Mathieu, yes, I think you can totally build up this glossary in a
dataset. Just remember that each string can be no longer then 400 chars,
and total size under 2mb.

On Sun, Dec 25, 2016, 10:45 mathieu stumpf guntz <
psychosl...@culture-libre.org> wrote:


Hi Yuri,

Seems very interesting. Am I wrong thinking this could helpto create
multi-lingual glossary as drafted in
https://phabricator.wikimedia.org/T150263#2860014 ?


Le 22/12/2016 à 20:30, Yuri Astrakhan a écrit :

Gift season! We have launched structured data on Commons, available from
all wikis.

TLDR; One data store. Use everywhere. Upload table data to Commons, with
localization, and use it to create wiki tables, lists, or use directly in
graphs. Works for GeoJSON maps too. Must be licensed as CC0. Try this
per-state GDP map demo, and select multiple years. More demos at the

bottom.

US Map state highlight
<https://en.wikipedia.org/wiki/Template:Graph:US_Map_state_highlight>

Data can now be stored as *.tab and *.map pages in the data namespace on
Commons. That data may contain localization, so a table cell could be in
multiple languages. And that data is accessible from any wikis, by Lua
scripts, Graphs, and Maps.

Lua lets you generate wiki tables from the data by filtering, converting,
mixing, and formatting the raw data. Lua also lets you generate lists. Or
any wiki markup.

Graphs can use both .tab and .map directly to visualize the data and let
users interact with it. The GDP demo above uses a map from Commons, and
colors each segment with the data based on a data table.

Kartographer (/) can use the .map data as an extra

layer

on top of the base map. This way we can show endangered species' habitat.

== Demo ==
* Raw data example
<https://commons.wikimedia.org/wiki/Data:Weather/New_York_City.tab>
* Interactive Weather data
<https://en.wikipedia.org/wiki/Template:Graph:Weather_monthly_history>
* Same data in Weather template
<https://en.wikipedia.org/wiki/User:Yurik/WeatherDemo>
* Interactive GDP map
<https://en.wikipedia.org/wiki/Template:Graph:US_Map_state_highlight>
* Endangered Jemez Mountains salamander - habitat
<https://en.wikipedia.org/wiki/Jemez_Mountains_salamander#/maplink/0>
* Population history
<https://en.wikipedia.org/wiki/Template:Graph:Population_history>
* Line chart <https://en.wikipedia.org/wiki/Template:Graph:Lines>

== Getting started ==
* Try creating a page at data:Sandbox/.tab on Commons. Don't forget
the .tab extension, or it won't work.
* Try using some data with the Line chart graph template
A thorough guide is needed, help is welcome!

== Documentation links ==
* Tabular help <https://www.mediawiki.org/wiki/Help:Tabular_Data>
* Map help <https://www.mediawiki.org/wiki/Help:Map_Data>
If you find a bug, create Phabricator ticket with #tabular-data tag, or
comment on the documentation talk pages.

== FAQ ==
* Relation to Wikidata:  Wikidata is about "facts" (small pieces of
information). Structured data is about "blobs" - large amounts of data

like

the historical weather or the outline of the state of New York.

== TODOs ==
* Add a nice "table editor" - editing JSON by hand is cruel. T134618
* "What links here" should track data usage across wikis. Will allow
quicker auto-refresh of the pages too. T153966
* Support data redirects. T153598
* Mega epic: Support external data feeds.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Now live: Shared structured data

2016-12-25 Thread mathieu stumpf guntz

Hi Yuri,

Seems very interesting. Am I wrong thinking this could helpto create 
multi-lingual glossary as drafted in 
https://phabricator.wikimedia.org/T150263#2860014 ?



Le 22/12/2016 à 20:30, Yuri Astrakhan a écrit :

Gift season! We have launched structured data on Commons, available from
all wikis.

TLDR; One data store. Use everywhere. Upload table data to Commons, with
localization, and use it to create wiki tables, lists, or use directly in
graphs. Works for GeoJSON maps too. Must be licensed as CC0. Try this
per-state GDP map demo, and select multiple years. More demos at the bottom.
US Map state highlight


Data can now be stored as *.tab and *.map pages in the data namespace on
Commons. That data may contain localization, so a table cell could be in
multiple languages. And that data is accessible from any wikis, by Lua
scripts, Graphs, and Maps.

Lua lets you generate wiki tables from the data by filtering, converting,
mixing, and formatting the raw data. Lua also lets you generate lists. Or
any wiki markup.

Graphs can use both .tab and .map directly to visualize the data and let
users interact with it. The GDP demo above uses a map from Commons, and
colors each segment with the data based on a data table.

Kartographer (/) can use the .map data as an extra layer
on top of the base map. This way we can show endangered species' habitat.

== Demo ==
* Raw data example

* Interactive Weather data

* Same data in Weather template

* Interactive GDP map

* Endangered Jemez Mountains salamander - habitat

* Population history

* Line chart 

== Getting started ==
* Try creating a page at data:Sandbox/.tab on Commons. Don't forget
the .tab extension, or it won't work.
* Try using some data with the Line chart graph template
A thorough guide is needed, help is welcome!

== Documentation links ==
* Tabular help 
* Map help 
If you find a bug, create Phabricator ticket with #tabular-data tag, or
comment on the documentation talk pages.

== FAQ ==
* Relation to Wikidata:  Wikidata is about "facts" (small pieces of
information). Structured data is about "blobs" - large amounts of data like
the historical weather or the outline of the state of New York.

== TODOs ==
* Add a nice "table editor" - editing JSON by hand is cruel. T134618
* "What links here" should track data usage across wikis. Will allow
quicker auto-refresh of the pages too. T153966
* Support data redirects. T153598
* Mega epic: Support external data feeds.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Offering internationalized programming facilities within WM enviroment

2016-12-23 Thread mathieu stumpf guntz



Le 22/12/2016 à 19:30, Chad a écrit :

On Thu, Dec 22, 2016 at 10:42 AM mathieu stumpf guntz <
psychosl...@culture-libre.org> wrote:


* mediawiki.util get renamed to mediawiki.outil : there is no problem
with that, isn't it?
* then changed to alors : there is no problem with that, isn't it?



Yes, that is a problem. As pointed out by others: it makes grepping for
code when doing refactoring basically impossible as I'd have to know
a list of every possible translation of "util." That's not a good use of
developer time at all.
Well, then I would say that it's more like a projection of a useful tool 
for a given situation on an other situation, however  similar, where it 
is no longer that useful. What you want isn't find the lexem (or even 
any accidentally matching string), but the seme (or semene). In most 
programming languages there is probably a strong enough correlation 
between lexem and seme(ne) is strong enough to make a simple regular 
expression matching useful in many cases. But even their, a grep 
approach can quickly show its limits (especially with false positive in 
my experience). Having a tool which let you perform transformations 
based on an AST is often far more accurate and flexible.



I mean these ideas have merit on the face of them, but I'm totally in the
"this is nice but probably not worth the maintenance burden" camp along
with Gergo.
Well, I do understand the argumentation, and it does sounds reasonable 
to me. It's more like I wouldn't place the cursor of maintenance burden 
tolerance at the same point, especially when I feel it might have large 
impact on (language) diversity.


Also, my understanding is that the main concern here is about letting 
developers with advanced skills easily go and help in misc. wiki. And as 
I understand it, this shouldn't be such a big deal with a central 
repository where most common problematic are (hopefully) already 
factored in a way which ease maintenance. To my mind, it's compatible 
with having more local specific problem solved, and possibly some local 
wrapper of centralized code, in a way which please the local community 
(which might just as well prefer not to use code localization after all).



T150417 was declined, and for good reasons I think.

Well, as said, I'm not in fundamental disagreement with that.

I think ideas like Babylscript are more useful for a project where you've
got a lot of developers in a *single* non-English language. For example,
a team of Italians who would rather work in Italian than English. In our
case, we've got *many* languages and being able to use things across
wikis/languages is important.
To my mind, having a lot of developers with many languages doesn't 
exclude that we also have developers in a single non-English language as 
part of the former, does it?




-Chad
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Offering internationalized programming facilities within WM enviroment

2016-12-22 Thread mathieu stumpf guntz



Le 21/12/2016 à 22:42, Gergo Tisza a écrit :

I sympathize with the goal but accessibility benefits would be far
outweighed by maintaince costs.

Maybe. Or maybe not. I can't judge very objectively without metrics, can I?

  We regularly use grep to find code which is
about to be deprecated;
Well, that's fine, and surely having localized versions of code would 
fall into your grep process, wouldn't it?

wikis copy gadgets from each other;
Yeah, sure that's a problem, and having a centralized gadget repository 
is a saner way to go that happen to be number 1 on 2016 Community 
Wishlist Survey/Results 


:)

more
experienced developers are sometimes asked to help a wiki where the local
maintainers have less experience. We have a whole global user group for
people who go from wiki to wiki and fix things.
Well, I share your concerns, and I don't pretend that I have a perfect 
out of the box solution which make the best balance between technical 
maintainability, technical skill dissemination, linguistic 
diversity/accessibility and so on.


And that's not even taking into account the implementation costs, which
would probably be massive if it includes stuff like localizing
Lua/Javascript language keywords.
Well, that's why the thread is about internationalization, and not 
localization. Just like they are tools to translate all the Wikimedia 
non-executable hosted content. Or more generally, the way Wikimedia 
provide the technical facilities but not the human resources to make it 
so wide projects constructed on this infrastructure.

The Wikimedia community collectively has
a lot of experience with web development but not a whole lot of experience
with programming language compiler development.
Once again, I don't have metrics about that (but I admit that to my mind 
you statements seems relevant).




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Offering internationalized programming facilities within WM enviroment

2016-12-22 Thread mathieu stumpf guntz
Hi Antoine, thank you for sharing your feedback and clarifying the whole 
topic.



Le 21/12/2016 à 15:13, Antoine Musso a écrit :


A few issues:

For both LUA Modules and Javascript gadgets, we would have to make the
related MediaWiki components to have their whole API to be translated.
So that in a Gadget instead of doing:
Well, API translation is a separated issue from programming language 
translation. And actually it's a rather far easier topic, especially 
regarding scripting languages which rely mainly on reference. Anyone can 
easily propose a Scribunto module which make relexicalised wrapping of 
"mw", right now, as long as it doesn't use non-ASCII characters. You 
could actually provide translations right into modules, with something 
like `mw.lang.fr.chargeur = mw.loader` to build on you bellow example. 
To my mind, the bigest issue with that can of approach is that the added 
reference layer can make the script harder to debug.

mw.chargeur.utilisant(['mediawiki.outil', 'mediawiki.annonce']).alors {

fonction () {
fonction horloge() {
 mw.outil.ajoutFSC( '#dateTUC e { 'fontes-taille: 120%; }
}
}
$( horloge() );
} )

(Example taken from mediawiki.org Gadget-UTCLiveClock.js)

You can see a few renames, some could be done, others are quite challenging:

* mediawiki.util get renamed to mediawiki.outil
* then changed to alors
* CSS translated to FSC which has absolutely no sense :D
* the HTML element A (anchor) renamed to E (encre)
* A CSS identifier that is changing from #utcdate to #dateTUC
* The invalid CSS 'fontes-taille'
As said the idea is to provide facilities for internationalization, not 
a localization of everything. Whether some community would like to go 
into a more or less deep localization process should be up to them. 
Moreover, to my mind, using the default keywords should always stay 
possible.


As for each of the above point :

* mediawiki.util get renamed to mediawiki.outil : there is no problem 
with that, isn't it?

* then changed to alors : there is no problem with that, isn't it?
* CSS translated to FSC which has absolutely no sense :D : well it is 
just as meaningful as CSS, whether you know the meaning of the acronym 
or not, isn't it?

* the HTML element A (anchor) renamed to E (encre):

   Actually, this is definitely not the way you would localize that.
   First an anchor is translated as "ancre" in French, "encre"
   translate "ink". But this really the most little problem with such
   way to localize. You should assume that your API users a) know how
   CSS identifier works and so you won't translate such a parameter b)
   don't know a thing about CSS identifier and provide an API which
   abstract that for them so that they never provide such a parameter.

* A CSS identifier that is changing from #utcdate to #dateTUC

   The same apply here. And while, to my mind, it doesn't make sense to
   provide that kind of identifier translation facility, you would
   translate that with #dateUTC (see Temps universel coordonné
   ).

* The invalid CSS 'fontes-taille'

   Indeed, you should rather have something like
   mon_ancre = méthode_de_récupération_de_mon_ancre();
   mon_ancre.taille_fonte_de_caractère("120%");

   and your user shouldn't care whether this will produce HTML/CSS,
   postscript, pdf, or whatever digital layout format out there. :)



etc...

Again I appreciate the idea, but I don't think it is technically doable
or worth pursuing.   Not to mention that it will be very challenging to
debug whenever some code has bug and the issue is referred to more
knowledgable developers that happens to not know french or esperanto.
Well, I do agree that it makes debug potentially more difficult, but to 
my mind the problem is the added layer.


As for the achievability of such a project, it's definitely doable. You 
already pointed to Babylscript, and I yet have to advance on this 
project but you can already play with mallupa 
 and lua-i18n 
 (see the relexicalisation 
 branch).

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Offering internationalized programming facilities within WM enviroment

2016-12-21 Thread mathieu stumpf guntz

Hello everybody,

Following the suggestion of Andre Klapper 
, I'm turning to this 
set of lists to see if it can attract more feedback on the topic of 
internationalized programming facilities within WM environment 
.


As described more extensively in the ticket,the idea is to implement 
internationalization facilities (if they don't exist yet) in compilers 
used in the WM infrastructure, and enable contributors to localize them 
(possibly through Translatewiki), and then let them use localized 
versions if they wish.


Please let me know if you need more details or if you have any question. 
You can answer on the list or on phabricator, as you wish.


Kind regards,
Mathieu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Update on WMF account compromises

2016-11-16 Thread mathieu stumpf guntz
Well, I didn't mistyped my password lately, but probably we have a limit 
of successive attempt to connexion failure, haven't we? So even 
relatively weak password are a far less big deal, the problem seemed to 
be that many password where know from attackers due to external leak.



Le 16/11/2016 à 22:41, Brad Jorsch (Anomie) a écrit :

On Wed, Nov 16, 2016 at 3:19 PM, Thomas Morton hZ|=S\*").




___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] FYI: Microsoft Connect - Streamed live Developer Event

2016-11-15 Thread mathieu stumpf guntz

Hi Piotr,

Despite the fact that this is a list with many developer, how is this 
information relevant on a Wikimedia mailing list? Is their any focus on 
how this products may useful in the Wikimedia environment? Will the 
their be free pedagogical material we may include on Wikibooks, 
Wikiversity or Commons as an outcome from this session?


If such is not the case, that's not an information I whish to receive 
through this canal. Other might have a different opinion, and they are 
welcome to bring feedback, but hopefully this won't make to much noise 
on this ml, and any flamewar will be contained somewhere else.


Kind regards,
mathieu


Le 15/11/2016 à 14:37, Piotr Miazga a écrit :

Hi all,

At November 16th - 18th Microsoft hosts Connect - Developer virtual event.
This event is streamed live and it's for free.

Link to event: https://connectevent.microsoft.com/

Speakers will share latest innovations from Microsoft software. There will
be many interactive Q sessions with Microsoft engineering teams,
customers and partners. Third day is an online full-day courses on
developing and deploying web and data applications or cross-platform mobile
apps.


Piotr
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] 2016W43 ArchCom-RFC meeting: Allow HTML in SVG?

2016-10-28 Thread mathieu stumpf guntz

Thank you Brian ant Stephen, I think that should help me. :)


Le 27/10/2016 à 20:02, Brian Wolff a écrit :

On Thursday, October 27, 2016, Mathieu Stumpf Guntz <
psychosl...@culture-libre.org> wrote:


Le 27/10/2016 à 19:22, Brian Wolff a écrit :

On Thursday, October 27, 2016, Mathieu Stumpf Guntz <
psychosl...@culture-libre.org> wrote:

Hi,

Somewhat of topic, but could someone provide me some links related to
translatable SVG?

Kind regards



Do you have anything in particular you want to know about translatable
svgs? Are you looking more for a help page on how to make them, for the
relavent part of the svg spec or for technical information about MW's
implementation?

I'm looking for help pages to make them and use them within the
Wikimedia projects.


https://commons.wikimedia.org/wiki/Commons:Translation_possible/Learn_more#Multiple_translations_within_one_SVG_file
is probably the best help page for that.

--
bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] 2016W43 ArchCom-RFC meeting: Allow HTML in SVG?

2016-10-27 Thread Mathieu Stumpf Guntz


Le 27/10/2016 à 19:22, Brian Wolff a écrit :
> On Thursday, October 27, 2016, Mathieu Stumpf Guntz <
> psychosl...@culture-libre.org> wrote:
>> Hi,
>>
>> Somewhat of topic, but could someone provide me some links related to
>> translatable SVG?
>>
>> Kind regards
>>
>>
> Do you have anything in particular you want to know about translatable
> svgs? Are you looking more for a help page on how to make them, for the
> relavent part of the svg spec or for technical information about MW's
> implementation?
I'm looking for help pages to make them and use them within the
Wikimedia projects. 

>
> --
> bawolff
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] 2016W43 ArchCom-RFC meeting: Allow HTML in SVG?

2016-10-27 Thread Mathieu Stumpf Guntz
Hi,

Somewhat of topic, but could someone provide me some links related to
translatable SVG?

Kind regards


Le 26/10/2016 à 03:26, Gabriel Wicke a écrit :
> See also https://phabricator.wikimedia.org/T96461, which discusses using
> https://github.com/cure53/DOMPurify, and Parsoid's Token-based sanitizer.
>
> On Tue, Oct 25, 2016 at 6:12 PM, Legoktm 
> wrote:
>
>> Hi,
>>
>> On 10/25/2016 03:14 PM, Rob Lanphier wrote:
>>> 3.  Should we turn our SVG validation code into a proper library?
>> Yes! This is . :)
>>
>> -- Legoktm
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>
>


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Introducing Srishti Sethi, our new Developer Advocate

2016-10-18 Thread mathieu stumpf guntz
That's wonderful, congratulation Srishti, I hope that you will be able 
to make flourish both the community and yourself on tasks regarding 
Wikimedia technical community engagement. :)



Le 12/10/2016 à 08:52, Quim Gil a écrit :

Hello everybody,

The Technical Collaboration team is very happy to introduce you to Srishti
Sethi , our new
Developer Advocate. Srishti will lead our efforts to engage volunteer
developers in Wikimedia software projects and to grow the Wikimedia
technical community. In her first assignments, she will help promote
Community Wishlist projects, the Wikimedia Developer Summit, and
Wikimedia’s participation in Google Code-in. Srishti just relocated from
Boston to San Francisco, and she will work at the Wikimedia Foundation
office.

Srishti originally hails from Rajasthan, India. She recently finished her
Masters in Media Arts and Sciences from the MIT Media Lab, where she was
exploring how online platforms for learning could be made more peer-led,
engaging, participatory and accessible to diverse populations. At the Media
Lab, Srishti contributed to the design, development, and research of
Unhangout (http://unhangout.media.mit.edu), a platform for running
large-scale un-conference style events online.

Before joining MIT, she was working as a software developer with a startup
organization in India. As an undergrad, she was involved with the open
source community GNOME and its educational project GCompris.

In her spare time, Srishti likes to play ping-pong, do long bike trips,
take photographs and make masala chai for friends. :)

Please join us in welcoming Srishti!



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Looking for WikiDev Summit main topics and volunteers

2016-10-04 Thread Mathieu Stumpf Guntz
Hi Quim and everyone on the list,

One of my spare time project is to "translate" programming languages to
Esperanto. I already achieved a Javascript translation within the
Babylscript project . Currently I'm working
on a translation of Lua . I chose
Lua because I wish I was able to code in (somewhat) plain Esperanto when
writing Scribunto modules on Esperanto Wikimedia projects.

So, I would be happy to know opinions of other developers about such a
project, and even more important, what steps would you see as necessary
to make it possible. Would that be a project interesting for a Wikimedia
Developer Summit?

Kind regards,
mathieu


Le 08/09/2016 à 12:08, Quim Gil a écrit :
> Hi, this is a request from the organizers of the Wikimedia Developer Summit
> 2016 (San Francisco, January 9-11).
>
> We are looking for candidates to become main topics of the next Summit.
> Ideally complex topics with a high user impact (direct or indirect) and
> ramifications in multiple technical areas. Deciding a few main topics
> beforehand will help us inviting the people needing to be involved,
> especially non-WMF contributors requiring travel sponsorship.
>
> We are also looking for volunteers who want to get involved in the
> organization of the Summit, or want to provide feedback to improve our
> plans.
>
> If you have proposals for main topics and/or want to volunteer, please
> reply.
>
> Context: https://www.mediawiki.org/wiki/Wikimedia_Developer_Summit_2017 &
> https://lists.wikimedia.org/pipermail/wikitech-l/2016-September/086476.html
>

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The Revision Scoring weekly update

2016-07-15 Thread mathieu stumpf guntz

Hey, it looks like it works with the right URL ^^

Admittedly, I had manually changed the "en" to "fr" rather than using 
the "in other languages" tool box. /o\


Thank you Aaron.


Le 13/07/2016 à 17:15, Aaron Halfaker a écrit :

Looks like the frwiki version is at
https://fr.wikipedia.org/wiki/Wikip%C3%A9dia:Label, not
https://fr.wikipedia.org/wiki/Wikip%C3%A9dia:Labels :)

FWIW, the frwiki version is working for me just fine.  Let me know if it
still doesn't work at the appropriate URL.  Then, I think that Amir might
be right about conflicting gadgets.

On Wed, Jul 13, 2016 at 10:09 AM, Amir Ladsgroup <ladsgr...@gmail.com>
wrote:


Hey,
You probably enabled a gadget that is broken causing wikilabels not working
properly (we had this issue in fawiki a while ago). I tested it and it was
okay in frwiki. First, try to disable your gadgets in frwiki and if that
was the case try to understand what is the broken gadget (e.g. check
console log). If that was not the case, please open a phabricator bug and
put console log. I will investigate it.

Best

On Wed, Jul 13, 2016 at 7:15 PM mathieu stumpf guntz <
psychosl...@culture-libre.org> wrote:


Hi Amir,

Could you give me more information on the wikilabels campaigns for
French WP? Is there a way to contribute on this? I did followed
instructions on documentation page
<https://meta.wikimedia.org/wiki/Wiki_labels> to install the JS on meta
<https://meta.wikimedia.org/wiki/User:Psychoslave/global.js>. Then when
I go on the English Wikipedia
<https://en.wikipedia.org/wiki/Wikipedia:Labels>, it works™. But on the
French version <https://fr.wikipedia.org/wiki/Wikip%C3%A9dia:Labels> it
doesn't.

What did I missed? ^^


Le 11/07/2016 à 21:40, Amir Ladsgroup a écrit :

Hey,
This is the 12th weekly update from revision scoring team that we have

sent

to this mailing list.

New developments:

 - We deployed ORES review tool in Russian and Portuguese Wikipedia!

[1]

 [2]


 - Basic support for English Wiktionary and Czech Wikipedia is live

in

 labs.[3]  Soon production will follow.


 - Damaging model for Polish Wikipedia is ready and soon be

available

 (then we can enable ORES review tool as well) [4]


 - ORES extension now highlights the whole row for Enhanced recent
 changes too. It will be live in two weeks [5]


 - We built prototype of a small web app to show progress of

wikilabels

 campaigns at

https://tools.wmflabs.org/dexbot/tools/wikilabels_stats.php.

 Feedback is welcome [6]


Maintenance and robustness:

 - If a web node goes down in labs, we get icigna warnings [7]


 - We re-launched wikilabels campaigns for French and Azeri

Wikipedia

[8]

 [9]


 - We are starting to use tsv2json to load the whole data when

adding

new

 campaigns in wikilabels [10]


 - "Hide/Show good edits" soon will be changed to "Hide/Show

probably

 good edits" in recent changes and watchlists to reflect ORES scores

more

 accurately [11]


1. https://phabricator.wikimedia.org/T139541
2. https://phabricator.wikimedia.org/T139692
3. https://phabricator.wikimedia.org/T139789
4. https://phabricator.wikimedia.org/T139207
5. https://phabricator.wikimedia.org/T139924
6. https://phabricator.wikimedia.org/T139874
7. https://phabricator.wikimedia.org/T134782
8. https://phabricator.wikimedia.org/T107730
9. https://phabricator.wikimedia.org/T139875
10. https://phabricator.wikimedia.org/T139939
11. https://phabricator.wikimedia.org/T139754

Sincerely,
Amir from the Revision Scoring team
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The Revision Scoring weekly update

2016-07-13 Thread mathieu stumpf guntz

Hi Amir,

Could you give me more information on the wikilabels campaigns for 
French WP? Is there a way to contribute on this? I did followed 
instructions on documentation page 
 to install the JS on meta 
. Then when 
I go on the English Wikipedia 
, it works™. But on the 
French version  it 
doesn't.


What did I missed? ^^


Le 11/07/2016 à 21:40, Amir Ladsgroup a écrit :

Hey,
This is the 12th weekly update from revision scoring team that we have sent
to this mailing list.

New developments:

- We deployed ORES review tool in Russian and Portuguese Wikipedia! [1]
[2]


- Basic support for English Wiktionary and Czech Wikipedia is live in
labs.[3]  Soon production will follow.


- Damaging model for Polish Wikipedia is ready and soon be available
(then we can enable ORES review tool as well) [4]


- ORES extension now highlights the whole row for Enhanced recent
changes too. It will be live in two weeks [5]


- We built prototype of a small web app to show progress of wikilabels
campaigns at https://tools.wmflabs.org/dexbot/tools/wikilabels_stats.php.
Feedback is welcome [6]


Maintenance and robustness:

- If a web node goes down in labs, we get icigna warnings [7]


- We re-launched wikilabels campaigns for French and Azeri Wikipedia [8]
[9]


- We are starting to use tsv2json to load the whole data when adding new
campaigns in wikilabels [10]


- "Hide/Show good edits" soon will be changed to "Hide/Show probably
good edits" in recent changes and watchlists to reflect ORES scores more
accurately [11]


1. https://phabricator.wikimedia.org/T139541
2. https://phabricator.wikimedia.org/T139692
3. https://phabricator.wikimedia.org/T139789
4. https://phabricator.wikimedia.org/T139207
5. https://phabricator.wikimedia.org/T139924
6. https://phabricator.wikimedia.org/T139874
7. https://phabricator.wikimedia.org/T134782
8. https://phabricator.wikimedia.org/T107730
9. https://phabricator.wikimedia.org/T139875
10. https://phabricator.wikimedia.org/T139939
11. https://phabricator.wikimedia.org/T139754

Sincerely,
Amir from the Revision Scoring team
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Is there a way to visualize where come from editors of a specific page?

2014-04-13 Thread Mathieu Stumpf
Hello,

Looking at articles on Sri Lanka in different languages, I wondered
where they were edited from. Is there a way to see that?

Kind regards,
Mathieu

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Is there a way to visualize where come from editors of a specific page?

2014-04-13 Thread Mathieu Stumpf
Le dimanche 13 avril 2014 à 20:15 +0200, Federico Leva (Nemo) a écrit :
 Recent discussion on the topic:
 * http://lists.wikimedia.org/pipermail/analytics/2013-August/thread.html#857
 * http://thread.gmane.org/gmane.org.wikimedia.analytics/103

Ok, thank you Nemo. I suppose that privacy pros worth the statistical
knowledge cons.

kind regrads,
mathieu

 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Lowering limitation of the score extension

2013-12-18 Thread Mathieu Stumpf
Le mardi 17 décembre 2013 à 10:36 -0700, Brian Wolff a écrit :
 I'm not very familiar with lilypond, so there is probably a better
 way, but I was able to generate that fret board with code like:
 
 score raw
 \markup {\fret-diagram #s:0.75;6-x;5-x;4-o;3-2;2-3;1-2;}
 \score {
   \new Score { c1 }
   \midi { }
 }
 \header { tagline = ##f}
 /score
 
 See 
 https://en.wikipedia.org/w/index.php?title=Wikipedia:Sandboxoldid=586519066

Oh, great thank you. Do you have any idea how you may make a landscape
oriented diagram with the \markup function? Or more generaly, how do you
translate this kind of code : 

  \new FretBoards {
  \once \override FretBoard
#'(fret-diagram-details fret-count) = #3
\override FretBoard
#'(fret-diagram-details orientation) = #'landscape
\override FretBoard
#'(fret-diagram-details barre-type) = #'none
  \override FretBoard
#'(fret-diagram-details finger-code) = #'below-string
\override FretBoards.FretBoard #'size = #'2
\override FretBoard
  #'(fret-diagram-details finger-code) = #'in-dot
 …

to apply it to the \new Score { c1 } you are using ?

 
 
 --bawolff
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Lowering limitation of the score extension

2013-12-17 Thread Mathieu Stumpf
Hello,

I would like to use the score extension to generate guitar chords
diagrams, but writing something like 
score
\fret-diagram #s:0.75;6-x;5-x;4-o;3-2;2-3;1-2; 
/score

I get this message error:
Processing `.../file.ly'
Parsing...
.../file.ly:12:0: error: unknown escaped string: `\fret'

\fret-diagram #s:0.75;6-x;5-x;4-o;3-2;2-3;1-2; 
.../file.ly:12:0: error: syntax error, unexpected STRING

\fret-diagram #s:0.75;6-x;5-x;4-o;3-2;2-3;1-2; 
.../file.ly:21:24: error: syntax error, unexpected '}'

}error: failed files: .../file.ly

exited with status: 1

The code works just fine when I compile it with my local lilypond. That
functionality would be really interesting, since with a little scribunto
module, it would allow to generate chords diagram on the fly, and even a
chord book.

Also I would like to make even more with colored diagrams showing a set
of interval on the neck. It would be fine if the mediawiki environnement
would allow this.

kind regards

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Pywikipediabot] Using the content of a file as input for articles

2013-12-02 Thread Mathieu Stumpf
Thank everybody for all your answers, I think that I should be able to
achieve my goal using your advice. 

Le dimanche 01 décembre 2013 à 21:26 +0100, Merlijn van Deen a écrit :
 On 1 December 2013 11:38, Amir Ladsgroup ladsgr...@gmail.com wrote:
 
  2- use readlines:
  import codecs
  site=wikipedia.getSite()
  f=codecs.open(file.txt,r,utf-8)
  for line in f.readlines():
  line=line.replace(\n,).replace(\r,)
  name=line.split(:)[0] #or any kind that you like to get the title
  page=wikipedia.Page(site,name)
  #do whatever you like with the page
 
 
 You can read the file per-line using 'for line in f' -- this will just read
 the current line in memory. Cleaning up the rest a bit results in:
 
 import codecs, wikipedia # or pywikibot if you are using core
 site = wikipedia.getSite() # or pywikibot.Site() if you are using core
 f = codecs.open(file.txt, r, utf-8)
 for line in f:
 line = line.strip()
 name, definition = line.split(:, 1)
 page = wikipedia.Page(site, name)
 page.put(definition) # probably something else, though.
 
 
 If you need some more assistance, I'd suggest joining #pywikipediabot on
 irc.freenode.net -- it's typically quicker than e-mail :-)
 
 Merlijn
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] [Pywikipediabot] Using the content of a file as input for articles

2013-12-01 Thread Mathieu Stumpf
Hello,

I want to add esperanto words to fr.wiktionary using as input a file
where each line have the format word:the fine definition. So I copied
the basic.py, and started hacking it to achieve my goal.

Now, it's seems like the -file argument expect a file where each line is
formated as [[Article name]]. Of course I can just create a second
input file, and read both in parallel, so I feed the genFactory with the
further, and use the second to build the wiktionary entry. But maybe you
could give me a hint on how can I write a generator that can feed a
pagegenerators.GeneratorFactory() without creating a miror file and
without loading the whole file in the main memory.

Kind regards,
Mathieu 

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Scope problem with scribunto happening when invoking but not in debug console

2013-08-22 Thread Mathieu Stumpf

Le 2013-08-21 22:54, Brad Jorsch (Anomie) a écrit :

On Wed, Aug 21, 2013 at 4:13 PM, Mathieu Stumpf 
psychosl...@culture-libre.org wrote:

But no, it doesn't, I still generate a randomly ordered wiki table. 
So,

what did I missed?



Two things:
1. table.sort() only sorts the sequence part of the table. So only 
the

elements 1-9 in your table, not 0 or A-E.
2. pairs() doesn't process the elements in any particular order; you 
can
see this by printing out the keys as you iterate through using 
pairs().
You'd need to use ipairs instead (which again only does the sequence 
part

of the table).


So as I understand it, I will have to create a new table indexed with 
integers,
using pairs, then sort this itable, before using ipairs on it. It 
looks to me
like something which may be of real common use, so maybe integrating a 
function
in the mw.* library may be interesting. On the other hand may be I'm 
just not
acustomed with the idiomatic way to build lua code and one may suggest 
me to

architecture my code in an other way.

--
Association Culture-Libre
http://www.culture-libre.org/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Scope problem with scribunto happening when invoking but not in debug console

2013-08-21 Thread Mathieu Stumpf
Ok, now I almost have what I want, I'm already able to dynamically
generate wiki tables from lua array, and specify which cell should be a
title, have specific style and col/row-span. :)

Now my data module[1] use a hash, and what I would like is to sort my
array before I generate my wiki table. So reading some documentation, I
thought naively that something like the follwing code, I putted in [2] 
would work:

function compare(a,b)
  return a[1]  b[1]
end

…
dekuma = ciferaro.dekuma
table.sort(dekuma, compare)
…

But no, it doesn't, I still generate a randomly ordered wiki table. So,
what did I missed?

[1] http://fr.wikiversity.org/wiki/Module:Fabeleblo/ciferaro
[2] http://fr.wikiversity.org/wiki/Module:Fabeleblo

Le lundi 19 août 2013 à 19:31 +0200, Mathieu Stumpf a écrit :
 Le lundi 19 août 2013 à 13:08 -0400, Brad Jorsch (Anomie) a écrit :
  On Mon, Aug 19, 2013 at 10:37 AM, Mathieu Stumpf 
  psychosl...@culture-libre.org wrote:
  
   But while editing the module, calling the function return the expected
   string.
  
  
  Are you sure it's returning the expected string? When I call your
  p.datumtestu() function in the debug console, it returns this:
  
  {| class=wikitable alternance centre
  |-
  |- Datumo1
  |-
  |- Datumo2
  |-
  |- Datumo3
  |-
  |-
  |- Datumo4
  |}
  
  That looks far from correct to me; at the least, the Datumo lines should
  begin with | rather than |-.
  
  
 
 You are right, but that's an other problem. Of course it could be a
 related side effect, but I think that on this point it's just some
 intertwinned iterations where I messed up on the treatment. I think I
 can manage that on my own, but having a working testing page would help
 me doing so. Now as I don't have a clue how to debug the no result for
 the template invokation, may be I'll try to debug that and hopefuly this
 will unexpectedly make my scope problem go away.
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Scope problem with scribunto happening when invoking but not in debug console

2013-08-19 Thread Mathieu Stumpf

Hello,

Maybe I missed some scope subtlety, but I'm stuck with a problem which 
occurs when I call my function with a template invocation, but return 
the expected result when I call it from the debug console.


I'm trying to code utilities to convert lua tables to wikisyntax 
tables. So far I coded something that should enable me to add title, 
data cells and rows to my lua table and return the wikitable, or at 
least it should. Indeed, my test page[1] return an empty row instead of 
my expected filled rows. But while editing the module, calling the 
function return the expected string. Do you have any idea with what's 
wrong ?


[1] https://fr.wikiversity.org/wiki/Module:Vikitablo/testu

--
Association Culture-Libre
http://www.culture-libre.org/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Scope problem with scribunto happening when invoking but not in debug console

2013-08-19 Thread Mathieu Stumpf
Le lundi 19 août 2013 à 13:08 -0400, Brad Jorsch (Anomie) a écrit :
 On Mon, Aug 19, 2013 at 10:37 AM, Mathieu Stumpf 
 psychosl...@culture-libre.org wrote:
 
  But while editing the module, calling the function return the expected
  string.
 
 
 Are you sure it's returning the expected string? When I call your
 p.datumtestu() function in the debug console, it returns this:
 
 {| class=wikitable alternance centre
 |-
 |- Datumo1
 |-
 |- Datumo2
 |-
 |- Datumo3
 |-
 |-
 |- Datumo4
 |}
 
 That looks far from correct to me; at the least, the Datumo lines should
 begin with | rather than |-.
 
 

You are right, but that's an other problem. Of course it could be a
related side effect, but I think that on this point it's just some
intertwinned iterations where I messed up on the treatment. I think I
can manage that on my own, but having a working testing page would help
me doing so. Now as I don't have a clue how to debug the no result for
the template invokation, may be I'll try to debug that and hopefuly this
will unexpectedly make my scope problem go away.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Improving the scribunto development experience

2013-08-16 Thread Mathieu Stumpf
I just found [1] but it doesn't seem to give explaination on mw.*
modules installation. Also I began to use koneki[2] to code lua.

[1] https://test2.wikipedia.org/wiki/Help:Lua_development_environment
[2] http://www.eclipse.org/koneki/ldt/


Le jeudi 15 août 2013 à 18:14 +0400, Yury Katkov a écrit :
 Indeed there is such need. In fact MediaWiki template language itself
 requires an IDE: WikEd is not enough.
 
 -
 Yury Katkov, WikiVote
 
 
 
 On Thu, Aug 15, 2013 at 6:10 PM, Mathieu Stumpf 
 psychosl...@culture-libre.org wrote:
 
  Hello,
 
  Having some fun with the scribunto possibilities, I also found some
  drawbacks in the UX. Having to scroll to switching between the code
  panel and the debug panel is prohibitive. Making a better web IDE is
  possible, take a look at [1] for example: at one glance you have the
  documentation (exercises specifications in the case of the previous
  site), the code doing the fine job, the code testing it, the
  debug/result console.
 
  Now one may prefer to use it's favorite editor/IDE anyway, in which case
  one may wonder how to proceed to get and install scribunto librairies in
  order to test the code on the local box. That should be documented
  somewhere, as well as how to make a git/mediawiki bridge to ease the
  whole process. If such a documentation already exists, please point me
  there.
 
  Ok, that was my thought of the day. :P
 
  [1] http://www.codingame.com/cg/
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Improving the scribunto development experience

2013-08-15 Thread Mathieu Stumpf
Hello,

Having some fun with the scribunto possibilities, I also found some
drawbacks in the UX. Having to scroll to switching between the code
panel and the debug panel is prohibitive. Making a better web IDE is
possible, take a look at [1] for example: at one glance you have the
documentation (exercises specifications in the case of the previous
site), the code doing the fine job, the code testing it, the
debug/result console.

Now one may prefer to use it's favorite editor/IDE anyway, in which case
one may wonder how to proceed to get and install scribunto librairies in
order to test the code on the local box. That should be documented
somewhere, as well as how to make a git/mediawiki bridge to ease the
whole process. If such a documentation already exists, please point me
there.

Ok, that was my thought of the day. :P

[1] http://www.codingame.com/cg/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] GMail sending lots of WIkimedia mail to spam again

2013-08-05 Thread Mathieu Stumpf
Le lundi 05 août 2013 à 23:01 +0530, Yuvi Panda a écrit :
 All emails to labs-l always end up in spam for me (I've a special rule
 that picks them out of spam, and GMail still warns me).
 
 /end-data-point
 
 
Bad mail provider, change mail provider. ;)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wiktionary-l] Listing missing words of wiktionnaries

2013-07-30 Thread Mathieu Stumpf

Le 2013-07-26 20:26, Amgine a écrit :
The request is to create a web-based text corpus[1] from which to 
derive

frequencies and then compare with existing wiktionaries. Not a light
undertaking, but one which has been proposed and implemented 
previously

(e.g. Connel's Gutenberg project[2])

Generically speaking, someone would need to determine the appropriate
size of the corpus sample, it's temporal currency, and the method of
creating and maintaining it. This isn't easy to do, and having no
strictures results in unwieldy and mostly irrelevant products like
Google's n-grams[3] (on the other hand, if someone can figure out how 
to
filter n-grams usefully it would mean we don't have to build our 
own.)


Actually, I think it would be interesting to have a trend history of 
words usage over centuries (current trend would also be interesting but 
probably harder to implement). Wikisource may be used in order to 
achieve that.




Amgine

[1] https://en.wikipedia.org/wiki/Linguistic_corpus
[2] https://en.wiktionary.org/wiki/User:Connel_MacKenzie/Gutenberg
[3] http://storage.googleapis.com/books/ngrams/books/datasetsv2.html


On 26/07/13 09:18, Lars Aronsson wrote:

On 07/23/2013 11:23 AM, Mathieu Stumpf wrote:
Here is what I would like to do : generating reports which give, 
for

a given language, a list of words which are used on the web with a
number evaluating its occurencies, but which are not in a given
wiktionary.

How would you recommand to implemente that within the wikimedia
infrastructure?


Some years back, I undertook to add entries for
Swedish words in the English Wiktionary. You can
follow my diary at http://en.wiktionary.org/wiki/User:LA2

Among the things I did was to extract a list of all
Swedish words that already had entries. The best
way was to use CatScan to list entries in categories
for Swedish words. Even if there is a page called
men, this doesn't mean the Swedish word men
has an entry, because it could be the English word
men that is in that page.

Then I extracted all words from some known texts,
e.g. novels, the Bible, government reports, and the
Swedish Wikipedia, counting the number of
occurrencies of each word. Case significance is
a bit tricky. There should not be an entry for
lower-case stockholm, so you can't just convert
everything to lower case. But if a sentence begins
with a capital letter, that word should not have
a capitalized entry. Another tricky issue is
abbreviations, which should keep the period,
for example i.e. rather than i and e. But
the period that ends a sentence should be removed.
When splitting a text into words, I decided to keep
all periods and initial capital letters, even if this
leads to some false words.

When you have word frequency statistics for a text,
and a list of existing entries from Wiktionary, you
can compute the coverage, and I wrote a little
script for this. I found that English Wiktionary already
had Swedish entries covering 72% of the words in the
Bible, and when I started to add entries for the most
common of the missing words, I was able to increase
this to 87% in just a single month (September 2010).

Many of the common words that were missing when
I started were adverbs such as thereof, herein,
which occur frequently in any text but are not very
exciting to write entries about. This statistics-based
approach gave me a reason to add those entries.

It is interesting to contrast a given text to a given
dictionary in this way. The Swedish entries in the
English Wiktionary is a different dictionary than the
Swedish entries in the German or Danish Wiktionary.
The kinds of words found in the Bible are different
from those found in Wikipedia or in legal texts.
There is not a single, universal text corpus that we
can aim to cover. Google has released its ngram
dataset. I'm not sure if it covers Swedish, but even
if it does, it must differ from the corpus frequencies
published by the Swedish Academy.

It is relatively easy to extract a list of existing entries
from Wiktionary. But to prepare a given text corpus
for frequency and coverage analysis needs more
preparation.



___
Wiktionary-l mailing list
wiktionar...@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wiktionary-l


--
Association Culture-Libre
http://www.culture-libre.org/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Listing missing words of wiktionnaries

2013-07-24 Thread Mathieu Stumpf

Le 2013-07-23 18:20, Jean-Frédéric a écrit :

Hello,

Here is what I would like to do : generating reports which give, for 
a
given language, a list of words which are used on the web with a 
number

evaluating its occurencies, but which are not in a given wiktionary.

How would you recommand to implemente that within the wikimedia
infrastructure?



Related : the French Wiktionary folks did that using a Wikisource 
dump

(I’ll agree that fr.wikisource is a tiny subset of « the web » ;)

See http://tools.wmflabs.org/dicompte/

Hope that helps,


Thank everybody for all the feedback.


--
Association Culture-Libre
http://www.culture-libre.org/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Listing missing words of wiktionnaries

2013-07-23 Thread Mathieu Stumpf

Hello,

Here is what I would like to do : generating reports which give, for a 
given language, a list of words which are used on the web with a number 
evaluating its occurencies, but which are not in a given wiktionary.


How would you recommand to implemente that within the wikimedia 
infrastructure?


--
Association Culture-Libre
http://www.culture-libre.org/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Centralized Lua modules for Wikisource (OPW mentor needed)

2013-06-03 Thread Mathieu Stumpf

Le 2013-06-02 16:38, David Cuenca a écrit :

On Sun, Jun 2, 2013 at 7:21 AM, Mathieu Stumpf 
psychosl...@culture-libre.org wrote:


Interesting, but why make a central code repository only for
Wikisource ?



 There are several reasons for this.
- trying to support all projects at once might be too much for a 
grantee to

do in 3 months.
- it is an experience to learn from, and it will have to be decided 
if it

can be applied broadly or substituted for some better method
- wikisource has a central location that can be used for this 
purpose,
other projects don't have it specifically. It could be meta, but 
that

should be discussed


Oh, I wasn't aware of this wikisource specificity? Is that related to a 
central djvu repository?




Micru
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


--
Association Culture-Libre
http://www.culture-libre.org/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Architecture Guidelines: Writing Testable Code

2013-06-03 Thread Mathieu Stumpf

Le 2013-06-03 16:20, Jeroen De Dauw a écrit :

Absolutist statements are typically wrong.


Please, don't miss an oportunity to provide people with the joy of 
reading paradoxes : Absolutist statements are *always* wrong.


There, fixed that for you. ;)


(sorry, I couldn't resist…)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New git-review lets you configure 'origin' as the gerrit remote

2013-06-02 Thread Mathieu Stumpf
Hey,

Did you add this useful documentation on meta? If yes, please provide me
the link so I can add this to my toread/todo list.


Le vendredi 31 mai 2013 à 15:14 -0700, Ori Livneh a écrit :
 Hey,
 
 The new version of git-review released today (1.22) includes a patch I
 wrote that makes it possible to work against a single 'origin' remote. This
 amounts to a workaround for git-review's tendency to frighten you into
 thinking you're about to submit more patches than the ones you are working
 on. It makes git-review more pleasant to work with, in my opinion.
 
 To enable this behavior, you first need to upgrade to the latest version of
 git-review, by running pip install -U git-review. Then you need to create
 a configuration file: either /etc/git-review/git-review.conf (system-wide)
 or ~/.config/git-review/git-review.conf (user-specific).
 
 The file should contain these two lines:
 
 [gerrit]
 defaultremote = origin
 
 Once you've made the change, any new Gerrit repos you clone using an
 authenticated URI will just work.
 
 You'll need to perform an additional step to migrate existing repositories.
 In each repository, run the following commands:
 
   git remote set-url origin $(git config --get remote.gerrit.url)
   git remote rm gerrit
   git review -s
 
 Hope you find this useful.
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Centralized Lua modules for Wikisource (OPW mentor needed)

2013-06-02 Thread Mathieu Stumpf
Le vendredi 31 mai 2013 à 11:15 -0400, David Cuenca a écrit :
 Hi all,
 
 After a talk with Brad Jorsch during the Hackathon (thanks again Brad for
 your patience), it became clear to me that Lua modules can be localized
 either by using system messages or by getting the project language code
 (mw.getContentLanguage().getCode()) and then switching the message. This
 second option is less integrated with the translation system, but can serve
 as intermediate step to get things running.

Interesting, but why make a central code repository only for
Wikisource ?

 
 For Wikisource it would be nice to have a central repository (sitting on
 wikisource.org) of localized Lua modules and associated templates. The
 documentation could be translated using Extension:Translate. These modules,
 templates and associated documentation would be then synchronized with all
 the language wikisources that subscribe to an opt-in list. Users would be
 then advised to modify the central module, thus all language versions would
 benefit of the improvements. This could be the first experiment of having a
 centralized repository of modules.
 
 What do you think of this? Would be anyone available to mentor an Outreach
 Program for Women project?
 
 Thanks,
 David Cuenca --Micru
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] showing videos and images in modal viewers within articles

2013-05-30 Thread Mathieu Stumpf

Le 2013-05-30 06:21, Erik Moeller a écrit :
Yes, better support for display of images through a modal viewer 
would

be great. I'm not sure a modal parameter that has to be explicitly
set for files is the best approach - I would recommend optimizing the
default experience when a user clicks an image or video. It's not
clear that the current behavior based on a pixel threshold is 
actually

desirable as the default behavior. (On a side note, the TMH behavior
should be improved to actually play the video immediately, not 
require

a second click to play in modal view.)


You'll probably need to pop-up a menu on mouse over the media, so user 
may open the modal window, open the commons page, and so on. Also you 
may want to integrate a next/previous media file in your modal.




Magnus Manske explored an alternative approach pretty extensively in
response to the October 2011 Coding Challenge, which is worth taking 
a

look at:
https://www.mediawiki.org/wiki/User:Magnus_Manske/wikipic


Oh, well, I suppose he already wrote all this obvious things then :P



Cheers,
Erik

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


--
Association Culture-Libre
http://www.culture-libre.org/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] APIStrat conference, San Francisco, October 23, 24, 25

2013-05-08 Thread Mathieu Stumpf
Le mercredi 08 mai 2013 à 09:09 -0400, Marc A. Pelletier a écrit :
 On 05/07/2013 05:33 PM, Juliusz Gonera wrote:
  Seems interesting. Is there anyone who'd be interested in giving a talk
  or participating in a panel there?
 
 Given that it's very close to me, and that I have quite a bit of
 experience with the API, this is something I could do.
 
 I actually had an API talk planned for WM 2012 that I never got a chance
 to present, I could dust it off and see how well it fits in their scope.
  Do you know when their CFP opens?
 
 -- Marc

Will the event be recorded and broadcasted? This may be an interesting
document to add on meta where efforts are done to make developers
involvement more attractive.

 
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Git for idiots

2013-05-08 Thread Mathieu Stumpf
Le mercredi 08 mai 2013 à 18:34 +0200, Petr Bena a écrit :
 Hi,
 
 Long time ago when I started learning with git I decided to create a
 simple guide (basically I was just taking some notes of what is
 needed). I never thought that it could be useful to anyone so I never
 announced it anywhere. However I got some feedback to it, so I decided
 to inform you too.
 
 The basic idea is to create a TOTALLY SIMPLE guide that git
 illiterates like me can understand and thanks to which they would find
 out how to do stuff in wikimedia git / gerrit.
 
 Link is here: www.mediawiki.org/wiki/User:Petrb/Git_for_idiots

Interesting idea. Probably you should also put a part for those who
don't know svn, just like you go with everything is ok/wrong. I
didn't read the whole thing, so maybe in practice it doen't matter, but
if you want something accessible to total newbies, you shouldn't expect
them to know svn. Also you may have more contributors using {{empty}}
than if you say I don't know windows, look at this and move along.

Aprart from that, nice job.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Starter kit ?

2013-05-06 Thread Mathieu Stumpf
Reading [1], I'm wondering if we couldn't have a wiki version of this 
page that could be updated along the road. For example the text talk 
about MySQL and but doesn't talk about the MariaDB migration. By the way 
the two The Architecture of Open Source Applications books may find 
there place on Wikisource, don't you think? But Wikisource isn't the 
place for books you want to update, may be Wikibooks may be a good 
place, what do you think?


[1] http://www.aosabook.org/en/mediawiki.html

Le 2013-04-15 20:09, Quim Gil a écrit :

Hello Mathieu, welcome!

These days we are getting many new potential contributors thanks to
Google Summer of Code and Outreach Program for Women. We want to know
what can we do better for you!

On 04/13/2013 07:45 AM, Mathieu Stumpf wrote:

https://www.mediawiki.org/wiki/How_to_become_a_MediaWiki_hacker
https://www.mediawiki.org/wiki/How_to_contribute

are two good places to begin.


Thank you, it seems to be exactly what I need. :)


This is very good to hear! Still, we welcome improvements to this
Starter kit (well put).

Just recently we have started a selection of essential resources for
new contributors, see

https://www.mediawiki.org/wiki/Category:New_contributors

And we are discussing many other ideas at

http://www.mediawiki.org/wiki/Project:New_contributors


--
Association Culture-Libre
http://www.culture-libre.org/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Starter kit ?

2013-05-06 Thread Mathieu Stumpf

Le 2013-05-06 15:34, Guillaume Paumier a écrit :

Hi,

On Mon, May 6, 2013 at 2:28 PM, Mathieu Stumpf
psychosl...@culture-libre.org wrote:
Reading [1], I'm wondering if we couldn't have a wiki version of 
this page
that could be updated along the road. For example the text talk 
about MySQL

and but doesn't talk about the MariaDB migration.

[1] http://www.aosabook.org/en/mediawiki.html


That chapter was actually written on mediawiki.org, as part of this 
project:

https://www.mediawiki.org/wiki/MediaWiki_architecture_document

The content of the chapter lives in 2 wiki pages:
* https://www.mediawiki.org/wiki/MediaWiki_history
* https://www.mediawiki.org/wiki/Manual:MediaWiki_architecture

You're encouraged to update what needs updating :)


Ok, thank you for the information. It may be interesting to add this 
link in [1], and maybe it would be more relevant to use this links on 
[2] instead of giving a link to [1], what do you think?


[2] 
https://www.mediawiki.org/wiki/How_to_become_a_MediaWiki_hacker#MediaWiki



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Starter kit ?

2013-05-06 Thread Mathieu Stumpf
Le lundi 06 mai 2013 à 09:14 -0700, Quim Gil a écrit :
 Only to honor the subject and initial motivation of this thread:
 
 Recently we started drafting an actual Starter Kit, as Mathieu requested:
 
 https://www.mediawiki.org/wiki/Project:New_contributors/Starter_kit
 
 Edits and questions in the discussion page are welcome. In fact with a 
 bit of focus from a few people during a few days we could get something 
 pretty decent.
 
 Mathieu and other newcomers: your contribution is especially valuable 
 since you still remember how it was when you landed here for the first time.

Glad to hear that I could be useful there. I can't promess feedback
before this week end though, I already have a full shedule.


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Deploying alpha of VisualEditor to non-English Wikipedias

2013-04-29 Thread Mathieu Stumpf

Le 2013-04-26 17:17, Gerard Meijssen a écrit :

Hoi,
Why use such an old browser and why should anybody care if it works 
...

Current version of Firefox is 20.0.1
Thanks,
 Gerard


Because that's the version which is installed on one of the workstation 
I use and for which I have no required privileges to install anything 
else. We should care because we care about all potential contributors, 
and we don't want to make discrimination based on gender, sexual 
orientation nor the browser available to them.



On 26 April 2013 14:52, Mathieu Stumpf 
psychosl...@culture-libre.orgwrote:



Le 2013-04-26 14:19, Bartosz Dziewoński a écrit :

 It's currently disabled on IE9 and Opera. You can test it on them 
by

using the ?vewhitelist=1 parameter (but don't expect much). I'm
currently working on Opera support myself (as a volunteer).



I'm using firefex 8.0.1. It works with the parameter work around. By 
the
way, is there some roadmap somewhere for this specific extension? It 
looks
like there's not yet feature to edit tables and I would be 
interested to

know what's planed on this subject.




2013/4/26, Mathieu Stumpf psychosl...@culture-libre.org**:


Le 2013-04-25 19:09, James Forrester a écrit :

On 18 April 2013 17:32, James Forrester 
jforres...@wikimedia.org

wrote:

TL;DR: VisualEditor will be deployed on 14 new Wikipedias next 
week

as an
opt-in alpha. Your assitance is requested to inform your wikis 
about

this
and help get the software translated.



This is now done (for de, nl, fr, it, ru, es, sv, pl, ja, ar, he, 
hi,
ko, and zh). Grateful for feedback, bug reports and suggestions 
of

how
we can improve the VisualEditor for you.



I opted in, but I can't see anything else than the classical edit
option. I tryed on severa pages I never visited before and also to
refresh my browser cache, but steal no new option. Is there a URL 
param

I could try to see if it at least it works this way?

--
Association Culture-Libre
http://www.culture-libre.org/

__**_
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org

https://lists.wikimedia.org/**mailman/listinfo/wikitech-lhttps://lists.wikimedia.org/mailman/listinfo/wikitech-l




--
Association Culture-Libre
http://www.culture-libre.org/

__**_
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org

https://lists.wikimedia.org/**mailman/listinfo/wikitech-lhttps://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


--
Association Culture-Libre
http://www.culture-libre.org/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Deploying alpha of VisualEditor to non-English Wikipedias

2013-04-26 Thread Mathieu Stumpf

Le 2013-04-25 19:09, James Forrester a écrit :
On 18 April 2013 17:32, James Forrester jforres...@wikimedia.org 
wrote:
TL;DR: VisualEditor will be deployed on 14 new Wikipedias next week 
as an
opt-in alpha. Your assitance is requested to inform your wikis about 
this

and help get the software translated.


This is now done (for de, nl, fr, it, ru, es, sv, pl, ja, ar, he, hi,
ko, and zh). Grateful for feedback, bug reports and suggestions of 
how

we can improve the VisualEditor for you.


In the french version, the label of the checkbox to activate it have to 
internal links to none-existant pages (red links). The wiki equivalent 
text is :


   Activer VisualEditor (seulement dans les espaces de noms 
[[Wikipédia:Main namespace|principal]] et [[Espace de nom 
utilisateur|utilisateur]])


Otherwise I didn't test it yet, but thank you. :)



Yours,
--
James D. Forrester
Product Manager, VisualEditor
Wikimedia Foundation, Inc.

jforres...@wikimedia.org | @jdforrester

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


--
Association Culture-Libre
http://www.culture-libre.org/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Deploying alpha of VisualEditor to non-English Wikipedias

2013-04-26 Thread Mathieu Stumpf

Le 2013-04-25 19:09, James Forrester a écrit :
On 18 April 2013 17:32, James Forrester jforres...@wikimedia.org 
wrote:
TL;DR: VisualEditor will be deployed on 14 new Wikipedias next week 
as an
opt-in alpha. Your assitance is requested to inform your wikis about 
this

and help get the software translated.


This is now done (for de, nl, fr, it, ru, es, sv, pl, ja, ar, he, hi,
ko, and zh). Grateful for feedback, bug reports and suggestions of 
how

we can improve the VisualEditor for you.


I opted in, but I can't see anything else than the classical edit 
option. I tryed on severa pages I never visited before and also to 
refresh my browser cache, but steal no new option. Is there a URL param 
I could try to see if it at least it works this way?


--
Association Culture-Libre
http://www.culture-libre.org/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Deploying alpha of VisualEditor to non-English Wikipedias

2013-04-26 Thread Mathieu Stumpf

Le 2013-04-26 14:19, Bartosz Dziewoński a écrit :

It's currently disabled on IE9 and Opera. You can test it on them by
using the ?vewhitelist=1 parameter (but don't expect much). I'm
currently working on Opera support myself (as a volunteer).


I'm using firefex 8.0.1. It works with the parameter work around. By 
the way, is there some roadmap somewhere for this specific extension? It 
looks like there's not yet feature to edit tables and I would be 
interested to know what's planed on this subject.




2013/4/26, Mathieu Stumpf psychosl...@culture-libre.org:

Le 2013-04-25 19:09, James Forrester a écrit :

On 18 April 2013 17:32, James Forrester jforres...@wikimedia.org
wrote:
TL;DR: VisualEditor will be deployed on 14 new Wikipedias next 
week

as an
opt-in alpha. Your assitance is requested to inform your wikis 
about

this
and help get the software translated.


This is now done (for de, nl, fr, it, ru, es, sv, pl, ja, ar, he, 
hi,

ko, and zh). Grateful for feedback, bug reports and suggestions of
how
we can improve the VisualEditor for you.


I opted in, but I can't see anything else than the classical edit
option. I tryed on severa pages I never visited before and also to
refresh my browser cache, but steal no new option. Is there a URL 
param

I could try to see if it at least it works this way?

--
Association Culture-Libre
http://www.culture-libre.org/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


--
Association Culture-Libre
http://www.culture-libre.org/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Zuul (gerrit/jenkins) upgraded

2013-04-25 Thread Mathieu Stumpf

Le 2013-04-24 19:44, Antoine Musso a écrit :

Hello,

This morning was very quiet so I have been bold and upgraded our Zuul
installation.


Excuse me, but what is Zuul?

--
Association Culture-Libre
http://www.culture-libre.org/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] RFC on possible interproject link interfaces

2013-04-24 Thread Mathieu Stumpf

Le 2013-04-24 03:53, David Cuenca a écrit :

Dear all,

I have started a new RFC with some proposals for the interproject 
links and

you can add more if you want.

https://meta.wikimedia.org/wiki/Requests_for_comment/Interproject_links_interface

It has been a long standing issue and one of the most voted 
enhancements in

Bugzilla
https://bugzilla.wikimedia.org/show_bug.cgi?id=708


Very intereting. I'm working on a template[1] that I want to propose to 
the french wikipedia community to respond to the following issue: some 
articles are taking for granted that the reader will know certain 
non-trivial concepts. My idea would be to give the reader an opportunity 
to quickly check that it doesn't lake such a knowledge requirement 
through a box which would list each concept with the relevant wikipedia 
article, but also the relevant wikiversity course if the reader would 
like to follow such a sturctured material on this topic.


For now I have to finish the template so I can go to the community with 
a a working example and let them decide on something else than mere 
textual description mockup. I know not everybody will be enthusiastic 
with this idea, especially given that (at least on the french chapter) 
wikiversty courses are not equal in quality, but to my mind this is a 
vicious circle: poor quality-few visibility-few contributors-poor 
quality.


[1] https://fr.wikipedia.org/wiki/Mod%C3%A8le:Pr%C3%A9alable



To have the sister projects templates at the bottom of the page it is 
also
one of the reasons why sister projects have been also so hidden from 
the

eyes of the big public, and now with Wikidata also the issue of
maintainability can be addressed as well (similar problem as with
interlanguage links).

Micru
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


--
Association Culture-Libre
http://www.culture-libre.org/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Tutorial on Semantic MediaWiki in Russian

2013-04-20 Thread Mathieu Stumpf
Le samedi 20 avril 2013 à 05:36 +0400, Yury Katkov a écrit :
 Any ideas where to publish the English version? I was quite surprised when
 I found out that there is no collective blog in english internet as our
 Habrahabr.
 
 I was thinking about IBM Developerworks but maybe they need something
 closer to the programming. Can I try to propose the article to
 http://blog.wikimedia.org/c/technology/ ?

Would the meta wiki be innapropriate?

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Making inter-language links shorter

2013-04-19 Thread Mathieu Stumpf

Le 2013-04-19 12:05, Lars Aronsson a écrit :

On 04/18/2013 06:50 PM, Pau Giner wrote:

As multilingual content grows, interlanguage links become longer on
Wikipedia articles. Articles such as Barak Obama or Sun have 
more than
200 links, and that becomes a problem for users that often switch 
among

several languages.


For how many users is this a problem? How do you estimate
this? I think it's good that the user is a little overwhelmed
with how many languages are available.
The use of Geo-IP
will be interpreted as a political tool that tries to enforce
certain languages in certain geographic areas, which would
be contrary to the mission of the Wikimedia Foundation,
that says all the world's knowledge in your own language.


What a delight to see someone which is more paranoic than myself on 
this topic. ;)




If a user finds it offensive that an article is available in some
languages that she somehow dislikes, let her login and
select which ones should be hidden. The burden should be
on that user. Wikipedia should provide knowledge, not
hide it.


On the other hand cluter can be an effective way to hide or obstruct 
the access to information.


--
Association Culture-Libre
http://www.culture-libre.org/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Works on annotations

2013-04-19 Thread Mathieu Stumpf
I think I red some things on annotations in the Visual Editor, but I 
can't find it again. Do you have relevant informations on the subject? I 
like to use template like {{why|blabla}} and {{according to 
who|blabla}}, but in the same time I can see that while making a 
contribution call to knowledgeable people, it can quickly clutter the 
text with a lot of underlined sentances. So it may be interesting to 
have an option to hide/show this kind of query.


--
Association Culture-Libre
http://www.culture-libre.org/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Support for 3D content

2013-04-19 Thread Mathieu Stumpf

Hi,

Reading the 2012-13 Plan, I see that multimedia is one the key 
activities for Mediawiki. So I was wondering if there was already any 
plan to integrate 3D model viewers, which would be for example very 
interesting for anatomy articles, or simply 3D maths objects.


MediaGoblin[1] show that we already all the free software stack to do 
that, however I don't have enough data to estimate development charges 
of that kind of features into Wikimedia.



[1] https://en.wikipedia.org/wiki/GNU_MediaGoblin


--
Association Culture-Libre
http://www.culture-libre.org/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Design] Making inter-language links shorter

2013-04-19 Thread Mathieu Stumpf
Also, did you think of the accessibility issues in your solution? Here I
especialy think of people with view disabilities, for who js often mean
no way to get the content, while a long list of hyperlinks is
manageable.

Le vendredi 19 avril 2013 à 20:19 +0200, Pau Giner a écrit :
 Thanks for all the feedback!
 
 
 I'll try to respond below to some of the issues raised:
 
 
 Which is the problem?
 
 
 As it has been mentioned, one of the most effective ways of hiding
 something is to surround it in the middle of a long list. This
 produces two problems:
   * Lack of discoverability. users may be not aware that the
 content is available in their language (which goes against our
 goal of providing access to knowledge). Speakers of small
 languages that access the English Wikipedia because it has
 more content, are forced to make an effort each time to check
 if each article is also available in their language.
   * Problems for multi-lingual exploration. It is hard to switch
 between multiple language versions since the users has to look
 for his languages each time in the whole list.
 
 
 The fact that some Wikipedias adjust the order of the languages, the
 existence of user scripts and an Opera extension to solve the issue,
 is an indicator of the existence of such problem. 
 
 
 
 We support lots of languages (+300) but users are normally interested
 in a small (1-8) subset of those. We need to make these subset easily
 discoverable for our users, and providing them in the middle of a list
 with 200 items is not the best way to do it in my opinion.
 
 
 Possible cultural and value problems
 
 
 As it was commented, the multilingual nature of Wikipedia is a strong
 held value. However, currently it is hard to know in how many
 languages an article is available since you need to count the links.
 With the proposed approach we provide a number which helps to
 communicate that. So I think we are not going against that value.
 
 
 I think that concerns about the imposition of languages per region are
 not a big issue when the previous user choices and the browser accept
 language are considered with more priority than Geo-IP. Users just
 need to select their language once and it will be appearing in the
 short list the next times. These concerns should be more relevant with
 the current situation where some Wikis put some languages on top
 regardless user choices (for some work 100% of the time, for others
 they fail 100% of the time).
 
 
 
 I also don't think that we should prioritise the need to  hide
 languages that users somehow dislikes over making it easy to access
 the languages that the user wants. In any case, the former is also not
 supported with the current approach.
 
 
 Why to hide?
 
 
 I understand the problems commented when language links were initially
 hidden in Vector, since uses were required to make an additional step
 to get into the same long list of links we currently have. With the
 proposed approach, the extra step is only taken in exceptional cases
 (e.g., a user in a foreign country accessing from a public pc), and
 this is made only once (not for each language change), and aids such
 as search are provided to make it really quick.
 
 
 The reordering alternative has some problems compared with the
 proposed approach. For example, when a language does not appear on
 top, it is hard to determine whether the current article is not
 provided in that language or it is in the middle of the list. In
 addition, with reordering, you cannot rely on alphabetical order
 (while you can present the short list alphabetically).
 
 
 
 
 Considering size and quality of the article
 
 
 It can be a factor to consider since communicating that an article has
 good versions in other languages is a good thing. But I think it is a
 low priority one, since I find hard to imagine a user selecting a
  language which she does not understand (otherwise will be already in
 the short list) just because the article is of good quality. In any
 case, users normally speak 1-8 languages, so even in a relatively
 short list there is still room for other criteria.
 
 
 The way to access more
 
 
 We choose the ... so that the label could work across languages. So
 that you can go back to your language if you arrive by accident to a
 foreign Wikipedia (or you are an advanced user curious to check if web
 fonts were enabled in Javanese wikipedia). However, the visual
 execution needs some work still to make it more touch friendly among
 other things.
 
 
 Details on the language list UI
 
 
 The interactive prototype was created to communicate the main idea and
 most details are still lacking. 
 
 
 Thanks for pointing layout and navigation problems. The layout of the
 list is one of the aspects that needs more work: currently we group
 the languages by script to help the user to recognise their familiar
 scripts, and the algorithm also makes some 

Re: [Wikitech-l] Tutorial on Semantic MediaWiki in Russian

2013-04-19 Thread Mathieu Stumpf
Le vendredi 19 avril 2013 à 21:02 +0300, Paul Selitskas a écrit :
 That is a nice starter guide! Thank you, Yury!

I'll be definitely interested to have the english version. Well, to be
honnest, I would prefer some magical recipe for Russian instant
learning, but I doubt anyone can provide such a thing. ;)

 
 On Fri, Apr 19, 2013 at 7:35 PM, Yury Katkov katkov.ju...@gmail.com wrote:
  Hi everyone!
 
  In Semantic MediaWiki we have great but very long documentation. I've tried
  to write a small introduction to SMW. It's only in Russian now (I know that
  some amount of Russian MediaWikers are reading this mailing list), but I
  think to write something similar in English (maybe to IBM DeveloperWorks).
 
  http://habrahabr.ru/post/173877/
 
  If you don't know Russian you can enjoy the pictures anyway ;)
 
  Cheers,
  -
  Yury Katkov, WikiVote
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 
 


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] WMF Engineering Roadmap Update - 20130417

2013-04-18 Thread Mathieu Stumpf

Le 2013-04-17 20:06, Greg Grossmeier a écrit :
Hello and welcome to the latest edition of the WMF Engineering 
Roadmap

Update.


Highlights include:
* Mobile resubmitted to AppStore yesterday, still waiting on the 
Apple

  Geniuses to approve ;)

* QA is turning all the tests running against betalabs green (they 
were

  red from unexpected config differences); they're close.

* Echo will be rolling out on en, de, and fr wiki next week on 
Thursday

  (the 25th)


On all Mediawiki projects, or just on some of them like Wikipedia?



Full roadmap at:

https://docs.google.com/a/wikimedia.org/spreadsheet/ccc?key=0Aoizbfxc5g6KdEkza0xkQnJlM0o0TXlwQXhDOUFvYnc#gid=0


Would it be possible to use free software solutions, which is not the 
case of the google spreadsheet as far as I know. Depending on the 
feature needed, EtherCalc may be used instead of the current solution. 
Other free/libre culture advocates, like the French framasoft[2] network 
are already using it, see [3].


[1] https://ethercalc.org/
[2] https://fr.wikipedia.org/wiki/Framasoft (Sorry, no English 
translation yet)

[3] http://framacalc.org/_start


--
Association Culture-Libre
http://www.culture-libre.org/

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

  1   2   >