Re: [Wikitech-l] Equivalence for using Template:Cite_Web with templates surrounding Wikicode

2019-12-31 Thread Eran Rosenthal
Hi Eugene,
Assuming it is local installation or wiki other than English wikipedia
where citations aren't yet configurated you can follow

E.g copy
https://en.wikipedia.org/wiki/MediaWiki:Citoid-template-type-map.json
and
https://en.wikipedia.org/wiki/MediaWiki:Visualeditor-cite-tool-definition.json

If you are doing it in one of Wikimedia projects and not local
installation, it will be great to update :
https://phabricator.wikimedia.org/T127354


On Tue, Dec 31, 2019 at 4:13 PM Jaime Crespo  wrote:

> Hi,
>
> On Tue, Dec 31, 2019 at 12:42 PM Egbe Eugene 
> wrote:
>
> > Hi All,
> >
> > I am implementing template
> https://en.wikipedia.org/wiki/Template:Cite_web
> > in
> > VE and after using the template code[1], I get results shown below
> >
>
> Sorry, I may not be understanding what you are trying to accomplish, I am
> guessing reimplementing Cite_web functionality on your own local mediawiki
> installation?
> I don't know much about visual editor or references-related code, someone
> else may be able to help you here better, but I know that is implemented on
> enwiki:
>
> https://phab.wmfusercontent.org/file/data/wz3njgh6k6iwrugvap6f/PHID-FILE-n6el7aiazixa4mwb2rq2/Screenshot_20191231_150008.png
>
> https://en.wikipedia.org/w/index.php?title=User:JCrespo_(WMF)/sandbox=revision=933365112=926096219=visual
>
> So my only suggestion is to look at how enwiki has it configured and try to
> adapt it to your needs.
>
> Another suggestion is to use Phaste:
> https://phabricator.wikimedia.org/paste/edit/form/14/ for code snippets,
> as
> it will prevent vandals from editing it + allow comments.
>
> Sorry I cannot be of more help,
> --
> Jaime Crespo
> 
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit Vandalism

2019-03-18 Thread Eran Rosenthal
I couldn't catch anyone in the IRC so this is also tracked in:
https://phabricator.wikimedia.org/T218636

On Tue, Mar 19, 2019 at 7:40 AM Jay prakash <0freerunn...@gmail.com> wrote:

> Hi please stop User Mill on Gerrit. He/She vandalised my and my Tech's
> patches.
>
> User:Jayprakash12345
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Community-Engineering gaps as defined in configuration

2019-03-09 Thread Eran Rosenthal
Hi,
Wiki communities can ask to override their default configurations
(following consensus). The reasons to override may vary:

   1. *customization* for community to align to community specific policy
   (example: special namespaces / upload policy/user groups rights defined per
   project and language version)
   2. *technical dispute* where community and engineering don't agree
   (example:  VE as single tab for enwiki[1], disabling description tagline in
   enwiki[2] etc).

Technical dispute are problematic - if the product is not good enough
engineering and community should ideally come with a solution how to
address the issues. We can define a plan to fix the product (disable till
task X is fixed), enable/disable feature in the USER level if it is a
matter of choice (not the site level) etc.
Anyway we should try to avoid letting specific communities override
defaults for long term if there is no community specific reason to override
the configuration. This create a jungle of configurations, inconsistent UI
across languages and more complex maintenance etc.

This is a call for action for the wiki communities and engineering:

   - Engineering - consider to align all wikis to similar configuration if
   it isn't community customization but "technical dispute"
   - Communities - consider whether these configurations are old and can be
   removed

Examples of issues found in wmf-config:

   - VisualEditor tabs (wmgVisualEditorUseSingleEditTab 60 wikis "Deploying
   slowly with community advanced notice"?
   wmgVisualEditorSingleEditTabSecondaryEditor - enwiki overrides the default
   editor)
  - and more generally enabling VE deployments:
  https://www.mediawiki.org/wiki/VisualEditor/Rollouts
  - Patroling
  - wgUseRCPatrol - disabled by default but ~80 wikis enable it. Should
  be enabled for all wikis? (if not - what is missing from getting it
  deployed in more wikis? are there alternative feature for it
used in other
  wikis?)
  - wgUseNPPatrol/wgUseFilePatrol - few wikis override it. Do we really
  need to override it?
   - wgImportSources
   - Each wiki define arbitrary list of wikis it import from. We should get
  rid of it and allow import between any Wikimedia site. See
  https://phabricator.wikimedia.org/T17583
   - wmgUseSandboxLink
  - Enabled in 80 wikis. Why not enabling it everywhere?
   - wmgUseWikiLove -enabled in ~50 wikis including enwiki. is there a
   reason to not enable in all wikis?

Thanks,
Eran

[1] https://phabricator.wikimedia.org/T128478
[2] https://phabricator.wikimedia.org/T161805
[3] https://phabricator.wikimedia.org/T214678
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Javascript function does not work

2019-02-12 Thread Eran Rosenthal
Hi Igal,
We are trying to avoid using jquery.ui in favor or OOUI to get a more
modern interface. jquery dialog module is NOT loaded by default  (and if it
works for you in any wiki, likely you enabled some gadget that relay on it
or it was enabled as default gadget)
Though not all people like OOUI, and though it may be harder to do simple
stuff, it provides some benefits when creating larger infrastructure of
dialogs.

So the short answer wrap your function:
mw.loader.using('jquery.ui.dialog', function() {
$( '' ).dialog( ... ) // 1 parameter
})

The better way to do it: Follow
https://www.mediawiki.org/wiki/OOUI/Windows/Dialogs

Eran




‪On Tue, Feb 12, 2019 at 9:45 PM ‫יגאל חיטרון‬‎ 
wrote:‬

> So, nobody knows what happened? If so, I should create a phab ticket.
> Igal
> On Feb 8, 2019 16:51, "יגאל חיטרון"  wrote:
>
> > Works on hewiki.
> > Does not on enwiki, eswiki, ruwiki.
> > Thanks,
> > Igal
> >
> > ‫בתאריך יום ו׳, 8 בפבר׳ 2019 ב-16:50 מאת ‪Alex Monk‬‏ <‪
> kren...@gmail.com
> > ‬‏>:‬
> >
> >> Is the jquery.ui.dialog module loaded in RL?
> >> Where is this failing and where is it fine?
> >>
> >> ‪On Fri, 8 Feb 2019 at 14:47, ‫יגאל חיטרון‬‎ 
> >> wrote:‬
> >>
> >> > Hi. Some script stopped working. Looks like the function
> >> >
> >> > $( '' ).dialog( ... ) // 1 parameter
> >> >
> >> > does not work any more. But it still works on different wiki. Was
> there
> >> > some breaking change? Thank you.
> >> > Igal (User:IKhitron)
> >> > ___
> >> > Wikitech-l mailing list
> >> > Wikitech-l@lists.wikimedia.org
> >> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >> ___
> >> Wikitech-l mailing list
> >> Wikitech-l@lists.wikimedia.org
> >> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The mw.ext construct in lua modules

2019-02-04 Thread Eran Rosenthal
> What is the problem with the ".ext" part?
1. It adds unnecessary complexity both in the extension (need to init
mw.ext if it doesn't exist) and more important - in its usage when the Lua
extension is invoked (longer names)
   (there is very small risk of name collision -  mw.ModuleA and mw.ModuleB
are unlikely to clash as different extensions, and mw.ModuleA and mw.FUNC
are unlikely to clash because function names

are usually verbs and extensions
 are usually nouns)
2. Practically the convention is to not use mw.ext - the convention (based
on most of the Lua code - e.g wikibase) is to not use mw.ext

> What is the benefit of moving existing code that is so heavily used?
consistency and alignment to some code convention (2). [keeping backward
compatibility can be with mw.wikibase=mw.ext.wikibase with deprecation
notice]
If we believe ext is good convention we should drive to align it and at
least to allow usages

to align to that convention.  (google counts 3K usages - if we don't fix
it, it will be much harder to fix it later)
if we don't believe this is good convention, we shouldn't impose it on new
Lua extensions.

Thanks,
Eran





On Mon, Feb 4, 2019 at 4:13 PM Thiemo Kreuz 
wrote:

> > […] I think mw.ext.EXTNAME should be avoided […]
>
> Can I ask to provide arguments that help others understand this
> opinion better? What is the problem with the ".ext" part?
>
> > […] or we should reject this proposal and open phab ticket to wikibase
> to change mw.wikibase to mw.ext.wikibase everywhere […]
>
> How is this an unavoidable consequence of deciding on a standard new
> code should follow? What is the benefit of moving existing code that
> is so heavily used?
>
> Kind regards
> Thiemo
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The mw.ext construct in lua modules

2019-02-02 Thread Eran Rosenthal
I agree with John, and I think mw.ext.EXTNAME should be avoided and we
should prefer mw.EXTNAME which is clear and simple and fills very native.
This is already the way it is used in wikibase (mw.wikibase.FUNCNAME) which
I believe is the most heavily used extension exposing Lua interface.

To conclude:
1. We should either accept this convention (dropping the "ext") and update
https://www.mediawiki.org/wiki/Extension:Scribunto/Example_extension
2. or we should reject this proposal and open phab ticket to wikibase to
change mw.wikibase to mw.ext.wikibase everywhere (probably keeping the
first for backward compatibility)

Thanks,
Eran



On Fri, Jan 25, 2019 at 9:48 PM John Erling Blad  wrote:

> There are several extensions that diverge on the naming scheme. Some
> of them even being referenced as using the scheme, while not providing
> lua libs at all. It is a bit weird.
>
> On Fri, Jan 25, 2019 at 7:09 PM Kunal Mehta 
> wrote:
> >
> > -BEGIN PGP SIGNED MESSAGE-
> > Hash: SHA512
> >
> > Hi,
> >
> > On 1/24/19 11:33 PM, Thiemo Kreuz wrote:
> > > Is there a question assigned with this long email? Is this a call
> > > for feedback?
> >
> > I think this is probably related to/coming from
> > .
> >
> > - -- Legoktm
> > -BEGIN PGP SIGNATURE-
> >
> > iQIzBAEBCgAdFiEE2MtZ8F27ngU4xIGd8QX4EBsFJpsFAlxLUJcACgkQ8QX4EBsF
> > JpuIGRAAtHXuDQqmJK+fqKiMYrzRE7aXkX/pis7z7F5nncPWfHpaMFKFMHeAu4/d
> > PHvpJqifXi5LwCV/YSAugmZJaQ1FFn2u+/ZA9sXAAR0JBvHnY/A5unmfXkzpteEP
> > eUSCtexL5vjyjVo+Yd/ixbg06FS9Jc/6dxECxb6/A84gjHHQxA9drK4bkLZRvGPj
> > 2oInMsB37iBj5/Q/ShO8Km2hz7HJ/zNyW5ljFTYwTKzNiPBcGdswMLu4vj0ALfIF
> > OHwUeHj+M6i5UqnP0HiRBSHeFWo9it6RSXEd+lfVNbn46VJZ3zkNUFDqkWeJOWgs
> > o3N781lCdRbcn/P3V+k2CkQhVqjGPb/MgxUyQAreup8fcwBcDiDkj7wNnnUETVuS
> > EYg3Fc/xlrjIKYO54LSU5kHphEhCxAHdbxol8X8mNPQ3IHGQpyJCCSX6+qSGM/0+
> > CYtNh+ktJSyghmdUv2QOvjSkObTKL2HV9yLD3a/3qqO+Pekn9mnoNax/Splr0bV2
> > OkK9KMBEd73+/r+6hmhQoJdESOjLofyzoT9ohR3xWlJSfH8XOAWkphbuu87Dp0k1
> > KNjue1eP0KY5bO4+64hnqbCpeVpJiaQjkw+uCTmLz7u7tBME1rt7D+3D0PizXENN
> > NNkLc4XNl4ouKti3Yhkx0P4TAy/QIDR15M0eSSikHJI8PehqnRU=
> > =V+xr
> > -END PGP SIGNATURE-
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Article geotags missing in geo_tags table for some WP languages?

2018-11-08 Thread Eran Rosenthal
List of wikis by geo_tags count may help to identify wikis that aren't
using geo tags:

abwiki0
adywiki0
akwiki0
arcwiki0
aywiki0
bgwiki0
biwiki0
bjnwiki0
bmwiki0
bowiki0
bpywiki0
brwiki0
bugwiki0
bxrwiki0
chrwiki0
chywiki0
crhwiki0
crwiki0
csbwiki0
cuwiki0
diqwiki0
dzwiki0
eewiki0
emlwiki0
extwiki0
ffwiki0
fjwiki0
fywiki0
gagwiki0
ganwiki0
gdwiki0
glkwiki0
gotwiki0
gvwiki0
hawiki0
hawwiki0
hsbwiki0
iawiki0
iewiki0
igwiki0
ikwiki0
inhwiki0
iuwiki0
jamwiki0
jbowiki0
kaawiki0
kbdwiki0
kbpwiki0
kgwiki0
kiwiki0
klwiki0
koiwiki0
krcwiki0
kshwiki0
kswiki0
kvwiki0
ladwiki0
lbewiki0
lfnwiki0
lijwiki0
liwiki0
lrcwiki0
ltgwiki0
mdfwiki0
mhrwiki0
miwiki0
mrjwiki0
mtwiki0
mznwiki0
nahwiki0
napwiki0
nawiki0
novwiki0
nrmwiki0
nvwiki0
omwiki0
oswiki0
pamwiki0
papwiki0
pcdwiki0
pdcwiki0
piwiki0
pmswiki0
pntwiki0
quwiki0
rmwiki0
rmywiki0
rnwiki0
rupwiki0
rwwiki0
scnwiki0
sewiki0
sgwiki0
skwiki0
srnwiki0
stqwiki0
swwiki0
szlwiki0
tcywiki0
tiwiki0
towiki0
tpiwiki0
tumwiki0
twwiki0
udmwiki0
ugwiki0
vecwiki0
vepwiki0
vewiki0
vlswiki0
vowiki0
wawiki0
wowiki0
wuuwiki0
xalwiki0
xmfwiki0
yiwiki0
yowiki0
zawiki0
zeawiki0
classicalwiki0
nanwiki0
zuwiki0
amwiki1
oldwiki1
frrwiki1
gorwiki1
kwwiki1
nywiki1
sowiki1
tywiki1
anwiki2
lgwiki2
cowiki3
smwiki3
smgwiki4
stwiki5
lnwiki6
cdowiki7
iowiki8
kuwiki8
tetwiki8
lbwiki9
kawiki10
sswiki13
eowiki14
snwiki14
atjwiki15
chwiki15
pagwiki18
gnwiki21
zamwiki23
hrwiki37
mkwiki37
hakwiki38
satwiki39
xhwiki40
angwiki45
fowiki50
furwiki55
tyvwiki63
pihwiki66
bmswiki67
dinwiki70
mwlwiki74
tkwiki86
gomwiki99
lowiki138
frpwiki139
hifwiki140
htwiki186
tnwiki229
tarawiki249
kabwiki282
vrowiki287
mnwiki295
sawiki336
dtywiki338
ruewiki374
sdwiki401
avwiki462
bclwiki481
dvwiki499
minwiki511
idwiki571
olowiki575
mrwiki580
suwiki589
nlwiki628
dsbwiki639
kmwiki655
sahwiki691
aswiki759
lezwiki761
iswiki961
pnbwiki1172
scwiki1286
gawiki1290
lawiki1354
arzwiki1387
bhwiki1533
siwiki1560
pswiki1621
yuewiki1726
lmowiki1788
warwiki1820
pflwiki1882
acewiki1934
newwiki1984
nsowiki2394
myvwiki2660
orwiki3181
knwiki3262
ckbwiki3666
tgwiki3846
maiwiki5152
newiki5179
tlwiki5305
ndswiki5808
pawiki6390
tawiki7747
mlwiki8229
tewiki8424
alswiki9300
cvwiki9821
afwiki11025
jvwiki11062
tswiki12077
barwiki12224
bnwiki13776
slwiki14608
astwiki14778
mswiki15047
simplewiki15836
hiwiki15933
kywiki16201
ttwiki16624
azwiki17440
sqwiki18621
scowiki19495
mywiki19593
ilowiki19769
guwiki20287
cywiki21966
lvwiki24097
glwiki24254
bawiki25354
hewiki27228
azbwiki29506
thwiki29889
etwiki31017
elwiki33093
urwiki33521
bswiki34825
mgwiki34926
ocwiki37370
trwiki43747
nnwiki47846
ltwiki54916
bewiki57123
fiwiki59629
kowiki63841
uzwiki67233
kkwiki71017
cewiki77008
nowiki101437
hywiki105532
viwiki110624
dawiki112547
jawiki115929
huwiki136378
euwiki142636
ptwiki144496
cswiki154191
rowiki181206
fawiki185066
nlwiki219759
shwiki263092
ukwiki305328
cawiki307912
zhwiki310981
itwiki361703
srwiki374054
arwiki381995
plwiki387716
eswiki474918
ruwiki480509
frwiki665610
dewiki1348793
enwiki1969331
svwiki3027477
cebwiki5763981

On Thu, Nov 8, 2018 at 5:23 PM Martin Dittus  wrote:

> Thank you Strainu and Bartosz, this was very useful.
>
> As far as I can tell, idwiki editors simply don't use the GeoData
> geotagging conventions. Instead, people specify article location as
> infobox latd/longd properties, which on enwiki (and likely others) is
> being deprecated in favour of GeoData tags. While this older method
> allows a map to be displayed on the page, the coordinates are not
> imported by the GeoData extension, and as a result none of these
> geotagged pages show up in API geo lookups, or in the data dumps.
>
> See also:
> https://en.wikipedia.org/wiki/Wikipedia:Coordinates_in_infoboxes
>
> I'm now pondering if there's a quick way to asses for which wikis this
> is the case... I may report back if I find a simple 

Re: [Wikitech-l] The annual Community Wishlist Survey now open for proposals

2018-10-30 Thread Eran Rosenthal
Please avoid as much as possible from "not feasible" answers but rather
direct the proposals to smaller and feasible solutions if they are too
broad. Usually such proposals have higher impact and importance compared to
smaller tasks.

Good example of such answer by Danny H in
https://meta.wikimedia.org/wiki/Community_Wishlist_Survey_2019/Miscellaneous/Wikimedia_Maps_Improvements

Some ideas that may be hard to fully achieve and community tech may need to
help with properly scoping:
1.
https://meta.wikimedia.org/wiki/Community_Wishlist_Survey_2019/Wikidata/Edit_Wikidata_from_another_project
2.
https://meta.wikimedia.org/wiki/Community_Wishlist_Survey_2019/Archive/Centralized_templates_and_modules
3.
https://meta.wikimedia.org/wiki/Community_Wishlist_Survey_2019/Reading/Improve_graphs_and_interactive_content
All of the 3 examples are very good ideas IMO. They may not be fully
feasible, but I'm sure they are all partially feasible if properly scoped.






On Tue, Oct 30, 2018 at 1:36 PM Johan Jönsson 
wrote:

> Hey everyone,
>
> The Community Wishlist Survey is now open, and you can post proposals
> for projects that you would like the Wikimedia Foundation's Community
> Tech team to work on:
>
> https://meta.wikimedia.org/wiki/Community_Wishlist_Survey_2019
>
> The Community Tech team builds features and makes changes that active
> Wikimedia contributors want, and the Wishlist Survey sets the team's
> agenda for the next year.
>
> The Wishlist Survey starts with a two-week proposal period, when
> contributors from all Wikimedia projects are invited to post, discuss
> and improve proposals. After that, there's a two-week voting period,
> when everyone can post support-votes on the proposals.
>
> You can post technical proposals until 11 November.
>
> You can vote on proposals from 16 November to 30 November.
>
> The Community Tech team is responsible for addressing the top ten
> wishes on the list. If they, after investigating it, find that
> something isn't feasible, they need to explain why to the community.
> The Wishlist can also be used by volunteer developers and other teams,
> who want to find projects to work on that the community really wants.
>
> //Johan Jönsson
> --
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How much data can we plan to upload on Wikidata infrastructure? [Was: Fwd: Re: [wikidata] [glam] [Toulouse] Projet de partenariat CNES]

2018-10-15 Thread Eran Rosenthal
I'm not sure Wikidata/Wikibase is the right platform for real-time/near
real time data or such raw data from sensors.

Dependening on the usage I think the following should be considered:
* What resultion (time resolution/spatial resolution) is really needed for
readers? (Probably averaging/down sampling is required anyway)
* Consider whether Wikibase is the right platform for it, or maybe Commons
(see Help:Map_Data
 and
Help:Tabular_Data

 )

Regards,
Eran






On Mon, Oct 15, 2018 at 12:21 PM Mathieu Lovato Stumpf Guntz <
psychosl...@culture-libre.org> wrote:

> Hello,
>
> That might not be the most appropriate canal for this question, but I
> didn't have a better idea, so please let me know if you have better
> suggestion for my future demands.
>
> So, if you read French you can read the thread bellow, but basically to
> give some context to my question, we are looking at possible partnership
> with spatial agencies to feed the Wikimedia world with data. Depending
> on what we ask and achieve to make as agreement, the volume they could
> provide would be possibly really huge, with a given example of
> 1Go/minute for a single satellite.
>
> So my question is how much data should we aim at collecting, and
> depending on the volume, what transfer process should we use?
>
> Cheers
>
>
>
>  Message transféré 
> Sujet : Re: [wikidata] [glam] [Toulouse] Projet de partenariat CNES
> Date :  Sun, 8 Apr 2018 13:06:00 +0200
> De :Sébastien Dinot 
> Répondre à :Sébastien Dinot 
> Pour :  Xavier Cailleau 
> Copie à :   g...@lists.wikimedia.fr, toulo...@lists.wikimedia.fr,
> Liste
> OSM Toulouse ,
> c...@listes.openstreetmap.fr, wikid...@lists.wikimedia.fr,
> pa...@lists.wikimedia.fr
>
>
>
> Sébastien Dinot a écrit :
> > Je dois pouvoir me libérer une demi-journée :
>
> Il est sans doute utile de préciser que je connais le projet Wikipédia
> depuis
> fort longtemps mais que mes contributions y sont fort modestes (quelques
> corrections d'articles et quelques photos) car on ne peut pas être sur tous
> les fronts à la fois (je suis un militant du logiciel libre depuis 1998
> et un
> militant de l'open data depuis 2009, mais essentiellement dans le périmètre
> utile à la cartographie).
>
> En outre, je ne connais pas grand chose au climat et je peux manquer de
> pertinence sur le sujet.
>
> Par conséquent, je peux rencontrer vos interlocuteurs et sans doute être
> utile
> par ma connaissance du CNES et des licences, mais il me semble nécessaire
> d'être accompagné par quelqu'un qui connait bien mieux que moi Wikipédia et
> les projets connexes.
>
> Quels sont les objectifs de la fondation ? Obtenir des échantillons de
> données
> permettant d'illustrer des articles, des couvertures globales de l'Europe
> ou
> des terres émergées, de longues séries temporelles ? Quel volume de données
> est-il raisonnable d'envisager (dans le spatial, les volumes de données
> produits sont impressionnants : à ma connaissance, un seul satellite
> sentinel 2 transmet 12 Go de données brutes toutes les 12 minutes).
>
> Sébastien
>
> --
> Sébastien Dinot, sebastien.di...@free.fr
> http://sebastien.dinot.free.fr/
> Ne goûtez pas au logiciel libre, vous ne pourriez plus vous en passer !
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Map internationalization launched everywhere, AND embedded maps now live on 276 Wikipedias

2018-05-10 Thread Eran Rosenthal
 Kudos. Great work!


On Thu, May 10, 2018 at 2:11 AM, Yuri Astrakhan 
wrote:

> Joe, thanks for heeding community's wish to have better maps, and
> delivering, all within the short span of ~3.5 months. Very impressive
> achievement!  How can the community encourage WMF to do more map work
> beyond the June deadline, similar to how foundation continues to work on
> improving search, Visual Editor, and other tools?  Thanks!
>
> On Thu, May 10, 2018 at 1:33 AM Joe Matazzoni 
> wrote:
>
> > As of today, interactive (Kartographer) maps no longer display in the
> > language of the territory mapped; instead, you’ll read them in the
> content
> > language of the wiki where they appear—or in the language their authors
> > specify (subject to availability of multilingual data). In addition,
> > mapframe, the feature that automatically embeds dynamic maps right on a
> > wiki page, is now live on most Wikipedias that lacked the feature. (Not
> > included in the mapframe launch are nine Wikipedias [1] that use the
> > stricter version of Flagged Revisions).
> >
> > If you you’re new to mapframe, this Kartographer help page [2] shows how
> > to get started putting dynamic maps on your pages.  If you’d like to read
> > more about map internationalization: this Special Update [3] explains the
> > feature and its limiations; this post [4] and this one [5] describe the
> > uses of the new parameter, lang=”xx”, which  lets you specify a map’s
> > language. And here are some example maps [6] to illustrate the new
> > capabilities.
> >
> > These features could not have been created without the generous
> > programming contributions and advice of our many map-loving volunteers,
> > including Yurik, Framawiki, Naveenpf, TheDJ, Milu92, Astirlin, Evad37,
> > Pigsonthewing, Mike Peel, Eran Roz,  Gareth and Abbe98. My apologies to
> > anyone I’ve missed.
> >
> > The Map Improvements 2018 [7] project wraps up at the end of June, so
> > please give internationalized maps and mapframe a try soon and give us
> your
> > feedback on the project talk page [8]. We’re listening.
> >
> > [1] https://phabricator.wikimedia.org/T191583
> > [2] https://www.mediawiki.org/wiki/Help:Extension:Kartographer
> > [3]
> > https://www.mediawiki.org/wiki/Map_improvements_2018#
> April_18,_2018,_Special_Update_on_Map_Internationalization
> > [4]
> > https://www.mediawiki.org/wiki/Map_improvements_2018#
> April_25,_2018,_You_can_now_try_out_internationalization_(on_testwiki)
> > [5]
> > https://www.mediawiki.org/wiki/Map_improvements_2018#
> April_26,_2018:_OSM_name_data_quirks_and_the_uses_of_lang=%
> E2%80%9Clocal%E2%80%9D
> > [6] https://test2.wikipedia.org/wiki/Map_internationalization_examples
> > [7] https://www.mediawiki.org/wiki/Map_improvements_2018
> > [8] https://www.mediawiki.org/wiki/Talk:Map_improvements_2018
> > _
> >
> > Joe Matazzoni
> > Product Manager, Collaboration
> > Wikimedia Foundation, San Francisco
> >
> > "Imagine a world in which every single human being can freely share in
> the
> > sum of all knowledge."
> >
> >
> >
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] GSoC 2018 Introduction: Hagar Shilo

2018-05-03 Thread Eran Rosenthal
Good luck / בהצלחה!


On Thu, May 3, 2018 at 7:39 PM, Amir E. Aharoni <
amir.ahar...@mail.huji.ac.il> wrote:

> Welcome / ברוכה הבאה!
>
> בתאריך יום ה׳, 3 במאי 2018, 19:27, מאת Hagar Shilo ‏<
> hagarshi...@mail.tau.ac.il>:
>
> > Hi All,
> >
> > My name is Hagar Shilo. I'm a web developer and a student at Tel Aviv
> > University, Israel.
> >
> > This summer I will be working on a user search menu and user filters for
> > Wikipedia's "Recent changes" section. Here is the workplan:
> > https://phabricator.wikimedia.org/T190714
> >
> > My mentors are Moriel and Roan.
> >
> > I am looking forward to becoming a Wikimedia developer and an open source
> > contributor.
> >
> > Cheers,
> > Hagar
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Code review for T15673

2018-05-02 Thread Eran Rosenthal
Done. Thanks for Lucas and Brian Wolff for reviewing the patch.


On Wed, May 2, 2018 at 1:18 PM, Eran Rosenthal <eranro...@gmail.com> wrote:

> Hi,
> A long standing issue with references that it doesn't natively support
> direction which is widely requested feature in RTL wikis.
>
> Can someone with +2 permissions please review the gerrit item in
> https://gerrit.wikimedia.org/r/#/c/7738/ ?
>
>
> Thanks,
> Eran
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Code review for T15673

2018-05-02 Thread Eran Rosenthal
Hi,
A long standing issue with references that it doesn't natively support
direction which is widely requested feature in RTL wikis.

Can someone with +2 permissions please review the gerrit item in
https://gerrit.wikimedia.org/r/#/c/7738/ ?


Thanks,
Eran
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] What ways are there to include user-edited JavaScript in a wiki page? (threat model: crypto miners)

2018-03-15 Thread Eran Rosenthal
Lego already did a script to verify no external resources are loaded:
https://phabricator.wikimedia.org/T71519
I think there is a Jenkins job running it on regular basis

On Thu, Mar 15, 2018 at 6:30 AM, MZMcBride  wrote:

> David Gerard wrote:
> >What ways are there to include user-edited JavaScript in a wiki page?
> >
> >[...]
> >
> >You can't see it now, but it was someone including a JavaScript
> >cryptocurrency miner in common.js!
> >
> >Obviously this is not going to be a common thing, and common.js is
> >closely watched. (The above edit was reverted in 7 minutes, and the
> >user banned.)
> >
> >But what are the ways to get user-edited JavaScript running on a
> >MediaWiki, outside one's own personal usage? And what permissions are
> >needed? I ask with threats like this in mind.
>
> There's an old post of mine that documents some of the ways to inject
> site-wide JavaScript:
>  >
>
> I believe, as Brian notes in this thread, that most methods require having
> the "editinterface" user right so that you can edit wiki pages in the
> "MediaWiki" namespace. By default, this user right is assigned to the
> "sysop" user group, but if you search through
>  for the string
> "editinterface", you can see that on specific wikis such as fawiki, this
> user right has been assigned to additional user groups.
>
> Jon Robson wrote:
> >It has always made me a little uneasy that there are wiki pages where
> >JavaScript could potentially be injected into my page without my approval.
> >To be honest if I had the option I would disable all site and user scripts
> >for my account.
>
> You could file a Phabricator task about this. We already specifically
> exempt certain pages, such as Special:UserLogin and Special:Preferences,
> from injecting custom JavaScript. We could potentially add a user
> preference to do what you're suggesting.
>
> That said, you're currently executing thousands upon thousands of lines of
> code on your computer that you've never read or verified. If you're a
> standard computer user, you visit hundreds of Web sites per year that each
> execute thousands of lines of untrusted scripts that you've never read or
> verified. Of all the places you're likely to run into trouble, Wikimedia
> wikis are, in many ways, some of the safest. Given all of this code, your
> computer, as well as mine, are vulnerable to dozens of very real attacks
> at any time. And yet we soldier on without too much panic or worry.
>
> >Has this sort of thing happened before?
>
> Salon.com recently prompted users with ad blocking software installed to
> voluntarily mine cryptocurrency: .
> This situation on fa.wikipedia.org was obviously involuntary. I don't know
> of any similar incidents. We have had wiki administrators inadvertently
> inject scripts with privacy issues, such as Google Analytics. These
> scripts have generally been promptly removed when noticed. On the other
> hand, pages such as  have been loading the
> same problematic scripts (Google Analytics and JavaScript from
> ajax.googleapis.com) for years and nobody seems to have cared enough yet.
>
> >Can we be sure there isn't a gadget, interface page that has this sort of
> >code lurking inside? Do we have any detection measures in place?
>
> A much surer bet is that at least some gadgets and other site-wide
> JavaScript have privacy issues and potentially security issues. It would
> be shocking if, across the hundreds of Wikimedia wikis, none of them did.
>
> I think in the past Timo and maybe Alex Monk have done some surveying of
> public Wikimedia wikis using a browser or browser emulator to check if
> there are network requests being made to non-Wikimedia domains. As Lucas
> noted in this thread already, there are also tasks such as
>  that could be worked on, if
> there's sufficient interest.
>
> MZMcBride
>
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikidata vandalism dashboard (for Wikipedians)

2018-01-29 Thread Eran Rosenthal
Good idea, and thanks for sharing.

Ideas for improvement:
1. Missing reference - the list by itself is not sufficient to determine
whether some edit is bad or not, as there is no reference (Qid is not
meaningful).
It would be nice to have another column with "reference" value to
compare - either different language (en or fallback lang) or prev value
2. " /* wbsetlabel-add:1|he */ " or " /* wbsetlabel-set:1|he */" should be
formatted to human readable format/removed/replaced with icon.
3. (dreaming) How awful would it be to move it to mediawiki itself (e.g
implement ChangesListStringOptionsFilterGroup /
ChangesListStringOptionsFilter [T176515
])
doing regex queries on non indexed summary column of recent changes?

On Mon, Jan 29, 2018 at 6:48 PM, Dayllan Maza  wrote:

> This is great. Thank you for sharing
>
> On Mon, Jan 29, 2018 at 7:07 AM Amir Ladsgroup 
> wrote:
>
> > It's tracked in: https://github.com/Ladsgroup/
> Vandalism-dashboard/issues/2
> >
> > On Sun, Jan 28, 2018 at 11:26 PM Magnus Manske <
> > magnusman...@googlemail.com>
> > wrote:
> >
> > > Quick note: Looks great, but "Changes in descriptions" is always on,
> even
> > > after clicked off...
> > >
> > > On Sun, Jan 28, 2018 at 5:54 PM Amir Ladsgroup 
> > > wrote:
> > >
> > > > Hello,
> > > > People usually ask me how they can patrol edits that affect their
> > > Wikipedia
> > > > or their language. The proper way to do so is by using watchlist and
> > > > recentchanges (with "Show Wikidata edits" option enabled) in
> Wikipedias
> > > but
> > > > sometimes it shows too many unrelated changes.
> > > >
> > > > Also, it would be good to patrol edits for languages you know because
> > the
> > > > descriptions are being shown and editable in the Wikipedia app making
> > it
> > > > vulnerable to vandalism (so many vandalism in this area goes
> unnoticed
> > > for
> > > > a while and sometimes gets fixed by another reader which is
> > suboptimal).
> > > >
> > > > So Lucas [1] and I had a pet project to allow you see unpatrolled
> edits
> > > > related to a language in Wikidata. It has some basic integration with
> > > ORES
> > > > and if you see a good edit and mark it as patrolled it goes away from
> > > this
> > > > list. What I do usually is to check this page twice a day for Persian
> > > > langauge which given the size of it, that's enough.
> > > >
> > > > It's in https://tools.wmflabs.org/wdvd/index.php the source code is
> in
> > > > https://github.com/Ladsgroup/Vandalism-dashboard and you can report
> > > > issues/bug/feature requests in
> > > > https://github.com/Ladsgroup/Vandalism-dashboard/issues
> > > >
> > > > Please spread the word and any feedback about this tool is very
> welcome
> > > :)
> > > >
> > > > [1]: 
> > > > https://www.wikidata.org/wiki/User:Lucas_Werkmeister_(WMDE)
> > > >
> > > > 
> > > > Best
> > > > ___
> > > > Wikitech-l mailing list
> > > > Wikitech-l@lists.wikimedia.org
> > > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > > ___
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Proposal regarding the handling of imported usernames

2017-11-30 Thread Eran Rosenthal
I suggested it on T20209#3535024 back in August, thanks Brad for taking
care for it :)

Just to add a sidenote regarding user=0 and user_text with some non IP
value - I saw it was quite common in Wikidata recentchanges table few
months ago with rc_type=5 (RC_EXTERNAL), though I can't see such anymore.






On Thu, Nov 30, 2017 at 7:31 PM, Brad Jorsch (Anomie)  wrote:

> The proposal was approved by TechCom, the code has been merged, and it's
> live now on the Beta Cluster. I'm running the maintenance script now.
> Please test things there and report any bugs you encounter, either by
> replying to this message or by filing it in Phabricator and adding me as a
> subscriber. Assuming no major errors turn up that can't be quickly fixed,
> I'll probably start running the maintenance script on the production wikis
> the week of December 11 (and perhaps on mediawiki.org and testwiki the
> week
> before).
>
> If you're curious as to what the history of an existing imported page might
> look like after the maintenance script is run, see
> https://commons.wikimedia.beta.wmflabs.org/wiki/
> Template:Documentation?action=history
> for an example.
>
> On Tue, Oct 31, 2017 at 10:52 AM, Brad Jorsch (Anomie) <
> bjor...@wikimedia.org> wrote:
>
> > Handling of usernames in imported edits in MediaWiki has long been weird
> > (T9240[1] was filed in 2006!).
> >
> > If the local user doesn't exist, we get a strange row in the revision
> > table where rev_user_text refers to a valid name while rev_user is 0
> which
> > typically indicates an IP edit. Someone can later create the name, but
> > rev_user remains 0, so depending on which field a tool looks at the
> > revision may or may not be considered to actually belong to the
> > newly-created user.
> >
> > If the local user does exist when the import is done, the edit is
> > attributed to that user regardless of whether it's actually the same
> user.
> > See T179246[2] for an example where imported edits got attributed to the
> > wrong account in pre-SUL times.
> >
> > In Gerrit change 386625[3] I propose to change that.
> >
> >- If revisions are imported using the "Upload XML data" method, it
> >will be required to fill in a new field to indicate the source of the
> >edits, which is intended to be interpreted as an interwiki prefix.
> >- If revisions are imported using the."Import from another wiki"
> >method, the specified source wiki will be used as the source.
> >- During the import, any usernames that don't exist locally (and can't
> >be auto-created via CentralAuth[4]) will be imported as an
> >otherwise-invalid name, e.g. an edit by User:Example from source 'en'
> would
> >be imported as "en>Example".[5]
> >- There will be a checkbox on Special:Import to specify whether the
> >same should be done for usernames that do exist locally (or can be
> created)
> >or whether those edits should be attributed to the
> existing/autocreated
> >local user.
> >- On history pages, log pages, and the like, these usernames will be
> >displayed as interwiki links, much as might be generated by wikitext
> like "
> >[[:en:User:Example|en>Example]]". No parenthesized 'tool' links
> (talk,
> >block, and so on) will be generated for these rows.
> >- On WMF wikis, we'll run a maintenance script to clean up the
> >existing rows with valid usernames and rev_user = 0. The current plan
> there
> >is to attribute these edits to existing SUL users where possible and
> to
> >prefix them with a generic prefix otherwise, but we could as easily
> prefix
> >them all.
> >   - Unfortunately it's impossible to retroactively determine the
> >   actual source of old imports automatically or to automatically do
> anything
> >   about imports that were misattributed to a different local user in
> pre-SUL
> >   times (e.g. T179246[2]).
> >   - The same will be done for CentralAuth's global suppression
> >blocks. In this case, on WMF wikis we can safely point them all at
> Meta.
> >
> > If you have comments on this proposal, please reply here or on
> > https://gerrit.wikimedia.org/r/#/c/386625/.
> >
> >
> > Background: The upcoming actor table changes[6] require some change to
> the
> > handling of these imported names because we can't have separate
> attribution
> > to "Example as a non-registered user" and "Example as a registered user"
> > with the new schema. The options we've identified are:
> >
> >1. This proposal, or something much like it.
> >2. All the existing rows with rev_user = 0 would have to be attributed
> >to the existing local user (if any), and in the future when a new
> user is
> >created any existing edits attributed to that name will be
> automatically
> >attributed to that new account.
> >3. All the existing rows with rev_user = 0 and an existing local user
> >would have to be re-attributed to different *valid* usernames,
> >   

Re: [Wikitech-l] Previous mediawiki version test wiki

2017-09-22 Thread Eran Rosenthal
If you don't have access to old mediawiki version (whether it is group2,
your own wiki or test3wiki suggested above),
and suspects there is a regression of something that was working in the
past,
it is useful to indicate it in the bug description, and the maintainer of
that feature can check it out
(either by indications from git log with relevant commit messages, or by
restoring to older code)

You can also go over the commits log using web interface:
https://github.com/wikimedia/mediawiki/commits/master
and as Niharika said there are a lot of new changes :)
While this is static (compared to running mediawiki instance) it as its own
advantages:
You can find out who is the owner (blame), related bugs (usually "Bug: X"
in commit message) etc



On Fri, Sep 22, 2017 at 8:27 AM, Niharika Kohli 
wrote:

> >
> > When filing a phab task with some new bug,
> > you always want to know - is it really new, or I just did not pay
> attention
> > to it before?
>
> What's the purpose of this information? If it's a bug, new or not, a ticket
> needs to be filed.
>
> And when I do know it's a new bug, I can open both versions
> > in the same time, and compare the behaviour for this bug. And also,
> compare
> > the console results - what exactly changed in html, in css, in js
> commands
> > reactions.
>
> I agree that information will save some developer time but at the same time
> this information is not so easy to gather. This is helpful when the users
> have some working knowledge of how developer tools work and how to compare
> file changes. Usually in each version there are a lot of new changes. Often
> it's not easy for developers even to find out what could be causing the
> bug.
>
> I can easily imagine such a wiki quickly falling into disuse.
>
>
> On Thu, Sep 21, 2017 at 7:29 PM, יגאל חיטרון  wrote:
>
> > It can work. But another Monday. I mean, if Tue-Wed-Thu there is a
> > deployment of version 5, a day before, Mon there is a deployment of
> > version 4, so starting from tomorrow, group 0 will get a way  to see
> > both version, exactly from the beginning, but not until the end, for 6
> > days, group 1 for 5 days, and group 2 for 4 days. And from Monday to the
> > deployment, 1-2-3 days, there will not be use of this. I'll be very glad
> if
> > it will be decided to do this, and if so, it will be a good thing to add
> to
> > the text of how to report a bug in phabricator help, something about, you
> > can check if it is a regression, the last version "falt", by comparing
> with
> > this new wiki. I can thing about many dozens of tasks I wrote and read
> > where this information could be useful, if added at the first place. Hope
> > you decide this indeed. Thank you very much,
> > Igal
> >
> >
> > On Sep 22, 2017 05:17, "Chad"  wrote:
> >
> > > No non-emergency deployments on Fridays, Saturdays or Sundays.
> > > Monday could work.
> > >
> > > -Chad
> > >
> > > ‪On Thu, Sep 21, 2017 at 7:15 PM ‫יגאל חיטרון‬‎ 
> > > wrote:‬
> > >
> > > > I glad you say so. What about Friday?
> > > > Igal
> > > >
> > > >
> > > > On Sep 22, 2017 05:07, "Chad"  wrote:
> > > >
> > > > > It wouldn't be hard to do at all, technically. I imagine it'd be
> > > > something
> > > > > like a test3wiki.
> > > > >
> > > > > Main thing to know is when do we cycle off of the old version? When
> > the
> > > > > version goes out on Tuesdays? That day's already pretty loaded for
> > > > software
> > > > > moving about...
> > > > >
> > > > > -Chad
> > > > >
> > > > > On Thu, Sep 21, 2017 at 1:25 PM Brian Wolff 
> > wrote:
> > > > >
> > > > > > Making your case here is probably best. The release engineering
> > team
> > > > are
> > > > > > the people you probably have to convince, although of course
> anyone
> > > > could
> > > > > > potentially create such a wiki, in an unofficial way.
> > > > > >
> > > > > > Keep in mind that keeping an older version of the software
> running
> > > does
> > > > > > introduce a maintinance burden, so you will probably have to
> > convince
> > > > > > people that it would be regularly useful and not just useful this
> > one
> > > > > time.
> > > > > >
> > > > > > --
> > > > > > bawolff
> > > > > >
> > > > > > On Thursday, September 21, 2017, יגאל חיטרון 
> > > > wrote:
> > > > > > > Thank you. Sorry to hear this. Is there some place I can
> suggest
> > > this
> > > > > and
> > > > > > > explain why do I think it can be very helpful?
> > > > > > > Igal
> > > > > > >
> > > > > > > On Sep 21, 2017 22:12, "Brian Wolff" 
> wrote:
> > > > > > >
> > > > > > >> On Thursday, September 21, 2017, יגאל חיטרון <
> > > > khit...@post.bgu.ac.il>
> > > > > > >> wrote:
> > > > > > >> > Hi. Sometimes after the week deployment I need to compare
> the
> > > new
> > > > > > version
> > > > > > >> > with the previous one, in some aspect. Is there a test wiki
> > 

[Wikitech-l] Orphan unbreak now

2017-08-17 Thread Eran Rosenthal
There are thousands of pages generating lua errors when accessing wikidata
with Lua (T170039 ) and it
affects on most/all wikis. The issue is lasting for few weeks since first
reported.

It seems to have no root cause, but the best current guess (AFAIK) is due
to Lua-sandbox (T171166 ).

[Please don't take it personally!]
1. What is the ETA for getting it deployed to production?
2. IMO It looks like there is a little bit miscommunication (between WMF &
WMDE?) in handling this case. For future cases, it would be nice if who can
better communicate urgency of blocker tasks, especially if it involves
other organizations.

Thanks,
Eran
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] catching errors in local JavaScript

2017-01-18 Thread Eran Rosenthal
I wrote 2 years ago a phantom-js based script to catch all JS errors in all
wikis:
https://github.com/eranroz/wiki-js-error-log

See also:
https://phabricator.wikimedia.org/T71519




On Wed, Jan 18, 2017 at 6:37 PM, Jeremy Baron  wrote:

> On Jan 18, 2017 11:26, "Amir E. Aharoni" 
> wrote:
>
> Remind me please, were there ever any efforts to get client-side JavaScript
> errors monitored centrally?
>
>
> I think you're looking for
> https://phabricator.wikimedia.org/project/profile/976/ aka
> https://phabricator.wikimedia.org/tag/sentry/
>
> -Jeremy
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Arbitrary Wikidata querying

2016-12-15 Thread Eran Rosenthal
TL;DR: The ONLY practical solution today it to use Lua.
This sucks,but it works and scale well [in WP sense] - hewiki uses it
heavily in infoboxs - to show list of actors in movies, or musical band
members etc.

Long version:
Actually, specifically for list of presidents you don't need bot.
Here is how to do it in Lua  more or less (in pseudo code):
local countryEntity = mw.wikibase.getEntity('Q30') --note: you can get
the country from property/current entity to be generic
local presidents=  countryEntity:getBestStatements('P6') --note: you
can get this as a parameter
local output = ''
for i, property in ipairs(propertyVals) do
local propValue = property.mainsnak and property.mainsnak.datavalue
-- parse it to the desired output...
end

(A real world usage example:
https://he.wikipedia.org/wiki/%D7%99%D7%97%D7%99%D7%93%D7%94:PropertyLink?uselang=en
in function: getProperty)

Why this is good:
1. It is the only practical way to query wikidata from Wikipedia. [bots
aren't practical - 1. They are less accessible to common users. 2. Some use
cases requires to run the query and update every 4/5 years when list of
governors is updated]
2. It is generic enough to work in different countries and different lists
3. Users can easily use it with with syntax such as
{{#invoke:LuaModule|listOf|Q30|P6}} or as templates, and are unaware to the
implementation

Why it sucks:
1. Because it is ugly Lua code
2. This just moves the problem to Wikidata [have to maintain Q30.P6 using
bots/humans instead of queries]
3. It is limited to simple lists (you can't have list of Republican
presidents - because it requires additional filters and you don't want to
create new property for it)
4. Internationalization - What if yi Wikipedia wants to create list of
governors of some small country where there are no yi labels for the
presidents? The list would be partially in yi partially in en - is this
desired behavior? or they can show only presidents who have label in yi -
but this would give partial data - is this the desired behavior?  [Probably
the correct solution is to do show the fallback labels in en, but add some
tracking category for pages requires label translation or [translate me]
links)












On Fri, Dec 16, 2016 at 7:35 AM, Stas Malyshev 
wrote:

> Hi!
>
> > Sure, but I'm not really worried about potential false positives. I'm
> > worried that we're building a giant write-only data store.
>
> Fortunately, we are not doing that.
>
> >> Unless you're talking about pulling a small set of values, in which case
> >> Lua/templates are probably the best venue.
> >
> > I'm not sure what small means here. We have about 46 U.S. Presidents, is
> > that small enough? Which Lua functions and templates could I use?
>
> No, list of presidents is not small enough. Lua right now can fetch
> specific data from specific item. Which is OK if you know the item and
> what you're getting (e.g. infoboxes, etc.) but not good for lists of
> items, especially with complicated conditions. That use case currently
> needs external tools - like bots.
>
> > Wikidata began in October 2012. I thought it might take till 2014 or even
> > 2015 to get querying capability into a usable state, but we're now
> looking
>
> Please do not confuse your particular use case with querying not be
> usable at all. It is definitely usable and being used by many people for
> many things. Generating lists directly from wiki template is not
> supported yet, and we're working on it. I'm sorry that your use case is
> not supported and you're feeling disappointed. But we do have query
> capability and it can be used and is being used for many other things.
>
> Of course, contributions - in any form, query development, code
> development, design, frontend, backend, data contributions, etc. - are
> always welcome.
>
> > to even contribute to it when it feels like putting data into a giant
> > system that you can't really get back out. I love Magnus and I have a ton
>
> Again, this is not correct - you can read data back out and there are
> several ways you can use query functionality for it right now. The way
> you want to do it is not supported - yet - but there are many other
> ways. Which we are constantly improving. But we can't do everything at
> once. Please be patient, please contribute with what you can, and we'll
> get there.
> --
> Stas Malyshev
> smalys...@wikimedia.org
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Arbitrary Wikidata querying

2016-12-11 Thread Eran Rosenthal
Currently it is only possible with Lua.
The documentation is in:
https://www.mediawiki.org/wiki/Extension:Wikibase_Client/Lua

it is quite ugly to write such module (not cool SPARQL...) but it works,
and you can expose it with nice interface to be used in wikipages.






On Sun, Dec 11, 2016 at 6:17 AM, Gergo Tisza  wrote:

> On Sat, Dec 10, 2016 at 5:30 PM, MZMcBride  wrote:
>
> > A more advanced form of this Wikidata querying would be dynamically
> > generating a list of presidents of the United States by finding every
> > Wikidata item where position held includes "President of the United
> > States". Is this currently possible on-wiki or via wikitext?
> >
>
> Not directly, but there are bots which can emulate it, such as Listeria by
> Magnus:
> http://magnusmanske.de/wordpress/?p=301
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [discovery] Weekly update for the week starting 2016-09-05

2016-09-10 Thread Eran Rosenthal
Thanks Deborah for the update.

Just to mention another interesting feature (implemented but yet
evaluated/activated) - dcausse have implemented ability to show search
results also based on the DEFAULTSORT. (T134978
)
E.g when you search for Putin (and not Vladimir Putin) you will get
suggestion for Vladimir Putin (even if there is no redirect from Putin), as
its defaultsort is Putin, Vladimir.
This feature may have high impact once (and if) it is activated.


On Sat, Sep 10, 2016 at 1:44 AM, Deborah Tankersley <
dtankers...@wikimedia.org> wrote:

> Hello,
>
> Here is the week's update from the Discovery department - enjoy the read
> and your weekend!
>
> == Discussions ==
> * Trey completed the analysis for optimizing language identification for
> the Dutch Wikipedia (nlwiki). The results were good (F0.5 = 82.3%) but not
> great. The small proportions of queries in the Romance languages and in
> German led to many more false positives than true positives and so they had
> to be excluded. Future work on improving confidence may help. [1]
> * We could use help translating (via translatewiki) the relevant "showing
> results from" messages into Dutch. We'll need English, Chinese, Arabic,
> Korean, Greek, Hebrew, Japanese, and Russian translations. [2]
> * The Analysis team had a discussion on how to use better wording for
> phrases like "users were 1.07 times more likely to do X" and decided on
> using phrases similar to "we can expect 2-9 more sessions to click on a
> search result when they have the new feature" [3]
> * The Search team wrapped up research into the ElasticSearch instabilities
> on the eqiad search cluster that occurred on Aug 6, 2016; nothing
> conclusive was found. [4]
>
>
> == Events and News ==
>
> === Interactive ===
> *  has been enabled on all wikis (announced via email to
> wikitech-l) [5]
> * Geoshapes data service is now integrated into all maps [6]
>
> === Search ===
> * Turned off BM25 A/B test, awaiting analysis [7]
> * Pushed into production a change that implemented ascii-folding for
> French [8]
> * Improved balance of nodes across rows for ElasticSearch eqiad cluster [9]
>
> === Portal ===
> * Currently blocked on this check-in to gerrit [10]
>
>
> == Other Noteworthy Stuff' ==
> * Our elasticsearch clusters now have "row aware shard allocation". This
> means that we can theoretically lose one row of servers in our datacenter
> and still serve search traffic. [11]
> * The Search team sent out a request for comment article that was posted
> to various Village Pumps asking for it to be translated. [12]
> ** This was in reference to the cross-wiki search results new
> functionality and design articles on MediaWiki. [13], [14]
>
>
> == Did you know? ==
> * A study came out yesterday showing that giraffes are actually four
> distinct species, rather than one (article and BBC report). [15], [16]
> ** Of course, the English and German Wikipedia pages on giraffes have
> already been updated! [17], [18]
>
>
> [1] https://www.mediawiki.org/wiki/User:TJones_(WMF)/Notes/
> TextCat_Optimization_for_plwiki_arwiki_zhwiki_and_nlwiki
> [2] https://phabricator.wikimedia.org/T143354
> [3] https://phabricator.wikimedia.org/T140187
> [4] https://phabricator.wikimedia.org/T142506
> [5] https://lists.wikimedia.org/pipermail/wikitech-l/2016-
> September/086490.html
> [6] https://www.mediawiki.org/wiki/Help:Extension:Kartographer#GeoShapes_
> external_data
> [7] https://phabricator.wikimedia.org/T143588
> [8] https://phabricator.wikimedia.org/T144429
> [9] https://phabricator.wikimedia.org/T143685
> [10] https://gerrit.wikimedia.org/r/#/c/306241/
> [11] https://phabricator.wikimedia.org/T143571
> [12] https://meta.wikimedia.org/wiki/User:DTankersley_(
> WMF)/translation_request_for_cross-wiki_search_results
> [13] https://www.mediawiki.org/wiki/Cross-wiki_Search_Result_Improvements
> [14] https://www.mediawiki.org/wiki/Cross-wiki_Search_
> Result_Improvements/Design
> [15] http://www.cell.com/current-biology/fulltext/S0960-9822(16)30787-4
> [16] http://www.bbc.com/news/science-environment-37311716
> [17] https://en.wikipedia.org/wiki/Giraffe
> [18] https://de.wikipedia.org/wiki/Giraffe
>
>
> 
>
>
> The full update, and archive of past updates, can be found on
> Mediawiki.org:
> https://www.mediawiki.org/wiki/Discovery/Status_updates
>
>
> Interested in getting involved? See tasks marked as Easy or volunteer
> needed in Phabricator:
> [1] https://phabricator.wikimedia.org/maniphest/query/qW51XhCCd8.7/#R
> [2] https://phabricator.wikimedia.org/maniphest/query/5KEPuEJh9TPS/#R
>
>
> Cheers!
>
> --
> Deb Tankersley
> Product Manager, Discovery
> IRC: debt
> Wikimedia Foundation
>
> ___
> discovery mailing list
> discov...@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/discovery
>
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org

Re: [Wikitech-l] Opening up MediaWiki dev summit in January?

2016-09-06 Thread Eran Rosenthal
>
> But there is a problem: we have a capacity limit of 200 people.

In hackthons (either Wikimedia hackathon or Wikimania hackthon days) there
is not always a large enough hall for all the devs, and people may sit in
different rooms.
So the capability limit can be soften a bit - this could be a simultaneous
event in multiple different locations where the main part take in SF, but
Wikimedia chapters may organize in the same time smaller scale events (it
could be even 1 room +pizza ++beer).

We could set the goal of selecting (top down) a small number of product
> challenges
>
One possible goal: Citations
Citation support in MW is very hacky  - based on hacks EVERYWHERE from
parsoid, VE, ContentTranslation (tech) and template/modules (where every
wiki have its own version/some version imported from enwiki...)

I can imagine rewriting the Extension:Cite from scratch (Extension:CiteV2),
with more structured data support (similar to in sense to Brion''s idea
from Wikimania Mexico) - then the Wikidata support + importing /generating
bibliographic data in wikidata (or other Wikibase repo?) takes part in
Berlin where there is strong pywikibot/WD community, while
Parsoid+VE+core/extension support for Cite takes place in SF.












On Tue, Sep 6, 2016 at 9:14 AM, Quim Gil  wrote:

> Thank you for starting this conversation, Brion!
>
> Let me share the point where Rachel Farrand (Summit organizer) and I
> (Summit budget owner) find ourselves, after some conversations.
>
> GOALS
>
> First we need to define the goals of the Summit, then we can talk about the
> target audiences and the structure of the event that will help achieving
> these goals. The Summit and its goal have been a moving target over the
> years, as you can deduce from the many changes of names & goals. [0]
>
> Widening the audience was a main goal last year. This is why we renamed it
> to Wikimedia (not MediaWiki) Developer Summit, and we invited developers of
> tools, templates, bots, mobile apps, the MediaWiki Stakeholders Group, and
> also non-Wikimedia users of our APIs. It was a half-backed thought that
> received half-backed support that unsurprisingly brought half-backed
> results.
>
> Still, even if we would have done better, "widening the audience" is not a
> goal per se. What should we widen the audience for? Here is an idea.
>
> What if the Summit would be product driven, with architecture and the rest
> following that drive. All we are here to offer better products to our
> users. All the technical discussions make more sense when there is a clear
> product vision to be either supported or contested with reality checks.
>
> We have a Wikimedia Foundation Product department and also a Community
> Wishlist where the communities push for product improvements. We could set
> the goal of selecting (top down) a small number of product challenges and
> invite whoever needs to be involved to push them forward. Then we can leave
> plenty of free space for other topics that participants want to push
> (bottom up).
>
> That "we" should be representative and effective in order to define a list
> of goals in a few days (we need to open registration asap). It should be
> possible to get a short list from the Product and Technology departments,
> the Community Tech team (representing the Community Wishlist) and the
> Architecture Committee. Then again these product goals cannot be too
> surprising, since they are supposed to be prominent in discussions and
> plans already now.
>
>
> AUDIENCE
>
> If the Summit will focus on product goals, then it is evident that software
> architects and core developers will not be enough to achieve it. Product
> managers, UX designers, researchers, [add other roles here], and maybe even
> selected users/editors must be invited too in order to push the selected
> product improvements forward.
>
> But there is a problem: we have a capacity limit of 200 people. The
> Foundation alone could basically fill the event if we don't set limits, The
> Summit is immediately followed by the Wikimedia Foundation AllHands annual
> meeting. The Summit is actually the successor of Tech Days, an AllHands for
> all people who worked in tech at the Foundation.
>
> We do have some travel sponsorship budget for volunteers, and I believe we
> could get more participants among non-Wikimedia users of Wikimedia APIs and
> MediaWiki if we really want to target them. However, we simply cannot go
> for a big outreach while keeping an expectation of general attendance from
> Foundation's Product and Technology departments.
>
> Maybe we should go back to the invitation-only model with the capacity
> limit of 200 people in mind, and the representation of target audiences we
> want to get. For instance, we could set priorities on those directly
> involved in the product improvements selected (and that means that we need
> to select them asap) and define a % limit for Foundation participants.
>
> Basically, we would need to 

Re: [Wikitech-l] Let's make parsoid i18n great again

2016-04-01 Thread Eran Rosenthal
I rebased https://gerrit.wikimedia.org/r/#/c/247914/ and added some
documentation.
It seems that jenkins got drunk :)






On Fri, Apr 1, 2016 at 10:34 PM, C. Scott Ananian 
wrote:

> On Fri, Apr 1, 2016 at 3:31 PM, Siebrand Mazeland 
> wrote:
>
> > Please give me some time. It's way past beer 'o clock at the Hackathon.
> > Tomorrow's another day (JST - Jerusalem Standard Time).
> >
>
> Sure, no worries.  I'm waiting for Eran to rebase and document anyway. ;)
>
> Enjoy your beer!
>  --scott
>
> --
> (http://cscott.net)
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Let's make parsoid i18n great again

2016-03-31 Thread Eran Rosenthal
Usage statistics link is broken. Correct one:
https://phabricator.wikimedia.org/T116020#1738654

On Thu, Mar 31, 2016 at 10:42 PM, Eran Rosenthal <eranro...@gmail.com>
wrote:

> TL;DR: Parsoid isn't i18n friendly and uses English keywords instead of
> localized.[1] Is it a bug or feature? Please voice your opinion!
>
> Longer version:
> For some funny reasons Parsoid is reading arrays from "right to left"[1],
> that is, it uses the LAST alias of the magic words rather than the first
> one[2].
> One of the reasons for this is because in English the shorter "thumb" is
> preferred compared to the long "thumbnail". However, instead of fixing
> MessagesEn.php to define thumb as the first option, parsoid uses the last
> option.
> This choice result in all other wikis using the English alias (which
> appears last in magic words) rather than the localized one - so Parsoid
> isn't i18n friendly.
>
> However there are different POVs regarding the correct solution for it:
> 1. Use English aliases in all projects - these are the most used aliases
> [and one of the reasons is people copying code from enwiki or using biased
> tools such as Parsoid]
> 2. Use localized aliases - keep the article content and syntax in the same
> language. This is especially important for non-latin languages with
> different alphabet.
> And there is a consensus for English being bad choice for RTL languages as
> it cause mixed directional content which should be avoided. So if we go
> with 1 choice, RTL languages should be exception.
>
> I believe there is a cultural point of view here, and would like to hear
> what do you think (especially non RTL and non English speakers): Do you
> prefer mini (German), vignette (French), miniaturadeimagen (Spanish),
> мини (Russian) instead of thumb (for example)?
>
> I did some dump-minning to get the usage statistics:
>
> https://phab.wmfusercontent.org/file/data/bskxfupspqo64dnnkdr7/PHID-FILE-v4rf4qpq5zhm5nv5qvs4/ipjn7e7pktmycw42/img_magic_words.out
> And based on this I wrote a python script to suggest a reordering of the
> aliases by usage[3], so if  choice 2 is selected, we can merge[2] and all
> languages will use the preferred choice.
>
>
> [1] https://phabricator.wikimedia.org/T53852
> [2] https://gerrit.wikimedia.org/r/#/c/244254/3/lib/wts.LinkHandler.js
> [3] https://gerrit.wikimedia.org/r/#/c/247914
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Let's make parsoid i18n great again

2016-03-31 Thread Eran Rosenthal
TL;DR: Parsoid isn't i18n friendly and uses English keywords instead of
localized.[1] Is it a bug or feature? Please voice your opinion!

Longer version:
For some funny reasons Parsoid is reading arrays from "right to left"[1],
that is, it uses the LAST alias of the magic words rather than the first
one[2].
One of the reasons for this is because in English the shorter "thumb" is
preferred compared to the long "thumbnail". However, instead of fixing
MessagesEn.php to define thumb as the first option, parsoid uses the last
option.
This choice result in all other wikis using the English alias (which
appears last in magic words) rather than the localized one - so Parsoid
isn't i18n friendly.

However there are different POVs regarding the correct solution for it:
1. Use English aliases in all projects - these are the most used aliases
[and one of the reasons is people copying code from enwiki or using biased
tools such as Parsoid]
2. Use localized aliases - keep the article content and syntax in the same
language. This is especially important for non-latin languages with
different alphabet.
And there is a consensus for English being bad choice for RTL languages as
it cause mixed directional content which should be avoided. So if we go
with 1 choice, RTL languages should be exception.

I believe there is a cultural point of view here, and would like to hear
what do you think (especially non RTL and non English speakers): Do you
prefer mini (German), vignette (French), miniaturadeimagen (Spanish), мини
(Russian) instead of thumb (for example)?

I did some dump-minning to get the usage statistics:
https://phab.wmfusercontent.org/file/data/bskxfupspqo64dnnkdr7/PHID-FILE-v4rf4qpq5zhm5nv5qvs4/ipjn7e7pktmycw42/img_magic_words.out
And based on this I wrote a python script to suggest a reordering of the
aliases by usage[3], so if  choice 2 is selected, we can merge[2] and all
languages will use the preferred choice.


[1] https://phabricator.wikimedia.org/T53852
[2] https://gerrit.wikimedia.org/r/#/c/244254/3/lib/wts.LinkHandler.js
[3] https://gerrit.wikimedia.org/r/#/c/247914
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] VisualEditor roadmap - extensibility within MediaWiki?

2016-01-22 Thread Eran Rosenthal
Good question (and no good answer) and I think it address to the weakest
point of VE documentation  - it doesn't have enough code examples, or high
view documentation.
(the best answer you can get may be
https://doc.wikimedia.org/VisualEditor/master/#!/api/mw.libs.ve-method-addPlugin
)


My advice is to AVOID using this documentation - this is usually micro
level documentation, and it is not enough.
You should prefer to grep the codebase to get some working examples and
craft it.
Here is the example of graph:
http://git.wikimedia.org/tree/mediawiki%2Fextensions%2FGraph.git/master/modules%2Fve-graph

and old (possibly outdated) but good into and high level documentation for
VE:
https://www.mediawiki.org/wiki/VisualEditor/API






On Thu, Jan 21, 2016 at 10:49 PM, Daniel Barrett  wrote:

> Thank you! Which class on this page is the best starting point for
> learning to write a plugin?
>
> https://doc.wikimedia.org/VisualEditor/master/
>
> DanB
>
>
> 
> From: Wikitech-l [mailto:wikitech-l-boun...@lists.wikimedia.org] On
> Behalf Of Trevor Parscal
> Sent: Thursday, January 21, 2016 11:28 AM
> To: Wikimedia developers
> Subject: Re: [Wikitech-l] VisualEditor roadmap - extensibility within
> MediaWiki?
>
> VisualEditor is very extendable by design. You can do pretty much anything
> you want with a plugin, and we've demonstrated this with many existing
> plugins that provide all sorts of interesting features.
>
> The APIs for adding features to VisualEditor, while perhaps not as well
> documented as we'd like them to be, have existed for years and are now
> quite stable.
>
> We have seen extensions such as math, graph and score be integrated into
> VisualEditor by developers who are relatively new to the code base.
> However, direct communication with the team was still important to those
> efforts.
>
> The documentation that does exist is generated from code comments, and the
> VisualEditor code base is particularly well documented. There was
> a supplemental documentation effort for OOjs UI this time last year, and I
> think that worked out pretty well. This may be something we can do in the
> next six months, but there are not yet any concrete plans to do so.
>
> Ed Sanders is a good person to be in touch with, along with others on the
> VosualEditor team, who are easily reached on IRC. See the MediaWiki page on
> VisialEditor for details.
>
> - Trevor
>
> On Thursday, January 21, 2016, Daniel Barrett  wrote:
>
> > I was looking through the VisualEditor roadmap (
> > https://www.mediawiki.org/wiki/VisualEditor/Roadmap) and did not notice
> > anything about third-party MediaWiki extensions for the editor. Did I
> > miss it?
> >
> > I do see plans for "non-Mediawiki" extensions (under "Release for
> > third-party non-MediaWiki users"), and also for Mediawiki admins to
> > "easily install and use VisualEditor" (under "Release for third-party
> > MediaWiki users"), but nothing about extending it within MediaWiki. For
> > example, adding a button or menu item to insert a particular parser tag.
> >
> > Is this by design?
> >
> > I did notice "Non-template transclusions" on the roadmap, which looks
> like
> > a way to insert parser tags & parser functions if you already know their
> > name (the way template transclusions work right now). That will be a big
> > help. However, for (say) inserting a given parser tag, it would be great
> if
> > we could easily add a button or menu item for it.
> >
> > Thank you very much for any info.
> > DanB
> >
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org 
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Cross-wiki search API?

2015-09-11 Thread Eran Rosenthal
See also related bug:
https://phabricator.wikimedia.org/T71489

On Fri, Sep 11, 2015 at 12:45 PM, Magnus Manske  wrote:

> I seem to remember that all Wikimedia wikis now share a single search
> index, and per-wiki searches are filtered through a tag for the respective
> wiki.
>
> If that is indeed the case, is there an API to search all wikis, but
> omitting that tag?
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Min php version

2015-07-22 Thread Eran Rosenthal
side note: How come php 5.3.3 support broken accidentally? Isn't Jenkins
script validates compatibility with the min php? :)

On Wed, Jul 22, 2015 at 9:36 PM, Krinkle krinklem...@gmail.com wrote:


  On 20 Jul 2015, at 22:42, Legoktm legoktm.wikipe...@gmail.com wrote:
 
  OTOH, if we never bump our version requirements, there's less incentive
  for hosting providers to upgrade their PHP. [1] has some interesting
  arguments regarding this.
 
  [1] http://blog.ircmaxell.com/2014/12/on-php-version-requirements.html
 
 

 Indeed. Providers that don't already provide newer PHP options, will
 certainly
 start doing so when major software requires it.


  On 21 Jul 2015, at 07:12, bawolff bawolff...@gmail.com wrote:
 
 
 https://wikiapiary.com/w/index.php?title=Special:SearchByPropertylimit=500offset=0property=Has+PHP+Versionvalue=5.3.3
  is also something to keep in mind
 


 Yes, but also keep in mind that many of those wikis likely run in hosting
 environments that already support newer PHP versions. But customers won't
 change their settings until they have to. And providers can't change
 customers proactively without risking site breakage or damaging customer
 relations.

 I had the same with my third-party wikis. Until recently they ran on PHP
 5.3. Then at some point I realised my provider had a simple Select PHP
 Version page in the control panel. I switched them all to PHP 5.6 that day
 and also enabled opcache. Site performance improved greatly.


  On 19 Jul 2015, at 07:15, Bryan Davis bd...@wikimedia.org wrote:
 
  Some WMF production hosts are still on PHP 5.3.10 so as Tim pointed
  out last spring [0] we shouldn't drop 5.3 support until after the
  entirety of the WMF server fleet are all switched over to HHVM or at
  least a newer version of PHP5. [..]
 
  [0]: http://www.gossamer-threads.com/lists/wiki/wikitech/436441#436441
 

 Yeah, in case of Wikimedia master is near-immediate production so
 let's post-pone this until right after Wikimedia's migration is complete.

 Third parties can stick to using the LTS or the current stable version
 as needed for upto several years more without issue.

 -- Krinkle




 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] global cleanup of nowiki

2015-06-30 Thread Eran Rosenthal
Recently a little bird told me Main roundtrip quality target achieved for
the Parsoid, having 99.95% percentage of clean roundtrip.
Given this information, I would expect we can use the Parsoid to cleanup
its own (previous) mess based on lots of bug fixes done during the time.
Even if it can't do it, the process of doing this with parsoid is a kind of
verification to bug fixes. (can we get to 99.95% percentage  clean
roundtrip for such cases?)

Sometimes the Parsoid doesn't have to deal with its own mess, but in this
case maybe it is good idea to attach to a bug fix also a maintaince script
to fix previous issues
(similar to the requirement of attaching a unittest), rather than writing
bots that work in specific wikis as the problems arise in many wikis,
and it requires other devs time to understand the bug and come up with
their own magic regex to fix issue which may not be  fully compatible with
the fix.





On Tue, Jun 30, 2015 at 11:55 PM, Nicolas Vervelle nverve...@gmail.com
wrote:

 On Tue, Jun 30, 2015 at 10:31 PM, C. Scott Ananian canan...@wikimedia.org
 
 wrote:

  On Mon, Jun 22, 2015 at 11:14 AM, Nicolas Vervelle nverve...@gmail.com
  wrote:
 
  - Second, I'm not a big fan of VE changing wikitext in parts not
  modified by the user: experience shows that it messes the diffs, and
   makes
  watching what VE is doing a lot more difficult. It has been
 requested
  several times that VE doesn't start modifying wikitext in places not
  modified by the user.
  
 
  In case it wasn't clear, this is already the case.  Parsoid/VE uses
  selective serialization to avoid touching unmodified content.  This
  feature has been present since the beginning.
 

 Yes, I'm aware of that, but I was answering this because it was suggested
 previously in the discussion to use VE to do the cleanup...

 Nico
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Implementing features for the sake of implementing features

2015-06-17 Thread Eran Rosenthal
Sometimes programmers waste their time for implementing
non-useful/non-important features, but sometimes such features aren't just
not useful, but harmful.[1; you are more than welcome to convince me
otherwise]

Both developers, designers and project managers should always ask
themselves why do we need such feature before going to implement it and
should be able to convince others on the motivation for it.

I think that sometimes we fail… (and sometimes even in the postmortem
step[1]) The question is how can Wikimedia foundation improve the process?

For  large changes there are already reviews and RFC on mediawiki wiki, but
for smaller ones it is sometimes missed (and even phabricator ticket).
Should any feature should be associated with phab task?[2]


Eranroz

-

[1]  https://phabricator.wikimedia.org/T100691https
https://phabricator.wikimedia.org/T100691://
https://phabricator.wikimedia.org/T100691phabricator.wikimedia.org
https://phabricator.wikimedia.org/T100691/T100691
https://phabricator.wikimedia.org/T100691 /
https://gerrit.wikimedia.org/r/#/c/95723/ defeature __NOEDITSECTION__ from
VE

[2]
https://www.mediawiki.org/wiki/VisualEditor/2015_Process_Review#Recommendation:_Maintain_Goals.2C_Epics.2C_and_Tasks_in_alignment_in_Phabricator
.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Attracting new talent to our projects

2014-12-31 Thread Eran Rosenthal
Just to mention (Yuvi Panda idea):
http://www.gossamer-threads.com/lists/wiki/wikitech/348122

On Thu, Jan 1, 2015 at 2:25 AM, Jeremy Baron jer...@tuxmachine.com wrote:

 On Dec 31, 2014 7:22 PM, Gabriel Wicke gwi...@wikimedia.org wrote:
  Perhaps some fun HTTP headers
  
 http://royal.pingdom.com/2012/08/15/fun-and-unusual-http-response-headers/
 

 See also https://phabricator.wikimedia.org/T70982

 -Jeremy
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Our CAPTCHA is very unfriendly

2014-11-08 Thread Eran Rosenthal
+1 for disabling CAPTCHs (at least in signup form).

Anyway, sysops already have enougth tools for treating abuse by spam bots
using AbuseFilter  (e.g with rate filter).


On Sat, Nov 8, 2014 at 12:19 AM, MZMcBride z...@mzmcbride.com wrote:

 Tim Starling wrote:
 On 07/11/14 19:17, svetlana wrote:
  I would suggest to ask on a village pump and alter the configuration
 per local consencus.
 
 We tried that before and the answer was OMG no, even though nobody
 bothered to look at the logs. It turns out that the captcha we were
 using was broken from the outset -- trivially solvable with OCR -- but
 apparently that didn't affect anyone's opinion of whether or not it
 should be enabled.
 
 According to reports from non-WMF users of ConfirmEdit, FancyCaptcha
 has little effect on spam rate. See the table here for a summary:
 
 https://www.mediawiki.org/wiki/Extension:ConfirmEdit
 
 Anyway, sure, let's ask on some more village pumps.

 I think that's unfair.

 Wikis have a serious spam problem. People associate CAPTCHAs with spam
 prevention. On the English Wikipedia, one of the actions that results in
 the user being required to successfully enter a CAPTCHA is adding an(y?)
 external link to a page as a newly registered user. This, of course, in
 addition to the CAPTCHA presented when registering an account (consider
 that many new account creations only come about as the result of the
 requirement for an account to make a new page on the English Wikipedia).

 Why not disable the extension for a week and see what happens? If you're
 wrong and there's a marked increase in wiki spam (account creations and
 edits), then you can help devise a better solution than a CAPTCHA.
 CAPTCHAs are clearly not a sustainable fix to the underlying problems they
 were implemented to address, or so I think you're saying. If you're right
 and the CAPTCHA is simply ineffective and unneeded, we've eliminated some
 code from production and we can move on. In other words: what exactly is
 stopping you from disabling the extension?

 MZMcBride



 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] what should i publish a gadget extension?

2013-12-15 Thread Eran Rosenthal
There is no conventional process for gadget publishing, but you may
proposed it to English Wikipedia in:
https://en.wikipedia.org/wiki/Wikipedia:Gadget/proposals
and it is a community decision whether to adapt it or not (and please read
first https://en.wikipedia.org/wiki/Wikipedia:Gadget)
Other such pages exist for various projects/languages, and there are
different policies.

Another option*, if the target audience isn't Wikimedia projects but any
MediaWiki installation is
https://www.mediawiki.org/wiki/Extension:Gadgets/Scripts
(*Disclaimer: I don't know if anyone actually use this list or review it...)




On Sun, Dec 15, 2013 at 11:42 AM, Sen kik...@gmail.com wrote:

 i make a gadget called hotkeyedit,which i add some hotkey in the
 wikitextbox,like control+1-6 make the title,ctrl+s switch the lish..ctrl+[
 add a category,i think if i public it maybe some one others will like
 it,but i know how to publish extension,but how is gadget.


 Regrades
 Sen

 From SLboat(http://see.sl088.com)


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Re-evaluating MediaWiki ResourceLoader's debug mode

2013-12-12 Thread Eran Rosenthal
Another point to look into if someones wants to refactor the RL debug=true:
When loading VE with debug=true hundreds of modules are loaded (with
different requests) and it takes long to load it.
It would be nice to have an option of debug=0.5 :)
in which all the  resources (or at least the ve resources) will be loaded
as one request,
but without minified code and including all the comment.






On Wed, Dec 11, 2013 at 8:46 PM, Jon Robson jdlrob...@gmail.com wrote:

 +1000 to what Max says. It really is kinda obvious to anyone who needs to
 know how to get into debug mode and if not there are wiki pages and if not
 it's easy enough to find out if you care enough.

 That said debug mode could definitely be improved and I'm glad you brought
 this topic up Max. A few things of the top of my head I'd like to see:

 * RTL working in debug mode
 * The code in ResourceLoader really could do with a good refactor - there
 are far too many different code paths and it would be good if we could
 simplify this to get them as close as possible. When I've worked with RL in
 the past to design my own modules which involves files I've had a lot of
 headaches trying to get things to work in both debug mode and non-debug
 mode (JavaScript templates [1] being one concrete example) - and even then
 the result wasn't quite was I was hoping for in that debug mode uses
 load.php urls to inject JavaScript before the file that needs it.

 [1]

 https://git.wikimedia.org/blob/mediawiki%2Fextensions%2FMobileFrontend.git/1f3c57137afae1d0f8ac602b62dccc741893d670/includes%2Fmodules%2FMFResourceLoaderModule.php
 On 11 Dec 2013 08:33, Max Semenik maxsem.w...@gmail.com wrote:

  On 11.12.2013, 19:36 Brian wrote:
 
  
   As everybody else already said, less bandwidth is a good thing for
 most
   people, obfuscation is OK when the source is available elsewhere, and
   debug=true is not hard for developers to find.
  
 
   I'd actually disagree with the assertion that debug=true is easy to
   find, particularly for people who aren't active developers. Some
   random on the internet who just wants to see what our js looks like
   (out of curiosity or whatever) is not going to be able to find
   debug=true.
 
  If they look at the URL it will be pretty obvious because all of them
  have debug=false as first parameter.
 
  --
  Best regards,
Max Semenik ([[User:MaxSem]])
 
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Inspecting page performance with mw.loader.inspect()

2013-10-10 Thread Eran Rosenthal
Nice feature, thanks!

1. I tried to use it in ?debug=1 mode, and it seems to give 0 size to many
modules.
2. It would be nice if it would also give details about dependent modules
(inclusive size vs exclusive size).
for example when using ve with ?debug=1 and inspecting the net panel, it
looks like DOS attack with hundreds of requests,
and having both the inclusive and exclusive size would allow developers to
understand the net effect of loading a module.

Eranroz



On Fri, Oct 11, 2013 at 3:19 AM, Ori Livneh o...@wikimedia.org wrote:

 If you are know how to use your browser's JavaScript console, you can now
 get an ordered list of all ResourceLoader modules that are loaded on the
 page, sorted by the total size of each module's JavaScript and CSS assets
 -- simply run mw.loader.inspect();.

 It works best in newer versions of Chromium / Chrome, where it takes
 advantage of the availability of console.table(). It looks like this:
 http://i.imgur.com/zGymrsF.png

 In the course of testing this feature yesterday, Matt Flaschen spotted and
 fixed redundant module loading in TimedMediaHandler (see bug 0). That's
 pretty cool, no?

 Do remember that size != performance, though -- just because a module is
 small does not mean that it is performant. (The reverse is also true.) This
 tool also does not account for factors like gzip compression. So no burning
 developers at the stake, please :) But curious poking is definitely
 encouraged.

 Thanks to Timo  Roan for helping this along.

 ---
 Ori Livneh
 o...@wikimedia.org
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Add tags when saving an edit ?

2013-08-16 Thread Eran Rosenthal
You can define convention for your edit summaries, and ask sysops to tag it
with AbuseFilter.


On Fri, Aug 16, 2013 at 9:30 PM, Bartosz Dziewoński matma@gmail.comwrote:

 No, but there's a bug about this[1] and there was a now-abandoned patch[2].

 [1] 
 https://bugzilla.wikimedia.**org/show_bug.cgi?id=18670https://bugzilla.wikimedia.org/show_bug.cgi?id=18670
 [2] 
 https://gerrit.wikimedia.org/**r/#/c/64650/https://gerrit.wikimedia.org/r/#/c/64650/

 --
 Matma Rex


 __**_
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/**mailman/listinfo/wikitech-lhttps://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] VE/Migration guide for gadgets developers

2013-07-22 Thread Eran Rosenthal
Hi,
When the ResourceLoader was deployed (or even before it) to production,
there were migration development guides for gadget/extension developers:

   -
   
http://www.mediawiki.org/wiki/ResourceLoader/Migration_guide_for_extension_developers
   -
   http://www.mediawiki.org/wiki/ResourceLoader/Developing_with_ResourceLoader

Such guides allowed easier adoption of ResourceLoader. We need something
similar for the visual editor:

   - Migration - What are the recommended steps to make gadget/extension
   VE adapted? [with answers to questions such as: how to get the underlying
   model - instead of $('#wpTextbox1').val() and what is this model [and what
   modifications to the underlying model are supported/to be avoided by
   gadgets/user scripts ] ]
   - Development with the VE: guides with explanation for common editor UI
   customization, and what is recommended API for it (for example: add custom
   toolbar buttons).

Eran
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Disambiguator extension deployed to all WMF wikis (action required)

2013-07-10 Thread Eran Rosenthal
Nice extension :)

You may use generator to enjoy this new property. For example to check
whether there is a disambig link from SOME_TITLE
en.wikipedia.org/w/api.php?action=querygenerator=linkstitles=SOME_TITLEprop=pagepropsppprop=disambiguationgpllimit=500

(without this extension it was possible only with external tools such as
dablinks: http://toolserver.org/~dispenser/view/Dablinks)



On Wed, Jul 10, 2013 at 9:00 AM, MZMcBride z...@mzmcbride.com wrote:

 Nicolas Vervelle wrote:
 Has the API been modified so that we can ask it if a page is a
 disambiguation page ?

 Looks like it.

 Starting point:
 https://en.wikipedia.org/w/api.php

 List of available property names:
 https://en.wikipedia.org/w/api.php?action=querylist=pagepropnamesppnlimit
 =100

 Look up properties of a particular title:
 https://en.wikipedia.org/w/api.php?action=queryprop=pagepropstitles=Madon
 na
 pageprops disambiguation= wikibase_item=q1564372 /

 https://en.wikipedia.org/wiki/Special:PagesWithProp can look up pages by
 property name. I'm not sure if there's an equivalent module for the Web
 API yet.

 The Web API has disambiguation-related query pages as well (including
 Special:DisambiguationPageLinks, which I'm only now learning exists).

 MZMcBride



 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Search documentation

2013-06-17 Thread Eran Rosenthal
I think that from user perspective, such help pages are not very useful
since most users don't read help pages.
Some features (the important ones) should be hinted in the search form
itself.
In hewiki we modified the [[MediaWiki:Search-summary]][1] to include a
small expandable table with hints for features (intitle/incategory e.g).

Eran

[1]
http://he.wikipedia.org/wiki/%D7%9E%D7%93%D7%99%D7%94_%D7%95%D7%99%D7%A7%D7%99:Search-summary?uselang=en



On Tue, Jun 18, 2013 at 3:40 AM, Nikolas Everett never...@wikimedia.orgwrote:

 One of our goals while building this has been to make something reasonably
 easy to install by folks outside of WMF.  I've added some notes about this
 to the page.  I'd certainly love to hear ways that'd make it simpler to
 use.

 Nik


 On Mon, Jun 17, 2013 at 8:23 PM, Brian Wolff bawo...@gmail.com wrote:

  Just as a note, MediaWiki default (aka crappy) search is very
  different from the lucene stuff used by Wikimedia. Lucene search is
  rather difficult to set up, so most third party wikis do not use it.
 
  --bawolff
 
 
  On 6/17/13, Nikolas Everett never...@wikimedia.org wrote:
   I'm not sure about http://www.mediawiki.org/wiki/Help:Searching but
   https://en.wikipedia.org/wiki/Help:Searching has lots of things we're
  going
   to have to add to our list.  My guess is
   http://www.mediawiki.org/wiki/Help:Searching is simply out of date.
  
   Nik
  
  
   On Mon, Jun 17, 2013 at 4:33 PM, Chris McMahon
   cmcma...@wikimedia.orgwrote:
  
   On Mon, Jun 17, 2013 at 1:28 PM, S Page sp...@wikimedia.org wrote:
  

* enwiki says Hello dolly in quotes gives different results, mw
   directly
contradicts this. Even on my local wiki, quotes make a difference.
   
* enwiki disagrees with itself what a dash in front of a word does.
   
  
   I did some research a few weeks ago on the current state of Search and
   there are a number of discrepancies between the documentation and
 actual
   behavior.  Some of them have BZ tickets, like
   https://bugzilla.wikimedia.org/show_bug.cgi?id=44238
   -Chris
   ___
   Wikitech-l mailing list
   Wikitech-l@lists.wikimedia.org
   https://lists.wikimedia.org/mailman/listinfo/wikitech-l
   ___
   Wikitech-l mailing list
   Wikitech-l@lists.wikimedia.org
   https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] State of MediaWiki's render action (parameter to index.php)

2013-05-20 Thread Eran Rosenthal
To include a page within iframe, sometimes it is reasonable to use
?printable=yes.
For example there is a gadget in hewiki that shows Wikidata entry in iframe
(within dialog) with ?printable=yes to allow uses to edit properties.
(the benefit of printable=yes is that the JS is loaded, but there is no
sidebar - which is nice for this purpose)


On Mon, May 20, 2013 at 11:17 PM, Daniel Friesen dan...@nadir-seen-fire.com
 wrote:

 On Mon, 20 May 2013 12:55:24 -0700, Thomas Gries m...@tgries.de wrote:

  Am 20.05.2013 18:11, schrieb Tyler Romeo:

 I'm confused as to what the point of action=render is. How is it
 different
 from using the API?


 I do use it (action=render) in rendering the content div of a mediawiki
 A in a different web page B,
 in the context of the authenticated user, in an iframe.

 The different web page B has no additional logic, especially, I cannot
 add javascript or jquery, so render=action appears to be an easy
 solution for my problem.
 Unfortunately, and this is a drawback. action=render does currently not
 apply the mediawiki A stylesheets.


 action=render should NOT be used for this purpose.
 action=render is not a valid html page. It's meant to be read by
 something that does additional processing (which could use the api instead)
 and embedded inside of another html document.

 I also don't see how your technique will logically work at all, since the
 moment the user clicks a link they lose action=render.


 --
 ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]


 __**_
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/**mailman/listinfo/wikitech-lhttps://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Global watchlist and watchlist wishlist

2013-05-06 Thread Eran Rosenthal

 What happened to last years gsoc project in this area? Are there people
 working on getting it merged?

Patchsets for watchlist are waiting for review for too long.
The GSOC project: https://gerrit.wikimedia.org/r/#/c/16419/
other patchsets waiting for review:
https://gerrit.wikimedia.org/r/#/c/38743/
https://gerrit.wikimedia.org/r/#/c/53964/ +
https://gerrit.wikimedia.org/r/#/c/53968/

Is there anyone who is responsible for reviewing patches for watchlist?




On Mon, May 6, 2013 at 9:11 PM, Brian Wolff bawo...@gmail.com wrote:

 On 2013-05-06 1:31 PM, Quim Gil q...@wikimedia.org wrote:
 
  On 05/06/2013 05:41 AM, Guillaume Paumier wrote:
 
  A few people have started to organize the various bug reports about
  watchlists, but there is still much to do before we have a clear 
  prioritized vision of what the watchlist feature should become.
 
 
  fwiw I had already suggested a Bug Day focusing on the Watchlist feature.
 If this is considered useful Andre could schedule it whenever appropriate.
 
 
  Therefore, if a few developers could declare their interest in
  tackling the watchlist issue in the foreseeable future, it would help
  arouse interest and enthusiasm from users, and motivate them to
  organize user research in order to design a better watchlist feature.
 
  I don't think we need a formal pledge or commitment; a simple
  declaration of interest would imho be enough to get started. The
  specifics can be ironed out later.
 
 
  Sounds like an entry to

 http://www.mediawiki.org/wiki/Mentorship_programs/Possible_projects#Raw_projectsmight
 help as soon as there is a broad idea of what needs to be done.

 What happened to last years gsoc project in this area? Are there people
 working on getting it merged?

 -bawolff
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Socializing changes

2013-04-06 Thread Eran Rosenthal
In hewiki we had a discussion in village pump before phase II deployment
and there is a simple bureaucratic policy for converting templates to use
the new {{#property}} feature:
a discussion in Wikipedia:Village pump/templates for each template
conversion before actually adding {{#property}} for it.
(this village pump for templates was already exist to eliminate adding
unnecessary extra parameters that were added without discussion)
This way we can check that each conversion is both technically ok, use the
correct properties from wikidata, and doesn't add unnecessary parameters.

We don't use yet the new features - as in the real world cases it isn't
just {{#property}} and to enjoy the powerful features of wikidata we must
use WikibaseClient Lua api,
but we are in final phase of testing {{Taxobox}} (with no parameters at
all. really cool!)

I would like to thank to Lydia and wikidata team - you are doing a great
job.



On Sat, Apr 6, 2013 at 6:40 PM, legoktm legoktm.wikipe...@gmail.com wrote:

 On Sat, Apr 6, 2013 at 10:28 AM, MZMcBride z...@mzmcbride.com wrote:

  Risker wrote:
  Lydia, could you please point me to the discussion on *English
 Wikipedia*
  where the community indicated an interest in deploying this software?
  Infoboxes and sourcing to another website completely outside the control
  of English Wikipedia is a rather big issue, and I would expect to see a
  Request for Comment with at least 200-300 participants.
 
  I think the issue we're seeing here is that changes, particularly large
  changes, often aren't socialized well.
 
  It probably doesn't help to target the English Wikipedia first, of
 course,
  given that it's often annoyingly exceptional. Wikidata seems like a large
  enough change that I agree that a bit more socialization might be nice.
  There are over 700 wikis on which to possibly deploy Wikidata, in theory.
 

 Wikidata phase 2 is already live on 11 different wikis. (
 http://blog.wikimedia.de/2013/03/27/you-can-have-all-the-data/)

 
  MZMcBride
 
 
  -- Legoktm
 http://enwp.org/User:Legoktm


 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Code review statistics and trends

2012-08-23 Thread Eran Rosenthal
A real waste of time for reviewer is checking for code conventions which
can be done automatically.

Some static code analysis tools to automatically check conventions (spaces,
missing messages['qqq']...)
would help reduce the effort needed for review, allowing the human code
reviewer to fully concentrate on real issues,
and for the commiter to get almost immediate response for these small
annoying convention issues.

Is there a code analysis plugin that can be setup with jenkins?

Eran Roz

On Thu, Aug 23, 2012 at 11:44 AM, Niklas Laxström niklas.laxst...@gmail.com
 wrote:

 On 23 August 2012 04:37, Rob Lanphier ro...@wikimedia.org wrote:

  How is the process working for everyone?  Is stuff getting reviewed
  quickly enough?  How has it been for the reviewers out there?

 I'm doing a lot of code review[1], but I feel like it's wasting my
 time by being inefficient. I'm still looking forward to diffs in
 emails and/or one page diff view in Gerrit. Especially re-reviewing
 big patches is just plain annoying when you have to go over all files
 just to find the changes.

 My dashboard is mostly useless, it's full of -1 or +1 patches waiting
 for an action by the submitter. Or +0 patches I gave +1 or -1 before
 new patchset was submitted - they are indistinguishable from totally
 new patches. It would also be nice to have different queues for things
 like GSoC, Translate, sprint related, i18n review requested.

 It has also happened that my inline comments have been ignored because
 a new patchset was submitted without addressing them, and subsequent
 reviewers didn't notice the comments anymore.

 My own patches have been reviewed quickly, but that's because sprint
 related patches are reviewed by other team members and non-sprint
 related patches usually by Siebrand. There have been some exception
 for patches that need review for someone outside of our team.

 [1] Since there are no statistics for this, I have no idea whether I'm
 doing more or less than average, but I'm definitely spending a lot of
 time on it. It's mainly the positive feedback I get that makes it
 rewarding.

   -Niklas

 --
 Niklas Laxström

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l