[Wikitech-l] Re: Feature idea: data structure to improve translation capabilities

2021-07-05 Thread petr . kadlec
Hi,

On Sun, Jul 4, 2021 at 2:49 AM Wolter HV  wrote:

> Thanks, this is interesting too, though this project doesn't seem to
> decouple meanings from words, so automatic translations don't work with it
> (as far as I could see from my short snoop-around.)
>

I’m not sure what you mean with “decouple meanings from words”. Sure,
lexemes themselves do not have DefinedMeaning entries like OmegaWiki does,
but note this is a part of Wikidata, and lexeme senses are linked with
main-namespace Wikidata items using the “item for this sense” (P5137)
property. See e.g. https://www.wikidata.org/wiki/Lexeme:L10984 linking
the “point
of entry to an enclosed space” sense to https://www.wikidata.org/wiki/Q53060
and also to the corresponding lexemes in other language(s), in this case,
https://www.wikidata.org/wiki/Lexeme:L406305#S1

HTH,
-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

[Wikitech-l] Re: Feature idea: data structure to improve translation capabilities

2021-06-23 Thread petr . kadlec
Hi!

On Wed, Jun 23, 2021 at 11:05 AM Wolter HV  wrote:

> [2021-06-20 17:43 +0100] John:
> > Off hand isn’t this something that wikidata was setup to handle?
>
> I'm not sure, but I don't see the functionality currently being there in
> Wiktionary.  Is this something currently under development?
>

Yeah, for something like fifteen years, I guess… :-) See e.g. OmegaWiki
(formerly known as WiktionaryZ).

The modern incarnation of machine-readable dictionary is the
Lexicographical Data project on Wikidata. It is a nice project, definitely
go take a look at it, but it is not really an evolution/improvement of
Wiktionary but rather a fresh start. (Among other reasons because of the
license incompatibility of Wiktionary’s CC-BY-SA with Wikidata’s CC-0.) See
https://www.wikidata.org/wiki/Wikidata:Lexicographical_data

-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list -- wikitech-l@lists.wikimedia.org
To unsubscribe send an email to wikitech-l-le...@lists.wikimedia.org
https://lists.wikimedia.org/postorius/lists/wikitech-l.lists.wikimedia.org/

Re: [Wikitech-l] How are value node IRIs generated?

2020-12-09 Thread petr . kadlec
On Wed, Dec 9, 2020 at 6:17 PM Baskauf, Steven James <
steve.bask...@vanderbilt.edu> wrote:

> When performing a SPARQL query at the WD Query Service (example:
> https://w.wiki/ptp), these value nodes are identified by an IRI such as
> wdv: 742521f02b14bf1a6cbf7d4bc599eb77 (
> http://www.wikidata.org/value/742521f02b14bf1a6cbf7d4bc599eb77). The
> local name part of this IRI seems to be a hash of something. However, when
> I compare the hash values from the snak JSON returned from the API for the
> same value node (see
> https://gist.github.com/baskaufs/8c86bc5ceaae19e31fde88a2880cf0e9 for the
> example), the hash associated with the value node
> (35976d7cb070b06a2dec1482aaca2982df3fedd4 in this case) does not have any
> relationship to the local name part if the IRI for that value node.
>

https://www.mediawiki.org/wiki/Wikibase/Indexing/RDF_Dump_Format#Value_representation
states:
> Full values are represented as nodes having prefix wdv: and the local
name being the hash of the value contents (e.g.
wdv:382603eaa501e15688076291fc47ae54). There is no guarantee of the value
of the hash except for the fact that different values will be represented
by different hashes, and same value mentioned in different places will have
the same hash.

But hopefully somebody will know more than me.

-- [[ cs:User:Mormegil | Petr Kadlec ]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] List of all articles about cities?

2020-08-12 Thread petr . kadlec
On Wed, Aug 12, 2020 at 9:18 PM Thomas Güttler Lists <
guettl...@thomas-guettler.de> wrote:

> Do you have an idea how I can get a list of articles which are about a
> city?
>

Since this is a huge list anyway, so you don’t really want a list of _all_
of them, there are many options to approach it. For start, you can try
https://w.wiki/ZQd as an example.

-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] [Wikitech-ambassadors] Translations on hold until further notice

2018-09-27 Thread petr . kadlec
‪On Thu, Sep 27, 2018 at 1:43 PM ‫יגאל חיטרון‬‎  wrote:‬

> Hi. If so, I think that the next issue of the Tech news should recommend to
> fix the existing translations locally, instead of translatewiki.com.
>

Ermm, isn’t this _exactly_ what we want to avoid (and why the original
message was sent in the first place)? Local translations will not be
updated if neccessary, being stuck basically forever once set (and will not
be propagated to other wikis, but that is the lesser problem; once
TranslateWiki starts working again, that will be fixed, the local
translations won’t).

-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Finding namespaces

2016-08-31 Thread Petr Kadlec
Hi,

On Wed, Aug 31, 2016 at 2:10 PM, Bináris <wikipo...@gmail.com> wrote:

> > > Additional question: where can I see available aliases as a user?
> > >
> >
> > https://meta.wikimedia.org/w/api.php?action=query=siteinfo=
> > namespacealiases
>
> That's useful, but not complete again. Are there somewhere default aliases?
>

https://hu.wikipedia.org/w/api.php?action=query=siteinfo=namespaces|namespacealiases

HTH,
-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] IPv6 block range size

2015-12-22 Thread Petr Kadlec
On Tue, Dec 22, 2015 at 3:27 PM, Bináris <wikipo...@gmail.com> wrote:

> All right, but what is the actual value for WMF wikis, and where is this
> information available?
>

The configuration of WMF wikis is publicly available at
https://noc.wikimedia.org/conf/ (especially
https://noc.wikimedia.org/conf/CommonSettings.php.txt and
https://noc.wikimedia.org/conf/InitialiseSettings.php.txt) – IIANM, there
is no wgBlockCIDRLimit override there, therefore, it is left at the
MediaWiki default of /16 for IPv4 and /19 for IPv6.

HTH,
-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Difference in stats shown at gom.wikipedia.org and stats.wikimedia.org

2015-10-07 Thread Petr Kadlec
2015-10-07 15:09 GMT+02:00 రహ్మానుద్దీన్ షేక్ <nani1o...@gmail.com>:

> gom.wikipedia.org (Konkani Wikipedia) shows 678 as article count (Using
> magic word {{NUMBEROFARTICLES}}) whereas stats
> <https://stats.wikimedia.org/EN/TablesWikipediaGOM.htm> show 2.3K as
> article count.
> Why is this difference, thought there are articles at gom wikipedia, still
> the count is shown to be less. What should be done to update the magic
> word?
>

…as I have already explained to you in August. <
https://lists.wikimedia.org/pipermail/wikitech-l/2015-August/082777.html>

-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Difference in stats shown at gom.wikipedia.org and stats.wikimedia.org

2015-10-07 Thread Petr Kadlec
Hi,

2015-10-07 15:09 GMT+02:00 రహ్మానుద్దీన్ షేక్ <nani1o...@gmail.com>:

> gom.wikipedia.org (Konkani Wikipedia) shows 678 as article count (Using
> magic word {{NUMBEROFARTICLES}}) whereas stats
> <https://stats.wikimedia.org/EN/TablesWikipediaGOM.htm> show 2.3K as
> article count.
> Why is this difference, thought there are articles at gom wikipedia, still
> the count is shown to be less. What should be done to update the magic
> word?
>

gomwiki has 2358 pages in the main namespace which are not redirects. That
is what stats.wikimedia.org shows (I believe). But only 678 of those
articles contain at least one internal link, which is what the
{{NUMBEROFARTICLES}} magic word shows.

See https://www.mediawiki.org/wiki/Manual:Article_count

HTH,
-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] renaming Wikimedia domains

2015-08-26 Thread Petr Kadlec
On Wed, Aug 26, 2015 at 9:53 AM, Antoine Musso hashar+...@free.fr wrote:

 It would surely consume a lot of engineering time to come up with a
 proper migration plan and actually conduct them.  I am not sure it is
 worth the time and money unfortunately.


As noted by Eric Lippert, “If you're going to make a
backward-compatibility-breaking change, no time is better than now; things
will be worse in the future.” [1]

If “we” are not going to invest the time and money _now_, we are probably
_never_ going to do that, so if that is the case, let’s state that
explicitly, close the relevant bugs as WONTFIX and stop making the
impression the renaming could happen sometimes later.

-- [[cs:User:Mormegil | Petr Kadlec]]

[1] http://www.informit.com/articles/article.aspx?p=2425867
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Difference in article count at gom wikipedia

2015-08-14 Thread Petr Kadlec
Hi,

2015-08-14 9:19 GMT+02:00 రహ్మానుద్దీన్ షేక్ nani1o...@gmail.com:

 How different is this number {{NUMBEROFARTICLES}} from  Select count(*)
 from page where page_namespace=0 and page_is_redirect=0;


See https://www.mediawiki.org/wiki/Manual:Article_count


 While {{NUMBEROFARTICLES}} gives a count of 558 on gom.wikipedia.org,
 while
 the sql query http://quarry.wmflabs.org/query/4738 says its 2253 as of
 now. (14 August 12:47 pm IST).
 Why there is such difference (huge one)?
 Is this  a bug? Should this be dealt at phabricator?


No, this might very well be correct. Articles such as
https://gom.wikipedia.org/wiki/%E0%A4%A7%E0%A5%81%E0%A4%95%E0%A5%87%E0%A4%82
do not count, as they do not contain any internal link.

HTH,
-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Geolocation and WikiVoyage

2015-07-22 Thread Petr Kadlec
Hi,

On Wed, Jul 22, 2015 at 5:10 PM, Sylvain Arnouts sylvain.arno...@wanadoo.fr
 wrote:

 Actually, it's the same for other languages, but the API has a very
 specific way to manage languages in request.

 To have the datas in different languages, we have to call the langlinks
 props in the url, and then do llprop=url to have the url of the page for
 each languages. This is the final request I send to the API :
 $wikivoyageRequest = 
 https://en.wikivoyage.org/w/api.php?action=queryformat=json; //Base
 .prop=coordinates|info|langlinks//Props list
 .inprop=url //Properties dedicated
 to url
 .llprop=url//Properties dedicated
 to langlinks
 .generator=geosearchggscoord=$user_latitude|$user_longitudeggsradius=1ggslimit=50;
 //Properties dedicated to geosearch

 The answer contains every link for every language, so I guess it solves my
 problem.


Well, that depends on what exactly is your problem. The query you have
shown here is find all articles within the radius on the English
Wikivoyage, and give me the versions of those English articles in other
languages, if they exist. However, if you try e.g. 
https://en.wikivoyage.org/w/api.php?action=queryprop=coordinates|infoinprop=urlgenerator=geosearchggscoord=46.016667|-1.175ggsradius=1ggslimit=50ggsradius=1ggslimit=50,
you'll find nothing, even though there is an article about Île d'Aix (lying
on those coordinates) on the French and German Wikivoyages, which you
_will_ find using 
https://de.wikivoyage.org/w/api.php?action=queryprop=coordinates|infoinprop=urlgenerator=geosearchggscoord=46.016667|-1.175ggsradius=1ggslimit=50ggsradius=1ggslimit=50
(and you would find it also on French Wikivoyage, if it had the
{{#coordinates}} I wrote about in the previous message).

In other words, you are searching _only_ within English Wikivoyage articles
with the query you posted above. Whether that is enough for you, I cannot
decide.

HTH,
-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Geolocation and WikiVoyage

2015-07-22 Thread Petr Kadlec
On Wed, Jul 22, 2015 at 9:09 AM, Sylvain Arnouts sylvain.arno...@wanadoo.fr
 wrote:

 But I still have some issues. I'm trying to use it in different languages.
 It works in english, but I'd like to have results from the french version
 of Wikivoyage. I don't know why, it doesn't work by calling
 https://fr.wikivoyage.org/w/api.php? ... same request. I've no answer,
 even if the page actually exists.


For this feature to work, the page obviously needs to know its coordinates.
The English page does 
https://en.wikivoyage.org/w/api.php?action=queryprop=coordinatestitles=Lille
while the French page does not 
https://fr.wikivoyage.org/w/api.php?action=queryprop=coordinatestitles=Lille.
The coordinates are stored into the page properties using the
{{#coordinates:}} special function, see 
https://www.mediawiki.org/wiki/Extension:GeoData#Usage. On English
Wiktionary, this function is called using the {{Geo}} template (see 
https://en.wikivoyage.org/w/index.php?title=Template:Geoaction=edit). On
the French Wiktionary, the function should probably be called in {{Info
Ville}} or {{Geo}}, I guess.

HTH,
-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Geolocation and WikiVoyage

2015-07-22 Thread Petr Kadlec
Hi,

On Wed, Jul 22, 2015 at 7:13 PM, Sylvain Arnouts sylvain.arno...@wanadoo.fr
 wrote:

 You're right, I didn't think about it. So to have all result, I have to
 call all API on each Wikivoyage version ?


Well, either that, or you might want to utilize Wikidata, which should
basically contain everything. For instance, using 
https://wdq.wmflabs.org/api?q=around[625,46.016667,-1.175,10] you’ll get
the list of all Wikidata items in 10km radius around Île d'Aix. You might
want to limit the query to Wikidata items with a link to a Wikivoyage
article in English, German or French with 
https://wdq.wmflabs.org/api?q=around[625,46.016667,-1.175,10]%20AND%20link[enwikivoyage,dewikivoyage,frwikivoyage]
(add more languages to the list if so desired). I don’t know if mapping
those entity IDs to links could be done in a single step (@Magnus?), or you
need to take the result and in another step call something 
https://www.wikidata.org/w/api.php?action=wbgetentitiesids=Q214568|Q292111props=sitelinks/urlssitefilter=enwikivoyage|dewikivoyage|frwikivoyage
.

HTH,
-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] statistics about frequent section titles

2015-07-13 Thread Petr Kadlec
Hi,

On Sat, Jul 11, 2015 at 12:29 PM, Amir E. Aharoni 
amir.ahar...@mail.huji.ac.il wrote:

 Did anybody ever try to collect statistics of the most frequent section
 titles in each language and project?


I know Danny B. collects a list of used section titles with usage count on
the Czech Wiktionary (when trying to fix non-standard captions) at
https://cs.wiktionary.org/wiki/User:Danny_B./Datamining/Nadpisy

-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Search engine indexing of userspace - has something changed?

2015-07-07 Thread Petr Kadlec
Hi

On Mon, Jul 6, 2015 at 7:35 PM, Jon Robson jdlrob...@gmail.com wrote:

 Looking closely this meta tag is present in user pages:
 meta name=robots content=noindex,follow

 but not present in that particular sub page [1].
 I haven't had time to investigate further but please raise a phabricator
 task.

 [1]
 https://en.m.wikipedia.org/wiki/User:Nobleeagle/India_as_an_emerging_superpower


You are not looking into userspace. The user page has been moved into the
mainspace a few days ago, and has been a redirect since. [2] The mobile
view seems not to indicate the fact you were redirected at all.

-- [[cs:User:Mormegil | Petr Kadlec]]

[2]
https://en.wikipedia.org/w/index.php?title=User:Nobleeagle/India_as_an_emerging_superpoweraction=history
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Foreign commits create mess in Phabricator

2015-03-04 Thread Petr Kadlec
Hi,

somebody probably already noticed this, but I didn’t see any discussion
around, so just to be sure:

I have received notifications about commits being assigned to old bugs in
Phabricator by epriestley. See for instance T4123 [1] – “epriestley added a
commit: Unknown Object (Commit)”.

It seems to me the problem is that somehow the Wikimedia instance of
Phabricator and the Phabricator instance of Phabricator (erm… the Phacility
instance? secure.phabricator.org, I mean) got
crossed/synchronized/whatever. Therefore their commit D7601 [2], marked as
fixing _their_ bug T4123 [3] somehow modified _our_ bug T4123. [1] It might
have been helped by the fact Evan Priestley has the same “epriestley”
account at both installations.

While adding nonexistent commits makes a bit of a mess, it’s not a huge
problem, but some of these commits changed state of the underlying task as
well, which is worse. See epriestley’s feed [4]. I guess those would need
to be reverted?

-- [[cs:User:Mormegil | Petr Kadlec]]

   [1]: https://phabricator.wikimedia.org/T4123
   [2]: https://secure.phabricator.com/D7601
   [3]: https://secure.phabricator.com/T4123
   [4]: https://phabricator.wikimedia.org/p/epriestley/feed/
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Permanently deleting change tags available by default

2015-02-06 Thread Petr Kadlec
On Fri, Feb 6, 2015 at 3:31 AM, Brad Jorsch (Anomie) bjor...@wikimedia.org
wrote:

 On Thu, Feb 5, 2015 at 9:04 PM, Jackmcbarn jackmcb...@gmail.com wrote:

  Well, technically the revision goes away, but the log entries contain
  enough information to determine exactly what was in it, sufficient to
  reconstruct it.

 It'll tell you where it redirected to, but not whatever additional content
 (if any) was on the page besides the redirect.


And who created the redirect and when. (But I guess this is off-topic here.
Is there a Phabricator bug for this?)

-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] broken server-side SVG rendering of a VisualEditor icon

2014-11-17 Thread Petr Kadlec
On Mon, Nov 17, 2014 at 12:43 PM, Amir E. Aharoni 
amir.ahar...@mail.huji.ac.il wrote:

 I guess that either something is broken in the SVG file or in MediaWiki's
 server-side SVG to PNG rendering.


Well, that is definitely not the only SVG file with problematic rendering.
See https://commons.wikimedia.org/wiki/Category:SVG_compatibility_issues
and
https://commons.wikimedia.org/wiki/Category:Pictures_showing_a_librsvg_bug

-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wmfall] Marielle Volz joins Wikimedia as Software Developer

2014-10-29 Thread Petr Kadlec
On Mon, Oct 27, 2014 at 8:12 PM, Roan Kattouw rkatt...@wikimedia.org
wrote:

 To put that into perspective: she's working on a feature in VE that
 will let you paste in a URL to, say, a New York Times article, and
 will then automatically generate and insert a {{cite news}} template
 for you with all the right information (title of the article, author,
 date, etc.).


Interesting and useful project! I wonder – will this work only on enwiki,
or will it work in other languages and for other news websites around the
world?

-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wmfall] Marielle Volz joins Wikimedia as Software Developer

2014-10-29 Thread Petr Kadlec
Hi,

On Wed, Oct 29, 2014 at 4:22 PM, Marielle Volz marielle.v...@gmail.com
wrote:

 […]
 So the answer is that this is being designed to work on any mediawiki
 installation in any language, by configuring it with a) a special
 Mediawiki namespace message, and b) a map in each template's
 TemplateData.
 […]


Thank you for the detailed answer, and I am looking forward to using the
feature!

-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] apihighlimits and bot flags

2014-09-19 Thread Petr Kadlec
Hi,

On Fri, Sep 19, 2014 at 9:47 AM, Jonas Öberg 
jo...@shuttleworthfoundation.org wrote:

 According to EugeneZelenko who tried to grant this right, it could not
 be granted through the normal interface. Question then: is
 apihighlimits included in the bot flag, or how can the apihighlimits
 right be granted?


Yes, all members of the bot group have the apihighlimits permission. See
https://commons.wikimedia.org/wiki/Special:ListGroupRights

-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Realtime common upload plugin - wordpress

2014-09-01 Thread Petr Kadlec
Hi,

On Thu, Aug 28, 2014 at 1:41 PM, Nkansah Rexford seanmav...@gmail.com
wrote:

 Is there any way to display images uploaded into a particular category on
 Wikimedia Commons to be displayed in realtime on a WordPress site?

 Like a plugin that follows a particular category on commons and displays
 any images that are added to it? Kinda a feeder?


Well, there is Magnus’ CatFood 
https://tools.wmflabs.org/catfood/catfood.php, which creates an RSS feed
from newly added items in a specified category. And I guess there could be
a WordPress plugin which displays the content of an RSS feed?

-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] (no subject)

2014-07-10 Thread Petr Kadlec
On Thu, Jul 10, 2014 at 2:32 PM, Gerard Meijssen gerard.meijs...@gmail.com
wrote:

 Hoi,
 no filters for me..
 Thanks,

 GerardM


Congratulations, we are all happy to hear that. Just FYI: The subjectless
e-mail fell into the spam folder for me at Gmail, too. And I guess we could
all agree sending e-mails with no subject to a large mailing list is just
not a great idea anyway. So… do we need to spend any more e-mails debating
this?

Thanks.
-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Separating Special:MyLanguage from Extension:Collection

2014-06-19 Thread Petr Kadlec
On Thu, Jun 19, 2014 at 1:37 PM, Seb35 seb35wikipe...@gmail.com wrote:

 Adding Special:MyLanguage on the foundationwiki will not achieve the
 feature you want: this special page redirects to the user language (of the
 user preferences), and nobody in the public has an account on this wiki, so
 everyone will be redirected to the English page (try on Meta with a private
 tab after setting the browser default language to 'de' or whatever).


I almost pointed out the same until I read Matthew’s mail, where he writes

I don't know why Kaldari wants it. But Fundraising can use it for our thank
 you page. We would like to give a redirect like
 wikimediafoundation.org/wiki/Special:MyLanguage/Thank_You/fi and have it
 go
 to the Finnish page, or English if that doesn't exist.


And so I learned about another feature of Special:MyLanguage. However,
you’re right that detecting the user’s language from some other source than
just (local!) user preferences might be even better.

-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New tabs for images

2014-06-04 Thread Petr Kadlec
On Sun, Jun 1, 2014 at 1:17 PM, bawolff bawolff...@gmail.com wrote:

 On Jun 1, 2014 5:12 AM, ENWP Pine deyntest...@hotmail.com wrote:
 
  So here I am working late night / early morning trying to get the
 Signpost published, and I see something new at the top of image pages on
 English Wikipedia such as https://en.wikipedia.org/wiki/File:Wikidata.png
 
  We now have View on Wikimedia Commons and Add local description tabs.
 Cool!

 I agree. Thank you This,_that_and_the_other for making that feature happen.


Except that now (with Media Viewer deployed), the local file description
page is no longer accessible, so this new feature is practically useless.

-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki template page edit by IP?

2014-04-24 Thread Petr Kadlec
On Thu, Apr 24, 2014 at 10:01 AM, Thomas Gries m...@tgries.de wrote:

 Do we really have the template Namespace on MediaWiki open for edits by
 anons?


Yes. https://www.mediawiki.org/wiki/Special:ListGroupRights


 In my view, this should be changed.


Why? Do you know any WMF project where the whole Template namespace is
semiprotected?

-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Missing $wgUseVFormUserLogin ??

2014-04-23 Thread Petr Kadlec
Hi,

On Wed, Apr 23, 2014 at 5:05 AM, Daniel Renfro dren...@vistaprint.comwrote:

 Whilst upgrading to 1.22.5 from 1.21.8 I came across $wgUseVFormUserLogin
 [1]. Form what I can tell this global directive is completely missing from
 core. The docs on mw.org say it's in 1.22+, but I can't find any record
 of this setting. I also searched the mailing list archives (both
 mediawiki-l and wikitech-l) and found nothing.


* https://bugzilla.wikimedia.org/show_bug.cgi?id=46333 (Make new user
login/account creation the only version after testing and MW namespace
cleanup)
* https://gerrit.wikimedia.org/r/#/c/67233/


 I would like to disable the new login form (for legacy reasons, not
 because I don't think it's sexy.) Is there some other way?


I would say not anymore.

-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Kategória:Pages with script errors

2014-01-02 Thread Petr Kadlec
On Thu, Jan 2, 2014 at 11:36 PM, Bináris wikipo...@gmail.com wrote:

 Thank you, it works. (It was hopeless for me to find in
 special:Allmessages.)


Try going to translatewiki.net and writing the searched text into the
search box on the main page:
http://translatewiki.net/w/i.php?title=Special%3ASearchTranslationsquery=Pages+with+script+errors


 Now, another question:
 Is there a collection of such message keys of automatic categories?
 Is there a list of all automatic categories?


I don’t know of any such list, but it would be useful. In fact, it would be
useful to have a common mechanism of handling such automatic categories in
MediaWiki, ensuring that e.g. all such categories would be automatically
created on a wiki / displayed in a special way / tagged with a template /
something like that. Otherwise, those automatic categories need to be
created ad hoc whenever someone spots a red category somewhere.

-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Slippy map embedded in geohack page not working in HTTPS mode

2013-08-30 Thread Petr Kadlec
On Fri, Aug 30, 2013 at 3:17 PM, Brion Vibber bvib...@wikimedia.org wrote:

 What browser are you using? Works for me in Firefox 23.0.1 and Chrome
 29.0.1547.62
 on OS X 10.8.


Does not work for me on Firefox 23.0 on Windows with

Blocked loading mixed active content 
http://toolserver.org/~dschwen/wma/iframe.html?10_10_700_500_en_5_englobe=Earthlang=enpage=Fooclient=GeoHack

Source:
https://en.wikipedia.org/w/index.php?title=MediaWiki:GeoHack.jsaction=rawctype=text/javascript
Line: 42

-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Slippy map embedded in geohack page not working in HTTPS mode

2013-08-30 Thread Petr Kadlec
On Fri, Aug 30, 2013 at 4:26 PM, Marc A. Pelletier m...@uberbox.org wrote:

 On 08/30/2013 10:21 AM, Brion Vibber wrote:
  I definitely recommend fixing the geohack page to work properly over
 SSL...

 Given that the actual webserver fully allows https, I expect the only
 issue is protocol specified in some constructed URLs and should be
 fairly simple to fix.  The listed maintainers of the Tool Lab's geohack
 are Magnus Manske and Kolossos; Perhaps poking one of them?


Isn’t the only problem the hardcoded http: URL for the map iframe in
https://en.wikipedia.org/wiki/MediaWiki:GeoHack.js, fixable by any enwiki
sysop?

-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Feature proposal: backups while editing articles

2013-04-30 Thread Petr Kadlec
On 30 April 2013 15:46, Paul Selitskas p.selits...@gmail.com wrote:

 There is another approach: storing the backups at the server side.
 Some people ( :] ) suggest that it's quite cheap and may be
 reasonable. Anyhow, we cannow allow per-keypress backup update due to
 requests latency. And this would be a disaster to process a huge bunch
 of some way useless requests simultaneously, unless dedicated servers
 are run to serve swiftly and obediently.


https://www.mediawiki.org/wiki/Extension:Drafts ? (IIRC, this was for some
time considered to be used at (all?) WMF sites, but then somehow abandoned
(?). I definitely recall seeing it working somewhere – at testwp or
possibly mediawiki.org. Oh, now I see
https://bugzilla.wikimedia.org/show_bug.cgi?id=37992)

-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Zuul (gerrit/jenkins) upgraded

2013-04-25 Thread Petr Kadlec
On 25 April 2013 15:18, Mathieu Stumpf psychosl...@culture-libre.orgwrote:

 Excuse me, but what is Zuul?


https://wikitech.wikimedia.org/wiki/Zuul →
https://www.mediawiki.org/wiki/Continuous_integration/Zuul

-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] No cascade protection on Mediawiki namespace

2013-04-20 Thread Petr Kadlec
On 20 April 2013 07:08, Techman224 techman...@techman224.ca wrote:

 Right now the MediaWiki namespace is protected from editing. However, if
 you add a template to a Mediawiki message and the template is unprotected,
 any user could edit the message by editing the template, creating a
 backdoor.


Ummm... Don't do that, then? Sometimes you _want_ to include pieces of text
editable by more than just sysops (say, by autoconfirmed users), so you use
a template from a MediaWiki message. (Cf. e.g.
https://cs.wikipedia.org/wiki/MediaWiki:Recentchangestext and
https://cs.wikipedia.org/wiki/%C5%A0ablona:Ozn%C3%A1men%C3%ADRC.) Or, you
do not want to do that, then why are you using an unprotected template?
Either transclude another page in MediaWiki namespace as a template
({{MediaWiki:Something}}), or make sure you protect the used template (with
possible cascade).

-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Article on API Characteristics

2013-04-18 Thread Petr Kadlec
On 17 April 2013 22:33, Brian Wolff bawo...@gmail.com wrote:

 My understanding is its not really possible to do this in php in a way
 that would actually be of use to anyone. See
 https://bugzilla.wikimedia.org/show_bug.cgi?id=26631#c1


Still, supporting this in a way “that wouldn’t be of use”, i.e. send the
100 status immediately instead of 417 would probably make it a tiny bit
easier for clients. However, this is not a bug/problem/feature-request for
MediaWiki, but for Squid. It seems ApachePHP would handle this correctly,
but Squid rejects such requests. There is a configuration variable doing
exactly what Svick is proposing 
http://www.squid-cache.org/Doc/config/ignore_expect_100/, but I agree
turning it on would not be a good idea. And FYI: Squid 3.2 seems to support
100-continue somehow, but not sure how much. 
http://wiki.squid-cache.org/Features/HTTP11

-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Same wikicode - different outcomes

2013-02-13 Thread Petr Kadlec
Hi,

On 13 February 2013 14:35, Tuszynski, Jaroslaw W. 
jaroslaw.w.tuszyn...@saic.com wrote:

 working in the predictable way. Some of them do not work properly and
 issues can be fixed by changing order of arguments in the template. For
 example http://commons.wikimedia.org/wiki/Creator:Auguste_Angellier does
 not show the content of the name field and template is in


The problem here was that the “| default=” line is not what it looks like:
instead of a space, there was a literal no-break space character (U+00A0)
between the pipe separator and the “default” keyword.

I have fixed that in https://commons.wikimedia.org/wiki/?diff=90571366

-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Portuguese Wikipedia village pump and crowdsourcing ideas

2013-01-25 Thread Petr Kadlec
Hi,

On 24 January 2013 18:34, Everton Zanella Alvarenga 
ezalvare...@wikimedia.org wrote:

 1) My first question is if someone could help me to create a  RSS feed
 of threads of the Portuguese Wikipedia village pump (= /esplanada/).
 The feed would contain the title and link to the thread, the thread
 body and the author. Our /esplanada/ has one subpage per thread, which
 makes easy to watch what I want.


If you don’t want to implement your own software to generate the RSS feed,
I guess you could use something like
http://pt.wikipedia.org/w/index.php?title=Especial%3AP%C3%A1ginas_novasnamespace=4tagfilter=username=and
if you would create an “abuse filter” marking all new subpages of the
village pump with a special tag, you could even filter this using the new
tag, and be practically done with it (it might not be exactly the format
you requested, but the data is there).

HTH,
-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Limiting storage/generation of thumbnails without loss of functionality

2013-01-23 Thread Petr Kadlec
On 23 January 2013 13:24, Georgiy Tugai georgiy.tu...@gmail.com wrote:

 However, this is also an avenue for denial of service - someone could
 create many links to different images with non-standard sizes,
 intentionally or unintentionally, and therefore overload computational
 (temporarily) and storage resources on the server.


Note that this has been debated at least a few times in the past. See e.g.
the thread at
http://thread.gmane.org/gmane.science.linguistics.wikipedia.technical/63701/

-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] a source repo for Extension:MathJax

2012-11-13 Thread Petr Kadlec
On 13 November 2012 15:47, Chad innocentkil...@gmail.com wrote:

 I'm not entirely sure I understand. If it's customization for the
 extension,
 shouldn't it go in the Math[0] extension?


Note that the Math extension and the MathJax extension are two completely
(?) unrelated MediaWiki extensions. See
https://www.mediawiki.org/wiki/Extension:MathJax

-- [[cs:User:Mormegil | Petr Kadlec]]
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Pretty Timestamps

2012-10-25 Thread Petr Kadlec
On 25 October 2012 07:07, Tyler Romeo tylerro...@gmail.com wrote:
 So recently https://gerrit.wikimedia.org/r/15746 was merged. It implements
 a pretty timestamp function. Yet it was somehow completely ignored that we
 actually have an MWTimestamp class made specifically for timestamp objects
 in MediaWiki.

…and its getHumanTimestamp is implemented in an unlocalizable way, see
http://translatewiki.net/wiki/Thread:Support/ago_%28%22$1_ago%22%29_cannot_work

Replace the implementation of getHumanTimestamp with a call to
Language::prettyTimestamp() and we are done, aren’t we? The
language-specific user-friendly formatting belongs to the language
class, anyway. And I am not sure what should have prettyTimestamp()
done differently – is wfTimestamp deprecated in favor of MWTimestamp,
or what?

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How to display the title with underscores?

2012-07-16 Thread Petr Kadlec
On 16 July 2012 10:41, wangfeng wangfeng wangfeng.v1.1...@gmail.com wrote:
 When I edit a page titled a_b_c,the title always displayed as a b c.
 I have tried to add {{DISPLAYTITLE:a_b_c}} in the page but it doesn't
 work.
 Do I have made some mistake?

Drop the quotes. Use just

{{DISPLAYTITLE:a_b_c}}

--[[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] How to display the title with underscores?

2012-07-16 Thread Petr Kadlec
On 16 July 2012 10:54, wangfeng wangfeng wangfeng.v1.1...@gmail.com wrote:
 Thank you for your reply.
 But it still doesn't work.

Well, this should work. See https://test.wikipedia.org/wiki/A_B_C

 Any other possible mistakes?

Well, does DISPLAYTITLE work at all – do you have $wgAllowDisplayTitle
enabled? (Is this at a Wikimedia wiki, or somewhere else?)

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Sorry if inconvenient : SVG-to-png rendering on Wikimedia

2012-06-27 Thread Petr Kadlec
On 27 June 2012 13:33, Achim Flammenkamp ac...@uni-bielefeld.de wrote:
 2) I wonder why the SVG-graphic devolpers use such an improper(?) rendering-
 philosophy. All these articfacts on the Iran-flag would have been avoided, if
 the rendering is divided up logical into two steps: Firstly render the 
 SVG-code
 to the size given in this SVG-code (or an integer multiple for large final 
 sizes) to a pixel (discret) map.

Technical remark: The width/height specified in the SVG file is a
generic “length” value [1], which can be in other units than pixels
[2] and also can take non-integral values. [3]

-- [[cs:User:Mormegil | Petr Kadlec]]

[1]: http://www.w3.org/TR/SVG/struct.html#SVGElementWidthAttribute
[2]: http://www.w3.org/TR/SVG/types.html#DataTypeLength
[3]: http://www.w3.org/TR/SVG/types.html#DataTypeNumber

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] User ID 0 on simple

2012-05-04 Thread Petr Kadlec
On 4 May 2012 15:43, Bryan Tong Minh bryan.tongm...@gmail.com wrote:
 A user will have id 0, but an associated user name, when a page is
 imported via Special:Import, and if said user does not exist locally.

See also https://bugzilla.wikimedia.org/show_bug.cgi?id=7240

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] How to get a uploaded files path

2012-02-24 Thread Petr Kadlec
On 24 February 2012 15:09, Johannes Weberhofer
jweberho...@weberhofer.at wrote:
 can someone give me an idea, how I can determine a uploaded file's real
 path, when I only have the filename? I have written some code but there
 could also be a API call available.

You mean like 
https://commons.wikimedia.org/wiki/Special:FilePath/File:Example.jpg
?

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Git migration - instructions for Sysops

2012-02-15 Thread Petr Kadlec
On 16 February 2012 02:28, Jeremy Baron jer...@tuxmachine.com wrote:
 Subversion puts .svn in nested subdirectories, git doesn't.

Not anymore. Since Subversion 1.7, there is only one .svn directory in
the root of the working copy.

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Some questions about #switch

2012-02-01 Thread Petr Kadlec
On 1 February 2012 16:02, Antoine Musso hashar+...@free.fr wrote:
 Should we create an extension that would let users create key = value
 stores?

We already have that… See e.g. [[w:Category:Political party colour templates]].

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikimedia Trusted XFF List

2012-01-20 Thread Petr Kadlec
On 20 January 2012 13:22, Petr Bena benap...@gmail.com wrote:
 It shouldn't be kept in sync with trunk. There is a branched version in
 1.18wmf1 we use on cluster atm.

Well, OK, you’re right. Then again, shouldn’t
http://www.wikimedia.org/trusted-xff.html be kept in sync with
http://svn.wikimedia.org/viewvc/mediawiki/branches/wmf/1.18wmf1/extensions/TrustedXFF/trusted-hosts.txt?view=co
…? (Cause, currently, it obviously isn’t.)

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Wikimedia Trusted XFF List

2012-01-19 Thread Petr Kadlec
Hi,

shouldn’t http://www.wikimedia.org/trusted-xff.html be kept in sync
with 
http://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/TrustedXFF/trusted-hosts.txt?view=co
(or deleted/redirected)? Or is there another current valid list of the
trusted XFF hosts?

Regards,
-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] snapshot of categories content and template usage

2012-01-16 Thread Petr Kadlec
Hi,

On 15 January 2012 12:26, Piotr Jagielski piotr.jagiel...@op.pl wrote:
 Is there any easier option? Is there a dump of categorization and
 template usage kept somewhere?

Yes, it is. There are raw SQL dumps of the templatelinks and
categorylinks tables available in the dumps next to the XML dump (look
for ...-categorylinks.sql.gz and ...-templatelinks.sql.gz). These do
not have a stable clearly defined format (like the XML dumps), but
they are quite a good choice for your needs, I guess. (See also
https://www.mediawiki.org/wiki/Manual:Database_layout.)

HTH,
-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Mediawiki and a standard template library

2011-08-25 Thread Petr Kadlec
On 25 August 2011 08:28, Andre Engels andreeng...@gmail.com wrote:
 On Sat, Aug 20, 2011 at 11:31 PM, go moko gom...@yahoo.com wrote:
 So, if I can express some simple opinion, perhaps what would be useful is a
 mean to easily export/import a template and all those which are necessary
 for it to work properly.
 I don't know if such a tool exist, but could it bea (partial) solution?

 This should not be too hard to program - what is needed is that one
 exports/imports not only the template but also all templates included in it.
 There is already functionality to get this list - they are shown on the edit
 page. Thus, all that is needed is some kind of interface to export this
 page and all templates directly or indirectly included in it.

You mean like… checking the “Include templates” box on Special:Export?

But the problem is more difficult than that. MediaWiki has no chance
of knowing which templates are “necessary for it to work properly”, it
can only detect those that are _actually used_ in a specific use case
(as you say, those which are shown on the edit page), which is just a
subset of those required in general.

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Mediawiki and a standard template library

2011-08-25 Thread Petr Kadlec
On 25 August 2011 10:47, Andre Engels andreeng...@gmail.com wrote:
 But the problem is more difficult than that. MediaWiki has no chance
 of knowing which templates are “necessary for it to work properly”, it
 can only detect those that are _actually used_ in a specific use case
 (as you say, those which are shown on the edit page), which is just a
 subset of those required in general.


 No, but it does at least mean that exporting and importing any single
 template should be possible in a very short time indeed.

Well… no, generally not. Check for instance {{flagicon}}. There is no
way Special:Export could recognize it is required to export all 2556
(!) templates in [[Special:Prefixindex/Template:Country data]].

Of course, this does help for “simpler” template systems.

-- Petr Kadlec / Mormegil

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] find deleted pages

2011-07-16 Thread Petr Kadlec
On 15 July 2011 20:21, priyank bagrecha bagi.priy...@gmail.com wrote:
 is there a way to find using the api whether a certain page is deleted or not?

What do you mean by “is deleted”? You can ask whether a certain page
exists at the moment with something like

http://en.wikipedia.org/w/api.php?action=querytitles=X
http://en.wikipedia.org/w/api.php?action=querytitles=Htfhfhsrtr

And you can ask whether a certain page has been deleted sometime in
the past with something like

http://en.wikipedia.org/w/api.php?action=querylist=logeventsletype=deleteletitle=X
http://en.wikipedia.org/w/api.php?action=querylist=logeventsletype=deleteletitle=Htfhfhsrtr

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Manual (book) Guide for MediaWiki developers

2011-06-02 Thread Petr Kadlec
On 2 June 2011 07:41, Stevie i...@webserver-management.de wrote:
 how can I print out the complete already collected book?

http://www.mediawiki.org/wiki/Special:Collection/render_collection/?colltitle=User:Wikinaut/Books/MediaWiki_Developer's_Guidewriter=rl

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Manual (book) Guide for MediaWiki developers

2011-06-02 Thread Petr Kadlec
On 2 June 2011 22:12, Stevie i...@webserver-management.de wrote:
 where can I find this link in MediaWiki?

I don’t think there is a simple way for that, currently. Normally,
this is done using the {{saved book}} template (see e.g.
http://en.wikipedia.org/wiki/Template:Saved_book), but the
MediaWiki.org one does not have the relevant links
(http://www.mediawiki.org/wiki/Template:Saved_book). I guess some
universal support could be added to the Collection extension…

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Open call for Parser bugs

2011-04-07 Thread Petr Kadlec
On 6 April 2011 23:21, Mark A. Hershberger mhershber...@wikimedia.org wrote:
 This week, I'm going to focus on Parser-related bugs.  There are
 currently 10 bugs on the with the “triage” keyword applied.  A bug
 triage meeting needs about 30 bugs, so I have room for about 20 more
 right now.  I'll be adding to the list before Monday, but this is your
 chance to get WMF's developers talking about YOUR favorite parser bug by
 adding the “triage” keyword.

This is generally a nice idea, of course, but in the specific case of
parser bugs, we already have a keyword for that; I guess you could get
your ~30 bugs just by
https://bugzilla.wikimedia.org/buglist.cgi?keywords=parserbug_severity=blockerbug_severity=criticalbug_severity=majorbug_severity=normalbug_severity=minorbug_severity=trivialresolution=---
;-)

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Removing test suites from trunk

2010-12-08 Thread Petr Kadlec
On 7 December 2010 19:43, Niklas Laxström niklas.laxst...@gmail.com wrote:
 On 7 December 2010 20:08, Petr Kadlec petr.kad...@gmail.com wrote:
 Well, the hint to use sparse checkouts applies this way, too… You can
 omit the testing directories from your checkout, and even subsequent
 SVN updates will skip them. After checkout, just call
 svn up --set-depth exclude maintenance/tests
 and you should be set.

 svn up --set-depth exclude maintenance/tests
 svn: 'exclude' is not a valid depth; try 'empty', 'files',
 'immediates', or 'infinity'

 There is no mention of 'exclude' even in the docs of development version of 
 svn.

The exclude option is new in 1.6, see
http://subversion.apache.org/docs/release-notes/1.6.html#sparse-directory-exclusion
and 
http://blogs.open.collab.net/svn/2009/03/sparse-directories-now-with-exclusion.html.
In 1.5, I you need to use --set-depth empty, I guess.

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Removing test suites from trunk

2010-12-07 Thread Petr Kadlec
On 7 December 2010 14:59,  jida...@jidanni.org wrote:
 Me too. Please move all the expert testing stuff to a different place
 where the experts can use an expert command to fetch it, and stop
 bloating we regular WikiSysops SVN updates. Thanks.

Well, the hint to use sparse checkouts applies this way, too… You can
omit the testing directories from your checkout, and even subsequent
SVN updates will skip them. After checkout, just call
svn up --set-depth exclude maintenance/tests
and you should be set.

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Removing test suites from trunk

2010-12-06 Thread Petr Kadlec
On 6 December 2010 08:47, Ashar Voultoiz hashar+...@free.fr wrote:
  - I want to be able to commit a bug fix + associated tests in one
 revision without having to fetch the whole phase3 tree.

You don’t have to. Subversion supports (since 1.5?) sparse checkouts
http://svnbook.red-bean.com/en/1.5/svn.advanced.sparsedirs.html,
allowing you to have only selected parts of a directory checked out in
your working copy, which is a feature suitable exactly for scenarios
like this one.

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How to generate link to section in JavaScript?

2010-08-24 Thread Petr Kadlec
On 24 August 2010 10:43, lampak llam...@gmail.com wrote:
 How can I generate in JavaScript such a link which would be identical to
 the one generated by MediaWiki? Has somebody written such a function? Or
 at least, do you know where it is done in MediaWiki php code?

See Parser.php, function guessSectionNameFromWikiText; most of the
work is done in Sanitizer::escapeId.

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] How to generate link to section in JavaScript?

2010-08-24 Thread Petr Kadlec
On 24 August 2010 15:09, Brad Jorsch b-jor...@northwestern.edu wrote:
 Sometimes it may be easiest to just ask the API.

Yep, that is a possibility:

http://en.wikipedia.org/w/api.php?action=parsetext={{anchorencode:ba%C5%BCant%20kr%C3%B3lewski}}

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] (no subject)

2010-01-23 Thread Petr Kadlec
On 23 January 2010 07:21, 李琴 q...@ica.stc.sh.cn wrote:
 I changed the $wgRCLinkLimits = array( 50, 100, 250, 500 );
 to $wgRCLinkLimits = array( 500, 1000, 5000, 1 ); and 'rclimit'   =
 1.

Just to be sure: do not _change_ the line in DefaultSettings.php!
Override the setting by adding the second quoted command into your
LocalSettings.php. Otherwise, the change will be overwritten every
time you update to a newer version of MediaWiki.

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Data Processing

2009-09-25 Thread Petr Kadlec
2009/9/25 vanessa lee q...@ica.stc.sh.cn:
 What form are articles  stored in database?

Raw wiki text, plus many tables containing metadata. See
http://www.mediawiki.org/wiki/Manual:Database_layout

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Data Processing

2009-09-25 Thread Petr Kadlec
2009/9/25 李琴 q...@ica.stc.sh.cn:
       If i just edit a page's section.Then ,What will be saved in the text
 table,  the entire page's text or  just the edited section's text   ?

The entire page text, which is the result of merging the previous page
text with the changes you’ve made to the section.

     Why the text table's fiels are old_id, old_text,old_flags? What does
 the old mean?

See http://www.mediawiki.org/wiki/Manual:Text_table

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Data Processing

2009-09-25 Thread Petr Kadlec
2009/9/25 李琴 q...@ica.stc.sh.cn:
 The entire page text has been stroed in text table. But the recent change
 page just shows the edited text.
 Then,how do these text stroed?

It is not stored. It is evaluated during every diff view by comparing
(diffing) the two revisions (see phase3/includes/diff/*.php). Note you
can view differences between two non-consecutive versions (indeed, you
can compare two revisions not even belonging to the same page, e.g.
http://en.wikipedia.org/wiki/?diff=12345oldid=67890diffonly=1).

 I want to see the content(BLOB) of  old_text fiels in text table.
  What should I do?

It depends on the configuration of your wiki. The “text” table might
contain the wikitext directly in the “old_text” column (in the source
text form, or as a serialized PHP object, see
phase3/includes/HistoryBlob.php), or the “old_text” column is only a
pointer to where/how the text is really stored, see e.g.
http://www.mediawiki.org/wiki/Manual:External_Storage

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Any accepted use cases that block bug 1310?

2009-09-20 Thread Petr Kadlec
2009/9/20 David Gerard dger...@gmail.com:
 What's the actual use case for nested refs? (Do you have an example
 page or two where they're useful and there's no way to do it right
 without nesting the refs?)

“There’s no way”? Well, obviously, there is always a way do it
differently… But nested refs can be useful especially with grouped
references [1], when a footnote can refer to a source (or, more
generally, refs from one group can refer to another group).

Note that there is a workaround for this problem, explained at the
enwp help page. [2]

-- [[cs:User:Mormegil | Petr Kadlec]]

[1] http://www.mediawiki.org/wiki/Extension:Cite/Cite.php#Grouped_references
[2] http://en.wikipedia.org/wiki/Wikipedia:Footnotes#Known_bugs

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] How sort key works?

2009-08-11 Thread Petr Kadlec
2009/8/11 Helder Geovane Gomes de Lima heldergeov...@gmail.com:
 2009/8/10 Aryeh Gregor
 We could break ties by appending the page title to custom sort keys,
 if this is a problem.

 I think it would be good! =)

I don’t (at least not in the way it is expressed). If you want to use
the page title as a tiebreaker, then add it as a new column to the
index (before the page_id), not (as I read the original sentence) by
appending the title to the sort key.

Otherwise, you’ll have to separate the sort key from the title with
some control character under U+0020 (to ensure correct ordering of
different-length sort keys – you need a separator which sorts before
any valid character), which would be messy.

But still, I don’t see the point in doing that. You don’t want a page
called “Aaa” to come after a page called “Abc” when you set their
sortkeys both to the same value? Don’t do that then. Set the sortkey
accordingly to what you want.

(OBTW, a different thing is that category paging is probably buggy in
this tiebreaking aspect – even though the index is correctly defined
to be unique, the page_id column is not included in the from= paging
parameter. But this bug will probably appear only in extreme cases,
like 300 articles with an identical sortkey.)

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] removing rc_ip

2009-06-17 Thread Petr Kadlec
2009/6/17 John Doe phoenixoverr...@gmail.com:
 please note
 that the chechuser table is created by an extension and not by
 mediawiki thus you cannot depend on extensions that my or may not be
 insalled on all wikis. it may be that way on MWF wikis bit mediawiki
 is not just used on MWF wikis. what you want done cannot be done
 untill CU is merged into the trunk

The point is that in core, the hook would be implemented using rc_ip.
The CheckUser extension could then override the hook to use
cu_changes, which enables any wiki with the CheckUser extension to set
$wgPutIPinRC = false without any loss of functionality.

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] More aggressive DEFAULTSORT

2009-05-12 Thread Petr Kadlec
2009/5/12 Lars Aronsson l...@aronsson.se:
 Petr Kadlec wrote:
 It is exactly this attitude of oh, it's so difficult that has
 blocked bug 164 from being fixed.

Well, not really. Bug 164 would be fixed almost completely for
Czech-language wikis by using database features designed for exactly
this problem. [1] But, I guess you know the situation.

 I doubt it will get fixed in
 the coming year. But the more aggressive use of DEFAULTSORT could
 be applied within a year.  I know this is not the right or most
 advanced solution, it's just an improvement that *can be made*.

More aggressive use of DEFAULTSORT could be applied right now, and
some people already do that. But, to ask people to create strange
codes (like Šácholanotvaré → SŠahωolanotvare [2]) to enforce proper
sorting… I can’t believe this would be that much improvement… (Not to
say that users would make many errors with such a complicated system.)

If Swedish sorting rules are simple enough that removing all
whitespace and punctuation and converting to lower case would solve
most of the problems, I would say that such feature would not be too
difficult to implement right into MediaWiki (into LanguageSv.php),
writing those DEFAULTSORT codes explicitly into every article would be
nonsense, IMHO. (So, go ahead with it, I won’t stop you or anything,
I’m just trying to say that this is not really a solution for Czech
language.)

-- [[cs:User:Mormegil | Petr Kadlec]]

[1] http://dev.mysql.com/doc/refman/4.1/en/charset-collation-effect.html
[2] Really! http://jdem.cz/beqz6 (And that is only a partial solution,
implementing only about a half of the Czech sorting rules.)

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] More aggressive DEFAULTSORT

2009-05-11 Thread Petr Kadlec
2009/5/11 Lars Aronsson l...@aronsson.se:
 Category sorting in MediaWiki has always been done wrong.
 Categories are not sorted alphabetically, but in Unicode order.

Sure thing. See https://bugzilla.wikimedia.org/show_bug.cgi?id=164
with 95 votes…

 Another example of broken sorting is when whitespace is compared
 to letters.  In ASCII and Unicode, whitespace (position 32) sorts
 ahead of all printable characters.  This means Moon illusion sorts
 ahead of Moonbow in http://en.wikipedia.org/wiki/Category:Moon
 because the whitespace before illusion is compared to the b in
 Moonbow.  I'm not sure if this is correct in English, but in
 Swedish it is wrong; bow should sort before illusion, regardless
 of the whitespace.

In Czech, this is correct, Czech collation works on individual words.
As you see, the rules are language-specific.

 There is a way to avoid all such problems, namely by a more
 aggressive use of DEFAULTSORT that removes from sorting all upper
 case letters (except the initial one), all whitespace and all
 commas.

The problem is much more difficult than that (see the linked bug).
Commas, case sensitivity and whitespace are a trivial problem in
comparison with non-ASCII letters.

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Dealing with Large Files when attempting a wikipedia database download.

2009-04-15 Thread Petr Kadlec
2009/4/14 Platonides platoni...@gmail.com:
 IMHO the benefits of separated files are similar to the disadvantages. A
 side side benefit if it would be that hashes would be splitted, too. If
 you were unlucky, knowing that 'something' (perhaps just a bit) on the
 150GB you downloaded is wrong, is not that helpful.
 So having hashes for file sections on the big ones, even if not
 'standard' would be an improvement.

For that, something like Parchive would probably be better…

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] On extension SVN revisions in Special:Version

2009-03-27 Thread Petr Kadlec
2009/3/27 Daniel Kinzler dan...@brightbyte.de:
 Indeed. It would be much nicer if it could report the *directories* revision
 number. Does SVN have a keyword substitution for this? Does SVN allow
 scripts/plugins for this type of thing? That would by quite useful.

SVN does not have any such keyword, and the reason is the same as why
doing this with a post-commit hook is not a good idea. See
http://subversion.tigris.org/faq.html#version-value-in-source and a
note at 
http://svnbook.red-bean.com/nightly/en/svn.reposadmin.create.html#svn.reposadmin.create.hooks

The proper way, IMO, is to do that with the sync script, or something similar.

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Getting the list of Page Titles and Redirects of Wikipedia

2009-03-18 Thread Petr Kadlec
2009/3/18 O. O. olson...@yahoo.com:
 This is fine, but where can I find information on custom namespaces i.e.
 those that lie above 100.

In $wgExtraNamespaces (see
http://www.mediawiki.org/wiki/Manual:Using_custom_namespaces)

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Article blaming

2009-01-26 Thread Petr Kadlec
 Is the code available and I have missed it? Do we have any other
 implementation?

I tried to do something similar (two examples are at
http://mormegil.info/wp/blame/AIM.htm
http://mormegil.info/wp/blame/AFC_Ajax.htm); the code is nothing
secret, even though it is not too clean, and there is also no rocket
science [you have been warned:
https://opensvn.csie.org/traccgi/MWTools/browser/MWTools/trunk/PageBlame]:

The biggest problem I see with such tools is that it is IMHO unusable
for any copyright-related purposes. My tool works by diffing the
article revisions and tracking who was the last author of every word.
Even though you can be much smarter than that, I don't believe you
would be able to track all copyright-relevant contributions with that.
As an example, consider using that tool on an article that was created
by:
1. Importing an article with all its history from the English
Wikipedia to some other-language wiki.
2. Translating it into the local language (for more fun, imagine a
language using a different script, e.g. Russian, or even Chinese)

There is IMHO no way the blame tool could track copyright properly
through the translation (which it has to, copyright-wise). And even in
the general case, I believe such tracking would be an AI-hard task
(often, even a human is unable to do it properly…). Of course, such
Blame tools are great for many reasons (which is why I wrote them),
but I think the current context (license change, attribution etc.)
does not fit them at all.

-- [[cs:User:Mormegil | Petr Kadlec]]

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l