Re: [Wikitech-l] Does anyone know about bulk loads on Wikibase/Wikidata?

2021-04-14 Thread Yuri Astrakhan
I have been doing it from open street map data using a python script.

At first it generates about 100gb of zipped ttl files, imports them with a
specialized blazegraph tool and afterwards uses direct sparql statements to
do batch updates when new data becomes available.

https://github.com/Sophox/sophox/tree/main/osm2rdf


On Wed, Apr 14, 2021, 5:07 PM Henry Rosales  wrote:

> Hi all!
>
> I hope you are doing very well! I’m working on constructing an RDF
> dataset, which I’m ingesting to Wikibase and leverage its ecosystem:
> visualization, SPARQL endpoint, a script for dump creation, etc. For
> ingestion, I’m using the API (explained in
> https://www.wikidata.org/w/api.php), and it is going well, except for the
> time performance.
>
> I was exploring the Quick Statements module of Wikibase, which allows the
> ingestion through the UI. However, I need a way to ingest data
> automatically. I wonder if Wikibase/Wikidata has any script for bulk loads.
>
> Any comment or suggestion will be welcome.
>
> Best Regards,
> Henry Rosales
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Community Tech: New Format for 2020 Wishlist Survey

2019-10-04 Thread Yuri Astrakhan
Ilana, restricting wishlist to non-Wikipedia this year is a very sad news.

For many years, wishlist survey was the best way for the community to talk
back to the foundation, and to try to influence its direction. WMF mostly
ignored these wishes, yet it was still a place to express, discuss,
aggregate and vote on what community needed. Big thank-you is due to the
tiny community tech team that tackled the top 10 items, but that's just ~3%
of the foundation's employees.

WMF has been steadily separating itself from the community and loosing
credibility as a guiding force.  Take a look at the last election -- almost
every candidate has said "no" to the question if WMF is capable of
deciding/delivering on the direction [1].  In **every** single conversation
I had with the community members, people expressed doubts with the movement
strategy project, in some cases even treating it as a joke.

This is a huge problem, and restricting wishlist kills the last effective
feedback mechanism community had.  Now WMF is fully in control of itself,
with nearly no checks & balances from the people who created it.

I still believe that if WMF makes it a priority to align most of its
quarterly/yearly goals with the community wishlist (not just top 10
positions), we could return to the effective community-governance.
Otherwise WMF is risking to mirror Red Cross Haiti story [2] -- hundreds of
millions of $$ donated, and very few buildings actually built.

With great respect to all the people who made Wikis what they are today,
--[[User:Yurik]]

[1]
https://meta.wikimedia.org/wiki/Affiliate-selected_Board_seats/2019/Questions#Do_you_believe_the_Wikimedia_Foundation_in_its_present_form_is_the_right_vehicle_for_the_delivery_of_the_strategic_direction?_If_so_why,_and_if_not,_what_might_replace_it?

[2]
https://en.wikipedia.org/wiki/American_Red_Cross#Disaster_preparedness_and_response

On Fri, Oct 4, 2019 at 5:18 PM Ilana Fried  wrote:

> Hello, everyone!
>
> My name is Ilana, and I'm the product manager for the Community Tech team.
> We’re excited to share an update on the Community Tech 2020 Wishlist Survey
> . This
> will
> be our fifth annual Community Wishlist Survey, and for this year, we’ve
> decided to take a different approach. In the past, we've invited people to
> write proposals for any features or fixes that they'd like to see, and the
> Community Tech team has addressed the top ten wishes with the most support
> votes. This year, we're just going to focus on the *non-Wikipedia content
> projects* (i.e. Wikibooks, Wiktionary, Wikiquote, Commons, Wikisource,
> Wikiversity, Wikispecies, Wikidata, Wikivoyage, and Wikinews), and we're
> only going to address the top five wishes from this survey. This is a big
> departure from the typical process. In the following year (2021), we’ll
> probably return to the traditional structure.
>
> So, why this change? We’ve been following the same format for years — and,
> generally, it has lots of benefits. We build great tools, provide useful
> improvements, and have an impact on diverse communities. However, the
> nature of the format tends to prioritize the largest project (Wikipedia).
> This makes it harder to serve smaller projects, and many of their wishes
> never make it onto the wishlist. As a community-focused team, we want to
> support *all* projects. Thus, for 2020, we want to shine a light on
> non-Wikipedia projects.
>
> Furthermore, we’ll be accepting five wishes. Over the years, we’ve taken on
> larger wishes (like Global Preferences
>  or Who
> Wrote That
> ),
> which are awesome projects. At the same time, they tend to be lengthy
> endeavors, requiring extra time for research and development. When we
> looked at the 2019 wishlist, there were still many unresolved wishes.
> Meanwhile, we wanted to make room for the new 2020 wishes. For this reason,
> we’ve decided to take on a shortened list, so we can address as many wishes
> (new and remaining 2019 wishes
> )
> as possible.
>
> Overall, we look forward to this year’s survey. We worked with lots of
> folks (engineering, product management, and others) to think about how we
> could support underserved projects, all while preserving the dynamic and
> open nature of the wishlist. *Please let us know your thoughts
> *
> related
> to this change. In addition, we’ll begin thinking about the guidelines for
> this new process, so *we want your feedback
> * (on
> what sorts of processes/rules we may want to consider). Thank you, and
> we’re very curious to see the wishes in November!
>
>
> Thanks,
>
> Ilana Fried
>
> Product 

Re: [Wikitech-l] For title normalization, what characters are converted to uppercase ?

2019-08-03 Thread Yuri Astrakhan
Hi Nico, if possible, can your tool to actually use MW API to normalize
titles? It's a very quick API call, you can do multiple titles at once, but
it will save you a lot of grief over incompatibilities.
--Yuri

On Sat, Aug 3, 2019 at 10:57 AM Nicolas Vervelle 
wrote:

> Hello,
>
> On most wikis, MediaWiki is configuration to convert the first letter of a
> title to uppercase, but apparently it's not converting every Unicode
> characters : for example, on frwiki ɽ
>  is a
> different article than Ɽ , even
> if
> the second character is the uppercase version of the first one in Unicode.
>
> So, what characters are actually converted to uppercase by the title
> normalization ?
>
> I need to know this information to stop reporting some false positives in
> WPCleaner .
>
> Thanks, Nico
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Self Introduction - Xin Chen

2019-05-08 Thread Yuri Astrakhan
Xin Chen, welcome and thank you for tackling this!

If anyone wants to track it, the primary ticket for this work is
https://phabricator.wikimedia.org/T165118

Thanks!
--Yurik




On Wed, May 8, 2019 at 10:58 PM Xin Chen  wrote:

> Hi, everyone!
> My name is Xin Chen. I'm a GSoC student working with the Vega team and
> being mentored by Dominik Moritz and Yuri Astrakhan. I'll submit patches to
> add support for Vega 5 and Vega-Lite because the current Vega 2 is
> outdated.
> I will be very happy for the community to review and help me in the next
> several months. Hope you can cooperate happily with me, too.
>
> Sincerely yours,
> Xin Chen
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New Gerrit privilege policy

2019-03-16 Thread Yuri Astrakhan
I agree that while possibly made with good intentions, policy changes like
these should go through dev community discussion + vote, not be handed down
from a CTO.  Wikipedia as a movement started that way, and many people
participated in it because of its transparency and community-driven
process. Just because now there is a large split between "community" and
"WMF staff who gets +2 automatically", we should try to keep the original
values.

On Sat, Mar 16, 2019 at 3:33 PM Zppix  wrote:

> Andre, from what im gathering from this thread thats not what I
> understand, so i redact the part of my last email about toolforge, however
> my point on this policy change should of been put to a community
> vote/consensus is valid.
>
> --
> Devin “Zppix” CCENT
> Volunteer Wikimedia Developer
> Africa Wikimedia Developers Member and Mentor
> Volunteer Mozilla Support Team Member (SUMO)
> Quora.com Partner Program Member
> enwp.org/User:Zppix
> **Note: I do not work for Wikimedia Foundation, or any of its chapters. I
> also do not work for Mozilla, or any of its projects. **
>
> > On Mar 16, 2019, at 1:13 PM, Andre Klapper 
> wrote:
> >
> >> On Sat, 2019-03-16 at 12:40 -0500, Zppix wrote:
> >> So your basically telling me, I can’t decide who gets the power to +2
> >> on for example a toolforge tool I actively am the primary maintainer
> >> of? Instead it has to be requested. I do not disagree with a lot of
> >> the changes to technical policies, but with this change it seems to
> >> restrict ability to scale projects. I also do believe that this
> >> change should of be taken under RfC or some sort of consensus-gaining
> >> measure.  I respect the intentions, but I absolutely think the change
> >> needs reverted then voted on by the technical community.
> >
> > Did you read https://www.mediawiki.org/wiki/Gerrit/Privilege_policy ?
> >
> > It says "For extensions (and other projects) not deployed to the
> > Wikimedia cluster, the code review policy is up to the maintainer or
> > author of the extension."
> >
> > andre
> > --
> > Andre Klapper | Bugwrangler / Developer Advocate
> > https://blogs.gnome.org/aklapper/
> >
> >
> >
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] problematic use of "Declined" in Phabricator

2018-10-02 Thread Yuri Astrakhan
I think we are misusing the term "priority" here.  Priority for whom?
Setting something to "lowest" priority implies that users do not care about
the item. Unbreak now implies users need it fixed right away. "we have no
resources" does not mean its not needed, it just means WMF does not view it
as a priority, which might align, but frequently misalign with the user's
needs.

I suggest we use dashboard columns for the planning activities, while
keeping the tasks themselves fully under "requester's" control. E.g. let
the community decide what is more important, but use dashboards for team
work planning.  This way, if a volunteer developer wants to contribute,
they would look for the highest-demanded bugs that don't have active status
in any teams.

On Tue, Oct 2, 2018 at 7:32 PM Stas Malyshev 
wrote:

> Hi!
>
> On 10/2/18 9:57 AM, Brion Vibber wrote:
> > *nods* I think the root problem is that the phabricator task system does
> > double duty as both an *issue reporting system* for users and a *task
> > tracker* for devs.
>
> This is probably the real root cause. But I don't think we are going to
> make the split anytime soon, even if we wanted to (which is not
> certain), and this mode of operation is very common in many organizations.
>
> Realizing this, I think we need some mode of explicitly saying "we do
> not have any means to do it now or in near-term future, but we don't
> reject it completely and if we ever have resources or ways to do this,
> we might revisit this".
>
> We kinda sorta have this with "Stalled" status and "Need volunteer" tag,
> but we might want to get this status more prominent and distinguish
> "TODO" items outside of any planning cycle and the ones that are part of
> the ongoing development. And document it in the lifecycle document.
>
> --
> Stas Malyshev
> smalys...@wikimedia.org
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] problematic use of "Declined" in Phabricator

2018-10-02 Thread Yuri Astrakhan
I agree, which raises a question why so many map related legitimate used
requests were closed recently as declined, and with a comment that there is
no resources to work on them

On Tue, Oct 2, 2018, 17:51 Joe Matazzoni  wrote:

> I agree with Amir’s understanding. "Declined” is basically for ideas whose
> proper timing is never.  Valid ideas that we just aren’t going to work on
> any time soon should go in a backlog or freezer or some such, where they
> can await until some future project or other development makes them
> relevant (at least theoretically).
>
> All of which does raise a slightly different question: I am much less
> clear on what the exact difference is between “Invalid” and “Declined.”
> Thoughts?
>
> Best,
> Joe
> _
>
> Joe Matazzoni
> Product Manager, Collaboration
> Wikimedia Foundation, San Francisco
> mobile 202.744.7910
> jmatazz...@wikimedia.org
>
> "Imagine a world in which every single human being can freely share in the
> sum of all knowledge."
>
>
>
>
> > On Oct 2, 2018, at 9:31 AM, Amir E. Aharoni <
> amir.ahar...@mail.huji.ac.il> wrote:
> >
> > Hi,
> >
> > I sometimes see WMF developers and product managers marking tasks as
> > "Declined" with comments such as these:
> > * "No resources for it in (team name)"
> > * "We won't have the resources to work on this anytime soon."
> > * "I do not plan to work on this any time soon."
> >
> > Can we perhaps agree that the "Declined" status shouldn't be used like
> this?
> >
> > "Declined" should be valid when:
> > * The component is no longer maintained (this is often done as
> > mass-declining).
> > * A product manager, a developer, or any other sensible stakeholder
> thinks
> > that doing the task as proposed is a bad idea. There are also variants of
> > this:
> > * The person who filed the tasks misunderstood what the software
> component
> > is supposed to do and had wrong expectations.
> > * The person who filed the tasks identified a real problem, but another
> > task proposes a better solution.
> >
> > It's quite possible that some people will disagree with the decision to
> > mark a particular task as "Declined", but the reasons above are
> legitimate
> > explanations.
> >
> > However, if the task suggests a valid idea, but the reason for declining
> is
> > that a team or a person doesn't plan to work on it because of lack of
> > resources or different near-term priorities, it's quite problematic to
> mark
> > it as Declined.
> >
> > It's possible to reopen tasks, of course, but nevertheless "Declined"
> gives
> > a somewhat permanent feeling, and may cause good ideas to get lost.
> >
> > So can we perhaps decide that such tasks should just remain Open? Maybe
> > with a Lowest priority, maybe in something like a "Freezer" or "Long
> term"
> > or "Volunteer needed" column on a project workboard, but nevertheless
> Open?
> >
> > --
> > Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
> > http://aharoni.wordpress.com
> > ‪“We're living in pieces,
> > I want to live in peace.” – T. Moore‬
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Map internationalization launched everywhere, AND embedded maps now live on 276 Wikipedias

2018-05-09 Thread Yuri Astrakhan
Joe, thanks for heeding community's wish to have better maps, and
delivering, all within the short span of ~3.5 months. Very impressive
achievement!  How can the community encourage WMF to do more map work
beyond the June deadline, similar to how foundation continues to work on
improving search, Visual Editor, and other tools?  Thanks!

On Thu, May 10, 2018 at 1:33 AM Joe Matazzoni 
wrote:

> As of today, interactive (Kartographer) maps no longer display in the
> language of the territory mapped; instead, you’ll read them in the content
> language of the wiki where they appear—or in the language their authors
> specify (subject to availability of multilingual data). In addition,
> mapframe, the feature that automatically embeds dynamic maps right on a
> wiki page, is now live on most Wikipedias that lacked the feature. (Not
> included in the mapframe launch are nine Wikipedias [1] that use the
> stricter version of Flagged Revisions).
>
> If you you’re new to mapframe, this Kartographer help page [2] shows how
> to get started putting dynamic maps on your pages.  If you’d like to read
> more about map internationalization: this Special Update [3] explains the
> feature and its limiations; this post [4] and this one [5] describe the
> uses of the new parameter, lang=”xx”, which  lets you specify a map’s
> language. And here are some example maps [6] to illustrate the new
> capabilities.
>
> These features could not have been created without the generous
> programming contributions and advice of our many map-loving volunteers,
> including Yurik, Framawiki, Naveenpf, TheDJ, Milu92, Astirlin, Evad37,
> Pigsonthewing, Mike Peel, Eran Roz,  Gareth and Abbe98. My apologies to
> anyone I’ve missed.
>
> The Map Improvements 2018 [7] project wraps up at the end of June, so
> please give internationalized maps and mapframe a try soon and give us your
> feedback on the project talk page [8]. We’re listening.
>
> [1] https://phabricator.wikimedia.org/T191583
> [2] https://www.mediawiki.org/wiki/Help:Extension:Kartographer
> [3]
> https://www.mediawiki.org/wiki/Map_improvements_2018#April_18,_2018,_Special_Update_on_Map_Internationalization
> [4]
> https://www.mediawiki.org/wiki/Map_improvements_2018#April_25,_2018,_You_can_now_try_out_internationalization_(on_testwiki)
> [5]
> https://www.mediawiki.org/wiki/Map_improvements_2018#April_26,_2018:_OSM_name_data_quirks_and_the_uses_of_lang=%E2%80%9Clocal%E2%80%9D
> [6] https://test2.wikipedia.org/wiki/Map_internationalization_examples
> [7] https://www.mediawiki.org/wiki/Map_improvements_2018
> [8] https://www.mediawiki.org/wiki/Talk:Map_improvements_2018
> _
>
> Joe Matazzoni
> Product Manager, Collaboration
> Wikimedia Foundation, San Francisco
>
> "Imagine a world in which every single human being can freely share in the
> sum of all knowledge."
>
>
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Adding a category to json page

2018-05-07 Thread Yuri Astrakhan
It has been requested a long time ago, but I don't think anyone is
currently working on anything dataset related. See

https://phabricator.wikimedia.org/T155290

‪On Mon, May 7, 2018 at 4:13 PM ‫יגאל חיטרון‬‎ 
wrote:‬

> Hello. Is there a way to add a category to json contentmodel page? Thank
> you.
> Igal (User:IKhitron)
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikistats 2.0 - Now with Maps!

2018-02-14 Thread Yuri Astrakhan
Nuria, well done, looks awesome!  I think you want to set zoom limits -
right now you can zoom in and zoom out almost infinitely. Congrats!

On Wed, Feb 14, 2018 at 8:47 PM, Michael Schönitzer <
michael.schoenit...@wikimedia.de> wrote:

> here a screenshot:
>
> There's a second anomaly: "SX" is also listed…
>
> 2018-02-15 1:35 GMT+01:00 יגאל חיטרון :
>
> > Sorry, I can't. I opened the link you gave on hewiki. Changed from map
> view
> > to table view. I get a list of countries - US, France, Spain, --, Japan,
> > and so on. It's a link, and clicking it opens unexisting wiki article
> with
> > the same name.
> > Igal
> >
> >
> > On Feb 15, 2018 02:23, "Nuria Ruiz"  wrote:
> >
> > > Hello,
> > >
> > > >Hello and thank you.
> > > >What do you mean in country named "--"?
> > >
> > > Not sure what you are asking, maybe a screenshot would help?
> > >
> > > On Wed, Feb 14, 2018 at 3:04 PM, יגאל חיטרון 
> wrote:
> > >
> > > > Hello and thank you.
> > > > What do you mean in country named "--"?
> > > > Igal (User:IKhitron)
> > > >
> > > >
> > > > On Feb 15, 2018 00:15, "Nuria Ruiz"  wrote:
> > > >
> > > > Hello from Analytics team:
> > > >
> > > > Just a brief note to announce that Wikistats 2.0 includes data about
> > > > pageviews per project per country for the current month.
> > > >
> > > > Take a look, pageviews for Spanish Wikipedia this current month:
> > > > https://stats.wikimedia.org/v2/#/es.wikipedia.org/reading/
> > > > pageviews-by-country
> > > >
> > > > Data is also available programatically vi APIs:
> > > >
> > > > https://wikitech.wikimedia.org/wiki/Analytics/AQS/
> > > > Pageviews#Pageviews_split_by_country
> > > >
> > > > We will be deploying small UI tweaks during this week but please
> > explore
> > > > and let us know what you think.
> > > >
> > > > Thanks,
> > > >
> > > > Nuria
> > > > ___
> > > > Wikitech-l mailing list
> > > > Wikitech-l@lists.wikimedia.org
> > > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > > > ___
> > > > Wikitech-l mailing list
> > > > Wikitech-l@lists.wikimedia.org
> > > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > > ___
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
>
>
>
> --
> Michael F. Schönitzer
>
>
>
> Wikimedia Deutschland e.V. | Tempelhofer Ufer 23-24 | 10963 Berlin
> Tel. (030) 219 158 26-0
> http://wikimedia.de
>
> Stellen Sie sich eine Welt vor, in der jeder Mensch an der Menge allen
> Wissens frei teilhaben kann. Helfen Sie uns dabei!
> http://spenden.wikimedia.de/
>
> Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
> Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
> der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
> Körperschaften I Berlin, Steuernummer 27/681/51985.
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Multilingual maps: how to pick each feature language?

2017-02-09 Thread Yuri Astrakhan
TLDR: if browsing a map for French wiki, and a city only has a Russian and
Chinese name, which one should be shown? Should the city name have
different rules from a store or a street name? ...

I have been hacking to add unlimited multilingual support to Wikipedia
maps, and have language fallback question:  given a list of arbitrary
languages for each map feature, what is the best choice for a given
language?

I know Mediawiki has language fallbacks, but they are very simple (e.g. for
"ru", if "ru" is not there, try "en").

Some things to consider:
* Unlike Wikipedia, where readers go for "meaning", in maps we mostly need
"readability".
* Alphabets: Latin alphabet is probably the most universally understood,
followed by...? Per target language?
* Politics: places like Crimea tend to have both Russian and Ukrainian
names defined, but if drawing map in Ukrainian, and some feature has
Russian and English names, but not Ukrainian, should it be shown with the
Russian or Ukrainian name?
* When viewing a map of China in English, should Chinese (local) name be
shown together with the English name? Should it be shown for all types of
features (city name, street name, name of the church, ...?)

Thanks!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] +2 request for yurik in mediawiki and maps-dev

2017-01-25 Thread Yuri Astrakhan
How is that different from becoming disgruntled while still working at WMF?
Or becoming disgruntled without ever joining? Or stolen credentials?

On Wed, Jan 25, 2017, 16:01 Kevin Smith  wrote:

On Wed, Jan 25, 2017 at 12:55 PM, Alex Monk  wrote:

> You don't appear to be a developer. Even if you were, Brian said 'similar
> situation' - i.e., where someone had been granted rights before becoming
> WMF staff.
>

​I am a developer, but not a wikimedia developer (yet?). That's why I'm not
exactly clear on the potential risks of giving +2 rights to someone who
intends harm.

We are talking about exactly the same situation: Someone was a productive
volunteer, then staff, then no longer staff. 99+% of the time, they should
retain their rights, or get them back shortly after leaving. But there may
be cases where someone would be disgruntled, and if rights are automatic in
all cases, such a person could do damage. I'm just suggesting that there
should be a quick and simple check to be sure that's not the case. Being
without rights for a couple days doesn't seem like a small price to pay for
the safety of the project.

​Kevin​
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] +2 request for yurik in mediawiki and maps-dev

2017-01-24 Thread Yuri Astrakhan
Thanks Legoktm!

For the reference, I was granted these rights as a volunteer, before
joining WMF: https://lists.gt.net/wiki/wikitech/335950

On Tue, Jan 24, 2017 at 10:44 PM, Legoktm 
wrote:

> Hi,
>
> After speaking with Yurik, I've filed
>  on his behalf to restore his
> membership in the mediawiki and maps-dev groups.
>
> I would appreciate guidance in whether these rights can be summarily
> granted since he used to have them, or if it needs to go through the
> full process.
>
> -- Legoktm
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Map replaces GeoHack on ruwiki

2017-01-20 Thread Yuri Astrakhan
Russian Wikipedia just replaced all of their map links in the upper right
corner (geohack) with the  - Kartographer extension!  Moreover,
when clicking the link, it also shows the location outline, if that object
exists in OpenStreetMap, using corresponding Wikidata ID.  My deepest
respect to my former Interactive Team colleagues and volunteers who have
made it possible!  (This was community wishlist #21)

Example - city of Salzburg (click coordinates in the upper right corner, or
in the infobox):
https://ru.wikipedia.org/wiki/%D0%97%D0%B0%D0%BB%D1%8C%D1%86%D0%B1%D1%83%D1%80%D0%B3
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Localizable cross-wiki templates

2017-01-06 Thread Yuri Astrakhan
TLDR: Templates can now use datasets on Commons for all their localizable
messages, making it possible to copy/paste templates between wiki without
any changes.

If you look at the Graph:Lines template [1], there is a small link under
each graph that points to the data source. That link is automatically
localized on enwiki and ruwiki because the actual messages are stored in a
Commons' Dataset, while the template itself is identical on both wikis.

The magic happens in the Module:TNT (hoping it won't blow up):

*{{#invoke:TNT | msg | DatasetDict.tab | message-key | param1 | param2 |
... }}*

I would love to support template doc pages too, but I'm not sure how well
VE and other tools will work with the 

This is an experimental thing, just hacking on it in my free time, and
would love to get your feedback on the viability of this approach. Thanks!

Links:
enwiki: https://en.wikipedia.org/wiki/Template:Graph:Lines
ruwiki: https://ru.wikipedia.org/wiki/Template:Graph:Lines
TNT: https://commons.wikimedia.org/wiki/Module:TNT
Dictionary:
https://commons.wikimedia.org/wiki/Data:Original/Template:Graph:Lines.tab
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Now live: Shared structured data

2016-12-28 Thread Yuri Astrakhan
The 400 chat limit is to be in sync with Wikidata, which has the same
limitation. The origins of this limit is to encourage storage of "values"
rather than full strings (sentences). Also, it discourages storage of wiki
markup.

On Wed, Dec 28, 2016, 16:45 mathieu stumpf guntz <
psychosl...@culture-libre.org> wrote:

> Thank you Yuri. Is there some rational explanation behind this limits? I
> understand the limit over performance concern, and 2Mb seems already
> very large for intented glossaries. But 400 chars might be problematic
> for some definition I guess, especially since translations can lead to
> varying lenght needs.
>
>
> Le 25/12/2016 à 17:03, Yuri Astrakhan a écrit :
> > Hi Mathieu, yes, I think you can totally build up this glossary in a
> > dataset. Just remember that each string can be no longer then 400 chars,
> > and total size under 2mb.
> >
> > On Sun, Dec 25, 2016, 10:45 mathieu stumpf guntz <
> > psychosl...@culture-libre.org> wrote:
> >
> >> Hi Yuri,
> >>
> >> Seems very interesting. Am I wrong thinking this could helpto create
> >> multi-lingual glossary as drafted in
> >> https://phabricator.wikimedia.org/T150263#2860014 ?
> >>
> >>
> >> Le 22/12/2016 à 20:30, Yuri Astrakhan a écrit :
> >>> Gift season! We have launched structured data on Commons, available
> from
> >>> all wikis.
> >>>
> >>> TLDR; One data store. Use everywhere. Upload table data to Commons,
> with
> >>> localization, and use it to create wiki tables, lists, or use directly
> in
> >>> graphs. Works for GeoJSON maps too. Must be licensed as CC0. Try this
> >>> per-state GDP map demo, and select multiple years. More demos at the
> >> bottom.
> >>> US Map state highlight
> >>> <https://en.wikipedia.org/wiki/Template:Graph:US_Map_state_highlight>
> >>>
> >>> Data can now be stored as *.tab and *.map pages in the data namespace
> on
> >>> Commons. That data may contain localization, so a table cell could be
> in
> >>> multiple languages. And that data is accessible from any wikis, by Lua
> >>> scripts, Graphs, and Maps.
> >>>
> >>> Lua lets you generate wiki tables from the data by filtering,
> converting,
> >>> mixing, and formatting the raw data. Lua also lets you generate lists.
> Or
> >>> any wiki markup.
> >>>
> >>> Graphs can use both .tab and .map directly to visualize the data and
> let
> >>> users interact with it. The GDP demo above uses a map from Commons, and
> >>> colors each segment with the data based on a data table.
> >>>
> >>> Kartographer (/) can use the .map data as an extra
> >> layer
> >>> on top of the base map. This way we can show endangered species'
> habitat.
> >>>
> >>> == Demo ==
> >>> * Raw data example
> >>> <https://commons.wikimedia.org/wiki/Data:Weather/New_York_City.tab>
> >>> * Interactive Weather data
> >>> <https://en.wikipedia.org/wiki/Template:Graph:Weather_monthly_history>
> >>> * Same data in Weather template
> >>> <https://en.wikipedia.org/wiki/User:Yurik/WeatherDemo>
> >>> * Interactive GDP map
> >>> <https://en.wikipedia.org/wiki/Template:Graph:US_Map_state_highlight>
> >>> * Endangered Jemez Mountains salamander - habitat
> >>> <https://en.wikipedia.org/wiki/Jemez_Mountains_salamander#/maplink/0>
> >>> * Population history
> >>> <https://en.wikipedia.org/wiki/Template:Graph:Population_history>
> >>> * Line chart <https://en.wikipedia.org/wiki/Template:Graph:Lines>
> >>>
> >>> == Getting started ==
> >>> * Try creating a page at data:Sandbox/.tab on Commons. Don't
> forget
> >>> the .tab extension, or it won't work.
> >>> * Try using some data with the Line chart graph template
> >>> A thorough guide is needed, help is welcome!
> >>>
> >>> == Documentation links ==
> >>> * Tabular help <https://www.mediawiki.org/wiki/Help:Tabular_Data>
> >>> * Map help <https://www.mediawiki.org/wiki/Help:Map_Data>
> >>> If you find a bug, create Phabricator ticket with #tabular-data tag, or
> >>> comment on the documentation talk pages.
> >>>
> >>> == FAQ ==
> >>> * Relation to Wikidata:  Wikidata is about "facts" (small pieces of
> >>> in

Re: [Wikitech-l] Now live: Shared structured data

2016-12-25 Thread Yuri Astrakhan
Hi Mathieu, yes, I think you can totally build up this glossary in a
dataset. Just remember that each string can be no longer then 400 chars,
and total size under 2mb.

On Sun, Dec 25, 2016, 10:45 mathieu stumpf guntz <
psychosl...@culture-libre.org> wrote:

> Hi Yuri,
>
> Seems very interesting. Am I wrong thinking this could helpto create
> multi-lingual glossary as drafted in
> https://phabricator.wikimedia.org/T150263#2860014 ?
>
>
> Le 22/12/2016 à 20:30, Yuri Astrakhan a écrit :
> > Gift season! We have launched structured data on Commons, available from
> > all wikis.
> >
> > TLDR; One data store. Use everywhere. Upload table data to Commons, with
> > localization, and use it to create wiki tables, lists, or use directly in
> > graphs. Works for GeoJSON maps too. Must be licensed as CC0. Try this
> > per-state GDP map demo, and select multiple years. More demos at the
> bottom.
> > US Map state highlight
> > <https://en.wikipedia.org/wiki/Template:Graph:US_Map_state_highlight>
> >
> > Data can now be stored as *.tab and *.map pages in the data namespace on
> > Commons. That data may contain localization, so a table cell could be in
> > multiple languages. And that data is accessible from any wikis, by Lua
> > scripts, Graphs, and Maps.
> >
> > Lua lets you generate wiki tables from the data by filtering, converting,
> > mixing, and formatting the raw data. Lua also lets you generate lists. Or
> > any wiki markup.
> >
> > Graphs can use both .tab and .map directly to visualize the data and let
> > users interact with it. The GDP demo above uses a map from Commons, and
> > colors each segment with the data based on a data table.
> >
> > Kartographer (/) can use the .map data as an extra
> layer
> > on top of the base map. This way we can show endangered species' habitat.
> >
> > == Demo ==
> > * Raw data example
> > <https://commons.wikimedia.org/wiki/Data:Weather/New_York_City.tab>
> > * Interactive Weather data
> > <https://en.wikipedia.org/wiki/Template:Graph:Weather_monthly_history>
> > * Same data in Weather template
> > <https://en.wikipedia.org/wiki/User:Yurik/WeatherDemo>
> > * Interactive GDP map
> > <https://en.wikipedia.org/wiki/Template:Graph:US_Map_state_highlight>
> > * Endangered Jemez Mountains salamander - habitat
> > <https://en.wikipedia.org/wiki/Jemez_Mountains_salamander#/maplink/0>
> > * Population history
> > <https://en.wikipedia.org/wiki/Template:Graph:Population_history>
> > * Line chart <https://en.wikipedia.org/wiki/Template:Graph:Lines>
> >
> > == Getting started ==
> > * Try creating a page at data:Sandbox/.tab on Commons. Don't forget
> > the .tab extension, or it won't work.
> > * Try using some data with the Line chart graph template
> > A thorough guide is needed, help is welcome!
> >
> > == Documentation links ==
> > * Tabular help <https://www.mediawiki.org/wiki/Help:Tabular_Data>
> > * Map help <https://www.mediawiki.org/wiki/Help:Map_Data>
> > If you find a bug, create Phabricator ticket with #tabular-data tag, or
> > comment on the documentation talk pages.
> >
> > == FAQ ==
> > * Relation to Wikidata:  Wikidata is about "facts" (small pieces of
> > information). Structured data is about "blobs" - large amounts of data
> like
> > the historical weather or the outline of the state of New York.
> >
> > == TODOs ==
> > * Add a nice "table editor" - editing JSON by hand is cruel. T134618
> > * "What links here" should track data usage across wikis. Will allow
> > quicker auto-refresh of the pages too. T153966
> > * Support data redirects. T153598
> > * Mega epic: Support external data feeds.
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [discovery] Now live: Shared structured data

2016-12-22 Thread Yuri Astrakhan
Svetlana, thanks for suggestion. I think we should create a portal similar
to the Structured Data one, and put some examples there.  Deciding on the
name is difficult :)   "Commons Datasets" does sound good.

There has been a very prolonged discussion on where to host this feature -
https://meta.wikimedia.org/wiki/User:Yurik/Storing_data. Wikidata would
have been a good choice, but users expect all the data there to be in
public domain, and we may add more licensing choices later.

> An inline example with English commentary -- straight on the first page about
this new technology without making users click links -- could be nice. The
text you typed up does not seem to be on a wiki page, so I am unable to
edit it...

Which page are you referring to?

On Thu, Dec 22, 2016 at 4:03 PM Svetlana Tkachenko 
wrote:

> Hello,
>
> Maybe 'commons store' or 'commons datasets' could work? I would suggest
> that the name reflects on the fact that the datasets are shared
> ('common') and are not on Wikidata.
>
> If I may ask, why is it in commons.wikimedia.org/wiki/Data:* and not at
> Meta (like Global user pages) or Wikidata (like structured data about
> lots of things)?
>
> An inline example with English commentary -- straight on the first page
> about this new technology without making users click links -- could be
> nice. The text you typed up does not seem to be on a wiki page, so I am
> unable to edit it...
>
> Svetlana.
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Now live: Shared structured data

2016-12-22 Thread Yuri Astrakhan
Micru, thanks, I think Datasets sounds like a good name too!

On Thu, Dec 22, 2016 at 2:44 PM David Cuenca Tudela <dacu...@gmail.com>
wrote:

> On Thu, Dec 22, 2016 at 8:38 PM, Brad Jorsch (Anomie) <
> bjor...@wikimedia.org
> > wrote:
>
> > On Thu, Dec 22, 2016 at 2:30 PM, Yuri Astrakhan <
> yastrak...@wikimedia.org>
> > wrote:
> >
> > > Gift season! We have launched structured data on Commons, available
> from
> > > all wikis.
> > >
> >
> > I was momentarily excited, then I read a little farther and discovered
> this
> > isn't about https://commons.wikimedia.org/wiki/Commons:Structured_data.
> >
>
> Same here, I think it needs a better name...
>
> What about calling it datasets or structured datasets?
>
> Cheers,
> Micru
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Now live: Shared structured data

2016-12-22 Thread Yuri Astrakhan
Yes, there seem to have been a bit of a naming collision.  Tabular data and
map data have been jointly known as structured data, but there is also the
Structured Data project, which IMO should be called Structured Metadata
project :)  Naming suggestions are welcome!

P.S. Brad, I'm sorry tabular and map data did not excite you :(

On Thu, Dec 22, 2016 at 2:38 PM Brad Jorsch (Anomie) <bjor...@wikimedia.org>
wrote:

> On Thu, Dec 22, 2016 at 2:30 PM, Yuri Astrakhan <yastrak...@wikimedia.org>
> wrote:
>
> > Gift season! We have launched structured data on Commons, available from
> > all wikis.
> >
>
> I was momentarily excited, then I read a little farther and discovered this
> isn't about https://commons.wikimedia.org/wiki/Commons:Structured_data.
>
>
> --
> Brad Jorsch (Anomie)
> Senior Software Engineer
> Wikimedia Foundation
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Now live: Shared structured data

2016-12-22 Thread Yuri Astrakhan
Gift season! We have launched structured data on Commons, available from
all wikis.

TLDR; One data store. Use everywhere. Upload table data to Commons, with
localization, and use it to create wiki tables, lists, or use directly in
graphs. Works for GeoJSON maps too. Must be licensed as CC0. Try this
per-state GDP map demo, and select multiple years. More demos at the bottom.
US Map state highlight


Data can now be stored as *.tab and *.map pages in the data namespace on
Commons. That data may contain localization, so a table cell could be in
multiple languages. And that data is accessible from any wikis, by Lua
scripts, Graphs, and Maps.

Lua lets you generate wiki tables from the data by filtering, converting,
mixing, and formatting the raw data. Lua also lets you generate lists. Or
any wiki markup.

Graphs can use both .tab and .map directly to visualize the data and let
users interact with it. The GDP demo above uses a map from Commons, and
colors each segment with the data based on a data table.

Kartographer (/) can use the .map data as an extra layer
on top of the base map. This way we can show endangered species' habitat.

== Demo ==
* Raw data example

* Interactive Weather data

* Same data in Weather template

* Interactive GDP map

* Endangered Jemez Mountains salamander - habitat

* Population history

* Line chart 

== Getting started ==
* Try creating a page at data:Sandbox/.tab on Commons. Don't forget
the .tab extension, or it won't work.
* Try using some data with the Line chart graph template
A thorough guide is needed, help is welcome!

== Documentation links ==
* Tabular help 
* Map help 
If you find a bug, create Phabricator ticket with #tabular-data tag, or
comment on the documentation talk pages.

== FAQ ==
* Relation to Wikidata:  Wikidata is about "facts" (small pieces of
information). Structured data is about "blobs" - large amounts of data like
the historical weather or the outline of the state of New York.

== TODOs ==
* Add a nice "table editor" - editing JSON by hand is cruel. T134618
* "What links here" should track data usage across wikis. Will allow
quicker auto-refresh of the pages too. T153966
* Support data redirects. T153598
* Mega epic: Support external data feeds.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Arbitrary Wikidata querying

2016-12-10 Thread Yuri Astrakhan
AFAIK, you can query data from Wikidata, but you cannot put it into a page,
unless its a graph. Graphs can do it -
https://www.mediawiki.org/wiki/Extension:Graph/Demo/Sparql

As of last Thursday, you can also create a table on Commons Data namespace,
and make a simple Lua script on your favorite wiki to pull that data in and
render it. Since Wikidata is accessible from Lua, you could pull useful
information about each president. I am not sure about the efficiency
aspects here.

WeatherDemo  -- pulls
data from commons Data:Weather/New_York_City.tab
 and
formats it using enwiki module
.  I'm still working on
some fun demos for a big presentation.


On Sat, Dec 10, 2016 at 8:30 PM MZMcBride  wrote:

> Hi.
>
> If I wanted to make a page on the English Wikipedia using wikitext called
> "List of United States presidents" that dynamically embeds information
> from  and
>  and other similar items, is this
> currently possible? I consider this to be arbitrary Wikidata querying, but
> if that's not the correct term, please let me know what to call it.
>
> A more advanced form of this Wikidata querying would be dynamically
> generating a list of presidents of the United States by finding every
> Wikidata item where position held includes "President of the United
> States". Is this currently possible on-wiki or via wikitext?
>
> If either of these querying capabilities are possible, how do I do them?
> I don't understand how to query Wikidata in a useful way and I find this
> frustrating. Since 2012, we've been putting a lot of data into Wikidata,
> but I want to programmatically extract some of this data and use it in my
> Wikipedia editing. How do I do this?
>
> If these querying capabilities are not currently possible, when might they
> be? I understand that cache invalidation is difficult and that this will
> need a sensible editing user interface, but I don't care about all of
> that, I just want to be able to query data out of this large data store.
>
> MZMcBride
>
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Localizable data for Graphs and Templates on Commons

2016-11-09 Thread Yuri Astrakhan
Following the localizable maps example, here is a structured tabular data
example that also supports localization, shared data, and can be used
directly from the graphs or from Lua scripts on any wiki. Note that the
graph itself is in English wiki (labs), but data comes from Commons. Feel
free to add translations.

English:
https://en.wikipedia.beta.wmflabs.org/wiki/Dimpvis_-_Fertility_vs_Life_Expectancy
Russian:
https://en.wikipedia.beta.wmflabs.org/wiki/Dimpvis_-_Fertility_vs_Life_Expectancy?uselang=ru



On Mon, Nov 7, 2016 at 11:46 PM Yuri Astrakhan <yastrak...@wikimedia.org>
wrote:

> I would like to show one of the projects that Interactive team has been
> hacking on: localizable maps data (GeoJSON), stored on Commons, and usable
> from multiple wikis. I hope we can get it polished and enabled in
> production soon enough - so far, lab's beta cluster only:
>
> https://en.wikipedia.beta.wmflabs.org/wiki/Maplink-page
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Shared localized maps data demo

2016-11-07 Thread Yuri Astrakhan
I would like to show one of the projects that Interactive team has been
hacking on: localizable maps data (GeoJSON), stored on Commons, and usable
from multiple wikis. I hope we can get it polished and enabled in
production soon enough - so far, lab's beta cluster only:

https://en.wikipedia.beta.wmflabs.org/wiki/Maplink-page
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Wikipedia gets Map links and Geoshapes service

2016-09-08 Thread Yuri Astrakhan
Dear community, this week we enabled  support on all Wikipedia and
sister projects.  This means that now an article can have a link to a map,
and that map may contain highlighted regions and popups with information.
[1],[3]

Our next step is to add an informational sidebar to the map, similar to
what is being shown on the "geohack" page (map link in the upper right
corner of most location articles).  Check out proposed screenshots [2]

We now also have a geoshapes service. So if Open Street Maps community has
defined a region and assigned it a Wikidata ID, you can draw it on the map
with that ID. Or you can use Wikidata Query Service (via SPARQL language),
to query for those IDs and draw them on the map, coloring them and adding
popup information. [3] Geoshapes can also be used for the graphs. [4]


[1] Maps docs: https://www.mediawiki.org/wiki/Help:Extension:Kartographer
[2] sidebar prototype: https://phabricator.wikimedia.org/T131907#2616275
[3] US Governors
https://en.wikipedia.org/wiki/User:Yurik/maplink#/maplink/0
[4] Graph geoshapes
https://www.mediawiki.org/wiki/Template:Graph:Country_with_regions_and_capitals
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] to be enabled tomorrow

2016-09-02 Thread Yuri Astrakhan
We tried to deploy it yesterday, but ran into some issues, so will try
again next week. Please keep an eye on
https://phabricator.wikimedia.org/T144062

On Fri, Sep 2, 2016, 12:03 Strainu <strain...@gmail.com> wrote:

> Hi Yuri,
>
> This does not seem to have happened. Is there a new ETA for this?
>
> Thanks,
>Andrei
>
> 2016-08-31 23:19 GMT+03:00 Yuri Astrakhan <yastrak...@wikimedia.org>:
> > Seems we have no more blockers for  tag on all Wikipedias. We
> will
> > enable it tomorrow, Sept 1st.  tag allows editors to add a link
> to
> > a popup map, complete with extra geojson overlay data.
> >
> > See help page for instructions:
> > https://www.mediawiki.org/wiki/Help:Extension:Kartographer#.3Cmaplink.3E
> >
> > Submit bugs to phabricator:
> >
> https://phabricator.wikimedia.org/maniphest/task/edit/form/1/?tags=Kartographer,maps
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] to be enabled tomorrow

2016-08-31 Thread Yuri Astrakhan
Seems we have no more blockers for  tag on all Wikipedias. We will
enable it tomorrow, Sept 1st.  tag allows editors to add a link to
a popup map, complete with extra geojson overlay data.

See help page for instructions:
https://www.mediawiki.org/wiki/Help:Extension:Kartographer#.3Cmaplink.3E

Submit bugs to phabricator:
https://phabricator.wikimedia.org/maniphest/task/edit/form/1/?tags=Kartographer,maps
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] 2016W24 ArchCom RFC meeting (2016-06-15)

2016-06-15 Thread Yuri Astrakhan
Thank you Pine for your support! I do hope we can get this sorted out and
deployed :)

On Wed, Jun 15, 2016 at 10:30 AM, Pine W  wrote:

> I can't attend this meeting, but want to continue to voice support for
> improving MediaWiki (primarily VE) and Commons ability to accept,
> manipulate, and display spreadsheets and their kin.
>
> Thanks for working on this.
>
> Pine
> On Jun 15, 2016 00:11, "Rob Lanphier"  wrote:
>
> > Hi everyone,
> >
> > We're holding another ArchCom-RFC meeting this week to follow up on
> > the thread Yuri started about T120452.
> >
> > The ArchCom status page is finally up-to-date with last week's info:
> > 
> >
> > (and it's quoted below)
> >
> > Links!:
> > This week's meeting: 
> > RFC for this week's meeting: 
> > Subject of the RFC: Allow tabular datasets on Commons (or some similar
> > central repository) (CSV, TSV, JSON, XML)
> > Location: #wikimedia-office IRC channel
> > Time: 2016-06-15 Wednesday 21:00 UTC (2pm PDT, 23:00 CEST)
> >
> > Meetbot looks forward to scribing your presence!
> >
> > Rob
> >
> >
> > 
> > Source of [[mw:Architecture_committee/Status]], where "phab:" links
> > are pointers to https://phabricator.wikimedia.org
> >
> > This page provides status update for [[Requests for
> > comment|ArchCom-RFCs]], with an emphasis on ArchCom team member.  As
> > of this writing on 2016-04-29, this update is an experiment discussed
> > [[Topic:T2zctt083izvx07l|weekly ArchCom update discussion on the
> > "ArchCom/Team practices" talk page]].
> >
> > = Recent RFC meetings =
> > *ArchCom Planning meeting 2016W23: 2016-06-08: [[Phab:E202]] (E156/10)
> > **Notes: [[Architecture committee/2016-06-08]]
> > *ArchCom-RFC office hour 2016W23: 2016-06-08: [[Phab:E203]] (E66/38)
> > ** [[Phab:T89331|T89331 Replace Tidy in MW parser with HTML 5
> > parse/reserialize]]
> >
> > = Upcoming RFC meetings =
> > *ArchCom Planning meeting 2016W24: 2016-06-15: [[Phab:E212]] (E156/11)
> > **Notes: [[Architecture committee/2016-06-15]]
> > *ArchCom-RFC office hour 2016W24: 2016-06-15: [[Phab:E213]] (E66/39)
> > ** [[Phab:T120452|T120452]]: Allow tabular datasets on Commons (or
> > some similar central repository) (CSV, TSV, JSON, XML)
> > *** See also: comments in T124569 and T134426
> >
> > = Entering Final Comment Period =
> > * None.
> >
> > = Recently Approved =
> > * none
> >
> >  RFC inbox 
> > * [[phab:tag/archcom-rfc/|ArchCom RFC board]]:
> > ** [[Phab:T124569|T124569 RFC: Data namespace blob storage on
> wikidata.org
> > ]]
> >
> > = Shepherd status =
> > * Brion
> > ** [[Phab:T107595|T107595]] Multi-content revisions is interesting,
> > needed for various things in multimedia land
> > *** Meeting happened earlier this week; notes are on the ticket
> > ** T66214 - predictable thumb URLs
> > *** Break this out into:
> >  Define set of core & extensible media file options for Handler
> > extensions
> >  Predictable thumb URLs
> >  Improve InstantCommons perf by reducing need to run thumbnail URL
> > lookups
> >  Iframe-based rich media embedding for InstantCommons
> > ** plan to write up new RfCs for:
> > *** In-browser SVG rendering (pick up existing bug & mailing list notes)
> > *** iframe+CSP-isolated JS widgets for rich content
> >  & extend that to InstantCommons via embedding
> > *** iframe+CSP-isolated JS gadgets for UI plugins
> >  Build these out from ideas from hackathon project T131436
> > * Daniel
> > ** Software Quality working group - will follow up on earlier
> > proposal.  Will talk to people at Wikimania
> > ** Working on Multi Content Rev Spec with Brion
> > ** T113034 [[phab:T113034|RFC: Overhaul Interwiki map, unify with
> > Sites and WikiMap]]: checking in with Adam
> > ** T89733 (approved, with Stas driving implementation)
> > * Gabriel
> > ** Looking into content composition working group, possibly kick-off
> > at Wikimania
> > ** Discussing Multi Content Rev / RB interaction with Daniel;
> > follow-up at Wikimania
> > * Roan
> > ** T108655 [[phab:T108655|RFC: Standardise JavaScript interfaces]]: I
> > need to start the second part, but the recent comments have me
> > confused. I'll need to talk to Timo and figure out what the subject of
> > part two should be.
> > * RobLa
> > ** Working with [[User:DPatrick (WMF)|DPatrick]] on [[Wikimedia
> > Security Team]] issues in an attempt to be useful there.
> > ** T123753 [[phab:T123753|Establish retrospective reports for Security
> > and Performance incidents]]
> > *** In scope for this group?
> > ** Forming ArchCom-affiliated working groups
> > *** RFCs
> >  T124504 [[phab:T124504|Transition WikiDev '16 working areas into
> > working groups]] and
> >  T123606 [[Phab:T123606|RFC: Implement ArchCom-affiliated working
> > groups (process inspired by Rust's 

Re: [Wikitech-l] Enabling shared tabular data pages

2016-06-06 Thread Yuri Astrakhan
Rob, thanks for your offer to help! Always welcome :)

By discussion and positive feedback I meant Facebook and Commons comments,
and a very old and elaborate phab ticket discussion:
* https://phabricator.wikimedia.org/T120452
*
https://commons.wikimedia.org/wiki/Commons:Village_pump/Proposals#Tabular_data_storage_for_Commons.21
* https://www.facebook.com/groups/wikipediaweekly/permalink/997545366959961/

I do not think this feature will immediately have a huge uptake. It will
simplify graph design, as it will be possible to store data outside of the
graph. It will also allow data from tables and lists to be moved into
separate wiki pages. The work of moving existing data into these pages
might not be very fast. In short, it will only be accessible from Lua and
graphs, will be less than 2MB each, and will require some technical skills
to edit JSON until better tools are created.


On Mon, Jun 6, 2016 at 9:14 PM, Rob Lanphier <ro...@wikimedia.org> wrote:

> On Mon, Jun 6, 2016 at 6:40 AM, Yuri Astrakhan <yastrak...@wikimedia.org>
> wrote:
>
> > Daniel, I agree about the data/api versioning. I was mostly talking about
> > features and capabilities. For example, we could spend the next year
> > developing a visual table editor, implement support for unlimited table
> > sizes, provide import/export from other table formats, introduce
> elaborate
> > schema validation, and many other cool features. And after that year
> > realize that users don't need this whole thing at all, or need something
> > similar but very different.  Or we could release one small, well defined,
> > stable subset of that functionality, get feedback, and move forward.
> >
>
> Hi Yuri,
>
> I think one thing that would be helpful for me (and I suspect many people
> who want to help) is some more specifics about this statement from your
> original email: "We have had some good feedback for the new shared tabular
> data feature, and we are getting ready to deploy it in production."  Which
> "we" are you referring to, and by "getting ready to deploy it in
> production", does that mean it's about to be usable where someone could
> upload gigabytes of production data in this format Commons by the end of
> the week?  Is there a more measured plan published somewhere?
>
> This all sounds very cool, but also an area where we could accidentally
> accrue a crushing load of technical debt without fully realizing it (per
> Daniel's comment).  I'll confess to being ignorant on everything that's
> been going on, and I'm wondering now how desperately I should study your
> documentation to make up for it (and how important it is to drop other work
> to make time for this).
>
> Rob
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Enabling shared tabular data pages

2016-06-06 Thread Yuri Astrakhan
Daniel, thanks, inline:

The structure looks sane and future-proof to me, but since it's
> all-in-one-blob,
> it'll be hard to scale it to more than a few ten thousand lines or so. I
> like
> this model, but if you want to go beyond that (DO we want to go beyond
> that?!)
> you will need a different approach, which may be incompatible.
>

We do *eventually* want to go beyond that towards large data. We had this
discussion with Brion, see here:
*  https://phabricator.wikimedia.org/T120452#2224764

I do not think my approach is a blocker for larger datasets, because you
can add simple SQL-like interface capable of reading data from these pages
and from large backend databases. 2MB page limit will prevent page data
from growing too large. Also, larger datasets is a different target, that
we should approach when we are ready.

One thing that should be specified very rigorously from the start are the
> supported data types, along with their exact syntax and semantics. Your
> example
> has string, number, boolean, and localized. So:
>
> * what's the length limit for string?
>
Good question. Do you have a limit for Wikidata labels and other string
values?

> * what's the range and precision of number? Is it the same as for JSON?
>
For now, same as JSON.

> * does boolean only accept JSON primitives, or also strings?
>
true/false only, no strings

> * what language codes are valid for localized? Is language fallback
> applied for
> display?
>
Same rules as for wiki language codes (but without validation against the
actual list). Automatic fallback is already implemented, using Language
class.  If everything else fails, and there is no English, takes random
first (unlike Language which stops at English and fails otherwise).


> You write in your proposal "Hard to define types like Wikidata ID,
> datetime, and
> URL could be stored as a string until we can reuse Wikidata's type system".
> Well, what's keeping you from using it now? DataValue and friends are
> standalone
> composer modules, you can find them on github.

I was told by the Wikidata team at the Jerusalem hackathon that the
Javascript code is too entangled, and I won't be able to reuse it for
non-Wikidata stuff.  I will be very happy to adapt it if possible. Yet, I
do not think this is a requirement for the first release.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Enabling shared tabular data pages

2016-06-06 Thread Yuri Astrakhan
Daniel, I agree about the data/api versioning. I was mostly talking about
features and capabilities. For example, we could spend the next year
developing a visual table editor, implement support for unlimited table
sizes, provide import/export from other table formats, introduce elaborate
schema validation, and many other cool features. And after that year
realize that users don't need this whole thing at all, or need something
similar but very different.  Or we could release one small, well defined,
stable subset of that functionality, get feedback, and move forward.

Do you have any thoughts about the proposed data structure?

On Mon, Jun 6, 2016 at 4:09 PM, Daniel Kinzler <daniel.kinz...@wikimedia.de>
wrote:

> Am 04.06.2016 um 18:47 schrieb Yuri Astrakhan:
> > In line with the "release early, release often", we will not have any
> > elaborate data editing interface beyond the raw JSON code editor for the
> > first release.
>
> A word of caution about this strategy: this is great for user facing
> things, but
> it really sucks if you are creating artifacts, such as page revisions. You
> will
> have to stay compatible with your very first data format, and the second,
> and
> the third, etc, forever. Similarly, once you have an ecosystem of tools
> that
> rely on your API and data model, changing it becomes rather troublesome.
>
> So, for anything that is supposed to offer a stable API, or creates
> persistent
> data, "release early, release often" is not a good strategy in my
> experience. A
> lot of pain lies this way. Remember: wikitext syntax was once a "let's
> just make
> it work, we will fix it later" hack...
>
> -- daniel
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Enabling shared tabular data pages

2016-06-04 Thread Yuri Astrakhan
We have had some good feedback for the new shared tabular data feature, and
we are getting ready to deploy it in production. It would be amazing if you
can give it a final look-over to see if there are any blockers left.

The first stage will be to enable  Data:*.tab  pages on Commons, and allow
all other wikis direct access to it via Lua code and Graph extension. All
data at this point must be licensed under CC0. More licensing options are
still under discussion, and can be easily added later.

In line with the "release early, release often", we will not have any
elaborate data editing interface beyond the raw JSON code editor for the
first release. Our initial target audience is the more experienced users
who will evaluate and test the new technology. Once the underlying tech is
stable and prooven, we will work on making it more accessible to the
general audience.

Links:
* Task: https://phabricator.wikimedia.org/T134426
* Demo: http://data.wmflabs.org
* Technical: https://www.mediawiki.org/wiki/Extension:JsonConfig/Tabular
* Discussion:
https://commons.wikimedia.org/wiki/Commons:Village_pump/Proposals#Tabular_data_storage_for_Commons.21
* Facebook:
https://www.facebook.com/groups/wikipediaweekly/permalink/997545366959961/
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] JetBrains licenses for Volunteers and Staff

2016-05-10 Thread Yuri Astrakhan
Hi Pine,

They are not upgrades, they are full one year renewable free licenses for
the powerful ide with php, js, puppet, Ruby, Python, c++, ... , donated to
help with Wiki development.

You basically need to be a volunteer or a staff developer working on Wiki
projects with some history of code contributions.   Resharper is a great
product and it is part of the license, but I am not sure we have any .net
code (I think we used to run bits of mono code somewhere, but that was a
while ago). Have you submitted any patches? In what languages?

Thanks!
On May 10, 2016 08:48, "Pine W" <wiki.p...@gmail.com> wrote:

> Hi Yuri,
>
> ReSharper interests me. This email refers to license upgrades. What is
> involved in initially qualifying for a license?
>
> Thanks,
>
> Pine
> On Apr 18, 2016 09:44, "Yuri Astrakhan" <yastrak...@wikimedia.org> wrote:
>
> > PhpStorm, InteliJ IDEA, Resharper and other JetBrain users, we just
> > received free upgraded licenses for all of their products.
> >
> > Check your account at https://account.jetbrains.com/licenses and login
> > with
> > that account inside your application to automatically use that license.
> If
> > you don't see the license, contact me or Sam Reed, and we will add you
> > right away.  On IRC: yurik or reedy, or via email.  We could also use a
> few
> > more admins for these licenses. You will no longer need to copy/paste any
> > licenses manually.
> >
> > IDEA:  use for PHP, JavaScript, Puppets, Ruby, and other web-related
> > technologies
> > https://www.jetbrains.com/idea/
> >
> > Resharper: for C# and Microsoft Visual Studio C++
> > https://www.jetbrains.com/resharper/
> >
> > CLion: for C++
> > https://www.jetbrains.com/clion/
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] JetBrains licenses for Volunteers and Staff

2016-04-18 Thread Yuri Astrakhan
PhpStorm, InteliJ IDEA, Resharper and other JetBrain users, we just
received free upgraded licenses for all of their products.

Check your account at https://account.jetbrains.com/licenses and login with
that account inside your application to automatically use that license. If
you don't see the license, contact me or Sam Reed, and we will add you
right away.  On IRC: yurik or reedy, or via email.  We could also use a few
more admins for these licenses. You will no longer need to copy/paste any
licenses manually.

IDEA:  use for PHP, JavaScript, Puppets, Ruby, and other web-related
technologies
https://www.jetbrains.com/idea/

Resharper: for C# and Microsoft Visual Studio C++
https://www.jetbrains.com/resharper/

CLion: for C++
https://www.jetbrains.com/clion/
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] X-Wikimedia-Debug, your new secret side-kick

2016-03-31 Thread Yuri Astrakhan
Isn't there a recommendation not to use the X- prefix for any new headers?
On Mar 31, 2016 12:23 PM, "Jaime Crespo"  wrote:

> On Thu, Mar 31, 2016 at 3:32 AM, Ori Livneh  wrote:
>
> > Cool? Cool.
>
> Definitely Cool.
>
> --
> Jaime Crespo
> 
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] T43327: Add page views graph(s) to MediaWiki's info action for Wikimedia wikis

2016-03-10 Thread Yuri Astrakhan
he.wiki has already enabled it by inserting Graph:PageViews template into
the info template, but yours does look cleaner. The graph will show only if
you did not modify your interface language in the Hebrew wiki, or if you
are not logged in.

https://he.wikipedia.org/w/index.php?title=%D7%A2%D7%9E%D7%95%D7%93_%D7%A8%D7%90%D7%A9%D7%99=info

One thing that concerns me though - you seem to get all the Vega and d3
code and graph data during the page load, yet you only show the graph on
click. Seems to take a lot of extra time and very wasteful. Also, it
doesn't work for older browsers.

On Fri, Mar 11, 2016 at 4:33 AM, Legoktm 
wrote:

> Hi!
>
> (I meant to send this out earlier but got distracted by other things :()
>
> At the Wikimedia developer summit, Addshore and I started working on a
> MediaWiki extension to resolve T43327[1], which asked for page view
> graphs to be added to action=info. The "WikimediaPageViewInfo"[2]
> extension has passed a security review, and should be deployed soon™.
>
> If you want to try it out, there's a test wiki[3] which uses the
> pageview data for en.wp.
>
> And for updates on the deployment status, please follow T125917[4].
>
> [1] https://phabricator.wikimedia.org/T43327
> [2] https://www.mediawiki.org/wiki/Extension:WikimediaPageViewInfo
> [3] http://bf-wmpageview.wmflabs.org/wiki/Taylor_Swift?action=info
> [4] https://phabricator.wikimedia.org/T125917
>
> -- Legoktm
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Data visualization for wiki

2016-02-04 Thread Yuri Astrakhan
Ivo, there are two aspects of building graphs for Wiki:
* defining a "type" or "style" of a graph
* instantiating a graph with the specific data

Defining a new graph is a relatively complex programming task - one has to
program how graph would look like using Vega syntax. Instantiating is much
easier - one has to simply provide the needed data and a few parameters
controlling the graph.  Graph:Chart
  is a good example
which has been copied to many wikis.  The graph is generated via a complex
Lua script, but that complexity is hidden from all the graph users.

Frederic Bolduc (CCed) has done a good job building an experimental Visual
Editor plugin for graphs, but that plugin attempts to do both tasks -
define the new graph type and allow for users to instantiate it, and we do
not have enough resources to support both aspects well.  It would be ideal
if you helped with that extension to change it into a template editor +
galery picker. This way community may build the graphs it needs, and lists
them in a "gallery of good graphs". Visual Editor would than show a "list
of recommended graphs" as a set of icons. Editors will be able to quickly
insert a graph template and provide data for it.

On Fri, Feb 5, 2016 at 2:10 AM, Ivo Kruusamägi 
wrote:

> So, some questions:
>
> * how important could that in-wiki image editing capability be?
> * has anything been done so far?
>
> I'd like to find some topic suitable for university student to work on. So
> here it is: free workforce for wiki. Act now the get that! Offer some
> topics. PHP related tasks preferred.
>
> Ivo
>
> 2016-02-04 14:20 GMT+02:00 Ivo Kruusamägi :
>
> > OK, this Graph extension seems rather advanced and even though not
> exactly
> > what I had in mind, somewhat similar. So may I conclude it would be
> > pointless to go into that direction? Or are there some things, that may
> > need someone to work with them? Like (making random example): building
> very
> > simple Visual Editor add-on to enable fast creation of some more simple
> > graphs (like pie charts from the enormous variety of possibilities
> > )?
> >
> > Could I get more information about need for image editors? *I'd like to
> > find something that might be suitable topic for bachelor thesis* and
> > vandering around alone inside the topic not yet known to me may not prove
> > to be fruitful.
> >
> > As for this Commons topic. Yes, something similar to Wikidata Game, but
> > directed towards Commons and not Wikidata. More in line with Estonian
> sites
> > Ajapaik  (for adding geolocation information
> > to old images), sift.pics (for determining image types) and
> > postkaart.ajapaik.ee (for digitalizing text in old postcards).
> >
> > Ivo
> >
> > 2016-02-04 5:00 GMT+02:00 MZMcBride :
> >
> >> Ivo Kruusamägi wrote:
> >> >What I'd like to see is some development, that would make it possible
> for
> >> >user to create visualizations inside MediaWiki. Something so easy that
> a
> >> >child could do that. Like this . Workflow example:
> >> 1)
> >> >user selects sth like Create Data Visualization, 2) has some selections
> >> >about cart type, colors, etc, 3) place to write down text (title, axes,
> >> >description) and 4) a table to fill in with data (values + their text
> >> >labels). That could then be saved as one revision. After that every
> other
> >> >user could edit this graph with the same selections and data tables
> just
> >> >like users edit articles and edit history is saved and easy to compare.
> >> >
> >> >Image files like this
> >> > or that
> >> ><
> >>
> https://commons.wikimedia.org/wiki/File:Tree_map_exports_2010_Estonia.svg
> >> > are ridiculous and fixes like that
> >> > are not that
> >> flexible,
> >> >pretty and easy to use as what we need. So lets move forward.
> >>
> >> Are you familiar with the "Graph" MediaWiki extension? Here's a demo:
> >> . This MediaWiki
> >> extension is deployed to Wikimedia wikis, including all the Wikipedias.
> >>
> >> >There are plenty of GPL licensed solutions that could be integrated
> with
> >> >MediaWiki. But I can't be only one thinking about this.
> >>
> >> You're not. :-)
> >>
> >> >So what should I know about that topic so that this work could really
> be
> >> >useful? I.e. how to avoid reinventing the wheel (like building
> something
> >> >already in development) and how to be sure that it could be easily
> >> >incorporated later? Who would be the perfect people to talk about this
> >> >topic?
> >>
> >> This mailing list is great for general discussion. Or we have a
> >> Phabricator installation at  where
> we
> >> track bugs 

Re: [Wikitech-l] New tutorial for interactive graphics on Wiki

2016-01-10 Thread Yuri Astrakhan
Florian, thank you!  Any help with the tutorial is greatly appreciated :)

On Sun, Jan 10, 2016 at 11:36 PM, Florian Schmidt <
florian.schmidt.wel...@t-online.de> wrote:

> Hi,
>
> I was free and edited the page:
> https://www.mediawiki.org/w/index.php?title=Extension%3AGraph%2FInteractive_Graph_Tutorial=revision=2013042=2008358
>
> Hope that's a better wording, if not: feel free to edit it ;)
>
> Best,
> Florian
>
> -Original-Nachricht-
> Betreff: Re: [Wikitech-l] New tutorial for interactive graphics on Wiki
> Datum: 2016-01-10T20:21:40+0100
> Von: "Purodha Blissenbach" 
> An: "wikitech-l@lists.wikimedia.org" 
>
> Hi,
>
> in the 1st translation unit, there is a "klick here" type of link.
> There is a section explaining why that is a no-go:
> https://www.mediawiki.org/wiki/Localisation#Use_meaningful_link_anchors
>
> Purodha
>
> On 08.01.2016 18:49, Johan Jönsson wrote:> On Thu, Dec 31, 2015 at 11:13
> AM, Szymon Grabarczuk>  wrote:>> Great! now, I
> think it should be marked for translation.>> Agreed. And done.>> //Johan
> Jönsson> -->> ___> Wikitech-l
> mailing list> Wikitech-l@lists.wikimedia.org>
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New tutorial for interactive graphics on Wiki

2016-01-01 Thread Yuri Astrakhan
We could also use it for non-maps - basically the slider is a dynamic
parameter - and we could have multiple such parameters, and show data as
any kind of a picture - a chart, map, or even a state of animation.

On Fri, Jan 1, 2016 at 10:47 PM, Pine W <wiki.p...@gmail.com> wrote:

> I've played with this more, and I like how it works. I can imagine people
> using this map with other data sets like HDI, GDP per capita, inflation,
> Wikipedia readership, and WMF fundraising (:
>
> Pine
>
> On Thu, Dec 31, 2015 at 2:13 AM, Szymon Grabarczuk <
> tar.locesil...@gmail.com
> > wrote:
>
> > Great! now, I think it should be marked for translation.
> >
> > On 31 December 2015 at 08:38, Pine W <wiki.p...@gmail.com> wrote:
> >
> > > The tutorial looks nice! I will take a look when I am less distracted
> by
> > > finance reports. Thank you for working on this.
> > >
> > > Pine
> > > On Dec 30, 2015 22:37, "Yuri Astrakhan" <yastrak...@wikimedia.org>
> > wrote:
> > >
> > > > I just finished writing a tutorial on how to build interactive Vega
> > > graphs
> > > > for Wikipedia. And yes, we could build video games this way too :)
> > > >
> > > >
> > >
> >
> https://www.mediawiki.org/wiki/Extension:Graph/Interactive_Graph_Tutorial
> > > > ___
> > > > Wikitech-l mailing list
> > > > Wikitech-l@lists.wikimedia.org
> > > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > > ___
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > >
> >
> >
> >
> > --
> > *Szymon Grabarczuk*
> >
> > Free Knowledge Advocacy Group EU
> > Head of Research & Development Group, Wikimedia Polska
> > pl.wikimedia.org/wiki/User:Tar_Lócesilion
> <http://pl.wikimedia.org/wiki/User:Tar_L%C3%B3cesilion>
> > <http://pl.wikimedia.org/wiki/User:Tar_L%C3%B3cesilion>
> > <http://pl.wikimedia.org/wiki/User:Tar_L%C3%B3cesilion>
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New tutorial for interactive graphics on Wiki

2015-12-30 Thread Yuri Astrakhan
Well, it already supports geo projections, with mouse-draggable map
spinning [1]... We could implement some 3d models if we can hide the
out-of-sight polygons :)

http://vega.github.io/vega-editor/?mode=vega=map-params
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] New tutorial for interactive graphics on Wiki

2015-12-30 Thread Yuri Astrakhan
I just finished writing a tutorial on how to build interactive Vega graphs
for Wikipedia. And yes, we could build video games this way too :)

https://www.mediawiki.org/wiki/Extension:Graph/Interactive_Graph_Tutorial
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Sharing JS code between NodeJS and browser

2015-12-18 Thread Yuri Astrakhan
For JS gurus - what is the best way to share JavaScript library code
between the NodeJS and browser?  The shared library will need to use
$.extend()-like functionality, URL parsing and reconstructing, and logging.

How should the code be organized? What are the recommended tools to
automate the process?

Thanks!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Sharing JS code between NodeJS and browser

2015-12-18 Thread Yuri Astrakhan
Trevor and Daniel, thanks for your reply.

How would you structure the code that is to be shared?  Should it be a
separate NPM package, referenced from the extension package.json via git
url, and have a small file in the extension's lib/ dir with a oneliner -
 "require('...')"  that browserify could pick up? And have a script command
in package.json to build that file?

There are many implementations of URL parsing - from mw.Url to
require('url').  They differ slightly, but that is not the main problem -
how should the shared code use the right one?  Should they be passed as
functional function params? Should there be some "lib init" that sets lib's
global functions?

On Fri, Dec 18, 2015 at 8:22 PM, Daniel Friesen 
wrote:

> On 2015-12-18 8:41 AM, Trevor Parscal wrote:
> > Be careful when doing this with NPM modules, as their contents are
> subject
> > to change, and only their index file is configured and trying to
> > automatically know the paths and inclusion order is more of a mad science
> > than an art. Your best bet would be hard-coding and using very specific
> > versions in your package.json to protect from unexpected changes dues to
> > subtle NPM module version differences.
> npm modules are modules, not scripts. Even if you knew their execution
> order you can't just "include" them in any order at all.
>
> Using browserify to build a standalone script with a UMD that exposes a
> global might be the best bet.
> Though you'll have to be wary of requiring a module that pulls in a
> bunch of dependencies.
>
> > As for needing some $.extend, URL parsing and reconstructing, and
> logging.
> > I'm assuming you mean taking client code to the server where jQuery and
> > common browser functionally might not be present. There are many NPM
> > modules that provide shims for these things, including full-on
> server-side
> > jQuery. Usually the differences are subtle if any. Members of the Paraoid
> > team are very familiar with this space and are probably good people to
> talk
> > to about this.
> URL - Built in to node. Browserify comes with a browser compatible
> version of the module for when you require('url') it.
>
> extend:
> - lodash.merge (as part of lodash or the dedicated 'lodash.merge' module)
> - Some people use xtend or extend
> - There is also a new native Object.assign built into ES6, which you
> could include a simple polyfill for
>
> Logging varies very much by what you need. So there's wilson, debug, and
> a bunch of others.
>
> ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Wmfall] Community Wishlist Survey: Top 10 wishes!

2015-12-16 Thread Yuri Astrakhan
>
> #3. Central global repository for templates, gadgets and Lua modules  (87)
>

I would love to participate in this - I feel it would bring different
language communities much closer together.  And great for Graphs & Maps.


> #7. Pageview Stats tool  (70)
>

We could even do it on-wiki like here
 - it shows
how to draw a graph from pageviews API, and this graph extension
 can add
HTML controls to it.

Awesome list, thanks for getting more community involvement!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Interactive graphs & charts are now live

2015-12-14 Thread Yuri Astrakhan
The  tag has just been upgraded to Vega 2.0, adding interactivity
support.  See live examples here
.

Vega 1.0 is still available, but it is obsolete.  Please migrate all your
graphs to the new system so we can turn 1.0 off. See how to migrate:
https://github.com/vega/vega/wiki/Upgrading-to-2.0

Unless the graph definition contains a "version" value, the default is
still 1. After I tag all existing graphs, I will switch the default to
version 2.

See Vega main page for more: http://vega.github.io/vega/
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Appreciation thread, 2015

2015-12-07 Thread Yuri Astrakhan
* JGirault, TheDJ, Milimetric, and Ferdbold for their hard work on graphs
* MaxSem, Akosiaris, bblack and johannesk_wmde for the maps

Everyone - for helping )

On Tue, Dec 8, 2015 at 2:38 AM, Isarra Yos  wrote:

> Perhaps a thanks to the people we don't think about, who we often never
> even work with. Volunteers and third-party developers who contribute to and
> maintain repositories not part of the big tickets. People who take the time
> to file bugs, and triage them, and sort things out. The ops who keep things
> running, even when on a plane. Those who comment on proposals, finding
> problems, expressing support. Liaisons, developers, designers who take the
> time to reach out and talk to the people affected by what they do, and take
> the time to listen.
>
> And also a thanks to everyone on this list for keeping things fairly
> sensible and productive and not turning it into a massive drama fest every
> tuesday.
>
>
> On 05/12/15 08:25, Legoktm wrote:
>
>> Hi,
>>
>> It's been a while since we had one of these, and 2015 has been a very
>> busy year! :-)
>>
>> I'll quote Sumana from last time:
>>
>> How about a little email thread for us to say nice things about each
>>> other?  Rules: be kind, thank someone, and say why you're thanking
>>> them.
>>> [If you don't get thanked and you feel mopey, email me and I'll
>>> comfort you. :-)]
>>>
>>
>> I'll start off with thank yous to:
>> * Addshore, for his awesome work on CatWatch and help with recent
>> ExtensionDistributor improvements
>> * YuviPanda, for coming up with unique and useful tools for Wikimedians,
>> and then following through and building them
>> * S Page, for consistently nagging me and other developers to improve
>> documentation
>>
>> -- Legoktm
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Flushing cached parser output after preview

2015-11-28 Thread Yuri Astrakhan
https://phabricator.wikimedia.org/T119779
Graph extension generates different html output depending on the isPreview
parser option, but if user previews a page and saves it right after without
any changes, the parser reuses previous output. Is there a way to force
parser regenerate on save? Thanks!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Vision for Wikipedia

2015-11-26 Thread Yuri Astrakhan
I would like to share my vision for Wikipedia's future, as well as some
steps that could help us get there. I hope you find it useful. Please share
your feedback and ideas.

Vision: https://meta.wikimedia.org/wiki/User:Yurik/I_Dream_of_Content
Implementation:
https://meta.wikimedia.org/wiki/User:Yurik/From_Dream_to_Reality

Thanks!

P.S. The first link was previously shared on a different mailing list.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] JetBrains IDE licenses for MW developers - php, js/node, python, C/C++, puppet, ruby, ...

2015-10-19 Thread Yuri Astrakhan
Developers, we now have licenses for JetBrains' IDEA Ultimate
 (JavaScript, PHP, Python, ...)  and CLion
 (C/C++).  The IDE supports debugging in
Vagrant and on-the-fly static code analysis.

If you are a volunteer developer, and want to use an IDE, contact me for a
license. Those with access to the office wiki may register directly at the
licenses  page.

Please try it for a bit before getting a license -- just in case you don't
like it.

P.S. This is only for those who want JetBrains' IDE. If you are happy with
Vim, good for you. If you like Atom or Sublime, great. If notepad is the
best text editor of all times, all the power to you.  Bikeshedding should
go into a separate thread ))
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Deprecate $wgEnableAPI?

2015-10-13 Thread Yuri Astrakhan
+1
On Oct 14, 2015 02:49, "Erik Bernhardson" 
wrote:

> +∞. This switch can only make things more complicated, and i highly doubt
> developers are taking this into consideration when writing new features. It
> seems like a 'please break everything' switch.
>
> On Tue, Oct 13, 2015 at 3:49 PM, Max Semenik 
> wrote:
>
> > Hey, I've created https://phabricator.wikimedia.org/T115414 to remove
> > $wgEnableAPI. Quoting the bug:
> >
> > By now, MediaWiki is severely crippled without the API, with most of
> > interesting extensions relying on the API, it becomes more and more
> > problematic to disable it. Now it's probably fair to say that wikis that
> do
> > that are just shooting their own feet off for no good reason. Therefore,
> I
> > propose to officially declare that MW does not support working without
> the
> > API and remove this very setting.
> >
> > Please state your opinions on this.
> >
> > --
> > Best regards,
> > Max Semenik ([[User:MaxSem]])
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Announcing the launch of Maps

2015-09-18 Thread Yuri Astrakhan
Please file a phabricator task, and add maps and operations on it
On Sep 19, 2015 3:21 AM, "Yongmin Hong"  wrote:

> 2015. 9. 18. 오전 4:27에 "Tomasz Finc" 님이 작성:
> >
> > The Discovery Department has launched an experimental tile and static
> maps
> > service available at https://maps.wikimedia.org.
> >
> > Using this service you can browse and embed map tiles into your own tools
> > using OpenStreetMap data. Currently, we handle traffic from *.wmflabs
> .org
> > and *.wikivoyage .org (referrer header must be either missing or set to
> > these values) but we would like to open it up to Wikipedia traffic if we
> > see enough use. Our hope is that this service fits the needs of the
> > numerous maps developers and tool authors who have asked for a WMF hosted
> > tile service with an initial focus on WikiVoyage.
>
> It is sad that incubator.wikimedia.org is not included. (Wikivoyages not
> ready for their own wikis resides on this domain under prefix Wy/**, so
> Korean one becomes [[Wy/ko]].)
>
> - snip -
>
> --
> revi
> https://revi.me
> -- Sent from Android --
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Announcing the launch of Maps

2015-09-17 Thread Yuri Astrakhan
The STYLE of the map is in https://github.com/kartotherian/osm-bright.tm2
which is a fork from https://github.com/mapbox/mapbox-studio-osm-bright.tm2
-- released by Mapbox under a 3 clause BSD license

(IANAL).

On Thu, Sep 17, 2015 at 11:21 PM, Max Semenik  wrote:

> The underlying data is ODbL which is properly attributed per OSM
> guidelines. The license for tiles themselves is a different issue, OSM
> decided to CC them, we're not claiming any additional copyright so far,
> until legal tells us in https://phabricator.wikimedia.org/T111224
>
> On Thu, Sep 17, 2015 at 12:40 PM, Ryan Kaldari 
> wrote:
>
> > What is the license for the map tiles, i.e. the cartography rather than
> the
> > data? Open Street Maps uses CC-BY-SA, although they don't really make it
> > clear who is supposed to be attributed. It looks like our tiles are
> > different though.
> >
> > On Thu, Sep 17, 2015 at 12:26 PM, Tomasz Finc 
> wrote:
> >
> > > The Discovery Department has launched an experimental tile and static
> > maps
> > > service available at https://maps.wikimedia.org.
> > >
> > > Using this service you can browse and embed map tiles into your own
> tools
> > > using OpenStreetMap data. Currently, we handle traffic from *.wmflabs
> > .org
> > > and *.wikivoyage .org (referrer header must be either missing or set to
> > > these values) but we would like to open it up to Wikipedia traffic if
> we
> > > see enough use. Our hope is that this service fits the needs of the
> > > numerous maps developers and tool authors who have asked for a WMF
> hosted
> > > tile service with an initial focus on WikiVoyage.
> > >
> > > We'd love for you to try our new service, experiment writing tools
> using
> > > our tiles, and giving us feedback <
> > > https://www.mediawiki.org/wiki/Talk:Maps> .
> > > If you've built a tool using OpenStreetMap-based imagery then using our
> > > service is a simple drop-in replacement.
> > >
> > > Getting started is as easy as
> > > https://www.mediawiki.org/wiki/Maps#Getting_Started
> > >
> > > How can you help?
> > >
> > > * Adapt your labs tool to use this service - for example, use Leaflet
> js
> > > library and point it to https://maps.wikimedia.org
> > > * File bugs in Phabricator
> > > 
> > > * Provide us feedback to help guide future features
> > > 
> > > * Improve our map style <
> https://github.com/kartotherian/osm-bright.tm2>
> > > * Improve our data extraction
> > > 
> > >
> > > Based on usage and your feedback, the Discovery team
> > >  will decide how to proceed.
> > >
> > > We could add more data sources (both vector and raster), work on
> > additional
> > > services such as static maps or geosearch, work on supporting all
> > > languages, switch to client-side WebGL rendering, etc. Please help us
> > > decide what is most important.
> > >
> > > https://www.mediawiki.org/wiki/Maps has more about the project and
> > related
> > > Maps work.
> > >
> > > == In Depth ==
> > >
> > > Tiles are served from https://maps.wikimedia.org, but can only be
> > accessed
> > > from any subdomains of *.wmflabs .org and *.wikivoyage.org.
> > Kartotherian
> > > can produce tiles as images (png), and as raw vector data (PBF Mapbox
> > > format or json):
> > >
> > > .../{source}/{zoom}/{x}/{y}[@{scale}x].{format}
> > >
> > > Additionally, Kartotherian can produce snapshot (static) images of any
> > > location, scaling, and zoom level with
> > >
> > > .../{source},{zoom},{lat},{lon},{width}x{height}[@{scale}x].{format}.
> > >
> > > For example, to get an image centered at 42,-3.14, at zoom level 4,
> size
> > > 800x600, use
> > > https://maps.wikimedia.org/img/osm-intl,4,42,-3.14,800x600.png
> > > (copy/paste the link, or else it might not work due to referrer
> > > restriction).
> > >
> > > Do note that the static feature is highly experimental right now.
> > >
> > > We would like to thank WMF Ops (especially Alex Kosiaris, Brandon
> Black,
> > > and Jaime Crespo), services team, OSM community and engineers, and the
> > > Mapnik and Mapbox teams. The project would not have completed so fast
> > > without you.
> > >
> > > Thank You
> > >
> > > --tomasz
> > > ___
> > > Wikitech-l mailing list
> > > Wikitech-l@lists.wikimedia.org
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> > ___
> > Wikitech-l mailing list
> > Wikitech-l@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
>
>
>
> --
> Best regards,
> Max Semenik ([[User:MaxSem]])
> ___
> Wikitech-l mailing list
> 

Re: [Wikitech-l] Announcing the launch of Maps

2015-09-17 Thread Yuri Astrakhan
The bracketing was introduced to reduce the DDOS attack surface and
increase the cache hit ratio. This means that we could in theory do all the
magic of bracketing in the varnish (to keep the cache rate high), or we
could force the clients to do the logic.  The problem with Varnish approach
is that it puts business-level logic into Varnish's brittle C-style
configs.

On Thu, Sep 17, 2015 at 11:58 PM, Brion Vibber 
wrote:

> Awesome sauce! Congrats to everyone who's been working on this.
>
>
> On Thu, Sep 17, 2015 at 12:26 PM, Tomasz Finc  wrote:
>
> > == In Depth ==
> >
> > Tiles are served from https://maps.wikimedia.org, but can only be
> accessed
> > from any subdomains of *.wmflabs .org and *.wikivoyage.org.
> Kartotherian
> > can produce tiles as images (png), and as raw vector data (PBF Mapbox
> > format or json):
> >
> > .../{source}/{zoom}/{x}/{y}[@{scale}x].{format}
> >
>
> Just a note -- the scale factor is for high-density displays, which are now
> standard on most phones and increasingly common on laptops (1.5x is common
> on small Windows laptops with 1080p displays). However the available scale
> factors are hardcoded, so you can't just take an arbitrary device scale
> factor (window.devicePixelRatio in HTML5) and feed it in, as it might not
> match and you'll get HTTP errors out.
>
> I've got a provisional patch in for the example front-end which manually
> brackets to the available settings:
> https://github.com/kartotherian/kartotherian/pull/13
>
> It'd be awesome to either expose the available scaling factors as a list
> via JSON API, or move the bracketing logic into the server side so you can
> send a query with your actual scale factor and get back one that will work.
>
> (If available scaling factors depend on the layer source being viewed, this
> might be more complicated.)
>
>
> > We would like to thank WMF Ops (especially Alex Kosiaris, Brandon Black,
> > and Jaime Crespo), services team, OSM community and engineers, and the
> > Mapnik and Mapbox teams. The project would not have completed so fast
> > without you.
> >
>
> Seconded. :)
>
> -- brion
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Geohack tools

2015-08-07 Thread Yuri Astrakhan
Pine, you are right. That list is not very useful, and instead should look
something like this:

https://en.wikivoyage.org/wiki/Salzburg#Get_around

The bad news is that the tile service it uses is hosted on labs, which
means it cannot scale to the regular wikipedia-usage levels. Plus there
might be a potential policy problem there - default lab-content loading on
every page visit without user's consent.

The good news is that we are very close to launching a full-blown WMF
production-hosted tile service, based on the wonderful data from OSM.

See general info and some ideas people have proposed -
https://www.mediawiki.org/wiki/Maps  (feel free to add more)


On Fri, Aug 7, 2015 at 11:23 PM, Gergo Tisza gti...@wikimedia.org wrote:

 On Thu, Aug 6, 2015 at 11:01 PM, Pine W wiki.p...@gmail.com wrote:

  I just now realized how powerful these tools are when I started clicking
  around.
 
 
 
 https://tools.wmflabs.org/geohack/geohack.php?pagename=File%3AWhite-cheeked_Starling_perching_on_a_rock.jpgparams=34.610576_N_135.540542_E_globe:Earth_class:object_language=en
 
  Is there any chance of integrating some of these tools more directly onto
  Wikipedia pages, and into mobile web/mobile apps?
 

 There is an ongoing project
 https://phabricator.wikimedia.org/project/profile/1127/ to integrate
 maps
 into MediaWiki; that's probably the best place to catalog the use cases of
 Geohack and decide which can and need to be supported by the new tools.
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] API BREAKING CHANGE: Default continuation mode for action=query will change at the end of this month

2015-06-29 Thread Yuri Astrakhan
Jonathan, in theory, if you never use query continuation, you don't need to
do anything with your code - you just need to tell the community that the
bot is ok to use the new continuation system. This way noone will block the
bot just in case.  It is much better than to use rawcontinue, because
that flag will keep telling us someone needs the old system.

On Mon, Jun 29, 2015 at 8:01 PM, Jonathan Morgan jmor...@wikimedia.org
wrote:

 I run GrantsBot, which is listed here.

 I've updated all GrantsBot API requests to use rawcontinue=1. But as I read
 through this thread, it's not clear to me that that's the problem. I can't
 find a single instance in my code where I'm actually continuing a query.
 Does this breaking change only apply be an issue if you were using
 querycontinue in the first place?

 I'm sure this has been covered elsewhere or I'm missing something obvious,
 but I'd be grateful for specific confirmation :)

 Best,
 Jonathan

 On Wed, Jun 10, 2015 at 12:01 PM, Brad Jorsch (Anomie) 
 bjor...@wikimedia.org wrote:

  On Wed, Jun 3, 2015 at 12:43 PM, Brad Jorsch (Anomie) 
  bjor...@wikimedia.org
   wrote:
 
   On Wed, Jun 3, 2015 at 7:29 AM, John Mark Vandenberg jay...@gmail.com
 
   wrote:
  
   If possible, could you compile a list of bots affected at a lower
   threshold - maybe 1,000.  That will give us a better idea of the scale
   of bots operators that will be affected when this lands - currently in
   one months time.
  
  
   I already have the list of *accounts* affected: there are 510 with
  between
   1000 and 1 hits. Of those, 454 do not contain bot (case
   insensitively), so they might be human users with user scripts, or AWB
 if
   that's not fixed (someone please check!), or the like. For comparison,
 in
   the over-1 group there were 30 such that I filtered out.
  
   I'll want to check with Legal to make sure the additional release of
   account names is still compliant with the privacy policy (I'm almost
 but
   not entirely sure it would be ok).
  
 
  Legal recommended we only post the list of bots, not the human accounts.
  These are:
 
  AHbot
  AsuraBot
  Autobot
  BattyBot
  Bibcode_Bot
  Bottuzzu
  ChenzwBot
  Cydebot
  DickensBot
  DrTrigonBot
  DSisyphBot
  DumbBOT
  DYKHousekeepingBot
  DYKUpdateBot
  FBot
  GiftBot
  GrantsBot
  HangsnaBot
  HangsnaBot2
  ImageRemovalBot
  InceptionBot
  JackBot
  JBot
  Jimmy-bot
  Kenrick95Bot
  KrBot
  KrinkleBot
  LivingBot
  MalafayaBot
  MaraBot
  MauroBot
  MBHbot
  Mr.Z-bot
  NowCommons-Sichtbot
  Olafbot
  PereBot
  PseudoBot
  QianBot
  Rainbot
  Reports_bot
  RFF-Bot
  Salebot
  Sanjeev_bot
  SemperBlottoBot
  SergoBot
  SHBot
  Steenthbot
  TurkászBot
  UWCTransferBot
  VlsergeyBot
  VriuBot
  YiFeiBot
  Yobot
  ZacheBot
  Zlobot
 
  Note this list is still from May 23–29; a bot appearing in this list may
  have been updated since then.
 
 
  --
  Brad Jorsch (Anomie)
  Software Engineer
  Wikimedia Foundation
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 



 --
 Jonathan T. Morgan
 Senior Design Researcher
 Wikimedia Foundation
 User:Jmorgan (WMF) https://meta.wikimedia.org/wiki/User:Jmorgan_(WMF)
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] API BREAKING CHANGE: Default continuation mode for action=query will change at the end of this month

2015-06-17 Thread Yuri Astrakhan
On Wed, Jun 17, 2015 at 7:44 PM, John Mark Vandenberg jay...@gmail.com
wrote:


 The API currently emits a warning if a query continuation mode isnt
 selected.

 I guess on July 1 the API could emit an error, and not return any query
 data.
 Then the data isnt going to cause weird behaviour - it will break,
 properly.


Hmm, this is actually an interesting idea - would it make sense to error on
missing continue or rawcontinue for all action=query for about a month
or two, so that everyone notices it right away and gets updated, and than
resume with the new behavior?
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] [Mediawiki-api-announce] API BREAKING CHANGE: Default continuation mode for action=query will change at the end of this month

2015-06-04 Thread Yuri Astrakhan
Would it make sense to introduce per-user-agent API black list?  This way
we could prevent accidental usage of the broken tools.

On Thu, Jun 4, 2015 at 4:19 PM, Petr Bena benap...@gmail.com wrote:

 * 3.1.5 not 3.15, sorry for confusion :P

 On Thu, Jun 4, 2015 at 3:19 PM, Petr Bena benap...@gmail.com wrote:
  Out of curiosity, this may also break all versions of huggle that are
  older than 3.15, which was released sometime around November 2014.
 
  I strongly recommend you to upgrade if you are using such an ancient
  version. I don't know what is going to happen, but it may break.
  Unless someone patches the legacy huggle version, it will probably
  remain defunct.
 
 
  On Wed, Jun 3, 2015 at 11:23 PM, Maarten Dammers maar...@mdammers.nl
 wrote:
  Hi,
 
  Brad Jorsch (Anomie) schreef op 3-6-2015 om 18:43:
 
  not an announcement
 
  Could you please stop using the announce list for chatter? Probably
 better
  to put the announce list on moderation.
 
  Maarten
 
 
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] API BREAKING CHANGE: Default continuation mode for action=query will change at the end of this month

2015-06-03 Thread Yuri Astrakhan
MZ:  I feel that former MediaWiki Web API maintainers should actively pay
attention to which mailing lists they're posting to. ;-)  I doubt you
intended to send this message to mediawiki-api-announce.
Mz, I don't think I ever spent much time maintaining it :))  But yes, good
point, reply all is evil at times :)

MZ: re why minimalistic lib - for most apis out there, people tend to write
reference implementation - code that explains to would-be lib authors how
to use API. It sets up prefered usage patterns and guidelines. We never had
that for the api. This resulted in a large number of API libs that vary in
how they use api. Pywikibot is a powerful platform, but is too big for many
usecases (as discussed in Lyon). Hence the need.

SW:  Most operator are volunteers and don't have time to change the code
every month because there is a change in the api.
* I might agree about every month, but we are talking about a feature that
has been out for 2 years...

 The old one was imho perfectly.
* Most API users imho would laugh at this statement. See the roadmap
https://www.mediawiki.org/wiki/Requests_for_comment/API_roadmap which
came as a result of analysing numerous bugs  wishes filed against the api,
such as returning {} instead of [] for empty values, inconsistencies, the
silly '*' value for text, and many other problems that have accumulated
during the years. API might work for you, but it does not support
multilingual error messaging, it overcomplicates the process of writing
javascript clients, it does not easily cache. In short, lots of problems.

 Was the new api coded by WMF or by volunteers?
* I wrote the original API as a volunteer (took about a year in 2005/6). I
recently coded the continuation as a volunteer, and Brad has improved it as
a WMF employee. I later became a full time WMF employee as well, but my
project was Zero, not API. As a volunteer, over 2 years ago I wrote a huge API
improvement roadmap
https://www.mediawiki.org/wiki/Requests_for_comment/API_roadmapthat Brad
picked up and greatly extended, and is driving it forward with the amazing
improvements. From what I know, Brad has now been officially moved into
position of working on the API. In other words, that employee vs volunteer
line is very fuzzy. No point of splitting it that way.

TLDR;

In short, we need to make improvements if we want to move forward in
turning API into a platform, with flexible JavaScript-driven interfaces
such as Visual Editor. To allow creative uses that go beyond AWB, that
support complex interactions, and not just the way for bots to make edits.
Unfortunately, that means that despite our best efforts, very infrequently,
some bots must be updated.

If the bot author cannot add a simple one line change rawcontinue=1 once
in two years because of their busy personal live, I don't think that person
should be making automatic edits to Wikipedia - because sometimes bots make
mistakes that require substantially more time involvement. I would not
trust wikipedia edits to a bot runner under such circumstances.  If the bot
runner is not a programmer, they should get the latest version of their
bot. If there is no up-to-date code because noone is maintaining it, it
again should not be accessing wikipedia - we sometimes discover security
bugs that require fixing, or because bot calls wiki too often, or other
changes in content structure - e.g. introduction of WikiData for interwiki
links required all interwiki bots to be updated.

Wikipedia is a living, developing ecosystem, and it does require updates to
all parties accessing it. People accessing wikipedia from the older
browsers discover that they no longer can do everything they used to many
years ago - because we now use newer browser features, and fallback into
basic mode for older browsers. Please participate in the evolution of the
project.


On Wed, Jun 3, 2015 at 5:22 PM, Steinsplitter Wiki 
steinsplitter-w...@live.com wrote:

 Most operator are volunteers and don't have time to change the code every
 month because there is a change in the api. Because of this devs should
 keep the api backward-compatible.
 Also wondering why wee need this new api. The old one was imho perfectly.

 Was the new api coded by WMF or by volunteers?

  I feel that bot operators should actively pay attention to the technical
  aspects of the community and the mailing lists.
 Sorry, i disagree. Bot operators are volunteers and not payed staffers.
 Most of them having a job and real live.

 -- Steinsplitter

  Date: Wed, 3 Jun 2015 14:50:48 +0300
  From: yastrak...@wikimedia.org
  To: wikitech-l@lists.wikimedia.org
  CC: mediawiki-api-annou...@lists.wikimedia.org
  Subject: Re: [Wikitech-l] API BREAKING CHANGE: Default continuation mode
 for action=query will change at the end of this month
 
  I feel that bot operators should actively pay attention to the technical
  aspects of the community and the mailing lists. So, the bot operator who
  never updates their software, doesn't pay 

Re: [Wikitech-l] [Mediawiki-api] API BREAKING CHANGE: Default continuation mode for action=query will change at the end of this month

2015-06-03 Thread Yuri Astrakhan
I would like to point out that it might be a good idea add
formatversion=1  for anyone who wants to lock the current formatting in
place.

On Wed, Jun 3, 2015 at 8:13 PM, Brad Jorsch (Anomie) bjor...@wikimedia.org
wrote:

 On Wed, Jun 3, 2015 at 10:04 AM, Brian Gerstle bgers...@wikimedia.org
 wrote:

 My question is: why does the default behavior need to change?  Wouldn't
 continuing with the default behavior allow people to continue using the
 rawcontinue behavior for as long as we want to support it—without making
 any changes?


 The decision to change the default came out of the same concerns that led
 to the improved action=help output and some of the other work I've been
 doing lately: We want to lower the barriers for using our API, which means
 that the default shouldn't be something user-hostile.

 The raw continuation is deceptively simple: it looks straightforward, but
 if you're using it with a generator, multiple prop modules, and meta or
 list modules, your client code has to know when to ignore the returned
 continuation for the generator, when to remove a module from prop and then
 when to re-add it, and when to remove the meta or list modules. I wouldn't
 be that surprised to learn that more people have it wrong than correct if
 their code supports using prop modules with generators.

 The new continuation actually is simple: you send the equivalent of
 array_merge( $originalParams, $continueParams ) and it just works.


 Yes, some of the same could be said for making format=jsonformatversion=2
 the default. In this case the formatversion=1 output is just annoying
 rather than actually hostile (although representing boolean true as a
 falsey string comes close), so at this time there's no plan to make that
 breaking change.


 --
 Brad Jorsch (Anomie)
 Software Engineer
 Wikimedia Foundation

 ___
 Mediawiki-api mailing list
 mediawiki-...@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/mediawiki-api


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Replace [Tracking bugs] with [Projects] in Phabricator

2015-05-11 Thread Yuri Astrakhan
I would like to propose that we remove all tracking bugs, and instead use
Phabricator projects (e.g. with an Umbrella icon). Some of the benefits:

* Discoverable - projects are more intuitive and easier to add to a bug
with the auto-complete
* Manageable - within the project, tasks can be broken down into columns by
categories/priorities specific to the project
* Searchable - easier to search for tasks that belong to multiple projects
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] GRAPH extension is now live everywhere!

2015-05-06 Thread Yuri Astrakhan
Aleksey, please file a bug report in phabricator with the URL to the page
containing your non-working graph - I have to look at your graph before
figuring out why it is not working.

Bawolff, very cool API usage!  I just wish we allowed api from inside lua
-- so much creative stuff could be done!

On Wed, May 6, 2015 at 6:32 AM, Brian Wolff bawo...@gmail.com wrote:

 On 5/5/15, Yuri Astrakhan yastrak...@wikimedia.org wrote:
  Starting today, editors can use *graph* tag to include complex graphs
 and
  maps inside articles.
 
  *Demo:* https://www.mediawiki.org/wiki/Extension:Graph/Demo
  *Vega's demo:*
 http://trifacta.github.io/vega/editor/?spec=scatter_matrix
  *Extension info:* https://www.mediawiki.org/wiki/Extension:Graph
  *Vega's docs:* https://github.com/trifacta/vega/wiki
  *Bug reports:* https://phabricator.wikimedia.org/ - project tag #graph
 
  Graph tag support template parameter expansion. There is also a Graphoid
  service to convert graphs into images. Currently, Graphoid is used in
 case
  the browser does not support modern JavaScript, but I plan to use it for
  all anonymous users - downloading large JS code needed to render graphs
 is
  significantly slower than showing an image.
 
  Potential future growth (developers needed!):
  * Documentation and better tutorials
  * Visualize as you type - show changes in graph while editing its code
  * Visual Editor's plugin
  * Animation https://github.com/trifacta/vega/wiki/Interaction-Scenarios
 
 
  Project history: Exactly one year ago, Dan Andreescu (milimetric) and Jon
  Robson demoed Vega visualization grammar 
 https://trifacta.github.io/vega/
  usage in MediaWiki. The project stayed dormant for almost half a year,
  until Zero team decided it was a good solution to do on-wiki graphs. The
  project was rewritten, and gained many new features, such as template
  parameters. Yet, doing graphs just for Zero portal seemed silly. Wider
  audience meant that we now had to support older browsers, thus Graphoid
  service was born.
 
  This project could not have happened without the help from Dan Andreescu,
  Brion Vibber, Timo Tijhof, Chris Steipp, Max Semenik,  Marko Obrovac,
  Alexandros Kosiaris, Jon Robson, Gabriel Wicke, and others who have
 helped
  me develop,  test, instrument, and deploy Graph extension and Graphoid
  service. I also would like to thank the Vega team for making this amazing
  library.
 
  --Yurik
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l


 Hmm cool.

 One of the interesting things, is you can use the API as a data
 source. For example, here is a pie graph of how images on commons
 needing categories are divided up

 https://commons.wikimedia.org/w/index.php?title=Commons:Sandboxoldid=159978060
 (One could even make that more general and have a template, which
 given a cat name, would give a pie graph of how the subcategories are
 divided in terms of number).

 --bawolff

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] GRAPH extension is now live everywhere!

2015-05-06 Thread Yuri Astrakhan
Jon, graph is designed to support template parameter expansions, and I
found most convenient to place each graph into a separate template page.

David, re VE - https://phabricator.wikimedia.org/T93585

On Wed, May 6, 2015 at 2:27 PM, David Gerard dger...@gmail.com wrote:

 Is there a facility to use this in VE?

 On 6 May 2015 at 12:25, Jon Robson jdlrob...@gmail.com wrote:
  I think this is great but I'm still super super concerned about the
 support
  for Embedded directly with graph. I'm concerned as if used this way
 we
  risk making wikitext even more like code and more difficult for others to
  edit. Also having it inside the page makes it really difficult to
  extract/encourage remixing of the data...
 
  On Wed, May 6, 2015 at 4:32 AM, Brian Wolff bawo...@gmail.com wrote:
 
  On 5/5/15, Yuri Astrakhan yastrak...@wikimedia.org wrote:
   Starting today, editors can use *graph* tag to include complex
 graphs
  and
   maps inside articles.
  
   *Demo:* https://www.mediawiki.org/wiki/Extension:Graph/Demo
   *Vega's demo:*
  http://trifacta.github.io/vega/editor/?spec=scatter_matrix
   *Extension info:* https://www.mediawiki.org/wiki/Extension:Graph
   *Vega's docs:* https://github.com/trifacta/vega/wiki
   *Bug reports:* https://phabricator.wikimedia.org/ - project tag
 #graph
  
   Graph tag support template parameter expansion. There is also a
 Graphoid
   service to convert graphs into images. Currently, Graphoid is used in
  case
   the browser does not support modern JavaScript, but I plan to use it
 for
   all anonymous users - downloading large JS code needed to render
 graphs
  is
   significantly slower than showing an image.
  
   Potential future growth (developers needed!):
   * Documentation and better tutorials
   * Visualize as you type - show changes in graph while editing its code
   * Visual Editor's plugin
   * Animation 
 https://github.com/trifacta/vega/wiki/Interaction-Scenarios
  
  
   Project history: Exactly one year ago, Dan Andreescu (milimetric) and
 Jon
   Robson demoed Vega visualization grammar 
  https://trifacta.github.io/vega/
   usage in MediaWiki. The project stayed dormant for almost half a year,
   until Zero team decided it was a good solution to do on-wiki graphs.
 The
   project was rewritten, and gained many new features, such as template
   parameters. Yet, doing graphs just for Zero portal seemed silly. Wider
   audience meant that we now had to support older browsers, thus
 Graphoid
   service was born.
  
   This project could not have happened without the help from Dan
 Andreescu,
   Brion Vibber, Timo Tijhof, Chris Steipp, Max Semenik,  Marko Obrovac,
   Alexandros Kosiaris, Jon Robson, Gabriel Wicke, and others who have
  helped
   me develop,  test, instrument, and deploy Graph extension and Graphoid
   service. I also would like to thank the Vega team for making this
 amazing
   library.
  
   --Yurik
   ___
   Wikitech-l mailing list
   Wikitech-l@lists.wikimedia.org
   https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 
  Hmm cool.
 
  One of the interesting things, is you can use the API as a data
  source. For example, here is a pie graph of how images on commons
  needing categories are divided up
 
 
 https://commons.wikimedia.org/w/index.php?title=Commons:Sandboxoldid=159978060
  (One could even make that more general and have a template, which
  given a cat name, would give a pie graph of how the subcategories are
  divided in terms of number).
 
  --bawolff
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 
 
 
  --
  Jon Robson
  * http://jonrobson.me.uk
  * https://www.facebook.com/jonrobson
  * @rakugojon
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] GRAPH extension is now live everywhere!

2015-05-06 Thread Yuri Astrakhan
Sorry, I meant do NOT save, when trying dynamic graph preview.

If anyone is interested, or as a student project - a simple sandbox
implementation task: https://phabricator.wikimedia.org/T98312

On Wed, May 6, 2015 at 10:28 PM, Yuri Astrakhan yastrak...@wikimedia.org
wrote:

 You currently can use dynamic graph-as-you-type in mediawiki.org for
 Graph testing:
 * navigate to any Graph namespace graph, e.g.
 https://www.mediawiki.org/wiki/Graph:Sample
 * Go to edit source
 * Click Preview
 Now any change you make to the definition is dynamically reflected in the
 graph.  Please do save. I would love for this to eventually be done in the
 VE.

 Eugene, this is an excellent idea, but graphs are no different from other
 templates. I think this should be solved for all templates, to allow
 template sharing between projects.


 On Wed, May 6, 2015 at 8:24 PM, Eugene Zelenko eugene.zele...@gmail.com
 wrote:

 Hi, Yuri!

 I think will be reasonable to allow transclusion of graphs (or their
 parts, for example without text labels) from Commons. This will allow
 to share graphs between projects.

 Eugene.

 On Wed, May 6, 2015 at 5:51 AM, Yuri Astrakhan yastrak...@wikimedia.org
 wrote:
  Jon, graph is designed to support template parameter expansions, and I
  found most convenient to place each graph into a separate template page.
 
  David, re VE - https://phabricator.wikimedia.org/T93585
 
  On Wed, May 6, 2015 at 2:27 PM, David Gerard dger...@gmail.com wrote:
 
  Is there a facility to use this in VE?
 
  On 6 May 2015 at 12:25, Jon Robson jdlrob...@gmail.com wrote:
   I think this is great but I'm still super super concerned about the
  support
   for Embedded directly with graph. I'm concerned as if used this
 way
  we
   risk making wikitext even more like code and more difficult for
 others to
   edit. Also having it inside the page makes it really difficult to
   extract/encourage remixing of the data...
  
   On Wed, May 6, 2015 at 4:32 AM, Brian Wolff bawo...@gmail.com
 wrote:
  
   On 5/5/15, Yuri Astrakhan yastrak...@wikimedia.org wrote:
Starting today, editors can use *graph* tag to include complex
  graphs
   and
maps inside articles.
   
*Demo:* https://www.mediawiki.org/wiki/Extension:Graph/Demo
*Vega's demo:*
   http://trifacta.github.io/vega/editor/?spec=scatter_matrix
*Extension info:* https://www.mediawiki.org/wiki/Extension:Graph
*Vega's docs:* https://github.com/trifacta/vega/wiki
*Bug reports:* https://phabricator.wikimedia.org/ - project tag
  #graph
   
Graph tag support template parameter expansion. There is also a
  Graphoid
service to convert graphs into images. Currently, Graphoid is
 used in
   case
the browser does not support modern JavaScript, but I plan to use
 it
  for
all anonymous users - downloading large JS code needed to render
  graphs
   is
significantly slower than showing an image.
   
Potential future growth (developers needed!):
* Documentation and better tutorials
* Visualize as you type - show changes in graph while editing its
 code
* Visual Editor's plugin
* Animation 
  https://github.com/trifacta/vega/wiki/Interaction-Scenarios
   
   
Project history: Exactly one year ago, Dan Andreescu (milimetric)
 and
  Jon
Robson demoed Vega visualization grammar 
   https://trifacta.github.io/vega/
usage in MediaWiki. The project stayed dormant for almost half a
 year,
until Zero team decided it was a good solution to do on-wiki
 graphs.
  The
project was rewritten, and gained many new features, such as
 template
parameters. Yet, doing graphs just for Zero portal seemed silly.
 Wider
audience meant that we now had to support older browsers, thus
  Graphoid
service was born.
   
This project could not have happened without the help from Dan
  Andreescu,
Brion Vibber, Timo Tijhof, Chris Steipp, Max Semenik,  Marko
 Obrovac,
Alexandros Kosiaris, Jon Robson, Gabriel Wicke, and others who
 have
   helped
me develop,  test, instrument, and deploy Graph extension and
 Graphoid
service. I also would like to thank the Vega team for making this
  amazing
library.
   
--Yurik
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
  
   Hmm cool.
  
   One of the interesting things, is you can use the API as a data
   source. For example, here is a pie graph of how images on commons
   needing categories are divided up
  
  
 
 https://commons.wikimedia.org/w/index.php?title=Commons:Sandboxoldid=159978060
   (One could even make that more general and have a template, which
   given a cat name, would give a pie graph of how the subcategories
 are
   divided in terms of number).
  
   --bawolff
  
   ___
   Wikitech-l mailing list
   Wikitech-l@lists.wikimedia.org
   https://lists.wikimedia.org

Re: [Wikitech-l] GRAPH extension is now live everywhere!

2015-05-06 Thread Yuri Astrakhan
You currently can use dynamic graph-as-you-type in mediawiki.org for
Graph testing:
* navigate to any Graph namespace graph, e.g.
https://www.mediawiki.org/wiki/Graph:Sample
* Go to edit source
* Click Preview
Now any change you make to the definition is dynamically reflected in the
graph.  Please do save. I would love for this to eventually be done in the
VE.

Eugene, this is an excellent idea, but graphs are no different from other
templates. I think this should be solved for all templates, to allow
template sharing between projects.


On Wed, May 6, 2015 at 8:24 PM, Eugene Zelenko eugene.zele...@gmail.com
wrote:

 Hi, Yuri!

 I think will be reasonable to allow transclusion of graphs (or their
 parts, for example without text labels) from Commons. This will allow
 to share graphs between projects.

 Eugene.

 On Wed, May 6, 2015 at 5:51 AM, Yuri Astrakhan yastrak...@wikimedia.org
 wrote:
  Jon, graph is designed to support template parameter expansions, and I
  found most convenient to place each graph into a separate template page.
 
  David, re VE - https://phabricator.wikimedia.org/T93585
 
  On Wed, May 6, 2015 at 2:27 PM, David Gerard dger...@gmail.com wrote:
 
  Is there a facility to use this in VE?
 
  On 6 May 2015 at 12:25, Jon Robson jdlrob...@gmail.com wrote:
   I think this is great but I'm still super super concerned about the
  support
   for Embedded directly with graph. I'm concerned as if used this
 way
  we
   risk making wikitext even more like code and more difficult for
 others to
   edit. Also having it inside the page makes it really difficult to
   extract/encourage remixing of the data...
  
   On Wed, May 6, 2015 at 4:32 AM, Brian Wolff bawo...@gmail.com
 wrote:
  
   On 5/5/15, Yuri Astrakhan yastrak...@wikimedia.org wrote:
Starting today, editors can use *graph* tag to include complex
  graphs
   and
maps inside articles.
   
*Demo:* https://www.mediawiki.org/wiki/Extension:Graph/Demo
*Vega's demo:*
   http://trifacta.github.io/vega/editor/?spec=scatter_matrix
*Extension info:* https://www.mediawiki.org/wiki/Extension:Graph
*Vega's docs:* https://github.com/trifacta/vega/wiki
*Bug reports:* https://phabricator.wikimedia.org/ - project tag
  #graph
   
Graph tag support template parameter expansion. There is also a
  Graphoid
service to convert graphs into images. Currently, Graphoid is used
 in
   case
the browser does not support modern JavaScript, but I plan to use
 it
  for
all anonymous users - downloading large JS code needed to render
  graphs
   is
significantly slower than showing an image.
   
Potential future growth (developers needed!):
* Documentation and better tutorials
* Visualize as you type - show changes in graph while editing its
 code
* Visual Editor's plugin
* Animation 
  https://github.com/trifacta/vega/wiki/Interaction-Scenarios
   
   
Project history: Exactly one year ago, Dan Andreescu (milimetric)
 and
  Jon
Robson demoed Vega visualization grammar 
   https://trifacta.github.io/vega/
usage in MediaWiki. The project stayed dormant for almost half a
 year,
until Zero team decided it was a good solution to do on-wiki
 graphs.
  The
project was rewritten, and gained many new features, such as
 template
parameters. Yet, doing graphs just for Zero portal seemed silly.
 Wider
audience meant that we now had to support older browsers, thus
  Graphoid
service was born.
   
This project could not have happened without the help from Dan
  Andreescu,
Brion Vibber, Timo Tijhof, Chris Steipp, Max Semenik,  Marko
 Obrovac,
Alexandros Kosiaris, Jon Robson, Gabriel Wicke, and others who have
   helped
me develop,  test, instrument, and deploy Graph extension and
 Graphoid
service. I also would like to thank the Vega team for making this
  amazing
library.
   
--Yurik
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
  
   Hmm cool.
  
   One of the interesting things, is you can use the API as a data
   source. For example, here is a pie graph of how images on commons
   needing categories are divided up
  
  
 
 https://commons.wikimedia.org/w/index.php?title=Commons:Sandboxoldid=159978060
   (One could even make that more general and have a template, which
   given a cat name, would give a pie graph of how the subcategories are
   divided in terms of number).
  
   --bawolff
  
   ___
   Wikitech-l mailing list
   Wikitech-l@lists.wikimedia.org
   https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
  
  
  
   --
   Jon Robson
   * http://jonrobson.me.uk
   * https://www.facebook.com/jonrobson
   * @rakugojon
   ___
   Wikitech-l mailing list
   Wikitech-l@lists.wikimedia.org
   https://lists.wikimedia.org/mailman

Re: [Wikitech-l] Reminder: Breaking change to API continuation planned for 1.26

2015-04-20 Thread Yuri Astrakhan
I still think that we should provide a simple API clients for JS, PHP, and
python. JS version should support both Browser  node.js. The libs should
handle the most rudimentary API functioning like logins, warnings, 
continuation, in the way that API devs feel is best, but nothing specific
to any of the modules (e.g. it should not have a separate function to get a
list of all pages).

On Mon, Apr 20, 2015 at 8:38 PM, Jon Robson jdlrob...@gmail.com wrote:

 Is there a phab task for that.. ? :-)


 On Mon, Apr 20, 2015 at 10:21 AM, Brad Jorsch (Anomie)
 bjor...@wikimedia.org wrote:
  On Mon, Apr 20, 2015 at 1:19 PM, Jon Robson jdlrob...@gmail.com wrote:
 
  I use mw.api so I suspect that to handle deprecation notices - does it
  not? If not why not?
 
 
  Because no one coded it for that framework yet?
 
 
  --
  Brad Jorsch (Anomie)
  Software Engineer
  Wikimedia Foundation
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l



 --
 Jon Robson
 * http://jonrobson.me.uk
 * https://www.facebook.com/jonrobson
 * @rakugojon

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Investigating building an apps content service using RESTBase and Node.js

2015-02-05 Thread Yuri Astrakhan
Tim, I like Varnish's vcl flexibility, but not the debugging aspects.
Still, +1, but could you elaborate on how you see:

* Services communicate with each other - via varnish as well or directly?
* Do you see routing varnish layer as non-caching, only to forward request
to the second tear service-specific varnish instance that will actually
perform caching?
* Will there be any problems separating services into separate, non
mutually trusted clusters?
* service management - varnish would only indicate overall health of the
service. What about service-specific real time monitored values (if
needed), logs, controlling service instances. Any thoughts on generalizing
that infrastructure?

Lastly, semi related - my service will also need a node.js lib to access MW
api. How should we manage common libs like that?

Thanks!
On Feb 6, 2015 4:15 AM, Tim Starling tstarl...@wikimedia.org wrote:

 On 04/02/15 16:59, Marko Obrovac wrote:
  For v1, however, we plan to
  provide only logical separation (to a certain extent) via modules which
 can
  be dynamically loaded/unloaded from RESTBase. In return, RESTBase will
  provide them with routing, monitoring, caching and authorisation out of
 the
  box.

 We already have routing, monitoring and caching in Varnish. That seems
 like a good place to implement further service routing to me.

 
 https://git.wikimedia.org/blob/operations%2Fpuppet/4a7f5ce62d9cdd1ace20ca6c489cbdb538503750/templates%2Fvarnish%2Fmisc.inc.vcl.erb
 

 It's simple to configure, has excellent performance and scalability,
 and monitoring and a distributed logging system are already implemented.

 It doesn't have authorisation, but I thought that was going to be in a
 separate service from RESTBase anyway.

 -- Tim Starling


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] SOA in .NET, or Microsoft is going open source MIT style

2015-02-04 Thread Yuri Astrakhan
flame war ahead

For those not adicted to slashdot, see here
http://news.slashdot.org/story/15/02/04/0332238/microsoft-open-sources-coreclr-the-net-execution-engine
.

Licenced under MIT
https://github.com/dotnet/coreclr/blob/master/LICENSE.TXT, plus an
additional patents promise
https://github.com/dotnet/coreclr/blob/master/PATENTS.TXT.

If Microsoft continues to go in the direction of the OSS as before, I
suspect we just might benefit from some good quality components as
individual services on a completelly open source stack.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Improving our code review efficiency

2015-01-29 Thread Yuri Astrakhan
How about a simple script to create a phabricator task after a few days (a
week?) of a patch inactivity to review that patch. It will allow assign
to, allow managers to see each dev's review queue, and will prevent
patches to fall through the cracks.

Obviously this won't be needed after we move gerrit to phabricator.

On Thu, Jan 29, 2015 at 1:44 PM, James Douglas jdoug...@wikimedia.org
wrote:

 This is a situation where disciplined testing can come in really handy.

 If I submit a patch, and the patch passes the tests that have been
 specified for the feature it implements (or the bug it fixes), and the code
 coverage is sufficiently high, then a reviewer has a running start in terms
 of confidence in the correctness and completeness of the patch.

 Practically speaking, this doesn't necessarily rely on rest of the project
 already having a very level of code coverage; as long as there are tests
 laid out for the feature in question, and the patch makes those tests pass,
 it gives the code reviewer a real shot in the arm.

 On Thu, Jan 29, 2015 at 1:14 PM, Jon Robson jdlrob...@gmail.com wrote:

  Thanks for kicking off the conversation Brad :-)
 
  Just mean at the moment. I hacked together and I'm more than happy to
  iterate on this and improve the reporting.
 
  On the subject of patch abandonment: Personally I think we should be
  abandoning inactive patches. They cause unnecessary confusion to
  someone coming into a new extension wanting to help out. We may want
  to change the name to 'abandon' to 'remove from code review queue' as
  abandon sounds rather nasty and that's not at all what it actually
  does - any abandoned patch can be restored at any time if necessary.
 
 
  On Thu, Jan 29, 2015 at 1:11 PM, Brad Jorsch (Anomie)
  bjor...@wikimedia.org wrote:
   On Thu, Jan 29, 2015 at 12:56 PM, Jon Robson jdlrob...@gmail.com
  wrote:
  
   The average time for code to go from submitted to merged appears to be
   29 days over a dataset of 524 patches, excluding all that were written
   by the L10n bot. There is a patchset there that has been _open_ for
   766 days - if you look at it it was uploaded on Dec 23, 2012 12:23 PM
   is -1ed by me and needs a rebase.
  
  
   Mean or median?
  
   I recall talk a while back about someone else (Quim, I think?) doing
 this
   same sort of analysis, and considering the same issues over patches
 that
   seem to have been abandoned by their author and so on, which led to
   discussions of whether we should go around abandoning patches that have
   been -1ed for a long time, etc. Without proper consideration of those
  sorts
   of issues, the statistics don't seem particularly useful.
  
  
   --
   Brad Jorsch (Anomie)
   Software Engineer
   Wikimedia Foundation
   ___
   Wikitech-l mailing list
   Wikitech-l@lists.wikimedia.org
   https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 
 
  --
  Jon Robson
  * http://jonrobson.me.uk
  * https://www.facebook.com/jonrobson
  * @rakugojon
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Improving our code review efficiency

2015-01-29 Thread Yuri Astrakhan
Brion, i would love to use gerrit more fully (that is until we finally
migrate! :)), but gerrit to my knowledge does not differentiate between a
CC (review if you want to) and TO (i want you to +2). Having multiple cooks
means some patches don't get merged at all.  I feel each patch should be
assigned to a person who will +2 it.  This does not preclude others from
+2ing it, but it designates one responsible for the answer.

On Thu, Jan 29, 2015 at 2:33 PM, Brion Vibber bvib...@wikimedia.org wrote:

 I'd like us to start by using the review request system already in gerrit
 more fully.

 Personally I've got a bunch of incoming reviews in my queue where I'm not
 sure the current status of them or if it's wise to land them. :)

 First step is probably to go through the existing old patches in
 everybody's queues and either do the review, abandon the patch, or trim
 down reviewers who aren't familiar with the code area.

 Rejected patches should be abandoned to get them out of the queues.

 Then we should go through unassigned patches more aggressively...

 We also need to make sure we have default reviewers for modules, which will
 be relevant also to triaging bug reports.

 -- brion
 On Jan 29, 2015 2:03 PM, Yuri Astrakhan yastrak...@wikimedia.org
 wrote:

  How about a simple script to create a phabricator task after a few days
 (a
  week?) of a patch inactivity to review that patch. It will allow assign
  to, allow managers to see each dev's review queue, and will prevent
  patches to fall through the cracks.
 
  Obviously this won't be needed after we move gerrit to phabricator.
 
  On Thu, Jan 29, 2015 at 1:44 PM, James Douglas jdoug...@wikimedia.org
  wrote:
 
   This is a situation where disciplined testing can come in really handy.
  
   If I submit a patch, and the patch passes the tests that have been
   specified for the feature it implements (or the bug it fixes), and the
  code
   coverage is sufficiently high, then a reviewer has a running start in
  terms
   of confidence in the correctness and completeness of the patch.
  
   Practically speaking, this doesn't necessarily rely on rest of the
  project
   already having a very level of code coverage; as long as there are
 tests
   laid out for the feature in question, and the patch makes those tests
  pass,
   it gives the code reviewer a real shot in the arm.
  
   On Thu, Jan 29, 2015 at 1:14 PM, Jon Robson jdlrob...@gmail.com
 wrote:
  
Thanks for kicking off the conversation Brad :-)
   
Just mean at the moment. I hacked together and I'm more than happy to
iterate on this and improve the reporting.
   
On the subject of patch abandonment: Personally I think we should be
abandoning inactive patches. They cause unnecessary confusion to
someone coming into a new extension wanting to help out. We may want
to change the name to 'abandon' to 'remove from code review queue' as
abandon sounds rather nasty and that's not at all what it actually
does - any abandoned patch can be restored at any time if necessary.
   
   
On Thu, Jan 29, 2015 at 1:11 PM, Brad Jorsch (Anomie)
bjor...@wikimedia.org wrote:
 On Thu, Jan 29, 2015 at 12:56 PM, Jon Robson jdlrob...@gmail.com
wrote:

 The average time for code to go from submitted to merged appears
 to
  be
 29 days over a dataset of 524 patches, excluding all that were
  written
 by the L10n bot. There is a patchset there that has been _open_
 for
 766 days - if you look at it it was uploaded on Dec 23, 2012 12:23
  PM
 is -1ed by me and needs a rebase.


 Mean or median?

 I recall talk a while back about someone else (Quim, I think?)
 doing
   this
 same sort of analysis, and considering the same issues over patches
   that
 seem to have been abandoned by their author and so on, which led to
 discussions of whether we should go around abandoning patches that
  have
 been -1ed for a long time, etc. Without proper consideration of
 those
sorts
 of issues, the statistics don't seem particularly useful.


 --
 Brad Jorsch (Anomie)
 Software Engineer
 Wikimedia Foundation
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
   
   
   
--
Jon Robson
* http://jonrobson.me.uk
* https://www.facebook.com/jonrobson
* @rakugojon
   
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
   
   ___
   Wikitech-l mailing list
   Wikitech-l@lists.wikimedia.org
   https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo

Re: [Wikitech-l] From Node.js to Go

2015-01-29 Thread Yuri Astrakhan
Language fragmentation is always fun, but, as with any new one, my concerns
lie in the environment - is there enough tools to make the advertised
benefits worth it, does it have a decent IDE with the smart code
completion, refactoring, and a good debugger? Does it have a
packaging/dependency system? How extensive is the standard library, and
user contributed packages. How well does it play with the code written in
other languages? The list could go on.  In short - we can always try new
things as a small service ))  And yes, Rust also sounds interesting.
On Jan 29, 2015 7:22 PM, Ori Livneh o...@wikimedia.org wrote:

 (Sorry, this was meant for wikitech-l.)

 On Thu, Jan 29, 2015 at 7:20 PM, Ori Livneh o...@wikimedia.org wrote:

  We should do the same, IMO.
  http://bowery.io/posts/Nodejs-to-Golang-Bowery/
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] JSON and Scribunto

2014-11-04 Thread Yuri Astrakhan
Disclaimer, Obviously as the developer of JsonConfig  Graph, I am strongly
in favor of Json capabilities in Lua. That being said, would the new Lua
functions be any more in danger of misuse than the powerful raw HTML
generation library, or the capability of pulling content of may pages in a
list (both of which are already available in Lua), e.g.
   for page in list_of_pages:
mw.getContent(page)

I feel that most powerful functions could be abused, and at the end of the
day, there is no such thing as a perfect idiot-proof way, simply because
there are a lot of brilliant idiots. So instead of prohibiting a function
that is clearly demanded by many developers, let us implement it and rely
on 1) performance tuning once we see a problem, 2) usage/code reviewing, 3)
documenting recommended usage patterns.

On Tue, Nov 4, 2014 at 3:45 PM, Brad Jorsch (Anomie) bjor...@wikimedia.org
wrote:

 There's a long-standing request for Scribunto to provide library functions
 for JSON encoding and decoding.

 The advantage of this would be improved interoperability with the growing
 number of extensions that use JSON (e.g. JsonConfig, Graph).

 The disadvantages include:
 * People may store data in JSON blobs that must be parsed where a module
 using mw.loadData would be more appropriate.
 * People may write templates that attempt to bypass the normal MediaWiki
 parameter handling mechanism in favor of passing a JSON blob, which would
 likely lead to page wikitext that is harder for end users to understand.

 So, let's discuss it: do the advantages outweigh the potential
 disadvantages? Are there additional advantages or disadvantages not yet
 mentioned?


 --
 Brad Jorsch (Anomie)
 Software Engineer
 Wikimedia Foundation
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Graphs demo

2014-08-15 Thread Yuri Astrakhan
Graph extension https://www.mediawiki.org/wiki/Extension:Graph is now
live on http://graphtest.wmflabs.org/wiki/Main_Page

Hopefully the main page will stay somewhat presentable :)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Bikeshedding a good name for the api.php API

2014-08-06 Thread Yuri Astrakhan
API vs REST/CONTENT API? If we end up exposing rest API via the same entry
point, no reason of even calling it anything else. If we have a separate
entry point (why?), we could call it REST API or CONTENT API, specifying
that it is mostly for the rendered content as opposed to internal database
data.
On Aug 6, 2014 1:04 PM, Petr Bena benap...@gmail.com wrote:

 The Chosen One's API. In short: Tchopi :P

 Do we really need to call it somehow? When you will say api 99% of
 people who know mediawiki a bit will go for api.php. Special naming
 should be used just for the other weird api's that nobody is ever
 going to use anyway.

 Btw, why do we need to have them in secondary php files / entry points?

 On Wed, Aug 6, 2014 at 5:23 PM, Tyler Romeo tylerro...@gmail.com wrote:
  Definitely agree with this. It’s the only API that is part of core, so
 “MediaWiki API” makes sense.
  --
  Tyler Romeo
  0x405D34A7C86B42DF
 
  From: Bartosz Dziewoński matma@gmail.com
  Reply: Wikimedia developers wikitech-l@lists.wikimedia.org
  Date: August 6, 2014 at 9:52:34
  To: Wikimedia developers wikitech-l@lists.wikimedia.org
  Subject:  Re: [Wikitech-l] Bikeshedding a good name for the api.php API
 
  How about just the MediaWiki API? That's the only proper external API
  core MediaWiki has, as far as I'm aware.
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Do we have a universal font in production?

2014-07-29 Thread Yuri Astrakhan
I'm trying to render an image which uses characters from all of the
languages supported by WP. Is there a single font deployed on production
servers that include all scripts? Any simple font would do, preferably TTF
arial-style.
Thanks!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Do we have a universal font in production?

2014-07-29 Thread Yuri Astrakhan
Thanks Federico, I used /usr/share/fonts/truetype/ttf-dejavu/DejaVuSans.ttf
but didn't see FreeSerif. DejaVuSans doesn't seem to render Hindi. Is there
a font for that?


On Wed, Jul 30, 2014 at 2:12 AM, Federico Leva (Nemo) nemow...@gmail.com
wrote:

 No such font exists. You can try DejaVu Sans or FreeSerif for best
 coverage.

 Nemo

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Attention: Vagrant XDebug has changed!

2014-07-09 Thread Yuri Astrakhan
If you have been using remote debugging in Vagrant, make sure you
enable-role zend. Otherwise your debugger will no longer receive any
callbacks from Vagrant. This is due to the recent change to HHVM by default.

There has been some ideas to allow both zend  hhvm to coexist at the same
time on the same vagrant on different ports and be debuggable via the same
xdebug interface, but that hasn't been done yet, nor is it known if it's
possible.

I learnt about it the hard way :(
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Wikipedia App Reboots, HTTPS, and Wikipedia Zero (was Re: [WikimediaMobile] Wikimedia Commons mobile photo uploader app updated on iOS and Android)

2014-03-12 Thread Yuri Astrakhan
Thanks Adam for driving it and getting it done :)


On Wed, Mar 12, 2014 at 4:44 PM, Adam Baso ab...@wikimedia.org wrote:

 Okay, HTTPS contributory features are now introduced on the Wikipedia Zero
 mobile web experience for operators that zero-rate HTTPS.

 Thanks Brandon Black and Yuri Astrakhan for help on the final pieces!


 On Thu, Mar 6, 2014 at 2:51 PM, Adam Baso ab...@wikimedia.org wrote:

  Another note in case you missed it earlier. If your'e looking in general
 to
  test the Wikipedia app reboot, at the moment the Android APK can be
  downloaded from
 
 
 https://releases.wikimedia.org/mobile/android/apps-android-wikipedia-sprint25.apkand
  bugs can be filed via Bugzilla. The iOS build is currently internal
  due
  to installation limits, although simulator and debugging stuff can be
 done
  on the latest beta of Xcode.
 
  I also forgot to mention my peer Yuri's great work! The guy knuckled down
  to considerably revise Varnish scripts, reviewed and helped me improve
  code, and offered really good advice on API-app interaction. Thanks Yuri!
 
  -Adam
 
 
  On Wed, Mar 5, 2014 at 4:16 PM, Adam Baso ab...@wikimedia.org wrote:
 
   I realized I should be clear that the rebooted apps I mention are
 the
   future Wikipedia mobile apps mentioned earlier in the thread. Sorry if
  any
   confusion.
  
   -Adam
  
  
   On Wed, Mar 5, 2014 at 11:43 AM, Adam Baso ab...@wikimedia.org
 wrote:
  
   +mobile-l
  
   Greetings. Rupert, an update!
  
   The rebooted Android (Android 2.3+) and iOS (iOS 6+) apps will have
   Wikipedia Zero flourishes built into them, making it possible for the
  user
   to know whether the app access is free of data usage charges. The
  rebooted
   apps are tentatively slated for store submission at the end of the
  month.
   The flourishes will hinge on each operator's zero-rating of HTTPS.
  
   Likewise, HTTPS contributory features are about to be introduced on
 the
   Wikipedia Zero mobile web experience as well for operators that
  zero-rate
   HTTPS.
  
   WMF is starting the work with partner operators to add support for
   zero-rating of HTTPS. There will be, at least, technical hurdles
   (networking equipment architecture varies) in this transition, but
 it's
   underway! Indeed, we have some carriers that have noted support for
  HTTPS
   zero-rating already.
  
   I'm very much grateful to Brion, Yuvi, and Monte for their assistance
   while I added code to the Android and iOS platforms, and am happy to
  get to
   work with them more while putting final touches in place this month.
  Props
   to Faidon, Mark, and Brandon in Ops Engineering as well on helping us
   overcome some rather non-trivial hurdles in order to retain good
   performance and maintainability while adding HTTPS support.
  
   -Adam
  
  
   On Mon, Aug 26, 2013 at 3:34 PM, Brion Vibber bvib...@wikimedia.org
  wrote:
  
   On Mon, Aug 26, 2013 at 8:19 AM, Adam Baso ab...@wikimedia.org
  wrote:
  
Rupert, I saw your question regarding Wikipedia Zero. Wikipedia
 Zero
  is
currently targeted for the mobile web, but I'll take this question
   back to
the business team as to whether we'd be able to support zero-rating
  of
   apps
traffic at some point in the future, at least in locales where
  moderate
bandwidth is available.
   
  
   I think that once the zero-rating is switched to support HTTPS by
 using
   IP-based instead of Deep Packet Inspection-based HTTP sniffing, ISP
   partners wouldn't actually be able to distinguish between mobile web
  and
   mobile apps content unless we actively choose to make them use
 separate
   IPs
   and domain names.
  
   Especially if, as we think we're going to, the future Wikipedia
 mobile
   app
   will consist mostly of native code widgets and modules that plug into
  the
   web site embedded in a web control... it'll be loading mostly the
 same
   web
   pages from the same servers, but running a different mix of
 JavaScript.
  
   -- brion
   ___
   Wikitech-l mailing list
   Wikitech-l@lists.wikimedia.org
   https://lists.wikimedia.org/mailman/listinfo/wikitech-l
  
  
  
  
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Dynamic search results as category pages

2014-01-25 Thread Yuri Astrakhan
Of course - I am actually modeling this feature on Semantic MW. It's just
that I doubt WMF will enable SMW on the cluster any time soon :)


On Sat, Jan 25, 2014 at 4:13 PM, Jamie Thingelstad ja...@thingelstad.comwrote:

 You could do something exactly like this very, very easily using Semantic
 MediaWiki and Concepts.
 Jamie Thingelstad
 ja...@thingelstad.com
 mobile: 612-810-3699
 find me on AIM Twitter Facebook LinkedIn

 On Jan 24, 2014, at 9:55 AM, Brian Wolff bawo...@gmail.com wrote:

  On Jan 24, 2014 1:54 AM, Yuri Astrakhan yastrak...@wikimedia.org
 wrote:
 
  Hi, I am thinking of implementing a
 
  #CATQUERY query
 
  magic keyword for the category pages.
 
  When this keyword is present, the category page would execute a query
  against the search backend instead of normal category behavior and show
  result as if those pages were actually marked with this category.
 
  For example, this would allow Greek Philosophers category page to be
  quickly redefined as
  a cross-section of greeks  philosophers categories:
 
  #CATQUERY incategory:Greek incategory:Philosopher
 
  Obviously the community will be able to define much more elaborate
  queries,
  including the ordering (will be supported by the new search backend)
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
  I like the idea in principle, but think the syntax could use
 bikeshedding ;)
 
  If we use this as category pages, im a little worried that people could
 get
  confused and try to add [[category:Greek philosophers]] to a page, and
  expect it to work. We would need good error handling in that situation
 
  including the ordering (will be supported by the new search backend)
 
  Cool. I didnt realize search would support this. That's a pretty big deal
  since people expect there categorirs alphabetized.
 
  Another cool project would be to expand intersection/Dyanamic Page List
  (Wikimedia) to be able to use search as a different backend (however,
 that
  extension would need quite a bit of refactoring to get there)
 
  -bawolff
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l

 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Dynamic search results as category pages

2014-01-24 Thread Yuri Astrakhan
Hi, I am thinking of implementing a

#CATQUERY query

magic keyword for the category pages.

When this keyword is present, the category page would execute a query
against the search backend instead of normal category behavior and show
result as if those pages were actually marked with this category.

For example, this would allow Greek Philosophers category page to be
quickly redefined as
a cross-section of greeks  philosophers categories:

#CATQUERY incategory:Greek incategory:Philosopher

Obviously the community will be able to define much more elaborate queries,
including the ordering (will be supported by the new search backend)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Redirecting obsolete mobile. and wap. domains

2013-12-05 Thread Yuri Astrakhan
This morning Faidon has deployed a
patchhttps://gerrit.wikimedia.org/r/#/c/99394/1/templates/varnish/mobile-frontend.inc.vcl.erbto
redirect wap.and mobile. subdomains (see also this
patch https://gerrit.wikimedia.org/r/#/c/98058 for the apache side
handling), but later due to some cencerns the change has been reverted.

Are there any reasons for us to keep the current wap  mobile subdomains
handling, or everyone is ok with obsoleting them this way?

Thx
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Thoughts of the better landing page

2013-11-04 Thread Yuri Astrakhan
Hi, I have been researching how to implement m.wikipedia.org landing page.
It should show the same output as en.m.wikipedia.org/wiki/Special:Zero page,
except that it should be localized based on the carrier's default language.

Possible solutions:

* VirtualHost - map to docroot that contains the same mediawiki as
en.m.wikipedia.org. There should be a Rewrite rule that allows only ^/$ -
wiki/Special:Zero; but all other paths will return 404.

* Magic hack - something similar to extract2.php that will first load zero
configuration, configure the language to use, then somehow execute the
whole mediawiki rendering for the special page, and lastly output the
rendered content.

Also, it might be needed to set proper MW_LANG before loading, but that
means some other code need to access memcached and make API call

I think #1 is better, please advise on how to switch the language and if we
should put up any defenses (like removing headers, rejecting non-GET
requests, etc).

Thanks.


On Tue, Nov 5, 2013 at 1:58 AM, Brion Vibber bvib...@wikimedia.org wrote:

 Generally I agree that #1 is the way to go -- extract2.php is a nasty hack
 and I'd hate for us to have to replicate it. :D

 re: changing the language something within Zero extension should be
 able to do that if you just need to change the _display_ language. If you
 have to target it to a specific wiki I'd recommend doing that in the
 rewrite rules if possible. But that don't sound easy to do. ;)

 -- brion


 On Mon, Nov 4, 2013 at 1:50 PM, Yuri Astrakhan 
 yastrak...@wikimedia.orgwrote:

 Brion,

 I was thinking of how to do the new landing page better.

  Needed:  m.wikipedia.org   should show the same output as
 en.m.wikipedia.org/wiki/Special:Zero page, except that it should be
 localized based on the carrier's default language.

 Possible solutions:

 * VirtualHost - map to docroot that contains the same mediawiki as
 en.m.wikipedia.org. There should be a Rewrite rule that allows only ^/$
 - wiki/Special:Zero; but all other paths will return 404. Also, I need
 some magic code that dynamically changes the current language based on
 the carrier's configuration.

 * Magic hack - something similar to extract2.php that will first load
 zero configuration, configure the language to use, then somehow execute the
 whole mediawiki rendering for the special page, and lastly output the
 rendered content.

 I think #1 is better, please advise on how to switch the language and if
 we should put up any defenses (like removing headers, rejecting non-GET
 requests, etc).

 Thanks.



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The summary of new zero architecture proposal

2013-09-25 Thread Yuri Astrakhan
The zero ESI change is now live, but is enabled only when X-FORCE-ESI
header is set to 1.  Also, it won't work until varnish enables ESI
support for zero requests (see
dochttps://www.varnish-cache.org/docs/3.0/tutorial/esi.html
)

I propose the following deployment steps:

1) set beresp.do_esi = true for all requests with X-CS header. (should not
be enabled for javascript or any other non-html resources)
This will let us measure ESI impact when HTML has no esi:include tag.

2) Enable X-FORCE-ESI header for one or more smaller carriers while
monitoring the impact on varnish load

3) Enable ESI configuration flag for all zero partners

4) Remove cache variance on X-CS header


On Tue, Jun 18, 2013 at 12:06 PM, Yuri Astrakhan
yastrak...@wikimedia.orgwrote:

 Hi Mark,

 On Tue, Jun 18, 2013 at 11:58 AM, Mark Bergsma m...@wikimedia.org wrote:

 
  * All non-local links always point to a redirector. On javascript
 capable
  devices, it will load carrier configuration and replace the link with
 local
  confirmation dialog box or direct link. Without javascript, redirector
 will
  either silently 301-redirect or show confirmation HTML. Links to images
 on
  ZERO.wiki and all external links are done in similar way.

 For M, you only want to do this when it's a zero carrier I guess? If not,
 just a straight link?

 Correct - two variants for M -- with ESI banner + redirect links, and
 without ESI + direct links.


   * The banner is an ESI link to */w/api.php?action=zerobanner=250-99* -
  returns HTML div blob of the banner. (Not sure if banner ID should be
  part of the URL)
 

 I'm wondering, is there any HTML difference between M  isZeroCarrier ==
 TRUE and ZERO? Links maybe? Can we make those protocol relative perhaps?
 We might be able to kill the cache differences for the domain completely,
 while still supporting both URLs externally.


 Yes - M+Carrier -- has images,  ZERO -- redirect links to images

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] MediaWiki-Vagrant: new features

2013-07-30 Thread Yuri Astrakhan
Awesome!!! Thanks Ori!!!

Can we have varnish through a role please? We will have to do a lot of dev
 test for it soon.

The default varnish config in   /etc/varnish/default.vcl could have this at
the very top of the file. Leave all comments there.

backend default {
.host = 127.0.0.1;
.port = 80;
}

start with:

sudo varnishd -f /etc/varnish/default.vcl -s malloc,1G -T 127.0.0.1:2000 -a
0.0.0.0:8080

This way varnish will only work if accessed through port 8080 (needs to be
mapped to outside), while port 80 will continue be served uncached. The
admin port 2000 doesn't have to be patched to the outside (rarely used, can
be accessed through ssh).

P.S. Note that varnish config should not be deleted on every vagrant up -
there could be a lot of changes during development to that file. Maybe it
would make sense to have an include command of some sort the same way as
settings.d/ files for vagrant.

Thanks!




On Tue, Jul 30, 2013 at 4:58 AM, Antoine Musso hashar+...@free.fr wrote:

 Le 30/07/13 06:39, Ori Livneh a écrit :
  There's a short, 1-minute screencast up at
  http://ascii.io/a/4428demonstrating usage. Check it out. I'll wait.

 I am sorry to report you are definitely too awesome.  Thank you for all
 your work on that front \O/

 --
 Antoine hashar Musso


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] PHPUnit tests fail from Jenkins but work fine locally

2013-06-28 Thread Yuri Astrakhan
I once had similar issue due to the difference in how Jenkins (uses sqlite)
differs from mariadb setup.That issue is still pending, and needs fixing in
the way we set up sqlite database schema --
https://gerrit.wikimedia.org/r/#/c/48098/

Other than that, I only saw issues related to some leftover state from
previous tests. Running test independently passes, but running them all
together fails.

On Fri, Jun 28, 2013 at 1:50 PM, Arthur Richards aricha...@wikimedia.orgwrote:

 Mobile web is trying to merge https://gerrit.wikimedia.org/r/#/c/69585/but
 PHPUnit tests are failing when Jenkins executes them.

 What's weird is that we've executed PHPUnit tests on our various local
 machines - with no failures. We've been scratching our heads trying to
 figure out what might be causing the inconsistency.

 Anyone have any ideas what might be causing this?

 --
 Arthur Richards
 Software Engineer, Mobile
 [[User:Awjrichards]]
 IRC: awjr
 +1-415-839-6885 x6687
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] The summary of new zero architecture proposal

2013-06-18 Thread Yuri Astrakhan
Hi Mark,

On Tue, Jun 18, 2013 at 11:58 AM, Mark Bergsma m...@wikimedia.org wrote:

 
  * All non-local links always point to a redirector. On javascript capable
  devices, it will load carrier configuration and replace the link with
 local
  confirmation dialog box or direct link. Without javascript, redirector
 will
  either silently 301-redirect or show confirmation HTML. Links to images
 on
  ZERO.wiki and all external links are done in similar way.

 For M, you only want to do this when it's a zero carrier I guess? If not,
 just a straight link?

 Correct - two variants for M -- with ESI banner + redirect links, and
without ESI + direct links.


  * The banner is an ESI link to */w/api.php?action=zerobanner=250-99* -
  returns HTML div blob of the banner. (Not sure if banner ID should be
  part of the URL)
 

 I'm wondering, is there any HTML difference between M  isZeroCarrier ==
 TRUE and ZERO? Links maybe? Can we make those protocol relative perhaps?
 We might be able to kill the cache differences for the domain completely,
 while still supporting both URLs externally.


Yes - M+Carrier -- has images,  ZERO -- redirect links to images
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] The summary of new zero architecture proposal

2013-06-14 Thread Yuri Astrakhan
Based on many ideas that were put forth, I would like to seek comments on
this ZERO design. This HTML will be rendered for both M and ZERO subdomains
if varnish detects that request is coming from a zero partner. M and ZERO
will be identical except for the images - ZERO substitutes images with
links to File:xxx namespace through a redirector.

* All non-local links always point to a redirector. On javascript capable
devices, it will load carrier configuration and replace the link with local
confirmation dialog box or direct link. Without javascript, redirector will
either silently 301-redirect or show confirmation HTML. Links to images on
ZERO.wiki and all external links are done in similar way.

* The banner is an ESI link to */w/api.php?action=zerobanner=250-99* -
returns HTML div blob of the banner. (Not sure if banner ID should be
part of the URL)

Expected cache fragmentation for each wiki page:
* per subdomain (M|ZERO)
* if M - per isZeroCarrier (TRUE|FALSE). if ZERO - always TRUE.
3 variants is much better then one per carrier ID * 2 per subdomain.

P.S.
Redirector is a Special:Zero page, but if speed is an issue, it could be an
API calls (which seem to load much faster). The API call would redirect to
the target, or could either redirect to the special page for confirmation
rendering, or output HTML itself (no skin support, but avoids an extra
redirect). Might not be worth it as javascript will be available on most of
our target platforms now or soon.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New git-review lets you configure 'origin' as the gerrit remote

2013-06-05 Thread Yuri Astrakhan
I wonder if origin should be made the default setting for gerrit - after
all every new git clone automatically uses origin. The fewer surprise!
moments devs have, the more productive we become.


On Wed, Jun 5, 2013 at 6:42 PM, Juliusz Gonera jgon...@wikimedia.orgwrote:

 Thank you!



 On 05/31/2013 03:14 PM, Ori Livneh wrote:

 Hey,

 The new version of git-review released today (1.22) includes a patch I
 wrote that makes it possible to work against a single 'origin' remote.
 This
 amounts to a workaround for git-review's tendency to frighten you into
 thinking you're about to submit more patches than the ones you are working
 on. It makes git-review more pleasant to work with, in my opinion.

 To enable this behavior, you first need to upgrade to the latest version
 of
 git-review, by running pip install -U git-review. Then you need to
 create
 a configuration file: either /etc/git-review/git-review.**conf
 (system-wide)
 or ~/.config/git-review/git-**review.conf (user-specific).

 The file should contain these two lines:

 [gerrit]
 defaultremote = origin

 Once you've made the change, any new Gerrit repos you clone using an
 authenticated URI will just work.

 You'll need to perform an additional step to migrate existing
 repositories.
 In each repository, run the following commands:

git remote set-url origin $(git config --get remote.gerrit.url)
git remote rm gerrit
git review -s

 Hope you find this useful.
 __**_
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/**mailman/listinfo/wikitech-lhttps://lists.wikimedia.org/mailman/listinfo/wikitech-l



 __**_
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/**mailman/listinfo/wikitech-lhttps://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] New git-review lets you configure 'origin' as the gerrit remote

2013-06-05 Thread Yuri Astrakhan
Daniel, I am not saying it will solve everyone's usage scenarios, but I
think one would agree that for the majority of the developers, origin is
set by default. Even gerrit's own usage of navigate to gerrit.wikimedia.org,
and copy git clone command from the project page uses origin, not -o
gerrit. So lets try to make it easy for common case first, without
requiring magic git remote rename or -o 


On Wed, Jun 5, 2013 at 6:51 PM, Daniel Friesen
dan...@nadir-seen-fire.comwrote:

 Not if they fork from our github mirror and their origin is their own repo.

 Personally my origin is the github remote I push my unfinished branches of
 code too. I use 'core' for the actual gerrit remote.
 And I didn't even fork. I did this before we had a github mirror.

 --
 ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]


 On Wed, 05 Jun 2013 15:45:32 -0700, Yuri Astrakhan 
 yastrak...@wikimedia.org wrote:

  I wonder if origin should be made the default setting for gerrit - after
 all every new git clone automatically uses origin. The fewer surprise!
 moments devs have, the more productive we become.



 __**_
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/**mailman/listinfo/wikitech-lhttps://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Quo Vadis, Vagrant

2013-05-31 Thread Yuri Astrakhan
This is an awesome news! Just to through some magic dust into gears - has
anyone succeeded in running it under windows? Alternativelly, I wouldn't
mind having a vagrant VM that includes GUI  a browser so that no host
interaction is needed.


On Wed, May 29, 2013 at 9:37 AM, Željko Filipin zfili...@wikimedia.orgwrote:

 On Mon, May 13, 2013 at 3:20 PM, Ori Livneh ori.liv...@gmail.com wrote:

  If you're interested in using MW-V to document and automate your
  development setup, get in touch -- I'd be happy to help. I'm 'ori-l'
  on IRC, and I'll be at the Amsterdam Hackathon later this month too.
 

 I just wanted to say thanks to Ori. Mediawiki-Vagrant is now ready to run
 Selenium tests[1].

 Željko
 --
 1:

 https://github.com/wikimedia/mediawiki-vagrant/tree/master/puppet/modules/browsertests
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Quo Vadis, Vagrant

2013-05-31 Thread Yuri Astrakhan
I so wish there was a way not to mess with files under git's control. Can
that option be exposed through the local user config file?
On May 31, 2013 6:59 PM, Ori Livneh o...@wikimedia.org wrote:

 On Fri, May 31, 2013 at 9:51 AM, Yuri Astrakhan yastrak...@wikimedia.org
 wrote:

  This is an awesome news! Just to through some magic dust into gears - has
  anyone succeeded in running it under windows? Alternativelly, I wouldn't
  mind having a vagrant VM that includes GUI  a browser so that no host
  interaction is needed.
 

 The tests run on Firefox / X11 on the guest VM. To invoke them from a
 'vagrant ssh' session, though, you need an X display server. Graphical
 Linuxes have all the prerequisites installed, and on OS X you just need to
 install XQuartz. On Windows, people have good things to say about NoMachine
 (http://www.nomachine.com/download-client-windows.php) but I haven't
 tested
 it myself and while the client is free as in beer, it is not open source.

 There is another alternative that you'll probably find simpler: edit your
 Vagrantfile and find these two lines:

 # To boot the VM in graphical mode, uncomment the following line:
 # vb.gui = true

 Uncomment the second line and do a vagrant halt / vagrant up. You'll then
 be able to run the browser tests on an X display server running on the
 guest VM.
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] ZERO architecture

2013-05-30 Thread Yuri Astrakhan
Zero is rapidly growing, but its architecture is falling behind, and needs
to be revised.

*== Zero Partner Requirements ==*
* Support both smart phones (full JavaScript support) and feature phones
(limited HTML/no js support)
* Show carrier-specific, language-specific banner
* Allow carriers to select what features are zero-rated: images and list of
languages
* Ask for user confirmation when navigating away from zero (images,
non-zero languages, external sites)

*== Technical Requirements ==*
* increase Varnish cache hits / minimize cache fragmentation
* Set up and configure new partners without code changes
* Use partner-supplied IP ranges as a preferred alternative to the geo-ip
database for fundraising  analytic teams

*== Current state ==*
Zero domain requests set X-Subdomain=ZERO, and treat the request as
mobile. The backend uses X-Subdomain and X-CS headers to customize result.
The cache is heavily fragmented due to its variance on both of these
headers in addition to the variance set by MobileFrontend extension and
MediaWiki core.

*== Proposals ==*
In order to reduce Zero-caused fragmentation, we propose to shrink from one
bucket per carrier (X-CS) to three general buckets:
* smart phones bucket -- banner and site modifications are done on the
client in javascript
* feature phones -- HTML only, the banner is inserted by the ESI
** for carriers with free images
** for carriers without free images

*=== Varnish logic ===*
* Parse User-Agent to distinguish between desktop / mobile / feature phone:
X-Device-Type=desktop|mobile|legacy
* Use IP - X-CS lookup (under development by OPs) to convert client's IP
into X-CS header
* If X-CS  X-Device-Type == 'legacy': Use IP - X-Images lookup (same
lookup plugin, different database file) to determine if carrier allows
images

Since  each carrier has its own list of free languages, language links on
 feature phones will point to origin, which will either silently redirect
 or ask for confirmation.

*=== ZERO vs M ===*
Even  though I think zero. and m. subdomains should both go the way of the
 dodo to make each article have just one canonical location (no more
 linking  Google issues) , this won't happen until we are fully  migrated
to Varnish and make some mobile code changes (and possibly  other changes
that I am not aware of).

At the same time, we  should try to get rid of ZERO wherever possible.
There are two  technical differences between m  zero: zero shows a link to
image  instead of the actual image, and a big red zero warning is shown if
the  carrier is not detected. There is also an organizational difference --
 some carriers only whitelist zero, some - only m, and some --  both zero
  m subdomains.

This 
spreadsheethttps://docs.google.com/spreadsheet/ccc?key=0As-T7jJ1slQGdDd3WjhkVmk5SWc5OEdDZVdLYlA2M2cusp=sharing
shows
all configurations we allow, and how we handle them at the moment. Each
case requires a different handling, so proposals are listed there.

In  general, we should manipulate image links in M for carriers who don't
 allow them, and always redirect ZERO to M unless M is not whitelisted,  in
which case convince carrier to change their whitelist.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Code style: overuse of Html::element()

2013-05-13 Thread Yuri Astrakhan
On one hand, I prefer to have a properly formatted code, but on the other,
most systems i have worked with have a very high cost of string
concatenation, and I have a strong suspicion PHP is guilty of that too.
Constructing HTML one element/value at a time might prove to be on of the
bigger perf bottlenecks.

From my personal experience, once I worked on a networking lib for
proprietary protocol, and noticed that there was a lot of logging calls in
the form of Log(value1= + value1 +  value2= + value2 ...). After I
switched it to the form Log(value1={0}, value2={1}, value1, value2), the
code became an order of magnitude faster because the logging
framework deferred concatenation until the last moment after it knew that
logging is needed, and the actual concatenation was done for the whole
complex string with values, not one substring at a time.


On Mon, May 13, 2013 at 6:10 PM, Antoine Musso hashar+...@free.fr wrote:

 Le 13/05/13 19:26, Max Semenik a écrit :
  Hi, I've seen recently a lot of code like this:
 
  $html = Html::openElement( 'div', array( 'class' = 'foo' )
  . Html::rawElement( 'p', array(),
  Html::element( 'span', array( 'id' = $somePotentiallyUnsafeId ),
  $somePotentiallyUnsafeText
  )
  )
  . Html::closeElement( 'div' );
 
  IMO, cruft like this makes things harder to read and adds additional
  performance overhead.

 Html is just like our Xml class, that let us raise the probability that
 the result code will be valid.  That is also a good way to make sure the
 content is properly escaped, though in the example above that could lead
 to some mistake due to all the nested calls.

 For the performance overhead, it surely exist but it is most probably
 negligible unless the methods are in a heavily used code path.


 Ideally we would use templates to generate all of that. That will let us
 extract the views logic out of the PHP code.


 --
 Antoine hashar Musso


 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Purging parts of varnish cache

2013-05-09 Thread Yuri Astrakhan
ESI for older browsers, and JavaScript otherwise are still on the table,
but that's orthogonal to the varnish cache purge -- if we use ESI, the
banner will be cached and will need to be purged. If we use JavaScript, the
JSON configuration blob will need to be purged. The only difference is the
actual purging expression.


On Thu, May 9, 2013 at 2:14 PM, Arthur Richards aricha...@wikimedia.orgwrote:

 Previously we had spoken about implementing partial page caching (ESI) for
 Zero so we could remove X-CS variance for article content and use it only
 for partner banners. Is this still being pursued?


 On Wed, May 8, 2013 at 10:24 PM, Yuri Astrakhan yastrak...@wikimedia.org
 wrote:

  Hi, when we edit Zero configuration, it would be very beneficial to flush
  any cached pages in varnish that are related to the change.
 
  For example, if I edit Beeline's banner settings, any objects with the
  header X-CS=250-99 should be purged, hopefully without any additional
  manual interaction. Without this purge, the cache will be stale for the
  next 30 days for the most common articles.
 
  Now, according to the
  http://giantdorks.org/alain/exploring-methods-to-purge-varnish-cache/,
  varnish has an extensive matching support, and the author provides some
  PHP-based code to perform the cache flushing.  What would we need to
  implement secure, automated partial varnish flushing in production?
 
  Thanks!
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l




 --
 Arthur Richards
 Software Engineer, Mobile
 [[User:Awjrichards]]
 IRC: awjr
 +1-415-839-6885 x6687
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Purging parts of varnish cache

2013-05-08 Thread Yuri Astrakhan
Hi, when we edit Zero configuration, it would be very beneficial to flush
any cached pages in varnish that are related to the change.

For example, if I edit Beeline's banner settings, any objects with the
header X-CS=250-99 should be purged, hopefully without any additional
manual interaction. Without this purge, the cache will be stale for the
next 30 days for the most common articles.

Now, according to the
http://giantdorks.org/alain/exploring-methods-to-purge-varnish-cache/,
varnish has an extensive matching support, and the author provides some
PHP-based code to perform the cache flushing.  What would we need to
implement secure, automated partial varnish flushing in production?

Thanks!
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Making inter-language links shorter

2013-04-18 Thread Yuri Astrakhan
There are a few things (IMO) that should be done to langlist ordering:

* Group by alphabet
People who understand latin alphabet should get a list of all latin-using
languages listed/sorted together. Cyrillic is a separate group, and so are
various asian and middle-eastern languages. I have seen other sites do this
(e.g. Google, but I can't quickly locate an example right now). Having all
languages bunched up together make going through them extremely painful -
one has to skip all the scripts not understood.

* Each wiki site has different ordering requirements - like Hebrew and
Hungarian wikis want English as the first link, or 'nn' uses 'no','sv','da'
before all others. See
pywikihttp://svn.wikimedia.org/svnroot/pywikipedia/trunk/pywikipedia/families/wikipedia_family.py-
interwiki_putfirst

* Lastly, but IMO - most importantly, we should honor user settings or
browser settings. If my browser sends *Accept Language:
en-US,en;q=0.8,ru;q=0.6*, it would be good to show english  russian at the
top, followed by others.

All this can (and should) be done in javascript, without affecting servers.

And for historical reasons: bug
2867https://bugzilla.wikimedia.org/show_bug.cgi?id=2867...
i filed it in 2005, it has over 60 votes (highest count in bugzilla if i'm
not mistaken)...

--Yuri


On Thu, Apr 18, 2013 at 9:57 PM, Brion Vibber br...@pobox.com wrote:

 I was traditionally in favor of keeping the full language list visible,
 but it's just too damn big in many cases and is hard to search through
 on any device. On touch devices it's difficult to pick a correct item from
 the list as all the links are adjacent (though if you zoom it's ok).

 Definitely we need something improved, and if we're going to improve it we
 need to do it for the default or we're failing to serve 99% of our
 readers...

 I'm not sure about the current demo; one thing that bugs me is that there's
 a very small tap/click target for getting the full language list call-out.
 Clicking on Language just hides/shows the short list, it doesn't do
 anything. Clicking the settings gear icon next to Languages brings up a
 call-out with language-related settings none of which help you get to
 another language version of the wiki.


 On the mobile site we've collapsed the whole thing to an Other languages
 section or button (depending on if you're in beta mode) at the bottom of
 the article, and this seems to have gotten good usability responses from
 mobile users.



 On Thu, Apr 18, 2013 at 12:47 PM, David Gerard dger...@gmail.com wrote:

  On 18 April 2013 20:43, David Gerard dger...@gmail.com wrote:
   On 18 April 2013 17:50, Pau Giner pgi...@wikimedia.org wrote:
 
   Please let me know if you see any possible concern with this approach.
 
   My first thought is of how upset people were when the first version of
   Vector hid the language links by default. I would suggest being sure
   there will be little or no similar objection.
 
 
  (hit send too soon, sorry)
 
  A simple solution that would avoid a similar reaction is: do not do
  this by default - make it only for logged-in users who want it that
  way.
 
  Possibly for default users, you could put the heuristically-calculated
  likely preferred languages at the top. But keeping the rest of the
  list below, right there on display, will (I predict) be favoured, as
  advertising the many languages of Wikipedia is a strongly-held value
  of many Wikimedians.
 
 
  - d.
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Article on API Characteristics

2013-04-16 Thread Yuri Astrakhan
Thanks Tyler, some points are very interesting.


On Wed, Apr 17, 2013 at 5:50 AM, Tyler Romeo tylerro...@gmail.com wrote:

 Found this interesting articles on designing an API for what it's worth.
 Thought some people my find it interesting.

 http://mathieu.fenniak.net/the-api-checklist/
 *-- *
 *Tyler Romeo*
 Stevens Institute of Technology, Class of 2015
 Major in Computer Science
 www.whizkidztech.com | tylerro...@gmail.com
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Adding messages to cache for wfMessage() at runtime

2013-04-08 Thread Yuri Astrakhan
Hi, I need to use the localization framework - wfMessage() to translate my
own custom messages. I have a message dictionary in the form of
{languageCode = message, ...}, and looking for a way to inject those
messages into the message cache. The actual messages dictionary is stored
as a custom setting as part of a json blob, and cannot be added to the
regular $message['xx'] = '...'; for  translatewiki.  Thanks!

Usage scenario:

// Array is generated from an external source and will look like this:
$messageMap = array('en'='aaa', 'fr'='b {{SITENAME}} b', ...);

// Some magic method to add the entire message map into message cache
messageCache-Add('custom-key1', $messageMap);

// Process message
// Note that in case 'fr' is not defined, it should fallback using default
resolution rules.
$result = wfMessage('custom-key1')-inLanguage('fr')-text();
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Adding messages to cache for wfMessage() at runtime

2013-04-08 Thread Yuri Astrakhan
 So, we used to have a $messagecache-add(), but it was removed
 after a very very long deprecation.


Just when  I was about to use it :) I basically look for this functionality:

* Figure out which localized string to use for language X, in case there is
no message in X. Something like use Russian if Belarusian message is not
available. I remember this was present in pywiki framework, and suspect
mediawiki has it somewhere.

* Expand {{templates}}/PLURAL/GENDER/...
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Proposal: Wikitech contributors

2013-04-04 Thread Yuri Astrakhan
I think we should consolidate wikitech and mediawiki. If wikipedia.org can
fit all the world's knowledge, how come we can't fit all our technical
know-how on one site? The fragmentation is unnecessary and arbitrary, it
confuses most newcomers and those not paying attention to the proper
separation between them.

The site should highlight and share the technology platform used at
Wikimedia Foundation.
*We need a site for everyone to look at, participate in, and replicate our
technology.*
Replication could be anything from the most basic MW installation to a
complete replica of our cluster.

If we are not happy with the URL, lets come up with another one, or decide
that wikitech is the preferred name, and redirect everything there. The
main page would have mediawiki docs, labs, server info and stats,
hackathons, apis, RFCs, and everything else related. Any non-wiki
information like server health could be sub-domain of this site, possibly
with extra permissions. Lets create and share a knowledge platform, not
just software.

/end-rant :)


On Thu, Apr 4, 2013 at 3:24 AM, Quim Gil q...@wikimedia.org wrote:

 Thank you for the vivid discussion about the potential future roles of
 http://mediawiki.org and http://wikitech.wikimedia.org

 I have updated http://www.mediawiki.org/wiki/**
 Technical_communications/Dev_**wiki_consolidation#Proposed_**solutionhttp://www.mediawiki.org/wiki/Technical_communications/Dev_wiki_consolidation#Proposed_solutionaccordingly.


 On 04/03/2013 03:51 PM, Steven Walling wrote:

 who's going to do the work of
 migrating all our project documentation over?


 guillom, myself and whoever else wants to help.

 http://www.mediawiki.org/wiki/**Technical_communications/Dev_**
 wiki_consolidation#Stepshttp://www.mediawiki.org/wiki/Technical_communications/Dev_wiki_consolidation#Steps


 --
 Quim Gil
 Technical Contributor Coordinator @ Wikimedia Foundation
 http://www.mediawiki.org/wiki/**User:Qgilhttp://www.mediawiki.org/wiki/User:Qgil

 __**_
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/**mailman/listinfo/wikitech-lhttps://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Gerrit actively discourages discussion

2013-04-03 Thread Yuri Astrakhan
Jon, do you think it might make sense to hide all those jenkin-bot
comments? They are mostly noise, and the only time when compilation fails,
its ok to click to expand just that comment. Or maybe just expand the last
jenkin-bot comment, but hide all the previous ones.


On Wed, Apr 3, 2013 at 1:53 AM, Yuri Astrakhan yastrak...@wikimedia.orgwrote:

 This is awesome!!! Thanks Jon :) And no, I didn't look at the code, too
 scared :)


 On Wed, Apr 3, 2013 at 1:14 AM, Jon Robson jdlrob...@gmail.com wrote:

 I feel very dirty having done this but I made a chrome extension that
 autoexpands all comments and adds a comment count next to unexpanded older
 patchsets.

 The code's horrible but it works - feel free to try it out:

 https://github.com/jdlrobson/gerrit-be-nice-to-me

 I suspect the best long term solution might be to rewrite the interface.
 The fact it is so ajaxay makes enhancements hard unless someone actually
 wants to get to know and work on the gerrit codebase themselves...


 On Tue, Apr 2, 2013 at 7:54 PM, MZMcBride z...@mzmcbride.com wrote:

  Christian Aistleitner wrote:
  while I do not want to discourage discussion of those items on
  wikitech-l, I nevertheless filed them in bugzilla, so we can keep
  track of the issues / solutions.
 
  This is wonderful. Thank you for filing these bugs, Christian. I'll
 take a
  look at them now.
 
  MZMcBride
 
 
 
  ___
  Wikitech-l mailing list
  Wikitech-l@lists.wikimedia.org
  https://lists.wikimedia.org/mailman/listinfo/wikitech-l
 



 --
 Jon Robson
 http://jonrobson.me.uk
 @rakugojon
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

  1   2   >