Re: [Wikitech-l] [Engineering] Wikimedia REST content API is now available in beta

2015-03-11 Thread Hay (Husky)
This is awesome. Congratulations to Gabriel and the rest of the team.
I'll surely hope this will provide a stable platform for getting
Wikimedia content on even more platforms.

-- Hay

On Wed, Mar 11, 2015 at 12:20 PM, Marc Ordinas i Llopis
 wrote:
> Congratulations! After all the hard work that has gone into this, it's
> great to see it up and running. Besides the improvements it will allow in
> existing projects, I can't wait to see the new things it will enable.
>
> -- Marc
>
> On Tue, Mar 10, 2015 at 11:23 PM, Gabriel Wicke 
> wrote:
>
>> Hello all,
>>
>> I am happy to announce the beta release of the Wikimedia REST Content API
>> at
>>
>> https://rest.wikimedia.org/
>>
>> Each domain has its own API documentation, which is auto-generated from
>> Swagger API specs. For example, here is the link for the English Wikipedia:
>>
>> https://rest.wikimedia.org/en.wikipedia.org/v1/?doc
>>
>> At present, this API provides convenient and low-latency access to article
>> HTML, page metadata and content conversions between HTML and wikitext.
>> After extensive testing we are confident that these endpoints are ready for
>> production use, but have marked them as 'unstable' until we have also
>> validated this with production users. You can start writing applications
>> that depend on it now, if you aren't afraid of possible minor changes
>> before transitioning to 'stable' status. For the definition of the terms
>> ‘stable’ and ‘unstable’ see https://www.mediawiki.org/wiki/API_versioning
>> .
>>
>> While general and not specific to VisualEditor, the selection of endpoints
>> reflects this release's focus on speeding up VisualEditor. By storing
>> private Parsoid round-trip information separately, we were able to reduce
>> the HTML size by about 40%. This in turn reduces network transfer and
>> processing times, which will make loading and saving with VisualEditor
>> faster. We are also switching from a cache to actual storage, which will
>> eliminate slow VisualEditor loads caused by cache misses. Other users of
>> Parsoid HTML like Flow, HTML dumps, the OCG PDF renderer or Content
>> translation will benefit similarly.
>>
>> But, we are not done yet. In the medium term, we plan to further reduce
>> the HTML size by separating out all read-write metadata. This should allow
>> us to use Parsoid HTML with its semantic markup
>>  directly for
>> both views and editing without increasing the HTML size over the current
>> output. Combined with performance work in VisualEditor, this has the
>> potential to make switching to visual editing instantaneous and free of any
>> scrolling.
>>
>> We are also investigating a sub-page-level edit API for
>> micro-contributions and very fast VisualEditor saves. HTML saves don't
>> necessarily have to wait for the page to re-render from wikitext, which
>> means that we can potentially make them faster than wikitext saves. For
>> this to work we'll need to minimize network transfer and processing time on
>> both client and server.
>>
>> More generally, this API is intended to be the beginning of a
>> multi-purpose content API. Its implementation (RESTBase
>> ) is driven by a declarative
>> Swagger API specification, which helps to make it straightforward to extend
>> the API with new entry points. The same API spec is also used to
>> auto-generate the aforementioned sandbox environment, complete with handy
>> "try it" buttons. So, please give it a try and let us know what you think!
>>
>> This API is currently unmetered; we recommend that users not perform more
>> than 200 requests per second and may implement limitations if necessary.
>>
>> I also want to use this opportunity to thank all contributors who made
>> this possible:
>>
>> - Marko Obrovac, Eric Evans, James Douglas and Hardik Juneja on the
>> Services team worked hard to build RESTBase, and to make it as extensible
>> and clean as it is now.
>>
>> - Filippo Giunchedi, Alex Kosiaris, Andrew Otto, Faidon Liambotis, Rob
>> Halsell and Mark Bergsma helped to procure and set up the Cassandra storage
>> cluster backing this API.
>>
>> - The Parsoid team with Subbu Sastry, Arlo Breault, C. Scott Ananian and
>> Marc Ordinas i Llopis is solving the extremely difficult task of converting
>> between wikitext and HTML, and built a new API that lets us retrieve and
>> pass in metadata separately.
>>
>> - On the MediaWiki core team, Brad Jorsch quickly created a minimal
>> authorization API that will let us support private wikis, and Aaron Schulz,
>> Alex Monk and Ori Livneh built and extended the VirtualRestService that
>> lets VisualEditor and MediaWiki in general easily access external services.
>>
>> We welcome your feedback here:
>> https://www.mediawiki.org/wiki/Talk:RESTBase - and in Phabricator
>> 
>> .
>>
>> Sincerely --
>>
>> Gabriel Wicke

Re: [Wikitech-l] Volunteers for Wikimania mobile app?

2012-06-02 Thread Hay (Husky)
A full-featured multi-platform app would be pretty shortday now
(especially if you want to get it past the app review process on iOS).
A nice mobile website would be a better option IMHO.

Let me know if you need any help on that.

-- Hay

On Sat, Jun 2, 2012 at 10:45 AM, Finne Boonen  wrote:
> http://m.fosdem.org/ Is the one that's used by Fosdem which works quite
> well, and works on most devices.
>
> henna
>
> On Sat, Jun 2, 2012 at 9:58 AM, Tomasz Finc  wrote:
>
>> Certainly take a look at the official Wikipedia and Wiktionary apps on
>> github to see how we've bult them.
>>
>> https://github.com/wikimedia
>>
>> Stop by in #wikimedia-mobile if you need any help.
>>
>> --tomasz
>>
>>
>> On Fri, Jun 1, 2012 at 11:34 PM, Gregory Varnum
>>  wrote:
>> > Greetings,
>> >
>> > As someone involved with Wikimania programming - I'm wondering if there
>> are volunteers interested in developing a Wikimania mobile app.  It may not
>> be feasible given the short timeline, but I've seen a number of conference
>> apps coming out over the past week and figured it should at least be
>> pondered.  :)
>> >
>> > Essentially the idea would be to include schedule, local info, maps,
>> etc.  Things that can be pulled from the Wikimania 2012 wiki.  Perhaps
>> something PhoneGap based?
>> >
>> > Any thoughts or interest?
>> >
>> > -greg aka varnent
>> > ___
>> > Wikitech-l mailing list
>> > Wikitech-l@lists.wikimedia.org
>> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>
>
>
> --
> "Maybe you knew early on that your track went from point A to B, but unlike
> you I wasn't given a map at birth!" Alyssa, "Chasing Amy"
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] PhotoCommons Wordpress plugin

2011-03-01 Thread Hay (Husky)
Hi everyone,
some of you might have been aware of the Hackathon that was held in
Amsterdam in honor of Wikipedia's 10th birthday in early january.

At that event Krinkle and i hacked together a WordPress plugin that
makes it possible to easily search and include Wikimedia Commons
pictures in your blog posts. It's far from production ready, but we'd
like to give you a sneak peek and ask for your input, thoughts and of
course bugzilla tickets and patches ;)

Installation instructions and a link to the download can be found here:

http://www.mediawiki.org/wiki/PhotoCommons

Let us know what you think, and feel free to tweet and blog about it,
of course using the plugin to find a freely licensed image for your
blog post ;)

Thanks,
-- Hay / Husky

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Firesheep

2010-10-25 Thread Hay (Husky)
Has anyone seen this?

http://codebutler.com/firesheep

A new Firefox plugin that makes it trivially easy to hijack cookies
from a website that's using HTTP for login over an unencrypted
wireless network. Wikipedia isn't in the standard installation as a
site (lots of other sites, such as Facebook, Twitter, etc. are). We
are using HTTP login by default, so i guess we're vulnerable as well
(please say so if we're using some other kind of defensive mechanism
i'm not aware of). Might it be a good idea to se HTTPS as the standard
login? Gmail has been doing this since april this year.

-- Hay

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] VP8 freed!

2010-05-19 Thread Hay (Husky)
http://x264dev.multimedia.cx/?p=377

Apparently the codec itself isn't as good as H264, and patent problems
are still likely. It's better than Theora though.

-- Hay

On Wed, May 19, 2010 at 8:31 PM, papyromancer
 wrote:
> There's a session this afternoon at Google I/O that will focus on this:
> http://code.google.com/events/io/2010/sessions/webm-open-video-playback-html5.html
>
> I'm not sure if it's going to be streamed live, but I bet it'll hit
> youtube soon, I'll follow up with a link.
>
> --Drew
>
> On Wed, May 19, 2010 at 5:53 PM, David Gerard  wrote:
>> http://www.webmproject.org/
>> http://openvideoalliance.org/2010/05/google-frees-vp8-codec-for-html5-the-webm-project/?l=en
>> http://www.h-online.com/open/news/item/Google-open-source-VP8-as-part-of-the-WebM-Project-1003772.html
>>
>> Container will be .webm, a modified version of Matroshka. Audio is Ogg 
>> Vorbis.
>>
>> YouTube is serving up .webm *right now*. Flash will also include .webm.
>>
>> Comment from WMF already, that WMF is happy to host any free codec.
>> Encoders are available at the project home page.
>>
>>
>> - d.
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Wikipedia iPhone app official page?

2009-08-29 Thread Hay (Husky)
On Sat, Aug 29, 2009 at 3:07 PM, Dmitriy Sintsov wrote:
> Some local coder told me that GIT is slower and consumes much more RAM
> on some operations than SVN.
> I can't confirm that, though, because I never used GIT and still rarely
> use SVN. But, be warned.
> Dmitriy

I'm not a git expert, but afaik Git was designed from the ground up by
Linus Torvalds to be *very fast*, even on large operations. It runs
the Linux project, so i guess it can't be *that* slow :)

Chad wrote:
> It's _ok_. The command line usage is pretty solid and I haven't encountered
> any issues there. The GUI interface sucks and is a complete waste of time.
There aren't any real good interfaces for SVN on Mac and Linux too.
Windows users are spoiled by having such a great app as Tortoise for
free. I guess most Mac and Linux SVN users simply learn to use the
command line, which isn't a bad thing at all.

-- Hay

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Wikipedia iPhone app official page?

2009-08-28 Thread Hay (Husky)
As far as i know, the current trend is more towards Git than towards
Bazaar. It's not a bad system, but Git seems to have more traction at
the moment.

-- Hay

On Fri, Aug 28, 2009 at 11:15 PM, Bryan Tong
Minh wrote:
> On Fri, Aug 28, 2009 at 10:29 PM, Chad wrote:
>> Blasphemy!
>>
>> /me goes and sits on the SVN server and refuses to leave
>>
> We could take a look at Bazaar. It has pretty good SVN integration.You
> can create a (centralized) checkout of an SVN repo and when you commit
> into that centralized checkout from your local Bazaar branch it should
> commit into the master SVN as well (as I understand it). This would
> allow peaceful concurrent use of a centralized and decentralized
> version control system (although Bazaar itself supports centralized
> use as well). Plus it works natively on Windows without icky POSIX
> emulation layers.
>
> I have not had time myself to look into the details yet, so I don't
> know if what I wrote actually works, but it looks promising from what
> I read.
>
>
> Bryan
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Wikipedia iPhone app official page?

2009-08-28 Thread Hay (Husky)
I think it would be a good start to spice up the page and add to link
to the app description on iTunes too. The description has a line
saying 'come and help us if you're good in HTML5/JS!' text, but no
link to follow or where to get more information (maybe somewhere in
the app itself too?)

Also, why is this hosted externally on github on not in the main SVN repo?

Right now it's pretty high on the top 25 on the app store, but has a
very low rating because it's obviously still in early beta and lacks
many of the features found in the already existing native apps.

-- Hay

On Thu, Aug 27, 2009 at 8:15 PM, Strainu wrote:
> On Thu, Aug 27, 2009 at 9:06 PM, David Gerard wrote:
>> Is there an official page for the iPhone app, other than the iTunes
>> store link? Some sort of "about" page, a link to the source, etc.?
>>
>>
>> - d.
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>
> See http://www.mediawiki.org/wiki/Wikipedia_iPhone_app for some info...
>
> Strainu
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] On templates and programming languages

2009-07-01 Thread Hay (Husky)
On Wed, Jul 1, 2009 at 5:26 PM, William Allen
Simpson wrote:
> William Allen Simpson wrote:
>> I run Firefox with JS off by default for all wikimedia sites, because of
>> serious problems in the not so recent past!
>>
> s/recent/distant/
I'm sorry that you seem to have such bad experiences with JavaScript.
Still, i don't think your comments are really valid in today's world.
Take a look at  'web 2.0-style' applications, such as Gmail or Google
Maps. Stuff like that would simply be impossible in a web browser
without depending on proprietary technology such as Flash. Recent
effort in all modern webbrowsers (including IE) has gone mostly into
optimizing Javascript engines. Whether you like it or not, Javascript
is here to stay.

Of course, this debate shouldn't really be about what people like or
dislike in a certain programming language. It should be about what the
best option is for Mediawiki template programming. A small script
language serves that goal best, so that leaves us to Lua and
Javascript. Lua is pretty cool too, but isn't as well known as
Javascript, and as far as i know they are pretty similar in most
aspects.

-- Hay

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] On templates and programming languages

2009-07-01 Thread Hay (Husky)
Javascript might have gotten a bad name in the past because of 14-year
olds who used it to display 'Welcome to my website!' alerts on their
Geocities homepage, but it's really unfair. Javascript is a very
flexible and dynamic language that can be written very elegantly.

I urge everyone who still think Javascript is a toy language to read
Douglas Crockford's excellent article:

http://javascript.crockford.com/javascript.html

-- Hay

On Wed, Jul 1, 2009 at 10:35 AM, Gregory Maxwell wrote:
> On Wed, Jul 1, 2009 at 3:50 AM, William Allen
> Simpson wrote:
>> Javascript, OMG don't go there.
>
> Don't be so quick to dismiss Javscript.  If we were making a scorecard
> it would likely meet most of the checkboxes:
>
> * Available of reliable battle tested sandboxes (and probably the only
> option discussed other than x-in-JVM meeting this criteria)
> * Availability of fast execution engines
> * Widely known by the existing technical userbase   (JS beats the
> other options hands down here)
> * Already used by many Mediawiki developers
> * Doesn't inflate the number of languages used in the operation of the site
> * Possibility of reuse between server-executed and client-executed
> (Only JS of the named options meets this criteria)
> * Can easily write clear and readable code
> * Modern high level language features (dynamic arrays, hash tables, etc)
>
> There may exist great reasons why another language is a better choice,
> but JS is far from the first thing that should be eliminated.
>
> Python is a fine language but it fails all the criteria I listed above
> except the last two.
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] On templates and programming languages

2009-06-30 Thread Hay (Husky)
I would opt for Javascript.

PHP and Python are intended for large and complex applications and
come with a huge standard library people probably expect to be
available. Security concerns are a problem too, so a subset would
probably be necessary So, in essence you get a crippled-down language
that isn't really useful for templates.

Making our own language, either by 'fixing' the template language or
by inventing something new would only mean we introduce a new language
that'll be specific to our own platform and nobody knows outside of
Mediawiki developers.

XSLT is not meant to be written or read by humans. It's a
Turing-complete language stuffed into horrendous XML statements. Let's
not go down that road.

That leaves us to Lua and Javascript, which are both small and
efficient languages meant to solve tasks like this. Remember, i'm
talking about 'core' Javascript here, not with all DOM methods and
stuff. If you strip that all out (take a look at the 1.5. core
reference at Mozilla.com:
https://developer.mozilla.org/en/Core_JavaScript_1.5_Reference) you
get a pretty nice and simple language that isn't very large. Both
would require a new parser and/or installed compilers on the
server-side. Compared to the disadvantages of other options, that
seems like a pretty small loss for a great win.

Javascript is a widely understood and implemented language, with lots
of efforts to get it even faster in modern browsers. Every Wikipedia
user has a copy of it implemented in their browser and can start
experimenting without the need for installing a compiler or a web
server. Many people program in Javascript, so you have a huge
potential number of people who could start programming Mediawiki
templates. And it's already closely tied to the web, so you don't have
to invent new ways of dealing with web-specific stuff.

So, let's choose Javascript as our new template programming language.

Regards,
-- Hay

On Tue, Jun 30, 2009 at 6:16 PM, Brion Vibber wrote:
> As many folks have noted, our current templating system works ok for
> simple things, but doesn't scale well -- even moderately complex
> conditionals or text-munging will quickly turn your template source into
> what appears to be line noise.
>
> And we all thought Perl was bad! ;)
>
> There's been talk of Lua as an embedded templating language for a while,
> and there's even an extension implementation.
>
> One advantage of Lua over other languages is that its implementation is
> optimized for use as an embedded language, and it looks kind of pretty.
>
> An _inherent_ disadvantage is that it's a fairly rarely-used language,
> so still requires special learning on potential template programmers' part.
>
> An _implementation_ disadvantage is that it currently is dependent on an
> external Lua binary installation -- something that probably won't be
> present on third-party installs, meaning Lua templates couldn't be
> easily copied to non-Wikimedia wikis.
>
>
> There are perhaps three primary alternative contenders that don't
> involve making up our own scripting language (something I'd dearly like
> to avoid):
>
> * PHP
>
> Advantage: Lots of webbish people have some experience with PHP or can
> easily find references.
>
> Advantage: we're pretty much guaranteed to have a PHP interpreter
> available. :)
>
> Disadvantage: PHP is difficult to lock down for secure execution.
>
>
> * JavaScript
>
> Advantage: Even more folks have been exposed to JavaScript programming,
> including Wikipedia power-users.
>
> Disadvantage: Server-side interpreter not guaranteed to be present. Like
> Lua, would either restrict our portability or would require an
> interpreter reimplementation. :P
>
>
> * Python
>
> Advantage: A Python interpreter will be present on most web servers,
> though not necessarily all. (Windows-based servers especially.)
>
> Wash: Python is probably better known than Lua, but not as well as PHP
> or JS.
>
> Disadvantage: Like PHP, Python is difficult to lock down securely.
>
>
> Any thoughts? Does anybody happen to have a PHP implementation of a Lua
> or JavaScript interpreter? ;)
>
> -- brion
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Skin & JS cleanup and jQuery

2009-04-15 Thread Hay (Husky)
That's great news. I've been programming in JavaScript quite a lot the
last few years and i think i would've probably gone insane if i hadn't
discovered jQuery. Especially for complex and intricate HTML
selections it's pretty amazing what you can do with jQuery.

Also, the fact that animations are built into the core could mean it
might get a little bit more simple to do visually cool things, which
might make the UI more intuitive.

-- Hay

On Wed, Apr 15, 2009 at 11:45 PM, Sergey Chernyshev
 wrote:
> Guys,
>
> It's great to see that you're working in this direction - I'm thinking about
> working on this for a while, but didn't have a gut to undertake such
> ambitious project alone ;)
>
> Do you have a working instance of ScriptLoader anywhere so I can aim some
> performance tools at it?
>
> Thank you,
>
>        Sergey
>
>
> --
> Sergey Chernyshev
> http://www.sergeychernyshev.com/
>
>
> On Wed, Apr 15, 2009 at 5:29 PM, Michael Dale  wrote:
>
>> These changes will probably result in some minor adjustments to existing
>> skins. (I will try not to completely break compatibility cuz I know
>> there are many custom skins out in the wild that would be no fun to stop
>> working once they update medaiWiki)
>>
>> This consolidation of  includes _may_ result in _some_ un-updated
>> skins referencing the same files twice which I think most browsers
>> genneraly handle "oky"
>>
>> Enabling $wgEnableScriptLoader will not work so well with skins that
>> have not been updated. Should have a patch soon. more about scriptLoader:
>> http://www.mediawiki.org/wiki/ScriptLoader
>> (We will most likely ship with $wgEnableScriptLoader off by default )
>>
>> I am also very excited about jQuery making its way into core. Things
>> like the add_media_wizard are much easier to put together with jQuery's
>> nifty abstractions and msg system. More about add media wizard:
>>
>> http://metavid.org/blog/2009/03/27/add-media-wizard-and-firefogg-on-test-wikimediaorg/
>>
>> peace,
>> michael
>>
>>
>> Brion Vibber wrote:
>> > Just a heads-up --
>> >
>> > Michael Dale is working on some cleanup of how the various JavaScript
>> > bits are loaded by the skins to centralize some of the currently
>> > horridly spread-out code and make it easier to integrate in a
>> > centralized loader so we can serve more JS together in a single
>> > compressed request.
>> >
>> > Unless there's a strong objection I'd be very happy for this to also
>> > include loading up the jQuery core library as a standard component.
>> >
>> > The minified jQuery core is 19k gzipped, and can simplify other JS code
>> > significantly so we can likely chop down wikibits.js, mwsuggest.js, and
>> > the site-customized Monobook.js files by a large margin for a net
>> savings.
>> >
>> > If you've done browser-side JavaScript development without jQuery and
>> > wanted to kill yourself, I highly recommend you try jQuery -- it's
>> > so nice. :)
>> >
>> > -- brion
>> >
>> > ___
>> > Wikitech-l mailing list
>> > Wikitech-l@lists.wikimedia.org
>> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>> >
>>
>>
>> ___
>> Wikitech-l mailing list
>> Wikitech-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] A proposal to de-table Wikipedia infoboxes

2009-03-03 Thread Hay (Husky)
Maybe it would be an idea to have some kind of fancy extension that
adds classes for templates only if they are available on articles that
need them? Actually, i'm not sure if its in their job description, but
something that would make tables and infoboxes a lot simpler should be
a task of the usability team.

-- Hay

On Tue, Mar 3, 2009 at 3:29 PM, Aryeh Gregor
 wrote:
> On Tue, Mar 3, 2009 at 7:42 AM, Hay (Husky)  wrote:
>> I don't know if making such an infobox that does not support IE6 and
>> IE7 is a good idea.
>
> It doesn't even support Firefox 2 . . . inline-block wasn't
> implemented in Gecko until 1.9 (Firefox 3).
>
> Also: "It should be fairly easy to do so, as the HTML code is
> generated by templates."  Has he *looked* at the templates?  :)
>
>
> The major reason why inline style is used on Wikipedia is, of course,
> because ordinary editors don't have the ability to use stylesheets.
> And while admins do, they can only effectively add markup to *all*
> pages at once, regardless of whether they contain the exact infobox in
> question.  An awful lot of the provided CSS is nation-box-specific,
> and so useless in 99.99% of Wikipedia's articles.  (Literally: there
> are about 2.7 million articles, and I'm pretty sure there are less
> than 270 recognized nations.)  But all that CSS would have to be
> served with all of them.
> - Show quoted text -
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] A proposal to de-table Wikipedia infoboxes

2009-03-03 Thread Hay (Husky)
I don't know if making such an infobox that does not support IE6 and
IE7 is a good idea. If you would take out all inline style elements
and replace with them classes that are available in a general
stylesheet it would already safe a lot of the cruft in the original
code.

-- Hay

On Tue, Mar 3, 2009 at 1:30 PM, David Gerard  wrote:
> By Hakon Wium Lie of Opera:
>
> http://www.princexml.com/howcome/2009/wikipedia/infobox/
>
> What is the likelihood of making as much as possible CSS? How to make
> infoboxes degrade gracefully for non-CSS browsers and IE users?
>
>
> - d.
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Norwegian Websites Declare War on Internet Explorer 6

2009-02-20 Thread Hay (Husky)
On Fri, Feb 20, 2009 at 3:50 PM, Aryeh Gregor
 wrote:
> Wikimedia's goal is not to better humanity in some unspecified way.
> It's to disseminate free knowledge.  Pestering users who probably
> can't fix the problem does nothing to advance that goal.  If we're
> going to try moralizing our users, why don't we go ahead and nag our
> users to ditch IE entirely and switch to Firefox?  IE7 is pretty
> bug-ridden too.  Or hey, why not try getting them all to switch to
> Linux?
Exactly. Also, in many ways upgrading to IE7 doesn't solve many
problems at all (except for some very obvious CSS bugs) because it's
still riddled with bugs and wrong implementations of W3C specs,
especially on the Javascript implementation. We just have to live with
the fact that web development means spending 30% of your time on
writing hacks for IE6 and IE7 (and probably for IE8 too).

-- Hay

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Norwegian Websites Declare War on Internet Explorer 6

2009-02-20 Thread Hay (Husky)
On Fri, Feb 20, 2009 at 3:19 PM, Stephen Bain  wrote:
> If some new major feature is added that a given old browser doesn't
> support, a banner message could be displayed akin to the one Brion put
> in place recently for mobile users not using the mobile gateway.
Hmm.. so anytime you visit Wikipedia from a pc on your work with IE6
that you can't control you see an ugly banner on the top urging you to
upgrade your browser, even though you are not in control of that?
Seems pretty unfriendly to me...

-- Hay

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Norwegian Websites Declare War on Internet Explorer 6

2009-02-20 Thread Hay (Husky)
Unfortunately, IE6 (and IE7 as well) are problems that all web sites
got to live with. IE6 is still used by about 34% of all web users
(according to the latest statistics from thecounter.com), so banning
those users or not paying attention to problems they might have with
certain website elements is pretty bad, especially considering the
fact that we have a mission to deliver our content to everyone,
regardless of platform or browser. Testing on those platforms is
probably not done enough, simply because most developers and editors
are using something else than IE.

-- Hay

On Fri, Feb 20, 2009 at 11:50 AM, Thomas Dalton  wrote:
> 2009/2/20 Aryeh Gregor :
>> On Fri, Feb 20, 2009 at 5:19 AM, Thomas Dalton  
>> wrote:
>>> There are different levels of support. We should certainly make sure
>>> things fail gracefully for IE6, but a new feature not working on IE6
>>> shouldn't be a reason not to implement it for everyone else. (I
>>> believe that is pretty much the current policy already.)
>>
>> It depends on the type of feature.  For instance, when implementing
>> different icons for certain filetypes on external links in Monobook, I
>> used the [$=foo] CSS selector, knowing it would fail in IE6, because
>> it's not going to hurt anyone if it does.  On the other hand, it would
>> still be unacceptable at this point for a significant new feature not
>> to work in IE6.  It still has 20% of the market, by some figures I've
>> seen.
>
> I can't see any significant new features causing a problem that
> wouldn't be dealt with by the "fail gracefully" condition. As long as
> adding the feature doesn't make things worse for IE6 users (so they
> can still read and edit the sites), then there isn't a big problem. Of
> course, if you can cater to IE6 easily, then there is no reason not
> to.
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] image backup and info on wikitech wiki

2008-12-16 Thread Hay (Husky)
I think that any initiative of a new backup or download possibility of
Commons files should be welcome. If the whole Commons collection is
indeed only located on two servers in the whole world that gives me a
little bit of a scary feeling...

-- Hay

On Tue, Dec 16, 2008 at 11:48 AM, Daniel Kinzler  wrote:
> grin schrieb:
>> Even if it is, the entry on image backups is highly nonspecific and
>> contains a big warning about being seriously out of date.
>
> I'm not a server admin and I don't know the exact details, but as far as I 
> know,
> the information there is correct in so far as there are two storage servers 
> for
> media files in tampa, one replicating the other. I think the second one is 
> even
> located in a different data center, though in the same building. What exactly
> the status of automatic replication between these servers is, I do not know.
>
> What I do know is that the german chapter plans an off-size backup for media
> files in amsterdam. We will order the hardware we need for that this year 
> still.
> I don't know how long it will take to set up a working replication process, 
> but
> I hope that it will not be too long. Having an off-site backup seems a good
> idea, the next hurricane isn't that far away.
>
> We should also think about providing reasonable media bundles for download in
> some way. My current idea is to create bundles of all media files used or
> contained in a given category (maybe even including two or three levels of
> subcategories). Such bundles would have to be created by an off-line job, i
> suppose. And maybe requesting them should be limited to admins, to avoid 
> flooding.
>
> -- daniel
>
>
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l