to me...). Thus you can use the api with a callback
parameter to get around the same origin policy.
Obviously CORS is a much nicer solution.
-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
tween Wikimedia sites
However, I think the main issue is that people want to do more things
with metadata than just retrieving the information with js. I suppose
it all boils down to use-cases. I'll admit I'm not overly familiar
with Wikisource's metadata use case.
--
-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
g file), and try each command out while logged in as the
apache user, and see
which one fails.
[1] http://www.mediawiki.org/wiki/How_to_debug#Logging
-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
rements, such as sandbox-ability, performance
concerns, and ease of implementing
resource limits, etc. If I recall lua came out the clear winner.
-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
n a
single wiki, I really believe one should get permission from that wiki
first.
That of course has a downside - What if foundation hires X people to
develop feature, and enwikipedia rejects it. After all features aren't
free, and that would represent a wasted investment. I thi
f people don't want to use it, they don't have to.
I do believe consensus should be sought when enabling extensions like
moodbar and what not on enwikipedia,
but this is nothing like that situation.
--bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
ho can tell us more about it?
> This would be very interesting to get our synching code optimized.
>
> It still wouldn't help us with the global identifiers, though, but it
>
> would be good to know more about it.
>
I've tried to add a brie
t. Furthermore it feels this problem has
gotten worse with time. (On the flip side, there is an even more
pronounced problem with the "community" treating us as service
providers instead of colleagues - so it goes both ways)
--
-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
taking a
chill pill. Additionally I'd like to add that mistakes happen, we all
make them. The important thing is to realize we've made a mistake,
mitigate the affect of the mistake, and document the mistakes so that
others don't make them. Yelling
was for (joke, but only sort of.
There's enough open bugs in need of love that verifying a RESOLVED
WORKSFORME bug really works for us doesn't seem that useful)
-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
ng per db,
per site (Wikipedia, Wiktionary, etc), or global interwikis that act
on all sites.
The feature is a bit wmf specific, but it does seem to support
different levels of interwiki lists.
Furthermore, I imagine (but don't know, so lets see how fast I get
corrected ;) that the cdb database
.
> >
>
> I officially redact my bogus claim that it "should" be 4 spaces.
> However, we certainly shouldn't default to 8!
I use 8 :P
(Seriously though, everything looks so crunched up with 4 space tab
width... Makes me claustrophobic!)
-bawolff
sign
that whatever is replacing the deprecated function isn't a sufficient
replacement.
--
-Bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Thu, Aug 2, 2012 at 7:18 PM, Julien Dorra
wrote:
[snip]
>
> * Question for you all: do you have an example of this "consumer mode"
> behavior on the software part of Wikimedia? How have you dealt with it in
> the past?
>
> * Bawolff a question just for you, could
; than an entity meant to serve their interests. As a result they
start to feel that MediaWiki is a product that is being developed
"for" them (in a similar way how something like facebook or google is
developed "for" its users) rather than "by" them (or by "their"
community). Maybe there was always such setiment, and I just never
noticed it in earlier times, but I find such sentiment disturbing. I
think in order to best reach out to new contributors, we need to start
at home so to speak.
Cheers, and once again best of luck,
--Bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
e with that?).
If there are no concrete plans to have squid start caching logged in
views, and only logged in users have a lang preference, it
doesn't seem to make a lot of sense to split the squid cache based on
language as that would catach 0% of the hits.
If there is concern that parser cache would be too diluted by this,
why not cache in the normal $wgMemc->set
cache. We certainly cache all sorts of stuff there much of it not even
very expensive to generate.
--
-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
arballs. Then we have to make sure to validate the
contents, and communicate to people that the tarball is only for
uploaded djvu files). [Of course until 5 minutes ago I'd never heard
of an indirect djvu file, so I could be misunderstanding]
-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
ertain padding around the subject.
ImageMagick would definitely be good since its already in use. I know
netpbm programs can also programatically crop things, but image magick
is definitely the best choice if possible.
-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
ed about auto-detection failing, we could do better
auto-detection in the install script ( Make an ajax request to some
script /mw-config/path_info.php/test and see if it can properly
extract the path info).
-bawolff
___
Wikitech-l mailing list
W
ssary as we
currently do not support per-language tailoring of the collation. All
languages get sorted the same at the moment (there are bugs in
bugzilla to change this, and really it should be changed, but such
per-language support has yet to be implemented. However even if it was
fixed, its un
>
> We plan to start recruiting for a new Bugmeister soon. I'll post more
> details when I have them.
>
> In the meantime, please join me in thanking Mark for his service to
> the Wikimedia movement.
>
> Rob
:'(
You will be very deeply missed. I wish you the best o
used on the page.
In particular, doing ?action=purge does *not* update the categories
that are used on a page.
https://bugzilla.wikimedia.org/show_bug.cgi?id=28876 is a bug for
making the category change happen quickly. Its on my list of bugs that
I think would be nice if they were fixed.
Che
sting ticker works on a
static list of pages that isn't updated, instead of doing some
auto-update magic with ajax). See
https://en.wikinews.org/wiki/Template:Ticker and
https://en.wikinews.org/wiki/User:Bawolff/sandbox/ticker .
Cheers,
-bawolff
___
ty printer isn't all that pretty at
the moment).
> For a stable API, that's far too fast of a deprecation path. We don't even
> remove core functions that fast (or shouldn't). I'd suggest throwing some
> kinds of warnings in the output for at least one release (1.20
ry
browser ever made (including browsers not yet released) do that? Are
we sure there isn't certain weird situations (aka bugs) where the
browser would not normalize something, etc.
Things should either be escaped totally (So we know its safe), or not
escaped at all (so we know its dangerous
nsions)
*Roughly how popular that hook is
*A lot of places to find examples of how to use such a hook (useful if
you're a newbie)
Of course for things like FlaggedRevs with 2 million hooks, that
starts to become impractical.
-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
o convince Wikipedia et al that
social networking buttons are a good idea.
Social networking button type extensions are very simple (Its a linked
image, that's it) and thus trivial to create. The issue has always
been that whenever asked, the Wikipedians say no (and its not like
they o
even the rsvg bug would probably have many files
that aren't visibly affected)
A message on the commons village pump whenever rsvg is updated would
be a good idea imho (since they are generally the folks who deal with
svgs and should know if something potentially changed, and people like
kno
sed, is that
you can request it via api (I suppose it's copied over during file
reverts as well). I don't think anyone uses that field on archived
images really. (maybe one day bug 26741 will be fixed and this would
be less of a concern).
Anyhow, I do believe it would be awesome
e is
interested, although its kind of feel out of favour recently
[disclaimer: I'm one of the authors of the enwikinews js ticker hack
thing].
-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
e - I'd imagine that's where most people go.
Thanks,
-bawolff
On Wed, Nov 23, 2011 at 2:09 PM, Philip Chang wrote:
> Hi bawolff,
>
> Please understand that automatic re-direction to the mobile site will not
> "break" the intended site.
>
> Most content of th
ooked for pattern and treat the main page as a
normal page in that case instead of totally and utterly breaking if
the admin didn't customize the front page properly?
-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
complexity of the
> system to protect against their outage will be more likely cause harm than
> the outages themselves. Instead, just announce it on the blog before and
> apologize to anybody affected afterwards.
I really feel that having a site/central notice for planned
ma
nd of comparison utility to help
> confirm/deny a photo or article is derived from another source.
That could certainly be a technical challenge, and not a trivial one.
However at the end of the day we can just get a human to compare.
> If there's
ssign you their copyright (imho).
My understanding is we allow people to commit extensions under
whatever OSI approved license strikes their fancy, and that if you
commit to someone else's extension, then you also release your commit
under that license. This always struck me as common sense.
27;s called from some parser hook or something). I would guess (but
haven't checked) that the upload method calls $wgParser->parse
somewhere along the line, probably when its creating the image
description page. The babel extension had a very similar issue with
auto-creating categories. See th
on different wikis (I wrote one at enwikinews -
http://en.wikinews.org/wiki/Template:Social_bookmarks . Commons used
to have one with stockphoto.js. Not sure if they still use it, several
other wikis do their own thing) The issue has always been if people
actually want it, which I bel
on-wheels type vandal, who at worst tricked an admin into doing
a delete of a page with a very high number of revisions making the
server kittens cry for a moment. There's no indication he has "mad
hacker skillz" in any way or form (and given the tone of that
y work for me perfectly fine and seem to work on
cruise control. What's the error they're failing with?
Thanks,
Bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
s of new code related to Exif
support (and related image metadata stuff). The maintenance script
just regenerates img_metadata based on the source image.
-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
ementById( 'mp-itn' );
Which I assume is a refrence to class names used on the english
wikipedia's main page.
I personally feel (without actually looking at the issues involved, so
take this with salt) that its bad to hardcode such
>I've been waiting for this but unfortunately I can't be at that time. Would
>there be a log about the result?
#wikimedia-dev is always logged at
http://prototype.wikimedia.org/logs/%23wikimedia-dev/
-bawolff
___
Wikitech-l mailing
On Wed, Jun 1, 2011 at 3:02 PM, Brion Vibber wrote:
> On Wed, Jun 1, 2011 at 1:53 PM, bawolff wrote:
>>
>> As a volunteer person, I'm fine if code I commit is reverted based on
>> it sucking, being too complicated, being too ugly, etc provided there
>> is act
in the last 3 days, since that is something essentially out
of my control.
As it stands, trivial one line changes aren't even reviewed in 72 hours.
-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
e shorter links (
http://en.wikinews.org/wiki/?curid=1234 style) for twitter.
No data is passed to facebook. Its just an image that's hyperlinked in essence.
-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
o get rid of the parser cached version of it).
So I'm not sure what the deal with that was, but it seems to be a one
time issue.
(Or also entirely possible, it fixed itself well I was looking at it,
and I just think purging the page
had something to do with it, where really its all a big coincidence)
er projects have specialized stuff that should be in
php extensions.
The issue is at the end of the day it is _significantly_ easier to
write a js hack, then
to manage to get a php extension written, reviewed and deployed.
-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
still in
many small projects [usually with the project code field set to the
wrong project]. I've even seen that code in wikis that were created
after said toolserver tool stopped working). On the other hand they
probably won't complain, as the js is already fairly broken ;)
-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
ites (other then
enwikipedia)
tend not to really know what they're doing in terms of customizing JS,
thus there is a lot of really sketchy js in mediawiki:Common.js that
gets copied from site to site and messed around with until it works.
Several of the smaller wikis already have broken JS. I imagine that
many
es, ideally when querying this information from the api - we'd
want to be able to do things like get the author's name in language X,
falling back to the original language if unavailable. Get the author's
name in all available languages, etc.
-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
ll option where all the source files get tar'ed together on
the server side for an easy download.
-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
ry.
>
> Is someone interested in reviving it, or we can delete it right away?
>
> --
> Max Semenik ([[User:MaxSem]])
This is offtopic... But that extension's name sounds oddly familiar.
Did it used to be enabled on wikimedia (en wikinews specifically) a
long time ago?
-bawol
ng and replace the script with malicious js.
However if someone actually has the ability to do that, they could
already do that with the geoip lookup. Thus I don't see how doing the
importScriptURI reduces security.
-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
e, why not just do the whole
variant thing. For example, Serbian has both a latin and a crylic
version, and the user can select which one they want. Then everyone
wins (?)
-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
know php yet, but I have tryed it anyway, it gives me syntax mistake
> [...]
You could also try http://www.mediawiki.org/wiki/Extension:CommentPages
-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Mon, Jul 5, 2010 at 9:35 PM, Lars Aronsson wrote:
> On 07/06/2010 01:04 AM, bawolff wrote:
>>
>> Note, Wikibooks actually does have DynamicPageList.
>
> Is this used on Wikibooks as a way to limit categories to one
> book, i.e. does each book have a category, that i
nly, but it would be great into wikibooks and wikisource too to
> produce good, updated lists by "virtual categories intersection".
>
> Alex
Note, Wikibooks actually does have DynamicPageList.
Cheers,
Bawolff
___
Wikitech-l mailing list
Wi
rsion of images.
Cheers,
bawolff
On Tue, Jun 1, 2010 at 1:23 AM, Ilmari Karonen wrote:
> bawolff wrote:
>>
>> However, if I do implement a new table as part of this, it will
>> probably use page_ids to identify the image - I don't see any reason
>> to arti
e) .
>
> Roan Kattouw (Catrope)
>
Ah, I was thinking that looked like something too perfectly fitted to
the situation to be true. However I still think it may be a good
approach to generally keep the metadata as a serialized php blob, and
have another table, similar looking to page_pr
Hi Alex. Thats actually on my list of to do if I have time. Building a
metadata editor for files on the wiki (probably in the form of an
extension) would be in phase 2 of my project. (In my project proposal
it was on the list of things to do if I have extra time).
Cheers,
bawolff
On Mon, May 31
ing up on the page_props table the other day,
and I believe that in the commit implementing that table, bug 8298 was
given as an example of something cool the table could be used to
implement.
However, if I do implement a new table as part of thi
On Fri, May 28, 2010 at 10:12 AM, church.of.emacs.ml
wrote:
> Hi bawolff,
>
> thanks for your work.
> I'm not very happy about the name "metadata" for the table. As far as I
> understand it, this is about file metadata. "metadata" suggests it
> con
ues. However doing a metadata table like this does
leave the possibility open for people to do such intersection things
on the toolserver or in a DPL-like extension.
I'd love to get some feedback on this. Is this a reasonable approach
for me
= also
not cause significant problems (well making uselang persist)?
(although then again, I'd imagine that would not be much different
then the use a cookie solution in terms of caching)
-bawolff
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
gt;
> --
> sylv...@chicoree.fr
> http://www.chicoree.fr
I was involved with a wiki that had similar needs. I made a small
little extension to whitelist a namespace -
http://www.mediawiki.org/wiki/Extension:Whitelist_Namespaces Perhaps
it might be useful to you.
Cheers,
Bawolff
201 - 265 of 265 matches
Mail list logo