You should test with tc. :)
http://lartc.org/manpages/tc.html
We definitely need more info on these scenarios, we don't know anything
on the effects of our software development on people without broadband.
Nemo
___
Wikitech-l mailing list
James Forrester, 12/05/2013 19:56:
On 12 May 2013 10:40, Federico Leva (Nemo) wrote:
James Forrester, 12/05/2013 19:20:
Sorry for the disruption to everyone's plans. If you have any questions,
please do ask.
Will you still make the lists and send out the notifications so that
people can
Thanks Chris, I've added a link. However, the new page only links some
technical design documents from which I don't get simple answers to
questions such as will it be read-only? ([[OAuth]] proposed it to be so).
Nemo
___
Wikitech-l mailing list
David, for a previous discussion on renames see
https://bugzilla.wikimedia.org/show_bug.cgi?id=34490
John, for Visualeditor https://bugzilla.wikimedia.org/show_bug.cgi?id=49728
Nemo
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
Thanks, is there any information on this OAuth deployment planned for
August? The last substantial update about WMF plans to
https://www.mediawiki.org/wiki/OAuth was about 14 months ago.
Nemo
___
Wikitech-l mailing list
Dumb question: so, would this be an alternative to Etherpad lite
integration with MediaWiki via the EtherEditor extension?
Nemo
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
AFAIK the first experiment was ClickTracking for the usability
initiative in 2009-10:
https://www.mediawiki.org/wiki/Extension:ClickTracking
https://usability.wikimedia.org/wiki/ClickTracking
We've waited for its data for years, I suppose we can assume it will
never be shared; it had some
As all templates requiring it are listed in
[[MediaWiki:Disambiguationspage]], why not run a global bot? It's surely
easier than trying to find 800 editors.
Nemo
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
We have several tools and bots using chunked upload via the API for
files over 100 MB. Daniel, please report back if you encounter
https://bugzilla.wikimedia.org/show_bug.cgi?id=36587 ; it seemed fixed
at some point but Fastily (developer of one of those tools) reopened it.
Nemo
I've linked the new policy from [[mw:Bugzilla]]. (In general, some sort
of announcement when policies are enacted would be nice.)
I didn't remember hearing of any such new policy coming and the only
mention of rights removal was James_F on IRC saying he no longer was
admin but was still able to
https://bugzilla.wikimedia.org/49604 was fixed, so those who newbies to
be helped can do so from
https://en.wikipedia.org/w/index.php?limit=500tagfilter=visualeditortitle=Special%3AContributionscontribs=newbietarget=namespace=8nsInvert=1tagfilter=visualeditoryear=2013month=-1
Nemo
The true (old) MediaWiki documentation on search is still on Meta, it
needs to be rewritten on mediawiki.org (PD and up to date):
https://meta.wikimedia.org/wiki/Help:Searching
Nobody reads the help pages, sure. That's why they should be linked from
the special pages etc. as some extensions
No, DXR doesn't help the CSE. I also doubt it will help restoring full
text search on our end, but we'll see.
https://bugzilla.wikimedia.org/show_bug.cgi?id=49674
On the contrary, gitblit is more robust and faster than gitweb, so it
allows crawling by search engines. It was crawled very
I disagree with the last three messages: the visual editor is bound to
be painful when enabled; if the team has established that they're in a
stage where they are done with the knowns in some areas but need more
testing on the field (and feedback) to discover more, they must be
allowed to,
The fact that there are known issues doesn't mean that finding new,
unknown issues will slow down the work on the known; it's up to the team
to decide what sort and what amount of feedback they'll be able/need to
process (and to adjust if they were wrong).
Gradually enabling a feature is not
James Forrester, 12/05/2013 19:56:
James Forrester, 12/05/2013 19:20:
Sorry for the disruption to everyone's plans. If you have any questions,
please do ask.
Will you still make the lists and send out the notifications so that
people can start planning?
Yes.
Any news on the
shi zhao, 05/06/2013 22:07:
Why ULS deployment all wikis?
Does
https://www.mediawiki.org/wiki/Universal_Language_Selector/Design#Rationale
answer your question?
Nemo
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
I don't know what you mean by similar program, but of course the
Germans have all their local groups, courses, workshops co. to train
the workforce. For instance https://de.wikipedia.org/wiki/Wikipedia:WikiCon
Nemo
___
Wikitech-l mailing list
James Forrester, 12/05/2013 19:20:
Sorry for the disruption to everyone's plans. If you have any questions,
please do ask.
Will you still make the lists and send out the notifications so that
people can start planning?
Nemo
___
Wikitech-l mailing
James Forrester, 12/05/2013 19:56:
On 12 May 2013 10:40, Federico Leva (Nemo) nemow...@gmail.com wrote:
James Forrester, 12/05/2013 19:20:
Sorry for the disruption to everyone's plans. If you have any questions,
please do ask.
Will you still make the lists and send out the notifications
Some weeks ago I wget-warc'ed wikitech-old so that stuff like the logs
(and other stuff permanently lost) could be still visible in Wayback
machine some day. I uploaded it to
https://archive.org/details/wikitech.wikimedia.org-2013-03-26
I don't actually remember if I had completed the mirror
As I said, the logs are lost. Also users and their groups, of course,
like everything that is not in XML dumps – except the uploads, which
were recovered.
There are also some bugs in the import that sometime make the history
hard to read, so having a fallback is nice.
Nemo
I think this is one of the most amazing achievements the ULS will allow
us, thanks Pau.
Pau Giner, 18/04/2013 18:50:
As part of the future plans for the Universal Language Selector, we were
considering to:
* Show only a short list of the relevant languages for the user based
on geo-IP,
Help is needed to submit (or reject) backports to 1.19 and 1.20 of
patches where requested:
https://bugzilla.wikimedia.org/buglist.cgi?cmdtype=doremremaction=runnamedcmd=Backport_Stable%3Fsharer_id=15390list_id=193887
37209 Highest blo LinkCache doesn't currently know about this
Sumana Harihareswara, 08/04/2013 14:36:
Nemo, I have sympathy for you here -- it took a while for me to get
used to the use of socialize in the way the Wikimedia communities use
it.
Not communities, the WMF.
I think socializing is more than just communication implies,
though;
By the way, it would be lovely not to call communication like
nationalization... sometimes I see people coming to communities saying
they're socializing something and it feels weird. ;-) (Especially as
it's false good news.)
https://en.wiktionary.org/wiki/socialize
(transitive) To
Are we going to do something about the ~200 (?) estimated bug fixes
without a release note, and other missing release notes?
https://www.mediawiki.org/wiki/Thread:Manual_talk:Coding_conventions/Release_notes_and_bug_fixes
Nemo
___
Wikitech-l mailing
Greg, «Can we actually start thinking about #5?» Fine with me. As step
#5 was about what you have to do as in communicate (from what I
understand) and you ask us not to start with the other steps, can I
summarise that the answer to the question Who is responsible for
communicating changes in
Seb35, 24/03/2013 21:19:
I created some time ago a template on meta for a glossary and applied it
to very basic terms [1], mainly with translation in mind. Another idea is
to use the translate extension on [[meta:Glossary]] to uniformize the
presentation accross languages and to use the
Thanks Greg for the overview. 0–4 is fine, but there a couple premises I
question, which trigger a couple considerations/conjectures which may be
of use.
First, let's not call the wmfXX subpages release notes. They just are
lists of commits. As you noted, they are also created – by design –
Guillaume Paumier, 22/03/2013 14:27:
* Status quo: We keep the current glossaries as they are, even if they
overlap and duplicate work. We'll manage.
Ugly.
* Wikidata: If Wikidata could be used to host terms and definitions
(in various languages), and wikis could pull this data using
Restrictive wikis for captchas are only a handful (plus pt.wiki which is
in permanent emergency mode).
https://meta.wikimedia.org/wiki/Newly_registered_user
For them you could request confirmed flag at
https://meta.wikimedia.org/wiki/SRP
Personally I found it easier to do the required 10, 50 or
Quim, you seem to be answering the question how does one communicate
changes, but the question of this thread is who is responsible for
doing so. It's quite a difference.
Usually volunteers know the communities better and have less problems
with the how than others, but that's not the point.
Partly related: to be fair, Aaron asked comments about release notes and
announcements some months ago (although in that case for schema changes)
but there was none.
http://lists.wikimedia.org/pipermail/wikitech-l/2012-November/064630.html
Nemo
___
There's slow-parse.log, but it's private unless a solution is found for
https://gerrit.wikimedia.org/r/#/c/49678/
https://wikitech.wikimedia.org/wiki/Logs
Nemo
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
IIRC EditWatchlist used to be more robust before the recent
restructuring: https://bugzilla.wikimedia.org/show_bug.cgi?id=39510
Nemo
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Does all of the content include archive (deleted pages)?
Nemo
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Mark, your assumptions about LQT are blatantly wrong or in other words
it's just wishful thinking.
Tests are done as usual by translatewiki.net on the last code, then some
volunteers take care of the worst problems: usually it's TWN staff, but
few days ago Krenair has submitted fixes for a
Wonderful!
So, now that we have a release manager for MediaWiki, is the release
manager the person who writes/approves the policy for what sort of
things are worth a 1.x.y release and how they're tracked on bugzilla,
and will Chris be able to make and push tarballs for those even if
they're
It's also annoying that while the toolbar (normal or advanced) loads I
can't type in the header (for section=new) or the edit area, at least on
Firefox:* is this the same problem?
Nemo
(*) Might also be a recent regression:
https://bugzilla.mozilla.org/show_bug.cgi?id=795232
Thank god when it even loads completely. :)
https://bugzilla.wikimedia.org/show_bug.cgi?id=44188
Nemo
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Lars Aronsson, 14/02/2013 04:02:
On 02/14/2013 02:56 AM, Erik Zachte wrote:
Lars,
I think you are overdoing it.
The reports are not nonsense, but have over time become more
inaccurate than some other stats we present.
Actually if the reports would have mentioned 'pages served' rather
than
Since ages, there's http://vs.aka-online.de/cgi-bin/wppagehiststat.pl
I've never understood why en.wiki links from [[MediaWiki:histlegend]] an
obscure tool rather than this, maybe it's to avoid DoS'ing it for the
benefit of non-en.wiki projects.
Nemo
We now have some real-life data:
https://bugzilla.wikimedia.org/show_bug.cgi?id=15434#c59
Bug 15434 is the most reported bug in Wikimedia product (16 duplicates),
but ops should now be able to go ahead and fix it, it seems.
Nemo
___
Wikitech-l
Luke, we do not know how many humans are being turned away by the
difficulty: actually we sort of do, that paper tells this as well. It's
where their study came from, and gives recommendations on what captcha
techniques are best for balancing efficacy with difficulty for humans.
We don't seem
Luke, sorry for reiterating, but «brainstorm solutions to a problem
before we measure the extent of the problem» is wrong: it's already been
measured by others, see the other posts...
Nemo
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
And to be more explicit, quoting the paper in case someone really has
doubts: «we deem a captcha scheme broken when the attacker is able to
reach a precision of at least 1%». With our FancyCaptcha we are/were at
25 % precision for attackers, so yes, it's officially broken, and it's
been so
Congratulations Patrick!
Are you now the right person one should ask decisions/updates to for
stale performance/platformeng bugs?
https://bugzilla.wikimedia.org/buglist.cgi?keywords=performance%2C%20platformengkeywords_type=anywordsresolution=---query_format=advancedlist_id=172284
Or questions
Matt, let's be clearer then: what you describe is ok ONLY for en.wiki.
ALL the other wikis have a different system.
Thanks,
Nemo
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
I rather think that devs' time would be best spent ensuring that our
tools against Tor users and open proxies are effective and reliable.
Huge amounts of volunteers' time are spent combating abuse of them, with
inadequate tools.
See for instance:
Just FYI, TWN staff has seen no help at all in the last few months, so
everything in
http://lists.wikimedia.org/pipermail/wikitech-l/2012-August/062960.html
is still valid and https://bugzilla.wikimedia.org/38638 dependencies are
longer than ever.
The small good news is that now good devs
Writing to village pumps is sometimes useless, not everybody looks at
them so you still rely on someone to forward the message to its actual
consumers.
If there's a specific affected page, as in this case, you're lucky
because you can write on its talk. On big wikis people necessarily use
Yes, the as already suggested referred to the translatable message
points, not the the part before it.
Matt, there's no such a thing as the appropriate
village pump for everything everywhere. You used as an example
[[w:en:Wikipedia:MediaWiki messages]], too bad that only nl.wiki has
something
That data is hardly useful, it doesn't explain what it refers to and,
even when it does, seems wrong. Compare e.g.
https://www.ohloh.net/p/mediawiki/contributors?query=sort=commits
Also,
https://bugzilla.wikimedia.org/weekly-bug-summary.cgi?tops=10days=10 proves
they're not talking of the
Since a few years ago, we have several query [special] pages, also
called maintenance reports in the list of special pages, which are
never updated for performance reasons: 6 on all wikis and 6 more only on
en.wiki. https://bugzilla.wikimedia.org/show_bug.cgi?id=39667#c6
A proposal is to run
In the meanwhile, in such cases, it's useful to add the domain to
https://meta.wikimedia.org/wiki/Live_mirrors and also to the Internal
wiki relevant page
https://meta.wikimedia.org/w/index.php?diff=747627oldid=441648 (or
whatever page replaced it).
Nemo
TL;DR: Please add feedback to
https://www.mediawiki.org/wiki/Suggestions_for_extensions_to_be_integrated
Since MediaWiki 1.20 has been released, a discussion (opened by hexmode)
is ongoing on how to make our release notes better so that users
understand what improvements there are and why
Robla wrote:
I'm sure all of those hooks have a corresponding extension that needs
the new hook.
Right! (Kudos to iAlex for creating all those pages.)
However, most extension pages don't have updated lists of used hooks, so
WhatLinksHere doesn't help.
I suspect it might be easier (!) to check
Andre, the answer is in this one-line comment by Brion:
https://bugzilla.wikimedia.org/show_bug.cgi?id=16614#c3
None of this bugs should be closed unless both the reported occurrence
and the underlying problem are fixed. The alternative is that it's
proved to be an occasional problem which
Thanks Mark for opening this discussion, it's very useful.
Indeed, by reading https://www.mediawiki.org/wiki/MediaWiki_1.20 one
would think that nobody has been done in 1.20, except perhaps some
localisation work. :p
I also agree that https://www.mediawiki.org/wiki/MediaWiki_1.18 is a
good
Sue Gardner, 08/11/2012 07:03:
I kind of have the sense that people are considering this a done
deal. [...]
So to be super-clear: None of this is a done deal at this moment. Lots
of conversations are happening in various places, and it's all good.
That's why Erik made the pre-announcement
Thank you, Erik. Before (or rather than) commenting, I have a single
question below; the rest of the email is just a premise+addendum to it. ;-)
Terry Chay, 07/11/2012 21:04:
You aren't the only one. It turns out we use a lot of industry
terminology, without realizing that we are
See previous discussion at https://meta.wikimedia.org/wiki/Test wikis
It would be nicer to have answers to the general problem rather than
constantly ad-hoc solutions which create confusion.
Nemo
___
Wikitech-l mailing list
Thanks for working on this. I would have thought it obvious that ohloh
should point to gerrit by default rather than GitHub which is (or when
it is) only a mirror, why are you even considering the contrary?
333 extensions are still on SVN, those need to be added to a project as
well.[1]
Thanks for your reply.
Mmm are they planning to be moved to gerrit? Adding (hundreds of) repos
is just as tedious as updating them again whenever they move.
They're still being sorted out, but most of them are quite dead and
won't be moved I guess. A better thread to ask might be
Thank you, looks like I've just been unlucky and several was a severe
overestimation. :)
I'll report other duplicates on your community metrics wiki subpage as
soon as I can, but I still don't know what to do with
duplicate/unclaimed commit IDs.
Can you also create the -extensions-svn project
Can this be brought back/centralised on the appropriate existing channel
(#wikimedia-dev or #wikimedia-office), as it was decided for the metrics
meeting?
Thanks,
Nemo
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
WereSpielChequers, 15/10/2012 09:56:
60 edits a minute sounds high, and probably faster than most of these
sessions run at, but not if it is as I suspect, calculated every few
seconds.
It's not, as far as I can see. This is how it works:
https://www.mediawiki.org/wiki/Manual:$wgRateLimits
https://meta.wikimedia.org/wiki/Importer has more info.
Nemo
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Hello Andre, I think you should move the inventory you posted on en.wiki
to a subpage of https://www.mediawiki.org/wiki/Bugzilla so that it can
be edited and discussed on talk.
Current situation seems almost complete; on the contrary, the
Proposed pages section is completely unclear to me, you
I think this is supposed to be the page summarizing the issue and path
to go: https://www.mediawiki.org/wiki/Files_and_licenses_concept
Nemo
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
Just reiterating that discussing individual preferences (not to mention
touching them) is a very bad idea before we have good criteria of some
sort to evaluate them.
Usage stats are a requirement but such criteria are not only about them
because some preferences may be of small harm for many
Just quick note that, as announced in August
http://lists.wikimedia.org/pipermail/mediawiki-l/2012-August/039678.html,
translatewiki.net yesterday dropped support for about 260
extensions/message groups.
People still using code from SVN will no longer have the localisation
updates they might
Eventually we'll probably bump ?all to a stricter ~all aka
SoftFail, which tells the receiving side that only mail coming from
the listed subnets is valid. Most ISPs will route 'other' mail to a
spam folder based on SoftFail.
I guess this means that people will no longer be able to successfully
Autopratica is actually not a valid word: or rather, it's a neologism
from a new/fringe theory, possibly grammatical thanks to the
productivity of auto- but slightly confusing due to the jargon-meaning
of pratica here.
That said, the reason is surely in that link label, which is the only
use
Diederik van Liere, 14/09/2012 19:18:
The Analytics Team is happy to announce the first version of
gerrit-stats. Gerrit-stats keeps track of the backlog of codereview for
Git individual repositories.
\o/
4) Will you add metrics for individual committers?
Right now, the unit of analysis is a
Shouldn't you be using ZIM, and aren't dumpHTML and siblings The Right
Way to do it? See also http://openzim.org/Build_your_ZIM_file
Nemo
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
Tomasz Ganicz, 27/08/2012 13:27:
Detailed information regarding Wikimedia Foundation's server's and
network can be found here:
http://wikitech.wikimedia.org/
Which is completely useless for any kind of high-level information, in
particular [[Server roles]] which is dead and has no
Daniel, interesting experience.
The case of requested configuration changes is rather simple though:
they require consensus, so the place you were looking for is linked in
comment 0. For the same reason, there's usually no need to tell the
community, the reporter will take care of that.
Siebrand Mazeland (WMF), 20/08/2012 19:17:
You are invited to the Localisation team development demo on Tuesday 21
August 2012 at 15:00 UTC (other time zones: 08:00 PDT, 17:00 CEST,
20:30 IST). This meeting will take 40 minutes at most. In this meeting
the Localisation team will present its
About colleagues vs. customers: I don't think it can be considered a
misunderstanding by the community, it's largely due to what the WMF
really wants.
The WMF, as the article puts it, doesn't [necessarily] want to work
better with the existing community (- colleagues) by providing what's
felt
I doubt fixing this requires rewriting mailman. It only requires dummy
messages to be reinserted where they've been deleted and the archives to
be rebuilt after this, just as if the correct procedure had been
followed from the start.
This, by the way, is by some orders of magnitude easier and
Tyler is right. So let's be positive and read that what's your plan as
a generic you, a question to the audience/community, and get the issue
fixed all together. I made my proposal/question/suggestion, it's the
best I can.
Alternatively, of course we could as well spend our energies in
«This is like the 27637862487 time that links have been broken due to
the exact same action.» I can bear hyperboles but this is a bit
excessive, and looks like just another attempt to make this flame
bigger: I hope the reality is closer to two or three times in the past
couple of years, and
Thanks Daniel. I don't understand, how can a message need to be removed
completely? I can't imagine anything which couldn't just be redacted by
leaving at least the message's skeleton as demanded by
https://wikitech.wikimedia.org/view/Remove_a_message_from_mailing_list_archive
Nemo
I suppose the change is ok, but some old users with shared usernames
which are not unified will be forced to use a second username when they
currently are not and they're not disturbing each other, because each of
them is active only in a single language.
Old users are mostly inactive nowadays
Let's guess, you're using Chrome/Chromium over HTTPS? Not much to do
besides changing browser.
https://code.google.com/p/chromium/issues/detail?id=109555
Nemo
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
«Our most prolific reviewers seem to prefer Gerrit»{{citation needed}}
Do you mean the reviewers who are most prolific right now, those who
were most prolific before the Git conversion or a mix of the two?
And where are the stats? I know the analytics team is working on
gerrit-stats but not
Ehm, I know that I'll sound like a broken record, but look at the
WikiCAPTCHA proposal: it's just a proposal, but it could address the
problem just by fetching books from the relevant Wikisource.
Links in: https://www.mediawiki.org/wiki/CAPTCHA
Nemo
Sumana Harihareswara, 24/07/2012 22:47:
Ohloh would sure be a nice resource - I'm not sure how to get it fixed
exactly, but please feel free to poke around, tell Ohloh where our new
repository is, and try to get it fixed. Sorry, it's a low priority for
me right now, but you have my
Practically, though, the difference is minimal. The ambassadors are
just the tech-savvy non-English-only interested users who spread the
word in the non-English Wikimedia world, which is exactly what happens
with any announcement English-only list.
The two differences I see are that
Just in case someone thinks otherwise, none of my doubts has been
addressed. The point is what Helder stressed, and a lot of work is
needed to reduce confusion.
Nemo
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
Strainu:
Actually, you shouldn't AGF.
Why not. http://meatballwiki.org/wiki/AssumeStupidityNotMalice
If I am to be the bad guy for something to change, so be it. I will
live with that.
Thank you for proving you're only playing the martyr.
Nemo
Now that 1.19 deploy has been completed, it seems a good idea to discuss
about our test wikis.
I see the following problems:
* No inventory of test wikis exist. There are many and they don't follow
any pattern or rule of sort (see appendix).
* It's unclear what purpose those wikis have, in
I don't think it's ever executed on small wikis either. We have 5 years
old ghost entries in Wanted* special pages on it.quote. They're like
family friends now, but still ghosts.
Nemo
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
There are several maintenance scripts waiting to be run on many wikis
since years ago:
https://bugzilla.wikimedia.org/showdependencytree.cgi?id=29782hide_resolved=1
The biggest offenders are:
* https://bugzilla.wikimedia.org/show_bug.cgi?id=16112 which breaks tons
of special pages;
*
Thanks again!
Opt-in for sidebar seems better to me. The best would be to make it very
easy to add, as with the * SEARCH which allows to customise the position
of search bar in monobook.
Nemo
___
Wikitech-l mailing list
Does anyone have any stats on how far short we are of that
goal? As in, what fraction of accounts on all wikis are still part
of the 'messy' part of SUL rather than the 'clean' part?
--HM
What's messy and what's not? Real usernames conflicts seem pretty rare
nowadays and it's not really a
We sort of use IA's data already, because many Wikisource texts are
OCR'ed on IA. If we manage to use OCR improvements within DjVu, it
shouldn't be too difficult to reupload such DjVu in their items and then
they could do what they want with them.
OCRs generally work by finding lines of text
Guillaume Paumier, 24/08/2011 16:36:
The AbuseFilter extension for MediaWiki, which helps prevent vandalism
on wikis, will be globally enabled on all Wikimedia projects later
today.
More information is available at
201 - 300 of 307 matches
Mail list logo