Noman wrote:
Hi,
i've installed mediawiki for a wiki project.
now we have 4 sections on main page . like there are on wikipedia main page.
Now as its done in wikipedia these 4 boxes are tables and update on date
criteria.
Now i want to do is to give some kind a navigation bar like drop
Jeroen De Dauw wrote:
Would it not be enough to hash all extensions on the distributor side, and
to check the hash sum on the client side using https for the connection?
I guess this would suffice for ensuring integrity, but what about the other
distribution meta-data? Where to get it from,
Павел Петроченко wrote:
Hi guys,
At the moment we are discussing an opportunity to create full scale
true WYSIWYG client for media wiki. To the moment we have a technology
which should allow us to implement with a good quality and quite fast.
Unfortunately we are not sure
if there is a
Roan Kattouw wrote:
2010/8/2 Aryeh Gregor:
That I don't know. I don't know if descriptions of the Usability
Initiative's studies are all public, or what. Maybe one of them could
fill us in.
There are videos around, yes, but I'm not sure we have reports.
Digging around on usabilitywiki
Aryeh Gregor wrote:
Look, this is just not a useful solution, period. It would be
extremely ineffective. If you extended the permitted staleness level
so much that it would be moderately effective, it would be useless,
because you'd be seeing hours- or days-old articles. On the other
hand,
Edward Z. Yang wrote:
We've noticed several things:
- When Wordpress 3.0 came out, we received several support tickets
asking us when we would be pushing an upgrade, and asked us if
anything bad would happen if they went ahead and upgraded their
install themselves. We
Roan Kattouw wrote:
One easy hack to reduce this problem is just to only provide a few
options for stub threshold, as we do with thumbnail size. Although
this is only useful if we cache pages with nonzero stub threshold . .
. why don't we do that? Too much fragmentation due to the excessive
Roan Kattouw wrote:
2010/8/1 Platonides:
Aryeh, can you do some statistics about the frequency of the different
stub thresholds? Perhaps restricted to people which edited this year, to
discard unused accounts.
He can't, but I can. I ran a couple of queries and put the result at
http
Edward Z. Yang wrote:
There's probably some interesting knowledge on looking how they patched
it, but I don't know how to easily extract it.
A good starting point would probablyb e most edited files.
Cheers,
Edward
I'm open for any data :)
My guess is that the most edited files are the
Chad wrote:
On Fri, Jul 30, 2010 at 3:57 PM, Platonides platoni...@gmail.com wrote:
Bryan Tong Minh wrote:
Also, on places where no memcached or equivalent is available (i.e.
CACHE_NONE), this will not work.
Then you could be using the objectcache table in the database.
No, that's
Aryeh Gregor wrote:
On Thu, Jul 22, 2010 at 4:02 PM, David Gerard dger...@gmail.com wrote:
This is a perennial proposal. It's an idea I like, as it puts control
in the hands of the viewer rather than third parties. All it requires
is someone to code something that passes muster as being
Aryeh Gregor wrote:
On Fri, Jul 23, 2010 at 8:00 AM, Platonides platoni...@gmail.com wrote:
You would need to reparse on edit (which changes categories) all pages
including the image. Even if the image comes from commons or another
ForeignRepo.
Not as easy, I think, but this is a long wanted
Aryeh Gregor wrote:
On Fri, Jul 23, 2010 at 12:41 PM, Alex Kozak ako...@creativecommons.org
wrote:
Ah, ok my apologies. Maybe you could add a note describing some of the
things that would break to
http://www.mediawiki.org/wiki/Manual:$wgUrlProtocols?
I've corrected some inaccuracies on
Aryeh Gregor wrote:
This is C-oriented, but the application to MediaWiki is fairly clear.
Extensions will invariably make function calls back and forth to core
code, and share data structures (= objects). This conventional
understanding is reflected in MediaWiki's README file, which has
Tim Starling wrote:
The problem is just that increasing the limits in our main Squid and
Apache pool would create DoS vulnerabilities, including the prospect
of accidental DoS. We could offer this service via another domain
name, with a specially-configured webserver, and a higher level of
Tim Starling wrote:
There's still quite a lot of work to do to get the new installer ready
for 1.17. I think we should focus on that, and avoid expanding the
scope of the project until we've reached that milestone.
There are the issues discussed here:
Aryeh Gregor wrote:
As I discussed with a few others at Wikimania, it'd be nice to take
this one step further and allow multiple people to sign off on a
revision, possibly with various types of sign-off, like:
* I read the diff and it looks good
* I tested this and seems to work
* I reviewed
Roan Kattouw wrote:
2010/6/13 Platonides platoni...@gmail.com:
Do we need *both* values?
It could simply contain http://foo.com/etc (API) or
mysql://localhost:3306/abc (dbname)
I don't like using one field for two different things like that, but
besides that, it'd be nice to have the API URL
Victor Vasiliev wrote:
Are there any public Bugzilla and Subversion dumps? I'd like to download
them and experiment with BZ/RM, but I need real data for it (and don't
really like the idea of downloading it manually).
--vvv
There are subversion dumps at http://svn.wikimedia.org/dumps/
I am
Roan Kattouw wrote:
2010/6/13 Chad innocentkil...@gmail.com:
1) iw_trans - I don't think this needs to become more than a
boolean like it is. If we allow transwiki inclusion, we'll have
to use a DB or API connection. Since a DB connection will
always be preferable to an HTTP request to the
Helder Geovane wrote:
Currently it is possible to define the sortkeys using
{{DEFAULTSORT:Sortkey}}
and in lots of places this sortkey coincide with the value of magic
words like {{PAGENAME}} and {{SUBPAGENAME}}, so we don't need to
update them manually. The (annoying) exception is when the
Bence Damokos wrote:
Hi,
It might just be my browser's strange cookie handling, but it seems to me
that a number of WMF programme wikis don't log me in automatically,
nowadays.
Is it possible to enable automatic login on some of the special project
wikis that have been set up lately?
Ryan Chan wrote:
Hello all,
I remember in old days, UTF-8 string are stored as varbinary, are
there reason to change to varchar(255) binary?
Also, what is the default server/connection/client character set settings now?
Thanks.
MediaWiki supports both ways. Wikipedia still uses the
Ryan Chan wrote:
Hello,
On Sun, Jun 6, 2010 at 11:12 PM, Platonides platoni...@gmail.com wrote:
MediaWiki supports both ways. Wikipedia still uses the mysql 4
compatible options, and since mysql chars only support the bmp, it isn't
likely to change.
It all depends on what you choose
Rob Lanphier wrote:
A full test pass with all of the different configurations isn't going to be
possible, so some help with testing the different configurations would be
wonderful. We'll have a fast fallback plan in place should we accidentally
break the other wikis, but obviously it'd be
Aryeh Gregor wrote:
On Fri, Jun 4, 2010 at 3:39 PM, Chad innocentkil...@gmail.com wrote:
This isn't necessarily hard. If there's a specific area in the HTML
we can inject them, we could easily add a {{#icon}} parser function
or similar that could affect these sorts of icons (and kill the need
Tisane wrote:
A modified version is already there:
http://www.mediawiki.org/wiki/Manual:Wikibot Hopefully at some point we can
come up with a more distinctive name than Wikibot.
Now there is some discussion, by the way, as to whether bot maintenance
should be moved to a different SVN
Daniel Friesen wrote:
^_^ hackish isn't that bad in some sense. I'm currently experimenting
with some farm code that works completely outside of MediaWiki rather
than as a extension sitting inside of it. Using a sandbox it can get
access to the MediaWiki install and extract info from it in
Robert Ullmann wrote:
I've looked at this a bit more. There are more serious problems.
Apparently, no-one converted the 5.0 titles in the wiki to 5.1 when
normalization was turned on; there are pages that can't be accessed.
(!) for example, try this (Malayalam for fish):
Since you are storing in the db the metadata of the images, try to make
the schema able to store metadata coming from the page, so it can be
used to implement bug 8298 or extensions like ImageFilter.
___
Wikitech-l mailing list
Why do you need to use CSS absolute positioning?
Since it's output from an extension, you could place it on an appropiate
place outside the bodyContent.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
Peter17 wrote:
I didn't set $wgUploadPath. Just $wgUseInstantCommons = true; The
images URLs are actually transformed to remote URLs:
I work on my own local wiki, which address is
http://localhost/mediawiki/ and transcluding
{{mediawikiwiki::User:Peter17}} which contains
Conrad Irwin wrote:
Wouldn't removing 1.6 from the main page solve the problem for most
newcomers? Only those who go down to the PHP 4 section of the
downloads need ever know it exists and thus get the impression that it
is an older version. Once they're no-longer newcomers, we can hope
that
Aryeh Gregor wrote:
On Mon, May 24, 2010 at 8:27 PM, Q overlo...@gmail.com wrote:
I would have to suggest to not go the shared database route unless the
code can be fixed so that shared databases actually work with all of the
DB backends.
I don't see why it shouldn't be easy to get it
church.of.emacs.ml wrote:
However, you'd have to worry that each distant wiki uses only a fair
amount of the home wiki server's resources. E.g. set a limit of
inclusions (that limit would have to be on the home-wiki-server-side)
and disallow infinite loops (they're always fun).
Infinite loops
for remote users, it
can be later refined to add more backends.
Anyway, I don't think api request would be cacheable by squids, so it
would be directly passed to an application server.
On Tue, May 25, 2010 at 9:22 AM, Platonides platoni...@gmail.com wrote:
He can internally call the api from
http://www.mediawiki.org/wiki/User:Peter17/Reasonably_efficient_interwiki_transclusion#Good_points
Seems it doesn't work so well. It was inadvertedly broken for wikitext
transclusions when the interwiki points to the nice url. See
'wgEnableScaryTranscluding and Templates/Images?' thread at
We should probably normalise to 5.1 on all wikis.
I can view the 5.0 characters but not the 5.1 ones, though.
But would
someone tell me where in the server code this is done? I have not been
able to find it. Then I can understand a bit better, possibly just fix
it in the bot code somehow, or
As I know, users of wikipedia can change their status by becoming helper,
admin or joining other groups. Since when, the user groups like this data
showed, the date this date collected? How I am supposed to do if I want the
data showing the change of user status?
You'd need to check the
Conrad Irwin wrote:
Just as with image captchas, you'd need to introduce noise into it.
If you are working from known constituents, you can use
cross-correlation to ignore noise pretty effectively (I believe it's
what humans do). The choice then is either to make the noise sound
like the
We should start a page at mediawiki.org listing the Pros and Cons of
each option.
Church of Emacs:
I'm still not sure whether RevisionDelete is aimed at replacing the old
deletion schema completely, including the archive table. Could anyone
comment on that please?
It's the logical evolution
Chad wrote:
On Tue, May 18, 2010 at 1:39 PM, Platonides wrote:
We should start a page at mediawiki.org listing the Pros and Cons of
each option.
Put it as a subpage of [[RFC]].
-Chad
I started it at
http://www.mediawiki.org/wiki/Requests_for_comment/Page_deletion
Please, collaborate
On Mon, May 17, 2010 at 6:46 PM, Bryan Tong Minh wrote:
On Mon, May 17, 2010 at 6:41 PM, church.of.emacs.ml
church.of.emacs...@googlemail.com wrote:
Special:Undelete doesn't use a pager like in the page history, where
you can browse the revisions. Instead, all revisions are displayed – and
Tim Starling wrote:
Audio CAPTCHAs, like visual CAPTCHAs, are not accessible for all
people and do not conform to W3C accessibility guidelines. What's
more, they're easier to crack than visual CAPTCHAs due to their
one-dimensional nature. This is especially true if you use a public
source
emijrp wrote:
Hi all;
Solving captcha during registration is mandatory. Can this be replaced with
a sound captcha for visual impairment people? It is a suggestion to the
usability project too. Thanks.
Regards,
emijrp
That's an old bug
https://bugzilla.wikimedia.org/show_bug.cgi?id=4845
vyznev wrote:
I'd suspect some script is trying to do an API query using a very long URL,
and the fact that this only happens when JS is enabled lends support to
this.
I don't see any long url requested on enwiki.
I would guess that the Javascript associated with Vector is using more
Daniel Friesen wrote:
I've been experimenting with a MW related project after all this time away.
No-one is in IRC so I had two minor questions I needed clarified.
Perhaps not at the time you sent the email. You should try at an
appropiate US time (around evening in Europe).
What's the
Alex Brollo wrote:
On 11 May 2010 17:23, Makelesi Kora-Gonelevu makele...@gmail.com wrote:
Hi i seem to have a problem with my wiki. Everytime i bold or italise a
text
and save it, it keeps adding more ' ' '. Has anyone ever come across this
problem?
Do you have magic quotes or another
Svip wrote:
On 12 May 2010 13:33, Platonides platoni...@gmail.com wrote:
Alex Brollo wrote:
This question gives me the opportunity for a question to experts about
server load. Is really so harder for the server to manage html tags like
b, /b, i,/i instead of usual wiki markup
Tomasz Finc wrote:
I think that would be a nice status update. If someone can write the code ..
I'll happily dig up where this report lives.
--tomasz
The report is generated from isidore:/home/reporter/bugzilla_report.php
The script living at svn-private/wmf/reports
It probably isn't using
Svip wrote:
On 12 May 2010 13:57, Platonides platoni...@gmail.com wrote:
You could use span style=font-weight: bold, but what's the point of
that?
Because it would be correct HTML.
Using b and i where what you really want is to make it bold and
italic isn't deprecated.
Incorrect
Roan Kattouw wrote:
2010/5/12 Platonides platoni...@gmail.com:
$n = Sql_query( SELECT (SELECT COUNT(*) FROM code_rev WHERE
cr_repo_id=1 AND cr_timestamp = '$epoch' $extra AND cr_path LIKE
/trunk/phase3%) + (SELECT COUNT(*) FROM code_rev WHERE cr_repo_id=1
AND cr_timestamp = '$epoch
Could be CodeReview data easily merged here?
Would be cool having (for phase3) the number of new revisions,
how many revisions are marked new since the latest release, the number
of fixmes...
That would be a few simple of sql queries, so I don't think there would
be issues on it (although
Aryeh Gregor wrote:
On Mon, Apr 26, 2010 at 4:02 AM, Dmitriy Sintsov ques...@rambler.ru wrote:
Wouldn't it be enough just to define an entity?
http://www.criticism.com/dita/dtd2.html#section-ENTITIES
I used such definition for nbsp once in XSL sheet. Don't know how well
it works alone in XML.
You don't need to specify all parameters:
{{man|bash}} Open bash(1)
{{man|passwd|section=5}} Documentation about /etc/passwd
{{man|read|section=2|os=bsd}} Open the man page of the syscall read(2)
{{man|apt-get|section=2|os=debian|version=5}} Use the man page from Debian 5
(I know, I know,
Aran Dunkley escribió:
Hi I'm wondering if anyone can help with this multibyte character
corruption:
http://aqes.organicdesign.tv/Categor%C3%ADa:Arquitecto
The site was moved from a shared host to a dedicated server, but now
it's not rendering the multibyte characters properly in the
Aran Dunkley escribió:
$wgDBmysql5 is set to false, the show create table for page gives this
on both the original and the new server:
mwiki_page | CREATE TABLE `mwiki_page` (
`page_id` int(8) unsigned NOT NULL AUTO_INCREMENT,
`page_namespace` int(11) NOT NULL DEFAULT '0',
Aryeh Gregor wrote:
Anyway, this list is used mainly by MediaWiki developers and Wikimedia
sysadmins, not third-party/corporate users. You could try asking at
the mwusers.com forum, or some place like that, for an answer from
someone who's actually in a similar situation to you.
In fact,
Aryeh Gregor wrote:
On Sat, Apr 10, 2010 at 8:20 AM, Roan Kattouw roan.katt...@gmail.com wrote:
There's no config var for that like there is for the skin, cache and
upload paths, but you could always use symlinks for the
/languages/messages/ directory (provided you're not on Windows).
You could express those relantionships using semantic mediawiki.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Aryeh Gregor wrote:
On Mon, Mar 29, 2010 at 12:46 PM, Chad wrote:
What if it was written as an extension and moved to /extensions?
Then we get the benefit of decoupling Math from the core software,
What benefit is this? It's not realistically decoupled from the core
software unless it
Damon Wang wrote:
Option (2) is the most maintainable and feasible option, and it's
precisely the one that cannot be done in PHP. As far as I know, PHP has
no parser-generator package. (Please, please let me know if that's
incorrect so I can stop embarrassing myself and get on with writing a
Roan Kattouw wrote:
2010/3/26 Amir E. Aharoni amir.ahar...@mail.huji.ac.il:
Thanks.
Note, however, that if you add JS snippets, it may make it usable to me, but
probably not quite usable to many people who don't want to learn to edit
their vector.js. I've been editing Wikipedia for 5 years
The new Message system should allow passing a Language class
representing the language that is to be used.
This way, passing $wgContLang instead of a default $wgLang you could get
the message in content language.
___
Wikitech-l mailing list
Bryan Tong Minh wrote:
On Tue, Mar 30, 2010 at 2:13 PM, Platonides wrote:
Changing to python will also break for people that compiled math, update
without reading the release notes and don't have python.
While this is ofcourse possible, how big is the chance that somebody
will have ocaml
RobH will be installing a new secure gateway (gilman) shortly. It was
racked some hours ago.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Jean-Marc van Leerdam wrote:
Regardless of whether you succeed in hiding the 'view source' tab, how
will you counter URL manipulation by the user?
AFAIK anyone can change the URL to
...index.php?title=Pagenameaction=edit
and then get presented with the 'view source' results (or the edit
Chad wrote:
He's not asking about rights changes for user accounts. He was wanting
to look at the historical values of $wgGroupPermissions.
Like was said earlier in the thread, the only way to find that is to do some
digging through Bugzilla and who knows where else to find discussions
Aryeh Gregor wrote:
As long as the worst that could happen on a large majority of
installations is DoS, I don't think we should be afraid to rewrite the
code just because *maybe* it would be less secure. We should
obviously check over the new code carefully, but I wouldn't say it's
any more
Python is a nice language. PHP (portability) or C/C++ (speed) would be
better but Python is preferable to OCaml.
You mention ANTLR, something like that could be a good because it should
allow to generate the same parser in a different language with not so
much effort (probably you won't have
Happy-melon wrote:
I took it to mean that he wanted to split the math parsing out as a
**MediaWiki** extension, implementing math as a parser tag hook in the
usual way. Which is definitely highly desirable.
--HM
Making it a MediaWiki extension is of course desirable (moving texvc out
of
Thomas Dalton wrote:
How about an anti-flood filter than stops emails that are identical to
an email sent within the last hour? That ought to stop people
auto-responding to themselves, at least. I think we can live with one
auto-response per person (which is all most clients send, anyway,
Zeyi wrote:
Hi,
Firstly, congratulations for this! as i Know it has taken for a long time!
and May I ask a small question: what difference between current dump and
history dump. I know current one only includes current edits, and history
one has all edits as introduction said.
You have
Tisza Gergő wrote:
Hi,
is there a MediaWiki feature or external tool to get a live feed of Commons
file
uploads? (By live I mean something that can be used to show a realtime
slideshow
of new images. I vaguely remember someone saying that the WMF office has a
screen with such a
Dmitriy Sintsov wrote:
Casey Brown:
You can change the language of the Commons interface in your
preferences, instead of using the ?uselang= parameter all the time.
As other peoples correctly pointed out, it is not convenient to select
the language manually in user's options (especially for
K. Peachey wrote:
The reason I suggested a separate IRC channel, since i wrote about it
at night and my crazied ramblings probably didn't resemble much, (or
communication venue in general, for example people have pointed out a
mailing list) is because as someone pointed out (someone, Dmitriy i
Tomasz Finc wrote:
Brian J Mingus wrote:
On Wed, Mar 10, 2010 at 8:54 PM, Tomasz Finctf...@wikimedia.org
mailto:tf...@wikimedia.org wrote:
Yup, that's the one. If you have a fast upload pipe then I'm more then
happy to setup space for it. Otherwise it should be arriving in our
Roan Kattouw wrote:
2010/3/10 Conrad Irwin:
This should work the same way as extension parser-tests, with the tests
in the extension's directory, included using a config variable.
Otherwise, extensions are split over two places, so there's not such an
easy way to just tarball them, and it's
Roan Kattouw wrote:
2010/3/10 Platonidesplatoni...@gmail.com:
Shouldn't that server have the extensions checked out and installed on a
local mediawiki before running them ?
(I am assuming it's a single server testing against localhost)
The point is it's not a single server testing them
Makelesi Kora-Gonelevu wrote:
I know how to include tables but i'm not sure how to include them in the
template.
Just write the table in the template. It's the same syntax.
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
Chris Lewis wrote:
I hope I am emailing this to the right group.
It is.
My concern was about mediawiki and it's limitations, as well as it's outdated
methods. As someone wo runs a wiki, I've gone through a lot of frustrations.
Maybe you should list your frustrations? It maybe a problem on
Danese Cooper wrote:
Would we not just run an ldap server? Its an open standard, afterall.
How do you envision using a contact database?
D
It's nice to see you here Danese :)
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
fl wrote:
No, they can't. As far as I am aware, MediaWiki is released under the
GNU General Public License[1], which stipulates, among other things, the
requirement to release a program's source code to the public and to
release any derived changes under the same license[2].
If the WMF were
Huib Laurens wrote:
Hello,
I was trying to figure out how to work with Central Auth but the page
on the wiki is kind of small and not clear.
I have four wiki's all running on the same server with mediawiki
1.16alpha and mysql
All four wiki's have a own database
main
intern
llam
Siebrand Mazeland wrote:
Would also appreciate information about any extension in this list that may
be obsolete on MediaWiki trunk; this will allow us to tag it in trunk and
remove it after branching the next MediaWiki version.
How are we dealing with obsolete extensions? Just deleting the
Aryeh Gregor wrote:
On Tue, Feb 9, 2010 at 8:35 PM, Trevor Parscal tpars...@wikimedia.org wrote:
Merging Monobook and Modern is actually a good point for one of my other
ideas, which is to have themes for skins. In other words, same HTML
generation, different CSS. Then Monobook and Modern
Trevor Parscal wrote:
I've been very busy with deployments, so I've been waiting to talk here,
but I think that there are allot of considerations for a skinning system
that I could help shed some light on when I get some time (maybe next
week?). The Usability Initiative's last hard deadline
Roan Kattouw wrote:
2010/2/8 Jack Phoenix:
Hi all,
MediaWiki's skin system has been quite a mess for some while. I decided to
try rewriting it
AWESOME! Someone finally took up this task :D
and you can see the end result on [1]. It's quite a big
patch, which is why I want to encourage all
Domas Mituzas wrote:
However, that article is just rumour. I think it's more likely they made
some apc-like cache/optimizer than a compiler.
http://www.facebook.com/note.php?note_id=280583813919id=9445547199ref=nf
And the conclusion that
it will be open source is questionable. In such
Nimish Gautam wrote:
Hey all,
I really want to turn on Google chrome frame support for the mediawiki
projects:
http://code.google.com/chrome/chromeframe/
Basically, it would just involve us putting in a meta tag on our pages
that would trigger an IE plugin Google wrote, assuming the
David Gerard wrote:
http://www.sdtimes.com/blog/post/2010/01/30/Facebook-rewrites-PHP-runtime.aspx
Facebook is written in PHP. Unlike Wikimedia, they have money to pay
people to speed it up. Looks like they're introducing a PHP compiler.
Open source. Might be of interest to WMF and
李琴 wrote:
I turned off all my extentions,but it still has that error
:(
vanessa
What's the wikitext of that Infobox_housi(ing?) template / the pages
containing it?
Try finding the wikitext that produces the error.
It's probably that search is truncating the wikitext, but still
李琴 wrote:
Hi all,
I search an article in my localwiki,sometimes it shows the error:
Inner error
Preprocessor_DOM::preprocessToObj generated invalid XML
why does this happen?how can I solve this problem?
Thanks :-)
vanessa
This shouldn't happen. Do you use any search
李琴 wrote:
Hi all,
I have built a LocalWiki. Now I want the data of it to keep consistent
with the
Wikipedia and one work I should do is to get the data of update from
Wikipedia.
I get the URLs through analyzing the RSS
Let me see if i have understood it.
You wrote an extension we don't know anything about.
You then changed the wiki settings in a specified way.
The extension now doesn't work for normal users.
and you want us to fix the extension for you just from that description?
Unforgettableid wrote:
A) Did I go too far when I did all the research I described above? Do you
yourself often use the Range Contributions tool[4] for looking at vandals'
ISPs'
contributions?
B) What do you think are the chances that the same person made both the
first[1] and the
Tisza Gero wrote:
Huji huji.huji at gmail.com writes:
Just because some page is watched doesn't mean somebody is really caring
about it. A better feature could be to list pages which are not being
actively watched; an active watcher could be defined as someone who has
made a login/edit in the
Aryeh Gregor wrote:
Why do we hide Special:UnwatchedPages from regular users? Unwatched
pages are something that people should know about so they can be sure
to watch them. If no one is actively watching a page, it's more
likely that vandalism will stick around. Yes, vandals and trolls
Brianna Laugher wrote:
All better now.
Can someone confirm for me, that basically the way to add yourself to
the CC list for a bug is just view the bug and hit commit? I think
there used to be more steps involved.
Brianna
Yes. Thanks to the magic of Add me to CC list being checked.
It
Philip wrote:
Certainly, but if wiki editors are *able* to do it by hand, then IMHO
microdata is much less error-prone.
Manu Sporny wrote:
I don't think that the best approach for Wikipedia is to allow direct
Microdata or RDFa markup. There are already many templates in use at
Wikipedia via
901 - 1000 of 1204 matches
Mail list logo