Re: [Wikitech-l] Why are we still using captchas on WMF sites?
On 21 January 2013 07:56, Andre Klapper aklap...@wikimedia.org wrote: at all implies that some spambots are blocked at least? Yes, but to count as successful it would have to block approximately all, I'd think. I mean, you could redefine something that doesn't block all spambots but does hamper a significant proportion of humans as successful, but it would be a redefinition. - d. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Why are we still using captchas on WMF sites?
Not sure about enwiki, but from my experience with hosting smaller wiki's CAPTCHA's are pretty useless (reCAPTCHA, FancyCAPTCHA, some custom ones). The spambots keep on flooding through. I've found its much more effective to just use the AbuseFilter. -- Chris ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Why are we still using captchas on WMF sites?
On 2013-01-21 3:56 AM, Andre Klapper aklap...@wikimedia.org wrote: On Mon, 2013-01-21 at 07:48 +, David Gerard wrote: On 21 January 2013 05:13, Victor Vasiliev vasi...@gmail.com wrote: On 01/20/2013 04:22 PM, David Gerard wrote: The MediaWiki captcha is literally worse than useless: it doesn't keep spambots out, and it does keep some humans out. I don't see how the spambot statement is true. Do you have evidence for it? That spambots get through at all. Evidence is not provided by simply repeating the statement. :) Does http://elie.im/publication/text-based-captcha-strengths-and-weaknessescount as evidence? (Copied and pasted from the mailing list archives) Sure captchas do prevent some limitted attacks - it makes it more effort then a 5 minute perl script. Most spammers are more sophisticated than that. -bawolff ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Why are we still using captchas on WMF sites?
On 21 January 2013 08:43, Bawolff Bawolff bawo...@gmail.com wrote: Does http://elie.im/publication/text-based-captcha-strengths-and-weaknessescount as evidence? (Copied and pasted from the mailing list archives) 404 :-) Correct link: http://elie.im/publication/text-based-captcha-strengths-and-weaknesses - d. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Disable module mediawiki.searchSuggest in MW 1.20+
Hi Krinkle! Thanks for the clarification! mediawiki.searchSuggest does NOT use jquery.ui.autocomplete. Yes, I know. I never said anything like this. Maybe this was a misunderstanding. I said I wanted to replace mediawiki.searchSuggest with my own implementation _based on_ jquery.ui.autocomplete. As jquery.ui.autocomplete is delivered together with MW and can easily be made available on the client side thanks to ResourceLoader and its dependency management it was a good choice for me. Instead you'd register your own module with its own name and make sure it is loaded instead of some other module. The method used can differ based on the feature at hand, but handling this at the module level is not the way to go. Yes I understand this. And I think this was my problem. I did not find a proper way to make sure it is loaded instead of some other module, because 'mediawiki.searchSuggest' module is added by OutputPage and there seems to be no way (but changing the current users settings as Matt suggested) to avoid that. I think in the case of SearchSuggest it should probably be loaded by the Skin [...] This would probably be a good idea as it may allow a more specific server side hook. :) So, again, thanks for your explanations. Those where very helpful and I think I now have a better understanding of the concepts of modules. -- Robert ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Adding h2s to the TOC
Hey, I have a custom action in which I construct HTML to display by rendering some user provided wikitext and some grammatically build chunks of HTML. It looks like this: $html = 'some html'; $html .= $this-getOutput()-parse( 'wiki text' ); $html .= 'more html'; $html .= 'even more html'; These last two segments of HTML start with a h2, which I would like to show up in the same TOC as rendered by the parsed wiki text. I tried some things but did not find a way to make this work, so I'm looking for some pointers. Note: this HTML construction is happening inside a function that should return HTML and not have side effects (so its result can be cached). It is thus not possible to do anything in the direction of $this-getOutput()-addHTML(). Cheers -- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. -- ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Problem to get an article's content in MW 1.20.2
Hi, thanks for your suggestions, at the end I found it :-) It was a closed data base connection, that prevented Title::newFrom...-methods to get articleId etc. --- the eval part --- $dbReadOnly2 = wfGetDB(DB_SLAVE); # do some stuff $dbReadOnly2-close(); # use another method that needs data base access will fail --- It is because wfGetDB(...); will get the database object as *reference* not as copy (see PHP code of wfGetDB() in file includes/GlobalFunctions.php) and as it is a reference and some other function closes the dabta base connection then there is silence ... at least for the db connection Andreas 2013/1/16 Andreas Plank andreas.pl...@web.de: 2013/1/16 Krenair kren...@gmail.com: You could try RequestContext::getMain() to get the context source object. Alex Monk On 16/01/13 16:56, Andreas Plank wrote: I tried many functions in various classes: RequestContext WikiPage Title Article ... but it is always the same I do not get the raw page content. I believe these functions should work but somehow they do not get the right incoming data. In an earlier post I printed the Title object and it is almost empty, I wonder about that too, might be related? Maybe it has nothing to do with these functions but with the database? However browsing and editing those pages runs smoothly but my application does not get the related page data. I wonder too, because with MW 1.18 it did work fine. If I switch on $wgDebugDumpSql there are a lot of queries and also a page query SELECT page_id, page_namespace, page_title, page_restrictions, page_counter, page_is_redirect, page_is_new, page_random, page_touched, page_latest, page_len FROM `page` WHERE page_namespace = 'X' LIMIT N ... So it seems the right query from a quick look. Has anybody an idea how I can trace and debug the data flow until my app asserts “the page does not exist”? Andreas ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Google Summer of Code 2013
On Mon, Jan 21, 2013 at 12:28 AM, Sébastien Santoro dereck...@espace-win.org wrote: I concur and offer to document that. Something like this text could be used in this purpose. == Tips == === Push to Gerrit to show your code. In code review we trust. === MediaWiki uses a continuous integration model. Code is first peer-reviewed: other developers provide feedback about your code, approve it or recommend improvements. Jenkins tests run too, to ensure your code doesn't break anything. When your change is ready, it's merged in the master branch of our code repository. Follow this workflow. Push your code to Gerrit when you want to show it. Add your mentor as reviewer. Others will join the conversation on a regular basis. You'll learn a lot from the others reviewers' feedback. And the greatest bonus? Your code will be merged on a continuous basis. You will directly be able to see your code live and in production. This is what we're calling the continuous integration. A few GSoC admins (including me) wrote these some time ago: * http://google-opensource.blogspot.de/2011/03/dos-and-donts-of-google-summer-of-code.html * http://google-opensource.blogspot.de/2011/04/dos-and-donts-of-google-summer-of-code.html * http://google-opensource.blogspot.de/2011/04/dos-and-donts-of-google-summer-of-code_21.html Cheers Lydia -- Lydia Pintscher - http://about.me/lydia.pintscher Community Communications for Wikidata Wikimedia Deutschland e.V. Obentrautstr. 72 10963 Berlin www.wikimedia.de Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V. Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Why are we still using captchas on WMF sites?
And to be more explicit, quoting the paper in case someone really has doubts: «we deem a captcha scheme broken when the attacker is able to reach a precision of at least 1%». With our FancyCaptcha we are/were at 25 % precision for attackers, so yes, it's officially broken, and it's been so since more than a year ago http://thread.gmane.org/gmane.science.linguistics.wikipedia.technical/56387 so surely spammers got better in the meanwhile. Nemo ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] creating the context-aware sidebar
Hi guys! I'm searching the good sidebar extension for MediaWiki project, can anyone recommend something nice? I've tried SidebarEx, NavBlocks, CustomSidebar Requirements: 1) Full MediaWiki markup support; 2) show different content for different groups of users; 3) show different content for different categories. Or maybe using the sidebar calls inside the Templates that will then be invoked on the pages, creating sidebars that are speciafic for the given page; 4) nice appearance. 5) ideally live project that is supported by developers Thanks in advance! Cheers, - Yury Katkov, WikiVote ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] talk about MediaWiki groups in Berlin
Heya :) On the 4th of February Quim will be at the Wikimedia Germany office to introduce MediaWiki groups. See https://www.mediawiki.org/wiki/Groups for more info about the groups. We'll meet at 18:30 in the office in Obentrautstr. 72, Berlin. Quim will talk and answer questions for about 1 hour and then we'll move on to Brauhaus Lemke for some food and drinks. If you're going to attend please let me know soon so I can plan better. I'd also be delighted if you could forward it to other people who might be interested. I hope to see many of you there. Cheers Lydia -- Lydia Pintscher - http://about.me/lydia.pintscher Community Communications for Wikidata Wikimedia Deutschland e.V. Obentrautstr. 72 10963 Berlin www.wikimedia.de Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V. Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] creating the context-aware sidebar
On Mon, Jan 21, 2013 at 6:07 AM, Yury Katkov katkov.ju...@gmail.com wrote: Hi guys! I'm searching the good sidebar extension for MediaWiki project, can anyone recommend something nice? I've tried SidebarEx, NavBlocks, CustomSidebar https://www.mediawiki.org/wiki/Extension:DynamicSidebar Maintained by myself for use on labsconsole. Requirements: 1) Full MediaWiki markup support; It supports the same level of markup as the normal sidebar. 2) show different content for different groups of users; It handles this. 3) show different content for different categories. Or maybe using the sidebar calls inside the Templates that will then be invoked on the pages, creating sidebars that are speciafic for the given page; It does not support sidebars specific to a given page. It would be necessary to disable sidebar caching to do so. Feel free to add this, but make it optional via a config option if you do so. 4) nice appearance. It looks and acts like the normal sidebar. 5) ideally live project that is supported by developers I maintain it and am happy to take changes in Gerrit. - Ryan ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] New 1.20.2 install didn't prompt for license or private mode
Is that stuff supposed to go after one chooses no, ask me more questions? Cause I was not asked *any* more questions. On a related story: if I give the installer a table prefix for which the tables already exist, what will it do? I have 3 wikis in the same DB, and I therefore cannot simply drop the DB... and you apparently can't use wildcards in DROP TABLE. Cheers, -- jr 'Bobby; where are ya'? a -- Jay R. Ashworth Baylink j...@baylink.com Designer The Things I Think RFC 2100 Ashworth Associates http://baylink.pitas.com 2000 Land Rover DII St Petersburg FL USA #natog +1 727 647 1274 ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] On code review
There's a useful blog post on code review at Mozilla by Mozilla developer David Humphrey on his blog: http://vocamus.net/dave/?p=1569. I like his breakdown of different types of code reviews. It seems like at Mozilla there is a lot of room for the patch submitter to indicate to reviewers what sort of review is needed for a particular patch, ranging from requests for manual testing and careful scrutiny all the way to what Humphrey calls catechism reviews, in which the committer uses a review request to announce her intent and solicit a basic sanity-check. Unofficially such reviews do not exist at the WMF because we are all infallibly meticulous and diligent about testing every branch of every code change. But unofficially they do, of course. It'd be nice if such reviews were formally sanctioned (with whatever qualifications). I'm interested to hear other people's thoughts. -- Ori Livneh ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Why are we still using captchas on WMF sites?
On Mon, Jan 21, 2013 at 3:00 AM, David Gerard dger...@gmail.com wrote: I mean, you could redefine something that doesn't block all spambots but does hamper a significant proportion of humans as successful, but it would be a redefinition. It's not a definition, it's a judgment. And whether or not it's a correct judgment depends on how many spambots are blocked, and how many productive individuals are hampered, among other things. After all, reverting spam hampers people too. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Why are we still using captchas on WMF sites?
Given that there are algorithms that can solve our captcha presumably they are mostly preventing the lazy and those that don't have enough knowledge to use those algorithims. I would guess that text on an image without any blurring or manipulation would be just as hard for those sorts of people to break. (Obviously that's a rather large guess). As a compromise maybe we should have straight text in image captchas. -bawolff On 2013-01-21 7:40 PM, Anthony wikim...@inbox.org wrote: On Mon, Jan 21, 2013 at 3:00 AM, David Gerard dger...@gmail.com wrote: I mean, you could redefine something that doesn't block all spambots but does hamper a significant proportion of humans as successful, but it would be a redefinition. It's not a definition, it's a judgment. And whether or not it's a correct judgment depends on how many spambots are blocked, and how many productive individuals are hampered, among other things. After all, reverting spam hampers people too. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Why are we still using captchas on WMF sites?
On 01/21/2013 03:00 AM, David Gerard wrote: On 21 January 2013 07:56, Andre Klapper aklap...@wikimedia.org wrote: at all implies that some spambots are blocked at least? Yes, but to count as successful it would have to block approximately all, I'd think. That's dubious. Blocking all spambots is not the goal of any CAPTCHA. The goal is to significantly decrease spam, and thus to save human spam-fighters time. Matt Flaschen ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
[Wikitech-l] Fwd: [Wikimedia-l] Wikimania 2013 scholarship now accepting application
This call is equally valid to MediaWiki / Wikimedia tech contributors: Original Message Subject: [Wikimedia-l] Wikimania 2013 scholarship now accepting application Date: Tue, 22 Jan 2013 13:03:47 +0800 From: Simon Shek simon.s...@wikimedia.hk Reply-To: Wikimedia Mailing List wikimedi...@lists.wikimedia.org To: wikimedi...@lists.wikimedia.org Hi all, Scholarship applications for Wikimania 2013 in Hong Kong are being accept. The application window is one month (through 22 February). Wikimania 2013 scholarship is an award given to an individual to enable them to attend Wikimania in Hong Kong from 7-11 August, 2013. Both types of scholarships will be available this year. Partial scholarships will cover travel expenses to Wikimania, capped at 50% of the estimated air fare from your nearest international airport according to [[wm2013:Getting to Hong Kong]]. Full scholarships will cover round-trip travel, dorms accommodations as arranged by the Wikimania Team, and registration for Wikimania 2013. Applicants will be rated on the following four dimensions: 1. Activity within Wikimedia (on-wiki and off-wiki) - 50% 2. Activity outside of Wikimedia and other free knowledge/software projects - 15% 3. Interest in Wikimania and the Wikimedia movement - 25% 4. Fluency of English language - 10% To learn more about Wikimania 2013 scholarships, please visit https://wikimania2013.wikimedia.org/wiki/Scholarships To apply for a scholarship, you can fill out the application form here: https://scholarship.wikimedia.hk If you have any question, email us at wikimania-shcolars...@wikimedia.org . Good luck! Simon Shek Community coordinator - Wikimania 2013 / Wikimedia Hong Kong wikimedia.hk ___ Wikimedia-l mailing list wikimedi...@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Disable module mediawiki.searchSuggest in MW 1.20+
Am 21.01.2013 09:48, schrieb Robert Vogel: Hi Krinkle! Thanks for the clarification! mediawiki.searchSuggest does NOT use jquery.ui.autocomplete. Yes, I know. I never said anything like this. Maybe this was a misunderstanding. I said I wanted to replace mediawiki.searchSuggest with my own implementation _based on_ jquery.ui.autocomplete. As jquery.ui.autocomplete is delivered together with MW and can easily be made available on the client side thanks to ResourceLoader and its dependency management it was a good choice for me. Instead you'd register your own module with its own name and make sure it is loaded instead of some other module. The method used can differ based on the feature at hand, but handling this at the module level is not the way to go. Yes I understand this. And I think this was my problem. I did not find a proper way to make sure it is loaded instead of some other module, because 'mediawiki.searchSuggest' module is added by OutputPage and there seems to be no way (but changing the current users settings as Matt suggested) to avoid that. I ran into big problems when using jquery.ui.autocomplete. Xes, I _do_ know how to use it and have years of bad experience with it. _It is difficult to customise and creates more overhead than it claims to save_ -- especially when you need to modify the suggestion output according to your needs -- , and my advice for MediaWiki is, _not_ to go this way. My advice: stick on using the current version. Major problems into which you may run are [1-3]. - customising suggestion output (when you want to use HTML, when you want to highlight something in the suggestions): very difficult, lot of overhead - when leaving the tab and returning. the suggestion list disappears. Retriggering onfocus is almost impossible without negative side-effects. Tom [1] https://forum.jquery.com/topic/jquery-ui-autocomplete-bug-autocomplete-suggestions-input-line-is-not-empty-disappear-when-returning-from-another-browser-tab [2] https://forum.jquery.com/topic/autocomplete-ie7-backspace-issue#1473702558270 [3] bugs.jqueryui.com/ticket/8832 ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Html comments into raw wiki code: can they be wrapped into parsed html?
I tried to build a template which wraps template parameters into data- attributes. First results have been incouraging, then I find something logical but unexpected, crushing the whole idea. I wrote into the code of an infobox-like template something like this: span data-author={{{author}}} data-birthdate={{{birthDate}}}/span and I very happily see that html code had my data wrapped into such span tags. But I was testing my code with clean templates, t.i.: templates which have no wikicode into parameter values (as usually occurs into it.wikisource). As soon as I tested my idea into another project (Commons) I found that any wikicode (template call, parameter, link) present into the value of infobox parameter breaks the stuff, since it is parsed and expanded by parser with unpredictable results. So... I ask you again: is there any sound reason (i.e. safety related,or server loading related ) reason to avoid that HTML comments, wrapped into raw page wikicode are sent back into html rendering as-they-are? Alex brollo ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Why are we still using captchas on WMF sites?
On 22 January 2013 04:28, Matthew Flaschen mflasc...@wikimedia.org wrote: On 01/21/2013 03:00 AM, David Gerard wrote: Yes, but to count as successful it would have to block approximately all, I'd think. That's dubious. Blocking all spambots is not the goal of any CAPTCHA. The goal is to significantly decrease spam, and thus to save human spam-fighters time. Per the previous comments in this post, anything over 1% precision should be regarded as failure, and our Fancy Captcha was at 25% a year ago. So yeah, approximately all, and our captcha is well known to actually suck. - s. ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Re: [Wikitech-l] Disable module mediawiki.searchSuggest in MW 1.20+
Hi Thomas, thanks for sharing your experiences. The issues you mention can truly be problematic. I will keep them in mind. But for my use case jquery.ui.autocomplete does work quite well. [...] and my advice for MediaWiki is, _not_ to go this way. I don't think there were considerations about changing MediaWiki's built-in suggestion feature. That's not what this discussion is about. It's just about a little (very specific) extension I'm working on. :) Robert ___ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l