Re: [Wikitech-l] [Xmldatadumps-l] HTML wikipedia dumps: Could you please provide them, or make public the code for interpreting templates?

2012-10-03 Thread Roberto Flores
Could we have an HTML dump for X amount of money?
Something like a paid feature.

Include the CSS of course.
Also, leave the  tags as they are, as those have to be processed by
3rd party libraries.

2012/9/17 Pablo N. Mendes 

>
> I also think the HTML dumps would be super useful!
>
> Cheers
> Pablo
> On Sep 17, 2012 8:05 PM, "James L"  wrote:
>
>>   I’m all vote for continuing the HTML wiki dumps that were once done, *2007
>> was the last*?  Why are these discontinued? they would be more useful
>> than the so called “XML”.
>>
>> There is no complete solution to processing dumps, the XML is most
>> certainly not XML in its lowest form, and it IS DEFINITELY a moving target!
>>
>> Regards,
>>
>>  *From:* Roberto Flores 
>> *Sent:* Sunday, September 09, 2012 8:07 PM
>> *To:* Wikimedia developers 
>> *Cc:* Wikipedia Xmldatadumps-l 
>> *Subject:* Re: [Xmldatadumps-l] [Wikitech-l] HTML wikipedia dumps: Could
>> you please provide them, or make public the code for interpreting templates?
>>
>> Allow me to reply to each point:
>>
>> (By the way, my offline app is called WikiGear Offline:)
>> http://itunes.apple.com/us/app/wikigear-offline/id453614487?mt=8
>>
>> > Templates are dumped just like all other pages are...
>>
>> Yes, but that's only a text description of what the template does.
>> Code must be written to actually process them into HTML.
>> There are tens of thousands of them, and some can't be even programmed by
>> me (e.g., Wiktionary's conjugation templates)
>> If they were already pre-processed into HTML inside the articles'
>> contents, that would solve all of my problems.
>>
>> > what purpose would the dump serve? you dont want to keep the full dump
>> > on the device.
>>
>> I made an indexing program that selects only content articles (namespaces
>> included) and compresses it all to a reasonable size (e.g. about 7gb for
>> the English Wikipedia)
>>
>> > How would this template API function? What does import mean?
>>
>> By this I mean, a set of functions written in some computer language to
>> which I could send them the template within the wiki markup and receive
>> HTML to display.
>>
>> Wikipedia does this whenever a page is requested, but I ignore the exact
>> mechanism through which it's performed.
>> Maybe you just need to make that code publicly available, and I'll try to
>> make it work with my application somehow.
>>
>>
>> 2012/9/9 Jeremy Baron 
>>
>>> On Sun, Sep 9, 2012 at 6:34 PM, Roberto Flores 
>>> wrote:
>>> > I have developed an offline Wikipedia, Wikibooks, Wiktionary, etc. app
>>> for
>>> > the iPhone, which does a somewhat decent job at interpreting the wiki
>>> > markup into HTML.
>>> > However, there are too many templates for me to program (not to
>>> mention,
>>> > it's a moving target).
>>> > Without converting these templates, many articles are simply
>>> unreadable and
>>> > useless.
>>>
>>> Templates are dumped just like all other pages are. Have you found
>>> them in the dumps? which dump are you looking at right now?
>>>
>>> > Could you please provide HTML dumps (I mean, with the templates
>>> > pre-processed into HTML, everything else the same as now) every 3 or 4
>>> > months?
>>>
>>> 3 or 4 month frequency seems unlikely to be useful to many people.
>>> Otherwise no comment.
>>>
>>> > Or alternatively, could you make the template API available so I could
>>> > import it in my program?
>>>
>>> How would this template API function? What does import mean?
>>>
>>> -Jeremy
>>>
>>> ___
>>> Wikitech-l mailing list
>>> Wikitech-l@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>>
>>
>>  --
>> ___
>> Xmldatadumps-l mailing list
>> xmldatadump...@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/xmldatadumps-l
>>
>>
>> ___
>> Xmldatadumps-l mailing list
>> xmldatadump...@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/xmldatadumps-l
>>
>>
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Call to eliminate sajax

2012-10-03 Thread MZMcBride
Daniel Friesen wrote:
> sajax is an ancient ajax library, it's part of our legacy code. And it
> still gets to sneak a license note into our README.
> 
> It's probably about time that we start making sure that code is ready for
> the day it disappears. Just as code shouldn't be relying on bits and
> pieces of wikibits.
> 
> Currently the only parts of core that depend on sajax are the legacy
> mwsuggest and upload.js.
> mwsuggest is going to disappear when Krinkle's work making simplesearch's
> suggestions work universally is finished.
> I'm not sure what's going on with upload.js.
> 
> The real problem however is extensions. For some reason it appears that we
> STILL have extensions depending on sajax. And I'm not talking about
> ancient extensions on the wiki or in svn. I only did an ack through stuff
> that's currently in git.

Can you please file a tracking bug for removing sajax?

MZMcBride



___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Call to eliminate sajax

2012-10-03 Thread S Page
Informative e-mail threads are not documentation.
I added https://www.mediawiki.org/wiki/Manual:Ajax#Deprecated_functionality

There seems to be no mention of AjaxDispatcher on mediawiki.org ,
which I guess is good?  If it's obsolete someone needs to add a
comment to includes/AjaxDispatcher.php.
-- 
=S Page  software engineer on E3

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Github replication

2012-10-03 Thread Brion Vibber
On Wed, Oct 3, 2012 at 10:29 AM, Chad  wrote:

> Yeah, that sounds sane. Anyone who wants to volunteer to keep an eye
> on Github and make sure patches get into Gerrit, let me know and I'll add
> you to the group on Github.
>

Crap, I think I just volunteered. ;)

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Welcome Željko Filipin, QA Engineer

2012-10-03 Thread Rob Moen
I'm very glad you are joining us Željko.  Welcome!

On Oct 2, 2012, at 8:47 PM, Krinkle wrote:

> On Oct 2, 2012, at 4:25 PM, Chris McMahon  wrote:
> 
>> I am pleased to announce that Željko Filipin joins WMF this week as QA
>> Engineer.
> 
> Welcome Željko!
> 
> For the last 1.5 year, hashar and I  have set up the current integration 
> environment. I'm also in CET ("Krinkle" on freenode).
> 
> Hashar did most of the backend with PHPUnit and Jenkins, I'm occupied in 
> browsers and unit testing their in (QUnit/TestSwarm/BrowserStack/..).
> 
> Looking forward to work with you!
> 
> -- 
> Timo "Krinkle" Tijhof
> 
> 
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Github replication

2012-10-03 Thread Chad
On Wed, Oct 3, 2012 at 1:13 PM, Brion Vibber  wrote:
> On Wed, Oct 3, 2012 at 10:10 AM, Chad  wrote:
>
>> On Wed, Oct 3, 2012 at 1:00 PM, Antoine Musso  wrote:
>> > Can we please disable "Pull requests" until we agree on a workflow to
>> > review those or have them automatically sent to Gerrit?
>> >
>>
>> There is no way to do that that I've found.
>>
>
> My recommendation would be to leave pull requests active and, when we see
> things come in, manually import them to gerrit and close out the pull
> requests.
>
> Perfect? No, but probably a better way than refusing to take them until we
> figure out a magic automatic gateway. :)
>

Yeah, that sounds sane. Anyone who wants to volunteer to keep an eye
on Github and make sure patches get into Gerrit, let me know and I'll add
you to the group on Github.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Call to eliminate sajax

2012-10-03 Thread Krinkle
On Oct 3, 2012, at 7:18 PM, "Daniel Friesen"  wrote:

> The real problem however is extensions. For some reason it appears that we 
> STILL have extensions depending on sajax. And I'm not talking about ancient 
> extensions on the wiki or in svn. I only did an ack through stuff that's 
> currently in git.
> 
> So I welcome anyone who is interested in going through extension code and 
> eliminating the use of sajax in favor of jQuery.ajax and RL.
> 
> 

Also note that in various cases these are not just frontend legacy problems, 
backend as well. Meaning, AjaxDispatcher.

Invoked through index.php?action=ajax&rs=efFooBar&rsargs[]=param&rsargs[]=param.

Though blindly replacing sajax would allow us to remove it from core, it would 
be very much worth it to give these extensions a good look and update them in 
general (to make it use API modules, ResourceLoader modules, and following 
current conventions for front-end code with mw and jQuery).

-- Krinkle


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Call to eliminate sajax

2012-10-03 Thread Bryan Tong Minh
On Wed, Oct 3, 2012 at 7:18 PM, Daniel Friesen
 wrote:
> core/skins/common/upload.js
> 99: if ( !ajaxUploadDestCheck || !sajax_init_object() ) return;
> 124:if ( !ajaxUploadDestCheck || !sajax_init_object() ) return;
> 133:if ( !ajaxUploadDestCheck || !sajax_init_object() ) return;
> 141:sajax_do_call( 'SpecialUpload::ajaxGetExistsWarning',
> [this.nameToCheck],
> 287:var req = sajax_init_object();
>


https://bugzilla.wikimedia.org/show_bug.cgi?id=31946

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Call to eliminate sajax

2012-10-03 Thread Daniel Friesen
sajax is an ancient ajax library, it's part of our legacy code. And it  
still gets to sneak a license note into our README.


It's probably about time that we start making sure that code is ready for  
the day it disappears. Just as code shouldn't be relying on bits and  
pieces of wikibits.


Currently the only parts of core that depend on sajax are the legacy  
mwsuggest and upload.js.
mwsuggest is going to disappear when Krinkle's work making simplesearch's  
suggestions work universally is finished.

I'm not sure what's going on with upload.js.

The real problem however is extensions. For some reason it appears that we  
STILL have extensions depending on sajax. And I'm not talking about  
ancient extensions on the wiki or in svn. I only did an ack through stuff  
that's currently in git.


So I welcome anyone who is interested in going through extension code and  
eliminating the use of sajax in favor of jQuery.ajax and RL.



core/skins/common/ajax.js
3:window.sajax_debug_mode = false;
4:window.sajax_request_type = 'GET';
7: * if sajax_debug_mode is true, this function outputs given the message  
into
8: * the element with id = sajax_debug; if no such element exists in the  
document,

11:window.sajax_debug = function(text) {
12: if (!sajax_debug_mode) return false;
14: var e = document.getElementById( 'sajax_debug' );
18: e.className = 'sajax_debug';
19: e.id = 'sajax_debug';
41:window.sajax_init_object = function() {
42: sajax_debug( 'sajax_init_object() called..' );
61: sajax_debug( 'Could not create connection object.' );
77: *sajax_do_call( 'doFoo', [1, 2, 3], document.getElementById(  
'showFoo' ) );

83:window.sajax_do_call = function(func_name, args, target) {
88: if ( sajax_request_type == 'GET' ) {
105:x = sajax_init_object();
112:x.open( sajax_request_type, uri, true );
119:if ( sajax_request_type == 'POST' ) {
130:		sajax_debug( 'received (' + x.status + ' ' + x.statusText + ') ' +  
x.responseText );
153:			alert( 'bad target for sajax_do_call: not a function or object: ' +  
target );

157:sajax_debug( func_name + ' uri = ' + uri + ' / post = ' + post_data );
159:sajax_debug( func_name + ' waiting..' );
169:var request = sajax_init_object();

core/skins/common/mwsuggest.js
489:var xmlhttp = sajax_init_object();

core/skins/common/upload.js
99: if ( !ajaxUploadDestCheck || !sajax_init_object() ) return;
124:if ( !ajaxUploadDestCheck || !sajax_init_object() ) return;
133:if ( !ajaxUploadDestCheck || !sajax_init_object() ) return;
141:		sajax_do_call( 'SpecialUpload::ajaxGetExistsWarning',  
[this.nameToCheck],

287:var req = sajax_init_object();

extensions/CommunityVoice/Resources/CommunityVoice.js
100:var oldRequestType = sajax_request_type;
102:sajax_request_type = "POST";
104:sajax_do_call(
114:sajax_request_type = oldRequestType;

extensions/DonationInterface/modules/validate_input.js
14: sajax_do_call( 'efPayflowGatewayCheckSession', [], checkSession );

extensions/Drafts/Drafts.js
76: var oldRequestType = sajax_request_type;
78: sajax_request_type = 'POST';
80: sajax_do_call(
98: sajax_request_type = oldRequestType;

extensions/OnlineStatus/OnlineStatus.js
35: sajax_do_call( 'OnlineStatus::Ajax', ['get'], function( x ){
71: sajax_do_call( 'OnlineStatus::Ajax', ['set', status], function( x ){

extensions/ReaderFeedback/readerfeedback.js
36:  /*extern sajax_init_object, sajax_do_call */
92:	sajax_do_call( "ReaderFeedbackPage::AjaxReview", args,  
wgAjaxFeedback.processResult );


extensions/SecurePoll/resources/SecurePoll.js
95:	sajax_do_call( 'wfSecurePollStrike', [ action, id, reason ],  
processResult );


extensions/SemanticForms/includes/SF_FormUtils.php
450:function FCK_sajax(func_name, args, target) {
451:sajax_request_type = 'POST' ;
452:sajax_do_call(func_name, args, function (x) {
716:sajax_request_type = 'POST' ;
718:		sajax_do_call('wfSajaxWikiToHTML', [SRCtextarea.value], function (  
result ){

736:if (!oFCKeditor.ready) return false;//sajax_do_call in 
action
754:sajax_request_type = 'GET' ;
755:			sajax_do_call( 'wfSajaxToggleFCKeditor', ['hide'], function(){} )  
;		//remember closing in session


extensions/SemanticForms/libs/SF_ajax_form_preview.js
43: var aj = sajax_init_object();
44: var aj2 = sajax_init_object();
70:	// if (!oFCKeditor.ready) return false;//sajax_do_call in action -  
what do we do?


extensions/SemanticForms/libs/SF_autoedit.js
35: sajax_request_type = 'POST';
37:		sajax_do_call( 'SFAutoeditAPI::handleAutoEdit', data, function(  
ajaxHeader ){


extensions/SemanticForms/libs/SF_submit.js
55: sajax_request_type = 'POST';
58:			sajax_do_call( 'SFAutoeditAPI::handleAutoEdit', new  
Array(c

Re: [Wikitech-l] Github replication

2012-10-03 Thread Brion Vibber
On Wed, Oct 3, 2012 at 10:10 AM, Chad  wrote:

> On Wed, Oct 3, 2012 at 1:00 PM, Antoine Musso  wrote:
> > Can we please disable "Pull requests" until we agree on a workflow to
> > review those or have them automatically sent to Gerrit?
> >
>
> There is no way to do that that I've found.
>

My recommendation would be to leave pull requests active and, when we see
things come in, manually import them to gerrit and close out the pull
requests.

Perfect? No, but probably a better way than refusing to take them until we
figure out a magic automatic gateway. :)

-- brion
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Github replication

2012-10-03 Thread Chad
On Wed, Oct 3, 2012 at 1:00 PM, Antoine Musso  wrote:
> Le 03/10/12 18:27, Chad a écrit :
>> Just letting everyone know: mediawiki/core is now replicating from
>> gerrit to github.
>>
>> https://github.com/mediawiki/core
>>
>> Next step: extensions!
>
> Well done!
>
>
> Can we please disable "Pull requests" until we agree on a workflow to
> review those or have them automatically sent to Gerrit?
>

There is no way to do that that I've found.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github replication

2012-10-03 Thread Antoine Musso
Le 03/10/12 18:27, Chad a écrit :
> Just letting everyone know: mediawiki/core is now replicating from
> gerrit to github.
> 
> https://github.com/mediawiki/core
> 
> Next step: extensions!

Well done!


Can we please disable "Pull requests" until we agree on a workflow to
review those or have them automatically sent to Gerrit?

Thanks!

-- 
Antoine "hashar" Musso


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Learning Git/Gerrit? - 3 Oct 2012 17:30 UTC

2012-10-03 Thread Marcin Cieslak
Hello, 

Our scheduled Git+Gerrit session starts in ca. 40 minutes from now.

Everything will happen via SIP audioconference and SSH connection.

Please make sure your SIP and SSH clients works!

More information on the setup:

 https://www.mediawiki.org/wiki/Git/Workshop

I am already available on SIP as well as on IRC
(#git-gerrit on Freenode) if you would like
to test your setup.

See you soon!

Marcin Cieślak
(saper)


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Github replication

2012-10-03 Thread Chad
On Wed, Oct 3, 2012 at 12:36 PM, Yuvi Panda  wrote:
> On Wed, Oct 3, 2012 at 9:57 PM, Chad  wrote:
>> Just letting everyone know: mediawiki/core is now replicating from
>> gerrit to github.
>
>
> Sweeet!
>
> Any plans for pull-request integration?
>

Yes!

https://bugzilla.wikimedia.org/35497

A bit harder than pushing out, but definitely on the roadmap.

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Github replication

2012-10-03 Thread Chad
On Wed, Oct 3, 2012 at 12:36 PM, Mark Holmquist  wrote:
> On 12-10-03 09:27 AM, Chad wrote:
>>
>> Hi everyone,
>>
>> Just letting everyone know: mediawiki/core is now replicating from
>> gerrit to github.
>>
>> https://github.com/mediawiki/core
>>
>> Next step: extensions!
>
>
> Hi Chad,
>
> Will all extensions be replicated?

Yes.

> Are we also looking to replicate to,
> e.g., Gitorious?

No plans yet, but a lot of the heavy lifting re: replication has
been done, so this wouldn't be impossible.

> I'm sure there are docs for this decision, but I haven't
> seen them--do you have them handy?
>

Just bugzilla requests for it [0], [1]. I can't remember when
the original decision was made, but this has been a goal for
some time.

-Chad

[0] https://bugzilla.wikimedia.org/35429
[1] https://bugzilla.wikimedia.org/35497
[2] https://bugzilla.wikimedia.org/38196

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Github replication

2012-10-03 Thread Yuvi Panda
On Wed, Oct 3, 2012 at 9:57 PM, Chad  wrote:
> Just letting everyone know: mediawiki/core is now replicating from
> gerrit to github.


Sweeet!

Any plans for pull-request integration?

-- 
Yuvi Panda T
http://yuvi.in/blog

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Github replication

2012-10-03 Thread Mark Holmquist

On 12-10-03 09:27 AM, Chad wrote:

Hi everyone,

Just letting everyone know: mediawiki/core is now replicating from
gerrit to github.

https://github.com/mediawiki/core

Next step: extensions!


Hi Chad,

Will all extensions be replicated? Are we also looking to replicate to, 
e.g., Gitorious? I'm sure there are docs for this decision, but I 
haven't seen them--do you have them handy?


Thanks,

--
Mark Holmquist
Software Engineer, Wikimedia Foundation
mtrac...@member.fsf.org
http://marktraceur.info

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Github replication

2012-10-03 Thread Mike Dupont
On Wed, Oct 3, 2012 at 6:27 PM, Chad  wrote:
> Hi everyone,
>
> Just letting everyone know: mediawiki/core is now replicating from
> gerrit to github.
>
> https://github.com/mediawiki/core

that is great news.
mike

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Github replication

2012-10-03 Thread Siebrand Mazeland (WMF)
On Oct 3, 2012, at 12:27 PM, Chad  wrote:
> Just letting everyone know: mediawiki/core is now replicating from
> gerrit to github.
>
> https://github.com/mediawiki/core
>
> Next step: extensions!

Yay. Finally we're allowing the world to fix our code :).

Can has Github->Gerrit merge?!

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Github replication

2012-10-03 Thread Andrew Otto
Awesome!  I have a repo I'd love to try this with right now.  I'll find you on 
IRC…


On Oct 3, 2012, at 12:27 PM, Chad  wrote:

> Hi everyone,
> 
> Just letting everyone know: mediawiki/core is now replicating from
> gerrit to github.
> 
> https://github.com/mediawiki/core
> 
> Next step: extensions!
> 
> -Chad
> 
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l


___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Github replication

2012-10-03 Thread Chad
Hi everyone,

Just letting everyone know: mediawiki/core is now replicating from
gerrit to github.

https://github.com/mediawiki/core

Next step: extensions!

-Chad

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] DEPLOYED TODAY: SPF (email spoof prevention feature) test-rollout Weds 10/3

2012-10-03 Thread Jeremy Baron
On Wed, Oct 3, 2012 at 4:08 PM, Artur Fijałkowski  wrote:
> 2012/10/3 Jeff Green :
>> As of ~11:15AM EDT SPF is deployed for the domain wikimedia.org. Please let
>> me know ASAP if you discover any issues with mail sent from a @wikimedia.org
>> address.
>
> Is allowing ALL IP's in all WMF ranges really needed by anyone?
>
> It would be much better if there will be only finite number of
> designated SMTP servers and all other machines should send mail via
> those servers, not directly into public internet.

It's closer to the status quo (and I've not heard people complain
about spam from our blocks but maybe I just don't know) and therefore
less work to make it happen. Being perfect can be deferred to a later
date.

-Jeremy

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] DEPLOYED TODAY: SPF (email spoof prevention feature) test-rollout Weds 10/3

2012-10-03 Thread Artur Fijałkowski
2012/10/3 Jeff Green :
> As of ~11:15AM EDT SPF is deployed for the domain wikimedia.org. Please let
> me know ASAP if you discover any issues with mail sent from a @wikimedia.org
> address.

Is allowing ALL IP's in all WMF ranges really needed by anyone?

It would be much better if there will be only finite number of
designated SMTP servers and all other machines should send mail via
those servers, not directly into public internet.

AJF/WarX

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] Wikimedia engineering September 2012 report

2012-10-03 Thread Guillaume Paumier
Hi,

The report covering Wikimedia engineering activities in September 2012
is now available.

Wiki version: 
https://www.mediawiki.org/wiki/Wikimedia_engineering_report/2012/September
Blog version: 
https://blog.wikimedia.org/2012/10/03/engineering-september-2012-report/

--
Guillaume Paumier

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


[Wikitech-l] DEPLOYED TODAY: SPF (email spoof prevention feature) test-rollout Weds 10/3

2012-10-03 Thread Jeff Green
As of ~11:15AM EDT SPF is deployed for the domain wikimedia.org. Please 
let me know ASAP if you discover any issues with mail sent from a 
@wikimedia.org address.


Thanks!
jg

Jeff Green
Operations Engineer, Special Projects
Wikimedia Foundation
149 New Montgomery Street, 3rd Floor
San Francisco, CA 94105
 415-839-6885 x6807
 jgr...@wikimedia.org

P.S. Ops folks, rollback is simply a matter of reverting the wikimedia.org 
zone file and running authdns-update. I set the TTL to 10 min just in 
case.


-- Forwarded message --
Date: Fri, 28 Sep 2012 11:00:08 -0700 (PDT)
From: Jeff Green 
Reply-To: Wikimedia developers 
To: wmf...@lists.wikimedia.org, wikimedi...@lists.wikimedia.org,
wikitech-l@lists.wikimedia.org
Subject: [Wikitech-l] SPF (email spoof prevention feature) test-rollout Weds
10/5

I'm planning to deploy Sender Policy Framework (SPF) for the wikimedia.org 
domain on Weds October 5. SPF is a framework for validating outgoing mail, 
which gives the receiving side useful information for spam filtering. The main 
goal is to cause spoofed @wikimedia.org mail to be correctly identified as 
such. It should also improve our odds of getting fundraiser mailings into 
inboxes rather than spam folders.


The change should not be noticeable, but the most likely problem would be 
legitimate @wikimedia.org mail being treated as spam. If you hear of this 
happening please let me know.


Technical details are below for anyone interested . . .

Thanks,
jg

Jeff Green
Operations Engineer, Special Projects
Wikimedia Foundation
149 New Montgomery Street, 3rd Floor
San Francisco, CA 94105
 jgr...@wikimedia.org

. . . . . . .

SPF overview http://en.wikipedia.org/wiki/Sender_Policy_Framework

The October 8 change will be simply a matter of adding a TXT record to the 
wikimedia.org DNS zone:


wikimedia.org IN TXT "v=spf1 ip4:91.198.174.0/24 ip4:208.80.152.0/22 
ip6:2620:0:860::/46 include:_spf.google.com ip4:74.121.51.111 ?all"


The record is a list of subnets that we identify as senders (all wmf subnets, 
google apps, and the fundraiser mailhouse). The "?all" is a "neutral" 
policy--it doesn't state either way how mail should be handled.


Eventually we'll probably bump "?all" to a stricter "~all" aka SoftFail, which 
tells the receiving side that only mail coming from the listed subnets is 
valid. Most ISPs will route 'other' mail to a spam folder based on SoftFail.


Please bug me with any questions/comments!

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Using mediawiki from within the Social networks?

2012-10-03 Thread Mike Dupont
On Wed, Oct 3, 2012 at 1:03 PM, Yury Katkov  wrote:
> hmmm, much like the blips on Google Wave?
like line comments in github :
https://github.com/h4ck3rm1k3/wikiteam/commit/4da7f7f4a813b53be13bff7e29a1e5325bb68a30#L0R58

We need a way to track and rate comments and then we can resolve them
with changes after people have stormed over them.


-- 
James Michael DuPont
Member of Free Libre Open Source Software Kosova http://flossk.org
Saving wikipedia(tm) articles from deletion http://SpeedyDeletion.wikia.com
Contributor FOSM, the CC-BY-SA map of the world http://fosm.org
Mozilla Rep https://reps.mozilla.org/u/h4ck3rm1k3
Free Software Foundation Europe Fellow http://fsfe.org/support/?h4ck3rm1k3

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Using mediawiki from within the Social networks?

2012-10-03 Thread Yury Katkov
hmmm, much like the blips on Google Wave?
-
Yury Katkov



On Wed, Oct 3, 2012 at 12:50 PM, Mike  Dupont
 wrote:
> On Wed, Oct 3, 2012 at 7:43 AM, Yury Katkov  wrote:
>> I'm not sure that '''editing''' can be made more easy with the help of
>> social network client. Any ideas on that? any ideas on what else can
>> be made  more engaging with the power of social networks?
>
> Well what if people can click on a bit of text and comment on it, they
> could suggest in that comment that the text is replaced.
>
> mike
>
>
> --
> James Michael DuPont
> Member of Free Libre Open Source Software Kosova http://flossk.org
> Saving wikipedia(tm) articles from deletion http://SpeedyDeletion.wikia.com
> Contributor FOSM, the CC-BY-SA map of the world http://fosm.org
> Mozilla Rep https://reps.mozilla.org/u/h4ck3rm1k3
> Free Software Foundation Europe Fellow http://fsfe.org/support/?h4ck3rm1k3
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] Using mediawiki from within the Social networks?

2012-10-03 Thread Mike Dupont
On Wed, Oct 3, 2012 at 7:43 AM, Yury Katkov  wrote:
> I'm not sure that '''editing''' can be made more easy with the help of
> social network client. Any ideas on that? any ideas on what else can
> be made  more engaging with the power of social networks?

Well what if people can click on a bit of text and comment on it, they
could suggest in that comment that the text is replaced.

mike


-- 
James Michael DuPont
Member of Free Libre Open Source Software Kosova http://flossk.org
Saving wikipedia(tm) articles from deletion http://SpeedyDeletion.wikia.com
Contributor FOSM, the CC-BY-SA map of the world http://fosm.org
Mozilla Rep https://reps.mozilla.org/u/h4ck3rm1k3
Free Software Foundation Europe Fellow http://fsfe.org/support/?h4ck3rm1k3

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l