On Mon, Oct 25, 2010 at 11:23 PM, Ashar Voultoiz hashar+...@free.fr wrote:
On 25/10/10 23:26, George Herbert wrote:
I for one only use secure.wikimedia.org; I would like to urge as a
general course that the Foundation switch to a HTTPS by default
strategy...
HTTPS means full encryption, that
George Herbert wrote:
The current WMF situation is becoming quaint - pros use
secure.wikimedia.org, amateurs don't realize what they're exposing.
By professional standards, we're not keeping up with professional
industry expectations. It's not nuclear bomb secrets (cough) or
missile designs
On Mon, Oct 25, 2010 at 11:59 PM, MZMcBride z...@mzmcbride.com wrote:
George Herbert wrote:
The current WMF situation is becoming quaint - pros use
secure.wikimedia.org, amateurs don't realize what they're exposing.
By professional standards, we're not keeping up with professional
industry
On 10/26/2010 08:59 AM, MZMcBride wrote:
As Aryeh notes, even those who act in an editing role (rather than in simply
a reader role) don't generally have valuable accounts. The pros you're
talking about are free to use secure.wikimedia.org (which is already set up
and has been for quite some
On Tue, Oct 26, 2010 at 6:24 PM, George Herbert
george.herb...@gmail.com wrote:
..
But I would prefer to move towards a logged-in user by default goes to
secure connection model. That would include making secure a
multi-system, fully redundantly supported part of the environment, or
On 26.10.2010 09:36, Nikola Smolenski wrote:
On 10/26/2010 08:59 AM, MZMcBride wrote:
As Aryeh notes, even those who act in an editing role (rather than in simply
a reader role) don't generally have valuable accounts. The pros you're
talking about are free to use secure.wikimedia.org (which is
There is no real massive load caused by https at runtime. There is however
a significant chink of developer and sysadmin time needed to implement this
and make it work.
For now, at least, the only optimisations that should be considered are
those that make it easier all round.
Conrad
On 26 Oct
2010/10/25 Jan Paul Posma jp.po...@gmail.com
Hi all,
As presented last Saturday at the Hack-A-Ton, I've committed a new version
of the InlineEditor extension. [1] This is an implementation of the
sentence-level editing demo posted a few months ago.
Very interesting! Obviously I'll not see
Robert Rohde wrote:
Many of the things done for the statistical analysis of database dumps
should be suitable for parallelization (e.g. break the dump into
chunks, process the chunks in parallel and sum the results). You
could talk to Erik Zachte. I don't know if his code has already been
Develop a new bot framework (may be interwiki processing to start with) for
high performance GPU cluster (nvidia or AMD) similar to what boinc based
projects does. nvdia is more popular while AMD has more cores for the same
price
:)
Regards,
Jyothis.
http://www.Jyothis.net
Στις 26-10-2010, ημέρα Τρι, και ώρα 16:25 +0200, ο/η Platonides έγραψε:
Robert Rohde wrote:
Many of the things done for the statistical analysis of database dumps
should be suitable for parallelization (e.g. break the dump into
chunks, process the chunks in parallel and sum the results).
On Tue, Oct 26, 2010 at 2:23 AM, Ashar Voultoiz hashar+...@free.fr wrote:
HTTPS means full encryption, that is either :
- a ton of CPU cycles : those are wasted cycles for something else.
- SSL ASIC : costly, specially given our gets/ bandwidth levels
HTTPS uses very few CPU cycles by
Good afternoon,
In r75437, r75438[0][1] I moved the old installer to old-index.php
and moved the new to index.php. At this stage in the process,
I don't see us backing this out before we branch 1.17. I really
want people to test it out and report any major breakages [2].
This has been a long
2010/10/26 Erik Moeller e...@wikimedia.org:
A few quick notes:
And, sorry for duplicating stuff from the known issues list.
--
Erik Möller
Deputy Director, Wikimedia Foundation
Support Free Knowledge: http://wikimediafoundation.org/wiki/Donate
___
I am on ALL of these things, actually. I have fixes for most of them
pending.
On 10/26/10 10:41 AM, Erik Moeller wrote:
2010/10/26 Chadinnocentkil...@gmail.com:
Good afternoon,
In r75437, r75438[0][1] I moved the old installer to old-index.php
and moved the new to index.php. At
2010/10/26 Brandon Harris bhar...@wikimedia.org:
I am on ALL of these things, actually. I have fixes for most of them
pending.
Awesome :-)
--
Erik Möller
Deputy Director, Wikimedia Foundation
Support Free Knowledge: http://wikimediafoundation.org/wiki/Donate
Aryeh Gregor Simetrical+wikilist at gmail.com writes:
To clarify, the subject needs to 1) be reasonably doable in a short
timeframe, 2) not build on top of something that's already too
optimized. It should probably either be a new project; or an effort
to parallelize something that already
On 24/10/10 17:42, Aryeh Gregor wrote:
This term I'm taking a course in high-performance computing
http://cs.nyu.edu/courses/fall10/G22.2945-001/index.html, and I have
to pick a topic for a final project. According to the assignment
On Tue, Oct 26, 2010 at 10:00 AM, Chad innocentkil...@gmail.com wrote:
This has been a long development process for almost 2 years
now, and I'd like to thank Max, Mark H., Jure, Jeroen, Roan
and Siebrand for their invaluable help in working on this. And
especially thanks to Tim for starting
On Tue, Oct 26, 2010 at 8:25 AM, Ariel T. Glenn ar...@wikimedia.org wrote:
Στις 26-10-2010, ημέρα Τρι, και ώρα 16:25 +0200, ο/η Platonides έγραψε:
Robert Rohde wrote:
Many of the things done for the statistical analysis of database dumps
should be suitable for parallelization (e.g. break the
Ariel T. Glenn wrote:
If one were clever (and I have some code that would enable one to be
clever), one could seek to some point in the (bzip2-compressed) file and
uncompress from there before processing. Running a bunch of jobs each
decompressing only their small piece then becomes feasible.
Στις 27-10-2010, ημέρα Τετ, και ώρα 00:05 +0200, ο/η Ángel González
έγραψε:
Ariel T. Glenn wrote:
If one were clever (and I have some code that would enable one to be
clever), one could seek to some point in the (bzip2-compressed) file and
uncompress from there before processing. Running a
@2010-10-26 03:45, Erik Moeller:
2010/10/25 Brion Vibberbr...@pobox.com:
In all cases we have the worry that if we allow uploading those funky
formats, we'll either a) end up with malicious files or b) end up with lazy
people using and uploading non-free editing formats when we'd prefer them
After the recent dicussions open open-ness and clarity with requests by
serveral people what is contained within the RT after several people have
asked and given answers like it's staff stuff.
So what is stored in it that can't be within either the staff or internal
wiki where it must be private
On Tue, Oct 26, 2010 at 6:50 AM, Max Semenik maxsem.w...@gmail.com wrote:
Instead of amassing social constructs around technical deficiency, I
propose to fix bug 24230 [1] by implementing proper checking for JAR
format. Also, we need to check all contents with antivirus and
disallow certain
25 matches
Mail list logo