Re: Is fast-matrix.cpantesters.org dead?

2021-01-29 Thread Neil Bowers via cpan-workers
The fast matrix is back up and running again.

Just confirming here, though hopefully you already found out on other channels 
...


FOSDEM devroom on package managers

2017-11-15 Thread Neil Bowers
Next year’s FOSDEM is going to have a devroom on package managers:

https://lists.fosdem.org/pipermail/fosdem/2017-October/002630.html 


It would be great to have a talk on the “CPAN ecosystem” in this devroom. 
David, are you going to FOSDEM? :-)

One on CPAN Testers would be good, given that no other language seems to have 
something quite like the CPAN Testers family, that I’m aware of.

DEADLINE for submissions is 1st December.

Neil



Re: Open source archives hosting malicious software packages

2017-09-22 Thread Neil Bowers
First cut at a script to check new CPAN packages:
https://github.com/neilb/cpan-watcher 


At the moment it just flags:
Package names that are confusable with packages in other dists
Package names which don’t come under the expected main package name

The first time you run it, it will grab the current CPAN Index. When you next 
run it (eg tomorrow) it will grab the index again, and then check new packages. 
It expects $HOME/cpan-watcher to exist.

And the output today:

new package Dancer2::Logger::LogAny (dist Dancer2-Logger-LogAny) is confusable 
with package Dancer::Logger::LogAny (dist Dancer-Logger-LogAny)
new package Device::Chip::AD5691R is in dist Device-Chip-AnalogConverters, but 
doesn't match expected namespace (Device::Chip::AnalogConverters)
new package Lab::Moose::Connection::USB is in dist Lab-Measurement, but doesn't 
match expected namespace (Lab::Measurement)
new package Lab::Moose::Connection::VXI11 is in dist Lab-Measurement, but 
doesn't match expected namespace (Lab::Measurement)
new package Lab::Moose::Instrument::ZI_MFIA is in dist Lab-Measurement, but 
doesn't match expected namespace (Lab::Measurement)

I’m going to have this in a crontab, running once a day.

Neil



Re: Open source archives hosting malicious software packages

2017-09-21 Thread Neil Bowers
> Would anyone know of any prior art for detection of "short edit distances"?  
> (Perhaps even already on CPAN?)

As David & Zefram pointed out, Levenshtein is the classic algorithm for this, 
but there are plenty of others; in the SEE ALSO for Text::Levenshtein I’ve 
listed at least some of the ones I know of on CPAN:
https://metacpan.org/pod/Text::Levenshtein#SEE-ALSO

A better algorithm for this purpose is the Damerau-Levenshtein edit distance:
Classic Levenshtein counts the number of insertions, deletions, and 
substitutions needed to get from one string to the other. Comparing 
"Algorithm::SVM" and "Algorithm::VSM” gives an edit distance of 2.
The Damerau variant adds transpositions of adjacent characters. This results in 
an edit distance of 1 for the example above, which is how my script found it.

I used Text::Levenshtein::Damerau::XS, because it’s quicker. That’s how I found 
the examples I gave yesterday.

I’ll tweak my script to not worry about packages in the same distribution (eg 
Acme::Flat::GV and Acme::Flat::HV). Then I just need to get a list of new 
packages each day, and I’m just about there :-)

Neil



Re: Open source archives hosting malicious software packages

2017-09-20 Thread Neil Bowers
>> http://www.theregister.co.uk/2017/09/15/pretend_python_packages_prey_on_poor_typing/Would
>>  CPAN be subject to the same problem as described in the article above?
> 
> Yes.
> 
> DBI::Class, for example, could be a typo for DBIx::Class or a
> misremembered Class::DBI, and there's nothing stopping anyone from
> uploading a DBI::Class package that does all kinds of dodgy stuff.

There are plenty of confusable (small edit distance) pairs of module names on 
CPAN.

For example,
Algorithm::SVM and Algorithm::VSM
AI::POS and AI::PSO
both pairs are from different dists. More likely with short acronyms.

One thing we could do is have a tool looking at newly registered package names 
and alert the PAUSE admins to have a look at any that are a short edit distance 
from an existing package name.

Neil


Re: Renaming the "QA Hackathon"?

2016-04-25 Thread Neil Bowers
Hi Salve,

> Since I'm the guy that actually named the QA hackathon originally, I'll take 
> the liberty to share my thoughts on the matter. I hope You don't mind. :)
> 
> "The Perl QA Hackathon" was originally named after the IRC channel: "The 
> #perl-qa hackathon". If you guys want to change the name (which I think you 
> are completely entitled to do, and probably is a good idea), maybe it's worth 
> considering the IRC channel roots of the name?

We had a discussion on the name at the QAH. I’ll write up a summary at some 
point this week.

> I'm looking forward to hear where this discussion goes, and where the 10th 
> anniversiary will be next year. (On that note, would you guys be interested 
> in seeing a QAH in Oslo again?)

In a group discussion the possibility of Oslo was mentioned, and a number of 
people were positive, and no dissenting opinions were given :-)

Neil



Renaming the "QA Hackathon"?

2016-04-09 Thread Neil Bowers
I’ve added a topic to the wiki page for “topics for discussion” at the QAH:

Should we rename this event?

Eg to “Perl QA Workshop”, or something like that.

There’s a well-established definition for “hackathon” these days, and the QAH 
is not one of those. As a result when talking to potential sponsors, we have to 
be careful to define what the event is, how it works, and the attitude towards 
the output(s). I’ve had plenty of discussions explaining “no, not that kind of 
hackathon”.

Ie people who aren’t already familiar with the QAH hear “4-day … hackathon” and 
think something along the lines of:

So you’re going to get together and lash things up in a frenzy, in teams 
competing against each other.

Uh, no.

Producing the number of sponsors we have requires contacting an awful lot more 
companies and organisations, and I wonder how many of them skim it and think:

So they want some money to get together for a hackathon?

We don’t support hackathons.

And then don’t bother replying. Or just reply with a “no”, which obviously we 
have to respect.

On the flip side, it’s an established name, which is being held for the 9th 
time this year.

I’m not saying “we must change the name!", but think we should consider it.

Neil



Re: Thoughts on Kik and NPM and implications for CPAN

2016-03-24 Thread Neil Bowers
>> PAUSE doesn’t (currently) know the river position, but if it published
>> a feed of deletion-schedulings, then some third-party agent could
>> monitor the feed and check for dists that are on river. I think those
>> are the dists that should be alerted to modules@ […] Obviously the
>> issue here is DarkPAN: a dist might not have any CPAN dependents, but
>> may be used plenty out in the big bad world. That’s a separate problem
>> :-)
> 
> I don’t think so. Plack::Middleware::Rewrite is used by a ton of people
> and klaxons certainly ought to ring if I ever opened up that namespace.
> The number of on-CPAN dependents is just 3 though.

The key word in what I said was *any*. I think even 3 dependents should klaxon.
Plus having any favourites on CPAN should also prompt the klaxon as well: I use 
favourites as a proxy for “has dependents” in both the adoption list and 
weighting dists for the PRC, and it seems to work.

I still think we need a service where you can say “I’m using this dist”. I 
think I’ll add that feature to the dashboard, which I’ll be working on at the 
QAH.

Neil



Re: Thoughts on Kik and NPM and implications for CPAN

2016-03-24 Thread Neil Bowers
> However, we (the CPAN community) can do a lot of things after that to 
> mitigate any damage. I wholeheartedly agree with transferring namespace 
> permissions to something that the PAUSE admins control, so any random joe 
> cannot claim the namespace and upload whatever he likes into it (this is an 
> attack vector we must keep closed).  We also need to be able to act quickly 
> to publish something in its place so installations pulling directly from the 
> CPAN do not break.  I would suggest an email alert go out to the modules@ 
> list (or another list, should this prove too noisy) providing notification 
> that an indexed module is being deleted and de-indexed.

I’ve got no idea what the monthly volume of deletions is, but I think there are 
two main cases:

1. dists that aren’t used by anything else
2. dists that are somewhere on the river

It would be nice to have a feed of all dists scheduled for deletion, as soon as 
they’re scheduled, with additional alerting if the dist is upriver at all.
PAUSE doesn’t (currently) know the river position, but if it published a feed 
of deletion-schedulings, then some third-party agent could monitor the feed and 
check for dists that are on river. I think those are the dists that should be 
alerted to modules@

Even if all deletions go to modules@, it would still be handy if that 
notification mentioned river position. Maybe PAUSE could publish an hourly list 
of files that are currently scheduled for deletion, similar to various other 
files it generates?

Obviously the issue here is DarkPAN: a dist might not have any CPAN dependents, 
but may be used plenty out in the big bad world. That’s a separate problem :-)

Neil



Re: Found rare bug in Pod::Simple

2016-03-06 Thread Neil Bowers
>> There are two CPAN Testers fails:
>>  
>> http://www.cpantesters.org/cpan/report/4ddcddb1-6c58-1014-bec3-a1032b7077ee 
>>  
>> http://www.cpantesters.org/cpan/report/39970866-dd9c-11e5-a3ee-89603848fe5a 
> 
> Do you have any thoughts on why these occurred on these particular OS/Perl 
> version combinations?
> 
>> Basically the problem is that
>>  - the pod directory has both perlpodstyle and perlpodstyle.pod in it 
>> (how come?!)
> 
> Is it possible that the testers have files left over from previous test runs?

Not the foggiest. That was one of the reasons I included cpan-workers on this, 
because I’m curious how this came about.

I’ve emailed the two people who produced these two fails, asking them to look 
in the relevant directory and check my theory (just because I can reproduce the 
error, doesn’t mean that I’ve reproduced how they produced the error. Though it 
seems likely I have).

Somehow when installing perl, they ended up with “perlpodstyle” as well as 
“perlpodstyle.pod” in their pod directory. And given what the test does, it 
looks like it was only for that one pod file. One on MacOSX and one on Windows, 
and different versions of perl.

Odd.

> Also, could I ask what repository/branch you're working from?

I forked Marc’s repo for Pod-Simple, since `corelist --upstream Pod::Simple` 
says it’s upstream cpan.

> If I read your analysis correctly, survey() -- or, more precisely 
> _make_search_callback() -- already handles '.plx'.  It's find() that fails to 
> handle '.pl’.

Yup, I miswrote in my summary.

> If that's correct, then why not make both functions handle exactly the same 
> set of file extensions?  Whether that should be specified explicitly or by 
> regex -- I have no preference.  But we could then abstract out the formula 
> for scanning extensions into a function in a single location.

I’ve been resisting the urge to refactor, and just make minimal changes, but 
you’re right, in this case I should do a teeny bit more.

Cheers,
Neil




Found rare bug in Pod::Simple

2016-03-05 Thread Neil Bowers
Hi Marc, & CPAN Workers,

I’ve been looking into the final two CPAN Testers fails, and have finally got 
to the bottom of them.
The failing test is search50.t, and the problem is where it does the following:

- call survey() to get hash of name => path
- foreach name, then call find() and check it returns the same path

There are two CPAN Testers fails:


http://www.cpantesters.org/cpan/report/4ddcddb1-6c58-1014-bec3-a1032b7077ee 


http://www.cpantesters.org/cpan/report/39970866-dd9c-11e5-a3ee-89603848fe5a 


Basically the problem is that

- the pod directory has both perlpodstyle and perlpodstyle.pod in it 
(how come?!)
- the survey() method doesn’t even find perlpodstyle, it just finds 
perlpodstyle.pod
- the find() method finds both and it finds perlpodstyle first
- so the test failed

With this understanding I reproduced the failing test.

The problem comes down to two lines of code. In _make_search_callback(), which 
is invoked by survey(), we have:

unless( m/^[-_a-zA-Z0-9]+\.(?:pod|pm|plx?)\z/is ) {

So it’s only considering files that end in .pod, .pm, .pl, or .plx

But in find() we have:

foreach my $ext ('', '.pod', '.pm', '.pl') { # possible extensions

As you can see, it first checks for no extension. Also note that it’s not 
checking for the ‘.plx’ extension, which survey handles. I’ve never come across 
anyone using the .plx extensions, but I guess for a while maybe people did:

http://www.perlmonks.org/?node_id=336713 


I think that the best fix is to make the survey() pattern match files with no 
extensions, and also files with the .plx extension.
This means that survey will scan a number of extra files, but it also scans 
them looking for pod, so it shouldn’t return any false positive. This would 
mean that survey would now find scripts that have pod in them, which it 
currently misses.

Do you agree with the proposed fix, and are there any gotchas I might be 
missing?

Cheers,
Neil



Working on Pod-Simple

2016-03-05 Thread Neil Bowers
A few weeks ago I mailed that I was planning to work on improving things in 
modules at the head of the river. The first of these is Pod::Simple. I’m slowly 
working towards a PR with various changes, and doing developer releases (with 
Marc’s permission). I’ll outline the changes here, to give people a chance to 
comment, since the module has so many dists downstream.

Min perl version
I’ve set min perl version to 5.006. It was previously not set in metadata, and 
all modules bar one had “require 5”. But the ChangeLog had discussion of 
changes that explicitly said anything earlier than 5.6 was no longer supported. 
If you look at CPAN Testers, you can see there are passes back to 5.6.2.
http://matrix.cpantesters.org/?dist=Pod-Simple%203.32

One module in the dist requires 5.008: Pod::Simple::TranscodeSmart. This is 
used by Pod::Simple::Transcode, which falls back onto using 
Pod::Simple::TranscodeDumb if it can’t load Pod::Simple::TranscodeSmart, which 
is why the dist passes tests on 5.6.2.

So I set it to 5.006; another option would have been 5.006002 — since there is 
evidence back that far.

Dropped “use vars”
Dropped all uses of “use vars”, and replaced with “our” as appropriate.

Add “use warnings”
I added “use warnings” to all modules, and resolved the resulting warnings. 
Some things required appropriately scoped “no warnings …”

Fixed CPAN Testers fails
As you can see from the CPAN Testers matrix link above, the current stable 
release has various failures. I’ve fixed most of them, and my current dev 
release has one particular failing test, that I’m hoping to sort out this 
weekend:
http://matrix.cpantesters.org/?dist=Pod-Simple 

This is one of my main goals: to get this green across the board.

Neil



Re: Looking for prior art on conventions for dep-listing

2016-03-01 Thread Neil Bowers
> cpanm had --scan-deps, though it's now listed as deprecated.
> 
> And CPAN has plenty of these sorts of things, eg. Perl::PrereqScanner

App::Midgen and the midgen script were designed to determine and list prereqs 
of different types, in the formats expected by various things:

https://metacpan.org/release/App-Midgen



addressing kwalitee and other quality issues at the head of the CPAN River

2016-01-28 Thread Neil Bowers
I’ve been looking at CPAN distributions that have 10k+ downstream dependent 
distributions. There are currently just 45 such distributions:

http://neilb.org/2016/01/26/river-head-quality.html

I think that in general these heavyweight dists should be good examples for 
people to look at. Sometimes there will be special reasons why they not 
following all best practices, no doubt, but in general I reckon they should.

But before I start sending pull requests and blead patches for core modules, 
given I know that opinions on kwalitee vary, I figured I should raise the topic 
somewhere.

For example, picking one module, base, I would look at:
 - adding license to the doc and dist metadata
 - adding min perl version
 - add a basic README
 - use warnings
 - add links to parent and superclass in SEE ALSO
 - changes mentions of fields and parent in first para to L and 
L
 - update doc to include discussion of RT#28580, and suggest it’s closed 
(https://rt.cpan.org/Public/Bug/Display.html?id=28580)
 - RT#98621 can be closed (https://rt.cpan.org/Public/Bug/Display.html?id=98621)
 - For RT#98360 see below
 - I’m not familiar enough with fields for RT#68763 
(https://rt.cpan.org/Public/Bug/Display.html?id=68763)
 - Similarly for 28399

Is there an agreed policy for whether blead upstream modules should have the 
perl git repo in the dist metadata or not? Some do and some don’t. Personally 
it’s of questionable value, since I can’t submit a PR, but seeing the perl repo 
URL does at least tell me that it’s blead upstream.

Neil



The PRC and kwalitee

2015-12-24 Thread Neil Bowers
Given an email I had off-list, I’ll clarify something related to the PR 
challenge (PRC):

Through the year I had the occasional email from *authors* whose distributions 
had been assigned, and who got a PR that addressed kwalitee fails and nothing 
else. They weren’t happy with these PRs.

Recently, I sent a questionnaire to all authors who’d had at least one 
distribution assigned in the PRC. I got quite a few more comments from authors 
saying that they didn’t want to get kwalitee PRs.

On the flip-side, some *participants* got assignments where they said “the only 
thing I can think to do is kwalitee improvements, which I don’t want to do, so 
please can I have a different assignment”.

Originally I didn’t plan to run the PRC in 2016, but enough people have asked 
to do it again next year that I’m now going to, but with some changes.

In particular I’m going to email all authors with a repo and get them to 
opt-in, rather than the system for 2015, which was opt-out.

Neil



Re: CPAN River - water quality metric

2015-12-24 Thread Neil Bowers
> CPANdeps (http://deps.cpantesters.org) has been providing useful
> information on water quality. It might be enough to make a better or
> opinionated presentation of it for the upriver authors. IMHO META
> files and min version specification depends more on when a
> distribution is released and don't well fit for water quality metrics.

I’m not convinced on min version either, but am leaning towards including it, 
if we can come up with a definition that’s practical and useful.

I think “has a META.yml or META.json” is worth keeping in, as there are a 
number of benefits to having one, and I suspect there’s at least some 
correlation between dists that don’t have a META file and dists that haven’t 
listed all prereqs (eg in the Makefile.PL).

That said, I’m really just experimenting here, trying to find things that are 
useful indicators for whether a dist is good to rely on.

Neil



Re: CPAN River - water quality metric

2015-12-23 Thread Neil Bowers
> I thought the "min perl version" is a tough metric without considering what 
> version of Perl it will actually run on.  I would refine that metric to 
> "declared min perl version >= actual perl version required".  Figuring out 
> the latter could perhaps be done via CPAN Testers -- if all of 5.6 fails, 
> then we know it's 5.8 or better.But if there is at least one 5.6 pass, 
> then it works on 5.6.And if it works on 5.6, I think omission of a 
> minimum perl version is no big deal.

I nearly didn’t include the “min perl version” in this, as there’s no easy 
clean definition. As you say, using in conjunction with CPAN Testers results 
might produce something usable. The thing that prompted me to include it was 
the part of my talk about the fact that the OSes and versions of perl supported 
by your dist is the intersection of those your code supports and those 
supported by all of your upstream dependencies. Once we’re past the important 
events of the next few days[*], I’ll have more time to spend on this.

> I don't want to see go down the Kwalitee route where people put a minimum 
> perl version of "5" or something just to get a better water quality score.

Indeed.

> Generally, I think some subset of the core Kwalitee metrics and some 
> adaptation of your adoption criteria (e.g. time since any release by author) 
> would be a place to look for "water quality" metrics.  I do think you need to 
> find a way to distinguish what water quality is trying to measure distinct 
> from Kwalitee.

Agreed.

Part of my motivation for a separate and simpler measure was the feedback I had 
on the PR challenge, where some authors weren’t happy to get PRs that addressed 
failing CPANTS metrics. In general the message I got was “I agreed with some 
parts of kwalitee, but not other bits”. I’m hoping we can identify a small set 
of metrics that are hard to argue with when considering distributions that 
start moving up river.

Neil

[*] for example, taking my son to Star Wars tomorrow :-)



Re: CPAN River - water quality metric

2015-12-23 Thread Neil Bowers
> You could try collecting up a bunch of these different metrics and then run a 
> regression analysis against the graph wise recursive downstream dep count for 
> everything on CPAN and see which metrics fall out in the real world.

I might have a dabble at this, perhaps roping in help from someone more 
mathematically, er rigorous, than me.

> 
> So many times we come up with arbitrary scoring systems that don't actually 
> match to the real things that happen in the wild.

Having played with various scoring metrics, the one I use for CPAN Testers 
seems to be pretty reliable, and good for this purpose. A CPAN Testers fail for 
one of your upstream dependencies could indicate someone unable to install your 
dist.

The other measure that worked well for the adoption criteria is the bug 
scoring: basically have multiple bugs (not wishes) been raised since the last 
release, and was that last release more than N months ago. That basically 
indicates that people are using it, but there doesn’t appear to be an engaged 
maintainer.

Neil



CPAN River - water quality metric

2015-12-22 Thread Neil Bowers
At the London Perl Workshop I gave a talk on the CPAN River, and how 
development and release practices should mature as a dist moves up river. This 
was prompted by the discussions we had at Berlin earlier this year.

Writing the talk prompted a bunch of ideas, one of which is having a “water 
quality” metric, which gives some indication of whether a dist is a good one to 
rely on (needs a better name). I’ve come up with a first definition, and 
calculated the metric for the different stages of the river:

http://neilb.org/2015/12/22/cpan-river-water-quality.html

Any thoughts on what factors should be included in such a metric? I think it 
should really include factors that it would be hard for anyone to argue with. 
Currently the individual factors are:

Not having too many CPAN Testers fails
Having a META.json or META.yml file
Specifying the min perl version required for the dist

Cheers,
Neil

At some point I’ll share the slides from my talk, but slideshare doesn’t handle 
keynote presentations, and the exported powerpoint from keynote is broken 
(neither powerpoint nor slideshare can handle it!)



Re: Measuring water quality of the CPAN river

2015-05-11 Thread Neil Bowers
 On 11 May 2015, at 01:47, Kent Fredric kentfred...@gmail.com wrote:
 So the quality of a dist could be measured indirectly by the failure rate of 
 its dependents.

This was kind of the basis of the “River Smoker” idea that Tux and and I 
discussed late on the last day of the QAH:

http://neilb.org/2015/04/24/cpan-downstream-testing.html

 Or as an analogy, we have 2 sampling points in the river 100m apart.
 
 If we sample at point 1, we can't sense the fecal matter because it enters 
 the river downstream of the sampling point.
 
 But the fecal matter is sampled at point 2, so by conjecture, either point 2 
 created it, or it entered between points 1 and 2.
 
 Sampling across the river at point 2 helps infer where the probable fecal 
 matter is.

Sort of a cpan bisest, or rather a cpan testers bisect: look at 2 or more CPAN 
Testers fails where the only difference is an upriver version number.

Neil



Measuring water quality of the CPAN river

2015-05-10 Thread Neil Bowers
One of the goals of the CPAN River model is to get us to focus on cleaning up 
the river, starting at the headwaters first. To that end, I’ve been thinking 
about how we might measure “river quality”, and have written up two ideas so 
far:

http://neilb.org/2015/05/11/measuring-cpan-quality.html 
http://neilb.org/2015/05/11/measuring-cpan-quality.html

These are pretty simple, and have some problems, so I’m hoping someone might 
come up with something more useful.

Here are the 6 dists in the most upstream category that have = 2% failures on 
CPAN Testers 

podlators
IO
File-Path
IPC-Cmd
XSLoader
Locale-Maketext-Simple

I think I saw that File::Path was worked on at the NY hackathon last weekend, 
so maybe that will drop off this list soon ...

Neil



Re: Documenting best practices and the state of ToolChain guidelines using CPAN and POD

2015-05-06 Thread Neil Bowers
 I’ve parked it for the moment, because Gabor has said he’s working on a CPAN 
 notification system that he’d like to add this feature to.
 
 Neil, it seems to me it is important to clarify if Gabor intends for his 
 system to be fully and unconditionally open akin to metacpan, or is intended 
 as a freemium service akin to stratopan. In case of the latter I do not 
 consider this plan a viable way forward.

Gabor says “free”!

Neil




Re: Documenting best practices and the state of ToolChain guidelines using CPAN and POD

2015-05-06 Thread Neil Bowers
 In that vein, we need some sort of Canon set of documentations, written and 
 maintained by toolchain themselves, articulating how things /should/ be done 
 as far as toolchain are concerned, without any sort of requirement that 
 people adhere to it, unless they want to make toolchain happy.

+N

 As such, I propose a very rudimentary idea:
 
 Toolchain
 
 This is the top namespace 
 
 Toolchain::Standards

Please please please, let’s not put this on CPAN. There are enough abuses of 
CPAN already. It’s a comprehensive archive of Perl, not everything in any way 
related to Perl. Plus I wouldn’t want to constrain this sort of documentation 
to pod, and how it’s presented on MetaCPAN / search.cpan.org 
http://search.cpan.org/, which is what we’d effectively be talking about.

If there were a canonical source of information related to toolchain etc, then 
plenty of things on CPAN would link to it in their SEE ALSO sections, but it 
really doesn’t have to be *on* CPAN.

I’m no great fan of wikis, but I often thought it surprising that there isn’t a 
centralised wiki for Perl knowledge, a Perlipedia, if you will. It doesn’t have 
to be a wiki. It could be done via a github repo / github pages (yes, I did 
note your comment about markdown, but markdown is preferable to pod rendered 
via MetaCPAN, IMHO :-) The advantage of a wiki is that it makes it very easy to 
contribute.

There’s one domain that’s woefully under-used where this could live: perl.com

This isn’t a fully thought-out response, but I wanted to (a) offer support for 
the concept, and (b) plead that it not be done via CPAN.

Neil