Gabor Szabo wrote:
> On 9/24/06, David Golden <[EMAIL PROTECTED]> wrote:
>> (As posted to Perl Monks, but probably a good question for this list
>> as well)
>>
>> As I've been developing CPAN::Reporter, I've been thinking about what
>> information I wished I had when examining failure reports on CPAN
>> Testers. Test::Reporter already includes the results of "perl -V". As
>> part of CPAN::Reporter, I include a list of prerequisites specified
>> and the module versions loaded (and I believe that CPANPLUS does this
>> as well).
> 
> I think the above (and maybe the rest of the details) should be included
> in the success reports as well.
> Sometimes there seem to be two reports from the same source a failure
> and then a success. It would be very useful to be able to see what has
> changed in the system that let the tests pass.
> Even without the duplicates this might help understand what is a good
> vs. bad environment for the module.
> 
> Thank you for giving CAPN::Reporter to the world.
> 
> Gabor
> 
> 

Late to the thread. Sorry.

Personally, one of the things I always wished for in tester reports is
the actual test output itself. Specifically, I'd like to know what tests
were skipped because test-related things aren't installed. Just because
30 people submitted success reports doesn't mean they're run all the tests.

If I have a set of tests that are almost always skipped because
something isn't installed, maybe I would look for alternate ways to run
those tests.

Maybe, just a list of the skipped tests would suffice.

-=Chris

Attachment: signature.asc
Description: OpenPGP digital signature

Reply via email to