> On Feb 19, 2021, at 5:17 PM, Miller, Edward S. via vmsperl <vmsperl@perl.org> 
> wrote:
> 
> As I stated presviously in this thread, our PERL installation (Alpha VMS 8.3) 
> was apparently successful
> and I had my previous questions answered, thank you.

I have updated README.vms based on your comments and some other things I 
noticed reading through it:

<https://github.com/Perl/perl5/commit/f1bf079f5ffcef6c6cc3eed62b84949770fa03e9>

> I have an additional question, which is mostly academic so it's fine if I 
> don't get an answer.
> Also I have some observations/suggestions related to the test procedures.
> 
> My question:  if I were to do a new install on another VMS system, could I 
> skip the MMK TEST step?
> Does this step do anything that is a prerequisite for the final MMK INSTALL 
> step?  Does this step ever
> fail in such a way as to make the INSTALL step impossible?

As far as I can think of, nothing would prevent an installation without running 
tests.  If all your systems are on the same version of VMS and similar 
configuration (same networking set-up, quotas, volume settings, logical names, 
etc.) you're unlikely to learn anything new by running the test suite on a 
different system.

If you are on v8.4 or later, the easiest way to get a semi-recent Perl is to 
just install one of the PCSI kits from here:

<https://sourceforge.net/projects/vmsperlkit/files/>

That project also includes procedures for building PCSI kits, so if you have 
more than a handful of systems, it might well be easier to build your own kit 
and install it everywhere than to build on every system.

If you have a VSI version of VMS, they provide a kit built with the same 
kitting procedures, but last time I checked it was a fairly old version of Perl 
(5.20?).

> Some observations about the test procedures.
> 
> At the end of the MMK TEST job the following message appears:
>     ### Since not all tests were successful, you may want to run some of
>     ### them individually and examine any diagnostic messages they produce.
>     ### See the INSTALL document's section on "make test".
>     ### You have a good chance to get more information by running
>     ###   ./perl harness
>     ### in the 't' directory since most (>=80%) of the tests succeeded.
> 
> I have performed both of these operations.  The recommendation to run the 
> HARNESS test is not
> mentioned in the README.VMS file.  I'd suggest some mention in README.VMS 
> about HARNESS and the
> motivation for running it as well (in addition to MMK TEST).  

There is some discussion of harness in [.pod]perlhack.pod and there is nothing 
really VMS-specific about it.  The default test driver is intended to have 
fewer requirements and can still run some tests when, for example, dynamic 
loading of extensions isn't working. It also aborts a test script on first 
failure rather than running the rest of the tests.  Harness has more 
requirements (notably the Test::Harness module) and runs every test script to 
completion.  I think it also reports more detail that may provide information 
about why a test failed.

> My IMPRESSION is that the information
> these tests provide is not likely to be actionable by the unknowledgable 
> bloke who is just trying to
> install PERL to satisfy a user request.

Right.  The assumption is really that anyone building Perl from source is ready 
and willing to debug Perl and C code if any tests fail.  I have done a ton of 
that for a couple of decades but can never quite keep up with new tests that 
sometimes fail, often for reasons having nothing to do with what's being 
tested.  That said, I only see about 8 tests failing on my system, not 58.

> My experience running HARNESS:
>    It reported 64 failed tests (where MMK TEST reported 58).  Most were the 
> same, but 8
>    failures were unique to HARNESS, 2 were unique to MMK TEST.
> 
>           Tests which fail with MMK_TEST but not HARNESS:
>             io/socketpair.t  (explicitly skipped by the HARNESS job)
>             op/lexsub.t      (not mentioned by the HARNESS job)
> 
>           Tests which fail with HARNESS but not MMK_TEST:
>             ../lib/File/Copy.t
>             ../lib/diagnostics.t
>             io/data.t
>             op/gv.t
>             op/lex.t
>             op/sigdispatch.t
>             run/switchC.t
>             run/switchx.t
>             (None of the above HARNESS failures were mentioned as being 
> tested in the MMK_TEST job.)
> 
>    I ran the 8 tests uniquely reported as failed by HARNESS thru the procedure
>              $ @[.vms]test .EXE "" -"v"   <testname>
> 
>    That procedure found NO failures in any of those failed HARNESS tests.
>    It appears that failure is in the eye of the beholder.

There is no way to know what makes the difference without debugging in detail 
and identifying whether the problem is the test scripts, the Perl operators 
used by the test scripts, the test drivers, the local system environment, etc.  
In particular, without detailed test output indicating what was expected and 
what the test got instead of what it expected, I can't even guess at what might 
be not working correctly.

________________________________________
Craig A. Berry

"... getting out of a sonnet is much more
 difficult than getting in."
                 Brad Leithauser

Reply via email to