"As to re-testing after recompile, if the resulting OBJ is the same, sure, no 
need. When can you expect that? Probably rarely, but that's only because there 
are often dates present in the output. So ignoring changes due to compile-date, 
changes in macros could affect things, so assume none of them. That brings you 
to something like application of a PTF. So are you compiling with the same PTF 
level of the compiler as before?"

In this discussion, it arose from this: "Any program change requires full 
regression testing, including "just a recompile"."

I'm not sure what you mean by "changes to macros" in this context, so I'll 
assume "copybooks".

By "just a recompile" I assumed "with no changes to anything". The reason I 
assumed that is because there are sites where, if program A CALLs program B, 
when program B is changed, you have to recompile program A. Seriously.

If you change a data-name in a copybook, and recompile the program, the source 
is different but the object is expected to be the same. If it is expected to be 
the same, and can be proven to be the same, why do a regression test?

I don't think anyone would count a PTF which actually affected a program to 
count as "just a recompile",  it is not the process, it is the expectation that 
the object is identical.

For thousands of different Mainframe sites there are tens (at least) of ways 
that things are done. Not all of them are good (which is subjective), (as) 
effective (as they could be), or efficient. You can be sure there will be 
various rules, old wive's tales, rumour, cant and "it's the way we've always 
done it" which underpin these things.

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to lists...@listserv.ua.edu with the message: INFO IBM-MAIN

Reply via email to