On 1/8/2013 11:12 AM, Farley, Peter x23353 wrote:
I have usually found that the SuperC utility (ASMFSUPC actually, the HLASM 
Toolkit enhanced version) does a very creditable job of finding differences in 
text or report files, and is quite useful in regression testing batch 
application changes.

However, I just discovered that the compare process can "lose its place" 
comparing text files with a significant number of changes.  I have an application change 
that inserts an additional 4 lines of text every 6ith line of the original text, with the 
differences and inserts scattered over all the sections of a report.  There are 115,400 
new records and 115,400 changed lines in total, arranged such that there are four changed 
lines immediately preceding each of the sets of four inserted lines.  The original file 
has 268,716 lines and the new file has 389,323 lines.  There are heading lines for each 
page as well, so I used the DPLINE option to tell SuperC to ignore the header lines and 
the entirely blank lines.

However, after 13,467 lines of the original file and 20,154 lines of the new 
file, SuperC seems to stop recognizing changed/inserted lines in groups and 
starts reporting huge globs of deletes and inserts (6,439 deletes followed by 
10,778 inserts the first time it lost its place).

Has anyone else seen this behavior?  Is there anything I can do to help SuperC "keep 
its place" and report the actual changes instead of globs of inserts and deletes?

TIA for your help with this problem.

Peter

Peter,

I've seen this behavior myself, even on smaller files. It stops recognizing that lines are the same and just has huge blocks as inserts and deletes, even though the lines are the same. SUPERC support was unimpressed, I got the WAD.

Regards,
Tom Conley

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [email protected] with the message: INFO IBM-MAIN

Reply via email to