On Thu, 28 Jan 2021, at 07:58, David Crayford wrote:
> I think your the one missing the point. I can't remember the last time I 
> had to write a macro as I can do the things I need just using commands.

I used Xedit (with macros I wrote in EXEC or EXEC 2) for a few years in the
1980s, then moved to an MVS site and ISPF edit and macros written in CLIST
and REXX.

At home I use Kedit, because for me it's more useful to be able to write 
macros for that in KEXX (essentially REXX) than it is to use eg spflite but 
have to write macros (with a familiar ispf edit command set) in the 
version of BASIC that they support.


A typical macro I have here gets used when I download a telephone bill.
The data arrives here as a table of numbers called, call durations, dates
etc.

The macro does a certain amount of syntax checking, so that it will fail 
sensibly if the phone company change the format of their data file.  It
then reformats it, grouping calls per destination, summing the costs for 
each destination, and also inserting textual descriptions of the numbers 
called (ie people's names, company names etc).

I do not think a macro recording and playback approach would work. 
There is a lot of logic in the Kexx/Rexx aspect of the macro, apart from 
the editor commands that get issued.


My longest macro is just over 12,500 lines long.  Its job was to read two 
files that contained a list of all the radio & tv programmes downloadable
from the BBC (a few years ago, in the early days of BBC iPlayer - the lists
having being created by a perl program written by someone else). There
was typically about 3500 radio programmes and 1800 TV programmes
listed, and the lists changed every few hours.

The files contained programme series & episode names, & descriptions
of the contents of the programmes.  Data was not column-aligned.

The macro looked for programmes I might want to download, that I 
already knew about, so that I'd find out about new episodes of things
in current series, and new series in due course.  It also kept track of  
which episodes I'd already downloaded & didn't want to redownload.

But as well as that it looked for programmes I didn't know about, on 
topics that interest me, or presented by people whose programmes I
generally like.  It excluded programmes on topics I don't care about 
and featuring presenters I don't like.

Essentially it did many instances of:
  - find all lines containing some pattern 
  - exclude any of those that contained many other things
  - note the results of that overall group of commands
  - reset so the whole file was visible again

and at the end, listed what it had found through the whole process. It 
could also tell me (if I asked it to) why a particular programme had been
identified (ie which of the many searches had actually yielded it).

I also wrote a whole set of menu-driven editor commands which used
Kexx macros to manipulate the contents of this macro, because it had 
to be edited a lot.

Typical parts of the code (this is a simplified example) looked like

   call srch "\Doctor Finlay\"
   call hsepprog "Doctor Finlay: The Further Adventures of a Black Bag"
   call send

   call prog "Doctor Finlay: The Further Adventures of a Black Bag"
   call omit "\Series 1|\  &  \|1. The Catch|\"
   call omit "\Series 1|\  &  \|2. The Fever|\"
   call pend 

The lines between "call srch" and "call send" (ie the end of a search 
definition) stored parameters in stems which would later look for 
"\Doctor Finlay\" anywhere in the file, but ignore any of those finds 
if they contained "Doctor Finlay: The Further Adventures of a Black Bag"
because that ("call hsepprog") was the name of a programme that was
handled separately.  The macro did check that things that were stated
to be handled separately were actually handled separately.

The lines between a "call prog" and "call pend" also stored parms for 
a future search, but that search would only look at the parts of the 
data that listed programme names (so eg would ignore the free text
descriptions of episodes).  The "call omit" lines would make sure I'd
not get told when episodes I'd heard were retransmitted.

When the macro was run, all those function calls set up stems full of 
parms for searches and excludes.  The syntax of the arguments in the
functions was checked, as was the relationships between eg "call 
hsepprog" and whether there actually was a "call prog" that specified 
the same programme.  The macro also made sure that after a "call
srch" there were only calls of functions which made sense in a "srch"
body, followed by "call send".  LIkewise there are restrictions on the
functions I allowed between "call prog/pend". 

After that the process loop was run.  Some of the searches used 
regexes so the regex expressions would need to be constructed
from the plain text arguments in the search strings, escaping some
characters as required.

The macro would have been a lot smaller if I'd placed only the 
search and results display logic in it, and had it also read other
external files full of search definitions.  But then, I'd have had to 
write a parser for validating that data, whereas to some extent 
the approach I took here allowed me to use all those function
calls to do some of the enforcement of the structure of the 
definitions.  It also meant that when I had to adjust the code, I
had the code and the search definition data it manipulated all 
in one file.                                                                    
                          

-- 
Jeremy Nicoll - my opinions are my own.

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [email protected] with the message: INFO IBM-MAIN

Reply via email to