The 'standard' way is to build a makefile for the compiles, and run
make.

Another suggestion, and I'm new at this, is to code a quicky Rexx
(Regina) script. You should be able to do the compiles, check return
codes, and e-mail yourself the results.  At least this is the kind of
thing I do on VM when I want to run a batch of compiles.  With Linux you
should be able to run it in the background (if the script is 'compile'
use the command 'compile&')

"McKown, John" wrote:
>
> OK, so I have a corrupted mindset, coming from MVS <grin>. But suppose that
> I want to compile a LOT of programs. In MVS, I code up some JCL and submit
> it to run later. When it completes, I get a notify to my TSO id and look at
> the output in SDSF. I repeat this for however many compiles that I want to
> do. Perhaps doing the submissions over a period of time. How do I do that in
> Linux (or any UNIX)? In VM/CMS, I remember a CMSBATCH virtual machine which
> worked a bit like the MVS initiator. The best that I can think of to do in
> Linux is:
>
> nohup compiler and switches 1>stdout.unique.qualifer
> 2>stderr.unique.qualifer &
>
> This would run my compile "in the background" so to speak (or at least not
> tie up my terminal). I could do this any number of times. But this would
> have all the compiles running at the same time. So now I'm impacting
> performance for others (even if I "nice" the compiles). Now I have 50
> programmers all doing the same. My machine is a mess. Is there an equivalent
> to an initiator where people can "submit" work to be done (compiles, shell
> scripts, whatever) and the system will schedule it and the sysadmin can
> control it (I.e. only do 5 at a time, let the others wait)
>
> Or am I worrying about nothing since Linux developers don't do thing this
> way anyway. I.e. queuing up 20 compiles while going to lunch and surfing the
> web and generally schmoozing around? The same question about testing
> programs. Perhaps there just isn't any "batch" type processing?
>
> --
> John McKown

Reply via email to