You could code up a shell/perl script called "make" which takes the
arguments of the real make, sticks them in an array or something, and calls
the real make for each item in the array 5 at a time or whatever till it's
done.
Hrm, maybe I just ported JES to Linux....
Uh Oh.
Jay Brenneman
"McKown, John"
<[EMAIL PROTECTED] To: [EMAIL PROTECTED]
tr.com> cc:
Sent by: Linux on Subject: "batch" compiles
390 Port
<[EMAIL PROTECTED]
IST.EDU>
03/20/2003 08:56
AM
Please respond to
Linux on 390 Port
OK, so I have a corrupted mindset, coming from MVS <grin>. But suppose that
I want to compile a LOT of programs. In MVS, I code up some JCL and submit
it to run later. When it completes, I get a notify to my TSO id and look at
the output in SDSF. I repeat this for however many compiles that I want to
do. Perhaps doing the submissions over a period of time. How do I do that
in
Linux (or any UNIX)? In VM/CMS, I remember a CMSBATCH virtual machine which
worked a bit like the MVS initiator. The best that I can think of to do in
Linux is:
nohup compiler and switches 1>stdout.unique.qualifer
2>stderr.unique.qualifer &
This would run my compile "in the background" so to speak (or at least not
tie up my terminal). I could do this any number of times. But this would
have all the compiles running at the same time. So now I'm impacting
performance for others (even if I "nice" the compiles). Now I have 50
programmers all doing the same. My machine is a mess. Is there an
equivalent
to an initiator where people can "submit" work to be done (compiles, shell
scripts, whatever) and the system will schedule it and the sysadmin can
control it (I.e. only do 5 at a time, let the others wait)
Or am I worrying about nothing since Linux developers don't do thing this
way anyway. I.e. queuing up 20 compiles while going to lunch and surfing
the
web and generally schmoozing around? The same question about testing
programs. Perhaps there just isn't any "batch" type processing?
--
John McKown