Re: feature request: parallel builds feature

2013-05-02 Thread Jim Michaels
OK




>
> From: Paul Smith 
>To: Jim Michaels  
>Cc: "bug-make@gnu.org"  
>Sent: Thursday, May 2, 2013 4:41 AM
>Subject: Re: feature request: parallel builds feature
> 
>
>On Wed, 2013-05-01 at 20:38 -0700, Jim Michaels wrote:
>
>>  again, problem solved with what I proposed. think. separate shell
>> window for each job.
>
>You can do that today by just writing your recipes such that they start
>a screen session or xterm or whatever.  Those tools allocate and manage
>their own PTY's and so each has its own "stdin".
>
>You haven't provided any use case description, at a level _above_ the
>implementation.  Sure, the manual documents a restriction but not every
>restriction needs to be lifted.  We only do that work if there's a real
>need for it that can't be met more easily a different way.
>
>So we need to understand at a higher level what problem you're trying to
>solve.  Then maybe there's a good way to do it with existing make
>capabilities, maybe the best way is using capabilities available outside
>of make in conjunction with make, and maybe the best way is to enhance
>make.  We'll have to see.
>
>
>
>
>___
Bug-make mailing list
Bug-make@gnu.org
https://lists.gnu.org/mailman/listinfo/bug-make


Re: feature request: parallel builds feature

2013-05-02 Thread Paul Smith
On Wed, 2013-05-01 at 20:38 -0700, Jim Michaels wrote:

>  again, problem solved with what I proposed. think. separate shell
> window for each job.

You can do that today by just writing your recipes such that they start
a screen session or xterm or whatever.  Those tools allocate and manage
their own PTY's and so each has its own "stdin".

You haven't provided any use case description, at a level _above_ the
implementation.  Sure, the manual documents a restriction but not every
restriction needs to be lifted.  We only do that work if there's a real
need for it that can't be met more easily a different way.

So we need to understand at a higher level what problem you're trying to
solve.  Then maybe there's a good way to do it with existing make
capabilities, maybe the best way is using capabilities available outside
of make in conjunction with make, and maybe the best way is to enhance
make.  We'll have to see.



___
Bug-make mailing list
Bug-make@gnu.org
https://lists.gnu.org/mailman/listinfo/bug-make


Re: feature request: parallel builds feature

2013-05-01 Thread Jim Michaels


 again, problem solved with what I proposed. think. separate shell window for 
each job.




>
> From: Paul Smith 
>To: Jim Michaels  
>Cc: "bug-make@gnu.org"  
>Sent: Tuesday, April 30, 2013 11:23 PM
>Subject: Re: feature request: parallel builds feature
> 
>
>On Tue, 2013-04-30 at 17:20 -0700, Jim Michaels wrote:
>> I wasn't digressing.  I was explaining the point.  the concept I am
>> trying to present as a solution to the problem of making parallel
>> stdin for --jobs in gnu make (which currenty doesn't work and is I
>> guess single-threaded) is to make a separate terminal or command shell
>> for each job, such as via a generated batch file or shell script.
>> 
>> this is as simple as I can make it.
>
>You need to give a concrete example of the problem you're trying to
>solve.  When the manual discusses stdin it means that only one job at a
>time can read from the make program's stdin.
>
>Multithreading won't help because there is only one input to read from,
>regardless of how many threads do the reading.
>
>What this limitation is discussing is if there were a makefile like
>this:
>
>    read: read1 read2
>    read1 read2: ; @echo $@: enter a word: ; read word ; echo $@: $$word
>
>Both of these two targets read from stdin.  If you run them serially, it
>works:
>
>    $ make
>   read1: enter a word:
>   fooblatz
>   read1: fooblatz
>   read2: enter a word:
>   barflatz
>   read2: barflatz
>
>If you run in parallel then both the read1 and read2 targets run at the
>same time and both want input from stdin, at the same time.  There's no
>way this can work: when you typed a word how could you know or specify
>which read operation got it?  So make arbitrarily chooses one of the
>jobs to get the input and the others have their stdin closed.
>
>But if course, this doesn't impact in any way rules like this:
>
>    read: read1 read2
>
>    read1 read2: ; @word=`cat $@.input`; echo $@: $$word
>
>Now if you have files like read1-input and read2-input, those will be
>read inside these rules and behave properly.
>
>> I have learned that on a machine with 12 threads and 64GB of memory,
>> you can have 50+ jobs running.
>
>This depends very much on what those jobs are doing.  Obviously you CAN
>run as many jobs as you want.  However I've never heard of being able to
>get more than #CPUs plus a few jobs running at the same time without
>making the build _slower_.  At some point the kernel will simply thrash
>trying to keep all those jobs running at the same time, if they
>seriously outnumber the cores available to run them on.
>
>
>
>___
Bug-make mailing list
Bug-make@gnu.org
https://lists.gnu.org/mailman/listinfo/bug-make


Re: feature request: parallel builds feature

2013-04-30 Thread Paul Smith
On Tue, 2013-04-30 at 17:20 -0700, Jim Michaels wrote:
> I wasn't digressing.  I was explaining the point.  the concept I am
> trying to present as a solution to the problem of making parallel
> stdin for --jobs in gnu make (which currenty doesn't work and is I
> guess single-threaded) is to make a separate terminal or command shell
> for each job, such as via a generated batch file or shell script.
> 
> this is as simple as I can make it.

You need to give a concrete example of the problem you're trying to
solve.  When the manual discusses stdin it means that only one job at a
time can read from the make program's stdin.

Multithreading won't help because there is only one input to read from,
regardless of how many threads do the reading.

What this limitation is discussing is if there were a makefile like
this:

read: read1 read2
read1 read2: ; @echo $@: enter a word: ; read word ; echo $@: $$word

Both of these two targets read from stdin.  If you run them serially, it
works:

$ make
   read1: enter a word:
   fooblatz
   read1: fooblatz
   read2: enter a word:
   barflatz
   read2: barflatz

If you run in parallel then both the read1 and read2 targets run at the
same time and both want input from stdin, at the same time.  There's no
way this can work: when you typed a word how could you know or specify
which read operation got it?  So make arbitrarily chooses one of the
jobs to get the input and the others have their stdin closed.

But if course, this doesn't impact in any way rules like this:

read: read1 read2

read1 read2: ; @word=`cat $@.input`; echo $@: $$word

Now if you have files like read1-input and read2-input, those will be
read inside these rules and behave properly.

> I have learned that on a machine with 12 threads and 64GB of memory,
> you can have 50+ jobs running.

This depends very much on what those jobs are doing.  Obviously you CAN
run as many jobs as you want.  However I've never heard of being able to
get more than #CPUs plus a few jobs running at the same time without
making the build _slower_.  At some point the kernel will simply thrash
trying to keep all those jobs running at the same time, if they
seriously outnumber the cores available to run them on.


___
Bug-make mailing list
Bug-make@gnu.org
https://lists.gnu.org/mailman/listinfo/bug-make


Re: feature request: parallel builds feature

2013-04-30 Thread Jim Michaels


 you only have to read the documentation to know that it doesn't support 
parallel stdin.
you would know that if you had followed the thread. here's the documention, 
read around the middle.  
http://www.gnu.org/software/make/manual/html_node/Parallel.html




>
> From: Howard Chu 
>To: Jim Michaels ; "psm...@gnu.org"  
>Cc: "bug-make@gnu.org"  
>Sent: Tuesday, April 30, 2013 6:55 PM
>Subject: Re: feature request: parallel builds feature
> 
>
>Jim Michaels wrote:
>>
>> I wasn't digressing.  I was explaining the point.  the concept I am trying to
>> present as a solution to the problem of making parallel stdin for --jobs in
>> gnu make (which currenty doesn't work and is I guess single-threaded) is to
>> make a separate terminal or command shell for each job, such as via a
>> generated batch file or shell script.
>>
>> this is as simple as I can make it.
>
>Who said stdin was a problem? Fundamentally the jobs spawned by make are batch 
>jobs - they should not be requesting input from stdin in the first place.
>
>Semantically the effect of running parallel make must be identical to running 
>serial make. You cannot guarantee this to be true if jobs are reading from 
>stdin because stdin's supply of data is inherently serial but the order it 
>gets read is non-deterministic in a parallel build.
>
>If the jobs you're spawning from make require input from stdin you need to 
>rewrite those jobs.
>
>> at the end of the shell script, you can put in whatever you like, such as
>> synchronization stuff saying "I am done" by creating  a semaphore file, a 
>> flag
>> file.
>>
>> the problem would then be porting BASH to windows and other platforms in 
>> order
>> to handle --jobs.
>
>bash has already been ported to Windows.
>>
>> I have learned that on a machine with 12 threads and 64GB of memory, you can
>> have 50+ jobs running.
>>
>>
>>     
>>--
>>     *From:* Paul Smith 
>>     *To:* Jim Michaels 
>>     *Cc:* bug-make@gnu.org
>>     *Sent:* Monday, April 22, 2013 10:56 AM
>>     *Subject:* Re: feature request: parallel builds feature
>>
>>     On Mon, 2013-04-22 at 00:42 -0700, Jim Michaels wrote:
>>      > it currently has a problem with stdin, because at this point there is
>>      > only one of those, only 1 of them gets it, and the others starve. so
>>      > if your build needs stdin or creates files from the commandline using
>>      > heredocs, you can't use it (check first!). you will get an error. gnu
>>      > has not yet figured out a solution yet (I have, multiple shells IF you
>>      > can control them... probably can't without some work creating batch
>>      > files for the jobs). so there is a solution. even with a batch file,
>>      > make would need some sort of way of reporting back error conditions. I
>>      > think there are ways of managing that with files via presence-detect,
>>      > sort of like semaphores. they should get cleared when the job ends, or
>>      > when a switch is given to clear the state for that session if the
>>      > session was broken with ctrl-c. well, I suppose a ctrl-c handler
>>      > should still kill those terminals or cmd shells and clear up those
>>      > files.
>>      > what do you think?
>>      > if a terminal is opened, it should be created without a window. some
>>      > OS's have that option. some don't, like freeDOS, which would not have
>>      > the ability to do terminals, parallel shell windows, or even the
>>      > --jobs feature (but that's implementation-dependent).
>>
>>     Please keep the mailing list CC'd.  Thanks.
>>
>>     I'm afraid I still don't understand what you're asking for here.  You'll
>>     need to back up and provide a description of your needs in a clear,
>>     orderly way without digressions.
>>
>>     Yes, it's true that GNU make only provides its stdin to one job at a
>>     time and which job gets it is essentially random.  In order to address
>>     this we'd need to see a specific use-case or requirement, but my
>>     suspicion is that all such possible use-cases are better solved by a
>>     change of process at a level above what make can provide.
>>
>>
>>
>>
>>
>>
>> ___
>> Bug-make mailing list
>> Bug-make@gnu.org
>> https://lists.gnu.org/mailman/listinfo/bug-make
>>
>
>
>-- 
>   -- Howard Chu
>   CTO, Symas Corp.          http://www.symas.com
>   Director, Highland Sun    http://highlandsun.com/hyc/
>   Chief Architect, OpenLDAP  http://www.openldap.org/project/
>
>
>___
Bug-make mailing list
Bug-make@gnu.org
https://lists.gnu.org/mailman/listinfo/bug-make


Re: feature request: parallel builds feature

2013-04-30 Thread Howard Chu

Jim Michaels wrote:

what if you in your makefile are creating files from scratch using echo, based
on system configuration information?
I know I have to do that in order to create XML manifest files for resources
to compile and link in via resource compiler for windows builds.


echo writes to stdout. That has nothing to do with stdin. Looks to me like you 
have no F'ing idea what you're talking about.


--
  -- Howard Chu
  CTO, Symas Corp.   http://www.symas.com
  Director, Highland Sun http://highlandsun.com/hyc/
  Chief Architect, OpenLDAP  http://www.openldap.org/project/

___
Bug-make mailing list
Bug-make@gnu.org
https://lists.gnu.org/mailman/listinfo/bug-make


Re: feature request: parallel builds feature

2013-04-30 Thread Jim Michaels
what if you in your makefile are creating files from scratch using echo, based 
on system configuration information?
I know I have to do that in order to create XML manifest files for resources to 
compile and link in via resource compiler for windows builds.

I can give you a sample of some of the lines I have to convert from .cmd batch 
files as they are now, to makefiles as the system will be eventually when I get 
the migration completed.


    echo -creating 32-bit 9x+ manifest...  don't change this particular 
block or nothing will work.
    echo ^>32\%outfile%.%extension%.manifest
    echo ^>>32\%outfile%.%extension%.manifest

rem it works with description on XP, but I don't know about other platforms, so 
I am leaving it out.
rem echo 
^%manifest_apptitle%^>>32\%outfile%.%extension%.manifest

    echo ^>>32\%outfile%.%extension%.manifest
    echo ^>>32\%outfile%.%extension%.manifest
    echo 
^>>32\%outfile%.%extension%.manifest
    echo ^>>32\%outfile%.%extension%.manifest
rem it works on XP, but I don't know about other platforms, so I am leaving it 
out.
rem echo ^>>32\%outfile%.%extension%.manifest
    echo 
^>>32\%outfile%.%extension%.manifest
    echo 
^>>32\%outfile%.%extension%.manifest
    echo ^>>32\%outfile%.%extension%.manifest
    echo ^>>32\%outfile%.%extension%.manifest
    echo ^>>32\%outfile%.%extension%.manifest

    echo -creating 64-bit 9x+ manifest...  don't change this particular 
block or nothing will work.
    echo ^>64\%outfile%.%extension%.manifest
    echo ^>>64\%outfile%.%extension%.manifest

rem it works with description on XP, but I don't know about other platforms, so 
I am leaving it out.
rem echo 
^%manifest_apptitle%^>>64\%outfile%.%extension%.manifest

    echo ^>>64\%outfile%.%extension%.manifest
    echo ^>>64\%outfile%.%extension%.manifest
    echo 
^>>64\%outfile%.%extension%.manifest
    echo ^>>64\%outfile%.%extension%.manifest
rem it works on XP, but I don't know about other platforms, so I am leaving it 
out.
rem echo ^>>64\%outfile%.%extension%.manifest
    echo 
^>>64\%outfile%.%extension%.manifest
    echo 
^>>64\%outfile%.%extension%.manifest
    echo ^>>64\%outfile%.%extension%.manifest
    echo    ^>>64\%outfile%.%extension%.manifest
    echo ^>>64\%outfile%.%extension%.manifest



there is more. there are if statements involved, etc.
currently, there is no manifest tool in the compiler set for mingw-w64 or 
mingw. nothing is planned.





>____________
> From: Howard Chu 
>To: Jim Michaels ; "psm...@gnu.org"  
>Cc: "bug-make@gnu.org"  
>Sent: Tuesday, April 30, 2013 6:55 PM
>Subject: Re: feature request: parallel builds feature
> 
>
>Jim Michaels wrote:
>>
>> I wasn't digressing.  I was explaining the point.  the concept I am trying to
>> present as a solution to the problem of making parallel stdin for --jobs in
>> gnu make (which currenty doesn't work and is I guess single-threaded) is to
>> make a separate terminal or command shell for each job, such as via a
>> generated batch file or shell script.
>>
>> this is as simple as I can make it.
>
>Who said stdin was a problem? Fundamentally the jobs spawned by make are batch 
>jobs - they should not be requesting input from stdin in the first place.
>
>Semantically the effect of running parallel make must be identical to running 
>serial make. You cannot guarantee this to be true if jobs are reading from 
>stdin because stdin's supply of data is inherently serial but the order it 
>gets read is non-deterministic in a parallel build.
>
>If the jobs you're spawning from make require input from stdin you need to 
>rewrite those jobs.
>
>> at the end of the shell script, you can put in whatever you like, such as
>> synchronization stuff saying "I am done" by creating  a semaphore file, a 
>> flag
>> file.
>>
>> the problem would then be porting BASH to windows and other platforms in 
>> order
>> to handle --jobs.
>
>bash has already been ported to Windows.
>>
>> I have learned that on a machine with 12 threads and 64GB of memory, you can
>> have 50+ jobs running.
>>
>>
>>     
>>--
>>     *From:* Paul Smith 
>>     *To:* Jim Michaels 
>>     *Cc:* bug-make@gnu.org
>>     *Sent:* Monday, April 22, 2013 10:56 AM
>>     *Subject:* Re: fe

Re: feature request: parallel builds feature

2013-04-30 Thread Howard Chu

Jim Michaels wrote:


I wasn't digressing.  I was explaining the point.  the concept I am trying to
present as a solution to the problem of making parallel stdin for --jobs in
gnu make (which currenty doesn't work and is I guess single-threaded) is to
make a separate terminal or command shell for each job, such as via a
generated batch file or shell script.

this is as simple as I can make it.


Who said stdin was a problem? Fundamentally the jobs spawned by make are batch 
jobs - they should not be requesting input from stdin in the first place.


Semantically the effect of running parallel make must be identical to running 
serial make. You cannot guarantee this to be true if jobs are reading from 
stdin because stdin's supply of data is inherently serial but the order it 
gets read is non-deterministic in a parallel build.


If the jobs you're spawning from make require input from stdin you need to 
rewrite those jobs.



at the end of the shell script, you can put in whatever you like, such as
synchronization stuff saying "I am done" by creating  a semaphore file, a flag
file.

the problem would then be porting BASH to windows and other platforms in order
to handle --jobs.


bash has already been ported to Windows.


I have learned that on a machine with 12 threads and 64GB of memory, you can
have 50+ jobs running.



--
*From:* Paul Smith 
*To:* Jim Michaels 
*Cc:* bug-make@gnu.org
*Sent:* Monday, April 22, 2013 10:56 AM
*Subject:* Re: feature request: parallel builds feature

On Mon, 2013-04-22 at 00:42 -0700, Jim Michaels wrote:
 > it currently has a problem with stdin, because at this point there is
 > only one of those, only 1 of them gets it, and the others starve. so
 > if your build needs stdin or creates files from the commandline using
 > heredocs, you can't use it (check first!). you will get an error. gnu
 > has not yet figured out a solution yet (I have, multiple shells IF you
 > can control them... probably can't without some work creating batch
 > files for the jobs). so there is a solution. even with a batch file,
 > make would need some sort of way of reporting back error conditions. I
 > think there are ways of managing that with files via presence-detect,
 > sort of like semaphores. they should get cleared when the job ends, or
 > when a switch is given to clear the state for that session if the
 > session was broken with ctrl-c. well, I suppose a ctrl-c handler
 > should still kill those terminals or cmd shells and clear up those
 > files.
 > what do you think?
 > if a terminal is opened, it should be created without a window. some
 > OS's have that option. some don't, like freeDOS, which would not have
 > the ability to do terminals, parallel shell windows, or even the
 > --jobs feature (but that's implementation-dependent).

Please keep the mailing list CC'd.  Thanks.

I'm afraid I still don't understand what you're asking for here.  You'll
need to back up and provide a description of your needs in a clear,
orderly way without digressions.

Yes, it's true that GNU make only provides its stdin to one job at a
time and which job gets it is essentially random.  In order to address
this we'd need to see a specific use-case or requirement, but my
suspicion is that all such possible use-cases are better solved by a
change of process at a level above what make can provide.






___
Bug-make mailing list
Bug-make@gnu.org
https://lists.gnu.org/mailman/listinfo/bug-make




--
  -- Howard Chu
  CTO, Symas Corp.   http://www.symas.com
  Director, Highland Sun http://highlandsun.com/hyc/
  Chief Architect, OpenLDAP  http://www.openldap.org/project/

___
Bug-make mailing list
Bug-make@gnu.org
https://lists.gnu.org/mailman/listinfo/bug-make


Re: feature request: parallel builds feature

2013-04-30 Thread Jim Michaels

I wasn't digressing.  I was explaining the point.  the concept I am trying to 
present as a solution to the problem of making parallel stdin for --jobs in gnu 
make (which currenty doesn't work and is I guess single-threaded) is to make a 
separate terminal or command shell for each job, such as via a generated batch 
file or shell script.

this is as simple as I can make it.

at the end of the shell script, you can put in whatever you like, such as 
synchronization stuff saying "I am done" by creating  a semaphore file, a flag 
file.

the problem would then be porting BASH to windows and other platforms in order 
to handle --jobs.

I have learned that on a machine with 12 threads and 64GB of memory, you can 
have 50+ jobs running.





>
> From: Paul Smith 
>To: Jim Michaels  
>Cc: bug-make@gnu.org 
>Sent: Monday, April 22, 2013 10:56 AM
>Subject: Re: feature request: parallel builds feature
> 
>
>On Mon, 2013-04-22 at 00:42 -0700, Jim Michaels wrote:
>> it currently has a problem with stdin, because at this point there is
>> only one of those, only 1 of them gets it, and the others starve. so
>> if your build needs stdin or creates files from the commandline using
>> heredocs, you can't use it (check first!). you will get an error. gnu
>> has not yet figured out a solution yet (I have, multiple shells IF you
>> can control them... probably can't without some work creating batch
>> files for the jobs). so there is a solution. even with a batch file,
>> make would need some sort of way of reporting back error conditions. I
>> think there are ways of managing that with files via presence-detect,
>> sort of like semaphores. they should get cleared when the job ends, or
>> when a switch is given to clear the state for that session if the
>> session was broken with ctrl-c. well, I suppose a ctrl-c handler
>> should still kill those terminals or cmd shells and clear up those
>> files.
>> what do you think?
>> if a terminal is opened, it should be created without a window. some
>> OS's have that option. some don't, like freeDOS, which would not have
>> the ability to do terminals, parallel shell windows, or even the
>> --jobs feature (but that's implementation-dependent).
>
>Please keep the mailing list CC'd.  Thanks.
>
>I'm afraid I still don't understand what you're asking for here.  You'll
>need to back up and provide a description of your needs in a clear,
>orderly way without digressions.
>
>Yes, it's true that GNU make only provides its stdin to one job at a
>time and which job gets it is essentially random.  In order to address
>this we'd need to see a specific use-case or requirement, but my
>suspicion is that all such possible use-cases are better solved by a
>change of process at a level above what make can provide.
>
>
>
>
>___
Bug-make mailing list
Bug-make@gnu.org
https://lists.gnu.org/mailman/listinfo/bug-make


Re: feature request: parallel builds feature

2013-04-22 Thread Paul Smith
On Mon, 2013-04-22 at 00:42 -0700, Jim Michaels wrote:
> it currently has a problem with stdin, because at this point there is
> only one of those, only 1 of them gets it, and the others starve. so
> if your build needs stdin or creates files from the commandline using
> heredocs, you can't use it (check first!). you will get an error. gnu
> has not yet figured out a solution yet (I have, multiple shells IF you
> can control them... probably can't without some work creating batch
> files for the jobs). so there is a solution. even with a batch file,
> make would need some sort of way of reporting back error conditions. I
> think there are ways of managing that with files via presence-detect,
> sort of like semaphores. they should get cleared when the job ends, or
> when a switch is given to clear the state for that session if the
> session was broken with ctrl-c. well, I suppose a ctrl-c handler
> should still kill those terminals or cmd shells and clear up those
> files.
> what do you think?
> if a terminal is opened, it should be created without a window. some
> OS's have that option. some don't, like freeDOS, which would not have
> the ability to do terminals, parallel shell windows, or even the
> --jobs feature (but that's implementation-dependent).

Please keep the mailing list CC'd.  Thanks.

I'm afraid I still don't understand what you're asking for here.  You'll
need to back up and provide a description of your needs in a clear,
orderly way without digressions.

Yes, it's true that GNU make only provides its stdin to one job at a
time and which job gets it is essentially random.  In order to address
this we'd need to see a specific use-case or requirement, but my
suspicion is that all such possible use-cases are better solved by a
change of process at a level above what make can provide.



___
Bug-make mailing list
Bug-make@gnu.org
https://lists.gnu.org/mailman/listinfo/bug-make


Re: feature request: parallel builds feature

2013-04-16 Thread Paul Smith
On Tue, 2013-04-16 at 01:34 -0700, Jim Michaels wrote:
> I have been toying with this idea of parallel builds to gain project
> compile speed (reducing time to a fraction) for quite a while.

Can you explain the difference between what you're suggesting and the
existing --jobs (-j) feature available in GNU make?



___
Bug-make mailing list
Bug-make@gnu.org
https://lists.gnu.org/mailman/listinfo/bug-make


feature request: parallel builds feature

2013-04-16 Thread Jim Michaels
feature request: parallelize make builds.
current problem: make is serial in nature. there is room for making it 
series-parallel.


I have been toying with this idea of parallel builds to gain project compile 
speed (reducing time to a fraction) for quite a while.

compiles seem to spend more time on cpu and memory than they do disk for large 
monolithic files. but for the typical small files on most projects, I should 
think they would be more disk bound.

I have 2 machines I could test with (one with 2 threads and 32-bit and 38GB VM 
and 3GB RAM, one with 12 threads and 64-bit and 64GB memory and 64GB VM), both 
with about 50+ projects I could parallelize the builds on. but it would take 
some manual typing work to do this kind of testing if you want it done (I may 
do it anyway, I want to parallelize my builds somehow to save time and make use 
of this nice fast proc). right now I am transferring the files to my new 
machine. so it could be a month before I get anywhere. or less. the problem 
with my current build system is, they are batch files and I don't use make. I 
would have to convert the build systems for all of my 50+ projects, and also 
redo my build system somehow by making some sort of mingw make template, and I 
am no make expert. I am just putting this idea out there for someone to grab 
onto and implement. should I just feed it to the GNU folks via a bug report?
on to the idea...

if you want to compile anything big without losing hair, you should start 
compiling individual items in parallel where possible. in fact, within that, 
the compilers if possible should be multithreaded compilers where you can 
either set the limit on the number of number of threads (as long as it doesn't 
go over what the system has available) or by default auto-detect the number of 
threads. although it seems compilers do seem very much a serial thing from what 
little I remember in my compiler class from 20 years ago...
one large scale example is CPU-based HPC machines with EMC disks would benefit 
from this kind of feature. it would make provision for parallel builds for a 
given project. which brings me to my next point.

I am starting to parallelize my compiles BECAUSE IT MAKES THINGS FASTER up to a 
limit, which is probably some combination of disk speed and cpu threads.
this will afford more speed since most procs are multithreaded/multicore and 
windows treats them like cpus. Of course, this is not limited to windows. you 
can bring this to the mac and to linux also. any platform that has a C++ 
compiler that compiles lots of individual files.
this will make things disk bound very quickly however, since usually these 
files reside on a single disk. this is where RAID comes in very handy and 
provides the extra jump in speed. and this is where the EMC or even small-scale 
RAID boxes for personal use or 19" RAID racks for work use can come in. you can 
make build servers with this work even better by using the processors more 
efficiently. this is also where you can do things like buy a certain number of 
threads for faster build time, etc. for cloud build services.


an idea I have is you can have a fixed pool of worker threads assigned as 
compile job engines, each with their own spooler.
and you need to sync up the jobs internally when finishing a SERIALCOMPILE 
command.
for a SERIALCOMPILE job list such as a list of .o/obj files you want made from 
.c files, so that the SERIALCOMPILE command finishes with them all done.
instead of the usual compile command using .c.o: $(CC) etc, I think it was, you 
do COMPILESERIAL 12 THREADS for a 12-threaded CPU or COMPILESERIAL AUTO THREADS 
and then your list of .cpp files and what file extension you want them turned 
into (such as .obj), and what command you want to use to do it. maybe this 
would havce to take up several lines of make.
or something like that.

think of it like a glorified make.

at some point I should expect we might even see a parallel build system in 
place I would hope, once software developers begin to start thinking in terms 
of parallel builds instead of serial builds - it would cut time by a fraction, 
but you have to be careful HOW you do it.
some things just have to be done serially. that would just be regular make 
commands.


I am not sure if I am providing this idea to the right person or not. maybe I 
should be going to Intel or to GNU or to Microsoft or Apple or all of the 
above. but I didn't really want one vendor hogging all of the benefits. so I 
thought I would bring the idea to you. should you want to bring the 
specification of a parallel build system into the language, I would appreciate 
this (because I could certainly use it!).

and for us software developers, it would reduce our compile times.


Note that on windows machines, there is WaitForMultipleObjects() in the Win32 
API to handle the issue of waiting for [process, window, thread, whatever] 
HANDLEs without using while+for loops and some sort of conditional.

per