I am wonder if this will be good choice to allow gcc to generate AI data about 
best optimization approach on user computer.

Yes. GCC team will provide source code for some examples and (once low 
hardware consumption) made gcc to compile these sources with various 
optimization patch. It will take avg run time of each patch. Of course - it 
will run compilation once there is many free resources on computer.

Why? To allow user select --best-optimiation-speed-for-my-machine or another 
optimization criteria, which will depends on AI data sheet.

What do you think?

Another idea is to allow adding feature point to program. Program will report 
that user is using some feature. In next step, linker could made optimization 
based on this - by simply, compiles only functionality, which is used by user. 
Rest functions will be put into shared objects and special code to load this 
shared object could been injected to executable.

I am not very good in programming, but have had crazy ideas about lot of 
stuff.

Best regards,
Lach Sławomir.


Reply via email to