And, another strong side of Nim is its strictness and performance.

I never cared about performance, it was the least important part of 
programming, almost never needed in practice. Think - you spin 0.5Gb virtual 
machine only to read this text. When in 1997, the full PC memory was 8Mb, and 
the whole 3D game demo were fit in couple hundreds of kb. This is like burning 
full aircraft load of fuel just to heat a cup of coffee, and nobody cared the 
slightest bit, because computational power cost pennies and available in 
abundance.

So - until this day, there were very, very, very little need for high computing 
power (although system programmers would like to think otherwise).

But today, thanks to AI it's different. In next 5, maximum 10 years, there will 
be no more conventional programming, it will be replaced by AI and some form of 
declarative plain english programming or something like that etc. When with one 
paragraph you can say "write mobile Dota in sci-fi, fantasy and steampunk 
settings, looking like WoW art style, and publish it to Play and AppStore" and 
in 5 seconds all will be done.

And the AI is very hungry for computing power, pretty much it's its only 
limiting factor. And the more computing pawer you can utilise, the smarter the 
AI and better the result. So there's huge incentive for very efficient 
computations.

And Nim really can shine here. But it should not be like "5% more efficient GC" 
or similar, marginal 10% performance wins nobody cares about. It should be some 
game changing wins, huge wins, like do computation x1000 parallel on GPUs and 
do it like 50% more efficient etc.

Reply via email to