Nim is in the line of classic programming languages: FORTRAN, Pascal, C. Nim 
represents a minimal (if not THE minimal) closure over C's memory model.

  * FORTRAN did not use a stack for variables at all, because any of these were 
placed in global memory. Recursion did not work of course, but the memory 
requirements could be precisely precalculated. The stack was only needed for 
addresses. Therefore, "stackless FORTRAN" is an oxymoron. The memory footprint 
of FORTRAN was somewhat minimal, and this was a requirement : The original 
Mainframe IBM 360 ( a "huge" machine at an almost unimaginable price) provided 
RAM of 32 to 64 KBytes. Why so little? Well, RAM was costly : 400 $ / KByte 
(approx.) in the late 60ies. Anyway, a typical FORTRAN program would work well 
with a stack capacity of 64 bytes and a total RAM footprint (prog. + data) with 
less then 16 KB.
  * C placed variables on the stack, so more "dynamic" RAM was needed, but 
otherwise, it followed FORTRAN's terse memory footprint well. This paved the 
way for C's success. Moreover, C reflected the addressing modes of the PDP11 
(the workhorse for scientific labs in the 70ies) nicely, therefore, the ` 
autoincrement operator etc. . However, C did not provide an automatic heap 
management.
  * Now, Nim kicks in and offers the `Seq` exactly for that. With `seq` , 
stack-bounded lifetime can safely be expanded into the heap. No other 
requirements needed, no GC at all. Furthermore, parameters passed by pointers 
are not considered mutable if not declared explicitly with `var` . Therefore, 
Nim offers safety and convenience at the same time but remains in line with C.
  * So, the microcontroller people should immediately jump on the bandwagon . 
Because, they are the true 60ies programmers in our times, since there is so 
little RAM , 4K, 8K ... (The processor is way faster than a IBM 360 mainframe 
though . If and why they don't jump, I'll come to this later.
  * PLs in computer science. (Tensorflow reloaded?). Evaluation of types, type 
safety, compiler construction, etc. . Show me a PL that allows you do write 
compilers more easily than Nim. Regarding both convenience and efficiency 
(speed), Nim is on the top. To promote this further, it would be helpful if a 
Nim macro could work together with a redefined lexer/parser. Until now, macros 
can only deal with Nim-parsed code and therefore the Nim parser is in the way.
  * Ressource management "safety", stackless Nim and stateful Nim. If 
multithreading is considered, any of these things come into play. Within 
realtime applications, coroutines will be involved almost unavoidably . Now, 
state-bound transitions need to be considered, potential pitfalls with data 
races etc. Example: A digital altimeter in an airplane. If you move a switch on 
its front panel, you want a state update immediately and accordingly. The state 
update must not depend on the previous state of the altimeter. And no time for 
error messages (Please reboot....) . Almost ironically, we have to reconsider 
old FORTRAN-style with a defined memory layout for that.
  * Dynamic (duck-type) languages. Seems to be the worst invention ever, but, 
Objective-C , being a thin extension of C, made its way through the decades. 
(C++ tried to be smarter and failed...). The update of methods at runtime is 
certainly not in the focus of Nim. However, in the 20ies now, compilers are 
pretty good at extracting static (precompiled) functionality from it. 
Javascript/Nodescript is the proof of concept. Javascript was born as a pure 
scripting language. It made an astonishing career , very similar to Objective-C 
(now Swift)
  * So, the main selling point for Nim is Core-Nim. No fancy concepts, no type 
analysis of generic functions at compiletime (they are checked at instantiation 
instead and this works surprisingly well!) Core-Nim could be promoted heavier 
though, again, I will come to this later.
  * Memory safety is a concern. Nim's Arc is excellent but Rust is a mighty 
competitor. They have an important application (FireFox) and the Mozilla 
foundation. But the selling point for Rust is not a "killer" application. The 
question is: Will Rust keep its promises? How will Rust evolve as stateful 
(continuation-based) Rust? Are their "lifetimes" type-theoretically complete? 
What can be done differently? Nim has already an (almost invisible) borrow 
checker, should Nim expand it and (hopefully not ) introduce lifetime 
parameters? What needs to be done else?
  * The latest point is a question of ongoing research. In Science, Nim should 
be found more frequently, more scientific papers where Nim plays a role . 
Existing programming languages (to mention Haskell and Scala here) can be 
toolboxes for type reasoning, linear types (or quantitative types, there is 
even a theory for it..) continuation models etc. Nim could take a place here as 
its own.
  * Forget about waiting for a "killer app" , the next new webserver and so on. 
A killer app does not rely on the language. It could - in principle and in 
reality - always be rewritten in another PL. The PL market ist highly 
fragmented.


Reply via email to