On Wednesday, 30 August 2017 at 01:30:30 UTC, Pradeep Gowda wrote:
I'm referring to this thread about Crystal --
https://lobste.rs/s/dyitr0/its_fun_program_crystal_is_it_scalable
Crystal is strongly typed, but overwhelmingly uses type
inference, rather than explicit types. Because Crystal aims to
be spiritually—and frequently literally—compatible with Ruby,
that’s a problem: to accomplish that, Crystal relies on
sometimes-nullable types with implicit structure and implicit
unions, such that, frequently, the only way to even begin type
inference is to load the entire program’s AST into RAM all at
once and then start your massive type inference pass. What
you’re seeing in this thread is how a “simple” fix to a YAML
parser error reporting hit that problem, causing Crystal to
use a critical amount too much RAM and OOM.
How does D compare in this regard, especially in cases where
`auto` storage class specifiers are used liberally throughout
the code base?
D supports separate compilation by design. I.e. it doesn't
require all the source files corresponding to all the object
files being linked to produce the final executable, to be loaded
in memory by the compiler.