> For instance, why introduce {.pure.} with enums to require full
> qualification? If the compiler does not respect the pragma, there are syntax
> errors in the code. Typing and naming should be enough to avoid enums names
> clashes.
Wouldn't work with Nim's disambiguation rules. Think of enum values as nullary
procs and you can see it would be ambiguous immediately.
> Another example with {.procvar.} used to qualify a proc that can be passed to
> a procedural variable. This should be the default or the compiler should
> deduce it from analyze of the source, shouldn't it?
`.procvar` is a whole topic on its own and will eventually disappear from the
language.
> Or the {.global.} to qualify a global variable or {.borrow.} to borrow code
> from overloaded proc.
It's not clear what you propose here.
> An important set of pragmas is used for foreign functions interfaces,
> importing headers or injecting code. Why not define a domain specific
> sublanguage to manage this?
You need to map such a DSL to builtin capabilities anyway, there is no way
around it. We would need to document the builtins, you would argue the language
is bloated by them.
> But this large amount of semantic pragmas is breaking the simplicity of the
> language.
Agreed but there is no systems programming language out there without a
pragma-like annotation system:
* D has @annotations.
* C/C++ has __underscored_Identifiers__ and C++
[[even_a_new_syntax_for_these]]
* Rust has `[Attribute("value")]` iirc. C# has the same.
* Go has special comments `//!noWriteBarrier` iirc.
"Simple" GNU C has this
[https://gcc.gnu.org/onlinedocs/gcc/C-Extensions.html](https://gcc.gnu.org/onlinedocs/gcc/C-Extensions.html)
> Am I misunderstanding how pragmas are used in Nim?
No, you understand them just fine.
> Is there a model logic behind all these pragmas?
Well you gave a nice taxonomy for them.
> And will these "syntax" pragmas replaced by language keywords before version
> 1.0?
Introducing new keywords breaks code, so most are here to stay.