Hi,
I was thinking about Domain Specific Languages lately from different
perspectives and concluded, that we could "easily" make Julia an awesome
home for DSL's. (Domain Specific Languages)
Why would Julia benefit from a good infrastructure for DSL's?
*1.)*
Rapid prototyping of language concepts.
I was considering Mauro's trait implementation lately, and I didn't want to
use it, because it just looked a little cumbersome (no offense, its just a
little difficult with only macros!).
If he was able to implement the prototype first with a DSL, he could have
already created a syntax prototype for it and thinks could have looked more
concise and inviting.
*2.)*
Writing in different languages inside Julia, for example Assebler, OpenCL
kernels, OpenGL shader, etc...
function foo(a::Float32, b::Float32)
@DSL Assembler(a, b)"""
push RBP
mov RBP, RSP
vaddss XMM0, XMM0, XMM1
pop RBP
ret
"""
end
I know you might ask, why this is a good idea, as LLVM should be tuned, to
emit the best native code.
But lets assume that it doesn't! Then, a person can already prototype how
the code emitted by LLVM should look like, and another person, who probably
knows LLVM a lot better, can do the appropriate changes to LLVM/Julia.
*3.)*
Classic use cases like a specialized UI languages, first order logic, other
mathematical constructs, scripting, etc...
*4.)*
Stealing nice syntactic concepts from other language, to see if they give
Julia any advantages, without a big hassle.
function isspecificperson(person)
@DSL Scala"""
person *match* {
*case* Person("Hans","Meyer",7) => "Found: Hans Meyer"
*case* Person("Heinz","Mustermann",28) => "Found: Heinz Mustermann"
*case* *_* => "Unknown Person"
}
"""
*Possible disadvantages: *
If widely used, code will get harder to read, as you need to learn all the
crazy DSL's users are creating.
But this will happen anyways, and it's probably better to do it in an
orderly fashion than ;)
*Implementation Sketch:*
macro DSL(name, text)
tokens = *dsltokenizer*(DSLTokens{name}(), text)::DSLTokens{name}
dslast = *generate_ast*(tokens)::AST{name}
return *dsl*(dslast)::AST{:Julia}
end
# Default implementations:
*dsltokenizer*(::DSLTokens, text) = ... # Default probably with Jake
Bolewkis Lexer?!
*generate_ast*(text::DSLTokens) = ...
*dsl*(ast::AST) = ...
#Depending on the complexity of your DSL, overwrite any of these stages, to
implement your own DSL, otherwise use default
It seems like Jake Bolewski has already implemented a lot to make this work.
Probably it would be nice to integrate OpenCL kernel code like this ;)
My hope would also be, to pair this with meta data on the different stages,
to make it very easy to supply correct syntax highlighting/correction for
the different DSL's.
Creating the ast and tokenizing things otherwise needs to be done twice,
one time for the system and one time for an IDE.
Or are there currently simple ways in Julia, to determine where tokens are
in a string, what scopes there are and what kind of attribute println is in
"println("1234")" ?
I haven't found them yet. Most of the things you would currently need to
implement for this, will be redundant to "parse" and are then volatile to
changes in the language.
Well these are just some thoughts I recently had, feel free to evaluate
this and/or judge if this is something we really want!
I won't implement this anytime soon, but maybe someone searching for a
bachelor/master thesis comes to the rescue?
Who knows...
Cheers,
Simon