On Jan 25, 2017, at 7:27 AM, Chris Eidhof <[email protected]> wrote:
> I wonder if built-in grammars (that's what Perl calls them) would work only 
> for things that are backed by string literals, or if it's worth the 
> time/effort to make them work for other kind of data as well. For example, 
> what if you write a grammar to tokenize (yielding some sequence of `Token`s), 
> and then want to parse those `Token`s? Or what if you want to parse other 
> kinds of data? Or should we try to make the 80% case work (only provide 
> grammar/regex literals for Strings) to avoid complexity?

I don’t have a strong opinion on this matter.  I can definitely see the 
elegance of being able to pattern match non-string data with the regex 
features.  Certainly things like parsing fixed packet formats coming off a 
network seem like good candidates for this sort of thing.

That said, it isn’t clear to me that this would be widely-used enough to be 
worth the complexity cost.  If it just drops into the existing model (e.g. the 
string model works on sequences of bytes, so this just falls out of it) then 
that would be great.  If it requires massive complexity for little gain, then 
probably not.  We can see when it comes time to actually design and build this 
functionality out and lazily evaluate the decision based on what we know then.

> I think it's worth looking at parser combinators.

Yep, I’m a fan, they are definitely very nice in many cases!

> To me it seems like there's a lot of (exciting) work to be done to get this 
> right :).

Totally.  Lets start by getting the essential bones of the String design right 
:-)

-Chris

_______________________________________________
swift-evolution mailing list
[email protected]
https://lists.swift.org/mailman/listinfo/swift-evolution

Reply via email to