On Tue, Feb 28, 2023, 6:49 PM B 9 <[email protected]> wrote:

>
>
> On Tue, Feb 28, 2023 at 4:55 PM [email protected] <[email protected]>
> wrote:
>
>> Thanks all!
>>
>> At some point I’ll look into adding Tokenization directly into Github.
>>
>
> Awesome. It looks like compiling and running a C program may be trivial in
> the yaml file:
>
> - uses: actions/checkout@v3
>
> - run:   |
>          make
>          ./tokenize FOO.DO
>
>
> By the way, you may be able to use a Python lexer, such as ply
> <https://www.dabeaz.com/ply/ply.html>, to create a Python program from my
> flex source code. However, I suspect that will be more work than it's
> worth.
>
>

Parser systems are less work than they're worth. But lexer systems, not so
much.

Modern languages have advanced regular expression systems which are equal
in power to a lexer. Might as well just use a big regex to lex your tokens.

-- John.

Reply via email to