Re: [Haskell-cafe] Tokenizing and Parsec

2010-01-12 Thread Stephen Tetley
2010/1/12 Günther Schmidt gue.schm...@web.de: [Snip...] I need to write my own parsec-token-parsers to parse this token stream in a context-sensitive way. Uhm, how do I that then? Hi Günther Get the Parsec manual from Daan Leijen's home page then see the section '2.11 Advanced: Seperate

Re: [Haskell-cafe] Tokenizing and Parsec

2010-01-12 Thread Magnus Therning
2010/1/12 Günther Schmidt gue.schm...@web.de: Hi all, I've used Parsec to tokenize data from a text file. It was actually quite easy, everything is correctly identified. So now I have a list/stream of self defined Tokens and now I'm stuck. Because now I need to write my own

Re: [Haskell-cafe] Tokenizing and Parsec

2010-01-12 Thread Khudyakov Alexey
В сообщении от 12 января 2010 03:35:10 Günther Schmidt написал: Hi all, I've used Parsec to tokenize data from a text file. It was actually quite easy, everything is correctly identified. So now I have a list/stream of self defined Tokens and now I'm stuck. Because now I need to write my

[Haskell-cafe] Tokenizing and Parsec

2010-01-11 Thread Günther Schmidt
Hi all, I've used Parsec to tokenize data from a text file. It was actually quite easy, everything is correctly identified. So now I have a list/stream of self defined Tokens and now I'm stuck. Because now I need to write my own parsec-token-parsers to parse this token stream in a

Re: [Haskell-cafe] Tokenizing and Parsec

2010-01-11 Thread Uwe Hollerbach
Hi, Günther, you could write functions that pattern-match on various sequences of tokens in a list, you could for example have a look at the file Evaluator.hs in my scheme interpreter haskeem, or you could build up more-complex data structures entirely within parsec, and for this I would point you