Hi,

      Recently I was developing a parser to parse text files with a 
particular format. The huge file sizes made it to hang. So I read the file 
in chunks and passed it to ply. But this caused the whole parser to break 
as the tokens sometime fall between the file chunks. This lead me to add 
some code to the lex and yacc module of ply so that it loads file chunks if 
it reaches a threshold (no of bytes). Is it the way it is done normally? Is 
there any better way to do this? 

Thanks,
Saravanan

 

-- 
You received this message because you are subscribed to the Google Groups 
"ply-hack" group.
To view this discussion on the web visit 
https://groups.google.com/d/msg/ply-hack/-/bs5096n0_70J.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/ply-hack?hl=en.

Reply via email to