The FastAtRead class is a straightforward integration of Vitalije's @file 
read code.

The contrast between Vitalije's code and my own legacy code is embarrassing.

In this post, I'm going to say just a few words about why the code, that 
is,  fast_at.scan_lines, is so good.

1. It uses brilliantly simple data structures.

The local stack var in fast_at.scan_lines is a stack of tuples, (gnx, 
indent). This stack is a brilliant "compression" of the representation of a 
context, that is, the state of the scan.  The sentinels corresponding to 
nodes, section references and @others push items onto the stack.  The end 
of these constructs pop the stack.  Nothing else needs to be done!

The old code represents state as a horror show of parallel stacks. 
Switching context is so difficult that it must be encapsulated in helper 
functions.

2. It doesn't call helper functions.

All time-critical code resides directly in fast_at.scan_lines, organized by 
section references.

3. It handles each input line very quickly.

The main loop is: for i, line in enumerate(lines[start:]):

This is considerably faster than computing line "by hand", despite creating 
the new list lines[start:].  But, Doh, this substring computation is only 
done once per file, and so has essentially no effect on performance.

The code uses regex patterns to determine which lines must be treated 
specially. My only real contribution to the code is realizing that an 
initial test can quickly determine that most (about 75%) lines are not 
special and can simply be appended to the current list of body lines.

*Summary*

This code is worth very careful study for anyone interested in speedy code.

It's fast because it does very little.

Edward

-- 
You received this message because you are subscribed to the Google Groups 
"leo-editor" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/leo-editor.
For more options, visit https://groups.google.com/d/optout.

Reply via email to