On 11 June 2012 08:51, Tom Harris <celephi...@gmail.com> wrote: > Greetings, > > I have a class that implements the iterator protocol, and tokenises a > string into a series of tokens. As well as the token, it keeps track of > some information such as line number, source file, etc. > > for tokens in Tokeniser(): > do_stuff(token) > > What I want is to be able to wrap the tokeniser to add functionality to > the base parser without subclassing, e.g. > > for tokens in processor(Tokeniser()): > do_stuff(token) > > Sort of Decorator pattern, so that I can chain more processors, but I > cannot think how to implement it. Any clues for me? >
Maybe I've misunderstood. Is this what you're looking for? def processer(tokens): for token in tokens: yield func(token) > Thanks > > TomH > > > > > -- > http://mail.python.org/mailman/listinfo/python-list > >
-- http://mail.python.org/mailman/listinfo/python-list