On Apr 21, 4:54 pm, [EMAIL PROTECTED] wrote: > On Apr 21, 5:58 am, Dustan <[EMAIL PROTECTED]> wrote: > > > >From my searches here, there is no equivalent to java's > > > StringTokenizer in python, which seems like a real shame to me. > > > However, str.split() works just as well, except for the fact that it > > creates it all at one go. I suggest an itersplit be introduced for > > lazy evaluation, if you don't want to take up recourses, and it could > > be used just like java's StringTokenizer. > > > Comments? > > If your delimiter is a non-empty string, you > can use an iterator like: > > def it(S, sub): > start = 0 > sublen = len(sub) > while True: > idx = S.find(sub,start) > if idx == -1: > yield S[start:] > raise StopIteration > else: > yield S[start:idx] > start = idx + sublen > > target_string = 'abcabcabc' > for subs in it(target_string,'b'): > print subs
Thanks. Well, now I know it can be implemented in a reasonably efficient manner in pure python (ie without having side-efect strings that aren't of any use, as with concatenation). That's what I was mainly concerned about. I feel that it could be a builtin function (seriously, the world wouldn't end if it was, and nor would python), but this'll work. That's my last word on the subject. > For something more complex, > you may be able to use > re.finditer. > > -- > Hope this helps, > Steven -- http://mail.python.org/mailman/listinfo/python-list