[EMAIL PROTECTED] wrote:
We still don't know the business specifics of the original poster to
know if this is at all useful to him, but assuming it will be to others
down the road the next logical questions are:

1. How can we generalize this so one handler can be used to feed lines
to another handler for processing?

2. Can we further generalize it to use other chunk types (words, items,
tokens)?

3. Once we solve #1 and 2, should we request an addition to the engine
for this?  If you think this is fast now wait till you see what the
engine can do with it.  It'll be like life before and after the split
and combine commands.
>
> There is also the issue with the buffer size which can significantly
> reduce the number of file reads (depending on block size, file size,
> number of words or lines or etc to be parsed)... There's different
> optimizations for this to be considered and usually on case by case
> basis. It's hard to generalize.

Good thought. Yes, if we could generalize a handler we should have a param for bufferSize.

> Can you expand on 1 and 2? Im not sure what you mean?
>
> Do you mean
>
> read from file for a word -- or for a line
>  or something like that?

Exactly.

Maybe such a handler could look like this:

  ReadBuffered pFileName, pChunkType, pCallbackMessage, pBufferSize

Your routine calls ReadBuffered and the ReadBuffered command sends the callback message to you with the next chunk as its param.

Here's a challenge: can we implement callbacks in a way that don't use "send"? It's not prohibitively slow, but if we're doing this for speed it'd be nice if we could add the convenience of a separate handler without sacrificing any milliseconds we don't need to.

--
 Richard Gaskin
 Fourth World Media Corporation
 ___________________________________________________________
 [EMAIL PROTECTED]       http://www.FourthWorld.com
_______________________________________________
use-revolution mailing list
[EMAIL PROTECTED]
http://lists.runrev.com/mailman/listinfo/use-revolution

Reply via email to