Hi
One of the functionality that my nodejs application provides is the ability
to search thru log files which can be huge. My first approach was to read
the files line by line and make a regular expression match and whenever a
match is found, emit an event with the matched data something like
self.emit('match', {indx: i, file: fileName, line: line});
this has the benefit of transmitting small amounts of data which in
realtime is appended to the browser which looks cool as well i guess. But
then I switched it to capturing all matched records and transmitting all
matched data at the end of search ..s omething like
if(matched)
matches.push({....});
....
self.emit("done", matches);
now with this approach there is only one event that gets fired at the end
and all the matched data is returned. this seems like would cause less
network traffic but takes the realtime effect away also i am concerned that
if there were a lot of matches and the matches data is too big .. can there
be problems in transmitting large amounts of data
so which approach is better. multiple events with small chunks of result
data or one event at the which sends a lot of data back to browser
Thanks
--
Job Board: http://jobs.nodejs.org/
Posting guidelines:
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
You received this message because you are subscribed to the Google
Groups "nodejs" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/nodejs?hl=en?hl=en