possible bottlenecks: - bandwidth between db and application - bandwidth between app and client - both of them - memory consumtion in the app may force excessive swapping
you're saying rows, but use MongoDB which is document based. do you mean documents? possible solutions: - use streaming where possible - use paging where no streaming supported - reduce data fetched. MongoDB support the limitation of fields Disclaimer: all of this are guesses without knowledge about the infrastructure and code. Am Freitag, 27. September 2013 11:31:54 UTC+2 schrieb Luke Han: > > Hi Experts. > We are building a data feed rest service with node.js and > MongoDB/Express. The server perform very well if the query result is small. > But it will hang the server when the client query a large dataset, such as > 1m rows (already using gzip to compression). Is this caused by node.js > single thread design? > I would like to consulting you about any idea to handle this. > Any comments are welcome:) > Thank you very much. > > Luke > -- -- Job Board: http://jobs.nodejs.org/ Posting guidelines: https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines You received this message because you are subscribed to the Google Groups "nodejs" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [email protected] For more options, visit this group at http://groups.google.com/group/nodejs?hl=en?hl=en --- You received this message because you are subscribed to the Google Groups "nodejs" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. For more options, visit https://groups.google.com/groups/opt_out.
