I am writing a code that is downloading one file from server "A" to my
server and I am streaming to the client at the same time, real time. The
file is never full in my sever. Only chunks. Here is the code:
downloader.getLink(link, cookies[acc], function(err, location) {
if (!err) {
downloader.downloadLink(location, cookies[acc], function(err, response)
{
if (!err) {
res.writeHead(200, response.headers);
response.pipe(res);
} else
res.end(JSON.stringfy(err));
});
} else {
res.end(JSON.stringfy(err));
}});
As I can see there is nothing blocking this code since response is comming
from a simple http.response... The problem is, this way I can only stream 6
files at the same time. But the server is not using all the resources(cpu
10%, memory 10%) and it is a single core. After 6 files I only get the
loading page and the stream doesn't starts, only after one of the others
has completed.
This is not a limitation on the 1st server where I am downloading the files
because using my browser for example I can download as many as I want. Am I
doing something wrong or this is some limitation in node/ubuntu14.04 that I
can change?
I already changed: agent.maxSockets..
I already sent agent: false
I already tried to use hyperequest
I doubled the ulimit and still the same..
I don't know where to go from here hoping that somebody can help me.
Thanks
--
Job board: http://jobs.nodejs.org/
New group rules:
https://gist.github.com/othiym23/9886289#file-moderation-policy-md
Old group rules:
https://github.com/joyent/node/wiki/Mailing-List-Posting-Guidelines
---
You received this message because you are subscribed to the Google Groups
"nodejs" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To post to this group, send email to [email protected].
To view this discussion on the web visit
https://groups.google.com/d/msgid/nodejs/76bbbb91-65e8-47b4-8b0d-c8b04e77d8ec%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.