Thanks a lot Niko, FYI, the application is that we want to compress printdrivers prior to distributing them. Before we do so, we want to determine what is more favourable: All in one archive, or individual archives. I saw that note about re-using connections, I'll have to deal with it by setting up an HTTP ping, and add the HTTP ping RTT to the measurement of each file
Love the tools! R Niko Tyni wrote: >On Fri, Dec 02, 2005 at 02:18:32PM +1100, Rob de Jong wrote: > > > >>I'd like to measure the time taken to fetch multiple url's with curl, >>eg: urlformat = http://%host%/pdf[1-2].pdf >>My problem is that SmokePing does not seem to handle the fact that curl >>is fetching multiple url's. >>It just grabs the average Total Time of the second url in this case. >>Is there a way I can achieve this? >> >> > >Hi, > >I'm not quite sure if this is a good idea... the curl documentation >states that curl will re-use the same TCP connection when it can, so >you'll get kind of skewed results. What are you trying to measure >anyway? It seems to me that one file would be enough to measure >webserver responsiveness, and if you're trying to measure the bandwidth >of the connection (to see if it's saturated or something), you'd >probably be better off with big ICMP packets. > >That said, here's a patch that makes multiple URLs (whether in the >[]/{} notation or with separate arguments) possible. It needs >the 'redirect' patch I sent earlier to be applied first. > >Cheers, > > -- Unsubscribe mailto:[EMAIL PROTECTED] Help mailto:[EMAIL PROTECTED] Archive http://lists.ee.ethz.ch/smokeping-users WebAdmin http://lists.ee.ethz.ch/lsg2.cgi
