I have a clue, maybe your client is not able to do more requests or your configuration on client side is wrong.

Best Regards,
Boian Jordanov
SNE
Orbitel - Next Generation Telecom
tel. +359 2 4004 723
tel. +359 2 4004 002




On Oct 17, 2007, at 9:17 AM, Alan DeKok wrote:

Apostolos Pantsiopoulos wrote:
Well, yes that has been my main concern I must admit... because I have
seen so many replies in the mailing
list "urging" people to make the backend DB faster (and concentrating on
that aspect alone when the server performs poorly).

  There are many factors to consider in tuning a system.  A RADIUS
server all by itself can handle 5k requests/s, if it doesn't access DB's
or any files.  A stand-along DB client can do 1000's of reads/s all by
itself.

  The combination of the two does NOT necessarily get the best of
both... i.e. 1000's of reads/s through RADIUS. Interaction effects mean that the maximum throughput is LESS than the maximum throughput of each
piece in isolation.

Find out what else is stopping the server from processing requests. Is there ANYTHING you have configured other than your Perl script? If
so, that may be the issue.

I''ll re-check it.

Run "cachegrind" to see where all of the CPU time is spent. It won't
count sleeping (or waiting for network activity), so the times may be
somewhat misleading.  But it may help.

  Alan DeKok.
-
List info/subscribe/unsubscribe? See http://www.freeradius.org/list/ users.html

-
List info/subscribe/unsubscribe? See http://www.freeradius.org/list/users.html

Reply via email to