Hi Mike,

Thanks for that.... The config will be changed according to your advice.

Regards,
Paul

Dans son message (In his/her message), Mike McCauley ecrivait (wrote) :
> Hi Christophe,
> 
> My advice is to reduce the DupInterval to something like 2 seconds. It is
> really only intended to catch genuine duplicate packets (ie packets sent along
> duplicate parallel network paths, or from some other pathological network
> problem). Its really not supposed to catch _retransmissions_ by the NAS. As you
> have found, when it starts to catch _retransmissions_ (as opposed to
> duplicates), you start to have problems.
> 
> Hope that helps.
> Cheers.
> 
> On Feb 9,  8:35pm, Christophe Wolfhugel wrote:
> > Subject: (RADIATOR) Question regarding DupInterval
> > From the documentation, DupInterval is applied to the client, ie the
> > host sending the request, eventually an intermediate proxy.
> >
> > From reading the Client.pm source code I see following :
> >
> >     $self->{RecentIdentifiers}->{$nas_id . $code}[$p->identifier]
> >
> > $nas_id is there supposed to be the NAS-IP-Address, or if not
> > available NAS-Identifier, which is possibly not the proxy. Only
> > if none of these attributes are present $nas_id will contain the IP
> > address of the Client. The $code identifies the type of request, so
> > on a standard setup that gives a 256 packets history for each kind of
> > request.
> >
> > If my understanding is correct this is somewhat different from what
> > the documentation as well as the comment at the beginning of Client.pm
> > say.
> >
> > Now let's go to my particular situation : I have an central accountng Radius
> > server which gets all accounting packets from the proxys. Whenever this
> > machine gets really odd (or just out of CPU) the proxies start doing
> > retransmissions, and then the NASes also start retransmitting (via
> > a different proxy). By having a really high DupInterval (19) on this
> > accounting Radius I reduce the number of duplicate records in the
> > accounting files on that machine, but my clients won't get their
> > Accounting-Accept because Radiator believes it comes from the same client.
> >
> > I would conclude that my design is wrong and that I should reduce the
> > DupInterval on the accounting Radius a lot and have the scripts who
> > handle the accounting files manage to eliminate the duplicate. Can
> > someone more knowledgable confirm me this is the way I should go ?
> >
> > --
> > Christophe Wolfhugel  -+-  [EMAIL PROTECTED]  -+-  France Telecom Oleane
> >
> > ===
> > Archive at http://www.thesite.com.au/~radiator/
> > To unsubscribe, email '[EMAIL PROTECTED]' with
> > 'unsubscribe radiator' in the body of the message.
> >-- End of excerpt from Christophe Wolfhugel
> 
> 
> 
> -- 
> Mike McCauley                               [EMAIL PROTECTED]
> Open System Consultants Pty. Ltd            Unix, Perl, Motif, C++, WWW
> 24 Bateman St Hampton, VIC 3188 Australia   http://www.open.com.au
> Phone +61 3 9598-0985                       Fax   +61 3 9598-0955
> 
> Radiator: the most portable, flexible and configurable RADIUS server 
> anywhere. SQL, proxy, DBM, files, LDAP, NIS+, password, NT, Emerald, 
> Platypus, Freeside, TACACS+, PAM, external, etc etc on Unix, Win95/8, 
> NT, Rhapsody
Fin du message inclus (end of included message).

Paul Rolland, [EMAIL PROTECTED]
France Telecom Oleane/Direction Technique/Directeur
France Telecom Oleane/Technical Direction/Director

--

Please no MIME, I don't read it - Pas de MIME, je ne le lis pas
Please no HTML, I'm not a navigator - Pas d'HTML, je ne suis pas un navigateur

"I hate monday morning" - Garfield   "I hate * morning" - Me

===
Archive at http://www.thesite.com.au/~radiator/
To unsubscribe, email '[EMAIL PROTECTED]' with
'unsubscribe radiator' in the body of the message.

Reply via email to