Arran Cudbard-Bell wrote: > This is pretty much a none issue. Just have the detail file writer start > a new file every minute/hour, then the number of repeated entries is > very small. It's only when you have it start a new file every day, or > use one monolithic detail file that you run into problems.
Having hourly detail files is recommended, and does help. But for sites doing 100 acct/s, that means having ~300K packets in the file. Alan DeKok. - List info/subscribe/unsubscribe? See http://www.freeradius.org/list/users.html

