I don't think this has been mentioned...
Some servers will not serve perfectly good requests with certain UA strings (e.g., Wikipedia accessed from a special purpose client built using Python libraries that have a built in default UA string, also some servers with 'wget'). Of course the Python client can 'lie' with trivial effort and use the same UA string that an 'approved' browser would use... then the results flow back just as if the genuine browser had asked. I can't remember if Wikipedia refused anything that wasn't apparently from a well-known browser or was sensitive specifically to the Python case. Regards, Elwyn Sent from my ASUS Pad Karl Dubost <[email protected]> wrote: >Nicolas Mailhot [2013-09-16T12:34]: >> User-Agent is invaluable for filtering out pathologic web clients in a >> network without bothering legitimate users. > >* At which level do you need to filter out? (proxy, server, …) >* Which type of clients? (bot, browser, etc?) >* Why do you need to filter out? >* What are you looking for in the UA string? > >/me is interested to know. I'm kind of collecting how the UA string is used >out in the wild in good and bad ways. > >-- >Karl Dubost >http://www.la-grange.net/karl/ > > > >_______________________________________________ >perpass mailing list >[email protected] >https://www.ietf.org/mailman/listinfo/perpass _______________________________________________ perpass mailing list [email protected] https://www.ietf.org/mailman/listinfo/perpass
