Like you, that's what I thought initially, until the security scanning application report hit me.
For each simulated attack (including null-characters and other characters) our *.do URLs were showing errors and exceptions in all their full glory right on the web page.
Take a look at the web.xml <error-page> tag. Might help out with that a bit.
[snip]
And what I liked about my SafeValidatorForm was this :- once the form's values get past it, I am in la-la-land. And so were the other developers in the entire team whose capabilities range from you-know-where to a full 10.
Its gatekeeping was something like the java security policy checks , no code changes -- if I am allowed to stretch things a bit in my favor. Maybe its my style, but I prefer hooks and filters, rather than enhanced capabilities.
I guess what you are getting at is that you don't have to worry about a programmer forgetting to put in the call to the validation stuff. (For example, forgetting to put our fictional invalidCharacters validation on a form field in validation.xml or something)
Just hope they don't break you programming policies and extend ValidatorForm (or just use DynaValidatorForm!) instead of extending SafeValidatorForm :)
And with that single class, I reduced the scanning report down to one heavenly page for all the applications. Ok. (Truthfully ;-), to get down to one page, we replaced <!-- with <%-- and also took out javascript validation that Struts does provide :-( -- these scanners think any javascript is a potential problem ).
That seems very strict...no javascript at all? I can see it considering some javascript bad, but that is ridiculous imho. I feel for ya...
Matt
--------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]