On Monday 27 September 2010, Rich Bowen wrote: > As much as we dislike .htaccess files, they certainly seem to be > a necessary evil. You can disable them (although not at compile > time), but this is simply not an option for many folks. Educating > them when it's better to use the main config than .htaccess files > is the job of documentation, and we've done a somewhat poor job of > that. Perhaps that can be improved. The largest problem here is > the HUGE number of third-party sites peddling bad advice, and I > honestly have no idea how to address that problem.
We could disable it by default in 2.4. But there would be lots of screaming that we disable a "security feature" in the default configuration :-/ > It's been frequently suggested that .htaccess files could be > improved via some kind of "cache and only reload if the > timestamp has changed" mechanism, but in my benchmarking, simply > stat'ing a .htaccess file (even in cases when there's no file > there to begin with) accounts for an awful lot of the performance > hit of "AllowOverride All", so I don't know whether this would > really be a solution. Was that benchmarking in 1.3 time? I have had the experience that the parsing of .htaccess is *a lot* more heavy-weight in 2.x than it used to be in 1.3. We once upgraded a web server from 1.3 to 2.0 that had lots of Redirects in the /.htaccess and the performance dropped to 50%. After we rearranged the redirects to be in a RewriteMap file that is only reread when changed, the performance went up again (slightly faster than it used to be with 1.3). > For the most part, folks who need to use .htaccess files are not in > a position to really do much in the way of performance tuning, > and vice versa.
