[email protected] writes: > But in cases where you *are* recursively downloading and using > --page-requisites, it would be polite to otherwise obey the robots 5B> exclusion standard by default. Which you can't do if you have to use -e > robots=off to ensure all requisites are downloaded.
it seems a good idea to handle -r and --page-requisites in this case, wget shouldn't obbey the robots exclusion directives. Thanks, Giuseppe
