On Wed, Jun 7, 2017 at 3:50 PM, Warren Young <war...@etr-usa.com> wrote:
> On Jun 7, 2017, at 7:42 AM, Johan Kuuse <jo...@kuu.se> wrote:
>>
>> 2. I want to validate the web pages: Validate the HTML, check for
>> broken links, etc, using for example the W3C validation tools.
>
> If you’re using something like curl or wget to pull the web pages, there’s 
> typically a way to set up a “cookie jar” so that you can log in with one HTTP 
> request, then make the remaining HTTP requests as that user, with the HTTP 
> client automatically sending the necessary session cookie.


Thanks for the suggestion, but I wanted to avoid both a running web
server and the cookie jar setup.
"fossil test-http" made my day.
Get the HTML (including the HTTP Response header) from all Fossil
built-in web pages, using the output from 'fossil help -w' as a list:

mkdir -p fhtml && for w in `fossil help -w | xargs printf "%s\n"`; do
printf "GET $w HTTP/1.0\n\n" | fossil test-http > fhtml/${w}.html;done

BR,
Johan
_______________________________________________
fossil-users mailing list
fossil-users@lists.fossil-scm.org
http://lists.fossil-scm.org:8080/cgi-bin/mailman/listinfo/fossil-users

Reply via email to