Ken Stephens wrote:
> Rod
>
> wget -rv <site>
>
> -r recursive
> -v verbose
>
> Will get the website, but not the configuration and certificate files.
> Those must be deduced from the site.
>
> Regards,
> Ken
>
> On Fri, Jan 12, 2018 at 10:59 AM, Roderick Anderson <[email protected]>
> wrote:
>
>> We recently lost our web master for the local user group.  Died suddenly
>> in the night.  No warning at all.
>>
>> We didn't have a backup plan in place so I'm researching how to at least
>> get a static copy of the web site.
>>
>> Domain name is under one persons name, DNS is under my control, but the
>> site is actually hosted on the web masters personal account somewhere else.
>>
>> I'm thinking wget but open to other suggestions.
>>
>>
>> When I taught Intro to Computers at the local community college I used to
>> ask my students "How do you describe someone that doesn't do regular back
>> ups?".  "Really sorry!"
>>    Now I need to add "doesn't have a disaster plan?" Yup the same.
>>
>>
>> TIA,
>> Rod
Guessing here ... I know that if you view the site/page in a browser, you can 
show source ... then you can print (to file) both of those views.
Is it possible that one of both of those might help one to deduce the unknown 
configuration and/or certificate information?
Hope that helps, or at least teaches me something
Regards
Fred James
_______________________________________________
PLUG mailing list
[email protected]
http://lists.pdxlinux.org/mailman/listinfo/plug

Reply via email to