Have you considered grepping through the files searching for the hard coded
portion of the code which is the same across the board and change that to
the location on the dev box?  Might take a nix box an hour or so but should
get nearly all of them.  Only true way to be sure is search all the code
line by line.  If it is that important that you don't hit the prod box I
would spend the time and effort to do so.

Also a year or so back Paul and Larry talked about a way to clone a site
and pull it down to a local machine and run all the pages locally, but I
can't recall exactly what they did.  Sorry, hopefully Bugbear, Paul, or
Larry can recall that one to help you.  My old age and all, I don't recall
as well as I used too.  What were we discussing anyway :-)

- Robert
(arch3angel)

On Wed, Oct 31, 2012 at 12:15 PM, Patrick Laverty <[email protected]
> wrote:

> Ok, newbie here...
>
> I was asked to scan a web site that we were told is vulnerable. So I'm
> copying the site over to my Dev server and each time I manually click
> on links, I see it sends my request to production. I went through the
> .htaccess file and changed everything to point to my Dev server. It
> still goes to prod. I dig in a little further and sure enough, most of
> the links in the hundreds of pages are hardcoded to the prod site.
>
> What's the safest way to get around this? Set the /etc/hosts file on
> my scanning machine to point to my Dev server? I want to make 100%
> sure that my scan never hits the production server.
>
> Suggestions?
>
> Thank you.
> _______________________________________________
> Pauldotcom mailing list
> [email protected]
> http://mail.pauldotcom.com/cgi-bin/mailman/listinfo/pauldotcom
> Main Web Site: http://pauldotcom.com
>
_______________________________________________
Pauldotcom mailing list
[email protected]
http://mail.pauldotcom.com/cgi-bin/mailman/listinfo/pauldotcom
Main Web Site: http://pauldotcom.com

Reply via email to