1. When your Web site contains a blog (which may not be a problem for
Geoff's friend's Web site), the local copy upload method is not
feasible, unless designed to skip the blog part.

2. The local copy upload method does not alert you when vandalism has
actually occurred.

On Mon, 2008-01-28 at 09:48 +0200, Shahar Dag wrote:
> Hi
> 
> I would prefer to maintain a local copy of the web + once a day (using cron) 
> to upload it to the web server
> (or even better, maintain a SVN server that hold the local copy of the web)
> 
> Shahar
> ----- Original Message ----- 
> From: "Omer Zak" <[EMAIL PROTECTED]>
> To: "linux-il" <linux-il@cs.huji.ac.il>
> Sent: Monday, January 28, 2008 9:15 AM
> Subject: Re: Finding porn links in hacked web pages
> 
> 
> > The method which I use is to:
> > 1. Perform periodic backup of the entire Web site, including SQL dumps
> > of any databases driving it.
> > 2. Download the backup files to PC.
> > 3. Open them (into a subdirectory and import into a new DB instance,
> > respectively).
> > 4. Run 'diff' between the opened files and the previous backup.
> >
> > For regular files, use 'diff'.  For DB comparison of two MySQL DBs, I
> > use a Python script, which I wrote.
> >                                           --- Omer
> >
> > On Mon, 2008-01-28 at 09:03 +0200, Geoffrey S. Mendelson wrote:
> >> Yesterday my wife went to a perfectly normal web page and after
> >> a few seconds a porn page replaced it.
> >>
> >> I looked at the HTML page source and found that at the bottom of the
> >> page were hundreds of links, which did not belong there. I called
> >> the publisher of the page, and he determined that his server had been
> >> "hacked" and the links added.
> >>
> >> He is not technicaly inclined at all, and does not have the ability
> >> to check his pages without going to each one in a browser and looking
> >> at the page source. He has thousands of pages and runs the site as
> >> a Jewish news site, with no income.
> >>
> >> I was thinking that I could write a program that scans each of his
> >> web pages using wget or lynx to download them, but don't want to
> >> start writing code if it has been already done.
> >>
> >> Any suggestions?
-- 
MS-Windows is the Pal-Kal of the PC world.
My own blog is at http://www.zak.co.il/tddpirate/

My opinions, as expressed in this E-mail message, are mine alone.
They do not represent the official policy of any organization with which
I may be affiliated in any way.
WARNING TO SPAMMERS:  at http://www.zak.co.il/spamwarning.html


=================================================================
To unsubscribe, send mail to [EMAIL PROTECTED] with
the word "unsubscribe" in the message body, e.g., run the command
echo unsubscribe | mail [EMAIL PROTECTED]

Reply via email to