maybe you should work with CVS to manage your website, and then just backup 
the CVS repository

On Saturday 14 June 2003 15:56, Shawn wrote:
> Hi ya.
>
> I've just put together a script file that will call wget to backup one of
> the websites I manage, and built a cron job to run this every night.  The
> problem I'm having is in accessing a specific ftp folder.
>
> The host the site is with has assigned our user account a default
> directory, so when I log in my path is something like
> "ftp.mysite.com/html".  There is a database file that is one level above
> this "ftp.mysite.com/database" (assuming I got the path right).  The
> datbase folder has restricted access so that web users cannot download the
> database file, or even see the folder. When I use a tool like SmartFTP, I
> can access both folders without having to login.  However, when I open up a
> command line ftp session on my linux box, I cannot connect to the database
> folder - I get permission denied, even though I'm logged in with the
> correct user account.  Any thoughts on this?
>
> The command I'm feeding into wget right now to backup the html files is
> this:
>
> wget -N -r -i source_files.txt
>
> My source_files.txt contains these lines:
>
> ftp://username:[EMAIL PROTECTED]/
> ftp://username:[EMAIL PROTECTED]/database
>
> The second line fails.
>
> The problem is connecting to the database folder so I can backup the
> database file.  Once I'm able to do this, I can apply it to wget, and
> backup the site properly.  I guess an alternative would be to use an FTP
> script instead of wget, but the connection problems still exists for the
> database folder.  If it helps any, the host is running Serv-U FTP-Server
> v2.5m
>
> Thanks for any suggestions/tips.  It feels like I've missed something small
> somewhere, but I'm not seeing it right now.....
>
> Shawn

Reply via email to