I'm no Linux guru, but I think this would involve using cron to run several
scripts, say once or twice per week.
One script would connect to your ISP.
Then the next would run FTP to d/l the files you need (You would need some
kind of program to run this non-interactively. I *remember* seeing one on
freshmeat.net a couple months ago, I just can't remember what it was
called.)
And then a third script to disconnect, or just let the connection time out.
These scripts could be put together into one. But just tell cron to run them
one after another.
Hope this helps.
Sean Conway
----- Original Message -----
From: Glen Lee Edwards <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>; <[EMAIL PROTECTED]>
Sent: Sunday, June 27, 1999 11:32 AM
Subject: Downloading remote files
> I have a www server located a thousand miles from me that has a couple
> domains I own on it. My personal PC (Linux only) is dial up to a local
> ISP which is NOT my domain server. My job keeps me away from the house
> for sometimes up to 2+ weeks at a time. I need some kind of automated
> system that will allow my PC to dial up my ISP, and then download certain
> files from my domain server. I already have the auto dial-in working.
>
> My question is, what do I have to do so my PC will access these remote
> files and download them. BTW, I don't know if the server is running any
> kind of remote file server program. Both computers are running Red Hat.
>
> Glen
>
> Glen Lee Edwards
> [EMAIL PROTECTED]
>
> "Linux, giving you the freedom to make the choice."
>
>