I have a website on a UNIX server which includes static pages and pages 
generated dynamically, with perl scripts pulling content from flatfile 
databases.

Until now, the site was not officially online, which meant that all editing 
could be done directly on the original files on the server.
Now it's public, which of course means I should perform all editing 
operations offline - I'm pondering over what's the most efficient way to do 
that, and how to use perl in what would basically amount to copying data 
structures back and forth, changing directory parameters in the process and 
comparing file versions (by date and time, presumably). This is to apply to 
all file types included - html, pl, cgi.

Two possibilities:
- copy everything to a desktop, in the process, change all directory 
parameters to reflect local settings, then work on files locally. When 
done, copy all files that have been edited back to server while resetting 
directory parameters.

- copy everything to a second directory on the server, change all directory 
parameters to reflect the new settings, do all editing there, and once 
everything's done, copy all edited files back to the original directory 
while resetting directory parameters.

I was wondering whether there are any modules for this particular purpose, 
or whether anyone has done something similar and can point me to some 
relevant resources.

Thanks,

Birgit Kellner





-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to