Apologies for the HTML posting - it is necessary to get the command
scripting to show more correctly.  All of the commands in the script are
one liners.

I have a situation I am trying to deal with via automation .i.e.cron.
The story to be brief is this - my boss is paranoid about security.  A
little time ago we ran an external server with an internal network
connection controlled securely using iptables, very strong passwords and
"deny all except local ip addresses" in hosts allow.  In fact this group
reviewed my security last year.  There has never been a successful
penetration but despite this, the boss demanded that I disconnect the
server from the internal network so that it could only be accessed from
"outside".  We are only talking about web access here.  After an
horrendous couple of months when everything slowed down to hell as our
internal internet bandwidth is pretty much maxed out all of the time, he
asked me to come up with a solution which allows the staff at home to
access the server at night but during the working day, he wants absolute
speed.  He still will not allow the internal connection however.  This
server is on a separate dedicated internet connection BTW.

We have a spare server so I have set this up on the internal network and
have written some scripts which effectively shut down the internal
network, connect to the external network and allow the external server
to pull or push the data via the temporarily opened joint connection.
It is a lot of work but it does work to a point.

What is happening is that when I run the scripts via cron, the job does
not complete properly i.e. the syncing is incomplete in both directions.
When I invoke the scripts manually from the command line they do
complete properly.  The minor scripts work properly via cron - these are
the ones which change permissions and create and delete some files and
open and close the ports.  The most crucial ones do not work properly
and these are the pull/push scripts.  One of these is below.  The
biggest sync job is the first one in the script which must transfer 11Gb
data but when run via cron sends precisely 25Mb.  As said earlier, when
run from the CLI it performs correctly.  It's as if the cron instance
just races through the script and I can't understand why it should do
this.  I have resorted to putting full paths into the script for the
commands.  It is impractical (I think) to run this script as separate
"syncs" as the completion time of the main one is hard to judge.  This
needs to be foolproof in case I am not there to intervene.

#!/bin/sh
/bin/touch /var/www/moodledata/1/maintenance.html  ## Flag to make the
site unavailable
/bin/chown apache: /var/www/moodledata/1/maintenance.html  ## Just
changes the permissions
/usr/bin/expect -c 'spawn /usr/bin/rsync -vrux /var/www/moodledata/
[email protected]:/var/www/moodledata/ ; expect password ; send
"mypassword\n" ; interact'  ## This one invariable does not complete
/usr/bin/expect -c 'spawn /usr/bin/rsync -vrux /var/www/html/moodle/
[email protected]:/var/www/html/moodle/ ; expect password ; send
"mypassword" ; interact'
/usr/bin/expect -c 'spawn /usr/bin/rsync -vrux /var/lib/mysql/moodle/
[email protected]:/var/lib/mysql/moodle/ ; expect password ; send
"mypassword" ; interact'

The inside server allows a root connection and the cron job is run as
root as is the CLI job.  The major sync jobs are run on the outside
server and depending on the time of day, the outside web site is either
open or closed for business.  I cannot just simply put this outside
server inside and then outside as the this outside server hosts for
other people and so must always be available. 

Any comments would be appreciated.

Regards,

Rick

-- 
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html

Reply via email to