Searching for "works from command line but not via cron" shows that you're
not alone :)
Usually, this stems from the fact that cron doesn't execute your various
shell initialisation scripts - ~/.profile, ~/.bashrc, ~/.bash_profile,
~/.login, etc. A common problem is thus that $PATH isn't set (when the
script runs under cron) to what you expect (based on what you see when
you're running it by hand). However, all sorts of other things could be
different: your ~/.bash_profile might set a non-standard umask; or set up
some aliases that don't exist when the script runs under cron.

Changing the shebang line to "#!/bin/sh -x" will cause each line to be
echoed; if you then change the cronjob to ">>/var/log/cronlog" or similar
you should get some more information about what cron is actually running,
and what the rsync commands are doing.

A random guesses, aside from the things I've mentioned above: the shebang
line stipulates /bin/sh; I'd bet you're not using /bin/sh as your shell. On
some systems /bin/sh is bash; on other systems it's something different -
maybe sh, maybe dash, maybe bash-in-sh-compatibility-mode.

Another random guess: if the cronjob runs as a user who doesn't have
192.168.0.20 in their ~/.ssh/known_hosts file, ssh will ask questions about
whether you really want to connect. It's possible that when you do it by
hand you have a pre-populated ~/.ssh/known_hosts, but the cronjob runs as a
different user and doesn't have that.

On Sun, May 10, 2009 at 2:14 PM, Rick Phillips <[email protected]> wrote:

> Apologies for the HTML posting - it is necessary to get the command
> scripting to show more correctly.  All of the commands in the script are
> one liners.
>
> I have a situation I am trying to deal with via automation .i.e.cron.
> The story to be brief is this - my boss is paranoid about security.  A
> little time ago we ran an external server with an internal network
> connection controlled securely using iptables, very strong passwords and
> "deny all except local ip addresses" in hosts allow.  In fact this group
> reviewed my security last year.  There has never been a successful
> penetration but despite this, the boss demanded that I disconnect the
> server from the internal network so that it could only be accessed from
> "outside".  We are only talking about web access here.  After an
> horrendous couple of months when everything slowed down to hell as our
> internal internet bandwidth is pretty much maxed out all of the time, he
> asked me to come up with a solution which allows the staff at home to
> access the server at night but during the working day, he wants absolute
> speed.  He still will not allow the internal connection however.  This
> server is on a separate dedicated internet connection BTW.
>
> We have a spare server so I have set this up on the internal network and
> have written some scripts which effectively shut down the internal
> network, connect to the external network and allow the external server
> to pull or push the data via the temporarily opened joint connection.
> It is a lot of work but it does work to a point.
>
> What is happening is that when I run the scripts via cron, the job does
> not complete properly i.e. the syncing is incomplete in both directions.
> When I invoke the scripts manually from the command line they do
> complete properly.  The minor scripts work properly via cron - these are
> the ones which change permissions and create and delete some files and
> open and close the ports.  The most crucial ones do not work properly
> and these are the pull/push scripts.  One of these is below.  The
> biggest sync job is the first one in the script which must transfer 11Gb
> data but when run via cron sends precisely 25Mb.  As said earlier, when
> run from the CLI it performs correctly.  It's as if the cron instance
> just races through the script and I can't understand why it should do
> this.  I have resorted to putting full paths into the script for the
> commands.  It is impractical (I think) to run this script as separate
> "syncs" as the completion time of the main one is hard to judge.  This
> needs to be foolproof in case I am not there to intervene.
>
> #!/bin/sh
> /bin/touch /var/www/moodledata/1/maintenance.html  ## Flag to make the
> site unavailable
> /bin/chown apache: /var/www/moodledata/1/maintenance.html  ## Just
> changes the permissions
> /usr/bin/expect -c 'spawn /usr/bin/rsync -vrux /var/www/moodledata/
> [email protected]:/var/www/moodledata/ ; expect password ; send
> "mypassword\n" ; interact'  ## This one invariable does not complete
> /usr/bin/expect -c 'spawn /usr/bin/rsync -vrux /var/www/html/moodle/
> [email protected]:/var/www/html/moodle/ ; expect password ; send
> "mypassword" ; interact'
> /usr/bin/expect -c 'spawn /usr/bin/rsync -vrux /var/lib/mysql/moodle/
> [email protected]:/var/lib/mysql/moodle/ ; expect password ; send
> "mypassword" ; interact'
>
> The inside server allows a root connection and the cron job is run as
> root as is the CLI job.  The major sync jobs are run on the outside
> server and depending on the time of day, the outside web site is either
> open or closed for business.  I cannot just simply put this outside
> server inside and then outside as the this outside server hosts for
> other people and so must always be available.
>
> Any comments would be appreciated.
>
> Regards,
>
> Rick
>
> --
> SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
> Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html
>
-- 
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html

Reply via email to