Issue #16121 has been updated by Chris Henry.
Stefan -
Ahh thank you for the explanation. That was the difference between my original
manifests and this new test one - I did have other cron jobs for the www-data
user.
I am now getting my original results using the 'test manifest - that is the
www-data user job never gets removed and duplicate jobs keep getting created
for the root user:
original manifest - apply cronjob as www-data
[user@p-HOSTNAME tmp]$ sudo puppet apply -v -d ./test.pp
debug: Creating default schedules
debug: Failed to load library 'rubygems' for feature 'rubygems'
debug: Puppet::Type::User::ProviderUser_role_add: file rolemod does not exist
debug: Puppet::Type::User::ProviderLdap: true value when expecting false
debug: Puppet::Type::User::ProviderDirectoryservice: file /usr/bin/dscl does
not exist
debug: Puppet::Type::User::ProviderPw: file pw does not exist
debug: Failed to load library 'ldap' for feature 'ldap'
debug: /File[/var/lib/puppet/state/last_run_summary.yaml]: Autorequiring
File[/var/lib/puppet/state]
debug: /File[/var/lib/puppet/clientbucket]: Autorequiring File[/var/lib/puppet]
debug: /File[/var/lib/puppet/state/graphs]: Autorequiring
File[/var/lib/puppet/state]
debug: /File[/var/lib/puppet/lib]: Autorequiring File[/var/lib/puppet]
debug: /File[/var/lib/puppet/ssl/certs]: Autorequiring File[/var/lib/puppet/ssl]
debug: /File[/var/lib/puppet/ssl]: Autorequiring File[/var/lib/puppet]
debug: /File[/var/lib/puppet/ssl/crl.pem]: Autorequiring
File[/var/lib/puppet/ssl]
debug: /File[/var/lib/puppet/state/state.yaml]: Autorequiring
File[/var/lib/puppet/state]
debug: /File[/var/lib/puppet/client_yaml]: Autorequiring File[/var/lib/puppet]
debug: /File[/var/lib/puppet/ssl/public_keys]: Autorequiring
File[/var/lib/puppet/ssl]
debug: /File[/var/lib/puppet/ssl/certs/ca.pem]: Autorequiring
File[/var/lib/puppet/ssl/certs]
debug: /File[/var/lib/puppet/ssl/private_keys]: Autorequiring
File[/var/lib/puppet/ssl]
debug: /File[/var/lib/puppet/facts]: Autorequiring File[/var/lib/puppet]
debug: /File[/var/lib/puppet/ssl/public_keys/p-HOSTNAME.use01.plat.priv.pem]:
Autorequiring File[/var/lib/puppet/ssl/public_keys]
debug: /File[/var/lib/puppet/ssl/certs/p-HOSTNAME.use01.plat.priv.pem]:
Autorequiring File[/var/lib/puppet/ssl/certs]
debug: /File[/var/lib/puppet/state/resources.txt]: Autorequiring
File[/var/lib/puppet/state]
debug: /File[/var/lib/puppet/ssl/certificate_requests]: Autorequiring
File[/var/lib/puppet/ssl]
debug: /File[/var/lib/puppet/state]: Autorequiring File[/var/lib/puppet]
debug: /File[/var/lib/puppet/ssl/private_keys/p-HOSTNAME.use01.plat.priv.pem]:
Autorequiring File[/var/lib/puppet/ssl/private_keys]
debug: /File[/var/lib/puppet/state/last_run_report.yaml]: Autorequiring
File[/var/lib/puppet/state]
debug: /File[/var/lib/puppet/ssl/private]: Autorequiring
File[/var/lib/puppet/ssl]
debug: /File[/var/lib/puppet/client_data]: Autorequiring File[/var/lib/puppet]
debug: Finishing transaction 70318302517420
debug: Loaded state in 0.02 seconds
debug: Loaded state in 0.02 seconds
info: Applying configuration version '1346103833'
debug: /Schedule[daily]: Skipping device resources because running on a host
debug: /Schedule[monthly]: Skipping device resources because running on a host
debug: /Schedule[hourly]: Skipping device resources because running on a host
debug: Prefetching crontab resources for cron
debug: /Schedule[never]: Skipping device resources because running on a host
debug: /Schedule[weekly]: Skipping device resources because running on a host
notice: /Stage[main]//Cron[s3_logger_supervisord]/ensure: created
debug: Flushing cron provider target www-data
debug: /Stage[main]//Cron[s3_logger_supervisord]: The container Class[Main]
will propagate my refresh event
debug: /Schedule[puppet]: Skipping device resources because running on a host
debug: Class[Main]: The container Stage[main] will propagate my refresh event
debug: Finishing transaction 70318301685280
debug: Storing state
debug: Stored state in 0.29 seconds
notice: Finished catalog run in 0.49 seconds
debug: Finishing transaction 70318302764000
debug: Received report to process from p-HOSTNAME.use01.plat.priv
debug: Processing report from p-HOSTNAME.use01.plat.priv with processor
Puppet::Reports::Store
--
verify crontabs
[user@p-HOSTNAME tmp]$ sudo -u www-data crontab -l |grep -A 2
"s3_logger_supervisord$"
# Puppet Name: s3_logger_supervisord
40 23 * * * /var/has/s3_logrotate/bin/s3_logrotate.py -b bucket -p
/var/has/log/supervisord -l /var/has/log/s3_logrotate/HOSTNAME_supervisord -s
'secret_key' -k 'key'
[user@p-HOSTNAME tmp]$ sudo -u root crontab -l |grep -A 2
"s3_logger_supervisord$"
--
change manifest cron job from www-data user -> root
[user@p-HOSTNAME tmp]$ sudo vim test.pp
cron {
"s3_logger_supervisord":
command => "/var/has/s3_logrotate/bin/s3_logrotate.py -b bucket -p
/var/has/log/supervisord -l /var/has/lo
g/s3_logrotate/HOSTNAME_supervisord -s 'secret_key' -k 'key'",
user => root,
minute => 40,
hour => 23;
"s3_logger_supervisord_cleanup":
command => "/bin/find /var/has/log/s3_logrotate -name
'${hostname}_supervisord.*' -mtime +7 -exec rm {} \\
;",
user => root,
minute => 35,
hour => 23;
"test_job":
ensure => absent,
user => www-data;
}
"test.pp" 18L, 597C written
--
reapply test manifest
[user@p-HOSTNAME tmp]$ sudo puppet apply -v -d ./test.pp
debug: Creating default schedules
debug: Failed to load library 'rubygems' for feature 'rubygems'
debug: Puppet::Type::User::ProviderUser_role_add: file rolemod does not exist
debug: Puppet::Type::User::ProviderLdap: true value when expecting false
debug: Puppet::Type::User::ProviderDirectoryservice: file /usr/bin/dscl does
not exist
debug: Puppet::Type::User::ProviderPw: file pw does not exist
debug: Failed to load library 'ldap' for feature 'ldap'
debug: /File[/var/lib/puppet/client_data]: Autorequiring File[/var/lib/puppet]
debug: /File[/var/lib/puppet/state/resources.txt]: Autorequiring
File[/var/lib/puppet/state]
debug: /File[/var/lib/puppet/clientbucket]: Autorequiring File[/var/lib/puppet]
debug: /File[/var/lib/puppet/ssl/private_keys/p-HOSTNAME.use01.plat.priv.pem]:
Autorequiring File[/var/lib/puppet/ssl/private_keys]
debug: /File[/var/lib/puppet/state/last_run_report.yaml]: Autorequiring
File[/var/lib/puppet/state]
debug: /File[/var/lib/puppet/state/graphs]: Autorequiring
File[/var/lib/puppet/state]
debug: /File[/var/lib/puppet/ssl/private_keys]: Autorequiring
File[/var/lib/puppet/ssl]
debug: /File[/var/lib/puppet/ssl/crl.pem]: Autorequiring
File[/var/lib/puppet/ssl]
debug: /File[/var/lib/puppet/state/state.yaml]: Autorequiring
File[/var/lib/puppet/state]
debug: /File[/var/lib/puppet/ssl/public_keys]: Autorequiring
File[/var/lib/puppet/ssl]
debug: /File[/var/lib/puppet/ssl/certs/p-HOSTNAME.use01.plat.priv.pem]:
Autorequiring File[/var/lib/puppet/ssl/certs]
debug: /File[/var/lib/puppet/ssl/certs]: Autorequiring File[/var/lib/puppet/ssl]
debug: /File[/var/lib/puppet/state]: Autorequiring File[/var/lib/puppet]
debug: /File[/var/lib/puppet/client_yaml]: Autorequiring File[/var/lib/puppet]
debug: /File[/var/lib/puppet/facts]: Autorequiring File[/var/lib/puppet]
debug: /File[/var/lib/puppet/ssl/public_keys/p-HOSTNAME.use01.plat.priv.pem]:
Autorequiring File[/var/lib/puppet/ssl/public_keys]
debug: /File[/var/lib/puppet/ssl]: Autorequiring File[/var/lib/puppet]
debug: /File[/var/lib/puppet/ssl/private]: Autorequiring
File[/var/lib/puppet/ssl]
debug: /File[/var/lib/puppet/state/last_run_summary.yaml]: Autorequiring
File[/var/lib/puppet/state]
debug: /File[/var/lib/puppet/ssl/certs/ca.pem]: Autorequiring
File[/var/lib/puppet/ssl/certs]
debug: /File[/var/lib/puppet/lib]: Autorequiring File[/var/lib/puppet]
debug: /File[/var/lib/puppet/ssl/certificate_requests]: Autorequiring
File[/var/lib/puppet/ssl]
debug: Finishing transaction 69888778679020
debug: Loaded state in 0.02 seconds
debug: Loaded state in 0.02 seconds
info: Applying configuration version '1346103868'
debug: /Schedule[daily]: Skipping device resources because running on a host
debug: /Schedule[monthly]: Skipping device resources because running on a host
debug: /Schedule[hourly]: Skipping device resources because running on a host
debug: Prefetching crontab resources for cron
debug: /Schedule[never]: Skipping device resources because running on a host
debug: /Schedule[weekly]: Skipping device resources because running on a host
notice: /Stage[main]//Cron[s3_logger_supervisord]/user: user changed 'www-data'
to 'root'
notice: /Stage[main]//Cron[s3_logger_supervisord]/target: target changed
'www-data' to 'root'
debug: Flushing cron provider target root
debug: /Stage[main]//Cron[s3_logger_supervisord]: The container Class[Main]
will propagate my refresh event
debug: /Stage[main]//Cron[s3_logger_supervisord]: The container Class[Main]
will propagate my refresh event
debug: /Schedule[puppet]: Skipping device resources because running on a host
debug: Class[Main]: The container Stage[main] will propagate my refresh event
debug: Finishing transaction 69888777846420
debug: Storing state
debug: Stored state in 0.28 seconds
notice: Finished catalog run in 0.43 seconds
debug: Finishing transaction 69888778911560
debug: Received report to process from p-HOSTNAME.use01.plat.priv
debug: Processing report from p-HOSTNAME.use01.plat.priv with processor
Puppet::Reports::Store
--
verify cron jobs (exists for www-data and root now)
[user@p-HOSTNAME tmp]$ sudo -u www-data crontab -l |grep -A 2
"s3_logger_supervisord$"
# Puppet Name: s3_logger_supervisord
40 23 * * * /var/has/s3_logrotate/bin/s3_logrotate.py -b bucket -p
/var/has/log/supervisord -l /var/has/log/s3_logrotate/HOSTNAME_supervisord -s
'secret_key' -k 'key'
[user@p-HOSTNAME tmp]$ sudo -u root crontab -l |grep -A 2
"s3_logger_supervisord$"
# Puppet Name: s3_logger_supervisord
40 23 * * * /var/has/s3_logrotate/bin/s3_logrotate.py -b bucket -p
/var/has/log/supervisord -l /var/has/log/s3_logrotate/HOSTNAME_supervisord -s
'secret_key' -k 'key'
--
apply manifest a second time
[user@p-HOSTNAME tmp]$ sudo puppet apply -v -d ./test.pp
debug: Creating default schedules
debug: Failed to load library 'rubygems' for feature 'rubygems'
debug: Puppet::Type::User::ProviderUser_role_add: file rolemod does not exist
debug: Puppet::Type::User::ProviderLdap: true value when expecting false
debug: Puppet::Type::User::ProviderDirectoryservice: file /usr/bin/dscl does
not exist
debug: Puppet::Type::User::ProviderPw: file pw does not exist
debug: Failed to load library 'ldap' for feature 'ldap'
debug: /File[/var/lib/puppet/client_yaml]: Autorequiring File[/var/lib/puppet]
debug: /File[/var/lib/puppet/ssl/crl.pem]: Autorequiring
File[/var/lib/puppet/ssl]
debug: /File[/var/lib/puppet/ssl/private_keys/p-HOSTNAME.use01.plat.priv.pem]:
Autorequiring File[/var/lib/puppet/ssl/private_keys]
debug: /File[/var/lib/puppet/ssl/certs/ca.pem]: Autorequiring
File[/var/lib/puppet/ssl/certs]
debug: /File[/var/lib/puppet/ssl/private]: Autorequiring
File[/var/lib/puppet/ssl]
debug: /File[/var/lib/puppet/client_data]: Autorequiring File[/var/lib/puppet]
debug: /File[/var/lib/puppet/state/last_run_report.yaml]: Autorequiring
File[/var/lib/puppet/state]
debug: /File[/var/lib/puppet/state/resources.txt]: Autorequiring
File[/var/lib/puppet/state]
debug: /File[/var/lib/puppet/state/graphs]: Autorequiring
File[/var/lib/puppet/state]
debug: /File[/var/lib/puppet/state/last_run_summary.yaml]: Autorequiring
File[/var/lib/puppet/state]
debug: /File[/var/lib/puppet/ssl/certificate_requests]: Autorequiring
File[/var/lib/puppet/ssl]
debug: /File[/var/lib/puppet/state/state.yaml]: Autorequiring
File[/var/lib/puppet/state]
debug: /File[/var/lib/puppet/facts]: Autorequiring File[/var/lib/puppet]
debug: /File[/var/lib/puppet/ssl]: Autorequiring File[/var/lib/puppet]
debug: /File[/var/lib/puppet/state]: Autorequiring File[/var/lib/puppet]
debug: /File[/var/lib/puppet/ssl/public_keys]: Autorequiring
File[/var/lib/puppet/ssl]
debug: /File[/var/lib/puppet/ssl/private_keys]: Autorequiring
File[/var/lib/puppet/ssl]
debug: /File[/var/lib/puppet/ssl/public_keys/p-HOSTNAME.use01.plat.priv.pem]:
Autorequiring File[/var/lib/puppet/ssl/public_keys]
debug: /File[/var/lib/puppet/clientbucket]: Autorequiring File[/var/lib/puppet]
debug: /File[/var/lib/puppet/ssl/certs/p-HOSTNAME.use01.plat.priv.pem]:
Autorequiring File[/var/lib/puppet/ssl/certs]
debug: /File[/var/lib/puppet/ssl/certs]: Autorequiring File[/var/lib/puppet/ssl]
debug: /File[/var/lib/puppet/lib]: Autorequiring File[/var/lib/puppet]
debug: Finishing transaction 69923343454980
debug: Loaded state in 0.02 seconds
debug: Loaded state in 0.02 seconds
info: Applying configuration version '1346103885'
debug: /Schedule[daily]: Skipping device resources because running on a host
debug: /Schedule[monthly]: Skipping device resources because running on a host
debug: /Schedule[hourly]: Skipping device resources because running on a host
debug: Prefetching crontab resources for cron
debug: /Schedule[never]: Skipping device resources because running on a host
debug: /Schedule[weekly]: Skipping device resources because running on a host
notice: /Stage[main]//Cron[s3_logger_supervisord]/user: user changed 'www-data'
to 'root'
notice: /Stage[main]//Cron[s3_logger_supervisord]/target: target changed
'www-data' to 'root'
debug: Flushing cron provider target root
debug: /Stage[main]//Cron[s3_logger_supervisord]: The container Class[Main]
will propagate my refresh event
debug: /Stage[main]//Cron[s3_logger_supervisord]: The container Class[Main]
will propagate my refresh event
debug: /Schedule[puppet]: Skipping device resources because running on a host
debug: Class[Main]: The container Stage[main] will propagate my refresh event
debug: Finishing transaction 69923342626060
debug: Storing state
debug: Stored state in 0.29 seconds
notice: Finished catalog run in 0.50 seconds
debug: Finishing transaction 69923343685540
debug: Received report to process from p-HOSTNAME.use01.plat.priv
debug: Processing report from p-HOSTNAME.use01.plat.priv with processor
Puppet::Reports::Store
--
verify cron jobs - 2 exist for root now
[user@p-HOSTNAME tmp]$ sudo -u www-data crontab -l |grep -A 2
"s3_logger_supervisord$"
# Puppet Name: s3_logger_supervisord
40 23 * * * /var/has/s3_logrotate/bin/s3_logrotate.py -b bucket -p
/var/has/log/supervisord -l /var/has/log/s3_logrotate/HOSTNAME_supervisord -s
'secret_key' -k 'key'
[user@p-HOSTNAME tmp]$ sudo -u root crontab -l |grep -A 2
"s3_logger_supervisord$"
# Puppet Name: s3_logger_supervisord
40 23 * * * /var/has/s3_logrotate/bin/s3_logrotate.py -b bucket -p
/var/has/log/supervisord -l /var/has/log/s3_logrotate/HOSTNAME_supervisord -s
'secret_key' -k 'key'
# Puppet Name: s3_logger_supervisord
40 23 * * * /var/has/s3_logrotate/bin/s3_logrotate.py -b bucket -p
/var/has/log/supervisord -l /var/has/log/s3_logrotate/HOSTNAME_supervisord -s
'secret_key' -k 'key'
[user@p-HOSTNAME tmp]$
----------------------------------------
Bug #16121: Cron user change results in duplicate entries on target user
https://projects.puppetlabs.com/issues/16121#change-69937
Author: Chris Henry
Status: Needs More Information
Priority: Normal
Assignee: Chris Henry
Category: cron
Target version:
Affected Puppet version:
Keywords:
Branch:
Running puppetmaster 2.7.13 and puppetd 2.7.13 on Centos6
We have a puppet module that installs some scripts and creates a cronjob to
pickup gzip'd logs and upload them to s3. I mistakenly created the cron job as
the user 'www-data' initially - but later found out that the supervisord daemon
logs as 'root' - so I changed the user of a puppet cronjob from 'www-data' ->
'root'.
On a puppetd run the client successfully detects the change and returns a
notice that it is changing users for the cron job - but the end result is that
the cron job is not removed for the www-data user and a duplicate job is create
on the root users crontab.
I didn't notice this for about a day but when I looked I saw that there were
many duplicate entries in the root users crontab for this job - presumably one
for each puppetd run.
The only crontab jobs on this server are managed by puppet - no manual edits or
jobs have ever been created
As you can see in the output below every client run results in the cronjob
still existing for the www-data user and another entry being generated in the
root users crontab:
<pre>
client puppetd run:
[user@HOSTNAME ~]$ date
Sat Aug 25 07:52:20 UTC 2012
[user@HOSTNAME ~]$ sudo puppetd -t
info: Caching catalog for HOSTNAME
info: Applying configuration version '1345880642'
notice:
/Stage[main]/S3_logrotate::Supervisord/Cron[s3_logger_supervisord]/user: user
changed 'www-data' to 'root'
notice:
/Stage[main]/S3_logrotate::Supervisord/Cron[s3_logger_supervisord]/target:
target changed 'www-data' to 'root'
notice: Finished catalog run in 31.34 seconds
[user@HOSTNAME ~]$
---------------
'www-data' crontab after run:
[user@HOSTNAME ~]$ date
Sat Aug 25 07:53:36 UTC 2012
[user@HOSTNAME ~]$ sudo -u www-data crontab -l
# HEADER: This file was autogenerated at Fri Aug 24 20:09:06 +0000 2012 by
puppet.
# HEADER: While it can still be managed manually, it is definitely not
recommended.
# HEADER: Note particularly that the comments starting with 'Puppet Name' should
# HEADER: not be deleted, as doing so could cause duplicate cron jobs.
# Puppet Name: s3_logger_crond
40 23 * * * /var/has/s3_logrotate/bin/s3_logrotate.py -b bucket_name -p
/var/has/log/crond -l /var/has/log/s3_logrotate/HOSTNAME_crond -s
secret_key_here -k key_here
# Puppet Name: s3_logger_nginx
40 23 * * * /var/has/s3_logrotate/bin/s3_logrotate.py -b bucket_name -p
/var/has/log/nginx -l /var/has/log/s3_logrotate/HOSTNAME_nginx -s
'secret_key_here' -k 'key_here'
# Puppet Name: s3_logger_supervisord
40 23 * * * /var/has/s3_logrotate/bin/s3_logrotate.py -b bucket_name -p
/var/has/log/supervisord -l /var/has/log/s3_logrotate/HOSTNAME_supervisord -s
'secret_key_here' -k 'key_here'
# Puppet Name: s3_logger_api
40 23 * * * /var/has/s3_logrotate/bin/s3_logrotate.py -b bucket_name -p
/data/log/api -l /var/has/log/s3_logrotate/HOSTNAME_api -s 'secret_key_here' -k
'key_here'
[user@HOSTNAME ~]$
------------------
'root' crontab after run:
[user@HOSTNAME ~]$ date
Sat Aug 25 07:54:15 UTC 2012
[user@HOSTNAME ~]$ sudo -u root crontab -l
# HEADER: This file was autogenerated at Sat Aug 25 07:53:08 +0000 2012 by
puppet.
# HEADER: While it can still be managed manually, it is definitely not
recommended.
# HEADER: Note particularly that the comments starting with 'Puppet Name' should
# HEADER: not be deleted, as doing so could cause duplicate cron jobs.
# Puppet Name: puppet_clientbucket_cleanup
15 1 * * * /usr/bin/find /var/lib/puppet/clientbucket/ -type f -mtime +14 -exec
rm {} \;
# Puppet Name: s3_logger_php-fpm_cleanup
35 23 * * * /bin/find /var/has/log/s3_logrotate -name 'HOSTNAME_php-fpm.*'
-mtime +7 -exec rm {} \;
# Puppet Name: s3_logger_nginx_cleanup
35 23 * * * /bin/find /var/has/log/s3_logrotate -name 'HOSTNAME_nginx.*' -mtime
+7 -exec rm {} \;
# Puppet Name: s3_logger_crond_cleanup
35 23 * * * /bin/find /var/has/log/s3_logrotate -name 'HOSTNAME_crond.*' -mtime
+7 -exec rm {} \;
# Puppet Name: s3_logger_supervisord_cleanup
35 23 * * * /bin/find /var/has/log/s3_logrotate -name 'HOSTNAME_supervisord.*'
-mtime +7 -exec rm {} \;
# Puppet Name: s3_logger_api_cleanup
35 23 * * * /bin/find /var/has/log/s3_logrotate -name 'HOSTNAME_api.*' -mtime
+7 -exec rm {} \;
# Puppet Name: s3_logger_php-fpm
40 23 * * * /var/has/s3_logrotate/bin/s3_logrotate.py -b bucket_name -p
/var/has/log/php-fpm -l /var/has/log/s3_logrotate/HOSTNAME_php-fpm -s
'secret_key_here' -k 'key_here'
# Puppet Name: s3_logger_supervisord
40 23 * * * /var/has/s3_logrotate/bin/s3_logrotate.py -b bucket_name -p
/var/has/log/supervisord -l /var/has/log/s3_logrotate/HOSTNAME_supervisord -s
'secret_key_here' -k 'key_here'
# Puppet Name: s3_logger_supervisord
40 23 * * * /var/has/s3_logrotate/bin/s3_logrotate.py -b bucket_name -p
/var/has/log/supervisord -l /var/has/log/s3_logrotate/HOSTNAME_supervisord -s
'secret_key_here' -k 'key_here'
# Puppet Name: s3_logger_supervisord
40 23 * * * /var/has/s3_logrotate/bin/s3_logrotate.py -b bucket_name -p
/var/has/log/supervisord -l /var/has/log/s3_logrotate/HOSTNAME_supervisord -s
'secret_key_here' -k 'key_here'
# Puppet Name: s3_logger_supervisord
40 23 * * * /var/has/s3_logrotate/bin/s3_logrotate.py -b bucket_name -p
/var/has/log/supervisord -l /var/has/log/s3_logrotate/HOSTNAME_supervisord -s
'secret_key_here' -k 'key_here'
# Puppet Name: s3_logger_supervisord
40 23 * * * /var/has/s3_logrotate/bin/s3_logrotate.py -b bucket_name -p
/var/has/log/supervisord -l /var/has/log/s3_logrotate/HOSTNAME_supervisord -s
'secret_key_here' -k 'key_here'
# Puppet Name: s3_logger_supervisord
40 23 * * * /var/has/s3_logrotate/bin/s3_logrotate.py -b bucket_name -p
/var/has/log/supervisord -l /var/has/log/s3_logrotate/HOSTNAME_supervisord -s
'secret_key_here' -k 'key_here'
# Puppet Name: s3_logger_supervisord
40 23 * * * /var/has/s3_logrotate/bin/s3_logrotate.py -b bucket_name -p
/var/has/log/supervisord -l /var/has/log/s3_logrotate/HOSTNAME_supervisord -s
'secret_key_here' -k 'key_here'
# Puppet Name: s3_logger_supervisord
40 23 * * * /var/has/s3_logrotate/bin/s3_logrotate.py -b bucket_name -p
/var/has/log/supervisord -l /var/has/log/s3_logrotate/HOSTNAME_supervisord -s
'secret_key_here' -k 'key_here'
# Puppet Name: s3_logger_supervisord
40 23 * * * /var/has/s3_logrotate/bin/s3_logrotate.py -b bucket_name -p
/var/has/log/supervisord -l /var/has/log/s3_logrotate/HOSTNAME_supervisord -s
'secret_key_here' -k 'key_here'
[user@HOSTNAME ~]$
------------------
manifest:
class s3_logrotate::supervisord ($secret, $key,
$bucket,$path='/var/has/log/supervisord') {
include s3_logrotate
Cron {
require => Class["s3_logrotate"]
}
cron {
"s3_logger_supervisord":
command => "/var/has/s3_logrotate/bin/s3_logrotate.py -b $bucket
-p $path -l /var/has/log/s3_logrotate/${hostname}_supervisord -s '${secret}' -k
'${key}'",
user => root, # this used to be www-data but supervisord logs
as root
minute => 40,
hour => 23;
"s3_logger_supervisord_cleanup":
command => "/bin/find /var/has/log/s3_logrotate -name
'${hostname}_supervisord.*' -mtime +7 -exec rm {} \\;",
user => root,
minute => 35,
hour => 23;
}
}
</pre>
--
You have received this notification because you have either subscribed to it,
or are involved in it.
To change your notification preferences, please click here:
http://projects.puppetlabs.com/my/account
--
You received this message because you are subscribed to the Google Groups
"Puppet Bugs" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to
[email protected].
For more options, visit this group at
http://groups.google.com/group/puppet-bugs?hl=en.