Ottomata has uploaded a new change for review.
https://gerrit.wikimedia.org/r/90574
Change subject: Piping stderr into logfile for kraken jobs
......................................................................
Piping stderr into logfile for kraken jobs
Change-Id: If259d2406f9771b383d0eab85e8077c87fa3f24b
---
M manifests/role/analytics/kraken.pp
1 file changed, 2 insertions(+), 2 deletions(-)
git pull ssh://gerrit.wikimedia.org:29418/operations/puppet
refs/changes/74/90574/1
diff --git a/manifests/role/analytics/kraken.pp
b/manifests/role/analytics/kraken.pp
index d4c0c88..2b7cf7b 100644
--- a/manifests/role/analytics/kraken.pp
+++ b/manifests/role/analytics/kraken.pp
@@ -62,7 +62,7 @@
# cron job to download any missing pagecount files from
# dumps.wikimedia.org and store them into HDFS.
cron { 'kraken-import-hourly-pagecounts':
- command => "${script} --start ${start_date} ${datadir} >> ${log_file}",
+ command => "${script} --start ${start_date} ${datadir} 2>&1
/usr/bin/tee -a ${log_file}",
user => 'hdfs',
minute => 5,
require => Exec["${script}-exists"],
@@ -97,7 +97,7 @@
# TODO: Add -o '--auxpath /path/to/hive-serdes-1.0-SNAPSHOT.jar'
# once we know where it will be deployed.
cron { 'kraken-create-external-hive-partitions':
- command => "${script} --database ${database} ${datadir} >>
${log_file}",
+ command => "${script} --database ${database} ${datadir} 2>&1 |
/usr/bin/tee -a ${log_file}",
user => 'hdfs',
minute => 15,
require => Exec["${script}-exists"],
--
To view, visit https://gerrit.wikimedia.org/r/90574
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings
Gerrit-MessageType: newchange
Gerrit-Change-Id: If259d2406f9771b383d0eab85e8077c87fa3f24b
Gerrit-PatchSet: 1
Gerrit-Project: operations/puppet
Gerrit-Branch: production
Gerrit-Owner: Ottomata <[email protected]>
_______________________________________________
MediaWiki-commits mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-commits