[MediaWiki-commits] [Gerrit] operations/dumps[master]: gzip compress output from api jobs and abstracts dumps

2017-11-20 Thread ArielGlenn (Code Review)
ArielGlenn has submitted this change and it was merged. ( 
https://gerrit.wikimedia.org/r/392455 )

Change subject: gzip compress output from api jobs and abstracts dumps
..


gzip compress output from api jobs and abstracts dumps

We don't really care if the output is large or small, just do it
all the time, regardless of the job.

Bug:T178046
Change-Id: I4be42637b4f0ee5d99aa0ef0bf3eeec97979e8d0
---
M xmldumps-backup/dumps/apijobs.py
M xmldumps-backup/dumps/xmljobs.py
M xmldumps-backup/xmlabstracts.py
3 files changed, 5 insertions(+), 5 deletions(-)

Approvals:
  ArielGlenn: Looks good to me, approved
  jenkins-bot: Verified



diff --git a/xmldumps-backup/dumps/apijobs.py b/xmldumps-backup/dumps/apijobs.py
index ad31c55..3991add 100644
--- a/xmldumps-backup/dumps/apijobs.py
+++ b/xmldumps-backup/dumps/apijobs.py
@@ -17,7 +17,7 @@
 return "json"
 
 def get_file_ext(self):
-return ""
+return "gz"
 
 def run(self, runner):
 retries = 0
@@ -53,5 +53,5 @@
 properties = '|'.join(self._properties)
 api_url = 
"{baseurl}?action=query=siteinfo={props}=json"
 url = api_url.format(baseurl=base_url, props=properties)
-command = [["/usr/bin/curl", "-s", url]]
+command = [["/usr/bin/curl", "-s", url], [runner.wiki.config.gzip]]
 return command
diff --git a/xmldumps-backup/dumps/xmljobs.py b/xmldumps-backup/dumps/xmljobs.py
index 8a999ec..3b6ea86 100644
--- a/xmldumps-backup/dumps/xmljobs.py
+++ b/xmldumps-backup/dumps/xmljobs.py
@@ -268,7 +268,7 @@
 return "xml"
 
 def get_file_ext(self):
-return ""
+return "gz"
 
 def get_variant_from_dumpname(self, dumpname):
 fields = dumpname.split("-")
diff --git a/xmldumps-backup/xmlabstracts.py b/xmldumps-backup/xmlabstracts.py
index 28abd31..c39f748 100644
--- a/xmldumps-backup/xmlabstracts.py
+++ b/xmldumps-backup/xmlabstracts.py
@@ -12,7 +12,7 @@
 import getopt
 from dumps.WikiDump import Config
 from dumps.utils import MultiVersion
-from xmlstreams import do_xml_stream, catit
+from xmlstreams import do_xml_stream, gzippit
 
 
 def do_abstractsbackup(wikidb, output_files, variants,
@@ -35,7 +35,7 @@
 if dryrun:
 outfiles[filetype]['compr'] = None
 else:
-outfiles[filetype]['compr'] = catit(outfiles[filetype]['name'])
+outfiles[filetype]['compr'] = gzippit(outfiles[filetype]['name'])
 
 script_command = MultiVersion.mw_script_as_array(wikiconf,
  "dumpBackup.php")

-- 
To view, visit https://gerrit.wikimedia.org/r/392455
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings

Gerrit-MessageType: merged
Gerrit-Change-Id: I4be42637b4f0ee5d99aa0ef0bf3eeec97979e8d0
Gerrit-PatchSet: 1
Gerrit-Project: operations/dumps
Gerrit-Branch: master
Gerrit-Owner: ArielGlenn 
Gerrit-Reviewer: ArielGlenn 
Gerrit-Reviewer: jenkins-bot <>

___
MediaWiki-commits mailing list
MediaWiki-commits@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-commits


[MediaWiki-commits] [Gerrit] operations/dumps[master]: gzip compress output from api jobs and abstracts dumps

2017-11-20 Thread ArielGlenn (Code Review)
ArielGlenn has uploaded a new change for review. ( 
https://gerrit.wikimedia.org/r/392455 )

Change subject: gzip compress output from api jobs and abstracts dumps
..

gzip compress output from api jobs and abstracts dumps

We don't really care if the output is large or small, just do it
all the time, regardless of the job.

Bug:T178046
Change-Id: I4be42637b4f0ee5d99aa0ef0bf3eeec97979e8d0
---
M xmldumps-backup/dumps/apijobs.py
M xmldumps-backup/dumps/xmljobs.py
M xmldumps-backup/xmlabstracts.py
3 files changed, 5 insertions(+), 5 deletions(-)


  git pull ssh://gerrit.wikimedia.org:29418/operations/dumps 
refs/changes/55/392455/1

diff --git a/xmldumps-backup/dumps/apijobs.py b/xmldumps-backup/dumps/apijobs.py
index ad31c55..3991add 100644
--- a/xmldumps-backup/dumps/apijobs.py
+++ b/xmldumps-backup/dumps/apijobs.py
@@ -17,7 +17,7 @@
 return "json"
 
 def get_file_ext(self):
-return ""
+return "gz"
 
 def run(self, runner):
 retries = 0
@@ -53,5 +53,5 @@
 properties = '|'.join(self._properties)
 api_url = 
"{baseurl}?action=query=siteinfo={props}=json"
 url = api_url.format(baseurl=base_url, props=properties)
-command = [["/usr/bin/curl", "-s", url]]
+command = [["/usr/bin/curl", "-s", url], [runner.wiki.config.gzip]]
 return command
diff --git a/xmldumps-backup/dumps/xmljobs.py b/xmldumps-backup/dumps/xmljobs.py
index 8a999ec..3b6ea86 100644
--- a/xmldumps-backup/dumps/xmljobs.py
+++ b/xmldumps-backup/dumps/xmljobs.py
@@ -268,7 +268,7 @@
 return "xml"
 
 def get_file_ext(self):
-return ""
+return "gz"
 
 def get_variant_from_dumpname(self, dumpname):
 fields = dumpname.split("-")
diff --git a/xmldumps-backup/xmlabstracts.py b/xmldumps-backup/xmlabstracts.py
index 28abd31..c39f748 100644
--- a/xmldumps-backup/xmlabstracts.py
+++ b/xmldumps-backup/xmlabstracts.py
@@ -12,7 +12,7 @@
 import getopt
 from dumps.WikiDump import Config
 from dumps.utils import MultiVersion
-from xmlstreams import do_xml_stream, catit
+from xmlstreams import do_xml_stream, gzippit
 
 
 def do_abstractsbackup(wikidb, output_files, variants,
@@ -35,7 +35,7 @@
 if dryrun:
 outfiles[filetype]['compr'] = None
 else:
-outfiles[filetype]['compr'] = catit(outfiles[filetype]['name'])
+outfiles[filetype]['compr'] = gzippit(outfiles[filetype]['name'])
 
 script_command = MultiVersion.mw_script_as_array(wikiconf,
  "dumpBackup.php")

-- 
To view, visit https://gerrit.wikimedia.org/r/392455
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings

Gerrit-MessageType: newchange
Gerrit-Change-Id: I4be42637b4f0ee5d99aa0ef0bf3eeec97979e8d0
Gerrit-PatchSet: 1
Gerrit-Project: operations/dumps
Gerrit-Branch: master
Gerrit-Owner: ArielGlenn 

___
MediaWiki-commits mailing list
MediaWiki-commits@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-commits