** Description changed: [Impact] It would be nice to collect: ceph osd pool autoscale-status ceph balancer status https://docs.ceph.com/docs/master/rados/operations/placement-groups/ VIEWING PG SCALING RECOMMENDATIONS You can view each pool, its relative utilization, and any suggested changes to the PG count with this command: ceph osd pool autoscale-status https://docs.ceph.com/docs/mimic/mgr/balancer/ STATUS The current status of the balancer can be check at any time with: ceph balancer status [Test Case] * Install latest sosreport found in -updates - * Run sosreport -o ceph (version 3.X and/or 4.X) or sos report -o ceph (4.X only) + * Run sosreport -o ceph (version 3.X and/or 4.X) or sos report -o ceph (4.X only) * Look content inside /path_to_sosreport/sos_command/ceph/ - * Make sure the 2 new commands are found there. + * Make sure the 2 new commands are found there. + * There will be 3 additional files, as the autoscale-status is also captured in JSON format. [Regression Potential] + This patch adds two commands to the collected command output. Potential regressions would include a command typo or a code typo. A command typo would result in a failed command which should capture the command error output. A code typo will raise an exception in the ceph plug-in halting further ceph data capture. [Other Info] [Original Description] It would be nice to collect: ceph osd pool autoscale-status ceph balancer status Upstream report: https://github.com/sosreport/sos/issues/2211 Upstream commit: https://github.com/sosreport/sos/commit/52f4661e2b594134b98e2967b02cc860d7963fef
-- You received this bug notification because you are a member of Ubuntu Bugs, which is subscribed to Ubuntu. https://bugs.launchpad.net/bugs/1893109 Title: [plugin][ceph] collect ceph balancer and pr-autoscale status To manage notifications about this bug go to: https://bugs.launchpad.net/sosreport/+bug/1893109/+subscriptions -- ubuntu-bugs mailing list [email protected] https://lists.ubuntu.com/mailman/listinfo/ubuntu-bugs
