I have been testing the proposed package for noble against the test plan
and I'm happy to report that it passes validation! See the steps I took
below.

---

Install terraform and LXD

    sudo snap install terraform --classic
    sudo snap install lxd

Make sure you're in the `lxd` group:

    if ! getent group lxd | grep "$USER"; then
        sudo usermod -aG lxd "$USER"
        newgrp lxd
    fi

If you've not already set up LXD do so now:

    lxd init --minimal

Start up Landscape Server, enable autoregistration and get the
registration info.

Extract the benchmarking patch from the original test plan to a file
called `benchmark_patch.py`, which will add CPU profiling to package
reporter:

    def main():
        original_function = '''def _compute_packages_changes(self):'''

        replacement_function = '''def _compute_packages_changes(self):
            import cProfile
            import pstats
            from datetime import datetime
            import psutil

            profile = cProfile.Profile()
            process = psutil.Process()
            start_cpu_times = process.cpu_times()
            profile.enable()

            result = self.compute_packages_change_inner()

            end_cpu_times = process.cpu_times()

            profile.disable()

            user_time = end_cpu_times.user - start_cpu_times.user
            system_time = end_cpu_times.system - start_cpu_times.system
            total_cpu_time = user_time + system_time

            output_path = "/var/lib/landscape/client/result.txt"
            with open(output_path, "a") as fp:
                now = datetime.now()
                fp.write(f"\\n--------- Run on: {now.strftime('%Y-%m-%d 
%H:%M:%S')} ---------\\n\\n")
                stats = pstats.Stats(profile, stream=fp)
                stats.strip_dirs().sort_stats("cumulative").print_stats(10)
                fp.write(f"CPU Time: {total_cpu_time}s\\n")
            return result

        def compute_packages_change_inner(self):'''

        file_path = '/usr/lib/python3/dist-
packages/landscape/client/package/reporter.py'

        try:
            with open(file_path, 'r') as f:
                content = f.read()
            
            modified_content = content.replace(original_function, 
replacement_function)
            
            with open(file_path, 'w') as f:
                f.write(modified_content)

        except Exception as e:
            print(f"Error adding benchmarking: {str(e)}")

    if __name__ == "__main__":
        main()

Create a file called `main.tf` and paste the following, making sure to
replace the server/pro info. This will create two noble instances, one
with the proposed version of Landscape Client and one with the current
version in the noble archives. It will copy over the
`benchmark_patch.py` file to both instances and run it to add the
profiling to package reporter:

    module "landscape-client" {
        source             = "jansdhillon/landscape-client/lxd"
        pro_token          = "xxxx" # replace with real pro token
        account_name       = "onward"
        landscape_root_url = "10.1.77.207"
        registration_key   = "key"
        image_alias        = "noble"

        instances = [
            {
                image_alias = "noble"
                client_config = {
                    computer_title           = 
"noble-lp2099283-proposed-validator"
                    ping_interval            = 10
                    exchange_interval        = 10
                    urgent_exchange_interval = 10
                }

                landscape_client_package = "landscape-
client=24.02-0ubuntu5.6"

                files = [
                    {
                        content     = <<EOT
                            Package: landscape-client landscape-common
                            Pin: release a=noble-proposed
                            Pin-Priority: 600
                            EOT
                        target_path = "/etc/apt/preferences.d/proposed"
                    },
                    {
                        source_path = "./benchmark_patch.py"
                        target_path = "/tmp/benchmark_patch.py"
                    }
                ]

                additional_cloud_init = <<EOT
                            #cloud-config
                            apt:
                                sources:
                                    noble-proposed:
                                        source: "deb 
http://archive.ubuntu.com/ubuntu noble-proposed main restricted universe 
multiverse"

                            package-upgrades: true
                            packages:
                                - python3-psutil
                            runcmd:
                                - python3 /tmp/benchmark_patch.py && sleep 60
                            EOT
            },

            {
                image_alias = "noble"
                client_config = {
                    computer_title           = 
"noble-lp2099283-control-validator"
                    ping_interval            = 10
                    exchange_interval        = 10
                    urgent_exchange_interval = 10
                }

                landscape_client_package = "landscape-
client=24.02-0ubuntu5.3"

                files = [
                    {
                        source_path = "./benchmark_patch.py"
                        target_path = "/tmp/benchmark_patch.py"
                    }
                ]

                additional_cloud_init = <<EOT
                            #cloud-config
                            package-upgrades: true
                            packages:
                                - python3-psutil
                            runcmd:
                                - python3 /tmp/benchmark_patch.py && sleep 60
                            EOT
            },
        ]
    }

Initialize and apply the terraform plan:

    terraform init && terraform apply -auto-approve

Once both machines have registered and run their benchmarks (apply
finishes), view the results:

    printf "\nnoble-lp2099283-proposed-validator: \n"
    lxc exec noble-lp2099283-proposed-validator -- cat 
/var/lib/landscape/client/result.txt

    printf "\nnoble-lp2099283-control-validator (currently in archives): \n"
    lxc exec noble-lp2099283-control-validator -- cat 
/var/lib/landscape/client/result.txt

Example output:

    noble-lp2099283-proposed-validator:

    --------- Run on: 2025-09-16 22:01:56 ---------

            3511260 function calls (3508798 primitive calls) in 3.790
seconds

    Ordered by: cumulative time
    List reduced from 207 to 10 due to restriction <10>

    ncalls  tottime  percall  cumtime  percall filename:lineno(function)
            1    0.295    0.295    3.790    3.790 
reporter.py:687(compute_packages_change_inner)
    105578    0.098    0.000    2.487    0.000 store.py:151(get_hash_id)
    129488    0.186    0.000    2.378    0.000 store.py:20(inner)
    129482    0.116    0.000    2.092    0.000 store.py:52(get_hash_id)
    129488    1.789    0.000    1.789    0.000 {method 'execute' of 
'sqlite3.Cursor' objects}
    187540    0.075    0.000    0.643    0.000 
facade.py:489(is_package_installed)
    194572    0.057    0.000    0.532    0.000 package.py:429(__eq__)
    194622    0.235    0.000    0.475    0.000 package.py:400(_cmp)
            1    0.020    0.020    0.302    0.302 
facade.py:183(get_locked_packages)
    129482    0.188    0.000    0.188    0.000 {method 'fetchone' of 
'sqlite3.Cursor' objects}


    CPU Time: 3.410000000000001s

    noble-lp2099283-control-validator (currently in archives):

    --------- Run on: 2025-09-16 22:02:16 ---------

            4135491 function calls (4133029 primitive calls) in 95.125
seconds

    Ordered by: cumulative time
    List reduced from 210 to 10 due to restriction <10>

    ncalls  tottime  percall  cumtime  percall filename:lineno(function)
            1    0.945    0.945   95.124   95.124 
reporter.py:687(compute_packages_change_inner)
    182874    0.820    0.000   85.087    0.000 package.py:675(origins)
    209252    0.686    0.000   84.167    0.000 package.py:299(__init__)
    209252   83.481    0.000   83.481    0.000 {method 'find_index' of 
'apt_pkg.SourceList' objects}
    100624    0.289    0.000    6.061    0.000 store.py:151(get_hash_id)
    119580    0.493    0.000    5.722    0.000 store.py:20(inner)
    119574    0.321    0.000    4.900    0.000 store.py:52(get_hash_id)
    119580    4.061    0.000    4.061    0.000 {method 'execute' of 
'sqlite3.Cursor' objects}
    182586    0.247    0.000    1.981    0.000 
facade.py:489(is_package_installed)
    188478    0.145    0.000    1.634    0.000 package.py:429(__eq__)


    CPU Time: 86.21000000000001s

These findings validate those in the original test plan, showing a
performance improvement of about 29x in the proposed version in this
sample run.

Additionally, I was able to install a package, remove a package and
upgrade all packages on the proposed version, through Landscape Server.

Cleanup:

    terraform destroy -auto-approve

TF module used: https://github.com/jansdhillon/terraform-lxd-landscape-
client/tree/main/examples/benchmark


** Tags removed: verification-needed verification-needed-noble
** Tags added: verification-done-noble

-- 
You received this bug notification because you are a member of Ubuntu
Bugs, which is subscribed to Ubuntu.
https://bugs.launchpad.net/bugs/2099283

Title:
  [SRU] Update Focal, Jammy, Noble, Oracular, Plucky, Questing to reduce
  CPU usage of landscape-package-reporter

To manage notifications about this bug go to:
https://bugs.launchpad.net/landscape-client/+bug/2099283/+subscriptions


-- 
ubuntu-bugs mailing list
[email protected]
https://lists.ubuntu.com/mailman/listinfo/ubuntu-bugs

Reply via email to