On Tue, 15 Jul 2025 at 18:49, Fabiano Rosas <faro...@suse.de> wrote:
> @@ -57,11 +57,9 @@ static const gchar *format_time_str(uint64_t us)
>      const char *units[] = {"us", "ms", "sec"};
>      int index = 0;
>
> -    while (us > 1000) {
> +    while (us > 1000 && index + 1 < ARRAY_SIZE(units)) {
>          us /= 1000;
> -        if (++index >= (sizeof(units) - 1)) {
> -            break;
> -        }
> +        index++;
>      }
>
>      return g_strdup_printf("%"PRIu64" %s", us, units[index]);

* This loop is rather confusing.

* Is the while loop converting microseconds (us) to seconds with:  us
/= 1000 ?  ie. index shall mostly be 2 = "sec", except for the range =
1000000 - 1000999,  when us / 1000 => 1000 would break the while loop
and it'd return string "1000 ms".
===
#define MS  (1000)
#define US  (MS * 1000)
#define NS  (US * 1000)

    if (n >= NS)
        n /= NS;
    else if (n >= US)
        n /= US;
    else if (n >= MS)
        n /= MS;

    return g_strdup_printf("%"PRIu64" sec", n);
===

* Does the above simplification look right? It shall always return
seconds as:  "<n> sec"


Thank you.
---
  - Prasad


Reply via email to