On Sat, Jan 25, 2025 at 03:57:14PM -0700, Simon Glass wrote: > Hi Tom, > > On Sat, 25 Jan 2025 at 15:53, Tom Rini <[email protected]> wrote: > > > > On Sat, Jan 25, 2025 at 03:50:02PM -0700, Simon Glass wrote: > > > Hi Tom, > > > > > > On Sat, 25 Jan 2025 at 15:01, Tom Rini <[email protected]> wrote: > > > > > > > > On Sat, Jan 25, 2025 at 02:36:52PM -0700, Simon Glass wrote: > > > > > Hi Tom, > > > > > > > > > > On Sat, 25 Jan 2025 at 11:30, Tom Rini <[email protected]> wrote: > > > > > > > > > > > > On Sat, Jan 25, 2025 at 10:14:05AM -0700, Simon Glass wrote: > > > > > > > Hi Tom, > > > > > > > > > > > > > > On Fri, 24 Jan 2025 at 12:21, Tom Rini <[email protected]> wrote: > > > > > > > > > > > > > > > > On Fri, Jan 24, 2025 at 11:56:04AM -0700, Simon Glass wrote: > > > > > > > > > > > > > > > > > Execution time varies widely with the existing tests. > > > > > > > > > Provides a summary > > > > > > > > > of the time taken for each test, along with a histogram. > > > > > > > > > > > > > > > > > > Example: > > > > > > > > > > > > > > > > > > Duration : Number of tests > > > > > > > > > ======== : ======================================== > > > > > > > > > <1ms : 1 > > > > > > > > > <8ms : 1 > > > > > > > > > <20ms : # 20 > > > > > > > > > <30ms : ######## 127 > > > > > > > > > <50ms : ######################################## 582 > > > > > > > > > <75ms : ####### 102 > > > > > > > > > <100ms : ## 39 > > > > > > > > > <200ms : ##### 86 > > > > > > > > > <300ms : # 29 > > > > > > > > > <500ms : ## 42 > > > > > > > > > <750ms : # 16 > > > > > > > > > <1.0s : # 15 > > > > > > > > > <2.0s : # 23 > > > > > > > > > <3.0s : 13 > > > > > > > > > <5.0s : 9 > > > > > > > > > <7.5s : 1 > > > > > > > > > <10.0s : 6 > > > > > > > > > <20.0s : 12 > > > > > > > > > > > > > > > > > > Signed-off-by: Simon Glass <[email protected]> > > > > > > > > > --- > > > > > > > > > > > > > > > > > > test/py/conftest.py | 77 > > > > > > > > > +++++++++++++++++++++++++++++++++++++++++++++ > > > > > > > > > 1 file changed, 77 insertions(+) > > > > > > > > > > > > > > > > Whitespace issues aside, > > > > > > > > > > > > > > Do you mean the blank lines? pylint wants those for top-level > > > > > > > items. > > > > > > > > > > > > That's weird but OK. > > > > > > > > > > > > > we should probably not do this every time, just > > > > > > > > when requested. > > > > > > > > > > > > > > I'd like to see it on each run, actually. It doesn't take any > > > > > > > time to > > > > > > > calculate and it lets us see where the time is going. > > > > > > > > > > > > We already print the full time, and I think knowing what the tests > > > > > > are > > > > > > that take so long, so that we can see if there's anything to do > > > > > > about > > > > > > it, would be more helpful. The general feedback about our pytests > > > > > > today > > > > > > is they're already too verbose in the normal case. > > > > > > > > > > This is what is shown in the full output (although with CI you have to > > > > > download the files to see it): > > > > > > > > > > [-] Section: Timing Report > > > > [snip] > > > > > > > > This is handy as a one-off, and maybe hints at a few tests to look at > > > > more, but perhaps it's mostly "is what it is" tests (and not just the > > > > sleep test). > > > > > > Perhaps, but I would hope we could speed some of these up, or reduce > > > the number of variations. > > > > Looking at it, all I see is a sample that yes, we should see if there's > > some way to speed up output from U-Boot to python. None of the "long" > > tests look otherwise out of line for what they're doing. > > > > > > > TIME: SINCE-SECTION: 0:00:00.004227 > > > > > > > > > > This is just the summary which is visible in CI: > > > > > > > > > > [-] Section: Timing Summary > > > > > TIME: NOW: 2025/01/25 14:28:57.730179 > > > > > TIME: SINCE-PREV: 0:00:00.004234 > > > > > TIME: SINCE-START: 0:02:00.307984 > > > > > TIME: SINCE-SECTION: 0:00:00.000041 > > > > > Duration : Number of tests > > > > > ======== : ======================================== > > > > > <4ms : 1 > > > > > <8ms : 1 > > > > > <10ms : 3 > > > > > <20ms : #### 47 > > > > > <30ms : ######################################## 377 > > > > > <40ms : ############################# 278 > > > > > <50ms : ####### 69 > > > > > <75ms : #### 47 > > > > > <100ms : # 18 > > > > > <200ms : ### 29 > > > > > <300ms : # 15 > > > > > <400ms : # 13 > > > > > <500ms : 8 > > > > > <750ms : 6 > > > > > <1.0s : 2 > > > > > <2.0s : 9 > > > > > <3.0s : 3 > > > > > <4.0s : 4 > > > > > <5.0s : 2 > > > > > <7.5s : 2 > > > > > <20.0s : 1 > > > > > > > > > > So what do you think would make most sense? > > > > > > > > By default, neither of these, and only when requested. Looking over at > > > > one of your runs, no, I don't want to waste more output space. If you > > > > must do something, log it off to examine later and not add to the main > > > > output. > > > > > > OK, I'll just put it in my tree for now. > > > > Sigh. > > > > > BTW what do you mean by 'by default'? What other way of running would > > > there be? Are you thinking of a CI flag? > > > > I mean passing some flag to pytest, like you do what you want more > > verbose output, or less output. I don't see this as something that adds > > value to run every time. > > OK, we could do that. But I really like seeing the info in gitlab. > Perhaps we could have it appear just for the one 'sandbox' test?
OK, that's a reasonable compromise. -- Tom
signature.asc
Description: PGP signature

