Hi,

On Thu, Nov 17, 2022 at 04:57:36PM +0000, Richard Purdie wrote:
> On Thu, 2022-11-17 at 17:39 +0200, Mikko Rapeli wrote:
> > Hi,
> > 
> > On Thu, Nov 17, 2022 at 03:17:43PM +0000, Richard Purdie wrote:
> > > On Thu, 2022-11-17 at 09:12 +0200, Mikko Rapeli wrote:
> > > > Many runtime tests would need customization for different
> > > > machines and images. Currently some tests like parselogs.py are hard
> > > > coding machine specific exceptions into the test itself. I think these
> > > > machine specific exceptions fit better as image specific ones, since a
> > > > single machine config can generate multiple images which behave
> > > > differently. Thus create a "testimage_data.json" file format which image
> > > > recipes can deploy. This is then used by tests like parselogs.py to find
> > > > the image specific exception list.
> > > > 
> > > > Same approach would fit other runtime tests too. For example systemd
> > > > tests could include a test case which checks that an image specific 
> > > > list of
> > > > services are running.
> > > > 
> > > > I don't know how this data storage would be used with SDK or selftests,
> > > > but maybe it could work there too with some small tweaks.
> > > > 
> > > > Mikko Rapeli (2):
> > > >   oeqa: add utils/data.py with get_data() function
> > > >   oeqa parselogs.py: use get_data() to fetch image specific error list
> > > > 
> > > >  meta/lib/oeqa/runtime/cases/parselogs.py | 17 +++++++---
> > > >  meta/lib/oeqa/utils/data.py              | 41 ++++++++++++++++++++++++
> > > >  2 files changed, 54 insertions(+), 4 deletions(-)
> > > >  create mode 100644 meta/lib/oeqa/utils/data.py
> > > 
> > > This patch looks like it is one side of the equation, i.e. importing
> > > the data into the tests. How does the data get into the deploy
> > > directory in the first place? I assume there are other patches which do
> > > that?
> > 
> > Patches in other layers do that, yes.

Note to self and anyone else interested in this, it is rather
tricky to get SRC_URI and do_deploy() working in image recipes.
Something like this will do it though:

SUMMARY = "Test image"
LICENSE = "MIT"

SRC_URI = "file://testimage_data.json"

inherit deploy

# re-enable SRC_URI handling, it's disabled in image.bbclass
python __anonymous() {
    d.delVarFlag("do_fetch", "noexec")
    d.delVarFlag("do_unpack", "noexec")
}
...
do_deploy() {
    # to customise oeqa tests
    mkdir -p "${DEPLOYDIR}"
    install "${WORKDIR}/testimage_data.json" "${DEPLOYDIR}"
}
# do_unpack needed to run do_fetch and do_unpack which are disabled by 
image.bbclass.
addtask deploy before do_build after do_rootfs do_unpack

> > > We have a bit of contention with two approaches to data management in
> > > OEQA. One is where the runtime tests are directly run against an image,
> > > in which case the datastore is available. You could therefore have
> > > markup in the recipe as normal variables and access them directly in
> > > the tests.
> > 
> > My use case is running tests right after build, but I would like to export
> > them to execute later as well.
> 
> When you execute later, are you going to use testexport or will the
> metadata still be available? As I mentioned, removing testexport would
> be desirable for a number of reasons but I suspect there are people who
> might want it.

I was planning to use testexport and also make sure all images and other
things needed for running tests are in the output of a build.

> > > The second is the "testexport" approach where the tests are run without
> > > the main metadata. I know Ross and I would like to see testexport
> > > dropped as it complicates things and is a pain.
> > > 
> > > This new file "feels" a lot like more extensions in the testexport
> > > direction and I'm not sure we need to do that. Could we handle this
> > > with more markup in the image recipe?
> > 
> > For simple variables this would do but how about a long list of strings
> > like poky/meta/lib/oeqa/runtime/cases/parselogs.py:
> > 
> > common_errors = [
> >     "(WW) warning, (EE) error, (NI) not implemented, (??) unknown.",
> >     "dma timeout",
> >     "can\'t add hid device:",
> >     "usbhid: probe of ",
> >     "_OSC failed (AE_ERROR)",
> >     "_OSC failed (AE_SUPPORT)",
> >     "AE_ALREADY_EXISTS",
> >     "ACPI _OSC request failed (AE_SUPPORT)",
> >     "can\'t disable ASPM",
> >     "Failed to load module \"vesa\"",
> >     "Failed to load module vesa",
> >     "Failed to load module \"modesetting\"",
> >     "Failed to load module modesetting",
> >     "Failed to load module \"glx\"",
> >     "Failed to load module \"fbdev\"",
> >     "Failed to load module fbdev",
> >     "Failed to load module glx"
> > ]
> > 
> > Embed json into a bitbake variable? Or embed directly as python code?
> 
> I've wondered if we could add some new syntax to bitbake to support
> this somehow, does anyone have any ideas to propose?
> 
> I'd wondered about both python data and/or json format (at which point
> someone will want yaml :/).

This sounds pretty far fetched currently. json files are quite simple
to work with in python so I'd just stick to this. If this approach is ok
I could update the testimage.bbclass documentation with these details.
I really want to re-use tests and infratructure for running them but I need
to customize various details.

Cheers,

-Mikko
-=-=-=-=-=-=-=-=-=-=-=-
Links: You receive all messages sent to this group.
View/Reply Online (#173462): 
https://lists.openembedded.org/g/openembedded-core/message/173462
Mute This Topic: https://lists.openembedded.org/mt/95085492/21656
Group Owner: [email protected]
Unsubscribe: https://lists.openembedded.org/g/openembedded-core/unsub 
[[email protected]]
-=-=-=-=-=-=-=-=-=-=-=-

Reply via email to