Script 'mail_helper' called by obssrc
Hello community,

here is the log from the commit of package product-composer for 
openSUSE:Factory checked in at 2025-05-20 09:31:35
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/product-composer (Old)
 and      /work/SRC/openSUSE:Factory/.product-composer.new.30101 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Package is "product-composer"

Tue May 20 09:31:35 2025 rev:36 rq:1278080 version:0.5.16

Changes:
--------
--- /work/SRC/openSUSE:Factory/product-composer/product-composer.changes        
2025-05-12 16:55:29.473306160 +0200
+++ 
/work/SRC/openSUSE:Factory/.product-composer.new.30101/product-composer.changes 
    2025-05-20 09:31:44.053299490 +0200
@@ -1,0 +2,21 @@
+Fri May 16 13:28:58 UTC 2025 - Adrian Schröter <adr...@suse.de>
+
+- update to version 0.5.16:
+  * package EULA support added
+  * agama: do not take the iso meta data from the agama iso
+  * code cleanup and refactoring
+  * build description files are now validated.
+  * verify command is now checking all flavors by default.
+
+-------------------------------------------------------------------
+Tue May 13 12:34:06 UTC 2025 - Adrian Schröter <adr...@suse.de>
+
+- update to version 0.5.15:
+  * fix generation of gpg-pubkey content tags
+  * Do not error out in updateinfo_packages_only mode if packages are not found
+  * Set BUILD_DIR before calling the sbom generator
+  * Handle build_options in flavors different
+    Add them to the global set, instead of replacing the global set.
+  * Fix handover of multiple --build-option cli parameters
+
+-------------------------------------------------------------------

Old:
----
  _service
  product-composer-0.5.14.obscpio
  product-composer.obsinfo

New:
----
  _scmsync.obsinfo
  build.specials.obscpio
  product-composer.obscpio

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Other differences:
------------------
++++++ product-composer.spec ++++++
--- /var/tmp/diff_new_pack.udBrNn/_old  2025-05-20 09:31:46.329393140 +0200
+++ /var/tmp/diff_new_pack.udBrNn/_new  2025-05-20 09:31:46.337393469 +0200
@@ -23,12 +23,13 @@
 %endif
 
 Name:           product-composer
-Version:        0.5.14
+Version:        0.5.16
 Release:        0
 Summary:        Product Composer
 License:        GPL-2.0-or-later
 Group:          Development/Tools/Building
 URL:            https://github.com/openSUSE/product-composer
+#!CreateArchive: product-composer
 Source:         %name-%{version}.tar.xz
 # Should become a build option
 Patch1:         sle-15-defaults.patch

++++++ _scmsync.obsinfo ++++++
mtime: 1747402289
commit: fed5fbb26fa9c09ab6729daf6640a03d7723158f7ac9954c016c1bc9eb55bf00
url: https://src.opensuse.org/tools/product-composer
revision: devel

++++++ product-composer-0.5.14.obscpio -> product-composer.obscpio ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/product-composer-0.5.14/.git 
new/product-composer/.git
--- old/product-composer-0.5.14/.git    1970-01-01 01:00:00.000000000 +0100
+++ new/product-composer/.git   2025-05-16 15:31:50.000000000 +0200
@@ -0,0 +1 @@
+gitdir: ../.git/modules/product-composer
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/product-composer-0.5.14/.github/workflows/tests.yaml 
new/product-composer/.github/workflows/tests.yaml
--- old/product-composer-0.5.14/.github/workflows/tests.yaml    2025-05-08 
12:37:30.000000000 +0200
+++ new/product-composer/.github/workflows/tests.yaml   2025-05-16 
15:31:50.000000000 +0200
@@ -10,7 +10,7 @@
 
 jobs:
   unit:
-    name: "unit"
+    name: "basic"
     runs-on: 'ubuntu-latest'
     strategy:
       fail-fast: false
@@ -26,12 +26,13 @@
         run: |
             zypper -n modifyrepo --disable repo-openh264 || :
             zypper -n --gpg-auto-import-keys refresh
-            zypper -n install python3 python3-pip python3-pydantic 
python3-pytest python3-rpm python3-setuptools python3-solv python3-PyYAML
+            zypper -n install python3 python3-pip python3-pydantic 
python3-pytest python3-rpm python3-setuptools python3-solv python3-PyYAML 
python3-schema
 
       - uses: actions/checkout@v4
 
-      - name: 'Run unit tests'
+      - name: 'Run basic example verification'
         run: |
           pip3 config set global.break-system-packages 1
           pip3 install --no-dependencies -e .
-          pytest tests
+          productcomposer verify examples/ftp.productcompose
+#          pytest tests
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/product-composer-0.5.14/README.rst 
new/product-composer/README.rst
--- old/product-composer-0.5.14/README.rst      2025-05-08 12:37:30.000000000 
+0200
+++ new/product-composer/README.rst     2025-05-16 15:31:50.000000000 +0200
@@ -5,7 +5,8 @@
 repositories inside of Open Build Service based on a larger pool
 of packages.
 
-It is starting as small as possible, just enough for ALP products atm.
+It is used by any SLFO based product during product creation and
+also during maintenance time.
 
 Currently it supports:
  - processing based on a list of rpm package names
@@ -13,9 +14,8 @@
  - it can either just take a single rpm of a given name or all of them
  - it can post process updateinfo data
  - post processing like rpm meta data generation
-
-Not yet implemented:
- - create bootable iso files
+ - modify pre-generated installer images to put a package set for 
+   off-line installation on it.
 
 Development
 ===========
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/product-composer-0.5.14/examples/ftp.productcompose 
new/product-composer/examples/ftp.productcompose
--- old/product-composer-0.5.14/examples/ftp.productcompose     2025-05-08 
12:37:30.000000000 +0200
+++ new/product-composer/examples/ftp.productcompose    2025-05-16 
15:31:50.000000000 +0200
@@ -24,10 +24,10 @@
   # free: false
 
 iso:
-  publisher:
-  volume_id:
-#  tree: drop
-#  base: agama-installer
+  publisher: 'Iggy'
+  volume_id: 'Pop'
+#  tree: 'drop'
+#  base: 'agama-installer'
 
 build_options:
 ### For maintenance, otherwise only "the best" version of each package is 
picked:
@@ -40,8 +40,8 @@
 # - updateinfo_packages_only
 # - base_skip_packages
 
-#installcheck:
-# - ignore_errors
+installcheck:
+ - ignore_errors
 
 # Enable collection of source and debug packages. Either "include" it
 # on main medium, "drop" it or "split" it away on extra medium.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/product-composer-0.5.14/pyproject.toml 
new/product-composer/pyproject.toml
--- old/product-composer-0.5.14/pyproject.toml  2025-05-08 12:37:30.000000000 
+0200
+++ new/product-composer/pyproject.toml 2025-05-16 15:31:50.000000000 +0200
@@ -12,6 +12,7 @@
     "zstandard",
     "pydantic<2",
     "pyyaml",
+    "schema",
 ]
 dynamic = ["version", "readme"]
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/product-composer-0.5.14/src/productcomposer/api/parse.py 
new/product-composer/src/productcomposer/api/parse.py
--- old/product-composer-0.5.14/src/productcomposer/api/parse.py        
2025-05-08 12:37:30.000000000 +0200
+++ new/product-composer/src/productcomposer/api/parse.py       2025-05-16 
15:31:50.000000000 +0200
@@ -10,4 +10,4 @@
     :param name: name to use in greeting
     """
     logger.debug("executing hello command")
-    return f"Hello, parser!"
+    return "Hello, parser!"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/product-composer-0.5.14/src/productcomposer/cli.py 
new/product-composer/src/productcomposer/cli.py
--- old/product-composer-0.5.14/src/productcomposer/cli.py      2025-05-08 
12:37:30.000000000 +0200
+++ new/product-composer/src/productcomposer/cli.py     2025-05-16 
15:31:50.000000000 +0200
@@ -7,10 +7,12 @@
 import shutil
 import subprocess
 import gettext
+import glob
 from datetime import datetime
 from argparse import ArgumentParser
 from xml.etree import ElementTree as ET
 
+from schema import Schema, And, Or, Optional, SchemaError
 import yaml
 
 from .core.logger import logger
@@ -25,6 +27,8 @@
 
 
 ET_ENCODING = "unicode"
+ISO_PREPARER = "Product Composer - 
http://www.github.com/openSUSE/product-composer";
+DEFAULT_EULADIR = "/usr/share/doc/packages/eulas"
 
 
 tree_report = {}        # hashed via file name
@@ -34,14 +38,118 @@
 
 # global db for supportstatus
 supportstatus = {}
+# global db for eulas
+eulas = {}
 # per package override via supportstatus.txt file
 supportstatus_override = {}
 # debug aka verbose
 verbose_level = 0
 
+compose_schema_iso = Schema({
+    Optional('publisher'): str,
+    Optional('volume_id'): str,
+    Optional('tree'): str,
+    Optional('base'): str,
+})
+compose_schema_packageset = Schema({
+    Optional('name'): str,
+    Optional('supportstatus'): str,
+    Optional('flavors'): [str],
+    Optional('architectures'): [str],
+    Optional('add'): [str],
+    Optional('sub'): [str],
+    Optional('intersect'): [str],
+    Optional('packages'): Or(None, [str]),
+})
+compose_schema_scc_cpe = Schema({
+    'cpe': str,
+    Optional('online'): bool,
+})
+compose_schema_scc = Schema({
+    Optional('description'): str,
+    Optional('family'): str,
+    Optional('product-class'): str,
+    Optional('free'): bool,
+    Optional('predecessors'): [compose_schema_scc_cpe],
+    Optional('shortname'): str,
+    Optional('base-products'): [compose_schema_scc_cpe],
+    Optional('root-products'): [compose_schema_scc_cpe],
+    Optional('recommended-for'): [compose_schema_scc_cpe],
+    Optional('migration-extra-for'): [compose_schema_scc_cpe],
+})
+compose_schema_build_option = Schema(
+    Or(
+        'add_slsa_provenance',
+        'base_skip_packages',
+        'block_updates_under_embargo',
+        'hide_flavor_in_product_directory_name',
+        'ignore_missing_packages',
+        'skip_updateinfos',
+        'take_all_available_versions',
+        'updateinfo_packages_only',
+    )
+)
+compose_schema_source_and_debug = Schema(
+    Or(
+        'drop',
+        'include',
+        'split',
+    )
+)
+compose_schema_repodata = Schema(
+    Or(
+        'all',
+        'split',
+    )
+)
+compose_schema_flavor = Schema({
+    Optional('architectures'): [str],
+    Optional('name'): str,
+    Optional('version'): str,
+    Optional('update'): str,
+    Optional('edition'): str,
+    Optional('product-type'): str,
+    Optional('product_directory_name'): str,
+    Optional('repodata'): compose_schema_repodata,
+    Optional('summary'): str,
+    Optional('debug'): compose_schema_source_and_debug,
+    Optional('source'): compose_schema_source_and_debug,
+    Optional('build_options'): Or(None, [compose_schema_build_option]),
+    Optional('scc'): compose_schema_scc,
+    Optional('iso'): compose_schema_iso,
+})
+
+compose_schema = Schema({
+    'product_compose_schema': str,
+    'vendor': str,
+    'name': str,
+    'version': str,
+    Optional('update'): str,
+    'product-type': str,
+    'summary': str,
+    Optional('bcntsynctag'): str,
+    Optional('milestone'): str,
+    Optional('scc'): compose_schema_scc,
+    Optional('iso'): compose_schema_iso,
+    Optional('installcheck'): Or(None, ['ignore_errors']),
+    Optional('build_options'): Or(None, [compose_schema_build_option]),
+    Optional('architectures'): [str],
+
+    Optional('product_directory_name'): str,
+    Optional('set_updateinfo_from'): str,
+    Optional('set_updateinfo_id_prefix'): str,
+    Optional('block_updates_under_embargo'): str,
+    Optional('debug'): compose_schema_source_and_debug,
+    Optional('source'): compose_schema_source_and_debug,
+    Optional('repodata'): compose_schema_repodata,
+
+    Optional('flavors'): {str: compose_schema_flavor},
+    Optional('packagesets'): [compose_schema_packageset],
+    Optional('unpack'): [str],
+})
 
 def main(argv=None) -> int:
-    """ Execute the application CLI.
+    """Execute the application CLI.
 
     :param argv: argument list to parse (sys.argv by default)
     :return: exit status
@@ -60,7 +168,7 @@
     build_parser.set_defaults(func=build)
 
     # Generic options
-    for cmd_parser in [verify_parser, build_parser]:
+    for cmd_parser in (verify_parser, build_parser):
         cmd_parser.add_argument('-f', '--flavor', help='Build a given flavor')
         cmd_parser.add_argument('-v', '--verbose', action='store_true',  
help='Enable verbose output')
         cmd_parser.add_argument('--reposdir', action='store',  help='Take 
packages from this directory')
@@ -69,9 +177,10 @@
     # build command options
     build_parser.add_argument('-r', '--release', default=None,  help='Define a 
build release counter')
     build_parser.add_argument('--disturl', default=None,  help='Define a 
disturl')
-    build_parser.add_argument('--build-option', nargs='+', default=[],  
help='Set a build option')
+    build_parser.add_argument('--build-option', action='append', nargs='+', 
default=[],  help='Set a build option')
     build_parser.add_argument('--vcs', default=None,  help='Define a source 
repository identifier')
     build_parser.add_argument('--clean', action='store_true',  help='Remove 
existing output directory first')
+    build_parser.add_argument('--euladir', default=DEFAULT_EULADIR, 
help='Directory containing EULA data')
     build_parser.add_argument('out',  help='Directory to write the result')
 
     # parse and check
@@ -110,6 +219,8 @@
 
 def build(args):
     flavor = None
+    global verbose_level
+
     if args.flavor:
         f = args.flavor.split('.')
         if f[0] != '':
@@ -118,15 +229,16 @@
         verbose_level = 1
 
     if not args.out:
-        # No subcommand was specified.
-        print("No output directory given")
-        parser.print_help()
-        die(None)
+        die("No output directory given")
 
     yml = parse_yaml(args.filename, flavor)
 
-    for option in args.build_option:
-        yml['build_options'].append(option)
+    for arg in args.build_option:
+        for option in arg:
+            yml['build_options'].append(option)
+
+    if 'architectures' not in yml or not yml['architectures']:
+        die(f'No architecture defined for flavor {flavor}')
 
     directory = os.getcwd()
     if args.filename.startswith('/'):
@@ -137,22 +249,25 @@
     if os.path.isfile(supportstatus_fn):
         parse_supportstatus(supportstatus_fn)
 
+    if args.euladir and os.path.isdir(args.euladir):
+        parse_eulas(args.euladir)
+
     pool = Pool()
-    note(f"scanning: {reposdir}")
+    note(f"Scanning: {reposdir}")
     pool.scan(reposdir)
 
-    # clean up black listed packages
+    # clean up blacklisted packages
     for u in sorted(pool.lookup_all_updateinfos()):
         for update in u.root.findall('update'):
             if not update.find('blocked_in_product'):
-                 continue
+                continue
 
             parent = update.findall('pkglist')[0].findall('collection')[0]
             for pkgentry in parent.findall('package'):
                 name = pkgentry.get('name')
                 epoch = pkgentry.get('epoch')
                 version = pkgentry.get('version')
-                pool.remove_rpms(None, name, '=', epoch, version)
+                pool.remove_rpms(None, name, '=', epoch, version, None)
 
     if args.clean and os.path.exists(args.out):
         shutil.rmtree(args.out)
@@ -163,50 +278,83 @@
 
 
 def verify(args):
-    parse_yaml(args.filename, args.flavor)
+    yml = parse_yaml(args.filename, args.flavor)
+    if args.flavor == None and 'flavors' in yml:
+        for flavor in yml['flavors']:
+            yml = parse_yaml(args.filename, flavor)
+            if 'architectures' not in yml or not yml['architectures']:
+                die(f'No architecture defined for flavor {flavor}')
+    elif 'architectures' not in yml or not yml['architectures']:
+        die('No architecture defined and no flavor.')
 
 
-def parse_yaml(filename, flavor):
 
+def parse_yaml(filename, flavor):
     with open(filename, 'r') as file:
         yml = yaml.safe_load(file)
 
+    # we may not allow this in future anymore, but for now convert these from 
float to str
+    if 'product_compose_schema' in yml:
+        yml['product_compose_schema'] = str(yml['product_compose_schema'])
+    if 'version' in yml:
+        yml['version'] = str(yml['version'])
+
     if 'product_compose_schema' not in yml:
         die('missing product composer schema')
-    if yml['product_compose_schema'] != 0 and yml['product_compose_schema'] != 
0.1 and yml['product_compose_schema'] != 0.2:
-        die(f"Unsupported product composer schema: 
{yml['product_compose_schema']}")
+    if yml['product_compose_schema'] not in ('0.1', '0.2'):
+        die(f'Unsupported product composer schema: 
{yml["product_compose_schema"]}')
+
+    try:
+        compose_schema.validate(yml)
+        note(f"Configuration is valid for flavor: {flavor}")
+    except SchemaError as se:
+        warn(f"YAML syntax is invalid for flavor: {flavor}")
+        raise se
 
     if 'flavors' not in yml:
         yml['flavors'] = []
 
+    if 'build_options' not in yml or yml['build_options'] is None:
+        yml['build_options'] = []
+
     if flavor:
         if flavor not in yml['flavors']:
-            die("Flavor not found: " + flavor)
+            die('Flavor not found: ' + flavor)
         f = yml['flavors'][flavor]
         # overwrite global values from flavor overwrites
-        for tag in ['architectures', 'name', 'summary', 'version', 'update', 
'edition',
-                    'product-type', 'product_directory_name',
-                    'build_options', 'source', 'debug', 'repodata']:
+        for tag in (
+            'architectures',
+            'name',
+            'summary',
+            'version',
+            'update',
+            'edition',
+            'product-type',
+            'product_directory_name',
+            'source',
+            'debug',
+            'repodata',
+        ):
             if tag in f:
                 yml[tag] = f[tag]
+
+        # Add additional build_options instead of replacing global defined set.
+        if 'build_options' in f:
+            for option in f['build_options']:
+                yml['build_options'].append(option)
+
         if 'iso' in f:
-            if not 'iso' in yml:
+            if 'iso' not in yml:
                 yml['iso'] = {}
-            for tag in ['volume_id', 'publisher', 'tree', 'base']:
+            for tag in ('volume_id', 'publisher', 'tree', 'base'):
                 if tag in f['iso']:
                     yml['iso'][tag] = f['iso'][tag]
 
-    if 'architectures' not in yml or not yml['architectures']:
-        die("No architecture defined. Maybe wrong flavor?")
-
-    if 'build_options' not in yml or yml['build_options'] is None:
-        yml['build_options'] = []
-
     if 'installcheck' in yml and yml['installcheck'] is None:
         yml['installcheck'] = []
 
     # FIXME: validate strings, eg. right set of chars
-    
+
     return yml
 
 
@@ -217,26 +365,39 @@
             supportstatus_override[a[0]] = a[1]
 
 
+def parse_eulas(euladir):
+    note(f"Reading eula data from {euladir}")
+    for dirpath, dirs, files in os.walk(euladir):
+        for filename in files:
+            if filename.startswith('.'):
+                continue
+            pkgname = filename.removesuffix('.en')
+            with open(os.path.join(dirpath, filename), encoding="utf-8") as f:
+                eulas[pkgname] = f.read()
+
+
 def get_product_dir(yml, flavor, release):
-    name = yml['name'] + "-" + str(yml['version'])
+    name = f'{yml["name"]}-{yml["version"]}'
     if 'product_directory_name' in yml:
         # manual override
         name = yml['product_directory_name']
-    if flavor and not 'hide_flavor_in_product_directory_name' in 
yml['build_options']:
-        name += "-" + flavor
+    if flavor and 'hide_flavor_in_product_directory_name' not in 
yml['build_options']:
+        name += f'-{flavor}'
     if yml['architectures']:
         visible_archs = yml['architectures']
         if 'local' in visible_archs:
             visible_archs.remove('local')
         name += "-" + "-".join(visible_archs)
     if release:
-        name += "-Build" + str(release)
+        name += f'-Build{release}'
     if '/' in name:
         die("Illegal product name")
     return name
 
 
-def run_helper(args, cwd=None, fatal=True, stdout=None, stdin=None, 
failmsg=None):
+def run_helper(args, cwd=None, fatal=True, stdout=None, stdin=None, 
failmsg=None, verbose=False):
+    if verbose:
+        note(f'Calling {args}')
     if stdout is None:
         stdout = subprocess.PIPE
     if stdin is None:
@@ -264,6 +425,71 @@
         args = [ 'sha256sum', filename.split('/')[-1] ]
         run_helper(args, cwd=("/"+os.path.join(*filename.split('/')[:-1])), 
stdout=sha_file, failmsg="create .sha256 file")
 
+def create_iso(outdir, yml, pool, flavor, workdir, application_id):
+    verbose = True if verbose_level > 0 else False
+    isoconf = yml['iso']
+    args = ['/usr/bin/mkisofs', '-quiet', '-p', ISO_PREPARER]
+    args += ['-r', '-pad', '-f', '-J', '-joliet-long']
+    if 'publisher' in isoconf and isoconf['publisher'] is not None:
+        args += ['-publisher', isoconf['publisher']]
+    if 'volume_id' in isoconf and isoconf['volume_id'] is not None:
+        args += ['-V', isoconf['volume_id']]
+    args += ['-A', application_id]
+    args += ['-o', workdir + '.iso', workdir]
+    run_helper(args, cwd=outdir, failmsg="create iso file", verbose=verbose)
+    # simple tag media call ... we may add options for pading or triggering 
media check later
+    args = [ 'tagmedia' , '--digest' , 'sha256', workdir + '.iso' ]
+    run_helper(args, cwd=outdir, failmsg="tagmedia iso file", verbose=verbose)
+    # creating .sha256 for iso file
+    create_sha256_for(workdir + ".iso")
+
+def create_agama_iso(outdir, yml, pool, flavor, workdir, application_id, arch):
+    verbose = True if verbose_level > 0 else False
+    isoconf = yml['iso']
+    base = isoconf['base']
+    if verbose:
+        note(f"Looking for baseiso-{base} rpm on {arch}")
+    agama = pool.lookup_rpm(arch, f"baseiso-{base}")
+    if not agama:
+        die(f"Base iso in baseiso-{base} rpm was not found")
+    baseisodir = f"{outdir}/baseiso"
+    os.mkdir(baseisodir)
+    args = ['unrpm', '-q', agama.location]
+    run_helper(args, cwd=baseisodir, failmsg=f"extract {agama.location}", 
verbose=verbose)
+    files = glob.glob(f"usr/libexec/base-isos/{base}*.iso", 
root_dir=baseisodir)
+    if not files:
+        die(f"Base iso {base} not found in {agama}")
+    if len(files) > 1:
+        die(f"Multiple base isos for {base} found in {agama}")
+    agamaiso = f"{baseisodir}/{files[0]}"
+    if verbose:
+        note(f"Found base iso image {agamaiso}")
+
+    # create new iso
+    tempdir = f"{outdir}/mksusecd"
+    os.mkdir(tempdir)
+    if 'base_skip_packages' not in yml['build_options']:
+        args = ['cp', '-al', workdir, f"{tempdir}/install"]
+        run_helper(args, failmsg="add tree to agama image")
+    args = ['mksusecd', agamaiso, tempdir, '--create', workdir + 
'.install.iso']
+    # mksusecd would take the volume_id, publisher, application_id, preparer 
from the agama iso
+    args += ['--preparer', ISO_PREPARER]
+    if 'publisher' in isoconf and isoconf['publisher'] is not None:
+        args += ['--vendor', isoconf['publisher']]
+    if 'volume_id' in isoconf and isoconf['volume_id'] is not None:
+        args += ['--volume', isoconf['volume_id']]
+    args += ['--application', application_id]
+    run_helper(args, failmsg="add tree to agama image", verbose=verbose)
+    # mksusecd already did a tagmedia call with a sha256 digest
+    # cleanup directories
+    shutil.rmtree(tempdir)
+    shutil.rmtree(baseisodir)
+    # just for the bootable image, signature is not yet applied, so ignore 
that error
+    run_helper(['verifymedia', workdir + '.install.iso', '--ignore', 'ISO is 
signed'], fatal=False, failmsg="verify install.iso")
+    # creating .sha256 for iso file
+    create_sha256_for(workdir + '.install.iso')
+
+
 def create_tree(outdir, product_base_dir, yml, pool, flavor, vcs=None, 
disturl=None):
     if not os.path.exists(outdir):
         os.mkdir(outdir)
@@ -272,7 +498,7 @@
     if not os.path.exists(maindir):
         os.mkdir(maindir)
 
-    workdirectories = [ maindir ]
+    workdirectories = [maindir]
     debugdir = sourcedir = None
     if "source" in yml:
         if yml['source'] == 'split':
@@ -320,12 +546,11 @@
         args = ['gpg', '--no-keyring', '--no-default-keyring', '--with-colons',
               '--import-options', 'show-only', '--import', '--fingerprint']
         out = run_helper(args, stdin=open(f'{maindir}/{file}', 'rb'),
-                         failmsg="Finger printing of gpg file")
+                         failmsg="get fingerprint of gpg file")
         for line in out.splitlines():
-            if not str(line).startswith("b'fpr:"):
-                continue
-
-            default_content.append(str(line).split(':')[9])
+            if line.startswith("fpr:"):
+                content = f"{file}?fpr={line.split(':')[9]}"
+                default_content.append(content)
 
     note("Create rpm-md data")
     run_createrepo(maindir, yml, content=default_content, repos=repos)
@@ -394,39 +619,40 @@
            args.append(find_primary(maindir + subdir))
            if debugdir:
                args.append(find_primary(debugdir + subdir))
-           run_helper(args, fatal=(not 'ignore_errors' in 
yml['installcheck']), failmsg="run installcheck validation")
+           run_helper(args, fatal=('ignore_errors' not in 
yml['installcheck']), failmsg="run installcheck validation")
 
     if 'skip_updateinfos' not in yml['build_options']:
         create_updateinfo_xml(maindir, yml, pool, flavor, debugdir, sourcedir)
 
     # Add License File and create extra .license directory
-    licensefilename = '/license.tar'
-    if os.path.exists(maindir + '/license-' + yml['name'] + '.tar') or 
os.path.exists(maindir + '/license-' + yml['name'] + '.tar.gz'):
-        licensefilename = '/license-' + yml['name'] + '.tar'
-    if os.path.exists(maindir + licensefilename + '.gz'):
-        run_helper(['gzip', '-d', maindir + licensefilename + '.gz'],
-                   failmsg="Uncompress of license.tar.gz failed")
-    if os.path.exists(maindir + licensefilename):
-        note("Setup .license directory")
-        licensedir = maindir + ".license"
-        if not os.path.exists(licensedir):
-            os.mkdir(licensedir)
-        args = ['tar', 'xf', maindir + licensefilename, '-C', licensedir]
-        output = run_helper(args, failmsg="extract license tar ball")
-        if not os.path.exists(licensedir + "/license.txt"):
-            die("No license.txt extracted", details=output)
-
-        mr = ModifyrepoWrapper(
-            file=maindir + licensefilename,
-            directory=os.path.join(maindir, "repodata"),
-        )
-        mr.run_cmd()
-        os.unlink(maindir + licensefilename)
-        # meta package may bring a second file or expanded symlink, so we need 
clean up
-        if os.path.exists(maindir + '/license.tar'):
-            os.unlink(maindir + '/license.tar')
-        if os.path.exists(maindir + '/license.tar.gz'):
-            os.unlink(maindir + '/license.tar.gz')
+    if yml.get('iso', {}).get('tree') != 'drop':
+      licensefilename = '/license.tar'
+      if os.path.exists(maindir + '/license-' + yml['name'] + '.tar') or 
os.path.exists(maindir + '/license-' + yml['name'] + '.tar.gz'):
+          licensefilename = '/license-' + yml['name'] + '.tar'
+      if os.path.exists(maindir + licensefilename + '.gz'):
+          run_helper(['gzip', '-d', maindir + licensefilename + '.gz'],
+                     failmsg="uncompress license.tar.gz")
+      if os.path.exists(maindir + licensefilename):
+          note("Setup .license directory")
+          licensedir = maindir + ".license"
+          if not os.path.exists(licensedir):
+              os.mkdir(licensedir)
+          args = ['tar', 'xf', maindir + licensefilename, '-C', licensedir]
+          output = run_helper(args, failmsg="extract license tar ball")
+          if not os.path.exists(licensedir + "/license.txt"):
+              die("No license.txt extracted", details=output)
+
+          mr = ModifyrepoWrapper(
+              file=maindir + licensefilename,
+              directory=os.path.join(maindir, "repodata"),
+          )
+          mr.run_cmd()
+          os.unlink(maindir + licensefilename)
+          # meta package may bring a second file or expanded symlink, so we 
need clean up
+          if os.path.exists(maindir + '/license.tar'):
+              os.unlink(maindir + '/license.tar')
+          if os.path.exists(maindir + '/license.tar.gz'):
+              os.unlink(maindir + '/license.tar.gz')
 
     for repodatadir in repodatadirectories:
         # detached signature
@@ -443,68 +669,18 @@
             args = ['/usr/lib/build/signdummy', '-d', workdir + '/CHECKSUMS']
             run_helper(args, failmsg="create detached signature for CHECKSUMS")
 
+        application_id = product_base_dir
         # When using the baseiso feature, the primary media should be
         # the base iso, with the packages added.
         # Other medias/workdirs would then be generated as usual, as
         # presumably you wouldn't need a bootable iso for source and
         # debuginfo packages.
         if workdir == maindir and 'base' in yml.get('iso', {}):
-            note("Export main tree into agama iso file")
-            if verbose_level > 0:
-                note(f"Looking for baseiso-{yml['iso']['base']} rpm on 
{yml['architectures'][0]}")
-            agama = pool.lookup_rpm(yml['architectures'][0], 
f"baseiso-{yml['iso']['base']}")
-            if agama is None:
-                die(f"Base iso in baseiso-{yml['iso']['base']} rpm was not 
found")
-            if verbose_level > 0:
-                note(f"Found {agama.location}")
-            baseisodir = f"{outdir}/baseiso"
-            os.mkdir(baseisodir)
-            args = ['unrpm', '-q', agama.location]
-            run_helper(args, cwd=baseisodir, failmsg=f"Failing unpacking 
{agama.location}")
-            import glob
-            files = 
glob.glob(f"{baseisodir}/usr/libexec/base-isos/{yml['iso']['base']}*.iso", 
recursive=True)
-            if len(files) < 1:
-                die(f"Base iso {yml['iso']['base']} not found in 
{agama.location}")
-            if len(files) > 1:
-                die(f"Multiple base isos for {yml['iso']['base']} are found in 
{agama.location}")
-            agamaiso = files[0]
-            if verbose_level > 0:
-                note(f"Found base iso image {agamaiso}")
-
-            # create new iso
-            tempdir = f"{outdir}/mksusecd"
-            os.mkdir(tempdir)
-            if not 'base_skip_packages' in yml['build_options']:
-                args = ['cp', '-al', workdir, f"{tempdir}/install"]
-                run_helper(args, failmsg="Adding tree to agama image")
-            args = ['mksusecd', agamaiso, tempdir, '--create', workdir + 
'.install.iso']
-            if verbose_level > 0:
-                print("Calling: ", args)
-            run_helper(args, failmsg="Adding tree to agama image")
-            # just for the bootable image, signature is not yet applied, so 
ignore that error
-            run_helper(['verifymedia', workdir + '.install.iso', '--ignore', 
'ISO is signed'], fatal=False, failmsg="Verification of install.iso")
-            # creating .sha256 for iso file
-            create_sha256_for(workdir + '.install.iso')
-            # cleanup
-            shutil.rmtree(tempdir)
-            shutil.rmtree(baseisodir)
+            agama_arch = yml['architectures'][0]
+            note(f"Export main tree into agama iso file for {agama_arch}")
+            create_agama_iso(outdir, yml, pool, flavor, workdir, 
application_id, agama_arch)
         elif 'iso' in yml:
-            note("Create iso files")
-            application_id = re.sub(r'^.*/', '', maindir)
-            args = ['/usr/bin/mkisofs', '-quiet', '-p', 'Product Composer - 
http://www.github.com/openSUSE/product-composer']
-            args += ['-r', '-pad', '-f', '-J', '-joliet-long']
-            if 'publisher' in yml['iso'] and yml['iso']['publisher'] is not 
None:
-                args += ['-publisher', yml['iso']['publisher']]
-            if 'volume_id' in yml['iso'] and yml['iso']['volume_id'] is not 
None:
-                args += ['-V', yml['iso']['volume_id']]
-            args += ['-A', application_id]
-            args += ['-o', workdir + '.iso', workdir]
-            run_helper(args, cwd=outdir, failmsg="create iso file")
-            # simple tag media call ... we may add options for pading or 
triggering media check later
-            args = [ 'tagmedia' , '--digest' , 'sha256', workdir + '.iso' ]
-            run_helper(args, cwd=outdir, failmsg="tagmedia iso file")
-            # creating .sha256 for iso file
-            create_sha256_for(workdir + ".iso")
+            create_iso(outdir, yml, pool, flavor, workdir, application_id);
 
         # cleanup
         if yml.get('iso', {}).get('tree') == 'drop':
@@ -520,7 +696,7 @@
     # Pro: SBOM formats are constant changing, we don't need to adapt always 
all distributions for that
     if os.path.exists("/.build/generate_sbom"):
         # unfortunatly, it is not exectuable by default
-        generate_sbom_call = ["perl", "-I", "/.build", "/.build/generate_sbom"]
+        generate_sbom_call = ['env', 'BUILD_DIR=/.build', 'perl', 
'/.build/generate_sbom']
 
     if generate_sbom_call:
         spdx_distro = f"{yml['name']}-{yml['version']}"
@@ -573,19 +749,22 @@
             for root, dirnames, filenames in os.walk(maindir + '/' + subdir):
                 for name in filenames:
                     relname = os.path.relpath(root + '/' + name, maindir)
-                    run_helper([chksums_tool, relname], cwd=maindir, 
stdout=chksums_file)
+                    run_helper(
+                        [chksums_tool, relname], cwd=maindir, 
stdout=chksums_file
+                    )
+
 
 # create a fake package entry from an updateinfo package spec
 
 
 def create_updateinfo_package(pkgentry):
     entry = Package()
-    for tag in 'name', 'epoch', 'version', 'release', 'arch':
+    for tag in ('name', 'epoch', 'version', 'release', 'arch'):
         setattr(entry, tag, pkgentry.get(tag))
     return entry
 
+
 def generate_du_data(pkg, maxdepth):
-    dirs = pkg.get_directories()
     seen = set()
     dudata_size = {}
     dudata_count = {}
@@ -622,6 +801,7 @@
         dudata.append((dir, size, dudata_count[dir]))
     return dudata
 
+
 # Get supported translations based on installed packages
 def get_package_translation_languages():
     i18ndir = '/usr/share/locale/en_US/LC_MESSAGES'
@@ -699,6 +879,11 @@
                 for duitem in dudata:
                     ET.SubElement(dirselement, 'dir', {'name': duitem[0], 
'size': str(duitem[1]), 'count': str(duitem[2])})
 
+        # add eula
+        eula = eulas.get(name)
+        if eula:
+            ET.SubElement(package, 'eula').text = eula
+
         # get summary/description/category of the package
         summary = pkg.find(f'{ns}summary').text
         description = pkg.find(f'{ns}description').text
@@ -710,7 +895,8 @@
             isummary = i18ntrans[lang].gettext(summary)
             idescription = i18ntrans[lang].gettext(description)
             icategory = i18ntrans[lang].gettext(category) if category is not 
None else None
-            if isummary == summary and idescription == description and 
icategory == category:
+            ieula = eulas.get(name + '.' + lang, eula) if eula is not None 
else None
+            if isummary == summary and idescription == description and 
icategory == category and ieula == eula:
                 continue
             if lang not in susedatas:
                 susedatas[lang] = ET.Element('susedata')
@@ -724,6 +910,8 @@
                 ET.SubElement(ipackage, 'description', {'lang': lang}).text = 
idescription
             if icategory != category:
                 ET.SubElement(ipackage, 'category', {'lang': lang}).text = 
icategory
+            if ieula != eula:
+                ET.SubElement(ipackage, 'eula', {'lang': lang}).text = ieula
 
     # write all susedata files
     for lang, susedata in sorted(susedatas.items()):
@@ -753,7 +941,7 @@
     # build the union of the package sets for all requested architectures
     main_pkgset = PkgSet('main')
     for arch in yml['architectures']:
-        pkgset = main_pkgset.add(create_package_set(yml, arch, flavor, 'main', 
pool=pool))
+        main_pkgset.add(create_package_set(yml, arch, flavor, 'main', 
pool=pool))
     main_pkgset_names = main_pkgset.names()
 
     uitemp = None
@@ -794,7 +982,11 @@
                             die("shutting down due to 
block_updates_under_embargo flag")
 
                 # clean internal attributes
-                for internal_attributes in ['supportstatus', 'superseded_by', 
'embargo_date']:
+                for internal_attributes in (
+                    'supportstatus',
+                    'superseded_by',
+                    'embargo_date',
+                ):
                     pkgentry.attrib.pop(internal_attributes, None)
 
                 # check if we have files for the entry
@@ -853,16 +1045,10 @@
 
         os.unlink(rpmdir + '/updateinfo.xml')
 
-    if missing_package and not 'ignore_missing_packages' in 
yml['build_options']:
-        die('Abort due to missing packages')
-
+    if missing_package and 'ignore_missing_packages' not in 
yml['build_options']:
+        die('Abort due to missing packages for updateinfo')
 
 def run_createrepo(rpmdir, yml, content=[], repos=[]):
-    product_name = product_summary = yml['name']
-    if 'summary' in yml:
-        product_summary = yml['summary']
-    product_summary += " " + str(yml['version'])
-
     product_type = '/o'
     if 'product-type' in yml:
         if yml['product-type'] == 'base':
@@ -872,7 +1058,7 @@
         else:
             die('Undefined product-type')
     cr = CreaterepoWrapper(directory=".")
-    cr.distro = product_summary
+    cr.distro = f"{yml.get('summary', yml['name'])} {yml['version']}"
     cr.cpeid = 
f"cpe:{product_type}:{yml['vendor']}:{yml['name']}:{yml['version']}"
     if 'update' in yml:
         cr.cpeid = cr.cpeid + f":{yml['update']}"
@@ -881,7 +1067,7 @@
     elif 'edition' in yml:
         cr.cpeid = cr.cpeid + f"::{yml['edition']}"
     cr.repos = repos
-# cr.split = True
+    # cr.split = True
     # cr.baseurl = "media://"
     cr.content = content
     cr.excludes = ["boot"]
@@ -920,32 +1106,8 @@
                 continue
             unpack_one_meta_rpm(rpmdir, rpm, medium)
 
-    if missing_package and not 'ignore_missing_packages' in 
yml['build_options']:
-        die('Abort due to missing packages')
-
-
-def create_package_set_compat(yml, arch, flavor, setname):
-    if setname == 'main':
-        oldname = 'packages'
-    elif setname == 'unpack':
-        oldname = 'unpack_packages'
-    else:
-        return None
-    if oldname not in yml:
-        return PkgSet(setname) if setname == 'unpack' else None
-    pkgset = PkgSet(setname)
-    for entry in list(yml[oldname]):
-        if type(entry) == dict:
-            if 'flavors' in entry:
-                if flavor is None or flavor not in entry['flavors']:
-                    continue
-            if 'architectures' in entry:
-                if arch not in entry['architectures']:
-                    continue
-            pkgset.add_specs(entry['packages'])
-        else:
-            pkgset.add_specs([str(entry)])
-    return pkgset
+    if missing_package and 'ignore_missing_packages' not in 
yml['build_options']:
+        die('Abort due to missing meta packages')
 
 
 def create_package_set_all(setname, pool, arch):
@@ -956,13 +1118,8 @@
 
     return pkgset
 
-def create_package_set(yml, arch, flavor, setname, pool=None):
-    if 'packagesets' not in yml:
-        pkgset = create_package_set_compat(yml, arch, flavor, setname)
-        if pkgset is None:
-            die(f'package set {setname} is not defined')
-        return pkgset
 
+def create_package_set(yml, arch, flavor, setname, pool=None):
     pkgsets = {}
     for entry in list(yml['packagesets']):
         name = entry['name'] if 'name' in entry else 'main'
@@ -992,7 +1149,7 @@
                 if oname == name or oname not in pkgsets:
                     die(f'package set {oname} does not exist')
                 if pkgsets[oname] is None:
-                    pkgsets[oname] = PkgSet(oname)      # instantiate
+                    pkgsets[oname] = PkgSet(oname)  # instantiate
                 if setop == 'add':
                     pkgset.add(pkgsets[oname])
                 elif setop == 'sub':
@@ -1005,7 +1162,7 @@
     if setname not in pkgsets:
         die(f'package set {setname} is not defined')
     if pkgsets[setname] is None:
-        pkgsets[setname] = PkgSet(setname)      # instantiate
+        pkgsets[setname] = PkgSet(setname)  # instantiate
     return pkgsets[setname]
 
 
@@ -1021,16 +1178,16 @@
     if 'updateinfo_packages_only' in yml['build_options']:
         if not pool.updateinfos:
             die("filtering for updates enabled, but no updateinfo found")
+        if singlemode:
+            die("filtering for updates enabled, but 
take_all_available_versions is not set")
 
         referenced_update_rpms = {}
         for u in sorted(pool.lookup_all_updateinfos()):
             for update in u.root.findall('update'):
                 parent = update.findall('pkglist')[0].findall('collection')[0]
-
                 for pkgentry in parent.findall('package'):
                     referenced_update_rpms[pkgentry.get('src')] = 1
 
-
     main_pkgset = create_package_set(yml, arch, flavor, 'main', pool=pool)
 
     missing_package = None
@@ -1042,6 +1199,8 @@
             rpms = pool.lookup_all_rpms(arch, sel.name, sel.op, sel.epoch, 
sel.version, sel.release)
 
         if not rpms:
+            if referenced_update_rpms is not None:
+                continue
             warn(f"package {sel} not found for {arch}")
             missing_package = True
             continue
@@ -1049,7 +1208,7 @@
         for rpm in rpms:
             if referenced_update_rpms is not None:
                 if (rpm.arch + '/' + rpm.canonfilename) not in 
referenced_update_rpms:
-                    print(f"No update for {rpm}")
+                    note(f"No update for {rpm}")
                     continue
 
             link_entry_into_dir(rpm, rpmdir, add_slsa=add_slsa)
@@ -1082,7 +1241,7 @@
                 if drpm:
                     link_entry_into_dir(drpm, debugdir, add_slsa=add_slsa)
 
-    if missing_package and not 'ignore_missing_packages' in 
yml['build_options']:
+    if missing_package and 'ignore_missing_packages' not in 
yml['build_options']:
         die('Abort due to missing packages')
 
 
@@ -1128,7 +1287,16 @@
             continue
         binary = ET.SubElement(root, 'binary')
         binary.text = 'obs://' + entry.origin
-        for tag in 'name', 'epoch', 'version', 'release', 'arch', 'buildtime', 
'disturl', 'license':
+        for tag in (
+            'name',
+            'epoch',
+            'version',
+            'release',
+            'arch',
+            'buildtime',
+            'disturl',
+            'license',
+        ):
             val = getattr(entry, tag, None)
             if val is None or val == '':
                 continue
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/product-composer-0.5.14/src/productcomposer/core/PkgSet.py 
new/product-composer/src/productcomposer/core/PkgSet.py
--- old/product-composer-0.5.14/src/productcomposer/core/PkgSet.py      
2025-05-08 12:37:30.000000000 +0200
+++ new/product-composer/src/productcomposer/core/PkgSet.py     2025-05-16 
15:31:50.000000000 +0200
@@ -51,11 +51,11 @@
             if name not in otherbyname:
                 pkgs.append(sel)
                 continue
-            for osel in otherbyname[name]:
+            for other_sel in otherbyname[name]:
                 if sel is not None:
-                    sel = sel.sub(osel)
+                    sel = sel.sub(other_sel)
             if sel is not None:
-                pkgs.append(p)
+                pkgs.append(sel)
         self.pkgs = pkgs
         self.byname = None
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/product-composer-0.5.14/src/productcomposer/core/Pool.py 
new/product-composer/src/productcomposer/core/Pool.py
--- old/product-composer-0.5.14/src/productcomposer/core/Pool.py        
2025-05-08 12:37:30.000000000 +0200
+++ new/product-composer/src/productcomposer/core/Pool.py       2025-05-16 
15:31:50.000000000 +0200
@@ -57,7 +57,7 @@
     def lookup_all_updateinfos(self):
         return self.updateinfos.values()
 
-    def remove_rpms(self, arch, name, epoch=None, version=None, release=None):
+    def remove_rpms(self, arch, name, op=None, epoch=None, version=None, 
release=None):
         if name not in self.rpms:
             return
         self.rpms[name] = [rpm for rpm in self.rpms[name] if not 
rpm.matches(arch, name, op, epoch, version, release)]

Reply via email to