Compare commits

...

52 Commits

Author SHA1 Message Date
soksanichenko c539c58f93 - Chronological order of changelog is fixed 2022-11-04 11:12:02 +02:00
soksanichenko b68051e221 Merge remote-tracking branch 'origin/master' into aln8
# Conflicts:
#	pungi.spec
#	pungi/phases/pkgset/pkgsets.py
#	pungi/phases/pkgset/sources/source_koji.py
#	pungi/wrappers/kojiwrapper.py
#	setup.py
#	tests/test_extra_isos_phase.py
#	tests/test_koji_wrapper.py
#	tests/test_pkgset_pkgsets.py
#	tests/test_pkgset_source_koji.py
2022-11-03 23:22:46 +02:00
soksanichenko 750499eda1 - The unittests are fixed 2022-10-19 14:10:48 +03:00
soksanichenko d999960235 - bump the dependency version 2022-10-19 13:00:32 +03:00
soksanichenko 6edece449d - changelog
- bump version
2022-10-19 04:40:39 +03:00
Stepan Oksanichenko dd22d94a9e Merge pull request 'Replace list of cr.packages by cr.PackageIterator' (#6) from package_iterator into aln8
Reviewed-on: #6
2022-10-19 01:38:44 +00:00
soksanichenko b157a1825a Do not lose a module from koji if we have more than one arch (e.g. x86_64 + i686) 2022-10-19 04:33:34 +03:00
soksanichenko fd298d4f17 Replace list of cr.packages by cr.PackageIterator 2022-10-18 22:53:50 +03:00
soksanichenko f21ed6f607 ALBS-334: Make the ability of Pungi to give module_defaults from remote sources 2022-05-04 20:16:23 +03:00
Stepan Oksanichenko cfe6ec3f4e Merge pull request 'ALBS-334: Make the ability of Pungi to give module_defaults from remote sources' (#4) from ALBS-334 into aln8
Reviewed-on: #4
2022-05-04 17:05:45 +00:00
soksanichenko e6c6f74176 ALBS-334: Make the ability of Pungi to give module_defaults from remote sources 2022-05-03 18:18:17 +03:00
soksanichenko 8676941655 ALBS-334: Make the ability of Pungi to give module_defaults from remote sources 2022-05-02 02:25:32 +03:00
soksanichenko 5f74175c33 ALBS-334: Make the ability of Pungi to give module_defaults from remote sources 2022-05-01 03:41:40 +03:00
soksanichenko 1e18e8995d ALBS-334: Make the ability of Pungi to give module_defaults from remote sources 2022-05-01 03:32:01 +03:00
soksanichenko 38ea822260 ALBS-334: Make the ability of Pungi to give module_defaults from remote sources 2022-04-30 00:27:31 +03:00
soksanichenko 34eb45c7ec ALBS-334: Make the ability of Pungi to give module_defaults from remote sources 2022-04-29 21:39:51 +03:00
soksanichenko 7422d1e045 ALBS-334: Make the ability of Pungi to give module_defaults from remote sources 2022-04-29 21:33:28 +03:00
soksanichenko 97801e772e ALBS-334: Make the ability of Pungi to give module_defaults from remote sources 2022-04-29 21:25:59 +03:00
soksanichenko dff346eedb - Unit tests are fixed 2022-04-28 16:44:47 +03:00
soksanichenko de53dd0bbd - Unit tests are fixed 2022-04-28 16:30:03 +03:00
soksanichenko 88121619bc ALBS-226: Patch pungi/lorax for building AL9
- Defaults modules can be empty, but pungi detects
  empty folder while copying and raises the exception in this case
2022-03-18 22:37:57 +02:00
soksanichenko 0484426e0c ALBS-97: Build AlmaLinux PPC64le repos and ISOs with pungi
- Changelog
- Version is bumped

@BS-NOBUILD
@BS-TARGET-CL8

Change-Id: I933925b7a27a5e1b642020e060f59212fdc6ebf4
2021-12-30 12:42:34 +02:00
soksanichenko b9d86b90e1 ALBS-97: Build AlmaLinux PPC64le repos and ISOs with pungi
- Scripts `create_packages_json` & `gather_modules` can process lzma compressed yaml files
- Script `create_package_json` can use repodata there are packages with different
  arch by compare with passed to the script

@BS-NOBUILD
@BS-TARGET-CL8

Change-Id: Ia9a3bacfa4344f0cf33b9f416649fd4a5f8d3c37
2021-12-28 16:08:04 +02:00
soksanichenko 58a16e5688 - The version is bumped
- The changelog is updated
- The test `create_packages_json` is fixed

@BS-NOBUILD
@BS-TARGET-CL8

Change-Id: I173013da990eb296e58ca8f3555a05913ca1c852
2021-12-20 14:11:17 +02:00
soksanichenko f2ed64d952 ALBS-66: Prepare Jenkins jobs for building distribution of AlmaLinux 8.5
- Script `create_packages_json` can duplicate the packages with
  same version in different variants

Change-Id: I3c79ad06c4c22442423c12d5fa06baf82d663a3f
2021-11-10 15:29:59 +02:00
stepan_oksanichenko b2c49dcaf6 - The version is bumped
- The changelog is updated

@BS-NOBUILD
@BS-TARGET-CL8

Change-Id: Iadbf3d7223db85a58ba82f41597de27dbfffe1ca
2021-06-18 14:47:09 +03:00
stepan_oksanichenko 14dd6a195f LNX-326: Add the ability to include any package by mask in packages.json to the generator
- The reference packages should be replaced only by the newer reference packages
- The non-reference packages should be replaced by both of types packages

@BS-NOBUILD
@BS-TARGET-CL8

Change-Id: I881bd4e58527ae219ef6e1adbc6332b3b05933c1
2021-06-18 14:23:42 +03:00
stepan_oksanichenko 084321dd97 LNX-326: Add the ability to include any package by mask in packages.json to the generator
- The ability is added
- Also the generator includes only the latest by version packages to packages.json
- The generator has key `--is-reference` for an each repo. This key marks a repo as reference.
  An reference repo is used as main source of packages. A not reference repo is used as source
  of packages which don't exist in the reference repos.
- All cases are covered by the unittest

@BS-NOBUILD
@BS-TARGET-CL8

Change-Id: I2f80ba4fbfce27fb9a30500ae46c0b8a2f2aabcd
2021-06-15 17:42:12 +03:00
stepan_oksanichenko 941d6b064a LNX-318: Modify build scripts for building CloudLinux OS 8.4
- [Fixed] The script `create_packages_json` selects a first
          comer package from variant, but it should select the
          higher by version of package

@BS-NOBUILD
@BS-TARGET-CL8

Change-Id: I36268f2a493897fc11e787c040066d2d501a1c81
2021-06-04 12:36:03 +03:00
stepan_oksanichenko aaeee7132d - It's bumped version
- It's added changelog

@BS-TARGET-CL8

Change-Id: I51eef1eb45ba54d034e6bed46d99b0470f4e9221
2021-05-25 21:28:47 +03:00
stepan_oksanichenko cc4d99441c LNX-108: Add multiarch support to pungi
@BS-NOBUILD
@BS-TARGET-CL8

Change-Id: Ibfd540454941922d790ae4e56cc0992c0c85635d
2021-05-24 18:07:11 +03:00
stepan_oksanichenko a435eeed06 - it's added changelog
@BS-NOBUILD

Change-Id: I3a0a0377f9c1cefabf52c33fbc0d19ab0e4fe4f1
2021-04-29 17:15:17 +03:00
stepan_oksanichenko b9f554bf39 LNX-311: Add ability to productmd set a main variant while dumping TreeInfo
@BS-NOBUILD
@BS-TARGET-CL8
@BS-LINKED-608ab56299ce8ac801a396c5  # python3-productmd

Change-Id: Id86d627ae8ae0b9a73b5ce6531c20538f3d040b1
2021-04-29 17:01:49 +03:00
stepan_oksanichenko ebf028ca3b LNX-286: Prepare pungi configuration and setup Jenkins job for AlmaLinux 8.4 beta
- The modules from a parsend output of FUS should be have a stream
  with replaced dash by underscore

@BS-NOBUILD
@BS-TARGET-CL8

Change-Id: If36d3d0a1ef8010bf85a4a0218b9838e0888453c
2021-04-27 13:39:09 +03:00
stepan_oksanichenko 305103a38e LNX-286: Prepare pungi configuration and setup Jenkins job for AlmaLinux 8.4 beta
- Some modules can be absent in koji env but be present in variants.xml.
  And Pungi will fail in this case. So we must filter out those modules
  from expected modules list by list from pungi build config

@BS-NOBUILD
@BS-TARGET-CL8

Change-Id: I22c15c42868412e34fd554030130bd7c3e25b8ef
2021-04-23 13:03:05 +03:00
stepan_oksanichenko 01bce26275 LNX-286: Prepare pungi configuration and setup Jenkins job for AlmaLinux 8.4 beta
- The script `gather_modules` should replace `-` by `_`
  in stream of modules as pungi does it in self

@BS-NOBUILD
@BS-TARGET-CL8

Change-Id: Iea05b70afbf80f3ccd20ad4943c9d86c7ed7aa90
2021-04-22 13:40:48 +03:00
soksanichenko 4d763514c1 - Version is bumped
- Changelog is added

Change-Id: I440b44f12c4a1aa41619acd3ba5ca354dc71b419
2021-02-24 17:42:22 +02:00
Danylo Kuropiatnyk 41381df6a5 LU-2202: Start unittests during installation or build of pungi
* added section with tests and pytest module to requires
IMPORTANT - build.sh script is commented
* added pyfakefs dependency
* fixed little mock_open issue for runroot test
* bumped version

@BS-TARGET-CL8

Change-Id: I036db225646875eb610736cd26f473850a78447c
2021-02-23 07:55:36 -05:00
soksanichenko 02686d7bdf LU-2186 .treeinfo file in AlmaLinux public kickstart repo should contain AppStream variant
- We are modifying existing repo's .treeinfo:
-- Take info about included variants from iso's .treeinfo and put it to repo's .treeinfo

Change-Id: I29bf655d90994e8a1bda40ad04568dd7364f5dca
2021-02-23 06:48:15 -05:00
soksanichenko 2e48c9a56f LU-2195 Change path to sources and iso when generating repositories
- We should add the images to the compose if they will be used only as netinstall image.
  E.g. *-boot.iso.
- And we shouldn't add if the images will be modified in phase `extra_isos`.
  E.g. *-minimal.iso

Change-Id: I9095cfd87414ecca46b1213553589731c82dd2e2
2021-02-22 13:23:48 +02:00
soksanichenko b3a8c3f28a - Version is bumped
- Changelog is added

Change-Id: Ib1366f1fe2639037db99b8e939537bb63801058e
2021-02-11 14:50:12 +02:00
soksanichenko 5434d24027 LU-2133: Prepare CI for iso builds of CLOSS 8
@BS-TARGET-CL8
@BS-NOBUILD

- It's added the script which can collect packages/modules
  from the remote repos (including BS repos) and merge them
  to an one local repo with the right repodata (including
  modules.yaml.gz)
- The script `create_packages_json` can use regexps for list of excluded packages

Change-Id: I1365b712460959db6bb451d1199d640bff6ffe5e
2021-02-09 10:47:46 +02:00
soksanichenko 3b5501b4bf LNX-133: Create a server for building nightly builds of AlmaLinux
- It's added key argument '--json-output-path' to script `pungi-generate-package-json`

Change-Id: Ic18fa2708cc4913002023828b3be018d4907de25
2021-01-28 14:03:40 +02:00
soksanichenko cea8d92906 Bump version for setup.py
Change-Id: I980e9ebb728c3a88597c987d585e1b5937499e81
2021-01-28 00:06:40 +02:00
soksanichenko 1a29de435e - It's added changelog
- Its bumped version

Change-Id: I4c7b8d9c64da3379a24d93837657cec2686a8511
2021-01-27 23:47:39 +02:00
soksanichenko 69ed7699e8 LNX-133: Create a server for building nightly builds of AlmaLinux
- It's added dependency `python3-dataclasses` to spec

Change-Id: Id6b6f33ca6621ddc1408d9ab51e278801e4dd0a2
2021-01-27 07:47:07 -05:00
Stepan Oksanichenko 103c3dc608 LNX-133: Create a server for building nightly builds of AlmaLinux
- Script `pungi-gather-modules` can find valid *modules.yaml.gz in the repo dirs by itself

@BS-LINKED-5ffda6156f44affc6c5ea239  # pungi & dependencies
@BS-TARGET-CL8

Change-Id: I3cddc0cf41ea1087183e23de39126a52c69bc9ac
2021-01-25 16:17:35 +02:00
Stepan Oksanichenko 94ad7603b8 LNX-104: Create gather_prepopulate file generator for Pungi
- It's added the tool which can generate json like as `centos-packages.json` using repodata from completed repos.

@BS-LINKED-5ffda6156f44affc6c5ea239  # pungi & dependencies
@BS-TARGET-CL8

Change-Id: Ib0466a1d8e06feb855e81fb7160fe170e2e82e04
2021-01-25 16:17:34 +02:00
oshyshatskyi 903db91c0f LNX-102: Patch pungi tool to use local koji mock
Instead of koji.mbox use local koji-like wrapper.

@BS-LINKED-5ff8b8cb6f44affc6c5e9a7a
@BS-TARGET-CL8

Change-Id: I82a2bc8bc71ae06240656898f3df71bb28bcb9e9
2021-01-25 16:17:33 +02:00
oshyshatskyi 552343fffe LNX-102: Add tool that gathers directory for all rpms
Tool that finds all available rpm files in directory
and creates special tree for pungi:
 # ls /mnt/koji/
   i686/  noarch/  x86_64/

Change-Id: Ibcf2d23c46411ad89477058f4d56e07ca117f0d1
2021-01-25 16:17:33 +02:00
oshyshatskyi 5806217041 LNX-102: Add tool that collects information about modules
Add special tool that gathers given modules.tar.gz files
and collects information about modules into two dirs:
 - module_defaults
 - modules

 First one is used by pungi during repocreate phase and
 the second one is used by koji mock to get list of
 available modules and their versions.

Change-Id: I50a095a5f3bafa7e7a1effc2c0d4a2fc52ba603b
2021-01-25 16:17:33 +02:00
Andrew Lukoshko 67eacf8483 LNX-103 Update .spec file for AlmaLinux
New binaries added to pungi rpm:
pungi-gather-rpms
pungi-gather-modules

Change-Id: Idb25dffb10d50fa9f566c99d714d32df962b6f52
2021-01-25 16:17:32 +02:00
46 changed files with 3741 additions and 290 deletions

File diff suppressed because it is too large Load Diff

View File

@ -794,6 +794,14 @@ def make_schema():
"type": "string",
"enum": ["lorax", "buildinstall"],
},
# In phase `buildinstall` we should add to compose only the
# images that will be used only as netinstall
"netinstall_variants": {
"$ref": "#/definitions/list_of_strings",
"default": [
"BaseOS",
],
},
"buildinstall_topdir": {"type": "string"},
"buildinstall_kickstart": {"$ref": "#/definitions/str_or_scm_dict"},
"buildinstall_use_guestmount": {"type": "boolean", "default": True},

View File

@ -533,7 +533,14 @@ def link_boot_iso(compose, arch, variant, can_fail):
img.volume_id = iso.get_volume_id(new_boot_iso_path)
except RuntimeError:
pass
compose.im.add(variant.uid, arch, img)
# In this phase we should add to compose only the images that
# will be used only as netinstall.
# On this step lorax generates environment
# for creating isos and create them.
# On step `extra_isos` we overwrite the not needed iso `boot Minimal` by
# new iso. It already contains necessary packages from incldued variants.
if variant.uid in compose.conf['netinstall_variants']:
compose.im.add(variant.uid, arch, img)
compose.log_info("[DONE ] %s" % msg)

View File

@ -420,6 +420,12 @@ def get_iso_contents(
original_treeinfo,
os.path.join(extra_files_dir, ".treeinfo"),
)
tweak_repo_treeinfo(
compose,
include_variants,
original_treeinfo,
original_treeinfo,
)
# Add extra files specific for the ISO
files.update(
@ -431,6 +437,45 @@ def get_iso_contents(
return gp
def tweak_repo_treeinfo(compose, include_variants, source_file, dest_file):
"""
The method includes the variants to file .treeinfo of a variant. It takes
the variants which are described
by options `extra_isos -> include_variants`.
"""
ti = productmd.treeinfo.TreeInfo()
ti.load(source_file)
main_variant = next(iter(ti.variants))
for variant_uid in include_variants:
variant = compose.all_variants[variant_uid]
var = productmd.treeinfo.Variant(ti)
var.id = variant.id
var.uid = variant.uid
var.name = variant.name
var.type = variant.type
ti.variants.add(var)
for variant_id in ti.variants:
var = ti.variants[variant_id]
if variant_id == main_variant:
var.paths.packages = 'Packages'
var.paths.repository = '.'
else:
var.paths.packages = os.path.join(
'../../..',
var.uid,
var.arch,
'os/Packages',
)
var.paths.repository = os.path.join(
'../../..',
var.uid,
var.arch,
'os',
)
ti.dump(dest_file, main_variant=main_variant)
def tweak_treeinfo(compose, include_variants, source_file, dest_file):
ti = load_and_tweak_treeinfo(source_file)
for variant_uid in include_variants:
@ -446,7 +491,6 @@ def tweak_treeinfo(compose, include_variants, source_file, dest_file):
var = ti.variants[variant_id]
var.paths.packages = os.path.join(var.uid, "Packages")
var.paths.repository = var.uid
ti.dump(dest_file)

View File

@ -23,12 +23,18 @@ from itertools import groupby
from kobo.rpmlib import parse_nvra
from kobo.shortcuts import force_list
from typing import (
Dict,
AnyStr,
List,
Tuple,
Set,
)
import pungi.wrappers.kojiwrapper
from pungi.wrappers.comps import CompsWrapper
from pungi.wrappers.mbs import MBSWrapper
import pungi.phases.pkgset.pkgsets
from pungi.arch import getBaseArch
from pungi.util import (
retry,
get_arch_variant_data,
@ -217,7 +223,7 @@ def _add_module_to_variant(
"""
Adds module defined by Koji build info to variant.
:param Variant variant: Variant to add the module to.
:param variant: Variant to add the module to.
:param int: build id
:param bool add_to_variant_modules: Adds the modules also to
variant.modules.
@ -230,18 +236,13 @@ def _add_module_to_variant(
if archive["btype"] != "module":
# Skip non module archives
continue
typedir = koji_wrapper.koji_module.pathinfo.typedir(build, archive["btype"])
filename = archive["filename"]
file_path = os.path.join(typedir, filename)
try:
# If there are two dots, the arch is in the middle. MBS uploads
# files with actual architecture in the filename, but Pungi deals
# in basearch. This assumes that each arch in the build maps to a
# unique basearch.
_, arch, _ = filename.split(".")
filename = "modulemd.%s.txt" % getBaseArch(arch)
except ValueError:
pass
file_path = os.path.join(
koji_wrapper.koji_module.pathinfo.topdir,
'modules',
build['arch'],
build['extra']['typeinfo']['module']['content_koji_tag']
)
mmds[filename] = file_path
if len(mmds) <= 1:
@ -507,16 +508,15 @@ def filter_by_whitelist(compose, module_builds, input_modules, expected_modules)
info.get("context"),
)
nvr_patterns.add((pattern, spec["name"]))
modules_to_keep = []
for mb in module_builds:
for mb in sorted(module_builds, key=lambda i: i['name']):
# Split release from the build into version and context
ver, ctx = mb["release"].split(".")
# Values in `mb` are from Koji build. There's nvr and name, version and
# release. The input pattern specifies modular name, stream, version
# and context.
for (n, s, v, c), spec in nvr_patterns:
for (n, s, v, c), spec in sorted(nvr_patterns):
if (
# We always have a name and stream...
mb["name"] == n
@ -528,11 +528,49 @@ def filter_by_whitelist(compose, module_builds, input_modules, expected_modules)
):
modules_to_keep.append(mb)
expected_modules.discard(spec)
break
return modules_to_keep
def _filter_expected_modules(
variant_name: AnyStr,
variant_arches: List[AnyStr],
expected_modules: Set[AnyStr],
filtered_modules: List[Tuple[AnyStr, Dict[AnyStr, List[AnyStr]]]],
) -> set:
"""
Function filters out all modules which are listed in Pungi config.
Those modules can be absent in koji env so we must remove it from
the expected modules list otherwise Pungi will fail
"""
for variant_regexp, filters_dict in filtered_modules:
for arch, modules in filters_dict.items():
arch = '.*' if arch == '*' else arch
variant_regexp = '.*' if variant_regexp == '*' else variant_regexp
modules = ['.*' if module == '*' else module for module in modules]
cond1 = re.findall(
variant_regexp,
variant_name,
)
cond2 = any(
re.findall(
arch,
variant_arch,
) for variant_arch in variant_arches
)
if cond1 and cond2:
expected_modules = {
expected_module for expected_module in expected_modules if
not any(
re.findall(
filtered_module,
expected_module,
) for filtered_module in modules
)
}
return expected_modules
def _get_modules_from_koji_tags(
compose,
koji_wrapper,
@ -546,10 +584,10 @@ def _get_modules_from_koji_tags(
Loads modules for given `variant` from Koji, adds them to
the `variant` and also to `variant_tags` dict.
:param Compose compose: Compose for which the modules are found.
:param compose: Compose for which the modules are found.
:param KojiWrapper koji_wrapper: Koji wrapper.
:param dict event_id: Koji event ID.
:param Variant variant: Variant with modules to find.
:param variant: Variant with modules to find.
:param dict variant_tags: Dict populated by this method. Key is `variant`
and value is list of Koji tags to get the RPMs from.
:param list exclude_module_ns: Module name:stream which will be excluded.
@ -560,7 +598,13 @@ def _get_modules_from_koji_tags(
]
# Get set of configured module names for this variant. If nothing is
# configured, the set is empty.
expected_modules = set(spec["name"] for spec in variant.get_modules())
expected_modules = []
for spec in variant.get_modules():
name, stream = spec['name'].split(':')
expected_modules.append(
':'.join((name, stream.replace('-', '_')))
)
expected_modules = set(expected_modules)
# Find out all modules in every variant and add their Koji tags
# to variant and variant_tags list.
koji_proxy = koji_wrapper.koji_proxy
@ -660,7 +704,12 @@ def _get_modules_from_koji_tags(
# needed in createrepo phase where metadata is exposed by
# productmd
variant.module_uid_to_koji_tag[nsvc] = module_tag
expected_modules = _filter_expected_modules(
variant_name=variant.name,
variant_arches=variant.arches,
expected_modules=expected_modules,
filtered_modules=compose.conf['filter_modules'],
)
if expected_modules:
# There are some module names that were listed in configuration and not
# found in any tag...

View File

@ -94,7 +94,7 @@ class Runroot(kobo.log.LoggingBase):
log_file = os.path.join(log_dir, "program.log")
try:
with open(log_file) as f:
for line in f:
for line in f.readlines():
if "losetup: cannot find an unused loop device" in line:
return True
if re.match("losetup: .* failed to set up loop device", line):

View File

@ -0,0 +1,434 @@
# coding=utf-8
import argparse
import os
import subprocess
import tempfile
from shutil import rmtree
from typing import AnyStr, List, Dict, Optional
import createrepo_c as cr
import requests
import yaml
from dataclasses import dataclass, field
from .create_packages_json import PackagesGenerator, RepoInfo
@dataclass
class ExtraRepoInfo(RepoInfo):
modules: List[AnyStr] = field(default_factory=list)
packages: List[AnyStr] = field(default_factory=list)
is_remote: bool = True
class CreateExtraRepo(PackagesGenerator):
def __init__(
self,
repos: List[ExtraRepoInfo],
bs_auth_token: AnyStr,
local_repository_path: AnyStr,
clear_target_repo: bool = True,
):
self.repos = [] # type: List[ExtraRepoInfo]
super().__init__(repos, [], [])
self.auth_headers = {
'Authorization': f'Bearer {bs_auth_token}',
}
# modules data of modules.yaml.gz from an existing local repo
self.local_modules_data = []
self.local_repository_path = local_repository_path
# path to modules.yaml, which generated by the class
self.default_modules_yaml_path = os.path.join(
local_repository_path,
'modules.yaml',
)
if clear_target_repo:
if os.path.exists(self.local_repository_path):
rmtree(self.local_repository_path)
os.makedirs(self.local_repository_path, exist_ok=True)
else:
self._read_local_modules_yaml()
def _read_local_modules_yaml(self):
"""
Read modules data from an existin local repo
"""
repomd_file_path = os.path.join(
self.local_repository_path,
'repodata',
'repomd.xml',
)
repomd_object = self._parse_repomd(repomd_file_path)
for repomd_record in repomd_object.records:
if repomd_record.type != 'modules':
continue
modules_yaml_path = os.path.join(
self.local_repository_path,
repomd_record.location_href,
)
self.local_modules_data = list(self._parse_modules_file(
modules_yaml_path,
))
break
def _dump_local_modules_yaml(self):
"""
Dump merged modules data to an local repo
"""
if self.local_modules_data:
with open(self.default_modules_yaml_path, 'w') as yaml_file:
yaml.dump_all(
self.local_modules_data,
yaml_file,
)
@staticmethod
def get_repo_info_from_bs_repo(
auth_token: AnyStr,
build_id: AnyStr,
arch: AnyStr,
packages: Optional[List[AnyStr]] = None,
modules: Optional[List[AnyStr]] = None,
) -> List[ExtraRepoInfo]:
"""
Get info about a BS repo and save it to
an object of class ExtraRepoInfo
:param auth_token: Auth token to Build System
:param build_id: ID of a build from BS
:param arch: an architecture of repo which will be used
:param packages: list of names of packages which will be put to an
local repo from a BS repo
:param modules: list of names of modules which will be put to an
local repo from a BS repo
:return: list of ExtraRepoInfo with info about the BS repos
"""
bs_url = 'https://build.cloudlinux.com'
api_uri = 'api/v1'
bs_repo_suffix = 'build_repos'
repos_info = []
# get the full info about a BS repo
repo_request = requests.get(
url=os.path.join(
bs_url,
api_uri,
'builds',
build_id,
),
headers={
'Authorization': f'Bearer {auth_token}',
},
)
repo_request.raise_for_status()
result = repo_request.json()
for build_platform in result['build_platforms']:
platform_name = build_platform['name']
for architecture in build_platform['architectures']:
# skip repo with unsuitable architecture
if architecture != arch:
continue
repo_info = ExtraRepoInfo(
path=os.path.join(
bs_url,
bs_repo_suffix,
build_id,
platform_name,
),
folder=architecture,
name=f'{build_id}-{platform_name}-{architecture}',
arch=architecture,
is_remote=True,
packages=packages,
modules=modules,
)
repos_info.append(repo_info)
return repos_info
def _create_local_extra_repo(self):
"""
Call `createrepo_c <path_to_repo>` for creating a local repo
"""
subprocess.call(
f'createrepo_c {self.local_repository_path}',
shell=True,
)
# remove an unnecessary temporary modules.yaml
if os.path.exists(self.default_modules_yaml_path):
os.remove(self.default_modules_yaml_path)
def get_remote_file_content(
self,
file_url: AnyStr,
) -> AnyStr:
"""
Get content from a remote file and write it to a temp file
:param file_url: url of a remote file
:return: path to a temp file
"""
file_request = requests.get(
url=file_url,
# for the case when we get a file from BS
headers=self.auth_headers,
)
file_request.raise_for_status()
with tempfile.NamedTemporaryFile(delete=False) as file_stream:
file_stream.write(file_request.content)
return file_stream.name
def _download_rpm_to_local_repo(
self,
package_location: AnyStr,
repo_info: ExtraRepoInfo,
) -> None:
"""
Download a rpm package from a remote repo and save it to a local repo
:param package_location: relative uri of a package in a remote repo
:param repo_info: info about a remote repo which contains a specific
rpm package
"""
rpm_package_remote_path = os.path.join(
repo_info.path,
repo_info.folder,
package_location,
)
rpm_package_local_path = os.path.join(
self.local_repository_path,
os.path.basename(package_location),
)
rpm_request = requests.get(
url=rpm_package_remote_path,
headers=self.auth_headers,
)
rpm_request.raise_for_status()
with open(rpm_package_local_path, 'wb') as rpm_file:
rpm_file.write(rpm_request.content)
def _download_packages(
self,
packages: Dict[AnyStr, cr.Package],
repo_info: ExtraRepoInfo
):
"""
Download all defined packages from a remote repo
:param packages: information about all packages (including
modularity) in a remote repo
:param repo_info: information about a remote repo
"""
for package in packages.values():
package_name = package.name
# Skip a current package from a remote repo if we defined
# the list packages and a current package doesn't belong to it
if repo_info.packages and \
package_name not in repo_info.packages:
continue
self._download_rpm_to_local_repo(
package_location=package.location_href,
repo_info=repo_info,
)
def _download_modules(
self,
modules_data: List[Dict],
repo_info: ExtraRepoInfo,
packages: Dict[AnyStr, cr.Package]
):
"""
Download all defined modularity packages and their data from
a remote repo
:param modules_data: information about all modules in a remote repo
:param repo_info: information about a remote repo
:param packages: information about all packages (including
modularity) in a remote repo
"""
for module in modules_data:
module_data = module['data']
# Skip a current module from a remote repo if we defined
# the list modules and a current module doesn't belong to it
if repo_info.modules and \
module_data['name'] not in repo_info.modules:
continue
# we should add info about a module if the local repodata
# doesn't have it
if module not in self.local_modules_data:
self.local_modules_data.append(module)
# just skip a module's record if it doesn't have rpm artifact
if module['document'] != 'modulemd' or \
'artifacts' not in module_data or \
'rpms' not in module_data['artifacts']:
continue
for rpm in module['data']['artifacts']['rpms']:
# Empty repo_info.packages means that we will download
# all packages from repo including
# the modularity packages
if not repo_info.packages:
break
# skip a rpm if it doesn't belong to a processed repo
if rpm not in packages:
continue
self._download_rpm_to_local_repo(
package_location=packages[rpm].location_href,
repo_info=repo_info,
)
def create_extra_repo(self):
"""
1. Get from the remote repos the specific (or all) packages/modules
2. Save them to a local repo
3. Save info about the modules to a local repo
3. Call `createrepo_c` which creates a local repo
with the right repodata
"""
for repo_info in self.repos:
packages = {} # type: Dict[AnyStr, cr.Package]
repomd_records = self._get_repomd_records(
repo_info=repo_info,
)
repomd_records_dict = {} # type: Dict[str, str]
self._download_repomd_records(
repo_info=repo_info,
repomd_records=repomd_records,
repomd_records_dict=repomd_records_dict,
)
packages_iterator = cr.PackageIterator(
primary_path=repomd_records_dict['primary'],
filelists_path=repomd_records_dict['filelists'],
other_path=repomd_records_dict['other'],
warningcb=self._warning_callback,
)
# parse the repodata (including modules.yaml.gz)
modules_data = self._parse_module_repomd_record(
repo_info=repo_info,
repomd_records=repomd_records,
)
# convert the packages dict to more usable form
# for future checking that a rpm from the module's artifacts
# belongs to a processed repository
packages = {
f'{package.name}-{package.epoch}:{package.version}-'
f'{package.release}.{package.arch}':
package for package in packages_iterator
}
self._download_modules(
modules_data=modules_data,
repo_info=repo_info,
packages=packages,
)
self._download_packages(
packages=packages,
repo_info=repo_info,
)
self._dump_local_modules_yaml()
self._create_local_extra_repo()
def create_parser():
parser = argparse.ArgumentParser()
parser.add_argument(
'--bs-auth-token',
help='Auth token for Build System',
required=True,
)
parser.add_argument(
'--local-repo-path',
help='Path to a local repo. E.g. /var/repo/test_repo',
required=True,
)
parser.add_argument(
'--clear-local-repo',
help='Clear a local repo before creating a new',
action='store_true',
default=False,
)
parser.add_argument(
'--repo',
action='append',
help='Path to a folder with repofolders or build id. E.g. '
'"http://koji.cloudlinux.com/mirrors/rhel_mirror" or '
'"601809b3c2f5b0e458b14cd3"',
required=True,
)
parser.add_argument(
'--repo-folder',
action='append',
help='A folder which contains folder repodata . E.g. "baseos-stream"',
required=True,
)
parser.add_argument(
'--repo-arch',
action='append',
help='What architecture packages a repository contains. E.g. "x86_64"',
required=True,
)
parser.add_argument(
'--packages',
action='append',
type=str,
default=[],
help='A list of packages names which we want to download to local '
'extra repo. We will download all of packages if param is empty',
required=True,
)
parser.add_argument(
'--modules',
action='append',
type=str,
default=[],
help='A list of modules names which we want to download to local '
'extra repo. We will download all of modules if param is empty',
required=True,
)
return parser
def cli_main():
args = create_parser().parse_args()
repos_info = []
for repo, repo_folder, repo_arch, packages, modules in zip(
args.repo,
args.repo_folder,
args.repo_arch,
args.packages,
args.modules,
):
modules = modules.split()
packages = packages.split()
if repo.startswith('http://'):
repos_info.append(
ExtraRepoInfo(
path=repo,
folder=repo_folder,
name=repo_folder,
arch=repo_arch,
modules=modules,
packages=packages,
)
)
else:
repos_info.extend(
CreateExtraRepo.get_repo_info_from_bs_repo(
auth_token=args.bs_auth_token,
build_id=repo,
arch=repo_arch,
modules=modules,
packages=packages,
)
)
cer = CreateExtraRepo(
repos=repos_info,
bs_auth_token=args.bs_auth_token,
local_repository_path=args.local_repo_path,
clear_target_repo=args.clear_local_repo,
)
cer.create_extra_repo()
if __name__ == '__main__':
cli_main()

View File

@ -0,0 +1,528 @@
# coding=utf-8
"""
The tool allow to generate package.json. This file is used by pungi
# as parameter `gather_prepopulate`
Sample of using repodata files taken from
https://github.com/rpm-software-management/createrepo_c/blob/master/examples/python/repodata_parsing.py
"""
import argparse
import gzip
import json
import lzma
import os
import re
import tempfile
from collections import defaultdict
from typing import AnyStr, Dict, List, Optional, Any, Iterator
import binascii
import createrepo_c as cr
import dnf.subject
import hawkey
import requests
import rpm
import yaml
from createrepo_c import Package, PackageIterator
from dataclasses import dataclass
def _is_compressed_file(first_two_bytes: bytes, initial_bytes: bytes):
return binascii.hexlify(first_two_bytes) == initial_bytes
def is_gzip_file(first_two_bytes):
return _is_compressed_file(
first_two_bytes=first_two_bytes,
initial_bytes=b'1f8b',
)
def is_xz_file(first_two_bytes):
return _is_compressed_file(
first_two_bytes=first_two_bytes,
initial_bytes=b'fd37',
)
@dataclass
class RepoInfo:
# path to a directory with repo directories. E.g. '/var/repos' contains
# 'appstream', 'baseos', etc.
# Or 'http://koji.cloudlinux.com/mirrors/rhel_mirror' if you are
# using remote repo
path: AnyStr
# name of folder with a repodata folder. E.g. 'baseos', 'appstream', etc
folder: AnyStr
# name of repo. E.g. 'BaseOS', 'AppStream', etc
name: AnyStr
# architecture of repo. E.g. 'x86_64', 'i686', etc
arch: AnyStr
# Is a repo remote or local
is_remote: bool
# Is a reference repository (usually it's a RHEL repo)
# Layout of packages from such repository will be taken as example
# Only layout of specific package (which don't exist
# in a reference repository) will be taken as example
is_reference: bool = False
class PackagesGenerator:
def __init__(
self,
repos: List[RepoInfo],
excluded_packages: List[AnyStr],
included_packages: List[AnyStr],
):
self.repos = repos
self.excluded_packages = excluded_packages
self.included_packages = included_packages
self.tmp_files = []
def __del__(self):
for tmp_file in self.tmp_files:
if os.path.exists(tmp_file):
os.remove(tmp_file)
@staticmethod
def _warning_callback(warning_type, message):
"""
Warning callback for createrepo_c parsing functions
"""
print(f'Warning message: "{message}"; warning type: "{warning_type}"')
return True
@staticmethod
def get_remote_file_content(file_url: AnyStr) -> AnyStr:
"""
Get content from a remote file and write it to a temp file
:param file_url: url of a remote file
:return: path to a temp file
"""
file_request = requests.get(
url=file_url,
)
file_request.raise_for_status()
with tempfile.NamedTemporaryFile(delete=False) as file_stream:
file_stream.write(file_request.content)
return file_stream.name
@staticmethod
def _parse_repomd(repomd_file_path: AnyStr) -> cr.Repomd:
"""
Parse file repomd.xml and create object Repomd
:param repomd_file_path: path to local repomd.xml
"""
return cr.Repomd(repomd_file_path)
def _parse_primary_file(
self,
primary_file_path: AnyStr,
packages: Dict[AnyStr, cr.Package],
) -> None:
"""
Parse primary.xml.gz, take from it info about packages and put it to
dict packages
:param primary_file_path: path to local primary.xml.gz
:param packages: dictionary which will contain info about packages
from repository
"""
cr.xml_parse_primary(
path=primary_file_path,
pkgcb=lambda pkg: packages.update({
pkg.pkgId: pkg,
}),
do_files=False,
warningcb=self._warning_callback,
)
def _parse_filelists_file(
self,
filelists_file_path: AnyStr,
packages: Dict[AnyStr, cr.Package],
) -> None:
"""
Parse filelists.xml.gz, take from it info about packages and put it to
dict packages
:param filelists_file_path: path to local filelists.xml.gz
:param packages: dictionary which will contain info about packages
from repository
"""
cr.xml_parse_filelists(
path=filelists_file_path,
newpkgcb=lambda pkg_id, name, arch: packages.get(
pkg_id,
None,
),
warningcb=self._warning_callback,
)
def _parse_other_file(
self,
other_file_path: AnyStr,
packages: Dict[AnyStr, cr.Package],
) -> None:
"""
Parse other.xml.gz, take from it info about packages and put it to
dict packages
:param other_file_path: path to local other.xml.gz
:param packages: dictionary which will contain info about packages
from repository
"""
cr.xml_parse_other(
path=other_file_path,
newpkgcb=lambda pkg_id, name, arch: packages.get(
pkg_id,
None,
),
warningcb=self._warning_callback,
)
@classmethod
def _parse_modules_file(
cls,
modules_file_path: AnyStr,
) -> Iterator[Any]:
"""
Parse modules.yaml.gz and returns parsed data
:param modules_file_path: path to local modules.yaml.gz
:return: List of dict for each modules in a repo
"""
with open(modules_file_path, 'rb') as modules_file:
data = modules_file.read()
if is_gzip_file(data[:2]):
data = gzip.decompress(data)
elif is_xz_file(data[:2]):
data = lzma.decompress(data)
return yaml.load_all(
data,
Loader=yaml.BaseLoader,
)
def _get_repomd_records(
self,
repo_info: RepoInfo,
) -> List[cr.RepomdRecord]:
"""
Get, parse file repomd.xml and extract from it repomd records
:param repo_info: structure which contains info about a current repo
:return: list with repomd records
"""
repomd_file_path = os.path.join(
repo_info.path,
repo_info.folder,
'repodata',
'repomd.xml',
)
if repo_info.is_remote:
repomd_file_path = self.get_remote_file_content(repomd_file_path)
else:
repomd_file_path = repomd_file_path
repomd_object = self._parse_repomd(repomd_file_path)
if repo_info.is_remote:
os.remove(repomd_file_path)
return repomd_object.records
def _download_repomd_records(
self,
repo_info: RepoInfo,
repomd_records: List[cr.RepomdRecord],
repomd_records_dict: Dict[str, str],
):
"""
Download repomd records
:param repo_info: structure which contains info about a current repo
:param repomd_records: list with repomd records
:param repomd_records_dict: dict with paths to repodata files
"""
for repomd_record in repomd_records:
if repomd_record.type not in (
'primary',
'filelists',
'other',
):
continue
repomd_record_file_path = os.path.join(
repo_info.path,
repo_info.folder,
repomd_record.location_href,
)
if repo_info.is_remote:
repomd_record_file_path = self.get_remote_file_content(
repomd_record_file_path)
self.tmp_files.append(repomd_record_file_path)
repomd_records_dict[repomd_record.type] = repomd_record_file_path
def _parse_module_repomd_record(
self,
repo_info: RepoInfo,
repomd_records: List[cr.RepomdRecord],
) -> List[Dict]:
"""
Download repomd records
:param repo_info: structure which contains info about a current repo
:param repomd_records: list with repomd records
:param repomd_records_dict: dict with paths to repodata files
"""
for repomd_record in repomd_records:
if repomd_record.type != 'modules':
continue
repomd_record_file_path = os.path.join(
repo_info.path,
repo_info.folder,
repomd_record.location_href,
)
if repo_info.is_remote:
repomd_record_file_path = self.get_remote_file_content(
repomd_record_file_path)
self.tmp_files.append(repomd_record_file_path)
return list(self._parse_modules_file(
repomd_record_file_path,
))
@staticmethod
def compare_pkgs_version(package_1: Package, package_2: Package) -> int:
version_tuple_1 = (
package_1.epoch,
package_1.version,
package_1.release,
)
version_tuple_2 = (
package_2.epoch,
package_2.version,
package_2.release,
)
return rpm.labelCompare(version_tuple_1, version_tuple_2)
def generate_packages_json(
self
) -> Dict[AnyStr, Dict[AnyStr, Dict[AnyStr, List[AnyStr]]]]:
"""
Generate packages.json
"""
packages_json = defaultdict(
lambda: defaultdict(
lambda: defaultdict(
list,
)
)
)
all_packages = defaultdict(lambda: {'variants': list()})
for repo_info in self.repos:
repo_arches = [
repo_info.arch,
'noarch',
]
if repo_info.arch == 'x86_64':
repo_arches.extend([
'i686',
'i386',
])
repomd_records = self._get_repomd_records(
repo_info=repo_info,
)
repomd_records_dict = {} # type: Dict[str, str]
self._download_repomd_records(
repo_info=repo_info,
repomd_records=repomd_records,
repomd_records_dict=repomd_records_dict,
)
packages_iterator = PackageIterator(
primary_path=repomd_records_dict['primary'],
filelists_path=repomd_records_dict['filelists'],
other_path=repomd_records_dict['other'],
warningcb=self._warning_callback,
)
for package in packages_iterator:
if package.arch not in repo_arches:
package_arch = repo_info.arch
else:
package_arch = package.arch
package_key = f'{package.name}.{package_arch}'
if 'module' in package.release and not any(
re.search(included_package, package.name)
for included_package in self.included_packages
):
# Even a module package will be added to packages.json if
# it presents in the list of included packages
continue
if package_key not in all_packages:
all_packages[package_key]['variants'].append(
repo_info.name
)
all_packages[package_key]['arch'] = repo_info.arch
all_packages[package_key]['package'] = package
all_packages[package_key]['type'] = repo_info.is_reference
# replace an older package if it's not reference or
# a newer package is from reference repo
elif (not all_packages[package_key]['type'] or
all_packages[package_key]['type'] ==
repo_info.is_reference) and \
self.compare_pkgs_version(
package,
all_packages[package_key]['package']
) > 0:
all_packages[package_key]['variants'] = [repo_info.name]
all_packages[package_key]['arch'] = repo_info.arch
all_packages[package_key]['package'] = package
elif self.compare_pkgs_version(
package,
all_packages[package_key]['package']
) == 0:
all_packages[package_key]['variants'].append(
repo_info.name
)
for package_dict in all_packages.values():
repo_arches = [
package_dict['arch'],
'noarch',
]
if package_dict['arch'] == 'x86_64':
repo_arches.extend([
'i686',
'i386',
])
for variant in package_dict['variants']:
repo_arch = package_dict['arch']
package = package_dict['package']
package_name = package.name
if package.arch not in repo_arches:
package_arch = package_dict['arch']
else:
package_arch = package.arch
if any(re.search(excluded_package, package_name)
for excluded_package in self.excluded_packages):
continue
src_package_name = dnf.subject.Subject(
package.rpm_sourcerpm,
).get_nevra_possibilities(
forms=hawkey.FORM_NEVRA,
)
if len(src_package_name) > 1:
# We should stop utility if we can't get exact name of srpm
raise ValueError(
'We can\'t get exact name of srpm '
f'by its NEVRA "{package.rpm_sourcerpm}"'
)
else:
src_package_name = src_package_name[0].name
pkgs_list = packages_json[variant][
repo_arch][src_package_name]
added_pkg = f'{package_name}.{package_arch}'
if added_pkg not in pkgs_list:
pkgs_list.append(added_pkg)
return packages_json
def create_parser():
parser = argparse.ArgumentParser()
parser.add_argument(
'--repo-path',
action='append',
help='Path to a folder with repofolders. E.g. "/var/repos" or '
'"http://koji.cloudlinux.com/mirrors/rhel_mirror"',
required=True,
)
parser.add_argument(
'--repo-folder',
action='append',
help='A folder which contains folder repodata . E.g. "baseos-stream"',
required=True,
)
parser.add_argument(
'--repo-arch',
action='append',
help='What architecture packages a repository contains. E.g. "x86_64"',
required=True,
)
parser.add_argument(
'--repo-name',
action='append',
help='Name of a repository. E.g. "AppStream"',
required=True,
)
parser.add_argument(
'--is-remote',
action='append',
type=str,
help='A repository is remote or local',
choices=['yes', 'no'],
required=True,
)
parser.add_argument(
'--is-reference',
action='append',
type=str,
help='A repository is used as reference for packages layout',
choices=['yes', 'no'],
required=True,
)
parser.add_argument(
'--excluded-packages',
nargs='+',
type=str,
default=[],
help='A list of globally excluded packages from generated json.'
'All of list elements should be separated by space',
required=False,
)
parser.add_argument(
'--included-packages',
nargs='+',
type=str,
default=[],
help='A list of globally included packages from generated json.'
'All of list elements should be separated by space',
required=False,
)
parser.add_argument(
'--json-output-path',
type=str,
help='Full path to output json file',
required=True,
)
return parser
def cli_main():
args = create_parser().parse_args()
repos = []
for repo_path, repo_folder, repo_name, \
repo_arch, is_remote, is_reference in zip(
args.repo_path,
args.repo_folder,
args.repo_name,
args.repo_arch,
args.is_remote,
args.is_reference,
):
repos.append(RepoInfo(
path=repo_path,
folder=repo_folder,
name=repo_name,
arch=repo_arch,
is_remote=True if is_remote == 'yes' else False,
is_reference=True if is_reference == 'yes' else False
))
pg = PackagesGenerator(
repos=repos,
excluded_packages=args.excluded_packages,
included_packages=args.included_packages,
)
result = pg.generate_packages_json()
with open(args.json_output_path, 'w') as packages_file:
json.dump(
result,
packages_file,
indent=4,
sort_keys=True,
)
if __name__ == '__main__':
cli_main()

View File

@ -0,0 +1,241 @@
import gzip
import lzma
import os
from argparse import ArgumentParser, FileType
from io import BytesIO
from pathlib import Path
from typing import List, AnyStr, Iterable, Union, Optional
import logging
from urllib.parse import urljoin
import yaml
import createrepo_c as cr
from typing.io import BinaryIO
from .create_packages_json import PackagesGenerator, is_gzip_file, is_xz_file
EMPTY_FILE = '.empty'
def read_modules_yaml(modules_yaml_path: Union[str, Path]) -> BytesIO:
with open(modules_yaml_path, 'rb') as fp:
return BytesIO(fp.read())
def grep_list_of_modules_yaml(repos_path: AnyStr) -> Iterable[BytesIO]:
"""
Find all of valid *modules.yaml.gz in repos
:param repos_path: path to a directory which contains repo dirs
:return: iterable object of content from *modules.yaml.*
"""
return (
read_modules_yaml_from_specific_repo(repo_path=path.parent)
for path in Path(repos_path).rglob('repodata')
)
def _is_remote(path: str):
return any(str(path).startswith(protocol)
for protocol in ('http', 'https'))
def read_modules_yaml_from_specific_repo(
repo_path: Union[str, Path]
) -> Optional[BytesIO]:
"""
Read modules_yaml from a specific repo (remote or local)
:param repo_path: path/url to a specific repo
(final dir should contain dir `repodata`)
:return: iterable object of content from *modules.yaml.*
"""
if _is_remote(repo_path):
repomd_url = urljoin(
repo_path + '/',
'repodata/repomd.xml',
)
repomd_file_path = PackagesGenerator.get_remote_file_content(
file_url=repomd_url
)
else:
repomd_file_path = os.path.join(
repo_path,
'repodata/repomd.xml',
)
repomd_obj = cr.Repomd(str(repomd_file_path))
for record in repomd_obj.records:
if record.type != 'modules':
continue
else:
if _is_remote(repo_path):
modules_yaml_url = urljoin(
repo_path + '/',
record.location_href,
)
modules_yaml_path = PackagesGenerator.get_remote_file_content(
file_url=modules_yaml_url
)
else:
modules_yaml_path = os.path.join(
repo_path,
record.location_href,
)
return read_modules_yaml(modules_yaml_path=modules_yaml_path)
else:
return None
def _should_grep_defaults(
document_type: str,
grep_only_modules_data: bool = False,
grep_only_modules_defaults_data: bool = False,
) -> bool:
xor_flag = grep_only_modules_data == grep_only_modules_defaults_data
if document_type == 'modulemd' and (xor_flag or grep_only_modules_data):
return True
return False
def _should_grep_modules(
document_type: str,
grep_only_modules_data: bool = False,
grep_only_modules_defaults_data: bool = False,
) -> bool:
xor_flag = grep_only_modules_data == grep_only_modules_defaults_data
if document_type == 'modulemd-defaults' and \
(xor_flag or grep_only_modules_defaults_data):
return True
return False
def collect_modules(
modules_paths: List[BinaryIO],
target_dir: str,
grep_only_modules_data: bool = False,
grep_only_modules_defaults_data: bool = False,
):
"""
Read given modules.yaml.gz files and export modules
and modulemd files from it.
Returns:
object:
"""
xor_flag = grep_only_modules_defaults_data is grep_only_modules_data
modules_path = os.path.join(target_dir, 'modules')
module_defaults_path = os.path.join(target_dir, 'module_defaults')
if grep_only_modules_data or xor_flag:
os.makedirs(modules_path, exist_ok=True)
if grep_only_modules_defaults_data or xor_flag:
os.makedirs(module_defaults_path, exist_ok=True)
# Defaults modules can be empty, but pungi detects
# empty folder while copying and raises the exception in this case
Path(os.path.join(module_defaults_path, EMPTY_FILE)).touch()
for module_file in modules_paths:
data = module_file.read()
if is_gzip_file(data[:2]):
data = gzip.decompress(data)
elif is_xz_file(data[:2]):
data = lzma.decompress(data)
documents = yaml.load_all(data, Loader=yaml.BaseLoader)
for doc in documents:
path = None
if _should_grep_modules(
doc['document'],
grep_only_modules_data,
grep_only_modules_defaults_data,
):
name = f"{doc['data']['module']}.yaml"
path = os.path.join(module_defaults_path, name)
logging.info('Found %s module defaults', name)
elif _should_grep_defaults(
doc['document'],
grep_only_modules_data,
grep_only_modules_defaults_data,
):
# pungi.phases.pkgset.sources.source_koji.get_koji_modules
stream = doc['data']['stream'].replace('-', '_')
doc_data = doc['data']
name = f"{doc_data['name']}-{stream}-" \
f"{doc_data['version']}.{doc_data['context']}"
arch_dir = os.path.join(
modules_path,
doc_data['arch']
)
os.makedirs(arch_dir, exist_ok=True)
path = os.path.join(
arch_dir,
name,
)
logging.info('Found module %s', name)
if 'artifacts' not in doc['data']:
logging.warning(
'RPM %s does not have explicit list of artifacts',
name
)
if path is not None:
with open(path, 'w') as f:
yaml.dump(doc, f, default_flow_style=False)
def cli_main():
parser = ArgumentParser()
content_type_group = parser.add_mutually_exclusive_group(required=False)
content_type_group.add_argument(
'--get-only-modules-data',
action='store_true',
help='Parse and get only modules data',
)
content_type_group.add_argument(
'--get-only-modules-defaults-data',
action='store_true',
help='Parse and get only modules_defaults data',
)
path_group = parser.add_mutually_exclusive_group(required=True)
path_group.add_argument(
'-p', '--path',
type=FileType('rb'), nargs='+',
help='Path to modules.yaml.gz file. '
'You may pass multiple files by passing -p path1 path2'
)
path_group.add_argument(
'-rp', '--repo-path',
required=False,
type=str,
default=None,
help='Path to a directory which contains repodirs. E.g. /var/repos'
)
path_group.add_argument(
'-rd', '--repodata-paths',
required=False,
type=str,
nargs='+',
default=[],
help='Paths/urls to the directories with directory `repodata`',
)
parser.add_argument('-t', '--target', required=True)
namespace = parser.parse_args()
if namespace.repodata_paths:
modules = []
for repodata_path in namespace.repodata_paths:
modules.append(read_modules_yaml_from_specific_repo(
repodata_path,
))
elif namespace.path is not None:
modules = namespace.path
else:
modules = grep_list_of_modules_yaml(namespace.repo_path)
modules = list(filter(lambda i: i is not None, modules))
collect_modules(
modules,
namespace.target,
namespace.get_only_modules_data,
namespace.get_only_modules_defaults_data,
)
if __name__ == '__main__':
cli_main()

View File

@ -0,0 +1,75 @@
from argparse import ArgumentParser
import os
from typing import List
from attr import dataclass
from productmd.common import parse_nvra
@dataclass
class Package:
nvra: str
path: str
def search_rpms(top_dir) -> List[Package]:
"""
Search for all *.rpm files recursively
in given top directory
Returns:
list: list of paths
"""
rpms = []
for root, dirs, files in os.walk(top_dir):
path = root.split(os.sep)
for file in files:
if not file.endswith('.rpm'):
continue
nvra, _ = os.path.splitext(file)
rpms.append(
Package(nvra=nvra, path=os.path.join('/', *path, file))
)
return rpms
def copy_rpms(packages: List[Package], target_top_dir: str):
"""
Search synced repos for rpms and prepare
koji-like structure for pungi
Instead of repos, use following structure:
# ls /mnt/koji/
i686/ noarch/ x86_64/
Returns:
Nothing:
"""
for package in packages:
info = parse_nvra(package.nvra)
target_arch_dir = os.path.join(target_top_dir, info['arch'])
os.makedirs(target_arch_dir, exist_ok=True)
target_file = os.path.join(target_arch_dir, os.path.basename(package.path))
if not os.path.exists(target_file):
try:
os.link(package.path, target_file)
except OSError:
# hardlink failed, try symlinking
os.symlink(package.path, target_file)
def cli_main():
parser = ArgumentParser()
parser.add_argument('-p', '--path', required=True)
parser.add_argument('-t', '--target', required=True)
namespace = parser.parse_args()
rpms = search_rpms(namespace.path)
copy_rpms(rpms, namespace.target)
if __name__ == '__main__':
cli_main()

View File

@ -305,6 +305,8 @@ class CompsWrapper(object):
append_common_info(doc, group_node, group, force_description=True)
append_bool(doc, group_node, "default", group.default)
append_bool(doc, group_node, "uservisible", group.uservisible)
if group.display_order is not None:
append(doc, group_node, "display_order", str(group.display_order))
if group.lang_only:
append(doc, group_node, "langonly", group.lang_only)

View File

@ -88,5 +88,12 @@ def parse_output(output):
packages.add((name, arch, frozenset(flags)))
else:
name, arch = nevra.rsplit(".", 1)
modules.add(name.split(":", 1)[1])
# replace dash by underscore in stream of module's nerva
# source of name looks like
# module:llvm-toolset:rhel8:8040020210411062713:9f9e2e7e.x86_64
name = ':'.join(
item.replace('-', '_') if i == 1 else item for
i, item in enumerate(name.split(':')[1:])
)
modules.add(name)
return packages, modules

314
pungi/wrappers/kojimock.py Normal file
View File

@ -0,0 +1,314 @@
import os
import time
from pathlib import Path
from attr import dataclass
from kobo.rpmlib import parse_nvra
from pungi.module_util import Modulemd
# just a random value which we don't
# use in mock currently
# originally builds are filtered by this value
# to get consistent snapshot of tags and packages
from pungi.scripts.gather_rpms import search_rpms
LAST_EVENT_ID = 999999
# last event time is not important but build
# time should be less then it
LAST_EVENT_TIME = time.time()
BUILD_TIME = 0
# virtual build that collects all
# packages built for some arch
RELEASE_BUILD_ID = 15270
# tag that should have all packages available
ALL_PACKAGES_TAG = 'dist-c8-compose'
# tag that should have all modules available
ALL_MODULES_TAG = 'dist-c8-module-compose'
@dataclass
class Module:
build_id: int
name: str
nvr: str
stream: str
version: str
context: str
arch: str
class KojiMock:
"""
Class that acts like real koji (for some needed methods)
but uses local storage as data source
"""
def __init__(self, packages_dir, modules_dir):
self._modules = self._gather_modules(modules_dir)
self._modules_dir = modules_dir
self._packages_dir = packages_dir
@staticmethod
def _gather_modules(modules_dir):
modules = {}
for index, (f, arch) in enumerate(
(sub_path.name, sub_path.parent.name)
for path in Path(modules_dir).glob('*')
for sub_path in path.iterdir()
):
parsed = parse_nvra(f)
modules[index] = Module(
name=parsed['name'],
nvr=f,
version=parsed['release'],
context=parsed['arch'],
stream=parsed['version'],
build_id=index,
arch=arch,
)
return modules
@staticmethod
def getLastEvent(*args, **kwargs):
return {'id': LAST_EVENT_ID, 'ts': LAST_EVENT_TIME}
def listTagged(self, tag_name, *args, **kwargs):
"""
Returns list of virtual 'builds' that contain packages by given tag
There are two kinds of tags: modular and distributive.
For now, only one kind, distributive one, is needed.
"""
if tag_name != ALL_MODULES_TAG:
raise ValueError("I don't know what tag is %s" % tag_name)
builds = []
for module in self._modules.values():
builds.append({
'build_id': module.build_id,
'owner_name': 'centos',
'package_name': module.name,
'nvr': module.nvr,
'version': module.stream,
'release': '%s.%s' % (module.version, module.context),
'name': module.name,
'id': module.build_id,
'tag_name': tag_name,
# Following fields are currently not
# used but returned by real koji
# left them here just for reference
#
# 'task_id': None,
# 'state': 1,
# 'start_time': '2020-12-23 16:43:59',
# 'creation_event_id': 309485,
# 'creation_time': '2020-12-23 17:05:33.553748',
# 'epoch': None, 'tag_id': 533,
# 'completion_time': '2020-12-23 17:05:23',
# 'volume_id': 0,
# 'package_id': 3221,
# 'owner_id': 11,
# 'volume_name': 'DEFAULT',
})
return builds
@staticmethod
def getFullInheritance(*args, **kwargs):
"""
Unneeded because we use local storage.
"""
return []
def getBuild(self, build_id, *args, **kwargs):
"""
Used to get information about build
(used in pungi only for modules currently)
"""
module = self._modules[build_id]
result = {
'id': build_id,
'name': module.name,
'version': module.stream,
'release': '%s.%s' % (module.version, module.context),
'completion_ts': BUILD_TIME,
'state': 'COMPLETE',
'arch': module.arch,
'extra': {
'typeinfo': {
'module': {
'stream': module.stream,
'version': module.version,
'name': module.name,
'context': module.context,
'content_koji_tag': '-'.join([
module.name,
module.stream,
module.version
]) + '.' + module.context
}
}
}
}
return result
def listArchives(self, build_id, *args, **kwargs):
"""
Originally lists artifacts for build, but in pungi used
only to get list of modulemd files for some module
"""
module = self._modules[build_id]
return [
{
'build_id': module.build_id,
'filename': f'modulemd.{module.arch}.txt',
'btype': 'module'
},
# noone ever uses this file
# but it should be because pungi ignores builds
# with len(files) <= 1
{
'build_id': module.build_id,
'filename': 'modulemd.txt',
'btype': 'module'
}
]
def listTaggedRPMS(self, tag_name, *args, **kwargs):
"""
Get information about packages that are tagged by tag.
There are two kings of tags: per-module and per-distr.
"""
if tag_name == ALL_PACKAGES_TAG:
builds, packages = self._get_release_packages()
else:
builds, packages = self._get_module_packages(tag_name)
return [
packages,
builds
]
def _get_release_packages(self):
"""
Search packages dir and keep only
packages that are non-modular.
This is quite the way how real koji works:
- modular packages are tagged by module-* tag
- all other packages are tagged with dist* tag
"""
packages = []
# get all rpms in folder
rpms = search_rpms(self._packages_dir)
all_rpms = [package.path for package in rpms]
# get nvras for modular packages
nvras = set()
for module in self._modules.values():
path = os.path.join(
self._modules_dir,
module.arch,
module.nvr,
)
info = Modulemd.ModuleStream.read_string(open(path).read(), strict=True)
for package in info.get_rpm_artifacts():
data = parse_nvra(package)
nvras.add((data['name'], data['version'], data['release'], data['arch']))
# and remove modular packages from global list
for rpm in all_rpms[:]:
data = parse_nvra(os.path.basename(rpm[:-4]))
if (data['name'], data['version'], data['release'], data['arch']) in nvras:
all_rpms.remove(rpm)
for rpm in all_rpms:
info = parse_nvra(os.path.basename(rpm))
packages.append({
"build_id": RELEASE_BUILD_ID,
"name": info['name'],
"extra": None,
"arch": info['arch'],
"epoch": info['epoch'] or None,
"version": info['version'],
"metadata_only": False,
"release": info['release'],
# not used currently
# "id": 262555,
# "size": 0
})
builds = []
return builds, packages
def _get_module_packages(self, tag_name):
"""
Get list of builds for module and given module tag name.
"""
module = self._get_module_by_name(tag_name)
path = os.path.join(
self._modules_dir,
module.arch,
tag_name,
)
builds = [
{
"build_id": module.build_id,
"package_name": module.name,
"nvr": module.nvr,
"tag_name": module.nvr,
"version": module.stream,
"release": module.version,
"id": module.build_id,
"name": module.name,
"volume_name": "DEFAULT",
# Following fields are currently not
# used but returned by real koji
# left them here just for reference
#
# "owner_name": "mbox-mbs-backend",
# "task_id": 195937,
# "state": 1,
# "start_time": "2020-12-22 19:20:12.504578",
# "creation_event_id": 306731,
# "creation_time": "2020-12-22 19:20:12.504578",
# "epoch": None,
# "tag_id": 1192,
# "completion_time": "2020-12-22 19:34:34.716615",
# "volume_id": 0,
# "package_id": 104,
# "owner_id": 6,
}
]
if module is None:
raise ValueError('Module %s is not found' % tag_name)
packages = []
if os.path.exists(path):
info = Modulemd.ModuleStream.read_string(open(path).read(), strict=True)
for art in info.get_rpm_artifacts():
data = parse_nvra(art)
packages.append({
"build_id": module.build_id,
"name": data['name'],
"extra": None,
"arch": data['arch'],
"epoch": data['epoch'] or None,
"version": data['version'],
"metadata_only": False,
"release": data['release'],
"id": 262555,
"size": 0
})
else:
raise RuntimeError('Unable to find module %s' % path)
return builds, packages
def _get_module_by_name(self, tag_name):
for module in self._modules.values():
if module.nvr != tag_name:
continue
return module
return None

View File

@ -26,6 +26,7 @@ import six
from six.moves import configparser, shlex_quote
import six.moves.xmlrpc_client as xmlrpclib
from .kojimock import KojiMock
from .. import util
from ..arch_utils import getBaseArch
@ -36,7 +37,7 @@ KOJI_BUILD_DELETED = koji.BUILD_STATES["DELETED"]
class KojiWrapper(object):
lock = threading.Lock()
def __init__(self, compose):
def __init__(self, compose, real_koji=False):
self.compose = compose
try:
self.profile = self.compose.conf["koji_profile"]
@ -61,9 +62,14 @@ class KojiWrapper(object):
value = getattr(self.koji_module.config, key, None)
if value is not None:
session_opts[key] = value
self.koji_proxy = koji.ClientSession(
self.koji_module.config.server, session_opts
)
if real_koji:
self.koji_proxy = koji.ClientSession(
self.koji_module.config.server, session_opts
)
else:
self.koji_proxy = KojiMock(
packages_dir=self.koji_module.config.topdir,
modules_dir=os.path.join(self.koji_module.config.topdir, 'modules'))
# This retry should be removed once https://pagure.io/koji/issue/3170 is
# fixed and released.

View File

@ -47,6 +47,10 @@ setup(
"pungi-gather = pungi.scripts.pungi_gather:cli_main",
"pungi-config-dump = pungi.scripts.config_dump:cli_main",
"pungi-config-validate = pungi.scripts.config_validate:cli_main",
"pungi-gather-modules = pungi.scripts.gather_modules:cli_main",
"pungi-gather-rpms = pungi.scripts.gather_rpms:cli_main",
"pungi-generate-packages-json = pungi.scripts.create_packages_json:cli_main", # noqa: E501
"pungi-create-extra-repo = pungi.scripts.create_extra_repo:cli_main"
]
},
scripts=["contrib/yum-dnf-compare/pungi-compare-depsolving"],
@ -66,5 +70,5 @@ setup(
"dogpile.cache",
],
extras_require={':python_version=="2.7"': ["enum34", "lockfile"]},
tests_require=["mock", "pytest", "pytest-cov"],
tests_require=["mock", "pytest", "pytest-cov", "pyfakefs"],
)

View File

@ -0,0 +1,36 @@
<?xml version="1.0" encoding="UTF-8"?>
<repomd xmlns="http://linux.duke.edu/metadata/repo" xmlns:rpm="http://linux.duke.edu/metadata/rpm">
<revision>1612479076</revision>
<data type="primary">
<checksum type="sha256">08941fae6bdb14f3b22bfad38b9d7dcb685a9df58fe8f515a3a0b2fe1af903bb</checksum>
<open-checksum type="sha256">2a15e618f049a883d360ccbf3e764b30640255f47dc526c633b1722fe23cbcbc</open-checksum>
<location href="repodata/08941fae6bdb14f3b22bfad38b9d7dcb685a9df58fe8f515a3a0b2fe1af903bb-primary.xml.gz"/>
<timestamp>1612479075</timestamp>
<size>1240</size>
<open-size>3888</open-size>
</data>
<data type="filelists">
<checksum type="sha256">e37a0b4a63b2b245dca1727195300cd3961f80aebc82ae7b9849dbf7482f5d0f</checksum>
<open-checksum type="sha256">b1782bc4207a5b7c3e64115d5a1d001802e8d363f022ea165df7cdab6f14651c</open-checksum>
<location href="repodata/e37a0b4a63b2b245dca1727195300cd3961f80aebc82ae7b9849dbf7482f5d0f-filelists.xml.gz"/>
<timestamp>1612479075</timestamp>
<size>439</size>
<open-size>1295</open-size>
</data>
<data type="other">
<checksum type="sha256">92992176bce71dcde9e4b6ad1442e7b5c7f3de9b7f019a2cd27d042ab38ea2b1</checksum>
<open-checksum type="sha256">3b847919691ad32279b13463de6c08f1f8b32f51e87b7d8d7e95a3ec2f46ef51</open-checksum>
<location href="repodata/92992176bce71dcde9e4b6ad1442e7b5c7f3de9b7f019a2cd27d042ab38ea2b1-other.xml.gz"/>
<timestamp>1612479075</timestamp>
<size>630</size>
<open-size>1911</open-size>
</data>
<data type="modules">
<checksum type="sha256">e7a671401f8e207e4cd3b90b4ac92d621f84a34dc9026f57c3f427fbed444c57</checksum>
<open-checksum type="sha256">d59fee86c18018cc18bb7325aa74aa0abf923c64d29a4ec45e08dcd01a0c3966</open-checksum>
<location href="repodata/e7a671401f8e207e4cd3b90b4ac92d621f84a34dc9026f57c3f427fbed444c57-modules.yaml.gz"/>
<timestamp>1612479075</timestamp>
<size>920</size>
<open-size>3308</open-size>
</data>
</repomd>

View File

@ -0,0 +1,55 @@
<?xml version="1.0" encoding="UTF-8"?>
<repomd xmlns="http://linux.duke.edu/metadata/repo" xmlns:rpm="http://linux.duke.edu/metadata/rpm">
<revision>1666177486</revision>
<data type="primary">
<checksum type="sha256">89cb9cc1181635c9147864a7076d91fb81072641d481cd202832a2d257453576</checksum>
<open-checksum type="sha256">07255d9856f7531b52a6459f6fc7701c6d93c6d6c29d1382d83afcc53f13494a</open-checksum>
<location href="repodata/89cb9cc1181635c9147864a7076d91fb81072641d481cd202832a2d257453576-primary.xml.gz"/>
<timestamp>1666177486</timestamp>
<size>1387</size>
<open-size>6528</open-size>
</data>
<data type="filelists">
<checksum type="sha256">f69ca03957574729fd5150335b0d87afddcfb37a97aed5b06272212854f1773d</checksum>
<open-checksum type="sha256">c2e1e674d7d48bccaa16cae0a5f70cb55ef4cd7352b4d9d4fdaa619075d07dbc</open-checksum>
<location href="repodata/f69ca03957574729fd5150335b0d87afddcfb37a97aed5b06272212854f1773d-filelists.xml.gz"/>
<timestamp>1666177486</timestamp>
<size>1252</size>
<open-size>5594</open-size>
</data>
<data type="other">
<checksum type="sha256">b3827bd6c9ea67ffa3912002515c64e4d9fe5c4dacbf7c46b0d8768b7abbb84f</checksum>
<open-checksum type="sha256">9ce24c526239e349d023c577b2ae3872c8b0f1888aed1fb24b9b9aa12063fdf3</open-checksum>
<location href="repodata/b3827bd6c9ea67ffa3912002515c64e4d9fe5c4dacbf7c46b0d8768b7abbb84f-other.xml.gz"/>
<timestamp>1666177486</timestamp>
<size>999</size>
<open-size>6320</open-size>
</data>
<data type="primary_db">
<checksum type="sha256">ab8df35061dfa0285069b843f24a7076e31266d9a8abe8282340bcb936aa61d7</checksum>
<open-checksum type="sha256">2bce9554ce4496cef34b5cd69f186f7f3143c7cabae8fa384fc5c9eeab326f7f</open-checksum>
<location href="repodata/ab8df35061dfa0285069b843f24a7076e31266d9a8abe8282340bcb936aa61d7-primary.sqlite.bz2"/>
<timestamp>1666177486</timestamp>
<size>3558</size>
<open-size>106496</open-size>
<database_version>10</database_version>
</data>
<data type="filelists_db">
<checksum type="sha256">8bcf6d40db4e922934ac47e8ac7fb8d15bdacf579af8c819d2134ed54d30550b</checksum>
<open-checksum type="sha256">f7001d1df7f5f7e4898919b15710bea8ed9711ce42faf68e22b757e63169b1fb</open-checksum>
<location href="repodata/8bcf6d40db4e922934ac47e8ac7fb8d15bdacf579af8c819d2134ed54d30550b-filelists.sqlite.bz2"/>
<timestamp>1666177486</timestamp>
<size>2360</size>
<open-size>28672</open-size>
<database_version>10</database_version>
</data>
<data type="other_db">
<checksum type="sha256">01b82e9eb7ee9151f283c6e761ae450de18ed2d64b5e32de88689eaf95216a80</checksum>
<open-checksum type="sha256">07f5b9750af1e440d37ca216e719dd288149e79e9132f2fdccb6f73b2e5dd541</open-checksum>
<location href="repodata/01b82e9eb7ee9151f283c6e761ae450de18ed2d64b5e32de88689eaf95216a80-other.sqlite.bz2"/>
<timestamp>1666177486</timestamp>
<size>2196</size>
<open-size>32768</open-size>
<database_version>10</database_version>
</data>
</repomd>

View File

@ -0,0 +1,55 @@
<?xml version="1.0" encoding="UTF-8"?>
<repomd xmlns="http://linux.duke.edu/metadata/repo" xmlns:rpm="http://linux.duke.edu/metadata/rpm">
<revision>1666177500</revision>
<data type="primary">
<checksum type="sha256">a1d342aa7cef3a2034fc3f9d6ee02d63572780bc76e61749a57e50b6b3ca9869</checksum>
<open-checksum type="sha256">a9e3eae447dd44282d7d96db5f15f049b757925397adb752f4df982176bab7e0</open-checksum>
<location href="repodata/a1d342aa7cef3a2034fc3f9d6ee02d63572780bc76e61749a57e50b6b3ca9869-primary.xml.gz"/>
<timestamp>1666177500</timestamp>
<size>3501</size>
<open-size>37296</open-size>
</data>
<data type="filelists">
<checksum type="sha256">6778922d5853d20f213ae7702699a76f1e87e55d6bfb5e4ac6a117d904d47b3c</checksum>
<open-checksum type="sha256">e30b666d9d88a70de69a08f45e6696bcd600c45485d856bd0213395d7da7bd49</open-checksum>
<location href="repodata/6778922d5853d20f213ae7702699a76f1e87e55d6bfb5e4ac6a117d904d47b3c-filelists.xml.gz"/>
<timestamp>1666177500</timestamp>
<size>27624</size>
<open-size>318187</open-size>
</data>
<data type="other">
<checksum type="sha256">5a60d79d8bce6a805f4fdb22fd891524359dce8ccc665c0b54e7299e79debe84</checksum>
<open-checksum type="sha256">b18138f4a3de45714e578fb1f30b7ec54fdcdaf1a22585891625b6af0894388e</open-checksum>
<location href="repodata/5a60d79d8bce6a805f4fdb22fd891524359dce8ccc665c0b54e7299e79debe84-other.xml.gz"/>
<timestamp>1666177500</timestamp>
<size>1876</size>
<open-size>28701</open-size>
</data>
<data type="primary_db">
<checksum type="sha256">c27bc2ce947173aba305041552c3c6d8db71442c1a2e5dcaf35ff750fe0469fc</checksum>
<open-checksum type="sha256">586e1af8934229925adb9e746ae5ced119859dfd97f4e3237399bb36a7d7f071</open-checksum>
<location href="repodata/c27bc2ce947173aba305041552c3c6d8db71442c1a2e5dcaf35ff750fe0469fc-primary.sqlite.bz2"/>
<timestamp>1666177500</timestamp>
<size>11528</size>
<open-size>126976</open-size>
<database_version>10</database_version>
</data>
<data type="filelists_db">
<checksum type="sha256">ed350865982e7a1e45b144839b56eac888e5d8f680571dd2cd06b37dc83e0fd8</checksum>
<open-checksum type="sha256">697903989d0f77de2d44a2b603e75c9b4ca23b3795eb136d175caf5666ce6459</open-checksum>
<location href="repodata/ed350865982e7a1e45b144839b56eac888e5d8f680571dd2cd06b37dc83e0fd8-filelists.sqlite.bz2"/>
<timestamp>1666177500</timestamp>
<size>20440</size>
<open-size>163840</open-size>
<database_version>10</database_version>
</data>
<data type="other_db">
<checksum type="sha256">35eff699131e0976429144c6f4514d21568177dc64bb4091c3ff62f76b293725</checksum>
<open-checksum type="sha256">3bd999a1bdf300df836a4607b7b75f845d8e1432e3e4e1ab6f0c7cc8a853db39</open-checksum>
<location href="repodata/35eff699131e0976429144c6f4514d21568177dc64bb4091c3ff62f76b293725-other.sqlite.bz2"/>
<timestamp>1666177500</timestamp>
<size>4471</size>
<open-size>49152</open-size>
<database_version>10</database_version>
</data>
</repomd>

View File

@ -0,0 +1,58 @@
[checksums]
images/boot.iso = sha256:fc8a4be604b6425746f12fa706116eb940f93358f036b8fbbe518b516cb6870c
[general]
; WARNING.0 = This section provides compatibility with pre-productmd treeinfos.
; WARNING.1 = Read productmd documentation for details about new format.
arch = x86_64
family = Test
name = Test 1.0
packagedir = Packages
platforms = x86_64,xen
repository = .
timestamp = 1531881582
variant = Server
variants = Client,Server
version = 1.0
[header]
type = productmd.treeinfo
version = 1.2
[images-x86_64]
boot.iso = images/boot.iso
[images-xen]
initrd = images/pxeboot/initrd.img
kernel = images/pxeboot/vmlinuz
[release]
name = Test
short = T
version = 1.0
[stage2]
mainimage = images/install.img
[tree]
arch = x86_64
build_timestamp = 1531881582
platforms = x86_64,xen
variants = Client,Server
[variant-Client]
id = Client
name = Client
packages = ../../../Client/x86_64/os/Packages
repository = ../../../Client/x86_64/os
type = variant
uid = Client
[variant-Server]
id = Server
name = Server
packages = Packages
repository = .
type = variant
uid = Server

View File

@ -7,7 +7,7 @@ import shutil
import tempfile
from collections import defaultdict
import mock
from unittest import mock
import six
from kobo.rpmlib import parse_nvr

View File

@ -1,15 +1,15 @@
# -*- coding: utf-8 -*-
try:
import unittest2 as unittest
except ImportError:
import unittest
import mock
from unittest import mock
import six
from copy import copy
from six.moves import StringIO
from ddt import ddt, data
import os
@ -2005,6 +2005,7 @@ class BuildinstallThreadTestCase(PungiTestCase):
self.assertEqual(ret, None)
@ddt
class TestSymlinkIso(PungiTestCase):
def setUp(self):
super(TestSymlinkIso, self).setUp()
@ -2020,8 +2021,13 @@ class TestSymlinkIso(PungiTestCase):
@mock.patch("pungi.phases.buildinstall.get_file_size")
@mock.patch("pungi.phases.buildinstall.iso")
@mock.patch("pungi.phases.buildinstall.run")
def test_hardlink(self, run, iso, get_file_size, get_mtime, ImageCls):
self.compose.conf = {"buildinstall_symlink": False, "disc_types": {}}
@data(['Server'], ['BaseOS'])
def test_hardlink(self, netinstall_variants, run, iso, get_file_size, get_mtime, ImageCls):
self.compose.conf = {
"buildinstall_symlink": False,
"disc_types": {},
"netinstall_variants": netinstall_variants,
}
get_file_size.return_value = 1024
get_mtime.return_value = 13579
@ -2071,9 +2077,14 @@ class TestSymlinkIso(PungiTestCase):
self.assertEqual(image.bootable, True)
self.assertEqual(image.implant_md5, iso.get_implanted_md5.return_value)
self.assertEqual(image.can_fail, False)
self.assertEqual(
self.compose.im.add.mock_calls, [mock.call("Server", "x86_64", image)]
)
if 'Server' in netinstall_variants:
self.assertEqual(
self.compose.im.add.mock_calls, [mock.call("Server", "x86_64", image)]
)
else:
self.assertEqual(
self.compose.im.add.mock_calls, []
)
@mock.patch("pungi.phases.buildinstall.Image")
@mock.patch("pungi.phases.buildinstall.get_mtime")
@ -2086,6 +2097,7 @@ class TestSymlinkIso(PungiTestCase):
self.compose.conf = {
"buildinstall_symlink": False,
"disc_types": {"boot": "netinst"},
"netinstall_variants": ['Server'],
}
get_file_size.return_value = 1024
get_mtime.return_value = 13579

View File

@ -0,0 +1,221 @@
# coding=utf-8
import os
from unittest import TestCase, mock, main
import yaml
from pungi.scripts.create_extra_repo import CreateExtraRepo, ExtraRepoInfo
FOLDER_WITH_TEST_DATA = os.path.join(
os.path.dirname(
os.path.abspath(__file__)
),
'data/test_create_extra_repo/',
)
TEST_MODULE_INFO = yaml.load("""
---
document: modulemd
version: 2
data:
name: perl-App-cpanminus
stream: 1.7044
version: 8030020210126085450
context: 3a33b840
arch: x86_64
summary: Get, unpack, build and install CPAN modules
description: >
This is a CPAN client that requires zero configuration, and stands alone but it's
maintainable and extensible with plug-ins and friendly to shell scripting.
license:
module:
- MIT
content:
- (GPL+ or Artistic) and GPLv2+
- ASL 2.0
- GPL+ or Artistic
dependencies:
- buildrequires:
perl: [5.30]
platform: [el8.3.0]
requires:
perl: [5.30]
perl-YAML: []
platform: [el8]
references:
community: https://metacpan.org/release/App-cpanminus
profiles:
common:
description: App-cpanminus distribution
rpms:
- perl-App-cpanminus
api:
rpms:
- perl-App-cpanminus
filter:
rpms:
- perl-CPAN-DistnameInfo-dummy
- perl-Test-Deep
buildopts:
rpms:
macros: >
%_without_perl_CPAN_Meta_Check_enables_extra_test 1
components:
rpms:
perl-App-cpanminus:
rationale: The API.
ref: perl-App-cpanminus-1.7044-5.module+el8.2.0+4278+abcfa81a.src.rpm
buildorder: 1
arches: [i686, x86_64]
perl-CPAN-DistnameInfo:
rationale: Run-time dependency.
ref: stream-0.12-rhel-8.3.0
arches: [i686, x86_64]
perl-CPAN-Meta-Check:
rationale: Run-time dependency.
ref: perl-CPAN-Meta-Check-0.014-6.module+el8.2.0+4278+abcfa81a.src.rpm
buildorder: 1
arches: [i686, x86_64]
perl-File-pushd:
rationale: Run-time dependency.
ref: perl-File-pushd-1.014-6.module+el8.2.0+4278+abcfa81a.src.rpm
arches: [i686, x86_64]
perl-Module-CPANfile:
rationale: Run-time dependency.
ref: perl-Module-CPANfile-1.1002-7.module+el8.2.0+4278+abcfa81a.src.rpm
arches: [i686, x86_64]
perl-Parse-PMFile:
rationale: Run-time dependency.
ref: perl-Parse-PMFile-0.41-7.module+el8.2.0+4278+abcfa81a.src.rpm
arches: [i686, x86_64]
perl-String-ShellQuote:
rationale: Run-time dependency.
ref: perl-String-ShellQuote-1.04-24.module+el8.2.0+4278+abcfa81a.src.rpm
arches: [i686, x86_64]
perl-Test-Deep:
rationale: Build-time dependency.
ref: stream-1.127-rhel-8.3.0
arches: [i686, x86_64]
artifacts:
rpms:
- perl-App-cpanminus-0:1.7044-5.module_el8.3.0+2027+c8990d1d.noarch
- perl-App-cpanminus-0:1.7044-5.module_el8.3.0+2027+c8990d1d.src
- perl-CPAN-Meta-Check-0:0.014-6.module_el8.3.0+2027+c8990d1d.noarch
- perl-CPAN-Meta-Check-0:0.014-6.module_el8.3.0+2027+c8990d1d.src
- perl-File-pushd-0:1.014-6.module_el8.3.0+2027+c8990d1d.noarch
- perl-File-pushd-0:1.014-6.module_el8.3.0+2027+c8990d1d.src
- perl-Module-CPANfile-0:1.1002-7.module_el8.3.0+2027+c8990d1d.noarch
- perl-Module-CPANfile-0:1.1002-7.module_el8.3.0+2027+c8990d1d.src
- perl-Parse-PMFile-0:0.41-7.module_el8.3.0+2027+c8990d1d.noarch
- perl-Parse-PMFile-0:0.41-7.module_el8.3.0+2027+c8990d1d.src
- perl-String-ShellQuote-0:1.04-24.module_el8.3.0+2027+c8990d1d.noarch
- perl-String-ShellQuote-0:1.04-24.module_el8.3.0+2027+c8990d1d.src
...
""", Loader=yaml.BaseLoader)
TEST_REPO_INFO = ExtraRepoInfo(
path=FOLDER_WITH_TEST_DATA,
folder='test_repo',
name='TestRepo',
arch='x86_64',
is_remote=False,
packages=[],
modules=[],
)
BS_BUILD_INFO = {
'build_platforms': [
{
'architectures': ['non_fake_arch', 'fake_arch'],
'name': 'fake_platform'
}
]
}
class TestCreteExtraRepo(TestCase):
maxDiff = None
def test_01_get_repo_info_from_bs_repo(self):
auth_token = 'fake_auth_token'
build_id = 'fake_build_id'
arch = 'fake_arch'
packages = ['fake_package1', 'fake_package2']
modules = ['fake_module1', 'fake_module2']
request_object = mock.Mock()
request_object.raise_for_status = lambda: True
request_object.json = lambda: BS_BUILD_INFO
with mock.patch(
'pungi.scripts.create_extra_repo.requests.get',
return_value=request_object,
) as mock_request_get:
repos_info = CreateExtraRepo.get_repo_info_from_bs_repo(
auth_token=auth_token,
build_id=build_id,
arch=arch,
packages=packages,
modules=modules,
)
self.assertEqual(
[
ExtraRepoInfo(
path='https://build.cloudlinux.com/'
f'build_repos/{build_id}/fake_platform',
folder=arch,
name=f'{build_id}-fake_platform-{arch}',
arch=arch,
is_remote=True,
packages=packages,
modules=modules,
)
],
repos_info,
)
mock_request_get.assert_called_once_with(
url=f'https://build.cloudlinux.com/api/v1/builds/{build_id}',
headers={
'Authorization': f'Bearer {auth_token}',
}
)
def test_02_create_extra_repo(self):
with mock.patch(
'pungi.scripts.create_extra_repo.'
'CreateExtraRepo._read_local_modules_yaml',
return_value=[],
) as mock__read_local_modules_yaml, mock.patch(
'pungi.scripts.create_extra_repo.'
'CreateExtraRepo._download_rpm_to_local_repo',
) as mock__download_rpm_to_local_repo, mock.patch(
'pungi.scripts.create_extra_repo.'
'CreateExtraRepo._dump_local_modules_yaml'
) as mock__dump_local_modules_yaml, mock.patch(
'pungi.scripts.create_extra_repo.'
'CreateExtraRepo._create_local_extra_repo'
) as mock__create_local_extra_repo:
cer = CreateExtraRepo(
repos=[TEST_REPO_INFO],
bs_auth_token='fake_auth_token',
local_repository_path='/path/to/local/repo',
clear_target_repo=False,
)
mock__read_local_modules_yaml.assert_called_once_with()
cer.create_extra_repo()
mock__download_rpm_to_local_repo.assert_called_once_with(
package_location='perl-App-cpanminus-1.7044-5.'
'module_el8.3.0+2027+c8990d1d.noarch.rpm',
repo_info=TEST_REPO_INFO,
)
mock__dump_local_modules_yaml.assert_called_once_with()
mock__create_local_extra_repo.assert_called_once_with()
self.assertEqual(
[TEST_MODULE_INFO],
cer.local_modules_data,
)
if __name__ == '__main__':
main()

View File

@ -0,0 +1,97 @@
# coding=utf-8
import os
from collections import defaultdict
from unittest import TestCase, mock, main
from pungi.scripts.create_packages_json import PackagesGenerator, RepoInfo
FOLDER_WITH_TEST_DATA = os.path.join(
os.path.dirname(
os.path.abspath(__file__)
),
'data/test_create_packages_json/',
)
test_repo_info = RepoInfo(
path=FOLDER_WITH_TEST_DATA,
folder='test_repo',
name='TestRepo',
arch='x86_64',
is_remote=False,
is_reference=True,
)
test_repo_info_2 = RepoInfo(
path=FOLDER_WITH_TEST_DATA,
folder='test_repo_2',
name='TestRepo2',
arch='x86_64',
is_remote=False,
is_reference=True,
)
class TestPackagesJson(TestCase):
def test_01_get_remote_file_content(self):
"""
Test the getting of content from a remote file
"""
request_object = mock.Mock()
request_object.raise_for_status = lambda: True
request_object.content = b'TestContent'
with mock.patch(
'pungi.scripts.create_packages_json.requests.get',
return_value=request_object,
) as mock_requests_get, mock.patch(
'pungi.scripts.create_packages_json.tempfile.NamedTemporaryFile',
) as mock_tempfile:
mock_tempfile.return_value.__enter__.return_value.name = 'tmpfile'
file_name = PackagesGenerator.get_remote_file_content(
file_url='fakeurl')
mock_requests_get.assert_called_once_with(url='fakeurl')
mock_tempfile.assert_called_once_with(delete=False)
mock_tempfile.return_value.__enter__().\
write.assert_called_once_with(b'TestContent')
self.assertEqual(
file_name,
'tmpfile',
)
def test_02_generate_additional_packages(self):
pg = PackagesGenerator(
repos=[
test_repo_info,
test_repo_info_2,
],
excluded_packages=['zziplib-utils'],
included_packages=['vim-file*'],
)
test_packages = defaultdict(
lambda: defaultdict(
lambda: defaultdict(
list,
)
)
)
test_packages['TestRepo']['x86_64']['zziplib'] = \
[
'zziplib.i686',
'zziplib.x86_64',
]
test_packages['TestRepo2']['x86_64']['vim'] = \
[
'vim-X11.i686',
'vim-common.i686',
'vim-enhanced.i686',
'vim-filesystem.noarch',
]
result = pg.generate_packages_json()
self.assertEqual(
test_packages,
result,
)
if __name__ == '__main__':
main()

View File

@ -2,6 +2,8 @@
import logging
import mock
from typing import AnyStr, List
from unittest import mock
import six
import os
@ -614,6 +616,7 @@ class GetExtraFilesTest(helpers.PungiTestCase):
)
@mock.patch("pungi.phases.extra_isos.tweak_repo_treeinfo")
@mock.patch("pungi.phases.extra_isos.tweak_treeinfo")
@mock.patch("pungi.wrappers.iso.write_graft_points")
@mock.patch("pungi.wrappers.iso.get_graft_points")
@ -623,7 +626,7 @@ class GetIsoContentsTest(helpers.PungiTestCase):
self.compose = helpers.DummyCompose(self.topdir, {})
self.variant = self.compose.variants["Server"]
def test_non_bootable_binary(self, ggp, wgp, tt):
def test_non_bootable_binary(self, ggp, wgp, tt, trt):
gp = {
"compose/Client/x86_64/os/Packages": {"f/foo.rpm": "/mnt/f/foo.rpm"},
"compose/Client/x86_64/os/repodata": {
@ -693,7 +696,15 @@ class GetIsoContentsTest(helpers.PungiTestCase):
],
)
def test_inherit_extra_files(self, ggp, wgp, tt):
# Check correct call to tweak_repo_treeinfo
self._tweak_repo_treeinfo_call_list_checker(
trt_mock=trt,
main_variant='Server',
addon_variants=['Client'],
sub_path='x86_64/os',
)
def test_inherit_extra_files(self, ggp, wgp, tt, trt):
gp = {
"compose/Client/x86_64/os/Packages": {"f/foo.rpm": "/mnt/f/foo.rpm"},
"compose/Client/x86_64/os/repodata": {
@ -767,7 +778,15 @@ class GetIsoContentsTest(helpers.PungiTestCase):
],
)
def test_source(self, ggp, wgp, tt):
# Check correct call to tweak_repo_treeinfo
self._tweak_repo_treeinfo_call_list_checker(
trt_mock=trt,
main_variant='Server',
addon_variants=['Client'],
sub_path='x86_64/os',
)
def test_source(self, ggp, wgp, tt, trt):
gp = {
"compose/Client/source/tree/Packages": {"f/foo.rpm": "/mnt/f/foo.rpm"},
"compose/Client/source/tree/repodata": {
@ -837,7 +856,15 @@ class GetIsoContentsTest(helpers.PungiTestCase):
],
)
def test_bootable(self, ggp, wgp, tt):
# Check correct call to tweak_repo_treeinfo
self._tweak_repo_treeinfo_call_list_checker(
trt_mock=trt,
main_variant='Server',
addon_variants=['Client'],
sub_path='source/tree',
)
def test_bootable(self, ggp, wgp, tt, trt):
self.compose.conf["buildinstall_method"] = "lorax"
bi_dir = os.path.join(self.topdir, "work/x86_64/buildinstall/Server")
@ -939,6 +966,42 @@ class GetIsoContentsTest(helpers.PungiTestCase):
],
)
# Check correct call to tweak_repo_treeinfo
self._tweak_repo_treeinfo_call_list_checker(
trt_mock=trt,
main_variant='Server',
addon_variants=['Client'],
sub_path='x86_64/os',
)
def _tweak_repo_treeinfo_call_list_checker(
self,
trt_mock: mock.Mock,
main_variant: AnyStr,
addon_variants: List[AnyStr],
sub_path: AnyStr) -> None:
"""
Check correct call to tweak_repo_treeinfo
"""
path_to_treeinfo = os.path.join(
self.topdir,
'compose',
main_variant,
sub_path,
'.treeinfo',
)
self.assertEqual(
trt_mock.call_args_list,
[
mock.call(
self.compose,
addon_variants,
path_to_treeinfo,
path_to_treeinfo,
)
]
)
class GetFilenameTest(helpers.PungiTestCase):
def test_use_original_name(self):
@ -1016,6 +1079,15 @@ class TweakTreeinfoTest(helpers.PungiTestCase):
self.assertFilesEqual(output, expected)
def test_repo_tweak(self):
compose = helpers.DummyCompose(self.topdir, {})
input = os.path.join(helpers.FIXTURE_DIR, "extraiso.treeinfo")
output = os.path.join(self.topdir, "actual-treeinfo")
expected = os.path.join(helpers.FIXTURE_DIR, "extraiso-tweaked-expected.treeinfo")
extra_isos.tweak_repo_treeinfo(compose, ["Client"], input, output)
self.assertFilesEqual(output, expected)
class PrepareMetadataTest(helpers.PungiTestCase):
@mock.patch("pungi.metadata.create_media_repo")

View File

@ -153,7 +153,10 @@ class TestParseOutput(unittest.TestCase):
self.assertEqual(modules, set())
def test_extracts_modules(self):
touch(self.file, "module:mod:master:20181003:cafebeef.x86_64@repo-0\n")
touch(
self.file,
"module:mod:master-1:20181003:cafebeef.x86_64@repo-0\n"
)
packages, modules = fus.parse_output(self.file)
self.assertEqual(packages, set())
self.assertEqual(modules, set(["mod:master:20181003:cafebeef"]))
self.assertEqual(modules, set(["mod:master_1:20181003:cafebeef"]))

View File

@ -0,0 +1,124 @@
# -*- coding: utf-8 -*-
import gzip
import os
from io import StringIO
import yaml
from pungi.scripts.gather_modules import collect_modules, EMPTY_FILE
import unittest
from pyfakefs.fake_filesystem_unittest import TestCase
MARIADB_MODULE = yaml.load("""
---
document: modulemd
version: 2
data:
name: mariadb-devel
stream: 10.3-1
version: 8010020200108182321
context: cdc1202b
arch: x86_64
summary: MariaDB Module
description: >-
MariaDB is a community developed branch of MySQL.
components:
rpms:
Judy:
rationale: MariaDB dependency for OQgraph computation engine
ref: a3583b33f939e74a530f2a1dff0552dff2c8ea73
buildorder: 4
arches: [aarch64, i686, ppc64le, x86_64]
artifacts:
rpms:
- Judy-0:1.0.5-18.module_el8.1.0+217+4d875839.i686
- Judy-debuginfo-0:1.0.5-18.module_el8.1.0+217+4d875839.i686
""", Loader=yaml.BaseLoader)
JAVAPACKAGES_TOOLS_MODULE = yaml.load("""
---
document: modulemd
version: 2
data:
name: javapackages-tools
stream: 201801
version: 8000020190628172923
context: b07bea58
arch: x86_64
summary: Tools and macros for Java packaging support
description: >-
Java Packages Tools is a collection of tools that make it easier to build RPM
packages containing software running on Java platform.
components:
rpms:
ant:
rationale: "Runtime dependency of ant-contrib"
ref: 2eaf095676540e2805ee7e8c7f6f78285c428fdc
arches: [aarch64, i686, ppc64le, x86_64]
artifacts:
rpms:
- ant-0:1.10.5-1.module_el8.0.0+30+832da3a1.noarch
- ant-0:1.10.5-1.module_el8.0.0+30+832da3a1.src
""", Loader=yaml.BaseLoader)
ANT_DEFAULTS = yaml.load("""
data:
module: ant
profiles:
'1.10':
- common
stream: '1.10'
document: modulemd-defaults
version: '1'
""", Loader=yaml.BaseLoader)
PATH_TO_KOJI = '/path/to/koji'
MODULES_YAML_GZ = 'modules.yaml.gz'
class TestModulesYamlParser(TestCase):
maxDiff = None
def setUp(self):
self.setUpPyfakefs()
def _prepare_test_data(self):
"""
Create modules.yaml.gz with some test data
"""
os.makedirs(PATH_TO_KOJI)
modules_gz_path = os.path.join(PATH_TO_KOJI, MODULES_YAML_GZ)
# dump modules into compressed file as in generic repos for rpm
io = StringIO()
yaml.dump_all([MARIADB_MODULE, JAVAPACKAGES_TOOLS_MODULE, ANT_DEFAULTS], io)
with open(os.path.join(PATH_TO_KOJI, MODULES_YAML_GZ), 'wb') as f:
f.write(gzip.compress(io.getvalue().encode()))
return modules_gz_path
def test_export_modules(self):
modules_gz_path = self._prepare_test_data()
paths = [open(modules_gz_path, 'rb')]
collect_modules(paths, PATH_TO_KOJI)
# check directory structure matches expected
self.assertEqual([MODULES_YAML_GZ, 'modules', 'module_defaults'], os.listdir(PATH_TO_KOJI))
self.assertEqual(['mariadb-devel-10.3_1-8010020200108182321.cdc1202b',
'javapackages-tools-201801-8000020190628172923.b07bea58'],
os.listdir(os.path.join(PATH_TO_KOJI, 'modules/x86_64')))
self.assertEqual([EMPTY_FILE, 'ant.yaml'],
os.listdir(os.path.join(PATH_TO_KOJI, 'module_defaults')))
# check that modules were exported
self.assertEqual(MARIADB_MODULE, yaml.load(
open(os.path.join(PATH_TO_KOJI, 'modules/x86_64', 'mariadb-devel-10.3_1-8010020200108182321.cdc1202b'))))
self.assertEqual(JAVAPACKAGES_TOOLS_MODULE, yaml.load(
open(os.path.join(PATH_TO_KOJI, 'modules/x86_64', 'javapackages-tools-201801-8000020190628172923.b07bea58'))))
# check that defaults were copied
self.assertEqual(ANT_DEFAULTS, yaml.load(
open(os.path.join(PATH_TO_KOJI, 'module_defaults', 'ant.yaml'))))
if __name__ == '__main__':
unittest.main()

105
tests/test_gather_rpms.py Normal file
View File

@ -0,0 +1,105 @@
# -*- coding: utf-8 -*-
import os
import unittest
from pathlib import Path
from pyfakefs.fake_filesystem_unittest import TestCase
from pungi.scripts.gather_rpms import search_rpms, copy_rpms, Package
PATH_TO_REPOS = '/path/to/repos'
MODULES_YAML_GZ = 'modules.yaml.gz'
class TestGatherRpms(TestCase):
maxDiff = None
FILES_TO_CREATE = [
'powertools/Packages/libvirt-6.0.0-28.module_el8.3.0+555+a55c8938.i686.rpm',
'powertools/Packages/libgit2-devel-0.26.8-2.el8.x86_64.rpm',
'powertools/Packages/xalan-j2-2.7.1-38.module_el8.0.0+30+832da3a1.noarch.rpm',
'appstream/Packages/bnd-maven-plugin-3.5.0-4.module_el8.0.0+30+832da3a1.noarch.rpm',
'appstream/Packages/OpenEXR-devel-2.2.0-11.el8.i686.rpm',
'appstream/Packages/mingw-binutils-generic-2.30-1.el8.x86_64.rpm',
'appstream/Packages/somenonrpm',
]
def setUp(self):
self.setUpPyfakefs()
os.makedirs(PATH_TO_REPOS)
for filepath in self.FILES_TO_CREATE:
os.makedirs(os.path.join(PATH_TO_REPOS, os.path.dirname(filepath)), exist_ok=True)
open(os.path.join(PATH_TO_REPOS, filepath), 'w').close()
def test_gather_rpms(self):
self.assertEqual(
[Package(nvra='libvirt-6.0.0-28.module_el8.3.0+555+a55c8938.i686',
path=f'{PATH_TO_REPOS}/powertools/Packages/'
f'libvirt-6.0.0-28.module_el8.3.0+555+a55c8938.i686.rpm'),
Package(nvra='libgit2-devel-0.26.8-2.el8.x86_64',
path=f'{PATH_TO_REPOS}/powertools/Packages/'
f'libgit2-devel-0.26.8-2.el8.x86_64.rpm'),
Package(nvra='xalan-j2-2.7.1-38.module_el8.0.0+30+832da3a1.noarch',
path=f'{PATH_TO_REPOS}/powertools/Packages/'
f'xalan-j2-2.7.1-38.module_el8.0.0+30+832da3a1.noarch.rpm'),
Package(nvra='bnd-maven-plugin-3.5.0-4.module_el8.0.0+30+832da3a1.noarch',
path='/path/to/repos/appstream/Packages/'
'bnd-maven-plugin-3.5.0-4.module_el8.0.0+30+832da3a1.noarch.rpm'),
Package(nvra='OpenEXR-devel-2.2.0-11.el8.i686',
path=f'{PATH_TO_REPOS}/appstream/Packages/'
f'OpenEXR-devel-2.2.0-11.el8.i686.rpm'),
Package(nvra='mingw-binutils-generic-2.30-1.el8.x86_64',
path=f'{PATH_TO_REPOS}/appstream/Packages/'
f'mingw-binutils-generic-2.30-1.el8.x86_64.rpm')],
search_rpms(PATH_TO_REPOS)
)
def test_copy_rpms(self):
target_path = Path('/mnt/koji')
packages = [
Package(nvra='libvirt-6.0.0-28.module_el8.3.0+555+a55c8938.i686',
path=f'{PATH_TO_REPOS}/powertools/Packages/'
f'libvirt-6.0.0-28.module_el8.3.0+555+a55c8938.i686.rpm'),
Package(nvra='libgit2-devel-0.26.8-2.el8.x86_64',
path=f'{PATH_TO_REPOS}/powertools/Packages/'
f'libgit2-devel-0.26.8-2.el8.x86_64.rpm'),
Package(nvra='xalan-j2-2.7.1-38.module_el8.0.0+30+832da3a1.noarch',
path=f'{PATH_TO_REPOS}/powertools/Packages/'
f'xalan-j2-2.7.1-38.module_el8.0.0+30+832da3a1.noarch.rpm'),
Package(nvra='bnd-maven-plugin-3.5.0-4.module_el8.0.0+30+832da3a1.noarch',
path='/path/to/repos/appstream/Packages/'
'bnd-maven-plugin-3.5.0-4.module_el8.0.0+30+832da3a1.noarch.rpm'),
Package(nvra='OpenEXR-devel-2.2.0-11.el8.i686',
path=f'{PATH_TO_REPOS}/appstream/Packages/'
f'OpenEXR-devel-2.2.0-11.el8.i686.rpm'),
Package(nvra='mingw-binutils-generic-2.30-1.el8.x86_64',
path=f'{PATH_TO_REPOS}/appstream/Packages/'
f'mingw-binutils-generic-2.30-1.el8.x86_64.rpm')
]
copy_rpms(packages, target_path)
self.assertCountEqual([
'xalan-j2-2.7.1-38.module_el8.0.0+30+832da3a1.noarch.rpm',
'bnd-maven-plugin-3.5.0-4.module_el8.0.0+30+832da3a1.noarch.rpm'
], os.listdir(target_path / 'noarch'))
self.assertCountEqual([
'libgit2-devel-0.26.8-2.el8.x86_64.rpm',
'mingw-binutils-generic-2.30-1.el8.x86_64.rpm'
], os.listdir(target_path / 'x86_64'))
self.assertCountEqual([
'libvirt-6.0.0-28.module_el8.3.0+555+a55c8938.i686.rpm',
'OpenEXR-devel-2.2.0-11.el8.i686.rpm'
], os.listdir(target_path / 'i686'))
self.assertCountEqual([
'i686', 'x86_64', 'noarch'
], os.listdir(target_path))
if __name__ == '__main__':
unittest.main()

View File

@ -0,0 +1,358 @@
# -*- coding: utf-8 -*-
import os
import ddt
import unittest
from pyfakefs.fake_filesystem_unittest import TestCase
from pungi.wrappers.kojimock import KojiMock, RELEASE_BUILD_ID
PATH_TO_REPOS = '/path/to/repos'
MODULES_YAML_GZ = 'modules.yaml.gz'
@ddt.ddt
class TestLocalKojiMock(TestCase):
maxDiff = None
FILES_TO_CREATE = [
# modular package that should be excluded from global list
'powertools/Packages/ant-1.10.5-1.module_el8.0.0+30+832da3a1.noarch.rpm',
# packages that should be gathered
'powertools/Packages/libgit2-devel-0.26.8-2.el8.x86_64.rpm',
'appstream/Packages/OpenEXR-devel-2.2.0-11.el8.i686.rpm',
'appstream/Packages/mingw-binutils-generic-2.30-1.el8.x86_64.rpm',
# non-rpm
'appstream/Packages/somenonrpm',
]
MARIADB_MODULE = """
---
document: modulemd
version: 2
data:
name: mariadb-devel
stream: 10.3
version: 8010020200108182321
context: cdc1202b
arch: x86_64
summary: MariaDB Module
license:
content:
- (CDDL or GPLv2 with exceptions) and ASL 2.0
module:
- MIT
description: >-
MariaDB is a community developed branch of MySQL.
components:
rpms:
Judy:
rationale: MariaDB dependency for OQgraph computation engine
ref: a3583b33f939e74a530f2a1dff0552dff2c8ea73
buildorder: 4
arches: [aarch64, i686, ppc64le, x86_64]
artifacts:
rpms:
- Judy-0:1.0.5-18.module_el8.1.0+217+4d875839.i686
- Judy-debuginfo-0:1.0.5-18.module_el8.1.0+217+4d875839.i686
"""
JAVAPACKAGES_TOOLS_MODULE = """
---
document: modulemd
version: 2
data:
name: javapackages-tools
stream: 201801
version: 8000020190628172923
context: b07bea58
arch: x86_64
summary: Tools and macros for Java packaging support
license:
content:
- (CDDL or GPLv2 with exceptions) and ASL 2.0
module:
- MIT
description: >-
Java Packages Tools is a collection of tools that make it easier to build RPM
packages containing software running on Java platform.
components:
rpms:
ant:
rationale: "Runtime dependency of ant-contrib"
ref: 2eaf095676540e2805ee7e8c7f6f78285c428fdc
arches: [aarch64, i686, ppc64le, x86_64]
artifacts:
rpms:
- ant-0:1.10.5-1.module_el8.0.0+30+832da3a1.noarch
- ant-0:1.10.5-1.module_el8.0.0+30+832da3a1.src
"""
ANT_DEFAULTS = """
data:
module: ant
profiles:
'1.10':
- common
stream: '1.10'
document: modulemd-defaults
version: '1'
"""
def setUp(self):
self.setUpPyfakefs()
os.makedirs(PATH_TO_REPOS)
os.makedirs(os.path.join(PATH_TO_REPOS, 'modules/x86_64'))
with open(os.path.join(PATH_TO_REPOS, 'modules/x86_64',
'javapackages-tools-201801-8000020190628172923.b07bea58'), 'w') as f:
f.write(self.JAVAPACKAGES_TOOLS_MODULE)
with open(os.path.join(PATH_TO_REPOS, 'modules/x86_64',
'mariadb-devel-10.3-8010020200108182321.cdc1202b'), 'w') as f:
f.write(self.MARIADB_MODULE)
for filepath in self.FILES_TO_CREATE:
os.makedirs(os.path.join(PATH_TO_REPOS, os.path.dirname(filepath)), exist_ok=True)
open(os.path.join(PATH_TO_REPOS, filepath), 'w').close()
self._koji = KojiMock(PATH_TO_REPOS, os.path.join(PATH_TO_REPOS, 'modules'))
@ddt.data(
[0, {
'completion_ts': 0,
'arch': 'x86_64',
'extra': {
'typeinfo': {
'module': {
'content_koji_tag': 'javapackages-tools-201801-8000020190628172923.b07bea58',
'context': 'b07bea58',
'name': 'javapackages-tools',
'stream': '201801',
'version': '8000020190628172923'
}
}
},
'id': 0,
'name': 'javapackages-tools',
'release': '8000020190628172923.b07bea58',
'state': 'COMPLETE',
'version': '201801'
}],
[1, {
'completion_ts': 0,
'arch': 'x86_64',
'extra': {
'typeinfo': {
'module': {
'content_koji_tag': 'mariadb-devel-10.3-8010020200108182321.cdc1202b',
'context': 'cdc1202b',
'name': 'mariadb-devel',
'stream': '10.3',
'version': '8010020200108182321'
}
}
},
'id': 1,
'name': 'mariadb-devel',
'release': '8010020200108182321.cdc1202b',
'state': 'COMPLETE',
'version': '10.3'
}]
)
@ddt.unpack
def test_get_build_info(self, build_id, result):
"""
Check that we are able to get build information from getBuild method
"""
build_info = self._koji.getBuild(build_id)
self.assertEqual(result, build_info)
@ddt.data(
[0, [{'btype': 'module', 'build_id': 0, 'filename': 'modulemd.x86_64.txt'},
{'btype': 'module', 'build_id': 0, 'filename': 'modulemd.txt'}]],
[1, [{'btype': 'module', 'build_id': 1, 'filename': 'modulemd.x86_64.txt'},
{'btype': 'module', 'build_id': 1, 'filename': 'modulemd.txt'}]]
)
@ddt.unpack
def test_list_archives(self, build_id, result):
"""
Provides list of archives of module descriptions.
Always should contain at least two files, so
I did a little hack and added modulemd.txt (it is on real koji)
but it is not used later by pungi
"""
build_info = self._koji.listArchives(build_id)
self.assertEqual(result, build_info)
@ddt.data(
[
'javapackages-tools-201801-8000020190628172923.b07bea58',
[
[
{
'arch': 'noarch',
'build_id': 0,
'epoch': '0',
'extra': None,
'id': 262555,
'metadata_only': False,
'name': 'ant',
'release': '1.module_el8.0.0+30+832da3a1',
'size': 0,
'version': '1.10.5'
},
{
'arch': 'src',
'build_id': 0,
'epoch': '0',
'extra': None,
'id': 262555,
'metadata_only': False,
'name': 'ant',
'release': '1.module_el8.0.0+30+832da3a1',
'size': 0,
'version': '1.10.5'
}
],
[
{
'build_id': 0,
'id': 0,
'name': 'javapackages-tools',
'nvr': 'javapackages-tools-201801-8000020190628172923.b07bea58',
'package_name': 'javapackages-tools',
'release': '8000020190628172923',
'tag_name': 'javapackages-tools-201801-8000020190628172923.b07bea58',
'version': '201801',
'volume_name': 'DEFAULT'
}
]
]
],
[
'mariadb-devel-10.3-8010020200108182321.cdc1202b',
[
[
{
'arch': 'i686',
'build_id': 1,
'epoch': '0',
'extra': None,
'id': 262555,
'metadata_only': False,
'name': 'Judy',
'release': '18.module_el8.1.0+217+4d875839',
'size': 0,
'version': '1.0.5'
},
{
'arch': 'i686',
'build_id': 1,
'epoch': '0',
'extra': None,
'id': 262555,
'metadata_only': False,
'name': 'Judy-debuginfo',
'release': '18.module_el8.1.0+217+4d875839',
'size': 0,
'version': '1.0.5'
}
],
[
{'build_id': 1,
'id': 1,
'name': 'mariadb-devel',
'nvr': 'mariadb-devel-10.3-8010020200108182321.cdc1202b',
'package_name': 'mariadb-devel',
'release': '8010020200108182321',
'tag_name': 'mariadb-devel-10.3-8010020200108182321.cdc1202b',
'version': '10.3',
'volume_name': 'DEFAULT'
}
]
]
],
[
'dist-c8-compose',
[
[
{
'arch': 'x86_64',
'build_id': RELEASE_BUILD_ID,
'epoch': None,
'extra': None,
'metadata_only': False,
'name': 'libgit2-devel',
'release': '2.el8',
'version': '0.26.8'
},
{
'arch': 'i686',
'build_id': RELEASE_BUILD_ID,
'epoch': None,
'extra': None,
'metadata_only': False,
'name': 'OpenEXR-devel',
'release': '11.el8',
'version': '2.2.0'
},
{
'arch': 'x86_64',
'build_id': RELEASE_BUILD_ID,
'epoch': None,
'extra': None,
'metadata_only': False,
'name': 'mingw-binutils-generic',
'release': '1.el8',
'version': '2.30'
}
],
# no build needed in this case because pungi does not use them
[]
]
],
)
@ddt.unpack
def test_list_tagged_rpms(self, tag, result):
"""
This method is used by pungi to get list of rpms:
either modular or just prepared for release
"""
self.assertEqual(result, self._koji.listTaggedRPMS(tag))
def test_list_tagged(self):
"""
Used only to get list of modules for some release.
"""
result = self._koji.listTagged('dist-c8-module-compose')
self.assertEqual([
{
'build_id': 0,
'id': 0,
'name': 'javapackages-tools',
'nvr': 'javapackages-tools-201801-8000020190628172923.b07bea58',
'owner_name': 'centos',
'package_name': 'javapackages-tools',
'release': '8000020190628172923.b07bea58',
'tag_name': 'dist-c8-module-compose',
'version': '201801'
},
{
'build_id': 1,
'id': 1,
'name': 'mariadb-devel',
'nvr': 'mariadb-devel-10.3-8010020200108182321.cdc1202b',
'owner_name': 'centos',
'package_name': 'mariadb-devel',
'release': '8010020200108182321.cdc1202b',
'tag_name': 'dist-c8-module-compose',
'version': '10.3'
}], result)
if __name__ == '__main__':
unittest.main()

View File

@ -54,7 +54,7 @@ class KojiWrapperBaseTestCase(unittest.TestCase):
)
)
self.koji_profile = koji.get_profile_module.return_value
self.koji = KojiWrapper(compose)
self.koji = KojiWrapper(compose, real_koji=True)
def tearDown(self):
os.remove(self.tmpfile)

View File

@ -1,15 +1,19 @@
# -*- coding: utf-8 -*-
import json
import mock
import os
import re
import six
from ddt import ddt, data, unpack
from typing import AnyStr, List, Set, Dict, Tuple
try:
import unittest2 as unittest
from unittest2 import mock
except ImportError:
import unittest
from unittest import mock
from pungi.phases.pkgset.sources import source_koji
from tests import helpers
@ -674,18 +678,34 @@ class TestFilterByWhitelist(unittest.TestCase):
self.assertEqual(expected, set())
class MockModule(object):
def __init__(self, path, strict=True):
self.path = path
def __repr__(self):
return "MockModule(%r)" % self.path
def __eq__(self, other):
return self.path == other.path
# TODO: multiarch support was removed from modules
# and will be added by https://cloudlinux.atlassian.net/browse/LNX-108
@mock.patch("pungi.module_util.Modulemd.ModuleStream.read_file", new=MockModule)
@unittest.skipIf(Modulemd is None, "Skipping tests, no module support")
class TestAddModuleToVariant(helpers.PungiTestCase):
def setUp(self):
super(TestAddModuleToVariant, self).setUp()
self.koji = mock.Mock()
self.koji.koji_module.pathinfo.typedir.return_value = MMDS_DIR
self.koji.koji_module.pathinfo.topdir = "/mnt/koji"
files = ["modulemd.x86_64.txt", "modulemd.armv7hl.txt", "modulemd.txt"]
self.koji.koji_proxy.listArchives.return_value = [
{"btype": "module", "filename": fname} for fname in files
] + [{"btype": "foo"}]
self.buildinfo = {
"id": 1234,
"arch": "fake_arch",
"extra": {
"typeinfo": {
"module": {
@ -693,74 +713,108 @@ class TestAddModuleToVariant(helpers.PungiTestCase):
"stream": "master",
"version": "20190318",
"context": "abcdef",
'content_koji_tag': 'module:master-20190318-abcdef'
},
},
},
}
def test_adding_module(self):
variant = mock.Mock(arches=["armhfp", "x86_64"], arch_mmds={}, modules=[])
variant = mock.Mock(arches=[
# "armhfp",
"x86_64"
], arch_mmds={}, modules=[])
source_koji._add_module_to_variant(self.koji, variant, self.buildinfo)
mod1 = variant.arch_mmds["armhfp"]["module:master:20190318:abcdef"]
self.assertEqual(mod1.get_NSVCA(), "module:master:20190318:abcdef:armhfp")
mod2 = variant.arch_mmds["x86_64"]["module:master:20190318:abcdef"]
self.assertEqual(mod2.get_NSVCA(), "module:master:20190318:abcdef:x86_64")
self.assertEqual(len(variant.arch_mmds), 2)
self.assertEqual(
variant.arch_mmds,
{
# "armhfp": {
# "module:master:20190318:abcdef": MockModule(
# "/mnt/koji/modules/armv7hl/module:master-20190318-abcdef"
# ),
# },
"x86_64": {
"module:master:20190318:abcdef": MockModule(
"/mnt/koji/modules/fake_arch/module:master-20190318-abcdef"
),
},
},
)
self.assertEqual(variant.modules, [])
def test_adding_module_to_existing(self):
variant = mock.Mock(
arches=["armhfp", "x86_64"],
arches=[
# "armhfp",
"x86_64"
],
arch_mmds={
"x86_64": {
"m1:latest:20190101:cafe": read_single_module_stream_from_file(
os.path.join(MMDS_DIR, "m1.x86_64.txt")
)
}
"x86_64": {"m1:latest:20190101:cafe": MockModule("/mnt/koji/modules/fake_arch/m1:latest:20190101:cafe")}
},
modules=[{"name": "m1:latest-20190101:cafe", "glob": False}],
)
source_koji._add_module_to_variant(self.koji, variant, self.buildinfo)
mod1 = variant.arch_mmds["armhfp"]["module:master:20190318:abcdef"]
self.assertEqual(mod1.get_NSVCA(), "module:master:20190318:abcdef:armhfp")
mod2 = variant.arch_mmds["x86_64"]["module:master:20190318:abcdef"]
self.assertEqual(mod2.get_NSVCA(), "module:master:20190318:abcdef:x86_64")
mod3 = variant.arch_mmds["x86_64"]["m1:latest:20190101:cafe"]
self.assertEqual(mod3.get_NSVCA(), "m1:latest:20190101:cafe:x86_64")
self.assertEqual(
variant.arch_mmds,
{
# "armhfp": {
# "module:master:20190318:abcdef": MockModule(
# "/mnt/koji/modules/armv7hl/module:master-20190318-abcdef"
# ),
# },
"x86_64": {
"module:master:20190318:abcdef": MockModule(
"/mnt/koji/modules/fake_arch/module:master-20190318-abcdef"
),
"m1:latest:20190101:cafe": MockModule("/mnt/koji/modules/fake_arch/m1:latest:20190101:cafe"),
},
},
)
self.assertEqual(
variant.modules, [{"name": "m1:latest-20190101:cafe", "glob": False}]
)
def test_adding_module_with_add_module(self):
variant = mock.Mock(arches=["armhfp", "x86_64"], arch_mmds={}, modules=[])
variant = mock.Mock(arches=[
# "armhfp",
"x86_64"
], arch_mmds={}, modules=[])
source_koji._add_module_to_variant(
self.koji, variant, self.buildinfo, add_to_variant_modules=True
)
mod1 = variant.arch_mmds["armhfp"]["module:master:20190318:abcdef"]
self.assertEqual(mod1.get_NSVCA(), "module:master:20190318:abcdef:armhfp")
mod2 = variant.arch_mmds["x86_64"]["module:master:20190318:abcdef"]
self.assertEqual(mod2.get_NSVCA(), "module:master:20190318:abcdef:x86_64")
self.assertEqual(
variant.arch_mmds,
{
# "armhfp": {
# "module:master:20190318:abcdef": MockModule(
# "/mnt/koji/modules/module:master-20190318-abcdef"
# ),
# },
"x86_64": {
"module:master:20190318:abcdef": MockModule(
"/mnt/koji/modules/fake_arch/module:master-20190318-abcdef"
)
},
},
)
self.assertEqual(
variant.modules, [{"name": "module:master:20190318:abcdef", "glob": False}]
)
def test_adding_module_to_existing_with_add_module(self):
variant = mock.Mock(
arches=["armhfp", "x86_64"],
arches=[
# "armhfp",
"x86_64"
],
arch_mmds={
"x86_64": {
"m1:latest:20190101:cafe": read_single_module_stream_from_file(
os.path.join(MMDS_DIR, "m1.x86_64.txt")
)
}
"x86_64": {"m1:latest:20190101:cafe": MockModule("/mnt/koji/modules/fake_arch/m1:latest:20190101:cafe")}
},
modules=[{"name": "m1:latest-20190101:cafe", "glob": False}],
)
@ -769,13 +823,22 @@ class TestAddModuleToVariant(helpers.PungiTestCase):
self.koji, variant, self.buildinfo, add_to_variant_modules=True
)
mod1 = variant.arch_mmds["armhfp"]["module:master:20190318:abcdef"]
self.assertEqual(mod1.get_NSVCA(), "module:master:20190318:abcdef:armhfp")
mod2 = variant.arch_mmds["x86_64"]["module:master:20190318:abcdef"]
self.assertEqual(mod2.get_NSVCA(), "module:master:20190318:abcdef:x86_64")
mod3 = variant.arch_mmds["x86_64"]["m1:latest:20190101:cafe"]
self.assertEqual(mod3.get_NSVCA(), "m1:latest:20190101:cafe:x86_64")
self.assertEqual(
variant.arch_mmds,
{
# "armhfp": {
# "module:master:20190318:abcdef": MockModule(
# "/koji/modulemd.armv7hl.txt"
# ),
# },
"x86_64": {
"module:master:20190318:abcdef": MockModule(
"/mnt/koji/modules/fake_arch/module:master-20190318-abcdef"
),
"m1:latest:20190101:cafe": MockModule("/mnt/koji/modules/fake_arch/m1:latest:20190101:cafe"),
},
},
)
self.assertEqual(
variant.modules,
[
@ -789,7 +852,10 @@ class TestAddModuleToVariant(helpers.PungiTestCase):
self.topdir, {"filter_modules": [(".*", {"*": ["module:*"]})]}
)
variant = mock.Mock(
arches=["armhfp", "x86_64"], arch_mmds={}, modules=[], uid="Variant"
arches=[
# "armhfp",
"x86_64"
], arch_mmds={}, modules=[], uid="Variant"
)
nsvc = source_koji._add_module_to_variant(
@ -867,7 +933,10 @@ class TestAddScratchModuleToVariant(helpers.PungiTestCase):
def test_adding_scratch_module(self):
variant = mock.Mock(
arches=["armhfp", "x86_64"],
arches=[
# "armhfp",
"x86_64"
],
arch_mmds={},
modules=[],
module_uid_to_koji_tag={},
@ -904,3 +973,124 @@ class TestAddScratchModuleToVariant(helpers.PungiTestCase):
self.compose.log_warning.assert_called_once_with(
"Only test composes could include scratch module builds"
)
@ddt
class TestSourceKoji(unittest.TestCase):
@unpack
@data(
(
'AppStream',
[
'x86_64',
'i386'
],
{
'python39-devel:3.9',
'python39:3.9',
},
[
(
'^(BaseOS|AppStream|PowerTools)$',
{
'x86_64': [
'python39:3.9',
],
'aarch64': [
'python39-devel:3.9',
]
}
)
],
{
'python39-devel:3.9',
}
),
(
'AppStream',
[
'x86_64',
'i386'
],
{
'python39-devel:3.9',
'python39:3.9',
'python38-devel:3.8',
'python38:3.8',
},
[
(
'^(BaseOS|AppStream|PowerTools)$',
{
'x86_64': [
'python39:3.9',
],
'*': [
'python38-devel:3.8',
]
}
)
],
{
'python39-devel:3.9',
'python38:3.8',
}
),
(
'AppStream',
[
'x86_64',
'i386'
],
{
'python39-devel:3.9',
'python39:3.9',
'python38-devel:3.8',
'python38:3.8',
},
[
(
'^(BaseOS|AppStream|PowerTools)$',
{
'x86_64': [
'python39:3.9',
],
'aarch64': [
'python38-devel:3.8',
]
}
),
(
'*',
{
'*': [
'python38-devel:3.8',
]
}
),
],
{
'python39-devel:3.9',
'python38:3.8',
}
),
)
def test__filter_expected_modules(
self,
variant_name: AnyStr,
variant_arches: List[AnyStr],
expected_modules: Set[AnyStr],
filtered_modules: List[Tuple[AnyStr, Dict[AnyStr, List[AnyStr]]]],
expected_result: Set[AnyStr],
) -> None:
real_result = source_koji._filter_expected_modules(
variant_name=variant_name,
variant_arches=variant_arches,
expected_modules=expected_modules,
filtered_modules=filtered_modules,
)
self.assertSetEqual(
real_result,
expected_result,
)