Compare commits

..

198 Commits

Author SHA1 Message Date
Stepan Oksanichenko bc8c776872
- Method `get_remote_file_content` is object's method now 2024-05-04 10:43:19 +03:00
Stepan Oksanichenko 91d282708e
- Method `get_remote_file_content` is object's method now 2023-11-21 09:19:01 +02:00
Stepan Oksanichenko ccaf31bc87
- Method `get_remote_file_content` is object's method now 2023-11-21 08:51:05 +02:00
Stepan Oksanichenko 5fe0504265
- Spec's changelog chronology is fixed 2023-11-15 15:14:22 +02:00
Stepan Oksanichenko d79f163685
- Bump version 2023-11-15 14:49:51 +02:00
Stepan Oksanichenko 793fb23958
- Bump version 2023-11-15 14:02:10 +02:00
Stepan Oksanichenko 65d0c09e97
- Return empty list if a repo doesn't contain any module 2023-11-15 13:17:57 +02:00
Stepan Oksanichenko 0a9e5df66c
- Properly removing tmp files 2023-11-10 21:38:01 +02:00
Stepan Oksanichenko ae527a2e01
- The unittests are fixed 2023-11-10 18:08:03 +02:00
Aditya Bisoi 4991144a01
4.5.0 release
Signed-off-by: Aditya Bisoi <abisoi@redhat.com>

(cherry picked from commit 4c7611291d (centos_master))
2023-11-10 16:58:03 +02:00
Lubomír Sedlář 68d94ff488
kojiwrapper: Stop being smart about local access
Rather than trying to use local access when it's accessible, let user
make the decision:

 * if koji_cache is configured use it and download stuff
 * if not, fall back to local access

Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit 0d3cd150bd)
2023-11-10 16:57:53 +02:00
Ozan Unsal ce45fdc39a
Fix unittest errors
Signed-off-by: Ozan Unsal <ounsal@redhat.com>

(cherry picked from commit aa0aae3d3e (centos_master))
2023-11-10 16:57:51 +02:00
Lubomír Sedlář b625ccea06
Add integrity checking for builds
When a real build is downloaded, Koji can provide a checksum via API.
This commit adds verification of that checksum.

A mismatch will abort the compose. If Koji doesn't provide a checksum
for the particular sigkey, no checking will happen.

Nothing is still checked for scratch builds and images.

This patch requires Koji 1.32. When talking to an older version, there
is no checking done.

Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit 77f8fa25ad)
2023-11-10 16:55:44 +02:00
Lubomír Sedlář 8eccfc5a03
Add script for cleaning up the cache
Pungi would by default only ever add files to the cache. That would
eventually result in essentially a mirror of the Koji volume.

This patch adds a helper cleanup script. When called, it goes through
files in the cache and deletes anything that is not hardlinked from
elsewhere and with mtime not updated recently.

Cleaning up files that hardlinked from some compose would not save any
space anyway. The mtime check should account for cases like subpackage
being downloaded but not included in any compose. This would avoid it
from being downloaded over and over again.

When a compose fails or is aborted, there can be a stale lock file left
behind in the cache. This script cleans that up too.

Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>

(cherry picked from commit e6d9f31ef4 (centos_master))
2023-11-10 16:55:43 +02:00
Lubomír Sedlář f5a0e06af5
Add ability to download images
This patch extends the ability to download files from Koji to image
building phases too.

There is no integrity checking for the downloaded images.

Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit bf3e9bc53a)
2023-11-10 16:55:20 +02:00
Lubomír Sedlář f6f54b56ca
Add support for not having koji volume mounted locally
With this patch, Pungi can be configured with a local directory to be
used as a cache for RPMs, and it will download packages from Koji over
HTTP instead of reading them from filesystem directly.

The files from the cache can then be hardlink as usual.

There is locking in place to avoid different composes running at the
same time to step on each other.

This is now supported for RPMs only, be it real builds or scratch
builds.

Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit 631bb01d8f)
2023-11-10 16:55:19 +02:00
Aditya Bisoi fcee346c7c
Remove repository cloning multiple times
JIRA: RHELCMP-8913
Signed-off-by: Aditya Bisoi <abisoi@redhat.com>
(cherry picked from commit b6296bdfcd)
2023-11-10 16:55:18 +02:00
Lubomír Sedlář 82ec38ad60
Support require_all_comps_packages on DNF backend
It's not a great name anymore though, because it will fail the compose
if any input package is missing, no matter whether it's from comps,
prepopulate or additional_packages.

JIRA: RHELCMP-12484
Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit 1c4275bbfa)
2023-11-10 16:55:17 +02:00
Lubomír Sedlář c9cbd80569
Fix new warnings from flake8
Use isinstance rather than directly comparing types.

Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit fe2dad3b3c)
2023-11-10 16:55:16 +02:00
Aditya Bisoi 035fca1e6d
4.4.1 release
Signed-off-by: Aditya Bisoi <abisoi@redhat.com>

(cherry picked from commit 7128021654 (centos_master))
2023-11-10 16:55:15 +02:00
Lubomír Sedlář 0f8cae69b7
ostree: Add configuration for custom runroot packages
Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit bd64894a03)
2023-11-10 16:55:01 +02:00
Lubomír Sedlář f17628dd5f
pkgset: Emit better error for missing modulemd file
The exceptions from libmodulemd are not particularly helpful as they do
not contain information about what file caused it.

   modulemd-yaml-error-quark: Failed to open file: Permission denied (0)

This patch should add the path to the problematic file into the message.

Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit 14e025a5a1)
2023-11-10 16:55:00 +02:00
Lubomír Sedlář f3485410ad
Add support for git-credential-helper
This patch adds an additional field `options` to scm_dict, which can be
used to provide additional information to the backends.

It implements a single new option for GitWrapper. This option allows
setting a custom git credentials wrapper. This can be useful if Pungi
needs to get files from a git repository that requires authentication.

The helper can be as simple as this (assuming the username is already
provided in the url):

    #!/bin/sh
    echo password=i-am-secret

The helper would need to be referenced by an absolute path from the
pungi configuration, or prefixed with ! to have git interpret it as a
shell script and look it up in PATH.

See https://git-scm.com/docs/gitcredentials for more details.

Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
JIRA: RHELCMP-11808
(cherry picked from commit ada8f4e346)
2023-11-10 16:54:59 +02:00
Haibo Lin cccfaea14e
Support OIDC Client Credentials authentication to CTS
JIRA: RHELCMP-11324
Signed-off-by: Haibo Lin <hlin@redhat.com>
(cherry picked from commit e4c525ecbf)
2023-11-10 16:54:58 +02:00
Lubomír Sedlář e2057b75c5
4.4.0 release
JIRA: RHELCMP-11764
Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>

(cherry picked from commit 091d228219 (centos_stream))
2023-11-10 16:54:57 +02:00
Lubomír Sedlář 44ea4d4419
gather-dnf: Run latest() later
The initial version of the filtered the latest builds at the start. That
doesn't matter in many cases:

* When there are no lookaside repos, there is generally a single version
  of each package.
* When lookaside repos do not overlap with compose repos, or contain
  only older versions.

It is however a problem when the lookaside repos contain higher version
of a package than what is in a compose repo, and some package explicitly
requires the older version.

Consider this scenario:

* lookaside contains bar-1.1
* compose repo contains bar-1.0 and foo-1.0
* foo-1.0 `Requires: bar < 1.1`

The original code would filter out the bar-1.0 package, and then fail on
unresolved dependencies.

This patch moves the computation of latest packages much later, to part
of code where all options to satisfy a dependency are selected and the
best match is chosen. At that point if there are multiple versions
available, we do want the latest one.

JIRA: SPMM-13483
Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit bcc440491e)
2023-11-10 16:54:43 +02:00
Lubomír Sedlář d4425f7935
iso: Support joliet long names
Without this option the names reported by joliet tree are truncated.

Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit fa50eedfad)
2023-11-10 16:54:42 +02:00
Lubomír Sedlář c8118527ea
Drop pungi-orchestrator code
This was never actually used.

JIRA: RHELCMP-10218
Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>

(cherry picked from commit b7adbf8a91 (centos_master))
2023-11-10 16:54:40 +02:00
Lubomír Sedlář a8ea322907
isos: Ensure proper file ownership and permissions
The genisoimage backend uses the -rational-rock option, which sets uid
and gid to 0, and makes file readable by everyone.

With xorriso this must be done explicitly. Setting ownership is a single
command, but the permissions require a per-file command to not make
files executable where not needed.

Fixes: https://bugzilla.redhat.com/show_bug.cgi?id=2203888
Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>

(cherry picked from commit 82ae9e86d5 (centos_master))
2023-11-10 16:54:22 +02:00
Lubomír Sedlář c4995c8f4b
gather: Always get latest packages
If lookaside contains an older version of a package, but with a
different arch, the depsolver doesn't notice that and prefers the
lookaside version.

This is not correct. The latest package should be used no matter if
there are different arches available.

The filtering in DNF doesn't ensure this, so we have to build it
ourselves. To limit the performance impact, only run this filtering when
there actually are some lookaside repos configured.

JIRA: RHELCMP-11728

Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit 2ad341a01c)
2023-11-10 16:54:01 +02:00
Lubomír Sedlář 997e372f25
Add back compatibility with jsonschema <3.0.0
Resolves: https://pagure.io/pungi/issue/1667
Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>

(cherry picked from commit e888e76992 (centos_master))
2023-11-10 16:54:00 +02:00
Lubomír Sedlář 42f1c62528
Remove useless debug message
Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit 6e72de7efe)
2023-11-10 16:52:27 +02:00
Lubomír Sedlář 3fd29d0ee0
Remove fedmsg from requirements
The code for sending messages in Fedora actually relies on
fedora-messaging library now. However, we do not have any tests for
that, so there's little reason to pull the library in via
requirements.txt

Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit c8263fcd39 (centos_master))
2023-11-10 16:52:04 +02:00
Lubomír Sedlář c1f2fa5035
gather: Support dotarch in DNF backend
The documentation claims that dotarch syntax is supported for additional
packages. For yum backend this seems to be handled automatically, but
the dnf backend could not interpret this.

This patch checks if a package is specified in the syntax and contains a
valid architecture. If so, the query will honor the arch.

JIRA: RHELCMP-11728
Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit 82ca4f4e65)
2023-11-10 16:51:55 +02:00
Aurélien Bompard 85c9e9e776
Set the priority in the fedora-messaging notifier
According to [infra ticket #10899](https://pagure.io/fedora-infrastructure/issue/10899),
ostree messages should have prioriy 3.

Signed-off-by: Aurélien Bompard <aurelien@bompard.org>
(cherry picked from commit b8b6b46ce7)
2023-11-10 16:51:54 +02:00
Lubomír Sedlář 33012ab31e
Fix compatibility with createrepo_c 0.21.1
The length of the file entry tuple has changed, it can not be unpacked
reliably.

Relates: https://github.com/rpm-software-management/createrepo_c/issues/360
Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit e9d836c115)
2023-11-10 16:51:53 +02:00
Lubomír Sedlář 72ddf65e62
comps: Apply arch filtering to environment/optionlist
Let's filter this list too, not just the grouplist tag.

JIRA: RHELCMP-7926
Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit d3f0701e01)
2023-11-10 16:51:52 +02:00
Haibo Lin c402ff3d60
Add config file for cleaning up cache files
systemd-tmpfiles is required to enable the auto clean up.

JIRA: RHELCMP-6327
Signed-off-by: Haibo Lin <hlin@redhat.com>
(cherry picked from commit 8f6f0f463f)
2023-11-10 16:51:51 +02:00
Haibo Lin 8dd344f9ee
4.3.8 release
JIRA: RHELCMP-11448
Signed-off-by: Haibo Lin <hlin@redhat.com>

(cherry picked from commit 467c7a7f6a (centos_master))
2023-11-10 16:51:49 +02:00
Lubomír Sedlář d07f517a90
createiso: Update possibly changed file on DVD
There's no good way of detecting if buildinstall phase tweaked boot
configuration (and efiboot.img). We should update those files in the DVD
just to be sure.

The .discinfo file is always different and needs to be updated.

Relates: https://pagure.io/pungi/issue/1647
JIRA: RHELCMP-10811
Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit e1d7544c2b)
2023-11-10 16:51:39 +02:00
Lubomír Sedlář 48366177cc
pkgset: Stop reuse if configuration changed
When options controlling excluding arches change, it should break reuse.

Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit a71c8e23be)
2023-11-10 16:51:38 +02:00
Lubomír Sedlář 4cb8671fe4
Allow disabling inheriting ExcludeArch to noarch packages
Copying ExcludeArch/ExclusiveArch from source rpm to noarch is an easy
option to block shipping that particular noarch package from a certain
architecture. However, there is no way to bypass it, and it is rather
confusing and not discoverable.

An alternative way to remove an unwanted package is to use the good old
`filter_packages`, which has enough granularity to remove pretty much
anything from anywhere. The only downside is that it requires a change
in configuration, so it can't be done by a packager directly from a spec
file.

When we decide to break backwards compatibility, this option should be
removed and the entire ExcludeArch/ExclusiveArch inheritance removed
completely.

JIRA: ENGCMP-2606
Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit ab508c1511)
2023-11-10 16:51:37 +02:00
Lubomír Sedlář 135bbbfe7e
pkgset: Support extra builds with no tags
This is a rather fringe use case. If the configuration contains
pkgset_koji_builds or pkgset_koji_scratch_tasks but no pkgset_koji_tag,
the compose will be empty.

The expectation though is that the packages should be pulled.

The extra RPMs are added to all non-modular tags because they are
supposed to mask builds from the same packages (e.g. user may want to
explicitly pull in older version than tagged).

This patch adds support for composes containing only explicitly listed
builds by creating a dummy package set that is not actually using any
tag.

JIRA: RHELCMP-11385
Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit f960b4d155)
2023-11-10 16:51:36 +02:00
Lubomír Sedlář 5624829564
buildinstall: Avoid pointlessly tweaking the boot images
Only modify boot images if there actually is some change.

The tweak function updates config files with volume id and kickstart
file. Even if we don't have a kickstart and there is no change in the
config files, the image will be regenerated. This leads to a change in
checksum for no good reason.

This patch keeps track of modified config files. If there are none, it
avoids touching anything else.

Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit 602b698080)
2023-11-10 16:51:35 +02:00
Haibo Lin 5fb4f86312
Prevent to reuse if unsigned packages are allowed
JIRA: RHELCMP-8415
Signed-off-by: Haibo Lin <hlin@redhat.com>
(cherry picked from commit b30f7e0d83)
2023-11-10 16:51:34 +02:00
Lubomír Sedlář e891fe7b09
Pass parent id/respin id to CTS
When the --target-dir option is used, the compose can be created in CTS,
but the parent and respin information is not passed through. That leads
to data missing later on.

JIRA: RHELCMP-11411
Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>

(cherry picked from commit 0c3b6e22f9 (centos_master))
2023-11-10 16:51:33 +02:00
Haibo Lin 4cd7d39914
Exclude existing files in boot.iso
JIRA: RHELCMP-10811
Fixes: https://pagure.io/pungi/issue/1647
Signed-off-by: Haibo Lin <hlin@redhat.com>
(cherry picked from commit 3175ede38a)
2023-11-10 16:50:46 +02:00
Lubomír Sedlář 5de829d05b
image-build/osbuild: Pull ISOs into the compose
OSBuild tasks can produce ISO files. If they do, we should include them
in the compose, and we should pull them into the iso/ subdirectory
together with other ISOs.

Fixes: https://pagure.io/pungi/issue/1657
Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit 8920eef339)
2023-11-10 16:50:45 +02:00
Lubomír Sedlář 2930a1cc54
Retry 401 error from CTS
This could be a transient error caused by kerberos server instability.

JIRA: RHELCMP-11251
Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit 58036eab84)
2023-11-10 16:50:43 +02:00
Lubomír Sedlář 9c4d3d496d
gather: Better detection of debuginfo in lookaside
If the depsolver wants to include a package that is present in both the
source repo and a lookaside repo, it reliably detects binary packages
present in lookaside, but for debuginfo it's not so reliable.

There is a separate package object for each package in each repo.
Depending on which one is used, debuginfo could be included in the
result or not. This patch fixes that by actually looking if the same
package is present in any lookaside repo.

JIRA: RHELCMP-9373
Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit a4476f2570)
2023-11-10 16:50:42 +02:00
Haibo Lin 4637fd6697
Log versions of all installed packages
JIRA: RHELCMP-9493
Signed-off-by: Haibo Lin <hlin@redhat.com>
(cherry picked from commit 8c06b7a3f1)
2023-11-10 16:50:41 +02:00
Lubomír Sedlář 2ff8132eaf
Use authentication for all CTS calls
The update of compose URL relied on environment being set from the
initial import. This got broken when a unique credentials cache started
to be used, and was cleaned up after the import.

JIRA: RHELCMP-11072
Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit 64ae81b416)
2023-11-10 16:50:40 +02:00
Lubomír Sedlář f9190d1fd1
Fix black complaints
These are newly detected by black 23.1.0.

Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit 826169af7c)
2023-11-10 16:50:38 +02:00
Lubomír Sedlář 80ad0448ec
Add vhd.gz extension to compressed VHD images
JIRA: RHELCMP-11027
Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit d97b8bdd33)
2023-11-10 16:50:37 +02:00
Lubomír Sedlář 027380f969
Add vhd-compressed image type
JIRA: RHELCMP-11027
Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit 8768b23cbe)
2023-11-10 16:50:36 +02:00
Lubomír Sedlář 41048f60b7
Update to work with latest mock
The `called_once` attribute now raises an exception. Switch to
`assert_called_once` method. Also replace `assertTrue(x.called)` with
`x.assert_called()`.

Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit 51628a974d)
2023-11-10 16:50:34 +02:00
Ondrej Nosek 9f8f6a7956
Default bztar format for sdist command
Usage of the 'bztar' format is unchanged, just changing the way
of configuration. The previous method was deprecated.

Signed-off-by: Ondrej Nosek <onosek@redhat.com>
(cherry picked from commit 88327d5784)
2023-11-10 16:50:33 +02:00
Lubomír Sedlář 3d3e4bafdf
- New upstream release 4.5.0
(cherry picked from commit 4dfabb647b (fedora_master))
2023-11-10 16:47:04 +02:00
Lubomír Sedlář 8fe0257e93
Release 4.4.1
(cherry picked from commit 4c604f434a (fedora_master))
2023-11-10 16:46:02 +02:00
Fedora Release Engineering d7b5fd2278
Rebuilt for https://fedoraproject.org/wiki/Fedora_39_Mass_Rebuild
Signed-off-by: Fedora Release Engineering <releng@fedoraproject.org>

(cherry picked from commit bf4f5b6e53 (fedora_master))
2023-11-10 16:44:52 +02:00
Lubomír Sedlář 8b49d4ad61
Backport patch from upstream PR 1690
(cherry picked from commit 2362ef59c5 (fedora_master))
2023-11-10 16:44:19 +02:00
Lubomír Sedlář 57443cd0aa
Backport patch from upstream PR 1690
(cherry picked from commit 9ee6caf117 (fedora_master))
2023-11-10 16:43:47 +02:00
Python Maint 1d146bb8d5
Rebuilt for Python 3.12
(cherry picked from commit 8b8b558fbc (fedora_master))
2023-11-10 16:42:36 +02:00
Lubomír Sedlář 790091b7d7
Release 4.4.0
(cherry picked from commit a6196da315 (fedora_master))
2023-11-10 16:42:10 +02:00
Lubomír Sedlář 28aad3ea40
Rebuild without fedmsg dependencs
(cherry picked from commit d142464ef1 (fedora_master))
2023-11-10 16:41:29 +02:00
Pierre-Yves Chibon 7373b4dbbf
Replace the requirement on fedmsg to one on fedora-messaging
Signed-off-by: Pierre-Yves Chibon <pingou@pingoured.fr>
(cherry picked from commit 802f5fe854)
2023-11-10 16:40:34 +02:00
Lubomír Sedlář 218b11f1b7
Backport patches
(cherry picked from commit 20a5d00961 (fedora_master))
2023-11-10 16:40:33 +02:00
Haibo Lin bfbe9095d2
Release 4.3.8
Signed-off-by: Haibo Lin <hlin@redhat.com>

(cherry picked from commit 3548f55821 (fedora_master))
2023-11-10 16:38:58 +02:00
Lubomír Sedlář eb17182c04
Update license tag to SPDX
(cherry picked from commit f9143f6ea1 (fedora_master))
2023-11-10 16:33:41 +02:00
Stepan Oksanichenko f91f90cf64 - Test empty sub-package 2023-10-26 00:01:45 +03:00
Stepan Oksanichenko 49931082b2 - Test empty sub-package 2023-10-25 23:11:26 +03:00
Stepan Oksanichenko 8ba8609bda - Test empty sub-package 2023-10-25 22:58:28 +03:00
Stepan Oksanichenko 6f495a8133 - Test empty sub-package 2023-10-25 22:55:18 +03:00
Stepan Oksanichenko 2b4bddbfe0 - Test empty sub-package 2023-10-25 22:17:42 +03:00
Stepan Oksanichenko 032cf725de - Bump version
- Changelog
2023-07-25 11:12:03 +03:00
Stepan Oksanichenko 8b11bb81af AL-5220: Investigate why CL9 can't built on the new nebula
- Exclude the packages for using in a build
2023-07-24 18:26:51 +03:00
soksanichenko 114a73f100 - gather-module can find modules through symlinks
- Bump version
- Update changelog
2023-04-15 20:03:27 +03:00
soksanichenko 1c3e5dce5e - CLI option `--label` can be passed through a Pungi config file
- Bump version
- Update changelog
2023-04-13 00:57:39 +03:00
soksanichenko e55abb17f1 - Bump version 2023-04-04 10:12:22 +03:00
soksanichenko e81d78a1d1 - The log message contains a variant's name if Pungi didn't find one or more modules for that 2023-04-04 10:11:59 +03:00
soksanichenko 68915d04f8 - Excluded/included modules/packages will be processed correctly 2023-04-02 22:27:24 +03:00
soksanichenko a25bf72fb8 - Changelog is updated
- Version is bumped
2023-03-31 12:07:22 +03:00
Stepan Oksanichenko 68aee1fa2d Merge pull request 'ALBS-987: Generate i686 and dev repositories with pungi on building new distr. version automatically' (#15) from ALBS-987 into al_master
Reviewed-on: #15
2023-03-31 09:03:39 +00:00
soksanichenko 6592735aec ALBS-987: Generate i686 and dev repositories with pungi on building new distr. version automatically
- Unittests are fixed
2023-03-30 14:05:47 +03:00
soksanichenko 943fd8e77d ALBS-987: Generate i686 and dev repositories with pungi on building new distr. version automatically
- Script `create extra repo` is fixed
- Unittests are fixed
2023-03-30 12:52:51 +03:00
soksanichenko 004fc4382f ALBS-987: Generate i686 and dev repositories with pungi on building new distr. version automatically
- Review comments
2023-03-29 11:40:00 +03:00
soksanichenko 596c5c0b7f ALBS-987: Generate i686 and dev repositories with pungi on building new distr. version automatically
- Refactoring
- Some absent packages are in packages.json now
2023-03-28 12:58:08 +03:00
soksanichenko 141d00e941 ALBS-987: Generate i686 and dev repositories with pungi on building new distr. version automatically
- More info about unsigned packages
2023-03-24 16:39:10 +02:00
soksanichenko 4b64d20826 ALBS-987: Generate i686 and dev repositories with pungi on building new distr. version automatically
- Path.rglob/glob doesn't work with symlinks (it's the bug and reported)
- Refactoring
2023-03-24 12:45:28 +02:00
soksanichenko 0747e967b0 ALBS-987: Generate i686 and dev repositories with pungi on building new distr. version automatically
- Some refactoring
2023-03-23 09:36:52 +02:00
soksanichenko 6d58bc2ed8 ALBS-987: Generate i686 and dev repositories with pungi on building new distr. version automatically
- [Generator of packages.json] Replace using CLI by config.yaml
- [Gather RPMs] os.path is replaced by Path
2023-03-22 15:56:58 +02:00
Stepan Oksanichenko 60a347a4a2 Merge pull request 'ALBS-1030: Generate Devel section in packages.json' (#14) from ALBS-1030 into al_master
Reviewed-on: #14
2023-03-22 10:06:58 +00:00
soksanichenko 53ed7386f3 ALBS-1030: Generate Devel section in packages.json
- Redundant empty lines are removed
2023-03-20 13:56:44 +02:00
soksanichenko ed43f0038e ALBS-1030: Generate Devel section in packages.json
- Style fix
2023-03-20 13:55:06 +02:00
soksanichenko fcc9b4f1ca ALBS-1030: Generate Devel section in packages.json
- Skip verifying an RPM signature if sigkeys are empty
2023-03-20 13:25:45 +02:00
soksanichenko d32c293bca ALBS-1030: Generate Devel section in packages.json
- Some upstream changes to KojiMock parts
2023-03-19 21:11:12 +02:00
soksanichenko f0bd1af999 ALBS-1030: Generate Devel section in packages.json
- Also the tool can combine (remove and add) packages in a variant from different
  sources according to an url's type of source
2023-03-19 18:21:33 +02:00
soksanichenko 1b4747b915 - Changelog is updated
- Version is bumped
- New release 4.3.7-3.alma
2023-03-17 12:02:48 +02:00
Lubomír Sedlář 6aabfc9285 osbuild: test passing of rich repos from configuration
Test that "rich" repositories defined as dicts in the configuration
stay as dicts in the arguments passed to the osbuild phase.

Signed-off-by: Tomáš Hozza <thozza@redhat.com>
(cherry picked from commit 8be0d84f8a)
2023-03-17 11:58:11 +02:00
Tomáš Hozza 9e014fed6a osbuild: support specifying `package_sets` for repos
The `koji-osbuild` plugin supports additional formats for the `repo`
property since v4 [1]. Specifically, a repo can be specified as a
dictionary with `baseurl` key and `package_sets` list containing
specific package set names, that the repository should be used for.

Extend the configuration schema to reflect the plugin change.
Extend the documentation to cover the new repository format.
Extend an existing unit test to specify additional repository using the
added format.

[1] https://github.com/osbuild/koji-osbuild/pull/82

Signed-off-by: Tomáš Hozza <thozza@redhat.com>
(cherry picked from commit 8f0906be53)
2023-03-17 11:58:11 +02:00
Tomáš Hozza 7ccb1d4849 osbuild: don't use `util.get_repo_urls()`
Don't use `util.get_repo_urls()` to resolve provided repositories, but
implement osbuild-specific variant of the function named
`_get_repo_urls(). The reason is that the function from `utils`
transforms repositories defined as dicts to strings, which is
undesired for osbuild. The requirement for osbuild is to preserve the
dict as is, just to resolve the string in `baseurl` to the actual
repository URL.

Add a unit test covering the newly added function. It is inspired by a
similar test from `test_util.py`.

Signed-off-by: Tomáš Hozza <thozza@redhat.com>
(cherry picked from commit e3072c3d5f)
2023-03-17 11:58:11 +02:00
Tomáš Hozza abec28256d osbuild: update schema and config documentation
The `koji-osbuild` Hub schema has been relaxed a bit in the latest
release (v11). Adjust the schema in Pungi to reflect changes in
`koji-osbuild`.

For more information on the changes in `koji-osbuild`, see:
https://github.com/osbuild/koji-osbuild/pull/108

Signed-off-by: Tomáš Hozza <thozza@redhat.com>
(cherry picked from commit ef6d40dce4)
2023-03-17 11:58:11 +02:00
Lubomír Sedlář 46216b4f17 Speed up tests by 30 seconds
The retry test for CTS doesn't actually need to wait. Let's mock the
sleep function.

Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit df6664098d)
2023-03-17 11:58:11 +02:00
Lubomír Sedlář 02b3adbaeb Stop sending compose paths to CTS
The tracking service will reject it as it's not an HTTP URL. Let's not
even try.

Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit 147df93f75)
2023-03-17 11:58:11 +02:00
Lubomír Sedlář d17e578645 Report errors from CTS
If the service returns a status code indicating a user error, report
that and do not retry.

Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit dd8c1002d4)
2023-03-17 11:58:11 +02:00
Lubomír Sedlář 6c1c9d9efd createiso: Create Joliet tree with xorriso
This structure is important for isoinfo -J, which is in turn called by
virt-install.

This can be tested by using a bootable ISO by modifying it with a dummy
additional file and preserving boot records:

    $ xorriso -indev netinst.iso -outdev test.iso -boot_image any replay -map setup.py setup.py -end
    ...
    $ isoinfo -J -i test.iso
    isoinfo: Unable to find Joliet SVD
    $ rm test.iso
    $ xorriso -indev netinst.iso -outdev test.iso -joliet on -boot_image any replay -map setup.py setup.py -end
    ...
    $ isoinfo -J -i test.iso
    $

Fixes: https://bugzilla.redhat.com/show_bug.cgi?id=2144105
Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit 12e3a46390)
2023-03-17 11:58:04 +02:00
Stepan Oksanichenko 8dd7d8326f Merge pull request 'ALBS-1040: Investigate why Pungi doesn't put modules packages into the final repos' (#13) from ALBS-1040 into al_master
Reviewed-on: #13
2023-03-16 11:52:02 +00:00
soksanichenko d7b173cae5 ALBS-1040: Investigate why Pungi doesn't put modules packages into the final repos
- The unitttest is fixed
2023-03-14 18:43:14 +02:00
soksanichenko fa4640f03e ALBS-1040: Investigate why Pungi doesn't put modules packages into the final repos
- Refactoring
- KojiMock extracts all modules which are suitable for the variant's arches
2023-03-14 18:25:21 +02:00
Stepan Oksanichenko d66eb0dea8 Merge pull request 'ALBS-1032: Generate i686 section for all variants in packages.json' (#12) from ALBS-1032 into al_master
Reviewed-on: #12
2023-03-14 16:21:41 +00:00
soksanichenko d56227ab4a ALBS-1032: Generate i686 section for all variants in packages.json
- Remove old non-necessary methods
- Some fixes to arch code
2023-03-09 12:32:11 +02:00
soksanichenko 12433157dd - changelog 2022-11-12 00:04:44 +02:00
soksanichenko 623955cb1f - python3-distro as dependency 2022-11-11 19:21:37 +02:00
soksanichenko 4e0d2d14c9 - Unify branch for both RHEL versions 2022-11-11 16:31:43 +02:00
soksanichenko b61e59d676 - Use unittest.mock instead external mock 2022-11-11 15:32:00 +02:00
soksanichenko eb35d7baac - Unify branch for both RHEL versions 2022-11-11 01:38:14 +02:00
soksanichenko 54209f3643 ALBS-732 2022-11-09 21:42:13 +02:00
soksanichenko 80c4536eaa ALBS-732 2022-11-09 21:27:51 +02:00
soksanichenko 9bb5550d36 ALBS-732 2022-11-09 21:01:30 +02:00
soksanichenko 364ed6c3af - kojimock is added to pungi.phases.gather._make_lookaside_repo#prefixes
- unittests are fixed
2022-11-09 20:56:56 +02:00
soksanichenko 0b965096ee - PkgsetSourceKojiMock is added to ALL_SOURCES 2022-11-09 18:18:12 +02:00
soksanichenko d914626d92 - "kojimock" is valid value for option "pkgset_source" 2022-11-09 17:59:50 +02:00
soksanichenko 32215d955a - fedmsg is removed as not needed 2022-11-09 12:38:34 +02:00
soksanichenko d711f8a2d6 - fedmsg is removed as not needed 2022-11-09 09:06:09 +02:00
soksanichenko bd9d800b52 - Fix spec 2022-11-08 17:11:21 +02:00
soksanichenko e03648589d - Fix spec 2022-11-08 17:09:03 +02:00
soksanichenko b5fe2e8129 - Fix spec 2022-11-08 17:06:36 +02:00
soksanichenko b14e85324c - Fix unittests 2022-11-08 14:57:52 +02:00
soksanichenko 5a19ad2258 - Fix unittests 2022-11-08 12:47:14 +02:00
soksanichenko 9ae49dae5b - Fix unittests 2022-11-08 01:43:53 +02:00
soksanichenko c82cbfdc32 - Fix unittests 2022-11-08 00:59:10 +02:00
soksanichenko ee9c9a74e6 - Fix unittests 2022-11-07 23:55:26 +02:00
soksanichenko ea0f933315 - Updates from upstream (https://pagure.io/pungi.git#master) 2022-11-07 23:40:26 +02:00
soksanichenko 323d31df2b Merge branch 'master' into a8_updated
# Conflicts:
#	pungi.spec
#	pungi/wrappers/kojiwrapper.py
#	setup.py
#	tests/test_extra_isos_phase.py
#	tests/test_pkgset_pkgsets.py
2022-11-07 23:38:38 +02:00
soksanichenko 9acd7f5fa4 Merge remote-tracking branch 'centos-origin/master' 2022-11-07 23:33:20 +02:00
soksanichenko a2b16eb44f - spec is updated (merged with last changed from Fedora repo
https://src.fedoraproject.org/rpms/pungi/blob/main/f/pungi.spec
2022-11-07 23:33:03 +02:00
soksanichenko ff946d3f7b - Unittests are fixed 2022-11-07 20:15:37 +02:00
soksanichenko ede91bcd03 - Right name of the class in constructor 2022-11-07 20:03:59 +02:00
soksanichenko 0fa459eb9e - Right name of the class in constructor 2022-11-07 19:56:02 +02:00
soksanichenko b49ffee06d - Mock of Koji is moved to the separate modules, classes
- Unittests for mock of Koji are moved to the separate
2022-11-07 19:24:39 +02:00
soksanichenko fce5493f09 Merge remote-tracking branch 'centos-origin/master'
# Conflicts:
#	pungi/phases/init.py
#	pungi/wrappers/comps.py
2022-11-03 22:49:11 +02:00
soksanichenko 750499eda1 - The unittests are fixed 2022-10-19 14:10:48 +03:00
soksanichenko d999960235 - bump the dependency version 2022-10-19 13:00:32 +03:00
soksanichenko 6edece449d - changelog
- bump version
2022-10-19 04:40:39 +03:00
Stepan Oksanichenko dd22d94a9e Merge pull request 'Replace list of cr.packages by cr.PackageIterator' (#6) from package_iterator into aln8
Reviewed-on: #6
2022-10-19 01:38:44 +00:00
soksanichenko b157a1825a Do not lose a module from koji if we have more than one arch (e.g. x86_64 + i686) 2022-10-19 04:33:34 +03:00
soksanichenko fd298d4f17 Replace list of cr.packages by cr.PackageIterator 2022-10-18 22:53:50 +03:00
soksanichenko f21ed6f607 ALBS-334: Make the ability of Pungi to give module_defaults from remote sources 2022-05-04 20:16:23 +03:00
Stepan Oksanichenko cfe6ec3f4e Merge pull request 'ALBS-334: Make the ability of Pungi to give module_defaults from remote sources' (#4) from ALBS-334 into aln8
Reviewed-on: #4
2022-05-04 17:05:45 +00:00
soksanichenko e6c6f74176 ALBS-334: Make the ability of Pungi to give module_defaults from remote sources 2022-05-03 18:18:17 +03:00
soksanichenko 8676941655 ALBS-334: Make the ability of Pungi to give module_defaults from remote sources 2022-05-02 02:25:32 +03:00
soksanichenko 5f74175c33 ALBS-334: Make the ability of Pungi to give module_defaults from remote sources 2022-05-01 03:41:40 +03:00
soksanichenko 1e18e8995d ALBS-334: Make the ability of Pungi to give module_defaults from remote sources 2022-05-01 03:32:01 +03:00
soksanichenko 38ea822260 ALBS-334: Make the ability of Pungi to give module_defaults from remote sources 2022-04-30 00:27:31 +03:00
soksanichenko 34eb45c7ec ALBS-334: Make the ability of Pungi to give module_defaults from remote sources 2022-04-29 21:39:51 +03:00
soksanichenko 7422d1e045 ALBS-334: Make the ability of Pungi to give module_defaults from remote sources 2022-04-29 21:33:28 +03:00
soksanichenko 97801e772e ALBS-334: Make the ability of Pungi to give module_defaults from remote sources 2022-04-29 21:25:59 +03:00
soksanichenko dff346eedb - Unit tests are fixed 2022-04-28 16:44:47 +03:00
soksanichenko de53dd0bbd - Unit tests are fixed 2022-04-28 16:30:03 +03:00
soksanichenko 88121619bc ALBS-226: Patch pungi/lorax for building AL9
- Defaults modules can be empty, but pungi detects
  empty folder while copying and raises the exception in this case
2022-03-18 22:37:57 +02:00
soksanichenko 0484426e0c ALBS-97: Build AlmaLinux PPC64le repos and ISOs with pungi
- Changelog
- Version is bumped

@BS-NOBUILD
@BS-TARGET-CL8

Change-Id: I933925b7a27a5e1b642020e060f59212fdc6ebf4
2021-12-30 12:42:34 +02:00
soksanichenko b9d86b90e1 ALBS-97: Build AlmaLinux PPC64le repos and ISOs with pungi
- Scripts `create_packages_json` & `gather_modules` can process lzma compressed yaml files
- Script `create_package_json` can use repodata there are packages with different
  arch by compare with passed to the script

@BS-NOBUILD
@BS-TARGET-CL8

Change-Id: Ia9a3bacfa4344f0cf33b9f416649fd4a5f8d3c37
2021-12-28 16:08:04 +02:00
soksanichenko 58a16e5688 - The version is bumped
- The changelog is updated
- The test `create_packages_json` is fixed

@BS-NOBUILD
@BS-TARGET-CL8

Change-Id: I173013da990eb296e58ca8f3555a05913ca1c852
2021-12-20 14:11:17 +02:00
soksanichenko f2ed64d952 ALBS-66: Prepare Jenkins jobs for building distribution of AlmaLinux 8.5
- Script `create_packages_json` can duplicate the packages with
  same version in different variants

Change-Id: I3c79ad06c4c22442423c12d5fa06baf82d663a3f
2021-11-10 15:29:59 +02:00
stepan_oksanichenko b2c49dcaf6 - The version is bumped
- The changelog is updated

@BS-NOBUILD
@BS-TARGET-CL8

Change-Id: Iadbf3d7223db85a58ba82f41597de27dbfffe1ca
2021-06-18 14:47:09 +03:00
stepan_oksanichenko 14dd6a195f LNX-326: Add the ability to include any package by mask in packages.json to the generator
- The reference packages should be replaced only by the newer reference packages
- The non-reference packages should be replaced by both of types packages

@BS-NOBUILD
@BS-TARGET-CL8

Change-Id: I881bd4e58527ae219ef6e1adbc6332b3b05933c1
2021-06-18 14:23:42 +03:00
stepan_oksanichenko 084321dd97 LNX-326: Add the ability to include any package by mask in packages.json to the generator
- The ability is added
- Also the generator includes only the latest by version packages to packages.json
- The generator has key `--is-reference` for an each repo. This key marks a repo as reference.
  An reference repo is used as main source of packages. A not reference repo is used as source
  of packages which don't exist in the reference repos.
- All cases are covered by the unittest

@BS-NOBUILD
@BS-TARGET-CL8

Change-Id: I2f80ba4fbfce27fb9a30500ae46c0b8a2f2aabcd
2021-06-15 17:42:12 +03:00
stepan_oksanichenko 941d6b064a LNX-318: Modify build scripts for building CloudLinux OS 8.4
- [Fixed] The script `create_packages_json` selects a first
          comer package from variant, but it should select the
          higher by version of package

@BS-NOBUILD
@BS-TARGET-CL8

Change-Id: I36268f2a493897fc11e787c040066d2d501a1c81
2021-06-04 12:36:03 +03:00
stepan_oksanichenko aaeee7132d - It's bumped version
- It's added changelog

@BS-TARGET-CL8

Change-Id: I51eef1eb45ba54d034e6bed46d99b0470f4e9221
2021-05-25 21:28:47 +03:00
stepan_oksanichenko cc4d99441c LNX-108: Add multiarch support to pungi
@BS-NOBUILD
@BS-TARGET-CL8

Change-Id: Ibfd540454941922d790ae4e56cc0992c0c85635d
2021-05-24 18:07:11 +03:00
stepan_oksanichenko a435eeed06 - it's added changelog
@BS-NOBUILD

Change-Id: I3a0a0377f9c1cefabf52c33fbc0d19ab0e4fe4f1
2021-04-29 17:15:17 +03:00
stepan_oksanichenko b9f554bf39 LNX-311: Add ability to productmd set a main variant while dumping TreeInfo
@BS-NOBUILD
@BS-TARGET-CL8
@BS-LINKED-608ab56299ce8ac801a396c5  # python3-productmd

Change-Id: Id86d627ae8ae0b9a73b5ce6531c20538f3d040b1
2021-04-29 17:01:49 +03:00
stepan_oksanichenko ebf028ca3b LNX-286: Prepare pungi configuration and setup Jenkins job for AlmaLinux 8.4 beta
- The modules from a parsend output of FUS should be have a stream
  with replaced dash by underscore

@BS-NOBUILD
@BS-TARGET-CL8

Change-Id: If36d3d0a1ef8010bf85a4a0218b9838e0888453c
2021-04-27 13:39:09 +03:00
stepan_oksanichenko 305103a38e LNX-286: Prepare pungi configuration and setup Jenkins job for AlmaLinux 8.4 beta
- Some modules can be absent in koji env but be present in variants.xml.
  And Pungi will fail in this case. So we must filter out those modules
  from expected modules list by list from pungi build config

@BS-NOBUILD
@BS-TARGET-CL8

Change-Id: I22c15c42868412e34fd554030130bd7c3e25b8ef
2021-04-23 13:03:05 +03:00
stepan_oksanichenko 01bce26275 LNX-286: Prepare pungi configuration and setup Jenkins job for AlmaLinux 8.4 beta
- The script `gather_modules` should replace `-` by `_`
  in stream of modules as pungi does it in self

@BS-NOBUILD
@BS-TARGET-CL8

Change-Id: Iea05b70afbf80f3ccd20ad4943c9d86c7ed7aa90
2021-04-22 13:40:48 +03:00
soksanichenko 4d763514c1 - Version is bumped
- Changelog is added

Change-Id: I440b44f12c4a1aa41619acd3ba5ca354dc71b419
2021-02-24 17:42:22 +02:00
Danylo Kuropiatnyk 41381df6a5 LU-2202: Start unittests during installation or build of pungi
* added section with tests and pytest module to requires
IMPORTANT - build.sh script is commented
* added pyfakefs dependency
* fixed little mock_open issue for runroot test
* bumped version

@BS-TARGET-CL8

Change-Id: I036db225646875eb610736cd26f473850a78447c
2021-02-23 07:55:36 -05:00
soksanichenko 02686d7bdf LU-2186 .treeinfo file in AlmaLinux public kickstart repo should contain AppStream variant
- We are modifying existing repo's .treeinfo:
-- Take info about included variants from iso's .treeinfo and put it to repo's .treeinfo

Change-Id: I29bf655d90994e8a1bda40ad04568dd7364f5dca
2021-02-23 06:48:15 -05:00
soksanichenko 2e48c9a56f LU-2195 Change path to sources and iso when generating repositories
- We should add the images to the compose if they will be used only as netinstall image.
  E.g. *-boot.iso.
- And we shouldn't add if the images will be modified in phase `extra_isos`.
  E.g. *-minimal.iso

Change-Id: I9095cfd87414ecca46b1213553589731c82dd2e2
2021-02-22 13:23:48 +02:00
soksanichenko b3a8c3f28a - Version is bumped
- Changelog is added

Change-Id: Ib1366f1fe2639037db99b8e939537bb63801058e
2021-02-11 14:50:12 +02:00
soksanichenko 5434d24027 LU-2133: Prepare CI for iso builds of CLOSS 8
@BS-TARGET-CL8
@BS-NOBUILD

- It's added the script which can collect packages/modules
  from the remote repos (including BS repos) and merge them
  to an one local repo with the right repodata (including
  modules.yaml.gz)
- The script `create_packages_json` can use regexps for list of excluded packages

Change-Id: I1365b712460959db6bb451d1199d640bff6ffe5e
2021-02-09 10:47:46 +02:00
soksanichenko 3b5501b4bf LNX-133: Create a server for building nightly builds of AlmaLinux
- It's added key argument '--json-output-path' to script `pungi-generate-package-json`

Change-Id: Ic18fa2708cc4913002023828b3be018d4907de25
2021-01-28 14:03:40 +02:00
soksanichenko cea8d92906 Bump version for setup.py
Change-Id: I980e9ebb728c3a88597c987d585e1b5937499e81
2021-01-28 00:06:40 +02:00
soksanichenko 1a29de435e - It's added changelog
- Its bumped version

Change-Id: I4c7b8d9c64da3379a24d93837657cec2686a8511
2021-01-27 23:47:39 +02:00
soksanichenko 69ed7699e8 LNX-133: Create a server for building nightly builds of AlmaLinux
- It's added dependency `python3-dataclasses` to spec

Change-Id: Id6b6f33ca6621ddc1408d9ab51e278801e4dd0a2
2021-01-27 07:47:07 -05:00
Stepan Oksanichenko 103c3dc608 LNX-133: Create a server for building nightly builds of AlmaLinux
- Script `pungi-gather-modules` can find valid *modules.yaml.gz in the repo dirs by itself

@BS-LINKED-5ffda6156f44affc6c5ea239  # pungi & dependencies
@BS-TARGET-CL8

Change-Id: I3cddc0cf41ea1087183e23de39126a52c69bc9ac
2021-01-25 16:17:35 +02:00
Stepan Oksanichenko 94ad7603b8 LNX-104: Create gather_prepopulate file generator for Pungi
- It's added the tool which can generate json like as `centos-packages.json` using repodata from completed repos.

@BS-LINKED-5ffda6156f44affc6c5ea239  # pungi & dependencies
@BS-TARGET-CL8

Change-Id: Ib0466a1d8e06feb855e81fb7160fe170e2e82e04
2021-01-25 16:17:34 +02:00
oshyshatskyi 903db91c0f LNX-102: Patch pungi tool to use local koji mock
Instead of koji.mbox use local koji-like wrapper.

@BS-LINKED-5ff8b8cb6f44affc6c5e9a7a
@BS-TARGET-CL8

Change-Id: I82a2bc8bc71ae06240656898f3df71bb28bcb9e9
2021-01-25 16:17:33 +02:00
oshyshatskyi 552343fffe LNX-102: Add tool that gathers directory for all rpms
Tool that finds all available rpm files in directory
and creates special tree for pungi:
 # ls /mnt/koji/
   i686/  noarch/  x86_64/

Change-Id: Ibcf2d23c46411ad89477058f4d56e07ca117f0d1
2021-01-25 16:17:33 +02:00
oshyshatskyi 5806217041 LNX-102: Add tool that collects information about modules
Add special tool that gathers given modules.tar.gz files
and collects information about modules into two dirs:
 - module_defaults
 - modules

 First one is used by pungi during repocreate phase and
 the second one is used by koji mock to get list of
 available modules and their versions.

Change-Id: I50a095a5f3bafa7e7a1effc2c0d4a2fc52ba603b
2021-01-25 16:17:33 +02:00
Andrew Lukoshko 67eacf8483 LNX-103 Update .spec file for AlmaLinux
New binaries added to pungi rpm:
pungi-gather-rpms
pungi-gather-modules

Change-Id: Idb25dffb10d50fa9f566c99d714d32df962b6f52
2021-01-25 16:17:32 +02:00
Ken Dreyer 38789d07ee doc: remove default createrepo_checksum value from example
createrepo_checksum already defaults to sha256. Remove this setting from
the documented Minimal Example configuration to make it easier to read.

Signed-off-by: Ken Dreyer <kdreyer@redhat.com>
(cherry picked from commit 39b847094a)
2021-01-25 14:06:34 +02:00
Lubomír Sedlář 3735aaa443 comps: Preserve default arg on groupid
When the wrapper processes comps file, it wasn't emitting "default"
argument for groupid element. The default is false and most entries are
actually using the default, so let's only emit it if set to true.

Fixes: https://bugzilla.redhat.com/show_bug.cgi?id=1882358
Signed-off-by: Lubomír Sedlář <lsedlar@redhat.com>
(cherry picked from commit 9ea1098eae)
2021-01-25 14:06:33 +02:00
Haibo Lin 2c1603c414 Stop copying .git directory with module defaults
JIRA: RHELCMP-3016
Fixes: https://pagure.io/pungi/issue/1464

Signed-off-by: Haibo Lin <hlin@redhat.com>
(cherry picked from commit f518c1bb7c)
2021-01-25 14:06:33 +02:00
Haibo Lin f2fd10b0ab React to SIGINT signal
ODCS sends SIGINT signal.

JIRA: RHELCMP-3687
Signed-off-by: Haibo Lin <hlin@redhat.com>
(cherry picked from commit f470599f6c)
2021-01-25 14:06:33 +02:00
Sergey Fokin ac601ab8ea change Source0 in spec file 2021-01-08 13:30:17 +03:00
oshyshatskyi 757a6ed653 Revert unneeded commit to match upstream sources
This reverts commit b2e439e5

Change-Id: Ia6706415039681a6fe7b5ec6a735c3bda66d6bb1
2020-12-30 13:58:06 +02:00
Oleksandr Shyshatskyi b2e439e561 current 2020-12-29 10:44:49 +02:00
98 changed files with 5490 additions and 365 deletions

View File

@ -53,7 +53,7 @@ copyright = "2016, Red Hat, Inc."
# The short X.Y version.
version = "4.5"
# The full version, including alpha/beta/rc tags.
release = "4.5.1"
release = "4.5.0"
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.

File diff suppressed because it is too large Load Diff

View File

@ -609,6 +609,7 @@ def make_schema():
"release_discinfo_description": {"type": "string"},
"treeinfo_version": {"type": "string"},
"compose_type": {"type": "string", "enum": COMPOSE_TYPES},
"label": {"type": "string"},
"base_product_name": {"type": "string"},
"base_product_short": {"type": "string"},
"base_product_version": {"type": "string"},
@ -686,7 +687,11 @@ def make_schema():
"pkgset_allow_reuse": {"type": "boolean", "default": True},
"createiso_allow_reuse": {"type": "boolean", "default": True},
"extraiso_allow_reuse": {"type": "boolean", "default": True},
"pkgset_source": {"type": "string", "enum": ["koji", "repos"]},
"pkgset_source": {"type": "string", "enum": [
"koji",
"repos",
"kojimock",
]},
"createrepo_c": {"type": "boolean", "default": True},
"createrepo_checksum": {
"type": "string",
@ -815,6 +820,14 @@ def make_schema():
"type": "string",
"enum": ["lorax", "buildinstall"],
},
# In phase `buildinstall` we should add to compose only the
# images that will be used only as netinstall
"netinstall_variants": {
"$ref": "#/definitions/list_of_strings",
"default": [
"BaseOS",
],
},
"buildinstall_topdir": {"type": "string"},
"buildinstall_kickstart": {"$ref": "#/definitions/str_or_scm_dict"},
"buildinstall_use_guestmount": {"type": "boolean", "default": True},

View File

@ -17,7 +17,6 @@
from enum import Enum
from functools import cmp_to_key
from itertools import count, groupby
import errno
import logging
import os
import re
@ -1068,12 +1067,9 @@ class Gather(GatherBase):
# Link downloaded package in (or link package from file repo)
try:
linker.link(pkg.localPkg(), target)
except Exception as ex:
if ex.errno == errno.EEXIST:
self.logger.warning("Downloaded package exists in %s", target)
else:
self.logger.error("Unable to link %s from the yum cache.", pkg.name)
raise
except Exception:
self.logger.error("Unable to link %s from the yum cache." % pkg.name)
raise
def log_count(self, msg, method, *args):
"""

View File

@ -546,7 +546,14 @@ def link_boot_iso(compose, arch, variant, can_fail):
img.volume_id = iso.get_volume_id(new_boot_iso_path)
except RuntimeError:
pass
compose.im.add(variant.uid, arch, img)
# In this phase we should add to compose only the images that
# will be used only as netinstall.
# On this step lorax generates environment
# for creating isos and create them.
# On step `extra_isos` we overwrite the not needed iso `boot Minimal` by
# new iso. It already contains necessary packages from incldued variants.
if variant.uid in compose.conf['netinstall_variants']:
compose.im.add(variant.uid, arch, img)
compose.log_info("[DONE ] %s" % msg)

View File

@ -420,6 +420,12 @@ def get_iso_contents(
original_treeinfo,
os.path.join(extra_files_dir, ".treeinfo"),
)
tweak_repo_treeinfo(
compose,
include_variants,
original_treeinfo,
original_treeinfo,
)
# Add extra files specific for the ISO
files.update(
@ -431,6 +437,45 @@ def get_iso_contents(
return gp
def tweak_repo_treeinfo(compose, include_variants, source_file, dest_file):
"""
The method includes the variants to file .treeinfo of a variant. It takes
the variants which are described
by options `extra_isos -> include_variants`.
"""
ti = productmd.treeinfo.TreeInfo()
ti.load(source_file)
main_variant = next(iter(ti.variants))
for variant_uid in include_variants:
variant = compose.all_variants[variant_uid]
var = productmd.treeinfo.Variant(ti)
var.id = variant.id
var.uid = variant.uid
var.name = variant.name
var.type = variant.type
ti.variants.add(var)
for variant_id in ti.variants:
var = ti.variants[variant_id]
if variant_id == main_variant:
var.paths.packages = 'Packages'
var.paths.repository = '.'
else:
var.paths.packages = os.path.join(
'../../..',
var.uid,
var.arch,
'os/Packages',
)
var.paths.repository = os.path.join(
'../../..',
var.uid,
var.arch,
'os',
)
ti.dump(dest_file, main_variant=main_variant)
def tweak_treeinfo(compose, include_variants, source_file, dest_file):
ti = load_and_tweak_treeinfo(source_file)
for variant_uid in include_variants:
@ -446,7 +491,6 @@ def tweak_treeinfo(compose, include_variants, source_file, dest_file):
var = ti.variants[variant_id]
var.paths.packages = os.path.join(var.uid, "Packages")
var.paths.repository = var.uid
ti.dump(dest_file)

View File

@ -23,6 +23,7 @@ import threading
from kobo.rpmlib import parse_nvra
from kobo.shortcuts import run
from productmd.rpms import Rpms
from pungi.phases.pkgset.common import get_all_arches
from six.moves import cPickle as pickle
try:
@ -649,6 +650,11 @@ def _make_lookaside_repo(compose, variant, arch, pkg_map, package_sets=None):
pungi.wrappers.kojiwrapper.KojiWrapper(compose).koji_module.config.topdir,
).rstrip("/")
+ "/",
"kojimock": lambda: pungi.wrappers.kojiwrapper.KojiMockWrapper(
compose,
get_all_arches(compose),
).koji_module.config.topdir.rstrip("/")
+ "/",
}
path_prefix = prefixes[compose.conf["pkgset_source"]]()
package_list = set()

View File

@ -23,6 +23,8 @@ import itertools
import json
import os
import time
import pgpy
import rpm
from six.moves import cPickle as pickle
from functools import partial
@ -152,9 +154,15 @@ class PackageSetBase(kobo.log.LoggingBase):
"""
def nvr_formatter(package_info):
# joins NVR parts of the package with '-' character.
return "-".join(
(package_info["name"], package_info["version"], package_info["release"])
epoch_suffix = ''
if package_info['epoch'] is not None:
epoch_suffix = ':' + package_info['epoch']
return (
f"{package_info['name']}"
f"{epoch_suffix}-"
f"{package_info['version']}-"
f"{package_info['release']}."
f"{package_info['arch']}"
)
def get_error(sigkeys, infos):
@ -503,7 +511,8 @@ class KojiPackageSet(PackageSetBase):
response = None
if self.cache_region:
cache_key = "KojiPackageSet.get_latest_rpms_%s_%s_%s" % (
cache_key = "%s.get_latest_rpms_%s_%s_%s" % (
str(self.__class__.__name__),
str(tag),
str(event),
str(inherit),
@ -525,6 +534,8 @@ class KojiPackageSet(PackageSetBase):
return response
def get_package_path(self, queue_item):
rpm_info, build_info = queue_item
@ -882,6 +893,67 @@ class KojiPackageSet(PackageSetBase):
return False
class KojiMockPackageSet(KojiPackageSet):
def _is_rpm_signed(self, rpm_path) -> bool:
ts = rpm.TransactionSet()
ts.setVSFlags(rpm._RPMVSF_NOSIGNATURES)
sigkeys = [
sigkey.lower() for sigkey in self.sigkey_ordering
if sigkey is not None
]
if not sigkeys:
return True
with open(rpm_path, 'rb') as fd:
header = ts.hdrFromFdno(fd)
signature = header[rpm.RPMTAG_SIGGPG] or header[rpm.RPMTAG_SIGPGP]
if signature is None:
return False
pgp_msg = pgpy.PGPMessage.from_blob(signature)
return any(
signature.signer.lower() in sigkeys
for signature in pgp_msg.signatures
)
def get_package_path(self, queue_item):
rpm_info, build_info = queue_item
# Check if this RPM is coming from scratch task.
# In this case, we already know the path.
if "path_from_task" in rpm_info:
return rpm_info["path_from_task"]
# we replaced this part because pungi uses way
# of guessing path of package on koji based on sigkey
# we don't need that because all our packages will
# be ready for release
# signature verification is still done during deps resolution
pathinfo = self.koji_wrapper.koji_module.pathinfo
rpm_path = os.path.join(pathinfo.topdir, pathinfo.rpm(rpm_info))
if os.path.isfile(rpm_path):
if not self._is_rpm_signed(rpm_path):
self._invalid_sigkey_rpms.append(rpm_info)
self.log_error(
'RPM "%s" not found for sigs: "%s". Path checked: "%s"',
rpm_info, self.sigkey_ordering, rpm_path
)
return
return rpm_path
else:
self.log_warning("RPM %s not found" % rpm_path)
return None
def populate(self, tag, event=None, inherit=True, include_packages=None):
result = super().populate(
tag=tag,
event=event,
inherit=inherit,
include_packages=include_packages,
)
return result
def _is_src(rpm_info):
"""Check if rpm info object returned by Koji refers to source packages."""
return rpm_info["arch"] in ("src", "nosrc")

View File

@ -15,8 +15,10 @@
from .source_koji import PkgsetSourceKoji
from .source_repos import PkgsetSourceRepos
from .source_kojimock import PkgsetSourceKojiMock
ALL_SOURCES = {
"koji": PkgsetSourceKoji,
"repos": PkgsetSourceRepos,
"kojimock": PkgsetSourceKojiMock,
}

File diff suppressed because it is too large Load Diff

View File

@ -94,7 +94,7 @@ class Runroot(kobo.log.LoggingBase):
log_file = os.path.join(log_dir, "program.log")
try:
with open(log_file) as f:
for line in f:
for line in f.readlines():
if "losetup: cannot find an unused loop device" in line:
return True
if re.match("losetup: .* failed to set up loop device", line):

View File

@ -0,0 +1,441 @@
# coding=utf-8
import argparse
import os
import subprocess
import tempfile
from shutil import rmtree
from typing import (
AnyStr,
List,
Dict,
Optional,
)
import createrepo_c as cr
import requests
import yaml
from dataclasses import dataclass, field
from .create_packages_json import (
PackagesGenerator,
RepoInfo,
VariantInfo,
)
@dataclass
class ExtraVariantInfo(VariantInfo):
modules: List[AnyStr] = field(default_factory=list)
packages: List[AnyStr] = field(default_factory=list)
class CreateExtraRepo(PackagesGenerator):
def __init__(
self,
variants: List[ExtraVariantInfo],
bs_auth_token: AnyStr,
local_repository_path: AnyStr,
clear_target_repo: bool = True,
):
self.variants = [] # type: List[ExtraVariantInfo]
super().__init__(variants, [], [])
self.auth_headers = {
'Authorization': f'Bearer {bs_auth_token}',
}
# modules data of modules.yaml.gz from an existing local repo
self.local_modules_data = []
self.local_repository_path = local_repository_path
# path to modules.yaml, which generated by the class
self.default_modules_yaml_path = os.path.join(
local_repository_path,
'modules.yaml',
)
if clear_target_repo:
if os.path.exists(self.local_repository_path):
rmtree(self.local_repository_path)
os.makedirs(self.local_repository_path, exist_ok=True)
else:
self._read_local_modules_yaml()
def _read_local_modules_yaml(self):
"""
Read modules data from an existin local repo
"""
repomd_file_path = os.path.join(
self.local_repository_path,
'repodata',
'repomd.xml',
)
repomd_object = self._parse_repomd(repomd_file_path)
for repomd_record in repomd_object.records:
if repomd_record.type != 'modules':
continue
modules_yaml_path = os.path.join(
self.local_repository_path,
repomd_record.location_href,
)
self.local_modules_data = list(self._parse_modules_file(
modules_yaml_path,
))
break
def _dump_local_modules_yaml(self):
"""
Dump merged modules data to an local repo
"""
if self.local_modules_data:
with open(self.default_modules_yaml_path, 'w') as yaml_file:
yaml.dump_all(
self.local_modules_data,
yaml_file,
)
@staticmethod
def get_repo_info_from_bs_repo(
auth_token: AnyStr,
build_id: AnyStr,
arch: AnyStr,
packages: Optional[List[AnyStr]] = None,
modules: Optional[List[AnyStr]] = None,
) -> List[ExtraVariantInfo]:
"""
Get info about a BS repo and save it to
an object of class ExtraRepoInfo
:param auth_token: Auth token to Build System
:param build_id: ID of a build from BS
:param arch: an architecture of repo which will be used
:param packages: list of names of packages which will be put to an
local repo from a BS repo
:param modules: list of names of modules which will be put to an
local repo from a BS repo
:return: list of ExtraRepoInfo with info about the BS repos
"""
bs_url = 'https://build.cloudlinux.com'
api_uri = 'api/v1'
bs_repo_suffix = 'build_repos'
variants_info = []
# get the full info about a BS repo
repo_request = requests.get(
url=os.path.join(
bs_url,
api_uri,
'builds',
build_id,
),
headers={
'Authorization': f'Bearer {auth_token}',
},
)
repo_request.raise_for_status()
result = repo_request.json()
for build_platform in result['build_platforms']:
platform_name = build_platform['name']
for architecture in build_platform['architectures']:
# skip repo with unsuitable architecture
if architecture != arch:
continue
variant_info = ExtraVariantInfo(
name=f'{build_id}-{platform_name}-{architecture}',
arch=architecture,
packages=packages,
modules=modules,
repos=[
RepoInfo(
path=os.path.join(
bs_url,
bs_repo_suffix,
build_id,
platform_name,
),
folder=architecture,
is_remote=True,
)
]
)
variants_info.append(variant_info)
return variants_info
def _create_local_extra_repo(self):
"""
Call `createrepo_c <path_to_repo>` for creating a local repo
"""
subprocess.call(
f'createrepo_c {self.local_repository_path}',
shell=True,
)
# remove an unnecessary temporary modules.yaml
if os.path.exists(self.default_modules_yaml_path):
os.remove(self.default_modules_yaml_path)
def get_remote_file_content(
self,
file_url: AnyStr,
) -> AnyStr:
"""
Get content from a remote file and write it to a temp file
:param file_url: url of a remote file
:return: path to a temp file
"""
file_request = requests.get(
url=file_url,
# for the case when we get a file from BS
headers=self.auth_headers,
)
file_request.raise_for_status()
with tempfile.NamedTemporaryFile(delete=False) as file_stream:
file_stream.write(file_request.content)
return file_stream.name
def _download_rpm_to_local_repo(
self,
package_location: AnyStr,
repo_info: RepoInfo,
) -> None:
"""
Download a rpm package from a remote repo and save it to a local repo
:param package_location: relative uri of a package in a remote repo
:param repo_info: info about a remote repo which contains a specific
rpm package
"""
rpm_package_remote_path = os.path.join(
repo_info.path,
repo_info.folder,
package_location,
)
rpm_package_local_path = os.path.join(
self.local_repository_path,
os.path.basename(package_location),
)
rpm_request = requests.get(
url=rpm_package_remote_path,
headers=self.auth_headers,
)
rpm_request.raise_for_status()
with open(rpm_package_local_path, 'wb') as rpm_file:
rpm_file.write(rpm_request.content)
def _download_packages(
self,
packages: Dict[AnyStr, cr.Package],
variant_info: ExtraVariantInfo
):
"""
Download all defined packages from a remote repo
:param packages: information about all packages (including
modularity) in a remote repo
:param variant_info: information about a remote variant
"""
for package in packages.values():
package_name = package.name
# Skip a current package from a remote repo if we defined
# the list packages and a current package doesn't belong to it
if variant_info.packages and \
package_name not in variant_info.packages:
continue
for repo_info in variant_info.repos:
self._download_rpm_to_local_repo(
package_location=package.location_href,
repo_info=repo_info,
)
def _download_modules(
self,
modules_data: List[Dict],
variant_info: ExtraVariantInfo,
packages: Dict[AnyStr, cr.Package]
):
"""
Download all defined modularity packages and their data from
a remote repo
:param modules_data: information about all modules in a remote repo
:param variant_info: information about a remote variant
:param packages: information about all packages (including
modularity) in a remote repo
"""
for module in modules_data:
module_data = module['data']
# Skip a current module from a remote repo if we defined
# the list modules and a current module doesn't belong to it
if variant_info.modules and \
module_data['name'] not in variant_info.modules:
continue
# we should add info about a module if the local repodata
# doesn't have it
if module not in self.local_modules_data:
self.local_modules_data.append(module)
# just skip a module's record if it doesn't have rpm artifact
if module['document'] != 'modulemd' or \
'artifacts' not in module_data or \
'rpms' not in module_data['artifacts']:
continue
for rpm in module['data']['artifacts']['rpms']:
# Empty repo_info.packages means that we will download
# all packages from repo including
# the modularity packages
if not variant_info.packages:
break
# skip a rpm if it doesn't belong to a processed repo
if rpm not in packages:
continue
for repo_info in variant_info.repos:
self._download_rpm_to_local_repo(
package_location=packages[rpm].location_href,
repo_info=repo_info,
)
def create_extra_repo(self):
"""
1. Get from the remote repos the specific (or all) packages/modules
2. Save them to a local repo
3. Save info about the modules to a local repo
3. Call `createrepo_c` which creates a local repo
with the right repodata
"""
for variant_info in self.variants:
for repo_info in variant_info.repos:
repomd_records = self._get_repomd_records(
repo_info=repo_info,
)
packages_iterator = self.get_packages_iterator(repo_info)
# parse the repodata (including modules.yaml.gz)
modules_data = self._parse_module_repomd_record(
repo_info=repo_info,
repomd_records=repomd_records,
)
# convert the packages dict to more usable form
# for future checking that a rpm from the module's artifacts
# belongs to a processed repository
packages = {
f'{package.name}-{package.epoch}:{package.version}-'
f'{package.release}.{package.arch}':
package for package in packages_iterator
}
self._download_modules(
modules_data=modules_data,
variant_info=variant_info,
packages=packages,
)
self._download_packages(
packages=packages,
variant_info=variant_info,
)
self._dump_local_modules_yaml()
self._create_local_extra_repo()
def create_parser():
parser = argparse.ArgumentParser()
parser.add_argument(
'--bs-auth-token',
help='Auth token for Build System',
)
parser.add_argument(
'--local-repo-path',
help='Path to a local repo. E.g. /var/repo/test_repo',
required=True,
)
parser.add_argument(
'--clear-local-repo',
help='Clear a local repo before creating a new',
action='store_true',
default=False,
)
parser.add_argument(
'--repo',
action='append',
help='Path to a folder with repofolders or build id. E.g. '
'"http://koji.cloudlinux.com/mirrors/rhel_mirror" or '
'"601809b3c2f5b0e458b14cd3"',
required=True,
)
parser.add_argument(
'--repo-folder',
action='append',
help='A folder which contains folder repodata . E.g. "baseos-stream"',
required=True,
)
parser.add_argument(
'--repo-arch',
action='append',
help='What architecture packages a repository contains. E.g. "x86_64"',
required=True,
)
parser.add_argument(
'--packages',
action='append',
type=str,
default=[],
help='A list of packages names which we want to download to local '
'extra repo. We will download all of packages if param is empty',
required=True,
)
parser.add_argument(
'--modules',
action='append',
type=str,
default=[],
help='A list of modules names which we want to download to local '
'extra repo. We will download all of modules if param is empty',
required=True,
)
return parser
def cli_main():
args = create_parser().parse_args()
repos_info = []
for repo, repo_folder, repo_arch, packages, modules in zip(
args.repo,
args.repo_folder,
args.repo_arch,
args.packages,
args.modules,
):
modules = modules.split()
packages = packages.split()
if repo.startswith('http://'):
repos_info.append(
ExtraVariantInfo(
name=repo_folder,
arch=repo_arch,
repos=[
RepoInfo(
path=repo,
folder=repo_folder,
is_remote=True,
)
],
modules=modules,
packages=packages,
)
)
else:
repos_info.extend(
CreateExtraRepo.get_repo_info_from_bs_repo(
auth_token=args.bs_auth_token,
build_id=repo,
arch=repo_arch,
modules=modules,
packages=packages,
)
)
cer = CreateExtraRepo(
variants=repos_info,
bs_auth_token=args.bs_auth_token,
local_repository_path=args.local_repo_path,
clear_target_repo=args.clear_local_repo,
)
cer.create_extra_repo()
if __name__ == '__main__':
cli_main()

View File

@ -0,0 +1,514 @@
# coding=utf-8
"""
The tool allow to generate package.json. This file is used by pungi
# as parameter `gather_prepopulate`
Sample of using repodata files taken from
https://github.com/rpm-software-management/createrepo_c/blob/master/examples/python/repodata_parsing.py
"""
import argparse
import gzip
import json
import logging
import lzma
import os
import re
import tempfile
from collections import defaultdict
from itertools import tee
from pathlib import Path
from typing import (
AnyStr,
Dict,
List,
Any,
Iterator,
Optional,
Tuple,
Union,
)
import binascii
from urllib.parse import urljoin
import requests
import rpm
import yaml
from createrepo_c import (
Package,
PackageIterator,
Repomd,
RepomdRecord,
)
from dataclasses import dataclass, field
from kobo.rpmlib import parse_nvra
logging.basicConfig(level=logging.INFO)
def _is_compressed_file(first_two_bytes: bytes, initial_bytes: bytes):
return binascii.hexlify(first_two_bytes) == initial_bytes
def is_gzip_file(first_two_bytes):
return _is_compressed_file(
first_two_bytes=first_two_bytes,
initial_bytes=b'1f8b',
)
def is_xz_file(first_two_bytes):
return _is_compressed_file(
first_two_bytes=first_two_bytes,
initial_bytes=b'fd37',
)
@dataclass
class RepoInfo:
# path to a directory with repo directories. E.g. '/var/repos' contains
# 'appstream', 'baseos', etc.
# Or 'http://koji.cloudlinux.com/mirrors/rhel_mirror' if you are
# using remote repo
path: str
# name of folder with a repodata folder. E.g. 'baseos', 'appstream', etc
folder: str
# Is a repo remote or local
is_remote: bool
# Is a reference repository (usually it's a RHEL repo)
# Layout of packages from such repository will be taken as example
# Only layout of specific package (which doesn't exist
# in a reference repository) will be taken as example
is_reference: bool = False
# The packages from 'present' repo will be added to a variant.
# The packages from 'absent' repo will be removed from a variant.
repo_type: str = 'present'
@dataclass
class VariantInfo:
# name of variant. E.g. 'BaseOS', 'AppStream', etc
name: AnyStr
# architecture of variant. E.g. 'x86_64', 'i686', etc
arch: AnyStr
# The packages which will be not added to a variant
excluded_packages: List[str] = field(default_factory=list)
# Repos of a variant
repos: List[RepoInfo] = field(default_factory=list)
class PackagesGenerator:
repo_arches = defaultdict(lambda: list(('noarch',)))
addon_repos = {
'x86_64': ['i686'],
'ppc64le': [],
'aarch64': [],
's390x': [],
'i686': [],
}
def __init__(
self,
variants: List[VariantInfo],
excluded_packages: List[AnyStr],
included_packages: List[AnyStr],
):
self.variants = variants
self.pkgs = dict()
self.excluded_packages = excluded_packages
self.included_packages = included_packages
self.tmp_files = [] # type: list[Path]
for arch, arch_list in self.addon_repos.items():
self.repo_arches[arch].extend(arch_list)
self.repo_arches[arch].append(arch)
def __del__(self):
for tmp_file in self.tmp_files:
if tmp_file.exists():
tmp_file.unlink()
@staticmethod
def _get_full_repo_path(repo_info: RepoInfo):
result = os.path.join(
repo_info.path,
repo_info.folder
)
if repo_info.is_remote:
result = urljoin(
repo_info.path + '/',
repo_info.folder,
)
return result
@staticmethod
def _warning_callback(warning_type, message):
"""
Warning callback for createrepo_c parsing functions
"""
print(f'Warning message: "{message}"; warning type: "{warning_type}"')
return True
def get_remote_file_content(self, file_url: AnyStr) -> AnyStr:
"""
Get content from a remote file and write it to a temp file
:param file_url: url of a remote file
:return: path to a temp file
"""
file_request = requests.get(
url=file_url,
)
file_request.raise_for_status()
with tempfile.NamedTemporaryFile(delete=False) as file_stream:
file_stream.write(file_request.content)
self.tmp_files.append(Path(file_stream.name))
return file_stream.name
@staticmethod
def _parse_repomd(repomd_file_path: AnyStr) -> Repomd:
"""
Parse file repomd.xml and create object Repomd
:param repomd_file_path: path to local repomd.xml
"""
return Repomd(repomd_file_path)
@classmethod
def _parse_modules_file(
cls,
modules_file_path: AnyStr,
) -> Iterator[Any]:
"""
Parse modules.yaml.gz and returns parsed data
:param modules_file_path: path to local modules.yaml.gz
:return: List of dict for each module in a repo
"""
with open(modules_file_path, 'rb') as modules_file:
data = modules_file.read()
if is_gzip_file(data[:2]):
data = gzip.decompress(data)
elif is_xz_file(data[:2]):
data = lzma.decompress(data)
return yaml.load_all(
data,
Loader=yaml.BaseLoader,
)
def _get_repomd_records(
self,
repo_info: RepoInfo,
) -> List[RepomdRecord]:
"""
Get, parse file repomd.xml and extract from it repomd records
:param repo_info: structure which contains info about a current repo
:return: list with repomd records
"""
repomd_file_path = os.path.join(
repo_info.path,
repo_info.folder,
'repodata',
'repomd.xml',
)
if repo_info.is_remote:
repomd_file_path = urljoin(
urljoin(
repo_info.path + '/',
repo_info.folder
) + '/',
'repodata/repomd.xml'
)
repomd_file_path = self.get_remote_file_content(repomd_file_path)
repomd_object = self._parse_repomd(repomd_file_path)
if repo_info.is_remote:
os.remove(repomd_file_path)
return repomd_object.records
def _download_repomd_records(
self,
repo_info: RepoInfo,
repomd_records: List[RepomdRecord],
repomd_records_dict: Dict[str, str],
):
"""
Download repomd records
:param repo_info: structure which contains info about a current repo
:param repomd_records: list with repomd records
:param repomd_records_dict: dict with paths to repodata files
"""
for repomd_record in repomd_records:
if repomd_record.type not in (
'primary',
'filelists',
'other',
):
continue
repomd_record_file_path = os.path.join(
repo_info.path,
repo_info.folder,
repomd_record.location_href,
)
if repo_info.is_remote:
repomd_record_file_path = self.get_remote_file_content(
repomd_record_file_path)
repomd_records_dict[repomd_record.type] = repomd_record_file_path
def _parse_module_repomd_record(
self,
repo_info: RepoInfo,
repomd_records: List[RepomdRecord],
) -> List[Dict]:
"""
Download repomd records
:param repo_info: structure which contains info about a current repo
:param repomd_records: list with repomd records
"""
for repomd_record in repomd_records:
if repomd_record.type != 'modules':
continue
repomd_record_file_path = os.path.join(
repo_info.path,
repo_info.folder,
repomd_record.location_href,
)
if repo_info.is_remote:
repomd_record_file_path = self.get_remote_file_content(
repomd_record_file_path)
return list(self._parse_modules_file(
repomd_record_file_path,
))
return []
@staticmethod
def compare_pkgs_version(package_1: Package, package_2: Package) -> int:
version_tuple_1 = (
package_1.epoch,
package_1.version,
package_1.release,
)
version_tuple_2 = (
package_2.epoch,
package_2.version,
package_2.release,
)
return rpm.labelCompare(version_tuple_1, version_tuple_2)
def get_packages_iterator(
self,
repo_info: RepoInfo,
) -> Union[PackageIterator, Iterator]:
full_repo_path = self._get_full_repo_path(repo_info)
pkgs_iterator = self.pkgs.get(full_repo_path)
if pkgs_iterator is None:
repomd_records = self._get_repomd_records(
repo_info=repo_info,
)
repomd_records_dict = {} # type: Dict[str, str]
self._download_repomd_records(
repo_info=repo_info,
repomd_records=repomd_records,
repomd_records_dict=repomd_records_dict,
)
pkgs_iterator = PackageIterator(
primary_path=repomd_records_dict['primary'],
filelists_path=repomd_records_dict['filelists'],
other_path=repomd_records_dict['other'],
warningcb=self._warning_callback,
)
pkgs_iterator, self.pkgs[full_repo_path] = tee(pkgs_iterator)
return pkgs_iterator
def get_package_arch(
self,
package: Package,
variant_arch: str,
) -> str:
result = variant_arch
if package.arch in self.repo_arches[variant_arch]:
result = package.arch
return result
def is_skipped_module_package(
self,
package: Package,
variant_arch: str,
) -> bool:
package_key = self.get_package_key(package, variant_arch)
# Even a module package will be added to packages.json if
# it presents in the list of included packages
return 'module' in package.release and not any(
re.search(
f'^{included_pkg}$',
package_key,
) or included_pkg in (package.name, package_key)
for included_pkg in self.included_packages
)
def is_excluded_package(
self,
package: Package,
variant_arch: str,
excluded_packages: List[str],
) -> bool:
package_key = self.get_package_key(package, variant_arch)
return any(
re.search(
f'^{excluded_pkg}$',
package_key,
) or excluded_pkg in (package.name, package_key)
for excluded_pkg in excluded_packages
)
@staticmethod
def get_source_rpm_name(package: Package) -> str:
source_rpm_nvra = parse_nvra(package.rpm_sourcerpm)
return source_rpm_nvra['name']
def get_package_key(self, package: Package, variant_arch: str) -> str:
return (
f'{package.name}.'
f'{self.get_package_arch(package, variant_arch)}'
)
def generate_packages_json(
self
) -> Dict[AnyStr, Dict[AnyStr, Dict[AnyStr, List[AnyStr]]]]:
"""
Generate packages.json
"""
packages = defaultdict(lambda: defaultdict(lambda: {
'variants': list(),
}))
for variant_info in self.variants:
for repo_info in variant_info.repos:
is_reference = repo_info.is_reference
for package in self.get_packages_iterator(repo_info=repo_info):
if self.is_skipped_module_package(
package=package,
variant_arch=variant_info.arch,
):
continue
if self.is_excluded_package(
package=package,
variant_arch=variant_info.arch,
excluded_packages=self.excluded_packages,
):
continue
if self.is_excluded_package(
package=package,
variant_arch=variant_info.arch,
excluded_packages=variant_info.excluded_packages,
):
continue
package_key = self.get_package_key(
package,
variant_info.arch,
)
source_rpm_name = self.get_source_rpm_name(package)
package_info = packages[source_rpm_name][package_key]
if 'is_reference' not in package_info:
package_info['variants'].append(variant_info.name)
package_info['is_reference'] = is_reference
package_info['package'] = package
elif not package_info['is_reference'] or \
package_info['is_reference'] == is_reference and \
self.compare_pkgs_version(
package_1=package,
package_2=package_info['package'],
) > 0:
package_info['variants'] = [variant_info.name]
package_info['is_reference'] = is_reference
package_info['package'] = package
elif self.compare_pkgs_version(
package_1=package,
package_2=package_info['package'],
) == 0 and repo_info.repo_type != 'absent':
package_info['variants'].append(variant_info.name)
result = defaultdict(lambda: defaultdict(
lambda: defaultdict(list),
))
for variant_info in self.variants:
for source_rpm_name, packages_info in packages.items():
for package_key, package_info in packages_info.items():
variant_pkgs = result[variant_info.name][variant_info.arch]
if variant_info.name not in package_info['variants']:
continue
variant_pkgs[source_rpm_name].append(package_key)
return result
def create_parser():
parser = argparse.ArgumentParser()
parser.add_argument(
'-c',
'--config',
type=Path,
default=Path('config.yaml'),
required=False,
help='Path to a config',
)
parser.add_argument(
'-o',
'--json-output-path',
type=str,
help='Full path to output json file',
required=True,
)
return parser
def read_config(config_path: Path) -> Optional[Dict]:
if not config_path.exists():
logging.error('A config by path "%s" does not exist', config_path)
exit(1)
with config_path.open('r') as config_fd:
return yaml.safe_load(config_fd)
def process_config(config_data: Dict) -> Tuple[
List[VariantInfo],
List[str],
List[str],
]:
excluded_packages = config_data.get('excluded_packages', [])
included_packages = config_data.get('included_packages', [])
variants = [VariantInfo(
name=variant_name,
arch=variant_info['arch'],
excluded_packages=variant_info.get('excluded_packages', []),
repos=[RepoInfo(
path=variant_repo['path'],
folder=variant_repo['folder'],
is_remote=variant_repo['remote'],
is_reference=variant_repo['reference'],
repo_type=variant_repo.get('repo_type', 'present'),
) for variant_repo in variant_info['repos']]
) for variant_name, variant_info in config_data['variants'].items()]
return variants, excluded_packages, included_packages
def cli_main():
args = create_parser().parse_args()
variants, excluded_packages, included_packages = process_config(
config_data=read_config(args.config)
)
pg = PackagesGenerator(
variants=variants,
excluded_packages=excluded_packages,
included_packages=included_packages,
)
result = pg.generate_packages_json()
with open(args.json_output_path, 'w') as packages_file:
json.dump(
result,
packages_file,
indent=4,
sort_keys=True,
)
if __name__ == '__main__':
cli_main()

View File

@ -0,0 +1,255 @@
import gzip
import lzma
import os
from argparse import ArgumentParser, FileType
from glob import iglob
from io import BytesIO
from pathlib import Path
from typing import List, AnyStr, Iterable, Union, Optional
import logging
from urllib.parse import urljoin
import yaml
import createrepo_c as cr
from typing.io import BinaryIO
from .create_packages_json import PackagesGenerator, is_gzip_file, is_xz_file
EMPTY_FILE = '.empty'
def read_modules_yaml(modules_yaml_path: Union[str, Path]) -> BytesIO:
with open(modules_yaml_path, 'rb') as fp:
return BytesIO(fp.read())
def grep_list_of_modules_yaml(repos_path: AnyStr) -> Iterable[BytesIO]:
"""
Find all of valid *modules.yaml.gz in repos
:param repos_path: path to a directory which contains repo dirs
:return: iterable object of content from *modules.yaml.*
"""
return (
read_modules_yaml_from_specific_repo(repo_path=Path(path).parent)
for path in iglob(
str(Path(repos_path).joinpath('**/repodata')),
recursive=True
)
)
def _is_remote(path: str):
return any(str(path).startswith(protocol)
for protocol in ('http', 'https'))
def read_modules_yaml_from_specific_repo(
repo_path: Union[str, Path]
) -> Optional[BytesIO]:
"""
Read modules_yaml from a specific repo (remote or local)
:param repo_path: path/url to a specific repo
(final dir should contain dir `repodata`)
:return: iterable object of content from *modules.yaml.*
"""
if _is_remote(repo_path):
repomd_url = urljoin(
repo_path + '/',
'repodata/repomd.xml',
)
packages_generator = PackagesGenerator(
variants=[],
excluded_packages=[],
included_packages=[],
)
repomd_file_path = packages_generator.get_remote_file_content(
file_url=repomd_url
)
else:
repomd_file_path = os.path.join(
repo_path,
'repodata/repomd.xml',
)
repomd_obj = cr.Repomd(str(repomd_file_path))
for record in repomd_obj.records:
if record.type != 'modules':
continue
else:
if _is_remote(repo_path):
modules_yaml_url = urljoin(
repo_path + '/',
record.location_href,
)
packages_generator = PackagesGenerator(
variants=[],
excluded_packages=[],
included_packages=[],
)
modules_yaml_path = packages_generator.get_remote_file_content(
file_url=modules_yaml_url
)
else:
modules_yaml_path = os.path.join(
repo_path,
record.location_href,
)
return read_modules_yaml(modules_yaml_path=modules_yaml_path)
else:
return None
def _should_grep_defaults(
document_type: str,
grep_only_modules_data: bool = False,
grep_only_modules_defaults_data: bool = False,
) -> bool:
xor_flag = grep_only_modules_data == grep_only_modules_defaults_data
if document_type == 'modulemd' and (xor_flag or grep_only_modules_data):
return True
return False
def _should_grep_modules(
document_type: str,
grep_only_modules_data: bool = False,
grep_only_modules_defaults_data: bool = False,
) -> bool:
xor_flag = grep_only_modules_data == grep_only_modules_defaults_data
if document_type == 'modulemd-defaults' and \
(xor_flag or grep_only_modules_defaults_data):
return True
return False
def collect_modules(
modules_paths: List[BinaryIO],
target_dir: str,
grep_only_modules_data: bool = False,
grep_only_modules_defaults_data: bool = False,
):
"""
Read given modules.yaml.gz files and export modules
and modulemd files from it.
Returns:
object:
"""
xor_flag = grep_only_modules_defaults_data is grep_only_modules_data
modules_path = os.path.join(target_dir, 'modules')
module_defaults_path = os.path.join(target_dir, 'module_defaults')
if grep_only_modules_data or xor_flag:
os.makedirs(modules_path, exist_ok=True)
if grep_only_modules_defaults_data or xor_flag:
os.makedirs(module_defaults_path, exist_ok=True)
# Defaults modules can be empty, but pungi detects
# empty folder while copying and raises the exception in this case
Path(os.path.join(module_defaults_path, EMPTY_FILE)).touch()
for module_file in modules_paths:
data = module_file.read()
if is_gzip_file(data[:2]):
data = gzip.decompress(data)
elif is_xz_file(data[:2]):
data = lzma.decompress(data)
documents = yaml.load_all(data, Loader=yaml.BaseLoader)
for doc in documents:
path = None
if _should_grep_modules(
doc['document'],
grep_only_modules_data,
grep_only_modules_defaults_data,
):
name = f"{doc['data']['module']}.yaml"
path = os.path.join(module_defaults_path, name)
logging.info('Found %s module defaults', name)
elif _should_grep_defaults(
doc['document'],
grep_only_modules_data,
grep_only_modules_defaults_data,
):
# pungi.phases.pkgset.sources.source_koji.get_koji_modules
stream = doc['data']['stream'].replace('-', '_')
doc_data = doc['data']
name = f"{doc_data['name']}-{stream}-" \
f"{doc_data['version']}.{doc_data['context']}"
arch_dir = os.path.join(
modules_path,
doc_data['arch']
)
os.makedirs(arch_dir, exist_ok=True)
path = os.path.join(
arch_dir,
name,
)
logging.info('Found module %s', name)
if 'artifacts' not in doc['data']:
logging.warning(
'RPM %s does not have explicit list of artifacts',
name
)
if path is not None:
with open(path, 'w') as f:
yaml.dump(doc, f, default_flow_style=False)
def cli_main():
parser = ArgumentParser()
content_type_group = parser.add_mutually_exclusive_group(required=False)
content_type_group.add_argument(
'--get-only-modules-data',
action='store_true',
help='Parse and get only modules data',
)
content_type_group.add_argument(
'--get-only-modules-defaults-data',
action='store_true',
help='Parse and get only modules_defaults data',
)
path_group = parser.add_mutually_exclusive_group(required=True)
path_group.add_argument(
'-p', '--path',
type=FileType('rb'), nargs='+',
help='Path to modules.yaml.gz file. '
'You may pass multiple files by passing -p path1 path2'
)
path_group.add_argument(
'-rp', '--repo-path',
required=False,
type=str,
default=None,
help='Path to a directory which contains repodirs. E.g. /var/repos'
)
path_group.add_argument(
'-rd', '--repodata-paths',
required=False,
type=str,
nargs='+',
default=[],
help='Paths/urls to the directories with directory `repodata`',
)
parser.add_argument('-t', '--target', required=True)
namespace = parser.parse_args()
if namespace.repodata_paths:
modules = []
for repodata_path in namespace.repodata_paths:
modules.append(read_modules_yaml_from_specific_repo(
repodata_path,
))
elif namespace.path is not None:
modules = namespace.path
else:
modules = grep_list_of_modules_yaml(namespace.repo_path)
modules = list(filter(lambda i: i is not None, modules))
collect_modules(
modules,
namespace.target,
namespace.get_only_modules_data,
namespace.get_only_modules_defaults_data,
)
if __name__ == '__main__':
cli_main()

View File

@ -0,0 +1,96 @@
import re
from argparse import ArgumentParser
import os
from glob import iglob
from typing import List
from pathlib import Path
from dataclasses import dataclass
from productmd.common import parse_nvra
@dataclass
class Package:
nvra: dict
path: Path
def search_rpms(top_dir: Path) -> List[Package]:
"""
Search for all *.rpm files recursively
in given top directory
Returns:
list: list of paths
"""
return [Package(
nvra=parse_nvra(Path(path).stem),
path=Path(path),
) for path in iglob(str(top_dir.joinpath('**/*.rpm')), recursive=True)]
def is_excluded_package(
package: Package,
excluded_packages: List[str],
) -> bool:
package_key = f'{package.nvra["name"]}.{package.nvra["arch"]}'
return any(
re.search(
f'^{excluded_pkg}$',
package_key,
) or excluded_pkg in (package.nvra['name'], package_key)
for excluded_pkg in excluded_packages
)
def copy_rpms(
packages: List[Package],
target_top_dir: Path,
excluded_packages: List[str],
):
"""
Search synced repos for rpms and prepare
koji-like structure for pungi
Instead of repos, use following structure:
# ls /mnt/koji/
i686/ noarch/ x86_64/
Returns:
Nothing:
"""
for package in packages:
if is_excluded_package(package, excluded_packages):
continue
target_arch_dir = target_top_dir.joinpath(package.nvra['arch'])
target_file = target_arch_dir.joinpath(package.path.name)
os.makedirs(target_arch_dir, exist_ok=True)
if not target_file.exists():
try:
os.link(package.path, target_file)
except OSError:
# hardlink failed, try symlinking
package.path.symlink_to(target_file)
def cli_main():
parser = ArgumentParser()
parser.add_argument('-p', '--path', required=True, type=Path)
parser.add_argument('-t', '--target', required=True, type=Path)
parser.add_argument(
'-e',
'--excluded-packages',
required=False,
nargs='+',
type=str,
default=[],
)
namespace = parser.parse_args()
rpms = search_rpms(namespace.path)
copy_rpms(rpms, namespace.target, namespace.excluded_packages)
if __name__ == '__main__':
cli_main()

View File

@ -252,9 +252,15 @@ def main():
kobo.log.add_stderr_logger(logger)
conf = util.load_config(opts.config)
compose_type = opts.compose_type or conf.get("compose_type", "production")
if compose_type == "production" and not opts.label and not opts.no_label:
label = opts.label or conf.get("label")
if label:
try:
productmd.composeinfo.verify_label(label)
except ValueError as ex:
abort(str(ex))
if compose_type == "production" and not label and not opts.no_label:
abort("must specify label for a production compose")
if (
@ -304,7 +310,7 @@ def main():
opts.target_dir,
conf,
compose_type=compose_type,
compose_label=opts.label,
compose_label=label,
parent_compose_ids=opts.parent_compose_id,
respin_of=opts.respin_of,
)
@ -315,7 +321,7 @@ def main():
ci = Compose.get_compose_info(
conf,
compose_type=compose_type,
compose_label=opts.label,
compose_label=label,
parent_compose_ids=opts.parent_compose_id,
respin_of=opts.respin_of,
)

View File

@ -306,6 +306,8 @@ class CompsWrapper(object):
append_common_info(doc, group_node, group, force_description=True)
append_bool(doc, group_node, "default", group.default)
append_bool(doc, group_node, "uservisible", group.uservisible)
if group.display_order is not None:
append(doc, group_node, "display_order", str(group.display_order))
if group.lang_only:
append(doc, group_node, "langonly", group.lang_only)

View File

@ -88,5 +88,12 @@ def parse_output(output):
packages.add((name, arch, frozenset(flags)))
else:
name, arch = nevra.rsplit(".", 1)
modules.add(name.split(":", 1)[1])
# replace dash by underscore in stream of module's nerva
# source of name looks like
# module:llvm-toolset:rhel8:8040020210411062713:9f9e2e7e.x86_64
name = ':'.join(
item.replace('-', '_') if i == 1 else item for
i, item in enumerate(name.split(':')[1:])
)
modules.add(name)
return packages, modules

299
pungi/wrappers/kojimock.py Normal file
View File

@ -0,0 +1,299 @@
import os
import time
from pathlib import Path
from attr import dataclass
from kobo.rpmlib import parse_nvra
from pungi.module_util import Modulemd
# just a random value which we don't
# use in mock currently
# originally builds are filtered by this value
# to get consistent snapshot of tags and packages
from pungi.scripts.gather_rpms import search_rpms
LAST_EVENT_ID = 999999
# last event time is not important but build
# time should be less then it
LAST_EVENT_TIME = time.time()
BUILD_TIME = 0
# virtual build that collects all
# packages built for some arch
RELEASE_BUILD_ID = 15270
# tag that should have all packages available
ALL_PACKAGES_TAG = 'dist-c8-compose'
# tag that should have all modules available
ALL_MODULES_TAG = 'dist-c8-module-compose'
@dataclass
class Module:
build_id: int
name: str
nvr: str
stream: str
version: str
context: str
arch: str
class KojiMock:
"""
Class that acts like real koji (for some needed methods)
but uses local storage as data source
"""
def __init__(self, packages_dir, modules_dir, all_arches):
self._modules = self._gather_modules(modules_dir)
self._modules_dir = modules_dir
self._packages_dir = packages_dir
self._all_arches = all_arches
@staticmethod
def _gather_modules(modules_dir):
modules = {}
for index, (f, arch) in enumerate(
(sub_path.name, sub_path.parent.name)
for path in Path(modules_dir).glob('*')
for sub_path in path.iterdir()
):
parsed = parse_nvra(f)
modules[index] = Module(
name=parsed['name'],
nvr=f,
version=parsed['release'],
context=parsed['arch'],
stream=parsed['version'],
build_id=index,
arch=arch,
)
return modules
@staticmethod
def getLastEvent(*args, **kwargs):
return {'id': LAST_EVENT_ID, 'ts': LAST_EVENT_TIME}
def listTagged(self, tag_name, *args, **kwargs):
"""
Returns list of virtual 'builds' that contain packages by given tag
There are two kinds of tags: modular and distributive.
For now, only one kind, distributive one, is needed.
"""
if tag_name != ALL_MODULES_TAG:
raise ValueError("I don't know what tag is %s" % tag_name)
builds = []
for module in self._modules.values():
builds.append({
'build_id': module.build_id,
'owner_name': 'centos',
'package_name': module.name,
'nvr': module.nvr,
'version': module.stream,
'release': '%s.%s' % (module.version, module.context),
'name': module.name,
'id': module.build_id,
'tag_name': tag_name,
'arch': module.arch,
# Following fields are currently not
# used but returned by real koji
# left them here just for reference
#
# 'task_id': None,
# 'state': 1,
# 'start_time': '2020-12-23 16:43:59',
# 'creation_event_id': 309485,
# 'creation_time': '2020-12-23 17:05:33.553748',
# 'epoch': None, 'tag_id': 533,
# 'completion_time': '2020-12-23 17:05:23',
# 'volume_id': 0,
# 'package_id': 3221,
# 'owner_id': 11,
# 'volume_name': 'DEFAULT',
})
return builds
@staticmethod
def getFullInheritance(*args, **kwargs):
"""
Unneeded because we use local storage.
"""
return []
def getBuild(self, build_id, *args, **kwargs):
"""
Used to get information about build
(used in pungi only for modules currently)
"""
module = self._modules[build_id]
result = {
'id': build_id,
'name': module.name,
'version': module.stream,
'release': '%s.%s' % (module.version, module.context),
'completion_ts': BUILD_TIME,
'state': 'COMPLETE',
'arch': module.arch,
'extra': {
'typeinfo': {
'module': {
'stream': module.stream,
'version': module.version,
'name': module.name,
'context': module.context,
'content_koji_tag': '-'.join([
module.name,
module.stream,
module.version
]) + '.' + module.context
}
}
}
}
return result
def listArchives(self, build_id, *args, **kwargs):
"""
Originally lists artifacts for build, but in pungi used
only to get list of modulemd files for some module
"""
module = self._modules[build_id]
return [
{
'build_id': module.build_id,
'filename': f'modulemd.{module.arch}.txt',
'btype': 'module'
},
# noone ever uses this file
# but it should be because pungi ignores builds
# with len(files) <= 1
{
'build_id': module.build_id,
'filename': 'modulemd.txt',
'btype': 'module'
}
]
def listTaggedRPMS(self, tag_name, *args, **kwargs):
"""
Get information about packages that are tagged by tag.
There are two kings of tags: per-module and per-distr.
"""
if tag_name == ALL_PACKAGES_TAG:
builds, packages = self._get_release_packages()
else:
builds, packages = self._get_module_packages(tag_name)
return [
packages,
builds
]
def _get_release_packages(self):
"""
Search packages dir and keep only
packages that are non-modular.
This is quite the way how real koji works:
- modular packages are tagged by module-* tag
- all other packages are tagged with dist* tag
"""
packages = []
# get all rpms in folder
rpms = search_rpms(Path(self._packages_dir))
for rpm in rpms:
info = parse_nvra(rpm.path.stem)
if 'module' in info['release']:
continue
packages.append({
"build_id": RELEASE_BUILD_ID,
"name": info['name'],
"extra": None,
"arch": info['arch'],
"epoch": info['epoch'] or None,
"version": info['version'],
"metadata_only": False,
"release": info['release'],
# not used currently
# "id": 262555,
# "size": 0
})
builds = []
return builds, packages
def _get_module_packages(self, tag_name):
"""
Get list of builds for module and given module tag name.
"""
builds = []
packages = []
modules = self._get_modules_by_name(tag_name)
for module in modules:
if module is None:
raise ValueError('Module %s is not found' % tag_name)
path = os.path.join(
self._modules_dir,
module.arch,
tag_name,
)
builds.append({
"build_id": module.build_id,
"package_name": module.name,
"nvr": module.nvr,
"tag_name": module.nvr,
"version": module.stream,
"release": module.version,
"id": module.build_id,
"name": module.name,
"volume_name": "DEFAULT",
# Following fields are currently not
# used but returned by real koji
# left them here just for reference
#
# "owner_name": "mbox-mbs-backend",
# "task_id": 195937,
# "state": 1,
# "start_time": "2020-12-22 19:20:12.504578",
# "creation_event_id": 306731,
# "creation_time": "2020-12-22 19:20:12.504578",
# "epoch": None,
# "tag_id": 1192,
# "completion_time": "2020-12-22 19:34:34.716615",
# "volume_id": 0,
# "package_id": 104,
# "owner_id": 6,
})
if os.path.exists(path):
info = Modulemd.ModuleStream.read_string(open(path).read(), strict=True)
for art in info.get_rpm_artifacts():
data = parse_nvra(art)
packages.append({
"build_id": module.build_id,
"name": data['name'],
"extra": None,
"arch": data['arch'],
"epoch": data['epoch'] or None,
"version": data['version'],
"metadata_only": False,
"release": data['release'],
"id": 262555,
"size": 0
})
else:
raise RuntimeError('Unable to find module %s' % path)
return builds, packages
def _get_modules_by_name(self, tag_name):
modules = []
for arch in self._all_arches:
for module in self._modules.values():
if module.nvr != tag_name or module.arch != arch:
continue
modules.append(module)
return modules

View File

@ -32,6 +32,7 @@ import six.moves.xmlrpc_client as xmlrpclib
from flufl.lock import Lock
from datetime import timedelta
from .kojimock import KojiMock
from .. import util
from ..arch_utils import getBaseArch
@ -869,6 +870,45 @@ class KojiWrapper(object):
pass
class KojiMockWrapper(object):
lock = threading.Lock()
def __init__(self, compose, all_arches):
self.all_arches = all_arches
self.compose = compose
try:
self.profile = self.compose.conf["koji_profile"]
except KeyError:
raise RuntimeError("Koji profile must be configured")
with self.lock:
self.koji_module = koji.get_profile_module(self.profile)
session_opts = {}
for key in (
"timeout",
"keepalive",
"max_retries",
"retry_interval",
"anon_retry",
"offline_retry",
"offline_retry_interval",
"debug",
"debug_xmlrpc",
"serverca",
"use_fast_upload",
):
value = getattr(self.koji_module.config, key, None)
if value is not None:
session_opts[key] = value
self.koji_proxy = KojiMock(
packages_dir=self.koji_module.config.topdir,
modules_dir=os.path.join(
self.koji_module.config.topdir,
'modules',
),
all_arches=self.all_arches,
)
def get_buildroot_rpms(compose, task_id):
"""Get build root RPMs - either from runroot or local"""
result = []

View File

@ -20,7 +20,7 @@ packages = sorted(packages)
setup(
name="pungi",
version="4.5.1",
version="4.5.0",
description="Distribution compose tool",
url="https://pagure.io/pungi",
author="Dennis Gilmore",
@ -42,6 +42,10 @@ setup(
"pungi-config-dump = pungi.scripts.config_dump:cli_main",
"pungi-config-validate = pungi.scripts.config_validate:cli_main",
"pungi-cache-cleanup = pungi.scripts.cache_cleanup:main",
"pungi-gather-modules = pungi.scripts.gather_modules:cli_main",
"pungi-gather-rpms = pungi.scripts.gather_rpms:cli_main",
"pungi-generate-packages-json = pungi.scripts.create_packages_json:cli_main", # noqa: E501
"pungi-create-extra-repo = pungi.scripts.create_extra_repo:cli_main"
]
},
scripts=["contrib/yum-dnf-compare/pungi-compare-depsolving"],
@ -62,5 +66,5 @@ setup(
"dogpile.cache",
],
extras_require={':python_version=="2.7"': ["enum34", "lockfile"]},
tests_require=["mock", "pytest", "pytest-cov"],
tests_require=["mock", "pytest", "pytest-cov", "pyfakefs"],
)

View File

@ -0,0 +1,36 @@
<?xml version="1.0" encoding="UTF-8"?>
<repomd xmlns="http://linux.duke.edu/metadata/repo" xmlns:rpm="http://linux.duke.edu/metadata/rpm">
<revision>1612479076</revision>
<data type="primary">
<checksum type="sha256">08941fae6bdb14f3b22bfad38b9d7dcb685a9df58fe8f515a3a0b2fe1af903bb</checksum>
<open-checksum type="sha256">2a15e618f049a883d360ccbf3e764b30640255f47dc526c633b1722fe23cbcbc</open-checksum>
<location href="repodata/08941fae6bdb14f3b22bfad38b9d7dcb685a9df58fe8f515a3a0b2fe1af903bb-primary.xml.gz"/>
<timestamp>1612479075</timestamp>
<size>1240</size>
<open-size>3888</open-size>
</data>
<data type="filelists">
<checksum type="sha256">e37a0b4a63b2b245dca1727195300cd3961f80aebc82ae7b9849dbf7482f5d0f</checksum>
<open-checksum type="sha256">b1782bc4207a5b7c3e64115d5a1d001802e8d363f022ea165df7cdab6f14651c</open-checksum>
<location href="repodata/e37a0b4a63b2b245dca1727195300cd3961f80aebc82ae7b9849dbf7482f5d0f-filelists.xml.gz"/>
<timestamp>1612479075</timestamp>
<size>439</size>
<open-size>1295</open-size>
</data>
<data type="other">
<checksum type="sha256">92992176bce71dcde9e4b6ad1442e7b5c7f3de9b7f019a2cd27d042ab38ea2b1</checksum>
<open-checksum type="sha256">3b847919691ad32279b13463de6c08f1f8b32f51e87b7d8d7e95a3ec2f46ef51</open-checksum>
<location href="repodata/92992176bce71dcde9e4b6ad1442e7b5c7f3de9b7f019a2cd27d042ab38ea2b1-other.xml.gz"/>
<timestamp>1612479075</timestamp>
<size>630</size>
<open-size>1911</open-size>
</data>
<data type="modules">
<checksum type="sha256">e7a671401f8e207e4cd3b90b4ac92d621f84a34dc9026f57c3f427fbed444c57</checksum>
<open-checksum type="sha256">d59fee86c18018cc18bb7325aa74aa0abf923c64d29a4ec45e08dcd01a0c3966</open-checksum>
<location href="repodata/e7a671401f8e207e4cd3b90b4ac92d621f84a34dc9026f57c3f427fbed444c57-modules.yaml.gz"/>
<timestamp>1612479075</timestamp>
<size>920</size>
<open-size>3308</open-size>
</data>
</repomd>

View File

@ -0,0 +1,55 @@
<?xml version="1.0" encoding="UTF-8"?>
<repomd xmlns="http://linux.duke.edu/metadata/repo" xmlns:rpm="http://linux.duke.edu/metadata/rpm">
<revision>1666177486</revision>
<data type="primary">
<checksum type="sha256">89cb9cc1181635c9147864a7076d91fb81072641d481cd202832a2d257453576</checksum>
<open-checksum type="sha256">07255d9856f7531b52a6459f6fc7701c6d93c6d6c29d1382d83afcc53f13494a</open-checksum>
<location href="repodata/89cb9cc1181635c9147864a7076d91fb81072641d481cd202832a2d257453576-primary.xml.gz"/>
<timestamp>1666177486</timestamp>
<size>1387</size>
<open-size>6528</open-size>
</data>
<data type="filelists">
<checksum type="sha256">f69ca03957574729fd5150335b0d87afddcfb37a97aed5b06272212854f1773d</checksum>
<open-checksum type="sha256">c2e1e674d7d48bccaa16cae0a5f70cb55ef4cd7352b4d9d4fdaa619075d07dbc</open-checksum>
<location href="repodata/f69ca03957574729fd5150335b0d87afddcfb37a97aed5b06272212854f1773d-filelists.xml.gz"/>
<timestamp>1666177486</timestamp>
<size>1252</size>
<open-size>5594</open-size>
</data>
<data type="other">
<checksum type="sha256">b3827bd6c9ea67ffa3912002515c64e4d9fe5c4dacbf7c46b0d8768b7abbb84f</checksum>
<open-checksum type="sha256">9ce24c526239e349d023c577b2ae3872c8b0f1888aed1fb24b9b9aa12063fdf3</open-checksum>
<location href="repodata/b3827bd6c9ea67ffa3912002515c64e4d9fe5c4dacbf7c46b0d8768b7abbb84f-other.xml.gz"/>
<timestamp>1666177486</timestamp>
<size>999</size>
<open-size>6320</open-size>
</data>
<data type="primary_db">
<checksum type="sha256">ab8df35061dfa0285069b843f24a7076e31266d9a8abe8282340bcb936aa61d7</checksum>
<open-checksum type="sha256">2bce9554ce4496cef34b5cd69f186f7f3143c7cabae8fa384fc5c9eeab326f7f</open-checksum>
<location href="repodata/ab8df35061dfa0285069b843f24a7076e31266d9a8abe8282340bcb936aa61d7-primary.sqlite.bz2"/>
<timestamp>1666177486</timestamp>
<size>3558</size>
<open-size>106496</open-size>
<database_version>10</database_version>
</data>
<data type="filelists_db">
<checksum type="sha256">8bcf6d40db4e922934ac47e8ac7fb8d15bdacf579af8c819d2134ed54d30550b</checksum>
<open-checksum type="sha256">f7001d1df7f5f7e4898919b15710bea8ed9711ce42faf68e22b757e63169b1fb</open-checksum>
<location href="repodata/8bcf6d40db4e922934ac47e8ac7fb8d15bdacf579af8c819d2134ed54d30550b-filelists.sqlite.bz2"/>
<timestamp>1666177486</timestamp>
<size>2360</size>
<open-size>28672</open-size>
<database_version>10</database_version>
</data>
<data type="other_db">
<checksum type="sha256">01b82e9eb7ee9151f283c6e761ae450de18ed2d64b5e32de88689eaf95216a80</checksum>
<open-checksum type="sha256">07f5b9750af1e440d37ca216e719dd288149e79e9132f2fdccb6f73b2e5dd541</open-checksum>
<location href="repodata/01b82e9eb7ee9151f283c6e761ae450de18ed2d64b5e32de88689eaf95216a80-other.sqlite.bz2"/>
<timestamp>1666177486</timestamp>
<size>2196</size>
<open-size>32768</open-size>
<database_version>10</database_version>
</data>
</repomd>

View File

@ -0,0 +1,55 @@
<?xml version="1.0" encoding="UTF-8"?>
<repomd xmlns="http://linux.duke.edu/metadata/repo" xmlns:rpm="http://linux.duke.edu/metadata/rpm">
<revision>1666177500</revision>
<data type="primary">
<checksum type="sha256">a1d342aa7cef3a2034fc3f9d6ee02d63572780bc76e61749a57e50b6b3ca9869</checksum>
<open-checksum type="sha256">a9e3eae447dd44282d7d96db5f15f049b757925397adb752f4df982176bab7e0</open-checksum>
<location href="repodata/a1d342aa7cef3a2034fc3f9d6ee02d63572780bc76e61749a57e50b6b3ca9869-primary.xml.gz"/>
<timestamp>1666177500</timestamp>
<size>3501</size>
<open-size>37296</open-size>
</data>
<data type="filelists">
<checksum type="sha256">6778922d5853d20f213ae7702699a76f1e87e55d6bfb5e4ac6a117d904d47b3c</checksum>
<open-checksum type="sha256">e30b666d9d88a70de69a08f45e6696bcd600c45485d856bd0213395d7da7bd49</open-checksum>
<location href="repodata/6778922d5853d20f213ae7702699a76f1e87e55d6bfb5e4ac6a117d904d47b3c-filelists.xml.gz"/>
<timestamp>1666177500</timestamp>
<size>27624</size>
<open-size>318187</open-size>
</data>
<data type="other">
<checksum type="sha256">5a60d79d8bce6a805f4fdb22fd891524359dce8ccc665c0b54e7299e79debe84</checksum>
<open-checksum type="sha256">b18138f4a3de45714e578fb1f30b7ec54fdcdaf1a22585891625b6af0894388e</open-checksum>
<location href="repodata/5a60d79d8bce6a805f4fdb22fd891524359dce8ccc665c0b54e7299e79debe84-other.xml.gz"/>
<timestamp>1666177500</timestamp>
<size>1876</size>
<open-size>28701</open-size>
</data>
<data type="primary_db">
<checksum type="sha256">c27bc2ce947173aba305041552c3c6d8db71442c1a2e5dcaf35ff750fe0469fc</checksum>
<open-checksum type="sha256">586e1af8934229925adb9e746ae5ced119859dfd97f4e3237399bb36a7d7f071</open-checksum>
<location href="repodata/c27bc2ce947173aba305041552c3c6d8db71442c1a2e5dcaf35ff750fe0469fc-primary.sqlite.bz2"/>
<timestamp>1666177500</timestamp>
<size>11528</size>
<open-size>126976</open-size>
<database_version>10</database_version>
</data>
<data type="filelists_db">
<checksum type="sha256">ed350865982e7a1e45b144839b56eac888e5d8f680571dd2cd06b37dc83e0fd8</checksum>
<open-checksum type="sha256">697903989d0f77de2d44a2b603e75c9b4ca23b3795eb136d175caf5666ce6459</open-checksum>
<location href="repodata/ed350865982e7a1e45b144839b56eac888e5d8f680571dd2cd06b37dc83e0fd8-filelists.sqlite.bz2"/>
<timestamp>1666177500</timestamp>
<size>20440</size>
<open-size>163840</open-size>
<database_version>10</database_version>
</data>
<data type="other_db">
<checksum type="sha256">35eff699131e0976429144c6f4514d21568177dc64bb4091c3ff62f76b293725</checksum>
<open-checksum type="sha256">3bd999a1bdf300df836a4607b7b75f845d8e1432e3e4e1ab6f0c7cc8a853db39</open-checksum>
<location href="repodata/35eff699131e0976429144c6f4514d21568177dc64bb4091c3ff62f76b293725-other.sqlite.bz2"/>
<timestamp>1666177500</timestamp>
<size>4471</size>
<open-size>49152</open-size>
<database_version>10</database_version>
</data>
</repomd>

View File

@ -0,0 +1,58 @@
[checksums]
images/boot.iso = sha256:fc8a4be604b6425746f12fa706116eb940f93358f036b8fbbe518b516cb6870c
[general]
; WARNING.0 = This section provides compatibility with pre-productmd treeinfos.
; WARNING.1 = Read productmd documentation for details about new format.
arch = x86_64
family = Test
name = Test 1.0
packagedir = Packages
platforms = x86_64,xen
repository = .
timestamp = 1531881582
variant = Server
variants = Client,Server
version = 1.0
[header]
type = productmd.treeinfo
version = 1.2
[images-x86_64]
boot.iso = images/boot.iso
[images-xen]
initrd = images/pxeboot/initrd.img
kernel = images/pxeboot/vmlinuz
[release]
name = Test
short = T
version = 1.0
[stage2]
mainimage = images/install.img
[tree]
arch = x86_64
build_timestamp = 1531881582
platforms = x86_64,xen
variants = Client,Server
[variant-Client]
id = Client
name = Client
packages = ../../../Client/x86_64/os/Packages
repository = ../../../Client/x86_64/os
type = variant
uid = Client
[variant-Server]
id = Server
name = Server
packages = Packages
repository = .
type = variant
uid = Server

View File

@ -0,0 +1,20 @@
---
document: modulemd
version: 2
data:
name: module
stream: master
version: 20190318
context: abcdef
arch: x86_64
summary: Dummy module
description: Dummy module
license:
module:
- Beerware
content:
- Beerware
artifacts:
rpms:
- foobar-0:1.0-1.noarch
...

View File

@ -0,0 +1,20 @@
---
document: modulemd
version: 2
data:
name: module
stream: master
version: 20190318
context: abcdef
arch: x86_64
summary: Dummy module
description: Dummy module
license:
module:
- Beerware
content:
- Beerware
artifacts:
rpms:
- foobar-0:1.0-1.noarch
...

View File

@ -0,0 +1,20 @@
---
document: modulemd
version: 2
data:
name: scratch-module
stream: master
version: 20200710
context: abcdef
arch: x86_64
summary: Dummy module
description: Dummy module
license:
module:
- Beerware
content:
- Beerware
artifacts:
rpms:
- foobar-0:1.0-1.noarch
...

View File

@ -0,0 +1,20 @@
---
document: modulemd
version: 2
data:
name: scratch-module
stream: master
version: 20200710
context: abcdef
arch: x86_64
summary: Dummy module
description: Dummy module
license:
module:
- Beerware
content:
- Beerware
artifacts:
rpms:
- foobar-0:1.0-1.noarch
...

View File

@ -7,7 +7,7 @@ import shutil
import tempfile
from collections import defaultdict
import mock
from unittest import mock
import six
from kobo.rpmlib import parse_nvr

View File

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
import mock
from unittest import mock
import unittest
from pungi.arch import (

View File

@ -1,4 +1,4 @@
import mock
from unittest import mock
try:
import unittest2 as unittest

View File

@ -1,15 +1,15 @@
# -*- coding: utf-8 -*-
try:
import unittest2 as unittest
except ImportError:
import unittest
import mock
from unittest import mock
import six
from copy import copy
from six.moves import StringIO
from ddt import ddt, data
import os
@ -2014,6 +2014,7 @@ class BuildinstallThreadTestCase(PungiTestCase):
self.assertEqual(ret, None)
@ddt
class TestSymlinkIso(PungiTestCase):
def setUp(self):
super(TestSymlinkIso, self).setUp()
@ -2029,8 +2030,13 @@ class TestSymlinkIso(PungiTestCase):
@mock.patch("pungi.phases.buildinstall.get_file_size")
@mock.patch("pungi.phases.buildinstall.iso")
@mock.patch("pungi.phases.buildinstall.run")
def test_hardlink(self, run, iso, get_file_size, get_mtime, ImageCls):
self.compose.conf = {"buildinstall_symlink": False, "disc_types": {}}
@data(['Server'], ['BaseOS'])
def test_hardlink(self, netinstall_variants, run, iso, get_file_size, get_mtime, ImageCls):
self.compose.conf = {
"buildinstall_symlink": False,
"disc_types": {},
"netinstall_variants": netinstall_variants,
}
get_file_size.return_value = 1024
get_mtime.return_value = 13579
@ -2080,9 +2086,14 @@ class TestSymlinkIso(PungiTestCase):
self.assertEqual(image.bootable, True)
self.assertEqual(image.implant_md5, iso.get_implanted_md5.return_value)
self.assertEqual(image.can_fail, False)
self.assertEqual(
self.compose.im.add.mock_calls, [mock.call("Server", "x86_64", image)]
)
if 'Server' in netinstall_variants:
self.assertEqual(
self.compose.im.add.mock_calls, [mock.call("Server", "x86_64", image)]
)
else:
self.assertEqual(
self.compose.im.add.mock_calls, []
)
@mock.patch("pungi.phases.buildinstall.Image")
@mock.patch("pungi.phases.buildinstall.get_mtime")
@ -2095,6 +2106,7 @@ class TestSymlinkIso(PungiTestCase):
self.compose.conf = {
"buildinstall_symlink": False,
"disc_types": {"boot": "netinst"},
"netinstall_variants": ['Server'],
}
get_file_size.return_value = 1024
get_mtime.return_value = 13579

View File

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
import mock
from unittest import mock
try:
import unittest2 as unittest

View File

@ -1,7 +1,7 @@
# -*- coding: utf-8 -*-
import logging
import mock
from unittest import mock
try:
import unittest2 as unittest

View File

@ -7,7 +7,7 @@ except ImportError:
import unittest
import six
import mock
from unittest import mock
from pungi import checks
from tests.helpers import load_config, PKGSET_REPOS

View File

@ -1,7 +1,7 @@
# -*- coding: utf-8 -*-
import mock
from unittest import mock
import os
import six

View File

@ -0,0 +1,228 @@
# coding=utf-8
import os
from unittest import TestCase, mock, main
import yaml
from pungi.scripts.create_extra_repo import CreateExtraRepo, ExtraVariantInfo, RepoInfo
FOLDER_WITH_TEST_DATA = os.path.join(
os.path.dirname(
os.path.abspath(__file__)
),
'data/test_create_extra_repo/',
)
TEST_MODULE_INFO = yaml.load("""
---
document: modulemd
version: 2
data:
name: perl-App-cpanminus
stream: 1.7044
version: 8030020210126085450
context: 3a33b840
arch: x86_64
summary: Get, unpack, build and install CPAN modules
description: >
This is a CPAN client that requires zero configuration, and stands alone but it's
maintainable and extensible with plug-ins and friendly to shell scripting.
license:
module:
- MIT
content:
- (GPL+ or Artistic) and GPLv2+
- ASL 2.0
- GPL+ or Artistic
dependencies:
- buildrequires:
perl: [5.30]
platform: [el8.3.0]
requires:
perl: [5.30]
perl-YAML: []
platform: [el8]
references:
community: https://metacpan.org/release/App-cpanminus
profiles:
common:
description: App-cpanminus distribution
rpms:
- perl-App-cpanminus
api:
rpms:
- perl-App-cpanminus
filter:
rpms:
- perl-CPAN-DistnameInfo-dummy
- perl-Test-Deep
buildopts:
rpms:
macros: >
%_without_perl_CPAN_Meta_Check_enables_extra_test 1
components:
rpms:
perl-App-cpanminus:
rationale: The API.
ref: perl-App-cpanminus-1.7044-5.module+el8.2.0+4278+abcfa81a.src.rpm
buildorder: 1
arches: [i686, x86_64]
perl-CPAN-DistnameInfo:
rationale: Run-time dependency.
ref: stream-0.12-rhel-8.3.0
arches: [i686, x86_64]
perl-CPAN-Meta-Check:
rationale: Run-time dependency.
ref: perl-CPAN-Meta-Check-0.014-6.module+el8.2.0+4278+abcfa81a.src.rpm
buildorder: 1
arches: [i686, x86_64]
perl-File-pushd:
rationale: Run-time dependency.
ref: perl-File-pushd-1.014-6.module+el8.2.0+4278+abcfa81a.src.rpm
arches: [i686, x86_64]
perl-Module-CPANfile:
rationale: Run-time dependency.
ref: perl-Module-CPANfile-1.1002-7.module+el8.2.0+4278+abcfa81a.src.rpm
arches: [i686, x86_64]
perl-Parse-PMFile:
rationale: Run-time dependency.
ref: perl-Parse-PMFile-0.41-7.module+el8.2.0+4278+abcfa81a.src.rpm
arches: [i686, x86_64]
perl-String-ShellQuote:
rationale: Run-time dependency.
ref: perl-String-ShellQuote-1.04-24.module+el8.2.0+4278+abcfa81a.src.rpm
arches: [i686, x86_64]
perl-Test-Deep:
rationale: Build-time dependency.
ref: stream-1.127-rhel-8.3.0
arches: [i686, x86_64]
artifacts:
rpms:
- perl-App-cpanminus-0:1.7044-5.module_el8.3.0+2027+c8990d1d.noarch
- perl-App-cpanminus-0:1.7044-5.module_el8.3.0+2027+c8990d1d.src
- perl-CPAN-Meta-Check-0:0.014-6.module_el8.3.0+2027+c8990d1d.noarch
- perl-CPAN-Meta-Check-0:0.014-6.module_el8.3.0+2027+c8990d1d.src
- perl-File-pushd-0:1.014-6.module_el8.3.0+2027+c8990d1d.noarch
- perl-File-pushd-0:1.014-6.module_el8.3.0+2027+c8990d1d.src
- perl-Module-CPANfile-0:1.1002-7.module_el8.3.0+2027+c8990d1d.noarch
- perl-Module-CPANfile-0:1.1002-7.module_el8.3.0+2027+c8990d1d.src
- perl-Parse-PMFile-0:0.41-7.module_el8.3.0+2027+c8990d1d.noarch
- perl-Parse-PMFile-0:0.41-7.module_el8.3.0+2027+c8990d1d.src
- perl-String-ShellQuote-0:1.04-24.module_el8.3.0+2027+c8990d1d.noarch
- perl-String-ShellQuote-0:1.04-24.module_el8.3.0+2027+c8990d1d.src
...
""", Loader=yaml.BaseLoader)
TEST_REPO_INFO = RepoInfo(
path=FOLDER_WITH_TEST_DATA,
folder='test_repo',
is_remote=False,
)
TEST_VARIANT_INFO = ExtraVariantInfo(
name='TestRepo',
arch='x86_64',
packages=[],
modules=[],
repos=[TEST_REPO_INFO]
)
BS_BUILD_INFO = {
'build_platforms': [
{
'architectures': ['non_fake_arch', 'fake_arch'],
'name': 'fake_platform'
}
]
}
class TestCreteExtraRepo(TestCase):
maxDiff = None
def test_01_get_repo_info_from_bs_repo(self):
auth_token = 'fake_auth_token'
build_id = 'fake_build_id'
arch = 'fake_arch'
packages = ['fake_package1', 'fake_package2']
modules = ['fake_module1', 'fake_module2']
request_object = mock.Mock()
request_object.raise_for_status = lambda: True
request_object.json = lambda: BS_BUILD_INFO
with mock.patch(
'pungi.scripts.create_extra_repo.requests.get',
return_value=request_object,
) as mock_request_get:
repos_info = CreateExtraRepo.get_repo_info_from_bs_repo(
auth_token=auth_token,
build_id=build_id,
arch=arch,
packages=packages,
modules=modules,
)
self.assertEqual(
[
ExtraVariantInfo(
name=f'{build_id}-fake_platform-{arch}',
arch=arch,
packages=packages,
modules=modules,
repos=[
RepoInfo(
path='https://build.cloudlinux.com/'
f'build_repos/{build_id}/fake_platform',
folder=arch,
is_remote=True,
)
]
)
],
repos_info,
)
mock_request_get.assert_called_once_with(
url=f'https://build.cloudlinux.com/api/v1/builds/{build_id}',
headers={
'Authorization': f'Bearer {auth_token}',
}
)
def test_02_create_extra_repo(self):
with mock.patch(
'pungi.scripts.create_extra_repo.'
'CreateExtraRepo._read_local_modules_yaml',
return_value=[],
) as mock__read_local_modules_yaml, mock.patch(
'pungi.scripts.create_extra_repo.'
'CreateExtraRepo._download_rpm_to_local_repo',
) as mock__download_rpm_to_local_repo, mock.patch(
'pungi.scripts.create_extra_repo.'
'CreateExtraRepo._dump_local_modules_yaml'
) as mock__dump_local_modules_yaml, mock.patch(
'pungi.scripts.create_extra_repo.'
'CreateExtraRepo._create_local_extra_repo'
) as mock__create_local_extra_repo:
cer = CreateExtraRepo(
variants=[TEST_VARIANT_INFO],
bs_auth_token='fake_auth_token',
local_repository_path='/path/to/local/repo',
clear_target_repo=False,
)
mock__read_local_modules_yaml.assert_called_once_with()
cer.create_extra_repo()
mock__download_rpm_to_local_repo.assert_called_once_with(
package_location='perl-App-cpanminus-1.7044-5.'
'module_el8.3.0+2027+c8990d1d.noarch.rpm',
repo_info=TEST_REPO_INFO,
)
mock__dump_local_modules_yaml.assert_called_once_with()
mock__create_local_extra_repo.assert_called_once_with()
self.assertEqual(
[TEST_MODULE_INFO],
cer.local_modules_data,
)
if __name__ == '__main__':
main()

View File

@ -0,0 +1,112 @@
# coding=utf-8
import os
from collections import defaultdict
from unittest import TestCase, mock, main
from pungi.scripts.create_packages_json import (
PackagesGenerator,
RepoInfo,
VariantInfo,
)
FOLDER_WITH_TEST_DATA = os.path.join(
os.path.dirname(
os.path.abspath(__file__)
),
'data/test_create_packages_json/',
)
test_repo_info = RepoInfo(
path=FOLDER_WITH_TEST_DATA,
folder='test_repo',
is_remote=False,
is_reference=True,
)
test_repo_info_2 = RepoInfo(
path=FOLDER_WITH_TEST_DATA,
folder='test_repo_2',
is_remote=False,
is_reference=True,
)
variant_info_1 = VariantInfo(
name='TestRepo',
arch='x86_64',
repos=[test_repo_info]
)
variant_info_2 = VariantInfo(
name='TestRepo2',
arch='x86_64',
repos=[test_repo_info_2]
)
class TestPackagesJson(TestCase):
def test_01_get_remote_file_content(self):
"""
Test the getting of content from a remote file
"""
request_object = mock.Mock()
request_object.raise_for_status = lambda: True
request_object.content = b'TestContent'
with mock.patch(
'pungi.scripts.create_packages_json.requests.get',
return_value=request_object,
) as mock_requests_get, mock.patch(
'pungi.scripts.create_packages_json.tempfile.NamedTemporaryFile',
) as mock_tempfile:
mock_tempfile.return_value.__enter__.return_value.name = 'tmpfile'
packages_generator = PackagesGenerator(
variants=[],
excluded_packages=[],
included_packages=[],
)
file_name = packages_generator.get_remote_file_content(
file_url='fakeurl')
mock_requests_get.assert_called_once_with(url='fakeurl')
mock_tempfile.assert_called_once_with(delete=False)
mock_tempfile.return_value.__enter__().\
write.assert_called_once_with(b'TestContent')
self.assertEqual(
file_name,
'tmpfile',
)
def test_02_generate_additional_packages(self):
pg = PackagesGenerator(
variants=[
variant_info_1,
variant_info_2,
],
excluded_packages=['zziplib-utils'],
included_packages=['vim-file*'],
)
test_packages = defaultdict(
lambda: defaultdict(
lambda: defaultdict(
list,
)
)
)
test_packages['TestRepo']['x86_64']['zziplib'] = \
[
'zziplib.i686',
'zziplib.x86_64',
]
test_packages['TestRepo2']['x86_64']['vim'] = \
[
'vim-X11.i686',
'vim-common.i686',
'vim-enhanced.i686',
'vim-filesystem.noarch',
]
result = pg.generate_packages_json()
self.assertEqual(
test_packages,
result,
)
if __name__ == '__main__':
main()

View File

@ -2,7 +2,7 @@
import logging
import mock
from unittest import mock
import six
import os

View File

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
import mock
from unittest import mock
from parameterized import parameterized
import os

View File

@ -8,7 +8,7 @@ except ImportError:
import glob
import os
import mock
from unittest import mock
import six
from pungi.module_util import Modulemd

View File

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
import mock
from unittest import mock
import os
from productmd.extra_files import ExtraFiles

View File

@ -1,8 +1,8 @@
# -*- coding: utf-8 -*-
import logging
import mock
from typing import AnyStr, List
from unittest import mock
import six
import logging
import os
@ -614,6 +614,7 @@ class GetExtraFilesTest(helpers.PungiTestCase):
)
@mock.patch("pungi.phases.extra_isos.tweak_repo_treeinfo")
@mock.patch("pungi.phases.extra_isos.tweak_treeinfo")
@mock.patch("pungi.wrappers.iso.write_graft_points")
@mock.patch("pungi.wrappers.iso.get_graft_points")
@ -623,7 +624,7 @@ class GetIsoContentsTest(helpers.PungiTestCase):
self.compose = helpers.DummyCompose(self.topdir, {})
self.variant = self.compose.variants["Server"]
def test_non_bootable_binary(self, ggp, wgp, tt):
def test_non_bootable_binary(self, ggp, wgp, tt, trt):
gp = {
"compose/Client/x86_64/os/Packages": {"f/foo.rpm": "/mnt/f/foo.rpm"},
"compose/Client/x86_64/os/repodata": {
@ -693,7 +694,15 @@ class GetIsoContentsTest(helpers.PungiTestCase):
],
)
def test_inherit_extra_files(self, ggp, wgp, tt):
# Check correct call to tweak_repo_treeinfo
self._tweak_repo_treeinfo_call_list_checker(
trt_mock=trt,
main_variant='Server',
addon_variants=['Client'],
sub_path='x86_64/os',
)
def test_inherit_extra_files(self, ggp, wgp, tt, trt):
gp = {
"compose/Client/x86_64/os/Packages": {"f/foo.rpm": "/mnt/f/foo.rpm"},
"compose/Client/x86_64/os/repodata": {
@ -767,7 +776,15 @@ class GetIsoContentsTest(helpers.PungiTestCase):
],
)
def test_source(self, ggp, wgp, tt):
# Check correct call to tweak_repo_treeinfo
self._tweak_repo_treeinfo_call_list_checker(
trt_mock=trt,
main_variant='Server',
addon_variants=['Client'],
sub_path='x86_64/os',
)
def test_source(self, ggp, wgp, tt, trt):
gp = {
"compose/Client/source/tree/Packages": {"f/foo.rpm": "/mnt/f/foo.rpm"},
"compose/Client/source/tree/repodata": {
@ -837,7 +854,15 @@ class GetIsoContentsTest(helpers.PungiTestCase):
],
)
def test_bootable(self, ggp, wgp, tt):
# Check correct call to tweak_repo_treeinfo
self._tweak_repo_treeinfo_call_list_checker(
trt_mock=trt,
main_variant='Server',
addon_variants=['Client'],
sub_path='source/tree',
)
def test_bootable(self, ggp, wgp, tt, trt):
self.compose.conf["buildinstall_method"] = "lorax"
bi_dir = os.path.join(self.topdir, "work/x86_64/buildinstall/Server")
@ -939,6 +964,42 @@ class GetIsoContentsTest(helpers.PungiTestCase):
],
)
# Check correct call to tweak_repo_treeinfo
self._tweak_repo_treeinfo_call_list_checker(
trt_mock=trt,
main_variant='Server',
addon_variants=['Client'],
sub_path='x86_64/os',
)
def _tweak_repo_treeinfo_call_list_checker(
self,
trt_mock: mock.Mock,
main_variant: AnyStr,
addon_variants: List[AnyStr],
sub_path: AnyStr) -> None:
"""
Check correct call to tweak_repo_treeinfo
"""
path_to_treeinfo = os.path.join(
self.topdir,
'compose',
main_variant,
sub_path,
'.treeinfo',
)
self.assertEqual(
trt_mock.call_args_list,
[
mock.call(
self.compose,
addon_variants,
path_to_treeinfo,
path_to_treeinfo,
)
]
)
class GetFilenameTest(helpers.PungiTestCase):
def test_use_original_name(self):
@ -1016,6 +1077,15 @@ class TweakTreeinfoTest(helpers.PungiTestCase):
self.assertFilesEqual(output, expected)
def test_repo_tweak(self):
compose = helpers.DummyCompose(self.topdir, {})
input = os.path.join(helpers.FIXTURE_DIR, "extraiso.treeinfo")
output = os.path.join(self.topdir, "actual-treeinfo")
expected = os.path.join(helpers.FIXTURE_DIR, "extraiso-tweaked-expected.treeinfo")
extra_isos.tweak_repo_treeinfo(compose, ["Client"], input, output)
self.assertFilesEqual(output, expected)
class PrepareMetadataTest(helpers.PungiTestCase):
@mock.patch("pungi.metadata.create_media_repo")

View File

@ -153,7 +153,10 @@ class TestParseOutput(unittest.TestCase):
self.assertEqual(modules, set())
def test_extracts_modules(self):
touch(self.file, "module:mod:master:20181003:cafebeef.x86_64@repo-0\n")
touch(
self.file,
"module:mod:master-1:20181003:cafebeef.x86_64@repo-0\n"
)
packages, modules = fus.parse_output(self.file)
self.assertEqual(packages, set())
self.assertEqual(modules, set(["mod:master:20181003:cafebeef"]))
self.assertEqual(modules, set(["mod:master_1:20181003:cafebeef"]))

View File

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
import mock
from unittest import mock
from pungi.phases.gather.methods import method_deps as deps
from tests import helpers

View File

@ -2,7 +2,7 @@
from collections import namedtuple
import copy
import mock
from unittest import mock
import os
import six

View File

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
import mock
from unittest import mock
import os
import six

View File

@ -0,0 +1,124 @@
# -*- coding: utf-8 -*-
import gzip
import os
from io import StringIO
import yaml
from pungi.scripts.gather_modules import collect_modules, EMPTY_FILE
import unittest
from pyfakefs.fake_filesystem_unittest import TestCase
MARIADB_MODULE = yaml.load("""
---
document: modulemd
version: 2
data:
name: mariadb-devel
stream: 10.3-1
version: 8010020200108182321
context: cdc1202b
arch: x86_64
summary: MariaDB Module
description: >-
MariaDB is a community developed branch of MySQL.
components:
rpms:
Judy:
rationale: MariaDB dependency for OQgraph computation engine
ref: a3583b33f939e74a530f2a1dff0552dff2c8ea73
buildorder: 4
arches: [aarch64, i686, ppc64le, x86_64]
artifacts:
rpms:
- Judy-0:1.0.5-18.module_el8.1.0+217+4d875839.i686
- Judy-debuginfo-0:1.0.5-18.module_el8.1.0+217+4d875839.i686
""", Loader=yaml.BaseLoader)
JAVAPACKAGES_TOOLS_MODULE = yaml.load("""
---
document: modulemd
version: 2
data:
name: javapackages-tools
stream: 201801
version: 8000020190628172923
context: b07bea58
arch: x86_64
summary: Tools and macros for Java packaging support
description: >-
Java Packages Tools is a collection of tools that make it easier to build RPM
packages containing software running on Java platform.
components:
rpms:
ant:
rationale: "Runtime dependency of ant-contrib"
ref: 2eaf095676540e2805ee7e8c7f6f78285c428fdc
arches: [aarch64, i686, ppc64le, x86_64]
artifacts:
rpms:
- ant-0:1.10.5-1.module_el8.0.0+30+832da3a1.noarch
- ant-0:1.10.5-1.module_el8.0.0+30+832da3a1.src
""", Loader=yaml.BaseLoader)
ANT_DEFAULTS = yaml.load("""
data:
module: ant
profiles:
'1.10':
- common
stream: '1.10'
document: modulemd-defaults
version: '1'
""", Loader=yaml.BaseLoader)
PATH_TO_KOJI = '/path/to/koji'
MODULES_YAML_GZ = 'modules.yaml.gz'
class TestModulesYamlParser(TestCase):
maxDiff = None
def setUp(self):
self.setUpPyfakefs()
def _prepare_test_data(self):
"""
Create modules.yaml.gz with some test data
"""
os.makedirs(PATH_TO_KOJI)
modules_gz_path = os.path.join(PATH_TO_KOJI, MODULES_YAML_GZ)
# dump modules into compressed file as in generic repos for rpm
io = StringIO()
yaml.dump_all([MARIADB_MODULE, JAVAPACKAGES_TOOLS_MODULE, ANT_DEFAULTS], io)
with open(os.path.join(PATH_TO_KOJI, MODULES_YAML_GZ), 'wb') as f:
f.write(gzip.compress(io.getvalue().encode()))
return modules_gz_path
def test_export_modules(self):
modules_gz_path = self._prepare_test_data()
paths = [open(modules_gz_path, 'rb')]
collect_modules(paths, PATH_TO_KOJI)
# check directory structure matches expected
self.assertEqual([MODULES_YAML_GZ, 'modules', 'module_defaults'], os.listdir(PATH_TO_KOJI))
self.assertEqual(['mariadb-devel-10.3_1-8010020200108182321.cdc1202b',
'javapackages-tools-201801-8000020190628172923.b07bea58'],
os.listdir(os.path.join(PATH_TO_KOJI, 'modules/x86_64')))
self.assertEqual([EMPTY_FILE, 'ant.yaml'],
os.listdir(os.path.join(PATH_TO_KOJI, 'module_defaults')))
# check that modules were exported
self.assertEqual(MARIADB_MODULE, yaml.safe_load(
open(os.path.join(PATH_TO_KOJI, 'modules/x86_64', 'mariadb-devel-10.3_1-8010020200108182321.cdc1202b'))))
self.assertEqual(JAVAPACKAGES_TOOLS_MODULE, yaml.safe_load(
open(os.path.join(PATH_TO_KOJI, 'modules/x86_64', 'javapackages-tools-201801-8000020190628172923.b07bea58'))))
# check that defaults were copied
self.assertEqual(ANT_DEFAULTS, yaml.safe_load(
open(os.path.join(PATH_TO_KOJI, 'module_defaults', 'ant.yaml'))))
if __name__ == '__main__':
unittest.main()

View File

@ -4,7 +4,7 @@ import copy
import json
import os
import mock
from unittest import mock
try:
import unittest2 as unittest

151
tests/test_gather_rpms.py Normal file
View File

@ -0,0 +1,151 @@
# -*- coding: utf-8 -*-
import os
import unittest
from pathlib import Path
from pyfakefs.fake_filesystem_unittest import TestCase
from pungi.scripts.gather_rpms import search_rpms, copy_rpms, Package
from productmd.common import parse_nvra
PATH_TO_REPOS = '/path/to/repos'
MODULES_YAML_GZ = 'modules.yaml.gz'
class TestGatherRpms(TestCase):
maxDiff = None
FILES_TO_CREATE = [
'powertools/Packages/libvirt-6.0.0-28.module_el'
'8.3.0+555+a55c8938.i686.rpm',
'powertools/Packages/libgit2-devel-0.26.8-2.el8.x86_64.rpm',
'powertools/Packages/xalan-j2-2.7.1-38.module_el'
'8.0.0+30+832da3a1.noarch.rpm',
'appstream/Packages/bnd-maven-plugin-3.5.0-4.module_el'
'8.0.0+30+832da3a1.noarch.rpm',
'appstream/Packages/OpenEXR-devel-2.2.0-11.el8.i686.rpm',
'appstream/Packages/mingw-binutils-generic-2.30-1.el8.x86_64.rpm',
'appstream/Packages/somenonrpm',
]
def setUp(self):
self.setUpPyfakefs()
os.makedirs(PATH_TO_REPOS)
for filepath in self.FILES_TO_CREATE:
os.makedirs(
os.path.join(PATH_TO_REPOS, os.path.dirname(filepath)),
exist_ok=True,
)
open(os.path.join(PATH_TO_REPOS, filepath), 'w').close()
def test_gather_rpms(self):
self.assertEqual(
[Package(nvra=parse_nvra('libvirt-6.0.0-28.module_'
'el8.3.0+555+a55c8938.i686'),
path=Path(
f'{PATH_TO_REPOS}/powertools/Packages/'
f'libvirt-6.0.0-28.module_el'
f'8.3.0+555+a55c8938.i686.rpm'
)),
Package(nvra=parse_nvra('libgit2-devel-0.26.8-2.el8.x86_64'),
path=Path(
f'{PATH_TO_REPOS}/powertools/Packages/'
f'libgit2-devel-0.26.8-2.el8.x86_64.rpm'
)),
Package(nvra=parse_nvra('xalan-j2-2.7.1-38.module_el'
'8.0.0+30+832da3a1.noarch'),
path=Path(
f'{PATH_TO_REPOS}/powertools/Packages/'
f'xalan-j2-2.7.1-38.module_el'
f'8.0.0+30+832da3a1.noarch.rpm'
)),
Package(nvra=parse_nvra('bnd-maven-plugin-3.5.0-4.module_el'
'8.0.0+30+832da3a1.noarch'),
path=Path(
'/path/to/repos/appstream/Packages/'
'bnd-maven-plugin-3.5.0-4.module_el'
'8.0.0+30+832da3a1.noarch.rpm'
)),
Package(nvra=parse_nvra('OpenEXR-devel-2.2.0-11.el8.i686'),
path=Path(
f'{PATH_TO_REPOS}/appstream/Packages/'
f'OpenEXR-devel-2.2.0-11.el8.i686.rpm'
)),
Package(nvra=parse_nvra('mingw-binutils-generic-'
'2.30-1.el8.x86_64'),
path=Path(
f'{PATH_TO_REPOS}/appstream/Packages/'
f'mingw-binutils-generic-2.30-1.el8.x86_64.rpm'
))
],
search_rpms(Path(PATH_TO_REPOS))
)
def test_copy_rpms(self):
target_path = Path('/mnt/koji')
packages = [
Package(nvra=parse_nvra('libvirt-6.0.0-28.module_'
'el8.3.0+555+a55c8938.i686'),
path=Path(
f'{PATH_TO_REPOS}/powertools/Packages/'
f'libvirt-6.0.0-28.module_el'
f'8.3.0+555+a55c8938.i686.rpm'
)),
Package(nvra=parse_nvra('libgit2-devel-0.26.8-2.el8.x86_64'),
path=Path(
f'{PATH_TO_REPOS}/powertools/Packages/'
f'libgit2-devel-0.26.8-2.el8.x86_64.rpm'
)),
Package(nvra=parse_nvra('xalan-j2-2.7.1-38.module_'
'el8.0.0+30+832da3a1.noarch'),
path=Path(
f'{PATH_TO_REPOS}/powertools/Packages/'
f'xalan-j2-2.7.1-38.module_el'
f'8.0.0+30+832da3a1.noarch.rpm'
)),
Package(nvra=parse_nvra('bnd-maven-plugin-3.5.0-4.module_el'
'8.0.0+30+832da3a1.noarch'),
path=Path(
'/path/to/repos/appstream/Packages/'
'bnd-maven-plugin-3.5.0-4.module_el'
'8.0.0+30+832da3a1.noarch.rpm'
)),
Package(nvra=parse_nvra('OpenEXR-devel-2.2.0-11.el8.i686'),
path=Path(
f'{PATH_TO_REPOS}/appstream/Packages/'
f'OpenEXR-devel-2.2.0-11.el8.i686.rpm'
)),
Package(nvra=parse_nvra('mingw-binutils-generic-'
'2.30-1.el8.x86_64'),
path=Path(
f'{PATH_TO_REPOS}/appstream/Packages/'
f'mingw-binutils-generic-2.30-1.el8.x86_64.rpm'
))
]
copy_rpms(packages, target_path, [])
self.assertCountEqual([
'xalan-j2-2.7.1-38.module_el8.0.0+30+832da3a1.noarch.rpm',
'bnd-maven-plugin-3.5.0-4.module_el8.0.0+30+832da3a1.noarch.rpm'
], os.listdir(target_path / 'noarch'))
self.assertCountEqual([
'libgit2-devel-0.26.8-2.el8.x86_64.rpm',
'mingw-binutils-generic-2.30-1.el8.x86_64.rpm'
], os.listdir(target_path / 'x86_64'))
self.assertCountEqual([
'libvirt-6.0.0-28.module_el8.3.0+555+a55c8938.i686.rpm',
'OpenEXR-devel-2.2.0-11.el8.i686.rpm'
], os.listdir(target_path / 'i686'))
self.assertCountEqual([
'i686', 'x86_64', 'noarch'
], os.listdir(target_path))
if __name__ == '__main__':
unittest.main()

View File

@ -5,7 +5,7 @@ try:
except ImportError:
import unittest
import mock
from unittest import mock
import six
from pungi.phases.gather.sources.source_module import GatherSourceModule

View File

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
import mock
from unittest import mock
import os

View File

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
import mock
from unittest import mock
import six

View File

@ -4,7 +4,7 @@ try:
import unittest2 as unittest
except ImportError:
import unittest
import mock
from unittest import mock
import os
import tempfile

View File

@ -5,7 +5,7 @@ try:
import unittest2 as unittest
except ImportError:
import unittest
import mock
from unittest import mock
import six

View File

@ -1,7 +1,7 @@
# -*- coding: utf-8 -*-
import itertools
import mock
from unittest import mock
import os
import six

View File

@ -0,0 +1,364 @@
# -*- coding: utf-8 -*-
import os
import ddt
import unittest
from pyfakefs.fake_filesystem_unittest import TestCase
from pungi.wrappers.kojimock import KojiMock, RELEASE_BUILD_ID
PATH_TO_REPOS = '/path/to/repos'
MODULES_YAML_GZ = 'modules.yaml.gz'
@ddt.ddt
class TestLocalKojiMock(TestCase):
maxDiff = None
FILES_TO_CREATE = [
# modular package that should be excluded from global list
'powertools/Packages/ant-1.10.5-1.module_el8.0.0+30+832da3a1.noarch.rpm',
# packages that should be gathered
'powertools/Packages/libgit2-devel-0.26.8-2.el8.x86_64.rpm',
'appstream/Packages/OpenEXR-devel-2.2.0-11.el8.i686.rpm',
'appstream/Packages/mingw-binutils-generic-2.30-1.el8.x86_64.rpm',
# non-rpm
'appstream/Packages/somenonrpm',
]
MARIADB_MODULE = """
---
document: modulemd
version: 2
data:
name: mariadb-devel
stream: 10.3
version: 8010020200108182321
context: cdc1202b
arch: x86_64
summary: MariaDB Module
license:
content:
- (CDDL or GPLv2 with exceptions) and ASL 2.0
module:
- MIT
description: >-
MariaDB is a community developed branch of MySQL.
components:
rpms:
Judy:
rationale: MariaDB dependency for OQgraph computation engine
ref: a3583b33f939e74a530f2a1dff0552dff2c8ea73
buildorder: 4
arches: [aarch64, i686, ppc64le, x86_64]
artifacts:
rpms:
- Judy-0:1.0.5-18.module_el8.1.0+217+4d875839.i686
- Judy-debuginfo-0:1.0.5-18.module_el8.1.0+217+4d875839.i686
"""
JAVAPACKAGES_TOOLS_MODULE = """
---
document: modulemd
version: 2
data:
name: javapackages-tools
stream: 201801
version: 8000020190628172923
context: b07bea58
arch: x86_64
summary: Tools and macros for Java packaging support
license:
content:
- (CDDL or GPLv2 with exceptions) and ASL 2.0
module:
- MIT
description: >-
Java Packages Tools is a collection of tools that make it easier to build RPM
packages containing software running on Java platform.
components:
rpms:
ant:
rationale: "Runtime dependency of ant-contrib"
ref: 2eaf095676540e2805ee7e8c7f6f78285c428fdc
arches: [aarch64, i686, ppc64le, x86_64]
artifacts:
rpms:
- ant-0:1.10.5-1.module_el8.0.0+30+832da3a1.noarch
- ant-0:1.10.5-1.module_el8.0.0+30+832da3a1.src
"""
ANT_DEFAULTS = """
data:
module: ant
profiles:
'1.10':
- common
stream: '1.10'
document: modulemd-defaults
version: '1'
"""
def setUp(self):
self.setUpPyfakefs()
os.makedirs(PATH_TO_REPOS)
os.makedirs(os.path.join(PATH_TO_REPOS, 'modules/x86_64'))
with open(os.path.join(PATH_TO_REPOS, 'modules/x86_64',
'javapackages-tools-201801-8000020190628172923.b07bea58'), 'w') as f:
f.write(self.JAVAPACKAGES_TOOLS_MODULE)
with open(os.path.join(PATH_TO_REPOS, 'modules/x86_64',
'mariadb-devel-10.3-8010020200108182321.cdc1202b'), 'w') as f:
f.write(self.MARIADB_MODULE)
for filepath in self.FILES_TO_CREATE:
os.makedirs(os.path.join(PATH_TO_REPOS, os.path.dirname(filepath)), exist_ok=True)
open(os.path.join(PATH_TO_REPOS, filepath), 'w').close()
self._koji = KojiMock(
PATH_TO_REPOS,
os.path.join(PATH_TO_REPOS, 'modules'),
['x86_64', 'noarch', 'i686'],
)
@ddt.data(
[0, {
'completion_ts': 0,
'arch': 'x86_64',
'extra': {
'typeinfo': {
'module': {
'content_koji_tag': 'javapackages-tools-201801-8000020190628172923.b07bea58',
'context': 'b07bea58',
'name': 'javapackages-tools',
'stream': '201801',
'version': '8000020190628172923'
}
}
},
'id': 0,
'name': 'javapackages-tools',
'release': '8000020190628172923.b07bea58',
'state': 'COMPLETE',
'version': '201801'
}],
[1, {
'completion_ts': 0,
'arch': 'x86_64',
'extra': {
'typeinfo': {
'module': {
'content_koji_tag': 'mariadb-devel-10.3-8010020200108182321.cdc1202b',
'context': 'cdc1202b',
'name': 'mariadb-devel',
'stream': '10.3',
'version': '8010020200108182321'
}
}
},
'id': 1,
'name': 'mariadb-devel',
'release': '8010020200108182321.cdc1202b',
'state': 'COMPLETE',
'version': '10.3'
}]
)
@ddt.unpack
def test_get_build_info(self, build_id, result):
"""
Check that we are able to get build information from getBuild method
"""
build_info = self._koji.getBuild(build_id)
self.assertEqual(result, build_info)
@ddt.data(
[0, [{'btype': 'module', 'build_id': 0, 'filename': 'modulemd.x86_64.txt'},
{'btype': 'module', 'build_id': 0, 'filename': 'modulemd.txt'}]],
[1, [{'btype': 'module', 'build_id': 1, 'filename': 'modulemd.x86_64.txt'},
{'btype': 'module', 'build_id': 1, 'filename': 'modulemd.txt'}]]
)
@ddt.unpack
def test_list_archives(self, build_id, result):
"""
Provides list of archives of module descriptions.
Always should contain at least two files, so
I did a little hack and added modulemd.txt (it is on real koji)
but it is not used later by pungi
"""
build_info = self._koji.listArchives(build_id)
self.assertEqual(result, build_info)
@ddt.data(
[
'javapackages-tools-201801-8000020190628172923.b07bea58',
[
[
{
'arch': 'noarch',
'build_id': 0,
'epoch': '0',
'extra': None,
'id': 262555,
'metadata_only': False,
'name': 'ant',
'release': '1.module_el8.0.0+30+832da3a1',
'size': 0,
'version': '1.10.5'
},
{
'arch': 'src',
'build_id': 0,
'epoch': '0',
'extra': None,
'id': 262555,
'metadata_only': False,
'name': 'ant',
'release': '1.module_el8.0.0+30+832da3a1',
'size': 0,
'version': '1.10.5'
}
],
[
{
'build_id': 0,
'id': 0,
'name': 'javapackages-tools',
'nvr': 'javapackages-tools-201801-8000020190628172923.b07bea58',
'package_name': 'javapackages-tools',
'release': '8000020190628172923',
'tag_name': 'javapackages-tools-201801-8000020190628172923.b07bea58',
'version': '201801',
'volume_name': 'DEFAULT'
}
]
]
],
[
'mariadb-devel-10.3-8010020200108182321.cdc1202b',
[
[
{
'arch': 'i686',
'build_id': 1,
'epoch': '0',
'extra': None,
'id': 262555,
'metadata_only': False,
'name': 'Judy',
'release': '18.module_el8.1.0+217+4d875839',
'size': 0,
'version': '1.0.5'
},
{
'arch': 'i686',
'build_id': 1,
'epoch': '0',
'extra': None,
'id': 262555,
'metadata_only': False,
'name': 'Judy-debuginfo',
'release': '18.module_el8.1.0+217+4d875839',
'size': 0,
'version': '1.0.5'
}
],
[
{'build_id': 1,
'id': 1,
'name': 'mariadb-devel',
'nvr': 'mariadb-devel-10.3-8010020200108182321.cdc1202b',
'package_name': 'mariadb-devel',
'release': '8010020200108182321',
'tag_name': 'mariadb-devel-10.3-8010020200108182321.cdc1202b',
'version': '10.3',
'volume_name': 'DEFAULT'
}
]
]
],
[
'dist-c8-compose',
[
[
{
'arch': 'x86_64',
'build_id': RELEASE_BUILD_ID,
'epoch': None,
'extra': None,
'metadata_only': False,
'name': 'libgit2-devel',
'release': '2.el8',
'version': '0.26.8'
},
{
'arch': 'i686',
'build_id': RELEASE_BUILD_ID,
'epoch': None,
'extra': None,
'metadata_only': False,
'name': 'OpenEXR-devel',
'release': '11.el8',
'version': '2.2.0'
},
{
'arch': 'x86_64',
'build_id': RELEASE_BUILD_ID,
'epoch': None,
'extra': None,
'metadata_only': False,
'name': 'mingw-binutils-generic',
'release': '1.el8',
'version': '2.30'
}
],
# no build needed in this case because pungi does not use them
[]
]
],
)
@ddt.unpack
def test_list_tagged_rpms(self, tag, result):
"""
This method is used by pungi to get list of rpms:
either modular or just prepared for release
"""
self.assertEqual(result, self._koji.listTaggedRPMS(tag))
def test_list_tagged(self):
"""
Used only to get list of modules for some release.
"""
result = self._koji.listTagged('dist-c8-module-compose')
self.assertEqual([
{
'arch': 'x86_64',
'build_id': 0,
'id': 0,
'name': 'javapackages-tools',
'nvr': 'javapackages-tools-201801-8000020190628172923.b07bea58',
'owner_name': 'centos',
'package_name': 'javapackages-tools',
'release': '8000020190628172923.b07bea58',
'tag_name': 'dist-c8-module-compose',
'version': '201801'
},
{
'arch': 'x86_64',
'build_id': 1,
'id': 1,
'name': 'mariadb-devel',
'nvr': 'mariadb-devel-10.3-8010020200108182321.cdc1202b',
'owner_name': 'centos',
'package_name': 'mariadb-devel',
'release': '8010020200108182321.cdc1202b',
'tag_name': 'dist-c8-module-compose',
'version': '10.3'
}], result)
if __name__ == '__main__':
unittest.main()

View File

@ -1,7 +1,7 @@
# -*- coding: utf-8 -*-
import json
import mock
from unittest import mock
try:
import unittest2 as unittest

View File

@ -1,7 +1,7 @@
# -*- coding: utf-8 -*-
import mock
from unittest import mock
import errno
import os
import stat

View File

@ -1,7 +1,7 @@
# -*- coding: utf-8 -*-
import mock
from unittest import mock
import six

View File

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
import mock
from unittest import mock
import os

View File

@ -4,7 +4,7 @@ try:
import unittest2 as unittest
except ImportError:
import unittest
import mock
from unittest import mock
from pungi import media_split

View File

@ -1,4 +1,4 @@
import mock
from unittest import mock
import os
import six

View File

@ -2,7 +2,7 @@
from datetime import datetime
import json
import mock
from unittest import mock
try:
import unittest2 as unittest

View File

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
import mock
from unittest import mock
import json
import copy

View File

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
import mock
from unittest import mock
import os
import shutil

View File

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
import mock
from unittest import mock
import os

View File

@ -2,7 +2,7 @@
import json
import mock
from unittest import mock
import os

View File

@ -4,7 +4,7 @@
import json
import os
import mock
from unittest import mock
import six
import yaml

View File

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
import mock
from unittest import mock
import os

View File

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
import mock
from unittest import mock
import os
try:

View File

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
import mock
from unittest import mock
try:
import unittest2 as unittest

View File

@ -2,7 +2,7 @@
import os
import mock
from unittest import mock
import six
from pungi.module_util import Modulemd
@ -135,16 +135,16 @@ class TestMaterializedPkgsetCreate(helpers.PungiTestCase):
amm.assert_called_once()
self.assertEqual(
amm.mock_calls[0].args[1], os.path.join(self.topdir, "work/x86_64/repo/foo")
amm.mock_calls[0][1][1], os.path.join(self.topdir, "work/x86_64/repo/foo")
)
self.assertIsInstance(amm.mock_calls[0].args[2], Modulemd.ModuleIndex)
self.assertIsNotNone(amm.mock_calls[0].args[2].get_module("mod_name"))
self.assertIsInstance(amm.mock_calls[0][1][2], Modulemd.ModuleIndex)
self.assertIsNotNone(amm.mock_calls[0][1][2].get_module("mod_name"))
# Check if proper Index is used by add_modular_metadata
self.assertIsNotNone(
amm.mock_calls[0].args[2].get_module("mod_name").get_obsoletes()
amm.mock_calls[0][1][2].get_module("mod_name").get_obsoletes()
)
self.assertEqual(
amm.mock_calls[0].args[3],
amm.mock_calls[0][1][3],
os.path.join(self.topdir, "logs/x86_64/arch_repo_modulemd.foo.x86_64.log"),
)

View File

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
import mock
import ddt
from unittest import mock
import os
import six
@ -141,9 +141,25 @@ class DummySystem(object):
return self.methods
@ddt.ddt
@mock.patch("pungi.phases.pkgset.pkgsets.ReaderPool", new=FakePool)
@mock.patch("kobo.pkgset.FileCache", new=MockFileCache)
class TestKojiPkgset(PkgsetCompareMixin, helpers.PungiTestCase):
@classmethod
def setUpClass(cls) -> None:
cls.patcher = mock.patch.object(
pkgsets.KojiMockPackageSet,
'_is_rpm_signed',
return_value=True,
)
cls.patcher.start()
@classmethod
def tearDownClass(cls) -> None:
cls.patcher.stop()
def setUp(self):
super(TestKojiPkgset, self).setUp()
with open(os.path.join(helpers.FIXTURE_DIR, "tagged-rpms.json")) as f:
@ -167,7 +183,11 @@ class TestKojiPkgset(PkgsetCompareMixin, helpers.PungiTestCase):
six.assertCountEqual(self, v1, v2)
self.assertEqual({}, actual, msg="Some architectures were missing")
def test_all_arches(self):
@ddt.data(
pkgsets.KojiMockPackageSet,
pkgsets.KojiPackageSet,
)
def test_all_arches(self, package_set):
self._touch_files(
[
"rpms/pungi@4.1.3@3.fc25@noarch",
@ -180,7 +200,7 @@ class TestKojiPkgset(PkgsetCompareMixin, helpers.PungiTestCase):
]
)
pkgset = pkgsets.KojiPackageSet(
pkgset = package_set(
"pkgset", self.koji_wrapper, [None], downloader=self.koji_downloader
)
@ -207,7 +227,11 @@ class TestKojiPkgset(PkgsetCompareMixin, helpers.PungiTestCase):
},
)
def test_only_one_arch(self):
@ddt.data(
pkgsets.KojiPackageSet,
pkgsets.KojiMockPackageSet,
)
def test_only_one_arch(self, package_set):
self._touch_files(
[
"rpms/bash@4.3.42@4.fc24@x86_64",
@ -215,7 +239,7 @@ class TestKojiPkgset(PkgsetCompareMixin, helpers.PungiTestCase):
]
)
pkgset = pkgsets.KojiPackageSet(
pkgset = package_set(
"pkgset",
self.koji_wrapper,
[None],
@ -325,7 +349,7 @@ class TestKojiPkgset(PkgsetCompareMixin, helpers.PungiTestCase):
)
figure = re.compile(
r"^RPM\(s\) not found for sigs: .+Check log for details.+bash-4\.3\.42-4\.fc24.+bash-debuginfo-4\.3\.42-4\.fc24$", # noqa: E501
r"^RPM\(s\) not found for sigs: .+Check log for details.+bash-4\.3\.42-4\.fc24\.x86_64.+bash-debuginfo-4\.3\.42-4\.fc24\.x86_64$", # noqa: E501
re.DOTALL,
)
self.assertRegex(str(ctx.exception), figure)
@ -404,7 +428,7 @@ class TestKojiPkgset(PkgsetCompareMixin, helpers.PungiTestCase):
pkgset.raise_invalid_sigkeys_exception(pkgset.invalid_sigkey_rpms)
figure = re.compile(
r"^RPM\(s\) not found for sigs: .+Check log for details.+bash-4\.3\.42-4\.fc24.+bash-debuginfo-4\.3\.42-4\.fc24$", # noqa: E501
r"^RPM\(s\) not found for sigs: .+Check log for details.+bash-4\.3\.42-4\.fc24\.x86_64.+bash-debuginfo-4\.3\.42-4\.fc24\.x86_64$", # noqa: E501
re.DOTALL,
)
self.assertRegex(str(ctx.exception), figure)
@ -458,7 +482,11 @@ class TestKojiPkgset(PkgsetCompareMixin, helpers.PungiTestCase):
# Two packages making three attempts each, so two waits per package.
self.assertEqual(time.call_args_list, [mock.call(5)] * 4)
def test_packages_attribute(self):
@ddt.data(
pkgsets.KojiPackageSet,
pkgsets.KojiMockPackageSet,
)
def test_packages_attribute(self, package_set):
self._touch_files(
[
"rpms/pungi@4.1.3@3.fc25@noarch",
@ -471,7 +499,7 @@ class TestKojiPkgset(PkgsetCompareMixin, helpers.PungiTestCase):
]
)
pkgset = pkgsets.KojiPackageSet(
pkgset = package_set(
"pkgset",
self.koji_wrapper,
[None],
@ -496,8 +524,12 @@ class TestKojiPkgset(PkgsetCompareMixin, helpers.PungiTestCase):
},
)
def test_get_extra_rpms_from_tasks(self):
pkgset = pkgsets.KojiPackageSet(
@ddt.data(
pkgsets.KojiPackageSet,
pkgsets.KojiMockPackageSet,
)
def test_get_extra_rpms_from_tasks(self, package_set):
pkgset = package_set(
"pkgset",
self.koji_wrapper,
[None],
@ -563,7 +595,11 @@ class TestKojiPkgset(PkgsetCompareMixin, helpers.PungiTestCase):
rpms = pkgset.get_extra_rpms_from_tasks()
self.assertEqual(rpms, expected_rpms)
def test_get_latest_rpms_cache(self):
@ddt.data(
pkgsets.KojiMockPackageSet,
pkgsets.KojiPackageSet,
)
def test_get_latest_rpms_cache(self, package_set):
self._touch_files(
[
"rpms/bash@4.3.42@4.fc24@x86_64",
@ -572,7 +608,7 @@ class TestKojiPkgset(PkgsetCompareMixin, helpers.PungiTestCase):
)
cache_region = make_region().configure("dogpile.cache.memory")
pkgset = pkgsets.KojiPackageSet(
pkgset = package_set(
"pkgset",
self.koji_wrapper,
[None],
@ -603,7 +639,11 @@ class TestKojiPkgset(PkgsetCompareMixin, helpers.PungiTestCase):
},
)
def test_get_latest_rpms_cache_different_id(self):
@ddt.data(
pkgsets.KojiMockPackageSet,
pkgsets.KojiPackageSet,
)
def test_get_latest_rpms_cache_different_id(self, package_set):
self._touch_files(
[
"rpms/bash@4.3.42@4.fc24@x86_64",
@ -612,7 +652,7 @@ class TestKojiPkgset(PkgsetCompareMixin, helpers.PungiTestCase):
)
cache_region = make_region().configure("dogpile.cache.memory")
pkgset = pkgsets.KojiPackageSet(
pkgset = package_set(
"pkgset",
self.koji_wrapper,
[None],
@ -640,7 +680,11 @@ class TestKojiPkgset(PkgsetCompareMixin, helpers.PungiTestCase):
},
)
def test_extra_builds_attribute(self):
@ddt.data(
pkgsets.KojiMockPackageSet,
pkgsets.KojiPackageSet,
)
def test_extra_builds_attribute(self, package_set):
self._touch_files(
[
"rpms/pungi@4.1.3@3.fc25@noarch",
@ -671,7 +715,7 @@ class TestKojiPkgset(PkgsetCompareMixin, helpers.PungiTestCase):
[b for b in self.tagged_rpms[1] if b["package_name"] != "pungi"],
]
pkgset = pkgsets.KojiPackageSet(
pkgset = package_set(
"pkgset",
self.koji_wrapper,
[None],
@ -747,6 +791,211 @@ class TestReuseKojiPkgset(helpers.PungiTestCase):
)
self.assert_not_reuse()
@mock.patch.object(helpers.paths.Paths, "get_old_compose_topdir")
def test_reuse_build_under_tag_changed(self, mock_old_topdir):
mock_old_topdir.return_value = self.old_compose_dir
self.pkgset._get_koji_event_from_file = mock.Mock(side_effect=[3, 1])
self.koji_wrapper.koji_proxy.queryHistory.return_value = {"tag_listing": [{}]}
self.pkgset.try_to_reuse(self.compose, self.tag)
self.assertEqual(
self.pkgset.log_debug.mock_calls,
[
mock.call(
"Koji event doesn't match, querying changes between event 1 and 3"
),
mock.call("Builds under tag %s changed. Can't reuse." % self.tag),
],
)
self.assert_not_reuse()
@mock.patch.object(helpers.paths.Paths, "get_old_compose_topdir")
def test_reuse_build_under_inherited_tag_changed(self, mock_old_topdir):
mock_old_topdir.return_value = self.old_compose_dir
self.pkgset._get_koji_event_from_file = mock.Mock(side_effect=[3, 1])
self.koji_wrapper.koji_proxy.queryHistory.side_effect = [
{"tag_listing": [], "tag_inheritance": []},
{"tag_listing": [{}], "tag_inheritance": []},
]
self.koji_wrapper.koji_proxy.getFullInheritance.return_value = [
{"name": self.inherited_tag}
]
self.pkgset.try_to_reuse(self.compose, self.tag)
self.assertEqual(
self.pkgset.log_debug.mock_calls,
[
mock.call(
"Koji event doesn't match, querying changes between event 1 and 3"
),
mock.call(
"Builds under inherited tag %s changed. Can't reuse."
% self.inherited_tag
),
],
)
self.assert_not_reuse()
@mock.patch("pungi.paths.os.path.exists", return_value=True)
@mock.patch.object(helpers.paths.Paths, "get_old_compose_topdir")
def test_reuse_failed_load_reuse_file(self, mock_old_topdir, mock_exists):
mock_old_topdir.return_value = self.old_compose_dir
self.pkgset._get_koji_event_from_file = mock.Mock(side_effect=[3, 1])
self.koji_wrapper.koji_proxy.queryHistory.return_value = {
"tag_listing": [], "tag_inheritance": []
}
self.koji_wrapper.koji_proxy.getFullInheritance.return_value = []
self.pkgset.load_old_file_cache = mock.Mock(
side_effect=Exception("unknown error")
)
self.pkgset.try_to_reuse(self.compose, self.tag)
self.assertEqual(
self.pkgset.log_debug.mock_calls,
[
mock.call(
"Koji event doesn't match, querying changes between event 1 and 3"
),
mock.call(
"Loading reuse file: %s"
% os.path.join(
self.old_compose_dir,
"work/global",
"pkgset_%s_reuse.pickle" % self.tag,
)
),
mock.call("Failed to load reuse file: unknown error"),
],
)
self.assert_not_reuse()
@mock.patch("pungi.paths.os.path.exists", return_value=True)
@mock.patch.object(helpers.paths.Paths, "get_old_compose_topdir")
def test_reuse_criteria_not_match(self, mock_old_topdir, mock_exists):
mock_old_topdir.return_value = self.old_compose_dir
self.pkgset._get_koji_event_from_file = mock.Mock(side_effect=[3, 1])
self.koji_wrapper.koji_proxy.queryHistory.return_value = {
"tag_listing": [], "tag_inheritance": []
}
self.koji_wrapper.koji_proxy.getFullInheritance.return_value = []
self.pkgset.load_old_file_cache = mock.Mock(
return_value={"allow_invalid_sigkeys": True}
)
self.pkgset.try_to_reuse(self.compose, self.tag)
self.assertEqual(
self.pkgset.log_debug.mock_calls,
[
mock.call(
"Koji event doesn't match, querying changes between event 1 and 3"
),
mock.call(
"Loading reuse file: %s"
% os.path.join(
self.old_compose_dir,
"work/global",
"pkgset_%s_reuse.pickle" % self.tag,
)
),
],
)
self.assertEqual(
self.pkgset.log_info.mock_calls,
[
mock.call("Trying to reuse pkgset data of old compose"),
mock.call("Criteria does not match. Nothing to reuse."),
],
)
self.assert_not_reuse()
@mock.patch("pungi.phases.pkgset.pkgsets.copy_all")
@mock.patch("pungi.paths.os.path.exists", return_value=True)
@mock.patch.object(helpers.paths.Paths, "get_old_compose_topdir")
def test_reuse_pkgset(self, mock_old_topdir, mock_exists, mock_copy_all):
mock_old_topdir.return_value = self.old_compose_dir
self.pkgset._get_koji_event_from_file = mock.Mock(side_effect=[3, 1])
self.koji_wrapper.koji_proxy.queryHistory.return_value = {
"tag_listing": [], "tag_inheritance": []
}
self.koji_wrapper.koji_proxy.getFullInheritance.return_value = []
self.pkgset.load_old_file_cache = mock.Mock(
return_value={
"allow_invalid_sigkeys": self.pkgset._allow_invalid_sigkeys,
"packages": self.pkgset.packages,
"populate_only_packages": self.pkgset.populate_only_packages,
"extra_builds": self.pkgset.extra_builds,
"sigkeys": self.pkgset.sigkey_ordering,
"include_packages": None,
"rpms_by_arch": mock.Mock(),
"srpms_by_name": mock.Mock(),
"inherit_to_noarch": True,
"exclusive_noarch": True,
}
)
self.pkgset.old_file_cache = mock.Mock()
self.pkgset.try_to_reuse(self.compose, self.tag)
old_repo_dir = os.path.join(self.old_compose_dir, "work/global/repo", self.tag)
self.assertEqual(
self.pkgset.log_info.mock_calls,
[
mock.call("Trying to reuse pkgset data of old compose"),
mock.call("Copying repo data for reuse: %s" % old_repo_dir),
],
)
self.assertEqual(old_repo_dir, self.pkgset.reuse)
self.assertEqual(self.pkgset.file_cache, self.pkgset.old_file_cache)
class TestReuseKojiMockPkgset(helpers.PungiTestCase):
def setUp(self):
super(TestReuseKojiMockPkgset, self).setUp()
self.old_compose_dir = tempfile.mkdtemp()
self.old_compose = helpers.DummyCompose(self.old_compose_dir, {})
self.compose = helpers.DummyCompose(
self.topdir, {"old_composes": os.path.dirname(self.old_compose_dir)}
)
self.koji_wrapper = mock.Mock()
self.tag = "test-tag"
self.inherited_tag = "inherited-test-tag"
self.pkgset = pkgsets.KojiMockPackageSet(
self.tag, self.koji_wrapper, [None], arches=["x86_64"]
)
self.pkgset.log_debug = mock.Mock()
self.pkgset.log_info = mock.Mock()
def assert_not_reuse(self):
self.assertIsNone(getattr(self.pkgset, "reuse", None))
def test_resue_no_old_compose_found(self):
self.pkgset.try_to_reuse(self.compose, self.tag)
self.pkgset.log_info.assert_called_once_with(
"Trying to reuse pkgset data of old compose"
)
self.pkgset.log_debug.assert_called_once_with(
"No old compose found. Nothing to reuse."
)
self.assert_not_reuse()
@mock.patch.object(helpers.paths.Paths, "get_old_compose_topdir")
def test_reuse_read_koji_event_file_failed(self, mock_old_topdir):
mock_old_topdir.return_value = self.old_compose_dir
self.pkgset._get_koji_event_from_file = mock.Mock(
side_effect=Exception("unknown error")
)
self.pkgset.try_to_reuse(self.compose, self.tag)
self.pkgset.log_debug.assert_called_once_with(
"Can't read koji event from file: unknown error"
)
self.assert_not_reuse()
@mock.patch.object(helpers.paths.Paths, "get_old_compose_topdir")
def test_reuse_build_under_tag_changed(self, mock_old_topdir):
mock_old_topdir.return_value = self.old_compose_dir

View File

@ -1,17 +1,23 @@
# -*- coding: utf-8 -*-
import json
import mock
import os
import re
import six
from ddt import ddt, data, unpack
from typing import AnyStr, List, Set, Dict, Tuple
from tests.test_gather_method_hybrid import MockModule
try:
import unittest2 as unittest
from unittest2 import mock
except ImportError:
import unittest
from unittest import mock
from pungi.phases.pkgset.sources import source_koji
from pungi.phases.pkgset.sources import source_koji, source_kojimock
from tests import helpers
from pungi.module_util import Modulemd
from pungi.util import read_single_module_stream_from_file
@ -826,6 +832,153 @@ class TestAddModuleToVariant(helpers.PungiTestCase):
self.assertEqual(variant.modules, [])
@mock.patch("pungi.module_util.Modulemd.ModuleStream.read_file", new=MockModule)
@unittest.skipIf(Modulemd is None, "Skipping tests, no module support")
class TestAddModuleToVariantForKojiMock(helpers.PungiTestCase):
def setUp(self):
super(TestAddModuleToVariantForKojiMock, self).setUp()
self.koji = mock.Mock()
self.koji.koji_module.pathinfo.typedir.return_value = "/koji"
self.compose = helpers.DummyCompose(self.topdir, {})
self.koji.koji_module.pathinfo.topdir = MMDS_DIR
files = [
"modulemd.x86_64.txt",
"scratch-module.x86_64.txt",
]
self.koji.koji_proxy.listArchives.return_value = [
{"btype": "module", "filename": fname} for fname in files
]
self.buildinfo = {
"id": 1234,
"arch": "x86_64",
"extra": {
"typeinfo": {
"module": {
"name": "module",
"stream": "master",
"version": "20190318",
"context": "abcdef",
'content_koji_tag': 'module:master-20190318-abcdef'
},
},
},
}
def test_adding_module(self):
variant = mock.Mock(
arches=[
"x86_64"
],
arch_mmds={},
modules=[],
)
source_kojimock._add_module_to_variant(
self.koji,
variant,
self.buildinfo,
)
mod = variant.arch_mmds["x86_64"]["module:master:20190318:abcdef"]
self.assertEqual(mod.get_NSVCA(), "module:master:20190318:abcdef:x86_64")
self.assertEqual(len(variant.arch_mmds), 1)
self.assertEqual(variant.modules, [])
def test_adding_module_to_existing(self):
variant = mock.Mock(
arches=[
"x86_64"
],
arch_mmds={
"x86_64": {
"m1:latest:20190101:cafe": read_single_module_stream_from_file(
os.path.join(MMDS_DIR, "m1.x86_64.txt")
)}
},
modules=[{"name": "m1:latest-20190101:cafe", "glob": False}],
)
source_koji._add_module_to_variant(
self.koji, variant, self.buildinfo, compose=self.compose
)
mod = variant.arch_mmds["x86_64"]["m1:latest:20190101:cafe"]
self.assertEqual(mod.get_NSVCA(), "m1:latest:20190101:cafe:x86_64")
self.assertEqual(
variant.modules,
[{"name": "m1:latest-20190101:cafe", "glob": False}]
)
def test_adding_module_with_add_module(self):
variant = mock.Mock(arches=[
"x86_64"
], arch_mmds={}, modules=[])
source_kojimock._add_module_to_variant(
self.koji, variant, self.buildinfo, add_to_variant_modules=True
)
mod = variant.arch_mmds["x86_64"]["module:master:20190318:abcdef"]
self.assertEqual(mod.get_NSVCA(), "module:master:20190318:abcdef:x86_64")
self.assertEqual(
variant.modules, [{"name": "module:master:20190318:abcdef", "glob": False}]
)
def test_adding_module_to_existing_with_add_module(self):
variant = mock.Mock(
arches=[
"x86_64"
],
arch_mmds={
"x86_64": {"m1:latest:20190101:cafe": read_single_module_stream_from_file(
os.path.join(MMDS_DIR, "m1.x86_64.txt")
)
}
},
modules=[{"name": "m1:latest-20190101:cafe", "glob": False}],
)
source_kojimock._add_module_to_variant(
self.koji, variant, self.buildinfo, add_to_variant_modules=True
)
mod = variant.arch_mmds["x86_64"]["m1:latest:20190101:cafe"]
self.assertEqual(mod.get_NSVCA(), "m1:latest:20190101:cafe:x86_64")
self.assertEqual(
variant.modules,
[
{"name": "m1:latest-20190101:cafe", "glob": False},
{"name": "module:master:20190318:abcdef", "glob": False},
],
)
def test_adding_module_but_filtered(self):
compose = helpers.DummyCompose(
self.topdir, {"filter_modules": [(".*", {"*": ["module:*"]})]}
)
variant = mock.Mock(
arches=[
"x86_64"
], arch_mmds={}, modules=[], uid="Variant"
)
nsvc = source_kojimock._add_module_to_variant(
self.koji,
variant,
self.buildinfo,
add_to_variant_modules=True,
compose=compose,
)
self.assertIsNone(nsvc)
self.assertEqual(variant.arch_mmds, {})
self.assertEqual(variant.modules, [])
class TestIsModuleFiltered(helpers.PungiTestCase):
def assertIsFiltered(self, name, stream):
self.assertTrue(
@ -888,7 +1041,10 @@ class TestAddScratchModuleToVariant(helpers.PungiTestCase):
def test_adding_scratch_module(self):
variant = mock.Mock(
arches=["armhfp", "x86_64"],
arches=[
# "armhfp",
"x86_64"
],
arch_mmds={},
modules=[],
module_uid_to_koji_tag={},
@ -925,3 +1081,124 @@ class TestAddScratchModuleToVariant(helpers.PungiTestCase):
self.compose.log_warning.assert_called_once_with(
"Only test composes could include scratch module builds"
)
@ddt
class TestSourceKoji(unittest.TestCase):
@unpack
@data(
(
'AppStream',
[
'x86_64',
'i386'
],
{
'python39-devel:3.9',
'python39:3.9',
},
[
(
'^(BaseOS|AppStream|PowerTools)$',
{
'x86_64': [
'python39:3.9',
],
'aarch64': [
'python39-devel:3.9',
]
}
)
],
{
'python39-devel:3.9',
}
),
(
'AppStream',
[
'x86_64',
'i386'
],
{
'python39-devel:3.9',
'python39:3.9',
'python38-devel:3.8',
'python38:3.8',
},
[
(
'^(BaseOS|AppStream|PowerTools)$',
{
'x86_64': [
'python39:3.9',
],
'*': [
'python38-devel:3.8',
]
}
)
],
{
'python39-devel:3.9',
'python38:3.8',
}
),
(
'AppStream',
[
'x86_64',
'i386'
],
{
'python39-devel:3.9',
'python39:3.9',
'python38-devel:3.8',
'python38:3.8',
},
[
(
'^(BaseOS|AppStream|PowerTools)$',
{
'x86_64': [
'python39:3.9',
],
'aarch64': [
'python38-devel:3.8',
]
}
),
(
'*',
{
'*': [
'python38-devel:3.8',
]
}
),
],
{
'python39-devel:3.9',
'python38:3.8',
}
),
)
def test__filter_expected_modules(
self,
variant_name: AnyStr,
variant_arches: List[AnyStr],
expected_modules: Set[AnyStr],
filtered_modules: List[Tuple[AnyStr, Dict[AnyStr, List[AnyStr]]]],
expected_result: Set[AnyStr],
) -> None:
real_result = source_kojimock._filter_expected_modules(
variant_name=variant_name,
variant_arches=variant_arches,
expected_modules=expected_modules,
filtered_modules=filtered_modules,
)
self.assertSetEqual(
real_result,
expected_result,
)

View File

@ -6,7 +6,7 @@ try:
except ImportError:
import unittest
import mock
from unittest import mock
import six
import pungi.phases.repoclosure as repoclosure_phase

View File

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
import mock
from unittest import mock
import os
from pungi.runroot import Runroot

View File

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
import mock
from unittest import mock
try:
import unittest2 as unittest

View File

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
import mock
from unittest import mock
import os
import pungi.phases.test as test_phase

View File

@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
import mock
from unittest import mock
import os
import shutil
import six

View File

@ -1,7 +1,7 @@
# -*- coding: utf-8 -*-
import argparse
import mock
from unittest import mock
import os
try: