unify repo and repo_from options
Config option 'repo' and 'repo_from' are used in several phases, merge them with one option 'repo'. 'append' in schema is used for appending the values from deprecated options to 'repo', so it won't break on any existing config files that have the old options of 'repo_from' and 'source_repo_from' (which is an alias of 'repo_from'). And 'repo' schema is updated to support repo dict as the value or an item in the values, a repo dict is just a dict contains repo options, 'baseurl' is required in the dict, like: {"baseurl": "http://example.com/url/to/repo"} or: {"baseurl": "Serer"} currently this is used in ostree phase to support extra repo options like: {"baseurl": "Server", "exclude": "systemd-container"} Signed-off-by: Qixiang Wan <qwan@redhat.com>
This commit is contained in:
parent
0ee2189d9c
commit
2f5d6d7dcd
@ -941,8 +941,7 @@ Live Images Settings
|
||||
* ``ksurl`` (*str*) [optional] -- where to get the kickstart from
|
||||
* ``name`` (*str*)
|
||||
* ``version`` (*str*)
|
||||
* ``repo`` (*list*) -- external repos specified by URL
|
||||
* ``repo_from`` (*list*) -- repos from other variants
|
||||
* ``repo`` (*str|[str]*) -- repos specified by URL or variant UID
|
||||
* ``specfile`` (*str*) -- for images wrapped in RPM
|
||||
* ``scratch`` (*bool*) -- only RPM-wrapped images can use scratch builds,
|
||||
but by default this is turned off
|
||||
@ -953,7 +952,8 @@ Live Images Settings
|
||||
|
||||
Deprecated options:
|
||||
|
||||
* ``additional_repos`` (*list*) -- deprecated, use ``repo`` instead
|
||||
* ``additional_repos`` -- deprecated, use ``repo`` instead
|
||||
* ``repo_from`` -- deprecated, use ``repo`` instead
|
||||
|
||||
**live_images_no_rename**
|
||||
(*bool*) -- When set to ``True``, filenames generated by Koji will be used.
|
||||
@ -986,11 +986,14 @@ Live Media Settings
|
||||
for automatically generating one. See :ref:`common options
|
||||
<auto_release>` for details.
|
||||
* ``skip_tag`` (*bool*)
|
||||
* ``repo`` (*[str]*) -- external repo
|
||||
* ``repo_from`` (*[str]*) -- list of variants to take extra repos from
|
||||
* ``repo`` (*str|[str]*) -- repos specified by URL or variant UID
|
||||
* ``title`` (*str*)
|
||||
* ``install_tree_from`` (*str*) -- variant to take install tree from
|
||||
|
||||
Deprecated options:
|
||||
|
||||
* ``repo_from`` -- deprecated, use ``repo`` instead
|
||||
|
||||
If many of your media use the same value for one of ``ksurl``, ``release``,
|
||||
``target`` or ``version``, consider using these options to set the value in one
|
||||
place and have all media inherit it.
|
||||
@ -1035,9 +1038,6 @@ Image Build Settings
|
||||
If you explicitly set ``release`` to ``None``, it will be replaced with
|
||||
a value generated as described in :ref:`common options <auto_release>`.
|
||||
|
||||
You can also add extra variants to get repos from with key ``repo_from``.
|
||||
The value should be a list of variant names.
|
||||
|
||||
Please don't set ``install_tree``. This gets automatically set by *pungi*
|
||||
based on current variant. You can use ``install_tree_from`` key to use
|
||||
install tree from another variant.
|
||||
@ -1114,7 +1114,7 @@ Example
|
||||
|
||||
# Use install tree and repo from Everything variant.
|
||||
'install_tree_from': 'Everything',
|
||||
'repo_from': ['Everything'],
|
||||
'repo': ['Everything'],
|
||||
|
||||
# Set release automatically.
|
||||
'release': None,
|
||||
@ -1141,21 +1141,12 @@ a new commit.
|
||||
|
||||
* ``treefile`` -- (*str*) Filename of configuration for ``rpm-ostree``.
|
||||
* ``config_url`` -- (*str*) URL for Git repository with the ``treefile``.
|
||||
* ``repo_from`` -- (*str*) Name of variant serving as source repository.
|
||||
* ``repo`` -- (*str|dict|[str|dict]*) repos specified by URL or variant UID
|
||||
or a dict of repo options, ``baseurl`` is required in the dict.
|
||||
* ``ostree_repo`` -- (*str*) Where to put the ostree repository
|
||||
|
||||
These keys are optional:
|
||||
|
||||
* ``repo`` -- (*[dict]*) Extra source repos to get packages
|
||||
while composing the OSTree repository. Each dict represents a yum repo.
|
||||
The allowed keys are:
|
||||
|
||||
* ``name`` (required)
|
||||
* ``baseurl`` (required) -- URL of external repo or variant UID, in the case
|
||||
of variant UID, url to variant repo will be built automatically.
|
||||
* ``gpgcheck`` (optional)
|
||||
* ``exclude`` (optional)
|
||||
|
||||
* ``keep_original_sources`` -- (*bool*) Keep the existing source repos in
|
||||
the tree config file. If not enabled, all the original source repos will
|
||||
be removed from the tree config file.
|
||||
@ -1171,8 +1162,10 @@ a new commit.
|
||||
|
||||
Deprecated options:
|
||||
|
||||
* ``source_repo_from`` -- (*str*) Deprecated, use ``repo_from`` instead.
|
||||
* ``extra_source_repos`` -- (*[dict]*) Deprecated, use ``repo`` instead.
|
||||
* ``repo_from`` -- Deprecated, use ``repo`` instead.
|
||||
* ``source_repo_from`` -- Deprecated, use ``repo`` instead.
|
||||
* ``extra_source_repos`` -- Deprecated, use ``repo`` instead.
|
||||
|
||||
|
||||
|
||||
Example config
|
||||
@ -1184,18 +1177,11 @@ Example config
|
||||
"x86_64": {
|
||||
"treefile": "fedora-atomic-docker-host.json",
|
||||
"config_url": "https://git.fedorahosted.org/git/fedora-atomic.git",
|
||||
"repo_from": "Server",
|
||||
"repo": [
|
||||
{
|
||||
"name": "repo_a",
|
||||
"baseurl": "http://example.com/repo/x86_64/os",
|
||||
"exclude": "systemd-container",
|
||||
"gpgcheck": False
|
||||
},
|
||||
{
|
||||
"name": "Everything",
|
||||
"baseurl": "Everything",
|
||||
}
|
||||
"Server",
|
||||
"http://example.com/repo/x86_64/os",
|
||||
{"baseurl": "Everything"},
|
||||
{"baseurl": "http://example.com/linux/repo", "exclude": "systemd-container"},
|
||||
],
|
||||
"keep_original_sources": True,
|
||||
"ostree_repo": "/mnt/koji/compose/atomic/Rawhide/",
|
||||
@ -1218,12 +1204,9 @@ an OSTree repository. This always runs in Koji as a ``runroot`` task.
|
||||
|
||||
The configuration dict for each variant arch pair must have this key:
|
||||
|
||||
* ``repo_from`` -- (*str|[str]*) Name of variant or a name list of
|
||||
variants serving as source repositories.
|
||||
|
||||
These keys are optional:
|
||||
|
||||
* ``repo`` -- (*str|[str]*) URL of a repo or a list of urls.
|
||||
* ``repo`` -- (*str|[str]*) repos specified by URL or variant UID
|
||||
* ``release`` -- (*str*) Release value to set for the installer image. Set
|
||||
to ``None`` to generate the value :ref:`automatically <auto_release>`.
|
||||
* ``failable`` -- (*[str]*) List of architectures for which this
|
||||
@ -1247,7 +1230,8 @@ an OSTree repository. This always runs in Koji as a ``runroot`` task.
|
||||
|
||||
Deprecated options:
|
||||
|
||||
* ``source_repo_from`` -- (*str|[str]*) Deprecated, use ``repo_from`` instead.
|
||||
* ``repo_from`` -- Deprecated, use ``repo`` instead.
|
||||
* ``source_repo_from`` -- Deprecated, use ``repo`` instead.
|
||||
|
||||
Example config
|
||||
--------------
|
||||
@ -1256,7 +1240,11 @@ Example config
|
||||
ostree_installer = [
|
||||
("^Atomic$", {
|
||||
"x86_64": {
|
||||
"repo_from": "Everything",
|
||||
"repo": [
|
||||
"Everything",
|
||||
"https://example.com/extra-repo1.repo",
|
||||
"https://example.com/extra-repo2.repo",
|
||||
],
|
||||
"release": None,
|
||||
"installpkgs": ["fedora-productimg-atomic"],
|
||||
"add_template": ["atomic-installer/lorax-configure-repo.tmpl"],
|
||||
@ -1272,12 +1260,6 @@ Example config
|
||||
]
|
||||
'template_repo': 'https://git.fedorahosted.org/git/spin-kickstarts.git',
|
||||
'template_branch': 'f24',
|
||||
|
||||
# optional
|
||||
"repo": [
|
||||
"https://example.com/extra-repo1.repo",
|
||||
"https://example.com/extra-repo2.repo",
|
||||
],
|
||||
}
|
||||
})
|
||||
]
|
||||
@ -1318,9 +1300,9 @@ they are not scratch builds).
|
||||
|
||||
A value for ``yum_repourls`` will be created automatically and point at a
|
||||
repository in the current compose. You can add extra repositories with
|
||||
``repo`` key having a list of urls pointing to ``.repo`` files or
|
||||
``repo_from`` as a list of variants in current compose. ``gpgkey`` can be
|
||||
specified to enable gpgcheck in repo files for variants.
|
||||
``repo`` key having a list of urls pointing to ``.repo`` files or just
|
||||
variant uid, Pungi will create the .repo file for that variant. ``gpgkey``
|
||||
can be specified to enable gpgcheck in repo files for variants.
|
||||
|
||||
|
||||
Example config
|
||||
@ -1336,8 +1318,7 @@ Example config
|
||||
# optional
|
||||
"name": "fedora-docker-base",
|
||||
"version": "24",
|
||||
"repo": ["https://example.com/extra-repo.repo"],
|
||||
"repo_from": ["Everything"],
|
||||
"repo": ["Everything", "https://example.com/extra-repo.repo"],
|
||||
# This will result in three repo urls being passed to the task.
|
||||
# They will be in this order: Server, Everything, example.com/
|
||||
"gpgkey": 'file:///etc/pki/rpm-gpg/RPM-GPG-KEY-redhat-release',
|
||||
|
@ -410,15 +410,36 @@ def _make_schema():
|
||||
]
|
||||
},
|
||||
|
||||
"source_repo_dict": {
|
||||
"repo_dict": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"name": {"type": "string"},
|
||||
"baseurl": {"type": "string"},
|
||||
"exclude": {"type": "string"},
|
||||
"gpgcheck": {"type": "boolean"},
|
||||
"gpgcheck": {"type": "string"},
|
||||
"enabled": {"type": "string"},
|
||||
},
|
||||
"additionalProperties": False,
|
||||
"required": ["baseurl"],
|
||||
},
|
||||
|
||||
"repo": {
|
||||
"anyOf": [
|
||||
{"type": "string"},
|
||||
{"$ref": "#/definitions/repo_dict"},
|
||||
]
|
||||
},
|
||||
|
||||
"list_of_repos": {
|
||||
"type": "array",
|
||||
"items": {"$ref": "#/definitions/repo"},
|
||||
},
|
||||
|
||||
"repos": {
|
||||
"anyOf": [
|
||||
{"$ref": "#/definitions/repo"},
|
||||
{"$ref": "#/definitions/list_of_repos"},
|
||||
]
|
||||
},
|
||||
|
||||
"list_of_strings": {
|
||||
@ -426,11 +447,6 @@ def _make_schema():
|
||||
"items": {"type": "string"},
|
||||
},
|
||||
|
||||
"list_of_source_repo_dicts": {
|
||||
"type": "array",
|
||||
"items": {"$ref": "#/definitions/source_repo_dict"},
|
||||
},
|
||||
|
||||
"strings": {
|
||||
"anyOf": [
|
||||
{"type": "string"},
|
||||
@ -454,10 +470,10 @@ def _make_schema():
|
||||
"subvariant": {"type": "string"},
|
||||
"version": {"type": "string"},
|
||||
"repo": {
|
||||
"$ref": "#/definitions/strings",
|
||||
"$ref": "#/definitions/repos",
|
||||
"alias": "additional_repos",
|
||||
"append": "repo_from",
|
||||
},
|
||||
"repo_from": {"$ref": "#/definitions/strings"},
|
||||
"specfile": {"type": "string"},
|
||||
"scratch": {"type": "boolean"},
|
||||
"type": {"type": "string"},
|
||||
@ -792,8 +808,10 @@ def _make_schema():
|
||||
"name": {"type": "string"},
|
||||
"subvariant": {"type": "string"},
|
||||
"title": {"type": "string"},
|
||||
"repo": {"$ref": "#/definitions/strings"},
|
||||
"repo_from": {"$ref": "#/definitions/strings"},
|
||||
"repo": {
|
||||
"$ref": "#/definitions/repos",
|
||||
"append": "repo_from",
|
||||
},
|
||||
"target": {"type": "string"},
|
||||
"arches": {"$ref": "#/definitions/list_of_strings"},
|
||||
"failable": {"$ref": "#/definitions/list_of_strings"},
|
||||
@ -812,13 +830,10 @@ def _make_schema():
|
||||
"properties": {
|
||||
"treefile": {"type": "string"},
|
||||
"config_url": {"type": "string"},
|
||||
"repo_from": {
|
||||
"type": "string",
|
||||
"alias": "source_repo_from",
|
||||
},
|
||||
"repo": {
|
||||
"$ref": "#/definitions/list_of_source_repo_dicts",
|
||||
"$ref": "#/definitions/repos",
|
||||
"alias": "extra_source_repos",
|
||||
"append": ["repo_from", "source_repo_from"],
|
||||
},
|
||||
"keep_original_sources": {"type": "boolean"},
|
||||
"ostree_repo": {"type": "string"},
|
||||
@ -828,17 +843,16 @@ def _make_schema():
|
||||
"config_branch": {"type": "string"},
|
||||
"tag_ref": {"type": "boolean"},
|
||||
},
|
||||
"required": ["treefile", "config_url", "repo_from", "ostree_repo"],
|
||||
"required": ["treefile", "config_url", "repo", "ostree_repo"],
|
||||
"additionalProperties": False,
|
||||
}),
|
||||
|
||||
"ostree_installer": _variant_arch_mapping({
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"repo": {"$ref": "#/definitions/strings"},
|
||||
"repo_from": {
|
||||
"$ref": "#/definitions/strings",
|
||||
"alias": "source_repo_from",
|
||||
"repo": {
|
||||
"$ref": "#/definitions/repos",
|
||||
"append": ["repo_from", "source_repo_from"],
|
||||
},
|
||||
"release": {"$ref": "#/definitions/optional_string"},
|
||||
"failable": {"$ref": "#/definitions/list_of_strings"},
|
||||
@ -851,7 +865,7 @@ def _make_schema():
|
||||
"template_repo": {"type": "string"},
|
||||
"template_branch": {"type": "string"},
|
||||
},
|
||||
"required": ["repo_from"],
|
||||
"required": ["repo"],
|
||||
"additionalProperties": False,
|
||||
}),
|
||||
|
||||
@ -887,7 +901,10 @@ def _make_schema():
|
||||
"name": {"type": "string"},
|
||||
"kickstart": {"type": "string"},
|
||||
"arches": {"$ref": "#/definitions/list_of_strings"},
|
||||
"repo_from": {"$ref": "#/definitions/strings"},
|
||||
"repo": {
|
||||
"$ref": "#/definitions/repos",
|
||||
"append": "repo_from",
|
||||
},
|
||||
"install_tree_from": {"type": "string"},
|
||||
"subvariant": {"type": "string"},
|
||||
"format": {"$ref": "#/definitions/string_tuples"},
|
||||
@ -954,8 +971,10 @@ def _make_schema():
|
||||
"version": {"type": "string"},
|
||||
"scratch": {"type": "boolean"},
|
||||
"priority": {"type": "number"},
|
||||
"repo": {"$ref": "#/definitions/strings"},
|
||||
"repo_from": {"$ref": "#/definitions/strings"},
|
||||
"repo": {
|
||||
"$ref": "#/definitions/repos",
|
||||
"append": "repo_from",
|
||||
},
|
||||
"gpgkey": {"type": "string"},
|
||||
},
|
||||
"required": ["url", "target"]
|
||||
|
@ -89,10 +89,8 @@ class Tree(OSTree):
|
||||
self.extra_config = self.args.extra_config
|
||||
if self.extra_config:
|
||||
self.extra_config = json.load(open(self.extra_config, 'r'))
|
||||
source_repo_from = self.extra_config.get('repo_from', None)
|
||||
extra_source_repos = self.extra_config.get('repo', [])
|
||||
repos = self.extra_config.get('repo', [])
|
||||
keep_original_sources = self.extra_config.get('keep_original_sources', False)
|
||||
repos = extra_source_repos + [{'name': 'source_repo_from', 'baseurl': source_repo_from}]
|
||||
tweak_treeconf(self.treefile, source_repos=repos, keep_original_sources=keep_original_sources)
|
||||
|
||||
self.commitid_file = make_log_file(self.logdir, 'commitid')
|
||||
|
@ -6,7 +6,7 @@ import time
|
||||
from kobo import shortcuts
|
||||
|
||||
from pungi.util import get_variant_data, makedirs, get_mtime, get_file_size, failable
|
||||
from pungi.util import translate_path
|
||||
from pungi.util import translate_path, get_repo_urls
|
||||
from pungi.phases import base
|
||||
from pungi.linker import Linker
|
||||
from pungi.wrappers.kojiwrapper import KojiWrapper
|
||||
@ -49,28 +49,15 @@ class ImageBuildPhase(base.PhaseLoggerMixin, base.ImageConfigMixin, base.ConfigG
|
||||
def _get_repo(self, image_conf, variant):
|
||||
"""
|
||||
Get a comma separated list of repos. First included are those
|
||||
explicitly listed in config, followed by repos from other variants,
|
||||
finally followed by repo for current variant.
|
||||
|
||||
The `repo_from` key is removed from the dict (if present).
|
||||
explicitly listed in config, followed by by repo for current variant
|
||||
if it's not included in the list already.
|
||||
"""
|
||||
repo = shortcuts.force_list(image_conf.get('repo', []))
|
||||
repos = shortcuts.force_list(image_conf.get('repo', []))
|
||||
|
||||
extras = shortcuts.force_list(image_conf.pop('repo_from', []))
|
||||
if not variant.is_empty:
|
||||
extras.append(variant.uid)
|
||||
if not variant.is_empty and variant.uid not in repos:
|
||||
repos.append(variant.uid)
|
||||
|
||||
for extra in extras:
|
||||
v = self.compose.all_variants.get(extra)
|
||||
if not v:
|
||||
raise RuntimeError(
|
||||
'There is no variant %s to get repo from when building image for %s.'
|
||||
% (extra, variant.uid))
|
||||
repo.append(translate_path(
|
||||
self.compose,
|
||||
self.compose.paths.compose.os_tree('$arch', v, create_dir=False)))
|
||||
|
||||
return ",".join(repo)
|
||||
return ",".join(get_repo_urls(self.compose, repos, arch='$arch'))
|
||||
|
||||
def _get_arches(self, image_conf, arches):
|
||||
if 'arches' in image_conf['image-build']:
|
||||
|
@ -28,7 +28,7 @@ from pungi.wrappers.kojiwrapper import KojiWrapper
|
||||
from pungi.wrappers import iso
|
||||
from pungi.phases import base
|
||||
from pungi.util import get_arch_variant_data, makedirs, get_mtime, get_file_size, failable
|
||||
from pungi.util import translate_path
|
||||
from pungi.util import get_repo_urls
|
||||
|
||||
|
||||
# HACK: define cmp in python3
|
||||
@ -44,29 +44,12 @@ class LiveImagesPhase(base.PhaseLoggerMixin, base.ImageConfigMixin, base.ConfigG
|
||||
super(LiveImagesPhase, self).__init__(compose)
|
||||
self.pool = ThreadPool(logger=self.logger)
|
||||
|
||||
def _get_extra_repos(self, arch, variant, extras):
|
||||
repo = []
|
||||
for extra in extras:
|
||||
v = self.compose.all_variants.get(extra)
|
||||
if not v:
|
||||
raise RuntimeError(
|
||||
'There is no variant %s to get repo from when building live image for %s.'
|
||||
% (extra, variant.uid))
|
||||
repo.append(translate_path(
|
||||
self.compose, self.compose.paths.compose.repository(arch, v, create_dir=False)))
|
||||
|
||||
return repo
|
||||
|
||||
def _get_repos(self, arch, variant, data):
|
||||
repos = []
|
||||
if not variant.is_empty:
|
||||
repos.append(translate_path(
|
||||
self.compose, self.compose.paths.compose.repository(arch, variant, create_dir=False)))
|
||||
|
||||
# additional repos
|
||||
repos.extend(data.get("repo", []))
|
||||
repos.extend(self._get_extra_repos(arch, variant, force_list(data.get('repo_from', []))))
|
||||
return repos
|
||||
repos.append(variant.uid)
|
||||
repos.extend(force_list(data.get('repo', [])))
|
||||
return get_repo_urls(self.compose, repos, arch=arch)
|
||||
|
||||
def run(self):
|
||||
symlink_isos_to = self.compose.conf.get("symlink_isos_to")
|
||||
|
@ -5,7 +5,7 @@ import time
|
||||
from kobo import shortcuts
|
||||
|
||||
from pungi.util import get_variant_data, makedirs, get_mtime, get_file_size, failable
|
||||
from pungi.util import translate_path
|
||||
from pungi.util import translate_path, get_repo_urls
|
||||
from pungi.phases.base import ConfigGuardedPhase, ImageConfigMixin, PhaseLoggerMixin
|
||||
from pungi.linker import Linker
|
||||
from pungi.wrappers.kojiwrapper import KojiWrapper
|
||||
@ -23,29 +23,16 @@ class LiveMediaPhase(PhaseLoggerMixin, ImageConfigMixin, ConfigGuardedPhase):
|
||||
|
||||
def _get_repos(self, image_conf, variant):
|
||||
"""
|
||||
Get a comma separated list of repos. First included are those
|
||||
explicitly listed in config, followed by repos from other variants,
|
||||
finally followed by repo for current variant.
|
||||
|
||||
The `repo_from` key is removed from the dict (if present).
|
||||
Get a list of repo urls. First included are those explicitly listed in config,
|
||||
followed by repo for current variant if it's not present in the list.
|
||||
"""
|
||||
repo = shortcuts.force_list(image_conf.get('repo', []))
|
||||
repos = shortcuts.force_list(image_conf.get('repo', []))
|
||||
|
||||
extras = shortcuts.force_list(image_conf.pop('repo_from', []))
|
||||
if not variant.is_empty:
|
||||
extras.append(variant.uid)
|
||||
if variant.uid not in repos:
|
||||
repos.append(variant.uid)
|
||||
|
||||
for extra in extras:
|
||||
v = self.compose.all_variants.get(extra)
|
||||
if not v:
|
||||
raise RuntimeError(
|
||||
'There is no variant %s to get repo from when building live media for %s.'
|
||||
% (extra, variant.uid))
|
||||
repo.append(translate_path(
|
||||
self.compose,
|
||||
self.compose.paths.compose.repository('$basearch', v, create_dir=False)))
|
||||
|
||||
return repo
|
||||
return get_repo_urls(self.compose, repos)
|
||||
|
||||
def _get_arches(self, image_conf, arches):
|
||||
if 'arches' in image_conf:
|
||||
|
@ -53,13 +53,11 @@ class OSBSThread(WorkerThread):
|
||||
source = util.resolve_git_url(config.pop('url'))
|
||||
target = config.pop('target')
|
||||
priority = config.pop('priority', None)
|
||||
repos = shortcuts.force_list(config.pop('repo', []))
|
||||
gpgkey = config.pop('gpgkey', None)
|
||||
compose_repos = [self._get_repo(compose, v, gpgkey=gpgkey)
|
||||
for v in [variant.uid] + shortcuts.force_list(
|
||||
config.pop('repo_from', []))]
|
||||
repos = [self._get_repo(compose, v, gpgkey=gpgkey)
|
||||
for v in [variant.uid] + shortcuts.force_list(config.pop('repo', []))]
|
||||
|
||||
config['yum_repourls'] = compose_repos + repos
|
||||
config['yum_repourls'] = repos
|
||||
|
||||
task_id = koji.koji_proxy.buildContainer(source, target, config,
|
||||
priority=priority)
|
||||
@ -120,17 +118,21 @@ class OSBSThread(WorkerThread):
|
||||
self.pool.metadata.setdefault(
|
||||
variant.uid, {}).setdefault(arch, []).append(data)
|
||||
|
||||
def _get_repo(self, compose, variant_uid, gpgkey=None):
|
||||
def _get_repo(self, compose, repo, gpgkey=None):
|
||||
"""
|
||||
Write a .repo file pointing to current variant and return URL to the
|
||||
file.
|
||||
Return repo file URL of repo, if repo contains "://", it's already
|
||||
a URL of repo file. Or it's a variant UID, then write a .repo file
|
||||
pointing to current variant and return the URL to .repo file.
|
||||
"""
|
||||
if "://" in repo:
|
||||
return repo
|
||||
|
||||
try:
|
||||
variant = compose.all_variants[variant_uid]
|
||||
variant = compose.all_variants[repo]
|
||||
except KeyError:
|
||||
raise RuntimeError(
|
||||
'There is no variant %s to get repo from to pass to OSBS.'
|
||||
% (variant_uid))
|
||||
% (repo))
|
||||
os_tree = compose.paths.compose.os_tree('$basearch', variant,
|
||||
create_dir=False)
|
||||
repo_file = os.path.join(compose.paths.work.tmp_dir(None, variant),
|
||||
|
@ -3,12 +3,13 @@
|
||||
import copy
|
||||
import json
|
||||
import os
|
||||
from kobo import shortcuts
|
||||
from kobo.threads import ThreadPool, WorkerThread
|
||||
|
||||
from .base import ConfigGuardedPhase
|
||||
from .. import util
|
||||
from ..ostree.utils import get_ref_from_treefile, get_commitid_from_commitid_file
|
||||
from ..util import translate_path
|
||||
from ..util import get_repo_dicts
|
||||
from ..wrappers import kojiwrapper, scm
|
||||
|
||||
|
||||
@ -45,41 +46,16 @@ class OSTreeThread(WorkerThread):
|
||||
self.logdir = compose.paths.log.topdir('%s/%s/ostree-%d' %
|
||||
(arch, variant.uid, self.num))
|
||||
repodir = os.path.join(workdir, 'config_repo')
|
||||
|
||||
source_variant = compose.all_variants[config['repo_from']]
|
||||
source_repo = translate_path(compose,
|
||||
compose.paths.compose.repository('$basearch',
|
||||
source_variant,
|
||||
create_dir=False))
|
||||
|
||||
self._clone_repo(repodir, config['config_url'], config.get('config_branch', 'master'))
|
||||
|
||||
source_repos = [{'name': '%s-%s' % (compose.compose_id, config['repo_from']),
|
||||
'baseurl': source_repo}]
|
||||
|
||||
extra_source_repos = config.get('repo', None)
|
||||
if extra_source_repos:
|
||||
for extra in extra_source_repos:
|
||||
baseurl = extra['baseurl']
|
||||
if "://" not in baseurl:
|
||||
# it's variant UID, translate to url
|
||||
variant = compose.variants[baseurl]
|
||||
url = translate_path(compose,
|
||||
compose.paths.compose.repository('$basearch',
|
||||
variant,
|
||||
create_dir=False))
|
||||
extra['baseurl'] = url
|
||||
|
||||
source_repos = source_repos + extra_source_repos
|
||||
repos = get_repo_dicts(compose, shortcuts.force_list(config['repo']))
|
||||
|
||||
# copy the original config and update before save to a json file
|
||||
new_config = copy.copy(config)
|
||||
|
||||
# repos in configuration can have repo url set to variant UID,
|
||||
# update it to have the actual url that we just translated.
|
||||
new_config.update({'repo_from': source_repo})
|
||||
if extra_source_repos:
|
||||
new_config.update({'repo': extra_source_repos})
|
||||
new_config.update({'repo': repos})
|
||||
|
||||
# remove unnecessary (for 'pungi-make-ostree tree' script ) elements
|
||||
# from config, it doesn't hurt to have them, however remove them can
|
||||
@ -88,8 +64,6 @@ class OSTreeThread(WorkerThread):
|
||||
'failable', 'version', 'update_summary']:
|
||||
new_config.pop(k, None)
|
||||
|
||||
extra_config_file = None
|
||||
if new_config:
|
||||
# write a json file to save the configuration, so 'pungi-make-ostree tree'
|
||||
# can take use of it
|
||||
extra_config_file = os.path.join(workdir, 'extra_config.json')
|
||||
|
@ -9,7 +9,7 @@ from kobo import shortcuts
|
||||
|
||||
from .base import ConfigGuardedPhase, PhaseLoggerMixin
|
||||
from .. import util
|
||||
from ..util import get_volid, translate_path
|
||||
from ..util import get_volid, get_repo_urls
|
||||
from ..wrappers import kojiwrapper, iso, lorax, scm
|
||||
|
||||
|
||||
@ -45,10 +45,8 @@ class OstreeInstallerThread(WorkerThread):
|
||||
self.pool.log_info('[BEGIN] %s' % msg)
|
||||
self.logdir = compose.paths.log.topdir('%s/%s/ostree_installer-%s' % (arch, variant, self.num))
|
||||
|
||||
source_from_repos = [self._get_source_repo(compose, arch, v)
|
||||
for v in shortcuts.force_list(config['repo_from'])]
|
||||
repos = shortcuts.force_list(config.pop('repo', []))
|
||||
source_repos = source_from_repos + repos
|
||||
repos = get_repo_urls(compose, shortcuts.force_list(config['repo']), arch=arch)
|
||||
repos = [url.replace('$arch', arch) for url in repos]
|
||||
output_dir = os.path.join(compose.paths.work.topdir(arch), variant.uid, 'ostree_installer')
|
||||
util.makedirs(os.path.dirname(output_dir))
|
||||
|
||||
@ -57,25 +55,13 @@ class OstreeInstallerThread(WorkerThread):
|
||||
disc_type = compose.conf['disc_types'].get('ostree', 'ostree')
|
||||
|
||||
volid = get_volid(compose, arch, variant, disc_type=disc_type)
|
||||
task_id = self._run_ostree_cmd(compose, variant, arch, config, source_repos, output_dir, volid)
|
||||
task_id = self._run_ostree_cmd(compose, variant, arch, config, repos, output_dir, volid)
|
||||
|
||||
filename = compose.get_image_name(arch, variant, disc_type=disc_type)
|
||||
self._copy_image(compose, variant, arch, filename, output_dir)
|
||||
self._add_to_manifest(compose, variant, arch, filename)
|
||||
self.pool.log_info('[DONE ] %s, (task id: %s)' % (msg, task_id))
|
||||
|
||||
def _get_source_repo(self, compose, arch, source):
|
||||
"""
|
||||
If `source` is a URL, return it as-is (possibly replacing $arch with
|
||||
actual arch. Otherwise treat is a a variant name and return path to
|
||||
repo in that variant.
|
||||
"""
|
||||
if '://' in source:
|
||||
return source.replace('$arch', arch)
|
||||
source_variant = compose.all_variants[source]
|
||||
return translate_path(
|
||||
compose, compose.paths.compose.repository(arch, source_variant, create_dir=False))
|
||||
|
||||
def _clone_templates(self, url, branch='master'):
|
||||
if not url:
|
||||
self.template_dir = None
|
||||
|
103
pungi/util.py
103
pungi/util.py
@ -17,6 +17,7 @@
|
||||
import subprocess
|
||||
import os
|
||||
import shutil
|
||||
import string
|
||||
import sys
|
||||
import hashlib
|
||||
import errno
|
||||
@ -658,3 +659,105 @@ def translate_path(compose, path):
|
||||
return normpath.replace(prefix, newvalue, 1)
|
||||
|
||||
return normpath
|
||||
|
||||
|
||||
def get_repo_url(compose, repo, arch='$basearch'):
|
||||
"""
|
||||
Convert repo to repo URL.
|
||||
|
||||
@param compose - required for access to variants
|
||||
@param repo - string or a dict which at least contains 'baseurl' key
|
||||
@param arch - string to be used as arch in repo url
|
||||
"""
|
||||
if isinstance(repo, dict):
|
||||
try:
|
||||
repo = repo['baseurl']
|
||||
except KeyError:
|
||||
raise RuntimeError('Baseurl is required in repo dict %s' % str(repo))
|
||||
if '://' not in repo:
|
||||
# this is a variant name
|
||||
v = compose.all_variants.get(repo)
|
||||
if not v:
|
||||
raise RuntimeError('There is no variant %s to get repo from.' % repo)
|
||||
repo = translate_path(compose, compose.paths.compose.repository(arch, v, create_dir=False))
|
||||
return repo
|
||||
|
||||
|
||||
def get_repo_urls(compose, repos, arch='$basearch'):
|
||||
"""
|
||||
Convert repos to a list of repo URLs.
|
||||
|
||||
@param compose - required for access to variants
|
||||
@param repos - list of string or dict, if item is a dict, key 'baseurl' is required
|
||||
@param arch - string to be used as arch in repo url
|
||||
"""
|
||||
urls = []
|
||||
for repo in repos:
|
||||
repo = get_repo_url(compose, repo, arch=arch)
|
||||
urls.append(repo)
|
||||
return urls
|
||||
|
||||
|
||||
def _translate_url_to_repo_id(url):
|
||||
"""
|
||||
Translate url to valid repo id by replacing any invalid char to '_'.
|
||||
"""
|
||||
_REPOID_CHARS = string.ascii_letters + string.digits + '-_.:'
|
||||
return ''.join([s if s in list(_REPOID_CHARS) else '_' for s in url])
|
||||
|
||||
|
||||
def get_repo_dict(compose, repo, arch='$basearch'):
|
||||
"""
|
||||
Convert repo to a dict of repo options.
|
||||
|
||||
If repo is a string, translate it to repo url if necessary (when it's
|
||||
not a url), and set it as 'baseurl' in result dict, also generate
|
||||
a repo id/name as 'name' key in result dict.
|
||||
If repo is a dict, translate value of 'baseurl' key to url if necessary,
|
||||
if 'name' key is missing in the dict, generate one for it.
|
||||
|
||||
@param compose - required for access to variants
|
||||
@param repo - A string or dict, if it is a dict, key 'baseurl' is required
|
||||
@param arch - string to be used as arch in repo url
|
||||
"""
|
||||
repo_dict = {}
|
||||
if isinstance(repo, dict):
|
||||
url = repo['baseurl']
|
||||
name = repo.get('name', None)
|
||||
if '://' in url:
|
||||
if name is None:
|
||||
name = _translate_url_to_repo_id(url)
|
||||
else:
|
||||
# url is variant uid
|
||||
if name is None:
|
||||
name = '%s-%s' % (compose.compose_id, url)
|
||||
url = get_repo_url(compose, url, arch=arch)
|
||||
repo['name'] = name
|
||||
repo['baseurl'] = url
|
||||
return repo
|
||||
else:
|
||||
# repo is normal url or variant uid
|
||||
repo_dict = {}
|
||||
if '://' in repo:
|
||||
repo_dict['name'] = _translate_url_to_repo_id(repo)
|
||||
repo_dict['baseurl'] = repo
|
||||
else:
|
||||
repo_dict['name'] = '%s-%s' % (compose.compose_id, repo)
|
||||
repo_dict['baseurl'] = get_repo_url(compose, repo)
|
||||
|
||||
return repo_dict
|
||||
|
||||
|
||||
def get_repo_dicts(compose, repos, arch='$basearch'):
|
||||
"""
|
||||
Convert repos to a list of repo dicts.
|
||||
|
||||
@param compose - required for access to variants
|
||||
@param repo - A list of string or dict, if item is a dict, key 'baseurl' is required
|
||||
@param arch - string to be used as arch in repo url
|
||||
"""
|
||||
repo_dicts = []
|
||||
for repo in repos:
|
||||
repo_dict = get_repo_dict(compose, repo, arch=arch)
|
||||
repo_dicts.append(repo_dict)
|
||||
return repo_dicts
|
||||
|
@ -273,7 +273,7 @@ class OstreeConfigTestCase(ConfigTestCase):
|
||||
"x86_64": {
|
||||
"treefile": "fedora-atomic-docker-host.json",
|
||||
"config_url": "https://git.fedorahosted.org/git/fedora-atomic.git",
|
||||
"repo_from": "Everything",
|
||||
"repo": "Everything",
|
||||
"ostree_repo": "/mnt/koji/compose/atomic/Rawhide/"
|
||||
}
|
||||
})
|
||||
@ -298,7 +298,7 @@ class OstreeInstallerConfigTestCase(ConfigTestCase):
|
||||
ostree_installer=[
|
||||
("^Atomic$", {
|
||||
"x86_64": {
|
||||
"repo_from": "Everything",
|
||||
"repo": "Everything",
|
||||
"release": None,
|
||||
"installpkgs": ["fedora-productimg-atomic"],
|
||||
"add_template": ["/spin-kickstarts/atomic-installer/lorax-configure-repo.tmpl"],
|
||||
@ -326,7 +326,7 @@ class OstreeInstallerConfigTestCase(ConfigTestCase):
|
||||
ostree_installer=[
|
||||
("^Atomic$", {
|
||||
"x86_64": {
|
||||
"repo_from": "Everything",
|
||||
"repo": "Everything",
|
||||
"release": None,
|
||||
"installpkgs": ["fedora-productimg-atomic"],
|
||||
"add_template": ["/spin-kickstarts/atomic-installer/lorax-configure-repo.tmpl"],
|
||||
|
@ -23,8 +23,7 @@ class TestLiveImagesPhase(PungiTestCase):
|
||||
('^Client$', {
|
||||
'amd64': {
|
||||
'kickstart': 'test.ks',
|
||||
'repo': ['http://example.com/repo/'],
|
||||
'repo_from': ['Everything', 'Server-optional'],
|
||||
'repo': ['http://example.com/repo/', 'Everything', 'Server-optional'],
|
||||
'release': None,
|
||||
}
|
||||
})
|
||||
@ -76,8 +75,7 @@ class TestLiveImagesPhase(PungiTestCase):
|
||||
('^Client$', {
|
||||
'amd64': {
|
||||
'kickstart': 'test.ks',
|
||||
'repo': ['http://example.com/repo/'],
|
||||
'repo_from': 'Everything',
|
||||
'repo': ['http://example.com/repo/', 'Everything'],
|
||||
'release': None,
|
||||
}
|
||||
})
|
||||
@ -124,8 +122,7 @@ class TestLiveImagesPhase(PungiTestCase):
|
||||
('^Client$', {
|
||||
'amd64': {
|
||||
'kickstart': 'test.ks',
|
||||
'repo': ['http://example.com/repo/'],
|
||||
'repo_from': ['Everything'],
|
||||
'repo': ['http://example.com/repo/', 'Everything'],
|
||||
'release': None,
|
||||
}
|
||||
})
|
||||
@ -171,12 +168,10 @@ class TestLiveImagesPhase(PungiTestCase):
|
||||
('^Client$', {
|
||||
'amd64': [{
|
||||
'kickstart': 'test.ks',
|
||||
'repo': ['http://example.com/repo/'],
|
||||
'repo_from': ['Everything'],
|
||||
'repo': ['http://example.com/repo/', 'Everything'],
|
||||
}, {
|
||||
'kickstart': 'another.ks',
|
||||
'repo': ['http://example.com/repo/'],
|
||||
'repo_from': ['Everything'],
|
||||
'repo': ['http://example.com/repo/', 'Everything'],
|
||||
}]
|
||||
})
|
||||
],
|
||||
@ -244,8 +239,7 @@ class TestLiveImagesPhase(PungiTestCase):
|
||||
'amd64': {
|
||||
'kickstart': 'test.ks',
|
||||
'ksurl': 'https://git.example.com/kickstarts.git?#HEAD',
|
||||
'repo': ['http://example.com/repo/'],
|
||||
'repo_from': ['Everything'],
|
||||
'repo': ['http://example.com/repo/', 'Everything'],
|
||||
'type': 'appliance',
|
||||
}
|
||||
})
|
||||
@ -299,8 +293,7 @@ class TestLiveImagesPhase(PungiTestCase):
|
||||
('^Client$', {
|
||||
'amd64': {
|
||||
'kickstart': 'test.ks',
|
||||
'repo': ['http://example.com/repo/'],
|
||||
'repo_from': ['Everything'],
|
||||
'repo': ['http://example.com/repo/', 'Everything'],
|
||||
'type': 'appliance',
|
||||
}
|
||||
})
|
||||
@ -354,8 +347,7 @@ class TestLiveImagesPhase(PungiTestCase):
|
||||
('^Client$', {
|
||||
'amd64': {
|
||||
'kickstart': 'test.ks',
|
||||
'repo': ['http://example.com/repo/'],
|
||||
'repo_from': ['Everything'],
|
||||
'repo': ['http://example.com/repo/', 'Everything'],
|
||||
'type': 'appliance',
|
||||
}
|
||||
})
|
||||
@ -406,8 +398,7 @@ class TestLiveImagesPhase(PungiTestCase):
|
||||
('^Client$', {
|
||||
'amd64': {
|
||||
'kickstart': 'test.ks',
|
||||
'repo': ['http://example.com/repo/'],
|
||||
'repo_from': ['Everything'],
|
||||
'repo': ['http://example.com/repo/', 'Everything'],
|
||||
'release': None,
|
||||
}
|
||||
})
|
||||
|
@ -353,7 +353,7 @@ class TestLiveMediaPhase(PungiTestCase):
|
||||
|
||||
phase = LiveMediaPhase(compose)
|
||||
|
||||
with self.assertRaisesRegexp(RuntimeError, r'no.+Missing.+when building.+Server'):
|
||||
with self.assertRaisesRegexp(RuntimeError, r'There is no variant Missing to get repo from.'):
|
||||
phase.run()
|
||||
|
||||
@mock.patch('pungi.util.resolve_git_url')
|
||||
|
@ -23,7 +23,7 @@ class OSBSPhaseTest(helpers.PungiTestCase):
|
||||
|
||||
@mock.patch('pungi.phases.osbs.ThreadPool')
|
||||
def test_run(self, ThreadPool):
|
||||
cfg = mock.Mock()
|
||||
cfg = helpers.IterableMock()
|
||||
compose = helpers.DummyCompose(self.topdir, {
|
||||
'osbs': {'^Everything$': cfg}
|
||||
})
|
||||
@ -310,8 +310,7 @@ class OSBSThreadTest(helpers.PungiTestCase):
|
||||
'target': 'f24-docker-candidate',
|
||||
'name': 'my-name',
|
||||
'version': '1.0',
|
||||
'repo': 'http://pkgs.example.com/my.repo',
|
||||
'repo_from': 'Everything',
|
||||
'repo': ['Everything', 'http://pkgs.example.com/my.repo']
|
||||
}
|
||||
self._setupMock(KojiWrapper, resolve_git_url)
|
||||
self._assertConfigCorrect(cfg)
|
||||
@ -339,8 +338,7 @@ class OSBSThreadTest(helpers.PungiTestCase):
|
||||
'target': 'f24-docker-candidate',
|
||||
'name': 'my-name',
|
||||
'version': '1.0',
|
||||
'repo': ['http://pkgs.example.com/my.repo'],
|
||||
'repo_from': ['Everything', 'Client'],
|
||||
'repo': ['Everything', 'Client', 'http://pkgs.example.com/my.repo'],
|
||||
}
|
||||
self._assertConfigCorrect(cfg)
|
||||
self._setupMock(KojiWrapper, resolve_git_url)
|
||||
@ -370,8 +368,7 @@ class OSBSThreadTest(helpers.PungiTestCase):
|
||||
'target': 'f24-docker-candidate',
|
||||
'name': 'my-name',
|
||||
'version': '1.0',
|
||||
'repo': ['http://pkgs.example.com/my.repo'],
|
||||
'repo_from': ['Everything', 'Client'],
|
||||
'repo': ['Everything', 'Client', 'http://pkgs.example.com/my.repo'],
|
||||
'gpgkey': gpgkey,
|
||||
}
|
||||
self._assertConfigCorrect(cfg)
|
||||
@ -389,7 +386,7 @@ class OSBSThreadTest(helpers.PungiTestCase):
|
||||
'target': 'f24-docker-candidate',
|
||||
'name': 'my-name',
|
||||
'version': '1.0',
|
||||
'repo_from': 'Gold',
|
||||
'repo': 'Gold',
|
||||
}
|
||||
self._assertConfigCorrect(cfg)
|
||||
self._setupMock(KojiWrapper, resolve_git_url)
|
||||
|
@ -136,7 +136,7 @@ class OstreeThreadTest(helpers.PungiTestCase):
|
||||
self.compose.supported = False
|
||||
pool = mock.Mock()
|
||||
cfg = {
|
||||
'repo_from': 'Everything',
|
||||
'repo': 'Everything',
|
||||
'release': '20160321.n.0',
|
||||
}
|
||||
koji = KojiWrapper.return_value
|
||||
@ -172,7 +172,7 @@ class OstreeThreadTest(helpers.PungiTestCase):
|
||||
get_file_size, get_mtime, ImageCls, run):
|
||||
pool = mock.Mock()
|
||||
cfg = {
|
||||
'repo_from': 'http://example.com/repo/$arch/',
|
||||
'repo': 'http://example.com/repo/$arch/',
|
||||
'release': '20160321.n.0',
|
||||
}
|
||||
koji = KojiWrapper.return_value
|
||||
@ -206,9 +206,9 @@ class OstreeThreadTest(helpers.PungiTestCase):
|
||||
get_file_size, get_mtime, ImageCls, run):
|
||||
pool = mock.Mock()
|
||||
cfg = {
|
||||
'repo_from': 'Everything',
|
||||
'release': '20160321.n.0',
|
||||
'repo': [
|
||||
'Everything',
|
||||
'https://example.com/extra-repo1.repo',
|
||||
'https://example.com/extra-repo2.repo',
|
||||
],
|
||||
@ -244,9 +244,10 @@ class OstreeThreadTest(helpers.PungiTestCase):
|
||||
get_file_size, get_mtime, ImageCls, run):
|
||||
pool = mock.Mock()
|
||||
cfg = {
|
||||
'repo_from': ['Everything', 'Server'],
|
||||
'release': '20160321.n.0',
|
||||
'repo': [
|
||||
'Everything',
|
||||
'Server',
|
||||
'https://example.com/extra-repo1.repo',
|
||||
'https://example.com/extra-repo2.repo',
|
||||
],
|
||||
@ -284,7 +285,7 @@ class OstreeThreadTest(helpers.PungiTestCase):
|
||||
get_mtime, ImageCls, run):
|
||||
pool = mock.Mock()
|
||||
cfg = {
|
||||
'repo_from': 'Everything',
|
||||
'repo': 'Everything',
|
||||
'release': '20160321.n.0',
|
||||
'add_template': ['some-file.txt'],
|
||||
}
|
||||
@ -317,7 +318,7 @@ class OstreeThreadTest(helpers.PungiTestCase):
|
||||
get_dir_from_scm):
|
||||
pool = mock.Mock()
|
||||
cfg = {
|
||||
'repo_from': 'Everything',
|
||||
'repo': 'Everything',
|
||||
'release': '20160321.n.0',
|
||||
'add_template': ['some_file.txt'],
|
||||
'add_arch_template': ['other_file.txt'],
|
||||
@ -365,7 +366,7 @@ class OstreeThreadTest(helpers.PungiTestCase):
|
||||
get_file_size, get_mtime, ImageCls, run):
|
||||
pool = mock.Mock()
|
||||
cfg = {
|
||||
'repo_from': 'Everything',
|
||||
'repo': 'Everything',
|
||||
'release': None,
|
||||
"installpkgs": ["fedora-productimg-atomic"],
|
||||
"add_template": ["/spin-kickstarts/atomic-installer/lorax-configure-repo.tmpl"],
|
||||
@ -426,7 +427,7 @@ class OstreeThreadTest(helpers.PungiTestCase):
|
||||
get_mtime, ImageCls, run):
|
||||
pool = mock.Mock()
|
||||
cfg = {
|
||||
'repo_from': 'Everything',
|
||||
'repo': 'Everything',
|
||||
'release': None,
|
||||
'failable': ['x86_64']
|
||||
}
|
||||
@ -452,7 +453,7 @@ class OstreeThreadTest(helpers.PungiTestCase):
|
||||
get_file_size, get_mtime, ImageCls, run):
|
||||
pool = mock.Mock()
|
||||
cfg = {
|
||||
'repo_from': 'Everything',
|
||||
'repo': 'Everything',
|
||||
'release': None,
|
||||
'failable': ['*'],
|
||||
}
|
||||
|
@ -51,7 +51,7 @@ class OSTreeThreadTest(helpers.PungiTestCase):
|
||||
self.repo = os.path.join(self.topdir, 'place/for/atomic')
|
||||
os.makedirs(os.path.join(self.repo, 'refs', 'heads'))
|
||||
self.cfg = {
|
||||
'repo_from': 'Everything',
|
||||
'repo': 'Everything',
|
||||
'config_url': 'https://git.fedorahosted.org/git/fedora-atomic.git',
|
||||
'config_branch': 'f24',
|
||||
'treefile': 'fedora-atomic-docker-host.json',
|
||||
@ -305,8 +305,8 @@ class OSTreeThreadTest(helpers.PungiTestCase):
|
||||
koji.run_runroot_cmd.side_effect = self._mock_runroot(0)
|
||||
|
||||
cfg = {
|
||||
'repo_from': 'Everything',
|
||||
'repo': [
|
||||
'Everything',
|
||||
{
|
||||
'name': 'repo_a',
|
||||
'baseurl': 'http://url/to/repo/a',
|
||||
@ -333,9 +333,10 @@ class OSTreeThreadTest(helpers.PungiTestCase):
|
||||
self.assertTrue(os.path.isfile(extra_config_file))
|
||||
extra_config = json.load(open(extra_config_file, 'r'))
|
||||
self.assertTrue(extra_config.get('keep_original_sources', False))
|
||||
self.assertEqual(extra_config.get('repo_from', None), 'http://example.com/Everything/$basearch/os')
|
||||
self.assertEqual(len(extra_config.get('repo', [])), len(cfg['repo']))
|
||||
self.assertEqual(extra_config.get('repo').pop()['baseurl'], 'http://example.com/Server/$basearch/os')
|
||||
self.assertEqual(extra_config.get('repo').pop()['baseurl'], 'http://url/to/repo/a')
|
||||
self.assertEqual(extra_config.get('repo').pop()['baseurl'], 'http://example.com/Everything/$basearch/os')
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main()
|
||||
|
@ -156,8 +156,11 @@ class OstreeTreeScriptTest(helpers.PungiTestCase):
|
||||
|
||||
extra_config_file = os.path.join(self.topdir, 'extra_config.json')
|
||||
extra_config = {
|
||||
"repo_from": "http://www.example.com/Server.repo",
|
||||
"repo": [
|
||||
{
|
||||
"name": "server",
|
||||
"baseurl": "http://www.example.com/Server/repo",
|
||||
},
|
||||
{
|
||||
"name": "optional",
|
||||
"baseurl": "http://example.com/repo/x86_64/optional",
|
||||
@ -180,14 +183,14 @@ class OstreeTreeScriptTest(helpers.PungiTestCase):
|
||||
'--extra-config=%s' % extra_config_file,
|
||||
])
|
||||
|
||||
source_repo_from_name = "source_repo_from-%s" % timestamp
|
||||
source_repo_from_repo = os.path.join(configdir, "%s.repo" % source_repo_from_name)
|
||||
self.assertTrue(os.path.isfile(source_repo_from_repo))
|
||||
with open(source_repo_from_repo, 'r') as f:
|
||||
server_repo_name = "server-%s" % timestamp
|
||||
server_repo = os.path.join(configdir, "%s.repo" % server_repo_name)
|
||||
self.assertTrue(os.path.isfile(server_repo))
|
||||
with open(server_repo, 'r') as f:
|
||||
content = f.read()
|
||||
self.assertIn("[%s]" % source_repo_from_name, content)
|
||||
self.assertIn("name=%s" % source_repo_from_name, content)
|
||||
self.assertIn("baseurl=http://www.example.com/Server.repo", content)
|
||||
self.assertIn("[%s]" % server_repo_name, content)
|
||||
self.assertIn("name=%s" % server_repo_name, content)
|
||||
self.assertIn("baseurl=http://www.example.com/Server/repo", content)
|
||||
self.assertIn("gpgcheck=0", content)
|
||||
|
||||
optional_repo_name = "optional-%s" % timestamp
|
||||
@ -213,7 +216,7 @@ class OstreeTreeScriptTest(helpers.PungiTestCase):
|
||||
treeconf = json.load(open(treefile, 'r'))
|
||||
repos = treeconf['repos']
|
||||
self.assertEqual(len(repos), 3)
|
||||
for name in [source_repo_from_name, optional_repo_name, extra_repo_name]:
|
||||
for name in [server_repo_name, optional_repo_name, extra_repo_name]:
|
||||
self.assertIn(name, repos)
|
||||
|
||||
@mock.patch('pungi.ostree.utils.datetime')
|
||||
@ -230,8 +233,11 @@ class OstreeTreeScriptTest(helpers.PungiTestCase):
|
||||
|
||||
extra_config_file = os.path.join(self.topdir, 'extra_config.json')
|
||||
extra_config = {
|
||||
"repo_from": "http://www.example.com/Server.repo",
|
||||
"repo": [
|
||||
{
|
||||
"name": "server",
|
||||
"baseurl": "http://www.example.com/Server/repo",
|
||||
},
|
||||
{
|
||||
"name": "optional",
|
||||
"baseurl": "http://example.com/repo/x86_64/optional",
|
||||
@ -255,7 +261,7 @@ class OstreeTreeScriptTest(helpers.PungiTestCase):
|
||||
'--extra-config=%s' % extra_config_file,
|
||||
])
|
||||
|
||||
source_repo_from_name = "source_repo_from-%s" % timestamp
|
||||
server_repo_name = "server-%s" % timestamp
|
||||
optional_repo_name = "optional-%s" % timestamp
|
||||
extra_repo_name = "extra-%s" % timestamp
|
||||
|
||||
@ -263,7 +269,7 @@ class OstreeTreeScriptTest(helpers.PungiTestCase):
|
||||
repos = treeconf['repos']
|
||||
self.assertEqual(len(repos), 6)
|
||||
for name in ['fedora-rawhide', 'fedora-24', 'fedora-23',
|
||||
source_repo_from_name, optional_repo_name, extra_repo_name]:
|
||||
server_repo_name, optional_repo_name, extra_repo_name]:
|
||||
self.assertIn(name, repos)
|
||||
|
||||
|
||||
|
@ -540,5 +540,107 @@ class TranslatePathTestCase(unittest.TestCase):
|
||||
self.assertEqual(ret, '/mnt/fedora_koji/compose/rawhide/XYZ')
|
||||
|
||||
|
||||
class GetRepoFuncsTestCase(unittest.TestCase):
|
||||
@mock.patch('pungi.compose.ComposeInfo')
|
||||
def setUp(self, ci):
|
||||
self.tmp_dir = tempfile.mkdtemp()
|
||||
conf = {
|
||||
'translate_paths': [(self.tmp_dir, 'http://example.com')]
|
||||
}
|
||||
ci.return_value.compose.respin = 0
|
||||
ci.return_value.compose.id = 'RHEL-8.0-20180101.n.0'
|
||||
ci.return_value.compose.date = '20160101'
|
||||
ci.return_value.compose.type = 'nightly'
|
||||
ci.return_value.compose.type_suffix = '.n'
|
||||
ci.return_value.compose.label = 'RC-1.0'
|
||||
ci.return_value.compose.label_major_version = '1'
|
||||
|
||||
compose_dir = os.path.join(self.tmp_dir, ci.return_value.compose.id)
|
||||
self.compose = compose.Compose(conf, compose_dir)
|
||||
server_variant = mock.Mock(uid='Server', type='variant')
|
||||
client_variant = mock.Mock(uid='Client', type='variant')
|
||||
self.compose.all_variants = {
|
||||
'Server': server_variant,
|
||||
'Client': client_variant,
|
||||
}
|
||||
|
||||
def tearDown(self):
|
||||
shutil.rmtree(self.tmp_dir)
|
||||
|
||||
def test_get_repo_url_from_normal_url(self):
|
||||
url = util.get_repo_url(self.compose, 'http://example.com/repo')
|
||||
self.assertEqual(url, 'http://example.com/repo')
|
||||
|
||||
def test_get_repo_url_from_variant_uid(self):
|
||||
url = util.get_repo_url(self.compose, 'Server')
|
||||
self.assertEqual(url, 'http://example.com/RHEL-8.0-20180101.n.0/compose/Server/$basearch/os')
|
||||
|
||||
def test_get_repo_url_from_repo_dict(self):
|
||||
repo = {'baseurl': 'http://example.com/repo'}
|
||||
url = util.get_repo_url(self.compose, repo)
|
||||
self.assertEqual(url, 'http://example.com/repo')
|
||||
|
||||
repo = {'baseurl': 'Server'}
|
||||
url = util.get_repo_url(self.compose, repo)
|
||||
self.assertEqual(url, 'http://example.com/RHEL-8.0-20180101.n.0/compose/Server/$basearch/os')
|
||||
|
||||
def test_get_repo_urls(self):
|
||||
repos = [
|
||||
'http://example.com/repo',
|
||||
'Server',
|
||||
{'baseurl': 'Client'},
|
||||
{'baseurl': 'ftp://example.com/linux/repo'},
|
||||
]
|
||||
|
||||
expect = [
|
||||
'http://example.com/repo',
|
||||
'http://example.com/RHEL-8.0-20180101.n.0/compose/Server/$basearch/os',
|
||||
'http://example.com/RHEL-8.0-20180101.n.0/compose/Client/$basearch/os',
|
||||
'ftp://example.com/linux/repo',
|
||||
]
|
||||
|
||||
self.assertEqual(util.get_repo_urls(self.compose, repos), expect)
|
||||
|
||||
def test_get_repo_dict_from_normal_url(self):
|
||||
repo_dict = util.get_repo_dict(self.compose, 'http://example.com/repo')
|
||||
expect = {'name': 'http:__example.com_repo', 'baseurl': 'http://example.com/repo'}
|
||||
self.assertEqual(repo_dict, expect)
|
||||
|
||||
def test_get_repo_dict_from_variant_uid(self):
|
||||
repo_dict = util.get_repo_dict(self.compose, 'Server')
|
||||
expect = {
|
||||
'name': "%s-%s" % (self.compose.compose_id, 'Server'),
|
||||
'baseurl': 'http://example.com/RHEL-8.0-20180101.n.0/compose/Server/$basearch/os',
|
||||
}
|
||||
self.assertEqual(repo_dict, expect)
|
||||
|
||||
def test_get_repo_dict_from_repo_dict(self):
|
||||
repo = {'baseurl': 'Server'}
|
||||
expect = {
|
||||
'name': '%s-%s' % (self.compose.compose_id, 'Server'),
|
||||
'baseurl': 'http://example.com/RHEL-8.0-20180101.n.0/compose/Server/$basearch/os'
|
||||
}
|
||||
repo_dict = util.get_repo_dict(self.compose, repo)
|
||||
self.assertEqual(repo_dict, expect)
|
||||
|
||||
def test_get_repo_dicts(self):
|
||||
repos = [
|
||||
'http://example.com/repo',
|
||||
'Server',
|
||||
{'baseurl': 'Client'},
|
||||
{'baseurl': 'ftp://example.com/linux/repo'},
|
||||
{'name': 'testrepo', 'baseurl': 'ftp://example.com/linux/repo'},
|
||||
]
|
||||
expect = [
|
||||
{'name': 'http:__example.com_repo', 'baseurl': 'http://example.com/repo'},
|
||||
{'name': '%s-%s' % (self.compose.compose_id, 'Server'), 'baseurl': 'http://example.com/RHEL-8.0-20180101.n.0/compose/Server/$basearch/os'},
|
||||
{'name': '%s-%s' % (self.compose.compose_id, 'Client'), 'baseurl': 'http://example.com/RHEL-8.0-20180101.n.0/compose/Client/$basearch/os'},
|
||||
{'name': 'ftp:__example.com_linux_repo', 'baseurl': 'ftp://example.com/linux/repo'},
|
||||
{'name': 'testrepo', 'baseurl': 'ftp://example.com/linux/repo'},
|
||||
]
|
||||
repos = util.get_repo_dicts(self.compose, repos)
|
||||
self.assertEqual(repos, expect)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main()
|
||||
|
Loading…
Reference in New Issue
Block a user