unify repo and repo_from options
Config option 'repo' and 'repo_from' are used in several phases, merge them with one option 'repo'. 'append' in schema is used for appending the values from deprecated options to 'repo', so it won't break on any existing config files that have the old options of 'repo_from' and 'source_repo_from' (which is an alias of 'repo_from'). And 'repo' schema is updated to support repo dict as the value or an item in the values, a repo dict is just a dict contains repo options, 'baseurl' is required in the dict, like: {"baseurl": "http://example.com/url/to/repo"} or: {"baseurl": "Serer"} currently this is used in ostree phase to support extra repo options like: {"baseurl": "Server", "exclude": "systemd-container"} Signed-off-by: Qixiang Wan <qwan@redhat.com>
This commit is contained in:
parent
0ee2189d9c
commit
2f5d6d7dcd
@ -941,8 +941,7 @@ Live Images Settings
|
|||||||
* ``ksurl`` (*str*) [optional] -- where to get the kickstart from
|
* ``ksurl`` (*str*) [optional] -- where to get the kickstart from
|
||||||
* ``name`` (*str*)
|
* ``name`` (*str*)
|
||||||
* ``version`` (*str*)
|
* ``version`` (*str*)
|
||||||
* ``repo`` (*list*) -- external repos specified by URL
|
* ``repo`` (*str|[str]*) -- repos specified by URL or variant UID
|
||||||
* ``repo_from`` (*list*) -- repos from other variants
|
|
||||||
* ``specfile`` (*str*) -- for images wrapped in RPM
|
* ``specfile`` (*str*) -- for images wrapped in RPM
|
||||||
* ``scratch`` (*bool*) -- only RPM-wrapped images can use scratch builds,
|
* ``scratch`` (*bool*) -- only RPM-wrapped images can use scratch builds,
|
||||||
but by default this is turned off
|
but by default this is turned off
|
||||||
@ -953,7 +952,8 @@ Live Images Settings
|
|||||||
|
|
||||||
Deprecated options:
|
Deprecated options:
|
||||||
|
|
||||||
* ``additional_repos`` (*list*) -- deprecated, use ``repo`` instead
|
* ``additional_repos`` -- deprecated, use ``repo`` instead
|
||||||
|
* ``repo_from`` -- deprecated, use ``repo`` instead
|
||||||
|
|
||||||
**live_images_no_rename**
|
**live_images_no_rename**
|
||||||
(*bool*) -- When set to ``True``, filenames generated by Koji will be used.
|
(*bool*) -- When set to ``True``, filenames generated by Koji will be used.
|
||||||
@ -986,11 +986,14 @@ Live Media Settings
|
|||||||
for automatically generating one. See :ref:`common options
|
for automatically generating one. See :ref:`common options
|
||||||
<auto_release>` for details.
|
<auto_release>` for details.
|
||||||
* ``skip_tag`` (*bool*)
|
* ``skip_tag`` (*bool*)
|
||||||
* ``repo`` (*[str]*) -- external repo
|
* ``repo`` (*str|[str]*) -- repos specified by URL or variant UID
|
||||||
* ``repo_from`` (*[str]*) -- list of variants to take extra repos from
|
|
||||||
* ``title`` (*str*)
|
* ``title`` (*str*)
|
||||||
* ``install_tree_from`` (*str*) -- variant to take install tree from
|
* ``install_tree_from`` (*str*) -- variant to take install tree from
|
||||||
|
|
||||||
|
Deprecated options:
|
||||||
|
|
||||||
|
* ``repo_from`` -- deprecated, use ``repo`` instead
|
||||||
|
|
||||||
If many of your media use the same value for one of ``ksurl``, ``release``,
|
If many of your media use the same value for one of ``ksurl``, ``release``,
|
||||||
``target`` or ``version``, consider using these options to set the value in one
|
``target`` or ``version``, consider using these options to set the value in one
|
||||||
place and have all media inherit it.
|
place and have all media inherit it.
|
||||||
@ -1035,9 +1038,6 @@ Image Build Settings
|
|||||||
If you explicitly set ``release`` to ``None``, it will be replaced with
|
If you explicitly set ``release`` to ``None``, it will be replaced with
|
||||||
a value generated as described in :ref:`common options <auto_release>`.
|
a value generated as described in :ref:`common options <auto_release>`.
|
||||||
|
|
||||||
You can also add extra variants to get repos from with key ``repo_from``.
|
|
||||||
The value should be a list of variant names.
|
|
||||||
|
|
||||||
Please don't set ``install_tree``. This gets automatically set by *pungi*
|
Please don't set ``install_tree``. This gets automatically set by *pungi*
|
||||||
based on current variant. You can use ``install_tree_from`` key to use
|
based on current variant. You can use ``install_tree_from`` key to use
|
||||||
install tree from another variant.
|
install tree from another variant.
|
||||||
@ -1114,7 +1114,7 @@ Example
|
|||||||
|
|
||||||
# Use install tree and repo from Everything variant.
|
# Use install tree and repo from Everything variant.
|
||||||
'install_tree_from': 'Everything',
|
'install_tree_from': 'Everything',
|
||||||
'repo_from': ['Everything'],
|
'repo': ['Everything'],
|
||||||
|
|
||||||
# Set release automatically.
|
# Set release automatically.
|
||||||
'release': None,
|
'release': None,
|
||||||
@ -1141,21 +1141,12 @@ a new commit.
|
|||||||
|
|
||||||
* ``treefile`` -- (*str*) Filename of configuration for ``rpm-ostree``.
|
* ``treefile`` -- (*str*) Filename of configuration for ``rpm-ostree``.
|
||||||
* ``config_url`` -- (*str*) URL for Git repository with the ``treefile``.
|
* ``config_url`` -- (*str*) URL for Git repository with the ``treefile``.
|
||||||
* ``repo_from`` -- (*str*) Name of variant serving as source repository.
|
* ``repo`` -- (*str|dict|[str|dict]*) repos specified by URL or variant UID
|
||||||
|
or a dict of repo options, ``baseurl`` is required in the dict.
|
||||||
* ``ostree_repo`` -- (*str*) Where to put the ostree repository
|
* ``ostree_repo`` -- (*str*) Where to put the ostree repository
|
||||||
|
|
||||||
These keys are optional:
|
These keys are optional:
|
||||||
|
|
||||||
* ``repo`` -- (*[dict]*) Extra source repos to get packages
|
|
||||||
while composing the OSTree repository. Each dict represents a yum repo.
|
|
||||||
The allowed keys are:
|
|
||||||
|
|
||||||
* ``name`` (required)
|
|
||||||
* ``baseurl`` (required) -- URL of external repo or variant UID, in the case
|
|
||||||
of variant UID, url to variant repo will be built automatically.
|
|
||||||
* ``gpgcheck`` (optional)
|
|
||||||
* ``exclude`` (optional)
|
|
||||||
|
|
||||||
* ``keep_original_sources`` -- (*bool*) Keep the existing source repos in
|
* ``keep_original_sources`` -- (*bool*) Keep the existing source repos in
|
||||||
the tree config file. If not enabled, all the original source repos will
|
the tree config file. If not enabled, all the original source repos will
|
||||||
be removed from the tree config file.
|
be removed from the tree config file.
|
||||||
@ -1171,8 +1162,10 @@ a new commit.
|
|||||||
|
|
||||||
Deprecated options:
|
Deprecated options:
|
||||||
|
|
||||||
* ``source_repo_from`` -- (*str*) Deprecated, use ``repo_from`` instead.
|
* ``repo_from`` -- Deprecated, use ``repo`` instead.
|
||||||
* ``extra_source_repos`` -- (*[dict]*) Deprecated, use ``repo`` instead.
|
* ``source_repo_from`` -- Deprecated, use ``repo`` instead.
|
||||||
|
* ``extra_source_repos`` -- Deprecated, use ``repo`` instead.
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
Example config
|
Example config
|
||||||
@ -1184,18 +1177,11 @@ Example config
|
|||||||
"x86_64": {
|
"x86_64": {
|
||||||
"treefile": "fedora-atomic-docker-host.json",
|
"treefile": "fedora-atomic-docker-host.json",
|
||||||
"config_url": "https://git.fedorahosted.org/git/fedora-atomic.git",
|
"config_url": "https://git.fedorahosted.org/git/fedora-atomic.git",
|
||||||
"repo_from": "Server",
|
|
||||||
"repo": [
|
"repo": [
|
||||||
{
|
"Server",
|
||||||
"name": "repo_a",
|
"http://example.com/repo/x86_64/os",
|
||||||
"baseurl": "http://example.com/repo/x86_64/os",
|
{"baseurl": "Everything"},
|
||||||
"exclude": "systemd-container",
|
{"baseurl": "http://example.com/linux/repo", "exclude": "systemd-container"},
|
||||||
"gpgcheck": False
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"name": "Everything",
|
|
||||||
"baseurl": "Everything",
|
|
||||||
}
|
|
||||||
],
|
],
|
||||||
"keep_original_sources": True,
|
"keep_original_sources": True,
|
||||||
"ostree_repo": "/mnt/koji/compose/atomic/Rawhide/",
|
"ostree_repo": "/mnt/koji/compose/atomic/Rawhide/",
|
||||||
@ -1218,12 +1204,9 @@ an OSTree repository. This always runs in Koji as a ``runroot`` task.
|
|||||||
|
|
||||||
The configuration dict for each variant arch pair must have this key:
|
The configuration dict for each variant arch pair must have this key:
|
||||||
|
|
||||||
* ``repo_from`` -- (*str|[str]*) Name of variant or a name list of
|
|
||||||
variants serving as source repositories.
|
|
||||||
|
|
||||||
These keys are optional:
|
These keys are optional:
|
||||||
|
|
||||||
* ``repo`` -- (*str|[str]*) URL of a repo or a list of urls.
|
* ``repo`` -- (*str|[str]*) repos specified by URL or variant UID
|
||||||
* ``release`` -- (*str*) Release value to set for the installer image. Set
|
* ``release`` -- (*str*) Release value to set for the installer image. Set
|
||||||
to ``None`` to generate the value :ref:`automatically <auto_release>`.
|
to ``None`` to generate the value :ref:`automatically <auto_release>`.
|
||||||
* ``failable`` -- (*[str]*) List of architectures for which this
|
* ``failable`` -- (*[str]*) List of architectures for which this
|
||||||
@ -1247,7 +1230,8 @@ an OSTree repository. This always runs in Koji as a ``runroot`` task.
|
|||||||
|
|
||||||
Deprecated options:
|
Deprecated options:
|
||||||
|
|
||||||
* ``source_repo_from`` -- (*str|[str]*) Deprecated, use ``repo_from`` instead.
|
* ``repo_from`` -- Deprecated, use ``repo`` instead.
|
||||||
|
* ``source_repo_from`` -- Deprecated, use ``repo`` instead.
|
||||||
|
|
||||||
Example config
|
Example config
|
||||||
--------------
|
--------------
|
||||||
@ -1256,7 +1240,11 @@ Example config
|
|||||||
ostree_installer = [
|
ostree_installer = [
|
||||||
("^Atomic$", {
|
("^Atomic$", {
|
||||||
"x86_64": {
|
"x86_64": {
|
||||||
"repo_from": "Everything",
|
"repo": [
|
||||||
|
"Everything",
|
||||||
|
"https://example.com/extra-repo1.repo",
|
||||||
|
"https://example.com/extra-repo2.repo",
|
||||||
|
],
|
||||||
"release": None,
|
"release": None,
|
||||||
"installpkgs": ["fedora-productimg-atomic"],
|
"installpkgs": ["fedora-productimg-atomic"],
|
||||||
"add_template": ["atomic-installer/lorax-configure-repo.tmpl"],
|
"add_template": ["atomic-installer/lorax-configure-repo.tmpl"],
|
||||||
@ -1272,12 +1260,6 @@ Example config
|
|||||||
]
|
]
|
||||||
'template_repo': 'https://git.fedorahosted.org/git/spin-kickstarts.git',
|
'template_repo': 'https://git.fedorahosted.org/git/spin-kickstarts.git',
|
||||||
'template_branch': 'f24',
|
'template_branch': 'f24',
|
||||||
|
|
||||||
# optional
|
|
||||||
"repo": [
|
|
||||||
"https://example.com/extra-repo1.repo",
|
|
||||||
"https://example.com/extra-repo2.repo",
|
|
||||||
],
|
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
]
|
]
|
||||||
@ -1318,9 +1300,9 @@ they are not scratch builds).
|
|||||||
|
|
||||||
A value for ``yum_repourls`` will be created automatically and point at a
|
A value for ``yum_repourls`` will be created automatically and point at a
|
||||||
repository in the current compose. You can add extra repositories with
|
repository in the current compose. You can add extra repositories with
|
||||||
``repo`` key having a list of urls pointing to ``.repo`` files or
|
``repo`` key having a list of urls pointing to ``.repo`` files or just
|
||||||
``repo_from`` as a list of variants in current compose. ``gpgkey`` can be
|
variant uid, Pungi will create the .repo file for that variant. ``gpgkey``
|
||||||
specified to enable gpgcheck in repo files for variants.
|
can be specified to enable gpgcheck in repo files for variants.
|
||||||
|
|
||||||
|
|
||||||
Example config
|
Example config
|
||||||
@ -1336,8 +1318,7 @@ Example config
|
|||||||
# optional
|
# optional
|
||||||
"name": "fedora-docker-base",
|
"name": "fedora-docker-base",
|
||||||
"version": "24",
|
"version": "24",
|
||||||
"repo": ["https://example.com/extra-repo.repo"],
|
"repo": ["Everything", "https://example.com/extra-repo.repo"],
|
||||||
"repo_from": ["Everything"],
|
|
||||||
# This will result in three repo urls being passed to the task.
|
# This will result in three repo urls being passed to the task.
|
||||||
# They will be in this order: Server, Everything, example.com/
|
# They will be in this order: Server, Everything, example.com/
|
||||||
"gpgkey": 'file:///etc/pki/rpm-gpg/RPM-GPG-KEY-redhat-release',
|
"gpgkey": 'file:///etc/pki/rpm-gpg/RPM-GPG-KEY-redhat-release',
|
||||||
|
@ -410,15 +410,36 @@ def _make_schema():
|
|||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
|
||||||
"source_repo_dict": {
|
"repo_dict": {
|
||||||
"type": "object",
|
"type": "object",
|
||||||
"properties": {
|
"properties": {
|
||||||
"name": {"type": "string"},
|
"name": {"type": "string"},
|
||||||
"baseurl": {"type": "string"},
|
"baseurl": {"type": "string"},
|
||||||
"exclude": {"type": "string"},
|
"exclude": {"type": "string"},
|
||||||
"gpgcheck": {"type": "boolean"},
|
"gpgcheck": {"type": "string"},
|
||||||
|
"enabled": {"type": "string"},
|
||||||
},
|
},
|
||||||
"additionalProperties": False,
|
"additionalProperties": False,
|
||||||
|
"required": ["baseurl"],
|
||||||
|
},
|
||||||
|
|
||||||
|
"repo": {
|
||||||
|
"anyOf": [
|
||||||
|
{"type": "string"},
|
||||||
|
{"$ref": "#/definitions/repo_dict"},
|
||||||
|
]
|
||||||
|
},
|
||||||
|
|
||||||
|
"list_of_repos": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {"$ref": "#/definitions/repo"},
|
||||||
|
},
|
||||||
|
|
||||||
|
"repos": {
|
||||||
|
"anyOf": [
|
||||||
|
{"$ref": "#/definitions/repo"},
|
||||||
|
{"$ref": "#/definitions/list_of_repos"},
|
||||||
|
]
|
||||||
},
|
},
|
||||||
|
|
||||||
"list_of_strings": {
|
"list_of_strings": {
|
||||||
@ -426,11 +447,6 @@ def _make_schema():
|
|||||||
"items": {"type": "string"},
|
"items": {"type": "string"},
|
||||||
},
|
},
|
||||||
|
|
||||||
"list_of_source_repo_dicts": {
|
|
||||||
"type": "array",
|
|
||||||
"items": {"$ref": "#/definitions/source_repo_dict"},
|
|
||||||
},
|
|
||||||
|
|
||||||
"strings": {
|
"strings": {
|
||||||
"anyOf": [
|
"anyOf": [
|
||||||
{"type": "string"},
|
{"type": "string"},
|
||||||
@ -454,10 +470,10 @@ def _make_schema():
|
|||||||
"subvariant": {"type": "string"},
|
"subvariant": {"type": "string"},
|
||||||
"version": {"type": "string"},
|
"version": {"type": "string"},
|
||||||
"repo": {
|
"repo": {
|
||||||
"$ref": "#/definitions/strings",
|
"$ref": "#/definitions/repos",
|
||||||
"alias": "additional_repos",
|
"alias": "additional_repos",
|
||||||
|
"append": "repo_from",
|
||||||
},
|
},
|
||||||
"repo_from": {"$ref": "#/definitions/strings"},
|
|
||||||
"specfile": {"type": "string"},
|
"specfile": {"type": "string"},
|
||||||
"scratch": {"type": "boolean"},
|
"scratch": {"type": "boolean"},
|
||||||
"type": {"type": "string"},
|
"type": {"type": "string"},
|
||||||
@ -792,8 +808,10 @@ def _make_schema():
|
|||||||
"name": {"type": "string"},
|
"name": {"type": "string"},
|
||||||
"subvariant": {"type": "string"},
|
"subvariant": {"type": "string"},
|
||||||
"title": {"type": "string"},
|
"title": {"type": "string"},
|
||||||
"repo": {"$ref": "#/definitions/strings"},
|
"repo": {
|
||||||
"repo_from": {"$ref": "#/definitions/strings"},
|
"$ref": "#/definitions/repos",
|
||||||
|
"append": "repo_from",
|
||||||
|
},
|
||||||
"target": {"type": "string"},
|
"target": {"type": "string"},
|
||||||
"arches": {"$ref": "#/definitions/list_of_strings"},
|
"arches": {"$ref": "#/definitions/list_of_strings"},
|
||||||
"failable": {"$ref": "#/definitions/list_of_strings"},
|
"failable": {"$ref": "#/definitions/list_of_strings"},
|
||||||
@ -812,13 +830,10 @@ def _make_schema():
|
|||||||
"properties": {
|
"properties": {
|
||||||
"treefile": {"type": "string"},
|
"treefile": {"type": "string"},
|
||||||
"config_url": {"type": "string"},
|
"config_url": {"type": "string"},
|
||||||
"repo_from": {
|
|
||||||
"type": "string",
|
|
||||||
"alias": "source_repo_from",
|
|
||||||
},
|
|
||||||
"repo": {
|
"repo": {
|
||||||
"$ref": "#/definitions/list_of_source_repo_dicts",
|
"$ref": "#/definitions/repos",
|
||||||
"alias": "extra_source_repos",
|
"alias": "extra_source_repos",
|
||||||
|
"append": ["repo_from", "source_repo_from"],
|
||||||
},
|
},
|
||||||
"keep_original_sources": {"type": "boolean"},
|
"keep_original_sources": {"type": "boolean"},
|
||||||
"ostree_repo": {"type": "string"},
|
"ostree_repo": {"type": "string"},
|
||||||
@ -828,17 +843,16 @@ def _make_schema():
|
|||||||
"config_branch": {"type": "string"},
|
"config_branch": {"type": "string"},
|
||||||
"tag_ref": {"type": "boolean"},
|
"tag_ref": {"type": "boolean"},
|
||||||
},
|
},
|
||||||
"required": ["treefile", "config_url", "repo_from", "ostree_repo"],
|
"required": ["treefile", "config_url", "repo", "ostree_repo"],
|
||||||
"additionalProperties": False,
|
"additionalProperties": False,
|
||||||
}),
|
}),
|
||||||
|
|
||||||
"ostree_installer": _variant_arch_mapping({
|
"ostree_installer": _variant_arch_mapping({
|
||||||
"type": "object",
|
"type": "object",
|
||||||
"properties": {
|
"properties": {
|
||||||
"repo": {"$ref": "#/definitions/strings"},
|
"repo": {
|
||||||
"repo_from": {
|
"$ref": "#/definitions/repos",
|
||||||
"$ref": "#/definitions/strings",
|
"append": ["repo_from", "source_repo_from"],
|
||||||
"alias": "source_repo_from",
|
|
||||||
},
|
},
|
||||||
"release": {"$ref": "#/definitions/optional_string"},
|
"release": {"$ref": "#/definitions/optional_string"},
|
||||||
"failable": {"$ref": "#/definitions/list_of_strings"},
|
"failable": {"$ref": "#/definitions/list_of_strings"},
|
||||||
@ -851,7 +865,7 @@ def _make_schema():
|
|||||||
"template_repo": {"type": "string"},
|
"template_repo": {"type": "string"},
|
||||||
"template_branch": {"type": "string"},
|
"template_branch": {"type": "string"},
|
||||||
},
|
},
|
||||||
"required": ["repo_from"],
|
"required": ["repo"],
|
||||||
"additionalProperties": False,
|
"additionalProperties": False,
|
||||||
}),
|
}),
|
||||||
|
|
||||||
@ -887,7 +901,10 @@ def _make_schema():
|
|||||||
"name": {"type": "string"},
|
"name": {"type": "string"},
|
||||||
"kickstart": {"type": "string"},
|
"kickstart": {"type": "string"},
|
||||||
"arches": {"$ref": "#/definitions/list_of_strings"},
|
"arches": {"$ref": "#/definitions/list_of_strings"},
|
||||||
"repo_from": {"$ref": "#/definitions/strings"},
|
"repo": {
|
||||||
|
"$ref": "#/definitions/repos",
|
||||||
|
"append": "repo_from",
|
||||||
|
},
|
||||||
"install_tree_from": {"type": "string"},
|
"install_tree_from": {"type": "string"},
|
||||||
"subvariant": {"type": "string"},
|
"subvariant": {"type": "string"},
|
||||||
"format": {"$ref": "#/definitions/string_tuples"},
|
"format": {"$ref": "#/definitions/string_tuples"},
|
||||||
@ -954,8 +971,10 @@ def _make_schema():
|
|||||||
"version": {"type": "string"},
|
"version": {"type": "string"},
|
||||||
"scratch": {"type": "boolean"},
|
"scratch": {"type": "boolean"},
|
||||||
"priority": {"type": "number"},
|
"priority": {"type": "number"},
|
||||||
"repo": {"$ref": "#/definitions/strings"},
|
"repo": {
|
||||||
"repo_from": {"$ref": "#/definitions/strings"},
|
"$ref": "#/definitions/repos",
|
||||||
|
"append": "repo_from",
|
||||||
|
},
|
||||||
"gpgkey": {"type": "string"},
|
"gpgkey": {"type": "string"},
|
||||||
},
|
},
|
||||||
"required": ["url", "target"]
|
"required": ["url", "target"]
|
||||||
|
@ -89,10 +89,8 @@ class Tree(OSTree):
|
|||||||
self.extra_config = self.args.extra_config
|
self.extra_config = self.args.extra_config
|
||||||
if self.extra_config:
|
if self.extra_config:
|
||||||
self.extra_config = json.load(open(self.extra_config, 'r'))
|
self.extra_config = json.load(open(self.extra_config, 'r'))
|
||||||
source_repo_from = self.extra_config.get('repo_from', None)
|
repos = self.extra_config.get('repo', [])
|
||||||
extra_source_repos = self.extra_config.get('repo', [])
|
|
||||||
keep_original_sources = self.extra_config.get('keep_original_sources', False)
|
keep_original_sources = self.extra_config.get('keep_original_sources', False)
|
||||||
repos = extra_source_repos + [{'name': 'source_repo_from', 'baseurl': source_repo_from}]
|
|
||||||
tweak_treeconf(self.treefile, source_repos=repos, keep_original_sources=keep_original_sources)
|
tweak_treeconf(self.treefile, source_repos=repos, keep_original_sources=keep_original_sources)
|
||||||
|
|
||||||
self.commitid_file = make_log_file(self.logdir, 'commitid')
|
self.commitid_file = make_log_file(self.logdir, 'commitid')
|
||||||
|
@ -6,7 +6,7 @@ import time
|
|||||||
from kobo import shortcuts
|
from kobo import shortcuts
|
||||||
|
|
||||||
from pungi.util import get_variant_data, makedirs, get_mtime, get_file_size, failable
|
from pungi.util import get_variant_data, makedirs, get_mtime, get_file_size, failable
|
||||||
from pungi.util import translate_path
|
from pungi.util import translate_path, get_repo_urls
|
||||||
from pungi.phases import base
|
from pungi.phases import base
|
||||||
from pungi.linker import Linker
|
from pungi.linker import Linker
|
||||||
from pungi.wrappers.kojiwrapper import KojiWrapper
|
from pungi.wrappers.kojiwrapper import KojiWrapper
|
||||||
@ -49,28 +49,15 @@ class ImageBuildPhase(base.PhaseLoggerMixin, base.ImageConfigMixin, base.ConfigG
|
|||||||
def _get_repo(self, image_conf, variant):
|
def _get_repo(self, image_conf, variant):
|
||||||
"""
|
"""
|
||||||
Get a comma separated list of repos. First included are those
|
Get a comma separated list of repos. First included are those
|
||||||
explicitly listed in config, followed by repos from other variants,
|
explicitly listed in config, followed by by repo for current variant
|
||||||
finally followed by repo for current variant.
|
if it's not included in the list already.
|
||||||
|
|
||||||
The `repo_from` key is removed from the dict (if present).
|
|
||||||
"""
|
"""
|
||||||
repo = shortcuts.force_list(image_conf.get('repo', []))
|
repos = shortcuts.force_list(image_conf.get('repo', []))
|
||||||
|
|
||||||
extras = shortcuts.force_list(image_conf.pop('repo_from', []))
|
if not variant.is_empty and variant.uid not in repos:
|
||||||
if not variant.is_empty:
|
repos.append(variant.uid)
|
||||||
extras.append(variant.uid)
|
|
||||||
|
|
||||||
for extra in extras:
|
return ",".join(get_repo_urls(self.compose, repos, arch='$arch'))
|
||||||
v = self.compose.all_variants.get(extra)
|
|
||||||
if not v:
|
|
||||||
raise RuntimeError(
|
|
||||||
'There is no variant %s to get repo from when building image for %s.'
|
|
||||||
% (extra, variant.uid))
|
|
||||||
repo.append(translate_path(
|
|
||||||
self.compose,
|
|
||||||
self.compose.paths.compose.os_tree('$arch', v, create_dir=False)))
|
|
||||||
|
|
||||||
return ",".join(repo)
|
|
||||||
|
|
||||||
def _get_arches(self, image_conf, arches):
|
def _get_arches(self, image_conf, arches):
|
||||||
if 'arches' in image_conf['image-build']:
|
if 'arches' in image_conf['image-build']:
|
||||||
|
@ -28,7 +28,7 @@ from pungi.wrappers.kojiwrapper import KojiWrapper
|
|||||||
from pungi.wrappers import iso
|
from pungi.wrappers import iso
|
||||||
from pungi.phases import base
|
from pungi.phases import base
|
||||||
from pungi.util import get_arch_variant_data, makedirs, get_mtime, get_file_size, failable
|
from pungi.util import get_arch_variant_data, makedirs, get_mtime, get_file_size, failable
|
||||||
from pungi.util import translate_path
|
from pungi.util import get_repo_urls
|
||||||
|
|
||||||
|
|
||||||
# HACK: define cmp in python3
|
# HACK: define cmp in python3
|
||||||
@ -44,29 +44,12 @@ class LiveImagesPhase(base.PhaseLoggerMixin, base.ImageConfigMixin, base.ConfigG
|
|||||||
super(LiveImagesPhase, self).__init__(compose)
|
super(LiveImagesPhase, self).__init__(compose)
|
||||||
self.pool = ThreadPool(logger=self.logger)
|
self.pool = ThreadPool(logger=self.logger)
|
||||||
|
|
||||||
def _get_extra_repos(self, arch, variant, extras):
|
|
||||||
repo = []
|
|
||||||
for extra in extras:
|
|
||||||
v = self.compose.all_variants.get(extra)
|
|
||||||
if not v:
|
|
||||||
raise RuntimeError(
|
|
||||||
'There is no variant %s to get repo from when building live image for %s.'
|
|
||||||
% (extra, variant.uid))
|
|
||||||
repo.append(translate_path(
|
|
||||||
self.compose, self.compose.paths.compose.repository(arch, v, create_dir=False)))
|
|
||||||
|
|
||||||
return repo
|
|
||||||
|
|
||||||
def _get_repos(self, arch, variant, data):
|
def _get_repos(self, arch, variant, data):
|
||||||
repos = []
|
repos = []
|
||||||
if not variant.is_empty:
|
if not variant.is_empty:
|
||||||
repos.append(translate_path(
|
repos.append(variant.uid)
|
||||||
self.compose, self.compose.paths.compose.repository(arch, variant, create_dir=False)))
|
repos.extend(force_list(data.get('repo', [])))
|
||||||
|
return get_repo_urls(self.compose, repos, arch=arch)
|
||||||
# additional repos
|
|
||||||
repos.extend(data.get("repo", []))
|
|
||||||
repos.extend(self._get_extra_repos(arch, variant, force_list(data.get('repo_from', []))))
|
|
||||||
return repos
|
|
||||||
|
|
||||||
def run(self):
|
def run(self):
|
||||||
symlink_isos_to = self.compose.conf.get("symlink_isos_to")
|
symlink_isos_to = self.compose.conf.get("symlink_isos_to")
|
||||||
|
@ -5,7 +5,7 @@ import time
|
|||||||
from kobo import shortcuts
|
from kobo import shortcuts
|
||||||
|
|
||||||
from pungi.util import get_variant_data, makedirs, get_mtime, get_file_size, failable
|
from pungi.util import get_variant_data, makedirs, get_mtime, get_file_size, failable
|
||||||
from pungi.util import translate_path
|
from pungi.util import translate_path, get_repo_urls
|
||||||
from pungi.phases.base import ConfigGuardedPhase, ImageConfigMixin, PhaseLoggerMixin
|
from pungi.phases.base import ConfigGuardedPhase, ImageConfigMixin, PhaseLoggerMixin
|
||||||
from pungi.linker import Linker
|
from pungi.linker import Linker
|
||||||
from pungi.wrappers.kojiwrapper import KojiWrapper
|
from pungi.wrappers.kojiwrapper import KojiWrapper
|
||||||
@ -23,29 +23,16 @@ class LiveMediaPhase(PhaseLoggerMixin, ImageConfigMixin, ConfigGuardedPhase):
|
|||||||
|
|
||||||
def _get_repos(self, image_conf, variant):
|
def _get_repos(self, image_conf, variant):
|
||||||
"""
|
"""
|
||||||
Get a comma separated list of repos. First included are those
|
Get a list of repo urls. First included are those explicitly listed in config,
|
||||||
explicitly listed in config, followed by repos from other variants,
|
followed by repo for current variant if it's not present in the list.
|
||||||
finally followed by repo for current variant.
|
|
||||||
|
|
||||||
The `repo_from` key is removed from the dict (if present).
|
|
||||||
"""
|
"""
|
||||||
repo = shortcuts.force_list(image_conf.get('repo', []))
|
repos = shortcuts.force_list(image_conf.get('repo', []))
|
||||||
|
|
||||||
extras = shortcuts.force_list(image_conf.pop('repo_from', []))
|
|
||||||
if not variant.is_empty:
|
if not variant.is_empty:
|
||||||
extras.append(variant.uid)
|
if variant.uid not in repos:
|
||||||
|
repos.append(variant.uid)
|
||||||
|
|
||||||
for extra in extras:
|
return get_repo_urls(self.compose, repos)
|
||||||
v = self.compose.all_variants.get(extra)
|
|
||||||
if not v:
|
|
||||||
raise RuntimeError(
|
|
||||||
'There is no variant %s to get repo from when building live media for %s.'
|
|
||||||
% (extra, variant.uid))
|
|
||||||
repo.append(translate_path(
|
|
||||||
self.compose,
|
|
||||||
self.compose.paths.compose.repository('$basearch', v, create_dir=False)))
|
|
||||||
|
|
||||||
return repo
|
|
||||||
|
|
||||||
def _get_arches(self, image_conf, arches):
|
def _get_arches(self, image_conf, arches):
|
||||||
if 'arches' in image_conf:
|
if 'arches' in image_conf:
|
||||||
|
@ -53,13 +53,11 @@ class OSBSThread(WorkerThread):
|
|||||||
source = util.resolve_git_url(config.pop('url'))
|
source = util.resolve_git_url(config.pop('url'))
|
||||||
target = config.pop('target')
|
target = config.pop('target')
|
||||||
priority = config.pop('priority', None)
|
priority = config.pop('priority', None)
|
||||||
repos = shortcuts.force_list(config.pop('repo', []))
|
|
||||||
gpgkey = config.pop('gpgkey', None)
|
gpgkey = config.pop('gpgkey', None)
|
||||||
compose_repos = [self._get_repo(compose, v, gpgkey=gpgkey)
|
repos = [self._get_repo(compose, v, gpgkey=gpgkey)
|
||||||
for v in [variant.uid] + shortcuts.force_list(
|
for v in [variant.uid] + shortcuts.force_list(config.pop('repo', []))]
|
||||||
config.pop('repo_from', []))]
|
|
||||||
|
|
||||||
config['yum_repourls'] = compose_repos + repos
|
config['yum_repourls'] = repos
|
||||||
|
|
||||||
task_id = koji.koji_proxy.buildContainer(source, target, config,
|
task_id = koji.koji_proxy.buildContainer(source, target, config,
|
||||||
priority=priority)
|
priority=priority)
|
||||||
@ -120,17 +118,21 @@ class OSBSThread(WorkerThread):
|
|||||||
self.pool.metadata.setdefault(
|
self.pool.metadata.setdefault(
|
||||||
variant.uid, {}).setdefault(arch, []).append(data)
|
variant.uid, {}).setdefault(arch, []).append(data)
|
||||||
|
|
||||||
def _get_repo(self, compose, variant_uid, gpgkey=None):
|
def _get_repo(self, compose, repo, gpgkey=None):
|
||||||
"""
|
"""
|
||||||
Write a .repo file pointing to current variant and return URL to the
|
Return repo file URL of repo, if repo contains "://", it's already
|
||||||
file.
|
a URL of repo file. Or it's a variant UID, then write a .repo file
|
||||||
|
pointing to current variant and return the URL to .repo file.
|
||||||
"""
|
"""
|
||||||
|
if "://" in repo:
|
||||||
|
return repo
|
||||||
|
|
||||||
try:
|
try:
|
||||||
variant = compose.all_variants[variant_uid]
|
variant = compose.all_variants[repo]
|
||||||
except KeyError:
|
except KeyError:
|
||||||
raise RuntimeError(
|
raise RuntimeError(
|
||||||
'There is no variant %s to get repo from to pass to OSBS.'
|
'There is no variant %s to get repo from to pass to OSBS.'
|
||||||
% (variant_uid))
|
% (repo))
|
||||||
os_tree = compose.paths.compose.os_tree('$basearch', variant,
|
os_tree = compose.paths.compose.os_tree('$basearch', variant,
|
||||||
create_dir=False)
|
create_dir=False)
|
||||||
repo_file = os.path.join(compose.paths.work.tmp_dir(None, variant),
|
repo_file = os.path.join(compose.paths.work.tmp_dir(None, variant),
|
||||||
|
@ -3,12 +3,13 @@
|
|||||||
import copy
|
import copy
|
||||||
import json
|
import json
|
||||||
import os
|
import os
|
||||||
|
from kobo import shortcuts
|
||||||
from kobo.threads import ThreadPool, WorkerThread
|
from kobo.threads import ThreadPool, WorkerThread
|
||||||
|
|
||||||
from .base import ConfigGuardedPhase
|
from .base import ConfigGuardedPhase
|
||||||
from .. import util
|
from .. import util
|
||||||
from ..ostree.utils import get_ref_from_treefile, get_commitid_from_commitid_file
|
from ..ostree.utils import get_ref_from_treefile, get_commitid_from_commitid_file
|
||||||
from ..util import translate_path
|
from ..util import get_repo_dicts
|
||||||
from ..wrappers import kojiwrapper, scm
|
from ..wrappers import kojiwrapper, scm
|
||||||
|
|
||||||
|
|
||||||
@ -45,41 +46,16 @@ class OSTreeThread(WorkerThread):
|
|||||||
self.logdir = compose.paths.log.topdir('%s/%s/ostree-%d' %
|
self.logdir = compose.paths.log.topdir('%s/%s/ostree-%d' %
|
||||||
(arch, variant.uid, self.num))
|
(arch, variant.uid, self.num))
|
||||||
repodir = os.path.join(workdir, 'config_repo')
|
repodir = os.path.join(workdir, 'config_repo')
|
||||||
|
|
||||||
source_variant = compose.all_variants[config['repo_from']]
|
|
||||||
source_repo = translate_path(compose,
|
|
||||||
compose.paths.compose.repository('$basearch',
|
|
||||||
source_variant,
|
|
||||||
create_dir=False))
|
|
||||||
|
|
||||||
self._clone_repo(repodir, config['config_url'], config.get('config_branch', 'master'))
|
self._clone_repo(repodir, config['config_url'], config.get('config_branch', 'master'))
|
||||||
|
|
||||||
source_repos = [{'name': '%s-%s' % (compose.compose_id, config['repo_from']),
|
repos = get_repo_dicts(compose, shortcuts.force_list(config['repo']))
|
||||||
'baseurl': source_repo}]
|
|
||||||
|
|
||||||
extra_source_repos = config.get('repo', None)
|
|
||||||
if extra_source_repos:
|
|
||||||
for extra in extra_source_repos:
|
|
||||||
baseurl = extra['baseurl']
|
|
||||||
if "://" not in baseurl:
|
|
||||||
# it's variant UID, translate to url
|
|
||||||
variant = compose.variants[baseurl]
|
|
||||||
url = translate_path(compose,
|
|
||||||
compose.paths.compose.repository('$basearch',
|
|
||||||
variant,
|
|
||||||
create_dir=False))
|
|
||||||
extra['baseurl'] = url
|
|
||||||
|
|
||||||
source_repos = source_repos + extra_source_repos
|
|
||||||
|
|
||||||
# copy the original config and update before save to a json file
|
# copy the original config and update before save to a json file
|
||||||
new_config = copy.copy(config)
|
new_config = copy.copy(config)
|
||||||
|
|
||||||
# repos in configuration can have repo url set to variant UID,
|
# repos in configuration can have repo url set to variant UID,
|
||||||
# update it to have the actual url that we just translated.
|
# update it to have the actual url that we just translated.
|
||||||
new_config.update({'repo_from': source_repo})
|
new_config.update({'repo': repos})
|
||||||
if extra_source_repos:
|
|
||||||
new_config.update({'repo': extra_source_repos})
|
|
||||||
|
|
||||||
# remove unnecessary (for 'pungi-make-ostree tree' script ) elements
|
# remove unnecessary (for 'pungi-make-ostree tree' script ) elements
|
||||||
# from config, it doesn't hurt to have them, however remove them can
|
# from config, it doesn't hurt to have them, however remove them can
|
||||||
@ -88,13 +64,11 @@ class OSTreeThread(WorkerThread):
|
|||||||
'failable', 'version', 'update_summary']:
|
'failable', 'version', 'update_summary']:
|
||||||
new_config.pop(k, None)
|
new_config.pop(k, None)
|
||||||
|
|
||||||
extra_config_file = None
|
# write a json file to save the configuration, so 'pungi-make-ostree tree'
|
||||||
if new_config:
|
# can take use of it
|
||||||
# write a json file to save the configuration, so 'pungi-make-ostree tree'
|
extra_config_file = os.path.join(workdir, 'extra_config.json')
|
||||||
# can take use of it
|
with open(extra_config_file, 'w') as f:
|
||||||
extra_config_file = os.path.join(workdir, 'extra_config.json')
|
json.dump(new_config, f, indent=4)
|
||||||
with open(extra_config_file, 'w') as f:
|
|
||||||
json.dump(new_config, f, indent=4)
|
|
||||||
|
|
||||||
# Ensure target directory exists, otherwise Koji task will fail to
|
# Ensure target directory exists, otherwise Koji task will fail to
|
||||||
# mount it.
|
# mount it.
|
||||||
|
@ -9,7 +9,7 @@ from kobo import shortcuts
|
|||||||
|
|
||||||
from .base import ConfigGuardedPhase, PhaseLoggerMixin
|
from .base import ConfigGuardedPhase, PhaseLoggerMixin
|
||||||
from .. import util
|
from .. import util
|
||||||
from ..util import get_volid, translate_path
|
from ..util import get_volid, get_repo_urls
|
||||||
from ..wrappers import kojiwrapper, iso, lorax, scm
|
from ..wrappers import kojiwrapper, iso, lorax, scm
|
||||||
|
|
||||||
|
|
||||||
@ -45,10 +45,8 @@ class OstreeInstallerThread(WorkerThread):
|
|||||||
self.pool.log_info('[BEGIN] %s' % msg)
|
self.pool.log_info('[BEGIN] %s' % msg)
|
||||||
self.logdir = compose.paths.log.topdir('%s/%s/ostree_installer-%s' % (arch, variant, self.num))
|
self.logdir = compose.paths.log.topdir('%s/%s/ostree_installer-%s' % (arch, variant, self.num))
|
||||||
|
|
||||||
source_from_repos = [self._get_source_repo(compose, arch, v)
|
repos = get_repo_urls(compose, shortcuts.force_list(config['repo']), arch=arch)
|
||||||
for v in shortcuts.force_list(config['repo_from'])]
|
repos = [url.replace('$arch', arch) for url in repos]
|
||||||
repos = shortcuts.force_list(config.pop('repo', []))
|
|
||||||
source_repos = source_from_repos + repos
|
|
||||||
output_dir = os.path.join(compose.paths.work.topdir(arch), variant.uid, 'ostree_installer')
|
output_dir = os.path.join(compose.paths.work.topdir(arch), variant.uid, 'ostree_installer')
|
||||||
util.makedirs(os.path.dirname(output_dir))
|
util.makedirs(os.path.dirname(output_dir))
|
||||||
|
|
||||||
@ -57,25 +55,13 @@ class OstreeInstallerThread(WorkerThread):
|
|||||||
disc_type = compose.conf['disc_types'].get('ostree', 'ostree')
|
disc_type = compose.conf['disc_types'].get('ostree', 'ostree')
|
||||||
|
|
||||||
volid = get_volid(compose, arch, variant, disc_type=disc_type)
|
volid = get_volid(compose, arch, variant, disc_type=disc_type)
|
||||||
task_id = self._run_ostree_cmd(compose, variant, arch, config, source_repos, output_dir, volid)
|
task_id = self._run_ostree_cmd(compose, variant, arch, config, repos, output_dir, volid)
|
||||||
|
|
||||||
filename = compose.get_image_name(arch, variant, disc_type=disc_type)
|
filename = compose.get_image_name(arch, variant, disc_type=disc_type)
|
||||||
self._copy_image(compose, variant, arch, filename, output_dir)
|
self._copy_image(compose, variant, arch, filename, output_dir)
|
||||||
self._add_to_manifest(compose, variant, arch, filename)
|
self._add_to_manifest(compose, variant, arch, filename)
|
||||||
self.pool.log_info('[DONE ] %s, (task id: %s)' % (msg, task_id))
|
self.pool.log_info('[DONE ] %s, (task id: %s)' % (msg, task_id))
|
||||||
|
|
||||||
def _get_source_repo(self, compose, arch, source):
|
|
||||||
"""
|
|
||||||
If `source` is a URL, return it as-is (possibly replacing $arch with
|
|
||||||
actual arch. Otherwise treat is a a variant name and return path to
|
|
||||||
repo in that variant.
|
|
||||||
"""
|
|
||||||
if '://' in source:
|
|
||||||
return source.replace('$arch', arch)
|
|
||||||
source_variant = compose.all_variants[source]
|
|
||||||
return translate_path(
|
|
||||||
compose, compose.paths.compose.repository(arch, source_variant, create_dir=False))
|
|
||||||
|
|
||||||
def _clone_templates(self, url, branch='master'):
|
def _clone_templates(self, url, branch='master'):
|
||||||
if not url:
|
if not url:
|
||||||
self.template_dir = None
|
self.template_dir = None
|
||||||
|
103
pungi/util.py
103
pungi/util.py
@ -17,6 +17,7 @@
|
|||||||
import subprocess
|
import subprocess
|
||||||
import os
|
import os
|
||||||
import shutil
|
import shutil
|
||||||
|
import string
|
||||||
import sys
|
import sys
|
||||||
import hashlib
|
import hashlib
|
||||||
import errno
|
import errno
|
||||||
@ -658,3 +659,105 @@ def translate_path(compose, path):
|
|||||||
return normpath.replace(prefix, newvalue, 1)
|
return normpath.replace(prefix, newvalue, 1)
|
||||||
|
|
||||||
return normpath
|
return normpath
|
||||||
|
|
||||||
|
|
||||||
|
def get_repo_url(compose, repo, arch='$basearch'):
|
||||||
|
"""
|
||||||
|
Convert repo to repo URL.
|
||||||
|
|
||||||
|
@param compose - required for access to variants
|
||||||
|
@param repo - string or a dict which at least contains 'baseurl' key
|
||||||
|
@param arch - string to be used as arch in repo url
|
||||||
|
"""
|
||||||
|
if isinstance(repo, dict):
|
||||||
|
try:
|
||||||
|
repo = repo['baseurl']
|
||||||
|
except KeyError:
|
||||||
|
raise RuntimeError('Baseurl is required in repo dict %s' % str(repo))
|
||||||
|
if '://' not in repo:
|
||||||
|
# this is a variant name
|
||||||
|
v = compose.all_variants.get(repo)
|
||||||
|
if not v:
|
||||||
|
raise RuntimeError('There is no variant %s to get repo from.' % repo)
|
||||||
|
repo = translate_path(compose, compose.paths.compose.repository(arch, v, create_dir=False))
|
||||||
|
return repo
|
||||||
|
|
||||||
|
|
||||||
|
def get_repo_urls(compose, repos, arch='$basearch'):
|
||||||
|
"""
|
||||||
|
Convert repos to a list of repo URLs.
|
||||||
|
|
||||||
|
@param compose - required for access to variants
|
||||||
|
@param repos - list of string or dict, if item is a dict, key 'baseurl' is required
|
||||||
|
@param arch - string to be used as arch in repo url
|
||||||
|
"""
|
||||||
|
urls = []
|
||||||
|
for repo in repos:
|
||||||
|
repo = get_repo_url(compose, repo, arch=arch)
|
||||||
|
urls.append(repo)
|
||||||
|
return urls
|
||||||
|
|
||||||
|
|
||||||
|
def _translate_url_to_repo_id(url):
|
||||||
|
"""
|
||||||
|
Translate url to valid repo id by replacing any invalid char to '_'.
|
||||||
|
"""
|
||||||
|
_REPOID_CHARS = string.ascii_letters + string.digits + '-_.:'
|
||||||
|
return ''.join([s if s in list(_REPOID_CHARS) else '_' for s in url])
|
||||||
|
|
||||||
|
|
||||||
|
def get_repo_dict(compose, repo, arch='$basearch'):
|
||||||
|
"""
|
||||||
|
Convert repo to a dict of repo options.
|
||||||
|
|
||||||
|
If repo is a string, translate it to repo url if necessary (when it's
|
||||||
|
not a url), and set it as 'baseurl' in result dict, also generate
|
||||||
|
a repo id/name as 'name' key in result dict.
|
||||||
|
If repo is a dict, translate value of 'baseurl' key to url if necessary,
|
||||||
|
if 'name' key is missing in the dict, generate one for it.
|
||||||
|
|
||||||
|
@param compose - required for access to variants
|
||||||
|
@param repo - A string or dict, if it is a dict, key 'baseurl' is required
|
||||||
|
@param arch - string to be used as arch in repo url
|
||||||
|
"""
|
||||||
|
repo_dict = {}
|
||||||
|
if isinstance(repo, dict):
|
||||||
|
url = repo['baseurl']
|
||||||
|
name = repo.get('name', None)
|
||||||
|
if '://' in url:
|
||||||
|
if name is None:
|
||||||
|
name = _translate_url_to_repo_id(url)
|
||||||
|
else:
|
||||||
|
# url is variant uid
|
||||||
|
if name is None:
|
||||||
|
name = '%s-%s' % (compose.compose_id, url)
|
||||||
|
url = get_repo_url(compose, url, arch=arch)
|
||||||
|
repo['name'] = name
|
||||||
|
repo['baseurl'] = url
|
||||||
|
return repo
|
||||||
|
else:
|
||||||
|
# repo is normal url or variant uid
|
||||||
|
repo_dict = {}
|
||||||
|
if '://' in repo:
|
||||||
|
repo_dict['name'] = _translate_url_to_repo_id(repo)
|
||||||
|
repo_dict['baseurl'] = repo
|
||||||
|
else:
|
||||||
|
repo_dict['name'] = '%s-%s' % (compose.compose_id, repo)
|
||||||
|
repo_dict['baseurl'] = get_repo_url(compose, repo)
|
||||||
|
|
||||||
|
return repo_dict
|
||||||
|
|
||||||
|
|
||||||
|
def get_repo_dicts(compose, repos, arch='$basearch'):
|
||||||
|
"""
|
||||||
|
Convert repos to a list of repo dicts.
|
||||||
|
|
||||||
|
@param compose - required for access to variants
|
||||||
|
@param repo - A list of string or dict, if item is a dict, key 'baseurl' is required
|
||||||
|
@param arch - string to be used as arch in repo url
|
||||||
|
"""
|
||||||
|
repo_dicts = []
|
||||||
|
for repo in repos:
|
||||||
|
repo_dict = get_repo_dict(compose, repo, arch=arch)
|
||||||
|
repo_dicts.append(repo_dict)
|
||||||
|
return repo_dicts
|
||||||
|
@ -273,7 +273,7 @@ class OstreeConfigTestCase(ConfigTestCase):
|
|||||||
"x86_64": {
|
"x86_64": {
|
||||||
"treefile": "fedora-atomic-docker-host.json",
|
"treefile": "fedora-atomic-docker-host.json",
|
||||||
"config_url": "https://git.fedorahosted.org/git/fedora-atomic.git",
|
"config_url": "https://git.fedorahosted.org/git/fedora-atomic.git",
|
||||||
"repo_from": "Everything",
|
"repo": "Everything",
|
||||||
"ostree_repo": "/mnt/koji/compose/atomic/Rawhide/"
|
"ostree_repo": "/mnt/koji/compose/atomic/Rawhide/"
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
@ -298,7 +298,7 @@ class OstreeInstallerConfigTestCase(ConfigTestCase):
|
|||||||
ostree_installer=[
|
ostree_installer=[
|
||||||
("^Atomic$", {
|
("^Atomic$", {
|
||||||
"x86_64": {
|
"x86_64": {
|
||||||
"repo_from": "Everything",
|
"repo": "Everything",
|
||||||
"release": None,
|
"release": None,
|
||||||
"installpkgs": ["fedora-productimg-atomic"],
|
"installpkgs": ["fedora-productimg-atomic"],
|
||||||
"add_template": ["/spin-kickstarts/atomic-installer/lorax-configure-repo.tmpl"],
|
"add_template": ["/spin-kickstarts/atomic-installer/lorax-configure-repo.tmpl"],
|
||||||
@ -326,7 +326,7 @@ class OstreeInstallerConfigTestCase(ConfigTestCase):
|
|||||||
ostree_installer=[
|
ostree_installer=[
|
||||||
("^Atomic$", {
|
("^Atomic$", {
|
||||||
"x86_64": {
|
"x86_64": {
|
||||||
"repo_from": "Everything",
|
"repo": "Everything",
|
||||||
"release": None,
|
"release": None,
|
||||||
"installpkgs": ["fedora-productimg-atomic"],
|
"installpkgs": ["fedora-productimg-atomic"],
|
||||||
"add_template": ["/spin-kickstarts/atomic-installer/lorax-configure-repo.tmpl"],
|
"add_template": ["/spin-kickstarts/atomic-installer/lorax-configure-repo.tmpl"],
|
||||||
|
@ -23,8 +23,7 @@ class TestLiveImagesPhase(PungiTestCase):
|
|||||||
('^Client$', {
|
('^Client$', {
|
||||||
'amd64': {
|
'amd64': {
|
||||||
'kickstart': 'test.ks',
|
'kickstart': 'test.ks',
|
||||||
'repo': ['http://example.com/repo/'],
|
'repo': ['http://example.com/repo/', 'Everything', 'Server-optional'],
|
||||||
'repo_from': ['Everything', 'Server-optional'],
|
|
||||||
'release': None,
|
'release': None,
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
@ -76,8 +75,7 @@ class TestLiveImagesPhase(PungiTestCase):
|
|||||||
('^Client$', {
|
('^Client$', {
|
||||||
'amd64': {
|
'amd64': {
|
||||||
'kickstart': 'test.ks',
|
'kickstart': 'test.ks',
|
||||||
'repo': ['http://example.com/repo/'],
|
'repo': ['http://example.com/repo/', 'Everything'],
|
||||||
'repo_from': 'Everything',
|
|
||||||
'release': None,
|
'release': None,
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
@ -124,8 +122,7 @@ class TestLiveImagesPhase(PungiTestCase):
|
|||||||
('^Client$', {
|
('^Client$', {
|
||||||
'amd64': {
|
'amd64': {
|
||||||
'kickstart': 'test.ks',
|
'kickstart': 'test.ks',
|
||||||
'repo': ['http://example.com/repo/'],
|
'repo': ['http://example.com/repo/', 'Everything'],
|
||||||
'repo_from': ['Everything'],
|
|
||||||
'release': None,
|
'release': None,
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
@ -171,12 +168,10 @@ class TestLiveImagesPhase(PungiTestCase):
|
|||||||
('^Client$', {
|
('^Client$', {
|
||||||
'amd64': [{
|
'amd64': [{
|
||||||
'kickstart': 'test.ks',
|
'kickstart': 'test.ks',
|
||||||
'repo': ['http://example.com/repo/'],
|
'repo': ['http://example.com/repo/', 'Everything'],
|
||||||
'repo_from': ['Everything'],
|
|
||||||
}, {
|
}, {
|
||||||
'kickstart': 'another.ks',
|
'kickstart': 'another.ks',
|
||||||
'repo': ['http://example.com/repo/'],
|
'repo': ['http://example.com/repo/', 'Everything'],
|
||||||
'repo_from': ['Everything'],
|
|
||||||
}]
|
}]
|
||||||
})
|
})
|
||||||
],
|
],
|
||||||
@ -244,8 +239,7 @@ class TestLiveImagesPhase(PungiTestCase):
|
|||||||
'amd64': {
|
'amd64': {
|
||||||
'kickstart': 'test.ks',
|
'kickstart': 'test.ks',
|
||||||
'ksurl': 'https://git.example.com/kickstarts.git?#HEAD',
|
'ksurl': 'https://git.example.com/kickstarts.git?#HEAD',
|
||||||
'repo': ['http://example.com/repo/'],
|
'repo': ['http://example.com/repo/', 'Everything'],
|
||||||
'repo_from': ['Everything'],
|
|
||||||
'type': 'appliance',
|
'type': 'appliance',
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
@ -299,8 +293,7 @@ class TestLiveImagesPhase(PungiTestCase):
|
|||||||
('^Client$', {
|
('^Client$', {
|
||||||
'amd64': {
|
'amd64': {
|
||||||
'kickstart': 'test.ks',
|
'kickstart': 'test.ks',
|
||||||
'repo': ['http://example.com/repo/'],
|
'repo': ['http://example.com/repo/', 'Everything'],
|
||||||
'repo_from': ['Everything'],
|
|
||||||
'type': 'appliance',
|
'type': 'appliance',
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
@ -354,8 +347,7 @@ class TestLiveImagesPhase(PungiTestCase):
|
|||||||
('^Client$', {
|
('^Client$', {
|
||||||
'amd64': {
|
'amd64': {
|
||||||
'kickstart': 'test.ks',
|
'kickstart': 'test.ks',
|
||||||
'repo': ['http://example.com/repo/'],
|
'repo': ['http://example.com/repo/', 'Everything'],
|
||||||
'repo_from': ['Everything'],
|
|
||||||
'type': 'appliance',
|
'type': 'appliance',
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
@ -406,8 +398,7 @@ class TestLiveImagesPhase(PungiTestCase):
|
|||||||
('^Client$', {
|
('^Client$', {
|
||||||
'amd64': {
|
'amd64': {
|
||||||
'kickstart': 'test.ks',
|
'kickstart': 'test.ks',
|
||||||
'repo': ['http://example.com/repo/'],
|
'repo': ['http://example.com/repo/', 'Everything'],
|
||||||
'repo_from': ['Everything'],
|
|
||||||
'release': None,
|
'release': None,
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
|
@ -353,7 +353,7 @@ class TestLiveMediaPhase(PungiTestCase):
|
|||||||
|
|
||||||
phase = LiveMediaPhase(compose)
|
phase = LiveMediaPhase(compose)
|
||||||
|
|
||||||
with self.assertRaisesRegexp(RuntimeError, r'no.+Missing.+when building.+Server'):
|
with self.assertRaisesRegexp(RuntimeError, r'There is no variant Missing to get repo from.'):
|
||||||
phase.run()
|
phase.run()
|
||||||
|
|
||||||
@mock.patch('pungi.util.resolve_git_url')
|
@mock.patch('pungi.util.resolve_git_url')
|
||||||
|
@ -23,7 +23,7 @@ class OSBSPhaseTest(helpers.PungiTestCase):
|
|||||||
|
|
||||||
@mock.patch('pungi.phases.osbs.ThreadPool')
|
@mock.patch('pungi.phases.osbs.ThreadPool')
|
||||||
def test_run(self, ThreadPool):
|
def test_run(self, ThreadPool):
|
||||||
cfg = mock.Mock()
|
cfg = helpers.IterableMock()
|
||||||
compose = helpers.DummyCompose(self.topdir, {
|
compose = helpers.DummyCompose(self.topdir, {
|
||||||
'osbs': {'^Everything$': cfg}
|
'osbs': {'^Everything$': cfg}
|
||||||
})
|
})
|
||||||
@ -310,8 +310,7 @@ class OSBSThreadTest(helpers.PungiTestCase):
|
|||||||
'target': 'f24-docker-candidate',
|
'target': 'f24-docker-candidate',
|
||||||
'name': 'my-name',
|
'name': 'my-name',
|
||||||
'version': '1.0',
|
'version': '1.0',
|
||||||
'repo': 'http://pkgs.example.com/my.repo',
|
'repo': ['Everything', 'http://pkgs.example.com/my.repo']
|
||||||
'repo_from': 'Everything',
|
|
||||||
}
|
}
|
||||||
self._setupMock(KojiWrapper, resolve_git_url)
|
self._setupMock(KojiWrapper, resolve_git_url)
|
||||||
self._assertConfigCorrect(cfg)
|
self._assertConfigCorrect(cfg)
|
||||||
@ -339,8 +338,7 @@ class OSBSThreadTest(helpers.PungiTestCase):
|
|||||||
'target': 'f24-docker-candidate',
|
'target': 'f24-docker-candidate',
|
||||||
'name': 'my-name',
|
'name': 'my-name',
|
||||||
'version': '1.0',
|
'version': '1.0',
|
||||||
'repo': ['http://pkgs.example.com/my.repo'],
|
'repo': ['Everything', 'Client', 'http://pkgs.example.com/my.repo'],
|
||||||
'repo_from': ['Everything', 'Client'],
|
|
||||||
}
|
}
|
||||||
self._assertConfigCorrect(cfg)
|
self._assertConfigCorrect(cfg)
|
||||||
self._setupMock(KojiWrapper, resolve_git_url)
|
self._setupMock(KojiWrapper, resolve_git_url)
|
||||||
@ -370,8 +368,7 @@ class OSBSThreadTest(helpers.PungiTestCase):
|
|||||||
'target': 'f24-docker-candidate',
|
'target': 'f24-docker-candidate',
|
||||||
'name': 'my-name',
|
'name': 'my-name',
|
||||||
'version': '1.0',
|
'version': '1.0',
|
||||||
'repo': ['http://pkgs.example.com/my.repo'],
|
'repo': ['Everything', 'Client', 'http://pkgs.example.com/my.repo'],
|
||||||
'repo_from': ['Everything', 'Client'],
|
|
||||||
'gpgkey': gpgkey,
|
'gpgkey': gpgkey,
|
||||||
}
|
}
|
||||||
self._assertConfigCorrect(cfg)
|
self._assertConfigCorrect(cfg)
|
||||||
@ -389,7 +386,7 @@ class OSBSThreadTest(helpers.PungiTestCase):
|
|||||||
'target': 'f24-docker-candidate',
|
'target': 'f24-docker-candidate',
|
||||||
'name': 'my-name',
|
'name': 'my-name',
|
||||||
'version': '1.0',
|
'version': '1.0',
|
||||||
'repo_from': 'Gold',
|
'repo': 'Gold',
|
||||||
}
|
}
|
||||||
self._assertConfigCorrect(cfg)
|
self._assertConfigCorrect(cfg)
|
||||||
self._setupMock(KojiWrapper, resolve_git_url)
|
self._setupMock(KojiWrapper, resolve_git_url)
|
||||||
|
@ -136,7 +136,7 @@ class OstreeThreadTest(helpers.PungiTestCase):
|
|||||||
self.compose.supported = False
|
self.compose.supported = False
|
||||||
pool = mock.Mock()
|
pool = mock.Mock()
|
||||||
cfg = {
|
cfg = {
|
||||||
'repo_from': 'Everything',
|
'repo': 'Everything',
|
||||||
'release': '20160321.n.0',
|
'release': '20160321.n.0',
|
||||||
}
|
}
|
||||||
koji = KojiWrapper.return_value
|
koji = KojiWrapper.return_value
|
||||||
@ -172,7 +172,7 @@ class OstreeThreadTest(helpers.PungiTestCase):
|
|||||||
get_file_size, get_mtime, ImageCls, run):
|
get_file_size, get_mtime, ImageCls, run):
|
||||||
pool = mock.Mock()
|
pool = mock.Mock()
|
||||||
cfg = {
|
cfg = {
|
||||||
'repo_from': 'http://example.com/repo/$arch/',
|
'repo': 'http://example.com/repo/$arch/',
|
||||||
'release': '20160321.n.0',
|
'release': '20160321.n.0',
|
||||||
}
|
}
|
||||||
koji = KojiWrapper.return_value
|
koji = KojiWrapper.return_value
|
||||||
@ -206,9 +206,9 @@ class OstreeThreadTest(helpers.PungiTestCase):
|
|||||||
get_file_size, get_mtime, ImageCls, run):
|
get_file_size, get_mtime, ImageCls, run):
|
||||||
pool = mock.Mock()
|
pool = mock.Mock()
|
||||||
cfg = {
|
cfg = {
|
||||||
'repo_from': 'Everything',
|
|
||||||
'release': '20160321.n.0',
|
'release': '20160321.n.0',
|
||||||
'repo': [
|
'repo': [
|
||||||
|
'Everything',
|
||||||
'https://example.com/extra-repo1.repo',
|
'https://example.com/extra-repo1.repo',
|
||||||
'https://example.com/extra-repo2.repo',
|
'https://example.com/extra-repo2.repo',
|
||||||
],
|
],
|
||||||
@ -244,9 +244,10 @@ class OstreeThreadTest(helpers.PungiTestCase):
|
|||||||
get_file_size, get_mtime, ImageCls, run):
|
get_file_size, get_mtime, ImageCls, run):
|
||||||
pool = mock.Mock()
|
pool = mock.Mock()
|
||||||
cfg = {
|
cfg = {
|
||||||
'repo_from': ['Everything', 'Server'],
|
|
||||||
'release': '20160321.n.0',
|
'release': '20160321.n.0',
|
||||||
'repo': [
|
'repo': [
|
||||||
|
'Everything',
|
||||||
|
'Server',
|
||||||
'https://example.com/extra-repo1.repo',
|
'https://example.com/extra-repo1.repo',
|
||||||
'https://example.com/extra-repo2.repo',
|
'https://example.com/extra-repo2.repo',
|
||||||
],
|
],
|
||||||
@ -284,7 +285,7 @@ class OstreeThreadTest(helpers.PungiTestCase):
|
|||||||
get_mtime, ImageCls, run):
|
get_mtime, ImageCls, run):
|
||||||
pool = mock.Mock()
|
pool = mock.Mock()
|
||||||
cfg = {
|
cfg = {
|
||||||
'repo_from': 'Everything',
|
'repo': 'Everything',
|
||||||
'release': '20160321.n.0',
|
'release': '20160321.n.0',
|
||||||
'add_template': ['some-file.txt'],
|
'add_template': ['some-file.txt'],
|
||||||
}
|
}
|
||||||
@ -317,7 +318,7 @@ class OstreeThreadTest(helpers.PungiTestCase):
|
|||||||
get_dir_from_scm):
|
get_dir_from_scm):
|
||||||
pool = mock.Mock()
|
pool = mock.Mock()
|
||||||
cfg = {
|
cfg = {
|
||||||
'repo_from': 'Everything',
|
'repo': 'Everything',
|
||||||
'release': '20160321.n.0',
|
'release': '20160321.n.0',
|
||||||
'add_template': ['some_file.txt'],
|
'add_template': ['some_file.txt'],
|
||||||
'add_arch_template': ['other_file.txt'],
|
'add_arch_template': ['other_file.txt'],
|
||||||
@ -365,7 +366,7 @@ class OstreeThreadTest(helpers.PungiTestCase):
|
|||||||
get_file_size, get_mtime, ImageCls, run):
|
get_file_size, get_mtime, ImageCls, run):
|
||||||
pool = mock.Mock()
|
pool = mock.Mock()
|
||||||
cfg = {
|
cfg = {
|
||||||
'repo_from': 'Everything',
|
'repo': 'Everything',
|
||||||
'release': None,
|
'release': None,
|
||||||
"installpkgs": ["fedora-productimg-atomic"],
|
"installpkgs": ["fedora-productimg-atomic"],
|
||||||
"add_template": ["/spin-kickstarts/atomic-installer/lorax-configure-repo.tmpl"],
|
"add_template": ["/spin-kickstarts/atomic-installer/lorax-configure-repo.tmpl"],
|
||||||
@ -426,7 +427,7 @@ class OstreeThreadTest(helpers.PungiTestCase):
|
|||||||
get_mtime, ImageCls, run):
|
get_mtime, ImageCls, run):
|
||||||
pool = mock.Mock()
|
pool = mock.Mock()
|
||||||
cfg = {
|
cfg = {
|
||||||
'repo_from': 'Everything',
|
'repo': 'Everything',
|
||||||
'release': None,
|
'release': None,
|
||||||
'failable': ['x86_64']
|
'failable': ['x86_64']
|
||||||
}
|
}
|
||||||
@ -452,7 +453,7 @@ class OstreeThreadTest(helpers.PungiTestCase):
|
|||||||
get_file_size, get_mtime, ImageCls, run):
|
get_file_size, get_mtime, ImageCls, run):
|
||||||
pool = mock.Mock()
|
pool = mock.Mock()
|
||||||
cfg = {
|
cfg = {
|
||||||
'repo_from': 'Everything',
|
'repo': 'Everything',
|
||||||
'release': None,
|
'release': None,
|
||||||
'failable': ['*'],
|
'failable': ['*'],
|
||||||
}
|
}
|
||||||
|
@ -51,7 +51,7 @@ class OSTreeThreadTest(helpers.PungiTestCase):
|
|||||||
self.repo = os.path.join(self.topdir, 'place/for/atomic')
|
self.repo = os.path.join(self.topdir, 'place/for/atomic')
|
||||||
os.makedirs(os.path.join(self.repo, 'refs', 'heads'))
|
os.makedirs(os.path.join(self.repo, 'refs', 'heads'))
|
||||||
self.cfg = {
|
self.cfg = {
|
||||||
'repo_from': 'Everything',
|
'repo': 'Everything',
|
||||||
'config_url': 'https://git.fedorahosted.org/git/fedora-atomic.git',
|
'config_url': 'https://git.fedorahosted.org/git/fedora-atomic.git',
|
||||||
'config_branch': 'f24',
|
'config_branch': 'f24',
|
||||||
'treefile': 'fedora-atomic-docker-host.json',
|
'treefile': 'fedora-atomic-docker-host.json',
|
||||||
@ -305,8 +305,8 @@ class OSTreeThreadTest(helpers.PungiTestCase):
|
|||||||
koji.run_runroot_cmd.side_effect = self._mock_runroot(0)
|
koji.run_runroot_cmd.side_effect = self._mock_runroot(0)
|
||||||
|
|
||||||
cfg = {
|
cfg = {
|
||||||
'repo_from': 'Everything',
|
|
||||||
'repo': [
|
'repo': [
|
||||||
|
'Everything',
|
||||||
{
|
{
|
||||||
'name': 'repo_a',
|
'name': 'repo_a',
|
||||||
'baseurl': 'http://url/to/repo/a',
|
'baseurl': 'http://url/to/repo/a',
|
||||||
@ -333,9 +333,10 @@ class OSTreeThreadTest(helpers.PungiTestCase):
|
|||||||
self.assertTrue(os.path.isfile(extra_config_file))
|
self.assertTrue(os.path.isfile(extra_config_file))
|
||||||
extra_config = json.load(open(extra_config_file, 'r'))
|
extra_config = json.load(open(extra_config_file, 'r'))
|
||||||
self.assertTrue(extra_config.get('keep_original_sources', False))
|
self.assertTrue(extra_config.get('keep_original_sources', False))
|
||||||
self.assertEqual(extra_config.get('repo_from', None), 'http://example.com/Everything/$basearch/os')
|
|
||||||
self.assertEqual(len(extra_config.get('repo', [])), len(cfg['repo']))
|
self.assertEqual(len(extra_config.get('repo', [])), len(cfg['repo']))
|
||||||
self.assertEqual(extra_config.get('repo').pop()['baseurl'], 'http://example.com/Server/$basearch/os')
|
self.assertEqual(extra_config.get('repo').pop()['baseurl'], 'http://example.com/Server/$basearch/os')
|
||||||
|
self.assertEqual(extra_config.get('repo').pop()['baseurl'], 'http://url/to/repo/a')
|
||||||
|
self.assertEqual(extra_config.get('repo').pop()['baseurl'], 'http://example.com/Everything/$basearch/os')
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
unittest.main()
|
unittest.main()
|
||||||
|
@ -156,8 +156,11 @@ class OstreeTreeScriptTest(helpers.PungiTestCase):
|
|||||||
|
|
||||||
extra_config_file = os.path.join(self.topdir, 'extra_config.json')
|
extra_config_file = os.path.join(self.topdir, 'extra_config.json')
|
||||||
extra_config = {
|
extra_config = {
|
||||||
"repo_from": "http://www.example.com/Server.repo",
|
|
||||||
"repo": [
|
"repo": [
|
||||||
|
{
|
||||||
|
"name": "server",
|
||||||
|
"baseurl": "http://www.example.com/Server/repo",
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"name": "optional",
|
"name": "optional",
|
||||||
"baseurl": "http://example.com/repo/x86_64/optional",
|
"baseurl": "http://example.com/repo/x86_64/optional",
|
||||||
@ -180,14 +183,14 @@ class OstreeTreeScriptTest(helpers.PungiTestCase):
|
|||||||
'--extra-config=%s' % extra_config_file,
|
'--extra-config=%s' % extra_config_file,
|
||||||
])
|
])
|
||||||
|
|
||||||
source_repo_from_name = "source_repo_from-%s" % timestamp
|
server_repo_name = "server-%s" % timestamp
|
||||||
source_repo_from_repo = os.path.join(configdir, "%s.repo" % source_repo_from_name)
|
server_repo = os.path.join(configdir, "%s.repo" % server_repo_name)
|
||||||
self.assertTrue(os.path.isfile(source_repo_from_repo))
|
self.assertTrue(os.path.isfile(server_repo))
|
||||||
with open(source_repo_from_repo, 'r') as f:
|
with open(server_repo, 'r') as f:
|
||||||
content = f.read()
|
content = f.read()
|
||||||
self.assertIn("[%s]" % source_repo_from_name, content)
|
self.assertIn("[%s]" % server_repo_name, content)
|
||||||
self.assertIn("name=%s" % source_repo_from_name, content)
|
self.assertIn("name=%s" % server_repo_name, content)
|
||||||
self.assertIn("baseurl=http://www.example.com/Server.repo", content)
|
self.assertIn("baseurl=http://www.example.com/Server/repo", content)
|
||||||
self.assertIn("gpgcheck=0", content)
|
self.assertIn("gpgcheck=0", content)
|
||||||
|
|
||||||
optional_repo_name = "optional-%s" % timestamp
|
optional_repo_name = "optional-%s" % timestamp
|
||||||
@ -213,7 +216,7 @@ class OstreeTreeScriptTest(helpers.PungiTestCase):
|
|||||||
treeconf = json.load(open(treefile, 'r'))
|
treeconf = json.load(open(treefile, 'r'))
|
||||||
repos = treeconf['repos']
|
repos = treeconf['repos']
|
||||||
self.assertEqual(len(repos), 3)
|
self.assertEqual(len(repos), 3)
|
||||||
for name in [source_repo_from_name, optional_repo_name, extra_repo_name]:
|
for name in [server_repo_name, optional_repo_name, extra_repo_name]:
|
||||||
self.assertIn(name, repos)
|
self.assertIn(name, repos)
|
||||||
|
|
||||||
@mock.patch('pungi.ostree.utils.datetime')
|
@mock.patch('pungi.ostree.utils.datetime')
|
||||||
@ -230,8 +233,11 @@ class OstreeTreeScriptTest(helpers.PungiTestCase):
|
|||||||
|
|
||||||
extra_config_file = os.path.join(self.topdir, 'extra_config.json')
|
extra_config_file = os.path.join(self.topdir, 'extra_config.json')
|
||||||
extra_config = {
|
extra_config = {
|
||||||
"repo_from": "http://www.example.com/Server.repo",
|
|
||||||
"repo": [
|
"repo": [
|
||||||
|
{
|
||||||
|
"name": "server",
|
||||||
|
"baseurl": "http://www.example.com/Server/repo",
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"name": "optional",
|
"name": "optional",
|
||||||
"baseurl": "http://example.com/repo/x86_64/optional",
|
"baseurl": "http://example.com/repo/x86_64/optional",
|
||||||
@ -255,7 +261,7 @@ class OstreeTreeScriptTest(helpers.PungiTestCase):
|
|||||||
'--extra-config=%s' % extra_config_file,
|
'--extra-config=%s' % extra_config_file,
|
||||||
])
|
])
|
||||||
|
|
||||||
source_repo_from_name = "source_repo_from-%s" % timestamp
|
server_repo_name = "server-%s" % timestamp
|
||||||
optional_repo_name = "optional-%s" % timestamp
|
optional_repo_name = "optional-%s" % timestamp
|
||||||
extra_repo_name = "extra-%s" % timestamp
|
extra_repo_name = "extra-%s" % timestamp
|
||||||
|
|
||||||
@ -263,7 +269,7 @@ class OstreeTreeScriptTest(helpers.PungiTestCase):
|
|||||||
repos = treeconf['repos']
|
repos = treeconf['repos']
|
||||||
self.assertEqual(len(repos), 6)
|
self.assertEqual(len(repos), 6)
|
||||||
for name in ['fedora-rawhide', 'fedora-24', 'fedora-23',
|
for name in ['fedora-rawhide', 'fedora-24', 'fedora-23',
|
||||||
source_repo_from_name, optional_repo_name, extra_repo_name]:
|
server_repo_name, optional_repo_name, extra_repo_name]:
|
||||||
self.assertIn(name, repos)
|
self.assertIn(name, repos)
|
||||||
|
|
||||||
|
|
||||||
|
@ -540,5 +540,107 @@ class TranslatePathTestCase(unittest.TestCase):
|
|||||||
self.assertEqual(ret, '/mnt/fedora_koji/compose/rawhide/XYZ')
|
self.assertEqual(ret, '/mnt/fedora_koji/compose/rawhide/XYZ')
|
||||||
|
|
||||||
|
|
||||||
|
class GetRepoFuncsTestCase(unittest.TestCase):
|
||||||
|
@mock.patch('pungi.compose.ComposeInfo')
|
||||||
|
def setUp(self, ci):
|
||||||
|
self.tmp_dir = tempfile.mkdtemp()
|
||||||
|
conf = {
|
||||||
|
'translate_paths': [(self.tmp_dir, 'http://example.com')]
|
||||||
|
}
|
||||||
|
ci.return_value.compose.respin = 0
|
||||||
|
ci.return_value.compose.id = 'RHEL-8.0-20180101.n.0'
|
||||||
|
ci.return_value.compose.date = '20160101'
|
||||||
|
ci.return_value.compose.type = 'nightly'
|
||||||
|
ci.return_value.compose.type_suffix = '.n'
|
||||||
|
ci.return_value.compose.label = 'RC-1.0'
|
||||||
|
ci.return_value.compose.label_major_version = '1'
|
||||||
|
|
||||||
|
compose_dir = os.path.join(self.tmp_dir, ci.return_value.compose.id)
|
||||||
|
self.compose = compose.Compose(conf, compose_dir)
|
||||||
|
server_variant = mock.Mock(uid='Server', type='variant')
|
||||||
|
client_variant = mock.Mock(uid='Client', type='variant')
|
||||||
|
self.compose.all_variants = {
|
||||||
|
'Server': server_variant,
|
||||||
|
'Client': client_variant,
|
||||||
|
}
|
||||||
|
|
||||||
|
def tearDown(self):
|
||||||
|
shutil.rmtree(self.tmp_dir)
|
||||||
|
|
||||||
|
def test_get_repo_url_from_normal_url(self):
|
||||||
|
url = util.get_repo_url(self.compose, 'http://example.com/repo')
|
||||||
|
self.assertEqual(url, 'http://example.com/repo')
|
||||||
|
|
||||||
|
def test_get_repo_url_from_variant_uid(self):
|
||||||
|
url = util.get_repo_url(self.compose, 'Server')
|
||||||
|
self.assertEqual(url, 'http://example.com/RHEL-8.0-20180101.n.0/compose/Server/$basearch/os')
|
||||||
|
|
||||||
|
def test_get_repo_url_from_repo_dict(self):
|
||||||
|
repo = {'baseurl': 'http://example.com/repo'}
|
||||||
|
url = util.get_repo_url(self.compose, repo)
|
||||||
|
self.assertEqual(url, 'http://example.com/repo')
|
||||||
|
|
||||||
|
repo = {'baseurl': 'Server'}
|
||||||
|
url = util.get_repo_url(self.compose, repo)
|
||||||
|
self.assertEqual(url, 'http://example.com/RHEL-8.0-20180101.n.0/compose/Server/$basearch/os')
|
||||||
|
|
||||||
|
def test_get_repo_urls(self):
|
||||||
|
repos = [
|
||||||
|
'http://example.com/repo',
|
||||||
|
'Server',
|
||||||
|
{'baseurl': 'Client'},
|
||||||
|
{'baseurl': 'ftp://example.com/linux/repo'},
|
||||||
|
]
|
||||||
|
|
||||||
|
expect = [
|
||||||
|
'http://example.com/repo',
|
||||||
|
'http://example.com/RHEL-8.0-20180101.n.0/compose/Server/$basearch/os',
|
||||||
|
'http://example.com/RHEL-8.0-20180101.n.0/compose/Client/$basearch/os',
|
||||||
|
'ftp://example.com/linux/repo',
|
||||||
|
]
|
||||||
|
|
||||||
|
self.assertEqual(util.get_repo_urls(self.compose, repos), expect)
|
||||||
|
|
||||||
|
def test_get_repo_dict_from_normal_url(self):
|
||||||
|
repo_dict = util.get_repo_dict(self.compose, 'http://example.com/repo')
|
||||||
|
expect = {'name': 'http:__example.com_repo', 'baseurl': 'http://example.com/repo'}
|
||||||
|
self.assertEqual(repo_dict, expect)
|
||||||
|
|
||||||
|
def test_get_repo_dict_from_variant_uid(self):
|
||||||
|
repo_dict = util.get_repo_dict(self.compose, 'Server')
|
||||||
|
expect = {
|
||||||
|
'name': "%s-%s" % (self.compose.compose_id, 'Server'),
|
||||||
|
'baseurl': 'http://example.com/RHEL-8.0-20180101.n.0/compose/Server/$basearch/os',
|
||||||
|
}
|
||||||
|
self.assertEqual(repo_dict, expect)
|
||||||
|
|
||||||
|
def test_get_repo_dict_from_repo_dict(self):
|
||||||
|
repo = {'baseurl': 'Server'}
|
||||||
|
expect = {
|
||||||
|
'name': '%s-%s' % (self.compose.compose_id, 'Server'),
|
||||||
|
'baseurl': 'http://example.com/RHEL-8.0-20180101.n.0/compose/Server/$basearch/os'
|
||||||
|
}
|
||||||
|
repo_dict = util.get_repo_dict(self.compose, repo)
|
||||||
|
self.assertEqual(repo_dict, expect)
|
||||||
|
|
||||||
|
def test_get_repo_dicts(self):
|
||||||
|
repos = [
|
||||||
|
'http://example.com/repo',
|
||||||
|
'Server',
|
||||||
|
{'baseurl': 'Client'},
|
||||||
|
{'baseurl': 'ftp://example.com/linux/repo'},
|
||||||
|
{'name': 'testrepo', 'baseurl': 'ftp://example.com/linux/repo'},
|
||||||
|
]
|
||||||
|
expect = [
|
||||||
|
{'name': 'http:__example.com_repo', 'baseurl': 'http://example.com/repo'},
|
||||||
|
{'name': '%s-%s' % (self.compose.compose_id, 'Server'), 'baseurl': 'http://example.com/RHEL-8.0-20180101.n.0/compose/Server/$basearch/os'},
|
||||||
|
{'name': '%s-%s' % (self.compose.compose_id, 'Client'), 'baseurl': 'http://example.com/RHEL-8.0-20180101.n.0/compose/Client/$basearch/os'},
|
||||||
|
{'name': 'ftp:__example.com_linux_repo', 'baseurl': 'ftp://example.com/linux/repo'},
|
||||||
|
{'name': 'testrepo', 'baseurl': 'ftp://example.com/linux/repo'},
|
||||||
|
]
|
||||||
|
repos = util.get_repo_dicts(self.compose, repos)
|
||||||
|
self.assertEqual(repos, expect)
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
unittest.main()
|
unittest.main()
|
||||||
|
Loading…
Reference in New Issue
Block a user