Merged update from upstream sources

This is an automated DistroBaker update from upstream sources.
If you do not know what this is about or would like to opt out,
contact the OSCI team.

Source: https://src.fedoraproject.org/rpms/pyproject-rpm-macros.git#26bb3cb4d123a7f57df5ec56b17ccbf3d415c505
This commit is contained in:
DistroBaker 2021-02-04 13:32:12 +00:00
parent ab4eb9e25e
commit 795a9b3332
33 changed files with 19512 additions and 399 deletions

104
README.md
View File

@ -1,27 +1,54 @@
pyproject RPM macros
====================
This is a provisional implementation of pyproject RPM macros for Fedora.
These macros allow projects that follow the Python [packaging specifications]
to be packaged as RPMs.
These macros are useful for packaging Python projects that use the [PEP 517] `pyproject.toml` file, which specifies the package's build dependencies (including the build system, such as setuptools, flit or poetry).
They are still *provisional*: we can make non-backwards-compatible changes to
the API.
Please subscribe to Fedora's [python-devel list] if you use the macros.
They work for:
* traditional Setuptools-based projects that use the `setup.py` file,
* newer Setuptools-based projects that have a `setup.cfg` file,
* general Python projects that use the [PEP 517] `pyproject.toml` file (which allows using any build system, such as setuptools, flit or poetry).
These macros replace `%py3_build` and `%py3_install`, which only work with `setup.py`.
[packaging specifications]: https://packaging.python.org/specifications/
[python-devel list]: https://lists.fedoraproject.org/archives/list/python-devel@lists.fedoraproject.org/
Usage
-----
If your upstream sources include `pyproject.toml` and you want to use these macros, BuildRequire them:
To use these macros, first BuildRequire them:
BuildRequires: pyproject-rpm-macros
This will bring in python3-devel, so you don't need to require python3-devel explicitly.
Also BuildRequire the devel package for the Python you are building against.
In Fedora, that's `python3-devel`.
(In the future, we plan to make `python3-devel` itself require
`pyproject-rpm-macros`.)
In order to get automatic build dependencies on Fedora 31+, run `%pyproject_buildrequires` in the `%generate_buildrequires` section:
Next, you need to generate more build dependencies (of your projects and
the macros themselves) by running `%pyproject_buildrequires` in the
`%generate_buildrequires` section:
%generate_buildrequires
%pyproject_buildrequires
Only build dependencies according to [PEP 517] and [PEP 518] will be added.
All other build dependencies (such as non-Python libraries or test dependencies) still need to be specified manually.
This will add build dependencies according to [PEP 517] and [PEP 518].
To also add run-time and test-time dependencies, see the section below.
If you need more dependencies, such as non-Python libraries, BuildRequire
them manually.
Note that `%generate_buildrequires` may produce error messages `(exit 11)` in
the build log. This is expected behavior of BuildRequires generators; see
[the Fedora change] for details.
[the Fedora change]: https://fedoraproject.org/wiki/Changes/DynamicBuildRequires
Then, build a wheel in `%build` with `%pyproject_wheel`:
@ -33,7 +60,7 @@ And install the wheel in `%install` with `%pyproject_install`:
%install
%pyproject_install
`%pyproject_install` installs all wheels in `$PWD/pyproject-wheeldir/`. If you would like to save wheels somewhere else redefine `%{_pyproject_wheeldir}`.
`%pyproject_install` installs all wheels in `$PWD/pyproject-wheeldir/`.
Adding run-time and test-time dependencies
@ -41,17 +68,19 @@ Adding run-time and test-time dependencies
To run tests in the `%check` section, the package's runtime dependencies
often need to also be included as build requirements.
If the project's build system supports the [`prepare-metadata-for-build-wheel`
hook](https://www.python.org/dev/peps/pep-0517/#prepare-metadata-for-build-wheel),
this can be done using the `-r` flag:
This can be done using the `-r` flag:
%generate_buildrequires
%pyproject_buildrequires -r
For this to work, the project's build system must support the
[`prepare-metadata-for-build-wheel` hook](https://www.python.org/dev/peps/pep-0517/#prepare-metadata-for-build-wheel).
The popular buildsystems (setuptools, flit, poetry) do support it.
For projects that specify test requirements using an [`extra`
provide](https://packaging.python.org/specifications/core-metadata/#provides-extra-multiple-use),
these can be added using the `-x` flag.
Multiple extras can be supplied as a comma separated list.
Multiple extras can be supplied by repeating the flag or as a comma separated list.
For example, if upstream suggests installing test dependencies with
`pip install mypackage[testing]`, the test deps would be generated by:
@ -93,7 +122,8 @@ Running tox based tests
-----------------------
In case you want to run the tests as specified in [tox] configuration,
you can use the `%tox` macro:
you must use `%pyproject_buildrequires` with `-t` or `-e` as explained above.
Then, use the `%tox` macro in `%check`:
%check
%tox
@ -104,7 +134,7 @@ The macro:
- If not defined, sets `$PYTHONPATH` to `%{buildroot}%{python3_sitearch}:%{buildroot}%{python3_sitelib}`
- If not defined, sets `$TOX_TESTENV_PASSENV` to `*`
- Runs `tox` with `-q` (quiet), `--recreate` and `--current-env` (from [tox-current-env]) flags
- Implicitly uses the tox environment name stored in `%{toxenv}` - as overridden by `%pyproject_buildrequires -t`
- Implicitly uses the tox environment name stored in `%{toxenv}` - as overridden by `%pyproject_buildrequires -e`
By using the `-e` flag, you can use a different tox environment(s):
@ -126,10 +156,6 @@ Or (note the two sequential `--`s):
%tox -- -- --flag-for-posargs
**Warning:** This macro assumes you have used `%pyproject_buildrequires -t` or `-e`
in `%generate_buildrequires`. If not, you need to add:
BuildRequires: python3dist(tox-current-env)
Generating the %files section
@ -156,7 +182,7 @@ You can use globs in the module names if listing them explicitly would be too te
%pyproject_install
%pyproject_save_files '*requests'
In fully automated environmets, you can use the `*` glob to include all modules (put it in single quotes to prevent Shell from expanding it). In Fedora however, you should always use a more specific glob to avoid accidentally packaging unwanted files (for example, a top level module named `test`).
In fully automated environments, you can use the `*` glob to include all modules (put it in single quotes to prevent Shell from expanding it). In Fedora however, you should always use a more specific glob to avoid accidentally packaging unwanted files (for example, a top level module named `test`).
Speaking about automated environments, some files cannot be classified with `%pyproject_save_files`, but it is possible to list all unclassified files by adding a special `+auto` argument.
@ -177,6 +203,46 @@ However, in Fedora packages, always list executables explicitly to avoid uninten
%license LICENSE
%{_bindir}/downloader
`%pyproject_save_files` also automatically recognizes language (`*.mo`) files and marks them with `%lang` macro and appropriate language code.
Note that `%pyproject_save_files` uses data from the [RECORD file](https://www.python.org/dev/peps/pep-0627/).
If you wish to rename, remove or otherwise change the installed files of a package
*after* `%pyproject_install`, `%pyproject_save_files` might break.
If possible, remove/rename such files in `%prep`.
If not possible, avoid using `%pyproject_save_files` or edit/replace `%{pyproject_files}`.
Generating Extras subpackages
-----------------------------
The `%pyproject_extras_subpkg` macro generates simple subpackage(s)
for Python extras.
The macro should be placed after the base package's `%description` to avoid
issues in building the SRPM.
For example, if the `requests` project's metadata defines the extras
`security` and `socks`, the following invocation will generate the subpackage
`python3-requests+security` that provides `python3dist(requests[security])`,
and a similar one for `socks`.
%pyproject_extras_subpkg -n python3-requests security socks
The macro works like `%python_extras_subpkg`,
except the `-i`/`-f`/`-F` arguments are optional and discouraged.
A filelist written by `%pyproject_install` is used by default.
For more information on `%python_extras_subpkg`, see the [Fedora change].
[Fedora change]: https://fedoraproject.org/wiki/Changes/PythonExtras
These arguments are still required:
* -n: name of the “base” package (e.g. python3-requests)
* Positional arguments: the extra name(s).
Multiple subpackages are generated when multiple names are provided.
The macro does nothing on Fedora 32 and lower, as automation around
extras was only added in f33.
Limitations
-----------

View File

@ -12,6 +12,7 @@
%pyproject_files %{_builddir}/pyproject-files
%pyproject_ghost_distinfo %{_builddir}/pyproject-ghost-distinfo
%pyproject_record %{_builddir}/pyproject-record
%pyproject_wheel() %{expand:\\\
export TMPDIR="${PWD}/%{_pyproject_builddir}"
@ -26,22 +27,31 @@ specifier=$(ls %{_pyproject_wheeldir}/*.whl | xargs basename --multiple | sed -E
export TMPDIR="${PWD}/%{_pyproject_builddir}"
%{__python3} -m pip install --root %{buildroot} --no-deps --disable-pip-version-check --progress-bar off --verbose --ignore-installed --no-warn-script-location --no-index --no-cache-dir --find-links %{_pyproject_wheeldir} $specifier
if [ -d %{buildroot}%{_bindir} ]; then
pathfix%{python3_version}.py -pni "%{__python3}" -k%{?py3_shbang_opts: -a%{py3_shbang_opts_nodash}} %{buildroot}%{_bindir}/*
%py3_shebang_fix %{buildroot}%{_bindir}/*
rm -rfv %{buildroot}%{_bindir}/__pycache__
fi
rm -f %{pyproject_ghost_distinfo}
site_dirs=()
# Process %%{python3_sitelib} if exists
if [ -d %{buildroot}%{python3_sitelib} ]; then
for distinfo in %{buildroot}%{python3_sitelib}/*.dist-info; do
echo "%ghost ${distinfo#%{buildroot}}" >> %{pyproject_ghost_distinfo}
sed -i 's/pip/rpm/' ${distinfo}/INSTALLER
done
site_dirs+=( "%{python3_sitelib}" )
fi
# Process %%{python3_sitearch} if exists and does not equal to %%{python3_sitelib}
if [ %{buildroot}%{python3_sitearch} != %{buildroot}%{python3_sitelib} ] && [ -d %{buildroot}%{python3_sitearch} ]; then
for distinfo in %{buildroot}%{python3_sitearch}/*.dist-info; do
site_dirs+=( "%{python3_sitearch}" )
fi
# Process all *.dist-info dirs in sitelib/sitearch
for site_dir in ${site_dirs[@]}; do
for distinfo in %{buildroot}$site_dir/*.dist-info; do
echo "%ghost ${distinfo#%{buildroot}}" >> %{pyproject_ghost_distinfo}
sed -i 's/pip/rpm/' ${distinfo}/INSTALLER
PYTHONPATH=%{_rpmconfigdir}/redhat \\
%{__python3} -B %{_rpmconfigdir}/redhat/pyproject_preprocess_record.py \\
--buildroot %{buildroot} --record ${distinfo}/RECORD --output %{pyproject_record}
rm -fv ${distinfo}/RECORD
rm -fv ${distinfo}/REQUESTED
done
fi
done
lines=$(wc -l %{pyproject_ghost_distinfo} | cut -f1 -d" ")
if [ $lines -ne 1 ]; then
echo -e "\\n\\nWARNING: %%%%pyproject_extras_subpkg won't work without explicit -i or -F, found $lines dist-info directories.\\n\\n" >/dev/stderr
@ -61,6 +71,7 @@ fi
--sitelib "%{python3_sitelib}" \\
--sitearch "%{python3_sitearch}" \\
--python-version "%{python3_version}" \\
--pyproject-record "%{pyproject_record}" \\
%{*}
}
@ -70,25 +81,34 @@ fi
%pyproject_buildrequires(rxte:) %{expand:\\\
%{-e:%{expand:%global toxenv %{-e*}}}
%{-e:%{expand:%global toxenv %(%{__python3} -s %{_rpmconfigdir}/redhat/pyproject_construct_toxenv.py %{?**})}}
echo 'python%{python3_pkgversion}-devel'
echo 'python%{python3_pkgversion}dist(pip) >= 19'
echo 'python%{python3_pkgversion}dist(packaging)'
echo 'python%{python3_pkgversion}dist(toml)'
# The first part is for cases when %%{python3_version_nodots} is not yet available
if [ ! -z "%{?python3_version_nodots}" ] && [ %{python3_version_nodots} -lt 38 ]; then
echo 'python%{python3_pkgversion}dist(importlib-metadata)'
if [ -f pyproject.toml ]; then
echo 'python%{python3_pkgversion}dist(toml)'
else
# Note: If the default requirements change, also change them in the script!
echo 'python%{python3_pkgversion}dist(setuptools) >= 40.8'
echo 'python%{python3_pkgversion}dist(wheel)'
fi
# Check if we can generate dependencies on Python extras
if [ "%{py_dist_name []}" == "[]" ]; then
extras_flag=%{?!_python_no_extras_requires:--generate-extras}
else
extras_flag=
fi
# setuptools assumes no pre-existing dist-info
rm -rfv *.dist-info/ >&2
if [ -f %{__python3} ]; then
RPM_TOXENV="%{toxenv}" HOSTNAME="rpmbuild" %{__python3} -I %{_rpmconfigdir}/redhat/pyproject_buildrequires.py --python3_pkgversion %{python3_pkgversion} %{?**}
RPM_TOXENV="%{toxenv}" HOSTNAME="rpmbuild" %{__python3} -s %{_rpmconfigdir}/redhat/pyproject_buildrequires.py $extras_flag --python3_pkgversion %{python3_pkgversion} %{?**}
fi
}
%tox(e:) %{expand:\\\
TOX_TESTENV_PASSENV="${TOX_TESTENV_PASSENV:-*}" \\
PYTHONDONTWRITEBYTECODE=1 \\
PATH="%{buildroot}%{_bindir}:$PATH" \\
PYTHONPATH="${PYTHONPATH:-%{buildroot}%{python3_sitearch}:%{buildroot}%{python3_sitelib}}" \\
HOSTNAME="rpmbuild" \\

View File

@ -6,7 +6,7 @@ License: MIT
# Keep the version at zero and increment only release
Version: 0
Release: 24.1%{?dist}
Release: 37%{?dist}
# Macro files
Source001: macros.pyproject
@ -14,6 +14,9 @@ Source001: macros.pyproject
# Implementation files
Source101: pyproject_buildrequires.py
Source102: pyproject_save_files.py
Source103: pyproject_convert.py
Source104: pyproject_preprocess_record.py
Source105: pyproject_construct_toxenv.py
# Tests
Source201: test_pyproject_buildrequires.py
@ -36,24 +39,31 @@ BuildArch: noarch
BuildRequires: python3dist(pytest)
BuildRequires: python3dist(pyyaml)
BuildRequires: python3dist(packaging)
%if 0%{?fedora} < 32
# The %%if should not be needed, it works around:
# https://github.com/rpm-software-management/mock/issues/336
BuildRequires: (python3dist(importlib-metadata) if python3 < 3.8)
%endif
BuildRequires: python3dist(pip)
BuildRequires: python3dist(setuptools)
BuildRequires: python3dist(toml)
BuildRequires: python3dist(tox-current-env) >= 0.0.2
BuildRequires: python3dist(tox-current-env) >= 0.0.3
BuildRequires: python3dist(wheel)
%endif
%description
This is a provisional implementation of pyproject RPM macros for Fedora 30+.
These macros are useful for packaging Python projects that use the PEP 517
pyproject.toml file, which specifies the package's build dependencies
(including the build system, such as setuptools, flit or poetry).
These macros allow projects that follow the Python packaging specifications
to be packaged as RPMs.
They are still provisional: we can make non-backwards-compatible changes to
the API.
Please subscribe to Fedora's python-devel list if you use the macros.
They work for:
* traditional Setuptools-based projects that use the setup.py file,
* newer Setuptools-based projects that have a setup.cfg file,
* general Python projects that use the PEP 517 pyproject.toml file
(which allows using any build system, such as setuptools, flit or poetry).
These macros replace %%py3_build and %%py3_install,
which only work with setup.py.
%prep
@ -70,7 +80,10 @@ mkdir -p %{buildroot}%{_rpmmacrodir}
mkdir -p %{buildroot}%{_rpmconfigdir}/redhat
install -m 644 macros.pyproject %{buildroot}%{_rpmmacrodir}/
install -m 644 pyproject_buildrequires.py %{buildroot}%{_rpmconfigdir}/redhat/
install -m 644 pyproject_convert.py %{buildroot}%{_rpmconfigdir}/redhat/
install -m 644 pyproject_save_files.py %{buildroot}%{_rpmconfigdir}/redhat/
install -m 644 pyproject_preprocess_record.py %{buildroot}%{_rpmconfigdir}/redhat/
install -m 644 pyproject_construct_toxenv.py %{buildroot}%{_rpmconfigdir}/redhat/
%if %{with tests}
%check
@ -82,14 +95,66 @@ export HOSTNAME="rpmbuild" # to speedup tox in network-less mock, see rhbz#1856
%files
%{_rpmmacrodir}/macros.pyproject
%{_rpmconfigdir}/redhat/pyproject_buildrequires.py
%{_rpmconfigdir}/redhat/pyproject_convert.py
%{_rpmconfigdir}/redhat/pyproject_save_files.py
%{_rpmconfigdir}/redhat/pyproject_preprocess_record.py
%{_rpmconfigdir}/redhat/pyproject_construct_toxenv.py
%doc README.md
%license LICENSE
%changelog
* Wed Dec 09 2020 Petr Šabata <contyk@redhat.com> - 0-24.1
- Downstream workaround for the %%fedora macro
* Tue Feb 02 2021 Miro Hrončok <mhroncok@redhat.com> - 0-37
- Remove support for Python 3.7 from %%pyproject_buildrequires
- Generate python3dist(toml) BR with pyproject.toml earlier to avoid extra install round
- Generate python3dist(setutpools/wheel) BR without pyproject.toml earlier as well
* Wed Jan 27 2021 Fedora Release Engineering <releng@fedoraproject.org> - 0-36
- Rebuilt for https://fedoraproject.org/wiki/Fedora_34_Mass_Rebuild
* Fri Jan 15 2021 Miro Hrončok <mhroncok@redhat.com> - 0-35
- Update the description of the package to match the new README content
* Fri Dec 04 2020 Miro Hrončok <miro@hroncok.cz> - 0-34
- List all files in %%pyproject_files explicitly to avoid duplicate %%lang entries
- If you amend the installed files after %%pyproject_install, %%pyproject_files might break
* Fri Nov 27 2020 Miro Hrončok <mhroncok@redhat.com> - 0-33
- Pass PYTHONDONTWRITEBYTECODE=1 to %%tox to avoid packaged PYTEST bytecode
* Tue Nov 03 2020 Miro Hrončok <mhroncok@redhat.com> - 0-32
- Allow multiple -e in %%pyproject_buildrequires
- Fixes: rhbz#1886509
* Mon Oct 05 2020 Miro Hrončok <mhroncok@redhat.com> - 0-31
- Support PEP 517 list based backend-path
* Tue Sep 29 2020 Lumír Balhar <lbalhar@redhat.com> - 0-30
- Process RECORD files in %%pyproject_install and remove them
- Support the extras configuration option of tox in %%pyproject_buildrequires -t
- Support multiple -x options for %%pyproject_buildrequires
- Fixes: rhbz#1877977
- Fixes: rhbz#1877978
* Wed Sep 23 2020 Miro Hrončok <mhroncok@redhat.com> - 0-29
- Check the requirements after installing "requires_for_build_wheel"
- If not checked, installing runtime requirements might fail
* Tue Sep 08 2020 Gordon Messmer <gordon.messmer@gmail.com> - 0-28
- Support more Python version specifiers in generated BuildRequires
- This adds support for the '~=' operator and wildcards
* Fri Sep 04 2020 Miro Hrončok <miro@hroncok.cz> - 0-27
- Make code in $PWD importable from %%pyproject_buildrequires
- Only require toml for projects with pyproject.toml
- Remove a no longer useful warning for unrecognized files in %%pyproject_save_files
* Mon Aug 24 2020 Tomas Hrnciar <thrnciar@redhat.com> - 0-26
- Implement automatic detection of %%lang files in %%pyproject_save_files
and mark them with %%lang in filelist
* Fri Aug 14 2020 Miro Hrončok <mhroncok@redhat.com> - 0-25
- Handle Python Extras in %%pyproject_buildrequires on Fedora 33+
* Tue Aug 11 2020 Miro Hrončok <mhroncok@redhat.com> - 0-24
- Allow multiple, comma-separated extras in %%pyproject_buildrequires -x

View File

@ -1,6 +1,6 @@
import os
import sys
import importlib
import importlib.metadata
import argparse
import functools
import traceback
@ -23,18 +23,16 @@ class EndPass(Exception):
try:
import toml
from packaging.requirements import Requirement, InvalidRequirement
from packaging.utils import canonicalize_name, canonicalize_version
try:
import importlib.metadata as importlib_metadata
except ImportError:
import importlib_metadata
except ImportError as e:
print_err('Import error:', e)
# already echoed by the %pyproject_buildrequires macro
sys.exit(0)
# uses packaging, needs to be imported after packaging is verified to be present
from pyproject_convert import convert
@contextlib.contextmanager
def hook_call():
@ -47,19 +45,29 @@ def hook_call():
class Requirements:
"""Requirement printer"""
def __init__(self, get_installed_version, extras='',
python3_pkgversion='3'):
def __init__(self, get_installed_version, extras=None,
generate_extras=False, python3_pkgversion='3'):
self.get_installed_version = get_installed_version
self.extras = set()
if extras:
self.marker_envs = [{'extra': e.strip()} for e in extras.split(',')]
else:
self.marker_envs = [{'extra': ''}]
for extra in extras:
self.add_extras(*extra.split(','))
self.missing_requirements = False
self.generate_extras = generate_extras
self.python3_pkgversion = python3_pkgversion
def add_extras(self, *extras):
self.extras |= set(e.strip() for e in extras)
@property
def marker_envs(self):
if self.extras:
return [{'extra': e} for e in sorted(self.extras)]
return [{'extra': ''}]
def evaluate_all_environamnets(self, requirement):
for marker_env in self.marker_envs:
if requirement.marker.evaluate(environment=marker_env):
@ -86,45 +94,45 @@ class Requirements:
return
try:
# TODO: check if requirements with extras are satisfied
installed = self.get_installed_version(requirement.name)
except importlib_metadata.PackageNotFoundError:
except importlib.metadata.PackageNotFoundError:
print_err(f'Requirement not satisfied: {requirement_str}')
installed = None
if installed and installed in requirement.specifier:
print_err(f'Requirement satisfied: {requirement_str}')
print_err(f' (installed: {requirement.name} {installed})')
if requirement.extras:
print_err(f' (extras are currently not checked)')
else:
self.missing_requirements = True
together = []
for specifier in sorted(
requirement.specifier,
key=lambda s: (s.operator, s.version),
):
version = canonicalize_version(specifier.version)
if not VERSION_RE.fullmatch(str(specifier.version)):
raise ValueError(
f'Unknown character in version: {specifier.version}. '
+ '(This is probably a bug in pyproject-rpm-macros.)',
)
if specifier.operator == '!=':
lower = python3dist(name, '<', version,
self.python3_pkgversion)
higher = python3dist(name, '>', f'{version}.0',
self.python3_pkgversion)
together.append(
f'({lower} or {higher})'
)
else:
together.append(python3dist(name, specifier.operator, version,
self.python3_pkgversion))
if len(together) == 0:
print(python3dist(name,
python3_pkgversion=self.python3_pkgversion))
elif len(together) == 1:
print(together[0])
if self.generate_extras:
extra_names = [f'{name}[{extra}]' for extra in sorted(requirement.extras)]
else:
print(f"({' and '.join(together)})")
extra_names = []
for name in [name] + extra_names:
together = []
for specifier in sorted(
requirement.specifier,
key=lambda s: (s.operator, s.version),
):
version = canonicalize_version(specifier.version)
if not VERSION_RE.fullmatch(str(specifier.version)):
raise ValueError(
f'Unknown character in version: {specifier.version}. '
+ '(This is probably a bug in pyproject-rpm-macros.)',
)
together.append(convert(python3dist(name, python3_pkgversion=self.python3_pkgversion),
specifier.operator, version))
if len(together) == 0:
print(python3dist(name,
python3_pkgversion=self.python3_pkgversion))
elif len(together) == 1:
print(together[0])
else:
print(f"({' with '.join(together)})")
def check(self, *, source=None):
"""End current pass if any unsatisfied dependencies were output"""
@ -144,6 +152,13 @@ def get_backend(requirements):
except FileNotFoundError:
pyproject_data = {}
else:
try:
# lazy import toml here, not needed without pyproject.toml
import toml
except ImportError as e:
print_err('Import error:', e)
# already echoed by the %pyproject_buildrequires macro
sys.exit(0)
with f:
pyproject_data = toml.load(f)
@ -162,6 +177,10 @@ def get_backend(requirements):
# (either directly, or by implicitly invoking the [following] backend).
backend_name = 'setuptools.build_meta:__legacy__'
# Note: For projects without pyproject.toml, this was already echoed
# by the %pyproject_buildrequires macro, but this also handles cases
# with pyproject.toml without a specified build backend.
# If the default requirements change, also change them in the macro!
requirements.add('setuptools >= 40.8', source='default build backend')
requirements.add('wheel', source='default build backend')
@ -169,7 +188,10 @@ def get_backend(requirements):
backend_path = buildsystem_data.get('backend-path')
if backend_path:
sys.path.insert(0, backend_path)
# PEP 517 example shows the path as a list, but some projects don't follow that
if isinstance(backend_path, str):
backend_path = [backend_path]
sys.path = backend_path + sys.path
module_name, _, object_name = backend_name.partition(":")
backend_module = importlib.import_module(module_name)
@ -186,6 +208,7 @@ def generate_build_requirements(backend, requirements):
with hook_call():
new_reqs = get_requires()
requirements.extend(new_reqs, source='get_requires_for_build_wheel')
requirements.check(source='get_requires_for_build_wheel')
def generate_run_requirements(backend, requirements):
@ -218,18 +241,21 @@ def parse_tox_requires_lines(lines):
f'WARNING: Skipping dependency line: {line}\n'
+ f' tox deps options other than -r are not supported (yet).',
)
else:
elif line:
packages.append(line)
return packages
def generate_tox_requirements(toxenv, requirements):
requirements.add('tox-current-env >= 0.0.2', source='tox itself')
toxenv = ','.join(toxenv)
requirements.add('tox-current-env >= 0.0.3', source='tox itself')
requirements.check(source='tox itself')
with tempfile.NamedTemporaryFile('r') as depfile:
with tempfile.NamedTemporaryFile('r') as deps, tempfile.NamedTemporaryFile('r') as extras:
r = subprocess.run(
[sys.executable, '-m', 'tox', '--print-deps-to-file',
depfile.name, '-qre', toxenv],
[sys.executable, '-m', 'tox',
'--print-deps-to', deps.name,
'--print-extras-to', extras.name,
'-qre', toxenv],
check=False,
encoding='utf-8',
stdout=subprocess.PIPE,
@ -239,8 +265,9 @@ def generate_tox_requirements(toxenv, requirements):
print_err(r.stdout, end='')
r.check_returncode()
deplines = depfile.read().splitlines()
deplines = deps.read().splitlines()
packages = parse_tox_requires_lines(deplines)
requirements.add_extras(*extras.read().splitlines())
requirements.extend(packages,
source=f'tox --print-deps-only: {toxenv}')
@ -257,23 +284,24 @@ def python3dist(name, op=None, version=None, python3_pkgversion="3"):
def generate_requires(
*, include_runtime=False, toxenv=None, extras='',
get_installed_version=importlib_metadata.version, # for dep injection
python3_pkgversion="3",
*, include_runtime=False, toxenv=None, extras=None,
get_installed_version=importlib.metadata.version, # for dep injection
generate_extras=False, python3_pkgversion="3",
):
"""Generate the BuildRequires for the project in the current directory
This is the main Python entry point.
"""
requirements = Requirements(
get_installed_version, extras=extras,
get_installed_version, extras=extras or [],
generate_extras=generate_extras,
python3_pkgversion=python3_pkgversion
)
try:
backend = get_backend(requirements)
generate_build_requirements(backend, requirements)
if toxenv is not None:
if toxenv:
include_runtime = True
generate_tox_requirements(toxenv, requirements)
if include_runtime:
@ -291,8 +319,8 @@ def main(argv):
help='Generate run-time requirements',
)
parser.add_argument(
'-e', '--toxenv', metavar='TOXENVS', default=None,
help=('specify tox environments'
'-e', '--toxenv', metavar='TOXENVS', action='append',
help=('specify tox environments (comma separated and/or repeated)'
'(implies --tox)'),
)
parser.add_argument(
@ -301,9 +329,13 @@ def main(argv):
'(implies --runtime)'),
)
parser.add_argument(
'-x', '--extras', metavar='EXTRAS', default='',
'-x', '--extras', metavar='EXTRAS', action='append',
help='comma separated list of "extras" for runtime requirements '
'(e.g. -x testing,feature-x) (implies --runtime)',
'(e.g. -x testing,feature-x) (implies --runtime, can be repeated)',
)
parser.add_argument(
'--generate-extras', action='store_true',
help='Generate build requirements on Python Extras',
)
parser.add_argument(
'-p', '--python3_pkgversion', metavar='PYTHON3_PKGVERSION',
@ -318,8 +350,9 @@ def main(argv):
if args.tox:
args.runtime = True
args.toxenv = (args.toxenv or os.getenv('RPM_TOXENV') or
f'py{sys.version_info.major}{sys.version_info.minor}')
if not args.toxenv:
_default = f'py{sys.version_info.major}{sys.version_info.minor}'
args.toxenv = [os.getenv('RPM_TOXENV', _default)]
if args.extras:
args.runtime = True
@ -329,6 +362,7 @@ def main(argv):
include_runtime=args.runtime,
toxenv=args.toxenv,
extras=args.extras,
generate_extras=args.generate_extras,
python3_pkgversion=args.python3_pkgversion,
)
except Exception:

View File

@ -20,6 +20,7 @@ Insufficient version of setuptools:
installed:
setuptools: 5
wheel: 1
toml: 1
pyproject.toml: |
# empty
expected: |
@ -27,7 +28,7 @@ Insufficient version of setuptools:
python3dist(wheel)
result: 0
Empty pyproject.toml, empty setup.py:
No pyproject.toml, empty setup.py:
installed:
setuptools: 50
wheel: 1
@ -42,6 +43,7 @@ Default build system, empty setup.py:
installed:
setuptools: 50
wheel: 1
toml: 1
pyproject.toml: |
# empty
setup.py: |
@ -60,41 +62,68 @@ Erroring setup.py:
result: 77
Bad character in version:
installed: {}
installed:
toml: 1
pyproject.toml: |
[build-system]
requires = ["pkg == 0.$.^.*"]
except: ValueError
Build system dependencies in pyproject.toml:
Build system dependencies in pyproject.toml with extras:
generate_extras: true
installed:
setuptools: 50
wheel: 1
toml: 1
pyproject.toml: |
[build-system]
requires = [
"foo",
"bar[baz] > 5",
"ne!=1",
"ge>=1.2",
"le <= 1.2.3",
"lt < 1.2.3.4 ",
" gt > 1.2.3.4.5",
"multi[extras1,extras2] == 6.0",
"combo >2, <5, != 3.0.0",
"invalid!!ignored",
"py2 ; python_version < '2.7'",
"py3 ; python_version > '3.0'",
"pkg [extra-currently-ignored]",
]
expected: |
python3dist(foo)
(python3dist(ne) < 1 or python3dist(ne) > 1.0)
python3dist(bar) > 5
python3dist(bar[baz]) > 5
(python3dist(ne) < 1 or python3dist(ne) > 1)
python3dist(ge) >= 1.2
python3dist(le) <= 1.2.3
python3dist(lt) < 1.2.3.4
python3dist(gt) > 1.2.3.4.5
((python3dist(combo) < 3 or python3dist(combo) > 3.0) and python3dist(combo) < 5 and python3dist(combo) > 2)
python3dist(multi) = 6
python3dist(multi[extras1]) = 6
python3dist(multi[extras2]) = 6
((python3dist(combo) < 3 or python3dist(combo) > 3) with python3dist(combo) < 5 with python3dist(combo) > 2)
python3dist(py3)
python3dist(pkg)
python3dist(setuptools) >= 40.8
python3dist(wheel)
result: 0
Build system dependencies in pyproject.toml without extras:
generate_extras: false
installed:
setuptools: 50
wheel: 1
toml: 1
pyproject.toml: |
[build-system]
requires = [
"bar[baz] > 5",
"multi[extras1,extras2] == 6.0",
]
expected: |
python3dist(bar) > 5
python3dist(multi) = 6
python3dist(setuptools) >= 40.8
python3dist(wheel)
result: 0
@ -108,7 +137,7 @@ Default build system, build dependencies in setup.py:
setup(
name='test',
version='0.1',
setup_requires=['foo', 'bar!=2'],
setup_requires=['foo', 'bar!=2', 'baz~=1.1.1'],
install_requires=['inst'],
)
expected: |
@ -116,7 +145,8 @@ Default build system, build dependencies in setup.py:
python3dist(wheel)
python3dist(wheel)
python3dist(foo)
(python3dist(bar) < 2 or python3dist(bar) > 2.0)
(python3dist(bar) < 2 or python3dist(bar) > 2)
(python3dist(baz) >= 1.1.1 with python3dist(baz) < 1.2)
result: 0
Default build system, run dependencies in setup.py:
@ -205,7 +235,8 @@ Run dependencies with extras (selected):
wheel: 1
pyyaml: 1
include_runtime: true
extras: testing
extras:
- testing
setup.py: *pytest_setup_py
expected: |
python3dist(setuptools) >= 40.8
@ -231,7 +262,9 @@ Run dependencies with multiple extras:
wheel: 1
pyyaml: 1
include_runtime: true
extras: testing,more-testing, even-more-testing , cool-feature
extras:
- testing,more-testing
- even-more-testing , cool-feature
setup.py: |
from setuptools import setup
setup(
@ -257,8 +290,9 @@ Tox dependencies:
setuptools: 50
wheel: 1
tox: 3.5.3
tox-current-env: 0.0.2
toxenv: py3
tox-current-env: 0.0.3
toxenv:
- py3
setup.py: |
from setuptools import setup
setup(
@ -279,8 +313,53 @@ Tox dependencies:
python3dist(setuptools) >= 40.8
python3dist(wheel)
python3dist(wheel)
python3dist(tox-current-env) >= 0.0.2
python3dist(tox-current-env) >= 0.0.3
python3dist(toxdep1)
python3dist(toxdep2)
python3dist(inst)
result: 0
Tox extras:
installed:
setuptools: 50
wheel: 1
tox: 3.5.3
tox-current-env: 0.0.3
toxenv:
- py3
setup.py: |
from setuptools import setup
setup(
name='test',
version='0.1',
install_requires=['inst'],
extras_require={
'extra1': ['dep11 > 11', 'dep12'],
'extra2': ['dep21', 'dep22', 'dep23'],
'nope': ['nopedep'],
}
)
tox.ini: |
[tox]
envlist = py36,py37,py38
[testenv]
deps =
toxdep
extras =
extra2
extra1
commands =
true
expected: |
python3dist(setuptools) >= 40.8
python3dist(wheel)
python3dist(wheel)
python3dist(tox-current-env) >= 0.0.3
python3dist(toxdep)
python3dist(inst)
python3dist(dep11) > 11
python3dist(dep12)
python3dist(dep21)
python3dist(dep22)
python3dist(dep23)
result: 0

View File

@ -0,0 +1,15 @@
import argparse
import sys
def main(argv):
parser = argparse.ArgumentParser(
description='Parse -e arguments instead of RPM getopt.'
)
parser.add_argument('-e', '--toxenv', action='append')
args, _ = parser.parse_known_args(argv)
return ','.join(args.toxenv)
if __name__ == '__main__':
print(main(sys.argv[1:]))

142
pyproject_convert.py Normal file
View File

@ -0,0 +1,142 @@
# Copyright 2019 Gordon Messmer <gordon.messmer@gmail.com>
#
# Upstream: https://github.com/gordonmessmer/pyreq2rpm
#
# Permission is hereby granted, free of charge, to any person
# obtaining a copy of this software and associated documentation files
# (the "Software"), to deal in the Software without restriction,
# including without limitation the rights to use, copy, modify, merge,
# publish, distribute, sublicense, and/or sell copies of the Software,
# and to permit persons to whom the Software is furnished to do so,
# subject to the following conditions:
#
# The above copyright notice and this permission notice shall be
# included in all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
# NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS
# BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN
# ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
# CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
# SOFTWARE.
from packaging.requirements import Requirement
from packaging.version import parse as parse_version
class RpmVersion():
def __init__(self, version_id):
version = parse_version(version_id)
if isinstance(version._version, str):
self.version = version._version
else:
self.epoch = version._version.epoch
self.version = list(version._version.release)
self.pre = version._version.pre
self.dev = version._version.dev
self.post = version._version.post
def increment(self):
self.version[-1] += 1
self.pre = None
self.dev = None
self.post = None
return self
def __str__(self):
if isinstance(self.version, str):
return self.version
if self.epoch:
rpm_epoch = str(self.epoch) + ':'
else:
rpm_epoch = ''
while len(self.version) > 1 and self.version[-1] == 0:
self.version.pop()
rpm_version = '.'.join(str(x) for x in self.version)
if self.pre:
rpm_suffix = '~{}'.format(''.join(str(x) for x in self.pre))
elif self.dev:
rpm_suffix = '~~{}'.format(''.join(str(x) for x in self.dev))
elif self.post:
rpm_suffix = '^post{}'.format(self.post[1])
else:
rpm_suffix = ''
return '{}{}{}'.format(rpm_epoch, rpm_version, rpm_suffix)
def convert_compatible(name, operator, version_id):
if version_id.endswith('.*'):
return 'Invalid version'
version = RpmVersion(version_id)
if len(version.version) == 1:
return 'Invalid version'
upper_version = RpmVersion(version_id)
upper_version.version.pop()
upper_version.increment()
return '({} >= {} with {} < {})'.format(
name, version, name, upper_version)
def convert_equal(name, operator, version_id):
if version_id.endswith('.*'):
version_id = version_id[:-2] + '.0'
return convert_compatible(name, '~=', version_id)
version = RpmVersion(version_id)
return '{} = {}'.format(name, version)
def convert_arbitrary_equal(name, operator, version_id):
if version_id.endswith('.*'):
return 'Invalid version'
version = RpmVersion(version_id)
return '{} = {}'.format(name, version)
def convert_not_equal(name, operator, version_id):
if version_id.endswith('.*'):
version_id = version_id[:-2]
version = RpmVersion(version_id)
lower_version = RpmVersion(version_id).increment()
else:
version = RpmVersion(version_id)
lower_version = version
return '({} < {} or {} > {})'.format(
name, version, name, lower_version)
def convert_ordered(name, operator, version_id):
if version_id.endswith('.*'):
# PEP 440 does not define semantics for prefix matching
# with ordered comparisons
version_id = version_id[:-2]
version = RpmVersion(version_id)
if operator == '>':
# distutils will allow a prefix match with '>'
operator = '>='
if operator == '<=':
# distutils will not allow a prefix match with '<='
operator = '<'
else:
version = RpmVersion(version_id)
return '{} {} {}'.format(name, operator, version)
OPERATORS = {'~=': convert_compatible,
'==': convert_equal,
'===': convert_arbitrary_equal,
'!=': convert_not_equal,
'<=': convert_ordered,
'<': convert_ordered,
'>=': convert_ordered,
'>': convert_ordered}
def convert(name, operator, version_id):
return OPERATORS[operator](name, operator, version_id)
def convert_requirement(req):
parsed_req = Requirement.parse(req)
reqs = []
for spec in parsed_req.specs:
reqs.append(convert(parsed_req.project_name, spec[0], spec[1]))
if len(reqs) == 0:
return parsed_req.project_name
if len(reqs) == 1:
return reqs[0]
else:
reqs.sort()
return '({})'.format(' with '.join(reqs))

View File

@ -0,0 +1,85 @@
import argparse
import csv
import json
import os
from pathlib import PosixPath
from pyproject_save_files import BuildrootPath
def read_record(record_path):
"""
A generator yielding individual RECORD triplets.
https://www.python.org/dev/peps/pep-0376/#record
The triplet is str-path, hash, size -- the last two optional.
We will later care only for the paths anyway.
Example:
>>> g = read_record(PosixPath('./test_RECORD'))
>>> next(g)
['../../../bin/__pycache__/tldr.cpython-....pyc', '', '']
>>> next(g)
['../../../bin/tldr', 'sha256=...', '12766']
>>> next(g)
['../../../bin/tldr.py', 'sha256=...', '12766']
"""
with open(record_path, newline="", encoding="utf-8") as f:
yield from csv.reader(
f, delimiter=",", quotechar='"', lineterminator=os.linesep
)
def parse_record(record_path, record_content):
"""
Returns a list with BuildrootPaths parsed from record_content
params:
record_path: RECORD BuildrootPath
record_content: list of RECORD triplets
first item is a str-path relative to directory where dist-info directory is
(it can also be absolute according to the standard, but not from pip)
Examples:
>>> parse_record(BuildrootPath('/usr/lib/python3.7/site-packages/requests-2.22.0.dist-info/RECORD'),
... [('requests/sessions.py', 'sha256=xxx', '666')])
['/usr/lib/python3.7/site-packages/requests/sessions.py']
>>> parse_record(BuildrootPath('/usr/lib/python3.7/site-packages/tldr-0.5.dist-info/RECORD'),
... [('../../../bin/tldr', 'sha256=yyy', '777')])
['/usr/bin/tldr']
"""
sitedir = record_path.parent.parent # trough the dist-info directory
# / with absolute right operand will remove the left operand
# any .. parts are resolved via normpath
return [str((sitedir / row[0]).normpath()) for row in record_content]
def save_parsed_record(record_path, parsed_record, output_file):
content = {}
if output_file.is_file():
content = json.loads(output_file.read_text())
content[str(record_path)] = parsed_record
output_file.write_text(json.dumps(content))
def main(cli_args):
record_path = BuildrootPath.from_real(cli_args.record, root=cli_args.buildroot)
parsed_record = parse_record(record_path, read_record(cli_args.record))
save_parsed_record(record_path, parsed_record, cli_args.output)
def argparser():
parser = argparse.ArgumentParser()
r = parser.add_argument_group("required arguments")
r.add_argument("--buildroot", type=PosixPath, required=True)
r.add_argument("--record", type=PosixPath, required=True)
r.add_argument("--output", type=PosixPath, required=True)
return parser
if __name__ == "__main__":
cli_args = argparser().parse_args()
main(cli_args)

189
pyproject_save_files.py Executable file → Normal file
View File

@ -1,8 +1,7 @@
import argparse
import csv
import fnmatch
import json
import os
import warnings
from collections import defaultdict
from pathlib import PosixPath, PurePosixPath
@ -56,79 +55,6 @@ class BuildrootPath(PurePosixPath):
return type(self)(os.path.normpath(self))
def locate_record(root, sitedirs):
"""
Find a RECORD file in the given root.
sitedirs are BuildrootPaths.
Only RECORDs in dist-info dirs inside sitedirs are considered.
There can only be one RECORD file.
Returns a PosixPath of the RECORD file.
"""
records = []
for sitedir in sitedirs:
records.extend(sitedir.to_real(root).glob("*.dist-info/RECORD"))
sitedirs_text = ", ".join(str(p) for p in sitedirs)
if len(records) == 0:
raise FileNotFoundError(f"There is no *.dist-info/RECORD in {sitedirs_text}")
if len(records) > 1:
raise FileExistsError(f"Multiple *.dist-info directories in {sitedirs_text}")
return records[0]
def read_record(record_path):
"""
A generator yielding individual RECORD triplets.
https://www.python.org/dev/peps/pep-0376/#record
The triplet is str-path, hash, size -- the last two optional.
We will later care only for the paths anyway.
Example:
>>> g = read_record(PosixPath('./test_RECORD'))
>>> next(g)
['../../../bin/__pycache__/tldr.cpython-....pyc', '', '']
>>> next(g)
['../../../bin/tldr', 'sha256=...', '12766']
>>> next(g)
['../../../bin/tldr.py', 'sha256=...', '12766']
"""
with open(record_path, newline="", encoding="utf-8") as f:
yield from csv.reader(
f, delimiter=",", quotechar='"', lineterminator=os.linesep
)
def parse_record(record_path, record_content):
"""
Returns a generator with BuildrootPaths parsed from record_content
params:
record_path: RECORD BuildrootPath
record_content: list of RECORD triplets
first item is a str-path relative to directory where dist-info directory is
(it can also be absolute according to the standard, but not from pip)
Examples:
>>> next(parse_record(BuildrootPath('/usr/lib/python3.7/site-packages/requests-2.22.0.dist-info/RECORD'),
... [('requests/sessions.py', 'sha256=xxx', '666'), ...]))
BuildrootPath('/usr/lib/python3.7/site-packages/requests/sessions.py')
>>> next(parse_record(BuildrootPath('/usr/lib/python3.7/site-packages/tldr-0.5.dist-info/RECORD'),
... [('../../../bin/tldr', 'sha256=yyy', '777'), ...]))
BuildrootPath('/usr/bin/tldr')
"""
sitedir = record_path.parent.parent # trough the dist-info directory
# / with absolute right operand will remove the left operand
# any .. parts are resolved via normpath
return ((sitedir / row[0]).normpath() for row in record_content)
def pycached(script, python_version):
"""
For a script BuildrootPath, return a list with that path and its bytecode glob.
@ -151,21 +77,41 @@ def pycached(script, python_version):
return [script, pyc]
def add_file_to_module(paths, module_name, module_type, *files):
def add_file_to_module(paths, module_name, module_type, files_dirs, *files):
"""
Helper procedure, adds given files to the module_name of a given module_type
"""
for module in paths["modules"][module_name]:
if module["type"] == module_type:
if files[0] not in module["files"]:
module["files"].extend(files)
if files[0] not in module[files_dirs]:
module[files_dirs].extend(files)
break
else:
paths["modules"][module_name].append(
{"type": module_type, "files": list(files)}
{"type": module_type, "files": [], "dirs": [], files_dirs: list(files)}
)
def add_lang_to_module(paths, module_name, path):
"""
Helper procedure, divides lang files by language and adds them to the module_name
Returns True if the language code detection was successful
"""
for i, parent in enumerate(path.parents):
if i > 0 and parent.name == 'locale':
lang_country_code = path.parents[i-1].name
break
else:
return False
# convert potential en_US to plain en
lang_code = lang_country_code.partition('_')[0]
if module_name not in paths["lang"]:
paths["lang"].update({module_name: defaultdict(list)})
paths["lang"][module_name][lang_code].append(path)
return True
def classify_paths(
record_path, parsed_record_content, sitedirs, python_version
):
@ -175,7 +121,7 @@ def classify_paths(
For the dict structure, look at the beginning of this function's code.
Each "module" is a dict with "type" ("package", "script", "extension") and "files".
Each "module" is a dict with "type" ("package", "script", "extension"), and "files" and "dirs".
"""
distinfo = record_path.parent
paths = {
@ -185,6 +131,7 @@ def classify_paths(
"docs": [], # to be used once there is upstream way to recognize READMEs
"licenses": [], # to be used once there is upstream way to recognize LICENSEs
},
"lang": {}, # %lang entries: [module_name or None][language_code] lists of .mo files
"modules": defaultdict(list), # each importable module (directory, .py, .so)
"other": {"files": []}, # regular %file entries we could not parse :(
}
@ -198,6 +145,10 @@ def classify_paths(
continue
if path.parent == distinfo:
if path.name in ("RECORD", "REQUESTED"):
# RECORD and REQUESTED files are removed in %pyproject_install
# See PEP 627
continue
# TODO is this a license/documentation?
paths["metadata"]["files"].append(path)
continue
@ -208,23 +159,32 @@ def classify_paths(
if path.suffix == ".so":
# extension modules can have 2 suffixes
name = BuildrootPath(path.stem).stem
add_file_to_module(paths, name, "extension", path)
add_file_to_module(paths, name, "extension", "files", path)
elif path.suffix == ".py":
name = path.stem
add_file_to_module(
paths, name, "script", *pycached(path, python_version)
paths, name, "script", "files", *pycached(path, python_version)
)
else:
paths["other"]["files"].append(path)
else:
# this file is inside a dir, we classify that dir
# this file is inside a dir, we add all dirs upwards until sitedir
index = path.parents.index(sitedir)
module_dir = path.parents[index - 1]
add_file_to_module(paths, module_dir.name, "package", module_dir)
for parent in list(path.parents)[:index]: # no direct slice until Python 3.10
add_file_to_module(paths, module_dir.name, "package", "dirs", parent)
is_lang = False
if path.suffix == ".mo":
is_lang = add_lang_to_module(paths, module_dir.name, path)
if not is_lang:
path = pycached(path, python_version) if path.suffix == ".py" else [path]
add_file_to_module(paths, module_dir.name, "package", "files", *path)
break
else:
warnings.warn(f"Unrecognized file: {path}")
paths["other"]["files"].append(path)
if path.suffix == ".mo":
add_lang_to_module(paths, None, path) or paths["other"]["files"].append(path)
else:
paths["other"]["files"].append(path)
return paths
@ -244,6 +204,11 @@ def generate_file_list(paths_dict, module_globs, include_others=False):
if include_others:
files.update(f"{p}" for p in paths_dict["other"]["files"])
try:
for lang_code in paths_dict["lang"][None]:
files.update(f"%lang({lang_code}) {path}" for path in paths_dict["lang"][None][lang_code])
except KeyError:
pass
files.update(f"{p}" for p in paths_dict["metadata"]["files"])
for macro in "dir", "doc", "license":
@ -257,11 +222,14 @@ def generate_file_list(paths_dict, module_globs, include_others=False):
for name in modules:
if fnmatch.fnmatchcase(name, glob):
if name not in done_modules:
try:
for lang_code in paths_dict["lang"][name]:
files.update(f"%lang({lang_code}) {path}" for path in paths_dict["lang"][name][lang_code])
except KeyError:
pass
for module in modules[name]:
if module["type"] == "package":
files.update(f"{p}/" for p in module["files"])
else:
files.update(f"{p}" for p in module["files"])
files.update(f"%dir {p}" for p in module["dirs"])
files.update(f"{p}" for p in module["files"])
done_modules.add(name)
done_globs.add(glob)
@ -352,7 +320,24 @@ def parse_varargs(varargs):
return globs, include_auto
def pyproject_save_files(buildroot, sitelib, sitearch, python_version, varargs):
def load_parsed_record(pyproject_record):
parsed_record = {}
with open(pyproject_record) as pyproject_record_file:
content = json.load(pyproject_record_file)
if len(content) > 1:
raise FileExistsError("%pyproject install has found more than one *.dist-info/RECORD file. "
"Currently, %pyproject_save_files supports only one wheel → one file list mapping. "
"Feel free to open a bugzilla for pyproject-rpm-macros and describe your usecase.")
# Redefine strings stored in JSON to BuildRootPaths
for record_path, files in content.items():
parsed_record[BuildrootPath(record_path)] = [BuildrootPath(f) for f in files]
return parsed_record
def pyproject_save_files(buildroot, sitelib, sitearch, python_version, pyproject_record, varargs):
"""
Takes arguments from the %{pyproject_save_files} macro
@ -363,14 +348,20 @@ def pyproject_save_files(buildroot, sitelib, sitearch, python_version, varargs):
sitedirs = sorted({sitelib, sitearch})
globs, include_auto = parse_varargs(varargs)
record_path_real = locate_record(buildroot, sitedirs)
record_path = BuildrootPath.from_real(record_path_real, root=buildroot)
parsed_record = parse_record(record_path, read_record(record_path_real))
parsed_records = load_parsed_record(pyproject_record)
paths_dict = classify_paths(
record_path, parsed_record, sitedirs, python_version
)
return generate_file_list(paths_dict, globs, include_auto)
final_file_list = []
for record_path, files in parsed_records.items():
paths_dict = classify_paths(
record_path, files, sitedirs, python_version
)
final_file_list.extend(
generate_file_list(paths_dict, globs, include_auto)
)
return final_file_list
def main(cli_args):
@ -379,6 +370,7 @@ def main(cli_args):
cli_args.sitelib,
cli_args.sitearch,
cli_args.python_version,
cli_args.pyproject_record,
cli_args.varargs,
)
@ -393,6 +385,7 @@ def argparser():
r.add_argument("--sitelib", type=BuildrootPath, required=True)
r.add_argument("--sitearch", type=BuildrootPath, required=True)
r.add_argument("--python-version", type=str, required=True)
r.add_argument("--pyproject-record", type=PosixPath, required=True)
parser.add_argument("varargs", nargs="+")
return parser

File diff suppressed because it is too large Load Diff

View File

@ -1,4 +1,5 @@
from pathlib import Path
import importlib.metadata
import io
import pytest
@ -6,10 +7,6 @@ import yaml
from pyproject_buildrequires import generate_requires
try:
import importlib.metadata as importlib_metadata
except ImportError:
import importlib_metadata
testcases = {}
with Path(__file__).parent.joinpath('pyproject_buildrequires_testcases.yaml').open() as f:
@ -35,7 +32,7 @@ def test_data(case_name, capsys, tmp_path, monkeypatch):
try:
return str(case['installed'][dist_name])
except (KeyError, TypeError):
raise importlib_metadata.PackageNotFoundError(
raise importlib.metadata.PackageNotFoundError(
f'info not found for {dist_name}'
)
@ -43,8 +40,9 @@ def test_data(case_name, capsys, tmp_path, monkeypatch):
generate_requires(
get_installed_version=get_installed_version,
include_runtime=case.get('include_runtime', False),
extras=case.get('extras', ''),
extras=case.get('extras', []),
toxenv=case.get('toxenv', None),
generate_extras=case.get('generate_extras', False),
)
except SystemExit as e:
assert e.code == case['result']

View File

@ -4,10 +4,10 @@ import yaml
from pathlib import Path
from pprint import pprint
from pyproject_save_files import argparser, generate_file_list, main
from pyproject_save_files import locate_record, parse_record, read_record
from pyproject_save_files import BuildrootPath
from pyproject_preprocess_record import parse_record, read_record, save_parsed_record
from pyproject_save_files import argparser, generate_file_list, BuildrootPath
from pyproject_save_files import main as save_files_main
DIR = Path(__file__).parent
BINDIR = BuildrootPath("/usr/bin")
@ -22,38 +22,37 @@ EXPECTED_FILES = yaml_data["dumped"]
TEST_RECORDS = yaml_data["records"]
def create_root(tmp_path, *records):
r"""
Create mock buildroot in tmp_path
parameters:
tmp_path: path where buildroot should be created
records: dicts with:
path: expected path found in buildroot
content: string content of the file
Example:
>>> record = {'path': '/usr/lib/python/tldr-0.5.dist-info/RECORD', 'content': '__pycache__/tldr.cpython-37.pyc,,\n...'}
>>> create_root(Path('tmp'), record)
PosixPath('tmp/buildroot')
The example creates ./tmp/buildroot/usr/lib/python/tldr-0.5.dist-info/RECORD with the content.
>>> import shutil
>>> shutil.rmtree(Path('./tmp'))
"""
buildroot = tmp_path / "buildroot"
for record in records:
dest = buildroot / Path(record["path"]).relative_to("/")
dest.parent.mkdir(parents=True)
dest.write_text(record["content"])
return buildroot
@pytest.fixture
def tldr_root(tmp_path):
prepare_pyproject_record(tmp_path, package="tldr")
return tmp_path
@pytest.fixture
def tldr_root(tmp_path):
return create_root(tmp_path, TEST_RECORDS["tldr"])
def pyproject_record(tmp_path):
return tmp_path / "pyproject-record"
def prepare_pyproject_record(tmp_path, package=None, content=None):
"""
Creates RECORD from test data and then uses
functions from pyproject_process_record to convert
it to pyproject-record file which is then
further processed by functions from pyproject_save_files.
"""
record_file = tmp_path / "RECORD"
pyproject_record = tmp_path / "pyproject-record"
if package is not None:
# Get test data and write dist-info/RECORD file
record_path = BuildrootPath(TEST_RECORDS[package]["path"])
record_file.write_text(TEST_RECORDS[package]["content"])
# Parse RECORD file
parsed_record = parse_record(record_path, read_record(record_file))
# Save JSON content to pyproject-record
save_parsed_record(record_path, parsed_record, pyproject_record)
elif content is not None:
save_parsed_record(*content, output_file=pyproject_record)
@pytest.fixture
@ -61,79 +60,23 @@ def output(tmp_path):
return tmp_path / "pyproject_files"
def test_locate_record_good(tmp_path):
sitedir = tmp_path / "ha/ha/ha/site-packages"
distinfo = sitedir / "foo-0.6.dist-info"
distinfo.mkdir(parents=True)
record = distinfo / "RECORD"
record.write_text("\n")
sitedir = BuildrootPath.from_real(sitedir, root=tmp_path)
assert locate_record(tmp_path, {sitedir}) == record
def test_locate_record_missing(tmp_path):
sitedir = tmp_path / "ha/ha/ha/site-packages"
distinfo = sitedir / "foo-0.6.dist-info"
distinfo.mkdir(parents=True)
sitedir = BuildrootPath.from_real(sitedir, root=tmp_path)
with pytest.raises(FileNotFoundError):
locate_record(tmp_path, {sitedir})
def test_locate_record_misplaced(tmp_path):
sitedir = tmp_path / "ha/ha/ha/site-packages"
fakedir = tmp_path / "no/no/no/site-packages"
distinfo = fakedir / "foo-0.6.dist-info"
distinfo.mkdir(parents=True)
record = distinfo / "RECORD"
record.write_text("\n")
sitedir = BuildrootPath.from_real(sitedir, root=tmp_path)
with pytest.raises(FileNotFoundError):
locate_record(tmp_path, {sitedir})
def test_locate_record_two_packages(tmp_path):
sitedir = tmp_path / "ha/ha/ha/site-packages"
for name in "foo-0.6.dist-info", "bar-1.8.dist-info":
distinfo = sitedir / name
distinfo.mkdir(parents=True)
record = distinfo / "RECORD"
record.write_text("\n")
sitedir = BuildrootPath.from_real(sitedir, root=tmp_path)
with pytest.raises(FileExistsError):
locate_record(tmp_path, {sitedir})
def test_locate_record_two_sitedirs(tmp_path):
sitedirs = ["ha/ha/ha/site-packages", "ha/ha/ha64/site-packages"]
for idx, sitedir in enumerate(sitedirs):
sitedir = tmp_path / sitedir
distinfo = sitedir / "foo-0.6.dist-info"
distinfo.mkdir(parents=True)
record = distinfo / "RECORD"
record.write_text("\n")
sitedirs[idx] = BuildrootPath.from_real(sitedir, root=tmp_path)
with pytest.raises(FileExistsError):
locate_record(tmp_path, set(sitedirs))
def test_parse_record_tldr():
record_path = BuildrootPath(TEST_RECORDS["tldr"]["path"])
record_content = read_record(DIR / "test_RECORD")
output = list(parse_record(record_path, record_content))
pprint(output)
expected = [
BINDIR / "__pycache__/tldr.cpython-37.pyc",
BINDIR / "tldr",
BINDIR / "tldr.py",
SITELIB / "__pycache__/tldr.cpython-37.pyc",
SITELIB / "tldr-0.5.dist-info/INSTALLER",
SITELIB / "tldr-0.5.dist-info/LICENSE",
SITELIB / "tldr-0.5.dist-info/METADATA",
SITELIB / "tldr-0.5.dist-info/RECORD",
SITELIB / "tldr-0.5.dist-info/WHEEL",
SITELIB / "tldr-0.5.dist-info/top_level.txt",
SITELIB / "tldr.py",
str(BINDIR / "__pycache__/tldr.cpython-37.pyc"),
str(BINDIR / "tldr"),
str(BINDIR / "tldr.py"),
str(SITELIB / "__pycache__/tldr.cpython-37.pyc"),
str(SITELIB / "tldr-0.5.dist-info/INSTALLER"),
str(SITELIB / "tldr-0.5.dist-info/LICENSE"),
str(SITELIB / "tldr-0.5.dist-info/METADATA"),
str(SITELIB / "tldr-0.5.dist-info/RECORD"),
str(SITELIB / "tldr-0.5.dist-info/WHEEL"),
str(SITELIB / "tldr-0.5.dist-info/top_level.txt"),
str(SITELIB / "tldr.py"),
]
assert output == expected
@ -149,15 +92,15 @@ def test_parse_record_tensorflow():
output = list(parse_record(record_path, record_content))
pprint(output)
expected = [
BINDIR / "toco_from_protos",
SITELIB / long,
SITEARCH / "tensorflow-2.1.0.dist-info/METADATA",
str(BINDIR / "toco_from_protos"),
str(SITELIB / long),
str(SITEARCH / "tensorflow-2.1.0.dist-info/METADATA"),
]
assert output == expected
def remove_others(expected):
return [p for p in expected if not (p.startswith(str(BINDIR)) or p.startswith(str(DATADIR)) or p.endswith(".pth"))]
return [p for p in expected if not (p.startswith(str(BINDIR)) or p.endswith(".pth") or p.rpartition(' ')[-1].startswith(str(DATADIR)))]
@pytest.mark.parametrize("include_auto", (True, False))
@ -182,7 +125,7 @@ def test_generate_file_list_unused_glob():
assert "kerb" not in str(excinfo.value)
def default_options(output, mock_root):
def default_options(output, mock_root, pyproject_record):
return [
"--output",
str(output),
@ -193,18 +136,20 @@ def default_options(output, mock_root):
"--sitearch",
str(SITEARCH),
"--python-version",
"3.7", # test data are for 3.7
"3.7", # test data are for 3.7,
"--pyproject-record",
str(pyproject_record)
]
@pytest.mark.parametrize("include_auto", (True, False))
@pytest.mark.parametrize("package, glob, expected", EXPECTED_FILES)
def test_cli(tmp_path, package, glob, expected, include_auto):
mock_root = create_root(tmp_path, TEST_RECORDS[package])
def test_cli(tmp_path, package, glob, expected, include_auto, pyproject_record):
prepare_pyproject_record(tmp_path, package)
output = tmp_path / "files"
globs = [glob, "+auto"] if include_auto else [glob]
cli_args = argparser().parse_args([*default_options(output, mock_root), *globs])
main(cli_args)
cli_args = argparser().parse_args([*default_options(output, tmp_path, pyproject_record), *globs])
save_files_main(cli_args)
if not include_auto:
expected = remove_others(expected)
@ -212,54 +157,49 @@ def test_cli(tmp_path, package, glob, expected, include_auto):
assert tested == "\n".join(expected) + "\n"
def test_cli_no_RECORD(tmp_path):
mock_root = create_root(tmp_path)
def test_cli_no_pyproject_record(tmp_path, pyproject_record):
output = tmp_path / "files"
cli_args = argparser().parse_args([*default_options(output, mock_root), "tldr*"])
cli_args = argparser().parse_args([*default_options(output, tmp_path, pyproject_record), "tldr*"])
with pytest.raises(FileNotFoundError):
main(cli_args)
save_files_main(cli_args)
def test_cli_misplaced_RECORD(tmp_path, output):
record = {"path": "/usr/lib/", "content": TEST_RECORDS["tldr"]["content"]}
mock_root = create_root(tmp_path, record)
cli_args = argparser().parse_args([*default_options(output, mock_root), "tldr*"])
with pytest.raises(FileNotFoundError):
main(cli_args)
def test_cli_find_too_many_RECORDS(tldr_root, output):
mock_root = create_root(tldr_root.parent, TEST_RECORDS["tensorflow"])
cli_args = argparser().parse_args([*default_options(output, mock_root), "tldr*"])
def test_cli_too_many_RECORDS(tldr_root, output, pyproject_record):
# Two calls to simulate how %pyproject_install process more than one RECORD file
prepare_pyproject_record(tldr_root,
content=("foo/bar/dist-info/RECORD", []))
prepare_pyproject_record(tldr_root,
content=("foo/baz/dist-info/RECORD", []))
cli_args = argparser().parse_args([*default_options(output, tldr_root, pyproject_record), "tldr*"])
with pytest.raises(FileExistsError):
main(cli_args)
save_files_main(cli_args)
def test_cli_bad_argument(tldr_root, output):
def test_cli_bad_argument(tldr_root, output, pyproject_record):
cli_args = argparser().parse_args(
[*default_options(output, tldr_root), "tldr*", "+foodir"]
[*default_options(output, tldr_root, pyproject_record), "tldr*", "+foodir"]
)
with pytest.raises(ValueError):
main(cli_args)
save_files_main(cli_args)
def test_cli_bad_option(tldr_root, output):
def test_cli_bad_option(tldr_root, output, pyproject_record):
prepare_pyproject_record(tldr_root.parent, content=("RECORD1", []))
cli_args = argparser().parse_args(
[*default_options(output, tldr_root), "tldr*", "you_cannot_have_this"]
[*default_options(output, tldr_root, pyproject_record), "tldr*", "you_cannot_have_this"]
)
with pytest.raises(ValueError):
main(cli_args)
save_files_main(cli_args)
def test_cli_bad_namespace(tldr_root, output):
def test_cli_bad_namespace(tldr_root, output, pyproject_record):
cli_args = argparser().parse_args(
[*default_options(output, tldr_root), "tldr.didntread"]
[*default_options(output, tldr_root, pyproject_record), "tldr.didntread"]
)
with pytest.raises(ValueError):
main(cli_args)
save_files_main(cli_args)

View File

@ -2,6 +2,9 @@
. /etc/os-release
fedora=$VERSION_ID
pkgname=${1}
shift
config="/tmp/fedora-${fedora}-x86_64-ci.cfg"
# create mock config if not present
@ -11,6 +14,9 @@ if [ ! -f $config ]; then
original="/etc/mock/fedora-${fedora}-x86_64.cfg"
cp $original $config
echo -e '\n\n' >> $config
echo -e 'config_opts["package_manager_max_attempts"] = 5' >> $config
echo -e 'config_opts["package_manager_attempt_delay"] = 20' >> $config
echo -e '\n\nconfig_opts[f"{config_opts.package_manager}.conf"] += """' >> $config
# The zuul CI has zuul-build.repo
@ -26,23 +32,23 @@ fi
# prepare the rpmbuild folders, make sure nothing relevant is there
mkdir -p ~/rpmbuild/{SOURCES,SRPMS}
rm -f ~/rpmbuild/SRPMS/${1}-*.src.rpm
rm -f ~/rpmbuild/SRPMS/${pkgname}-*.src.rpm
# download the sources and create SRPM
spectool -g -R ${1}.spec
rpmbuild -bs ${1}.spec
spectool -g -R ${pkgname}.spec
rpmbuild -bs ${pkgname}.spec
# build the SRPM in mock
res=0
mock -r $config --enablerepo=local init
mock -r $config --enablerepo=local ~/rpmbuild/SRPMS/${1}-*.src.rpm || res=$?
mock -r $config --enablerepo=local "$@" ~/rpmbuild/SRPMS/${pkgname}-*.src.rpm || res=$?
# move the results to the artifacts directory, so we can examine them
artifacts=${TEST_ARTIFACTS:-/tmp/artifacts}
pushd /var/lib/mock/fedora-*-x86_64/result
mv *.rpm ${artifacts}/ || :
for log in *.log; do
mv ${log} ${artifacts}/${1}-${log}
mv ${log} ${artifacts}/${pkgname}-${log}
done
popd

52
tests/printrun.spec Normal file
View File

@ -0,0 +1,52 @@
Name: printrun
Version: 2.0.0~rc6
%global upstream_version 2.0.0rc6
Release: 0%{?dist}
Summary: RepRap printer interface and tools
License: GPLv3+ and FSFAP
URL: https://github.com/kliment/Printrun
Source0: https://github.com/kliment/Printrun/archive/%{name}-%{upstream_version}.tar.gz
# fix locale location
Patch0: https://github.com/kliment/Printrun/pull/1101.patch
BuildRequires: pyproject-rpm-macros
BuildRequires: python3-devel
BuildRequires: gcc
%description
This package contains lang files outside of printrun module.
Building this tests that lang files are marked with %%lang in filelist.
%prep
%autosetup -p1 -n Printrun-printrun-%{upstream_version}
%generate_buildrequires
%pyproject_buildrequires
%build
%pyproject_wheel
%install
%pyproject_install
%pyproject_save_files printrun +auto
%check
# Internal check if generated lang entries are same as
# the ones generated using %%find_lang
%find_lang pronterface
%find_lang plater
grep '^%%lang' %{pyproject_files} | sort > tested.lang
sort pronterface.lang plater.lang > expected.lang
diff tested.lang expected.lang
%files -f %{pyproject_files}
%doc README*
%license COPYING

View File

@ -10,6 +10,7 @@ Source0: %{pypi_source}
BuildArch: noarch
BuildRequires: pyproject-rpm-macros
BuildRequires: python3-devel
%description
Tests building with the poetry build backend.
@ -36,9 +37,16 @@ Summary: %{summary}
%install
%pyproject_install
%pyproject_save_files clikit
%check
# Internal check that the RECORD and REQUESTED files are
# always removed in %%pyproject_wheel
test ! $(find %{buildroot}%{python3_sitelib}/ | grep -E "\.dist-info/RECORD$")
test ! $(find %{buildroot}%{python3_sitelib}/ | grep -E "\.dist-info/REQUESTED$")
%files -n python3-%{pypi_name} -f %{pyproject_files}
%files -n python3-%{pypi_name}
%doc README.md
%license LICENSE
%{python3_sitelib}/%{pypi_name}/
%{python3_sitelib}/%{pypi_name}-%{version}.dist-info/

View File

@ -0,0 +1,50 @@
Name: python-distroinfo
Version: 0.3.2
Release: 0%{?dist}
Summary: Parsing and querying distribution metadata stored in text/YAML files
License: ASL 2.0
URL: https://github.com/softwarefactory-project/distroinfo
Source0: %{pypi_source distroinfo}
BuildArch: noarch
BuildRequires: pyproject-rpm-macros
BuildRequires: python3-devel
BuildRequires: python3-pytest
BuildRequires: git-core
%description
This package uses setuptools and pbr.
It has setup_requires and tests that %%pyproject_buildrequires correctly
handles that including runtime requirements.
%package -n python3-distroinfo
Summary: %{summary}
%description -n python3-distroinfo
...
%prep
%autosetup -p1 -n distroinfo-%{version}
%generate_buildrequires
%pyproject_buildrequires -r
%build
%pyproject_wheel
%install
%pyproject_install
%pyproject_save_files distroinfo
%check
%pytest
%files -n python3-distroinfo -f %{pyproject_files}
%doc README.rst AUTHORS
%license LICENSE

67
tests/python-django.spec Normal file
View File

@ -0,0 +1,67 @@
Name: python-django
Version: 3.0.7
Release: 0%{?dist}
Summary: A high-level Python Web framework
License: BSD
URL: https://www.djangoproject.com/
Source0: %{pypi_source Django}
BuildArch: noarch
BuildRequires: pyproject-rpm-macros
BuildRequires: python3-devel
%description
This package contains lang files.
Building this tests that lang files are marked with %%lang in filelist.
%package -n python3-django
Summary: %{summary}
%description -n python3-django
...
%prep
%autosetup -p1 -n Django-%{version}
%py3_shebang_fix django/conf/project_template/manage.py-tpl django/bin/django-admin.py
%if 0%{?fedora} < 32 && 0%{?rhel} < 9
# Python RPM dependency generator doesn't support ~= yet
# https://bugzilla.redhat.com/show_bug.cgi?id=1758141
sed -i 's/asgiref ~= /asgiref >= /' setup.py
%endif
%generate_buildrequires
%pyproject_buildrequires
%build
# remove .po files (in ideal world, we would rebuild the .mo files first)
find -name "*.po" | xargs rm -f
%pyproject_wheel
%install
%pyproject_install
%pyproject_save_files django
%check
# Internal check if generated lang entries are same as
# the ones generated using %%find_lang
%find_lang django
%find_lang djangojs
grep '^%%lang' %{pyproject_files} | sort > tested.lang
sort django.lang djangojs.lang > expected.lang
diff tested.lang expected.lang
%files -n python3-django -f %{pyproject_files}
%doc README.rst
%license LICENSE
%{_bindir}/django-admin
%{_bindir}/django-admin.py

View File

@ -0,0 +1,82 @@
Name: python-dns-lexicon
# For testing purposes, we package different versions on different Fedoras,
# because otherwise we would miss some dependencies.
# Please, don't write spec files like this in Fedora, it is forbidden.
%if 0%{?fedora} >= 34 || 0%{?rhel} >= 9
Version: 3.5.2
%else
Version: 3.4.0
%endif
Release: 0%{?dist}
Summary: Manipulate DNS records on various DNS providers in a standardized/agnostic way
License: MIT
URL: https://github.com/AnalogJ/lexicon
Source0: %{url}/archive/v%{version}/lexicon-%{version}.tar.gz
BuildArch: noarch
BuildRequires: pyproject-rpm-macros
BuildRequires: python3-devel
%description
This package has extras specified in tox configuration,
we test that the extras are installed when -e is used.
This package also uses a custom toxenv and creates several extras subpackages.
%package -n python3-dns-lexicon
Summary: %{summary}
%description -n python3-dns-lexicon
...
%pyproject_extras_subpackage -n python3-dns-lexicon plesk route53
%prep
%autosetup -n lexicon-%{version}
# The tox configuration lists a [dev] extra, but that installs nothing (is missing).
# The test requirements are only specified via poetry.dev-dependencies.
# Here we amend the data a bit so we can test more things, adding the tests deps to the dev extra:
sed -i \
's/\[tool.poetry.extras\]/'\
'pytest = {version = ">3", optional = true}\n'\
'vcrpy = {version = ">1", optional = true}\n\n'\
'[tool.poetry.extras]\n'\
'dev = ["pytest", "vcrpy"]/' pyproject.toml
%generate_buildrequires
%if 0%{?fedora} >= 33 || 0%{?rhel} >= 9
# We use the "light" toxenv because the default one installs the [full] extra and we don't have all the deps.
# Note that [full] contains [plesk] and [route53] but we specify them manually instead:
%pyproject_buildrequires -e light -x plesk -x route53
%else
# older Fedoras don't have the required runtime dependencies, so we don't test it there
%pyproject_buildrequires
%endif
%build
%pyproject_wheel
%install
%pyproject_install
%pyproject_save_files lexicon
%if 0%{?fedora} >= 33 || 0%{?rhel} >= 9
%check
# we cannot use %%tox here, because the configured commands call poetry directly :/
# we use %%pytest instead, running a subset of tests not to waste CI time
%pytest -k "test_route53 or test_plesk"
%endif
%files -n python3-dns-lexicon -f %{pyproject_files}
%license LICENSE
%doc README.rst
%{_bindir}/lexicon

View File

@ -9,6 +9,7 @@ Source0: %{pypi_source}
BuildArch: noarch
BuildRequires: pyproject-rpm-macros
BuildRequires: python3-devel
%description
This package contains one .py module
@ -41,8 +42,8 @@ Summary: %{summary}
%check
# Internal check: Top level __pycache__ is never owned
grep -vE '/__pycache__$' %{pyproject_files}
grep -vE '/__pycache__/$' %{pyproject_files}
! grep -E '/__pycache__$' %{pyproject_files}
! grep -E '/__pycache__/$' %{pyproject_files}
grep -F '/__pycache__/' %{pyproject_files}

View File

@ -0,0 +1,47 @@
Name: python-flit-core
Version: 3.0.0
Release: 0%{?dist}
Summary: Distribution-building parts of Flit
License: BSD
URL: https://pypi.org/project/flit-core/
Source0: %{pypi_source flit_core}
BuildArch: noarch
BuildRequires: python3-devel
BuildRequires: pyproject-rpm-macros
%description
Test a build with pyproject.toml backend-path = .
flit-core builds with flit-core.
%package -n python3-flit-core
Summary: %{summary}
%description -n python3-flit-core
...
%prep
%autosetup -p1 -n flit_core-%{version}
%generate_buildrequires
%pyproject_buildrequires
%build
%if 0%{?fedora} < 33 && 0%{?rhel} < 9
# the old pip version cannot handle backend-path properly, let's help it:
export PYTHONPATH=$PWD
%endif
%pyproject_wheel
%install
%pyproject_install
%pyproject_save_files flit_core
%files -n python3-flit-core -f %{pyproject_files}

66
tests/python-httpbin.spec Normal file
View File

@ -0,0 +1,66 @@
Name: python-httpbin
Version: 0.7.0
Release: 0%{?dist}
Summary: HTTP Request & Response Service, written in Python + Flask
License: MIT
URL: https://github.com/Runscope/httpbin
Source0: %{url}/archive/v%{version}/httpbin-%{version}.tar.gz
BuildArch: noarch
BuildRequires: python3-devel
BuildRequires: pyproject-rpm-macros
%description
This package buildrequires a package with extra: raven[flask].
%package -n python3-httpbin
Summary: %{summary}
%if 0%{?fedora} < 33 && 0%{?rhel} < 9
# Old Fedoras don't understand Python extras yet
# This package needs raven[flask]
# So we add the transitive dependencies manually:
BuildRequires: %{py3_dist blinker flask}
Requires: %{py3_dist blinker flask}
%endif
%description -n python3-httpbin
%{summary}.
%prep
%autosetup -n httpbin-%{version}
# brotlipy wrapper is not packaged, httpbin works fine with brotli
sed -i s/brotlipy/brotli/ setup.py
# update test_httpbin.py to reflect new behavior of werkzeug
sed -i /Content-Length/d test_httpbin.py
%generate_buildrequires
%pyproject_buildrequires -t
%build
%pyproject_wheel
%install
%pyproject_install
%pyproject_save_files httpbin
%check
%tox
# Internal check for our macros
# The runtime dependencies contain raven[flask], we assert we got them.
# The %%tox above also dies without it, but this makes it more explicit
%{python3} -c 'import blinker, flask' # transitive deps
%files -n python3-httpbin -f %{pyproject_files}
%doc README*
%license LICENSE*

View File

@ -9,6 +9,7 @@ License: MIT
URL: https://github.com/timothycrosley/%{modname}
Source0: %{url}/archive/%{version}-2/%{modname}-%{version}-2.tar.gz
BuildArch: noarch
BuildRequires: python3-devel
BuildRequires: pyproject-rpm-macros
%description
@ -46,7 +47,7 @@ test -d %{buildroot}%{python3_sitelib}/%{modname}/
test -d %{buildroot}%{python3_sitelib}/%{modname}-%{version}.dist-info/
# Internal check that executables are not present when +auto was not used with %%pyproject_save_files
grep -vF %{buildroot}%{_bindir}/%{modname} %{pyproject_files}
! grep -F %{buildroot}%{_bindir}/%{modname} %{pyproject_files}
%files -n python3-%{modname} -f %{pyproject_files}

View File

@ -5,6 +5,7 @@ License: Python
Summary: An object-oriented API to access LDAP directory servers
Source0: %{pypi_source}
BuildRequires: python3-devel
BuildRequires: pyproject-rpm-macros
BuildRequires: cyrus-sasl-devel
@ -63,14 +64,14 @@ test -f %{buildroot}%{python3_sitearch}/_ldap.cpython-*.so
# Internal check: Unmatched modules are not supposed to be listed in %%{pyproject_files}
# We'll list them explicitly
grep -vF %{python3_sitearch}/ldif.py %{pyproject_files}
grep -vF %{python3_sitearch}/__pycache__/ldif.cpython-%{python3_version_nodots}.pyc %{pyproject_files}
grep -vF %{python3_sitearch}/__pycache__/ldif.cpython-%{python3_version_nodots}.opt-1.pyc %{pyproject_files}
grep -vF %{python3_sitearch}/slapdtest/ %{pyproject_files}
! grep -F %{python3_sitearch}/ldif.py %{pyproject_files}
! grep -F %{python3_sitearch}/__pycache__/ldif.cpython-%{python3_version_nodots}.pyc %{pyproject_files}
! grep -F %{python3_sitearch}/__pycache__/ldif.cpython-%{python3_version_nodots}.opt-1.pyc %{pyproject_files}
! grep -F %{python3_sitearch}/slapdtest %{pyproject_files}
# Internal check: Top level __pycache__ is never owned
grep -vE '/__pycache__$' %{pyproject_files}
grep -vE '/__pycache__/$' %{pyproject_files}
! grep -E '/__pycache__$' %{pyproject_files}
! grep -E '/__pycache__/$' %{pyproject_files}
%files -n python3-ldap -f %{pyproject_files}

View File

@ -8,6 +8,7 @@ URL: https://github.com/lepture/mistune
Source0: %{url}/archive/v%{version}.tar.gz
BuildRequires: gcc
BuildRequires: python3-devel
BuildRequires: pyproject-rpm-macros
# optional dependency, listed explicitly to have the extension module:

View File

@ -9,6 +9,7 @@ URL: https://github.com/os-autoinst/openQA-python-client
Source0: %{pypi_source}
BuildArch: noarch
BuildRequires: python3-devel
BuildRequires: pyproject-rpm-macros
%description

View File

@ -10,6 +10,7 @@ Source0: %{pypi_source}
BuildArch: noarch
BuildRequires: pyproject-rpm-macros
# we don't BR python3-devel here just for test purposes, but we recommend you do it
%description
A pure Python library. The package contains tox.ini. Does not contain executables.

View File

@ -0,0 +1,49 @@
Name: python-poetry-core
Version: 1.0.0
Release: 0%{?dist}
Summary: Poetry PEP 517 Build Backend
License: MIT
URL: https://pypi.org/project/poetry-core/
Source0: %{pypi_source poetry-core}
BuildArch: noarch
BuildRequires: python3-devel
BuildRequires: pyproject-rpm-macros
%description
Test a build with pyproject.toml backend-path = [.]
poetry-core builds with poetry-core.
%package -n python3-poetry-core
Summary: %{summary}
%description -n python3-poetry-core
...
%prep
%autosetup -p1 -n poetry-core-%{version}
%generate_buildrequires
%pyproject_buildrequires
%build
%if 0%{?fedora} < 33 && 0%{?rhel} < 9
# the old pip version cannot handle backend-path properly, let's help it:
export PYTHONPATH=$PWD
%endif
%pyproject_wheel
%install
%pyproject_install
%pyproject_save_files poetry
%files -n python3-poetry-core -f %{pyproject_files}
%doc README.md
%license LICENSE

View File

@ -8,6 +8,7 @@ URL: https://pytest.org
Source0: %{pypi_source}
BuildArch: noarch
BuildRequires: python3-devel
BuildRequires: pyproject-rpm-macros
%description

View File

@ -14,6 +14,8 @@ BuildRequires: pyproject-rpm-macros
%description
This package uses multiple extras in %%pyproject_extras_subpkg and in
%%pyproject_buildrequires.
This test is mostly obsoleted by python-dns-lexicon.spec on Fedora 33+,
but we keep it around until Fedora 32 EOL.
%package -n python3-requests

View File

@ -1,5 +1,14 @@
Name: python-setuptools_scm
# For testing purposes, we package different versions on different Fedoras,
# because otherwise we would miss some dependencies.
# Please, don't write spec files like this in Fedora, it is forbidden.
%if 0%{?fedora} >= 33 || 0%{?rhel} >= 9
Version: 5.0.1
%else
Version: 3.5.0
%endif
Release: 0%{?dist}
Summary: The blessed package to manage your versions by SCM tags
License: MIT
@ -9,6 +18,8 @@ Source0: %{pypi_source setuptools_scm}
BuildArch: noarch
BuildRequires: python3-devel
BuildRequires: pyproject-rpm-macros
BuildRequires: /usr/bin/git
BuildRequires: /usr/bin/hg
%description
Here we test that %%pyproject_extras_subpkg works and generates
@ -16,6 +27,10 @@ setuptools_scm[toml] extra subpackage.
Note that it only works on Fedora 33+.
We also check passing multiple -e flags to %%pyproject_buildrequires.
The tox environments also have a dependency on an extra ("toml").
%package -n python3-setuptools_scm
Summary: %{summary}
@ -30,7 +45,15 @@ Summary: %{summary}
%generate_buildrequires
%if 0%{?fedora} >= 33 || 0%{?rhel} >= 9
# Note that you should not run flake8-like linters in Fedora spec files,
# here we do it solely to check the *ability* to use multiple toxenvs.
%pyproject_buildrequires -e %{default_toxenv}-test -e flake8
%else
# older Fedoras don't have the required runtime dependencies, so we don't test it there
%pyproject_buildrequires
%endif
%build
@ -43,6 +66,16 @@ Summary: %{summary}
%check
%if 0%{?fedora} >= 33 || 0%{?rhel} >= 9
# This tox should run all the toxenvs specified via -e in %%pyproject_buildrequires
# We only run some of the tests (running all of them requires network connection and is slow)
%tox -- -- -k test_version | tee toxlog
# Internal check for our macros: Assert both toxenvs were executed.
grep -F 'py%{python3_version_nodots}-test: commands succeeded' toxlog
grep -F 'flake8: commands succeeded' toxlog
%endif
# Internal check for our macros
# making sure that %%{pyproject_ghost_distinfo} has the right content
test -f %{pyproject_ghost_distinfo}

View File

@ -34,6 +34,12 @@ Summary: %{summary}
%pyproject_install
%pyproject_save_files zope +auto
%check
# Internal check that the RECORD and REQUESTED files are
# always removed in %%pyproject_wheel
test ! $(find %{buildroot}%{python3_sitelib}/ | grep -E "\.dist-info/RECORD$")
test ! $(find %{buildroot}%{python3_sitelib}/ | grep -E "\.dist-info/REQUESTED$")
%files -n python3-zope-event -f %{pyproject_files}
%doc README.rst
%license LICENSE.txt

View File

@ -25,12 +25,18 @@
- clikit:
dir: .
run: ./mocktest.sh python-clikit
- distroinfo:
dir: .
run: ./mocktest.sh python-distroinfo
- tldr:
dir: .
run: ./mocktest.sh tldr
- openqa_client:
dir: .
run: ./mocktest.sh python-openqa_client
- httpbin:
dir: .
run: ./mocktest.sh python-httpbin
- ldap:
dir: .
run: ./mocktest.sh python-ldap
@ -52,6 +58,21 @@
- zope:
dir: .
run: ./mocktest.sh python-zope-event
- django:
dir: .
run: ./mocktest.sh python-django
- printrun:
dir: .
run: ./mocktest.sh printrun
- dns_lexicon:
dir: .
run: ./mocktest.sh python-dns-lexicon
- flit_core:
dir: .
run: ./mocktest.sh python-flit-core
- poetry_core:
dir: .
run: ./mocktest.sh python-poetry-core
required_packages:
- mock
- rpmdevtools

View File

@ -8,6 +8,7 @@ URL: https://github.com/tldr-pages/tldr-python-client
Source0: %{pypi_source}
BuildArch: noarch
BuildRequires: python3-devel
BuildRequires: pyproject-rpm-macros
%description