Compare commits

...

3 Commits
a8 ... c9-beta

12 changed files with 1806 additions and 353 deletions

View File

@ -79,8 +79,21 @@ using the `-R` flag:
%generate_buildrequires %generate_buildrequires
%pyproject_buildrequires -R %pyproject_buildrequires -R
Alternatively, the runtime dependencies can be obtained by building the wheel and reading the metadata from the built wheel. Alternatively, if the project specifies its dependencies in the pyproject.toml
This can be enabled by using the `-w` flag. `[project]` table (as defined in [PEP 621](https://www.python.org/dev/peps/pep-0621/)),
the runtime dependencies can be obtained by reading that metadata.
This can be enabled by using the `-p` flag.
This flag supports reading both the runtime dependencies, and the selected extras
(see the `-x` flag described below).
Please note that not all build backends which use pyproject.toml support the
`[project]` table scheme.
For example, poetry-core (at least in 1.9.0) defines package metadata in the
custom `[tool.poetry]` table which is not supported by the `%pyproject_buildrequires` macro.
Finally, the runtime dependencies can be obtained by building the wheel and reading the metadata from the built wheel.
This can be enabled with the `-w` flag and cannot be combined with `-p`.
Support for building wheels with `%pyproject_buildrequires -w` is **provisional** and the behavior might change. Support for building wheels with `%pyproject_buildrequires -w` is **provisional** and the behavior might change.
Please subscribe to Fedora's [python-devel list] if you use the option. Please subscribe to Fedora's [python-devel list] if you use the option.
@ -111,6 +124,14 @@ For example, if upstream suggests installing test dependencies with
%generate_buildrequires %generate_buildrequires
%pyproject_buildrequires -x testing %pyproject_buildrequires -x testing
For projects that specify test requirements using [PEP 735] dependency groups,
these can be added using the `-g` flag.
Multiple groups can be supplied by repeating the flag or as a comma separated list.
For example, if upstream uses a dependency group called `tests`, the test deps would be generated by:
%generate_buildrequires
%pyproject_buildrequires -g tests
For projects that specify test requirements in their [tox] configuration, For projects that specify test requirements in their [tox] configuration,
these can be added using the `-t` flag (default tox environment) these can be added using the `-t` flag (default tox environment)
or the `-e` flag followed by the tox environment. or the `-e` flag followed by the tox environment.
@ -134,20 +155,26 @@ The `-e` option redefines `%{toxenv}` for further reuse.
Use `%{default_toxenv}` to get the default value. Use `%{default_toxenv}` to get the default value.
The `-t`/`-e` option uses [tox-current-env]'s `--print-deps-to-file` behind the scenes. The `-t`/`-e` option uses [tox-current-env]'s `--print-deps-to-file` behind the scenes.
It generates dependencies listed directly in `deps`,
dependencies defined through `extras`,
and on tox 4.22+ also dependencies defined through `dependency_groups`.
If your package specifies some tox plugins in `tox.requires`, If your package specifies some tox plugins in `tox.requires`,
such plugins will be BuildRequired as well. such plugins will be BuildRequired as well.
Not all plugins are guaranteed to play well with [tox-current-env], Not all plugins are guaranteed to play well with [tox-current-env],
in worst case, patch/sed the requirement out from the tox configuration. in worst case, patch/sed the requirement out from the tox configuration.
Note that neither `-x` or `-t` can be used with `-R`, Note that neither `-x` or `-t` can be used with `-R` or `-N`,
because runtime dependencies are always required for testing. because runtime dependencies are always required for testing.
You can only use those options if the build backend supports the [prepare-metadata-for-build-wheel hook], You can only use those options if the build backend supports the [prepare-metadata-for-build-wheel hook],
or together with `-w`. or together with `-p` or `-w`.
However, using `-g` with `-R` or `-N` is supported because dependency groups don't need to be used for testing
and can be obtained by reading `pyproject.toml` only.
[tox]: https://tox.readthedocs.io/ [tox]: https://tox.readthedocs.io/
[tox-current-env]: https://github.com/fedora-python/tox-current-env/ [tox-current-env]: https://github.com/fedora-python/tox-current-env/
[prepare-metadata-for-build-wheel hook]: https://www.python.org/dev/peps/pep-0517/#prepare-metadata-for-build-wheel [prepare-metadata-for-build-wheel hook]: https://www.python.org/dev/peps/pep-0517/#prepare-metadata-for-build-wheel
[python-devel list]: https://lists.fedoraproject.org/archives/list/python-devel@lists.fedoraproject.org/
Additionally to generated requirements you can supply multiple file names to `%pyproject_buildrequires` macro. Additionally to generated requirements you can supply multiple file names to `%pyproject_buildrequires` macro.
Dependencies will be loaded from them: Dependencies will be loaded from them:
@ -157,12 +184,49 @@ Dependencies will be loaded from them:
For packages not using build system you can use `-N` to entirely skip automatical For packages not using build system you can use `-N` to entirely skip automatical
generation of requirements and install requirements only from manually specified files. generation of requirements and install requirements only from manually specified files.
`-N` option implies `-R` and cannot be used in combination with other options mentioned above `-N` option implies `-R` and cannot be used in combination with other options mentioned above
(`-w`, `-e`, `-t`, `-x`). (`-w`, `-e`, `-t`, `-x`, `-p`).
The `%pyproject_buildrequires` macro also accepts the `-r` flag for backward compatibility; The `%pyproject_buildrequires` macro also accepts the `-r` flag for backward compatibility;
it means "include runtime dependencies" which has been the default since version 0-53. it means "include runtime dependencies" which has been the default since version 0-53.
Passing config settings to build backends
-----------------------------------------
The `%pyproject_buildrequires` and `%pyproject_wheel` macros accept a `-C` flag
to pass [configuration settings][config_settings] to the build backend.
Options take the form of `-C KEY`, `-C KEY=VALUE`, or `-C--option-with-dashes`.
Pass `-C` multiple times to specify multiple options.
This option is equivalent to pip's `--config-settings` flag.
These are passed on to PEP 517 hooks' `config_settings` argument as a Python
dictionary.
The `%pyproject_buildrequires` macro passes these options to the
`get_requires_for_build_wheel` and `prepare_metadata_for_build_wheel` hooks.
Passing `-C` to `%pyproject_buildrequires` is incompatible with `-N` which does
not call these hooks at all.
The `%pyproject_wheel` macro passes these options to the `build_wheel` hook.
Consult the project's upstream documentation and/or the corresponding build
backend's documentation for more information.
Note that some projects don't use config settings at all
and other projects may only accept config settings for one of the two steps.
Note that the current implementation of the macros uses `pip` to build wheels.
On some systems (notably on RHEL 9 with Python 3.9),
`pip` is too old to understand `--config-settings`.
Using the `-C` option for `%pyproject_wheel` (or `%pyproject_buildrequires -w`)
is not supported there and will result to an error like:
Usage:
/usr/bin/python3 -m pip wheel [options] <requirement specifier> ...
...
no such option: --config-settings
[config_settings]: https://peps.python.org/pep-0517/#config-settings
Running tox based tests Running tox based tests
----------------------- -----------------------
@ -250,7 +314,13 @@ However, in Fedora packages, always list executables explicitly to avoid uninten
`%pyproject_save_files` can automatically mark license files with `%license` macro `%pyproject_save_files` can automatically mark license files with `%license` macro
and language (`*.mo`) files with `%lang` macro and appropriate language code. and language (`*.mo`) files with `%lang` macro and appropriate language code.
Only license files declared via [PEP 639] `License-File` field are detected. Only license files declared via [PEP 639] `License-File` field are detected.
[PEP 639] is still a draft and can be changed in the future. [PEP 639] is still provisional and can be changed in the future.
It is possible to use the `-l` flag to declare that a missing license should
terminate the build or `-L` (the default) to explicitly disable this check.
Packagers are encouraged to use the `-l` flag when the `%license` file is not manually listed in `%files`
to avoid accidentally losing the file in a future version.
When the `%license` file is manually listed in `%files`,
packagers can use the `-L` flag to ensure future compatibility in case the `-l` behavior eventually becomes a default.
Note that `%pyproject_save_files` uses data from the [RECORD file](https://www.python.org/dev/peps/pep-0627/). Note that `%pyproject_save_files` uses data from the [RECORD file](https://www.python.org/dev/peps/pep-0627/).
If you wish to rename, remove or otherwise change the installed files of a package If you wish to rename, remove or otherwise change the installed files of a package
@ -305,6 +375,12 @@ The `%pyproject_check_import` macro also accepts positional arguments with
additional qualified module names to check, useful for example if some modules are installed manually. additional qualified module names to check, useful for example if some modules are installed manually.
Note that filtering by `-t`/`-e` also applies to the positional arguments. Note that filtering by `-t`/`-e` also applies to the positional arguments.
Another macro, `%_pyproject_check_import_allow_no_modules` allows to pass the import check,
even if no Python modules are detected in the package.
This may be a valid case for packages containing e.g. typing stubs.
Don't use this macro in Fedora packages.
It's only intended to be used in automated build environments such as Copr.
Generating Extras subpackages Generating Extras subpackages
----------------------------- -----------------------------
@ -336,89 +412,76 @@ These arguments are still required:
Multiple subpackages are generated when multiple names are provided. Multiple subpackages are generated when multiple names are provided.
PROVISIONAL: Importing just-built (extension) modules in %build Provisional: Declarative Buildsystem (RPM 4.20+)
--------------------------------------------------------------- ------------------------------------------------
Sometimes, it is desired to be able to import the *just-built* extension modules It is possible to reduce some of the spec boilerplate by using the provided
in the `%build` section, e.g. to build the documentation with Sphinx. pyproject [declarative buildsystem].
This option is only available with RPM 4.20+ (e.g. in Fedora 41+).
The declarative buildsystem is **provisional** and the behavior might change.
Please subscribe to Fedora's [python-devel list] if you use the feature.
To enable the pyproject declarative buildsystem, use the following:
BuildSystem: pyproject
BuildOption(install): <options for %%pyproject_save_files>
That way, RPM will automatically fill-in the `%prep`, `%generate_buildrequires`,
`%build`, `%install`, and `%check` sections the following defaults:
%prep
%autosetup -p1 -C
%generate_buildrequires
%pyproject_buildrequires
%build %build
%pyproject_wheel %pyproject_wheel
... build the docs here ...
With pure Python packages, it might be possible to set `PYTHONPATH=${PWD}` or `PYTHONPATH=${PWD}/src`. %install
However, it is a bit more complicated with extension modules. %pyproject_install
%pyproject_save_files <options from BuildOption(install)>
The location of just-built modules might differ depending on Python version, architecture, pip version, etc. %check
Hence, the macro `%{pyproject_build_lib}` exists to be used like this: %pyproject_check_import
To pass options to the individual macros, use `BuildOption` (see the [documentation of declarative buildsystems][declarative buildsystem]).
# pass options for %%pyproject_save_files (mandatory when not overriding %%install)
BuildOption(install): -l _module +auto
# replace the default options for %%autosetup
BuildOption(prep): -S git_am -C
# pass options to %%pyproject_buildrequires
BuildOption(generate_buildrequires): docs-requirements.txt -t
# pass options to %%pyproject_wheel
BuildOption(build): -C--global-option=--no-cython-compile
# pass options to %%pyproject_check_import
BuildOption(check): -e '*.test*'
Alternatively, you can supply your own sections to override the automatic ones:
BuildOption(generate_buildrequires): -w
...
%build %build
%pyproject_wheel # do nothing, the wheel was built in %%generate_buildrequires
PYTHONPATH=%{pyproject_build_lib} ... build the docs here ...
This macro is currently **provisional** and the behavior might change. You can append to end of the automatic sections:
Please subscribe to Fedora's [python-devel list] if you use the macro.
The `%{pyproject_build_lib}` macro expands to an Shell `$(...)` expression and does not work when put into single quotes (`'`). %check -a
# run %%pytest after %%pyproject_check_import
%pytest
Depending on the pip version, the expanded value will differ: Or prepend to the beginning of them:
[python-devel list]: https://lists.fedoraproject.org/archives/list/python-devel@lists.fedoraproject.org/ %prep -p
# run %%gpgverify before %%autosetup
%gpgverify -k2 -s1 -d0
### New pip 21.3+ with in-tree-build and setuptools 62.1+ (Fedora 37+) [declarative buildsystem]: https://rpm-software-management.github.io/rpm/manual/buildsystem.html
Always use the macro from the same directory where you called `%pyproject_wheel` from.
The value will expand to something like:
* `/builddir/build/BUILD/%{name}-%{version}/build/lib.linux-x86_64-cpython-311` for wheels with extension modules
* `/builddir/build/BUILD/%{name}-%{version}/build/lib` for pure Python wheels
If multiple wheels were built from the same directory,
some pure Python and some with extension modules,
the expanded value will be combined with `:`:
* `/builddir/build/BUILD/%{name}-%{version}/build/lib.linux-x86_64-cypthon-311:/builddir/build/BUILD/%{name}-%{version}/build/lib`
If multiple wheels were built from different directories,
the value will differ depending on the current directory.
### New pip 21.3+ with in-tree-build and older setuptools (Fedora 36)
Always use the macro from the same directory where you called `%pyproject_wheel` from.
The value will expand to something like:
* `/builddir/build/BUILD/%{name}-%{version}/build/lib.linux-x86_64-3.10` for wheels with extension modules
* `/builddir/build/BUILD/%{name}-%{version}/build/lib` for pure Python wheels
If multiple wheels were built from the same directory,
some pure Python and some with extension modules,
the expanded value will be combined with `:`:
* `/builddir/build/BUILD/%{name}-%{version}/build/lib.linux-x86_64-3.10:/builddir/build/BUILD/%{name}-%{version}/build/lib`
If multiple wheels were built from different directories,
the value will differ depending on the current directory.
### Older pip with out-of-tree-build (Fedora 35 and EL 9)
The value will expand to something like:
* `/builddir/build/BUILD/%{name}-%{version}/.pyproject-builddir/pip-req-build-xxxxxxxx/build/lib.linux-x86_64-3.10` for wheels with extension modules
* `/builddir/build/BUILD/%{name}-%{version}/.pyproject-builddir/pip-req-build-xxxxxxxx/build/lib` for pure Python wheels
Note that the exact value is **not stable** between builds
(the `xxxxxxxx` part is randomly generated,
neither you should consider the `.pyproject-builddir` directory to remain stable).
If multiple wheels are built,
the expanded value will always be combined with `:` regardless of the current directory, e.g.:
* `/builddir/build/BUILD/%{name}-%{version}/.pyproject-builddir/pip-req-build-xxxxxxxx/build/lib.linux-x86_64-3.10:/builddir/build/BUILD/%{name}-%{version}/.pyproject-builddir/pip-req-build-yyyyyyyy/build/lib.linux-x86_64-3.10:/builddir/build/BUILD/%{name}-%{version}/.pyproject-builddir/pip-req-build-zzzzzzzz/build/lib`
**Note:** If you manage to build some wheels with in-tree-build and some with out-of-tree-build option,
the expanded value will contain all relevant directories.
Limitations Limitations
@ -470,9 +533,16 @@ so be prepared for problems.
[PEP 517]: https://www.python.org/dev/peps/pep-0517/ [PEP 517]: https://www.python.org/dev/peps/pep-0517/
[PEP 518]: https://www.python.org/dev/peps/pep-0518/ [PEP 518]: https://www.python.org/dev/peps/pep-0518/
[PEP 639]: https://www.python.org/dev/peps/pep-0639/ [PEP 639]: https://www.python.org/dev/peps/pep-0639/
[PEP 735]: https://www.python.org/dev/peps/pep-0735/
[pip's documentation]: https://pip.pypa.io/en/stable/cli/pip_install/#vcs-support [pip's documentation]: https://pip.pypa.io/en/stable/cli/pip_install/#vcs-support
Deprecated
----------
The `%{pyproject_build_lib}` macro is deprecated, don't use it.
Testing the macros Testing the macros
------------------ ------------------

View File

@ -4,4 +4,14 @@
# this macro will cause the package with the real macro to be installed. # this macro will cause the package with the real macro to be installed.
# When macros.pyproject is installed, it overrides this macro. # When macros.pyproject is installed, it overrides this macro.
# Note: This needs to maintain the same set of options as the real macro. # Note: This needs to maintain the same set of options as the real macro.
%pyproject_buildrequires(rRxtNwe:) echo 'pyproject-rpm-macros' && exit 0 %pyproject_buildrequires(rRxtNwpe:g:C:) echo 'pyproject-rpm-macros' && exit 0
# Declarative buildsystem, requires RPM 4.20+ to work
# https://rpm-software-management.github.io/rpm/manual/buildsystem.html
# This is the minimal implementation to be in the srpm package,
# as required even before the BuildRequires are installed
%buildsystem_pyproject_conf() %nil
%buildsystem_pyproject_generate_buildrequires() %pyproject_buildrequires %*
%buildsystem_pyproject_build() %nil
%buildsystem_pyproject_install() %nil

View File

@ -1,5 +1,9 @@
# This is a backward-compatible suffix used in all pyproject-rpm-macros directories
# For the main Python it's empty, for all others it's "-3.X"
%_pyproject_files_pkgversion %{expr:"%{python3_pkgversion}" != "3" ? "-%{python3_pkgversion}" : ""}
# This is a directory where wheels are stored and installed from, absolute # This is a directory where wheels are stored and installed from, absolute
%_pyproject_wheeldir %{_builddir}%{?buildsubdir:/%{buildsubdir}}/pyproject-wheeldir %_pyproject_wheeldir %{_builddir}%{?buildsubdir:/%{buildsubdir}}/pyproject-wheeldir%{_pyproject_files_pkgversion}
# This is a directory used as TMPDIR, where pip copies sources to and builds from, relative to PWD # This is a directory used as TMPDIR, where pip copies sources to and builds from, relative to PWD
# For proper debugsource packages, we create TMPDIR within PWD # For proper debugsource packages, we create TMPDIR within PWD
@ -8,32 +12,46 @@
# This will be used in debugsource package paths (applies to extension modules only) # This will be used in debugsource package paths (applies to extension modules only)
# NB: pytest collects tests from here if not hidden # NB: pytest collects tests from here if not hidden
# https://docs.pytest.org/en/latest/reference.html#confval-norecursedirs # https://docs.pytest.org/en/latest/reference.html#confval-norecursedirs
%_pyproject_builddir %{_builddir}%{?buildsubdir:/%{buildsubdir}}/.pyproject-builddir %_pyproject_builddir %{_builddir}%{?buildsubdir:/%{buildsubdir}}/.pyproject-builddir%{_pyproject_files_pkgversion}
# We prefix all created files with this value to make them unique # We prefix all created files with this value to make them unique
# Ideally, we would put them into %%{buildsubdir}, but that value changes during the spec # Ideally, we would put them into %%{buildsubdir}, but that value changes during the spec
# The used value is similar to the one used to define the default %%buildroot # The used value is similar to the one used to define the default %%buildroot
%_pyproject_files_prefix %{name}-%{version}-%{release}.%{_arch} %_pyproject_files_prefix %{name}-%{version}-%{release}.%{_arch}%{_pyproject_files_pkgversion}
%pyproject_files %{_builddir}/%{_pyproject_files_prefix}-pyproject-files %pyproject_files %{_builddir}/%{_pyproject_files_prefix}-pyproject-files
%_pyproject_modules %{_builddir}/%{_pyproject_files_prefix}-pyproject-modules %_pyproject_modules %{_builddir}/%{_pyproject_files_prefix}-pyproject-modules
%_pyproject_ghost_distinfo %{_builddir}/%{_pyproject_files_prefix}-pyproject-ghost-distinfo %_pyproject_ghost_distinfo %{_builddir}/%{_pyproject_files_prefix}-pyproject-ghost-distinfo
%_pyproject_record %{_builddir}/%{_pyproject_files_prefix}-pyproject-record %_pyproject_record %{_builddir}/%{_pyproject_files_prefix}-pyproject-record
%_pyproject_buildrequires %{_builddir}/%{_pyproject_files_prefix}-pyproject-buildrequires
# Internal macro, takes %%set_build_flags and strips all the exports
# TODO: Make such a list an actual source of %%set_build_flags (in redhat-rpm-config)
# Cannot use %%gsub directly to preserve EL 9 compatibility
%_pyproject_build_flags %{lua:local exports = rpm.expand('%{set_build_flags} ;'); print((exports:gsub('%s*;+%s+export%s+[%u_]+%s*;+%s*', ' ')))}
# Avoid leaking %%{_pyproject_builddir} to pytest collection # Avoid leaking %%{_pyproject_builddir} to pytest collection
# https://bugzilla.redhat.com/show_bug.cgi?id=1935212 # https://bugzilla.redhat.com/show_bug.cgi?id=1935212
# The value is read and used by the %%pytest and %%tox macros: # The value is read and used by the %%pytest and %%tox macros:
%_set_pytest_addopts %global __pytest_addopts --ignore=%{_pyproject_builddir} %_set_pytest_addopts %global __pytest_addopts --ignore=%{_pyproject_builddir}
%pyproject_wheel() %{expand:\\\ %pyproject_wheel(C:) %{expand:\\\
%_set_pytest_addopts %_set_pytest_addopts
mkdir -p "%{_pyproject_builddir}" mkdir -p "%{_pyproject_builddir}"
CFLAGS="${CFLAGS:-${RPM_OPT_FLAGS}}" LDFLAGS="${LDFLAGS:-${RPM_LD_FLAGS}}" TMPDIR="%{_pyproject_builddir}" \\\ %{_pyproject_build_flags} \\\
%{__python3} -Bs %{_rpmconfigdir}/redhat/pyproject_wheel.py %{_pyproject_wheeldir} TMPDIR="%{_pyproject_builddir}" \\\
%{__python3} -Bs %{_rpmconfigdir}/redhat/pyproject_wheel.py %{?**} %{_pyproject_wheeldir}
} }
%pyproject_build_lib %{expand:\\\ %pyproject_build_lib %{!?__pyproject_build_lib_warned:%{warn:The %%{pyproject_build_lib} macro is deprecated.
It only works with setuptools and is not build-backend-agnostic.
The macro is not scheduled for removal, but there is a possibility of incompatibilities with future versions of setuptools.
As a replacement for the macro for the setuptools backend on Fedora 37+, you can use $PWD/build/lib for pure Python packages,
or $PWD/build/lib.%%{python3_platform}-cpython-%%{python3_version_nodots} for packages with extension modules.
Other build backends and older distributions may need different paths.
See https://lists.fedoraproject.org/archives/list/python-devel@lists.fedoraproject.org/thread/HMLOPAU3RZLXD4BOJHTIPKI3I4U6U7OE/ for details.
}%global __pyproject_build_lib_warned 1}%{expand:\\\
$( $(
pyproject_build_lib=() pyproject_build_lib=()
if [ -d build/lib.%{python3_platform}-cpython-%{python3_version_nodots} ]; then if [ -d build/lib.%{python3_platform}-cpython-%{python3_version_nodots} ]; then
@ -57,6 +75,10 @@ echo $(IFS=:; echo "${pyproject_build_lib[*]}")
%pyproject_install() %{expand:\\\ %pyproject_install() %{expand:\\\
specifier=$(ls %{_pyproject_wheeldir}/*.whl | xargs basename --multiple | sed -E 's/([^-]+)-([^-]+)-.+\\\.whl/\\\1==\\\2/') specifier=$(ls %{_pyproject_wheeldir}/*.whl | xargs basename --multiple | sed -E 's/([^-]+)-([^-]+)-.+\\\.whl/\\\1==\\\2/')
if [ -z $specifier ]; then
echo 'ERROR: %%%%pyproject_install found no wheel in %%%%{_pyproject_wheeldir} %{_pyproject_wheeldir}' >&2
exit 1
fi
TMPDIR="%{_pyproject_builddir}" %{__python3} -m pip install --root %{buildroot} --prefix %{_prefix} --no-deps --disable-pip-version-check --progress-bar off --verbose --ignore-installed --no-warn-script-location --no-index --no-cache-dir --find-links %{_pyproject_wheeldir} $specifier TMPDIR="%{_pyproject_builddir}" %{__python3} -m pip install --root %{buildroot} --prefix %{_prefix} --no-deps --disable-pip-version-check --progress-bar off --verbose --ignore-installed --no-warn-script-location --no-index --no-cache-dir --find-links %{_pyproject_wheeldir} $specifier
if [ -d %{buildroot}%{_bindir} ]; then if [ -d %{buildroot}%{_bindir} ]; then
%py3_shebang_fix %{buildroot}%{_bindir}/* %py3_shebang_fix %{buildroot}%{_bindir}/*
@ -93,10 +115,15 @@ fi
# Note: the three times nested questionmarked -i -f -F pattern means: If none of those options was used -- in that case, we inject our own -f # Note: the three times nested questionmarked -i -f -F pattern means: If none of those options was used -- in that case, we inject our own -f
%pyproject_extras_subpkg(n:i:f:F) %{expand:%{?python_extras_subpkg:%{python_extras_subpkg%{?!-i:%{?!-f:%{?!-F: -f %{_pyproject_ghost_distinfo}}}} %**}}} %pyproject_extras_subpkg(n:i:f:FaA) %{expand:%{?python_extras_subpkg:%{python_extras_subpkg%{?!-i:%{?!-f:%{?!-F: -f %{_pyproject_ghost_distinfo}}}} %**}}}
%pyproject_save_files() %{expand:\\\ # Escaping shell-globs, percentage signs and spaces was reworked in RPM 4.19+
# https://github.com/rpm-software-management/rpm/issues/1749#issuecomment-1020420616
# Since we support both ways, we pass either 4.19 or 4.18 to the script, so it knows which one to use
# Rather than passing the actual version, we let RPM compare the versions, as it is easier done here than in Python
%pyproject_save_files(lL) %{expand:\\\
%{expr:v"0%{?rpmversion}" >= v"4.18.90" ? "RPM_FILES_ESCAPE=4.19" : "RPM_FILES_ESCAPE=4.18" } \\
%{__python3} %{_rpmconfigdir}/redhat/pyproject_save_files.py \\ %{__python3} %{_rpmconfigdir}/redhat/pyproject_save_files.py \\
--output-files "%{pyproject_files}" \\ --output-files "%{pyproject_files}" \\
--output-modules "%{_pyproject_modules}" \\ --output-modules "%{_pyproject_modules}" \\
@ -106,7 +133,7 @@ fi
--python-version "%{python3_version}" \\ --python-version "%{python3_version}" \\
--pyproject-record "%{_pyproject_record}" \\ --pyproject-record "%{_pyproject_record}" \\
--prefix "%{_prefix}" \\ --prefix "%{_prefix}" \\
%{*} %{**}
} }
# -t - Process only top-level modules # -t - Process only top-level modules
@ -120,22 +147,38 @@ fi
} }
%_pyproject_check_import_allow_no_modules(e:t) \
if [ -z "$(cat %{_pyproject_modules})" ]; then\
echo "No modules to check found, exiting check"\
else\
%pyproject_check_import %{?**}\
fi
%default_toxenv py%{python3_version_nodots} %default_toxenv py%{python3_version_nodots}
%toxenv %{default_toxenv} %toxenv %{default_toxenv}
%_pyproject_tomlidep %["%{python3_pkgversion}" == "3"\
? "echo '(python%{python3_pkgversion}dist(tomli) if python%{python3_pkgversion}-devel < 3.11)'"\
: "%[v"%{python3_pkgversion}" < v"3.11"\
? "echo 'python%{python3_pkgversion}dist(tomli)'"\
: "true # will use tomllib, echo nothing"\
]"\
]
# Note: Keep the options in sync with this macro from macros.aaa-pyproject-srpm # Note: Keep the options in sync with this macro from macros.aaa-pyproject-srpm
%pyproject_buildrequires(rRxtNwe:) %{expand:\\\ %pyproject_buildrequires(rRxtNwpe:g:C:) %{expand:\\\
%_set_pytest_addopts %_set_pytest_addopts
# The _auto_set_build_flags feature does not do this in %%generate_buildrequires section,
# but we want to get an environment consistent with %%build:
%{?_auto_set_build_flags:%set_build_flags}
# The default flags expect the package note file to exist # The default flags expect the package note file to exist
# see https://bugzilla.redhat.com/show_bug.cgi?id=2097535 # see https://bugzilla.redhat.com/show_bug.cgi?id=2097535
%{?_package_note_flags:%_generate_package_note_file} %{?_package_note_flags:%_generate_package_note_file}
%{-R: %{-R:
%{-r:%{error:The -R and -r options are mutually exclusive}} %{-r:%{error:The -R and -r options are mutually exclusive}}
%{-x:%{error:The -R and -x options are mutually exclusive}}
%{-e:%{error:The -R and -e options are mutually exclusive}}
%{-t:%{error:The -R and -t options are mutually exclusive}}
%{-w:%{error:The -R and -w options are mutually exclusive}} %{-w:%{error:The -R and -w options are mutually exclusive}}
%{-p:%{error:The -R and -p options are mutually exclusive}}
} }
%{-N: %{-N:
%{-r:%{error:The -N and -r options are mutually exclusive}} %{-r:%{error:The -N and -r options are mutually exclusive}}
@ -143,24 +186,25 @@ fi
%{-e:%{error:The -N and -e options are mutually exclusive}} %{-e:%{error:The -N and -e options are mutually exclusive}}
%{-t:%{error:The -N and -t options are mutually exclusive}} %{-t:%{error:The -N and -t options are mutually exclusive}}
%{-w:%{error:The -N and -w options are mutually exclusive}} %{-w:%{error:The -N and -w options are mutually exclusive}}
%{-p:%{error:The -N and -p options are mutually exclusive}}
%{-C:%{error:The -N and -C options are mutually exclusive}}
%{-g:if [ -f pyproject.toml ]; then
%_pyproject_tomlidep
fi}
}
%{-w:
%{-p:%{error:The -w and -p options are mutually exclusive}}
} }
%{-e:%{expand:%global toxenv %(%{__python3} -s %{_rpmconfigdir}/redhat/pyproject_construct_toxenv.py %{?**})}} %{-e:%{expand:%global toxenv %(%{__python3} -s %{_rpmconfigdir}/redhat/pyproject_construct_toxenv.py %{?**})}}
echo 'pyproject-rpm-macros' # first stdout line matches the implementation in macros.aaa-pyproject-srpm echo 'pyproject-rpm-macros' # first stdout line matches the implementation in macros.aaa-pyproject-srpm
echo 'python%{python3_pkgversion}-devel' echo 'python%{python3_pkgversion}-devel'
echo 'python%{python3_pkgversion}dist(pip) >= 19'
echo 'python%{python3_pkgversion}dist(packaging)' echo 'python%{python3_pkgversion}dist(packaging)'
%{!-N:if [ -f pyproject.toml ]; then %{!-N:echo 'python%{python3_pkgversion}dist(pip) >= 19'
%["%{python3_pkgversion}" == "3" if [ -f pyproject.toml ]; then
? "echo '(python%{python3_pkgversion}dist(toml) if python%{python3_pkgversion}-devel < 3.11)'" %_pyproject_tomlidep
: "%[v"%{python3_pkgversion}" < v"3.11"
? "echo 'python%{python3_pkgversion}dist(toml)'"
: "true # will use tomllib, echo nothing"
]"
]
elif [ -f setup.py ]; then elif [ -f setup.py ]; then
# Note: If the default requirements change, also change them in the script! # Note: If the default requirements change, also change them in the script!
echo 'python%{python3_pkgversion}dist(setuptools) >= 40.8' echo 'python%{python3_pkgversion}dist(setuptools) >= 40.8'
echo 'python%{python3_pkgversion}dist(wheel)'
else else
echo 'ERROR: Neither pyproject.toml nor setup.py found, consider using %%%%pyproject_buildrequires -N <requirements-file> if this is not a Python package.' >&2 echo 'ERROR: Neither pyproject.toml nor setup.py found, consider using %%%%pyproject_buildrequires -N <requirements-file> if this is not a Python package.' >&2
exit 1 exit 1
@ -169,9 +213,14 @@ fi}
rm -rfv *.dist-info/ >&2 rm -rfv *.dist-info/ >&2
if [ -f %{__python3} ]; then if [ -f %{__python3} ]; then
mkdir -p "%{_pyproject_builddir}" mkdir -p "%{_pyproject_builddir}"
CFLAGS="${CFLAGS:-${RPM_OPT_FLAGS}}" LDFLAGS="${LDFLAGS:-${RPM_LD_FLAGS}}" TMPDIR="%{_pyproject_builddir}" \\\ echo -n > %{_pyproject_buildrequires}
RPM_TOXENV="%{toxenv}" HOSTNAME="rpmbuild" %{__python3} -Bs %{_rpmconfigdir}/redhat/pyproject_buildrequires.py %{?!_python_no_extras_requires:--generate-extras} --python3_pkgversion %{python3_pkgversion} --wheeldir %{_pyproject_wheeldir} %{?**} %{_pyproject_build_flags} \\\
TMPDIR="%{_pyproject_builddir}" \\\
RPM_TOXENV="%{toxenv}" HOSTNAME="rpmbuild" %{__python3} -Bs %{_rpmconfigdir}/redhat/pyproject_buildrequires.py %{?!_python_no_extras_requires:--generate-extras} --python3_pkgversion %{python3_pkgversion} --wheeldir %{_pyproject_wheeldir} --output %{_pyproject_buildrequires} %{?**} >&2
cat %{_pyproject_buildrequires}
fi fi
# Incomplete .dist-info dir might confuse importlib.metadata
rm -rfv *.dist-info/ >&2
} }
@ -184,3 +233,13 @@ PYTHONPATH="${PYTHONPATH:-%{buildroot}%{python3_sitearch}:%{buildroot}%{python3_
HOSTNAME="rpmbuild" \\ HOSTNAME="rpmbuild" \\
%{__python3} -m tox --current-env -q --recreate -e "%{-e:%{-e*}}%{!-e:%{toxenv}}" %{?*} %{__python3} -m tox --current-env -q --recreate -e "%{-e:%{-e*}}%{!-e:%{toxenv}}" %{?*}
} }
# Declarative buildsystem, requires RPM 4.20+ to work
# https://rpm-software-management.github.io/rpm/manual/buildsystem.html
%buildsystem_pyproject_conf() %nil
%buildsystem_pyproject_generate_buildrequires() %pyproject_buildrequires %*
%buildsystem_pyproject_build() %pyproject_wheel %*
%buildsystem_pyproject_install() %["%{shrink:%*}" == "" ? "%{error:BuildOption(install) is mandatory with pyproject BuildSystem.}" : "%pyproject_install \
%pyproject_save_files %*"]
%buildsystem_pyproject_check() %pyproject_check_import %*

View File

@ -4,18 +4,18 @@ import os
import sys import sys
import importlib.metadata import importlib.metadata
import argparse import argparse
import tempfile
import traceback import traceback
import contextlib
import json import json
import subprocess import subprocess
import re import re
import tempfile import tempfile
import email.parser import email.parser
import functools
import pathlib import pathlib
import zipfile import zipfile
from pyproject_requirements_txt import convert_requirements_txt from pyproject_requirements_txt import convert_requirements_txt
from pyproject_wheel import parse_config_settings_args
# Some valid Python version specifiers are not supported. # Some valid Python version specifiers are not supported.
@ -35,6 +35,7 @@ def print_err(*args, **kwargs):
try: try:
from packaging.markers import Marker
from packaging.requirements import Requirement, InvalidRequirement from packaging.requirements import Requirement, InvalidRequirement
from packaging.utils import canonicalize_name from packaging.utils import canonicalize_name
except ImportError as e: except ImportError as e:
@ -46,39 +47,6 @@ except ImportError as e:
from pyproject_convert import convert from pyproject_convert import convert
@contextlib.contextmanager
def hook_call():
"""Context manager that records all stdout content (on FD level)
and prints it to stderr at the end, with a 'HOOK STDOUT: ' prefix."""
tmpfile = io.TextIOWrapper(
tempfile.TemporaryFile(buffering=0),
encoding='utf-8',
errors='replace',
write_through=True,
)
stdout_fd = 1
stdout_fd_dup = os.dup(stdout_fd)
stdout_orig = sys.stdout
# begin capture
sys.stdout = tmpfile
os.dup2(tmpfile.fileno(), stdout_fd)
try:
yield
finally:
# end capture
sys.stdout = stdout_orig
os.dup2(stdout_fd_dup, stdout_fd)
tmpfile.seek(0) # rewind
for line in tmpfile:
print_err('HOOK STDOUT:', line, end='')
tmpfile.close()
def guess_reason_for_invalid_requirement(requirement_str): def guess_reason_for_invalid_requirement(requirement_str):
if ':' in requirement_str: if ':' in requirement_str:
message = ( message = (
@ -100,10 +68,11 @@ def guess_reason_for_invalid_requirement(requirement_str):
class Requirements: class Requirements:
"""Requirement printer""" """Requirement gatherer. The macro will eventually print out output_lines."""
def __init__(self, get_installed_version, extras=None, def __init__(self, get_installed_version, extras=None,
generate_extras=False, python3_pkgversion='3'): generate_extras=False, python3_pkgversion='3', config_settings=None):
self.get_installed_version = get_installed_version self.get_installed_version = get_installed_version
self.output_lines = []
self.extras = set() self.extras = set()
if extras: if extras:
@ -111,9 +80,11 @@ class Requirements:
self.add_extras(*extra.split(',')) self.add_extras(*extra.split(','))
self.missing_requirements = False self.missing_requirements = False
self.ignored_alien_requirements = []
self.generate_extras = generate_extras self.generate_extras = generate_extras
self.python3_pkgversion = python3_pkgversion self.python3_pkgversion = python3_pkgversion
self.config_settings = config_settings
def add_extras(self, *extras): def add_extras(self, *extras):
self.extras |= set(e.strip() for e in extras) self.extras |= set(e.strip() for e in extras)
@ -130,15 +101,20 @@ class Requirements:
return True return True
return False return False
def add(self, requirement_str, *, source=None): def add(self, requirement, *, package_name=None, source=None, extra=None):
"""Output a Python-style requirement string as RPM dep""" """Output a Python-style requirement string as RPM dep"""
requirement_str = str(requirement)
print_err(f'Handling {requirement_str} from {source}') print_err(f'Handling {requirement_str} from {source}')
# requirements read initially from the metadata are strings
# further on we work with them as Requirement instances
if not isinstance(requirement, Requirement):
try: try:
requirement = Requirement(requirement_str) requirement = Requirement(requirement)
except InvalidRequirement: except InvalidRequirement:
hint = guess_reason_for_invalid_requirement(requirement_str) hint = guess_reason_for_invalid_requirement(requirement)
message = f'Requirement {requirement_str!r} from {source} is invalid.' message = f'Requirement {requirement!r} from {source} is invalid.'
if hint: if hint:
message += f' Hint: {hint}' message += f' Hint: {hint}'
raise ValueError(message) raise ValueError(message)
@ -149,9 +125,31 @@ class Requirements:
) )
name = canonicalize_name(requirement.name) name = canonicalize_name(requirement.name)
if extra is not None:
extra_str = f'extra == "{extra}"'
if requirement.marker is not None:
extra_str = f'({requirement.marker}) and {extra_str}'
requirement.marker = Marker(extra_str)
if (requirement.marker is not None and if (requirement.marker is not None and
not self.evaluate_all_environments(requirement)): not self.evaluate_all_environments(requirement)):
print_err(f'Ignoring alien requirement:', requirement_str) print_err(f'Ignoring alien requirement:', requirement_str)
self.ignored_alien_requirements.append(requirement)
return
# Handle self-referencing requirements
if package_name and canonicalize_name(package_name) == name:
# Self-referential extras need to be handled specially
if requirement.extras:
if not (requirement.extras <= self.extras): # only handle it if needed
# let all further requirements know we want those extras
self.add_extras(*requirement.extras)
# re-add all of the alien requirements ignored in the past
# they might no longer be alien now
self.readd_ignored_alien_requirements(package_name=package_name)
else:
print_err(f'Ignoring self-referential requirement without extras:', requirement_str)
return return
# We need to always accept pre-releases as satisfying the requirement # We need to always accept pre-releases as satisfying the requirement
@ -192,12 +190,12 @@ class Requirements:
together.append(convert(python3dist(name, python3_pkgversion=self.python3_pkgversion), together.append(convert(python3dist(name, python3_pkgversion=self.python3_pkgversion),
specifier.operator, specifier.version)) specifier.operator, specifier.version))
if len(together) == 0: if len(together) == 0:
print(python3dist(name, dep = python3dist(name, python3_pkgversion=self.python3_pkgversion)
python3_pkgversion=self.python3_pkgversion)) self.output_lines.append(dep)
elif len(together) == 1: elif len(together) == 1:
print(together[0]) self.output_lines.append(together[0])
else: else:
print(f"({' with '.join(together)})") self.output_lines.append(f"({' with '.join(together)})")
def check(self, *, source=None): def check(self, *, source=None):
"""End current pass if any unsatisfied dependencies were output""" """End current pass if any unsatisfied dependencies were output"""
@ -210,26 +208,29 @@ class Requirements:
for req_str in requirement_strs: for req_str in requirement_strs:
self.add(req_str, **kwargs) self.add(req_str, **kwargs)
def readd_ignored_alien_requirements(self, **kwargs):
"""add() previously ignored alien requirements again."""
requirements, self.ignored_alien_requirements = self.ignored_alien_requirements, []
kwargs.setdefault('source', 'Previously ignored alien requirements')
self.extend(requirements, **kwargs)
def toml_load(opened_binary_file): def toml_load(opened_binary_file):
try: try:
# tomllib is in the standard library since 3.11.0b1 # tomllib is in the standard library since 3.11.0b1
import tomllib as toml_module import tomllib
load_from = opened_binary_file
except ImportError: except ImportError:
try: try:
# note: we could use tomli here, import tomli as tomllib
# but for backwards compatibility with RHEL 9, we use toml instead
import toml as toml_module
load_from = io.TextIOWrapper(opened_binary_file, encoding='utf-8')
except ImportError as e: except ImportError as e:
print_err('Import error:', e) print_err('Import error:', e)
# already echoed by the %pyproject_buildrequires macro # already echoed by the %pyproject_buildrequires macro
sys.exit(0) sys.exit(0)
return toml_module.load(load_from) return tomllib.load(opened_binary_file)
def get_backend(requirements): @functools.cache
def load_pyproject():
try: try:
f = open('pyproject.toml', 'rb') f = open('pyproject.toml', 'rb')
except FileNotFoundError: except FileNotFoundError:
@ -237,6 +238,11 @@ def get_backend(requirements):
else: else:
with f: with f:
pyproject_data = toml_load(f) pyproject_data = toml_load(f)
return pyproject_data
def get_backend(requirements):
pyproject_data = load_pyproject()
buildsystem_data = pyproject_data.get('build-system', {}) buildsystem_data = pyproject_data.get('build-system', {})
requirements.extend( requirements.extend(
@ -262,7 +268,6 @@ def get_backend(requirements):
# with pyproject.toml without a specified build backend. # with pyproject.toml without a specified build backend.
# If the default requirements change, also change them in the macro! # If the default requirements change, also change them in the macro!
requirements.add('setuptools >= 40.8', source='default build backend') requirements.add('setuptools >= 40.8', source='default build backend')
requirements.add('wheel', source='default build backend')
requirements.check(source='build backend') requirements.check(source='build backend')
@ -285,17 +290,30 @@ def get_backend(requirements):
def generate_build_requirements(backend, requirements): def generate_build_requirements(backend, requirements):
get_requires = getattr(backend, 'get_requires_for_build_wheel', None) get_requires = getattr(backend, 'get_requires_for_build_wheel', None)
if get_requires: if get_requires:
with hook_call(): new_reqs = get_requires(config_settings=requirements.config_settings)
new_reqs = get_requires()
requirements.extend(new_reqs, source='get_requires_for_build_wheel') requirements.extend(new_reqs, source='get_requires_for_build_wheel')
requirements.check(source='get_requires_for_build_wheel') requirements.check(source='get_requires_for_build_wheel')
def requires_from_metadata_file(metadata_file): def parse_metadata_file(metadata_file):
message = email.parser.Parser().parse(metadata_file, headersonly=True) return email.parser.Parser().parse(metadata_file, headersonly=True)
def requires_from_parsed_metadata_file(message):
return {k: message.get_all(k, ()) for k in ('Requires', 'Requires-Dist')} return {k: message.get_all(k, ()) for k in ('Requires', 'Requires-Dist')}
def package_name_from_parsed_metadata_file(message):
return message.get('name')
def package_name_and_requires_from_metadata_file(metadata_file):
message = parse_metadata_file(metadata_file)
package_name = package_name_from_parsed_metadata_file(message)
requires = requires_from_parsed_metadata_file(message)
return package_name, requires
def generate_run_requirements_hook(backend, requirements): def generate_run_requirements_hook(backend, requirements):
hook_name = 'prepare_metadata_for_build_wheel' hook_name = 'prepare_metadata_for_build_wheel'
prepare_metadata = getattr(backend, hook_name, None) prepare_metadata = getattr(backend, hook_name, None)
@ -303,14 +321,18 @@ def generate_run_requirements_hook(backend, requirements):
raise ValueError( raise ValueError(
'The build backend cannot provide build metadata ' 'The build backend cannot provide build metadata '
'(incl. runtime requirements) before build. ' '(incl. runtime requirements) before build. '
'Use the provisional -w flag to build the wheel and parse the metadata from it, ' 'If the dependencies are specified in the pyproject.toml [project] '
'table, you can use the -p flag to read them.'
'Alternatively, use the provisional -w flag to build the wheel and parse the metadata from it, '
'or use the -R flag not to generate runtime dependencies.' 'or use the -R flag not to generate runtime dependencies.'
) )
with hook_call(): dir_basename = prepare_metadata('.', config_settings=requirements.config_settings)
dir_basename = prepare_metadata('.')
with open(dir_basename + '/METADATA') as metadata_file: with open(dir_basename + '/METADATA') as metadata_file:
for key, requires in requires_from_metadata_file(metadata_file).items(): name, requires = package_name_and_requires_from_metadata_file(metadata_file)
requirements.extend(requires, source=f'hook generated metadata: {key}') for key, req in requires.items():
requirements.extend(req,
package_name=name,
source=f'hook generated metadata: {key} ({name})')
def find_built_wheel(wheeldir): def find_built_wheel(wheeldir):
@ -327,8 +349,17 @@ def generate_run_requirements_wheel(backend, requirements, wheeldir):
# Reuse the wheel from the previous round of %pyproject_buildrequires (if it exists) # Reuse the wheel from the previous round of %pyproject_buildrequires (if it exists)
wheel = find_built_wheel(wheeldir) wheel = find_built_wheel(wheeldir)
if not wheel: if not wheel:
# pip is already echoed from the macro
# but we need to explicitly restart if has not yet been installed
# see https://bugzilla.redhat.com/2169855
requirements.add('pip >= 19', source='%pyproject_buildrequires -w')
requirements.check(source='%pyproject_buildrequires -w')
import pyproject_wheel import pyproject_wheel
returncode = pyproject_wheel.build_wheel(wheeldir=wheeldir, stdout=sys.stderr) returncode = pyproject_wheel.build_wheel(
wheeldir=wheeldir,
stdout=sys.stderr,
config_settings=requirements.config_settings,
)
if returncode != 0: if returncode != 0:
raise RuntimeError('Failed to build the wheel for %pyproject_buildrequires -w.') raise RuntimeError('Failed to build the wheel for %pyproject_buildrequires -w.')
wheel = find_built_wheel(wheeldir) wheel = find_built_wheel(wheeldir)
@ -340,15 +371,45 @@ def generate_run_requirements_wheel(backend, requirements, wheeldir):
for name in wheelfile.namelist(): for name in wheelfile.namelist():
if name.count('/') == 1 and name.endswith('.dist-info/METADATA'): if name.count('/') == 1 and name.endswith('.dist-info/METADATA'):
with io.TextIOWrapper(wheelfile.open(name), encoding='utf-8') as metadata_file: with io.TextIOWrapper(wheelfile.open(name), encoding='utf-8') as metadata_file:
for key, requires in requires_from_metadata_file(metadata_file).items(): name, requires = package_name_and_requires_from_metadata_file(metadata_file)
requirements.extend(requires, source=f'built wheel metadata: {key}') for key, req in requires.items():
requirements.extend(req,
package_name=name,
source=f'built wheel metadata: {key} ({name})')
break break
else: else:
raise RuntimeError('Could not find *.dist-info/METADATA in built wheel.') raise RuntimeError('Could not find *.dist-info/METADATA in built wheel.')
def generate_run_requirements(backend, requirements, *, build_wheel, wheeldir): def generate_run_requirements_pyproject(requirements):
if build_wheel: pyproject_data = load_pyproject()
if not (project_table := pyproject_data.get('project', {})):
raise ValueError('Could not find the [project] table in pyproject.toml.')
dynamic_fields = project_table.get('dynamic', [])
if 'dependencies' in dynamic_fields or 'optional-dependencies' in dynamic_fields:
raise ValueError('Could not read the dependencies or optional-dependencies '
'from the [project] table in pyproject.toml, as the field is dynamic.')
dependencies = project_table.get('dependencies', [])
name = project_table.get('name')
requirements.extend(dependencies,
package_name=name,
source=f'pyproject.toml generated metadata: [dependencies] ({name})')
optional_dependencies = project_table.get('optional-dependencies', {})
for extra, dependencies in optional_dependencies.items():
requirements.extend(dependencies,
package_name=name,
source=f'pyproject.toml generated metadata: [optional-dependencies] {extra} ({name})',
extra=extra)
def generate_run_requirements(backend, requirements, *, build_wheel, read_pyproject_dependencies, wheeldir):
if read_pyproject_dependencies:
generate_run_requirements_pyproject(requirements)
elif build_wheel:
generate_run_requirements_wheel(backend, requirements, wheeldir) generate_run_requirements_wheel(backend, requirements, wheeldir)
else: else:
generate_run_requirements_hook(backend, requirements) generate_run_requirements_hook(backend, requirements)
@ -378,7 +439,7 @@ def generate_tox_requirements(toxenv, requirements):
provision_content = provision.read() provision_content = provision.read()
if provision_content and r.returncode != 0: if provision_content and r.returncode != 0:
provision_requires = json.loads(provision_content) provision_requires = json.loads(provision_content)
if 'minversion' in provision_requires: if provision_requires.get('minversion') is not None:
requirements.add(f'tox >= {provision_requires["minversion"]}', requirements.add(f'tox >= {provision_requires["minversion"]}',
source='tox provision (minversion)') source='tox provision (minversion)')
if 'requires' in provision_requires: if 'requires' in provision_requires:
@ -398,6 +459,103 @@ def generate_tox_requirements(toxenv, requirements):
source=f'tox --print-deps-only: {toxenv}') source=f'tox --print-deps-only: {toxenv}')
def tox_dependency_groups(toxenv):
# We call this command separately instead of folding it into the previous one
# becasue --print-dependency-groups-to only works with tox 4.22+ and tox-current-env 0.0.14+.
# We handle failure gracefully: upstreams using dependency_groups should require tox >= 4.22.
toxenv = ','.join(toxenv)
with tempfile.NamedTemporaryFile('r') as groups:
r = subprocess.run(
[sys.executable, '-m', 'tox',
'--print-dependency-groups-to', groups.name,
'-q', '-e', toxenv],
check=False,
encoding='utf-8',
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
)
if r.returncode == 0:
if r.stdout:
print_err(r.stdout, end='')
if output := groups.read().strip():
return output.splitlines()
return []
def generate_dependency_groups(requested_groups, requirements):
"""Adapted from https://peps.python.org/pep-0735/#reference-implementation (public domain)"""
from collections import defaultdict
def _normalize_name(name: str) -> str:
return re.sub(r"[-_.]+", "-", name).lower()
def _normalize_group_names(dependency_groups: dict) -> dict:
original_names = defaultdict(list)
normalized_groups = {}
for group_name, value in dependency_groups.items():
normed_group_name = _normalize_name(group_name)
original_names[normed_group_name].append(group_name)
normalized_groups[normed_group_name] = value
errors = []
for normed_name, names in original_names.items():
if len(names) > 1:
errors.append(f"{normed_name} ({', '.join(names)})")
if errors:
raise ValueError(f"Duplicate dependency group names: {', '.join(errors)}")
return normalized_groups
def _resolve_dependency_group(
dependency_groups: dict, group: str, past_groups: tuple[str, ...] = ()
) -> list[str]:
if group in past_groups:
raise ValueError(f"Cyclic dependency group include: {group} -> {past_groups}")
if group not in dependency_groups:
raise LookupError(f"Dependency group '{group}' not found")
raw_group = dependency_groups[group]
if not isinstance(raw_group, list):
raise ValueError(f"Dependency group '{group}' is not a list")
realized_group = []
for item in raw_group:
if isinstance(item, str):
realized_group.append(item)
elif isinstance(item, dict):
if tuple(item.keys()) != ("include-group",):
raise ValueError(f"Invalid dependency group item: {item}")
include_group = _normalize_name(next(iter(item.values())))
realized_group.extend(
_resolve_dependency_group(
dependency_groups, include_group, past_groups + (group,)
)
)
else:
raise ValueError(f"Invalid dependency group item: {item}")
return realized_group
def resolve(dependency_groups: dict, group: str) -> list[str]:
if not isinstance(dependency_groups, dict):
raise TypeError("Dependency Groups table is not a dict")
return _resolve_dependency_group(dependency_groups, _normalize_name(group))
pyproject_data = load_pyproject()
dependency_groups_raw = pyproject_data.get("dependency-groups", {})
dependency_groups = _normalize_group_names(dependency_groups_raw)
for group_names in requested_groups:
for group_name in group_names.split(","):
requirements.extend(
resolve(dependency_groups, group_name),
source=f"Dependency group {group_name}",
)
def python3dist(name, op=None, version=None, python3_pkgversion="3"): def python3dist(name, op=None, version=None, python3_pkgversion="3"):
prefix = f"python{python3_pkgversion}dist" prefix = f"python{python3_pkgversion}dist"
@ -410,23 +568,29 @@ def python3dist(name, op=None, version=None, python3_pkgversion="3"):
def generate_requires( def generate_requires(
*, include_runtime=False, build_wheel=False, wheeldir=None, toxenv=None, extras=None, *, include_runtime=False, build_wheel=False, wheeldir=None, toxenv=None, extras=None, dependency_groups=None,
get_installed_version=importlib.metadata.version, # for dep injection get_installed_version=importlib.metadata.version, # for dep injection
generate_extras=False, python3_pkgversion="3", requirement_files=None, use_build_system=True generate_extras=False, python3_pkgversion="3", requirement_files=None, use_build_system=True,
read_pyproject_dependencies=False,
output, config_settings=None,
): ):
"""Generate the BuildRequires for the project in the current directory """Generate the BuildRequires for the project in the current directory
The generated BuildRequires are written to the provided output.
This is the main Python entry point. This is the main Python entry point.
""" """
requirements = Requirements( requirements = Requirements(
get_installed_version, extras=extras or [], get_installed_version, extras=extras or [],
generate_extras=generate_extras, generate_extras=generate_extras,
python3_pkgversion=python3_pkgversion python3_pkgversion=python3_pkgversion,
config_settings=config_settings,
) )
dependency_groups = dependency_groups or []
try: try:
if (include_runtime or toxenv) and not use_build_system: if (include_runtime or toxenv or read_pyproject_dependencies) and not use_build_system:
raise ValueError('-N option cannot be used in combination with -r, -e, -t, -x options') raise ValueError('-N option cannot be used in combination with -r, -e, -t, -x, -p options')
if requirement_files: if requirement_files:
for req_file in requirement_files: for req_file in requirement_files:
requirements.extend( requirements.extend(
@ -440,10 +604,16 @@ def generate_requires(
if toxenv: if toxenv:
include_runtime = True include_runtime = True
generate_tox_requirements(toxenv, requirements) generate_tox_requirements(toxenv, requirements)
dependency_groups.extend(tox_dependency_groups(toxenv))
if dependency_groups:
generate_dependency_groups(dependency_groups, requirements)
if include_runtime: if include_runtime:
generate_run_requirements(backend, requirements, build_wheel=build_wheel, wheeldir=wheeldir) generate_run_requirements(backend, requirements, build_wheel=build_wheel,
read_pyproject_dependencies=read_pyproject_dependencies, wheeldir=wheeldir)
except EndPass: except EndPass:
return return
finally:
output.write_text(os.linesep.join(requirements.output_lines) + os.linesep)
def main(argv): def main(argv):
@ -466,9 +636,12 @@ def main(argv):
help=argparse.SUPPRESS, help=argparse.SUPPRESS,
) )
parser.add_argument( parser.add_argument(
'-p', '--python3_pkgversion', metavar='PYTHON3_PKGVERSION', '--python3_pkgversion', metavar='PYTHON3_PKGVERSION',
default="3", help=argparse.SUPPRESS, default="3", help=argparse.SUPPRESS,
) )
parser.add_argument(
'--output', type=pathlib.Path, required=True, help=argparse.SUPPRESS,
)
parser.add_argument( parser.add_argument(
'--wheeldir', metavar='PATH', default=None, '--wheeldir', metavar='PATH', default=None,
help=argparse.SUPPRESS, help=argparse.SUPPRESS,
@ -478,6 +651,11 @@ def main(argv):
help='comma separated list of "extras" for runtime requirements ' help='comma separated list of "extras" for runtime requirements '
'(e.g. -x testing,feature-x) (implies --runtime, can be repeated)', '(e.g. -x testing,feature-x) (implies --runtime, can be repeated)',
) )
parser.add_argument(
'-g', '--dependency-groups', metavar='GROUPS', action='append',
help='comma separated list of dependency groups (PEP 735) for requirements '
'(e.g. -g tests,docs) (can be repeated)',
)
parser.add_argument( parser.add_argument(
'-t', '--tox', action='store_true', '-t', '--tox', action='store_true',
help=('generate test tequirements from tox environment ' help=('generate test tequirements from tox environment '
@ -493,6 +671,11 @@ def main(argv):
help=('Generate run-time requirements by building the wheel ' help=('Generate run-time requirements by building the wheel '
'(useful for build backends without the prepare_metadata_for_build_wheel hook)'), '(useful for build backends without the prepare_metadata_for_build_wheel hook)'),
) )
parser.add_argument(
'-p', '--read-pyproject-dependencies', action='store_true', default=False,
help=('Generate dependencies from [project] table of pyproject.toml '
'instead of calling prepare_metadata_for_build_wheel hook)'),
)
parser.add_argument( parser.add_argument(
'-R', '--no-runtime', action='store_false', dest='runtime', '-R', '--no-runtime', action='store_false', dest='runtime',
help="Don't generate run-time requirements (implied by -N)", help="Don't generate run-time requirements (implied by -N)",
@ -506,6 +689,12 @@ def main(argv):
metavar='REQUIREMENTS.TXT', metavar='REQUIREMENTS.TXT',
help=('Add buildrequires from file'), help=('Add buildrequires from file'),
) )
parser.add_argument(
'-C',
dest='config_settings',
action='append',
help='Configuration settings to pass to the PEP 517 backend',
)
args = parser.parse_args(argv) args = parser.parse_args(argv)
@ -535,10 +724,14 @@ def main(argv):
wheeldir=args.wheeldir, wheeldir=args.wheeldir,
toxenv=args.toxenv, toxenv=args.toxenv,
extras=args.extras, extras=args.extras,
dependency_groups=args.dependency_groups,
generate_extras=args.generate_extras, generate_extras=args.generate_extras,
python3_pkgversion=args.python3_pkgversion, python3_pkgversion=args.python3_pkgversion,
requirement_files=args.requirement_files, requirement_files=args.requirement_files,
use_build_system=args.use_build_system, use_build_system=args.use_build_system,
read_pyproject_dependencies=args.read_pyproject_dependencies,
output=args.output,
config_settings=parse_config_settings_args(args.config_settings),
) )
except Exception: except Exception:
# Log the traceback explicitly (it's useful debug info) # Log the traceback explicitly (it's useful debug info)

File diff suppressed because it is too large Load Diff

View File

@ -2,6 +2,7 @@ import argparse
import fnmatch import fnmatch
import json import json
import os import os
import re
from collections import defaultdict from collections import defaultdict
from keyword import iskeyword from keyword import iskeyword
@ -11,6 +12,15 @@ from importlib.metadata import Distribution
# From RPM's build/files.c strtokWithQuotes delim argument # From RPM's build/files.c strtokWithQuotes delim argument
RPM_FILES_DELIMETERS = ' \n\t' RPM_FILES_DELIMETERS = ' \n\t'
RPM_GLOB_SYMBOLS = '[]{}*?!'
# Combined for escape_rpm_path_4_19()
RPM_SPECIAL_SYMBOLS = RPM_FILES_DELIMETERS + RPM_GLOB_SYMBOLS + '"' + "\\"
RPM_ESCAPE_REGEX = re.compile(f"([{re.escape(RPM_SPECIAL_SYMBOLS)}])")
# See the comment in the macro that wraps this script
RPM_FILES_ESCAPE = os.getenv('RPM_FILES_ESCAPE', '4.19')
PYCACHED_SUFFIX = '{,.opt-?}.pyc'
# RPM hardcodes the lists of manpage extensions and directories, # RPM hardcodes the lists of manpage extensions and directories,
# so we have to maintain separate ones :( # so we have to maintain separate ones :(
@ -115,8 +125,9 @@ def pycached(script, python_version):
""" """
assert script.suffix == ".py" assert script.suffix == ".py"
pyver = "".join(python_version.split(".")[:2]) pyver = "".join(python_version.split(".")[:2])
pycname = f"{script.stem}.cpython-{pyver}{{,.opt-?}}.pyc" pycname = f"{script.stem}.cpython-{pyver}{PYCACHED_SUFFIX}"
pyc = pycache_dir(script) / pycname pyc = pycache_dir(script) / pycname
pyc.glob_suffix_len = len(PYCACHED_SUFFIX)
return [script, pyc] return [script, pyc]
@ -209,10 +220,12 @@ def normalize_manpage_filename(prefix, path):
if fnmatch.fnmatch(str(path.parent), mandir) and path.name != "dir": if fnmatch.fnmatch(str(path.parent), mandir) and path.name != "dir":
# "abc.1.gz2" -> "abc.1*" # "abc.1.gz2" -> "abc.1*"
if path.suffix[1:] in MANPAGE_EXTENSIONS: if path.suffix[1:] in MANPAGE_EXTENSIONS:
return BuildrootPath(path.parent / (path.stem + "*")) path = BuildrootPath(path.parent / (path.stem + "*"))
# "abc.1 -> abc.1*" # "abc.1 -> abc.1*"
else: else:
return BuildrootPath(path.parent / (path.name + "*")) path = BuildrootPath(path.parent / (path.name + "*"))
path.glob_suffix_len = 1
return path
else: else:
return path return path
@ -342,7 +355,7 @@ def classify_paths(
} }
license_files = metadata.get_all('License-File') license_files = metadata.get_all('License-File')
license_directory = distinfo / 'licenses' # See PEP 369 "Root License Directory" license_directory = distinfo / 'licenses' # See PEP 639 "Root License Directory"
# setuptools was the first known build backend to implement License-File. # setuptools was the first known build backend to implement License-File.
# Unfortunately they don't put licenses to the license directory (yet): # Unfortunately they don't put licenses to the license directory (yet):
# https://github.com/pypa/setuptools/issues/3596 # https://github.com/pypa/setuptools/issues/3596
@ -421,63 +434,139 @@ def classify_paths(
return paths return paths
def escape_rpm_path(path): def escape_rpm_path_4_19(path):
r"""
Escape special characters in string-paths or BuildrootPaths, RPM >= 4.19
E.g. a space in path otherwise makes RPM think it's multiple paths,
unless we escape it.
Or a literal % symbol in path might be expanded as a macro if not escaped by %%.
See https://github.com/rpm-software-management/rpm/pull/2103
and https://github.com/rpm-software-management/rpm/pull/2206
If the path ends with a glob produced by our other functions,
we cannot escape that part.
The BuildrootPath.glob_suffix_len attribute is used to indicate such globs.
When such suffix exists, it is not escaped.
Examples:
>>> escape_rpm_path_4_19(BuildrootPath('/usr/lib/python3.9/site-packages/setuptools'))
'/usr/lib/python3.9/site-packages/setuptools'
>>> escape_rpm_path_4_19('/usr/lib/python3.9/site-packages/setuptools/script (dev).tmpl')
'/usr/lib/python3.9/site-packages/setuptools/script\\ (dev).tmpl'
>>> escape_rpm_path_4_19('/usr/share/data/100%valid.path')
'/usr/share/data/100%%valid.path'
>>> escape_rpm_path_4_19('/usr/share/data/100 % valid.path')
'/usr/share/data/100\\ %%\\ valid.path'
>>> escape_rpm_path_4_19('/usr/share/data/1000 %% valid.path')
'/usr/share/data/1000\\ %%%%\\ valid.path'
>>> escape_rpm_path_4_19('/usr/share/data/spaces and "quotes" and ?')
'/usr/share/data/spaces\\ and\\ \\"quotes\\"\\ and\\ \\?'
>>> escape_rpm_path_4_19('/usr/share/data/spaces and [square brackets]')
'/usr/share/data/spaces\\ and\\ \\[square\\ brackets\\]'
>>> path = BuildrootPath('/whatever/__pycache__/bar.cpython-38{,.opt-?}.pyc')
>>> path.glob_suffix_len = len('{,.opt-?}.pyc')
>>> escape_rpm_path_4_19(path)
'/whatever/__pycache__/bar.cpython-38{,.opt-?}.pyc'
>>> path = BuildrootPath('/spa ces/__pycache__/bar.cpython-38{,.opt-?}.pyc')
>>> path.glob_suffix_len = len('{,.opt-?}.pyc')
>>> escape_rpm_path_4_19(path)
'/spa\\ ces/__pycache__/bar.cpython-38{,.opt-?}.pyc'
>>> path = BuildrootPath('/usr/man/man5/ipykernel.5*')
>>> path.glob_suffix_len = 1
>>> escape_rpm_path_4_19(path)
'/usr/man/man5/ipykernel.5*'
""" """
Escape special characters in string-paths or BuildrootPaths glob_suffix_len = getattr(path, "glob_suffix_len", 0)
suffix = ""
path = str(path)
if glob_suffix_len:
suffix = path[-glob_suffix_len:]
path = path[:-glob_suffix_len]
if "%" in path:
path = path.replace("%", "%%")
# Prepend all matched/special characters (\1) with a backslash (escaped, hence \\):
return RPM_ESCAPE_REGEX.sub(r'\\\1', path) + suffix
def escape_rpm_path_4_18(path):
"""
Escape special characters in string-paths or BuildrootPaths, RPM < 4.19
E.g. a space in path otherwise makes RPM think it's multiple paths, E.g. a space in path otherwise makes RPM think it's multiple paths,
unless we put it in "quotes". unless we put it in "quotes".
Or a literal % symbol in path might be expanded as a macro if not escaped. Or a literal % symbol in path might be expanded as a macro if not escaped.
Due to limitations in RPM, Due to limitations in RPM < 4.19,
some paths with spaces and other special characters are not supported. some paths with spaces and other special characters are not supported.
See this thread http://lists.rpm.org/pipermail/rpm-list/2021-June/002048.html
Examples: Examples:
>>> escape_rpm_path(BuildrootPath('/usr/lib/python3.9/site-packages/setuptools')) >>> escape_rpm_path_4_18(BuildrootPath('/usr/lib/python3.9/site-packages/setuptools'))
'/usr/lib/python3.9/site-packages/setuptools' '/usr/lib/python3.9/site-packages/setuptools'
>>> escape_rpm_path('/usr/lib/python3.9/site-packages/setuptools/script (dev).tmpl') >>> escape_rpm_path_4_18('/usr/lib/python3.9/site-packages/setuptools/script (dev).tmpl')
'"/usr/lib/python3.9/site-packages/setuptools/script (dev).tmpl"' '"/usr/lib/python3.9/site-packages/setuptools/script (dev).tmpl"'
>>> escape_rpm_path('/usr/share/data/100%valid.path') >>> escape_rpm_path_4_18('/usr/share/data/100%valid.path')
'/usr/share/data/100%%%%%%%%valid.path' '/usr/share/data/100%%%%%%%%valid.path'
>>> escape_rpm_path('/usr/share/data/100 % valid.path') >>> escape_rpm_path_4_18('/usr/share/data/100 % valid.path')
'"/usr/share/data/100 %%%%%%%% valid.path"' '"/usr/share/data/100 %%%%%%%% valid.path"'
>>> escape_rpm_path('/usr/share/data/1000 %% valid.path') >>> escape_rpm_path_4_18('/usr/share/data/1000 %% valid.path')
'"/usr/share/data/1000 %%%%%%%%%%%%%%%% valid.path"' '"/usr/share/data/1000 %%%%%%%%%%%%%%%% valid.path"'
>>> escape_rpm_path('/usr/share/data/spaces and "quotes"') >>> escape_rpm_path_4_18('/usr/share/data/spaces and "quotes"')
Traceback (most recent call last): Traceback (most recent call last):
... ...
NotImplementedError: ... NotImplementedError: ...
>>> escape_rpm_path('/usr/share/data/spaces and [square brackets]') >>> escape_rpm_path_4_18('/usr/share/data/spaces and [square brackets]')
Traceback (most recent call last): Traceback (most recent call last):
... ...
NotImplementedError: ... NotImplementedError: ...
""" """
orig_path = path = str(path) orig_path = path = str(path)
if "%" in path: if "%" in path:
# Escaping by 8 %s has been verified in RPM 4.16 and 4.17, but probably not stable # Escaping an actual percentage sign in path by 8 signs
# See this thread http://lists.rpm.org/pipermail/rpm-list/2021-June/002048.html # has been verified in RPM 4.16 and 4.17:
# On the CI, we build tests/escape_percentages.spec to verify this assumption
path = path.replace("%", "%" * 8) path = path.replace("%", "%" * 8)
if any(symbol in path for symbol in RPM_FILES_DELIMETERS): if any(symbol in path for symbol in RPM_FILES_DELIMETERS):
if '"' in path: if '"' in path:
# As far as we know, RPM cannot list such file individually # As far as we know, RPM < 4.19 cannot list such file individually
# See this thread http://lists.rpm.org/pipermail/rpm-list/2021-June/002048.html # See this thread http://lists.rpm.org/pipermail/rpm-list/2021-June/002048.html
raise NotImplementedError(f'" symbol in path with spaces is not supported by %pyproject_save_files: {orig_path!r}') raise NotImplementedError(f'" symbol in path with spaces is not supported by %pyproject_save_files on RPM < 4.19: {orig_path!r}')
if "[" in path or "]" in path: if "[" in path or "]" in path:
# See https://bugzilla.redhat.com/show_bug.cgi?id=1990879 # See https://bugzilla.redhat.com/show_bug.cgi?id=1990879
# and https://github.com/rpm-software-management/rpm/issues/1749 # and https://github.com/rpm-software-management/rpm/issues/1749
raise NotImplementedError(f'[ or ] symbol in path with spaces is not supported by %pyproject_save_files: {orig_path!r}') raise NotImplementedError(f'[ or ] symbol in path with spaces is not supported by %pyproject_save_files on RPM < 4.19: {orig_path!r}')
return f'"{path}"' return f'"{path}"'
return path return path
if RPM_FILES_ESCAPE == "4.19":
escape_rpm_path = escape_rpm_path_4_19
elif RPM_FILES_ESCAPE == "4.18":
escape_rpm_path = escape_rpm_path_4_18
else:
raise RuntimeError("RPM_FILES_ESCAPE must be 4.18 or 4.19")
def generate_file_list(paths_dict, module_globs, include_others=False): def generate_file_list(paths_dict, module_globs, include_others=False):
""" """
This function takes the classified paths_dict and turns it into lines This function takes the classified paths_dict and turns it into lines
@ -673,7 +762,7 @@ def load_parsed_record(pyproject_record):
content = json.load(pyproject_record_file) content = json.load(pyproject_record_file)
if len(content) > 1: if len(content) > 1:
raise FileExistsError("%pyproject install has found more than one *.dist-info/RECORD file. " raise FileExistsError("%pyproject_install has found more than one *.dist-info/RECORD file. "
"Currently, %pyproject_save_files supports only one wheel → one file list mapping. " "Currently, %pyproject_save_files supports only one wheel → one file list mapping. "
"Feel free to open a bugzilla for pyproject-rpm-macros and describe your usecase.") "Feel free to open a bugzilla for pyproject-rpm-macros and describe your usecase.")
@ -693,12 +782,15 @@ def dist_metadata(buildroot, record_path):
return dist.metadata return dist.metadata
def pyproject_save_files_and_modules(buildroot, sitelib, sitearch, python_version, pyproject_record, prefix, varargs): def pyproject_save_files_and_modules(buildroot, sitelib, sitearch, python_version, pyproject_record, prefix, assert_license, varargs):
""" """
Takes arguments from the %{pyproject_save_files} macro Takes arguments from the %{pyproject_save_files} macro
Returns tuple: list of paths for the %files section and list of module names Returns tuple: list of paths for the %files section and list of module names
for the %check section for the %check section
Raises ValueError when assert_license is true and no License-File (PEP 639)
is found.
""" """
# On 32 bit architectures, sitelib equals to sitearch # On 32 bit architectures, sitelib equals to sitearch
# This saves us browsing one directory twice # This saves us browsing one directory twice
@ -710,11 +802,15 @@ def pyproject_save_files_and_modules(buildroot, sitelib, sitearch, python_versio
final_file_list = [] final_file_list = []
final_module_list = [] final_module_list = []
# we assume OK when not asserting
license_ok = not assert_license
for record_path, files in parsed_records.items(): for record_path, files in parsed_records.items():
metadata = dist_metadata(buildroot, record_path) metadata = dist_metadata(buildroot, record_path)
paths_dict = classify_paths( paths_dict = classify_paths(
record_path, files, metadata, sitedirs, python_version, prefix record_path, files, metadata, sitedirs, python_version, prefix
) )
license_ok = license_ok or bool(paths_dict["metadata"]["licenses"])
final_file_list.extend( final_file_list.extend(
generate_file_list(paths_dict, globs, include_auto) generate_file_list(paths_dict, globs, include_auto)
@ -723,6 +819,15 @@ def pyproject_save_files_and_modules(buildroot, sitelib, sitearch, python_versio
generate_module_list(paths_dict, globs) generate_module_list(paths_dict, globs)
) )
if not license_ok:
raise ValueError(
"No License-File (PEP 639) in upstream metadata found. "
"Adjust the upstream metadata "
"if the project's build backend supports PEP 639 "
"or use `%pyproject_save_files -L` "
"and include the %license file in %files manually."
)
return final_file_list, final_module_list return final_file_list, final_module_list
@ -734,6 +839,7 @@ def main(cli_args):
cli_args.python_version, cli_args.python_version,
cli_args.pyproject_record, cli_args.pyproject_record,
cli_args.prefix, cli_args.prefix,
cli_args.assert_license,
cli_args.varargs, cli_args.varargs,
) )
@ -747,7 +853,7 @@ def argparser():
prog="%pyproject_save_files", prog="%pyproject_save_files",
add_help=False, add_help=False,
# custom usage to add +auto # custom usage to add +auto
usage="%(prog)s MODULE_GLOB [MODULE_GLOB ...] [+auto]", usage="%(prog)s [-l|-L] MODULE_GLOB [MODULE_GLOB ...] [+auto]",
) )
parser.add_argument( parser.add_argument(
'--help', action='help', '--help', action='help',
@ -763,6 +869,14 @@ def argparser():
r.add_argument("--python-version", type=str, required=True, help=argparse.SUPPRESS) r.add_argument("--python-version", type=str, required=True, help=argparse.SUPPRESS)
r.add_argument("--pyproject-record", type=PosixPath, required=True, help=argparse.SUPPRESS) r.add_argument("--pyproject-record", type=PosixPath, required=True, help=argparse.SUPPRESS)
r.add_argument("--prefix", type=PosixPath, required=True, help=argparse.SUPPRESS) r.add_argument("--prefix", type=PosixPath, required=True, help=argparse.SUPPRESS)
parser.add_argument(
"-l", "--assert-license", action="store_true", default=False,
help="Fail when no License-File (PEP 639) is found.",
)
parser.add_argument(
"-L", "--no-assert-license", action="store_false", dest="assert_license",
help="Don't fail when no License-File (PEP 639) is found (the default).",
)
parser.add_argument( parser.add_argument(
"varargs", nargs="+", metavar="MODULE_GLOB", "varargs", nargs="+", metavar="MODULE_GLOB",
help="Shell-like glob matching top-level module names to save into %%{pyproject_files}", help="Shell-like glob matching top-level module names to save into %%{pyproject_files}",

View File

@ -457,7 +457,7 @@ classified:
- /usr/lib/python3.7/site-packages/comic2pdf-3.1.0.dist-info/top_level.txt - /usr/lib/python3.7/site-packages/comic2pdf-3.1.0.dist-info/top_level.txt
- /usr/lib/python3.7/site-packages/comic2pdf-3.1.0.dist-info/zip-safe - /usr/lib/python3.7/site-packages/comic2pdf-3.1.0.dist-info/zip-safe
licenses: [] licenses: []
modules: [] modules: {}
other: other:
files: files:
- /usr/bin/comic2pdf.py - /usr/bin/comic2pdf.py

View File

@ -1,8 +1,46 @@
import argparse
import sys import sys
import subprocess import subprocess
def build_wheel(*, wheeldir, stdout=None): def parse_config_settings_args(config_settings):
"""
Given a list of config `KEY=VALUE` formatted config settings,
return a dictionary that can be passed to PEP 517 hook functions.
"""
if not config_settings:
return config_settings
new_config_settings = {}
for arg in config_settings:
key, _, value = arg.partition('=')
if key in new_config_settings:
if not isinstance(new_config_settings[key], list):
# convert the existing value to a list
new_config_settings[key] = [new_config_settings[key]]
new_config_settings[key].append(value)
else:
new_config_settings[key] = value
return new_config_settings
def get_config_settings_args(config_settings):
"""
Given a dictionary of PEP 517 backend config_settings,
yield --config-settings args that can be passed to pip's CLI
"""
if not config_settings:
return
for key, values in config_settings.items():
if not isinstance(values, list):
values = [values]
for value in values:
if value == '':
yield f'--config-settings={key}'
else:
yield f'--config-settings={key}={value}'
def build_wheel(*, wheeldir, stdout=None, config_settings=None):
command = ( command = (
sys.executable, sys.executable,
'-m', 'pip', '-m', 'pip',
@ -15,11 +53,26 @@ def build_wheel(*, wheeldir, stdout=None):
'--no-clean', '--no-clean',
'--progress-bar', 'off', '--progress-bar', 'off',
'--verbose', '--verbose',
*get_config_settings_args(config_settings),
'.', '.',
) )
cp = subprocess.run(command, stdout=stdout) cp = subprocess.run(command, stdout=stdout)
return cp.returncode return cp.returncode
def parse_args(argv=None):
parser = argparse.ArgumentParser(prog='%pyproject_wheel')
parser.add_argument('wheeldir', help=argparse.SUPPRESS)
parser.add_argument(
'-C',
dest='config_settings',
action='append',
help='Configuration settings to pass to the PEP 517 backend',
)
args = parser.parse_args(argv)
args.config_settings = parse_config_settings_args(args.config_settings)
return args
if __name__ == '__main__': if __name__ == '__main__':
sys.exit(build_wheel(wheeldir=sys.argv[1])) sys.exit(build_wheel(**vars(parse_args())))

View File

@ -1,17 +1,41 @@
from pathlib import Path from pathlib import Path
import importlib.metadata import importlib.metadata
import packaging.version
import pytest import pytest
import setuptools
import yaml import yaml
from pyproject_buildrequires import generate_requires from pyproject_buildrequires import generate_requires, load_pyproject
SETUPTOOLS_VERSION = packaging.version.parse(setuptools.__version__)
SETUPTOOLS_60 = SETUPTOOLS_VERSION >= packaging.version.parse('60')
try:
import tox
except ImportError:
TOX_4_22 = False
else:
TOX_VERSION = packaging.version.parse(tox.__version__)
TOX_4_22 = TOX_VERSION >= packaging.version.parse('4.22')
testcases = {} testcases = {}
with Path(__file__).parent.joinpath('pyproject_buildrequires_testcases.yaml').open() as f: with Path(__file__).parent.joinpath('pyproject_buildrequires_testcases.yaml').open() as f:
testcases = yaml.safe_load(f) testcases = yaml.safe_load(f)
@pytest.fixture(autouse=True)
def clear_pyproject_data():
"""
Clear pyproject data before each test.
In reality we build one RPM package at a time, so we can keep the once-loaded
pyproject.toml contents.
When testing, the cached data would leak the once-loaded data to all the
following test cases.
"""
load_pyproject.cache_clear()
@pytest.mark.parametrize('case_name', testcases) @pytest.mark.parametrize('case_name', testcases)
def test_data(case_name, capfd, tmp_path, monkeypatch): def test_data(case_name, capfd, tmp_path, monkeypatch):
case = testcases[case_name] case = testcases[case_name]
@ -21,12 +45,16 @@ def test_data(case_name, capfd, tmp_path, monkeypatch):
monkeypatch.chdir(cwd) monkeypatch.chdir(cwd)
wheeldir = cwd.joinpath('wheeldir') wheeldir = cwd.joinpath('wheeldir')
wheeldir.mkdir() wheeldir.mkdir()
output = tmp_path.joinpath('output.txt')
if case.get('xfail'): if case.get('xfail'):
pytest.xfail(case.get('xfail')) pytest.xfail(case.get('xfail'))
if case.get('skipif') and eval(case.get('skipif')):
pytest.skip(case.get('skipif'))
for filename in case: for filename in case:
file_types = ('.toml', '.py', '.in', '.ini', '.txt') file_types = ('.toml', '.py', '.in', '.ini', '.txt', '.cfg')
if filename.endswith(file_types): if filename.endswith(file_types):
cwd.joinpath(filename).write_text(case[filename]) cwd.joinpath(filename).write_text(case[filename])
@ -43,6 +71,7 @@ def test_data(case_name, capfd, tmp_path, monkeypatch):
requirement_files = case.get('requirement_files', []) requirement_files = case.get('requirement_files', [])
requirement_files = [open(f) for f in requirement_files] requirement_files = [open(f) for f in requirement_files]
use_build_system = case.get('use_build_system', True) use_build_system = case.get('use_build_system', True)
read_pyproject_dependencies = case.get('read_pyproject_dependencies', False)
try: try:
generate_requires( generate_requires(
get_installed_version=get_installed_version, get_installed_version=get_installed_version,
@ -50,10 +79,14 @@ def test_data(case_name, capfd, tmp_path, monkeypatch):
build_wheel=case.get('build_wheel', False), build_wheel=case.get('build_wheel', False),
wheeldir=str(wheeldir), wheeldir=str(wheeldir),
extras=case.get('extras', []), extras=case.get('extras', []),
dependency_groups=case.get('dependency_groups', []),
toxenv=case.get('toxenv', None), toxenv=case.get('toxenv', None),
generate_extras=case.get('generate_extras', False), generate_extras=case.get('generate_extras', False),
requirement_files=requirement_files, requirement_files=requirement_files,
use_build_system=use_build_system, use_build_system=use_build_system,
read_pyproject_dependencies=read_pyproject_dependencies,
output=output,
config_settings=case.get('config_settings'),
) )
except SystemExit as e: except SystemExit as e:
assert e.code == case['result'] assert e.code == case['result']
@ -69,14 +102,15 @@ def test_data(case_name, capfd, tmp_path, monkeypatch):
assert 'expected' in case or 'stderr_contains' in case assert 'expected' in case or 'stderr_contains' in case
out, err = capfd.readouterr() out, err = capfd.readouterr()
dependencies = output.read_text()
if 'expected' in case: if 'expected' in case:
expected = case['expected'] expected = case['expected']
if isinstance(expected, list): if isinstance(expected, list):
# at least one of them needs to match # at least one of them needs to match
assert any(out == e for e in expected) assert dependencies in expected
else: else:
assert out == expected assert dependencies == expected
# stderr_contains may be a string or list of strings # stderr_contains may be a string or list of strings
stderr_contains = case.get('stderr_contains') stderr_contains = case.get('stderr_contains')

View File

@ -25,6 +25,21 @@ TEST_RECORDS = yaml_data["records"]
TEST_METADATAS = yaml_data["metadata"] TEST_METADATAS = yaml_data["metadata"]
# insert glob_suffix_len for .pyc files and man pages globs
for paths_dict in EXPECTED_DICT.values():
for modules in paths_dict["modules"].values():
for module in modules:
for idx, file in enumerate(module["files"]):
if file.endswith(".pyc"):
module["files"][idx] = BuildrootPath(file)
module["files"][idx].glob_suffix_len = len("{,.opt-?}.pyc")
if "other" in paths_dict and "files" in paths_dict["other"]:
for idx, file in enumerate(paths_dict["other"]["files"]):
if file.endswith("*"):
paths_dict["other"]["files"][idx] = BuildrootPath(file)
paths_dict["other"]["files"][idx].glob_suffix_len = len("*")
@pytest.fixture @pytest.fixture
def tldr_root(tmp_path): def tldr_root(tmp_path):
prepare_pyproject_record(tmp_path, package="tldr") prepare_pyproject_record(tmp_path, package="tldr")

View File

@ -1,10 +1,12 @@
Name: pyproject-rpm-macros Name: pyproject-rpm-macros
Summary: RPM macros for PEP 517 Python packages Summary: RPM macros for PEP 517 Python packages
# SPDX
License: MIT License: MIT
# Disable tests on RHEL9 as to not pull in the test dependencies %bcond tests 1
# Specify --with tests to run the tests e.g. on EPEL # pytest-xdist and tox are not desired in RHEL
%bcond_with tests %bcond pytest_xdist %{undefined rhel}
%bcond tox_tests %{undefined rhel}
# The idea is to follow the spirit of semver # The idea is to follow the spirit of semver
# Given version X.Y.Z: # Given version X.Y.Z:
@ -12,7 +14,7 @@ License: MIT
# Increment Y and reset Z when new macros or features are added # Increment Y and reset Z when new macros or features are added
# Increment Z when this is a bugfix or a cosmetic change # Increment Z when this is a bugfix or a cosmetic change
# Dropping support for EOL Fedoras is *not* considered a breaking change # Dropping support for EOL Fedoras is *not* considered a breaking change
Version: 1.6.2 Version: 1.16.2
Release: 1%{?dist} Release: 1%{?dist}
# Macro files # Macro files
@ -49,14 +51,33 @@ BuildArch: noarch
%if %{with tests} %if %{with tests}
BuildRequires: python3dist(pytest) BuildRequires: python3dist(pytest)
%if %{with pytest_xdist}
BuildRequires: python3dist(pytest-xdist) BuildRequires: python3dist(pytest-xdist)
%endif
BuildRequires: python3dist(pyyaml) BuildRequires: python3dist(pyyaml)
BuildRequires: python3dist(packaging) BuildRequires: python3dist(packaging)
BuildRequires: python3dist(pip) BuildRequires: python3dist(pip)
BuildRequires: python3dist(setuptools) BuildRequires: python3dist(setuptools)
%if %{with tox_tests}
BuildRequires: python3dist(tox-current-env) >= 0.0.6 BuildRequires: python3dist(tox-current-env) >= 0.0.6
%endif
BuildRequires: python3dist(wheel) BuildRequires: python3dist(wheel)
BuildRequires: (python3dist(toml) if python3-devel < 3.11) BuildRequires: (python3dist(tomli) if python3 < 3.11)
# RHEL 9: We also run pytest with Python 3.11 and 3.12
BuildRequires: python3.11dist(pytest)
BuildRequires: python3.11dist(pyyaml)
BuildRequires: python3.11dist(packaging)
BuildRequires: python3.11dist(pip)
BuildRequires: python3.11dist(setuptools)
BuildRequires: python3.11dist(wheel)
BuildRequires: python3.12dist(pytest)
BuildRequires: python3.12dist(pyyaml)
BuildRequires: python3.12dist(packaging)
BuildRequires: python3.12dist(pip)
BuildRequires: python3.12dist(setuptools)
BuildRequires: python3.12dist(wheel)
%endif %endif
# We build on top of those: # We build on top of those:
@ -72,6 +93,13 @@ Requires: (pyproject-srpm-macros = %{?epoch:%{epoch}:}%{version}-%{release
Requires: /usr/bin/find Requires: /usr/bin/find
Requires: /usr/bin/sed Requires: /usr/bin/sed
# This package requires the %%generate_buildrequires functionality.
# It has been introduced in RPM 4.15 (4.14.90 is the alpha of 4.15).
# What we need is rpmlib(DynamicBuildRequires), but that is impossible to (Build)Require.
# Also, we need to avoid 4.19.90..4.19.91-7 due to rhbz#2284187
Requires: ((rpm-build >= 4.14.90 with (rpm-build < 4.19.90 or rpm-build >= 4.19.91-8)) if rpm-build)
BuildRequires: rpm-build >= 4.14.90
%description %description
These macros allow projects that follow the Python packaging specifications These macros allow projects that follow the Python packaging specifications
to be packaged as RPMs. to be packaged as RPMs.
@ -90,6 +118,7 @@ which only work with setup.py.
%package -n pyproject-srpm-macros %package -n pyproject-srpm-macros
Summary: Minimal implementation of %%pyproject_buildrequires Summary: Minimal implementation of %%pyproject_buildrequires
Requires: (pyproject-rpm-macros = %{?epoch:%{epoch}:}%{version}-%{release} if pyproject-rpm-macros) Requires: (pyproject-rpm-macros = %{?epoch:%{epoch}:}%{version}-%{release} if pyproject-rpm-macros)
Requires: (rpm-build >= 4.14.90 if rpm-build)
%description -n pyproject-srpm-macros %description -n pyproject-srpm-macros
This package contains a minimal implementation of %%pyproject_buildrequires. This package contains a minimal implementation of %%pyproject_buildrequires.
@ -104,6 +133,9 @@ takes precedence.
%setup -c -T %setup -c -T
cp -p %{sources} . cp -p %{sources} .
%generate_buildrequires
# nothing to do, this is here just to assert we have that functionality
%build %build
# nothing to do, sources are not buildable # nothing to do, sources are not buildable
@ -120,10 +152,25 @@ install -pm 644 pyproject_construct_toxenv.py %{buildroot}%{_rpmconfigdir}/redha
install -pm 644 pyproject_requirements_txt.py %{buildroot}%{_rpmconfigdir}/redhat/ install -pm 644 pyproject_requirements_txt.py %{buildroot}%{_rpmconfigdir}/redhat/
install -pm 644 pyproject_wheel.py %{buildroot}%{_rpmconfigdir}/redhat/ install -pm 644 pyproject_wheel.py %{buildroot}%{_rpmconfigdir}/redhat/
%if %{with tests}
%check %check
# assert the two signatures of %%pyproject_buildrequires match exactly
signature1="$(grep '^%%pyproject_buildrequires' macros.pyproject | cut -d' ' -f1)"
signature2="$(grep '^%%pyproject_buildrequires' macros.aaa-pyproject-srpm | cut -d' ' -f1)"
test "$signature1" == "$signature2"
# but also assert we are not comparing empty strings
test "$signature1" != ""
%if %{with tests}
export HOSTNAME="rpmbuild" # to speedup tox in network-less mock, see rhbz#1856356 export HOSTNAME="rpmbuild" # to speedup tox in network-less mock, see rhbz#1856356
%pytest -vv --doctest-modules -n auto %pytest -vv --doctest-modules %{?with_pytest_xdist:-n auto} %{!?with_tox_tests:-k "not tox"}
# RHEL 9 only:
%global __pytest pytest-3.11
%pytest -vv --doctest-modules -k "not tox"
# RHEL 9 only:
%global __pytest pytest-3.12
%pytest -vv --doctest-modules -k "not tox"
# brp-compress is provided as an argument to get the right directory macro expansion # brp-compress is provided as an argument to get the right directory macro expansion
%{python3} compare_mandata.py -f %{_rpmconfigdir}/brp-compress %{python3} compare_mandata.py -f %{_rpmconfigdir}/brp-compress
@ -149,6 +196,91 @@ export HOSTNAME="rpmbuild" # to speedup tox in network-less mock, see rhbz#1856
%changelog %changelog
* Wed Nov 13 2024 Miro Hrončok <mhroncok@redhat.com> - 1.16.2-1
- Fix one remaining test for setuptools 70+
* Thu Nov 07 2024 Miro Hrončok <miro@hroncok.cz> - 1.16.1-1
- Support for setuptools 70+
- wheel is no longer generated as a dependency of the default build system
* Mon Nov 04 2024 Miro Hrončok <mhroncok@redhat.com> - 1.16.0-1
- %%pyproject_buildrequires: Add support for dependency groups (PEP 735), via the -g flag
- This is implied when used tox testenvs depend on dependency groups (requires tox 4.22+)
- Fixes: rhbz#2318849
* Thu Oct 03 2024 Karolina Surma <ksurma@redhat.com> - 1.15.1-1
- Fix handling of self-referencing extras when reading pyproject.toml
* Tue Sep 17 2024 Python Maint <python-maint@redhat.com> - 1.15.0-1
- Add a possibility to read runtime requirements from pyproject.toml [project] table
- Fixes: rhbz#2261939
- Don't generate a dependency on pip when %%pyproject_buildrequires -N is used
- Fixes: rhbz#2294510
- Even when %%_auto_set_build_flags is disabled, set all compiler flags when building wheels
- Fixes: rhbz#2293616
* Tue Jul 23 2024 Miro Hrončok <mhroncok@redhat.com> - 1.14.0-1
- Add a provisional RPM Declarative Buildsystem (RPM 4.20+)
* Fri Jul 19 2024 Fedora Release Engineering <releng@fedoraproject.org> - 1.13.0-2
- Rebuilt for https://fedoraproject.org/wiki/Fedora_41_Mass_Rebuild
* Tue Jul 02 2024 Miro Hrončok <mhroncok@redhat.com> - 1.13.0-1
- Properly escape weird characters from paths in %%{pyproject_files} (RPM 4.19+ only)
- Revert the temporary workaround for RPM 4.20 alpha 2 leaking \x1f (unit separators)
- Fixes: rhbz#1990879
* Tue Jun 25 2024 Cristian Le <fedora@lecris.me> - 1.12.2-1
- %%pyproject_extras_subpkg: Allow passing -a or -A to %%python_extras_subpkg
* Tue Jun 04 2024 Miro Hrončok <mhroncok@redhat.com> - 1.12.1-1
- Add a temporary workaround for RPM 4.20 alpha 2 leaking \x1f (unit separators)
- Related: rhbz#2284187
* Fri Jan 26 2024 Miro Hrončok <miro@hroncok.cz> - 1.12.0-1
- Namespace pyproject-rpm-macros generated text files with %%{python3_pkgversion}
- That way, a single-spec can be used to build packages for multiple Python versions
- Fixes: rhbz#2209055
* Wed Sep 27 2023 Miro Hrončok <mhroncok@redhat.com> - 1.11.0-1
- Add the -l/-L flag to %%pyproject_save_files
- The -l flag can be used to assert at least 1 License-File was detected
- The -L flag explicitly disables this check (which remains the default)
- Prevent incorrect usage of %%pyproject_buildrequires -R with -x/-e/-t
- Fixes: rhbz#2244282
- Show a better error message when %%pyproject_install finds no wheel
- Fixes: rhbz#2242452
- Fix %%pyproject_buildrequires -w when the build backend is already installed and pip isn't
- Fixes: rhbz#2169855
* Wed Sep 13 2023 Python Maint <python-maint@redhat.com> - 1.10.0-1
- Add %%_pyproject_check_import_allow_no_modules for automated environments
- Fix handling of tox 4 provision without an explicit tox minversion
- Fixes: rhbz#2240590
* Wed May 31 2023 Maxwell G <maxwell@gtmx.me> - 1.9.0-1
- Allow passing config_settings to the build backend.
* Wed May 31 2023 Miro Hrončok <mhroncok@redhat.com> - 1.8.1-1
- On Python older than 3.11, use tomli instead of deprecated toml
- Fix literal %% handling in %%{pyproject_files} on RPM 4.19
* Tue May 23 2023 Miro Hrončok <mhroncok@redhat.com> - 1.8.0-2
- Rebuilt for ELN dependency changes
* Thu Apr 27 2023 Miro Hrončok <mhroncok@redhat.com> - 1.8.0-1
- %%pyproject_buildrequires: Add support for self-referential extras requirements
- Deprecate the provisional %%{pyproject_build_lib} macro
See https://lists.fedoraproject.org/archives/list/python-devel@lists.fedoraproject.org/thread/HMLOPAU3RZLXD4BOJHTIPKI3I4U6U7OE/
* Fri Mar 31 2023 Miro Hrončok <mhroncok@redhat.com> - 1.7.0-1
- %%pyproject_buildrequires: Redirect stdout to stderr via Shell
- Dependencies are recorded to a text file that is catted at the end
* Mon Feb 13 2023 Lumír Balhar <lbalhar@redhat.com> - 1.6.3-1
- Remove .dist-info directory at the end of %%pyproject_buildrequires
- An incomplete .dist-info directory in $PWD can confuse tests in %%check
* Wed Feb 08 2023 Lumír Balhar <lbalhar@redhat.com> - 1.6.2-1 * Wed Feb 08 2023 Lumír Balhar <lbalhar@redhat.com> - 1.6.2-1
- Improve detection of lang files - Improve detection of lang files
@ -165,7 +297,8 @@ export HOSTNAME="rpmbuild" # to speedup tox in network-less mock, see rhbz#1856
- Use %%py3_test_envvars in %%tox when available - Use %%py3_test_envvars in %%tox when available
* Mon Sep 19 2022 Python Maint <python-maint@redhat.com> - 1.4.0-1 * Mon Sep 19 2022 Python Maint <python-maint@redhat.com> - 1.4.0-1
- %%pyproject_save_files: Support License-Files installed into the *Root License Directory* from PEP 369 - %%pyproject_save_files: Support License-Files installed into the *Root License Directory* from PEP 639
- %%pyproject_check_import: Import only the modules whose top-level names - %%pyproject_check_import: Import only the modules whose top-level names
match any of the globs provided to %%pyproject_save_files match any of the globs provided to %%pyproject_save_files