Introduce %%pyproject_check_import

%%pyproject_save_files newly saves also a list of importable modules.
The list is used by %%pyproject_check_import to invoke the import test
on each module name.
%%pyproject_check_import accepts two options:
-t: filter only top-level modules
-e: exclude module names matching the given glob from the import check

Related: rhbz#1950291
This commit is contained in:
Karolina Surma 2021-10-27 11:28:45 +02:00
parent 655f6dda0e
commit d427052f40
9 changed files with 1191 additions and 35 deletions

View File

@ -226,6 +226,53 @@ If you wish to rename, remove or otherwise change the installed files of a packa
If possible, remove/rename such files in `%prep`. If possible, remove/rename such files in `%prep`.
If not possible, avoid using `%pyproject_save_files` or edit/replace `%{pyproject_files}`. If not possible, avoid using `%pyproject_save_files` or edit/replace `%{pyproject_files}`.
Performing an import check on all importable modules
----------------------------------------------------
If the upstream test suite cannot be used during the package build
and you use `%pyproject_save_files`,
you can benefit from the `%pyproject_check_import` macro.
If `%pyproject_save_files` is not used, calling `%pyproject_check_import` will fail.
When `%pyproject_save_files` is invoked,
it creates a list of all valid and public (i.e. not starting with `_`)
importable module names found in the package.
This list is then usable by `%pyproject_check_import` which performs an import check for each listed module.
When a module fails to import, the build fails.
The modules are imported from both installed and buildroot's `%{python3_sitearch}`
and `%{python3_sitelib}`, not from the current directory.
Use the macro in `%check`:
%check
%pyproject_check_import
By using the `-e` flag, you can exclude module names matching the given glob(s) from the import check
(put it in single quotes to prevent Shell from expanding it).
The flag can be used repeatedly.
For example, to exclude all submodules ending with `config` and all submodules starting with `test`, you can use:
%pyproject_check_import -e '*.config' -e '*.test*'
There must be at least one module left for the import check;
if, as a result of greedy excluding, no modules are left to check, the check fails.
When the `-t` flag is used, only top-level modules are checked,
qualified module names with a dot (`.`) are excluded.
If the modules detected by `%pyproject_save_files` are `requests`, `requests.models`, and `requests.packages`, this will only perform an import of `requests`:
%pyproject_check_import -t
The modifying flags should only be used when there is a valid reason for not checking all available modules.
The reason should be documented in a comment.
The `%pyproject_check_import` macro also accepts positional arguments with
additional qualified module names to check, useful for example if some modules are installed manually.
Note that filtering by `-t`/`-e` also applies to the positional arguments.
Generating Extras subpackages Generating Extras subpackages
----------------------------- -----------------------------

View File

@ -11,6 +11,7 @@
%_pyproject_builddir %{_builddir}%{?buildsubdir:/%{buildsubdir}}/.pyproject-builddir %_pyproject_builddir %{_builddir}%{?buildsubdir:/%{buildsubdir}}/.pyproject-builddir
%pyproject_files %{_builddir}/pyproject-files %pyproject_files %{_builddir}/pyproject-files
%_pyproject_modules %{_builddir}/pyproject-modules
%_pyproject_ghost_distinfo %{_builddir}/pyproject-ghost-distinfo %_pyproject_ghost_distinfo %{_builddir}/pyproject-ghost-distinfo
%_pyproject_record %{_builddir}/pyproject-record %_pyproject_record %{_builddir}/pyproject-record
@ -70,7 +71,8 @@ fi
%pyproject_save_files() %{expand:\\\ %pyproject_save_files() %{expand:\\\
%{__python3} %{_rpmconfigdir}/redhat/pyproject_save_files.py \\ %{__python3} %{_rpmconfigdir}/redhat/pyproject_save_files.py \\
--output "%{pyproject_files}" \\ --output-files "%{pyproject_files}" \\
--output-modules "%{_pyproject_modules}" \\
--buildroot "%{buildroot}" \\ --buildroot "%{buildroot}" \\
--sitelib "%{python3_sitelib}" \\ --sitelib "%{python3_sitelib}" \\
--sitearch "%{python3_sitearch}" \\ --sitearch "%{python3_sitearch}" \\
@ -79,6 +81,16 @@ fi
%{*} %{*}
} }
# -t - Process only top-level modules
# -e - Exclude the module names matching given glob, may be used repeatedly
%pyproject_check_import(e:t) %{expand:\\\
if [ ! -f "%{_pyproject_modules}" ]; then
echo 'ERROR: %%%%pyproject_check_import only works when %%%%pyproject_save_files is used' >&2
exit 1
fi
%py3_check_import -f "%{_pyproject_modules}" %{?**}
}
%default_toxenv py%{python3_version_nodots} %default_toxenv py%{python3_version_nodots}
%toxenv %{default_toxenv} %toxenv %{default_toxenv}

View File

@ -12,7 +12,7 @@ License: MIT
# In other cases, such as backports, increment the point # In other cases, such as backports, increment the point
# release. # release.
Version: 0 Version: 0
Release: 48%{?dist} Release: 49%{?dist}
# Macro files # Macro files
Source001: macros.pyproject Source001: macros.pyproject
@ -119,6 +119,11 @@ export HOSTNAME="rpmbuild" # to speedup tox in network-less mock, see rhbz#1856
%license LICENSE %license LICENSE
%changelog %changelog
* Tue Oct 19 2021 Karolina Surma <ksurma@redhat.com> - 0-49
- %%pyproject_save_files: Save %%_pyproject_modules file with importable module names
- Introduce %%pyproject_check_import which passes %%_pyproject_modules to %%py3_check_import
- Introduce -t, -e filtering options to %%pyproject_check_import
* Sat Oct 16 2021 Miro Hrončok <mhroncok@redhat.com> - 0-48 * Sat Oct 16 2021 Miro Hrončok <mhroncok@redhat.com> - 0-48
- %%pyproject_buildrequires: Accept installed pre-releases for all requirements - %%pyproject_buildrequires: Accept installed pre-releases for all requirements

View File

@ -4,6 +4,7 @@ import json
import os import os
from collections import defaultdict from collections import defaultdict
from keyword import iskeyword
from pathlib import PosixPath, PurePosixPath from pathlib import PosixPath, PurePosixPath
from importlib.metadata import Distribution from importlib.metadata import Distribution
@ -144,12 +145,81 @@ def add_lang_to_module(paths, module_name, path):
return True return True
def is_valid_module_name(s):
"""Return True if a string is considered a valid module name and False otherwise.
String must be a valid Python name, not a Python keyword and must not
start with underscore - we treat those as private.
Examples:
>>> is_valid_module_name('module_name')
True
>>> is_valid_module_name('12module_name')
False
>>> is_valid_module_name('module-name')
False
>>> is_valid_module_name('return')
False
>>> is_valid_module_name('_module_name')
False
"""
if (s.isidentifier() and not iskeyword(s) and not s.startswith("_")):
return True
return False
def module_names_from_path(path):
"""Get all importable module names from given path.
Paths containing ".py" and ".so" files are considered importable modules,
and so their respective directories (ie. "foo/bar/baz.py": "foo", "foo.bar",
"foo.bar.baz").
Paths containing invalid Python strings are discarded.
Return set of all valid possibilities.
"""
# Discard all files that are not valid modules
if path.suffix not in (".py", ".so"):
return set()
parts = list(path.parts)
# Modify the file names according to their suffixes
if path.suffix == ".py":
parts[-1] = path.stem
elif path.suffix == ".so":
# .so files can have two suffixes - cut both of them
parts[-1] = PosixPath(path.stem).stem
# '__init__' indicates a module but we don't want to import the actual file
# It's unclear whether there can be __init__.so files in the Python packages.
# The idea to implement this file was raised in 2008 on Python-ideas mailing list
# (https://mail.python.org/pipermail/python-ideas/2008-October/002292.html)
# and there are a few reports of people compiling their __init__.py to __init__.so.
# However it's not officially documented nor forbidden,
# so we're checking for the stem after stripping the suffix from the file.
if parts[-1] == "__init__":
del parts[-1]
# For each part of the path check whether it's valid
# If not, discard the whole path - return an empty set
for path_part in parts:
if not is_valid_module_name(path_part):
return set()
else:
return {'.'.join(parts[:x+1]) for x in range(len(parts))}
def classify_paths( def classify_paths(
record_path, parsed_record_content, metadata, sitedirs, python_version record_path, parsed_record_content, metadata, sitedirs, python_version
): ):
""" """
For each BuildrootPath in parsed_record_content classify it to a dict structure For each BuildrootPath in parsed_record_content classify it to a dict structure
that allows to filter the files for the %files section easier. that allows to filter the files for the %files and %check section easier.
For the dict structure, look at the beginning of this function's code. For the dict structure, look at the beginning of this function's code.
@ -165,6 +235,7 @@ def classify_paths(
}, },
"lang": {}, # %lang entries: [module_name or None][language_code] lists of .mo files "lang": {}, # %lang entries: [module_name or None][language_code] lists of .mo files
"modules": defaultdict(list), # each importable module (directory, .py, .so) "modules": defaultdict(list), # each importable module (directory, .py, .so)
"module_names": set(), # qualified names of each importable module ("foo.bar.baz")
"other": {"files": []}, # regular %file entries we could not parse :( "other": {"files": []}, # regular %file entries we could not parse :(
} }
@ -190,6 +261,9 @@ def classify_paths(
for sitedir in sitedirs: for sitedir in sitedirs:
if sitedir in path.parents: if sitedir in path.parents:
# Get only the part without sitedir prefix to classify module names
relative_path = path.relative_to(sitedir)
paths["module_names"].update(module_names_from_path(relative_path))
if path.parent == sitedir: if path.parent == sitedir:
if path.suffix == ".so": if path.suffix == ".so":
# extension modules can have 2 suffixes # extension modules can have 2 suffixes
@ -453,11 +527,13 @@ def dist_metadata(buildroot, record_path):
dist = Distribution.at(real_dist_path) dist = Distribution.at(real_dist_path)
return dist.metadata return dist.metadata
def pyproject_save_files(buildroot, sitelib, sitearch, python_version, pyproject_record, varargs):
def pyproject_save_files_and_modules(buildroot, sitelib, sitearch, python_version, pyproject_record, varargs):
""" """
Takes arguments from the %{pyproject_save_files} macro Takes arguments from the %{pyproject_save_files} macro
Returns list of paths for the %files section Returns tuple: list of paths for the %files section and list of module names
for the %check section
""" """
# On 32 bit architectures, sitelib equals to sitearch # On 32 bit architectures, sitelib equals to sitearch
# This saves us browsing one directory twice # This saves us browsing one directory twice
@ -467,6 +543,7 @@ def pyproject_save_files(buildroot, sitelib, sitearch, python_version, pyproject
parsed_records = load_parsed_record(pyproject_record) parsed_records = load_parsed_record(pyproject_record)
final_file_list = [] final_file_list = []
all_module_names = set()
for record_path, files in parsed_records.items(): for record_path, files in parsed_records.items():
metadata = dist_metadata(buildroot, record_path) metadata = dist_metadata(buildroot, record_path)
@ -477,12 +554,16 @@ def pyproject_save_files(buildroot, sitelib, sitearch, python_version, pyproject
final_file_list.extend( final_file_list.extend(
generate_file_list(paths_dict, globs, include_auto) generate_file_list(paths_dict, globs, include_auto)
) )
all_module_names.update(paths_dict["module_names"])
return final_file_list # Sort values, so they are always checked in the same order
all_module_names = sorted(all_module_names)
return final_file_list, all_module_names
def main(cli_args): def main(cli_args):
file_section = pyproject_save_files( file_section, module_names = pyproject_save_files_and_modules(
cli_args.buildroot, cli_args.buildroot,
cli_args.sitelib, cli_args.sitelib,
cli_args.sitearch, cli_args.sitearch,
@ -491,13 +572,15 @@ def main(cli_args):
cli_args.varargs, cli_args.varargs,
) )
cli_args.output.write_text("\n".join(file_section) + "\n", encoding="utf-8") cli_args.output_files.write_text("\n".join(file_section) + "\n", encoding="utf-8")
cli_args.output_modules.write_text("\n".join(module_names) + "\n", encoding="utf-8")
def argparser(): def argparser():
parser = argparse.ArgumentParser() parser = argparse.ArgumentParser()
r = parser.add_argument_group("required arguments") r = parser.add_argument_group("required arguments")
r.add_argument("--output", type=PosixPath, required=True) r.add_argument("--output-files", type=PosixPath, required=True)
r.add_argument("--output-modules", type=PosixPath, required=True)
r.add_argument("--buildroot", type=PosixPath, required=True) r.add_argument("--buildroot", type=PosixPath, required=True)
r.add_argument("--sitelib", type=BuildrootPath, required=True) r.add_argument("--sitelib", type=BuildrootPath, required=True)
r.add_argument("--sitearch", type=BuildrootPath, required=True) r.add_argument("--sitearch", type=BuildrootPath, required=True)

File diff suppressed because it is too large Load Diff

View File

@ -8,6 +8,7 @@ from pyproject_preprocess_record import parse_record, read_record, save_parsed_r
from pyproject_save_files import argparser, generate_file_list, BuildrootPath from pyproject_save_files import argparser, generate_file_list, BuildrootPath
from pyproject_save_files import main as save_files_main from pyproject_save_files import main as save_files_main
from pyproject_save_files import module_names_from_path
DIR = Path(__file__).parent DIR = Path(__file__).parent
BINDIR = BuildrootPath("/usr/bin") BINDIR = BuildrootPath("/usr/bin")
@ -61,9 +62,13 @@ def prepare_pyproject_record(tmp_path, package=None, content=None):
@pytest.fixture @pytest.fixture
def output(tmp_path): def output_files(tmp_path):
return tmp_path / "pyproject_files" return tmp_path / "pyproject_files"
@pytest.fixture
def output_modules(tmp_path):
return tmp_path / "pyproject_modules"
def test_parse_record_tldr(): def test_parse_record_tldr():
record_path = BuildrootPath(TEST_RECORDS["tldr"]["path"]) record_path = BuildrootPath(TEST_RECORDS["tldr"]["path"])
@ -109,15 +114,15 @@ def remove_others(expected):
@pytest.mark.parametrize("include_auto", (True, False)) @pytest.mark.parametrize("include_auto", (True, False))
@pytest.mark.parametrize("package, glob, expected", EXPECTED_FILES) @pytest.mark.parametrize("package, glob, expected_files, expected_modules", EXPECTED_FILES)
def test_generate_file_list(package, glob, expected, include_auto): def test_generate_file_list(package, glob, expected_files, include_auto, expected_modules):
paths_dict = EXPECTED_DICT[package] paths_dict = EXPECTED_DICT[package]
modules_glob = {glob} modules_glob = {glob}
if not include_auto: if not include_auto:
expected = remove_others(expected) expected_files = remove_others(expected_files)
tested = generate_file_list(paths_dict, modules_glob, include_auto) tested = generate_file_list(paths_dict, modules_glob, include_auto)
assert tested == expected assert tested == expected_files
def test_generate_file_list_unused_glob(): def test_generate_file_list_unused_glob():
@ -130,10 +135,41 @@ def test_generate_file_list_unused_glob():
assert "kerb" not in str(excinfo.value) assert "kerb" not in str(excinfo.value)
def default_options(output, mock_root, pyproject_record): @pytest.mark.parametrize(
"path, expected",
[
("foo/bar/baz.py", {"foo", "foo.bar", "foo.bar.baz"}),
("foo/bar.py", {"foo", "foo.bar"}),
("foo.py", {"foo"}),
("foo/bar.so.2", set()),
("foo.cpython-37m-x86_64-linux-gnu.so", {"foo"}),
("foo/_api/v2/__init__.py", set()),
("foo/__init__.py", {"foo"}),
("foo/_priv.py", set()),
("foo/_bar/lib.so", set()),
("foo/bar/baz.so", {"foo", "foo.bar", "foo.bar.baz"}),
("foo/bar/baz.pth", set()),
("foo/bar/baz.pyc", set()),
("def.py", set()),
("foo-bar/baz.py", set()),
("foobar/12baz.py", set()),
("foo/\nbar/baz.py", set()),
("foo/+bar/baz.py", set()),
("foo/__init__.cpython-39-x86_64-linux-gnu.so", {"foo"}),
("foo/bar/__pycache__/abc.cpython-37.pyc", set()),
],
)
def test_module_names_from_path(path, expected):
tested = Path(path)
assert module_names_from_path(tested) == expected
def default_options(output_files, output_modules, mock_root, pyproject_record):
return [ return [
"--output", "--output-files",
str(output), str(output_files),
"--output-modules",
str(output_modules),
"--buildroot", "--buildroot",
str(mock_root), str(mock_root),
"--sitelib", "--sitelib",
@ -148,62 +184,68 @@ def default_options(output, mock_root, pyproject_record):
@pytest.mark.parametrize("include_auto", (True, False)) @pytest.mark.parametrize("include_auto", (True, False))
@pytest.mark.parametrize("package, glob, expected", EXPECTED_FILES) @pytest.mark.parametrize("package, glob, expected_files, expected_modules", EXPECTED_FILES)
def test_cli(tmp_path, package, glob, expected, include_auto, pyproject_record): def test_cli(tmp_path, package, glob, expected_files, expected_modules, include_auto, pyproject_record):
prepare_pyproject_record(tmp_path, package) prepare_pyproject_record(tmp_path, package)
output = tmp_path / "files" output_files = tmp_path / "files"
output_modules = tmp_path / "modules"
globs = [glob, "+auto"] if include_auto else [glob] globs = [glob, "+auto"] if include_auto else [glob]
cli_args = argparser().parse_args([*default_options(output, tmp_path, pyproject_record), *globs]) cli_args = argparser().parse_args([*default_options(output_files, output_modules, tmp_path, pyproject_record), *globs])
save_files_main(cli_args) save_files_main(cli_args)
if not include_auto: if not include_auto:
expected = remove_others(expected) expected_files = remove_others(expected_files)
tested = output.read_text() tested_files = output_files.read_text()
assert tested == "\n".join(expected) + "\n" assert tested_files == "\n".join(expected_files) + "\n"
tested_modules = output_modules.read_text().split()
assert tested_modules == expected_modules
def test_cli_no_pyproject_record(tmp_path, pyproject_record): def test_cli_no_pyproject_record(tmp_path, pyproject_record):
output = tmp_path / "files" output_files = tmp_path / "files"
cli_args = argparser().parse_args([*default_options(output, tmp_path, pyproject_record), "tldr*"]) output_modules = tmp_path / "modules"
cli_args = argparser().parse_args([*default_options(output_files, output_modules, tmp_path, pyproject_record), "tldr*"])
with pytest.raises(FileNotFoundError): with pytest.raises(FileNotFoundError):
save_files_main(cli_args) save_files_main(cli_args)
def test_cli_too_many_RECORDS(tldr_root, output, pyproject_record): def test_cli_too_many_RECORDS(tldr_root, output_files, output_modules, pyproject_record):
# Two calls to simulate how %pyproject_install process more than one RECORD file # Two calls to simulate how %pyproject_install process more than one RECORD file
prepare_pyproject_record(tldr_root, prepare_pyproject_record(tldr_root,
content=("foo/bar/dist-info/RECORD", [])) content=("foo/bar/dist-info/RECORD", []))
prepare_pyproject_record(tldr_root, prepare_pyproject_record(tldr_root,
content=("foo/baz/dist-info/RECORD", [])) content=("foo/baz/dist-info/RECORD", []))
cli_args = argparser().parse_args([*default_options(output, tldr_root, pyproject_record), "tldr*"]) cli_args = argparser().parse_args([*default_options(output_files, output_modules, tldr_root, pyproject_record), "tldr*"])
with pytest.raises(FileExistsError): with pytest.raises(FileExistsError):
save_files_main(cli_args) save_files_main(cli_args)
def test_cli_bad_argument(tldr_root, output, pyproject_record): def test_cli_bad_argument(tldr_root, output_files, output_modules, pyproject_record):
cli_args = argparser().parse_args( cli_args = argparser().parse_args(
[*default_options(output, tldr_root, pyproject_record), "tldr*", "+foodir"] [*default_options(output_files, output_modules, tldr_root, pyproject_record), "tldr*", "+foodir"]
) )
with pytest.raises(ValueError): with pytest.raises(ValueError):
save_files_main(cli_args) save_files_main(cli_args)
def test_cli_bad_option(tldr_root, output, pyproject_record): def test_cli_bad_option(tldr_root, output_files, output_modules, pyproject_record):
prepare_pyproject_record(tldr_root.parent, content=("RECORD1", [])) prepare_pyproject_record(tldr_root.parent, content=("RECORD1", []))
cli_args = argparser().parse_args( cli_args = argparser().parse_args(
[*default_options(output, tldr_root, pyproject_record), "tldr*", "you_cannot_have_this"] [*default_options(output_files, output_modules, tldr_root, pyproject_record), "tldr*", "you_cannot_have_this"]
) )
with pytest.raises(ValueError): with pytest.raises(ValueError):
save_files_main(cli_args) save_files_main(cli_args)
def test_cli_bad_namespace(tldr_root, output, pyproject_record): def test_cli_bad_namespace(tldr_root, output_files, output_modules, pyproject_record):
cli_args = argparser().parse_args( cli_args = argparser().parse_args(
[*default_options(output, tldr_root, pyproject_record), "tldr.didntread"] [*default_options(output_files, output_modules, tldr_root, pyproject_record), "tldr.didntread"]
) )
with pytest.raises(ValueError): with pytest.raises(ValueError):

View File

@ -16,6 +16,8 @@ BuildRequires: git-core
This package uses setuptools and pbr. This package uses setuptools and pbr.
It has setup_requires and tests that %%pyproject_buildrequires correctly It has setup_requires and tests that %%pyproject_buildrequires correctly
handles that including runtime requirements. handles that including runtime requirements.
Run %%pyproject_check_import with top-level modules filtering.
%package -n python3-distroinfo %package -n python3-distroinfo
Summary: %{summary} Summary: %{summary}
@ -43,6 +45,7 @@ Summary: %{summary}
%check %check
%pytest %pytest
%pyproject_check_import -t
%files -n python3-distroinfo -f %{pyproject_files} %files -n python3-distroinfo -f %{pyproject_files}

View File

@ -15,6 +15,12 @@ BuildRequires: python3-devel
This package contains data files. This package contains data files.
Building this tests that data files are not listed when +auto is not used Building this tests that data files are not listed when +auto is not used
with %%pyproject_save_files. with %%pyproject_save_files.
Run %%pyproject_check_import on installed package and exclude unwanted modules
(if they're not excluded, build fails).
- We don't want to pull test dependencies just to check import
- The others fail to find `gi` and `matplotlib` which weren't declared
in the upstream metadata
%package -n python3-ipykernel %package -n python3-ipykernel
Summary: %{summary} Summary: %{summary}
@ -26,7 +32,7 @@ Summary: %{summary}
%autosetup -p1 -n ipykernel-%{version} %autosetup -p1 -n ipykernel-%{version}
%generate_buildrequires %generate_buildrequires
%pyproject_buildrequires %pyproject_buildrequires -r
%build %build
%pyproject_wheel %pyproject_wheel
@ -35,6 +41,9 @@ Summary: %{summary}
%pyproject_install %pyproject_install
%pyproject_save_files 'ipykernel*' +auto %pyproject_save_files 'ipykernel*' +auto
%check
%pyproject_check_import -e '*.test*' -e 'ipykernel.gui*' -e 'ipykernel.pylab.backend_inline'
%files -n python3-ipykernel -f %{pyproject_files} %files -n python3-ipykernel -f %{pyproject_files}
%license COPYING.md %license COPYING.md
%doc README.md %doc README.md

View File

@ -20,6 +20,7 @@ Has a script (.py) and extension (.so) with identical name.
Building this tests: Building this tests:
- installing both a script and an extension with the same name - installing both a script and an extension with the same name
- default build backend without pyproject.toml - default build backend without pyproject.toml
Check %%pyproject_check_import basic functionality.
%package -n python3-mistune %package -n python3-mistune
@ -47,6 +48,8 @@ Summary: %summary
%check %check
%pyproject_check_import
# Internal check for our macros # Internal check for our macros
# making sure that pyproject_install outputs these files so that we can test behaviour of %%pyproject_save_files # making sure that pyproject_install outputs these files so that we can test behaviour of %%pyproject_save_files
# when a package has multiple files with the same name (here script and extension) # when a package has multiple files with the same name (here script and extension)