Note that the code is completely unchanged except for the indentation
under the new if __name__ == "__main__":
Note that this change is necessary, but not sufficient to use the
RpmVersion class.
The init of the RpmVersion class will fail when called from an outside
script, because the `parse_version()` function is lazily imported from
the code outside the class. However, adding the import of
parse_version() to RpmVersion class is not done right now, because while
we would import it from `pkg_resources`, other scripts might want to
rely instead of the lightweight `packaging` module for the import. Thus
I'm leaving this conondrum to be addressed in the future.
--normalized-names-format FORMAT
FORMAT of normalized names can be `pep503` [default] or `legacy-dots` (dots allowed)
--normalized-names-provide-both
Provede both `pep503` and `legacy-dots` format of normalized names (useful for a transition period)
Notes from an attempted rewrite from pkg_resources to importlib.metadata in 2020:
1. While pkg_resources can open a metadata on a specified path
(Distribution.from_location()), importlib provides access only to
"installed package metadata", i.e. the the dist-info or egg-info directory
must be "discoverable", i.e. on the sys.path.
- Thankfully only the dist/egg-info directory must exist, the
corresponding Python module does not have to be present.
- The problems this causes:
(a) You have to manipulate the sys.path to add the specific location of
the site-packages directory inside the buildroot
(b) If you have package "foo" in this newly added directory on sys.path
and there is some problem and its dist/egg-info metadata are not found,
importlib.metadata continues searching the sys.path and may discover a
package with the same name (possibly same version) outside the
buildroot.
To get around this, you can manipulate the sys.path to remove all
other "site-packages" directories. But you have to leave the
standard library there, because importlib may import other modules
(in my testing: base64, quopri, random, socket, calendar, uu)
(c) I have not tested how well it works if you're ispecting metadata of
different Python versions than the one you run the script with
(especially Python 2 vs Python 3). This might also cause problems with
dependency specifiers (i.e. python_version != "3.4")
2. Handling of dependencies (requires) is problematic in importlib.metadata
- pkg_resources provides a way to separately list standard requires and a
requires for each "extras" category. importlib does not provide this, it
only spits out a list of strings, each string in the format:
- 'packaging>=14',
- 'towncrier>=18.5.0; extra == "docs"', or
- 'psutil<6,>=5.6.1; (python_version != "3.4") and extra == "testing"
you can either parse these with a regex (fragile) or use the external
`packaging` Python module. `packaging`, however, also doesn't have a great
support for figuring out extra dependencies, it provides the marker api:
- <Marker(\'python_version != "3.4" and extra == "testing"\')>
you can use Marker api to evaluate the condition, but not to parse.
For parsing you can access the private api Marker._markers:
- marker._markers=[[(<Variable('python_version')>, <Op('!=')>, \
<Value('3.4')>)], 'and', (<Variable('extra')>, <Op('==')>, \
<Value('testing')>)]
which beyond the problem of being private is also not very useful for
parsing due to its structure.
- pkg_resources also provides version parsing, which importlib does not
and `packaging` needs to be used
- importlib is part of the standard library, but packaging and its
2 runtime dependencies (pyparsing and six) are not, and therefore we
would go from 1 dependency to 3
3. A few minor issues, more in the next section about equivalents.
importlib.metadata.distribution equivalents of pkg_resources.Distribution attributes:
- pkg_resources: dist.py_version
importlib: # not implemented (but can be guessed from the /usr/lib/pythonXX.YY/ path)
- pkg_resources: dist.project_name
importlib: dist.metadata['name']
- pkg_resources: dist.key
importlib: # not implemented
- pkg_resources: dist.version
importlib: dist.version
- pkg_resources: dist.requires()
importlib: dist.requires # but returns strings with almost no parsing done, and also lists extras
- pkg_resources: dist.requires(extras=dist.extras)
importlib: # not implemented, has to be parsed from dist.requires
- pkg_resources: dist.get_entry_map('console_scripts')
importlib: [ep for ep in importlib.metadata.entry_points()['console_scripts'] if ep.name == pkg][0]
# I have not found a better way to get the console_scripts
- pkg_resources: dist.get_entry_map('gui_scripts')
importlib: # Presumably same as console_scripts, but untested
That is, we add new provides that replace dots with a dash.
Package that used to provide python3dist(zope.component) and python3.8dist(zope.component)
now also provides python3dist(zope-component) and python3.8dist(zope-component).
Package that used to provide python3dist(a.-.-.-.a) now provides python3dist(a-a) as well.
This is consistent with pip behavior, `pip install zope-component` installs zope.component.
Historically, we have always used dist.key (safe_name) from setuptools,
but that is a non-standardized convention -- whether or not it replaces dots
with dashes is not even documented.
We say we use "canonical name" or "normalized name" everywhere, yet we didn't.
We really need to follow the standard (PEP 503):
https://www.python.org/dev/peps/pep-0503/#normalized-names
The proper function here would be packaging.utils.canonicalize_name
https://packaging.pypa.io/en/latest/utils/#packaging.utils.canonicalize_name
-- we reimplement it here to avoid an external dependency.
This is the first required step needed if we want to change our requirements later.
If we decide we don't, for whatever reason, this doesn't break anything.
Puts bounded requirements into parenthesis
Fixes: https://github.com/rpm-software-management/rpm/issues/995
Upstream: https://github.com/rpm-software-management/rpm/pull/996
For this input: pyparsing>=2.0.1,!=2.0.4,!=2.1.2,!=2.1.6
Instead of (invalid):
(python3.8dist(pyparsing) >= 2.0.1 with
python3.8dist(pyparsing) < 2.1.2 or python3.8dist(pyparsing) >= 2.1.2.0 with
python3.8dist(pyparsing) < 2.1.6 or python3.8dist(pyparsing) >= 2.1.6.0 with
python3.8dist(pyparsing) < 2.0.4 or python3.8dist(pyparsing) >= 2.0.4.0)
Produces (valid):
(python3.8dist(pyparsing) >= 2.0.1 with
(python3.8dist(pyparsing) < 2.1.2 or python3.8dist(pyparsing) >= 2.1.2.0) with
(python3.8dist(pyparsing) < 2.0.4 or python3.8dist(pyparsing) >= 2.0.4.0) with
(python3.8dist(pyparsing) < 2.1.6 or python3.8dist(pyparsing) >= 2.1.6.0))
For this input: babel>=1.3,!=2.0
Instead of (invalid):
(python3.8dist(babel) >= 1.3 with
python3.8dist(babel) < 2 or python3.8dist(babel) >= 2.0)
Produces (valid):
(python3.8dist(babel) >= 1.3 with
(python3.8dist(babel) < 2 or python3.8dist(babel) >= 2.0))
For this input: pbr!=2.1.0,>=2.0.0
Instead of (invalid):
(python3.8dist(pbr) >= 2 with
python3.8dist(pbr) < 2.1 or python3.8dist(pbr) >= 2.1.0)
Produces (valid):
(python3.8dist(pbr) >= 2 with
(python3.8dist(pbr) < 2.1 or python3.8dist(pbr) >= 2.1.0))
The purelib and platlib were both defined to /usr/lib64/python on
64bits systems. This is because:
>>> get_python_lib(standard_lib=1, plat_specific=0)
'/usr/lib64/python3.7'
>>> get_python_lib(standard_lib=1, plat_specific=1)
'/usr/lib64/python3.7'
>>> get_python_lib(standard_lib=0, plat_specific=0)
'/usr/lib/python3.7/site-packages'
>>> get_python_lib(standard_lib=0, plat_specific=1)
'/usr/lib64/python3.7/site-packages'
So now we use standard_lib=0 to get the site-packages base path
from /usr/lib and not /usr/lib64.
Fixes https://bugzilla.redhat.com/show_bug.cgi?id=1609492
Running this python script on all possible files is way too expensive.
Some of the packages timeout due to that.
Signed-off-by: Igor Gnatenko <ignatenkobrain@fedoraproject.org>
This package is not being kept up to date, it's hard to maintain and we
will need to tune it from time to time which is painful.
Also removes whole layer of bootstrapping.
Signed-off-by: Igor Gnatenko <ignatenkobrain@fedoraproject.org>