Commit 04a916ed authored by Jason R. Coombs's avatar Jason R. Coombs

Merge branch 'main' into maintenance/flake8-exclude-.tox

parents b091db99 bf195695
[bumpversion]
current_version = 51.1.0
current_version = 51.2.0
commit = True
tag = True
......
[flake8]
max-line-length = 88
# jaraco/skeleton#34
max-complexity = 10
extend-exclude =
build
setuptools/_vendor
setuptools/_distutils
pkg_resources/_vendor
ignore =
# W503 violates spec https://github.com/PyCQA/pycodestyle/issues/513
W503
# W504 has issues https://github.com/OCA/maintainer-quality-tools/issues/545
W504
extend-ignore =
# Black creates whitespace before colon
E203
setuptools/site-patch.py F821
......
name: automerge
on:
pull_request:
types:
- labeled
- unlabeled
- synchronize
- opened
- edited
- ready_for_review
- reopened
- unlocked
pull_request_review:
types:
- submitted
check_suite:
types:
- completed
status: {}
jobs:
automerge:
runs-on: ubuntu-latest
steps:
- name: automerge
uses: "pascalgn/automerge-action@v0.12.0"
env:
GITHUB_TOKEN: "${{ secrets.GITHUB_TOKEN }}"
name: Automated Tests
name: tests
on: [push, pull_request]
......
repos:
- repo: https://github.com/psf/black
rev: stable
rev: 20.8b1
hooks:
- id: black
- repo: https://github.com/asottile/blacken-docs
rev: v1.8.0
rev: v1.9.1
hooks:
- id: blacken-docs
v51.2.0
-------
Changes
^^^^^^^
* #2493: Use importlib.import_module() rather than the deprectated loader.load_module()
in pkg_resources namespace delaration -- by :user:`encukou`
Documentation changes
^^^^^^^^^^^^^^^^^^^^^
* #2525: Fix typo in the document page about entry point. -- by :user:`jtr109`
Misc
^^^^
* #2534: Avoid hitting network during test_easy_install.
v51.1.2
-------
Misc
^^^^
* #2505: Disable inclusion of package data as it causes 'tests' to be included as data.
v51.1.1
-------
Misc
^^^^
* #2534: Avoid hitting network during test_virtualenv.test_test_command.
v51.1.0
-------
......@@ -371,6 +407,7 @@ v47.2.0
Changes
^^^^^^^
* #2194: Editable-installed entry points now load significantly faster on Python versions 3.8+.
* #1471: Incidentally fixed by #2194 on Python 3.8 or when importlib_metadata is present.
v47.1.1
......
......@@ -6,9 +6,9 @@
.. _PyPI link: https://pypi.org/project/setuptools
.. image:: https://github.com/pypa/setuptools/workflows/Automated%20Tests/badge.svg
:target: https://github.com/pypa/setuptools/actions?query=workflow%3A%22Automated+Tests%22
:alt: Automated Tests
.. image:: https://github.com/pypa/setuptools/workflows/tests/badge.svg
:target: https://github.com/pypa/setuptools/actions?query=workflow%3A%22tests%22
:alt: tests
.. image:: https://img.shields.io/badge/code%20style-black-000000.svg
:target: https://github.com/psf/black
......
Define ``create_module()`` and ``exec_module()`` methods in ``VendorImporter``
to get rid of ``ImportWarning`` -- by :user:`hroncok`
Fixed an issue where version tags may be added multiple times
......@@ -157,7 +157,7 @@ To use this feature:
]
build-backend = "setuptools.build_meta"
* Use a :pep:`517` compatible build frontend, such as ``pip >= 19`` or ``pep517``.
* Use a :pep:`517` compatible build frontend, such as ``pip >= 19`` or ``build``.
.. warning::
......
......@@ -72,7 +72,7 @@ When your project is installed (e.g. using pip), all of the dependencies not
already installed will be located (via PyPI), downloaded, built (if necessary),
and installed and 2) Any scripts in your project will be installed with wrappers
that verify the availability of the specified dependencies at runtime.
Platform specific dependencies
------------------------------
......@@ -202,7 +202,7 @@ Optional dependencies
Setuptools allows you to declare dependencies that only get installed under
specific circumstances. These dependencies are specified with ``extras_require``
keyword and are only installed if another package depends on it (either
directly or indirectly) This makes it convenient to declare dependencies for
directly or indirectly) This makes it convenient to declare dependencies for
ancillary functions such as "tests" and "docs".
.. note::
......@@ -262,8 +262,12 @@ First is the console_scripts entry point:
}
)
When the script ``rst2pdf`` is run, it will trigger the installation of
the two dependencies ``PDF`` maps to.
This syntax indicates that the entry point (in this case a console script)
is only valid when the PDF extra is installed. It is up to the installer
to determine how to handle the situation where PDF was not indicated
(e.g. omit the console script, provide a warning when attempting to load
the entry point, assume the extras are present and let the implementation
fail later).
The second use case is that other package can use this "extra" for their
own dependencies. For example, if "Project-B" needs "project A" with PDF support
......
......@@ -28,7 +28,7 @@ with ``__init__.py`` as:
.. code-block:: python
def helloworld():
def hello_world():
print("Hello world")
and ``__main__.py`` providing a hook:
......
......@@ -38,6 +38,7 @@ import itertools
import inspect
import ntpath
import posixpath
import importlib
from pkgutil import get_importer
try:
......@@ -696,7 +697,8 @@ class WorkingSet:
keys2.append(dist.key)
self._added_new(dist)
def resolve(self, requirements, env=None, installer=None,
# FIXME: 'WorkingSet.resolve' is too complex (11)
def resolve(self, requirements, env=None, installer=None, # noqa: C901
replace_conflicting=False, extras=None):
"""List all distributions needed to (recursively) meet `requirements`
......@@ -1745,7 +1747,8 @@ class ZipProvider(EggProvider):
timestamp = time.mktime(date_time)
return timestamp, size
def _extract_resource(self, manager, zip_path):
# FIXME: 'ZipProvider._extract_resource' is too complex (12)
def _extract_resource(self, manager, zip_path): # noqa: C901
if zip_path in self._index():
for name in self._index()[zip_path]:
......@@ -2209,7 +2212,7 @@ def _handle_ns(packageName, path_item):
if subpath is not None:
path = module.__path__
path.append(subpath)
loader.load_module(packageName)
importlib.import_module(packageName)
_rebuild_mod_path(path, packageName, module)
return subpath
......@@ -2858,7 +2861,8 @@ class Distribution:
"""Return the EntryPoint object for `group`+`name`, or ``None``"""
return self.get_entry_map(group).get(name)
def insert_on(self, path, loc=None, replace=False):
# FIXME: 'Distribution.insert_on' is too complex (13)
def insert_on(self, path, loc=None, replace=False): # noqa: C901
"""Ensure self.location is on path
If replace=False (default):
......
......@@ -54,6 +54,12 @@ class VendorImporter:
"distribution.".format(**locals())
)
def create_module(self, spec):
return self.load_module(spec.name)
def exec_module(self, module):
pass
def install(self):
"""
Install this importer into sys.meta_path if not already present.
......
......@@ -12,18 +12,16 @@ skip-string-normalization = true
[tool.setuptools_scm]
# jaraco/skeleton#22
[tool.jaraco.pytest.plugins.black]
[pytest.enabler.black]
#addopts = "--black"
# jaraco/skeleton#22
[tool.jaraco.pytest.plugins.mypy]
[pytest.enabler.mypy]
#addopts = "--mypy"
[tool.jaraco.pytest.plugins.flake8]
[pytest.enabler.flake8]
addopts = "--flake8"
[tool.jaraco.pytest.plugins.cov]
[pytest.enabler.cov]
addopts = "--cov"
[tool.towncrier]
......
[metadata]
license_file = LICENSE
license_files =
LICENSE
name = setuptools
version = 51.1.0
version = 51.2.0
author = Python Packaging Authority
author_email = distutils-sig@python.org
description = Easily download, build, install, upgrade, and uninstall Python packages
......@@ -22,14 +23,20 @@ project_urls =
Documentation = https://setuptools.readthedocs.io/
[options]
packages = find:
packages = find_namespace:
py_modules = easy_install
include_package_data = true
# disabled as it causes tests to be included #2505
# include_package_data = true
python_requires = >=3.6
install_requires =
[options.packages.find]
exclude = *.tests
exclude =
build*
docs*
tests*
*.tests
tools*
[options.extras_require]
testing =
......@@ -40,8 +47,7 @@ testing =
pytest-black >= 0.3.7; python_implementation != "PyPy"
pytest-cov
pytest-mypy; python_implementation != "PyPy"
# jaraco/skeleton#22
jaraco.test >= 3.2.0
pytest-enabler
# local
mock
......@@ -57,7 +63,7 @@ docs =
# Keep these in sync with docs/requirements.txt
# upstream
sphinx
jaraco.packaging >= 6.1
jaraco.packaging >= 8.2
rst.linker >= 1.9
# local
......
......@@ -125,6 +125,56 @@ def unpack_zipfile(filename, extract_dir, progress_filter=default_filter):
os.chmod(target, unix_attributes)
def _resolve_tar_file_or_dir(tar_obj, tar_member_obj):
"""Resolve any links and extract link targets as normal files."""
while tar_member_obj is not None and (
tar_member_obj.islnk() or tar_member_obj.issym()):
linkpath = tar_member_obj.linkname
if tar_member_obj.issym():
base = posixpath.dirname(tar_member_obj.name)
linkpath = posixpath.join(base, linkpath)
linkpath = posixpath.normpath(linkpath)
tar_member_obj = tar_obj._getmember(linkpath)
is_file_or_dir = (
tar_member_obj is not None and
(tar_member_obj.isfile() or tar_member_obj.isdir())
)
if is_file_or_dir:
return tar_member_obj
raise LookupError('Got unknown file type')
def _iter_open_tar(tar_obj, extract_dir, progress_filter):
"""Emit member-destination pairs from a tar archive."""
# don't do any chowning!
tar_obj.chown = lambda *args: None
with contextlib.closing(tar_obj):
for member in tar_obj:
name = member.name
# don't extract absolute paths or ones with .. in them
if name.startswith('/') or '..' in name.split('/'):
continue
prelim_dst = os.path.join(extract_dir, *name.split('/'))
try:
member = _resolve_tar_file_or_dir(tar_obj, member)
except LookupError:
continue
final_dst = progress_filter(name, prelim_dst)
if not final_dst:
continue
if final_dst.endswith(os.sep):
final_dst = final_dst[:-1]
yield member, final_dst
def unpack_tarfile(filename, extract_dir, progress_filter=default_filter):
"""Unpack tar/tar.gz/tar.bz2 `filename` to `extract_dir`
......@@ -138,38 +188,18 @@ def unpack_tarfile(filename, extract_dir, progress_filter=default_filter):
raise UnrecognizedFormat(
"%s is not a compressed or uncompressed tar file" % (filename,)
) from e
with contextlib.closing(tarobj):
# don't do any chowning!
tarobj.chown = lambda *args: None
for member in tarobj:
name = member.name
# don't extract absolute paths or ones with .. in them
if not name.startswith('/') and '..' not in name.split('/'):
prelim_dst = os.path.join(extract_dir, *name.split('/'))
# resolve any links and to extract the link targets as normal
# files
while member is not None and (
member.islnk() or member.issym()):
linkpath = member.linkname
if member.issym():
base = posixpath.dirname(member.name)
linkpath = posixpath.join(base, linkpath)
linkpath = posixpath.normpath(linkpath)
member = tarobj._getmember(linkpath)
if member is not None and (member.isfile() or member.isdir()):
final_dst = progress_filter(name, prelim_dst)
if final_dst:
if final_dst.endswith(os.sep):
final_dst = final_dst[:-1]
try:
# XXX Ugh
tarobj._extract_member(member, final_dst)
except tarfile.ExtractError:
# chown/chmod/mkfifo/mknode/makedev failed
pass
return True
for member, final_dst in _iter_open_tar(
tarobj, extract_dir, progress_filter,
):
try:
# XXX Ugh
tarobj._extract_member(member, final_dst)
except tarfile.ExtractError:
# chown/chmod/mkfifo/mknode/makedev failed
pass
return True
extraction_drivers = unpack_directory, unpack_zipfile, unpack_tarfile
......@@ -153,7 +153,7 @@ class bdist_egg(Command):
self.run_command(cmdname)
return cmd
def run(self):
def run(self): # noqa: C901 # is too complex (14) # FIXME
# Generate metadata first
self.run_command("egg_info")
# We run install_lib before install_data, because some data hacks
......
......@@ -226,7 +226,7 @@ class easy_install(Command):
print(tmpl.format(**locals()))
raise SystemExit()
def finalize_options(self):
def finalize_options(self): # noqa: C901 # is too complex (25) # FIXME
self.version and self._render_version()
py_version = sys.version.split()[0]
......@@ -437,7 +437,7 @@ class easy_install(Command):
def warn_deprecated_options(self):
pass
def check_site_dir(self):
def check_site_dir(self): # noqa: C901 # is too complex (12) # FIXME
"""Verify that self.install_dir is .pth-capable dir, if needed"""
instdir = normalize_path(self.install_dir)
......@@ -713,7 +713,10 @@ class easy_install(Command):
if getattr(self, attrname) is None:
setattr(self, attrname, scheme[key])
def process_distribution(self, requirement, dist, deps=True, *info):
# FIXME: 'easy_install.process_distribution' is too complex (12)
def process_distribution( # noqa: C901
self, requirement, dist, deps=True, *info,
):
self.update_pth(dist)
self.package_index.add(dist)
if dist in self.local_index[dist.key]:
......@@ -837,12 +840,19 @@ class easy_install(Command):
def install_eggs(self, spec, dist_filename, tmpdir):
# .egg dirs or files are already built, so just return them
if dist_filename.lower().endswith('.egg'):
return [self.install_egg(dist_filename, tmpdir)]
elif dist_filename.lower().endswith('.exe'):
return [self.install_exe(dist_filename, tmpdir)]
elif dist_filename.lower().endswith('.whl'):
return [self.install_wheel(dist_filename, tmpdir)]
installer_map = {
'.egg': self.install_egg,
'.exe': self.install_exe,
'.whl': self.install_wheel,
}
try:
install_dist = installer_map[
dist_filename.lower()[-4:]
]
except KeyError:
pass
else:
return [install_dist(dist_filename, tmpdir)]
# Anything else, try to extract and build
setup_base = tmpdir
......@@ -887,7 +897,8 @@ class easy_install(Command):
metadata = EggMetadata(zipimport.zipimporter(egg_path))
return Distribution.from_filename(egg_path, metadata=metadata)
def install_egg(self, egg_path, tmpdir):
# FIXME: 'easy_install.install_egg' is too complex (11)
def install_egg(self, egg_path, tmpdir): # noqa: C901
destination = os.path.join(
self.install_dir,
os.path.basename(egg_path),
......@@ -986,7 +997,8 @@ class easy_install(Command):
# install the .egg
return self.install_egg(egg_path, tmpdir)
def exe_to_egg(self, dist_filename, egg_tmp):
# FIXME: 'easy_install.exe_to_egg' is too complex (12)
def exe_to_egg(self, dist_filename, egg_tmp): # noqa: C901
"""Extract a bdist_wininst to the directories an egg would use"""
# Check for .pth file and set up prefix translations
prefixes = get_exe_prefixes(dist_filename)
......@@ -1184,16 +1196,18 @@ class easy_install(Command):
cfg_filename = os.path.join(base, 'setup.cfg')
setopt.edit_config(cfg_filename, settings)
def update_pth(self, dist):
def update_pth(self, dist): # noqa: C901 # is too complex (11) # FIXME
if self.pth_file is None:
return
for d in self.pth_file[dist.key]: # drop old entries
if self.multi_version or d.location != dist.location:
log.info("Removing %s from easy-install.pth file", d)
self.pth_file.remove(d)
if d.location in self.shadow_path:
self.shadow_path.remove(d.location)
if not self.multi_version and d.location == dist.location:
continue
log.info("Removing %s from easy-install.pth file", d)
self.pth_file.remove(d)
if d.location in self.shadow_path:
self.shadow_path.remove(d.location)
if not self.multi_version:
if dist.location in self.pth_file.paths:
......@@ -1207,19 +1221,21 @@ class easy_install(Command):
if dist.location not in self.shadow_path:
self.shadow_path.append(dist.location)
if not self.dry_run:
if self.dry_run:
return
self.pth_file.save()
self.pth_file.save()
if dist.key == 'setuptools':
# Ensure that setuptools itself never becomes unavailable!
# XXX should this check for latest version?
filename = os.path.join(self.install_dir, 'setuptools.pth')
if os.path.islink(filename):
os.unlink(filename)
f = open(filename, 'wt')
f.write(self.pth_file.make_relative(dist.location) + '\n')
f.close()
if dist.key != 'setuptools':
return
# Ensure that setuptools itself never becomes unavailable!
# XXX should this check for latest version?
filename = os.path.join(self.install_dir, 'setuptools.pth')
if os.path.islink(filename):
os.unlink(filename)
with open(filename, 'wt') as f:
f.write(self.pth_file.make_relative(dist.location) + '\n')
def unpack_progress(self, src, dst):
# Progress filter for unpacking
......@@ -1360,58 +1376,63 @@ def get_site_dirs():
if sys.exec_prefix != sys.prefix:
prefixes.append(sys.exec_prefix)
for prefix in prefixes:
if prefix:
if sys.platform in ('os2emx', 'riscos'):
sitedirs.append(os.path.join(prefix, "Lib", "site-packages"))
elif os.sep == '/':
sitedirs.extend([
os.path.join(
prefix,
"lib",
"python{}.{}".format(*sys.version_info),
"site-packages",
),
os.path.join(prefix, "lib", "site-python"),
])
else:
sitedirs.extend([
if not prefix:
continue
if sys.platform in ('os2emx', 'riscos'):
sitedirs.append(os.path.join(prefix, "Lib", "site-packages"))
elif os.sep == '/':
sitedirs.extend([
os.path.join(
prefix,
os.path.join(prefix, "lib", "site-packages"),
])
if sys.platform == 'darwin':
# for framework builds *only* we add the standard Apple
# locations. Currently only per-user, but /Library and
# /Network/Library could be added too
if 'Python.framework' in prefix:
home = os.environ.get('HOME')
if home:
home_sp = os.path.join(
home,
'Library',
'Python',
'{}.{}'.format(*sys.version_info),
'site-packages',
)
sitedirs.append(home_sp)
"lib",
"python{}.{}".format(*sys.version_info),
"site-packages",
),
os.path.join(prefix, "lib", "site-python"),
])
else:
sitedirs.extend([
prefix,
os.path.join(prefix, "lib", "site-packages"),
])
if sys.platform != 'darwin':
continue
# for framework builds *only* we add the standard Apple
# locations. Currently only per-user, but /Library and
# /Network/Library could be added too
if 'Python.framework' not in prefix:
continue
home = os.environ.get('HOME')
if not home:
continue
home_sp = os.path.join(
home,
'Library',
'Python',
'{}.{}'.format(*sys.version_info),
'site-packages',
)
sitedirs.append(home_sp)
lib_paths = get_path('purelib'), get_path('platlib')
for site_lib in lib_paths:
if site_lib not in sitedirs:
sitedirs.append(site_lib)
sitedirs.extend(s for s in lib_paths if s not in sitedirs)
if site.ENABLE_USER_SITE:
sitedirs.append(site.USER_SITE)
try:
with contextlib.suppress(AttributeError):
sitedirs.extend(site.getsitepackages())
except AttributeError:
pass
sitedirs = list(map(normalize_path, sitedirs))
return sitedirs
def expand_paths(inputs):
def expand_paths(inputs): # noqa: C901 # is too complex (11) # FIXME
"""Yield sys.path directories that might contain "old-style" packages"""
seen = {}
......@@ -1443,13 +1464,18 @@ def expand_paths(inputs):
# Yield existing non-dupe, non-import directory lines from it
for line in lines:
if not line.startswith("import"):
line = normalize_path(line.rstrip())
if line not in seen:
seen[line] = 1
if not os.path.isdir(line):
continue
yield line, os.listdir(line)
if line.startswith("import"):
continue
line = normalize_path(line.rstrip())
if line in seen:
continue
seen[line] = 1
if not os.path.isdir(line):
continue
yield line, os.listdir(line)
def extract_wininst_cfg(dist_filename):
......
......@@ -8,6 +8,7 @@ from distutils.util import convert_path
from distutils import log
import distutils.errors
import distutils.filelist
import functools
import os
import re
import sys
......@@ -31,7 +32,7 @@ from setuptools.extern import packaging
from setuptools import SetuptoolsDeprecationWarning
def translate_pattern(glob):
def translate_pattern(glob): # noqa: C901 # is too complex (14) # FIXME
"""
Translate a file path glob like '*.txt' in to a regular expression.
This differs from fnmatch.translate which allows wildcards to match
......@@ -130,10 +131,12 @@ class InfoCommon:
egg_info may be called more than once for a distribution,
in which case the version string already contains all tags.
"""
return (
version if self.vtags and version.endswith(self.vtags)
else version + self.vtags
)
# Remove the tags if they exist. The tags maybe have been normalized
# (e.g. turning .dev into .dev0) so we can't just compare strings
base_version = parse_version(version).base_version
# Add the tags
return base_version + self.vtags
def tags(self):
version = ''
......@@ -332,70 +335,74 @@ class FileList(_FileList):
# patterns, (dir and patterns), or (dir_pattern).
(action, patterns, dir, dir_pattern) = self._parse_template_line(line)
action_map = {
'include': self.include,
'exclude': self.exclude,
'global-include': self.global_include,
'global-exclude': self.global_exclude,
'recursive-include': functools.partial(
self.recursive_include, dir,
),
'recursive-exclude': functools.partial(
self.recursive_exclude, dir,
),
'graft': self.graft,
'prune': self.prune,
}
log_map = {
'include': "warning: no files found matching '%s'",
'exclude': (
"warning: no previously-included files found "
"matching '%s'"
),
'global-include': (
"warning: no files found matching '%s' "
"anywhere in distribution"
),
'global-exclude': (
"warning: no previously-included files matching "
"'%s' found anywhere in distribution"
),
'recursive-include': (
"warning: no files found matching '%s' "
"under directory '%s'"
),
'recursive-exclude': (
"warning: no previously-included files matching "
"'%s' found under directory '%s'"
),
'graft': "warning: no directories found matching '%s'",
'prune': "no previously-included directories found matching '%s'",
}
try:
process_action = action_map[action]
except KeyError:
raise DistutilsInternalError(
"this cannot happen: invalid action '{action!s}'".
format(action=action),
)
# OK, now we know that the action is valid and we have the
# right number of words on the line for that action -- so we
# can proceed with minimal error-checking.
if action == 'include':
self.debug_print("include " + ' '.join(patterns))
for pattern in patterns:
if not self.include(pattern):
log.warn("warning: no files found matching '%s'", pattern)
elif action == 'exclude':
self.debug_print("exclude " + ' '.join(patterns))
for pattern in patterns:
if not self.exclude(pattern):
log.warn(("warning: no previously-included files "
"found matching '%s'"), pattern)
elif action == 'global-include':
self.debug_print("global-include " + ' '.join(patterns))
for pattern in patterns:
if not self.global_include(pattern):
log.warn(("warning: no files found matching '%s' "
"anywhere in distribution"), pattern)
elif action == 'global-exclude':
self.debug_print("global-exclude " + ' '.join(patterns))
for pattern in patterns:
if not self.global_exclude(pattern):
log.warn(("warning: no previously-included files matching "
"'%s' found anywhere in distribution"),
pattern)
elif action == 'recursive-include':
self.debug_print("recursive-include %s %s" %
(dir, ' '.join(patterns)))
for pattern in patterns:
if not self.recursive_include(dir, pattern):
log.warn(("warning: no files found matching '%s' "
"under directory '%s'"),
pattern, dir)
elif action == 'recursive-exclude':
self.debug_print("recursive-exclude %s %s" %
(dir, ' '.join(patterns)))
for pattern in patterns:
if not self.recursive_exclude(dir, pattern):
log.warn(("warning: no previously-included files matching "
"'%s' found under directory '%s'"),
pattern, dir)
elif action == 'graft':
self.debug_print("graft " + dir_pattern)
if not self.graft(dir_pattern):
log.warn("warning: no directories found matching '%s'",
dir_pattern)
elif action == 'prune':
self.debug_print("prune " + dir_pattern)
if not self.prune(dir_pattern):
log.warn(("no previously-included directories found "
"matching '%s'"), dir_pattern)
else:
raise DistutilsInternalError(
"this cannot happen: invalid action '%s'" % action)
action_is_recursive = action.startswith('recursive-')
if action in {'graft', 'prune'}:
patterns = [dir_pattern]
extra_log_args = (dir, ) if action_is_recursive else ()
log_tmpl = log_map[action]
self.debug_print(
' '.join(
[action] +
([dir] if action_is_recursive else []) +
patterns,
)
)
for pattern in patterns:
if not process_action(pattern):
log.warn(log_tmpl, pattern, *extra_log_args)
def _remove_files(self, predicate):
"""
......
......@@ -119,7 +119,7 @@ def read_pkg_file(self, file):
# Based on Python 3.5 version
def write_pkg_file(self, file):
def write_pkg_file(self, file): # noqa: C901 # is too complex (14) # FIXME
"""Write the PKG-INFO format data to a file object.
"""
version = self.get_metadata_version()
......@@ -548,7 +548,8 @@ class Distribution(_Distribution):
req.marker = None
return req
def _parse_config_files(self, filenames=None):
# FIXME: 'Distribution._parse_config_files' is too complex (14)
def _parse_config_files(self, filenames=None): # noqa: C901
"""
Adapted from distutils.dist.Distribution.parse_config_files,
this method provides the same functionality in subtly-improved
......@@ -557,14 +558,12 @@ class Distribution(_Distribution):
from configparser import ConfigParser
# Ignore install directory options if we have a venv
if sys.prefix != sys.base_prefix:
ignore_options = [
'install-base', 'install-platbase', 'install-lib',
'install-platlib', 'install-purelib', 'install-headers',
'install-scripts', 'install-data', 'prefix', 'exec-prefix',
'home', 'user', 'root']
else:
ignore_options = []
ignore_options = [] if sys.prefix == sys.base_prefix else [
'install-base', 'install-platbase', 'install-lib',
'install-platlib', 'install-purelib', 'install-headers',
'install-scripts', 'install-data', 'prefix', 'exec-prefix',
'home', 'user', 'root',
]
ignore_options = frozenset(ignore_options)
......@@ -585,32 +584,37 @@ class Distribution(_Distribution):
opt_dict = self.get_option_dict(section)
for opt in options:
if opt != '__name__' and opt not in ignore_options:
val = parser.get(section, opt)
opt = opt.replace('-', '_')
opt_dict[opt] = (filename, val)
if opt == '__name__' or opt in ignore_options:
continue
val = parser.get(section, opt)
opt = opt.replace('-', '_')
opt_dict[opt] = (filename, val)
# Make the ConfigParser forget everything (so we retain
# the original filenames that options come from)
parser.__init__()
if 'global' not in self.command_options:
return
# If there was a "global" section in the config file, use it
# to set Distribution options.
if 'global' in self.command_options:
for (opt, (src, val)) in self.command_options['global'].items():
alias = self.negative_opt.get(opt)
try:
if alias:
setattr(self, alias, not strtobool(val))
elif opt in ('verbose', 'dry_run'): # ugh!
setattr(self, opt, strtobool(val))
else:
setattr(self, opt, val)
except ValueError as e:
raise DistutilsOptionError(e) from e
for (opt, (src, val)) in self.command_options['global'].items():
alias = self.negative_opt.get(opt)
if alias:
val = not strtobool(val)
elif opt in ('verbose', 'dry_run'): # ugh!
val = strtobool(val)
try:
setattr(self, alias or opt, val)
except ValueError as e:
raise DistutilsOptionError(e) from e
def _set_command_options(self, command_obj, option_dict=None):
# FIXME: 'Distribution._set_command_options' is too complex (14)
def _set_command_options(self, command_obj, option_dict=None): # noqa: C901
"""
Set the options for 'command_obj' from 'option_dict'. Basically
this means copying elements of a dictionary ('option_dict') to
......
......@@ -54,6 +54,12 @@ class VendorImporter:
"distribution.".format(**locals())
)
def create_module(self, spec):
return self.load_module(spec.name)
def exec_module(self, module):
pass
def install(self):
"""
Install this importer into sys.meta_path if not already present.
......
......@@ -47,6 +47,8 @@ def iglob(pathname, recursive=False):
def _iglob(pathname, recursive):
dirname, basename = os.path.split(pathname)
glob_in_dir = glob2 if recursive and _isrecursive(basename) else glob1
if not has_magic(pathname):
if basename:
if os.path.lexists(pathname):
......@@ -56,13 +58,9 @@ def _iglob(pathname, recursive):
if os.path.isdir(dirname):
yield pathname
return
if not dirname:
if recursive and _isrecursive(basename):
for x in glob2(dirname, basename):
yield x
else:
for x in glob1(dirname, basename):
yield x
yield from glob_in_dir(dirname, basename)
return
# `os.path.split()` returns the argument itself as a dirname if it is a
# drive or UNC path. Prevent an infinite recursion if a drive or UNC path
......@@ -71,12 +69,7 @@ def _iglob(pathname, recursive):
dirs = _iglob(dirname, recursive)
else:
dirs = [dirname]
if has_magic(basename):
if recursive and _isrecursive(basename):
glob_in_dir = glob2
else:
glob_in_dir = glob1
else:
if not has_magic(basename):
glob_in_dir = glob0
for dirname in dirs:
for name in glob_in_dir(dirname, basename):
......
......@@ -51,7 +51,7 @@ def _legacy_fetch_build_egg(dist, req):
return cmd.easy_install(req)
def fetch_build_egg(dist, req):
def fetch_build_egg(dist, req): # noqa: C901 # is too complex (16) # FIXME
"""Fetch an egg needed for building.
Use pip/wheel to fetch/build a wheel."""
......@@ -80,20 +80,17 @@ def fetch_build_egg(dist, req):
if 'allow_hosts' in opts:
raise DistutilsError('the `allow-hosts` option is not supported '
'when using pip to install requirements.')
if 'PIP_QUIET' in os.environ or 'PIP_VERBOSE' in os.environ:
quiet = False
else:
quiet = True
quiet = 'PIP_QUIET' not in os.environ and 'PIP_VERBOSE' not in os.environ
if 'PIP_INDEX_URL' in os.environ:
index_url = None
elif 'index_url' in opts:
index_url = opts['index_url'][1]
else:
index_url = None
if 'find_links' in opts:
find_links = _fixup_find_links(opts['find_links'][1])[:]
else:
find_links = []
find_links = (
_fixup_find_links(opts['find_links'][1])[:] if 'find_links' in opts
else []
)
if dist.dependency_links:
find_links.extend(dist.dependency_links)
eggs_dir = os.path.realpath(dist.get_egg_cache_dir())
......@@ -112,16 +109,12 @@ def fetch_build_egg(dist, req):
cmd.append('--quiet')
if index_url is not None:
cmd.extend(('--index-url', index_url))
if find_links is not None:
for link in find_links:
cmd.extend(('--find-links', link))
for link in find_links or []:
cmd.extend(('--find-links', link))
# If requirement is a PEP 508 direct URL, directly pass
# the URL to pip, as `req @ url` does not work on the
# command line.
if req.url:
cmd.append(req.url)
else:
cmd.append(str(req))
cmd.append(req.url or str(req))
try:
subprocess.check_call(cmd)
except subprocess.CalledProcessError as e:
......
......@@ -24,6 +24,7 @@ from io import open
from os import listdir, pathsep
from os.path import join, isfile, isdir, dirname
import sys
import contextlib
import platform
import itertools
import subprocess
......@@ -724,28 +725,23 @@ class SystemInfo:
ms = self.ri.microsoft
vckeys = (self.ri.vc, self.ri.vc_for_python, self.ri.vs)
vs_vers = []
for hkey in self.ri.HKEYS:
for key in vckeys:
try:
bkey = winreg.OpenKey(hkey, ms(key), 0, winreg.KEY_READ)
except (OSError, IOError):
continue
with bkey:
subkeys, values, _ = winreg.QueryInfoKey(bkey)
for i in range(values):
try:
ver = float(winreg.EnumValue(bkey, i)[0])
if ver not in vs_vers:
vs_vers.append(ver)
except ValueError:
pass
for i in range(subkeys):
try:
ver = float(winreg.EnumKey(bkey, i))
if ver not in vs_vers:
vs_vers.append(ver)
except ValueError:
pass
for hkey, key in itertools.product(self.ri.HKEYS, vckeys):
try:
bkey = winreg.OpenKey(hkey, ms(key), 0, winreg.KEY_READ)
except (OSError, IOError):
continue
with bkey:
subkeys, values, _ = winreg.QueryInfoKey(bkey)
for i in range(values):
with contextlib.suppress(ValueError):
ver = float(winreg.EnumValue(bkey, i)[0])
if ver not in vs_vers:
vs_vers.append(ver)
for i in range(subkeys):
with contextlib.suppress(ValueError):
ver = float(winreg.EnumKey(bkey, i))
if ver not in vs_vers:
vs_vers.append(ver)
return sorted(vs_vers)
def find_programdata_vs_vers(self):
......@@ -925,8 +921,8 @@ class SystemInfo:
"""
return self._use_last_dir_name(join(self.WindowsSdkDir, 'lib'))
@property
def WindowsSdkDir(self):
@property # noqa: C901
def WindowsSdkDir(self): # noqa: C901 # is too complex (12) # FIXME
"""
Microsoft Windows SDK directory.
......
......@@ -320,7 +320,8 @@ class PackageIndex(Environment):
else:
self.opener = urllib.request.urlopen
def process_url(self, url, retrieve=False):
# FIXME: 'PackageIndex.process_url' is too complex (14)
def process_url(self, url, retrieve=False): # noqa: C901
"""Evaluate a URL as a possible download, and maybe retrieve it"""
if url in self.scanned_urls and not retrieve:
return
......@@ -428,49 +429,53 @@ class PackageIndex(Environment):
dist.precedence = SOURCE_DIST
self.add(dist)
def _scan(self, link):
# Process a URL to see if it's for a package page
NO_MATCH_SENTINEL = None, None
if not link.startswith(self.index_url):
return NO_MATCH_SENTINEL
parts = list(map(
urllib.parse.unquote, link[len(self.index_url):].split('/')
))
if len(parts) != 2 or '#' in parts[1]:
return NO_MATCH_SENTINEL
# it's a package page, sanitize and index it
pkg = safe_name(parts[0])
ver = safe_version(parts[1])
self.package_pages.setdefault(pkg.lower(), {})[link] = True
return to_filename(pkg), to_filename(ver)
def process_index(self, url, page):
"""Process the contents of a PyPI page"""
def scan(link):
# Process a URL to see if it's for a package page
if link.startswith(self.index_url):
parts = list(map(
urllib.parse.unquote, link[len(self.index_url):].split('/')
))
if len(parts) == 2 and '#' not in parts[1]:
# it's a package page, sanitize and index it
pkg = safe_name(parts[0])
ver = safe_version(parts[1])
self.package_pages.setdefault(pkg.lower(), {})[link] = True
return to_filename(pkg), to_filename(ver)
return None, None
# process an index page into the package-page index
for match in HREF.finditer(page):
try:
scan(urllib.parse.urljoin(url, htmldecode(match.group(1))))
self._scan(urllib.parse.urljoin(url, htmldecode(match.group(1))))
except ValueError:
pass
pkg, ver = scan(url) # ensure this page is in the page index
if pkg:
# process individual package page
for new_url in find_external_links(url, page):
# Process the found URL
base, frag = egg_info_for_url(new_url)
if base.endswith('.py') and not frag:
if ver:
new_url += '#egg=%s-%s' % (pkg, ver)
else:
self.need_version_info(url)
self.scan_url(new_url)
return PYPI_MD5.sub(
lambda m: '<a href="%s#md5=%s">%s</a>' % m.group(1, 3, 2), page
)
else:
pkg, ver = self._scan(url) # ensure this page is in the page index
if not pkg:
return "" # no sense double-scanning non-package pages
# process individual package page
for new_url in find_external_links(url, page):
# Process the found URL
base, frag = egg_info_for_url(new_url)
if base.endswith('.py') and not frag:
if ver:
new_url += '#egg=%s-%s' % (pkg, ver)
else:
self.need_version_info(url)
self.scan_url(new_url)
return PYPI_MD5.sub(
lambda m: '<a href="%s#md5=%s">%s</a>' % m.group(1, 3, 2), page
)
def need_version_info(self, url):
self.scan_all(
"Page at %s links to .py file(s) without version info; an index "
......@@ -591,7 +596,7 @@ class PackageIndex(Environment):
spec = parse_requirement_arg(spec)
return getattr(self.fetch_distribution(spec, tmpdir), 'location', None)
def fetch_distribution(
def fetch_distribution( # noqa: C901 # is too complex (14) # FIXME
self, requirement, tmpdir, force_scan=False, source=False,
develop_ok=False, local_index=None):
"""Obtain a distribution suitable for fulfilling `requirement`
......@@ -762,7 +767,8 @@ class PackageIndex(Environment):
def reporthook(self, url, filename, blocknum, blksize, size):
pass # no-op
def open_url(self, url, warning=None):
# FIXME:
def open_url(self, url, warning=None): # noqa: C901 # is too complex (12)
if url.startswith('file:'):
return local_open(url)
try:
......
......@@ -56,7 +56,7 @@ if not CertificateError:
pass
if not match_hostname:
if not match_hostname: # noqa: C901 # 'If 59' is too complex (21) # FIXME
def _dnsname_match(dn, hostname, max_wildcards=1):
"""Matching according to RFC 6125, section 6.4.3
......
......@@ -38,6 +38,16 @@ from .files import build_files
from .textwrap import DALS
@pytest.fixture(autouse=True)
def pip_disable_index(monkeypatch):
"""
Important: Disable the default index for pip to avoid
querying packages in the index and potentially resolving
and installing packages there.
"""
monkeypatch.setenv('PIP_NO_INDEX', 'true')
class FakeDist:
def get_entry_map(self, group):
if group != 'console_scripts':
......@@ -445,6 +455,7 @@ class TestSetupRequires:
"""
monkeypatch.setenv(str('PIP_RETRIES'), str('0'))
monkeypatch.setenv(str('PIP_TIMEOUT'), str('0'))
monkeypatch.setenv('PIP_NO_INDEX', 'false')
with contexts.quiet():
# create an sdist that has a build-time dependency.
with TestSetupRequires.create_sdist() as dist_file:
......@@ -618,6 +629,7 @@ class TestSetupRequires:
def test_setup_requires_honors_pip_env(self, mock_index, monkeypatch):
monkeypatch.setenv(str('PIP_RETRIES'), str('0'))
monkeypatch.setenv(str('PIP_TIMEOUT'), str('0'))
monkeypatch.setenv('PIP_NO_INDEX', 'false')
monkeypatch.setenv(str('PIP_INDEX_URL'), mock_index.url)
with contexts.save_pkg_resources_state():
with contexts.tempdir() as temp_dir:
......
......@@ -6,6 +6,7 @@ import re
import stat
import time
from setuptools.build_meta import prepare_metadata_for_build_wheel
from setuptools.command.egg_info import (
egg_info, manifest_maker, EggInfoDeprecationWarning, get_pkg_info_revision,
)
......@@ -19,6 +20,26 @@ from .textwrap import DALS
from . import contexts
def _run_egg_info_command(tmpdir_cwd, env, cmd=None, output=None):
environ = os.environ.copy().update(
HOME=env.paths['home'],
)
if cmd is None:
cmd = [
'egg_info',
]
code, data = environment.run_setup_py(
cmd=cmd,
pypath=os.pathsep.join([env.paths['lib'], str(tmpdir_cwd)]),
data_stream=1,
env=environ,
)
assert not code, data
if output:
assert output in data
class Environment(str):
pass
......@@ -132,7 +153,7 @@ class TestEggInfo:
def test_expected_files_produced(self, tmpdir_cwd, env):
self._create_project()
self._run_egg_info_command(tmpdir_cwd, env)
_run_egg_info_command(tmpdir_cwd, env)
actual = os.listdir('foo.egg-info')
expected = [
......@@ -166,7 +187,7 @@ class TestEggInfo:
# currently configured to use a subprocess, the actual traceback
# object is lost and we need to parse it from stderr
with pytest.raises(AssertionError) as exc:
self._run_egg_info_command(tmpdir_cwd, env)
_run_egg_info_command(tmpdir_cwd, env)
# Hopefully this is not too fragile: the only argument to the
# assertion error should be a traceback, ending with:
......@@ -180,13 +201,13 @@ class TestEggInfo:
"""Ensure timestamps are updated when the command is re-run."""
self._create_project()
self._run_egg_info_command(tmpdir_cwd, env)
_run_egg_info_command(tmpdir_cwd, env)
timestamp_a = os.path.getmtime('foo.egg-info')
# arbitrary sleep just to handle *really* fast systems
time.sleep(.001)
self._run_egg_info_command(tmpdir_cwd, env)
_run_egg_info_command(tmpdir_cwd, env)
timestamp_b = os.path.getmtime('foo.egg-info')
assert timestamp_a != timestamp_b
......@@ -201,7 +222,7 @@ class TestEggInfo:
'usage.rst': "Run 'hi'",
}
})
self._run_egg_info_command(tmpdir_cwd, env)
_run_egg_info_command(tmpdir_cwd, env)
egg_info_dir = os.path.join('.', 'foo.egg-info')
sources_txt = os.path.join(egg_info_dir, 'SOURCES.txt')
with open(sources_txt) as f:
......@@ -441,7 +462,7 @@ class TestEggInfo:
self, tmpdir_cwd, env, requires, use_setup_cfg,
expected_requires, install_cmd_kwargs):
self._setup_script_with_requires(requires, use_setup_cfg)
self._run_egg_info_command(tmpdir_cwd, env, **install_cmd_kwargs)
_run_egg_info_command(tmpdir_cwd, env, **install_cmd_kwargs)
egg_info_dir = os.path.join('.', 'foo.egg-info')
requires_txt = os.path.join(egg_info_dir, 'requires.txt')
if os.path.exists(requires_txt):
......@@ -461,14 +482,14 @@ class TestEggInfo:
req = 'install_requires={"fake-factory==0.5.2", "pytz"}'
self._setup_script_with_requires(req)
with pytest.raises(AssertionError):
self._run_egg_info_command(tmpdir_cwd, env)
_run_egg_info_command(tmpdir_cwd, env)
def test_extras_require_with_invalid_marker(self, tmpdir_cwd, env):
tmpl = 'extras_require={{":{marker}": ["barbazquux"]}},'
req = tmpl.format(marker=self.invalid_marker)
self._setup_script_with_requires(req)
with pytest.raises(AssertionError):
self._run_egg_info_command(tmpdir_cwd, env)
_run_egg_info_command(tmpdir_cwd, env)
assert glob.glob(os.path.join(env.paths['lib'], 'barbazquux*')) == []
def test_extras_require_with_invalid_marker_in_req(self, tmpdir_cwd, env):
......@@ -476,7 +497,7 @@ class TestEggInfo:
req = tmpl.format(marker=self.invalid_marker)
self._setup_script_with_requires(req)
with pytest.raises(AssertionError):
self._run_egg_info_command(tmpdir_cwd, env)
_run_egg_info_command(tmpdir_cwd, env)
assert glob.glob(os.path.join(env.paths['lib'], 'barbazquux*')) == []
def test_provides_extra(self, tmpdir_cwd, env):
......@@ -865,26 +886,22 @@ class TestEggInfo:
sources = f.read().split('\n')
assert 'setup.py' in sources
def _run_egg_info_command(self, tmpdir_cwd, env, cmd=None, output=None):
environ = os.environ.copy().update(
HOME=env.paths['home'],
)
if cmd is None:
cmd = [
'egg_info',
]
code, data = environment.run_setup_py(
cmd=cmd,
pypath=os.pathsep.join([env.paths['lib'], str(tmpdir_cwd)]),
data_stream=1,
env=environ,
)
assert not code, data
if output:
assert output in data
def test_egg_info_tag_only_once(self, tmpdir_cwd, env):
@pytest.mark.parametrize(
('make_metadata_path', 'run_command'),
[
(
lambda env: os.path.join('.', 'foo.egg-info', 'PKG-INFO'),
lambda tmpdir_cwd, env: _run_egg_info_command(tmpdir_cwd, env)
),
(
lambda env: os.path.join(env, 'foo.dist-info', 'METADATA'),
lambda tmpdir_cwd, env: prepare_metadata_for_build_wheel(env)
)
]
)
def test_egg_info_tag_only_once(
self, tmpdir_cwd, env, make_metadata_path, run_command
):
self._create_project()
build_files({
'setup.cfg': DALS("""
......@@ -894,11 +911,10 @@ class TestEggInfo:
tag_svn_revision = 0
"""),
})
self._run_egg_info_command(tmpdir_cwd, env)
egg_info_dir = os.path.join('.', 'foo.egg-info')
with open(os.path.join(egg_info_dir, 'PKG-INFO')) as pkginfo_file:
pkg_info_lines = pkginfo_file.read().split('\n')
assert 'Version: 0.0.0.dev0' in pkg_info_lines
run_command(tmpdir_cwd, env)
with open(make_metadata_path(env)) as metadata_file:
metadata_lines = metadata_file.read().split('\n')
assert 'Version: 0.0.0.dev0' in metadata_lines
def test_get_pkg_info_revision_deprecated(self):
pytest.warns(EggInfoDeprecationWarning, get_pkg_info_revision)
......@@ -179,6 +179,10 @@ def test_test_command_install_requirements(virtualenv, tmpdir):
# Ensure pip/wheel packages are installed.
virtualenv.run(
"python -c \"__import__('pkg_resources').require(['pip', 'wheel'])\"")
# uninstall setuptools so that 'setup.py develop' works
virtualenv.run("python -m pip uninstall -y setuptools")
# disable index URL so bits and bobs aren't requested from PyPI
virtualenv.env['PIP_NO_INDEX'] = '1'
_check_test_command_install_requirements(virtualenv, tmpdir)
......
......@@ -46,6 +46,26 @@ For example, here's a session of the [path project](https://pypi.org/project/pat
Thereafter, the target project can make whatever customizations it deems relevant to the scaffolding. The project may even at some point decide that the divergence is too great to merit renewed merging with the original skeleton. This approach applies maximal guidance while creating minimal constraints.
## Periodic Collapse
In late 2020, this project [introduced](https://github.com/jaraco/skeleton/issues/27) the idea of a periodic but infrequent (O(years)) collapse of commits to limit the number of commits a new consumer will need to accept to adopt the skeleton.
The full history of commits is collapsed into a single commit and that commit becomes the new mainline head.
When one of these collapse operations happens, any project that previously pulled from the skeleton will no longer have a related history with that new main branch. For those projects, the skeleton provides a "handoff" branch that reconciles the two branches. Any project that has previously merged with the skeleton but now gets an error "fatal: refusing to merge unrelated histories" should instead use the handoff branch once to incorporate the new main branch.
```
$ git pull https://github.com/jaraco/skeleton 2020-handoff
```
This handoff needs to be pulled just once and thereafter the project can pull from the main head.
The archive and handoff branches from prior collapses are indicate here:
| refresh | archive | handoff |
|---------|-----------------|--------------|
| 2020-12 | archive/2020-12 | 2020-handoff |
# Features
The features/techniques employed by the skeleton include:
......@@ -118,6 +138,8 @@ Features include:
- test against multiple Python versions
- run on late (and updated) platform versions
- automated releases of tagged commits
- [automatic merging of PRs](https://github.com/marketplace/actions/merge-pull-requests) (requires [protecting branches with required status checks](https://docs.github.com/en/free-pro-team@latest/github/administering-a-repository/enabling-required-status-checks), [not possible through API](https://github.community/t/set-all-status-checks-to-be-required-as-branch-protection-using-the-github-api/119493))
### Continuous Deployments
......
......@@ -54,7 +54,7 @@ commands =
[testenv:release]
skip_install = True
deps =
pep517>=0.5
build
twine[keyring]>=1.13
path
jaraco.develop>=7.1
......@@ -70,7 +70,7 @@ commands =
python -c "import path; path.Path('dist').rmtree_p()"
# unset tag_build and tag_date pypa/setuptools#2500
python setup.py egg_info -Db "" saveopts
python -m pep517.build .
python -m build
python -m twine upload dist/*
python -m jaraco.develop.create-github-release
python -m jaraco.tidelift.publish-release-notes
......
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment