Skip to content

Commit 1405a40

Browse files
authored
Merge branch 'master' into add-use-user-site-to-preparer
2 parents 11b3fc2 + 8453fa5 commit 1405a40

21 files changed

+409
-270
lines changed

.github/no-response.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# Number of days of inactivity before issue is closed for lack of response
2-
daysUntilClose: 30
2+
daysUntilClose: 15
33
# Label requiring a response
44
responseRequiredLabel: "S: awaiting response"
55
# Comment to post when closing an Issue for lack of response. Set to `false` to disable

docs/html/development/architecture/package-finding.rst

Lines changed: 22 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -15,17 +15,16 @@ Overview
1515
Here is a rough description of the process that pip uses to choose what
1616
file to download for a package, given a requirement:
1717

18-
1. Access the various network and file system locations configured for pip
19-
that contain package files. These locations can include, for example,
20-
pip's :ref:`--index-url <--index-url>` (with default
21-
https://pypi.org/simple/ ) and any configured
22-
:ref:`--extra-index-url <--extra-index-url>` locations.
23-
Each of these locations is a `PEP 503`_ "simple repository" page, which
24-
is an HTML page of anchor links.
25-
2. Collect together all of the links (e.g. by parsing the anchor links
26-
from the HTML pages) and create ``Link`` objects from each of these.
27-
The :ref:`LinkCollector <link-collector-class>` class is responsible
28-
for both this step and the previous.
18+
1. Collect together the various network and file system locations containing
19+
project package files. These locations are derived, for example, from pip's
20+
:ref:`--index-url <--index-url>` (with default https://pypi.org/simple/ )
21+
setting and any configured :ref:`--extra-index-url <--extra-index-url>`
22+
locations. Each of the project page URL's is an HTML page of anchor links,
23+
as defined in `PEP 503`_, the "Simple Repository API."
24+
2. For each project page URL, fetch the HTML and parse out the anchor links,
25+
creating a ``Link`` object from each one. The :ref:`LinkCollector
26+
<link-collector-class>` class is responsible for both the previous step
27+
and fetching the HTML over the network.
2928
3. Determine which of the links are minimally relevant, using the
3029
:ref:`LinkEvaluator <link-evaluator-class>` class. Create an
3130
``InstallationCandidate`` object (aka candidate for install) for each
@@ -111,6 +110,12 @@ One of ``PackageFinder``'s main top-level methods is
111110
class's ``compute_best_candidate()`` method on the return value of
112111
``find_all_candidates()``. This corresponds to steps 4-5 of the Overview.
113112

113+
``PackageFinder`` also has a ``process_project_url()`` method (called by
114+
``find_best_candidate()``) to process a `PEP 503`_ "simple repository"
115+
project page. This method fetches and parses the HTML from a PEP 503 project
116+
page URL, extracts the anchor elements and creates ``Link`` objects from
117+
them, and then evaluates those links.
118+
114119

115120
.. _link-collector-class:
116121

@@ -119,12 +124,8 @@ The ``LinkCollector`` class
119124

120125
The :ref:`LinkCollector <link-collector-class>` class is the class
121126
responsible for collecting the raw list of "links" to package files
122-
(represented as ``Link`` objects). An instance of the class accesses the
123-
various `PEP 503`_ HTML "simple repository" pages, parses their HTML,
124-
extracts the links from the anchor elements, and creates ``Link`` objects
125-
from that information. The ``LinkCollector`` class is "unintelligent" in that
126-
it doesn't do any evaluation of whether the links are relevant to the
127-
original requirement; it just collects them.
127+
(represented as ``Link`` objects) from file system locations, as well as the
128+
`PEP 503`_ project page URL's that ``PackageFinder`` should access.
128129

129130
The ``LinkCollector`` class takes into account the user's :ref:`--find-links
130131
<--find-links>`, :ref:`--extra-index-url <--extra-index-url>`, and related
@@ -133,6 +134,10 @@ method is the ``collect_links()`` method. The :ref:`PackageFinder
133134
<package-finder-class>` class invokes this method as the first step of its
134135
``find_all_candidates()`` method.
135136

137+
``LinkCollector`` also has a ``fetch_page()`` method to fetch the HTML from a
138+
project page URL. This method is "unintelligent" in that it doesn't parse the
139+
HTML.
140+
136141
The ``LinkCollector`` class is the only class in the ``index`` sub-package that
137142
makes network requests and is the only class in the sub-package that depends
138143
directly on ``PipSession``, which stores pip's configuration options and

docs/html/development/release-process.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -80,13 +80,13 @@ Creating a new release
8080
----------------------
8181

8282
#. Checkout the current pip ``master`` branch.
83-
#. Ensure you have the latest ``nox`` and ``twine`` installed.
83+
#. Ensure you have the latest ``nox`` installed.
8484
#. Prepare for release using ``nox -s prepare-release -- YY.N``.
8585
This will update the relevant files and tag the correct commit.
8686
#. Build the release artifacts using ``nox -s build-release -- YY.N``.
8787
This will checkout the tag, generate the distribution files to be
8888
uploaded and checkout the master branch again.
89-
#. Upload the distribution files to PyPI using ``twine upload dist/*``.
89+
#. Upload the release to PyPI using ``nox -s upload-release -- YY.N``.
9090
#. Push all of the changes including the tag.
9191
#. Regenerate the ``get-pip.py`` script in the `get-pip repository`_ (as
9292
documented there) and commit the results.

docs/html/user_guide.rst

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -394,8 +394,8 @@ set like this:
394394
ignore-installed = true
395395
no-dependencies = yes
396396
397-
To enable the boolean options ``--no-compile`` and ``--no-cache-dir``, falsy
398-
values have to be used:
397+
To enable the boolean options ``--no-compile``, ``--no-warn-script-location``
398+
and ``--no-cache-dir``, falsy values have to be used:
399399

400400
.. code-block:: ini
401401
@@ -404,6 +404,7 @@ values have to be used:
404404
405405
[install]
406406
no-compile = no
407+
no-warn-script-location = false
407408
408409
Appending options like ``--find-links`` can be written on multiple lines:
409410

news/31044E84-3F3C-48A8-84B2-6028E21FEBF1.trivial

Whitespace-only changes.

news/6410.bugfix

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
Enforce PEP 508 requirement format in ``pyproject.toml``
2+
``build-system.requires``.

noxfile.py

Lines changed: 34 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -189,7 +189,7 @@ def prepare_release(session):
189189
def build_release(session):
190190
version = release.get_version_from_arguments(session.posargs)
191191
if not version:
192-
session.error("Usage: nox -s upload-release -- YY.N[.P]")
192+
session.error("Usage: nox -s build-release -- YY.N[.P]")
193193

194194
session.log("# Ensure no files in dist/")
195195
if release.have_files_in_folder("dist"):
@@ -209,3 +209,36 @@ def build_release(session):
209209

210210
session.log("# Checkout the master branch")
211211
session.run("git", "checkout", "master", external=True, silent=True)
212+
213+
214+
@nox.session(name="upload-release")
215+
def upload_release(session):
216+
version = release.get_version_from_arguments(session.posargs)
217+
if not version:
218+
session.error("Usage: nox -s upload-release -- YY.N[.P]")
219+
220+
session.log("# Install dependencies")
221+
session.install("twine")
222+
223+
distribution_files = glob.glob("dist/*")
224+
session.log(f"# Distribution files: {distribution_files}")
225+
226+
# Sanity check: Make sure there's 2 distribution files.
227+
count = len(distribution_files)
228+
if count != 2:
229+
session.error(
230+
f"Expected 2 distribution files for upload, got {count}. "
231+
f"Remove dist/ and run 'nox -s build-release -- {version}'"
232+
)
233+
# Sanity check: Make sure the files are correctly named.
234+
expected_distribution_files = [
235+
f"pip-{version}-py2.py3-none-any.whl",
236+
f"pip-{version}.tar.gz",
237+
]
238+
if sorted(distribution_files) != sorted(expected_distribution_files):
239+
session.error(
240+
f"Distribution files do not seem to be for {version} release."
241+
)
242+
243+
session.log("# Upload distributions")
244+
session.run("twine", "upload", *distribution_files)

src/pip/_internal/commands/download.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@
1010
from pip._internal.cli.cmdoptions import make_target_python
1111
from pip._internal.cli.req_command import RequirementCommand
1212
from pip._internal.req import RequirementSet
13-
from pip._internal.req.req_tracker import RequirementTracker
13+
from pip._internal.req.req_tracker import get_requirement_tracker
1414
from pip._internal.utils.filesystem import check_path_owner
1515
from pip._internal.utils.misc import ensure_dir, normalize_path, write_output
1616
from pip._internal.utils.temp_dir import TempDirectory
@@ -111,7 +111,7 @@ def run(self, options, args):
111111
)
112112
options.cache_dir = None
113113

114-
with RequirementTracker() as req_tracker, TempDirectory(
114+
with get_requirement_tracker() as req_tracker, TempDirectory(
115115
options.build_dir, delete=build_delete, kind="download"
116116
) as directory:
117117

src/pip/_internal/commands/install.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@
3131
from pip._internal.locations import distutils_scheme
3232
from pip._internal.operations.check import check_install_conflicts
3333
from pip._internal.req import RequirementSet, install_given_reqs
34-
from pip._internal.req.req_tracker import RequirementTracker
34+
from pip._internal.req.req_tracker import get_requirement_tracker
3535
from pip._internal.utils.filesystem import check_path_owner, test_writable_dir
3636
from pip._internal.utils.misc import (
3737
ensure_dir,
@@ -343,7 +343,7 @@ def run(self, options, args):
343343
)
344344
options.cache_dir = None
345345

346-
with RequirementTracker() as req_tracker, TempDirectory(
346+
with get_requirement_tracker() as req_tracker, TempDirectory(
347347
options.build_dir, delete=build_delete, kind="install"
348348
) as directory:
349349
requirement_set = RequirementSet(

src/pip/_internal/commands/wheel.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@
1313
from pip._internal.cli.req_command import RequirementCommand
1414
from pip._internal.exceptions import CommandError, PreviousBuildDirError
1515
from pip._internal.req import RequirementSet
16-
from pip._internal.req.req_tracker import RequirementTracker
16+
from pip._internal.req.req_tracker import get_requirement_tracker
1717
from pip._internal.utils.temp_dir import TempDirectory
1818
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
1919
from pip._internal.wheel_builder import WheelBuilder
@@ -157,7 +157,7 @@ def run(self, options, args):
157157
build_delete = (not (options.no_clean or options.build_dir))
158158
wheel_cache = WheelCache(options.cache_dir, options.format_control)
159159

160-
with RequirementTracker() as req_tracker, TempDirectory(
160+
with get_requirement_tracker() as req_tracker, TempDirectory(
161161
options.build_dir, delete=build_delete, kind="wheel"
162162
) as directory:
163163

src/pip/_internal/index/collector.py

Lines changed: 22 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -27,8 +27,8 @@
2727

2828
if MYPY_CHECK_RUNNING:
2929
from typing import (
30-
Callable, Dict, Iterable, List, MutableMapping, Optional, Sequence,
31-
Tuple, Union,
30+
Callable, Iterable, List, MutableMapping, Optional, Sequence, Tuple,
31+
Union,
3232
)
3333
import xml.etree.ElementTree
3434

@@ -435,29 +435,36 @@ def sort_path(path):
435435
class CollectedLinks(object):
436436

437437
"""
438-
Encapsulates all the Link objects collected by a call to
439-
LinkCollector.collect_links(), stored separately as--
438+
Encapsulates the return value of a call to LinkCollector.collect_links().
439+
440+
The return value includes both URLs to project pages containing package
441+
links, as well as individual package Link objects collected from other
442+
sources.
443+
444+
This info is stored separately as:
440445
441446
(1) links from the configured file locations,
442447
(2) links from the configured find_links, and
443-
(3) a dict mapping HTML page url to links from that page.
448+
(3) urls to HTML project pages, as described by the PEP 503 simple
449+
repository API.
444450
"""
445451

446452
def __init__(
447453
self,
448-
files, # type: List[Link]
449-
find_links, # type: List[Link]
450-
pages, # type: Dict[str, List[Link]]
454+
files, # type: List[Link]
455+
find_links, # type: List[Link]
456+
project_urls, # type: List[Link]
451457
):
452458
# type: (...) -> None
453459
"""
454460
:param files: Links from file locations.
455461
:param find_links: Links from find_links.
456-
:param pages: A dict mapping HTML page url to links from that page.
462+
:param project_urls: URLs to HTML project pages, as described by
463+
the PEP 503 simple repository API.
457464
"""
458465
self.files = files
459466
self.find_links = find_links
460-
self.pages = pages
467+
self.project_urls = project_urls
461468

462469

463470
class LinkCollector(object):
@@ -483,18 +490,12 @@ def find_links(self):
483490
# type: () -> List[str]
484491
return self.search_scope.find_links
485492

486-
def _get_pages(self, locations):
487-
# type: (Iterable[Link]) -> Iterable[HTMLPage]
493+
def fetch_page(self, location):
494+
# type: (Link) -> Optional[HTMLPage]
488495
"""
489-
Yields (page, page_url) from the given locations, skipping
490-
locations that have errors.
496+
Fetch an HTML page containing package links.
491497
"""
492-
for location in locations:
493-
page = _get_html_page(location, session=self.session)
494-
if page is None:
495-
continue
496-
497-
yield page
498+
return _get_html_page(location, session=self.session)
498499

499500
def collect_links(self, project_name):
500501
# type: (str) -> CollectedLinks
@@ -537,12 +538,8 @@ def collect_links(self, project_name):
537538
lines.append('* {}'.format(link))
538539
logger.debug('\n'.join(lines))
539540

540-
pages_links = {}
541-
for page in self._get_pages(url_locations):
542-
pages_links[page.url] = list(parse_links(page))
543-
544541
return CollectedLinks(
545542
files=file_links,
546543
find_links=find_link_links,
547-
pages=pages_links,
544+
project_urls=url_locations,
548545
)

src/pip/_internal/index/package_finder.py

Lines changed: 25 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -19,6 +19,7 @@
1919
InvalidWheelFilename,
2020
UnsupportedWheel,
2121
)
22+
from pip._internal.index.collector import parse_links
2223
from pip._internal.models.candidate import InstallationCandidate
2324
from pip._internal.models.format_control import FormatControl
2425
from pip._internal.models.link import Link
@@ -778,6 +779,25 @@ def evaluate_links(self, link_evaluator, links):
778779

779780
return candidates
780781

782+
def process_project_url(self, project_url, link_evaluator):
783+
# type: (Link, LinkEvaluator) -> List[InstallationCandidate]
784+
logger.debug(
785+
'Fetching project page and analyzing links: %s', project_url,
786+
)
787+
html_page = self._link_collector.fetch_page(project_url)
788+
if html_page is None:
789+
return []
790+
791+
page_links = list(parse_links(html_page))
792+
793+
with indent_log():
794+
package_links = self.evaluate_links(
795+
link_evaluator,
796+
links=page_links,
797+
)
798+
799+
return package_links
800+
781801
def find_all_candidates(self, project_name):
782802
# type: (str) -> List[InstallationCandidate]
783803
"""Find all available InstallationCandidate for project_name
@@ -798,14 +818,11 @@ def find_all_candidates(self, project_name):
798818
)
799819

800820
page_versions = []
801-
for page_url, page_links in collected_links.pages.items():
802-
logger.debug('Analyzing links from page %s', page_url)
803-
with indent_log():
804-
new_versions = self.evaluate_links(
805-
link_evaluator,
806-
links=page_links,
807-
)
808-
page_versions.extend(new_versions)
821+
for project_url in collected_links.project_urls:
822+
package_links = self.process_project_url(
823+
project_url, link_evaluator=link_evaluator,
824+
)
825+
page_versions.extend(package_links)
809826

810827
file_versions = self.evaluate_links(
811828
link_evaluator,

src/pip/_internal/pyproject.py

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,7 @@
55
import sys
66

77
from pip._vendor import pytoml, six
8+
from pip._vendor.packaging.requirements import InvalidRequirement, Requirement
89

910
from pip._internal.exceptions import InstallationError
1011
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
@@ -150,6 +151,21 @@ def load_pyproject_toml(
150151
reason="'build-system.requires' is not a list of strings.",
151152
))
152153

154+
# Each requirement must be valid as per PEP 508
155+
for requirement in requires:
156+
try:
157+
Requirement(requirement)
158+
except InvalidRequirement:
159+
raise InstallationError(
160+
error_template.format(
161+
package=req_name,
162+
reason=(
163+
"'build-system.requires' contains an invalid "
164+
"requirement: {!r}".format(requirement)
165+
),
166+
)
167+
)
168+
153169
backend = build_system.get("build-backend")
154170
check = [] # type: List[str]
155171
if backend is None:

0 commit comments

Comments
 (0)