Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Experimental uv integration #3144

Merged
merged 16 commits into from
Sep 23, 2024
10 changes: 8 additions & 2 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -90,6 +90,10 @@ jobs:
with:
path: .venv
key: venv-${{ runner.os }}-${{ matrix.python-version }}-${{ hashFiles('pdm.lock') }}
- name: Install uv
uses: astral-sh/setup-uv@v1
with:
version: "latest"
- name: Install current PDM via pip
if: matrix.install-via == 'pip'
run: python -m pip install -U .
Expand Down Expand Up @@ -117,8 +121,10 @@ jobs:
Pack:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
- uses: actions/checkout@v4
with:
fetch-depth: 0
- uses: actions/setup-python@v5
with:
python-version: 3.x
- name: Install PDM
Expand Down
13 changes: 12 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
## Release v2.18.2 (2024-09-10)


### Bug Fixes

- Respect the `excludes` and `overrides` settings when installing packages. ([#3113](https://github.com/pdm-project/pdm/issues/3113))
Expand All @@ -14,6 +13,18 @@

- Skip tests related to python installation on non-standard platforms. ([#3053](https://github.com/pdm-project/pdm/issues/3053))

## Release v2.19.0a0 (2024-09-05)

### Breaking Changes

- `pre_install` and `post_install` signals now receive the list of packages to be installed, instead of a candidate mapping. ([#3144](https://github.com/pdm-project/pdm/issues/3144))

### Features & Improvements

- Deprecate `Core.synchronizer_class` attribute. To get the synchronizer class, use `Project.get_synchronizer` method instead.
Deprecate `Core.resolver_class` attribute. To get the resolver class, use `Project.get_resolver` method instead. ([#3144](https://github.com/pdm-project/pdm/issues/3144))
- Add experimental support for `uv` as the resolver and installer. One can opt in by setting `use_uv` to `true` using `pdm config` command. ([#3144](https://github.com/pdm-project/pdm/issues/3144))


## Release v2.18.1 (2024-08-16)

Expand Down
30 changes: 30 additions & 0 deletions docs/usage/uv.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
# Use uv (Experimental)

+++ 2.19.0

PDM has experimental support for [uv](https://github.com/astral-sh/uv) as the resolver and installer. To enable it:

```
pdm config use_uv true
```

PDM will automatically detect the `uv` binary on your system. You need to install `uv` first. See [uv's installation guide](https://docs.astral.sh/uv/getting-started/installation/) for more details.

## Reuse the Python installations of uv

uv also supports installing Python interpreters. To avoid overhead, you can configure PDM to reuse the Python installations of uv by:

```
pdm config python.install_root $(uv python dir)
```

## Limitations

Despite the significant performance improvements brought by uv, it is important to note the following limitations:

- The cache files are stored in uv's own cache directory, and you have to use `uv` command to manage them.
- PEP 582 local packages layout is not supported.
- `inherit_metadata` lock strategy is not supported by uv. This will be ignored when writing to the lock file.
- Update strategies other than `all` and `reuse` are not supported.
- Editable requirement must be a local path. Requirements like `-e git+<git_url>` are not supported.
- `overrides` and `excludes` settings under `[tool.pdm.resolution]` are not supported.
1 change: 1 addition & 0 deletions mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -60,6 +60,7 @@ nav:
- Lock Files:
- usage/lockfile.md
- usage/lock-targets.md
- usage/uv.md
- usage/publish.md
- usage/config.md
- usage/scripts.md
Expand Down
8 changes: 4 additions & 4 deletions src/pdm/_types.py
Original file line number Diff line number Diff line change
Expand Up @@ -88,17 +88,17 @@ def __rich__(self) -> str:
CandidateInfo = Tuple[List[str], str, str]


class Package(NamedTuple):
class SearchResult(NamedTuple):
name: str
version: str
summary: str


SearchResult = List[Package]
SearchResults = List[SearchResult]


if TYPE_CHECKING:
from typing import TypedDict
from typing import Required, TypedDict

class Comparable(Protocol):
def __lt__(self, __other: Any) -> bool: ...
Expand All @@ -117,7 +117,7 @@ def __rich__(self) -> str: ...

class FileHash(TypedDict, total=False):
url: str
hash: str
hash: Required[str]
file: str


Expand Down
154 changes: 62 additions & 92 deletions src/pdm/cli/actions.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
from __future__ import annotations

import contextlib
import dataclasses
import datetime
import hashlib
import inspect
Expand All @@ -11,15 +10,13 @@
import textwrap
from typing import TYPE_CHECKING, Collection, Iterable, cast

from resolvelib.reporters import BaseReporter
from resolvelib.resolvers import ResolutionImpossible, ResolutionTooDeep, Resolver
from resolvelib.resolvers import ResolutionImpossible, ResolutionTooDeep

from pdm import termui
from pdm.cli.filters import GroupSelection
from pdm.cli.hooks import HookManager
from pdm.cli.utils import (
check_project_file,
fetch_hashes,
find_importable_files,
format_resolution_impossible,
get_pep582_path,
Expand All @@ -28,19 +25,16 @@
from pdm.environments import BareEnvironment
from pdm.exceptions import PdmException, PdmUsageError, ProjectError
from pdm.models.candidates import Candidate
from pdm.models.markers import EnvSpec, get_marker
from pdm.models.repositories import LockedRepository
from pdm.models.specifiers import PySpecSet
from pdm.models.markers import EnvSpec
from pdm.models.repositories import LockedRepository, Package
from pdm.project import Project
from pdm.project.lockfile import FLAG_CROSS_PLATFORM, FLAG_DIRECT_MINIMAL_VERSIONS, FLAG_INHERIT_METADATA
from pdm.resolver import resolve
from pdm.project.lockfile import FLAG_CROSS_PLATFORM, FLAG_INHERIT_METADATA, FLAG_STATIC_URLS
from pdm.resolver.reporters import RichLockReporter
from pdm.termui import logger
from pdm.utils import deprecation_warning

if TYPE_CHECKING:
from pdm.models.requirements import Requirement
from pdm.resolver.providers import BaseProvider


def do_lock(
Expand All @@ -59,8 +53,6 @@ def do_lock(
"""Performs the locking process and update lockfile."""
hooks = hooks or HookManager(project)
check_project_file(project)
if not project.config["strategy.inherit_metadata"]:
project.lockfile.default_strategies.remove(FLAG_INHERIT_METADATA)
lock_strategy = project.lockfile.apply_strategy_change(strategy_change or [])
if FLAG_CROSS_PLATFORM in lock_strategy: # pragma: no cover
project.core.ui.deprecated(
Expand All @@ -79,7 +71,7 @@ def do_lock(
candidates = [entry.candidate for entry in locked_repo.packages.values()]
for c in candidates:
c.hashes.clear()
fetch_hashes(repo, candidates)
repo.fetch_hashes(candidates)
lockfile = locked_repo.format_lockfile(groups=project.lockfile.groups, strategy=lock_strategy)
project.write_lockfile(lockfile)
return locked_repo.all_candidates
Expand Down Expand Up @@ -123,31 +115,25 @@ def do_lock(
with ui.logging("lock"):
# The context managers are nested to ensure the spinner is stopped before
# any message is thrown to the output.
resolver_class = project.get_resolver()
with RichLockReporter(requirements, ui) as reporter:
try:
for target in targets:
if supports_env_spec:
provider = project.get_provider(
strategy,
tracked_names,
direct_minimal_versions=FLAG_DIRECT_MINIMAL_VERSIONS in lock_strategy,
env_spec=target,
locked_repository=locked_repo,
)
else: # pragma: no cover
provider = project.get_provider(
strategy,
tracked_names,
direct_minimal_versions=FLAG_DIRECT_MINIMAL_VERSIONS in lock_strategy,
ignore_compatibility=target.is_allow_all(),
)
provider.repository.reporter = reporter
mapping = _lock_for_env(
project, target, provider, reporter, requirements, lock_strategy, resolve_max_rounds
resolver = resolver_class(
environment=project.environment,
requirements=[r for r in requirements if not r.marker or r.marker.matches(target)],
update_strategy=strategy,
strategies=lock_strategy,
target=target,
tracked_names=list(tracked_names or ()),
locked_repository=locked_repo,
reporter=reporter,
)
locked_repo.merge_result(target, mapping.values(), provider.fetched_dependencies)
reporter.update(f"Resolve for environment {target}")
resolved, collected_groups = resolver.resolve()
locked_repo.merge_result(target, resolved)
if result_repo is not locked_repo:
result_repo.merge_result(target, mapping.values(), provider.fetched_dependencies)
result_repo.merge_result(target, resolved)
except ResolutionTooDeep:
reporter.update(f"{termui.Emoji.LOCK} Lock failed.", info="", completed=1)
ui.echo(
Expand All @@ -163,8 +149,7 @@ def do_lock(
ui.error(format_resolution_impossible(err))
raise ResolutionImpossible("Unable to find a resolution") from None
else:
groups = list(set(groups) | provider.repository.collected_groups)
provider.repository.collected_groups.clear()
groups = list(set(groups) | collected_groups)
data = result_repo.format_lockfile(groups=groups, strategy=lock_strategy)
if project.enable_write_lockfile:
reporter.update(f"{termui.Emoji.LOCK} Lock successful.", info="", completed=1)
Expand All @@ -174,56 +159,24 @@ def do_lock(
return result_repo.all_candidates


def _lock_for_env(
project: Project,
env_spec: EnvSpec,
provider: BaseProvider,
reporter: RichLockReporter,
requirements: list[Requirement],
lock_strategy: set[str],
max_rounds: int,
) -> dict[str, Candidate]:
reporter.update(f"Resolve for environment {env_spec}")
requirements = [req for req in requirements if not req.marker or req.marker.matches(env_spec)]
resolver: Resolver = project.core.resolver_class(provider, reporter)
mapping, *_ = resolve(
resolver,
requirements,
max_rounds=max_rounds,
inherit_metadata=FLAG_INHERIT_METADATA in lock_strategy,
)
if project.enable_write_lockfile:
reporter.update(info="Fetching hashes for resolved packages")
fetch_hashes(provider.repository, mapping.values())
if not (env_python := PySpecSet(env_spec.requires_python)).is_superset(project.environment.python_requires):
python_marker = get_marker(env_python.as_marker_string())
for candidate in mapping.values():
marker = candidate.req.marker or get_marker("")
candidate.req = dataclasses.replace(candidate.req, marker=marker & python_marker)
return mapping


def resolve_candidates_from_lockfile(
def resolve_from_lockfile(
project: Project,
requirements: Iterable[Requirement],
cross_platform: bool | None = None,
groups: Collection[str] | None = None,
env_spec: EnvSpec | None = None,
) -> dict[str, Candidate]:
) -> Iterable[Package]:
from dep_logic.tags import EnvCompatibility

from pdm.resolver.resolvelib import RLResolver

ui = project.core.ui
resolve_max_rounds = int(project.config["strategy.resolve_max_rounds"])
if cross_platform is not None: # pragma: no cover
deprecation_warning("cross_platform argument is deprecated", stacklevel=2)

if env_spec is None:
# Resolve for the current environment by default
env_spec = project.environment.spec
reqs = [req for req in requirements if not req.marker or req.marker.matches(env_spec)]
with ui.open_spinner("Resolving packages from lockfile...") as spinner:
reporter = BaseReporter()
provider = project.get_provider(for_install=True, env_spec=env_spec)
locked_repo = cast("LockedRepository", provider.repository)
with ui.open_spinner("Resolving packages from lockfile..."):
locked_repo = project.get_locked_repository(env_spec)
lock_targets = locked_repo.targets
if env_spec not in lock_targets:
compatibilities = [target.compare(env_spec) for target in lock_targets]
Expand All @@ -246,25 +199,41 @@ def resolve_candidates_from_lockfile(
raise PdmException("No compatible lock target found")

with ui.logging("install-resolve"):
if FLAG_INHERIT_METADATA in project.lockfile.strategy and groups is not None:
strategies = project.lockfile.strategy.copy()
if FLAG_INHERIT_METADATA in strategies and groups is not None:
return locked_repo.evaluate_candidates(groups)
resolver: Resolver = project.core.resolver_class(provider, reporter)
strategies.update((FLAG_STATIC_URLS, FLAG_INHERIT_METADATA))
resolver = project.get_resolver()(
environment=project.environment,
requirements=reqs,
update_strategy="reuse",
strategies=strategies,
target=env_spec,
tracked_names=(),
locked_repository=locked_repo,
)
if isinstance(resolver, RLResolver): # resolve from lock file
resolver.provider.repository = locked_repo
try:
mapping, *_ = resolve(
resolver,
reqs,
max_rounds=resolve_max_rounds,
inherit_metadata=True,
)
return resolver.resolve().packages
except ResolutionImpossible as e:
logger.exception("Broken lockfile")
raise PdmException(
"Resolving from lockfile failed. You may fix the lockfile by `pdm lock --update-reuse` and retry."
) from e
else:
spinner.update("Fetching hashes for resolved packages...")
fetch_hashes(provider.repository, mapping.values())
return mapping


def resolve_candidates_from_lockfile(
project: Project,
requirements: Iterable[Requirement],
cross_platform: bool | None = None,
groups: Collection[str] | None = None,
env_spec: EnvSpec | None = None,
) -> dict[str, Candidate]:
if cross_platform is not None: # pragma: no cover
deprecation_warning("cross_platform argument is deprecated", stacklevel=2)
packages = resolve_from_lockfile(project, requirements, groups, env_spec)
return {p.candidate.identify(): p.candidate for p in packages}


def check_lockfile(project: Project, raise_not_exist: bool = True) -> str | None:
Expand Down Expand Up @@ -312,11 +281,10 @@ def do_sync(
selection.validate()
for group in selection:
requirements.extend(project.get_dependencies(group))
candidates = resolve_candidates_from_lockfile(project, requirements, groups=list(selection))
packages = resolve_from_lockfile(project, requirements, groups=list(selection))
if tracked_names and dry_run:
candidates = {name: c for name, c in candidates.items() if name in tracked_names}
synchronizer = project.core.synchronizer_class(
candidates,
packages = [p for p in packages if p.candidate.identify() in tracked_names]
synchronizer = project.get_synchronizer()(
project.environment,
clean=clean,
dry_run=dry_run,
Expand All @@ -325,11 +293,13 @@ def do_sync(
reinstall=reinstall,
only_keep=only_keep,
fail_fast=fail_fast,
packages=packages,
requirements=requirements,
)
with project.core.ui.logging("install"):
hooks.try_emit("pre_install", candidates=candidates, dry_run=dry_run)
hooks.try_emit("pre_install", packages=packages, dry_run=dry_run)
synchronizer.synchronize()
hooks.try_emit("post_install", candidates=candidates, dry_run=dry_run)
hooks.try_emit("post_install", packages=packages, dry_run=dry_run)


def ask_for_import(project: Project) -> None:
Expand Down
2 changes: 1 addition & 1 deletion src/pdm/cli/commands/add.py
Original file line number Diff line number Diff line change
Expand Up @@ -154,7 +154,7 @@ def do_add(
for req in group_deps:
req.specifier = get_specifier("")

reqs = [r for g, deps in all_dependencies.items() if lock_groups is None or g in lock_groups for r in deps]
reqs = [r for g, deps in all_dependencies.items() for r in deps if lock_groups is None or g in lock_groups]
with hooks.skipping("post_lock"):
resolved = do_lock(
project,
Expand Down
Loading
Loading