Compare commits
2 commits
026416d158
...
33b1176800
| Author | SHA1 | Date | |
|---|---|---|---|
| 33b1176800 | |||
| 6a36a9d2d5 |
17 changed files with 1776 additions and 205 deletions
20
CHANGELOG.md
20
CHANGELOG.md
|
|
@ -1,3 +1,17 @@
|
|||
# 0.1.0
|
||||
|
||||
* Add remote mode for harvesting a remote machine via a local workstation (no need to install enroll remotely)
|
||||
Optionally use `--no-sudo` if you don't want the remote user to have passwordless sudo when conducting the
|
||||
harvest, albeit you'll end up with less useful data (same as if running `enroll harvest` on a machine without
|
||||
sudo)
|
||||
* Add `--dangerous` flag to capture even sensitive data (use at your own risk!)
|
||||
* Add `--sops` flag which makes the harvest and the manifest 'out' data encrypted as a single SOPS data file.
|
||||
This would make `--dangerous` a little bit safer, if your intention is just to store the Ansible manifest
|
||||
in git or somewhere similar for disaster-recovery purposes (e.g encrypted at rest for safe-keeping).
|
||||
* Do a better job at capturing other config files in `/etc/<package>/` even if that package doesn't normally
|
||||
ship or manage those files.
|
||||
* Don't collect files ending in `.log`
|
||||
|
||||
# 0.0.5
|
||||
|
||||
* Use JinjaTurtle to generate dynamic template/inventory if it's on the PATH
|
||||
|
|
@ -7,7 +21,7 @@
|
|||
of the same role. Use 'single site' mode (no `--fqdn`) if you want more readable,
|
||||
self-contained roles (in which case, store each manifested output in its own
|
||||
repo per server)
|
||||
* Generate an ansible.cfg if not present, to support host_vars plugin and other params,
|
||||
* Generate an ansible.cfg if not present, to support `host_vars` plugin and other params,
|
||||
when using `--fqdn` mode
|
||||
* Be more permissive with files that we previously thought contained secrets (ignore commented lines)
|
||||
|
||||
|
|
@ -24,10 +38,10 @@
|
|||
# 0.0.2
|
||||
|
||||
* Merge pkg_ and roles created based on file/service detection
|
||||
* Avoid idempotency issue with users (password_lock)
|
||||
* Avoid idempotency issue with users (`password_lock`)
|
||||
* Rename subcommands/args ('export' is now 'enroll', '--bundle' is now '--harvest')
|
||||
* Don't try and start systemd services that were Inactive at harvest time
|
||||
* Capture miscellaneous files in /etc under their own etc_custom role, but not backup files
|
||||
* Capture miscellaneous files in /etc under their own `etc_custom` role, but not backup files
|
||||
* Add tests
|
||||
* Various other bug fixes
|
||||
|
||||
|
|
|
|||
|
|
@ -25,6 +25,7 @@ RUN set -eux; \
|
|||
python3-all \
|
||||
python3-poetry-core \
|
||||
python3-yaml \
|
||||
python3-paramiko \
|
||||
rsync \
|
||||
ca-certificates \
|
||||
; \
|
||||
|
|
|
|||
151
README.md
151
README.md
|
|
@ -8,7 +8,7 @@
|
|||
|
||||
It aims to be **optimistic and noninteractive**:
|
||||
- Detects packages that have been installed
|
||||
- Detects Debian package ownership of `/etc` files using dpkg’s local database.
|
||||
- Detects Debian package ownership of `/etc` files using dpkg's local database.
|
||||
- Captures config that has **changed from packaged defaults** (dpkg conffile hashes + package md5sums when available).
|
||||
- Also captures **service-relevant custom/unowned files** under `/etc/<service>/...` (e.g. drop-in config includes).
|
||||
- Defensively excludes likely secrets (path denylist + content sniff + size caps).
|
||||
|
|
@ -23,12 +23,12 @@ It aims to be **optimistic and noninteractive**:
|
|||
**enroll** has two distinct ways to generate Ansible:
|
||||
|
||||
## 1) Single-site mode (default: *no* `--fqdn`)
|
||||
Use this when you’re enrolling **one server** (or you’re generating a “golden” role set you intend to reuse).
|
||||
Use this when you're enrolling **one server** (or you're generating a "golden" role set you intend to reuse).
|
||||
|
||||
**What you get**
|
||||
- Config, templates, and defaults are primarily **contained inside each role**.
|
||||
- Raw config files (when not templated) live in the role’s `files/`.
|
||||
- Template variables (when templated) live in the role’s `defaults/main.yml`.
|
||||
- Raw config files (when not templated) live in the role's `files/`.
|
||||
- Template variables (when templated) live in the role's `defaults/main.yml`.
|
||||
|
||||
**Pros**
|
||||
- Roles are more **self-contained** and easier to understand.
|
||||
|
|
@ -36,14 +36,14 @@ Use this when you’re enrolling **one server** (or you’re generating a “gol
|
|||
- Less inventory abstraction/duplication.
|
||||
|
||||
**Cons**
|
||||
- Less convenient for quickly enrolling multiple hosts with divergent configs (you’ll do more manual work to make roles flexible across hosts).
|
||||
- Less convenient for quickly enrolling multiple hosts with divergent configs (you'll do more manual work to make roles flexible across hosts).
|
||||
|
||||
## 2) Multi-site mode (`--fqdn`)
|
||||
Use this when you want to enroll **several existing servers** quickly, especially if they differ.
|
||||
|
||||
**What you get**
|
||||
- Roles are **shared** across hosts, but host-specific data lives in inventory.
|
||||
- Host inventory drives what’s managed:
|
||||
- Host inventory drives what's managed:
|
||||
- which files to deploy for that host
|
||||
- which packages are relevant for that host
|
||||
- which services should be enabled/started for that host
|
||||
|
|
@ -51,17 +51,17 @@ Use this when you want to enroll **several existing servers** quickly, especiall
|
|||
|
||||
**Pros**
|
||||
- Fastest way to retrofit **multiple servers** into config management.
|
||||
- Avoids shared-role “host A breaks host B” problems by keeping host-specific state in inventory.
|
||||
- Avoids shared-role "host A breaks host B" problems by keeping host-specific state in inventory.
|
||||
- Better fit when you already have a fleet and want to capture/reflect reality first.
|
||||
|
||||
**Cons**
|
||||
- More abstraction: roles become more “data-driven”.
|
||||
- More abstraction: roles become more "data-driven".
|
||||
- Potential duplication: raw files may exist per-host in inventory (even if identical).
|
||||
- Harder to use the roles to **provision a brand-new server** without also building an inventory for that new host, because multi-site output assumes the server already exists and is being retrofitted.
|
||||
|
||||
**Rule of thumb**
|
||||
- If your goal is *“make this one server reproducible / provisionable”* → start with **single-site**.
|
||||
- If your goal is *“get several already-running servers under management quickly”* → use **multi-site**.
|
||||
- If your goal is *"make this one server reproducible / provisionable"* → start with **single-site**.
|
||||
- If your goal is *"get several already-running servers under management quickly"* → use **multi-site**.
|
||||
|
||||
---
|
||||
|
||||
|
|
@ -75,6 +75,21 @@ It also detects if any config files have been *changed* from their packaged defa
|
|||
|
||||
The harvest writes a state.json file explaining all the data it harvested and, if it chose not to harvest something, explanations as to why that is the case (see below: sensitive data).
|
||||
|
||||
### Remote harvesting (workstation → remote)
|
||||
|
||||
If you'd prefer not to install **enroll** on the target host, you can run the harvest over SSH from your workstation and pull the harvest bundle back locally:
|
||||
|
||||
```bash
|
||||
enroll harvest --remote-host myhost.example.com --remote-user myuser --out /tmp/enroll-harvest
|
||||
```
|
||||
|
||||
- `--remote-port` defaults to `22`
|
||||
- `--remote-user` defaults to your local `$USER`
|
||||
|
||||
This uploads a self-contained `enroll` zipapp to a temporary directory on the remote host, runs `harvest` there, then downloads the resulting harvest bundle to the `--out` directory on your workstation.
|
||||
|
||||
**Privilege note:** A "full" harvest typically needs root access. Remote harvesting assumes the remote user can run `sudo` **without a password prompt** (NOPASSWD) so the harvest can run non-interactively. If you don't want this, pass `--no-sudo` as well.
|
||||
|
||||
## Sensitive data
|
||||
|
||||
**enroll** doesn't make any assumptions about how you might handle sensitive data from your config files, in Ansible. Some people might use SOPS, others might use Vault, others might do something else entirely.
|
||||
|
|
@ -85,6 +100,68 @@ This inevitably means that it will deliberately miss some important config files
|
|||
|
||||
Nonetheless, in the Harvest 'state' file, there should be an explanation of 'excluded files'. You can parse or inspect this file to find what it chose to ignore, and then you know what you might want to augment the results with later, once you 'manifest' the harvest into Ansible configuration.
|
||||
|
||||
Nonetheless, in some cases it may be appropriate to truly grab as much as you can, including secrets. For that, read on for the `--dangerous` flag.
|
||||
|
||||
### Opting in to fetching sensitive data: `--dangerous`
|
||||
|
||||
**WARNING:** `--dangerous` disables enroll's "likely a secret" safety checks. This can cause private keys, TLS key material, API tokens, database passwords, and other credentials to be copied into your harvest output **in plaintext**.
|
||||
|
||||
Only use `--dangerous` if you explicitly want to scoop up sensitive files and you understand where the harvest output is stored, who can read it, and how it will be handled (backups, git commits, etc, as well as risk of using `--out` with a shared `/tmp` location where other users could see the data). We offer no liability if your sensitive data is compromised through the use of this tool!
|
||||
|
||||
**Strong recommendation:** If you plan to keep harvested files long-term (especially in git), encrypt secrets at rest. A common approach is to use **SOPS** and then use the **community.sops** Ansible collection to load/decrypt encrypted content during deploy.
|
||||
|
||||
Install the collection:
|
||||
|
||||
```bash
|
||||
ansible-galaxy collection install community.sops
|
||||
```
|
||||
|
||||
Then you can use the collection's lookup/vars plugins or modules to decrypt or load SOPS-encrypted vars at runtime.
|
||||
|
||||
Note the section below **also** talks about SOPS, but this is in the context of simply encrypting the data generated by `enroll` at rest for safe-keeping, **not** for direct integration with Ansible.
|
||||
|
||||
|
||||
### Encrypting harvest/manifests at rest with `--sops`
|
||||
|
||||
If you want to use `--dangerous` (or you simply want to keep the harvested artifacts private when they're sitting on disk, in git, etc), you can pass `--sops` to `harvest`, `manifest`, or `single-shot`.
|
||||
|
||||
To use `--sops`, you will need to have [sops](https://github.com/getsops/sops) installed on your `$PATH`.
|
||||
|
||||
- `--sops` expects one or more **GPG key fingerprints**. If `sops` is not on the `$PATH`, **enroll** will error.
|
||||
- `harvest --sops ...` writes a *single* encrypted file (`harvest.tar.gz.sops`) instead of a plaintext directory.
|
||||
- `manifest --sops ...` (and `single-shot --sops ...`) will:
|
||||
- decrypt the harvest bundle with `sops -d` (if the `--harvest` input is an encrypted file), then generate manifests as normal
|
||||
- bundle the entire generated Ansible output into a *single* encrypted file (`manifest.tar.gz.sops`)
|
||||
|
||||
⚠️ **Important:** `manifest --sops` (and `single-shot --sops`) produces **one encrypted file**. It is **not** an Ansible repo you can point `ansible-playbook` at directly. It is **not** the same as using SOPS inventory with the Ansible SOPS collection.
|
||||
|
||||
To use the encrypted SOPS manifest, decrypt and extract it first, then run Ansible from inside the extracted `manifest/` directory:
|
||||
|
||||
```bash
|
||||
sops -d /path/to/manifest.tar.gz.sops | tar -xzvf -
|
||||
cd manifest
|
||||
ansible-playbook ...
|
||||
```
|
||||
|
||||
Example:
|
||||
|
||||
```bash
|
||||
# Harvest (encrypted-at-rest)
|
||||
enroll harvest --out /tmp/enroll-harvest --dangerous --sops <FINGERPRINT(s)>
|
||||
|
||||
# Manifest (encrypted-at-rest)
|
||||
enroll manifest --harvest /tmp/enroll-harvest/harvest.tar.gz.sops --out /tmp/enroll-ansible --sops <FINGERPRINT(s)>
|
||||
|
||||
# Decrypt/extract manifest output for inspection / ansible runs
|
||||
cd /tmp/enroll-ansible
|
||||
sops -d manifest.tar.gz.sops | tar -xzvf -
|
||||
cd manifest
|
||||
```
|
||||
|
||||
(If you want to manually inspect an encrypted harvest bundle, extract it into its own directory, e.g. `mkdir -p harvest && sops -d harvest.tar.gz.sops | tar -xzvf - -C harvest`.)
|
||||
|
||||
|
||||
|
||||
## Manifest
|
||||
|
||||
The 'manifest' subcommand expects to be given a path to the 'harvest' obtained in the first step. It will then attempt to generate Ansible roles and playbooks (and potentially 'inventory') from that harvest.
|
||||
|
|
@ -116,13 +193,13 @@ If you *do* have JinjaTurtle installed, but *don't* wish to make use of it, you
|
|||
|
||||
---
|
||||
|
||||
# How multi-site avoids “shared role breaks a host”
|
||||
# How multi-site avoids "shared role breaks a host"
|
||||
|
||||
In multi-site mode, **roles are data-driven**. The role contains generic tasks like:
|
||||
|
||||
- “deploy all files listed for this host”
|
||||
- “install packages listed for this host”
|
||||
- “apply systemd enable/start state listed for this host”
|
||||
- "deploy all files listed for this host"
|
||||
- "install packages listed for this host"
|
||||
- "apply systemd enable/start state listed for this host"
|
||||
|
||||
The host inventory is what decides which files/packages/services apply to that host. This prevents the classic failure mode where host2 adds a config file to a shared role and host1 then fails trying to deploy a file it never had.
|
||||
|
||||
|
|
@ -130,7 +207,7 @@ Raw non-templated files are stored under:
|
|||
|
||||
- `inventory/host_vars/<fqdn>/<role>/.files/...`
|
||||
|
||||
…and the host’s role variables describe which of those files should be deployed.
|
||||
…and the host's role variables describe which of those files should be deployed.
|
||||
|
||||
---
|
||||
|
||||
|
|
@ -182,6 +259,42 @@ On the host (root recommended to harvest as much data as possible):
|
|||
```bash
|
||||
enroll harvest --out /tmp/enroll-harvest
|
||||
```
|
||||
### Remote harvest over SSH (no enroll install required on the remote host)
|
||||
|
||||
```bash
|
||||
enroll harvest --remote-host myhost.example.com --remote-user myuser --out /tmp/enroll-harvest
|
||||
```
|
||||
|
||||
### `--dangerous` (captures potentially sensitive files — read the warning above)
|
||||
|
||||
```bash
|
||||
enroll harvest --out /tmp/enroll-harvest --dangerous
|
||||
```
|
||||
|
||||
Remote + dangerous:
|
||||
|
||||
```bash
|
||||
enroll harvest --remote-host myhost.example.com --remote-user myuser --out /tmp/enroll-harvest --dangerous
|
||||
```
|
||||
|
||||
### `--sops` (encrypt bundles at rest)
|
||||
|
||||
`--sops` bundles and encrypts the output as a single SOPS-encrypted `.tar.gz.sops` file (GPG). This is particularly useful if you're using `--dangerous`.
|
||||
|
||||
```bash
|
||||
# Encrypted harvest bundle (writes /tmp/enroll-harvest/harvest.tar.gz.sops)
|
||||
enroll harvest --out /tmp/enroll-harvest --dangerous --sops <FINGERPRINT(s)>
|
||||
|
||||
# Encrypted manifest bundle (writes /tmp/enroll-ansible/manifest.tar.gz.sops)
|
||||
enroll manifest --harvest /tmp/enroll-harvest/harvest.tar.gz.sops --out /tmp/enroll-ansible --sops <FINGERPRINT(s)>
|
||||
|
||||
# Decrypt/extract the manifest bundle, then run Ansible from inside ./manifest/
|
||||
cd /tmp/enroll-ansible
|
||||
sops -d manifest.tar.gz.sops | tar -xzvf -
|
||||
cd manifest
|
||||
ansible-playbook ./playbook.yml
|
||||
```
|
||||
|
||||
|
||||
## 2. Generate Ansible manifests (roles/playbook) from that harvest
|
||||
|
||||
|
|
@ -208,6 +321,14 @@ Alternatively, do both steps in one shot:
|
|||
```bash
|
||||
enroll single-shot --harvest /tmp/enroll-harvest --out /tmp/enroll-ansible --fqdn "$(hostname -f)"
|
||||
```
|
||||
Remote single-shot (run harvest over SSH, then manifest locally):
|
||||
|
||||
```bash
|
||||
enroll single-shot --remote-host myhost.example.com --remote-user myuser --harvest /tmp/enroll-harvest --out /tmp/enroll-ansible --fqdn "myhost.example.com"
|
||||
```
|
||||
|
||||
In multi-site mode (`--fqdn`), you can run single-shot repeatedly against multiple hosts while reusing the same `--out` directory so each host merges into the existing Ansible repo.
|
||||
|
||||
|
||||
## 3. Run Ansible
|
||||
|
||||
|
|
|
|||
16
debian/changelog
vendored
16
debian/changelog
vendored
|
|
@ -1,3 +1,19 @@
|
|||
enroll (0.1.0) unstable; urgency=medium
|
||||
|
||||
* Add remote mode for harvesting a remote machine via a local workstation (no need to install enroll remotely)
|
||||
Optionally use `--no-sudo` if you don't want the remote user to have passwordless sudo when conducting the
|
||||
harvest, albeit you'll end up with less useful data (same as if running `enroll harvest` on a machine without
|
||||
sudo)
|
||||
* Add `--dangerous` flag to capture even sensitive data (use at your own risk!)
|
||||
* Add `--sops` flag which makes the harvest and the manifest 'out' data encrypted as a single SOPS data file.
|
||||
This would make `--dangerous` a little bit safer, if your intention is just to store the Ansible manifest
|
||||
in git or somewhere similar for disaster-recovery purposes (e.g encrypted at rest for safe-keeping).
|
||||
* Do a better job at capturing other config files in `/etc/<package>/` even if that package doesn't normally
|
||||
ship or manage those files.
|
||||
* Don't collect files ending in `.log`
|
||||
|
||||
-- Miguel Jacq <mig@mig5.net> Tue, 17 Dec 2025 18:00:00 +1100
|
||||
|
||||
enroll (0.0.5) unstable; urgency=medium
|
||||
|
||||
* Use JinjaTurtle to generate dynamic template/inventory if it's on the PATH
|
||||
|
|
|
|||
5
debian/control
vendored
5
debian/control
vendored
|
|
@ -9,12 +9,13 @@ Build-Depends:
|
|||
pybuild-plugin-pyproject,
|
||||
python3-all,
|
||||
python3-yaml,
|
||||
python3-poetry-core
|
||||
python3-poetry-core,
|
||||
python3-paramiko
|
||||
Standards-Version: 4.6.2
|
||||
Homepage: https://git.mig5.net/mig5/enroll
|
||||
|
||||
Package: enroll
|
||||
Architecture: all
|
||||
Depends: ${misc:Depends}, ${python3:Depends}, python3-yaml
|
||||
Depends: ${misc:Depends}, ${python3:Depends}, python3-yaml, python3-paramiko
|
||||
Description: Harvest a host into Ansible roles
|
||||
A tool that inspects a system and emits Ansible roles/playbooks to reproduce it.
|
||||
|
|
|
|||
79
enroll/cache.py
Normal file
79
enroll/cache.py
Normal file
|
|
@ -0,0 +1,79 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import os
|
||||
import re
|
||||
import tempfile
|
||||
from dataclasses import dataclass
|
||||
from datetime import datetime
|
||||
from pathlib import Path
|
||||
from typing import Optional
|
||||
|
||||
|
||||
def _safe_component(s: str) -> str:
|
||||
s = s.strip()
|
||||
if not s:
|
||||
return "unknown"
|
||||
s = re.sub(r"[^A-Za-z0-9_.-]+", "_", s)
|
||||
s = re.sub(r"_+", "_", s)
|
||||
return s[:64]
|
||||
|
||||
|
||||
def enroll_cache_dir() -> Path:
|
||||
"""Return the base cache directory for enroll.
|
||||
|
||||
We default to ~/.local/cache to match common Linux conventions in personal
|
||||
homedirs, but honour XDG_CACHE_HOME if set.
|
||||
"""
|
||||
base = os.environ.get("XDG_CACHE_HOME")
|
||||
if base:
|
||||
root = Path(base).expanduser()
|
||||
else:
|
||||
root = Path.home() / ".local" / "cache"
|
||||
return root / "enroll"
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class HarvestCache:
|
||||
"""A locally-persistent directory that holds a harvested bundle."""
|
||||
|
||||
dir: Path
|
||||
|
||||
@property
|
||||
def state_json(self) -> Path:
|
||||
return self.dir / "state.json"
|
||||
|
||||
|
||||
def _ensure_dir_secure(path: Path) -> None:
|
||||
"""Create a directory with restrictive permissions; refuse symlinks."""
|
||||
# Refuse a symlink at the leaf.
|
||||
if path.exists() and path.is_symlink():
|
||||
raise RuntimeError(f"Refusing to use symlink path: {path}")
|
||||
path.mkdir(parents=True, exist_ok=True, mode=0o700)
|
||||
try:
|
||||
os.chmod(path, 0o700)
|
||||
except OSError:
|
||||
# Best-effort; on some FS types chmod may fail.
|
||||
pass
|
||||
|
||||
|
||||
def new_harvest_cache_dir(*, hint: Optional[str] = None) -> HarvestCache:
|
||||
"""Create a new, unpredictable harvest directory under the user's cache.
|
||||
|
||||
This mitigates pre-guessing attacks (e.g. an attacker creating a directory
|
||||
in advance in a shared temp location) by creating the bundle directory under
|
||||
the user's home and using mkdtemp() randomness.
|
||||
"""
|
||||
base = enroll_cache_dir() / "harvest"
|
||||
_ensure_dir_secure(base)
|
||||
|
||||
ts = datetime.now().strftime("%Y%m%d-%H%M%S")
|
||||
safe = _safe_component(hint or "harvest")
|
||||
prefix = f"{ts}-{safe}-"
|
||||
|
||||
# mkdtemp creates a new directory with a random suffix.
|
||||
d = Path(tempfile.mkdtemp(prefix=prefix, dir=str(base)))
|
||||
try:
|
||||
os.chmod(d, 0o700)
|
||||
except OSError:
|
||||
pass
|
||||
return HarvestCache(dir=d)
|
||||
340
enroll/cli.py
340
enroll/cli.py
|
|
@ -1,9 +1,66 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import argparse
|
||||
import os
|
||||
import tarfile
|
||||
import tempfile
|
||||
from pathlib import Path
|
||||
from typing import Optional
|
||||
|
||||
from .cache import new_harvest_cache_dir
|
||||
from .harvest import harvest
|
||||
from .manifest import manifest
|
||||
from .remote import remote_harvest
|
||||
from .sopsutil import SopsError, encrypt_file_binary
|
||||
|
||||
|
||||
def _resolve_sops_out_file(out: Optional[str], *, hint: str) -> Path:
|
||||
"""Resolve an output *file* path for --sops mode.
|
||||
|
||||
If `out` looks like a directory (or points to an existing directory), we
|
||||
place the encrypted harvest inside it as harvest.tar.gz.sops.
|
||||
"""
|
||||
if out:
|
||||
p = Path(out).expanduser()
|
||||
if p.exists() and p.is_dir():
|
||||
return p / "harvest.tar.gz.sops"
|
||||
# Heuristic: treat paths with a suffix as files; otherwise directories.
|
||||
if p.suffix:
|
||||
return p
|
||||
return p / "harvest.tar.gz.sops"
|
||||
|
||||
# Default: use a secure cache directory.
|
||||
d = new_harvest_cache_dir(hint=hint).dir
|
||||
return d / "harvest.tar.gz.sops"
|
||||
|
||||
|
||||
def _tar_dir_to(path_dir: Path, tar_path: Path) -> None:
|
||||
tar_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
with tarfile.open(tar_path, mode="w:gz") as tf:
|
||||
# Keep a stable on-disk layout when extracted: state.json + artifacts/
|
||||
tf.add(str(path_dir), arcname=".")
|
||||
|
||||
|
||||
def _encrypt_harvest_dir_to_sops(
|
||||
bundle_dir: Path, out_file: Path, fps: list[str]
|
||||
) -> Path:
|
||||
out_file = Path(out_file)
|
||||
out_file.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# Create the tarball alongside the output file (keeps filesystem permissions/locality sane).
|
||||
fd, tmp_tgz = tempfile.mkstemp(
|
||||
prefix=".enroll-harvest-", suffix=".tar.gz", dir=str(out_file.parent)
|
||||
)
|
||||
os.close(fd)
|
||||
try:
|
||||
_tar_dir_to(bundle_dir, Path(tmp_tgz))
|
||||
encrypt_file_binary(Path(tmp_tgz), out_file, pgp_fingerprints=fps, mode=0o600)
|
||||
finally:
|
||||
try:
|
||||
os.unlink(tmp_tgz)
|
||||
except FileNotFoundError:
|
||||
pass
|
||||
return out_file
|
||||
|
||||
|
||||
def _add_common_manifest_args(p: argparse.ArgumentParser) -> None:
|
||||
|
|
@ -32,46 +89,295 @@ def _jt_mode(args: argparse.Namespace) -> str:
|
|||
return "auto"
|
||||
|
||||
|
||||
def _add_remote_args(p: argparse.ArgumentParser) -> None:
|
||||
p.add_argument(
|
||||
"--remote-host",
|
||||
help="SSH host to run harvesting on (if set, harvest runs remotely and is pulled locally).",
|
||||
)
|
||||
p.add_argument(
|
||||
"--remote-port",
|
||||
type=int,
|
||||
default=22,
|
||||
help="SSH port for --remote-host (default: 22).",
|
||||
)
|
||||
p.add_argument(
|
||||
"--remote-user",
|
||||
default=os.environ.get("USER") or None,
|
||||
help="SSH username for --remote-host (default: local $USER).",
|
||||
)
|
||||
|
||||
|
||||
def main() -> None:
|
||||
ap = argparse.ArgumentParser(prog="enroll")
|
||||
sub = ap.add_subparsers(dest="cmd", required=True)
|
||||
|
||||
h = sub.add_parser("harvest", help="Harvest service/package/config state")
|
||||
h.add_argument("--out", required=True, help="Harvest output directory")
|
||||
h.add_argument(
|
||||
"--out",
|
||||
help=(
|
||||
"Harvest output directory. If --sops is set, this may be either a directory "
|
||||
"(an encrypted file named harvest.tar.gz.sops will be created inside) or a file path."
|
||||
),
|
||||
)
|
||||
h.add_argument(
|
||||
"--dangerous",
|
||||
action="store_true",
|
||||
help="Collect files more aggressively (may include secrets). Disables secret-avoidance checks.",
|
||||
)
|
||||
h.add_argument(
|
||||
"--sops",
|
||||
nargs="+",
|
||||
metavar="GPG_FINGERPRINT",
|
||||
help=(
|
||||
"Encrypt the harvest output as a SOPS-encrypted tarball using the given GPG fingerprint(s). "
|
||||
"Requires `sops` on PATH."
|
||||
),
|
||||
)
|
||||
h.add_argument(
|
||||
"--no-sudo",
|
||||
action="store_true",
|
||||
help="Don't use sudo on the remote host (when using --remote options). This may result in a limited harvest due to permission restrictions.",
|
||||
)
|
||||
_add_remote_args(h)
|
||||
|
||||
r = sub.add_parser("manifest", help="Render Ansible roles from a harvest")
|
||||
r.add_argument(
|
||||
m = sub.add_parser("manifest", help="Render Ansible roles from a harvest")
|
||||
m.add_argument(
|
||||
"--harvest",
|
||||
required=True,
|
||||
help="Path to the directory created by the harvest command",
|
||||
help=(
|
||||
"Path to the directory created by the harvest command, or (with --sops) "
|
||||
"a SOPS-encrypted harvest tarball."
|
||||
),
|
||||
)
|
||||
r.add_argument(
|
||||
m.add_argument(
|
||||
"--out",
|
||||
required=True,
|
||||
help="Output directory for generated roles/playbook Ansible manifest",
|
||||
help=(
|
||||
"Output location for the generated manifest. In plain mode this is a directory. "
|
||||
"In --sops mode this may be either a directory (an encrypted file named manifest.tar.gz.sops will be created inside) "
|
||||
"or a file path."
|
||||
),
|
||||
)
|
||||
_add_common_manifest_args(r)
|
||||
m.add_argument(
|
||||
"--sops",
|
||||
nargs="+",
|
||||
metavar="GPG_FINGERPRINT",
|
||||
help=(
|
||||
"In --sops mode, decrypt the harvest using `sops -d` (if the harvest is an encrypted file) "
|
||||
"and then bundle+encrypt the entire generated manifest output into a single SOPS-encrypted tarball "
|
||||
"(binary) using the given GPG fingerprint(s). Requires `sops` on PATH."
|
||||
),
|
||||
)
|
||||
_add_common_manifest_args(m)
|
||||
|
||||
e = sub.add_parser(
|
||||
s = sub.add_parser(
|
||||
"single-shot", help="Harvest state, then manifest Ansible code, in one shot"
|
||||
)
|
||||
e.add_argument(
|
||||
"--harvest", required=True, help="Path to the directory to place the harvest in"
|
||||
s.add_argument(
|
||||
"--harvest",
|
||||
help=(
|
||||
"Where to place the harvest. In plain mode this is a directory; in --sops mode this may be "
|
||||
"a directory or a file path (an encrypted file is produced)."
|
||||
),
|
||||
)
|
||||
e.add_argument(
|
||||
s.add_argument(
|
||||
"--dangerous",
|
||||
action="store_true",
|
||||
help="Collect files more aggressively (may include secrets). Disables secret-avoidance checks.",
|
||||
)
|
||||
s.add_argument(
|
||||
"--sops",
|
||||
nargs="+",
|
||||
metavar="GPG_FINGERPRINT",
|
||||
help=(
|
||||
"Encrypt the harvest as a SOPS-encrypted tarball, and bundle+encrypt the manifest output in --out "
|
||||
"(same behavior as `harvest --sops` and `manifest --sops`)."
|
||||
),
|
||||
)
|
||||
s.add_argument(
|
||||
"--no-sudo",
|
||||
action="store_true",
|
||||
help="Don't use sudo on the remote host (when using --remote options). This may result in a limited harvest due to permission restrictions.",
|
||||
)
|
||||
s.add_argument(
|
||||
"--out",
|
||||
required=True,
|
||||
help="Output directory for generated roles/playbook Ansible manifest",
|
||||
help=(
|
||||
"Output location for the generated manifest. In plain mode this is a directory. "
|
||||
"In --sops mode this may be either a directory (an encrypted file named manifest.tar.gz.sops will be created inside) "
|
||||
"or a file path."
|
||||
),
|
||||
)
|
||||
_add_common_manifest_args(e)
|
||||
_add_common_manifest_args(s)
|
||||
_add_remote_args(s)
|
||||
|
||||
args = ap.parse_args()
|
||||
|
||||
remote_host: Optional[str] = getattr(args, "remote_host", None)
|
||||
|
||||
try:
|
||||
if args.cmd == "harvest":
|
||||
path = harvest(args.out)
|
||||
sops_fps = getattr(args, "sops", None)
|
||||
if remote_host:
|
||||
if sops_fps:
|
||||
out_file = _resolve_sops_out_file(args.out, hint=remote_host)
|
||||
with tempfile.TemporaryDirectory(prefix="enroll-harvest-") as td:
|
||||
tmp_bundle = Path(td) / "bundle"
|
||||
tmp_bundle.mkdir(parents=True, exist_ok=True)
|
||||
try:
|
||||
os.chmod(tmp_bundle, 0o700)
|
||||
except OSError:
|
||||
pass
|
||||
remote_harvest(
|
||||
local_out_dir=tmp_bundle,
|
||||
remote_host=remote_host,
|
||||
remote_port=int(args.remote_port),
|
||||
remote_user=args.remote_user,
|
||||
dangerous=bool(args.dangerous),
|
||||
no_sudo=bool(args.no_sudo),
|
||||
)
|
||||
_encrypt_harvest_dir_to_sops(
|
||||
tmp_bundle, out_file, list(sops_fps)
|
||||
)
|
||||
print(str(out_file))
|
||||
else:
|
||||
out_dir = (
|
||||
Path(args.out)
|
||||
if args.out
|
||||
else new_harvest_cache_dir(hint=remote_host).dir
|
||||
)
|
||||
state = remote_harvest(
|
||||
local_out_dir=out_dir,
|
||||
remote_host=remote_host,
|
||||
remote_port=int(args.remote_port),
|
||||
remote_user=args.remote_user,
|
||||
dangerous=bool(args.dangerous),
|
||||
no_sudo=bool(args.no_sudo),
|
||||
)
|
||||
print(str(state))
|
||||
else:
|
||||
if sops_fps:
|
||||
out_file = _resolve_sops_out_file(args.out, hint="local")
|
||||
with tempfile.TemporaryDirectory(prefix="enroll-harvest-") as td:
|
||||
tmp_bundle = Path(td) / "bundle"
|
||||
tmp_bundle.mkdir(parents=True, exist_ok=True)
|
||||
try:
|
||||
os.chmod(tmp_bundle, 0o700)
|
||||
except OSError:
|
||||
pass
|
||||
harvest(str(tmp_bundle), dangerous=bool(args.dangerous))
|
||||
_encrypt_harvest_dir_to_sops(
|
||||
tmp_bundle, out_file, list(sops_fps)
|
||||
)
|
||||
print(str(out_file))
|
||||
else:
|
||||
if not args.out:
|
||||
raise SystemExit(
|
||||
"error: --out is required unless --remote-host is set"
|
||||
)
|
||||
path = harvest(args.out, dangerous=bool(args.dangerous))
|
||||
print(path)
|
||||
elif args.cmd == "manifest":
|
||||
manifest(args.harvest, args.out, fqdn=args.fqdn, jinjaturtle=_jt_mode(args))
|
||||
out_enc = manifest(
|
||||
args.harvest,
|
||||
args.out,
|
||||
fqdn=args.fqdn,
|
||||
jinjaturtle=_jt_mode(args),
|
||||
sops_fingerprints=getattr(args, "sops", None),
|
||||
)
|
||||
if getattr(args, "sops", None) and out_enc:
|
||||
print(str(out_enc))
|
||||
elif args.cmd == "single-shot":
|
||||
harvest(args.harvest)
|
||||
manifest(args.harvest, args.out, fqdn=args.fqdn, jinjaturtle=_jt_mode(args))
|
||||
sops_fps = getattr(args, "sops", None)
|
||||
if remote_host:
|
||||
if sops_fps:
|
||||
out_file = _resolve_sops_out_file(args.harvest, hint=remote_host)
|
||||
with tempfile.TemporaryDirectory(prefix="enroll-harvest-") as td:
|
||||
tmp_bundle = Path(td) / "bundle"
|
||||
tmp_bundle.mkdir(parents=True, exist_ok=True)
|
||||
try:
|
||||
os.chmod(tmp_bundle, 0o700)
|
||||
except OSError:
|
||||
pass
|
||||
remote_harvest(
|
||||
local_out_dir=tmp_bundle,
|
||||
remote_host=remote_host,
|
||||
remote_port=int(args.remote_port),
|
||||
remote_user=args.remote_user,
|
||||
dangerous=bool(args.dangerous),
|
||||
no_sudo=bool(args.no_sudo),
|
||||
)
|
||||
_encrypt_harvest_dir_to_sops(
|
||||
tmp_bundle, out_file, list(sops_fps)
|
||||
)
|
||||
|
||||
manifest(
|
||||
str(out_file),
|
||||
args.out,
|
||||
fqdn=args.fqdn,
|
||||
jinjaturtle=_jt_mode(args),
|
||||
sops_fingerprints=list(sops_fps),
|
||||
)
|
||||
if not args.harvest:
|
||||
print(str(out_file))
|
||||
else:
|
||||
harvest_dir = (
|
||||
Path(args.harvest)
|
||||
if args.harvest
|
||||
else new_harvest_cache_dir(hint=remote_host).dir
|
||||
)
|
||||
remote_harvest(
|
||||
local_out_dir=harvest_dir,
|
||||
remote_host=remote_host,
|
||||
remote_port=int(args.remote_port),
|
||||
remote_user=args.remote_user,
|
||||
dangerous=bool(args.dangerous),
|
||||
no_sudo=bool(args.no_sudo),
|
||||
)
|
||||
manifest(
|
||||
str(harvest_dir),
|
||||
args.out,
|
||||
fqdn=args.fqdn,
|
||||
jinjaturtle=_jt_mode(args),
|
||||
)
|
||||
# For usability (when --harvest wasn't provided), print the harvest path.
|
||||
if not args.harvest:
|
||||
print(str(harvest_dir / "state.json"))
|
||||
else:
|
||||
if sops_fps:
|
||||
out_file = _resolve_sops_out_file(args.harvest, hint="local")
|
||||
with tempfile.TemporaryDirectory(prefix="enroll-harvest-") as td:
|
||||
tmp_bundle = Path(td) / "bundle"
|
||||
tmp_bundle.mkdir(parents=True, exist_ok=True)
|
||||
try:
|
||||
os.chmod(tmp_bundle, 0o700)
|
||||
except OSError:
|
||||
pass
|
||||
harvest(str(tmp_bundle), dangerous=bool(args.dangerous))
|
||||
_encrypt_harvest_dir_to_sops(
|
||||
tmp_bundle, out_file, list(sops_fps)
|
||||
)
|
||||
|
||||
manifest(
|
||||
str(out_file),
|
||||
args.out,
|
||||
fqdn=args.fqdn,
|
||||
jinjaturtle=_jt_mode(args),
|
||||
sops_fingerprints=list(sops_fps),
|
||||
)
|
||||
if not args.harvest:
|
||||
print(str(out_file))
|
||||
else:
|
||||
if not args.harvest:
|
||||
raise SystemExit(
|
||||
"error: --harvest is required unless --remote-host is set"
|
||||
)
|
||||
harvest(args.harvest, dangerous=bool(args.dangerous))
|
||||
manifest(
|
||||
args.harvest,
|
||||
args.out,
|
||||
fqdn=args.fqdn,
|
||||
jinjaturtle=_jt_mode(args),
|
||||
)
|
||||
except SopsError as e:
|
||||
raise SystemExit(f"error: {e}")
|
||||
|
|
|
|||
|
|
@ -199,7 +199,11 @@ def _maybe_add_specific_paths(hints: Set[str]) -> List[str]:
|
|||
|
||||
|
||||
def _scan_unowned_under_roots(
|
||||
roots: List[str], owned_etc: Set[str], limit: int = MAX_UNOWNED_FILES_PER_ROLE
|
||||
roots: List[str],
|
||||
owned_etc: Set[str],
|
||||
limit: int = MAX_UNOWNED_FILES_PER_ROLE,
|
||||
*,
|
||||
confish_only: bool = True,
|
||||
) -> List[str]:
|
||||
found: List[str] = []
|
||||
for root in roots:
|
||||
|
|
@ -218,7 +222,7 @@ def _scan_unowned_under_roots(
|
|||
continue
|
||||
if not os.path.isfile(p) or os.path.islink(p):
|
||||
continue
|
||||
if not _is_confish(p):
|
||||
if confish_only and not _is_confish(p):
|
||||
continue
|
||||
found.append(p)
|
||||
return found
|
||||
|
|
@ -233,8 +237,20 @@ def _topdirs_for_package(pkg: str, pkg_to_etc_paths: Dict[str, List[str]]) -> Se
|
|||
return topdirs
|
||||
|
||||
|
||||
def harvest(bundle_dir: str, policy: Optional[IgnorePolicy] = None) -> str:
|
||||
policy = policy or IgnorePolicy()
|
||||
def harvest(
|
||||
bundle_dir: str,
|
||||
policy: Optional[IgnorePolicy] = None,
|
||||
*,
|
||||
dangerous: bool = False,
|
||||
) -> str:
|
||||
# If a policy is not supplied, build one. `--dangerous` relaxes secret
|
||||
# detection and deny-glob skipping.
|
||||
if policy is None:
|
||||
policy = IgnorePolicy(dangerous=dangerous)
|
||||
elif dangerous:
|
||||
# If callers explicitly provided a policy but also requested
|
||||
# dangerous behavior, honour the CLI intent.
|
||||
policy.dangerous = True
|
||||
os.makedirs(bundle_dir, exist_ok=True)
|
||||
|
||||
if hasattr(os, "geteuid") and os.geteuid() != 0:
|
||||
|
|
@ -338,10 +354,42 @@ def harvest(bundle_dir: str, policy: Optional[IgnorePolicy] = None) -> str:
|
|||
if current != baseline:
|
||||
candidates.setdefault(path, "modified_packaged_file")
|
||||
|
||||
roots: List[str] = []
|
||||
# Capture custom/unowned files living under /etc/<name> for this service.
|
||||
#
|
||||
# Historically we only captured "config-ish" files (by extension). That
|
||||
# misses important runtime-generated artifacts like certificates and
|
||||
# key material under service directories (e.g. /etc/openvpn/*.crt).
|
||||
#
|
||||
# To avoid exploding output for shared trees (e.g. /etc/systemd), keep
|
||||
# the older "config-ish only" behavior for known shared topdirs.
|
||||
any_roots: List[str] = []
|
||||
confish_roots: List[str] = []
|
||||
for h in hints:
|
||||
roots.extend([f"/etc/{h}", f"/etc/{h}.d"])
|
||||
for pth in _scan_unowned_under_roots(roots, owned_etc):
|
||||
roots_for_h = [f"/etc/{h}", f"/etc/{h}.d"]
|
||||
if h in SHARED_ETC_TOPDIRS:
|
||||
confish_roots.extend(roots_for_h)
|
||||
else:
|
||||
any_roots.extend(roots_for_h)
|
||||
|
||||
found: List[str] = []
|
||||
found.extend(
|
||||
_scan_unowned_under_roots(
|
||||
any_roots,
|
||||
owned_etc,
|
||||
limit=MAX_UNOWNED_FILES_PER_ROLE,
|
||||
confish_only=False,
|
||||
)
|
||||
)
|
||||
if len(found) < MAX_UNOWNED_FILES_PER_ROLE:
|
||||
found.extend(
|
||||
_scan_unowned_under_roots(
|
||||
confish_roots,
|
||||
owned_etc,
|
||||
limit=MAX_UNOWNED_FILES_PER_ROLE - len(found),
|
||||
confish_only=True,
|
||||
)
|
||||
)
|
||||
for pth in found:
|
||||
candidates.setdefault(pth, "custom_unowned")
|
||||
|
||||
if not pkgs and not candidates:
|
||||
|
|
@ -449,8 +497,14 @@ def harvest(bundle_dir: str, policy: Optional[IgnorePolicy] = None) -> str:
|
|||
roots.extend([f"/etc/logrotate.d/{td}"])
|
||||
roots.extend([f"/etc/sysctl.d/{td}.conf"])
|
||||
|
||||
# Capture any custom/unowned files under /etc/<topdir> for this
|
||||
# manually-installed package. This may include runtime-generated
|
||||
# artifacts like certificates, key files, and helper scripts which are
|
||||
# not owned by any .deb.
|
||||
for pth in _scan_unowned_under_roots(
|
||||
[r for r in roots if os.path.isdir(r)], owned_etc
|
||||
[r for r in roots if os.path.isdir(r)],
|
||||
owned_etc,
|
||||
confish_only=False,
|
||||
):
|
||||
candidates.setdefault(pth, "custom_unowned")
|
||||
|
||||
|
|
|
|||
|
|
@ -38,9 +38,13 @@ BLOCK_END = b"*/"
|
|||
|
||||
@dataclass
|
||||
class IgnorePolicy:
|
||||
deny_globs: list[str] = None
|
||||
deny_globs: Optional[list[str]] = None
|
||||
max_file_bytes: int = 256_000
|
||||
sample_bytes: int = 64_000
|
||||
# If True, be much less conservative about collecting potentially
|
||||
# sensitive files. This disables deny globs (e.g. /etc/shadow,
|
||||
# /etc/ssl/private/*) and skips heuristic content scanning.
|
||||
dangerous: bool = False
|
||||
|
||||
def __post_init__(self) -> None:
|
||||
if self.deny_globs is None:
|
||||
|
|
@ -69,7 +73,12 @@ class IgnorePolicy:
|
|||
yield raw
|
||||
|
||||
def deny_reason(self, path: str) -> Optional[str]:
|
||||
for g in self.deny_globs:
|
||||
# Always ignore plain *.log files (rarely useful as config, often noisy).
|
||||
if path.endswith(".log"):
|
||||
return "log_file"
|
||||
|
||||
if not self.dangerous:
|
||||
for g in self.deny_globs or []:
|
||||
if fnmatch.fnmatch(path, g):
|
||||
return "denied_path"
|
||||
|
||||
|
|
@ -93,6 +102,7 @@ class IgnorePolicy:
|
|||
if b"\x00" in data:
|
||||
return "binary_like"
|
||||
|
||||
if not self.dangerous:
|
||||
for line in self.iter_effective_lines(data):
|
||||
for pat in SENSITIVE_CONTENT_PATTERNS:
|
||||
if pat.search(line):
|
||||
|
|
|
|||
|
|
@ -1,6 +1,5 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import re
|
||||
import shutil
|
||||
import subprocess # nosec
|
||||
import tempfile
|
||||
|
|
@ -81,25 +80,3 @@ def run_jinjaturtle(
|
|||
return JinjifyResult(
|
||||
template_text=template_text, vars_text=vars_text.rstrip() + "\n"
|
||||
)
|
||||
|
||||
|
||||
def replace_or_append_block(
|
||||
base_text: str,
|
||||
*,
|
||||
begin: str,
|
||||
end: str,
|
||||
block_body: str,
|
||||
) -> str:
|
||||
"""Replace a marked block if present; else append it."""
|
||||
pattern = re.compile(
|
||||
re.escape(begin) + r".*?" + re.escape(end),
|
||||
flags=re.DOTALL,
|
||||
)
|
||||
new_block = f"{begin}\n{block_body.rstrip()}\n{end}"
|
||||
if pattern.search(base_text):
|
||||
return pattern.sub(new_block, base_text).rstrip() + "\n"
|
||||
# ensure base ends with newline
|
||||
bt = base_text.rstrip() + "\n"
|
||||
if not bt.endswith("\n"):
|
||||
bt += "\n"
|
||||
return bt + "\n" + new_block + "\n"
|
||||
|
|
|
|||
|
|
@ -3,6 +3,9 @@ from __future__ import annotations
|
|||
import json
|
||||
import os
|
||||
import shutil
|
||||
import stat
|
||||
import tarfile
|
||||
import tempfile
|
||||
from pathlib import Path
|
||||
from typing import Any, Dict, List, Optional, Set, Tuple
|
||||
|
||||
|
|
@ -12,9 +15,12 @@ from .jinjaturtle import (
|
|||
run_jinjaturtle,
|
||||
)
|
||||
|
||||
|
||||
JINJATURTLE_BEGIN = "# BEGIN JINJATURTLE (generated by enroll)"
|
||||
JINJATURTLE_END = "# END JINJATURTLE"
|
||||
from .remote import _safe_extract_tar
|
||||
from .sopsutil import (
|
||||
decrypt_file_binary_to,
|
||||
encrypt_file_binary,
|
||||
require_sops_cmd,
|
||||
)
|
||||
|
||||
|
||||
def _try_yaml():
|
||||
|
|
@ -70,36 +76,6 @@ def _yaml_dump_mapping(obj: Dict[str, Any], *, sort_keys: bool = True) -> str:
|
|||
)
|
||||
|
||||
|
||||
def _merge_list_keep_order(existing: List[Any], new: List[Any]) -> List[Any]:
|
||||
out = list(existing)
|
||||
seen = set(existing)
|
||||
for item in new:
|
||||
if item not in seen:
|
||||
out.append(item)
|
||||
seen.add(item)
|
||||
return out
|
||||
|
||||
|
||||
def _merge_mappings_preserve(
|
||||
existing: Dict[str, Any], incoming: Dict[str, Any]
|
||||
) -> Dict[str, Any]:
|
||||
"""Merge incoming into existing:
|
||||
- lists: union (preserve existing order)
|
||||
- scalars/dicts: only set if missing (do not overwrite)
|
||||
"""
|
||||
merged = dict(existing)
|
||||
for k, v in incoming.items():
|
||||
if k in merged:
|
||||
if isinstance(merged[k], list) and isinstance(v, list):
|
||||
merged[k] = _merge_list_keep_order(merged[k], v)
|
||||
else:
|
||||
# keep existing value (non-overwriting)
|
||||
continue
|
||||
else:
|
||||
merged[k] = v
|
||||
return merged
|
||||
|
||||
|
||||
def _merge_mappings_overwrite(
|
||||
existing: Dict[str, Any], incoming: Dict[str, Any]
|
||||
) -> Dict[str, Any]:
|
||||
|
|
@ -113,49 +89,28 @@ def _merge_mappings_overwrite(
|
|||
return merged
|
||||
|
||||
|
||||
def _write_role_defaults_merge(role_dir: str, incoming: Dict[str, Any]) -> None:
|
||||
"""Write/merge role defaults without clobbering existing values.
|
||||
Used in site mode to keep roles reusable across hosts.
|
||||
"""
|
||||
defaults_path = os.path.join(role_dir, "defaults", "main.yml")
|
||||
existing: Dict[str, Any] = {}
|
||||
if os.path.exists(defaults_path):
|
||||
def _copy2_replace(src: str, dst: str) -> None:
|
||||
dst_dir = os.path.dirname(dst)
|
||||
os.makedirs(dst_dir, exist_ok=True)
|
||||
|
||||
# Copy to a temp file in the same directory, then atomically replace.
|
||||
fd, tmp = tempfile.mkstemp(prefix=".enroll-tmp-", dir=dst_dir)
|
||||
os.close(fd)
|
||||
try:
|
||||
existing_text = Path(defaults_path).read_text(encoding="utf-8")
|
||||
existing = _yaml_load_mapping(existing_text)
|
||||
except Exception:
|
||||
existing = {}
|
||||
merged = _merge_mappings_preserve(existing, incoming)
|
||||
body = "---\n" + _yaml_dump_mapping(merged, sort_keys=True)
|
||||
with open(defaults_path, "w", encoding="utf-8") as f:
|
||||
f.write(body)
|
||||
shutil.copy2(src, tmp)
|
||||
|
||||
# Ensure the working tree stays mergeable: make the file user-writable.
|
||||
st = os.stat(tmp, follow_symlinks=False)
|
||||
mode = stat.S_IMODE(st.st_mode)
|
||||
if not (mode & stat.S_IWUSR):
|
||||
os.chmod(tmp, mode | stat.S_IWUSR)
|
||||
|
||||
def _extract_jinjaturtle_block(text: str) -> str:
|
||||
"""Return YAML text inside JINJATURTLE_BEGIN/END markers, or the whole text if no markers."""
|
||||
if JINJATURTLE_BEGIN in text and JINJATURTLE_END in text:
|
||||
start = text.split(JINJATURTLE_BEGIN, 1)[1]
|
||||
inner = start.split(JINJATURTLE_END, 1)[0]
|
||||
return inner.strip() + "\n"
|
||||
return text.strip() + "\n"
|
||||
|
||||
|
||||
def _normalise_jinjaturtle_vars_text(vars_text: str) -> str:
|
||||
"""Deduplicate keys in a vars fragment by parsing as YAML and dumping it back."""
|
||||
m = _yaml_load_mapping(vars_text)
|
||||
if not m:
|
||||
# if YAML isn't available or parsing failed, return raw text (best-effort)
|
||||
return vars_text.rstrip() + (
|
||||
"\n" if vars_text and not vars_text.endswith("\n") else ""
|
||||
)
|
||||
return _yaml_dump_mapping(m, sort_keys=True)
|
||||
|
||||
|
||||
def _yaml_list(items: List[str], indent: int = 2) -> str:
|
||||
pad = " " * indent
|
||||
if not items:
|
||||
return f"{pad}[]"
|
||||
return "\n".join(f"{pad}- {x}" for x in items)
|
||||
os.replace(tmp, dst)
|
||||
finally:
|
||||
try:
|
||||
os.unlink(tmp)
|
||||
except FileNotFoundError:
|
||||
pass
|
||||
|
||||
|
||||
def _copy_artifacts(
|
||||
|
|
@ -195,7 +150,7 @@ def _copy_artifacts(
|
|||
if preserve_existing and os.path.exists(dst):
|
||||
continue
|
||||
os.makedirs(os.path.dirname(dst), exist_ok=True)
|
||||
shutil.copy2(src, dst)
|
||||
_copy2_replace(src, dst)
|
||||
|
||||
|
||||
def _write_role_scaffold(role_dir: str) -> None:
|
||||
|
|
@ -380,28 +335,6 @@ def _jinjify_managed_files(
|
|||
return templated, ""
|
||||
|
||||
|
||||
def _hostvars_only_jinjaturtle(vars_text: str) -> str:
|
||||
# keep as valid YAML file
|
||||
return _defaults_with_jinjaturtle("---\n", vars_text)
|
||||
|
||||
|
||||
def _defaults_with_jinjaturtle(base_defaults: str, vars_text: str) -> str:
|
||||
if not vars_text.strip():
|
||||
return base_defaults.rstrip() + "\n"
|
||||
vars_text = _normalise_jinjaturtle_vars_text(vars_text)
|
||||
# Always regenerate the block (we regenerate whole defaults files anyway)
|
||||
return (
|
||||
base_defaults.rstrip()
|
||||
+ "\n\n"
|
||||
+ JINJATURTLE_BEGIN
|
||||
+ "\n"
|
||||
+ vars_text.rstrip()
|
||||
+ "\n"
|
||||
+ JINJATURTLE_END
|
||||
+ "\n"
|
||||
)
|
||||
|
||||
|
||||
def _write_role_defaults(role_dir: str, mapping: Dict[str, Any]) -> None:
|
||||
"""Overwrite role defaults/main.yml with the provided mapping."""
|
||||
defaults_path = os.path.join(role_dir, "defaults", "main.yml")
|
||||
|
|
@ -535,7 +468,153 @@ def _render_generic_files_tasks(
|
|||
"""
|
||||
|
||||
|
||||
def manifest(
|
||||
def _prepare_bundle_dir(
|
||||
bundle: str,
|
||||
*,
|
||||
sops_mode: bool,
|
||||
) -> tuple[str, Optional[tempfile.TemporaryDirectory]]:
|
||||
"""Return (bundle_dir, tempdir).
|
||||
|
||||
- In non-sops mode, `bundle` must be a directory.
|
||||
- In sops mode, `bundle` may be a directory (already-decrypted) *or*
|
||||
a SOPS-encrypted tarball. In the tarball case we decrypt+extract into
|
||||
a secure temp directory.
|
||||
"""
|
||||
p = Path(bundle).expanduser()
|
||||
|
||||
if p.is_dir():
|
||||
return str(p), None
|
||||
|
||||
if not sops_mode:
|
||||
raise RuntimeError(f"Harvest path is not a directory: {p}")
|
||||
|
||||
if not p.exists():
|
||||
raise RuntimeError(f"Harvest path not found: {p}")
|
||||
|
||||
# Ensure sops is available early for clear error messages.
|
||||
require_sops_cmd()
|
||||
|
||||
td = tempfile.TemporaryDirectory(prefix="enroll-harvest-")
|
||||
td_path = Path(td.name)
|
||||
try:
|
||||
os.chmod(td_path, 0o700)
|
||||
except OSError:
|
||||
pass
|
||||
|
||||
tar_path = td_path / "harvest.tar.gz"
|
||||
out_dir = td_path / "bundle"
|
||||
out_dir.mkdir(parents=True, exist_ok=True)
|
||||
try:
|
||||
os.chmod(out_dir, 0o700)
|
||||
except OSError:
|
||||
pass
|
||||
|
||||
decrypt_file_binary_to(p, tar_path, mode=0o600)
|
||||
|
||||
# Extract using the same safe extraction rules as remote harvesting.
|
||||
with tarfile.open(tar_path, mode="r:gz") as tf:
|
||||
_safe_extract_tar(tf, out_dir)
|
||||
|
||||
return str(out_dir), td
|
||||
|
||||
|
||||
def _resolve_sops_manifest_out_file(out: str) -> Path:
|
||||
"""Resolve an output *file* path for manifest --sops mode.
|
||||
|
||||
If `out` looks like a directory (or points to an existing directory), we
|
||||
place the encrypted manifest bundle inside it as manifest.tar.gz.sops.
|
||||
"""
|
||||
p = Path(out).expanduser()
|
||||
if p.exists() and p.is_dir():
|
||||
return p / "manifest.tar.gz.sops"
|
||||
# Heuristic: treat paths with a suffix as files; otherwise directories.
|
||||
if p.suffix:
|
||||
return p
|
||||
return p / "manifest.tar.gz.sops"
|
||||
|
||||
|
||||
def _tar_dir_to_with_progress(
|
||||
src_dir: Path, tar_path: Path, *, desc: str = "tarring"
|
||||
) -> None:
|
||||
"""Create a tar.gz of src_dir at tar_path, with a simple per-entry progress display."""
|
||||
src_dir = Path(src_dir)
|
||||
tar_path = Path(tar_path)
|
||||
tar_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# Collect paths (dirs + files)
|
||||
paths: list[Path] = [src_dir]
|
||||
for root, dirs, files in os.walk(str(src_dir)):
|
||||
root_p = Path(root)
|
||||
for d in sorted(dirs):
|
||||
paths.append(root_p / d)
|
||||
for f in sorted(files):
|
||||
paths.append(root_p / f)
|
||||
|
||||
total = len(paths)
|
||||
is_tty = hasattr(os, "isatty") and os.isatty(2)
|
||||
|
||||
def _print_progress(i: int, p: Path) -> None:
|
||||
if not is_tty:
|
||||
return
|
||||
pct = (i / total * 100.0) if total else 100.0
|
||||
rel = "."
|
||||
try:
|
||||
rel = str(p.relative_to(src_dir))
|
||||
except Exception:
|
||||
rel = str(p)
|
||||
msg = f"{desc}: {i}/{total} ({pct:5.1f}%) {rel}"
|
||||
try:
|
||||
cols = shutil.get_terminal_size((80, 20)).columns
|
||||
msg = msg[: cols - 1]
|
||||
except Exception:
|
||||
pass
|
||||
os.write(2, ("\r" + msg).encode("utf-8", errors="replace"))
|
||||
|
||||
with tarfile.open(tar_path, mode="w:gz") as tf:
|
||||
prefix = Path("manifest")
|
||||
|
||||
for i, p in enumerate(paths, start=1):
|
||||
if p == src_dir:
|
||||
arcname = str(prefix)
|
||||
else:
|
||||
rel = p.relative_to(src_dir)
|
||||
arcname = str(prefix / rel)
|
||||
tf.add(str(p), arcname=arcname, recursive=False)
|
||||
_print_progress(i, p)
|
||||
|
||||
if is_tty:
|
||||
os.write(2, b"\n")
|
||||
|
||||
|
||||
def _encrypt_manifest_out_dir_to_sops(
|
||||
out_dir: Path, out_file: Path, fps: list[str]
|
||||
) -> Path:
|
||||
"""Tar+encrypt the generated manifest output directory into a single .sops file."""
|
||||
require_sops_cmd()
|
||||
out_file = Path(out_file)
|
||||
out_file.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
fd, tmp_tgz = tempfile.mkstemp(
|
||||
prefix=".enroll-manifest-",
|
||||
suffix=".tar.gz",
|
||||
dir=str(out_file.parent),
|
||||
)
|
||||
os.close(fd)
|
||||
try:
|
||||
_tar_dir_to_with_progress(
|
||||
Path(out_dir), Path(tmp_tgz), desc="Bundling manifest"
|
||||
)
|
||||
encrypt_file_binary(Path(tmp_tgz), out_file, pgp_fingerprints=fps, mode=0o600)
|
||||
finally:
|
||||
try:
|
||||
os.unlink(tmp_tgz)
|
||||
except FileNotFoundError:
|
||||
pass
|
||||
|
||||
return out_file
|
||||
|
||||
|
||||
def _manifest_from_bundle_dir(
|
||||
bundle_dir: str,
|
||||
out_dir: str,
|
||||
*,
|
||||
|
|
@ -1240,3 +1319,69 @@ Generated for package `{pkg}`.
|
|||
)
|
||||
else:
|
||||
_write_playbook_all(os.path.join(out_dir, "playbook.yml"), all_roles)
|
||||
|
||||
|
||||
def manifest(
|
||||
bundle_dir: str,
|
||||
out: str,
|
||||
*,
|
||||
fqdn: Optional[str] = None,
|
||||
jinjaturtle: str = "auto", # auto|on|off
|
||||
sops_fingerprints: Optional[List[str]] = None,
|
||||
) -> Optional[str]:
|
||||
"""Render an Ansible manifest from a harvest.
|
||||
|
||||
Plain mode:
|
||||
- `bundle_dir` must be a directory
|
||||
- `out` is a directory written in-place
|
||||
|
||||
SOPS mode (when `sops_fingerprints` is provided):
|
||||
- `bundle_dir` may be either a directory (already decrypted) or a SOPS
|
||||
encrypted tarball (binary) produced by `harvest --sops`
|
||||
- the manifest output is bundled (tar.gz) and encrypted into a single
|
||||
SOPS file (binary) at the resolved output path.
|
||||
|
||||
Returns:
|
||||
- In SOPS mode: the path to the encrypted manifest bundle (.sops)
|
||||
- In plain mode: None
|
||||
"""
|
||||
sops_mode = bool(sops_fingerprints)
|
||||
|
||||
# Decrypt/extract the harvest bundle if needed.
|
||||
resolved_bundle_dir, td_bundle = _prepare_bundle_dir(
|
||||
bundle_dir, sops_mode=sops_mode
|
||||
)
|
||||
|
||||
td_out: Optional[tempfile.TemporaryDirectory] = None
|
||||
try:
|
||||
if not sops_mode:
|
||||
_manifest_from_bundle_dir(
|
||||
resolved_bundle_dir, out, fqdn=fqdn, jinjaturtle=jinjaturtle
|
||||
)
|
||||
return None
|
||||
|
||||
# SOPS mode: generate into a secure temp dir, then tar+encrypt into a single file.
|
||||
out_file = _resolve_sops_manifest_out_file(out)
|
||||
|
||||
td_out = tempfile.TemporaryDirectory(prefix="enroll-manifest-")
|
||||
tmp_out = Path(td_out.name) / "out"
|
||||
tmp_out.mkdir(parents=True, exist_ok=True)
|
||||
try:
|
||||
os.chmod(tmp_out, 0o700)
|
||||
except OSError:
|
||||
pass
|
||||
|
||||
_manifest_from_bundle_dir(
|
||||
resolved_bundle_dir, str(tmp_out), fqdn=fqdn, jinjaturtle=jinjaturtle
|
||||
)
|
||||
|
||||
enc = _encrypt_manifest_out_dir_to_sops(
|
||||
tmp_out, out_file, list(sops_fingerprints or [])
|
||||
)
|
||||
return str(enc)
|
||||
|
||||
finally:
|
||||
if td_out is not None:
|
||||
td_out.cleanup()
|
||||
if td_bundle is not None:
|
||||
td_bundle.cleanup()
|
||||
|
|
|
|||
235
enroll/remote.py
Normal file
235
enroll/remote.py
Normal file
|
|
@ -0,0 +1,235 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import os
|
||||
import shutil
|
||||
import tarfile
|
||||
import tempfile
|
||||
import zipapp
|
||||
from pathlib import Path
|
||||
from pathlib import PurePosixPath
|
||||
from typing import Optional
|
||||
|
||||
|
||||
def _safe_extract_tar(tar: tarfile.TarFile, dest: Path) -> None:
|
||||
"""Safely extract a tar archive into dest.
|
||||
|
||||
Protects against path traversal (e.g. entries containing ../).
|
||||
"""
|
||||
|
||||
# Note: tar member names use POSIX separators regardless of platform.
|
||||
dest = dest.resolve()
|
||||
|
||||
for m in tar.getmembers():
|
||||
name = m.name
|
||||
|
||||
# Some tar implementations include a top-level '.' entry when created
|
||||
# with `tar -C <dir> .`. That's harmless and should be allowed.
|
||||
if name in {".", "./"}:
|
||||
continue
|
||||
|
||||
# Reject absolute paths and any '..' components up front.
|
||||
p = PurePosixPath(name)
|
||||
if p.is_absolute() or ".." in p.parts:
|
||||
raise RuntimeError(f"Unsafe tar member path: {name}")
|
||||
|
||||
# Refuse to extract links or device nodes from an untrusted archive.
|
||||
# (A symlink can be used to redirect subsequent writes outside dest.)
|
||||
if m.issym() or m.islnk() or m.isdev():
|
||||
raise RuntimeError(f"Refusing to extract special tar member: {name}")
|
||||
|
||||
member_path = (dest / Path(*p.parts)).resolve()
|
||||
if member_path != dest and not str(member_path).startswith(str(dest) + os.sep):
|
||||
raise RuntimeError(f"Unsafe tar member path: {name}")
|
||||
|
||||
# Extract members one-by-one after validation.
|
||||
for m in tar.getmembers():
|
||||
if m.name in {".", "./"}:
|
||||
continue
|
||||
tar.extract(m, path=dest)
|
||||
|
||||
|
||||
def _build_enroll_pyz(tmpdir: Path) -> Path:
|
||||
"""Build a self-contained enroll zipapp (pyz) on the local machine.
|
||||
|
||||
The resulting file is stdlib-only and can be executed on the remote host
|
||||
as long as it has Python 3 available.
|
||||
"""
|
||||
import enroll as pkg
|
||||
|
||||
pkg_dir = Path(pkg.__file__).resolve().parent
|
||||
stage = tmpdir / "stage"
|
||||
(stage / "enroll").mkdir(parents=True, exist_ok=True)
|
||||
|
||||
def _ignore(d: str, names: list[str]) -> set[str]:
|
||||
return {
|
||||
n
|
||||
for n in names
|
||||
if n in {"__pycache__", ".pytest_cache"} or n.endswith(".pyc")
|
||||
}
|
||||
|
||||
shutil.copytree(pkg_dir, stage / "enroll", dirs_exist_ok=True, ignore=_ignore)
|
||||
|
||||
pyz_path = tmpdir / "enroll.pyz"
|
||||
zipapp.create_archive(
|
||||
stage,
|
||||
target=pyz_path,
|
||||
main="enroll.cli:main",
|
||||
compressed=True,
|
||||
)
|
||||
return pyz_path
|
||||
|
||||
|
||||
def _ssh_run(ssh, cmd: str) -> tuple[int, str, str]:
|
||||
"""Run a command over a Paramiko SSHClient."""
|
||||
_stdin, stdout, stderr = ssh.exec_command(cmd)
|
||||
out = stdout.read().decode("utf-8", errors="replace")
|
||||
err = stderr.read().decode("utf-8", errors="replace")
|
||||
rc = stdout.channel.recv_exit_status()
|
||||
return rc, out, err
|
||||
|
||||
|
||||
def remote_harvest(
|
||||
*,
|
||||
local_out_dir: Path,
|
||||
remote_host: str,
|
||||
remote_port: int = 22,
|
||||
remote_user: Optional[str] = None,
|
||||
remote_python: str = "python3",
|
||||
dangerous: bool = False,
|
||||
no_sudo: bool = False,
|
||||
) -> Path:
|
||||
"""Run enroll harvest on a remote host via SSH and pull the bundle locally.
|
||||
|
||||
Returns the local path to state.json inside local_out_dir.
|
||||
"""
|
||||
|
||||
try:
|
||||
import paramiko # type: ignore
|
||||
except Exception as e:
|
||||
raise RuntimeError(
|
||||
"Remote harvesting requires the 'paramiko' package. "
|
||||
"Install it with: pip install paramiko"
|
||||
) from e
|
||||
|
||||
local_out_dir = Path(local_out_dir)
|
||||
local_out_dir.mkdir(parents=True, exist_ok=True)
|
||||
try:
|
||||
os.chmod(local_out_dir, 0o700)
|
||||
except OSError:
|
||||
pass
|
||||
|
||||
# Build a zipapp locally and upload it to the remote.
|
||||
with tempfile.TemporaryDirectory(prefix="enroll-remote-") as td:
|
||||
td_path = Path(td)
|
||||
pyz = _build_enroll_pyz(td_path)
|
||||
local_tgz = td_path / "bundle.tgz"
|
||||
|
||||
ssh = paramiko.SSHClient()
|
||||
ssh.load_system_host_keys()
|
||||
# Default: refuse unknown host keys.
|
||||
# Users should add the key to known_hosts.
|
||||
ssh.set_missing_host_key_policy(paramiko.RejectPolicy())
|
||||
|
||||
ssh.connect(
|
||||
hostname=remote_host,
|
||||
port=int(remote_port),
|
||||
username=remote_user,
|
||||
allow_agent=True,
|
||||
look_for_keys=True,
|
||||
)
|
||||
|
||||
# If no username was explicitly provided, SSH may have selected a default.
|
||||
# We need a concrete username for the (sudo) chown step below.
|
||||
resolved_user = remote_user
|
||||
if not resolved_user:
|
||||
rc, out, err = _ssh_run(ssh, "id -un")
|
||||
if rc == 0 and out.strip():
|
||||
resolved_user = out.strip()
|
||||
|
||||
sftp = ssh.open_sftp()
|
||||
rtmp: Optional[str] = None
|
||||
try:
|
||||
rc, out, err = _ssh_run(ssh, "mktemp -d")
|
||||
if rc != 0:
|
||||
raise RuntimeError(f"Remote mktemp failed: {err.strip()}")
|
||||
rtmp = out.strip()
|
||||
|
||||
# Be explicit: restrict the remote staging area to the current user.
|
||||
rc, out, err = _ssh_run(ssh, f"chmod 700 {rtmp}")
|
||||
if rc != 0:
|
||||
raise RuntimeError(f"Remote chmod failed: {err.strip()}")
|
||||
|
||||
rapp = f"{rtmp}/enroll.pyz"
|
||||
rbundle = f"{rtmp}/bundle"
|
||||
|
||||
sftp.put(str(pyz), rapp)
|
||||
|
||||
# Run remote harvest.
|
||||
_cmd = f"{remote_python} {rapp} harvest --out {rbundle}"
|
||||
if not no_sudo:
|
||||
cmd = f"sudo {_cmd}"
|
||||
else:
|
||||
cmd = _cmd
|
||||
if dangerous:
|
||||
cmd += " --dangerous"
|
||||
rc, out, err = _ssh_run(ssh, cmd)
|
||||
if rc != 0:
|
||||
raise RuntimeError(
|
||||
"Remote harvest failed.\n"
|
||||
f"Command: {cmd}\n"
|
||||
f"Exit code: {rc}\n"
|
||||
f"Stderr: {err.strip()}"
|
||||
)
|
||||
|
||||
if not no_sudo:
|
||||
# Ensure user can read the files, before we tar it
|
||||
if not resolved_user:
|
||||
raise RuntimeError(
|
||||
"Unable to determine remote username for chown. "
|
||||
"Pass --remote-user explicitly or use --no-sudo."
|
||||
)
|
||||
cmd = f"sudo chown -R {resolved_user} {rbundle}"
|
||||
rc, out, err = _ssh_run(ssh, cmd)
|
||||
if rc != 0:
|
||||
raise RuntimeError(
|
||||
"chown of harvest failed.\n"
|
||||
f"Command: {cmd}\n"
|
||||
f"Exit code: {rc}\n"
|
||||
f"Stderr: {err.strip()}"
|
||||
)
|
||||
|
||||
# Stream a tarball back to the local machine (avoid creating a tar file on the remote).
|
||||
cmd = f"tar -cz -C {rbundle} ."
|
||||
_stdin, stdout, stderr = ssh.exec_command(cmd)
|
||||
with open(local_tgz, "wb") as f:
|
||||
while True:
|
||||
chunk = stdout.read(1024 * 128)
|
||||
if not chunk:
|
||||
break
|
||||
f.write(chunk)
|
||||
rc = stdout.channel.recv_exit_status()
|
||||
err_text = stderr.read().decode("utf-8", errors="replace")
|
||||
if rc != 0:
|
||||
raise RuntimeError(
|
||||
"Remote tar stream failed.\n"
|
||||
f"Command: {cmd}\n"
|
||||
f"Exit code: {rc}\n"
|
||||
f"Stderr: {err_text.strip()}"
|
||||
)
|
||||
|
||||
# Extract into the destination.
|
||||
with tarfile.open(local_tgz, mode="r:gz") as tf:
|
||||
_safe_extract_tar(tf, local_out_dir)
|
||||
|
||||
finally:
|
||||
# Cleanup remote tmpdir even on failure.
|
||||
if rtmp:
|
||||
_ssh_run(ssh, f"rm -rf {rtmp}")
|
||||
try:
|
||||
sftp.close()
|
||||
ssh.close()
|
||||
except Exception:
|
||||
ssh.close()
|
||||
raise RuntimeError("Something went wrong generating the harvest")
|
||||
|
||||
return local_out_dir / "state.json"
|
||||
137
enroll/sopsutil.py
Normal file
137
enroll/sopsutil.py
Normal file
|
|
@ -0,0 +1,137 @@
|
|||
from __future__ import annotations
|
||||
|
||||
import os
|
||||
import shutil
|
||||
import subprocess
|
||||
import tempfile
|
||||
from pathlib import Path
|
||||
from typing import Iterable, List, Optional
|
||||
|
||||
|
||||
class SopsError(RuntimeError):
|
||||
pass
|
||||
|
||||
|
||||
def find_sops_cmd() -> Optional[str]:
|
||||
"""Return the `sops` executable path if present on PATH."""
|
||||
return shutil.which("sops")
|
||||
|
||||
|
||||
def require_sops_cmd() -> str:
|
||||
exe = find_sops_cmd()
|
||||
if not exe:
|
||||
raise SopsError(
|
||||
"--sops was requested but `sops` was not found on PATH. "
|
||||
"Install sops and ensure it is available as `sops`."
|
||||
)
|
||||
return exe
|
||||
|
||||
|
||||
def _pgp_arg(fingerprints: Iterable[str]) -> str:
|
||||
fps = [f.strip() for f in fingerprints if f and f.strip()]
|
||||
if not fps:
|
||||
raise SopsError("No GPG fingerprints provided for --sops")
|
||||
# sops accepts a comma-separated list for --pgp.
|
||||
return ",".join(fps)
|
||||
|
||||
|
||||
def encrypt_file_binary(
|
||||
src_path: Path,
|
||||
dst_path: Path,
|
||||
*,
|
||||
pgp_fingerprints: List[str],
|
||||
mode: int = 0o600,
|
||||
) -> None:
|
||||
"""Encrypt src_path with sops (binary) and write to dst_path atomically."""
|
||||
sops = require_sops_cmd()
|
||||
src_path = Path(src_path)
|
||||
dst_path = Path(dst_path)
|
||||
dst_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
res = subprocess.run(
|
||||
[
|
||||
sops,
|
||||
"--encrypt",
|
||||
"--input-type",
|
||||
"binary",
|
||||
"--output-type",
|
||||
"binary",
|
||||
"--pgp",
|
||||
_pgp_arg(pgp_fingerprints),
|
||||
str(src_path),
|
||||
],
|
||||
capture_output=True,
|
||||
check=False,
|
||||
)
|
||||
if res.returncode != 0:
|
||||
raise SopsError(
|
||||
"sops encryption failed:\n"
|
||||
f" cmd: {sops} --encrypt ... {src_path}\n"
|
||||
f" rc: {res.returncode}\n"
|
||||
f" stderr: {res.stderr.decode('utf-8', errors='replace').strip()}"
|
||||
)
|
||||
|
||||
# Write atomically in the destination directory.
|
||||
fd, tmp = tempfile.mkstemp(prefix=".enroll-sops-", dir=str(dst_path.parent))
|
||||
try:
|
||||
with os.fdopen(fd, "wb") as f:
|
||||
f.write(res.stdout)
|
||||
try:
|
||||
os.chmod(tmp, mode)
|
||||
except OSError:
|
||||
pass
|
||||
os.replace(tmp, dst_path)
|
||||
finally:
|
||||
try:
|
||||
os.unlink(tmp)
|
||||
except FileNotFoundError:
|
||||
pass
|
||||
|
||||
|
||||
def decrypt_file_binary_to(
|
||||
src_path: Path,
|
||||
dst_path: Path,
|
||||
*,
|
||||
mode: int = 0o600,
|
||||
) -> None:
|
||||
"""Decrypt a sops-encrypted file (binary) into dst_path."""
|
||||
sops = require_sops_cmd()
|
||||
src_path = Path(src_path)
|
||||
dst_path = Path(dst_path)
|
||||
dst_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
res = subprocess.run(
|
||||
[
|
||||
sops,
|
||||
"--decrypt",
|
||||
"--input-type",
|
||||
"binary",
|
||||
"--output-type",
|
||||
"binary",
|
||||
str(src_path),
|
||||
],
|
||||
capture_output=True,
|
||||
check=False,
|
||||
)
|
||||
if res.returncode != 0:
|
||||
raise SopsError(
|
||||
"sops decryption failed:\n"
|
||||
f" cmd: {sops} --decrypt ... {src_path}\n"
|
||||
f" rc: {res.returncode}\n"
|
||||
f" stderr: {res.stderr.decode('utf-8', errors='replace').strip()}"
|
||||
)
|
||||
|
||||
fd, tmp = tempfile.mkstemp(prefix=".enroll-sops-", dir=str(dst_path.parent))
|
||||
try:
|
||||
with os.fdopen(fd, "wb") as f:
|
||||
f.write(res.stdout)
|
||||
try:
|
||||
os.chmod(tmp, mode)
|
||||
except OSError:
|
||||
pass
|
||||
os.replace(tmp, dst_path)
|
||||
finally:
|
||||
try:
|
||||
os.unlink(tmp)
|
||||
except FileNotFoundError:
|
||||
pass
|
||||
357
poetry.lock
generated
357
poetry.lock
generated
|
|
@ -1,10 +1,85 @@
|
|||
# This file is automatically @generated by Poetry and should not be changed by hand.
|
||||
# This file is automatically @generated by Poetry 1.8.3 and should not be changed by hand.
|
||||
|
||||
[[package]]
|
||||
name = "bcrypt"
|
||||
version = "5.0.0"
|
||||
description = "Modern password hashing for your software and your servers"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
{file = "bcrypt-5.0.0-cp313-cp313t-macosx_10_12_universal2.whl", hash = "sha256:f3c08197f3039bec79cee59a606d62b96b16669cff3949f21e74796b6e3cd2be"},
|
||||
{file = "bcrypt-5.0.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:200af71bc25f22006f4069060c88ed36f8aa4ff7f53e67ff04d2ab3f1e79a5b2"},
|
||||
{file = "bcrypt-5.0.0-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:baade0a5657654c2984468efb7d6c110db87ea63ef5a4b54732e7e337253e44f"},
|
||||
{file = "bcrypt-5.0.0-cp313-cp313t-manylinux_2_28_aarch64.whl", hash = "sha256:c58b56cdfb03202b3bcc9fd8daee8e8e9b6d7e3163aa97c631dfcfcc24d36c86"},
|
||||
{file = "bcrypt-5.0.0-cp313-cp313t-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:4bfd2a34de661f34d0bda43c3e4e79df586e4716ef401fe31ea39d69d581ef23"},
|
||||
{file = "bcrypt-5.0.0-cp313-cp313t-manylinux_2_28_x86_64.whl", hash = "sha256:ed2e1365e31fc73f1825fa830f1c8f8917ca1b3ca6185773b349c20fd606cec2"},
|
||||
{file = "bcrypt-5.0.0-cp313-cp313t-manylinux_2_34_aarch64.whl", hash = "sha256:83e787d7a84dbbfba6f250dd7a5efd689e935f03dd83b0f919d39349e1f23f83"},
|
||||
{file = "bcrypt-5.0.0-cp313-cp313t-manylinux_2_34_x86_64.whl", hash = "sha256:137c5156524328a24b9fac1cb5db0ba618bc97d11970b39184c1d87dc4bf1746"},
|
||||
{file = "bcrypt-5.0.0-cp313-cp313t-musllinux_1_1_aarch64.whl", hash = "sha256:38cac74101777a6a7d3b3e3cfefa57089b5ada650dce2baf0cbdd9d65db22a9e"},
|
||||
{file = "bcrypt-5.0.0-cp313-cp313t-musllinux_1_1_x86_64.whl", hash = "sha256:d8d65b564ec849643d9f7ea05c6d9f0cd7ca23bdd4ac0c2dbef1104ab504543d"},
|
||||
{file = "bcrypt-5.0.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:741449132f64b3524e95cd30e5cd3343006ce146088f074f31ab26b94e6c75ba"},
|
||||
{file = "bcrypt-5.0.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:212139484ab3207b1f0c00633d3be92fef3c5f0af17cad155679d03ff2ee1e41"},
|
||||
{file = "bcrypt-5.0.0-cp313-cp313t-win32.whl", hash = "sha256:9d52ed507c2488eddd6a95bccee4e808d3234fa78dd370e24bac65a21212b861"},
|
||||
{file = "bcrypt-5.0.0-cp313-cp313t-win_amd64.whl", hash = "sha256:f6984a24db30548fd39a44360532898c33528b74aedf81c26cf29c51ee47057e"},
|
||||
{file = "bcrypt-5.0.0-cp313-cp313t-win_arm64.whl", hash = "sha256:9fffdb387abe6aa775af36ef16f55e318dcda4194ddbf82007a6f21da29de8f5"},
|
||||
{file = "bcrypt-5.0.0-cp314-cp314t-macosx_10_12_universal2.whl", hash = "sha256:4870a52610537037adb382444fefd3706d96d663ac44cbb2f37e3919dca3d7ef"},
|
||||
{file = "bcrypt-5.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:48f753100931605686f74e27a7b49238122aa761a9aefe9373265b8b7aa43ea4"},
|
||||
{file = "bcrypt-5.0.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f70aadb7a809305226daedf75d90379c397b094755a710d7014b8b117df1ebbf"},
|
||||
{file = "bcrypt-5.0.0-cp314-cp314t-manylinux_2_28_aarch64.whl", hash = "sha256:744d3c6b164caa658adcb72cb8cc9ad9b4b75c7db507ab4bc2480474a51989da"},
|
||||
{file = "bcrypt-5.0.0-cp314-cp314t-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:a28bc05039bdf3289d757f49d616ab3efe8cf40d8e8001ccdd621cd4f98f4fc9"},
|
||||
{file = "bcrypt-5.0.0-cp314-cp314t-manylinux_2_28_x86_64.whl", hash = "sha256:7f277a4b3390ab4bebe597800a90da0edae882c6196d3038a73adf446c4f969f"},
|
||||
{file = "bcrypt-5.0.0-cp314-cp314t-manylinux_2_34_aarch64.whl", hash = "sha256:79cfa161eda8d2ddf29acad370356b47f02387153b11d46042e93a0a95127493"},
|
||||
{file = "bcrypt-5.0.0-cp314-cp314t-manylinux_2_34_x86_64.whl", hash = "sha256:a5393eae5722bcef046a990b84dff02b954904c36a194f6cfc817d7dca6c6f0b"},
|
||||
{file = "bcrypt-5.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:7f4c94dec1b5ab5d522750cb059bb9409ea8872d4494fd152b53cca99f1ddd8c"},
|
||||
{file = "bcrypt-5.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:0cae4cb350934dfd74c020525eeae0a5f79257e8a201c0c176f4b84fdbf2a4b4"},
|
||||
{file = "bcrypt-5.0.0-cp314-cp314t-win32.whl", hash = "sha256:b17366316c654e1ad0306a6858e189fc835eca39f7eb2cafd6aaca8ce0c40a2e"},
|
||||
{file = "bcrypt-5.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:92864f54fb48b4c718fc92a32825d0e42265a627f956bc0361fe869f1adc3e7d"},
|
||||
{file = "bcrypt-5.0.0-cp314-cp314t-win_arm64.whl", hash = "sha256:dd19cf5184a90c873009244586396a6a884d591a5323f0e8a5922560718d4993"},
|
||||
{file = "bcrypt-5.0.0-cp38-abi3-macosx_10_12_universal2.whl", hash = "sha256:fc746432b951e92b58317af8e0ca746efe93e66555f1b40888865ef5bf56446b"},
|
||||
{file = "bcrypt-5.0.0-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:c2388ca94ffee269b6038d48747f4ce8df0ffbea43f31abfa18ac72f0218effb"},
|
||||
{file = "bcrypt-5.0.0-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:560ddb6ec730386e7b3b26b8b4c88197aaed924430e7b74666a586ac997249ef"},
|
||||
{file = "bcrypt-5.0.0-cp38-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:d79e5c65dcc9af213594d6f7f1fa2c98ad3fc10431e7aa53c176b441943efbdd"},
|
||||
{file = "bcrypt-5.0.0-cp38-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:2b732e7d388fa22d48920baa267ba5d97cca38070b69c0e2d37087b381c681fd"},
|
||||
{file = "bcrypt-5.0.0-cp38-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:0c8e093ea2532601a6f686edbc2c6b2ec24131ff5c52f7610dd64fa4553b5464"},
|
||||
{file = "bcrypt-5.0.0-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:5b1589f4839a0899c146e8892efe320c0fa096568abd9b95593efac50a87cb75"},
|
||||
{file = "bcrypt-5.0.0-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:89042e61b5e808b67daf24a434d89bab164d4de1746b37a8d173b6b14f3db9ff"},
|
||||
{file = "bcrypt-5.0.0-cp38-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:e3cf5b2560c7b5a142286f69bde914494b6d8f901aaa71e453078388a50881c4"},
|
||||
{file = "bcrypt-5.0.0-cp38-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:f632fd56fc4e61564f78b46a2269153122db34988e78b6be8b32d28507b7eaeb"},
|
||||
{file = "bcrypt-5.0.0-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:801cad5ccb6b87d1b430f183269b94c24f248dddbbc5c1f78b6ed231743e001c"},
|
||||
{file = "bcrypt-5.0.0-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:3cf67a804fc66fc217e6914a5635000259fbbbb12e78a99488e4d5ba445a71eb"},
|
||||
{file = "bcrypt-5.0.0-cp38-abi3-win32.whl", hash = "sha256:3abeb543874b2c0524ff40c57a4e14e5d3a66ff33fb423529c88f180fd756538"},
|
||||
{file = "bcrypt-5.0.0-cp38-abi3-win_amd64.whl", hash = "sha256:35a77ec55b541e5e583eb3436ffbbf53b0ffa1fa16ca6782279daf95d146dcd9"},
|
||||
{file = "bcrypt-5.0.0-cp38-abi3-win_arm64.whl", hash = "sha256:cde08734f12c6a4e28dc6755cd11d3bdfea608d93d958fffbe95a7026ebe4980"},
|
||||
{file = "bcrypt-5.0.0-cp39-abi3-macosx_10_12_universal2.whl", hash = "sha256:0c418ca99fd47e9c59a301744d63328f17798b5947b0f791e9af3c1c499c2d0a"},
|
||||
{file = "bcrypt-5.0.0-cp39-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:ddb4e1500f6efdd402218ffe34d040a1196c072e07929b9820f363a1fd1f4191"},
|
||||
{file = "bcrypt-5.0.0-cp39-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:7aeef54b60ceddb6f30ee3db090351ecf0d40ec6e2abf41430997407a46d2254"},
|
||||
{file = "bcrypt-5.0.0-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:f0ce778135f60799d89c9693b9b398819d15f1921ba15fe719acb3178215a7db"},
|
||||
{file = "bcrypt-5.0.0-cp39-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:a71f70ee269671460b37a449f5ff26982a6f2ba493b3eabdd687b4bf35f875ac"},
|
||||
{file = "bcrypt-5.0.0-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:f8429e1c410b4073944f03bd778a9e066e7fad723564a52ff91841d278dfc822"},
|
||||
{file = "bcrypt-5.0.0-cp39-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:edfcdcedd0d0f05850c52ba3127b1fce70b9f89e0fe5ff16517df7e81fa3cbb8"},
|
||||
{file = "bcrypt-5.0.0-cp39-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:611f0a17aa4a25a69362dcc299fda5c8a3d4f160e2abb3831041feb77393a14a"},
|
||||
{file = "bcrypt-5.0.0-cp39-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:db99dca3b1fdc3db87d7c57eac0c82281242d1eabf19dcb8a6b10eb29a2e72d1"},
|
||||
{file = "bcrypt-5.0.0-cp39-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:5feebf85a9cefda32966d8171f5db7e3ba964b77fdfe31919622256f80f9cf42"},
|
||||
{file = "bcrypt-5.0.0-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:3ca8a166b1140436e058298a34d88032ab62f15aae1c598580333dc21d27ef10"},
|
||||
{file = "bcrypt-5.0.0-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:61afc381250c3182d9078551e3ac3a41da14154fbff647ddf52a769f588c4172"},
|
||||
{file = "bcrypt-5.0.0-cp39-abi3-win32.whl", hash = "sha256:64d7ce196203e468c457c37ec22390f1a61c85c6f0b8160fd752940ccfb3a683"},
|
||||
{file = "bcrypt-5.0.0-cp39-abi3-win_amd64.whl", hash = "sha256:64ee8434b0da054d830fa8e89e1c8bf30061d539044a39524ff7dec90481e5c2"},
|
||||
{file = "bcrypt-5.0.0-cp39-abi3-win_arm64.whl", hash = "sha256:f2347d3534e76bf50bca5500989d6c1d05ed64b440408057a37673282c654927"},
|
||||
{file = "bcrypt-5.0.0-pp311-pypy311_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:7edda91d5ab52b15636d9c30da87d2cc84f426c72b9dba7a9b4fe142ba11f534"},
|
||||
{file = "bcrypt-5.0.0-pp311-pypy311_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:046ad6db88edb3c5ece4369af997938fb1c19d6a699b9c1b27b0db432faae4c4"},
|
||||
{file = "bcrypt-5.0.0-pp311-pypy311_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:dcd58e2b3a908b5ecc9b9df2f0085592506ac2d5110786018ee5e160f28e0911"},
|
||||
{file = "bcrypt-5.0.0-pp311-pypy311_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:6b8f520b61e8781efee73cba14e3e8c9556ccfb375623f4f97429544734545b4"},
|
||||
{file = "bcrypt-5.0.0.tar.gz", hash = "sha256:f748f7c2d6fd375cc93d3fba7ef4a9e3a092421b8dbf34d8d4dc06be9492dfdd"},
|
||||
]
|
||||
|
||||
[package.extras]
|
||||
tests = ["pytest (>=3.2.1,!=3.3.0)"]
|
||||
typecheck = ["mypy"]
|
||||
|
||||
[[package]]
|
||||
name = "certifi"
|
||||
version = "2025.11.12"
|
||||
description = "Python package for providing Mozilla's CA Bundle."
|
||||
category = "dev"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
files = [
|
||||
|
|
@ -12,11 +87,106 @@ files = [
|
|||
{file = "certifi-2025.11.12.tar.gz", hash = "sha256:d8ab5478f2ecd78af242878415affce761ca6bc54a22a27e026d7c25357c3316"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "cffi"
|
||||
version = "2.0.0"
|
||||
description = "Foreign Function Interface for Python calling C code."
|
||||
optional = false
|
||||
python-versions = ">=3.9"
|
||||
files = [
|
||||
{file = "cffi-2.0.0-cp310-cp310-macosx_10_13_x86_64.whl", hash = "sha256:0cf2d91ecc3fcc0625c2c530fe004f82c110405f101548512cce44322fa8ac44"},
|
||||
{file = "cffi-2.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f73b96c41e3b2adedc34a7356e64c8eb96e03a3782b535e043a986276ce12a49"},
|
||||
{file = "cffi-2.0.0-cp310-cp310-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:53f77cbe57044e88bbd5ed26ac1d0514d2acf0591dd6bb02a3ae37f76811b80c"},
|
||||
{file = "cffi-2.0.0-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:3e837e369566884707ddaf85fc1744b47575005c0a229de3327f8f9a20f4efeb"},
|
||||
{file = "cffi-2.0.0-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:5eda85d6d1879e692d546a078b44251cdd08dd1cfb98dfb77b670c97cee49ea0"},
|
||||
{file = "cffi-2.0.0-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:9332088d75dc3241c702d852d4671613136d90fa6881da7d770a483fd05248b4"},
|
||||
{file = "cffi-2.0.0-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:fc7de24befaeae77ba923797c7c87834c73648a05a4bde34b3b7e5588973a453"},
|
||||
{file = "cffi-2.0.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:cf364028c016c03078a23b503f02058f1814320a56ad535686f90565636a9495"},
|
||||
{file = "cffi-2.0.0-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e11e82b744887154b182fd3e7e8512418446501191994dbf9c9fc1f32cc8efd5"},
|
||||
{file = "cffi-2.0.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:8ea985900c5c95ce9db1745f7933eeef5d314f0565b27625d9a10ec9881e1bfb"},
|
||||
{file = "cffi-2.0.0-cp310-cp310-win32.whl", hash = "sha256:1f72fb8906754ac8a2cc3f9f5aaa298070652a0ffae577e0ea9bd480dc3c931a"},
|
||||
{file = "cffi-2.0.0-cp310-cp310-win_amd64.whl", hash = "sha256:b18a3ed7d5b3bd8d9ef7a8cb226502c6bf8308df1525e1cc676c3680e7176739"},
|
||||
{file = "cffi-2.0.0-cp311-cp311-macosx_10_13_x86_64.whl", hash = "sha256:b4c854ef3adc177950a8dfc81a86f5115d2abd545751a304c5bcf2c2c7283cfe"},
|
||||
{file = "cffi-2.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:2de9a304e27f7596cd03d16f1b7c72219bd944e99cc52b84d0145aefb07cbd3c"},
|
||||
{file = "cffi-2.0.0-cp311-cp311-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:baf5215e0ab74c16e2dd324e8ec067ef59e41125d3eade2b863d294fd5035c92"},
|
||||
{file = "cffi-2.0.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:730cacb21e1bdff3ce90babf007d0a0917cc3e6492f336c2f0134101e0944f93"},
|
||||
{file = "cffi-2.0.0-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:6824f87845e3396029f3820c206e459ccc91760e8fa24422f8b0c3d1731cbec5"},
|
||||
{file = "cffi-2.0.0-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:9de40a7b0323d889cf8d23d1ef214f565ab154443c42737dfe52ff82cf857664"},
|
||||
{file = "cffi-2.0.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:8941aaadaf67246224cee8c3803777eed332a19d909b47e29c9842ef1e79ac26"},
|
||||
{file = "cffi-2.0.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:a05d0c237b3349096d3981b727493e22147f934b20f6f125a3eba8f994bec4a9"},
|
||||
{file = "cffi-2.0.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:94698a9c5f91f9d138526b48fe26a199609544591f859c870d477351dc7b2414"},
|
||||
{file = "cffi-2.0.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:5fed36fccc0612a53f1d4d9a816b50a36702c28a2aa880cb8a122b3466638743"},
|
||||
{file = "cffi-2.0.0-cp311-cp311-win32.whl", hash = "sha256:c649e3a33450ec82378822b3dad03cc228b8f5963c0c12fc3b1e0ab940f768a5"},
|
||||
{file = "cffi-2.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:66f011380d0e49ed280c789fbd08ff0d40968ee7b665575489afa95c98196ab5"},
|
||||
{file = "cffi-2.0.0-cp311-cp311-win_arm64.whl", hash = "sha256:c6638687455baf640e37344fe26d37c404db8b80d037c3d29f58fe8d1c3b194d"},
|
||||
{file = "cffi-2.0.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:6d02d6655b0e54f54c4ef0b94eb6be0607b70853c45ce98bd278dc7de718be5d"},
|
||||
{file = "cffi-2.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8eca2a813c1cb7ad4fb74d368c2ffbbb4789d377ee5bb8df98373c2cc0dee76c"},
|
||||
{file = "cffi-2.0.0-cp312-cp312-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:21d1152871b019407d8ac3985f6775c079416c282e431a4da6afe7aefd2bccbe"},
|
||||
{file = "cffi-2.0.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:b21e08af67b8a103c71a250401c78d5e0893beff75e28c53c98f4de42f774062"},
|
||||
{file = "cffi-2.0.0-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:1e3a615586f05fc4065a8b22b8152f0c1b00cdbc60596d187c2a74f9e3036e4e"},
|
||||
{file = "cffi-2.0.0-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:81afed14892743bbe14dacb9e36d9e0e504cd204e0b165062c488942b9718037"},
|
||||
{file = "cffi-2.0.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:3e17ed538242334bf70832644a32a7aae3d83b57567f9fd60a26257e992b79ba"},
|
||||
{file = "cffi-2.0.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3925dd22fa2b7699ed2617149842d2e6adde22b262fcbfada50e3d195e4b3a94"},
|
||||
{file = "cffi-2.0.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:2c8f814d84194c9ea681642fd164267891702542f028a15fc97d4674b6206187"},
|
||||
{file = "cffi-2.0.0-cp312-cp312-win32.whl", hash = "sha256:da902562c3e9c550df360bfa53c035b2f241fed6d9aef119048073680ace4a18"},
|
||||
{file = "cffi-2.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:da68248800ad6320861f129cd9c1bf96ca849a2771a59e0344e88681905916f5"},
|
||||
{file = "cffi-2.0.0-cp312-cp312-win_arm64.whl", hash = "sha256:4671d9dd5ec934cb9a73e7ee9676f9362aba54f7f34910956b84d727b0d73fb6"},
|
||||
{file = "cffi-2.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:00bdf7acc5f795150faa6957054fbbca2439db2f775ce831222b66f192f03beb"},
|
||||
{file = "cffi-2.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:45d5e886156860dc35862657e1494b9bae8dfa63bf56796f2fb56e1679fc0bca"},
|
||||
{file = "cffi-2.0.0-cp313-cp313-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:07b271772c100085dd28b74fa0cd81c8fb1a3ba18b21e03d7c27f3436a10606b"},
|
||||
{file = "cffi-2.0.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:d48a880098c96020b02d5a1f7d9251308510ce8858940e6fa99ece33f610838b"},
|
||||
{file = "cffi-2.0.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:f93fd8e5c8c0a4aa1f424d6173f14a892044054871c771f8566e4008eaa359d2"},
|
||||
{file = "cffi-2.0.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:dd4f05f54a52fb558f1ba9f528228066954fee3ebe629fc1660d874d040ae5a3"},
|
||||
{file = "cffi-2.0.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c8d3b5532fc71b7a77c09192b4a5a200ea992702734a2e9279a37f2478236f26"},
|
||||
{file = "cffi-2.0.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:d9b29c1f0ae438d5ee9acb31cadee00a58c46cc9c0b2f9038c6b0b3470877a8c"},
|
||||
{file = "cffi-2.0.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6d50360be4546678fc1b79ffe7a66265e28667840010348dd69a314145807a1b"},
|
||||
{file = "cffi-2.0.0-cp313-cp313-win32.whl", hash = "sha256:74a03b9698e198d47562765773b4a8309919089150a0bb17d829ad7b44b60d27"},
|
||||
{file = "cffi-2.0.0-cp313-cp313-win_amd64.whl", hash = "sha256:19f705ada2530c1167abacb171925dd886168931e0a7b78f5bffcae5c6b5be75"},
|
||||
{file = "cffi-2.0.0-cp313-cp313-win_arm64.whl", hash = "sha256:256f80b80ca3853f90c21b23ee78cd008713787b1b1e93eae9f3d6a7134abd91"},
|
||||
{file = "cffi-2.0.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:fc33c5141b55ed366cfaad382df24fe7dcbc686de5be719b207bb248e3053dc5"},
|
||||
{file = "cffi-2.0.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c654de545946e0db659b3400168c9ad31b5d29593291482c43e3564effbcee13"},
|
||||
{file = "cffi-2.0.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:24b6f81f1983e6df8db3adc38562c83f7d4a0c36162885ec7f7b77c7dcbec97b"},
|
||||
{file = "cffi-2.0.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:12873ca6cb9b0f0d3a0da705d6086fe911591737a59f28b7936bdfed27c0d47c"},
|
||||
{file = "cffi-2.0.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:d9b97165e8aed9272a6bb17c01e3cc5871a594a446ebedc996e2397a1c1ea8ef"},
|
||||
{file = "cffi-2.0.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:afb8db5439b81cf9c9d0c80404b60c3cc9c3add93e114dcae767f1477cb53775"},
|
||||
{file = "cffi-2.0.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:737fe7d37e1a1bffe70bd5754ea763a62a066dc5913ca57e957824b72a85e205"},
|
||||
{file = "cffi-2.0.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:38100abb9d1b1435bc4cc340bb4489635dc2f0da7456590877030c9b3d40b0c1"},
|
||||
{file = "cffi-2.0.0-cp314-cp314-win32.whl", hash = "sha256:087067fa8953339c723661eda6b54bc98c5625757ea62e95eb4898ad5e776e9f"},
|
||||
{file = "cffi-2.0.0-cp314-cp314-win_amd64.whl", hash = "sha256:203a48d1fb583fc7d78a4c6655692963b860a417c0528492a6bc21f1aaefab25"},
|
||||
{file = "cffi-2.0.0-cp314-cp314-win_arm64.whl", hash = "sha256:dbd5c7a25a7cb98f5ca55d258b103a2054f859a46ae11aaf23134f9cc0d356ad"},
|
||||
{file = "cffi-2.0.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:9a67fc9e8eb39039280526379fb3a70023d77caec1852002b4da7e8b270c4dd9"},
|
||||
{file = "cffi-2.0.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:7a66c7204d8869299919db4d5069a82f1561581af12b11b3c9f48c584eb8743d"},
|
||||
{file = "cffi-2.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7cc09976e8b56f8cebd752f7113ad07752461f48a58cbba644139015ac24954c"},
|
||||
{file = "cffi-2.0.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:92b68146a71df78564e4ef48af17551a5ddd142e5190cdf2c5624d0c3ff5b2e8"},
|
||||
{file = "cffi-2.0.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:b1e74d11748e7e98e2f426ab176d4ed720a64412b6a15054378afdb71e0f37dc"},
|
||||
{file = "cffi-2.0.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:28a3a209b96630bca57cce802da70c266eb08c6e97e5afd61a75611ee6c64592"},
|
||||
{file = "cffi-2.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:7553fb2090d71822f02c629afe6042c299edf91ba1bf94951165613553984512"},
|
||||
{file = "cffi-2.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:6c6c373cfc5c83a975506110d17457138c8c63016b563cc9ed6e056a82f13ce4"},
|
||||
{file = "cffi-2.0.0-cp314-cp314t-win32.whl", hash = "sha256:1fc9ea04857caf665289b7a75923f2c6ed559b8298a1b8c49e59f7dd95c8481e"},
|
||||
{file = "cffi-2.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:d68b6cef7827e8641e8ef16f4494edda8b36104d79773a334beaa1e3521430f6"},
|
||||
{file = "cffi-2.0.0-cp314-cp314t-win_arm64.whl", hash = "sha256:0a1527a803f0a659de1af2e1fd700213caba79377e27e4693648c2923da066f9"},
|
||||
{file = "cffi-2.0.0-cp39-cp39-macosx_10_13_x86_64.whl", hash = "sha256:fe562eb1a64e67dd297ccc4f5addea2501664954f2692b69a76449ec7913ecbf"},
|
||||
{file = "cffi-2.0.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:de8dad4425a6ca6e4e5e297b27b5c824ecc7581910bf9aee86cb6835e6812aa7"},
|
||||
{file = "cffi-2.0.0-cp39-cp39-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:4647afc2f90d1ddd33441e5b0e85b16b12ddec4fca55f0d9671fef036ecca27c"},
|
||||
{file = "cffi-2.0.0-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:3f4d46d8b35698056ec29bca21546e1551a205058ae1a181d871e278b0b28165"},
|
||||
{file = "cffi-2.0.0-cp39-cp39-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:e6e73b9e02893c764e7e8d5bb5ce277f1a009cd5243f8228f75f842bf937c534"},
|
||||
{file = "cffi-2.0.0-cp39-cp39-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:cb527a79772e5ef98fb1d700678fe031e353e765d1ca2d409c92263c6d43e09f"},
|
||||
{file = "cffi-2.0.0-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:61d028e90346df14fedc3d1e5441df818d095f3b87d286825dfcbd6459b7ef63"},
|
||||
{file = "cffi-2.0.0-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:0f6084a0ea23d05d20c3edcda20c3d006f9b6f3fefeac38f59262e10cef47ee2"},
|
||||
{file = "cffi-2.0.0-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:1cd13c99ce269b3ed80b417dcd591415d3372bcac067009b6e0f59c7d4015e65"},
|
||||
{file = "cffi-2.0.0-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:89472c9762729b5ae1ad974b777416bfda4ac5642423fa93bd57a09204712322"},
|
||||
{file = "cffi-2.0.0-cp39-cp39-win32.whl", hash = "sha256:2081580ebb843f759b9f617314a24ed5738c51d2aee65d31e02f6f7a2b97707a"},
|
||||
{file = "cffi-2.0.0-cp39-cp39-win_amd64.whl", hash = "sha256:b882b3df248017dba09d6b16defe9b5c407fe32fc7c65a9c69798e6175601be9"},
|
||||
{file = "cffi-2.0.0.tar.gz", hash = "sha256:44d1b5909021139fe36001ae048dbdde8214afa20200eda0f64c068cac5d5529"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
pycparser = {version = "*", markers = "implementation_name != \"PyPy\""}
|
||||
|
||||
[[package]]
|
||||
name = "charset-normalizer"
|
||||
version = "3.4.4"
|
||||
description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
|
||||
category = "dev"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
files = [
|
||||
|
|
@ -139,7 +309,6 @@ files = [
|
|||
name = "colorama"
|
||||
version = "0.4.6"
|
||||
description = "Cross-platform colored terminal text."
|
||||
category = "dev"
|
||||
optional = false
|
||||
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,>=2.7"
|
||||
files = [
|
||||
|
|
@ -151,7 +320,6 @@ files = [
|
|||
name = "coverage"
|
||||
version = "7.13.0"
|
||||
description = "Code coverage measurement for Python"
|
||||
category = "dev"
|
||||
optional = false
|
||||
python-versions = ">=3.10"
|
||||
files = [
|
||||
|
|
@ -255,11 +423,87 @@ tomli = {version = "*", optional = true, markers = "python_full_version <= \"3.1
|
|||
[package.extras]
|
||||
toml = ["tomli"]
|
||||
|
||||
[[package]]
|
||||
name = "cryptography"
|
||||
version = "46.0.3"
|
||||
description = "cryptography is a package which provides cryptographic recipes and primitives to Python developers."
|
||||
optional = false
|
||||
python-versions = "!=3.9.0,!=3.9.1,>=3.8"
|
||||
files = [
|
||||
{file = "cryptography-46.0.3-cp311-abi3-macosx_10_9_universal2.whl", hash = "sha256:109d4ddfadf17e8e7779c39f9b18111a09efb969a301a31e987416a0191ed93a"},
|
||||
{file = "cryptography-46.0.3-cp311-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:09859af8466b69bc3c27bdf4f5d84a665e0f7ab5088412e9e2ec49758eca5cbc"},
|
||||
{file = "cryptography-46.0.3-cp311-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:01ca9ff2885f3acc98c29f1860552e37f6d7c7d013d7334ff2a9de43a449315d"},
|
||||
{file = "cryptography-46.0.3-cp311-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:6eae65d4c3d33da080cff9c4ab1f711b15c1d9760809dad6ea763f3812d254cb"},
|
||||
{file = "cryptography-46.0.3-cp311-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:e5bf0ed4490068a2e72ac03d786693adeb909981cc596425d09032d372bcc849"},
|
||||
{file = "cryptography-46.0.3-cp311-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:5ecfccd2329e37e9b7112a888e76d9feca2347f12f37918facbb893d7bb88ee8"},
|
||||
{file = "cryptography-46.0.3-cp311-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:a2c0cd47381a3229c403062f764160d57d4d175e022c1df84e168c6251a22eec"},
|
||||
{file = "cryptography-46.0.3-cp311-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:549e234ff32571b1f4076ac269fcce7a808d3bf98b76c8dd560e42dbc66d7d91"},
|
||||
{file = "cryptography-46.0.3-cp311-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:c0a7bb1a68a5d3471880e264621346c48665b3bf1c3759d682fc0864c540bd9e"},
|
||||
{file = "cryptography-46.0.3-cp311-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:10b01676fc208c3e6feeb25a8b83d81767e8059e1fe86e1dc62d10a3018fa926"},
|
||||
{file = "cryptography-46.0.3-cp311-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:0abf1ffd6e57c67e92af68330d05760b7b7efb243aab8377e583284dbab72c71"},
|
||||
{file = "cryptography-46.0.3-cp311-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:a04bee9ab6a4da801eb9b51f1b708a1b5b5c9eb48c03f74198464c66f0d344ac"},
|
||||
{file = "cryptography-46.0.3-cp311-abi3-win32.whl", hash = "sha256:f260d0d41e9b4da1ed1e0f1ce571f97fe370b152ab18778e9e8f67d6af432018"},
|
||||
{file = "cryptography-46.0.3-cp311-abi3-win_amd64.whl", hash = "sha256:a9a3008438615669153eb86b26b61e09993921ebdd75385ddd748702c5adfddb"},
|
||||
{file = "cryptography-46.0.3-cp311-abi3-win_arm64.whl", hash = "sha256:5d7f93296ee28f68447397bf5198428c9aeeab45705a55d53a6343455dcb2c3c"},
|
||||
{file = "cryptography-46.0.3-cp314-cp314t-macosx_10_9_universal2.whl", hash = "sha256:00a5e7e87938e5ff9ff5447ab086a5706a957137e6e433841e9d24f38a065217"},
|
||||
{file = "cryptography-46.0.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:c8daeb2d2174beb4575b77482320303f3d39b8e81153da4f0fb08eb5fe86a6c5"},
|
||||
{file = "cryptography-46.0.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:39b6755623145ad5eff1dab323f4eae2a32a77a7abef2c5089a04a3d04366715"},
|
||||
{file = "cryptography-46.0.3-cp314-cp314t-manylinux_2_28_aarch64.whl", hash = "sha256:db391fa7c66df6762ee3f00c95a89e6d428f4d60e7abc8328f4fe155b5ac6e54"},
|
||||
{file = "cryptography-46.0.3-cp314-cp314t-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:78a97cf6a8839a48c49271cdcbd5cf37ca2c1d6b7fdd86cc864f302b5e9bf459"},
|
||||
{file = "cryptography-46.0.3-cp314-cp314t-manylinux_2_28_ppc64le.whl", hash = "sha256:dfb781ff7eaa91a6f7fd41776ec37c5853c795d3b358d4896fdbb5df168af422"},
|
||||
{file = "cryptography-46.0.3-cp314-cp314t-manylinux_2_28_x86_64.whl", hash = "sha256:6f61efb26e76c45c4a227835ddeae96d83624fb0d29eb5df5b96e14ed1a0afb7"},
|
||||
{file = "cryptography-46.0.3-cp314-cp314t-manylinux_2_34_aarch64.whl", hash = "sha256:23b1a8f26e43f47ceb6d6a43115f33a5a37d57df4ea0ca295b780ae8546e8044"},
|
||||
{file = "cryptography-46.0.3-cp314-cp314t-manylinux_2_34_ppc64le.whl", hash = "sha256:b419ae593c86b87014b9be7396b385491ad7f320bde96826d0dd174459e54665"},
|
||||
{file = "cryptography-46.0.3-cp314-cp314t-manylinux_2_34_x86_64.whl", hash = "sha256:50fc3343ac490c6b08c0cf0d704e881d0d660be923fd3076db3e932007e726e3"},
|
||||
{file = "cryptography-46.0.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:22d7e97932f511d6b0b04f2bfd818d73dcd5928db509460aaf48384778eb6d20"},
|
||||
{file = "cryptography-46.0.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:d55f3dffadd674514ad19451161118fd010988540cee43d8bc20675e775925de"},
|
||||
{file = "cryptography-46.0.3-cp314-cp314t-win32.whl", hash = "sha256:8a6e050cb6164d3f830453754094c086ff2d0b2f3a897a1d9820f6139a1f0914"},
|
||||
{file = "cryptography-46.0.3-cp314-cp314t-win_amd64.whl", hash = "sha256:760f83faa07f8b64e9c33fc963d790a2edb24efb479e3520c14a45741cd9b2db"},
|
||||
{file = "cryptography-46.0.3-cp314-cp314t-win_arm64.whl", hash = "sha256:516ea134e703e9fe26bcd1277a4b59ad30586ea90c365a87781d7887a646fe21"},
|
||||
{file = "cryptography-46.0.3-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:cb3d760a6117f621261d662bccc8ef5bc32ca673e037c83fbe565324f5c46936"},
|
||||
{file = "cryptography-46.0.3-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:4b7387121ac7d15e550f5cb4a43aef2559ed759c35df7336c402bb8275ac9683"},
|
||||
{file = "cryptography-46.0.3-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:15ab9b093e8f09daab0f2159bb7e47532596075139dd74365da52ecc9cb46c5d"},
|
||||
{file = "cryptography-46.0.3-cp38-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:46acf53b40ea38f9c6c229599a4a13f0d46a6c3fa9ef19fc1a124d62e338dfa0"},
|
||||
{file = "cryptography-46.0.3-cp38-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:10ca84c4668d066a9878890047f03546f3ae0a6b8b39b697457b7757aaf18dbc"},
|
||||
{file = "cryptography-46.0.3-cp38-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:36e627112085bb3b81b19fed209c05ce2a52ee8b15d161b7c643a7d5a88491f3"},
|
||||
{file = "cryptography-46.0.3-cp38-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:1000713389b75c449a6e979ffc7dcc8ac90b437048766cef052d4d30b8220971"},
|
||||
{file = "cryptography-46.0.3-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:b02cf04496f6576afffef5ddd04a0cb7d49cf6be16a9059d793a30b035f6b6ac"},
|
||||
{file = "cryptography-46.0.3-cp38-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:71e842ec9bc7abf543b47cf86b9a743baa95f4677d22baa4c7d5c69e49e9bc04"},
|
||||
{file = "cryptography-46.0.3-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:402b58fc32614f00980b66d6e56a5b4118e6cb362ae8f3fda141ba4689bd4506"},
|
||||
{file = "cryptography-46.0.3-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:ef639cb3372f69ec44915fafcd6698b6cc78fbe0c2ea41be867f6ed612811963"},
|
||||
{file = "cryptography-46.0.3-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:3b51b8ca4f1c6453d8829e1eb7299499ca7f313900dd4d89a24b8b87c0a780d4"},
|
||||
{file = "cryptography-46.0.3-cp38-abi3-win32.whl", hash = "sha256:6276eb85ef938dc035d59b87c8a7dc559a232f954962520137529d77b18ff1df"},
|
||||
{file = "cryptography-46.0.3-cp38-abi3-win_amd64.whl", hash = "sha256:416260257577718c05135c55958b674000baef9a1c7d9e8f306ec60d71db850f"},
|
||||
{file = "cryptography-46.0.3-cp38-abi3-win_arm64.whl", hash = "sha256:d89c3468de4cdc4f08a57e214384d0471911a3830fcdaf7a8cc587e42a866372"},
|
||||
{file = "cryptography-46.0.3-pp310-pypy310_pp73-macosx_10_9_x86_64.whl", hash = "sha256:a23582810fedb8c0bc47524558fb6c56aac3fc252cb306072fd2815da2a47c32"},
|
||||
{file = "cryptography-46.0.3-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:e7aec276d68421f9574040c26e2a7c3771060bc0cff408bae1dcb19d3ab1e63c"},
|
||||
{file = "cryptography-46.0.3-pp311-pypy311_pp73-macosx_10_9_x86_64.whl", hash = "sha256:7ce938a99998ed3c8aa7e7272dca1a610401ede816d36d0693907d863b10d9ea"},
|
||||
{file = "cryptography-46.0.3-pp311-pypy311_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:191bb60a7be5e6f54e30ba16fdfae78ad3a342a0599eb4193ba88e3f3d6e185b"},
|
||||
{file = "cryptography-46.0.3-pp311-pypy311_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:c70cc23f12726be8f8bc72e41d5065d77e4515efae3690326764ea1b07845cfb"},
|
||||
{file = "cryptography-46.0.3-pp311-pypy311_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:9394673a9f4de09e28b5356e7fff97d778f8abad85c9d5ac4a4b7e25a0de7717"},
|
||||
{file = "cryptography-46.0.3-pp311-pypy311_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:94cd0549accc38d1494e1f8de71eca837d0509d0d44bf11d158524b0e12cebf9"},
|
||||
{file = "cryptography-46.0.3-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:6b5063083824e5509fdba180721d55909ffacccc8adbec85268b48439423d78c"},
|
||||
{file = "cryptography-46.0.3.tar.gz", hash = "sha256:a8b17438104fed022ce745b362294d9ce35b4c2e45c1d958ad4a4b019285f4a1"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
cffi = {version = ">=2.0.0", markers = "python_full_version >= \"3.9\" and platform_python_implementation != \"PyPy\""}
|
||||
typing-extensions = {version = ">=4.13.2", markers = "python_full_version < \"3.11\""}
|
||||
|
||||
[package.extras]
|
||||
docs = ["sphinx (>=5.3.0)", "sphinx-inline-tabs", "sphinx-rtd-theme (>=3.0.0)"]
|
||||
docstest = ["pyenchant (>=3)", "readme-renderer (>=30.0)", "sphinxcontrib-spelling (>=7.3.1)"]
|
||||
nox = ["nox[uv] (>=2024.4.15)"]
|
||||
pep8test = ["check-sdist", "click (>=8.0.1)", "mypy (>=1.14)", "ruff (>=0.11.11)"]
|
||||
sdist = ["build (>=1.0.0)"]
|
||||
ssh = ["bcrypt (>=3.1.5)"]
|
||||
test = ["certifi (>=2024)", "cryptography-vectors (==46.0.3)", "pretend (>=0.7)", "pytest (>=7.4.0)", "pytest-benchmark (>=4.0)", "pytest-cov (>=2.10.1)", "pytest-xdist (>=3.5.0)"]
|
||||
test-randomorder = ["pytest-randomly"]
|
||||
|
||||
[[package]]
|
||||
name = "desktop-entry-lib"
|
||||
version = "5.0"
|
||||
description = "A library for working with .desktop files"
|
||||
category = "dev"
|
||||
optional = false
|
||||
python-versions = ">=3.10"
|
||||
files = [
|
||||
|
|
@ -274,7 +518,6 @@ xdg-desktop-portal = ["jeepney"]
|
|||
name = "exceptiongroup"
|
||||
version = "1.3.1"
|
||||
description = "Backport of PEP 654 (exception groups)"
|
||||
category = "dev"
|
||||
optional = false
|
||||
python-versions = ">=3.7"
|
||||
files = [
|
||||
|
|
@ -292,7 +535,6 @@ test = ["pytest (>=6)"]
|
|||
name = "idna"
|
||||
version = "3.11"
|
||||
description = "Internationalized Domain Names in Applications (IDNA)"
|
||||
category = "dev"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
|
|
@ -307,7 +549,6 @@ all = ["flake8 (>=7.1.1)", "mypy (>=1.11.2)", "pytest (>=8.3.2)", "ruff (>=0.6.2
|
|||
name = "iniconfig"
|
||||
version = "2.3.0"
|
||||
description = "brain-dead simple config-ini parsing"
|
||||
category = "dev"
|
||||
optional = false
|
||||
python-versions = ">=3.10"
|
||||
files = [
|
||||
|
|
@ -315,11 +556,21 @@ files = [
|
|||
{file = "iniconfig-2.3.0.tar.gz", hash = "sha256:c76315c77db068650d49c5b56314774a7804df16fee4402c1f19d6d15d8c4730"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "invoke"
|
||||
version = "2.2.1"
|
||||
description = "Pythonic task execution"
|
||||
optional = false
|
||||
python-versions = ">=3.6"
|
||||
files = [
|
||||
{file = "invoke-2.2.1-py3-none-any.whl", hash = "sha256:2413bc441b376e5cd3f55bb5d364f973ad8bdd7bf87e53c79de3c11bf3feecc8"},
|
||||
{file = "invoke-2.2.1.tar.gz", hash = "sha256:515bf49b4a48932b79b024590348da22f39c4942dff991ad1fb8b8baea1be707"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "packaging"
|
||||
version = "25.0"
|
||||
description = "Core utilities for Python packages"
|
||||
category = "dev"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
|
|
@ -327,11 +578,30 @@ files = [
|
|||
{file = "packaging-25.0.tar.gz", hash = "sha256:d443872c98d677bf60f6a1f2f8c1cb748e8fe762d2bf9d3148b5599295b0fc4f"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "paramiko"
|
||||
version = "4.0.0"
|
||||
description = "SSH2 protocol library"
|
||||
optional = false
|
||||
python-versions = ">=3.9"
|
||||
files = [
|
||||
{file = "paramiko-4.0.0-py3-none-any.whl", hash = "sha256:0e20e00ac666503bf0b4eda3b6d833465a2b7aff2e2b3d79a8bba5ef144ee3b9"},
|
||||
{file = "paramiko-4.0.0.tar.gz", hash = "sha256:6a25f07b380cc9c9a88d2b920ad37167ac4667f8d9886ccebd8f90f654b5d69f"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
bcrypt = ">=3.2"
|
||||
cryptography = ">=3.3"
|
||||
invoke = ">=2.0"
|
||||
pynacl = ">=1.5"
|
||||
|
||||
[package.extras]
|
||||
gssapi = ["gssapi (>=1.4.1)", "pyasn1 (>=0.1.7)", "pywin32 (>=2.1.8)"]
|
||||
|
||||
[[package]]
|
||||
name = "pluggy"
|
||||
version = "1.6.0"
|
||||
description = "plugin and hook calling mechanisms for python"
|
||||
category = "dev"
|
||||
optional = false
|
||||
python-versions = ">=3.9"
|
||||
files = [
|
||||
|
|
@ -343,11 +613,21 @@ files = [
|
|||
dev = ["pre-commit", "tox"]
|
||||
testing = ["coverage", "pytest", "pytest-benchmark"]
|
||||
|
||||
[[package]]
|
||||
name = "pycparser"
|
||||
version = "2.23"
|
||||
description = "C parser in Python"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
{file = "pycparser-2.23-py3-none-any.whl", hash = "sha256:e5c6e8d3fbad53479cab09ac03729e0a9faf2bee3db8208a550daf5af81a5934"},
|
||||
{file = "pycparser-2.23.tar.gz", hash = "sha256:78816d4f24add8f10a06d6f05b4d424ad9e96cfebf68a4ddc99c65c0720d00c2"},
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pygments"
|
||||
version = "2.19.2"
|
||||
description = "Pygments is a syntax highlighting package written in Python."
|
||||
category = "dev"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
|
|
@ -358,11 +638,53 @@ files = [
|
|||
[package.extras]
|
||||
windows-terminal = ["colorama (>=0.4.6)"]
|
||||
|
||||
[[package]]
|
||||
name = "pynacl"
|
||||
version = "1.6.1"
|
||||
description = "Python binding to the Networking and Cryptography (NaCl) library"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
{file = "pynacl-1.6.1-cp314-cp314t-macosx_10_10_universal2.whl", hash = "sha256:7d7c09749450c385301a3c20dca967a525152ae4608c0a096fe8464bfc3df93d"},
|
||||
{file = "pynacl-1.6.1-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:fc734c1696ffd49b40f7c1779c89ba908157c57345cf626be2e0719488a076d3"},
|
||||
{file = "pynacl-1.6.1-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:3cd787ec1f5c155dc8ecf39b1333cfef41415dc96d392f1ce288b4fe970df489"},
|
||||
{file = "pynacl-1.6.1-cp314-cp314t-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6b35d93ab2df03ecb3aa506be0d3c73609a51449ae0855c2e89c7ed44abde40b"},
|
||||
{file = "pynacl-1.6.1-cp314-cp314t-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:dece79aecbb8f4640a1adbb81e4aa3bfb0e98e99834884a80eb3f33c7c30e708"},
|
||||
{file = "pynacl-1.6.1-cp314-cp314t-manylinux_2_34_aarch64.whl", hash = "sha256:c2228054f04bf32d558fb89bb99f163a8197d5a9bf4efa13069a7fa8d4b93fc3"},
|
||||
{file = "pynacl-1.6.1-cp314-cp314t-manylinux_2_34_x86_64.whl", hash = "sha256:2b12f1b97346f177affcdfdc78875ff42637cb40dcf79484a97dae3448083a78"},
|
||||
{file = "pynacl-1.6.1-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:e735c3a1bdfde3834503baf1a6d74d4a143920281cb724ba29fb84c9f49b9c48"},
|
||||
{file = "pynacl-1.6.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:3384a454adf5d716a9fadcb5eb2e3e72cd49302d1374a60edc531c9957a9b014"},
|
||||
{file = "pynacl-1.6.1-cp314-cp314t-win32.whl", hash = "sha256:d8615ee34d01c8e0ab3f302dcdd7b32e2bcf698ba5f4809e7cc407c8cdea7717"},
|
||||
{file = "pynacl-1.6.1-cp314-cp314t-win_amd64.whl", hash = "sha256:5f5b35c1a266f8a9ad22525049280a600b19edd1f785bccd01ae838437dcf935"},
|
||||
{file = "pynacl-1.6.1-cp314-cp314t-win_arm64.whl", hash = "sha256:d984c91fe3494793b2a1fb1e91429539c6c28e9ec8209d26d25041ec599ccf63"},
|
||||
{file = "pynacl-1.6.1-cp38-abi3-macosx_10_10_universal2.whl", hash = "sha256:a6f9fd6d6639b1e81115c7f8ff16b8dedba1e8098d2756275d63d208b0e32021"},
|
||||
{file = "pynacl-1.6.1-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:e49a3f3d0da9f79c1bec2aa013261ab9fa651c7da045d376bd306cf7c1792993"},
|
||||
{file = "pynacl-1.6.1-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:7713f8977b5d25f54a811ec9efa2738ac592e846dd6e8a4d3f7578346a841078"},
|
||||
{file = "pynacl-1.6.1-cp38-abi3-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5a3becafc1ee2e5ea7f9abc642f56b82dcf5be69b961e782a96ea52b55d8a9fc"},
|
||||
{file = "pynacl-1.6.1-cp38-abi3-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4ce50d19f1566c391fedc8dc2f2f5be265ae214112ebe55315e41d1f36a7f0a9"},
|
||||
{file = "pynacl-1.6.1-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:543f869140f67d42b9b8d47f922552d7a967e6c116aad028c9bfc5f3f3b3a7b7"},
|
||||
{file = "pynacl-1.6.1-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:a2bb472458c7ca959aeeff8401b8efef329b0fc44a89d3775cffe8fad3398ad8"},
|
||||
{file = "pynacl-1.6.1-cp38-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:3206fa98737fdc66d59b8782cecc3d37d30aeec4593d1c8c145825a345bba0f0"},
|
||||
{file = "pynacl-1.6.1-cp38-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:53543b4f3d8acb344f75fd4d49f75e6572fce139f4bfb4815a9282296ff9f4c0"},
|
||||
{file = "pynacl-1.6.1-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:319de653ef84c4f04e045eb250e6101d23132372b0a61a7acf91bac0fda8e58c"},
|
||||
{file = "pynacl-1.6.1-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:262a8de6bba4aee8a66f5edf62c214b06647461c9b6b641f8cd0cb1e3b3196fe"},
|
||||
{file = "pynacl-1.6.1-cp38-abi3-win32.whl", hash = "sha256:9fd1a4eb03caf8a2fe27b515a998d26923adb9ddb68db78e35ca2875a3830dde"},
|
||||
{file = "pynacl-1.6.1-cp38-abi3-win_amd64.whl", hash = "sha256:a569a4069a7855f963940040f35e87d8bc084cb2d6347428d5ad20550a0a1a21"},
|
||||
{file = "pynacl-1.6.1-cp38-abi3-win_arm64.whl", hash = "sha256:5953e8b8cfadb10889a6e7bd0f53041a745d1b3d30111386a1bb37af171e6daf"},
|
||||
{file = "pynacl-1.6.1.tar.gz", hash = "sha256:8d361dac0309f2b6ad33b349a56cd163c98430d409fa503b10b70b3ad66eaa1d"},
|
||||
]
|
||||
|
||||
[package.dependencies]
|
||||
cffi = {version = ">=2.0.0", markers = "platform_python_implementation != \"PyPy\" and python_version >= \"3.9\""}
|
||||
|
||||
[package.extras]
|
||||
docs = ["sphinx (<7)", "sphinx_rtd_theme"]
|
||||
tests = ["hypothesis (>=3.27.0)", "pytest (>=7.4.0)", "pytest-cov (>=2.10.1)", "pytest-xdist (>=3.5.0)"]
|
||||
|
||||
[[package]]
|
||||
name = "pyproject-appimage"
|
||||
version = "4.2"
|
||||
description = "Generate AppImages from your Python projects"
|
||||
category = "dev"
|
||||
optional = false
|
||||
python-versions = ">=3.9"
|
||||
files = [
|
||||
|
|
@ -379,7 +701,6 @@ tomli = {version = "*", markers = "python_version < \"3.11\""}
|
|||
name = "pytest"
|
||||
version = "8.4.2"
|
||||
description = "pytest: simple powerful testing with Python"
|
||||
category = "dev"
|
||||
optional = false
|
||||
python-versions = ">=3.9"
|
||||
files = [
|
||||
|
|
@ -403,7 +724,6 @@ dev = ["argcomplete", "attrs (>=19.2)", "hypothesis (>=3.56)", "mock", "requests
|
|||
name = "pytest-cov"
|
||||
version = "5.0.0"
|
||||
description = "Pytest plugin for measuring coverage."
|
||||
category = "dev"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
|
|
@ -422,7 +742,6 @@ testing = ["fields", "hunter", "process-tests", "pytest-xdist", "virtualenv"]
|
|||
name = "pyyaml"
|
||||
version = "6.0.3"
|
||||
description = "YAML parser and emitter for Python"
|
||||
category = "main"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
|
|
@ -505,7 +824,6 @@ files = [
|
|||
name = "requests"
|
||||
version = "2.32.5"
|
||||
description = "Python HTTP for Humans."
|
||||
category = "dev"
|
||||
optional = false
|
||||
python-versions = ">=3.9"
|
||||
files = [
|
||||
|
|
@ -527,7 +845,6 @@ use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
|
|||
name = "tomli"
|
||||
version = "2.3.0"
|
||||
description = "A lil' TOML parser"
|
||||
category = "dev"
|
||||
optional = false
|
||||
python-versions = ">=3.8"
|
||||
files = [
|
||||
|
|
@ -579,7 +896,6 @@ files = [
|
|||
name = "typing-extensions"
|
||||
version = "4.15.0"
|
||||
description = "Backported and Experimental Type Hints for Python 3.9+"
|
||||
category = "dev"
|
||||
optional = false
|
||||
python-versions = ">=3.9"
|
||||
files = [
|
||||
|
|
@ -591,7 +907,6 @@ files = [
|
|||
name = "urllib3"
|
||||
version = "2.6.2"
|
||||
description = "HTTP library with thread-safe connection pooling, file post, and more."
|
||||
category = "dev"
|
||||
optional = false
|
||||
python-versions = ">=3.9"
|
||||
files = [
|
||||
|
|
@ -608,4 +923,4 @@ zstd = ["backports-zstd (>=1.0.0)"]
|
|||
[metadata]
|
||||
lock-version = "2.0"
|
||||
python-versions = "^3.10"
|
||||
content-hash = "10c279bd393cab27a94b4848c6f88f3a7a3d1af5062882c3e6fd2c2e15c945c8"
|
||||
content-hash = "c3466a6595a9822763431a6dff0c7f835407a2591b92d5995592f8e6802c774a"
|
||||
|
|
|
|||
|
|
@ -1,6 +1,6 @@
|
|||
[tool.poetry]
|
||||
name = "enroll"
|
||||
version = "0.0.5"
|
||||
version = "0.1.0"
|
||||
description = "Enroll a server's running state retrospectively into Ansible"
|
||||
authors = ["Miguel Jacq <mig@mig5.net>"]
|
||||
license = "GPL-3.0-or-later"
|
||||
|
|
@ -11,6 +11,7 @@ repository = "https://git.mig5.net/mig5/enroll"
|
|||
[tool.poetry.dependencies]
|
||||
python = "^3.10"
|
||||
pyyaml = "^6.0.3"
|
||||
paramiko = "^4.0.0"
|
||||
|
||||
[tool.poetry.scripts]
|
||||
enroll = "enroll.cli:main"
|
||||
|
|
|
|||
|
|
@ -6,8 +6,9 @@ import enroll.cli as cli
|
|||
def test_cli_harvest_subcommand_calls_harvest(monkeypatch, capsys, tmp_path):
|
||||
called = {}
|
||||
|
||||
def fake_harvest(out: str):
|
||||
def fake_harvest(out: str, dangerous: bool = False):
|
||||
called["out"] = out
|
||||
called["dangerous"] = dangerous
|
||||
return str(tmp_path / "state.json")
|
||||
|
||||
monkeypatch.setattr(cli, "harvest", fake_harvest)
|
||||
|
|
@ -15,6 +16,7 @@ def test_cli_harvest_subcommand_calls_harvest(monkeypatch, capsys, tmp_path):
|
|||
|
||||
cli.main()
|
||||
assert called["out"] == str(tmp_path)
|
||||
assert called["dangerous"] is False
|
||||
captured = capsys.readouterr()
|
||||
assert str(tmp_path / "state.json") in captured.out
|
||||
|
||||
|
|
@ -53,8 +55,8 @@ def test_cli_manifest_subcommand_calls_manifest(monkeypatch, tmp_path):
|
|||
def test_cli_enroll_subcommand_runs_harvest_then_manifest(monkeypatch, tmp_path):
|
||||
calls = []
|
||||
|
||||
def fake_harvest(bundle_dir: str):
|
||||
calls.append(("harvest", bundle_dir))
|
||||
def fake_harvest(bundle_dir: str, dangerous: bool = False):
|
||||
calls.append(("harvest", bundle_dir, dangerous))
|
||||
return str(tmp_path / "bundle" / "state.json")
|
||||
|
||||
def fake_manifest(bundle_dir: str, out_dir: str, **kwargs):
|
||||
|
|
@ -85,11 +87,142 @@ def test_cli_enroll_subcommand_runs_harvest_then_manifest(monkeypatch, tmp_path)
|
|||
|
||||
cli.main()
|
||||
assert calls == [
|
||||
("harvest", str(tmp_path / "bundle")),
|
||||
("harvest", str(tmp_path / "bundle"), False),
|
||||
("manifest", str(tmp_path / "bundle"), str(tmp_path / "ansible"), None, "auto"),
|
||||
]
|
||||
|
||||
|
||||
def test_cli_harvest_dangerous_flag_is_forwarded(monkeypatch, tmp_path):
|
||||
called = {}
|
||||
|
||||
def fake_harvest(out: str, dangerous: bool = False):
|
||||
called["out"] = out
|
||||
called["dangerous"] = dangerous
|
||||
return str(tmp_path / "state.json")
|
||||
|
||||
monkeypatch.setattr(cli, "harvest", fake_harvest)
|
||||
monkeypatch.setattr(
|
||||
sys, "argv", ["enroll", "harvest", "--out", str(tmp_path), "--dangerous"]
|
||||
)
|
||||
|
||||
cli.main()
|
||||
assert called["dangerous"] is True
|
||||
|
||||
|
||||
def test_cli_harvest_remote_calls_remote_harvest_and_uses_cache_dir(
|
||||
monkeypatch, capsys, tmp_path
|
||||
):
|
||||
from enroll.cache import HarvestCache
|
||||
|
||||
cache_dir = tmp_path / "cache"
|
||||
cache_dir.mkdir()
|
||||
|
||||
called = {}
|
||||
|
||||
def fake_cache_dir(*, hint=None):
|
||||
called["hint"] = hint
|
||||
return HarvestCache(dir=cache_dir)
|
||||
|
||||
def fake_remote_harvest(
|
||||
*,
|
||||
local_out_dir,
|
||||
remote_host,
|
||||
remote_port,
|
||||
remote_user,
|
||||
dangerous,
|
||||
no_sudo,
|
||||
):
|
||||
called.update(
|
||||
{
|
||||
"local_out_dir": local_out_dir,
|
||||
"remote_host": remote_host,
|
||||
"remote_port": remote_port,
|
||||
"remote_user": remote_user,
|
||||
"dangerous": dangerous,
|
||||
"no_sudo": no_sudo,
|
||||
}
|
||||
)
|
||||
return cache_dir / "state.json"
|
||||
|
||||
monkeypatch.setattr(cli, "new_harvest_cache_dir", fake_cache_dir)
|
||||
monkeypatch.setattr(cli, "remote_harvest", fake_remote_harvest)
|
||||
monkeypatch.setattr(
|
||||
sys,
|
||||
"argv",
|
||||
[
|
||||
"enroll",
|
||||
"harvest",
|
||||
"--remote-host",
|
||||
"example.test",
|
||||
"--remote-user",
|
||||
"alice",
|
||||
],
|
||||
)
|
||||
|
||||
cli.main()
|
||||
out = capsys.readouterr().out
|
||||
assert str(cache_dir / "state.json") in out
|
||||
assert called["hint"] == "example.test"
|
||||
assert called["local_out_dir"] == cache_dir
|
||||
assert called["remote_host"] == "example.test"
|
||||
assert called["remote_port"] == 22
|
||||
assert called["remote_user"] == "alice"
|
||||
assert called["dangerous"] is False
|
||||
assert called["no_sudo"] is False
|
||||
|
||||
|
||||
def test_cli_single_shot_remote_without_harvest_prints_state_path(
|
||||
monkeypatch, capsys, tmp_path
|
||||
):
|
||||
from enroll.cache import HarvestCache
|
||||
|
||||
cache_dir = tmp_path / "cache"
|
||||
cache_dir.mkdir()
|
||||
ansible_dir = tmp_path / "ansible"
|
||||
|
||||
calls = []
|
||||
|
||||
def fake_cache_dir(*, hint=None):
|
||||
return HarvestCache(dir=cache_dir)
|
||||
|
||||
def fake_remote_harvest(**kwargs):
|
||||
calls.append(("remote_harvest", kwargs))
|
||||
return cache_dir / "state.json"
|
||||
|
||||
def fake_manifest(harvest_dir: str, out_dir: str, **kwargs):
|
||||
calls.append(("manifest", harvest_dir, out_dir, kwargs.get("fqdn")))
|
||||
|
||||
monkeypatch.setattr(cli, "new_harvest_cache_dir", fake_cache_dir)
|
||||
monkeypatch.setattr(cli, "remote_harvest", fake_remote_harvest)
|
||||
monkeypatch.setattr(cli, "manifest", fake_manifest)
|
||||
monkeypatch.setattr(
|
||||
sys,
|
||||
"argv",
|
||||
[
|
||||
"enroll",
|
||||
"single-shot",
|
||||
"--remote-host",
|
||||
"example.test",
|
||||
"--remote-user",
|
||||
"alice",
|
||||
"--out",
|
||||
str(ansible_dir),
|
||||
"--fqdn",
|
||||
"example.test",
|
||||
],
|
||||
)
|
||||
|
||||
cli.main()
|
||||
out = capsys.readouterr().out
|
||||
|
||||
# It should print the derived state.json path for usability when --harvest
|
||||
# wasn't provided.
|
||||
assert str(cache_dir / "state.json") in out
|
||||
|
||||
# And it should manifest using the cache dir.
|
||||
assert ("manifest", str(cache_dir), str(ansible_dir), "example.test") in calls
|
||||
|
||||
|
||||
def test_cli_manifest_common_args(monkeypatch, tmp_path):
|
||||
"""Ensure --fqdn and jinjaturtle mode flags are forwarded correctly."""
|
||||
|
||||
|
|
|
|||
|
|
@ -223,3 +223,29 @@ def test_manifest_site_mode_creates_host_inventory_and_raw_files(tmp_path: Path)
|
|||
assert (
|
||||
out / "inventory" / "host_vars" / fqdn / "foo" / ".files" / "etc" / "foo.conf"
|
||||
).exists()
|
||||
|
||||
|
||||
def test_copy2_replace_overwrites_readonly_destination(tmp_path: Path):
|
||||
"""Merging into an existing manifest should tolerate read-only files.
|
||||
|
||||
Some harvested artifacts (e.g. private keys) may be mode 0400. If a previous
|
||||
run copied them into the destination tree, a subsequent run must still be
|
||||
able to update/replace them.
|
||||
"""
|
||||
|
||||
import os
|
||||
import stat
|
||||
|
||||
from enroll.manifest import _copy2_replace
|
||||
|
||||
src = tmp_path / "src"
|
||||
dst = tmp_path / "dst"
|
||||
src.write_text("new", encoding="utf-8")
|
||||
dst.write_text("old", encoding="utf-8")
|
||||
os.chmod(dst, 0o400)
|
||||
|
||||
_copy2_replace(str(src), str(dst))
|
||||
|
||||
assert dst.read_text(encoding="utf-8") == "new"
|
||||
mode = stat.S_IMODE(dst.stat().st_mode)
|
||||
assert mode & stat.S_IWUSR # destination should remain mergeable
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue