This commit is contained in:
2026-02-25 15:32:23 +02:00
parent 6cff65f288
commit 741173f617
24 changed files with 1205 additions and 124 deletions

View File

@@ -39,6 +39,9 @@ This installs `flow` to `~/.local/bin/flow`.
Files are read alphabetically (`*.yaml` and `*.yml`) and merged at top level. Files are read alphabetically (`*.yaml` and `*.yml`) and merged at top level.
If the same top-level key appears in multiple files, the later filename wins. If the same top-level key appears in multiple files, the later filename wins.
`repository.pull-before-edit` controls whether `flow dotfiles edit` runs `git pull --rebase` first (default: `true`).
When pull brings new changes, flow shows an info message and waits for Enter before opening the editor.
### Dotfiles layout (flat with reserved dirs) ### Dotfiles layout (flat with reserved dirs)
Inside your dotfiles repo root: Inside your dotfiles repo root:
@@ -50,21 +53,35 @@ _shared/
config.yaml config.yaml
packages.yaml packages.yaml
profiles.yaml profiles.yaml
dnsmasq/
.user_hosts
_root/
opt/homebrew/etc/dnsmasq.conf
git/ git/
.gitconfig .gitconfig
_root/
general/
etc/
hostname
linux-auto/ linux-auto/
nvim/ nvim/
.config/nvim/init.lua .config/nvim/init.lua
``` ```
- `_shared/`: linked for all profiles - `_shared/`: linked for all profiles
- `_root/`: linked to absolute paths (via sudo), e.g. `_root/etc/hostname -> /etc/hostname` - `_root/` is a marker inside a package for absolute paths (via sudo), e.g. `dnsmasq/_root/etc/hostname -> /etc/hostname`
- every other directory at this level is a profile name - every other directory at this level is a profile name
- when `_shared` and profile conflict on the same target file, profile wins - any target conflict fails (including `_shared` vs profile)
### External module packages
Packages can be backed by an external git repository using `_module.yaml`:
```yaml
source: github:org/nvim-config
ref:
branch: main
```
- If a package directory contains `_module.yaml`, flow uses the fetched module content as package source.
- Any sibling files in that package directory are ignored (shown only in `--verbose`).
- Modules are refreshed on `flow dotfiles init` and `flow dotfiles sync` (not on `link`).
## Manifest model ## Manifest model
@@ -144,7 +161,7 @@ profiles:
- `requires` validation for required env vars - `requires` validation for required env vars
- `ssh-keygen` definitions - `ssh-keygen` definitions
- `runcmd` (runs after package installation) - `runcmd` (runs after package installation)
- automatic config linking (`_shared` + profile + `_root`) - automatic config linking (`_shared` + profile, including package-local `_root` markers)
- `post-link` hook (runs after symlink phase) - `post-link` hook (runs after symlink phase)
- config skip patterns: - config skip patterns:
- package names (e.g. `nvim`) - package names (e.g. `nvim`)
@@ -161,6 +178,8 @@ flow dev create api -i tm0/node -p ~/projects/api
flow dotfiles init --repo git@github.com:you/dotfiles.git flow dotfiles init --repo git@github.com:you/dotfiles.git
flow dotfiles link --profile linux-auto flow dotfiles link --profile linux-auto
flow dotfiles status flow dotfiles status
flow dotfiles modules list
flow dotfiles modules sync
flow bootstrap list flow bootstrap list
flow bootstrap show linux-auto flow bootstrap show linux-auto

View File

@@ -6,8 +6,8 @@ This folder contains a complete dotfiles + bootstrap setup for the current `flow
- Flat repo-root layout with reserved dirs: - Flat repo-root layout with reserved dirs:
- `_shared/` (shared configs) - `_shared/` (shared configs)
- `_root/` (root-targeted configs)
- profile dirs (`linux-auto/`, `macos-dev/`) - profile dirs (`linux-auto/`, `macos-dev/`)
- package-local `_root/` marker for root-targeted files
- Unified YAML config under `_shared/flow/.config/flow/*.yaml` - Unified YAML config under `_shared/flow/.config/flow/*.yaml`
- Profile package list syntax: string, type prefix, and object entries - Profile package list syntax: string, type prefix, and object entries
- Binary install definition with `asset-pattern`, `platform-map`, `extract-dir`, and `install` - Binary install definition with `asset-pattern`, `platform-map`, `extract-dir`, and `install`
@@ -19,7 +19,6 @@ This folder contains a complete dotfiles + bootstrap setup for the current `flow
- `dotfiles-repo/_shared/flow/.config/flow/packages.yaml` - `dotfiles-repo/_shared/flow/.config/flow/packages.yaml`
- `dotfiles-repo/_shared/flow/.config/flow/profiles.yaml` - `dotfiles-repo/_shared/flow/.config/flow/profiles.yaml`
- `dotfiles-repo/_shared/...` - `dotfiles-repo/_shared/...`
- `dotfiles-repo/_root/...`
- `dotfiles-repo/linux-auto/...` - `dotfiles-repo/linux-auto/...`
- `dotfiles-repo/macos-dev/...` - `dotfiles-repo/macos-dev/...`
@@ -45,6 +44,8 @@ Check repo commands:
flow dotfiles repo status flow dotfiles repo status
flow dotfiles repo pull --relink --profile linux-auto flow dotfiles repo pull --relink --profile linux-auto
flow dotfiles repo push flow dotfiles repo push
flow dotfiles modules list
flow dotfiles modules sync
``` ```
Edit package or file/path targets: Edit package or file/path targets:

View File

@@ -1,6 +1,7 @@
repository: repository:
dotfiles-url: /ABSOLUTE/PATH/TO/flow-cli/example/dotfiles-repo dotfiles-url: /ABSOLUTE/PATH/TO/flow-cli/example/dotfiles-repo
dotfiles-branch: main dotfiles-branch: main
pull-before-edit: true
paths: paths:
projects-dir: ~/projects projects-dir: ~/projects

View File

@@ -1,3 +0,0 @@
[user]
name = Example Linux User
email = linux@example.com

View File

@@ -0,0 +1,2 @@
Host *
AddKeysToAgent yes

View File

@@ -0,0 +1 @@
theme = dark

View File

@@ -1,5 +0,0 @@
export EDITOR=vim
export PATH="$HOME/.local/bin:$PATH"
alias ll='ls -lah'
alias gs='git status -sb'

View File

@@ -314,6 +314,25 @@ def _strip_prefix(path: Path, prefix: Path) -> Path:
return path return path
def _is_under(path: Path, root: Path) -> bool:
try:
path.resolve().relative_to(root.resolve())
return True
except ValueError:
return False
def _validate_declared_install_path(package_name: str, declared_path: Path) -> None:
if declared_path.is_absolute():
raise RuntimeError(
f"Install path for '{package_name}' must be relative: {declared_path}"
)
if any(part == ".." for part in declared_path.parts):
raise RuntimeError(
f"Install path for '{package_name}' must not include parent traversal: {declared_path}"
)
def _install_destination(kind: str) -> Path: def _install_destination(kind: str) -> Path:
home = Path.home() home = Path.home()
if kind == "bin": if kind == "bin":
@@ -412,6 +431,7 @@ def _install_binary_package(
raise RuntimeError( raise RuntimeError(
f"extract-dir '{extract_dir_value}' not found for package '{spec['name']}'" f"extract-dir '{extract_dir_value}' not found for package '{spec['name']}'"
) )
source_root_resolved = source_root.resolve()
for kind in ("bin", "share", "man", "lib"): for kind in ("bin", "share", "man", "lib"):
items = install.get(kind, []) items = install.get(kind, [])
@@ -424,7 +444,13 @@ def _install_binary_package(
rendered = substitute_template(raw_item, template_ctx) rendered = substitute_template(raw_item, template_ctx)
declared_path = Path(rendered) declared_path = Path(rendered)
src = source_root / declared_path _validate_declared_install_path(spec["name"], declared_path)
src = (source_root / declared_path).resolve()
if not _is_under(src, source_root_resolved):
raise RuntimeError(
f"Install path escapes extract-dir for '{spec['name']}': {declared_path}"
)
if not src.exists(): if not src.exists():
raise RuntimeError( raise RuntimeError(
f"Install path not found for '{spec['name']}': {declared_path}" f"Install path not found for '{spec['name']}': {declared_path}"

View File

@@ -116,16 +116,32 @@ def _list_bootstrap_profiles() -> List[str]:
def _list_manifest_packages() -> List[str]: def _list_manifest_packages() -> List[str]:
manifest = _safe_manifest() manifest = _safe_manifest()
packages = manifest.get("packages", []) packages = manifest.get("packages", [])
if not isinstance(packages, list): names: Set[str] = set()
return []
names = [] if isinstance(packages, list):
for pkg in packages: for pkg in packages:
if isinstance(pkg, dict) and isinstance(pkg.get("name"), str): if not isinstance(pkg, dict):
continue
name = pkg.get("name")
if not isinstance(name, str) or not name:
continue
if str(pkg.get("type", "pkg")) == "binary": if str(pkg.get("type", "pkg")) == "binary":
names.append(pkg["name"]) names.add(name)
return sorted(names)
return sorted(set(names)) if isinstance(packages, dict):
for key, pkg in packages.items():
if not isinstance(pkg, dict):
continue
if str(pkg.get("type", "pkg")) != "binary":
continue
raw_name = pkg.get("name")
if isinstance(raw_name, str) and raw_name:
names.add(raw_name)
elif isinstance(key, str) and key:
names.add(key)
return sorted(names)
def _list_installed_packages() -> List[str]: def _list_installed_packages() -> List[str]:
@@ -297,11 +313,23 @@ def _complete_dev(before: Sequence[str], current: str) -> List[str]:
def _complete_dotfiles(before: Sequence[str], current: str) -> List[str]: def _complete_dotfiles(before: Sequence[str], current: str) -> List[str]:
if len(before) <= 1: if len(before) <= 1:
if current.startswith("-"):
return _filter(["--verbose", "-h", "--help"], current)
return _filter( return _filter(
["init", "link", "unlink", "status", "sync", "relink", "clean", "edit", "repo"], ["init", "link", "unlink", "status", "sync", "relink", "clean", "edit", "repo", "modules"],
current, current,
) )
if before[1] == "--verbose":
if len(before) <= 2:
if current.startswith("-"):
return _filter(["-h", "--help"], current)
return _filter(
["init", "link", "unlink", "status", "sync", "relink", "clean", "edit", "repo", "modules"],
current,
)
before = [before[0]] + list(before[2:])
sub = before[1] sub = before[1]
if sub == "init": if sub == "init":
@@ -322,6 +350,25 @@ def _complete_dotfiles(before: Sequence[str], current: str) -> List[str]:
return [] return []
if sub == "modules":
if len(before) <= 2:
if current.startswith("-"):
return _filter(["-h", "--help"], current)
return _filter(["list", "sync"], current)
modules_sub = before[2]
if modules_sub in {"list", "sync"}:
if before and before[-1] == "--profile":
return _filter(_list_dotfiles_profiles(), current)
if current.startswith("-"):
return _filter(["--profile", "-h", "--help"], current)
profile = _profile_from_before(before)
return _filter(_list_dotfiles_packages(profile), current)
return []
if sub in {"link", "relink"}: if sub in {"link", "relink"}:
if before and before[-1] == "--profile": if before and before[-1] == "--profile":
return _filter(_list_dotfiles_profiles(), current) return _filter(_list_dotfiles_profiles(), current)

View File

@@ -11,11 +11,14 @@ from dataclasses import dataclass
from pathlib import Path from pathlib import Path
from typing import Dict, List, Optional, Set from typing import Dict, List, Optional, Set
import yaml
from flow.core.config import FlowContext from flow.core.config import FlowContext
from flow.core.paths import DOTFILES_DIR, LINKED_STATE from flow.core.paths import DOTFILES_DIR, LINKED_STATE, MODULES_DIR
RESERVED_SHARED = "_shared" RESERVED_SHARED = "_shared"
RESERVED_ROOT = "_root" RESERVED_ROOT = "_root"
MODULE_FILE = "_module.yaml"
@dataclass @dataclass
@@ -26,8 +29,18 @@ class LinkSpec:
is_directory_link: bool = False is_directory_link: bool = False
@dataclass
class ModuleSpec:
package: str
source: str
ref_type: str
ref_value: str
package_dir: Path
def register(subparsers): def register(subparsers):
p = subparsers.add_parser("dotfiles", aliases=["dot"], help="Manage dotfiles") p = subparsers.add_parser("dotfiles", aliases=["dot"], help="Manage dotfiles")
p.add_argument("--verbose", action="store_true", help="Show detailed output")
sub = p.add_subparsers(dest="dotfiles_command") sub = p.add_subparsers(dest="dotfiles_command")
init = sub.add_parser("init", help="Clone dotfiles repository") init = sub.add_parser("init", help="Clone dotfiles repository")
@@ -54,6 +67,21 @@ def register(subparsers):
sync.add_argument("--profile", help="Profile to use when relinking") sync.add_argument("--profile", help="Profile to use when relinking")
sync.set_defaults(handler=run_sync) sync.set_defaults(handler=run_sync)
modules = sub.add_parser("modules", help="Inspect and refresh external modules")
modules_sub = modules.add_subparsers(dest="dotfiles_modules_command")
modules_list = modules_sub.add_parser("list", help="List detected module packages")
modules_list.add_argument("packages", nargs="*", help="Filter by package name")
modules_list.add_argument("--profile", help="Limit to shared + one profile")
modules_list.set_defaults(handler=run_modules_list)
modules_sync = modules_sub.add_parser("sync", help="Refresh module checkouts")
modules_sync.add_argument("packages", nargs="*", help="Filter by package name")
modules_sync.add_argument("--profile", help="Limit to shared + one profile")
modules_sync.set_defaults(handler=run_modules_sync)
modules.set_defaults(handler=run_modules_list)
repo = sub.add_parser("repo", help="Manage dotfiles repository") repo = sub.add_parser("repo", help="Manage dotfiles repository")
repo_sub = repo.add_subparsers(dest="dotfiles_repo_command") repo_sub = repo.add_subparsers(dest="dotfiles_repo_command")
@@ -100,12 +128,8 @@ def register(subparsers):
p.set_defaults(handler=lambda ctx, args: p.print_help()) p.set_defaults(handler=lambda ctx, args: p.print_help())
def _flow_config_dir(dotfiles_dir: Path = DOTFILES_DIR) -> Path: def _flow_config_dir(dotfiles_dir: Optional[Path] = None) -> Path:
return dotfiles_dir return dotfiles_dir or DOTFILES_DIR
def _is_root_package(package: str) -> bool:
return package == RESERVED_ROOT or package.startswith(f"{RESERVED_ROOT}/")
def _insert_spec( def _insert_spec(
@@ -125,6 +149,295 @@ def _insert_spec(
desired[target] = LinkSpec(source=source, target=target, package=package) desired[target] = LinkSpec(source=source, target=target, package=package)
def _is_path_like_target(target: str) -> bool:
raw = Path(target)
return "/" in target or target.startswith(".") or raw.suffix != ""
def _module_cache_dir(spec: ModuleSpec) -> Path:
key = spec.package.replace("/", "--")
return MODULES_DIR / key
def _normalize_module_source(source: str, *, package_dir: Optional[Path] = None) -> str:
if source.startswith("github:"):
repo = source.split(":", 1)[1]
return f"https://github.com/{repo}.git"
if "://" in source or source.startswith("git@"):
return source
raw = Path(source).expanduser()
if raw.is_absolute():
return str(raw)
if package_dir is None:
return source
return str((package_dir / raw).resolve())
def _load_module_spec(package_dir: Path, package: str) -> ModuleSpec:
module_file = package_dir / MODULE_FILE
if not module_file.exists():
raise RuntimeError(f"Module file not found: {module_file}")
try:
with open(module_file, "r", encoding="utf-8") as handle:
raw = yaml.safe_load(handle) or {}
except yaml.YAMLError as e:
raise RuntimeError(f"Invalid YAML in {module_file}: {e}") from e
if not isinstance(raw, dict):
raise RuntimeError(f"{module_file} must contain a mapping")
source = raw.get("source")
if not isinstance(source, str) or not source:
raise RuntimeError(f"{module_file} must define non-empty 'source'")
ref = raw.get("ref")
if not isinstance(ref, dict):
raise RuntimeError(f"{module_file} must define 'ref' mapping")
choices = [key for key in ("branch", "tag", "commit") if isinstance(ref.get(key), str) and ref.get(key)]
if len(choices) != 1:
raise RuntimeError(f"{module_file} 'ref' must include exactly one of: branch, tag, commit")
ref_type = choices[0]
ref_value = str(ref[ref_type])
return ModuleSpec(
package=package,
source=_normalize_module_source(source, package_dir=package_dir),
ref_type=ref_type,
ref_value=ref_value,
package_dir=package_dir,
)
def _run_git(dir_path: Path, *args: str, capture: bool = True) -> subprocess.CompletedProcess:
return subprocess.run(
["git", "-C", str(dir_path)] + list(args),
capture_output=capture,
text=True,
check=False,
)
def _pull_requires_ack(stdout: str, stderr: str) -> bool:
text = f"{stdout}\n{stderr}".strip()
if not text:
return False
lowered = text.lower()
if "already up to date" in lowered or "already up-to-date" in lowered:
return False
return True
def _pull_repo_before_edit(ctx: FlowContext, repo_dir: Path, *, verbose: bool = False) -> None:
ctx.console.info(f"Pulling latest changes in {repo_dir}...")
result = _run_git(repo_dir, "pull", "--rebase", capture=True)
if result.returncode != 0:
ctx.console.warn(f"Git pull failed: {result.stderr.strip()}")
return
if verbose:
output = result.stdout.strip()
if output:
print(output)
if _pull_requires_ack(result.stdout, result.stderr):
ctx.console.info("Repository updated before edit. Review incoming changes first.")
try:
input("Press Enter to continue editing... ")
except (EOFError, KeyboardInterrupt):
print()
def _refresh_module(spec: ModuleSpec) -> None:
module_dir = _module_cache_dir(spec)
module_dir.parent.mkdir(parents=True, exist_ok=True)
if not module_dir.exists():
clone = subprocess.run(
["git", "clone", "--recurse-submodules", spec.source, str(module_dir)],
capture_output=True,
text=True,
check=False,
)
if clone.returncode != 0:
raise RuntimeError(
f"Failed to clone module {spec.package} from {spec.source}: {clone.stderr.strip()}"
)
if spec.ref_type == "branch":
fetch = _run_git(module_dir, "fetch", "origin", spec.ref_value)
if fetch.returncode != 0:
raise RuntimeError(
f"Failed to fetch module {spec.package} branch {spec.ref_value}: {fetch.stderr.strip()}"
)
checkout = _run_git(module_dir, "checkout", spec.ref_value)
if checkout.returncode != 0:
create = _run_git(module_dir, "checkout", "-B", spec.ref_value, f"origin/{spec.ref_value}")
if create.returncode != 0:
raise RuntimeError(
f"Failed to checkout branch {spec.ref_value} for module {spec.package}: "
f"{create.stderr.strip()}"
)
pull = _run_git(module_dir, "pull", "--ff-only", "origin", spec.ref_value)
if pull.returncode != 0:
raise RuntimeError(
f"Failed to update module {spec.package} branch {spec.ref_value}: {pull.stderr.strip()}"
)
elif spec.ref_type == "tag":
fetch = _run_git(module_dir, "fetch", "--tags", "origin")
if fetch.returncode != 0:
raise RuntimeError(
f"Failed to fetch tags for module {spec.package}: {fetch.stderr.strip()}"
)
checkout = _run_git(module_dir, "checkout", f"tags/{spec.ref_value}")
if checkout.returncode != 0:
raise RuntimeError(
f"Failed to checkout tag {spec.ref_value} for module {spec.package}: "
f"{checkout.stderr.strip()}"
)
else:
fetch = _run_git(module_dir, "fetch", "origin")
if fetch.returncode != 0:
raise RuntimeError(
f"Failed to fetch module {spec.package}: {fetch.stderr.strip()}"
)
checkout = _run_git(module_dir, "checkout", spec.ref_value)
if checkout.returncode != 0:
raise RuntimeError(
f"Failed to checkout commit {spec.ref_value} for module {spec.package}: "
f"{checkout.stderr.strip()}"
)
update = _run_git(module_dir, "submodule", "update", "--init", "--recursive")
if update.returncode != 0:
raise RuntimeError(
f"Failed to update nested submodules for module {spec.package}: {update.stderr.strip()}"
)
def _iter_package_dirs(
dotfiles_dir: Path,
*,
profile: Optional[str] = None,
package_filter: Optional[Set[str]] = None,
) -> List[tuple[str, Path]]:
out: List[tuple[str, Path]] = []
flow_dir = _flow_config_dir(dotfiles_dir)
shared = flow_dir / RESERVED_SHARED
if shared.is_dir():
for pkg_dir in sorted(shared.iterdir()):
if pkg_dir.is_dir() and not pkg_dir.name.startswith("."):
if package_filter and pkg_dir.name not in package_filter:
continue
out.append((f"{RESERVED_SHARED}/{pkg_dir.name}", pkg_dir))
profiles = [profile] if profile else _list_profiles(flow_dir)
for profile_name in profiles:
profile_dir = flow_dir / profile_name
if not profile_dir.is_dir():
continue
for pkg_dir in sorted(profile_dir.iterdir()):
if pkg_dir.is_dir() and not pkg_dir.name.startswith("."):
if package_filter and pkg_dir.name not in package_filter:
continue
out.append((f"{profile_name}/{pkg_dir.name}", pkg_dir))
return out
def _collect_module_specs(
dotfiles_dir: Path,
*,
profile: Optional[str] = None,
package_filter: Optional[Set[str]] = None,
) -> List[ModuleSpec]:
specs: List[ModuleSpec] = []
for package, package_dir in _iter_package_dirs(
dotfiles_dir,
profile=profile,
package_filter=package_filter,
):
module_file = package_dir / MODULE_FILE
if not module_file.exists():
continue
specs.append(_load_module_spec(package_dir, package))
return specs
def _sync_modules(
ctx: FlowContext,
*,
verbose: bool = False,
profile: Optional[str] = None,
package_filter: Optional[Set[str]] = None,
) -> None:
_ensure_flow_dir(ctx)
for spec in _collect_module_specs(
DOTFILES_DIR,
profile=profile,
package_filter=package_filter,
):
if verbose:
ctx.console.info(
f"Updating module {spec.package} from {spec.source} ({spec.ref_type}={spec.ref_value})"
)
_refresh_module(spec)
def _module_ref_label(spec: ModuleSpec) -> str:
return f"{spec.ref_type}:{spec.ref_value}"
def _module_head_short(module_dir: Path) -> str:
result = _run_git(module_dir, "rev-parse", "--short", "HEAD")
if result.returncode != 0:
return "unknown"
return result.stdout.strip() or "unknown"
def _resolved_package_source(
ctx: FlowContext,
package: str,
package_dir: Path,
*,
verbose: bool = False,
) -> Path:
module_file = package_dir / MODULE_FILE
if not module_file.exists():
return package_dir
if verbose:
extras = [p.name for p in package_dir.iterdir() if p.name != MODULE_FILE]
if extras:
ctx.console.info(
f"Package {package} uses {MODULE_FILE}; ignoring local files: {', '.join(sorted(extras))}"
)
spec = _load_module_spec(package_dir, package)
module_dir = _module_cache_dir(spec)
if not module_dir.exists():
raise RuntimeError(
f"Module source missing for package '{package}'. Run 'flow dotfiles sync' first."
)
return module_dir
def _load_state() -> dict: def _load_state() -> dict:
if LINKED_STATE.exists(): if LINKED_STATE.exists():
with open(LINKED_STATE, "r", encoding="utf-8") as handle: with open(LINKED_STATE, "r", encoding="utf-8") as handle:
@@ -194,8 +507,12 @@ def _list_profiles(flow_dir: Path) -> List[str]:
def _walk_package(source_dir: Path): def _walk_package(source_dir: Path):
for root, _dirs, files in os.walk(source_dir): for root, dirs, files in os.walk(source_dir):
# Never traverse git metadata from module-backed package sources.
dirs[:] = [entry for entry in dirs if entry != ".git"]
for fname in files: for fname in files:
if fname == ".git":
continue
src = Path(root) / fname src = Path(root) / fname
rel = src.relative_to(source_dir) rel = src.relative_to(source_dir)
yield src, rel yield src, rel
@@ -244,7 +561,7 @@ def _discover_packages(dotfiles_dir: Path, profile: Optional[str] = None) -> dic
return packages return packages
def _find_package_dir(package_name: str, dotfiles_dir: Path = DOTFILES_DIR) -> Optional[Path]: def _find_package_dir(package_name: str, dotfiles_dir: Optional[Path] = None) -> Optional[Path]:
flow_dir = _flow_config_dir(dotfiles_dir) flow_dir = _flow_config_dir(dotfiles_dir)
shared_dir = flow_dir / RESERVED_SHARED / package_name shared_dir = flow_dir / RESERVED_SHARED / package_name
@@ -259,19 +576,20 @@ def _find_package_dir(package_name: str, dotfiles_dir: Path = DOTFILES_DIR) -> O
return None return None
def _resolve_edit_target(target: str, dotfiles_dir: Path = DOTFILES_DIR) -> Optional[Path]: def _resolve_edit_target(target: str, dotfiles_dir: Optional[Path] = None) -> Optional[Path]:
dotfiles_dir = dotfiles_dir or DOTFILES_DIR
base_dir = dotfiles_dir.resolve() base_dir = dotfiles_dir.resolve()
raw = Path(target).expanduser() raw = Path(target).expanduser()
if raw.is_absolute(): if raw.is_absolute():
try: if not _is_under(raw, base_dir):
raw.resolve().relative_to(base_dir)
except ValueError:
return None return None
return raw return raw
is_path_like = "/" in target or target.startswith(".") or raw.suffix != "" is_path_like = _is_path_like_target(target)
if is_path_like: if is_path_like:
candidate = dotfiles_dir / raw candidate = dotfiles_dir / raw
if not _is_under(candidate, base_dir):
return None
if candidate.exists() or candidate.parent.exists(): if candidate.exists() or candidate.parent.exists():
return candidate return candidate
return None return None
@@ -354,6 +672,14 @@ def _is_in_home(path: Path, home: Path) -> bool:
return False return False
def _is_under(path: Path, parent: Path) -> bool:
try:
path.resolve().relative_to(parent.resolve())
return True
except ValueError:
return False
def _run_sudo(cmd: List[str], *, dry_run: bool = False) -> None: def _run_sudo(cmd: List[str], *, dry_run: bool = False) -> None:
if dry_run: if dry_run:
print(" " + " ".join(shlex.quote(part) for part in (["sudo"] + cmd))) print(" " + " ".join(shlex.quote(part) for part in (["sudo"] + cmd)))
@@ -385,11 +711,14 @@ def _same_symlink(target: Path, source: Path) -> bool:
def _collect_home_specs( def _collect_home_specs(
ctx: FlowContext,
flow_dir: Path, flow_dir: Path,
home: Path, home: Path,
profile: Optional[str], profile: Optional[str],
skip: Set[str], skip: Set[str],
package_filter: Optional[Set[str]], package_filter: Optional[Set[str]],
*,
verbose: bool = False,
) -> Dict[Path, LinkSpec]: ) -> Dict[Path, LinkSpec]:
desired: Dict[Path, LinkSpec] = {} desired: Dict[Path, LinkSpec] = {}
@@ -405,10 +734,20 @@ def _collect_home_specs(
continue continue
package_name = f"{RESERVED_SHARED}/{pkg_dir.name}" package_name = f"{RESERVED_SHARED}/{pkg_dir.name}"
for src, rel in _walk_package(pkg_dir): source_dir = _resolved_package_source(ctx, package_name, pkg_dir, verbose=verbose)
for src, rel in _walk_package(source_dir):
if rel.parts and rel.parts[0] == RESERVED_ROOT:
if RESERVED_ROOT in skip:
continue
if len(rel.parts) < 2:
continue
target = Path("/") / Path(*rel.parts[1:])
else:
target = home / rel
_insert_spec( _insert_spec(
desired, desired,
target=home / rel, target=target,
source=src, source=src,
package=package_name, package=package_name,
) )
@@ -425,10 +764,20 @@ def _collect_home_specs(
continue continue
package_name = f"{profile}/{pkg_dir.name}" package_name = f"{profile}/{pkg_dir.name}"
for src, rel in _walk_package(pkg_dir): source_dir = _resolved_package_source(ctx, package_name, pkg_dir, verbose=verbose)
for src, rel in _walk_package(source_dir):
if rel.parts and rel.parts[0] == RESERVED_ROOT:
if RESERVED_ROOT in skip:
continue
if len(rel.parts) < 2:
continue
target = Path("/") / Path(*rel.parts[1:])
else:
target = home / rel
_insert_spec( _insert_spec(
desired, desired,
target=home / rel, target=target,
source=src, source=src,
package=package_name, package=package_name,
) )
@@ -436,31 +785,6 @@ def _collect_home_specs(
return desired return desired
def _collect_root_specs(flow_dir: Path, skip: Set[str], include_root: bool) -> Dict[Path, LinkSpec]:
desired: Dict[Path, LinkSpec] = {}
if not include_root or RESERVED_ROOT in skip:
return desired
root_dir = flow_dir / RESERVED_ROOT
if not root_dir.is_dir():
return desired
for root_pkg_dir in sorted(root_dir.iterdir()):
if not root_pkg_dir.is_dir() or root_pkg_dir.name.startswith("."):
continue
for src, rel in _walk_package(root_pkg_dir):
target = Path("/") / rel
_insert_spec(
desired,
target=target,
source=src,
package=f"{RESERVED_ROOT}/{root_pkg_dir.name}",
)
return desired
def _validate_conflicts( def _validate_conflicts(
desired: Dict[Path, LinkSpec], desired: Dict[Path, LinkSpec],
current: Dict[Path, LinkSpec], current: Dict[Path, LinkSpec],
@@ -488,7 +812,7 @@ def _validate_conflicts(
def _apply_link_spec(spec: LinkSpec, *, copy: bool, dry_run: bool) -> bool: def _apply_link_spec(spec: LinkSpec, *, copy: bool, dry_run: bool) -> bool:
use_sudo = _is_root_package(spec.package) use_sudo = not _is_in_home(spec.target, Path.home())
if copy and use_sudo: if copy and use_sudo:
print(f" SKIP COPY (root target): {spec.target}") print(f" SKIP COPY (root target): {spec.target}")
@@ -534,7 +858,7 @@ def _sync_to_desired(
for target in sorted(current.keys(), key=str): for target in sorted(current.keys(), key=str):
if target in desired: if target in desired:
continue continue
use_sudo = _is_root_package(current[target].package) or not _is_in_home(target, Path.home()) use_sudo = not _is_in_home(target, Path.home())
_remove_target(target, use_sudo=use_sudo, dry_run=dry_run) _remove_target(target, use_sudo=use_sudo, dry_run=dry_run)
del current[target] del current[target]
@@ -547,7 +871,7 @@ def _sync_to_desired(
exists = target.exists() or target.is_symlink() exists = target.exists() or target.is_symlink()
if exists: if exists:
use_sudo = _is_root_package(spec.package) or not _is_in_home(target, Path.home()) use_sudo = not _is_in_home(target, Path.home())
_remove_target(target, use_sudo=use_sudo, dry_run=dry_run) _remove_target(target, use_sudo=use_sudo, dry_run=dry_run)
applied = _apply_link_spec(spec, copy=copy, dry_run=dry_run) applied = _apply_link_spec(spec, copy=copy, dry_run=dry_run)
@@ -562,32 +886,22 @@ def _desired_links_for_profile(
ctx: FlowContext, ctx: FlowContext,
profile: Optional[str], profile: Optional[str],
package_filter: Optional[Set[str]], package_filter: Optional[Set[str]],
*,
verbose: bool = False,
) -> Dict[Path, LinkSpec]: ) -> Dict[Path, LinkSpec]:
flow_dir = _flow_config_dir() flow_dir = _flow_config_dir()
home = Path.home() home = Path.home()
skip = _profile_skip_set(ctx, profile) skip = _profile_skip_set(ctx, profile)
include_root = package_filter is None or RESERVED_ROOT in package_filter return _collect_home_specs(
ctx,
effective_filter = None flow_dir,
if package_filter is not None: home,
effective_filter = set(package_filter) profile,
effective_filter.discard(RESERVED_ROOT) skip,
if not effective_filter: package_filter,
effective_filter = set() verbose=verbose,
)
home_specs = _collect_home_specs(flow_dir, home, profile, skip, effective_filter)
root_specs = _collect_root_specs(flow_dir, skip, include_root)
combined = {}
combined.update(home_specs)
for target, spec in root_specs.items():
_insert_spec(
combined,
target=target,
source=spec.source,
package=spec.package,
)
return combined
def run_init(ctx: FlowContext, args): def run_init(ctx: FlowContext, args):
@@ -602,9 +916,16 @@ def run_init(ctx: FlowContext, args):
DOTFILES_DIR.parent.mkdir(parents=True, exist_ok=True) DOTFILES_DIR.parent.mkdir(parents=True, exist_ok=True)
branch = ctx.config.dotfiles_branch branch = ctx.config.dotfiles_branch
cmd = ["git", "clone", "-b", branch, repo_url, str(DOTFILES_DIR)] cmd = ["git", "clone", "-b", branch, "--recurse-submodules", repo_url, str(DOTFILES_DIR)]
ctx.console.info(f"Cloning {repo_url} (branch: {branch})...") ctx.console.info(f"Cloning {repo_url} (branch: {branch})...")
subprocess.run(cmd, check=True) subprocess.run(cmd, check=True)
try:
_sync_modules(ctx, verbose=bool(getattr(args, "verbose", False)))
except RuntimeError as e:
ctx.console.error(str(e))
sys.exit(1)
ctx.console.success(f"Dotfiles cloned to {DOTFILES_DIR}") ctx.console.success(f"Dotfiles cloned to {DOTFILES_DIR}")
@@ -618,7 +939,17 @@ def run_link(ctx: FlowContext, args):
sys.exit(1) sys.exit(1)
package_filter = set(args.packages) if args.packages else None package_filter = set(args.packages) if args.packages else None
desired = _desired_links_for_profile(ctx, profile, package_filter)
try:
desired = _desired_links_for_profile(
ctx,
profile,
package_filter,
verbose=bool(getattr(args, "verbose", False)),
)
except RuntimeError as e:
ctx.console.error(str(e))
sys.exit(1)
if not desired: if not desired:
ctx.console.warn("No link targets found for selected profile/filters") ctx.console.warn("No link targets found for selected profile/filters")
@@ -670,7 +1001,7 @@ def run_unlink(ctx: FlowContext, args):
if filters and not _package_match(spec.package, filters): if filters and not _package_match(spec.package, filters):
continue continue
use_sudo = _is_root_package(spec.package) or not _is_in_home(target, Path.home()) use_sudo = not _is_in_home(target, Path.home())
try: try:
_remove_target(target, use_sudo=use_sudo, dry_run=False) _remove_target(target, use_sudo=use_sudo, dry_run=False)
except RuntimeError as e: except RuntimeError as e:
@@ -718,6 +1049,7 @@ def run_sync(ctx: FlowContext, args):
try: try:
_pull_dotfiles(ctx, rebase=True) _pull_dotfiles(ctx, rebase=True)
_sync_modules(ctx, verbose=bool(getattr(args, "verbose", False)))
except RuntimeError as e: except RuntimeError as e:
ctx.console.error(str(e)) ctx.console.error(str(e))
sys.exit(1) sys.exit(1)
@@ -727,6 +1059,90 @@ def run_sync(ctx: FlowContext, args):
run_relink(ctx, relink_args) run_relink(ctx, relink_args)
def _validated_profile_name(profile: Optional[str]) -> Optional[str]:
if not profile:
return None
profiles = _list_profiles(_flow_config_dir())
if profile not in profiles:
raise RuntimeError(f"Profile not found: {profile}")
return profile
def _package_filter_from_args(args) -> Optional[Set[str]]:
packages = getattr(args, "packages", [])
if not packages:
return None
return {str(pkg) for pkg in packages}
def run_modules_list(ctx: FlowContext, args):
_ensure_flow_dir(ctx)
try:
profile = _validated_profile_name(getattr(args, "profile", None))
except RuntimeError as e:
ctx.console.error(str(e))
sys.exit(1)
package_filter = _package_filter_from_args(args)
specs = _collect_module_specs(
DOTFILES_DIR,
profile=profile,
package_filter=package_filter,
)
if not specs:
ctx.console.info("No module packages found.")
return
rows = []
for spec in sorted(specs, key=lambda item: item.package):
module_dir = _module_cache_dir(spec)
if module_dir.exists():
status = f"ready@{_module_head_short(module_dir)}"
else:
status = "missing"
rows.append([spec.package, _module_ref_label(spec), spec.source, status])
ctx.console.table(["PACKAGE", "REF", "SOURCE", "STATUS"], rows)
def run_modules_sync(ctx: FlowContext, args):
_ensure_flow_dir(ctx)
try:
profile = _validated_profile_name(getattr(args, "profile", None))
except RuntimeError as e:
ctx.console.error(str(e))
sys.exit(1)
package_filter = _package_filter_from_args(args)
specs = _collect_module_specs(
DOTFILES_DIR,
profile=profile,
package_filter=package_filter,
)
if not specs:
ctx.console.info("No module packages to sync.")
return
try:
_sync_modules(
ctx,
verbose=bool(getattr(args, "verbose", False)),
profile=profile,
package_filter=package_filter,
)
except RuntimeError as e:
ctx.console.error(str(e))
sys.exit(1)
ctx.console.success(f"Synced {len(specs)} module(s)")
def run_repo_status(ctx: FlowContext, args): def run_repo_status(ctx: FlowContext, args):
_ensure_dotfiles_dir(ctx) _ensure_dotfiles_dir(ctx)
@@ -747,6 +1163,7 @@ def run_repo_pull(ctx: FlowContext, args):
try: try:
_pull_dotfiles(ctx, rebase=args.rebase) _pull_dotfiles(ctx, rebase=args.rebase)
_sync_modules(ctx, verbose=bool(getattr(args, "verbose", False)))
except RuntimeError as e: except RuntimeError as e:
ctx.console.error(str(e)) ctx.console.error(str(e))
sys.exit(1) sys.exit(1)
@@ -803,7 +1220,7 @@ def run_clean(ctx: FlowContext, args):
if args.dry_run: if args.dry_run:
print(f"Would remove broken symlink: {target}") print(f"Would remove broken symlink: {target}")
else: else:
use_sudo = _is_root_package(current[target].package) or not _is_in_home(target, Path.home()) use_sudo = not _is_in_home(target, Path.home())
_remove_target(target, use_sudo=use_sudo, dry_run=False) _remove_target(target, use_sudo=use_sudo, dry_run=False)
del current[target] del current[target]
removed += 1 removed += 1
@@ -821,15 +1238,42 @@ def run_edit(ctx: FlowContext, args):
_ensure_dotfiles_dir(ctx) _ensure_dotfiles_dir(ctx)
target_name = args.target target_name = args.target
edit_target = _resolve_edit_target(target_name) verbose = bool(getattr(args, "verbose", False))
edit_target = None
if not _is_path_like_target(target_name):
package_dir = _find_package_dir(target_name)
if package_dir is not None:
try:
package_layer = package_dir.parent.name
package_id = f"{package_layer}/{package_dir.name}"
edit_target = _resolved_package_source(
ctx,
package_id,
package_dir,
verbose=verbose,
)
except RuntimeError as e:
ctx.console.error(str(e))
sys.exit(1)
if edit_target is None:
edit_target = _resolve_edit_target(target_name)
if edit_target is None: if edit_target is None:
ctx.console.error(f"No matching package or path found for: {target_name}") ctx.console.error(f"No matching package or path found for: {target_name}")
sys.exit(1) sys.exit(1)
ctx.console.info("Pulling latest changes...") module_mode = _is_under(edit_target, MODULES_DIR)
result = _run_dotfiles_git("pull", "--rebase", capture=True)
if result.returncode != 0: if verbose and module_mode:
ctx.console.warn(f"Git pull failed: {result.stderr.strip()}") ctx.console.info(f"Editing module workspace: {edit_target}")
if ctx.config.dotfiles_pull_before_edit:
pull_repo = DOTFILES_DIR
if module_mode:
pull_repo = edit_target if edit_target.is_dir() else edit_target.parent
_pull_repo_before_edit(ctx, pull_repo, verbose=verbose)
editor = os.environ.get("EDITOR", "vim") editor = os.environ.get("EDITOR", "vim")
ctx.console.info(f"Opening {edit_target} in {editor}...") ctx.console.info(f"Opening {edit_target} in {editor}...")
@@ -837,6 +1281,34 @@ def run_edit(ctx: FlowContext, args):
if edit_result.returncode != 0: if edit_result.returncode != 0:
ctx.console.warn(f"Editor exited with status {edit_result.returncode}") ctx.console.warn(f"Editor exited with status {edit_result.returncode}")
if module_mode:
module_git_dir = edit_target if edit_target.is_dir() else edit_target.parent
result = _run_git(module_git_dir, "status", "--porcelain", capture=True)
if result.stdout.strip() and not args.no_commit:
ctx.console.info("Module changes detected, committing...")
subprocess.run(["git", "-C", str(module_git_dir), "add", "."], check=True)
subprocess.run(
["git", "-C", str(module_git_dir), "commit", "-m", f"Update {target_name}"],
check=True,
)
try:
response = input("Push module changes to remote? [Y/n] ")
except (EOFError, KeyboardInterrupt):
response = "n"
print()
if response.lower() != "n":
subprocess.run(["git", "-C", str(module_git_dir), "push"], check=True)
ctx.console.success("Module changes committed and pushed")
else:
ctx.console.info("Module changes committed locally (not pushed)")
elif result.stdout.strip() and args.no_commit:
ctx.console.info("Module changes detected; skipped commit (--no-commit)")
else:
ctx.console.info("No module changes to commit")
return
result = _run_dotfiles_git("status", "--porcelain", capture=True) result = _run_dotfiles_git("status", "--porcelain", capture=True)
if result.stdout.strip() and not args.no_commit: if result.stdout.strip() and not args.no_commit:

View File

@@ -30,9 +30,17 @@ def register(subparsers):
def _load_installed() -> dict: def _load_installed() -> dict:
if INSTALLED_STATE.exists(): if not INSTALLED_STATE.exists():
return {}
try:
with open(INSTALLED_STATE, "r", encoding="utf-8") as handle: with open(INSTALLED_STATE, "r", encoding="utf-8") as handle:
return json.load(handle) state = json.load(handle)
except (OSError, json.JSONDecodeError):
return {}
if isinstance(state, dict):
return state
return {} return {}

View File

@@ -43,11 +43,15 @@ def _git(repo: str, *cmd, capture: bool = True) -> subprocess.CompletedProcess:
) )
def _is_git_repo(repo_path: str) -> bool:
git_dir = os.path.join(repo_path, ".git")
return os.path.isdir(git_dir) or os.path.isfile(git_dir)
def _check_repo(repo_path: str, do_fetch: bool = True): def _check_repo(repo_path: str, do_fetch: bool = True):
"""Check a single repo, return (name, issues list).""" """Check a single repo, return (name, issues list)."""
name = os.path.basename(repo_path) name = os.path.basename(repo_path)
git_dir = os.path.join(repo_path, ".git") if not _is_git_repo(repo_path):
if not os.path.isdir(git_dir):
return name, None # Not a git repo return name, None # Not a git repo
issues = [] issues = []
@@ -151,7 +155,7 @@ def run_fetch(ctx: FlowContext, args):
fetched = 0 fetched = 0
for entry in sorted(os.listdir(projects_dir)): for entry in sorted(os.listdir(projects_dir)):
repo_path = os.path.join(projects_dir, entry) repo_path = os.path.join(projects_dir, entry)
if not os.path.isdir(os.path.join(repo_path, ".git")): if not _is_git_repo(repo_path):
continue continue
ctx.console.info(f"Fetching {entry}...") ctx.console.info(f"Fetching {entry}...")
result = _git(repo_path, "fetch", "--all", "--quiet") result = _git(repo_path, "fetch", "--all", "--quiet")

View File

@@ -23,6 +23,7 @@ class TargetConfig:
class AppConfig: class AppConfig:
dotfiles_url: str = "" dotfiles_url: str = ""
dotfiles_branch: str = "main" dotfiles_branch: str = "main"
dotfiles_pull_before_edit: bool = True
projects_dir: str = "~/projects" projects_dir: str = "~/projects"
container_registry: str = "registry.tomastm.com" container_registry: str = "registry.tomastm.com"
container_tag: str = "latest" container_tag: str = "latest"
@@ -39,6 +40,20 @@ def _get_value(mapping: Any, *keys: str, default: Any = None) -> Any:
return default return default
def _as_bool(value: Any, default: bool) -> bool:
if isinstance(value, bool):
return value
if isinstance(value, (int, float)):
return bool(value)
if isinstance(value, str):
normalized = value.strip().lower()
if normalized in {"1", "true", "yes", "y", "on"}:
return True
if normalized in {"0", "false", "no", "n", "off"}:
return False
return default
def _parse_target_config(key: str, value: str) -> Optional[TargetConfig]: def _parse_target_config(key: str, value: str) -> Optional[TargetConfig]:
"""Parse a target line from config-like syntax. """Parse a target line from config-like syntax.
@@ -249,6 +264,15 @@ def load_config(path: Optional[Path] = None) -> AppConfig:
default=merged.get("dotfiles_branch", cfg.dotfiles_branch), default=merged.get("dotfiles_branch", cfg.dotfiles_branch),
) )
) )
cfg.dotfiles_pull_before_edit = _as_bool(
_get_value(
repository,
"pull_before_edit",
"pull-before-edit",
default=merged.get("dotfiles_pull_before_edit", cfg.dotfiles_pull_before_edit),
),
cfg.dotfiles_pull_before_edit,
)
cfg.projects_dir = str( cfg.projects_dir = str(
_get_value( _get_value(
paths_section, paths_section,

View File

@@ -18,6 +18,7 @@ MANIFEST_FILE = CONFIG_DIR / "manifest.yaml"
CONFIG_FILE = CONFIG_DIR / "config.yaml" CONFIG_FILE = CONFIG_DIR / "config.yaml"
DOTFILES_DIR = DATA_DIR / "dotfiles" DOTFILES_DIR = DATA_DIR / "dotfiles"
MODULES_DIR = DATA_DIR / "modules"
PACKAGES_DIR = DATA_DIR / "packages" PACKAGES_DIR = DATA_DIR / "packages"
SCRATCH_DIR = DATA_DIR / "scratch" SCRATCH_DIR = DATA_DIR / "scratch"
PROJECTS_DIR = HOME / "projects" PROJECTS_DIR = HOME / "projects"
@@ -33,5 +34,5 @@ DOTFILES_CONFIG = DOTFILES_FLOW_CONFIG / "config.yaml"
def ensure_dirs() -> None: def ensure_dirs() -> None:
"""Create all required directories if they don't exist.""" """Create all required directories if they don't exist."""
for d in (CONFIG_DIR, DATA_DIR, STATE_DIR, PACKAGES_DIR, SCRATCH_DIR): for d in (CONFIG_DIR, DATA_DIR, STATE_DIR, MODULES_DIR, PACKAGES_DIR, SCRATCH_DIR):
d.mkdir(parents=True, exist_ok=True) d.mkdir(parents=True, exist_ok=True)

View File

@@ -1,12 +1,14 @@
"""Tests for flow.commands.bootstrap helpers and schema behavior.""" """Tests for flow.commands.bootstrap helpers and schema behavior."""
import os import os
from pathlib import Path
import pytest import pytest
from flow.commands.bootstrap import ( from flow.commands.bootstrap import (
_ensure_required_variables, _ensure_required_variables,
_get_profiles, _get_profiles,
_install_binary_package,
_normalize_profile_package_entry, _normalize_profile_package_entry,
_resolve_package_manager, _resolve_package_manager,
_resolve_package_spec, _resolve_package_spec,
@@ -141,3 +143,73 @@ def test_ensure_required_variables_accepts_vars(monkeypatch):
env["USER_EMAIL"] = "a@b" env["USER_EMAIL"] = "a@b"
env["TARGET_HOSTNAME"] = "devbox" env["TARGET_HOSTNAME"] = "devbox"
_ensure_required_variables({"requires": ["USER_EMAIL", "TARGET_HOSTNAME"]}, env) _ensure_required_variables({"requires": ["USER_EMAIL", "TARGET_HOSTNAME"]}, env)
class _FakeResponse:
def __enter__(self):
return self
def __exit__(self, exc_type, exc, tb):
return False
def read(self):
return b"archive"
def _patch_binary_download(monkeypatch, after_unpack=None):
monkeypatch.setattr(
"flow.commands.bootstrap.urllib.request.urlopen",
lambda *args, **kwargs: _FakeResponse(),
)
def _fake_unpack(_archive, extract_dir):
extracted = Path(extract_dir)
extracted.mkdir(parents=True, exist_ok=True)
if after_unpack:
after_unpack(extracted)
monkeypatch.setattr("flow.commands.bootstrap.shutil.unpack_archive", _fake_unpack)
def test_install_binary_package_rejects_absolute_declared_path(monkeypatch, tmp_path, ctx):
absolute_item = tmp_path / "outside-bin"
absolute_item.write_text("binary")
_patch_binary_download(monkeypatch)
monkeypatch.setattr(
"flow.commands.bootstrap._copy_install_item",
lambda *args, **kwargs: pytest.fail("_copy_install_item should not be called"),
)
spec = {
"name": "demo",
"type": "binary",
"source": "https://example.invalid/demo",
"asset-pattern": "demo.tar.gz",
"install": {"bin": [str(absolute_item)]},
}
with pytest.raises(RuntimeError, match="must be relative"):
_install_binary_package(ctx, spec, {}, dry_run=False)
def test_install_binary_package_rejects_parent_traversal_declared_path(monkeypatch, ctx):
def _after_unpack(extracted):
(extracted.parent / "escape-bin").write_text("binary")
_patch_binary_download(monkeypatch, after_unpack=_after_unpack)
monkeypatch.setattr(
"flow.commands.bootstrap._copy_install_item",
lambda *args, **kwargs: pytest.fail("_copy_install_item should not be called"),
)
spec = {
"name": "demo",
"type": "binary",
"source": "https://example.invalid/demo",
"asset-pattern": "demo.tar.gz",
"install": {"bin": ["../escape-bin"]},
}
with pytest.raises(RuntimeError, match="parent traversal"):
_install_binary_package(ctx, spec, {}, dry_run=False)

View File

@@ -36,6 +36,43 @@ def test_complete_package_remove(monkeypatch):
assert out == ["hello"] assert out == ["hello"]
def test_list_manifest_packages_is_consistent_for_list_and_dict_forms(monkeypatch):
manifests = [
{
"packages": [
{"name": "neovim", "type": "binary"},
{"name": "ripgrep", "type": "pkg"},
{"name": "fzf", "type": "binary"},
]
},
{
"packages": {
"neovim": {"type": "binary"},
"ripgrep": {"type": "pkg"},
"fzf": {"type": "binary"},
}
},
]
monkeypatch.setattr(completion, "_safe_manifest", lambda: manifests.pop(0))
from_list = completion._list_manifest_packages()
from_dict = completion._list_manifest_packages()
assert from_list == ["fzf", "neovim"]
assert from_dict == ["fzf", "neovim"]
def test_list_manifest_packages_uses_mapping_key_when_name_missing(monkeypatch):
monkeypatch.setattr(
completion,
"_safe_manifest",
lambda: {"packages": {"bat": {"type": "binary"}, "git": {"type": "pkg"}}},
)
assert completion._list_manifest_packages() == ["bat"]
def test_complete_dotfiles_profile_value(monkeypatch): def test_complete_dotfiles_profile_value(monkeypatch):
monkeypatch.setattr(completion, "_list_dotfiles_profiles", lambda: ["work", "personal"]) monkeypatch.setattr(completion, "_list_dotfiles_profiles", lambda: ["work", "personal"])
out = completion.complete(["flow", "dotfiles", "link", "--profile", "w"], 5) out = completion.complete(["flow", "dotfiles", "link", "--profile", "w"], 5)
@@ -47,6 +84,17 @@ def test_complete_dotfiles_repo_subcommands():
assert out == ["pull", "push"] assert out == ["pull", "push"]
def test_complete_dotfiles_modules_subcommands():
out = completion.complete(["flow", "dotfiles", "modules", "s"], 4)
assert out == ["sync"]
def test_complete_dotfiles_modules_profile_value(monkeypatch):
monkeypatch.setattr(completion, "_list_dotfiles_profiles", lambda: ["work", "personal"])
out = completion.complete(["flow", "dotfiles", "modules", "list", "--profile", "w"], 6)
assert out == ["work"]
def test_complete_enter_targets(monkeypatch): def test_complete_enter_targets(monkeypatch):
monkeypatch.setattr(completion, "_list_targets", lambda: ["personal@orb", "work@ec2"]) monkeypatch.setattr(completion, "_list_targets", lambda: ["personal@orb", "work@ec2"])
out = completion.complete(["flow", "enter", "p"], 3) out = completion.complete(["flow", "enter", "p"], 3)

View File

@@ -10,6 +10,7 @@ def test_load_config_missing_path(tmp_path):
assert isinstance(cfg, AppConfig) assert isinstance(cfg, AppConfig)
assert cfg.dotfiles_url == "" assert cfg.dotfiles_url == ""
assert cfg.container_registry == "registry.tomastm.com" assert cfg.container_registry == "registry.tomastm.com"
assert cfg.dotfiles_pull_before_edit is True
def test_load_config_merged_yaml(tmp_path): def test_load_config_merged_yaml(tmp_path):
@@ -17,6 +18,7 @@ def test_load_config_merged_yaml(tmp_path):
"repository:\n" "repository:\n"
" dotfiles-url: git@github.com:user/dots.git\n" " dotfiles-url: git@github.com:user/dots.git\n"
" dotfiles-branch: dev\n" " dotfiles-branch: dev\n"
" pull-before-edit: false\n"
"paths:\n" "paths:\n"
" projects-dir: ~/code\n" " projects-dir: ~/code\n"
"defaults:\n" "defaults:\n"
@@ -31,6 +33,7 @@ def test_load_config_merged_yaml(tmp_path):
cfg = load_config(tmp_path) cfg = load_config(tmp_path)
assert cfg.dotfiles_url == "git@github.com:user/dots.git" assert cfg.dotfiles_url == "git@github.com:user/dots.git"
assert cfg.dotfiles_branch == "dev" assert cfg.dotfiles_branch == "dev"
assert cfg.dotfiles_pull_before_edit is False
assert cfg.projects_dir == "~/code" assert cfg.projects_dir == "~/code"
assert cfg.container_registry == "my.registry.com" assert cfg.container_registry == "my.registry.com"
assert cfg.container_tag == "v1" assert cfg.container_tag == "v1"
@@ -40,6 +43,16 @@ def test_load_config_merged_yaml(tmp_path):
assert cfg.targets[1].ssh_identity == "~/.ssh/id_work" assert cfg.targets[1].ssh_identity == "~/.ssh/id_work"
def test_load_config_pull_before_edit_string_true(tmp_path):
(tmp_path / "10-config.yaml").write_text(
"repository:\n"
" pull-before-edit: yes\n"
)
cfg = load_config(tmp_path)
assert cfg.dotfiles_pull_before_edit is True
def test_load_manifest_missing_path(tmp_path): def test_load_manifest_missing_path(tmp_path):
result = load_manifest(tmp_path / "nonexistent") result = load_manifest(tmp_path / "nonexistent")
assert result == {} assert result == {}

View File

@@ -2,7 +2,10 @@
import pytest import pytest
from flow.commands.dotfiles import _discover_packages, _resolve_edit_target, _walk_package from flow.commands.dotfiles import _collect_home_specs, _discover_packages, _resolve_edit_target, _walk_package
from flow.core.config import AppConfig, FlowContext
from flow.core.console import ConsoleLogger
from flow.core.platform import PlatformInfo
def _make_tree(tmp_path): def _make_tree(tmp_path):
@@ -20,6 +23,15 @@ def _make_tree(tmp_path):
return tmp_path return tmp_path
def _ctx() -> FlowContext:
return FlowContext(
config=AppConfig(),
manifest={"profiles": {"work": {"os": "linux", "configs": {"skip": []}}}},
platform=PlatformInfo(os="linux", arch="x64", platform="linux-x64"),
console=ConsoleLogger(),
)
def test_discover_packages_shared_only(tmp_path): def test_discover_packages_shared_only(tmp_path):
tree = _make_tree(tmp_path) tree = _make_tree(tmp_path)
packages = _discover_packages(tree) packages = _discover_packages(tree)
@@ -43,8 +55,7 @@ def test_discover_packages_profile_overrides_shared(tmp_path):
(profile_zsh / ".zshrc").write_text("# work zsh") (profile_zsh / ".zshrc").write_text("# work zsh")
with pytest.raises(RuntimeError, match="Conflicting dotfile targets"): with pytest.raises(RuntimeError, match="Conflicting dotfile targets"):
from flow.commands.dotfiles import _collect_home_specs _collect_home_specs(_ctx(), tree, tmp_path / "home", "work", set(), None)
_collect_home_specs(tree, tmp_path / "home", "work", set(), None)
def test_walk_package_returns_relative_paths(tmp_path): def test_walk_package_returns_relative_paths(tmp_path):
@@ -70,6 +81,24 @@ def test_resolve_edit_target_repo_path(tmp_path):
assert target == tree / "_shared" / "zsh" / ".zshrc" assert target == tree / "_shared" / "zsh" / ".zshrc"
def test_resolve_edit_target_rejects_parent_traversal(tmp_path):
tree = _make_tree(tmp_path / "repo")
outside = tmp_path / "outside.txt"
outside.write_text("secret")
target = _resolve_edit_target("../outside.txt", dotfiles_dir=tree)
assert target is None
def test_resolve_edit_target_rejects_nested_repo_escape(tmp_path):
tree = _make_tree(tmp_path / "repo")
outside = tmp_path / "escape.txt"
outside.write_text("secret")
target = _resolve_edit_target("_shared/../../escape.txt", dotfiles_dir=tree)
assert target is None
def test_resolve_edit_target_missing_returns_none(tmp_path): def test_resolve_edit_target_missing_returns_none(tmp_path):
tree = _make_tree(tmp_path) tree = _make_tree(tmp_path)
assert _resolve_edit_target("does-not-exist", dotfiles_dir=tree) is None assert _resolve_edit_target("does-not-exist", dotfiles_dir=tree) is None

View File

@@ -1,6 +1,7 @@
"""Tests for flat-layout dotfiles helpers and state format.""" """Tests for dotfiles link planning, root markers, and module sources."""
import json import json
import subprocess
from pathlib import Path from pathlib import Path
import pytest import pytest
@@ -8,11 +9,25 @@ import pytest
from flow.commands.dotfiles import ( from flow.commands.dotfiles import (
LinkSpec, LinkSpec,
_collect_home_specs, _collect_home_specs,
_collect_root_specs,
_list_profiles, _list_profiles,
_load_link_specs_from_state, _load_link_specs_from_state,
_pull_requires_ack,
_resolved_package_source,
_save_link_specs_to_state, _save_link_specs_to_state,
_sync_modules,
) )
from flow.core.config import AppConfig, FlowContext
from flow.core.console import ConsoleLogger
from flow.core.platform import PlatformInfo
def _ctx() -> FlowContext:
return FlowContext(
config=AppConfig(),
manifest={"profiles": {"work": {"os": "linux", "configs": {"skip": []}}}},
platform=PlatformInfo(os="linux", arch="x64", platform="linux-x64"),
console=ConsoleLogger(),
)
def _make_flow_tree(tmp_path: Path) -> Path: def _make_flow_tree(tmp_path: Path) -> Path:
@@ -25,12 +40,9 @@ def _make_flow_tree(tmp_path: Path) -> Path:
(flow_root / "work" / "git").mkdir(parents=True) (flow_root / "work" / "git").mkdir(parents=True)
(flow_root / "work" / "git" / ".gitconfig").write_text("profile") (flow_root / "work" / "git" / ".gitconfig").write_text("profile")
(flow_root / "work" / "nvim").mkdir(parents=True)
(flow_root / "work" / "nvim" / ".config" / "nvim").mkdir(parents=True)
(flow_root / "work" / "nvim" / ".config" / "nvim" / "init.lua").write_text("-- init")
(flow_root / "_root" / "general" / "etc").mkdir(parents=True) (flow_root / "_shared" / "dnsmasq" / "_root" / "etc").mkdir(parents=True)
(flow_root / "_root" / "general" / "etc" / "hostname").write_text("devbox") (flow_root / "_shared" / "dnsmasq" / "_root" / "etc" / "hostname").write_text("devbox")
return flow_root return flow_root
@@ -47,14 +59,33 @@ def test_collect_home_specs_conflict_fails(tmp_path):
home.mkdir() home.mkdir()
with pytest.raises(RuntimeError, match="Conflicting dotfile targets"): with pytest.raises(RuntimeError, match="Conflicting dotfile targets"):
_collect_home_specs(flow_root, home, "work", set(), None) _collect_home_specs(_ctx(), flow_root, home, "work", set(), None)
def test_collect_root_specs_maps_to_absolute_paths(tmp_path): def test_collect_home_specs_maps_root_marker_to_absolute(tmp_path):
flow_root = _make_flow_tree(tmp_path) flow_root = tmp_path
specs = _collect_root_specs(flow_root, set(), include_root=True) (flow_root / "_shared" / "dnsmasq" / "_root" / "opt" / "homebrew" / "etc").mkdir(parents=True)
assert Path("/etc/hostname") in specs src = flow_root / "_shared" / "dnsmasq" / "_root" / "opt" / "homebrew" / "etc" / "dnsmasq.conf"
assert specs[Path("/etc/hostname")].package == "_root/general" src.write_text("conf")
home = tmp_path / "home"
home.mkdir()
specs = _collect_home_specs(_ctx(), flow_root, home, None, set(), None)
assert Path("/opt/homebrew/etc/dnsmasq.conf") in specs
assert specs[Path("/opt/homebrew/etc/dnsmasq.conf")].source == src
def test_collect_home_specs_skip_root_marker(tmp_path):
flow_root = tmp_path
(flow_root / "_shared" / "dnsmasq" / "_root" / "etc").mkdir(parents=True)
(flow_root / "_shared" / "dnsmasq" / "_root" / "etc" / "hostname").write_text("devbox")
home = tmp_path / "home"
home.mkdir()
specs = _collect_home_specs(_ctx(), flow_root, home, None, {"_root"}, None)
assert Path("/etc/hostname") not in specs
def test_state_round_trip(tmp_path, monkeypatch): def test_state_round_trip(tmp_path, monkeypatch):
@@ -92,3 +123,153 @@ def test_state_old_format_rejected(tmp_path, monkeypatch):
with pytest.raises(RuntimeError, match="Unsupported linked state format"): with pytest.raises(RuntimeError, match="Unsupported linked state format"):
_load_link_specs_from_state() _load_link_specs_from_state()
def test_module_source_requires_sync(tmp_path):
package_dir = tmp_path / "_shared" / "nvim"
package_dir.mkdir(parents=True)
(package_dir / "_module.yaml").write_text(
"source: github:dummy/example\n"
"ref:\n"
" branch: main\n"
)
with pytest.raises(RuntimeError, match="Run 'flow dotfiles sync' first"):
_resolved_package_source(_ctx(), "_shared/nvim", package_dir)
def test_sync_modules_populates_cache_and_resolves_source(tmp_path, monkeypatch):
module_src = tmp_path / "module-src"
module_src.mkdir()
subprocess.run(["git", "init", "-b", "main", str(module_src)], check=True)
(module_src / ".config" / "nvim").mkdir(parents=True)
(module_src / ".config" / "nvim" / "init.lua").write_text("-- module")
subprocess.run(["git", "-C", str(module_src), "add", "."], check=True)
subprocess.run(
[
"git",
"-C",
str(module_src),
"-c",
"user.name=Flow Test",
"-c",
"user.email=flow-test@example.com",
"commit",
"-m",
"init module",
],
check=True,
)
dotfiles = tmp_path / "dotfiles"
package_dir = dotfiles / "_shared" / "nvim"
package_dir.mkdir(parents=True)
(package_dir / "_module.yaml").write_text(
f"source: {module_src}\n"
"ref:\n"
" branch: main\n"
)
(package_dir / "notes.txt").write_text("ignore me")
monkeypatch.setattr("flow.commands.dotfiles.DOTFILES_DIR", dotfiles)
monkeypatch.setattr("flow.commands.dotfiles.MODULES_DIR", tmp_path / "modules")
_sync_modules(_ctx(), verbose=False)
resolved = _resolved_package_source(_ctx(), "_shared/nvim", package_dir)
assert (resolved / ".config" / "nvim" / "init.lua").exists()
def test_module_backed_link_specs_exclude_git_internals(tmp_path, monkeypatch):
module_src = tmp_path / "module-src"
module_src.mkdir()
subprocess.run(["git", "init", "-b", "main", str(module_src)], check=True)
(module_src / ".config" / "nvim").mkdir(parents=True)
(module_src / ".config" / "nvim" / "init.lua").write_text("-- module")
subprocess.run(["git", "-C", str(module_src), "add", "."], check=True)
subprocess.run(
[
"git",
"-C",
str(module_src),
"-c",
"user.name=Flow Test",
"-c",
"user.email=flow-test@example.com",
"commit",
"-m",
"init module",
],
check=True,
)
dotfiles = tmp_path / "dotfiles"
package_dir = dotfiles / "_shared" / "nvim"
package_dir.mkdir(parents=True)
(package_dir / "_module.yaml").write_text(
f"source: {module_src}\n"
"ref:\n"
" branch: main\n"
)
monkeypatch.setattr("flow.commands.dotfiles.DOTFILES_DIR", dotfiles)
monkeypatch.setattr("flow.commands.dotfiles.MODULES_DIR", tmp_path / "modules")
_sync_modules(_ctx(), verbose=False)
home = tmp_path / "home"
home.mkdir()
specs = _collect_home_specs(_ctx(), dotfiles, home, None, set(), None)
assert home / ".config" / "nvim" / "init.lua" in specs
assert not any(target.relative_to(home).parts[0] == ".git" for target in specs)
def test_sync_modules_resolves_relative_source_independent_of_cwd(tmp_path, monkeypatch):
module_src = tmp_path / "module-src"
module_src.mkdir()
subprocess.run(["git", "init", "-b", "main", str(module_src)], check=True)
(module_src / ".config" / "nvim").mkdir(parents=True)
(module_src / ".config" / "nvim" / "init.lua").write_text("-- module")
subprocess.run(["git", "-C", str(module_src), "add", "."], check=True)
subprocess.run(
[
"git",
"-C",
str(module_src),
"-c",
"user.name=Flow Test",
"-c",
"user.email=flow-test@example.com",
"commit",
"-m",
"init module",
],
check=True,
)
dotfiles = tmp_path / "dotfiles"
package_dir = dotfiles / "_shared" / "nvim"
package_dir.mkdir(parents=True)
relative_source = Path("../../../module-src")
(package_dir / "_module.yaml").write_text(
f"source: {relative_source}\n"
"ref:\n"
" branch: main\n"
)
unrelated_cwd = tmp_path / "unrelated-cwd"
unrelated_cwd.mkdir()
monkeypatch.chdir(unrelated_cwd)
monkeypatch.setattr("flow.commands.dotfiles.DOTFILES_DIR", dotfiles)
monkeypatch.setattr("flow.commands.dotfiles.MODULES_DIR", tmp_path / "modules")
_sync_modules(_ctx(), verbose=False)
resolved = _resolved_package_source(_ctx(), "_shared/nvim", package_dir)
assert (resolved / ".config" / "nvim" / "init.lua").exists()
def test_pull_requires_ack_only_on_real_updates():
assert _pull_requires_ack("Already up to date.\n", "") is False
assert _pull_requires_ack("Updating 123..456\n", "") is True

45
tests/test_package.py Normal file
View File

@@ -0,0 +1,45 @@
"""Tests for flow.commands.package."""
from types import SimpleNamespace
from flow.commands import package
def test_load_installed_returns_empty_on_malformed_json(tmp_path, monkeypatch):
state_file = tmp_path / "installed.json"
state_file.write_text("{broken", encoding="utf-8")
monkeypatch.setattr(package, "INSTALLED_STATE", state_file)
assert package._load_installed() == {}
def test_load_installed_returns_empty_on_non_mapping_json(tmp_path, monkeypatch):
state_file = tmp_path / "installed.json"
state_file.write_text('["neovim"]', encoding="utf-8")
monkeypatch.setattr(package, "INSTALLED_STATE", state_file)
assert package._load_installed() == {}
class _ConsoleCapture:
def __init__(self):
self.info_messages = []
def info(self, message):
self.info_messages.append(message)
def table(self, _headers, _rows):
raise AssertionError("table() should not be called when installed state is malformed")
def test_run_list_handles_malformed_installed_state(tmp_path, monkeypatch):
state_file = tmp_path / "installed.json"
state_file.write_text("{oops", encoding="utf-8")
monkeypatch.setattr(package, "INSTALLED_STATE", state_file)
monkeypatch.setattr(package, "_get_definitions", lambda _ctx: {})
ctx = SimpleNamespace(console=_ConsoleCapture())
package.run_list(ctx, SimpleNamespace(all=False))
assert ctx.console.info_messages == ["No packages installed."]

View File

@@ -10,6 +10,7 @@ from flow.core.paths import (
INSTALLED_STATE, INSTALLED_STATE,
LINKED_STATE, LINKED_STATE,
MANIFEST_FILE, MANIFEST_FILE,
MODULES_DIR,
PACKAGES_DIR, PACKAGES_DIR,
SCRATCH_DIR, SCRATCH_DIR,
STATE_DIR, STATE_DIR,
@@ -45,6 +46,10 @@ def test_packages_dir():
assert PACKAGES_DIR == DATA_DIR / "packages" assert PACKAGES_DIR == DATA_DIR / "packages"
def test_modules_dir():
assert MODULES_DIR == DATA_DIR / "modules"
def test_scratch_dir(): def test_scratch_dir():
assert SCRATCH_DIR == DATA_DIR / "scratch" assert SCRATCH_DIR == DATA_DIR / "scratch"
@@ -58,6 +63,7 @@ def test_ensure_dirs(tmp_path, monkeypatch):
monkeypatch.setattr("flow.core.paths.CONFIG_DIR", tmp_path / "config") monkeypatch.setattr("flow.core.paths.CONFIG_DIR", tmp_path / "config")
monkeypatch.setattr("flow.core.paths.DATA_DIR", tmp_path / "data") monkeypatch.setattr("flow.core.paths.DATA_DIR", tmp_path / "data")
monkeypatch.setattr("flow.core.paths.STATE_DIR", tmp_path / "state") monkeypatch.setattr("flow.core.paths.STATE_DIR", tmp_path / "state")
monkeypatch.setattr("flow.core.paths.MODULES_DIR", tmp_path / "data" / "modules")
monkeypatch.setattr("flow.core.paths.PACKAGES_DIR", tmp_path / "data" / "packages") monkeypatch.setattr("flow.core.paths.PACKAGES_DIR", tmp_path / "data" / "packages")
monkeypatch.setattr("flow.core.paths.SCRATCH_DIR", tmp_path / "data" / "scratch") monkeypatch.setattr("flow.core.paths.SCRATCH_DIR", tmp_path / "data" / "scratch")
@@ -66,5 +72,6 @@ def test_ensure_dirs(tmp_path, monkeypatch):
assert (tmp_path / "config").is_dir() assert (tmp_path / "config").is_dir()
assert (tmp_path / "data").is_dir() assert (tmp_path / "data").is_dir()
assert (tmp_path / "state").is_dir() assert (tmp_path / "state").is_dir()
assert (tmp_path / "data" / "modules").is_dir()
assert (tmp_path / "data" / "packages").is_dir() assert (tmp_path / "data" / "packages").is_dir()
assert (tmp_path / "data" / "scratch").is_dir() assert (tmp_path / "data" / "scratch").is_dir()

88
tests/test_sync.py Normal file
View File

@@ -0,0 +1,88 @@
"""Tests for flow.commands.sync."""
from types import SimpleNamespace
import pytest
from flow.commands import sync
def _git_clean_repo(_repo, *cmd, capture=True):
_ = capture
if cmd == ("rev-parse", "--abbrev-ref", "HEAD"):
return SimpleNamespace(returncode=0, stdout="main\n")
if cmd == ("diff", "--quiet"):
return SimpleNamespace(returncode=0, stdout="")
if cmd == ("diff", "--cached", "--quiet"):
return SimpleNamespace(returncode=0, stdout="")
if cmd == ("ls-files", "--others", "--exclude-standard"):
return SimpleNamespace(returncode=0, stdout="")
if cmd == ("rev-parse", "--abbrev-ref", "main@{u}"):
return SimpleNamespace(returncode=0, stdout="origin/main\n")
if cmd == ("rev-list", "--oneline", "main@{u}..main"):
return SimpleNamespace(returncode=0, stdout="")
if cmd == ("for-each-ref", "--format=%(refname:short)", "refs/heads"):
return SimpleNamespace(returncode=0, stdout="main\n")
raise AssertionError(f"Unexpected git command: {cmd!r}")
@pytest.mark.parametrize("git_style", ["dir", "file"])
def test_check_repo_detects_git_dir_and_worktree_file(tmp_path, monkeypatch, git_style):
repo = tmp_path / "repo"
repo.mkdir()
if git_style == "dir":
(repo / ".git").mkdir()
else:
(repo / ".git").write_text("gitdir: /tmp/worktrees/repo\n", encoding="utf-8")
monkeypatch.setattr(sync, "_git", _git_clean_repo)
name, issues = sync._check_repo(str(repo), do_fetch=False)
assert name == "repo"
assert issues == []
class _ConsoleCapture:
def __init__(self):
self.info_messages = []
self.error_messages = []
self.success_messages = []
def info(self, message):
self.info_messages.append(message)
def error(self, message):
self.error_messages.append(message)
def success(self, message):
self.success_messages.append(message)
def test_run_fetch_includes_worktree_style_repo(tmp_path, monkeypatch):
projects = tmp_path / "projects"
projects.mkdir()
worktree_repo = projects / "worktree"
worktree_repo.mkdir()
(worktree_repo / ".git").write_text("gitdir: /tmp/worktrees/worktree\n", encoding="utf-8")
(projects / "non_git").mkdir()
calls = []
def _git_fetch(repo, *cmd, capture=True):
_ = capture
calls.append((repo, cmd))
return SimpleNamespace(returncode=0, stdout="")
monkeypatch.setattr(sync, "_git", _git_fetch)
console = _ConsoleCapture()
ctx = SimpleNamespace(config=SimpleNamespace(projects_dir=str(projects)), console=console)
sync.run_fetch(ctx, SimpleNamespace())
assert calls == [(str(worktree_repo), ("fetch", "--all", "--quiet"))]
assert console.success_messages == ["All remotes fetched."]