working version
This commit is contained in:
318
README.md
318
README.md
@@ -1,24 +1,18 @@
|
||||
# flow
|
||||
|
||||
`flow` is a CLI for managing development instances, containers, dotfiles, bootstrap profiles, and
|
||||
binary packages.
|
||||
|
||||
This repository contains the Python implementation of the tool and its command modules.
|
||||
`flow` is a CLI for managing development instances, containers, dotfiles, and host bootstrap.
|
||||
|
||||
## What is implemented
|
||||
|
||||
- Instance access via `flow enter`
|
||||
- Container lifecycle commands under `flow dev` (`create`, `exec`, `connect`, `list`, `stop`,
|
||||
`remove`, `respawn`)
|
||||
- Dotfiles management (`dotfiles` / `dot`)
|
||||
- Bootstrap planning and execution (`bootstrap` / `setup` / `provision`)
|
||||
- Binary package installation from manifest definitions (`package` / `pkg`)
|
||||
- Multi-repo sync checks (`sync`)
|
||||
- Container lifecycle under `flow dev`
|
||||
- Dotfiles repo management (`flow dotfiles`)
|
||||
- Bootstrap provisioning (`flow bootstrap`)
|
||||
- Package installs from unified manifest definitions (`flow package`)
|
||||
- Project sync checks (`flow sync`)
|
||||
|
||||
## Installation
|
||||
|
||||
Build and install a standalone binary (no pip install required for use):
|
||||
|
||||
```bash
|
||||
make build
|
||||
make install-local
|
||||
@@ -26,241 +20,163 @@ make install-local
|
||||
|
||||
This installs `flow` to `~/.local/bin/flow`.
|
||||
|
||||
## Configuration
|
||||
## Core behavior
|
||||
|
||||
`flow` uses XDG paths by default:
|
||||
### Security model
|
||||
|
||||
- `~/.config/devflow/config`
|
||||
- `~/.config/devflow/manifest.yaml`
|
||||
- `~/.local/share/devflow/`
|
||||
- `~/.local/state/devflow/`
|
||||
- `flow` must run as a regular user (root/sudo invocation is rejected).
|
||||
- At startup, `flow` refreshes sudo credentials once (`sudo -v`) for privileged steps.
|
||||
- Package `post-install` hooks run without sudo by default.
|
||||
- A package hook can use sudo only when `allow_sudo: true` is explicitly set.
|
||||
|
||||
### `config` (INI)
|
||||
### Config location and merge rules
|
||||
|
||||
```ini
|
||||
[repository]
|
||||
dotfiles_url = git@github.com:you/dotfiles.git
|
||||
dotfiles_branch = main
|
||||
`flow` loads all YAML files from:
|
||||
|
||||
[paths]
|
||||
projects_dir = ~/projects
|
||||
1. `~/.local/share/flow/dotfiles/_shared/flow/.config/flow/` (self-hosted, if present)
|
||||
2. `~/.config/flow/` (local fallback)
|
||||
|
||||
[defaults]
|
||||
container_registry = registry.tomastm.com
|
||||
container_tag = latest
|
||||
tmux_session = default
|
||||
Files are read alphabetically (`*.yaml` and `*.yml`) and merged at top level.
|
||||
If the same top-level key appears in multiple files, the later filename wins.
|
||||
|
||||
[targets]
|
||||
# Format A: namespace = platform ssh_host [ssh_identity]
|
||||
personal = orb personal.orb
|
||||
### Dotfiles layout (flat with reserved dirs)
|
||||
|
||||
# Format B: namespace@platform = ssh_host [ssh_identity]
|
||||
work@ec2 = work.internal ~/.ssh/id_work
|
||||
Inside your dotfiles repo root:
|
||||
|
||||
```text
|
||||
_shared/
|
||||
flow/
|
||||
.config/flow/
|
||||
config.yaml
|
||||
packages.yaml
|
||||
profiles.yaml
|
||||
git/
|
||||
.gitconfig
|
||||
_root/
|
||||
general/
|
||||
etc/
|
||||
hostname
|
||||
linux-auto/
|
||||
nvim/
|
||||
.config/nvim/init.lua
|
||||
```
|
||||
|
||||
## Manifest format
|
||||
- `_shared/`: linked for all profiles
|
||||
- `_root/`: linked to absolute paths (via sudo), e.g. `_root/etc/hostname -> /etc/hostname`
|
||||
- every other directory at this level is a profile name
|
||||
- when `_shared` and profile conflict on the same target file, profile wins
|
||||
|
||||
The manifest is YAML with these top-level sections used by the current code:
|
||||
## Manifest model
|
||||
|
||||
- `profiles` for bootstrap profiles
|
||||
- `binaries` for package definitions
|
||||
- `package-map` for cross-package-manager name mapping
|
||||
Top-level keys:
|
||||
|
||||
`environments` is no longer supported.
|
||||
- `profiles`
|
||||
- `packages`
|
||||
- optional global settings like `repository`, `paths`, `defaults`, `targets`
|
||||
|
||||
Example:
|
||||
`environments` is not supported.
|
||||
|
||||
### Packages (unified)
|
||||
|
||||
```yaml
|
||||
profiles:
|
||||
linux-vm:
|
||||
os: linux
|
||||
hostname: "$HOSTNAME"
|
||||
shell: zsh
|
||||
locale: en_US.UTF-8
|
||||
requires: [HOSTNAME]
|
||||
packages:
|
||||
standard: [git, tmux, zsh, fd]
|
||||
binary: [neovim]
|
||||
ssh_keygen:
|
||||
- type: ed25519
|
||||
comment: "$USER@$HOSTNAME"
|
||||
runcmd:
|
||||
- mkdir -p ~/projects
|
||||
packages:
|
||||
- name: fd
|
||||
type: pkg
|
||||
sources:
|
||||
apt: fd-find
|
||||
dnf: fd-find
|
||||
brew: fd
|
||||
|
||||
package-map:
|
||||
fd:
|
||||
apt: fd-find
|
||||
dnf: fd-find
|
||||
brew: fd
|
||||
- name: wezterm
|
||||
type: cask
|
||||
sources:
|
||||
brew: wezterm
|
||||
|
||||
binaries:
|
||||
neovim:
|
||||
- name: neovim
|
||||
type: binary
|
||||
source: github:neovim/neovim
|
||||
version: "0.10.4"
|
||||
asset-pattern: "nvim-{{os}}-{{arch}}.tar.gz"
|
||||
platform-map:
|
||||
linux-amd64: { os: linux, arch: x86_64 }
|
||||
linux-x64: { os: linux, arch: x64 }
|
||||
linux-arm64: { os: linux, arch: arm64 }
|
||||
macos-arm64: { os: macos, arch: arm64 }
|
||||
install-script: |
|
||||
curl -fL "{{downloadUrl}}" -o /tmp/nvim.tar.gz
|
||||
tar -xzf /tmp/nvim.tar.gz -C /tmp
|
||||
rm -rf ~/.local/bin/nvim
|
||||
cp /tmp/nvim-*/bin/nvim ~/.local/bin/nvim
|
||||
darwin-arm64: { os: macos, arch: arm64 }
|
||||
extract-dir: "nvim-{{os}}64"
|
||||
install:
|
||||
bin: [bin/nvim]
|
||||
share: [share/nvim]
|
||||
man: [share/man/man1/nvim.1]
|
||||
lib: [lib/libnvim.so]
|
||||
```
|
||||
|
||||
### Profile package syntaxes
|
||||
|
||||
All are supported in one profile list:
|
||||
|
||||
```yaml
|
||||
profiles:
|
||||
macos-dev:
|
||||
os: macos
|
||||
packages:
|
||||
- git
|
||||
- cask/wezterm
|
||||
- binary/neovim
|
||||
- name: docker
|
||||
allow_sudo: true
|
||||
post-install: |
|
||||
sudo groupadd docker || true
|
||||
sudo usermod -aG docker $USER
|
||||
```
|
||||
|
||||
### Templates
|
||||
|
||||
- `{{ env.VAR_NAME }}`
|
||||
- `{{ version }}`
|
||||
- `{{ os }}`
|
||||
- `{{ arch }}`
|
||||
|
||||
### Bootstrap profile features
|
||||
|
||||
- `os` is required (`linux` or `macos`)
|
||||
- `package-manager` optional (auto-detected if omitted)
|
||||
- default locale is `en_US.UTF-8`
|
||||
- shell auto-install + `chsh` when `shell:` is declared and missing
|
||||
- `requires` validation for required env vars
|
||||
- `ssh-keygen` definitions
|
||||
- `runcmd` (runs after package installation)
|
||||
- automatic config linking (`_shared` + profile + `_root`)
|
||||
- `post-link` hook (runs after symlink phase)
|
||||
- config skip patterns:
|
||||
- package names (e.g. `nvim`)
|
||||
- `_shared`
|
||||
- `_profile`
|
||||
- `_root`
|
||||
|
||||
## Command overview
|
||||
|
||||
### Enter instances
|
||||
|
||||
```bash
|
||||
flow enter personal@orb
|
||||
flow enter root@personal@orb
|
||||
flow enter personal@orb --dry-run
|
||||
```
|
||||
|
||||
If your local terminal uses `xterm-ghostty` or `wezterm`, `flow enter` shows a terminfo warning and
|
||||
a manual fix command before connecting. `flow` never installs terminfo on the target automatically.
|
||||
|
||||
### Containers
|
||||
|
||||
```bash
|
||||
flow dev create api -i tm0/node -p ~/projects/api
|
||||
flow dev connect api
|
||||
flow dev exec api -- npm test
|
||||
flow dev list
|
||||
flow dev stop api
|
||||
flow dev remove api
|
||||
```
|
||||
|
||||
### Dotfiles
|
||||
|
||||
```bash
|
||||
flow dotfiles init --repo git@github.com:you/dotfiles.git
|
||||
flow dotfiles link
|
||||
flow dotfiles link --profile linux-auto
|
||||
flow dotfiles status
|
||||
flow dotfiles relink
|
||||
flow dotfiles clean --dry-run
|
||||
flow dotfiles repo status
|
||||
flow dotfiles repo pull --relink
|
||||
flow dotfiles repo push
|
||||
```
|
||||
|
||||
### Bootstrap
|
||||
|
||||
```bash
|
||||
flow bootstrap list
|
||||
flow bootstrap show linux-vm
|
||||
flow bootstrap packages --profile linux-vm
|
||||
flow bootstrap packages --profile linux-vm --resolved
|
||||
flow bootstrap run --profile linux-vm --var HOSTNAME=devbox
|
||||
flow bootstrap run --profile linux-vm --dry-run
|
||||
```
|
||||
flow bootstrap show linux-auto
|
||||
flow bootstrap run --profile linux-auto --var USER_EMAIL=you@example.com
|
||||
|
||||
`flow bootstrap` auto-detects the package manager (`brew`, `apt`, `dnf`) when
|
||||
`package-manager` is not set in a profile.
|
||||
|
||||
### Packages
|
||||
|
||||
```bash
|
||||
flow package install neovim
|
||||
flow package list
|
||||
flow package list --all
|
||||
flow package remove neovim
|
||||
```
|
||||
|
||||
### Sync
|
||||
|
||||
```bash
|
||||
flow sync check
|
||||
flow sync check --no-fetch
|
||||
flow sync fetch
|
||||
flow sync summary
|
||||
```
|
||||
|
||||
### Completion
|
||||
|
||||
```bash
|
||||
flow completion install-zsh
|
||||
flow completion zsh
|
||||
```
|
||||
|
||||
## Self-hosted config priority
|
||||
|
||||
When present, `flow` prefers config from a linked dotfiles package:
|
||||
|
||||
1. `~/.local/share/devflow/dotfiles/flow/.config/flow/config`
|
||||
2. `~/.config/devflow/config`
|
||||
|
||||
And for manifest:
|
||||
|
||||
1. `~/.local/share/devflow/dotfiles/flow/.config/flow/manifest.yaml`
|
||||
2. `~/.config/devflow/manifest.yaml`
|
||||
|
||||
Passing an explicit file path to internal loaders bypasses this cascade.
|
||||
|
||||
## State format policy
|
||||
|
||||
`flow` currently supports only the v2 dotfiles link state format (`linked.json`). Older state
|
||||
formats are intentionally not supported.
|
||||
|
||||
## CLI behavior
|
||||
|
||||
- User errors return non-zero exit codes.
|
||||
- External command failures are surfaced as concise one-line errors (no traceback spam).
|
||||
- `Ctrl+C` exits with code `130`.
|
||||
|
||||
## Zsh completion
|
||||
|
||||
Recommended one-shot install:
|
||||
|
||||
```bash
|
||||
flow completion install-zsh
|
||||
```
|
||||
|
||||
Manual install (equivalent):
|
||||
|
||||
```bash
|
||||
mkdir -p ~/.zsh/completions
|
||||
flow completion zsh > ~/.zsh/completions/_flow
|
||||
```
|
||||
|
||||
Then ensure your `.zshrc` includes:
|
||||
|
||||
```bash
|
||||
fpath=(~/.zsh/completions $fpath)
|
||||
autoload -Uz compinit && compinit
|
||||
```
|
||||
|
||||
Completion is dynamic and pulls values from your current config/manifest/state (for example
|
||||
bootstrap profiles, package names, dotfiles packages, and configured `enter` targets).
|
||||
|
||||
## Development
|
||||
|
||||
Binary build (maintainers):
|
||||
|
||||
```bash
|
||||
python3 -m pip install pyinstaller
|
||||
make build
|
||||
make install-local
|
||||
```
|
||||
|
||||
Useful targets:
|
||||
|
||||
```bash
|
||||
make clean
|
||||
```
|
||||
|
||||
Run tests:
|
||||
|
||||
```bash
|
||||
python3 -m pytest
|
||||
```
|
||||
|
||||
Local development setup:
|
||||
|
||||
```bash
|
||||
python3 -m venv .venv
|
||||
.venv/bin/pip install -e ".[dev]"
|
||||
.venv/bin/pytest
|
||||
python3 -m pytest
|
||||
```
|
||||
|
||||
@@ -1,27 +1,27 @@
|
||||
# Example working scenario
|
||||
|
||||
This folder contains a complete, practical dotfiles + bootstrap setup that exercises most `flow`
|
||||
features.
|
||||
This folder contains a complete dotfiles + bootstrap setup for the current `flow` schema.
|
||||
|
||||
## What this example shows
|
||||
|
||||
- Dotfiles repository layout with `common/` packages and `profiles/work/` overrides
|
||||
- Self-hosted `flow` config + manifest in `common/flow/.config/flow/`
|
||||
- Bootstrap profiles for Linux (auto PM detection), Ubuntu (`apt`), Fedora (`dnf`), and macOS
|
||||
(`brew`)
|
||||
- Bootstrap actions: `requires`, `hostname`, `locale`, `shell`, package install, binary install,
|
||||
`ssh_keygen`, `configs`, and `runcmd`
|
||||
- Package name mapping via `package-map` (`apt`/`dnf`/`brew`)
|
||||
- Dotfiles repo workflows: `status`, `pull`, `push`, `sync --relink`, and `edit`
|
||||
- Flat repo-root layout with reserved dirs:
|
||||
- `_shared/` (shared configs)
|
||||
- `_root/` (root-targeted configs)
|
||||
- profile dirs (`linux-auto/`, `macos-dev/`)
|
||||
- Unified YAML config under `_shared/flow/.config/flow/*.yaml`
|
||||
- Profile package list syntax: string, type prefix, and object entries
|
||||
- Binary install definition with `asset-pattern`, `platform-map`, `extract-dir`, and `install`
|
||||
- Required env vars, templating, SSH keygen, runcmd, post-link, and config skip patterns
|
||||
|
||||
## Layout
|
||||
|
||||
- `dotfiles-repo/common/flow/.config/flow/config` example `flow` config
|
||||
- `dotfiles-repo/common/flow/.config/flow/manifest.yaml` profiles + package map + binaries
|
||||
- `dotfiles-repo/common/zsh/.zshrc`, `common/git/.gitconfig`, `common/tmux/.tmux.conf`
|
||||
- `dotfiles-repo/common/nvim/.config/nvim/init.lua`
|
||||
- `dotfiles-repo/common/bin/.local/bin/flow-hello`
|
||||
- `dotfiles-repo/profiles/work/git/.gitconfig` and `profiles/work/zsh/.zshrc` overrides
|
||||
- `dotfiles-repo/_shared/flow/.config/flow/config.yaml`
|
||||
- `dotfiles-repo/_shared/flow/.config/flow/packages.yaml`
|
||||
- `dotfiles-repo/_shared/flow/.config/flow/profiles.yaml`
|
||||
- `dotfiles-repo/_shared/...`
|
||||
- `dotfiles-repo/_root/...`
|
||||
- `dotfiles-repo/linux-auto/...`
|
||||
- `dotfiles-repo/macos-dev/...`
|
||||
|
||||
## Quick start
|
||||
|
||||
@@ -35,7 +35,7 @@ Initialize and link dotfiles:
|
||||
|
||||
```bash
|
||||
flow dotfiles init --repo "$EXAMPLE_REPO"
|
||||
flow dotfiles link
|
||||
flow dotfiles link --profile linux-auto
|
||||
flow dotfiles status
|
||||
```
|
||||
|
||||
@@ -43,15 +43,15 @@ Check repo commands:
|
||||
|
||||
```bash
|
||||
flow dotfiles repo status
|
||||
flow dotfiles repo pull --relink
|
||||
flow dotfiles repo pull --relink --profile linux-auto
|
||||
flow dotfiles repo push
|
||||
```
|
||||
|
||||
Edit package or file/path targets:
|
||||
|
||||
```bash
|
||||
flow dotfiles edit zsh --no-commit
|
||||
flow dotfiles edit common/flow/.config/flow/manifest.yaml --no-commit
|
||||
flow dotfiles edit git --no-commit
|
||||
flow dotfiles edit _shared/flow/.config/flow/profiles.yaml --no-commit
|
||||
```
|
||||
|
||||
Inspect bootstrap profiles and package resolution:
|
||||
@@ -59,20 +59,13 @@ Inspect bootstrap profiles and package resolution:
|
||||
```bash
|
||||
flow bootstrap list
|
||||
flow bootstrap packages --resolved
|
||||
flow bootstrap packages --profile fedora-dev --resolved
|
||||
flow bootstrap packages --profile linux-auto --resolved
|
||||
flow bootstrap show linux-auto
|
||||
```
|
||||
|
||||
Run bootstrap in dry-run mode:
|
||||
Run bootstrap dry-run:
|
||||
|
||||
```bash
|
||||
flow bootstrap run --profile linux-auto --var TARGET_HOSTNAME=devbox --var USER_EMAIL=you@example.com --dry-run
|
||||
flow bootstrap run --profile work-linux --var WORK_EMAIL=you@company.com --dry-run
|
||||
flow bootstrap run --profile macos-dev --dry-run
|
||||
```
|
||||
|
||||
## Manifest notes
|
||||
|
||||
- `linux-auto` omits `package-manager` to demonstrate auto-detection.
|
||||
- `ubuntu-dev` uses legacy `packages.package` key to show compatibility.
|
||||
- `package-map` rewrites logical names like `fd` and `python-dev` per package manager.
|
||||
- If mapping is missing for the selected manager, `flow` uses the original package name and warns.
|
||||
|
||||
1
example/dotfiles-repo/_root/etc/hostname
Normal file
1
example/dotfiles-repo/_root/etc/hostname
Normal file
@@ -0,0 +1 @@
|
||||
{{ env.TARGET_HOSTNAME }}
|
||||
@@ -0,0 +1,3 @@
|
||||
#!/usr/bin/env sh
|
||||
|
||||
echo "custom root script"
|
||||
15
example/dotfiles-repo/_shared/flow/.config/flow/config.yaml
Normal file
15
example/dotfiles-repo/_shared/flow/.config/flow/config.yaml
Normal file
@@ -0,0 +1,15 @@
|
||||
repository:
|
||||
dotfiles-url: /ABSOLUTE/PATH/TO/flow-cli/example/dotfiles-repo
|
||||
dotfiles-branch: main
|
||||
|
||||
paths:
|
||||
projects-dir: ~/projects
|
||||
|
||||
defaults:
|
||||
container-registry: registry.example.com
|
||||
container-tag: latest
|
||||
tmux-session: default
|
||||
|
||||
targets:
|
||||
personal: orb personal.orb
|
||||
work@ec2: work.internal ~/.ssh/id_work
|
||||
@@ -0,0 +1,37 @@
|
||||
packages:
|
||||
- name: fd
|
||||
type: pkg
|
||||
sources:
|
||||
apt: fd-find
|
||||
dnf: fd-find
|
||||
brew: fd
|
||||
|
||||
- name: ripgrep
|
||||
type: pkg
|
||||
sources:
|
||||
apt: ripgrep
|
||||
dnf: ripgrep
|
||||
brew: ripgrep
|
||||
|
||||
- name: wezterm
|
||||
type: cask
|
||||
sources:
|
||||
brew: wezterm
|
||||
|
||||
- name: neovim
|
||||
type: binary
|
||||
source: github:neovim/neovim
|
||||
version: "0.10.4"
|
||||
asset-pattern: "nvim-{{os}}-{{arch}}.tar.gz"
|
||||
platform-map:
|
||||
linux-x64: { os: linux, arch: x64 }
|
||||
linux-arm64: { os: linux, arch: arm64 }
|
||||
darwin-arm64: { os: macos, arch: arm64 }
|
||||
extract-dir: "nvim-{{os}}64"
|
||||
install:
|
||||
bin: [bin/nvim]
|
||||
share: [share/nvim]
|
||||
man: [share/man/man1/nvim.1]
|
||||
|
||||
- name: docker
|
||||
type: pkg
|
||||
@@ -0,0 +1,39 @@
|
||||
profiles:
|
||||
linux-auto:
|
||||
os: linux
|
||||
requires: [TARGET_HOSTNAME, USER_EMAIL]
|
||||
hostname: "{{ env.TARGET_HOSTNAME }}"
|
||||
shell: zsh
|
||||
packages:
|
||||
- git
|
||||
- tmux
|
||||
- zsh
|
||||
- fd
|
||||
- ripgrep
|
||||
- binary/neovim
|
||||
- name: docker
|
||||
allow_sudo: true
|
||||
post-install: |
|
||||
sudo groupadd docker || true
|
||||
sudo usermod -aG docker $USER
|
||||
ssh-keygen:
|
||||
- type: ed25519
|
||||
filename: id_ed25519
|
||||
comment: "{{ env.USER_EMAIL }}"
|
||||
configs:
|
||||
skip: [tmux]
|
||||
runcmd:
|
||||
- mkdir -p ~/projects
|
||||
- git config --global user.email "{{ env.USER_EMAIL }}"
|
||||
post-link: |
|
||||
echo "All configs linked."
|
||||
echo "Restart your shell to apply changes."
|
||||
|
||||
macos-dev:
|
||||
os: macos
|
||||
shell: zsh
|
||||
packages:
|
||||
- git
|
||||
- tmux
|
||||
- cask/wezterm
|
||||
- binary/neovim
|
||||
4
example/dotfiles-repo/_shared/zsh/.zshrc
Normal file
4
example/dotfiles-repo/_shared/zsh/.zshrc
Normal file
@@ -0,0 +1,4 @@
|
||||
export EDITOR=vim
|
||||
export PATH="$HOME/.local/bin:$PATH"
|
||||
|
||||
alias ll='ls -lah'
|
||||
@@ -1,15 +0,0 @@
|
||||
[repository]
|
||||
dotfiles_url = /ABSOLUTE/PATH/TO/flow-cli/example/dotfiles-repo
|
||||
dotfiles_branch = main
|
||||
|
||||
[paths]
|
||||
projects_dir = ~/projects
|
||||
|
||||
[defaults]
|
||||
container_registry = registry.example.com
|
||||
container_tag = latest
|
||||
tmux_session = default
|
||||
|
||||
[targets]
|
||||
personal = orb personal.orb
|
||||
work@ec2 = work.internal ~/.ssh/id_work
|
||||
@@ -1,2 +0,0 @@
|
||||
export FLOW_ENV=example
|
||||
export FLOW_EDITOR=vim
|
||||
@@ -1,96 +0,0 @@
|
||||
profiles:
|
||||
linux-auto:
|
||||
os: linux
|
||||
requires: [TARGET_HOSTNAME, USER_EMAIL]
|
||||
hostname: "$TARGET_HOSTNAME"
|
||||
locale: en_US.UTF-8
|
||||
shell: zsh
|
||||
packages:
|
||||
standard: [git, tmux, zsh, fd, ripgrep, python-dev]
|
||||
binary: [neovim, lazygit]
|
||||
ssh_keygen:
|
||||
- type: ed25519
|
||||
filename: id_ed25519
|
||||
comment: "$USER_EMAIL"
|
||||
configs: [flow, zsh, git, tmux, nvim, bin]
|
||||
runcmd:
|
||||
- mkdir -p ~/projects
|
||||
- git config --global user.email "$USER_EMAIL"
|
||||
|
||||
ubuntu-dev:
|
||||
os: linux
|
||||
package-manager: apt
|
||||
packages:
|
||||
package: [git, tmux, zsh, fd, ripgrep, python-dev]
|
||||
binary: [neovim]
|
||||
configs: [flow, zsh, git, tmux]
|
||||
|
||||
fedora-dev:
|
||||
os: linux
|
||||
package-manager: dnf
|
||||
packages:
|
||||
standard: [git, tmux, zsh, fd, ripgrep, python-dev]
|
||||
binary: [neovim]
|
||||
configs: [flow, zsh, git, tmux]
|
||||
|
||||
macos-dev:
|
||||
os: macos
|
||||
package-manager: brew
|
||||
packages:
|
||||
standard: [git, tmux, zsh, fd, ripgrep]
|
||||
cask: [wezterm]
|
||||
binary: [neovim]
|
||||
configs: [flow, zsh, git, nvim]
|
||||
|
||||
work-linux:
|
||||
os: linux
|
||||
package-manager: apt
|
||||
requires: [WORK_EMAIL]
|
||||
packages:
|
||||
standard: [git, tmux, zsh]
|
||||
configs: [git, zsh]
|
||||
runcmd:
|
||||
- git config --global user.email "$WORK_EMAIL"
|
||||
|
||||
package-map:
|
||||
fd:
|
||||
apt: fd-find
|
||||
dnf: fd-find
|
||||
brew: fd
|
||||
python-dev:
|
||||
apt: python3-dev
|
||||
dnf: python3-devel
|
||||
brew: python
|
||||
ripgrep:
|
||||
apt: ripgrep
|
||||
dnf: ripgrep
|
||||
brew: ripgrep
|
||||
|
||||
binaries:
|
||||
neovim:
|
||||
source: github:neovim/neovim
|
||||
version: "0.10.4"
|
||||
asset-pattern: "nvim-{{os}}-{{arch}}.tar.gz"
|
||||
platform-map:
|
||||
linux-amd64: { os: linux, arch: x86_64 }
|
||||
linux-arm64: { os: linux, arch: arm64 }
|
||||
macos-arm64: { os: macos, arch: arm64 }
|
||||
install-script: |
|
||||
curl -fL "{{downloadUrl}}" -o /tmp/nvim.tar.gz
|
||||
tar -xzf /tmp/nvim.tar.gz -C /tmp
|
||||
mkdir -p ~/.local/bin
|
||||
cp /tmp/nvim-*/bin/nvim ~/.local/bin/nvim
|
||||
|
||||
lazygit:
|
||||
source: github:jesseduffield/lazygit
|
||||
version: "0.44.1"
|
||||
asset-pattern: "lazygit_{{version}}_{{os}}_{{arch}}.tar.gz"
|
||||
platform-map:
|
||||
linux-amd64: { os: Linux, arch: x86_64 }
|
||||
linux-arm64: { os: Linux, arch: arm64 }
|
||||
macos-arm64: { os: Darwin, arch: arm64 }
|
||||
install-script: |
|
||||
curl -fL "{{downloadUrl}}" -o /tmp/lazygit.tar.gz
|
||||
tar -xzf /tmp/lazygit.tar.gz -C /tmp
|
||||
mkdir -p ~/.local/bin
|
||||
cp /tmp/lazygit ~/.local/bin/lazygit
|
||||
@@ -1,8 +0,0 @@
|
||||
export EDITOR=vim
|
||||
export PATH="$HOME/.local/bin:$PATH"
|
||||
|
||||
alias ll='ls -lah'
|
||||
|
||||
if [ -f "$HOME/.config/flow/env.sh" ]; then
|
||||
. "$HOME/.config/flow/env.sh"
|
||||
fi
|
||||
3
example/dotfiles-repo/linux-auto/git/.gitconfig
Normal file
3
example/dotfiles-repo/linux-auto/git/.gitconfig
Normal file
@@ -0,0 +1,3 @@
|
||||
[user]
|
||||
name = Example Linux User
|
||||
email = linux@example.com
|
||||
@@ -3,5 +3,3 @@ export PATH="$HOME/.local/bin:$PATH"
|
||||
|
||||
alias ll='ls -lah'
|
||||
alias gs='git status -sb'
|
||||
|
||||
export WORK_MODE=1
|
||||
@@ -1,6 +0,0 @@
|
||||
[user]
|
||||
name = Example Work User
|
||||
email = work@example.com
|
||||
|
||||
[url "git@github.com:work/"]
|
||||
insteadOf = https://github.com/work/
|
||||
@@ -1,6 +1,8 @@
|
||||
"""CLI entry point — argparse routing and context creation."""
|
||||
|
||||
import argparse
|
||||
import os
|
||||
import shutil
|
||||
import subprocess
|
||||
import sys
|
||||
|
||||
@@ -14,6 +16,27 @@ from flow.core.platform import detect_platform
|
||||
COMMAND_MODULES = [enter, container, dotfiles, bootstrap, package, sync, completion]
|
||||
|
||||
|
||||
def _ensure_non_root(console: ConsoleLogger) -> None:
|
||||
if os.geteuid() == 0:
|
||||
console.error("flow must be run as a regular user (not root/sudo)")
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
def _refresh_sudo_credentials(console: ConsoleLogger) -> None:
|
||||
if os.environ.get("FLOW_SKIP_SUDO_REFRESH") == "1":
|
||||
return
|
||||
|
||||
if not shutil.which("sudo"):
|
||||
console.error("sudo is required but was not found in PATH")
|
||||
sys.exit(1)
|
||||
|
||||
try:
|
||||
subprocess.run(["sudo", "-v"], check=True)
|
||||
except subprocess.CalledProcessError:
|
||||
console.error("Failed to refresh sudo credentials")
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
def main():
|
||||
parser = argparse.ArgumentParser(
|
||||
prog="flow",
|
||||
@@ -34,6 +57,9 @@ def main():
|
||||
parser.print_help()
|
||||
sys.exit(0)
|
||||
|
||||
console = ConsoleLogger()
|
||||
_ensure_non_root(console)
|
||||
|
||||
if args.command == "completion":
|
||||
handler = getattr(args, "handler", None)
|
||||
if handler:
|
||||
@@ -43,7 +69,7 @@ def main():
|
||||
return
|
||||
|
||||
ensure_dirs()
|
||||
console = ConsoleLogger()
|
||||
_refresh_sudo_credentials(console)
|
||||
|
||||
try:
|
||||
platform_info = detect_platform()
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -115,7 +115,17 @@ def _list_bootstrap_profiles() -> List[str]:
|
||||
|
||||
def _list_manifest_packages() -> List[str]:
|
||||
manifest = _safe_manifest()
|
||||
return sorted(manifest.get("binaries", {}).keys())
|
||||
packages = manifest.get("packages", [])
|
||||
if not isinstance(packages, list):
|
||||
return []
|
||||
|
||||
names = []
|
||||
for pkg in packages:
|
||||
if isinstance(pkg, dict) and isinstance(pkg.get("name"), str):
|
||||
if str(pkg.get("type", "pkg")) == "binary":
|
||||
names.append(pkg["name"])
|
||||
|
||||
return sorted(set(names))
|
||||
|
||||
|
||||
def _list_installed_packages() -> List[str]:
|
||||
@@ -132,36 +142,47 @@ def _list_installed_packages() -> List[str]:
|
||||
|
||||
|
||||
def _list_dotfiles_profiles() -> List[str]:
|
||||
profiles_dir = DOTFILES_DIR / "profiles"
|
||||
if not profiles_dir.is_dir():
|
||||
flow_dir = DOTFILES_DIR
|
||||
if not flow_dir.is_dir():
|
||||
return []
|
||||
return sorted([p.name for p in profiles_dir.iterdir() if p.is_dir() and not p.name.startswith(".")])
|
||||
|
||||
return sorted(
|
||||
[
|
||||
p.name
|
||||
for p in flow_dir.iterdir()
|
||||
if p.is_dir() and not p.name.startswith(".") and not p.name.startswith("_")
|
||||
]
|
||||
)
|
||||
|
||||
|
||||
def _list_dotfiles_packages(profile: Optional[str] = None) -> List[str]:
|
||||
package_names: Set[str] = set()
|
||||
flow_dir = DOTFILES_DIR
|
||||
|
||||
common = DOTFILES_DIR / "common"
|
||||
if common.is_dir():
|
||||
for pkg in common.iterdir():
|
||||
if not flow_dir.is_dir():
|
||||
return []
|
||||
|
||||
shared = flow_dir / "_shared"
|
||||
if shared.is_dir():
|
||||
for pkg in shared.iterdir():
|
||||
if pkg.is_dir() and not pkg.name.startswith("."):
|
||||
package_names.add(pkg.name)
|
||||
|
||||
if profile:
|
||||
profile_dir = DOTFILES_DIR / "profiles" / profile
|
||||
profile_dir = flow_dir / profile
|
||||
if profile_dir.is_dir():
|
||||
for pkg in profile_dir.iterdir():
|
||||
if pkg.is_dir() and not pkg.name.startswith("."):
|
||||
package_names.add(pkg.name)
|
||||
else:
|
||||
profiles_dir = DOTFILES_DIR / "profiles"
|
||||
if profiles_dir.is_dir():
|
||||
for profile_dir in profiles_dir.iterdir():
|
||||
if not profile_dir.is_dir():
|
||||
continue
|
||||
for pkg in profile_dir.iterdir():
|
||||
if pkg.is_dir() and not pkg.name.startswith("."):
|
||||
package_names.add(pkg.name)
|
||||
for profile_dir in flow_dir.iterdir():
|
||||
if profile_dir.name.startswith(".") or profile_dir.name.startswith("_"):
|
||||
continue
|
||||
if not profile_dir.is_dir():
|
||||
continue
|
||||
for pkg in profile_dir.iterdir():
|
||||
if pkg.is_dir() and not pkg.name.startswith("."):
|
||||
package_names.add(pkg.name)
|
||||
|
||||
return sorted(package_names)
|
||||
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
"""flow dotfiles — dotfile management with GNU Stow-style symlinking."""
|
||||
"""flow dotfiles — dotfile management with flat repo layout."""
|
||||
|
||||
import argparse
|
||||
import json
|
||||
@@ -7,48 +7,53 @@ import shlex
|
||||
import shutil
|
||||
import subprocess
|
||||
import sys
|
||||
from dataclasses import dataclass
|
||||
from pathlib import Path
|
||||
from typing import Optional
|
||||
from typing import Dict, List, Optional, Set
|
||||
|
||||
from flow.core.config import FlowContext
|
||||
from flow.core.paths import DOTFILES_DIR, LINKED_STATE
|
||||
from flow.core.stow import LinkTree, TreeFolder
|
||||
|
||||
RESERVED_SHARED = "_shared"
|
||||
RESERVED_ROOT = "_root"
|
||||
|
||||
|
||||
@dataclass
|
||||
class LinkSpec:
|
||||
source: Path
|
||||
target: Path
|
||||
package: str
|
||||
is_directory_link: bool = False
|
||||
|
||||
|
||||
def register(subparsers):
|
||||
p = subparsers.add_parser("dotfiles", aliases=["dot"], help="Manage dotfiles")
|
||||
sub = p.add_subparsers(dest="dotfiles_command")
|
||||
|
||||
# init
|
||||
init = sub.add_parser("init", help="Clone dotfiles repository")
|
||||
init.add_argument("--repo", help="Override repository URL")
|
||||
init.set_defaults(handler=run_init)
|
||||
|
||||
# link
|
||||
link = sub.add_parser("link", help="Create symlinks for dotfile packages")
|
||||
link.add_argument("packages", nargs="*", help="Specific packages to link (default: all)")
|
||||
link.add_argument("--profile", help="Profile to use for overrides")
|
||||
link.add_argument("--profile", help="Profile to use")
|
||||
link.add_argument("--copy", action="store_true", help="Copy instead of symlink")
|
||||
link.add_argument("--force", action="store_true", help="Overwrite existing files")
|
||||
link.add_argument("--dry-run", action="store_true", help="Show what would be done")
|
||||
link.set_defaults(handler=run_link)
|
||||
|
||||
# unlink
|
||||
unlink = sub.add_parser("unlink", help="Remove dotfile symlinks")
|
||||
unlink.add_argument("packages", nargs="*", help="Specific packages to unlink (default: all)")
|
||||
unlink.set_defaults(handler=run_unlink)
|
||||
|
||||
# status
|
||||
status = sub.add_parser("status", help="Show dotfiles link status")
|
||||
status.set_defaults(handler=run_status)
|
||||
|
||||
# sync
|
||||
sync = sub.add_parser("sync", help="Pull latest dotfiles from remote")
|
||||
sync.add_argument("--relink", action="store_true", help="Run relink after pull")
|
||||
sync.add_argument("--profile", help="Profile to use when relinking")
|
||||
sync.set_defaults(handler=run_sync)
|
||||
|
||||
# repo
|
||||
repo = sub.add_parser("repo", help="Manage dotfiles repository")
|
||||
repo_sub = repo.add_subparsers(dest="dotfiles_repo_command")
|
||||
|
||||
@@ -56,8 +61,18 @@ def register(subparsers):
|
||||
repo_status.set_defaults(handler=run_repo_status)
|
||||
|
||||
repo_pull = repo_sub.add_parser("pull", help="Pull latest changes")
|
||||
repo_pull.add_argument("--rebase", dest="rebase", action="store_true", help="Use rebase strategy (default)")
|
||||
repo_pull.add_argument("--no-rebase", dest="rebase", action="store_false", help="Disable rebase strategy")
|
||||
repo_pull.add_argument(
|
||||
"--rebase",
|
||||
dest="rebase",
|
||||
action="store_true",
|
||||
help="Use rebase strategy (default)",
|
||||
)
|
||||
repo_pull.add_argument(
|
||||
"--no-rebase",
|
||||
dest="rebase",
|
||||
action="store_false",
|
||||
help="Disable rebase strategy",
|
||||
)
|
||||
repo_pull.add_argument("--relink", action="store_true", help="Run relink after pull")
|
||||
repo_pull.add_argument("--profile", help="Profile to use when relinking")
|
||||
repo_pull.set_defaults(rebase=True)
|
||||
@@ -68,18 +83,15 @@ def register(subparsers):
|
||||
|
||||
repo.set_defaults(handler=lambda ctx, args: repo.print_help())
|
||||
|
||||
# relink
|
||||
relink = sub.add_parser("relink", help="Refresh symlinks after changes")
|
||||
relink.add_argument("packages", nargs="*", help="Specific packages to relink (default: all)")
|
||||
relink.add_argument("--profile", help="Profile to use for overrides")
|
||||
relink.add_argument("--profile", help="Profile to use")
|
||||
relink.set_defaults(handler=run_relink)
|
||||
|
||||
# clean
|
||||
clean = sub.add_parser("clean", help="Remove broken symlinks")
|
||||
clean.add_argument("--dry-run", action="store_true", help="Show what would be done")
|
||||
clean.set_defaults(handler=run_clean)
|
||||
|
||||
# edit
|
||||
edit = sub.add_parser("edit", help="Edit package or path with auto-commit")
|
||||
edit.add_argument("target", help="Package name or path inside dotfiles repo")
|
||||
edit.add_argument("--no-commit", action="store_true", help="Skip auto-commit")
|
||||
@@ -88,95 +100,161 @@ def register(subparsers):
|
||||
p.set_defaults(handler=lambda ctx, args: p.print_help())
|
||||
|
||||
|
||||
def _flow_config_dir(dotfiles_dir: Path = DOTFILES_DIR) -> Path:
|
||||
return dotfiles_dir
|
||||
|
||||
|
||||
def _is_root_package(package: str) -> bool:
|
||||
return package == RESERVED_ROOT or package.startswith(f"{RESERVED_ROOT}/")
|
||||
|
||||
|
||||
def _insert_spec(
|
||||
desired: Dict[Path, LinkSpec],
|
||||
*,
|
||||
target: Path,
|
||||
source: Path,
|
||||
package: str,
|
||||
) -> None:
|
||||
existing = desired.get(target)
|
||||
if existing is not None:
|
||||
raise RuntimeError(
|
||||
"Conflicting dotfile targets are not allowed: "
|
||||
f"{target} from {existing.package} and {package}"
|
||||
)
|
||||
|
||||
desired[target] = LinkSpec(source=source, target=target, package=package)
|
||||
|
||||
|
||||
def _load_state() -> dict:
|
||||
if LINKED_STATE.exists():
|
||||
with open(LINKED_STATE) as f:
|
||||
return json.load(f)
|
||||
return {"links": {}}
|
||||
with open(LINKED_STATE, "r", encoding="utf-8") as handle:
|
||||
return json.load(handle)
|
||||
return {"version": 2, "links": {}}
|
||||
|
||||
|
||||
def _save_state(state: dict):
|
||||
def _save_state(state: dict) -> None:
|
||||
LINKED_STATE.parent.mkdir(parents=True, exist_ok=True)
|
||||
with open(LINKED_STATE, "w") as f:
|
||||
json.dump(state, f, indent=2)
|
||||
with open(LINKED_STATE, "w", encoding="utf-8") as handle:
|
||||
json.dump(state, handle, indent=2)
|
||||
|
||||
|
||||
def _discover_packages(dotfiles_dir: Path, profile: Optional[str] = None) -> dict:
|
||||
"""Discover packages from common/ and optionally profiles/<name>/.
|
||||
def _load_link_specs_from_state() -> Dict[Path, LinkSpec]:
|
||||
state = _load_state()
|
||||
links = state.get("links", {})
|
||||
if not isinstance(links, dict):
|
||||
raise RuntimeError("Unsupported linked state format. Remove linked.json and relink dotfiles.")
|
||||
|
||||
Returns {package_name: source_dir} with profile dirs taking precedence.
|
||||
"""
|
||||
packages = {}
|
||||
common = dotfiles_dir / "common"
|
||||
if common.is_dir():
|
||||
for pkg in sorted(common.iterdir()):
|
||||
if pkg.is_dir() and not pkg.name.startswith("."):
|
||||
packages[pkg.name] = pkg
|
||||
resolved: Dict[Path, LinkSpec] = {}
|
||||
for package, pkg_links in links.items():
|
||||
if not isinstance(pkg_links, dict):
|
||||
raise RuntimeError("Unsupported linked state format. Remove linked.json and relink dotfiles.")
|
||||
|
||||
if profile:
|
||||
profile_dir = dotfiles_dir / "profiles" / profile
|
||||
if profile_dir.is_dir():
|
||||
for pkg in sorted(profile_dir.iterdir()):
|
||||
if pkg.is_dir() and not pkg.name.startswith("."):
|
||||
packages[pkg.name] = pkg # Override common
|
||||
for target_str, link_info in pkg_links.items():
|
||||
if not isinstance(link_info, dict) or "source" not in link_info:
|
||||
raise RuntimeError(
|
||||
"Unsupported linked state format. Remove linked.json and relink dotfiles."
|
||||
)
|
||||
|
||||
return packages
|
||||
target = Path(target_str)
|
||||
resolved[target] = LinkSpec(
|
||||
source=Path(link_info["source"]),
|
||||
target=target,
|
||||
package=str(package),
|
||||
is_directory_link=bool(link_info.get("is_directory_link", False)),
|
||||
)
|
||||
|
||||
return resolved
|
||||
|
||||
|
||||
def _walk_package(source_dir: Path, home: Path):
|
||||
"""Yield (source_file, target_file) pairs for a package directory.
|
||||
def _save_link_specs_to_state(specs: Dict[Path, LinkSpec]) -> None:
|
||||
grouped: Dict[str, Dict[str, dict]] = {}
|
||||
for spec in sorted(specs.values(), key=lambda s: str(s.target)):
|
||||
grouped.setdefault(spec.package, {})[str(spec.target)] = {
|
||||
"source": str(spec.source),
|
||||
"is_directory_link": spec.is_directory_link,
|
||||
}
|
||||
|
||||
Files in the package directory map relative to $HOME.
|
||||
"""
|
||||
_save_state({"version": 2, "links": grouped})
|
||||
|
||||
|
||||
def _list_profiles(flow_dir: Path) -> List[str]:
|
||||
if not flow_dir.exists() or not flow_dir.is_dir():
|
||||
return []
|
||||
|
||||
profiles: List[str] = []
|
||||
for child in flow_dir.iterdir():
|
||||
if not child.is_dir():
|
||||
continue
|
||||
if child.name.startswith("."):
|
||||
continue
|
||||
if child.name.startswith("_"):
|
||||
continue
|
||||
profiles.append(child.name)
|
||||
return sorted(profiles)
|
||||
|
||||
|
||||
def _walk_package(source_dir: Path):
|
||||
for root, _dirs, files in os.walk(source_dir):
|
||||
for fname in files:
|
||||
src = Path(root) / fname
|
||||
rel = src.relative_to(source_dir)
|
||||
dst = home / rel
|
||||
yield src, dst
|
||||
yield src, rel
|
||||
|
||||
|
||||
def _ensure_dotfiles_dir(ctx: FlowContext):
|
||||
if not DOTFILES_DIR.exists():
|
||||
ctx.console.error(f"Dotfiles not found at {DOTFILES_DIR}. Run 'flow dotfiles init' first.")
|
||||
sys.exit(1)
|
||||
def _profile_skip_set(ctx: FlowContext, profile: Optional[str]) -> Set[str]:
|
||||
if not profile:
|
||||
return set()
|
||||
|
||||
profiles = ctx.manifest.get("profiles", {})
|
||||
if not isinstance(profiles, dict):
|
||||
return set()
|
||||
|
||||
profile_cfg = profiles.get(profile, {})
|
||||
if not isinstance(profile_cfg, dict):
|
||||
return set()
|
||||
|
||||
configs = profile_cfg.get("configs", {})
|
||||
if not isinstance(configs, dict):
|
||||
return set()
|
||||
|
||||
skip = configs.get("skip", [])
|
||||
if not isinstance(skip, list):
|
||||
return set()
|
||||
|
||||
return {str(item) for item in skip if item}
|
||||
|
||||
|
||||
def _run_dotfiles_git(*cmd, capture: bool = True) -> subprocess.CompletedProcess:
|
||||
return subprocess.run(
|
||||
["git", "-C", str(DOTFILES_DIR)] + list(cmd),
|
||||
capture_output=capture,
|
||||
text=True,
|
||||
)
|
||||
def _discover_packages(dotfiles_dir: Path, profile: Optional[str] = None) -> dict:
|
||||
flow_dir = _flow_config_dir(dotfiles_dir)
|
||||
packages = {}
|
||||
|
||||
shared = flow_dir / RESERVED_SHARED
|
||||
if shared.is_dir():
|
||||
for pkg in sorted(shared.iterdir()):
|
||||
if pkg.is_dir() and not pkg.name.startswith("."):
|
||||
packages[pkg.name] = pkg
|
||||
|
||||
def _pull_dotfiles(ctx: FlowContext, *, rebase: bool = True) -> None:
|
||||
pull_cmd = ["pull"]
|
||||
if rebase:
|
||||
pull_cmd.append("--rebase")
|
||||
if profile:
|
||||
profile_dir = flow_dir / profile
|
||||
if profile_dir.is_dir():
|
||||
for pkg in sorted(profile_dir.iterdir()):
|
||||
if pkg.is_dir() and not pkg.name.startswith("."):
|
||||
packages[pkg.name] = pkg
|
||||
|
||||
strategy = "with rebase" if rebase else "without rebase"
|
||||
ctx.console.info(f"Pulling latest dotfiles ({strategy})...")
|
||||
result = _run_dotfiles_git(*pull_cmd, capture=True)
|
||||
|
||||
if result.returncode != 0:
|
||||
raise RuntimeError(f"Git pull failed: {result.stderr.strip()}")
|
||||
|
||||
output = result.stdout.strip()
|
||||
if output:
|
||||
print(output)
|
||||
|
||||
ctx.console.success("Dotfiles synced.")
|
||||
return packages
|
||||
|
||||
|
||||
def _find_package_dir(package_name: str, dotfiles_dir: Path = DOTFILES_DIR) -> Optional[Path]:
|
||||
common_dir = dotfiles_dir / "common" / package_name
|
||||
if common_dir.exists():
|
||||
return common_dir
|
||||
flow_dir = _flow_config_dir(dotfiles_dir)
|
||||
|
||||
profile_dirs = list((dotfiles_dir / "profiles").glob(f"*/{package_name}"))
|
||||
if profile_dirs:
|
||||
return profile_dirs[0]
|
||||
shared_dir = flow_dir / RESERVED_SHARED / package_name
|
||||
if shared_dir.exists():
|
||||
return shared_dir
|
||||
|
||||
for profile in _list_profiles(flow_dir):
|
||||
profile_pkg = flow_dir / profile / package_name
|
||||
if profile_pkg.exists():
|
||||
return profile_pkg
|
||||
|
||||
return None
|
||||
|
||||
@@ -209,10 +287,313 @@ def _resolve_edit_target(target: str, dotfiles_dir: Path = DOTFILES_DIR) -> Opti
|
||||
return None
|
||||
|
||||
|
||||
def _ensure_dotfiles_dir(ctx: FlowContext):
|
||||
if not DOTFILES_DIR.exists():
|
||||
ctx.console.error(f"Dotfiles not found at {DOTFILES_DIR}. Run 'flow dotfiles init' first.")
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
def _ensure_flow_dir(ctx: FlowContext):
|
||||
_ensure_dotfiles_dir(ctx)
|
||||
flow_dir = _flow_config_dir()
|
||||
if not flow_dir.exists() or not flow_dir.is_dir():
|
||||
ctx.console.error(f"Dotfiles repository not found at {flow_dir}")
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
def _run_dotfiles_git(*cmd, capture: bool = True) -> subprocess.CompletedProcess:
|
||||
return subprocess.run(
|
||||
["git", "-C", str(DOTFILES_DIR)] + list(cmd),
|
||||
capture_output=capture,
|
||||
text=True,
|
||||
)
|
||||
|
||||
|
||||
def _pull_dotfiles(ctx: FlowContext, *, rebase: bool = True) -> None:
|
||||
pull_cmd = ["pull"]
|
||||
if rebase:
|
||||
pull_cmd.append("--rebase")
|
||||
|
||||
strategy = "with rebase" if rebase else "without rebase"
|
||||
ctx.console.info(f"Pulling latest dotfiles ({strategy})...")
|
||||
result = _run_dotfiles_git(*pull_cmd, capture=True)
|
||||
|
||||
if result.returncode != 0:
|
||||
raise RuntimeError(f"Git pull failed: {result.stderr.strip()}")
|
||||
|
||||
output = result.stdout.strip()
|
||||
if output:
|
||||
print(output)
|
||||
|
||||
ctx.console.success("Dotfiles synced.")
|
||||
|
||||
|
||||
def _resolve_profile(ctx: FlowContext, requested: Optional[str]) -> Optional[str]:
|
||||
flow_dir = _flow_config_dir()
|
||||
profiles = _list_profiles(flow_dir)
|
||||
|
||||
if requested:
|
||||
if requested not in profiles:
|
||||
raise RuntimeError(f"Profile not found: {requested}")
|
||||
return requested
|
||||
|
||||
if len(profiles) == 1:
|
||||
return profiles[0]
|
||||
|
||||
if len(profiles) > 1:
|
||||
raise RuntimeError(f"Multiple profiles available. Use --profile: {', '.join(profiles)}")
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def _is_in_home(path: Path, home: Path) -> bool:
|
||||
try:
|
||||
path.relative_to(home)
|
||||
return True
|
||||
except ValueError:
|
||||
return False
|
||||
|
||||
|
||||
def _run_sudo(cmd: List[str], *, dry_run: bool = False) -> None:
|
||||
if dry_run:
|
||||
print(" " + " ".join(shlex.quote(part) for part in (["sudo"] + cmd)))
|
||||
return
|
||||
subprocess.run(["sudo"] + cmd, check=True)
|
||||
|
||||
|
||||
def _remove_target(path: Path, *, use_sudo: bool, dry_run: bool) -> None:
|
||||
if not (path.exists() or path.is_symlink()):
|
||||
return
|
||||
|
||||
if path.is_dir() and not path.is_symlink():
|
||||
raise RuntimeError(f"Cannot overwrite directory: {path}")
|
||||
|
||||
if use_sudo:
|
||||
_run_sudo(["rm", "-f", str(path)], dry_run=dry_run)
|
||||
return
|
||||
|
||||
if dry_run:
|
||||
print(f" REMOVE: {path}")
|
||||
return
|
||||
path.unlink()
|
||||
|
||||
|
||||
def _same_symlink(target: Path, source: Path) -> bool:
|
||||
if not target.is_symlink():
|
||||
return False
|
||||
return target.resolve(strict=False) == source.resolve(strict=False)
|
||||
|
||||
|
||||
def _collect_home_specs(
|
||||
flow_dir: Path,
|
||||
home: Path,
|
||||
profile: Optional[str],
|
||||
skip: Set[str],
|
||||
package_filter: Optional[Set[str]],
|
||||
) -> Dict[Path, LinkSpec]:
|
||||
desired: Dict[Path, LinkSpec] = {}
|
||||
|
||||
if RESERVED_SHARED not in skip:
|
||||
shared_dir = flow_dir / RESERVED_SHARED
|
||||
if shared_dir.is_dir():
|
||||
for pkg_dir in sorted(shared_dir.iterdir()):
|
||||
if not pkg_dir.is_dir() or pkg_dir.name.startswith("."):
|
||||
continue
|
||||
if package_filter and pkg_dir.name not in package_filter:
|
||||
continue
|
||||
if pkg_dir.name in skip:
|
||||
continue
|
||||
|
||||
package_name = f"{RESERVED_SHARED}/{pkg_dir.name}"
|
||||
for src, rel in _walk_package(pkg_dir):
|
||||
_insert_spec(
|
||||
desired,
|
||||
target=home / rel,
|
||||
source=src,
|
||||
package=package_name,
|
||||
)
|
||||
|
||||
if profile and "_profile" not in skip:
|
||||
profile_dir = flow_dir / profile
|
||||
if profile_dir.is_dir():
|
||||
for pkg_dir in sorted(profile_dir.iterdir()):
|
||||
if not pkg_dir.is_dir() or pkg_dir.name.startswith("."):
|
||||
continue
|
||||
if package_filter and pkg_dir.name not in package_filter:
|
||||
continue
|
||||
if pkg_dir.name in skip:
|
||||
continue
|
||||
|
||||
package_name = f"{profile}/{pkg_dir.name}"
|
||||
for src, rel in _walk_package(pkg_dir):
|
||||
_insert_spec(
|
||||
desired,
|
||||
target=home / rel,
|
||||
source=src,
|
||||
package=package_name,
|
||||
)
|
||||
|
||||
return desired
|
||||
|
||||
|
||||
def _collect_root_specs(flow_dir: Path, skip: Set[str], include_root: bool) -> Dict[Path, LinkSpec]:
|
||||
desired: Dict[Path, LinkSpec] = {}
|
||||
if not include_root or RESERVED_ROOT in skip:
|
||||
return desired
|
||||
|
||||
root_dir = flow_dir / RESERVED_ROOT
|
||||
if not root_dir.is_dir():
|
||||
return desired
|
||||
|
||||
for root_pkg_dir in sorted(root_dir.iterdir()):
|
||||
if not root_pkg_dir.is_dir() or root_pkg_dir.name.startswith("."):
|
||||
continue
|
||||
|
||||
for src, rel in _walk_package(root_pkg_dir):
|
||||
target = Path("/") / rel
|
||||
_insert_spec(
|
||||
desired,
|
||||
target=target,
|
||||
source=src,
|
||||
package=f"{RESERVED_ROOT}/{root_pkg_dir.name}",
|
||||
)
|
||||
|
||||
return desired
|
||||
|
||||
|
||||
def _validate_conflicts(
|
||||
desired: Dict[Path, LinkSpec],
|
||||
current: Dict[Path, LinkSpec],
|
||||
force: bool,
|
||||
) -> List[str]:
|
||||
conflicts: List[str] = []
|
||||
for target, spec in desired.items():
|
||||
if not (target.exists() or target.is_symlink()):
|
||||
continue
|
||||
|
||||
if _same_symlink(target, spec.source):
|
||||
continue
|
||||
|
||||
if target in current:
|
||||
continue
|
||||
|
||||
if target.is_dir() and not target.is_symlink():
|
||||
conflicts.append(f"Conflict: {target} is a directory")
|
||||
continue
|
||||
|
||||
if not force:
|
||||
conflicts.append(f"Conflict: {target} already exists and is not managed by flow")
|
||||
|
||||
return conflicts
|
||||
|
||||
|
||||
def _apply_link_spec(spec: LinkSpec, *, copy: bool, dry_run: bool) -> bool:
|
||||
use_sudo = _is_root_package(spec.package)
|
||||
|
||||
if copy and use_sudo:
|
||||
print(f" SKIP COPY (root target): {spec.target}")
|
||||
return False
|
||||
|
||||
if use_sudo:
|
||||
_run_sudo(["mkdir", "-p", str(spec.target.parent)], dry_run=dry_run)
|
||||
_run_sudo(["ln", "-sfn", str(spec.source), str(spec.target)], dry_run=dry_run)
|
||||
return True
|
||||
|
||||
if dry_run:
|
||||
if copy:
|
||||
print(f" COPY: {spec.source} -> {spec.target}")
|
||||
else:
|
||||
print(f" LINK: {spec.target} -> {spec.source}")
|
||||
return True
|
||||
|
||||
spec.target.parent.mkdir(parents=True, exist_ok=True)
|
||||
if copy:
|
||||
shutil.copy2(spec.source, spec.target)
|
||||
return True
|
||||
spec.target.symlink_to(spec.source)
|
||||
return True
|
||||
|
||||
|
||||
def _sync_to_desired(
|
||||
ctx: FlowContext,
|
||||
desired: Dict[Path, LinkSpec],
|
||||
*,
|
||||
force: bool,
|
||||
dry_run: bool,
|
||||
copy: bool,
|
||||
) -> None:
|
||||
current = _load_link_specs_from_state()
|
||||
conflicts = _validate_conflicts(desired, current, force)
|
||||
|
||||
if conflicts:
|
||||
for conflict in conflicts:
|
||||
ctx.console.error(conflict)
|
||||
if not force:
|
||||
raise RuntimeError("Use --force to overwrite existing files")
|
||||
|
||||
for target in sorted(current.keys(), key=str):
|
||||
if target in desired:
|
||||
continue
|
||||
use_sudo = _is_root_package(current[target].package) or not _is_in_home(target, Path.home())
|
||||
_remove_target(target, use_sudo=use_sudo, dry_run=dry_run)
|
||||
del current[target]
|
||||
|
||||
for target in sorted(desired.keys(), key=str):
|
||||
spec = desired[target]
|
||||
|
||||
if _same_symlink(target, spec.source):
|
||||
current[target] = spec
|
||||
continue
|
||||
|
||||
exists = target.exists() or target.is_symlink()
|
||||
if exists:
|
||||
use_sudo = _is_root_package(spec.package) or not _is_in_home(target, Path.home())
|
||||
_remove_target(target, use_sudo=use_sudo, dry_run=dry_run)
|
||||
|
||||
applied = _apply_link_spec(spec, copy=copy, dry_run=dry_run)
|
||||
if applied:
|
||||
current[target] = spec
|
||||
|
||||
if not dry_run:
|
||||
_save_link_specs_to_state(current)
|
||||
|
||||
|
||||
def _desired_links_for_profile(
|
||||
ctx: FlowContext,
|
||||
profile: Optional[str],
|
||||
package_filter: Optional[Set[str]],
|
||||
) -> Dict[Path, LinkSpec]:
|
||||
flow_dir = _flow_config_dir()
|
||||
home = Path.home()
|
||||
|
||||
skip = _profile_skip_set(ctx, profile)
|
||||
include_root = package_filter is None or RESERVED_ROOT in package_filter
|
||||
|
||||
effective_filter = None
|
||||
if package_filter is not None:
|
||||
effective_filter = set(package_filter)
|
||||
effective_filter.discard(RESERVED_ROOT)
|
||||
if not effective_filter:
|
||||
effective_filter = set()
|
||||
|
||||
home_specs = _collect_home_specs(flow_dir, home, profile, skip, effective_filter)
|
||||
root_specs = _collect_root_specs(flow_dir, skip, include_root)
|
||||
combined = {}
|
||||
combined.update(home_specs)
|
||||
for target, spec in root_specs.items():
|
||||
_insert_spec(
|
||||
combined,
|
||||
target=target,
|
||||
source=spec.source,
|
||||
package=spec.package,
|
||||
)
|
||||
return combined
|
||||
|
||||
|
||||
def run_init(ctx: FlowContext, args):
|
||||
repo_url = args.repo or ctx.config.dotfiles_url
|
||||
if not repo_url:
|
||||
ctx.console.error("No dotfiles repository URL. Set it in config or pass --repo.")
|
||||
ctx.console.error("No dotfiles repository URL. Set it in YAML config or pass --repo.")
|
||||
sys.exit(1)
|
||||
|
||||
if DOTFILES_DIR.exists():
|
||||
@@ -228,165 +609,108 @@ def run_init(ctx: FlowContext, args):
|
||||
|
||||
|
||||
def run_link(ctx: FlowContext, args):
|
||||
_ensure_dotfiles_dir(ctx)
|
||||
_ensure_flow_dir(ctx)
|
||||
|
||||
home = Path.home()
|
||||
packages = _discover_packages(DOTFILES_DIR, args.profile)
|
||||
|
||||
# Filter to requested packages
|
||||
if args.packages:
|
||||
packages = {k: v for k, v in packages.items() if k in args.packages}
|
||||
missing = set(args.packages) - set(packages.keys())
|
||||
if missing:
|
||||
ctx.console.warn(f"Packages not found: {', '.join(missing)}")
|
||||
if not packages:
|
||||
ctx.console.error("No valid packages selected")
|
||||
sys.exit(1)
|
||||
|
||||
# Build current link tree from state
|
||||
state = _load_state()
|
||||
try:
|
||||
tree = LinkTree.from_state(state)
|
||||
profile = _resolve_profile(ctx, args.profile)
|
||||
except RuntimeError as e:
|
||||
ctx.console.error(str(e))
|
||||
sys.exit(1)
|
||||
folder = TreeFolder(tree)
|
||||
|
||||
all_operations = []
|
||||
copied_count = 0
|
||||
package_filter = set(args.packages) if args.packages else None
|
||||
desired = _desired_links_for_profile(ctx, profile, package_filter)
|
||||
|
||||
for pkg_name, source_dir in packages.items():
|
||||
ctx.console.info(f"[{pkg_name}]")
|
||||
for src, dst in _walk_package(source_dir, home):
|
||||
if args.copy:
|
||||
if dst.exists() or dst.is_symlink():
|
||||
if not args.force:
|
||||
ctx.console.warn(f" Skipped (exists): {dst}")
|
||||
continue
|
||||
if dst.is_dir() and not dst.is_symlink():
|
||||
ctx.console.error(f"Cannot overwrite directory with --copy: {dst}")
|
||||
continue
|
||||
if not args.dry_run:
|
||||
dst.unlink()
|
||||
|
||||
if args.dry_run:
|
||||
print(f" COPY: {src} -> {dst}")
|
||||
else:
|
||||
dst.parent.mkdir(parents=True, exist_ok=True)
|
||||
shutil.copy2(src, dst)
|
||||
print(f" Copied: {src} -> {dst}")
|
||||
copied_count += 1
|
||||
continue
|
||||
|
||||
ops = folder.plan_link(src, dst, pkg_name)
|
||||
all_operations.extend(ops)
|
||||
|
||||
if args.copy:
|
||||
if args.dry_run:
|
||||
return
|
||||
ctx.console.success(f"Copied {copied_count} item(s)")
|
||||
if not desired:
|
||||
ctx.console.warn("No link targets found for selected profile/filters")
|
||||
return
|
||||
|
||||
# Conflict detection (two-phase)
|
||||
conflicts = folder.detect_conflicts(all_operations)
|
||||
if conflicts and not args.force:
|
||||
for conflict in conflicts:
|
||||
ctx.console.error(conflict)
|
||||
ctx.console.error("\nUse --force to overwrite existing files")
|
||||
try:
|
||||
_sync_to_desired(
|
||||
ctx,
|
||||
desired,
|
||||
force=args.force,
|
||||
dry_run=args.dry_run,
|
||||
copy=args.copy,
|
||||
)
|
||||
except RuntimeError as e:
|
||||
ctx.console.error(str(e))
|
||||
sys.exit(1)
|
||||
|
||||
# Handle force mode: remove conflicting targets
|
||||
if args.force and not args.dry_run:
|
||||
for op in all_operations:
|
||||
if op.type != "create_symlink":
|
||||
continue
|
||||
if not (op.target.exists() or op.target.is_symlink()):
|
||||
continue
|
||||
if op.target in tree.links:
|
||||
continue
|
||||
if op.target.is_dir() and not op.target.is_symlink():
|
||||
ctx.console.error(f"Cannot overwrite directory with --force: {op.target}")
|
||||
sys.exit(1)
|
||||
op.target.unlink()
|
||||
|
||||
# Execute operations
|
||||
if args.dry_run:
|
||||
ctx.console.info("\nPlanned operations:")
|
||||
for op in all_operations:
|
||||
print(str(op))
|
||||
else:
|
||||
folder.execute_operations(all_operations, dry_run=False)
|
||||
state = folder.to_state()
|
||||
_save_state(state)
|
||||
ctx.console.success(f"Linked {len(all_operations)} item(s)")
|
||||
return
|
||||
|
||||
ctx.console.success(f"Linked {len(desired)} item(s)")
|
||||
|
||||
|
||||
def _package_match(package_id: str, filters: Set[str]) -> bool:
|
||||
if package_id in filters:
|
||||
return True
|
||||
|
||||
# Allow users to pass just package basename (e.g. zsh)
|
||||
base = package_id.split("/", 1)[-1]
|
||||
return base in filters
|
||||
|
||||
|
||||
def run_unlink(ctx: FlowContext, args):
|
||||
state = _load_state()
|
||||
links_by_package = state.get("links", {})
|
||||
if not links_by_package:
|
||||
try:
|
||||
current = _load_link_specs_from_state()
|
||||
except RuntimeError as e:
|
||||
ctx.console.error(str(e))
|
||||
sys.exit(1)
|
||||
|
||||
if not current:
|
||||
ctx.console.info("No linked dotfiles found.")
|
||||
return
|
||||
|
||||
packages_to_unlink = args.packages if args.packages else list(links_by_package.keys())
|
||||
filters = set(args.packages) if args.packages else None
|
||||
removed = 0
|
||||
|
||||
for pkg_name in packages_to_unlink:
|
||||
links = links_by_package.get(pkg_name, {})
|
||||
if not links:
|
||||
for target in sorted(list(current.keys()), key=str):
|
||||
spec = current[target]
|
||||
if filters and not _package_match(spec.package, filters):
|
||||
continue
|
||||
|
||||
ctx.console.info(f"[{pkg_name}]")
|
||||
for dst_str in list(links.keys()):
|
||||
dst = Path(dst_str)
|
||||
if dst.is_symlink():
|
||||
dst.unlink()
|
||||
print(f" Removed: {dst}")
|
||||
removed += 1
|
||||
elif dst.exists():
|
||||
ctx.console.warn(f" Not a symlink, skipping: {dst}")
|
||||
else:
|
||||
print(f" Already gone: {dst}")
|
||||
use_sudo = _is_root_package(spec.package) or not _is_in_home(target, Path.home())
|
||||
try:
|
||||
_remove_target(target, use_sudo=use_sudo, dry_run=False)
|
||||
except RuntimeError as e:
|
||||
ctx.console.warn(str(e))
|
||||
continue
|
||||
|
||||
links_by_package.pop(pkg_name, None)
|
||||
removed += 1
|
||||
del current[target]
|
||||
|
||||
_save_state(state)
|
||||
_save_link_specs_to_state(current)
|
||||
ctx.console.success(f"Removed {removed} symlink(s)")
|
||||
|
||||
|
||||
def run_status(ctx: FlowContext, args):
|
||||
state = _load_state()
|
||||
links_by_package = state.get("links", {})
|
||||
if not links_by_package:
|
||||
try:
|
||||
current = _load_link_specs_from_state()
|
||||
except RuntimeError as e:
|
||||
ctx.console.error(str(e))
|
||||
sys.exit(1)
|
||||
|
||||
if not current:
|
||||
ctx.console.info("No linked dotfiles.")
|
||||
return
|
||||
|
||||
for pkg_name, links in links_by_package.items():
|
||||
ctx.console.info(f"[{pkg_name}]")
|
||||
for dst_str, link_info in links.items():
|
||||
dst = Path(dst_str)
|
||||
grouped: Dict[str, List[LinkSpec]] = {}
|
||||
for spec in current.values():
|
||||
grouped.setdefault(spec.package, []).append(spec)
|
||||
|
||||
if not isinstance(link_info, dict) or "source" not in link_info:
|
||||
ctx.console.error(
|
||||
"Unsupported linked state format. Remove linked.json and relink dotfiles."
|
||||
)
|
||||
sys.exit(1)
|
||||
|
||||
src_str = link_info["source"]
|
||||
is_dir_link = bool(link_info.get("is_directory_link", False))
|
||||
|
||||
link_type = "FOLDED" if is_dir_link else "OK"
|
||||
|
||||
if dst.is_symlink():
|
||||
target = os.readlink(dst)
|
||||
if target == src_str or str(dst.resolve()) == str(Path(src_str).resolve()):
|
||||
print(f" {link_type}: {dst} -> {src_str}")
|
||||
for package in sorted(grouped.keys()):
|
||||
ctx.console.info(f"[{package}]")
|
||||
for spec in sorted(grouped[package], key=lambda s: str(s.target)):
|
||||
if spec.target.is_symlink():
|
||||
if _same_symlink(spec.target, spec.source):
|
||||
print(f" OK: {spec.target} -> {spec.source}")
|
||||
else:
|
||||
print(f" CHANGED: {dst} -> {target} (expected {src_str})")
|
||||
elif dst.exists():
|
||||
print(f" NOT SYMLINK: {dst}")
|
||||
print(f" CHANGED: {spec.target}")
|
||||
elif spec.target.exists():
|
||||
print(f" NOT SYMLINK: {spec.target}")
|
||||
else:
|
||||
print(f" BROKEN: {dst} (missing)")
|
||||
print(f" BROKEN: {spec.target} (missing)")
|
||||
|
||||
|
||||
def run_sync(ctx: FlowContext, args):
|
||||
@@ -448,15 +772,11 @@ def run_repo_push(ctx: FlowContext, args):
|
||||
|
||||
|
||||
def run_relink(ctx: FlowContext, args):
|
||||
"""Refresh symlinks after changes (unlink + link)."""
|
||||
_ensure_dotfiles_dir(ctx)
|
||||
_ensure_flow_dir(ctx)
|
||||
|
||||
# First unlink
|
||||
ctx.console.info("Unlinking current symlinks...")
|
||||
run_unlink(ctx, args)
|
||||
|
||||
# Then link again — set defaults for attributes that run_link expects
|
||||
# but the relink parser doesn't define.
|
||||
args.copy = False
|
||||
args.force = False
|
||||
args.dry_run = False
|
||||
@@ -465,29 +785,31 @@ def run_relink(ctx: FlowContext, args):
|
||||
|
||||
|
||||
def run_clean(ctx: FlowContext, args):
|
||||
"""Remove broken symlinks."""
|
||||
state = _load_state()
|
||||
if not state.get("links"):
|
||||
try:
|
||||
current = _load_link_specs_from_state()
|
||||
except RuntimeError as e:
|
||||
ctx.console.error(str(e))
|
||||
sys.exit(1)
|
||||
|
||||
if not current:
|
||||
ctx.console.info("No linked dotfiles found.")
|
||||
return
|
||||
|
||||
removed = 0
|
||||
for pkg_name, links in state["links"].items():
|
||||
for dst_str in list(links.keys()):
|
||||
dst = Path(dst_str)
|
||||
for target in sorted(list(current.keys()), key=str):
|
||||
if not target.is_symlink() or target.exists():
|
||||
continue
|
||||
|
||||
# Check if symlink is broken
|
||||
if dst.is_symlink() and not dst.exists():
|
||||
if args.dry_run:
|
||||
print(f"Would remove broken symlink: {dst}")
|
||||
else:
|
||||
dst.unlink()
|
||||
print(f"Removed broken symlink: {dst}")
|
||||
del links[dst_str]
|
||||
removed += 1
|
||||
if args.dry_run:
|
||||
print(f"Would remove broken symlink: {target}")
|
||||
else:
|
||||
use_sudo = _is_root_package(current[target].package) or not _is_in_home(target, Path.home())
|
||||
_remove_target(target, use_sudo=use_sudo, dry_run=False)
|
||||
del current[target]
|
||||
removed += 1
|
||||
|
||||
if not args.dry_run:
|
||||
_save_state(state)
|
||||
_save_link_specs_to_state(current)
|
||||
|
||||
if removed > 0:
|
||||
ctx.console.success(f"Cleaned {removed} broken symlink(s)")
|
||||
@@ -496,7 +818,6 @@ def run_clean(ctx: FlowContext, args):
|
||||
|
||||
|
||||
def run_edit(ctx: FlowContext, args):
|
||||
"""Edit package config with auto-commit workflow."""
|
||||
_ensure_dotfiles_dir(ctx)
|
||||
|
||||
target_name = args.target
|
||||
@@ -505,24 +826,20 @@ def run_edit(ctx: FlowContext, args):
|
||||
ctx.console.error(f"No matching package or path found for: {target_name}")
|
||||
sys.exit(1)
|
||||
|
||||
# Git pull before editing
|
||||
ctx.console.info("Pulling latest changes...")
|
||||
result = _run_dotfiles_git("pull", "--rebase", capture=True)
|
||||
if result.returncode != 0:
|
||||
ctx.console.warn(f"Git pull failed: {result.stderr.strip()}")
|
||||
|
||||
# Open editor
|
||||
editor = os.environ.get("EDITOR", "vim")
|
||||
ctx.console.info(f"Opening {edit_target} in {editor}...")
|
||||
edit_result = subprocess.run(shlex.split(editor) + [str(edit_target)])
|
||||
if edit_result.returncode != 0:
|
||||
ctx.console.warn(f"Editor exited with status {edit_result.returncode}")
|
||||
|
||||
# Check for changes
|
||||
result = _run_dotfiles_git("status", "--porcelain", capture=True)
|
||||
|
||||
if result.stdout.strip() and not args.no_commit:
|
||||
# Auto-commit changes
|
||||
ctx.console.info("Changes detected, committing...")
|
||||
subprocess.run(["git", "-C", str(DOTFILES_DIR), "add", "."], check=True)
|
||||
subprocess.run(
|
||||
@@ -530,12 +847,11 @@ def run_edit(ctx: FlowContext, args):
|
||||
check=True,
|
||||
)
|
||||
|
||||
# Ask before pushing
|
||||
try:
|
||||
response = input("Push changes to remote? [Y/n] ")
|
||||
except (EOFError, KeyboardInterrupt):
|
||||
response = "n"
|
||||
print() # newline after ^C / EOF
|
||||
print()
|
||||
if response.lower() != "n":
|
||||
subprocess.run(["git", "-C", str(DOTFILES_DIR), "push"], check=True)
|
||||
ctx.console.success("Changes committed and pushed")
|
||||
|
||||
@@ -1,31 +1,27 @@
|
||||
"""flow package — binary package management from manifest definitions."""
|
||||
"""flow package — package management from unified manifest definitions."""
|
||||
|
||||
import json
|
||||
import subprocess
|
||||
import sys
|
||||
from typing import Any, Dict, Optional, Tuple
|
||||
from typing import Any, Dict
|
||||
|
||||
from flow.commands.bootstrap import _get_package_catalog, _install_binary_package
|
||||
from flow.core.config import FlowContext
|
||||
from flow.core.paths import INSTALLED_STATE
|
||||
from flow.core.variables import substitute_template
|
||||
|
||||
|
||||
def register(subparsers):
|
||||
p = subparsers.add_parser("package", aliases=["pkg"], help="Manage binary packages")
|
||||
p = subparsers.add_parser("package", aliases=["pkg"], help="Manage packages")
|
||||
sub = p.add_subparsers(dest="package_command")
|
||||
|
||||
# install
|
||||
inst = sub.add_parser("install", help="Install packages from manifest")
|
||||
inst.add_argument("packages", nargs="+", help="Package names to install")
|
||||
inst.add_argument("--dry-run", action="store_true", help="Show what would be done")
|
||||
inst.set_defaults(handler=run_install)
|
||||
|
||||
# list
|
||||
ls = sub.add_parser("list", help="List installed and available packages")
|
||||
ls.add_argument("--all", action="store_true", help="Show all available packages")
|
||||
ls.set_defaults(handler=run_list)
|
||||
|
||||
# remove
|
||||
rm = sub.add_parser("remove", help="Remove installed packages")
|
||||
rm.add_argument("packages", nargs="+", help="Package names to remove")
|
||||
rm.set_defaults(handler=run_remove)
|
||||
@@ -35,53 +31,24 @@ def register(subparsers):
|
||||
|
||||
def _load_installed() -> dict:
|
||||
if INSTALLED_STATE.exists():
|
||||
with open(INSTALLED_STATE) as f:
|
||||
return json.load(f)
|
||||
with open(INSTALLED_STATE, "r", encoding="utf-8") as handle:
|
||||
return json.load(handle)
|
||||
return {}
|
||||
|
||||
|
||||
def _save_installed(state: dict):
|
||||
INSTALLED_STATE.parent.mkdir(parents=True, exist_ok=True)
|
||||
with open(INSTALLED_STATE, "w") as f:
|
||||
json.dump(state, f, indent=2)
|
||||
with open(INSTALLED_STATE, "w", encoding="utf-8") as handle:
|
||||
json.dump(state, handle, indent=2)
|
||||
|
||||
|
||||
def _get_definitions(ctx: FlowContext) -> dict:
|
||||
"""Get package definitions from manifest (binaries section)."""
|
||||
return ctx.manifest.get("binaries", {})
|
||||
|
||||
|
||||
def _resolve_download_url(
|
||||
pkg_def: Dict[str, Any],
|
||||
platform_str: str,
|
||||
) -> Optional[Tuple[str, Dict[str, str]]]:
|
||||
"""Build GitHub release download URL from package definition."""
|
||||
source = pkg_def.get("source", "")
|
||||
if not source.startswith("github:"):
|
||||
return None
|
||||
|
||||
owner_repo = source[len("github:"):]
|
||||
version = pkg_def.get("version", "")
|
||||
asset_pattern = pkg_def.get("asset-pattern", "")
|
||||
platform_map = pkg_def.get("platform-map", {})
|
||||
|
||||
mapping = platform_map.get(platform_str)
|
||||
if not mapping:
|
||||
return None
|
||||
|
||||
# Build template context
|
||||
template_ctx = {**mapping, "version": version}
|
||||
asset = substitute_template(asset_pattern, template_ctx)
|
||||
url = f"https://github.com/{owner_repo}/releases/download/v{version}/{asset}"
|
||||
|
||||
template_ctx["downloadUrl"] = url
|
||||
return url, template_ctx
|
||||
def _get_definitions(ctx: FlowContext) -> Dict[str, Dict[str, Any]]:
|
||||
return _get_package_catalog(ctx)
|
||||
|
||||
|
||||
def run_install(ctx: FlowContext, args):
|
||||
definitions = _get_definitions(ctx)
|
||||
installed = _load_installed()
|
||||
platform_str = ctx.platform.platform
|
||||
had_error = False
|
||||
|
||||
for pkg_name in args.packages:
|
||||
@@ -91,48 +58,33 @@ def run_install(ctx: FlowContext, args):
|
||||
had_error = True
|
||||
continue
|
||||
|
||||
ctx.console.info(f"Installing {pkg_name} v{pkg_def.get('version', '?')}...")
|
||||
|
||||
result = _resolve_download_url(pkg_def, platform_str)
|
||||
if not result:
|
||||
ctx.console.error(f"No download available for {pkg_name} on {platform_str}")
|
||||
pkg_type = pkg_def.get("type", "pkg")
|
||||
if pkg_type != "binary":
|
||||
ctx.console.error(
|
||||
f"'flow package install' supports binary packages only. "
|
||||
f"'{pkg_name}' is type '{pkg_type}'."
|
||||
)
|
||||
had_error = True
|
||||
continue
|
||||
|
||||
url, template_ctx = result
|
||||
|
||||
if args.dry_run:
|
||||
ctx.console.info(f"[{pkg_name}] Would download: {url}")
|
||||
install_script = pkg_def.get("install-script", "")
|
||||
if install_script:
|
||||
ctx.console.info(f"[{pkg_name}] Would run install script")
|
||||
continue
|
||||
|
||||
# Run install script with template vars resolved
|
||||
install_script = pkg_def.get("install-script", "")
|
||||
if not install_script:
|
||||
ctx.console.error(f"Package '{pkg_name}' has no install-script")
|
||||
ctx.console.info(f"Installing {pkg_name}...")
|
||||
try:
|
||||
_install_binary_package(ctx, pkg_def, extra_env={}, dry_run=args.dry_run)
|
||||
except RuntimeError as e:
|
||||
ctx.console.error(str(e))
|
||||
had_error = True
|
||||
continue
|
||||
|
||||
resolved_script = substitute_template(install_script, template_ctx)
|
||||
ctx.console.info(f"Running install script for {pkg_name}...")
|
||||
proc = subprocess.run(
|
||||
resolved_script, shell=True,
|
||||
capture_output=False,
|
||||
)
|
||||
if proc.returncode != 0:
|
||||
ctx.console.error(f"Install script failed for {pkg_name}")
|
||||
had_error = True
|
||||
continue
|
||||
if not args.dry_run:
|
||||
installed[pkg_name] = {
|
||||
"version": str(pkg_def.get("version", "")),
|
||||
"type": pkg_type,
|
||||
}
|
||||
ctx.console.success(f"Installed {pkg_name}")
|
||||
|
||||
installed[pkg_name] = {
|
||||
"version": pkg_def.get("version", ""),
|
||||
"source": pkg_def.get("source", ""),
|
||||
}
|
||||
ctx.console.success(f"Installed {pkg_name} v{pkg_def.get('version', '')}")
|
||||
if not args.dry_run:
|
||||
_save_installed(installed)
|
||||
|
||||
_save_installed(installed)
|
||||
if had_error:
|
||||
sys.exit(1)
|
||||
|
||||
@@ -141,26 +93,24 @@ def run_list(ctx: FlowContext, args):
|
||||
definitions = _get_definitions(ctx)
|
||||
installed = _load_installed()
|
||||
|
||||
headers = ["PACKAGE", "INSTALLED", "AVAILABLE"]
|
||||
headers = ["PACKAGE", "TYPE", "INSTALLED", "AVAILABLE"]
|
||||
rows = []
|
||||
|
||||
if args.all:
|
||||
# Show all defined packages
|
||||
if not definitions:
|
||||
ctx.console.info("No packages defined in manifest.")
|
||||
return
|
||||
for name, pkg_def in sorted(definitions.items()):
|
||||
inst_ver = installed.get(name, {}).get("version", "-")
|
||||
avail_ver = pkg_def.get("version", "?")
|
||||
rows.append([name, inst_ver, avail_ver])
|
||||
avail_ver = str(pkg_def.get("version", "")) or "-"
|
||||
rows.append([name, str(pkg_def.get("type", "pkg")), inst_ver, avail_ver])
|
||||
else:
|
||||
# Show installed only
|
||||
if not installed:
|
||||
ctx.console.info("No packages installed.")
|
||||
return
|
||||
for name, info in sorted(installed.items()):
|
||||
avail = definitions.get(name, {}).get("version", "?")
|
||||
rows.append([name, info.get("version", "?"), avail])
|
||||
avail = str(definitions.get(name, {}).get("version", "")) or "-"
|
||||
rows.append([name, str(info.get("type", "?")), str(info.get("version", "?")), avail])
|
||||
|
||||
ctx.console.table(headers, rows)
|
||||
|
||||
@@ -173,9 +123,10 @@ def run_remove(ctx: FlowContext, args):
|
||||
ctx.console.warn(f"Package not installed: {pkg_name}")
|
||||
continue
|
||||
|
||||
# Remove from installed state
|
||||
del installed[pkg_name]
|
||||
ctx.console.success(f"Removed {pkg_name} from installed packages")
|
||||
ctx.console.warn("Note: binary files were not automatically deleted. Remove manually if needed.")
|
||||
ctx.console.warn(
|
||||
"Note: installed files were not automatically deleted. Remove manually if needed."
|
||||
)
|
||||
|
||||
_save_installed(installed)
|
||||
|
||||
@@ -1,14 +1,13 @@
|
||||
"""Configuration loading (INI config + YAML manifest) and FlowContext."""
|
||||
"""Configuration loading (merged YAML) and FlowContext."""
|
||||
|
||||
import configparser
|
||||
from dataclasses import dataclass, field
|
||||
from pathlib import Path
|
||||
from typing import Any, Dict, List, Optional
|
||||
|
||||
import yaml
|
||||
|
||||
from flow.core.console import ConsoleLogger
|
||||
from flow.core import paths
|
||||
from flow.core.console import ConsoleLogger
|
||||
from flow.core.platform import PlatformInfo
|
||||
|
||||
|
||||
@@ -31,8 +30,17 @@ class AppConfig:
|
||||
targets: List[TargetConfig] = field(default_factory=list)
|
||||
|
||||
|
||||
def _get_value(mapping: Any, *keys: str, default: Any = None) -> Any:
|
||||
if not isinstance(mapping, dict):
|
||||
return default
|
||||
for key in keys:
|
||||
if key in mapping:
|
||||
return mapping[key]
|
||||
return default
|
||||
|
||||
|
||||
def _parse_target_config(key: str, value: str) -> Optional[TargetConfig]:
|
||||
"""Parse a target line from config.
|
||||
"""Parse a target line from config-like syntax.
|
||||
|
||||
Supported formats:
|
||||
1) namespace = platform ssh_host [ssh_identity]
|
||||
@@ -66,83 +74,218 @@ def _parse_target_config(key: str, value: str) -> Optional[TargetConfig]:
|
||||
)
|
||||
|
||||
|
||||
def load_config(path: Optional[Path] = None) -> AppConfig:
|
||||
"""Load INI config file into AppConfig with cascading priority.
|
||||
def _list_yaml_files(directory: Path) -> List[Path]:
|
||||
if not directory.exists() or not directory.is_dir():
|
||||
return []
|
||||
|
||||
Priority:
|
||||
1. Dotfiles repo (self-hosted): ~/.local/share/devflow/dotfiles/flow/.config/flow/config
|
||||
2. Local override: ~/.config/devflow/config
|
||||
3. Empty fallback
|
||||
"""
|
||||
cfg = AppConfig()
|
||||
files = []
|
||||
for child in directory.iterdir():
|
||||
if not child.is_file():
|
||||
continue
|
||||
if child.suffix.lower() in {".yaml", ".yml"}:
|
||||
files.append(child)
|
||||
|
||||
if path is None:
|
||||
# Priority 1: Check dotfiles repo for self-hosted config
|
||||
if paths.DOTFILES_CONFIG.exists():
|
||||
path = paths.DOTFILES_CONFIG
|
||||
# Priority 2: Fall back to local config
|
||||
else:
|
||||
path = paths.CONFIG_FILE
|
||||
|
||||
assert path is not None
|
||||
|
||||
if not path.exists():
|
||||
return cfg
|
||||
|
||||
parser = configparser.ConfigParser()
|
||||
parser.read(path)
|
||||
|
||||
if parser.has_section("repository"):
|
||||
cfg.dotfiles_url = parser.get("repository", "dotfiles_url", fallback=cfg.dotfiles_url)
|
||||
cfg.dotfiles_branch = parser.get("repository", "dotfiles_branch", fallback=cfg.dotfiles_branch)
|
||||
|
||||
if parser.has_section("paths"):
|
||||
cfg.projects_dir = parser.get("paths", "projects_dir", fallback=cfg.projects_dir)
|
||||
|
||||
if parser.has_section("defaults"):
|
||||
cfg.container_registry = parser.get("defaults", "container_registry", fallback=cfg.container_registry)
|
||||
cfg.container_tag = parser.get("defaults", "container_tag", fallback=cfg.container_tag)
|
||||
cfg.tmux_session = parser.get("defaults", "tmux_session", fallback=cfg.tmux_session)
|
||||
|
||||
if parser.has_section("targets"):
|
||||
for key in parser.options("targets"):
|
||||
raw_value = parser.get("targets", key)
|
||||
tc = _parse_target_config(key, raw_value)
|
||||
if tc is not None:
|
||||
cfg.targets.append(tc)
|
||||
|
||||
return cfg
|
||||
return sorted(files, key=lambda p: p.name)
|
||||
|
||||
|
||||
def load_manifest(path: Optional[Path] = None) -> Dict[str, Any]:
|
||||
"""Load YAML manifest file with cascading priority.
|
||||
def _load_yaml_file(path: Path) -> Dict[str, Any]:
|
||||
try:
|
||||
with open(path, "r", encoding="utf-8") as handle:
|
||||
data = yaml.safe_load(handle)
|
||||
except yaml.YAMLError as e:
|
||||
raise RuntimeError(f"Invalid YAML in {path}: {e}") from e
|
||||
|
||||
Priority:
|
||||
1. Dotfiles repo (self-hosted): ~/.local/share/devflow/dotfiles/flow/.config/flow/manifest.yaml
|
||||
2. Local override: ~/.config/devflow/manifest.yaml
|
||||
3. Empty fallback
|
||||
"""
|
||||
if path is None:
|
||||
# Priority 1: Check dotfiles repo for self-hosted manifest
|
||||
if paths.DOTFILES_MANIFEST.exists():
|
||||
path = paths.DOTFILES_MANIFEST
|
||||
# Priority 2: Fall back to local manifest
|
||||
else:
|
||||
path = paths.MANIFEST_FILE
|
||||
if data is None:
|
||||
return {}
|
||||
|
||||
assert path is not None
|
||||
if not isinstance(data, dict):
|
||||
raise RuntimeError(f"YAML file must contain a mapping at root: {path}")
|
||||
|
||||
return data
|
||||
|
||||
|
||||
def _load_merged_yaml(directory: Path) -> Dict[str, Any]:
|
||||
merged: Dict[str, Any] = {}
|
||||
for file_path in _list_yaml_files(directory):
|
||||
merged.update(_load_yaml_file(file_path))
|
||||
return merged
|
||||
|
||||
|
||||
def _resolve_default_yaml_root() -> Path:
|
||||
# Priority 1: self-hosted config from linked dotfiles
|
||||
if paths.DOTFILES_FLOW_CONFIG.exists() and _list_yaml_files(paths.DOTFILES_FLOW_CONFIG):
|
||||
return paths.DOTFILES_FLOW_CONFIG
|
||||
|
||||
# Priority 2: local config directory
|
||||
return paths.CONFIG_DIR
|
||||
|
||||
|
||||
def _load_yaml_source(path: Path) -> Dict[str, Any]:
|
||||
if not path.exists():
|
||||
return {}
|
||||
|
||||
try:
|
||||
with open(path, "r") as f:
|
||||
data = yaml.safe_load(f)
|
||||
except yaml.YAMLError as e:
|
||||
raise RuntimeError(f"Invalid YAML in {path}: {e}") from e
|
||||
if path.is_file():
|
||||
return _load_yaml_file(path)
|
||||
|
||||
if path.is_dir():
|
||||
return _load_merged_yaml(path)
|
||||
|
||||
return {}
|
||||
|
||||
|
||||
def _parse_targets(raw_targets: Any) -> List[TargetConfig]:
|
||||
targets: List[TargetConfig] = []
|
||||
|
||||
if isinstance(raw_targets, dict):
|
||||
for key, value in raw_targets.items():
|
||||
if isinstance(value, str):
|
||||
parsed = _parse_target_config(key, value)
|
||||
if parsed is not None:
|
||||
targets.append(parsed)
|
||||
continue
|
||||
|
||||
if not isinstance(value, dict):
|
||||
continue
|
||||
|
||||
namespace_from_key = key
|
||||
platform_from_key = None
|
||||
if "@" in key:
|
||||
namespace_from_key, platform_from_key = key.split("@", 1)
|
||||
|
||||
namespace = str(
|
||||
_get_value(
|
||||
value,
|
||||
"namespace",
|
||||
default=namespace_from_key,
|
||||
)
|
||||
)
|
||||
platform = str(
|
||||
_get_value(
|
||||
value,
|
||||
"platform",
|
||||
default=platform_from_key,
|
||||
)
|
||||
)
|
||||
ssh_host = _get_value(value, "ssh_host", "ssh-host", "host", default="")
|
||||
ssh_identity = _get_value(value, "ssh_identity", "ssh-identity", "identity")
|
||||
|
||||
if not namespace or not platform or not ssh_host:
|
||||
continue
|
||||
|
||||
targets.append(
|
||||
TargetConfig(
|
||||
namespace=namespace,
|
||||
platform=platform,
|
||||
ssh_host=str(ssh_host),
|
||||
ssh_identity=str(ssh_identity) if ssh_identity else None,
|
||||
)
|
||||
)
|
||||
|
||||
elif isinstance(raw_targets, list):
|
||||
for item in raw_targets:
|
||||
if not isinstance(item, dict):
|
||||
continue
|
||||
|
||||
namespace = _get_value(item, "namespace")
|
||||
platform = _get_value(item, "platform")
|
||||
ssh_host = _get_value(item, "ssh_host", "ssh-host", "host")
|
||||
ssh_identity = _get_value(item, "ssh_identity", "ssh-identity", "identity")
|
||||
|
||||
if not namespace or not platform or not ssh_host:
|
||||
continue
|
||||
|
||||
targets.append(
|
||||
TargetConfig(
|
||||
namespace=str(namespace),
|
||||
platform=str(platform),
|
||||
ssh_host=str(ssh_host),
|
||||
ssh_identity=str(ssh_identity) if ssh_identity else None,
|
||||
)
|
||||
)
|
||||
|
||||
return targets
|
||||
|
||||
|
||||
def load_manifest(path: Optional[Path] = None) -> Dict[str, Any]:
|
||||
"""Load merged YAML manifest/config data.
|
||||
|
||||
Default priority:
|
||||
1) ~/.local/share/flow/dotfiles/_shared/flow/.config/flow/*.y[a]ml
|
||||
2) ~/.config/flow/*.y[a]ml
|
||||
"""
|
||||
source = path if path is not None else _resolve_default_yaml_root()
|
||||
assert source is not None
|
||||
data = _load_yaml_source(source)
|
||||
return data if isinstance(data, dict) else {}
|
||||
|
||||
|
||||
def load_config(path: Optional[Path] = None) -> AppConfig:
|
||||
"""Load merged YAML config into AppConfig."""
|
||||
source = path if path is not None else _resolve_default_yaml_root()
|
||||
assert source is not None
|
||||
merged = _load_yaml_source(source)
|
||||
|
||||
cfg = AppConfig()
|
||||
if not isinstance(merged, dict):
|
||||
return cfg
|
||||
|
||||
repository = merged.get("repository") if isinstance(merged.get("repository"), dict) else {}
|
||||
paths_section = merged.get("paths") if isinstance(merged.get("paths"), dict) else {}
|
||||
defaults = merged.get("defaults") if isinstance(merged.get("defaults"), dict) else {}
|
||||
|
||||
cfg.dotfiles_url = str(
|
||||
_get_value(
|
||||
repository,
|
||||
"dotfiles_url",
|
||||
"dotfiles-url",
|
||||
default=merged.get("dotfiles_url", cfg.dotfiles_url),
|
||||
)
|
||||
)
|
||||
cfg.dotfiles_branch = str(
|
||||
_get_value(
|
||||
repository,
|
||||
"dotfiles_branch",
|
||||
"dotfiles-branch",
|
||||
default=merged.get("dotfiles_branch", cfg.dotfiles_branch),
|
||||
)
|
||||
)
|
||||
cfg.projects_dir = str(
|
||||
_get_value(
|
||||
paths_section,
|
||||
"projects_dir",
|
||||
"projects-dir",
|
||||
default=merged.get("projects_dir", cfg.projects_dir),
|
||||
)
|
||||
)
|
||||
cfg.container_registry = str(
|
||||
_get_value(
|
||||
defaults,
|
||||
"container_registry",
|
||||
"container-registry",
|
||||
default=merged.get("container_registry", cfg.container_registry),
|
||||
)
|
||||
)
|
||||
cfg.container_tag = str(
|
||||
_get_value(
|
||||
defaults,
|
||||
"container_tag",
|
||||
"container-tag",
|
||||
default=merged.get("container_tag", cfg.container_tag),
|
||||
)
|
||||
)
|
||||
cfg.tmux_session = str(
|
||||
_get_value(
|
||||
defaults,
|
||||
"tmux_session",
|
||||
"tmux-session",
|
||||
default=merged.get("tmux_session", cfg.tmux_session),
|
||||
)
|
||||
)
|
||||
cfg.targets = _parse_targets(merged.get("targets", {}))
|
||||
|
||||
return cfg
|
||||
|
||||
|
||||
@dataclass
|
||||
class FlowContext:
|
||||
config: AppConfig
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
"""XDG-compliant path constants for DevFlow."""
|
||||
"""XDG-compliant path constants for flow."""
|
||||
|
||||
import os
|
||||
from pathlib import Path
|
||||
@@ -10,12 +10,12 @@ def _xdg(env_var: str, fallback: str) -> Path:
|
||||
|
||||
HOME = Path.home()
|
||||
|
||||
CONFIG_DIR = _xdg("XDG_CONFIG_HOME", str(HOME / ".config")) / "devflow"
|
||||
DATA_DIR = _xdg("XDG_DATA_HOME", str(HOME / ".local" / "share")) / "devflow"
|
||||
STATE_DIR = _xdg("XDG_STATE_HOME", str(HOME / ".local" / "state")) / "devflow"
|
||||
CONFIG_DIR = _xdg("XDG_CONFIG_HOME", str(HOME / ".config")) / "flow"
|
||||
DATA_DIR = _xdg("XDG_DATA_HOME", str(HOME / ".local" / "share")) / "flow"
|
||||
STATE_DIR = _xdg("XDG_STATE_HOME", str(HOME / ".local" / "state")) / "flow"
|
||||
|
||||
MANIFEST_FILE = CONFIG_DIR / "manifest.yaml"
|
||||
CONFIG_FILE = CONFIG_DIR / "config"
|
||||
CONFIG_FILE = CONFIG_DIR / "config.yaml"
|
||||
|
||||
DOTFILES_DIR = DATA_DIR / "dotfiles"
|
||||
PACKAGES_DIR = DATA_DIR / "packages"
|
||||
@@ -25,10 +25,10 @@ PROJECTS_DIR = HOME / "projects"
|
||||
LINKED_STATE = STATE_DIR / "linked.json"
|
||||
INSTALLED_STATE = STATE_DIR / "installed.json"
|
||||
|
||||
# Self-hosted flow config paths (from dotfiles repo)
|
||||
DOTFILES_FLOW_CONFIG = DOTFILES_DIR / "flow" / ".config" / "flow"
|
||||
# Self-hosted flow config path (from dotfiles repo)
|
||||
DOTFILES_FLOW_CONFIG = DOTFILES_DIR / "_shared" / "flow" / ".config" / "flow"
|
||||
DOTFILES_MANIFEST = DOTFILES_FLOW_CONFIG / "manifest.yaml"
|
||||
DOTFILES_CONFIG = DOTFILES_FLOW_CONFIG / "config"
|
||||
DOTFILES_CONFIG = DOTFILES_FLOW_CONFIG / "config.yaml"
|
||||
|
||||
|
||||
def ensure_dirs() -> None:
|
||||
|
||||
@@ -7,8 +7,8 @@ from dataclasses import dataclass
|
||||
@dataclass
|
||||
class PlatformInfo:
|
||||
os: str = "linux" # "linux" or "macos"
|
||||
arch: str = "amd64" # "amd64" or "arm64"
|
||||
platform: str = "" # "linux-amd64", etc.
|
||||
arch: str = "x64" # "x64" or "arm64"
|
||||
platform: str = "" # "linux-x64", etc.
|
||||
|
||||
def __post_init__(self):
|
||||
if not self.platform:
|
||||
@@ -16,7 +16,7 @@ class PlatformInfo:
|
||||
|
||||
|
||||
_OS_MAP = {"Darwin": "macos", "Linux": "linux"}
|
||||
_ARCH_MAP = {"x86_64": "amd64", "aarch64": "arm64", "arm64": "arm64"}
|
||||
_ARCH_MAP = {"x86_64": "x64", "amd64": "x64", "aarch64": "arm64", "arm64": "arm64"}
|
||||
|
||||
|
||||
def detect_platform() -> PlatformInfo:
|
||||
|
||||
@@ -1,9 +1,9 @@
|
||||
"""Variable substitution for $VAR/${VAR} and {{var}} templates."""
|
||||
"""Variable substitution for shell-style and template expressions."""
|
||||
|
||||
import os
|
||||
import re
|
||||
from pathlib import Path
|
||||
from typing import Dict
|
||||
from typing import Any, Dict
|
||||
|
||||
|
||||
def substitute(text: str, variables: Dict[str, str]) -> str:
|
||||
@@ -26,13 +26,36 @@ def substitute(text: str, variables: Dict[str, str]) -> str:
|
||||
return pattern.sub(_replace, text)
|
||||
|
||||
|
||||
def substitute_template(text: str, context: Dict[str, str]) -> str:
|
||||
"""Replace {{key}} placeholders with values from context dict."""
|
||||
def _resolve_template_value(expr: str, context: Dict[str, Any]) -> Any:
|
||||
if expr.startswith("env."):
|
||||
env_key = expr.split(".", 1)[1]
|
||||
env_ctx = context.get("env", {})
|
||||
if isinstance(env_ctx, dict) and env_key in env_ctx:
|
||||
return env_ctx[env_key]
|
||||
return os.environ.get(env_key)
|
||||
|
||||
if expr in context:
|
||||
return context[expr]
|
||||
|
||||
current: Any = context
|
||||
for part in expr.split("."):
|
||||
if not isinstance(current, dict) or part not in current:
|
||||
return None
|
||||
current = current[part]
|
||||
|
||||
return current
|
||||
|
||||
|
||||
def substitute_template(text: str, context: Dict[str, Any]) -> str:
|
||||
"""Replace {{expr}} placeholders with values from context dict."""
|
||||
if not isinstance(text, str):
|
||||
return text
|
||||
|
||||
def _replace(match: re.Match[str]) -> str:
|
||||
key = match.group(1).strip()
|
||||
return context.get(key, match.group(0))
|
||||
value = _resolve_template_value(key, context)
|
||||
if value is None:
|
||||
return match.group(0)
|
||||
return str(value)
|
||||
|
||||
return re.sub(r"\{\{(\w+)\}\}", _replace, text)
|
||||
return re.sub(r"\{\{\s*([^{}]+?)\s*\}\}", _replace, text)
|
||||
|
||||
@@ -1,12 +1,16 @@
|
||||
"""Tests for flow.commands.bootstrap — action planning."""
|
||||
"""Tests for flow.commands.bootstrap helpers and schema behavior."""
|
||||
|
||||
import os
|
||||
|
||||
import pytest
|
||||
|
||||
from flow.commands.bootstrap import (
|
||||
_ensure_required_variables,
|
||||
_get_profiles,
|
||||
_plan_actions,
|
||||
_normalize_profile_package_entry,
|
||||
_resolve_package_manager,
|
||||
_resolve_package_name,
|
||||
_resolve_package_spec,
|
||||
_resolve_pkg_source_name,
|
||||
)
|
||||
from flow.core.config import AppConfig, FlowContext
|
||||
from flow.core.console import ConsoleLogger
|
||||
@@ -18,127 +22,28 @@ def ctx():
|
||||
return FlowContext(
|
||||
config=AppConfig(),
|
||||
manifest={
|
||||
"binaries": {
|
||||
"neovim": {
|
||||
"version": "0.10.4",
|
||||
"source": "github:neovim/neovim",
|
||||
"asset-pattern": "nvim-{{os}}-{{arch}}.tar.gz",
|
||||
"platform-map": {"linux-arm64": {"os": "linux", "arch": "arm64"}},
|
||||
"install-script": "echo install",
|
||||
"packages": [
|
||||
{
|
||||
"name": "fd",
|
||||
"type": "pkg",
|
||||
"sources": {"apt": "fd-find", "dnf": "fd-find", "brew": "fd"},
|
||||
},
|
||||
},
|
||||
{
|
||||
"name": "neovim",
|
||||
"type": "binary",
|
||||
"source": "github:neovim/neovim",
|
||||
"version": "0.10.4",
|
||||
"asset-pattern": "nvim-{{os}}-{{arch}}.tar.gz",
|
||||
"platform-map": {"linux-x64": {"os": "linux", "arch": "x64"}},
|
||||
"install": {"bin": ["bin/nvim"]},
|
||||
},
|
||||
]
|
||||
},
|
||||
platform=PlatformInfo(os="linux", arch="arm64", platform="linux-arm64"),
|
||||
platform=PlatformInfo(os="linux", arch="x64", platform="linux-x64"),
|
||||
console=ConsoleLogger(),
|
||||
)
|
||||
|
||||
|
||||
def test_plan_empty_profile(ctx):
|
||||
actions = _plan_actions(ctx, "test", {}, {})
|
||||
assert actions == []
|
||||
|
||||
|
||||
def test_plan_hostname(ctx):
|
||||
actions = _plan_actions(ctx, "test", {"hostname": "myhost"}, {})
|
||||
types = [a.type for a in actions]
|
||||
assert "set-hostname" in types
|
||||
|
||||
|
||||
def test_plan_locale_and_shell(ctx):
|
||||
actions = _plan_actions(ctx, "test", {"locale": "en_US.UTF-8", "shell": "zsh"}, {})
|
||||
types = [a.type for a in actions]
|
||||
assert "set-locale" in types
|
||||
assert "set-shell" in types
|
||||
|
||||
|
||||
def test_plan_packages(ctx):
|
||||
env_config = {
|
||||
"packages": {
|
||||
"standard": ["git", "zsh", "tmux"],
|
||||
"binary": ["neovim"],
|
||||
},
|
||||
}
|
||||
actions = _plan_actions(ctx, "test", env_config, {})
|
||||
types = [a.type for a in actions]
|
||||
assert "pm-update" in types
|
||||
assert "install-packages" in types
|
||||
assert "install-binary" in types
|
||||
|
||||
|
||||
def test_plan_packages_uses_package_map(ctx):
|
||||
ctx.manifest["package-map"] = {
|
||||
"fd": {"apt": "fd-find"},
|
||||
}
|
||||
env_config = {
|
||||
"package-manager": "apt",
|
||||
"packages": {
|
||||
"standard": ["fd"],
|
||||
},
|
||||
}
|
||||
|
||||
actions = _plan_actions(ctx, "test", env_config, {})
|
||||
install = [a for a in actions if a.type == "install-packages"][0]
|
||||
assert install.data["packages"] == ["fd-find"]
|
||||
|
||||
|
||||
def test_plan_ssh_keygen(ctx):
|
||||
env_config = {
|
||||
"ssh_keygen": [
|
||||
{"type": "ed25519", "comment": "test@host", "filename": "id_ed25519"},
|
||||
],
|
||||
}
|
||||
actions = _plan_actions(ctx, "test", env_config, {})
|
||||
types = [a.type for a in actions]
|
||||
assert "generate-ssh-key" in types
|
||||
|
||||
|
||||
def test_plan_runcmd(ctx):
|
||||
env_config = {"runcmd": ["echo hello", "mkdir -p ~/tmp"]}
|
||||
actions = _plan_actions(ctx, "test", env_config, {})
|
||||
run_cmds = [a for a in actions if a.type == "run-command"]
|
||||
assert len(run_cmds) == 2
|
||||
|
||||
|
||||
def test_plan_requires(ctx):
|
||||
env_config = {"requires": ["VAR1", "VAR2"]}
|
||||
actions = _plan_actions(ctx, "test", env_config, {})
|
||||
checks = [a for a in actions if a.type == "check-variable"]
|
||||
assert len(checks) == 2
|
||||
assert all(not a.skip_on_error for a in checks)
|
||||
|
||||
|
||||
def test_plan_full_profile(ctx):
|
||||
"""Test planning with a realistic linux-vm profile."""
|
||||
env_config = {
|
||||
"requires": ["TARGET_HOSTNAME"],
|
||||
"os": "linux",
|
||||
"hostname": "$TARGET_HOSTNAME",
|
||||
"shell": "zsh",
|
||||
"locale": "en_US.UTF-8",
|
||||
"packages": {
|
||||
"standard": ["zsh", "tmux", "git"],
|
||||
"binary": ["neovim"],
|
||||
},
|
||||
"ssh_keygen": [{"type": "ed25519", "comment": "test"}],
|
||||
"configs": ["bin"],
|
||||
"runcmd": ["mkdir -p ~/projects"],
|
||||
}
|
||||
actions = _plan_actions(ctx, "linux-vm", env_config, {"TARGET_HOSTNAME": "myvm"})
|
||||
assert len(actions) >= 8
|
||||
|
||||
types = [a.type for a in actions]
|
||||
assert "check-variable" in types
|
||||
assert "set-hostname" in types
|
||||
assert "set-locale" in types
|
||||
assert "set-shell" in types
|
||||
assert "pm-update" in types
|
||||
assert "install-packages" in types
|
||||
assert "install-binary" in types
|
||||
assert "generate-ssh-key" in types
|
||||
assert "link-config" in types
|
||||
assert "run-command" in types
|
||||
|
||||
|
||||
def test_get_profiles_from_manifest(ctx):
|
||||
ctx.manifest = {"profiles": {"linux": {"os": "linux"}}}
|
||||
assert "linux" in _get_profiles(ctx)
|
||||
@@ -151,38 +56,88 @@ def test_get_profiles_rejects_environments(ctx):
|
||||
|
||||
|
||||
def test_resolve_package_manager_explicit_value(ctx):
|
||||
assert _resolve_package_manager(ctx, {"package-manager": "dnf"}) == "dnf"
|
||||
assert _resolve_package_manager(ctx, {"os": "linux", "package-manager": "dnf"}) == "dnf"
|
||||
|
||||
|
||||
def test_resolve_package_manager_linux_ubuntu(ctx):
|
||||
os_release = "ID=ubuntu\nID_LIKE=debian"
|
||||
assert _resolve_package_manager(ctx, {}, os_release_text=os_release) == "apt"
|
||||
def test_resolve_package_manager_linux_auto_apt(monkeypatch, ctx):
|
||||
monkeypatch.setattr("flow.commands.bootstrap.shutil.which", lambda name: "/usr/bin/apt" if name == "apt" else None)
|
||||
assert _resolve_package_manager(ctx, {"os": "linux"}) == "apt"
|
||||
|
||||
|
||||
def test_resolve_package_manager_linux_fedora(ctx):
|
||||
os_release = "ID=fedora\nID_LIKE=rhel"
|
||||
assert _resolve_package_manager(ctx, {}, os_release_text=os_release) == "dnf"
|
||||
def test_resolve_package_manager_linux_auto_dnf(monkeypatch, ctx):
|
||||
monkeypatch.setattr("flow.commands.bootstrap.shutil.which", lambda name: "/usr/bin/dnf" if name == "dnf" else None)
|
||||
assert _resolve_package_manager(ctx, {"os": "linux"}) == "dnf"
|
||||
|
||||
|
||||
def test_resolve_package_name_with_package_map(ctx):
|
||||
ctx.manifest["package-map"] = {
|
||||
def test_resolve_package_manager_requires_os(ctx):
|
||||
with pytest.raises(RuntimeError, match="must be set"):
|
||||
_resolve_package_manager(ctx, {})
|
||||
|
||||
|
||||
def test_normalize_package_entry_string():
|
||||
assert _normalize_profile_package_entry("git") == {"name": "git"}
|
||||
|
||||
|
||||
def test_normalize_package_entry_type_prefix():
|
||||
assert _normalize_profile_package_entry("cask/wezterm") == {"name": "wezterm", "type": "cask"}
|
||||
|
||||
|
||||
def test_normalize_package_entry_object():
|
||||
out = _normalize_profile_package_entry({"name": "docker", "allow_sudo": True})
|
||||
assert out["name"] == "docker"
|
||||
assert out["allow_sudo"] is True
|
||||
|
||||
|
||||
def test_resolve_package_spec_uses_catalog_type(ctx):
|
||||
catalog = {
|
||||
"fd": {
|
||||
"apt": "fd-find",
|
||||
"dnf": "fd-find",
|
||||
"brew": "fd",
|
||||
"name": "fd",
|
||||
"type": "pkg",
|
||||
"sources": {"apt": "fd-find"},
|
||||
}
|
||||
}
|
||||
assert _resolve_package_name(ctx, "fd", "apt") == "fd-find"
|
||||
assert _resolve_package_name(ctx, "fd", "dnf") == "fd-find"
|
||||
assert _resolve_package_name(ctx, "fd", "brew") == "fd"
|
||||
resolved = _resolve_package_spec(catalog, {"name": "fd"})
|
||||
assert resolved["type"] == "pkg"
|
||||
assert resolved["sources"]["apt"] == "fd-find"
|
||||
|
||||
|
||||
def test_resolve_package_name_falls_back_with_warning(ctx):
|
||||
warnings = []
|
||||
ctx.console.warn = warnings.append
|
||||
ctx.manifest["package-map"] = {"python3-dev": {"apt": "python3-dev"}}
|
||||
def test_resolve_package_spec_defaults_to_pkg(ctx):
|
||||
resolved = _resolve_package_spec({}, {"name": "git"})
|
||||
assert resolved["type"] == "pkg"
|
||||
|
||||
resolved = _resolve_package_name(ctx, "python3-dev", "dnf", warn_missing=True)
|
||||
|
||||
assert resolved == "python3-dev"
|
||||
assert warnings
|
||||
def test_resolve_package_spec_profile_override(ctx):
|
||||
catalog = {
|
||||
"neovim": {
|
||||
"name": "neovim",
|
||||
"type": "binary",
|
||||
"version": "0.10.4",
|
||||
}
|
||||
}
|
||||
resolved = _resolve_package_spec(catalog, {"name": "neovim", "post-install": "echo ok"})
|
||||
assert resolved["type"] == "binary"
|
||||
assert resolved["post-install"] == "echo ok"
|
||||
|
||||
|
||||
def test_resolve_pkg_source_name_with_mapping(ctx):
|
||||
spec = {"name": "fd", "sources": {"apt": "fd-find", "dnf": "fd-find", "brew": "fd"}}
|
||||
assert _resolve_pkg_source_name(spec, "apt") == "fd-find"
|
||||
assert _resolve_pkg_source_name(spec, "dnf") == "fd-find"
|
||||
assert _resolve_pkg_source_name(spec, "brew") == "fd"
|
||||
|
||||
|
||||
def test_resolve_pkg_source_name_fallback_to_name(ctx):
|
||||
spec = {"name": "ripgrep", "sources": {"apt": "ripgrep"}}
|
||||
assert _resolve_pkg_source_name(spec, "dnf") == "ripgrep"
|
||||
|
||||
|
||||
def test_ensure_required_variables_missing_raises():
|
||||
with pytest.raises(RuntimeError, match="Missing required environment variables"):
|
||||
_ensure_required_variables({"requires": ["USER_EMAIL", "TARGET_HOSTNAME"]}, {"USER_EMAIL": "a@b"})
|
||||
|
||||
|
||||
def test_ensure_required_variables_accepts_vars(monkeypatch):
|
||||
env = dict(os.environ)
|
||||
env["USER_EMAIL"] = "a@b"
|
||||
env["TARGET_HOSTNAME"] = "devbox"
|
||||
_ensure_required_variables({"requires": ["USER_EMAIL", "TARGET_HOSTNAME"]}, env)
|
||||
|
||||
@@ -7,7 +7,9 @@ import sys
|
||||
|
||||
def _clean_env():
|
||||
"""Return env dict without DF_* variables that trigger enter's guard."""
|
||||
return {k: v for k, v in os.environ.items() if not k.startswith("DF_")}
|
||||
env = {k: v for k, v in os.environ.items() if not k.startswith("DF_")}
|
||||
env["FLOW_SKIP_SUDO_REFRESH"] = "1"
|
||||
return env
|
||||
|
||||
|
||||
def test_version():
|
||||
|
||||
@@ -1,37 +1,34 @@
|
||||
"""Tests for flow.core.config."""
|
||||
|
||||
from pathlib import Path
|
||||
import pytest
|
||||
|
||||
from flow.core.config import AppConfig, FlowContext, load_config, load_manifest
|
||||
from flow.core.config import AppConfig, load_config, load_manifest
|
||||
|
||||
|
||||
def test_load_config_missing_file(tmp_path):
|
||||
def test_load_config_missing_path(tmp_path):
|
||||
cfg = load_config(tmp_path / "nonexistent")
|
||||
assert isinstance(cfg, AppConfig)
|
||||
assert cfg.dotfiles_url == ""
|
||||
assert cfg.container_registry == "registry.tomastm.com"
|
||||
|
||||
|
||||
def test_load_config_ini(tmp_path):
|
||||
config_file = tmp_path / "config"
|
||||
config_file.write_text("""
|
||||
[repository]
|
||||
dotfiles_url=git@github.com:user/dots.git
|
||||
dotfiles_branch=dev
|
||||
def test_load_config_merged_yaml(tmp_path):
|
||||
(tmp_path / "10-config.yaml").write_text(
|
||||
"repository:\n"
|
||||
" dotfiles-url: git@github.com:user/dots.git\n"
|
||||
" dotfiles-branch: dev\n"
|
||||
"paths:\n"
|
||||
" projects-dir: ~/code\n"
|
||||
"defaults:\n"
|
||||
" container-registry: my.registry.com\n"
|
||||
" container-tag: v1\n"
|
||||
" tmux-session: main\n"
|
||||
"targets:\n"
|
||||
" personal: orb personal@orb\n"
|
||||
" work@ec2: work.ec2.internal ~/.ssh/id_work\n"
|
||||
)
|
||||
|
||||
[paths]
|
||||
projects_dir=~/code
|
||||
|
||||
[defaults]
|
||||
container_registry=my.registry.com
|
||||
container_tag=v1
|
||||
tmux_session=main
|
||||
|
||||
[targets]
|
||||
personal=orb personal@orb
|
||||
work=ec2 work.ec2.internal ~/.ssh/id_work
|
||||
""")
|
||||
cfg = load_config(config_file)
|
||||
cfg = load_config(tmp_path)
|
||||
assert cfg.dotfiles_url == "git@github.com:user/dots.git"
|
||||
assert cfg.dotfiles_branch == "dev"
|
||||
assert cfg.projects_dir == "~/code"
|
||||
@@ -40,31 +37,28 @@ work=ec2 work.ec2.internal ~/.ssh/id_work
|
||||
assert cfg.tmux_session == "main"
|
||||
assert len(cfg.targets) == 2
|
||||
assert cfg.targets[0].namespace == "personal"
|
||||
assert cfg.targets[0].platform == "orb"
|
||||
assert cfg.targets[0].ssh_host == "personal@orb"
|
||||
assert cfg.targets[1].ssh_identity == "~/.ssh/id_work"
|
||||
|
||||
|
||||
def test_load_manifest_missing_file(tmp_path):
|
||||
result = load_manifest(tmp_path / "nonexistent.yaml")
|
||||
def test_load_manifest_missing_path(tmp_path):
|
||||
result = load_manifest(tmp_path / "nonexistent")
|
||||
assert result == {}
|
||||
|
||||
|
||||
def test_load_manifest_valid(tmp_path):
|
||||
manifest = tmp_path / "manifest.yaml"
|
||||
manifest.write_text("""
|
||||
profiles:
|
||||
linux-vm:
|
||||
os: linux
|
||||
hostname: test
|
||||
""")
|
||||
result = load_manifest(manifest)
|
||||
assert "profiles" in result
|
||||
def test_load_manifest_valid_directory(tmp_path):
|
||||
(tmp_path / "manifest.yaml").write_text(
|
||||
"profiles:\n"
|
||||
" linux-vm:\n"
|
||||
" os: linux\n"
|
||||
" hostname: devbox\n"
|
||||
)
|
||||
result = load_manifest(tmp_path)
|
||||
assert result["profiles"]["linux-vm"]["os"] == "linux"
|
||||
|
||||
|
||||
def test_load_manifest_non_dict(tmp_path):
|
||||
manifest = tmp_path / "manifest.yaml"
|
||||
manifest.write_text("- a\n- b\n")
|
||||
result = load_manifest(manifest)
|
||||
assert result == {}
|
||||
def test_load_manifest_non_dict_raises(tmp_path):
|
||||
bad = tmp_path / "bad.yaml"
|
||||
bad.write_text("- a\n- b\n")
|
||||
|
||||
with pytest.raises(RuntimeError, match="must contain a mapping"):
|
||||
load_manifest(bad)
|
||||
|
||||
@@ -1,80 +1,75 @@
|
||||
"""Tests for flow.commands.dotfiles — link/unlink/status logic."""
|
||||
|
||||
import json
|
||||
from pathlib import Path
|
||||
"""Tests for flow.commands.dotfiles discovery and path resolution."""
|
||||
|
||||
import pytest
|
||||
|
||||
from flow.commands.dotfiles import _discover_packages, _resolve_edit_target, _walk_package
|
||||
from flow.core.config import AppConfig, FlowContext
|
||||
from flow.core.console import ConsoleLogger
|
||||
from flow.core.platform import PlatformInfo
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def dotfiles_tree(tmp_path):
|
||||
"""Create a sample dotfiles directory structure."""
|
||||
common = tmp_path / "common"
|
||||
(common / "zsh").mkdir(parents=True)
|
||||
(common / "zsh" / ".zshrc").write_text("# zshrc")
|
||||
(common / "zsh" / ".zshenv").write_text("# zshenv")
|
||||
(common / "tmux").mkdir(parents=True)
|
||||
(common / "tmux" / ".tmux.conf").write_text("# tmux")
|
||||
def _make_tree(tmp_path):
|
||||
flow_root = tmp_path
|
||||
shared = flow_root / "_shared"
|
||||
(shared / "zsh").mkdir(parents=True)
|
||||
(shared / "zsh" / ".zshrc").write_text("# zsh")
|
||||
(shared / "tmux").mkdir(parents=True)
|
||||
(shared / "tmux" / ".tmux.conf").write_text("# tmux")
|
||||
|
||||
profiles = tmp_path / "profiles" / "work"
|
||||
(profiles / "git").mkdir(parents=True)
|
||||
(profiles / "git" / ".gitconfig").write_text("[user]\nname = Work")
|
||||
profile = flow_root / "work"
|
||||
(profile / "git").mkdir(parents=True)
|
||||
(profile / "git" / ".gitconfig").write_text("[user]\nname = Work")
|
||||
|
||||
return tmp_path
|
||||
|
||||
|
||||
def test_discover_packages_common(dotfiles_tree):
|
||||
packages = _discover_packages(dotfiles_tree)
|
||||
def test_discover_packages_shared_only(tmp_path):
|
||||
tree = _make_tree(tmp_path)
|
||||
packages = _discover_packages(tree)
|
||||
assert "zsh" in packages
|
||||
assert "tmux" in packages
|
||||
assert "git" not in packages # git is only in profiles
|
||||
assert "git" not in packages
|
||||
|
||||
|
||||
def test_discover_packages_with_profile(dotfiles_tree):
|
||||
packages = _discover_packages(dotfiles_tree, profile="work")
|
||||
def test_discover_packages_with_profile(tmp_path):
|
||||
tree = _make_tree(tmp_path)
|
||||
packages = _discover_packages(tree, profile="work")
|
||||
assert "zsh" in packages
|
||||
assert "tmux" in packages
|
||||
assert "git" in packages
|
||||
|
||||
|
||||
def test_discover_packages_profile_overrides(dotfiles_tree):
|
||||
# Add zsh to work profile
|
||||
work_zsh = dotfiles_tree / "profiles" / "work" / "zsh"
|
||||
work_zsh.mkdir(parents=True)
|
||||
(work_zsh / ".zshrc").write_text("# work zshrc")
|
||||
def test_discover_packages_profile_overrides_shared(tmp_path):
|
||||
tree = _make_tree(tmp_path)
|
||||
profile_zsh = tree / "work" / "zsh"
|
||||
profile_zsh.mkdir(parents=True)
|
||||
(profile_zsh / ".zshrc").write_text("# work zsh")
|
||||
|
||||
packages = _discover_packages(dotfiles_tree, profile="work")
|
||||
# Profile should override common
|
||||
assert packages["zsh"] == work_zsh
|
||||
with pytest.raises(RuntimeError, match="Conflicting dotfile targets"):
|
||||
from flow.commands.dotfiles import _collect_home_specs
|
||||
_collect_home_specs(tree, tmp_path / "home", "work", set(), None)
|
||||
|
||||
|
||||
def test_walk_package(dotfiles_tree):
|
||||
home = Path("/tmp/fakehome")
|
||||
source = dotfiles_tree / "common" / "zsh"
|
||||
pairs = list(_walk_package(source, home))
|
||||
assert len(pairs) == 2
|
||||
sources = {str(s.name) for s, _ in pairs}
|
||||
assert ".zshrc" in sources
|
||||
assert ".zshenv" in sources
|
||||
targets = {str(t) for _, t in pairs}
|
||||
assert str(home / ".zshrc") in targets
|
||||
assert str(home / ".zshenv") in targets
|
||||
def test_walk_package_returns_relative_paths(tmp_path):
|
||||
tree = _make_tree(tmp_path)
|
||||
source = tree / "_shared" / "zsh"
|
||||
|
||||
pairs = list(_walk_package(source))
|
||||
assert len(pairs) == 1
|
||||
src, rel = pairs[0]
|
||||
assert src.name == ".zshrc"
|
||||
assert str(rel) == ".zshrc"
|
||||
|
||||
|
||||
def test_resolve_edit_target_package(dotfiles_tree):
|
||||
target = _resolve_edit_target("zsh", dotfiles_dir=dotfiles_tree)
|
||||
assert target == dotfiles_tree / "common" / "zsh"
|
||||
def test_resolve_edit_target_package(tmp_path):
|
||||
tree = _make_tree(tmp_path)
|
||||
target = _resolve_edit_target("zsh", dotfiles_dir=tree)
|
||||
assert target == tree / "_shared" / "zsh"
|
||||
|
||||
|
||||
def test_resolve_edit_target_repo_path(dotfiles_tree):
|
||||
target = _resolve_edit_target("common/zsh/.zshrc", dotfiles_dir=dotfiles_tree)
|
||||
assert target == dotfiles_tree / "common" / "zsh" / ".zshrc"
|
||||
def test_resolve_edit_target_repo_path(tmp_path):
|
||||
tree = _make_tree(tmp_path)
|
||||
target = _resolve_edit_target("_shared/zsh/.zshrc", dotfiles_dir=tree)
|
||||
assert target == tree / "_shared" / "zsh" / ".zshrc"
|
||||
|
||||
|
||||
def test_resolve_edit_target_missing_returns_none(dotfiles_tree):
|
||||
assert _resolve_edit_target("does-not-exist", dotfiles_dir=dotfiles_tree) is None
|
||||
def test_resolve_edit_target_missing_returns_none(tmp_path):
|
||||
tree = _make_tree(tmp_path)
|
||||
assert _resolve_edit_target("does-not-exist", dotfiles_dir=tree) is None
|
||||
|
||||
@@ -1,298 +1,94 @@
|
||||
"""Integration tests for dotfiles tree folding behavior."""
|
||||
"""Tests for flat-layout dotfiles helpers and state format."""
|
||||
|
||||
import os
|
||||
import json
|
||||
from pathlib import Path
|
||||
|
||||
import pytest
|
||||
|
||||
from flow.commands.dotfiles import _discover_packages, _walk_package
|
||||
from flow.core.config import AppConfig, FlowContext
|
||||
from flow.core.console import ConsoleLogger
|
||||
from flow.core.platform import PlatformInfo
|
||||
from flow.core.stow import LinkTree, TreeFolder
|
||||
from flow.commands.dotfiles import (
|
||||
LinkSpec,
|
||||
_collect_home_specs,
|
||||
_collect_root_specs,
|
||||
_list_profiles,
|
||||
_load_link_specs_from_state,
|
||||
_save_link_specs_to_state,
|
||||
)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def ctx():
|
||||
"""Create a mock FlowContext."""
|
||||
return FlowContext(
|
||||
config=AppConfig(),
|
||||
manifest={},
|
||||
platform=PlatformInfo(),
|
||||
console=ConsoleLogger(),
|
||||
)
|
||||
def _make_flow_tree(tmp_path: Path) -> Path:
|
||||
flow_root = tmp_path
|
||||
|
||||
(flow_root / "_shared" / "git").mkdir(parents=True)
|
||||
(flow_root / "_shared" / "git" / ".gitconfig").write_text("shared")
|
||||
(flow_root / "_shared" / "tmux").mkdir(parents=True)
|
||||
(flow_root / "_shared" / "tmux" / ".tmux.conf").write_text("tmux")
|
||||
|
||||
(flow_root / "work" / "git").mkdir(parents=True)
|
||||
(flow_root / "work" / "git" / ".gitconfig").write_text("profile")
|
||||
(flow_root / "work" / "nvim").mkdir(parents=True)
|
||||
(flow_root / "work" / "nvim" / ".config" / "nvim").mkdir(parents=True)
|
||||
(flow_root / "work" / "nvim" / ".config" / "nvim" / "init.lua").write_text("-- init")
|
||||
|
||||
(flow_root / "_root" / "general" / "etc").mkdir(parents=True)
|
||||
(flow_root / "_root" / "general" / "etc" / "hostname").write_text("devbox")
|
||||
|
||||
return flow_root
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def dotfiles_with_nested(tmp_path):
|
||||
"""Create dotfiles with nested directory structure for folding tests."""
|
||||
common = tmp_path / "common"
|
||||
|
||||
# nvim package with nested config
|
||||
nvim = common / "nvim" / ".config" / "nvim"
|
||||
nvim.mkdir(parents=True)
|
||||
(nvim / "init.lua").write_text("-- init")
|
||||
(nvim / "lua").mkdir()
|
||||
(nvim / "lua" / "config.lua").write_text("-- config")
|
||||
(nvim / "lua" / "plugins.lua").write_text("-- plugins")
|
||||
|
||||
# zsh package with flat structure
|
||||
zsh = common / "zsh"
|
||||
zsh.mkdir(parents=True)
|
||||
(zsh / ".zshrc").write_text("# zshrc")
|
||||
(zsh / ".zshenv").write_text("# zshenv")
|
||||
|
||||
return tmp_path
|
||||
def test_list_profiles_ignores_reserved_dirs(tmp_path):
|
||||
flow_root = _make_flow_tree(tmp_path)
|
||||
profiles = _list_profiles(flow_root)
|
||||
assert profiles == ["work"]
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def home_dir(tmp_path):
|
||||
"""Create a temporary home directory."""
|
||||
def test_collect_home_specs_conflict_fails(tmp_path):
|
||||
flow_root = _make_flow_tree(tmp_path)
|
||||
home = tmp_path / "home"
|
||||
home.mkdir()
|
||||
return home
|
||||
|
||||
with pytest.raises(RuntimeError, match="Conflicting dotfile targets"):
|
||||
_collect_home_specs(flow_root, home, "work", set(), None)
|
||||
|
||||
|
||||
def test_tree_folding_single_package(dotfiles_with_nested, home_dir):
|
||||
"""Test that a single package can be folded into directory symlink."""
|
||||
# Discover nvim package
|
||||
packages = _discover_packages(dotfiles_with_nested)
|
||||
nvim_source = packages["nvim"]
|
||||
|
||||
# Build link tree
|
||||
tree = LinkTree()
|
||||
folder = TreeFolder(tree)
|
||||
|
||||
# Plan links for all nvim files
|
||||
operations = []
|
||||
for src, dst in _walk_package(nvim_source, home_dir):
|
||||
ops = folder.plan_link(src, dst, "nvim")
|
||||
operations.extend(ops)
|
||||
|
||||
# Execute operations
|
||||
folder.execute_operations(operations, dry_run=False)
|
||||
|
||||
# Check that we created efficient symlinks
|
||||
# In ideal case, we'd have one directory symlink instead of 3 file symlinks
|
||||
nvim_config = home_dir / ".config" / "nvim"
|
||||
|
||||
# Verify links work
|
||||
assert (nvim_config / "init.lua").exists()
|
||||
assert (nvim_config / "lua" / "config.lua").exists()
|
||||
def test_collect_root_specs_maps_to_absolute_paths(tmp_path):
|
||||
flow_root = _make_flow_tree(tmp_path)
|
||||
specs = _collect_root_specs(flow_root, set(), include_root=True)
|
||||
assert Path("/etc/hostname") in specs
|
||||
assert specs[Path("/etc/hostname")].package == "_root/general"
|
||||
|
||||
|
||||
def test_tree_unfolding_conflict(dotfiles_with_nested, home_dir):
|
||||
"""Test that tree unfolds when second package needs same directory."""
|
||||
common = dotfiles_with_nested / "common"
|
||||
def test_state_round_trip(tmp_path, monkeypatch):
|
||||
state_file = tmp_path / "linked.json"
|
||||
monkeypatch.setattr("flow.commands.dotfiles.LINKED_STATE", state_file)
|
||||
|
||||
# Create second package that shares .config
|
||||
tmux = common / "tmux" / ".config" / "tmux"
|
||||
tmux.mkdir(parents=True)
|
||||
(tmux / "tmux.conf").write_text("# tmux")
|
||||
|
||||
# First, link nvim (can fold .config/nvim)
|
||||
tree = LinkTree()
|
||||
folder = TreeFolder(tree)
|
||||
|
||||
nvim_source = common / "nvim"
|
||||
for src, dst in _walk_package(nvim_source, home_dir):
|
||||
ops = folder.plan_link(src, dst, "nvim")
|
||||
folder.execute_operations(ops, dry_run=False)
|
||||
|
||||
# Now link tmux (should unfold if needed)
|
||||
tmux_source = common / "tmux"
|
||||
for src, dst in _walk_package(tmux_source, home_dir):
|
||||
ops = folder.plan_link(src, dst, "tmux")
|
||||
folder.execute_operations(ops, dry_run=False)
|
||||
|
||||
# Both packages should be linked
|
||||
assert (home_dir / ".config" / "nvim" / "init.lua").exists()
|
||||
assert (home_dir / ".config" / "tmux" / "tmux.conf").exists()
|
||||
|
||||
|
||||
def test_state_format_with_directory_links(dotfiles_with_nested, home_dir):
|
||||
"""Test that state file correctly tracks directory vs file links."""
|
||||
tree = LinkTree()
|
||||
|
||||
# Add a directory link
|
||||
tree.add_link(
|
||||
home_dir / ".config" / "nvim",
|
||||
dotfiles_with_nested / "common" / "nvim" / ".config" / "nvim",
|
||||
"nvim",
|
||||
is_dir_link=True,
|
||||
)
|
||||
|
||||
# Add a file link
|
||||
tree.add_link(
|
||||
home_dir / ".zshrc",
|
||||
dotfiles_with_nested / "common" / "zsh" / ".zshrc",
|
||||
"zsh",
|
||||
is_dir_link=False,
|
||||
)
|
||||
|
||||
# Convert to state
|
||||
state = tree.to_state()
|
||||
|
||||
# Verify format
|
||||
assert state["version"] == 2
|
||||
nvim_link = state["links"]["nvim"][str(home_dir / ".config" / "nvim")]
|
||||
assert nvim_link["is_directory_link"] is True
|
||||
|
||||
zsh_link = state["links"]["zsh"][str(home_dir / ".zshrc")]
|
||||
assert zsh_link["is_directory_link"] is False
|
||||
|
||||
|
||||
def test_state_backward_compatibility_rejected(home_dir):
|
||||
"""Old state format should be rejected (no backward compatibility)."""
|
||||
old_state = {
|
||||
"links": {
|
||||
"zsh": {
|
||||
str(home_dir / ".zshrc"): str(home_dir.parent / "dotfiles" / "zsh" / ".zshrc"),
|
||||
}
|
||||
}
|
||||
specs = {
|
||||
Path("/home/user/.gitconfig"): LinkSpec(
|
||||
source=Path("/repo/_shared/git/.gitconfig"),
|
||||
target=Path("/home/user/.gitconfig"),
|
||||
package="_shared/git",
|
||||
)
|
||||
}
|
||||
_save_link_specs_to_state(specs)
|
||||
|
||||
loaded = _load_link_specs_from_state()
|
||||
assert Path("/home/user/.gitconfig") in loaded
|
||||
assert loaded[Path("/home/user/.gitconfig")].package == "_shared/git"
|
||||
|
||||
|
||||
def test_state_old_format_rejected(tmp_path, monkeypatch):
|
||||
state_file = tmp_path / "linked.json"
|
||||
monkeypatch.setattr("flow.commands.dotfiles.LINKED_STATE", state_file)
|
||||
state_file.write_text(
|
||||
json.dumps(
|
||||
{
|
||||
"links": {
|
||||
"zsh": {
|
||||
"/home/user/.zshrc": "/repo/.zshrc",
|
||||
}
|
||||
}
|
||||
}
|
||||
)
|
||||
)
|
||||
|
||||
with pytest.raises(RuntimeError, match="Unsupported linked state format"):
|
||||
LinkTree.from_state(old_state)
|
||||
|
||||
|
||||
def test_discover_packages_with_flow_package(tmp_path):
|
||||
"""Test discovering the flow package itself from dotfiles."""
|
||||
common = tmp_path / "common"
|
||||
|
||||
# Create flow package
|
||||
flow_pkg = common / "flow" / ".config" / "flow"
|
||||
flow_pkg.mkdir(parents=True)
|
||||
(flow_pkg / "manifest.yaml").write_text("profiles: {}")
|
||||
(flow_pkg / "config").write_text("[repository]\n")
|
||||
|
||||
packages = _discover_packages(tmp_path)
|
||||
|
||||
# Flow package should be discovered like any other
|
||||
assert "flow" in packages
|
||||
assert packages["flow"] == common / "flow"
|
||||
|
||||
|
||||
def test_walk_flow_package(tmp_path):
|
||||
"""Test walking the flow package structure."""
|
||||
flow_pkg = tmp_path / "flow"
|
||||
flow_config = flow_pkg / ".config" / "flow"
|
||||
flow_config.mkdir(parents=True)
|
||||
(flow_config / "manifest.yaml").write_text("profiles: {}")
|
||||
(flow_config / "config").write_text("[repository]\n")
|
||||
|
||||
home = Path("/tmp/fakehome")
|
||||
pairs = list(_walk_package(flow_pkg, home))
|
||||
|
||||
# Should find both files
|
||||
assert len(pairs) == 2
|
||||
targets = [str(t) for _, t in pairs]
|
||||
assert str(home / ".config" / "flow" / "manifest.yaml") in targets
|
||||
assert str(home / ".config" / "flow" / "config") in targets
|
||||
|
||||
|
||||
def test_conflict_detection_before_execution(dotfiles_with_nested, home_dir):
|
||||
"""Test that conflicts are detected before any changes are made."""
|
||||
# Create existing file that conflicts
|
||||
existing = home_dir / ".zshrc"
|
||||
existing.parent.mkdir(parents=True, exist_ok=True)
|
||||
existing.write_text("# existing zshrc")
|
||||
|
||||
# Try to link package that wants .zshrc
|
||||
tree = LinkTree()
|
||||
folder = TreeFolder(tree)
|
||||
|
||||
zsh_source = dotfiles_with_nested / "common" / "zsh"
|
||||
operations = []
|
||||
for src, dst in _walk_package(zsh_source, home_dir):
|
||||
ops = folder.plan_link(src, dst, "zsh")
|
||||
operations.extend(ops)
|
||||
|
||||
# Should detect conflict
|
||||
conflicts = folder.detect_conflicts(operations)
|
||||
assert len(conflicts) > 0
|
||||
assert any("already exists" in c for c in conflicts)
|
||||
|
||||
# Original file should be unchanged
|
||||
assert existing.read_text() == "# existing zshrc"
|
||||
|
||||
|
||||
def test_profile_switching_relink(tmp_path):
|
||||
"""Test switching between profiles maintains correct links."""
|
||||
# Create profiles
|
||||
common = tmp_path / "common"
|
||||
profiles = tmp_path / "profiles"
|
||||
|
||||
# Common zsh
|
||||
(common / "zsh").mkdir(parents=True)
|
||||
(common / "zsh" / ".zshrc").write_text("# common zsh")
|
||||
|
||||
# Work profile override
|
||||
(profiles / "work" / "zsh").mkdir(parents=True)
|
||||
(profiles / "work" / "zsh" / ".zshrc").write_text("# work zsh")
|
||||
|
||||
# Personal profile override
|
||||
(profiles / "personal" / "zsh").mkdir(parents=True)
|
||||
(profiles / "personal" / "zsh" / ".zshrc").write_text("# personal zsh")
|
||||
|
||||
# Test that profile discovery works correctly
|
||||
work_packages = _discover_packages(tmp_path, profile="work")
|
||||
personal_packages = _discover_packages(tmp_path, profile="personal")
|
||||
|
||||
# Both should find zsh, but from different sources
|
||||
assert "zsh" in work_packages
|
||||
assert "zsh" in personal_packages
|
||||
assert work_packages["zsh"] != personal_packages["zsh"]
|
||||
|
||||
|
||||
def test_can_fold_empty_directory():
|
||||
"""Test can_fold with empty directory."""
|
||||
tree = LinkTree()
|
||||
target_dir = Path("/home/user/.config/nvim")
|
||||
|
||||
# Empty directory - should be able to fold
|
||||
assert tree.can_fold(target_dir, "nvim")
|
||||
|
||||
|
||||
def test_can_fold_with_subdirectories():
|
||||
"""Test can_fold with nested directory structure."""
|
||||
tree = LinkTree()
|
||||
base = Path("/home/user/.config/nvim")
|
||||
|
||||
# Add nested files from same package
|
||||
tree.add_link(base / "init.lua", Path("/dotfiles/nvim/init.lua"), "nvim")
|
||||
tree.add_link(base / "lua" / "config.lua", Path("/dotfiles/nvim/lua/config.lua"), "nvim")
|
||||
tree.add_link(base / "lua" / "plugins" / "init.lua", Path("/dotfiles/nvim/lua/plugins/init.lua"), "nvim")
|
||||
|
||||
# Should be able to fold at base level
|
||||
assert tree.can_fold(base, "nvim")
|
||||
|
||||
# Add file from different package
|
||||
tree.add_link(base / "other.lua", Path("/dotfiles/other/other.lua"), "other")
|
||||
|
||||
# Now cannot fold
|
||||
assert not tree.can_fold(base, "nvim")
|
||||
|
||||
|
||||
def test_execute_operations_creates_parent_dirs(tmp_path):
|
||||
"""Test that execute_operations creates necessary parent directories."""
|
||||
tree = LinkTree()
|
||||
folder = TreeFolder(tree)
|
||||
|
||||
source = tmp_path / "dotfiles" / "nvim" / ".config" / "nvim" / "init.lua"
|
||||
target = tmp_path / "home" / ".config" / "nvim" / "init.lua"
|
||||
|
||||
# Create source
|
||||
source.parent.mkdir(parents=True)
|
||||
source.write_text("-- init")
|
||||
|
||||
# Target parent doesn't exist yet
|
||||
assert not target.parent.exists()
|
||||
|
||||
# Plan and execute
|
||||
ops = folder.plan_link(source, target, "nvim")
|
||||
folder.execute_operations(ops, dry_run=False)
|
||||
|
||||
# Parent should be created
|
||||
assert target.parent.exists()
|
||||
assert target.is_symlink()
|
||||
_load_link_specs_from_state()
|
||||
|
||||
@@ -18,15 +18,15 @@ from flow.core.paths import (
|
||||
|
||||
|
||||
def test_config_dir_under_home():
|
||||
assert ".config/devflow" in str(CONFIG_DIR)
|
||||
assert ".config/flow" in str(CONFIG_DIR)
|
||||
|
||||
|
||||
def test_data_dir_under_home():
|
||||
assert ".local/share/devflow" in str(DATA_DIR)
|
||||
assert ".local/share/flow" in str(DATA_DIR)
|
||||
|
||||
|
||||
def test_state_dir_under_home():
|
||||
assert ".local/state/devflow" in str(STATE_DIR)
|
||||
assert ".local/state/flow" in str(STATE_DIR)
|
||||
|
||||
|
||||
def test_manifest_file_in_config_dir():
|
||||
@@ -34,7 +34,7 @@ def test_manifest_file_in_config_dir():
|
||||
|
||||
|
||||
def test_config_file_in_config_dir():
|
||||
assert CONFIG_FILE == CONFIG_DIR / "config"
|
||||
assert CONFIG_FILE == CONFIG_DIR / "config.yaml"
|
||||
|
||||
|
||||
def test_dotfiles_dir():
|
||||
|
||||
@@ -11,7 +11,7 @@ def test_detect_platform_returns_platforminfo():
|
||||
info = detect_platform()
|
||||
assert isinstance(info, PlatformInfo)
|
||||
assert info.os in ("linux", "macos")
|
||||
assert info.arch in ("amd64", "arm64")
|
||||
assert info.arch in ("x64", "arm64")
|
||||
assert info.platform == f"{info.os}-{info.arch}"
|
||||
|
||||
|
||||
@@ -27,4 +27,3 @@ def test_detect_platform_unsupported_arch(monkeypatch):
|
||||
detect_platform()
|
||||
|
||||
|
||||
|
||||
|
||||
@@ -1,215 +1,81 @@
|
||||
"""Tests for self-hosting flow config from dotfiles repository."""
|
||||
"""Tests for self-hosted merged YAML config loading."""
|
||||
|
||||
from pathlib import Path
|
||||
from unittest.mock import patch
|
||||
|
||||
import pytest
|
||||
import yaml
|
||||
|
||||
from flow.core import paths as paths_module
|
||||
from flow.core.config import load_config, load_manifest
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_paths(tmp_path, monkeypatch):
|
||||
"""Mock path constants for testing."""
|
||||
config_dir = tmp_path / "config"
|
||||
dotfiles_dir = tmp_path / "dotfiles"
|
||||
def mock_roots(tmp_path, monkeypatch):
|
||||
local_root = tmp_path / "local-flow"
|
||||
dotfiles_root = tmp_path / "dotfiles" / "_shared" / "flow" / ".config" / "flow"
|
||||
|
||||
config_dir.mkdir()
|
||||
dotfiles_dir.mkdir()
|
||||
local_root.mkdir(parents=True)
|
||||
dotfiles_root.mkdir(parents=True)
|
||||
|
||||
test_paths = {
|
||||
"config_dir": config_dir,
|
||||
"dotfiles_dir": dotfiles_dir,
|
||||
"local_config": config_dir / "config",
|
||||
"local_manifest": config_dir / "manifest.yaml",
|
||||
"dotfiles_config": dotfiles_dir / "flow" / ".config" / "flow" / "config",
|
||||
"dotfiles_manifest": dotfiles_dir / "flow" / ".config" / "flow" / "manifest.yaml",
|
||||
monkeypatch.setattr(paths_module, "CONFIG_DIR", local_root)
|
||||
monkeypatch.setattr(paths_module, "DOTFILES_FLOW_CONFIG", dotfiles_root)
|
||||
|
||||
return {
|
||||
"local": local_root,
|
||||
"dotfiles": dotfiles_root,
|
||||
}
|
||||
|
||||
# Patch at the paths module level
|
||||
monkeypatch.setattr(paths_module, "CONFIG_FILE", test_paths["local_config"])
|
||||
monkeypatch.setattr(paths_module, "MANIFEST_FILE", test_paths["local_manifest"])
|
||||
monkeypatch.setattr(paths_module, "DOTFILES_CONFIG", test_paths["dotfiles_config"])
|
||||
monkeypatch.setattr(paths_module, "DOTFILES_MANIFEST", test_paths["dotfiles_manifest"])
|
||||
|
||||
return test_paths
|
||||
def test_load_manifest_priority_dotfiles_first(mock_roots):
|
||||
(mock_roots["local"] / "profiles.yaml").write_text("profiles:\n local: {os: linux}\n")
|
||||
(mock_roots["dotfiles"] / "profiles.yaml").write_text("profiles:\n dotfiles: {os: macos}\n")
|
||||
|
||||
|
||||
def test_load_manifest_priority_dotfiles_first(mock_paths):
|
||||
"""Test that dotfiles manifest takes priority over local."""
|
||||
# Create both manifests
|
||||
local_manifest = mock_paths["local_manifest"]
|
||||
dotfiles_manifest = mock_paths["dotfiles_manifest"]
|
||||
|
||||
local_manifest.write_text("profiles:\n local:\n os: linux")
|
||||
|
||||
dotfiles_manifest.parent.mkdir(parents=True)
|
||||
dotfiles_manifest.write_text("profiles:\n dotfiles:\n os: macos")
|
||||
|
||||
# Should load from dotfiles
|
||||
manifest = load_manifest()
|
||||
assert "dotfiles" in manifest.get("profiles", {})
|
||||
assert "local" not in manifest.get("profiles", {})
|
||||
|
||||
|
||||
def test_load_manifest_fallback_to_local(mock_paths):
|
||||
"""Test fallback to local manifest when dotfiles doesn't exist."""
|
||||
local_manifest = mock_paths["local_manifest"]
|
||||
local_manifest.write_text("profiles:\n local:\n os: linux")
|
||||
def test_load_manifest_fallback_to_local(mock_roots):
|
||||
(mock_roots["local"] / "profiles.yaml").write_text("profiles:\n local: {os: linux}\n")
|
||||
|
||||
# Remove dotfiles yaml file so local takes over.
|
||||
dot_yaml = mock_roots["dotfiles"] / "profiles.yaml"
|
||||
if dot_yaml.exists():
|
||||
dot_yaml.unlink()
|
||||
|
||||
# Dotfiles manifest doesn't exist
|
||||
manifest = load_manifest()
|
||||
assert "local" in manifest.get("profiles", {})
|
||||
|
||||
|
||||
def test_load_manifest_empty_when_none_exist(mock_paths):
|
||||
"""Test empty dict returned when no manifests exist."""
|
||||
def test_load_manifest_empty_when_none_exist(mock_roots):
|
||||
manifest = load_manifest()
|
||||
assert manifest == {}
|
||||
|
||||
|
||||
def test_load_config_priority_dotfiles_first(mock_paths):
|
||||
"""Test that dotfiles config takes priority over local."""
|
||||
local_config = mock_paths["local_config"]
|
||||
dotfiles_config = mock_paths["dotfiles_config"]
|
||||
|
||||
# Create local config
|
||||
local_config.write_text(
|
||||
"[repository]\n"
|
||||
"dotfiles_url = https://github.com/user/dotfiles-local.git\n"
|
||||
def test_load_config_from_merged_yaml(mock_roots):
|
||||
(mock_roots["dotfiles"] / "config.yaml").write_text(
|
||||
"repository:\n"
|
||||
" dotfiles-url: git@github.com:user/dotfiles.git\n"
|
||||
"defaults:\n"
|
||||
" container-registry: registry.example.com\n"
|
||||
)
|
||||
|
||||
# Create dotfiles config
|
||||
dotfiles_config.parent.mkdir(parents=True)
|
||||
dotfiles_config.write_text(
|
||||
"[repository]\n"
|
||||
"dotfiles_url = https://github.com/user/dotfiles-from-repo.git\n"
|
||||
)
|
||||
|
||||
# Should load from dotfiles
|
||||
config = load_config()
|
||||
assert "dotfiles-from-repo" in config.dotfiles_url
|
||||
cfg = load_config()
|
||||
assert cfg.dotfiles_url == "git@github.com:user/dotfiles.git"
|
||||
assert cfg.container_registry == "registry.example.com"
|
||||
|
||||
|
||||
def test_load_config_fallback_to_local(mock_paths):
|
||||
"""Test fallback to local config when dotfiles doesn't exist."""
|
||||
local_config = mock_paths["local_config"]
|
||||
local_config.write_text(
|
||||
"[repository]\n"
|
||||
"dotfiles_url = https://github.com/user/dotfiles-local.git\n"
|
||||
)
|
||||
def test_yaml_merge_is_alphabetical_last_writer_wins(mock_roots):
|
||||
(mock_roots["local"] / "10-a.yaml").write_text("profiles:\n a: {os: linux}\n")
|
||||
(mock_roots["local"] / "20-b.yaml").write_text("profiles:\n b: {os: linux}\n")
|
||||
|
||||
# Dotfiles config doesn't exist
|
||||
config = load_config()
|
||||
assert "dotfiles-local" in config.dotfiles_url
|
||||
manifest = load_manifest(mock_roots["local"])
|
||||
assert "b" in manifest.get("profiles", {})
|
||||
assert "a" not in manifest.get("profiles", {})
|
||||
|
||||
|
||||
def test_load_config_empty_when_none_exist(mock_paths):
|
||||
"""Test default config returned when no configs exist."""
|
||||
config = load_config()
|
||||
assert config.dotfiles_url == ""
|
||||
assert config.dotfiles_branch == "main"
|
||||
def test_explicit_file_path_loads_single_yaml(tmp_path):
|
||||
one_file = tmp_path / "single.yaml"
|
||||
one_file.write_text("profiles:\n only: {os: linux}\n")
|
||||
|
||||
|
||||
def test_self_hosting_workflow(tmp_path, monkeypatch):
|
||||
"""Test complete self-hosting workflow.
|
||||
|
||||
Simulates:
|
||||
1. User has dotfiles repo with flow config
|
||||
2. Flow links its own config from dotfiles
|
||||
3. Flow reads from self-hosted location
|
||||
"""
|
||||
# Setup paths
|
||||
home = tmp_path / "home"
|
||||
dotfiles = tmp_path / "dotfiles"
|
||||
home.mkdir()
|
||||
dotfiles.mkdir()
|
||||
|
||||
# Create flow package in dotfiles
|
||||
flow_pkg = dotfiles / "flow" / ".config" / "flow"
|
||||
flow_pkg.mkdir(parents=True)
|
||||
|
||||
# Create manifest in dotfiles
|
||||
manifest_content = {
|
||||
"profiles": {
|
||||
"test-env": {
|
||||
"os": "linux",
|
||||
"packages": {"standard": ["git", "vim"]},
|
||||
}
|
||||
}
|
||||
}
|
||||
(flow_pkg / "manifest.yaml").write_text(yaml.dump(manifest_content))
|
||||
|
||||
# Create config in dotfiles
|
||||
(flow_pkg / "config").write_text(
|
||||
"[repository]\n"
|
||||
"dotfiles_url = https://github.com/user/dotfiles.git\n"
|
||||
)
|
||||
|
||||
# Mock paths to use our temp directories
|
||||
monkeypatch.setattr(paths_module, "DOTFILES_MANIFEST", flow_pkg / "manifest.yaml")
|
||||
monkeypatch.setattr(paths_module, "DOTFILES_CONFIG", flow_pkg / "config")
|
||||
monkeypatch.setattr(paths_module, "MANIFEST_FILE", home / ".config" / "devflow" / "manifest.yaml")
|
||||
monkeypatch.setattr(paths_module, "CONFIG_FILE", home / ".config" / "devflow" / "config")
|
||||
|
||||
# Load config and manifest - should come from dotfiles
|
||||
manifest = load_manifest()
|
||||
config = load_config()
|
||||
|
||||
assert "test-env" in manifest.get("profiles", {})
|
||||
assert "github.com/user/dotfiles.git" in config.dotfiles_url
|
||||
|
||||
|
||||
def test_manifest_cascade_with_symlink(tmp_path, monkeypatch):
|
||||
"""Test that loading works correctly when symlink is used."""
|
||||
# Setup
|
||||
dotfiles = tmp_path / "dotfiles"
|
||||
home_config = tmp_path / "home" / ".config" / "flow"
|
||||
flow_pkg = dotfiles / "flow" / ".config" / "flow"
|
||||
|
||||
flow_pkg.mkdir(parents=True)
|
||||
home_config.mkdir(parents=True)
|
||||
|
||||
# Create manifest in dotfiles
|
||||
manifest_content = {"profiles": {"from-dotfiles": {"os": "linux"}}}
|
||||
(flow_pkg / "manifest.yaml").write_text(yaml.dump(manifest_content))
|
||||
|
||||
# Create symlink from home config to dotfiles
|
||||
manifest_link = home_config / "manifest.yaml"
|
||||
manifest_link.symlink_to(flow_pkg / "manifest.yaml")
|
||||
|
||||
# Mock paths
|
||||
monkeypatch.setattr(paths_module, "DOTFILES_MANIFEST", flow_pkg / "manifest.yaml")
|
||||
monkeypatch.setattr(paths_module, "MANIFEST_FILE", manifest_link)
|
||||
|
||||
# Load - should work through symlink
|
||||
manifest = load_manifest()
|
||||
assert "from-dotfiles" in manifest.get("profiles", {})
|
||||
|
||||
|
||||
def test_config_priority_documentation(mock_paths):
|
||||
"""Document the config loading priority for users."""
|
||||
# This test serves as documentation of the cascade behavior
|
||||
|
||||
# Priority 1: Dotfiles repo (self-hosted)
|
||||
dotfiles_manifest = mock_paths["dotfiles_manifest"]
|
||||
dotfiles_manifest.parent.mkdir(parents=True)
|
||||
dotfiles_manifest.write_text("profiles:\n priority-1: {}")
|
||||
|
||||
manifest = load_manifest()
|
||||
assert "priority-1" in manifest.get("profiles", {})
|
||||
|
||||
# If we remove dotfiles, falls back to Priority 2: Local override
|
||||
dotfiles_manifest.unlink()
|
||||
local_manifest = mock_paths["local_manifest"]
|
||||
local_manifest.write_text("profiles:\n priority-2: {}")
|
||||
|
||||
manifest = load_manifest()
|
||||
assert "priority-2" in manifest.get("profiles", {})
|
||||
|
||||
# If neither exists, Priority 3: Empty fallback
|
||||
local_manifest.unlink()
|
||||
manifest = load_manifest()
|
||||
assert manifest == {}
|
||||
manifest = load_manifest(one_file)
|
||||
assert "only" in manifest["profiles"]
|
||||
|
||||
@@ -50,3 +50,8 @@ def test_substitute_template_non_string():
|
||||
def test_substitute_template_no_placeholders():
|
||||
result = substitute_template("plain text", {"os": "linux"})
|
||||
assert result == "plain text"
|
||||
|
||||
|
||||
def test_substitute_template_env_namespace():
|
||||
result = substitute_template("{{ env.USER_EMAIL }}", {"env": {"USER_EMAIL": "you@example.com"}})
|
||||
assert result == "you@example.com"
|
||||
|
||||
Reference in New Issue
Block a user