Compare commits

...

35 Commits

Author SHA1 Message Date
Félix Saparelli 9f1f2e9d04
chore: Release 2024-05-16 14:20:58 +12:00
Félix Saparelli 0e393c25cf
update changelog 2024-05-16 14:20:36 +12:00
Luca Barbato 2026c52abd
feat: Add git-describe support (#832) 2024-05-15 13:02:25 +00:00
Félix Saparelli 72f069a847
chore: Release 2024-04-30 20:41:43 +12:00
Adit 4affed6fff
fix(cli): recursive paths provided by user getting treated non-recursively (#828) 2024-04-30 07:10:28 +00:00
Félix Saparelli e0084e69f8
fix ci again 2024-04-28 19:14:21 +12:00
Félix Saparelli 592b712c95
chore: Release 2024-04-28 18:55:23 +12:00
Félix Saparelli c9a3b9df00
chore: Release 2024-04-28 18:53:42 +12:00
Félix Saparelli e63d37f601
chore: Release 2024-04-28 18:52:50 +12:00
Félix Saparelli 14e6294f5a
chore: Release 2024-04-28 18:51:48 +12:00
Félix Saparelli 234d606563
chore: Release 2024-04-28 18:50:18 +12:00
Félix Saparelli 77405c8ce1
chore: Release 2024-04-28 18:48:50 +12:00
Félix Saparelli 6c23afe839
feat: make it possible to watch non-recursively (#827)
Fixes #227
Fixes #174

docs(cli): be more precise in print-events advice to use `-v`
docs(cli): improve jaq error help
feat(cli): add `-W` for non-recursive watches
feat(cli): use non-blocking logging
feat(globset): hide `fmt::Debug` spew from ignore crate
feat(ignore-files): hide `fmt::Debug` spew from ignore crate
feat(lib): make it possible to watch non-recursively
fix(lib): inserting `WatchedPath`s directly should be possible
refactor(lib): move `WatchedPath` out of `fs` mod
2024-04-28 06:33:07 +00:00
dependabot[bot] ee3795d776
Bump softprops/action-gh-release from 2.0.3 to 2.0.4 (#823)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-28 18:19:34 +12:00
Félix Saparelli eff96c7324
feat(project-origins): add support for out-of-tree git repos (#826) 2024-04-28 14:26:07 +12:00
Félix Saparelli a4df258735
doc: fix --on-busy-update help text (#825) 2024-04-23 14:44:59 +12:00
Félix Saparelli d388a280f0
ci: more build improvements (for next time) 2024-04-21 02:11:37 +12:00
Félix Saparelli bb97f71c8c
gha: probably the most frustrating syntax in the world 2024-04-21 02:04:56 +12:00
Félix Saparelli 953fa89dd9
even better 2024-04-21 02:02:57 +12:00
Félix Saparelli 0ef87821f2
Run manpage and completions in release when we've already built in releases 2024-04-21 01:57:09 +12:00
Félix Saparelli 62af5dd868
Fix dist manifest 2024-04-21 01:52:11 +12:00
Félix Saparelli 4497aaf515
Fix release builder 2024-04-21 01:38:11 +12:00
Félix Saparelli a63864c5f2
chore: Release 2024-04-21 01:18:24 +12:00
Félix Saparelli ee815ba166
chore: Release 2024-04-21 01:06:46 +12:00
Félix Saparelli d6138b9961
chore: Release 2024-04-21 01:04:18 +12:00
Félix Saparelli f73d388d18
Changelogs for filterers 2024-04-21 01:03:58 +12:00
Félix Saparelli 86d6c7d448
Remove more PR machinery 2024-04-21 01:02:40 +12:00
Félix Saparelli d317540fd3
chore: Release 2024-04-21 01:00:28 +12:00
Félix Saparelli 9d91c51651
chore: Release 2024-04-21 00:56:27 +12:00
Félix Saparelli 96480cb588
chore: Release 2024-04-21 00:55:14 +12:00
Félix Saparelli fd5afb8b3a
Add --wrap-process (#822) 2024-04-20 12:39:28 +00:00
Félix Saparelli e1cef25d7f
Fix watchexec-events tests 2024-04-21 00:36:59 +12:00
Félix Saparelli 22b58a66ab
Remove tagged filterer 2024-04-21 00:32:01 +12:00
Félix Saparelli 1c47ffbe1a
Update release.toml config 2024-04-21 00:30:56 +12:00
Félix Saparelli 48ff7ec68b
Remove PR machinery 2024-04-21 00:28:06 +12:00
83 changed files with 918 additions and 3710 deletions

View File

@ -4,7 +4,6 @@
app_name: "watchexec",
app_version: $version,
changelog_title: "CLI \($version)",
changelog_body: $changelog,
artifacts: [ $files | split("\n") | .[] | {
name: .,
kind: (if (. | test("[.](deb|rpm)$")) then "installer" else "executable-zip" end),

View File

@ -1,7 +1,6 @@
name: CLI Release
on:
workflow_call:
workflow_dispatch:
push:
tags:
@ -17,8 +16,6 @@ jobs:
runs-on: ubuntu-latest
outputs:
cli_version: ${{ steps.version.outputs.cli_version }}
release_notes: ${{ fromJSON(steps.notes.outputs.notes_json || 'null') }}
announce: ${{ steps.announce.outputs.announce || '' }}
steps:
- uses: actions/checkout@v4
- name: Extract version
@ -36,40 +33,6 @@ jobs:
echo "cli_version=$version" >> $GITHUB_OUTPUT
- name: Extract release notes
if: github.event.head_commit.message
id: notes
shell: bash
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_REPO: ${{ github.repository }}
release_commit: ${{ github.event.head_commit.message }}
run: |
set -x
set +eo pipefail
if [[ -z "$release_commit" ]]; then
echo "notes_json=null" >> $GITHUB_OUTPUT
exit
fi
release_pr=$(head -n1 <<< "$release_commit" | grep -oP '(?<=[(]#)\d+(?=[)])')
if [[ -z "$release_pr" ]]; then
echo "notes_json=null" >> $GITHUB_OUTPUT
exit
fi
gh \
pr --repo "$GITHUB_REPO" \
view "$release_pr" \
--json body \
--jq '"notes_json=\((.body | split("### Release notes")[1] // "") | tojson)"' \
>> $GITHUB_OUTPUT
- name: Make a new announcement post
id: announce
if: endsWith(steps.version.outputs.cli_version, '.0')
run: echo "announce=Announcements" >> $GITHUB_OUTPUT
build:
strategy:
matrix:
@ -233,19 +196,29 @@ jobs:
with:
tool: cross
- name: Build (cargo)
if: "!matrix.cross"
run: cargo build --package watchexec-cli --release --locked --target ${{ matrix.target }}
- name: Build (cross)
if: matrix.cross
run: cross build --package watchexec-cli --release --locked --target ${{ matrix.target }}
- name: Build
shell: bash
run: |
${{ matrix.cross && 'cross' || 'cargo' }} build \
-p watchexec-cli \
--release --locked \
--target ${{ matrix.target }}
- name: Make manpage
run: cargo run -p watchexec-cli -- --manual > doc/watchexec.1
shell: bash
run: |
cargo run -p watchexec-cli \
${{ (!matrix.cross) && '--release --target' || '' }} \
${{ (!matrix.cross) && matrix.target || '' }} \
--locked -- --manual > doc/watchexec.1
- name: Make completions
run: bin/completions
shell: bash
run: |
bin/completions \
${{ (!matrix.cross) && '--release --target' || '' }} \
${{ (!matrix.cross) && matrix.target || '' }} \
--locked
- name: Package
shell: bash
@ -285,7 +258,7 @@ jobs:
- uses: actions/upload-artifact@v4
with:
name: builds
name: ${{ matrix.name }}
retention-days: 1
path: |
watchexec-*.tar.xz
@ -310,13 +283,12 @@ jobs:
- uses: actions/download-artifact@v4
with:
name: builds
merge-multiple: true
- name: Dist manifest
run: |
jq -ncf .github/workflows/dist-manifest.jq \
--arg version "${{ needs.info.outputs.cli_version }}" \
--arg changelog "${{ needs.info.outputs.release_notes }}" \
--arg files "$(ls watchexec-*)" \
> dist-manifest.json
@ -334,13 +306,11 @@ jobs:
sha512sum $file | cut -d ' ' -f1 > "$file.sha512"
done
- uses: softprops/action-gh-release@3198ee18f814cdf787321b4a32a26ddbf37acc52
- uses: softprops/action-gh-release@9d7c94cfd0a1f3ed45544c887983e9fa900f0564
with:
tag_name: v${{ needs.info.outputs.cli_version }}
name: CLI v${{ needs.info.outputs.cli_version }}
body: ${{ needs.info.outputs.release_notes }}
append_body: true
discussion_category_name: ${{ needs.info.outputs.announce }}
files: |
dist-manifest.json
watchexec-*.tar.xz

View File

@ -1,61 +0,0 @@
<!-- <%- JSON.stringify({ "release-pr": { v2: { crates, version } } }) %> -->
This is a release PR for **<%= crate.name %>** version **<%= version.actual %>**<%
if (version.actual != version.desired) {
%> (performing a <%= version.desired %> bump).<%
} else {
%>.<%
}
%>
**Use squash merge.**
<% if (crate.name == "watchexec-cli") { %>
Upon merging, this will automatically create the tag `v<%= version.actual %>`, build the CLI, and create a GitHub release.
You will still need to manually publish the cargo crate:
```
$ git switch main
$ git pull
$ git switch --detach v<%= version.actual %>
$ cargo publish -p <%= crate.name %>
```
<% } else { %>
Remember to review the crate's changelog!
Upon merging, this will create the tag `<%= crate.name %>-v<%= version.actual %>`.
You will still need to manually publish the cargo crate:
```
$ git switch main
$ git pull
$ git switch --detach <%= crate.name %>-v<%= version.actual %>
$ cargo publish -p <%= crate.name %>
```
<% } %>
To trigger builds initially: either close and then immediately re-open this PR once, or push to the branch (perhaps with edits to the README.md or CHANGELOG.md!).
<% if (pr.releaseNotes) { %>
---
_Edit release notes into the section below:_
<!-- do not change or remove this heading -->
<% if (crate.name == "watchexec-cli") { %>
### Release notes
_Software development often involves running the same commands over and over. Boring! Watchexec is a simple, standalone tool that watches a path and runs a command whenever it detects modifications. Install it today with [`cargo-binstall watchexec-cli`](https://github.com/cargo-bins/cargo-binstall), from the binaries below, find it [in your favourite package manager](https://github.com/watchexec/watchexec/blob/main/doc/packages.md), or build it from source with `cargo install watchexec-cli`._
#### In this release:
-
#### Other changes:
-
<% } else { %>
### Changelog
-
<% } %>
<% } %>

View File

@ -1,54 +0,0 @@
name: Open a release PR
on:
workflow_dispatch:
inputs:
crate:
description: Crate to release
required: true
type: choice
options:
- cli
- lib
- bosion
- events
- ignore-files
- project-origins
- signals
- supervisor
- filterer/globset
- filterer/ignore
- filterer/tagged
version:
description: Version to release
required: true
type: string
default: patch
jobs:
make-release-pr:
permissions:
id-token: write # Enable OIDC
pull-requests: write
contents: write
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: chainguard-dev/actions/setup-gitsign@main
- name: Install cargo-release
uses: taiki-e/install-action@v2
with:
tool: cargo-release
- uses: cargo-bins/release-pr@v2
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
version: ${{ inputs.version }}
crate-path: crates/${{ inputs.crate }}
pr-release-notes: ${{ inputs.crate == 'cli' }}
pr-label: release
pr-template-file: .github/workflows/release-pr.ejs
env:
GITSIGN_LOG: /tmp/gitsign.log
- run: cat /tmp/gitsign.log
if: ${{ failure() }}

View File

@ -1,45 +0,0 @@
name: Tag a release
on:
push:
branches:
- main
tags-ignore:
- "*"
jobs:
make-tag:
runs-on: ubuntu-latest
# because we control the release PR title and only allow squashes,
# PRs that are named `release: {crate-name} v{version}` will get tagged!
# the commit message will look like: `release: {crate-name} v{version} (#{pr-number})`
if: "startsWith(github.event.head_commit.message, 'release: ')"
steps:
- name: Extract tag from commit message
env:
COMMIT_MESSAGE: ${{ github.event.head_commit.message }}
run: |
set -euxo pipefail
message="$(head -n1 <<< "$COMMIT_MESSAGE")"
crate="$(cut -d ' ' -f 2 <<< "${message}")"
version="$(cut -d ' ' -f 3 <<< "${message}")"
if [[ "$crate" == "watchexec-cli" ]]; then
echo "CUSTOM_TAG=${version}" >> $GITHUB_ENV
else
echo "CUSTOM_TAG=${crate}-${version}" >> $GITHUB_ENV
fi
- uses: actions/checkout@v4
- name: Push release tag
id: tag_version
uses: mathieudutour/github-tag-action@v6.2
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
custom_tag: ${{ env.CUSTOM_TAG }}
tag_prefix: ''
release-cli:
needs: make-tag
if: "startsWith(github.event.head_commit.message, 'release: watchexec-cli v')"
uses: ./.github/workflows/release-cli.yml
secrets: inherit

View File

@ -68,9 +68,11 @@ jobs:
key: ${{ runner.os }}-target-stable-${{ hashFiles('**/Cargo.lock') }}
- name: Run test suite
run: cargo test ${{ env.flags }}
run: cargo test
- name: Run watchexec-events integration tests
run: cargo test -p watchexec-events -F serde
- name: Check that CLI runs
run: cargo run ${{ env.flags }} -p watchexec-cli -- -1 echo
run: cargo run -p watchexec-cli -- -1 echo
- name: Install coreutils on mac
if: ${{ matrix.platform == 'macos' }}
@ -89,7 +91,7 @@ jobs:
shell: bash
- name: Generate manpage
run: cargo run ${{ env.flags }} -p watchexec-cli -- --manual > doc/watchexec.1
run: cargo run -p watchexec-cli -- --manual > doc/watchexec.1
- name: Check that manpage is up to date
run: git diff --exit-code -- doc/

View File

@ -3,8 +3,8 @@ message: |
If you use this software, please cite it using these metadata.
title: "Watchexec: a tool to react to filesystem changes, and a crate ecosystem to power it"
version: "1.25.1"
date-released: 2024-01-05
version: "2.1.1"
date-released: 2024-04-30
repository-code: https://github.com/watchexec/watchexec
license: Apache-2.0

141
Cargo.lock generated
View File

@ -488,7 +488,7 @@ dependencies = [
[[package]]
name = "bosion"
version = "1.0.3"
version = "1.1.0"
dependencies = [
"gix",
"time",
@ -1317,6 +1317,7 @@ dependencies = [
"gix-glob",
"gix-hash",
"gix-hashtable",
"gix-index",
"gix-lock",
"gix-macros",
"gix-object",
@ -1354,6 +1355,15 @@ dependencies = [
"winnow 0.6.6",
]
[[package]]
name = "gix-bitmap"
version = "0.2.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a371db66cbd4e13f0ed9dc4c0fea712d7276805fccc877f77e96374d317e87ae"
dependencies = [
"thiserror",
]
[[package]]
name = "gix-chunk"
version = "0.4.8"
@ -1513,6 +1523,33 @@ dependencies = [
"parking_lot",
]
[[package]]
name = "gix-index"
version = "0.32.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "881ab3b1fa57f497601a5add8289e72a7ae09471fc0b9bbe483b628ae8e418a1"
dependencies = [
"bitflags 2.5.0",
"bstr",
"filetime",
"fnv",
"gix-bitmap",
"gix-features",
"gix-fs",
"gix-hash",
"gix-lock",
"gix-object",
"gix-traverse",
"gix-utils",
"hashbrown 0.14.3",
"itoa",
"libc",
"memmap2",
"rustix",
"smallvec",
"thiserror",
]
[[package]]
name = "gix-lock"
version = "13.1.1"
@ -1988,7 +2025,7 @@ dependencies = [
[[package]]
name = "ignore-files"
version = "2.1.0"
version = "3.0.1"
dependencies = [
"dunce",
"futures",
@ -2833,7 +2870,7 @@ checksum = "744a264d26b88a6a7e37cbad97953fa233b94d585236310bcbc88474b4092d79"
[[package]]
name = "project-origins"
version = "1.3.0"
version = "1.4.0"
dependencies = [
"futures",
"miette",
@ -3685,6 +3722,18 @@ dependencies = [
"tracing-core",
]
[[package]]
name = "tracing-appender"
version = "0.2.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3566e8ce28cc0a3fe42519fc80e6b4c943cc4c8cef275620eb8dac2d3d4e06cf"
dependencies = [
"crossbeam-channel",
"thiserror",
"time",
"tracing-subscriber",
]
[[package]]
name = "tracing-attributes"
version = "0.1.27"
@ -3800,15 +3849,6 @@ dependencies = [
"winapi",
]
[[package]]
name = "unicase"
version = "2.7.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f7d2d4dafb69621809a81864c9c1b864479e1235c0dd4e199924b9742439ed89"
dependencies = [
"version_check",
]
[[package]]
name = "unicode-bidi"
version = "0.3.15"
@ -4004,7 +4044,7 @@ checksum = "af190c94f2773fdb3729c55b007a722abb5384da03bc0986df4c289bf5567e96"
[[package]]
name = "watchexec"
version = "3.0.1"
version = "4.1.0"
dependencies = [
"async-priority-channel",
"async-recursion",
@ -4022,14 +4062,14 @@ dependencies = [
"tokio",
"tracing",
"tracing-subscriber",
"watchexec-events 3.0.0",
"watchexec-signals 3.0.0",
"watchexec-events",
"watchexec-signals",
"watchexec-supervisor",
]
[[package]]
name = "watchexec-cli"
version = "1.25.1"
version = "2.1.1"
dependencies = [
"ahash",
"argfile",
@ -4069,28 +4109,17 @@ dependencies = [
"termcolor",
"tokio",
"tracing",
"tracing-appender",
"tracing-subscriber",
"tracing-test",
"uuid",
"watchexec",
"watchexec-events 3.0.0",
"watchexec-events",
"watchexec-filterer-globset",
"watchexec-signals 3.0.0",
"watchexec-signals",
"which",
]
[[package]]
name = "watchexec-events"
version = "2.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8fa905a7f327bfdda78b9c06831d3180a419b7b722bd1ef779ac13ff2ab69df0"
dependencies = [
"nix 0.27.1",
"notify",
"serde",
"watchexec-signals 2.1.0",
]
[[package]]
name = "watchexec-events"
version = "3.0.0"
@ -4100,13 +4129,12 @@ dependencies = [
"serde",
"serde_json",
"snapbox",
"watchexec-events 2.0.1",
"watchexec-signals 3.0.0",
"watchexec-signals",
]
[[package]]
name = "watchexec-filterer-globset"
version = "3.0.0"
version = "4.0.1"
dependencies = [
"ignore",
"ignore-files",
@ -4114,13 +4142,13 @@ dependencies = [
"tracing",
"tracing-subscriber",
"watchexec",
"watchexec-events 3.0.0",
"watchexec-events",
"watchexec-filterer-ignore",
]
[[package]]
name = "watchexec-filterer-ignore"
version = "3.0.1"
version = "4.0.1"
dependencies = [
"dunce",
"ignore",
@ -4131,41 +4159,8 @@ dependencies = [
"tracing",
"tracing-subscriber",
"watchexec",
"watchexec-events 3.0.0",
"watchexec-signals 3.0.0",
]
[[package]]
name = "watchexec-filterer-tagged"
version = "2.0.0"
dependencies = [
"futures",
"globset",
"ignore",
"ignore-files",
"miette",
"nom",
"project-origins",
"regex",
"thiserror",
"tokio",
"tracing",
"tracing-subscriber",
"unicase",
"watchexec",
"watchexec-events 3.0.0",
"watchexec-filterer-ignore",
"watchexec-signals 3.0.0",
]
[[package]]
name = "watchexec-signals"
version = "2.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "af0a778522cf0fc2fa8a8f1380e32893208cb2e7fd33e64de8bd81a00a2a7838"
dependencies = [
"nix 0.27.1",
"serde",
"watchexec-events",
"watchexec-signals",
]
[[package]]
@ -4180,7 +4175,7 @@ dependencies = [
[[package]]
name = "watchexec-supervisor"
version = "1.0.3"
version = "2.0.0"
dependencies = [
"boxcar",
"futures",
@ -4188,8 +4183,8 @@ dependencies = [
"process-wrap",
"tokio",
"tracing",
"watchexec-events 3.0.0",
"watchexec-signals 3.0.0",
"watchexec-events",
"watchexec-signals",
]
[[package]]

View File

@ -8,7 +8,6 @@ members = [
"crates/supervisor",
"crates/filterer/globset",
"crates/filterer/ignore",
"crates/filterer/tagged",
"crates/bosion",
"crates/ignore-files",
"crates/project-origins",
@ -20,7 +19,6 @@ tempfile = "3.8.0"
tracing-test = "0.2.4"
rand = "0.8"
uuid = "1.5.0"
watchexec-events = "2.0.1"
[profile.release]
lto = true

View File

@ -1,7 +1,7 @@
#!/bin/sh
cargo run -p watchexec-cli -- --completions bash > completions/bash
cargo run -p watchexec-cli -- --completions elvish > completions/elvish
cargo run -p watchexec-cli -- --completions fish > completions/fish
cargo run -p watchexec-cli -- --completions nu > completions/nu
cargo run -p watchexec-cli -- --completions powershell > completions/powershell
cargo run -p watchexec-cli -- --completions zsh > completions/zsh
cargo run -p watchexec-cli $* -- --completions bash > completions/bash
cargo run -p watchexec-cli $* -- --completions elvish > completions/elvish
cargo run -p watchexec-cli $* -- --completions fish > completions/fish
cargo run -p watchexec-cli $* -- --completions nu > completions/nu
cargo run -p watchexec-cli $* -- --completions powershell > completions/powershell
cargo run -p watchexec-cli $* -- --completions zsh > completions/zsh

View File

@ -19,7 +19,7 @@ _watchexec() {
case "${cmd}" in
watchexec)
opts="-w -c -o -r -s -d -p -n -E -1 -N -q -e -f -j -i -v -h -V --watch --clear --on-busy-update --restart --signal --stop-signal --stop-timeout --map-signal --debounce --stdin-quit --no-vcs-ignore --no-project-ignore --no-global-ignore --no-default-ignore --no-discover-ignore --ignore-nothing --postpone --delay-run --poll --shell --no-environment --emit-events-to --only-emit-events --env --no-process-group --notify --color --timings --quiet --bell --project-origin --workdir --exts --filter --filter-file --filter-prog --ignore --ignore-file --fs-events --no-meta --print-events --verbose --log-file --manual --completions --help --version [COMMAND]..."
opts="-w -W -c -o -r -s -d -p -n -E -1 -N -q -e -f -j -i -v -h -V --watch --watch-non-recursive --clear --on-busy-update --restart --signal --stop-signal --stop-timeout --map-signal --debounce --stdin-quit --no-vcs-ignore --no-project-ignore --no-global-ignore --no-default-ignore --no-discover-ignore --ignore-nothing --postpone --delay-run --poll --shell --no-environment --emit-events-to --only-emit-events --env --no-process-group --wrap-process --notify --color --timings --quiet --bell --project-origin --workdir --exts --filter --filter-file --filter-prog --ignore --ignore-file --fs-events --no-meta --print-events --manual --completions --verbose --log-file --help --version [COMMAND]..."
if [[ ${cur} == -* || ${COMP_CWORD} -eq 1 ]] ; then
COMPREPLY=( $(compgen -W "${opts}" -- "${cur}") )
return 0
@ -33,6 +33,14 @@ _watchexec() {
COMPREPLY=($(compgen -f "${cur}"))
return 0
;;
--watch-non-recursive)
COMPREPLY=($(compgen -f "${cur}"))
return 0
;;
-W)
COMPREPLY=($(compgen -f "${cur}"))
return 0
;;
--clear)
COMPREPLY=($(compgen -W "clear reset" -- "${cur}"))
return 0
@ -101,6 +109,10 @@ _watchexec() {
COMPREPLY=($(compgen -f "${cur}"))
return 0
;;
--wrap-process)
COMPREPLY=($(compgen -W "group session none" -- "${cur}"))
return 0
;;
--color)
COMPREPLY=($(compgen -W "auto always never" -- "${cur}"))
return 0
@ -185,14 +197,14 @@ _watchexec() {
COMPREPLY=($(compgen -W "access create remove rename modify metadata" -- "${cur}"))
return 0
;;
--log-file)
COMPREPLY=($(compgen -f "${cur}"))
return 0
;;
--completions)
COMPREPLY=($(compgen -W "bash elvish fish nu powershell zsh" -- "${cur}"))
return 0
;;
--log-file)
COMPREPLY=($(compgen -f "${cur}"))
return 0
;;
*)
COMPREPLY=()
;;

View File

@ -20,6 +20,8 @@ set edit:completion:arg-completer[watchexec] = {|@words|
&'watchexec'= {
cand -w 'Watch a specific file or directory'
cand --watch 'Watch a specific file or directory'
cand -W 'Watch a specific directory, non-recursively'
cand --watch-non-recursive 'Watch a specific directory, non-recursively'
cand -c 'Clear screen before running command'
cand --clear 'Clear screen before running command'
cand -o 'What to do when receiving events while the command is running'
@ -37,6 +39,7 @@ set edit:completion:arg-completer[watchexec] = {|@words|
cand --emit-events-to 'Configure event emission'
cand -E 'Add env vars to the command'
cand --env 'Add env vars to the command'
cand --wrap-process 'Configure how the process is wrapped'
cand --color 'When to use terminal colours'
cand --project-origin 'Set the project origin'
cand --workdir 'Set the working directory'
@ -51,8 +54,8 @@ set edit:completion:arg-completer[watchexec] = {|@words|
cand --ignore 'Filename patterns to filter out'
cand --ignore-file 'Files to load ignores from'
cand --fs-events 'Filesystem events to filter to'
cand --log-file 'Write diagnostic logs to a file'
cand --completions 'Generate a shell completions script'
cand --log-file 'Write diagnostic logs to a file'
cand -r 'Restart the process if it''s still running'
cand --restart 'Restart the process if it''s still running'
cand --stdin-quit 'Exit when stdin closes'
@ -77,9 +80,9 @@ set edit:completion:arg-completer[watchexec] = {|@words|
cand --bell 'Ring the terminal bell on command completion'
cand --no-meta 'Don''t emit fs events for metadata changes'
cand --print-events 'Print events that trigger actions'
cand --manual 'Show the manual page'
cand -v 'Set diagnostic log level'
cand --verbose 'Set diagnostic log level'
cand --manual 'Show the manual page'
cand -h 'Print help (see more with ''--help'')'
cand --help 'Print help (see more with ''--help'')'
cand -V 'Print version'

View File

@ -1,4 +1,5 @@
complete -c watchexec -s w -l watch -d 'Watch a specific file or directory' -r -F
complete -c watchexec -s W -l watch-non-recursive -d 'Watch a specific directory, non-recursively' -r -F
complete -c watchexec -s c -l clear -d 'Clear screen before running command' -r -f -a "{clear '',reset ''}"
complete -c watchexec -s o -l on-busy-update -d 'What to do when receiving events while the command is running' -r -f -a "{queue '',do-nothing '',restart '',signal ''}"
complete -c watchexec -s s -l signal -d 'Send a signal to the process when it\'s still running' -r
@ -11,6 +12,7 @@ complete -c watchexec -l poll -d 'Poll for filesystem changes' -r
complete -c watchexec -l shell -d 'Use a different shell' -r
complete -c watchexec -l emit-events-to -d 'Configure event emission' -r -f -a "{environment '',stdio '',file '',json-stdio '',json-file '',none ''}"
complete -c watchexec -s E -l env -d 'Add env vars to the command' -r
complete -c watchexec -l wrap-process -d 'Configure how the process is wrapped' -r -f -a "{group '',session '',none ''}"
complete -c watchexec -l color -d 'When to use terminal colours' -r -f -a "{auto '',always '',never ''}"
complete -c watchexec -l project-origin -d 'Set the project origin' -r -f -a "(__fish_complete_directories)"
complete -c watchexec -l workdir -d 'Set the working directory' -r -f -a "(__fish_complete_directories)"
@ -21,8 +23,8 @@ complete -c watchexec -s j -l filter-prog -d '[experimental] Filter programs' -r
complete -c watchexec -s i -l ignore -d 'Filename patterns to filter out' -r
complete -c watchexec -l ignore-file -d 'Files to load ignores from' -r -F
complete -c watchexec -l fs-events -d 'Filesystem events to filter to' -r -f -a "{access '',create '',remove '',rename '',modify '',metadata ''}"
complete -c watchexec -l log-file -d 'Write diagnostic logs to a file' -r -F
complete -c watchexec -l completions -d 'Generate a shell completions script' -r -f -a "{bash '',elvish '',fish '',nu '',powershell '',zsh ''}"
complete -c watchexec -l log-file -d 'Write diagnostic logs to a file' -r -F
complete -c watchexec -s r -l restart -d 'Restart the process if it\'s still running'
complete -c watchexec -l stdin-quit -d 'Exit when stdin closes'
complete -c watchexec -l no-vcs-ignore -d 'Don\'t load gitignores'
@ -43,7 +45,7 @@ complete -c watchexec -s q -l quiet -d 'Don\'t print starting and stopping messa
complete -c watchexec -l bell -d 'Ring the terminal bell on command completion'
complete -c watchexec -l no-meta -d 'Don\'t emit fs events for metadata changes'
complete -c watchexec -l print-events -d 'Print events that trigger actions'
complete -c watchexec -s v -l verbose -d 'Set diagnostic log level'
complete -c watchexec -l manual -d 'Show the manual page'
complete -c watchexec -s v -l verbose -d 'Set diagnostic log level'
complete -c watchexec -s h -l help -d 'Print help (see more with \'--help\')'
complete -c watchexec -s V -l version -d 'Print version'

View File

@ -12,6 +12,10 @@ module completions {
[ "environment" "stdio" "file" "json-stdio" "json-file" "none" ]
}
def "nu-complete watchexec wrap_process" [] {
[ "group" "session" "none" ]
}
def "nu-complete watchexec color" [] {
[ "auto" "always" "never" ]
}
@ -28,6 +32,7 @@ module completions {
export extern watchexec [
...command: string # Command to run on changes
--watch(-w): string # Watch a specific file or directory
--watch-non-recursive(-W): string # Watch a specific directory, non-recursively
--clear(-c): string@"nu-complete watchexec screen_clear" # Clear screen before running command
--on-busy-update(-o): string@"nu-complete watchexec on_busy_update" # What to do when receiving events while the command is running
--restart(-r) # Restart the process if it's still running
@ -53,6 +58,7 @@ module completions {
--only-emit-events # Only emit events to stdout, run no commands
--env(-E): string # Add env vars to the command
--no-process-group # Don't use a process group
--wrap-process: string@"nu-complete watchexec wrap_process" # Configure how the process is wrapped
-1 # Testing only: exit Watchexec after the first run
--notify(-N) # Alert when commands start and end
--color: string@"nu-complete watchexec color" # When to use terminal colours
@ -70,10 +76,10 @@ module completions {
--fs-events: string@"nu-complete watchexec filter_fs_events" # Filesystem events to filter to
--no-meta # Don't emit fs events for metadata changes
--print-events # Print events that trigger actions
--verbose(-v) # Set diagnostic log level
--log-file: string # Write diagnostic logs to a file
--manual # Show the manual page
--completions: string@"nu-complete watchexec completions" # Generate a shell completions script
--verbose(-v) # Set diagnostic log level
--log-file: string # Write diagnostic logs to a file
--help(-h) # Print help (see more with '--help')
--version(-V) # Print version
]

View File

@ -23,6 +23,8 @@ Register-ArgumentCompleter -Native -CommandName 'watchexec' -ScriptBlock {
'watchexec' {
[CompletionResult]::new('-w', 'w', [CompletionResultType]::ParameterName, 'Watch a specific file or directory')
[CompletionResult]::new('--watch', 'watch', [CompletionResultType]::ParameterName, 'Watch a specific file or directory')
[CompletionResult]::new('-W', 'W ', [CompletionResultType]::ParameterName, 'Watch a specific directory, non-recursively')
[CompletionResult]::new('--watch-non-recursive', 'watch-non-recursive', [CompletionResultType]::ParameterName, 'Watch a specific directory, non-recursively')
[CompletionResult]::new('-c', 'c', [CompletionResultType]::ParameterName, 'Clear screen before running command')
[CompletionResult]::new('--clear', 'clear', [CompletionResultType]::ParameterName, 'Clear screen before running command')
[CompletionResult]::new('-o', 'o', [CompletionResultType]::ParameterName, 'What to do when receiving events while the command is running')
@ -40,6 +42,7 @@ Register-ArgumentCompleter -Native -CommandName 'watchexec' -ScriptBlock {
[CompletionResult]::new('--emit-events-to', 'emit-events-to', [CompletionResultType]::ParameterName, 'Configure event emission')
[CompletionResult]::new('-E', 'E ', [CompletionResultType]::ParameterName, 'Add env vars to the command')
[CompletionResult]::new('--env', 'env', [CompletionResultType]::ParameterName, 'Add env vars to the command')
[CompletionResult]::new('--wrap-process', 'wrap-process', [CompletionResultType]::ParameterName, 'Configure how the process is wrapped')
[CompletionResult]::new('--color', 'color', [CompletionResultType]::ParameterName, 'When to use terminal colours')
[CompletionResult]::new('--project-origin', 'project-origin', [CompletionResultType]::ParameterName, 'Set the project origin')
[CompletionResult]::new('--workdir', 'workdir', [CompletionResultType]::ParameterName, 'Set the working directory')
@ -54,8 +57,8 @@ Register-ArgumentCompleter -Native -CommandName 'watchexec' -ScriptBlock {
[CompletionResult]::new('--ignore', 'ignore', [CompletionResultType]::ParameterName, 'Filename patterns to filter out')
[CompletionResult]::new('--ignore-file', 'ignore-file', [CompletionResultType]::ParameterName, 'Files to load ignores from')
[CompletionResult]::new('--fs-events', 'fs-events', [CompletionResultType]::ParameterName, 'Filesystem events to filter to')
[CompletionResult]::new('--log-file', 'log-file', [CompletionResultType]::ParameterName, 'Write diagnostic logs to a file')
[CompletionResult]::new('--completions', 'completions', [CompletionResultType]::ParameterName, 'Generate a shell completions script')
[CompletionResult]::new('--log-file', 'log-file', [CompletionResultType]::ParameterName, 'Write diagnostic logs to a file')
[CompletionResult]::new('-r', 'r', [CompletionResultType]::ParameterName, 'Restart the process if it''s still running')
[CompletionResult]::new('--restart', 'restart', [CompletionResultType]::ParameterName, 'Restart the process if it''s still running')
[CompletionResult]::new('--stdin-quit', 'stdin-quit', [CompletionResultType]::ParameterName, 'Exit when stdin closes')
@ -80,9 +83,9 @@ Register-ArgumentCompleter -Native -CommandName 'watchexec' -ScriptBlock {
[CompletionResult]::new('--bell', 'bell', [CompletionResultType]::ParameterName, 'Ring the terminal bell on command completion')
[CompletionResult]::new('--no-meta', 'no-meta', [CompletionResultType]::ParameterName, 'Don''t emit fs events for metadata changes')
[CompletionResult]::new('--print-events', 'print-events', [CompletionResultType]::ParameterName, 'Print events that trigger actions')
[CompletionResult]::new('--manual', 'manual', [CompletionResultType]::ParameterName, 'Show the manual page')
[CompletionResult]::new('-v', 'v', [CompletionResultType]::ParameterName, 'Set diagnostic log level')
[CompletionResult]::new('--verbose', 'verbose', [CompletionResultType]::ParameterName, 'Set diagnostic log level')
[CompletionResult]::new('--manual', 'manual', [CompletionResultType]::ParameterName, 'Show the manual page')
[CompletionResult]::new('-h', 'h', [CompletionResultType]::ParameterName, 'Print help (see more with ''--help'')')
[CompletionResult]::new('--help', 'help', [CompletionResultType]::ParameterName, 'Print help (see more with ''--help'')')
[CompletionResult]::new('-V', 'V ', [CompletionResultType]::ParameterName, 'Print version')

View File

@ -17,6 +17,8 @@ _watchexec() {
_arguments "${_arguments_options[@]}" \
'*-w+[Watch a specific file or directory]:PATH:_files' \
'*--watch=[Watch a specific file or directory]:PATH:_files' \
'*-W+[Watch a specific directory, non-recursively]:PATH:_files' \
'*--watch-non-recursive=[Watch a specific directory, non-recursively]:PATH:_files' \
'-c+[Clear screen before running command]' \
'--clear=[Clear screen before running command]' \
'-o+[What to do when receiving events while the command is running]:MODE:(queue do-nothing restart signal)' \
@ -34,6 +36,7 @@ _watchexec() {
'--emit-events-to=[Configure event emission]:MODE:(environment stdio file json-stdio json-file none)' \
'*-E+[Add env vars to the command]:KEY=VALUE: ' \
'*--env=[Add env vars to the command]:KEY=VALUE: ' \
'--wrap-process=[Configure how the process is wrapped]:MODE:(group session none)' \
'--color=[When to use terminal colours]:MODE:(auto always never)' \
'--project-origin=[Set the project origin]:DIRECTORY:_files -/' \
'--workdir=[Set the working directory]:DIRECTORY:_files -/' \
@ -48,8 +51,8 @@ _watchexec() {
'*--ignore=[Filename patterns to filter out]:PATTERN: ' \
'*--ignore-file=[Files to load ignores from]:PATH:_files' \
'*--fs-events=[Filesystem events to filter to]:EVENTS:(access create remove rename modify metadata)' \
'--log-file=[Write diagnostic logs to a file]' \
'(--manual)--completions=[Generate a shell completions script]:COMPLETIONS:(bash elvish fish nu powershell zsh)' \
'--log-file=[Write diagnostic logs to a file]' \
'(-o --on-busy-update)-r[Restart the process if it'\''s still running]' \
'(-o --on-busy-update)--restart[Restart the process if it'\''s still running]' \
'--stdin-quit[Exit when stdin closes]' \
@ -74,9 +77,9 @@ _watchexec() {
'--bell[Ring the terminal bell on command completion]' \
'(--fs-events)--no-meta[Don'\''t emit fs events for metadata changes]' \
'--print-events[Print events that trigger actions]' \
'(--completions)--manual[Show the manual page]' \
'*-v[Set diagnostic log level]' \
'*--verbose[Set diagnostic log level]' \
'(--completions)--manual[Show the manual page]' \
'-h[Print help (see more with '\''--help'\'')]' \
'--help[Print help (see more with '\''--help'\'')]' \
'-V[Print version]' \

View File

@ -2,6 +2,10 @@
## Next (YYYY-MM-DD)
## v1.1.0 (2024-05-16)
- Add `git-describe` support (#832, by @lu-zero)
## v1.0.3 (2024-04-20)
- Deps: gix 0.62

View File

@ -1,6 +1,6 @@
[package]
name = "bosion"
version = "1.0.3"
version = "1.1.0"
authors = ["Félix Saparelli <felix@passcod.name>"]
license = "Apache-2.0 OR MIT"
@ -22,6 +22,7 @@ features = ["macros", "formatting"]
version = "0.62.0"
optional = true
default-features = false
features = ["revision"]
[features]
default = ["git", "reproducible", "std"]

View File

@ -15,7 +15,7 @@ In your `Cargo.toml`:
```toml
[build-dependencies]
bosion = "1.0.3"
bosion = "1.1.0"
```
In your `build.rs`:

View File

@ -8,6 +8,24 @@ version = "1.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f26201604c87b1e01bd3d98f8d5d9a8fcbb815e8cedb41ffccbeb4bf593a35fe"
[[package]]
name = "ahash"
version = "0.8.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e89da841a80418a9b391ebaea17f5c112ffaaa96f621d2c285b5174da76b9011"
dependencies = [
"cfg-if",
"once_cell",
"version_check",
"zerocopy",
]
[[package]]
name = "allocator-api2"
version = "0.2.18"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5c6cb57a04249c6480766f7f7cef5467412af1490f8d1e243141daddada3264f"
[[package]]
name = "anstream"
version = "0.6.13"
@ -82,7 +100,7 @@ checksum = "cf4b9d6a944f767f8e5e0db018570623c85f3d925ac718db4e06d0187adb21c1"
[[package]]
name = "bosion"
version = "1.0.2"
version = "1.0.3"
dependencies = [
"gix",
"time",
@ -211,6 +229,18 @@ version = "2.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "658bd65b1cf4c852a3cc96f18a8ce7b5640f6b703f905c7d74532294c2a63984"
[[package]]
name = "filetime"
version = "0.2.23"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1ee447700ac8aa0b2f2bd7bc4462ad686ba06baa6727ac149a2d6277f0d240fd"
dependencies = [
"cfg-if",
"libc",
"redox_syscall",
"windows-sys",
]
[[package]]
name = "flate2"
version = "1.0.28"
@ -221,6 +251,12 @@ dependencies = [
"miniz_oxide",
]
[[package]]
name = "fnv"
version = "1.0.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3f9eec918d3f24069decb9af1554cad7c880e2da24a9afd88aca000531ab82c1"
[[package]]
name = "form_urlencoded"
version = "1.2.1"
@ -247,6 +283,7 @@ dependencies = [
"gix-glob",
"gix-hash",
"gix-hashtable",
"gix-index",
"gix-lock",
"gix-macros",
"gix-object",
@ -284,6 +321,15 @@ dependencies = [
"winnow",
]
[[package]]
name = "gix-bitmap"
version = "0.2.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a371db66cbd4e13f0ed9dc4c0fea712d7276805fccc877f77e96374d317e87ae"
dependencies = [
"thiserror",
]
[[package]]
name = "gix-chunk"
version = "0.4.8"
@ -443,6 +489,33 @@ dependencies = [
"parking_lot",
]
[[package]]
name = "gix-index"
version = "0.32.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "881ab3b1fa57f497601a5add8289e72a7ae09471fc0b9bbe483b628ae8e418a1"
dependencies = [
"bitflags 2.5.0",
"bstr",
"filetime",
"fnv",
"gix-bitmap",
"gix-features",
"gix-fs",
"gix-hash",
"gix-lock",
"gix-object",
"gix-traverse",
"gix-utils",
"hashbrown",
"itoa",
"libc",
"memmap2",
"rustix",
"smallvec",
"thiserror",
]
[[package]]
name = "gix-lock"
version = "13.1.1"
@ -702,6 +775,10 @@ name = "hashbrown"
version = "0.14.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "290f1a1d9242c78d09ce40a5e87e7554ee637af1351968159f4952f028f75604"
dependencies = [
"ahash",
"allocator-api2",
]
[[package]]
name = "heck"
@ -1076,6 +1153,12 @@ version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "711b9620af191e0cdc7468a8d14e709c3dcdb115b36f838e601583af800a370a"
[[package]]
name = "version_check"
version = "0.9.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "49874b5167b65d7193b8aba1567f5c7d93d001cafc34600cee003eda787e483f"
[[package]]
name = "walkdir"
version = "2.5.0"
@ -1255,3 +1338,23 @@ checksum = "f0c976aaaa0e1f90dbb21e9587cdaf1d9679a1cde8875c0d6bd83ab96a208352"
dependencies = [
"memchr",
]
[[package]]
name = "zerocopy"
version = "0.7.34"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ae87e3fcd617500e5d106f0380cf7b77f3c6092aae37191433159dda23cfb087"
dependencies = [
"zerocopy-derive",
]
[[package]]
name = "zerocopy-derive"
version = "0.7.34"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "15e934569e47891f7d9411f1a451d947a60e000ab3bd24fbb970f000387d1b3b"
dependencies = [
"proc-macro2",
"quote",
"syn",
]

View File

@ -13,6 +13,9 @@ struct Args {
#[clap(long)]
dates: bool,
#[clap(long)]
describe: bool,
}
fn main() {
@ -23,17 +26,15 @@ fn main() {
"{}",
Bosion::long_version_with(&[("extra", "field"), ("custom", "1.2.3"),])
);
} else
if args.features {
} else if args.features {
println!("Features: {}", Bosion::CRATE_FEATURE_STRING);
} else
if args.dates {
} else if args.dates {
println!("commit date: {}", Bosion::GIT_COMMIT_DATE);
println!("commit datetime: {}", Bosion::GIT_COMMIT_DATETIME);
println!("build date: {}", Bosion::BUILD_DATE);
println!("build datetime: {}", Bosion::BUILD_DATETIME);
} else if args.describe {
println!("commit description: {}", Bosion::GIT_COMMIT_DESCRIPTION);
} else {
println!("{}", Bosion::LONG_VERSION);
}

View File

@ -145,6 +145,9 @@ pub struct GitInfo {
/// The datetime of the current commit, in the format `YYYY-MM-DD HH:MM:SS`, at UTC.
pub git_datetime: String,
/// The `git describe` equivalent output
pub git_description: String,
}
#[cfg(feature = "git")]
@ -163,6 +166,7 @@ impl GitInfo {
git_shorthash: head.short_id().err_string()?.to_string(),
git_date: timestamp.format(DATE_FORMAT).err_string()?,
git_datetime: timestamp.format(DATETIME_FORMAT).err_string()?,
git_description: head.describe().format().err_string()?.to_string(),
})
}
}

View File

@ -74,6 +74,7 @@ pub fn gather_to(filename: &str, structname: &str, public: bool) {
git_shorthash,
git_date,
git_datetime,
git_description,
..
}) = git
{
@ -104,6 +105,11 @@ pub fn gather_to(filename: &str, structname: &str, public: bool) {
/// This is the date and time (`YYYY-MM-DD HH:MM:SS`) of the commit that was built. Same
/// caveats as with `GIT_COMMIT_HASH` apply.
pub const GIT_COMMIT_DATETIME: &'static str = {git_datetime:?};
/// The git description
///
/// This is the string equivalent to what `git describe` would output
pub const GIT_COMMIT_DESCRIPTION: &'static str = {git_description:?};
"
), format!("{crate_version} ({git_shorthash} {git_date}) {crate_feature_string}\ncommit-hash: {git_hash}\ncommit-date: {git_date}\nbuild-date: {build_date}\nrelease: {crate_version}\nfeatures: {crate_feature_list}"))
} else {
@ -244,6 +250,7 @@ pub fn gather_to_env_with_prefix(prefix: &str) {
git_shorthash,
git_date,
git_datetime,
git_description,
..
}) = git
{
@ -251,5 +258,6 @@ pub fn gather_to_env_with_prefix(prefix: &str) {
println!("cargo:rustc-env={prefix}GIT_COMMIT_SHORTHASH={git_shorthash}");
println!("cargo:rustc-env={prefix}GIT_COMMIT_DATE={git_date}");
println!("cargo:rustc-env={prefix}GIT_COMMIT_DATETIME={git_datetime}");
println!("cargo:rustc-env={prefix}GIT_COMMIT_DESCRIPTION={git_description}");
}
}

View File

@ -1,6 +1,6 @@
[package]
name = "watchexec-cli"
version = "1.25.1"
version = "2.1.1"
authors = ["Félix Saparelli <felix@passcod.name>", "Matt Green <mattgreenrocks@gmail.com>"]
license = "Apache-2.0"
@ -44,6 +44,7 @@ serde_json = "1.0.107"
tempfile = "3.8.1"
termcolor = "1.4.0"
tracing = "0.1.40"
tracing-appender = "0.2.3"
which = "6.0.1"
[dependencies.blake3]
@ -68,7 +69,7 @@ features = ["log", "env_logger"]
optional = true
[dependencies.ignore-files]
version = "2.1.0"
version = "3.0.1"
path = "../ignore-files"
[dependencies.miette]
@ -80,11 +81,11 @@ version = "0.1.1"
optional = true
[dependencies.project-origins]
version = "1.3.0"
version = "1.4.0"
path = "../project-origins"
[dependencies.watchexec]
version = "3.0.1"
version = "4.1.0"
path = "../lib"
[dependencies.watchexec-events]
@ -97,7 +98,7 @@ version = "3.0.0"
path = "../signals"
[dependencies.watchexec-filterer-globset]
version = "3.0.0"
version = "4.0.1"
path = "../filterer/globset"
[dependencies.tokio]
@ -129,7 +130,7 @@ mimalloc = "0.1.39"
embed-resource = "2.4.0"
[build-dependencies.bosion]
version = "1.0.3"
version = "1.1.0"
path = "../bosion"
[dev-dependencies]

View File

@ -1,7 +1,9 @@
pre-release-commit-message = "release: cli v{{version}}"
tag-prefix = "cli-"
tag-prefix = ""
tag-message = "watchexec {{version}}"
pre-release-hook = ["sh", "-c", "cd ../.. && bin/completions && bin/manpage"]
[[pre-release-replacements]]
file = "watchexec.exe.manifest"
search = "^ version=\"[\\d.]+[.]0\""

View File

@ -1,21 +1,28 @@
use std::{
collections::BTreeSet,
ffi::{OsStr, OsString},
path::PathBuf,
fs::canonicalize,
mem::take,
path::{Path, PathBuf},
str::FromStr,
time::Duration,
};
use clap::{
builder::TypedValueParser, error::ErrorKind, Arg, ArgAction, Command, CommandFactory, Parser,
ValueEnum, ValueHint,
builder::TypedValueParser, error::ErrorKind, Arg, Command, CommandFactory, Parser, ValueEnum,
ValueHint,
};
use miette::{IntoDiagnostic, Result};
use tokio::{fs::File, io::AsyncReadExt};
use watchexec::paths::PATH_SEPARATOR;
use tracing::{debug, info, trace, warn};
use tracing_appender::non_blocking::WorkerGuard;
use watchexec::{paths::PATH_SEPARATOR, sources::fs::WatchedPath};
use watchexec_signals::Signal;
use crate::filterer::parse::parse_filter_program;
mod logging;
const OPTSET_FILTERING: &str = "Filtering";
const OPTSET_COMMAND: &str = "Command";
const OPTSET_DEBUGGING: &str = "Debugging";
@ -128,7 +135,25 @@ pub struct Args {
value_hint = ValueHint::AnyPath,
value_name = "PATH",
)]
pub paths: Vec<PathBuf>,
pub recursive_paths: Vec<PathBuf>,
/// Watch a specific directory, non-recursively
///
/// Unlike '-w', folders watched with this option are not recursed into.
///
/// This option can be specified multiple times to watch multiple directories non-recursively.
#[arg(
short = 'W',
long = "watch-non-recursive",
help_heading = OPTSET_FILTERING,
value_hint = ValueHint::AnyPath,
value_name = "PATH",
)]
pub non_recursive_paths: Vec<PathBuf>,
#[doc(hidden)]
#[arg(skip)]
pub paths: Vec<WatchedPath>,
/// Clear screen before running command
///
@ -145,17 +170,14 @@ pub struct Args {
/// What to do when receiving events while the command is running
///
/// Default is to 'queue' up events and run the command once again when the previous run has
/// finished. You can also use 'do-nothing', which ignores events while the command is running
/// and may be useful to avoid spurious changes made by that command, or 'restart', which
/// terminates the running command and starts a new one. Finally, there's 'signal', which only
/// sends a signal; this can be useful with programs that can reload their configuration without
/// a full restart.
/// Default is to 'do-nothing', which ignores events while the command is running, so that
/// changes that occur due to the command are ignored, like compilation outputs. You can also
/// use 'queue' which will run the command once again when the current run has finished if any
/// events occur while it's running, or 'restart', which terminates the running command and starts
/// a new one. Finally, there's 'signal', which only sends a signal; this can be useful with
/// programs that can reload their configuration without a full restart.
///
/// The signal can be specified with the '--signal' option.
///
/// Note that this option is scheduled to change its default to 'do-nothing' in the next major
/// release. File an issue if you have any concerns.
#[arg(
short,
long,
@ -636,12 +658,31 @@ pub struct Args {
/// By default, Watchexec will run the command in a process group, so that signals and
/// terminations are sent to all processes in the group. Sometimes that's not what you want, and
/// you can disable the behaviour with this option.
///
/// Deprecated, use '--wrap-process=none' instead.
#[arg(
long,
help_heading = OPTSET_COMMAND,
)]
pub no_process_group: bool,
/// Configure how the process is wrapped
///
/// By default, Watchexec will run the command in a process group in Unix, and in a Job Object
/// in Windows.
///
/// Some Unix programs prefer running in a session, while others do not work in a process group.
///
/// Use 'group' to use a process group, 'session' to use a process session, and 'none' to run
/// the command directly. On Windows, either of 'group' or 'session' will use a Job Object.
#[arg(
long,
help_heading = OPTSET_COMMAND,
value_name = "MODE",
default_value = "group",
)]
pub wrap_process: WrapMode,
/// Testing only: exit Watchexec after the first run
#[arg(short = '1', hide = true)]
pub once: bool,
@ -775,6 +816,7 @@ pub struct Args {
///
/// Provide your own custom filter programs in jaq (similar to jq) syntax. Programs are given
/// an event in the same format as described in '--emit-events-to' and must return a boolean.
/// Invalid programs will make watchexec fail to start; use '-v' to see program runtime errors.
///
/// In addition to the jaq stdlib, watchexec adds some custom filter definitions:
///
@ -905,54 +947,13 @@ pub struct Args {
/// This prints the events that triggered the action when handling it (after debouncing), in a
/// human readable form. This is useful for debugging filters.
///
/// Use '-v' when you need more diagnostic information.
/// Use '-vvv' instead when you need more diagnostic information.
#[arg(
long,
help_heading = OPTSET_DEBUGGING,
)]
pub print_events: bool,
/// Set diagnostic log level
///
/// This enables diagnostic logging, which is useful for investigating bugs or gaining more
/// insight into faulty filters or "missing" events. Use multiple times to increase verbosity.
///
/// Goes up to '-vvvv'. When submitting bug reports, default to a '-vvv' log level.
///
/// You may want to use with '--log-file' to avoid polluting your terminal.
///
/// Setting $RUST_LOG also works, and takes precedence, but is not recommended. However, using
/// $RUST_LOG is the only way to get logs from before these options are parsed.
#[arg(
long,
short,
help_heading = OPTSET_DEBUGGING,
action = ArgAction::Count,
num_args = 0,
)]
pub verbose: Option<u8>,
/// Write diagnostic logs to a file
///
/// This writes diagnostic logs to a file, instead of the terminal, in JSON format. If a log
/// level was not already specified, this will set it to '-vvv'.
///
/// If a path is not provided, the default is the working directory. Note that with
/// '--ignore-nothing', the write events to the log will likely get picked up by Watchexec,
/// causing a loop; prefer setting a path outside of the watched directory.
///
/// If the path provided is a directory, a file will be created in that directory. The file name
/// will be the current date and time, in the format 'watchexec.YYYY-MM-DDTHH-MM-SSZ.log'.
#[arg(
long,
help_heading = OPTSET_DEBUGGING,
num_args = 0..=1,
default_missing_value = ".",
value_hint = ValueHint::AnyPath,
value_name = "PATH",
)]
pub log_file: Option<PathBuf>,
/// Show the manual page
///
/// This shows the manual page for Watchexec, if the output is a terminal and the 'man' program
@ -977,6 +978,9 @@ pub struct Args {
conflicts_with_all = ["command", "manual"],
)]
pub completions: Option<ShellCompletion>,
#[command(flatten)]
pub logging: logging::LoggingArgs,
}
#[derive(Clone, Copy, Debug, Default, ValueEnum)]
@ -999,6 +1003,14 @@ pub enum OnBusyUpdate {
Signal,
}
#[derive(Clone, Copy, Debug, Default, ValueEnum)]
pub enum WrapMode {
#[default]
Group,
Session,
None,
}
#[derive(Clone, Copy, Debug, Default, ValueEnum)]
pub enum ClearMode {
#[default]
@ -1135,11 +1147,10 @@ fn expand_args_up_to_doubledash() -> Result<Vec<OsString>, std::io::Error> {
}
#[inline]
pub async fn get_args() -> Result<Args> {
use tracing::{debug, trace, warn};
if std::env::var("RUST_LOG").is_ok() {
warn!("⚠ RUST_LOG environment variable set, logging options have no effect");
pub async fn get_args() -> Result<(Args, Option<WorkerGuard>)> {
let prearg_logs = logging::preargs();
if prearg_logs {
warn!("⚠ RUST_LOG environment variable set or hardcoded, logging options have no effect");
}
debug!("expanding @argfile arguments if any");
@ -1148,6 +1159,12 @@ pub async fn get_args() -> Result<Args> {
debug!("parsing arguments");
let mut args = Args::parse_from(args);
let log_guard = if !prearg_logs {
logging::postargs(&args.logging).await?
} else {
None
};
// https://no-color.org/
if args.color == ColourMode::Auto && std::env::var("NO_COLOR").is_ok() {
args.color = ColourMode::Never;
@ -1168,9 +1185,15 @@ pub async fn get_args() -> Result<Args> {
}
if args.no_environment {
warn!("--no-environment is deprecated");
args.emit_events_to = EmitEvents::None;
}
if args.no_process_group {
warn!("--no-process-group is deprecated");
args.wrap_process = WrapMode::None;
}
if args.filter_fs_meta {
args.filter_fs_events = vec![
FsEvent::Create,
@ -1193,6 +1216,63 @@ pub async fn get_args() -> Result<Args> {
.exit();
}
let workdir = if let Some(w) = take(&mut args.workdir) {
w
} else {
let curdir = std::env::current_dir().into_diagnostic()?;
canonicalize(curdir).into_diagnostic()?
};
info!(path=?workdir, "effective working directory");
args.workdir = Some(workdir.clone());
let project_origin = if let Some(p) = take(&mut args.project_origin) {
p
} else {
crate::dirs::project_origin(&args).await?
};
info!(path=?project_origin, "effective project origin");
args.project_origin = Some(project_origin.clone());
args.paths = take(&mut args.recursive_paths)
.into_iter()
.map(|path| {
{
if path.is_absolute() {
Ok(path)
} else {
canonicalize(project_origin.join(path)).into_diagnostic()
}
}
.map(WatchedPath::recursive)
})
.chain(take(&mut args.non_recursive_paths).into_iter().map(|path| {
{
if path.is_absolute() {
Ok(path)
} else {
canonicalize(project_origin.join(path)).into_diagnostic()
}
}
.map(WatchedPath::non_recursive)
}))
.collect::<Result<BTreeSet<_>>>()?
.into_iter()
.collect();
if args.paths.len() == 1
&& args
.paths
.first()
.map_or(false, |p| p.as_ref() == Path::new("/dev/null"))
{
info!("only path is /dev/null, not watching anything");
args.paths = Vec::new();
} else if args.paths.is_empty() {
info!("no paths, using current directory");
args.paths.push(args.workdir.clone().unwrap().into());
}
info!(paths=?args.paths, "effective watched paths");
for (n, prog) in args.filter_programs.iter_mut().enumerate() {
if let Some(progpath) = prog.strip_prefix('@') {
trace!(?n, path=?progpath, "reading filter program from file");
@ -1205,12 +1285,14 @@ pub async fn get_args() -> Result<Args> {
}
}
args.filter_programs_parsed = std::mem::take(&mut args.filter_programs)
args.filter_programs_parsed = take(&mut args.filter_programs)
.into_iter()
.enumerate()
.map(parse_filter_program)
.collect::<Result<_, _>>()?;
debug!(?args, "got arguments");
Ok(args)
debug_assert!(args.workdir.is_some());
debug_assert!(args.project_origin.is_some());
info!(?args, "got arguments");
Ok((args, log_guard))
}

View File

@ -0,0 +1,132 @@
use std::{env::var, io::stderr, path::PathBuf};
use clap::{ArgAction, Parser, ValueHint};
use miette::{bail, Result};
use tokio::fs::metadata;
use tracing::{info, warn};
use tracing_appender::{non_blocking, non_blocking::WorkerGuard, rolling};
#[derive(Debug, Clone, Parser)]
pub struct LoggingArgs {
/// Set diagnostic log level
///
/// This enables diagnostic logging, which is useful for investigating bugs or gaining more
/// insight into faulty filters or "missing" events. Use multiple times to increase verbosity.
///
/// Goes up to '-vvvv'. When submitting bug reports, default to a '-vvv' log level.
///
/// You may want to use with '--log-file' to avoid polluting your terminal.
///
/// Setting $RUST_LOG also works, and takes precedence, but is not recommended. However, using
/// $RUST_LOG is the only way to get logs from before these options are parsed.
#[arg(
long,
short,
help_heading = super::OPTSET_DEBUGGING,
action = ArgAction::Count,
default_value = "0",
num_args = 0,
)]
pub verbose: u8,
/// Write diagnostic logs to a file
///
/// This writes diagnostic logs to a file, instead of the terminal, in JSON format. If a log
/// level was not already specified, this will set it to '-vvv'.
///
/// If a path is not provided, the default is the working directory. Note that with
/// '--ignore-nothing', the write events to the log will likely get picked up by Watchexec,
/// causing a loop; prefer setting a path outside of the watched directory.
///
/// If the path provided is a directory, a file will be created in that directory. The file name
/// will be the current date and time, in the format 'watchexec.YYYY-MM-DDTHH-MM-SSZ.log'.
#[arg(
long,
help_heading = super::OPTSET_DEBUGGING,
num_args = 0..=1,
default_missing_value = ".",
value_hint = ValueHint::AnyPath,
value_name = "PATH",
)]
pub log_file: Option<PathBuf>,
}
pub fn preargs() -> bool {
let mut log_on = false;
#[cfg(feature = "dev-console")]
match console_subscriber::try_init() {
Ok(_) => {
warn!("dev-console enabled");
log_on = true;
}
Err(e) => {
eprintln!("Failed to initialise tokio console, falling back to normal logging\n{e}")
}
}
if !log_on && var("RUST_LOG").is_ok() {
match tracing_subscriber::fmt::try_init() {
Ok(()) => {
warn!(RUST_LOG=%var("RUST_LOG").unwrap(), "logging configured from RUST_LOG");
log_on = true;
}
Err(e) => eprintln!("Failed to initialise logging with RUST_LOG, falling back\n{e}"),
}
}
log_on
}
pub async fn postargs(args: &LoggingArgs) -> Result<Option<WorkerGuard>> {
if args.verbose == 0 {
return Ok(None);
}
let (log_writer, guard) = if let Some(file) = &args.log_file {
let is_dir = metadata(&file).await.map_or(false, |info| info.is_dir());
let (dir, filename) = if is_dir {
(
file.to_owned(),
PathBuf::from(format!(
"watchexec.{}.log",
chrono::Utc::now().format("%Y-%m-%dT%H-%M-%SZ")
)),
)
} else if let (Some(parent), Some(file_name)) = (file.parent(), file.file_name()) {
(parent.into(), PathBuf::from(file_name))
} else {
bail!("Failed to determine log file name");
};
non_blocking(rolling::never(dir, filename))
} else {
non_blocking(stderr())
};
let mut builder = tracing_subscriber::fmt().with_env_filter(match args.verbose {
0 => unreachable!("checked by if earlier"),
1 => "warn",
2 => "info",
3 => "debug",
_ => "trace",
});
if args.verbose > 2 {
use tracing_subscriber::fmt::format::FmtSpan;
builder = builder.with_span_events(FmtSpan::NEW | FmtSpan::CLOSE);
}
match if args.log_file.is_some() {
builder.json().with_writer(log_writer).try_init()
} else if args.verbose > 3 {
builder.pretty().with_writer(log_writer).try_init()
} else {
builder.with_writer(log_writer).try_init()
} {
Ok(()) => info!("logging initialised"),
Err(e) => eprintln!("Failed to initialise logging, continuing with none\n{e}"),
}
Ok(Some(guard))
}

View File

@ -1,11 +1,10 @@
use std::{
borrow::Cow,
collections::HashMap,
env::{current_dir, var},
env::var,
ffi::{OsStr, OsString},
fs::File,
io::{IsTerminal, Write},
path::Path,
process::Stdio,
sync::{
atomic::{AtomicBool, AtomicU8, Ordering},
@ -32,7 +31,7 @@ use watchexec_events::{Event, Keyboard, ProcessEnd, Tag};
use watchexec_signals::Signal;
use crate::{
args::{Args, ClearMode, ColourMode, EmitEvents, OnBusyUpdate, SignalMapping},
args::{Args, ClearMode, ColourMode, EmitEvents, OnBusyUpdate, SignalMapping, WrapMode},
state::RotatingTempFile,
};
use crate::{emits::events_to_simple_format, state::State};
@ -68,19 +67,7 @@ pub fn make_config(args: &Args, state: &State) -> Result<Config> {
eprintln!("[[Error (not fatal)]]\n{}", Report::new(err.error));
});
config.pathset(if args.paths.is_empty() {
vec![current_dir().into_diagnostic()?]
} else if args.paths.len() == 1
&& args
.paths
.first()
.map_or(false, |p| p == Path::new("/dev/null"))
{
// special case: /dev/null means "don't start the fs event source"
Vec::new()
} else {
args.paths.clone()
});
config.pathset(args.paths.clone());
config.throttle(args.debounce.0);
config.keyboard_events(args.stdin_quit);
@ -545,7 +532,8 @@ fn interpret_command_args(args: &Args) -> Result<Arc<Command>> {
Ok(Arc::new(Command {
program,
options: SpawnOptions {
grouped: !args.no_process_group,
grouped: matches!(args.wrap_process, WrapMode::Group),
session: matches!(args.wrap_process, WrapMode::Session),
..Default::default()
},
}))

View File

@ -1,7 +1,5 @@
use std::{
borrow::Cow,
collections::HashSet,
env,
path::{Path, PathBuf},
};
@ -14,16 +12,7 @@ use watchexec::paths::common_prefix;
use crate::args::Args;
type ProjectOriginPath = PathBuf;
type WorkDirPath = PathBuf;
/// Extract relevant directories (in particular the project origin and work directory)
/// given the command line arguments that were provided
pub async fn dirs(args: &Args) -> Result<(ProjectOriginPath, WorkDirPath)> {
let curdir = env::current_dir().into_diagnostic()?;
let curdir = canonicalize(curdir).await.into_diagnostic()?;
debug!(?curdir, "current directory");
pub async fn project_origin(args: &Args) -> Result<PathBuf> {
let project_origin = if let Some(origin) = &args.project_origin {
debug!(?origin, "project origin override");
canonicalize(origin).await.into_diagnostic()?
@ -34,27 +23,19 @@ pub async fn dirs(args: &Args) -> Result<(ProjectOriginPath, WorkDirPath)> {
};
debug!(?homedir, "home directory");
let mut paths = HashSet::new();
for path in &args.paths {
paths.insert(canonicalize(path).await.into_diagnostic()?);
}
let homedir_requested = homedir.as_ref().map_or(false, |home| paths.contains(home));
let homedir_requested = homedir.as_ref().map_or(false, |home| {
args.paths
.binary_search_by_key(home, |w| PathBuf::from(w.clone()))
.is_ok()
});
debug!(
?homedir_requested,
"resolved whether the homedir is explicitly requested"
);
if paths.is_empty() {
debug!("no paths, using current directory");
paths.insert(curdir.clone());
}
debug!(?paths, "resolved all watched paths");
let mut origins = HashSet::new();
for path in paths {
origins.extend(project_origins::origins(&path).await);
for path in &args.paths {
origins.extend(project_origins::origins(path).await);
}
match (homedir, homedir_requested) {
@ -67,7 +48,7 @@ pub async fn dirs(args: &Args) -> Result<(ProjectOriginPath, WorkDirPath)> {
if origins.is_empty() {
debug!("no origins, using current directory");
origins.insert(curdir.clone());
origins.insert(args.workdir.clone().unwrap());
}
debug!(?origins, "resolved all project origins");
@ -80,12 +61,9 @@ pub async fn dirs(args: &Args) -> Result<(ProjectOriginPath, WorkDirPath)> {
.await
.into_diagnostic()?
};
info!(?project_origin, "resolved common/project origin");
debug!(?project_origin, "resolved common/project origin");
let workdir = curdir;
info!(?workdir, "resolved working directory");
Ok((project_origin, workdir))
Ok(project_origin)
}
pub async fn vcs_types(origin: &Path) -> Vec<ProjectType> {
@ -94,41 +72,34 @@ pub async fn vcs_types(origin: &Path) -> Vec<ProjectType> {
.into_iter()
.filter(|pt| pt.is_vcs())
.collect::<Vec<_>>();
info!(?vcs_types, "resolved vcs types");
info!(?vcs_types, "effective vcs types");
vcs_types
}
pub async fn ignores(
args: &Args,
vcs_types: &[ProjectType],
origin: &Path,
) -> Result<Vec<IgnoreFile>> {
fn higher_make_absolute_if_needed<'a>(
origin: &'a Path,
) -> impl 'a + Fn(&'a PathBuf) -> Cow<'a, Path> {
|path| {
if path.is_absolute() {
Cow::Borrowed(path)
} else {
Cow::Owned(origin.join(path))
}
}
}
pub async fn ignores(args: &Args, vcs_types: &[ProjectType]) -> Result<Vec<IgnoreFile>> {
let origin = args.project_origin.clone().unwrap();
let mut skip_git_global_excludes = false;
let mut ignores = if args.no_project_ignore {
Vec::new()
} else {
let make_absolute_if_needed = higher_make_absolute_if_needed(origin);
let include_paths = args.paths.iter().map(&make_absolute_if_needed);
let ignore_files = args.ignore_files.iter().map(&make_absolute_if_needed);
let ignore_files = args.ignore_files.iter().map(|path| {
if path.is_absolute() {
path.into()
} else {
origin.join(path)
}
});
let (mut ignores, errors) = ignore_files::from_origin(
IgnoreFilesFromOriginArgs::new_unchecked(origin, include_paths, ignore_files)
.canonicalise()
.await
.into_diagnostic()?,
IgnoreFilesFromOriginArgs::new_unchecked(
&origin,
args.paths.iter().map(PathBuf::from),
ignore_files,
)
.canonicalise()
.await
.into_diagnostic()?,
)
.await;
@ -221,7 +192,7 @@ pub async fn ignores(
.filter(|ig| {
!ig.applies_in
.as_ref()
.map_or(false, |p| p.starts_with(origin))
.map_or(false, |p| p.starts_with(&origin))
})
.collect::<Vec<_>>();
debug!(

View File

@ -16,7 +16,6 @@ use watchexec_filterer_globset::GlobsetFilterer;
use crate::args::{Args, FsEvent};
mod dirs;
pub(crate) mod parse;
mod proglib;
mod progs;
@ -71,13 +70,14 @@ impl Filterer for WatchexecFilterer {
impl WatchexecFilterer {
/// Create a new filterer from the given arguments
pub async fn new(args: &Args) -> Result<Arc<Self>> {
let (project_origin, workdir) = dirs::dirs(args).await?;
let project_origin = args.project_origin.clone().unwrap();
let workdir = args.workdir.clone().unwrap();
let ignore_files = if args.no_discover_ignore {
Vec::new()
} else {
let vcs_types = dirs::vcs_types(&project_origin).await;
dirs::ignores(args, &vcs_types, &project_origin).await?
let vcs_types = crate::dirs::vcs_types(&project_origin).await;
crate::dirs::ignores(args, &vcs_types).await?
};
let mut ignores = Vec::new();

View File

@ -10,7 +10,7 @@ pub fn parse_filter_program((n, prog): (usize, String)) -> Result<jaq_syn::Main>
.map(|err| err.to_string())
.collect::<Vec<_>>()
.join("\n");
return Err(miette!("failed to load filter program #{}: {:?}", n, errs));
return Err(miette!("{}", errs).wrap_err(format!("failed to load filter program #{}", n)));
}
main.ok_or_else(|| miette!("failed to load filter program #{} (no reason given)", n))

View File

@ -1,7 +1,7 @@
#![deny(rust_2018_idioms)]
#![allow(clippy::missing_const_for_fn, clippy::future_not_send)]
use std::{env::var, fs::File, io::Write, process::Stdio, sync::Mutex};
use std::{io::Write, process::Stdio};
use args::{Args, ShellCompletion};
use clap::CommandFactory;
@ -9,8 +9,8 @@ use clap_complete::{Generator, Shell};
use clap_mangen::Man;
use is_terminal::IsTerminal;
use miette::{IntoDiagnostic, Result};
use tokio::{fs::metadata, io::AsyncWriteExt, process::Command};
use tracing::{debug, info, warn};
use tokio::{io::AsyncWriteExt, process::Command};
use tracing::{debug, info};
use watchexec::Watchexec;
use watchexec_events::{Event, Priority};
@ -18,86 +18,11 @@ use crate::filterer::WatchexecFilterer;
pub mod args;
mod config;
mod dirs;
mod emits;
mod filterer;
mod state;
async fn init() -> Result<Args> {
let mut log_on = false;
#[cfg(feature = "dev-console")]
match console_subscriber::try_init() {
Ok(_) => {
warn!("dev-console enabled");
log_on = true;
}
Err(e) => {
eprintln!("Failed to initialise tokio console, falling back to normal logging\n{e}")
}
}
if !log_on && var("RUST_LOG").is_ok() {
match tracing_subscriber::fmt::try_init() {
Ok(()) => {
warn!(RUST_LOG=%var("RUST_LOG").unwrap(), "logging configured from RUST_LOG");
log_on = true;
}
Err(e) => eprintln!("Failed to initialise logging with RUST_LOG, falling back\n{e}"),
}
}
let args = args::get_args().await?;
let verbosity = args.verbose.unwrap_or(0);
if log_on {
warn!("ignoring logging options from args");
} else if verbosity > 0 {
let log_file = if let Some(file) = &args.log_file {
let is_dir = metadata(&file).await.map_or(false, |info| info.is_dir());
let path = if is_dir {
let filename = format!(
"watchexec.{}.log",
chrono::Utc::now().format("%Y-%m-%dT%H-%M-%SZ")
);
file.join(filename)
} else {
file.to_owned()
};
// TODO: use tracing-appender instead
Some(File::create(path).into_diagnostic()?)
} else {
None
};
let mut builder = tracing_subscriber::fmt().with_env_filter(match verbosity {
0 => unreachable!("checked by if earlier"),
1 => "warn",
2 => "info",
3 => "debug",
_ => "trace",
});
if verbosity > 2 {
use tracing_subscriber::fmt::format::FmtSpan;
builder = builder.with_span_events(FmtSpan::NEW | FmtSpan::CLOSE);
}
match if let Some(writer) = log_file {
builder.json().with_writer(Mutex::new(writer)).try_init()
} else if verbosity > 3 {
builder.pretty().try_init()
} else {
builder.try_init()
} {
Ok(()) => info!("logging initialised"),
Err(e) => eprintln!("Failed to initialise logging, continuing with none\n{e}"),
}
}
Ok(args)
}
async fn run_watchexec(args: Args) -> Result<()> {
info!(version=%env!("CARGO_PKG_VERSION"), "constructing Watchexec from CLI");
@ -191,8 +116,7 @@ async fn run_completions(shell: ShellCompletion) -> Result<()> {
}
pub async fn run() -> Result<()> {
let args = init().await?;
debug!(?args, "arguments");
let (args, _log_guard) = args::get_args().await?;
if args.manual {
run_manpage(args).await

View File

@ -3,7 +3,7 @@
<assemblyIdentity
type="win32"
name="Watchexec.Cli.watchexec"
version="1.25.1.0"
version="2.1.1.0"
/>
<trustInfo>

View File

@ -33,7 +33,6 @@ version = "0.28.0"
features = ["signal"]
[dev-dependencies]
watchexec-events = { workspace = true, features = ["serde"] }
snapbox = "0.5.9"
serde_json = "1.0.107"

View File

@ -1,5 +1,5 @@
pre-release-commit-message = "release: events v{{version}}"
tag-prefix = "events-"
tag-prefix = "watchexec-events-"
tag-message = "watchexec-events {{version}}"
[[pre-release-replacements]]

View File

@ -1,3 +1,5 @@
#![cfg(feature = "serde")]
use std::num::{NonZeroI32, NonZeroI64};
use snapbox::{assert_eq, file};

View File

@ -2,6 +2,14 @@
## Next (YYYY-MM-DD)
## v4.0.1 (2024-04-28)
- Hide fmt::Debug spew from ignore crate, use `full_debug` feature to restore.
## v4.0.0 (2024-04-20)
- Deps: watchexec 4
## v3.0.0 (2024-01-01)
- Deps: `watchexec-filterer-ignore` and `ignore-files`

View File

@ -1,6 +1,6 @@
[package]
name = "watchexec-filterer-globset"
version = "3.0.0"
version = "4.0.1"
authors = ["Matt Green <mattgreenrocks@gmail.com>", "Félix Saparelli <felix@passcod.name>"]
license = "Apache-2.0"
@ -20,11 +20,11 @@ ignore = "0.4.18"
tracing = "0.1.40"
[dependencies.ignore-files]
version = "2.1.0"
version = "3.0.1"
path = "../../ignore-files"
[dependencies.watchexec]
version = "3.0.1"
version = "4.1.0"
path = "../../lib"
[dependencies.watchexec-events]
@ -32,7 +32,7 @@ version = "3.0.0"
path = "../../events"
[dependencies.watchexec-filterer-ignore]
version = "3.0.1"
version = "4.0.1"
path = "../ignore"
[dev-dependencies]
@ -47,3 +47,9 @@ features = [
"rt-multi-thread",
"macros",
]
[features]
default = []
## Don't hide ignore::gitignore::Gitignore Debug impl
full_debug = []

View File

@ -1,5 +1,5 @@
pre-release-commit-message = "release: filterer-globset v{{version}}"
tag-prefix = "filterer-globset-"
tag-prefix = "watchexec-filterer-globset-"
tag-message = "watchexec-filterer-globset {{version}}"
[[pre-release-replacements]]

View File

@ -10,6 +10,7 @@
use std::{
ffi::OsString,
fmt,
path::{Path, PathBuf},
};
@ -21,7 +22,7 @@ use watchexec_events::{Event, FileType, Priority};
use watchexec_filterer_ignore::IgnoreFilterer;
/// A simple filterer in the style of the watchexec v1.17 filter.
#[derive(Debug)]
#[cfg_attr(feature = "full_debug", derive(Debug))]
pub struct GlobsetFilterer {
#[cfg_attr(not(unix), allow(dead_code))]
origin: PathBuf,
@ -31,6 +32,19 @@ pub struct GlobsetFilterer {
extensions: Vec<OsString>,
}
#[cfg(not(feature = "full_debug"))]
impl fmt::Debug for GlobsetFilterer {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
f.debug_struct("GlobsetFilterer")
.field("origin", &self.origin)
.field("filters", &"ignore::gitignore::Gitignore{...}")
.field("ignores", &"ignore::gitignore::Gitignore{...}")
.field("ignore_files", &self.ignore_files)
.field("extensions", &self.extensions)
.finish()
}
}
impl GlobsetFilterer {
/// Create a new `GlobsetFilterer` from a project origin, allowed extensions, and lists of globs.
///

View File

@ -2,6 +2,12 @@
## Next (YYYY-MM-DD)
## v4.0.1 (2024-04-28)
## v4.0.0 (2024-04-20)
- Deps: watchexec 4
## v3.0.1 (2024-01-04)
- Normalise paths on all platforms (via `normalize-path`).

View File

@ -1,6 +1,6 @@
[package]
name = "watchexec-filterer-ignore"
version = "3.0.1"
version = "4.0.1"
authors = ["Félix Saparelli <felix@passcod.name>"]
license = "Apache-2.0"
@ -22,11 +22,11 @@ normalize-path = "0.2.1"
tracing = "0.1.40"
[dependencies.ignore-files]
version = "2.1.0"
version = "3.0.1"
path = "../../ignore-files"
[dependencies.watchexec]
version = "3.0.1"
version = "4.1.0"
path = "../../lib"
[dependencies.watchexec-events]
@ -41,7 +41,7 @@ path = "../../signals"
tracing-subscriber = "0.3.6"
[dev-dependencies.project-origins]
version = "1.3.0"
version = "1.4.0"
path = "../../project-origins"
[dev-dependencies.tokio]

View File

@ -1,5 +1,5 @@
pre-release-commit-message = "release: filterer-ignore v{{version}}"
tag-prefix = "filterer-ignore-"
tag-prefix = "watchexec-filterer-ignore-"
tag-message = "watchexec-filterer-ignore {{version}}"
[[pre-release-replacements]]

View File

@ -1,32 +0,0 @@
# Changelog
## Next (YYYY-MM-DD)
- Deps: miette 7
## v2.0.0 (2024-01-01)
- Depend on `watchexec-events` instead of the `watchexec` re-export.
## v1.0.0 (2023-12-10)
- Officially deprecate (crate is now unmaintained).
- Depend on `watchexec-events` instead of the `watchexec` re-export.
- Remove error diagnostic codes.
- Deps: upgrade Tokio requirement to 1.32.
## v0.3.0 (2023-03-18)
- Ditch MSRV policy. The `rust-version` indication will remain, for the minimum estimated Rust version for the code features used in the crate's own code, but dependencies may have already moved on. From now on, only latest stable is assumed and tested for. ([#510](https://github.com/watchexec/watchexec/pull/510))
## v0.2.0 (2023-01-09)
- MSRV: bump to 1.61.0
## v0.1.1 (2022-09-07)
- Deps: update miette to 5.3.0
## v0.1.0 (2022-06-23)
- Initial release as a separate crate.

View File

@ -1,71 +0,0 @@
[package]
name = "watchexec-filterer-tagged"
version = "2.0.0"
authors = ["Félix Saparelli <felix@passcod.name>"]
license = "Apache-2.0"
description = "Watchexec filterer component using tagged filters"
keywords = ["watchexec", "filterer", "tags"]
documentation = "https://docs.rs/watchexec-filterer-tagged"
homepage = "https://watchexec.github.io"
repository = "https://github.com/watchexec/watchexec"
readme = "README.md"
rust-version = "1.61.0"
edition = "2021"
[badges.maintenance]
status = "deprecated"
[dependencies]
futures = "0.3.25"
globset = "0.4.8"
ignore = "0.4.18"
miette = "7.2.0"
nom = "7.0.0"
regex = "1.5.4"
thiserror = "1.0.26"
tracing = "0.1.26"
unicase = "2.6.0"
[dependencies.ignore-files]
version = "2.1.0"
path = "../../ignore-files"
[dependencies.tokio]
version = "1.32.0"
features = [
"fs",
]
[dependencies.watchexec]
version = "3.0.1"
path = "../../lib"
[dependencies.watchexec-events]
version = "3.0.0"
path = "../../events"
[dependencies.watchexec-filterer-ignore]
version = "3.0.1"
path = "../ignore"
[dependencies.watchexec-signals]
version = "3.0.0"
path = "../../signals"
[dev-dependencies]
tracing-subscriber = "0.3.6"
[dev-dependencies.project-origins]
version = "1.3.0"
path = "../../project-origins"
[dev-dependencies.tokio]
version = "1.32.0"
features = [
"fs",
"io-std",
"sync",
]

View File

@ -1,19 +0,0 @@
[![Crates.io page](https://badgen.net/crates/v/watchexec-filterer-tagged)](https://crates.io/crates/watchexec-filterer-tagged)
[![API Docs](https://docs.rs/watchexec-filterer-tagged/badge.svg)][docs]
[![Crate license: Apache 2.0](https://badgen.net/badge/license/Apache%202.0)][license]
[![CI status](https://github.com/watchexec/watchexec/actions/workflows/check.yml/badge.svg)](https://github.com/watchexec/watchexec/actions/workflows/check.yml)
# Watchexec filterer: tagged
_Experimental filterer using tagged filters._
- **[API documentation][docs]**.
- Licensed under [Apache 2.0][license].
- Status: soft-deprecated.
The tagged filterer is not in use in the Watchexec CLI, but this crate will continue being updated
until and unless it becomes too much of a pain to do so, for third party users. It is expected that
some of the code will eventually be reused for a more generic filter crate without the tagged syntax.
[docs]: https://docs.rs/watchexec-filterer-tagged
[license]: ../../../LICENSE

View File

@ -1,10 +0,0 @@
pre-release-commit-message = "release: filterer-tagged v{{version}}"
tag-prefix = "filterer-tagged-"
tag-message = "watchexec-filterer-tagged {{version}}"
[[pre-release-replacements]]
file = "CHANGELOG.md"
search = "^## Next.*$"
replace = "## Next (YYYY-MM-DD)\n\n## v{{version}} ({{date}})"
prerelease = true
max = 1

View File

@ -1,73 +0,0 @@
use std::collections::HashMap;
use ignore::gitignore::Gitignore;
use miette::Diagnostic;
use thiserror::Error;
use tokio::sync::watch::error::SendError;
use watchexec::error::RuntimeError;
use watchexec_filterer_ignore::IgnoreFilterer;
use crate::{Filter, Matcher};
/// Errors emitted by the `TaggedFilterer`.
#[derive(Debug, Diagnostic, Error)]
#[non_exhaustive]
pub enum TaggedFiltererError {
/// Generic I/O error, with some context.
#[error("io({about}): {err}")]
IoError {
/// What it was about.
about: &'static str,
/// The I/O error which occurred.
#[source]
err: std::io::Error,
},
/// Error received when a tagged filter cannot be parsed.
#[error("cannot parse filter `{src}`: {err:?}")]
Parse {
/// The source of the filter.
#[source_code]
src: String,
/// What went wrong.
err: nom::error::ErrorKind,
},
/// Error received when a filter cannot be added or removed from a tagged filter list.
#[error("cannot {action} filter: {err:?}")]
FilterChange {
/// The action that was attempted.
action: &'static str,
/// The underlying error.
#[source]
err: SendError<HashMap<Matcher, Vec<Filter>>>,
},
/// Error received when a glob cannot be parsed.
#[error("cannot parse glob: {0}")]
GlobParse(#[source] ignore::Error),
/// Error received when a compiled globset cannot be changed.
#[error("cannot change compiled globset: {0:?}")]
GlobsetChange(#[source] SendError<Option<Gitignore>>),
/// Error received about the internal ignore filterer.
#[error("ignore filterer: {0}")]
Ignore(#[source] ignore_files::Error),
/// Error received when a new ignore filterer cannot be swapped in.
#[error("cannot swap in new ignore filterer: {0:?}")]
IgnoreSwap(#[source] SendError<IgnoreFilterer>),
}
impl From<TaggedFiltererError> for RuntimeError {
fn from(err: TaggedFiltererError) -> Self {
Self::Filterer {
kind: "tagged",
err: Box::new(err) as _,
}
}
}

View File

@ -1,93 +0,0 @@
use std::{
env,
io::Error,
path::{Path, PathBuf},
str::FromStr,
};
use ignore_files::{discover_file, IgnoreFile};
use tokio::fs::read_to_string;
use crate::{Filter, TaggedFiltererError};
/// A filter file.
///
/// This is merely a type wrapper around an [`IgnoreFile`], as the only difference is how the file
/// is parsed.
#[derive(Debug, Clone, PartialEq, Eq, Hash)]
pub struct FilterFile(pub IgnoreFile);
/// Finds all filter files that apply to the current runtime.
///
/// This considers:
/// - `$XDG_CONFIG_HOME/watchexec/filter`, as well as other locations (APPDATA on Windows…)
/// - Files from the `WATCHEXEC_FILTER_FILES` environment variable (comma-separated)
///
/// All errors (permissions, etc) are collected and returned alongside the ignore files: you may
/// want to show them to the user while still using whatever ignores were successfully found. Errors
/// from files not being found are silently ignored (the files are just not returned).
pub async fn discover_files_from_environment() -> (Vec<FilterFile>, Vec<Error>) {
let mut files = Vec::new();
let mut errors = Vec::new();
for path in env::var("WATCHEXEC_FILTER_FILES")
.unwrap_or_default()
.split(',')
{
discover_file(&mut files, &mut errors, None, None, PathBuf::from(path)).await;
}
let mut wgis = Vec::with_capacity(5);
if let Ok(home) = env::var("XDG_CONFIG_HOME") {
wgis.push(Path::new(&home).join("watchexec/filter"));
}
if let Ok(home) = env::var("APPDATA") {
wgis.push(Path::new(&home).join("watchexec/filter"));
}
if let Ok(home) = env::var("USERPROFILE") {
wgis.push(Path::new(&home).join(".watchexec/filter"));
}
if let Ok(home) = env::var("HOME") {
wgis.push(Path::new(&home).join(".watchexec/filter"));
}
for path in wgis {
if discover_file(&mut files, &mut errors, None, None, path).await {
break;
}
}
(files.into_iter().map(FilterFile).collect(), errors)
}
impl FilterFile {
/// Read and parse into [`Filter`]s.
///
/// Empty lines and lines starting with `#` are ignored. The `applies_in` field of the
/// [`IgnoreFile`] is used for the `in_path` field of each [`Filter`].
///
/// This method reads the entire file into memory.
pub async fn load(&self) -> Result<Vec<Filter>, TaggedFiltererError> {
let content =
read_to_string(&self.0.path)
.await
.map_err(|err| TaggedFiltererError::IoError {
about: "filter file load",
err,
})?;
let lines = content.lines();
let mut filters = Vec::with_capacity(lines.size_hint().0);
for line in lines {
if line.is_empty() || line.starts_with('#') {
continue;
}
let mut f = Filter::from_str(line)?;
f.in_path = self.0.applies_in.clone();
filters.push(f);
}
Ok(filters)
}
}

View File

@ -1,276 +0,0 @@
use std::collections::HashSet;
use std::path::PathBuf;
use globset::Glob;
use regex::Regex;
use tokio::fs::canonicalize;
use tracing::{trace, warn};
use unicase::UniCase;
use watchexec_events::Tag;
use crate::TaggedFiltererError;
/// A tagged filter.
#[derive(Debug, Clone, PartialEq, Eq)]
pub struct Filter {
/// Path the filter applies from.
pub in_path: Option<PathBuf>,
/// Which tag the filter applies to.
pub on: Matcher,
/// The operation to perform on the tag's value.
pub op: Op,
/// The pattern to match against the tag's value.
pub pat: Pattern,
/// If true, a positive match with this filter will override negative matches from previous
/// filters on the same tag, and negative matches will be ignored.
pub negate: bool,
}
impl Filter {
/// Matches the filter against a subject.
///
/// This is really an internal method to the tagged filterer machinery, exposed so you can build
/// your own filterer using the same types or the textual syntax. As such its behaviour is not
/// guaranteed to be stable (its signature is, though).
pub fn matches(&self, subject: impl AsRef<str>) -> Result<bool, TaggedFiltererError> {
let subject = subject.as_ref();
trace!(op=?self.op, pat=?self.pat, ?subject, "performing filter match");
Ok(match (self.op, &self.pat) {
(Op::Equal, Pattern::Exact(pat)) => UniCase::new(subject) == UniCase::new(pat),
(Op::NotEqual, Pattern::Exact(pat)) => UniCase::new(subject) != UniCase::new(pat),
(Op::Regex, Pattern::Regex(pat)) => pat.is_match(subject),
(Op::NotRegex, Pattern::Regex(pat)) => !pat.is_match(subject),
(Op::InSet, Pattern::Set(set)) => set.contains(subject),
(Op::InSet, Pattern::Exact(pat)) => subject == pat,
(Op::NotInSet, Pattern::Set(set)) => !set.contains(subject),
(Op::NotInSet, Pattern::Exact(pat)) => subject != pat,
(op @ (Op::Glob | Op::NotGlob), Pattern::Glob(glob)) => {
// FIXME: someway that isn't this horrible
match Glob::new(glob) {
Ok(glob) => {
let matches = glob.compile_matcher().is_match(subject);
match op {
Op::Glob => matches,
Op::NotGlob => !matches,
_ => unreachable!(),
}
}
Err(err) => {
warn!(
"failed to compile glob for non-path match, skipping (pass): {}",
err
);
true
}
}
}
(op, pat) => {
warn!(
"trying to match pattern {:?} with op {:?}, that cannot work",
pat, op
);
false
}
})
}
/// Create a filter from a gitignore-style glob pattern.
///
/// The optional path is for the `in_path` field of the filter. When parsing gitignore files, it
/// should be set to the path of the _directory_ the ignore file is in.
///
/// The resulting filter matches on [`Path`][Matcher::Path], with the [`NotGlob`][Op::NotGlob]
/// op, and a [`Glob`][Pattern::Glob] pattern. If it starts with a `!`, it is negated.
#[must_use]
pub fn from_glob_ignore(in_path: Option<PathBuf>, glob: &str) -> Self {
let (glob, negate) = glob.strip_prefix('!').map_or((glob, false), |g| (g, true));
Self {
in_path,
on: Matcher::Path,
op: Op::NotGlob,
pat: Pattern::Glob(glob.to_string()),
negate,
}
}
/// Returns the filter with its `in_path` canonicalised.
pub async fn canonicalised(mut self) -> Result<Self, TaggedFiltererError> {
if let Some(ctx) = self.in_path {
self.in_path =
Some(
canonicalize(&ctx)
.await
.map_err(|err| TaggedFiltererError::IoError {
about: "canonicalise Filter in_path",
err,
})?,
);
trace!(canon=?ctx, "canonicalised in_path");
}
Ok(self)
}
}
/// What a filter matches on.
#[derive(Clone, Copy, Debug, Eq, PartialEq, Hash)]
#[non_exhaustive]
pub enum Matcher {
/// The presence of a tag on an event.
Tag,
/// A path in a filesystem event. Paths are always canonicalised.
///
/// Note that there may be multiple paths in an event (e.g. both source and destination for renames), and filters
/// will be matched on all of them.
Path,
/// The file type of an object in a filesystem event.
///
/// This is not guaranteed to be present for every filesystem event.
///
/// It can be any of these values: `file`, `dir`, `symlink`, `other`. That last one means
/// "not any of the first three."
FileType,
/// The [`EventKind`][notify::event::EventKind] of a filesystem event.
///
/// This is the Debug representation of the event kind. Examples:
/// - `Access(Close(Write))`
/// - `Modify(Data(Any))`
/// - `Modify(Metadata(Permissions))`
/// - `Remove(Folder)`
///
/// You should probably use globs or regexes to match these, ex:
/// - `Create(*)`
/// - `Modify\(Name\(.+`
FileEventKind,
/// The [event source][crate::event::Source] the event came from.
///
/// These are the lowercase names of the variants.
Source,
/// The ID of the process which caused the event.
///
/// Note that it's rare for events to carry this information.
Process,
/// A signal sent to the main process.
///
/// This can be matched both on the signal number as an integer, and on the signal name as a
/// string. On Windows, only `BREAK` is supported; `CTRL_C` parses but won't work. Matching is
/// on both uppercase and lowercase forms.
///
/// Interrupt signals (`TERM` and `INT` on Unix, `CTRL_C` on Windows) are parsed, but these are
/// marked Urgent internally to Watchexec, and thus bypass filtering entirely.
Signal,
/// The exit status of a subprocess.
///
/// This is only present for events issued when the subprocess exits. The value is matched on
/// both the exit code as an integer, and either `success` or `fail`, whichever succeeds.
ProcessCompletion,
/// The [`Priority`] of the event.
///
/// This is never `urgent`, as urgent events bypass filtering.
Priority,
}
impl Matcher {
pub(crate) fn from_tag(tag: &Tag) -> &'static [Self] {
match tag {
Tag::Path {
file_type: None, ..
} => &[Self::Path],
Tag::Path { .. } => &[Self::Path, Self::FileType],
Tag::FileEventKind(_) => &[Self::FileEventKind],
Tag::Source(_) => &[Self::Source],
Tag::Process(_) => &[Self::Process],
Tag::Signal(_) => &[Self::Signal],
Tag::ProcessCompletion(_) => &[Self::ProcessCompletion],
_ => {
warn!("unhandled tag: {:?}", tag);
&[]
}
}
}
}
/// How a filter value is interpreted.
///
/// - `==` and `!=` match on the exact value as string equality (case-insensitively),
/// - `~=` and `~!` match using a [regex],
/// - `*=` and `*!` match using a glob, either via [globset] or [ignore]
/// - `:=` and `:!` match via exact string comparisons, but on any of the list of values separated
/// by `,`
/// - `=`, the "auto" operator, behaves as `*=` if the matcher is `Path`, and as `==` otherwise.
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
#[non_exhaustive]
pub enum Op {
/// The auto operator, `=`, resolves to `*=` or `==` depending on the matcher.
Auto,
/// The `==` operator, matches on exact string equality.
Equal,
/// The `!=` operator, matches on exact string inequality.
NotEqual,
/// The `~=` operator, matches on a regex.
Regex,
/// The `~!` operator, matches on a regex (matches are fails).
NotRegex,
/// The `*=` operator, matches on a glob.
Glob,
/// The `*!` operator, matches on a glob (matches are fails).
NotGlob,
/// The `:=` operator, matches (with string compares) on a set of values (belongs are passes).
InSet,
/// The `:!` operator, matches on a set of values (belongs are fails).
NotInSet,
}
/// A filter value (pattern to match with).
#[derive(Debug, Clone)]
#[non_exhaustive]
pub enum Pattern {
/// An exact string.
Exact(String),
/// A regex.
Regex(Regex),
/// A glob.
///
/// This is stored as a string as globs are compiled together rather than on a per-filter basis.
Glob(String),
/// A set of exact strings.
Set(HashSet<String>),
}
impl PartialEq<Self> for Pattern {
fn eq(&self, other: &Self) -> bool {
match (self, other) {
(Self::Exact(l), Self::Exact(r)) | (Self::Glob(l), Self::Glob(r)) => l == r,
(Self::Regex(l), Self::Regex(r)) => l.as_str() == r.as_str(),
(Self::Set(l), Self::Set(r)) => l == r,
_ => false,
}
}
}
impl Eq for Pattern {}

View File

@ -1,537 +0,0 @@
use std::path::PathBuf;
use std::sync::Arc;
use std::{collections::HashMap, convert::Into};
use futures::{stream::FuturesOrdered, TryStreamExt};
use ignore::{
gitignore::{Gitignore, GitignoreBuilder},
Match,
};
use ignore_files::{IgnoreFile, IgnoreFilter};
use tokio::fs::canonicalize;
use tracing::{debug, trace, trace_span};
use watchexec::{error::RuntimeError, filter::Filterer};
use watchexec_events::{Event, FileType, Priority, ProcessEnd, Tag};
use watchexec_filterer_ignore::IgnoreFilterer;
use watchexec_signals::Signal;
use crate::{swaplock::SwapLock, Filter, Matcher, Op, Pattern, TaggedFiltererError};
/// A complex filterer that can match any event tag and supports different matching operators.
///
/// See the crate-level documentation for more information.
#[derive(Debug)]
pub struct TaggedFilterer {
/// The directory the project is in, its origin.
///
/// This is used to resolve absolute paths without an `in_path` context.
origin: PathBuf,
/// Where the program is running from.
///
/// This is used to resolve relative paths without an `in_path` context.
workdir: PathBuf,
/// All filters that are applied, in order, by matcher.
filters: SwapLock<HashMap<Matcher, Vec<Filter>>>,
/// Sub-filterer for ignore files.
ignore_filterer: SwapLock<IgnoreFilterer>,
/// Compiled matcher for Glob filters.
glob_compiled: SwapLock<Option<Gitignore>>,
/// Compiled matcher for NotGlob filters.
not_glob_compiled: SwapLock<Option<Gitignore>>,
}
impl Filterer for TaggedFilterer {
fn check_event(&self, event: &Event, priority: Priority) -> Result<bool, RuntimeError> {
self.check(event, priority).map_err(Into::into)
}
}
impl TaggedFilterer {
fn check(&self, event: &Event, priority: Priority) -> Result<bool, TaggedFiltererError> {
let _span = trace_span!("filterer_check").entered();
trace!(?event, ?priority, "checking event");
{
trace!("checking priority");
if let Some(filters) = self.filters.borrow().get(&Matcher::Priority).cloned() {
trace!(filters=%filters.len(), "found some filters for priority");
//
let mut pri_match = true;
for filter in &filters {
let _span = trace_span!("checking filter against priority", ?filter).entered();
let applies = filter.matches(match priority {
Priority::Low => "low",
Priority::Normal => "normal",
Priority::High => "high",
Priority::Urgent => unreachable!("urgent by-passes filtering"),
})?;
if filter.negate {
if applies {
trace!(prev=%pri_match, now=%true, "negate filter passes, passing this priority");
pri_match = true;
break;
}
trace!(prev=%pri_match, now=%pri_match, "negate filter fails, ignoring");
} else {
trace!(prev=%pri_match, this=%applies, now=%(pri_match&applies), "filter applies to priority");
pri_match &= applies;
}
}
if !pri_match {
trace!("priority fails check, failing entire event");
return Ok(false);
}
} else {
trace!("no filters for priority, skipping (pass)");
}
}
{
trace!("checking internal ignore filterer");
let igf = self.ignore_filterer.borrow();
if !igf
.check_event(event, priority)
.expect("IgnoreFilterer never errors")
{
trace!("internal ignore filterer matched (fail)");
return Ok(false);
}
}
if self.filters.borrow().is_empty() {
trace!("no filters, skipping entire check (pass)");
return Ok(true);
}
trace!(tags=%event.tags.len(), "checking all tags on the event");
for tag in &event.tags {
let _span = trace_span!("check_tag", ?tag).entered();
trace!("checking tag");
for matcher in Matcher::from_tag(tag) {
let _span = trace_span!("check_matcher", ?matcher).entered();
let filters = self.filters.borrow().get(matcher).cloned();
if let Some(tag_filters) = filters {
if tag_filters.is_empty() {
trace!("no filters for this matcher, skipping (pass)");
continue;
}
trace!(filters=%tag_filters.len(), "found some filters for this matcher");
let mut tag_match = true;
if let (Matcher::Path, Tag::Path { path, file_type }) = (matcher, tag) {
let is_dir = file_type.map_or(false, |ft| matches!(ft, FileType::Dir));
{
let gc = self.glob_compiled.borrow();
if let Some(igs) = gc.as_ref() {
let _span =
trace_span!("checking_compiled_filters", compiled=%"Glob")
.entered();
match if path.strip_prefix(&self.origin).is_ok() {
trace!("checking against path or parents");
igs.matched_path_or_any_parents(path, is_dir)
} else {
trace!("checking against path only");
igs.matched(path, is_dir)
} {
Match::None => {
trace!("no match (fail)");
tag_match &= false;
}
Match::Ignore(glob) => {
if glob
.from()
.map_or(true, |f| path.strip_prefix(f).is_ok())
{
trace!(?glob, "positive match (pass)");
tag_match &= true;
} else {
trace!(
?glob,
"positive match, but not in scope (ignore)"
);
}
}
Match::Whitelist(glob) => {
trace!(?glob, "negative match (ignore)");
}
}
}
}
{
let ngc = self.not_glob_compiled.borrow();
if let Some(ngs) = ngc.as_ref() {
let _span =
trace_span!("checking_compiled_filters", compiled=%"NotGlob")
.entered();
match if path.strip_prefix(&self.origin).is_ok() {
trace!("checking against path or parents");
ngs.matched_path_or_any_parents(path, is_dir)
} else {
trace!("checking against path only");
ngs.matched(path, is_dir)
} {
Match::None => {
trace!("no match (pass)");
tag_match &= true;
}
Match::Ignore(glob) => {
if glob
.from()
.map_or(true, |f| path.strip_prefix(f).is_ok())
{
trace!(?glob, "positive match (fail)");
tag_match &= false;
} else {
trace!(
?glob,
"positive match, but not in scope (ignore)"
);
}
}
Match::Whitelist(glob) => {
trace!(?glob, "negative match (pass)");
tag_match = true;
}
}
}
}
}
// those are handled with the compiled ignore filters above
let tag_filters = tag_filters
.into_iter()
.filter(|f| {
!matches!(
(tag, matcher, f),
(
Tag::Path { .. },
Matcher::Path,
Filter {
on: Matcher::Path,
op: Op::Glob | Op::NotGlob,
pat: Pattern::Glob(_),
..
}
)
)
})
.collect::<Vec<_>>();
if tag_filters.is_empty() && tag_match {
trace!("no more filters for this matcher, skipping (pass)");
continue;
}
trace!(filters=%tag_filters.len(), "got some filters to check still");
for filter in &tag_filters {
let _span = trace_span!("checking filter against tag", ?filter).entered();
if let Some(app) = self.match_tag(filter, tag)? {
if filter.negate {
if app {
trace!(prev=%tag_match, now=%true, "negate filter passes, passing this matcher");
tag_match = true;
break;
}
trace!(prev=%tag_match, now=%tag_match, "negate filter fails, ignoring");
} else {
trace!(prev=%tag_match, this=%app, now=%(tag_match&app), "filter applies to this tag");
tag_match &= app;
}
}
}
if !tag_match {
trace!("matcher fails check, failing entire event");
return Ok(false);
}
trace!("matcher passes check, continuing");
} else {
trace!("no filters for this matcher, skipping (pass)");
}
}
}
trace!("passing event");
Ok(true)
}
/// Initialise a new tagged filterer with no filters.
///
/// This takes two paths: the project origin, and the current directory. The current directory
/// is not obtained from the environment so you can customise it; generally you should use
/// [`std::env::current_dir()`] though.
///
/// The origin is the directory the main project that is being watched is in. This is used to
/// resolve absolute paths given in filters without an `in_path` field (e.g. all filters parsed
/// from text), and for ignore file based filtering.
///
/// The workdir is used to resolve relative paths given in filters without an `in_path` field.
///
/// So, if origin is `/path/to/project` and workdir is `/path/to/project/subtree`:
/// - `path=foo.bar` is resolved to `/path/to/project/subtree/foo.bar`
/// - `path=/foo.bar` is resolved to `/path/to/project/foo.bar`
pub async fn new(origin: PathBuf, workdir: PathBuf) -> Result<Arc<Self>, TaggedFiltererError> {
let origin = canonicalize(origin)
.await
.map_err(|err| TaggedFiltererError::IoError {
about: "canonicalise origin on new tagged filterer",
err,
})?;
Ok(Arc::new(Self {
filters: SwapLock::new(HashMap::new()),
ignore_filterer: SwapLock::new(IgnoreFilterer(IgnoreFilter::empty(&origin))),
glob_compiled: SwapLock::new(None),
not_glob_compiled: SwapLock::new(None),
workdir: canonicalize(workdir)
.await
.map_err(|err| TaggedFiltererError::IoError {
about: "canonicalise workdir on new tagged filterer",
err,
})?,
origin,
}))
}
// filter ctx event path filter outcome
// /foo/bar /foo/bar/baz.txt baz.txt pass
// /foo/bar /foo/bar/baz.txt /baz.txt pass
// /foo/bar /foo/bar/baz.txt /baz.* pass
// /foo/bar /foo/bar/baz.txt /blah fail
// /foo/quz /foo/bar/baz.txt /baz.* skip
// Ok(Some(bool)) => the match was applied, bool is the result
// Ok(None) => for some precondition, the match was not done (mismatched tag, out of context, …)
fn match_tag(&self, filter: &Filter, tag: &Tag) -> Result<Option<bool>, TaggedFiltererError> {
const fn sig_match(sig: Signal) -> (&'static str, i32) {
match sig {
Signal::Hangup | Signal::Custom(1) => ("HUP", 1),
Signal::ForceStop | Signal::Custom(9) => ("KILL", 9),
Signal::Interrupt | Signal::Custom(2) => ("INT", 2),
Signal::Quit | Signal::Custom(3) => ("QUIT", 3),
Signal::Terminate | Signal::Custom(15) => ("TERM", 15),
Signal::User1 | Signal::Custom(10) => ("USR1", 10),
Signal::User2 | Signal::Custom(12) => ("USR2", 12),
Signal::Custom(n) => ("UNK", n),
_ => ("UNK", 0),
}
}
trace!(matcher=?filter.on, "matching filter to tag");
match (tag, filter.on) {
(tag, Matcher::Tag) => filter.matches(tag.discriminant_name()),
(Tag::Path { path, .. }, Matcher::Path) => {
let resolved = if let Some(ctx) = &filter.in_path {
if let Ok(suffix) = path.strip_prefix(ctx) {
suffix.strip_prefix("/").unwrap_or(suffix)
} else {
return Ok(None);
}
} else if let Ok(suffix) = path.strip_prefix(&self.workdir) {
suffix.strip_prefix("/").unwrap_or(suffix)
} else if let Ok(suffix) = path.strip_prefix(&self.origin) {
suffix.strip_prefix("/").unwrap_or(suffix)
} else {
path.strip_prefix("/").unwrap_or(path)
};
trace!(?resolved, "resolved path to match filter against");
if matches!(filter.op, Op::Glob | Op::NotGlob) {
trace!("path glob match with match_tag is already handled");
return Ok(None);
}
filter.matches(resolved.to_string_lossy())
}
(
Tag::Path {
file_type: Some(ft),
..
},
Matcher::FileType,
) => filter.matches(ft.to_string()),
(Tag::FileEventKind(kind), Matcher::FileEventKind) => {
filter.matches(format!("{kind:?}"))
}
(Tag::Source(src), Matcher::Source) => filter.matches(src.to_string()),
(Tag::Process(pid), Matcher::Process) => filter.matches(pid.to_string()),
(Tag::Signal(sig), Matcher::Signal) => {
let (text, int) = sig_match(*sig);
Ok(filter.matches(text)?
|| filter.matches(format!("SIG{text}"))?
|| filter.matches(int.to_string())?)
}
(Tag::ProcessCompletion(ope), Matcher::ProcessCompletion) => match ope {
None => filter.matches("_"),
Some(ProcessEnd::Success) => filter.matches("success"),
Some(ProcessEnd::ExitError(int)) => filter.matches(format!("error({int})")),
Some(ProcessEnd::ExitSignal(sig)) => {
let (text, int) = sig_match(*sig);
Ok(filter.matches(format!("signal({text})"))?
|| filter.matches(format!("signal(SIG{text})"))?
|| filter.matches(format!("signal({int})"))?)
}
Some(ProcessEnd::ExitStop(int)) => filter.matches(format!("stop({int})")),
Some(ProcessEnd::Exception(int)) => filter.matches(format!("exception({int:X})")),
Some(ProcessEnd::Continued) => filter.matches("continued"),
},
(_, _) => {
trace!("no match for tag, skipping");
return Ok(None);
}
}
.map(Some)
}
/// Add some filters to the filterer.
///
/// This is async as it submits the new filters to the live filterer, which may be holding a
/// read lock. It takes a slice of filters so it can efficiently add a large number of filters
/// with a single write, without needing to acquire the lock repeatedly.
///
/// If filters with glob operations are added, the filterer's glob matchers are recompiled after
/// the new filters are added, in this method. This should not be used for inserting an
/// [`IgnoreFile`]: use [`add_ignore_file()`](Self::add_ignore_file) instead.
pub async fn add_filters(&self, filters: &[Filter]) -> Result<(), TaggedFiltererError> {
debug!(?filters, "adding filters to filterer");
let mut recompile_globs = false;
let mut recompile_not_globs = false;
#[allow(clippy::from_iter_instead_of_collect)]
let filters = FuturesOrdered::from_iter(
filters
.iter()
.cloned()
.inspect(|f| match f.op {
Op::Glob => {
recompile_globs = true;
}
Op::NotGlob => {
recompile_not_globs = true;
}
_ => {}
})
.map(Filter::canonicalised),
)
.try_collect::<Vec<_>>()
.await?;
trace!(?filters, "canonicalised filters");
// TODO: use miette's related and issue canonicalisation errors for all of them
self.filters
.change(|fs| {
for filter in filters {
fs.entry(filter.on).or_default().push(filter);
}
})
.map_err(|err| TaggedFiltererError::FilterChange { action: "add", err })?;
trace!("inserted filters into swaplock");
if recompile_globs {
self.recompile_globs(Op::Glob)?;
}
if recompile_not_globs {
self.recompile_globs(Op::NotGlob)?;
}
Ok(())
}
fn recompile_globs(&self, op_filter: Op) -> Result<(), TaggedFiltererError> {
trace!(?op_filter, "recompiling globs");
let target = match op_filter {
Op::Glob => &self.glob_compiled,
Op::NotGlob => &self.not_glob_compiled,
_ => unreachable!("recompile_globs called with invalid op"),
};
let globs = {
let filters = self.filters.borrow();
if let Some(fs) = filters.get(&Matcher::Path) {
trace!(?op_filter, "pulling filters from swaplock");
// we want to hold the lock as little as possible, so we clone the filters
fs.iter()
.filter(|&f| f.op == op_filter)
.cloned()
.collect::<Vec<_>>()
} else {
trace!(?op_filter, "no filters, erasing compiled glob");
return target
.replace(None)
.map_err(TaggedFiltererError::GlobsetChange);
}
};
let mut builder = GitignoreBuilder::new(&self.origin);
for filter in globs {
if let Pattern::Glob(mut glob) = filter.pat {
if filter.negate {
glob.insert(0, '!');
}
trace!(?op_filter, in_path=?filter.in_path, ?glob, "adding new glob line");
builder
.add_line(filter.in_path, &glob)
.map_err(TaggedFiltererError::GlobParse)?;
}
}
trace!(?op_filter, "finalising compiled glob");
let compiled = builder.build().map_err(TaggedFiltererError::GlobParse)?;
trace!(?op_filter, "swapping in new compiled glob");
target
.replace(Some(compiled))
.map_err(TaggedFiltererError::GlobsetChange)
}
/// Reads a gitignore-style [`IgnoreFile`] and adds it to the filterer.
pub async fn add_ignore_file(&self, file: &IgnoreFile) -> Result<(), TaggedFiltererError> {
let mut new = { self.ignore_filterer.borrow().clone() };
new.0
.add_file(file)
.await
.map_err(TaggedFiltererError::Ignore)?;
self.ignore_filterer
.replace(new)
.map_err(TaggedFiltererError::IgnoreSwap)?;
Ok(())
}
/// Clears all filters from the filterer.
///
/// This also recompiles the glob matchers, so essentially it resets the entire filterer state.
pub fn clear_filters(&self) -> Result<(), TaggedFiltererError> {
debug!("removing all filters from filterer");
self.filters.replace(Default::default()).map_err(|err| {
TaggedFiltererError::FilterChange {
action: "clear all",
err,
}
})?;
self.recompile_globs(Op::Glob)?;
self.recompile_globs(Op::NotGlob)?;
Ok(())
}
}

View File

@ -1,92 +0,0 @@
//! A filterer implementation that exposes the full capabilities of Watchexec.
//!
//! Filters match against [event tags][Tag]; can be exact matches, glob matches, regex matches, or
//! set matches; can reverse the match (equal/not equal, etc); and can be negated.
//!
//! [Filters][Filter] can be generated from your application and inserted directly, or they can be
//! parsed from a textual format:
//!
//! ```text
//! [!]{Matcher}{Op}{Value}
//! ```
//!
//! For example:
//!
//! ```text
//! path==/foo/bar
//! path*=**/bar
//! path~=bar$
//! !kind=file
//! ```
//!
//! There is a set of [operators][Op]:
//! - `==` and `!=`: exact match and exact not match (case insensitive)
//! - `~=` and `~!`: regex match and regex not match
//! - `*=` and `*!`: glob match and glob not match
//! - `:=` and `:!`: set match and set not match
//!
//! Sets are a list of values separated by `,`.
//!
//! In addition to the two-symbol operators, there is the `=` "auto" operator, which maps to the
//! most convenient operator for the given _matcher_. The current mapping is:
//!
//! | Matcher | Operator |
//! |---------------------------------------------------|---------------|
//! | [`Tag`](Matcher::Tag) | `:=` (in set) |
//! | [`Path`](Matcher::Path) | `*=` (glob) |
//! | [`FileType`](Matcher::FileType) | `:=` (in set) |
//! | [`FileEventKind`](Matcher::FileEventKind) | `*=` (glob) |
//! | [`Source`](Matcher::Source) | `:=` (in set) |
//! | [`Process`](Matcher::Process) | `:=` (in set) |
//! | [`Signal`](Matcher::Signal) | `:=` (in set) |
//! | [`ProcessCompletion`](Matcher::ProcessCompletion) | `*=` (glob) |
//! | [`Priority`](Matcher::Priority) | `:=` (in set) |
//!
//! [Matchers][Matcher] correspond to Tags, but are not one-to-one: the `path` matcher operates on
//! the `path` part of the `Path` tag, and the `type` matcher operates on the `file_type`, for
//! example.
//!
//! | Matcher | Syntax | Tag |
//! |-------------------------------------------|----------|----------------------------------------------|
//! | [`Tag`](Matcher::Tag) | `tag` | _the presence of a Tag on the event_ |
//! | [`Path`](Matcher::Path) | `path` | [`Path`](Tag::Path) (`path` field) |
//! | [`FileType`](Matcher::FileType) | `type` | [`Path`](Tag::Path) (`file_type` field, when Some) |
//! | [`FileEventKind`](Matcher::FileEventKind) | `kind` or `fek` | [`FileEventKind`](Tag::FileEventKind) |
//! | [`Source`](Matcher::Source) | `source` or `src` | [`Source`](Tag::Source) |
//! | [`Process`](Matcher::Process) | `process` or `pid` | [`Process`](Tag::Process) |
//! | [`Signal`](Matcher::Signal) | `signal` | [`Signal`](Tag::Signal) |
//! | [`ProcessCompletion`](Matcher::ProcessCompletion) | `complete` or `exit` | [`ProcessCompletion`](Tag::ProcessCompletion) |
//! | [`Priority`](Matcher::Priority) | `priority` | special: event [`Priority`] |
//!
//! Filters are checked in order, grouped per tag and per matcher. Filter groups may be checked in
//! any order, but the filters in the groups are checked in add order. Path glob filters are always
//! checked first, for internal reasons.
//!
//! The `negate` boolean field behaves specially: it is not operator negation, but rather the same
//! kind of behaviour that is applied to `!`-prefixed globs in gitignore files: if a negated filter
//! matches the event, the result of the event checking for that matcher is reverted to `true`, even
//! if a previous filter set it to `false`. Unmatched negated filters are ignored.
//!
//! Glob syntax is as supported by the [ignore] crate for Paths, and by [globset] otherwise. (As of
//! writing, the ignore crate uses globset internally). Regex syntax is the default syntax of the
//! [regex] crate.
#![doc(html_favicon_url = "https://watchexec.github.io/logo:watchexec.svg")]
#![doc(html_logo_url = "https://watchexec.github.io/logo:watchexec.svg")]
#![warn(clippy::unwrap_used, missing_docs)]
#![deny(rust_2018_idioms)]
// to make filters
pub use regex::Regex;
pub use error::*;
pub use files::*;
pub use filter::*;
pub use filterer::*;
mod error;
mod files;
mod filter;
mod filterer;
mod parse;
mod swaplock;

View File

@ -1,139 +0,0 @@
use std::str::FromStr;
use nom::{
branch::alt,
bytes::complete::{is_not, tag, tag_no_case, take_while1},
character::complete::char,
combinator::{map_res, opt},
sequence::{delimited, tuple},
Finish, IResult,
};
use regex::Regex;
use tracing::trace;
use crate::{Filter, Matcher, Op, Pattern, TaggedFiltererError};
impl FromStr for Filter {
type Err = TaggedFiltererError;
fn from_str(s: &str) -> Result<Self, Self::Err> {
fn matcher(i: &str) -> IResult<&str, Matcher> {
map_res(
alt((
tag_no_case("tag"),
tag_no_case("path"),
tag_no_case("type"),
tag_no_case("kind"),
tag_no_case("fek"),
tag_no_case("source"),
tag_no_case("src"),
tag_no_case("priority"),
tag_no_case("process"),
tag_no_case("pid"),
tag_no_case("signal"),
tag_no_case("sig"),
tag_no_case("complete"),
tag_no_case("exit"),
)),
|m: &str| match m.to_ascii_lowercase().as_str() {
"tag" => Ok(Matcher::Tag),
"path" => Ok(Matcher::Path),
"type" => Ok(Matcher::FileType),
"kind" | "fek" => Ok(Matcher::FileEventKind),
"source" | "src" => Ok(Matcher::Source),
"priority" => Ok(Matcher::Priority),
"process" | "pid" => Ok(Matcher::Process),
"signal" | "sig" => Ok(Matcher::Signal),
"complete" | "exit" => Ok(Matcher::ProcessCompletion),
m => Err(format!("unknown matcher: {m}")),
},
)(i)
}
fn op(i: &str) -> IResult<&str, Op> {
map_res(
alt((
tag("=="),
tag("!="),
tag("~="),
tag("~!"),
tag("*="),
tag("*!"),
tag(":="),
tag(":!"),
tag("="),
)),
|o: &str| match o {
"==" => Ok(Op::Equal),
"!=" => Ok(Op::NotEqual),
"~=" => Ok(Op::Regex),
"~!" => Ok(Op::NotRegex),
"*=" => Ok(Op::Glob),
"*!" => Ok(Op::NotGlob),
":=" => Ok(Op::InSet),
":!" => Ok(Op::NotInSet),
"=" => Ok(Op::Auto),
o => Err(format!("unknown op: `{o}`")),
},
)(i)
}
fn pattern(i: &str) -> IResult<&str, &str> {
alt((
// TODO: escapes
delimited(char('"'), is_not("\""), char('"')),
delimited(char('\''), is_not("'"), char('\'')),
take_while1(|_| true),
))(i)
}
fn filter(i: &str) -> IResult<&str, Filter> {
map_res(
tuple((opt(tag("!")), matcher, op, pattern)),
|(n, m, o, p)| -> Result<_, ()> {
Ok(Filter {
in_path: None,
on: m,
op: match o {
Op::Auto => match m {
Matcher::Path
| Matcher::FileEventKind
| Matcher::ProcessCompletion => Op::Glob,
_ => Op::InSet,
},
o => o,
},
pat: match (o, m) {
// TODO: carry regex/glob errors through
(
Op::Auto,
Matcher::Path | Matcher::FileEventKind | Matcher::ProcessCompletion,
)
| (Op::Glob | Op::NotGlob, _) => Pattern::Glob(p.to_string()),
(Op::Auto | Op::InSet | Op::NotInSet, _) => {
Pattern::Set(p.split(',').map(|s| s.trim().to_string()).collect())
}
(Op::Regex | Op::NotRegex, _) => {
Pattern::Regex(Regex::new(p).map_err(drop)?)
}
(Op::Equal | Op::NotEqual, _) => Pattern::Exact(p.to_string()),
},
negate: n.is_some(),
})
},
)(i)
}
trace!(src=?s, "parsing tagged filter");
filter(s)
.finish()
.map(|(_, f)| {
trace!(src=?s, filter=?f, "parsed tagged filter");
f
})
.map_err(|e| TaggedFiltererError::Parse {
src: s.to_string(),
err: e.code,
})
}
}

View File

@ -1,58 +0,0 @@
//! A value that is always available, but can be swapped out.
use std::fmt;
use tokio::sync::watch::{channel, error::SendError, Receiver, Ref, Sender};
/// A value that is always available, but can be swapped out.
///
/// This is a wrapper around a [Tokio `watch`][tokio::sync::watch]. The value can be read without
/// await, but can only be written to with async. Borrows should be held for as little as possible,
/// as they keep a read lock.
pub struct SwapLock<T: Clone> {
r: Receiver<T>,
s: Sender<T>,
}
impl<T> SwapLock<T>
where
T: Clone,
{
/// Create a new `SwapLock` with the given value.
pub fn new(inner: T) -> Self {
let (s, r) = channel(inner);
Self { r, s }
}
/// Get a reference to the value.
pub fn borrow(&self) -> Ref<'_, T> {
self.r.borrow()
}
/// Rewrite the value using a closure.
///
/// This obtains a clone of the value, and then calls the closure with a mutable reference to
/// it. Once the closure returns, the value is swapped in.
pub fn change(&self, f: impl FnOnce(&mut T)) -> Result<(), SendError<T>> {
let mut new = { self.r.borrow().clone() };
f(&mut new);
self.s.send(new)
}
/// Replace the value with a new one.
pub fn replace(&self, new: T) -> Result<(), SendError<T>> {
self.s.send(new)
}
}
impl<T> fmt::Debug for SwapLock<T>
where
T: fmt::Debug + Clone,
{
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> Result<(), fmt::Error> {
f.debug_struct("SwapLock")
.field("(watch)", &self.r)
.finish_non_exhaustive()
}
}

View File

@ -1,114 +0,0 @@
use watchexec_events::{filekind::*, ProcessEnd, Source};
use watchexec_signals::Signal;
mod helpers;
use helpers::tagged_ff::*;
#[tokio::test]
async fn empty_filter_passes_everything() {
let filterer = filt("", &[], &[file("empty.wef").await]).await;
filterer.file_does_pass("Cargo.toml");
filterer.file_does_pass("Cargo.json");
filterer.file_does_pass("Gemfile.toml");
filterer.file_does_pass("FINAL-FINAL.docx");
filterer.dir_does_pass("/test/Cargo.toml");
filterer.dir_does_pass("/a/folder");
filterer.file_does_pass("apples/carrots/oranges");
filterer.file_does_pass("apples/carrots/cauliflowers/oranges");
filterer.file_does_pass("apples/carrots/cauliflowers/artichokes/oranges");
filterer.file_does_pass("apples/oranges/bananas");
filterer.dir_does_pass("apples/carrots/oranges");
filterer.dir_does_pass("apples/carrots/cauliflowers/oranges");
filterer.dir_does_pass("apples/carrots/cauliflowers/artichokes/oranges");
filterer.dir_does_pass("apples/oranges/bananas");
filterer.source_does_pass(Source::Keyboard);
filterer.fek_does_pass(FileEventKind::Create(CreateKind::File));
filterer.pid_does_pass(1234);
filterer.signal_does_pass(Signal::User1);
filterer.complete_does_pass(None);
filterer.complete_does_pass(Some(ProcessEnd::Success));
}
#[tokio::test]
async fn folder() {
let filterer = filt("", &[], &[file("folder.wef").await]).await;
filterer.file_doesnt_pass("apples");
filterer.file_doesnt_pass("apples/oranges/bananas");
filterer.dir_doesnt_pass("apples");
filterer.dir_doesnt_pass("apples/carrots");
filterer.file_doesnt_pass("raw-prunes");
filterer.dir_doesnt_pass("raw-prunes");
filterer.file_doesnt_pass("prunes");
filterer.file_doesnt_pass("prunes/oranges/bananas");
filterer.dir_does_pass("prunes");
filterer.dir_does_pass("prunes/carrots/cauliflowers/oranges");
}
#[tokio::test]
async fn patterns() {
let filterer = filt("", &[], &[file("path-patterns.wef").await]).await;
// Unmatched
filterer.file_does_pass("FINAL-FINAL.docx");
filterer.dir_does_pass("/a/folder");
filterer.file_does_pass("rat");
filterer.file_does_pass("foo/bar/rat");
filterer.file_does_pass("/foo/bar/rat");
// Cargo.toml
filterer.file_doesnt_pass("Cargo.toml");
filterer.dir_doesnt_pass("Cargo.toml");
filterer.file_does_pass("Cargo.json");
// package.json
filterer.file_doesnt_pass("package.json");
filterer.dir_doesnt_pass("package.json");
filterer.file_does_pass("package.toml");
// *.gemspec
filterer.file_doesnt_pass("pearl.gemspec");
filterer.dir_doesnt_pass("sapphire.gemspec");
filterer.file_doesnt_pass(".gemspec");
filterer.file_does_pass("diamond.gemspecial");
// test-[^u]+
filterer.file_does_pass("test-unit");
filterer.dir_doesnt_pass("test-integration");
filterer.file_does_pass("tester-helper");
// [.]sw[a-z]$
filterer.file_doesnt_pass("source.swa");
filterer.file_doesnt_pass(".source.swb");
filterer.file_doesnt_pass("sub/source.swc");
filterer.file_does_pass("sub/dir.swa/file");
filterer.file_does_pass("source.sw1");
}
#[tokio::test]
async fn negate() {
let filterer = filt("", &[], &[file("negate.wef").await]).await;
filterer.file_doesnt_pass("yeah");
filterer.file_does_pass("nah");
filterer.file_does_pass("nah.yeah");
}
#[tokio::test]
async fn ignores_and_filters() {
let filterer = filt("", &[file("globs").await.0], &[file("folder.wef").await]).await;
// ignored
filterer.dir_doesnt_pass("test-helper");
// not filtered
filterer.dir_doesnt_pass("tester-helper");
// not ignored && filtered
filterer.dir_does_pass("prunes/tester-helper");
}

View File

@ -1,349 +0,0 @@
#![allow(dead_code)]
use std::{
path::{Path, PathBuf},
str::FromStr,
sync::Arc,
};
use ignore_files::{IgnoreFile, IgnoreFilter};
use project_origins::ProjectType;
use tokio::fs::canonicalize;
use watchexec::{error::RuntimeError, filter::Filterer};
use watchexec_events::{
filekind::FileEventKind, Event, FileType, Priority, ProcessEnd, Source, Tag,
};
use watchexec_filterer_ignore::IgnoreFilterer;
use watchexec_filterer_tagged::{Filter, FilterFile, Matcher, Op, Pattern, TaggedFilterer};
use watchexec_signals::Signal;
pub mod tagged {
pub use super::ig_file as file;
pub use super::tagged_filt as filt;
pub use super::Applies;
pub use super::FilterExt;
pub use super::PathHarness;
pub use super::TaggedHarness;
pub use super::{filter, glob_filter, notglob_filter};
pub use watchexec_events::Priority;
}
pub mod tagged_ff {
pub use super::ff_file as file;
pub use super::tagged::*;
pub use super::tagged_fffilt as filt;
}
pub trait PathHarness: Filterer {
fn check_path(
&self,
path: PathBuf,
file_type: Option<FileType>,
) -> std::result::Result<bool, RuntimeError> {
let event = Event {
tags: vec![Tag::Path { path, file_type }],
metadata: Default::default(),
};
self.check_event(&event, Priority::Normal)
}
fn path_pass(&self, path: &str, file_type: Option<FileType>, pass: bool) {
let origin = std::fs::canonicalize(".").unwrap();
let full_path = if let Some(suf) = path.strip_prefix("/test/") {
origin.join(suf)
} else if Path::new(path).has_root() {
path.into()
} else {
origin.join(path)
};
tracing::info!(?path, ?file_type, ?pass, "check");
assert_eq!(
self.check_path(full_path, file_type).unwrap(),
pass,
"{} {:?} (expected {})",
match file_type {
Some(FileType::File) => "file",
Some(FileType::Dir) => "dir",
Some(FileType::Symlink) => "symlink",
Some(FileType::Other) => "other",
None => "path",
},
path,
if pass { "pass" } else { "fail" }
);
}
fn file_does_pass(&self, path: &str) {
self.path_pass(path, Some(FileType::File), true);
}
fn file_doesnt_pass(&self, path: &str) {
self.path_pass(path, Some(FileType::File), false);
}
fn dir_does_pass(&self, path: &str) {
self.path_pass(path, Some(FileType::Dir), true);
}
fn dir_doesnt_pass(&self, path: &str) {
self.path_pass(path, Some(FileType::Dir), false);
}
fn unk_does_pass(&self, path: &str) {
self.path_pass(path, None, true);
}
fn unk_doesnt_pass(&self, path: &str) {
self.path_pass(path, None, false);
}
}
impl PathHarness for TaggedFilterer {}
impl PathHarness for IgnoreFilterer {}
pub trait TaggedHarness {
fn check_tag(&self, tag: Tag, priority: Priority) -> std::result::Result<bool, RuntimeError>;
fn priority_pass(&self, priority: Priority, pass: bool) {
tracing::info!(?priority, ?pass, "check");
assert_eq!(
self.check_tag(Tag::Source(Source::Filesystem), priority)
.unwrap(),
pass,
"{priority:?} (expected {})",
if pass { "pass" } else { "fail" }
);
}
fn priority_does_pass(&self, priority: Priority) {
self.priority_pass(priority, true);
}
fn priority_doesnt_pass(&self, priority: Priority) {
self.priority_pass(priority, false);
}
fn tag_pass(&self, tag: Tag, pass: bool) {
tracing::info!(?tag, ?pass, "check");
assert_eq!(
self.check_tag(tag.clone(), Priority::Normal).unwrap(),
pass,
"{tag:?} (expected {})",
if pass { "pass" } else { "fail" }
);
}
fn fek_does_pass(&self, fek: FileEventKind) {
self.tag_pass(Tag::FileEventKind(fek), true);
}
fn fek_doesnt_pass(&self, fek: FileEventKind) {
self.tag_pass(Tag::FileEventKind(fek), false);
}
fn source_does_pass(&self, source: Source) {
self.tag_pass(Tag::Source(source), true);
}
fn source_doesnt_pass(&self, source: Source) {
self.tag_pass(Tag::Source(source), false);
}
fn pid_does_pass(&self, pid: u32) {
self.tag_pass(Tag::Process(pid), true);
}
fn pid_doesnt_pass(&self, pid: u32) {
self.tag_pass(Tag::Process(pid), false);
}
fn signal_does_pass(&self, sig: Signal) {
self.tag_pass(Tag::Signal(sig), true);
}
fn signal_doesnt_pass(&self, sig: Signal) {
self.tag_pass(Tag::Signal(sig), false);
}
fn complete_does_pass(&self, exit: Option<ProcessEnd>) {
self.tag_pass(Tag::ProcessCompletion(exit), true);
}
fn complete_doesnt_pass(&self, exit: Option<ProcessEnd>) {
self.tag_pass(Tag::ProcessCompletion(exit), false);
}
}
impl TaggedHarness for TaggedFilterer {
fn check_tag(&self, tag: Tag, priority: Priority) -> std::result::Result<bool, RuntimeError> {
let event = Event {
tags: vec![tag],
metadata: Default::default(),
};
self.check_event(&event, priority)
}
}
fn tracing_init() {
use tracing_subscriber::{
fmt::{format::FmtSpan, Subscriber},
util::SubscriberInitExt,
EnvFilter,
};
Subscriber::builder()
.pretty()
.with_span_events(FmtSpan::NEW | FmtSpan::CLOSE)
.with_env_filter(EnvFilter::from_default_env())
.finish()
.try_init()
.ok();
}
pub async fn ignore_filt(origin: &str, ignore_files: &[IgnoreFile]) -> IgnoreFilter {
tracing_init();
let origin = canonicalize(".").await.unwrap().join(origin);
IgnoreFilter::new(origin, ignore_files)
.await
.expect("making filterer")
}
pub async fn tagged_filt(filters: &[Filter]) -> Arc<TaggedFilterer> {
let origin = canonicalize(".").await.unwrap();
tracing_init();
let filterer = TaggedFilterer::new(origin.clone(), origin)
.await
.expect("creating filterer");
filterer.add_filters(filters).await.expect("adding filters");
filterer
}
pub async fn tagged_igfilt(origin: &str, ignore_files: &[IgnoreFile]) -> Arc<TaggedFilterer> {
let origin = canonicalize(".").await.unwrap().join(origin);
tracing_init();
let filterer = TaggedFilterer::new(origin.clone(), origin)
.await
.expect("creating filterer");
for file in ignore_files {
tracing::info!(?file, "loading ignore file");
filterer
.add_ignore_file(file)
.await
.expect("adding ignore file");
}
filterer
}
pub async fn tagged_fffilt(
origin: &str,
ignore_files: &[IgnoreFile],
filter_files: &[FilterFile],
) -> Arc<TaggedFilterer> {
let filterer = tagged_igfilt(origin, ignore_files).await;
let mut filters = Vec::new();
for file in filter_files {
tracing::info!(?file, "loading filter file");
filters.extend(file.load().await.expect("loading filter file"));
}
filterer
.add_filters(&filters)
.await
.expect("adding filters");
filterer
}
pub async fn ig_file(name: &str) -> IgnoreFile {
let path = canonicalize(".")
.await
.unwrap()
.join("tests")
.join("ignores")
.join(name);
IgnoreFile {
path,
applies_in: None,
applies_to: None,
}
}
pub async fn ff_file(name: &str) -> FilterFile {
FilterFile(ig_file(name).await)
}
pub trait Applies {
fn applies_in(self, origin: &str) -> Self;
fn applies_to(self, project_type: ProjectType) -> Self;
}
impl Applies for IgnoreFile {
fn applies_in(mut self, origin: &str) -> Self {
let origin = std::fs::canonicalize(".").unwrap().join(origin);
self.applies_in = Some(origin);
self
}
fn applies_to(mut self, project_type: ProjectType) -> Self {
self.applies_to = Some(project_type);
self
}
}
impl Applies for FilterFile {
fn applies_in(self, origin: &str) -> Self {
Self(self.0.applies_in(origin))
}
fn applies_to(self, project_type: ProjectType) -> Self {
Self(self.0.applies_to(project_type))
}
}
pub fn filter(expr: &str) -> Filter {
Filter::from_str(expr).expect("parse filter")
}
pub fn glob_filter(pat: &str) -> Filter {
Filter {
in_path: None,
on: Matcher::Path,
op: Op::Glob,
pat: Pattern::Glob(pat.into()),
negate: false,
}
}
pub fn notglob_filter(pat: &str) -> Filter {
Filter {
in_path: None,
on: Matcher::Path,
op: Op::NotGlob,
pat: Pattern::Glob(pat.into()),
negate: false,
}
}
pub trait FilterExt {
fn in_path(self) -> Self
where
Self: Sized,
{
self.in_subpath("")
}
fn in_subpath(self, sub: impl AsRef<Path>) -> Self;
}
impl FilterExt for Filter {
fn in_subpath(mut self, sub: impl AsRef<Path>) -> Self {
let origin = std::fs::canonicalize(".").unwrap();
self.in_path = Some(origin.join(sub));
self
}
}

View File

@ -1,3 +0,0 @@
# comment
# blank line

View File

@ -1,2 +0,0 @@
type==dir
path*=prunes

View File

@ -1,11 +0,0 @@
Cargo.toml
package.json
*.gemspec
test-*
*.sw*
sources.*/
/output.*
**/possum
zebra/**
elep/**/hant
song/**/bird/

View File

@ -1,2 +0,0 @@
path=nah
!path=nah.yeah

View File

@ -1,5 +0,0 @@
path*!Cargo.toml
path*!package.json
path*!*.gemspec
path~!test-[^u]+
path~![.]sw[a-z]$

View File

@ -1,453 +0,0 @@
use std::num::{NonZeroI32, NonZeroI64};
use watchexec_events::{filekind::*, ProcessEnd, Source};
use watchexec_filterer_tagged::TaggedFilterer;
use watchexec_signals::Signal;
mod helpers;
use helpers::tagged::*;
#[tokio::test]
async fn empty_filter_passes_everything() {
let filterer = filt(&[]).await;
filterer.source_does_pass(Source::Keyboard);
filterer.fek_does_pass(FileEventKind::Create(CreateKind::File));
filterer.pid_does_pass(1234);
filterer.signal_does_pass(Signal::User1);
filterer.complete_does_pass(None);
filterer.complete_does_pass(Some(ProcessEnd::Success));
}
// Source is used as a relatively simple test case for common text-based ops, so
// these aren't repeated for the other tags, which instead focus on their own
// special characteristics.
#[tokio::test]
async fn source_exact() {
let filterer = filt(&[filter("source==keyboard")]).await;
filterer.source_does_pass(Source::Keyboard);
filterer.source_doesnt_pass(Source::Mouse);
}
#[tokio::test]
async fn source_glob() {
let filterer = filt(&[filter("source*=*i*m*")]).await;
filterer.source_does_pass(Source::Filesystem);
filterer.source_does_pass(Source::Time);
filterer.source_doesnt_pass(Source::Internal);
}
#[tokio::test]
async fn source_regex() {
let filterer = filt(&[filter("source~=(keyboard|mouse)")]).await;
filterer.source_does_pass(Source::Keyboard);
filterer.source_does_pass(Source::Mouse);
filterer.source_doesnt_pass(Source::Internal);
}
#[tokio::test]
async fn source_two_filters() {
let filterer = filt(&[filter("source*=*s*"), filter("source!=mouse")]).await;
filterer.source_doesnt_pass(Source::Mouse);
filterer.source_does_pass(Source::Filesystem);
}
#[tokio::test]
async fn source_allowlisting() {
// allowlisting is vastly easier to achieve with e.g. `source==mouse`
// but this pattern is nonetheless useful for more complex cases.
let filterer = filt(&[filter("source*!*"), filter("!source==mouse")]).await;
filterer.source_does_pass(Source::Mouse);
filterer.source_doesnt_pass(Source::Filesystem);
}
#[tokio::test]
async fn source_set() {
let f = filter("source:=keyboard,mouse");
assert_eq!(f, filter("source=keyboard,mouse"));
let filterer = filt(&[f]).await;
filterer.source_does_pass(Source::Keyboard);
filterer.source_does_pass(Source::Mouse);
filterer.source_doesnt_pass(Source::Internal);
let filterer = filt(&[filter("source:!keyboard,mouse")]).await;
filterer.source_doesnt_pass(Source::Keyboard);
filterer.source_doesnt_pass(Source::Mouse);
filterer.source_does_pass(Source::Internal);
}
#[tokio::test]
async fn fek_glob_level_one() {
let f = filter("kind*=Create(*)");
assert_eq!(f, filter("fek*=Create(*)"));
assert_eq!(f, filter("kind=Create(*)"));
assert_eq!(f, filter("fek=Create(*)"));
let filterer = filt(&[f]).await;
filterer.fek_does_pass(FileEventKind::Create(CreateKind::Any));
filterer.fek_does_pass(FileEventKind::Create(CreateKind::File));
filterer.fek_doesnt_pass(FileEventKind::Modify(ModifyKind::Data(DataChange::Content)));
}
#[tokio::test]
async fn fek_glob_level_two() {
let filterer = filt(&[filter("fek=Modify(Data(*))")]).await;
filterer.fek_does_pass(FileEventKind::Modify(ModifyKind::Data(DataChange::Content)));
filterer.fek_doesnt_pass(FileEventKind::Modify(ModifyKind::Other));
filterer.fek_doesnt_pass(FileEventKind::Modify(ModifyKind::Metadata(
MetadataKind::Permissions,
)));
filterer.fek_doesnt_pass(FileEventKind::Create(CreateKind::Any));
}
#[tokio::test]
async fn fek_level_three() {
fn suite(filterer: &TaggedFilterer) {
filterer.fek_does_pass(FileEventKind::Modify(ModifyKind::Data(DataChange::Content)));
filterer.fek_doesnt_pass(FileEventKind::Modify(ModifyKind::Data(DataChange::Size)));
filterer.fek_doesnt_pass(FileEventKind::Modify(ModifyKind::Other));
filterer.fek_doesnt_pass(FileEventKind::Modify(ModifyKind::Metadata(
MetadataKind::Permissions,
)));
filterer.fek_doesnt_pass(FileEventKind::Create(CreateKind::Any));
}
suite(filt(&[filter("fek=Modify(Data(Content))")]).await.as_ref());
suite(filt(&[filter("fek==Modify(Data(Content))")]).await.as_ref());
}
#[tokio::test]
async fn pid_set_single() {
let f = filter("process:=1234");
assert_eq!(f, filter("pid:=1234"));
assert_eq!(f, filter("process=1234"));
assert_eq!(f, filter("pid=1234"));
let filterer = filt(&[f]).await;
filterer.pid_does_pass(1234);
filterer.pid_doesnt_pass(5678);
filterer.pid_doesnt_pass(12345);
filterer.pid_doesnt_pass(123);
}
#[tokio::test]
async fn pid_set_multiple() {
let filterer = filt(&[filter("pid=123,456")]).await;
filterer.pid_does_pass(123);
filterer.pid_does_pass(456);
filterer.pid_doesnt_pass(123456);
filterer.pid_doesnt_pass(12);
filterer.pid_doesnt_pass(23);
filterer.pid_doesnt_pass(45);
filterer.pid_doesnt_pass(56);
filterer.pid_doesnt_pass(1234);
filterer.pid_doesnt_pass(3456);
filterer.pid_doesnt_pass(4567);
filterer.pid_doesnt_pass(34567);
filterer.pid_doesnt_pass(0);
}
#[tokio::test]
async fn pid_equals() {
let f = filter("process==1234");
assert_eq!(f, filter("pid==1234"));
let filterer = filt(&[f]).await;
filterer.pid_does_pass(1234);
filterer.pid_doesnt_pass(5678);
filterer.pid_doesnt_pass(12345);
filterer.pid_doesnt_pass(123);
}
#[tokio::test]
async fn signal_set_single_without_sig() {
let f = filter("signal=INT");
assert_eq!(f, filter("sig=INT"));
assert_eq!(f, filter("signal:=INT"));
assert_eq!(f, filter("sig:=INT"));
let filterer = filt(&[f]).await;
filterer.signal_does_pass(Signal::Interrupt);
filterer.signal_doesnt_pass(Signal::Hangup);
}
#[tokio::test]
async fn signal_set_single_with_sig() {
let filterer = filt(&[filter("signal:=SIGINT")]).await;
filterer.signal_does_pass(Signal::Interrupt);
filterer.signal_doesnt_pass(Signal::Hangup);
}
#[tokio::test]
async fn signal_set_multiple_without_sig() {
let filterer = filt(&[filter("sig:=INT,TERM")]).await;
filterer.signal_does_pass(Signal::Interrupt);
filterer.signal_does_pass(Signal::Terminate);
filterer.signal_doesnt_pass(Signal::Hangup);
}
#[tokio::test]
async fn signal_set_multiple_with_sig() {
let filterer = filt(&[filter("signal:=SIGINT,SIGTERM")]).await;
filterer.signal_does_pass(Signal::Interrupt);
filterer.signal_does_pass(Signal::Terminate);
filterer.signal_doesnt_pass(Signal::Hangup);
}
#[tokio::test]
async fn signal_set_multiple_mixed_sig() {
let filterer = filt(&[filter("sig:=SIGINT,TERM")]).await;
filterer.signal_does_pass(Signal::Interrupt);
filterer.signal_does_pass(Signal::Terminate);
filterer.signal_doesnt_pass(Signal::Hangup);
}
#[tokio::test]
async fn signal_equals_without_sig() {
let filterer = filt(&[filter("sig==INT")]).await;
filterer.signal_does_pass(Signal::Interrupt);
filterer.signal_doesnt_pass(Signal::Hangup);
}
#[tokio::test]
async fn signal_equals_with_sig() {
let filterer = filt(&[filter("signal==SIGINT")]).await;
filterer.signal_does_pass(Signal::Interrupt);
filterer.signal_doesnt_pass(Signal::Hangup);
}
#[tokio::test]
async fn signal_set_single_numbers() {
let filterer = filt(&[filter("signal:=2")]).await;
filterer.signal_does_pass(Signal::Interrupt);
filterer.signal_doesnt_pass(Signal::Hangup);
}
#[tokio::test]
async fn signal_set_multiple_numbers() {
let filterer = filt(&[filter("sig:=2,15")]).await;
filterer.signal_does_pass(Signal::Interrupt);
filterer.signal_does_pass(Signal::Terminate);
filterer.signal_doesnt_pass(Signal::Hangup);
}
#[tokio::test]
async fn signal_equals_numbers() {
let filterer = filt(&[filter("sig==2")]).await;
filterer.signal_does_pass(Signal::Interrupt);
filterer.signal_doesnt_pass(Signal::Hangup);
}
#[tokio::test]
async fn signal_set_all_mixed() {
let filterer = filt(&[filter("signal:=SIGHUP,INT,15")]).await;
filterer.signal_does_pass(Signal::Hangup);
filterer.signal_does_pass(Signal::Interrupt);
filterer.signal_does_pass(Signal::Terminate);
filterer.signal_doesnt_pass(Signal::User1);
}
#[tokio::test]
async fn complete_empty() {
let f = filter("complete=_");
assert_eq!(f, filter("complete*=_"));
assert_eq!(f, filter("exit=_"));
assert_eq!(f, filter("exit*=_"));
let filterer = filt(&[f]).await;
filterer.complete_does_pass(None);
filterer.complete_doesnt_pass(Some(ProcessEnd::Success));
filterer.complete_doesnt_pass(Some(ProcessEnd::ExitError(NonZeroI64::new(1).unwrap())));
}
#[tokio::test]
async fn complete_any() {
let filterer = filt(&[filter("complete=*")]).await;
filterer.complete_does_pass(Some(ProcessEnd::Success));
filterer.complete_does_pass(Some(ProcessEnd::ExitError(NonZeroI64::new(1).unwrap())));
filterer.complete_does_pass(None);
}
#[tokio::test]
async fn complete_with_success() {
let filterer = filt(&[filter("complete*=success")]).await;
filterer.complete_does_pass(Some(ProcessEnd::Success));
filterer.complete_doesnt_pass(Some(ProcessEnd::ExitError(NonZeroI64::new(1).unwrap())));
filterer.complete_doesnt_pass(None);
}
#[tokio::test]
async fn complete_with_continued() {
let filterer = filt(&[filter("complete*=continued")]).await;
filterer.complete_does_pass(Some(ProcessEnd::Continued));
filterer.complete_doesnt_pass(Some(ProcessEnd::Success));
filterer.complete_doesnt_pass(Some(ProcessEnd::ExitError(NonZeroI64::new(1).unwrap())));
filterer.complete_doesnt_pass(None);
}
#[tokio::test]
async fn complete_with_specific_exit_error() {
let filterer = filt(&[filter("complete*=error(1)")]).await;
filterer.complete_does_pass(Some(ProcessEnd::ExitError(NonZeroI64::new(1).unwrap())));
filterer.complete_doesnt_pass(Some(ProcessEnd::Success));
filterer.complete_doesnt_pass(None);
}
#[tokio::test]
async fn complete_with_any_exit_error() {
let filterer = filt(&[filter("complete*=error(*)")]).await;
filterer.complete_does_pass(Some(ProcessEnd::ExitError(NonZeroI64::new(1).unwrap())));
filterer.complete_does_pass(Some(ProcessEnd::ExitError(NonZeroI64::new(63).unwrap())));
filterer.complete_does_pass(Some(ProcessEnd::ExitError(
NonZeroI64::new(-12823912738).unwrap(),
)));
filterer.complete_doesnt_pass(Some(ProcessEnd::ExitStop(NonZeroI32::new(63).unwrap())));
filterer.complete_doesnt_pass(Some(ProcessEnd::Success));
filterer.complete_doesnt_pass(None);
}
#[tokio::test]
async fn complete_with_specific_stop() {
let filterer = filt(&[filter("complete*=stop(19)")]).await;
filterer.complete_does_pass(Some(ProcessEnd::ExitStop(NonZeroI32::new(19).unwrap())));
filterer.complete_doesnt_pass(Some(ProcessEnd::Success));
filterer.complete_doesnt_pass(None);
}
#[tokio::test]
async fn complete_with_any_stop() {
let filterer = filt(&[filter("complete*=stop(*)")]).await;
filterer.complete_does_pass(Some(ProcessEnd::ExitStop(NonZeroI32::new(1).unwrap())));
filterer.complete_does_pass(Some(ProcessEnd::ExitStop(NonZeroI32::new(63).unwrap())));
filterer.complete_does_pass(Some(ProcessEnd::ExitStop(
NonZeroI32::new(-128239127).unwrap(),
)));
filterer.complete_doesnt_pass(Some(ProcessEnd::ExitError(NonZeroI64::new(63).unwrap())));
filterer.complete_doesnt_pass(Some(ProcessEnd::Success));
filterer.complete_doesnt_pass(None);
}
#[tokio::test]
async fn complete_with_specific_exception() {
let filterer = filt(&[filter("complete*=exception(4B53)")]).await;
filterer.complete_does_pass(Some(ProcessEnd::Exception(NonZeroI32::new(19283).unwrap())));
filterer.complete_doesnt_pass(Some(ProcessEnd::Success));
filterer.complete_doesnt_pass(None);
}
#[tokio::test]
async fn complete_with_any_exception() {
let filterer = filt(&[filter("complete*=exception(*)")]).await;
filterer.complete_does_pass(Some(ProcessEnd::Exception(NonZeroI32::new(1).unwrap())));
filterer.complete_does_pass(Some(ProcessEnd::Exception(NonZeroI32::new(63).unwrap())));
filterer.complete_does_pass(Some(ProcessEnd::Exception(
NonZeroI32::new(-128239127).unwrap(),
)));
filterer.complete_doesnt_pass(Some(ProcessEnd::ExitStop(NonZeroI32::new(63).unwrap())));
filterer.complete_doesnt_pass(Some(ProcessEnd::ExitError(NonZeroI64::new(63).unwrap())));
filterer.complete_doesnt_pass(Some(ProcessEnd::Success));
filterer.complete_doesnt_pass(None);
}
#[tokio::test]
async fn complete_with_specific_signal_with_sig() {
let filterer = filt(&[filter("complete*=signal(SIGINT)")]).await;
filterer.complete_does_pass(Some(ProcessEnd::ExitSignal(Signal::Interrupt)));
filterer.complete_doesnt_pass(Some(ProcessEnd::ExitStop(NonZeroI32::new(19).unwrap())));
filterer.complete_doesnt_pass(Some(ProcessEnd::Success));
filterer.complete_doesnt_pass(None);
}
#[tokio::test]
async fn complete_with_specific_signal_without_sig() {
let filterer = filt(&[filter("complete*=signal(INT)")]).await;
filterer.complete_does_pass(Some(ProcessEnd::ExitSignal(Signal::Interrupt)));
filterer.complete_doesnt_pass(Some(ProcessEnd::ExitStop(NonZeroI32::new(19).unwrap())));
filterer.complete_doesnt_pass(Some(ProcessEnd::Success));
filterer.complete_doesnt_pass(None);
}
#[tokio::test]
async fn complete_with_specific_signal_number() {
let filterer = filt(&[filter("complete*=signal(2)")]).await;
filterer.complete_does_pass(Some(ProcessEnd::ExitSignal(Signal::Interrupt)));
filterer.complete_doesnt_pass(Some(ProcessEnd::ExitStop(NonZeroI32::new(19).unwrap())));
filterer.complete_doesnt_pass(Some(ProcessEnd::Success));
filterer.complete_doesnt_pass(None);
}
#[tokio::test]
async fn complete_with_any_signal() {
let filterer = filt(&[filter("complete*=signal(*)")]).await;
filterer.complete_does_pass(Some(ProcessEnd::ExitSignal(Signal::Interrupt)));
filterer.complete_does_pass(Some(ProcessEnd::ExitSignal(Signal::Terminate)));
filterer.complete_does_pass(Some(ProcessEnd::ExitSignal(Signal::Custom(123))));
filterer.complete_doesnt_pass(Some(ProcessEnd::ExitStop(NonZeroI32::new(63).unwrap())));
filterer.complete_doesnt_pass(Some(ProcessEnd::ExitError(NonZeroI64::new(63).unwrap())));
filterer.complete_doesnt_pass(Some(ProcessEnd::Success));
filterer.complete_doesnt_pass(None);
}
#[tokio::test]
async fn priority_auto() {
let filterer = filt(&[filter("priority=normal")]).await;
filterer.priority_doesnt_pass(Priority::Low);
filterer.priority_does_pass(Priority::Normal);
filterer.priority_doesnt_pass(Priority::High);
}
#[tokio::test]
async fn priority_set() {
let filterer = filt(&[filter("priority:=normal,high")]).await;
filterer.priority_doesnt_pass(Priority::Low);
filterer.priority_does_pass(Priority::Normal);
filterer.priority_does_pass(Priority::High);
}
#[tokio::test]
async fn priority_none() {
let filterer = filt(&[]).await;
filterer.priority_does_pass(Priority::Low);
filterer.priority_does_pass(Priority::Normal);
filterer.priority_does_pass(Priority::High);
}

View File

@ -1,226 +0,0 @@
use std::{collections::HashSet, str::FromStr};
use watchexec_filterer_tagged::{Filter, Matcher, Op, Pattern, Regex, TaggedFiltererError};
mod helpers;
use helpers::tagged::*;
#[test]
fn empty_filter() {
assert!(matches!(
Filter::from_str(""),
Err(TaggedFiltererError::Parse { .. })
));
}
#[test]
fn only_bang() {
assert!(matches!(
Filter::from_str("!"),
Err(TaggedFiltererError::Parse { .. })
));
}
#[test]
fn no_op() {
assert!(matches!(
Filter::from_str("foobar"),
Err(TaggedFiltererError::Parse { .. })
));
}
#[test]
fn path_auto_op() {
assert_eq!(
filter("path=foo"),
Filter {
in_path: None,
on: Matcher::Path,
op: Op::Glob,
pat: Pattern::Glob("foo".to_string()),
negate: false,
}
);
}
#[test]
fn fek_auto_op() {
assert_eq!(
filter("fek=foo"),
Filter {
in_path: None,
on: Matcher::FileEventKind,
op: Op::Glob,
pat: Pattern::Glob("foo".to_string()),
negate: false,
}
);
}
#[test]
fn other_auto_op() {
assert_eq!(
filter("type=foo"),
Filter {
in_path: None,
on: Matcher::FileType,
op: Op::InSet,
pat: Pattern::Set(HashSet::from(["foo".to_string()])),
negate: false,
}
);
}
#[test]
fn op_equal() {
assert_eq!(
filter("path==foo"),
Filter {
in_path: None,
on: Matcher::Path,
op: Op::Equal,
pat: Pattern::Exact("foo".to_string()),
negate: false,
}
);
}
#[test]
fn op_not_equal() {
assert_eq!(
filter("path!=foo"),
Filter {
in_path: None,
on: Matcher::Path,
op: Op::NotEqual,
pat: Pattern::Exact("foo".to_string()),
negate: false,
}
);
}
#[test]
fn op_regex() {
assert_eq!(
filter("path~=^fo+$"),
Filter {
in_path: None,
on: Matcher::Path,
op: Op::Regex,
pat: Pattern::Regex(Regex::new("^fo+$").unwrap()),
negate: false,
}
);
}
#[test]
fn op_not_regex() {
assert_eq!(
filter("path~!f(o|al)+"),
Filter {
in_path: None,
on: Matcher::Path,
op: Op::NotRegex,
pat: Pattern::Regex(Regex::new("f(o|al)+").unwrap()),
negate: false,
}
);
}
#[test]
fn op_glob() {
assert_eq!(
filter("path*=**/foo"),
Filter {
in_path: None,
on: Matcher::Path,
op: Op::Glob,
pat: Pattern::Glob("**/foo".to_string()),
negate: false,
}
);
}
#[test]
fn op_not_glob() {
assert_eq!(
filter("path*!foo.*"),
Filter {
in_path: None,
on: Matcher::Path,
op: Op::NotGlob,
pat: Pattern::Glob("foo.*".to_string()),
negate: false,
}
);
}
#[test]
fn op_in_set() {
assert_eq!(
filter("path:=foo,bar"),
Filter {
in_path: None,
on: Matcher::Path,
op: Op::InSet,
pat: Pattern::Set(HashSet::from(["foo".to_string(), "bar".to_string()])),
negate: false,
}
);
}
#[test]
fn op_not_in_set() {
assert_eq!(
filter("path:!baz,qux"),
Filter {
in_path: None,
on: Matcher::Path,
op: Op::NotInSet,
pat: Pattern::Set(HashSet::from(["baz".to_string(), "qux".to_string()])),
negate: false,
}
);
}
#[test]
fn quoted_single() {
assert_eq!(
filter("path='blanche neige'"),
Filter {
in_path: None,
on: Matcher::Path,
op: Op::Glob,
pat: Pattern::Glob("blanche neige".to_string()),
negate: false,
}
);
}
#[test]
fn quoted_double() {
assert_eq!(
filter("path=\"et les sept nains\""),
Filter {
in_path: None,
on: Matcher::Path,
op: Op::Glob,
pat: Pattern::Glob("et les sept nains".to_string()),
negate: false,
}
);
}
#[test]
fn negate() {
assert_eq!(
filter("!path~=^f[om]+$"),
Filter {
in_path: None,
on: Matcher::Path,
op: Op::Regex,
pat: Pattern::Regex(Regex::new("^f[om]+$").unwrap()),
negate: true,
}
);
}

View File

@ -1,454 +0,0 @@
use std::sync::Arc;
use watchexec_filterer_tagged::TaggedFilterer;
mod helpers;
use helpers::tagged::*;
#[tokio::test]
async fn empty_filter_passes_everything() {
let filterer = filt(&[]).await;
filterer.file_does_pass("Cargo.toml");
filterer.file_does_pass("Cargo.json");
filterer.file_does_pass("Gemfile.toml");
filterer.file_does_pass("FINAL-FINAL.docx");
filterer.dir_does_pass("/test/Cargo.toml");
filterer.dir_does_pass("/a/folder");
filterer.file_does_pass("apples/carrots/oranges");
filterer.file_does_pass("apples/carrots/cauliflowers/oranges");
filterer.file_does_pass("apples/carrots/cauliflowers/artichokes/oranges");
filterer.file_does_pass("apples/oranges/bananas");
filterer.dir_does_pass("apples/carrots/oranges");
filterer.dir_does_pass("apples/carrots/cauliflowers/oranges");
filterer.dir_does_pass("apples/carrots/cauliflowers/artichokes/oranges");
filterer.dir_does_pass("apples/oranges/bananas");
}
#[tokio::test]
async fn exact_filename() {
let filterer = filt(&[glob_filter("Cargo.toml")]).await;
filterer.file_does_pass("Cargo.toml");
filterer.file_does_pass("/test/foo/bar/Cargo.toml");
filterer.file_doesnt_pass("Cargo.json");
filterer.file_doesnt_pass("Gemfile.toml");
filterer.file_doesnt_pass("FINAL-FINAL.docx");
filterer.dir_doesnt_pass("/a/folder");
filterer.dir_does_pass("/test/Cargo.toml");
}
#[tokio::test]
async fn exact_filenames_multiple() {
let filterer = filt(&[glob_filter("Cargo.toml"), glob_filter("package.json")]).await;
filterer.file_does_pass("Cargo.toml");
filterer.file_does_pass("/test/foo/bar/Cargo.toml");
filterer.file_does_pass("package.json");
filterer.file_does_pass("/test/foo/bar/package.json");
filterer.file_doesnt_pass("Cargo.json");
filterer.file_doesnt_pass("package.toml");
filterer.file_doesnt_pass("Gemfile.toml");
filterer.file_doesnt_pass("FINAL-FINAL.docx");
filterer.dir_doesnt_pass("/a/folder");
filterer.dir_does_pass("/test/Cargo.toml");
filterer.dir_does_pass("/test/package.json");
}
#[tokio::test]
async fn glob_single_final_ext_star() {
let filterer = filt(&[glob_filter("Cargo.*")]).await;
filterer.file_does_pass("Cargo.toml");
filterer.file_does_pass("Cargo.json");
filterer.file_doesnt_pass("Gemfile.toml");
filterer.file_doesnt_pass("FINAL-FINAL.docx");
filterer.dir_doesnt_pass("/a/folder");
filterer.dir_does_pass("Cargo.toml");
}
#[tokio::test]
async fn glob_star_trailing_slash() {
let filterer = filt(&[glob_filter("Cargo.*/")]).await;
filterer.file_doesnt_pass("Cargo.toml");
filterer.file_doesnt_pass("Cargo.json");
filterer.file_doesnt_pass("Gemfile.toml");
filterer.file_doesnt_pass("FINAL-FINAL.docx");
filterer.dir_doesnt_pass("/a/folder");
filterer.dir_does_pass("Cargo.toml");
filterer.unk_doesnt_pass("Cargo.toml");
}
#[tokio::test]
async fn glob_star_leading_slash() {
let filterer = filt(&[glob_filter("/Cargo.*")]).await;
filterer.file_does_pass("Cargo.toml");
filterer.file_does_pass("Cargo.json");
filterer.dir_does_pass("Cargo.toml");
filterer.unk_does_pass("Cargo.toml");
filterer.file_doesnt_pass("foo/Cargo.toml");
filterer.dir_doesnt_pass("foo/Cargo.toml");
}
#[tokio::test]
async fn glob_leading_double_star() {
let filterer = filt(&[glob_filter("**/possum")]).await;
filterer.file_does_pass("possum");
filterer.file_does_pass("foo/bar/possum");
filterer.file_does_pass("/foo/bar/possum");
filterer.dir_does_pass("possum");
filterer.dir_does_pass("foo/bar/possum");
filterer.dir_does_pass("/foo/bar/possum");
filterer.file_doesnt_pass("rat");
filterer.file_doesnt_pass("foo/bar/rat");
filterer.file_doesnt_pass("/foo/bar/rat");
}
#[tokio::test]
async fn glob_trailing_double_star() {
let filterer = filt(&[glob_filter("possum/**")]).await;
filterer.file_doesnt_pass("possum");
filterer.file_does_pass("possum/foo/bar");
filterer.file_doesnt_pass("/possum/foo/bar");
filterer.file_does_pass("/test/possum/foo/bar");
filterer.dir_doesnt_pass("possum");
filterer.dir_doesnt_pass("foo/bar/possum");
filterer.dir_doesnt_pass("/foo/bar/possum");
filterer.dir_does_pass("possum/foo/bar");
filterer.dir_doesnt_pass("/possum/foo/bar");
filterer.dir_does_pass("/test/possum/foo/bar");
filterer.file_doesnt_pass("rat");
filterer.file_doesnt_pass("foo/bar/rat");
filterer.file_doesnt_pass("/foo/bar/rat");
}
#[tokio::test]
async fn glob_middle_double_star() {
let filterer = filt(&[glob_filter("apples/**/oranges")]).await;
filterer.dir_doesnt_pass("/a/folder");
filterer.file_does_pass("apples/carrots/oranges");
filterer.file_does_pass("apples/carrots/cauliflowers/oranges");
filterer.file_does_pass("apples/carrots/cauliflowers/artichokes/oranges");
filterer.dir_does_pass("apples/carrots/oranges");
filterer.dir_does_pass("apples/carrots/cauliflowers/oranges");
filterer.dir_does_pass("apples/carrots/cauliflowers/artichokes/oranges");
// different from globset/v1 behaviour, but correct:
filterer.file_does_pass("apples/oranges/bananas");
filterer.dir_does_pass("apples/oranges/bananas");
}
#[tokio::test]
async fn glob_double_star_trailing_slash() {
let filterer = filt(&[glob_filter("apples/**/oranges/")]).await;
filterer.dir_doesnt_pass("/a/folder");
filterer.file_doesnt_pass("apples/carrots/oranges");
filterer.file_doesnt_pass("apples/carrots/cauliflowers/oranges");
filterer.file_doesnt_pass("apples/carrots/cauliflowers/artichokes/oranges");
filterer.dir_does_pass("apples/carrots/oranges");
filterer.dir_does_pass("apples/carrots/cauliflowers/oranges");
filterer.dir_does_pass("apples/carrots/cauliflowers/artichokes/oranges");
filterer.unk_doesnt_pass("apples/carrots/oranges");
filterer.unk_doesnt_pass("apples/carrots/cauliflowers/oranges");
filterer.unk_doesnt_pass("apples/carrots/cauliflowers/artichokes/oranges");
// different from globset/v1 behaviour, but correct:
filterer.file_does_pass("apples/oranges/bananas");
filterer.dir_does_pass("apples/oranges/bananas");
}
#[tokio::test]
async fn ignore_exact_filename() {
let filterer = filt(&[notglob_filter("Cargo.toml")]).await;
filterer.file_doesnt_pass("Cargo.toml");
filterer.file_doesnt_pass("/test/foo/bar/Cargo.toml");
filterer.file_does_pass("Cargo.json");
filterer.file_does_pass("Gemfile.toml");
filterer.file_does_pass("FINAL-FINAL.docx");
filterer.dir_does_pass("/a/folder");
filterer.dir_doesnt_pass("/test/Cargo.toml");
}
#[tokio::test]
async fn ignore_exact_filenames_multiple() {
let filterer = filt(&[notglob_filter("Cargo.toml"), notglob_filter("package.json")]).await;
filterer.file_doesnt_pass("Cargo.toml");
filterer.file_doesnt_pass("/test/foo/bar/Cargo.toml");
filterer.file_doesnt_pass("package.json");
filterer.file_doesnt_pass("/test/foo/bar/package.json");
filterer.file_does_pass("Cargo.json");
filterer.file_does_pass("package.toml");
filterer.file_does_pass("Gemfile.toml");
filterer.file_does_pass("FINAL-FINAL.docx");
filterer.dir_does_pass("/a/folder");
filterer.dir_doesnt_pass("/test/Cargo.toml");
filterer.dir_doesnt_pass("/test/package.json");
}
#[tokio::test]
async fn ignore_glob_single_final_ext_star() {
let filterer = filt(&[notglob_filter("Cargo.*")]).await;
filterer.file_doesnt_pass("Cargo.toml");
filterer.file_doesnt_pass("Cargo.json");
filterer.file_does_pass("Gemfile.toml");
filterer.file_does_pass("FINAL-FINAL.docx");
filterer.dir_does_pass("/a/folder");
filterer.dir_doesnt_pass("Cargo.toml");
}
#[tokio::test]
async fn ignore_glob_star_trailing_slash() {
let filterer = filt(&[notglob_filter("Cargo.*/")]).await;
filterer.file_does_pass("Cargo.toml");
filterer.file_does_pass("Cargo.json");
filterer.file_does_pass("Gemfile.toml");
filterer.file_does_pass("FINAL-FINAL.docx");
filterer.dir_does_pass("/a/folder");
filterer.dir_doesnt_pass("Cargo.toml");
filterer.unk_does_pass("Cargo.toml");
}
#[tokio::test]
async fn ignore_glob_star_leading_slash() {
let filterer = filt(&[notglob_filter("/Cargo.*")]).await;
filterer.file_doesnt_pass("Cargo.toml");
filterer.file_doesnt_pass("Cargo.json");
filterer.dir_doesnt_pass("Cargo.toml");
filterer.unk_doesnt_pass("Cargo.toml");
filterer.file_does_pass("foo/Cargo.toml");
filterer.dir_does_pass("foo/Cargo.toml");
}
#[tokio::test]
async fn ignore_glob_leading_double_star() {
let filterer = filt(&[notglob_filter("**/possum")]).await;
filterer.file_doesnt_pass("possum");
filterer.file_doesnt_pass("foo/bar/possum");
filterer.file_doesnt_pass("/foo/bar/possum");
filterer.dir_doesnt_pass("possum");
filterer.dir_doesnt_pass("foo/bar/possum");
filterer.dir_doesnt_pass("/foo/bar/possum");
filterer.file_does_pass("rat");
filterer.file_does_pass("foo/bar/rat");
filterer.file_does_pass("/foo/bar/rat");
}
#[tokio::test]
async fn ignore_glob_trailing_double_star() {
let filterer = filt(&[notglob_filter("possum/**")]).await;
filterer.file_does_pass("possum");
filterer.file_doesnt_pass("possum/foo/bar");
filterer.file_does_pass("/possum/foo/bar");
filterer.file_doesnt_pass("/test/possum/foo/bar");
filterer.dir_does_pass("possum");
filterer.dir_does_pass("foo/bar/possum");
filterer.dir_does_pass("/foo/bar/possum");
filterer.dir_doesnt_pass("possum/foo/bar");
filterer.dir_does_pass("/possum/foo/bar");
filterer.dir_doesnt_pass("/test/possum/foo/bar");
filterer.file_does_pass("rat");
filterer.file_does_pass("foo/bar/rat");
filterer.file_does_pass("/foo/bar/rat");
}
#[tokio::test]
async fn ignore_glob_middle_double_star() {
let filterer = filt(&[notglob_filter("apples/**/oranges")]).await;
filterer.dir_does_pass("/a/folder");
filterer.file_doesnt_pass("apples/carrots/oranges");
filterer.file_doesnt_pass("apples/carrots/cauliflowers/oranges");
filterer.file_doesnt_pass("apples/carrots/cauliflowers/artichokes/oranges");
filterer.dir_doesnt_pass("apples/carrots/oranges");
filterer.dir_doesnt_pass("apples/carrots/cauliflowers/oranges");
filterer.dir_doesnt_pass("apples/carrots/cauliflowers/artichokes/oranges");
// different from globset/v1 behaviour, but correct:
filterer.file_doesnt_pass("apples/oranges/bananas");
filterer.dir_doesnt_pass("apples/oranges/bananas");
}
#[tokio::test]
async fn ignore_glob_double_star_trailing_slash() {
let filterer = filt(&[notglob_filter("apples/**/oranges/")]).await;
filterer.dir_does_pass("/a/folder");
filterer.file_does_pass("apples/carrots/oranges");
filterer.file_does_pass("apples/carrots/cauliflowers/oranges");
filterer.file_does_pass("apples/carrots/cauliflowers/artichokes/oranges");
filterer.dir_doesnt_pass("apples/carrots/oranges");
filterer.dir_doesnt_pass("apples/carrots/cauliflowers/oranges");
filterer.dir_doesnt_pass("apples/carrots/cauliflowers/artichokes/oranges");
filterer.unk_does_pass("apples/carrots/oranges");
filterer.unk_does_pass("apples/carrots/cauliflowers/oranges");
filterer.unk_does_pass("apples/carrots/cauliflowers/artichokes/oranges");
// different from globset/v1 behaviour, but correct:
filterer.file_doesnt_pass("apples/oranges/bananas");
filterer.dir_doesnt_pass("apples/oranges/bananas");
}
#[tokio::test]
async fn ignores_take_precedence() {
let filterer = filt(&[
glob_filter("*.docx"),
glob_filter("*.toml"),
glob_filter("*.json"),
notglob_filter("*.toml"),
notglob_filter("*.json"),
])
.await;
filterer.file_doesnt_pass("Cargo.toml");
filterer.file_doesnt_pass("/test/foo/bar/Cargo.toml");
filterer.file_doesnt_pass("package.json");
filterer.file_doesnt_pass("/test/foo/bar/package.json");
filterer.dir_doesnt_pass("/test/Cargo.toml");
filterer.dir_doesnt_pass("/test/package.json");
filterer.file_does_pass("FINAL-FINAL.docx");
}
#[tokio::test]
async fn scopes_global() {
let filterer = filt(&[notglob_filter("*.toml")]).await;
filterer.file_doesnt_pass("Cargo.toml");
filterer.dir_doesnt_pass("Cargo.toml");
filterer.file_doesnt_pass("/outside/Cargo.toml");
filterer.dir_doesnt_pass("/outside/Cargo.toml");
filterer.file_does_pass("/outside/package.json");
filterer.dir_does_pass("/outside/package.json");
filterer.file_does_pass("package.json");
filterer.file_does_pass("FINAL-FINAL.docx");
}
#[tokio::test]
async fn scopes_local() {
let filterer = filt(&[notglob_filter("*.toml").in_path()]).await;
filterer.file_doesnt_pass("/test/Cargo.toml");
filterer.dir_doesnt_pass("/test/Cargo.toml");
filterer.file_does_pass("/outside/Cargo.toml");
filterer.dir_does_pass("/outside/Cargo.toml");
filterer.file_does_pass("/outside/package.json");
filterer.dir_does_pass("/outside/package.json");
filterer.file_does_pass("package.json");
filterer.file_does_pass("FINAL-FINAL.docx");
}
#[tokio::test]
async fn scopes_sublocal() {
let filterer = filt(&[notglob_filter("*.toml").in_subpath("src")]).await;
filterer.file_doesnt_pass("/test/src/Cargo.toml");
filterer.dir_doesnt_pass("/test/src/Cargo.toml");
filterer.file_does_pass("/test/Cargo.toml");
filterer.dir_does_pass("/test/Cargo.toml");
filterer.file_does_pass("/test/tests/Cargo.toml");
filterer.dir_does_pass("/test/tests/Cargo.toml");
filterer.file_does_pass("/outside/Cargo.toml");
filterer.dir_does_pass("/outside/Cargo.toml");
filterer.file_does_pass("/outside/package.json");
filterer.dir_does_pass("/outside/package.json");
filterer.file_does_pass("package.json");
filterer.file_does_pass("FINAL-FINAL.docx");
}
// The following tests check that the "buggy"/"confusing" watchexec v1 behaviour
// is no longer present.
fn watchexec_v1_confusing_suite(filterer: Arc<TaggedFilterer>) {
filterer.file_does_pass("apples");
filterer.file_does_pass("apples/carrots/cauliflowers/oranges");
filterer.file_does_pass("apples/carrots/cauliflowers/artichokes/oranges");
filterer.file_does_pass("apples/oranges/bananas");
filterer.dir_does_pass("apples");
filterer.dir_does_pass("apples/carrots/cauliflowers/oranges");
filterer.dir_does_pass("apples/carrots/cauliflowers/artichokes/oranges");
filterer.file_does_pass("raw-prunes");
filterer.dir_does_pass("raw-prunes");
filterer.file_does_pass("raw-prunes/carrots/cauliflowers/oranges");
filterer.file_does_pass("raw-prunes/carrots/cauliflowers/artichokes/oranges");
filterer.file_does_pass("raw-prunes/oranges/bananas");
filterer.dir_does_pass("raw-prunes/carrots/cauliflowers/oranges");
filterer.dir_does_pass("raw-prunes/carrots/cauliflowers/artichokes/oranges");
filterer.dir_doesnt_pass("prunes/carrots/cauliflowers/oranges");
filterer.dir_doesnt_pass("prunes/carrots/cauliflowers/artichokes/oranges");
filterer.file_doesnt_pass("prunes/carrots/cauliflowers/oranges");
filterer.file_doesnt_pass("prunes/carrots/cauliflowers/artichokes/oranges");
filterer.file_doesnt_pass("prunes/oranges/bananas");
}
#[tokio::test]
async fn ignore_folder_with_bare_match() {
let filterer = filt(&[notglob_filter("prunes").in_path()]).await;
filterer.file_doesnt_pass("prunes");
filterer.dir_doesnt_pass("prunes");
watchexec_v1_confusing_suite(filterer);
}
#[tokio::test]
async fn ignore_folder_with_bare_and_leading_slash() {
let filterer = filt(&[notglob_filter("/prunes").in_path()]).await;
filterer.file_doesnt_pass("prunes");
filterer.dir_doesnt_pass("prunes");
watchexec_v1_confusing_suite(filterer);
}
#[tokio::test]
async fn ignore_folder_with_bare_and_trailing_slash() {
let filterer = filt(&[notglob_filter("prunes/").in_path()]).await;
filterer.file_does_pass("prunes");
filterer.dir_doesnt_pass("prunes");
watchexec_v1_confusing_suite(filterer);
}
#[tokio::test]
async fn ignore_folder_with_only_double_double_glob() {
let filterer = filt(&[notglob_filter("**/prunes/**").in_path()]).await;
filterer.file_does_pass("prunes");
filterer.dir_does_pass("prunes");
watchexec_v1_confusing_suite(filterer);
}
#[tokio::test]
async fn ignore_folder_with_double_and_double_double_globs() {
let filterer = filt(&[
notglob_filter("**/prunes").in_path(),
notglob_filter("**/prunes/**").in_path(),
])
.await;
filterer.file_doesnt_pass("prunes");
filterer.dir_doesnt_pass("prunes");
watchexec_v1_confusing_suite(filterer);
}

View File

@ -2,6 +2,12 @@
## Next (YYYY-MM-DD)
## v3.0.1 (2024-04-28)
- Hide fmt::Debug spew from ignore crate, use `full_debug` feature to restore.
## v3.0.0 (2024-04-20)
- Deps: gix-config 0.36
- Deps: miette 7

View File

@ -1,6 +1,6 @@
[package]
name = "ignore-files"
version = "2.1.0"
version = "3.0.1"
authors = ["Félix Saparelli <felix@passcod.name>"]
license = "Apache-2.0"
@ -35,8 +35,14 @@ features = [
]
[dependencies.project-origins]
version = "1.3.0"
version = "1.4.0"
path = "../project-origins"
[dev-dependencies]
tracing-subscriber = "0.3.6"
[features]
default = []
## Don't hide ignore::gitignore::Gitignore Debug impl
full_debug = []

View File

@ -1,3 +1,4 @@
use std::fmt;
use std::path::{Path, PathBuf};
use futures::stream::{FuturesUnordered, StreamExt};
@ -11,12 +12,23 @@ use tracing::{trace, trace_span};
use crate::{simplify_path, Error, IgnoreFile};
#[derive(Clone, Debug)]
#[derive(Clone)]
#[cfg_attr(feature = "full_debug", derive(Debug))]
struct Ignore {
gitignore: Gitignore,
builder: Option<GitignoreBuilder>,
}
#[cfg(not(feature = "full_debug"))]
impl fmt::Debug for Ignore {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
f.debug_struct("Ignore")
.field("gitignore", &"ignore::gitignore::Gitignore{...}")
.field("builder", &"ignore::gitignore::GitignoreBuilder{...}")
.finish()
}
}
/// A mutable filter dedicated to ignore files and trees of ignore files.
///
/// This reads and compiles ignore files, and should be used for handling ignore files. It's created

View File

@ -2,6 +2,14 @@
## Next (YYYY-MM-DD)
## v4.1.0 (2024-04-28)
- Feature: non-recursive watches with `WatchedPath::non_recursive()`
- Fix: `config.pathset()` now preserves `WatchedPath` attributes
- Refactor: move `WatchedPath` to the root of the crate (old path remains as re-export for now)
## v4.0.0 (2024-04-20)
- Deps: replace command-group with process-wrap (in supervisor, but has flow-on effects)
- Deps: miette 7
- Deps: nix 0.28

View File

@ -1,6 +1,6 @@
[package]
name = "watchexec"
version = "3.0.1"
version = "4.1.0"
authors = ["Félix Saparelli <felix@passcod.name>", "Matt Green <mattgreenrocks@gmail.com>"]
license = "Apache-2.0"
@ -39,15 +39,15 @@ version = "3.0.0"
path = "../signals"
[dependencies.watchexec-supervisor]
version = "1.0.3"
version = "2.0.0"
path = "../supervisor"
[dependencies.ignore-files]
version = "2.1.0"
version = "3.0.1"
path = "../ignore-files"
[dependencies.project-origins]
version = "1.3.0"
version = "1.4.0"
path = "../project-origins"
[dependencies.tokio]

View File

@ -1,6 +1,6 @@
pre-release-commit-message = "release: lib v{{version}}"
tag-prefix = "lib-"
tag-message = "watchexec-lib {{version}}"
tag-prefix = "watchexec-"
tag-message = "watchexec {{version}}"
[[pre-release-replacements]]
file = "CHANGELOG.md"

View File

@ -1,6 +1,6 @@
//! Configuration and builders for [`crate::Watchexec`].
use std::{future::Future, path::Path, pin::pin, sync::Arc, time::Duration};
use std::{future::Future, pin::pin, sync::Arc, time::Duration};
use tokio::sync::Notify;
use tracing::{debug, trace};
@ -195,9 +195,9 @@ impl Config {
pub fn pathset<I, P>(&self, pathset: I) -> &Self
where
I: IntoIterator<Item = P>,
P: AsRef<Path>,
P: Into<WatchedPath>,
{
let pathset = pathset.into_iter().map(|p| p.as_ref().into()).collect();
let pathset = pathset.into_iter().map(|p| p.into()).collect();
debug!(?pathset, "Config: pathset");
self.pathset.replace(pathset);
self.signal_change()

View File

@ -68,12 +68,14 @@ pub mod config;
mod id;
mod late_join_set;
mod watched_path;
mod watchexec;
#[doc(inline)]
pub use crate::{
id::Id,
watchexec::{ErrorHook, Watchexec},
watched_path::WatchedPath,
};
#[doc(no_inline)]

View File

@ -4,7 +4,6 @@ use std::{
collections::{HashMap, HashSet},
fs::metadata,
mem::take,
path::{Path, PathBuf},
sync::Arc,
time::Duration,
};
@ -20,6 +19,9 @@ use crate::{
Config,
};
// re-export for compatibility, until next major version
pub use crate::WatchedPath;
/// What kind of filesystem watcher to use.
///
/// For now only native and poll watchers are supported. In the future there may be additional
@ -72,42 +74,6 @@ impl Watcher {
}
}
/// A path to watch.
///
/// This is currently only a wrapper around a [`PathBuf`], but may be augmented in the future.
#[derive(Clone, Debug, Default, PartialEq, Eq, PartialOrd, Ord, Hash)]
pub struct WatchedPath(PathBuf);
impl From<PathBuf> for WatchedPath {
fn from(path: PathBuf) -> Self {
Self(path)
}
}
impl From<&str> for WatchedPath {
fn from(path: &str) -> Self {
Self(path.into())
}
}
impl From<&Path> for WatchedPath {
fn from(path: &Path) -> Self {
Self(path.into())
}
}
impl From<WatchedPath> for PathBuf {
fn from(path: WatchedPath) -> Self {
path.0
}
}
impl AsRef<Path> for WatchedPath {
fn as_ref(&self) -> &Path {
self.0.as_ref()
}
}
/// Launch the filesystem event worker.
///
/// While you can run several, you should only have one.
@ -190,6 +156,7 @@ pub async fn worker(
// now let's calculate which paths we should add to the watch, and which we should drop:
let config_pathset = config.pathset.get();
tracing::info!(?config_pathset, "obtaining pathset");
let (to_watch, to_drop) = if pathset.is_empty() {
// if the current pathset is empty, we can take a shortcut
(config_pathset, Vec::new())
@ -222,7 +189,7 @@ pub async fn worker(
for path in to_drop {
trace!(?path, "removing path from the watcher");
if let Err(err) = watcher.unwatch(path.as_ref()) {
if let Err(err) = watcher.unwatch(path.path.as_ref()) {
error!(?err, "notify unwatch() error");
for e in notify_multi_path_errors(watcher_type, path, err, true) {
errors.send(e).await?;
@ -234,13 +201,18 @@ pub async fn worker(
for path in to_watch {
trace!(?path, "adding path to the watcher");
if let Err(err) = watcher.watch(path.as_ref(), notify::RecursiveMode::Recursive) {
if let Err(err) = watcher.watch(
path.path.as_ref(),
if path.recursive {
notify::RecursiveMode::Recursive
} else {
notify::RecursiveMode::NonRecursive
},
) {
error!(?err, "notify watch() error");
for e in notify_multi_path_errors(watcher_type, path, err, false) {
errors.send(e).await?;
}
// TODO: unwatch and re-watch manually while ignoring all the erroring paths
// See https://github.com/watchexec/watchexec/issues/218
} else {
pathset.insert(path);
}
@ -250,13 +222,13 @@ pub async fn worker(
fn notify_multi_path_errors(
kind: Watcher,
path: WatchedPath,
watched_path: WatchedPath,
mut err: notify::Error,
rm: bool,
) -> Vec<RuntimeError> {
let mut paths = take(&mut err.paths);
if paths.is_empty() {
paths.push(path.into());
paths.push(watched_path.into());
}
let generic = err.to_string();

View File

@ -0,0 +1,82 @@
use std::path::{Path, PathBuf};
/// A path to watch.
///
/// Can be a recursive or non-recursive watch.
#[derive(Clone, Debug, Default, PartialEq, Eq, PartialOrd, Ord, Hash)]
pub struct WatchedPath {
pub(crate) path: PathBuf,
pub(crate) recursive: bool,
}
impl From<PathBuf> for WatchedPath {
fn from(path: PathBuf) -> Self {
Self {
path,
recursive: true,
}
}
}
impl From<&str> for WatchedPath {
fn from(path: &str) -> Self {
Self {
path: path.into(),
recursive: true,
}
}
}
impl From<String> for WatchedPath {
fn from(path: String) -> Self {
Self {
path: path.into(),
recursive: true,
}
}
}
impl From<&Path> for WatchedPath {
fn from(path: &Path) -> Self {
Self {
path: path.into(),
recursive: true,
}
}
}
impl From<WatchedPath> for PathBuf {
fn from(path: WatchedPath) -> Self {
path.path
}
}
impl From<&WatchedPath> for PathBuf {
fn from(path: &WatchedPath) -> Self {
path.path.clone()
}
}
impl AsRef<Path> for WatchedPath {
fn as_ref(&self) -> &Path {
self.path.as_ref()
}
}
impl WatchedPath {
/// Create a new watched path, recursively descending into subdirectories.
pub fn recursive(path: impl Into<PathBuf>) -> Self {
Self {
path: path.into(),
recursive: true,
}
}
/// Create a new watched path, not descending into subdirectories.
pub fn non_recursive(path: impl Into<PathBuf>) -> Self {
Self {
path: path.into(),
recursive: false,
}
}
}

View File

@ -2,6 +2,10 @@
## Next (YYYY-MM-DD)
## v1.4.0 (2024-04-28)
- Add out-of-tree Git repositories (`.git` file instead of folder).
## v1.3.0 (2024-01-01)
- Remove `README.md` files from detection; those were causing too many false positives and were a weak signal anyway.

View File

@ -1,6 +1,6 @@
[package]
name = "project-origins"
version = "1.3.0"
version = "1.4.0"
authors = ["Félix Saparelli <felix@passcod.name>"]
license = "Apache-2.0"

View File

@ -49,7 +49,7 @@ pub enum ProjectType {
/// VCS: [Git](https://git-scm.com/).
///
/// Detects when a `.git` folder is present, or any of the files `.gitattributes` or
/// Detects when a `.git` file or folder is present, or any of the files `.gitattributes` or
/// `.gitmodules`. Does _not_ check or return from the presence of `.gitignore` files, as Git
/// supports nested ignores, and that would result in false-positives.
Git,
@ -208,6 +208,7 @@ pub async fn origins(path: impl AsRef<Path> + Send) -> HashSet<PathBuf> {
list.has_file(".codecov.yml"),
list.has_file(".ctags"),
list.has_file(".editorconfig"),
list.has_file(".git"),
list.has_file(".gitattributes"),
list.has_file(".gitmodules"),
list.has_file(".hgignore"),
@ -293,6 +294,7 @@ pub async fn types(path: impl AsRef<Path> + Send) -> HashSet<ProjectType> {
list.if_has_dir(".svn", ProjectType::Subversion),
list.if_has_file(".bzrignore", ProjectType::Bazaar),
list.if_has_file(".ctags", ProjectType::C),
list.if_has_file(".git", ProjectType::Git),
list.if_has_file(".gitattributes", ProjectType::Git),
list.if_has_file(".gitmodules", ProjectType::Git),
list.if_has_file(".hgignore", ProjectType::Mercurial),

View File

@ -1,5 +1,5 @@
pre-release-commit-message = "release: signals v{{version}}"
tag-prefix = "signals-"
tag-prefix = "watchexec-signals-"
tag-message = "watchexec-signals {{version}}"
[[pre-release-replacements]]

View File

@ -2,6 +2,8 @@
## Next (YYYY-MM-DD)
## v2.0.0 (2024-04-20)
- Deps: replace command-group with process-wrap
- Deps: nix 0.28

View File

@ -1,6 +1,6 @@
[package]
name = "watchexec-supervisor"
version = "1.0.3"
version = "2.0.0"
authors = ["Félix Saparelli <felix@passcod.name>"]
license = "Apache-2.0 OR MIT"

View File

@ -1,5 +1,5 @@
pre-release-commit-message = "release: supervisor v{{version}}"
tag-prefix = "supervisor-"
tag-prefix = "watchexec-supervisor-"
tag-message = "watchexec-supervisor {{version}}"
[[pre-release-replacements]]

View File

@ -1,10 +1,10 @@
.ie \n(.g .ds Aq \(aq
.el .ds Aq '
.TH watchexec 1 "watchexec 1.25.1"
.TH watchexec 1 "watchexec 2.1.1"
.SH NAME
watchexec \- Execute commands when watched files change
.SH SYNOPSIS
\fBwatchexec\fR [\fB\-w\fR|\fB\-\-watch\fR] [\fB\-c\fR|\fB\-\-clear\fR] [\fB\-o\fR|\fB\-\-on\-busy\-update\fR] [\fB\-r\fR|\fB\-\-restart\fR] [\fB\-s\fR|\fB\-\-signal\fR] [\fB\-\-stop\-signal\fR] [\fB\-\-stop\-timeout\fR] [\fB\-\-map\-signal\fR] [\fB\-d\fR|\fB\-\-debounce\fR] [\fB\-\-stdin\-quit\fR] [\fB\-\-no\-vcs\-ignore\fR] [\fB\-\-no\-project\-ignore\fR] [\fB\-\-no\-global\-ignore\fR] [\fB\-\-no\-default\-ignore\fR] [\fB\-\-no\-discover\-ignore\fR] [\fB\-\-ignore\-nothing\fR] [\fB\-p\fR|\fB\-\-postpone\fR] [\fB\-\-delay\-run\fR] [\fB\-\-poll\fR] [\fB\-\-shell\fR] [\fB\-n \fR] [\fB\-\-emit\-events\-to\fR] [\fB\-\-only\-emit\-events\fR] [\fB\-E\fR|\fB\-\-env\fR] [\fB\-\-no\-process\-group\fR] [\fB\-N\fR|\fB\-\-notify\fR] [\fB\-\-color\fR] [\fB\-\-timings\fR] [\fB\-q\fR|\fB\-\-quiet\fR] [\fB\-\-bell\fR] [\fB\-\-project\-origin\fR] [\fB\-\-workdir\fR] [\fB\-e\fR|\fB\-\-exts\fR] [\fB\-f\fR|\fB\-\-filter\fR] [\fB\-\-filter\-file\fR] [\fB\-j\fR|\fB\-\-filter\-prog\fR] [\fB\-i\fR|\fB\-\-ignore\fR] [\fB\-\-ignore\-file\fR] [\fB\-\-fs\-events\fR] [\fB\-\-no\-meta\fR] [\fB\-\-print\-events\fR] [\fB\-v\fR|\fB\-\-verbose\fR]... [\fB\-\-log\-file\fR] [\fB\-\-manual\fR] [\fB\-\-completions\fR] [\fB\-h\fR|\fB\-\-help\fR] [\fB\-V\fR|\fB\-\-version\fR] [\fICOMMAND\fR]
\fBwatchexec\fR [\fB\-w\fR|\fB\-\-watch\fR] [\fB\-W\fR|\fB\-\-watch\-non\-recursive\fR] [\fB\-c\fR|\fB\-\-clear\fR] [\fB\-o\fR|\fB\-\-on\-busy\-update\fR] [\fB\-r\fR|\fB\-\-restart\fR] [\fB\-s\fR|\fB\-\-signal\fR] [\fB\-\-stop\-signal\fR] [\fB\-\-stop\-timeout\fR] [\fB\-\-map\-signal\fR] [\fB\-d\fR|\fB\-\-debounce\fR] [\fB\-\-stdin\-quit\fR] [\fB\-\-no\-vcs\-ignore\fR] [\fB\-\-no\-project\-ignore\fR] [\fB\-\-no\-global\-ignore\fR] [\fB\-\-no\-default\-ignore\fR] [\fB\-\-no\-discover\-ignore\fR] [\fB\-\-ignore\-nothing\fR] [\fB\-p\fR|\fB\-\-postpone\fR] [\fB\-\-delay\-run\fR] [\fB\-\-poll\fR] [\fB\-\-shell\fR] [\fB\-n \fR] [\fB\-\-emit\-events\-to\fR] [\fB\-\-only\-emit\-events\fR] [\fB\-E\fR|\fB\-\-env\fR] [\fB\-\-no\-process\-group\fR] [\fB\-\-wrap\-process\fR] [\fB\-N\fR|\fB\-\-notify\fR] [\fB\-\-color\fR] [\fB\-\-timings\fR] [\fB\-q\fR|\fB\-\-quiet\fR] [\fB\-\-bell\fR] [\fB\-\-project\-origin\fR] [\fB\-\-workdir\fR] [\fB\-e\fR|\fB\-\-exts\fR] [\fB\-f\fR|\fB\-\-filter\fR] [\fB\-\-filter\-file\fR] [\fB\-j\fR|\fB\-\-filter\-prog\fR] [\fB\-i\fR|\fB\-\-ignore\fR] [\fB\-\-ignore\-file\fR] [\fB\-\-fs\-events\fR] [\fB\-\-no\-meta\fR] [\fB\-\-print\-events\fR] [\fB\-\-manual\fR] [\fB\-\-completions\fR] [\fB\-v\fR|\fB\-\-verbose\fR]... [\fB\-\-log\-file\fR] [\fB\-h\fR|\fB\-\-help\fR] [\fB\-V\fR|\fB\-\-version\fR] [\fICOMMAND\fR]
.SH DESCRIPTION
Execute commands when watched files change.
.PP
@ -48,6 +48,13 @@ This option can be specified multiple times to watch multiple files or directori
The special value \*(Aq/dev/null\*(Aq, provided as the only path watched, will cause Watchexec to not watch any paths. Other event sources (like signals or key events) may still be used.
.TP
\fB\-W\fR, \fB\-\-watch\-non\-recursive\fR=\fIPATH\fR
Watch a specific directory, non\-recursively
Unlike \*(Aq\-w\*(Aq, folders watched with this option are not recursed into.
This option can be specified multiple times to watch multiple directories non\-recursively.
.TP
\fB\-c\fR, \fB\-\-clear\fR=\fIMODE\fR
Clear screen before running command
@ -56,11 +63,9 @@ If this doesn\*(Aqt completely clear the screen, try \*(Aq\-\-clear=reset\*(Aq.
\fB\-o\fR, \fB\-\-on\-busy\-update\fR=\fIMODE\fR
What to do when receiving events while the command is running
Default is to \*(Aqqueue\*(Aq up events and run the command once again when the previous run has finished. You can also use \*(Aqdo\-nothing\*(Aq, which ignores events while the command is running and may be useful to avoid spurious changes made by that command, or \*(Aqrestart\*(Aq, which terminates the running command and starts a new one. Finally, there\*(Aqs \*(Aqsignal\*(Aq, which only sends a signal; this can be useful with programs that can reload their configuration without a full restart.
Default is to \*(Aqdo\-nothing\*(Aq, which ignores events while the command is running, so that changes that occur due to the command are ignored, like compilation outputs. You can also use \*(Aqqueue\*(Aq which will run the command once again when the current run has finished if any events occur while it\*(Aqs running, or \*(Aqrestart\*(Aq, which terminates the running command and starts a new one. Finally, there\*(Aqs \*(Aqsignal\*(Aq, which only sends a signal; this can be useful with programs that can reload their configuration without a full restart.
The signal can be specified with the \*(Aq\-\-signal\*(Aq option.
Note that this option is scheduled to change its default to \*(Aqdo\-nothing\*(Aq in the next major release. File an issue if you have any concerns.
.TP
\fB\-r\fR, \fB\-\-restart\fR
Restart the process if it\*(Aqs still running
@ -370,6 +375,17 @@ Use key=value syntax. Multiple variables can be set by repeating the option.
Don\*(Aqt use a process group
By default, Watchexec will run the command in a process group, so that signals and terminations are sent to all processes in the group. Sometimes that\*(Aqs not what you want, and you can disable the behaviour with this option.
Deprecated, use \*(Aq\-\-wrap\-process=none\*(Aq instead.
.TP
\fB\-\-wrap\-process\fR=\fIMODE\fR [default: group]
Configure how the process is wrapped
By default, Watchexec will run the command in a process group in Unix, and in a Job Object in Windows.
Some Unix programs prefer running in a session, while others do not work in a process group.
Use \*(Aqgroup\*(Aq to use a process group, \*(Aqsession\*(Aq to use a process session, and \*(Aqnone\*(Aq to run the command directly. On Windows, either of \*(Aqgroup\*(Aq or \*(Aqsession\*(Aq will use a Job Object.
.TP
\fB\-N\fR, \fB\-\-notify\fR
Alert when commands start and end
@ -430,7 +446,7 @@ This can also be used via the $WATCHEXEC_FILTER_FILES environment variable.
/!\\ This option is EXPERIMENTAL and may change and/or vanish without notice.
Provide your own custom filter programs in jaq (similar to jq) syntax. Programs are given an event in the same format as described in \*(Aq\-\-emit\-events\-to\*(Aq and must return a boolean.
Provide your own custom filter programs in jaq (similar to jq) syntax. Programs are given an event in the same format as described in \*(Aq\-\-emit\-events\-to\*(Aq and must return a boolean. Invalid programs will make watchexec fail to start; use \*(Aq\-v\*(Aq to see program runtime errors.
In addition to the jaq stdlib, watchexec adds some custom filter definitions:
@ -501,7 +517,19 @@ Print events that trigger actions
This prints the events that triggered the action when handling it (after debouncing), in a human readable form. This is useful for debugging filters.
Use \*(Aq\-v\*(Aq when you need more diagnostic information.
Use \*(Aq\-vvv\*(Aq instead when you need more diagnostic information.
.TP
\fB\-\-manual\fR
Show the manual page
This shows the manual page for Watchexec, if the output is a terminal and the \*(Aqman\*(Aq program is available. If not, the manual page is printed to stdout in ROFF format (suitable for writing to a watchexec.1 file).
.TP
\fB\-\-completions\fR=\fICOMPLETIONS\fR
Generate a shell completions script
Provides a completions script or configuration for the given shell. If Watchexec is not distributed with pre\-generated completions, you can use this to generate them yourself.
Supported shells: bash, elvish, fish, nu, powershell, zsh.
.TP
\fB\-v\fR, \fB\-\-verbose\fR
Set diagnostic log level
@ -523,18 +551,6 @@ If a path is not provided, the default is the working directory. Note that with
If the path provided is a directory, a file will be created in that directory. The file name will be the current date and time, in the format \*(Aqwatchexec.YYYY\-MM\-DDTHH\-MM\-SSZ.log\*(Aq.
.TP
\fB\-\-manual\fR
Show the manual page
This shows the manual page for Watchexec, if the output is a terminal and the \*(Aqman\*(Aq program is available. If not, the manual page is printed to stdout in ROFF format (suitable for writing to a watchexec.1 file).
.TP
\fB\-\-completions\fR=\fICOMPLETIONS\fR
Generate a shell completions script
Provides a completions script or configuration for the given shell. If Watchexec is not distributed with pre\-generated completions, you can use this to generate them yourself.
Supported shells: bash, elvish, fish, nu, powershell, zsh.
.TP
\fB\-h\fR, \fB\-\-help\fR
Print help (see a summary with \*(Aq\-h\*(Aq)
.TP
@ -566,6 +582,6 @@ Use @argfile as first argument to load arguments from the file \*(Aqargfile\*(Aq
Didn\*(Aqt expect this much output? Use the short \*(Aq\-h\*(Aq flag to get short help.
.SH VERSION
v1.25.1
v2.1.1
.SH AUTHORS
Félix Saparelli <felix@passcod.name>, Matt Green <mattgreenrocks@gmail.com>

View File

@ -4,7 +4,8 @@ watchexec - Execute commands when watched files change
# SYNOPSIS
**watchexec** \[**-w**\|**\--watch**\] \[**-c**\|**\--clear**\]
**watchexec** \[**-w**\|**\--watch**\]
\[**-W**\|**\--watch-non-recursive**\] \[**-c**\|**\--clear**\]
\[**-o**\|**\--on-busy-update**\] \[**-r**\|**\--restart**\]
\[**-s**\|**\--signal**\] \[**\--stop-signal**\] \[**\--stop-timeout**\]
\[**\--map-signal**\] \[**-d**\|**\--debounce**\] \[**\--stdin-quit**\]
@ -14,15 +15,16 @@ watchexec - Execute commands when watched files change
\[**-p**\|**\--postpone**\] \[**\--delay-run**\] \[**\--poll**\]
\[**\--shell**\] \[**-n **\] \[**\--emit-events-to**\]
\[**\--only-emit-events**\] \[**-E**\|**\--env**\]
\[**\--no-process-group**\] \[**-N**\|**\--notify**\] \[**\--color**\]
\[**\--timings**\] \[**-q**\|**\--quiet**\] \[**\--bell**\]
\[**\--project-origin**\] \[**\--workdir**\] \[**-e**\|**\--exts**\]
\[**-f**\|**\--filter**\] \[**\--filter-file**\]
\[**-j**\|**\--filter-prog**\] \[**-i**\|**\--ignore**\]
\[**\--ignore-file**\] \[**\--fs-events**\] \[**\--no-meta**\]
\[**\--print-events**\] \[**-v**\|**\--verbose**\]\...
\[**\--log-file**\] \[**\--manual**\] \[**\--completions**\]
\[**-h**\|**\--help**\] \[**-V**\|**\--version**\] \[*COMMAND*\]
\[**\--no-process-group**\] \[**\--wrap-process**\]
\[**-N**\|**\--notify**\] \[**\--color**\] \[**\--timings**\]
\[**-q**\|**\--quiet**\] \[**\--bell**\] \[**\--project-origin**\]
\[**\--workdir**\] \[**-e**\|**\--exts**\] \[**-f**\|**\--filter**\]
\[**\--filter-file**\] \[**-j**\|**\--filter-prog**\]
\[**-i**\|**\--ignore**\] \[**\--ignore-file**\] \[**\--fs-events**\]
\[**\--no-meta**\] \[**\--print-events**\] \[**\--manual**\]
\[**\--completions**\] \[**-v**\|**\--verbose**\]\...
\[**\--log-file**\] \[**-h**\|**\--help**\] \[**-V**\|**\--version**\]
\[*COMMAND*\]
# DESCRIPTION
@ -81,6 +83,15 @@ The special value /dev/null, provided as the only path watched, will
cause Watchexec to not watch any paths. Other event sources (like
signals or key events) may still be used.
**-W**, **\--watch-non-recursive**=*PATH*
: Watch a specific directory, non-recursively
Unlike -w, folders watched with this option are not recursed into.
This option can be specified multiple times to watch multiple
directories non-recursively.
**-c**, **\--clear**=*MODE*
: Clear screen before running command
@ -91,19 +102,17 @@ If this doesnt completely clear the screen, try \--clear=reset.
: What to do when receiving events while the command is running
Default is to queue up events and run the command once again when the
previous run has finished. You can also use do-nothing, which ignores
events while the command is running and may be useful to avoid spurious
changes made by that command, or restart, which terminates the running
command and starts a new one. Finally, theres signal, which only sends a
signal; this can be useful with programs that can reload their
configuration without a full restart.
Default is to do-nothing, which ignores events while the command is
running, so that changes that occur due to the command are ignored, like
compilation outputs. You can also use queue which will run the command
once again when the current run has finished if any events occur while
its running, or restart, which terminates the running command and starts
a new one. Finally, theres signal, which only sends a signal; this can
be useful with programs that can reload their configuration without a
full restart.
The signal can be specified with the \--signal option.
Note that this option is scheduled to change its default to do-nothing
in the next major release. File an issue if you have any concerns.
**-r**, **\--restart**
: Restart the process if its still running
@ -518,6 +527,22 @@ signals and terminations are sent to all processes in the group.
Sometimes thats not what you want, and you can disable the behaviour
with this option.
Deprecated, use \--wrap-process=none instead.
**\--wrap-process**=*MODE* \[default: group\]
: Configure how the process is wrapped
By default, Watchexec will run the command in a process group in Unix,
and in a Job Object in Windows.
Some Unix programs prefer running in a session, while others do not work
in a process group.
Use group to use a process group, session to use a process session, and
none to run the command directly. On Windows, either of group or session
will use a Job Object.
**-N**, **\--notify**
: Alert when commands start and end
@ -615,7 +640,8 @@ notice.
Provide your own custom filter programs in jaq (similar to jq) syntax.
Programs are given an event in the same format as described in
\--emit-events-to and must return a boolean.
\--emit-events-to and must return a boolean. Invalid programs will make
watchexec fail to start; use -v to see program runtime errors.
In addition to the jaq stdlib, watchexec adds some custom filter
definitions:
@ -731,7 +757,25 @@ This prints the events that triggered the action when handling it (after
debouncing), in a human readable form. This is useful for debugging
filters.
Use -v when you need more diagnostic information.
Use -vvv instead when you need more diagnostic information.
**\--manual**
: Show the manual page
This shows the manual page for Watchexec, if the output is a terminal
and the man program is available. If not, the manual page is printed to
stdout in ROFF format (suitable for writing to a watchexec.1 file).
**\--completions**=*COMPLETIONS*
: Generate a shell completions script
Provides a completions script or configuration for the given shell. If
Watchexec is not distributed with pre-generated completions, you can use
this to generate them yourself.
Supported shells: bash, elvish, fish, nu, powershell, zsh.
**-v**, **\--verbose**
@ -767,24 +811,6 @@ If the path provided is a directory, a file will be created in that
directory. The file name will be the current date and time, in the
format watchexec.YYYY-MM-DDTHH-MM-SSZ.log.
**\--manual**
: Show the manual page
This shows the manual page for Watchexec, if the output is a terminal
and the man program is available. If not, the manual page is printed to
stdout in ROFF format (suitable for writing to a watchexec.1 file).
**\--completions**=*COMPLETIONS*
: Generate a shell completions script
Provides a completions script or configuration for the given shell. If
Watchexec is not distributed with pre-generated completions, you can use
this to generate them yourself.
Supported shells: bash, elvish, fish, nu, powershell, zsh.
**-h**, **\--help**
: Print help (see a summary with -h)
@ -835,7 +861,7 @@ Didnt expect this much output? Use the short -h flag to get short help.
# VERSION
v1.25.1
v2.1.1
# AUTHORS