Compare commits

...

115 Commits

Author SHA1 Message Date
Félix Saparelli 9f1f2e9d04
chore: Release 2024-05-16 14:20:58 +12:00
Félix Saparelli 0e393c25cf
update changelog 2024-05-16 14:20:36 +12:00
Luca Barbato 2026c52abd
feat: Add git-describe support (#832) 2024-05-15 13:02:25 +00:00
Félix Saparelli 72f069a847
chore: Release 2024-04-30 20:41:43 +12:00
Adit 4affed6fff
fix(cli): recursive paths provided by user getting treated non-recursively (#828) 2024-04-30 07:10:28 +00:00
Félix Saparelli e0084e69f8
fix ci again 2024-04-28 19:14:21 +12:00
Félix Saparelli 592b712c95
chore: Release 2024-04-28 18:55:23 +12:00
Félix Saparelli c9a3b9df00
chore: Release 2024-04-28 18:53:42 +12:00
Félix Saparelli e63d37f601
chore: Release 2024-04-28 18:52:50 +12:00
Félix Saparelli 14e6294f5a
chore: Release 2024-04-28 18:51:48 +12:00
Félix Saparelli 234d606563
chore: Release 2024-04-28 18:50:18 +12:00
Félix Saparelli 77405c8ce1
chore: Release 2024-04-28 18:48:50 +12:00
Félix Saparelli 6c23afe839
feat: make it possible to watch non-recursively (#827)
Fixes #227
Fixes #174

docs(cli): be more precise in print-events advice to use `-v`
docs(cli): improve jaq error help
feat(cli): add `-W` for non-recursive watches
feat(cli): use non-blocking logging
feat(globset): hide `fmt::Debug` spew from ignore crate
feat(ignore-files): hide `fmt::Debug` spew from ignore crate
feat(lib): make it possible to watch non-recursively
fix(lib): inserting `WatchedPath`s directly should be possible
refactor(lib): move `WatchedPath` out of `fs` mod
2024-04-28 06:33:07 +00:00
dependabot[bot] ee3795d776
Bump softprops/action-gh-release from 2.0.3 to 2.0.4 (#823)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-28 18:19:34 +12:00
Félix Saparelli eff96c7324
feat(project-origins): add support for out-of-tree git repos (#826) 2024-04-28 14:26:07 +12:00
Félix Saparelli a4df258735
doc: fix --on-busy-update help text (#825) 2024-04-23 14:44:59 +12:00
Félix Saparelli d388a280f0
ci: more build improvements (for next time) 2024-04-21 02:11:37 +12:00
Félix Saparelli bb97f71c8c
gha: probably the most frustrating syntax in the world 2024-04-21 02:04:56 +12:00
Félix Saparelli 953fa89dd9
even better 2024-04-21 02:02:57 +12:00
Félix Saparelli 0ef87821f2
Run manpage and completions in release when we've already built in releases 2024-04-21 01:57:09 +12:00
Félix Saparelli 62af5dd868
Fix dist manifest 2024-04-21 01:52:11 +12:00
Félix Saparelli 4497aaf515
Fix release builder 2024-04-21 01:38:11 +12:00
Félix Saparelli a63864c5f2
chore: Release 2024-04-21 01:18:24 +12:00
Félix Saparelli ee815ba166
chore: Release 2024-04-21 01:06:46 +12:00
Félix Saparelli d6138b9961
chore: Release 2024-04-21 01:04:18 +12:00
Félix Saparelli f73d388d18
Changelogs for filterers 2024-04-21 01:03:58 +12:00
Félix Saparelli 86d6c7d448
Remove more PR machinery 2024-04-21 01:02:40 +12:00
Félix Saparelli d317540fd3
chore: Release 2024-04-21 01:00:28 +12:00
Félix Saparelli 9d91c51651
chore: Release 2024-04-21 00:56:27 +12:00
Félix Saparelli 96480cb588
chore: Release 2024-04-21 00:55:14 +12:00
Félix Saparelli fd5afb8b3a
Add --wrap-process (#822) 2024-04-20 12:39:28 +00:00
Félix Saparelli e1cef25d7f
Fix watchexec-events tests 2024-04-21 00:36:59 +12:00
Félix Saparelli 22b58a66ab
Remove tagged filterer 2024-04-21 00:32:01 +12:00
Félix Saparelli 1c47ffbe1a
Update release.toml config 2024-04-21 00:30:56 +12:00
Félix Saparelli 48ff7ec68b
Remove PR machinery 2024-04-21 00:28:06 +12:00
Félix Saparelli 4023bf7124
chore: Release 2024-04-21 00:21:04 +12:00
Félix Saparelli 8864811e79
Fix watchexec-events self-dependency 2024-04-21 00:19:11 +12:00
Félix Saparelli 7535e17661
Fix #809: clear screen before starting process, not on every event (#821) 2024-04-20 12:15:52 +00:00
Félix Saparelli 8ad12b1f65
chore: Release 2024-04-21 00:13:30 +12:00
Félix Saparelli dca13fed43
chore: Release 2024-04-21 00:12:21 +12:00
Félix Saparelli f81aed1260
Don't create a tmpfile until one is needed (#820) 2024-04-20 23:52:28 +12:00
Félix Saparelli ec316a7279
Breaking changes to CLI: various removals (#819) 2024-04-20 11:44:21 +00:00
Félix Saparelli e505a9ad05
Breaking changes for --on-busy-update (#818) 2024-04-20 11:15:25 +00:00
Félix Saparelli 317221584a
Breaking changes for --shell (#817) 2024-04-20 10:58:29 +00:00
Félix Saparelli af24252f21
Experimental filter programs (#571) 2024-04-20 10:06:53 +00:00
Félix Saparelli b72248a38c
Update deps (#816) 2024-04-20 05:45:50 +00:00
Chris West 11b98f776a
feat: under --clear reset, always reset at exit (#797) 2024-04-20 17:03:19 +12:00
dependabot[bot] 8c22d0cac7
Bump actions/download-artifact from 3 to 4 (#732)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-20 04:59:04 +00:00
dependabot[bot] 338999eb65
Bump softprops/action-gh-release from 1 to 2 (#799)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-20 16:58:33 +12:00
dependabot[bot] 538439f045
Bump mathieudutour/github-tag-action from 6.1 to 6.2 (#798)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-20 16:58:27 +12:00
Félix Saparelli 75b2c4b4ae
Adapt supervisor to process-wrap (#815) 2024-04-20 16:58:17 +12:00
dependabot[bot] a6e0b3f70a
Bump actions/cache from 3 to 4 (#772)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-20 16:42:55 +12:00
dependabot[bot] f538b74e81
Bump actions/upload-artifact from 3 to 4 (#733)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-04-20 16:42:08 +12:00
dependabot[bot] a50ce396cb
Bump mio from 0.8.10 to 0.8.11 (#796)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-06 13:01:52 +13:00
Félix Saparelli 1846c96b86
feat(cli): Add NO_COLOR support (#779) 2024-02-11 05:13:41 +00:00
dependabot[bot] 8b39279423
Bump h2 from 0.3.22 to 0.3.24 (#769)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-20 04:39:42 +00:00
Félix Saparelli 7f4fba02ef
Remove fish completion from rpm and deb packaging (#767) 2024-01-13 01:29:55 +00:00
github-actions[bot] d3949cc6e9
release: watchexec-cli v1.25.1 (#764)
Co-authored-by: github-actions <github-actions@github.com>
Co-authored-by: Félix Saparelli <felix@passcod.name>
2024-01-05 04:41:32 +00:00
github-actions[bot] 465ccfc597
release: watchexec-filterer-ignore v3.0.1 (#763)
Co-authored-by: github-actions <github-actions@github.com>
2024-01-05 15:48:04 +13:00
github-actions[bot] 6a2f637a60
release: ignore-files v2.1.0 (#762)
Co-authored-by: github-actions <github-actions@github.com>
2024-01-04 11:32:58 +00:00
Félix Saparelli 4f757de8df
Canonicalise paths for ignore discovery (#760) 2024-01-04 09:32:47 +00:00
Félix Saparelli 217f57f6a2
Update to async-priority-channel 0.2.0 (#761) 2024-01-04 22:13:08 +13:00
github-actions[bot] 682a9b4d21
release: watchexec-cli v1.25.0 (#754)
Co-authored-by: github-actions <github-actions@github.com>
2024-01-01 10:39:18 +00:00
github-actions[bot] 447b6fa963
release: watchexec-filterer-globset v3.0.0 (#752)
Co-authored-by: github-actions <github-actions@github.com>
Co-authored-by: Félix Saparelli <felix@passcod.name>
2024-01-01 09:10:43 +00:00
github-actions[bot] 3cbf277b2e
release: watchexec-filterer-tagged v2.0.0 (#753)
Co-authored-by: github-actions <github-actions@github.com>
Co-authored-by: Félix Saparelli <felix@passcod.name>
2024-01-01 09:09:24 +00:00
github-actions[bot] 48793008eb
release: watchexec-filterer-ignore v3.0.0 (#751)
Co-authored-by: github-actions <github-actions@github.com>
Co-authored-by: Félix Saparelli <felix@passcod.name>
2024-01-01 08:47:32 +00:00
github-actions[bot] 1ef2fcebf1
release: ignore-files v2.0.0 (#750)
Co-authored-by: github-actions <github-actions@github.com>
Co-authored-by: Félix Saparelli <felix@passcod.name>
2024-01-01 07:54:19 +00:00
github-actions[bot] 8523bd196c
release: project-origins v1.3.0 (#749)
Co-authored-by: github-actions <github-actions@github.com>
2024-01-01 06:07:03 +00:00
Victor Adossi ("vados") cb1cfb6bf5
Optimise ignore file gathering (#663)
Co-authored-by: Félix Saparelli <felix@passcod.name>
2024-01-01 05:01:14 +00:00
Félix Saparelli bf9c85f598
Tweak origin detection lists (#748) 2024-01-01 03:41:14 +00:00
thislooksfun 3ad0e1aa57
Respect `applies_in` scope when processing nested ignores (#746)
Previously, when importing multiple nested ignore files, some info from
the parent—notably the "root" path—would be inherited. This lead to some
problems with matching of "pseudo-absolute" rules (those with a leading
slash) in nested ignore files (see #745 for more details). To fix this,
we now fully isolate each path of the tree during the import process.
This leads to more accurate, though unfortunately slightly less
performant, rule matching. The only time a builder is reused now is if
two input files have the same `applies_in` value, in which case they are
merged together.

I have added tests to ensure correctness and prevent a regression. I
also was careful to make sure no previous tests broke in any way (all
changes to existing tests were made in isolation, and thus are not
affected by the logic changes). As far as I can tell, the only behavior
change is that now some previously-ignored rules will now be applied,
which could, in very rare configurations, lead to files being
unintentionally ignored. However, due to the aforementioned logic bug,
those files were all ignored by git already, so I suspect the number of
people actually caught off guard by this change to be extremely low,
likely zero.

Fixes #745.
2023-12-30 14:12:59 +13:00
Félix Saparelli f65a0d21ba
Fix CLI release workflow more (#740) 2023-12-20 01:06:41 +00:00
Félix Saparelli aa0bf1c24f
Fix CLI release workflow when dispatching (#739) 2023-12-19 23:09:48 +00:00
Félix Saparelli 0a6811f1fb
Update cargo.lock (#738) 2023-12-19 22:34:29 +00:00
github-actions[bot] e9cce54179
release: watchexec-cli v1.24.2 (#736)
Co-authored-by: github-actions <github-actions@github.com>
2023-12-19 13:10:59 +00:00
github-actions[bot] 6ecc5569e4
release: watchexec-supervisor v1.0.3 (#735)
Co-authored-by: github-actions <github-actions@github.com>
2023-12-19 11:31:34 +00:00
Félix Saparelli eb4f2ce201
Fix queueing behaviour (#734) 2023-12-19 11:22:59 +00:00
Félix Saparelli b4a64a096a
Add eyra support as a feature (#728) 2023-12-13 14:08:03 +13:00
github-actions[bot] a72ff0e142
release: watchexec-cli v1.24.1 (#722)
Co-authored-by: github-actions <github-actions@github.com>
2023-12-11 12:30:05 +00:00
Félix Saparelli 53c9e344eb
Fix argfile regression (#720) 2023-12-11 17:05:38 +13:00
Félix Saparelli e4e8d39546
Graceful quit on Ctrl-C (#721) 2023-12-11 01:21:57 +00:00
github-actions[bot] 4026938c18
release: watchexec-filterer-tagged v1.0.0 (#719)
Co-authored-by: github-actions <github-actions@github.com>
Co-authored-by: Félix Saparelli <felix@passcod.name>
2023-12-10 22:45:04 +00:00
Félix Saparelli 0557d70963
watchexec-filterers: fix bad publish (#715) 2023-12-09 11:24:00 +00:00
github-actions[bot] 477d59d319
release: watchexec-cli v1.24.0 (#699)
Co-authored-by: Félix Saparelli <felix@passcod.name>
Co-authored-by: github-actions <github-actions@github.com>
2023-12-09 10:52:40 +00:00
github-actions[bot] a166b3bc9f
release: watchexec-supervisor v1.0.2 (#713)
Co-authored-by: github-actions <github-actions@github.com>
2023-12-09 10:43:55 +00:00
github-actions[bot] deb6072a26
release: watchexec-signals v2.1.0 (#711)
Co-authored-by: github-actions <github-actions@github.com>
2023-12-09 23:25:41 +13:00
Félix Saparelli 709dbe5151
New option: --signal-map (#710) 2023-12-09 09:30:58 +00:00
Félix Saparelli 03460a6181
More logging for CLI and Supervisor (#709) 2023-12-09 09:10:11 +00:00
github-actions[bot] 44d794c921
release: watchexec-supervisor v1.0.1 (#708)
Co-authored-by: github-actions <github-actions@github.com>
Co-authored-by: Félix Saparelli <felix@passcod.name>
2023-11-29 06:10:10 +00:00
github-actions[bot] c75134d255
release: watchexec-events v2.0.1 (#707)
Co-authored-by: github-actions <github-actions@github.com>
Co-authored-by: Félix Saparelli <felix@passcod.name>
2023-11-29 05:56:23 +00:00
github-actions[bot] e90bf3756e
release: watchexec v3.0.1 (#706)
Co-authored-by: github-actions <github-actions@github.com>
Co-authored-by: Félix Saparelli <felix@passcod.name>
2023-11-29 05:41:27 +00:00
github-actions[bot] 91b34bc96e
release: watchexec-signals v2.0.0 (#705)
Co-authored-by: github-actions <github-actions@github.com>
Co-authored-by: Félix Saparelli <felix@passcod.name>
2023-11-29 05:18:03 +00:00
github-actions[bot] c4dceb2d88
release: watchexec-events v2.0.0 (#704)
Co-authored-by: github-actions <github-actions@github.com>
Co-authored-by: Félix Saparelli <felix@passcod.name>
2023-11-29 05:15:33 +00:00
Félix Saparelli 16be20050b
Amend readmes (#702) 2023-11-28 11:30:33 +00:00
Félix Saparelli 0e94f220e3
Tweaks to help (#700) 2023-11-27 12:57:01 +00:00
Félix Saparelli 9af9189ac4
Add --quiet, --timings, --colo[u]r, --bell (#698) 2023-11-27 12:12:51 +00:00
github-actions[bot] 16e606e944
release: watchexec-filterer-globset v2.0.0 (#696)
Co-authored-by: github-actions <github-actions@github.com>
2023-11-27 10:52:28 +00:00
Félix Saparelli dc91e69966
Fix bosion readme (#697) 2023-11-27 10:51:51 +00:00
Félix Saparelli 2751caeb39
Add --ignore-nothing (#695) 2023-11-27 10:50:39 +00:00
github-actions[bot] e7580b0a35
release: watchexec-filterer-ignore v2.0.0 (#694)
Co-authored-by: github-actions <github-actions@github.com>
2023-11-27 10:38:06 +00:00
Félix Saparelli 63562fe64d
Add --only-emit-events (#691) 2023-11-27 23:29:55 +13:00
github-actions[bot] d9f6d20b6b
release: watchexec v3.0.0 (#692)
Co-authored-by: github-actions <github-actions@github.com>
2023-11-26 04:54:43 +00:00
github-actions[bot] fb2c5449af
release: watchexec-supervisor v1.0.0 (#690)
Co-authored-by: github-actions <github-actions@github.com>
2023-11-26 04:23:20 +00:00
github-actions[bot] 64bdf7c9d5
release: ignore-files v1.3.2 (#689)
Co-authored-by: github-actions <github-actions@github.com>
2023-11-26 04:22:00 +00:00
github-actions[bot] 65e2db31bc
release: watchexec-events v1.1.0 (#688)
Co-authored-by: github-actions <github-actions@github.com>
2023-11-26 03:33:47 +00:00
github-actions[bot] f66aa5d808
release: project-origins v1.2.1 (#687)
Co-authored-by: github-actions <github-actions@github.com>
2023-11-26 03:30:39 +00:00
github-actions[bot] efacb29f86
release: watchexec-signals v1.0.1 (#686)
Co-authored-by: github-actions <github-actions@github.com>
2023-11-26 03:29:53 +00:00
github-actions[bot] 10556dea11
release: bosion v1.0.2 (#685)
Co-authored-by: github-actions <github-actions@github.com>
2023-11-26 03:29:49 +00:00
Félix Saparelli 87ca5d6b20
Dependabot: add supervisor and remove tagged filterer (#684) 2023-11-26 16:21:13 +13:00
Félix Saparelli 318f850e9b
Remove obsolete release order (#683) 2023-11-26 03:15:29 +00:00
Félix Saparelli b54cd60146
Update all changelogs (#682) 2023-11-26 03:01:13 +00:00
Félix Saparelli 89e3d60ecf
Clippy nursery (#681) 2023-11-26 02:40:57 +00:00
Félix Saparelli a13bc429eb
Watchexec lib v3 (#601)
Co-authored-by: emilHof <95590295+emilHof@users.noreply.github.com>
2023-11-25 20:33:44 +00:00
Felix Yan 7f23fbd68a
Update Arch Linux package URL in packages.md (#679) 2023-11-21 23:26:59 +00:00
dependabot[bot] 652b8c9b2f
Bump actions/checkout from 3 to 4 (#651)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-05 16:05:40 +12:00
197 changed files with 12008 additions and 10642 deletions

View File

@ -1,4 +1,3 @@
-D clippy::all
-W clippy::nursery
-W clippy::pedantic
-A clippy::module-name-repetitions
@ -10,3 +9,4 @@
-A clippy::default-trait-access
-A clippy::enum-glob-use
-A clippy::option-if-let-else
-A clippy::blocks-in-conditions

View File

@ -7,44 +7,44 @@ updates:
# default location of `.github/workflows`
directory: "/"
schedule:
interval: "daily"
interval: "weekly"
- package-ecosystem: "cargo"
directory: "/crates/cli"
schedule:
interval: "daily"
interval: "weekly"
- package-ecosystem: "cargo"
directory: "/crates/lib"
schedule:
interval: "daily"
interval: "weekly"
- package-ecosystem: "cargo"
directory: "/crates/events"
schedule:
interval: "daily"
interval: "weekly"
- package-ecosystem: "cargo"
directory: "/crates/signals"
schedule:
interval: "daily"
interval: "weekly"
- package-ecosystem: "cargo"
directory: "/crates/supervisor"
schedule:
interval: "weekly"
- package-ecosystem: "cargo"
directory: "/crates/filterer/ignore"
schedule:
interval: "daily"
interval: "weekly"
- package-ecosystem: "cargo"
directory: "/crates/filterer/globset"
schedule:
interval: "daily"
- package-ecosystem: "cargo"
directory: "/crates/filterer/tagged"
schedule:
interval: "daily"
interval: "weekly"
- package-ecosystem: "cargo"
directory: "/crates/bosion"
schedule:
interval: "daily"
interval: "weekly"
- package-ecosystem: "cargo"
directory: "/crates/ignore-files"
schedule:
interval: "daily"
interval: "weekly"
- package-ecosystem: "cargo"
directory: "/crates/project-origins"
schedule:
interval: "daily"
interval: "weekly"

View File

@ -31,7 +31,7 @@ jobs:
runs-on: "${{ matrix.platform }}-latest"
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Configure toolchain
run: |
rustup toolchain install stable --profile minimal --no-self-update --component clippy
@ -46,7 +46,7 @@ jobs:
echo C:\Program Files\Git\usr\bin>>"%GITHUB_PATH%"
- name: Configure caching
uses: actions/cache@v3
uses: actions/cache@v4
with:
path: |
~/.cargo/registry/index/

View File

@ -4,7 +4,6 @@
app_name: "watchexec",
app_version: $version,
changelog_title: "CLI \($version)",
changelog_body: $changelog,
artifacts: [ $files | split("\n") | .[] | {
name: .,
kind: (if (. | test("[.](deb|rpm)$")) then "installer" else "executable-zip" end),

View File

@ -1,7 +1,6 @@
name: CLI Release
on:
workflow_call:
workflow_dispatch:
push:
tags:
@ -17,10 +16,8 @@ jobs:
runs-on: ubuntu-latest
outputs:
cli_version: ${{ steps.version.outputs.cli_version }}
release_notes: ${{ fromJSON(steps.notes.outputs.notes_json) }}
announce: ${{ steps.announce.outputs.announce }}
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Extract version
id: version
shell: bash
@ -36,33 +33,6 @@ jobs:
echo "cli_version=$version" >> $GITHUB_OUTPUT
- name: Extract release notes
id: notes
shell: bash
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_REPO: ${{ github.repository }}
release_commit: ${{ github.event.head_commit.message }}
run: |
set -euxo pipefail
release_pr=$(head -n1 <<< "${release_commit:-}" | grep -oP '(?<=[(]#)\d+(?=[)])')
if [[ -z "$release_pr" ]]; then
echo "notes_json=null" >> $GITHUB_OUTPUT
exit
fi
gh \
pr --repo "$GITHUB_REPO" \
view "$release_pr" \
--json body \
--jq '"notes_json=\((.body | split("### Release notes")[1] // "") | tojson)"' \
>> $GITHUB_OUTPUT
- name: Make a new announcement post
id: announce
if: endsWith(steps.version.outputs.cli_version, '.0')
run: echo "announce=Announcements" >> $GITHUB_OUTPUT
build:
strategy:
matrix:
@ -161,7 +131,7 @@ jobs:
dst: watchexec-${{ needs.info.outputs.cli_version }}-${{ matrix.target }}
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
# https://github.com/actions/cache/issues/752
- if: ${{ runner.os == 'Windows' }}
@ -172,7 +142,7 @@ jobs:
echo C:\Program Files\Git\usr\bin>>"%GITHUB_PATH%"
- name: Configure caching
uses: actions/cache@v3
uses: actions/cache@v4
with:
path: |
~/.cargo/registry/index/
@ -184,6 +154,8 @@ jobs:
${{ runner.os }}-cargo-${{ matrix.target }}-
${{ runner.os }}-cargo-
- run: sudo apt update
if: startsWith(matrix.os, 'ubuntu-')
- name: Add musl tools
run: sudo apt install -y musl musl-dev musl-tools
if: endsWith(matrix.target, '-musl')
@ -224,19 +196,29 @@ jobs:
with:
tool: cross
- name: Build (cargo)
if: "!matrix.cross"
run: cargo build --package watchexec-cli --release --locked --target ${{ matrix.target }}
- name: Build (cross)
if: matrix.cross
run: cross build --package watchexec-cli --release --locked --target ${{ matrix.target }}
- name: Build
shell: bash
run: |
${{ matrix.cross && 'cross' || 'cargo' }} build \
-p watchexec-cli \
--release --locked \
--target ${{ matrix.target }}
- name: Make manpage
run: cargo run -p watchexec-cli -- --manual > doc/watchexec.1
shell: bash
run: |
cargo run -p watchexec-cli \
${{ (!matrix.cross) && '--release --target' || '' }} \
${{ (!matrix.cross) && matrix.target || '' }} \
--locked -- --manual > doc/watchexec.1
- name: Make completions
run: bin/completions
shell: bash
run: |
bin/completions \
${{ (!matrix.cross) && '--release --target' || '' }} \
${{ (!matrix.cross) && matrix.target || '' }} \
--locked
- name: Package
shell: bash
@ -274,9 +256,9 @@ jobs:
shell: bash
run: 7z a "$dst.zip" "$dst"
- uses: actions/upload-artifact@v3
- uses: actions/upload-artifact@v4
with:
name: builds
name: ${{ matrix.name }}
retention-days: 1
path: |
watchexec-*.tar.xz
@ -292,22 +274,21 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Install b3sum
uses: taiki-e/install-action@v2
with:
tool: b3sum
- uses: actions/download-artifact@v3
- uses: actions/download-artifact@v4
with:
name: builds
merge-multiple: true
- name: Dist manifest
run: |
jq -ncf .github/workflows/dist-manifest.jq \
--arg version "{{ needs.info.outputs.cli_version }}" \
--arg changelog "{{ needs.info.outputs.release_notes }}" \
--arg version "${{ needs.info.outputs.cli_version }}" \
--arg files "$(ls watchexec-*)" \
> dist-manifest.json
@ -325,13 +306,11 @@ jobs:
sha512sum $file | cut -d ' ' -f1 > "$file.sha512"
done
- uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844
- uses: softprops/action-gh-release@9d7c94cfd0a1f3ed45544c887983e9fa900f0564
with:
tag_name: v${{ needs.info.outputs.cli_version }}
name: CLI v${{ needs.info.outputs.cli_version }}
body: ${{ needs.info.outputs.release_notes }}
append_body: true
discussion_category_name: ${{ needs.info.outputs.announce }}
files: |
dist-manifest.json
watchexec-*.tar.xz

View File

@ -1,61 +0,0 @@
<!-- <%- JSON.stringify({ "release-pr": { v2: { crates, version } } }) %> -->
This is a release PR for **<%= crate.name %>** version **<%= version.actual %>**<%
if (version.actual != version.desired) {
%> (performing a <%= version.desired %> bump).<%
} else {
%>.<%
}
%>
**Use squash merge.**
<% if (crate.name == "watchexec-cli") { %>
Upon merging, this will automatically create the tag `v<%= version.actual %>`, build the CLI, and create a GitHub release.
You will still need to manually publish the cargo crate:
```
$ git switch main
$ git pull
$ git switch --detach v<%= version.actual %>
$ cargo publish -p <%= crate.name %>
```
<% } else { %>
Remember to review the crate's changelog!
Upon merging, this will create the tag `<%= crate.name %>-v<%= version.actual %>`.
You will still need to manually publish the cargo crate:
```
$ git switch main
$ git pull
$ git switch --detach <%= crate.name %>-v<%= version.actual %>
$ cargo publish -p <%= crate.name %>
```
<% } %>
To trigger builds initially: either close and then immediately re-open this PR once, or **enable auto-merge**.
<% if (pr.releaseNotes) { %>
---
_Edit release notes into the section below:_
<!-- do not change or remove this heading -->
<% if (crate.name == "watchexec-cli") { %>
### Release notes
_Software development often involves running the same commands over and over. Boring! Watchexec is a simple, standalone tool that watches a path and runs a command whenever it detects modifications. Install it today with [`cargo-binstall watchexec-cli`](https://github.com/cargo-bins/cargo-binstall), from the binaries below, find it [in your favourite package manager](https://github.com/watchexec/watchexec/blob/main/doc/packages.md), or build it from source with `cargo install watchexec-cli`._
#### In this release:
-
#### Other changes:
-
<% } else { %>
### Changelog
-
<% } %>
<% } %>

View File

@ -1,53 +0,0 @@
name: Open a release PR
on:
workflow_dispatch:
inputs:
crate:
description: Crate to release
required: true
type: choice
options:
- cli
- lib
- bosion
- events
- ignore-files
- project-origins
- signals
- filterer/globset
- filterer/ignore
- filterer/tagged
version:
description: Version to release
required: true
type: string
default: patch
jobs:
make-release-pr:
permissions:
id-token: write # Enable OIDC
pull-requests: write
contents: write
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: chainguard-dev/actions/setup-gitsign@main
- name: Install cargo-release
uses: taiki-e/install-action@v2
with:
tool: cargo-release
- uses: cargo-bins/release-pr@v2
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
version: ${{ inputs.version }}
crate-path: crates/${{ inputs.crate }}
pr-release-notes: true
pr-label: release
pr-template-file: .github/workflows/release-pr.ejs
env:
GITSIGN_LOG: /tmp/gitsign.log
- run: cat /tmp/gitsign.log
if: ${{ failure() }}

View File

@ -1,45 +0,0 @@
name: Tag a release
on:
push:
branches:
- main
tags-ignore:
- "*"
jobs:
make-tag:
runs-on: ubuntu-latest
# because we control the release PR title and only allow squashes,
# PRs that are named `release: {crate-name} v{version}` will get tagged!
# the commit message will look like: `release: {crate-name} v{version} (#{pr-number})`
if: "startsWith(github.event.head_commit.message, 'release: ')"
steps:
- name: Extract tag from commit message
env:
COMMIT_MESSAGE: ${{ github.event.head_commit.message }}
run: |
set -euxo pipefail
message="$(head -n1 <<< "$COMMIT_MESSAGE")"
crate="$(cut -d ' ' -f 2 <<< "${message}")"
version="$(cut -d ' ' -f 3 <<< "${message}")"
if [[ "$crate" == "watchexec-cli" ]]; then
echo "CUSTOM_TAG=${version}" >> $GITHUB_ENV
else
echo "CUSTOM_TAG=${crate}-${version}" >> $GITHUB_ENV
fi
- uses: actions/checkout@v3
- name: Push release tag
id: tag_version
uses: mathieudutour/github-tag-action@v6.1
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
custom_tag: ${{ env.CUSTOM_TAG }}
tag_prefix: ''
release-cli:
needs: make-tag
if: "startsWith(github.event.head_commit.message, 'release: watchexec-cli v')"
uses: ./.github/workflows/release-cli.yml
secrets: inherit

View File

@ -35,7 +35,7 @@ jobs:
runs-on: "${{ matrix.platform }}-latest"
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Configure toolchain
run: |
rustup toolchain install --profile minimal --no-self-update stable
@ -50,7 +50,7 @@ jobs:
echo C:\Program Files\Git\usr\bin>>"%GITHUB_PATH%"
- name: Cargo caching
uses: actions/cache@v3
uses: actions/cache@v4
with:
path: |
~/.cargo/registry/index/
@ -62,15 +62,28 @@ jobs:
${{ runner.os }}-cargo-
- name: Compilation caching
uses: actions/cache@v3
uses: actions/cache@v4
with:
path: target/
key: ${{ runner.os }}-target-stable-${{ hashFiles('**/Cargo.lock') }}
- name: Run test suite
run: cargo test ${{ env.flags }}
run: cargo test
- name: Run watchexec-events integration tests
run: cargo test -p watchexec-events -F serde
- name: Check that CLI runs
run: cargo run ${{ env.flags }} -p watchexec-cli -- -1 echo
run: cargo run -p watchexec-cli -- -1 echo
- name: Install coreutils on mac
if: ${{ matrix.platform == 'macos' }}
run: brew install coreutils
- name: Run watchexec integration tests (unix)
if: ${{ matrix.platform != 'windows' }}
run: crates/cli/run-tests.sh
shell: bash
env:
WATCHEXEC_BIN: target/debug/watchexec
- name: Run bosion integration tests
run: ./run-tests.sh
@ -78,7 +91,7 @@ jobs:
shell: bash
- name: Generate manpage
run: cargo run ${{ env.flags }} -p watchexec-cli -- --manual > doc/watchexec.1
run: cargo run -p watchexec-cli -- --manual > doc/watchexec.1
- name: Check that manpage is up to date
run: git diff --exit-code -- doc/
@ -92,7 +105,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Configure toolchain
run: |
rustup toolchain install --profile minimal --no-self-update stable
@ -107,7 +120,7 @@ jobs:
tool: cross
- name: Cargo caching
uses: actions/cache@v3
uses: actions/cache@v4
with:
path: |
~/.cargo/registry/index/
@ -122,15 +135,14 @@ jobs:
- run: cross check --target x86_64-unknown-freebsd
- run: cross check --target x86_64-unknown-netbsd
# Dummy job to have a stable name for the "all tests pass" requirement
tests-pass:
if: always() # always run even if dependencies fail
if: always()
name: Tests pass
needs:
- test
- cross-checks
runs-on: ubuntu-latest
steps:
# fail if ANY dependency has failed or been skipped or cancelled
- if: "contains(needs.*.result, 'failure') || contains(needs.*.result, 'skipped') || contains(needs.*.result, 'cancelled')"
run: exit 1
- uses: re-actors/alls-green@release/v1
with:
jobs: ${{ toJSON(needs) }}

View File

@ -3,8 +3,8 @@ message: |
If you use this software, please cite it using these metadata.
title: "Watchexec: a tool to react to filesystem changes, and a crate ecosystem to power it"
version: "1.23.0"
date-released: 2023-08-30
version: "2.1.1"
date-released: 2024-04-30
repository-code: https://github.com/watchexec/watchexec
license: Apache-2.0

View File

@ -59,14 +59,11 @@ A release goes like this:
### Release order
When all crates need releasing, the order is:
Use this command to see the tree of workspace dependencies:
- project-origins (depends on nothing)
- ignore-files (depends on project-origins)
- watchexec lib
- ignore filterer (depends on watchexec lib)
- other filterers (depends on ignore filterer)
- watchexec cli
```console
$ cargo tree -p watchexec-cli | rg -F '(/' --color=never | sed 's/ v[0-9].*//'
```
## Overview

2624
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -5,14 +5,21 @@ members = [
"crates/cli",
"crates/events",
"crates/signals",
"crates/supervisor",
"crates/filterer/globset",
"crates/filterer/ignore",
"crates/filterer/tagged",
"crates/bosion",
"crates/ignore-files",
"crates/project-origins",
]
[workspace.dependencies]
miette = "5.10.0"
tempfile = "3.8.0"
tracing-test = "0.2.4"
rand = "0.8"
uuid = "1.5.0"
[profile.release]
lto = true
debug = 1 # for stack traces

View File

@ -4,7 +4,7 @@
Software development often involves running the same commands over and over. Boring!
`watchexec` is a **simple**, standalone tool that watches a path and runs a command whenever it detects modifications.
`watchexec` is a simple, standalone tool that watches a path and runs a command whenever it detects modifications.
Example use cases:
@ -21,7 +21,7 @@ Example use cases:
* Coalesces multiple filesystem events into one, for editors that use swap/backup files during saving
* Loads `.gitignore` and `.ignore` files
* Uses process groups to keep hold of forking programs
* Provides the paths that changed in environment variables
* Provides the paths that changed in environment variables or STDIN
* Does not require a language runtime, not tied to any particular language or ecosystem
* [And more!](./crates/cli/#features)
@ -49,7 +49,7 @@ More usage examples: [in the CLI README](./crates/cli/#usage-examples)!
All options in detail: [in the CLI README](./crates/cli/#installation),
in the online help (`watchexec -h`, `watchexec --help`, or `watchexec --manual`),
and [in the manual page](./doc/watchexec.1.md) ([PDF](./doc/watchexec.1.pdf)).
and [in the manual page](./doc/watchexec.1.md).
## Augment
@ -62,11 +62,23 @@ Watchexec pairs well with:
## Extend
- [watchexec library](./crates/lib/): to create more specialised watchexec-powered tools! such as:
- [cargo watch](https://github.com/watchexec/cargo-watch): for Rust/Cargo projects.
- [watchexec library](./crates/lib/): to create more specialised watchexec-powered tools.
- [watchexec-events](./crates/events/): event types for watchexec.
- [watchexec-signals](./crates/signals/): signal types for watchexec.
- [watchexec-supervisor](./crates/supervisor/): process lifecycle manager (the _exec_ part of watchexec).
- [clearscreen](https://github.com/watchexec/clearscreen): to clear the (terminal) screen on every platform.
- [command group](https://github.com/watchexec/command-group): to run commands in process groups.
- [ignore files](./crates/ignore-files/): to find, parse, and interpret ignore files.
- [project origins](./crates/project-origins/): to find the origin(s) directory of a project.
- [notify](https://github.com/notify-rs/notify): to respond to file modifications (third-party).
- [globset](https://crates.io/crates/globset): to match globs (third-party).
### Downstreams
Selected downstreams of watchexec and associated crates:
- [cargo watch](https://github.com/watchexec/cargo-watch): a specialised watcher for Rust/Cargo projects.
- [cargo lambda](https://github.com/cargo-lambda/cargo-lambda): a dev tool for Rust-powered AWS Lambda functions.
- [create-rust-app](https://create-rust-app.dev): a template for Rust+React web apps.
- [dotter](https://github.com/supercuber/dotter): a dotfile manager.
- [ghciwatch](https://github.com/mercurytechnologies/ghciwatch): a specialised watcher for Haskell projects.
- [tectonic](https://tectonic-typesetting.github.io/book/latest/): a TeX/LaTeX typesetting system.

View File

@ -1,7 +1,7 @@
#!/bin/sh
cargo run -p watchexec-cli -- --completions bash > completions/bash
cargo run -p watchexec-cli -- --completions elvish > completions/elvish
cargo run -p watchexec-cli -- --completions fish > completions/fish
cargo run -p watchexec-cli -- --completions nu > completions/nu
cargo run -p watchexec-cli -- --completions powershell > completions/powershell
cargo run -p watchexec-cli -- --completions zsh > completions/zsh
cargo run -p watchexec-cli $* -- --completions bash > completions/bash
cargo run -p watchexec-cli $* -- --completions elvish > completions/elvish
cargo run -p watchexec-cli $* -- --completions fish > completions/fish
cargo run -p watchexec-cli $* -- --completions nu > completions/nu
cargo run -p watchexec-cli $* -- --completions powershell > completions/powershell
cargo run -p watchexec-cli $* -- --completions zsh > completions/zsh

10
bin/dates.mjs Executable file
View File

@ -0,0 +1,10 @@
#!/usr/bin/env node
const id = Math.floor(Math.random() * 100);
let n = 0;
const m = 5;
while (n < m) {
n += 1;
console.log(`[${id} : ${n}/${m}] ${new Date}`);
await new Promise(done => setTimeout(done, 2000));
}

View File

@ -1,4 +1,3 @@
#!/bin/sh
cargo run -p watchexec-cli -- --manual > doc/watchexec.1
roff2pdf < doc/watchexec.1 > doc/watchexec.1.pdf
pandoc doc/watchexec.1 -t markdown > doc/watchexec.1.md

View File

@ -19,7 +19,7 @@ _watchexec() {
case "${cmd}" in
watchexec)
opts="-w -c -o -W -r -s -k -d -p -n -E -1 -N -e -f -i -v -h -V --watch --clear --on-busy-update --watch-when-idle --restart --signal --kill --stop-signal --stop-timeout --debounce --stdin-quit --no-vcs-ignore --no-project-ignore --no-global-ignore --no-default-ignore --no-discover-ignore --postpone --delay-run --poll --shell --no-shell-long --no-environment --emit-events-to --env --no-process-group --notify --project-origin --workdir --exts --filter --filter-file --ignore --ignore-file --fs-events --no-meta --print-events --verbose --log-file --manual --completions --help --version [COMMAND]..."
opts="-w -W -c -o -r -s -d -p -n -E -1 -N -q -e -f -j -i -v -h -V --watch --watch-non-recursive --clear --on-busy-update --restart --signal --stop-signal --stop-timeout --map-signal --debounce --stdin-quit --no-vcs-ignore --no-project-ignore --no-global-ignore --no-default-ignore --no-discover-ignore --ignore-nothing --postpone --delay-run --poll --shell --no-environment --emit-events-to --only-emit-events --env --no-process-group --wrap-process --notify --color --timings --quiet --bell --project-origin --workdir --exts --filter --filter-file --filter-prog --ignore --ignore-file --fs-events --no-meta --print-events --manual --completions --verbose --log-file --help --version [COMMAND]..."
if [[ ${cur} == -* || ${COMP_CWORD} -eq 1 ]] ; then
COMPREPLY=( $(compgen -W "${opts}" -- "${cur}") )
return 0
@ -33,6 +33,14 @@ _watchexec() {
COMPREPLY=($(compgen -f "${cur}"))
return 0
;;
--watch-non-recursive)
COMPREPLY=($(compgen -f "${cur}"))
return 0
;;
-W)
COMPREPLY=($(compgen -f "${cur}"))
return 0
;;
--clear)
COMPREPLY=($(compgen -W "clear reset" -- "${cur}"))
return 0
@ -65,6 +73,10 @@ _watchexec() {
COMPREPLY=($(compgen -f "${cur}"))
return 0
;;
--map-signal)
COMPREPLY=($(compgen -f "${cur}"))
return 0
;;
--debounce)
COMPREPLY=($(compgen -f "${cur}"))
return 0
@ -86,7 +98,7 @@ _watchexec() {
return 0
;;
--emit-events-to)
COMPREPLY=($(compgen -W "environment stdin file json-stdin json-file none" -- "${cur}"))
COMPREPLY=($(compgen -W "environment stdio file json-stdio json-file none" -- "${cur}"))
return 0
;;
--env)
@ -97,12 +109,26 @@ _watchexec() {
COMPREPLY=($(compgen -f "${cur}"))
return 0
;;
--wrap-process)
COMPREPLY=($(compgen -W "group session none" -- "${cur}"))
return 0
;;
--color)
COMPREPLY=($(compgen -W "auto always never" -- "${cur}"))
return 0
;;
--project-origin)
COMPREPLY=($(compgen -f "${cur}"))
COMPREPLY=()
if [[ "${BASH_VERSINFO[0]}" -ge 4 ]]; then
compopt -o plusdirs
fi
return 0
;;
--workdir)
COMPREPLY=($(compgen -f "${cur}"))
COMPREPLY=()
if [[ "${BASH_VERSINFO[0]}" -ge 4 ]]; then
compopt -o plusdirs
fi
return 0
;;
--exts)
@ -122,6 +148,25 @@ _watchexec() {
return 0
;;
--filter-file)
local oldifs
if [ -n "${IFS+x}" ]; then
oldifs="$IFS"
fi
IFS=$'\n'
COMPREPLY=($(compgen -f "${cur}"))
if [ -n "${oldifs+x}" ]; then
IFS="$oldifs"
fi
if [[ "${BASH_VERSINFO[0]}" -ge 4 ]]; then
compopt -o filenames
fi
return 0
;;
--filter-prog)
COMPREPLY=($(compgen -f "${cur}"))
return 0
;;
-j)
COMPREPLY=($(compgen -f "${cur}"))
return 0
;;
@ -134,21 +179,32 @@ _watchexec() {
return 0
;;
--ignore-file)
local oldifs
if [ -n "${IFS+x}" ]; then
oldifs="$IFS"
fi
IFS=$'\n'
COMPREPLY=($(compgen -f "${cur}"))
if [ -n "${oldifs+x}" ]; then
IFS="$oldifs"
fi
if [[ "${BASH_VERSINFO[0]}" -ge 4 ]]; then
compopt -o filenames
fi
return 0
;;
--fs-events)
COMPREPLY=($(compgen -W "access create remove rename modify metadata" -- "${cur}"))
return 0
;;
--log-file)
COMPREPLY=($(compgen -f "${cur}"))
return 0
;;
--completions)
COMPREPLY=($(compgen -W "bash elvish fish nu powershell zsh" -- "${cur}"))
return 0
;;
--log-file)
COMPREPLY=($(compgen -f "${cur}"))
return 0
;;
*)
COMPREPLY=()
;;
@ -159,4 +215,8 @@ _watchexec() {
esac
}
complete -F _watchexec -o nosort -o bashdefault -o default watchexec
if [[ "${BASH_VERSINFO[0]}" -eq 4 && "${BASH_VERSINFO[1]}" -ge 4 || "${BASH_VERSINFO[0]}" -gt 4 ]]; then
complete -F _watchexec -o nosort -o bashdefault -o default watchexec
else
complete -F _watchexec -o bashdefault -o default watchexec
fi

View File

@ -20,6 +20,8 @@ set edit:completion:arg-completer[watchexec] = {|@words|
&'watchexec'= {
cand -w 'Watch a specific file or directory'
cand --watch 'Watch a specific file or directory'
cand -W 'Watch a specific directory, non-recursively'
cand --watch-non-recursive 'Watch a specific directory, non-recursively'
cand -c 'Clear screen before running command'
cand --clear 'Clear screen before running command'
cand -o 'What to do when receiving events while the command is running'
@ -28,6 +30,7 @@ set edit:completion:arg-completer[watchexec] = {|@words|
cand --signal 'Send a signal to the process when it''s still running'
cand --stop-signal 'Signal to send to stop the command'
cand --stop-timeout 'Time to wait for the command to exit gracefully'
cand --map-signal 'Translate signals from the OS to signals to send to the command'
cand -d 'Time to wait for new events before taking action'
cand --debounce 'Time to wait for new events before taking action'
cand --delay-run 'Sleep before running the command'
@ -36,6 +39,8 @@ set edit:completion:arg-completer[watchexec] = {|@words|
cand --emit-events-to 'Configure event emission'
cand -E 'Add env vars to the command'
cand --env 'Add env vars to the command'
cand --wrap-process 'Configure how the process is wrapped'
cand --color 'When to use terminal colours'
cand --project-origin 'Set the project origin'
cand --workdir 'Set the working directory'
cand -e 'Filename extensions to filter to'
@ -43,38 +48,41 @@ set edit:completion:arg-completer[watchexec] = {|@words|
cand -f 'Filename patterns to filter to'
cand --filter 'Filename patterns to filter to'
cand --filter-file 'Files to load filters from'
cand -j '[experimental] Filter programs'
cand --filter-prog '[experimental] Filter programs'
cand -i 'Filename patterns to filter out'
cand --ignore 'Filename patterns to filter out'
cand --ignore-file 'Files to load ignores from'
cand --fs-events 'Filesystem events to filter to'
cand --log-file 'Write diagnostic logs to a file'
cand --completions 'Generate a shell completions script'
cand -W 'Deprecated alias for ''--on-busy-update=do-nothing'''
cand --watch-when-idle 'Deprecated alias for ''--on-busy-update=do-nothing'''
cand --log-file 'Write diagnostic logs to a file'
cand -r 'Restart the process if it''s still running'
cand --restart 'Restart the process if it''s still running'
cand -k 'Hidden legacy shorthand for ''--signal=kill'''
cand --kill 'Hidden legacy shorthand for ''--signal=kill'''
cand --stdin-quit 'Exit when stdin closes'
cand --no-vcs-ignore 'Don''t load gitignores'
cand --no-project-ignore 'Don''t load project-local ignores'
cand --no-global-ignore 'Don''t load global ignores'
cand --no-default-ignore 'Don''t use internal default ignores'
cand --no-discover-ignore 'Don''t discover ignore files at all'
cand --ignore-nothing 'Don''t ignore anything at all'
cand -p 'Wait until first change before running command'
cand --postpone 'Wait until first change before running command'
cand -n 'Don''t use a shell'
cand --no-shell-long 'Don''t use a shell'
cand --no-environment 'Shorthand for ''--emit-events=none'''
cand -n 'Shorthand for ''--shell=none'''
cand --no-environment 'Deprecated shorthand for ''--emit-events=none'''
cand --only-emit-events 'Only emit events to stdout, run no commands'
cand --no-process-group 'Don''t use a process group'
cand -1 'Testing only: exit Watchexec after the first run'
cand -N 'Alert when commands start and end'
cand --notify 'Alert when commands start and end'
cand --timings 'Print how long the command took to run'
cand -q 'Don''t print starting and stopping messages'
cand --quiet 'Don''t print starting and stopping messages'
cand --bell 'Ring the terminal bell on command completion'
cand --no-meta 'Don''t emit fs events for metadata changes'
cand --print-events 'Print events that trigger actions'
cand --manual 'Show the manual page'
cand -v 'Set diagnostic log level'
cand --verbose 'Set diagnostic log level'
cand --manual 'Show the manual page'
cand -h 'Print help (see more with ''--help'')'
cand --help 'Print help (see more with ''--help'')'
cand -V 'Print version'

View File

@ -1,44 +1,51 @@
complete -c watchexec -s w -l watch -d 'Watch a specific file or directory' -r -F
complete -c watchexec -s c -l clear -d 'Clear screen before running command' -r -f -a "{clear ,reset }"
complete -c watchexec -s o -l on-busy-update -d 'What to do when receiving events while the command is running' -r -f -a "{queue ,do-nothing ,restart ,signal }"
complete -c watchexec -s W -l watch-non-recursive -d 'Watch a specific directory, non-recursively' -r -F
complete -c watchexec -s c -l clear -d 'Clear screen before running command' -r -f -a "{clear '',reset ''}"
complete -c watchexec -s o -l on-busy-update -d 'What to do when receiving events while the command is running' -r -f -a "{queue '',do-nothing '',restart '',signal ''}"
complete -c watchexec -s s -l signal -d 'Send a signal to the process when it\'s still running' -r
complete -c watchexec -l stop-signal -d 'Signal to send to stop the command' -r
complete -c watchexec -l stop-timeout -d 'Time to wait for the command to exit gracefully' -r
complete -c watchexec -l map-signal -d 'Translate signals from the OS to signals to send to the command' -r
complete -c watchexec -s d -l debounce -d 'Time to wait for new events before taking action' -r
complete -c watchexec -l delay-run -d 'Sleep before running the command' -r
complete -c watchexec -l poll -d 'Poll for filesystem changes' -r
complete -c watchexec -l shell -d 'Use a different shell' -r
complete -c watchexec -l emit-events-to -d 'Configure event emission' -r -f -a "{environment ,stdin ,file ,json-stdin ,json-file ,none }"
complete -c watchexec -l emit-events-to -d 'Configure event emission' -r -f -a "{environment '',stdio '',file '',json-stdio '',json-file '',none ''}"
complete -c watchexec -s E -l env -d 'Add env vars to the command' -r
complete -c watchexec -l wrap-process -d 'Configure how the process is wrapped' -r -f -a "{group '',session '',none ''}"
complete -c watchexec -l color -d 'When to use terminal colours' -r -f -a "{auto '',always '',never ''}"
complete -c watchexec -l project-origin -d 'Set the project origin' -r -f -a "(__fish_complete_directories)"
complete -c watchexec -l workdir -d 'Set the working directory' -r -f -a "(__fish_complete_directories)"
complete -c watchexec -s e -l exts -d 'Filename extensions to filter to' -r
complete -c watchexec -s f -l filter -d 'Filename patterns to filter to' -r
complete -c watchexec -l filter-file -d 'Files to load filters from' -r -F
complete -c watchexec -s j -l filter-prog -d '[experimental] Filter programs' -r
complete -c watchexec -s i -l ignore -d 'Filename patterns to filter out' -r
complete -c watchexec -l ignore-file -d 'Files to load ignores from' -r -F
complete -c watchexec -l fs-events -d 'Filesystem events to filter to' -r -f -a "{access ,create ,remove ,rename ,modify ,metadata }"
complete -c watchexec -l fs-events -d 'Filesystem events to filter to' -r -f -a "{access '',create '',remove '',rename '',modify '',metadata ''}"
complete -c watchexec -l completions -d 'Generate a shell completions script' -r -f -a "{bash '',elvish '',fish '',nu '',powershell '',zsh ''}"
complete -c watchexec -l log-file -d 'Write diagnostic logs to a file' -r -F
complete -c watchexec -l completions -d 'Generate a shell completions script' -r -f -a "{bash ,elvish ,fish ,nu ,powershell ,zsh }"
complete -c watchexec -s W -l watch-when-idle -d 'Deprecated alias for \'--on-busy-update=do-nothing\''
complete -c watchexec -s r -l restart -d 'Restart the process if it\'s still running'
complete -c watchexec -s k -l kill -d 'Hidden legacy shorthand for \'--signal=kill\''
complete -c watchexec -l stdin-quit -d 'Exit when stdin closes'
complete -c watchexec -l no-vcs-ignore -d 'Don\'t load gitignores'
complete -c watchexec -l no-project-ignore -d 'Don\'t load project-local ignores'
complete -c watchexec -l no-global-ignore -d 'Don\'t load global ignores'
complete -c watchexec -l no-default-ignore -d 'Don\'t use internal default ignores'
complete -c watchexec -l no-discover-ignore -d 'Don\'t discover ignore files at all'
complete -c watchexec -l ignore-nothing -d 'Don\'t ignore anything at all'
complete -c watchexec -s p -l postpone -d 'Wait until first change before running command'
complete -c watchexec -s n -d 'Don\'t use a shell'
complete -c watchexec -l no-shell-long -d 'Don\'t use a shell'
complete -c watchexec -l no-environment -d 'Shorthand for \'--emit-events=none\''
complete -c watchexec -s n -d 'Shorthand for \'--shell=none\''
complete -c watchexec -l no-environment -d 'Deprecated shorthand for \'--emit-events=none\''
complete -c watchexec -l only-emit-events -d 'Only emit events to stdout, run no commands'
complete -c watchexec -l no-process-group -d 'Don\'t use a process group'
complete -c watchexec -s 1 -d 'Testing only: exit Watchexec after the first run'
complete -c watchexec -s N -l notify -d 'Alert when commands start and end'
complete -c watchexec -l timings -d 'Print how long the command took to run'
complete -c watchexec -s q -l quiet -d 'Don\'t print starting and stopping messages'
complete -c watchexec -l bell -d 'Ring the terminal bell on command completion'
complete -c watchexec -l no-meta -d 'Don\'t emit fs events for metadata changes'
complete -c watchexec -l print-events -d 'Print events that trigger actions'
complete -c watchexec -s v -l verbose -d 'Set diagnostic log level'
complete -c watchexec -l manual -d 'Show the manual page'
complete -c watchexec -s v -l verbose -d 'Set diagnostic log level'
complete -c watchexec -s h -l help -d 'Print help (see more with \'--help\')'
complete -c watchexec -s V -l version -d 'Print version'

View File

@ -9,7 +9,15 @@ module completions {
}
def "nu-complete watchexec emit_events_to" [] {
[ "environment" "stdin" "file" "json-stdin" "json-file" "none" ]
[ "environment" "stdio" "file" "json-stdio" "json-file" "none" ]
}
def "nu-complete watchexec wrap_process" [] {
[ "group" "session" "none" ]
}
def "nu-complete watchexec color" [] {
[ "auto" "always" "never" ]
}
def "nu-complete watchexec filter_fs_events" [] {
@ -24,14 +32,14 @@ module completions {
export extern watchexec [
...command: string # Command to run on changes
--watch(-w): string # Watch a specific file or directory
--watch-non-recursive(-W): string # Watch a specific directory, non-recursively
--clear(-c): string@"nu-complete watchexec screen_clear" # Clear screen before running command
--on-busy-update(-o): string@"nu-complete watchexec on_busy_update" # What to do when receiving events while the command is running
--watch-when-idle(-W) # Deprecated alias for '--on-busy-update=do-nothing'
--restart(-r) # Restart the process if it's still running
--signal(-s): string # Send a signal to the process when it's still running
--kill(-k) # Hidden legacy shorthand for '--signal=kill'
--stop-signal: string # Signal to send to stop the command
--stop-timeout: string # Time to wait for the command to exit gracefully
--map-signal: string # Translate signals from the OS to signals to send to the command
--debounce(-d): string # Time to wait for new events before taking action
--stdin-quit # Exit when stdin closes
--no-vcs-ignore # Don't load gitignores
@ -39,36 +47,43 @@ module completions {
--no-global-ignore # Don't load global ignores
--no-default-ignore # Don't use internal default ignores
--no-discover-ignore # Don't discover ignore files at all
--ignore-nothing # Don't ignore anything at all
--postpone(-p) # Wait until first change before running command
--delay-run: string # Sleep before running the command
--poll: string # Poll for filesystem changes
--shell: string # Use a different shell
-n # Don't use a shell
--no-shell-long # Don't use a shell
--no-environment # Shorthand for '--emit-events=none'
-n # Shorthand for '--shell=none'
--no-environment # Deprecated shorthand for '--emit-events=none'
--emit-events-to: string@"nu-complete watchexec emit_events_to" # Configure event emission
--only-emit-events # Only emit events to stdout, run no commands
--env(-E): string # Add env vars to the command
--no-process-group # Don't use a process group
--wrap-process: string@"nu-complete watchexec wrap_process" # Configure how the process is wrapped
-1 # Testing only: exit Watchexec after the first run
--notify(-N) # Alert when commands start and end
--color: string@"nu-complete watchexec color" # When to use terminal colours
--timings # Print how long the command took to run
--quiet(-q) # Don't print starting and stopping messages
--bell # Ring the terminal bell on command completion
--project-origin: string # Set the project origin
--workdir: string # Set the working directory
--exts(-e): string # Filename extensions to filter to
--filter(-f): string # Filename patterns to filter to
--filter-file: string # Files to load filters from
--filter-prog(-j): string # [experimental] Filter programs
--ignore(-i): string # Filename patterns to filter out
--ignore-file: string # Files to load ignores from
--fs-events: string@"nu-complete watchexec filter_fs_events" # Filesystem events to filter to
--no-meta # Don't emit fs events for metadata changes
--print-events # Print events that trigger actions
--verbose(-v) # Set diagnostic log level
--log-file: string # Write diagnostic logs to a file
--manual # Show the manual page
--completions: string@"nu-complete watchexec completions" # Generate a shell completions script
--verbose(-v) # Set diagnostic log level
--log-file: string # Write diagnostic logs to a file
--help(-h) # Print help (see more with '--help')
--version(-V) # Print version
]
}
use completions *
export use completions *

View File

@ -23,6 +23,8 @@ Register-ArgumentCompleter -Native -CommandName 'watchexec' -ScriptBlock {
'watchexec' {
[CompletionResult]::new('-w', 'w', [CompletionResultType]::ParameterName, 'Watch a specific file or directory')
[CompletionResult]::new('--watch', 'watch', [CompletionResultType]::ParameterName, 'Watch a specific file or directory')
[CompletionResult]::new('-W', 'W ', [CompletionResultType]::ParameterName, 'Watch a specific directory, non-recursively')
[CompletionResult]::new('--watch-non-recursive', 'watch-non-recursive', [CompletionResultType]::ParameterName, 'Watch a specific directory, non-recursively')
[CompletionResult]::new('-c', 'c', [CompletionResultType]::ParameterName, 'Clear screen before running command')
[CompletionResult]::new('--clear', 'clear', [CompletionResultType]::ParameterName, 'Clear screen before running command')
[CompletionResult]::new('-o', 'o', [CompletionResultType]::ParameterName, 'What to do when receiving events while the command is running')
@ -31,6 +33,7 @@ Register-ArgumentCompleter -Native -CommandName 'watchexec' -ScriptBlock {
[CompletionResult]::new('--signal', 'signal', [CompletionResultType]::ParameterName, 'Send a signal to the process when it''s still running')
[CompletionResult]::new('--stop-signal', 'stop-signal', [CompletionResultType]::ParameterName, 'Signal to send to stop the command')
[CompletionResult]::new('--stop-timeout', 'stop-timeout', [CompletionResultType]::ParameterName, 'Time to wait for the command to exit gracefully')
[CompletionResult]::new('--map-signal', 'map-signal', [CompletionResultType]::ParameterName, 'Translate signals from the OS to signals to send to the command')
[CompletionResult]::new('-d', 'd', [CompletionResultType]::ParameterName, 'Time to wait for new events before taking action')
[CompletionResult]::new('--debounce', 'debounce', [CompletionResultType]::ParameterName, 'Time to wait for new events before taking action')
[CompletionResult]::new('--delay-run', 'delay-run', [CompletionResultType]::ParameterName, 'Sleep before running the command')
@ -39,6 +42,8 @@ Register-ArgumentCompleter -Native -CommandName 'watchexec' -ScriptBlock {
[CompletionResult]::new('--emit-events-to', 'emit-events-to', [CompletionResultType]::ParameterName, 'Configure event emission')
[CompletionResult]::new('-E', 'E ', [CompletionResultType]::ParameterName, 'Add env vars to the command')
[CompletionResult]::new('--env', 'env', [CompletionResultType]::ParameterName, 'Add env vars to the command')
[CompletionResult]::new('--wrap-process', 'wrap-process', [CompletionResultType]::ParameterName, 'Configure how the process is wrapped')
[CompletionResult]::new('--color', 'color', [CompletionResultType]::ParameterName, 'When to use terminal colours')
[CompletionResult]::new('--project-origin', 'project-origin', [CompletionResultType]::ParameterName, 'Set the project origin')
[CompletionResult]::new('--workdir', 'workdir', [CompletionResultType]::ParameterName, 'Set the working directory')
[CompletionResult]::new('-e', 'e', [CompletionResultType]::ParameterName, 'Filename extensions to filter to')
@ -46,38 +51,41 @@ Register-ArgumentCompleter -Native -CommandName 'watchexec' -ScriptBlock {
[CompletionResult]::new('-f', 'f', [CompletionResultType]::ParameterName, 'Filename patterns to filter to')
[CompletionResult]::new('--filter', 'filter', [CompletionResultType]::ParameterName, 'Filename patterns to filter to')
[CompletionResult]::new('--filter-file', 'filter-file', [CompletionResultType]::ParameterName, 'Files to load filters from')
[CompletionResult]::new('-j', 'j', [CompletionResultType]::ParameterName, '[experimental] Filter programs')
[CompletionResult]::new('--filter-prog', 'filter-prog', [CompletionResultType]::ParameterName, '[experimental] Filter programs')
[CompletionResult]::new('-i', 'i', [CompletionResultType]::ParameterName, 'Filename patterns to filter out')
[CompletionResult]::new('--ignore', 'ignore', [CompletionResultType]::ParameterName, 'Filename patterns to filter out')
[CompletionResult]::new('--ignore-file', 'ignore-file', [CompletionResultType]::ParameterName, 'Files to load ignores from')
[CompletionResult]::new('--fs-events', 'fs-events', [CompletionResultType]::ParameterName, 'Filesystem events to filter to')
[CompletionResult]::new('--log-file', 'log-file', [CompletionResultType]::ParameterName, 'Write diagnostic logs to a file')
[CompletionResult]::new('--completions', 'completions', [CompletionResultType]::ParameterName, 'Generate a shell completions script')
[CompletionResult]::new('-W', 'W ', [CompletionResultType]::ParameterName, 'Deprecated alias for ''--on-busy-update=do-nothing''')
[CompletionResult]::new('--watch-when-idle', 'watch-when-idle', [CompletionResultType]::ParameterName, 'Deprecated alias for ''--on-busy-update=do-nothing''')
[CompletionResult]::new('--log-file', 'log-file', [CompletionResultType]::ParameterName, 'Write diagnostic logs to a file')
[CompletionResult]::new('-r', 'r', [CompletionResultType]::ParameterName, 'Restart the process if it''s still running')
[CompletionResult]::new('--restart', 'restart', [CompletionResultType]::ParameterName, 'Restart the process if it''s still running')
[CompletionResult]::new('-k', 'k', [CompletionResultType]::ParameterName, 'Hidden legacy shorthand for ''--signal=kill''')
[CompletionResult]::new('--kill', 'kill', [CompletionResultType]::ParameterName, 'Hidden legacy shorthand for ''--signal=kill''')
[CompletionResult]::new('--stdin-quit', 'stdin-quit', [CompletionResultType]::ParameterName, 'Exit when stdin closes')
[CompletionResult]::new('--no-vcs-ignore', 'no-vcs-ignore', [CompletionResultType]::ParameterName, 'Don''t load gitignores')
[CompletionResult]::new('--no-project-ignore', 'no-project-ignore', [CompletionResultType]::ParameterName, 'Don''t load project-local ignores')
[CompletionResult]::new('--no-global-ignore', 'no-global-ignore', [CompletionResultType]::ParameterName, 'Don''t load global ignores')
[CompletionResult]::new('--no-default-ignore', 'no-default-ignore', [CompletionResultType]::ParameterName, 'Don''t use internal default ignores')
[CompletionResult]::new('--no-discover-ignore', 'no-discover-ignore', [CompletionResultType]::ParameterName, 'Don''t discover ignore files at all')
[CompletionResult]::new('--ignore-nothing', 'ignore-nothing', [CompletionResultType]::ParameterName, 'Don''t ignore anything at all')
[CompletionResult]::new('-p', 'p', [CompletionResultType]::ParameterName, 'Wait until first change before running command')
[CompletionResult]::new('--postpone', 'postpone', [CompletionResultType]::ParameterName, 'Wait until first change before running command')
[CompletionResult]::new('-n', 'n', [CompletionResultType]::ParameterName, 'Don''t use a shell')
[CompletionResult]::new('--no-shell-long', 'no-shell-long', [CompletionResultType]::ParameterName, 'Don''t use a shell')
[CompletionResult]::new('--no-environment', 'no-environment', [CompletionResultType]::ParameterName, 'Shorthand for ''--emit-events=none''')
[CompletionResult]::new('-n', 'n', [CompletionResultType]::ParameterName, 'Shorthand for ''--shell=none''')
[CompletionResult]::new('--no-environment', 'no-environment', [CompletionResultType]::ParameterName, 'Deprecated shorthand for ''--emit-events=none''')
[CompletionResult]::new('--only-emit-events', 'only-emit-events', [CompletionResultType]::ParameterName, 'Only emit events to stdout, run no commands')
[CompletionResult]::new('--no-process-group', 'no-process-group', [CompletionResultType]::ParameterName, 'Don''t use a process group')
[CompletionResult]::new('-1', '1', [CompletionResultType]::ParameterName, 'Testing only: exit Watchexec after the first run')
[CompletionResult]::new('-N', 'N ', [CompletionResultType]::ParameterName, 'Alert when commands start and end')
[CompletionResult]::new('--notify', 'notify', [CompletionResultType]::ParameterName, 'Alert when commands start and end')
[CompletionResult]::new('--timings', 'timings', [CompletionResultType]::ParameterName, 'Print how long the command took to run')
[CompletionResult]::new('-q', 'q', [CompletionResultType]::ParameterName, 'Don''t print starting and stopping messages')
[CompletionResult]::new('--quiet', 'quiet', [CompletionResultType]::ParameterName, 'Don''t print starting and stopping messages')
[CompletionResult]::new('--bell', 'bell', [CompletionResultType]::ParameterName, 'Ring the terminal bell on command completion')
[CompletionResult]::new('--no-meta', 'no-meta', [CompletionResultType]::ParameterName, 'Don''t emit fs events for metadata changes')
[CompletionResult]::new('--print-events', 'print-events', [CompletionResultType]::ParameterName, 'Print events that trigger actions')
[CompletionResult]::new('--manual', 'manual', [CompletionResultType]::ParameterName, 'Show the manual page')
[CompletionResult]::new('-v', 'v', [CompletionResultType]::ParameterName, 'Set diagnostic log level')
[CompletionResult]::new('--verbose', 'verbose', [CompletionResultType]::ParameterName, 'Set diagnostic log level')
[CompletionResult]::new('--manual', 'manual', [CompletionResultType]::ParameterName, 'Show the manual page')
[CompletionResult]::new('-h', 'h', [CompletionResultType]::ParameterName, 'Print help (see more with ''--help'')')
[CompletionResult]::new('--help', 'help', [CompletionResultType]::ParameterName, 'Print help (see more with ''--help'')')
[CompletionResult]::new('-V', 'V ', [CompletionResultType]::ParameterName, 'Print version')

View File

@ -17,22 +17,27 @@ _watchexec() {
_arguments "${_arguments_options[@]}" \
'*-w+[Watch a specific file or directory]:PATH:_files' \
'*--watch=[Watch a specific file or directory]:PATH:_files' \
'*-W+[Watch a specific directory, non-recursively]:PATH:_files' \
'*--watch-non-recursive=[Watch a specific directory, non-recursively]:PATH:_files' \
'-c+[Clear screen before running command]' \
'--clear=[Clear screen before running command]' \
'-o+[What to do when receiving events while the command is running]:MODE:(queue do-nothing restart signal)' \
'--on-busy-update=[What to do when receiving events while the command is running]:MODE:(queue do-nothing restart signal)' \
'(-r --restart -W --watch-when-idle)-s+[Send a signal to the process when it'\''s still running]:SIGNAL: ' \
'(-r --restart -W --watch-when-idle)--signal=[Send a signal to the process when it'\''s still running]:SIGNAL: ' \
'(-r --restart)-s+[Send a signal to the process when it'\''s still running]:SIGNAL: ' \
'(-r --restart)--signal=[Send a signal to the process when it'\''s still running]:SIGNAL: ' \
'--stop-signal=[Signal to send to stop the command]:SIGNAL: ' \
'--stop-timeout=[Time to wait for the command to exit gracefully]:TIMEOUT: ' \
'*--map-signal=[Translate signals from the OS to signals to send to the command]:SIGNAL:SIGNAL: ' \
'-d+[Time to wait for new events before taking action]:TIMEOUT: ' \
'--debounce=[Time to wait for new events before taking action]:TIMEOUT: ' \
'--delay-run=[Sleep before running the command]:DURATION: ' \
'--poll=[Poll for filesystem changes]' \
'--shell=[Use a different shell]:SHELL: ' \
'--emit-events-to=[Configure event emission]:MODE:(environment stdin file json-stdin json-file none)' \
'--emit-events-to=[Configure event emission]:MODE:(environment stdio file json-stdio json-file none)' \
'*-E+[Add env vars to the command]:KEY=VALUE: ' \
'*--env=[Add env vars to the command]:KEY=VALUE: ' \
'--wrap-process=[Configure how the process is wrapped]:MODE:(group session none)' \
'--color=[When to use terminal colours]:MODE:(auto always never)' \
'--project-origin=[Set the project origin]:DIRECTORY:_files -/' \
'--workdir=[Set the working directory]:DIRECTORY:_files -/' \
'*-e+[Filename extensions to filter to]:EXTENSIONS: ' \
@ -40,38 +45,41 @@ _watchexec() {
'*-f+[Filename patterns to filter to]:PATTERN: ' \
'*--filter=[Filename patterns to filter to]:PATTERN: ' \
'*--filter-file=[Files to load filters from]:PATH:_files' \
'*-j+[\[experimental\] Filter programs]:EXPRESSION: ' \
'*--filter-prog=[\[experimental\] Filter programs]:EXPRESSION: ' \
'*-i+[Filename patterns to filter out]:PATTERN: ' \
'*--ignore=[Filename patterns to filter out]:PATTERN: ' \
'*--ignore-file=[Files to load ignores from]:PATH:_files' \
'*--fs-events=[Filesystem events to filter to]:EVENTS:(access create remove rename modify metadata)' \
'--log-file=[Write diagnostic logs to a file]' \
'(--manual)--completions=[Generate a shell completions script]:COMPLETIONS:(bash elvish fish nu powershell zsh)' \
'(-o --on-busy-update -r --restart)-W[Deprecated alias for '\''--on-busy-update=do-nothing'\'']' \
'(-o --on-busy-update -r --restart)--watch-when-idle[Deprecated alias for '\''--on-busy-update=do-nothing'\'']' \
'(-o --on-busy-update -W --watch-when-idle)-r[Restart the process if it'\''s still running]' \
'(-o --on-busy-update -W --watch-when-idle)--restart[Restart the process if it'\''s still running]' \
'-k[Hidden legacy shorthand for '\''--signal=kill'\'']' \
'--kill[Hidden legacy shorthand for '\''--signal=kill'\'']' \
'--log-file=[Write diagnostic logs to a file]' \
'(-o --on-busy-update)-r[Restart the process if it'\''s still running]' \
'(-o --on-busy-update)--restart[Restart the process if it'\''s still running]' \
'--stdin-quit[Exit when stdin closes]' \
'--no-vcs-ignore[Don'\''t load gitignores]' \
'--no-project-ignore[Don'\''t load project-local ignores]' \
'--no-global-ignore[Don'\''t load global ignores]' \
'--no-default-ignore[Don'\''t use internal default ignores]' \
'--no-discover-ignore[Don'\''t discover ignore files at all]' \
'--ignore-nothing[Don'\''t ignore anything at all]' \
'-p[Wait until first change before running command]' \
'--postpone[Wait until first change before running command]' \
'-n[Don'\''t use a shell]' \
'--no-shell-long[Don'\''t use a shell]' \
'--no-environment[Shorthand for '\''--emit-events=none'\'']' \
'-n[Shorthand for '\''--shell=none'\'']' \
'--no-environment[Deprecated shorthand for '\''--emit-events=none'\'']' \
'(--completions --manual)--only-emit-events[Only emit events to stdout, run no commands]' \
'--no-process-group[Don'\''t use a process group]' \
'-1[Testing only\: exit Watchexec after the first run]' \
'-N[Alert when commands start and end]' \
'--notify[Alert when commands start and end]' \
'--timings[Print how long the command took to run]' \
'-q[Don'\''t print starting and stopping messages]' \
'--quiet[Don'\''t print starting and stopping messages]' \
'--bell[Ring the terminal bell on command completion]' \
'(--fs-events)--no-meta[Don'\''t emit fs events for metadata changes]' \
'--print-events[Print events that trigger actions]' \
'(--completions)--manual[Show the manual page]' \
'*-v[Set diagnostic log level]' \
'*--verbose[Set diagnostic log level]' \
'(--completions)--manual[Show the manual page]' \
'-h[Print help (see more with '\''--help'\'')]' \
'--help[Print help (see more with '\''--help'\'')]' \
'-V[Print version]' \

View File

@ -2,9 +2,21 @@
## Next (YYYY-MM-DD)
## v1.1.0 (2024-05-16)
- Add `git-describe` support (#832, by @lu-zero)
## v1.0.3 (2024-04-20)
- Deps: gix 0.62
## v1.0.2 (2023-11-26)
- Deps: upgrade to gix 0.55
## v1.0.1 (2023-07-02)
- Dep: upgrade to gix 0.44
- Deps: upgrade to gix 0.44
## v1.0.0 (2023-03-05)

View File

@ -1,6 +1,6 @@
[package]
name = "bosion"
version = "1.0.1"
version = "1.1.0"
authors = ["Félix Saparelli <felix@passcod.name>"]
license = "Apache-2.0 OR MIT"
@ -15,13 +15,14 @@ rust-version = "1.64.0"
edition = "2021"
[dependencies.time]
version = "0.3.20"
version = "0.3.30"
features = ["macros", "formatting"]
[dependencies.gix]
version = "0.48"
version = "0.62.0"
optional = true
default-features = false
features = ["revision"]
[features]
default = ["git", "reproducible", "std"]

View File

@ -15,7 +15,7 @@ In your `Cargo.toml`:
```toml
[build-dependencies]
bosion = "1.0.1"
bosion = "1.1.0"
```
In your `build.rs`:
@ -33,7 +33,7 @@ include!(env!("BOSION_PATH"));
fn main() {
// default output, like rustc -Vv
println!("{}", Bosion::long_version());
println!("{}", Bosion::LONG_VERSION);
// with additional fields
println!("{}", Bosion::long_version_with(&[

File diff suppressed because it is too large Load Diff

View File

@ -13,6 +13,9 @@ struct Args {
#[clap(long)]
dates: bool,
#[clap(long)]
describe: bool,
}
fn main() {
@ -23,17 +26,15 @@ fn main() {
"{}",
Bosion::long_version_with(&[("extra", "field"), ("custom", "1.2.3"),])
);
} else
if args.features {
} else if args.features {
println!("Features: {}", Bosion::CRATE_FEATURE_STRING);
} else
if args.dates {
} else if args.dates {
println!("commit date: {}", Bosion::GIT_COMMIT_DATE);
println!("commit datetime: {}", Bosion::GIT_COMMIT_DATETIME);
println!("build date: {}", Bosion::BUILD_DATE);
println!("build datetime: {}", Bosion::BUILD_DATETIME);
} else if args.describe {
println!("commit description: {}", Bosion::GIT_COMMIT_DESCRIPTION);
} else {
println!("{}", Bosion::LONG_VERSION);
}

File diff suppressed because it is too large Load Diff

View File

@ -15,6 +15,6 @@ version = "*"
path = "../.."
[dependencies]
leon = { version = "0.0.1", default-features = false }
snapbox = "0.4.8"
time = { version = "0.3.20", features = ["formatting", "macros"] }
leon = { version = "2.0.1", default-features = false }
snapbox = "0.5.9"
time = { version = "0.3.30", features = ["formatting", "macros"] }

View File

@ -3,14 +3,56 @@
version = 3
[[package]]
name = "bitflags"
version = "1.3.2"
name = "anstream"
version = "0.6.13"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bef38d45163c2f1dde094a7dfd33ccf595c92905c8f8f4fdc18d06fb1037718a"
checksum = "d96bd03f33fe50a863e394ee9718a706f988b9079b20c3784fb726e7678b62fb"
dependencies = [
"anstyle",
"anstyle-parse",
"anstyle-query",
"anstyle-wincon",
"colorchoice",
"utf8parse",
]
[[package]]
name = "anstyle"
version = "1.0.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8901269c6307e8d93993578286ac0edf7f195079ffff5ebdeea6a59ffb7e36bc"
[[package]]
name = "anstyle-parse"
version = "0.2.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c75ac65da39e5fe5ab759307499ddad880d724eed2f6ce5b5e8a26f4f387928c"
dependencies = [
"utf8parse",
]
[[package]]
name = "anstyle-query"
version = "1.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e28923312444cdd728e4738b3f9c9cac739500909bb3d3c94b43551b16517648"
dependencies = [
"windows-sys",
]
[[package]]
name = "anstyle-wincon"
version = "3.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1cd54b81ec8d6180e24654d0b371ad22fc3dd083b6ff8ba325b72e00c87660a7"
dependencies = [
"anstyle",
"windows-sys",
]
[[package]]
name = "bosion"
version = "1.0.0"
version = "1.0.2"
dependencies = [
"time",
]
@ -26,104 +68,35 @@ dependencies = [
]
[[package]]
name = "cc"
version = "1.0.79"
name = "colorchoice"
version = "1.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "50d30906286121d95be3d479533b458f87493b30a4b5f79a607db8f5d11aa91f"
checksum = "acbf1af155f9b9ef647e42cdc158db4b64a1b61f743629225fde6f3e0be2a7c7"
[[package]]
name = "concolor"
version = "0.0.12"
name = "deranged"
version = "0.3.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f7b3e3c41e9488eeda196b6806dbf487742107d61b2e16485bcca6c25ed5755b"
checksum = "b42b6fa04a440b495c8b04d0e71b707c585f83cb9cb28cf8cd0d976c315e31b4"
dependencies = [
"bitflags",
"concolor-query",
"is-terminal",
]
[[package]]
name = "concolor-query"
version = "0.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "82a90734b3d5dcf656e7624cca6bce9c3a90ee11f900e80141a7427ccfb3d317"
[[package]]
name = "errno"
version = "0.2.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f639046355ee4f37944e44f60642c6f3a7efa3cf6b78c78a0d989a8ce6c396a1"
dependencies = [
"errno-dragonfly",
"libc",
"winapi",
]
[[package]]
name = "errno-dragonfly"
version = "0.1.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "aa68f1b12764fab894d2755d2518754e71b4fd80ecfb822714a1206c2aab39bf"
dependencies = [
"cc",
"libc",
]
[[package]]
name = "hermit-abi"
version = "0.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fed44880c466736ef9a5c5b5facefb5ed0785676d0c02d612db14e54f0d84286"
[[package]]
name = "io-lifetimes"
version = "1.0.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1abeb7a0dd0f8181267ff8adc397075586500b81b28a73e8a0208b00fc170fb3"
dependencies = [
"libc",
"windows-sys",
]
[[package]]
name = "is-terminal"
version = "0.4.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "21b6b32576413a8e69b90e952e4a026476040d81017b80445deda5f2d3921857"
dependencies = [
"hermit-abi",
"io-lifetimes",
"rustix",
"windows-sys",
"powerfmt",
]
[[package]]
name = "itoa"
version = "1.0.6"
version = "1.0.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "453ad9f582a441959e5f0d088b02ce04cfe8d51a8eaf077f12ac6d3e94164ca6"
checksum = "49f1f14873335454500d59611f1cf4a4b0f786f9ac11f4312a78e4cf2566695b"
[[package]]
name = "leon"
version = "0.0.1"
version = "2.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1afa3794684c32f91a5aa105e5109743bc6f2999a869c28fffa40aeffa30cfd0"
checksum = "52df920dfe9751d43501ff2ee12dd81c457d9e810d3f64b5200ee461fe73800b"
dependencies = [
"thiserror",
]
[[package]]
name = "libc"
version = "0.2.139"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "201de327520df007757c1f0adce6e827fe8562fbc28bfd9c15571c66ca1f5f79"
[[package]]
name = "linux-raw-sys"
version = "0.1.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f051f77a7c8e6957c0696eac88f26b0117e54f52d3fc682ab19397a8812846a4"
[[package]]
name = "normalize-line-endings"
version = "0.3.0"
@ -131,73 +104,88 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "61807f77802ff30975e01f4f071c8ba10c022052f98b3294119f3e615d13e5be"
[[package]]
name = "proc-macro2"
version = "1.0.51"
name = "num-conv"
version = "0.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5d727cae5b39d21da60fa540906919ad737832fe0b1c165da3a34d6548c849d6"
checksum = "51d515d32fb182ee37cda2ccdcb92950d6a3c2893aa280e540671c2cd0f3b1d9"
[[package]]
name = "powerfmt"
version = "0.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "439ee305def115ba05938db6eb1644ff94165c5ab5e9420d1c1bcedbba909391"
[[package]]
name = "proc-macro2"
version = "1.0.81"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3d1597b0c024618f09a9c3b8655b7e430397a36d23fdafec26d6965e9eec3eba"
dependencies = [
"unicode-ident",
]
[[package]]
name = "quote"
version = "1.0.23"
version = "1.0.36"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8856d8364d252a14d474036ea1358d63c9e6965c8e5c1885c18f73d70bff9c7b"
checksum = "0fa76aaf39101c457836aec0ce2316dbdc3ab723cdda1c6bd4e6ad4208acaca7"
dependencies = [
"proc-macro2",
]
[[package]]
name = "rustix"
version = "0.36.9"
name = "serde"
version = "1.0.198"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fd5c6ff11fecd55b40746d1995a02f2eb375bf8c00d192d521ee09f42bef37bc"
checksum = "9846a40c979031340571da2545a4e5b7c4163bdae79b301d5f86d03979451fcc"
dependencies = [
"bitflags",
"errno",
"io-lifetimes",
"libc",
"linux-raw-sys",
"windows-sys",
"serde_derive",
]
[[package]]
name = "serde"
version = "1.0.152"
name = "serde_derive"
version = "1.0.198"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bb7d1f0d3021d347a83e556fc4683dea2ea09d87bccdf88ff5c12545d89d5efb"
checksum = "e88edab869b01783ba905e7d0153f9fc1a6505a96e4ad3018011eedb838566d9"
dependencies = [
"proc-macro2",
"quote",
"syn",
]
[[package]]
name = "similar"
version = "2.2.1"
version = "2.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "420acb44afdae038210c99e69aae24109f32f15500aa708e81d46c9f29d55fcf"
checksum = "fa42c91313f1d05da9b26f267f931cf178d4aba455b4c4622dd7355eb80c6640"
[[package]]
name = "snapbox"
version = "0.4.8"
version = "0.5.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4389a6395e9925166f19d67b64874e526ec28a4b8455f3321b686c912299c3ea"
checksum = "8ac441e1ecf678f68423d47f376d53fabce1afba92c8f68e31508eb27df8562a"
dependencies = [
"concolor",
"anstream",
"anstyle",
"normalize-line-endings",
"similar",
"snapbox-macros",
"yansi",
]
[[package]]
name = "snapbox-macros"
version = "0.3.1"
version = "0.3.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "485e65c1203eb37244465e857d15a26d3a85a5410648ccb53b18bd44cb3a7336"
checksum = "e1c4b838b05d15ab22754068cb73500b2f3b07bf09d310e15b27f88160f1de40"
dependencies = [
"anstream",
]
[[package]]
name = "syn"
version = "1.0.109"
version = "2.0.60"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "72b64191b275b66ffe2469e8af2c1cfe3bafa67b529ead792a6d0160888b4237"
checksum = "909518bc7b1c9b779f1bbf07f2929d35af9f0f37e47c6e9ef7f9dddc1e1821f3"
dependencies = [
"proc-macro2",
"quote",
@ -206,18 +194,18 @@ dependencies = [
[[package]]
name = "thiserror"
version = "1.0.38"
version = "1.0.58"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6a9cd18aa97d5c45c6603caea1da6628790b37f7a34b6ca89522331c5180fed0"
checksum = "03468839009160513471e86a034bb2c5c0e4baae3b43f79ffc55c4a5427b3297"
dependencies = [
"thiserror-impl",
]
[[package]]
name = "thiserror-impl"
version = "1.0.38"
version = "1.0.58"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1fb327af4685e4d03fa8cbcf1716380da910eeb2bb8be417e7f9fd3fb164f36f"
checksum = "c61f3ba182994efc43764a46c018c347bc492c79f024e705f46567b418f6d4f7"
dependencies = [
"proc-macro2",
"quote",
@ -226,11 +214,14 @@ dependencies = [
[[package]]
name = "time"
version = "0.3.20"
version = "0.3.36"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cd0cbfecb4d19b5ea75bb31ad904eb5b9fa13f21079c3b92017ebdf4999a5890"
checksum = "5dfd88e563464686c916c7e46e623e520ddc6d79fa6641390f2e3fa86e83e885"
dependencies = [
"deranged",
"itoa",
"num-conv",
"powerfmt",
"serde",
"time-core",
"time-macros",
@ -238,65 +229,51 @@ dependencies = [
[[package]]
name = "time-core"
version = "0.1.0"
version = "0.1.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2e153e1f1acaef8acc537e68b44906d2db6436e2b35ac2c6b42640fff91f00fd"
checksum = "ef927ca75afb808a4d64dd374f00a2adf8d0fcff8e7b184af886c3c87ec4a3f3"
[[package]]
name = "time-macros"
version = "0.2.8"
version = "0.2.18"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fd80a657e71da814b8e5d60d3374fc6d35045062245d80224748ae522dd76f36"
checksum = "3f252a68540fde3a3877aeea552b832b40ab9a69e318efd078774a01ddee1ccf"
dependencies = [
"num-conv",
"time-core",
]
[[package]]
name = "unicode-ident"
version = "1.0.7"
version = "1.0.12"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "775c11906edafc97bc378816b94585fbd9a054eabaf86fdd0ced94af449efab7"
checksum = "3354b9ac3fae1ff6755cb6db53683adb661634f67557942dea4facebec0fee4b"
[[package]]
name = "winapi"
version = "0.3.9"
name = "utf8parse"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5c839a674fcd7a98952e593242ea400abe93992746761e38641405d28b00f419"
dependencies = [
"winapi-i686-pc-windows-gnu",
"winapi-x86_64-pc-windows-gnu",
]
[[package]]
name = "winapi-i686-pc-windows-gnu"
version = "0.4.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ac3b87c63620426dd9b991e5ce0329eff545bccbbb34f3be09ff6fb6ab51b7b6"
[[package]]
name = "winapi-x86_64-pc-windows-gnu"
version = "0.4.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "712e227841d057c1ee1cd2fb22fa7e5a5461ae8e48fa2ca79ec42cfc1931183f"
checksum = "711b9620af191e0cdc7468a8d14e709c3dcdb115b36f838e601583af800a370a"
[[package]]
name = "windows-sys"
version = "0.45.0"
version = "0.52.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "75283be5efb2831d37ea142365f009c02ec203cd29a3ebecbc093d52315b66d0"
checksum = "282be5f36a8ce781fad8c8ae18fa3f9beff57ec1b52cb3de0789201425d9a33d"
dependencies = [
"windows-targets",
]
[[package]]
name = "windows-targets"
version = "0.42.1"
version = "0.52.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8e2522491fbfcd58cc84d47aeb2958948c4b8982e9a2d8a2a35bbaed431390e7"
checksum = "6f0713a46559409d202e70e28227288446bf7841d3211583a4b53e3f6d96e7eb"
dependencies = [
"windows_aarch64_gnullvm",
"windows_aarch64_msvc",
"windows_i686_gnu",
"windows_i686_gnullvm",
"windows_i686_msvc",
"windows_x86_64_gnu",
"windows_x86_64_gnullvm",
@ -305,48 +282,48 @@ dependencies = [
[[package]]
name = "windows_aarch64_gnullvm"
version = "0.42.1"
version = "0.52.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8c9864e83243fdec7fc9c5444389dcbbfd258f745e7853198f365e3c4968a608"
checksum = "7088eed71e8b8dda258ecc8bac5fb1153c5cffaf2578fc8ff5d61e23578d3263"
[[package]]
name = "windows_aarch64_msvc"
version = "0.42.1"
version = "0.52.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4c8b1b673ffc16c47a9ff48570a9d85e25d265735c503681332589af6253c6c7"
checksum = "9985fd1504e250c615ca5f281c3f7a6da76213ebd5ccc9561496568a2752afb6"
[[package]]
name = "windows_i686_gnu"
version = "0.42.1"
version = "0.52.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "de3887528ad530ba7bdbb1faa8275ec7a1155a45ffa57c37993960277145d640"
checksum = "88ba073cf16d5372720ec942a8ccbf61626074c6d4dd2e745299726ce8b89670"
[[package]]
name = "windows_i686_gnullvm"
version = "0.52.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "87f4261229030a858f36b459e748ae97545d6f1ec60e5e0d6a3d32e0dc232ee9"
[[package]]
name = "windows_i686_msvc"
version = "0.42.1"
version = "0.52.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bf4d1122317eddd6ff351aa852118a2418ad4214e6613a50e0191f7004372605"
checksum = "db3c2bf3d13d5b658be73463284eaf12830ac9a26a90c717b7f771dfe97487bf"
[[package]]
name = "windows_x86_64_gnu"
version = "0.42.1"
version = "0.52.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c1040f221285e17ebccbc2591ffdc2d44ee1f9186324dd3e84e99ac68d699c45"
checksum = "4e4246f76bdeff09eb48875a0fd3e2af6aada79d409d33011886d3e1581517d9"
[[package]]
name = "windows_x86_64_gnullvm"
version = "0.42.1"
version = "0.52.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "628bfdf232daa22b0d64fdb62b09fcc36bb01f05a3939e20ab73aaf9470d0463"
checksum = "852298e482cd67c356ddd9570386e2862b5673c85bd5f88df9ab6802b334c596"
[[package]]
name = "windows_x86_64_msvc"
version = "0.42.1"
version = "0.52.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "447660ad36a13288b1db4d4248e857b510e8c3a225c822ba4fb748c0aafecffd"
[[package]]
name = "yansi"
version = "0.5.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "09041cd90cf85f7f8b2df60c646f853b7f535ce68f85244eb6731cf89fa498ec"
checksum = "bec47e5bfd1bff0eeaf6d8b485cc1074891a197ab4225d504cb7a1ab88b02bf0"

View File

@ -17,6 +17,6 @@ default-features = false
features = ["std"]
[dependencies]
leon = { version = "0.0.1", default-features = false }
snapbox = "0.4.8"
time = { version = "0.3.20", features = ["formatting", "macros"] }
leon = { version = "2.0.1", default-features = false }
snapbox = "0.5.9"
time = { version = "0.3.30", features = ["formatting", "macros"] }

View File

@ -3,14 +3,56 @@
version = 3
[[package]]
name = "bitflags"
version = "1.3.2"
name = "anstream"
version = "0.6.13"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bef38d45163c2f1dde094a7dfd33ccf595c92905c8f8f4fdc18d06fb1037718a"
checksum = "d96bd03f33fe50a863e394ee9718a706f988b9079b20c3784fb726e7678b62fb"
dependencies = [
"anstyle",
"anstyle-parse",
"anstyle-query",
"anstyle-wincon",
"colorchoice",
"utf8parse",
]
[[package]]
name = "anstyle"
version = "1.0.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8901269c6307e8d93993578286ac0edf7f195079ffff5ebdeea6a59ffb7e36bc"
[[package]]
name = "anstyle-parse"
version = "0.2.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c75ac65da39e5fe5ab759307499ddad880d724eed2f6ce5b5e8a26f4f387928c"
dependencies = [
"utf8parse",
]
[[package]]
name = "anstyle-query"
version = "1.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e28923312444cdd728e4738b3f9c9cac739500909bb3d3c94b43551b16517648"
dependencies = [
"windows-sys",
]
[[package]]
name = "anstyle-wincon"
version = "3.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1cd54b81ec8d6180e24654d0b371ad22fc3dd083b6ff8ba325b72e00c87660a7"
dependencies = [
"anstyle",
"windows-sys",
]
[[package]]
name = "bosion"
version = "1.0.0"
version = "1.0.2"
dependencies = [
"time",
]
@ -26,104 +68,35 @@ dependencies = [
]
[[package]]
name = "cc"
version = "1.0.79"
name = "colorchoice"
version = "1.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "50d30906286121d95be3d479533b458f87493b30a4b5f79a607db8f5d11aa91f"
checksum = "acbf1af155f9b9ef647e42cdc158db4b64a1b61f743629225fde6f3e0be2a7c7"
[[package]]
name = "concolor"
version = "0.0.12"
name = "deranged"
version = "0.3.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f7b3e3c41e9488eeda196b6806dbf487742107d61b2e16485bcca6c25ed5755b"
checksum = "b42b6fa04a440b495c8b04d0e71b707c585f83cb9cb28cf8cd0d976c315e31b4"
dependencies = [
"bitflags",
"concolor-query",
"is-terminal",
]
[[package]]
name = "concolor-query"
version = "0.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "82a90734b3d5dcf656e7624cca6bce9c3a90ee11f900e80141a7427ccfb3d317"
[[package]]
name = "errno"
version = "0.2.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f639046355ee4f37944e44f60642c6f3a7efa3cf6b78c78a0d989a8ce6c396a1"
dependencies = [
"errno-dragonfly",
"libc",
"winapi",
]
[[package]]
name = "errno-dragonfly"
version = "0.1.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "aa68f1b12764fab894d2755d2518754e71b4fd80ecfb822714a1206c2aab39bf"
dependencies = [
"cc",
"libc",
]
[[package]]
name = "hermit-abi"
version = "0.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fed44880c466736ef9a5c5b5facefb5ed0785676d0c02d612db14e54f0d84286"
[[package]]
name = "io-lifetimes"
version = "1.0.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1abeb7a0dd0f8181267ff8adc397075586500b81b28a73e8a0208b00fc170fb3"
dependencies = [
"libc",
"windows-sys",
]
[[package]]
name = "is-terminal"
version = "0.4.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "21b6b32576413a8e69b90e952e4a026476040d81017b80445deda5f2d3921857"
dependencies = [
"hermit-abi",
"io-lifetimes",
"rustix",
"windows-sys",
"powerfmt",
]
[[package]]
name = "itoa"
version = "1.0.6"
version = "1.0.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "453ad9f582a441959e5f0d088b02ce04cfe8d51a8eaf077f12ac6d3e94164ca6"
checksum = "49f1f14873335454500d59611f1cf4a4b0f786f9ac11f4312a78e4cf2566695b"
[[package]]
name = "leon"
version = "0.0.1"
version = "2.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1afa3794684c32f91a5aa105e5109743bc6f2999a869c28fffa40aeffa30cfd0"
checksum = "52df920dfe9751d43501ff2ee12dd81c457d9e810d3f64b5200ee461fe73800b"
dependencies = [
"thiserror",
]
[[package]]
name = "libc"
version = "0.2.139"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "201de327520df007757c1f0adce6e827fe8562fbc28bfd9c15571c66ca1f5f79"
[[package]]
name = "linux-raw-sys"
version = "0.1.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f051f77a7c8e6957c0696eac88f26b0117e54f52d3fc682ab19397a8812846a4"
[[package]]
name = "normalize-line-endings"
version = "0.3.0"
@ -131,73 +104,88 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "61807f77802ff30975e01f4f071c8ba10c022052f98b3294119f3e615d13e5be"
[[package]]
name = "proc-macro2"
version = "1.0.51"
name = "num-conv"
version = "0.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5d727cae5b39d21da60fa540906919ad737832fe0b1c165da3a34d6548c849d6"
checksum = "51d515d32fb182ee37cda2ccdcb92950d6a3c2893aa280e540671c2cd0f3b1d9"
[[package]]
name = "powerfmt"
version = "0.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "439ee305def115ba05938db6eb1644ff94165c5ab5e9420d1c1bcedbba909391"
[[package]]
name = "proc-macro2"
version = "1.0.81"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3d1597b0c024618f09a9c3b8655b7e430397a36d23fdafec26d6965e9eec3eba"
dependencies = [
"unicode-ident",
]
[[package]]
name = "quote"
version = "1.0.23"
version = "1.0.36"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8856d8364d252a14d474036ea1358d63c9e6965c8e5c1885c18f73d70bff9c7b"
checksum = "0fa76aaf39101c457836aec0ce2316dbdc3ab723cdda1c6bd4e6ad4208acaca7"
dependencies = [
"proc-macro2",
]
[[package]]
name = "rustix"
version = "0.36.9"
name = "serde"
version = "1.0.198"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fd5c6ff11fecd55b40746d1995a02f2eb375bf8c00d192d521ee09f42bef37bc"
checksum = "9846a40c979031340571da2545a4e5b7c4163bdae79b301d5f86d03979451fcc"
dependencies = [
"bitflags",
"errno",
"io-lifetimes",
"libc",
"linux-raw-sys",
"windows-sys",
"serde_derive",
]
[[package]]
name = "serde"
version = "1.0.152"
name = "serde_derive"
version = "1.0.198"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bb7d1f0d3021d347a83e556fc4683dea2ea09d87bccdf88ff5c12545d89d5efb"
checksum = "e88edab869b01783ba905e7d0153f9fc1a6505a96e4ad3018011eedb838566d9"
dependencies = [
"proc-macro2",
"quote",
"syn",
]
[[package]]
name = "similar"
version = "2.2.1"
version = "2.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "420acb44afdae038210c99e69aae24109f32f15500aa708e81d46c9f29d55fcf"
checksum = "fa42c91313f1d05da9b26f267f931cf178d4aba455b4c4622dd7355eb80c6640"
[[package]]
name = "snapbox"
version = "0.4.8"
version = "0.5.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4389a6395e9925166f19d67b64874e526ec28a4b8455f3321b686c912299c3ea"
checksum = "8ac441e1ecf678f68423d47f376d53fabce1afba92c8f68e31508eb27df8562a"
dependencies = [
"concolor",
"anstream",
"anstyle",
"normalize-line-endings",
"similar",
"snapbox-macros",
"yansi",
]
[[package]]
name = "snapbox-macros"
version = "0.3.1"
version = "0.3.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "485e65c1203eb37244465e857d15a26d3a85a5410648ccb53b18bd44cb3a7336"
checksum = "e1c4b838b05d15ab22754068cb73500b2f3b07bf09d310e15b27f88160f1de40"
dependencies = [
"anstream",
]
[[package]]
name = "syn"
version = "1.0.109"
version = "2.0.60"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "72b64191b275b66ffe2469e8af2c1cfe3bafa67b529ead792a6d0160888b4237"
checksum = "909518bc7b1c9b779f1bbf07f2929d35af9f0f37e47c6e9ef7f9dddc1e1821f3"
dependencies = [
"proc-macro2",
"quote",
@ -206,18 +194,18 @@ dependencies = [
[[package]]
name = "thiserror"
version = "1.0.38"
version = "1.0.58"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6a9cd18aa97d5c45c6603caea1da6628790b37f7a34b6ca89522331c5180fed0"
checksum = "03468839009160513471e86a034bb2c5c0e4baae3b43f79ffc55c4a5427b3297"
dependencies = [
"thiserror-impl",
]
[[package]]
name = "thiserror-impl"
version = "1.0.38"
version = "1.0.58"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1fb327af4685e4d03fa8cbcf1716380da910eeb2bb8be417e7f9fd3fb164f36f"
checksum = "c61f3ba182994efc43764a46c018c347bc492c79f024e705f46567b418f6d4f7"
dependencies = [
"proc-macro2",
"quote",
@ -226,11 +214,14 @@ dependencies = [
[[package]]
name = "time"
version = "0.3.20"
version = "0.3.36"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cd0cbfecb4d19b5ea75bb31ad904eb5b9fa13f21079c3b92017ebdf4999a5890"
checksum = "5dfd88e563464686c916c7e46e623e520ddc6d79fa6641390f2e3fa86e83e885"
dependencies = [
"deranged",
"itoa",
"num-conv",
"powerfmt",
"serde",
"time-core",
"time-macros",
@ -238,65 +229,51 @@ dependencies = [
[[package]]
name = "time-core"
version = "0.1.0"
version = "0.1.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2e153e1f1acaef8acc537e68b44906d2db6436e2b35ac2c6b42640fff91f00fd"
checksum = "ef927ca75afb808a4d64dd374f00a2adf8d0fcff8e7b184af886c3c87ec4a3f3"
[[package]]
name = "time-macros"
version = "0.2.8"
version = "0.2.18"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fd80a657e71da814b8e5d60d3374fc6d35045062245d80224748ae522dd76f36"
checksum = "3f252a68540fde3a3877aeea552b832b40ab9a69e318efd078774a01ddee1ccf"
dependencies = [
"num-conv",
"time-core",
]
[[package]]
name = "unicode-ident"
version = "1.0.7"
version = "1.0.12"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "775c11906edafc97bc378816b94585fbd9a054eabaf86fdd0ced94af449efab7"
checksum = "3354b9ac3fae1ff6755cb6db53683adb661634f67557942dea4facebec0fee4b"
[[package]]
name = "winapi"
version = "0.3.9"
name = "utf8parse"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5c839a674fcd7a98952e593242ea400abe93992746761e38641405d28b00f419"
dependencies = [
"winapi-i686-pc-windows-gnu",
"winapi-x86_64-pc-windows-gnu",
]
[[package]]
name = "winapi-i686-pc-windows-gnu"
version = "0.4.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ac3b87c63620426dd9b991e5ce0329eff545bccbbb34f3be09ff6fb6ab51b7b6"
[[package]]
name = "winapi-x86_64-pc-windows-gnu"
version = "0.4.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "712e227841d057c1ee1cd2fb22fa7e5a5461ae8e48fa2ca79ec42cfc1931183f"
checksum = "711b9620af191e0cdc7468a8d14e709c3dcdb115b36f838e601583af800a370a"
[[package]]
name = "windows-sys"
version = "0.45.0"
version = "0.52.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "75283be5efb2831d37ea142365f009c02ec203cd29a3ebecbc093d52315b66d0"
checksum = "282be5f36a8ce781fad8c8ae18fa3f9beff57ec1b52cb3de0789201425d9a33d"
dependencies = [
"windows-targets",
]
[[package]]
name = "windows-targets"
version = "0.42.1"
version = "0.52.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8e2522491fbfcd58cc84d47aeb2958948c4b8982e9a2d8a2a35bbaed431390e7"
checksum = "6f0713a46559409d202e70e28227288446bf7841d3211583a4b53e3f6d96e7eb"
dependencies = [
"windows_aarch64_gnullvm",
"windows_aarch64_msvc",
"windows_i686_gnu",
"windows_i686_gnullvm",
"windows_i686_msvc",
"windows_x86_64_gnu",
"windows_x86_64_gnullvm",
@ -305,48 +282,48 @@ dependencies = [
[[package]]
name = "windows_aarch64_gnullvm"
version = "0.42.1"
version = "0.52.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8c9864e83243fdec7fc9c5444389dcbbfd258f745e7853198f365e3c4968a608"
checksum = "7088eed71e8b8dda258ecc8bac5fb1153c5cffaf2578fc8ff5d61e23578d3263"
[[package]]
name = "windows_aarch64_msvc"
version = "0.42.1"
version = "0.52.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4c8b1b673ffc16c47a9ff48570a9d85e25d265735c503681332589af6253c6c7"
checksum = "9985fd1504e250c615ca5f281c3f7a6da76213ebd5ccc9561496568a2752afb6"
[[package]]
name = "windows_i686_gnu"
version = "0.42.1"
version = "0.52.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "de3887528ad530ba7bdbb1faa8275ec7a1155a45ffa57c37993960277145d640"
checksum = "88ba073cf16d5372720ec942a8ccbf61626074c6d4dd2e745299726ce8b89670"
[[package]]
name = "windows_i686_gnullvm"
version = "0.52.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "87f4261229030a858f36b459e748ae97545d6f1ec60e5e0d6a3d32e0dc232ee9"
[[package]]
name = "windows_i686_msvc"
version = "0.42.1"
version = "0.52.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bf4d1122317eddd6ff351aa852118a2418ad4214e6613a50e0191f7004372605"
checksum = "db3c2bf3d13d5b658be73463284eaf12830ac9a26a90c717b7f771dfe97487bf"
[[package]]
name = "windows_x86_64_gnu"
version = "0.42.1"
version = "0.52.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c1040f221285e17ebccbc2591ffdc2d44ee1f9186324dd3e84e99ac68d699c45"
checksum = "4e4246f76bdeff09eb48875a0fd3e2af6aada79d409d33011886d3e1581517d9"
[[package]]
name = "windows_x86_64_gnullvm"
version = "0.42.1"
version = "0.52.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "628bfdf232daa22b0d64fdb62b09fcc36bb01f05a3939e20ab73aaf9470d0463"
checksum = "852298e482cd67c356ddd9570386e2862b5673c85bd5f88df9ab6802b334c596"
[[package]]
name = "windows_x86_64_msvc"
version = "0.42.1"
version = "0.52.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "447660ad36a13288b1db4d4248e857b510e8c3a225c822ba4fb748c0aafecffd"
[[package]]
name = "yansi"
version = "0.5.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "09041cd90cf85f7f8b2df60c646f853b7f535ce68f85244eb6731cf89fa498ec"
checksum = "bec47e5bfd1bff0eeaf6d8b485cc1074891a197ab4225d504cb7a1ab88b02bf0"

View File

@ -22,6 +22,6 @@ path = "../.."
default-features = false
[dependencies]
leon = { version = "0.0.1", default-features = false }
snapbox = "0.4.8"
time = { version = "0.3.20", features = ["formatting", "macros"] }
leon = { version = "2.0.1", default-features = false }
snapbox = "0.5.9"
time = { version = "0.3.30", features = ["formatting", "macros"] }

View File

@ -77,7 +77,7 @@ impl Info {
#[cfg(feature = "git")]
git: GitInfo::gather()
.map_err(|e| {
println!("cargo:warning=git info gathering failed: {}", e);
println!("cargo:warning=git info gathering failed: {e}");
})
.ok(),
#[cfg(not(feature = "git"))]
@ -145,12 +145,16 @@ pub struct GitInfo {
/// The datetime of the current commit, in the format `YYYY-MM-DD HH:MM:SS`, at UTC.
pub git_datetime: String,
/// The `git describe` equivalent output
pub git_description: String,
}
#[cfg(feature = "git")]
impl GitInfo {
fn gather() -> Result<Self, String> {
let (path, _) = gix::discover::upwards(".").err_string()?;
use std::path::Path;
let (path, _) = gix::discover::upwards(Path::new(".")).err_string()?;
let repo = gix::discover(path).err_string()?;
let head = repo.head_commit().err_string()?;
let time = head.time().err_string()?;
@ -162,6 +166,7 @@ impl GitInfo {
git_shorthash: head.short_id().err_string()?.to_string(),
git_date: timestamp.format(DATE_FORMAT).err_string()?,
git_datetime: timestamp.format(DATETIME_FORMAT).err_string()?,
git_description: head.describe().format().err_string()?.to_string(),
})
}
}

View File

@ -74,6 +74,7 @@ pub fn gather_to(filename: &str, structname: &str, public: bool) {
git_shorthash,
git_date,
git_datetime,
git_description,
..
}) = git
{
@ -104,10 +105,15 @@ pub fn gather_to(filename: &str, structname: &str, public: bool) {
/// This is the date and time (`YYYY-MM-DD HH:MM:SS`) of the commit that was built. Same
/// caveats as with `GIT_COMMIT_HASH` apply.
pub const GIT_COMMIT_DATETIME: &'static str = {git_datetime:?};
/// The git description
///
/// This is the string equivalent to what `git describe` would output
pub const GIT_COMMIT_DESCRIPTION: &'static str = {git_description:?};
"
), format!("{crate_version} ({git_shorthash} {git_date}) {crate_feature_string}\ncommit-hash: {git_hash}\ncommit-date: {git_date}\nbuild-date: {build_date}\nrelease: {crate_version}\nfeatures: {crate_feature_list}"))
} else {
("".to_string(), format!("{crate_version} ({build_date}) {crate_feature_string}\nbuild-date: {build_date}\nrelease: {crate_version}\nfeatures: {crate_feature_list}"))
(String::new(), format!("{crate_version} ({build_date}) {crate_feature_string}\nbuild-date: {build_date}\nrelease: {crate_version}\nfeatures: {crate_feature_list}"))
};
#[cfg(feature = "std")]
@ -120,7 +126,7 @@ pub fn gather_to(filename: &str, structname: &str, public: bool) {
let mut output = Self::LONG_VERSION.to_string();
for (k, v) in extra {
output.push_str(&format!("\n{}: {}", k, v));
output.push_str(&format!("\n{k}: {v}"));
}
output
@ -244,6 +250,7 @@ pub fn gather_to_env_with_prefix(prefix: &str) {
git_shorthash,
git_date,
git_datetime,
git_description,
..
}) = git
{
@ -251,5 +258,6 @@ pub fn gather_to_env_with_prefix(prefix: &str) {
println!("cargo:rustc-env={prefix}GIT_COMMIT_SHORTHASH={git_shorthash}");
println!("cargo:rustc-env={prefix}GIT_COMMIT_DATE={git_date}");
println!("cargo:rustc-env={prefix}GIT_COMMIT_DATETIME={git_datetime}");
println!("cargo:rustc-env={prefix}GIT_COMMIT_DESCRIPTION={git_description}");
}
}

View File

@ -1,6 +1,6 @@
[package]
name = "watchexec-cli"
version = "1.23.0"
version = "2.1.1"
authors = ["Félix Saparelli <felix@passcod.name>", "Matt Green <mattgreenrocks@gmail.com>"]
license = "Apache-2.0"
@ -20,64 +20,89 @@ name = "watchexec"
path = "src/main.rs"
[dependencies]
argfile = "0.1.5"
chrono = "0.4.23"
clap_complete = "4.1.4"
clap_complete_nushell = "4.3.1"
clap_mangen = "0.2.9"
ahash = "0.8.6" # needs to be in sync with jaq's
argfile = "0.2.0"
chrono = "0.4.31"
clap_complete = "4.4.4"
clap_complete_nushell = "4.4.2"
clap_mangen = "0.2.15"
clearscreen = "3.0.0"
dashmap = "5.4.0"
dirs = "5.0.0"
futures = "0.3.17"
futures = "0.3.29"
humantime = "2.1.0"
indexmap = "2.2.6" # needs to be in sync with jaq's
is-terminal = "0.4.4"
notify-rust = "4.5.2"
serde_json = "1.0.94"
tempfile = "3.4.0"
tracing = "0.1.26"
which = "4.4.0"
jaq-core = "1.2.1"
jaq-interpret = "1.2.1"
jaq-parse = "1.0.2"
jaq-std = "1.2.1"
jaq-syn = "1.1.0"
notify-rust = "4.9.0"
once_cell = "1.17.1"
serde_json = "1.0.107"
tempfile = "3.8.1"
termcolor = "1.4.0"
tracing = "0.1.40"
tracing-appender = "0.2.3"
which = "6.0.1"
[dependencies.blake3]
version = "1.3.3"
features = ["rayon"]
[dependencies.command-group]
version = "2.1.0"
features = ["with-tokio"]
[dependencies.console-subscriber]
version = "0.1.0"
optional = true
[dependencies.clap]
version = "4.1.8"
version = "4.4.7"
features = ["cargo", "derive", "env", "wrap_help"]
[dependencies.console-subscriber]
version = "0.2.0"
optional = true
[dependencies.eyra]
version = "0.16.8"
features = ["log", "env_logger"]
optional = true
[dependencies.ignore-files]
version = "1.3.1"
version = "3.0.1"
path = "../ignore-files"
[dependencies.miette]
version = "5.3.0"
version = "7.2.0"
features = ["fancy"]
[dependencies.pid1]
version = "0.1.1"
optional = true
[dependencies.project-origins]
version = "1.2.0"
version = "1.4.0"
path = "../project-origins"
[dependencies.watchexec]
version = "2.3.0"
version = "4.1.0"
path = "../lib"
[dependencies.watchexec-events]
version = "1.0.0"
version = "3.0.0"
path = "../events"
features = ["serde"]
[dependencies.watchexec-signals]
version = "1.0.0"
version = "3.0.0"
path = "../signals"
[dependencies.watchexec-filterer-globset]
version = "1.2.0"
version = "4.0.1"
path = "../filterer/globset"
[dependencies.tokio]
version = "1.24.2"
version = "1.33.0"
features = [
"fs",
"io-std",
@ -95,27 +120,38 @@ features = [
"fmt",
"json",
"tracing-log",
"ansi",
]
[target.'cfg(target_env = "musl")'.dependencies]
mimalloc = "0.1.26"
[target.'cfg(target_os = "linux")'.dependencies]
shadow-rs = "0.22.0"
[target.'cfg(target_os = "linux")'.build-dependencies]
shadow-rs = "0.22.0"
mimalloc = "0.1.39"
[build-dependencies]
embed-resource = "2.1.1"
embed-resource = "2.4.0"
[build-dependencies.bosion]
version = "1.0.1"
version = "1.1.0"
path = "../bosion"
[dev-dependencies]
tracing-test = "0.2.4"
uuid = { workspace = true, features = [ "v4", "fast-rng" ] }
rand = { workspace = true }
[features]
default = ["pid1"]
## Build using Eyra's pure-Rust libc
eyra = ["dep:eyra"]
## Enables PID1 handling.
pid1 = ["dep:pid1"]
## Enables logging for PID1 handling.
pid1-withlog = ["pid1"]
## For debugging only: enables the Tokio Console.
dev-console = ["console-subscriber"]
dev-console = ["dep:console-subscriber"]
[package.metadata.binstall]
pkg-url = "{ repo }/releases/download/v{ version }/watchexec-{ version }-{ target }.{ archive-format }"
@ -137,7 +173,7 @@ assets = [
["../../doc/watchexec.1.md", "usr/share/doc/watchexec/watchexec.1.md", "644"],
["../../doc/watchexec.1", "usr/share/man/man1/watchexec.1.html", "644"],
["../../completions/bash", "usr/share/bash-completion/completions/watchexec", "644"],
["../../completions/fish", "usr/share/fish/completions/watchexec.fish", "644"],
["../../completions/fish", "usr/share/fish/vendor_completions.d/watchexec.fish", "644"],
["../../completions/zsh", "usr/share/zsh/site-functions/_watchexec", "644"],
["../../doc/logo.svg", "usr/share/icons/hicolor/scalable/apps/watchexec.svg", "644"],
]
@ -149,7 +185,7 @@ assets = [
{ source = "../../doc/watchexec.1.md", dest = "/usr/share/doc/watchexec/watchexec.1.md", mode = "644", doc = true },
{ source = "../../doc/watchexec.1", dest = "/usr/share/man/man1/watchexec.1.html", mode = "644" },
{ source = "../../completions/bash", dest = "/usr/share/bash-completion/completions/watchexec", mode = "644" },
{ source = "../../completions/fish", dest = "/usr/share/fish/completions/watchexec.fish", mode = "644" },
{ source = "../../completions/fish", dest = "/usr/share/fish/vendor_completions.d/watchexec.fish", mode = "644" },
{ source = "../../completions/zsh", dest = "/usr/share/zsh/site-functions/_watchexec", mode = "644" },
{ source = "../../doc/logo.svg", dest = "/usr/share/icons/hicolor/scalable/apps/watchexec.svg", mode = "644" },
# set conf = true for config file when that lands

View File

@ -37,7 +37,7 @@ Example use cases:
These variables may contain multiple paths: these are separated by the platform's path separator, as with the `PATH` system environment variable. On Unix that is `:`, and on Windows `;`. Within each variable, paths are deduplicated and sorted in binary order (i.e. neither Unicode nor locale aware).
This can be disabled or limited with `--no-environment` (doesn't set any of these variables) and `--no-meta` (ignores metadata changes).
This can be disabled with `--emit-events=none` or changed to JSON events on STDIN with `--emit-events=json-stdio`.
## Anti-Features
@ -45,7 +45,7 @@ Example use cases:
* Not tied to Git or the presence of a repository/project
* Does not require a cryptic command line involving `xargs`
## Simple Usage Examples
## Usage Examples
Watch all JavaScript, CSS and HTML files in the current directory and all subdirectories for changes, running `make` when a change is detected:
@ -65,11 +65,11 @@ Call/restart `python server.py` when any Python file in the current directory (a
Call/restart `my_server` when any file in the current directory (and all subdirectories) changes, sending `SIGKILL` to stop the command:
$ watchexec -r -s SIGKILL my_server
$ watchexec -r --stop-signal SIGKILL my_server
Send a SIGHUP to the command upon changes (Note: using `-n` here we're executing `my_server` directly, instead of wrapping it in a shell:
$ watchexec -n -s SIGHUP my_server
$ watchexec -n --signal SIGHUP my_server
Run `make` when any file changes, using the `.gitignore` file in the current directory to filter:
@ -87,38 +87,40 @@ Run two commands:
$ watchexec 'date; make'
Get desktop ("toast") notifications when the command starts and finishes:
$ watchexec -N go build
Only run when files are created:
$ watchexec --fs-events create -- s3 sync . s3://my-bucket
If you come from `entr`, note that the watchexec command is run in a shell by default. You can use `-n` or `--shell=none` to not do that:
$ watchexec -n -- echo ';' lorem ipsum
On Windows, you may prefer to use Powershell:
$ watchexec --shell=powershell -- test-connection localhost
$ watchexec --shell=pwsh -- Test-Connection example.com
## Complex Usage Examples
You can eschew running commands entirely and get a stream of events to process on your own:
Turn a plain converter tool like PlantUML or Pandoc into a powerful live-editing tool, either as a script
```console
$ watchexec --emit-events-to=json-stdio --only-emit-events
#!/usr/bin/env bash
set -Eeuo pipefail
{"tags":[{"kind":"source","source":"filesystem"},{"kind":"fs","simple":"modify","full":"Modify(Data(Any))"},{"kind":"path","absolute":"/home/code/rust/watchexec/crates/cli/README.md","filetype":"file"}]}
{"tags":[{"kind":"source","source":"filesystem"},{"kind":"fs","simple":"modify","full":"Modify(Data(Any))"},{"kind":"path","absolute":"/home/code/rust/watchexec/crates/lib/Cargo.toml","filetype":"file"}]}
{"tags":[{"kind":"source","source":"filesystem"},{"kind":"fs","simple":"modify","full":"Modify(Data(Any))"},{"kind":"path","absolute":"/home/code/rust/watchexec/crates/cli/src/args.rs","filetype":"file"}]}
```
SOURCE="test.puml" # Define source file
TARGET="test.png" # Define conversion target file
CONVERT="plantuml $SOURCE" # Define how to convert source to target
VIEW="feh $TARGET" # Define how to open target file
if [ ! -f $TARGET ]; then $CONVERT; fi # Ensure target file exists for opening
$VIEW & # Open target file in viewer in the background
watchexec --filter $SOURCE -- $CONVERT # Update target file on any source file change
Print the time commands take to run:
or condensed as a single line
# Bash
$ SOURCE="test.puml"; TARGET="test.png"; CONVERT="plantuml $SOURCE"; VIEW="feh $TARGET"; if [ ! -f $TARGET ]; then $CONVERT; fi; ($VIEW &); watchexec -f $SOURCE -- $CONVERT
# Zsh
$ SOURCE="test.puml"; TARGET="test.png"; CONVERT="plantuml $SOURCE"; VIEW="feh $TARGET"; if [ ! -f $TARGET ]; then $CONVERT; fi; ($=VIEW &); watchexec -f $SOURCE -- $CONVERT
Replace [PlantUML](https://plantuml.com/) with another converter like [Pandoc](https://pandoc.org/): `plantuml $SOURCE` turns into `pandoc $SOURCE --output $TARGET`.
Similarly, replace the [Feh](https://feh.finalrewind.org/) image viewer with another viewer for your target file like the PDF viewer [Evince](https://wiki.gnome.org/Apps/Evince): `feh $TARGET` turns into `evince $TARGET`.
```console
$ watchexec --timings -- make
[Running: make]
...
[Command was successful, lasted 52.748081074s]
```
## Installation
@ -174,4 +176,22 @@ If not bundled, you can generate completions for your shell with `watchexec --co
There's a manual page at `doc/watchexec.1`. Install it to `/usr/share/man/man1/`.
If not bundled, you can generate a manual page with `watchexec --manual > /path/to/watchexec.1`, or view it inline with `watchexec --manual` (requires `man`).
You can also [read a text version](../../doc/watchexec.1.md) or a [PDF](../../doc/watchexec.1.pdf).
You can also [read a text version](../../doc/watchexec.1.md).
Note that it is automatically generated from the help text, so it is not as pretty as a carefully hand-written one.
## Advanced builds
These are additional options available with custom builds by setting features:
### PID1
If you're using Watchexec as PID1 (most frequently in containers or namespaces), and it's not doing what you expect, you can create a build with PID1 early logging: `--features pid1-withlog`.
If you don't need PID1 support, or if you're doing something that conflicts with this program's PID1 support, you can disable it with `--no-default-features`.
### Eyra
[Eyra](https://github.com/sunfishcode/eyra) is a system to build Linux programs with no dependency on C code (in the libc path). To build Watchexec like this, use `--features eyra` and a Nightly compiler.
This feature also lets you get early logging into program startup, with `RUST_LOG=trace`.

View File

@ -1,4 +1,8 @@
fn main() {
embed_resource::compile("watchexec-manifest.rc", embed_resource::NONE);
bosion::gather();
if std::env::var("CARGO_FEATURE_EYRA").is_ok() {
println!("cargo:rustc-link-arg=-nostartfiles");
}
}

7
crates/cli/integration/env.sh Executable file
View File

@ -0,0 +1,7 @@
#!/bin/bash
set -euxo pipefail
watchexec=${WATCHEXEC_BIN:-watchexec}
$watchexec -1 --env FOO=BAR echo '$FOO' | grep BAR

View File

@ -0,0 +1,7 @@
#!/bin/bash
set -euxo pipefail
watchexec=${WATCHEXEC_BIN:-watchexec}
$watchexec -1 -n echo 'foo bar' | grep 'foo bar'

View File

@ -0,0 +1,7 @@
#!/bin/bash
set -euxo pipefail
watchexec=${WATCHEXEC_BIN:-watchexec}
timeout -s9 30s sh -c "sleep 10 | $watchexec --stdin-quit echo"

View File

@ -0,0 +1,7 @@
#!/bin/bash
set -euxo pipefail
watchexec=${WATCHEXEC_BIN:-watchexec}
$watchexec -1 -- echo @trailingargfile

View File

@ -1,7 +1,9 @@
pre-release-commit-message = "release: cli v{{version}}"
tag-prefix = "cli-"
tag-prefix = ""
tag-message = "watchexec {{version}}"
pre-release-hook = ["sh", "-c", "cd ../.. && bin/completions && bin/manpage"]
[[pre-release-replacements]]
file = "watchexec.exe.manifest"
search = "^ version=\"[\\d.]+[.]0\""

13
crates/cli/run-tests.sh Executable file
View File

@ -0,0 +1,13 @@
#!/bin/bash
set -euo pipefail
export WATCHEXEC_BIN=$(realpath ${WATCHEXEC_BIN:-$(which watchexec)})
cd "$(dirname "${BASH_SOURCE[0]}")/integration"
for test in *.sh; do
echo
echo
echo "======= Testing $test ======="
./$test
done

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,132 @@
use std::{env::var, io::stderr, path::PathBuf};
use clap::{ArgAction, Parser, ValueHint};
use miette::{bail, Result};
use tokio::fs::metadata;
use tracing::{info, warn};
use tracing_appender::{non_blocking, non_blocking::WorkerGuard, rolling};
#[derive(Debug, Clone, Parser)]
pub struct LoggingArgs {
/// Set diagnostic log level
///
/// This enables diagnostic logging, which is useful for investigating bugs or gaining more
/// insight into faulty filters or "missing" events. Use multiple times to increase verbosity.
///
/// Goes up to '-vvvv'. When submitting bug reports, default to a '-vvv' log level.
///
/// You may want to use with '--log-file' to avoid polluting your terminal.
///
/// Setting $RUST_LOG also works, and takes precedence, but is not recommended. However, using
/// $RUST_LOG is the only way to get logs from before these options are parsed.
#[arg(
long,
short,
help_heading = super::OPTSET_DEBUGGING,
action = ArgAction::Count,
default_value = "0",
num_args = 0,
)]
pub verbose: u8,
/// Write diagnostic logs to a file
///
/// This writes diagnostic logs to a file, instead of the terminal, in JSON format. If a log
/// level was not already specified, this will set it to '-vvv'.
///
/// If a path is not provided, the default is the working directory. Note that with
/// '--ignore-nothing', the write events to the log will likely get picked up by Watchexec,
/// causing a loop; prefer setting a path outside of the watched directory.
///
/// If the path provided is a directory, a file will be created in that directory. The file name
/// will be the current date and time, in the format 'watchexec.YYYY-MM-DDTHH-MM-SSZ.log'.
#[arg(
long,
help_heading = super::OPTSET_DEBUGGING,
num_args = 0..=1,
default_missing_value = ".",
value_hint = ValueHint::AnyPath,
value_name = "PATH",
)]
pub log_file: Option<PathBuf>,
}
pub fn preargs() -> bool {
let mut log_on = false;
#[cfg(feature = "dev-console")]
match console_subscriber::try_init() {
Ok(_) => {
warn!("dev-console enabled");
log_on = true;
}
Err(e) => {
eprintln!("Failed to initialise tokio console, falling back to normal logging\n{e}")
}
}
if !log_on && var("RUST_LOG").is_ok() {
match tracing_subscriber::fmt::try_init() {
Ok(()) => {
warn!(RUST_LOG=%var("RUST_LOG").unwrap(), "logging configured from RUST_LOG");
log_on = true;
}
Err(e) => eprintln!("Failed to initialise logging with RUST_LOG, falling back\n{e}"),
}
}
log_on
}
pub async fn postargs(args: &LoggingArgs) -> Result<Option<WorkerGuard>> {
if args.verbose == 0 {
return Ok(None);
}
let (log_writer, guard) = if let Some(file) = &args.log_file {
let is_dir = metadata(&file).await.map_or(false, |info| info.is_dir());
let (dir, filename) = if is_dir {
(
file.to_owned(),
PathBuf::from(format!(
"watchexec.{}.log",
chrono::Utc::now().format("%Y-%m-%dT%H-%M-%SZ")
)),
)
} else if let (Some(parent), Some(file_name)) = (file.parent(), file.file_name()) {
(parent.into(), PathBuf::from(file_name))
} else {
bail!("Failed to determine log file name");
};
non_blocking(rolling::never(dir, filename))
} else {
non_blocking(stderr())
};
let mut builder = tracing_subscriber::fmt().with_env_filter(match args.verbose {
0 => unreachable!("checked by if earlier"),
1 => "warn",
2 => "info",
3 => "debug",
_ => "trace",
});
if args.verbose > 2 {
use tracing_subscriber::fmt::format::FmtSpan;
builder = builder.with_span_events(FmtSpan::NEW | FmtSpan::CLOSE);
}
match if args.log_file.is_some() {
builder.json().with_writer(log_writer).try_init()
} else if args.verbose > 3 {
builder.pretty().with_writer(log_writer).try_init()
} else {
builder.with_writer(log_writer).try_init()
} {
Ok(()) => info!("logging initialised"),
Err(e) => eprintln!("Failed to initialise logging, continuing with none\n{e}"),
}
Ok(Some(guard))
}

View File

@ -1,5 +1,708 @@
mod init;
mod runtime;
use std::{
borrow::Cow,
collections::HashMap,
env::var,
ffi::{OsStr, OsString},
fs::File,
io::{IsTerminal, Write},
process::Stdio,
sync::{
atomic::{AtomicBool, AtomicU8, Ordering},
Arc,
},
time::Duration,
};
pub use init::init;
pub use runtime::runtime;
use clearscreen::ClearScreen;
use miette::{miette, IntoDiagnostic, Report, Result};
use notify_rust::Notification;
use termcolor::{Color, ColorChoice, ColorSpec, StandardStream, WriteColor};
use tokio::{process::Command as TokioCommand, time::sleep};
use tracing::{debug, debug_span, error, instrument, trace, trace_span, Instrument};
use watchexec::{
action::ActionHandler,
command::{Command, Program, Shell, SpawnOptions},
error::RuntimeError,
job::{CommandState, Job},
sources::fs::Watcher,
Config, ErrorHook, Id,
};
use watchexec_events::{Event, Keyboard, ProcessEnd, Tag};
use watchexec_signals::Signal;
use crate::{
args::{Args, ClearMode, ColourMode, EmitEvents, OnBusyUpdate, SignalMapping, WrapMode},
state::RotatingTempFile,
};
use crate::{emits::events_to_simple_format, state::State};
#[derive(Clone, Copy, Debug)]
struct OutputFlags {
quiet: bool,
colour: ColorChoice,
timings: bool,
bell: bool,
toast: bool,
}
pub fn make_config(args: &Args, state: &State) -> Result<Config> {
let _span = debug_span!("args-runtime").entered();
let config = Config::default();
config.on_error(|err: ErrorHook| {
if let RuntimeError::IoError {
about: "waiting on process group",
..
} = err.error
{
// "No child processes" and such
// these are often spurious, so condemn them to -v only
error!("{}", err.error);
return;
}
if cfg!(debug_assertions) {
eprintln!("[[{:?}]]", err.error);
}
eprintln!("[[Error (not fatal)]]\n{}", Report::new(err.error));
});
config.pathset(args.paths.clone());
config.throttle(args.debounce.0);
config.keyboard_events(args.stdin_quit);
if let Some(interval) = args.poll {
config.file_watcher(Watcher::Poll(interval.0));
}
let once = args.once;
let clear = args.screen_clear;
let emit_events_to = args.emit_events_to;
let emit_file = state.emit_file.clone();
if args.only_emit_events {
config.on_action(move |mut action| {
// if we got a terminate or interrupt signal, quit
if action
.signals()
.any(|sig| sig == Signal::Terminate || sig == Signal::Interrupt)
{
// no need to be graceful as there's no commands
action.quit();
return action;
}
// clear the screen before printing events
if let Some(mode) = clear {
match mode {
ClearMode::Clear => {
clearscreen::clear().ok();
}
ClearMode::Reset => {
reset_screen();
}
}
}
match emit_events_to {
EmitEvents::Stdio => {
println!(
"{}",
events_to_simple_format(action.events.as_ref()).unwrap_or_default()
);
}
EmitEvents::JsonStdio => {
for event in action.events.iter().filter(|e| !e.is_empty()) {
println!("{}", serde_json::to_string(event).unwrap_or_default());
}
}
other => unreachable!(
"emit_events_to should have been validated earlier: {:?}",
other
),
}
action
});
return Ok(config);
}
let delay_run = args.delay_run.map(|ts| ts.0);
let on_busy = args.on_busy_update;
let stdin_quit = args.stdin_quit;
let signal = args.signal;
let stop_signal = args.stop_signal;
let stop_timeout = args.stop_timeout.0;
let print_events = args.print_events;
let outflags = OutputFlags {
quiet: args.quiet,
colour: match args.color {
ColourMode::Auto if !std::io::stdin().is_terminal() => ColorChoice::Never,
ColourMode::Auto => ColorChoice::Auto,
ColourMode::Always => ColorChoice::Always,
ColourMode::Never => ColorChoice::Never,
},
timings: args.timings,
bell: args.bell,
toast: args.notify,
};
let workdir = Arc::new(args.workdir.clone());
let mut add_envs = HashMap::new();
for pair in &args.env {
if let Some((k, v)) = pair.split_once('=') {
add_envs.insert(k.to_owned(), OsString::from(v));
} else {
return Err(miette!("{pair} is not in key=value format"));
}
}
debug!(
?add_envs,
"additional environment variables to add to command"
);
let id = Id::default();
let command = interpret_command_args(args)?;
let signal_map: Arc<HashMap<Signal, Option<Signal>>> = Arc::new(
args.signal_map
.iter()
.copied()
.map(|SignalMapping { from, to }| (from, to))
.collect(),
);
let queued = Arc::new(AtomicBool::new(false));
let quit_again = Arc::new(AtomicU8::new(0));
config.on_action_async(move |mut action| {
let add_envs = add_envs.clone();
let command = command.clone();
let emit_file = emit_file.clone();
let queued = queued.clone();
let quit_again = quit_again.clone();
let signal_map = signal_map.clone();
let workdir = workdir.clone();
Box::new(
async move {
trace!(events=?action.events, "handling action");
let add_envs = add_envs.clone();
let command = command.clone();
let emit_file = emit_file.clone();
let queued = queued.clone();
let quit_again = quit_again.clone();
let signal_map = signal_map.clone();
let workdir = workdir.clone();
trace!("set spawn hook for workdir and environment variables");
let job = action.get_or_create_job(id, move || command.clone());
let events = action.events.clone();
job.set_spawn_hook(move |command, _| {
let add_envs = add_envs.clone();
let emit_file = emit_file.clone();
let events = events.clone();
if let Some(ref workdir) = workdir.as_ref() {
debug!(?workdir, "set command workdir");
command.command_mut().current_dir(workdir);
}
emit_events_to_command(
command.command_mut(),
events,
emit_file,
emit_events_to,
add_envs,
);
});
let show_events = {
let events = action.events.clone();
move || {
if print_events {
trace!("print events to stderr");
for (n, event) in events.iter().enumerate() {
eprintln!("[EVENT {n}] {event}");
}
}
}
};
let clear_screen = {
let events = action.events.clone();
move || {
if let Some(mode) = clear {
match mode {
ClearMode::Clear => {
clearscreen::clear().ok();
debug!("cleared screen");
}
ClearMode::Reset => {
reset_screen();
debug!("hard-reset screen");
}
}
}
// re-show events after clearing
if print_events {
trace!("print events to stderr");
for (n, event) in events.iter().enumerate() {
eprintln!("[EVENT {n}] {event}");
}
}
}
};
let quit = |mut action: ActionHandler| {
match quit_again.fetch_add(1, Ordering::Relaxed) {
0 => {
eprintln!("[Waiting {stop_timeout:?} for processes to exit before stopping...]");
// eprintln!("[Waiting {stop_timeout:?} for processes to exit before stopping... Ctrl-C again to exit faster]");
// see TODO in action/worker.rs
action.quit_gracefully(
stop_signal.unwrap_or(Signal::Terminate),
stop_timeout,
);
}
1 => {
action.quit_gracefully(Signal::ForceStop, Duration::ZERO);
}
_ => {
action.quit();
}
}
action
};
if once {
debug!("debug mode: run once and quit");
show_events();
if let Some(delay) = delay_run {
job.run_async(move |_| {
Box::new(async move {
sleep(delay).await;
})
});
}
// this blocks the event loop, but also this is a debug feature so i don't care
job.start().await;
job.to_wait().await;
return quit(action);
}
let is_keyboard_eof = action
.events
.iter()
.any(|e| e.tags.contains(&Tag::Keyboard(Keyboard::Eof)));
if stdin_quit && is_keyboard_eof {
debug!("keyboard EOF, quit");
show_events();
return quit(action);
}
let signals: Vec<Signal> = action.signals().collect();
trace!(?signals, "received some signals");
// if we got a terminate or interrupt signal and they're not mapped, quit
if (signals.contains(&Signal::Terminate)
&& !signal_map.contains_key(&Signal::Terminate))
|| (signals.contains(&Signal::Interrupt)
&& !signal_map.contains_key(&Signal::Interrupt))
{
debug!("unmapped terminate or interrupt signal, quit");
show_events();
return quit(action);
}
// pass all other signals on
for signal in signals {
match signal_map.get(&signal) {
Some(Some(mapped)) => {
debug!(?signal, ?mapped, "passing mapped signal");
job.signal(*mapped);
}
Some(None) => {
debug!(?signal, "discarding signal");
}
None => {
debug!(?signal, "passing signal on");
job.signal(signal);
}
}
}
// only filesystem events below here (or empty synthetic events)
if action.paths().next().is_none() && !action.events.iter().any(|e| e.is_empty()) {
debug!("no filesystem or synthetic events, skip without doing more");
show_events();
return action;
}
show_events();
if let Some(delay) = delay_run {
trace!("delaying run by sleeping inside the job");
job.run_async(move |_| {
Box::new(async move {
sleep(delay).await;
})
});
}
trace!("querying job state via run_async");
job.run_async({
let job = job.clone();
move |context| {
let job = job.clone();
let is_running = matches!(context.current, CommandState::Running { .. });
Box::new(async move {
let innerjob = job.clone();
if is_running {
trace!(?on_busy, "job is running, decide what to do");
match on_busy {
OnBusyUpdate::DoNothing => {}
OnBusyUpdate::Signal => {
job.signal(if cfg!(windows) {
Signal::ForceStop
} else {
stop_signal.or(signal).unwrap_or(Signal::Terminate)
});
}
OnBusyUpdate::Restart if cfg!(windows) => {
job.restart();
job.run(move |context| {
clear_screen();
setup_process(
innerjob.clone(),
context.command.clone(),
outflags,
)
});
}
OnBusyUpdate::Restart => {
job.restart_with_signal(
stop_signal.unwrap_or(Signal::Terminate),
stop_timeout,
);
job.run(move |context| {
clear_screen();
setup_process(
innerjob.clone(),
context.command.clone(),
outflags,
)
});
}
OnBusyUpdate::Queue => {
let job = job.clone();
let already_queued =
queued.fetch_or(true, Ordering::SeqCst);
if already_queued {
debug!("next start is already queued, do nothing");
} else {
debug!("queueing next start of job");
tokio::spawn({
let queued = queued.clone();
async move {
trace!("waiting for job to finish");
job.to_wait().await;
trace!("job finished, starting queued");
job.start();
job.run(move |context| {
clear_screen();
setup_process(
innerjob.clone(),
context.command.clone(),
outflags,
)
})
.await;
trace!("resetting queued state");
queued.store(false, Ordering::SeqCst);
}
});
}
}
}
} else {
trace!("job is not running, start it");
job.start();
job.run(move |context| {
clear_screen();
setup_process(
innerjob.clone(),
context.command.clone(),
outflags,
)
});
}
})
}
});
action
}
.instrument(trace_span!("action handler")),
)
});
Ok(config)
}
#[instrument(level = "debug")]
fn interpret_command_args(args: &Args) -> Result<Arc<Command>> {
let mut cmd = args.command.clone();
if cmd.is_empty() {
panic!("(clap) Bug: command is not present");
}
let shell = if args.no_shell {
None
} else {
let shell = args.shell.clone().or_else(|| var("SHELL").ok());
match shell
.as_deref()
.or_else(|| {
if cfg!(not(windows)) {
Some("sh")
} else if var("POWERSHELL_DISTRIBUTION_CHANNEL").is_ok()
&& (which::which("pwsh").is_ok() || which::which("pwsh.exe").is_ok())
{
trace!("detected pwsh");
Some("pwsh")
} else if var("PSModulePath").is_ok()
&& (which::which("powershell").is_ok()
|| which::which("powershell.exe").is_ok())
{
trace!("detected powershell");
Some("powershell")
} else {
Some("cmd")
}
})
.or(Some("default"))
{
Some("") => return Err(RuntimeError::CommandShellEmptyShell).into_diagnostic(),
Some("none") | None => None,
#[cfg(windows)]
Some("cmd") | Some("cmd.exe") | Some("CMD") | Some("CMD.EXE") => Some(Shell::cmd()),
Some(other) => {
let sh = other.split_ascii_whitespace().collect::<Vec<_>>();
// UNWRAP: checked by Some("")
#[allow(clippy::unwrap_used)]
let (shprog, shopts) = sh.split_first().unwrap();
Some(Shell {
prog: shprog.into(),
options: shopts.iter().map(|s| (*s).to_string()).collect(),
program_option: Some(Cow::Borrowed(OsStr::new("-c"))),
})
}
}
};
let program = if let Some(shell) = shell {
Program::Shell {
shell,
command: cmd.join(" "),
args: Vec::new(),
}
} else {
Program::Exec {
prog: cmd.remove(0).into(),
args: cmd,
}
};
Ok(Arc::new(Command {
program,
options: SpawnOptions {
grouped: matches!(args.wrap_process, WrapMode::Group),
session: matches!(args.wrap_process, WrapMode::Session),
..Default::default()
},
}))
}
#[instrument(level = "trace")]
fn setup_process(job: Job, command: Arc<Command>, outflags: OutputFlags) {
if outflags.toast {
Notification::new()
.summary("Watchexec: change detected")
.body(&format!("Running {command}"))
.show()
.map_or_else(
|err| {
eprintln!("[[Failed to send desktop notification: {err}]]");
},
drop,
);
}
if !outflags.quiet {
let mut stderr = StandardStream::stderr(outflags.colour);
stderr.reset().ok();
stderr
.set_color(ColorSpec::new().set_fg(Some(Color::Green)))
.ok();
writeln!(&mut stderr, "[Running: {command}]").ok();
stderr.reset().ok();
}
tokio::spawn(async move {
job.to_wait().await;
job.run(move |context| end_of_process(context.current, outflags));
});
}
#[instrument(level = "trace")]
fn end_of_process(state: &CommandState, outflags: OutputFlags) {
let CommandState::Finished {
status,
started,
finished,
} = state
else {
return;
};
let duration = *finished - *started;
let timing = if outflags.timings {
format!(", lasted {duration:?}")
} else {
String::new()
};
let (msg, fg) = match status {
ProcessEnd::ExitError(code) => (format!("Command exited with {code}{timing}"), Color::Red),
ProcessEnd::ExitSignal(sig) => {
(format!("Command killed by {sig:?}{timing}"), Color::Magenta)
}
ProcessEnd::ExitStop(sig) => (format!("Command stopped by {sig:?}{timing}"), Color::Blue),
ProcessEnd::Continued => (format!("Command continued{timing}"), Color::Cyan),
ProcessEnd::Exception(ex) => (
format!("Command ended by exception {ex:#x}{timing}"),
Color::Yellow,
),
ProcessEnd::Success => (format!("Command was successful{timing}"), Color::Green),
};
if outflags.toast {
Notification::new()
.summary("Watchexec: command ended")
.body(&msg)
.show()
.map_or_else(
|err| {
eprintln!("[[Failed to send desktop notification: {err}]]");
},
drop,
);
}
if !outflags.quiet {
let mut stderr = StandardStream::stderr(outflags.colour);
stderr.reset().ok();
stderr.set_color(ColorSpec::new().set_fg(Some(fg))).ok();
writeln!(&mut stderr, "[{msg}]").ok();
stderr.reset().ok();
}
if outflags.bell {
let mut stdout = std::io::stdout();
stdout.write_all(b"\x07").ok();
stdout.flush().ok();
}
}
#[instrument(level = "trace")]
fn emit_events_to_command(
command: &mut TokioCommand,
events: Arc<[Event]>,
emit_file: RotatingTempFile,
emit_events_to: EmitEvents,
mut add_envs: HashMap<String, OsString>,
) {
use crate::emits::*;
let mut stdin = None;
match emit_events_to {
EmitEvents::Environment => {
add_envs.extend(emits_to_environment(&events));
}
EmitEvents::Stdio => match emits_to_file(&emit_file, &events)
.and_then(|path| File::open(path).into_diagnostic())
{
Ok(file) => {
stdin.replace(Stdio::from(file));
}
Err(err) => {
error!("Failed to write events to stdin, continuing without it: {err}");
}
},
EmitEvents::File => match emits_to_file(&emit_file, &events) {
Ok(path) => {
add_envs.insert("WATCHEXEC_EVENTS_FILE".into(), path.into());
}
Err(err) => {
error!("Failed to write WATCHEXEC_EVENTS_FILE, continuing without it: {err}");
}
},
EmitEvents::JsonStdio => match emits_to_json_file(&emit_file, &events)
.and_then(|path| File::open(path).into_diagnostic())
{
Ok(file) => {
stdin.replace(Stdio::from(file));
}
Err(err) => {
error!("Failed to write events to stdin, continuing without it: {err}");
}
},
EmitEvents::JsonFile => match emits_to_json_file(&emit_file, &events) {
Ok(path) => {
add_envs.insert("WATCHEXEC_EVENTS_FILE".into(), path.into());
}
Err(err) => {
error!("Failed to write WATCHEXEC_EVENTS_FILE, continuing without it: {err}");
}
},
EmitEvents::None => {}
}
for (k, v) in add_envs {
debug!(?k, ?v, "inserting environment variable");
command.env(k, v);
}
if let Some(stdin) = stdin {
debug!("set command stdin");
command.stdin(stdin);
}
}
pub(crate) fn reset_screen() {
for cs in [
ClearScreen::WindowsCooked,
ClearScreen::WindowsVt,
ClearScreen::VtLeaveAlt,
ClearScreen::VtWellDone,
ClearScreen::default(),
] {
cs.clear().ok();
}
}

View File

@ -1,52 +0,0 @@
use std::convert::Infallible;
use miette::Report;
use tracing::error;
use watchexec::{
config::InitConfig,
error::{FsWatcherError, RuntimeError},
handler::SyncFnHandler,
ErrorHook,
};
use crate::args::Args;
pub fn init(_args: &Args) -> InitConfig {
let mut config = InitConfig::default();
config.on_error(SyncFnHandler::from(
|err: ErrorHook| -> std::result::Result<(), Infallible> {
if let RuntimeError::IoError {
about: "waiting on process group",
..
} = err.error
{
// "No child processes" and such
// these are often spurious, so condemn them to -v only
error!("{}", err.error);
return Ok(());
}
if let RuntimeError::FsWatcher {
err:
FsWatcherError::Create { .. }
| FsWatcherError::TooManyWatches { .. }
| FsWatcherError::TooManyHandles { .. },
..
} = err.error
{
err.elevate();
return Ok(());
}
if cfg!(debug_assertions) {
eprintln!("[[{:?}]]", err.error);
}
eprintln!("[[Error (not fatal)]]\n{}", Report::new(err.error));
Ok(())
},
));
config
}

View File

@ -1,373 +0,0 @@
use std::{
collections::HashMap, convert::Infallible, env::current_dir, ffi::OsString, fs::File,
process::Stdio,
};
use miette::{miette, IntoDiagnostic, Result};
use notify_rust::Notification;
use tracing::{debug, debug_span, error};
use watchexec::{
action::{Action, Outcome, PostSpawn, PreSpawn},
command::{Command, Shell},
config::RuntimeConfig,
error::RuntimeError,
fs::Watcher,
handler::SyncFnHandler,
};
use watchexec_events::{Event, Keyboard, ProcessEnd, Tag};
use watchexec_signals::Signal;
use crate::args::{Args, ClearMode, EmitEvents, OnBusyUpdate};
use crate::state::State;
pub fn runtime(args: &Args, state: &State) -> Result<RuntimeConfig> {
let _span = debug_span!("args-runtime").entered();
let mut config = RuntimeConfig::default();
config.command(interpret_command_args(args)?);
config.pathset(if args.paths.is_empty() {
vec![current_dir().into_diagnostic()?]
} else {
args.paths.clone()
});
config.action_throttle(args.debounce.0);
config.command_grouped(!args.no_process_group);
config.keyboard_emit_eof(args.stdin_quit);
if let Some(interval) = args.poll {
config.file_watcher(Watcher::Poll(interval.0));
}
let clear = args.screen_clear;
let notif = args.notify;
let on_busy = args.on_busy_update;
let signal = args.signal;
let stop_signal = args.stop_signal;
let stop_timeout = args.stop_timeout.0;
let print_events = args.print_events;
let once = args.once;
let delay_run = args.delay_run.map(|ts| ts.0);
config.on_action(move |action: Action| {
let fut = async { Ok::<(), Infallible>(()) };
if print_events {
for (n, event) in action.events.iter().enumerate() {
eprintln!("[EVENT {n}] {event}");
}
}
if once {
action.outcome(Outcome::both(
if let Some(delay) = &delay_run {
Outcome::both(Outcome::Sleep(*delay), Outcome::Start)
} else {
Outcome::Start
},
Outcome::wait(Outcome::Exit),
));
return fut;
}
let signals: Vec<Signal> = action.events.iter().flat_map(Event::signals).collect();
let has_paths = action.events.iter().flat_map(Event::paths).next().is_some();
if signals.contains(&Signal::Terminate) {
action.outcome(Outcome::both(Outcome::Stop, Outcome::Exit));
return fut;
}
if signals.contains(&Signal::Interrupt) {
action.outcome(Outcome::both(Outcome::Stop, Outcome::Exit));
return fut;
}
let is_keyboard_eof = action
.events
.iter()
.any(|e| e.tags.contains(&Tag::Keyboard(Keyboard::Eof)));
if is_keyboard_eof {
action.outcome(Outcome::both(Outcome::Stop, Outcome::Exit));
return fut;
}
if !has_paths {
if !signals.is_empty() {
let mut out = Outcome::DoNothing;
for sig in signals {
out = Outcome::both(out, Outcome::Signal(sig));
}
action.outcome(out);
return fut;
}
let completion = action.events.iter().flat_map(Event::completions).next();
if let Some(status) = completion {
let (msg, printit) = match status {
Some(ProcessEnd::ExitError(code)) => {
(format!("Command exited with {code}"), true)
}
Some(ProcessEnd::ExitSignal(sig)) => {
(format!("Command killed by {sig:?}"), true)
}
Some(ProcessEnd::ExitStop(sig)) => {
(format!("Command stopped by {sig:?}"), true)
}
Some(ProcessEnd::Continued) => ("Command continued".to_string(), true),
Some(ProcessEnd::Exception(ex)) => {
(format!("Command ended by exception {ex:#x}"), true)
}
Some(ProcessEnd::Success) => ("Command was successful".to_string(), false),
None => ("Command completed".to_string(), false),
};
if printit {
eprintln!("[[{msg}]]");
}
if notif {
Notification::new()
.summary("Watchexec: command ended")
.body(&msg)
.show()
.map_or_else(
|err| {
eprintln!("[[Failed to send desktop notification: {err}]]");
},
drop,
);
}
action.outcome(Outcome::DoNothing);
return fut;
}
}
let start = if let Some(mode) = clear {
Outcome::both(
match mode {
ClearMode::Clear => Outcome::Clear,
ClearMode::Reset => Outcome::Reset,
},
Outcome::Start,
)
} else {
Outcome::Start
};
let start = if let Some(delay) = &delay_run {
Outcome::both(Outcome::Sleep(*delay), start)
} else {
start
};
let when_idle = start.clone();
let when_running = match on_busy {
OnBusyUpdate::Restart if cfg!(windows) => Outcome::both(Outcome::Stop, start),
OnBusyUpdate::Restart => Outcome::both(
Outcome::both(
Outcome::Signal(stop_signal.unwrap_or(Signal::Terminate)),
Outcome::wait_timeout(stop_timeout, Outcome::Stop),
),
start,
),
OnBusyUpdate::Signal if cfg!(windows) => Outcome::Stop,
OnBusyUpdate::Signal => {
Outcome::Signal(stop_signal.or(signal).unwrap_or(Signal::Terminate))
}
OnBusyUpdate::Queue => Outcome::wait(start),
OnBusyUpdate::DoNothing => Outcome::DoNothing,
};
action.outcome(Outcome::if_running(when_running, when_idle));
fut
});
let mut add_envs = HashMap::new();
// TODO: move to args?
for pair in &args.env {
if let Some((k, v)) = pair.split_once('=') {
add_envs.insert(k.to_owned(), OsString::from(v));
} else {
return Err(miette!("{pair} is not in key=value format"));
}
}
debug!(
?add_envs,
"additional environment variables to add to command"
);
let workdir = args.workdir.clone();
let emit_events_to = args.emit_events_to;
let emit_file = state.emit_file.clone();
config.on_pre_spawn(move |prespawn: PreSpawn| {
use crate::emits::*;
let workdir = workdir.clone();
let mut add_envs = add_envs.clone();
let mut stdin = None;
match emit_events_to {
EmitEvents::Environment => {
add_envs.extend(emits_to_environment(&prespawn.events));
}
EmitEvents::Stdin => match emits_to_file(&emit_file, &prespawn.events)
.and_then(|path| File::open(path).into_diagnostic())
{
Ok(file) => {
stdin.replace(Stdio::from(file));
}
Err(err) => {
error!("Failed to write events to stdin, continuing without it: {err}");
}
},
EmitEvents::File => match emits_to_file(&emit_file, &prespawn.events) {
Ok(path) => {
add_envs.insert("WATCHEXEC_EVENTS_FILE".into(), path.into());
}
Err(err) => {
error!("Failed to write WATCHEXEC_EVENTS_FILE, continuing without it: {err}");
}
},
EmitEvents::JsonStdin => match emits_to_json_file(&emit_file, &prespawn.events)
.and_then(|path| File::open(path).into_diagnostic())
{
Ok(file) => {
stdin.replace(Stdio::from(file));
}
Err(err) => {
error!("Failed to write events to stdin, continuing without it: {err}");
}
},
EmitEvents::JsonFile => match emits_to_json_file(&emit_file, &prespawn.events) {
Ok(path) => {
add_envs.insert("WATCHEXEC_EVENTS_FILE".into(), path.into());
}
Err(err) => {
error!("Failed to write WATCHEXEC_EVENTS_FILE, continuing without it: {err}");
}
},
EmitEvents::None => {}
}
async move {
if !add_envs.is_empty() || workdir.is_some() || stdin.is_some() {
if let Some(mut command) = prespawn.command().await {
for (k, v) in add_envs {
debug!(?k, ?v, "inserting environment variable");
command.env(k, v);
}
if let Some(ref workdir) = workdir {
debug!(?workdir, "set command workdir");
command.current_dir(workdir);
}
if let Some(stdin) = stdin {
debug!("set command stdin");
command.stdin(stdin);
}
}
}
Ok::<(), Infallible>(())
}
});
config.on_post_spawn(SyncFnHandler::from(move |postspawn: PostSpawn| {
if notif {
Notification::new()
.summary("Watchexec: change detected")
.body(&format!("Running {}", postspawn.command))
.show()
.map_or_else(
|err| {
eprintln!("[[Failed to send desktop notification: {err}]]");
},
drop,
);
}
Ok::<(), Infallible>(())
}));
Ok(config)
}
fn interpret_command_args(args: &Args) -> Result<Command> {
let mut cmd = args.command.clone();
if cmd.is_empty() {
panic!("(clap) Bug: command is not present");
}
Ok(if args.no_shell || args.no_shell_long {
Command::Exec {
prog: cmd.remove(0),
args: cmd,
}
} else {
let (shell, shopts) = if let Some(s) = &args.shell {
if s.is_empty() {
return Err(RuntimeError::CommandShellEmptyShell).into_diagnostic();
} else if s.eq_ignore_ascii_case("powershell") {
(Shell::Powershell, Vec::new())
} else if s.eq_ignore_ascii_case("none") {
return Ok(Command::Exec {
prog: cmd.remove(0),
args: cmd,
});
} else if s.eq_ignore_ascii_case("cmd") {
(cmd_shell(s.into()), Vec::new())
} else {
let sh = s.split_ascii_whitespace().collect::<Vec<_>>();
// UNWRAP: checked by first if branch
#[allow(clippy::unwrap_used)]
let (shprog, shopts) = sh.split_first().unwrap();
(
Shell::Unix((*shprog).to_string()),
shopts.iter().map(|s| (*s).to_string()).collect(),
)
}
} else {
(default_shell(), Vec::new())
};
Command::Shell {
shell,
args: shopts,
command: cmd.join(" "),
}
})
}
// until 2.0, then Powershell
#[cfg(windows)]
fn default_shell() -> Shell {
Shell::Cmd
}
#[cfg(not(windows))]
fn default_shell() -> Shell {
Shell::Unix("sh".to_string())
}
// because Shell::Cmd is only on windows
#[cfg(windows)]
fn cmd_shell(_: String) -> Shell {
Shell::Cmd
}
#[cfg(not(windows))]
fn cmd_shell(s: String) -> Shell {
Shell::Unix(s)
}

View File

@ -1,10 +1,9 @@
use std::{
collections::HashSet,
env,
path::{Path, PathBuf},
};
use ignore_files::IgnoreFile;
use ignore_files::{IgnoreFile, IgnoreFilesFromOriginArgs};
use miette::{miette, IntoDiagnostic, Result};
use project_origins::ProjectType;
use tokio::fs::canonicalize;
@ -13,11 +12,7 @@ use watchexec::paths::common_prefix;
use crate::args::Args;
pub async fn dirs(args: &Args) -> Result<(PathBuf, PathBuf)> {
let curdir = env::current_dir().into_diagnostic()?;
let curdir = canonicalize(curdir).await.into_diagnostic()?;
debug!(?curdir, "current directory");
pub async fn project_origin(args: &Args) -> Result<PathBuf> {
let project_origin = if let Some(origin) = &args.project_origin {
debug!(?origin, "project origin override");
canonicalize(origin).await.into_diagnostic()?
@ -28,27 +23,19 @@ pub async fn dirs(args: &Args) -> Result<(PathBuf, PathBuf)> {
};
debug!(?homedir, "home directory");
let mut paths = HashSet::new();
for path in &args.paths {
paths.insert(canonicalize(path).await.into_diagnostic()?);
}
let homedir_requested = homedir.as_ref().map_or(false, |home| paths.contains(home));
let homedir_requested = homedir.as_ref().map_or(false, |home| {
args.paths
.binary_search_by_key(home, |w| PathBuf::from(w.clone()))
.is_ok()
});
debug!(
?homedir_requested,
"resolved whether the homedir is explicitly requested"
);
if paths.is_empty() {
debug!("no paths, using current directory");
paths.insert(curdir.clone());
}
debug!(?paths, "resolved all watched paths");
let mut origins = HashSet::new();
for path in paths {
origins.extend(project_origins::origins(&path).await);
for path in &args.paths {
origins.extend(project_origins::origins(path).await);
}
match (homedir, homedir_requested) {
@ -61,7 +48,7 @@ pub async fn dirs(args: &Args) -> Result<(PathBuf, PathBuf)> {
if origins.is_empty() {
debug!("no origins, using current directory");
origins.insert(curdir.clone());
origins.insert(args.workdir.clone().unwrap());
}
debug!(?origins, "resolved all project origins");
@ -74,12 +61,9 @@ pub async fn dirs(args: &Args) -> Result<(PathBuf, PathBuf)> {
.await
.into_diagnostic()?
};
info!(?project_origin, "resolved common/project origin");
debug!(?project_origin, "resolved common/project origin");
let workdir = curdir;
info!(?workdir, "resolved working directory");
Ok((project_origin, workdir))
Ok(project_origin)
}
pub async fn vcs_types(origin: &Path) -> Vec<ProjectType> {
@ -88,17 +72,37 @@ pub async fn vcs_types(origin: &Path) -> Vec<ProjectType> {
.into_iter()
.filter(|pt| pt.is_vcs())
.collect::<Vec<_>>();
info!(?vcs_types, "resolved vcs types");
info!(?vcs_types, "effective vcs types");
vcs_types
}
pub async fn ignores(args: &Args, vcs_types: &[ProjectType], origin: &Path) -> Vec<IgnoreFile> {
pub async fn ignores(args: &Args, vcs_types: &[ProjectType]) -> Result<Vec<IgnoreFile>> {
let origin = args.project_origin.clone().unwrap();
let mut skip_git_global_excludes = false;
let mut ignores = if args.no_project_ignore {
Vec::new()
} else {
let (mut ignores, errors) = ignore_files::from_origin(origin).await;
let ignore_files = args.ignore_files.iter().map(|path| {
if path.is_absolute() {
path.into()
} else {
origin.join(path)
}
});
let (mut ignores, errors) = ignore_files::from_origin(
IgnoreFilesFromOriginArgs::new_unchecked(
&origin,
args.paths.iter().map(PathBuf::from),
ignore_files,
)
.canonicalise()
.await
.into_diagnostic()?,
)
.await;
for err in errors {
warn!("while discovering project-local ignore files: {}", err);
}
@ -188,7 +192,7 @@ pub async fn ignores(args: &Args, vcs_types: &[ProjectType], origin: &Path) -> V
.filter(|ig| {
!ig.applies_in
.as_ref()
.map_or(false, |p| p.starts_with(origin))
.map_or(false, |p| p.starts_with(&origin))
})
.collect::<Vec<_>>();
debug!(
@ -214,5 +218,5 @@ pub async fn ignores(args: &Args, vcs_types: &[ProjectType], origin: &Path) -> V
}
info!(files=?ignores.iter().map(|ig| ig.path.as_path()).collect::<Vec<_>>(), "found some ignores");
ignores
Ok(ignores)
}

View File

@ -12,7 +12,7 @@ pub fn emits_to_environment(events: &[Event]) -> impl Iterator<Item = (String, O
.map(|(k, v)| (format!("WATCHEXEC_{k}_PATH"), v))
}
fn events_to_simple_format(events: &[Event]) -> Result<String> {
pub fn events_to_simple_format(events: &[Event]) -> Result<String> {
let mut buf = String::new();
for event in events {
let feks = event

View File

@ -1,4 +1,175 @@
mod common;
mod globset;
use std::{
ffi::OsString,
path::{Path, PathBuf, MAIN_SEPARATOR},
sync::Arc,
};
pub use globset::globset;
use miette::{IntoDiagnostic, Result};
use tokio::io::{AsyncBufReadExt, BufReader};
use tracing::{info, trace, trace_span};
use watchexec::{error::RuntimeError, filter::Filterer};
use watchexec_events::{
filekind::{FileEventKind, ModifyKind},
Event, Priority, Tag,
};
use watchexec_filterer_globset::GlobsetFilterer;
use crate::args::{Args, FsEvent};
pub(crate) mod parse;
mod proglib;
mod progs;
mod syncval;
/// A custom filterer that combines the library's Globset filterer and a switch for --no-meta
#[derive(Debug)]
pub struct WatchexecFilterer {
inner: GlobsetFilterer,
fs_events: Vec<FsEvent>,
progs: Option<progs::FilterProgs>,
}
impl Filterer for WatchexecFilterer {
#[tracing::instrument(level = "trace", skip(self))]
fn check_event(&self, event: &Event, priority: Priority) -> Result<bool, RuntimeError> {
for tag in &event.tags {
if let Tag::FileEventKind(fek) = tag {
let normalised = match fek {
FileEventKind::Access(_) => FsEvent::Access,
FileEventKind::Modify(ModifyKind::Name(_)) => FsEvent::Rename,
FileEventKind::Modify(ModifyKind::Metadata(_)) => FsEvent::Metadata,
FileEventKind::Modify(_) => FsEvent::Modify,
FileEventKind::Create(_) => FsEvent::Create,
FileEventKind::Remove(_) => FsEvent::Remove,
_ => continue,
};
trace!(allowed=?self.fs_events, this=?normalised, "check against fs event filter");
if !self.fs_events.contains(&normalised) {
return Ok(false);
}
}
}
trace!("check against original event");
if !self.inner.check_event(event, priority)? {
return Ok(false);
}
if let Some(progs) = &self.progs {
trace!("check against program filters");
if !progs.check(event)? {
return Ok(false);
}
}
Ok(true)
}
}
impl WatchexecFilterer {
/// Create a new filterer from the given arguments
pub async fn new(args: &Args) -> Result<Arc<Self>> {
let project_origin = args.project_origin.clone().unwrap();
let workdir = args.workdir.clone().unwrap();
let ignore_files = if args.no_discover_ignore {
Vec::new()
} else {
let vcs_types = crate::dirs::vcs_types(&project_origin).await;
crate::dirs::ignores(args, &vcs_types).await?
};
let mut ignores = Vec::new();
if !args.no_default_ignore {
ignores.extend([
(format!("**{MAIN_SEPARATOR}.DS_Store"), None),
(String::from("watchexec.*.log"), None),
(String::from("*.py[co]"), None),
(String::from("#*#"), None),
(String::from(".#*"), None),
(String::from(".*.kate-swp"), None),
(String::from(".*.sw?"), None),
(String::from(".*.sw?x"), None),
(format!("**{MAIN_SEPARATOR}.bzr{MAIN_SEPARATOR}**"), None),
(format!("**{MAIN_SEPARATOR}_darcs{MAIN_SEPARATOR}**"), None),
(
format!("**{MAIN_SEPARATOR}.fossil-settings{MAIN_SEPARATOR}**"),
None,
),
(format!("**{MAIN_SEPARATOR}.git{MAIN_SEPARATOR}**"), None),
(format!("**{MAIN_SEPARATOR}.hg{MAIN_SEPARATOR}**"), None),
(format!("**{MAIN_SEPARATOR}.pijul{MAIN_SEPARATOR}**"), None),
(format!("**{MAIN_SEPARATOR}.svn{MAIN_SEPARATOR}**"), None),
]);
}
let mut filters = args
.filter_patterns
.iter()
.map(|f| (f.to_owned(), Some(workdir.clone())))
.collect::<Vec<_>>();
for filter_file in &args.filter_files {
filters.extend(read_filter_file(filter_file).await?);
}
ignores.extend(
args.ignore_patterns
.iter()
.map(|f| (f.to_owned(), Some(workdir.clone()))),
);
let exts = args
.filter_extensions
.iter()
.map(|e| OsString::from(e.strip_prefix('.').unwrap_or(e)));
info!("initialising Globset filterer");
Ok(Arc::new(Self {
inner: GlobsetFilterer::new(project_origin, filters, ignores, ignore_files, exts)
.await
.into_diagnostic()?,
fs_events: args.filter_fs_events.clone(),
progs: if args.filter_programs_parsed.is_empty() {
None
} else {
Some(progs::FilterProgs::new(args)?)
},
}))
}
}
async fn read_filter_file(path: &Path) -> Result<Vec<(String, Option<PathBuf>)>> {
let _span = trace_span!("loading filter file", ?path).entered();
let file = tokio::fs::File::open(path).await.into_diagnostic()?;
let metadata_len = file
.metadata()
.await
.map(|m| usize::try_from(m.len()))
.unwrap_or(Ok(0))
.into_diagnostic()?;
let filter_capacity = if metadata_len == 0 {
0
} else {
metadata_len / 20
};
let mut filters = Vec::with_capacity(filter_capacity);
let reader = BufReader::new(file);
let mut lines = reader.lines();
while let Some(line) = lines.next_line().await.into_diagnostic()? {
let line = line.trim();
if line.is_empty() || line.starts_with('#') {
continue;
}
trace!(?line, "adding filter line");
filters.push((line.to_owned(), Some(path.to_owned())));
}
Ok(filters)
}

View File

@ -1,143 +0,0 @@
use std::{
ffi::OsString,
path::{Path, PathBuf, MAIN_SEPARATOR},
sync::Arc,
};
use miette::{IntoDiagnostic, Result};
use tokio::io::{AsyncBufReadExt, BufReader};
use tracing::{info, trace, trace_span};
use watchexec::{
error::RuntimeError,
event::{
filekind::{FileEventKind, ModifyKind},
Event, Priority, Tag,
},
filter::Filterer,
};
use watchexec_filterer_globset::GlobsetFilterer;
use crate::args::{Args, FsEvent};
pub async fn globset(args: &Args) -> Result<Arc<WatchexecFilterer>> {
let (project_origin, workdir) = super::common::dirs(args).await?;
let ignore_files = if args.no_discover_ignore {
Vec::new()
} else {
let vcs_types = super::common::vcs_types(&project_origin).await;
super::common::ignores(args, &vcs_types, &project_origin).await
};
let mut ignores = Vec::new();
if !args.no_default_ignore {
ignores.extend([
(format!("**{MAIN_SEPARATOR}.DS_Store"), None),
(String::from("watchexec.*.log"), None),
(String::from("*.py[co]"), None),
(String::from("#*#"), None),
(String::from(".#*"), None),
(String::from(".*.kate-swp"), None),
(String::from(".*.sw?"), None),
(String::from(".*.sw?x"), None),
(format!("**{MAIN_SEPARATOR}.bzr{MAIN_SEPARATOR}**"), None),
(format!("**{MAIN_SEPARATOR}_darcs{MAIN_SEPARATOR}**"), None),
(
format!("**{MAIN_SEPARATOR}.fossil-settings{MAIN_SEPARATOR}**"),
None,
),
(format!("**{MAIN_SEPARATOR}.git{MAIN_SEPARATOR}**"), None),
(format!("**{MAIN_SEPARATOR}.hg{MAIN_SEPARATOR}**"), None),
(format!("**{MAIN_SEPARATOR}.pijul{MAIN_SEPARATOR}**"), None),
(format!("**{MAIN_SEPARATOR}.svn{MAIN_SEPARATOR}**"), None),
]);
}
let mut filters = args
.filter_patterns
.iter()
.map(|f| (f.to_owned(), Some(workdir.clone())))
.collect::<Vec<_>>();
for filter_file in &args.filter_files {
filters.extend(read_filter_file(filter_file).await?);
}
ignores.extend(
args.ignore_patterns
.iter()
.map(|f| (f.to_owned(), Some(workdir.clone()))),
);
let exts = args
.filter_extensions
.iter()
.map(|e| OsString::from(e.strip_prefix('.').unwrap_or(e)));
info!("initialising Globset filterer");
Ok(Arc::new(WatchexecFilterer {
inner: GlobsetFilterer::new(project_origin, filters, ignores, ignore_files, exts)
.await
.into_diagnostic()?,
fs_events: args.filter_fs_events.clone(),
}))
}
async fn read_filter_file(path: &Path) -> Result<Vec<(String, Option<PathBuf>)>> {
let _span = trace_span!("loading filter file", ?path).entered();
let file = tokio::fs::File::open(path).await.into_diagnostic()?;
let mut filters =
Vec::with_capacity(file.metadata().await.map(|m| m.len() as usize).unwrap_or(0) / 20);
let reader = BufReader::new(file);
let mut lines = reader.lines();
while let Some(line) = lines.next_line().await.into_diagnostic()? {
let line = line.trim();
if line.is_empty() || line.starts_with('#') {
continue;
}
trace!(?line, "adding filter line");
filters.push((line.to_owned(), Some(path.to_owned())));
}
Ok(filters)
}
/// A custom filterer that combines the library's Globset filterer and a switch for --no-meta
#[derive(Debug)]
pub struct WatchexecFilterer {
inner: GlobsetFilterer,
fs_events: Vec<FsEvent>,
}
impl Filterer for WatchexecFilterer {
fn check_event(&self, event: &Event, priority: Priority) -> Result<bool, RuntimeError> {
for tag in &event.tags {
if let Tag::FileEventKind(fek) = tag {
let normalised = match fek {
FileEventKind::Access(_) => FsEvent::Access,
FileEventKind::Modify(ModifyKind::Name(_)) => FsEvent::Rename,
FileEventKind::Modify(ModifyKind::Metadata(_)) => FsEvent::Metadata,
FileEventKind::Modify(_) => FsEvent::Modify,
FileEventKind::Create(_) => FsEvent::Create,
FileEventKind::Remove(_) => FsEvent::Remove,
_ => continue,
};
if !self.fs_events.contains(&normalised) {
return Ok(false);
}
}
}
trace!("check against original event");
if !self.inner.check_event(event, priority)? {
return Ok(false);
}
Ok(true)
}
}

View File

@ -0,0 +1,17 @@
use miette::{miette, Result};
pub fn parse_filter_program((n, prog): (usize, String)) -> Result<jaq_syn::Main> {
let parser = jaq_parse::main();
let (main, errs) = jaq_parse::parse(&prog, parser);
if !errs.is_empty() {
let errs = errs
.into_iter()
.map(|err| err.to_string())
.collect::<Vec<_>>()
.join("\n");
return Err(miette!("{}", errs).wrap_err(format!("failed to load filter program #{}", n)));
}
main.ok_or_else(|| miette!("failed to load filter program #{} (no reason given)", n))
}

View File

@ -0,0 +1,27 @@
use jaq_interpret::ParseCtx;
use miette::Result;
use tracing::debug;
mod file;
mod hash;
mod kv;
mod macros;
mod output;
pub fn jaq_lib() -> Result<ParseCtx> {
let mut jaq = ParseCtx::new(Vec::new());
debug!("loading jaq core library");
jaq.insert_natives(jaq_core::core());
debug!("loading jaq std library");
jaq.insert_defs(jaq_std::std());
debug!("loading jaq watchexec library");
file::load(&mut jaq);
hash::load(&mut jaq);
kv::load(&mut jaq);
output::load(&mut jaq);
Ok(jaq)
}

View File

@ -0,0 +1,173 @@
use std::{
fs::{metadata, File, FileType, Metadata},
io::{BufReader, Read},
iter::once,
time::{SystemTime, UNIX_EPOCH},
};
use jaq_interpret::{Error, Native, ParseCtx, Val};
use serde_json::{json, Value};
use tracing::{debug, error, trace};
use super::macros::*;
pub fn load(jaq: &mut ParseCtx) {
trace!("jaq: add file_read filter");
jaq.insert_native(
"file_read".into(),
1,
Native::new({
move |args, (ctx, val)| {
let path = match &val {
Val::Str(v) => v.to_string(),
_ => return_err!(Err(Error::str("expected string (path) but got {val:?}"))),
};
let bytes = match int_arg!(args, 0, ctx, &val) {
Ok(v) => v,
Err(e) => return_err!(Err(e)),
};
Box::new(once(Ok(match File::open(&path) {
Ok(file) => {
let buf_reader = BufReader::new(file);
let mut limited = buf_reader.take(bytes);
let mut buffer = String::with_capacity(bytes as _);
match limited.read_to_string(&mut buffer) {
Ok(read) => {
debug!("jaq: read {read} bytes from {path:?}");
Val::Str(buffer.into())
}
Err(err) => {
error!("jaq: failed to read from {path:?}: {err:?}");
Val::Null
}
}
}
Err(err) => {
error!("jaq: failed to open file {path:?}: {err:?}");
Val::Null
}
})))
}
}),
);
trace!("jaq: add file_meta filter");
jaq.insert_native(
"file_meta".into(),
0,
Native::new({
move |_, (_, val)| {
let path = match &val {
Val::Str(v) => v.to_string(),
_ => return_err!(Err(Error::str("expected string (path) but got {val:?}"))),
};
Box::new(once(Ok(match metadata(&path) {
Ok(meta) => Val::from(json_meta(meta)),
Err(err) => {
error!("jaq: failed to open {path:?}: {err:?}");
Val::Null
}
})))
}
}),
);
trace!("jaq: add file_size filter");
jaq.insert_native(
"file_size".into(),
0,
Native::new({
move |_, (_, val)| {
let path = match &val {
Val::Str(v) => v.to_string(),
_ => return_err!(Err(Error::str("expected string (path) but got {val:?}"))),
};
Box::new(once(Ok(match metadata(&path) {
Ok(meta) => Val::Int(meta.len() as _),
Err(err) => {
error!("jaq: failed to open {path:?}: {err:?}");
Val::Null
}
})))
}
}),
);
}
fn json_meta(meta: Metadata) -> Value {
let perms = meta.permissions();
let mut val = json!({
"type": filetype_str(meta.file_type()),
"size": meta.len(),
"modified": fs_time(meta.modified()),
"accessed": fs_time(meta.accessed()),
"created": fs_time(meta.created()),
"dir": meta.is_dir(),
"file": meta.is_file(),
"symlink": meta.is_symlink(),
"readonly": perms.readonly(),
});
#[cfg(unix)]
{
use std::os::unix::fs::PermissionsExt;
let map = val.as_object_mut().unwrap();
map.insert(
"mode".to_string(),
Value::String(format!("{:o}", perms.mode())),
);
map.insert("mode_byte".to_string(), Value::from(perms.mode()));
map.insert(
"executable".to_string(),
Value::Bool(perms.mode() & 0o111 != 0),
);
}
val
}
fn filetype_str(filetype: FileType) -> &'static str {
#[cfg(unix)]
{
use std::os::unix::fs::FileTypeExt;
if filetype.is_char_device() {
return "char";
} else if filetype.is_block_device() {
return "block";
} else if filetype.is_fifo() {
return "fifo";
} else if filetype.is_socket() {
return "socket";
}
}
#[cfg(windows)]
{
use std::os::windows::fs::FileTypeExt;
if filetype.is_symlink_dir() {
return "symdir";
} else if filetype.is_symlink_file() {
return "symfile";
}
}
if filetype.is_dir() {
"dir"
} else if filetype.is_file() {
"file"
} else if filetype.is_symlink() {
"symlink"
} else {
"unknown"
}
}
fn fs_time(time: std::io::Result<SystemTime>) -> Option<u64> {
time.ok()
.and_then(|time| time.duration_since(UNIX_EPOCH).ok())
.map(|dur| dur.as_secs())
}

View File

@ -0,0 +1,62 @@
use std::{fs::File, io::Read, iter::once};
use jaq_interpret::{Error, Native, ParseCtx, Val};
use tracing::{debug, error, trace};
use super::macros::*;
pub fn load(jaq: &mut ParseCtx) {
trace!("jaq: add hash filter");
jaq.insert_native(
"hash".into(),
0,
Native::new({
move |_, (_, val)| {
let string = match &val {
Val::Str(v) => v.to_string(),
_ => return_err!(Err(Error::str("expected string but got {val:?}"))),
};
Box::new(once(Ok(Val::Str(
blake3::hash(string.as_bytes()).to_hex().to_string().into(),
))))
}
}),
);
trace!("jaq: add file_hash filter");
jaq.insert_native(
"file_hash".into(),
0,
Native::new({
move |_, (_, val)| {
let path = match &val {
Val::Str(v) => v.to_string(),
_ => return_err!(Err(Error::str("expected string but got {val:?}"))),
};
Box::new(once(Ok(match File::open(&path) {
Ok(mut file) => {
const BUFFER_SIZE: usize = 1024 * 1024;
let mut hasher = blake3::Hasher::new();
let mut buf = vec![0; BUFFER_SIZE];
while let Ok(bytes) = file.read(&mut buf) {
debug!("jaq: read {bytes} bytes from {path:?}");
if bytes == 0 {
break;
}
hasher.update(&buf[..bytes]);
buf = vec![0; BUFFER_SIZE];
}
Val::Str(hasher.finalize().to_hex().to_string().into())
}
Err(err) => {
error!("jaq: failed to open file {path:?}: {err:?}");
Val::Null
}
})))
}
}),
);
}

View File

@ -0,0 +1,69 @@
use std::{iter::once, sync::Arc};
use dashmap::DashMap;
use jaq_interpret::{Error, Native, ParseCtx, Val};
use once_cell::sync::OnceCell;
use tracing::trace;
use crate::filterer::syncval::SyncVal;
use super::macros::*;
type KvStore = Arc<DashMap<String, SyncVal>>;
fn kv_store() -> KvStore {
static KV_STORE: OnceCell<KvStore> = OnceCell::new();
KV_STORE.get_or_init(|| KvStore::default()).clone()
}
pub fn load(jaq: &mut ParseCtx) {
trace!("jaq: add kv_clear filter");
jaq.insert_native(
"kv_clear".into(),
0,
Native::new({
move |_, (_, val)| {
let kv = kv_store();
kv.clear();
Box::new(once(Ok(val)))
}
}),
);
trace!("jaq: add kv_store filter");
jaq.insert_native(
"kv_store".into(),
1,
Native::new({
move |args, (ctx, val)| {
let kv = kv_store();
let key = match string_arg!(args, 0, ctx, val) {
Ok(v) => v,
Err(e) => return_err!(Err(e)),
};
kv.insert(key, (&val).into());
Box::new(once(Ok(val)))
}
}),
);
trace!("jaq: add kv_fetch filter");
jaq.insert_native(
"kv_fetch".into(),
1,
Native::new({
move |args, (ctx, val)| {
let kv = kv_store();
let key = match string_arg!(args, 0, ctx, val) {
Ok(v) => v,
Err(e) => return_err!(Err(e)),
};
Box::new(once(Ok(kv
.get(&key)
.map(|val| val.value().into())
.unwrap_or(Val::Null))))
}
}),
);
}

View File

@ -0,0 +1,30 @@
macro_rules! return_err {
($err:expr) => {
return Box::new(once($err))
};
}
pub(crate) use return_err;
macro_rules! string_arg {
($args:expr, $n:expr, $ctx:expr, $val:expr) => {
match ::jaq_interpret::FilterT::run($args.get($n), ($ctx.clone(), $val.clone())).next() {
Some(Ok(Val::Str(v))) => Ok(v.to_string()),
Some(Ok(val)) => Err(Error::str(format!("expected string but got {val:?}"))),
Some(Err(e)) => Err(e),
None => Err(Error::str("value expected but none found")),
}
};
}
pub(crate) use string_arg;
macro_rules! int_arg {
($args:expr, $n:expr, $ctx:expr, $val:expr) => {
match ::jaq_interpret::FilterT::run($args.get($n), ($ctx.clone(), $val.clone())).next() {
Some(Ok(Val::Int(v))) => Ok(v as _),
Some(Ok(val)) => Err(Error::str(format!("expected int but got {val:?}"))),
Some(Err(e)) => Err(e),
None => Err(Error::str("value expected but none found")),
}
};
}
pub(crate) use int_arg;

View File

@ -0,0 +1,83 @@
use std::iter::once;
use jaq_interpret::{Error, Native, ParseCtx, Val};
use tracing::{debug, error, info, trace, warn};
use super::macros::*;
macro_rules! log_action {
($level:expr, $val:expr) => {
match $level.to_ascii_lowercase().as_str() {
"trace" => trace!("jaq: {}", $val),
"debug" => debug!("jaq: {}", $val),
"info" => info!("jaq: {}", $val),
"warn" => warn!("jaq: {}", $val),
"error" => error!("jaq: {}", $val),
_ => return_err!(Err(Error::str("invalid log level"))),
}
};
}
pub fn load(jaq: &mut ParseCtx) {
trace!("jaq: add log filter");
jaq.insert_native(
"log".into(),
1,
Native::with_update(
|args, (ctx, val)| {
let level = match string_arg!(args, 0, ctx, val) {
Ok(v) => v,
Err(e) => return_err!(Err(e)),
};
log_action!(level, val);
// passthrough
Box::new(once(Ok(val)))
},
|args, (ctx, val), _| {
let level = match string_arg!(args, 0, ctx, val) {
Ok(v) => v,
Err(e) => return_err!(Err(e)),
};
log_action!(level, val);
// passthrough
Box::new(once(Ok(val)))
},
),
);
trace!("jaq: add printout filter");
jaq.insert_native(
"printout".into(),
0,
Native::with_update(
|_, (_, val)| {
println!("{}", val);
Box::new(once(Ok(val)))
},
|_, (_, val), _| {
println!("{}", val);
Box::new(once(Ok(val)))
},
),
);
trace!("jaq: add printerr filter");
jaq.insert_native(
"printerr".into(),
0,
Native::with_update(
|_, (_, val)| {
eprintln!("{}", val);
Box::new(once(Ok(val)))
},
|_, (_, val), _| {
eprintln!("{}", val);
Box::new(once(Ok(val)))
},
),
);
}

View File

@ -0,0 +1,143 @@
use std::{iter::empty, marker::PhantomData};
use jaq_interpret::{Ctx, FilterT, RcIter, Val};
use miette::miette;
use tokio::{
sync::{mpsc, oneshot},
task::{block_in_place, spawn_blocking},
};
use tracing::{error, trace, warn};
use watchexec::error::RuntimeError;
use watchexec_events::Event;
use crate::args::Args;
const BUFFER: usize = 128;
#[derive(Debug)]
pub struct FilterProgs {
channel: Requester<Event, bool>,
}
#[derive(Debug, Clone)]
pub struct Requester<S, R> {
sender: mpsc::Sender<(S, oneshot::Sender<R>)>,
_receiver: PhantomData<R>,
}
impl<S, R> Requester<S, R>
where
S: Send + Sync,
R: Send + Sync,
{
pub fn new(capacity: usize) -> (Self, mpsc::Receiver<(S, oneshot::Sender<R>)>) {
let (sender, receiver) = mpsc::channel(capacity);
(
Self {
sender,
_receiver: PhantomData,
},
receiver,
)
}
pub fn call(&self, value: S) -> Result<R, RuntimeError> {
// FIXME: this should really be async with a timeout, but that needs filtering in general
// to be async, which should be done at some point
block_in_place(|| {
let (sender, receiver) = oneshot::channel();
self.sender.blocking_send((value, sender)).map_err(|err| {
RuntimeError::External(miette!("filter progs internal channel: {}", err).into())
})?;
receiver
.blocking_recv()
.map_err(|err| RuntimeError::External(Box::new(err)))
})
}
}
impl FilterProgs {
pub fn check(&self, event: &Event) -> Result<bool, RuntimeError> {
self.channel.call(event.clone())
}
pub fn new(args: &Args) -> miette::Result<Self> {
let progs = args.filter_programs_parsed.clone();
eprintln!(
"EXPERIMENTAL: filter programs are unstable and may change/vanish without notice"
);
let (requester, mut receiver) = Requester::<Event, bool>::new(BUFFER);
let task =
spawn_blocking(move || {
'chan: while let Some((event, sender)) = receiver.blocking_recv() {
let val = serde_json::to_value(&event)
.map_err(|err| miette!("failed to serialize event: {}", err))
.map(Val::from)?;
for (n, prog) in progs.iter().enumerate() {
trace!(?n, "trying filter program");
let mut jaq = super::proglib::jaq_lib()?;
let filter = jaq.compile(prog.clone());
if !jaq.errs.is_empty() {
for (error, span) in jaq.errs {
error!(%error, "failed to compile filter program #{n}@{}:{}", span.start, span.end);
}
continue;
}
let inputs = RcIter::new(empty());
let mut results = filter.run((Ctx::new([], &inputs), val.clone()));
if let Some(res) = results.next() {
match res {
Ok(Val::Bool(false)) => {
trace!(
?n,
verdict = false,
"filter program finished; fail so stopping there"
);
sender
.send(false)
.unwrap_or_else(|_| warn!("failed to send filter result"));
continue 'chan;
}
Ok(Val::Bool(true)) => {
trace!(
?n,
verdict = true,
"filter program finished; pass so trying next"
);
continue;
}
Ok(val) => {
error!(?n, ?val, "filter program returned non-boolean, ignoring and trying next");
continue;
}
Err(err) => {
error!(?n, error=%err, "filter program failed, so trying next");
continue;
}
}
}
}
trace!("all filters failed, sending pass as default");
sender
.send(true)
.unwrap_or_else(|_| warn!("failed to send filter result"));
}
Ok(()) as miette::Result<()>
});
tokio::spawn(async {
match task.await {
Ok(Ok(())) => {}
Ok(Err(err)) => error!("filter progs task failed: {}", err),
Err(err) => error!("filter progs task panicked: {}", err),
}
});
Ok(Self { channel: requester })
}
}

View File

@ -0,0 +1,71 @@
/// Jaq's [Val](jaq_interpret::Val) uses Rc, but we want to use in Sync contexts. UGH!
use std::{rc::Rc, sync::Arc};
use indexmap::IndexMap;
use jaq_interpret::Val;
#[derive(Clone, Debug)]
pub enum SyncVal {
Null,
Bool(bool),
Int(isize),
Float(f64),
Num(Arc<str>),
Str(Arc<str>),
Arr(Arc<[SyncVal]>),
Obj(Arc<IndexMap<Arc<str>, SyncVal>>),
}
impl From<&Val> for SyncVal {
fn from(val: &Val) -> Self {
match val {
Val::Null => Self::Null,
Val::Bool(b) => Self::Bool(*b),
Val::Int(i) => Self::Int(*i),
Val::Float(f) => Self::Float(*f),
Val::Num(s) => Self::Num(s.to_string().into()),
Val::Str(s) => Self::Str(s.to_string().into()),
Val::Arr(a) => Self::Arr({
let mut arr = Vec::with_capacity(a.len());
for v in a.iter() {
arr.push(v.into());
}
arr.into()
}),
Val::Obj(m) => Self::Obj(Arc::new({
let mut map = IndexMap::new();
for (k, v) in m.iter() {
map.insert(k.to_string().into(), v.into());
}
map
})),
}
}
}
impl From<&SyncVal> for Val {
fn from(val: &SyncVal) -> Self {
match val {
SyncVal::Null => Self::Null,
SyncVal::Bool(b) => Self::Bool(*b),
SyncVal::Int(i) => Self::Int(*i),
SyncVal::Float(f) => Self::Float(*f),
SyncVal::Num(s) => Self::Num(s.to_string().into()),
SyncVal::Str(s) => Self::Str(s.to_string().into()),
SyncVal::Arr(a) => Self::Arr({
let mut arr = Vec::with_capacity(a.len());
for v in a.iter() {
arr.push(v.into());
}
arr.into()
}),
SyncVal::Obj(m) => Self::Obj(Rc::new({
let mut map: IndexMap<_, _, ahash::RandomState> = Default::default();
for (k, v) in m.iter() {
map.insert(k.to_string().into(), v.into());
}
map
})),
}
}
}

View File

@ -1,115 +1,37 @@
#![deny(rust_2018_idioms)]
#![allow(clippy::missing_const_for_fn, clippy::future_not_send)]
use std::{env::var, fs::File, io::Write, process::Stdio, sync::Mutex};
use std::{io::Write, process::Stdio};
use args::{Args, ShellCompletion};
use clap::CommandFactory;
use clap_complete::{Generator, Shell};
use clap_mangen::Man;
use command_group::AsyncCommandGroup;
use is_terminal::IsTerminal;
use miette::{IntoDiagnostic, Result};
use tokio::{fs::metadata, io::AsyncWriteExt, process::Command};
use tracing::{debug, info, warn};
use watchexec::{
event::{Event, Priority},
Watchexec,
};
use tokio::{io::AsyncWriteExt, process::Command};
use tracing::{debug, info};
use watchexec::Watchexec;
use watchexec_events::{Event, Priority};
use crate::filterer::WatchexecFilterer;
pub mod args;
mod config;
mod dirs;
mod emits;
mod filterer;
mod state;
async fn init() -> Result<Args> {
let mut log_on = false;
#[cfg(feature = "dev-console")]
match console_subscriber::try_init() {
Ok(_) => {
warn!("dev-console enabled");
log_on = true;
}
Err(e) => {
eprintln!("Failed to initialise tokio console, falling back to normal logging\n{e}")
}
}
if !log_on && var("RUST_LOG").is_ok() {
match tracing_subscriber::fmt::try_init() {
Ok(_) => {
warn!(RUST_LOG=%var("RUST_LOG").unwrap(), "logging configured from RUST_LOG");
log_on = true;
}
Err(e) => eprintln!("Failed to initialise logging with RUST_LOG, falling back\n{e}"),
}
}
let args = args::get_args();
let verbosity = args.verbose.unwrap_or(0);
if log_on {
warn!("ignoring logging options from args");
} else if verbosity > 0 {
let log_file = if let Some(file) = &args.log_file {
let is_dir = metadata(&file).await.map_or(false, |info| info.is_dir());
let path = if is_dir {
let filename = format!(
"watchexec.{}.log",
chrono::Utc::now().format("%Y-%m-%dT%H-%M-%SZ")
);
file.join(filename)
} else {
file.to_owned()
};
// TODO: use tracing-appender instead
Some(File::create(path).into_diagnostic()?)
} else {
None
};
let mut builder = tracing_subscriber::fmt().with_env_filter(match verbosity {
0 => unreachable!("checked by if earlier"),
1 => "warn",
2 => "info",
3 => "debug",
_ => "trace",
});
if verbosity > 2 {
use tracing_subscriber::fmt::format::FmtSpan;
builder = builder.with_span_events(FmtSpan::NEW | FmtSpan::CLOSE);
}
match if let Some(writer) = log_file {
builder.json().with_writer(Mutex::new(writer)).try_init()
} else if verbosity > 3 {
builder.pretty().try_init()
} else {
builder.try_init()
} {
Ok(_) => info!("logging initialised"),
Err(e) => eprintln!("Failed to initialise logging, continuing with none\n{e}"),
}
}
Ok(args)
}
async fn run_watchexec(args: Args) -> Result<()> {
info!(version=%env!("CARGO_PKG_VERSION"), "constructing Watchexec from CLI");
let init = config::init(&args);
let state = state::State::new()?;
let mut runtime = config::runtime(&args, &state)?;
runtime.filterer(filterer::globset(&args).await?);
let state = state::State::default();
let config = config::make_config(&args, &state)?;
config.filterer(WatchexecFilterer::new(&args).await?);
info!("initialising Watchexec runtime");
let wx = Watchexec::new(init, runtime)?;
let wx = Watchexec::with_config(config)?;
if !args.postpone {
debug!("kicking off with empty event");
@ -118,6 +40,11 @@ async fn run_watchexec(args: Args) -> Result<()> {
info!("running main loop");
wx.main().await.into_diagnostic()??;
if matches!(args.screen_clear, Some(args::ClearMode::Reset)) {
config::reset_screen();
}
info!("done with main loop");
Ok(())
@ -137,12 +64,10 @@ async fn run_manpage(_args: Args) -> Result<()> {
.stdin(Stdio::piped())
.stdout(Stdio::inherit())
.stderr(Stdio::inherit())
.group()
.kill_on_drop(true)
.spawn()
.into_diagnostic()?;
child
.inner()
.stdin
.as_mut()
.unwrap()
@ -169,14 +94,15 @@ async fn run_manpage(_args: Args) -> Result<()> {
Ok(())
}
#[allow(clippy::unused_async)]
async fn run_completions(shell: ShellCompletion) -> Result<()> {
info!(version=%env!("CARGO_PKG_VERSION"), "constructing completions");
fn generate(generator: impl Generator) {
let mut cmd = Args::command();
clap_complete::generate(generator, &mut cmd, "watchexec", &mut std::io::stdout());
}
info!(version=%env!("CARGO_PKG_VERSION"), "constructing completions");
match shell {
ShellCompletion::Bash => generate(Shell::Bash),
ShellCompletion::Elvish => generate(Shell::Elvish),
@ -190,8 +116,7 @@ async fn run_completions(shell: ShellCompletion) -> Result<()> {
}
pub async fn run() -> Result<()> {
let args = init().await?;
debug!(?args, "arguments");
let (args, _log_guard) = args::get_args().await?;
if args.manual {
run_manpage(args).await

View File

@ -1,14 +1,22 @@
#[cfg(feature = "eyra")]
extern crate eyra;
use miette::IntoDiagnostic;
#[cfg(target_env = "musl")]
#[global_allocator]
static GLOBAL: mimalloc::MiMalloc = mimalloc::MiMalloc;
#[tokio::main]
async fn main() -> miette::Result<()> {
watchexec_cli::run().await?;
fn main() -> miette::Result<()> {
#[cfg(feature = "pid1")]
pid1::Pid1Settings::new()
.enable_log(cfg!(feature = "pid1-withlog"))
.launch()
.into_diagnostic()?;
if std::process::id() == 1 {
std::process::exit(0);
}
Ok(())
tokio::runtime::Builder::new_multi_thread()
.enable_all()
.build()
.unwrap()
.block_on(async { watchexec_cli::run().await })
}

View File

@ -1,4 +1,5 @@
use std::{
env::var_os,
io::Write,
path::PathBuf,
sync::{Arc, Mutex},
@ -7,41 +8,41 @@ use std::{
use miette::{IntoDiagnostic, Result};
use tempfile::NamedTempFile;
#[derive(Clone, Debug)]
#[derive(Clone, Debug, Default)]
pub struct State {
pub emit_file: RotatingTempFile,
}
impl State {
pub fn new() -> Result<Self> {
let emit_file = RotatingTempFile::new()?;
Ok(Self { emit_file })
}
}
#[derive(Clone, Debug)]
pub struct RotatingTempFile(Arc<Mutex<NamedTempFile>>);
#[derive(Clone, Debug, Default)]
pub struct RotatingTempFile(Arc<Mutex<Option<NamedTempFile>>>);
impl RotatingTempFile {
pub fn new() -> Result<Self> {
let file = Arc::new(Mutex::new(NamedTempFile::new().into_diagnostic()?));
Ok(Self(file))
}
pub fn rotate(&self) -> Result<()> {
let mut file = self.0.lock().unwrap();
*file = NamedTempFile::new().into_diagnostic()?;
// implicitly drops the old file
*self.0.lock().unwrap() = Some(
if let Some(dir) = var_os("WATCHEXEC_TMPDIR") {
NamedTempFile::new_in(dir)
} else {
NamedTempFile::new()
}
.into_diagnostic()?,
);
Ok(())
}
pub fn write(&self, data: &[u8]) -> Result<()> {
let mut file = self.0.lock().unwrap();
file.write_all(data).into_diagnostic()?;
if let Some(file) = self.0.lock().unwrap().as_mut() {
file.write_all(data).into_diagnostic()?;
}
Ok(())
}
pub fn path(&self) -> PathBuf {
self.0.lock().unwrap().path().to_owned()
if let Some(file) = self.0.lock().unwrap().as_ref() {
file.path().to_owned()
} else {
PathBuf::new()
}
}
}

View File

@ -0,0 +1,121 @@
use std::path::PathBuf;
use std::{fs, sync::OnceLock};
use miette::{Context, IntoDiagnostic, Result};
use rand::Rng;
static PLACEHOLDER_DATA: OnceLock<String> = OnceLock::new();
fn get_placeholder_data() -> &'static str {
PLACEHOLDER_DATA.get_or_init(|| "PLACEHOLDER\n".repeat(500))
}
/// The amount of nesting that will be used for generated files
#[derive(Debug, Clone, PartialEq, Eq)]
pub(crate) enum GeneratedFileNesting {
/// Only one level of files
Flat,
/// Random, up to a certiain maximum
RandomToMax(usize),
}
/// Configuration for creating testing subfolders
#[derive(Debug, Clone, PartialEq, Eq)]
pub(crate) struct TestSubfolderConfiguration {
/// The amount of nesting that will be used when folders are generated
pub(crate) nesting: GeneratedFileNesting,
/// Number of files the folder should contain
pub(crate) file_count: usize,
/// Subfolder name
pub(crate) name: String,
}
/// Options for generating test files
#[derive(Debug, Clone, PartialEq, Eq, Default)]
pub(crate) struct GenerateTestFilesArgs {
/// The path where the files should be generated
/// if None, the current working directory will be used.
pub(crate) path: Option<PathBuf>,
/// Configurations for subfolders to generate
pub(crate) subfolder_configs: Vec<TestSubfolderConfiguration>,
}
/// Generate test files
///
/// This returns the same number of paths that were requested via subfolder_configs.
pub(crate) fn generate_test_files(args: GenerateTestFilesArgs) -> Result<Vec<PathBuf>> {
// Use or create a temporary directory for the test files
let tmpdir = if let Some(p) = args.path {
p
} else {
tempfile::tempdir()
.into_diagnostic()
.wrap_err("failed to build tempdir")?
.into_path()
};
let mut paths = vec![tmpdir.clone()];
// Generate subfolders matching each config
for subfolder_config in args.subfolder_configs.iter() {
// Create the subfolder path
let subfolder_path = tmpdir.join(&subfolder_config.name);
fs::create_dir(&subfolder_path)
.into_diagnostic()
.wrap_err(format!(
"failed to create path for dir [{}]",
subfolder_path.display()
))?;
paths.push(subfolder_path.clone());
// Fill the subfolder with files
match subfolder_config.nesting {
GeneratedFileNesting::Flat => {
for idx in 0..subfolder_config.file_count {
// Write stub file contents
fs::write(
subfolder_path.join(format!("stub-file-{idx}")),
get_placeholder_data(),
)
.into_diagnostic()
.wrap_err(format!(
"failed to write temporary file in subfolder {} @ idx {idx}",
subfolder_path.display()
))?;
}
}
GeneratedFileNesting::RandomToMax(max_depth) => {
let mut generator = rand::thread_rng();
for idx in 0..subfolder_config.file_count {
// Build a randomized path up to max depth
let mut generated_path = subfolder_path.clone();
let depth = generator.gen_range(0..max_depth);
for _ in 0..depth {
generated_path.push("stub-dir");
}
// Create the path
fs::create_dir_all(&generated_path)
.into_diagnostic()
.wrap_err(format!(
"failed to create randomly generated path [{}]",
generated_path.display()
))?;
// Write stub file contents @ the new randomized path
fs::write(
generated_path.join(format!("stub-file-{idx}")),
get_placeholder_data(),
)
.into_diagnostic()
.wrap_err(format!(
"failed to write temporary file in subfolder {} @ idx {idx}",
subfolder_path.display()
))?;
}
}
}
}
Ok(paths)
}

146
crates/cli/tests/ignore.rs Normal file
View File

@ -0,0 +1,146 @@
use std::{
path::{Path, PathBuf},
process::Stdio,
time::Duration,
};
use miette::{IntoDiagnostic, Result, WrapErr};
use tokio::{process::Command, time::Instant};
use tracing_test::traced_test;
use uuid::Uuid;
mod common;
use common::{generate_test_files, GenerateTestFilesArgs};
use crate::common::{GeneratedFileNesting, TestSubfolderConfiguration};
/// Directory name that will be sued for the dir that *should* be watched
const WATCH_DIR_NAME: &str = "watch";
/// The token that watch will echo every time a match is found
const WATCH_TOKEN: &str = "updated";
/// Ensure that watchexec runtime does not increase with the
/// number of *ignored* files in a given folder
///
/// This test creates two separate folders, one small and the other large
///
/// Each folder has two subfolders:
/// - a shallow one to be watched, with a few files of single depth (20 files)
/// - a deep one to be ignored, with many files at varying depths (small case 200 files, large case 200,000 files)
///
/// watchexec, when executed on *either* folder should *not* experience a more
/// than 10x degradation in performance, because the vast majority of the files
/// are supposed to be ignored to begin with.
///
/// When running the CLI on the root folders, it should *not* take a long time to start de
#[tokio::test]
#[traced_test]
async fn e2e_ignore_many_files_200_000() -> Result<()> {
// Create a tempfile so that drop will clean it up
let small_test_dir = tempfile::tempdir()
.into_diagnostic()
.wrap_err("failed to create tempdir for test use")?;
// Determine the watchexec bin to use & build arguments
let wexec_bin = std::env::var("TEST_WATCHEXEC_BIN").unwrap_or(
option_env!("CARGO_BIN_EXE_watchexec")
.map(std::string::ToString::to_string)
.unwrap_or("watchexec".into()),
);
let token = format!("{WATCH_TOKEN}-{}", Uuid::new_v4());
let args: Vec<String> = vec![
"-1".into(), // exit as soon as watch completes
"--watch".into(),
WATCH_DIR_NAME.into(),
"echo".into(),
token.clone(),
];
// Generate a small directory of files containing dirs that *will* and will *not* be watched
let [ref root_dir_path, _, _] = generate_test_files(GenerateTestFilesArgs {
path: Some(PathBuf::from(small_test_dir.path())),
subfolder_configs: vec![
// Shallow folder will have a small number of files and won't be watched
TestSubfolderConfiguration {
name: "watch".into(),
nesting: GeneratedFileNesting::Flat,
file_count: 5,
},
// Deep folder will have *many* amll files and will be watched
TestSubfolderConfiguration {
name: "unrelated".into(),
nesting: GeneratedFileNesting::RandomToMax(42),
file_count: 200,
},
],
})?[..] else {
panic!("unexpected number of paths returned from generate_test_files");
};
// Get the number of elapsed
let small_elapsed = run_watchexec_cmd(&wexec_bin, root_dir_path, args.clone()).await?;
// Create a tempfile so that drop will clean it up
let large_test_dir = tempfile::tempdir()
.into_diagnostic()
.wrap_err("failed to create tempdir for test use")?;
// Generate a *large* directory of files
let [ref root_dir_path, _, _] = generate_test_files(GenerateTestFilesArgs {
path: Some(PathBuf::from(large_test_dir.path())),
subfolder_configs: vec![
// Shallow folder will have a small number of files and won't be watched
TestSubfolderConfiguration {
name: "watch".into(),
nesting: GeneratedFileNesting::Flat,
file_count: 5,
},
// Deep folder will have *many* amll files and will be watched
TestSubfolderConfiguration {
name: "unrelated".into(),
nesting: GeneratedFileNesting::RandomToMax(42),
file_count: 200_000,
},
],
})?[..] else {
panic!("unexpected number of paths returned from generate_test_files");
};
// Get the number of elapsed
let large_elapsed = run_watchexec_cmd(&wexec_bin, root_dir_path, args.clone()).await?;
// We expect the ignores to not impact watchexec startup time at all
// whether there are 200 files in there or 200k
assert!(
large_elapsed < small_elapsed * 10,
"200k ignore folder ({:?}) took more than 10x more time ({:?}) than 200 ignore folder ({:?})",
large_elapsed,
small_elapsed * 10,
small_elapsed,
);
Ok(())
}
/// Run a watchexec command once
async fn run_watchexec_cmd(
wexec_bin: impl AsRef<str>,
dir: impl AsRef<Path>,
args: impl Into<Vec<String>>,
) -> Result<Duration> {
// Build the subprocess command
let mut cmd = Command::new(wexec_bin.as_ref());
cmd.args(args.into());
cmd.current_dir(dir);
cmd.stdout(Stdio::piped());
cmd.stderr(Stdio::piped());
let start = Instant::now();
cmd.kill_on_drop(true)
.output()
.await
.into_diagnostic()
.wrap_err("fixed")?;
Ok(start.elapsed())
}

View File

@ -3,7 +3,7 @@
<assemblyIdentity
type="win32"
name="Watchexec.Cli.watchexec"
version="1.23.0.0"
version="2.1.1.0"
/>
<trustInfo>

View File

@ -2,6 +2,25 @@
## Next (YYYY-MM-DD)
## v3.0.0 (2024-04-20)
- Deps: nix 0.28
## v2.0.1 (2023-11-29)
- Add `ProcessEnd::into_exitstatus` testing-only utility method.
- Deps: upgrade to Notify 6.0
- Deps: upgrade to nix 0.27
- Deps: upgrade to watchexec-signals 2.0.0
## v2.0.0 (2023-11-29)
Same as 2.0.1, but yanked.
## v1.1.0 (2023-11-26)
Same as 2.0.1, but yanked.
## v1.0.0 (2023-03-18)
- Split off new `watchexec-events` crate (this one), to have a lightweight library that can parse

View File

@ -1,6 +1,6 @@
[package]
name = "watchexec-events"
version = "1.0.0"
version = "3.0.0"
authors = ["Félix Saparelli <felix@passcod.name>"]
license = "Apache-2.0 OR MIT"
@ -15,27 +15,26 @@ rust-version = "1.61.0"
edition = "2021"
[dependencies.notify]
version = "5.0.0"
version = "6.0.0"
optional = true
[dependencies.serde]
version = "1.0.152"
version = "1.0.183"
optional = true
features = ["derive"]
[dependencies.watchexec-signals]
version = "1.0.0"
version = "3.0.0"
path = "../signals"
default-features = false
[target.'cfg(unix)'.dependencies.nix]
version = "0.26.2"
version = "0.28.0"
features = ["signal"]
[dev-dependencies]
watchexec-events = { version = "*", features = ["serde"], path = "." }
snapbox = "0.4.10"
serde_json = "1.0.94"
snapbox = "0.5.9"
serde_json = "1.0.107"
[features]
default = ["notify"]

View File

@ -16,7 +16,7 @@ Fundamentally, events in watchexec have three purposes:
3. To carry information about what caused the event, which may be provided to the process.
Outside of Watchexec, this library is particularly useful if you're building a tool that runs under
it, and want to easily read its events (with `--emit-events-to=json-file` and `--emit-events-to=json-stdin`).
it, and want to easily read its events (with `--emit-events-to=json-file` and `--emit-events-to=json-stdio`).
```rust ,no_run
use std::io::{stdin, Result};

View File

@ -1,5 +1,5 @@
pre-release-commit-message = "release: events v{{version}}"
tag-prefix = "events-"
tag-prefix = "watchexec-events-"
tag-message = "watchexec-events {{version}}"
[[pre-release-replacements]]

View File

@ -53,7 +53,7 @@ pub enum Tag {
/// The event is about a signal being delivered to the main process.
Signal(Signal),
/// The event is about the subprocess ending.
/// The event is about a subprocess ending.
ProcessCompletion(Option<ProcessEnd>),
#[cfg(feature = "serde")]

View File

@ -88,3 +88,50 @@ impl From<ExitStatus> for ProcessEnd {
}
}
}
impl ProcessEnd {
/// Convert a `ProcessEnd` to an `ExitStatus`.
///
/// This is a testing function only! **It will panic** if the `ProcessEnd` is not representable
/// as an `ExitStatus` on Unix. This is also not guaranteed to be accurate, as the waitpid()
/// status union is platform-specific. Exit codes and signals are implemented, other variants
/// are not.
#[cfg(unix)]
#[must_use]
pub fn into_exitstatus(self) -> ExitStatus {
use std::os::unix::process::ExitStatusExt;
match self {
Self::Success => ExitStatus::from_raw(0),
Self::ExitError(code) => {
ExitStatus::from_raw(i32::from(u8::try_from(code.get()).unwrap_or_default()) << 8)
}
Self::ExitSignal(signal) => {
ExitStatus::from_raw(signal.to_nix().map_or(0, |sig| sig as i32))
}
Self::Continued => ExitStatus::from_raw(0xffff),
_ => unimplemented!(),
}
}
/// Convert a `ProcessEnd` to an `ExitStatus`.
///
/// This is a testing function only! **It will panic** if the `ProcessEnd` is not representable
/// as an `ExitStatus` on Windows.
#[cfg(windows)]
#[must_use]
pub fn into_exitstatus(self) -> ExitStatus {
use std::os::windows::process::ExitStatusExt;
match self {
Self::Success => ExitStatus::from_raw(0),
Self::ExitError(code) => ExitStatus::from_raw(code.get().try_into().unwrap()),
_ => unimplemented!(),
}
}
/// Unimplemented on this platform.
#[cfg(not(any(unix, windows)))]
#[must_use]
pub fn into_exitstatus(self) -> ExitStatus {
unimplemented!()
}
}

View File

@ -113,7 +113,7 @@ impl From<Tag> for SerdeTag {
},
Tag::FileEventKind(fek) => Self {
kind: TagKind::Fs,
full: Some(format!("{:?}", fek)),
full: Some(format!("{fek:?}")),
simple: Some(fek.into()),
..Default::default()
},
@ -145,12 +145,10 @@ impl From<Tag> for SerdeTag {
Tag::ProcessCompletion(Some(end)) => Self {
kind: TagKind::Completion,
code: match &end {
ProcessEnd::Success => None,
ProcessEnd::ExitSignal(_) => None,
ProcessEnd::Success | ProcessEnd::Continued | ProcessEnd::ExitSignal(_) => None,
ProcessEnd::ExitError(err) => Some(err.get()),
ProcessEnd::ExitStop(code) => Some(code.get().into()),
ProcessEnd::Exception(exc) => Some(exc.get().into()),
ProcessEnd::Continued => None,
},
signal: if let ProcessEnd::ExitSignal(sig) = &end {
Some(*sig)
@ -172,6 +170,7 @@ impl From<Tag> for SerdeTag {
}
}
#[allow(clippy::fallible_impl_from)] // due to the unwraps
impl From<SerdeTag> for Tag {
fn from(value: SerdeTag) -> Self {
match value {
@ -314,6 +313,7 @@ impl From<SerdeTag> for Tag {
..
} if code != 0 && i32::try_from(code).is_ok() => {
Self::ProcessCompletion(Some(ProcessEnd::ExitStop(unsafe {
// SAFETY&UNWRAP: checked above
NonZeroI32::new_unchecked(code.try_into().unwrap())
})))
}
@ -324,6 +324,7 @@ impl From<SerdeTag> for Tag {
..
} if exc != 0 && i32::try_from(exc).is_ok() => {
Self::ProcessCompletion(Some(ProcessEnd::Exception(unsafe {
// SAFETY&UNWRAP: checked above
NonZeroI32::new_unchecked(exc.try_into().unwrap())
})))
}

View File

@ -1,6 +1,8 @@
#![cfg(feature = "serde")]
use std::num::{NonZeroI32, NonZeroI64};
use snapbox::assert_eq_path;
use snapbox::{assert_eq, file};
use watchexec_events::{
filekind::{CreateKind, FileEventKind as EventKind, ModifyKind, RemoveKind, RenameMode},
Event, FileType, Keyboard, ProcessEnd, Source, Tag,
@ -18,8 +20,8 @@ fn single() {
metadata: Default::default(),
};
assert_eq_path(
"tests/snapshots/single.json",
assert_eq(
file!["snapshots/single.json"],
serde_json::to_string_pretty(&single).unwrap(),
);
@ -52,8 +54,8 @@ fn array() {
},
];
assert_eq_path(
"tests/snapshots/array.json",
assert_eq(
file!["snapshots/array.json"],
serde_json::to_string_pretty(array).unwrap(),
);
@ -71,8 +73,8 @@ fn metadata() {
.into(),
}];
assert_eq_path(
"tests/snapshots/metadata.json",
assert_eq(
file!["snapshots/metadata.json"],
serde_json::to_string_pretty(metadata).unwrap(),
);
@ -134,8 +136,8 @@ fn sources() {
},
];
assert_eq_path(
"tests/snapshots/sources.json",
assert_eq(
file!["snapshots/sources.json"],
serde_json::to_string_pretty(&sources).unwrap(),
);
@ -162,8 +164,8 @@ fn signals() {
},
];
assert_eq_path(
"tests/snapshots/signals.json",
assert_eq(
file!["snapshots/signals.json"],
serde_json::to_string_pretty(&signals).unwrap(),
);
@ -193,8 +195,8 @@ fn completions() {
},
];
assert_eq_path(
"tests/snapshots/completions.json",
assert_eq(
file!["snapshots/completions.json"],
serde_json::to_string_pretty(&completions).unwrap(),
);
@ -244,8 +246,8 @@ fn paths() {
},
];
assert_eq_path(
"tests/snapshots/paths.json",
assert_eq(
file!["snapshots/paths.json"],
serde_json::to_string_pretty(&paths).unwrap(),
);

View File

@ -2,6 +2,22 @@
## Next (YYYY-MM-DD)
## v4.0.1 (2024-04-28)
- Hide fmt::Debug spew from ignore crate, use `full_debug` feature to restore.
## v4.0.0 (2024-04-20)
- Deps: watchexec 4
## v3.0.0 (2024-01-01)
- Deps: `watchexec-filterer-ignore` and `ignore-files`
## v2.0.1 (2023-12-09)
- Depend on `watchexec-events` instead of the `watchexec` re-export.
## v1.2.0 (2023-03-18)
- Ditch MSRV policy. The `rust-version` indication will remain, for the minimum estimated Rust version for the code features used in the crate's own code, but dependencies may have already moved on. From now on, only latest stable is assumed and tested for. ([#510](https://github.com/watchexec/watchexec/pull/510))

View File

@ -1,6 +1,6 @@
[package]
name = "watchexec-filterer-globset"
version = "1.2.0"
version = "4.0.1"
authors = ["Matt Green <mattgreenrocks@gmail.com>", "Félix Saparelli <felix@passcod.name>"]
license = "Apache-2.0"
@ -17,29 +17,29 @@ edition = "2021"
[dependencies]
ignore = "0.4.18"
tracing = "0.1.26"
tracing = "0.1.40"
[dependencies.ignore-files]
version = "1.3.1"
version = "3.0.1"
path = "../../ignore-files"
[dependencies.watchexec]
version = "2.3.0"
version = "4.1.0"
path = "../../lib"
[dependencies.watchexec-events]
version = "3.0.0"
path = "../../events"
[dependencies.watchexec-filterer-ignore]
version = "1.2.1"
version = "4.0.1"
path = "../ignore"
[dev-dependencies]
tracing-subscriber = "0.3.6"
[dev-dependencies.project-origins]
version = "1.2.0"
path = "../../project-origins"
[dev-dependencies.tokio]
version = "1.24.2"
version = "1.33.0"
features = [
"fs",
"io-std",
@ -47,3 +47,9 @@ features = [
"rt-multi-thread",
"macros",
]
[features]
default = []
## Don't hide ignore::gitignore::Gitignore Debug impl
full_debug = []

View File

@ -1,5 +1,5 @@
pre-release-commit-message = "release: filterer-globset v{{version}}"
tag-prefix = "filterer-globset-"
tag-prefix = "watchexec-filterer-globset-"
tag-message = "watchexec-filterer-globset {{version}}"
[[pre-release-replacements]]

View File

@ -10,21 +10,19 @@
use std::{
ffi::OsString,
fmt,
path::{Path, PathBuf},
};
use ignore::gitignore::{Gitignore, GitignoreBuilder};
use ignore_files::{Error, IgnoreFile, IgnoreFilter};
use tracing::{debug, trace, trace_span};
use watchexec::{
error::RuntimeError,
event::{Event, FileType, Priority},
filter::Filterer,
};
use watchexec::{error::RuntimeError, filter::Filterer};
use watchexec_events::{Event, FileType, Priority};
use watchexec_filterer_ignore::IgnoreFilterer;
/// A simple filterer in the style of the watchexec v1.17 filter.
#[derive(Debug)]
#[cfg_attr(feature = "full_debug", derive(Debug))]
pub struct GlobsetFilterer {
#[cfg_attr(not(unix), allow(dead_code))]
origin: PathBuf,
@ -34,6 +32,19 @@ pub struct GlobsetFilterer {
extensions: Vec<OsString>,
}
#[cfg(not(feature = "full_debug"))]
impl fmt::Debug for GlobsetFilterer {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
f.debug_struct("GlobsetFilterer")
.field("origin", &self.origin)
.field("filters", &"ignore::gitignore::Gitignore{...}")
.field("ignores", &"ignore::gitignore::Gitignore{...}")
.field("ignore_files", &self.ignore_files)
.field("extensions", &self.extensions)
.finish()
}
}
impl GlobsetFilterer {
/// Create a new `GlobsetFilterer` from a project origin, allowed extensions, and lists of globs.
///

View File

@ -374,10 +374,8 @@ async fn extensions_fail_extensionless() {
#[tokio::test]
async fn multipath_allow_on_any_one_pass() {
use watchexec::{
event::{Event, FileType, Tag},
filter::Filterer,
};
use watchexec::filter::Filterer;
use watchexec_events::{Event, FileType, Tag};
let filterer = filt(&[], &[], &["py"]).await;
let origin = tokio::fs::canonicalize(".").await.unwrap();
@ -442,10 +440,8 @@ async fn leading_single_glob_file() {
#[tokio::test]
async fn nonpath_event_passes() {
use watchexec::{
event::{Event, Source, Tag},
filter::Filterer,
};
use watchexec::filter::Filterer;
use watchexec_events::{Event, Source, Tag};
let filterer = filt(&[], &[], &["py"]).await;

View File

@ -3,21 +3,15 @@ use std::{
path::{Path, PathBuf},
};
use ignore_files::IgnoreFile;
use project_origins::ProjectType;
use watchexec::{
error::RuntimeError,
event::{Event, FileType, Priority, Tag},
filter::Filterer,
};
use watchexec::{error::RuntimeError, filter::Filterer};
use watchexec_events::{Event, FileType, Priority, Tag};
use watchexec_filterer_globset::GlobsetFilterer;
use watchexec_filterer_ignore::IgnoreFilterer;
pub mod globset {
pub use super::globset_filt as filt;
pub use super::Applies;
pub use super::PathHarness;
pub use watchexec::event::Priority;
pub use watchexec_events::Priority;
}
pub trait PathHarness: Filterer {
@ -122,21 +116,3 @@ pub async fn globset_filt(
.await
.expect("making filterer")
}
pub trait Applies {
fn applies_in(self, origin: &str) -> Self;
fn applies_to(self, project_type: ProjectType) -> Self;
}
impl Applies for IgnoreFile {
fn applies_in(mut self, origin: &str) -> Self {
let origin = std::fs::canonicalize(".").unwrap().join(origin);
self.applies_in = Some(origin);
self
}
fn applies_to(mut self, project_type: ProjectType) -> Self {
self.applies_to = Some(project_type);
self
}
}

View File

@ -2,6 +2,24 @@
## Next (YYYY-MM-DD)
## v4.0.1 (2024-04-28)
## v4.0.0 (2024-04-20)
- Deps: watchexec 4
## v3.0.1 (2024-01-04)
- Normalise paths on all platforms (via `normalize-path`).
## v3.0.0 (2024-01-01)
- Deps: `ignore-files` 2.0.0
## v2.0.1 (2023-12-09)
- Depend on `watchexec-events` instead of the `watchexec` re-export.
## v1.2.1 (2023-05-14)
- Use IO-free dunce::simplify to normalise paths on Windows.

View File

@ -1,6 +1,6 @@
[package]
name = "watchexec-filterer-ignore"
version = "1.2.1"
version = "4.0.1"
authors = ["Félix Saparelli <felix@passcod.name>"]
license = "Apache-2.0"
@ -17,30 +17,35 @@ edition = "2021"
[dependencies]
ignore = "0.4.18"
tracing = "0.1.26"
dunce = "1.0.4"
normalize-path = "0.2.1"
tracing = "0.1.40"
[dependencies.ignore-files]
version = "1.3.1"
version = "3.0.1"
path = "../../ignore-files"
[dependencies.watchexec]
version = "2.3.0"
version = "4.1.0"
path = "../../lib"
[dependencies.watchexec-events]
version = "3.0.0"
path = "../../events"
[dependencies.watchexec-signals]
version = "1.0.0"
version = "3.0.0"
path = "../../signals"
[dev-dependencies]
tracing-subscriber = "0.3.6"
[dev-dependencies.project-origins]
version = "1.2.0"
version = "1.4.0"
path = "../../project-origins"
[dev-dependencies.tokio]
version = "1.24.2"
version = "1.33.0"
features = [
"fs",
"io-std",

View File

@ -1,5 +1,5 @@
pre-release-commit-message = "release: filterer-ignore v{{version}}"
tag-prefix = "filterer-ignore-"
tag-prefix = "watchexec-filterer-ignore-"
tag-message = "watchexec-filterer-ignore {{version}}"
[[pre-release-replacements]]

View File

@ -13,13 +13,10 @@
use ignore::Match;
use ignore_files::IgnoreFilter;
use normalize_path::NormalizePath;
use tracing::{trace, trace_span};
use watchexec::{
error::RuntimeError,
event::{Event, FileType, Priority},
filter::Filterer,
};
use watchexec::{error::RuntimeError, filter::Filterer};
use watchexec_events::{Event, FileType, Priority};
/// A Watchexec [`Filterer`] implementation for [`IgnoreFilter`].
#[derive(Clone, Debug)]
@ -35,7 +32,8 @@ impl Filterer for IgnoreFilterer {
let mut pass = true;
for (path, file_type) in event.paths() {
let path = dunce::simplified(path);
let path = dunce::simplified(path).normalize();
let path = path.as_path();
let _span = trace_span!("checking_against_compiled", ?path, ?file_type).entered();
let is_dir = file_type.map_or(false, |t| matches!(t, FileType::Dir));

View File

@ -60,7 +60,7 @@ fn folders_suite(filterer: &IgnoreFilterer, name: &str) {
#[tokio::test]
async fn globs() {
let filterer = filt("", &[file("globs")]).await;
let filterer = filt("", &[file("globs").applies_globally()]).await;
// Unmatched
filterer.file_does_pass("FINAL-FINAL.docx");
@ -214,8 +214,8 @@ async fn scopes() {
let filterer = filt(
"",
&[
file("scopes-global"),
file("scopes-local").applies_in(""),
file("scopes-global").applies_globally(),
file("scopes-local"),
file("scopes-sublocal").applies_in("tests"),
file("none-allowed").applies_in("tests/child"),
],
@ -235,7 +235,7 @@ async fn scopes() {
#[cfg(not(windows))]
filterer.file_does_pass("/local.b");
// FIXME flaky
//filterer.file_doesnt_pass("tests/local.c");
// filterer.file_doesnt_pass("tests/local.c");
filterer.file_does_pass("sublocal.a");
// #[cfg(windows)] FIXME should work

View File

@ -2,10 +2,9 @@ use std::path::{Path, PathBuf};
use ignore_files::{IgnoreFile, IgnoreFilter};
use project_origins::ProjectType;
use watchexec::{
error::RuntimeError,
event::{filekind::FileEventKind, Event, FileType, Priority, ProcessEnd, Source, Tag},
filter::Filterer,
use watchexec::{error::RuntimeError, filter::Filterer};
use watchexec_events::{
filekind::FileEventKind, Event, FileType, Priority, ProcessEnd, Source, Tag,
};
use watchexec_filterer_ignore::IgnoreFilterer;
use watchexec_signals::Signal;
@ -15,7 +14,6 @@ pub mod ignore {
pub use super::ignore_filt as filt;
pub use super::Applies;
pub use super::PathHarness;
pub use watchexec::event::Priority;
}
pub trait PathHarness: Filterer {
@ -188,24 +186,27 @@ pub async fn ignore_filt(origin: &str, ignore_files: &[IgnoreFile]) -> IgnoreFil
}
pub fn ig_file(name: &str) -> IgnoreFile {
let path = std::fs::canonicalize(".")
.unwrap()
.join("tests")
.join("ignores")
.join(name);
let origin = std::fs::canonicalize(".").unwrap();
let path = origin.join("tests").join("ignores").join(name);
IgnoreFile {
path,
applies_in: None,
applies_in: Some(origin),
applies_to: None,
}
}
pub trait Applies {
fn applies_globally(self) -> Self;
fn applies_in(self, origin: &str) -> Self;
fn applies_to(self, project_type: ProjectType) -> Self;
}
impl Applies for IgnoreFile {
fn applies_globally(mut self) -> Self {
self.applies_in = None;
self
}
fn applies_in(mut self, origin: &str) -> Self {
let origin = std::fs::canonicalize(".").unwrap().join(origin);
self.applies_in = Some(origin);

View File

@ -1,19 +0,0 @@
# Changelog
## Next (YYYY-MM-DD)
## v0.3.0 (2023-03-18)
- Ditch MSRV policy. The `rust-version` indication will remain, for the minimum estimated Rust version for the code features used in the crate's own code, but dependencies may have already moved on. From now on, only latest stable is assumed and tested for. ([#510](https://github.com/watchexec/watchexec/pull/510))
## v0.2.0 (2023-01-09)
- MSRV: bump to 1.61.0
## v0.1.1 (2022-09-07)
- Deps: update miette to 5.3.0
## v0.1.0 (2022-06-23)
- Initial release as a separate crate.

View File

@ -1,64 +0,0 @@
[package]
name = "watchexec-filterer-tagged"
version = "0.3.0"
authors = ["Félix Saparelli <felix@passcod.name>"]
license = "Apache-2.0"
description = "Watchexec filterer component using tagged filters"
keywords = ["watchexec", "filterer", "tags"]
documentation = "https://docs.rs/watchexec-filterer-tagged"
homepage = "https://watchexec.github.io"
repository = "https://github.com/watchexec/watchexec"
readme = "README.md"
rust-version = "1.61.0"
edition = "2021"
[dependencies]
futures = "0.3.25"
globset = "0.4.8"
ignore = "0.4.18"
miette = "5.3.0"
nom = "7.0.0"
regex = "1.5.4"
thiserror = "1.0.26"
tracing = "0.1.26"
unicase = "2.6.0"
[dependencies.ignore-files]
version = "1.3.1"
path = "../../ignore-files"
[dependencies.tokio]
version = "1.24.2"
features = [
"fs",
]
[dependencies.watchexec]
version = "2.3.0"
path = "../../lib"
[dependencies.watchexec-filterer-ignore]
version = "1.2.1"
path = "../ignore"
[dependencies.watchexec-signals]
version = "1.0.0"
path = "../../signals"
[dev-dependencies]
tracing-subscriber = "0.3.6"
[dev-dependencies.project-origins]
version = "1.2.0"
path = "../../project-origins"
[dev-dependencies.tokio]
version = "1.24.2"
features = [
"fs",
"io-std",
"sync",
]

View File

@ -1,15 +0,0 @@
[![Crates.io page](https://badgen.net/crates/v/watchexec-filterer-tagged)](https://crates.io/crates/watchexec-filterer-tagged)
[![API Docs](https://docs.rs/watchexec-filterer-tagged/badge.svg)][docs]
[![Crate license: Apache 2.0](https://badgen.net/badge/license/Apache%202.0)][license]
[![CI status](https://github.com/watchexec/watchexec/actions/workflows/check.yml/badge.svg)](https://github.com/watchexec/watchexec/actions/workflows/check.yml)
# Watchexec filterer: tagged
_Experimental filterer using tagged filters._
- **[API documentation][docs]**.
- Licensed under [Apache 2.0][license].
- Status: maintained.
[docs]: https://docs.rs/watchexec-filterer-tagged
[license]: ../../../LICENSE

View File

@ -1,81 +0,0 @@
use std::collections::HashMap;
use ignore::gitignore::Gitignore;
use miette::Diagnostic;
use thiserror::Error;
use tokio::sync::watch::error::SendError;
use watchexec::error::RuntimeError;
use watchexec_filterer_ignore::IgnoreFilterer;
use crate::{Filter, Matcher};
/// Errors emitted by the `TaggedFilterer`.
#[derive(Debug, Diagnostic, Error)]
#[non_exhaustive]
#[diagnostic(url(docsrs))]
pub enum TaggedFiltererError {
/// Generic I/O error, with some context.
#[error("io({about}): {err}")]
#[diagnostic(code(watchexec::filter::io_error))]
IoError {
/// What it was about.
about: &'static str,
/// The I/O error which occurred.
#[source]
err: std::io::Error,
},
/// Error received when a tagged filter cannot be parsed.
#[error("cannot parse filter `{src}`: {err:?}")]
#[diagnostic(code(watchexec::filter::tagged::parse))]
Parse {
/// The source of the filter.
#[source_code]
src: String,
/// What went wrong.
err: nom::error::ErrorKind,
},
/// Error received when a filter cannot be added or removed from a tagged filter list.
#[error("cannot {action} filter: {err:?}")]
#[diagnostic(code(watchexec::filter::tagged::filter_change))]
FilterChange {
/// The action that was attempted.
action: &'static str,
/// The underlying error.
#[source]
err: SendError<HashMap<Matcher, Vec<Filter>>>,
},
/// Error received when a glob cannot be parsed.
#[error("cannot parse glob: {0}")]
#[diagnostic(code(watchexec::filter::tagged::glob_parse))]
GlobParse(#[source] ignore::Error),
/// Error received when a compiled globset cannot be changed.
#[error("cannot change compiled globset: {0:?}")]
#[diagnostic(code(watchexec::filter::tagged::globset_change))]
GlobsetChange(#[source] SendError<Option<Gitignore>>),
/// Error received about the internal ignore filterer.
#[error("ignore filterer: {0}")]
#[diagnostic(code(watchexec::filter::tagged::ignore))]
Ignore(#[source] ignore_files::Error),
/// Error received when a new ignore filterer cannot be swapped in.
#[error("cannot swap in new ignore filterer: {0:?}")]
#[diagnostic(code(watchexec::filter::tagged::ignore_swap))]
IgnoreSwap(#[source] SendError<IgnoreFilterer>),
}
impl From<TaggedFiltererError> for RuntimeError {
fn from(err: TaggedFiltererError) -> Self {
Self::Filterer {
kind: "tagged",
err: Box::new(err) as _,
}
}
}

View File

@ -1,93 +0,0 @@
use std::{
env,
io::Error,
path::{Path, PathBuf},
str::FromStr,
};
use ignore_files::{discover_file, IgnoreFile};
use tokio::fs::read_to_string;
use crate::{Filter, TaggedFiltererError};
/// A filter file.
///
/// This is merely a type wrapper around an [`IgnoreFile`], as the only difference is how the file
/// is parsed.
#[derive(Debug, Clone, PartialEq, Eq, Hash)]
pub struct FilterFile(pub IgnoreFile);
/// Finds all filter files that apply to the current runtime.
///
/// This considers:
/// - `$XDG_CONFIG_HOME/watchexec/filter`, as well as other locations (APPDATA on Windows…)
/// - Files from the `WATCHEXEC_FILTER_FILES` environment variable (comma-separated)
///
/// All errors (permissions, etc) are collected and returned alongside the ignore files: you may
/// want to show them to the user while still using whatever ignores were successfully found. Errors
/// from files not being found are silently ignored (the files are just not returned).
pub async fn discover_files_from_environment() -> (Vec<FilterFile>, Vec<Error>) {
let mut files = Vec::new();
let mut errors = Vec::new();
for path in env::var("WATCHEXEC_FILTER_FILES")
.unwrap_or_default()
.split(',')
{
discover_file(&mut files, &mut errors, None, None, PathBuf::from(path)).await;
}
let mut wgis = Vec::with_capacity(5);
if let Ok(home) = env::var("XDG_CONFIG_HOME") {
wgis.push(Path::new(&home).join("watchexec/filter"));
}
if let Ok(home) = env::var("APPDATA") {
wgis.push(Path::new(&home).join("watchexec/filter"));
}
if let Ok(home) = env::var("USERPROFILE") {
wgis.push(Path::new(&home).join(".watchexec/filter"));
}
if let Ok(home) = env::var("HOME") {
wgis.push(Path::new(&home).join(".watchexec/filter"));
}
for path in wgis {
if discover_file(&mut files, &mut errors, None, None, path).await {
break;
}
}
(files.into_iter().map(FilterFile).collect(), errors)
}
impl FilterFile {
/// Read and parse into [`Filter`]s.
///
/// Empty lines and lines starting with `#` are ignored. The `applies_in` field of the
/// [`IgnoreFile`] is used for the `in_path` field of each [`Filter`].
///
/// This method reads the entire file into memory.
pub async fn load(&self) -> Result<Vec<Filter>, TaggedFiltererError> {
let content =
read_to_string(&self.0.path)
.await
.map_err(|err| TaggedFiltererError::IoError {
about: "filter file load",
err,
})?;
let lines = content.lines();
let mut filters = Vec::with_capacity(lines.size_hint().0);
for line in lines {
if line.is_empty() || line.starts_with('#') {
continue;
}
let mut f = Filter::from_str(line)?;
f.in_path = self.0.applies_in.clone();
filters.push(f);
}
Ok(filters)
}
}

View File

@ -1,276 +0,0 @@
use std::collections::HashSet;
use std::path::PathBuf;
use globset::Glob;
use regex::Regex;
use tokio::fs::canonicalize;
use tracing::{trace, warn};
use unicase::UniCase;
use watchexec::event::Tag;
use crate::TaggedFiltererError;
/// A tagged filter.
#[derive(Debug, Clone, PartialEq, Eq)]
pub struct Filter {
/// Path the filter applies from.
pub in_path: Option<PathBuf>,
/// Which tag the filter applies to.
pub on: Matcher,
/// The operation to perform on the tag's value.
pub op: Op,
/// The pattern to match against the tag's value.
pub pat: Pattern,
/// If true, a positive match with this filter will override negative matches from previous
/// filters on the same tag, and negative matches will be ignored.
pub negate: bool,
}
impl Filter {
/// Matches the filter against a subject.
///
/// This is really an internal method to the tagged filterer machinery, exposed so you can build
/// your own filterer using the same types or the textual syntax. As such its behaviour is not
/// guaranteed to be stable (its signature is, though).
pub fn matches(&self, subject: impl AsRef<str>) -> Result<bool, TaggedFiltererError> {
let subject = subject.as_ref();
trace!(op=?self.op, pat=?self.pat, ?subject, "performing filter match");
Ok(match (self.op, &self.pat) {
(Op::Equal, Pattern::Exact(pat)) => UniCase::new(subject) == UniCase::new(pat),
(Op::NotEqual, Pattern::Exact(pat)) => UniCase::new(subject) != UniCase::new(pat),
(Op::Regex, Pattern::Regex(pat)) => pat.is_match(subject),
(Op::NotRegex, Pattern::Regex(pat)) => !pat.is_match(subject),
(Op::InSet, Pattern::Set(set)) => set.contains(subject),
(Op::InSet, Pattern::Exact(pat)) => subject == pat,
(Op::NotInSet, Pattern::Set(set)) => !set.contains(subject),
(Op::NotInSet, Pattern::Exact(pat)) => subject != pat,
(op @ (Op::Glob | Op::NotGlob), Pattern::Glob(glob)) => {
// FIXME: someway that isn't this horrible
match Glob::new(glob) {
Ok(glob) => {
let matches = glob.compile_matcher().is_match(subject);
match op {
Op::Glob => matches,
Op::NotGlob => !matches,
_ => unreachable!(),
}
}
Err(err) => {
warn!(
"failed to compile glob for non-path match, skipping (pass): {}",
err
);
true
}
}
}
(op, pat) => {
warn!(
"trying to match pattern {:?} with op {:?}, that cannot work",
pat, op
);
false
}
})
}
/// Create a filter from a gitignore-style glob pattern.
///
/// The optional path is for the `in_path` field of the filter. When parsing gitignore files, it
/// should be set to the path of the _directory_ the ignore file is in.
///
/// The resulting filter matches on [`Path`][Matcher::Path], with the [`NotGlob`][Op::NotGlob]
/// op, and a [`Glob`][Pattern::Glob] pattern. If it starts with a `!`, it is negated.
#[must_use]
pub fn from_glob_ignore(in_path: Option<PathBuf>, glob: &str) -> Self {
let (glob, negate) = glob.strip_prefix('!').map_or((glob, false), |g| (g, true));
Self {
in_path,
on: Matcher::Path,
op: Op::NotGlob,
pat: Pattern::Glob(glob.to_string()),
negate,
}
}
/// Returns the filter with its `in_path` canonicalised.
pub async fn canonicalised(mut self) -> Result<Self, TaggedFiltererError> {
if let Some(ctx) = self.in_path {
self.in_path =
Some(
canonicalize(&ctx)
.await
.map_err(|err| TaggedFiltererError::IoError {
about: "canonicalise Filter in_path",
err,
})?,
);
trace!(canon=?ctx, "canonicalised in_path");
}
Ok(self)
}
}
/// What a filter matches on.
#[derive(Clone, Copy, Debug, Eq, PartialEq, Hash)]
#[non_exhaustive]
pub enum Matcher {
/// The presence of a tag on an event.
Tag,
/// A path in a filesystem event. Paths are always canonicalised.
///
/// Note that there may be multiple paths in an event (e.g. both source and destination for renames), and filters
/// will be matched on all of them.
Path,
/// The file type of an object in a filesystem event.
///
/// This is not guaranteed to be present for every filesystem event.
///
/// It can be any of these values: `file`, `dir`, `symlink`, `other`. That last one means
/// "not any of the first three."
FileType,
/// The [`EventKind`][notify::event::EventKind] of a filesystem event.
///
/// This is the Debug representation of the event kind. Examples:
/// - `Access(Close(Write))`
/// - `Modify(Data(Any))`
/// - `Modify(Metadata(Permissions))`
/// - `Remove(Folder)`
///
/// You should probably use globs or regexes to match these, ex:
/// - `Create(*)`
/// - `Modify\(Name\(.+`
FileEventKind,
/// The [event source][crate::event::Source] the event came from.
///
/// These are the lowercase names of the variants.
Source,
/// The ID of the process which caused the event.
///
/// Note that it's rare for events to carry this information.
Process,
/// A signal sent to the main process.
///
/// This can be matched both on the signal number as an integer, and on the signal name as a
/// string. On Windows, only `BREAK` is supported; `CTRL_C` parses but won't work. Matching is
/// on both uppercase and lowercase forms.
///
/// Interrupt signals (`TERM` and `INT` on Unix, `CTRL_C` on Windows) are parsed, but these are
/// marked Urgent internally to Watchexec, and thus bypass filtering entirely.
Signal,
/// The exit status of a subprocess.
///
/// This is only present for events issued when the subprocess exits. The value is matched on
/// both the exit code as an integer, and either `success` or `fail`, whichever succeeds.
ProcessCompletion,
/// The [`Priority`] of the event.
///
/// This is never `urgent`, as urgent events bypass filtering.
Priority,
}
impl Matcher {
pub(crate) fn from_tag(tag: &Tag) -> &'static [Self] {
match tag {
Tag::Path {
file_type: None, ..
} => &[Self::Path],
Tag::Path { .. } => &[Self::Path, Self::FileType],
Tag::FileEventKind(_) => &[Self::FileEventKind],
Tag::Source(_) => &[Self::Source],
Tag::Process(_) => &[Self::Process],
Tag::Signal(_) => &[Self::Signal],
Tag::ProcessCompletion(_) => &[Self::ProcessCompletion],
_ => {
warn!("unhandled tag: {:?}", tag);
&[]
}
}
}
}
/// How a filter value is interpreted.
///
/// - `==` and `!=` match on the exact value as string equality (case-insensitively),
/// - `~=` and `~!` match using a [regex],
/// - `*=` and `*!` match using a glob, either via [globset] or [ignore]
/// - `:=` and `:!` match via exact string comparisons, but on any of the list of values separated
/// by `,`
/// - `=`, the "auto" operator, behaves as `*=` if the matcher is `Path`, and as `==` otherwise.
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
#[non_exhaustive]
pub enum Op {
/// The auto operator, `=`, resolves to `*=` or `==` depending on the matcher.
Auto,
/// The `==` operator, matches on exact string equality.
Equal,
/// The `!=` operator, matches on exact string inequality.
NotEqual,
/// The `~=` operator, matches on a regex.
Regex,
/// The `~!` operator, matches on a regex (matches are fails).
NotRegex,
/// The `*=` operator, matches on a glob.
Glob,
/// The `*!` operator, matches on a glob (matches are fails).
NotGlob,
/// The `:=` operator, matches (with string compares) on a set of values (belongs are passes).
InSet,
/// The `:!` operator, matches on a set of values (belongs are fails).
NotInSet,
}
/// A filter value (pattern to match with).
#[derive(Debug, Clone)]
#[non_exhaustive]
pub enum Pattern {
/// An exact string.
Exact(String),
/// A regex.
Regex(Regex),
/// A glob.
///
/// This is stored as a string as globs are compiled together rather than on a per-filter basis.
Glob(String),
/// A set of exact strings.
Set(HashSet<String>),
}
impl PartialEq<Self> for Pattern {
fn eq(&self, other: &Self) -> bool {
match (self, other) {
(Self::Exact(l), Self::Exact(r)) | (Self::Glob(l), Self::Glob(r)) => l == r,
(Self::Regex(l), Self::Regex(r)) => l.as_str() == r.as_str(),
(Self::Set(l), Self::Set(r)) => l == r,
_ => false,
}
}
}
impl Eq for Pattern {}

View File

@ -1,540 +0,0 @@
use std::path::PathBuf;
use std::sync::Arc;
use std::{collections::HashMap, convert::Into};
use futures::{stream::FuturesOrdered, TryStreamExt};
use ignore::{
gitignore::{Gitignore, GitignoreBuilder},
Match,
};
use ignore_files::{IgnoreFile, IgnoreFilter};
use tokio::fs::canonicalize;
use tracing::{debug, trace, trace_span};
use watchexec::{
error::RuntimeError,
event::{Event, FileType, Priority, ProcessEnd, Tag},
filter::Filterer,
};
use watchexec_filterer_ignore::IgnoreFilterer;
use watchexec_signals::Signal;
use crate::{swaplock::SwapLock, Filter, Matcher, Op, Pattern, TaggedFiltererError};
/// A complex filterer that can match any event tag and supports different matching operators.
///
/// See the crate-level documentation for more information.
#[derive(Debug)]
pub struct TaggedFilterer {
/// The directory the project is in, its origin.
///
/// This is used to resolve absolute paths without an `in_path` context.
origin: PathBuf,
/// Where the program is running from.
///
/// This is used to resolve relative paths without an `in_path` context.
workdir: PathBuf,
/// All filters that are applied, in order, by matcher.
filters: SwapLock<HashMap<Matcher, Vec<Filter>>>,
/// Sub-filterer for ignore files.
ignore_filterer: SwapLock<IgnoreFilterer>,
/// Compiled matcher for Glob filters.
glob_compiled: SwapLock<Option<Gitignore>>,
/// Compiled matcher for NotGlob filters.
not_glob_compiled: SwapLock<Option<Gitignore>>,
}
impl Filterer for TaggedFilterer {
fn check_event(&self, event: &Event, priority: Priority) -> Result<bool, RuntimeError> {
self.check(event, priority).map_err(Into::into)
}
}
impl TaggedFilterer {
fn check(&self, event: &Event, priority: Priority) -> Result<bool, TaggedFiltererError> {
let _span = trace_span!("filterer_check").entered();
trace!(?event, ?priority, "checking event");
{
trace!("checking priority");
if let Some(filters) = self.filters.borrow().get(&Matcher::Priority).cloned() {
trace!(filters=%filters.len(), "found some filters for priority");
//
let mut pri_match = true;
for filter in &filters {
let _span = trace_span!("checking filter against priority", ?filter).entered();
let applies = filter.matches(match priority {
Priority::Low => "low",
Priority::Normal => "normal",
Priority::High => "high",
Priority::Urgent => unreachable!("urgent by-passes filtering"),
})?;
if filter.negate {
if applies {
trace!(prev=%pri_match, now=%true, "negate filter passes, passing this priority");
pri_match = true;
break;
}
trace!(prev=%pri_match, now=%pri_match, "negate filter fails, ignoring");
} else {
trace!(prev=%pri_match, this=%applies, now=%(pri_match&applies), "filter applies to priority");
pri_match &= applies;
}
}
if !pri_match {
trace!("priority fails check, failing entire event");
return Ok(false);
}
} else {
trace!("no filters for priority, skipping (pass)");
}
}
{
trace!("checking internal ignore filterer");
let igf = self.ignore_filterer.borrow();
if !igf
.check_event(event, priority)
.expect("IgnoreFilterer never errors")
{
trace!("internal ignore filterer matched (fail)");
return Ok(false);
}
}
if self.filters.borrow().is_empty() {
trace!("no filters, skipping entire check (pass)");
return Ok(true);
}
trace!(tags=%event.tags.len(), "checking all tags on the event");
for tag in &event.tags {
let _span = trace_span!("check_tag", ?tag).entered();
trace!("checking tag");
for matcher in Matcher::from_tag(tag) {
let _span = trace_span!("check_matcher", ?matcher).entered();
let filters = self.filters.borrow().get(matcher).cloned();
if let Some(tag_filters) = filters {
if tag_filters.is_empty() {
trace!("no filters for this matcher, skipping (pass)");
continue;
}
trace!(filters=%tag_filters.len(), "found some filters for this matcher");
let mut tag_match = true;
if let (Matcher::Path, Tag::Path { path, file_type }) = (matcher, tag) {
let is_dir = file_type.map_or(false, |ft| matches!(ft, FileType::Dir));
{
let gc = self.glob_compiled.borrow();
if let Some(igs) = gc.as_ref() {
let _span =
trace_span!("checking_compiled_filters", compiled=%"Glob")
.entered();
match if path.strip_prefix(&self.origin).is_ok() {
trace!("checking against path or parents");
igs.matched_path_or_any_parents(path, is_dir)
} else {
trace!("checking against path only");
igs.matched(path, is_dir)
} {
Match::None => {
trace!("no match (fail)");
tag_match &= false;
}
Match::Ignore(glob) => {
if glob
.from()
.map_or(true, |f| path.strip_prefix(f).is_ok())
{
trace!(?glob, "positive match (pass)");
tag_match &= true;
} else {
trace!(
?glob,
"positive match, but not in scope (ignore)"
);
}
}
Match::Whitelist(glob) => {
trace!(?glob, "negative match (ignore)");
}
}
}
}
{
let ngc = self.not_glob_compiled.borrow();
if let Some(ngs) = ngc.as_ref() {
let _span =
trace_span!("checking_compiled_filters", compiled=%"NotGlob")
.entered();
match if path.strip_prefix(&self.origin).is_ok() {
trace!("checking against path or parents");
ngs.matched_path_or_any_parents(path, is_dir)
} else {
trace!("checking against path only");
ngs.matched(path, is_dir)
} {
Match::None => {
trace!("no match (pass)");
tag_match &= true;
}
Match::Ignore(glob) => {
if glob
.from()
.map_or(true, |f| path.strip_prefix(f).is_ok())
{
trace!(?glob, "positive match (fail)");
tag_match &= false;
} else {
trace!(
?glob,
"positive match, but not in scope (ignore)"
);
}
}
Match::Whitelist(glob) => {
trace!(?glob, "negative match (pass)");
tag_match = true;
}
}
}
}
}
// those are handled with the compiled ignore filters above
let tag_filters = tag_filters
.into_iter()
.filter(|f| {
!matches!(
(tag, matcher, f),
(
Tag::Path { .. },
Matcher::Path,
Filter {
on: Matcher::Path,
op: Op::Glob | Op::NotGlob,
pat: Pattern::Glob(_),
..
}
)
)
})
.collect::<Vec<_>>();
if tag_filters.is_empty() && tag_match {
trace!("no more filters for this matcher, skipping (pass)");
continue;
}
trace!(filters=%tag_filters.len(), "got some filters to check still");
for filter in &tag_filters {
let _span = trace_span!("checking filter against tag", ?filter).entered();
if let Some(app) = self.match_tag(filter, tag)? {
if filter.negate {
if app {
trace!(prev=%tag_match, now=%true, "negate filter passes, passing this matcher");
tag_match = true;
break;
}
trace!(prev=%tag_match, now=%tag_match, "negate filter fails, ignoring");
} else {
trace!(prev=%tag_match, this=%app, now=%(tag_match&app), "filter applies to this tag");
tag_match &= app;
}
}
}
if !tag_match {
trace!("matcher fails check, failing entire event");
return Ok(false);
}
trace!("matcher passes check, continuing");
} else {
trace!("no filters for this matcher, skipping (pass)");
}
}
}
trace!("passing event");
Ok(true)
}
/// Initialise a new tagged filterer with no filters.
///
/// This takes two paths: the project origin, and the current directory. The current directory
/// is not obtained from the environment so you can customise it; generally you should use
/// [`std::env::current_dir()`] though.
///
/// The origin is the directory the main project that is being watched is in. This is used to
/// resolve absolute paths given in filters without an `in_path` field (e.g. all filters parsed
/// from text), and for ignore file based filtering.
///
/// The workdir is used to resolve relative paths given in filters without an `in_path` field.
///
/// So, if origin is `/path/to/project` and workdir is `/path/to/project/subtree`:
/// - `path=foo.bar` is resolved to `/path/to/project/subtree/foo.bar`
/// - `path=/foo.bar` is resolved to `/path/to/project/foo.bar`
pub async fn new(origin: PathBuf, workdir: PathBuf) -> Result<Arc<Self>, TaggedFiltererError> {
let origin = canonicalize(origin)
.await
.map_err(|err| TaggedFiltererError::IoError {
about: "canonicalise origin on new tagged filterer",
err,
})?;
Ok(Arc::new(Self {
filters: SwapLock::new(HashMap::new()),
ignore_filterer: SwapLock::new(IgnoreFilterer(IgnoreFilter::empty(&origin))),
glob_compiled: SwapLock::new(None),
not_glob_compiled: SwapLock::new(None),
workdir: canonicalize(workdir)
.await
.map_err(|err| TaggedFiltererError::IoError {
about: "canonicalise workdir on new tagged filterer",
err,
})?,
origin,
}))
}
// filter ctx event path filter outcome
// /foo/bar /foo/bar/baz.txt baz.txt pass
// /foo/bar /foo/bar/baz.txt /baz.txt pass
// /foo/bar /foo/bar/baz.txt /baz.* pass
// /foo/bar /foo/bar/baz.txt /blah fail
// /foo/quz /foo/bar/baz.txt /baz.* skip
// Ok(Some(bool)) => the match was applied, bool is the result
// Ok(None) => for some precondition, the match was not done (mismatched tag, out of context, …)
fn match_tag(&self, filter: &Filter, tag: &Tag) -> Result<Option<bool>, TaggedFiltererError> {
trace!(matcher=?filter.on, "matching filter to tag");
fn sig_match(sig: Signal) -> (&'static str, i32) {
match sig {
Signal::Hangup | Signal::Custom(1) => ("HUP", 1),
Signal::ForceStop | Signal::Custom(9) => ("KILL", 9),
Signal::Interrupt | Signal::Custom(2) => ("INT", 2),
Signal::Quit | Signal::Custom(3) => ("QUIT", 3),
Signal::Terminate | Signal::Custom(15) => ("TERM", 15),
Signal::User1 | Signal::Custom(10) => ("USR1", 10),
Signal::User2 | Signal::Custom(12) => ("USR2", 12),
Signal::Custom(n) => ("UNK", n),
_ => ("UNK", 0),
}
}
match (tag, filter.on) {
(tag, Matcher::Tag) => filter.matches(tag.discriminant_name()),
(Tag::Path { path, .. }, Matcher::Path) => {
let resolved = if let Some(ctx) = &filter.in_path {
if let Ok(suffix) = path.strip_prefix(ctx) {
suffix.strip_prefix("/").unwrap_or(suffix)
} else {
return Ok(None);
}
} else if let Ok(suffix) = path.strip_prefix(&self.workdir) {
suffix.strip_prefix("/").unwrap_or(suffix)
} else if let Ok(suffix) = path.strip_prefix(&self.origin) {
suffix.strip_prefix("/").unwrap_or(suffix)
} else {
path.strip_prefix("/").unwrap_or(path)
};
trace!(?resolved, "resolved path to match filter against");
if matches!(filter.op, Op::Glob | Op::NotGlob) {
trace!("path glob match with match_tag is already handled");
return Ok(None);
}
filter.matches(resolved.to_string_lossy())
}
(
Tag::Path {
file_type: Some(ft),
..
},
Matcher::FileType,
) => filter.matches(ft.to_string()),
(Tag::FileEventKind(kind), Matcher::FileEventKind) => {
filter.matches(format!("{kind:?}"))
}
(Tag::Source(src), Matcher::Source) => filter.matches(src.to_string()),
(Tag::Process(pid), Matcher::Process) => filter.matches(pid.to_string()),
(Tag::Signal(sig), Matcher::Signal) => {
let (text, int) = sig_match(*sig);
Ok(filter.matches(text)?
|| filter.matches(format!("SIG{text}"))?
|| filter.matches(int.to_string())?)
}
(Tag::ProcessCompletion(ope), Matcher::ProcessCompletion) => match ope {
None => filter.matches("_"),
Some(ProcessEnd::Success) => filter.matches("success"),
Some(ProcessEnd::ExitError(int)) => filter.matches(format!("error({int})")),
Some(ProcessEnd::ExitSignal(sig)) => {
let (text, int) = sig_match(*sig);
Ok(filter.matches(format!("signal({text})"))?
|| filter.matches(format!("signal(SIG{text})"))?
|| filter.matches(format!("signal({int})"))?)
}
Some(ProcessEnd::ExitStop(int)) => filter.matches(format!("stop({int})")),
Some(ProcessEnd::Exception(int)) => filter.matches(format!("exception({int:X})")),
Some(ProcessEnd::Continued) => filter.matches("continued"),
},
(_, _) => {
trace!("no match for tag, skipping");
return Ok(None);
}
}
.map(Some)
}
/// Add some filters to the filterer.
///
/// This is async as it submits the new filters to the live filterer, which may be holding a
/// read lock. It takes a slice of filters so it can efficiently add a large number of filters
/// with a single write, without needing to acquire the lock repeatedly.
///
/// If filters with glob operations are added, the filterer's glob matchers are recompiled after
/// the new filters are added, in this method. This should not be used for inserting an
/// [`IgnoreFile`]: use [`add_ignore_file()`](Self::add_ignore_file) instead.
pub async fn add_filters(&self, filters: &[Filter]) -> Result<(), TaggedFiltererError> {
debug!(?filters, "adding filters to filterer");
let mut recompile_globs = false;
let mut recompile_not_globs = false;
#[allow(clippy::from_iter_instead_of_collect)]
let filters = FuturesOrdered::from_iter(
filters
.iter()
.cloned()
.inspect(|f| match f.op {
Op::Glob => {
recompile_globs = true;
}
Op::NotGlob => {
recompile_not_globs = true;
}
_ => {}
})
.map(Filter::canonicalised),
)
.try_collect::<Vec<_>>()
.await?;
trace!(?filters, "canonicalised filters");
// TODO: use miette's related and issue canonicalisation errors for all of them
self.filters
.change(|fs| {
for filter in filters {
fs.entry(filter.on).or_default().push(filter);
}
})
.map_err(|err| TaggedFiltererError::FilterChange { action: "add", err })?;
trace!("inserted filters into swaplock");
if recompile_globs {
self.recompile_globs(Op::Glob)?;
}
if recompile_not_globs {
self.recompile_globs(Op::NotGlob)?;
}
Ok(())
}
fn recompile_globs(&self, op_filter: Op) -> Result<(), TaggedFiltererError> {
trace!(?op_filter, "recompiling globs");
let target = match op_filter {
Op::Glob => &self.glob_compiled,
Op::NotGlob => &self.not_glob_compiled,
_ => unreachable!("recompile_globs called with invalid op"),
};
let globs = {
let filters = self.filters.borrow();
if let Some(fs) = filters.get(&Matcher::Path) {
trace!(?op_filter, "pulling filters from swaplock");
// we want to hold the lock as little as possible, so we clone the filters
fs.iter()
.filter(|&f| f.op == op_filter)
.cloned()
.collect::<Vec<_>>()
} else {
trace!(?op_filter, "no filters, erasing compiled glob");
return target
.replace(None)
.map_err(TaggedFiltererError::GlobsetChange);
}
};
let mut builder = GitignoreBuilder::new(&self.origin);
for filter in globs {
if let Pattern::Glob(mut glob) = filter.pat {
if filter.negate {
glob.insert(0, '!');
}
trace!(?op_filter, in_path=?filter.in_path, ?glob, "adding new glob line");
builder
.add_line(filter.in_path, &glob)
.map_err(TaggedFiltererError::GlobParse)?;
}
}
trace!(?op_filter, "finalising compiled glob");
let compiled = builder.build().map_err(TaggedFiltererError::GlobParse)?;
trace!(?op_filter, "swapping in new compiled glob");
target
.replace(Some(compiled))
.map_err(TaggedFiltererError::GlobsetChange)
}
/// Reads a gitignore-style [`IgnoreFile`] and adds it to the filterer.
pub async fn add_ignore_file(&self, file: &IgnoreFile) -> Result<(), TaggedFiltererError> {
let mut new = { self.ignore_filterer.borrow().clone() };
new.0
.add_file(file)
.await
.map_err(TaggedFiltererError::Ignore)?;
self.ignore_filterer
.replace(new)
.map_err(TaggedFiltererError::IgnoreSwap)?;
Ok(())
}
/// Clears all filters from the filterer.
///
/// This also recompiles the glob matchers, so essentially it resets the entire filterer state.
pub fn clear_filters(&self) -> Result<(), TaggedFiltererError> {
debug!("removing all filters from filterer");
self.filters.replace(Default::default()).map_err(|err| {
TaggedFiltererError::FilterChange {
action: "clear all",
err,
}
})?;
self.recompile_globs(Op::Glob)?;
self.recompile_globs(Op::NotGlob)?;
Ok(())
}
}

View File

@ -1,92 +0,0 @@
//! A filterer implementation that exposes the full capabilities of Watchexec.
//!
//! Filters match against [event tags][Tag]; can be exact matches, glob matches, regex matches, or
//! set matches; can reverse the match (equal/not equal, etc); and can be negated.
//!
//! [Filters][Filter] can be generated from your application and inserted directly, or they can be
//! parsed from a textual format:
//!
//! ```text
//! [!]{Matcher}{Op}{Value}
//! ```
//!
//! For example:
//!
//! ```text
//! path==/foo/bar
//! path*=**/bar
//! path~=bar$
//! !kind=file
//! ```
//!
//! There is a set of [operators][Op]:
//! - `==` and `!=`: exact match and exact not match (case insensitive)
//! - `~=` and `~!`: regex match and regex not match
//! - `*=` and `*!`: glob match and glob not match
//! - `:=` and `:!`: set match and set not match
//!
//! Sets are a list of values separated by `,`.
//!
//! In addition to the two-symbol operators, there is the `=` "auto" operator, which maps to the
//! most convenient operator for the given _matcher_. The current mapping is:
//!
//! | Matcher | Operator |
//! |-------------------------------------------------|---------------|
//! | [Tag](Matcher::Tag) | `:=` (in set) |
//! | [Path](Matcher::Path) | `*=` (glob) |
//! | [FileType](Matcher::FileType) | `:=` (in set) |
//! | [FileEventKind](Matcher::FileEventKind) | `*=` (glob) |
//! | [Source](Matcher::Source) | `:=` (in set) |
//! | [Process](Matcher::Process) | `:=` (in set) |
//! | [Signal](Matcher::Signal) | `:=` (in set) |
//! | [ProcessCompletion](Matcher::ProcessCompletion) | `*=` (glob) |
//! | [Priority](Matcher::Priority) | `:=` (in set) |
//!
//! [Matchers][Matcher] correspond to Tags, but are not one-to-one: the `path` matcher operates on
//! the `path` part of the `Path` tag, and the `type` matcher operates on the `file_type`, for
//! example.
//!
//! | Matcher | Syntax | Tag |
//! |------------------------------------|----------|----------------------------------------------|
//! | [Tag](Matcher::Tag) | `tag` | _the presence of a Tag on the event_ |
//! | [Path](Matcher::Path) | `path` | [Path](Tag::Path) (`path` field) |
//! | [FileType](Matcher::FileType) | `type` | [Path](Tag::Path) (`file_type` field, when Some) |
//! | [FileEventKind](Matcher::FileEventKind) | `kind` or `fek` | [FileEventKind](Tag::FileEventKind) |
//! | [Source](Matcher::Source) | `source` or `src` | [Source](Tag::Source) |
//! | [Process](Matcher::Process) | `process` or `pid` | [Process](Tag::Process) |
//! | [Signal](Matcher::Signal) | `signal` | [Signal](Tag::Signal) |
//! | [ProcessCompletion](Matcher::ProcessCompletion) | `complete` or `exit` | [ProcessCompletion](Tag::ProcessCompletion) |
//! | [Priority](Matcher::Priority) | `priority` | special: event [Priority] |
//!
//! Filters are checked in order, grouped per tag and per matcher. Filter groups may be checked in
//! any order, but the filters in the groups are checked in add order. Path glob filters are always
//! checked first, for internal reasons.
//!
//! The `negate` boolean field behaves specially: it is not operator negation, but rather the same
//! kind of behaviour that is applied to `!`-prefixed globs in gitignore files: if a negated filter
//! matches the event, the result of the event checking for that matcher is reverted to `true`, even
//! if a previous filter set it to `false`. Unmatched negated filters are ignored.
//!
//! Glob syntax is as supported by the [ignore] crate for Paths, and by [globset] otherwise. (As of
//! writing, the ignore crate uses globset internally). Regex syntax is the default syntax of the
//! [regex] crate.
#![doc(html_favicon_url = "https://watchexec.github.io/logo:watchexec.svg")]
#![doc(html_logo_url = "https://watchexec.github.io/logo:watchexec.svg")]
#![warn(clippy::unwrap_used, missing_docs)]
#![deny(rust_2018_idioms)]
// to make filters
pub use regex::Regex;
pub use error::*;
pub use files::*;
pub use filter::*;
pub use filterer::*;
mod error;
mod files;
mod filter;
mod filterer;
mod parse;
mod swaplock;

View File

@ -1,139 +0,0 @@
use std::str::FromStr;
use nom::{
branch::alt,
bytes::complete::{is_not, tag, tag_no_case, take_while1},
character::complete::char,
combinator::{map_res, opt},
sequence::{delimited, tuple},
Finish, IResult,
};
use regex::Regex;
use tracing::trace;
use crate::{Filter, Matcher, Op, Pattern, TaggedFiltererError};
impl FromStr for Filter {
type Err = TaggedFiltererError;
fn from_str(s: &str) -> Result<Self, Self::Err> {
fn matcher(i: &str) -> IResult<&str, Matcher> {
map_res(
alt((
tag_no_case("tag"),
tag_no_case("path"),
tag_no_case("type"),
tag_no_case("kind"),
tag_no_case("fek"),
tag_no_case("source"),
tag_no_case("src"),
tag_no_case("priority"),
tag_no_case("process"),
tag_no_case("pid"),
tag_no_case("signal"),
tag_no_case("sig"),
tag_no_case("complete"),
tag_no_case("exit"),
)),
|m: &str| match m.to_ascii_lowercase().as_str() {
"tag" => Ok(Matcher::Tag),
"path" => Ok(Matcher::Path),
"type" => Ok(Matcher::FileType),
"kind" | "fek" => Ok(Matcher::FileEventKind),
"source" | "src" => Ok(Matcher::Source),
"priority" => Ok(Matcher::Priority),
"process" | "pid" => Ok(Matcher::Process),
"signal" | "sig" => Ok(Matcher::Signal),
"complete" | "exit" => Ok(Matcher::ProcessCompletion),
m => Err(format!("unknown matcher: {m}")),
},
)(i)
}
fn op(i: &str) -> IResult<&str, Op> {
map_res(
alt((
tag("=="),
tag("!="),
tag("~="),
tag("~!"),
tag("*="),
tag("*!"),
tag(":="),
tag(":!"),
tag("="),
)),
|o: &str| match o {
"==" => Ok(Op::Equal),
"!=" => Ok(Op::NotEqual),
"~=" => Ok(Op::Regex),
"~!" => Ok(Op::NotRegex),
"*=" => Ok(Op::Glob),
"*!" => Ok(Op::NotGlob),
":=" => Ok(Op::InSet),
":!" => Ok(Op::NotInSet),
"=" => Ok(Op::Auto),
o => Err(format!("unknown op: `{o}`")),
},
)(i)
}
fn pattern(i: &str) -> IResult<&str, &str> {
alt((
// TODO: escapes
delimited(char('"'), is_not("\""), char('"')),
delimited(char('\''), is_not("'"), char('\'')),
take_while1(|_| true),
))(i)
}
fn filter(i: &str) -> IResult<&str, Filter> {
map_res(
tuple((opt(tag("!")), matcher, op, pattern)),
|(n, m, o, p)| -> Result<_, ()> {
Ok(Filter {
in_path: None,
on: m,
op: match o {
Op::Auto => match m {
Matcher::Path
| Matcher::FileEventKind
| Matcher::ProcessCompletion => Op::Glob,
_ => Op::InSet,
},
o => o,
},
pat: match (o, m) {
// TODO: carry regex/glob errors through
(
Op::Auto,
Matcher::Path | Matcher::FileEventKind | Matcher::ProcessCompletion,
)
| (Op::Glob | Op::NotGlob, _) => Pattern::Glob(p.to_string()),
(Op::Auto | Op::InSet | Op::NotInSet, _) => {
Pattern::Set(p.split(',').map(|s| s.trim().to_string()).collect())
}
(Op::Regex | Op::NotRegex, _) => {
Pattern::Regex(Regex::new(p).map_err(drop)?)
}
(Op::Equal | Op::NotEqual, _) => Pattern::Exact(p.to_string()),
},
negate: n.is_some(),
})
},
)(i)
}
trace!(src=?s, "parsing tagged filter");
filter(s)
.finish()
.map(|(_, f)| {
trace!(src=?s, filter=?f, "parsed tagged filter");
f
})
.map_err(|e| TaggedFiltererError::Parse {
src: s.to_string(),
err: e.code,
})
}
}

View File

@ -1,58 +0,0 @@
//! A value that is always available, but can be swapped out.
use std::fmt;
use tokio::sync::watch::{channel, error::SendError, Receiver, Ref, Sender};
/// A value that is always available, but can be swapped out.
///
/// This is a wrapper around a [Tokio `watch`][tokio::sync::watch]. The value can be read without
/// await, but can only be written to with async. Borrows should be held for as little as possible,
/// as they keep a read lock.
pub struct SwapLock<T: Clone> {
r: Receiver<T>,
s: Sender<T>,
}
impl<T> SwapLock<T>
where
T: Clone,
{
/// Create a new `SwapLock` with the given value.
pub fn new(inner: T) -> Self {
let (s, r) = channel(inner);
Self { r, s }
}
/// Get a reference to the value.
pub fn borrow(&self) -> Ref<'_, T> {
self.r.borrow()
}
/// Rewrite the value using a closure.
///
/// This obtains a clone of the value, and then calls the closure with a mutable reference to
/// it. Once the closure returns, the value is swapped in.
pub fn change(&self, f: impl FnOnce(&mut T)) -> Result<(), SendError<T>> {
let mut new = { self.r.borrow().clone() };
f(&mut new);
self.s.send(new)
}
/// Replace the value with a new one.
pub fn replace(&self, new: T) -> Result<(), SendError<T>> {
self.s.send(new)
}
}
impl<T> fmt::Debug for SwapLock<T>
where
T: fmt::Debug + Clone,
{
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> Result<(), fmt::Error> {
f.debug_struct("SwapLock")
.field("(watch)", &self.r)
.finish()
}
}

View File

@ -1,114 +0,0 @@
use watchexec::event::{filekind::*, ProcessEnd, Source};
use watchexec_signals::Signal;
mod helpers;
use helpers::tagged_ff::*;
#[tokio::test]
async fn empty_filter_passes_everything() {
let filterer = filt("", &[], &[file("empty.wef").await]).await;
filterer.file_does_pass("Cargo.toml");
filterer.file_does_pass("Cargo.json");
filterer.file_does_pass("Gemfile.toml");
filterer.file_does_pass("FINAL-FINAL.docx");
filterer.dir_does_pass("/test/Cargo.toml");
filterer.dir_does_pass("/a/folder");
filterer.file_does_pass("apples/carrots/oranges");
filterer.file_does_pass("apples/carrots/cauliflowers/oranges");
filterer.file_does_pass("apples/carrots/cauliflowers/artichokes/oranges");
filterer.file_does_pass("apples/oranges/bananas");
filterer.dir_does_pass("apples/carrots/oranges");
filterer.dir_does_pass("apples/carrots/cauliflowers/oranges");
filterer.dir_does_pass("apples/carrots/cauliflowers/artichokes/oranges");
filterer.dir_does_pass("apples/oranges/bananas");
filterer.source_does_pass(Source::Keyboard);
filterer.fek_does_pass(FileEventKind::Create(CreateKind::File));
filterer.pid_does_pass(1234);
filterer.signal_does_pass(Signal::User1);
filterer.complete_does_pass(None);
filterer.complete_does_pass(Some(ProcessEnd::Success));
}
#[tokio::test]
async fn folder() {
let filterer = filt("", &[], &[file("folder.wef").await]).await;
filterer.file_doesnt_pass("apples");
filterer.file_doesnt_pass("apples/oranges/bananas");
filterer.dir_doesnt_pass("apples");
filterer.dir_doesnt_pass("apples/carrots");
filterer.file_doesnt_pass("raw-prunes");
filterer.dir_doesnt_pass("raw-prunes");
filterer.file_doesnt_pass("prunes");
filterer.file_doesnt_pass("prunes/oranges/bananas");
filterer.dir_does_pass("prunes");
filterer.dir_does_pass("prunes/carrots/cauliflowers/oranges");
}
#[tokio::test]
async fn patterns() {
let filterer = filt("", &[], &[file("path-patterns.wef").await]).await;
// Unmatched
filterer.file_does_pass("FINAL-FINAL.docx");
filterer.dir_does_pass("/a/folder");
filterer.file_does_pass("rat");
filterer.file_does_pass("foo/bar/rat");
filterer.file_does_pass("/foo/bar/rat");
// Cargo.toml
filterer.file_doesnt_pass("Cargo.toml");
filterer.dir_doesnt_pass("Cargo.toml");
filterer.file_does_pass("Cargo.json");
// package.json
filterer.file_doesnt_pass("package.json");
filterer.dir_doesnt_pass("package.json");
filterer.file_does_pass("package.toml");
// *.gemspec
filterer.file_doesnt_pass("pearl.gemspec");
filterer.dir_doesnt_pass("sapphire.gemspec");
filterer.file_doesnt_pass(".gemspec");
filterer.file_does_pass("diamond.gemspecial");
// test-[^u]+
filterer.file_does_pass("test-unit");
filterer.dir_doesnt_pass("test-integration");
filterer.file_does_pass("tester-helper");
// [.]sw[a-z]$
filterer.file_doesnt_pass("source.swa");
filterer.file_doesnt_pass(".source.swb");
filterer.file_doesnt_pass("sub/source.swc");
filterer.file_does_pass("sub/dir.swa/file");
filterer.file_does_pass("source.sw1");
}
#[tokio::test]
async fn negate() {
let filterer = filt("", &[], &[file("negate.wef").await]).await;
filterer.file_doesnt_pass("yeah");
filterer.file_does_pass("nah");
filterer.file_does_pass("nah.yeah");
}
#[tokio::test]
async fn ignores_and_filters() {
let filterer = filt("", &[file("globs").await.0], &[file("folder.wef").await]).await;
// ignored
filterer.dir_doesnt_pass("test-helper");
// not filtered
filterer.dir_doesnt_pass("tester-helper");
// not ignored && filtered
filterer.dir_does_pass("prunes/tester-helper");
}

Some files were not shown because too many files have changed in this diff Show More