Compare commits

...

391 Commits

Author SHA1 Message Date
Jakub Jirutka 7c61b462dd disable unnecessary/unused regex features to reduce binary size
This will reduce the monolith binary size by ~15%.
2022-09-20 11:46:26 -04:00
Simone Mosciatti ef3684025b move to use http instead of https 2022-09-11 14:30:44 -04:00
Simone Mosciatti db7ee697b3 rewrite small part of the input argument handling
the commit rewrite a small part of the input argument handling, trying
to follow besr rust practices.
We get rid of a variable and of a mutable reference while keeping the
code a bit more coincise.
2022-09-11 14:30:44 -04:00
Sunshine 89ce5029b9
add option to blacklist/whitelist domains 2022-09-01 13:35:52 -10:00
dependabot[bot] 54609b10e5
Bump iana-time-zone from 0.1.44 to 0.1.46 (#316)
Bumps [iana-time-zone](https://github.com/strawlab/iana-time-zone) from 0.1.44 to 0.1.46.
- [Release notes](https://github.com/strawlab/iana-time-zone/releases)
- [Changelog](https://github.com/strawlab/iana-time-zone/blob/main/CHANGELOG.md)
- [Commits](https://github.com/strawlab/iana-time-zone/compare/0.1.44...v0.1.46)

---
updated-dependencies:
- dependency-name: iana-time-zone
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-08-31 11:35:38 -10:00
Sunshine 013d93bacc
update 3rd-party dependencies and bump version number 2022-08-14 05:12:39 -10:00
Sunshine 0df8613789
Rewrite part of function retrieve_asset, include support for brotli and deflate (#312)
do not crash the app if reqwest throws, add support for deflate & brotli
2022-08-06 19:07:39 -10:00
Sunshine 68a1531a11
Update packages (#313)
update dependencies
2022-08-06 18:21:53 -10:00
Sunshine 99c3be1804
Merge pull request #308 from Y2Z/dependabot/cargo/tokio-1.16.1
Bump tokio from 1.12.0 to 1.16.1
2022-08-06 17:07:18 -10:00
Sunshine 80559e7224
Merge pull request #309 from Y2Z/dependabot/cargo/regex-1.5.5
Bump regex from 1.5.4 to 1.5.5
2022-08-06 16:56:18 -10:00
dependabot[bot] c5c5f1ca44
Bump regex from 1.5.4 to 1.5.5
Bumps [regex](https://github.com/rust-lang/regex) from 1.5.4 to 1.5.5.
- [Release notes](https://github.com/rust-lang/regex/releases)
- [Changelog](https://github.com/rust-lang/regex/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rust-lang/regex/compare/1.5.4...1.5.5)

---
updated-dependencies:
- dependency-name: regex
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-06-06 21:07:06 +00:00
dependabot[bot] de6a13a884
Bump tokio from 1.12.0 to 1.16.1
Bumps [tokio](https://github.com/tokio-rs/tokio) from 1.12.0 to 1.16.1.
- [Release notes](https://github.com/tokio-rs/tokio/releases)
- [Commits](https://github.com/tokio-rs/tokio/compare/tokio-1.12.0...tokio-1.16.1)

---
updated-dependencies:
- dependency-name: tokio
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2022-06-06 19:44:19 +00:00
Sunshine ef16355f9f
Merge pull request #303 from timoteostewart/master
fix typo 'non-standart' to 'non-standard'
2022-03-17 04:21:16 -04:00
Tim Stewart a4dc0ed9b4
fix typo 'non-standart' to 'non-standard' 2022-03-16 17:54:48 -05:00
Sunshine cd0e366979
Merge pull request #301 from liamwarfield/patch-1
Updated monk project link
2022-02-22 15:22:33 -10:00
Liam Warfield d4c6c458f9
Updated monk project link
The monk project has recently moved to Github! Just changing the link here to the new repo.
2022-02-22 14:17:40 -07:00
Sunshine c9970b3a8e
Merge pull request #292 from snshn/include-unsafe-eval-origin-for-isolated-documents
Include unsafe-eval origin for isolated documents
2021-12-05 20:26:44 -10:00
Sunshine 404d322b99
make tests pass for newly added 'unsafe-eval' origin addition 2021-12-05 20:16:37 -10:00
Sunshine 1b353d0b46
include unsafe-eval origin for isolated documents 2021-12-05 20:09:26 -10:00
Sunshine f920a5e4d6
Merge pull request #290 from matildepark/patch-1
README: remove duplicate macports instructions
2021-11-10 20:33:35 -10:00
matildepark d3ca1ecad3
README: remove duplicate macports instructions 2021-11-10 23:10:31 -05:00
Sunshine 9e057472c6
Update README.md 2021-10-20 16:21:55 -10:00
Sunshine d453145bf8
Merge pull request #288 from snshn/update-markdown-files
Update Markdown files
2021-10-20 15:54:07 -10:00
Sunshine 8c131d649f
update Markdown files 2021-10-20 15:46:08 -10:00
Sunshine a221fdb368
Merge pull request #287 from snshn/ci-ignore-some-files
Update README files and set CI to ignore irrelevant paths
2021-10-20 15:40:43 -10:00
Sunshine 15dd82e300
update README files, set CI to ignore irrelevant paths 2021-10-20 15:31:54 -10:00
Sunshine de492caaa5
Merge pull request #286 from snshn/move-test-data
Move test data files under _data_
2021-10-17 22:51:22 -10:00
Sunshine 9096447c70
move test data files under _data_ 2021-10-17 22:46:06 -10:00
Sunshine 354340db86
Merge pull request #285 from snshn/use-percent-encoding-crate
Offload percent decoding to percent-encoding crate
2021-10-17 22:32:10 -10:00
Sunshine 900dd8d163
offload percent decoding to percent-encoding crate 2021-10-17 22:26:11 -10:00
Sunshine a11c4496b0
Merge pull request #284 from snshn/move-tests-to-upper-level
Get rid of macros, move tests out of src
2021-10-16 21:39:53 -10:00
Sunshine dd33b16876
Merge pull request #283 from snshn/formatting
Format README.md and annotate workflows
2021-10-16 21:16:53 -10:00
Sunshine 2cc1870033
get rid of macros, move tests out of src 2021-10-16 21:16:37 -10:00
Sunshine d41e6c041b
format README.md and annotate workflows 2021-10-16 18:48:32 -10:00
Sunshine 460a461373
Update README.md 2021-07-14 00:09:41 -10:00
Sunshine 1e6e87b6aa
Merge pull request #277 from Oliver-Hanikel/master
Reduce size of Docker image
2021-07-11 11:45:18 -10:00
Oliver Hanikel 54094270b3 Update run-in-container.sh 2021-07-11 20:07:48 +02:00
Oliver Hanikel e6cf367e23 reduce size of docker image 2021-07-11 20:00:39 +02:00
Sunshine e8437ecb28
Update README.md 2021-07-10 16:41:30 -10:00
Sunshine 543bebbd8d
Merge pull request #275 from snshn/improve-readme-code-snippets
Remove dollar signs from code snippets
2021-07-10 16:40:20 -10:00
Sunshine dc6c0200bc
remove dollar sign from code snippets 2021-07-10 16:32:56 -10:00
Sunshine 04bdb3072f
Update README.md 2021-07-08 13:14:37 -10:00
Sunshine a9228f0522
Merge pull request #274 from snshn/arm64-cd-job
Downgrade AArch64 CD job from Ubuntu 20.04 to Ubuntu 18.04
2021-07-06 15:29:55 -10:00
Sunshine aae68c4c82
downgrade AArch64 CD job from Ubuntu 20.04 to Ubuntu 18.04 2021-07-06 14:41:56 -10:00
Sunshine dd23826205
Merge pull request #273 from herbygillot/patch-1
README: add MacPorts install instructions
2021-07-04 21:16:18 -10:00
Herby Gillot 781f4cd3b5
README: add MacPorts install instructions 2021-07-05 03:07:55 -04:00
Sunshine 6826b59ab9
Merge pull request #272 from snshn/new-release
New release (2.6.1)
2021-07-03 19:39:32 -10:00
Sunshine 2be725eeb5
bump version number (2.6.0 -> 2.6.1) 2021-07-03 19:33:09 -10:00
Sunshine dd2e9ca2e5
update crates 2021-07-03 19:31:55 -10:00
Sunshine 50bccae476
Merge pull request #267 from snshn/aarch64-binary
Add GNU/Linux AArch64 CD job
2021-07-03 00:15:04 -10:00
Sunshine b3bcb1d85b
add GNU/Linux AArch64 CD job 2021-07-03 00:10:14 -10:00
Sunshine c58d044459
Merge pull request #271 from snshn/fix-charset-detection-mechanism
Fix charset detection logic
2021-07-02 21:47:56 -10:00
Sunshine eeaea0df16
fix use of wrong charset 2021-07-02 21:35:06 -10:00
Sunshine 2539aac4c0
Merge pull request #265 from snshn/version-bump
Bump version (2.5.0 -> 2.6.0)
2021-06-08 13:16:40 -10:00
Sunshine 03b9af543a
bump version (2.5.0 -> 2.6.0) 2021-06-08 13:09:50 -10:00
Sunshine 1bb8141021
Merge pull request #264 from snshn/fixes
Fixes
2021-06-08 13:04:57 -10:00
Sunshine 4bc8043f0f
account for charset when creating data URLs 2021-06-08 12:54:16 -10:00
Sunshine 5effa38392
use proper charset detection for linked assets 2021-06-08 12:25:19 -10:00
Sunshine 125aeeec3b
improve validation of charset found in HTML, use genuinely infinite timeout 2021-06-08 11:50:46 -10:00
Sunshine c938ba6a2f
modify proper attribute for (i)frame elements 2021-06-08 04:49:14 -10:00
Sunshine f354affc36
Merge pull request #263 from snshn/save-with-custom-charset
Add option for saving document using custom encoding
2021-06-08 04:15:49 -10:00
Sunshine 7686b2ea64
avoid excessive parsing of HTML into DOM 2021-06-08 03:57:28 -10:00
Sunshine b29b9a6a7c
add option for saving document using custom encoding 2021-06-08 03:39:27 -10:00
Sunshine cbda57cfa8
Merge pull request #262 from snshn/support-more-encodings
Add support for wider range of charsets
2021-06-08 02:39:24 -10:00
Sunshine b8aa545e8c
add support for wider range of charsets 2021-06-08 02:30:15 -10:00
Sunshine 22a031af5d
Merge pull request #256 from snshn/more-tests-fixes-and-improvements
More tests, fixes, improvements
2021-06-02 04:06:37 -10:00
Sunshine 6e6a60b305
Merge branch 'master' into more-tests-fixes-and-improvements 2021-06-02 04:01:41 -10:00
Sunshine 77d6022d84
bump version (2.4.1 -> 2.5.0) 2021-06-02 04:00:18 -10:00
Sunshine 5db19d1a3e
update dependencies 2021-06-02 03:58:28 -10:00
Sunshine a6e891b3c5
add more tests 2021-06-02 03:41:41 -10:00
Sunshine d7a82a008b
Merge pull request #260 from snshn/ie-css-hack-fix
Remove optional trailing space from CSS idents
2021-05-28 23:04:34 -10:00
Sunshine 2369a4dd3c
remove optional trailing space from CSS idents 2021-05-28 12:03:19 -10:00
Sunshine d27e53fb36
Merge pull request #259 from snshn/related-project-monk
Add Monk to related projects in README.md
2021-05-24 10:54:11 -10:00
Sunshine 2cb51477d2
add Monk to related projects in README.md 2021-05-24 01:47:19 -10:00
Sunshine a308a20411
simplify code of CLI tests 2021-03-15 20:10:50 -10:00
Sunshine a6ddf1c13a
simplify code responsible for processing CSS 2021-03-14 19:42:57 -10:00
Sunshine 8256d17efd
Merge pull request #253 from snshn/unwrap-noscript
Make possible to unwrap NOSCRIPT nodes
2021-03-11 22:43:28 -10:00
Sunshine efa12935ba
Merge pull request #254 from snshn/no-containers-md
Get rid of containers.md (now part of README.md)
2021-03-11 22:39:49 -10:00
Sunshine 7126a98023
Merge pull request #255 from snshn/pkgsrc
Add installation instructions using pkgsrc
2021-03-11 22:38:33 -10:00
Sunshine c7ee3ec6e2
get rid of containers.md (now part of README.md) 2021-03-11 22:27:44 -10:00
Sunshine c4218031e2
add installation instructions using pkgsrc 2021-03-11 22:26:32 -10:00
Sunshine 6f918f6c1c
make possible to unwrap NOSCRIPT nodes 2021-03-11 18:18:39 -10:00
Sunshine 6ecda080e8
Merge pull request #252 from snshn/revamp
Revamp codebase
2021-03-11 14:25:10 -10:00
Sunshine 2e86ee67a5
revamp codebase 2021-03-11 14:15:18 -10:00
Sunshine 359616b901
Update README.md 2021-03-09 16:04:32 -10:00
Sunshine ea2cdab330
Update README.md 2021-03-09 15:52:23 -10:00
Sunshine 4434823c46
Update README.md 2021-03-09 14:49:10 -10:00
Sunshine e0a78ffc9d
Update README.md 2021-03-09 13:31:15 -10:00
Sunshine cbbb297473
Merge pull request #251 from snshn/bump-version-again
Bump version number to 2.4.1
2021-03-09 02:17:17 -10:00
Sunshine 98ddb821a5
bump version number 2021-03-09 02:07:07 -10:00
Sunshine be097b1d4e
Merge pull request #250 from snshn/alternate-stylesheets
Embed alternate stylesheets
2021-03-09 01:58:08 -10:00
Sunshine 325688acf5
add test for alternate stylesheets 2021-03-09 01:48:41 -10:00
Sunshine 11207d49d2
embed alternate stylesheets 2021-03-09 01:46:15 -10:00
Sunshine 96da64e193
Merge pull request #247 from snshn/cc0
Change project license to CC0 1.0 Universal (CC0 1.0)
2021-03-01 13:28:49 -10:00
Sunshine 8a62a51210
Merge pull request #248 from snshn/update-container-instructions
Running in container instructions update
2021-02-28 23:24:10 -10:00
Sunshine a6ac1df93d
running in container instructions update 2021-02-28 21:46:38 -10:00
Sunshine 49e81149df
switch license to CC0-1.0 2021-02-28 19:54:46 -10:00
Sunshine a3516b2ae9
Merge pull request #245 from snshn/change-meta-charset-to-utf-8
Forcefully set document's charset to UTF-8
2021-02-23 23:48:49 -10:00
Sunshine 385301bf16
clean up unused code 2021-02-23 23:39:51 -10:00
Sunshine 4921a70dda
Merge branch 'master' into change-meta-charset-to-utf-8 2021-02-23 23:38:03 -10:00
Sunshine e0273c664a
forcefully set document's charset to UTF-8 2021-02-23 23:35:35 -10:00
Sunshine 6d629bfd4a
Merge pull request #244 from snshn/process-noscript
Process contents of NOSCRIPT tags
2021-02-22 20:13:26 -10:00
Sunshine ae9d78a891
process contents of NOSCRIPT tags 2021-02-22 19:42:39 -10:00
Sunshine 0f55fb3c49
Merge pull request #243 from snshn/fix-embedding-picture-srcset
Fix embedding of srcset assets for PICTURE nodes
2021-02-22 16:27:22 -10:00
Sunshine e41fd6a1c6
fix embedding of srcset for PICTURE nodes 2021-02-22 16:21:12 -10:00
Sunshine eaf662bb3b
Update README.md 2021-02-15 15:38:06 -10:00
Sunshine fa71f6a42c
Merge pull request #240 from snshn/color
Add color to asset download log
2021-01-30 10:48:35 -10:00
Sunshine 9a27c6c5ee
add color to asset download log 2021-01-29 20:24:35 -10:00
Sunshine 4ad07c0519
Merge pull request #239 from snshn/update-crates
Update dependencies
2021-01-29 17:27:43 -10:00
Sunshine e78405f2ae
update dependencies 2021-01-29 17:19:38 -10:00
Sunshine e81462be41
Merge pull request #237 from snshn/choco
Add Chocolatey spec file
2020-12-31 15:32:27 -10:00
Sunshine b972d717ce
add chocolatey spec 2020-12-31 15:30:41 -10:00
Sunshine edb679d2b3
Merge pull request #236 from snshn/pipe-in-target-test
Add test for stdin pipe
2020-12-31 14:44:57 -10:00
Sunshine 2e1462a953
add test for stdin pipe 2020-12-31 14:38:31 -10:00
Sunshine 57883b84b2
Merge pull request #235 from snshn/allow-empty-user-agent-string
Make it possible to specify an empty user-agent string
2020-12-31 13:02:35 -10:00
Sunshine 4fa2eda983
make it possible to specify an empty user-agent string 2020-12-31 12:57:22 -10:00
Sunshine 028187a31e
Merge pull request #234 from snshn/update-dependencies
Update crates
2020-12-28 12:11:25 -10:00
Sunshine c469c30cbd
update crates 2020-12-28 12:04:27 -10:00
Sunshine 6de36243f9
Fix armhf build in cd.yml 2020-12-27 05:52:47 -10:00
Sunshine 4f162d0cc0
Update README.md 2020-12-25 22:59:24 -10:00
Sunshine 95040173fc
Merge pull request #233 from snshn/cargo-fix-license-identifier
Fix license identifier in Cargo.toml
2020-12-25 22:42:31 -10:00
Sunshine b10d41f82e
fix license identifier in Cargo.toml 2020-12-25 22:41:31 -10:00
Sunshine 4c2c55d166
Merge pull request #232 from snshn/fix-cargo-toml
Reduce amount of keywords to 5 (max)
2020-12-25 22:34:08 -10:00
Sunshine 2dd1c465e4
reduce amount of keywords to 5 (max) 2020-12-25 22:28:19 -10:00
Sunshine a5afda9c80
Merge pull request #229 from snshn/cargo-install
Update Cargo.toml for publishing on crates.io
2020-12-25 22:15:13 -10:00
Sunshine ab6fed6d1f
Merge pull request #231 from snshn/fix-srcset
Fix srcset parsing
2020-12-25 22:07:45 -10:00
Sunshine f8dcb335e7
bump version, make possible to install via cargo 2020-12-25 22:07:19 -10:00
Sunshine 913051870a
Merge pull request #230 from snshn/stdin
Make possible to use stdin as input method
2020-12-25 21:58:14 -10:00
Sunshine 614a518475
fix srcset parsing 2020-12-25 21:56:40 -10:00
Sunshine 870a4b150e
make possible to use stdin as input method 2020-12-25 21:23:29 -10:00
Sunshine 0533b287b7
Merge pull request #228 from snshn/audio-video-support
Add support for embedding video and audio files
2020-12-25 16:55:42 -10:00
Sunshine 4ba4285b6b
add support for embedding video and audio files 2020-12-25 16:49:43 -10:00
Sunshine 2b9caf9840
Merge pull request #227 from snshn/fix-trailing-comma-for-srcset-parsing
Fix crash associated with trailing/repeating commas within srcset
2020-12-25 14:33:20 -10:00
Sunshine 8adf059980
fix crash associated with trailing/repeating commas within srcset 2020-12-25 14:24:52 -10:00
Sunshine 8ad252868e
Merge pull request #226 from snshn/base-tag-option
Add base URL option
2020-12-25 13:02:05 -10:00
Sunshine e145df372f
Merge branch 'master' into base-tag-option 2020-12-25 12:10:54 -10:00
Sunshine 816b6175ac
rewrite ADR #8 (Base Tag) 2020-12-25 12:06:56 -10:00
Sunshine d89b4d5f5b
refactor code that processes the DOM 2020-12-25 11:09:47 -10:00
Sunshine 15d98a7269
don't modify base url by default, add option for setting it 2020-12-24 18:38:44 -10:00
Sunshine 36e82cb511
Update README.md 2020-12-13 08:42:41 -10:00
Sunshine 1b1befd7b0
Merge pull request #222 from snshn/readme-no-metadata-info
Add description of -M option to README.md
2020-12-09 21:29:33 -05:00
Sunshine a2f59b4418
Update README.md 2020-12-09 16:23:17 -10:00
Sunshine 124a62920f
Merge pull request #221 from snshn/related-project-hako
Add Hako
2020-12-09 07:54:52 -05:00
Sunshine f557504bed
Update README.md
Add Hako to the list of related projects
2020-12-08 17:02:09 -10:00
Sunshine 5ac520b4da
Merge pull request #219 from snshn/ignore-network-errors-option
Account for network errors
2020-11-22 17:54:30 -10:00
Sunshine 7a97291498
add ADR 7 (Network errors) 2020-11-22 17:20:37 -10:00
Sunshine 38a6f963ad
account for network errors, add option to ignore them 2020-11-22 16:49:26 -10:00
Sunshine 052f8f49ec
Merge pull request #218 from snshn/update-crates
Update crates
2020-11-20 00:36:33 -10:00
Sunshine 08de486382
use newer dependencies 2020-11-20 00:30:05 -10:00
Sunshine c0e0a69773
Merge pull request #214 from zfhrp6/remove_use_unused_opts
remove unused import opts::Options;
2020-11-04 01:04:38 -08:00
Sunshine 1636540693
Merge pull request #216 from zfhrp6/suppress_deprecation_of_dependency
update clap 2.33.1 -> 2.33.3
2020-11-02 19:03:43 -08:00
zfhrp 3e80cb02ce update clap 2.33.1 -> 2.33.3 2020-11-02 00:11:41 +09:00
zfhrp a296531b3f remove unused import opts::Options; 2020-11-01 23:04:08 +09:00
Sunshine 8462b6bc31
Merge pull request #207 from snshn/bump-version
bump version (2.3.0 -> 2.3.1)
2020-08-01 21:00:26 -04:00
Sunshine 92f38556b6
bump version (2.3.0 -> 2.3.1) 2020-08-01 20:24:38 -04:00
Sunshine c0bdeab2e3
Merge pull request #206 from snshn/update-crates
Update crates
2020-08-01 19:43:00 -04:00
Sunshine 5a502eab4b
update crate versions 2020-08-01 19:20:20 -04:00
Sunshine 19f08265a2
Merge pull request #205 from snshn/base-tag
Implement support for BASE tag
2020-08-01 02:47:33 -04:00
Sunshine 1d6392cb28
implement support for BASE tag 2020-08-01 02:35:07 -04:00
Sunshine 03cdc0e0b2
Merge pull request #201 from snshn/refactor-and-version-bump
Refactor and version bump
2020-07-14 03:51:31 -04:00
Sunshine b98b7af0b4
Merge pull request #202 from snshn/minus-stdout
Treat - for stdout
2020-07-14 03:51:18 -04:00
Sunshine 73c35eaccb
treat minus for output target file path as stdout 2020-07-14 03:35:59 -04:00
Sunshine 2c5d1e930b
bump version (2.2.7 -> 2.3.0) 2020-07-14 03:29:08 -04:00
Sunshine 90f7c3a0d0
alphabetical order for function names 2020-07-14 03:27:52 -04:00
Sunshine c1fec5967d
Merge pull request #200 from snshn/favicon
Automatically obtain favicon.ico
2020-07-14 03:24:10 -04:00
Sunshine 09d41d2cf1
automatically obtain favicon.ico 2020-07-14 02:58:29 -04:00
Sunshine 8f1da3c792
Update cd.yml 2020-07-13 19:09:01 -04:00
Sunshine a8449a2b32
Update README.md 2020-07-13 01:16:38 -04:00
Sunshine 164e728ad3
Merge pull request #197 from snshn/addetional-black-box-test-data
Additional black box test data
2020-07-06 16:51:49 -04:00
Sunshine 8883bd6aca
add more black box test data 2020-07-06 16:15:57 -04:00
Sunshine eae5d4dc6b
Merge pull request #196 from snshn/help-message-update
Update help message
2020-07-01 06:41:32 -04:00
Sunshine ec85121d28
update help message 2020-07-01 06:29:56 -04:00
Sunshine a8a85a4191
Merge pull request #195 from snshn/logo
Logo
2020-07-01 06:24:28 -04:00
Sunshine decd5b2119
add ASCII logo atop of help message 2020-07-01 06:13:58 -04:00
Sunshine bef6d848e9
add raster icon along with its Blender scene 2020-07-01 05:54:48 -04:00
Sunshine 4263e42cd1
Merge pull request #194 from snshn/indented-tree
Indented tree
2020-06-28 16:37:10 -04:00
Sunshine 23de5ced21
add tests for utils::indent() 2020-06-28 16:15:42 -04:00
Sunshine bc98aca2a2
indent items in retrieval log to form a tree-like structure 2020-06-28 16:11:15 -04:00
Sunshine 438ebd520a
Merge pull request #193 from snshn/options-struct
Pass options object instead of using separate parameters
2020-06-28 01:51:05 -04:00
Sunshine ddb97009e9
pass options object instead of using separate parameters 2020-06-28 01:36:41 -04:00
Sunshine 6e67545b92
Merge pull request #192 from snshn/more-test-data
Add more sample data for blackbox tests
2020-06-27 14:57:07 -04:00
Sunshine 9e5d8ec691
add more sample data for blackbox tests 2020-06-27 14:55:10 -04:00
Sunshine fb835fae28
Merge pull request #191 from snshn/trim-style
Trim CSS if it contains nothing but whitespaces
2020-06-26 23:41:41 -04:00
Sunshine 29bf042da0
trim CSS if it contains nothing but whitespaces 2020-06-26 23:26:55 -04:00
Sunshine d67483cf8e
Merge pull request #190 from snshn/refactor-csp
Refactor CSP code
2020-06-26 21:42:19 -04:00
Sunshine 4140d8ebad
Create references.md 2020-06-26 18:16:18 -04:00
Sunshine 2ac964fae5
include font-src into CSP 2020-06-26 18:14:46 -04:00
Sunshine ae5d6d2df4
refactor CSP code 2020-06-26 16:19:44 -04:00
Sunshine 2ed151d883
Update web-apps.md 2020-06-26 15:05:47 -04:00
Sunshine 3cdfdc45d3
Update snapcraft.yaml 2020-06-26 14:57:52 -04:00
Sunshine ac04af2cfc
Update ADR-0006 2020-06-26 14:44:54 -04:00
Sunshine 769953d7bd
Merge pull request #187 from snshn/arm-snapcraft
Add armhf target to snapcraft.yaml
2020-06-26 14:40:46 -04:00
Sunshine 136dcc31cf
Merge pull request #189 from snshn/remove-unwanted-meta-tags
Automatically remove "Refresh" and "Location" META tags
2020-06-26 01:29:41 -04:00
Sunshine 44cac65a83
automatically remove "Refresh" and "Location" META tags 2020-06-26 01:18:52 -04:00
Sunshine c3ca2ad1d5
Merge pull request #188 from snshn/metadata-tag-function
Move metadata tag code into a function
2020-06-25 18:31:50 -04:00
Sunshine 0347fd3985
move metadata tag code into a function 2020-06-25 18:23:56 -04:00
Sunshine 95d0083b3c
Update README.md 2020-06-25 17:38:39 -04:00
Sunshine 3ce26b5fdd
Merge pull request #186 from snshn/code-improvements
Code improvements
2020-06-24 03:31:11 -04:00
Sunshine 7f9458adfe
add armhf target to snapcraft.yaml 2020-06-24 03:20:36 -04:00
Sunshine 5c229c51da
move functions related to URL manipulation into url.rs 2020-06-24 03:16:40 -04:00
Sunshine f6ea16b3ad
create a separate function for appending URL fragments 2020-06-24 02:26:05 -04:00
Sunshine 877b11d52c
Merge pull request #185 from snshn/upd-crates
Update crates
2020-06-20 03:41:00 -04:00
Sunshine f9aac6f41b
update crates 2020-06-20 01:05:39 -04:00
Sunshine 0a30c286fe
add x86_64 GNU/Linux target to CD 2020-06-19 07:44:57 -04:00
Sunshine ea56b9b4c1
Update README.md 2020-06-19 03:51:19 -04:00
Sunshine e821591efe
Merge pull request #183 from snshn/update-readme-freebsd-instructions
Add FreeBSD installation instructions to README.md
2020-06-18 22:26:56 -04:00
Sunshine 4e5d2fdc8d
Merge pull request #184 from snshn/update-readme-ascii
Update README.md
2020-06-18 00:25:55 -04:00
Sunshine 7c2ed2c9ca
Update README.md 2020-06-18 00:19:16 -04:00
Sunshine 60d21ae071
Update README.md 2020-06-17 07:42:10 -04:00
Sunshine bfdcd459e1
Update web-apps.md 2020-06-04 02:32:01 -04:00
Sunshine 6c020dfa88
Create web-apps.md 2020-06-04 02:31:41 -04:00
Sunshine 9894213393
Merge pull request #182 from snshn/version-bump
Version bump
2020-06-01 05:48:08 -04:00
Sunshine 80523c5a59
version bump 2020-06-01 05:41:42 -04:00
Sunshine 65b5ff4ec0
Merge pull request #181 from snshn/only-remove-credentals-from-http-urls
Only attempt to remove credentals from HTTP(S) URLs
2020-06-01 05:36:01 -04:00
Sunshine 4e31d0433e
only attempt to remove credentals from HTTP(S) URLs 2020-06-01 05:28:02 -04:00
Sunshine ed82b96152
Merge pull request #179 from snshn/refine-adrs
Refine ADRs
2020-05-25 00:41:38 -04:00
Sunshine f16a2a9ed5
refine ADRs 2020-05-24 21:21:52 -04:00
Sunshine 38d7873d6e
Update 0004-asset-integrity-check.md 2020-05-24 21:10:35 -04:00
Sunshine d848179a43
Merge pull request #124 from snshn/adr-integrity
Propose ADR 0004: Asset integrity check
2020-05-24 06:24:26 -04:00
Sunshine 399f515eeb
Merge pull request #178 from snshn/tests-code-refactor
Group all tests into either passing or failing groups
2020-05-24 03:46:34 -04:00
Sunshine 46616f327b
Merge pull request #177 from snshn/update-readme
Update README.md
2020-05-24 03:46:21 -04:00
Sunshine 090d647390
group all tests into either passing or failing groups 2020-05-23 03:49:04 -04:00
Sunshine 4fa88b7aba
update README.md 2020-05-23 03:16:08 -04:00
Sunshine 3d678d80ee
Merge pull request #176 from snshn/img-srcset
IMG srcset
2020-05-17 14:26:30 -04:00
Sunshine 19a87f426e
version bump 2020-05-17 14:06:55 -04:00
Sunshine cbe3f9f554
implement support for embedding images within srcset 2020-05-17 14:06:44 -04:00
Sunshine b6a44c64cf
Merge pull request #174 from snshn/armhf-cd
Improve CD for compiling ARM binary asset
2020-05-12 03:31:37 -04:00
Sunshine 84e2dd789c
improve CD for compiling ARM binary asset 2020-05-12 03:29:32 -04:00
Sunshine ac4945ca97
Merge pull request #173 from snshn/sha2-integrity
Add asset integrity validation
2020-05-12 03:15:02 -04:00
Sunshine 2ca2c7aff8
version bump 2020-05-12 03:10:43 -04:00
Sunshine a18df74946
refactor code and implement integrity validation 2020-05-12 02:51:37 -04:00
Sunshine 2bc8414cc1
Merge pull request #172 from snshn/update-metadata-comment
improve metadata comments
2020-04-30 22:39:25 -04:00
Sunshine c4569343a4
improve metadata comments 2020-04-30 20:23:09 -04:00
Sunshine 5f5820c71a
Merge pull request #168 from snshn/context-comment
Metadata comment tag
2020-04-30 20:06:40 -04:00
Sunshine 4719a6fecf
Merge pull request #170 from snshn/svg-image-href
Embed SVG IMAGE assets
2020-04-30 20:00:59 -04:00
Sunshine c999359b9f
Merge branch 'context-comment' of github.com:Alch-Emi/monolith into context-comment 2020-04-30 19:54:13 -04:00
Sunshine f22e2b6e68
embed SVG IMAGE assets 2020-04-30 19:51:30 -04:00
Sunshine 31a9550f5b
Merge pull request #171 from snshn/improve-ci-cd
Add rustfmt installation step to CI
2020-04-30 19:51:04 -04:00
Sunshine 201f2d61b9
add rustfmt installation step to CI 2020-04-30 19:45:44 -04:00
Sunshine 3ae4dfae8e
Update README.md 2020-04-28 09:07:47 -04:00
Sunshine 7b095fe4ff
Merge pull request #167 from snshn/version-bump
version bump
2020-04-25 03:50:10 -04:00
Sunshine 890bcb1bb6
version bump 2020-04-25 01:03:49 -04:00
Sunshine aa97ea9f82
Merge pull request #165 from snshn/no-fonts
Add flag for excluding web fonts
2020-04-22 09:16:30 -04:00
Sunshine 9b40dbbf27
add option to exclude web fonts 2020-04-22 09:11:20 -04:00
Sunshine 289f3e801b
Merge pull request #161 from snshn/cache-blob
Store blobs instead of data URLs in cache
2020-04-19 13:33:03 -04:00
Sunshine edacd09dc8
store blobs instead of data URLs in cache 2020-04-19 13:26:14 -04:00
Sunshine 5682863725
Merge pull request #164 from snshn/raspberry-pi-artifact-update
Update GitHub Action for assembling ARM artifacts
2020-04-18 13:46:44 -04:00
Sunshine 4304d7a638
update GitHub Action for assembling ARM artifacts 2020-04-18 13:44:26 -04:00
Sunshine f56f88da94
Merge pull request #91 from snshn/unwrap-noscript-if-no-js
Propose ADR-0002 (NOSCRIPT nodes)
2020-04-16 23:24:30 -04:00
Sunshine 87c8b361ea
add ADR-0002 (NOSCRIPT nodes) 2020-04-16 23:24:03 -04:00
Sunshine cd505ddb6c
Merge pull request #163 from snshn/proper-css-ident-escaping
Escape all special chars within #id and .class CSS selectors
2020-04-11 18:33:41 -04:00
Sunshine eeea617fb1
escape all special chars within #id and .class CSS selectors 2020-04-11 17:50:23 -04:00
Sunshine cc6dbddb49
Merge pull request #162 from snshn/colons-in-css-class-names
Escape colons within CSS idents
2020-04-10 21:20:37 -04:00
Sunshine 9d3df2cdc6
escape colons within CSS idents 2020-04-10 20:59:56 -04:00
Sunshine ab601c3830
Merge pull request #160 from snshn/more-css-image-url-detection-props
Treat url()'s found in @counter-style rules as images
2020-04-10 07:28:55 -04:00
Sunshine 3738be2b6d
treat url()'s found in @counter-style rules as images 2020-04-10 07:22:02 -04:00
Sunshine 53160f01c7
Merge pull request #159 from snshn/implement-data-url-media-type-detection
Improve data URL media type detection
2020-04-10 06:04:49 -04:00
Sunshine 594ad55bd8
improve data URL media type detection 2020-04-10 05:50:33 -04:00
Sunshine d2615f51dc
Merge pull request #158 from snshn/improve-data-url-support
Improve parsing of data URLs
2020-04-10 01:49:34 -04:00
Sunshine c097733ae7
improve parsing of data URLs 2020-04-09 20:27:07 -04:00
Sunshine 67d4b7dafc
Merge pull request #157 from snshn/2-2-3
Upgrade base64 crate & version bump (2.2.2 → 2.2.3)
2020-04-08 19:56:24 -04:00
Sunshine b1d6bbce0c
upgrade base64 crate & version bump (2.2.2 → 2.2.3) 2020-04-08 19:49:46 -04:00
Sunshine 20124f4891
Merge pull request #156 from snshn/raspberry-pi-artifact
Make the pipeline build and upload armhf executable with every new release
2020-04-08 19:40:41 -04:00
Sunshine 0dd540afaf
make the pipeline build and upload armhf executable with every new release 2020-04-08 19:29:17 -04:00
Sunshine df71083359
Merge pull request #155 from snshn/fix-css-unit-sign-bug
Fix css unit sign bug
2020-04-08 18:19:32 -04:00
Sunshine 349c7bb3ea
properly parse negative units in CSS 2020-04-08 18:07:39 -04:00
Sunshine 5a30c6b44b
Merge branch 'master' of github.com:snshn/monolith 2020-04-08 10:53:29 -04:00
Sunshine 929924accd
Merge pull request #153 from snshn/proper-quotation-marks
use proper quotation marks in the README
2020-04-05 16:25:40 -04:00
Sunshine 812b46960c
use proper quotation marks in the README 2020-04-05 16:24:18 -04:00
Sunshine 874080dbda
Merge pull request #152 from snshn/separate-ci-build-jobs
Separate OS build jobs
2020-04-05 15:34:21 -04:00
Sunshine 93dd9d4ed4
separate build job per OS 2020-04-05 15:32:25 -04:00
Sunshine 3f0ced0143
Merge pull request #151 from snshn/2-2-2
version bump (2.2.1 → 2.2.2)
2020-04-05 14:44:48 -04:00
Sunshine 8112ab6d04
version bump (2.2.1 → 2.2.2) 2020-04-05 14:38:40 -04:00
Sunshine e5fc05f5cd
Merge pull request #150 from snshn/cd-windows-executable
Make the pipeline upload windows build to every new release
2020-04-05 14:35:44 -04:00
Sunshine 1068ff659a
make the pipeline upload windows build to every new release 2020-04-05 14:29:06 -04:00
Sunshine d4d9bbe424
update cd.yml 2020-04-04 22:12:35 -04:00
Sunshine cf3a8c8ede
Merge pull request #149 from snshn/remove-travis-ci-and-appveyor
Remove TravisCI and AppVeyor from the project
2020-04-04 19:38:03 -04:00
Sunshine 920d992459
remove TravisCI and AppVeyor from the project 2020-04-04 19:26:58 -04:00
Sunshine c61b3ba858
Merge pull request #148 from snshn/github-actions-build
Improve GitHub Actions integration
2020-04-04 19:14:52 -04:00
Sunshine dc6e564ea2
integrate GitHub Actions CI further 2020-04-04 19:05:49 -04:00
Sunshine 24536b5e18
Merge pull request #147 from Y2Z/github-actions-ci
Implement CI using GitHub Actions
2020-04-04 17:51:28 -04:00
Sunshine 908fd59019
Update ci.yml 2020-04-04 17:08:19 -04:00
Sunshine a19aa37ea8
Merge pull request #145 from snshn/no-images-svg
Empty SVG nodes when excluding images
2020-04-04 15:55:26 -04:00
Sunshine c46bd5900b
Merge pull request #146 from snshn/image-map-area-href
Resolve hrefs of <area> image-map tags
2020-04-04 15:51:45 -04:00
Sunshine 5f98ed23b3
set autocrlf to false to let windows builds pass 2020-04-04 15:42:53 -04:00
Sunshine c6b135398a
Implement CI using GitHub Actions 2020-04-04 15:30:13 -04:00
Sunshine 791e44796e
resolve hrefs of <area> image-map tags 2020-04-04 14:55:45 -04:00
Sunshine b428dd8471
Merge pull request #144 from snshn/macros-unit-test
Implement unit tests for macros
2020-04-04 13:11:19 -04:00
Sunshine b88479446c
implement unit tests for macros 2020-04-04 08:21:41 -04:00
Sunshine 1d6217ef5a
empty SVG nodes if --no-images 2020-04-03 21:56:46 -04:00
Sunshine 746c7f05de
Merge pull request #143 from snshn/embed-input-images
Add support for image inputs
2020-04-03 04:12:06 -04:00
Sunshine 29836d979a
add support for image inputs 2020-04-03 03:30:52 -04:00
Sunshine 5ba6e33fa8
Merge pull request #142 from snshn/robatipoors-improvements
Revamp is_icon() and get_node_name()
2020-04-03 01:39:45 -04:00
Sunshine 643c4ce7ef
implement improvements suggested by @robatipoor 2020-04-03 00:00:08 -04:00
Sunshine c011f90b76
Merge pull request #141 from snshn/update-help-dialog
Update help dialog
2020-04-02 22:49:59 -04:00
Sunshine 875481b9a2
update help dialog 2020-04-02 03:04:21 -04:00
Sunshine 05275d864c
Merge pull request #140 from snshn/cssparser
Switch to token-based CSS parser
2020-04-02 02:28:58 -04:00
Sunshine 4951fea730
implement full CSS parsing 2020-04-02 01:09:32 -04:00
Sunshine b8315a7bd5
Merge pull request #138 from snshn/improved-media-type-detection
Improve SVG media type detection
2020-03-24 18:39:33 -04:00
Sunshine be25784297
improve SVG media type detection 2020-03-24 08:50:39 -04:00
Sunshine b0f1c39175
Merge pull request #137 from snshn/master
Bump version to 2.2.0
2020-03-24 08:23:56 -04:00
Sunshine f27d5fa23e
bump version number (2.1.2 → 2.2.0) 2020-03-22 23:30:31 -04:00
Sunshine 4f2944a600
Merge pull request #136 from snshn/restructure-tests
Restructure tests
2020-03-22 23:28:04 -04:00
Sunshine 479c42e1ce
improve test code structure 2020-03-22 22:08:41 -04:00
Sunshine 933379c798
ensure consistent naming across all tests 2020-03-22 19:03:33 -04:00
Sunshine 061386ccc2
Merge pull request #135 from snshn/local-file-support
Add support for working with local assets
2020-03-22 17:18:43 -04:00
Sunshine 59a8be493d
add support for working with local assets 2020-03-22 15:48:23 -04:00
Sunshine a653bbe7d4
Merge pull request #133 from Y2Z/docker-instructions
Move Docker instructions under docs/
2020-03-18 00:42:40 -04:00
Sunshine c7aab235d9
Merge pull request #134 from Y2Z/adr-asset-minimization
Add ADR describing asset minimization
2020-03-16 00:46:28 -04:00
Sunshine 60ef631315
add ADR describing asset minimization 2020-03-15 23:04:03 -04:00
Sunshine b800947151
move Docker instructions into docs/ 2020-03-14 12:51:05 -04:00
Sunshine 808ce3e722
Merge pull request #130 from snshn/body-background
Account for legacy BODY background="" attribute
2020-03-05 08:32:06 -05:00
Sunshine a92bba4ec5
Update README.md 2020-03-05 05:15:13 -05:00
Sunshine a445098409
Update README.md 2020-03-05 05:11:54 -05:00
Sunshine 224d4fc480
Merge pull request #129 from snshn/dockerfile
add Dockerfile
2020-03-05 05:08:13 -05:00
Sunshine d5ee8ae6ab
account for legacy BODY background="" attribute 2020-03-05 04:56:09 -05:00
Sunshine c16e80f507
add Dockerfile 2020-03-05 04:14:37 -05:00
Sunshine 1c1f2c7128
Merge pull request #127 from snshn/win-travis
add windows target OS to TravisCI
2020-02-27 18:38:50 -05:00
Sunshine efba6a048d
add windows target OS to TravisCI 2020-02-27 01:25:22 -05:00
Sunshine 1701425003
Merge pull request #125 from snshn/frames
Treat frames the same way as iframes
2020-02-24 21:35:29 -05:00
Sunshine 7654eec7e2
treat frames the same way as iframes 2020-02-24 20:18:13 -05:00
Sunshine 00942e0b1d
Merge pull request #119 from snshn/data-url-input
Data URL input
2020-02-23 23:33:25 -05:00
Sunshine 8fbae735fa
add ADR 0004: Asset integrity check 2020-02-23 23:15:32 -05:00
Sunshine 0d1e21e9ad
add black box tests 2020-02-23 22:48:14 -05:00
Sunshine 3d2d40e7cd
add support for data URL targets 2020-02-23 22:25:37 -05:00
Sunshine b8b6d8cff6
fix "succeeding" to "passing" in tests 2020-02-23 22:24:33 -05:00
Sunshine 928664dc88
correct is_valid_url to is_http_url 2020-02-23 22:24:33 -05:00
Sunshine 5c8d75539b
rename dataurl to data_url 2020-02-23 22:24:32 -05:00
Sunshine ee2055a2a3
Merge pull request #123 from snshn/adr-arch-dir
Move ADRs under docs/arch
2020-02-21 19:16:40 -05:00
Sunshine b4c46c59d4
move ADRs to docs/arch 2020-02-21 07:58:23 -05:00
Sunshine 8574b7899b
Merge pull request #121 from snshn/improve-help
Update help dialog and README.md
2020-02-20 08:07:01 -05:00
Sunshine 969bfbdd59
Merge pull request #120 from snshn/update-crates
Update crates
2020-02-15 12:41:29 -05:00
Sunshine 63f3a204a6
Merge pull request #122 from snshn/adr-timeout
Introduce ADR 0003-network-request-timeout.md
2020-02-15 12:40:02 -05:00
Sunshine 094be09e90
add ADR 0003-network-request-timeout.md 2020-02-15 09:09:12 -05:00
Sunshine 23ceaed493
update crates 2020-02-15 01:47:08 -05:00
Sunshine d9602e25eb
update help dialog and README.md 2020-02-15 01:33:20 -05:00
Sunshine 0c50aa223b
Update README.md 2020-02-13 23:47:30 -05:00
Sunshine e5425ee9d0
Update README.md 2020-02-12 08:38:08 -05:00
Sunshine f720fe0176
Merge pull request #114 from snshn/custom-network-timeout-option
Add option for custom network request timeout
2020-02-10 21:13:17 -05:00
Sunshine 727a5a410c
add option for custom network request timeout 2020-02-10 20:08:06 -05:00
Sunshine 23af174822
Merge pull request #115 from snshn/remove-javascript-anchors
Nullify JS within As' href attributes when needed
2020-02-05 22:57:48 -05:00
Sunshine 5ef2b7c9dc
nullify JS within As' href attributes when needed 2020-02-03 01:47:35 -05:00
Sunshine 1e8348543a
Merge pull request #111 from snshn/adr
Introduce ADRs
2020-01-22 23:57:25 -05:00
Sunshine f9bafe092d
Introduce ADRs 2020-01-22 01:03:31 -05:00
Sunshine f876e9243c
Merge pull request #109 from snshn/version-bump
version bump (2.1.1 → 2.1.2)
2020-01-21 08:39:10 -05:00
Sunshine b6896febf1
version bump (2.1.1 → 2.1.2) 2020-01-21 02:32:29 -05:00
Sunshine 29d2ba5857
Merge pull request #107 from snshn/update-readme
Update README.md
2020-01-21 02:18:10 -05:00
Sunshine 8b1ebc7871
Update README.md 2020-01-21 02:16:36 -05:00
Sunshine d753c83c76
Merge pull request #108 from rhysd/revert-manual-proxy-support
Revert #106 since reqwest supports system proxies by default
2020-01-21 02:15:29 -05:00
rhysd 47a825f5ed add proxies instruction in README.md 2020-01-21 13:02:45 +09:00
rhysd 0e12cecd85 Revert "Merge pull request #106 from rhysd/proxy-support"
This reverts commit d8def879b2, reversing
changes made to a9d114d04d.
2020-01-21 13:01:22 +09:00
Sunshine d8def879b2
Merge pull request #106 from rhysd/proxy-support
Support HTTP and HTTPS proxies
2020-01-20 18:36:00 -05:00
Linda_pp 0420854ed6
remove '$' from environment variable names in README.md 2020-01-20 23:11:14 +09:00
rhysd d47482fcd9 fix crash at setting empty values to HTTP proxies
with this patch `https_proxy=` and `http_proxy=` will work well.
2020-01-20 17:17:24 +09:00
rhysd b68624f2f3 support HTTP and HTTPS proxies (fix #103) 2020-01-20 17:02:43 +09:00
Sunshine a9d114d04d
Merge pull request #105 from rhysd/refactor-main
Refactoring for main.rs to address several issues
2020-01-20 01:10:29 -05:00
rhysd 4e4ebe9c98 refactor main to address several issues
Addressed issues:

- when specified URL is invalid, it exited successfully with doing
  nothing. There was no way why it does not work for users
- it exited successfully even if invalid User-Agent value is specified
- it created file twice on `--output` option specified. It may cause an
  issue when some file watcher (e.g. FsEvents on macOS) is watching

Improvements:
- handle errors with `Result::expect` consistently it correctly exits
  with non-zero status on error
- define `Output` enum for handling both stdout and file outputs
2020-01-15 16:52:20 +09:00
Sunshine 429217d8f7
Merge pull request #104 from rhysd/complete-dom-event-handlers
Use complete list of DOM event handlers for detecting JS attributes
2020-01-15 01:34:01 -05:00
rhysd 1779f4a374 better comments for JS_DOM_EVENT_ATTRS constant 2020-01-15 14:33:27 +09:00
rhysd 26e89ae6d3 use complete list of DOM event handlers 2020-01-15 13:58:09 +09:00
Sunshine b333d19d04
Update README.md 2020-01-14 03:42:04 -05:00
Sunshine c1dc798ded
Merge pull request #101 from rhysd/ignore-preload
Improve handling preload links and white spaces in attribute values
2020-01-13 17:51:25 -05:00
rhysd 69d99b69e8 remove . in line comment 2020-01-13 23:47:07 +09:00
Sunshine aae53d20f0
Merge pull request #102 from popey/update-snap-config
Update snapcraft configuration
2020-01-13 08:39:15 -05:00
Alan Pope 14cf2ce8a6
Update snapcraft configuration
This changes the build slightly. If snapcraft is triggered when there is a new tagged release in the project github release page, and it's newer than the version in the Snap Store beta channel, we build that stable release. If however, the latest stable release in github releases is already the same as the Snap Store beta channel, then we build the tip of master.

This gives a couple of advantages. 

  * One yaml can be used to build tip-of-git snaps, and stable releases alike
  * Closing the beta channel in the Snap Store will mean the next triggered build will re-build whatever the last stable release is. This is useful to force a rebuild of the stable version in case a dependency (not that there are many) has a security issue.

We also now set the version dynamically based on the git tags.
2020-01-13 11:14:08 +00:00
Emi Simpson 05985583f0
Switch timestamps from rfc822 local time to iso8601 UTC 2020-01-10 14:30:35 -05:00
Emi Simpson 651fa716b4
Clean user, pass, and fragment from URL before writing 2020-01-10 14:18:15 -05:00
rhysd 67b79e92f9 simplify &x.into_iter() to x.iter() 2020-01-10 14:45:02 +09:00
rhysd b51f41fe34 trim attribute values 2020-01-10 14:41:05 +09:00
rhysd 6f158dc6db compare value of 'rel' properties in case-insensitive 2020-01-10 13:52:31 +09:00
rhysd 8d7052b39c ignore preload and prefetch sources
since all resources are embedded as data URL.
2020-01-09 18:18:21 +09:00
rhysd 660511b8a0 define link type of <link> element as enum and prefer match statement
since match statement checks exhaustiveness
2020-01-09 16:55:42 +09:00
Emi Simpson 9be3982dc6
Added --no-context flag to disable adding context comment 2020-01-08 19:00:53 -05:00
Emi Simpson 27c9fb4cd3
Added comment indicating the context under which the page was downloaded 2020-01-08 18:51:18 -05:00
Sunshine 929512f4f5
Merge pull request #97 from rhysd/reqwest-0.10.0
Upgrade reqwest to v0.10.0 for better binary size and build time
2020-01-08 01:43:55 -05:00
Sunshine a46d89cefc
Merge pull request #98 from rhysd/fix-ci
Fix nighly and beta CI
2020-01-07 18:14:30 -05:00
rhysd f93646e17a ignore beta channel again on AppVeyor
since rustc command crashes on combination of
channel=beta & target=i686-pc-windows-gnu
2020-01-07 17:31:36 +09:00
rhysd 9d14b6dfea rename appveyor.yml to .appveyor.yml
align to .travis.yml
2020-01-07 15:28:29 +09:00
rhysd 9783b96524 check beta channel on CI not to break this crate with next Rust version 2020-01-07 15:28:29 +09:00
rhysd 106efe58ce fix nighly and beta on CI are failing
we always use stable rustfmt so checking with nighly/beta rustfmt is not
necessary.
2020-01-07 15:28:29 +09:00
rhysd 6e99ad13e7 upgrade reqwest to v0.10.0
This will improve build time and binary size as follows:

* Before

- **Compile targets**: 220
- **Build time**: `cargo build --release  1264.95s user 39.72s system 335% cpu 6:29.14 total`
- **Binary size**: 6578568 bytes

* After

- **Compile targets**: 170
- **Build time**: `cargo build --release  1130.64s user 32.15s system 359% cpu 5:23.69 total`
- **Binary size**: 6107088 bytes

* Differences

- **Compile targets**: 1.29x smaller
- **Build time**: 1.23x faster
- **Binary size**: 1.07x smaller
2020-01-07 14:22:32 +09:00
Sunshine 413dd66886
Merge pull request #96 from rhysd/refactorings
Refactorings
2020-01-05 18:46:31 -05:00
rhysd dc7ec6e7a8 remove more redundant type annotations 2020-01-04 16:33:11 +09:00
rhysd ed879231af fix test code was broken by refactoring 2020-01-04 08:07:19 +09:00
rhysd ddf4b8ac13 prefer &str to String for reducing allocations 2020-01-04 08:05:02 +09:00
rhysd 84c13f0605 prefer unwrap_or_default to unwrap_or 2020-01-04 07:58:29 +09:00
rhysd ce03e0e487 reduce allocation on checking DOM attributes and do not hard-code number of elements of array constant
`to_lower` allocates new string but the allocation is not necessary
here.
2020-01-04 07:52:47 +09:00
rhysd 63e19998d0 reduce clones and fix some code styles and redundant code 2020-01-04 07:49:26 +09:00
Sunshine e3321bbb07
Merge pull request #95 from rhysd/rust2018
Migrate to Rust2018 edition
2020-01-03 02:00:47 -05:00
rhysd 0a38cd0eae add rhysd to authors list 2020-01-03 15:43:25 +09:00
rhysd 75fb6961ed migrate to Rust 2018 2020-01-03 00:33:49 +09:00
108 changed files with 9312 additions and 2838 deletions

1
.adr-dir Normal file
View File

@ -0,0 +1 @@
docs/arch

35
.github/workflows/build_gnu_linux.yml vendored Normal file
View File

@ -0,0 +1,35 @@
name: GNU/Linux
on:
push:
branches: [ master ]
paths-ignore:
- 'assets/'
- 'dist/'
- 'docs/'
- 'snap/'
- '.adr-dir'
- 'Dockerfile'
- 'LICENSE'
- 'Makefile'
- 'monolith.nuspec'
- 'README.md'
jobs:
build:
strategy:
matrix:
os:
- ubuntu-latest
rust:
- stable
runs-on: ${{ matrix.os }}
steps:
- run: git config --global core.autocrlf false
- uses: actions/checkout@v2
- name: Build
run: cargo build --all --locked --verbose

35
.github/workflows/build_macos.yml vendored Normal file
View File

@ -0,0 +1,35 @@
name: macOS
on:
push:
branches: [ master ]
paths-ignore:
- 'assets/'
- 'dist/'
- 'docs/'
- 'snap/'
- '.adr-dir'
- 'Dockerfile'
- 'LICENSE'
- 'Makefile'
- 'monolith.nuspec'
- 'README.md'
jobs:
build:
strategy:
matrix:
os:
- macos-latest
rust:
- stable
runs-on: ${{ matrix.os }}
steps:
- run: git config --global core.autocrlf false
- uses: actions/checkout@v2
- name: Build
run: cargo build --all --locked --verbose

35
.github/workflows/build_windows.yml vendored Normal file
View File

@ -0,0 +1,35 @@
name: Windows
on:
push:
branches: [ master ]
paths-ignore:
- 'assets/'
- 'dist/'
- 'docs/'
- 'snap/'
- '.adr-dir'
- 'Dockerfile'
- 'LICENSE'
- 'Makefile'
- 'monolith.nuspec'
- 'README.md'
jobs:
build:
strategy:
matrix:
os:
- windows-latest
rust:
- stable
runs-on: ${{ matrix.os }}
steps:
- run: git config --global core.autocrlf false
- uses: actions/checkout@v2
- name: Build
run: cargo build --all --locked --verbose

108
.github/workflows/cd.yml vendored Normal file
View File

@ -0,0 +1,108 @@
# CD GitHub Actions workflow for monolith
name: CD
on:
release:
types:
- created
jobs:
windows:
runs-on: windows-2019
steps:
- run: git config --global core.autocrlf false
- name: Checkout the repository
uses: actions/checkout@v2
- name: Build the executable
run: cargo build --release
- uses: Shopify/upload-to-release@1.0.0
with:
name: monolith.exe
path: target\release\monolith.exe
repo-token: ${{ secrets.GITHUB_TOKEN }}
gnu_linux_armhf:
runs-on: ubuntu-18.04
steps:
- name: Checkout the repository
uses: actions/checkout@v2
- name: Prepare cross-platform environment
run: |
sudo mkdir /cross-build
sudo touch /etc/apt/sources.list.d/armhf.list
echo "deb [arch=armhf] http://ports.ubuntu.com/ubuntu-ports/ bionic main" | sudo tee -a /etc/apt/sources.list.d/armhf.list
sudo apt-get update
sudo apt-get install -y gcc-arm-linux-gnueabihf libc6-armhf-cross libc6-dev-armhf-cross
sudo apt-get download libssl1.1:armhf libssl-dev:armhf
sudo dpkg -x libssl1.1*.deb /cross-build
sudo dpkg -x libssl-dev*.deb /cross-build
rustup target add arm-unknown-linux-gnueabihf
echo "C_INCLUDE_PATH=/cross-build/usr/include" >> $GITHUB_ENV
echo "OPENSSL_INCLUDE_DIR=/cross-build/usr/include/arm-linux-gnueabihf" >> $GITHUB_ENV
echo "OPENSSL_LIB_DIR=/cross-build/usr/lib/arm-linux-gnueabihf" >> $GITHUB_ENV
echo "PKG_CONFIG_ALLOW_CROSS=1" >> $GITHUB_ENV
echo "RUSTFLAGS=-C linker=arm-linux-gnueabihf-gcc -L/usr/arm-linux-gnueabihf/lib -L/cross-build/usr/lib/arm-linux-gnueabihf -L/cross-build/lib/arm-linux-gnueabihf" >> $GITHUB_ENV
- name: Build the executable
run: cargo build --release --target=arm-unknown-linux-gnueabihf
- name: Attach artifact to the release
uses: Shopify/upload-to-release@1.0.0
with:
name: monolith-gnu-linux-armhf
path: target/arm-unknown-linux-gnueabihf/release/monolith
repo-token: ${{ secrets.GITHUB_TOKEN }}
gnu_linux_aarch64:
runs-on: ubuntu-18.04
steps:
- name: Checkout the repository
uses: actions/checkout@v2
- name: Prepare cross-platform environment
run: |
sudo mkdir /cross-build
sudo touch /etc/apt/sources.list.d/arm64.list
echo "deb [arch=arm64] http://ports.ubuntu.com/ubuntu-ports/ bionic main" | sudo tee -a /etc/apt/sources.list.d/arm64.list
sudo apt-get update
sudo apt-get install -y gcc-aarch64-linux-gnu libc6-arm64-cross libc6-dev-arm64-cross
sudo apt-get download libssl1.1:arm64 libssl-dev:arm64
sudo dpkg -x libssl1.1*.deb /cross-build
sudo dpkg -x libssl-dev*.deb /cross-build
rustup target add aarch64-unknown-linux-gnu
echo "C_INCLUDE_PATH=/cross-build/usr/include" >> $GITHUB_ENV
echo "OPENSSL_INCLUDE_DIR=/cross-build/usr/include/aarch64-linux-gnu" >> $GITHUB_ENV
echo "OPENSSL_LIB_DIR=/cross-build/usr/lib/aarch64-linux-gnu" >> $GITHUB_ENV
echo "PKG_CONFIG_ALLOW_CROSS=1" >> $GITHUB_ENV
echo "RUSTFLAGS=-C linker=aarch64-linux-gnu-gcc -L/usr/aarch64-linux-gnu/lib -L/cross-build/usr/lib/aarch64-linux-gnu" >> $GITHUB_ENV
- name: Build the executable
run: cargo build --release --target=aarch64-unknown-linux-gnu
- name: Attach artifact to the release
uses: Shopify/upload-to-release@1.0.0
with:
name: monolith-gnu-linux-aarch64
path: target/aarch64-unknown-linux-gnu/release/monolith
repo-token: ${{ secrets.GITHUB_TOKEN }}
gnu_linux_x86_64:
runs-on: ubuntu-18.04
steps:
- name: Checkout the repository
uses: actions/checkout@v2
- name: Build the executable
run: cargo build --release
- uses: Shopify/upload-to-release@1.0.0
with:
name: monolith-gnu-linux-x86_64
path: target/release/monolith
repo-token: ${{ secrets.GITHUB_TOKEN }}

49
.github/workflows/ci.yml vendored Normal file
View File

@ -0,0 +1,49 @@
# CI GitHub Actions workflow for monolith
name: CI
on:
pull_request:
branches: [ master ]
paths-ignore:
- 'assets/'
- 'dist/'
- 'docs/'
- 'snap/'
- '.adr-dir'
- 'Dockerfile'
- 'LICENSE'
- 'Makefile'
- 'monolith.nuspec'
- 'README.md'
jobs:
build_and_test:
strategy:
matrix:
os:
- ubuntu-latest
- macos-latest
- windows-latest
rust:
- stable
- beta
- nightly
runs-on: ${{ matrix.os }}
steps:
- run: git config --global core.autocrlf false
- uses: actions/checkout@v2
- name: Build
run: cargo build --all --locked --verbose
- name: Run tests
run: cargo test --all --locked --verbose
- name: Check code formatting
run: |
rustup component add rustfmt
cargo fmt --all -- --check

3
.gitignore vendored
View File

@ -4,6 +4,3 @@
# These are backup files generated by rustfmt
**/*.rs.bk
# Exclude accidental HTML files
*.html

View File

@ -1,27 +0,0 @@
language: rust
cache: cargo
sudo: false
os:
- linux
- osx
rust:
- stable
- beta
- nightly
before_script:
- rustup component add rustfmt
script:
- cargo build --all --locked --verbose
- cargo test --all --locked --verbose
- cargo fmt --all -- --check
jobs:
allow_failures:
- rust: beta
- rust: nightly
fast_finish: true

2125
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -1,19 +1,48 @@
[package]
name = "monolith"
version = "2.1.1"
version = "2.6.2"
authors = [
"Sunshine <sunshine@uberspace.net>",
"Mahdi Robatipoor <mahdi.robatipoor@gmail.com>",
"Emmanuel Delaborde <th3rac25@gmail.com>",
"Emi Simpson <emi@alchemi.dev>",
"rhysd <lin90162@yahoo.co.jp>",
]
edition = "2018"
description = "CLI tool for saving web pages as a single HTML file"
homepage = "https://github.com/Y2Z/monolith"
repository = "https://github.com/Y2Z/monolith"
readme = "README.md"
keywords = ["web", "http", "html", "download", "command-line"]
categories = ["command-line-utilities", "web-programming"]
include = [
"src/*.rs",
"Cargo.toml",
]
license = "CC0-1.0"
[dependencies]
base64 = "0.10.1"
clap = "2.33.0"
atty = "0.2.14" # Used for highlighting network errors
base64 = "0.13.0" # Used for integrity attributes
chrono = "0.4.20" # Used for formatting creation timestamp
clap = "3.2.16"
cssparser = "0.29.6"
encoding_rs = "0.8.31"
html5ever = "0.24.1"
lazy_static = "1.4.0"
regex = "1.3.1"
reqwest = "0.9.20"
url = "2.1.0"
percent-encoding = "2.1.0"
sha2 = "0.10.2" # Used for calculating checksums during integrity checks
url = "2.2.2"
# Used for parsing srcset and NOSCRIPT
[dependencies.regex]
version = "1.6.0"
default-features = false
features = ["std", "perf-dfa", "unicode-perl"]
[dependencies.reqwest]
version = "0.11.11"
default-features = false
features = ["default-tls", "blocking", "gzip", "brotli", "deflate"]
[dev-dependencies]
assert_cmd = "2.0.4"

22
Dockerfile Normal file
View File

@ -0,0 +1,22 @@
FROM ekidd/rust-musl-builder as builder
RUN curl -L -o monolith.tar.gz $(curl -s https://api.github.com/repos/y2z/monolith/releases/latest \
| grep "tarball_url.*\"," \
| cut -d '"' -f 4)
RUN tar xfz monolith.tar.gz \
&& mv Y2Z-monolith-* monolith \
&& rm monolith.tar.gz
WORKDIR monolith/
RUN make install
FROM alpine
RUN apk update && \
apk add --no-cache openssl && \
rm -rf "/var/cache/apk/*"
COPY --from=builder /home/rust/.cargo/bin/monolith /usr/bin/monolith
WORKDIR /tmp
ENTRYPOINT ["/usr/bin/monolith"]

137
LICENSE
View File

@ -1,24 +1,121 @@
This is free and unencumbered software released into the public domain.
Creative Commons Legal Code
Anyone is free to copy, modify, publish, use, compile, sell, or
distribute this software, either in source code form or as a compiled
binary, for any purpose, commercial or non-commercial, and by any
means.
CC0 1.0 Universal
In jurisdictions that recognize copyright laws, the author or authors
of this software dedicate any and all copyright interest in the
software to the public domain. We make this dedication for the benefit
of the public at large and to the detriment of our heirs and
successors. We intend this dedication to be an overt act of
relinquishment in perpetuity of all present and future rights to this
software under copyright law.
CREATIVE COMMONS CORPORATION IS NOT A LAW FIRM AND DOES NOT PROVIDE
LEGAL SERVICES. DISTRIBUTION OF THIS DOCUMENT DOES NOT CREATE AN
ATTORNEY-CLIENT RELATIONSHIP. CREATIVE COMMONS PROVIDES THIS
INFORMATION ON AN "AS-IS" BASIS. CREATIVE COMMONS MAKES NO WARRANTIES
REGARDING THE USE OF THIS DOCUMENT OR THE INFORMATION OR WORKS
PROVIDED HEREUNDER, AND DISCLAIMS LIABILITY FOR DAMAGES RESULTING FROM
THE USE OF THIS DOCUMENT OR THE INFORMATION OR WORKS PROVIDED
HEREUNDER.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
IN NO EVENT SHALL THE AUTHORS BE LIABLE FOR ANY CLAIM, DAMAGES OR
OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE,
ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
OTHER DEALINGS IN THE SOFTWARE.
Statement of Purpose
For more information, please refer to <http://unlicense.org>
The laws of most jurisdictions throughout the world automatically confer
exclusive Copyright and Related Rights (defined below) upon the creator
and subsequent owner(s) (each and all, an "owner") of an original work of
authorship and/or a database (each, a "Work").
Certain owners wish to permanently relinquish those rights to a Work for
the purpose of contributing to a commons of creative, cultural and
scientific works ("Commons") that the public can reliably and without fear
of later claims of infringement build upon, modify, incorporate in other
works, reuse and redistribute as freely as possible in any form whatsoever
and for any purposes, including without limitation commercial purposes.
These owners may contribute to the Commons to promote the ideal of a free
culture and the further production of creative, cultural and scientific
works, or to gain reputation or greater distribution for their Work in
part through the use and efforts of others.
For these and/or other purposes and motivations, and without any
expectation of additional consideration or compensation, the person
associating CC0 with a Work (the "Affirmer"), to the extent that he or she
is an owner of Copyright and Related Rights in the Work, voluntarily
elects to apply CC0 to the Work and publicly distribute the Work under its
terms, with knowledge of his or her Copyright and Related Rights in the
Work and the meaning and intended legal effect of CC0 on those rights.
1. Copyright and Related Rights. A Work made available under CC0 may be
protected by copyright and related or neighboring rights ("Copyright and
Related Rights"). Copyright and Related Rights include, but are not
limited to, the following:
i. the right to reproduce, adapt, distribute, perform, display,
communicate, and translate a Work;
ii. moral rights retained by the original author(s) and/or performer(s);
iii. publicity and privacy rights pertaining to a person's image or
likeness depicted in a Work;
iv. rights protecting against unfair competition in regards to a Work,
subject to the limitations in paragraph 4(a), below;
v. rights protecting the extraction, dissemination, use and reuse of data
in a Work;
vi. database rights (such as those arising under Directive 96/9/EC of the
European Parliament and of the Council of 11 March 1996 on the legal
protection of databases, and under any national implementation
thereof, including any amended or successor version of such
directive); and
vii. other similar, equivalent or corresponding rights throughout the
world based on applicable law or treaty, and any national
implementations thereof.
2. Waiver. To the greatest extent permitted by, but not in contravention
of, applicable law, Affirmer hereby overtly, fully, permanently,
irrevocably and unconditionally waives, abandons, and surrenders all of
Affirmer's Copyright and Related Rights and associated claims and causes
of action, whether now known or unknown (including existing as well as
future claims and causes of action), in the Work (i) in all territories
worldwide, (ii) for the maximum duration provided by applicable law or
treaty (including future time extensions), (iii) in any current or future
medium and for any number of copies, and (iv) for any purpose whatsoever,
including without limitation commercial, advertising or promotional
purposes (the "Waiver"). Affirmer makes the Waiver for the benefit of each
member of the public at large and to the detriment of Affirmer's heirs and
successors, fully intending that such Waiver shall not be subject to
revocation, rescission, cancellation, termination, or any other legal or
equitable action to disrupt the quiet enjoyment of the Work by the public
as contemplated by Affirmer's express Statement of Purpose.
3. Public License Fallback. Should any part of the Waiver for any reason
be judged legally invalid or ineffective under applicable law, then the
Waiver shall be preserved to the maximum extent permitted taking into
account Affirmer's express Statement of Purpose. In addition, to the
extent the Waiver is so judged Affirmer hereby grants to each affected
person a royalty-free, non transferable, non sublicensable, non exclusive,
irrevocable and unconditional license to exercise Affirmer's Copyright and
Related Rights in the Work (i) in all territories worldwide, (ii) for the
maximum duration provided by applicable law or treaty (including future
time extensions), (iii) in any current or future medium and for any number
of copies, and (iv) for any purpose whatsoever, including without
limitation commercial, advertising or promotional purposes (the
"License"). The License shall be deemed effective as of the date CC0 was
applied by Affirmer to the Work. Should any part of the License for any
reason be judged legally invalid or ineffective under applicable law, such
partial invalidity or ineffectiveness shall not invalidate the remainder
of the License, and in such case Affirmer hereby affirms that he or she
will not (i) exercise any of his or her remaining Copyright and Related
Rights in the Work or (ii) assert any associated claims and causes of
action with respect to the Work, in either case contrary to Affirmer's
express Statement of Purpose.
4. Limitations and Disclaimers.
a. No trademark or patent rights held by Affirmer are waived, abandoned,
surrendered, licensed or otherwise affected by this document.
b. Affirmer offers the Work as-is and makes no representations or
warranties of any kind concerning the Work, express, implied,
statutory or otherwise, including without limitation warranties of
title, merchantability, fitness for a particular purpose, non
infringement, or the absence of latent or other defects, accuracy, or
the present or absence of errors, whether or not discoverable, all to
the greatest extent permissible under applicable law.
c. Affirmer disclaims responsibility for clearing rights of other persons
that may apply to the Work or any use thereof, including without
limitation any person's Copyright and Related Rights in the Work.
Further, Affirmer disclaims responsibility for obtaining any necessary
consents, permissions or other rights required for any use of the
Work.
d. Affirmer understands and acknowledges that Creative Commons is not a
party to this document and has no duty or obligation with respect to
this CC0 or use of the Work.

View File

@ -1,16 +1,29 @@
.PHONY: all build install run test lint
# Makefile for monolith
all: test build
all: build
.PHONY: all
build:
@cargo build --locked
.PHONY: build
install:
@cargo install --force --locked --path .
test:
test: build
@cargo test --locked
@cargo fmt --all -- --check
.PHONY: test
lint:
@cargo fmt --all --
.PHONY: lint
install:
@cargo install --force --locked --path .
.PHONY: install
uninstall:
@cargo uninstall
.PHONY: uninstall
clean:
@cargo clean
.PHONY: clean

180
README.md
View File

@ -1,53 +1,175 @@
[![Travis CI Build Status](https://travis-ci.org/Y2Z/monolith.svg?branch=master)](https://travis-ci.org/Y2Z/monolith)
[![AppVeyor Build status](https://ci.appveyor.com/api/projects/status/ae7soyjih8jg2bv7/branch/master?svg=true)](https://ci.appveyor.com/project/snshn/monolith/branch/master)
[![monolith build status on GNU/Linux](https://github.com/Y2Z/monolith/workflows/GNU%2FLinux/badge.svg)](https://github.com/Y2Z/monolith/actions?query=workflow%3AGNU%2FLinux)
[![monolith build status on macOS](https://github.com/Y2Z/monolith/workflows/macOS/badge.svg)](https://github.com/Y2Z/monolith/actions?query=workflow%3AmacOS)
[![monolith build status on Windows](https://github.com/Y2Z/monolith/workflows/Windows/badge.svg)](https://github.com/Y2Z/monolith/actions?query=workflow%3AWindows)
```
___ ___________ __________ ___________________ ___
| \ / \ | | | | | |
| \_/ __ \_| __ | | ___ ___ |__| |
| | | | | | | | | | | |
| |__| _ |__| |____| | | | | __ |
| |\_/| | \ | | | | | | |
|___| |__________| \____________________| |___| |___| |___|
_____ ______________ __________ ___________________ ___
| \ / \ | | | | | |
| \_/ __ \_| __ | | ___ ___ |__| |
| | | | | | | | | | | |
| |\ /| |__| _ |__| |____| | | | | __ |
| | \___/ | | \ | | | | | | |
|___| |__________| \_____________________| |___| |___| |___|
```
A data hoarder's dream come true: bundle any web page into a single HTML file.
You can finally replace that gazillion of open tabs with a gazillion of .html files stored somewhere on your precious little drive.
A data hoarders dream come true: bundle any web page into a single HTML file. You can finally replace that gazillion of open tabs with a gazillion of .html files stored somewhere on your precious little drive.
Unlike the conventional "Save page as", `monolith` not only saves the target document, it embeds CSS, image, and JavaScript assets **all at once**, producing a single HTML5 document that is a joy to store and share.
Unlike the conventional “Save page as”, `monolith` not only saves the target document, it embeds CSS, image, and JavaScript assets **all at once**, producing a single HTML5 document that is a joy to store and share.
If compared to saving websites with `wget -mpk`, this tool embeds all assets as data URLs and therefore lets browsers render the saved page exactly the way it was on the Internet, even when no network connection is available.
---------------------------------------------------
## Installation
### From source
$ git clone https://github.com/Y2Z/monolith.git
$ cd monolith
$ cargo install --path .
#### Using [Cargo](https://crates.io/crates/monolith)
```console
cargo install monolith
```
#### Via [Homebrew](https://formulae.brew.sh/formula/monolith) (macOS and GNU/Linux)
```console
brew install monolith
```
#### Via [MacPorts](https://ports.macports.org/port/monolith/summary) (macOS)
```console
sudo port install monolith
```
#### Using [Snapcraft](https://snapcraft.io/monolith) (GNU/Linux)
```console
snap install monolith
```
#### Using [FreeBSD packages](https://svnweb.freebsd.org/ports/head/www/monolith/) (FreeBSD)
```console
pkg install monolith
```
#### Using [FreeBSD ports](https://www.freshports.org/www/monolith/) (FreeBSD)
```console
cd /usr/ports/www/monolith/
make install clean
```
#### Using [pkgsrc](https://pkgsrc.se/www/monolith) (NetBSD, OpenBSD, Haiku, etc)
```console
cd /usr/pkgsrc/www/monolith
make install clean
```
#### Using [containers](https://www.docker.com/)
```console
docker build -t Y2Z/monolith .
sudo install -b dist/run-in-container.sh /usr/local/bin/monolith
```
#### From [source](https://github.com/Y2Z/monolith)
Dependency: `libssl`
```console
git clone https://github.com/Y2Z/monolith.git
cd monolith
make install
```
#### Using [pre-built binaries](https://github.com/Y2Z/monolith/releases) (Windows, ARM-based devices, etc)
Every release contains pre-built binaries for Windows, GNU/Linux, as well as platforms with non-standard CPU architecture.
---------------------------------------------------
### On macOS (via Homebrew)
$ brew install monolith
## Usage
$ monolith https://lyrics.github.io/db/p/portishead/dummy/roads/ -o portishead-roads-lyrics.html
```console
monolith https://lyrics.github.io/db/P/Portishead/Dummy/Roads/ -o portishead-roads-lyrics.html
```
```console
cat index.html | monolith -aIiFfcMv -b https://original.site/ - > result.html
```
---------------------------------------------------
## Options
- `-c`: Ignore styles
- `-f`: Exclude iframes
- `-a`: Exclude audio sources
- `-b`: Use custom `base URL`
- `-c`: Exclude CSS
- `-C`: Save document using custom `charset`
- `-d`: Allow retrieving assets only from specified `domain(s)`
- `-e`: Ignore network errors
- `-E`: Avoid retrieving assets located within specified domains
- `-f`: Omit frames
- `-F`: Exclude web fonts
- `-i`: Remove images
- `-I`: Isolate document
- `-I`: Isolate the document
- `-j`: Exclude JavaScript
- `-k`: Accept invalid X.509 (TLS) certificates
- `-o`: Write output to file
- `-s`: Silent mode
- `-u`: Specify custom User-Agent
- `-M`: Don't add timestamp and URL information
- `-n`: Extract contents of NOSCRIPT elements
- `-o`: Write output to `file` (use “-” for STDOUT)
- `-s`: Be quiet
- `-t`: Adjust `network request timeout`
- `-u`: Provide custom `User-Agent`
- `-v`: Exclude videos
---------------------------------------------------
## Proxies
Please set `https_proxy`, `http_proxy`, and `no_proxy` environment variables.
---------------------------------------------------
## Contributing
Please open an issue if something is wrong, that helps make this project better.
---------------------------------------------------
## Related projects
- `Pagesaver`: https://github.com/distributed-mind/pagesaver
- `SingleFile`: https://github.com/gildas-lormeau/SingleFile
- Monolith Chrome Extension: https://github.com/rhysd/monolith-of-web
- Pagesaver: https://github.com/distributed-mind/pagesaver
- Personal WayBack Machine: https://github.com/popey/pwbm
- Hako: https://github.com/dmpop/hako
- Monk: https://github.com/monk-dev/monk
---------------------------------------------------
## License
The Unlicense
To the extent possible under law, the author(s) have dedicated all copyright related and neighboring rights to this software to the public domain worldwide.
This software is distributed without any warranty.
---------------------------------------------------
<!-- Microtext -->
<sub>Keep in mind that `monolith` is not aware of your browser's session</sub>
<sub>Keep in mind that `monolith` is not aware of your browsers session</sub>

View File

@ -1,131 +0,0 @@
# Appveyor configuration template for Rust using rustup for Rust installation
# https://github.com/starkat99/appveyor-rust
## Operating System (VM environment) ##
# Rust needs at least Visual Studio 2013 Appveyor OS for MSVC targets.
os: Visual Studio 2015
## Build Matrix ##
# This configuration will setup a build for each channel & target combination (12 windows
# combinations in all).
#
# There are 3 channels: stable, beta, and nightly.
#
# Alternatively, the full version may be specified for the channel to build using that specific
# version (e.g. channel: 1.5.0)
#
# The values for target are the set of windows Rust build targets. Each value is of the form
#
# ARCH-pc-windows-TOOLCHAIN
#
# Where ARCH is the target architecture, either x86_64 or i686, and TOOLCHAIN is the linker
# toolchain to use, either msvc or gnu. See https://www.rust-lang.org/downloads.html#win-foot for
# a description of the toolchain differences.
# See https://github.com/rust-lang-nursery/rustup.rs/#toolchain-specification for description of
# toolchains and host triples.
#
# Comment out channel/target combos you do not wish to build in CI.
#
# You may use the `cargoflags` and `RUSTFLAGS` variables to set additional flags for cargo commands
# and rustc, respectively. For instance, you can uncomment the cargoflags lines in the nightly
# channels to enable unstable features when building for nightly. Or you could add additional
# matrix entries to test different combinations of features.
environment:
matrix:
### MSVC Toolchains ###
# Stable 64-bit MSVC
- channel: stable
target: x86_64-pc-windows-msvc
# Stable 32-bit MSVC
- channel: stable
target: i686-pc-windows-msvc
# Beta 64-bit MSVC
- channel: beta
target: x86_64-pc-windows-msvc
# Beta 32-bit MSVC
- channel: beta
target: i686-pc-windows-msvc
# Nightly 64-bit MSVC
- channel: nightly
target: x86_64-pc-windows-msvc
#cargoflags: --features "unstable"
# Nightly 32-bit MSVC
- channel: nightly
target: i686-pc-windows-msvc
#cargoflags: --features "unstable"
### GNU Toolchains ###
# Stable 64-bit GNU
- channel: stable
target: x86_64-pc-windows-gnu
MINGW_PATH: 'C:\mingw-w64\x86_64-6.3.0-posix-seh-rt_v5-rev1\mingw64\bin'
# Stable 32-bit GNU
- channel: stable
target: i686-pc-windows-gnu
MINGW_PATH: 'C:\MinGW\bin'
# Beta 64-bit GNU
- channel: beta
target: x86_64-pc-windows-gnu
MINGW_PATH: 'C:\mingw-w64\x86_64-6.3.0-posix-seh-rt_v5-rev1\mingw64\bin'
# Beta 32-bit GNU
- channel: beta
target: i686-pc-windows-gnu
MINGW_PATH: 'C:\MinGW\bin'
# Nightly 64-bit GNU
- channel: nightly
target: x86_64-pc-windows-gnu
MINGW_PATH: 'C:\mingw-w64\x86_64-6.3.0-posix-seh-rt_v5-rev1\mingw64\bin'
#cargoflags: --features "unstable"
# Nightly 32-bit GNU
- channel: nightly
target: i686-pc-windows-gnu
MINGW_PATH: 'C:\MinGW\bin'
#cargoflags: --features "unstable"
### Allowed failures ###
# See Appveyor documentation for specific details. In short, place any channel or targets you wish
# to allow build failures on (usually nightly at least is a wise choice). This will prevent a build
# or test failure in the matching channels/targets from failing the entire build.
matrix:
allow_failures:
- channel: beta
- channel: nightly
# If you only care about stable channel build failures, uncomment the following line:
#- channel: beta
## Install Script ##
# This is the most important part of the Appveyor configuration. This installs the version of Rust
# specified by the 'channel' and 'target' environment variables from the build matrix. This uses
# rustup to install Rust.
#
# For simple configurations, instead of using the build matrix, you can simply set the
# default-toolchain and default-host manually here.
install:
- appveyor DownloadFile https://win.rustup.rs/ -FileName rustup-init.exe
- rustup-init -yv --default-toolchain %channel% --default-host %target%
- set PATH=%PATH%;%USERPROFILE%\.cargo\bin
- if defined MINGW_PATH set PATH=%PATH%;%MINGW_PATH%
- rustc -vV
- cargo -vV
- rustup component add rustfmt
## Build Script ##
# 'cargo test' takes care of building for us, so disable Appveyor's build stage. This prevents
# the "directory does not contain a project or solution file" error.
build: false
# Uses 'cargo test' to run tests and build. Alternatively, the project may call compiled programs
#directly or perform other testing commands. Rust will automatically be placed in the PATH
# environment variable.
test_script:
- cargo test --verbose %cargoflags%
- cargo fmt --all -- --check

BIN
assets/icon/icon.blend Normal file

Binary file not shown.

BIN
assets/icon/icon.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.2 MiB

10
dist/run-in-container.sh vendored Normal file
View File

@ -0,0 +1,10 @@
#!/bin/sh
DOCKER=docker
PROG_NAME=monolith
if which podman 2>&1 > /dev/null; then
DOCKER=podman
fi
$DOCKER run --rm Y2Z/$PROG_NAME "$@"

View File

@ -0,0 +1,19 @@
# 1. Record architecture decisions
Date: 2019-12-25
## Status
Accepted
## Context
We need to record the architectural decisions made on this project.
## Decision
We will use Architecture Decision Records, as [described by Michael Nygard](http://thinkrelevance.com/blog/2011/11/15/documenting-architecture-decisions).
## Consequences
See Michael Nygard's article, linked above. For a lightweight ADR toolset, see Nat Pryce's [adr-tools](https://github.com/npryce/adr-tools).

View File

@ -0,0 +1,19 @@
# 2. NOSCRIPT nodes
Date: 2020-04-16
## Status
Accepted
## Context
HTML pages can contain `noscript` nodes, which reveal their contents only in case when JavaScript is not available. Most of the time they contain hidden messages that inform about certain JavaScript-dependent features not being operational, however sometimes can also feature media assets or even iframes.
## Decision
When the document is being saved with or without JavaScript, each `noscript` node should be preserved while its children need to be processed exactly the same way as the rest of the document. This approach will ensure that even hidden remote assets are embedded — since those hidden elements may have to be displayed later in a browser that has JavaScript turned off. An option should be available to "unwrap" all `noscript` nodes in order to make their contents always visible in the document, complimenting the "disable JS" function of the program.
## Consequences
Saved documents will have contents of all `noscript` nodes processed as if they are part of the document's DOM, therefore properly display images encapsulated within `noscript` nodes when being viewed in browsers that have JavaScript turned off (or have no JavaScript support in the first place). The new option to "unwrap" `noscript` elements will help the user ensure that the resulting document always represents what the original web page looked like in a browser that had JavaScript turned off.

View File

@ -0,0 +1,21 @@
# 3. Network request timeout
Date: 2020-02-15
## Status
Accepted
## Context
A slow network connection and overloaded server may negatively impact network response time.
## Decision
Make the program simulate behavior of popular web browsers and CLI tools, where the default network response timeout is most often set to 120 seconds.
Instead of featuring retries for timed out network requests, the program should have an option to adjust the timeout length, along with making it indefinite when given "0" as its value.
## Consequences
The user is able to retrieve resources that have long response time, as well as obtain full control over how soon, and if at all, network requests should time out.

View File

@ -0,0 +1,21 @@
# 4. Asset integrity check
Date: 2020-02-23
## Status
Accepted
## Context
In HTML5, `link` and `script` nodes have an attribute named `integrity`, which lets the browser check if the remote file is valid, mostly for the purpose of enhancing page security.
## Decision
In order to replicate the browser's behavior, the program should perform integrity check the same way it does, excluding the linked asset from the final result if such check fails.
The `integrity` attribute should be removed from nodes, as it bears no benefit for resources embedded as data URLs.
## Consequences
Assets that fail to pass the check get excluded from the saved document. Meanwhile, saved documents no longer contain integrity attributes on all `link` and `script` nodes.

View File

@ -0,0 +1,19 @@
# 5. Asset Minimization
Date: 2020-03-14
## Status
Accepted
## Context
It may look like a good idea to make monolith compress retrieved assets while saving the page for the purpose of reducing the resulting document's file size.
## Decision
Given that the main purpose of this program is to save pages in a convenient to store and share manner — it's mostly an archiving tool, aside from being able to tell monolith to exclude certain types of asests (e.g. images, CSS, JavaScript), it would be outside of scope of this program to implement code for compressing assets. Minimizing files before embedding them does not reduce the amount of data that needs to be transferred either. A separate tool can be used later to compress and minimize pages saved by monolith, if needed.
## Consequences
Monolith will not support modification of original document assets for the purpose of reducing their size, sticking to performing only minimal amount of modifications to the original web page — whatever is needed to provide security or exclude unwanted asset types.

View File

@ -0,0 +1,19 @@
# 6. Reload and location `meta` tags
Date: 2020-06-25
## Status
Accepted
## Context
HTML documents may contain `meta` tags capable of automatically refreshing the page or redirecting to another location.
## Decision
Since the resulting document is saved to disk and generally not intended to be served over the network, it only makes sense to remove `meta` tags that have `http-equiv` attribute equal to "Refresh" or "Location", in order to prevent them from reloading the page or redirecting to another location.
## Consequences
Monolith will ensure that saved documents do not contain `meta` tags capable of changing location or reloading the page.

View File

@ -0,0 +1,19 @@
# 7. Network errors
Date: 2020-11-22
## Status
Accepted
## Context
Servers may return information with HTTP response codes other than `200`, however those responses may still contain useful data.
## Decision
Fail by default, notifying of the network error. Add option to continue retrieving assets by treating all response codes as `200`.
## Consequences
Monolith will fail to obtain resources with status other than `200`, unless told to ignore network errors.

View File

@ -0,0 +1,40 @@
# 8. Base Tag
Date: 2020-12-25
## Status
Accepted
## Context
HTML documents may contain `base` tag, which influences resolution of anchor links and relative URLs as well as dynamically loaded resources.
Sometimes, in order to make certain saved documents function closer to how they operate while being served from a remote server, the `base` tag specifying the source page's URL may need to be added to the document.
There can be only one such tag. If multiple `base` tags are present, only the first encountered tag ends up being used.
## Decision
Adding the `base` tag should be optional — saved documents should not contain the `base` tag unless it was specified by the user, or the document originally had the `base` tag in it.
Existing `href` attribute's value of the original `base` tag should be used for resolving the document's relative links instead of document's own URL (precisely the way browsers do it).
## Consequences
#### If the base tag does not exist in the source document
- If the base tag does not exist in the source document
- With base URL option provided
- use the specified base URL value to retrieve assets, keep original base URL value in the document
- Without base URL option provided
- download document as usual, do not add base tag
- If the base tag already exists in the source document
- With base URL option provided
- we overwrite the original base URL before retrieving assets, keep new base URL value in the document
- Without base URL option provided:
- use the base URL from the original document to retrieve assets, keep original base URL value in the document
The program will obtain ability to retrieve remote assets for non-remote sources (such as data URLs and local files).
The program will obatin ability to get rid of existing base tag values (by provind an empty one).

3
docs/references.md Normal file
View File

@ -0,0 +1,3 @@
# References
- https://content-security-policy.com/

23
docs/web-apps.md Normal file
View File

@ -0,0 +1,23 @@
# Web apps that can be saved with Monolith
These apps retain all or most of their functionality when saved with Monolith:
## Converse
| Website | https://conversejs.org |
|:-----------------------|:--------------------------------------------------------------------|
| Description | An XMPP client built using web technologies |
| Functionality retained | **full** |
| Command to use | `monolith https://conversejs.org/fullscreen.html > conversejs.html` |
| Monolith version used | 2.2.7 |
## Markdown Tables generator
| Website | https://www.tablesgenerator.com |
|:--------------------------|:-----------------------------------------------------------------------------------------------|
| Description | Tool for creating tables in extended Markdown format |
| Functionality retained | **full** |
| Command to use | `monolith -I https://www.tablesgenerator.com/markdown_tables -o markdown-table-generator.html` |
| Monolith version used | 2.6.1 |

25
monolith.nuspec Normal file
View File

@ -0,0 +1,25 @@
<?xml version="1.0" encoding="utf-8"?>
<package xmlns="http://schemas.microsoft.com/packaging/2015/06/nuspec.xsd">
<metadata>
<id>monolith</id>
<version>2.4.0</version>
<title>Monolith</title>
<authors>Sunshine, Mahdi Robatipoor, Emmanuel Delaborde, Emi Simpson, rhysd</authors>
<projectUrl>https://github.com/Y2Z/monolith</projectUrl>
<iconUrl>https://raw.githubusercontent.com/Y2Z/monolith/master/assets/icon/icon.png</iconUrl>
<licenseUrl>https://raw.githubusercontent.com/Y2Z/monolith/master/LICENSE</licenseUrl>
<requireLicenseAcceptance>false</requireLicenseAcceptance>
<description>CLI tool for saving complete web pages as a single HTML file
A data hoarders dream come true: bundle any web page into a single HTML file. You can finally replace that gazillion of open tabs with a gazillion of .html files stored somewhere on your precious little drive.
Unlike the conventional “Save page as”, monolith not only saves the target document, it embeds CSS, image, and JavaScript assets all at once, producing a single HTML5 document that is a joy to store and share.
If compared to saving websites using wget, this tool embeds all assets as data URLs and therefore lets browsers render the saved page exactly the way it was on the Internet, even when no network connection is available.
</description>
<copyright>Public Domain</copyright>
<language>en-US</language>
<tags>scraping archiving</tags>
<docsUrl>https://github.com/Y2Z/monolith/blob/master/README.md</docsUrl>
</metadata>
</package>

View File

@ -1,6 +1,7 @@
name: monolith
base: core18
version: git
# Version data defined inside the monolith part below
adopt-info: monolith
summary: Monolith - Save HTML pages with ease
description: |
A data hoarder's dream come true: bundle any web page into a single
@ -17,6 +18,14 @@ description: |
confinement: strict
architectures:
- build-on: amd64
- build-on: arm64
- build-on: armhf
- build-on: i386
- build-on: ppc64el
- build-on: s390x
parts:
monolith:
plugin: rust
@ -24,6 +33,21 @@ parts:
build-packages:
- libssl-dev
- pkg-config
override-pull: |
snapcraftctl pull
# Determine the current tag
last_committed_tag="$(git describe --tags --abbrev=0)"
last_committed_tag_ver="$(echo ${last_committed_tag} | sed 's/v//')"
# Determine the most recent version in the beta channel in the Snap Store
last_released_tag="$(snap info $SNAPCRAFT_PROJECT_NAME | awk '$1 == "beta:" { print $2 }')"
# If the latest tag from the upstream project has not been released to
# beta, build that tag instead of master.
if [ "${last_committed_tag_ver}" != "${last_released_tag}" ]; then
git fetch
git checkout "${last_committed_tag}"
fi
# set version number of the snap based on what we did above
snapcraftctl set-version $(git describe --tags --abbrev=0)
apps:
monolith:

View File

@ -1,65 +0,0 @@
use clap::{App, Arg};
#[derive(Default)]
pub struct AppArgs {
pub url_target: String,
pub no_css: bool,
pub no_frames: bool,
pub no_images: bool,
pub no_js: bool,
pub insecure: bool,
pub isolate: bool,
pub output: String,
pub silent: bool,
pub user_agent: String,
}
const DEFAULT_USER_AGENT: &str =
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10.14; rv:66.0) Gecko/20100101 Firefox/66.0";
impl AppArgs {
pub fn get() -> AppArgs {
let app = App::new("monolith")
.version(crate_version!())
.author(crate_authors!("\n"))
.about(crate_description!())
.arg(
Arg::with_name("url")
.required(true)
.takes_value(true)
.index(1)
.help("URL to download"),
)
// .args_from_usage("-a, --include-audio 'Embed audio sources'")
.args_from_usage("-c, --no-css 'Ignore styles'")
.args_from_usage("-f, --no-frames 'Exclude iframes'")
.args_from_usage("-i, --no-images 'Remove images'")
.args_from_usage("-I, --isolate 'Cut off from the Internet'")
.args_from_usage("-j, --no-js 'Exclude JavaScript'")
.args_from_usage("-k, --insecure 'Accept invalid X.509 (TLS) certificates'")
.args_from_usage("-o, --output=[document.html] 'Write output to <file>'")
.args_from_usage("-s, --silent 'Suppress verbosity'")
.args_from_usage("-u, --user-agent=[Iceweasel] 'Custom User-Agent string'")
// .args_from_usage("-v, --include-video 'Embed video sources'")
.get_matches();
let mut app_args = AppArgs::default();
// Process the command
app_args.url_target = app
.value_of("url")
.expect("please set target url")
.to_string();
app_args.no_css = app.is_present("no-css");
app_args.no_frames = app.is_present("no-frames");
app_args.no_images = app.is_present("no-images");
app_args.no_js = app.is_present("no-js");
app_args.insecure = app.is_present("insecure");
app_args.isolate = app.is_present("isolate");
app_args.silent = app.is_present("silent");
app_args.output = app.value_of("output").unwrap_or("").to_string();
app_args.user_agent = app
.value_of("user-agent")
.unwrap_or_else(|| DEFAULT_USER_AGENT)
.to_string();
app_args
}
}

447
src/css.rs Normal file
View File

@ -0,0 +1,447 @@
use cssparser::{
serialize_identifier, serialize_string, ParseError, Parser, ParserInput, SourcePosition, Token,
};
use reqwest::blocking::Client;
use std::collections::HashMap;
use url::Url;
use crate::opts::Options;
use crate::url::{create_data_url, resolve_url, EMPTY_IMAGE_DATA_URL};
use crate::utils::retrieve_asset;
const CSS_PROPS_WITH_IMAGE_URLS: &[&str] = &[
// Universal
"background",
"background-image",
"border-image",
"border-image-source",
"content",
"cursor",
"list-style",
"list-style-image",
"mask",
"mask-image",
// Specific to @counter-style
"additive-symbols",
"negative",
"pad",
"prefix",
"suffix",
"symbols",
];
pub fn embed_css(
cache: &mut HashMap<String, Vec<u8>>,
client: &Client,
document_url: &Url,
css: &str,
options: &Options,
depth: u32,
) -> String {
let mut input = ParserInput::new(&css);
let mut parser = Parser::new(&mut input);
process_css(
cache,
client,
document_url,
&mut parser,
options,
depth,
"",
"",
"",
)
.unwrap()
}
pub fn format_ident(ident: &str) -> String {
let mut res: String = "".to_string();
let _ = serialize_identifier(ident, &mut res);
res = res.trim_end().to_string();
res
}
pub fn format_quoted_string(string: &str) -> String {
let mut res: String = "".to_string();
let _ = serialize_string(string, &mut res);
res
}
pub fn is_image_url_prop(prop_name: &str) -> bool {
CSS_PROPS_WITH_IMAGE_URLS
.iter()
.find(|p| prop_name.eq_ignore_ascii_case(p))
.is_some()
}
pub fn process_css<'a>(
cache: &mut HashMap<String, Vec<u8>>,
client: &Client,
document_url: &Url,
parser: &mut Parser,
options: &Options,
depth: u32,
rule_name: &str,
prop_name: &str,
func_name: &str,
) -> Result<String, ParseError<'a, String>> {
let mut result: String = "".to_string();
let mut curr_rule: String = rule_name.clone().to_string();
let mut curr_prop: String = prop_name.clone().to_string();
let mut token: &Token;
let mut token_offset: SourcePosition;
loop {
token_offset = parser.position();
token = match parser.next_including_whitespace_and_comments() {
Ok(token) => token,
Err(_) => {
break;
}
};
match *token {
Token::Comment(_) => {
let token_slice = parser.slice_from(token_offset);
result.push_str(token_slice);
}
Token::Semicolon => result.push_str(";"),
Token::Colon => result.push_str(":"),
Token::Comma => result.push_str(","),
Token::ParenthesisBlock | Token::SquareBracketBlock | Token::CurlyBracketBlock => {
if options.no_fonts && curr_rule == "font-face" {
continue;
}
let closure: &str;
if token == &Token::ParenthesisBlock {
result.push_str("(");
closure = ")";
} else if token == &Token::SquareBracketBlock {
result.push_str("[");
closure = "]";
} else {
result.push_str("{");
closure = "}";
}
let block_css: String = parser
.parse_nested_block(|parser| {
process_css(
cache,
client,
document_url,
parser,
options,
depth,
rule_name,
curr_prop.as_str(),
func_name,
)
})
.unwrap();
result.push_str(block_css.as_str());
result.push_str(closure);
}
Token::CloseParenthesis => result.push_str(")"),
Token::CloseSquareBracket => result.push_str("]"),
Token::CloseCurlyBracket => result.push_str("}"),
Token::IncludeMatch => result.push_str("~="),
Token::DashMatch => result.push_str("|="),
Token::PrefixMatch => result.push_str("^="),
Token::SuffixMatch => result.push_str("$="),
Token::SubstringMatch => result.push_str("*="),
Token::CDO => result.push_str("<!--"),
Token::CDC => result.push_str("-->"),
Token::WhiteSpace(ref value) => {
result.push_str(value);
}
// div...
Token::Ident(ref value) => {
curr_rule = "".to_string();
curr_prop = value.to_string();
result.push_str(&format_ident(value));
}
// @import, @font-face, @charset, @media...
Token::AtKeyword(ref value) => {
curr_rule = value.to_string();
if options.no_fonts && curr_rule == "font-face" {
continue;
}
result.push_str("@");
result.push_str(value);
}
Token::Hash(ref value) => {
result.push_str("#");
result.push_str(value);
}
Token::QuotedString(ref value) => {
if curr_rule == "import" {
// Reset current at-rule value
curr_rule = "".to_string();
// Skip empty import values
if value.len() == 0 {
result.push_str("''");
continue;
}
let import_full_url: Url = resolve_url(&document_url, value);
match retrieve_asset(
cache,
client,
&document_url,
&import_full_url,
options,
depth + 1,
) {
Ok((
import_contents,
import_final_url,
import_media_type,
import_charset,
)) => {
let mut import_data_url = create_data_url(
&import_media_type,
&import_charset,
embed_css(
cache,
client,
&import_final_url,
&String::from_utf8_lossy(&import_contents),
options,
depth + 1,
)
.as_bytes(),
&import_final_url,
);
import_data_url.set_fragment(import_full_url.fragment());
result.push_str(
format_quoted_string(&import_data_url.to_string()).as_str(),
);
}
Err(_) => {
// Keep remote reference if unable to retrieve the asset
if import_full_url.scheme() == "http"
|| import_full_url.scheme() == "https"
{
result.push_str(
format_quoted_string(&import_full_url.to_string()).as_str(),
);
}
}
}
} else {
if func_name == "url" {
// Skip empty url()'s
if value.len() == 0 {
continue;
}
if options.no_images && is_image_url_prop(curr_prop.as_str()) {
result.push_str(format_quoted_string(EMPTY_IMAGE_DATA_URL).as_str());
} else {
let resolved_url: Url = resolve_url(&document_url, value);
match retrieve_asset(
cache,
client,
&document_url,
&resolved_url,
options,
depth + 1,
) {
Ok((data, final_url, media_type, charset)) => {
let mut data_url =
create_data_url(&media_type, &charset, &data, &final_url);
data_url.set_fragment(resolved_url.fragment());
result.push_str(
format_quoted_string(&data_url.to_string()).as_str(),
);
}
Err(_) => {
// Keep remote reference if unable to retrieve the asset
if resolved_url.scheme() == "http"
|| resolved_url.scheme() == "https"
{
result.push_str(
format_quoted_string(&resolved_url.to_string())
.as_str(),
);
}
}
}
}
} else {
result.push_str(format_quoted_string(value).as_str());
}
}
}
Token::Number {
ref has_sign,
ref value,
..
} => {
if *has_sign && *value >= 0. {
result.push_str("+");
}
result.push_str(&value.to_string())
}
Token::Percentage {
ref has_sign,
ref unit_value,
..
} => {
if *has_sign && *unit_value >= 0. {
result.push_str("+");
}
result.push_str(&(unit_value * 100.0).to_string());
result.push_str("%");
}
Token::Dimension {
ref has_sign,
ref value,
ref unit,
..
} => {
if *has_sign && *value >= 0. {
result.push_str("+");
}
result.push_str(&value.to_string());
result.push_str(&unit.to_string());
}
// #selector, #id...
Token::IDHash(ref value) => {
curr_rule = "".to_string();
result.push_str("#");
result.push_str(&format_ident(value));
}
// url()
Token::UnquotedUrl(ref value) => {
let is_import: bool = curr_rule == "import";
if is_import {
// Reset current at-rule value
curr_rule = "".to_string();
}
// Skip empty url()'s
if value.len() < 1 {
result.push_str("url()");
continue;
} else if value.starts_with("#") {
result.push_str("url(");
result.push_str(value);
result.push_str(")");
continue;
}
result.push_str("url(");
if is_import {
let full_url: Url = resolve_url(&document_url, value);
match retrieve_asset(
cache,
client,
&document_url,
&full_url,
options,
depth + 1,
) {
Ok((css, final_url, media_type, charset)) => {
let mut data_url = create_data_url(
&media_type,
&charset,
embed_css(
cache,
client,
&final_url,
&String::from_utf8_lossy(&css),
options,
depth + 1,
)
.as_bytes(),
&final_url,
);
data_url.set_fragment(full_url.fragment());
result.push_str(format_quoted_string(&data_url.to_string()).as_str());
}
Err(_) => {
// Keep remote reference if unable to retrieve the asset
if full_url.scheme() == "http" || full_url.scheme() == "https" {
result
.push_str(format_quoted_string(&full_url.to_string()).as_str());
}
}
}
} else {
if is_image_url_prop(curr_prop.as_str()) && options.no_images {
result.push_str(format_quoted_string(EMPTY_IMAGE_DATA_URL).as_str());
} else {
let full_url: Url = resolve_url(&document_url, value);
match retrieve_asset(
cache,
client,
&document_url,
&full_url,
options,
depth + 1,
) {
Ok((data, final_url, media_type, charset)) => {
let mut data_url =
create_data_url(&media_type, &charset, &data, &final_url);
data_url.set_fragment(full_url.fragment());
result
.push_str(format_quoted_string(&data_url.to_string()).as_str());
}
Err(_) => {
// Keep remote reference if unable to retrieve the asset
if full_url.scheme() == "http" || full_url.scheme() == "https" {
result.push_str(
format_quoted_string(&full_url.to_string()).as_str(),
);
}
}
}
}
}
result.push_str(")");
}
// =
Token::Delim(ref value) => result.push_str(&value.to_string()),
Token::Function(ref name) => {
let function_name: &str = &name.clone();
result.push_str(function_name);
result.push_str("(");
let block_css: String = parser
.parse_nested_block(|parser| {
process_css(
cache,
client,
document_url,
parser,
options,
depth,
curr_rule.as_str(),
curr_prop.as_str(),
function_name,
)
})
.unwrap();
result.push_str(block_css.as_str());
result.push_str(")");
}
Token::BadUrl(_) | Token::BadString(_) => {}
}
}
// Ensure empty CSS is really empty
if result.len() > 0 && result.trim().len() == 0 {
result = result.trim().to_string()
}
Ok(result)
}

File diff suppressed because it is too large Load Diff

View File

@ -1,67 +0,0 @@
use reqwest::header::CONTENT_TYPE;
use reqwest::Client;
use std::collections::HashMap;
use utils::{clean_url, data_to_dataurl, is_data_url};
pub fn retrieve_asset(
cache: &mut HashMap<String, String>,
client: &Client,
url: &str,
as_dataurl: bool,
mime: &str,
opt_silent: bool,
) -> Result<(String, String), reqwest::Error> {
let cache_key = clean_url(&url);
if is_data_url(&url).unwrap() {
Ok((url.to_string(), url.to_string()))
} else {
if cache.contains_key(&cache_key) {
// url is in cache
if !opt_silent {
eprintln!("{} (from cache)", &url);
}
let data = cache.get(&cache_key).unwrap();
Ok((data.to_string(), url.to_string()))
} else {
// url not in cache, we request it
let mut response = client.get(url).send()?;
if !opt_silent {
if url == response.url().as_str() {
eprintln!("{}", &url);
} else {
eprintln!("{} -> {}", &url, &response.url().as_str());
}
}
let new_cache_key = clean_url(response.url().to_string());
if as_dataurl {
// Convert response into a byte array
let mut data: Vec<u8> = vec![];
response.copy_to(&mut data)?;
// Attempt to obtain MIME type by reading the Content-Type header
let mimetype = if mime == "" {
response
.headers()
.get(CONTENT_TYPE)
.and_then(|header| header.to_str().ok())
.unwrap_or(&mime)
} else {
mime
};
let dataurl = data_to_dataurl(&mimetype, &data);
// insert in cache
cache.insert(new_cache_key, dataurl.to_string());
Ok((dataurl, response.url().to_string()))
} else {
let content = response.text().unwrap();
// insert in cache
cache.insert(new_cache_key, content.clone());
Ok((content, response.url().to_string()))
}
}
}
}

111
src/js.rs
View File

@ -1,32 +1,103 @@
const JS_DOM_EVENT_ATTRS: [&str; 21] = [
// Input
"onfocus",
const JS_DOM_EVENT_ATTRS: &'static [&str] = &[
// From WHATWG HTML spec 8.1.5.2 "Event handlers on elements, Document objects, and Window objects":
// https://html.spec.whatwg.org/#event-handlers-on-elements,-document-objects,-and-window-objects
// https://html.spec.whatwg.org/#attributes-3 (table "List of event handler content attributes")
// Global event handlers
"onabort",
"onauxclick",
"onblur",
"onselect",
"oncancel",
"oncanplay",
"oncanplaythrough",
"onchange",
"onsubmit",
"onreset",
"onclick",
"onclose",
"oncontextmenu",
"oncuechange",
"ondblclick",
"ondrag",
"ondragend",
"ondragenter",
"ondragexit",
"ondragleave",
"ondragover",
"ondragstart",
"ondrop",
"ondurationchange",
"onemptied",
"onended",
"onerror",
"onfocus",
"onformdata",
"oninput",
"oninvalid",
"onkeydown",
"onkeypress",
"onkeyup",
// Mouse
"onmouseover",
"onmouseout",
"onmousedown",
"onmouseup",
"onmousemove",
// Click
"onclick",
"ondblclick",
// Load
"onload",
"onunload",
"onabort",
"onerror",
"onloadeddata",
"onloadedmetadata",
"onloadstart",
"onmousedown",
"onmouseenter",
"onmouseleave",
"onmousemove",
"onmouseout",
"onmouseover",
"onmouseup",
"onwheel",
"onpause",
"onplay",
"onplaying",
"onprogress",
"onratechange",
"onreset",
"onresize",
"onscroll",
"onsecuritypolicyviolation",
"onseeked",
"onseeking",
"onselect",
"onslotchange",
"onstalled",
"onsubmit",
"onsuspend",
"ontimeupdate",
"ontoggle",
"onvolumechange",
"onwaiting",
"onwebkitanimationend",
"onwebkitanimationiteration",
"onwebkitanimationstart",
"onwebkittransitionend",
// Event handlers for <body/> and <frameset/> elements
"onafterprint",
"onbeforeprint",
"onbeforeunload",
"onhashchange",
"onlanguagechange",
"onmessage",
"onmessageerror",
"onoffline",
"ononline",
"onpagehide",
"onpageshow",
"onpopstate",
"onrejectionhandled",
"onstorage",
"onunhandledrejection",
"onunload",
// Event handlers for <html/> element
"oncut",
"oncopy",
"onpaste",
];
// Returns true if DOM attribute name matches a native JavaScript event handler
pub fn attr_is_event_handler(attr_name: &str) -> bool {
JS_DOM_EVENT_ATTRS.contains(&attr_name.to_lowercase().as_str())
JS_DOM_EVENT_ATTRS
.iter()
.find(|a| attr_name.eq_ignore_ascii_case(a))
.is_some()
}

View File

@ -1,17 +1,6 @@
extern crate html5ever;
#[macro_use]
extern crate lazy_static;
extern crate regex;
extern crate reqwest;
extern crate url;
#[macro_use]
mod macros;
pub mod css;
pub mod html;
pub mod http;
pub mod js;
pub mod opts;
pub mod url;
pub mod utils;
#[cfg(test)]
pub mod tests;

View File

@ -1,9 +0,0 @@
#[macro_export]
macro_rules! str {
() => {
String::new()
};
($val: expr) => {
ToString::to_string(&$val)
};
}

View File

@ -1,104 +1,329 @@
#[macro_use]
extern crate clap;
extern crate monolith;
extern crate reqwest;
mod args;
mod macros;
use args::AppArgs;
use monolith::html::{html_to_dom, stringify_document, walk_and_embed_assets};
use monolith::http::retrieve_asset;
use monolith::utils::is_valid_url;
use encoding_rs::Encoding;
use html5ever::rcdom::RcDom;
use reqwest::blocking::Client;
use reqwest::header::{HeaderMap, HeaderValue, USER_AGENT};
use std::collections::HashMap;
use std::fs::{remove_file, File};
use std::io::{Error, Write};
use std::fs;
use std::io::{self, prelude::*, Error, Write};
use std::path::Path;
use std::process;
use std::time::Duration;
use url::Url;
fn create_file(file_path: &String, content: String) -> Result<(), Error> {
let file = File::create(file_path.as_str());
use monolith::html::{
add_favicon, create_metadata_tag, get_base_url, get_charset, has_favicon, html_to_dom,
serialize_document, set_base_url, set_charset, walk_and_embed_assets,
};
use monolith::opts::Options;
use monolith::url::{create_data_url, resolve_url};
use monolith::utils::retrieve_asset;
let mut file = match file {
Ok(file) => file,
Err(error) => return Err(error),
};
if content != str!() {
file.write_all(content.as_bytes())?;
file.write_all("\n".as_bytes())?;
file.sync_all()?;
} else {
// Remove the file right away if it had no content
remove_file(file_path.as_str())?;
}
Ok(())
enum Output {
Stdout(io::Stdout),
File(fs::File),
}
fn main() {
let app_args = AppArgs::get();
let cache = &mut HashMap::new();
// Attempt to create output file
if app_args.output != str!() {
create_file(&app_args.output, str!()).unwrap();
impl Output {
fn new(file_path: &str) -> Result<Output, Error> {
if file_path.is_empty() || file_path.eq("-") {
Ok(Output::Stdout(io::stdout()))
} else {
Ok(Output::File(fs::File::create(file_path)?))
}
}
if is_valid_url(app_args.url_target.as_str()) {
// Initialize client
let mut header_map = HeaderMap::new();
match HeaderValue::from_str(&app_args.user_agent) {
Ok(header) => header_map.insert(USER_AGENT, header),
Err(err) => {
eprintln!("Invalid user agent! {}", err);
return;
fn write(&mut self, bytes: &Vec<u8>) -> Result<(), Error> {
match self {
Output::Stdout(stdout) => {
stdout.write_all(bytes)?;
// Ensure newline at end of output
if bytes.last() != Some(&b"\n"[0]) {
stdout.write(b"\n")?;
}
stdout.flush()
}
Output::File(file) => {
file.write_all(bytes)?;
// Ensure newline at end of output
if bytes.last() != Some(&b"\n"[0]) {
file.write(b"\n")?;
}
file.flush()
}
};
let client = reqwest::Client::builder()
.timeout(Duration::from_secs(10))
.danger_accept_invalid_certs(app_args.insecure)
.default_headers(header_map)
.build()
.expect("Failed to initialize HTTP client");
// Retrieve root document
let (data, final_url) = retrieve_asset(
cache,
&client,
app_args.url_target.as_str(),
false,
"",
app_args.silent,
)
.unwrap();
let dom = html_to_dom(&data);
walk_and_embed_assets(
cache,
&client,
&final_url,
&dom.document,
app_args.no_css,
app_args.no_js,
app_args.no_images,
app_args.silent,
app_args.no_frames,
);
let html: String = stringify_document(
&dom.document,
app_args.no_css,
app_args.no_frames,
app_args.no_js,
app_args.no_images,
app_args.isolate,
);
if app_args.output == str!() {
println!("{}", html);
} else {
create_file(&app_args.output, html).unwrap();
}
}
}
pub fn read_stdin() -> Vec<u8> {
let mut buffer: Vec<u8> = vec![];
match io::stdin().lock().read_to_end(&mut buffer) {
Ok(_) => buffer,
Err(_) => buffer,
}
}
fn main() {
let options = Options::from_args();
// Check if target was provided
if options.target.len() == 0 {
if !options.silent {
eprintln!("No target specified");
}
process::exit(1);
}
// Check if custom charset is valid
if let Some(custom_charset) = options.charset.clone() {
if !Encoding::for_label_no_replacement(custom_charset.as_bytes()).is_some() {
eprintln!("Unknown encoding: {}", &custom_charset);
process::exit(1);
}
}
let mut use_stdin: bool = false;
let target_url = match options.target.as_str() {
"-" => {
// Read from pipe (stdin)
use_stdin = true;
// Set default target URL to an empty data URL; the user can set it via --base-url
Url::parse("data:text/html,").unwrap()
}
target => match Url::parse(&target) {
Ok(url) => match url.scheme() {
"data" | "file" | "http" | "https" => url,
unsupported_scheme => {
if !options.silent {
eprintln!("Unsupported target URL type: {}", unsupported_scheme);
}
process::exit(1)
}
},
Err(_) => {
// Failed to parse given base URL (perhaps it's a filesystem path?)
let path: &Path = Path::new(&target);
match path.exists() {
true => match path.is_file() {
true => {
let canonical_path = fs::canonicalize(&path).unwrap();
match Url::from_file_path(canonical_path) {
Ok(url) => url,
Err(_) => {
if !options.silent {
eprintln!(
"Could not generate file URL out of given path: {}",
&target
);
}
process::exit(1);
}
}
}
false => {
if !options.silent {
eprintln!("Local target is not a file: {}", &target);
}
process::exit(1);
}
},
false => {
// It is not a FS path, now we do what browsers do:
// prepend "http://" and hope it points to a website
Url::parse(&format!("http://{hopefully_url}", hopefully_url = &target))
.unwrap()
}
}
}
},
};
// Initialize client
let mut cache = HashMap::new();
let mut header_map = HeaderMap::new();
if let Some(user_agent) = &options.user_agent {
header_map.insert(
USER_AGENT,
HeaderValue::from_str(&user_agent).expect("Invalid User-Agent header specified"),
);
}
let client = if options.timeout > 0 {
Client::builder().timeout(Duration::from_secs(options.timeout))
} else {
// No timeout is default
Client::builder()
}
.danger_accept_invalid_certs(options.insecure)
.default_headers(header_map)
.build()
.expect("Failed to initialize HTTP client");
// At first we assume that base URL is the same as target URL
let mut base_url: Url = target_url.clone();
let data: Vec<u8>;
let mut document_encoding: String = "".to_string();
let mut dom: RcDom;
// Retrieve target document
if use_stdin {
data = read_stdin();
} else if target_url.scheme() == "file"
|| (target_url.scheme() == "http" || target_url.scheme() == "https")
|| target_url.scheme() == "data"
{
match retrieve_asset(&mut cache, &client, &target_url, &target_url, &options, 0) {
Ok((retrieved_data, final_url, media_type, charset)) => {
// Make sure the media type is text/html
if !media_type.eq_ignore_ascii_case("text/html") {
if !options.silent {
eprintln!("Unsupported document media type");
}
process::exit(1);
}
if options
.base_url
.clone()
.unwrap_or("".to_string())
.is_empty()
{
base_url = final_url;
}
data = retrieved_data;
document_encoding = charset;
}
Err(_) => {
if !options.silent {
eprintln!("Could not retrieve target document");
}
process::exit(1);
}
}
} else {
process::exit(1);
}
// Initial parse
dom = html_to_dom(&data, document_encoding.clone());
// TODO: investigate if charset from filesystem/data URL/HTTP headers
// has say over what's specified in HTML
// Attempt to determine document's charset
if let Some(html_charset) = get_charset(&dom.document) {
if !html_charset.is_empty() {
// Check if the charset specified inside HTML is valid
if let Some(encoding) = Encoding::for_label_no_replacement(html_charset.as_bytes()) {
document_encoding = html_charset;
dom = html_to_dom(&data, encoding.name().to_string());
}
}
}
// Use custom base URL if specified, read and use what's in the DOM otherwise
let custom_base_url: String = options.base_url.clone().unwrap_or("".to_string());
if custom_base_url.is_empty() {
// No custom base URL is specified
// Try to see if document has BASE element
if let Some(existing_base_url) = get_base_url(&dom.document) {
base_url = resolve_url(&target_url, &existing_base_url);
}
} else {
// Custom base URL provided
match Url::parse(&custom_base_url) {
Ok(parsed_url) => {
if parsed_url.scheme() == "file" {
// File base URLs can only work with
// documents saved from filesystem
if target_url.scheme() == "file" {
base_url = parsed_url;
}
} else {
base_url = parsed_url;
}
}
Err(_) => {
// Failed to parse given base URL, perhaps it's a filesystem path?
if target_url.scheme() == "file" {
// Relative paths could work for documents saved from filesystem
let path: &Path = Path::new(&custom_base_url);
if path.exists() {
match Url::from_file_path(fs::canonicalize(&path).unwrap()) {
Ok(file_url) => {
base_url = file_url;
}
Err(_) => {
if !options.silent {
eprintln!(
"Could not map given path to base URL: {}",
custom_base_url
);
}
process::exit(1);
}
}
}
}
}
}
}
// Traverse through the document and embed remote assets
walk_and_embed_assets(&mut cache, &client, &base_url, &dom.document, &options, 0);
// Update or add new BASE element to reroute network requests and hash-links
if let Some(new_base_url) = options.base_url.clone() {
dom = set_base_url(&dom.document, new_base_url);
}
// Request and embed /favicon.ico (unless it's already linked in the document)
if !options.no_images
&& (target_url.scheme() == "http" || target_url.scheme() == "https")
&& !has_favicon(&dom.document)
{
let favicon_ico_url: Url = resolve_url(&base_url, "/favicon.ico");
match retrieve_asset(
&mut cache,
&client,
&target_url,
&favicon_ico_url,
&options,
0,
) {
Ok((data, final_url, media_type, charset)) => {
let favicon_data_url: Url =
create_data_url(&media_type, &charset, &data, &final_url);
dom = add_favicon(&dom.document, favicon_data_url.to_string());
}
Err(_) => {
// Failed to retrieve /favicon.ico
}
}
}
// Save using specified charset, if given
if let Some(custom_charset) = options.charset.clone() {
document_encoding = custom_charset;
dom = set_charset(dom, document_encoding.clone());
}
// Serialize DOM tree
let mut result: Vec<u8> = serialize_document(dom, document_encoding, &options);
// Prepend metadata comment tag
if !options.no_metadata {
let mut metadata_comment: String = create_metadata_tag(&target_url);
metadata_comment += "\n";
result.splice(0..0, metadata_comment.as_bytes().to_vec());
}
// Define output
let mut output = Output::new(&options.output).expect("Could not prepare output");
// Write result into stdout or file
output.write(&result).expect("Could not write HTML output");
}

144
src/opts.rs Normal file
View File

@ -0,0 +1,144 @@
use clap::{App, Arg, ArgAction};
use std::env;
#[derive(Default)]
pub struct Options {
pub no_audio: bool,
pub base_url: Option<String>,
pub no_css: bool,
pub charset: Option<String>,
pub domains: Option<Vec<String>>,
pub ignore_errors: bool,
pub exclude_domains: bool,
pub no_frames: bool,
pub no_fonts: bool,
pub no_images: bool,
pub isolate: bool,
pub no_js: bool,
pub insecure: bool,
pub no_metadata: bool,
pub output: String,
pub silent: bool,
pub timeout: u64,
pub user_agent: Option<String>,
pub no_video: bool,
pub target: String,
pub no_color: bool,
pub unwrap_noscript: bool,
}
const ASCII: &'static str = " \
_____ ______________ __________ ___________________ ___
| \\ / \\ | | | | | |
| \\_/ __ \\_| __ | | ___ ___ |__| |
| | | | | | | | | | | |
| |\\ /| |__| _ |__| |____| | | | | __ |
| | \\___/ | | \\ | | | | | | |
|___| |__________| \\_____________________| |___| |___| |___|
";
const DEFAULT_NETWORK_TIMEOUT: u64 = 120;
const DEFAULT_USER_AGENT: &'static str =
"Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:73.0) Gecko/20100101 Firefox/73.0";
const ENV_VAR_NO_COLOR: &str = "NO_COLOR";
const ENV_VAR_TERM: &str = "TERM";
impl Options {
pub fn from_args() -> Options {
let app = App::new(env!("CARGO_PKG_NAME"))
.version(env!("CARGO_PKG_VERSION"))
.author(format!("\n{}\n\n", env!("CARGO_PKG_AUTHORS").replace(':', "\n")).as_str())
.about(format!("{}\n{}", ASCII, env!("CARGO_PKG_DESCRIPTION")).as_str())
.args_from_usage("-a, --no-audio 'Removes audio sources'")
.args_from_usage("-b, --base-url=[http://localhost/] 'Sets custom base URL'")
.args_from_usage("-c, --no-css 'Removes CSS'")
.args_from_usage("-C, --charset=[UTF-8] 'Enforces custom encoding'")
.arg(
Arg::with_name("domains")
.short('d')
.long("domains")
.takes_value(true)
.value_name("DOMAINS")
.action(ArgAction::Append)
.help("Whitelist of domains"),
)
.args_from_usage("-e, --ignore-errors 'Ignore network errors'")
.args_from_usage("-E, --exclude-domains 'Treat specified domains as blacklist'")
.args_from_usage("-f, --no-frames 'Removes frames and iframes'")
.args_from_usage("-F, --no-fonts 'Removes fonts'")
.args_from_usage("-i, --no-images 'Removes images'")
.args_from_usage("-I, --isolate 'Cuts off document from the Internet'")
.args_from_usage("-j, --no-js 'Removes JavaScript'")
.args_from_usage("-k, --insecure 'Allows invalid X.509 (TLS) certificates'")
.args_from_usage("-M, --no-metadata 'Excludes timestamp and source information'")
.args_from_usage(
"-n, --unwrap-noscript 'Replaces NOSCRIPT elements with their contents'",
)
.args_from_usage(
"-o, --output=[document.html] 'Writes output to <file>, use - for STDOUT'",
)
.args_from_usage("-s, --silent 'Suppresses verbosity'")
.args_from_usage("-t, --timeout=[60] 'Adjusts network request timeout'")
.args_from_usage("-u, --user-agent=[Firefox] 'Sets custom User-Agent string'")
.args_from_usage("-v, --no-video 'Removes video sources'")
.arg(
Arg::with_name("target")
.required(true)
.takes_value(true)
.index(1)
.help("URL or file path, use - for STDIN"),
)
.get_matches();
let mut options: Options = Options::default();
// Process the command
options.target = app
.value_of("target")
.expect("please set target")
.to_string();
options.no_audio = app.is_present("no-audio");
if let Some(base_url) = app.value_of("base-url") {
options.base_url = Some(base_url.to_string());
}
options.no_css = app.is_present("no-css");
if let Some(charset) = app.value_of("charset") {
options.charset = Some(charset.to_string());
}
if let Some(domains) = app.get_many::<String>("domains") {
let list_of_domains: Vec<String> = domains.map(|v| v.clone()).collect::<Vec<_>>();
options.domains = Some(list_of_domains);
}
options.ignore_errors = app.is_present("ignore-errors");
options.exclude_domains = app.is_present("exclude-domains");
options.no_frames = app.is_present("no-frames");
options.no_fonts = app.is_present("no-fonts");
options.no_images = app.is_present("no-images");
options.isolate = app.is_present("isolate");
options.no_js = app.is_present("no-js");
options.insecure = app.is_present("insecure");
options.no_metadata = app.is_present("no-metadata");
options.output = app.value_of("output").unwrap_or("").to_string();
options.silent = app.is_present("silent");
options.timeout = app
.value_of("timeout")
.unwrap_or(&DEFAULT_NETWORK_TIMEOUT.to_string())
.parse::<u64>()
.unwrap();
if let Some(user_agent) = app.value_of("user-agent") {
options.user_agent = Some(user_agent.to_string());
} else {
options.user_agent = Some(DEFAULT_USER_AGENT.to_string());
}
options.unwrap_noscript = app.is_present("unwrap-noscript");
options.no_video = app.is_present("no-video");
options.no_color =
env::var_os(ENV_VAR_NO_COLOR).is_some() || atty::isnt(atty::Stream::Stderr);
if let Some(term) = env::var_os(ENV_VAR_TERM) {
if term == "dumb" {
options.no_color = true;
}
}
options
}
}

View File

@ -1,518 +0,0 @@
use crate::html::{
get_node_name, get_parent_node, html_to_dom, is_icon, stringify_document, walk_and_embed_assets,
};
use html5ever::rcdom::{Handle, NodeData};
use html5ever::serialize::{serialize, SerializeOpts};
use std::collections::HashMap;
#[test]
fn test_is_icon() {
assert_eq!(is_icon("icon"), true);
assert_eq!(is_icon("Shortcut Icon"), true);
assert_eq!(is_icon("ICON"), true);
assert_eq!(is_icon("mask-icon"), true);
assert_eq!(is_icon("fluid-icon"), true);
assert_eq!(is_icon("stylesheet"), false);
assert_eq!(is_icon(""), false);
}
#[test]
fn test_get_parent_node_name() {
let html = "<!doctype html><html><HEAD></HEAD><body><div><P></P></div></body></html>";
let dom = html_to_dom(&html);
let mut count = 0;
fn test_walk(node: &Handle, i: &mut i8) {
*i += 1;
match &node.data {
NodeData::Document => {
for child in node.children.borrow().iter() {
test_walk(child, &mut *i);
}
}
NodeData::Element { ref name, .. } => {
let node_name = name.local.as_ref().to_string();
let parent_node_name = get_node_name(&get_parent_node(node));
if node_name == "head" || node_name == "body" {
assert_eq!(parent_node_name, "html");
} else if node_name == "div" {
assert_eq!(parent_node_name, "body");
} else if node_name == "p" {
assert_eq!(parent_node_name, "div");
}
println!("{}", node_name);
for child in node.children.borrow().iter() {
test_walk(child, &mut *i);
}
}
_ => (),
};
}
test_walk(&dom.document, &mut count);
assert_eq!(count, 7);
}
#[test]
fn test_walk_and_embed_assets() {
let cache = &mut HashMap::new();
let html = "<div><P></P></div>";
let dom = html_to_dom(&html);
let url = "http://localhost";
let opt_no_css: bool = false;
let opt_no_frames: bool = false;
let opt_no_js: bool = false;
let opt_no_images: bool = false;
let opt_silent = true;
let client = reqwest::Client::new();
walk_and_embed_assets(
cache,
&client,
&url,
&dom.document,
opt_no_css,
opt_no_js,
opt_no_images,
opt_silent,
opt_no_frames,
);
let mut buf: Vec<u8> = Vec::new();
serialize(&mut buf, &dom.document, SerializeOpts::default()).unwrap();
assert_eq!(
buf.iter().map(|&c| c as char).collect::<String>(),
"<html><head></head><body><div><p></p></div></body></html>"
);
}
#[test]
fn test_walk_and_embed_assets_ensure_no_recursive_iframe() {
let html = "<div><P></P><iframe src=\"\"></iframe></div>";
let dom = html_to_dom(&html);
let url = "http://localhost";
let cache = &mut HashMap::new();
let opt_no_css: bool = false;
let opt_no_frames: bool = false;
let opt_no_js: bool = false;
let opt_no_images: bool = false;
let opt_silent = true;
let client = reqwest::Client::new();
walk_and_embed_assets(
cache,
&client,
&url,
&dom.document,
opt_no_css,
opt_no_js,
opt_no_images,
opt_silent,
opt_no_frames,
);
let mut buf: Vec<u8> = Vec::new();
serialize(&mut buf, &dom.document, SerializeOpts::default()).unwrap();
assert_eq!(
buf.iter().map(|&c| c as char).collect::<String>(),
"<html><head></head><body><div><p></p><iframe src=\"\"></iframe></div></body></html>"
);
}
#[test]
fn test_walk_and_embed_assets_no_css() {
let html = "<link rel=\"stylesheet\" href=\"main.css\">\
<style>html{background-color: #000;}</style>\
<div style=\"display: none;\"></div>";
let dom = html_to_dom(&html);
let url = "http://localhost";
let cache = &mut HashMap::new();
let opt_no_css: bool = true;
let opt_no_frames: bool = false;
let opt_no_js: bool = false;
let opt_no_images: bool = false;
let opt_silent = true;
let client = reqwest::Client::new();
walk_and_embed_assets(
cache,
&client,
&url,
&dom.document,
opt_no_css,
opt_no_js,
opt_no_images,
opt_silent,
opt_no_frames,
);
let mut buf: Vec<u8> = Vec::new();
serialize(&mut buf, &dom.document, SerializeOpts::default()).unwrap();
assert_eq!(
buf.iter().map(|&c| c as char).collect::<String>(),
"<html>\
<head>\
<link rel=\"stylesheet\" href=\"\">\
<style></style>\
</head>\
<body>\
<div></div>\
</body>\
</html>"
);
}
#[test]
fn test_walk_and_embed_assets_no_images() {
let html = "<link rel=\"icon\" href=\"favicon.ico\">\
<div><img src=\"http://localhost/assets/mono_lisa.png\" /></div>";
let dom = html_to_dom(&html);
let url = "http://localhost";
let cache = &mut HashMap::new();
let opt_no_css: bool = false;
let opt_no_frames: bool = false;
let opt_no_js: bool = false;
let opt_no_images: bool = true;
let opt_silent = true;
let client = reqwest::Client::new();
walk_and_embed_assets(
cache,
&client,
&url,
&dom.document,
opt_no_css,
opt_no_js,
opt_no_images,
opt_silent,
opt_no_frames,
);
let mut buf: Vec<u8> = Vec::new();
serialize(&mut buf, &dom.document, SerializeOpts::default()).unwrap();
assert_eq!(
buf.iter().map(|&c| c as char).collect::<String>(),
"<html>\
<head>\
<link rel=\"icon\" href=\"\">\
</head>\
<body>\
<div>\
<img src=\"data:image/png;base64,\
iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAQAAAC1HAwCAAAAC0\
lEQVR42mNkYAAAAAYAAjCB0C8AAAAASUVORK5CYII=\">\
</div>\
</body>\
</html>"
);
}
#[test]
fn test_walk_and_embed_assets_no_frames() {
let html = "<iframe src=\"http://trackbook.com\"></iframe>";
let dom = html_to_dom(&html);
let url = "http://localhost";
let cache = &mut HashMap::new();
let opt_no_css: bool = false;
let opt_no_frames: bool = true;
let opt_no_js: bool = false;
let opt_no_images: bool = false;
let opt_silent = true;
let client = reqwest::Client::new();
walk_and_embed_assets(
cache,
&client,
&url,
&dom.document,
opt_no_css,
opt_no_js,
opt_no_images,
opt_silent,
opt_no_frames,
);
let mut buf: Vec<u8> = Vec::new();
serialize(&mut buf, &dom.document, SerializeOpts::default()).unwrap();
assert_eq!(
buf.iter().map(|&c| c as char).collect::<String>(),
"<html><head></head><body><iframe src=\"\"></iframe></body></html>"
);
}
#[test]
fn test_walk_and_embed_assets_no_js() {
let html = "<div onClick=\"void(0)\">\
<script src=\"http://localhost/assets/some.js\"></script>\
<script>alert(1)</script>\
</div>";
let dom = html_to_dom(&html);
let url = "http://localhost";
let cache = &mut HashMap::new();
let opt_no_css: bool = false;
let opt_no_frames: bool = false;
let opt_no_js: bool = true;
let opt_no_images: bool = false;
let opt_silent = true;
let client = reqwest::Client::new();
walk_and_embed_assets(
cache,
&client,
&url,
&dom.document,
opt_no_css,
opt_no_js,
opt_no_images,
opt_silent,
opt_no_frames,
);
let mut buf: Vec<u8> = Vec::new();
serialize(&mut buf, &dom.document, SerializeOpts::default()).unwrap();
assert_eq!(
buf.iter().map(|&c| c as char).collect::<String>(),
"<html><head></head><body><div><script src=\"\"></script>\
<script></script></div></body></html>"
);
}
#[test]
fn test_walk_and_embed_with_no_integrity() {
let html = "<title>No integrity</title>\
<link integrity=\"sha384-...\" rel=\"something\"/>\
<script integrity=\"sha384-...\" src=\"some.js\"></script>";
let dom = html_to_dom(&html);
let url = "http://localhost";
let cache = &mut HashMap::new();
let client = reqwest::Client::new();
let opt_no_css: bool = true;
let opt_no_frames: bool = true;
let opt_no_js: bool = true;
let opt_no_images: bool = true;
let opt_silent = true;
walk_and_embed_assets(
cache,
&client,
&url,
&dom.document,
opt_no_css,
opt_no_js,
opt_no_images,
opt_silent,
opt_no_frames,
);
let mut buf: Vec<u8> = Vec::new();
serialize(&mut buf, &dom.document, SerializeOpts::default()).unwrap();
assert_eq!(
buf.iter().map(|&c| c as char).collect::<String>(),
"<html>\
<head><title>No integrity</title><link rel=\"something\"><script src=\"\"></script></head>\
<body></body>\
</html>"
);
}
#[test]
fn test_stringify_document() {
let html = "<div><script src=\"some.js\"></script></div>";
let dom = html_to_dom(&html);
let opt_no_css: bool = false;
let opt_no_frames: bool = false;
let opt_no_js: bool = false;
let opt_no_images: bool = false;
let opt_isolate: bool = false;
assert_eq!(
stringify_document(
&dom.document,
opt_no_css,
opt_no_frames,
opt_no_js,
opt_no_images,
opt_isolate,
),
"<html><head></head><body><div><script src=\"some.js\"></script></div></body></html>"
);
}
#[test]
fn test_stringify_document_isolate() {
let html = "<title>Isolated document</title>\
<link rel=\"something\" href=\"some.css\" />\
<meta http-equiv=\"Content-Security-Policy\" content=\"default-src https:\">\
<div><script src=\"some.js\"></script></div>";
let dom = html_to_dom(&html);
let opt_no_css: bool = false;
let opt_no_frames: bool = false;
let opt_no_js: bool = false;
let opt_no_images: bool = false;
let opt_isolate: bool = true;
assert_eq!(
stringify_document(
&dom.document,
opt_no_css,
opt_no_frames,
opt_no_js,
opt_no_images,
opt_isolate,
),
"<html>\
<head>\
<meta http-equiv=\"Content-Security-Policy\" content=\"default-src 'unsafe-inline' data:;\"></meta>\
<title>Isolated document</title>\
<link rel=\"something\" href=\"some.css\">\
<meta http-equiv=\"Content-Security-Policy\" content=\"default-src https:\">\
</head>\
<body>\
<div>\
<script src=\"some.js\"></script>\
</div>\
</body>\
</html>"
);
}
#[test]
fn test_stringify_document_no_css() {
let html = "<!doctype html>\
<title>Unstyled document</title>\
<link rel=\"stylesheet\" href=\"main.css\"/>\
<div style=\"display: none;\"></div>";
let dom = html_to_dom(&html);
let opt_no_css: bool = true;
let opt_no_frames: bool = false;
let opt_no_js: bool = false;
let opt_no_images: bool = false;
let opt_isolate: bool = false;
assert_eq!(
stringify_document(
&dom.document,
opt_no_css,
opt_no_frames,
opt_no_js,
opt_no_images,
opt_isolate,
),
"<!DOCTYPE html>\
<html>\
<head>\
<meta http-equiv=\"Content-Security-Policy\" content=\"style-src 'none';\"></meta>\
<title>Unstyled document</title>\
<link rel=\"stylesheet\" href=\"main.css\">\
</head>\
<body><div style=\"display: none;\"></div></body>\
</html>"
);
}
#[test]
fn test_stringify_document_no_frames() {
let html = "<!doctype html>\
<title>Frameless document</title>\
<link rel=\"something\"/>\
<div><script src=\"some.js\"></script></div>";
let dom = html_to_dom(&html);
let opt_no_css: bool = false;
let opt_no_frames: bool = true;
let opt_no_js: bool = false;
let opt_no_images: bool = false;
let opt_isolate: bool = false;
assert_eq!(
stringify_document(
&dom.document,
opt_no_css,
opt_no_frames,
opt_no_js,
opt_no_images,
opt_isolate,
),
"<!DOCTYPE html>\
<html>\
<head>\
<meta http-equiv=\"Content-Security-Policy\" content=\"frame-src 'none';child-src 'none';\"></meta>\
<title>Frameless document</title>\
<link rel=\"something\">\
</head>\
<body><div><script src=\"some.js\"></script></div></body>\
</html>"
);
}
#[test]
fn test_stringify_document_isolate_no_frames_no_js_no_css_no_images() {
let html = "<!doctype html>\
<title>no-frame no-css no-js no-image isolated document</title>\
<meta http-equiv=\"Content-Security-Policy\" content=\"default-src https:\">\
<link rel=\"stylesheet\" href=\"some.css\">\
<div>\
<script src=\"some.js\"></script>\
<img style=\"width: 100%;\" src=\"some.png\" />\
<iframe src=\"some.html\"></iframe>\
</div>";
let dom = html_to_dom(&html);
let opt_isolate: bool = true;
let opt_no_css: bool = true;
let opt_no_frames: bool = true;
let opt_no_js: bool = true;
let opt_no_images: bool = true;
assert_eq!(
stringify_document(
&dom.document,
opt_no_css,
opt_no_frames,
opt_no_js,
opt_no_images,
opt_isolate,
),
"<!DOCTYPE html>\
<html>\
<head>\
<meta http-equiv=\"Content-Security-Policy\" content=\"default-src \'unsafe-inline\' data:; style-src \'none\'; frame-src \'none\';child-src \'none\'; script-src \'none\'; img-src data:;\"></meta>\
<title>no-frame no-css no-js no-image isolated document</title>\
<meta http-equiv=\"Content-Security-Policy\" content=\"default-src https:\">\
<link rel=\"stylesheet\" href=\"some.css\">\
</head>\
<body>\
<div>\
<script src=\"some.js\"></script>\
<img style=\"width: 100%;\" src=\"some.png\">\
<iframe src=\"some.html\"></iframe>\
</div>\
</body>\
</html>"
);
}

View File

@ -1,23 +0,0 @@
use crate::http::retrieve_asset;
use std::collections::HashMap;
#[test]
fn test_retrieve_asset() {
let cache = &mut HashMap::new();
let client = reqwest::Client::new();
let (data, final_url) =
retrieve_asset(cache, &client, "data:text/html;base64,...", true, "", false).unwrap();
assert_eq!(&data, "data:text/html;base64,...");
assert_eq!(&final_url, "data:text/html;base64,...");
let (data, final_url) = retrieve_asset(
cache,
&client,
"data:text/html;base64,...",
true,
"image/png",
false,
)
.unwrap();
assert_eq!(&data, "data:text/html;base64,...");
assert_eq!(&final_url, "data:text/html;base64,...");
}

View File

@ -1,13 +0,0 @@
use crate::js::attr_is_event_handler;
#[test]
fn test_attr_is_event_handler() {
// succeeding
assert!(attr_is_event_handler("onBlur"));
assert!(attr_is_event_handler("onclick"));
assert!(attr_is_event_handler("onClick"));
// failing
assert!(!attr_is_event_handler("href"));
assert!(!attr_is_event_handler(""));
assert!(!attr_is_event_handler("class"));
}

View File

@ -1,4 +0,0 @@
mod html;
mod http;
mod js;
mod utils;

View File

@ -1,177 +0,0 @@
use crate::utils::{
clean_url, data_to_dataurl, detect_mimetype, is_data_url, is_valid_url, resolve_url,
url_has_protocol,
};
use url::ParseError;
#[test]
fn test_data_to_dataurl() {
let mime = "application/javascript";
let data = "var word = 'hello';\nalert(word);\n";
let datauri = data_to_dataurl(mime, data.as_bytes());
assert_eq!(
&datauri,
"data:application/javascript;base64,dmFyIHdvcmQgPSAnaGVsbG8nOwphbGVydCh3b3JkKTsK"
);
}
#[test]
fn test_detect_mimetype() {
// image
assert_eq!(detect_mimetype(b"GIF87a"), "image/gif");
assert_eq!(detect_mimetype(b"GIF89a"), "image/gif");
assert_eq!(detect_mimetype(b"\xFF\xD8\xFF"), "image/jpeg");
assert_eq!(detect_mimetype(b"\x89PNG\x0D\x0A\x1A\x0A"), "image/png");
assert_eq!(detect_mimetype(b"<?xml "), "image/svg+xml");
assert_eq!(detect_mimetype(b"<svg "), "image/svg+xml");
assert_eq!(detect_mimetype(b"RIFF....WEBPVP8 "), "image/webp");
assert_eq!(detect_mimetype(b"\x00\x00\x01\x00"), "image/x-icon");
// audio
assert_eq!(detect_mimetype(b"ID3"), "audio/mpeg");
assert_eq!(detect_mimetype(b"\xFF\x0E"), "audio/mpeg");
assert_eq!(detect_mimetype(b"\xFF\x0F"), "audio/mpeg");
assert_eq!(detect_mimetype(b"OggS"), "audio/ogg");
assert_eq!(detect_mimetype(b"RIFF....WAVEfmt "), "audio/wav");
assert_eq!(detect_mimetype(b"fLaC"), "audio/x-flac");
// video
assert_eq!(detect_mimetype(b"RIFF....AVI LIST"), "video/avi");
assert_eq!(detect_mimetype(b"....ftyp"), "video/mp4");
assert_eq!(detect_mimetype(b"\x00\x00\x01\x0B"), "video/mpeg");
assert_eq!(detect_mimetype(b"....moov"), "video/quicktime");
assert_eq!(detect_mimetype(b"\x1A\x45\xDF\xA3"), "video/webm");
}
#[test]
fn test_url_has_protocol() {
// succeeding
assert_eq!(
url_has_protocol("mailto:somebody@somewhere.com?subject=hello"),
true
);
assert_eq!(url_has_protocol("tel:5551234567"), true);
assert_eq!(
url_has_protocol("ftp:user:password@some-ftp-server.com"),
true
);
assert_eq!(url_has_protocol("javascript:void(0)"), true);
assert_eq!(url_has_protocol("http://news.ycombinator.com"), true);
assert_eq!(url_has_protocol("https://github.com"), true);
assert_eq!(
url_has_protocol("MAILTO:somebody@somewhere.com?subject=hello"),
true
);
// failing
assert_eq!(
url_has_protocol("//some-hostname.com/some-file.html"),
false
);
assert_eq!(url_has_protocol("some-hostname.com/some-file.html"), false);
assert_eq!(url_has_protocol("/some-file.html"), false);
assert_eq!(url_has_protocol(""), false);
}
#[test]
fn test_is_valid_url() {
// succeeding
assert!(is_valid_url("https://www.rust-lang.org/"));
assert!(is_valid_url("http://kernel.org"));
// failing
assert!(!is_valid_url("//kernel.org"));
assert!(!is_valid_url("./index.html"));
assert!(!is_valid_url("some-local-page.htm"));
assert!(!is_valid_url("ftp://1.2.3.4/www/index.html"));
assert!(!is_valid_url(
"data:text/html;base64,V2VsY29tZSBUbyBUaGUgUGFydHksIDxiPlBhbDwvYj4h"
));
}
#[test]
fn test_resolve_url() -> Result<(), ParseError> {
let resolved_url = resolve_url("https://www.kernel.org", "../category/signatures.html")?;
assert_eq!(
resolved_url.as_str(),
"https://www.kernel.org/category/signatures.html"
);
let resolved_url = resolve_url("https://www.kernel.org", "category/signatures.html")?;
assert_eq!(
resolved_url.as_str(),
"https://www.kernel.org/category/signatures.html"
);
let resolved_url = resolve_url(
"saved_page.htm",
"https://www.kernel.org/category/signatures.html",
)?;
assert_eq!(
resolved_url.as_str(),
"https://www.kernel.org/category/signatures.html"
);
let resolved_url = resolve_url(
"https://www.kernel.org",
"//www.kernel.org/theme/images/logos/tux.png",
)?;
assert_eq!(
resolved_url.as_str(),
"https://www.kernel.org/theme/images/logos/tux.png"
);
let resolved_url = resolve_url(
"https://www.kernel.org",
"//another-host.org/theme/images/logos/tux.png",
)?;
assert_eq!(
resolved_url.as_str(),
"https://another-host.org/theme/images/logos/tux.png"
);
let resolved_url = resolve_url(
"https://www.kernel.org/category/signatures.html",
"/theme/images/logos/tux.png",
)?;
assert_eq!(
resolved_url.as_str(),
"https://www.kernel.org/theme/images/logos/tux.png"
);
let resolved_url = resolve_url(
"https://www.w3schools.com/html/html_iframe.asp",
"default.asp",
)?;
assert_eq!(
resolved_url.as_str(),
"https://www.w3schools.com/html/default.asp"
);
Ok(())
}
#[test]
fn test_is_data_url() {
// succeeding
assert!(
is_data_url("data:text/html;base64,V2VsY29tZSBUbyBUaGUgUGFydHksIDxiPlBhbDwvYj4h")
.unwrap_or(false)
);
// failing
assert!(!is_data_url("https://kernel.org").unwrap_or(false));
assert!(!is_data_url("//kernel.org").unwrap_or(false));
assert!(!is_data_url("").unwrap_or(false));
}
#[test]
fn test_clean_url() {
assert_eq!(
clean_url("https://somewhere.com/font.eot#iefix"),
"https://somewhere.com/font.eot"
);
assert_eq!(
clean_url("https://somewhere.com/font.eot#"),
"https://somewhere.com/font.eot"
);
assert_eq!(
clean_url("https://somewhere.com/font.eot?#"),
"https://somewhere.com/font.eot"
);
}

82
src/url.rs Normal file
View File

@ -0,0 +1,82 @@
use base64;
use percent_encoding::percent_decode_str;
use url::Url;
use crate::utils::{detect_media_type, parse_content_type};
pub const EMPTY_IMAGE_DATA_URL: &'static str = "data:image/png;base64,\
iVBORw0KGgoAAAANSUhEUgAAAA0AAAANCAQAAADY4iz3AAAAEUlEQVR42mNkwAkYR6UolgIACvgADsuK6xYAAAAASUVORK5CYII=";
pub fn clean_url(url: Url) -> Url {
let mut url = url.clone();
// Clear fragment (if any)
url.set_fragment(None);
url
}
pub fn create_data_url(media_type: &str, charset: &str, data: &[u8], final_asset_url: &Url) -> Url {
// TODO: move this block out of this function
let media_type: String = if media_type.is_empty() {
detect_media_type(data, &final_asset_url)
} else {
media_type.to_string()
};
let mut data_url: Url = Url::parse("data:,").unwrap();
let c: String =
if !charset.trim().is_empty() && !charset.trim().eq_ignore_ascii_case("US-ASCII") {
format!(";charset={}", charset.trim())
} else {
"".to_string()
};
data_url.set_path(format!("{}{};base64,{}", media_type, c, base64::encode(data)).as_str());
data_url
}
pub fn is_url_and_has_protocol(input: &str) -> bool {
match Url::parse(&input) {
Ok(parsed_url) => {
return parsed_url.scheme().len() > 0;
}
Err(_) => {
return false;
}
}
}
pub fn parse_data_url(url: &Url) -> (String, String, Vec<u8>) {
let path: String = url.path().to_string();
let comma_loc: usize = path.find(',').unwrap_or(path.len());
// Split data URL into meta data and raw data
let content_type: String = path.chars().take(comma_loc).collect();
let data: String = path.chars().skip(comma_loc + 1).collect();
// Parse meta data
let (media_type, charset, is_base64) = parse_content_type(&content_type);
// Parse raw data into vector of bytes
let text: String = percent_decode_str(&data).decode_utf8_lossy().to_string();
let blob: Vec<u8> = if is_base64 {
base64::decode(&text).unwrap_or(vec![])
} else {
text.as_bytes().to_vec()
};
(media_type, charset, blob)
}
pub fn resolve_url(from: &Url, to: &str) -> Url {
match Url::parse(&to) {
Ok(parsed_url) => parsed_url,
Err(_) => match from.join(to) {
Ok(joined) => joined,
Err(_) => Url::parse("data:,").unwrap(),
},
}
}

View File

@ -1,56 +1,21 @@
extern crate base64;
use self::base64::encode;
use http::retrieve_asset;
use regex::Regex;
use reqwest::Client;
use reqwest::blocking::Client;
use reqwest::header::CONTENT_TYPE;
use std::collections::HashMap;
use url::{ParseError, Url};
use std::fs;
use std::path::{Path, PathBuf};
use url::Url;
/// This monster of a regex is used to match any kind of URL found in CSS.
///
/// There are roughly three different categories that a found URL could fit
/// into:
/// - Font [found after a src: property in an @font-family rule]
/// - Stylesheet [denoted by an @import before the url
/// - Image [covers all other uses of the url() function]
///
/// This regex aims to extract the following information:
/// - What type of URL is it (font/image/css)
/// - Where is the part that needs to be replaced (incl any wrapping quotes)
/// - What is the URL (excl any wrapping quotes)
///
/// Essentially, the regex can be broken down into two parts:
///
/// `(?:(?P<import>@import)|(?P<font>src\s*:)\s+)?`
/// This matches the precursor to a font or CSS URL, and fills in a match under
/// either `<import>` (if it's a CSS URL) or `<font>` (if it's a font).
/// Determining whether or not it's an image can be done by the negation of both
/// of these. Either zero or one of these can match.
///
/// `url\((?P<to_repl>['"]?(?P<url>[^"'\)]+)['"]?)\)`
/// This matches the actual URL part of the url(), and must always match. It also
/// sets `<to_repl>` and `<url>` which correspond to everything within
/// `url(...)` and a usable URL, respectively.
///
/// Note, however, that this does not perform any validation of the found URL.
/// Malformed CSS could lead to an invalid URL being present. It is therefore
/// recomended that the URL gets manually validated.
const CSS_URL_REGEX_STR: &str = r###"(?:(?:(?P<stylesheet>@import)|(?P<font>src\s*:))\s+)?url\((?P<to_repl>['"]?(?P<url>[^"'\)]+)['"]?)\)"###;
use crate::opts::Options;
use crate::url::{clean_url, parse_data_url};
lazy_static! {
static ref HAS_PROTOCOL: Regex = Regex::new(r"^[a-z0-9]+:").unwrap();
static ref REGEX_URL: Regex = Regex::new(r"^https?://").unwrap();
static ref REGEX_CSS_URL: Regex = Regex::new(CSS_URL_REGEX_STR).unwrap();
}
const MAGIC: [[&[u8]; 2]; 19] = [
const ANSI_COLOR_RED: &'static str = "\x1b[31m";
const ANSI_COLOR_RESET: &'static str = "\x1b[0m";
const MAGIC: [[&[u8]; 2]; 18] = [
// Image
[b"GIF87a", b"image/gif"],
[b"GIF89a", b"image/gif"],
[b"\xFF\xD8\xFF", b"image/jpeg"],
[b"\x89PNG\x0D\x0A\x1A\x0A", b"image/png"],
[b"<?xml ", b"image/svg+xml"],
[b"<svg ", b"image/svg+xml"],
[b"RIFF....WEBPVP8 ", b"image/webp"],
[b"\x00\x00\x01\x00", b"image/x-icon"],
@ -68,142 +33,367 @@ const MAGIC: [[&[u8]; 2]; 19] = [
[b"....moov", b"video/quicktime"],
[b"\x1A\x45\xDF\xA3", b"video/webm"],
];
const PLAINTEXT_MEDIA_TYPES: &[&str] = &[
"application/javascript",
"application/json",
"image/svg+xml",
];
pub fn data_to_dataurl(mime: &str, data: &[u8]) -> String {
let mimetype = if mime.is_empty() {
detect_mimetype(data)
} else {
mime.to_string()
};
format!("data:{};base64,{}", mimetype, encode(data))
}
pub fn detect_mimetype(data: &[u8]) -> String {
for item in MAGIC.iter() {
if data.starts_with(item[0]) {
return String::from_utf8(item[1].to_vec()).unwrap();
pub fn detect_media_type(data: &[u8], url: &Url) -> String {
// At first attempt to read file's header
for magic_item in MAGIC.iter() {
if data.starts_with(magic_item[0]) {
return String::from_utf8(magic_item[1].to_vec()).unwrap();
}
}
"".to_owned()
// If header didn't match any known magic signatures,
// try to guess media type from file name
let parts: Vec<&str> = url.path().split('/').collect();
detect_media_type_by_file_name(parts.last().unwrap())
}
pub fn url_has_protocol<T: AsRef<str>>(url: T) -> bool {
HAS_PROTOCOL.is_match(url.as_ref().to_lowercase().as_str())
}
pub fn detect_media_type_by_file_name(filename: &str) -> String {
let filename_lowercased: &str = &filename.to_lowercase();
let parts: Vec<&str> = filename_lowercased.split('.').collect();
pub fn is_data_url<T: AsRef<str>>(url: T) -> Result<bool, ParseError> {
Url::parse(url.as_ref()).and_then(|u| Ok(u.scheme() == "data"))
}
pub fn is_valid_url<T: AsRef<str>>(path: T) -> bool {
REGEX_URL.is_match(path.as_ref())
}
pub fn resolve_url<T: AsRef<str>, U: AsRef<str>>(from: T, to: U) -> Result<String, ParseError> {
let result = if is_valid_url(to.as_ref()) {
to.as_ref().to_string()
} else {
Url::parse(from.as_ref())?
.join(to.as_ref())?
.as_ref()
.to_string()
let mime: &str = match parts.last() {
Some(v) => match *v {
"avi" => "video/avi",
"bmp" => "image/bmp",
"css" => "text/css",
"flac" => "audio/flac",
"gif" => "image/gif",
"htm" | "html" => "text/html",
"ico" => "image/x-icon",
"jpeg" | "jpg" => "image/jpeg",
"js" => "application/javascript",
"json" => "application/json",
"mp3" => "audio/mpeg",
"mp4" | "m4v" => "video/mp4",
"ogg" => "audio/ogg",
"ogv" => "video/ogg",
"pdf" => "application/pdf",
"png" => "image/png",
"svg" => "image/svg+xml",
"swf" => "application/x-shockwave-flash",
"tif" | "tiff" => "image/tiff",
"txt" => "text/plain",
"wav" => "audio/wav",
"webp" => "image/webp",
"woff" => "font/woff",
"woff2" => "font/woff2",
"xml" => "text/xml",
&_ => "",
},
None => "",
};
Ok(result)
mime.to_string()
}
pub fn resolve_css_imports(
cache: &mut HashMap<String, String>,
client: &Client,
css_string: &str,
as_dataurl: bool,
href: &str,
opt_no_images: bool,
opt_silent: bool,
) -> String {
let mut resolved_css = String::from(css_string);
pub fn domain_is_within_domain(domain: &str, domain_to_match_against: &str) -> bool {
if domain_to_match_against.len() == 0 {
return false;
}
for link in REGEX_CSS_URL.captures_iter(&css_string) {
let target_link = link.name("url").unwrap().as_str();
if domain_to_match_against == "." {
return true;
}
// Determine the type of link
let is_stylesheet = link.name("stylesheet").is_some();
let is_font = link.name("font").is_some();
let is_image = !is_stylesheet && !is_font;
let domain_partials: Vec<&str> = domain.trim_end_matches(".").rsplit(".").collect();
let domain_to_match_against_partials: Vec<&str> = domain_to_match_against
.trim_end_matches(".")
.rsplit(".")
.collect();
let domain_to_match_against_starts_with_a_dot = domain_to_match_against.starts_with(".");
// Generate absolute URL for content
let embedded_url = match resolve_url(href, target_link) {
Ok(url) => url,
Err(_) => continue, // Malformed URL
let mut i: usize = 0;
let l: usize = std::cmp::max(
domain_partials.len(),
domain_to_match_against_partials.len(),
);
let mut ok: bool = true;
while i < l {
// Exit and return false if went out of bounds of domain to match against, and it didn't start with a dot
if !domain_to_match_against_starts_with_a_dot
&& domain_to_match_against_partials.len() < i + 1
{
ok = false;
break;
}
let domain_partial = if domain_partials.len() < i + 1 {
""
} else {
domain_partials.get(i).unwrap()
};
let domain_to_match_against_partial = if domain_to_match_against_partials.len() < i + 1 {
""
} else {
domain_to_match_against_partials.get(i).unwrap()
};
// Download the asset. If it's more CSS, resolve that too
let content = if is_stylesheet {
// The link is an @import link
retrieve_asset(
cache,
client,
&embedded_url,
false, // Formating as data URL will be done later
"text/css", // Expect CSS
opt_silent,
)
.map(|(content, _)| {
resolve_css_imports(
cache,
client,
&content,
true, // Finally, convert to a dataurl
&embedded_url,
opt_no_images,
opt_silent,
)
})
} else if (is_image && !opt_no_images) || is_font {
// The link is some other, non-@import link
retrieve_asset(
cache,
client,
&embedded_url,
true, // Format as data URL
"", // Unknown MIME type
opt_silent,
)
.map(|(a, _)| a)
} else {
// If it's a datatype that has been opt_no'd out of, replace with
// absolute URL
let parts_match = domain_to_match_against_partial.eq_ignore_ascii_case(domain_partial);
Ok(embedded_url.clone())
if !parts_match && domain_to_match_against_partial.len() != 0 {
ok = false;
break;
}
.unwrap_or_else(|e| {
eprintln!("Warning: {}", e);
// If failed to resolve, replace with absolute URL
embedded_url
});
let replacement = format!("\"{}\"", &content);
let dest = link.name("to_repl").unwrap();
let offset = resolved_css.len() - css_string.len();
let target_range = (dest.start() + offset)..(dest.end() + offset);
resolved_css.replace_range(target_range, &replacement);
i += 1;
}
if as_dataurl {
data_to_dataurl("text/css", resolved_css.as_bytes())
ok
}
pub fn indent(level: u32) -> String {
let mut result: String = String::new();
let mut l: u32 = level;
while l > 0 {
result += " ";
l -= 1;
}
result
}
pub fn is_plaintext_media_type(media_type: &str) -> bool {
media_type.to_lowercase().as_str().starts_with("text/")
|| PLAINTEXT_MEDIA_TYPES.contains(&media_type.to_lowercase().as_str())
}
pub fn parse_content_type(content_type: &str) -> (String, String, bool) {
let mut media_type: String = "text/plain".to_string();
let mut charset: String = "US-ASCII".to_string();
let mut is_base64: bool = false;
// Parse meta data
let content_type_items: Vec<&str> = content_type.split(';').collect();
let mut i: i8 = 0;
for item in &content_type_items {
if i == 0 {
if item.trim().len() > 0 {
media_type = item.trim().to_string();
}
} else {
if item.trim().eq_ignore_ascii_case("base64") {
is_base64 = true;
} else if item.trim().starts_with("charset=") {
charset = item.trim().chars().skip(8).collect();
}
}
i += 1;
}
(media_type, charset, is_base64)
}
pub fn retrieve_asset(
cache: &mut HashMap<String, Vec<u8>>,
client: &Client,
parent_url: &Url,
url: &Url,
options: &Options,
depth: u32,
) -> Result<(Vec<u8>, Url, String, String), reqwest::Error> {
if url.scheme() == "data" {
let (media_type, charset, data) = parse_data_url(url);
Ok((data, url.clone(), media_type, charset))
} else if url.scheme() == "file" {
// Check if parent_url is also a file: URL (if not, then we don't embed the asset)
if parent_url.scheme() != "file" {
if !options.silent {
eprintln!(
"{}{}{} ({}){}",
indent(depth).as_str(),
if options.no_color { "" } else { ANSI_COLOR_RED },
&url,
"Security Error",
if options.no_color {
""
} else {
ANSI_COLOR_RESET
},
);
}
// Provoke error
client.get("").send()?;
}
let path_buf: PathBuf = url.to_file_path().unwrap().clone();
let path: &Path = path_buf.as_path();
if path.exists() {
if path.is_dir() {
if !options.silent {
eprintln!(
"{}{}{} (is a directory){}",
indent(depth).as_str(),
if options.no_color { "" } else { ANSI_COLOR_RED },
&url,
if options.no_color {
""
} else {
ANSI_COLOR_RESET
},
);
}
// Provoke error
Err(client.get("").send().unwrap_err())
} else {
if !options.silent {
eprintln!("{}{}", indent(depth).as_str(), &url);
}
let file_blob: Vec<u8> = fs::read(&path).expect("Unable to read file");
Ok((
file_blob.clone(),
url.clone(),
detect_media_type(&file_blob, url),
"".to_string(),
))
}
} else {
if !options.silent {
eprintln!(
"{}{}{} (not found){}",
indent(depth).as_str(),
if options.no_color { "" } else { ANSI_COLOR_RED },
&url,
if options.no_color {
""
} else {
ANSI_COLOR_RESET
},
);
}
// Provoke error
Err(client.get("").send().unwrap_err())
}
} else {
resolved_css
}
}
let cache_key: String = clean_url(url.clone()).as_str().to_string();
pub fn clean_url<T: AsRef<str>>(url: T) -> String {
let mut result = Url::parse(url.as_ref()).unwrap();
// Clear fragment
result.set_fragment(None);
// Get rid of stray question mark
if result.query() == Some("") {
result.set_query(None);
if cache.contains_key(&cache_key) {
// URL is in cache, we get and return it
if !options.silent {
eprintln!("{}{} (from cache)", indent(depth).as_str(), &url);
}
Ok((
cache.get(&cache_key).unwrap().to_vec(),
url.clone(),
"".to_string(),
"".to_string(),
))
} else {
if let Some(domains) = &options.domains {
let domain_matches = domains
.iter()
.any(|d| domain_is_within_domain(url.host_str().unwrap(), &d.trim()));
if (options.exclude_domains && domain_matches)
|| (!options.exclude_domains && !domain_matches)
{
return Err(client.get("").send().unwrap_err());
}
}
// URL not in cache, we retrieve the file
match client.get(url.as_str()).send() {
Ok(response) => {
if !options.ignore_errors && response.status() != reqwest::StatusCode::OK {
if !options.silent {
eprintln!(
"{}{}{} ({}){}",
indent(depth).as_str(),
if options.no_color { "" } else { ANSI_COLOR_RED },
&url,
response.status(),
if options.no_color {
""
} else {
ANSI_COLOR_RESET
},
);
}
// Provoke error
return Err(client.get("").send().unwrap_err());
}
let response_url: Url = response.url().clone();
if !options.silent {
if url.as_str() == response_url.as_str() {
eprintln!("{}{}", indent(depth).as_str(), &url);
} else {
eprintln!("{}{} -> {}", indent(depth).as_str(), &url, &response_url);
}
}
let new_cache_key: String = clean_url(response_url.clone()).to_string();
// Attempt to obtain media type and charset by reading Content-Type header
let content_type: &str = response
.headers()
.get(CONTENT_TYPE)
.and_then(|header| header.to_str().ok())
.unwrap_or("");
let (media_type, charset, _is_base64) = parse_content_type(&content_type);
// Convert response into a byte array
let mut data: Vec<u8> = vec![];
match response.bytes() {
Ok(b) => {
data = b.to_vec();
}
Err(error) => {
if !options.silent {
eprintln!(
"{}{}{}{}",
indent(depth).as_str(),
if options.no_color { "" } else { ANSI_COLOR_RED },
error,
if options.no_color {
""
} else {
ANSI_COLOR_RESET
},
);
}
}
}
// Add retrieved resource to cache
cache.insert(new_cache_key, data.clone());
// Return
Ok((data, response_url, media_type, charset))
}
Err(error) => {
if !options.silent {
eprintln!(
"{}{}{} ({}){}",
indent(depth).as_str(),
if options.no_color { "" } else { ANSI_COLOR_RED },
&url,
error,
if options.no_color {
""
} else {
ANSI_COLOR_RESET
},
);
}
Err(client.get("").send().unwrap_err())
}
}
}
}
result.to_string()
}

View File

@ -0,0 +1,19 @@
<!doctype html>
<html lang="en">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
<title>Local HTML file</title>
<link href="local-style.css" rel="stylesheet" type="text/css" />
<link href="local-style-does-not-exist.css" rel="stylesheet" type="text/css" />
</head>
<body>
<img src="monolith.png" alt="" />
<a href="//local-file.html">Tricky href</a>
<a href="https://github.com/Y2Z/monolith">Remote URL</a>
<script src="local-script.js"></script>
</body>
</html>

View File

@ -0,0 +1,2 @@
document.body.style.backgroundColor = "green";
document.body.style.color = "red";

View File

@ -0,0 +1,4 @@
body {
background-color: #000;
color: #fff;
}

View File

@ -0,0 +1,11 @@
<style>
@charset 'UTF-8';
@import 'style.css';
@import url(style.css);
@import url('style.css');
</style>

View File

@ -0,0 +1 @@
body{background-color:#000;color:#fff}

View File

@ -0,0 +1,23 @@
<!doctype html>
<html lang="en">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
<title>Attempt to import CSS via data URL asset</title>
<style>
body {
background-color: white;
color: black;
}
</style>
<link href="data:text/css;base64,QGltcG9ydCAic3R5bGUuY3NzIjsK" rel="stylesheet" type="text/css" />
</head>
<body>
<p>If you see pink background with white foreground then were in trouble</p>
</body>
</html>

View File

@ -0,0 +1,4 @@
body {
background-color: pink;
color: white;
}

View File

@ -0,0 +1,17 @@
<!doctype html>
<html lang="en">
<head>
<title>Local HTML file</title>
<link href="style.css" rel="stylesheet" type="text/css" integrity="sha512-IWaCTORHkRhOWzcZeILSVmV6V6gPTHgNem6o6rsFAyaKTieDFkeeMrWjtO0DuWrX3bqZY46CVTZXUu0mia0qXQ==" crossorigin="anonymous" />
<link href="style.css" rel="stylesheet" type="text/css" integrity="sha512-vWBzl4NE9oIg8NFOPAyOZbaam0UXWr6aDHPaY2kodSzAFl+mKoj/RMNc6C31NDqK4mE2i68IWxYWqWJPLCgPOw==" crossorigin="anonymous" />
</head>
<body>
<p>This page should have black background and white foreground, but only when served via http: (not via file:)</p>
<script src="script.js" integrity="sha256-ecrEsYh3+ICCX8BCrNSotXgI5534282JwJjx8Q9ZWLc="></script>
<script src="script.js" integrity="sha256-6idk9dK0bOkVdG7Oz4/0YLXSJya8xZHqbRZKMhYrt6o="></script>
</body>
</html>

View File

@ -0,0 +1,3 @@
function noop() {
console.log("monolith");
}

View File

@ -0,0 +1,4 @@
body {
background-color: #000;
color: #FFF;
}

View File

@ -0,0 +1,5 @@
<svg version="1.1" baseProfile="full" width="300" height="200" xmlns="http://www.w3.org/2000/svg">
<rect width="100%" height="100%" fill="red" />
<circle cx="150" cy="100" r="80" fill="green" />
<text x="150" y="125" font-size="60" text-anchor="middle" fill="white">SVG</text>
</svg>

After

Width:  |  Height:  |  Size: 296 B

View File

@ -0,0 +1 @@
<body><noscript><img src="image.svg" /></noscript></body>

View File

@ -0,0 +1 @@
<body><noscript><h1>JS is not active</h1><noscript><img src="image.svg" /></noscript></noscript></body>

View File

@ -0,0 +1 @@
<body><noscript><script>alert(1);</script><img src="image.svg" /></noscript></body>

View File

@ -0,0 +1,5 @@
<svg version="1.1" baseProfile="full" width="300" height="200" xmlns="http://www.w3.org/2000/svg">
<rect width="100%" height="100%" fill="red" />
<circle cx="150" cy="100" r="80" fill="green" />
<text x="150" y="125" font-size="60" text-anchor="middle" fill="white">SVG</text>
</svg>

After

Width:  |  Height:  |  Size: 296 B

View File

@ -0,0 +1 @@
<div style="background-image: url('image.svg')"></div>

View File

@ -0,0 +1,9 @@
<html>
<head>
<meta http-equiv="content-type" content="text/html;charset=GB2312"/>
<title>近七成人减少线下需求 银行数字化转型提速--经济·科技--人民网 </title>
</head>
<body>
<h1>近七成人减少线下需求 银行数字化转型提速</h1>
</body>
</html>

View File

@ -0,0 +1,8 @@
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1">
</head>
<body>
&copy; Some Company
</body>
</html>

115
tests/cli/base_url.rs Normal file
View File

@ -0,0 +1,115 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
use assert_cmd::prelude::*;
use std::env;
use std::process::Command;
#[test]
fn add_new_when_provided() {
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let out = cmd
.arg("-M")
.arg("-b")
.arg("http://localhost:8000/")
.arg("data:text/html,Hello%2C%20World!")
.output()
.unwrap();
// STDERR should be empty
assert_eq!(String::from_utf8_lossy(&out.stderr), "");
// STDOUT should contain newly added base URL
assert_eq!(
String::from_utf8_lossy(&out.stdout),
"<html><head>\
<base href=\"http://localhost:8000/\"></base>\
</head><body>Hello, World!</body></html>\n"
);
// Exit code should be 0
out.assert().code(0);
}
#[test]
fn keep_existing_when_none_provided() {
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let out = cmd
.arg("-M")
.arg("data:text/html,<base href=\"http://localhost:8000/\" />Hello%2C%20World!")
.output()
.unwrap();
// STDERR should be empty
assert_eq!(String::from_utf8_lossy(&out.stderr), "");
// STDOUT should contain newly added base URL
assert_eq!(
String::from_utf8_lossy(&out.stdout),
"<html><head>\
<base href=\"http://localhost:8000/\">\
</head><body>Hello, World!</body></html>\n"
);
// Exit code should be 0
out.assert().code(0);
}
#[test]
fn override_existing_when_provided() {
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let out = cmd
.arg("-M")
.arg("-b")
.arg("http://localhost/")
.arg("data:text/html,<base href=\"http://localhost:8000/\" />Hello%2C%20World!")
.output()
.unwrap();
// STDERR should be empty
assert_eq!(String::from_utf8_lossy(&out.stderr), "");
// STDOUT should contain newly added base URL
assert_eq!(
String::from_utf8_lossy(&out.stdout),
"<html><head>\
<base href=\"http://localhost/\">\
</head><body>Hello, World!</body></html>\n"
);
// Exit code should be 0
out.assert().code(0);
}
#[test]
fn set_existing_to_empty_when_empty_provided() {
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let out = cmd
.arg("-M")
.arg("-b")
.arg("")
.arg("data:text/html,<base href=\"http://localhost:8000/\" />Hello%2C%20World!")
.output()
.unwrap();
// STDERR should be empty
assert_eq!(String::from_utf8_lossy(&out.stderr), "");
// STDOUT should contain newly added base URL
assert_eq!(
String::from_utf8_lossy(&out.stdout),
"<html><head>\
<base href=\"\">\
</head><body>Hello, World!</body></html>\n"
);
// Exit code should be 0
out.assert().code(0);
}
}

144
tests/cli/basic.rs Normal file
View File

@ -0,0 +1,144 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
use assert_cmd::prelude::*;
use std::env;
use std::fs;
use std::path::Path;
use std::process::{Command, Stdio};
use url::Url;
#[test]
fn print_help_information() {
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let out = cmd.arg("-h").output().unwrap();
// STDERR should be empty
assert_eq!(String::from_utf8_lossy(&out.stderr), "");
// STDOUT should contain program name, version, and usage information
// TODO
// Exit code should be 0
out.assert().code(0);
}
#[test]
fn print_version() {
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let out = cmd.arg("-V").output().unwrap();
// STDERR should be empty
assert_eq!(String::from_utf8_lossy(&out.stderr), "");
// STDOUT should contain program name and version
assert_eq!(
String::from_utf8_lossy(&out.stdout),
format!("{} {}\n", env!("CARGO_PKG_NAME"), env!("CARGO_PKG_VERSION"))
);
// Exit code should be 0
out.assert().code(0);
}
#[test]
fn stdin_target_input() {
let mut echo = Command::new("echo")
.arg("Hello from STDIN")
.stdout(Stdio::piped())
.spawn()
.unwrap();
let echo_out = echo.stdout.take().unwrap();
echo.wait().unwrap();
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
cmd.stdin(echo_out);
let out = cmd.arg("-M").arg("-").output().unwrap();
// STDERR should be empty
assert_eq!(String::from_utf8_lossy(&out.stderr), "");
// STDOUT should contain HTML created out of STDIN
assert_eq!(
String::from_utf8_lossy(&out.stdout),
"<html><head></head><body>Hello from STDIN\n</body></html>\n"
);
// Exit code should be 0
out.assert().code(0);
}
#[test]
fn css_import_string() {
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let path_html: &Path = Path::new("tests/_data_/css/index.html");
let path_css: &Path = Path::new("tests/_data_/css/style.css");
assert!(path_html.is_file());
assert!(path_css.is_file());
let out = cmd.arg("-M").arg(path_html.as_os_str()).output().unwrap();
// STDERR should list files that got retrieved
assert_eq!(
String::from_utf8_lossy(&out.stderr),
format!(
"\
{file_url_html}\n \
{file_url_css}\n \
{file_url_css}\n \
{file_url_css}\n\
",
file_url_html = Url::from_file_path(fs::canonicalize(&path_html).unwrap()).unwrap(),
file_url_css = Url::from_file_path(fs::canonicalize(&path_css).unwrap()).unwrap(),
)
);
// STDOUT should contain embedded CSS url()'s
assert_eq!(
String::from_utf8_lossy(&out.stdout),
"<html><head><style>\n\n @charset \"UTF-8\";\n\n @import \"data:text/css;base64,Ym9keXtiYWNrZ3JvdW5kLWNvbG9yOiMwMDA7Y29sb3I6I2ZmZn0K\";\n\n @import url(\"data:text/css;base64,Ym9keXtiYWNrZ3JvdW5kLWNvbG9yOiMwMDA7Y29sb3I6I2ZmZn0K\");\n\n @import url(\"data:text/css;base64,Ym9keXtiYWNrZ3JvdW5kLWNvbG9yOiMwMDA7Y29sb3I6I2ZmZn0K\");\n\n</style>\n</head><body></body></html>\n"
);
// Exit code should be 0
out.assert().code(0);
}
}
// ███████╗ █████╗ ██╗██╗ ██╗███╗ ██╗ ██████╗
// ██╔════╝██╔══██╗██║██║ ██║████╗ ██║██╔════╝
// █████╗ ███████║██║██║ ██║██╔██╗ ██║██║ ███╗
// ██╔══╝ ██╔══██║██║██║ ██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║██║███████╗██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚═╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod failing {
use assert_cmd::prelude::*;
use std::env;
use std::process::Command;
#[test]
fn bad_input_empty_target() {
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let out = cmd.arg("").output().unwrap();
// STDERR should contain error description
assert_eq!(
String::from_utf8_lossy(&out.stderr),
"No target specified\n"
);
// STDOUT should be empty
assert_eq!(String::from_utf8_lossy(&out.stdout), "");
// Exit code should be 1
out.assert().code(1);
}
}

233
tests/cli/data_url.rs Normal file
View File

@ -0,0 +1,233 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
use assert_cmd::prelude::*;
use std::env;
use std::process::Command;
use monolith::url::EMPTY_IMAGE_DATA_URL;
#[test]
fn isolate_data_url() {
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let out = cmd
.arg("-M")
.arg("-I")
.arg("data:text/html,Hello%2C%20World!")
.output()
.unwrap();
// STDERR should be empty
assert_eq!(String::from_utf8_lossy(&out.stderr), "");
// STDOUT should contain isolated HTML
assert_eq!(
String::from_utf8_lossy(&out.stdout),
"<html><head>\
<meta http-equiv=\"Content-Security-Policy\" content=\"default-src 'unsafe-eval' 'unsafe-inline' data:;\"></meta>\
</head><body>Hello, World!</body></html>\n"
);
// Exit code should be 0
out.assert().code(0);
}
#[test]
fn remove_css_from_data_url() {
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let out = cmd
.arg("-M")
.arg("-c")
.arg("data:text/html,<style>body{background-color:pink}</style>Hello")
.output()
.unwrap();
// STDERR should be empty
assert_eq!(String::from_utf8_lossy(&out.stderr), "");
// STDOUT should contain HTML with no CSS
assert_eq!(
String::from_utf8_lossy(&out.stdout),
"<html><head>\
<meta http-equiv=\"Content-Security-Policy\" content=\"style-src 'none';\"></meta>\
<style></style>\
</head><body>Hello</body></html>\n"
);
// Exit code should be 0
out.assert().code(0);
}
#[test]
fn remove_fonts_from_data_url() {
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let out = cmd
.arg("-M")
.arg("-F")
.arg("data:text/html,<style>@font-face { font-family: myFont; src: url(font.woff); }</style>Hi")
.output()
.unwrap();
// STDERR should be empty
assert_eq!(String::from_utf8_lossy(&out.stderr), "");
// STDOUT should contain HTML with no web fonts
assert_eq!(
String::from_utf8_lossy(&out.stdout),
"<html><head>\
<meta http-equiv=\"Content-Security-Policy\" content=\"font-src 'none';\"></meta>\
<style></style>\
</head><body>Hi</body></html>\n"
);
// Exit code should be 0
out.assert().code(0);
}
#[test]
fn remove_frames_from_data_url() {
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let out = cmd
.arg("-M")
.arg("-f")
.arg("data:text/html,<iframe src=\"https://duckduckgo.com\"></iframe>Hi")
.output()
.unwrap();
// STDERR should be empty
assert_eq!(String::from_utf8_lossy(&out.stderr), "");
// STDOUT should contain HTML with no iframes
assert_eq!(
String::from_utf8_lossy(&out.stdout),
"<html><head>\
<meta http-equiv=\"Content-Security-Policy\" content=\"frame-src 'none'; child-src 'none';\"></meta>\
</head><body><iframe src=\"\"></iframe>Hi</body></html>\n"
);
// Exit code should be 0
out.assert().code(0);
}
#[test]
fn remove_images_from_data_url() {
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let out = cmd
.arg("-M")
.arg("-i")
.arg("data:text/html,<img src=\"https://google.com\"/>Hi")
.output()
.unwrap();
// STDERR should be empty
assert_eq!(String::from_utf8_lossy(&out.stderr), "");
// STDOUT should contain HTML with no images
assert_eq!(
String::from_utf8_lossy(&out.stdout),
format!(
"<html>\
<head>\
<meta http-equiv=\"Content-Security-Policy\" content=\"img-src data:;\"></meta>\
</head>\
<body>\
<img src=\"{empty_image}\">\
Hi\
</body>\
</html>\n",
empty_image = EMPTY_IMAGE_DATA_URL,
)
);
// Exit code should be 0
out.assert().code(0);
}
#[test]
fn remove_js_from_data_url() {
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let out = cmd
.arg("-M")
.arg("-j")
.arg("data:text/html,<script>alert(2)</script>Hi")
.output()
.unwrap();
// STDERR should be empty
assert_eq!(String::from_utf8_lossy(&out.stderr), "");
// STDOUT should contain HTML with no JS
assert_eq!(
String::from_utf8_lossy(&out.stdout),
"<html>\
<head>\
<meta http-equiv=\"Content-Security-Policy\" content=\"script-src 'none';\"></meta>\
<script></script></head>\
<body>Hi</body>\
</html>\n"
);
// Exit code should be 0
out.assert().code(0);
}
}
// ███████╗ █████╗ ██╗██╗ ██╗███╗ ██╗ ██████╗
// ██╔════╝██╔══██╗██║██║ ██║████╗ ██║██╔════╝
// █████╗ ███████║██║██║ ██║██╔██╗ ██║██║ ███╗
// ██╔══╝ ██╔══██║██║██║ ██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║██║███████╗██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚═╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod failing {
use assert_cmd::prelude::*;
use std::env;
use std::process::Command;
#[test]
fn bad_input_data_url() {
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let out = cmd.arg("data:,Hello%2C%20World!").output().unwrap();
// STDERR should contain error description
assert_eq!(
String::from_utf8_lossy(&out.stderr),
"Unsupported document media type\n"
);
// STDOUT should contain HTML
assert_eq!(String::from_utf8_lossy(&out.stdout), "");
// Exit code should be 1
out.assert().code(1);
}
#[test]
fn security_disallow_local_assets_within_data_url_targets() {
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let out = cmd
.arg("-M")
.arg("data:text/html,%3Cscript%20src=\"src/tests/data/basic/local-script.js\"%3E%3C/script%3E")
.output()
.unwrap();
// STDERR should be empty
assert_eq!(String::from_utf8_lossy(&out.stderr), "");
// STDOUT should contain HTML with no JS in it
assert_eq!(
String::from_utf8_lossy(&out.stdout),
"<html><head><script src=\"data:application/javascript;base64,\"></script></head><body></body></html>\n"
);
// Exit code should be 0
out.assert().code(0);
}
}

271
tests/cli/local_files.rs Normal file
View File

@ -0,0 +1,271 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
use assert_cmd::prelude::*;
use std::env;
use std::fs;
use std::path::{Path, MAIN_SEPARATOR};
use std::process::Command;
use url::Url;
use monolith::url::EMPTY_IMAGE_DATA_URL;
#[test]
fn local_file_target_input_relative_target_path() {
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let cwd_normalized: String = env::current_dir()
.unwrap()
.to_str()
.unwrap()
.replace("\\", "/");
let out = cmd
.arg("-M")
.arg(format!(
"tests{s}_data_{s}basic{s}local-file.html",
s = MAIN_SEPARATOR
))
.output()
.unwrap();
let file_url_protocol: &str = if cfg!(windows) { "file:///" } else { "file://" };
// STDERR should contain list of retrieved file URLs, two missing
assert_eq!(
String::from_utf8_lossy(&out.stderr),
format!(
"\
{file}{cwd}/tests/_data_/basic/local-file.html\n \
{file}{cwd}/tests/_data_/basic/local-style.css\n \
{file}{cwd}/tests/_data_/basic/local-style-does-not-exist.css (not found)\n \
{file}{cwd}/tests/_data_/basic/monolith.png (not found)\n \
{file}{cwd}/tests/_data_/basic/local-script.js\n\
",
file = file_url_protocol,
cwd = cwd_normalized
)
);
// STDOUT should contain HTML from the local file
assert_eq!(
String::from_utf8_lossy(&out.stdout),
"\
<!DOCTYPE html><html lang=\"en\"><head>\n \
<meta http-equiv=\"Content-Type\" content=\"text/html; charset=utf-8\">\n \
<title>Local HTML file</title>\n \
<link href=\"data:text/css;base64,Ym9keSB7CiAgICBiYWNrZ3JvdW5kLWNvbG9yOiAjMDAwOwogICAgY29sb3I6ICNmZmY7Cn0K\" rel=\"stylesheet\" type=\"text/css\">\n \
<link rel=\"stylesheet\" type=\"text/css\">\n</head>\n\n<body>\n \
<img alt=\"\">\n \
<a href=\"file://local-file.html/\">Tricky href</a>\n \
<a href=\"https://github.com/Y2Z/monolith\">Remote URL</a>\n \
<script src=\"data:application/javascript;base64,ZG9jdW1lbnQuYm9keS5zdHlsZS5iYWNrZ3JvdW5kQ29sb3IgPSAiZ3JlZW4iOwpkb2N1bWVudC5ib2R5LnN0eWxlLmNvbG9yID0gInJlZCI7Cg==\"></script>\n\n\n\n\
</body></html>\n\
"
);
// Exit code should be 0
out.assert().code(0);
}
#[test]
fn local_file_target_input_absolute_target_path() {
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let path_html: &Path = Path::new("tests/_data_/basic/local-file.html");
let out = cmd
.arg("-M")
.arg("-Ijci")
.arg(path_html.as_os_str())
.output()
.unwrap();
// STDERR should contain only the target file
assert_eq!(
String::from_utf8_lossy(&out.stderr),
format!(
"{file_url_html}\n",
file_url_html = Url::from_file_path(fs::canonicalize(&path_html).unwrap()).unwrap(),
)
);
// STDOUT should contain HTML from the local file
assert_eq!(
String::from_utf8_lossy(&out.stdout),
format!(
"\
<!DOCTYPE html><html lang=\"en\"><head>\
<meta http-equiv=\"Content-Security-Policy\" content=\"default-src 'unsafe-eval' 'unsafe-inline' data:; style-src 'none'; script-src 'none'; img-src data:;\"></meta>\n \
<meta http-equiv=\"Content-Type\" content=\"text/html; charset=utf-8\">\n \
<title>Local HTML file</title>\n \
<link rel=\"stylesheet\" type=\"text/css\">\n \
<link rel=\"stylesheet\" type=\"text/css\">\n</head>\n\n<body>\n \
<img src=\"{empty_image}\" alt=\"\">\n \
<a href=\"file://local-file.html/\">Tricky href</a>\n \
<a href=\"https://github.com/Y2Z/monolith\">Remote URL</a>\n \
<script></script>\n\n\n\n\
</body></html>\n\
",
empty_image = EMPTY_IMAGE_DATA_URL
)
);
// Exit code should be 0
out.assert().code(0);
}
#[test]
fn local_file_url_target_input() {
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let cwd_normalized: String = env::current_dir()
.unwrap()
.to_str()
.unwrap()
.replace("\\", "/");
let file_url_protocol: &str = if cfg!(windows) { "file:///" } else { "file://" };
let out = cmd
.arg("-M")
.arg("-cji")
.arg(format!(
"{file}{cwd}/tests/_data_/basic/local-file.html",
file = file_url_protocol,
cwd = cwd_normalized,
))
.output()
.unwrap();
// STDERR should contain list of retrieved file URLs
assert_eq!(
String::from_utf8_lossy(&out.stderr),
format!(
"{file}{cwd}/tests/_data_/basic/local-file.html\n",
file = file_url_protocol,
cwd = cwd_normalized,
)
);
// STDOUT should contain HTML from the local file
assert_eq!(
String::from_utf8_lossy(&out.stdout),
format!(
"\
<!DOCTYPE html><html lang=\"en\"><head>\
<meta http-equiv=\"Content-Security-Policy\" content=\"style-src 'none'; script-src 'none'; img-src data:;\"></meta>\n \
<meta http-equiv=\"Content-Type\" content=\"text/html; charset=utf-8\">\n \
<title>Local HTML file</title>\n \
<link rel=\"stylesheet\" type=\"text/css\">\n \
<link rel=\"stylesheet\" type=\"text/css\">\n</head>\n\n<body>\n \
<img src=\"{empty_image}\" alt=\"\">\n \
<a href=\"file://local-file.html/\">Tricky href</a>\n \
<a href=\"https://github.com/Y2Z/monolith\">Remote URL</a>\n \
<script></script>\n\n\n\n\
</body></html>\n\
",
empty_image = EMPTY_IMAGE_DATA_URL
)
);
// Exit code should be 0
out.assert().code(0);
}
#[test]
fn embed_file_url_local_asset_within_style_attribute() {
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let path_html: &Path = Path::new("tests/_data_/svg/index.html");
let path_svg: &Path = Path::new("tests/_data_/svg/image.svg");
let out = cmd.arg("-M").arg(path_html.as_os_str()).output().unwrap();
// STDERR should list files that got retrieved
assert_eq!(
String::from_utf8_lossy(&out.stderr),
format!(
"\
{file_url_html}\n \
{file_url_svg}\n\
",
file_url_html = Url::from_file_path(fs::canonicalize(&path_html).unwrap()).unwrap(),
file_url_svg = Url::from_file_path(fs::canonicalize(&path_svg).unwrap()).unwrap(),
)
);
// STDOUT should contain HTML with date URL for background-image in it
assert_eq!(
String::from_utf8_lossy(&out.stdout),
"<html><head></head><body><div style=\"background-image: url(&quot;data:image/svg+xml;base64,PHN2ZyB2ZXJzaW9uPSIxLjEiIGJhc2VQcm9maWxlPSJmdWxsIiB3aWR0aD0iMzAwIiBoZWlnaHQ9IjIwMCIgeG1sbnM9Imh0dHA6Ly93d3cudzMub3JnLzIwMDAvc3ZnIj4KICAgIDxyZWN0IHdpZHRoPSIxMDAlIiBoZWlnaHQ9IjEwMCUiIGZpbGw9InJlZCIgLz4KICAgIDxjaXJjbGUgY3g9IjE1MCIgY3k9IjEwMCIgcj0iODAiIGZpbGw9ImdyZWVuIiAvPgogICAgPHRleHQgeD0iMTUwIiB5PSIxMjUiIGZvbnQtc2l6ZT0iNjAiIHRleHQtYW5jaG9yPSJtaWRkbGUiIGZpbGw9IndoaXRlIj5TVkc8L3RleHQ+Cjwvc3ZnPgo=&quot;)\"></div>\n</body></html>\n"
);
// Exit code should be 0
out.assert().code(0);
}
#[test]
fn discard_integrity_for_local_files() {
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let cwd_normalized: String = env::current_dir()
.unwrap()
.to_str()
.unwrap()
.replace("\\", "/");
let file_url_protocol: &str = if cfg!(windows) { "file:///" } else { "file://" };
let out = cmd
.arg("-M")
.arg("-i")
.arg(if cfg!(windows) {
format!(
"{file}{cwd}/tests/_data_/integrity/index.html",
file = file_url_protocol,
cwd = cwd_normalized,
)
} else {
format!(
"{file}{cwd}/tests/_data_/integrity/index.html",
file = file_url_protocol,
cwd = cwd_normalized,
)
})
.output()
.unwrap();
// STDERR should contain list of retrieved file URLs
assert_eq!(
String::from_utf8_lossy(&out.stderr),
format!(
"\
{file}{cwd}/tests/_data_/integrity/index.html\n \
{file}{cwd}/tests/_data_/integrity/style.css\n \
{file}{cwd}/tests/_data_/integrity/style.css\n \
{file}{cwd}/tests/_data_/integrity/script.js\n \
{file}{cwd}/tests/_data_/integrity/script.js\n\
",
file = file_url_protocol,
cwd = cwd_normalized,
)
);
// STDOUT should contain HTML from the local file; integrity attributes should be missing
assert_eq!(
String::from_utf8_lossy(&out.stdout),
format!(
"\
<!DOCTYPE html><html lang=\"en\"><head>\
<meta http-equiv=\"Content-Security-Policy\" content=\"img-src data:;\"></meta>\n \
<title>Local HTML file</title>\n \
<link href=\"data:text/css;base64,Ym9keSB7CiAgICBiYWNrZ3JvdW5kLWNvbG9yOiAjMDAwOwogICAgY29sb3I6ICNGRkY7Cn0K\" rel=\"stylesheet\" type=\"text/css\" crossorigin=\"anonymous\">\n \
<link href=\"style.css\" rel=\"stylesheet\" type=\"text/css\" crossorigin=\"anonymous\">\n</head>\n\n<body>\n \
<p>This page should have black background and white foreground, but only when served via http: (not via file:)</p>\n \
<script src=\"data:application/javascript;base64,ZnVuY3Rpb24gbm9vcCgpIHsKICAgIGNvbnNvbGUubG9nKCJtb25vbGl0aCIpOwp9Cg==\"></script>\n \
<script src=\"script.js\"></script>\n\n\n\n\
</body></html>\n\
"
)
);
// Exit code should be 0
out.assert().code(0);
}
}

6
tests/cli/mod.rs Normal file
View File

@ -0,0 +1,6 @@
mod base_url;
mod basic;
mod data_url;
mod local_files;
mod noscript;
mod unusual_encodings;

170
tests/cli/noscript.rs Normal file
View File

@ -0,0 +1,170 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
use assert_cmd::prelude::*;
use std::env;
use std::fs;
use std::path::Path;
use std::process::Command;
use url::Url;
#[test]
fn parse_noscript_contents() {
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let path_html: &Path = Path::new("tests/_data_/noscript/index.html");
let path_svg: &Path = Path::new("tests/_data_/noscript/image.svg");
let out = cmd.arg("-M").arg(path_html.as_os_str()).output().unwrap();
// STDERR should contain target HTML and embedded SVG files
assert_eq!(
String::from_utf8_lossy(&out.stderr),
format!(
"\
{file_url_html}\n \
{file_url_svg}\n\
",
file_url_html = Url::from_file_path(fs::canonicalize(&path_html).unwrap()).unwrap(),
file_url_svg = Url::from_file_path(fs::canonicalize(&path_svg).unwrap()).unwrap(),
)
);
// STDOUT should contain HTML with no CSS
assert_eq!(
String::from_utf8_lossy(&out.stdout),
"<html><head></head><body><noscript><img src=\"data:image/svg+xml;base64,PHN2ZyB2ZXJzaW9uPSIxLjEiIGJhc2VQcm9maWxlPSJmdWxsIiB3aWR0aD0iMzAwIiBoZWlnaHQ9IjIwMCIgeG1sbnM9Imh0dHA6Ly93d3cudzMub3JnLzIwMDAvc3ZnIj4KICAgIDxyZWN0IHdpZHRoPSIxMDAlIiBoZWlnaHQ9IjEwMCUiIGZpbGw9InJlZCIgLz4KICAgIDxjaXJjbGUgY3g9IjE1MCIgY3k9IjEwMCIgcj0iODAiIGZpbGw9ImdyZWVuIiAvPgogICAgPHRleHQgeD0iMTUwIiB5PSIxMjUiIGZvbnQtc2l6ZT0iNjAiIHRleHQtYW5jaG9yPSJtaWRkbGUiIGZpbGw9IndoaXRlIj5TVkc8L3RleHQ+Cjwvc3ZnPgo=\"></noscript>\n</body></html>\n"
);
// Exit code should be 0
out.assert().code(0);
}
#[test]
fn unwrap_noscript_contents() {
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let path_html: &Path = Path::new("tests/_data_/noscript/index.html");
let path_svg: &Path = Path::new("tests/_data_/noscript/image.svg");
let out = cmd.arg("-Mn").arg(path_html.as_os_str()).output().unwrap();
// STDERR should contain target HTML and embedded SVG files
assert_eq!(
String::from_utf8_lossy(&out.stderr),
format!(
"\
{file_url_html}\n \
{file_url_svg}\n\
",
file_url_html = Url::from_file_path(fs::canonicalize(&path_html).unwrap()).unwrap(),
file_url_svg = Url::from_file_path(fs::canonicalize(&path_svg).unwrap()).unwrap(),
)
);
// STDOUT should contain HTML with no CSS
assert_eq!(
String::from_utf8_lossy(&out.stdout),
"<html><head></head><body><!--noscript--><img src=\"data:image/svg+xml;base64,PHN2ZyB2ZXJzaW9uPSIxLjEiIGJhc2VQcm9maWxlPSJmdWxsIiB3aWR0aD0iMzAwIiBoZWlnaHQ9IjIwMCIgeG1sbnM9Imh0dHA6Ly93d3cudzMub3JnLzIwMDAvc3ZnIj4KICAgIDxyZWN0IHdpZHRoPSIxMDAlIiBoZWlnaHQ9IjEwMCUiIGZpbGw9InJlZCIgLz4KICAgIDxjaXJjbGUgY3g9IjE1MCIgY3k9IjEwMCIgcj0iODAiIGZpbGw9ImdyZWVuIiAvPgogICAgPHRleHQgeD0iMTUwIiB5PSIxMjUiIGZvbnQtc2l6ZT0iNjAiIHRleHQtYW5jaG9yPSJtaWRkbGUiIGZpbGw9IndoaXRlIj5TVkc8L3RleHQ+Cjwvc3ZnPgo=\"><!--/noscript-->\n</body></html>\n"
);
// Exit code should be 0
out.assert().code(0);
}
#[test]
fn unwrap_noscript_contents_nested() {
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let path_html: &Path = Path::new("tests/_data_/noscript/nested.html");
let path_svg: &Path = Path::new("tests/_data_/noscript/image.svg");
let out = cmd.arg("-Mn").arg(path_html.as_os_str()).output().unwrap();
// STDERR should contain target HTML and embedded SVG files
assert_eq!(
String::from_utf8_lossy(&out.stderr),
format!(
"\
{file_url_html}\n \
{file_url_svg}\n\
",
file_url_html = Url::from_file_path(fs::canonicalize(&path_html).unwrap()).unwrap(),
file_url_svg = Url::from_file_path(fs::canonicalize(&path_svg).unwrap()).unwrap(),
)
);
// STDOUT should contain HTML with no CSS
assert_eq!(
String::from_utf8_lossy(&out.stdout),
"<html><head></head><body><!--noscript--><h1>JS is not active</h1><!--noscript--><img src=\"data:image/svg+xml;base64,PHN2ZyB2ZXJzaW9uPSIxLjEiIGJhc2VQcm9maWxlPSJmdWxsIiB3aWR0aD0iMzAwIiBoZWlnaHQ9IjIwMCIgeG1sbnM9Imh0dHA6Ly93d3cudzMub3JnLzIwMDAvc3ZnIj4KICAgIDxyZWN0IHdpZHRoPSIxMDAlIiBoZWlnaHQ9IjEwMCUiIGZpbGw9InJlZCIgLz4KICAgIDxjaXJjbGUgY3g9IjE1MCIgY3k9IjEwMCIgcj0iODAiIGZpbGw9ImdyZWVuIiAvPgogICAgPHRleHQgeD0iMTUwIiB5PSIxMjUiIGZvbnQtc2l6ZT0iNjAiIHRleHQtYW5jaG9yPSJtaWRkbGUiIGZpbGw9IndoaXRlIj5TVkc8L3RleHQ+Cjwvc3ZnPgo=\"><!--/noscript--><!--/noscript-->\n</body></html>\n"
);
// Exit code should be 0
out.assert().code(0);
}
#[test]
fn unwrap_noscript_contents_with_script() {
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let path_html: &Path = Path::new("tests/_data_/noscript/script.html");
let path_svg: &Path = Path::new("tests/_data_/noscript/image.svg");
let out = cmd.arg("-Mn").arg(path_html.as_os_str()).output().unwrap();
// STDERR should contain target HTML and embedded SVG files
assert_eq!(
String::from_utf8_lossy(&out.stderr),
format!(
"\
{file_url_html}\n \
{file_url_svg}\n\
",
file_url_html = Url::from_file_path(fs::canonicalize(&path_html).unwrap()).unwrap(),
file_url_svg = Url::from_file_path(fs::canonicalize(&path_svg).unwrap()).unwrap(),
)
);
// STDOUT should contain HTML with no CSS
assert_eq!(
String::from_utf8_lossy(&out.stdout),
"<html>\
<head></head>\
<body>\
<!--noscript-->\
<img src=\"data:image/svg+xml;base64,PHN2ZyB2ZXJzaW9uPSIxLjEiIGJhc2VQcm9maWxlPSJmdWxsIiB3aWR0aD0iMzAwIiBoZWlnaHQ9IjIwMCIgeG1sbnM9Imh0dHA6Ly93d3cudzMub3JnLzIwMDAvc3ZnIj4KICAgIDxyZWN0IHdpZHRoPSIxMDAlIiBoZWlnaHQ9IjEwMCUiIGZpbGw9InJlZCIgLz4KICAgIDxjaXJjbGUgY3g9IjE1MCIgY3k9IjEwMCIgcj0iODAiIGZpbGw9ImdyZWVuIiAvPgogICAgPHRleHQgeD0iMTUwIiB5PSIxMjUiIGZvbnQtc2l6ZT0iNjAiIHRleHQtYW5jaG9yPSJtaWRkbGUiIGZpbGw9IndoaXRlIj5TVkc8L3RleHQ+Cjwvc3ZnPgo=\">\
<!--/noscript-->\n\
</body>\
</html>\n"
);
// Exit code should be 0
out.assert().code(0);
}
#[test]
fn unwrap_noscript_contents_attr_data_url() {
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let out = cmd
.arg("-M")
.arg("-n")
.arg("data:text/html,<noscript class=\"\">test</noscript>")
.output()
.unwrap();
// STDERR should be empty
assert_eq!(String::from_utf8_lossy(&out.stderr), "");
// STDOUT should contain unwrapped contents of NOSCRIPT element
assert_eq!(
String::from_utf8_lossy(&out.stdout),
"<html><head><!--noscript class=\"\"-->test<!--/noscript--></head><body></body></html>\n"
);
// Exit code should be 0
out.assert().code(0);
}
}

View File

@ -0,0 +1,239 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
use assert_cmd::prelude::*;
use encoding_rs::Encoding;
use std::env;
use std::path::MAIN_SEPARATOR;
use std::process::{Command, Stdio};
#[test]
fn properly_save_document_with_gb2312() {
let cwd = env::current_dir().unwrap();
let cwd_normalized: String = cwd.to_str().unwrap().replace("\\", "/");
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let out = cmd
.arg("-M")
.arg(format!(
"tests{s}_data_{s}unusual_encodings{s}gb2312.html",
s = MAIN_SEPARATOR
))
.output()
.unwrap();
let file_url_protocol: &str = if cfg!(windows) { "file:///" } else { "file://" };
// STDERR should contain only the target file
assert_eq!(
String::from_utf8_lossy(&out.stderr),
format!(
"{file}{cwd}/tests/_data_/unusual_encodings/gb2312.html\n",
file = file_url_protocol,
cwd = cwd_normalized,
)
);
// STDOUT should contain original document without any modificatons
let s: String;
if let Some(encoding) = Encoding::for_label(b"gb2312") {
let (string, _, _) = encoding.decode(&out.stdout);
s = string.to_string();
} else {
s = String::from_utf8_lossy(&out.stdout).to_string();
}
assert_eq!(
s,
"<html>\
<head>\n \
<meta http-equiv=\"content-type\" content=\"text/html;charset=GB2312\">\n \
<title>线\u{3000}--·-- </title>\n\
</head>\n\
<body>\n \
<h1>线\u{3000}</h1>\n\n\n\
</body>\
</html>\n"
);
// Exit code should be 0
out.assert().code(0);
}
#[test]
fn properly_save_document_with_gb2312_from_stdin() {
let mut echo = Command::new("cat")
.arg(format!(
"tests{s}_data_{s}unusual_encodings{s}gb2312.html",
s = MAIN_SEPARATOR
))
.stdout(Stdio::piped())
.spawn()
.unwrap();
let echo_out = echo.stdout.take().unwrap();
echo.wait().unwrap();
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
cmd.stdin(echo_out);
let out = cmd.arg("-M").arg("-").output().unwrap();
// STDERR should be empty
assert_eq!(String::from_utf8_lossy(&out.stderr), "");
// STDOUT should contain HTML created out of STDIN
let s: String;
if let Some(encoding) = Encoding::for_label(b"gb2312") {
let (string, _, _) = encoding.decode(&out.stdout);
s = string.to_string();
} else {
s = String::from_utf8_lossy(&out.stdout).to_string();
}
assert_eq!(
s,
"<html>\
<head>\n \
<meta http-equiv=\"content-type\" content=\"text/html;charset=GB2312\">\n \
<title>线\u{3000}--·-- </title>\n\
</head>\n\
<body>\n \
<h1>线\u{3000}</h1>\n\n\n\
</body>\
</html>\n"
);
// Exit code should be 0
out.assert().code(0);
}
#[test]
fn properly_save_document_with_gb2312_custom_charset() {
let cwd = env::current_dir().unwrap();
let cwd_normalized: String = cwd.to_str().unwrap().replace("\\", "/");
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let out = cmd
.arg("-M")
.arg("-C")
.arg("utf8")
.arg(format!(
"tests{s}_data_{s}unusual_encodings{s}gb2312.html",
s = MAIN_SEPARATOR
))
.output()
.unwrap();
let file_url_protocol: &str = if cfg!(windows) { "file:///" } else { "file://" };
// STDERR should contain only the target file
assert_eq!(
String::from_utf8_lossy(&out.stderr),
format!(
"{file}{cwd}/tests/_data_/unusual_encodings/gb2312.html\n",
file = file_url_protocol,
cwd = cwd_normalized,
)
);
// STDOUT should contain original document without any modificatons
assert_eq!(
String::from_utf8_lossy(&out.stdout).to_string(),
"<html>\
<head>\n \
<meta http-equiv=\"content-type\" content=\"text/html;charset=utf8\">\n \
<title>线\u{3000}--·-- </title>\n\
</head>\n\
<body>\n \
<h1>线\u{3000}</h1>\n\n\n\
</body>\
</html>\n"
);
// Exit code should be 0
out.assert().code(0);
}
#[test]
fn properly_save_document_with_gb2312_custom_charset_bad() {
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let out = cmd
.arg("-M")
.arg("-C")
.arg("utf0")
.arg(format!(
"tests{s}_data_{s}unusual_encodings{s}gb2312.html",
s = MAIN_SEPARATOR
))
.output()
.unwrap();
// STDERR should contain error message
assert_eq!(
String::from_utf8_lossy(&out.stderr),
"Unknown encoding: utf0\n"
);
// STDOUT should be empty
assert_eq!(String::from_utf8_lossy(&out.stdout).to_string(), "");
// Exit code should be 1
out.assert().code(1);
}
}
// ███████╗ █████╗ ██╗██╗ ██╗███╗ ██╗ ██████╗
// ██╔════╝██╔══██╗██║██║ ██║████╗ ██║██╔════╝
// █████╗ ███████║██║██║ ██║██╔██╗ ██║██║ ███╗
// ██╔══╝ ██╔══██║██║██║ ██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║██║███████╗██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚═╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod failing {
use assert_cmd::prelude::*;
use std::env;
use std::path::MAIN_SEPARATOR;
use std::process::Command;
#[test]
fn change_iso88591_to_utf8_to_properly_display_html_entities() {
let cwd = env::current_dir().unwrap();
let cwd_normalized: String = cwd.to_str().unwrap().replace("\\", "/");
let mut cmd = Command::cargo_bin(env!("CARGO_PKG_NAME")).unwrap();
let out = cmd
.arg("-M")
.arg(format!(
"tests{s}_data_{s}unusual_encodings{s}iso-8859-1.html",
s = MAIN_SEPARATOR
))
.output()
.unwrap();
let file_url_protocol: &str = if cfg!(windows) { "file:///" } else { "file://" };
// STDERR should contain only the target file
assert_eq!(
String::from_utf8_lossy(&out.stderr),
format!(
"{file}{cwd}/tests/_data_/unusual_encodings/iso-8859-1.html\n",
file = file_url_protocol,
cwd = cwd_normalized,
)
);
// STDOUT should contain original document but with UTF-8 charset
assert_eq!(
String::from_utf8_lossy(&out.stdout),
"<html>\
<head>\n \
<meta http-equiv=\"Content-Type\" content=\"text/html; charset=iso-8859-1\">\n \
</head>\n \
<body>\n \
<EFBFBD> Some Company\n \
\n\n</body>\
</html>\n"
);
// Exit code should be 0
out.assert().code(0);
}
}

371
tests/css/embed_css.rs Normal file
View File

@ -0,0 +1,371 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
use reqwest::blocking::Client;
use reqwest::Url;
use std::collections::HashMap;
use monolith::css;
use monolith::opts::Options;
use monolith::url::EMPTY_IMAGE_DATA_URL;
#[test]
fn empty_input() {
let cache = &mut HashMap::new();
let client = Client::new();
let document_url: Url = Url::parse("data:,").unwrap();
let options = Options::default();
assert_eq!(
css::embed_css(cache, &client, &document_url, "", &options, 0),
""
);
}
#[test]
fn trim_if_empty() {
let cache = &mut HashMap::new();
let client = Client::new();
let document_url: Url = Url::parse("https://doesntmatter.local/").unwrap();
let options = Options::default();
assert_eq!(
css::embed_css(cache, &client, &document_url, "\t \t ", &options, 0,),
""
);
}
#[test]
fn style_exclude_unquoted_images() {
let cache = &mut HashMap::new();
let client = Client::new();
let document_url: Url = Url::parse("https://doesntmatter.local/").unwrap();
let mut options = Options::default();
options.no_images = true;
options.silent = true;
const STYLE: &str = "/* border: none;*/\
background-image: url(https://somewhere.com/bg.png); \
list-style: url(/assets/images/bullet.svg);\
width:99.998%; \
margin-top: -20px; \
line-height: -1; \
height: calc(100vh - 10pt)";
assert_eq!(
css::embed_css(cache, &client, &document_url, &STYLE, &options, 0,),
format!(
"/* border: none;*/\
background-image: url(\"{empty_image}\"); \
list-style: url(\"{empty_image}\");\
width:99.998%; \
margin-top: -20px; \
line-height: -1; \
height: calc(100vh - 10pt)",
empty_image = EMPTY_IMAGE_DATA_URL
)
);
}
#[test]
fn style_exclude_single_quoted_images() {
let cache = &mut HashMap::new();
let client = Client::new();
let document_url: Url = Url::parse("data:,").unwrap();
let mut options = Options::default();
options.no_images = true;
options.silent = true;
const STYLE: &str = "/* border: none;*/\
background-image: url('https://somewhere.com/bg.png'); \
list-style: url('/assets/images/bullet.svg');\
width:99.998%; \
margin-top: -20px; \
line-height: -1; \
height: calc(100vh - 10pt)";
assert_eq!(
css::embed_css(cache, &client, &document_url, &STYLE, &options, 0),
format!(
"/* border: none;*/\
background-image: url(\"{empty_image}\"); \
list-style: url(\"{empty_image}\");\
width:99.998%; \
margin-top: -20px; \
line-height: -1; \
height: calc(100vh - 10pt)",
empty_image = EMPTY_IMAGE_DATA_URL
)
);
}
#[test]
fn style_block() {
let cache = &mut HashMap::new();
let client = Client::new();
let document_url: Url = Url::parse("file:///").unwrap();
let mut options = Options::default();
options.silent = true;
const CSS: &str = "\
#id.class-name:not(:nth-child(3n+0)) {\n \
// border: none;\n \
background-image: url(\"data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAQAAAC1HAwCAAAAC0lEQVR42mNkYAAAAAYAAjCB0C8AAAAASUVORK5CYII=\");\n\
}\n\
\n\
html > body {}";
assert_eq!(
css::embed_css(cache, &client, &document_url, &CSS, &options, 0),
CSS
);
}
#[test]
fn attribute_selectors() {
let cache = &mut HashMap::new();
let client = Client::new();
let document_url: Url = Url::parse("https://doesntmatter.local/").unwrap();
let mut options = Options::default();
options.silent = true;
const CSS: &str = "\
[data-value] {
/* Attribute exists */
}
[data-value=\"foo\"] {
/* Attribute has this exact value */
}
[data-value*=\"foo\"] {
/* Attribute value contains this value somewhere in it */
}
[data-value~=\"foo\"] {
/* Attribute has this value in a space-separated list somewhere */
}
[data-value^=\"foo\"] {
/* Attribute value starts with this */
}
[data-value|=\"foo\"] {
/* Attribute value starts with this in a dash-separated list */
}
[data-value$=\"foo\"] {
/* Attribute value ends with this */
}
";
assert_eq!(
css::embed_css(cache, &client, &document_url, &CSS, &options, 0),
CSS
);
}
#[test]
fn import_string() {
let cache = &mut HashMap::new();
let client = Client::new();
let document_url: Url = Url::parse("https://doesntmatter.local/").unwrap();
let mut options = Options::default();
options.silent = true;
const CSS: &str = "\
@charset 'UTF-8';\n\
\n\
@import 'data:text/css,html{background-color:%23000}';\n\
\n\
@import url('data:text/css,html{color:%23fff}')\n\
";
assert_eq!(
css::embed_css(cache, &client, &document_url, &CSS, &options, 0,),
"\
@charset \"UTF-8\";\n\
\n\
@import \"data:text/css;base64,aHRtbHtiYWNrZ3JvdW5kLWNvbG9yOiMwMDB9\";\n\
\n\
@import url(\"data:text/css;base64,aHRtbHtjb2xvcjojZmZmfQ==\")\n\
"
);
}
#[test]
fn hash_urls() {
let cache = &mut HashMap::new();
let client = Client::new();
let document_url: Url = Url::parse("https://doesntmatter.local/").unwrap();
let mut options = Options::default();
options.silent = true;
const CSS: &str = "\
body {\n \
behavior: url(#default#something);\n\
}\n\
\n\
.scissorHalf {\n \
offset-path: url(#somePath);\n\
}\n\
";
assert_eq!(
css::embed_css(cache, &client, &document_url, &CSS, &options, 0,),
CSS
);
}
#[test]
fn transform_percentages_and_degrees() {
let cache = &mut HashMap::new();
let client = Client::new();
let document_url: Url = Url::parse("https://doesntmatter.local/").unwrap();
let mut options = Options::default();
options.silent = true;
const CSS: &str = "\
div {\n \
transform: translate(-50%, -50%) rotate(-45deg);\n\
transform: translate(50%, 50%) rotate(45deg);\n\
transform: translate(+50%, +50%) rotate(+45deg);\n\
}\n\
";
assert_eq!(
css::embed_css(cache, &client, &document_url, &CSS, &options, 0,),
CSS
);
}
#[test]
fn unusual_indents() {
let cache = &mut HashMap::new();
let client = Client::new();
let document_url: Url = Url::parse("https://doesntmatter.local/").unwrap();
let mut options = Options::default();
options.silent = true;
const CSS: &str = "\
.is\\:good:hover {\n \
color: green\n\
}\n\
\n\
#\\~\\!\\@\\$\\%\\^\\&\\*\\(\\)\\+\\=\\,\\.\\/\\\\\\'\\\"\\;\\:\\?\\>\\<\\[\\]\\{\\}\\|\\`\\# {\n \
color: black\n\
}\n\
";
assert_eq!(
css::embed_css(cache, &client, &document_url, &CSS, &options, 0,),
CSS
);
}
#[test]
fn exclude_fonts() {
let cache = &mut HashMap::new();
let client = Client::new();
let document_url: Url = Url::parse("https://doesntmatter.local/").unwrap();
let mut options = Options::default();
options.no_fonts = true;
options.silent = true;
const CSS: &str = "\
@font-face {\n \
font-family: 'My Font';\n \
src: url(my_font.woff);\n\
}\n\
\n\
#identifier {\n \
font-family: 'My Font' Arial\n\
}\n\
\n\
@font-face {\n \
font-family: 'My Font';\n \
src: url(my_font.woff);\n\
}\n\
\n\
div {\n \
font-family: 'My Font' Verdana\n\
}\n\
";
const CSS_OUT: &str = " \
\n\
\n\
#identifier {\n \
font-family: \"My Font\" Arial\n\
}\n\
\n \
\n\
\n\
div {\n \
font-family: \"My Font\" Verdana\n\
}\n\
";
assert_eq!(
css::embed_css(cache, &client, &document_url, &CSS, &options, 0,),
CSS_OUT
);
}
#[test]
fn content() {
let cache = &mut HashMap::new();
let client = Client::new();
let document_url: Url = Url::parse("data:,").unwrap();
let mut options = Options::default();
options.silent = true;
const CSS: &str = "\
#language a[href=\"#translations\"]:before {\n\
content: url(data:,) \"\\A\";\n\
white-space: pre }\n\
";
const CSS_OUT: &str = "\
#language a[href=\"#translations\"]:before {\n\
content: url(\"data:text/plain;base64,\") \"\\a \";\n\
white-space: pre }\n\
";
assert_eq!(
css::embed_css(cache, &client, &document_url, &CSS, &options, 0,),
CSS_OUT
);
}
#[test]
fn ie_css_hack() {
let cache = &mut HashMap::new();
let client = Client::new();
let document_url: Url = Url::parse("data:,").unwrap();
let mut options = Options::default();
options.silent = true;
const CSS: &str = "\
div#p>svg>foreignObject>section:not(\\9) {\n\
width: 300px;\n\
width: 500px\\9;\n\
}\n\
";
const CSS_OUT: &str = "\
div#p>svg>foreignObject>section:not(\\9) {\n\
width: 300px;\n\
width: 500px\t;\n\
}\n\
";
assert_eq!(
css::embed_css(cache, &client, &document_url, &CSS, &options, 0,),
CSS_OUT
);
}
}

View File

@ -0,0 +1,88 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
use monolith::css;
#[test]
fn backrgound() {
assert!(css::is_image_url_prop("background"));
}
#[test]
fn backrgound_image() {
assert!(css::is_image_url_prop("background-image"));
}
#[test]
fn backrgound_image_uppercase() {
assert!(css::is_image_url_prop("BACKGROUND-IMAGE"));
}
#[test]
fn border_image() {
assert!(css::is_image_url_prop("border-image"));
}
#[test]
fn content() {
assert!(css::is_image_url_prop("content"));
}
#[test]
fn cursor() {
assert!(css::is_image_url_prop("cursor"));
}
#[test]
fn list_style() {
assert!(css::is_image_url_prop("list-style"));
}
#[test]
fn list_style_image() {
assert!(css::is_image_url_prop("list-style-image"));
}
#[test]
fn mask_image() {
assert!(css::is_image_url_prop("mask-image"));
}
}
// ███████╗ █████╗ ██╗██╗ ██╗███╗ ██╗ ██████╗
// ██╔════╝██╔══██╗██║██║ ██║████╗ ██║██╔════╝
// █████╗ ███████║██║██║ ██║██╔██╗ ██║██║ ███╗
// ██╔══╝ ██╔══██║██║██║ ██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║██║███████╗██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚═╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod failing {
use monolith::css;
#[test]
fn empty() {
assert!(!css::is_image_url_prop(""));
}
#[test]
fn width() {
assert!(!css::is_image_url_prop("width"));
}
#[test]
fn color() {
assert!(!css::is_image_url_prop("color"));
}
#[test]
fn z_index() {
assert!(!css::is_image_url_prop("z-index"));
}
}

2
tests/css/mod.rs Normal file
View File

@ -0,0 +1,2 @@
mod embed_css;
mod is_image_url_prop;

29
tests/html/add_favicon.rs Normal file
View File

@ -0,0 +1,29 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
use html5ever::serialize::{serialize, SerializeOpts};
use monolith::html;
#[test]
fn basic() {
let html = "<div>text</div>";
let mut dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
dom = html::add_favicon(&dom.document, "I_AM_A_FAVICON_DATA_URL".to_string());
let mut buf: Vec<u8> = Vec::new();
serialize(&mut buf, &dom.document, SerializeOpts::default()).unwrap();
assert_eq!(
buf.iter().map(|&c| c as char).collect::<String>(),
"<html><head><link rel=\"icon\" href=\"I_AM_A_FAVICON_DATA_URL\"></link></head><body><div>text</div></body></html>"
);
}
}

View File

@ -0,0 +1,89 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
use monolith::html;
#[test]
fn empty_input_sha256() {
assert!(html::check_integrity(
"".as_bytes(),
"sha256-47DEQpj8HBSa+/TImW+5JCeuQeRkm5NMpJWZG3hSuFU="
));
}
#[test]
fn sha256() {
assert!(html::check_integrity(
"abcdef0123456789".as_bytes(),
"sha256-9EWAHgy4mSYsm54hmDaIDXPKLRsLnBX7lZyQ6xISNOM="
));
}
#[test]
fn sha384() {
assert!(html::check_integrity(
"abcdef0123456789".as_bytes(),
"sha384-gc9l7omltke8C33bedgh15E12M7RrAQa5t63Yb8APlpe7ZhiqV23+oqiulSJl3Kw"
));
}
#[test]
fn sha512() {
assert!(html::check_integrity(
"abcdef0123456789".as_bytes(),
"sha512-zG5B88cYMqcdiMi9gz0XkOFYw2BpjeYdn5V6+oFrMgSNjRpqL7EF8JEwl17ztZbK3N7I/tTwp3kxQbN1RgFBww=="
));
}
}
// ███████╗ █████╗ ██╗██╗ ██╗███╗ ██╗ ██████╗
// ██╔════╝██╔══██╗██║██║ ██║████╗ ██║██╔════╝
// █████╗ ███████║██║██║ ██║██╔██╗ ██║██║ ███╗
// ██╔══╝ ██╔══██║██║██║ ██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║██║███████╗██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚═╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod failing {
use monolith::html;
#[test]
fn empty_hash() {
assert!(!html::check_integrity("abcdef0123456789".as_bytes(), ""));
}
#[test]
fn empty_input_empty_hash() {
assert!(!html::check_integrity("".as_bytes(), ""));
}
#[test]
fn sha256() {
assert!(!html::check_integrity(
"abcdef0123456789".as_bytes(),
"sha256-badhash"
));
}
#[test]
fn sha384() {
assert!(!html::check_integrity(
"abcdef0123456789".as_bytes(),
"sha384-badhash"
));
}
#[test]
fn sha512() {
assert!(!html::check_integrity(
"abcdef0123456789".as_bytes(),
"sha512-badhash"
));
}
}

83
tests/html/compose_csp.rs Normal file
View File

@ -0,0 +1,83 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
use monolith::html;
use monolith::opts::Options;
#[test]
fn isolated() {
let mut options = Options::default();
options.isolate = true;
let csp_content = html::compose_csp(&options);
assert_eq!(
csp_content,
"default-src 'unsafe-eval' 'unsafe-inline' data:;"
);
}
#[test]
fn no_css() {
let mut options = Options::default();
options.no_css = true;
let csp_content = html::compose_csp(&options);
assert_eq!(csp_content, "style-src 'none';");
}
#[test]
fn no_fonts() {
let mut options = Options::default();
options.no_fonts = true;
let csp_content = html::compose_csp(&options);
assert_eq!(csp_content, "font-src 'none';");
}
#[test]
fn no_frames() {
let mut options = Options::default();
options.no_frames = true;
let csp_content = html::compose_csp(&options);
assert_eq!(csp_content, "frame-src 'none'; child-src 'none';");
}
#[test]
fn no_js() {
let mut options = Options::default();
options.no_js = true;
let csp_content = html::compose_csp(&options);
assert_eq!(csp_content, "script-src 'none';");
}
#[test]
fn no_images() {
let mut options = Options::default();
options.no_images = true;
let csp_content = html::compose_csp(&options);
assert_eq!(csp_content, "img-src data:;");
}
#[test]
fn all() {
let mut options = Options::default();
options.isolate = true;
options.no_css = true;
options.no_fonts = true;
options.no_frames = true;
options.no_js = true;
options.no_images = true;
let csp_content = html::compose_csp(&options);
assert_eq!(csp_content, "default-src 'unsafe-eval' 'unsafe-inline' data:; style-src 'none'; font-src 'none'; frame-src 'none'; child-src 'none'; script-src 'none'; img-src data:;");
}
}

View File

@ -0,0 +1,66 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
use chrono::prelude::*;
use reqwest::Url;
use monolith::html;
#[test]
fn http_url() {
let url: Url = Url::parse("http://192.168.1.1/").unwrap();
let timestamp = Utc::now().to_rfc3339_opts(SecondsFormat::Secs, true);
let metadata_comment: String = html::create_metadata_tag(&url);
assert_eq!(
metadata_comment,
format!(
"<!-- Saved from {} at {} using {} v{} -->",
&url,
timestamp,
env!("CARGO_PKG_NAME"),
env!("CARGO_PKG_VERSION"),
)
);
}
#[test]
fn file_url() {
let url: Url = Url::parse("file:///home/monolith/index.html").unwrap();
let timestamp = Utc::now().to_rfc3339_opts(SecondsFormat::Secs, true);
let metadata_comment: String = html::create_metadata_tag(&url);
assert_eq!(
metadata_comment,
format!(
"<!-- Saved from local source at {} using {} v{} -->",
timestamp,
env!("CARGO_PKG_NAME"),
env!("CARGO_PKG_VERSION"),
)
);
}
#[test]
fn data_url() {
let url: Url = Url::parse("data:text/html,Hello%2C%20World!").unwrap();
let timestamp = Utc::now().to_rfc3339_opts(SecondsFormat::Secs, true);
let metadata_comment: String = html::create_metadata_tag(&url);
assert_eq!(
metadata_comment,
format!(
"<!-- Saved from local source at {} using {} v{} -->",
timestamp,
env!("CARGO_PKG_NAME"),
env!("CARGO_PKG_VERSION"),
)
);
}
}

156
tests/html/embed_srcset.rs Normal file
View File

@ -0,0 +1,156 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
use reqwest::blocking::Client;
use reqwest::Url;
use std::collections::HashMap;
use monolith::html;
use monolith::opts::Options;
use monolith::url::EMPTY_IMAGE_DATA_URL;
#[test]
fn small_medium_large() {
let cache = &mut HashMap::new();
let client = Client::new();
let srcset_value = "small.png 1x, medium.png 1.5x, large.png 2x";
let mut options = Options::default();
options.no_images = true;
options.silent = true;
let embedded_css = html::embed_srcset(
cache,
&client,
&Url::parse("data:,").unwrap(),
&srcset_value,
&options,
0,
);
assert_eq!(
embedded_css,
format!(
"{} 1x, {} 1.5x, {} 2x",
EMPTY_IMAGE_DATA_URL, EMPTY_IMAGE_DATA_URL, EMPTY_IMAGE_DATA_URL,
),
);
}
#[test]
fn small_medium_only_medium_has_scale() {
let cache = &mut HashMap::new();
let client = Client::new();
let srcset_value = "small.png, medium.png 1.5x";
let mut options = Options::default();
options.no_images = true;
options.silent = true;
let embedded_css = html::embed_srcset(
cache,
&client,
&Url::parse("data:,").unwrap(),
&srcset_value,
&options,
0,
);
assert_eq!(
embedded_css,
format!("{}, {} 1.5x", EMPTY_IMAGE_DATA_URL, EMPTY_IMAGE_DATA_URL),
);
}
#[test]
fn commas_within_file_names() {
let cache = &mut HashMap::new();
let client = Client::new();
let srcset_value = "small,s.png 1x, large,l.png 2x";
let mut options = Options::default();
options.no_images = true;
options.silent = true;
let embedded_css = html::embed_srcset(
cache,
&client,
&Url::parse("data:,").unwrap(),
&srcset_value,
&options,
0,
);
assert_eq!(
embedded_css,
format!("{} 1x, {} 2x", EMPTY_IMAGE_DATA_URL, EMPTY_IMAGE_DATA_URL),
);
}
#[test]
fn tabs_and_newlines_after_commas() {
let cache = &mut HashMap::new();
let client = Client::new();
let srcset_value = "small,s.png 1x,\nmedium,m.png 2x,\nlarge,l.png 3x";
let mut options = Options::default();
options.no_images = true;
options.silent = true;
let embedded_css = html::embed_srcset(
cache,
&client,
&Url::parse("data:,").unwrap(),
&srcset_value,
&options,
0,
);
assert_eq!(
embedded_css,
format!(
"{} 1x, {} 2x, {} 3x",
EMPTY_IMAGE_DATA_URL, EMPTY_IMAGE_DATA_URL, EMPTY_IMAGE_DATA_URL
),
);
}
}
// ███████╗ █████╗ ██╗██╗ ██╗███╗ ██╗ ██████╗
// ██╔════╝██╔══██╗██║██║ ██║████╗ ██║██╔════╝
// █████╗ ███████║██║██║ ██║██╔██╗ ██║██║ ███╗
// ██╔══╝ ██╔══██║██║██║ ██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║██║███████╗██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚═╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod failing {
use reqwest::blocking::Client;
use reqwest::Url;
use std::collections::HashMap;
use monolith::html;
use monolith::opts::Options;
use monolith::url::EMPTY_IMAGE_DATA_URL;
#[test]
fn trailing_comma() {
let cache = &mut HashMap::new();
let client = Client::new();
let srcset_value = "small.png 1x, large.png 2x,";
let mut options = Options::default();
options.no_images = true;
options.silent = true;
let embedded_css = html::embed_srcset(
cache,
&client,
&Url::parse("data:,").unwrap(),
&srcset_value,
&options,
0,
);
assert_eq!(
embedded_css,
format!("{} 1x, {} 2x,", EMPTY_IMAGE_DATA_URL, EMPTY_IMAGE_DATA_URL),
);
}
}

104
tests/html/get_base_url.rs Normal file
View File

@ -0,0 +1,104 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
use monolith::html;
#[test]
fn present() {
let html = "<!doctype html>
<html>
<head>
<base href=\"https://musicbrainz.org\" />
</head>
<body>
</body>
</html>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
assert_eq!(
html::get_base_url(&dom.document),
Some("https://musicbrainz.org".to_string())
);
}
#[test]
fn multiple_tags() {
let html = "<!doctype html>
<html>
<head>
<base href=\"https://www.discogs.com/\" />
<base href=\"https://musicbrainz.org\" />
</head>
<body>
</body>
</html>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
assert_eq!(
html::get_base_url(&dom.document),
Some("https://www.discogs.com/".to_string())
);
}
}
// ███████╗ █████╗ ██╗██╗ ██╗███╗ ██╗ ██████╗
// ██╔════╝██╔══██╗██║██║ ██║████╗ ██║██╔════╝
// █████╗ ███████║██║██║ ██║██╔██╗ ██║██║ ███╗
// ██╔══╝ ██╔══██║██║██║ ██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║██║███████╗██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚═╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod failing {
use monolith::html;
#[test]
fn absent() {
let html = "<!doctype html>
<html>
<head>
</head>
<body>
</body>
</html>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
assert_eq!(html::get_base_url(&dom.document), None);
}
#[test]
fn no_href() {
let html = "<!doctype html>
<html>
<head>
<base />
</head>
<body>
</body>
</html>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
assert_eq!(html::get_base_url(&dom.document), None);
}
#[test]
fn empty_href() {
let html = "<!doctype html>
<html>
<head>
<base href=\"\" />
</head>
<body>
</body>
</html>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
assert_eq!(html::get_base_url(&dom.document), Some("".to_string()));
}
}

72
tests/html/get_charset.rs Normal file
View File

@ -0,0 +1,72 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
use monolith::html;
#[test]
fn meta_content_type() {
let html = "<!doctype html>
<html>
<head>
<meta http-equiv=\"content-type\" content=\"text/html;charset=GB2312\" />
</head>
<body>
</body>
</html>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
assert_eq!(html::get_charset(&dom.document), Some("GB2312".to_string()));
}
#[test]
fn meta_charset() {
let html = "<!doctype html>
<html>
<head>
<meta charset=\"GB2312\" />
</head>
<body>
</body>
</html>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
assert_eq!(html::get_charset(&dom.document), Some("GB2312".to_string()));
}
#[test]
fn multiple_conflicting_meta_charset_first() {
let html = "<!doctype html>
<html>
<head>
<meta charset=\"utf-8\" />
<meta http-equiv=\"content-type\" content=\"text/html;charset=GB2312\" />
</head>
<body>
</body>
</html>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
assert_eq!(html::get_charset(&dom.document), Some("utf-8".to_string()));
}
#[test]
fn multiple_conflicting_meta_content_type_first() {
let html = "<!doctype html>
<html>
<head>
<meta http-equiv=\"content-type\" content=\"text/html;charset=GB2312\" />
<meta charset=\"utf-8\" />
</head>
<body>
</body>
</html>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
assert_eq!(html::get_charset(&dom.document), Some("GB2312".to_string()));
}
}

View File

@ -0,0 +1,54 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
use html5ever::rcdom::{Handle, NodeData};
use monolith::html;
#[test]
fn div_two_style_attributes() {
let html = "<!doctype html><html><head></head><body><DIV STYLE=\"color: blue;\" style=\"display: none;\"></div></body></html>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
let mut count = 0;
fn test_walk(node: &Handle, i: &mut i8) {
*i += 1;
match &node.data {
NodeData::Document => {
// Dig deeper
for child in node.children.borrow().iter() {
test_walk(child, &mut *i);
}
}
NodeData::Element { ref name, .. } => {
let node_name = name.local.as_ref().to_string();
if node_name == "body" {
assert_eq!(html::get_node_attr(node, "class"), None);
} else if node_name == "div" {
assert_eq!(
html::get_node_attr(node, "style"),
Some("color: blue;".to_string())
);
}
for child in node.children.borrow().iter() {
test_walk(child, &mut *i);
}
}
_ => (),
};
}
test_walk(&dom.document, &mut count);
assert_eq!(count, 6);
}
}

View File

@ -0,0 +1,53 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
use html5ever::rcdom::{Handle, NodeData};
use monolith::html;
#[test]
fn parent_node_names() {
let html = "<!doctype html><html><HEAD></HEAD><body><div><P></P></div></body></html>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
let mut count = 0;
fn test_walk(node: &Handle, i: &mut i8) {
*i += 1;
match &node.data {
NodeData::Document => {
for child in node.children.borrow().iter() {
test_walk(child, &mut *i);
}
}
NodeData::Element { ref name, .. } => {
let node_name = name.local.as_ref().to_string();
let parent = html::get_parent_node(node);
let parent_node_name = html::get_node_name(&parent);
if node_name == "head" || node_name == "body" {
assert_eq!(parent_node_name, Some("html"));
} else if node_name == "div" {
assert_eq!(parent_node_name, Some("body"));
} else if node_name == "p" {
assert_eq!(parent_node_name, Some("div"));
}
for child in node.children.borrow().iter() {
test_walk(child, &mut *i);
}
}
_ => (),
};
}
test_walk(&dom.document, &mut count);
assert_eq!(count, 7);
}
}

50
tests/html/has_favicon.rs Normal file
View File

@ -0,0 +1,50 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
use monolith::html;
#[test]
fn icon() {
let html = "<link rel=\"icon\" href=\"\" /><div>text</div>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
let res: bool = html::has_favicon(&dom.document);
assert!(res);
}
#[test]
fn shortcut_icon() {
let html = "<link rel=\"shortcut icon\" href=\"\" /><div>text</div>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
let res: bool = html::has_favicon(&dom.document);
assert!(res);
}
}
// ███████╗ █████╗ ██╗██╗ ██╗███╗ ██╗ ██████╗
// ██╔════╝██╔══██╗██║██║ ██║████╗ ██║██╔════╝
// █████╗ ███████║██║██║ ██║██╔██╗ ██║██║ ███╗
// ██╔══╝ ██╔══██║██║██║ ██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║██║███████╗██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚═╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod failing {
use monolith::html;
#[test]
fn absent() {
let html = "<div>text</div>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
let res: bool = html::has_favicon(&dom.document);
assert!(!res);
}
}

58
tests/html/is_icon.rs Normal file
View File

@ -0,0 +1,58 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
use monolith::html;
#[test]
fn icon() {
assert!(html::is_icon("icon"));
}
#[test]
fn shortcut_icon_capitalized() {
assert!(html::is_icon("Shortcut Icon"));
}
#[test]
fn icon_uppercase() {
assert!(html::is_icon("ICON"));
}
}
// ███████╗ █████╗ ██╗██╗ ██╗███╗ ██╗ ██████╗
// ██╔════╝██╔══██╗██║██║ ██║████╗ ██║██╔════╝
// █████╗ ███████║██║██║ ██║██╔██╗ ██║██║ ███╗
// ██╔══╝ ██╔══██║██║██║ ██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║██║███████╗██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚═╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod failing {
use monolith::html;
#[test]
fn mask_icon() {
assert!(!html::is_icon("mask-icon"));
}
#[test]
fn fluid_icon() {
assert!(!html::is_icon("fluid-icon"));
}
#[test]
fn stylesheet() {
assert!(!html::is_icon("stylesheet"));
}
#[test]
fn empty_string() {
assert!(!html::is_icon(""));
}
}

14
tests/html/mod.rs Normal file
View File

@ -0,0 +1,14 @@
mod add_favicon;
mod check_integrity;
mod compose_csp;
mod create_metadata_tag;
mod embed_srcset;
mod get_base_url;
mod get_charset;
mod get_node_attr;
mod get_node_name;
mod has_favicon;
mod is_icon;
mod serialize_document;
mod set_node_attr;
mod walk_and_embed_assets;

View File

@ -0,0 +1,153 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
use monolith::html;
use monolith::opts::Options;
#[test]
fn div_as_root_element() {
let html = "<div><script src=\"some.js\"></script></div>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
let options = Options::default();
assert_eq!(
String::from_utf8_lossy(&html::serialize_document(dom, "".to_string(), &options)),
"<html><head></head><body><div><script src=\"some.js\"></script></div></body></html>"
);
}
#[test]
fn full_page_with_no_html_head_or_body() {
let html = "<title>Isolated document</title>\
<link rel=\"something\" href=\"some.css\" />\
<meta http-equiv=\"Content-Security-Policy\" content=\"default-src https:\">\
<div><script src=\"some.js\"></script></div>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
let mut options = Options::default();
options.isolate = true;
assert_eq!(
String::from_utf8_lossy(&html::serialize_document(
dom,
"".to_string(),
&options
)),
"<html>\
<head>\
<meta http-equiv=\"Content-Security-Policy\" content=\"default-src 'unsafe-eval' 'unsafe-inline' data:;\"></meta>\
<title>Isolated document</title>\
<link rel=\"something\" href=\"some.css\">\
<meta http-equiv=\"Content-Security-Policy\" content=\"default-src https:\">\
</head>\
<body>\
<div>\
<script src=\"some.js\"></script>\
</div>\
</body>\
</html>"
);
}
#[test]
fn doctype_and_the_rest_no_html_head_or_body() {
let html = "<!doctype html>\
<title>Unstyled document</title>\
<link rel=\"stylesheet\" href=\"main.css\"/>\
<div style=\"display: none;\"></div>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
let mut options = Options::default();
options.no_css = true;
assert_eq!(
String::from_utf8_lossy(&html::serialize_document(dom, "".to_string(), &options)),
"<!DOCTYPE html>\
<html>\
<head>\
<meta http-equiv=\"Content-Security-Policy\" content=\"style-src 'none';\"></meta>\
<title>Unstyled document</title>\
<link rel=\"stylesheet\" href=\"main.css\">\
</head>\
<body><div style=\"display: none;\"></div></body>\
</html>"
);
}
#[test]
fn doctype_and_the_rest_no_html_head_or_body_forbid_frames() {
let html = "<!doctype html>\
<title>Frameless document</title>\
<link rel=\"something\"/>\
<div><script src=\"some.js\"></script></div>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
let mut options = Options::default();
options.no_frames = true;
assert_eq!(
String::from_utf8_lossy(&html::serialize_document(
dom,
"".to_string(),
&options
)),
"<!DOCTYPE html>\
<html>\
<head>\
<meta http-equiv=\"Content-Security-Policy\" content=\"frame-src 'none'; child-src 'none';\"></meta>\
<title>Frameless document</title>\
<link rel=\"something\">\
</head>\
<body><div><script src=\"some.js\"></script></div></body>\
</html>"
);
}
#[test]
fn doctype_and_the_rest_all_forbidden() {
let html = "<!doctype html>\
<title>no-frame no-css no-js no-image isolated document</title>\
<meta http-equiv=\"Content-Security-Policy\" content=\"default-src https:\">\
<link rel=\"stylesheet\" href=\"some.css\">\
<div>\
<script src=\"some.js\"></script>\
<img style=\"width: 100%;\" src=\"some.png\" />\
<iframe src=\"some.html\"></iframe>\
</div>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
let mut options = Options::default();
options.isolate = true;
options.no_css = true;
options.no_fonts = true;
options.no_frames = true;
options.no_js = true;
options.no_images = true;
assert_eq!(
String::from_utf8_lossy(&html::serialize_document(
dom,
"".to_string(),
&options
)),
"<!DOCTYPE html>\
<html>\
<head>\
<meta http-equiv=\"Content-Security-Policy\" content=\"default-src 'unsafe-eval' 'unsafe-inline' data:; style-src 'none'; font-src 'none'; frame-src 'none'; child-src 'none'; script-src 'none'; img-src data:;\"></meta>\
<title>no-frame no-css no-js no-image isolated document</title>\
<meta http-equiv=\"Content-Security-Policy\" content=\"default-src https:\">\
<link rel=\"stylesheet\" href=\"some.css\">\
</head>\
<body>\
<div>\
<script src=\"some.js\"></script>\
<img style=\"width: 100%;\" src=\"some.png\">\
<iframe src=\"some.html\"></iframe>\
</div>\
</body>\
</html>"
);
}
}

108
tests/html/set_node_attr.rs Normal file
View File

@ -0,0 +1,108 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
use html5ever::rcdom::{Handle, NodeData};
use monolith::html;
#[test]
fn html_lang_and_body_style() {
let html = "<!doctype html><html lang=\"en\"><head></head><body></body></html>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
let mut count = 0;
fn test_walk(node: &Handle, i: &mut i8) {
*i += 1;
match &node.data {
NodeData::Document => {
// Dig deeper
for child in node.children.borrow().iter() {
test_walk(child, &mut *i);
}
}
NodeData::Element { ref name, .. } => {
let node_name = name.local.as_ref().to_string();
if node_name == "html" {
assert_eq!(html::get_node_attr(node, "lang"), Some("en".to_string()));
html::set_node_attr(node, "lang", Some("de".to_string()));
assert_eq!(html::get_node_attr(node, "lang"), Some("de".to_string()));
html::set_node_attr(node, "lang", None);
assert_eq!(html::get_node_attr(node, "lang"), None);
html::set_node_attr(node, "lang", Some("".to_string()));
assert_eq!(html::get_node_attr(node, "lang"), Some("".to_string()));
} else if node_name == "body" {
assert_eq!(html::get_node_attr(node, "style"), None);
html::set_node_attr(node, "style", Some("display: none;".to_string()));
assert_eq!(
html::get_node_attr(node, "style"),
Some("display: none;".to_string())
);
}
for child in node.children.borrow().iter() {
test_walk(child, &mut *i);
}
}
_ => (),
};
}
test_walk(&dom.document, &mut count);
assert_eq!(count, 5);
}
#[test]
fn body_background() {
let html = "<!doctype html><html lang=\"en\"><head></head><body background=\"1\" background=\"2\"></body></html>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
let mut count = 0;
fn test_walk(node: &Handle, i: &mut i8) {
*i += 1;
match &node.data {
NodeData::Document => {
// Dig deeper
for child in node.children.borrow().iter() {
test_walk(child, &mut *i);
}
}
NodeData::Element { ref name, .. } => {
let node_name = name.local.as_ref().to_string();
if node_name == "body" {
assert_eq!(
html::get_node_attr(node, "background"),
Some("1".to_string())
);
html::set_node_attr(node, "background", None);
assert_eq!(html::get_node_attr(node, "background"), None);
}
for child in node.children.borrow().iter() {
test_walk(child, &mut *i);
}
}
_ => (),
};
}
test_walk(&dom.document, &mut count);
assert_eq!(count, 5);
}
}

View File

@ -0,0 +1,518 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
use html5ever::serialize::{serialize, SerializeOpts};
use reqwest::blocking::Client;
use std::collections::HashMap;
use url::Url;
use monolith::html;
use monolith::opts::Options;
use monolith::url::EMPTY_IMAGE_DATA_URL;
#[test]
fn basic() {
let cache = &mut HashMap::new();
let html: &str = "<div><P></P></div>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
let url: Url = Url::parse("http://localhost").unwrap();
let mut options = Options::default();
options.silent = true;
let client = Client::new();
html::walk_and_embed_assets(cache, &client, &url, &dom.document, &options, 0);
let mut buf: Vec<u8> = Vec::new();
serialize(&mut buf, &dom.document, SerializeOpts::default()).unwrap();
assert_eq!(
buf.iter().map(|&c| c as char).collect::<String>(),
"<html><head></head><body><div><p></p></div></body></html>"
);
}
#[test]
fn ensure_no_recursive_iframe() {
let html = "<div><P></P><iframe src=\"\"></iframe></div>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
let url: Url = Url::parse("http://localhost").unwrap();
let cache = &mut HashMap::new();
let mut options = Options::default();
options.silent = true;
let client = Client::new();
html::walk_and_embed_assets(cache, &client, &url, &dom.document, &options, 0);
let mut buf: Vec<u8> = Vec::new();
serialize(&mut buf, &dom.document, SerializeOpts::default()).unwrap();
assert_eq!(
buf.iter().map(|&c| c as char).collect::<String>(),
"<html><head></head><body><div><p></p><iframe src=\"\"></iframe></div></body></html>"
);
}
#[test]
fn ensure_no_recursive_frame() {
let html = "<frameset><frame src=\"\"></frameset>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
let url: Url = Url::parse("http://localhost").unwrap();
let cache = &mut HashMap::new();
let mut options = Options::default();
options.silent = true;
let client = Client::new();
html::walk_and_embed_assets(cache, &client, &url, &dom.document, &options, 0);
let mut buf: Vec<u8> = Vec::new();
serialize(&mut buf, &dom.document, SerializeOpts::default()).unwrap();
assert_eq!(
buf.iter().map(|&c| c as char).collect::<String>(),
"<html><head></head><frameset><frame src=\"\"></frameset></html>"
);
}
#[test]
fn no_css() {
let html = "\
<link rel=\"stylesheet\" href=\"main.css\">\
<link rel=\"alternate stylesheet\" href=\"main.css\">\
<style>html{background-color: #000;}</style>\
<div style=\"display: none;\"></div>\
";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
let url: Url = Url::parse("http://localhost").unwrap();
let cache = &mut HashMap::new();
let mut options = Options::default();
options.no_css = true;
options.silent = true;
let client = Client::new();
html::walk_and_embed_assets(cache, &client, &url, &dom.document, &options, 0);
let mut buf: Vec<u8> = Vec::new();
serialize(&mut buf, &dom.document, SerializeOpts::default()).unwrap();
assert_eq!(
buf.iter().map(|&c| c as char).collect::<String>(),
"\
<html>\
<head>\
<link rel=\"stylesheet\">\
<link rel=\"alternate stylesheet\">\
<style></style>\
</head>\
<body>\
<div></div>\
</body>\
</html>\
"
);
}
#[test]
fn no_images() {
let html = "<link rel=\"icon\" href=\"favicon.ico\">\
<div><img src=\"http://localhost/assets/mono_lisa.png\" /></div>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
let url: Url = Url::parse("http://localhost").unwrap();
let cache = &mut HashMap::new();
let mut options = Options::default();
options.no_images = true;
options.silent = true;
let client = Client::new();
html::walk_and_embed_assets(cache, &client, &url, &dom.document, &options, 0);
let mut buf: Vec<u8> = Vec::new();
serialize(&mut buf, &dom.document, SerializeOpts::default()).unwrap();
assert_eq!(
buf.iter().map(|&c| c as char).collect::<String>(),
format!(
"<html>\
<head>\
<link rel=\"icon\">\
</head>\
<body>\
<div>\
<img src=\"{empty_image}\">\
</div>\
</body>\
</html>",
empty_image = EMPTY_IMAGE_DATA_URL
)
);
}
#[test]
fn no_body_background_images() {
let html =
"<body background=\"no/such/image.png\" background=\"no/such/image2.png\"></body>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
let url: Url = Url::parse("http://localhost").unwrap();
let cache = &mut HashMap::new();
let mut options = Options::default();
options.no_images = true;
options.silent = true;
let client = Client::new();
html::walk_and_embed_assets(cache, &client, &url, &dom.document, &options, 0);
let mut buf: Vec<u8> = Vec::new();
serialize(&mut buf, &dom.document, SerializeOpts::default()).unwrap();
assert_eq!(
buf.iter().map(|&c| c as char).collect::<String>(),
"<html><head></head><body></body></html>"
);
}
#[test]
fn no_frames() {
let html = "<frameset><frame src=\"http://trackbook.com\"></frameset>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
let url: Url = Url::parse("http://localhost").unwrap();
let cache = &mut HashMap::new();
let mut options = Options::default();
options.no_frames = true;
options.silent = true;
let client = Client::new();
html::walk_and_embed_assets(cache, &client, &url, &dom.document, &options, 0);
let mut buf: Vec<u8> = Vec::new();
serialize(&mut buf, &dom.document, SerializeOpts::default()).unwrap();
assert_eq!(
buf.iter().map(|&c| c as char).collect::<String>(),
"\
<html>\
<head>\
</head>\
<frameset>\
<frame src=\"\">\
</frameset>\
</html>\
"
);
}
#[test]
fn no_iframes() {
let html = "<iframe src=\"http://trackbook.com\"></iframe>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
let url: Url = Url::parse("http://localhost").unwrap();
let cache = &mut HashMap::new();
let mut options = Options::default();
options.no_frames = true;
options.silent = true;
let client = Client::new();
html::walk_and_embed_assets(cache, &client, &url, &dom.document, &options, 0);
let mut buf: Vec<u8> = Vec::new();
serialize(&mut buf, &dom.document, SerializeOpts::default()).unwrap();
assert_eq!(
buf.iter().map(|&c| c as char).collect::<String>(),
"\
<html>\
<head></head>\
<body>\
<iframe src=\"\"></iframe>\
</body>\
</html>\
"
);
}
#[test]
fn no_js() {
let html = "\
<div onClick=\"void(0)\">\
<script src=\"http://localhost/assets/some.js\"></script>\
<script>alert(1)</script>\
</div>\
";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
let url: Url = Url::parse("http://localhost").unwrap();
let cache = &mut HashMap::new();
let mut options = Options::default();
options.no_js = true;
options.silent = true;
let client = Client::new();
html::walk_and_embed_assets(cache, &client, &url, &dom.document, &options, 0);
let mut buf: Vec<u8> = Vec::new();
serialize(&mut buf, &dom.document, SerializeOpts::default()).unwrap();
assert_eq!(
buf.iter().map(|&c| c as char).collect::<String>(),
"\
<html>\
<head></head>\
<body>\
<div>\
<script></script>\
<script></script>\
</div>\
</body>\
</html>\
"
);
}
#[test]
fn keeps_integrity_for_unfamiliar_links() {
let html = "<title>Has integrity</title>\
<link integrity=\"sha384-12345\" rel=\"something\" href=\"https://some-site.com/some-file.ext\" />";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
let url: Url = Url::parse("http://localhost").unwrap();
let cache = &mut HashMap::new();
let mut options = Options::default();
options.silent = true;
let client = Client::new();
html::walk_and_embed_assets(cache, &client, &url, &dom.document, &options, 0);
let mut buf: Vec<u8> = Vec::new();
serialize(&mut buf, &dom.document, SerializeOpts::default()).unwrap();
assert_eq!(
buf.iter().map(|&c| c as char).collect::<String>(),
"\
<html>\
<head>\
<title>Has integrity</title>\
<link integrity=\"sha384-12345\" rel=\"something\" href=\"https://some-site.com/some-file.ext\">\
</head>\
<body></body>\
</html>\
"
);
}
#[test]
fn discards_integrity_for_known_links_nojs_nocss() {
let html = "\
<title>No integrity</title>\
<link integrity=\"\" rel=\"stylesheet\" href=\"data:;\"/>\
<script integrity=\"\" src=\"some.js\"></script>\
";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
let url: Url = Url::parse("http://localhost").unwrap();
let cache = &mut HashMap::new();
let mut options = Options::default();
options.no_css = true;
options.no_js = true;
options.silent = true;
let client = Client::new();
html::walk_and_embed_assets(cache, &client, &url, &dom.document, &options, 0);
let mut buf: Vec<u8> = Vec::new();
serialize(&mut buf, &dom.document, SerializeOpts::default()).unwrap();
assert_eq!(
buf.iter().map(|&c| c as char).collect::<String>(),
"\
<html>\
<head>\
<title>No integrity</title>\
<link rel=\"stylesheet\">\
<script></script>\
</head>\
<body></body>\
</html>\
"
);
}
#[test]
fn discards_integrity_for_embedded_assets() {
let html = "\
<title>No integrity</title>\
<link integrity=\"sha384-123\" rel=\"something\" href=\"data:;\"/>\
<script integrity=\"sha384-456\" src=\"some.js\"></script>\
";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
let url: Url = Url::parse("http://localhost").unwrap();
let cache = &mut HashMap::new();
let mut options = Options::default();
options.no_css = true;
options.no_js = true;
options.silent = true;
let client = Client::new();
html::walk_and_embed_assets(cache, &client, &url, &dom.document, &options, 0);
let mut buf: Vec<u8> = Vec::new();
serialize(&mut buf, &dom.document, SerializeOpts::default()).unwrap();
assert_eq!(
buf.iter().map(|&c| c as char).collect::<String>(),
"\
<html>\
<head>\
<title>No integrity</title>\
<link integrity=\"sha384-123\" rel=\"something\" href=\"data:;\">\
<script></script>\
</head>\
<body>\
</body>\
</html>\
"
);
}
#[test]
fn removes_unwanted_meta_tags() {
let html = "\
<html>\
<head>\
<meta http-equiv=\"Refresh\" content=\"2\"/>\
<meta http-equiv=\"Location\" content=\"https://freebsd.org\"/>\
</head>\
<body>\
</body>\
</html>\
";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
let url: Url = Url::parse("http://localhost").unwrap();
let cache = &mut HashMap::new();
let mut options = Options::default();
options.no_css = true;
options.no_frames = true;
options.no_js = true;
options.no_images = true;
options.silent = true;
let client = Client::new();
html::walk_and_embed_assets(cache, &client, &url, &dom.document, &options, 0);
let mut buf: Vec<u8> = Vec::new();
serialize(&mut buf, &dom.document, SerializeOpts::default()).unwrap();
assert_eq!(
buf.iter().map(|&c| c as char).collect::<String>(),
"\
<html>\
<head>\
<meta content=\"2\">\
<meta content=\"https://freebsd.org\">\
</head>\
<body>\
</body>\
</html>"
);
}
#[test]
fn processes_noscript_tags() {
let html = "\
<html>\
<body>\
<noscript>\
<img src=\"image.png\" />\
</noscript>\
</body>\
</html>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
let url: Url = Url::parse("http://localhost").unwrap();
let cache = &mut HashMap::new();
let mut options = Options::default();
options.no_images = true;
options.silent = true;
let client = Client::new();
html::walk_and_embed_assets(cache, &client, &url, &dom.document, &options, 0);
let mut buf: Vec<u8> = Vec::new();
serialize(&mut buf, &dom.document, SerializeOpts::default()).unwrap();
assert_eq!(
buf.iter().map(|&c| c as char).collect::<String>(),
format!(
"\
<html>\
<head>\
</head>\
<body>\
<noscript>\
<img src=\"{}\">\
</noscript>\
</body>\
</html>",
EMPTY_IMAGE_DATA_URL,
)
);
}
#[test]
fn preserves_script_type_json() {
let html = "<script id=\"data\" type=\"application/json\">{\"mono\":\"lith\"}</script>";
let dom = html::html_to_dom(&html.as_bytes().to_vec(), "".to_string());
let url: Url = Url::parse("http://localhost").unwrap();
let cache = &mut HashMap::new();
let mut options = Options::default();
options.silent = true;
let client = Client::new();
html::walk_and_embed_assets(cache, &client, &url, &dom.document, &options, 0);
let mut buf: Vec<u8> = Vec::new();
serialize(&mut buf, &dom.document, SerializeOpts::default()).unwrap();
assert_eq!(
buf.iter().map(|&c| c as char).collect::<String>(),
"\
<html>\
<head>\
<script id=\"data\" type=\"application/json\">{\"mono\":\"lith\"}</script>\
</head>\
<body>\
</body>\
</html>"
);
}
}

View File

@ -0,0 +1,53 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
use monolith::js;
#[test]
fn onblur_camelcase() {
assert!(js::attr_is_event_handler("onBlur"));
}
#[test]
fn onclick_lowercase() {
assert!(js::attr_is_event_handler("onclick"));
}
#[test]
fn onclick_camelcase() {
assert!(js::attr_is_event_handler("onClick"));
}
}
// ███████╗ █████╗ ██╗██╗ ██╗███╗ ██╗ ██████╗
// ██╔════╝██╔══██╗██║██║ ██║████╗ ██║██╔════╝
// █████╗ ███████║██║██║ ██║██╔██╗ ██║██║ ███╗
// ██╔══╝ ██╔══██║██║██║ ██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║██║███████╗██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚═╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod failing {
use monolith::js;
#[test]
fn href() {
assert!(!js::attr_is_event_handler("href"));
}
#[test]
fn empty_string() {
assert!(!js::attr_is_event_handler(""));
}
#[test]
fn class() {
assert!(!js::attr_is_event_handler("class"));
}
}

1
tests/js/mod.rs Normal file
View File

@ -0,0 +1 @@
mod attr_is_event_handler;

View File

@ -0,0 +1,14 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
#[test]
fn contains_correct_image_data() {
assert_eq!(empty_image!(), "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAA0AAAANCAQAAADY4iz3AAAAEUlEQVR42mNkwAkYR6UolgIACvgADsuK6xYAAAAASUVORK5CYII=");
}
}

2
tests/macros/mod.rs Normal file
View File

@ -0,0 +1,2 @@
mod empty_image;
mod str;

24
tests/macros/str.rs Normal file
View File

@ -0,0 +1,24 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
#[test]
fn returns_empty_string() {
assert_eq!(str!(), "");
}
#[test]
fn converts_integer_into_string() {
assert_eq!(str!(123), "123");
}
#[test]
fn converts_str_into_string() {
assert_eq!(str!("abc"), "abc");
}
}

8
tests/mod.rs Normal file
View File

@ -0,0 +1,8 @@
mod cli;
mod css;
mod html;
mod js;
// mod macros;
mod opts;
mod url;
mod utils;

35
tests/opts.rs Normal file
View File

@ -0,0 +1,35 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
use monolith::opts::Options;
#[test]
fn defaults() {
let options: Options = Options::default();
assert_eq!(options.no_audio, false);
assert_eq!(options.base_url, None);
assert_eq!(options.no_css, false);
assert_eq!(options.charset, None);
assert_eq!(options.no_frames, false);
assert_eq!(options.no_fonts, false);
assert_eq!(options.no_images, false);
assert_eq!(options.isolate, false);
assert_eq!(options.no_js, false);
assert_eq!(options.insecure, false);
assert_eq!(options.no_metadata, false);
assert_eq!(options.output, "".to_string());
assert_eq!(options.silent, false);
assert_eq!(options.timeout, 0);
assert_eq!(options.user_agent, None);
assert_eq!(options.no_video, false);
assert_eq!(options.target, "".to_string());
}
}

63
tests/url/clean_url.rs Normal file
View File

@ -0,0 +1,63 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
use reqwest::Url;
use monolith::url;
#[test]
fn preserve_original() {
let u: Url = Url::parse("https://somewhere.com/font.eot#iefix").unwrap();
let clean_u: Url = url::clean_url(u.clone());
assert_eq!(clean_u.as_str(), "https://somewhere.com/font.eot");
assert_eq!(u.as_str(), "https://somewhere.com/font.eot#iefix");
}
#[test]
fn removes_fragment() {
assert_eq!(
url::clean_url(Url::parse("https://somewhere.com/font.eot#iefix").unwrap()).as_str(),
"https://somewhere.com/font.eot"
);
}
#[test]
fn removes_empty_fragment() {
assert_eq!(
url::clean_url(Url::parse("https://somewhere.com/font.eot#").unwrap()).as_str(),
"https://somewhere.com/font.eot"
);
}
#[test]
fn removes_empty_fragment_and_keeps_empty_query() {
assert_eq!(
url::clean_url(Url::parse("https://somewhere.com/font.eot?#").unwrap()).as_str(),
"https://somewhere.com/font.eot?"
);
}
#[test]
fn removesempty_fragment_and_keeps_empty_query() {
assert_eq!(
url::clean_url(Url::parse("https://somewhere.com/font.eot?a=b&#").unwrap()).as_str(),
"https://somewhere.com/font.eot?a=b&"
);
}
#[test]
fn keeps_credentials() {
assert_eq!(
url::clean_url(Url::parse("https://cookie:monster@gibson.internet/").unwrap()).as_str(),
"https://cookie:monster@gibson.internet/"
);
}
}

View File

@ -0,0 +1,109 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
use reqwest::Url;
use monolith::url;
#[test]
fn encode_string_with_specific_media_type() {
let media_type = "application/javascript";
let data = "var word = 'hello';\nalert(word);\n";
let data_url = url::create_data_url(
media_type,
"",
data.as_bytes(),
&Url::parse("data:,").unwrap(),
);
assert_eq!(
data_url.as_str(),
"data:application/javascript;base64,dmFyIHdvcmQgPSAnaGVsbG8nOwphbGVydCh3b3JkKTsK"
);
}
#[test]
fn encode_append_fragment() {
let data = "<svg></svg>\n";
let data_url = url::create_data_url(
"image/svg+xml",
"",
data.as_bytes(),
&Url::parse("data:,").unwrap(),
);
assert_eq!(
data_url.as_str(),
"data:image/svg+xml;base64,PHN2Zz48L3N2Zz4K"
);
}
#[test]
fn encode_string_with_specific_media_type_and_charset() {
let media_type = "application/javascript";
let charset = "utf8";
let data = "var word = 'hello';\nalert(word);\n";
let data_url = url::create_data_url(
media_type,
charset,
data.as_bytes(),
&Url::parse("data:,").unwrap(),
);
assert_eq!(
data_url.as_str(),
"data:application/javascript;charset=utf8;base64,dmFyIHdvcmQgPSAnaGVsbG8nOwphbGVydCh3b3JkKTsK"
);
}
#[test]
fn create_data_url_with_us_ascii_charset() {
let media_type = "";
let charset = "us-ascii";
let data = "";
let data_url = url::create_data_url(
media_type,
charset,
data.as_bytes(),
&Url::parse("data:,").unwrap(),
);
assert_eq!(data_url.as_str(), "data:;base64,");
}
#[test]
fn create_data_url_with_utf8_charset() {
let media_type = "";
let charset = "utf8";
let data = "";
let data_url = url::create_data_url(
media_type,
charset,
data.as_bytes(),
&Url::parse("data:,").unwrap(),
);
assert_eq!(data_url.as_str(), "data:;charset=utf8;base64,");
}
#[test]
fn create_data_url_with_media_type_text_plain_and_utf8_charset() {
let media_type = "text/plain";
let charset = "utf8";
let data = "";
let data_url = url::create_data_url(
media_type,
charset,
data.as_bytes(),
&Url::parse("data:,").unwrap(),
);
assert_eq!(data_url.as_str(), "data:text/plain;charset=utf8;base64,");
}
}

View File

@ -0,0 +1,110 @@
// ██████╗ █████╗ ███████╗███████╗██╗███╗ ██╗ ██████╗
// ██╔══██╗██╔══██╗██╔════╝██╔════╝██║████╗ ██║██╔════╝
// ██████╔╝███████║███████╗███████╗██║██╔██╗ ██║██║ ███╗
// ██╔═══╝ ██╔══██║╚════██║╚════██║██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║███████║███████║██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚══════╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod passing {
use monolith::url;
#[test]
fn mailto() {
assert!(url::is_url_and_has_protocol(
"mailto:somebody@somewhere.com?subject=hello"
));
}
#[test]
fn tel() {
assert!(url::is_url_and_has_protocol("tel:5551234567"));
}
#[test]
fn ftp_no_slashes() {
assert!(url::is_url_and_has_protocol("ftp:some-ftp-server.com"));
}
#[test]
fn ftp_with_credentials() {
assert!(url::is_url_and_has_protocol(
"ftp://user:password@some-ftp-server.com"
));
}
#[test]
fn javascript() {
assert!(url::is_url_and_has_protocol("javascript:void(0)"));
}
#[test]
fn http() {
assert!(url::is_url_and_has_protocol("http://news.ycombinator.com"));
}
#[test]
fn https() {
assert!(url::is_url_and_has_protocol("https://github.com"));
}
#[test]
fn file() {
assert!(url::is_url_and_has_protocol("file:///tmp/image.png"));
}
#[test]
fn mailto_uppercase() {
assert!(url::is_url_and_has_protocol(
"MAILTO:somebody@somewhere.com?subject=hello"
));
}
#[test]
fn empty_data_url() {
assert!(url::is_url_and_has_protocol("data:text/html,"));
}
#[test]
fn empty_data_url_surrounded_by_spaces() {
assert!(url::is_url_and_has_protocol(" data:text/html, "));
}
}
// ███████╗ █████╗ ██╗██╗ ██╗███╗ ██╗ ██████╗
// ██╔════╝██╔══██╗██║██║ ██║████╗ ██║██╔════╝
// █████╗ ███████║██║██║ ██║██╔██╗ ██║██║ ███╗
// ██╔══╝ ██╔══██║██║██║ ██║██║╚██╗██║██║ ██║
// ██║ ██║ ██║██║███████╗██║██║ ╚████║╚██████╔╝
// ╚═╝ ╚═╝ ╚═╝╚═╝╚══════╝╚═╝╚═╝ ╚═══╝ ╚═════╝
#[cfg(test)]
mod failing {
use monolith::url;
#[test]
fn url_with_no_protocol() {
assert_eq!(
url::is_url_and_has_protocol("//some-hostname.com/some-file.html"),
false
);
}
#[test]
fn relative_path() {
assert_eq!(
url::is_url_and_has_protocol("some-hostname.com/some-file.html"),
false
);
}
#[test]
fn relative_to_root_path() {
assert_eq!(url::is_url_and_has_protocol("/some-file.html"), false);
}
#[test]
fn empty_string() {
assert_eq!(url::is_url_and_has_protocol(""), false);
}
}

5
tests/url/mod.rs Normal file
View File

@ -0,0 +1,5 @@
mod clean_url;
mod create_data_url;
mod is_url_and_has_protocol;
mod parse_data_url;
mod resolve_url;

Some files were not shown because too many files have changed in this diff Show More