Compare commits

...

619 Commits

Author SHA1 Message Date
bd6556eee1 Use proper file extension for uniq command docs (#1411) 2020-02-18 09:37:46 -05:00
18d988d4c8 Restrict short-hand flag detection to exact match. (#1406) 2020-02-18 01:58:30 -05:00
0f7c723672 Bump version to 0.10.0 (#1403) 2020-02-18 16:56:09 +13:00
afce2fd0f9 Revert "Display rows in the same table regardless of their column order given they are equal. (#1392)" (#1401)
This reverts commit 4fd9974204.
2020-02-17 17:34:37 -08:00
4fd9974204 Display rows in the same table regardless of their column order given they are equal. (#1392) 2020-02-16 20:35:01 -05:00
71615f77a7 Fix minor typo in calc command error (#1395) 2020-02-16 16:02:41 -05:00
9bc5022c9c Force a \n at the end of a stdout stream (#1391)
* Force a \n at the end of a stdout stream

* clippy
2020-02-14 18:15:32 -08:00
552848b8b9 Leave raw mode correctly. (#1388)
Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-02-14 17:31:21 -05:00
8ae8ebd107 Add support for multiline script files (#1386)
* Add support for multiline script files

* clippy
2020-02-13 21:24:18 -08:00
473e9f9422 Tiny improvement to sys (#1385) 2020-02-13 08:33:55 -08:00
96985aa692 Fix invalid shorthand flag (#1384) 2020-02-13 07:47:34 -08:00
0961da406d Add string to datetime to str plugin (#1381)
* Add string to datetime to str plugin

* Test string to date/time conversion
2020-02-13 07:47:04 -08:00
84927d52b5 Refuse internal command execution given unexpected arguments. (#1383) 2020-02-13 02:34:43 -05:00
73312b506f Finer grained parsing and coloring command tail. (#1382) 2020-02-12 20:20:19 -05:00
c1bec3b443 Return error on a divide by zero (#1376)
Return error on a divide by zero
2020-02-12 08:38:04 -05:00
c0be02a434 Short-hand flags (#1378)
* typo fixes

* Change signature to take in short-hand flags

* update help information

* Parse short-hand flags as their long counterparts

* lints

* Modified a couple tests to use shorthand flags
2020-02-11 18:24:31 -08:00
2ab8d035e6 External it and nu variable column path fetch support. (#1379) 2020-02-11 18:25:56 -05:00
24094acee9 Allow switch flags anywhere in the pipeline. (#1375) 2020-02-11 03:49:00 -05:00
0b2be52bb5 Only add quotes if not in Windows (which adds its own?) (#1374)
* Only add quotes if not in Windows (which adds its own?)

* Only add quotes if not in Windows (which adds its own?)
2020-02-10 23:07:44 -08:00
6a371802b4 Add block size to du (#1341)
* Add block size to du

* Change blocks to physical size

* Use path instead of strings for file/directory names

* Why don't I just use paths instead of strings anyway?

* shorten physical size and apparent size to physical and apparent resp.
2020-02-10 12:32:18 -08:00
29ccb9f5cd Ensure stable plugins get installed. (#1373) 2020-02-10 15:32:10 -05:00
20ab125861 bump version (#1370) 2020-02-10 09:18:00 -08:00
fb532f3f4e Prototype shebang support (#1368)
* Add shebang support to nu.

* Move test file

* Add test for scripts

Co-authored-by: Jason Gedge <jason.gedge@shopify.com>
2020-02-10 08:49:45 -08:00
a29d52158e Do not panic when failing to decode lines from external stdout (#1364) 2020-02-10 07:37:48 -08:00
dc50e61f26 Switch stdin redirect to manual. Add test (#1367) 2020-02-09 22:55:07 -08:00
a2668e3327 Add some nu_source docs for meta.rs (#1366)
* Add some docs for meta.rs

* add better explanation for Span merging

* Add some doc tests - not sure how to get them to run

* get rid of doc comments for the temporary method

* add doc test for is_unknown

* fmt
2020-02-09 18:08:14 -08:00
e606407d79 Add error codes to -c (#1361) 2020-02-08 20:04:53 -08:00
5f4fae5b06 Pipeline sink refactor (#1359)
* Refactor pipeline ahead of block changes. Add '-c' commandline option

* Update pipelining an error value

* Fmt

* Clippy

* Add stdin redirect for -c flag

* Add stdin redirect for -c flag
2020-02-08 18:24:33 -08:00
3687603799 Only spawn external once when no $it argument (#1358) 2020-02-08 17:57:05 -08:00
643b532537 Fixed mv not throwing error when the source path was invalid (#1351)
* Fixed mv not throwing error when the source path was invalid

* Fixed failing test

* Fixed another lint error

* Fix $PATH conflicts in .gitpod.Dockerfile (#1349)

- Use the correct user for gitpod Dockerfile.
- Remove unneeded packages (curl, rustc) from gitpod Dockerfile.

* Added test to check for the error

* Fixed linting error

* Fixed mv not moving files on Windows. (#1342)

Move files correctly in windows.

* Fixed mv not throwing error when the source path was invalid

* Fixed failing test

* Fixed another lint error

* Added test to check for the error

* Fixed linting error

* Changed error message

* Typo and fixed test

Co-authored-by: Sean Hellum <seanhellum45@gmail.com>
2020-02-07 12:40:48 -05:00
ed86b1fbe8 Fixed mv not moving files on Windows. (#1342)
Move files correctly in windows.
2020-02-07 11:24:01 -05:00
44a114111e Fix $PATH conflicts in .gitpod.Dockerfile (#1349)
- Use the correct user for gitpod Dockerfile.
- Remove unneeded packages (curl, rustc) from gitpod Dockerfile.
2020-02-06 15:20:18 -05:00
812a76d588 Update more futures-preview to futures (#1346) 2020-02-05 20:28:42 -08:00
e3be849c2a Futures v0.3 upgrade (#1344)
* Upgrade futures, async-stream, and futures_codec

These were the last three dependencies on futures-preview. `nu` itself
is now fully dependent on `futures@0.3`, as opposed to `futures-preview`
alpha.

Because the update to `futures` from `0.3.0-alpha.19` to `0.3.0` removed
the `Stream` implementation of `VecDeque` ([changelog][changelog]), most
commands that convert a `VecDeque` to an `OutputStream` broke and had to
be fixed.

The current solution is to now convert `VecDeque`s to a `Stream` via
`futures::stream::iter`. However, it may be useful for `futures` to
create an `IntoStream` trait, implemented on the `std::collections` (or
really any `IntoIterator`). If something like this happends, it may be
worthwhile to update the trait implementations on `OutputStream` and
refactor these commands again.

While upgrading `futures_codec`, we remove a custom implementation of
`LinesCodec`, as one has been added to the library. There's also a small
refactor to make the stream output more idiomatic.

[changelog]: https://github.com/rust-lang/futures-rs/blob/master/CHANGELOG.md#030---2019-11-5

* Upgrade sys & ps plugin dependencies

They were previously dependent on `futures-preview`, and `nu_plugin_ps`
was dependent on an old version of `futures-timer`.

* Remove dependency on futures-timer from nu

* Update Cargo.lock

* Fix formatting

* Revert fmt regressions

CI is still on 1.40.0, but the latest rustfmt v1.41.0 has changes to the
`val @ pattern` syntax, causing the linting job to fail.

* Fix clippy warnings
2020-02-05 19:46:48 -08:00
ba1b67c072 Attempt rustup update on each PR (#1345)
* Attempt update on each PR

* Update fmt
2020-02-05 19:28:49 -08:00
fa910b95b7 Have from-ssv not fail for header-only inputs (#1334) 2020-02-05 11:54:14 -08:00
427bde83f7 Allow cp to overwrite existing files (#1339) 2020-02-05 01:54:05 -05:00
7a0bc6bc46 Opt-out unused heim features from sys/ps plugins. (#1335) 2020-02-04 01:51:14 -05:00
c6da56949c Add support for plugin names containing numbers (#1321)
* Add ability to have numbers in plugin name. Plugin must start with alphabetic char

* remove the first character as alphabetic requirement

* Update cli.rs

Going ahead and changing to plus to prevent issue notryanb found

* Update cli.rs

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-02-01 22:08:38 -08:00
5b398d2ed2 Adding cross-platform kill command (#1326)
* Adding kill command, unclean code

* Removing old comments

* Added quiet option, supports variable number of ids

* Made it per_item_command, calling commands directly without the shell
2020-02-01 10:46:28 -08:00
dcdfa2a866 Improve tests and labeling in FilesystemShell (#1305)
Additional `ls` command tests and better FilesystemShell error and label messages.
2020-02-01 03:34:34 -05:00
9474fa1ea5 Improved code in du command (#1320)
Made the code a little easier to read
2020-02-01 03:32:06 -05:00
49a1385543 Make tests work from directory names with spaces (#1325) 2020-01-31 22:12:56 -08:00
6427ea2331 Update Cargo.lock for ichwh fix (#1312)
`ichwh@0.3.1` fixes a bug that causes path searches to fail. We update
`Cargo.lock` to fix this.

Resolves #1207
2020-01-31 22:11:42 -08:00
3610baa227 Default plugins are independent and called from Nu. (#1322) 2020-01-31 17:45:33 -05:00
4e201d20ca Paths from Nu config take priority over external paths. (#1319) 2020-01-31 14:19:47 -05:00
1fa21ff056 Exclude images to reduce crate by 3MB (#1316)
Maybe there are more candidates for exclusion, but 'images/'
seemed obviously unnecessary.

Something I started realizing lately is that cargo puts most
of the root directory into the crate archive, causing huge
crates to appear on crates.io.

Now that I am in China, I do seem to notice every kilobyte.
2020-01-31 10:38:26 -05:00
0bbd12e37f Improve the default help message (#1313) 2020-01-30 20:13:14 -08:00
7df8fdfb28 Rename the now-deprecated add command docs into insert comm… (#1307) 2020-01-30 08:15:20 -05:00
6a39cd8546 Add docs for the calc command (#1290) 2020-01-29 08:34:54 -05:00
dc3370b103 Make a calc command (#1280) 2020-01-29 08:34:36 -05:00
ac5ad45783 Pretty Nu print default, pretty print regular secondary as raw flag. (#1302) 2020-01-29 02:46:54 -05:00
8ef5c47515 Update cargo flags (#1295)
* Update cargo flags

See https://github.com/nushell/nushell.github.io/issues/29

* Update to suggested flag
2020-01-28 22:44:49 -08:00
5b19bebe7d Isolate environment state changes into Host. (#1296)
Moves the state changes for setting and removing environment variables
into the context's host as opposed to calling `std::env::*` directly
from anywhere else.

Introduced FakeHost to prevent environemnt state changes leaking
between unit tests and cause random test failures.
2020-01-29 00:40:06 -05:00
2c529cd849 Fix bug where --with-symlink-targets would not display the targets column (#1300) 2020-01-28 21:36:20 -08:00
407f36af29 Remove unused dep (#1298) 2020-01-29 16:44:03 +13:00
763fcbc137 Bump to 0.9.0 (#1297) 2020-01-29 15:17:02 +13:00
7061af712e ls will return error if no files/folders match path/pattern (#1286)
* `ls` will return error if no files/folders match path/pattern

* Revert changes to src/data/files.rs

* Add a name_only flag to dir_entry_dict

Add name_only flag to indicate if the caller only cares about filenames
or wants the whole path

* Update ls changes from feedback

* Little cleanup

* Resolve merge conflicts

* lints
2020-01-29 05:58:31 +13:00
9b4ba09c95 Nu env vars from config have higher priority. (#1294) 2020-01-28 02:10:15 -05:00
9ec6d0c90e Add --with-symlink-targets option for the ls command that displays a new column for the target files of symlinks (#1292)
* Add `--with-symlink-targets` option for the `ls` command that displays a new column for the target files of symlinks

* Fix clippy warning

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-01-28 19:48:41 +13:00
f20a4a42e8 Let's the expander look for tokens from start. (#1293) 2020-01-28 01:03:28 -05:00
caa6830184 Baseline environment and configuration work. (#1287) 2020-01-27 22:13:22 -05:00
f8be1becf2 Updated rustyline to 6.0.0. Added completion_mode config (#1289)
* Updated rustyline to 6.0.0. Added completion_mode config

* Formatted completion_mode config
2020-01-27 16:41:17 +13:00
af51a0e6f0 Update motto 2020-01-27 16:32:02 +13:00
23d11d5e84 Nu source overview (#1282)
* add some notes into README for more elaboration

* rewrite the overview

* remove unused first line

* add last part about tracing and debugging

* change the wording to make it easier to read

* Add example of metadata system

* Add contact information as other helpful links
2020-01-27 15:55:02 +13:00
6da9e2aced Upgrade crossterm (#1288)
* WIP

* Finish porting to new crossterm

* Fmt
2020-01-27 15:51:46 +13:00
32dfb32741 Switch from subprocess crate to the builtin std::process (#1284)
* Switch from subprocess crate to the builtin std::process

* Update external.rs

* Update external.rs

* Update external.rs

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-01-26 16:03:21 +13:00
d48f99cb0e compute directory sizes from contained files and directories (#1250)
* compute directory sizes from contained files and directories

* De-lint

* Revert "De-lint"

This reverts commit 9df9fc07d777014fef8f5749a84b4e52e1ee652a.

* Revert "compute directory sizes from contained files and directories"

This reverts commit d43583e9aa20438bd613f78a36e641c9fd48cae3.

* Nu du command

* Nu du for you

* Add async support

* Lints

* so much bug fixing
2020-01-26 15:43:29 +13:00
35359cbc22 Buffer tables until a timeout or threshold is met (#1283) 2020-01-26 09:09:51 +13:00
b52dbcc8ef Separate dissimilar tables into separate tables (#1281)
* Allow the table command to stream

* Next part of table view refactor
2020-01-26 07:10:20 +13:00
4429a75e17 Make ls show only the file name (#1276)
* Make ls show only the file name

* Refactor and remove unwraps

* Put functionality in separate flag
2020-01-26 05:20:33 +13:00
583f27dc41 Added attributes to from-xml command (#1272)
* Added attributes to from-xml command

* Added attributes as their own rows

* Removed unneccesary lifetime declarations

* from-xml now has children and attributes side by side

* Fixed tests and linting

* Fixed lint-problem
2020-01-26 05:16:40 +13:00
83db5c34c3 Add docs for the from-ods and from-xlsx commands (#1279) 2020-01-26 04:31:20 +13:00
cdbfdf282f Allow the table command to stream (#1278) 2020-01-25 16:13:12 +13:00
a5e1372bc2 RM error on bad filename (#1244)
* rm error on bad filename

* De-lint

* Fix error message in test
2020-01-25 08:16:41 +13:00
798a24eda5 Soften restrictions for external parameters (#1277)
* Soften restrictions for external parameters

* Add test
2020-01-25 08:14:49 +13:00
a2bb23d78c Update README.md 2020-01-25 06:51:24 +13:00
d38a63473b Improve shelling out (#1273)
Improvements to shelling out
2020-01-24 08:24:31 +13:00
2b37ae3e81 Switch to using subprocess::shell (#1264)
* Switch to using `shell`

Switch to using the shell for subprocess to enable more natural shelling out.

* Update external.rs

* This is a test with .shell() for external

* El pollo loco's PR

* co co co

* Attempt to fix windows

* Fmt

* Less is more?

Co-authored-by: Andrés N. Robalino <andres@androbtech.com>
2020-01-24 05:21:05 +13:00
bc5a969562 [Gitpod] Add some VSCode extensions (#1268)
VSCode extensions for productive work.
2020-01-23 00:51:08 -05:00
fe4ad5f77e Color named type help especial case. (#1263)
Refactored out help named type as switch.
2020-01-22 19:36:48 -05:00
07191754bf Update ichwh to 3.0 (#1267)
The [newest update for ichwh][changes] introduced some breaking changes.
This PR bumps the version and refactors the `which` command to take
these into account.

[changes]: https://gitlab.com/avandesa/ichwh-rs/blob/master/CHANGELOG.md#030-2020-01-22
2020-01-23 12:26:49 +13:00
66bd331ba9 Make futures-timer a non-optional dependency (#1265)
Originally, it was only brought in with the `ps` feature enabled.
However, commit #ba7a17, made the crate used in
`src/commands/classified/external.rs` unconditionally, causing the build
to fail when built without the `ps` feature.

This commit fixes the problem by making it a non-optional dependency.
2020-01-23 10:56:29 +13:00
762c798670 It ls test setup rewrite. (#1260) 2020-01-21 22:56:12 -05:00
3c01526869 Test binaries no longer belong to stable or default features. (#1259) 2020-01-21 22:00:27 -05:00
7efb31a4e4 Restructure and streamline token expansion (#1123)
Restructure and streamline token expansion

The purpose of this commit is to streamline the token expansion code, by
removing aspects of the code that are no longer relevant, removing
pointless duplication, and eliminating the need to pass the same
arguments to `expand_syntax`.

The first big-picture change in this commit is that instead of a handful
of `expand_` functions, which take a TokensIterator and ExpandContext, a
smaller number of methods on the `TokensIterator` do the same job.

The second big-picture change in this commit is fully eliminating the
coloring traits, making coloring a responsibility of the base expansion
implementations. This also means that the coloring tracer is merged into
the expansion tracer, so you can follow a single expansion and see how
the expansion process produced colored tokens.

One side effect of this change is that the expander itself is marginally
more error-correcting. The error correction works by switching from
structured expansion to `BackoffColoringMode` when an unexpected token
is found, which guarantees that all spans of the source are colored, but
may not be the most optimal error recovery strategy.

That said, because `BackoffColoringMode` only extends as far as a
closing delimiter (`)`, `]`, `}`) or pipe (`|`), it does result in
fairly granular correction strategy.

The current code still produces an `Err` (plus a complete list of
colored shapes) from the parsing process if any errors are encountered,
but this could easily be addressed now that the underlying expansion is
error-correcting.

This commit also colors any spans that are syntax errors in red, and
causes the parser to include some additional information about what
tokens were expected at any given point where an error was encountered,
so that completions and hinting could be more robust in the future.

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
Co-authored-by: Andrés N. Robalino <andres@androbtech.com>
2020-01-21 17:45:03 -05:00
c8dd7838a8 Bump Pipeline images (#1255)
* Bump Pipeline images

* Update azure-pipelines.yml

* Update azure-pipelines.yml
2020-01-22 10:39:31 +13:00
3b57ee5dda Add internal clear command (#1249)
* Add clear.rs

* update

* update

* cross-platformify

* update

* fix

* format

* fix warnings

* update implementation

* remove return

* remove semicolon

* change from `.output()` to `.status()`

* format
2020-01-20 20:05:32 +13:00
fb977ab941 add automated setup badge and add .gitpod.yml patch (#1246)
* add automated setup badge and add .gitpod.yml patch

* Update .gitpod.yml
2020-01-20 14:40:04 +13:00
e059c74a06 Add support for primitive values to sort-by (#1241)
* Remove redundant clone

* Add support for primitive values to sort-by #1238
2020-01-20 08:08:36 +13:00
47d987d37f Add ctrl_c to RunnablePerItemContext. (#1239)
Also, this commit makes `ls` a per-item command.

A command that processes things item by item may still take some time to stream
out the results from a single item. For example, `ls` on a directory with a lot
of files could be interrupted in the middle of showing all of these files.
2020-01-19 15:25:07 +13:00
3abfefc025 More docs and random fixes (#1237) 2020-01-19 08:42:36 +13:00
a5c5b4e711 Add --help for commands (#1226)
* WIP --help works for PerItemCommands.

* De-linting

* Add more comments (#1228)

* Add some more docs

* More docs

* More docs

* More docs (#1229)

* Add some more docs

* More docs

* More docs

* Add more docs

* External commands: wrap values that contain spaces in quotes (#1214) (#1220)

* External commands: wrap values that contain spaces in quotes (#1214)

* Add fn's argument_contains_whitespace & add_quotes (#1214)

*  Fix formatting with cargo fmt

* Don't wrap argument in quotes when $it is already quoted (#1214)

* Implement --help for internal commands

* Externals now spawn independently. (#1230)

This commit changes the way we shell out externals when using the `"$it"` argument. Also pipes per row to an external's stdin if no `"$it"` argument is present for external commands. 

Further separation of logic (preparing the external's command arguments, getting the data for piping, emitting values, spawning processes) will give us a better idea for lower level details regarding external commands until we can find the right abstractions for making them more generic and unify within the pipeline calling logic of Nu internal's and external's.

* Poll externals quicker. (#1231)

* WIP --help works for PerItemCommands.

* De-linting

* Implement --help for internal commands

* Make having --help the default

* Update test to include new default switch

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
Co-authored-by: Koenraad Verheyden <mail@koenraadverheyden.com>
Co-authored-by: Andrés N. Robalino <andres@androbtech.com>
2020-01-18 11:46:18 +13:00
ba9cb753d5 Bump some of our dependencies (#1234) 2020-01-18 09:35:48 +13:00
ba7a1752db Poll externals quicker. (#1231) 2020-01-16 06:27:12 -05:00
29431e73c2 Externals now spawn independently. (#1230)
This commit changes the way we shell out externals when using the `"$it"` argument. Also pipes per row to an external's stdin if no `"$it"` argument is present for external commands. 

Further separation of logic (preparing the external's command arguments, getting the data for piping, emitting values, spawning processes) will give us a better idea for lower level details regarding external commands until we can find the right abstractions for making them more generic and unify within the pipeline calling logic of Nu internal's and external's.
2020-01-16 04:05:53 -05:00
d29fe6f6de External commands: wrap values that contain spaces in quotes (#1214) (#1220)
* External commands: wrap values that contain spaces in quotes (#1214)

* Add fn's argument_contains_whitespace & add_quotes (#1214)

*  Fix formatting with cargo fmt

* Don't wrap argument in quotes when $it is already quoted (#1214)
2020-01-16 13:38:16 +13:00
e2e9abab0a More docs (#1229)
* Add some more docs

* More docs

* More docs

* Add more docs
2020-01-16 07:32:46 +13:00
2956b0b087 Add more comments (#1228)
* Add some more docs

* More docs

* More docs
2020-01-16 05:28:31 +13:00
b32eceffb3 Add some comments (#1225) 2020-01-14 20:38:56 +13:00
3adf52b1c4 update .gitignore to exclude target directories in the crate directory (#1221) 2020-01-14 20:14:24 +13:00
78a644da2b Restrict Nu with a cleaned environment. (#1222) 2020-01-13 23:17:20 -05:00
98028433ad $it: add conversion from Int for external commands (#1218) 2020-01-13 18:57:44 -05:00
2ab5803f00 $it: add conversion from Path for external commands (#1210)
* $it: add conversion from Path for external commands (#1203)

* Replace PathBuf::to_str with to_string_lossy
2020-01-14 05:41:18 +13:00
65980c7beb Revert 8cadc5a4 (#1211) 2020-01-13 19:38:58 +13:00
29fd8b55fb Keep dummies in default features for convenience. (#1212) 2020-01-13 01:17:56 -05:00
2f039b3abc Fix crash when attempting to enter help shell (#1201)
`enter help` would result in a crash
2020-01-13 17:27:00 +13:00
d3dae05714 Groundwork for coverage with Nu internals. (#1205) 2020-01-12 16:44:22 -05:00
5fd3191d91 Fix randomly failing test (#1200)
* Fix randomly failing test

* Fix randomly failing test
2020-01-13 06:03:28 +13:00
0dcd90cb8f Silence stdout for test runs. (#1198) 2020-01-12 04:14:10 -05:00
02d0a4107e A few ls improvements. New welcome message (#1195) 2020-01-12 09:49:20 +13:00
63885c4ee6 Change black to other colors (#1194) 2020-01-12 06:21:59 +13:00
147bfefd7e Sort ls case-insensitively by default (#1192) 2020-01-11 20:59:55 +13:00
60043df917 Allow ColumnPaths when picking tables. (#1191) 2020-01-11 01:45:09 -05:00
6d3a30772d Get error message improvements. (#1185)
More especific "get" command error messages + Test refactoring.
2020-01-10 10:44:24 -05:00
347f91ab53 Have internal/external/pipelines taken an optional InputStream. (#1182)
Primarily, this fixes an issue where we always open a stdin pipe for
external commands, which can break various interactive commands (e.g.,
editors).
2020-01-09 22:31:44 -08:00
5692a08e7f Update README.md 2020-01-10 09:40:30 +13:00
515a3b33f8 Thin-lines for tables for better rendering (#1181)
The thick lines are pretty subtle and some fonts have issues with it. Seems keeping the lines consistent works better across fonts.
2020-01-09 12:33:02 -08:00
c3e466e464 Make debug command always prettty-print (Resolves #1178) (#1180) 2020-01-09 11:24:21 -08:00
00c0327031 Support more Values to plain string. (#1169)
* Support more Values to plain string.

* Continue converting to delimited data for simple values.
2020-01-08 06:12:59 -05:00
7451414b9e Eliminate ClassifiedInputStream in favour of InputStream. (#1056) 2020-01-07 13:00:01 -08:00
41ebc6b42d Bump to 0.8.0 (#1166) 2020-01-07 20:08:31 +13:00
b574dc6365 Add the from-ods command (#1161)
* Put a sample_data.ods file for testing

This is a copy of the sample_data.xlsx file but in ods format

* Add the from-ods command

Most of the work was doing `rg xlsx` and then copy/paste with light editing

* Add tests for the from-ods command

* Fix failing test

The problem was improper filename sorting in the test `prepares_and_decorates_filesystem_source_files`
2020-01-07 19:35:00 +13:00
4af9e1de41 Resolves #750 (#1164)
Pick now produces an error when none of the columns are found
2020-01-07 17:06:48 +13:00
77d856fd53 Last unwraps (#1160)
* Work through most of the last unwraps

* Finish removing unwraps
2020-01-04 19:44:17 +13:00
6dceabf389 Isolate data processing helpers. (#1159)
Isolate data processing helpers. Remove unwraps and down to zero unwraps.
2020-01-03 23:00:39 -05:00
5919c6c433 Remove unwraps (#1153)
* Remove a batch of unwraps

* finish another batch
2020-01-04 10:11:21 +13:00
339a2de0eb More ununwraps (#1152)
* More ununwraps

* More ununwraps

* Update completer.rs

* Update completer.rs
2020-01-03 06:51:20 +13:00
3e3cb15f3d Yet more ununwraps (#1150) 2020-01-02 20:07:17 +13:00
5e31851070 A couple more (#1149) 2020-01-02 18:24:41 +13:00
0f626dd076 Another batch of un-unwrapping (#1148)
Another batch of un-unwrappings
2020-01-02 17:02:46 +13:00
aa577bf9bf Clean up some unwraps (#1147) 2020-01-02 09:45:32 +13:00
25298d35e4 Bump rustyline (#1146)
* Slightly improve new which command

* Bump rustyline
2020-01-02 06:54:25 +13:00
78016446dc Slightly improve new which command (#1145) 2020-01-01 20:47:25 +13:00
b304de8199 Rewrite which (#1144)
* Detect built-in commands passed as args to `which`

This expands the built-in `which` command to detect nushell commands
that may have the same name as a binary in the path.

* Allow which to interpret multiple arguments

Previously, it would discard any argument besides the first. This allows
`which` to process multiple arguments. It also makes the output a stream
of rows.

* Use map to build the output

* Add boolean column for builtins

* Use macros for entry creation shortcuts

* Process command args and use async_stream

In order to use `ichwh`, I'll need to use async_stream. But in order to
avoid lifetime errors with that, I have to process the command args
before using them. I'll admit I don't fully understand what is going on
with the `args.process(...)` function, but it works.

* Use `ichwh` for path searching

This commit transitions from `which` to `ichwh`. The path search is now
done asynchronously.

* Enable the `--all` flag on `which`

* Make `which` respect external commands

Escaped commands passed to wich (e.g., `which "^ls"`), are now searched
before builtins.

* Fix clippy warnings

This commit resolves two warnings from clippy, in light of #1142.

* Update Cargo.lock to get new `ichwh` version

`ichwh@0.2.1` has support for local paths.

* Add documentation for command
2020-01-01 19:45:27 +13:00
72838cc083 Move to using clippy (#1142)
* Clippy fixes

* Finish converting to use clippy

* fix warnings in new master

* fix windows

* fix windows

Co-authored-by: Artem Vorotnikov <artem@vorotnikov.me>
2019-12-31 20:36:08 +13:00
8093612cac Allow moving in text with Ctrl+ArrowLeft, Ctrl+ArrowRight (#1141)
* Allow moving in text with Ctrl+ArrowLeft, Ctrl+ArrowRight

* Document changes

* Format
2019-12-31 17:06:36 +13:00
f37f29b441 Add uniq command (#1132)
* start playing with ways to use the uniq command

* WIP

* Got uniq working, but still need to figure out args issue and add tests

* Add some tests for uniq

* fmt

* remove commented out code

* Add documentation and some additional tests showing uniq values and rows. Also removed args TODO

* add changes that didn't get committed

* whoops, I didn't save the docs correctly...

* fmt

* Add a test for uniq with nested json

* Add another test

* Fix unique-ness when json keys are out of order and make the test json more complicated
2019-12-31 17:05:02 +13:00
dba82ac530 handle single quoted external command args (#1139)
fixes #1138
2019-12-31 06:47:14 +13:00
0615adac94 Inc refactoring, Value helper test method extractions, and more integration helpers. (#1135)
* Manifests check. Ignore doctests for now.

* We continue with refactorings towards the separation of concerns between
crates. `nu_plugin_inc` and `nu_plugin_str` common test helpers usage
has been refactored into `nu-plugin` value test helpers.

Inc also uses the new API for integration tests.
2019-12-29 00:17:24 -05:00
21e508009f Refactor struct names for old commands (ls, cd, pwd) (#1133) 2019-12-29 10:33:31 +13:00
a9317d939f Update README.md 2019-12-28 15:27:51 +13:00
65d843c2a1 Merge pull request #1128 from andrasio/nu-plugin-extract
Extract nu-plugin crate.
2019-12-27 09:16:18 -05:00
f6c62bf121 Nu plugins now depend on nu-plugin crate. 2019-12-27 08:52:15 -05:00
b4bc5fe9af Merge pull request #1126 from jonathandturner/utf8_fix
UTF8 fix for twitter-reported issue
2019-12-27 19:48:42 +13:00
10368d7060 UTF8 fix for twitter-reported issue 2019-12-27 19:25:44 +13:00
68a314b5cb UTF8 fix for twitter-reported issue 2019-12-27 19:03:00 +13:00
3c7633ae9f Merge pull request #1125 from notryanb/update-readme
update readme to reflect >= 0.7.2 $nu variables
2019-12-27 15:42:25 +13:00
dba347ad00 update readme to show >= 0.7 nu path 2019-12-26 20:08:30 -05:00
bfba2c57f8 Merge pull request #1124 from quebin31/master
Fix positional macro on crate nu-macros
2019-12-27 07:16:47 +13:00
c69bf9f46f Merge branch 'master' of https://github.com/nushell/nushell 2019-12-26 12:32:28 -05:00
7ce1ddc6fd Fixed optional and required argument in signature.
This fixes issues like #1117
2019-12-26 12:29:41 -05:00
e7ce6f2fcd Merge pull request #1113 from jonathandturner/bump_0_7_2
Bump to 0.7.2
2019-12-24 14:51:58 +13:00
0c786bb890 Bump to 0.7.2 2019-12-24 14:51:10 +13:00
8d31c32bda Merge pull request #1112 from jonathandturner/assorted_fixes
Fix an assortment of issues
2019-12-24 14:45:15 +13:00
e7fb15be59 Fix an assortment of issues 2019-12-24 14:26:47 +13:00
be7550822c Merge pull request #1109 from nushell/ctrl_l_clear
Move to git rustyline to fix Ctrl-L
2019-12-24 05:48:42 +13:00
0ce216eec4 Move to git rustyline to fix Ctrl+L 2019-12-24 05:26:30 +13:00
1fe85cb91e Merge pull request #1108 from thegedge/faster-pipelines
Wait for process instead of polling its status.
2019-12-23 07:06:16 +13:00
8cadc5a4ac Wait for process instead of polling its status.
This provides a huge performance boost for pipelines that end in an
external command. Rough testing shows an improvement from roughly 400ms
to 30ms when `cat`-ing a large file.
2019-12-22 14:14:03 -03:30
f9da7f7d58 Merge pull request #1102 from jonathandturner/bump_nu
Bump nu version
2019-12-20 10:54:30 +13:00
367f11a62e Bump nu version 2019-12-20 09:03:54 +13:00
8a45ca9cc3 Merge pull request #1100 from nushell/fix-stable
Fix the stable plugins to correct list
2019-12-20 06:37:55 +13:00
e336930fd8 Update Cargo.toml 2019-12-20 06:18:06 +13:00
172ccc910e Fix the stable plugins to correct list 2019-12-20 06:01:42 +13:00
a8425daf14 Merge pull request #1097 from jonathandturner/fix_workspace
Fix the workspace I commented out
2019-12-18 10:14:16 -08:00
b629136528 Fix the workspace I commented out 2019-12-19 06:58:23 +13:00
91ebb7f718 Merge pull request #1096 from jonathandturner/copy_core_plugins
Copy core plugins back so we can publish
2019-12-18 08:54:31 -08:00
96484161c0 Copy core plugins back so we can publish 2019-12-19 05:35:17 +13:00
d21ddeeae6 Merge pull request #1094 from jonathandturner/rename_test_support
Rename test-support to nu-test-support
2019-12-17 11:08:24 -08:00
4322d373e6 More renames 2019-12-18 07:54:39 +13:00
08571392e6 Rename test-support to nu-test-support 2019-12-18 07:41:47 +13:00
f52235b1c1 Merge pull request #1093 from jonathandturner/fix_asset
Try to fix asset building
2019-12-17 10:28:15 -08:00
a66147da47 Try to fix asset building 2019-12-18 07:09:38 +13:00
df778afd1f Try to fix asset building 2019-12-18 07:05:12 +13:00
d7ddaa376b Merge pull request #1092 from jonathandturner/oops
More oops
2019-12-17 09:11:52 -08:00
2ce892c6f0 More oops 2019-12-18 06:11:14 +13:00
28179ef450 Merge pull request #1091 from jonathandturner/add_descs
Oops
2019-12-17 09:09:30 -08:00
2c6336c806 Oops 2019-12-18 06:08:45 +13:00
761fc9ae73 Merge pull request #1090 from jonathandturner/add_descs
Add missing descriptions and licenses to subcrates
2019-12-17 09:07:36 -08:00
314c3c4a97 Add missing descriptions and licenses to subcrates 2019-12-18 06:07:00 +13:00
f7f1fba94f Merge pull request #1089 from jonathandturner/bump
Bump Nu version
2019-12-17 08:54:02 -08:00
14817ef229 Subcrate versions 2019-12-18 05:18:10 +13:00
98233dcec1 Subcrate versions 2019-12-18 05:09:53 +13:00
6540509911 Bump Nu version 2019-12-18 04:55:49 +13:00
594eae1cbc Merge pull request #1085 from andrasio/externals-line
$it can contain a string line or plain string data.
2019-12-16 17:42:49 -05:00
5e961815fc can contain a string line or plain string data. 2019-12-16 17:27:36 -05:00
fa9329c8e3 Merge pull request #1082 from sebastian-xyz/update-book-links
update links to books
2019-12-15 14:34:38 -08:00
6c577e18ca Merge pull request #1081 from andrasio/test-extract
Start test organization facelift.
2019-12-15 11:46:58 -05:00
4034129dba This commit is the continuing phase of extracting functionality to subcrates. We extract test helpers and begin to change Nu shell's test organization along with it. 2019-12-15 11:34:58 -05:00
52cf65c19e Merge pull request #1080 from andrasio/command-refactor
Separate internal and external command definitions.
2019-12-15 08:49:45 -05:00
cbbb246a6d update links to books 2019-12-15 13:56:26 +01:00
87cc6d6f01 Separate internal and external command definitions. 2019-12-15 01:24:31 -05:00
4b9ef5a9d0 Merge pull request #1079 from jonathandturner/bump_some_deps
Bump heim and necessary deps
2019-12-14 09:32:23 -08:00
31c703891a Bump heim and necessary deps 2019-12-15 02:27:14 +13:00
550bda477b Merge pull request #1060 from naufraghi/issues-972-expand-tilde-as-home-in-external-commands
Expand tilde as home in external commands
2019-12-13 08:46:08 -08:00
219b7e64cd Use shellexpand to expand ~ in external commands
Add tests for ~tilde expansion:

- test that "~" is expanded (no more "~" in output)
- ensure that "1~1" is not expanded to "1/home/user1" as it was
  before

Fixes #972

Note: the first test does not check the literal expansion because
the path on Windows is expanded as a Linux path, but the correct
expansion may come for free once `shellexpand` will use the `dirs`
crate too (https://github.com/netvl/shellexpand/issues/3).
2019-12-13 11:54:41 +01:00
98c59f77b2 Merge pull request #1078 from nushell/enable_coloring_in_tokens
Remove the coloring_in_tokens feature flag
2019-12-12 13:08:35 -08:00
e8800fdd0c Remove the coloring_in_tokens feature flag
Stabilize and enable
2019-12-12 11:34:43 -08:00
09f903c37a Merge pull request #1077 from nushell/implement-signature-syntax
Add Range and start Signature support
2019-12-11 21:58:09 -08:00
57af9b5040 Add Range and start Signature support
This commit contains two improvements:

- Support for a Range syntax (and a corresponding Range value)
- Work towards a signature syntax

Implementing the Range syntax resulted in cleaning up how operators in
the core syntax works. There are now two kinds of infix operators

- tight operators (`.` and `..`)
- loose operators

Tight operators may not be interspersed (`$it.left..$it.right` is a
syntax error). Loose operators require whitespace on both sides of the
operator, and can be arbitrarily interspersed. Precedence is left to
right in the core syntax.

Note that delimited syntax (like `( ... )` or `[ ... ]`) is a single
token node in the core syntax. A single token node can be parsed from
beginning to end in a context-free manner.

The rule for `.` is `<token node>.<member>`. The rule for `..` is
`<token node>..<token node>`.

Loose operators all have the same syntactic rule: `<token
node><space><loose op><space><token node>`.

The second aspect of this pull request is the beginning of support for a
signature syntax. Before implementing signatures, a necessary
prerequisite is for the core syntax to support multi-line programs.

That work establishes a few things:

- `;` and newlines are handled in the core grammar, and both count as
  "separators"
- line comments begin with `#` and continue until the end of the line

In this commit, multi-token productions in the core grammar can use
separators interchangably with spaces. However, I think we will
ultimately want a different rule preventing separators from occurring
before an infix operator, so that the end of a line is always
unambiguous. This would avoid gratuitous differences between modules and
repl usage.

We already effectively have this rule, because otherwise `x<newline> |
y` would be a single pipeline, but of course that wouldn't work.
2019-12-11 16:41:07 -08:00
16272b1b20 Merge pull request #1076 from jonathandturner/finish_plugin_refactor
Trying this as a workaround to the [[bin]] issue
2019-12-09 20:20:51 -08:00
1dcbd89a89 Trying this as a workaround to the [[bin]] issue 2019-12-10 16:57:55 +13:00
eb6ef02ad1 Merge pull request #1075 from jonathandturner/finish_plugin_refactor
Finish plugin refactor
2019-12-09 18:34:26 -08:00
17586bdfbd Fix missing dep 2019-12-10 15:13:22 +13:00
0e98cf3f1e Merge branch 'finish_plugin_refactor' of github.com:jonathandturner/nushell into finish_plugin_refactor 2019-12-10 13:59:44 +13:00
e2a95c3e1d Move str and inc to core plugins 2019-12-10 13:59:13 +13:00
5cb7df57fc Update azure-pipelines.yml 2019-12-10 13:09:25 +13:00
88f899d341 Move some plugins back to being core shippable plugins 2019-12-10 13:05:40 +13:00
7d70b5feda Try to fix CI with new subcrates 2019-12-10 08:14:58 +13:00
fd6ee03391 Remove old ValueExt 2019-12-10 07:52:01 +13:00
9f702fe01a Move the remainder of the plugins to crates 2019-12-10 07:39:51 +13:00
c9d9eec7f8 Merge pull request #1073 from jonathandturner/docker_wrap
Remove partial docker plugin. Embed->wrap
2019-12-08 21:08:03 -08:00
38cbfdb8a9 Remove partial docker plugin. Embed->wrap 2019-12-09 17:41:09 +13:00
f9b7376949 Merge pull request #1072 from jonathandturner/format_parse
Move format/parse to core commands
2019-12-08 18:26:35 -08:00
e98ed1b43d Move format/parse to core commands 2019-12-09 15:04:13 +13:00
251c3e103d Move format/parse to core commands 2019-12-09 14:57:53 +13:00
d26e938436 Merge pull request #1071 from jonathandturner/fix_1068
Fix 1068
2019-12-08 12:38:10 -08:00
dbadf9499e Fix 1068 2019-12-09 08:15:14 +13:00
28df1559ea Merge pull request #1070 from jonathandturner/upgrade_some_deps
Upgrade some dependencies
2019-12-08 10:19:39 -08:00
91784218c0 Upgrade some dependencies 2019-12-09 06:56:21 +13:00
eeec5e10c3 Merge pull request #1069 from jonathandturner/param_complete
Named param completion
2019-12-08 08:55:13 -08:00
0515ed976c Fix panic 2019-12-09 05:36:24 +13:00
f653992b4a A little cleanup 2019-12-08 19:42:43 +13:00
b5f8c1cc50 param completions work now 2019-12-08 19:23:31 +13:00
f9a46ce1e7 WIP param completions 2019-12-08 19:04:23 +13:00
b6ba7f97fd WIP param completions 2019-12-08 18:58:53 +13:00
7a47905f11 Merge pull request #1066 from thibran/fix-more-clippy-warnings
Fix more Clippy warnings
2019-12-07 16:10:36 -08:00
683f4c35d9 Fix more Clippy warnings
cargo clippy -- -W clippy::correctness
2019-12-07 21:04:58 +01:00
dfa5173cf4 Merge pull request #1064 from thibran/split-table-from-list
split format/table::from_list into multiple functions
2019-12-07 09:00:14 -08:00
04b214bef6 split format/table::from_list into multiple functions 2019-12-07 14:52:52 +01:00
37cb7fec77 Merge pull request #1063 from jonathandturner/unused_deps
Remove some unused deps
2019-12-06 23:44:52 -08:00
8833969e4a Remove some unused deps 2019-12-07 20:23:29 +13:00
bda238267c Merge pull request #1062 from jonathandturner/fetch_post
Fetch/post as plugins
2019-12-06 22:46:30 -08:00
d07dc57537 Add missing fallback case 2019-12-07 19:24:58 +13:00
d0a2888e88 Finish adding makeshift support for to fetch/post plugins 2019-12-07 17:23:59 +13:00
cec2eff933 Merge branch 'master' into fetch_post 2019-12-07 16:53:50 +13:00
38b7a3e32b WIP move post/fetch to plugins 2019-12-07 16:46:05 +13:00
9dfb6c023f Merge pull request #1061 from thibran/fix-most-clippy-warnings
Fix most Clippy performance warnings
2019-12-06 19:26:20 -08:00
cde92a9fb9 Fix most Clippy performance warnings
command used: cargo clippy -- -W clippy::perf
2019-12-06 23:25:47 +01:00
5622bbdd48 Merge pull request #1059 from coolshaurya/patch-1
Fix minor error in reject command docs
2019-12-06 08:13:55 -08:00
3d79a9c37a Fix minor error in reject command docs 2019-12-06 17:27:14 +05:30
a2a5b30568 Merge pull request #1058 from jonathandturner/edit_insert_core
Move edit and insert to core
2019-12-05 12:42:19 -08:00
768adb84a4 Remove commented out region 2019-12-06 09:19:24 +13:00
26b0250e22 Remove commented out region 2019-12-06 09:18:16 +13:00
6893850fce Move edit and insert to core 2019-12-06 09:15:41 +13:00
8834e6905e Merge pull request #1055 from jonathandturner/ps_sys_crates
Extract ps and sys subcrates. Move helper methods to UntaggedValue
2019-12-04 12:24:45 -08:00
1d5f13ddca formatting 2019-12-05 08:57:03 +13:00
d12c16a331 Extract ps and sys subcrates. Move helper methods to UntaggedValue 2019-12-05 08:52:31 +13:00
ecf47bb3ab Merge pull request #1054 from jonathandturner/binaryview_crate
Move binaryview to a sub-crate
2019-12-04 10:17:01 -08:00
a4bb5d4ff5 Move binaryview to a sub-crate 2019-12-05 06:51:20 +13:00
e9ee7bda46 Merge pull request #1052 from jonathandturner/fix_textview
Re-enable the textview plugin, now its own crate
2019-12-04 08:49:40 -08:00
1d196394f6 Merge pull request #1045 from sebastian-xyz/range
add range command
2019-12-04 08:37:03 -08:00
cfda67ff82 Finish making the textview plugin optional 2019-12-05 05:28:48 +13:00
59510a85d1 fix build warnings 2019-12-04 17:13:21 +01:00
35edf22ac3 Test all subcrates 2019-12-04 19:53:06 +13:00
871fc72892 Test all subcrates 2019-12-04 19:49:38 +13:00
1fcf671ca4 Re-enable the textview plugin, now its own crate 2019-12-04 19:38:40 +13:00
ecebe1314a update to new crates structure 2019-12-03 20:56:39 +01:00
bda5db59c8 Merge remote-tracking branch 'upstream/master' into range 2019-12-03 20:23:49 +01:00
4526d757b6 Merge pull request #1049 from andrasio/embed-list
embed as column when embedding a list
2019-12-03 02:51:58 -05:00
e5405d7f5c embed as column when embedding a list 2019-12-03 02:26:01 -05:00
201506a5ad add tests for range + run rustfmt 2019-12-03 08:24:49 +01:00
49f9253ca2 Merge pull request #1047 from jonathandturner/new_lines
Add new line primitive, bump version, allow bare filepaths
2019-12-02 23:14:08 -08:00
efc879b955 Add new line primitive, bump version, allow bare filepaths 2019-12-03 19:44:59 +13:00
3fa03eb7a4 Merge pull request #1046 from nushell/fix-external-words
Clean up expansion of external words
2019-12-02 17:12:50 -08:00
24bad78607 Clean up expansion of external words
Previously, external words accidentally used
ExpansionRule::new().allow_external_command(), when it should have been
ExpansionRule::new().allow_external_word().

External words are the broadest category in the parser, and are the
appropriate category for external arguments. This was just a mistake.
2019-12-02 16:34:33 -08:00
8de4c9dbb7 Merge pull request #1044 from nushell/protocol-extraction
Extract into crates
2019-12-02 14:29:04 -08:00
f858e854bf Fix a rebase mistake 2019-12-02 13:48:34 -08:00
87dbd3d5ac Extract build.rs 2019-12-02 13:14:51 -08:00
fe66b4c8ea Merge remote-tracking branch 'origin/master' into protocol-extraction 2019-12-02 11:16:00 -08:00
8390cc97e1 add range command 2019-12-02 20:15:14 +01:00
c0a7d4e2a7 Update .gitpod.yml 2019-12-02 11:02:59 -08:00
ce23a672d9 add documentation for compact command 2019-12-02 11:02:59 -08:00
9851317aeb add documentation for default command 2019-12-02 11:02:59 -08:00
3fb4a5d6e6 add documentation for format 2019-12-02 11:02:59 -08:00
340e701124 fix error in save.md 2019-12-02 11:02:59 -08:00
36938a4407 add documentation for save, config 2019-12-02 11:02:59 -08:00
6a6589a357 Update where.md 2019-12-02 11:02:59 -08:00
b94a32e523 add documentation for from-json, from-yaml, history, split-row 2019-12-02 11:02:59 -08:00
7db3c69984 update histogram, nth documentation 2019-12-02 11:02:59 -08:00
5406450c42 Add documentation for histogram, split-column 2019-12-02 11:02:59 -08:00
d6a6e16d21 Switch to the new Cargo.lock format
This was achieved by deleting Cargo.lock
and letting a recent Cargo nightly re-create
it. Support for the format was already
introduced in Rust 1.38, but currently,
stable releases of Cargo only retain it
if encountered but don't generate such
files by default.

The new format is smaller, better suited to
prevent merge conflicts and generates smaller
diffs at dependency updates, leading to
smaller git history.

You can read more about it in this PR: https://github.com/rust-lang/cargo/pull/7070
2019-12-02 11:02:59 -08:00
ea1b65916d Update Cargo.toml 2019-12-02 11:02:59 -08:00
cd9d9ad50b improve duration print 2019-12-02 11:02:58 -08:00
552272b37e replace and find-replace str plugin additions. 2019-12-02 11:02:58 -08:00
388ce738e3 expand tilde in externals 2019-12-02 11:02:58 -08:00
ef7fbcbe9f Update README.md 2019-12-02 11:02:58 -08:00
80941ace37 Add 0.6.1 release 2019-12-02 11:02:58 -08:00
f317500873 Update from-yaml.md 2019-12-02 11:02:58 -08:00
911414a190 Update config.md 2019-12-02 11:02:58 -08:00
cca6360bcc add documentation for from-tsv, from-xml 2019-12-02 11:02:58 -08:00
f68503fa21 add documentation for get, ps 2019-12-02 11:02:58 -08:00
911b69dff0 Update some command docs 2019-12-02 11:02:58 -08:00
4115634bfc Try to re-apply #1039 2019-12-02 11:02:58 -08:00
8a0bdde17a Remove env var from starship 2019-12-02 11:02:58 -08:00
a1e21828d6 Fix tests 2019-12-02 11:02:57 -08:00
0f193c2337 Update histogram.rs 2019-12-02 11:02:57 -08:00
526d94d862 improve duration print
original commit: ddb9d3a864
2019-12-02 11:02:57 -08:00
2fdafa52b1 replace and find-replace str plugin additions. 2019-12-02 11:02:57 -08:00
f52c0655c7 expand tilde in externals
original: 9f42d7693f
2019-12-02 11:02:57 -08:00
97331c7b25 Update README 2019-12-02 11:02:57 -08:00
1fb5a419a7 Bump the release version 2019-12-02 11:02:57 -08:00
4e9afd6698 Refactor classified.rs into separate modules.
Adds modules for internal, external, and dynamic commands, as well as
the pipeline functionality. These are exported as their old names from
the classified module so as to keep its "interface" the same.
2019-12-02 11:02:57 -08:00
8f9dd6516e Add =~ and !~ operators on strings
`left =~ right` return true if left contains right, using Rust's
`String::contains`. `!~` is the negated version.

A new `apply_operator` function is added which decouples evaluation from
`Value::compare`. This returns a `Value` and opens the door to
implementing `+` for example, though it wouldn't be useful immediately.

The `operator!` macro had to be changed slightly as it would choke on
`~` in arguments.
2019-12-02 11:02:57 -08:00
e4226def16 Extract core stuff into own crates
This commit extracts five new crates:

- nu-source, which contains the core source-code handling logic in Nu,
  including Text, Span, and also the pretty.rs-based debug logic
- nu-parser, which is the parser and expander logic
- nu-protocol, which is the bulk of the types and basic conveniences
  used by plugins
- nu-errors, which contains ShellError, ParseError and error handling
  conveniences
- nu-textview, which is the textview plugin extracted into a crate

One of the major consequences of this refactor is that it's no longer
possible to `impl X for Spanned<Y>` outside of the `nu-source` crate, so
a lot of types became more concrete (Value became a concrete type
instead of Spanned<Value>, for example).

This also turned a number of inherent methods in the main nu crate into
plain functions (impl Value {} became a bunch of functions in the
`value` namespace in `crate::data::value`).
2019-12-02 10:54:12 -08:00
c199a84dbb Merge pull request #1039 from thegedge/move-pipeline-execution-out-of-cli
Move pipeline execution code into classified::Pipeline
2019-12-01 19:47:34 -08:00
5a4ca11362 Merge pull request #1043 from JesterOrNot/master
install all features for nushell for gitpod
2019-12-01 18:32:15 -08:00
f2968c8385 Update .gitpod.yml 2019-12-01 17:16:53 -06:00
8d01b019f4 Merge pull request #1041 from tchak/docs-compact-default
document compact and default
2019-12-01 09:01:50 -08:00
bf87330d6e add documentation for compact command 2019-12-01 17:44:43 +01:00
2bb85bdbd4 add documentation for default command 2019-12-01 17:39:09 +01:00
8f34c6eeda Merge pull request #1032 from sebastian-xyz/doc
add documentation for save, config, get, ps, from-tsv, from-xml
2019-11-30 18:15:39 -08:00
ac5543bad9 Move pipeline execution code into classified::Pipeline 2019-11-30 16:12:34 -05:00
e4c56a25c6 Merge remote-tracking branch 'refs/remotes/origin/doc' into doc 2019-11-30 21:21:15 +01:00
11ff8190b1 add documentation for format 2019-11-30 21:15:12 +01:00
9bd25d7427 fix error in save.md 2019-11-30 21:07:43 +01:00
5676713b1f Update README.md 2019-12-01 07:12:14 +13:00
b59231d32b Merge pull request #1035 from jonathandturner/bump_to_0_6_1
Add 0.6.1 release
2019-11-30 10:11:31 -08:00
e530cf0a9d Add 0.6.1 release 2019-12-01 07:10:51 +13:00
6bfb4207c4 Update from-yaml.md 2019-12-01 07:00:36 +13:00
c63ad610f5 Update config.md 2019-12-01 06:59:53 +13:00
e38a4323b4 add documentation for from-tsv, from-xml 2019-11-30 13:38:52 +01:00
d40aea5d0a add documentation for get, ps 2019-11-30 12:48:23 +01:00
1ba69e4b11 Merge pull request #1030 from jonathandturner/more_doc_updates
Update some command docs
2019-11-29 17:53:02 -08:00
f10390b1be Update some command docs 2019-11-30 14:24:39 +13:00
c2b1908644 Merge pull request #1029 from jonathandturner/fix_starship_env_var
Remove env var from starship
2019-11-29 12:00:10 -08:00
0a93335f6d Remove env var from starship 2019-11-30 08:38:44 +13:00
fbb65cde44 add documentation for save, config 2019-11-29 18:15:51 +01:00
8e7acd1094 Update where.md 2019-11-29 08:41:27 +13:00
c6ee6273db Merge pull request #1015 from sebastian-xyz/doc
Add documentation for histogram, split-column
2019-11-28 11:19:36 -08:00
c77059f891 add documentation for from-json, from-yaml, history, split-row 2019-11-28 19:33:17 +01:00
5bdda06ca6 update histogram, nth documentation 2019-11-28 19:32:31 +01:00
d8303dd6d6 Merge pull request #1026 from est31/new-cargo-lock
Switch to the new Cargo.lock format
2019-11-27 19:11:54 -08:00
60ec68b097 Switch to the new Cargo.lock format
This was achieved by deleting Cargo.lock
and letting a recent Cargo nightly re-create
it. Support for the format was already
introduced in Rust 1.38, but currently,
stable releases of Cargo only retain it
if encountered but don't generate such
files by default.

The new format is smaller, better suited to
prevent merge conflicts and generates smaller
diffs at dependency updates, leading to
smaller git history.

You can read more about it in this PR: https://github.com/rust-lang/cargo/pull/7070
2019-11-28 03:27:37 +01:00
deae66c194 Cargo update
This performs a cargo update to allow the upcoming commit
that switches to the new Cargo.lock format to be only about
that format change.
2019-11-28 03:17:31 +01:00
0bdb6e735a Update Cargo.toml 2019-11-27 17:14:45 +13:00
7933e01e77 Update Cargo.toml 2019-11-27 15:55:02 +13:00
b443a2d713 Merge pull request #1017 from jonathandturner/better_duration
improve duration print
2019-11-27 15:32:17 +13:00
7a28ababd1 Update histogram.rs 2019-11-27 15:32:05 +13:00
ddb9d3a864 improve duration print 2019-11-27 15:07:55 +13:00
186b75a848 Merge pull request #1016 from andrasio/str
replace and find-replace str plugin additions.
2019-11-26 19:29:16 -05:00
8cedd2ee5b replace and find-replace str plugin additions. 2019-11-26 19:03:22 -05:00
0845572878 Add documentation for histogram, split-column 2019-11-26 20:47:34 +01:00
2e4b0b0b17 Merge pull request #1014 from jonathandturner/fix_1013
expand tilde in externals
2019-11-27 06:52:30 +13:00
9f42d7693f expand tilde in externals 2019-11-27 06:34:02 +13:00
3424334ce5 Merge pull request #1012 from jonathandturner/bump_release_version
Bump release version
2019-11-26 21:21:33 +13:00
c68d236fd7 Update README 2019-11-26 21:00:34 +13:00
7c6e82c990 Bump the release version 2019-11-26 20:59:43 +13:00
eb5d0d295b Merge pull request #1009 from nushell/cleanup-wip
Extract nu_source into a crate
2019-11-25 19:54:22 -08:00
2eae5a2a89 Merge remote-tracking branch 'origin/master' into cleanup-wip 2019-11-25 19:25:12 -08:00
595c9f2999 Merge branch 'master' into cleanup-wip 2019-11-25 18:32:24 -08:00
70d63e34e9 Merge pull request #1008 from thegedge/move-pipeline-to-classified
Move pipeline code from cli to classified
2019-11-25 18:21:07 -05:00
83ac65ced3 Merge pull request #997 from bndbsh/operator-contains
Add `=~` and `!~` operators on strings
2019-11-25 18:19:58 -05:00
be140382cf Merge pull request #1011 from andrasio/nth-checks
nth can select more than one row at a time.
2019-11-25 17:55:33 -05:00
d320ffe742 nth can select more than one row at a time. 2019-11-25 17:16:58 -05:00
fbc6f01cfb Add =~ and !~ operators on strings
`left =~ right` return true if left contains right, using Rust's
`String::contains`. `!~` is the negated version.

A new `apply_operator` function is added which decouples evaluation from
`Value::compare`. This returns a `Value` and opens the door to
implementing `+` for example, though it wouldn't be useful immediately.

The `operator!` macro had to be changed slightly as it would choke on
`~` in arguments.
2019-11-25 15:06:11 -05:00
3008434c0f Eliminate repetitive code and fix Unix failure 2019-11-25 11:09:59 -08:00
5fbea31d15 Remove unused Display implementations
After the previous commit, nushell uses PrettyDebug and
PrettyDebugWithSource for our pretty-printed display output.

PrettyDebug produces a structured `pretty.rs` document rather than
writing directly into a fmt::Formatter, and types that implement
`PrettyDebug` have a convenience `display` method that produces a string
(to be used in situations where `Display` is needed for compatibility
with other traits, or where simple rendering is appropriate).
2019-11-25 10:07:20 -08:00
f70c6d5d48 Extract nu_source into a crate
This commit extracts Tag, Span, Text, as well as source-related debug
facilities into a new crate called nu_source.

This change is much bigger than one might have expected because the
previous code relied heavily on implementing inherent methods on
`Tagged<T>` and `Spanned<T>`, which is no longer possible.

As a result, this change creates more concrete types instead of using
`Tagged<T>`. One notable example: Tagged<Value> became Value, and Value
became UntaggedValue.

This change clarifies the intent of the code in many places, but it does
make it a big change.
2019-11-25 07:37:33 -08:00
71e7eb7cfc Move all pipeline execution code from cli to classified::pipeline 2019-11-24 22:52:37 -05:00
339ec46961 Refactor classified.rs into separate modules.
Adds modules for internal, external, and dynamic commands, as well as
the pipeline functionality. These are exported as their old names from
the classified module so as to keep its "interface" the same.
2019-11-24 17:19:12 -05:00
fe53c37654 Merge pull request #1006 from andrasio/additions
Default.
2019-11-24 04:55:12 -05:00
06857fbc52 Take all rows having the column present. 2019-11-24 04:35:36 -05:00
1c830b5c95 default command introduced. 2019-11-24 04:20:08 -05:00
a74145961e Always check the row's columns. 2019-11-24 01:25:41 -05:00
91698b2657 Merge pull request #1003 from andrasio/compact
Compact.
2019-11-23 22:03:20 -05:00
40fd8070a9 Merge pull request #1004 from jonathandturner/revert_some_table_changes
Revert some of the recent styled string changes
2019-11-24 14:28:58 +13:00
4d5f1f6023 Revert some of the recent styled string changes 2019-11-24 13:56:19 +13:00
bc2d65cd2e Remove raw data debugging. 2019-11-23 19:16:25 -05:00
1a0b339897 compact command introduced. 2019-11-23 19:05:44 -05:00
8d3a937413 Display raw debugging data (rust represetantion). 2019-11-23 18:53:50 -05:00
e85e1b2c9e Merge pull request #986 from nushell/int-columns
Integer columns and better debug infra
2019-11-22 09:07:03 -08:00
c8aa8cb842 debug command facelift. 2019-11-22 03:31:58 -05:00
88c4473283 Remove fuzzysearch. 2019-11-22 03:25:09 -05:00
f4d9975dab Clean up feature build flags. 2019-11-22 03:11:36 -05:00
6e8b768d79 Requiring at least one member is no longer necessary. 2019-11-22 01:18:06 -05:00
cdb0eeafa2 --no-edit 2019-11-21 14:22:32 -08:00
388fc24191 Merge pull request #990 from drmason13/combine-csv-and-tsv
combine functions behind to/from-c/tsv commands
2019-11-19 11:29:33 -05:00
b3c021899c combine functions behind to/from-c/tsv commands
fixes #969, admittedly without a --delimiter alias

moves from_structured_data.rs to from_delimited_data.rs to better
identify its scope and adds to_delimited_data.rs. Now csv and tsv both
use the same code, tsv passes in a fixed '\t' argument where csv passes
in the value of --separator
2019-11-19 16:02:35 +00:00
bff50c6987 Merge pull request #988 from jonathandturner/umask
Add umask to unix --full list
2019-11-19 21:10:15 +13:00
111fcf188e Add umask to unix --full list 2019-11-19 18:46:47 +13:00
015693aea7 Update README.md 2019-11-19 03:41:16 +13:00
03a52f1988 Merge pull request #984 from nushell/latest_nightly
Fix build errors on latest nightly
2019-11-18 16:33:46 +13:00
372f6c16b3 Fix build errors on latest nightly 2019-11-18 16:12:37 +13:00
c04da4c232 Merge pull request #982 from Aloso/master
Format durations nicely
2019-11-18 11:49:58 +13:00
a070cb8154 Format durations nicely 2019-11-17 22:51:56 +01:00
bf4273776f Merge pull request #980 from jonathandturner/remove_fuzzy_search
Remove fuzzy search because of compat issues
2019-11-18 08:22:45 +13:00
95ca3ed4fa Remove fuzzy search because of compat issues 2019-11-18 08:01:17 +13:00
54c0603263 Merge pull request #979 from jonathandturner/abbrev_ls
Abbreviate ls by default, add --full flag
2019-11-18 07:06:19 +13:00
c598cd4255 Fix tests 2019-11-18 06:38:44 +13:00
2bb03d9813 Abbreviate ls by default, add --full flag 2019-11-18 06:10:50 +13:00
9c41f581a9 Merge pull request #978 from jonathandturner/duration_primitive
Make duration its own primitive
2019-11-17 19:07:51 +13:00
6231367bc8 Make duration its own primitive 2019-11-17 18:48:48 +13:00
a7d7098b1a Merge pull request #977 from jonathandturner/from_xls
Add from-xlsx for importing excel files
2019-11-17 16:36:22 +13:00
90aeb700ea Add from_xlsx for importing excel files 2019-11-17 16:18:41 +13:00
9dfc647386 Merge pull request #976 from bndbsh/save-error
Improve error messages for save
2019-11-17 14:58:55 +13:00
f992f5de95 Update save.rs 2019-11-17 14:13:52 +13:00
946f7256e4 Improve error messages for save
`save` attempts to convert input based on the target filename extension,
and expects a stream of text otherwise. However the error message is
unclear and provides little guidance, hopefully this is less confusing
to new users.

It might be worthwhile to also add a hint about adding an extension,
though I'm not sure if it's possible to emit multiple diagnostics.
2019-11-16 19:08:38 -05:00
57d425d929 Merge pull request #975 from jonathandturner/process_prompt_once
Process prompts once rather than twice
2019-11-17 10:22:49 +13:00
dd36bf07f4 Process prompts once rather than twice 2019-11-17 09:42:35 +13:00
406fb8d1d9 Merge pull request #973 from jonathandturner/fix_windows_starship
Give rustyline non-ansi to begin with. Fixes starship in windows
2019-11-17 09:25:45 +13:00
2d4a225e2a Fix formatting 2019-11-17 09:06:00 +13:00
db218e06dc Give rustyline non-ansi to begin with. Fixes Windows 2019-11-17 09:02:26 +13:00
17e8a5ce38 Merge pull request #970 from jonathandturner/starship-prompt
Starship prompt
2019-11-17 06:43:59 +13:00
07db14f72e Merge master 2019-11-17 06:17:05 +13:00
412831cb9c Merge pull request #968 from sebastian-xyz/patch-4
add group-by command documentation
2019-11-17 05:59:41 +13:00
f4dc79f4ba add group-by command documentation 2019-11-16 15:31:28 +01:00
9cb573b3b4 Merge pull request #967 from jonathandturner/fix_warning
Fix build warnings
2019-11-16 22:05:28 +13:00
ce106bfda9 Fix build warnings 2019-11-16 21:23:04 +13:00
a3ffc4baf0 Merge pull request #966 from jonathandturner/duration_comparison
Add comparison between dates
2019-11-16 15:03:12 +13:00
3c3637b674 Add comparison between dates 2019-11-16 14:36:51 +13:00
bcecd08825 Merge pull request #965 from sebastian-xyz/patch-3
Add prepend command documentation
2019-11-16 06:19:14 +13:00
55f99073ad Merge pull request #964 from sebastian-xyz/patch-1
Add append command documentation
2019-11-16 06:18:35 +13:00
008c60651c Merge pull request #961 from rtlechow/patch-1
Document pivot command
2019-11-16 06:14:43 +13:00
63667d9e46 Add prepend command documentation 2019-11-15 15:53:58 +01:00
08b770719c Add append command documentation 2019-11-15 15:37:41 +01:00
e0d27ebf84 Merge pull request #960 from uma0317/master
Fix move file to diffrent partition on Windows
2019-11-15 00:39:00 -05:00
0756145caf Fix move file to diffrent partition on Windows 2019-11-15 11:52:51 +09:00
036860770b Document pivot command
Part of https://github.com/nushell/nushell/issues/711
2019-11-14 16:59:39 -05:00
aa1ef39da3 Merge pull request #916 from t-hart/pr/from-tsv-csv-headerless
Make --headerless treat first row as data
2019-11-14 05:34:49 +13:00
7c8969d4ea Merge pull request #957 from nushell/futures-codec-dgrade
Downgrade futures-codec.
2019-11-12 14:49:24 -05:00
87d58535ff Downgrade futures-codec. 2019-11-12 14:04:53 -05:00
1060ba2206 Fixes --headerless functionality for from-ssv.
Squashed commit of the following:

commit fc59d47a2291461d84e0587fc0fe63af0dc26f9f
Author: Thomas Hartmann <thomas.o.hartmann@gmail.com>
Date:   Tue Nov 12 15:39:38 2019 +0100

    Fixes inconsistencies in output.

commit da4084e9fdd983557b101207b381e333a443e551
Author: Thomas Hartmann <thomas.o.hartmann@gmail.com>
Date:   Tue Nov 12 13:04:10 2019 +0100

    remove unused enum.

commit 7f6a105879c8746786b99fb19bb9f0860c41796a
Author: Thomas Hartmann <thomas.o.hartmann@gmail.com>
Date:   Tue Nov 12 12:58:41 2019 +0100

    Starts refactoring from_ssv.

commit b70ddd169ef0c900e03fb590cb171cc7181528db
Author: Thomas Hartmann <thomas.o.hartmann@gmail.com>
Date:   Tue Nov 12 11:34:06 2019 +0100

    Fixes --headerless for non-aligned columns.

commit 6332778dd26de8d07be77b291124115141479892
Author: Thomas Hartmann <thomas.o.hartmann@gmail.com>
Date:   Tue Nov 12 10:27:35 2019 +0100

    Fixes from-ssv headerless aligned-columns logic.

commit 747d8c812e06349b4a15b8c130721881d86fff98
Author: Thomas Hartmann <thomas.o.hartmann@gmail.com>
Date:   Mon Nov 11 23:53:59 2019 +0100

    fixes unit tests for ssv.

commit c77cb451623b37a7a9742c791a4fc38cad053d3d
Author: Thomas Hartmann <thomas.o.hartmann@gmail.com>
Date:   Mon Nov 11 22:49:21 2019 +0100

    it compiles! one broken test.

commit 08a05964f56cf92507c255057d0aaf2b6dbb6f45
Author: Thomas Hartmann <thomas.o.hartmann@gmail.com>
Date:   Mon Nov 11 18:52:54 2019 +0100

    Backed into a corner. Help.

commit c95ab683025a8007b8a6f8e1659f021a002df584
Author: Thomas Hartmann <thomas.o.hartmann@gmail.com>
Date:   Mon Nov 11 17:30:54 2019 +0100

    broken but on the way
2019-11-12 16:04:55 +01:00
0401087175 Refactors out structured parsing logic to a separate module. 2019-11-12 16:04:55 +01:00
f8dc06ef49 Changes implementation of --headerless for from-tsv. 2019-11-12 16:04:55 +01:00
282cb46ff1 Implements --headerless for from-csv 2019-11-12 16:04:55 +01:00
a3ff5f1246 Updates tests for from tsv, csv, and ssv.
With the proposed changes, these tests now become invalid. If the first line is
to be counted as data, then converting the headers to ints will fail. Removing
the headers and instead treating the first line as data, however, reflects the
new, desired mode of operation.
2019-11-12 16:04:55 +01:00
5bb822dcd4 Merge pull request #954 from andrasio/reduce
Expose histogram and split-by command.
2019-11-12 04:10:37 -05:00
00b3c2036a This is part of on-going work with capabilities when working with
tables and able to work with them for data processing & viewing
purposes. At the moment, certain ways to process said tables we
are able to view a histogram of a given column.

As usage matures, we may find certain core commands that could
be used ergonomically when working with tables on Nu.
2019-11-12 03:39:30 -05:00
3163b0d362 Data processing mvp histogram. 2019-11-12 02:08:28 -05:00
21f48577ae Reductions placeholder. 2019-11-12 02:08:28 -05:00
11e4410d1c Merge pull request #806 from DrSensor/ci/github/quay.io
ci(github): replace docker.pkg.github.com with quay.io
2019-11-12 12:03:00 +13:00
27a950d28e Merge pull request #952 from JesterOrNot/master
edit install cmd
2019-11-11 14:40:09 -08:00
f3d056110a DOCKER_USER should come from secrets 2019-11-11 13:33:52 -05:00
b39c2e2f75 edit install cmd 2019-11-11 18:17:55 +00:00
7cf3c6eb95 Move env declaration to jobs.docker 2019-11-11 07:51:41 +07:00
cdec0254ec Merge pull request #951 from jonathandturner/bump_deps
Bump dep versions
2019-11-10 10:15:18 -08:00
02f3330812 Merge pull request #950 from coolshaurya/docs-size
Make documentation for size command
2019-11-10 09:52:43 -08:00
6f013d0225 Merge pull request #949 from coolshaurya/docs-count
Add docs for the count command
2019-11-10 09:51:25 -08:00
1f06f57de3 Merge pull request #948 from coolshaurya/docs-pick-reject
Add documentation for the pick and reject command
2019-11-10 09:50:29 -08:00
0f405f24c7 Bump dep versions 2019-11-11 06:48:49 +13:00
5a8128dd30 Make documentation for size command 2019-11-10 14:41:23 +05:30
50616cc62c Add docs for the count command
Partial fix of issue #711
2019-11-10 14:12:59 +05:30
9d345cab07 Add documentation for the pick and reject command
Partial fix of issue#711
2019-11-10 12:37:27 +05:30
59ab11e932 Merge pull request #947 from jonathandturner/bump_and_plugin_load
Bump Nu version and change plugin load logic for debug
2019-11-09 21:29:09 -08:00
df302d4bac Bump Nu version and change plugin load logic for debug 2019-11-10 16:44:05 +13:00
6bbfd0f4f6 Merge pull request #945 from jonathandturner/format
Format
2019-11-09 16:39:51 -08:00
943e0045e7 Update readme 2019-11-10 13:16:52 +13:00
62a5250554 Add format command 2019-11-10 13:14:59 +13:00
9043970e97 Merge pull request #943 from drmason13/from_csv-add-separator-arg
Add --separator argument to from_csv
2019-11-09 15:09:32 -08:00
d32c9ce1b6 Merge pull request #944 from jonathandturner/read_to_parse
Read to parse
2019-11-09 15:07:40 -08:00
73d8478678 Update readme 2019-11-10 11:27:56 +13:00
bab58576b4 Rename read to parse 2019-11-10 11:26:44 +13:00
41212c1ad1 Merge pull request #942 from BurNiinTRee/master
removed the requirement on the 'regex' feature for the match plugin
2019-11-08 09:11:36 -08:00
4a6122905b fmt: cargo fmt --all 2019-11-08 15:27:29 +00:00
15986c598a Add --separator command to from_csv
The command takes a string, checks it is a single character and then
passes it to csv::ReaderBuilder via .delimiter() method as a u8.
2019-11-08 15:06:33 +00:00
078342442d removed the requirement on the 'regex' feature for the match plugin
The nu_plugin_match binary wasn't built anymore
after the regex dependency was made non-optional in
https://github.com/nushell/nushell/pull/889, causing
the removal of the regex feature, which nu_plugin_match
depended on.
2019-11-08 13:33:28 +01:00
8855c54391 Update README.md 2019-11-08 08:19:41 +13:00
5dfc81a157 Merge pull request #940 from jonathandturner/stable_v2
Second attempt to remove rust-toolchain
2019-11-07 11:18:50 -08:00
c42d97fb97 try again 2019-11-08 08:00:46 +13:00
13314ad1e7 try again 2019-11-08 07:54:52 +13:00
ff6026ca79 try again 2019-11-08 07:47:43 +13:00
c6c6c0f295 try again 2019-11-08 07:44:34 +13:00
1cca5557b1 Second attempt to remove rust-toolchain 2019-11-08 07:27:39 +13:00
76208110b9 Update README.md 2019-11-08 07:17:12 +13:00
56dd0282f0 Merge pull request #938 from nushell/jonathandturner-patch-1
Update docker to stable
2019-11-07 09:59:53 -08:00
c01b602b86 Update docker to stable 2019-11-08 06:34:53 +13:00
d6f46236e9 Merge pull request #937 from nushell/jonathandturner-patch-1
Move azure pipeline to stable
2019-11-07 09:32:54 -08:00
104b30142f Move azure pipeline to stable 2019-11-08 06:13:39 +13:00
f3a885d920 Merge pull request #936 from nushell/move-to-stable
Move Nu to the stable Rust 1.39 release
2019-11-07 09:11:18 -08:00
60445b0559 Move Nu to the stable Rust 1.39 release 2019-11-08 05:51:21 +13:00
01d6287a8f Update README.md 2019-11-06 18:25:23 +13:00
0462b2db80 Merge pull request #925 from jonathandturner/bump_to_0_5_0
Bump version to 0.5.0
2019-11-06 18:24:45 +13:00
4cb399ed70 Bump version to 0.5.0 2019-11-06 18:24:04 +13:00
7ef9f7702f Merge pull request #924 from jonathandturner/help_flags_last
Move flags help to last
2019-11-06 15:50:25 +13:00
44a1686a76 Move flags help to last 2019-11-06 15:28:26 +13:00
15c6d24178 Merge pull request #919 from JesterOrNot/master
Update .gitpod.yml to install nu rather than just build!
2019-11-05 07:24:20 +13:00
3b84e3ccfe Update .gitpod.yml 2019-11-04 11:44:56 -06:00
da7d6beb22 Merge pull request #917 from thegedge/eliminate-is-first-command
Eliminate is_first_command by defaulting to Value::nothing()
2019-11-05 06:34:33 +13:00
f012eb7bdd Eliminate is_first_command by defaulting to Value::nothing() 2019-11-03 20:06:59 -05:00
f966394b63 Merge pull request #888 from andrasio/data-primitives
WIP [data processing]
2019-11-03 16:52:21 -05:00
889d2bb378 Isolate feature. 2019-11-03 16:36:47 -05:00
a2c4e485ba Merge pull request #914 from andrasio/column_path-semantic-fix
ColumnPaths should expect members encapsulated as members.
2019-11-03 06:56:19 -05:00
8860d8de8d At the moment, ColumnPaths represent a set of Members (eg. package.authors is a column path of two members)
The functions for retrieving, replacing, and inserting values into values all assumed they get the complete
column path as regular tagged strings. This commit changes for these to accept a tagged values instead. Basically
it means we can have column paths containing strings and numbers (eg. package.authors.1)

Unfortunately, for the moment all members when parsed and deserialized for a command that expects column paths
of tagged values will get tagged values (encapsulating Members) as strings only.

This makes it impossible to determine whether package.authors.1 package.authors."1" (meaning the "number" 1) is
a string member or a number member and thus prevents to know and force the user that paths enclosed in double
quotes means "retrieve the column at this given table" and that numbers are for retrieving a particular row number
from a table.

This commit sets in place the infraestructure needed when integer members land, in the mean time the workaround
is to convert back to strings the tagged values passed from the column paths.
2019-11-03 06:30:32 -05:00
d7b768ee9f Fallback internally to String primitives until Member int serialization lands. 2019-11-03 05:38:47 -05:00
6ea8e42331 Move column paths to support broader value types. 2019-11-03 05:38:47 -05:00
1b784cb77a Merge pull request #913 from andrasio/tests-builtins
`get` preserves anchored inputs.
2019-11-03 05:11:09 -05:00
4a0ec1207c Preserve anchored meta data for all get queries in the pipeline 2019-11-03 03:49:06 -05:00
ffb2fedca9 Update README.md 2019-11-03 18:24:11 +13:00
382b1ba85f Merge pull request #912 from jonathandturner/fix_910
Make column logic in from-ssv optional
2019-11-03 17:31:15 +13:00
3b42655b51 Make column logic in from-ssv optional 2019-11-03 17:04:59 +13:00
e43e906f86 Update README.md 2019-11-03 16:13:00 +13:00
e51d9d0935 Update README.md 2019-11-03 16:12:36 +13:00
f57489ed92 get command tests already present and move to their own. 2019-11-02 21:05:27 -05:00
503e521820 Merge pull request #909 from jonathandturner/config_set_into
Add support for config --set_into
2019-11-03 13:06:58 +13:00
c317094947 Add support for config --set_into 2019-11-03 12:43:15 +13:00
243df63978 Move config to async_stream 2019-11-03 12:22:30 +13:00
05ff102e09 Merge pull request #908 from jonathandturner/fix_907
Fix 907 and improve substring
2019-11-03 09:14:13 +13:00
cd30fac050 Approach fix differently 2019-11-03 08:57:28 +13:00
f589d3c795 Fix 907 and improve substring 2019-11-03 07:49:28 +13:00
51879d022e Merge pull request #895 from Flare576/substring
Adds new substring function to str plugin
2019-11-02 17:42:45 +13:00
2260b3dda3 Update str.rs 2019-11-02 17:25:20 +13:00
aa64442453 Merge pull request #906 from jonathandturner/nu_env_vars
Add initial support for env vars
2019-11-02 17:12:13 +13:00
129ee45944 Add initial support for env vars 2019-11-02 16:41:58 +13:00
2fe7d105b0 Merge pull request #905 from jonathandturner/add_to_insert
Rename add to insert
2019-11-02 15:07:41 +13:00
136c8acba6 Update README 2019-11-02 14:48:18 +13:00
e92d4b2ccb Rename add to insert 2019-11-02 14:47:14 +13:00
6e91c96dd7 Merge pull request #904 from jonathandturner/plugin_nu_path
Use nu:path for plugin loading
2019-11-02 14:12:51 +13:00
7801c03e2d plugin_nu_path 2019-11-02 13:36:21 +13:00
763bbe1c01 Updated Doc, error on bad input 2019-11-01 17:25:08 -05:00
0f67569cc3 Merge branch 'master' of github.com:nushell/nushell 2019-11-01 13:35:20 -07:00
0ea3527544 Update issue templates 2019-11-02 09:21:29 +13:00
20dfca073f Merge pull request #902 from jonathandturner/updated_echo
Make echo more flexible with data types
2019-11-02 08:34:20 +13:00
a3679f0f4e Make echo more flexible with data types 2019-11-02 08:15:53 +13:00
e75fdc2865 Merge pull request #897 from nushell/modernize_external_tokens
Modernize external tokens
2019-11-02 06:18:38 +13:00
4be88ff572 Modernize external parse and improve trace
The original purpose of this PR was to modernize the external parser to
use the new Shape system.

This commit does include some of that change, but a more important
aspect of this change is an improvement to the expansion trace.

Previous commit 6a7c00ea adding trace infrastructure to the syntax coloring
feature. This commit adds tracing to the expander.

The bulk of that work, in addition to the tree builder logic, was an
overhaul of the formatter traits to make them more general purpose, and
more structured.

Some highlights:

- `ToDebug` was split into two traits (`ToDebug` and `DebugFormat`)
  because implementations needed to become objects, but a convenience
  method on `ToDebug` didn't qualify
- `DebugFormat`'s `fmt_debug` method now takes a `DebugFormatter` rather
  than a standard formatter, and `DebugFormatter` has a new (but still
  limited) facility for structured formatting.
- Implementations of `ExpandSyntax` need to produce output that
  implements `DebugFormat`.

Unlike the highlighter changes, these changes are fairly focused in the
trace output, so these changes aren't behind a flag.
2019-11-01 08:45:45 -07:00
992789af26 Merge pull request #899 from loksonarius/document-tags-command
Add documentation for tags command
2019-11-01 21:25:55 +13:00
b822e13f12 Add documentation for tags command 2019-11-01 00:08:24 -04:00
cd058db046 Substring option for str plugin
Adds new substr function to str plugin with tests and documentation

Function takes a start/end location as a string in the form "##,##", both sides of comma are optional, and
behaves like Rust's own index operator [##..##].
2019-10-31 19:49:17 -05:00
1b3143d3d4 Merge pull request #898 from andrasio/numbers-are-valid-column-names
get :: support fetching rows from tables using column paths named as numbers.
2019-10-31 14:43:41 -05:00
e31ed66610 get :: support fetching rows using numbers in column path. 2019-10-31 14:20:22 -05:00
7f18ff10b2 Merge pull request #892 from andrasio/column_path-fetch-table
Value operations and error handling separation.
2019-10-31 05:31:16 -05:00
65ae24fbf1 suite in place. 2019-10-31 04:42:18 -05:00
b54ce921dd Better error messages. 2019-10-31 04:36:08 -05:00
4935129c5a Merge branch 'master' of github.com:nushell/nushell 2019-10-30 18:40:34 -07:00
7614ce4b49 Allow handling errors with failure callbacks. 2019-10-30 17:46:40 -05:00
9d34ec9153 Merge pull request #891 from nushell/jonathandturner-patch-1
Move rustyline dep back to crates
2019-10-31 09:30:30 +13:00
fd92271884 Move rustyline dep back to crates 2019-10-31 09:14:47 +13:00
cea8fab307 "Integers" in column paths fetch a row from a table. 2019-10-30 05:55:26 -05:00
2d44b7d296 Update README.md 2019-10-30 20:22:01 +13:00
faccb0627f Merge pull request #890 from jonathandturner/append_prepend
Add prepend and append commands
2019-10-30 20:20:06 +13:00
a9cd6b4f7a Format files 2019-10-30 20:04:39 +13:00
81691e07c6 Add prepend and append commands 2019-10-30 19:54:06 +13:00
26f40dcabc Merge pull request #889 from jonathandturner/read_plugin
Add a simple read/parse plugin to better handle text data
2019-10-30 12:08:28 +13:00
3820fef801 Add a simple read/parse plugin to better handle text data 2019-10-30 11:33:36 +13:00
392ff286b2 This commit is ongoing work for making Nu working with data processing
a joy. Fundamentally we embrace functional programming principles for
transforming the dataset from any format picked up by Nu. This table
processing "primitive" commands will build up and make pipelines
composable with data processing capabilities allowing us the valuate,
reduce, and map, the tables as far as even composing this declartively.

On this regard, `split-by` expects some table with grouped data and we
can use it further in interesting ways (Eg. collecting labels for
visualizing the data in charts and/or suit it for a particular chart
of our interest).
2019-10-29 16:04:31 -05:00
b6824d8b88 Merge pull request #886 from notryanb/fetch-from-variable
WIP fetch command - support reading url from variable
2019-10-29 13:52:35 +13:00
e09160e80d add ability to create PathBuf from string to avoid type mismatch 2019-10-28 20:22:51 -04:00
8ba5388438 Merge pull request #885 from jonathandturner/update_path
Allow updating PATH in config
2019-10-29 11:38:53 +13:00
30b6eac03d Allow updating path in config 2019-10-29 10:22:31 +13:00
17ad07ce27 Merge pull request #884 from jonathandturner/nu_path_var
Add support for $nu:path
2019-10-29 08:23:02 +13:00
53911ebecd Add support for :path 2019-10-29 07:40:34 +13:00
bc309705a9 Merge pull request #883 from jonathandturner/magic_env_vars
Add support for $nu:config and $nu:env
2019-10-29 07:22:44 +13:00
1de80aeac3 Add support for :config and :env 2019-10-29 06:51:08 +13:00
1eaaf368ee Merge pull request #879 from andrasio/tilde-pattern
Expand tilde in patterns.
2019-10-28 12:09:02 -05:00
36e40ebb85 Merge pull request #882 from jonathandturner/arg_descs
Add descriptions to arguments
2019-10-28 18:47:56 +13:00
3f600c5b82 Fix build issues 2019-10-28 18:30:14 +13:00
fbd980f8b0 Add descriptions to arguments 2019-10-28 18:15:35 +13:00
7d383421c6 Merge pull request #881 from jonathandturner/history
Always save history, add history command
2019-10-28 06:36:54 +13:00
aed386b3cd Always save history, add history command 2019-10-28 05:58:39 +13:00
540cc4016e Expand tilde in patterns. 2019-10-27 03:55:30 -05:00
1b3a09495d Merge pull request #874 from andrasio/move-out-tag
Move out tags when parsing and building tree nodes.
2019-10-25 22:09:39 -05:00
b7af34371b Merge pull request #871 from oknozor/master
Create docs for from-csv command
2019-10-25 21:36:38 -05:00
105762e1c3 Merge pull request #873 from oknozor/doc/from-toml
Create docs for from-toml command
2019-10-25 21:35:28 -05:00
2706ae076d Move out tags when parsing and building tree nodes. 2019-10-25 18:31:25 -05:00
07ceec3e0b Create docs for from-toml command
Partial fix of issue nushell#711
2019-10-25 20:47:00 +02:00
72fd1b047f Create docs for from-csv command
Partial fix of issue nushell#711
2019-10-25 20:40:51 +02:00
178b6d4d8d Merge pull request #870 from jonathandturner/rusty
rustyline git and add plus for filenames
2019-10-26 06:15:45 +13:00
d160e834eb rustyline git and add plus for filenames 2019-10-26 05:43:31 +13:00
3e8b9e7e8b Merge pull request #867 from nushell/bump
Bump version
2019-10-23 21:15:53 +13:00
c34ebfe739 Bump version
Bump version so we can tell a difference between what has been released and what's in master.
2019-10-23 20:57:04 +13:00
571b33a11c Merge pull request #857 from andrasio/group-by
Can group rows by given column name.
2019-10-23 18:25:52 +13:00
07b90f4b4b Merge pull request #866 from andrasio/color-external
color escaped external command.
2019-10-22 20:16:03 -05:00
f1630da2cc Suggest a column name in case one unknown column is supplied. 2019-10-22 20:10:42 -05:00
16751b5dee color escaped external command. 2019-10-22 19:29:45 -05:00
29ec9a436a Merge pull request #864 from nushell/coloring_in_tokens
Coloring in tokens
2019-10-22 16:38:16 -07:00
6a7c00eaef Finish the job of moving shapes into the stream
This commit should finish the `coloring_in_tokens` feature, which moves
the shape accumulator into the token stream. This allows rollbacks of
the token stream to also roll back any shapes that were added.

This commit also adds a much nicer syntax highlighter trace, which shows
all of the paths the highlighter took to arrive at a particular coloring
output. This change is fairly substantial, but really improves the
understandability of the flow. I intend to update the normal parser with
a similar tracing view.

In general, this change also fleshes out the concept of "atomic" token
stream operations.

A good next step would be to try to make the parser more
error-correcting, using the coloring infrastructure. A follow-up step
would involve merging the parser and highlighter shapes themselves.
2019-10-22 16:19:22 -07:00
82b24d9beb Merge pull request #863 from andrasio/cov-enter
Cover failure not found files cases.
2019-10-22 08:24:48 -05:00
a317072e4e Cover failure not found files cases. 2019-10-22 08:08:24 -05:00
5b701cd197 Merge pull request #862 from Detegr/master
Fix `enter` crashing on nonexistent file
2019-10-22 07:40:23 -05:00
8f035616a0 Fix enter crashing on nonexistent file
Fixes #839
2019-10-22 15:22:47 +03:00
81f8ba9e4c Merge pull request #861 from andrasio/from_xml-cov
baseline coverage for xml parsing.
2019-10-22 04:10:36 -05:00
380ab19910 Merge pull request #858 from Charles-Schleich/master
added Docs for sort-by command
2019-10-22 03:49:18 -05:00
4329629ee9 baseline coverage for xml parsing. 2019-10-22 03:47:59 -05:00
39fde52d8e added Docs for sort-by command 2019-10-21 17:59:20 +02:00
0611f56776 Can group cells by given column name. 2019-10-20 18:42:07 -05:00
8923e91e39 Merge pull request #856 from andrasio/value-improvements
Improvements to Value mutable operations.
2019-10-21 06:57:36 +13:00
d6e6811bb9 Merge pull request #854 from jdvr/master
#194 Connect `rm` command to platform's recycle bin
2019-10-21 05:16:48 +13:00
f24bc5c826 Improvements to Value mutable operations. 2019-10-20 06:55:56 -05:00
c209d0d487 194 Fixed file format 2019-10-19 22:52:39 +02:00
74dddc880d "#194 Added trash switch checked before normal rm command action" 2019-10-19 22:31:18 +02:00
f3c41bbdf1 Merge pull request #851 from t-hart/pr/remove-unwrap-unit
Deletes impl From<&str> for Unit
2019-10-20 07:29:07 +13:00
c45ddc8f22 Merge pull request #848 from andrasio/column_path-inc
Inc plugin increments appropiately given a table containing a version.
2019-10-20 07:27:47 +13:00
84a98995bf Merge pull request #845 from t-hart/from-ssv/headers-as-markers
`from-ssv` logic updated
2019-10-20 07:26:04 +13:00
ed83449514 Merge pull request #808 from notryanb/plugin-average
Average Plugin
2019-10-20 07:23:22 +13:00
9eda573a43 filter out the files that have the same size on multiple operating systems 2019-10-18 20:43:37 -04:00
4f91d2512a add a test to calculate average of bytes 2019-10-18 20:43:37 -04:00
2f5eeab567 fix typos and incorrect commands 2019-10-18 20:43:37 -04:00
f9fbb0eb3c add docs for average and give more specific examples for sum 2019-10-18 20:43:37 -04:00
43fbf4345d remove comment and add test for averaging integers 2019-10-18 20:43:37 -04:00
8262c2dd33 add support for average on byte columns and fmt the code 2019-10-18 20:43:37 -04:00
0e86430ea3 get very basic average working 2019-10-18 20:43:37 -04:00
fc1301c92d #194 Added trash crate and send files to the trash using a flag 2019-10-19 00:41:24 +02:00
e913e26c01 Deletes impl From<&str>
The code still compiles, so this doesn't seem to break anything. That also means
it's not critical to fix it, but having dead code around isn't great either.
2019-10-18 20:02:24 +02:00
5ce4b12cc1 Inc plugin increments appropiately given a table containing a version in it. 2019-10-18 07:30:36 -05:00
94429d781f Merge pull request #847 from Detegr/master
Fix size comparison in 'where size'
2019-10-18 09:27:02 +13:00
321629a693 Fix size comparison in 'where size'
Fixes #840
2019-10-17 22:57:02 +03:00
f21405399c Formats file. 2019-10-17 09:56:06 +02:00
305ca11eb5 Changes the parsing to use the full value of the final column.
Previously it would split the last column on the first separator value found
between the start of the column and the end of the row. Changing this to using
everything from the start of the column to the end of the string makes it behave
more similarly to the other columns, making it less surprising.
2019-10-17 09:40:00 +02:00
9b1ff9b566 Updates the table creation logic.
The table parsing/creation logic has changed from treating every line the same
to processing each line in context of the column header's placement. Previously,
lines on separate rows would go towards the same column as long as they were the
same index based on separator alone. Now, each item's index is based on vertical
alignment to the column header.

This may seem brittle, but it solves the problem of some tables operating with
empty cells that would cause remaining values to be paired with the wrong
column.

Based on kubernetes output (get pods, events), the new method has shown to have
much greater success rates for parsing.
2019-10-17 00:25:43 +02:00
a0ed6ea3c8 Adds new tests and updates old ones.
New tests are added to test for additional cases that might be trickier to
handle with the new logic.

Old tests are updated where their expectations are no longer expected to hold true.
For instance: previously, lines would be treated separately, allowing any index
offset between columns on different rows, as long as they had the same row index
as decided by a separator. When this is no longer the case, some things need to
be adjusted.
2019-10-17 00:17:58 +02:00
4a6529973e Merge pull request #844 from nushell/unknown-value
Rename <unknown> to <value>
2019-10-17 08:24:15 +13:00
9a02fac0e5 Rename <unknown> to <value> 2019-10-17 07:28:49 +13:00
2c6a9e9e48 Merge pull request #838 from jonathandturner/master
Update cargo.lock
2019-10-16 15:18:09 +13:00
d91b735442 Update cargo.lock 2019-10-16 15:09:47 +13:00
f8d337ad29 chore: omit the entire git.rs file when starship is used 2019-10-09 08:42:46 +01:00
47150efc14 chore: switch starship dependency back to the main one 2019-10-09 08:36:55 +01:00
3e14de158b fix(ci): can't push to quay.io (#1)
* ci(github): lowercase ${{ github.actor }}

* ci(github): fix robot username

* fix(ci): fix tag name on suffixed version
2019-10-09 09:58:49 +07:00
c8671c719f fix: addressed unused imports and dead code 2019-10-08 21:50:28 +01:00
0412c3a2f8 fix: remove the additional characters from highlighter
This resolves a small integration issue that would make custom prompts problematic (if they are implemented). The approach was to use the highlighter implementation in Helper to insert colour codes to the prompt however it heavily relies on the prompt being in a specific format, ending with a `> ` sequence. However, this should really be the job of the prompt itself not the presentation layer.

For now, I've simply stripped off the additional `> ` characters and passed in just the prompt itself without slicing off the last two characters. I moved the `\x1b[m` control sequence to the prompt creation in `cli.rs` as this feels like the more logical home for controlling what the prompt looks like. I can think of better ways to do this in future but this should be a fine solution for now.

In future it would probably make sense to completely separate prompts (be it, internal or external) from this code so it can be configured as an isolated piece of code.
2019-10-08 21:39:58 +01:00
ef3e8eb778 fix: update Cargo.lock with correct hash for starship fork 2019-10-08 21:16:52 +01:00
fb8cfeb70d feat: starship prompt
Kind of touches on #356 by integrating the Starship prompt directly into the shell.

Not finished yet and has surfaced a potential bug in rustyline anyway. It depends on https://github.com/starship/starship/pull/509 being merged so the Starship prompt can be used as a library.

I could have tackled #356 completely and implemented a full custom prompt feature but I felt this was a simpler approach given that Starship is both written in Rust so shelling out isn't necessary and it already has a bunch of useful features built in.

However, I would understand if it would be preferable to just scrap integrating Starship directly and instead implement a custom prompt system which would facilitate simply shelling out to Starship.
2019-10-08 16:25:12 +01:00
93ae5043cc ci(github): change REGISTRY to quay.io 2019-10-07 03:53:07 +07:00
b134394319 ci(github): refactor docker related run scripts 2019-10-07 03:14:51 +07:00
b163775112 ci(github): install cross from release page
Instead of compiling `cross` via `cargo install`,
downloading binary executable from release page will speedup the CI
2019-10-07 02:53:23 +07:00
8bd035f51d ci(github): renew trigger definition
There is an update in workflow syntax docs
https://help.github.com/en/articles/workflow-syntax-for-github-actions#filter-pattern-cheat-sheet
2019-10-07 02:51:35 +07:00
fa859f1461 Merge branch 'master' of github.com:nushell/nushell 2019-09-02 21:53:26 -07:00
556f4b2f12 Merge branch 'master' of github.com:nushell/nushell 2019-09-01 23:14:59 -07:00
530 changed files with 41349 additions and 22942 deletions

View File

@ -3,26 +3,26 @@ trigger:
strategy:
matrix:
linux-nightly:
image: ubuntu-16.04
linux-stable:
image: ubuntu-18.04
style: 'unflagged'
macos-nightly:
macos-stable:
image: macos-10.14
style: 'unflagged'
windows-nightly:
image: vs2017-win2016
windows-stable:
image: windows-2019
style: 'unflagged'
linux-nightly-canary:
image: ubuntu-16.04
image: ubuntu-18.04
style: 'canary'
macos-nightly-canary:
image: macos-10.14
style: 'canary'
windows-nightly-canary:
image: vs2017-win2016
image: windows-2019
style: 'canary'
fmt:
image: ubuntu-16.04
image: ubuntu-18.04
style: 'fmt'
pool:
@ -35,18 +35,28 @@ steps:
then
sudo apt-get -y install libxcb-composite0-dev libx11-dev
fi
curl https://sh.rustup.rs -sSf | sh -s -- -y --no-modify-path --default-toolchain `cat rust-toolchain`
if [ "$(uname)" == "Darwin" ]; then
curl https://sh.rustup.rs -sSf | sh -s -- -y --no-modify-path --default-toolchain "stable"
export PATH=$HOME/.cargo/bin:$PATH
fi
rustup update
rustc -Vv
echo "##vso[task.prependpath]$HOME/.cargo/bin"
rustup component add rustfmt --toolchain `cat rust-toolchain`
rustup component add rustfmt
displayName: Install Rust
- bash: RUSTFLAGS="-D warnings" cargo test --all-features
- bash: RUSTFLAGS="-D warnings" cargo test --all --features stable,test-bins
condition: eq(variables['style'], 'unflagged')
displayName: Run tests
- bash: NUSHELL_ENABLE_ALL_FLAGS=1 RUSTFLAGS="-D warnings" cargo test --all-features
- bash: RUSTFLAGS="-D warnings" cargo clippy --all --features=stable -- -D clippy::result_unwrap_used -D clippy::option_unwrap_used
condition: eq(variables['style'], 'unflagged')
displayName: Check clippy lints
- bash: NUSHELL_ENABLE_ALL_FLAGS=1 RUSTFLAGS="-D warnings" cargo test --all --features stable,test-bins
condition: eq(variables['style'], 'canary')
displayName: Run tests
- bash: NUSHELL_ENABLE_ALL_FLAGS=1 RUSTFLAGS="-D warnings" cargo clippy --all --features=stable -- -D clippy::result_unwrap_used -D clippy::option_unwrap_used
condition: eq(variables['style'], 'canary')
displayName: Check clippy lints
- bash: cargo fmt --all -- --check
condition: eq(variables['style'], 'fmt')
displayName: Lint

View File

@ -0,0 +1,3 @@
[build]
#rustflags = ["--cfg", "coloring_in_tokens"]

30
.github/ISSUE_TEMPLATE/bug_report.md vendored Normal file
View File

@ -0,0 +1,30 @@
---
name: Bug report
about: Create a report to help us improve
title: ''
labels: ''
assignees: ''
---
**Describe the bug**
A clear and concise description of what the bug is.
**To Reproduce**
Steps to reproduce the behavior:
1.
2.
3.
**Expected behavior**
A clear and concise description of what you expected to happen.
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Configuration (please complete the following information):**
- OS: [e.g. Windows]
- Version [e.g. 0.4.0]
- Optional features (if any)
Add any other context about the problem here.

View File

@ -0,0 +1,20 @@
---
name: Feature request
about: Suggest an idea for this project
title: ''
labels: ''
assignees: ''
---
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.

View File

@ -2,7 +2,7 @@ name: Publish consumable Docker images
on:
push:
tags: ['*.*.*']
tags: ['v?[0-9]+.[0-9]+.[0-9]+*']
jobs:
compile:
@ -14,7 +14,11 @@ jobs:
- x86_64-unknown-linux-gnu
steps:
- uses: actions/checkout@v1
- run: cargo install cross
- name: Install rust-embedded/cross
env: { VERSION: v0.1.16 }
run: >-
wget -nv https://github.com/rust-embedded/cross/releases/download/${VERSION}/cross-${VERSION}-x86_64-unknown-linux-gnu.tar.gz
-O- | sudo tar xz -C /usr/local/bin/
- name: compile for specific target
env: { arch: '${{ matrix.arch }}' }
run: |
@ -31,6 +35,10 @@ jobs:
name: Build and publish docker images
needs: compile
runs-on: ubuntu-latest
env:
DOCKER_REGISTRY: quay.io/nushell
DOCKER_PASSWORD: ${{ secrets.DOCKER_REGISTRY }}
DOCKER_USER: ${{ secrets.DOCKER_USER }}
strategy:
matrix:
tag:
@ -58,41 +66,53 @@ jobs:
- uses: actions/download-artifact@master
with: { name: '${{ matrix.arch }}', path: target/release }
- name: Build and publish exact version
run: |
REGISTRY=${REGISTRY,,}; export TAG=${GITHUB_REF##*/}-${{ matrix.tag }};
run: |-
export DOCKER_TAG=${GITHUB_REF##*/}-${{ matrix.tag }}
export NU_BINS=target/release/$( [ ${{ matrix.plugin }} = true ] && echo nu* || echo nu )
export PATCH=$([ ${{ matrix.use-patch }} = true ] && echo .${{ matrix.tag }} || echo '')
chmod +x $NU_BINS
echo ${{ secrets.DOCKER_REGISTRY }} | docker login docker.pkg.github.com -u ${{ github.actor }} --password-stdin
echo ${DOCKER_PASSWORD} | docker login ${DOCKER_REGISTRY} -u ${DOCKER_USER} --password-stdin
docker-compose --file docker/docker-compose.package.yml build
docker-compose --file docker/docker-compose.package.yml push # exact version
env:
BASE_IMAGE: ${{ matrix.base-image }}
REGISTRY: docker.pkg.github.com/${{ github.repository }}
#region semantics tagging
- name: Retag and push without suffixing version
run: |
- name: Retag and push with suffixed version
run: |-
VERSION=${GITHUB_REF##*/}
docker tag ${REGISTRY,,}/nu:${VERSION}-${{ matrix.tag }} ${REGISTRY,,}/nu:${{ matrix.tag }}
docker tag ${REGISTRY,,}/nu:${VERSION}-${{ matrix.tag }} ${REGISTRY,,}/nu:${VERSION%%.*}-${{ matrix.tag }}
docker tag ${REGISTRY,,}/nu:${VERSION}-${{ matrix.tag }} ${REGISTRY,,}/nu:${VERSION%.*}-${{ matrix.tag }}
docker push ${REGISTRY,,}/nu:${VERSION%.*}-${{ matrix.tag }} # latest patch
docker push ${REGISTRY,,}/nu:${VERSION%%.*}-${{ matrix.tag }} # latest features
docker push ${REGISTRY,,}/nu:${{ matrix.tag }} # latest version
env: { REGISTRY: 'docker.pkg.github.com/${{ github.repository }}' }
latest_version=${VERSION%%%.*}-${{ matrix.tag }}
latest_feature=${VERSION%%.*}-${{ matrix.tag }}
latest_patch=${VERSION%.*}-${{ matrix.tag }}
exact_version=${VERSION}-${{ matrix.tag }}
tags=( ${latest_version} ${latest_feature} ${latest_patch} ${exact_version} )
for tag in ${tags[@]}; do
docker tag ${DOCKER_REGISTRY}/nu:${VERSION}-${{ matrix.tag }} ${DOCKER_REGISTRY}/nu:${tag}
docker push ${DOCKER_REGISTRY}/nu:${tag}
done
# latest version
docker tag ${DOCKER_REGISTRY}/nu:${VERSION}-${{ matrix.tag }} ${DOCKER_REGISTRY}/nu:${{ matrix.tag }}
docker push ${DOCKER_REGISTRY}/nu:${{ matrix.tag }}
- name: Retag and push debian as latest
if: matrix.tag == 'debian'
run: |
run: |-
VERSION=${GITHUB_REF##*/}
docker tag ${REGISTRY,,}/nu:${{ matrix.tag }} ${REGISTRY,,}/nu:latest
docker tag ${REGISTRY,,}/nu:${VERSION}-${{ matrix.tag }} ${REGISTRY,,}/nu:${VERSION%.*}
docker tag ${REGISTRY,,}/nu:${VERSION}-${{ matrix.tag }} ${REGISTRY,,}/nu:${VERSION%%.*}
docker tag ${REGISTRY,,}/nu:${VERSION}-${{ matrix.tag }} ${REGISTRY,,}/nu:${VERSION}
docker push ${REGISTRY,,}/nu:${VERSION} # exact version
docker push ${REGISTRY,,}/nu:${VERSION%%.*} # latest features
docker push ${REGISTRY,,}/nu:${VERSION%.*} # latest patch
docker push ${REGISTRY,,}/nu:latest # latest version
env: { REGISTRY: 'docker.pkg.github.com/${{ github.repository }}' }
# ${latest features} ${latest patch} ${exact version}
tags=( ${VERSION%%.*} ${VERSION%.*} ${VERSION} )
for tag in ${tags[@]}; do
docker tag ${DOCKER_REGISTRY}/nu:${VERSION}-${{ matrix.tag }} ${DOCKER_REGISTRY}/nu:${tag}
docker push ${DOCKER_REGISTRY}/nu:${tag}
done
# latest version
docker tag ${DOCKER_REGISTRY}/nu:${{ matrix.tag }} ${DOCKER_REGISTRY}/nu:latest
docker push ${DOCKER_REGISTRY}/nu:latest
#endregion semantics tagging

2
.gitignore vendored
View File

@ -3,7 +3,7 @@
**/*.rs.bk
history.txt
tests/fixtures/nuplayground
crates/*/target
# Debian/Ubuntu
debian/.debhelper/
debian/debhelper-build-stamp

12
.gitpod.Dockerfile vendored
View File

@ -1,7 +1,9 @@
FROM gitpod/workspace-full
USER root
RUN apt-get update && apt-get install -y libssl-dev \
USER gitpod
RUN sudo apt-get update && \
sudo apt-get install -y \
libssl-dev \
libxcb-composite0-dev \
pkg-config \
curl \
rustc
pkg-config

View File

@ -1,8 +1,8 @@
image:
file: .gitpod.Dockerfile
tasks:
- init: cargo build
command: cargo run
- init: cargo install --path . --force --features=stable
command: nu
github:
prebuilds:
# enable for the master/default branch (defaults to true)
@ -19,3 +19,10 @@ github:
addBadge: false
# add a label once the prebuild is ready to pull requests (defaults to false)
addLabel: prebuilt-in-gitpod
vscode:
extensions:
- hbenl.vscode-test-explorer@2.15.0:koqDUMWDPJzELp/hdS/lWw==
- Swellaby.vscode-rust-test-adapter@0.11.0:Xg+YeZZQiVpVUsIkH+uiiw==
- serayuzgur.crates@0.4.7:HMkoguLcXp9M3ud7ac3eIw==
- belfz.search-crates-io@1.2.1:kSLnyrOhXtYPjQpKnMr4eQ==
- vadimcn.vscode-lldb@1.4.5:lwHCNwtm0kmOBXeQUIPGMQ==

4144
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -1,6 +1,6 @@
[package]
name = "nu"
version = "0.4.0"
version = "0.10.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
description = "A shell for the GitHub era"
license = "MIT"
@ -9,92 +9,159 @@ readme = "README.md"
default-run = "nu"
repository = "https://github.com/nushell/nushell"
homepage = "https://www.nushell.sh"
documentation = "https://book.nushell.sh"
documentation = "https://www.nushell.sh/book/"
exclude = ["images"]
[workspace]
members = [
"crates/nu-macros",
"crates/nu-errors",
"crates/nu-source",
"crates/nu_plugin_average",
"crates/nu_plugin_binaryview",
"crates/nu_plugin_fetch",
"crates/nu_plugin_inc",
"crates/nu_plugin_match",
"crates/nu_plugin_post",
"crates/nu_plugin_ps",
"crates/nu_plugin_str",
"crates/nu_plugin_sum",
"crates/nu_plugin_sys",
"crates/nu_plugin_textview",
"crates/nu_plugin_tree",
"crates/nu-protocol",
"crates/nu-plugin",
"crates/nu-parser",
"crates/nu-value-ext",
"crates/nu-build"
]
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
rustyline = "5.0.3"
chrono = { version = "0.4.9", features = ["serde"] }
nu-source = { version = "0.10.0", path = "./crates/nu-source" }
nu-plugin = { version = "0.10.0", path = "./crates/nu-plugin" }
nu-protocol = { version = "0.10.0", path = "./crates/nu-protocol" }
nu-errors = { version = "0.10.0", path = "./crates/nu-errors" }
nu-parser = { version = "0.10.0", path = "./crates/nu-parser" }
nu-value-ext = { version = "0.10.0", path = "./crates/nu-value-ext" }
nu_plugin_average = { version = "0.10.0", path = "./crates/nu_plugin_average", optional=true }
nu_plugin_binaryview = { version = "0.10.0", path = "./crates/nu_plugin_binaryview", optional=true }
nu_plugin_fetch = { version = "0.10.0", path = "./crates/nu_plugin_fetch", optional=true }
nu_plugin_inc = { version = "0.10.0", path = "./crates/nu_plugin_inc", optional=true }
nu_plugin_match = { version = "0.10.0", path = "./crates/nu_plugin_match", optional=true }
nu_plugin_post = { version = "0.10.0", path = "./crates/nu_plugin_post", optional=true }
nu_plugin_ps = { version = "0.10.0", path = "./crates/nu_plugin_ps", optional=true }
nu_plugin_str = { version = "0.10.0", path = "./crates/nu_plugin_str", optional=true }
nu_plugin_sum = { version = "0.10.0", path = "./crates/nu_plugin_sum", optional=true }
nu_plugin_sys = { version = "0.10.0", path = "./crates/nu_plugin_sys", optional=true }
nu_plugin_textview = { version = "0.10.0", path = "./crates/nu_plugin_textview", optional=true }
nu_plugin_tree = { version = "0.10.0", path = "./crates/nu_plugin_tree", optional=true }
nu-macros = { version = "0.10.0", path = "./crates/nu-macros" }
query_interface = "0.3.5"
typetag = "0.1.4"
rustyline = "6.0.0"
chrono = { version = "0.4.10", features = ["serde"] }
derive-new = "0.5.8"
prettytable-rs = "0.8.0"
itertools = "0.8.0"
itertools = "0.8.2"
ansi_term = "0.12.1"
nom = "5.0.0"
nom = "5.0.1"
dunce = "1.0.0"
indexmap = { version = "1.2.0", features = ["serde-1"] }
chrono-humanize = "0.0.11"
byte-unit = "3.0.1"
base64 = "0.10.1"
futures-preview = { version = "=0.3.0-alpha.18", features = ["compat", "io-compat"] }
async-stream = "0.1.1"
futures_codec = "0.2.5"
num-traits = "0.2.8"
indexmap = { version = "1.3.2", features = ["serde-1"] }
byte-unit = "3.0.3"
base64 = "0.11"
futures = { version = "0.3", features = ["compat", "io-compat"] }
async-stream = "0.2"
futures_codec = "0.4"
num-traits = "0.2.11"
term = "0.5.2"
bytes = "0.4.12"
log = "0.4.8"
pretty_env_logger = "0.3.1"
serde = { version = "1.0.100", features = ["derive"] }
pretty_env_logger = "0.4.0"
serde = { version = "1.0.104", features = ["derive"] }
bson = { version = "0.14.0", features = ["decimal128"] }
serde_json = "1.0.40"
serde_json = "1.0.47"
serde-hjson = "0.9.1"
serde_yaml = "0.8"
serde_bytes = "0.11.2"
getset = "0.0.8"
serde_bytes = "0.11.3"
getset = "0.0.9"
language-reporting = "0.4.0"
app_dirs = "1.2.1"
csv = "1.1"
toml = "0.5.3"
toml = "0.5.6"
clap = "2.33.0"
git2 = { version = "0.10.1", default_features = false }
git2 = { version = "0.11.0", default_features = false }
dirs = "2.0.2"
glob = "0.3.0"
ctrlc = "3.1.3"
surf = "1.0.2"
url = "2.1.0"
roxmltree = "0.7.0"
roxmltree = "0.9.1"
nom_locate = "1.0.0"
nom-tracable = "0.4.0"
nom-tracable = "0.4.1"
unicode-xid = "0.2.0"
serde_ini = "0.2.0"
subprocess = "0.1.18"
mime = "0.3.14"
pretty-hex = "0.1.0"
hex = "0.3.2"
pretty-hex = "0.1.1"
hex = "0.4"
tempfile = "3.1.0"
semver = "0.9.0"
which = "2.0.1"
which = "3.1.0"
ichwh = "0.3"
textwrap = {version = "0.11.0", features = ["term_size"]}
shellexpand = "1.0.0"
futures-timer = "0.4.0"
shellexpand = "1.1.1"
pin-utils = "0.1.0-alpha.4"
num-bigint = { version = "0.2.3", features = ["serde"] }
num-bigint = { version = "0.2.6", features = ["serde"] }
bigdecimal = { version = "0.1.0", features = ["serde"] }
natural = "0.3.0"
serde_urlencoded = "0.6.1"
sublime_fuzzy = "0.5"
trash = "1.0.0"
regex = "1"
cfg-if = "0.1"
strip-ansi-escapes = "0.1.0"
calamine = "0.16"
umask = "0.1"
futures-util = "0.3.4"
termcolor = "1.1.0"
natural = "0.3.0"
parking_lot = "0.10.0"
meval = "0.2"
regex = {version = "1", optional = true }
neso = { version = "0.5.0", optional = true }
crossterm = { version = "0.10.2", optional = true }
clipboard = {version = "0.5", optional = true }
ptree = {version = "0.2" }
starship = { version = "0.35.1", optional = true}
syntect = {version = "3.2.0", optional = true }
onig_sys = {version = "=69.1.0", optional = true }
heim = {version = "0.0.8", optional = true }
battery = {version = "0.7.4", optional = true }
rawkey = {version = "0.1.2", optional = true }
clipboard = {version = "0.5", optional = true }
ptree = {version = "0.2", optional = true }
image = { version = "0.22.2", default_features = false, features = ["png_codec", "jpeg"], optional = true }
crossterm = {version = "0.16.0", optional = true}
url = {version = "2.1.1", optional = true}
semver = {version = "0.9.0", optional = true}
filesize = "0.1.0"
[target.'cfg(unix)'.dependencies]
users = "0.9"
[features]
default = ["textview", "sys", "ps"]
raw-key = ["rawkey", "neso"]
textview = ["syntect", "onig_sys", "crossterm"]
binaryview = ["image", "crossterm"]
sys = ["heim", "battery"]
ps = ["heim"]
# trace = ["nom-tracable/trace"]
all = ["raw-key", "textview", "binaryview", "sys", "ps", "clipboard", "ptree"]
# Test executables
test-bins = []
default = ["sys", "ps", "textview", "inc", "str"]
stable = ["default", "starship-prompt", "binaryview", "match", "tree", "average", "sum", "post", "fetch", "clipboard"]
# Default
textview = ["crossterm", "syntect", "onig_sys", "url", "nu_plugin_textview"]
sys = ["nu_plugin_sys"]
ps = ["nu_plugin_ps"]
inc = ["semver", "nu_plugin_inc"]
str = ["nu_plugin_str"]
# Stable
average = ["nu_plugin_average"]
binaryview = ["nu_plugin_binaryview"]
fetch = ["nu_plugin_fetch"]
match = ["nu_plugin_match"]
post = ["nu_plugin_post"]
starship-prompt = ["starship"]
sum = ["nu_plugin_sum"]
trace = ["nu-parser/trace"]
tree = ["nu_plugin_tree"]
[dependencies.rusqlite]
version = "0.20.0"
@ -102,78 +169,103 @@ features = ["bundled", "blob"]
[dev-dependencies]
pretty_assertions = "0.6.1"
nu-test-support = { version = "0.10.0", path = "./crates/nu-test-support" }
[build-dependencies]
toml = "0.5.3"
serde = { version = "1.0.101", features = ["derive"] }
toml = "0.5.6"
serde = { version = "1.0.104", features = ["derive"] }
nu-build = { version = "0.10.0", path = "./crates/nu-build" }
[lib]
name = "nu"
doctest = false
path = "src/lib.rs"
[[bin]]
name = "nu_plugin_inc"
path = "src/plugins/inc.rs"
name = "fail"
path = "crates/nu-test-support/src/bins/fail.rs"
required-features = ["test-bins"]
[[bin]]
name = "nu_plugin_sum"
path = "src/plugins/sum.rs"
name = "chop"
path = "crates/nu-test-support/src/bins/chop.rs"
required-features = ["test-bins"]
[[bin]]
name = "nu_plugin_embed"
path = "src/plugins/embed.rs"
name = "cococo"
path = "crates/nu-test-support/src/bins/cococo.rs"
required-features = ["test-bins"]
[[bin]]
name = "nu_plugin_add"
path = "src/plugins/add.rs"
name = "nonu"
path = "crates/nu-test-support/src/bins/nonu.rs"
required-features = ["test-bins"]
# Core plugins that ship with `cargo install nu` by default
# Currently, Cargo limits us to installing only one binary
# unless we use [[bin]], so we use this as a workaround
[[bin]]
name = "nu_plugin_edit"
path = "src/plugins/edit.rs"
[[bin]]
name = "nu_plugin_str"
path = "src/plugins/str.rs"
[[bin]]
name = "nu_plugin_skip"
path = "src/plugins/skip.rs"
[[bin]]
name = "nu_plugin_match"
path = "src/plugins/match.rs"
required-features = ["regex"]
[[bin]]
name = "nu_plugin_sys"
path = "src/plugins/sys.rs"
required-features = ["sys"]
[[bin]]
name = "nu_plugin_ps"
path = "src/plugins/ps.rs"
required-features = ["ps"]
[[bin]]
name = "nu_plugin_tree"
path = "src/plugins/tree.rs"
required-features = ["tree"]
[[bin]]
name = "nu_plugin_binaryview"
path = "src/plugins/binaryview.rs"
required-features = ["binaryview"]
[[bin]]
name = "nu_plugin_textview"
path = "src/plugins/textview.rs"
name = "nu_plugin_core_textview"
path = "src/plugins/nu_plugin_core_textview.rs"
required-features = ["textview"]
[[bin]]
name = "nu_plugin_docker"
path = "src/plugins/docker.rs"
required-features = ["docker"]
name = "nu_plugin_core_inc"
path = "src/plugins/nu_plugin_core_inc.rs"
required-features = ["inc"]
[[bin]]
name = "nu_plugin_core_ps"
path = "src/plugins/nu_plugin_core_ps.rs"
required-features = ["ps"]
[[bin]]
name = "nu_plugin_core_str"
path = "src/plugins/nu_plugin_core_str.rs"
required-features = ["str"]
[[bin]]
name = "nu_plugin_core_sys"
path = "src/plugins/nu_plugin_core_sys.rs"
required-features = ["sys"]
# Stable plugins
[[bin]]
name = "nu_plugin_stable_average"
path = "src/plugins/nu_plugin_stable_average.rs"
required-features = ["average"]
[[bin]]
name = "nu_plugin_stable_fetch"
path = "src/plugins/nu_plugin_stable_fetch.rs"
required-features = ["fetch"]
[[bin]]
name = "nu_plugin_stable_binaryview"
path = "src/plugins/nu_plugin_stable_binaryview.rs"
required-features = ["binaryview"]
[[bin]]
name = "nu_plugin_stable_match"
path = "src/plugins/nu_plugin_stable_match.rs"
required-features = ["match"]
[[bin]]
name = "nu_plugin_stable_post"
path = "src/plugins/nu_plugin_stable_post.rs"
required-features = ["post"]
[[bin]]
name = "nu_plugin_stable_sum"
path = "src/plugins/nu_plugin_stable_sum.rs"
required-features = ["sum"]
[[bin]]
name = "nu_plugin_stable_tree"
path = "src/plugins/nu_plugin_stable_tree.rs"
required-features = ["tree"]
# Main nu binary
[[bin]]
name = "nu"
path = "src/main.rs"

View File

@ -4,21 +4,12 @@ command = "lalrpop"
args = ["src/parser/parser.lalrpop"]
[tasks.baseline]
dependencies = ["lalrpop"]
[tasks.build]
command = "cargo"
args = ["build"]
dependencies = ["lalrpop"]
args = ["build", "--bins"]
[tasks.run]
command = "cargo"
args = ["run", "--release"]
dependencies = ["baseline"]
[tasks.release]
command = "cargo"
args = ["build", "--release"]
args = ["run"]
dependencies = ["baseline"]
[tasks.test]

152
README.md
View File

@ -1,3 +1,4 @@
[![Gitpod Ready-to-Code](https://img.shields.io/badge/Gitpod-Ready--to--Code-blue?logo=gitpod)](https://gitpod.io/#https://github.com/nushell/nushell)
[![Crates.io](https://img.shields.io/crates/v/nu.svg)](https://crates.io/crates/nu)
[![Build Status](https://dev.azure.com/nushell/nushell/_apis/build/status/nushell.nushell?branchName=master)](https://dev.azure.com/nushell/nushell/_build/latest?definitionId=2&branchName=master)
[![Discord](https://img.shields.io/discord/601130461678272522.svg?logo=discord)](https://discord.gg/NtAbbGn)
@ -6,7 +7,7 @@
# Nu Shell
A modern shell for the GitHub era.
A new type of shell.
![Example of nushell](images/nushell-autocomplete.gif "Example of nushell")
@ -18,12 +19,14 @@ Nu comes with a set of built-in commands (listed below). If a command is unknown
# Learning more
There are a few good resources to learn about Nu. There is a [book](https://book.nushell.sh) about Nu that is currently in progress. The book focuses on using Nu and its core concepts.
There are a few good resources to learn about Nu. There is a [book](https://www.nushell.sh/book/) about Nu that is currently in progress. The book focuses on using Nu and its core concepts.
If you're a developer who would like to contribute to Nu, we're also working on a [book for developers](https://github.com/nushell/contributor-book/tree/master/en) to help you get started. There are also [good first issues](https://github.com/nushell/nushell/issues?q=is%3Aopen+is%3Aissue+label%3A%22good+first+issue%22) to help you dive in.
If you're a developer who would like to contribute to Nu, we're also working on a [book for developers](https://www.nushell.sh/contributor-book/) to help you get started. There are also [good first issues](https://github.com/nushell/nushell/issues?q=is%3Aopen+is%3Aissue+label%3A%22good+first+issue%22) to help you dive in.
We also have an active [Discord](https://discord.gg/NtAbbGn) and [Twitter](https://twitter.com/nu_shell) if you'd like to come and chat with us.
You can also find more learning resources in our [documentation](https://www.nushell.sh/documentation.html) site.
Try it in Gitpod.
[![Open in Gitpod](https://gitpod.io/button/open-in-gitpod.svg)](https://gitpod.io/#https://github.com/nushell/nushell)
@ -32,9 +35,9 @@ Try it in Gitpod.
## Local
Up-to-date installation instructions can be found in the [installation chapter of the book](https://book.nushell.sh/en/installation).
Up-to-date installation instructions can be found in the [installation chapter of the book](https://www.nushell.sh/book/en/installation.html). **Windows users**: please note that Nu works on Windows 10 and does not currently have Windows 7/8.1 support.
To build Nu, you will need to use the **nightly** version of the compiler.
To build Nu, you will need to use the **latest stable (1.39 or later)** version of the compiler.
Required dependencies:
@ -46,16 +49,16 @@ Optional dependencies:
* To use Nu with all possible optional features enabled, you'll also need the following:
* on Linux (on Debian/Ubuntu): `apt install libxcb-composite0-dev libx11-dev`
To install Nu via cargo (make sure you have installed [rustup](https://rustup.rs/) and the beta compiler via `rustup install beta`):
To install Nu via cargo (make sure you have installed [rustup](https://rustup.rs/) and the latest stable compiler via `rustup install stable`):
```
cargo +beta install nu
cargo install nu
```
You can also install Nu with all the bells and whistles (be sure to have installed the [dependencies](https://book.nushell.sh/en/installation#dependencies) for your platform):
You can also build Nu yourself with all the bells and whistles (be sure to have installed the [dependencies](https://www.nushell.sh/book/en/installation.html#dependencies) for your platform), once you have checked out this repo with git:
```
cargo +beta install nu --all-features
cargo build --workspace --features=stable
```
## Docker
@ -118,7 +121,7 @@ Commands are separated by the pipe symbol (`|`) to denote a pipeline flowing lef
```
/home/jonathan/Source/nushell(master)> ls | where type == "Directory" | autoview
━━━━┯━━━━━━━━━━━┯━━━━━━━━━━━┯━━━━━━━━━━┯━━━━━━━━┯━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━
────┬───────────┬───────────┬──────────┬────────┬──────────────┬────────────────
# │ name │ type │ readonly │ size │ accessed │ modified
────┼───────────┼───────────┼──────────┼────────┼──────────────┼────────────────
0 │ .azure │ Directory │ │ 4.1 KB │ 2 months ago │ a day ago
@ -129,7 +132,7 @@ Commands are separated by the pipe symbol (`|`) to denote a pipeline flowing lef
5 │ src │ Directory │ │ 4.1 KB │ 2 months ago │ 37 minutes ago
6 │ assets │ Directory │ │ 4.1 KB │ a month ago │ a month ago
7 │ docs │ Directory │ │ 4.1 KB │ 2 months ago │ 2 months ago
━━━━┷━━━━━━━━━━━┷━━━━━━━━━━━┷━━━━━━━━━━┷━━━━━━━━┷━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━
────┴───────────┴───────────┴──────────┴────────┴──────────────┴────────────────
```
Because most of the time you'll want to see the output of a pipeline, `autoview` is assumed. We could have also written the above:
@ -142,15 +145,14 @@ Being able to use the same commands and compose them differently is an important
```text
/home/jonathan/Source/nushell(master)> ps | where cpu > 0
━━━┯━━━━━━━┯━━━━━━━━━━━━━━━━━┯━━━━━━━━━━┯━━━━━━━━━━
───┬───────┬─────────────────┬──────────┬──────────
# │ pid │ name │ status │ cpu
───┼───────┼─────────────────┼──────────┼──────────
0 │ 992 │ chrome │ Sleeping │ 6.988768
1 │ 4240 │ chrome │ Sleeping │ 5.645982
2 │ 13973 │ qemu-system-x86 │ Sleeping │ 4.996551
3 │ 15746 │ nu │ Sleeping │ 84.59905
━━━┷━━━━━━━┷━━━━━━━━━━━━━━━━━┷━━━━━━━━━━┷━━━━━━━━━━
───┴───────┴─────────────────┴──────────┴──────────
```
## Opening files
@ -159,33 +161,53 @@ Nu can load file and URL contents as raw text or as structured data (if it recog
```
/home/jonathan/Source/nushell(master)> open Cargo.toml
━━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━
──────────────────┬────────────────┬──────────────────
bin │ dependencies │ dev-dependencies
──────────────────┼────────────────┼──────────────────
[table: 12 rows] │ [table: 1 row] │ [table: 1 row]
━━━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━
──────────────────┴────────────────┴──────────────────
```
We can pipeline this into a command that gets the contents of one of the columns:
```
/home/jonathan/Source/nushell(master)> open Cargo.toml | get package
━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━━━━┯━━━━━━━━━┯━━━━━━┯━━━━━━━━━
─────────────────┬────────────────────────────┬─────────┬─────────┬──────┬─────────
authors │ description │ edition │ license │ name │ version
─────────────────┼────────────────────────────┼─────────┼─────────┼──────┼─────────
[table: 3 rows] │ A shell for the GitHub era │ 2018 │ MIT │ nu │ 0.4.0
━━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━━━━┷━━━━━━━━━┷━━━━━━┷━━━━━━━━━
[table: 3 rows] │ A shell for the GitHub era │ 2018 │ MIT │ nu │ 0.9.0
─────────────────┴────────────────────────────┴─────────┴─────────┴──────┴─────────
```
Finally, we can use commands outside of Nu once we have the data we want:
```
/home/jonathan/Source/nushell(master)> open Cargo.toml | get package.version | echo $it
0.4.0
0.9.0
```
Here we use the variable `$it` to refer to the value being piped to the external command.
## Configuration
Nu has early support for configuring the shell. It currently supports the following settings:
| Variable | Type | Description |
| --------------- | -------------------- | -------------------------------------------------------------- |
| path | table of strings | PATH to use to find binaries |
| env | row | the environment variables to pass to external commands |
| ctrlc_exit | boolean | whether or not to exit Nu after multiple ctrl-c presses |
| table_mode | "light" or other | enable lightweight or normal tables |
| edit_mode | "vi" or "emacs" | changes line editing to "vi" or "emacs" mode |
| completion_mode | "circular" or "list" | changes completion type to "circular" (default) or "list" mode |
To set one of these variables, you can use `config --set`. For example:
```
> config --set [edit_mode "vi"]
> config --set [path $nu.path]
```
## Shells
Nu will work inside of a single directory and allow you to navigate around your filesystem by default. Nu also offers a way of adding additional working directories that you can jump between, allowing you to work in multiple directories at the same time.
@ -217,96 +239,8 @@ Nu adheres closely to a set of goals that make up its design philosophy. As feat
* Finally, Nu views data functionally. Rather than using mutation, pipelines act as a means to load, change, and save data without mutable state.
# Commands
## Initial commands
| command | description |
| ------------- | ------------- |
| cd path | Change to a new path |
| cp source path | Copy files |
| date (--utc) | Get the current datetime |
| fetch url | Fetch contents from a url and retrieve data as a table if possible |
| help | Display help information about commands |
| ls (path) | View the contents of the current or given path |
| mkdir path | Make directories, creates intermediary directories as required. |
| mv source target | Move files or directories. |
| open filename | Load a file into a cell, convert to table if possible (avoid by appending '--raw') |
| post url body (--user <user>) (--password <password>) | Post content to a url and retrieve data as a table if possible |
| ps | View current processes |
| sys | View information about the current system |
| which filename | Finds a program file. |
| rm {file or directory} | Remove a file, (for removing directory append '--recursive') |
| version | Display Nu version |
## Shell commands
| command | description |
| ------- | ----------- |
| exit (--now) | Exit the current shell (or all shells) |
| enter (path) | Create a new shell and begin at this path |
| p | Go to previous shell |
| n | Go to next shell |
| shells | Display the list of current shells |
## Filters on tables (structured data)
| command | description |
| ------------- | ------------- |
| add column-or-column-path value | Add a new column to the table |
| edit column-or-column-path value | Edit an existing column to have a new value |
| embed column | Creates a new table of one column with the given name, and places the current table inside of it |
| first amount | Show only the first number of rows |
| get column-or-column-path | Open column and get data from the corresponding cells |
| inc (column-or-column-path) | Increment a value or version. Optionally use the column of a table |
| last amount | Show only the last number of rows |
| nth row-number | Return only the selected row |
| pick ...columns | Down-select table to only these columns |
| pivot --header-row <headers> | Pivot the tables, making columns into rows and vice versa |
| reject ...columns | Remove the given columns from the table |
| reverse | Reverses the table. |
| skip amount | Skip a number of rows |
| skip-while condition | Skips rows while the condition matches. |
| sort-by ...columns | Sort by the given columns |
| str (column) | Apply string function. Optionally use the column of a table |
| sum | Sum a column of values |
| tags | Read the tags (metadata) for values |
| to-bson | Convert table into .bson binary data |
| to-csv | Convert table into .csv text |
| to-json | Convert table into .json text |
| to-sqlite | Convert table to sqlite .db binary data |
| to-toml | Convert table into .toml text |
| to-tsv | Convert table into .tsv text |
| to-url | Convert table to a urlencoded string |
| to-yaml | Convert table into .yaml text |
| where condition | Filter table to match the condition |
## Filters on text (unstructured data)
| command | description |
| ------------- | ------------- |
| from-bson | Parse binary data as .bson and create table |
| from-csv | Parse text as .csv and create table |
| from-ini | Parse text as .ini and create table |
| from-json | Parse text as .json and create table |
| from-sqlite | Parse binary data as sqlite .db and create table |
| from-ssv --minimum-spaces <minimum number of spaces to count as a separator> | Parse text as space-separated values and create table |
| from-toml | Parse text as .toml and create table |
| from-tsv | Parse text as .tsv and create table |
| from-url | Parse urlencoded string and create a table |
| from-xml | Parse text as .xml and create a table |
| from-yaml | Parse text as a .yaml/.yml and create a table |
| lines | Split single string into rows, one per line |
| size | Gather word count statistics on the text |
| split-column sep ...column-names | Split row contents across multiple columns via the separator, optionally give the columns names |
| split-row sep | Split row contents over multiple rows via the separator |
| trim | Trim leading and following whitespace from text data |
| {external-command} $it | Run external command with given arguments, replacing $it with each row text |
## Consuming commands
| command | description |
| ------------- | ------------- |
| autoview | View the contents of the pipeline as a table or list |
| binaryview | Autoview of binary data (optional feature) |
| clip | Copy the contents of the pipeline to the copy/paste buffer (optional feature) |
| save filename | Save the contents of the pipeline to a file |
| table | View the contents of the pipeline as a table |
| textview | Autoview of text data |
| tree | View the contents of the pipeline as a tree (optional feature) |
You can find a list of Nu commands, complete with documentation, in [quick command references](https://www.nushell.sh/documentation.html#quick-command-references).
# License

60
TODO.md Normal file
View File

@ -0,0 +1,60 @@
This pattern is extremely repetitive and can be abstracted:
```rs
let args = args.evaluate_once(registry)?;
let tag = args.name_tag();
let input = args.input;
let stream = async_stream! {
let values: Vec<Value> = input.values.collect().await;
let mut concat_string = String::new();
let mut latest_tag: Option<Tag> = None;
for value in values {
latest_tag = Some(value_tag.clone());
let value_span = value.tag.span;
match &value.value {
UntaggedValue::Primitive(Primitive::String(s)) => {
concat_string.push_str(&s);
concat_string.push_str("\n");
}
_ => yield Err(ShellError::labeled_error_with_secondary(
"Expected a string from pipeline",
"requires string input",
name_span,
"value originates from here",
value_span,
)),
}
}
```
Mandatory and Optional in parse_command
trace_remaining?
select_fields and select_fields take unnecessary Tag
Value#value should be Value#untagged
Unify dictionary building, probably around a macro
sys plugin in own crate
textview in own crate
Combine atomic and atomic_parse in parser
at_end_possible_ws needs to be comment and separator sensitive
Eliminate unnecessary `nodes` parser
#[derive(HasSpan)]
Figure out a solution for the duplication in stuff like NumberShape vs. NumberExpressionShape
use `struct Expander` from signature.rs

View File

@ -1,39 +1,3 @@
use serde::Deserialize;
use std::collections::HashMap;
use std::collections::HashSet;
use std::env;
use std::path::Path;
#[derive(Deserialize)]
struct Feature {
#[allow(unused)]
description: String,
enabled: bool,
}
fn main() -> Result<(), Box<dyn std::error::Error>> {
let input = env::var("CARGO_MANIFEST_DIR").unwrap();
let all_on = env::var("NUSHELL_ENABLE_ALL_FLAGS").is_ok();
let flags: HashSet<String> = env::var("NUSHELL_ENABLE_FLAGS")
.map(|s| s.split(",").map(|s| s.to_string()).collect())
.unwrap_or_else(|_| HashSet::new());
if all_on && !flags.is_empty() {
println!(
"cargo:warning={}",
"Both NUSHELL_ENABLE_ALL_FLAGS and NUSHELL_ENABLE_FLAGS were set. You don't need both."
);
}
let path = Path::new(&input).join("features.toml");
let toml: HashMap<String, Feature> = toml::from_str(&std::fs::read_to_string(path)?)?;
for (key, value) in toml.iter() {
if value.enabled == true || all_on || flags.contains(key) {
println!("cargo:rustc-cfg={}", key);
}
}
Ok(())
nu_build::build()
}

View File

@ -0,0 +1,16 @@
[package]
name = "nu-build"
version = "0.10.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "Core build system for nushell"
license = "MIT"
[lib]
doctest = false
[dependencies]
serde = { version = "1.0.103", features = ["derive"] }
lazy_static = "1.4.0"
serde_json = "1.0.44"
toml = "0.5.5"

View File

@ -0,0 +1,80 @@
use lazy_static::lazy_static;
use serde::Deserialize;
use std::collections::BTreeMap;
use std::collections::HashMap;
use std::collections::HashSet;
use std::env;
use std::path::{Path, PathBuf};
use std::sync::Mutex;
lazy_static! {
static ref WORKSPACES: Mutex<BTreeMap<String, &'static Path>> = Mutex::new(BTreeMap::new());
}
// got from https://github.com/mitsuhiko/insta/blob/b113499249584cb650150d2d01ed96ee66db6b30/src/runtime.rs#L67-L88
fn get_cargo_workspace(manifest_dir: &str) -> Result<Option<&Path>, Box<dyn std::error::Error>> {
let mut workspaces = WORKSPACES.lock()?;
if let Some(rv) = workspaces.get(manifest_dir) {
Ok(Some(rv))
} else {
#[derive(Deserialize)]
struct Manifest {
workspace_root: String,
}
let output = std::process::Command::new(env!("CARGO"))
.arg("metadata")
.arg("--format-version=1")
.current_dir(manifest_dir)
.output()?;
let manifest: Manifest = serde_json::from_slice(&output.stdout)?;
let path = Box::leak(Box::new(PathBuf::from(manifest.workspace_root)));
workspaces.insert(manifest_dir.to_string(), path.as_path());
Ok(workspaces.get(manifest_dir).cloned())
}
}
#[derive(Deserialize)]
struct Feature {
#[allow(unused)]
description: String,
enabled: bool,
}
pub fn build() -> Result<(), Box<dyn std::error::Error>> {
let input = env::var("CARGO_MANIFEST_DIR")?;
let all_on = env::var("NUSHELL_ENABLE_ALL_FLAGS").is_ok();
let flags: HashSet<String> = env::var("NUSHELL_ENABLE_FLAGS")
.map(|s| s.split(',').map(|s| s.to_string()).collect())
.unwrap_or_else(|_| HashSet::new());
if all_on && !flags.is_empty() {
println!(
"cargo:warning=Both NUSHELL_ENABLE_ALL_FLAGS and NUSHELL_ENABLE_FLAGS were set. You don't need both."
);
}
let workspace = match get_cargo_workspace(&input)? {
// If the crate is being downloaded from crates.io, it won't have a workspace root, and that's ok
None => return Ok(()),
Some(workspace) => workspace,
};
let path = Path::new(&workspace).join("features.toml");
// If the crate is being downloaded from crates.io, it won't have a features.toml, and that's ok
if !path.exists() {
return Ok(());
}
let toml: HashMap<String, Feature> = toml::from_str(&std::fs::read_to_string(path)?)?;
for (key, value) in toml.iter() {
if value.enabled || all_on || flags.contains(key) {
println!("cargo:rustc-cfg={}", key);
}
}
Ok(())
}

View File

@ -0,0 +1,32 @@
[package]
name = "nu-errors"
version = "0.10.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "Core error subsystem for Nushell"
license = "MIT"
[lib]
doctest = false
[dependencies]
nu-source = { path = "../nu-source", version = "0.10.0" }
ansi_term = "0.12.1"
bigdecimal = { version = "0.1.0", features = ["serde"] }
derive-new = "0.5.8"
language-reporting = "0.4.0"
num-bigint = { version = "0.2.3", features = ["serde"] }
num-traits = "0.2.10"
serde = { version = "1.0.103", features = ["derive"] }
nom = "5.0.1"
nom_locate = "1.0.0"
getset = "0.0.9"
# implement conversions
serde_yaml = "0.8"
toml = "0.5.5"
serde_json = "1.0.44"
[build-dependencies]
nu-build = { version = "0.10.0", path = "../nu-build" }

View File

@ -0,0 +1,3 @@
fn main() -> Result<(), Box<dyn std::error::Error>> {
nu_build::build()
}

976
crates/nu-errors/src/lib.rs Normal file
View File

@ -0,0 +1,976 @@
use ansi_term::Color;
use bigdecimal::BigDecimal;
use derive_new::new;
use getset::Getters;
use language_reporting::{Diagnostic, Label, Severity};
use nu_source::{
b, DebugDocBuilder, HasFallibleSpan, PrettyDebug, Span, Spanned, SpannedItem, TracableContext,
};
use num_bigint::BigInt;
use num_traits::ToPrimitive;
use serde::{Deserialize, Serialize};
use std::fmt;
use std::ops::Range;
/// A structured reason for a ParseError. Note that parsing in nu is more like macro expansion in
/// other languages, so the kinds of errors that can occur during parsing are more contextual than
/// you might expect.
#[derive(Debug, Clone, Eq, PartialEq)]
pub enum ParseErrorReason {
/// The parser encountered an EOF rather than what it was expecting
Eof { expected: String, span: Span },
/// The parser expected to see the end of a token stream (possibly the token
/// stream from inside a delimited token node), but found something else.
ExtraTokens { actual: Spanned<String> },
/// The parser encountered something other than what it was expecting
Mismatch {
expected: String,
actual: Spanned<String>,
},
/// An unexpected internal error has occurred
InternalError { message: Spanned<String> },
/// The parser tried to parse an argument for a command, but it failed for
/// some reason
ArgumentError {
command: Spanned<String>,
error: ArgumentError,
},
}
/// A newtype for `ParseErrorReason`
#[derive(Debug, Clone, Eq, PartialEq, Getters)]
pub struct ParseError {
#[get = "pub"]
reason: ParseErrorReason,
}
impl ParseError {
/// Construct a [ParseErrorReason::Eof](ParseErrorReason::Eof)
pub fn unexpected_eof(expected: impl Into<String>, span: Span) -> ParseError {
ParseError {
reason: ParseErrorReason::Eof {
expected: expected.into(),
span,
},
}
}
/// Construct a [ParseErrorReason::ExtraTokens](ParseErrorReason::ExtraTokens)
pub fn extra_tokens(actual: Spanned<impl Into<String>>) -> ParseError {
let Spanned { span, item } = actual;
ParseError {
reason: ParseErrorReason::ExtraTokens {
actual: item.into().spanned(span),
},
}
}
/// Construct a [ParseErrorReason::Mismatch](ParseErrorReason::Mismatch)
pub fn mismatch(expected: impl Into<String>, actual: Spanned<impl Into<String>>) -> ParseError {
let Spanned { span, item } = actual;
ParseError {
reason: ParseErrorReason::Mismatch {
expected: expected.into(),
actual: item.into().spanned(span),
},
}
}
/// Construct a [ParseErrorReason::InternalError](ParseErrorReason::InternalError)
pub fn internal_error(message: Spanned<impl Into<String>>) -> ParseError {
ParseError {
reason: ParseErrorReason::InternalError {
message: message.item.into().spanned(message.span),
},
}
}
/// Construct a [ParseErrorReason::ArgumentError](ParseErrorReason::ArgumentError)
pub fn argument_error(command: Spanned<impl Into<String>>, kind: ArgumentError) -> ParseError {
ParseError {
reason: ParseErrorReason::ArgumentError {
command: command.item.into().spanned(command.span),
error: kind,
},
}
}
}
/// Convert a [ParseError](ParseError) into a [ShellError](ShellError)
impl From<ParseError> for ShellError {
fn from(error: ParseError) -> ShellError {
match error.reason {
ParseErrorReason::Eof { expected, span } => ShellError::unexpected_eof(expected, span),
ParseErrorReason::ExtraTokens { actual } => ShellError::type_error("nothing", actual),
ParseErrorReason::Mismatch { actual, expected } => {
ShellError::type_error(expected, actual)
}
ParseErrorReason::InternalError { message } => ShellError::labeled_error(
format!("Internal error: {}", message.item),
&message.item,
&message.span,
),
ParseErrorReason::ArgumentError { command, error } => {
ShellError::argument_error(command, error)
}
}
}
}
/// ArgumentError describes various ways that the parser could fail because of unexpected arguments.
/// Nu commands are like a combination of functions and macros, and these errors correspond to
/// problems that could be identified during expansion based on the syntactic signature of a
/// command.
#[derive(Debug, Eq, PartialEq, Clone, Ord, Hash, PartialOrd, Serialize, Deserialize)]
pub enum ArgumentError {
/// The command specified a mandatory flag, but it was missing.
MissingMandatoryFlag(String),
/// The command specified a mandatory positional argument, but it was missing.
MissingMandatoryPositional(String),
/// A flag was found, and it should have been followed by a value, but no value was found
MissingValueForName(String),
/// An argument was found, but the command does not recognize it
UnexpectedArgument(Spanned<String>),
/// An flag was found, but the command does not recognize it
UnexpectedFlag(Spanned<String>),
/// A sequence of characters was found that was not syntactically valid (but would have
/// been valid if the command was an external command)
InvalidExternalWord,
}
impl PrettyDebug for ArgumentError {
fn pretty(&self) -> DebugDocBuilder {
match self {
ArgumentError::MissingMandatoryFlag(flag) => {
b::description("missing `")
+ b::description(flag)
+ b::description("` as mandatory flag")
}
ArgumentError::UnexpectedArgument(name) => {
b::description("unexpected `")
+ b::description(&name.item)
+ b::description("` is not supported")
}
ArgumentError::UnexpectedFlag(name) => {
b::description("unexpected `")
+ b::description(&name.item)
+ b::description("` is not supported")
}
ArgumentError::MissingMandatoryPositional(pos) => {
b::description("missing `")
+ b::description(pos)
+ b::description("` as mandatory positional argument")
}
ArgumentError::MissingValueForName(name) => {
b::description("missing value for flag `")
+ b::description(name)
+ b::description("`")
}
ArgumentError::InvalidExternalWord => b::description("invalid word"),
}
}
}
/// A `ShellError` is a proximate error and a possible cause, which could have its own cause,
/// creating a cause chain.
#[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Clone, Serialize, Deserialize, Hash)]
pub struct ShellError {
error: ProximateShellError,
cause: Option<Box<ShellError>>,
}
/// `PrettyDebug` is for internal debugging. For user-facing debugging, [into_diagnostic](ShellError::into_diagnostic)
/// is used, which prints an error, highlighting spans.
impl PrettyDebug for ShellError {
fn pretty(&self) -> DebugDocBuilder {
match &self.error {
ProximateShellError::SyntaxError { problem } => {
b::error("Syntax Error")
+ b::space()
+ b::delimit("(", b::description(&problem.item), ")")
}
ProximateShellError::UnexpectedEof { .. } => b::error("Unexpected end"),
ProximateShellError::TypeError { expected, actual } => {
b::error("Type Error")
+ b::space()
+ b::delimit(
"(",
b::description("expected:")
+ b::space()
+ b::description(expected)
+ b::description(",")
+ b::space()
+ b::description("actual:")
+ b::space()
+ b::option(actual.item.as_ref().map(b::description)),
")",
)
}
ProximateShellError::MissingProperty { subpath, expr } => {
b::error("Missing Property")
+ b::space()
+ b::delimit(
"(",
b::description("expr:")
+ b::space()
+ b::description(&expr.item)
+ b::description(",")
+ b::space()
+ b::description("subpath:")
+ b::space()
+ b::description(&subpath.item),
")",
)
}
ProximateShellError::InvalidIntegerIndex { subpath, .. } => {
b::error("Invalid integer index")
+ b::space()
+ b::delimit(
"(",
b::description("subpath:") + b::space() + b::description(&subpath.item),
")",
)
}
ProximateShellError::MissingValue { reason, .. } => {
b::error("Missing Value")
+ b::space()
+ b::delimit(
"(",
b::description("reason:") + b::space() + b::description(reason),
")",
)
}
ProximateShellError::ArgumentError { command, error } => {
b::error("Argument Error")
+ b::space()
+ b::delimit(
"(",
b::description("command:")
+ b::space()
+ b::description(&command.item)
+ b::description(",")
+ b::space()
+ b::description("error:")
+ b::space()
+ error.pretty(),
")",
)
}
ProximateShellError::RangeError {
kind,
actual_kind,
operation,
} => {
b::error("Range Error")
+ b::space()
+ b::delimit(
"(",
b::description("expected:")
+ b::space()
+ kind.pretty()
+ b::description(",")
+ b::space()
+ b::description("actual:")
+ b::space()
+ b::description(&actual_kind.item)
+ b::description(",")
+ b::space()
+ b::description("operation:")
+ b::space()
+ b::description(operation),
")",
)
}
ProximateShellError::Diagnostic(_) => b::error("diagnostic"),
ProximateShellError::CoerceError { left, right } => {
b::error("Coercion Error")
+ b::space()
+ b::delimit(
"(",
b::description("left:")
+ b::space()
+ b::description(&left.item)
+ b::description(",")
+ b::space()
+ b::description("right:")
+ b::space()
+ b::description(&right.item),
")",
)
}
ProximateShellError::UntaggedRuntimeError { reason } => {
b::error("Unknown Error") + b::delimit("(", b::description(reason), ")")
}
}
}
}
impl std::fmt::Display for ShellError {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "{}", self.pretty().display())
}
}
impl serde::de::Error for ShellError {
fn custom<T>(msg: T) -> Self
where
T: std::fmt::Display,
{
ShellError::untagged_runtime_error(msg.to_string())
}
}
impl ShellError {
/// An error that describes a mismatch between the given type and the expected type
pub fn type_error(
expected: impl Into<String>,
actual: Spanned<impl Into<String>>,
) -> ShellError {
ProximateShellError::TypeError {
expected: expected.into(),
actual: actual.map(|i| Some(i.into())),
}
.start()
}
pub fn missing_property(
subpath: Spanned<impl Into<String>>,
expr: Spanned<impl Into<String>>,
) -> ShellError {
ProximateShellError::MissingProperty {
subpath: subpath.map(|s| s.into()),
expr: expr.map(|e| e.into()),
}
.start()
}
pub fn invalid_integer_index(
subpath: Spanned<impl Into<String>>,
integer: impl Into<Span>,
) -> ShellError {
ProximateShellError::InvalidIntegerIndex {
subpath: subpath.map(|s| s.into()),
integer: integer.into(),
}
.start()
}
pub fn untagged_runtime_error(error: impl Into<String>) -> ShellError {
ProximateShellError::UntaggedRuntimeError {
reason: error.into(),
}
.start()
}
pub fn unexpected_eof(expected: impl Into<String>, span: impl Into<Span>) -> ShellError {
ProximateShellError::UnexpectedEof {
expected: expected.into(),
span: span.into(),
}
.start()
}
pub fn range_error(
expected: impl Into<ExpectedRange>,
actual: &Spanned<impl fmt::Debug>,
operation: impl Into<String>,
) -> ShellError {
ProximateShellError::RangeError {
kind: expected.into(),
actual_kind: format!("{:?}", actual.item).spanned(actual.span),
operation: operation.into(),
}
.start()
}
pub fn syntax_error(problem: Spanned<impl Into<String>>) -> ShellError {
ProximateShellError::SyntaxError {
problem: problem.map(|p| p.into()),
}
.start()
}
pub fn coerce_error(
left: Spanned<impl Into<String>>,
right: Spanned<impl Into<String>>,
) -> ShellError {
ProximateShellError::CoerceError {
left: left.map(|l| l.into()),
right: right.map(|r| r.into()),
}
.start()
}
pub fn argument_error(command: Spanned<impl Into<String>>, kind: ArgumentError) -> ShellError {
ProximateShellError::ArgumentError {
command: command.map(|c| c.into()),
error: kind,
}
.start()
}
pub fn parse_error(
error: nom::Err<(
nom_locate::LocatedSpanEx<&str, TracableContext>,
nom::error::ErrorKind,
)>,
) -> ShellError {
use language_reporting::*;
match error {
nom::Err::Incomplete(_) => {
// TODO: Get span of EOF
let diagnostic = Diagnostic::new(
Severity::Error,
"Parse Error: Unexpected end of line".to_string(),
);
ShellError::diagnostic(diagnostic)
}
nom::Err::Failure(span) | nom::Err::Error(span) => {
let diagnostic = Diagnostic::new(Severity::Error, "Parse Error".to_string())
.with_label(Label::new_primary(Span::from(span.0)));
ShellError::diagnostic(diagnostic)
}
}
}
pub fn diagnostic(diagnostic: Diagnostic<Span>) -> ShellError {
ProximateShellError::Diagnostic(ShellDiagnostic { diagnostic }).start()
}
pub fn into_diagnostic(self) -> Diagnostic<Span> {
match self.error {
ProximateShellError::MissingValue { span, reason } => {
let mut d = Diagnostic::new(
Severity::Bug,
format!("Internal Error (missing value) :: {}", reason),
);
if let Some(span) = span {
d = d.with_label(Label::new_primary(span));
}
d
}
ProximateShellError::ArgumentError {
command,
error,
} => match error {
ArgumentError::InvalidExternalWord => Diagnostic::new(
Severity::Error,
"Invalid bare word for Nu command (did you intend to invoke an external command?)".to_string())
.with_label(Label::new_primary(command.span)),
ArgumentError::UnexpectedArgument(argument) => Diagnostic::new(
Severity::Error,
format!(
"{} unexpected {}",
Color::Cyan.paint(&command.item),
Color::Green.bold().paint(&argument.item)
),
)
.with_label(
Label::new_primary(argument.span).with_message(
format!("unexpected argument (try {} -h)", &command.item))
),
ArgumentError::UnexpectedFlag(flag) => Diagnostic::new(
Severity::Error,
format!(
"{} unexpected {}",
Color::Cyan.paint(&command.item),
Color::Green.bold().paint(&flag.item)
),
)
.with_label(
Label::new_primary(flag.span).with_message(
format!("unexpected flag (try {} -h)", &command.item))
),
ArgumentError::MissingMandatoryFlag(name) => Diagnostic::new(
Severity::Error,
format!(
"{} requires {}{}",
Color::Cyan.paint(&command.item),
Color::Green.bold().paint("--"),
Color::Green.bold().paint(name)
),
)
.with_label(Label::new_primary(command.span)),
ArgumentError::MissingMandatoryPositional(name) => Diagnostic::new(
Severity::Error,
format!(
"{} requires {} parameter",
Color::Cyan.paint(&command.item),
Color::Green.bold().paint(name.clone())
),
)
.with_label(
Label::new_primary(command.span).with_message(format!("requires {} parameter", name)),
),
ArgumentError::MissingValueForName(name) => Diagnostic::new(
Severity::Error,
format!(
"{} is missing value for flag {}{}",
Color::Cyan.paint(&command.item),
Color::Green.bold().paint("--"),
Color::Green.bold().paint(name)
),
)
.with_label(Label::new_primary(command.span)),
},
ProximateShellError::TypeError {
expected,
actual:
Spanned {
item: Some(actual),
span,
},
} => Diagnostic::new(Severity::Error, "Type Error").with_label(
Label::new_primary(span)
.with_message(format!("Expected {}, found {}", expected, actual)),
),
ProximateShellError::TypeError {
expected,
actual:
Spanned {
item: None,
span
},
} => Diagnostic::new(Severity::Error, "Type Error")
.with_label(Label::new_primary(span).with_message(expected)),
ProximateShellError::UnexpectedEof {
expected, span
} => Diagnostic::new(Severity::Error, "Unexpected end of input".to_string())
.with_label(Label::new_primary(span).with_message(format!("Expected {}", expected))),
ProximateShellError::RangeError {
kind,
operation,
actual_kind:
Spanned {
item,
span
},
} => Diagnostic::new(Severity::Error, "Range Error").with_label(
Label::new_primary(span).with_message(format!(
"Expected to convert {} to {} while {}, but it was out of range",
item,
kind.display(),
operation
)),
),
ProximateShellError::SyntaxError {
problem:
Spanned {
span,
item
},
} => Diagnostic::new(Severity::Error, "Syntax Error")
.with_label(Label::new_primary(span).with_message(item)),
ProximateShellError::MissingProperty { subpath, expr, .. } => {
let mut diag = Diagnostic::new(Severity::Error, "Missing property");
if subpath.span == Span::unknown() {
diag.message = format!("Missing property (for {})", subpath.item);
} else {
let subpath = Label::new_primary(subpath.span).with_message(subpath.item);
diag = diag.with_label(subpath);
if expr.span != Span::unknown() {
let expr = Label::new_primary(expr.span).with_message(expr.item);
diag = diag.with_label(expr)
}
}
diag
}
ProximateShellError::InvalidIntegerIndex { subpath,integer } => {
let mut diag = Diagnostic::new(Severity::Error, "Invalid integer property");
if subpath.span == Span::unknown() {
diag.message = format!("Invalid integer property (for {})", subpath.item)
} else {
let label = Label::new_primary(subpath.span).with_message(subpath.item);
diag = diag.with_label(label)
}
diag = diag.with_label(Label::new_secondary(integer).with_message("integer"));
diag
}
ProximateShellError::Diagnostic(diag) => diag.diagnostic,
ProximateShellError::CoerceError { left, right } => {
Diagnostic::new(Severity::Error, "Coercion error")
.with_label(Label::new_primary(left.span).with_message(left.item))
.with_label(Label::new_secondary(right.span).with_message(right.item))
}
ProximateShellError::UntaggedRuntimeError { reason } => Diagnostic::new(Severity::Error, format!("Error: {}", reason))
}
}
pub fn labeled_error(
msg: impl Into<String>,
label: impl Into<String>,
span: impl Into<Span>,
) -> ShellError {
ShellError::diagnostic(
Diagnostic::new(Severity::Error, msg.into())
.with_label(Label::new_primary(span.into()).with_message(label.into())),
)
}
pub fn labeled_error_with_secondary(
msg: impl Into<String>,
primary_label: impl Into<String>,
primary_span: impl Into<Span>,
secondary_label: impl Into<String>,
secondary_span: impl Into<Span>,
) -> ShellError {
ShellError::diagnostic(
Diagnostic::new_error(msg.into())
.with_label(
Label::new_primary(primary_span.into()).with_message(primary_label.into()),
)
.with_label(
Label::new_secondary(secondary_span.into())
.with_message(secondary_label.into()),
),
)
}
pub fn unimplemented(title: impl Into<String>) -> ShellError {
ShellError::untagged_runtime_error(&format!("Unimplemented: {}", title.into()))
}
pub fn unexpected(title: impl Into<String>) -> ShellError {
ShellError::untagged_runtime_error(&format!("Unexpected: {}", title.into()))
}
}
/// `ExpectedRange` describes a range of values that was expected by a command. In addition
/// to typical ranges, this enum allows an error to specify that the range of allowed values
/// corresponds to a particular numeric type (which is a dominant use-case for the
/// [RangeError](ProximateShellError::RangeError) error type).
#[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Serialize, Deserialize)]
pub enum ExpectedRange {
I8,
I16,
I32,
I64,
I128,
U8,
U16,
U32,
U64,
U128,
F32,
F64,
Usize,
Size,
BigInt,
BigDecimal,
Range { start: usize, end: usize },
}
/// Convert a Rust range into an [ExpectedRange](ExpectedRange).
impl From<Range<usize>> for ExpectedRange {
fn from(range: Range<usize>) -> Self {
ExpectedRange::Range {
start: range.start,
end: range.end,
}
}
}
impl PrettyDebug for ExpectedRange {
fn pretty(&self) -> DebugDocBuilder {
b::description(match self {
ExpectedRange::I8 => "an 8-bit signed integer",
ExpectedRange::I16 => "a 16-bit signed integer",
ExpectedRange::I32 => "a 32-bit signed integer",
ExpectedRange::I64 => "a 64-bit signed integer",
ExpectedRange::I128 => "a 128-bit signed integer",
ExpectedRange::U8 => "an 8-bit unsigned integer",
ExpectedRange::U16 => "a 16-bit unsigned integer",
ExpectedRange::U32 => "a 32-bit unsigned integer",
ExpectedRange::U64 => "a 64-bit unsigned integer",
ExpectedRange::U128 => "a 128-bit unsigned integer",
ExpectedRange::F32 => "a 32-bit float",
ExpectedRange::F64 => "a 64-bit float",
ExpectedRange::Usize => "an list index",
ExpectedRange::Size => "a list offset",
ExpectedRange::BigDecimal => "a decimal",
ExpectedRange::BigInt => "an integer",
ExpectedRange::Range { start, end } => {
return b::description(format!("{} to {}", start, end))
}
})
}
}
#[derive(Debug, Eq, PartialEq, Clone, Ord, PartialOrd, Serialize, Deserialize, Hash)]
pub enum ProximateShellError {
SyntaxError {
problem: Spanned<String>,
},
UnexpectedEof {
expected: String,
span: Span,
},
TypeError {
expected: String,
actual: Spanned<Option<String>>,
},
MissingProperty {
subpath: Spanned<String>,
expr: Spanned<String>,
},
InvalidIntegerIndex {
subpath: Spanned<String>,
integer: Span,
},
MissingValue {
span: Option<Span>,
reason: String,
},
ArgumentError {
command: Spanned<String>,
error: ArgumentError,
},
RangeError {
kind: ExpectedRange,
actual_kind: Spanned<String>,
operation: String,
},
Diagnostic(ShellDiagnostic),
CoerceError {
left: Spanned<String>,
right: Spanned<String>,
},
UntaggedRuntimeError {
reason: String,
},
}
impl ProximateShellError {
fn start(self) -> ShellError {
ShellError {
cause: None,
error: self,
}
}
}
impl HasFallibleSpan for ShellError {
fn maybe_span(&self) -> Option<Span> {
self.error.maybe_span()
}
}
impl HasFallibleSpan for ProximateShellError {
fn maybe_span(&self) -> Option<Span> {
Some(match self {
ProximateShellError::SyntaxError { problem } => problem.span,
ProximateShellError::UnexpectedEof { span, .. } => *span,
ProximateShellError::TypeError { actual, .. } => actual.span,
ProximateShellError::MissingProperty { subpath, .. } => subpath.span,
ProximateShellError::InvalidIntegerIndex { subpath, .. } => subpath.span,
ProximateShellError::MissingValue { span, .. } => return *span,
ProximateShellError::ArgumentError { command, .. } => command.span,
ProximateShellError::RangeError { actual_kind, .. } => actual_kind.span,
ProximateShellError::Diagnostic(_) => return None,
ProximateShellError::CoerceError { left, right } => left.span.until(right.span),
ProximateShellError::UntaggedRuntimeError { .. } => return None,
})
}
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct ShellDiagnostic {
pub(crate) diagnostic: Diagnostic<Span>,
}
impl std::hash::Hash for ShellDiagnostic {
fn hash<H: std::hash::Hasher>(&self, state: &mut H) {
self.diagnostic.severity.hash(state);
self.diagnostic.code.hash(state);
self.diagnostic.message.hash(state);
for label in &self.diagnostic.labels {
label.span.hash(state);
label.message.hash(state);
match label.style {
language_reporting::LabelStyle::Primary => 0.hash(state),
language_reporting::LabelStyle::Secondary => 1.hash(state),
}
}
}
}
impl PartialEq for ShellDiagnostic {
fn eq(&self, _other: &ShellDiagnostic) -> bool {
false
}
}
impl Eq for ShellDiagnostic {}
impl std::cmp::PartialOrd for ShellDiagnostic {
fn partial_cmp(&self, _other: &Self) -> Option<std::cmp::Ordering> {
Some(std::cmp::Ordering::Less)
}
}
impl std::cmp::Ord for ShellDiagnostic {
fn cmp(&self, _other: &Self) -> std::cmp::Ordering {
std::cmp::Ordering::Less
}
}
#[derive(Debug, Ord, PartialOrd, Eq, PartialEq, new, Clone, Serialize, Deserialize)]
pub struct StringError {
title: String,
error: String,
}
impl std::error::Error for ShellError {}
impl std::convert::From<Box<dyn std::error::Error>> for ShellError {
fn from(input: Box<dyn std::error::Error>) -> ShellError {
ShellError::untagged_runtime_error(format!("{}", input))
}
}
impl std::convert::From<std::io::Error> for ShellError {
fn from(input: std::io::Error) -> ShellError {
ShellError::untagged_runtime_error(format!("{}", input))
}
}
impl std::convert::From<serde_yaml::Error> for ShellError {
fn from(input: serde_yaml::Error) -> ShellError {
ShellError::untagged_runtime_error(format!("{:?}", input))
}
}
impl std::convert::From<toml::ser::Error> for ShellError {
fn from(input: toml::ser::Error) -> ShellError {
ShellError::untagged_runtime_error(format!("{:?}", input))
}
}
impl std::convert::From<serde_json::Error> for ShellError {
fn from(input: serde_json::Error) -> ShellError {
ShellError::untagged_runtime_error(format!("{:?}", input))
}
}
impl std::convert::From<Box<dyn std::error::Error + Send + Sync>> for ShellError {
fn from(input: Box<dyn std::error::Error + Send + Sync>) -> ShellError {
ShellError::untagged_runtime_error(format!("{:?}", input))
}
}
pub trait CoerceInto<U> {
fn coerce_into(self, operation: impl Into<String>) -> Result<U, ShellError>;
}
trait ToExpectedRange {
fn to_expected_range() -> ExpectedRange;
}
macro_rules! ranged_int {
($ty:tt -> $op:tt -> $variant:tt) => {
impl ToExpectedRange for $ty {
fn to_expected_range() -> ExpectedRange {
ExpectedRange::$variant
}
}
impl CoerceInto<$ty> for nu_source::Tagged<BigInt> {
fn coerce_into(self, operation: impl Into<String>) -> Result<$ty, ShellError> {
match self.$op() {
Some(v) => Ok(v),
None => Err(ShellError::range_error(
$ty::to_expected_range(),
&self.item.spanned(self.tag.span),
operation.into(),
)),
}
}
}
impl CoerceInto<$ty> for nu_source::Tagged<&BigInt> {
fn coerce_into(self, operation: impl Into<String>) -> Result<$ty, ShellError> {
match self.$op() {
Some(v) => Ok(v),
None => Err(ShellError::range_error(
$ty::to_expected_range(),
&self.item.spanned(self.tag.span),
operation.into(),
)),
}
}
}
};
}
ranged_int!(u8 -> to_u8 -> U8);
ranged_int!(u16 -> to_u16 -> U16);
ranged_int!(u32 -> to_u32 -> U32);
ranged_int!(u64 -> to_u64 -> U64);
ranged_int!(i8 -> to_i8 -> I8);
ranged_int!(i16 -> to_i16 -> I16);
ranged_int!(i32 -> to_i32 -> I32);
ranged_int!(i64 -> to_i64 -> I64);
macro_rules! ranged_decimal {
($ty:tt -> $op:tt -> $variant:tt) => {
impl ToExpectedRange for $ty {
fn to_expected_range() -> ExpectedRange {
ExpectedRange::$variant
}
}
impl CoerceInto<$ty> for nu_source::Tagged<BigDecimal> {
fn coerce_into(self, operation: impl Into<String>) -> Result<$ty, ShellError> {
match self.$op() {
Some(v) => Ok(v),
None => Err(ShellError::range_error(
$ty::to_expected_range(),
&self.item.spanned(self.tag.span),
operation.into(),
)),
}
}
}
impl CoerceInto<$ty> for nu_source::Tagged<&BigDecimal> {
fn coerce_into(self, operation: impl Into<String>) -> Result<$ty, ShellError> {
match self.$op() {
Some(v) => Ok(v),
None => Err(ShellError::range_error(
$ty::to_expected_range(),
&self.item.spanned(self.tag.span),
operation.into(),
)),
}
}
}
};
}
ranged_decimal!(f32 -> to_f32 -> F32);
ranged_decimal!(f64 -> to_f64 -> F64);

View File

@ -0,0 +1,13 @@
[package]
name = "nu-macros"
version = "0.10.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "Core macros for building Nushell"
license = "MIT"
[lib]
doctest = false
[dependencies]
nu-protocol = { path = "../nu-protocol", version = "0.10.0" }

View File

@ -0,0 +1,25 @@
#[macro_export]
macro_rules! signature {
(def $name:tt {
$usage:tt
$(
$positional_name:tt $positional_ty:tt - $positional_desc:tt
)*
}) => {{
let signature = Signature::new(stringify!($name)).desc($usage);
$(
$crate::positional! { signature, $positional_name $positional_ty - $positional_desc }
)*
signature
}};
}
#[macro_export]
macro_rules! positional {
($ident:tt, $name:tt (optional $shape:tt) - $desc:tt) => {
let $ident = $ident.optional(stringify!($name), SyntaxShape::$shape, $desc);
};
($ident:tt, $name:tt ($shape:tt)- $desc:tt) => {
let $ident = $ident.required(stringify!($name), SyntaxShape::$shape, $desc);
};
}

View File

@ -0,0 +1,48 @@
[package]
name = "nu-parser"
version = "0.10.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "Core parser used in Nushell"
license = "MIT"
[lib]
doctest = false
[dependencies]
nu-errors = { path = "../nu-errors", version = "0.10.0" }
nu-source = { path = "../nu-source", version = "0.10.0" }
nu-protocol = { path = "../nu-protocol", version = "0.10.0" }
pretty_env_logger = "0.3.1"
pretty = "0.5.2"
termcolor = "1.0.5"
log = "0.4.8"
indexmap = { version = "1.3.0", features = ["serde-1"] }
serde = { version = "1.0.102", features = ["derive"] }
nom = "5.0.1"
nom_locate = "1.0.0"
nom-tracable = "0.4.1"
num-traits = "0.2.8"
num-bigint = { version = "0.2.3", features = ["serde"] }
bigdecimal = { version = "0.1.0", features = ["serde"] }
derive-new = "0.5.8"
getset = "0.0.9"
cfg-if = "0.1"
itertools = "0.8.1"
shellexpand = "1.0.0"
ansi_term = "0.12.1"
ptree = {version = "0.2" }
language-reporting = "0.4.0"
unicode-xid = "0.2.0"
enumflags2 = "0.6.2"
[dev-dependencies]
pretty_assertions = "0.6.1"
[build-dependencies]
nu-build = { version = "0.10.0", path = "../nu-build" }
[features]
stable = []
trace = ["nom-tracable/trace"]

View File

@ -0,0 +1,3 @@
fn main() -> Result<(), Box<dyn std::error::Error>> {
nu_build::build()
}

View File

@ -0,0 +1,34 @@
pub mod classified;
use crate::commands::classified::external::{ExternalArg, ExternalArgs, ExternalCommand};
use crate::commands::classified::ClassifiedCommand;
use crate::hir::expand_external_tokens::ExternalTokensShape;
use crate::hir::tokens_iterator::TokensIterator;
use nu_errors::ParseError;
use nu_source::{Spanned, Tagged};
// Classify this command as an external command, which doesn't give special meaning
// to nu syntactic constructs, and passes all arguments to the external command as
// strings.
pub(crate) fn external_command(
tokens: &mut TokensIterator,
name: Tagged<&str>,
) -> Result<ClassifiedCommand, ParseError> {
let Spanned { item, span } = tokens.expand_infallible(ExternalTokensShape).tokens;
let full_span = name.span().until(span);
Ok(ClassifiedCommand::External(ExternalCommand {
name: name.to_string(),
name_tag: name.tag(),
args: ExternalArgs {
list: item
.iter()
.map(|x| ExternalArg {
tag: x.span.into(),
arg: x.item.clone(),
})
.collect(),
span: full_span,
},
}))
}

View File

@ -0,0 +1,100 @@
pub mod external;
pub mod internal;
use crate::commands::classified::external::ExternalCommand;
use crate::commands::classified::internal::InternalCommand;
use crate::hir;
use crate::parse::token_tree::SpannedToken;
use derive_new::new;
use nu_errors::ParseError;
use nu_source::{b, DebugDocBuilder, HasSpan, PrettyDebugWithSource, Span};
#[derive(Debug, Clone, Eq, PartialEq)]
pub enum ClassifiedCommand {
#[allow(unused)]
Expr(SpannedToken),
#[allow(unused)]
Dynamic(hir::Call),
Internal(InternalCommand),
External(ExternalCommand),
Error(ParseError),
}
impl PrettyDebugWithSource for ClassifiedCommand {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
match self {
ClassifiedCommand::Expr(token) => b::typed("command", token.pretty_debug(source)),
ClassifiedCommand::Dynamic(call) => b::typed("command", call.pretty_debug(source)),
ClassifiedCommand::Error(_) => b::error("no command"),
ClassifiedCommand::Internal(internal) => internal.pretty_debug(source),
ClassifiedCommand::External(external) => external.pretty_debug(source),
}
}
}
impl HasSpan for ClassifiedCommand {
fn span(&self) -> Span {
match self {
ClassifiedCommand::Expr(node) => node.span(),
ClassifiedCommand::Internal(command) => command.span(),
ClassifiedCommand::Dynamic(call) => call.span,
ClassifiedCommand::Error(_) => Span::unknown(),
ClassifiedCommand::External(command) => command.span(),
}
}
}
#[derive(new, Debug, Eq, PartialEq)]
pub(crate) struct DynamicCommand {
pub(crate) args: hir::Call,
}
#[derive(Debug, Clone)]
pub struct Commands {
pub list: Vec<ClassifiedCommand>,
pub span: Span,
}
impl std::ops::Deref for Commands {
type Target = [ClassifiedCommand];
fn deref(&self) -> &Self::Target {
&self.list
}
}
#[derive(Debug, Clone)]
pub struct ClassifiedPipeline {
pub commands: Commands,
// this is not a Result to make it crystal clear that these shapes
// aren't intended to be used directly with `?`
pub failed: Option<nu_errors::ParseError>,
}
impl ClassifiedPipeline {
pub fn commands(list: Vec<ClassifiedCommand>, span: impl Into<Span>) -> ClassifiedPipeline {
ClassifiedPipeline {
commands: Commands {
list,
span: span.into(),
},
failed: None,
}
}
}
impl HasSpan for ClassifiedPipeline {
fn span(&self) -> Span {
self.commands.span
}
}
impl PrettyDebugWithSource for ClassifiedPipeline {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
b::intersperse(
self.commands.iter().map(|c| c.pretty_debug(source)),
b::operator(" | "),
)
.or(b::delimit("<", b::description("empty pipeline"), ">"))
}
}

View File

@ -0,0 +1,97 @@
use nu_source::{b, DebugDocBuilder, HasSpan, PrettyDebug, Span, Tag};
#[derive(Debug, Clone, Eq, PartialEq)]
pub struct ExternalArg {
pub arg: String,
pub tag: Tag,
}
impl ExternalArg {
pub fn has(&self, name: &str) -> bool {
self.arg == name
}
pub fn is_it(&self) -> bool {
self.has("$it")
}
pub fn is_nu(&self) -> bool {
self.has("$nu")
}
pub fn looks_like_it(&self) -> bool {
self.arg.starts_with("$it") && (self.arg.starts_with("$it.") || self.is_it())
}
pub fn looks_like_nu(&self) -> bool {
self.arg.starts_with("$nu") && (self.arg.starts_with("$nu.") || self.is_nu())
}
}
impl std::ops::Deref for ExternalArg {
type Target = str;
fn deref(&self) -> &str {
&self.arg
}
}
#[derive(Debug, Clone, Eq, PartialEq)]
pub struct ExternalArgs {
pub list: Vec<ExternalArg>,
pub span: Span,
}
impl ExternalArgs {
pub fn iter(&self) -> impl Iterator<Item = &ExternalArg> {
self.list.iter()
}
}
impl std::ops::Deref for ExternalArgs {
type Target = [ExternalArg];
fn deref(&self) -> &[ExternalArg] {
&self.list
}
}
#[derive(Debug, Clone, Eq, PartialEq)]
pub struct ExternalCommand {
pub name: String,
pub name_tag: Tag,
pub args: ExternalArgs,
}
impl ExternalCommand {
pub fn has_it_argument(&self) -> bool {
self.args.iter().any(|arg| arg.looks_like_it())
}
pub fn has_nu_argument(&self) -> bool {
self.args.iter().any(|arg| arg.looks_like_nu())
}
}
impl PrettyDebug for ExternalCommand {
fn pretty(&self) -> DebugDocBuilder {
b::typed(
"external command",
b::description(&self.name)
+ b::preceded(
b::space(),
b::intersperse(
self.args.iter().map(|a| b::primitive(a.arg.to_string())),
b::space(),
),
),
)
}
}
impl HasSpan for ExternalCommand {
fn span(&self) -> Span {
self.name_tag.span.until(self.args.span)
}
}

View File

@ -0,0 +1,28 @@
use crate::hir;
use derive_new::new;
use nu_source::{b, DebugDocBuilder, HasSpan, PrettyDebugWithSource, Span, Tag};
#[derive(new, Debug, Clone, Eq, PartialEq)]
pub struct InternalCommand {
pub name: String,
pub name_tag: Tag,
pub args: hir::Call,
}
impl PrettyDebugWithSource for InternalCommand {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
b::typed(
"internal command",
b::description(&self.name) + b::space() + self.args.pretty_debug(source),
)
}
}
impl HasSpan for InternalCommand {
fn span(&self) -> Span {
let start = self.name_tag.span;
start.until(self.args.span)
}
}

492
crates/nu-parser/src/hir.rs Normal file
View File

@ -0,0 +1,492 @@
pub(crate) mod baseline_parse;
pub(crate) mod binary;
pub(crate) mod expand_external_tokens;
pub(crate) mod external_command;
pub(crate) mod named;
pub(crate) mod path;
pub(crate) mod range;
pub mod syntax_shape;
pub(crate) mod tokens_iterator;
use crate::hir::syntax_shape::Member;
use crate::parse::operator::CompareOperator;
use crate::parse::parser::Number;
use crate::parse::unit::Unit;
use derive_new::new;
use getset::Getters;
use nu_protocol::{PathMember, ShellTypeName};
use nu_source::{
b, DebugDocBuilder, HasSpan, IntoSpanned, PrettyDebug, PrettyDebugRefineKind,
PrettyDebugWithSource, Span, Spanned,
};
use serde::{Deserialize, Serialize};
use std::path::PathBuf;
use crate::parse::number::RawNumber;
pub(crate) use self::binary::Binary;
pub(crate) use self::path::Path;
pub(crate) use self::range::Range;
pub(crate) use self::tokens_iterator::TokensIterator;
pub use self::external_command::ExternalCommand;
pub use self::named::{NamedArguments, NamedValue};
#[derive(Debug, Clone)]
pub struct Signature {
unspanned: nu_protocol::Signature,
span: Span,
}
impl Signature {
pub fn new(unspanned: nu_protocol::Signature, span: impl Into<Span>) -> Signature {
Signature {
unspanned,
span: span.into(),
}
}
}
impl HasSpan for Signature {
fn span(&self) -> Span {
self.span
}
}
impl PrettyDebugWithSource for Signature {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
self.unspanned.pretty_debug(source)
}
}
#[derive(Debug, Clone, Eq, PartialEq, Getters, Serialize, Deserialize, new)]
pub struct Call {
#[get = "pub(crate)"]
pub head: Box<SpannedExpression>,
#[get = "pub(crate)"]
pub positional: Option<Vec<SpannedExpression>>,
#[get = "pub(crate)"]
pub named: Option<NamedArguments>,
pub span: Span,
}
impl Call {
pub fn switch_preset(&self, switch: &str) -> bool {
self.named
.as_ref()
.and_then(|n| n.get(switch))
.map(|t| match t {
NamedValue::PresentSwitch(_) => true,
_ => false,
})
.unwrap_or(false)
}
}
impl PrettyDebugWithSource for Call {
fn refined_pretty_debug(&self, refine: PrettyDebugRefineKind, source: &str) -> DebugDocBuilder {
match refine {
PrettyDebugRefineKind::ContextFree => self.pretty_debug(source),
PrettyDebugRefineKind::WithContext => {
self.head
.refined_pretty_debug(PrettyDebugRefineKind::WithContext, source)
+ b::preceded_option(
Some(b::space()),
self.positional.as_ref().map(|pos| {
b::intersperse(
pos.iter().map(|expr| {
expr.refined_pretty_debug(
PrettyDebugRefineKind::WithContext,
source,
)
}),
b::space(),
)
}),
)
+ b::preceded_option(
Some(b::space()),
self.named.as_ref().map(|named| {
named.refined_pretty_debug(PrettyDebugRefineKind::WithContext, source)
}),
)
}
}
}
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
b::typed(
"call",
self.refined_pretty_debug(PrettyDebugRefineKind::WithContext, source),
)
}
}
#[derive(Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Hash, Serialize, Deserialize)]
pub enum Expression {
Literal(Literal),
ExternalWord,
Synthetic(Synthetic),
Variable(Variable),
Binary(Box<Binary>),
Range(Box<Range>),
Block(Vec<SpannedExpression>),
List(Vec<SpannedExpression>),
Path(Box<Path>),
FilePath(PathBuf),
ExternalCommand(ExternalCommand),
Command(Span),
Boolean(bool),
}
impl ShellTypeName for Expression {
fn type_name(&self) -> &'static str {
match self {
Expression::Literal(literal) => literal.type_name(),
Expression::Synthetic(synthetic) => synthetic.type_name(),
Expression::Command(..) => "command",
Expression::ExternalWord => "external word",
Expression::FilePath(..) => "file path",
Expression::Variable(..) => "variable",
Expression::List(..) => "list",
Expression::Binary(..) => "binary",
Expression::Range(..) => "range",
Expression::Block(..) => "block",
Expression::Path(..) => "variable path",
Expression::Boolean(..) => "boolean",
Expression::ExternalCommand(..) => "external",
}
}
}
#[derive(Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Hash, Serialize, Deserialize)]
pub enum Synthetic {
String(String),
}
impl ShellTypeName for Synthetic {
fn type_name(&self) -> &'static str {
match self {
Synthetic::String(_) => "string",
}
}
}
impl IntoSpanned for Expression {
type Output = SpannedExpression;
fn into_spanned(self, span: impl Into<Span>) -> Self::Output {
SpannedExpression {
expr: self,
span: span.into(),
}
}
}
impl Expression {
pub fn into_expr(self, span: impl Into<Span>) -> SpannedExpression {
self.into_spanned(span)
}
pub fn into_unspanned_expr(self) -> SpannedExpression {
SpannedExpression {
expr: self,
span: Span::unknown(),
}
}
}
#[derive(Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Hash, Serialize, Deserialize)]
pub struct SpannedExpression {
pub expr: Expression,
pub span: Span,
}
impl SpannedExpression {
pub fn new(expr: Expression, span: Span) -> SpannedExpression {
SpannedExpression { expr, span }
}
}
impl std::ops::Deref for SpannedExpression {
type Target = Expression;
fn deref(&self) -> &Expression {
&self.expr
}
}
impl HasSpan for SpannedExpression {
fn span(&self) -> Span {
self.span
}
}
impl ShellTypeName for SpannedExpression {
fn type_name(&self) -> &'static str {
self.expr.type_name()
}
}
impl PrettyDebugWithSource for SpannedExpression {
fn refined_pretty_debug(&self, refine: PrettyDebugRefineKind, source: &str) -> DebugDocBuilder {
match refine {
PrettyDebugRefineKind::ContextFree => self.refined_pretty_debug(refine, source),
PrettyDebugRefineKind::WithContext => match &self.expr {
Expression::Literal(literal) => literal
.clone()
.into_spanned(self.span)
.refined_pretty_debug(refine, source),
Expression::ExternalWord => {
b::delimit("e\"", b::primitive(self.span.slice(source)), "\"").group()
}
Expression::Synthetic(s) => match s {
Synthetic::String(_) => {
b::delimit("s\"", b::primitive(self.span.slice(source)), "\"").group()
}
},
Expression::Variable(Variable::Other(_)) => b::keyword(self.span.slice(source)),
Expression::Variable(Variable::It(_)) => b::keyword("$it"),
Expression::Binary(binary) => binary.pretty_debug(source),
Expression::Range(range) => range.pretty_debug(source),
Expression::Block(_) => b::opaque("block"),
Expression::List(list) => b::delimit(
"[",
b::intersperse(
list.iter()
.map(|item| item.refined_pretty_debug(refine, source)),
b::space(),
),
"]",
),
Expression::Path(path) => path.pretty_debug(source),
Expression::FilePath(path) => b::typed("path", b::primitive(path.display())),
Expression::ExternalCommand(external) => {
b::keyword("^") + b::keyword(external.name.slice(source))
}
Expression::Command(command) => b::keyword(command.slice(source)),
Expression::Boolean(boolean) => match boolean {
true => b::primitive("$yes"),
false => b::primitive("$no"),
},
},
}
}
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
match &self.expr {
Expression::Literal(literal) => {
literal.clone().into_spanned(self.span).pretty_debug(source)
}
Expression::ExternalWord => {
b::typed("external word", b::primitive(self.span.slice(source)))
}
Expression::Synthetic(s) => match s {
Synthetic::String(s) => b::typed("synthetic", b::primitive(format!("{:?}", s))),
},
Expression::Variable(Variable::Other(_)) => b::keyword(self.span.slice(source)),
Expression::Variable(Variable::It(_)) => b::keyword("$it"),
Expression::Binary(binary) => binary.pretty_debug(source),
Expression::Range(range) => range.pretty_debug(source),
Expression::Block(_) => b::opaque("block"),
Expression::List(list) => b::delimit(
"[",
b::intersperse(
list.iter().map(|item| item.pretty_debug(source)),
b::space(),
),
"]",
),
Expression::Path(path) => path.pretty_debug(source),
Expression::FilePath(path) => b::typed("path", b::primitive(path.display())),
Expression::ExternalCommand(external) => b::typed(
"command",
b::keyword("^") + b::primitive(external.name.slice(source)),
),
Expression::Command(command) => {
b::typed("command", b::primitive(command.slice(source)))
}
Expression::Boolean(boolean) => match boolean {
true => b::primitive("$yes"),
false => b::primitive("$no"),
},
}
}
}
impl Expression {
pub fn number(i: impl Into<Number>) -> Expression {
Expression::Literal(Literal::Number(i.into()))
}
pub fn size(i: impl Into<Number>, unit: impl Into<Unit>) -> Expression {
Expression::Literal(Literal::Size(i.into(), unit.into()))
}
pub fn string(inner: impl Into<Span>) -> Expression {
Expression::Literal(Literal::String(inner.into()))
}
pub fn synthetic_string(string: impl Into<String>) -> Expression {
Expression::Synthetic(Synthetic::String(string.into()))
}
pub fn column_path(members: Vec<Member>) -> Expression {
Expression::Literal(Literal::ColumnPath(members))
}
pub fn path(head: SpannedExpression, tail: Vec<impl Into<PathMember>>) -> Expression {
let tail = tail.into_iter().map(|t| t.into()).collect();
Expression::Path(Box::new(Path::new(head, tail)))
}
pub fn dot_member(head: SpannedExpression, next: impl Into<PathMember>) -> Expression {
let SpannedExpression { expr: item, span } = head;
let next = next.into();
match item {
Expression::Path(path) => {
let (head, mut tail) = path.parts();
tail.push(next);
Expression::path(head, tail)
}
other => Expression::path(other.into_expr(span), vec![next]),
}
}
pub fn infix(
left: SpannedExpression,
op: Spanned<impl Into<CompareOperator>>,
right: SpannedExpression,
) -> Expression {
Expression::Binary(Box::new(Binary::new(left, op.map(|o| o.into()), right)))
}
pub fn range(left: SpannedExpression, op: Span, right: SpannedExpression) -> Expression {
Expression::Range(Box::new(Range::new(left, op, right)))
}
pub fn file_path(path: impl Into<PathBuf>) -> Expression {
Expression::FilePath(path.into())
}
pub fn list(list: Vec<SpannedExpression>) -> Expression {
Expression::List(list)
}
pub fn bare() -> Expression {
Expression::Literal(Literal::Bare)
}
pub fn pattern(inner: impl Into<String>) -> Expression {
Expression::Literal(Literal::GlobPattern(inner.into()))
}
pub fn variable(inner: impl Into<Span>) -> Expression {
Expression::Variable(Variable::Other(inner.into()))
}
pub fn external_command(inner: impl Into<Span>) -> Expression {
Expression::ExternalCommand(ExternalCommand::new(inner.into()))
}
pub fn it_variable(inner: impl Into<Span>) -> Expression {
Expression::Variable(Variable::It(inner.into()))
}
}
impl From<Spanned<Path>> for SpannedExpression {
fn from(path: Spanned<Path>) -> SpannedExpression {
Expression::Path(Box::new(path.item)).into_expr(path.span)
}
}
/// Literals are expressions that are:
///
/// 1. Copy
/// 2. Can be evaluated without additional context
/// 3. Evaluation cannot produce an error
#[derive(Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Hash, Serialize, Deserialize)]
pub enum Literal {
Number(Number),
Size(Number, Unit),
String(Span),
GlobPattern(String),
ColumnPath(Vec<Member>),
Bare,
}
impl Literal {
pub fn into_spanned(self, span: impl Into<Span>) -> SpannedLiteral {
SpannedLiteral {
literal: self,
span: span.into(),
}
}
}
#[derive(Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Hash, Serialize, Deserialize)]
pub struct SpannedLiteral {
pub literal: Literal,
pub span: Span,
}
impl ShellTypeName for Literal {
fn type_name(&self) -> &'static str {
match &self {
Literal::Number(..) => "number",
Literal::Size(..) => "size",
Literal::String(..) => "string",
Literal::ColumnPath(..) => "column path",
Literal::Bare => "string",
Literal::GlobPattern(_) => "pattern",
}
}
}
impl PrettyDebugWithSource for SpannedLiteral {
fn refined_pretty_debug(&self, refine: PrettyDebugRefineKind, source: &str) -> DebugDocBuilder {
match refine {
PrettyDebugRefineKind::ContextFree => self.pretty_debug(source),
PrettyDebugRefineKind::WithContext => match &self.literal {
Literal::Number(number) => number.pretty(),
Literal::Size(number, unit) => (number.pretty() + unit.pretty()).group(),
Literal::String(string) => b::primitive(format!("{:?}", string.slice(source))),
Literal::GlobPattern(pattern) => b::primitive(pattern),
Literal::ColumnPath(path) => {
b::intersperse_with_source(path.iter(), b::space(), source)
}
Literal::Bare => b::delimit("b\"", b::primitive(self.span.slice(source)), "\""),
},
}
}
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
match &self.literal {
Literal::Number(number) => number.pretty(),
Literal::Size(number, unit) => {
b::typed("size", (number.pretty() + unit.pretty()).group())
}
Literal::String(string) => b::typed(
"string",
b::primitive(format!("{:?}", string.slice(source))),
),
Literal::GlobPattern(pattern) => b::typed("pattern", b::primitive(pattern)),
Literal::ColumnPath(path) => b::typed(
"column path",
b::intersperse_with_source(path.iter(), b::space(), source),
),
Literal::Bare => b::typed("bare", b::primitive(self.span.slice(source))),
}
}
}
#[derive(Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Hash, Serialize, Deserialize)]
pub enum Variable {
It(Span),
Other(Span),
}

View File

@ -0,0 +1,2 @@
#[cfg(test)]
pub mod tests;

View File

@ -0,0 +1,301 @@
use crate::commands::classified::{internal::InternalCommand, ClassifiedCommand};
use crate::hir::expand_external_tokens::{ExternalTokensShape, ExternalTokensSyntax};
use crate::hir::{
self, named::NamedValue, syntax_shape::*, Expression, NamedArguments, SpannedExpression,
TokensIterator,
};
use crate::parse::files::Files;
use crate::parse::token_tree_builder::{CurriedToken, TokenTreeBuilder as b};
use crate::SpannedToken;
use derive_new::new;
use indexmap::IndexMap;
use nu_errors::{ParseError, ShellError};
use nu_protocol::{outln, PathMember, Signature, SyntaxShape};
use nu_source::{HasSpan, PrettyDebugWithSource, Span, SpannedItem, Tag, Text};
use pretty_assertions::assert_eq;
use std::fmt::Debug;
#[test]
fn test_parse_external() {
parse_tokens(
fallible(ExternalTokensShape),
"5kb",
vec![b::bare("5kb")],
|tokens| {
ExternalTokensSyntax::new(
vec![format!("5kb").spanned(tokens[0].span())].spanned(tokens[0].span()),
)
},
);
parse_tokens(
fallible(ExternalTokensShape),
"cargo +nightly run -- --features all",
vec![
b::bare("cargo"),
b::sp(),
b::external_word("+nightly"),
b::sp(),
b::bare("run"),
b::sp(),
b::external_word("--"),
b::sp(),
b::flag("features"),
b::sp(),
b::bare("all"),
],
|tokens| {
let cargo = format!("cargo").spanned(tokens[0].span());
let nightly = format!("+nightly").spanned(tokens[2].span());
let run = format!("run").spanned(tokens[4].span());
let dashdash = format!("--").spanned(tokens[6].span());
let features = format!("--features").spanned(tokens[8].span());
let all = format!("all").spanned(tokens[10].span());
let span = tokens[0].span().until(tokens[10].span());
ExternalTokensSyntax::new(
vec![cargo, nightly, run, dashdash, features, all].spanned(span),
)
},
);
}
#[test]
fn test_parse_string() {
parse_tokens(
CoerceStringShape,
r#""hello""#,
vec![b::string("hello")],
|tokens| {
Expression::string(inner_string_span(tokens[0].span())).into_expr(tokens[0].span())
},
);
}
#[test]
fn test_parse_path() {
let _ = pretty_env_logger::try_init();
parse_expr(
AnyExpressionShape,
"$it.cpu",
vec![b::it_var(), b::dot(), b::bare("cpu")],
|tokens| {
let (outer_var, inner_var) = tokens[0].expect_var();
let bare = tokens[2].expect_bare();
Expression::path(
Expression::it_variable(inner_var).into_expr(outer_var),
vec![PathMember::string("cpu", bare)],
)
.into_expr(outer_var.until(bare))
},
);
parse_expr(
VariablePathShape,
r#"$cpu.amount."max ghz""#,
vec![
b::var("cpu"),
b::dot(),
b::bare("amount"),
b::dot(),
b::string("max ghz"),
],
|tokens| {
let (outer_var, inner_var) = tokens[0].expect_var();
let amount = tokens[2].expect_bare();
let (outer_max_ghz, _) = tokens[4].expect_string();
Expression::path(
Expression::variable(inner_var).into_expr(outer_var),
vec![
PathMember::string("amount", amount),
PathMember::string("max ghz", outer_max_ghz),
],
)
.into_expr(outer_var.until(outer_max_ghz))
},
);
}
#[test]
fn test_parse_command() {
parse_tokens(
fallible(ClassifiedCommandShape),
"ls *.txt",
vec![b::bare("ls"), b::sp(), b::pattern("*.txt")],
|tokens| {
let bare = tokens[0].expect_bare();
let pat = tokens[2].expect_pattern();
let mut map = IndexMap::new();
map.insert("full".to_string(), NamedValue::AbsentSwitch);
map.insert("help".to_string(), NamedValue::AbsentSwitch);
ClassifiedCommand::Internal(InternalCommand::new(
"ls".to_string(),
Tag {
span: bare,
anchor: None,
},
hir::Call {
head: Box::new(Expression::Command(bare).into_expr(bare)),
positional: Some(vec![Expression::pattern("*.txt").into_expr(pat)]),
named: Some(NamedArguments { named: map }),
span: bare.until(pat),
},
))
},
);
}
#[derive(Debug, Clone, new)]
struct TestRegistry {
#[new(default)]
signatures: indexmap::IndexMap<String, Signature>,
}
impl TestRegistry {
fn insert(&mut self, key: &str, value: Signature) {
self.signatures.insert(key.to_string(), value);
}
}
impl SignatureRegistry for TestRegistry {
fn has(&self, name: &str) -> bool {
self.signatures.contains_key(name)
}
fn get(&self, name: &str) -> Option<Signature> {
self.signatures.get(name).cloned()
}
fn clone_box(&self) -> Box<dyn SignatureRegistry> {
Box::new(self.clone())
}
}
fn with_empty_context(source: &Text, callback: impl FnOnce(ExpandContext)) {
let mut registry = TestRegistry::new();
registry.insert(
"ls",
Signature::build("ls")
.optional(
"path",
SyntaxShape::Pattern,
"a path to get the directory contents from",
)
.switch(
"full",
"list all available columns for each entry",
Some('f'),
),
);
callback(ExpandContext::new(Box::new(registry), source, None))
}
trait Expand {}
fn parse_tokens<T: Eq + HasSpan + PrettyDebugWithSource + Clone + Debug + 'static>(
shape: impl ExpandSyntax<Output = Result<T, ParseError>>,
syntax: &str,
tokens: Vec<CurriedToken>,
expected: impl FnOnce(&[SpannedToken]) -> T,
) {
// let parsed_tokens = parse(syntax);
let tokens = b::token_list(tokens);
let (tokens, source) = b::build(tokens);
let text = Text::from(&source);
assert_eq!(syntax, source);
with_empty_context(&text, |context| {
let tokens = tokens.expect_list();
let mut iterator = TokensIterator::new(&tokens.item, context, tokens.span);
let expr = iterator.expand_syntax(shape);
let expr = match expr {
Ok(expr) => expr,
Err(err) => {
outln!("");
ptree::print_tree(&iterator.expand_tracer().print(text.clone())).unwrap();
outln!("");
print_err(err.into(), &iterator.context().source().clone());
panic!("Parse failed");
}
};
let expected = expected(&tokens.item);
if expr != expected {
outln!("");
ptree::print_tree(&iterator.expand_tracer().print(text.clone())).unwrap();
outln!("");
assert_eq!(expr, expected);
}
})
}
fn parse_expr(
shape: impl ExpandSyntax<Output = Result<SpannedExpression, ParseError>>,
syntax: &str,
tokens: Vec<CurriedToken>,
expected: impl FnOnce(&[SpannedToken]) -> SpannedExpression,
) {
// let parsed_tokens = parse(syntax);
let tokens = b::token_list(tokens);
let (tokens, source) = b::build(tokens);
let text = Text::from(&source);
assert_eq!(syntax, source);
with_empty_context(&text, |context| {
let tokens = tokens.expect_list();
let mut iterator = TokensIterator::new(&tokens.item, context, tokens.span);
let expr = iterator.expand_syntax(shape);
let expr = match expr {
Ok(expr) => expr,
Err(err) => {
outln!("");
ptree::print_tree(&iterator.expand_tracer().print(text.clone())).unwrap();
outln!("");
print_err(err.into(), &iterator.source());
panic!("Parse failed");
}
};
let expected = expected(&tokens.item);
if expr != expected {
outln!("");
ptree::print_tree(&iterator.expand_tracer().print(text.clone())).unwrap();
outln!("");
assert_eq!(expr, expected);
}
})
}
fn inner_string_span(span: Span) -> Span {
Span::new(span.start() + 1, span.end() - 1)
}
pub fn print_err(err: ShellError, source: &Text) {
let diag = err.into_diagnostic();
let writer = termcolor::StandardStream::stderr(termcolor::ColorChoice::Auto);
let mut source = source.to_string();
source.push_str(" ");
let files = Files::new(source);
let _ = language_reporting::emit(
&mut writer.lock(),
&files,
&diag,
&language_reporting::DefaultConfig,
);
}

View File

@ -0,0 +1,31 @@
use crate::{hir::SpannedExpression, CompareOperator};
use derive_new::new;
use getset::Getters;
use nu_source::{b, DebugDocBuilder, PrettyDebugWithSource, Spanned};
use serde::{Deserialize, Serialize};
#[derive(
Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Hash, Getters, Serialize, Deserialize, new,
)]
#[get = "pub"]
pub struct Binary {
left: SpannedExpression,
op: Spanned<CompareOperator>,
right: SpannedExpression,
}
impl PrettyDebugWithSource for Binary {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
b::delimit(
"<",
self.left.pretty_debug(source)
+ b::space()
+ b::keyword(self.op.span.slice(source))
+ b::space()
+ self.right.pretty_debug(source),
">",
)
.group()
}
}

View File

@ -0,0 +1,175 @@
use crate::parse::token_tree::Token;
use crate::{
hir::syntax_shape::{ExpandSyntax, FlatShape, MaybeSpaceShape},
TokensIterator,
};
use derive_new::new;
use nu_errors::ParseError;
use nu_protocol::SpannedTypeName;
use nu_source::{b, DebugDocBuilder, HasSpan, PrettyDebug, Span, Spanned, SpannedItem};
#[derive(Debug, Eq, PartialEq, Clone, new)]
pub struct ExternalTokensSyntax {
pub tokens: Spanned<Vec<Spanned<String>>>,
}
impl HasSpan for ExternalTokensSyntax {
fn span(&self) -> Span {
self.tokens.span
}
}
impl PrettyDebug for ExternalTokensSyntax {
fn pretty(&self) -> DebugDocBuilder {
b::intersperse(
self.tokens
.iter()
.map(|token| b::primitive(format!("{:?}", token.item))),
b::space(),
)
}
}
#[derive(Debug, Copy, Clone)]
pub struct ExternalTokensShape;
impl ExpandSyntax for ExternalTokensShape {
type Output = ExternalTokensSyntax;
fn name(&self) -> &'static str {
"external tokens"
}
fn expand<'a, 'b>(&self, token_nodes: &'b mut TokensIterator<'a>) -> ExternalTokensSyntax {
let mut out: Vec<Spanned<String>> = vec![];
let start = token_nodes.span_at_cursor();
loop {
match token_nodes.expand_syntax(ExternalExpressionShape) {
Err(_) => break,
Ok(span) => out.push(span.spanned_string(&token_nodes.source())),
}
}
let end = token_nodes.span_at_cursor();
ExternalTokensSyntax {
tokens: out.spanned(start.until(end)),
}
}
}
#[derive(Debug, Copy, Clone)]
pub struct ExternalExpressionShape;
impl ExpandSyntax for ExternalExpressionShape {
type Output = Result<Span, ParseError>;
fn name(&self) -> &'static str {
"external expression"
}
fn expand<'a, 'b>(&self, token_nodes: &'b mut TokensIterator<'a>) -> Result<Span, ParseError> {
token_nodes.expand_infallible(MaybeSpaceShape);
let first = token_nodes.expand_syntax(ExternalStartToken)?;
let mut last = first;
loop {
let continuation = token_nodes.expand_syntax(ExternalStartToken);
if let Ok(continuation) = continuation {
last = continuation;
} else {
break;
}
}
Ok(first.until(last))
}
}
#[derive(Debug, Copy, Clone)]
struct ExternalStartToken;
impl ExpandSyntax for ExternalStartToken {
type Output = Result<Span, ParseError>;
fn name(&self) -> &'static str {
"external start token"
}
fn expand<'a, 'b>(&self, token_nodes: &'b mut TokensIterator<'a>) -> Result<Span, ParseError> {
token_nodes.atomic_parse(|token_nodes| {
let mut span: Option<Span> = None;
loop {
let boundary = token_nodes.expand_infallible(PeekExternalBoundary);
if boundary {
break;
}
let peeked = token_nodes.peek().not_eof("external start token")?;
let node = peeked.node;
let new_span = match node.unspanned() {
Token::Comment(_)
| Token::Separator
| Token::Whitespace
| Token::Pipeline(_) => {
return Err(ParseError::mismatch(
"external start token",
node.spanned_type_name(),
))
}
_ => {
let node = peeked.commit();
node.span()
}
};
span = match span {
None => Some(new_span),
Some(before) => Some(before.until(new_span)),
};
}
match span {
None => Err(token_nodes.err_next_token("external start token")),
Some(span) => {
token_nodes.color_shape(FlatShape::ExternalWord.spanned(span));
Ok(span)
}
}
})
}
}
#[derive(Debug, Copy, Clone)]
struct PeekExternalBoundary;
impl ExpandSyntax for PeekExternalBoundary {
type Output = bool;
fn name(&self) -> &'static str {
"external boundary"
}
fn expand<'a, 'b>(&self, token_nodes: &'b mut TokensIterator<'a>) -> Self::Output {
let next = token_nodes.peek();
match next.node {
None => true,
Some(node) => match node.unspanned() {
Token::Delimited(_) => true,
Token::Whitespace => true,
Token::Comment(_) => true,
Token::Separator => true,
Token::Call(_) => true,
_ => false,
},
}
}
}

View File

@ -0,0 +1,12 @@
use derive_new::new;
use getset::Getters;
use nu_source::Span;
use serde::{Deserialize, Serialize};
#[derive(
Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Hash, Getters, Serialize, Deserialize, new,
)]
#[get = "pub"]
pub struct ExternalCommand {
pub(crate) name: Span,
}

View File

@ -0,0 +1,109 @@
use crate::hir::SpannedExpression;
use crate::Flag;
use indexmap::IndexMap;
use log::trace;
use nu_source::{b, DebugDocBuilder, PrettyDebugRefineKind, PrettyDebugWithSource, Tag};
use serde::{Deserialize, Serialize};
#[derive(Debug, Clone, Eq, PartialEq, Serialize, Deserialize)]
pub enum NamedValue {
AbsentSwitch,
PresentSwitch(Tag),
AbsentValue,
Value(SpannedExpression),
}
impl PrettyDebugWithSource for NamedValue {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
match self {
NamedValue::AbsentSwitch => b::typed("switch", b::description("absent")),
NamedValue::PresentSwitch(_) => b::typed("switch", b::description("present")),
NamedValue::AbsentValue => b::description("absent"),
NamedValue::Value(value) => value.pretty_debug(source),
}
}
fn refined_pretty_debug(&self, refine: PrettyDebugRefineKind, source: &str) -> DebugDocBuilder {
match refine {
PrettyDebugRefineKind::ContextFree => self.pretty_debug(source),
PrettyDebugRefineKind::WithContext => match self {
NamedValue::AbsentSwitch => b::value("absent"),
NamedValue::PresentSwitch(_) => b::value("present"),
NamedValue::AbsentValue => b::value("absent"),
NamedValue::Value(value) => value.refined_pretty_debug(refine, source),
},
}
}
}
#[derive(Debug, Default, Clone, Eq, PartialEq, Serialize, Deserialize)]
pub struct NamedArguments {
pub named: IndexMap<String, NamedValue>,
}
impl NamedArguments {
pub fn new() -> NamedArguments {
Default::default()
}
pub fn iter(&self) -> impl Iterator<Item = (&String, &NamedValue)> {
self.named.iter()
}
pub fn get(&self, name: &str) -> Option<&NamedValue> {
self.named.get(name)
}
}
impl NamedArguments {
pub fn insert_switch(&mut self, name: impl Into<String>, switch: Option<Flag>) {
let name = name.into();
trace!("Inserting switch -- {} = {:?}", name, switch);
match switch {
None => self.named.insert(name, NamedValue::AbsentSwitch),
Some(flag) => self.named.insert(
name,
NamedValue::PresentSwitch(Tag {
span: *flag.name(),
anchor: None,
}),
),
};
}
pub fn insert_optional(&mut self, name: impl Into<String>, expr: Option<SpannedExpression>) {
match expr {
None => self.named.insert(name.into(), NamedValue::AbsentValue),
Some(expr) => self.named.insert(name.into(), NamedValue::Value(expr)),
};
}
pub fn insert_mandatory(&mut self, name: impl Into<String>, expr: SpannedExpression) {
self.named.insert(name.into(), NamedValue::Value(expr));
}
}
impl PrettyDebugWithSource for NamedArguments {
fn refined_pretty_debug(&self, refine: PrettyDebugRefineKind, source: &str) -> DebugDocBuilder {
match refine {
PrettyDebugRefineKind::ContextFree => self.pretty_debug(source),
PrettyDebugRefineKind::WithContext => b::intersperse(
self.named.iter().map(|(key, value)| {
b::key(key)
+ b::equals()
+ value.refined_pretty_debug(PrettyDebugRefineKind::WithContext, source)
}),
b::space(),
),
}
}
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
b::delimit(
"(",
self.refined_pretty_debug(PrettyDebugRefineKind::WithContext, source),
")",
)
}
}

View File

@ -0,0 +1,41 @@
use crate::hir::SpannedExpression;
use derive_new::new;
use getset::{Getters, MutGetters};
use nu_protocol::PathMember;
use nu_source::{b, DebugDocBuilder, PrettyDebug, PrettyDebugWithSource};
use serde::{Deserialize, Serialize};
#[derive(
Debug,
Clone,
Eq,
PartialEq,
Ord,
PartialOrd,
Hash,
Getters,
MutGetters,
Serialize,
Deserialize,
new,
)]
#[get = "pub"]
pub struct Path {
head: SpannedExpression,
#[get_mut = "pub(crate)"]
tail: Vec<PathMember>,
}
impl PrettyDebugWithSource for Path {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
self.head.pretty_debug(source)
+ b::operator(".")
+ b::intersperse(self.tail.iter().map(|m| m.pretty()), b::operator("."))
}
}
impl Path {
pub(crate) fn parts(self) -> (SpannedExpression, Vec<PathMember>) {
(self.head, self.tail)
}
}

View File

@ -0,0 +1,33 @@
use crate::hir::SpannedExpression;
use derive_new::new;
use getset::Getters;
use nu_source::{b, DebugDocBuilder, PrettyDebugWithSource, Span};
use serde::{Deserialize, Serialize};
#[derive(
Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Hash, Getters, Serialize, Deserialize, new,
)]
pub struct Range {
#[get = "pub"]
left: SpannedExpression,
#[get = "pub"]
dotdot: Span,
#[get = "pub"]
right: SpannedExpression,
}
impl PrettyDebugWithSource for Range {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
b::delimit(
"<",
self.left.pretty_debug(source)
+ b::space()
+ b::keyword(self.dotdot.slice(source))
+ b::space()
+ self.right.pretty_debug(source),
">",
)
.group()
}
}

View File

@ -0,0 +1,475 @@
use crate::hir;
use crate::hir::syntax_shape::{
expand_atom, expand_syntax, BareShape, ExpandContext, ExpandSyntax, ExpansionRule,
UnspannedAtomicToken, WhitespaceShape,
};
use crate::hir::tokens_iterator::TokensIterator;
use crate::parse::comment::Comment;
use derive_new::new;
use nu_errors::ParseError;
use nu_protocol::{RowType, SpannedTypeName, Type};
use nu_source::{
b, DebugDocBuilder, HasFallibleSpan, HasSpan, PrettyDebugWithSource, Span, Spanned, SpannedItem,
};
use std::fmt::Debug;
// A Signature is a command without implementation.
//
// In Nu, a command is a function combined with macro expansion rules.
//
// def cd
// # Change to a new path.
// optional directory(Path) # the directory to change to
// end
#[derive(new)]
struct Expander<'a, 'b, 'c, 'd> {
iterator: &'b mut TokensIterator<'a>,
context: &'d ExpandContext<'c>,
}
impl<'a, 'b, 'c, 'd> Expander<'a, 'b, 'c, 'd> {
fn expand<O>(&mut self, syntax: impl ExpandSyntax<Output = O>) -> Result<O, ParseError>
where
O: HasFallibleSpan + Clone + std::fmt::Debug + 'static,
{
expand_syntax(&syntax, self.iterator, self.context)
}
fn optional<O>(&mut self, syntax: impl ExpandSyntax<Output = O>) -> Option<O>
where
O: HasFallibleSpan + Clone + std::fmt::Debug + 'static,
{
match expand_syntax(&syntax, self.iterator, self.context) {
Err(_) => None,
Ok(value) => Some(value),
}
}
fn pos(&mut self) -> Span {
self.iterator.span_at_cursor()
}
fn slice_string(&mut self, span: impl Into<Span>) -> String {
span.into().slice(self.context.source()).to_string()
}
}
#[derive(Debug, Copy, Clone)]
struct SignatureShape;
impl ExpandSyntax for SignatureShape {
type Output = hir::Signature;
fn name(&self) -> &'static str {
"signature"
}
fn expand_syntax<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<Self::Output, ParseError> {
token_nodes.atomic_parse(|token_nodes| {
let mut expander = Expander::new(token_nodes, context);
let start = expander.pos();
expander.expand(keyword("def"))?;
expander.expand(WhitespaceShape)?;
let name = expander.expand(BareShape)?;
expander.expand(SeparatorShape)?;
let usage = expander.expand(CommentShape)?;
expander.expand(SeparatorShape)?;
let end = expander.pos();
Ok(hir::Signature::new(
nu_protocol::Signature::new(&name.word).desc(expander.slice_string(usage.text)),
start.until(end),
))
})
}
}
fn keyword(kw: &'static str) -> KeywordShape {
KeywordShape { keyword: kw }
}
#[derive(Debug, Copy, Clone)]
struct KeywordShape {
keyword: &'static str,
}
impl ExpandSyntax for KeywordShape {
type Output = Span;
fn name(&self) -> &'static str {
"keyword"
}
fn expand_syntax<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<Self::Output, ParseError> {
let atom = expand_atom(token_nodes, "keyword", context, ExpansionRule::new())?;
if let UnspannedAtomicToken::Word { text } = &atom.unspanned {
let word = text.slice(context.source());
if word == self.keyword {
return Ok(atom.span);
}
}
Err(ParseError::mismatch(self.keyword, atom.spanned_type_name()))
}
}
#[derive(Debug, Copy, Clone)]
struct SeparatorShape;
impl ExpandSyntax for SeparatorShape {
type Output = Span;
fn name(&self) -> &'static str {
"separator"
}
fn expand_syntax<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<Self::Output, ParseError> {
let atom = expand_atom(token_nodes, "separator", context, ExpansionRule::new())?;
match &atom.unspanned {
UnspannedAtomicToken::Separator { text } => Ok(*text),
_ => Err(ParseError::mismatch("separator", atom.spanned_type_name())),
}
}
}
#[derive(Debug, Copy, Clone)]
struct CommentShape;
impl ExpandSyntax for CommentShape {
type Output = Comment;
fn name(&self) -> &'static str {
"comment"
}
fn expand_syntax<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<Self::Output, ParseError> {
let atom = expand_atom(token_nodes, "comment", context, ExpansionRule::new())?;
match &atom.unspanned {
UnspannedAtomicToken::Comment { body } => Ok(Comment::line(body, atom.span)),
_ => Err(ParseError::mismatch("separator", atom.spanned_type_name())),
}
}
}
#[derive(Debug, Copy, Clone, new)]
struct TupleShape<A, B> {
first: A,
second: B,
}
#[derive(Debug, Clone, new)]
struct TupleSyntax<A, B> {
first: A,
second: B,
}
impl<A, B> PrettyDebugWithSource for TupleSyntax<A, B>
where
A: PrettyDebugWithSource,
B: PrettyDebugWithSource,
{
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
b::typed(
"pair",
self.first.pretty_debug(source) + b::space() + self.second.pretty_debug(source),
)
}
}
impl<A, B> HasFallibleSpan for TupleSyntax<A, B>
where
A: HasFallibleSpan + Debug + Clone,
B: HasFallibleSpan + Debug + Clone,
{
fn maybe_span(&self) -> Option<Span> {
match (self.first.maybe_span(), self.second.maybe_span()) {
(Some(first), Some(second)) => Some(first.until(second)),
(Some(first), None) => Some(first),
(None, Some(second)) => Some(second),
(None, None) => None,
}
}
}
impl<A, B, AOut, BOut> ExpandSyntax for TupleShape<A, B>
where
A: ExpandSyntax<Output = AOut> + Debug + Copy,
B: ExpandSyntax<Output = BOut> + Debug + Copy,
AOut: HasFallibleSpan + Debug + Clone + 'static,
BOut: HasFallibleSpan + Debug + Clone + 'static,
{
type Output = TupleSyntax<AOut, BOut>;
fn name(&self) -> &'static str {
"pair"
}
fn expand_syntax<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<Self::Output, ParseError> {
token_nodes.atomic_parse(|token_nodes| {
let first = expand_syntax(&self.first, token_nodes, context)?;
let second = expand_syntax(&self.second, token_nodes, context)?;
Ok(TupleSyntax { first, second })
})
}
}
#[derive(Debug, Clone)]
pub struct PositionalParam {
optional: Option<Span>,
name: Identifier,
ty: Spanned<Type>,
desc: Spanned<String>,
span: Span,
}
impl HasSpan for PositionalParam {
fn span(&self) -> Span {
self.span
}
}
impl PrettyDebugWithSource for PositionalParam {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
(match self.optional {
Some(_) => b::description("optional") + b::space(),
None => b::blank(),
}) + self.ty.pretty_debug(source)
}
}
#[derive(Debug, Copy, Clone)]
pub struct PositionalParamShape;
impl ExpandSyntax for PositionalParamShape {
type Output = PositionalParam;
fn name(&self) -> &'static str {
"positional param"
}
fn expand_syntax<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<Self::Output, ParseError> {
token_nodes.atomic_parse(|token_nodes| {
let mut expander = Expander::new(token_nodes, context);
let optional = expander
.optional(TupleShape::new(keyword("optional"), WhitespaceShape))
.map(|s| s.first);
let name = expander.expand(IdentifierShape)?;
expander.optional(WhitespaceShape);
let _ty = expander.expand(TypeShape)?;
Ok(PositionalParam {
optional,
name,
ty: Type::Nothing.spanned(Span::unknown()),
desc: format!("").spanned(Span::unknown()),
span: Span::unknown(),
})
})
}
}
#[derive(Debug, Clone)]
struct Identifier {
body: String,
span: Span,
}
impl HasSpan for Identifier {
fn span(&self) -> Span {
self.span
}
}
impl PrettyDebugWithSource for Identifier {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
b::typed("id", b::description(self.span.slice(source)))
}
}
#[derive(Debug, Copy, Clone)]
struct IdentifierShape;
impl ExpandSyntax for IdentifierShape {
type Output = Identifier;
fn name(&self) -> &'static str {
"identifier"
}
fn expand_syntax<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<Self::Output, ParseError> {
let atom = expand_atom(token_nodes, "identifier", context, ExpansionRule::new())?;
if let UnspannedAtomicToken::Word { text } = atom.unspanned {
let body = text.slice(context.source());
if is_id(body) {
return Ok(Identifier {
body: body.to_string(),
span: text,
});
}
}
Err(ParseError::mismatch("identifier", atom.spanned_type_name()))
}
}
fn is_id(input: &str) -> bool {
let source = nu_source::nom_input(input);
match crate::parse::parser::ident(source) {
Err(_) => false,
Ok((input, _)) => input.fragment.is_empty(),
}
}
#[derive(Debug, Clone, new)]
struct TypeSyntax {
ty: Type,
span: Span,
}
impl HasSpan for TypeSyntax {
fn span(&self) -> Span {
self.span
}
}
impl PrettyDebugWithSource for TypeSyntax {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
self.ty.pretty_debug(source)
}
}
#[derive(Debug, Copy, Clone)]
struct TypeShape;
impl ExpandSyntax for TypeShape {
type Output = TypeSyntax;
fn name(&self) -> &'static str {
"type"
}
fn expand_syntax<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<Self::Output, ParseError> {
let atom = expand_atom(token_nodes, "type", context, ExpansionRule::new())?;
match atom.unspanned {
UnspannedAtomicToken::Word { text } => {
let word = text.slice(context.source());
Ok(TypeSyntax::new(
match word {
"nothing" => Type::Nothing,
"integer" => Type::Int,
"decimal" => Type::Decimal,
"bytesize" => Type::Bytesize,
"string" => Type::String,
"column-path" => Type::ColumnPath,
"pattern" => Type::Pattern,
"boolean" => Type::Boolean,
"date" => Type::Date,
"duration" => Type::Duration,
"filename" => Type::Path,
"binary" => Type::Binary,
"row" => Type::Row(RowType::new()),
"table" => Type::Table(vec![]),
"block" => Type::Block,
_ => return Err(ParseError::mismatch("type", atom.spanned_type_name())),
},
atom.span,
))
}
_ => Err(ParseError::mismatch("type", atom.spanned_type_name())),
}
}
}
#[derive(Debug, Copy, Clone)]
struct TypeAnnotation;
impl ExpandSyntax for TypeAnnotation {
type Output = TypeSyntax;
fn name(&self) -> &'static str {
"type annotation"
}
fn expand_syntax<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<Self::Output, ParseError> {
let atom = expand_atom(
token_nodes,
"type annotation",
context,
ExpansionRule::new(),
)?;
match atom.unspanned {
UnspannedAtomicToken::RoundDelimited { nodes, .. } => {
token_nodes.atomic_parse(|token_nodes| {
token_nodes.child(
(&nodes[..]).spanned(atom.span),
context.source().clone(),
|token_nodes| {
let ty = expand_syntax(&TypeShape, token_nodes, context)?;
let next = token_nodes.peek_non_ws();
match next.node {
None => Ok(ty),
Some(node) => {
Err(ParseError::extra_tokens(node.spanned_type_name()))
}
}
},
)
})
}
_ => Err(ParseError::mismatch(
"type annotation",
atom.spanned_type_name(),
)),
}
}
}

View File

@ -0,0 +1,680 @@
#![allow(clippy::large_enum_variant, clippy::type_complexity)]
mod block;
mod expression;
pub mod flat_shape;
use crate::commands::classified::internal::InternalCommand;
use crate::commands::classified::{ClassifiedCommand, ClassifiedPipeline};
use crate::commands::external_command;
use crate::hir;
use crate::hir::syntax_shape::block::CoerceBlockShape;
use crate::hir::syntax_shape::expression::range::RangeShape;
use crate::hir::syntax_shape::flat_shape::ShapeResult;
use crate::hir::tokens_iterator::TokensIterator;
use crate::hir::{Expression, SpannedExpression};
use crate::parse::operator::EvaluationOperator;
use crate::parse::token_tree::{
ExternalCommandType, PipelineType, SpannedToken, Token, WhitespaceType, WordType,
};
use crate::parse_command::parse_command_tail;
use derive_new::new;
use getset::Getters;
use nu_errors::ParseError;
use nu_protocol::{ShellTypeName, Signature, SpannedTypeName};
use nu_source::{
b, DebugDocBuilder, HasSpan, PrettyDebug, PrettyDebugWithSource, Span, Spanned, SpannedItem,
Tag, TaggedItem, Text,
};
use std::path::{Path, PathBuf};
pub(crate) use self::expression::delimited::DelimitedSquareShape;
pub(crate) use self::expression::file_path::{ExternalWordShape, FilePathShape};
pub(crate) use self::expression::list::{BackoffColoringMode, ExpressionListShape};
pub(crate) use self::expression::number::{
DecimalShape, IntExpressionShape, IntShape, NumberExpressionShape, NumberShape,
};
pub(crate) use self::expression::pattern::{PatternExpressionShape, PatternShape};
pub(crate) use self::expression::string::{CoerceStringShape, StringExpressionShape, StringShape};
pub(crate) use self::expression::unit::UnitExpressionShape;
pub(crate) use self::expression::variable_path::{
ColumnPathShape, ColumnPathSyntax, ExpressionContinuationShape, Member, MemberShape,
PathTailShape, PathTailSyntax, VariablePathShape, VariableShape,
};
pub(crate) use self::expression::{AnyExpressionShape, AnyExpressionStartShape};
pub(crate) use self::flat_shape::FlatShape;
use nu_protocol::SyntaxShape;
use std::fmt::Debug;
impl ExpandSyntax for SyntaxShape {
type Output = Result<SpannedExpression, ParseError>;
fn name(&self) -> &'static str {
match self {
SyntaxShape::Any => "shape[any]",
SyntaxShape::Int => "shape[integer]",
SyntaxShape::Range => "shape[range]",
SyntaxShape::String => "shape[string]",
SyntaxShape::Member => "shape[column name]",
SyntaxShape::ColumnPath => "shape[column path]",
SyntaxShape::Number => "shape[number]",
SyntaxShape::Path => "shape[file path]",
SyntaxShape::Pattern => "shape[glob pattern]",
SyntaxShape::Block => "shape[block]",
}
}
fn expand<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
) -> Result<SpannedExpression, ParseError> {
match self {
SyntaxShape::Any => token_nodes.expand_syntax(AnyExpressionShape),
SyntaxShape::Int => token_nodes
.expand_syntax(IntExpressionShape)
.or_else(|_| token_nodes.expand_syntax(VariablePathShape)),
SyntaxShape::Range => token_nodes
.expand_syntax(RangeShape)
.or_else(|_| token_nodes.expand_syntax(VariablePathShape)),
SyntaxShape::String => token_nodes
.expand_syntax(CoerceStringShape)
.or_else(|_| token_nodes.expand_syntax(VariablePathShape)),
SyntaxShape::Member => {
let syntax = token_nodes.expand_syntax(MemberShape)?;
Ok(syntax.to_expr())
}
SyntaxShape::ColumnPath => {
let column_path = token_nodes.expand_syntax(ColumnPathShape)?;
let ColumnPathSyntax {
path: column_path,
tag,
} = column_path;
Ok(Expression::column_path(column_path).into_expr(tag.span))
}
SyntaxShape::Number => token_nodes
.expand_syntax(NumberExpressionShape)
.or_else(|_| token_nodes.expand_syntax(VariablePathShape)),
SyntaxShape::Path => token_nodes
.expand_syntax(FilePathShape)
.or_else(|_| token_nodes.expand_syntax(VariablePathShape)),
SyntaxShape::Pattern => token_nodes
.expand_syntax(PatternShape)
.or_else(|_| token_nodes.expand_syntax(VariablePathShape)),
SyntaxShape::Block => token_nodes
.expand_syntax(CoerceBlockShape)
.or_else(|_| token_nodes.expand_syntax(VariablePathShape)),
}
}
}
pub trait SignatureRegistry: Debug {
fn has(&self, name: &str) -> bool;
fn get(&self, name: &str) -> Option<Signature>;
fn clone_box(&self) -> Box<dyn SignatureRegistry>;
}
impl SignatureRegistry for Box<dyn SignatureRegistry> {
fn has(&self, name: &str) -> bool {
(&**self).has(name)
}
fn get(&self, name: &str) -> Option<Signature> {
(&**self).get(name)
}
fn clone_box(&self) -> Box<dyn SignatureRegistry> {
(&**self).clone_box()
}
}
#[derive(Debug, Getters, new)]
pub struct ExpandContext<'context> {
#[get = "pub(crate)"]
pub registry: Box<dyn SignatureRegistry>,
pub source: &'context Text,
pub homedir: Option<PathBuf>,
}
impl<'context> ExpandContext<'context> {
pub(crate) fn homedir(&self) -> Option<&Path> {
self.homedir.as_ref().map(|h| h.as_path())
}
pub(crate) fn source(&self) -> &'context Text {
self.source
}
}
pub trait ExpandSyntax: std::fmt::Debug + Clone {
type Output: Clone + std::fmt::Debug + 'static;
fn name(&self) -> &'static str;
fn expand<'a, 'b>(&self, token_nodes: &'b mut TokensIterator<'a>) -> Self::Output;
}
pub fn fallible<T, S>(syntax: S) -> FallibleSyntax<S>
where
T: Clone + Debug + 'static,
S: ExpandSyntax<Output = T>,
{
FallibleSyntax { inner: syntax }
}
#[derive(Debug, Copy, Clone)]
pub struct FallibleSyntax<I> {
inner: I,
}
impl<I, T> ExpandSyntax for FallibleSyntax<I>
where
I: ExpandSyntax<Output = T>,
T: Clone + Debug + 'static,
{
type Output = Result<T, ParseError>;
fn name(&self) -> &'static str {
"fallible"
}
fn expand<'a, 'b>(&self, token_nodes: &'b mut TokensIterator<'a>) -> Result<T, ParseError> {
Ok(self.inner.expand(token_nodes))
}
}
#[derive(Debug, Clone)]
enum BarePathState {
Initial,
Seen(Span, Span),
Error(ParseError),
}
impl BarePathState {
pub fn seen(self, span: Span) -> BarePathState {
match self {
BarePathState::Initial => BarePathState::Seen(span, span),
BarePathState::Seen(start, _) => BarePathState::Seen(start, span),
BarePathState::Error(err) => BarePathState::Error(err),
}
}
pub fn end(self, node: Option<&SpannedToken>, expected: &'static str) -> BarePathState {
match self {
BarePathState::Initial => match node {
None => BarePathState::Error(ParseError::unexpected_eof(expected, Span::unknown())),
Some(token) => {
BarePathState::Error(ParseError::mismatch(expected, token.spanned_type_name()))
}
},
BarePathState::Seen(start, end) => BarePathState::Seen(start, end),
BarePathState::Error(err) => BarePathState::Error(err),
}
}
pub fn into_bare(self) -> Result<Span, ParseError> {
match self {
BarePathState::Initial => unreachable!("into_bare in initial state"),
BarePathState::Seen(start, end) => Ok(start.until(end)),
BarePathState::Error(err) => Err(err),
}
}
}
pub fn expand_bare(
token_nodes: &'_ mut TokensIterator<'_>,
predicate: impl Fn(&SpannedToken) -> bool,
) -> Result<Span, ParseError> {
let mut state = BarePathState::Initial;
loop {
if token_nodes.at_end() {
state = state.end(None, "word");
break;
}
let source = token_nodes.source();
let mut peeked = token_nodes.peek();
let node = peeked.node;
match node {
Some(token) if predicate(token) => {
peeked.commit();
state = state.clone().seen(token.span());
let shapes = FlatShape::shapes(token, &source);
token_nodes.color_shapes(shapes);
}
token => {
state = state.clone().end(token, "word");
break;
}
}
}
state.into_bare()
}
#[derive(Debug, Copy, Clone)]
pub struct BareExpressionShape;
impl ExpandSyntax for BareExpressionShape {
type Output = Result<SpannedExpression, ParseError>;
fn name(&self) -> &'static str {
"bare expression"
}
fn expand<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
) -> Result<SpannedExpression, ParseError> {
token_nodes
.expand_syntax(BarePathShape)
.map(|span| Expression::bare().into_expr(span))
}
}
#[derive(Debug, Copy, Clone)]
pub struct BarePathShape;
impl ExpandSyntax for BarePathShape {
type Output = Result<Span, ParseError>;
fn name(&self) -> &'static str {
"bare path"
}
fn expand<'a, 'b>(&self, token_nodes: &'b mut TokensIterator<'a>) -> Result<Span, ParseError> {
expand_bare(token_nodes, |token| match token.unspanned() {
Token::Bare | Token::EvaluationOperator(EvaluationOperator::Dot) => true,
_ => false,
})
}
}
#[derive(Debug, Copy, Clone)]
pub struct BareShape;
#[derive(Debug, Clone)]
pub struct BareSyntax {
pub word: String,
pub span: Span,
}
impl HasSpan for BareSyntax {
fn span(&self) -> Span {
self.span
}
}
impl PrettyDebug for BareSyntax {
fn pretty(&self) -> DebugDocBuilder {
b::primitive(&self.word)
}
}
impl ExpandSyntax for BareShape {
type Output = Result<BareSyntax, ParseError>;
fn name(&self) -> &'static str {
"word"
}
fn expand<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
) -> Result<BareSyntax, ParseError> {
let source = token_nodes.source();
token_nodes.expand_token(WordType, |span| {
Ok((
FlatShape::Word,
BareSyntax {
word: span.string(&source),
span,
},
))
})
}
}
#[derive(Debug, Clone)]
pub enum CommandSignature {
Internal(Spanned<Signature>),
LiteralExternal { outer: Span, inner: Span },
External(Span),
Expression(hir::SpannedExpression),
}
impl PrettyDebugWithSource for CommandSignature {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
match self {
CommandSignature::Internal(internal) => {
b::typed("command", b::description(&internal.name))
}
CommandSignature::LiteralExternal { outer, .. } => {
b::typed("command", b::description(outer.slice(source)))
}
CommandSignature::External(external) => b::typed(
"command",
b::description("^") + b::description(external.slice(source)),
),
CommandSignature::Expression(expr) => b::typed("command", expr.pretty_debug(source)),
}
}
}
impl HasSpan for CommandSignature {
fn span(&self) -> Span {
match self {
CommandSignature::Internal(spanned) => spanned.span,
CommandSignature::LiteralExternal { outer, .. } => *outer,
CommandSignature::External(span) => *span,
CommandSignature::Expression(expr) => expr.span,
}
}
}
impl CommandSignature {
pub fn to_expression(&self) -> hir::SpannedExpression {
match self {
CommandSignature::Internal(command) => {
let span = command.span;
hir::Expression::Command(span).into_expr(span)
}
CommandSignature::LiteralExternal { outer, inner } => {
hir::Expression::ExternalCommand(hir::ExternalCommand::new(*inner))
.into_expr(*outer)
}
CommandSignature::External(span) => {
hir::Expression::ExternalCommand(hir::ExternalCommand::new(*span)).into_expr(*span)
}
CommandSignature::Expression(expr) => expr.clone(),
}
}
}
#[derive(Debug, Copy, Clone)]
pub struct PipelineShape;
impl ExpandSyntax for PipelineShape {
type Output = ClassifiedPipeline;
fn name(&self) -> &'static str {
"pipeline"
}
fn expand<'content, 'me>(
&self,
token_nodes: &'me mut TokensIterator<'content>,
) -> ClassifiedPipeline {
if token_nodes.at_end() {
return ClassifiedPipeline::commands(vec![], Span::unknown());
}
let start = token_nodes.span_at_cursor();
// whitespace is allowed at the beginning
token_nodes.expand_infallible(MaybeSpaceShape);
let pipeline = token_nodes
.expand_token(PipelineType, |pipeline| Ok(((), pipeline)))
.expect("PipelineShape is only expected to be called with a Pipeline token");
let parts = &pipeline.parts[..];
let mut out = vec![];
for part in parts {
if let Some(span) = part.pipe {
token_nodes.color_shape(FlatShape::Pipe.spanned(span));
}
let tokens: Spanned<&[SpannedToken]> = part.tokens().spanned(part.span());
let (shapes, classified) = token_nodes.child(tokens, move |token_nodes| {
token_nodes.expand_infallible(ClassifiedCommandShape)
});
for shape in shapes {
match shape {
ShapeResult::Success(shape) => token_nodes.color_shape(shape),
ShapeResult::Fallback { shape, allowed } => {
token_nodes.color_err(shape, allowed)
}
}
}
out.push(classified);
}
token_nodes.expand_infallible(BackoffColoringMode::new(vec!["no more tokens".to_string()]));
let end = token_nodes.span_at_cursor();
ClassifiedPipeline::commands(out, start.until(end))
}
}
pub enum CommandHeadKind {
External,
Internal(Signature),
}
#[derive(Debug, Copy, Clone)]
pub struct CommandHeadShape;
impl ExpandSyntax for CommandHeadShape {
type Output = Result<CommandSignature, ParseError>;
fn name(&self) -> &'static str {
"command head"
}
fn expand<'a, 'b>(
&self,
token_nodes: &mut TokensIterator<'_>,
) -> Result<CommandSignature, ParseError> {
token_nodes.expand_infallible(MaybeSpaceShape);
let source = token_nodes.source();
let registry = &token_nodes.context().registry.clone_box();
token_nodes
.expand_token(ExternalCommandType, |(inner, outer)| {
Ok((
FlatShape::ExternalCommand,
CommandSignature::LiteralExternal { outer, inner },
))
})
.or_else(|_| {
token_nodes.expand_token(WordType, |span| {
let name = span.slice(&source);
if registry.has(name) {
let signature = registry.get(name).unwrap();
Ok((
FlatShape::InternalCommand,
CommandSignature::Internal(signature.spanned(span)),
))
} else {
Ok((FlatShape::ExternalCommand, CommandSignature::External(span)))
}
})
})
.or_else(|_| {
token_nodes
.expand_syntax(AnyExpressionShape)
.map(CommandSignature::Expression)
})
}
}
#[derive(Debug, Copy, Clone)]
pub struct ClassifiedCommandShape;
impl ExpandSyntax for ClassifiedCommandShape {
type Output = ClassifiedCommand;
fn name(&self) -> &'static str {
"classified command"
}
fn expand<'a, 'b>(&self, token_nodes: &'b mut TokensIterator<'a>) -> ClassifiedCommand {
let start = token_nodes.span_at_cursor();
let source = token_nodes.source();
let head = match token_nodes.expand_syntax(CommandHeadShape) {
Err(err) => {
token_nodes
.expand_infallible(BackoffColoringMode::new(vec!["command".to_string()]));
return ClassifiedCommand::Error(err);
}
Ok(head) => head,
};
match head {
CommandSignature::Expression(expr) => ClassifiedCommand::Error(ParseError::mismatch(
"command",
expr.type_name().spanned(expr.span),
)),
CommandSignature::External(name) => {
let name_str = name.slice(&source);
match external_command(token_nodes, name_str.tagged(name)) {
Err(err) => ClassifiedCommand::Error(err),
Ok(command) => command,
}
}
// If the command starts with `^`, treat it as an external command no matter what
CommandSignature::LiteralExternal { outer, inner } => {
let name_str = inner.slice(&source);
match external_command(token_nodes, name_str.tagged(outer)) {
Err(err) => ClassifiedCommand::Error(err),
Ok(command) => command,
}
}
CommandSignature::Internal(signature) => {
let tail = parse_command_tail(&signature.item, token_nodes, signature.span);
let tail = match tail {
Err(err) => {
return ClassifiedCommand::Error(err);
}
Ok(tail) => tail,
};
let (positional, named) = match tail {
None => (None, None),
Some((positional, named)) => (positional, named),
};
let end = token_nodes.span_at_cursor();
let expr = hir::Expression::Command(signature.span).into_expr(signature.span);
let call = hir::Call {
head: Box::new(expr),
positional,
named,
span: start.until(end),
};
ClassifiedCommand::Internal(InternalCommand::new(
signature.item.name.clone(),
Tag {
span: signature.span,
anchor: None,
},
call,
))
}
}
}
}
#[derive(Debug, Copy, Clone)]
pub struct MaybeWhitespaceEof;
impl ExpandSyntax for MaybeWhitespaceEof {
type Output = Result<(), ParseError>;
fn name(&self) -> &'static str {
"<whitespace? eof>"
}
fn expand<'a, 'b>(&self, token_nodes: &'b mut TokensIterator<'a>) -> Self::Output {
token_nodes.atomic_parse(|token_nodes| {
token_nodes.expand_infallible(MaybeSpaceShape);
token_nodes.expand_syntax(EofShape)
})
}
}
#[derive(Debug, Copy, Clone)]
pub struct EofShape;
impl ExpandSyntax for EofShape {
type Output = Result<(), ParseError>;
fn name(&self) -> &'static str {
"eof"
}
fn expand<'a, 'b>(&self, token_nodes: &'b mut TokensIterator<'a>) -> Result<(), ParseError> {
let next = token_nodes.peek();
let node = next.node;
match node {
None => Ok(()),
Some(node) => Err(ParseError::mismatch("eof", node.spanned_type_name())),
}
}
}
#[derive(Debug, Copy, Clone)]
pub struct WhitespaceShape;
impl ExpandSyntax for WhitespaceShape {
type Output = Result<Span, ParseError>;
fn name(&self) -> &'static str {
"whitespace"
}
fn expand<'a, 'b>(&self, token_nodes: &'b mut TokensIterator<'a>) -> Result<Span, ParseError> {
token_nodes.expand_token(WhitespaceType, |span| Ok((FlatShape::Whitespace, span)))
}
}
#[derive(Debug, Copy, Clone)]
pub struct MaybeSpaceShape;
impl ExpandSyntax for MaybeSpaceShape {
type Output = Option<Span>;
fn name(&self) -> &'static str {
"whitespace?"
}
fn expand<'a, 'b>(&self, token_nodes: &'b mut TokensIterator<'a>) -> Option<Span> {
let result = token_nodes.expand_token(WhitespaceType, |span| {
Ok((FlatShape::Whitespace, Some(span)))
});
// No space is acceptable, but we need to err inside expand_token so we don't
// consume the non-whitespace token
result.unwrap_or(None)
}
}
#[derive(Debug, Copy, Clone)]
pub struct SpaceShape;
#[derive(Debug, Copy, Clone)]
pub struct CommandShape;

View File

@ -0,0 +1,159 @@
use crate::hir::Expression;
use crate::{
hir,
hir::syntax_shape::{
ExpandSyntax, ExpressionContinuationShape, MemberShape, PathTailShape, PathTailSyntax,
VariablePathShape,
},
hir::tokens_iterator::TokensIterator,
};
use hir::SpannedExpression;
use nu_errors::ParseError;
use nu_source::Span;
#[derive(Debug, Copy, Clone)]
pub struct CoerceBlockShape;
impl ExpandSyntax for CoerceBlockShape {
type Output = Result<SpannedExpression, ParseError>;
fn name(&self) -> &'static str {
"any block"
}
fn expand<'a, 'b>(
&self,
token_nodes: &mut TokensIterator<'_>,
) -> Result<SpannedExpression, ParseError> {
// is it just a block?
token_nodes
.expand_syntax(BlockShape)
.or_else(|_| token_nodes.expand_syntax(ShorthandBlockShape))
}
}
#[derive(Debug, Copy, Clone)]
pub struct BlockShape;
impl ExpandSyntax for BlockShape {
type Output = Result<SpannedExpression, ParseError>;
fn name(&self) -> &'static str {
"block"
}
fn expand<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
) -> Result<SpannedExpression, ParseError> {
let exprs = token_nodes.block()?;
Ok(hir::Expression::Block(exprs.item).into_expr(exprs.span))
}
}
#[derive(Debug, Copy, Clone)]
pub struct ShorthandBlockShape;
impl ExpandSyntax for ShorthandBlockShape {
type Output = Result<SpannedExpression, ParseError>;
fn name(&self) -> &'static str {
"shorthand block"
}
fn expand<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
) -> Result<SpannedExpression, ParseError> {
let mut current = token_nodes.expand_syntax(ShorthandPath)?;
loop {
match token_nodes.expand_syntax(ExpressionContinuationShape) {
Result::Err(_) => break,
Result::Ok(continuation) => current = continuation.append_to(current),
}
}
let span = current.span;
let block = hir::Expression::Block(vec![current]).into_expr(span);
Ok(block)
}
}
/// A shorthand for `$it.foo."bar"`, used inside of a shorthand block
#[derive(Debug, Copy, Clone)]
pub struct ShorthandPath;
impl ExpandSyntax for ShorthandPath {
type Output = Result<SpannedExpression, ParseError>;
fn name(&self) -> &'static str {
"shorthand path"
}
fn expand<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
) -> Result<SpannedExpression, ParseError> {
// if it's a variable path, that's the head part
let path = token_nodes.expand_syntax(VariablePathShape);
if let Ok(path) = path {
return Ok(path);
}
// Synthesize the head of the shorthand path (`<member>` -> `$it.<member>`)
let mut head = token_nodes.expand_syntax(ShorthandHeadShape)?;
// Now that we've synthesized the head, of the path, proceed to expand the tail of the path
// like any other path.
let tail = token_nodes.expand_syntax(PathTailShape);
match tail {
Err(_) => Ok(head),
Ok(PathTailSyntax { tail, span }) => {
let span = head.span.until(span);
// For each member that `PathTailShape` expanded, join it onto the existing expression
// to form a new path
for member in tail {
head = Expression::dot_member(head, member).into_expr(span);
}
Ok(head)
}
}
}
}
/// A shorthand for `$it.foo."bar"`, used inside of a shorthand block
#[derive(Debug, Copy, Clone)]
pub struct ShorthandHeadShape;
impl ExpandSyntax for ShorthandHeadShape {
type Output = Result<SpannedExpression, ParseError>;
fn name(&self) -> &'static str {
"shorthand head"
}
fn expand<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
) -> Result<SpannedExpression, ParseError> {
let head = token_nodes.expand_syntax(MemberShape)?;
let head = head.to_path_member(&token_nodes.source());
// Synthesize an `$it` expression
let it = synthetic_it();
let span = head.span;
Ok(Expression::path(it, vec![head]).into_expr(span))
}
}
fn synthetic_it() -> hir::SpannedExpression {
Expression::it_variable(Span::unknown()).into_expr(Span::unknown())
}

View File

@ -0,0 +1,72 @@
# Meaningful Primitive Tokens
- `int`
- `decimal`
- `op::name`
- `dot`
- `dotdot`
- `string`
- `var::it`
- `var::other`
- `external-command`
- `pattern::glob`
- `word`
- `comment`
- `whitespace`
- `separator`
- `longhand-flag`
- `shorthand-flag`
# Grouped Tokens
- `(call head ...tail)`
- `(list ...nodes)`
- `(paren ...nodes)`
- `(square ...nodes)`
- `(curly ...nodes)`
- `(pipeline ...elements) where elements: pipeline-element`
- `(pipeline-element pipe? token)`
# Atomic Tokens
- `(unit number unit) where number: number, unit: unit`
# Expression
```
start(ExpressionStart) continuation(ExpressionContinuation)* ->
```
## ExpressionStart
```
word -> String
unit -> Unit
number -> Number
string -> String
var::it -> Var::It
var::other -> Var::Other
pattern::glob -> Pattern::Glob
square -> Array
```
## TightExpressionContinuation
```
dot AnyExpression -> Member
dodot AnyExpression -> RangeContinuation
```
## InfixExpressionContinuation
```
whitespace op whitespace AnyExpression -> InfixContinuation
```
## Member
```
int -> Member::Int
word -> Member::Word
string -> Member::String
```

View File

@ -0,0 +1,77 @@
pub(crate) mod delimited;
pub(crate) mod file_path;
pub(crate) mod list;
pub(crate) mod number;
pub(crate) mod pattern;
pub(crate) mod range;
pub(crate) mod string;
pub(crate) mod unit;
pub(crate) mod variable_path;
use crate::hir::syntax_shape::{
BareExpressionShape, DelimitedSquareShape, ExpandContext, ExpandSyntax,
ExpressionContinuationShape, NumberExpressionShape, PatternExpressionShape,
StringExpressionShape, UnitExpressionShape, VariableShape,
};
use crate::hir::{SpannedExpression, TokensIterator};
use nu_errors::ParseError;
use std::path::PathBuf;
#[derive(Debug, Copy, Clone)]
pub struct AnyExpressionShape;
impl ExpandSyntax for AnyExpressionShape {
type Output = Result<SpannedExpression, ParseError>;
fn name(&self) -> &'static str {
"any expression"
}
fn expand<'a, 'b>(
&self,
token_nodes: &mut TokensIterator<'_>,
) -> Result<SpannedExpression, ParseError> {
token_nodes.atomic_parse(|token_nodes| {
// Look for an atomic expression at the cursor
let mut current = token_nodes.expand_syntax(AnyExpressionStartShape)?;
loop {
match token_nodes.expand_syntax(ExpressionContinuationShape) {
Err(_) => return Ok(current),
Ok(continuation) => current = continuation.append_to(current),
}
}
})
}
}
#[derive(Debug, Copy, Clone)]
pub struct AnyExpressionStartShape;
impl ExpandSyntax for AnyExpressionStartShape {
type Output = Result<SpannedExpression, ParseError>;
fn name(&self) -> &'static str {
"any expression start"
}
fn expand<'a, 'b>(
&self,
token_nodes: &mut TokensIterator<'_>,
) -> Result<SpannedExpression, ParseError> {
token_nodes
.expand_syntax(VariableShape)
.or_else(|_| token_nodes.expand_syntax(UnitExpressionShape))
.or_else(|_| token_nodes.expand_syntax(BareExpressionShape))
.or_else(|_| token_nodes.expand_syntax(PatternExpressionShape))
.or_else(|_| token_nodes.expand_syntax(NumberExpressionShape))
.or_else(|_| token_nodes.expand_syntax(StringExpressionShape))
.or_else(|_| token_nodes.expand_syntax(DelimitedSquareShape))
}
}
pub fn expand_file_path(string: &str, context: &ExpandContext) -> PathBuf {
let expanded = shellexpand::tilde_with_context(string, || context.homedir());
PathBuf::from(expanded.as_ref())
}

View File

@ -0,0 +1,760 @@
use crate::hir::syntax_shape::FlatShape;
use crate::hir::syntax_shape::{
expand_syntax, expression::expand_file_path, parse_single_node, BarePathShape,
BarePatternShape, ExpandContext, UnitShape, UnitSyntax,
};
use crate::parse::operator::EvaluationOperator;
use crate::parse::token_tree::{DelimitedNode, Delimiter, TokenNode};
use crate::parse::tokens::UnspannedToken;
use crate::parse::unit::Unit;
use crate::{
hir,
hir::{Expression, RawNumber, TokensIterator},
parse::flag::{Flag, FlagKind},
};
use nu_errors::{ParseError, ShellError};
use nu_protocol::{ShellTypeName, SpannedTypeName};
use nu_source::{b, DebugDocBuilder, HasSpan, PrettyDebugWithSource, Span, Spanned, SpannedItem};
use std::ops::Deref;
#[derive(Debug, Clone)]
pub enum UnspannedAtomicToken<'tokens> {
Eof {
span: Span,
},
Error {
error: Spanned<ShellError>,
},
Number {
number: RawNumber,
},
Size {
number: RawNumber,
unit: Spanned<Unit>,
},
String {
body: Span,
},
ItVariable {
name: Span,
},
Variable {
name: Span,
},
ExternalCommand {
command: Span,
},
ExternalWord {
text: Span,
},
GlobPattern {
pattern: Span,
},
Word {
text: Span,
},
SquareDelimited {
spans: (Span, Span),
nodes: &'tokens Vec<TokenNode>,
},
#[allow(unused)]
RoundDelimited {
spans: (Span, Span),
nodes: &'tokens Vec<TokenNode>,
},
ShorthandFlag {
name: Span,
},
CompareOperator {
text: Span,
},
Dot {
text: Span,
},
DotDot {
text: Span,
},
Whitespace {
text: Span,
},
Separator {
text: Span,
},
Comment {
body: Span,
},
}
impl<'tokens> UnspannedAtomicToken<'tokens> {
pub fn into_atomic_token(self, span: impl Into<Span>) -> AtomicToken<'tokens> {
AtomicToken {
unspanned: self,
span: span.into(),
}
}
}
impl<'tokens> ShellTypeName for AtomicToken<'tokens> {
fn type_name(&self) -> &'static str {
self.unspanned.type_name()
}
}
impl<'tokens> ShellTypeName for UnspannedAtomicToken<'tokens> {
fn type_name(&self) -> &'static str {
match &self {
UnspannedAtomicToken::Eof { .. } => "eof",
UnspannedAtomicToken::Error { .. } => "error",
UnspannedAtomicToken::CompareOperator { .. } => "compare operator",
UnspannedAtomicToken::ShorthandFlag { .. } => "shorthand flag",
UnspannedAtomicToken::Whitespace { .. } => "whitespace",
UnspannedAtomicToken::Separator { .. } => "separator",
UnspannedAtomicToken::Comment { .. } => "comment",
UnspannedAtomicToken::Dot { .. } => "dot",
UnspannedAtomicToken::DotDot { .. } => "dotdot",
UnspannedAtomicToken::Number { .. } => "number",
UnspannedAtomicToken::Size { .. } => "size",
UnspannedAtomicToken::String { .. } => "string",
UnspannedAtomicToken::ItVariable { .. } => "$it",
UnspannedAtomicToken::Variable { .. } => "variable",
UnspannedAtomicToken::ExternalCommand { .. } => "external command",
UnspannedAtomicToken::ExternalWord { .. } => "external word",
UnspannedAtomicToken::GlobPattern { .. } => "file pattern",
UnspannedAtomicToken::Word { .. } => "word",
UnspannedAtomicToken::SquareDelimited { .. } => "array literal",
UnspannedAtomicToken::RoundDelimited { .. } => "paren delimited",
}
}
}
#[derive(Debug, Clone)]
pub struct AtomicToken<'tokens> {
pub unspanned: UnspannedAtomicToken<'tokens>,
pub span: Span,
}
impl<'tokens> HasSpan for AtomicToken<'tokens> {
fn span(&self) -> Span {
self.span
}
}
impl<'tokens> Deref for AtomicToken<'tokens> {
type Target = UnspannedAtomicToken<'tokens>;
fn deref(&self) -> &UnspannedAtomicToken<'tokens> {
&self.unspanned
}
}
impl<'tokens> AtomicToken<'tokens> {
pub fn to_hir(
&self,
context: &ExpandContext,
expected: &'static str,
) -> Result<hir::Expression, ParseError> {
Ok(match &self.unspanned {
UnspannedAtomicToken::Eof { .. } => {
return Err(ParseError::mismatch(
expected,
"eof atomic token".spanned(self.span),
))
}
UnspannedAtomicToken::Error { .. } => {
return Err(ParseError::mismatch(expected, "error".spanned(self.span)))
}
UnspannedAtomicToken::RoundDelimited { .. }
| UnspannedAtomicToken::CompareOperator { .. }
| UnspannedAtomicToken::ShorthandFlag { .. }
| UnspannedAtomicToken::Whitespace { .. }
| UnspannedAtomicToken::Separator { .. }
| UnspannedAtomicToken::Comment { .. }
| UnspannedAtomicToken::Dot { .. }
| UnspannedAtomicToken::DotDot { .. }
| UnspannedAtomicToken::SquareDelimited { .. } => {
return Err(ParseError::mismatch(expected, self.spanned_type_name()));
}
UnspannedAtomicToken::Number { number } => {
Expression::number(number.to_number(context.source), self.span)
}
UnspannedAtomicToken::Size { number, unit } => {
Expression::size(number.to_number(context.source), **unit, self.span)
}
UnspannedAtomicToken::String { body } => Expression::string(*body, self.span),
UnspannedAtomicToken::ItVariable { name } => Expression::it_variable(*name, self.span),
UnspannedAtomicToken::Variable { name } => Expression::variable(*name, self.span),
UnspannedAtomicToken::ExternalCommand { command } => {
Expression::external_command(*command, self.span)
}
UnspannedAtomicToken::ExternalWord { text } => Expression::string(*text, self.span),
UnspannedAtomicToken::GlobPattern { pattern } => Expression::pattern(
expand_file_path(pattern.slice(context.source), context).to_string_lossy(),
self.span,
),
UnspannedAtomicToken::Word { text } => Expression::string(*text, *text),
})
}
pub(crate) fn color_tokens(&self, shapes: &mut Vec<Spanned<FlatShape>>) {
match &self.unspanned {
UnspannedAtomicToken::Eof { .. } => {}
UnspannedAtomicToken::Error { .. } => shapes.push(FlatShape::Error.spanned(self.span)),
UnspannedAtomicToken::CompareOperator { .. } => {
shapes.push(FlatShape::CompareOperator.spanned(self.span))
}
UnspannedAtomicToken::ShorthandFlag { .. } => {
shapes.push(FlatShape::ShorthandFlag.spanned(self.span))
}
UnspannedAtomicToken::Whitespace { .. } => {
shapes.push(FlatShape::Whitespace.spanned(self.span))
}
UnspannedAtomicToken::Number {
number: RawNumber::Decimal(_),
} => shapes.push(FlatShape::Decimal.spanned(self.span)),
UnspannedAtomicToken::Number {
number: RawNumber::Int(_),
} => shapes.push(FlatShape::Int.spanned(self.span)),
UnspannedAtomicToken::Size { number, unit } => shapes.push(
FlatShape::Size {
number: number.span(),
unit: unit.span,
}
.spanned(self.span),
),
UnspannedAtomicToken::String { .. } => {
shapes.push(FlatShape::String.spanned(self.span))
}
UnspannedAtomicToken::ItVariable { .. } => {
shapes.push(FlatShape::ItVariable.spanned(self.span))
}
UnspannedAtomicToken::Variable { .. } => {
shapes.push(FlatShape::Variable.spanned(self.span))
}
UnspannedAtomicToken::ExternalCommand { .. } => {
shapes.push(FlatShape::ExternalCommand.spanned(self.span))
}
UnspannedAtomicToken::ExternalWord { .. } => {
shapes.push(FlatShape::ExternalWord.spanned(self.span))
}
UnspannedAtomicToken::GlobPattern { .. } => {
shapes.push(FlatShape::GlobPattern.spanned(self.span))
}
UnspannedAtomicToken::Word { .. } => shapes.push(FlatShape::Word.spanned(self.span)),
_ => shapes.push(FlatShape::Error.spanned(self.span)),
}
}
}
impl PrettyDebugWithSource for AtomicToken<'_> {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
fn atom(value: DebugDocBuilder) -> DebugDocBuilder {
b::delimit("(", b::kind("atom") + b::space() + value.group(), ")").group()
}
fn atom_kind(kind: impl std::fmt::Display, value: DebugDocBuilder) -> DebugDocBuilder {
b::delimit(
"(",
(b::kind("atom") + b::delimit("[", b::kind(kind), "]")).group()
+ b::space()
+ value.group(),
")",
)
.group()
}
atom(match &self.unspanned {
UnspannedAtomicToken::Eof { .. } => b::description("eof"),
UnspannedAtomicToken::Error { .. } => b::error("error"),
UnspannedAtomicToken::Number { number } => number.pretty_debug(source),
UnspannedAtomicToken::Size { number, unit } => {
number.pretty_debug(source) + b::keyword(unit.span.slice(source))
}
UnspannedAtomicToken::String { body } => b::primitive(body.slice(source)),
UnspannedAtomicToken::ItVariable { .. } | UnspannedAtomicToken::Variable { .. } => {
b::keyword(self.span.slice(source))
}
UnspannedAtomicToken::ExternalCommand { .. } => b::primitive(self.span.slice(source)),
UnspannedAtomicToken::ExternalWord { text } => {
atom_kind("external word", b::primitive(text.slice(source)))
}
UnspannedAtomicToken::GlobPattern { pattern } => {
atom_kind("pattern", b::primitive(pattern.slice(source)))
}
UnspannedAtomicToken::Word { text } => {
atom_kind("word", b::primitive(text.slice(source)))
}
UnspannedAtomicToken::SquareDelimited { nodes, .. } => b::delimit(
"[",
b::intersperse_with_source(nodes.iter(), b::space(), source),
"]",
),
UnspannedAtomicToken::RoundDelimited { nodes, .. } => b::delimit(
"(",
b::intersperse_with_source(nodes.iter(), b::space(), source),
")",
),
UnspannedAtomicToken::ShorthandFlag { name } => {
atom_kind("shorthand flag", b::key(name.slice(source)))
}
UnspannedAtomicToken::Dot { .. } => atom(b::kind("dot")),
UnspannedAtomicToken::DotDot { .. } => atom(b::kind("dotdot")),
UnspannedAtomicToken::CompareOperator { text } => {
atom_kind("operator", b::keyword(text.slice(source)))
}
UnspannedAtomicToken::Whitespace { text } => atom_kind(
"whitespace",
b::description(format!("{:?}", text.slice(source))),
),
UnspannedAtomicToken::Separator { text } => atom_kind(
"separator",
b::description(format!("{:?}", text.slice(source))),
),
UnspannedAtomicToken::Comment { body } => {
atom_kind("comment", b::description(body.slice(source)))
}
})
}
}
#[derive(Debug)]
pub enum WhitespaceHandling {
#[allow(unused)]
AllowWhitespace,
RejectWhitespace,
}
#[derive(Debug)]
pub struct ExpansionRule {
pub(crate) allow_external_command: bool,
pub(crate) allow_external_word: bool,
pub(crate) allow_cmp_operator: bool,
pub(crate) allow_eval_operator: bool,
pub(crate) allow_eof: bool,
pub(crate) allow_separator: bool,
pub(crate) treat_size_as_word: bool,
pub(crate) separate_members: bool,
pub(crate) commit_errors: bool,
pub(crate) whitespace: WhitespaceHandling,
pub(crate) allow_comments: bool,
}
impl ExpansionRule {
pub fn new() -> ExpansionRule {
ExpansionRule {
allow_external_command: false,
allow_external_word: false,
allow_eval_operator: false,
allow_cmp_operator: false,
allow_eof: false,
treat_size_as_word: false,
separate_members: false,
commit_errors: false,
allow_separator: false,
whitespace: WhitespaceHandling::RejectWhitespace,
allow_comments: false,
}
}
/// The intent of permissive mode is to return an atomic token for every possible
/// input token. This is important for error-correcting parsing, such as the
/// syntax highlighter.
pub fn permissive() -> ExpansionRule {
ExpansionRule {
allow_external_command: true,
allow_external_word: true,
allow_cmp_operator: true,
allow_eval_operator: true,
allow_eof: true,
separate_members: false,
treat_size_as_word: false,
commit_errors: true,
allow_separator: true,
allow_comments: true,
whitespace: WhitespaceHandling::AllowWhitespace,
}
}
#[allow(unused)]
pub fn allow_external_command(mut self) -> ExpansionRule {
self.allow_external_command = true;
self
}
#[allow(unused)]
pub fn allow_cmp_operator(mut self) -> ExpansionRule {
self.allow_cmp_operator = true;
self
}
#[allow(unused)]
pub fn no_cmp_operator(mut self) -> ExpansionRule {
self.allow_cmp_operator = false;
self
}
#[allow(unused)]
pub fn allow_eval_operator(mut self) -> ExpansionRule {
self.allow_eval_operator = true;
self
}
#[allow(unused)]
pub fn no_operator(mut self) -> ExpansionRule {
self.allow_eval_operator = false;
self
}
#[allow(unused)]
pub fn no_external_command(mut self) -> ExpansionRule {
self.allow_external_command = false;
self
}
#[allow(unused)]
pub fn allow_external_word(mut self) -> ExpansionRule {
self.allow_external_word = true;
self
}
#[allow(unused)]
pub fn no_external_word(mut self) -> ExpansionRule {
self.allow_external_word = false;
self
}
#[allow(unused)]
pub fn treat_size_as_word(mut self) -> ExpansionRule {
self.treat_size_as_word = true;
self
}
#[allow(unused)]
pub fn separate_members(mut self) -> ExpansionRule {
self.separate_members = true;
self
}
#[allow(unused)]
pub fn no_separate_members(mut self) -> ExpansionRule {
self.separate_members = false;
self
}
#[allow(unused)]
pub fn commit_errors(mut self) -> ExpansionRule {
self.commit_errors = true;
self
}
#[allow(unused)]
pub fn allow_whitespace(mut self) -> ExpansionRule {
self.whitespace = WhitespaceHandling::AllowWhitespace;
self
}
#[allow(unused)]
pub fn reject_whitespace(mut self) -> ExpansionRule {
self.whitespace = WhitespaceHandling::RejectWhitespace;
self
}
#[allow(unused)]
pub fn allow_separator(mut self) -> ExpansionRule {
self.allow_separator = true;
self
}
#[allow(unused)]
pub fn reject_separator(mut self) -> ExpansionRule {
self.allow_separator = false;
self
}
#[allow(unused)]
pub fn allow_comments(mut self) -> ExpansionRule {
self.allow_comments = true;
self
}
#[allow(unused)]
pub fn reject_comments(mut self) -> ExpansionRule {
self.allow_comments = false;
self
}
}
pub fn expand_atom<'me, 'content>(
token_nodes: &'me mut TokensIterator<'content>,
expected: &'static str,
context: &ExpandContext,
rule: ExpansionRule,
) -> Result<AtomicToken<'content>, ParseError> {
token_nodes.with_expand_tracer(|_, tracer| tracer.start("atom"));
let result = expand_atom_inner(token_nodes, expected, context, rule);
token_nodes.with_expand_tracer(|_, tracer| match &result {
Ok(result) => {
tracer.add_result(result.clone());
tracer.success();
}
Err(err) => tracer.failed(err),
});
result
}
/// If the caller of expand_atom throws away the returned atomic token returned, it
/// must use a checkpoint to roll it back.
fn expand_atom_inner<'me, 'content>(
token_nodes: &'me mut TokensIterator<'content>,
expected: &'static str,
context: &ExpandContext,
rule: ExpansionRule,
) -> Result<AtomicToken<'content>, ParseError> {
if token_nodes.at_end() {
if rule.allow_eof {
return Ok(UnspannedAtomicToken::Eof {
span: Span::unknown(),
}
.into_atomic_token(Span::unknown()));
} else {
return Err(ParseError::unexpected_eof("anything", Span::unknown()));
}
}
// First, we'll need to handle the situation where more than one token corresponds
// to a single atomic token
// If treat_size_as_word, don't try to parse the head of the token stream
// as a size.
if !rule.treat_size_as_word {
match expand_syntax(&UnitShape, token_nodes, context) {
// If the head of the stream isn't a valid unit, we'll try to parse
// it again next as a word
Err(_) => {}
// But if it was a valid unit, we're done here
Ok(UnitSyntax {
unit: (number, unit),
span,
}) => return Ok(UnspannedAtomicToken::Size { number, unit }.into_atomic_token(span)),
}
}
if rule.separate_members {
let mut next = token_nodes.peek_any();
match next.node {
Some(token) if token.is_word() => {
next.commit();
return Ok(UnspannedAtomicToken::Word { text: token.span() }
.into_atomic_token(token.span()));
}
Some(token) if token.is_int() => {
next.commit();
return Ok(UnspannedAtomicToken::Number {
number: RawNumber::Int(token.span()),
}
.into_atomic_token(token.span()));
}
_ => {}
}
}
// Try to parse the head of the stream as a bare path. A bare path includes
// words as well as `.`s, connected together without whitespace.
match expand_syntax(&BarePathShape, token_nodes, context) {
// If we didn't find a bare path
Err(_) => {}
Ok(span) => {
let next = token_nodes.peek_any();
match next.node {
Some(token) if token.is_pattern() => {
// if the very next token is a pattern, we're looking at a glob, not a
// word, and we should try to parse it as a glob next
}
_ => return Ok(UnspannedAtomicToken::Word { text: span }.into_atomic_token(span)),
}
}
}
// Try to parse the head of the stream as a pattern. A pattern includes
// words, words with `*` as well as `.`s, connected together without whitespace.
match expand_syntax(&BarePatternShape, token_nodes, context) {
// If we didn't find a bare path
Err(_) => {}
Ok(span) => {
return Ok(UnspannedAtomicToken::GlobPattern { pattern: span }.into_atomic_token(span))
}
}
// The next token corresponds to at most one atomic token
// We need to `peek` because `parse_single_node` doesn't cover all of the
// cases that `expand_atom` covers. We should probably collapse the two
// if possible.
let peeked = token_nodes.peek_any().not_eof(expected)?;
match peeked.node {
TokenNode::Token(_) => {
// handle this next
}
TokenNode::Error(error) => {
peeked.commit();
return Ok(UnspannedAtomicToken::Error {
error: error.clone(),
}
.into_atomic_token(error.span));
}
TokenNode::Separator(span) if rule.allow_separator => {
peeked.commit();
return Ok(UnspannedAtomicToken::Separator { text: *span }.into_atomic_token(span));
}
TokenNode::Comment(comment) if rule.allow_comments => {
peeked.commit();
return Ok(UnspannedAtomicToken::Comment { body: comment.text }
.into_atomic_token(comment.span()));
}
// [ ... ]
TokenNode::Delimited(Spanned {
item:
DelimitedNode {
delimiter: Delimiter::Square,
spans,
children,
},
span,
}) => {
peeked.commit();
let span = *span;
return Ok(UnspannedAtomicToken::SquareDelimited {
nodes: children,
spans: *spans,
}
.into_atomic_token(span));
}
TokenNode::Flag(Flag {
kind: FlagKind::Shorthand,
name,
span,
}) => {
peeked.commit();
return Ok(UnspannedAtomicToken::ShorthandFlag { name: *name }.into_atomic_token(*span));
}
TokenNode::Flag(Flag {
kind: FlagKind::Longhand,
name,
span,
}) => {
peeked.commit();
return Ok(UnspannedAtomicToken::ShorthandFlag { name: *name }.into_atomic_token(*span));
}
// If we see whitespace, process the whitespace according to the whitespace
// handling rules
TokenNode::Whitespace(span) => match rule.whitespace {
// if whitespace is allowed, return a whitespace token
WhitespaceHandling::AllowWhitespace => {
peeked.commit();
return Ok(
UnspannedAtomicToken::Whitespace { text: *span }.into_atomic_token(*span)
);
}
// if whitespace is disallowed, return an error
WhitespaceHandling::RejectWhitespace => {
return Err(ParseError::mismatch(expected, "whitespace".spanned(*span)))
}
},
other => {
let span = peeked.node.span();
peeked.commit();
return Ok(UnspannedAtomicToken::Error {
error: ShellError::type_error("token", other.type_name().spanned(span))
.spanned(span),
}
.into_atomic_token(span));
}
}
parse_single_node(token_nodes, expected, |token, token_span, err| {
Ok(match token {
// First, the error cases. Each error case corresponds to a expansion rule
// flag that can be used to allow the case
// rule.allow_cmp_operator
UnspannedToken::CompareOperator(_) if !rule.allow_cmp_operator => {
return Err(err.error())
}
// rule.allow_eval_operator
UnspannedToken::EvaluationOperator(_) if !rule.allow_eval_operator => {
return Err(err.error())
}
// rule.allow_external_command
UnspannedToken::ExternalCommand(_) if !rule.allow_external_command => {
return Err(ParseError::mismatch(
expected,
token.type_name().spanned(token_span),
))
}
// rule.allow_external_word
UnspannedToken::ExternalWord if !rule.allow_external_word => {
return Err(ParseError::mismatch(
expected,
"external word".spanned(token_span),
))
}
UnspannedToken::Number(number) => {
UnspannedAtomicToken::Number { number }.into_atomic_token(token_span)
}
UnspannedToken::CompareOperator(_) => {
UnspannedAtomicToken::CompareOperator { text: token_span }
.into_atomic_token(token_span)
}
UnspannedToken::EvaluationOperator(EvaluationOperator::Dot) => {
UnspannedAtomicToken::Dot { text: token_span }.into_atomic_token(token_span)
}
UnspannedToken::EvaluationOperator(EvaluationOperator::DotDot) => {
UnspannedAtomicToken::DotDot { text: token_span }.into_atomic_token(token_span)
}
UnspannedToken::String(body) => {
UnspannedAtomicToken::String { body }.into_atomic_token(token_span)
}
UnspannedToken::Variable(name) if name.slice(context.source) == "it" => {
UnspannedAtomicToken::ItVariable { name }.into_atomic_token(token_span)
}
UnspannedToken::Variable(name) => {
UnspannedAtomicToken::Variable { name }.into_atomic_token(token_span)
}
UnspannedToken::ExternalCommand(command) => {
UnspannedAtomicToken::ExternalCommand { command }.into_atomic_token(token_span)
}
UnspannedToken::ExternalWord => UnspannedAtomicToken::ExternalWord { text: token_span }
.into_atomic_token(token_span),
UnspannedToken::GlobPattern => UnspannedAtomicToken::GlobPattern {
pattern: token_span,
}
.into_atomic_token(token_span),
UnspannedToken::Bare => {
UnspannedAtomicToken::Word { text: token_span }.into_atomic_token(token_span)
}
})
})
}

View File

@ -0,0 +1,24 @@
use crate::hir::syntax_shape::ExpandSyntax;
use crate::hir::SpannedExpression;
use crate::{hir, hir::TokensIterator};
use nu_errors::ParseError;
#[derive(Debug, Copy, Clone)]
pub struct DelimitedSquareShape;
impl ExpandSyntax for DelimitedSquareShape {
type Output = Result<SpannedExpression, ParseError>;
fn name(&self) -> &'static str {
"delimited square"
}
fn expand<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
) -> Result<SpannedExpression, ParseError> {
let exprs = token_nodes.square()?;
Ok(hir::Expression::list(exprs.item).into_expr(exprs.span))
}
}

View File

@ -0,0 +1,62 @@
use crate::hir::syntax_shape::{
expression::expand_file_path, BarePathShape, DecimalShape, ExpandContext, ExpandSyntax,
FlatShape, IntShape, StringShape,
};
use crate::hir::{Expression, SpannedExpression, TokensIterator};
use crate::parse::token_tree::ExternalWordType;
use nu_errors::ParseError;
use nu_source::{HasSpan, Span};
#[derive(Debug, Copy, Clone)]
pub struct FilePathShape;
impl ExpandSyntax for FilePathShape {
type Output = Result<SpannedExpression, ParseError>;
fn name(&self) -> &'static str {
"file path"
}
fn expand<'a, 'b>(
&self,
token_nodes: &mut TokensIterator<'_>,
) -> Result<SpannedExpression, ParseError> {
token_nodes
.expand_syntax(BarePathShape)
.or_else(|_| token_nodes.expand_syntax(ExternalWordShape))
.map(|span| file_path(span, token_nodes.context()).into_expr(span))
.or_else(|_| {
token_nodes.expand_syntax(StringShape).map(|syntax| {
file_path(syntax.inner, token_nodes.context()).into_expr(syntax.span)
})
})
.or_else(|_| {
token_nodes
.expand_syntax(IntShape)
.or_else(|_| token_nodes.expand_syntax(DecimalShape))
.map(|number| {
file_path(number.span(), token_nodes.context()).into_expr(number.span())
})
})
.map_err(|_| token_nodes.err_next_token("file path"))
}
}
fn file_path(text: Span, context: &ExpandContext) -> Expression {
Expression::FilePath(expand_file_path(text.slice(context.source), context))
}
#[derive(Debug, Copy, Clone)]
pub struct ExternalWordShape;
impl ExpandSyntax for ExternalWordShape {
type Output = Result<Span, ParseError>;
fn name(&self) -> &'static str {
"external word"
}
fn expand<'a, 'b>(&self, token_nodes: &'b mut TokensIterator<'a>) -> Result<Span, ParseError> {
token_nodes.expand_token(ExternalWordType, |span| Ok((FlatShape::ExternalWord, span)))
}
}

View File

@ -0,0 +1,170 @@
use crate::hir::syntax_shape::flat_shape::FlatShape;
use crate::{
hir,
hir::syntax_shape::{AnyExpressionShape, ExpandSyntax, MaybeSpaceShape},
hir::TokensIterator,
};
use derive_new::new;
use nu_source::{b, DebugDocBuilder, HasSpan, PrettyDebugWithSource, Span, Spanned, SpannedItem};
#[derive(Debug, Clone)]
pub struct ExpressionListSyntax {
pub exprs: Spanned<Vec<hir::SpannedExpression>>,
}
impl HasSpan for ExpressionListSyntax {
fn span(&self) -> Span {
self.exprs.span
}
}
impl PrettyDebugWithSource for ExpressionListSyntax {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
b::intersperse(
self.exprs.iter().map(|e| e.pretty_debug(source)),
b::space(),
)
}
}
#[derive(Debug, Copy, Clone)]
pub struct ExpressionListShape;
impl ExpandSyntax for ExpressionListShape {
type Output = ExpressionListSyntax;
fn name(&self) -> &'static str {
"expression list"
}
fn expand<'a, 'b>(&self, token_nodes: &mut TokensIterator<'_>) -> ExpressionListSyntax {
// We encountered a parsing error and will continue with simpler coloring ("backoff
// coloring mode")
let mut backoff = false;
let mut exprs = vec![];
let start = token_nodes.span_at_cursor();
token_nodes.expand_infallible(MaybeSpaceShape);
if token_nodes.at_end() {
return ExpressionListSyntax {
exprs: exprs.spanned(start),
};
}
let expr = token_nodes.expand_syntax(AnyExpressionShape);
match expr {
Ok(expr) => exprs.push(expr),
Err(_) => backoff = true,
}
loop {
if token_nodes.at_end() {
let end = token_nodes.span_at_cursor();
return ExpressionListSyntax {
exprs: exprs.spanned(start.until(end)),
};
}
if backoff {
let len = token_nodes.state().shapes().len();
// If we previously encountered a parsing error, use backoff coloring mode
token_nodes
.expand_infallible(SimplestExpression::new(vec!["expression".to_string()]));
if len == token_nodes.state().shapes().len() && !token_nodes.at_end() {
// This should never happen, but if it does, a panic is better than an infinite loop
panic!("Unexpected tokens left that couldn't be colored even with SimplestExpression")
}
} else {
let expr = token_nodes.atomic_parse(|token_nodes| {
token_nodes.expand_infallible(MaybeSpaceShape);
token_nodes.expand_syntax(AnyExpressionShape)
});
match expr {
Ok(expr) => exprs.push(expr),
Err(_) => {
backoff = true;
}
}
// Otherwise, move on to the next expression
}
}
}
}
/// BackoffColoringMode consumes all of the remaining tokens in an infallible way
#[derive(Debug, Clone, new)]
pub struct BackoffColoringMode {
allowed: Vec<String>,
}
impl ExpandSyntax for BackoffColoringMode {
type Output = Option<Span>;
fn name(&self) -> &'static str {
"BackoffColoringMode"
}
fn expand<'a, 'b>(&self, token_nodes: &'b mut TokensIterator<'a>) -> Option<Span> {
loop {
if token_nodes.at_end() {
break;
}
let len = token_nodes.state().shapes().len();
token_nodes.expand_infallible(SimplestExpression::new(self.allowed.clone()));
if len == token_nodes.state().shapes().len() && !token_nodes.at_end() {
// This shouldn't happen, but if it does, a panic is better than an infinite loop
panic!("SimplestExpression failed to consume any tokens, but it's not at the end. This is unexpected\n== token nodes==\n{:#?}\n\n== shapes ==\n{:#?}", token_nodes, token_nodes.state().shapes());
}
}
None
}
}
/// The point of `SimplestExpression` is to serve as an infallible base case for coloring.
/// As a last ditch effort, if we can't find any way to parse the head of the stream as an
/// expression, fall back to simple coloring.
#[derive(Debug, Clone, new)]
pub struct SimplestExpression {
valid_shapes: Vec<String>,
}
impl ExpandSyntax for SimplestExpression {
type Output = Span;
fn name(&self) -> &'static str {
"SimplestExpression"
}
fn expand<'a, 'b>(&self, token_nodes: &'b mut TokensIterator<'a>) -> Span {
if token_nodes.at_end() {
return Span::unknown();
}
let source = token_nodes.source();
let peeked = token_nodes.peek();
match peeked.not_eof("simplest expression") {
Err(_) => token_nodes.span_at_cursor(),
Ok(peeked) => {
let token = peeked.commit();
for shape in FlatShape::shapes(token, &source) {
token_nodes.color_err(shape, self.valid_shapes.clone())
}
token.span()
}
}
}
}

View File

@ -0,0 +1,109 @@
use crate::hir::syntax_shape::{ExpandSyntax, FlatShape};
use crate::hir::{Expression, SpannedExpression};
use crate::hir::{RawNumber, TokensIterator};
use crate::parse::token_tree::{DecimalType, IntType};
use nu_errors::ParseError;
use nu_source::HasSpan;
#[derive(Debug, Copy, Clone)]
pub struct NumberExpressionShape;
impl ExpandSyntax for NumberExpressionShape {
type Output = Result<SpannedExpression, ParseError>;
fn name(&self) -> &'static str {
"number"
}
fn expand<'a, 'b>(
&self,
token_nodes: &mut TokensIterator<'_>,
) -> Result<SpannedExpression, ParseError> {
let source = token_nodes.source();
token_nodes
.expand_syntax(NumberShape)
.map(|number| Expression::number(number.to_number(&source)).into_expr(number.span()))
}
}
#[derive(Debug, Copy, Clone)]
pub struct IntExpressionShape;
impl ExpandSyntax for IntExpressionShape {
type Output = Result<SpannedExpression, ParseError>;
fn name(&self) -> &'static str {
"integer"
}
fn expand<'a, 'b>(
&self,
token_nodes: &mut TokensIterator<'_>,
) -> Result<SpannedExpression, ParseError> {
let source = token_nodes.source();
token_nodes.expand_token(IntType, |number| {
Ok((
FlatShape::Int,
Expression::number(number.to_number(&source)),
))
})
}
}
#[derive(Debug, Copy, Clone)]
pub struct IntShape;
impl ExpandSyntax for IntShape {
type Output = Result<RawNumber, ParseError>;
fn name(&self) -> &'static str {
"integer"
}
fn expand<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
) -> Result<RawNumber, ParseError> {
token_nodes.expand_token(IntType, |number| Ok((FlatShape::Int, number)))
}
}
#[derive(Debug, Copy, Clone)]
pub struct DecimalShape;
impl ExpandSyntax for DecimalShape {
type Output = Result<RawNumber, ParseError>;
fn name(&self) -> &'static str {
"decimal"
}
fn expand<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
) -> Result<RawNumber, ParseError> {
token_nodes.expand_token(DecimalType, |number| Ok((FlatShape::Decimal, number)))
}
}
#[derive(Debug, Copy, Clone)]
pub struct NumberShape;
impl ExpandSyntax for NumberShape {
type Output = Result<RawNumber, ParseError>;
fn name(&self) -> &'static str {
"decimal"
}
fn expand<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
) -> Result<RawNumber, ParseError> {
token_nodes
.expand_syntax(IntShape)
.or_else(|_| token_nodes.expand_syntax(DecimalShape))
}
}

View File

@ -0,0 +1,86 @@
use crate::hir::syntax_shape::{
expand_bare, expression::expand_file_path, BarePathShape, ExpandContext, ExpandSyntax,
ExternalWordShape, StringShape,
};
use crate::hir::{Expression, SpannedExpression};
use crate::parse::operator::EvaluationOperator;
use crate::{hir, hir::TokensIterator, Token};
use nu_errors::ParseError;
use nu_source::Span;
#[derive(Debug, Copy, Clone)]
pub struct PatternShape;
impl ExpandSyntax for PatternShape {
type Output = Result<SpannedExpression, ParseError>;
fn name(&self) -> &'static str {
"glob pattern"
}
fn expand<'a, 'b>(
&self,
token_nodes: &mut TokensIterator<'_>,
) -> Result<hir::SpannedExpression, ParseError> {
let (inner, outer) = token_nodes
.expand_syntax(BarePatternShape)
.or_else(|_| token_nodes.expand_syntax(BarePathShape))
.or_else(|_| token_nodes.expand_syntax(ExternalWordShape))
.map(|span| (span, span))
.or_else(|_| {
token_nodes
.expand_syntax(StringShape)
.map(|syntax| (syntax.inner, syntax.span))
})
.map_err(|_| token_nodes.err_next_token("glob pattern"))?;
Ok(file_pattern(inner, outer, token_nodes.context()))
}
}
fn file_pattern(body: Span, outer: Span, context: &ExpandContext) -> SpannedExpression {
let path = expand_file_path(body.slice(context.source), context);
Expression::pattern(path.to_string_lossy()).into_expr(outer)
}
#[derive(Debug, Copy, Clone)]
pub struct PatternExpressionShape;
impl ExpandSyntax for PatternExpressionShape {
type Output = Result<SpannedExpression, ParseError>;
fn name(&self) -> &'static str {
"pattern"
}
fn expand<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
) -> Result<SpannedExpression, ParseError> {
token_nodes.expand_syntax(BarePatternShape).map(|span| {
let path = expand_file_path(span.slice(&token_nodes.source()), token_nodes.context());
Expression::pattern(path.to_string_lossy()).into_expr(span)
})
}
}
#[derive(Debug, Copy, Clone)]
pub struct BarePatternShape;
impl ExpandSyntax for BarePatternShape {
type Output = Result<Span, ParseError>;
fn name(&self) -> &'static str {
"bare pattern"
}
fn expand<'a, 'b>(&self, token_nodes: &'b mut TokensIterator<'a>) -> Result<Span, ParseError> {
expand_bare(token_nodes, |token| match token.unspanned() {
Token::Bare
| Token::EvaluationOperator(EvaluationOperator::Dot)
| Token::GlobPattern => true,
_ => false,
})
}
}

View File

@ -0,0 +1,47 @@
use crate::hir::syntax_shape::{AnyExpressionStartShape, ExpandSyntax, FlatShape};
use crate::hir::TokensIterator;
use crate::hir::{Expression, SpannedExpression};
use crate::parse::token_tree::DotDotType;
use nu_errors::ParseError;
use nu_source::{HasSpan, Span};
#[derive(Debug, Copy, Clone)]
pub struct RangeShape;
impl ExpandSyntax for RangeShape {
type Output = Result<SpannedExpression, ParseError>;
fn name(&self) -> &'static str {
"range"
}
fn expand<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
) -> Result<SpannedExpression, ParseError> {
token_nodes.atomic_parse(|token_nodes| {
let left = token_nodes.expand_syntax(AnyExpressionStartShape)?;
let dotdot = token_nodes.expand_syntax(DotDotShape)?;
let right = token_nodes.expand_syntax(AnyExpressionStartShape)?;
let span = left.span.until(right.span);
Ok(Expression::range(left, dotdot, right).into_expr(span))
})
}
}
#[derive(Debug, Copy, Clone)]
struct DotDotShape;
impl ExpandSyntax for DotDotShape {
type Output = Result<Span, ParseError>;
fn name(&self) -> &'static str {
"dotdot"
}
fn expand<'a, 'b>(&self, token_nodes: &'b mut TokensIterator<'a>) -> Result<Span, ParseError> {
token_nodes.expand_token(DotDotType, |token| Ok((FlatShape::DotDot, token.span())))
}
}

View File

@ -0,0 +1,103 @@
use crate::hir::syntax_shape::{ExpandSyntax, FlatShape, NumberShape, VariableShape};
use crate::hir::TokensIterator;
use crate::hir::{Expression, SpannedExpression};
use crate::parse::token_tree::{BareType, StringType};
use nu_errors::ParseError;
use nu_source::{b, DebugDocBuilder, HasSpan, PrettyDebugWithSource, Span};
#[derive(Debug, Copy, Clone)]
pub struct CoerceStringShape;
impl ExpandSyntax for CoerceStringShape {
type Output = Result<SpannedExpression, ParseError>;
fn name(&self) -> &'static str {
"StringShape"
}
fn expand<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
) -> Result<SpannedExpression, ParseError> {
token_nodes
.expand_token(StringType, |(inner, outer)| {
Ok((
FlatShape::String,
Expression::string(inner).into_expr(outer),
))
})
.or_else(|_| {
token_nodes.expand_token(BareType, |span| {
Ok((FlatShape::String, Expression::string(span).into_expr(span)))
})
})
.or_else(|_| {
token_nodes
.expand_syntax(NumberShape)
.map(|number| Expression::string(number.span()).into_expr(number.span()))
})
}
}
#[derive(Debug, Copy, Clone)]
pub struct StringExpressionShape;
impl ExpandSyntax for StringExpressionShape {
type Output = Result<SpannedExpression, ParseError>;
fn name(&self) -> &'static str {
"string"
}
fn expand<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
) -> Result<SpannedExpression, ParseError> {
token_nodes.expand_syntax(VariableShape).or_else(|_| {
token_nodes.expand_token(StringType, |(inner, outer)| {
Ok((
FlatShape::String,
Expression::string(inner).into_expr(outer),
))
})
})
}
}
#[derive(Debug, Copy, Clone)]
pub struct StringSyntax {
pub inner: Span,
pub span: Span,
}
impl HasSpan for StringSyntax {
fn span(&self) -> Span {
self.span
}
}
impl PrettyDebugWithSource for StringSyntax {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
b::primitive(self.span.slice(source))
}
}
#[derive(Debug, Copy, Clone)]
pub struct StringShape;
impl ExpandSyntax for StringShape {
type Output = Result<StringSyntax, ParseError>;
fn name(&self) -> &'static str {
"string"
}
fn expand<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
) -> Result<StringSyntax, ParseError> {
token_nodes.expand_token(StringType, |(inner, outer)| {
Ok((FlatShape::String, StringSyntax { inner, span: outer }))
})
}
}

View File

@ -0,0 +1,156 @@
use crate::hir::syntax_shape::flat_shape::FlatShape;
use crate::hir::syntax_shape::ExpandSyntax;
use crate::hir::TokensIterator;
use crate::hir::{Expression, SpannedExpression};
use crate::parse::number::RawNumber;
use crate::parse::token_tree::BareType;
use crate::parse::unit::Unit;
use nom::branch::alt;
use nom::bytes::complete::tag;
use nom::character::complete::digit1;
use nom::combinator::{all_consuming, opt, value};
use nom::IResult;
use nu_errors::ParseError;
use nu_source::{
b, DebugDocBuilder, HasSpan, PrettyDebugWithSource, Span, Spanned, SpannedItem, Text,
};
#[derive(Debug, Clone)]
pub struct UnitSyntax {
pub unit: (RawNumber, Spanned<Unit>),
pub span: Span,
}
impl UnitSyntax {
pub fn into_expr(self, source: &Text) -> SpannedExpression {
let UnitSyntax {
unit: (number, unit),
span,
} = self;
Expression::size(number.to_number(source), *unit).into_expr(span)
}
}
impl PrettyDebugWithSource for UnitSyntax {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
b::typed(
"unit",
self.unit.0.pretty_debug(source) + b::space() + self.unit.1.pretty_debug(source),
)
}
}
impl HasSpan for UnitSyntax {
fn span(&self) -> Span {
self.span
}
}
#[derive(Debug, Copy, Clone)]
pub struct UnitExpressionShape;
impl ExpandSyntax for UnitExpressionShape {
type Output = Result<SpannedExpression, ParseError>;
fn name(&self) -> &'static str {
"unit expression"
}
fn expand<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
) -> Result<SpannedExpression, ParseError> {
token_nodes
.expand_syntax(UnitShape)
.map(|unit| unit.into_expr(&token_nodes.source()))
}
}
#[derive(Debug, Copy, Clone)]
pub struct UnitShape;
impl ExpandSyntax for UnitShape {
type Output = Result<UnitSyntax, ParseError>;
fn name(&self) -> &'static str {
"unit"
}
fn expand<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
) -> Result<UnitSyntax, ParseError> {
let source = token_nodes.source();
token_nodes.expand_token(BareType, |span| {
let unit = unit_size(span.slice(&source), span);
let (_, (number, unit)) = match unit {
Err(_) => return Err(ParseError::mismatch("unit", "word".spanned(span))),
Ok((number, unit)) => (number, unit),
};
Ok((
FlatShape::Size {
number: number.span(),
unit: unit.span,
},
UnitSyntax {
unit: (number, unit),
span,
},
))
})
}
}
fn unit_size(input: &str, bare_span: Span) -> IResult<&str, (RawNumber, Spanned<Unit>)> {
let (input, digits) = digit1(input)?;
let (input, dot) = opt(tag("."))(input)?;
let (input, number) = match dot {
Some(dot) => {
let (input, rest) = digit1(input)?;
(
input,
RawNumber::decimal(Span::new(
bare_span.start(),
bare_span.start() + digits.len() + dot.len() + rest.len(),
)),
)
}
None => (
input,
RawNumber::int(Span::new(
bare_span.start(),
bare_span.start() + digits.len(),
)),
),
};
let (input, unit) = all_consuming(alt((
value(Unit::Byte, alt((tag("B"), tag("b")))),
value(Unit::Kilobyte, alt((tag("KB"), tag("kb"), tag("Kb")))),
value(Unit::Megabyte, alt((tag("MB"), tag("mb"), tag("Mb")))),
value(Unit::Gigabyte, alt((tag("GB"), tag("gb"), tag("Gb")))),
value(Unit::Terabyte, alt((tag("TB"), tag("tb"), tag("Tb")))),
value(Unit::Petabyte, alt((tag("PB"), tag("pb"), tag("Pb")))),
value(Unit::Second, tag("s")),
value(Unit::Minute, tag("m")),
value(Unit::Hour, tag("h")),
value(Unit::Day, tag("d")),
value(Unit::Week, tag("w")),
value(Unit::Month, tag("M")),
value(Unit::Year, tag("y")),
)))(input)?;
let start_span = number.span().end();
Ok((
input,
(number, unit.spanned(Span::new(start_span, bare_span.end()))),
))
}

View File

@ -0,0 +1,644 @@
use crate::hir::syntax_shape::{
AnyExpressionShape, BareShape, ExpandSyntax, FlatShape, IntShape, ParseError, StringShape,
WhitespaceShape,
};
use crate::hir::{Expression, SpannedExpression, TokensIterator};
use crate::parse::token_tree::{CompareOperatorType, DotDotType, DotType, ItVarType, VarType};
use crate::{hir, CompareOperator};
use nu_protocol::{PathMember, ShellTypeName};
use nu_source::{
b, DebugDocBuilder, HasSpan, PrettyDebug, PrettyDebugWithSource, Span, Spanned, SpannedItem,
Tag, Tagged, TaggedItem, Text,
};
use num_bigint::BigInt;
use serde::{Deserialize, Serialize};
use std::str::FromStr;
#[derive(Debug, Copy, Clone)]
pub struct VariablePathShape;
impl ExpandSyntax for VariablePathShape {
type Output = Result<SpannedExpression, ParseError>;
fn name(&self) -> &'static str {
"variable path"
}
fn expand<'a, 'b>(
&self,
token_nodes: &mut TokensIterator<'_>,
) -> Result<SpannedExpression, ParseError> {
// 1. let the head be the first token, expecting a variable
// 2. let the tail be an empty list of members
// 2. while the next token (excluding ws) is a dot:
// 1. consume the dot
// 2. consume the next token as a member and push it onto tail
let head = token_nodes.expand_syntax(VariableShape)?;
let start = head.span;
let mut end = start;
let mut tail: Vec<PathMember> = vec![];
loop {
if token_nodes.expand_syntax(DotShape).is_err() {
break;
}
let member = token_nodes.expand_syntax(MemberShape)?;
let member = member.to_path_member(&token_nodes.source());
end = member.span;
tail.push(member);
}
Ok(Expression::path(head, tail).into_expr(start.until(end)))
}
}
#[derive(Debug, Copy, Clone)]
pub struct PathTailShape;
#[derive(Debug, Clone)]
pub struct PathTailSyntax {
pub tail: Vec<PathMember>,
pub span: Span,
}
impl HasSpan for PathTailSyntax {
fn span(&self) -> Span {
self.span
}
}
impl PrettyDebug for PathTailSyntax {
fn pretty(&self) -> DebugDocBuilder {
b::typed("tail", b::intersperse(self.tail.iter(), b::space()))
}
}
impl ExpandSyntax for PathTailShape {
type Output = Result<PathTailSyntax, ParseError>;
fn name(&self) -> &'static str {
"path continuation"
}
fn expand<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
) -> Result<PathTailSyntax, ParseError> {
let mut end: Option<Span> = None;
let mut tail: Vec<PathMember> = vec![];
loop {
if token_nodes.expand_syntax(DotShape).is_err() {
break;
}
let member = token_nodes.expand_syntax(MemberShape)?;
let member = member.to_path_member(&token_nodes.source());
end = Some(member.span);
tail.push(member);
}
match end {
None => Err(token_nodes.err_next_token("path continuation")),
Some(end) => Ok(PathTailSyntax { tail, span: end }),
}
}
}
#[derive(Debug, Clone)]
pub struct ContinuationSyntax {
kind: ContinuationSyntaxKind,
span: Span,
}
impl ContinuationSyntax {
pub fn append_to(self, expr: SpannedExpression) -> SpannedExpression {
match self.kind {
ContinuationSyntaxKind::Infix(op, right) => {
let span = expr.span.until(right.span);
Expression::infix(expr, op, right).into_expr(span)
}
ContinuationSyntaxKind::Dot(_, member) => {
let span = expr.span.until(member.span);
Expression::dot_member(expr, member).into_expr(span)
}
ContinuationSyntaxKind::DotDot(_, right) => {
let span = expr.span.until(right.span);
Expression::range(expr, span, right).into_expr(span)
}
}
}
}
impl HasSpan for ContinuationSyntax {
fn span(&self) -> Span {
self.span
}
}
impl PrettyDebugWithSource for ContinuationSyntax {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
b::typed("continuation", self.kind.pretty_debug(source))
}
}
#[derive(Debug, Clone)]
pub enum ContinuationSyntaxKind {
Infix(Spanned<CompareOperator>, SpannedExpression),
Dot(Span, PathMember),
DotDot(Span, SpannedExpression),
}
impl PrettyDebugWithSource for ContinuationSyntaxKind {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
match self {
ContinuationSyntaxKind::Infix(op, expr) => {
b::operator(op.span.slice(source)) + expr.pretty_debug(source)
}
ContinuationSyntaxKind::Dot(span, member) => {
b::operator(span.slice(source)) + member.pretty_debug(source)
}
ContinuationSyntaxKind::DotDot(span, expr) => {
b::operator(span.slice(source)) + expr.pretty_debug(source)
}
}
}
}
/// An expression continuation
#[derive(Debug, Copy, Clone)]
pub struct ExpressionContinuationShape;
impl ExpandSyntax for ExpressionContinuationShape {
type Output = Result<ContinuationSyntax, ParseError>;
fn name(&self) -> &'static str {
"expression continuation"
}
fn expand<'a, 'b>(
&self,
token_nodes: &mut TokensIterator<'_>,
) -> Result<ContinuationSyntax, ParseError> {
token_nodes.atomic_parse(|token_nodes| {
// Try to expand a `.`
let dot = token_nodes.expand_syntax(DotShape);
if let Ok(dot) = dot {
// If a `.` was matched, it's a `Path`, and we expect a `Member` next
let syntax = token_nodes.expand_syntax(MemberShape)?;
let member = syntax.to_path_member(&token_nodes.source());
let member_span = member.span;
return Ok(ContinuationSyntax {
kind: ContinuationSyntaxKind::Dot(dot, member),
span: dot.until(member_span),
});
}
// Try to expand a `..`
let dot = token_nodes.expand_syntax(DotDotShape);
if let Ok(dotdot) = dot {
// If a `..` was matched, it's a `Range`, and we expect an `Expression` next
let expr = token_nodes.expand_syntax(AnyExpressionShape)?;
let expr_span = expr.span;
return Ok(ContinuationSyntax {
kind: ContinuationSyntaxKind::DotDot(dotdot, expr),
span: dotdot.until(expr_span),
});
}
// Otherwise, we expect an infix operator and an expression next
let (_, op, _) = token_nodes.expand_syntax(InfixShape)?.infix.item;
let next = token_nodes.expand_syntax(AnyExpressionShape)?;
let next_span = next.span;
Ok(ContinuationSyntax {
kind: ContinuationSyntaxKind::Infix(op.operator, next),
span: op.operator.span.until(next_span),
})
})
}
}
#[derive(Debug, Copy, Clone)]
pub struct VariableShape;
impl ExpandSyntax for VariableShape {
type Output = Result<SpannedExpression, ParseError>;
fn name(&self) -> &'static str {
"variable"
}
fn expand<'a, 'b>(
&self,
token_nodes: &mut TokensIterator<'_>,
) -> Result<SpannedExpression, ParseError> {
token_nodes
.expand_token(ItVarType, |(inner, outer)| {
Ok((
FlatShape::ItVariable,
Expression::it_variable(inner).into_expr(outer),
))
})
.or_else(|_| {
token_nodes.expand_token(VarType, |(inner, outer)| {
Ok((
FlatShape::Variable,
Expression::variable(inner).into_expr(outer),
))
})
})
}
}
#[derive(Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Hash, Serialize, Deserialize)]
pub enum Member {
String(/* outer */ Span, /* inner */ Span),
Int(BigInt, Span),
Bare(Span),
}
impl ShellTypeName for Member {
fn type_name(&self) -> &'static str {
match self {
Member::String(_, _) => "string",
Member::Int(_, _) => "integer",
Member::Bare(_) => "word",
}
}
}
impl Member {
pub fn int(span: Span, source: &Text) -> Member {
if let Ok(big_int) = BigInt::from_str(span.slice(source)) {
Member::Int(big_int, span)
} else {
unreachable!("Internal error: could not convert text to BigInt as expected")
}
}
pub fn to_path_member(&self, source: &Text) -> PathMember {
match self {
Member::String(outer, inner) => PathMember::string(inner.slice(source), *outer),
Member::Int(int, span) => PathMember::int(int.clone(), *span),
Member::Bare(span) => PathMember::string(span.slice(source), *span),
}
}
}
impl PrettyDebugWithSource for Member {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
match self {
Member::String(outer, _) => b::value(outer.slice(source)),
Member::Int(int, _) => b::value(format!("{}", int)),
Member::Bare(span) => b::value(span.slice(source)),
}
}
}
impl HasSpan for Member {
fn span(&self) -> Span {
match self {
Member::String(outer, ..) => *outer,
Member::Int(_, int) => *int,
Member::Bare(name) => *name,
}
}
}
impl Member {
pub fn to_expr(&self) -> hir::SpannedExpression {
match self {
Member::String(outer, inner) => Expression::string(*inner).into_expr(outer),
Member::Int(number, span) => Expression::number(number.clone()).into_expr(span),
Member::Bare(span) => Expression::string(*span).into_expr(span),
}
}
pub(crate) fn span(&self) -> Span {
match self {
Member::String(outer, _inner) => *outer,
Member::Int(_, span) => *span,
Member::Bare(span) => *span,
}
}
}
enum ColumnPathState {
Initial,
LeadingDot(Span),
Dot(Span, Vec<Member>, Span),
Member(Span, Vec<Member>),
Error(ParseError),
}
impl ColumnPathState {
pub fn dot(self, dot: Span) -> ColumnPathState {
match self {
ColumnPathState::Initial => ColumnPathState::LeadingDot(dot),
ColumnPathState::LeadingDot(_) => {
ColumnPathState::Error(ParseError::mismatch("column", "dot".spanned(dot)))
}
ColumnPathState::Dot(..) => {
ColumnPathState::Error(ParseError::mismatch("column", "dot".spanned(dot)))
}
ColumnPathState::Member(tag, members) => ColumnPathState::Dot(tag, members, dot),
ColumnPathState::Error(err) => ColumnPathState::Error(err),
}
}
pub fn member(self, member: Member) -> ColumnPathState {
match self {
ColumnPathState::Initial => ColumnPathState::Member(member.span(), vec![member]),
ColumnPathState::LeadingDot(tag) => {
ColumnPathState::Member(tag.until(member.span()), vec![member])
}
ColumnPathState::Dot(tag, mut tags, _) => {
ColumnPathState::Member(tag.until(member.span()), {
tags.push(member);
tags
})
}
ColumnPathState::Member(..) => ColumnPathState::Error(ParseError::mismatch(
"column",
member.type_name().spanned(member.span()),
)),
ColumnPathState::Error(err) => ColumnPathState::Error(err),
}
}
pub fn into_path(self, err: ParseError) -> Result<Tagged<Vec<Member>>, ParseError> {
match self {
ColumnPathState::Initial => Err(err),
ColumnPathState::LeadingDot(dot) => {
Err(ParseError::mismatch("column", "dot".spanned(dot)))
}
ColumnPathState::Dot(_tag, _members, dot) => {
Err(ParseError::mismatch("column", "dot".spanned(dot)))
}
ColumnPathState::Member(tag, tags) => Ok(tags.tagged(tag)),
ColumnPathState::Error(err) => Err(err),
}
}
}
#[derive(Debug, Copy, Clone)]
pub struct ColumnPathShape;
impl ExpandSyntax for ColumnPathShape {
type Output = Result<ColumnPathSyntax, ParseError>;
fn name(&self) -> &'static str {
"column path"
}
fn expand<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
) -> Result<ColumnPathSyntax, ParseError> {
let mut state = ColumnPathState::Initial;
loop {
let member = token_nodes.expand_syntax(MemberShape);
match member {
Err(_) => break,
Ok(member) => state = state.member(member),
}
let dot = token_nodes.expand_syntax(DotShape);
match dot {
Err(_) => break,
Ok(dot) => state = state.dot(dot),
}
}
let path = state.into_path(token_nodes.err_next_token("column path"))?;
Ok(ColumnPathSyntax {
path: path.item,
tag: path.tag,
})
}
}
#[derive(Debug, Clone)]
pub struct ColumnPathSyntax {
pub path: Vec<Member>,
pub tag: Tag,
}
impl HasSpan for ColumnPathSyntax {
fn span(&self) -> Span {
self.tag.span
}
}
impl PrettyDebugWithSource for ColumnPathSyntax {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
b::typed(
"column path",
b::intersperse(
self.path.iter().map(|member| member.pretty_debug(source)),
b::space(),
),
)
}
}
#[derive(Debug, Copy, Clone)]
pub struct MemberShape;
impl ExpandSyntax for MemberShape {
type Output = Result<Member, ParseError>;
fn name(&self) -> &'static str {
"column"
}
fn expand<'a, 'b>(&self, token_nodes: &mut TokensIterator<'_>) -> Result<Member, ParseError> {
if let Ok(int) = token_nodes.expand_syntax(IntMemberShape) {
return Ok(int);
}
let bare = token_nodes.expand_syntax(BareShape);
if let Ok(bare) = bare {
return Ok(Member::Bare(bare.span()));
}
/* KATZ */
/* let number = NumberShape.test(token_nodes, context);
if let Some(peeked) = number {
let node = peeked.not_eof("column")?.commit();
let (n, span) = node.as_number().ok_or_else(|| {
ParseError::internal_error("can't convert node to number".spanned(node.span()))
})?;
return Ok(Member::Number(n, span))
}*/
let string = token_nodes.expand_syntax(StringShape);
if let Ok(syntax) = string {
return Ok(Member::String(syntax.span, syntax.inner));
}
Err(token_nodes.peek().type_error("column"))
}
}
#[derive(Debug, Copy, Clone)]
struct IntMemberShape;
impl ExpandSyntax for IntMemberShape {
type Output = Result<Member, ParseError>;
fn name(&self) -> &'static str {
"integer member"
}
fn expand<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
) -> Result<Member, ParseError> {
token_nodes
.expand_syntax(IntShape)
.map(|int| Member::int(int.span(), &token_nodes.source()))
.or_else(|_| Err(token_nodes.err_next_token("integer member")))
}
}
#[derive(Debug, Copy, Clone)]
pub struct DotShape;
#[derive(Debug, Copy, Clone)]
pub struct ColorableDotShape;
impl ExpandSyntax for DotShape {
type Output = Result<Span, ParseError>;
fn name(&self) -> &'static str {
"dot"
}
fn expand<'a, 'b>(&self, token_nodes: &'b mut TokensIterator<'a>) -> Result<Span, ParseError> {
token_nodes.expand_token(DotType, |token| Ok((FlatShape::Dot, token.span())))
}
}
#[derive(Debug, Copy, Clone)]
struct DotDotShape;
impl ExpandSyntax for DotDotShape {
type Output = Result<Span, ParseError>;
fn name(&self) -> &'static str {
"dotdot"
}
fn expand<'a, 'b>(&self, token_nodes: &'b mut TokensIterator<'a>) -> Result<Span, ParseError> {
token_nodes.expand_token(DotDotType, |token| Ok((FlatShape::DotDot, token.span())))
}
}
#[derive(Debug, Copy, Clone)]
pub struct InfixShape;
#[derive(Debug, Clone)]
pub struct InfixSyntax {
infix: Spanned<(Span, InfixInnerSyntax, Span)>,
}
impl HasSpan for InfixSyntax {
fn span(&self) -> Span {
self.infix.span
}
}
impl PrettyDebugWithSource for InfixSyntax {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
self.infix.1.pretty_debug(source)
}
}
impl ExpandSyntax for InfixShape {
type Output = Result<InfixSyntax, ParseError>;
fn name(&self) -> &'static str {
"infix operator"
}
fn expand<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
) -> Result<InfixSyntax, ParseError> {
token_nodes.atomic_parse(|token_nodes| {
// An infix operator must be prefixed by whitespace
let start = token_nodes.expand_syntax(WhitespaceShape)?;
// Parse the next TokenNode after the whitespace
let operator = token_nodes.expand_syntax(InfixInnerShape)?;
// An infix operator must be followed by whitespace
let end = token_nodes.expand_syntax(WhitespaceShape)?;
Ok(InfixSyntax {
infix: (start, operator, end).spanned(start.until(end)),
})
})
}
}
#[derive(Debug, Clone)]
pub struct InfixInnerSyntax {
pub operator: Spanned<CompareOperator>,
}
impl HasSpan for InfixInnerSyntax {
fn span(&self) -> Span {
self.operator.span
}
}
impl PrettyDebug for InfixInnerSyntax {
fn pretty(&self) -> DebugDocBuilder {
self.operator.pretty()
}
}
#[derive(Debug, Copy, Clone)]
pub struct InfixInnerShape;
impl ExpandSyntax for InfixInnerShape {
type Output = Result<InfixInnerSyntax, ParseError>;
fn name(&self) -> &'static str {
"infix inner"
}
fn expand<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
) -> Result<InfixInnerSyntax, ParseError> {
token_nodes.expand_token(CompareOperatorType, |(span, operator)| {
Ok((
FlatShape::CompareOperator,
InfixInnerSyntax {
operator: operator.spanned(span),
},
))
})
}
}

View File

@ -0,0 +1,193 @@
use crate::parse::flag::{Flag, FlagKind};
use crate::parse::number::RawNumber;
use crate::parse::operator::EvaluationOperator;
use crate::parse::token_tree::{Delimiter, SpannedToken, Token};
use nu_protocol::ShellTypeName;
use nu_source::{DebugDocBuilder, HasSpan, PrettyDebug, Span, Spanned, SpannedItem, Text};
#[derive(Debug, Copy, Clone)]
pub enum FlatShape {
OpenDelimiter(Delimiter),
CloseDelimiter(Delimiter),
Type,
Identifier,
ItVariable,
Variable,
CompareOperator,
Dot,
DotDot,
InternalCommand,
ExternalCommand,
ExternalWord,
BareMember,
StringMember,
String,
Path,
Word,
Keyword,
Pipe,
GlobPattern,
Flag,
ShorthandFlag,
Int,
Decimal,
Garbage,
Whitespace,
Separator,
Comment,
Size { number: Span, unit: Span },
}
#[derive(Debug, Clone)]
pub enum ShapeResult {
Success(Spanned<FlatShape>),
Fallback {
shape: Spanned<FlatShape>,
allowed: Vec<String>,
},
}
impl HasSpan for ShapeResult {
fn span(&self) -> Span {
match self {
ShapeResult::Success(shape) => shape.span,
ShapeResult::Fallback { shape, .. } => shape.span,
}
}
}
impl PrettyDebug for FlatShape {
fn pretty(&self) -> DebugDocBuilder {
unimplemented!()
}
}
#[derive(Debug, Copy, Clone)]
pub struct TraceShape {
shape: FlatShape,
span: Span,
}
impl ShellTypeName for TraceShape {
fn type_name(&self) -> &'static str {
self.shape.type_name()
}
}
impl PrettyDebug for TraceShape {
fn pretty(&self) -> DebugDocBuilder {
self.shape.pretty()
}
}
impl HasSpan for TraceShape {
fn span(&self) -> Span {
self.span
}
}
impl ShellTypeName for FlatShape {
fn type_name(&self) -> &'static str {
match self {
FlatShape::OpenDelimiter(Delimiter::Brace) => "open brace",
FlatShape::OpenDelimiter(Delimiter::Paren) => "open paren",
FlatShape::OpenDelimiter(Delimiter::Square) => "open square",
FlatShape::CloseDelimiter(Delimiter::Brace) => "close brace",
FlatShape::CloseDelimiter(Delimiter::Paren) => "close paren",
FlatShape::CloseDelimiter(Delimiter::Square) => "close square",
FlatShape::Type => "type",
FlatShape::Identifier => "identifier",
FlatShape::ItVariable => "$it",
FlatShape::Variable => "variable",
FlatShape::CompareOperator => "comparison",
FlatShape::Dot => "dot",
FlatShape::DotDot => "dotdot",
FlatShape::InternalCommand => "internal command",
FlatShape::ExternalCommand => "external command",
FlatShape::ExternalWord => "external word",
FlatShape::BareMember => "bare member",
FlatShape::StringMember => "string member",
FlatShape::String => "string",
FlatShape::Path => "path",
FlatShape::Word => "word",
FlatShape::Keyword => "keyword",
FlatShape::Pipe => "pipe",
FlatShape::GlobPattern => "glob",
FlatShape::Flag => "flag",
FlatShape::ShorthandFlag => "shorthand flag",
FlatShape::Int => "int",
FlatShape::Decimal => "decimal",
FlatShape::Garbage => "garbage",
FlatShape::Whitespace => "whitespace",
FlatShape::Separator => "separator",
FlatShape::Comment => "comment",
FlatShape::Size { .. } => "size",
}
}
}
impl FlatShape {
pub fn into_trace_shape(self, span: Span) -> TraceShape {
TraceShape { shape: self, span }
}
pub fn shapes(token: &SpannedToken, source: &Text) -> Vec<Spanned<FlatShape>> {
let mut shapes = vec![];
FlatShape::from(token, source, &mut shapes);
shapes
}
fn from(token: &SpannedToken, source: &Text, shapes: &mut Vec<Spanned<FlatShape>>) {
let span = token.span();
match token.unspanned() {
Token::Number(RawNumber::Int(_)) => shapes.push(FlatShape::Int.spanned(span)),
Token::Number(RawNumber::Decimal(_)) => shapes.push(FlatShape::Decimal.spanned(span)),
Token::EvaluationOperator(EvaluationOperator::Dot) => {
shapes.push(FlatShape::Dot.spanned(span))
}
Token::EvaluationOperator(EvaluationOperator::DotDot) => {
shapes.push(FlatShape::DotDot.spanned(span))
}
Token::CompareOperator(_) => shapes.push(FlatShape::CompareOperator.spanned(span)),
Token::String(_) => shapes.push(FlatShape::String.spanned(span)),
Token::Variable(v) if v.slice(source) == "it" => {
shapes.push(FlatShape::ItVariable.spanned(span))
}
Token::Variable(_) => shapes.push(FlatShape::Variable.spanned(span)),
Token::ItVariable(_) => shapes.push(FlatShape::ItVariable.spanned(span)),
Token::ExternalCommand(_) => shapes.push(FlatShape::ExternalCommand.spanned(span)),
Token::ExternalWord => shapes.push(FlatShape::ExternalWord.spanned(span)),
Token::GlobPattern => shapes.push(FlatShape::GlobPattern.spanned(span)),
Token::Bare => shapes.push(FlatShape::Word.spanned(span)),
Token::Call(_) => unimplemented!(),
Token::Delimited(v) => {
shapes.push(FlatShape::OpenDelimiter(v.delimiter).spanned(v.spans.0));
for token in &v.children {
FlatShape::from(token, source, shapes);
}
shapes.push(FlatShape::CloseDelimiter(v.delimiter).spanned(v.spans.1));
}
Token::Pipeline(pipeline) => {
for part in &pipeline.parts {
if part.pipe.is_some() {
shapes.push(FlatShape::Pipe.spanned(part.span()));
}
}
}
Token::Flag(Flag {
kind: FlagKind::Longhand,
..
}) => shapes.push(FlatShape::Flag.spanned(span)),
Token::Flag(Flag {
kind: FlagKind::Shorthand,
..
}) => shapes.push(FlatShape::ShorthandFlag.spanned(span)),
Token::Garbage => shapes.push(FlatShape::Garbage.spanned(span)),
Token::Whitespace => shapes.push(FlatShape::Whitespace.spanned(span)),
Token::Separator => shapes.push(FlatShape::Separator.spanned(span)),
Token::Comment(_) => shapes.push(FlatShape::Comment.spanned(span)),
}
}
}

View File

@ -0,0 +1,602 @@
pub(crate) mod debug;
pub(crate) mod into_shapes;
pub(crate) mod pattern;
pub(crate) mod state;
use self::debug::ExpandTracer;
use self::into_shapes::IntoShapes;
use self::state::{Peeked, TokensIteratorState};
use crate::hir::syntax_shape::flat_shape::{FlatShape, ShapeResult};
use crate::hir::syntax_shape::{ExpandContext, ExpandSyntax, ExpressionListShape};
use crate::hir::SpannedExpression;
use crate::parse::token_tree::{BlockType, DelimitedNode, SpannedToken, SquareType, TokenType};
use getset::{Getters, MutGetters};
use nu_errors::ParseError;
use nu_protocol::SpannedTypeName;
use nu_source::{
HasFallibleSpan, HasSpan, IntoSpanned, PrettyDebugWithSource, Span, Spanned, SpannedItem, Text,
};
use std::borrow::Borrow;
use std::sync::Arc;
#[derive(Getters, MutGetters, Clone, Debug)]
pub struct TokensIterator<'content> {
#[get = "pub"]
#[get_mut = "pub"]
state: TokensIteratorState<'content>,
#[get = "pub"]
#[get_mut = "pub"]
expand_tracer: ExpandTracer<SpannedExpression>,
}
#[derive(Debug)]
pub struct Checkpoint<'content, 'me> {
pub(crate) iterator: &'me mut TokensIterator<'content>,
index: usize,
seen: indexmap::IndexSet<usize>,
shape_start: usize,
committed: bool,
}
impl<'content, 'me> Checkpoint<'content, 'me> {
pub(crate) fn commit(mut self) {
self.committed = true;
}
}
impl<'content, 'me> std::ops::Drop for Checkpoint<'content, 'me> {
fn drop(&mut self) {
if !self.committed {
let state = &mut self.iterator.state;
state.index = self.index;
state.seen = self.seen.clone();
state.shapes.truncate(self.shape_start);
}
}
}
// For parse_command
impl<'content> TokensIterator<'content> {
pub fn sort_shapes(&mut self) {
// This is pretty dubious, but it works. We should look into a better algorithm that doesn't end up requiring
// this solution.
self.state
.shapes
.sort_by(|a, b| a.span().start().cmp(&b.span().start()));
}
/// Run a block of code, retrieving the shapes that were created during the block. This is
/// used by `parse_command` to associate shapes with a particular flag.
pub fn shapes_for<'me, T>(
&'me mut self,
block: impl FnOnce(&mut TokensIterator<'content>) -> Result<T, ParseError>,
) -> (Result<T, ParseError>, Vec<ShapeResult>) {
let index = self.state.index;
let mut shapes = vec![];
let mut errors = self.state.errors.clone();
let seen = self.state.seen.clone();
std::mem::swap(&mut self.state.shapes, &mut shapes);
std::mem::swap(&mut self.state.errors, &mut errors);
let checkpoint = Checkpoint {
iterator: self,
index,
seen,
committed: false,
shape_start: 0,
};
let value = block(checkpoint.iterator);
let value = match value {
Err(err) => {
drop(checkpoint);
std::mem::swap(&mut self.state.shapes, &mut shapes);
std::mem::swap(&mut self.state.errors, &mut errors);
return (Err(err), vec![]);
}
Ok(value) => value,
};
checkpoint.commit();
std::mem::swap(&mut self.state.shapes, &mut shapes);
(Ok(value), shapes)
}
pub fn extract<T>(&mut self, f: impl Fn(&SpannedToken) -> Option<T>) -> Option<(usize, T)> {
let state = &mut self.state;
for (i, item) in state.tokens.iter().enumerate() {
if state.seen.contains(&i) {
continue;
}
match f(item) {
None => {
continue;
}
Some(value) => {
state.seen.insert(i);
return Some((i, value));
}
}
}
self.move_to(0);
None
}
pub fn remove(&mut self, position: usize) {
self.state.seen.insert(position);
}
}
// Delimited
impl<'content> TokensIterator<'content> {
pub fn block(&mut self) -> Result<Spanned<Vec<SpannedExpression>>, ParseError> {
self.expand_token_with_token_nodes(BlockType, |node, token_nodes| {
token_nodes.delimited(node)
})
}
pub fn square(&mut self) -> Result<Spanned<Vec<SpannedExpression>>, ParseError> {
self.expand_token_with_token_nodes(SquareType, |node, token_nodes| {
token_nodes.delimited(node)
})
}
fn delimited(
&mut self,
DelimitedNode {
delimiter,
spans,
children,
}: DelimitedNode,
) -> Result<(Vec<ShapeResult>, Spanned<Vec<SpannedExpression>>), ParseError> {
let span = spans.0.until(spans.1);
let (child_shapes, expr) = self.child(children[..].spanned(span), |token_nodes| {
token_nodes.expand_infallible(ExpressionListShape).exprs
});
let mut shapes = vec![ShapeResult::Success(
FlatShape::OpenDelimiter(delimiter).spanned(spans.0),
)];
shapes.extend(child_shapes);
shapes.push(ShapeResult::Success(
FlatShape::CloseDelimiter(delimiter).spanned(spans.1),
));
Ok((shapes, expr))
}
}
impl<'content> TokensIterator<'content> {
pub fn new(
items: &'content [SpannedToken],
context: ExpandContext<'content>,
span: Span,
) -> TokensIterator<'content> {
let source = context.source();
TokensIterator {
state: TokensIteratorState {
tokens: items,
span,
index: 0,
seen: indexmap::IndexSet::new(),
shapes: vec![],
errors: indexmap::IndexMap::new(),
context: Arc::new(context),
},
expand_tracer: ExpandTracer::new("Expand Trace", source.clone()),
}
}
pub fn len(&self) -> usize {
self.state.tokens.len()
}
pub fn is_empty(&self) -> bool {
self.state.tokens.is_empty()
}
pub fn source(&self) -> Text {
self.state.context.source().clone()
}
pub fn context(&self) -> &ExpandContext {
&self.state.context
}
pub fn color_result(&mut self, shape: ShapeResult) {
match shape {
ShapeResult::Success(shape) => self.color_shape(shape),
ShapeResult::Fallback { shape, allowed } => self.color_err(shape, allowed),
}
}
pub fn color_shape(&mut self, shape: Spanned<FlatShape>) {
self.with_tracer(|_, tracer| tracer.add_shape(shape.into_trace_shape(shape.span)));
self.state.shapes.push(ShapeResult::Success(shape));
}
pub fn color_err(&mut self, shape: Spanned<FlatShape>, valid_shapes: Vec<String>) {
self.with_tracer(|_, tracer| tracer.add_err_shape(shape.into_trace_shape(shape.span)));
self.state.errors.insert(shape.span, valid_shapes.clone());
self.state.shapes.push(ShapeResult::Fallback {
shape,
allowed: valid_shapes,
});
}
pub fn color_shapes(&mut self, shapes: Vec<Spanned<FlatShape>>) {
self.with_tracer(|_, tracer| {
for shape in &shapes {
tracer.add_shape(shape.into_trace_shape(shape.span))
}
});
for shape in &shapes {
self.state.shapes.push(ShapeResult::Success(*shape));
}
}
pub fn child<'me, T>(
&'me mut self,
tokens: Spanned<&'me [SpannedToken]>,
block: impl FnOnce(&mut TokensIterator<'me>) -> T,
) -> (Vec<ShapeResult>, T) {
let mut shapes = vec![];
std::mem::swap(&mut shapes, &mut self.state.shapes);
let mut errors = self.state.errors.clone();
std::mem::swap(&mut errors, &mut self.state.errors);
let mut expand_tracer = ExpandTracer::new("Expand Trace", self.source());
std::mem::swap(&mut expand_tracer, &mut self.expand_tracer);
let mut iterator = TokensIterator {
state: TokensIteratorState {
tokens: tokens.item,
span: tokens.span,
index: 0,
seen: indexmap::IndexSet::new(),
shapes,
errors,
context: self.state.context.clone(),
},
expand_tracer,
};
let result = block(&mut iterator);
std::mem::swap(&mut iterator.state.shapes, &mut self.state.shapes);
std::mem::swap(&mut iterator.state.errors, &mut self.state.errors);
std::mem::swap(&mut iterator.expand_tracer, &mut self.expand_tracer);
(iterator.state.shapes, result)
}
fn with_tracer(
&mut self,
block: impl FnOnce(&mut TokensIteratorState, &mut ExpandTracer<SpannedExpression>),
) {
let state = &mut self.state;
let tracer = &mut self.expand_tracer;
block(state, tracer)
}
pub fn finish_tracer(&mut self) {
self.with_tracer(|_, tracer| tracer.finish())
}
pub fn atomic_parse<'me, T, E>(
&'me mut self,
block: impl FnOnce(&mut TokensIterator<'content>) -> Result<T, E>,
) -> Result<T, E> {
let state = &mut self.state;
let index = state.index;
let shape_start = state.shapes.len();
let seen = state.seen.clone();
let checkpoint = Checkpoint {
iterator: self,
index,
seen,
committed: false,
shape_start,
};
let value = block(checkpoint.iterator)?;
checkpoint.commit();
Ok(value)
}
fn eof_span(&self) -> Span {
Span::new(self.state.span.end(), self.state.span.end())
}
pub fn span_at_cursor(&mut self) -> Span {
let next = self.peek();
match next.node {
None => self.eof_span(),
Some(node) => node.span(),
}
}
pub fn at_end(&self) -> bool {
next_index(&self.state).is_none()
}
pub fn move_to(&mut self, pos: usize) {
self.state.index = pos;
}
/// Peek the next token in the token stream and return a `Peeked`.
///
/// # Example
///
/// ```ignore
/// let peeked = token_nodes.peek().not_eof();
/// let node = peeked.node;
/// match node.unspanned() {
/// Token::Whitespace => {
/// let node = peeked.commit();
/// return Ok(node.span)
/// }
/// other => return Err(ParseError::mismatch("whitespace", node.spanned_type_name()))
/// }
/// ```
pub fn peek<'me>(&'me mut self) -> Peeked<'content, 'me> {
let state = self.state();
let len = state.tokens.len();
let from = state.index;
let index = next_index(state);
let (node, to) = match index {
None => (None, len),
Some(to) => (Some(&state.tokens[to]), to + 1),
};
Peeked {
node,
iterator: self,
from,
to,
}
}
/// Produce an error corresponding to the next token.
///
/// If the next token is EOF, produce an `UnexpectedEof`. Otherwise, produce a `Mismatch`.
pub fn err_next_token(&mut self, expected: &'static str) -> ParseError {
match next_index(&self.state) {
None => ParseError::unexpected_eof(expected, self.eof_span()),
Some(index) => {
ParseError::mismatch(expected, self.state.tokens[index].spanned_type_name())
}
}
}
fn expand_token_with_token_nodes<
'me,
T: 'me,
U: IntoSpanned<Output = V>,
V: HasFallibleSpan,
F: IntoShapes,
>(
&'me mut self,
expected: impl TokenType<Output = T>,
block: impl FnOnce(T, &mut Self) -> Result<(F, U), ParseError>,
) -> Result<V, ParseError> {
let desc = expected.desc();
let peeked = self.peek().not_eof(desc.borrow())?;
let (shapes, val) = {
let node = peeked.node;
let type_name = node.spanned_type_name();
let func = Box::new(|| Err(ParseError::mismatch(desc.clone().into_owned(), type_name)));
match expected.extract_token_value(node, &func) {
Err(err) => return Err(err),
Ok(value) => match block(value, peeked.iterator) {
Err(err) => return Err(err),
Ok((shape, val)) => {
let span = peeked.node.span();
peeked.commit();
(shape.into_shapes(span), val.into_spanned(span))
}
},
}
};
for shape in &shapes {
self.color_result(shape.clone());
}
Ok(val)
}
/// Expand and color a single token. Takes an `impl TokenType` and produces
/// (() | FlatShape | Vec<Spanned<FlatShape>>, Output) (or an error).
///
/// If a single FlatShape is produced, it is annotated with the span of the
/// original token. Otherwise, each FlatShape in the list must already be
/// annotated.
pub fn expand_token<'me, T, U, V, F>(
&'me mut self,
expected: impl TokenType<Output = T>,
block: impl FnOnce(T) -> Result<(F, U), ParseError>,
) -> Result<V, ParseError>
where
T: 'me,
U: IntoSpanned<Output = V>,
V: HasFallibleSpan,
F: IntoShapes,
{
self.expand_token_with_token_nodes(expected, |value, _| block(value))
}
fn commit(&mut self, from: usize, to: usize) {
for index in from..to {
self.state.seen.insert(index);
}
self.state.index = to;
}
pub fn debug_remaining(&self) -> Vec<SpannedToken> {
let mut tokens: TokensIterator = self.clone();
tokens.move_to(0);
tokens.cloned().collect()
}
/// Expand an `ExpandSyntax` whose output is a `Result`, producing either the shape's output
/// or a `ParseError`. If the token stream is at EOF, this method produces a ParseError
/// (`UnexpectedEof`).
///
/// You must use `expand_syntax` if the `Output` of the `ExpandSyntax` is a `Result`, but
/// it's difficult to model this in the Rust type system.
pub fn expand_syntax<U>(
&mut self,
shape: impl ExpandSyntax<Output = Result<U, ParseError>>,
) -> Result<U, ParseError>
where
U: std::fmt::Debug + HasFallibleSpan + PrettyDebugWithSource + Clone + 'static,
{
if self.at_end() {
self.with_tracer(|_, tracer| tracer.start(shape.name(), None));
self.with_tracer(|_, tracer| tracer.eof_frame());
return Err(ParseError::unexpected_eof(shape.name(), self.eof_span()));
}
let (result, added_shapes) = self.expand(shape);
match &result {
Ok(val) => self.finish_expand(val, added_shapes),
Err(err) => self.with_tracer(|_, tracer| tracer.failed(err)),
}
result
}
/// Expand an `impl ExpandSyntax` and produce its Output. Use `expand_infallible` if the
/// `ExpandSyntax` cannot produce a `Result`. You must also use `ExpandSyntax` if EOF
/// is an error.
///
/// The purpose of `expand_infallible` is to clearly mark the infallible path through
/// and entire list of tokens that produces a fully colored version of the source.
///
/// If the `ExpandSyntax` can poroduce a `Result`, make sure to use `expand_syntax`,
/// which will correctly show the error in the trace.
pub fn expand_infallible<U>(&mut self, shape: impl ExpandSyntax<Output = U>) -> U
where
U: std::fmt::Debug + PrettyDebugWithSource + HasFallibleSpan + Clone + 'static,
{
let (result, added_shapes) = self.expand(shape);
self.finish_expand(&result, added_shapes);
result
}
fn finish_expand<V>(&mut self, val: &V, added_shapes: usize)
where
V: PrettyDebugWithSource + HasFallibleSpan + Clone,
{
self.with_tracer(|_, tracer| {
if val.maybe_span().is_some() || added_shapes > 0 {
tracer.add_result(val.clone());
}
tracer.success();
})
}
fn expand<U>(&mut self, shape: impl ExpandSyntax<Output = U>) -> (U, usize)
where
U: std::fmt::Debug + Clone + 'static,
{
let desc = shape.name();
self.with_tracer(|state, tracer| {
tracer.start(
desc,
next_index(state).map(|index| state.tokens[index].clone()),
)
});
let start_shapes = self.state.shapes.len();
let result = shape.expand(self);
let added_shapes = self.state.shapes.len() - start_shapes;
(result, added_shapes)
}
}
impl<'content> Iterator for TokensIterator<'content> {
type Item = &'content SpannedToken;
fn next(&mut self) -> Option<Self::Item> {
next(self)
}
}
fn next_index(state: &TokensIteratorState) -> Option<usize> {
let mut to = state.index;
loop {
if to >= state.tokens.len() {
return None;
}
if state.seen.contains(&to) {
to += 1;
continue;
}
if to >= state.tokens.len() {
return None;
}
return Some(to);
}
}
fn next<'me, 'content>(
iterator: &'me mut TokensIterator<'content>,
) -> Option<&'content SpannedToken> {
let next = next_index(&iterator.state);
let len = iterator.len();
match next {
None => {
iterator.move_to(len);
None
}
Some(index) => {
iterator.move_to(index + 1);
Some(&iterator.state.tokens[index])
}
}
}

View File

@ -0,0 +1,36 @@
#![allow(unused)]
pub(crate) mod color_trace;
pub(crate) mod expand_trace;
pub(crate) use self::color_trace::*;
pub(crate) use self::expand_trace::*;
use crate::hir::tokens_iterator::TokensIteratorState;
use nu_source::{PrettyDebug, PrettyDebugWithSource, Text};
#[derive(Debug)]
pub(crate) enum DebugIteratorToken {
Seen(String),
Unseen(String),
Cursor,
}
pub(crate) fn debug_tokens(state: &TokensIteratorState, source: &str) -> Vec<DebugIteratorToken> {
let mut out = vec![];
for (i, token) in state.tokens.iter().enumerate() {
if state.index == i {
out.push(DebugIteratorToken::Cursor);
}
let msg = token.debug(source).to_string();
if state.seen.contains(&i) {
out.push(DebugIteratorToken::Seen(msg));
} else {
out.push(DebugIteratorToken::Unseen(msg));
}
}
out
}

View File

@ -0,0 +1,363 @@
use crate::hir::syntax_shape::flat_shape::{FlatShape, ShapeResult};
use ansi_term::Color;
use log::trace;
use nu_errors::{ParseError, ShellError};
use nu_source::{Spanned, Text};
use ptree::*;
use std::borrow::Cow;
use std::io;
#[derive(Debug, Clone)]
pub enum FrameChild {
#[allow(unused)]
Shape(ShapeResult),
Frame(ColorFrame),
}
impl FrameChild {
fn colored_leaf_description(&self, text: &Text, f: &mut impl io::Write) -> io::Result<()> {
match self {
FrameChild::Shape(ShapeResult::Success(shape)) => write!(
f,
"{} {:?}",
Color::White
.bold()
.on(Color::Green)
.paint(format!("{:?}", shape.item)),
shape.span.slice(text)
),
FrameChild::Shape(ShapeResult::Fallback { shape, .. }) => write!(
f,
"{} {:?}",
Color::White
.bold()
.on(Color::Green)
.paint(format!("{:?}", shape.item)),
shape.span.slice(text)
),
FrameChild::Frame(frame) => frame.colored_leaf_description(f),
}
}
fn into_tree_child(self, text: &Text) -> TreeChild {
match self {
FrameChild::Shape(shape) => TreeChild::Shape(shape, text.clone()),
FrameChild::Frame(frame) => TreeChild::Frame(frame, text.clone()),
}
}
}
#[derive(Debug, Clone)]
pub struct ColorFrame {
description: &'static str,
children: Vec<FrameChild>,
error: Option<ParseError>,
}
impl ColorFrame {
fn colored_leaf_description(&self, f: &mut impl io::Write) -> io::Result<()> {
if self.has_only_error_descendents() {
if self.children.is_empty() {
write!(
f,
"{}",
Color::White.bold().on(Color::Red).paint(self.description)
)
} else {
write!(f, "{}", Color::Red.normal().paint(self.description))
}
} else if self.has_descendent_shapes() {
write!(f, "{}", Color::Green.normal().paint(self.description))
} else {
write!(f, "{}", Color::Yellow.bold().paint(self.description))
}
}
fn colored_description(&self, text: &Text, f: &mut impl io::Write) -> io::Result<()> {
if self.children.len() == 1 {
let child = &self.children[0];
self.colored_leaf_description(f)?;
write!(f, " -> ")?;
child.colored_leaf_description(text, f)
} else {
self.colored_leaf_description(f)
}
}
fn children_for_formatting(&self, text: &Text) -> Vec<TreeChild> {
if self.children.len() == 1 {
let child = &self.children[0];
match child {
FrameChild::Shape(_) => vec![],
FrameChild::Frame(frame) => frame.tree_children(text),
}
} else {
self.tree_children(text)
}
}
fn tree_children(&self, text: &Text) -> Vec<TreeChild> {
self.children
.clone()
.into_iter()
.map(|c| c.into_tree_child(text))
.collect()
}
fn add_shape(&mut self, shape: ShapeResult) {
self.children.push(FrameChild::Shape(shape))
}
fn has_child_shapes(&self) -> bool {
self.any_child_shape(|_| true)
}
fn any_child_shape(&self, predicate: impl Fn(&ShapeResult) -> bool) -> bool {
for item in &self.children {
if let FrameChild::Shape(shape) = item {
if predicate(shape) {
return true;
}
}
}
false
}
fn any_child_frame(&self, predicate: impl Fn(&ColorFrame) -> bool) -> bool {
for item in &self.children {
if let FrameChild::Frame(frame) = item {
if predicate(frame) {
return true;
}
}
}
false
}
fn has_descendent_shapes(&self) -> bool {
if self.has_child_shapes() {
true
} else {
self.any_child_frame(|frame| frame.has_descendent_shapes())
}
}
fn has_only_error_descendents(&self) -> bool {
if self.children.is_empty() {
// if this frame has no children at all, it has only error descendents if this frame
// is an error
self.error.is_some()
} else {
// otherwise, it has only error descendents if all of its children terminate in an
// error (transitively)
let mut seen_error = false;
for child in &self.children {
match child {
// if this frame has at least one child shape, this frame has non-error descendents
FrameChild::Shape(_) => return false,
FrameChild::Frame(frame) => {
// if the chi
if frame.has_only_error_descendents() {
seen_error = true;
} else {
return false;
}
}
}
}
seen_error
}
}
}
#[derive(Debug, Clone)]
pub enum TreeChild {
Shape(ShapeResult, Text),
Frame(ColorFrame, Text),
}
impl TreeChild {
fn colored_leaf_description(&self, f: &mut impl io::Write) -> io::Result<()> {
match self {
TreeChild::Shape(ShapeResult::Success(shape), text) => write!(
f,
"{} {:?}",
Color::White
.bold()
.on(Color::Green)
.paint(format!("{:?}", shape.item)),
shape.span.slice(text)
),
TreeChild::Shape(ShapeResult::Fallback { shape, .. }, text) => write!(
f,
"{} {:?}",
Color::White
.bold()
.on(Color::Green)
.paint(format!("{:?}", shape.item)),
shape.span.slice(text)
),
TreeChild::Frame(frame, _) => frame.colored_leaf_description(f),
}
}
}
impl TreeItem for TreeChild {
type Child = TreeChild;
fn write_self<W: io::Write>(&self, f: &mut W, _style: &Style) -> io::Result<()> {
match self {
shape @ TreeChild::Shape(..) => shape.colored_leaf_description(f),
TreeChild::Frame(frame, text) => frame.colored_description(text, f),
}
}
fn children(&self) -> Cow<[Self::Child]> {
match self {
TreeChild::Shape(..) => Cow::Borrowed(&[]),
TreeChild::Frame(frame, text) => Cow::Owned(frame.children_for_formatting(text)),
}
}
}
#[derive(Debug, Clone)]
pub struct ColorTracer {
frame_stack: Vec<ColorFrame>,
source: Text,
}
impl ColorTracer {
pub fn print(self, source: Text) -> PrintTracer {
PrintTracer {
tracer: self,
source,
}
}
pub fn new(source: Text) -> ColorTracer {
let root = ColorFrame {
description: "Trace",
children: vec![],
error: None,
};
ColorTracer {
frame_stack: vec![root],
source,
}
}
fn current_frame(&mut self) -> &mut ColorFrame {
let frames = &mut self.frame_stack;
let last = frames.len() - 1;
&mut frames[last]
}
fn pop_frame(&mut self) -> ColorFrame {
trace!(target: "nu::color_syntax", "Popping {:#?}", self);
let result = self.frame_stack.pop().expect("Can't pop root tracer frame");
if self.frame_stack.is_empty() {
panic!("Can't pop root tracer frame {:#?}", self);
}
self.debug();
result
}
pub fn start(&mut self, description: &'static str) {
let frame = ColorFrame {
description,
children: vec![],
error: None,
};
self.frame_stack.push(frame);
self.debug();
}
pub fn eof_frame(&mut self) {
let current = self.pop_frame();
self.current_frame()
.children
.push(FrameChild::Frame(current));
}
#[allow(unused)]
pub fn finish(&mut self) {
loop {
if self.frame_stack.len() == 1 {
break;
}
let frame = self.pop_frame();
self.current_frame().children.push(FrameChild::Frame(frame));
}
}
pub fn add_shape(&mut self, shape: ShapeResult) {
self.current_frame().add_shape(shape);
}
pub fn success(&mut self) {
let current = self.pop_frame();
self.current_frame()
.children
.push(FrameChild::Frame(current));
}
pub fn failed(&mut self, error: &ParseError) {
let mut current = self.pop_frame();
current.error = Some(error.clone());
self.current_frame()
.children
.push(FrameChild::Frame(current));
}
fn debug(&self) {
trace!(target: "nu::color_syntax",
"frames = {:?}",
self.frame_stack
.iter()
.map(|f| f.description)
.collect::<Vec<_>>()
);
trace!(target: "nu::color_syntax", "{:#?}", self);
}
}
#[derive(Debug, Clone)]
pub struct PrintTracer {
tracer: ColorTracer,
source: Text,
}
impl TreeItem for PrintTracer {
type Child = TreeChild;
fn write_self<W: io::Write>(&self, f: &mut W, style: &Style) -> io::Result<()> {
write!(f, "{}", style.paint("Color Trace"))
}
fn children(&self) -> Cow<[Self::Child]> {
Cow::Owned(vec![TreeChild::Frame(
self.tracer.frame_stack[0].clone(),
self.source.clone(),
)])
}
}

View File

@ -0,0 +1,494 @@
use crate::hir::syntax_shape::flat_shape::TraceShape;
use crate::hir::SpannedExpression;
use crate::parse::token_tree::SpannedToken;
use ansi_term::Color;
use log::trace;
use nu_errors::{ParseError, ParseErrorReason};
use nu_protocol::{ShellTypeName, SpannedTypeName};
use nu_source::{DebugDoc, PrettyDebug, PrettyDebugWithSource, Span, Spanned, Text};
use ptree::*;
use std::borrow::Cow;
use std::fmt::Debug;
use std::io;
#[derive(Debug, Clone)]
pub enum FrameChild<T: SpannedTypeName> {
Expr(T),
Shape(Result<TraceShape, TraceShape>),
Frame(Box<ExprFrame<T>>),
Result(DebugDoc),
}
fn err_desc(error: &ParseError) -> &'static str {
match error.reason() {
ParseErrorReason::ExtraTokens { .. } => "extra tokens",
ParseErrorReason::Mismatch { .. } => "mismatch",
ParseErrorReason::ArgumentError { .. } => "argument error",
ParseErrorReason::Eof { .. } => "eof",
ParseErrorReason::InternalError { .. } => "internal error",
}
}
impl<T: SpannedTypeName> FrameChild<T> {
fn get_error_leaf(&self) -> Option<(&'static str, &'static str)> {
match self {
FrameChild::Frame(frame) => {
if let Some(error) = &frame.error {
if frame.children.is_empty() {
Some((frame.description, err_desc(error)))
} else {
None
}
} else {
None
}
}
_ => None,
}
}
fn to_tree_child(&self, text: &Text) -> TreeChild {
match self {
FrameChild::Expr(expr) => TreeChild::OkExpr {
source: expr.spanned_type_name().span,
desc: expr.spanned_type_name().item,
text: text.clone(),
},
FrameChild::Shape(Ok(shape)) => TreeChild::OkShape {
source: shape.spanned_type_name().span,
desc: shape.spanned_type_name().item,
text: text.clone(),
fallback: false,
},
FrameChild::Shape(Err(shape)) => TreeChild::OkShape {
source: shape.spanned_type_name().span,
desc: shape.spanned_type_name().item,
text: text.clone(),
fallback: true,
},
FrameChild::Result(result) => {
let result = result.display();
TreeChild::OkNonExpr(result)
}
FrameChild::Frame(frame) => {
if let Some(err) = &frame.error {
if frame.children.is_empty() {
TreeChild::ErrorLeaf(
vec![(frame.description, err_desc(err))],
frame.token_desc(),
)
} else {
TreeChild::ErrorFrame(frame.to_tree_frame(text), text.clone())
}
} else {
TreeChild::OkFrame(frame.to_tree_frame(text), text.clone())
}
}
}
}
}
#[derive(Debug, Clone)]
pub struct ExprFrame<T: SpannedTypeName> {
description: &'static str,
token: Option<SpannedToken>,
children: Vec<FrameChild<T>>,
error: Option<ParseError>,
}
impl<T: SpannedTypeName> ExprFrame<T> {
fn token_desc(&self) -> &'static str {
match &self.token {
None => "EOF",
Some(token) => token.type_name(),
}
}
fn to_tree_frame(&self, text: &Text) -> TreeFrame {
let mut children = vec![];
let mut errors = vec![];
for child in &self.children {
if let Some(error_leaf) = child.get_error_leaf() {
errors.push(error_leaf);
continue;
} else if !errors.is_empty() {
children.push(TreeChild::ErrorLeaf(errors, self.token_desc()));
errors = vec![];
}
children.push(child.to_tree_child(text));
}
if !errors.is_empty() {
children.push(TreeChild::ErrorLeaf(errors, self.token_desc()));
}
TreeFrame {
description: self.description,
token_desc: self.token_desc(),
children,
error: self.error.clone(),
}
}
fn add_return(&mut self, value: T) {
self.children.push(FrameChild::Expr(value))
}
fn add_shape(&mut self, shape: TraceShape) {
self.children.push(FrameChild::Shape(Ok(shape)))
}
fn add_err_shape(&mut self, shape: TraceShape) {
self.children.push(FrameChild::Shape(Err(shape)))
}
fn add_result(&mut self, result: impl PrettyDebug) {
self.children.push(FrameChild::Result(result.to_doc()))
}
}
#[derive(Debug, Clone)]
pub struct TreeFrame {
description: &'static str,
token_desc: &'static str,
children: Vec<TreeChild>,
error: Option<ParseError>,
}
impl TreeFrame {
fn leaf_description(&self, f: &mut impl io::Write) -> io::Result<()> {
if self.children.len() == 1 {
if self.error.is_some() {
write!(f, "{}", Color::Red.normal().paint(self.description))?;
} else if self.has_descendent_green() {
write!(f, "{}", Color::Green.normal().paint(self.description))?;
} else {
write!(f, "{}", Color::Yellow.bold().paint(self.description))?;
}
write!(
f,
"{}",
Color::White.bold().paint(&format!("({})", self.token_desc))
)?;
write!(f, " -> ")?;
self.children[0].leaf_description(f)
} else {
if self.error.is_some() {
if self.children.is_empty() {
write!(
f,
"{}",
Color::White.bold().on(Color::Red).paint(self.description)
)?
} else {
write!(f, "{}", Color::Red.normal().paint(self.description))?
}
} else if self.has_descendent_green() {
write!(f, "{}", Color::Green.normal().paint(self.description))?
} else {
write!(f, "{}", Color::Yellow.bold().paint(self.description))?
}
write!(
f,
"{}",
Color::White.bold().paint(&format!("({})", self.token_desc))
)
}
}
fn has_child_green(&self) -> bool {
self.children.iter().any(|item| match item {
TreeChild::OkFrame(..) | TreeChild::ErrorFrame(..) | TreeChild::ErrorLeaf(..) => false,
TreeChild::OkExpr { .. } | TreeChild::OkShape { .. } | TreeChild::OkNonExpr(..) => true,
})
}
fn any_child_frame(&self, predicate: impl Fn(&TreeFrame) -> bool) -> bool {
for item in &self.children {
if let TreeChild::OkFrame(frame, ..) = item {
if predicate(frame) {
return true;
}
}
}
false
}
fn has_descendent_green(&self) -> bool {
if self.has_child_green() {
true
} else {
self.any_child_frame(|frame| frame.has_child_green())
}
}
fn children_for_formatting(&self, text: &Text) -> Vec<TreeChild> {
if self.children.len() == 1 {
let child: &TreeChild = &self.children[0];
match child {
TreeChild::OkExpr { .. }
| TreeChild::OkShape { .. }
| TreeChild::OkNonExpr(..)
| TreeChild::ErrorLeaf(..) => vec![],
TreeChild::OkFrame(frame, _) | TreeChild::ErrorFrame(frame, _) => {
frame.children_for_formatting(text)
}
}
} else {
self.children.clone()
}
}
}
#[derive(Debug, Clone)]
pub enum TreeChild {
OkNonExpr(String),
OkExpr {
source: Span,
desc: &'static str,
text: Text,
},
OkShape {
source: Span,
desc: &'static str,
text: Text,
fallback: bool,
},
OkFrame(TreeFrame, Text),
ErrorFrame(TreeFrame, Text),
ErrorLeaf(Vec<(&'static str, &'static str)>, &'static str),
}
impl TreeChild {
fn leaf_description(&self, f: &mut impl io::Write) -> io::Result<()> {
match self {
TreeChild::OkExpr { source, desc, text } => write!(
f,
"{} {} {}",
Color::Cyan.normal().paint("returns"),
Color::White.bold().on(Color::Green).paint(*desc),
source.slice(text)
),
TreeChild::OkShape {
source,
desc,
text,
fallback,
} => write!(
f,
"{} {} {}",
Color::Purple.normal().paint("paints"),
Color::White.bold().on(Color::Green).paint(*desc),
source.slice(text)
),
TreeChild::OkNonExpr(result) => write!(
f,
"{} {}",
Color::Cyan.normal().paint("returns"),
Color::White
.bold()
.on(Color::Green)
.paint(result.to_string())
),
TreeChild::ErrorLeaf(desc, token_desc) => {
let last = desc.len() - 1;
for (i, (desc, err_desc)) in desc.iter().enumerate() {
write!(f, "{}", Color::White.bold().on(Color::Red).paint(*desc))?;
write!(f, " {}", Color::White.bold().paint(*err_desc))?;
if i != last {
write!(f, "{}", Color::White.normal().paint(", "))?;
}
}
// write!(f, " {}", Color::Black.bold().paint(*token_desc))?;
Ok(())
}
TreeChild::ErrorFrame(frame, _) | TreeChild::OkFrame(frame, _) => {
frame.leaf_description(f)
}
}
}
}
impl TreeItem for TreeChild {
type Child = TreeChild;
fn write_self<W: io::Write>(&self, f: &mut W, _style: &Style) -> io::Result<()> {
self.leaf_description(f)
}
fn children(&self) -> Cow<[Self::Child]> {
match self {
TreeChild::OkExpr { .. }
| TreeChild::OkShape { .. }
| TreeChild::OkNonExpr(..)
| TreeChild::ErrorLeaf(..) => Cow::Borrowed(&[]),
TreeChild::OkFrame(frame, text) | TreeChild::ErrorFrame(frame, text) => {
Cow::Owned(frame.children_for_formatting(text))
}
}
}
}
#[derive(Debug, Clone)]
pub struct ExpandTracer<T: SpannedTypeName> {
desc: &'static str,
frame_stack: Vec<ExprFrame<T>>,
source: Text,
}
impl<T: SpannedTypeName + Debug> ExpandTracer<T> {
pub fn print(&self, source: Text) -> PrintTracer {
let root = self.frame_stack[0].to_tree_frame(&source);
PrintTracer {
root,
desc: self.desc,
source,
}
}
pub fn new(desc: &'static str, source: Text) -> ExpandTracer<T> {
let root = ExprFrame {
description: "Trace",
children: vec![],
token: None,
error: None,
};
ExpandTracer {
desc,
frame_stack: vec![root],
source,
}
}
fn current_frame(&mut self) -> &mut ExprFrame<T> {
let frames = &mut self.frame_stack;
let last = frames.len() - 1;
&mut frames[last]
}
fn pop_frame(&mut self) -> ExprFrame<T> {
let result = self.frame_stack.pop().expect("Can't pop root tracer frame");
if self.frame_stack.is_empty() {
panic!("Can't pop root tracer frame");
}
self.debug();
result
}
pub fn start(&mut self, description: &'static str, token: Option<SpannedToken>) {
let frame = ExprFrame {
description,
children: vec![],
token,
error: None,
};
self.frame_stack.push(frame);
self.debug();
}
pub fn add_return(&mut self, value: T) {
self.current_frame().add_return(value);
}
pub fn add_shape(&mut self, shape: TraceShape) {
self.current_frame().add_shape(shape);
}
pub fn add_err_shape(&mut self, shape: TraceShape) {
self.current_frame().add_err_shape(shape);
}
pub fn finish(&mut self) {
loop {
if self.frame_stack.len() == 1 {
break;
}
let frame = self.pop_frame();
self.current_frame()
.children
.push(FrameChild::Frame(Box::new(frame)));
}
}
pub fn eof_frame(&mut self) {
let current = self.pop_frame();
self.current_frame()
.children
.push(FrameChild::Frame(Box::new(current)));
}
pub fn add_result(&mut self, result: impl PrettyDebugWithSource) {
let source = self.source.clone();
self.current_frame().add_result(result.debuggable(source));
}
pub fn success(&mut self) {
trace!(target: "parser::expand_syntax", "success {:#?}", self);
let current = self.pop_frame();
self.current_frame()
.children
.push(FrameChild::Frame(Box::new(current)));
}
pub fn failed(&mut self, error: &ParseError) {
let mut current = self.pop_frame();
current.error = Some(error.clone());
self.current_frame()
.children
.push(FrameChild::Frame(Box::new(current)));
}
fn debug(&self) {
trace!(target: "nu::parser::expand",
"frames = {:?}",
self.frame_stack
.iter()
.map(|f| f.description)
.collect::<Vec<_>>()
);
trace!(target: "nu::parser::expand", "{:#?}", self);
}
}
#[derive(Debug, Clone)]
pub struct PrintTracer {
desc: &'static str,
root: TreeFrame,
source: Text,
}
impl TreeItem for PrintTracer {
type Child = TreeChild;
fn write_self<W: io::Write>(&self, f: &mut W, style: &Style) -> io::Result<()> {
write!(f, "{}", style.paint(self.desc))
}
fn children(&self) -> Cow<[Self::Child]> {
Cow::Borrowed(&self.root.children)
}
}

View File

@ -0,0 +1,56 @@
use crate::hir::syntax_shape::flat_shape::{FlatShape, ShapeResult};
use nu_source::{Span, Spanned, SpannedItem};
pub struct FlatShapes {
shapes: Vec<ShapeResult>,
}
impl<'a> IntoIterator for &'a FlatShapes {
type Item = &'a ShapeResult;
type IntoIter = std::slice::Iter<'a, ShapeResult>;
fn into_iter(self) -> Self::IntoIter {
self.shapes.iter()
}
}
pub trait IntoShapes: 'static {
fn into_shapes(self, span: Span) -> FlatShapes;
}
impl IntoShapes for FlatShape {
fn into_shapes(self, span: Span) -> FlatShapes {
FlatShapes {
shapes: vec![ShapeResult::Success(self.spanned(span))],
}
}
}
impl IntoShapes for Vec<Spanned<FlatShape>> {
fn into_shapes(self, _span: Span) -> FlatShapes {
FlatShapes {
shapes: self.into_iter().map(ShapeResult::Success).collect(),
}
}
}
impl IntoShapes for Vec<ShapeResult> {
fn into_shapes(self, _span: Span) -> FlatShapes {
FlatShapes { shapes: self }
}
}
impl IntoShapes for () {
fn into_shapes(self, _span: Span) -> FlatShapes {
FlatShapes { shapes: vec![] }
}
}
impl IntoShapes for Option<FlatShape> {
fn into_shapes(self, span: Span) -> FlatShapes {
match self {
Option::None => ().into_shapes(span),
Option::Some(shape) => shape.into_shapes(span),
}
}
}

View File

@ -0,0 +1,30 @@
use crate::parse::token_tree::{ParseErrorFn, SpannedToken, TokenType};
use nu_errors::ParseError;
use std::borrow::Cow;
pub struct Pattern<T> {
parts: Vec<Box<dyn TokenType<Output = T>>>,
}
impl<T> TokenType for Pattern<T> {
type Output = T;
fn desc(&self) -> Cow<'static, str> {
Cow::Borrowed("pattern")
}
fn extract_token_value(
&self,
token: &SpannedToken,
err: ParseErrorFn<Self::Output>,
) -> Result<Self::Output, ParseError> {
for part in &self.parts {
match part.extract_token_value(token, err) {
Err(_) => {}
Ok(result) => return Ok(result),
}
}
err()
}
}

View File

@ -0,0 +1,105 @@
use crate::hir::syntax_shape::flat_shape::ShapeResult;
use crate::hir::syntax_shape::ExpandContext;
use crate::hir::tokens_iterator::TokensIterator;
use crate::parse::token_tree::SpannedToken;
use getset::Getters;
use nu_errors::ParseError;
use nu_protocol::SpannedTypeName;
use nu_source::Span;
use std::sync::Arc;
#[derive(Getters, Debug, Clone)]
pub struct TokensIteratorState<'content> {
pub(crate) tokens: &'content [SpannedToken],
pub(crate) span: Span,
pub(crate) index: usize,
pub(crate) seen: indexmap::IndexSet<usize>,
#[get = "pub"]
pub(crate) shapes: Vec<ShapeResult>,
pub(crate) errors: indexmap::IndexMap<Span, Vec<String>>,
pub(crate) context: Arc<ExpandContext<'content>>,
}
#[derive(Debug)]
pub struct Peeked<'content, 'me> {
pub(crate) node: Option<&'content SpannedToken>,
pub(crate) iterator: &'me mut TokensIterator<'content>,
pub(crate) from: usize,
pub(crate) to: usize,
}
impl<'content, 'me> Peeked<'content, 'me> {
pub fn commit(&mut self) -> Option<&'content SpannedToken> {
let Peeked {
node,
iterator,
from,
to,
} = self;
let node = (*node)?;
iterator.commit(*from, *to);
Some(node)
}
pub fn rollback(self) {}
pub fn not_eof(self, expected: &str) -> Result<PeekedNode<'content, 'me>, ParseError> {
match self.node {
None => Err(ParseError::unexpected_eof(
expected.to_string(),
self.iterator.eof_span(),
)),
Some(node) => Ok(PeekedNode {
node,
iterator: self.iterator,
from: self.from,
to: self.to,
}),
}
}
pub fn type_error(&self, expected: &'static str) -> ParseError {
peek_error(self.node, self.iterator.eof_span(), expected)
}
}
#[derive(Debug)]
pub struct PeekedNode<'content, 'me> {
pub(crate) node: &'content SpannedToken,
pub(crate) iterator: &'me mut TokensIterator<'content>,
from: usize,
to: usize,
}
impl<'content, 'me> PeekedNode<'content, 'me> {
pub fn commit(self) -> &'content SpannedToken {
let PeekedNode {
node,
iterator,
from,
to,
} = self;
iterator.commit(from, to);
node
}
pub fn rollback(self) {}
pub fn type_error(&self, expected: &'static str) -> ParseError {
peek_error(Some(self.node), self.iterator.eof_span(), expected)
}
}
pub fn peek_error(
node: Option<&SpannedToken>,
eof_span: Span,
expected: &'static str,
) -> ParseError {
match node {
None => ParseError::unexpected_eof(expected, eof_span),
Some(node) => ParseError::mismatch(expected, node.spanned_type_name()),
}
}

View File

@ -0,0 +1,21 @@
use crate::hir::TokensIterator;
use crate::parse::token_tree_builder::TokenTreeBuilder as b;
use crate::Span;
#[test]
<<<<<<< HEAD
fn supplies_tokens() {
let tokens = b::token_list(vec![b::it_var(), b::op("."), b::bare("cpu")]);
=======
fn supplies_tokens() -> Result<(), Box<dyn std::error::Error>> {
let tokens = b::token_list(vec![b::var("it"), b::op("."), b::bare("cpu")]);
>>>>>>> master
let (tokens, _) = b::build(tokens);
let tokens = tokens.expect_list();
let mut iterator = TokensIterator::new(tokens, Span::unknown());
iterator.next()?.expect_var();
iterator.next()?.expect_dot();
iterator.next()?.expect_bare();
}

View File

@ -0,0 +1,87 @@
#[macro_use]
pub mod macros;
pub mod commands;
pub mod hir;
pub mod parse;
pub mod parse_command;
pub use crate::commands::classified::{
external::ExternalCommand, internal::InternalCommand, ClassifiedCommand, ClassifiedPipeline,
};
pub use crate::hir::syntax_shape::flat_shape::{FlatShape, ShapeResult};
pub use crate::hir::syntax_shape::{ExpandContext, ExpandSyntax, PipelineShape, SignatureRegistry};
pub use crate::hir::tokens_iterator::TokensIterator;
pub use crate::parse::files::Files;
pub use crate::parse::flag::Flag;
pub use crate::parse::operator::{CompareOperator, EvaluationOperator};
pub use crate::parse::parser::Number;
pub use crate::parse::parser::{module, pipeline};
pub use crate::parse::token_tree::{Delimiter, SpannedToken, Token};
pub use crate::parse::token_tree_builder::TokenTreeBuilder;
use log::log_enabled;
use nu_errors::ShellError;
use nu_protocol::{errln, outln};
use nu_source::{nom_input, HasSpan, Text};
pub fn pipeline_shapes(line: &str, expand_context: ExpandContext) -> Vec<ShapeResult> {
let tokens = parse_pipeline(line);
match tokens {
Err(_) => vec![],
Ok(v) => {
let pipeline = match v.as_pipeline() {
Err(_) => return vec![],
Ok(v) => v,
};
let tokens = vec![Token::Pipeline(pipeline).into_spanned(v.span())];
let mut tokens = TokensIterator::new(&tokens[..], expand_context, v.span());
let shapes = {
// We just constructed a token list that only contains a pipeline, so it can't fail
let result = tokens.expand_infallible(PipelineShape);
if let Some(failure) = result.failed {
errln!(
"BUG: PipelineShape didn't find a pipeline :: {:#?}",
failure
);
}
tokens.finish_tracer();
tokens.state().shapes()
};
if log_enabled!(target: "nu::expand_syntax", log::Level::Debug) {
outln!("");
let _ = ptree::print_tree(&tokens.expand_tracer().clone().print(Text::from(line)));
outln!("");
}
shapes.clone()
}
}
}
pub fn parse_pipeline(input: &str) -> Result<SpannedToken, ShellError> {
let _ = pretty_env_logger::try_init();
match pipeline(nom_input(input)) {
Ok((_rest, val)) => Ok(val),
Err(err) => Err(ShellError::parse_error(err)),
}
}
pub use parse_pipeline as parse;
pub fn parse_script(input: &str) -> Result<SpannedToken, ShellError> {
let _ = pretty_env_logger::try_init();
match module(nom_input(input)) {
Ok((_rest, val)) => Ok(val),
Err(err) => Err(ShellError::parse_error(err)),
}
}

View File

@ -0,0 +1,9 @@
#[macro_export]
macro_rules! return_ok {
($expr:expr) => {
match $expr {
Ok(val) => return Ok(val),
Err(_) => {}
}
};
}

View File

@ -1,12 +1,12 @@
pub(crate) mod call_node;
pub(crate) mod comment;
pub(crate) mod files;
pub(crate) mod flag;
pub(crate) mod number;
pub(crate) mod operator;
pub(crate) mod parser;
pub(crate) mod pipeline;
pub(crate) mod text;
pub(crate) mod token_tree;
pub(crate) mod token_tree_builder;
pub(crate) mod tokens;
pub(crate) mod unit;
pub(crate) mod util;

View File

@ -0,0 +1,45 @@
use crate::parse::token_tree::SpannedToken;
use getset::Getters;
use nu_source::{b, DebugDocBuilder, PrettyDebugWithSource};
#[derive(Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Getters)]
pub struct CallNode {
#[get = "pub(crate)"]
head: Box<SpannedToken>,
#[get = "pub(crate)"]
children: Option<Vec<SpannedToken>>,
}
impl PrettyDebugWithSource for CallNode {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
b::typed(
"call",
self.head.pretty_debug(source)
+ b::preceded(
b::space(),
b::intersperse(
self.children.iter().flat_map(|children| {
children.iter().map(|child| child.pretty_debug(source))
}),
b::space(),
),
),
)
}
}
impl CallNode {
pub fn new(head: Box<SpannedToken>, children: Vec<SpannedToken>) -> CallNode {
if children.is_empty() {
CallNode {
head,
children: None,
}
} else {
CallNode {
head,
children: Some(children),
}
}
}
}

View File

@ -0,0 +1,34 @@
use derive_new::new;
use getset::Getters;
use nu_source::{b, DebugDocBuilder, PrettyDebugWithSource, Span};
use serde::{Deserialize, Serialize};
#[derive(Debug, Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Hash, Deserialize, Serialize)]
pub enum CommentKind {
Line,
}
#[derive(Debug, Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Hash, Getters, new)]
pub struct Comment {
pub(crate) kind: CommentKind,
pub(crate) text: Span,
}
impl Comment {
pub fn line(text: impl Into<Span>) -> Comment {
Comment {
kind: CommentKind::Line,
text: text.into(),
}
}
}
impl PrettyDebugWithSource for Comment {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
let prefix = match self.kind {
CommentKind::Line => b::description("#"),
};
prefix + b::description(self.text.slice(source))
}
}

View File

@ -0,0 +1,151 @@
use derive_new::new;
use language_reporting::{FileName, Location};
use log::trace;
use nu_source::Span;
#[derive(new, Debug, Clone)]
pub struct Files {
snippet: String,
}
impl language_reporting::ReportingFiles for Files {
type Span = Span;
type FileId = usize;
fn byte_span(
&self,
_file: Self::FileId,
from_index: usize,
to_index: usize,
) -> Option<Self::Span> {
Some(Span::new(from_index, to_index))
}
fn file_id(&self, _tag: Self::Span) -> Self::FileId {
0
}
fn file_name(&self, _file: Self::FileId) -> FileName {
FileName::Verbatim("shell".to_string())
}
fn byte_index(&self, _file: Self::FileId, _line: usize, _column: usize) -> Option<usize> {
unimplemented!("byte_index")
}
fn location(&self, _file: Self::FileId, byte_index: usize) -> Option<Location> {
trace!("finding location for {}", byte_index);
let source = &self.snippet;
let mut seen_lines = 0;
let mut seen_bytes = 0;
for (pos, slice) in source.match_indices('\n') {
trace!(
"searching byte_index={} seen_bytes={} pos={} slice={:?} slice.len={} source={:?}",
byte_index,
seen_bytes,
pos,
slice,
source.len(),
source
);
if pos >= byte_index {
trace!(
"returning {}:{} seen_lines={} byte_index={} pos={} seen_bytes={}",
seen_lines,
byte_index,
pos,
seen_lines,
byte_index,
seen_bytes
);
return Some(language_reporting::Location::new(
seen_lines,
byte_index - pos,
));
} else {
seen_lines += 1;
seen_bytes = pos;
}
}
if seen_lines == 0 {
trace!("seen_lines=0 end={}", source.len() - 1);
// if we got here, there were no newlines in the source
Some(language_reporting::Location::new(0, source.len() - 1))
} else {
trace!(
"last line seen_lines={} end={}",
seen_lines,
source.len() - 1 - byte_index
);
// if we got here and we didn't return, it should mean that we're talking about
// the last line
Some(language_reporting::Location::new(
seen_lines,
source.len() - 1 - byte_index,
))
}
}
fn line_span(&self, _file: Self::FileId, lineno: usize) -> Option<Self::Span> {
trace!("finding line_span for {}", lineno);
let source = &self.snippet;
let mut seen_lines = 0;
let mut seen_bytes = 0;
for (pos, _) in source.match_indices('\n') {
trace!(
"lineno={} seen_lines={} seen_bytes={} pos={}",
lineno,
seen_lines,
seen_bytes,
pos
);
if seen_lines == lineno {
trace!("returning start={} end={}", seen_bytes, pos);
// If the number of seen lines is the lineno, seen_bytes is the start of the
// line and pos is the end of the line
return Some(Span::new(seen_bytes, pos));
} else {
// If it's not, increment seen_lines, and move seen_bytes to the beginning of
// the next line
seen_lines += 1;
seen_bytes = pos + 1;
}
}
if seen_lines == 0 {
trace!("returning start={} end={}", 0, self.snippet.len() - 1);
// if we got here, there were no newlines in the source
Some(Span::new(0, self.snippet.len() - 1))
} else {
trace!(
"returning start={} end={}",
seen_bytes,
self.snippet.len() - 1
);
// if we got here and we didn't return, it should mean that we're talking about
// the last line
Some(Span::new(seen_bytes, self.snippet.len() - 1))
}
}
fn source(&self, span: Self::Span) -> Option<String> {
trace!("source(tag={:?}) snippet={:?}", span, self.snippet);
if span.start() > span.end() || span.end() > self.snippet.len() {
return None;
}
Some(span.slice(&self.snippet).to_string())
}
}

View File

@ -0,0 +1,38 @@
use crate::hir::syntax_shape::flat_shape::FlatShape;
use derive_new::new;
use getset::Getters;
use nu_source::{b, DebugDocBuilder, PrettyDebugWithSource, Span, Spanned, SpannedItem};
use serde::{Deserialize, Serialize};
#[derive(Debug, Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Hash, Deserialize, Serialize)]
pub enum FlagKind {
Shorthand,
Longhand,
}
#[derive(Debug, Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Hash, Getters, new)]
#[get = "pub(crate)"]
pub struct Flag {
pub(crate) kind: FlagKind,
pub(crate) name: Span,
}
impl PrettyDebugWithSource for Flag {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
let prefix = match self.kind {
FlagKind::Longhand => b::description("--"),
FlagKind::Shorthand => b::description("-"),
};
prefix + b::description(self.name.slice(source))
}
}
impl Flag {
pub fn color(&self, span: impl Into<Span>) -> Spanned<FlatShape> {
match self.kind {
FlagKind::Longhand => FlatShape::Flag.spanned(span.into()),
FlagKind::Shorthand => FlatShape::ShorthandFlag.spanned(span.into()),
}
}
}

View File

@ -0,0 +1,70 @@
use crate::hir::syntax_shape::FlatShape;
use crate::parse::parser::Number;
use bigdecimal::BigDecimal;
use nu_source::{b, DebugDocBuilder, HasSpan, PrettyDebugWithSource, Span, Text};
use num_bigint::BigInt;
use std::str::FromStr;
#[derive(Debug, Clone, Copy, Eq, PartialEq, Ord, PartialOrd, Hash)]
pub enum RawNumber {
Int(Span),
Decimal(Span),
}
impl HasSpan for RawNumber {
fn span(&self) -> Span {
match self {
RawNumber::Int(span) => *span,
RawNumber::Decimal(span) => *span,
}
}
}
impl PrettyDebugWithSource for RawNumber {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
match self {
RawNumber::Int(span) => b::primitive(span.slice(source)),
RawNumber::Decimal(span) => b::primitive(span.slice(source)),
}
}
}
impl RawNumber {
pub fn as_flat_shape(&self) -> FlatShape {
match self {
RawNumber::Int(_) => FlatShape::Int,
RawNumber::Decimal(_) => FlatShape::Decimal,
}
}
pub fn int(span: impl Into<Span>) -> RawNumber {
let span = span.into();
RawNumber::Int(span)
}
pub fn decimal(span: impl Into<Span>) -> RawNumber {
let span = span.into();
RawNumber::Decimal(span)
}
pub(crate) fn to_number(self, source: &Text) -> Number {
match self {
RawNumber::Int(tag) => {
if let Ok(big_int) = BigInt::from_str(tag.slice(source)) {
Number::Int(big_int)
} else {
unreachable!("Internal error: could not parse text as BigInt as expected")
}
}
RawNumber::Decimal(tag) => {
if let Ok(big_decimal) = BigDecimal::from_str(tag.slice(source)) {
Number::Decimal(big_decimal)
} else {
unreachable!("Internal error: could not parse text as BigDecimal as expected")
}
}
}
}
}

View File

@ -0,0 +1,114 @@
use nu_source::{b, DebugDocBuilder, PrettyDebug};
use serde::{Deserialize, Serialize};
use std::str::FromStr;
#[derive(Debug, Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Hash, Deserialize, Serialize)]
pub enum CompareOperator {
Equal,
NotEqual,
LessThan,
GreaterThan,
LessThanOrEqual,
GreaterThanOrEqual,
Contains,
NotContains,
}
impl PrettyDebug for CompareOperator {
fn pretty(&self) -> DebugDocBuilder {
b::operator(self.as_str())
}
}
impl CompareOperator {
pub fn print(self) -> String {
self.as_str().to_string()
}
pub fn as_str(self) -> &'static str {
match self {
CompareOperator::Equal => "==",
CompareOperator::NotEqual => "!=",
CompareOperator::LessThan => "<",
CompareOperator::GreaterThan => ">",
CompareOperator::LessThanOrEqual => "<=",
CompareOperator::GreaterThanOrEqual => ">=",
CompareOperator::Contains => "=~",
CompareOperator::NotContains => "!~",
}
}
}
impl From<&str> for CompareOperator {
fn from(input: &str) -> CompareOperator {
if let Ok(output) = CompareOperator::from_str(input) {
output
} else {
unreachable!("Internal error: CompareOperator from failed")
}
}
}
impl FromStr for CompareOperator {
type Err = ();
fn from_str(input: &str) -> Result<Self, <Self as std::str::FromStr>::Err> {
match input {
"==" => Ok(CompareOperator::Equal),
"!=" => Ok(CompareOperator::NotEqual),
"<" => Ok(CompareOperator::LessThan),
">" => Ok(CompareOperator::GreaterThan),
"<=" => Ok(CompareOperator::LessThanOrEqual),
">=" => Ok(CompareOperator::GreaterThanOrEqual),
"=~" => Ok(CompareOperator::Contains),
"!~" => Ok(CompareOperator::NotContains),
_ => Err(()),
}
}
}
#[derive(Debug, Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Hash, Deserialize, Serialize)]
pub enum EvaluationOperator {
Dot,
DotDot,
}
impl PrettyDebug for EvaluationOperator {
fn pretty(&self) -> DebugDocBuilder {
b::operator(self.as_str())
}
}
impl EvaluationOperator {
pub fn print(self) -> String {
self.as_str().to_string()
}
pub fn as_str(self) -> &'static str {
match self {
EvaluationOperator::Dot => ".",
EvaluationOperator::DotDot => "..",
}
}
}
impl From<&str> for EvaluationOperator {
fn from(input: &str) -> EvaluationOperator {
if let Ok(output) = EvaluationOperator::from_str(input) {
output
} else {
unreachable!("Internal error: EvaluationOperator 'from' failed")
}
}
}
impl FromStr for EvaluationOperator {
type Err = ();
fn from_str(input: &str) -> Result<Self, <Self as std::str::FromStr>::Err> {
match input {
"." => Ok(EvaluationOperator::Dot),
".." => Ok(EvaluationOperator::DotDot),
_ => Err(()),
}
}
}

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,84 @@
use crate::{SpannedToken, Token};
use derive_new::new;
use getset::Getters;
use nu_source::{
b, DebugDocBuilder, HasSpan, IntoSpanned, PrettyDebugWithSource, Span, Spanned, SpannedItem,
};
#[derive(Debug, Clone, PartialEq, Eq, PartialOrd, Ord, Getters, new)]
pub struct Pipeline {
#[get = "pub"]
pub(crate) parts: Vec<PipelineElement>,
}
impl IntoSpanned for Pipeline {
type Output = Spanned<Pipeline>;
fn into_spanned(self, span: impl Into<Span>) -> Self::Output {
self.spanned(span.into())
}
}
#[derive(Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Getters, new)]
pub struct Tokens {
pub(crate) tokens: Vec<SpannedToken>,
pub(crate) span: Span,
}
impl Tokens {
pub fn iter(&self) -> impl Iterator<Item = &SpannedToken> {
self.tokens.iter()
}
}
#[derive(Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Getters)]
pub struct PipelineElement {
pub pipe: Option<Span>,
pub tokens: Tokens,
}
impl HasSpan for PipelineElement {
fn span(&self) -> Span {
match self.pipe {
Option::None => self.tokens.span,
Option::Some(pipe) => pipe.until(self.tokens.span),
}
}
}
impl PipelineElement {
pub fn new(pipe: Option<Span>, tokens: Spanned<Vec<SpannedToken>>) -> PipelineElement {
PipelineElement {
pipe,
tokens: Tokens {
tokens: tokens.item,
span: tokens.span,
},
}
}
pub fn tokens(&self) -> &[SpannedToken] {
&self.tokens.tokens
}
}
impl PrettyDebugWithSource for Pipeline {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
b::intersperse(
self.parts.iter().map(|token| token.pretty_debug(source)),
b::operator(" | "),
)
}
}
impl PrettyDebugWithSource for PipelineElement {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
b::intersperse(
self.tokens.iter().map(|token| match token.unspanned() {
Token::Whitespace => b::blank(),
_ => token.pretty_debug(source),
}),
b::space(),
)
}
}

View File

@ -0,0 +1,530 @@
#![allow(clippy::type_complexity)]
use crate::parse::{call_node::*, comment::*, flag::*, number::*, operator::*, pipeline::*};
use derive_new::new;
use getset::Getters;
use nu_errors::{ParseError, ShellError};
use nu_protocol::{ShellTypeName, SpannedTypeName};
use nu_source::{
b, DebugDocBuilder, HasSpan, PrettyDebugWithSource, Span, Spanned, SpannedItem, Text,
};
use std::borrow::Cow;
use std::ops::Deref;
#[derive(Debug, Clone, Eq, PartialEq, Ord, PartialOrd)]
pub enum Token {
Number(RawNumber),
CompareOperator(CompareOperator),
EvaluationOperator(EvaluationOperator),
String(Span),
Variable(Span),
ItVariable(Span),
ExternalCommand(Span),
ExternalWord,
GlobPattern,
Bare,
Garbage,
Call(CallNode),
Delimited(DelimitedNode),
Pipeline(Pipeline),
Flag(Flag),
Comment(Comment),
Whitespace,
Separator,
}
macro_rules! token_type {
(struct $name:tt (desc: $desc:tt) -> $out:ty { |$span:ident, $pat:pat| => $do:expr }) => {
pub struct $name;
impl TokenType for $name {
type Output = $out;
fn desc(&self) -> Cow<'static, str> {
Cow::Borrowed($desc)
}
fn extract_token_value(
&self,
token: &SpannedToken,
err: ParseErrorFn<$out>,
) -> Result<$out, ParseError> {
let $span = token.span();
match *token.unspanned() {
$pat => Ok($do),
_ => err(),
}
}
}
};
(struct $name:tt (desc: $desc:tt) -> $out:ty { $pat:pat => $do:expr }) => {
pub struct $name;
impl TokenType for $name {
type Output = $out;
fn desc(&self) -> Cow<'static, str> {
Cow::Borrowed($desc)
}
fn extract_token_value(
&self,
token: &SpannedToken,
err: ParseErrorFn<$out>,
) -> Result<$out, ParseError> {
match token.unspanned().clone() {
$pat => Ok($do),
_ => err(),
}
}
}
};
}
pub type ParseErrorFn<'a, T> = &'a dyn Fn() -> Result<T, ParseError>;
token_type!(struct IntType (desc: "integer") -> RawNumber {
Token::Number(number @ RawNumber::Int(_)) => number
});
token_type!(struct DecimalType (desc: "decimal") -> RawNumber {
Token::Number(number @ RawNumber::Decimal(_)) => number
});
token_type!(struct StringType (desc: "string") -> (Span, Span) {
|outer, Token::String(inner)| => (inner, outer)
});
token_type!(struct BareType (desc: "word") -> Span {
|span, Token::Bare| => span
});
token_type!(struct DotType (desc: "dot") -> Span {
|span, Token::EvaluationOperator(EvaluationOperator::Dot)| => span
});
token_type!(struct DotDotType (desc: "dotdot") -> Span {
|span, Token::EvaluationOperator(EvaluationOperator::DotDot)| => span
});
token_type!(struct CompareOperatorType (desc: "compare operator") -> (Span, CompareOperator) {
|span, Token::CompareOperator(operator)| => (span, operator)
});
token_type!(struct ExternalWordType (desc: "external word") -> Span {
|span, Token::ExternalWord| => span
});
token_type!(struct ExternalCommandType (desc: "external command") -> (Span, Span) {
|outer, Token::ExternalCommand(inner)| => (inner, outer)
});
token_type!(struct CommentType (desc: "comment") -> (Comment, Span) {
|outer, Token::Comment(comment)| => (comment, outer)
});
token_type!(struct SeparatorType (desc: "separator") -> Span {
|span, Token::Separator| => span
});
token_type!(struct WhitespaceType (desc: "whitespace") -> Span {
|span, Token::Whitespace| => span
});
token_type!(struct WordType (desc: "word") -> Span {
|span, Token::Bare| => span
});
token_type!(struct ItVarType (desc: "$it") -> (Span, Span) {
|outer, Token::ItVariable(inner)| => (inner, outer)
});
token_type!(struct VarType (desc: "variable") -> (Span, Span) {
|outer, Token::Variable(inner)| => (inner, outer)
});
token_type!(struct PipelineType (desc: "pipeline") -> Pipeline {
Token::Pipeline(pipeline) => pipeline
});
token_type!(struct BlockType (desc: "block") -> DelimitedNode {
Token::Delimited(block @ DelimitedNode { delimiter: Delimiter::Brace, .. }) => block
});
token_type!(struct SquareType (desc: "square") -> DelimitedNode {
Token::Delimited(square @ DelimitedNode { delimiter: Delimiter::Square, .. }) => square
});
pub trait TokenType {
type Output;
fn desc(&self) -> Cow<'static, str>;
fn extract_token_value(
&self,
token: &SpannedToken,
err: ParseErrorFn<Self::Output>,
) -> Result<Self::Output, ParseError>;
}
impl Token {
pub fn into_spanned(self, span: impl Into<Span>) -> SpannedToken {
SpannedToken {
unspanned: self,
span: span.into(),
}
}
}
#[derive(Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Getters)]
pub struct SpannedToken {
#[get = "pub"]
unspanned: Token,
span: Span,
}
impl Deref for SpannedToken {
type Target = Token;
fn deref(&self) -> &Self::Target {
&self.unspanned
}
}
impl HasSpan for SpannedToken {
fn span(&self) -> Span {
self.span
}
}
impl ShellTypeName for SpannedToken {
fn type_name(&self) -> &'static str {
self.unspanned.type_name()
}
}
impl PrettyDebugWithSource for SpannedToken {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
match self.unspanned() {
Token::Number(number) => number.pretty_debug(source),
Token::CompareOperator(operator) => operator.pretty_debug(source),
Token::EvaluationOperator(operator) => operator.pretty_debug(source),
Token::String(_) | Token::GlobPattern | Token::Bare => {
b::primitive(self.span.slice(source))
}
Token::Variable(_) => b::var(self.span.slice(source)),
Token::ItVariable(_) => b::keyword(self.span.slice(source)),
Token::ExternalCommand(_) => b::description(self.span.slice(source)),
Token::ExternalWord => b::description(self.span.slice(source)),
Token::Call(call) => call.pretty_debug(source),
Token::Delimited(delimited) => delimited.pretty_debug(source),
Token::Pipeline(pipeline) => pipeline.pretty_debug(source),
Token::Flag(flag) => flag.pretty_debug(source),
Token::Garbage => b::error(self.span.slice(source)),
Token::Whitespace => b::typed(
"whitespace",
b::description(format!("{:?}", self.span.slice(source))),
),
Token::Separator => b::typed(
"separator",
b::description(format!("{:?}", self.span.slice(source))),
),
Token::Comment(comment) => {
b::typed("comment", b::description(comment.text.slice(source)))
}
}
}
}
impl ShellTypeName for Token {
fn type_name(&self) -> &'static str {
match self {
Token::Number(_) => "number",
Token::CompareOperator(_) => "comparison operator",
Token::EvaluationOperator(EvaluationOperator::Dot) => "dot",
Token::EvaluationOperator(EvaluationOperator::DotDot) => "dot dot",
Token::String(_) => "string",
Token::Variable(_) => "variable",
Token::ItVariable(_) => "it variable",
Token::ExternalCommand(_) => "external command",
Token::ExternalWord => "external word",
Token::GlobPattern => "glob pattern",
Token::Bare => "word",
Token::Call(_) => "command",
Token::Delimited(d) => d.type_name(),
Token::Pipeline(_) => "pipeline",
Token::Flag(_) => "flag",
Token::Garbage => "garbage",
Token::Whitespace => "whitespace",
Token::Separator => "separator",
Token::Comment(_) => "comment",
}
}
}
impl From<&SpannedToken> for Span {
fn from(token: &SpannedToken) -> Span {
token.span
}
}
impl SpannedToken {
pub fn as_external_arg(&self, source: &Text) -> String {
self.span().slice(source).to_string()
}
pub fn source<'a>(&self, source: &'a Text) -> &'a str {
self.span().slice(source)
}
pub fn get_variable(&self) -> Result<(Span, Span), ShellError> {
match self.unspanned() {
Token::Variable(inner_span) => Ok((self.span(), *inner_span)),
_ => Err(ShellError::type_error("variable", self.spanned_type_name())),
}
}
pub fn is_bare(&self) -> bool {
match self.unspanned() {
Token::Bare => true,
_ => false,
}
}
pub fn is_string(&self) -> bool {
match self.unspanned() {
Token::String(_) => true,
_ => false,
}
}
pub fn is_number(&self) -> bool {
match self.unspanned() {
Token::Number(_) => true,
_ => false,
}
}
pub fn as_string(&self) -> Option<(Span, Span)> {
match self.unspanned() {
Token::String(inner_span) => Some((self.span(), *inner_span)),
_ => None,
}
}
pub fn is_pattern(&self) -> bool {
match self.unspanned() {
Token::GlobPattern => true,
_ => false,
}
}
pub fn is_word(&self) -> bool {
match self.unspanned() {
Token::Bare => true,
_ => false,
}
}
pub fn is_int(&self) -> bool {
match self.unspanned() {
Token::Number(RawNumber::Int(_)) => true,
_ => false,
}
}
pub fn is_dot(&self) -> bool {
match self.unspanned() {
Token::EvaluationOperator(EvaluationOperator::Dot) => true,
_ => false,
}
}
pub fn as_block(&self) -> Option<(Spanned<&[SpannedToken]>, (Span, Span))> {
match self.unspanned() {
Token::Delimited(DelimitedNode {
delimiter,
children,
spans,
}) if *delimiter == Delimiter::Brace => {
Some(((&children[..]).spanned(self.span()), *spans))
}
_ => None,
}
}
pub fn is_external(&self) -> bool {
match self.unspanned() {
Token::ExternalCommand(..) => true,
_ => false,
}
}
pub(crate) fn as_flag(&self, value: &str, short: Option<char>, source: &Text) -> Option<Flag> {
match self.unspanned() {
Token::Flag(flag) => {
let name = flag.name().slice(source);
match flag.kind {
FlagKind::Longhand if value == name => Some(*flag),
FlagKind::Shorthand => {
if let Some(short_hand) = short {
if short_hand.to_string() == name {
return Some(*flag);
}
}
None
}
_ => None,
}
}
_ => None,
}
}
pub fn as_pipeline(&self) -> Result<Pipeline, ParseError> {
match self.unspanned() {
Token::Pipeline(pipeline) => Ok(pipeline.clone()),
_ => Err(ParseError::mismatch("pipeline", self.spanned_type_name())),
}
}
pub fn is_whitespace(&self) -> bool {
match self.unspanned() {
Token::Whitespace => true,
_ => false,
}
}
}
#[derive(Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Getters, new)]
#[get = "pub(crate)"]
pub struct DelimitedNode {
pub(crate) delimiter: Delimiter,
pub(crate) spans: (Span, Span),
pub(crate) children: Vec<SpannedToken>,
}
impl HasSpan for DelimitedNode {
fn span(&self) -> Span {
self.spans.0.until(self.spans.1)
}
}
impl PrettyDebugWithSource for DelimitedNode {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
b::delimit(
self.delimiter.open(),
b::intersperse(
self.children.iter().map(|child| child.pretty_debug(source)),
b::space(),
),
self.delimiter.close(),
)
}
}
impl DelimitedNode {
pub fn type_name(&self) -> &'static str {
match self.delimiter {
Delimiter::Brace => "braced expression",
Delimiter::Paren => "parenthesized expression",
Delimiter::Square => "array literal or index operator",
}
}
}
#[derive(Debug, Copy, Clone, Eq, PartialEq, Ord, PartialOrd)]
pub enum Delimiter {
Paren,
Brace,
Square,
}
impl Delimiter {
pub(crate) fn open(self) -> &'static str {
match self {
Delimiter::Paren => "(",
Delimiter::Brace => "{",
Delimiter::Square => "[",
}
}
pub(crate) fn close(self) -> &'static str {
match self {
Delimiter::Paren => ")",
Delimiter::Brace => "}",
Delimiter::Square => "]",
}
}
}
#[derive(Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Getters, new)]
#[get = "pub(crate)"]
pub struct PathNode {
head: Box<SpannedToken>,
tail: Vec<SpannedToken>,
}
#[cfg(test)]
impl SpannedToken {
pub fn expect_external(&self) -> Span {
match self.unspanned() {
Token::ExternalCommand(span) => *span,
_ => panic!(
"Only call expect_external if you checked is_external first, found {:?}",
self
),
}
}
pub fn expect_string(&self) -> (Span, Span) {
match self.unspanned() {
Token::String(inner_span) => (self.span(), *inner_span),
other => panic!("Expected string, found {:?}", other),
}
}
pub fn expect_list(&self) -> Spanned<Vec<SpannedToken>> {
match self.unspanned() {
Token::Pipeline(pipeline) => pipeline
.parts()
.iter()
.flat_map(|part| part.tokens())
.cloned()
.collect::<Vec<SpannedToken>>()
.spanned(self.span()),
_ => panic!("Expected list, found {:?}", self),
}
}
pub fn expect_pattern(&self) -> Span {
match self.unspanned() {
Token::GlobPattern => self.span(),
_ => panic!("Expected pattern, found {:?}", self),
}
}
pub fn expect_var(&self) -> (Span, Span) {
match self.unspanned() {
Token::Variable(inner_span) => (self.span(), *inner_span),
Token::ItVariable(inner_span) => (self.span(), *inner_span),
other => panic!("Expected var, found {:?}", other),
}
}
pub fn expect_dot(&self) -> Span {
match self.unspanned() {
Token::EvaluationOperator(EvaluationOperator::Dot) => self.span(),
other => panic!("Expected dot, found {:?}", other),
}
}
pub fn expect_bare(&self) -> Span {
match self.unspanned() {
Token::Bare => self.span(),
_ => panic!("Expected bare, found {:?}", self),
}
}
}

View File

@ -1,27 +1,31 @@
use crate::prelude::*;
use crate::parse::call_node::CallNode;
use crate::parse::comment::Comment;
use crate::parse::flag::{Flag, FlagKind};
use crate::parse::number::RawNumber;
use crate::parse::operator::{CompareOperator, EvaluationOperator};
use crate::parse::pipeline::{Pipeline, PipelineElement};
use crate::parse::token_tree::{DelimitedNode, Delimiter, SpannedToken, Token};
use bigdecimal::BigDecimal;
use nu_source::{Span, Spanned, SpannedItem};
use num_bigint::BigInt;
use crate::parser::parse::flag::{Flag, FlagKind};
use crate::parser::parse::operator::Operator;
use crate::parser::parse::pipeline::{Pipeline, PipelineElement};
use crate::parser::parse::token_tree::{DelimitedNode, Delimiter, TokenNode};
use crate::parser::parse::tokens::{RawNumber, RawToken};
use crate::parser::CallNode;
use derive_new::new;
#[derive(new)]
#[derive(Default)]
pub struct TokenTreeBuilder {
#[new(default)]
pos: usize,
#[new(default)]
output: String,
}
pub type CurriedToken = Box<dyn FnOnce(&mut TokenTreeBuilder) -> TokenNode + 'static>;
pub type CurriedCall = Box<dyn FnOnce(&mut TokenTreeBuilder) -> Tagged<CallNode> + 'static>;
impl TokenTreeBuilder {
pub fn new() -> Self {
Default::default()
}
}
pub type CurriedToken = Box<dyn FnOnce(&mut TokenTreeBuilder) -> SpannedToken + 'static>;
pub type CurriedCall = Box<dyn FnOnce(&mut TokenTreeBuilder) -> Spanned<CallNode> + 'static>;
impl TokenTreeBuilder {
pub fn build(block: impl FnOnce(&mut Self) -> TokenNode) -> (TokenNode, String) {
pub fn build(block: impl FnOnce(&mut Self) -> SpannedToken) -> (SpannedToken, String) {
let mut builder = TokenTreeBuilder::new();
let node = block(&mut builder);
(node, builder.output)
@ -42,7 +46,7 @@ impl TokenTreeBuilder {
Box::new(move |b| {
let start = b.pos;
let mut out: Vec<Spanned<PipelineElement>> = vec![];
let mut out: Vec<PipelineElement> = vec![];
let mut input = input.into_iter().peekable();
let head = input
@ -52,20 +56,17 @@ impl TokenTreeBuilder {
let pipe = None;
let head = b.build_spanned(|b| head.into_iter().map(|node| node(b)).collect());
let head_span: Span = head.span;
out.push(PipelineElement::new(pipe, head).spanned(head_span));
out.push(PipelineElement::new(pipe, head));
loop {
match input.next() {
None => break,
Some(node) => {
let start = b.pos;
let pipe = Some(b.consume_span("|"));
let node =
b.build_spanned(|b| node.into_iter().map(|node| node(b)).collect());
let end = b.pos;
out.push(PipelineElement::new(pipe, node).spanned(Span::new(start, end)));
out.push(PipelineElement::new(pipe, node));
}
}
}
@ -76,11 +77,8 @@ impl TokenTreeBuilder {
})
}
pub fn spanned_pipeline(
input: Vec<Spanned<PipelineElement>>,
span: impl Into<Span>,
) -> TokenNode {
TokenNode::Pipeline(Pipeline::new(input).spanned(span))
pub fn spanned_pipeline(input: Vec<PipelineElement>, span: impl Into<Span>) -> SpannedToken {
Token::Pipeline(Pipeline::new(input)).into_spanned(span)
}
pub fn token_list(input: Vec<CurriedToken>) -> CurriedToken {
@ -89,15 +87,35 @@ impl TokenTreeBuilder {
let tokens = input.into_iter().map(|i| i(b)).collect();
let end = b.pos;
TokenTreeBuilder::tagged_token_list(tokens, (start, end, None))
TokenTreeBuilder::spanned_token_list(tokens, Span::new(start, end))
})
}
pub fn tagged_token_list(input: Vec<TokenNode>, tag: impl Into<Tag>) -> TokenNode {
TokenNode::Nodes(input.spanned(tag.into().span))
pub fn spanned_token_list(input: Vec<SpannedToken>, span: impl Into<Span>) -> SpannedToken {
let span = span.into();
Token::Pipeline(Pipeline::new(vec![PipelineElement::new(
None,
input.spanned(span),
)]))
.into_spanned(span)
}
pub fn op(input: impl Into<Operator>) -> CurriedToken {
pub fn garbage(input: impl Into<String>) -> CurriedToken {
let input = input.into();
Box::new(move |b| {
let (start, end) = b.consume(&input);
b.pos = end;
TokenTreeBuilder::spanned_garbage(Span::new(start, end))
})
}
pub fn spanned_garbage(span: impl Into<Span>) -> SpannedToken {
Token::Garbage.into_spanned(span)
}
pub fn op(input: impl Into<CompareOperator>) -> CurriedToken {
let input = input.into();
Box::new(move |b| {
@ -105,12 +123,42 @@ impl TokenTreeBuilder {
b.pos = end;
TokenTreeBuilder::spanned_op(input, Span::new(start, end))
TokenTreeBuilder::spanned_cmp_op(input, Span::new(start, end))
})
}
pub fn spanned_op(input: impl Into<Operator>, span: impl Into<Span>) -> TokenNode {
TokenNode::Token(RawToken::Operator(input.into()).spanned(span.into()))
pub fn spanned_cmp_op(
input: impl Into<CompareOperator>,
span: impl Into<Span>,
) -> SpannedToken {
Token::CompareOperator(input.into()).into_spanned(span)
}
pub fn dot() -> CurriedToken {
Box::new(move |b| {
let (start, end) = b.consume(".");
b.pos = end;
TokenTreeBuilder::spanned_eval_op(".", Span::new(start, end))
})
}
pub fn dotdot() -> CurriedToken {
Box::new(move |b| {
let (start, end) = b.consume("..");
b.pos = end;
TokenTreeBuilder::spanned_eval_op("..", Span::new(start, end))
})
}
pub fn spanned_eval_op(
input: impl Into<EvaluationOperator>,
span: impl Into<Span>,
) -> SpannedToken {
Token::EvaluationOperator(input.into()).into_spanned(span)
}
pub fn string(input: impl Into<String>) -> CurriedToken {
@ -129,8 +177,8 @@ impl TokenTreeBuilder {
})
}
pub fn spanned_string(input: impl Into<Span>, span: impl Into<Span>) -> TokenNode {
TokenNode::Token(RawToken::String(input.into()).spanned(span.into()))
pub fn spanned_string(input: impl Into<Span>, span: impl Into<Span>) -> SpannedToken {
Token::String(input.into()).into_spanned(span)
}
pub fn bare(input: impl Into<String>) -> CurriedToken {
@ -144,8 +192,8 @@ impl TokenTreeBuilder {
})
}
pub fn spanned_bare(span: impl Into<Span>) -> TokenNode {
TokenNode::Token(RawToken::Bare.spanned(span))
pub fn spanned_bare(span: impl Into<Span>) -> SpannedToken {
Token::Bare.into_spanned(span)
}
pub fn pattern(input: impl Into<String>) -> CurriedToken {
@ -159,8 +207,8 @@ impl TokenTreeBuilder {
})
}
pub fn spanned_pattern(input: impl Into<Span>) -> TokenNode {
TokenNode::Token(RawToken::GlobPattern.spanned(input.into()))
pub fn spanned_pattern(input: impl Into<Span>) -> SpannedToken {
Token::GlobPattern.into_spanned(input)
}
pub fn external_word(input: impl Into<String>) -> CurriedToken {
@ -174,8 +222,8 @@ impl TokenTreeBuilder {
})
}
pub fn spanned_external_word(input: impl Into<Span>) -> TokenNode {
TokenNode::Token(RawToken::ExternalWord.spanned(input.into()))
pub fn spanned_external_word(input: impl Into<Span>) -> SpannedToken {
Token::ExternalWord.into_spanned(input)
}
pub fn external_command(input: impl Into<String>) -> CurriedToken {
@ -193,8 +241,11 @@ impl TokenTreeBuilder {
})
}
pub fn spanned_external_command(inner: impl Into<Span>, outer: impl Into<Span>) -> TokenNode {
TokenNode::Token(RawToken::ExternalCommand(inner.into()).spanned(outer.into()))
pub fn spanned_external_command(
inner: impl Into<Span>,
outer: impl Into<Span>,
) -> SpannedToken {
Token::ExternalCommand(inner.into()).into_spanned(outer)
}
pub fn int(input: impl Into<BigInt>) -> CurriedToken {
@ -225,8 +276,8 @@ impl TokenTreeBuilder {
})
}
pub fn spanned_number(input: impl Into<RawNumber>, span: impl Into<Span>) -> TokenNode {
TokenNode::Token(RawToken::Number(input.into()).spanned(span.into()))
pub fn spanned_number(input: impl Into<RawNumber>, span: impl Into<Span>) -> SpannedToken {
Token::Number(input.into()).into_spanned(span)
}
pub fn var(input: impl Into<String>) -> CurriedToken {
@ -240,8 +291,21 @@ impl TokenTreeBuilder {
})
}
pub fn spanned_var(input: impl Into<Span>, span: impl Into<Span>) -> TokenNode {
TokenNode::Token(RawToken::Variable(input.into()).spanned(span.into()))
pub fn spanned_var(input: impl Into<Span>, span: impl Into<Span>) -> SpannedToken {
Token::Variable(input.into()).into_spanned(span)
}
pub fn it_var() -> CurriedToken {
Box::new(move |b| {
let (start, _) = b.consume("$");
let (inner_start, end) = b.consume("it");
TokenTreeBuilder::spanned_it_var(Span::new(inner_start, end), Span::new(start, end))
})
}
pub fn spanned_it_var(input: impl Into<Span>, span: impl Into<Span>) -> SpannedToken {
Token::ItVariable(input.into()).into_spanned(span)
}
pub fn flag(input: impl Into<String>) -> CurriedToken {
@ -255,8 +319,9 @@ impl TokenTreeBuilder {
})
}
pub fn spanned_flag(input: impl Into<Span>, span: impl Into<Span>) -> TokenNode {
TokenNode::Flag(Flag::new(FlagKind::Longhand, input.into()).spanned(span.into()))
pub fn spanned_flag(input: impl Into<Span>, span: impl Into<Span>) -> SpannedToken {
let span = span.into();
Token::Flag(Flag::new(FlagKind::Longhand, input.into())).into_spanned(span)
}
pub fn shorthand(input: impl Into<String>) -> CurriedToken {
@ -270,8 +335,10 @@ impl TokenTreeBuilder {
})
}
pub fn spanned_shorthand(input: impl Into<Span>, span: impl Into<Span>) -> TokenNode {
TokenNode::Flag(Flag::new(FlagKind::Shorthand, input.into()).spanned(span.into()))
pub fn spanned_shorthand(input: impl Into<Span>, span: impl Into<Span>) -> SpannedToken {
let span = span.into();
Token::Flag(Flag::new(FlagKind::Shorthand, input.into())).into_spanned(span)
}
pub fn call(head: CurriedToken, input: Vec<CurriedToken>) -> CurriedCall {
@ -287,21 +354,24 @@ impl TokenTreeBuilder {
let end = b.pos;
TokenTreeBuilder::tagged_call(nodes, (start, end, None))
TokenTreeBuilder::spanned_call(nodes, Span::new(start, end))
})
}
pub fn tagged_call(input: Vec<TokenNode>, tag: impl Into<Tag>) -> Tagged<CallNode> {
if input.len() == 0 {
pub fn spanned_call(input: Vec<SpannedToken>, span: impl Into<Span>) -> Spanned<CallNode> {
if input.is_empty() {
panic!("BUG: spanned call (TODO)")
}
let mut input = input.into_iter();
let head = input.next().unwrap();
if let Some(head) = input.next() {
let tail = input.collect();
CallNode::new(Box::new(head), tail).tagged(tag.into())
CallNode::new(Box::new(head), tail).spanned(span.into())
} else {
unreachable!("Internal error: spanned_call failed")
}
}
fn consume_delimiter(
@ -309,7 +379,7 @@ impl TokenTreeBuilder {
input: Vec<CurriedToken>,
_open: &str,
_close: &str,
) -> (Span, Span, Span, Vec<TokenNode>) {
) -> (Span, Span, Span, Vec<SpannedToken>) {
let (start_open_paren, end_open_paren) = self.consume("(");
let mut output = vec![];
for item in input {
@ -334,13 +404,12 @@ impl TokenTreeBuilder {
}
pub fn spanned_parens(
input: impl Into<Vec<TokenNode>>,
input: impl Into<Vec<SpannedToken>>,
spans: (Span, Span),
span: impl Into<Span>,
) -> TokenNode {
TokenNode::Delimited(
DelimitedNode::new(Delimiter::Paren, spans, input.into()).spanned(span.into()),
)
) -> SpannedToken {
Token::Delimited(DelimitedNode::new(Delimiter::Paren, spans, input.into()))
.into_spanned(span.into())
}
pub fn square(input: Vec<CurriedToken>) -> CurriedToken {
@ -352,13 +421,12 @@ impl TokenTreeBuilder {
}
pub fn spanned_square(
input: impl Into<Vec<TokenNode>>,
input: impl Into<Vec<SpannedToken>>,
spans: (Span, Span),
span: impl Into<Span>,
) -> TokenNode {
TokenNode::Delimited(
DelimitedNode::new(Delimiter::Square, spans, input.into()).spanned(span.into()),
)
) -> SpannedToken {
Token::Delimited(DelimitedNode::new(Delimiter::Square, spans, input.into()))
.into_spanned(span)
}
pub fn braced(input: Vec<CurriedToken>) -> CurriedToken {
@ -370,19 +438,18 @@ impl TokenTreeBuilder {
}
pub fn spanned_brace(
input: impl Into<Vec<TokenNode>>,
input: impl Into<Vec<SpannedToken>>,
spans: (Span, Span),
span: impl Into<Span>,
) -> TokenNode {
TokenNode::Delimited(
DelimitedNode::new(Delimiter::Brace, spans, input.into()).spanned(span.into()),
)
) -> SpannedToken {
Token::Delimited(DelimitedNode::new(Delimiter::Brace, spans, input.into()))
.into_spanned(span)
}
pub fn sp() -> CurriedToken {
Box::new(|b| {
let (start, end) = b.consume(" ");
TokenNode::Whitespace(Span::new(start, end))
Token::Whitespace.into_spanned((start, end))
})
}
@ -395,8 +462,40 @@ impl TokenTreeBuilder {
})
}
pub fn spanned_ws(span: impl Into<Span>) -> TokenNode {
TokenNode::Whitespace(span.into())
pub fn spanned_ws(span: impl Into<Span>) -> SpannedToken {
Token::Whitespace.into_spanned(span)
}
pub fn sep(input: impl Into<String>) -> CurriedToken {
let input = input.into();
Box::new(move |b| {
let (start, end) = b.consume(&input);
TokenTreeBuilder::spanned_sep(Span::new(start, end))
})
}
pub fn spanned_sep(span: impl Into<Span>) -> SpannedToken {
Token::Separator.into_spanned(span)
}
pub fn comment(input: impl Into<String>) -> CurriedToken {
let input = input.into();
Box::new(move |b| {
let outer_start = b.pos;
b.consume("#");
let (start, end) = b.consume(&input);
let outer_end = b.pos;
TokenTreeBuilder::spanned_comment((start, end), (outer_start, outer_end))
})
}
pub fn spanned_comment(input: impl Into<Span>, span: impl Into<Span>) -> SpannedToken {
let span = span.into();
Token::Comment(Comment::line(input)).into_spanned(span)
}
fn consume(&mut self, input: &str) -> (usize, usize) {

View File

@ -0,0 +1,231 @@
use crate::parse::parser::Number;
use crate::{CompareOperator, EvaluationOperator};
use bigdecimal::BigDecimal;
use nu_protocol::ShellTypeName;
use nu_source::{
b, DebugDocBuilder, HasSpan, PrettyDebug, PrettyDebugWithSource, Span, Spanned, SpannedItem,
Text,
};
use num_bigint::BigInt;
use std::fmt;
use std::str::FromStr;
#[derive(Debug, Clone, Copy, Eq, PartialEq, Ord, PartialOrd, Hash)]
pub enum UnspannedToken {
Number(RawNumber),
CompareOperator(CompareOperator),
EvaluationOperator(EvaluationOperator),
String(Span),
Variable(Span),
ExternalCommand(Span),
ExternalWord,
GlobPattern,
Bare,
}
impl UnspannedToken {
pub fn into_token(self, span: impl Into<Span>) -> Token {
Token {
unspanned: self,
span: span.into(),
}
}
}
impl ShellTypeName for UnspannedToken {
fn type_name(&self) -> &'static str {
match self {
UnspannedToken::Number(_) => "number",
UnspannedToken::CompareOperator(..) => "comparison operator",
UnspannedToken::EvaluationOperator(EvaluationOperator::Dot) => "dot",
UnspannedToken::EvaluationOperator(EvaluationOperator::DotDot) => "dotdot",
UnspannedToken::String(_) => "string",
UnspannedToken::Variable(_) => "variable",
UnspannedToken::ExternalCommand(_) => "syntax error",
UnspannedToken::ExternalWord => "syntax error",
UnspannedToken::GlobPattern => "glob pattern",
UnspannedToken::Bare => "string",
}
}
}
#[derive(Debug, Clone, Copy, Eq, PartialEq, Ord, PartialOrd, Hash)]
pub enum RawNumber {
Int(Span),
Decimal(Span),
}
impl HasSpan for RawNumber {
fn span(&self) -> Span {
match self {
RawNumber::Int(span) => *span,
RawNumber::Decimal(span) => *span,
}
}
}
impl PrettyDebugWithSource for RawNumber {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
match self {
RawNumber::Int(span) => b::primitive(span.slice(source)),
RawNumber::Decimal(span) => b::primitive(span.slice(source)),
}
}
}
impl RawNumber {
pub fn int(span: impl Into<Span>) -> RawNumber {
let span = span.into();
RawNumber::Int(span)
}
pub fn decimal(span: impl Into<Span>) -> RawNumber {
let span = span.into();
RawNumber::Decimal(span)
}
pub(crate) fn to_number(self, source: &Text) -> Number {
match self {
RawNumber::Int(tag) => {
if let Ok(int) = BigInt::from_str(tag.slice(source)) {
Number::Int(int)
} else {
unreachable!("Internal error: to_number failed")
}
}
RawNumber::Decimal(tag) => {
if let Ok(decimal) = BigDecimal::from_str(tag.slice(source)) {
Number::Decimal(decimal)
} else {
unreachable!("Internal error: to_number failed")
}
}
}
}
}
#[derive(Debug, Clone, Copy, Eq, PartialEq, Ord, PartialOrd, Hash)]
pub struct Token {
pub unspanned: UnspannedToken,
pub span: Span,
}
impl std::ops::Deref for Token {
type Target = UnspannedToken;
fn deref(&self) -> &UnspannedToken {
&self.unspanned
}
}
impl PrettyDebugWithSource for Token {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
match self.unspanned {
UnspannedToken::Number(number) => number.pretty_debug(source),
UnspannedToken::CompareOperator(operator) => operator.pretty(),
UnspannedToken::EvaluationOperator(operator) => operator.pretty(),
UnspannedToken::String(_) => b::primitive(self.span.slice(source)),
UnspannedToken::Variable(_) => b::var(self.span.slice(source)),
UnspannedToken::ExternalCommand(_) => b::primitive(self.span.slice(source)),
UnspannedToken::ExternalWord => {
b::typed("external", b::description(self.span.slice(source)))
}
UnspannedToken::GlobPattern => {
b::typed("pattern", b::description(self.span.slice(source)))
}
UnspannedToken::Bare => b::primitive(self.span.slice(source)),
}
}
}
impl Token {
pub fn debug<'a>(&self, source: &'a Text) -> DebugToken<'a> {
DebugToken {
node: *self,
source,
}
}
pub fn extract_number(&self) -> Option<RawNumber> {
match self.unspanned {
UnspannedToken::Number(number) => Some(number),
_ => None,
}
}
pub fn extract_int(&self) -> Option<(Span, Span)> {
match self.unspanned {
UnspannedToken::Number(RawNumber::Int(int)) => Some((int, self.span)),
_ => None,
}
}
pub fn extract_decimal(&self) -> Option<(Span, Span)> {
match self.unspanned {
UnspannedToken::Number(RawNumber::Decimal(decimal)) => Some((decimal, self.span)),
_ => None,
}
}
pub fn extract_operator(&self) -> Option<Spanned<CompareOperator>> {
match self.unspanned {
UnspannedToken::CompareOperator(operator) => Some(operator.spanned(self.span)),
_ => None,
}
}
pub fn extract_string(&self) -> Option<(Span, Span)> {
match self.unspanned {
UnspannedToken::String(span) => Some((span, self.span)),
_ => None,
}
}
pub fn extract_variable(&self) -> Option<(Span, Span)> {
match self.unspanned {
UnspannedToken::Variable(span) => Some((span, self.span)),
_ => None,
}
}
pub fn extract_external_command(&self) -> Option<(Span, Span)> {
match self.unspanned {
UnspannedToken::ExternalCommand(span) => Some((span, self.span)),
_ => None,
}
}
pub fn extract_external_word(&self) -> Option<Span> {
match self.unspanned {
UnspannedToken::ExternalWord => Some(self.span),
_ => None,
}
}
pub fn extract_glob_pattern(&self) -> Option<Span> {
match self.unspanned {
UnspannedToken::GlobPattern => Some(self.span),
_ => None,
}
}
pub fn extract_bare(&self) -> Option<Span> {
match self.unspanned {
UnspannedToken::Bare => Some(self.span),
_ => None,
}
}
}
pub struct DebugToken<'a> {
node: Token,
source: &'a Text,
}
impl fmt::Debug for DebugToken<'_> {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
write!(f, "{}", self.node.span.slice(self.source))
}
}

View File

@ -0,0 +1,127 @@
use crate::parse::parser::Number;
use nu_protocol::{Primitive, UntaggedValue};
use nu_source::{b, DebugDocBuilder, PrettyDebug};
use num_traits::ToPrimitive;
use serde::{Deserialize, Serialize};
use std::str::FromStr;
#[derive(Debug, Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Hash, Deserialize, Serialize)]
pub enum Unit {
// Filesize units
Byte,
Kilobyte,
Megabyte,
Gigabyte,
Terabyte,
Petabyte,
// Duration units
Second,
Minute,
Hour,
Day,
Week,
Month,
Year,
}
impl PrettyDebug for Unit {
fn pretty(&self) -> DebugDocBuilder {
b::keyword(self.as_str())
}
}
fn convert_number_to_u64(number: &Number) -> u64 {
match number {
Number::Int(big_int) => {
if let Some(x) = big_int.to_u64() {
x
} else {
unreachable!("Internal error: convert_number_to_u64 given incompatible number")
}
}
Number::Decimal(big_decimal) => {
if let Some(x) = big_decimal.to_u64() {
x
} else {
unreachable!("Internal error: convert_number_to_u64 given incompatible number")
}
}
}
}
impl Unit {
pub fn as_str(self) -> &'static str {
match self {
Unit::Byte => "B",
Unit::Kilobyte => "KB",
Unit::Megabyte => "MB",
Unit::Gigabyte => "GB",
Unit::Terabyte => "TB",
Unit::Petabyte => "PB",
Unit::Second => "s",
Unit::Minute => "m",
Unit::Hour => "h",
Unit::Day => "d",
Unit::Week => "w",
Unit::Month => "M",
Unit::Year => "y",
}
}
pub fn compute(self, size: &Number) -> UntaggedValue {
let size = size.clone();
match self {
Unit::Byte => number(size),
Unit::Kilobyte => number(size * 1024),
Unit::Megabyte => number(size * 1024 * 1024),
Unit::Gigabyte => number(size * 1024 * 1024 * 1024),
Unit::Terabyte => number(size * 1024 * 1024 * 1024 * 1024),
Unit::Petabyte => number(size * 1024 * 1024 * 1024 * 1024 * 1024),
Unit::Second => duration(convert_number_to_u64(&size)),
Unit::Minute => duration(60 * convert_number_to_u64(&size)),
Unit::Hour => duration(60 * 60 * convert_number_to_u64(&size)),
Unit::Day => duration(24 * 60 * 60 * convert_number_to_u64(&size)),
Unit::Week => duration(7 * 24 * 60 * 60 * convert_number_to_u64(&size)),
Unit::Month => duration(30 * 24 * 60 * 60 * convert_number_to_u64(&size)),
Unit::Year => duration(365 * 24 * 60 * 60 * convert_number_to_u64(&size)),
}
}
}
fn number(number: impl Into<Number>) -> UntaggedValue {
let number = number.into();
match number {
Number::Int(int) => UntaggedValue::Primitive(Primitive::Int(int)),
Number::Decimal(decimal) => UntaggedValue::Primitive(Primitive::Decimal(decimal)),
}
}
pub fn duration(secs: u64) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::Duration(secs))
}
impl FromStr for Unit {
type Err = ();
fn from_str(input: &str) -> Result<Self, <Self as std::str::FromStr>::Err> {
match input {
"B" | "b" => Ok(Unit::Byte),
"KB" | "kb" | "Kb" | "K" | "k" => Ok(Unit::Kilobyte),
"MB" | "mb" | "Mb" => Ok(Unit::Megabyte),
"GB" | "gb" | "Gb" => Ok(Unit::Gigabyte),
"TB" | "tb" | "Tb" => Ok(Unit::Terabyte),
"PB" | "pb" | "Pb" => Ok(Unit::Petabyte),
"s" => Ok(Unit::Second),
"m" => Ok(Unit::Minute),
"h" => Ok(Unit::Hour),
"d" => Ok(Unit::Day),
"w" => Ok(Unit::Week),
"M" => Ok(Unit::Month),
"y" => Ok(Unit::Year),
_ => Err(()),
}
}
}

View File

@ -0,0 +1,409 @@
use crate::hir::syntax_shape::{
BackoffColoringMode, ExpandSyntax, MaybeSpaceShape, MaybeWhitespaceEof,
};
use crate::hir::SpannedExpression;
use crate::{
hir::{self, NamedArguments},
Flag,
};
use crate::{Token, TokensIterator};
use log::trace;
use nu_errors::{ArgumentError, ParseError};
use nu_protocol::{NamedType, PositionalType, Signature, SyntaxShape};
use nu_source::{HasFallibleSpan, HasSpan, PrettyDebugWithSource, Span, Spanned, SpannedItem};
type OptionalHeadTail = (Option<Vec<hir::SpannedExpression>>, Option<NamedArguments>);
pub fn parse_command_tail(
config: &Signature,
tail: &mut TokensIterator,
command_span: Span,
) -> Result<Option<OptionalHeadTail>, ParseError> {
let mut named = NamedArguments::new();
let mut found_error: Option<ParseError> = None;
let mut rest_signature = config.clone();
trace!(target: "nu::parse::trace_remaining", "");
trace_remaining("nodes", &tail);
for (name, kind) in &config.named {
trace!(target: "nu::parse::trace_remaining", "looking for {} : {:?}", name, kind);
match &kind.0 {
NamedType::Switch(s) => {
let switch = extract_switch(name, *s, tail);
match switch {
None => named.insert_switch(name, None),
Some((pos, flag)) => {
named.insert_switch(name, Some(*flag));
rest_signature.remove_named(name);
tail.color_shape(flag.color(flag.span));
tail.move_to(pos);
tail.expand_infallible(MaybeSpaceShape);
tail.move_to(0);
}
}
}
NamedType::Mandatory(s, syntax_type) => {
match extract_mandatory(config, name, *s, tail, command_span) {
Err(err) => {
// remember this error, but continue coloring
found_error = Some(err);
}
Ok((pos, flag)) => {
let result = expand_flag(tail, *syntax_type, flag, pos);
tail.move_to(0);
match result {
Ok(expr) => {
named.insert_mandatory(name, expr);
rest_signature.remove_named(name);
}
Err(_) => {
found_error = Some(ParseError::argument_error(
config.name.clone().spanned(flag.span),
ArgumentError::MissingValueForName(name.to_string()),
))
}
}
}
}
}
NamedType::Optional(s, syntax_type) => {
match extract_optional(name, *s, tail) {
Err(err) => {
// remember this error, but continue coloring
found_error = Some(err);
}
Ok(Some((pos, flag))) => {
let result = expand_flag(tail, *syntax_type, flag, pos);
tail.move_to(0);
match result {
Ok(expr) => {
named.insert_optional(name, Some(expr));
rest_signature.remove_named(name);
tail.move_to(pos);
}
Err(_) => {
found_error = Some(ParseError::argument_error(
config.name.clone().spanned(flag.span),
ArgumentError::MissingValueForName(name.to_string()),
))
}
}
}
Ok(None) => {
named.insert_optional(name, None);
}
}
}
};
}
trace_remaining("after named", &tail);
let mut positional = vec![];
match continue_parsing_positionals(&config, tail, &mut rest_signature, command_span) {
Ok(positionals) => {
positional = positionals;
}
Err(reason) => {
if found_error.is_none() && !tail.source().contains("help") {
found_error = Some(reason);
}
}
}
trace_remaining("after positional", &tail);
if let Some((syntax_type, _)) = config.rest_positional {
let mut out = vec![];
loop {
if found_error.is_some() {
break;
}
tail.move_to(0);
trace_remaining("start rest", &tail);
eat_any_whitespace(tail);
trace_remaining("after whitespace", &tail);
if tail.at_end() {
break;
}
match tail.expand_syntax(syntax_type) {
Err(err) => found_error = Some(err),
Ok(next) => out.push(next),
};
}
positional.extend(out);
}
trace_remaining("after rest", &tail);
if found_error.is_none() {
if let Some(unexpected_argument_error) = find_unexpected_tokens(config, tail, command_span)
{
found_error = Some(unexpected_argument_error);
}
}
eat_any_whitespace(tail);
// Consume any remaining tokens with backoff coloring mode
tail.expand_infallible(BackoffColoringMode::new(rest_signature.allowed()));
// This is pretty dubious, but it works. We should look into a better algorithm that doesn't end up requiring
// this solution.
tail.sort_shapes();
trace!(target: "nu::parse::trace_remaining", "Constructed positional={:?} named={:?}", positional, named);
let positional = if positional.is_empty() {
None
} else {
Some(positional)
};
let named = if named.named.is_empty() {
None
} else {
Some(named)
};
trace!(target: "nu::parse::trace_remaining", "Normalized positional={:?} named={:?}", positional, named);
if let Some(err) = found_error {
return Err(err);
}
Ok(Some((positional, named)))
}
pub fn continue_parsing_positionals(
config: &Signature,
tail: &mut TokensIterator,
rest_signature: &mut Signature,
command_span: Span,
) -> Result<Vec<SpannedExpression>, ParseError> {
let mut positional = vec![];
eat_any_whitespace(tail);
for arg in &config.positional {
trace!(target: "nu::parse::trace_remaining", "Processing positional {:?}", arg);
tail.move_to(0);
let result = expand_spaced_expr(arg.0.syntax_type(), tail);
match result {
Err(_) => match &arg.0 {
PositionalType::Mandatory(..) => {
return Err(ParseError::argument_error(
config.name.clone().spanned(command_span),
ArgumentError::MissingMandatoryPositional(arg.0.name().to_string()),
))
}
PositionalType::Optional(..) => {
if tail.expand_syntax(MaybeWhitespaceEof).is_ok() {
break;
}
}
},
Ok(result) => {
rest_signature.shift_positional();
positional.push(result);
}
}
}
Ok(positional)
}
fn eat_any_whitespace(tail: &mut TokensIterator) {
loop {
match tail.expand_infallible(MaybeSpaceShape) {
None => break,
Some(_) => continue,
}
}
}
fn expand_flag(
token_nodes: &mut TokensIterator,
syntax_type: SyntaxShape,
flag: Spanned<Flag>,
pos: usize,
) -> Result<SpannedExpression, ()> {
token_nodes.color_shape(flag.color(flag.span));
let result = token_nodes.atomic_parse(|token_nodes| {
token_nodes.move_to(pos);
if token_nodes.at_end() {
return Err(ParseError::unexpected_eof("flag", Span::unknown()));
}
let expr = expand_spaced_expr(syntax_type, token_nodes)?;
Ok(expr)
});
let expr = result.map_err(|_| ())?;
Ok(expr)
}
fn expand_spaced_expr<
T: HasFallibleSpan + PrettyDebugWithSource + Clone + std::fmt::Debug + 'static,
>(
syntax: impl ExpandSyntax<Output = Result<T, ParseError>>,
token_nodes: &mut TokensIterator,
) -> Result<T, ParseError> {
token_nodes.atomic_parse(|token_nodes| {
token_nodes.expand_infallible(MaybeSpaceShape);
token_nodes.expand_syntax(syntax)
})
}
fn extract_switch(
name: &str,
short: Option<char>,
tokens: &mut hir::TokensIterator<'_>,
) -> Option<(usize, Spanned<Flag>)> {
let source = tokens.source();
let switch = tokens.extract(|t| {
t.as_flag(name, short, &source)
.map(|flag| flag.spanned(t.span()))
});
match switch {
None => None,
Some((pos, flag)) => {
tokens.remove(pos);
Some((pos, flag))
}
}
}
fn extract_mandatory(
config: &Signature,
name: &str,
short: Option<char>,
tokens: &mut hir::TokensIterator<'_>,
span: Span,
) -> Result<(usize, Spanned<Flag>), ParseError> {
let source = tokens.source();
let flag = tokens.extract(|t| {
t.as_flag(name, short, &source)
.map(|flag| flag.spanned(t.span()))
});
match flag {
None => Err(ParseError::argument_error(
config.name.clone().spanned(span),
ArgumentError::MissingMandatoryFlag(name.to_string()),
)),
Some((pos, flag)) => {
tokens.remove(pos);
Ok((pos, flag))
}
}
}
fn extract_optional(
name: &str,
short: Option<char>,
tokens: &mut hir::TokensIterator<'_>,
) -> Result<Option<(usize, Spanned<Flag>)>, ParseError> {
let source = tokens.source();
let flag = tokens.extract(|t| {
t.as_flag(name, short, &source)
.map(|flag| flag.spanned(t.span()))
});
match flag {
None => Ok(None),
Some((pos, flag)) => {
tokens.remove(pos);
Ok(Some((pos, flag)))
}
}
}
fn find_unexpected_tokens(
config: &Signature,
tail: &hir::TokensIterator,
command_span: Span,
) -> Option<ParseError> {
let mut tokens = tail.clone();
let source = tail.source();
loop {
tokens.move_to(0);
if let Some(node) = tokens.peek().commit() {
match &node.unspanned() {
Token::Whitespace => {}
Token::Flag { .. } => {
return Some(ParseError::argument_error(
config.name.clone().spanned(command_span),
ArgumentError::UnexpectedFlag(Spanned {
item: node.span().slice(&source).to_string(),
span: node.span(),
}),
));
}
_ => {
return Some(ParseError::argument_error(
config.name.clone().spanned(command_span),
ArgumentError::UnexpectedArgument(Spanned {
item: node.span().slice(&source).to_string(),
span: node.span(),
}),
));
}
}
}
if tokens.at_end() {
break;
}
}
None
}
pub fn trace_remaining(desc: &'static str, tail: &hir::TokensIterator<'_>) {
let offset = tail.clone().span_at_cursor();
let source = tail.source();
trace!(
target: "nu::parse::trace_remaining",
"{} = {}",
desc,
itertools::join(
tail.debug_remaining()
.iter()
.map(|val| {
if val.span().start() == offset.start() {
format!("<|> %{}%", val.debug(&source))
} else {
format!("%{}%", val.debug(&source))
}
}),
" "
)
);
}

View File

@ -0,0 +1,24 @@
[package]
name = "nu-plugin"
version = "0.10.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "Nushell Plugin"
license = "MIT"
[lib]
doctest = false
[dependencies]
nu-protocol = { path = "../nu-protocol", version = "0.10.0" }
nu-source = { path = "../nu-source", version = "0.10.0" }
nu-errors = { path = "../nu-errors", version = "0.10.0" }
nu-value-ext = { path = "../nu-value-ext", version = "0.10.0" }
indexmap = { version = "1.3.0", features = ["serde-1"] }
serde = { version = "1.0.103", features = ["derive"] }
num-bigint = { version = "0.2.3", features = ["serde"] }
serde_json = "1.0.44"
[build-dependencies]
nu-build = { version = "0.10.0", path = "../nu-build" }

View File

@ -0,0 +1,3 @@
fn main() -> Result<(), Box<dyn std::error::Error>> {
nu_build::build()
}

View File

@ -0,0 +1,4 @@
mod plugin;
pub mod test_helpers;
pub use crate::plugin::{serve_plugin, Plugin};

View File

@ -1,6 +1,5 @@
use crate::Signature;
use crate::Tagged;
use crate::{CallInfo, ReturnValue, ShellError, Value};
use nu_errors::ShellError;
use nu_protocol::{outln, CallInfo, ReturnValue, Signature, Value};
use serde::{Deserialize, Serialize};
use std::io;
@ -11,7 +10,7 @@ pub trait Plugin {
Ok(vec![])
}
fn filter(&mut self, _input: Tagged<Value>) -> Result<Vec<ReturnValue>, ShellError> {
fn filter(&mut self, _input: Value) -> Result<Vec<ReturnValue>, ShellError> {
Ok(vec![])
}
@ -19,15 +18,15 @@ pub trait Plugin {
Ok(vec![])
}
fn sink(&mut self, _call_info: CallInfo, _input: Vec<Tagged<Value>>) {}
fn sink(&mut self, _call_info: CallInfo, _input: Vec<Value>) {}
fn quit(&mut self) {}
}
pub fn serve_plugin(plugin: &mut dyn Plugin) {
let args = std::env::args();
let mut args = std::env::args();
if args.len() > 1 {
let input = args.skip(1).next();
let input = args.nth(1);
let input = match input {
Some(arg) => std::fs::read_to_string(arg),
@ -143,8 +142,8 @@ fn send_response<T: Serialize>(result: T) {
let response_raw = serde_json::to_string(&response);
match response_raw {
Ok(response) => println!("{}", response),
Err(err) => println!("{}", err),
Ok(response) => outln!("{}", response),
Err(err) => outln!("{}", err),
}
}
#[derive(Debug, Serialize, Deserialize)]
@ -152,15 +151,9 @@ fn send_response<T: Serialize>(result: T) {
#[allow(non_camel_case_types)]
pub enum NuCommand {
config,
begin_filter {
params: CallInfo,
},
filter {
params: Tagged<Value>,
},
begin_filter { params: CallInfo },
filter { params: Value },
end_filter,
sink {
params: (CallInfo, Vec<Tagged<Value>>),
},
sink { params: (CallInfo, Vec<Value>) },
quit,
}

View File

@ -0,0 +1,213 @@
use crate::Plugin;
use indexmap::IndexMap;
use nu_errors::ShellError;
use nu_protocol::{CallInfo, EvaluatedArgs, ReturnSuccess, ReturnValue, UntaggedValue, Value};
use nu_source::Tag;
pub struct PluginTest<'a, T: Plugin> {
plugin: &'a mut T,
call_info: CallInfo,
input: Value,
}
impl<'a, T: Plugin> PluginTest<'a, T> {
pub fn for_plugin(plugin: &'a mut T) -> Self {
PluginTest {
plugin,
call_info: CallStub::new().create(),
input: UntaggedValue::nothing().into_value(Tag::unknown()),
}
}
pub fn args(&mut self, call_info: CallInfo) -> &mut PluginTest<'a, T> {
self.call_info = call_info;
self
}
pub fn configure(&mut self, callback: impl FnOnce(Vec<String>)) -> &mut PluginTest<'a, T> {
let signature = self
.plugin
.config()
.expect("There was a problem configuring the plugin.");
callback(signature.named.keys().map(String::from).collect());
self
}
pub fn input(&mut self, value: Value) -> &mut PluginTest<'a, T> {
self.input = value;
self
}
pub fn test(&mut self) -> Result<Vec<ReturnValue>, ShellError> {
let return_values = self.plugin.filter(self.input.clone());
let mut return_values = match return_values {
Ok(filtered) => filtered,
Err(reason) => return Err(reason),
};
let end = self.plugin.end_filter();
match end {
Ok(filter_ended) => return_values.extend(filter_ended),
Err(reason) => return Err(reason),
}
self.plugin.quit();
Ok(return_values)
}
pub fn setup(
&mut self,
callback: impl FnOnce(&mut T, Result<Vec<ReturnValue>, ShellError>),
) -> &mut PluginTest<'a, T> {
let call_stub = self.call_info.clone();
self.configure(|flags_configured| {
let flags_registered = &call_stub.args.named;
let flag_passed = match flags_registered {
Some(names) => Some(names.keys().map(String::from).collect::<Vec<String>>()),
None => None,
};
if let Some(flags) = flag_passed {
for flag in flags {
assert!(
flags_configured.iter().any(|f| *f == flag),
format!(
"The flag you passed ({}) is not configured in the plugin.",
flag
)
);
}
}
});
let began = self.plugin.begin_filter(call_stub);
let return_values = match began {
Ok(values) => Ok(values),
Err(reason) => Err(reason),
};
callback(self.plugin, return_values);
self
}
}
pub fn plugin<T: Plugin>(plugin: &mut T) -> PluginTest<T> {
PluginTest::for_plugin(plugin)
}
#[derive(Default)]
pub struct CallStub {
positionals: Vec<Value>,
flags: IndexMap<String, Value>,
}
impl CallStub {
pub fn new() -> Self {
Default::default()
}
pub fn with_named_parameter(&mut self, name: &str, value: Value) -> &mut Self {
self.flags.insert(name.to_string(), value);
self
}
pub fn with_long_flag(&mut self, name: &str) -> &mut Self {
self.flags.insert(
name.to_string(),
UntaggedValue::boolean(true).into_value(Tag::unknown()),
);
self
}
pub fn with_parameter(&mut self, name: &str) -> Result<&mut Self, ShellError> {
let fields: Vec<Value> = name
.split('.')
.map(|s| UntaggedValue::string(s.to_string()).into_value(Tag::unknown()))
.collect();
self.positionals.push(value::column_path(&fields)?);
Ok(self)
}
pub fn create(&self) -> CallInfo {
CallInfo {
args: EvaluatedArgs::new(Some(self.positionals.clone()), Some(self.flags.clone())),
name_tag: Tag::unknown(),
}
}
}
pub fn expect_return_value_at(
for_results: Result<Vec<Result<ReturnSuccess, ShellError>>, ShellError>,
at: usize,
) -> Value {
let return_values = for_results
.expect("Failed! This seems to be an error getting back the results from the plugin.");
for (idx, item) in return_values.iter().enumerate() {
let item = match item {
Ok(return_value) => return_value,
Err(reason) => panic!(format!("{}", reason)),
};
if idx == at {
if let Some(value) = item.raw_value() {
return value;
} else {
panic!("Internal error: could not get raw value in expect_return_value_at")
}
}
}
panic!(format!(
"Couldn't get return value from stream at {}. (There are {} items)",
at,
return_values.len() - 1
))
}
pub mod value {
use nu_errors::ShellError;
use nu_protocol::{Primitive, TaggedDictBuilder, UntaggedValue, Value};
use nu_source::Tag;
use nu_value_ext::ValueExt;
use num_bigint::BigInt;
pub fn get_data(for_value: Value, key: &str) -> Value {
for_value.get_data(&key.to_string()).borrow().clone()
}
pub fn int(i: impl Into<BigInt>) -> Value {
UntaggedValue::Primitive(Primitive::Int(i.into())).into_untagged_value()
}
pub fn string(input: impl Into<String>) -> Value {
UntaggedValue::string(input.into()).into_untagged_value()
}
pub fn structured_sample_record(key: &str, value: &str) -> Value {
let mut record = TaggedDictBuilder::new(Tag::unknown());
record.insert_untagged(key, UntaggedValue::string(value));
record.into_value()
}
pub fn unstructured_sample_record(value: &str) -> Value {
UntaggedValue::string(value).into_value(Tag::unknown())
}
pub fn table(list: &[Value]) -> Value {
UntaggedValue::table(list).into_untagged_value()
}
pub fn column_path(paths: &[Value]) -> Result<Value, ShellError> {
Ok(UntaggedValue::Primitive(Primitive::ColumnPath(
table(&paths.to_vec()).as_column_path()?.item,
))
.into_untagged_value())
}
}

View File

@ -0,0 +1,41 @@
[package]
name = "nu-protocol"
version = "0.10.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "Core values and protocols for Nushell"
license = "MIT"
[lib]
doctest = false
[dependencies]
nu-source = { path = "../nu-source", version = "0.10.0" }
nu-errors = { path = "../nu-errors", version = "0.10.0" }
serde = { version = "1.0.103", features = ["derive"] }
indexmap = { version = "1.3.0", features = ["serde-1"] }
num-bigint = { version = "0.2.3", features = ["serde"] }
bigdecimal = { version = "0.1.0", features = ["serde"] }
chrono = { version = "0.4.10", features = ["serde"] }
num-traits = "0.2.8"
serde_bytes = "0.11.3"
getset = "0.0.9"
derive-new = "0.5.8"
ansi_term = "0.12.1"
language-reporting = "0.4.0"
nom = "5.0.1"
nom_locate = "1.0.0"
nom-tracable = "0.4.1"
typetag = "0.1.4"
query_interface = "0.3.5"
byte-unit = "3.0.3"
natural = "0.3.0"
# implement conversions
serde_yaml = "0.8"
toml = "0.5.5"
serde_json = "1.0.44"
[build-dependencies]
nu-build = { version = "0.10.0", path = "../nu-build" }

View File

@ -0,0 +1,3 @@
fn main() -> Result<(), Box<dyn std::error::Error>> {
nu_build::build()
}

View File

@ -0,0 +1,114 @@
use crate::value::Value;
use derive_new::new;
use indexmap::IndexMap;
use nu_errors::ShellError;
use nu_source::Tag;
use serde::{Deserialize, Serialize};
/// Associated information for the call of a command, including the args passed to the command and a tag that spans the name of the command being called
#[derive(Deserialize, Serialize, Debug, Clone)]
pub struct CallInfo {
/// The arguments associated with this call
pub args: EvaluatedArgs,
/// The tag (underline-able position) of the name of the call itself
pub name_tag: Tag,
}
/// The set of positional and named arguments, after their values have been evaluated.
///
/// * Positional arguments are those who are given as values, without any associated flag. For example, in `foo arg1 arg2`, both `arg1` and `arg2` are positional arguments.
/// * Named arguments are those associated with a flag. For example, `foo --given bar` the named argument would be name `given` and the value `bar`.
#[derive(Debug, Default, new, Serialize, Deserialize, Clone)]
pub struct EvaluatedArgs {
pub positional: Option<Vec<Value>>,
pub named: Option<IndexMap<String, Value>>,
}
impl EvaluatedArgs {
/// Retrieve a subset of positional arguments starting at a given position
pub fn slice_from(&self, from: usize) -> Vec<Value> {
let positional = &self.positional;
match positional {
None => vec![],
Some(list) => list[from..].to_vec(),
}
}
/// Get the nth positional argument, if possible
pub fn nth(&self, pos: usize) -> Option<&Value> {
match &self.positional {
None => None,
Some(array) => array.get(pos),
}
}
/// Get the nth positional argument, error if not possible
pub fn expect_nth(&self, pos: usize) -> Result<&Value, ShellError> {
match &self.positional {
None => Err(ShellError::unimplemented("Better error: expect_nth")),
Some(array) => match array.get(pos) {
None => Err(ShellError::unimplemented("Better error: expect_nth")),
Some(item) => Ok(item),
},
}
}
/// Get the number of positional arguments available
pub fn len(&self) -> usize {
match &self.positional {
None => 0,
Some(array) => array.len(),
}
}
/// Return if there are no positional arguments
pub fn is_empty(&self) -> bool {
self.len() == 0
}
/// Return true if the set of named arguments contains the name provided
pub fn has(&self, name: &str) -> bool {
match &self.named {
None => false,
Some(named) => named.contains_key(name),
}
}
/// Gets the corresponding Value for the named argument given, if possible
pub fn get(&self, name: &str) -> Option<&Value> {
match &self.named {
None => None,
Some(named) => named.get(name),
}
}
/// Iterates over the positional arguments
pub fn positional_iter(&self) -> PositionalIter<'_> {
match &self.positional {
None => PositionalIter::Empty,
Some(v) => {
let iter = v.iter();
PositionalIter::Array(iter)
}
}
}
}
/// An iterator to help iterate over positional arguments
pub enum PositionalIter<'a> {
Empty,
Array(std::slice::Iter<'a, Value>),
}
impl<'a> Iterator for PositionalIter<'a> {
type Item = &'a Value;
/// The required `next` function to implement the Iterator trait
fn next(&mut self) -> Option<Self::Item> {
match self {
PositionalIter::Empty => None,
PositionalIter::Array(iter) => iter.next(),
}
}
}

View File

@ -0,0 +1,26 @@
#[macro_use]
mod macros;
mod call_info;
mod maybe_owned;
mod return_value;
mod signature;
mod syntax_shape;
mod type_name;
mod type_shape;
mod value;
pub use crate::call_info::{CallInfo, EvaluatedArgs};
pub use crate::maybe_owned::MaybeOwned;
pub use crate::return_value::{CommandAction, ReturnSuccess, ReturnValue};
pub use crate::signature::{NamedType, PositionalType, Signature};
pub use crate::syntax_shape::SyntaxShape;
pub use crate::type_name::{PrettyType, ShellTypeName, SpannedTypeName};
pub use crate::type_shape::{Row as RowType, Type};
pub use crate::value::column_path::{did_you_mean, ColumnPath, PathMember, UnspannedPathMember};
pub use crate::value::dict::{Dictionary, TaggedDictBuilder};
pub use crate::value::evaluate::{Evaluate, EvaluateTrait, Scope};
pub use crate::value::primitive::Primitive;
pub use crate::value::primitive::{format_date, format_duration, format_primitive};
pub use crate::value::range::{Range, RangeInclusion};
pub use crate::value::{UntaggedValue, Value};

View File

@ -0,0 +1,17 @@
/// Outputs to standard out
///
/// Note: this exists to differentiate between intentional writing to stdout
/// and stray printlns left by accident
#[macro_export]
macro_rules! outln {
($($tokens:tt)*) => { println!($($tokens)*) }
}
/// Outputs to standard error
///
/// Note: this exists to differentiate between intentional writing to stdout
/// and stray printlns left by accident
#[macro_export]
macro_rules! errln {
($($tokens:tt)*) => { eprintln!($($tokens)*) }
}

View File

@ -0,0 +1,18 @@
#![allow(clippy::should_implement_trait)]
/// Helper type to allow passing something that may potentially be owned, but could also be borrowed
#[derive(Debug)]
pub enum MaybeOwned<'a, T> {
Owned(T),
Borrowed(&'a T),
}
impl<T> MaybeOwned<'_, T> {
/// Allows the borrowing of an owned value or passes out the borrowed value
pub fn borrow(&self) -> &T {
match self {
MaybeOwned::Owned(v) => v,
MaybeOwned::Borrowed(v) => v,
}
}
}

View File

@ -0,0 +1,111 @@
use crate::value::Value;
use nu_errors::ShellError;
use nu_source::{b, DebugDocBuilder, PrettyDebug};
use serde::{Deserialize, Serialize};
/// The inner set of actions for the command processor. Each denotes a way to change state in the processor without changing it directly from the command itself.
#[derive(Debug, Clone, Serialize, Deserialize)]
pub enum CommandAction {
/// Change to a new directory or path (in non-filesystem situations)
ChangePath(String),
/// Exit out of Nu
Exit,
/// Display an error
Error(ShellError),
/// Enter a new shell at the given path
EnterShell(String),
/// Convert the value given from one type to another
AutoConvert(Value, String),
/// Enter a value shell, one that allows exploring inside of a Value
EnterValueShell(Value),
/// Enter the help shell, which allows exploring the help system
EnterHelpShell(Value),
/// Go to the previous shell in the shell ring buffer
PreviousShell,
/// Go to the next shell in the shell ring buffer
NextShell,
/// Leave the current shell. If it's the last shell, exit out of Nu
LeaveShell,
}
impl PrettyDebug for CommandAction {
/// Get a command action ready to be pretty-printed
fn pretty(&self) -> DebugDocBuilder {
match self {
CommandAction::ChangePath(path) => b::typed("change path", b::description(path)),
CommandAction::Exit => b::description("exit"),
CommandAction::Error(_) => b::error("error"),
CommandAction::AutoConvert(_, extension) => {
b::typed("auto convert", b::description(extension))
}
CommandAction::EnterShell(s) => b::typed("enter shell", b::description(s)),
CommandAction::EnterValueShell(v) => b::typed("enter value shell", v.pretty()),
CommandAction::EnterHelpShell(v) => b::typed("enter help shell", v.pretty()),
CommandAction::PreviousShell => b::description("previous shell"),
CommandAction::NextShell => b::description("next shell"),
CommandAction::LeaveShell => b::description("leave shell"),
}
}
}
/// The fundamental success type in the pipeline. Commands return these values as their main responsibility
#[derive(Debug, Clone, Serialize, Deserialize)]
pub enum ReturnSuccess {
/// A value to be used or shown to the user
Value(Value),
/// A debug-enabled value to be used or shown to the user
DebugValue(Value),
/// An action to be performed as values pass out of the command. These are performed rather than passed to the next command in the pipeline
Action(CommandAction),
}
impl PrettyDebug for ReturnSuccess {
/// Get a return success ready to be pretty-printed
fn pretty(&self) -> DebugDocBuilder {
match self {
ReturnSuccess::Value(value) => b::typed("value", value.pretty()),
ReturnSuccess::DebugValue(value) => b::typed("debug value", value.pretty()),
ReturnSuccess::Action(action) => b::typed("action", action.pretty()),
}
}
}
/// The core Result type for pipelines
pub type ReturnValue = Result<ReturnSuccess, ShellError>;
impl From<Value> for ReturnValue {
fn from(v: Value) -> Self {
Ok(ReturnSuccess::Value(v))
}
}
impl ReturnSuccess {
/// Get to the contained Value, if possible
pub fn raw_value(&self) -> Option<Value> {
match self {
ReturnSuccess::Value(raw) => Some(raw.clone()),
ReturnSuccess::DebugValue(raw) => Some(raw.clone()),
ReturnSuccess::Action(_) => None,
}
}
/// Helper function for an action to change the the path
pub fn change_cwd(path: String) -> ReturnValue {
Ok(ReturnSuccess::Action(CommandAction::ChangePath(path)))
}
/// Helper function to create simple values for returning
pub fn value(input: impl Into<Value>) -> ReturnValue {
Ok(ReturnSuccess::Value(input.into()))
}
/// Helper function to create simple debug-enabled values for returning
pub fn debug_value(input: impl Into<Value>) -> ReturnValue {
Ok(ReturnSuccess::DebugValue(input.into()))
}
/// Helper function for creating actions
pub fn action(input: CommandAction) -> ReturnValue {
Ok(ReturnSuccess::Action(input))
}
}

View File

@ -0,0 +1,320 @@
use crate::syntax_shape::SyntaxShape;
use crate::type_shape::Type;
use indexmap::IndexMap;
use nu_source::{b, DebugDocBuilder, PrettyDebug, PrettyDebugWithSource};
use serde::{Deserialize, Serialize};
/// The types of named parameter that a command can have
#[derive(Debug, Serialize, Deserialize, Clone)]
pub enum NamedType {
/// A flag without any associated argument. eg) `foo --bar, foo -b`
Switch(Option<char>),
/// A mandatory flag, with associated argument. eg) `foo --required xyz, foo -r xyz`
Mandatory(Option<char>, SyntaxShape),
/// An optional flag, with associated argument. eg) `foo --optional abc, foo -o abc`
Optional(Option<char>, SyntaxShape),
}
impl NamedType {
pub fn get_short(&self) -> Option<char> {
match self {
NamedType::Switch(s) => *s,
NamedType::Mandatory(s, _) => *s,
NamedType::Optional(s, _) => *s,
}
}
}
/// The type of positional arguments
#[derive(Debug, Clone, Serialize, Deserialize)]
pub enum PositionalType {
/// A mandatory positional argument with the expected shape of the value
Mandatory(String, SyntaxShape),
/// An optional positional argument with the expected shape of the value
Optional(String, SyntaxShape),
}
impl PrettyDebug for PositionalType {
/// Prepare the PositionalType for pretty-printing
fn pretty(&self) -> DebugDocBuilder {
match self {
PositionalType::Mandatory(string, shape) => {
b::description(string) + b::delimit("(", shape.pretty(), ")").into_kind().group()
}
PositionalType::Optional(string, shape) => {
b::description(string)
+ b::operator("?")
+ b::delimit("(", shape.pretty(), ")").into_kind().group()
}
}
}
}
impl PositionalType {
/// Helper to create a mandatory positional argument type
pub fn mandatory(name: &str, ty: SyntaxShape) -> PositionalType {
PositionalType::Mandatory(name.to_string(), ty)
}
/// Helper to create a mandatory positional argument with an "any" type
pub fn mandatory_any(name: &str) -> PositionalType {
PositionalType::Mandatory(name.to_string(), SyntaxShape::Any)
}
/// Helper to create a mandatory positional argument with a block type
pub fn mandatory_block(name: &str) -> PositionalType {
PositionalType::Mandatory(name.to_string(), SyntaxShape::Block)
}
/// Helper to create a optional positional argument type
pub fn optional(name: &str, ty: SyntaxShape) -> PositionalType {
PositionalType::Optional(name.to_string(), ty)
}
/// Helper to create a optional positional argument with an "any" type
pub fn optional_any(name: &str) -> PositionalType {
PositionalType::Optional(name.to_string(), SyntaxShape::Any)
}
/// Gets the name of the positional argument
pub fn name(&self) -> &str {
match self {
PositionalType::Mandatory(s, _) => s,
PositionalType::Optional(s, _) => s,
}
}
/// Gets the expected type of a positional argument
pub fn syntax_type(&self) -> SyntaxShape {
match *self {
PositionalType::Mandatory(_, t) => t,
PositionalType::Optional(_, t) => t,
}
}
}
type Description = String;
/// The full signature of a command. All commands have a signature similar to a function signature.
/// Commands will use this information to register themselves with Nu's core engine so that the command
/// can be invoked, help can be displayed, and calls to the command can be error-checked.
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct Signature {
/// The name of the command. Used when calling the command
pub name: String,
/// Usage instructions about the command
pub usage: String,
/// The list of positional arguments, both required and optional, and their corresponding types and help text
pub positional: Vec<(PositionalType, Description)>,
/// After the positional arguments, a catch-all for the rest of the arguments that might follow, their type, and help text
pub rest_positional: Option<(SyntaxShape, Description)>,
/// The named flags with corresponding type and help text
pub named: IndexMap<String, (NamedType, Description)>,
/// The type of values being sent out from the command into the pipeline, if any
pub yields: Option<Type>,
/// The type of values being read in from the pipeline into the command, if any
pub input: Option<Type>,
/// If the command is expected to filter data, or to consume it (as a sink)
pub is_filter: bool,
}
impl Signature {
pub fn shift_positional(&mut self) {
self.positional = Vec::from(&self.positional[1..]);
}
pub fn remove_named(&mut self, name: &str) {
self.named.remove(name);
}
pub fn allowed(&self) -> Vec<String> {
let mut allowed = indexmap::IndexSet::new();
for (name, (t, _)) in &self.named {
if let Some(c) = t.get_short() {
allowed.insert(format!("-{}", c));
}
allowed.insert(format!("--{}", name));
}
for (ty, _) in &self.positional {
let shape = ty.syntax_type();
allowed.insert(shape.display());
}
if let Some((shape, _)) = &self.rest_positional {
allowed.insert(shape.display());
}
allowed.into_iter().collect()
}
}
impl PrettyDebugWithSource for Signature {
/// Prepare a Signature for pretty-printing
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
b::typed(
"signature",
b::description(&self.name)
+ b::preceded(
b::space(),
b::intersperse(
self.positional
.iter()
.map(|(ty, _)| ty.pretty_debug(source)),
b::space(),
),
),
)
}
}
impl Signature {
/// Create a new command signature with the given name
pub fn new(name: impl Into<String>) -> Signature {
Signature {
name: name.into(),
usage: String::new(),
positional: vec![],
rest_positional: None,
named: indexmap::indexmap! {"help".into() => (NamedType::Switch(Some('h')), "Display this help message".into())},
is_filter: false,
yields: None,
input: None,
}
}
/// Create a new signature
pub fn build(name: impl Into<String>) -> Signature {
Signature::new(name.into())
}
/// Add a description to the signature
pub fn desc(mut self, usage: impl Into<String>) -> Signature {
self.usage = usage.into();
self
}
/// Add a required positional argument to the signature
pub fn required(
mut self,
name: impl Into<String>,
ty: impl Into<SyntaxShape>,
desc: impl Into<String>,
) -> Signature {
self.positional.push((
PositionalType::Mandatory(name.into(), ty.into()),
desc.into(),
));
self
}
/// Add an optional positional argument to the signature
pub fn optional(
mut self,
name: impl Into<String>,
ty: impl Into<SyntaxShape>,
desc: impl Into<String>,
) -> Signature {
self.positional.push((
PositionalType::Optional(name.into(), ty.into()),
desc.into(),
));
self
}
/// Add an optional named flag argument to the signature
pub fn named(
mut self,
name: impl Into<String>,
ty: impl Into<SyntaxShape>,
desc: impl Into<String>,
short: Option<char>,
) -> Signature {
let s = short.and_then(|c| {
debug_assert!(!self.get_shorts().contains(&c));
Some(c)
});
self.named.insert(
name.into(),
(NamedType::Optional(s, ty.into()), desc.into()),
);
self
}
/// Add a required named flag argument to the signature
pub fn required_named(
mut self,
name: impl Into<String>,
ty: impl Into<SyntaxShape>,
desc: impl Into<String>,
short: Option<char>,
) -> Signature {
let s = short.and_then(|c| {
debug_assert!(!self.get_shorts().contains(&c));
Some(c)
});
self.named.insert(
name.into(),
(NamedType::Mandatory(s, ty.into()), desc.into()),
);
self
}
/// Add a switch to the signature
pub fn switch(
mut self,
name: impl Into<String>,
desc: impl Into<String>,
short: Option<char>,
) -> Signature {
let s = short.and_then(|c| {
debug_assert!(!self.get_shorts().contains(&c));
Some(c)
});
self.named
.insert(name.into(), (NamedType::Switch(s), desc.into()));
self
}
/// Set the filter flag for the signature
pub fn filter(mut self) -> Signature {
self.is_filter = true;
self
}
/// Set the type for the "rest" of the positional arguments
pub fn rest(mut self, ty: SyntaxShape, desc: impl Into<String>) -> Signature {
self.rest_positional = Some((ty, desc.into()));
self
}
/// Add a type for the output of the command to the signature
pub fn yields(mut self, ty: Type) -> Signature {
self.yields = Some(ty);
self
}
/// Add a type for the input of the command to the signature
pub fn input(mut self, ty: Type) -> Signature {
self.input = Some(ty);
self
}
/// Get list of the short-hand flags
pub fn get_shorts(&self) -> Vec<char> {
let mut shorts = Vec::new();
for (_, (t, _)) in &self.named {
if let Some(c) = t.get_short() {
shorts.push(c);
}
}
shorts
}
}

View File

@ -0,0 +1,45 @@
use nu_source::{b, DebugDocBuilder, PrettyDebug};
use serde::{Deserialize, Serialize};
/// The syntactic shapes that values must match to be passed into a command. You can think of this as the type-checking that occurs when you call a function.
#[derive(Debug, Copy, Clone, Serialize, Deserialize)]
pub enum SyntaxShape {
/// Any syntactic form is allowed
Any,
/// Strings and string-like bare words are allowed
String,
/// Values that can be the right hand side of a '.'
Member,
/// A dotted path to navigate the table
ColumnPath,
/// Only a numeric (integer or decimal) value is allowed
Number,
/// A range is allowed (eg, `1..3`)
Range,
/// Only an integer value is allowed
Int,
/// A filepath is allowed
Path,
/// A glob pattern is allowed, eg `foo*`
Pattern,
/// A block is allowed, eg `{start this thing}`
Block,
}
impl PrettyDebug for SyntaxShape {
/// Prepare SyntaxShape for pretty-printing
fn pretty(&self) -> DebugDocBuilder {
b::kind(match self {
SyntaxShape::Any => "any",
SyntaxShape::String => "string",
SyntaxShape::Member => "member",
SyntaxShape::ColumnPath => "column path",
SyntaxShape::Number => "number",
SyntaxShape::Range => "range",
SyntaxShape::Int => "integer",
SyntaxShape::Path => "file path",
SyntaxShape::Pattern => "pattern",
SyntaxShape::Block => "block",
})
}
}

View File

@ -0,0 +1,44 @@
use nu_source::{DebugDocBuilder, HasSpan, Spanned, SpannedItem, Tagged};
/// A trait that allows structures to define a known .type_name() which pretty-prints the type
pub trait ShellTypeName {
fn type_name(&self) -> &'static str;
}
impl<T: ShellTypeName> ShellTypeName for Spanned<T> {
/// Return the type_name of the spanned item
fn type_name(&self) -> &'static str {
self.item.type_name()
}
}
impl<T: ShellTypeName> ShellTypeName for &T {
/// Return the type_name for the borrowed reference
fn type_name(&self) -> &'static str {
(*self).type_name()
}
}
/// A trait that allows structures to define a known way to return a spanned type name
pub trait SpannedTypeName {
fn spanned_type_name(&self) -> Spanned<&'static str>;
}
impl<T: ShellTypeName + HasSpan> SpannedTypeName for T {
/// Return the type name as a spanned string
fn spanned_type_name(&self) -> Spanned<&'static str> {
self.type_name().spanned(self.span())
}
}
impl<T: ShellTypeName> SpannedTypeName for Tagged<T> {
/// Return the spanned type name for a Tagged value
fn spanned_type_name(&self) -> Spanned<&'static str> {
self.item.type_name().spanned(self.tag.span)
}
}
/// A trait to enable pretty-printing of type information
pub trait PrettyType {
fn pretty_type(&self) -> DebugDocBuilder;
}

View File

@ -0,0 +1,408 @@
///
/// This file describes the structural types of the nushell system.
///
/// Its primary purpose today is to identify "equivalent" values for the purpose
/// of merging rows into a single table or identify rows in a table that have the
/// same shape for reflection.
use crate::value::dict::Dictionary;
use crate::value::primitive::Primitive;
use crate::value::range::RangeInclusion;
use crate::value::{UntaggedValue, Value};
use derive_new::new;
use nu_source::{b, DebugDoc, DebugDocBuilder, PrettyDebug};
use serde::{Deserialize, Deserializer, Serialize};
use std::collections::BTreeMap;
use std::fmt::Debug;
use std::hash::Hash;
/// Representation of the type of ranges
#[derive(Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Hash, Serialize, Deserialize, new)]
pub struct RangeType {
from: (Type, RangeInclusion),
to: (Type, RangeInclusion),
}
/// Representation of for the type of a value in Nu
#[derive(Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Hash, Serialize, Deserialize)]
pub enum Type {
/// A value which has no value
Nothing,
/// An integer-based value
Int,
/// A range between two values
Range(Box<RangeType>),
/// A decimal (floating point) value
Decimal,
/// A filesize in bytes
Bytesize,
/// A string of text
String,
/// A line of text (a string with trailing line ending)
Line,
/// A path through a table
ColumnPath,
/// A glob pattern (like foo*)
Pattern,
/// A boolean value
Boolean,
/// A date value (in UTC)
Date,
/// A data duration value
Duration,
/// A filepath value
Path,
/// A binary (non-text) buffer value
Binary,
/// A row of data
Row(Row),
/// A full table of data
Table(Vec<Type>),
/// A block of script (TODO)
Block,
/// An error value (TODO)
Error,
/// Beginning of stream marker (used as bookend markers rather than actual values)
BeginningOfStream,
/// End of stream marker (used as bookend markers rather than actual values)
EndOfStream,
}
/// A shape representation of the type of a row
#[derive(Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Hash, new)]
pub struct Row {
#[new(default)]
map: BTreeMap<Column, Type>,
}
impl Serialize for Row {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: serde::Serializer,
{
serializer.collect_map(self.map.iter())
}
}
impl<'de> Deserialize<'de> for Row {
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
where
D: Deserializer<'de>,
{
struct RowVisitor;
impl<'de> serde::de::Visitor<'de> for RowVisitor {
type Value = Row;
fn expecting(&self, formatter: &mut std::fmt::Formatter) -> std::fmt::Result {
write!(formatter, "a row")
}
fn visit_map<A>(self, mut map: A) -> Result<Self::Value, A::Error>
where
A: serde::de::MapAccess<'de>,
{
let mut new_map = BTreeMap::new();
loop {
let entry = map.next_entry()?;
match entry {
None => return Ok(Row { map: new_map }),
Some((key, value)) => {
new_map.insert(key, value);
}
}
}
}
}
deserializer.deserialize_map(RowVisitor)
}
}
impl Type {
/// Convert a Primitive into its corresponding Type
pub fn from_primitive(primitive: &Primitive) -> Type {
match primitive {
Primitive::Nothing => Type::Nothing,
Primitive::Int(_) => Type::Int,
Primitive::Range(range) => {
let (left_value, left_inclusion) = &range.from;
let (right_value, right_inclusion) = &range.to;
let left_type = (Type::from_primitive(left_value), *left_inclusion);
let right_type = (Type::from_primitive(right_value), *right_inclusion);
let range = RangeType::new(left_type, right_type);
Type::Range(Box::new(range))
}
Primitive::Decimal(_) => Type::Decimal,
Primitive::Bytes(_) => Type::Bytesize,
Primitive::String(_) => Type::String,
Primitive::Line(_) => Type::Line,
Primitive::ColumnPath(_) => Type::ColumnPath,
Primitive::Pattern(_) => Type::Pattern,
Primitive::Boolean(_) => Type::Boolean,
Primitive::Date(_) => Type::Date,
Primitive::Duration(_) => Type::Duration,
Primitive::Path(_) => Type::Path,
Primitive::Binary(_) => Type::Binary,
Primitive::BeginningOfStream => Type::BeginningOfStream,
Primitive::EndOfStream => Type::EndOfStream,
}
}
/// Convert a dictionary into its corresponding row Type
pub fn from_dictionary(dictionary: &Dictionary) -> Type {
let mut map = BTreeMap::new();
for (key, value) in dictionary.entries.iter() {
let column = Column::String(key.clone());
map.insert(column, Type::from_value(value));
}
Type::Row(Row { map })
}
/// Convert a table into its corresponding Type
pub fn from_table<'a>(table: impl IntoIterator<Item = &'a Value>) -> Type {
let mut vec = vec![];
for item in table.into_iter() {
vec.push(Type::from_value(item))
}
Type::Table(vec)
}
/// Convert a value into its corresponding Type
pub fn from_value<'a>(value: impl Into<&'a UntaggedValue>) -> Type {
match value.into() {
UntaggedValue::Primitive(p) => Type::from_primitive(p),
UntaggedValue::Row(row) => Type::from_dictionary(row),
UntaggedValue::Table(table) => Type::from_table(table.iter()),
UntaggedValue::Error(_) => Type::Error,
UntaggedValue::Block(_) => Type::Block,
}
}
}
impl PrettyDebug for Type {
/// Prepare Type for pretty-printing
fn pretty(&self) -> DebugDocBuilder {
match self {
Type::Nothing => ty("nothing"),
Type::Int => ty("integer"),
Type::Range(range) => {
let (left, left_inclusion) = &range.from;
let (right, right_inclusion) = &range.to;
let left_bracket = b::delimiter(match left_inclusion {
RangeInclusion::Exclusive => "(",
RangeInclusion::Inclusive => "[",
});
let right_bracket = b::delimiter(match right_inclusion {
RangeInclusion::Exclusive => ")",
RangeInclusion::Inclusive => "]",
});
b::typed(
"range",
(left_bracket
+ left.pretty()
+ b::operator(",")
+ b::space()
+ right.pretty()
+ right_bracket)
.group(),
)
}
Type::Decimal => ty("decimal"),
Type::Bytesize => ty("bytesize"),
Type::String => ty("string"),
Type::Line => ty("line"),
Type::ColumnPath => ty("column-path"),
Type::Pattern => ty("pattern"),
Type::Boolean => ty("boolean"),
Type::Date => ty("date"),
Type::Duration => ty("duration"),
Type::Path => ty("path"),
Type::Binary => ty("binary"),
Type::Error => b::error("error"),
Type::BeginningOfStream => b::keyword("beginning-of-stream"),
Type::EndOfStream => b::keyword("end-of-stream"),
Type::Row(row) => (b::kind("row")
+ b::space()
+ b::intersperse(
row.map.iter().map(|(key, ty)| {
(b::key(match key {
Column::String(string) => string.clone(),
Column::Value => "<value>".to_string(),
}) + b::delimit("(", ty.pretty(), ")").into_kind())
.nest()
}),
b::space(),
)
.nest())
.nest(),
Type::Table(table) => {
let mut group: Group<DebugDoc, Vec<(usize, usize)>> = Group::new();
for (i, item) in table.iter().enumerate() {
group.add(item.to_doc(), i);
}
(b::kind("table") + b::space() + b::keyword("of")).group()
+ b::space()
+ (if group.len() == 1 {
let (doc, _) = group.into_iter().collect::<Vec<_>>()[0].clone();
DebugDocBuilder::from_doc(doc)
} else {
b::intersperse(
group.into_iter().map(|(doc, rows)| {
(b::intersperse(
rows.iter().map(|(from, to)| {
if from == to {
b::description(from)
} else {
(b::description(from)
+ b::space()
+ b::keyword("to")
+ b::space()
+ b::description(to))
.group()
}
}),
b::description(", "),
) + b::description(":")
+ b::space()
+ DebugDocBuilder::from_doc(doc))
.nest()
}),
b::space(),
)
})
}
Type::Block => ty("block"),
}
}
}
/// A view into dictionaries for debug purposes
#[derive(Debug, new)]
struct DebugEntry<'a> {
key: &'a Column,
value: &'a Type,
}
impl<'a> PrettyDebug for DebugEntry<'a> {
/// Prepare debug entries for pretty-printing
fn pretty(&self) -> DebugDocBuilder {
(b::key(match self.key {
Column::String(string) => string.clone(),
Column::Value => "<value>".to_string(),
}) + b::delimit("(", self.value.pretty(), ")").into_kind())
}
}
/// Helper to create a pretty-print for the type
fn ty(name: impl std::fmt::Display) -> DebugDocBuilder {
b::kind(format!("{}", name))
}
pub trait GroupedValue: Debug + Clone {
type Item;
fn new() -> Self;
fn merge(&mut self, value: Self::Item);
}
impl GroupedValue for Vec<(usize, usize)> {
type Item = usize;
fn new() -> Vec<(usize, usize)> {
vec![]
}
fn merge(&mut self, new_value: usize) {
match self.last_mut() {
Some(value) if value.1 == new_value - 1 => {
value.1 += 1;
}
_ => self.push((new_value, new_value)),
}
}
}
#[derive(Debug)]
pub struct Group<K: Debug + Eq + Hash, V: GroupedValue> {
values: indexmap::IndexMap<K, V>,
}
impl<K, G> Group<K, G>
where
K: Debug + Eq + Hash,
G: GroupedValue,
{
pub fn new() -> Group<K, G> {
Group {
values: indexmap::IndexMap::default(),
}
}
pub fn len(&self) -> usize {
self.values.len()
}
pub fn into_iter(self) -> impl Iterator<Item = (K, G)> {
self.values.into_iter()
}
pub fn add(&mut self, key: impl Into<K>, value: impl Into<G::Item>) {
let key = key.into();
let value = value.into();
let group = self.values.get_mut(&key);
match group {
None => {
self.values.insert(key, {
let mut group = G::new();
group.merge(value);
group
});
}
Some(group) => {
group.merge(value);
}
}
}
}
#[derive(Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Serialize, Deserialize, Hash)]
pub enum Column {
String(String),
Value,
}
impl Into<Column> for String {
fn into(self) -> Column {
Column::String(self)
}
}
impl Into<Column> for &String {
fn into(self) -> Column {
Column::String(self.clone())
}
}
impl Into<Column> for &str {
fn into(self) -> Column {
Column::String(self.to_string())
}
}

View File

@ -0,0 +1,380 @@
pub mod column_path;
mod convert;
mod debug;
pub mod dict;
pub mod evaluate;
pub mod primitive;
pub mod range;
mod serde_bigdecimal;
mod serde_bigint;
use crate::type_name::{ShellTypeName, SpannedTypeName};
use crate::value::dict::Dictionary;
use crate::value::evaluate::Evaluate;
use crate::value::primitive::Primitive;
use crate::value::range::{Range, RangeInclusion};
use crate::{ColumnPath, PathMember};
use bigdecimal::BigDecimal;
use chrono::{DateTime, Utc};
use indexmap::IndexMap;
use nu_errors::ShellError;
use nu_source::{AnchorLocation, HasSpan, Span, Spanned, Tag};
use num_bigint::BigInt;
use serde::{Deserialize, Serialize};
use std::path::PathBuf;
use std::time::SystemTime;
/// The core structured values that flow through a pipeline
#[derive(Debug, Clone, PartialEq, PartialOrd, Eq, Ord, Hash, Serialize, Deserialize)]
pub enum UntaggedValue {
/// A primitive (or fundamental) type of values
Primitive(Primitive),
/// A table row
Row(Dictionary),
/// A full inner (or embedded) table
Table(Vec<Value>),
/// An error value that represents an error that occurred as the values in the pipeline were built
Error(ShellError),
/// A block of Nu code, eg `{ ls | get name }`
Block(Evaluate),
}
impl UntaggedValue {
/// Tags an UntaggedValue so that it can become a Value
pub fn retag(self, tag: impl Into<Tag>) -> Value {
Value {
value: self,
tag: tag.into(),
}
}
/// Get the corresponding descriptors (column names) associated with this value
pub fn data_descriptors(&self) -> Vec<String> {
match self {
UntaggedValue::Primitive(_) => vec![],
UntaggedValue::Row(columns) => columns.entries.keys().map(|x| x.to_string()).collect(),
UntaggedValue::Block(_) => vec![],
UntaggedValue::Table(_) => vec![],
UntaggedValue::Error(_) => vec![],
}
}
/// Convert this UntaggedValue to a Value with the given Tag
pub fn into_value(self, tag: impl Into<Tag>) -> Value {
Value {
value: self,
tag: tag.into(),
}
}
/// Convert this UntaggedValue into a Value with an empty Tag
pub fn into_untagged_value(self) -> Value {
Value {
value: self,
tag: Tag::unknown(),
}
}
/// Returns true if this value represents boolean true
pub fn is_true(&self) -> bool {
match self {
UntaggedValue::Primitive(Primitive::Boolean(true)) => true,
_ => false,
}
}
/// Returns true if the value represents something other than Nothing
pub fn is_some(&self) -> bool {
!self.is_none()
}
/// Returns true if the value represents Nothing
pub fn is_none(&self) -> bool {
match self {
UntaggedValue::Primitive(Primitive::Nothing) => true,
_ => false,
}
}
/// Returns true if the value represents an error
pub fn is_error(&self) -> bool {
match self {
UntaggedValue::Error(_err) => true,
_ => false,
}
}
/// Expect this value to be an error and return it
pub fn expect_error(&self) -> ShellError {
match self {
UntaggedValue::Error(err) => err.clone(),
_ => panic!("Don't call expect_error without first calling is_error"),
}
}
/// Expect this value to be a string and return it
pub fn expect_string(&self) -> &str {
match self {
UntaggedValue::Primitive(Primitive::String(string)) => &string[..],
_ => panic!("expect_string assumes that the value must be a string"),
}
}
/// Helper for creating row values
pub fn row(entries: IndexMap<String, Value>) -> UntaggedValue {
UntaggedValue::Row(entries.into())
}
/// Helper for creating table values
pub fn table(list: &[Value]) -> UntaggedValue {
UntaggedValue::Table(list.to_vec())
}
/// Helper for creating string values
pub fn string(s: impl Into<String>) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::String(s.into()))
}
/// Helper for creating line values
pub fn line(s: impl Into<String>) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::Line(s.into()))
}
/// Helper for creating column-path values
pub fn column_path(s: Vec<impl Into<PathMember>>) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::ColumnPath(ColumnPath::new(
s.into_iter().map(|p| p.into()).collect(),
)))
}
/// Helper for creating integer values
pub fn int(i: impl Into<BigInt>) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::Int(i.into()))
}
/// Helper for creating glob pattern values
pub fn pattern(s: impl Into<String>) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::String(s.into()))
}
/// Helper for creating filepath values
pub fn path(s: impl Into<PathBuf>) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::Path(s.into()))
}
/// Helper for creating bytesize values
pub fn bytes(s: impl Into<u64>) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::Bytes(s.into()))
}
/// Helper for creating decimal values
pub fn decimal(s: impl Into<BigDecimal>) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::Decimal(s.into()))
}
/// Helper for creating binary (non-text) buffer values
pub fn binary(binary: Vec<u8>) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::Binary(binary))
}
/// Helper for creating range values
pub fn range(
left: (Spanned<Primitive>, RangeInclusion),
right: (Spanned<Primitive>, RangeInclusion),
) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::Range(Box::new(Range::new(left, right))))
}
/// Helper for creating boolean values
pub fn boolean(s: impl Into<bool>) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::Boolean(s.into()))
}
/// Helper for creating date duration values
pub fn duration(secs: u64) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::Duration(secs))
}
/// Helper for creating datatime values
pub fn system_date(s: SystemTime) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::Date(s.into()))
}
pub fn date(d: impl Into<DateTime<Utc>>) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::Date(d.into()))
}
/// Helper for creating the Nothing value
pub fn nothing() -> UntaggedValue {
UntaggedValue::Primitive(Primitive::Nothing)
}
}
/// The fundamental structured value that flows through the pipeline, with associated metadata
#[derive(Debug, Clone, PartialOrd, PartialEq, Ord, Eq, Hash, Serialize, Deserialize)]
pub struct Value {
pub value: UntaggedValue,
pub tag: Tag,
}
/// Overload deferencing to give back the UntaggedValue inside of a Value
impl std::ops::Deref for Value {
type Target = UntaggedValue;
fn deref(&self) -> &Self::Target {
&self.value
}
}
impl Value {
/// Get the corresponding anchor (originating location) for the Value
pub fn anchor(&self) -> Option<AnchorLocation> {
self.tag.anchor()
}
/// Get the name (url, filepath, etc) behind an anchor for the Value
pub fn anchor_name(&self) -> Option<String> {
self.tag.anchor_name()
}
/// Get the metadata for the Value
pub fn tag(&self) -> Tag {
self.tag.clone()
}
/// View the Value as a string, if possible
pub fn as_string(&self) -> Result<String, ShellError> {
match &self.value {
UntaggedValue::Primitive(Primitive::String(string)) => Ok(string.clone()),
UntaggedValue::Primitive(Primitive::Line(line)) => Ok(line.clone() + "\n"),
_ => Err(ShellError::type_error("string", self.spanned_type_name())),
}
}
/// View into the borrowed string contents of a Value, if possible
pub fn as_forgiving_string(&self) -> Result<&str, ShellError> {
match &self.value {
UntaggedValue::Primitive(Primitive::String(string)) => Ok(&string[..]),
_ => Err(ShellError::type_error("string", self.spanned_type_name())),
}
}
/// View the Value as a path, if possible
pub fn as_path(&self) -> Result<PathBuf, ShellError> {
match &self.value {
UntaggedValue::Primitive(Primitive::Path(path)) => Ok(path.clone()),
UntaggedValue::Primitive(Primitive::String(path_str)) => Ok(PathBuf::from(&path_str)),
_ => Err(ShellError::type_error("Path", self.spanned_type_name())),
}
}
/// View the Value as a Primitive value, if possible
pub fn as_primitive(&self) -> Result<Primitive, ShellError> {
match &self.value {
UntaggedValue::Primitive(primitive) => Ok(primitive.clone()),
_ => Err(ShellError::type_error(
"Primitive",
self.spanned_type_name(),
)),
}
}
/// View the Value as unsigned 64-bit, if possible
pub fn as_u64(&self) -> Result<u64, ShellError> {
match &self.value {
UntaggedValue::Primitive(primitive) => primitive.as_u64(self.tag.span),
_ => Err(ShellError::type_error("integer", self.spanned_type_name())),
}
}
/// View the Value as boolean, if possible
pub fn as_bool(&self) -> Result<bool, ShellError> {
match &self.value {
UntaggedValue::Primitive(Primitive::Boolean(p)) => Ok(*p),
_ => Err(ShellError::type_error("boolean", self.spanned_type_name())),
}
}
}
impl Into<Value> for String {
fn into(self) -> Value {
let end = self.len();
Value {
value: self.into(),
tag: Tag {
anchor: None,
span: Span::new(0, end),
},
}
}
}
impl Into<UntaggedValue> for &str {
/// Convert a string slice into an UntaggedValue
fn into(self) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::String(self.to_string()))
}
}
impl Into<UntaggedValue> for Value {
/// Convert a Value into an UntaggedValue
fn into(self) -> UntaggedValue {
self.value
}
}
/// Convert a borrowed Value into a borrowed UntaggedValue
impl<'a> Into<&'a UntaggedValue> for &'a Value {
fn into(self) -> &'a UntaggedValue {
&self.value
}
}
impl HasSpan for Value {
/// Return the corresponding Span for the Value
fn span(&self) -> Span {
self.tag.span
}
}
impl ShellTypeName for Value {
/// Get the type name for the Value
fn type_name(&self) -> &'static str {
ShellTypeName::type_name(&self.value)
}
}
impl ShellTypeName for UntaggedValue {
/// Get the type name for the UntaggedValue
fn type_name(&self) -> &'static str {
match &self {
UntaggedValue::Primitive(p) => p.type_name(),
UntaggedValue::Row(_) => "row",
UntaggedValue::Table(_) => "table",
UntaggedValue::Error(_) => "error",
UntaggedValue::Block(_) => "block",
}
}
}
impl From<Primitive> for UntaggedValue {
/// Convert a Primitive to an UntaggedValue
fn from(input: Primitive) -> UntaggedValue {
UntaggedValue::Primitive(input)
}
}
impl From<String> for UntaggedValue {
/// Convert a String to an UntaggedValue
fn from(input: String) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::String(input))
}
}
impl From<ShellError> for UntaggedValue {
fn from(e: ShellError) -> Self {
UntaggedValue::Error(e)
}
}

View File

@ -0,0 +1,129 @@
use crate::Value;
use derive_new::new;
use getset::Getters;
use nu_source::{b, span_for_spanned_list, DebugDocBuilder, HasFallibleSpan, PrettyDebug, Span};
use num_bigint::BigInt;
use serde::{Deserialize, Serialize};
/// A PathMember that has yet to be spanned so that it can be used in later processing
#[derive(Clone, Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Serialize, Deserialize)]
pub enum UnspannedPathMember {
String(String),
Int(BigInt),
}
impl UnspannedPathMember {
/// Add the span information and get a full PathMember
pub fn into_path_member(self, span: impl Into<Span>) -> PathMember {
PathMember {
unspanned: self,
span: span.into(),
}
}
}
/// A basic piece of a ColumnPath, which describes the steps to take through a table to arrive a cell, row, or inner table
#[derive(Clone, Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Serialize, Deserialize)]
pub struct PathMember {
pub unspanned: UnspannedPathMember,
pub span: Span,
}
impl PrettyDebug for &PathMember {
/// Gets the PathMember ready to be pretty-printed
fn pretty(&self) -> DebugDocBuilder {
match &self.unspanned {
UnspannedPathMember::String(string) => b::primitive(format!("{:?}", string)),
UnspannedPathMember::Int(int) => b::primitive(format!("{}", int)),
}
}
}
/// The fundamental path primitive to descrive how to navigate through a table to get to a sub-item. A path member can be either a word or a number. Words/strings are taken to mean
/// a column name, and numbers are the row number. Taken together they describe which column or row to narrow to in order to get data.
///
/// Rows must follow column names, they can't come first. eg) `foo.1` is valid where `1.foo` is not.
#[derive(
Debug, Hash, Serialize, Deserialize, Ord, PartialOrd, Eq, PartialEq, Getters, Clone, new,
)]
pub struct ColumnPath {
#[get = "pub"]
members: Vec<PathMember>,
}
impl ColumnPath {
/// Iterate over the members of the column path
pub fn iter(&self) -> impl Iterator<Item = &PathMember> {
self.members.iter()
}
/// Returns the last member and a slice of the remaining members
pub fn split_last(&self) -> Option<(&PathMember, &[PathMember])> {
self.members.split_last()
}
}
impl PrettyDebug for ColumnPath {
/// Gets the ColumnPath ready to be pretty-printed
fn pretty(&self) -> DebugDocBuilder {
let members: Vec<DebugDocBuilder> =
self.members.iter().map(|member| member.pretty()).collect();
b::delimit(
"(",
b::description("path") + b::equals() + b::intersperse(members, b::space()),
")",
)
.nest()
}
}
impl HasFallibleSpan for ColumnPath {
/// Creates a span that will cover the column path, if possible
fn maybe_span(&self) -> Option<Span> {
if self.members.is_empty() {
None
} else {
Some(span_for_spanned_list(self.members.iter().map(|m| m.span)))
}
}
}
impl PathMember {
/// Create a string path member
pub fn string(string: impl Into<String>, span: impl Into<Span>) -> PathMember {
UnspannedPathMember::String(string.into()).into_path_member(span)
}
/// Create a numeric path member
pub fn int(int: impl Into<BigInt>, span: impl Into<Span>) -> PathMember {
UnspannedPathMember::Int(int.into()).into_path_member(span)
}
}
/// Prepares a list of "sounds like" matches for the string you're trying to find
pub fn did_you_mean(obj_source: &Value, field_tried: &PathMember) -> Option<Vec<(usize, String)>> {
let field_tried = match &field_tried.unspanned {
UnspannedPathMember::String(string) => string.clone(),
UnspannedPathMember::Int(int) => format!("{}", int),
};
let possibilities = obj_source.data_descriptors();
let mut possible_matches: Vec<_> = possibilities
.into_iter()
.map(|x| {
let word = x;
let distance = natural::distance::levenshtein_distance(&word, &field_tried);
(distance, word)
})
.collect();
if !possible_matches.is_empty() {
possible_matches.sort();
Some(possible_matches)
} else {
None
}
}

View File

@ -0,0 +1,59 @@
use crate::type_name::SpannedTypeName;
use crate::value::dict::Dictionary;
use crate::value::primitive::Primitive;
use crate::value::{UntaggedValue, Value};
use nu_errors::{CoerceInto, ShellError};
use nu_source::TaggedItem;
impl std::convert::TryFrom<&Value> for i64 {
type Error = ShellError;
/// Convert to an i64 integer, if possible
fn try_from(value: &Value) -> Result<i64, ShellError> {
match &value.value {
UntaggedValue::Primitive(Primitive::Int(int)) => {
int.tagged(&value.tag).coerce_into("converting to i64")
}
_ => Err(ShellError::type_error("Integer", value.spanned_type_name())),
}
}
}
impl std::convert::TryFrom<&Value> for String {
type Error = ShellError;
/// Convert to a string, if possible
fn try_from(value: &Value) -> Result<String, ShellError> {
match &value.value {
UntaggedValue::Primitive(Primitive::String(s)) => Ok(s.clone()),
_ => Err(ShellError::type_error("String", value.spanned_type_name())),
}
}
}
impl std::convert::TryFrom<&Value> for Vec<u8> {
type Error = ShellError;
/// Convert to a u8 vec, if possible
fn try_from(value: &Value) -> Result<Vec<u8>, ShellError> {
match &value.value {
UntaggedValue::Primitive(Primitive::Binary(b)) => Ok(b.clone()),
_ => Err(ShellError::type_error("Binary", value.spanned_type_name())),
}
}
}
impl<'a> std::convert::TryFrom<&'a Value> for &'a Dictionary {
type Error = ShellError;
/// Convert to a dictionary, if possible
fn try_from(value: &'a Value) -> Result<&'a Dictionary, ShellError> {
match &value.value {
UntaggedValue::Row(d) => Ok(d),
_ => Err(ShellError::type_error(
"Dictionary",
value.spanned_type_name(),
)),
}
}
}

View File

@ -0,0 +1,103 @@
use crate::type_name::PrettyType;
use crate::value::primitive::Primitive;
use crate::value::{UntaggedValue, Value};
use nu_source::{b, DebugDocBuilder, PrettyDebug};
impl PrettyDebug for &Value {
/// Get a borrowed Value ready to be pretty-printed
fn pretty(&self) -> DebugDocBuilder {
PrettyDebug::pretty(*self)
}
}
impl PrettyDebug for Value {
/// Get a Value ready to be pretty-printed
fn pretty(&self) -> DebugDocBuilder {
match &self.value {
UntaggedValue::Primitive(p) => p.pretty(),
UntaggedValue::Row(row) => row.pretty_builder().nest(1).group().into(),
UntaggedValue::Table(table) => {
b::delimit("[", b::intersperse(table, b::space()), "]").nest()
}
UntaggedValue::Error(_) => b::error("error"),
UntaggedValue::Block(_) => b::opaque("block"),
}
}
}
impl PrettyType for Primitive {
/// Find the type of the Value and prepare it for pretty-printing
fn pretty_type(&self) -> DebugDocBuilder {
match self {
Primitive::Nothing => ty("nothing"),
Primitive::Int(_) => ty("integer"),
Primitive::Range(_) => ty("range"),
Primitive::Decimal(_) => ty("decimal"),
Primitive::Bytes(_) => ty("bytesize"),
Primitive::String(_) => ty("string"),
Primitive::Line(_) => ty("line"),
Primitive::ColumnPath(_) => ty("column-path"),
Primitive::Pattern(_) => ty("pattern"),
Primitive::Boolean(_) => ty("boolean"),
Primitive::Date(_) => ty("date"),
Primitive::Duration(_) => ty("duration"),
Primitive::Path(_) => ty("path"),
Primitive::Binary(_) => ty("binary"),
Primitive::BeginningOfStream => b::keyword("beginning-of-stream"),
Primitive::EndOfStream => b::keyword("end-of-stream"),
}
}
}
impl PrettyDebug for Primitive {
/// Get a Primitive value ready to be pretty-printed
fn pretty(&self) -> DebugDocBuilder {
match self {
Primitive::Nothing => b::primitive("nothing"),
Primitive::Int(int) => prim(format_args!("{}", int)),
Primitive::Decimal(decimal) => prim(format_args!("{}", decimal)),
Primitive::Range(range) => {
let (left, left_inclusion) = &range.from;
let (right, right_inclusion) = &range.to;
b::typed(
"range",
(left_inclusion.debug_left_bracket()
+ left.pretty()
+ b::operator(",")
+ b::space()
+ right.pretty()
+ right_inclusion.debug_right_bracket())
.group(),
)
}
Primitive::Bytes(bytes) => primitive_doc(bytes, "bytesize"),
Primitive::String(string) => prim(string),
Primitive::Line(string) => prim(string),
Primitive::ColumnPath(path) => path.pretty(),
Primitive::Pattern(pattern) => primitive_doc(pattern, "pattern"),
Primitive::Boolean(boolean) => match boolean {
true => b::primitive("$yes"),
false => b::primitive("$no"),
},
Primitive::Date(date) => primitive_doc(date, "date"),
Primitive::Duration(duration) => primitive_doc(duration, "seconds"),
Primitive::Path(path) => primitive_doc(path, "path"),
Primitive::Binary(_) => b::opaque("binary"),
Primitive::BeginningOfStream => b::keyword("beginning-of-stream"),
Primitive::EndOfStream => b::keyword("end-of-stream"),
}
}
}
fn prim(name: impl std::fmt::Debug) -> DebugDocBuilder {
b::primitive(format!("{:?}", name))
}
fn primitive_doc(name: impl std::fmt::Debug, ty: impl Into<String>) -> DebugDocBuilder {
b::primitive(format!("{:?}", name)) + b::delimit("(", b::kind(ty.into()), ")")
}
fn ty(name: impl std::fmt::Debug) -> DebugDocBuilder {
b::kind(format!("{:?}", name))
}

View File

@ -0,0 +1,236 @@
use crate::maybe_owned::MaybeOwned;
use crate::value::primitive::Primitive;
use crate::value::{UntaggedValue, Value};
use derive_new::new;
use getset::Getters;
use indexmap::IndexMap;
use nu_source::{b, DebugDocBuilder, PrettyDebug, Spanned, Tag};
use serde::{Deserialize, Serialize};
use std::cmp::{Ord, Ordering, PartialOrd};
use std::hash::{Hash, Hasher};
/// A dictionary that can hold a mapping from names to Values
#[derive(Debug, Default, Serialize, Deserialize, PartialEq, Eq, Clone, Getters, new)]
pub struct Dictionary {
#[get = "pub"]
pub entries: IndexMap<String, Value>,
}
#[allow(clippy::derive_hash_xor_eq)]
impl Hash for Dictionary {
/// Create the hash function to allow the Hash trait for dictionaries
fn hash<H: Hasher>(&self, state: &mut H) {
let mut entries = self.entries.clone();
entries.sort_keys();
entries.keys().collect::<Vec<&String>>().hash(state);
entries.values().collect::<Vec<&Value>>().hash(state);
}
}
impl PartialOrd for Dictionary {
/// Compare two dictionaries for sort ordering
fn partial_cmp(&self, other: &Dictionary) -> Option<Ordering> {
let this: Vec<&String> = self.entries.keys().collect();
let that: Vec<&String> = other.entries.keys().collect();
if this != that {
return this.partial_cmp(&that);
}
let this: Vec<&Value> = self.entries.values().collect();
let that: Vec<&Value> = self.entries.values().collect();
this.partial_cmp(&that)
}
}
impl Ord for Dictionary {
/// Compare two dictionaries for ordering
fn cmp(&self, other: &Dictionary) -> Ordering {
let this: Vec<&String> = self.entries.keys().collect();
let that: Vec<&String> = other.entries.keys().collect();
if this != that {
return this.cmp(&that);
}
let this: Vec<&Value> = self.entries.values().collect();
let that: Vec<&Value> = self.entries.values().collect();
this.cmp(&that)
}
}
impl PartialEq<Value> for Dictionary {
/// Test a dictionary against a Value for equality
fn eq(&self, other: &Value) -> bool {
match &other.value {
UntaggedValue::Row(d) => self == d,
_ => false,
}
}
}
/// A key-value pair specifically meant to be used in debug and pretty-printing
#[derive(Debug, new)]
struct DebugEntry<'a> {
key: &'a str,
value: &'a Value,
}
impl<'a> PrettyDebug for DebugEntry<'a> {
/// Build the the information to pretty-print the DebugEntry
fn pretty(&self) -> DebugDocBuilder {
(b::key(self.key.to_string()) + b::equals() + self.value.pretty().into_value()).group()
}
}
impl PrettyDebug for Dictionary {
/// Get a Dictionary ready to be pretty-printed
fn pretty(&self) -> DebugDocBuilder {
b::delimit(
"(",
b::intersperse(
self.entries()
.iter()
.map(|(key, value)| DebugEntry::new(key, value)),
b::space(),
),
")",
)
}
}
impl From<IndexMap<String, Value>> for Dictionary {
/// Create a dictionary from a map of strings to Values
fn from(input: IndexMap<String, Value>) -> Dictionary {
let mut out = IndexMap::default();
for (key, value) in input {
out.insert(key, value);
}
Dictionary::new(out)
}
}
impl Dictionary {
/// Find the matching Value for a given key, if possible. If not, return a Primitive::Nothing
pub fn get_data(&self, desc: &str) -> MaybeOwned<'_, Value> {
match self.entries.get(desc) {
Some(v) => MaybeOwned::Borrowed(v),
None => MaybeOwned::Owned(
UntaggedValue::Primitive(Primitive::Nothing).into_untagged_value(),
),
}
}
/// Iterate the keys in the Dictionary
pub fn keys(&self) -> impl Iterator<Item = &String> {
self.entries.keys()
}
/// Checks if given key exists
pub fn contains_key(&self, key: &str) -> bool {
self.entries.contains_key(key)
}
/// Find the matching Value for a key, if possible
pub fn get_data_by_key(&self, name: Spanned<&str>) -> Option<Value> {
let result = self
.entries
.iter()
.find(|(desc_name, _)| *desc_name == name.item)?
.1;
Some(
result
.value
.clone()
.into_value(Tag::new(result.tag.anchor(), name.span)),
)
}
/// Get a mutable entry that matches a key, if possible
pub fn get_mut_data_by_key(&mut self, name: &str) -> Option<&mut Value> {
match self
.entries
.iter_mut()
.find(|(desc_name, _)| *desc_name == name)
{
Some((_, v)) => Some(v),
None => None,
}
}
/// Insert a new key/value pair into the dictionary
pub fn insert_data_at_key(&mut self, name: &str, value: Value) {
self.entries.insert(name.to_string(), value);
}
}
/// A helper to help create dictionaries for you. It has the ability to insert values into the dictionary while maintaining the tags that need to be applied to the individual members
#[derive(Debug)]
pub struct TaggedDictBuilder {
tag: Tag,
dict: IndexMap<String, Value>,
}
impl TaggedDictBuilder {
/// Create a new builder
pub fn new(tag: impl Into<Tag>) -> TaggedDictBuilder {
TaggedDictBuilder {
tag: tag.into(),
dict: IndexMap::default(),
}
}
/// Build the contents of the builder into a Value
pub fn build(tag: impl Into<Tag>, block: impl FnOnce(&mut TaggedDictBuilder)) -> Value {
let mut builder = TaggedDictBuilder::new(tag);
block(&mut builder);
builder.into_value()
}
/// Create a new builder with a pre-defined capacity
pub fn with_capacity(tag: impl Into<Tag>, n: usize) -> TaggedDictBuilder {
TaggedDictBuilder {
tag: tag.into(),
dict: IndexMap::with_capacity(n),
}
}
/// Insert an untagged key/value pair into the dictionary, to later be tagged when built
pub fn insert_untagged(&mut self, key: impl Into<String>, value: impl Into<UntaggedValue>) {
self.dict
.insert(key.into(), value.into().into_value(&self.tag));
}
/// Insert a key/value pair into the dictionary
pub fn insert_value(&mut self, key: impl Into<String>, value: impl Into<Value>) {
self.dict.insert(key.into(), value.into());
}
/// Convert the dictionary into a tagged Value using the original tag
pub fn into_value(self) -> Value {
let tag = self.tag.clone();
self.into_untagged_value().into_value(tag)
}
/// Convert the dictionary into an UntaggedValue
pub fn into_untagged_value(self) -> UntaggedValue {
UntaggedValue::Row(Dictionary { entries: self.dict })
}
/// Returns true if the dictionary is empty, false otherwise
pub fn is_empty(&self) -> bool {
self.dict.is_empty()
}
}
impl From<TaggedDictBuilder> for Value {
/// Convert a builder into a tagged Value
fn from(input: TaggedDictBuilder) -> Value {
input.into_value()
}
}

View File

@ -0,0 +1,108 @@
use crate::value::{Primitive, UntaggedValue, Value};
use indexmap::IndexMap;
use nu_errors::ShellError;
use query_interface::{interfaces, vtable_for, Object, ObjectHash};
use serde::{Deserialize, Serialize};
use std::cmp::{Ord, Ordering, PartialOrd};
use std::fmt::Debug;
/// An evaluation scope. Scopes map variable names to Values and aid in evaluating blocks and expressions.
/// Additionally, holds the value for the special $it variable, a variable used to refer to the value passing
/// through the pipeline at that moment
#[derive(Debug)]
pub struct Scope {
pub it: Value,
pub vars: IndexMap<String, Value>,
}
impl Scope {
/// Create a new scope
pub fn new(it: Value) -> Scope {
Scope {
it,
vars: IndexMap::new(),
}
}
}
impl Scope {
/// Create an empty scope
pub fn empty() -> Scope {
Scope {
it: UntaggedValue::Primitive(Primitive::Nothing).into_untagged_value(),
vars: IndexMap::new(),
}
}
/// Create an empty scope, setting $it to a known Value
pub fn it_value(value: Value) -> Scope {
Scope {
it: value,
vars: IndexMap::new(),
}
}
}
#[typetag::serde(tag = "type")]
pub trait EvaluateTrait: Debug + Send + Sync + Object + ObjectHash + 'static {
fn invoke(&self, scope: &Scope) -> Result<Value, ShellError>;
fn clone_box(&self) -> Evaluate;
}
interfaces!(Evaluate: dyn ObjectHash);
#[typetag::serde]
impl EvaluateTrait for Evaluate {
fn invoke(&self, scope: &Scope) -> Result<Value, ShellError> {
self.expr.invoke(scope)
}
fn clone_box(&self) -> Evaluate {
self.expr.clone_box()
}
}
#[derive(Debug, Serialize, Deserialize)]
pub struct Evaluate {
expr: Box<dyn EvaluateTrait>,
}
impl Evaluate {
pub fn new(evaluate: impl EvaluateTrait) -> Evaluate {
Evaluate {
expr: Box::new(evaluate),
}
}
}
impl std::hash::Hash for Evaluate {
fn hash<H: std::hash::Hasher>(&self, state: &mut H) {
self.expr.obj_hash(state)
}
}
impl Clone for Evaluate {
fn clone(&self) -> Evaluate {
self.expr.clone_box()
}
}
impl Ord for Evaluate {
fn cmp(&self, _: &Self) -> Ordering {
Ordering::Equal
}
}
impl PartialOrd for Evaluate {
fn partial_cmp(&self, _: &Evaluate) -> Option<Ordering> {
Some(Ordering::Equal)
}
}
impl PartialEq for Evaluate {
fn eq(&self, _: &Evaluate) -> bool {
true
}
}
impl Eq for Evaluate {}

View File

@ -0,0 +1,262 @@
use crate::type_name::ShellTypeName;
use crate::value::column_path::ColumnPath;
use crate::value::range::Range;
use crate::value::{serde_bigdecimal, serde_bigint};
use bigdecimal::BigDecimal;
use chrono::{DateTime, Utc};
use nu_errors::{ExpectedRange, ShellError};
use nu_source::{PrettyDebug, Span, SpannedItem};
use num_bigint::BigInt;
use num_traits::cast::{FromPrimitive, ToPrimitive};
use serde::{Deserialize, Serialize};
use std::path::PathBuf;
/// The most fundamental of structured values in Nu are the Primitive values. These values represent types like integers, strings, booleans, dates, etc that are then used
/// as the buildig blocks to build up more complex structures.
///
/// Primitives also include marker values BeginningOfStream and EndOfStream which denote a change of condition in the stream
#[derive(Debug, Clone, Ord, PartialOrd, Eq, PartialEq, Hash, Deserialize, Serialize)]
pub enum Primitive {
/// An empty value
Nothing,
/// A "big int", an integer with arbitrarily large size (aka not limited to 64-bit)
#[serde(with = "serde_bigint")]
Int(BigInt),
/// A "big decimal", an decimal number with arbitrarily large size (aka not limited to 64-bit)
#[serde(with = "serde_bigdecimal")]
Decimal(BigDecimal),
/// A count in the number of bytes, used as a filesize
Bytes(u64),
/// A string value
String(String),
/// A string value with an implied carriage return (or cr/lf) ending
Line(String),
/// A path to travel to reach a value in a table
ColumnPath(ColumnPath),
/// A glob pattern, eg foo*
Pattern(String),
/// A boolean value
Boolean(bool),
/// A date value, in UTC
Date(DateTime<Utc>),
/// A count in the number of seconds
Duration(u64),
/// A range of values
Range(Box<Range>),
/// A file path
Path(PathBuf),
/// A vector of raw binary data
#[serde(with = "serde_bytes")]
Binary(Vec<u8>),
/// Beginning of stream marker, a pseudo-value not intended for tables
BeginningOfStream,
/// End of stream marker, a pseudo-value not intended for tables
EndOfStream,
}
impl Primitive {
/// Converts a primitive value to a u64, if possible. Uses a span to build an error if the conversion isn't possible.
pub fn as_u64(&self, span: Span) -> Result<u64, ShellError> {
match self {
Primitive::Int(int) => match int.to_u64() {
None => Err(ShellError::range_error(
ExpectedRange::U64,
&format!("{}", int).spanned(span),
"converting an integer into a 64-bit integer",
)),
Some(num) => Ok(num),
},
other => Err(ShellError::type_error(
"integer",
other.type_name().spanned(span),
)),
}
}
}
impl From<BigDecimal> for Primitive {
/// Helper to convert from decimals to a Primitive value
fn from(decimal: BigDecimal) -> Primitive {
Primitive::Decimal(decimal)
}
}
impl From<f64> for Primitive {
/// Helper to convert from 64-bit float to a Primitive value
fn from(float: f64) -> Primitive {
if let Some(f) = BigDecimal::from_f64(float) {
Primitive::Decimal(f)
} else {
unreachable!("Internal error: protocol did not use f64-compatible decimal")
}
}
}
impl ShellTypeName for Primitive {
/// Get the name of the type of a Primitive value
fn type_name(&self) -> &'static str {
match self {
Primitive::Nothing => "nothing",
Primitive::Int(_) => "integer",
Primitive::Range(_) => "range",
Primitive::Decimal(_) => "decimal",
Primitive::Bytes(_) => "bytes",
Primitive::String(_) => "string",
Primitive::Line(_) => "line",
Primitive::ColumnPath(_) => "column path",
Primitive::Pattern(_) => "pattern",
Primitive::Boolean(_) => "boolean",
Primitive::Date(_) => "date",
Primitive::Duration(_) => "duration",
Primitive::Path(_) => "file path",
Primitive::Binary(_) => "binary",
Primitive::BeginningOfStream => "marker<beginning of stream>",
Primitive::EndOfStream => "marker<end of stream>",
}
}
}
/// Format a Primitive value into a string
pub fn format_primitive(primitive: &Primitive, field_name: Option<&String>) -> String {
match primitive {
Primitive::Nothing => String::new(),
Primitive::BeginningOfStream => String::new(),
Primitive::EndOfStream => String::new(),
Primitive::Path(p) => format!("{}", p.display()),
Primitive::Bytes(b) => {
let byte = byte_unit::Byte::from_bytes(*b as u128);
if byte.get_bytes() == 0u128 {
return "".to_string();
}
let byte = byte.get_appropriate_unit(false);
match byte.get_unit() {
byte_unit::ByteUnit::B => format!("{} B ", byte.get_value()),
_ => byte.format(1),
}
}
Primitive::Duration(sec) => format_duration(*sec),
Primitive::Int(i) => i.to_string(),
Primitive::Decimal(decimal) => format!("{:.4}", decimal),
Primitive::Range(range) => format!(
"{}..{}",
format_primitive(&range.from.0.item, None),
format_primitive(&range.to.0.item, None)
),
Primitive::Pattern(s) => s.to_string(),
Primitive::String(s) => s.to_owned(),
Primitive::Line(s) => s.to_owned(),
Primitive::ColumnPath(p) => {
let mut members = p.iter();
let mut f = String::new();
f.push_str(
&members
.next()
.expect("BUG: column path with zero members")
.display(),
);
for member in members {
f.push_str(".");
f.push_str(&member.display())
}
f
}
Primitive::Boolean(b) => match (b, field_name) {
(true, None) => "Yes",
(false, None) => "No",
(true, Some(s)) if !s.is_empty() => s,
(false, Some(s)) if !s.is_empty() => "",
(true, Some(_)) => "Yes",
(false, Some(_)) => "No",
}
.to_owned(),
Primitive::Binary(_) => "<binary>".to_owned(),
Primitive::Date(d) => format_date(d),
}
}
/// Format a duration in seconds into a string
pub fn format_duration(sec: u64) -> String {
let (minutes, seconds) = (sec / 60, sec % 60);
let (hours, minutes) = (minutes / 60, minutes % 60);
let (days, hours) = (hours / 24, hours % 24);
match (days, hours, minutes, seconds) {
(0, 0, 0, 1) => "1 sec".to_owned(),
(0, 0, 0, s) => format!("{} secs", s),
(0, 0, m, s) => format!("{}:{:02}", m, s),
(0, h, m, s) => format!("{}:{:02}:{:02}", h, m, s),
(d, h, m, s) => format!("{}:{:02}:{:02}:{:02}", d, h, m, s),
}
}
/// Format a UTC date value into a humanized string (eg "1 week ago" instead of a formal date string)
pub fn format_date(d: &DateTime<Utc>) -> String {
let utc: DateTime<Utc> = Utc::now();
let duration = utc.signed_duration_since(*d);
if duration.num_weeks() >= 52 {
let num_years = duration.num_weeks() / 52;
format!(
"{} year{} ago",
num_years,
if num_years == 1 { "" } else { "s" }
)
} else if duration.num_weeks() >= 4 {
let num_months = duration.num_weeks() / 4;
format!(
"{} month{} ago",
num_months,
if num_months == 1 { "" } else { "s" }
)
} else if duration.num_weeks() >= 1 {
let num_weeks = duration.num_weeks();
format!(
"{} week{} ago",
num_weeks,
if num_weeks == 1 { "" } else { "s" }
)
} else if duration.num_days() >= 1 {
let num_days = duration.num_days();
format!(
"{} day{} ago",
num_days,
if num_days == 1 { "" } else { "s" }
)
} else if duration.num_hours() >= 1 {
let num_hours = duration.num_hours();
format!(
"{} hour{} ago",
num_hours,
if num_hours == 1 { "" } else { "s" }
)
} else if duration.num_minutes() >= 1 {
let num_minutes = duration.num_minutes();
format!(
"{} min{} ago",
num_minutes,
if num_minutes == 1 { "" } else { "s" }
)
} else {
let num_seconds = duration.num_seconds();
format!(
"{} sec{} ago",
num_seconds,
if num_seconds == 1 { "" } else { "s" }
)
}
}

Some files were not shown because too many files have changed in this diff Show More