Compare commits

...

216 Commits
0.6.0 ... 0.8.0

Author SHA1 Message Date
41ebc6b42d Bump to 0.8.0 (#1166) 2020-01-07 20:08:31 +13:00
b574dc6365 Add the from-ods command (#1161)
* Put a sample_data.ods file for testing

This is a copy of the sample_data.xlsx file but in ods format

* Add the from-ods command

Most of the work was doing `rg xlsx` and then copy/paste with light editing

* Add tests for the from-ods command

* Fix failing test

The problem was improper filename sorting in the test `prepares_and_decorates_filesystem_source_files`
2020-01-07 19:35:00 +13:00
4af9e1de41 Resolves #750 (#1164)
Pick now produces an error when none of the columns are found
2020-01-07 17:06:48 +13:00
77d856fd53 Last unwraps (#1160)
* Work through most of the last unwraps

* Finish removing unwraps
2020-01-04 19:44:17 +13:00
6dceabf389 Isolate data processing helpers. (#1159)
Isolate data processing helpers. Remove unwraps and down to zero unwraps.
2020-01-03 23:00:39 -05:00
5919c6c433 Remove unwraps (#1153)
* Remove a batch of unwraps

* finish another batch
2020-01-04 10:11:21 +13:00
339a2de0eb More ununwraps (#1152)
* More ununwraps

* More ununwraps

* Update completer.rs

* Update completer.rs
2020-01-03 06:51:20 +13:00
3e3cb15f3d Yet more ununwraps (#1150) 2020-01-02 20:07:17 +13:00
5e31851070 A couple more (#1149) 2020-01-02 18:24:41 +13:00
0f626dd076 Another batch of un-unwrapping (#1148)
Another batch of un-unwrappings
2020-01-02 17:02:46 +13:00
aa577bf9bf Clean up some unwraps (#1147) 2020-01-02 09:45:32 +13:00
25298d35e4 Bump rustyline (#1146)
* Slightly improve new which command

* Bump rustyline
2020-01-02 06:54:25 +13:00
78016446dc Slightly improve new which command (#1145) 2020-01-01 20:47:25 +13:00
b304de8199 Rewrite which (#1144)
* Detect built-in commands passed as args to `which`

This expands the built-in `which` command to detect nushell commands
that may have the same name as a binary in the path.

* Allow which to interpret multiple arguments

Previously, it would discard any argument besides the first. This allows
`which` to process multiple arguments. It also makes the output a stream
of rows.

* Use map to build the output

* Add boolean column for builtins

* Use macros for entry creation shortcuts

* Process command args and use async_stream

In order to use `ichwh`, I'll need to use async_stream. But in order to
avoid lifetime errors with that, I have to process the command args
before using them. I'll admit I don't fully understand what is going on
with the `args.process(...)` function, but it works.

* Use `ichwh` for path searching

This commit transitions from `which` to `ichwh`. The path search is now
done asynchronously.

* Enable the `--all` flag on `which`

* Make `which` respect external commands

Escaped commands passed to wich (e.g., `which "^ls"`), are now searched
before builtins.

* Fix clippy warnings

This commit resolves two warnings from clippy, in light of #1142.

* Update Cargo.lock to get new `ichwh` version

`ichwh@0.2.1` has support for local paths.

* Add documentation for command
2020-01-01 19:45:27 +13:00
72838cc083 Move to using clippy (#1142)
* Clippy fixes

* Finish converting to use clippy

* fix warnings in new master

* fix windows

* fix windows

Co-authored-by: Artem Vorotnikov <artem@vorotnikov.me>
2019-12-31 20:36:08 +13:00
8093612cac Allow moving in text with Ctrl+ArrowLeft, Ctrl+ArrowRight (#1141)
* Allow moving in text with Ctrl+ArrowLeft, Ctrl+ArrowRight

* Document changes

* Format
2019-12-31 17:06:36 +13:00
f37f29b441 Add uniq command (#1132)
* start playing with ways to use the uniq command

* WIP

* Got uniq working, but still need to figure out args issue and add tests

* Add some tests for uniq

* fmt

* remove commented out code

* Add documentation and some additional tests showing uniq values and rows. Also removed args TODO

* add changes that didn't get committed

* whoops, I didn't save the docs correctly...

* fmt

* Add a test for uniq with nested json

* Add another test

* Fix unique-ness when json keys are out of order and make the test json more complicated
2019-12-31 17:05:02 +13:00
dba82ac530 handle single quoted external command args (#1139)
fixes #1138
2019-12-31 06:47:14 +13:00
0615adac94 Inc refactoring, Value helper test method extractions, and more integration helpers. (#1135)
* Manifests check. Ignore doctests for now.

* We continue with refactorings towards the separation of concerns between
crates. `nu_plugin_inc` and `nu_plugin_str` common test helpers usage
has been refactored into `nu-plugin` value test helpers.

Inc also uses the new API for integration tests.
2019-12-29 00:17:24 -05:00
21e508009f Refactor struct names for old commands (ls, cd, pwd) (#1133) 2019-12-29 10:33:31 +13:00
a9317d939f Update README.md 2019-12-28 15:27:51 +13:00
65d843c2a1 Merge pull request #1128 from andrasio/nu-plugin-extract
Extract nu-plugin crate.
2019-12-27 09:16:18 -05:00
f6c62bf121 Nu plugins now depend on nu-plugin crate. 2019-12-27 08:52:15 -05:00
b4bc5fe9af Merge pull request #1126 from jonathandturner/utf8_fix
UTF8 fix for twitter-reported issue
2019-12-27 19:48:42 +13:00
10368d7060 UTF8 fix for twitter-reported issue 2019-12-27 19:25:44 +13:00
68a314b5cb UTF8 fix for twitter-reported issue 2019-12-27 19:03:00 +13:00
3c7633ae9f Merge pull request #1125 from notryanb/update-readme
update readme to reflect >= 0.7.2 $nu variables
2019-12-27 15:42:25 +13:00
dba347ad00 update readme to show >= 0.7 nu path 2019-12-26 20:08:30 -05:00
bfba2c57f8 Merge pull request #1124 from quebin31/master
Fix positional macro on crate nu-macros
2019-12-27 07:16:47 +13:00
c69bf9f46f Merge branch 'master' of https://github.com/nushell/nushell 2019-12-26 12:32:28 -05:00
7ce1ddc6fd Fixed optional and required argument in signature.
This fixes issues like #1117
2019-12-26 12:29:41 -05:00
e7ce6f2fcd Merge pull request #1113 from jonathandturner/bump_0_7_2
Bump to 0.7.2
2019-12-24 14:51:58 +13:00
0c786bb890 Bump to 0.7.2 2019-12-24 14:51:10 +13:00
8d31c32bda Merge pull request #1112 from jonathandturner/assorted_fixes
Fix an assortment of issues
2019-12-24 14:45:15 +13:00
e7fb15be59 Fix an assortment of issues 2019-12-24 14:26:47 +13:00
be7550822c Merge pull request #1109 from nushell/ctrl_l_clear
Move to git rustyline to fix Ctrl-L
2019-12-24 05:48:42 +13:00
0ce216eec4 Move to git rustyline to fix Ctrl+L 2019-12-24 05:26:30 +13:00
1fe85cb91e Merge pull request #1108 from thegedge/faster-pipelines
Wait for process instead of polling its status.
2019-12-23 07:06:16 +13:00
8cadc5a4ac Wait for process instead of polling its status.
This provides a huge performance boost for pipelines that end in an
external command. Rough testing shows an improvement from roughly 400ms
to 30ms when `cat`-ing a large file.
2019-12-22 14:14:03 -03:30
f9da7f7d58 Merge pull request #1102 from jonathandturner/bump_nu
Bump nu version
2019-12-20 10:54:30 +13:00
367f11a62e Bump nu version 2019-12-20 09:03:54 +13:00
8a45ca9cc3 Merge pull request #1100 from nushell/fix-stable
Fix the stable plugins to correct list
2019-12-20 06:37:55 +13:00
e336930fd8 Update Cargo.toml 2019-12-20 06:18:06 +13:00
172ccc910e Fix the stable plugins to correct list 2019-12-20 06:01:42 +13:00
a8425daf14 Merge pull request #1097 from jonathandturner/fix_workspace
Fix the workspace I commented out
2019-12-18 10:14:16 -08:00
b629136528 Fix the workspace I commented out 2019-12-19 06:58:23 +13:00
91ebb7f718 Merge pull request #1096 from jonathandturner/copy_core_plugins
Copy core plugins back so we can publish
2019-12-18 08:54:31 -08:00
96484161c0 Copy core plugins back so we can publish 2019-12-19 05:35:17 +13:00
d21ddeeae6 Merge pull request #1094 from jonathandturner/rename_test_support
Rename test-support to nu-test-support
2019-12-17 11:08:24 -08:00
4322d373e6 More renames 2019-12-18 07:54:39 +13:00
08571392e6 Rename test-support to nu-test-support 2019-12-18 07:41:47 +13:00
f52235b1c1 Merge pull request #1093 from jonathandturner/fix_asset
Try to fix asset building
2019-12-17 10:28:15 -08:00
a66147da47 Try to fix asset building 2019-12-18 07:09:38 +13:00
df778afd1f Try to fix asset building 2019-12-18 07:05:12 +13:00
d7ddaa376b Merge pull request #1092 from jonathandturner/oops
More oops
2019-12-17 09:11:52 -08:00
2ce892c6f0 More oops 2019-12-18 06:11:14 +13:00
28179ef450 Merge pull request #1091 from jonathandturner/add_descs
Oops
2019-12-17 09:09:30 -08:00
2c6336c806 Oops 2019-12-18 06:08:45 +13:00
761fc9ae73 Merge pull request #1090 from jonathandturner/add_descs
Add missing descriptions and licenses to subcrates
2019-12-17 09:07:36 -08:00
314c3c4a97 Add missing descriptions and licenses to subcrates 2019-12-18 06:07:00 +13:00
f7f1fba94f Merge pull request #1089 from jonathandturner/bump
Bump Nu version
2019-12-17 08:54:02 -08:00
14817ef229 Subcrate versions 2019-12-18 05:18:10 +13:00
98233dcec1 Subcrate versions 2019-12-18 05:09:53 +13:00
6540509911 Bump Nu version 2019-12-18 04:55:49 +13:00
594eae1cbc Merge pull request #1085 from andrasio/externals-line
$it can contain a string line or plain string data.
2019-12-16 17:42:49 -05:00
5e961815fc can contain a string line or plain string data. 2019-12-16 17:27:36 -05:00
fa9329c8e3 Merge pull request #1082 from sebastian-xyz/update-book-links
update links to books
2019-12-15 14:34:38 -08:00
6c577e18ca Merge pull request #1081 from andrasio/test-extract
Start test organization facelift.
2019-12-15 11:46:58 -05:00
4034129dba This commit is the continuing phase of extracting functionality to subcrates. We extract test helpers and begin to change Nu shell's test organization along with it. 2019-12-15 11:34:58 -05:00
52cf65c19e Merge pull request #1080 from andrasio/command-refactor
Separate internal and external command definitions.
2019-12-15 08:49:45 -05:00
cbbb246a6d update links to books 2019-12-15 13:56:26 +01:00
87cc6d6f01 Separate internal and external command definitions. 2019-12-15 01:24:31 -05:00
4b9ef5a9d0 Merge pull request #1079 from jonathandturner/bump_some_deps
Bump heim and necessary deps
2019-12-14 09:32:23 -08:00
31c703891a Bump heim and necessary deps 2019-12-15 02:27:14 +13:00
550bda477b Merge pull request #1060 from naufraghi/issues-972-expand-tilde-as-home-in-external-commands
Expand tilde as home in external commands
2019-12-13 08:46:08 -08:00
219b7e64cd Use shellexpand to expand ~ in external commands
Add tests for ~tilde expansion:

- test that "~" is expanded (no more "~" in output)
- ensure that "1~1" is not expanded to "1/home/user1" as it was
  before

Fixes #972

Note: the first test does not check the literal expansion because
the path on Windows is expanded as a Linux path, but the correct
expansion may come for free once `shellexpand` will use the `dirs`
crate too (https://github.com/netvl/shellexpand/issues/3).
2019-12-13 11:54:41 +01:00
98c59f77b2 Merge pull request #1078 from nushell/enable_coloring_in_tokens
Remove the coloring_in_tokens feature flag
2019-12-12 13:08:35 -08:00
e8800fdd0c Remove the coloring_in_tokens feature flag
Stabilize and enable
2019-12-12 11:34:43 -08:00
09f903c37a Merge pull request #1077 from nushell/implement-signature-syntax
Add Range and start Signature support
2019-12-11 21:58:09 -08:00
57af9b5040 Add Range and start Signature support
This commit contains two improvements:

- Support for a Range syntax (and a corresponding Range value)
- Work towards a signature syntax

Implementing the Range syntax resulted in cleaning up how operators in
the core syntax works. There are now two kinds of infix operators

- tight operators (`.` and `..`)
- loose operators

Tight operators may not be interspersed (`$it.left..$it.right` is a
syntax error). Loose operators require whitespace on both sides of the
operator, and can be arbitrarily interspersed. Precedence is left to
right in the core syntax.

Note that delimited syntax (like `( ... )` or `[ ... ]`) is a single
token node in the core syntax. A single token node can be parsed from
beginning to end in a context-free manner.

The rule for `.` is `<token node>.<member>`. The rule for `..` is
`<token node>..<token node>`.

Loose operators all have the same syntactic rule: `<token
node><space><loose op><space><token node>`.

The second aspect of this pull request is the beginning of support for a
signature syntax. Before implementing signatures, a necessary
prerequisite is for the core syntax to support multi-line programs.

That work establishes a few things:

- `;` and newlines are handled in the core grammar, and both count as
  "separators"
- line comments begin with `#` and continue until the end of the line

In this commit, multi-token productions in the core grammar can use
separators interchangably with spaces. However, I think we will
ultimately want a different rule preventing separators from occurring
before an infix operator, so that the end of a line is always
unambiguous. This would avoid gratuitous differences between modules and
repl usage.

We already effectively have this rule, because otherwise `x<newline> |
y` would be a single pipeline, but of course that wouldn't work.
2019-12-11 16:41:07 -08:00
16272b1b20 Merge pull request #1076 from jonathandturner/finish_plugin_refactor
Trying this as a workaround to the [[bin]] issue
2019-12-09 20:20:51 -08:00
1dcbd89a89 Trying this as a workaround to the [[bin]] issue 2019-12-10 16:57:55 +13:00
eb6ef02ad1 Merge pull request #1075 from jonathandturner/finish_plugin_refactor
Finish plugin refactor
2019-12-09 18:34:26 -08:00
17586bdfbd Fix missing dep 2019-12-10 15:13:22 +13:00
0e98cf3f1e Merge branch 'finish_plugin_refactor' of github.com:jonathandturner/nushell into finish_plugin_refactor 2019-12-10 13:59:44 +13:00
e2a95c3e1d Move str and inc to core plugins 2019-12-10 13:59:13 +13:00
5cb7df57fc Update azure-pipelines.yml 2019-12-10 13:09:25 +13:00
88f899d341 Move some plugins back to being core shippable plugins 2019-12-10 13:05:40 +13:00
7d70b5feda Try to fix CI with new subcrates 2019-12-10 08:14:58 +13:00
fd6ee03391 Remove old ValueExt 2019-12-10 07:52:01 +13:00
9f702fe01a Move the remainder of the plugins to crates 2019-12-10 07:39:51 +13:00
c9d9eec7f8 Merge pull request #1073 from jonathandturner/docker_wrap
Remove partial docker plugin. Embed->wrap
2019-12-08 21:08:03 -08:00
38cbfdb8a9 Remove partial docker plugin. Embed->wrap 2019-12-09 17:41:09 +13:00
f9b7376949 Merge pull request #1072 from jonathandturner/format_parse
Move format/parse to core commands
2019-12-08 18:26:35 -08:00
e98ed1b43d Move format/parse to core commands 2019-12-09 15:04:13 +13:00
251c3e103d Move format/parse to core commands 2019-12-09 14:57:53 +13:00
d26e938436 Merge pull request #1071 from jonathandturner/fix_1068
Fix 1068
2019-12-08 12:38:10 -08:00
dbadf9499e Fix 1068 2019-12-09 08:15:14 +13:00
28df1559ea Merge pull request #1070 from jonathandturner/upgrade_some_deps
Upgrade some dependencies
2019-12-08 10:19:39 -08:00
91784218c0 Upgrade some dependencies 2019-12-09 06:56:21 +13:00
eeec5e10c3 Merge pull request #1069 from jonathandturner/param_complete
Named param completion
2019-12-08 08:55:13 -08:00
0515ed976c Fix panic 2019-12-09 05:36:24 +13:00
f653992b4a A little cleanup 2019-12-08 19:42:43 +13:00
b5f8c1cc50 param completions work now 2019-12-08 19:23:31 +13:00
f9a46ce1e7 WIP param completions 2019-12-08 19:04:23 +13:00
b6ba7f97fd WIP param completions 2019-12-08 18:58:53 +13:00
7a47905f11 Merge pull request #1066 from thibran/fix-more-clippy-warnings
Fix more Clippy warnings
2019-12-07 16:10:36 -08:00
683f4c35d9 Fix more Clippy warnings
cargo clippy -- -W clippy::correctness
2019-12-07 21:04:58 +01:00
dfa5173cf4 Merge pull request #1064 from thibran/split-table-from-list
split format/table::from_list into multiple functions
2019-12-07 09:00:14 -08:00
04b214bef6 split format/table::from_list into multiple functions 2019-12-07 14:52:52 +01:00
37cb7fec77 Merge pull request #1063 from jonathandturner/unused_deps
Remove some unused deps
2019-12-06 23:44:52 -08:00
8833969e4a Remove some unused deps 2019-12-07 20:23:29 +13:00
bda238267c Merge pull request #1062 from jonathandturner/fetch_post
Fetch/post as plugins
2019-12-06 22:46:30 -08:00
d07dc57537 Add missing fallback case 2019-12-07 19:24:58 +13:00
d0a2888e88 Finish adding makeshift support for to fetch/post plugins 2019-12-07 17:23:59 +13:00
cec2eff933 Merge branch 'master' into fetch_post 2019-12-07 16:53:50 +13:00
38b7a3e32b WIP move post/fetch to plugins 2019-12-07 16:46:05 +13:00
9dfb6c023f Merge pull request #1061 from thibran/fix-most-clippy-warnings
Fix most Clippy performance warnings
2019-12-06 19:26:20 -08:00
cde92a9fb9 Fix most Clippy performance warnings
command used: cargo clippy -- -W clippy::perf
2019-12-06 23:25:47 +01:00
5622bbdd48 Merge pull request #1059 from coolshaurya/patch-1
Fix minor error in reject command docs
2019-12-06 08:13:55 -08:00
3d79a9c37a Fix minor error in reject command docs 2019-12-06 17:27:14 +05:30
a2a5b30568 Merge pull request #1058 from jonathandturner/edit_insert_core
Move edit and insert to core
2019-12-05 12:42:19 -08:00
768adb84a4 Remove commented out region 2019-12-06 09:19:24 +13:00
26b0250e22 Remove commented out region 2019-12-06 09:18:16 +13:00
6893850fce Move edit and insert to core 2019-12-06 09:15:41 +13:00
8834e6905e Merge pull request #1055 from jonathandturner/ps_sys_crates
Extract ps and sys subcrates. Move helper methods to UntaggedValue
2019-12-04 12:24:45 -08:00
1d5f13ddca formatting 2019-12-05 08:57:03 +13:00
d12c16a331 Extract ps and sys subcrates. Move helper methods to UntaggedValue 2019-12-05 08:52:31 +13:00
ecf47bb3ab Merge pull request #1054 from jonathandturner/binaryview_crate
Move binaryview to a sub-crate
2019-12-04 10:17:01 -08:00
a4bb5d4ff5 Move binaryview to a sub-crate 2019-12-05 06:51:20 +13:00
e9ee7bda46 Merge pull request #1052 from jonathandturner/fix_textview
Re-enable the textview plugin, now its own crate
2019-12-04 08:49:40 -08:00
1d196394f6 Merge pull request #1045 from sebastian-xyz/range
add range command
2019-12-04 08:37:03 -08:00
cfda67ff82 Finish making the textview plugin optional 2019-12-05 05:28:48 +13:00
59510a85d1 fix build warnings 2019-12-04 17:13:21 +01:00
35edf22ac3 Test all subcrates 2019-12-04 19:53:06 +13:00
871fc72892 Test all subcrates 2019-12-04 19:49:38 +13:00
1fcf671ca4 Re-enable the textview plugin, now its own crate 2019-12-04 19:38:40 +13:00
ecebe1314a update to new crates structure 2019-12-03 20:56:39 +01:00
bda5db59c8 Merge remote-tracking branch 'upstream/master' into range 2019-12-03 20:23:49 +01:00
4526d757b6 Merge pull request #1049 from andrasio/embed-list
embed as column when embedding a list
2019-12-03 02:51:58 -05:00
e5405d7f5c embed as column when embedding a list 2019-12-03 02:26:01 -05:00
201506a5ad add tests for range + run rustfmt 2019-12-03 08:24:49 +01:00
49f9253ca2 Merge pull request #1047 from jonathandturner/new_lines
Add new line primitive, bump version, allow bare filepaths
2019-12-02 23:14:08 -08:00
efc879b955 Add new line primitive, bump version, allow bare filepaths 2019-12-03 19:44:59 +13:00
3fa03eb7a4 Merge pull request #1046 from nushell/fix-external-words
Clean up expansion of external words
2019-12-02 17:12:50 -08:00
24bad78607 Clean up expansion of external words
Previously, external words accidentally used
ExpansionRule::new().allow_external_command(), when it should have been
ExpansionRule::new().allow_external_word().

External words are the broadest category in the parser, and are the
appropriate category for external arguments. This was just a mistake.
2019-12-02 16:34:33 -08:00
8de4c9dbb7 Merge pull request #1044 from nushell/protocol-extraction
Extract into crates
2019-12-02 14:29:04 -08:00
f858e854bf Fix a rebase mistake 2019-12-02 13:48:34 -08:00
87dbd3d5ac Extract build.rs 2019-12-02 13:14:51 -08:00
fe66b4c8ea Merge remote-tracking branch 'origin/master' into protocol-extraction 2019-12-02 11:16:00 -08:00
8390cc97e1 add range command 2019-12-02 20:15:14 +01:00
c0a7d4e2a7 Update .gitpod.yml 2019-12-02 11:02:59 -08:00
ce23a672d9 add documentation for compact command 2019-12-02 11:02:59 -08:00
9851317aeb add documentation for default command 2019-12-02 11:02:59 -08:00
3fb4a5d6e6 add documentation for format 2019-12-02 11:02:59 -08:00
340e701124 fix error in save.md 2019-12-02 11:02:59 -08:00
36938a4407 add documentation for save, config 2019-12-02 11:02:59 -08:00
6a6589a357 Update where.md 2019-12-02 11:02:59 -08:00
b94a32e523 add documentation for from-json, from-yaml, history, split-row 2019-12-02 11:02:59 -08:00
7db3c69984 update histogram, nth documentation 2019-12-02 11:02:59 -08:00
5406450c42 Add documentation for histogram, split-column 2019-12-02 11:02:59 -08:00
d6a6e16d21 Switch to the new Cargo.lock format
This was achieved by deleting Cargo.lock
and letting a recent Cargo nightly re-create
it. Support for the format was already
introduced in Rust 1.38, but currently,
stable releases of Cargo only retain it
if encountered but don't generate such
files by default.

The new format is smaller, better suited to
prevent merge conflicts and generates smaller
diffs at dependency updates, leading to
smaller git history.

You can read more about it in this PR: https://github.com/rust-lang/cargo/pull/7070
2019-12-02 11:02:59 -08:00
ea1b65916d Update Cargo.toml 2019-12-02 11:02:59 -08:00
cd9d9ad50b improve duration print 2019-12-02 11:02:58 -08:00
552272b37e replace and find-replace str plugin additions. 2019-12-02 11:02:58 -08:00
388ce738e3 expand tilde in externals 2019-12-02 11:02:58 -08:00
ef7fbcbe9f Update README.md 2019-12-02 11:02:58 -08:00
80941ace37 Add 0.6.1 release 2019-12-02 11:02:58 -08:00
f317500873 Update from-yaml.md 2019-12-02 11:02:58 -08:00
911414a190 Update config.md 2019-12-02 11:02:58 -08:00
cca6360bcc add documentation for from-tsv, from-xml 2019-12-02 11:02:58 -08:00
f68503fa21 add documentation for get, ps 2019-12-02 11:02:58 -08:00
911b69dff0 Update some command docs 2019-12-02 11:02:58 -08:00
4115634bfc Try to re-apply #1039 2019-12-02 11:02:58 -08:00
8a0bdde17a Remove env var from starship 2019-12-02 11:02:58 -08:00
a1e21828d6 Fix tests 2019-12-02 11:02:57 -08:00
0f193c2337 Update histogram.rs 2019-12-02 11:02:57 -08:00
526d94d862 improve duration print
original commit: ddb9d3a864
2019-12-02 11:02:57 -08:00
2fdafa52b1 replace and find-replace str plugin additions. 2019-12-02 11:02:57 -08:00
f52c0655c7 expand tilde in externals
original: 9f42d7693f
2019-12-02 11:02:57 -08:00
97331c7b25 Update README 2019-12-02 11:02:57 -08:00
1fb5a419a7 Bump the release version 2019-12-02 11:02:57 -08:00
4e9afd6698 Refactor classified.rs into separate modules.
Adds modules for internal, external, and dynamic commands, as well as
the pipeline functionality. These are exported as their old names from
the classified module so as to keep its "interface" the same.
2019-12-02 11:02:57 -08:00
8f9dd6516e Add =~ and !~ operators on strings
`left =~ right` return true if left contains right, using Rust's
`String::contains`. `!~` is the negated version.

A new `apply_operator` function is added which decouples evaluation from
`Value::compare`. This returns a `Value` and opens the door to
implementing `+` for example, though it wouldn't be useful immediately.

The `operator!` macro had to be changed slightly as it would choke on
`~` in arguments.
2019-12-02 11:02:57 -08:00
e4226def16 Extract core stuff into own crates
This commit extracts five new crates:

- nu-source, which contains the core source-code handling logic in Nu,
  including Text, Span, and also the pretty.rs-based debug logic
- nu-parser, which is the parser and expander logic
- nu-protocol, which is the bulk of the types and basic conveniences
  used by plugins
- nu-errors, which contains ShellError, ParseError and error handling
  conveniences
- nu-textview, which is the textview plugin extracted into a crate

One of the major consequences of this refactor is that it's no longer
possible to `impl X for Spanned<Y>` outside of the `nu-source` crate, so
a lot of types became more concrete (Value became a concrete type
instead of Spanned<Value>, for example).

This also turned a number of inherent methods in the main nu crate into
plain functions (impl Value {} became a bunch of functions in the
`value` namespace in `crate::data::value`).
2019-12-02 10:54:12 -08:00
c199a84dbb Merge pull request #1039 from thegedge/move-pipeline-execution-out-of-cli
Move pipeline execution code into classified::Pipeline
2019-12-01 19:47:34 -08:00
5a4ca11362 Merge pull request #1043 from JesterOrNot/master
install all features for nushell for gitpod
2019-12-01 18:32:15 -08:00
f2968c8385 Update .gitpod.yml 2019-12-01 17:16:53 -06:00
8d01b019f4 Merge pull request #1041 from tchak/docs-compact-default
document compact and default
2019-12-01 09:01:50 -08:00
bf87330d6e add documentation for compact command 2019-12-01 17:44:43 +01:00
2bb85bdbd4 add documentation for default command 2019-12-01 17:39:09 +01:00
8f34c6eeda Merge pull request #1032 from sebastian-xyz/doc
add documentation for save, config, get, ps, from-tsv, from-xml
2019-11-30 18:15:39 -08:00
ac5543bad9 Move pipeline execution code into classified::Pipeline 2019-11-30 16:12:34 -05:00
e4c56a25c6 Merge remote-tracking branch 'refs/remotes/origin/doc' into doc 2019-11-30 21:21:15 +01:00
11ff8190b1 add documentation for format 2019-11-30 21:15:12 +01:00
9bd25d7427 fix error in save.md 2019-11-30 21:07:43 +01:00
5676713b1f Update README.md 2019-12-01 07:12:14 +13:00
b59231d32b Merge pull request #1035 from jonathandturner/bump_to_0_6_1
Add 0.6.1 release
2019-11-30 10:11:31 -08:00
e530cf0a9d Add 0.6.1 release 2019-12-01 07:10:51 +13:00
6bfb4207c4 Update from-yaml.md 2019-12-01 07:00:36 +13:00
c63ad610f5 Update config.md 2019-12-01 06:59:53 +13:00
e38a4323b4 add documentation for from-tsv, from-xml 2019-11-30 13:38:52 +01:00
d40aea5d0a add documentation for get, ps 2019-11-30 12:48:23 +01:00
1ba69e4b11 Merge pull request #1030 from jonathandturner/more_doc_updates
Update some command docs
2019-11-29 17:53:02 -08:00
f10390b1be Update some command docs 2019-11-30 14:24:39 +13:00
c2b1908644 Merge pull request #1029 from jonathandturner/fix_starship_env_var
Remove env var from starship
2019-11-29 12:00:10 -08:00
0a93335f6d Remove env var from starship 2019-11-30 08:38:44 +13:00
fbb65cde44 add documentation for save, config 2019-11-29 18:15:51 +01:00
8e7acd1094 Update where.md 2019-11-29 08:41:27 +13:00
c6ee6273db Merge pull request #1015 from sebastian-xyz/doc
Add documentation for histogram, split-column
2019-11-28 11:19:36 -08:00
c77059f891 add documentation for from-json, from-yaml, history, split-row 2019-11-28 19:33:17 +01:00
5bdda06ca6 update histogram, nth documentation 2019-11-28 19:32:31 +01:00
d8303dd6d6 Merge pull request #1026 from est31/new-cargo-lock
Switch to the new Cargo.lock format
2019-11-27 19:11:54 -08:00
60ec68b097 Switch to the new Cargo.lock format
This was achieved by deleting Cargo.lock
and letting a recent Cargo nightly re-create
it. Support for the format was already
introduced in Rust 1.38, but currently,
stable releases of Cargo only retain it
if encountered but don't generate such
files by default.

The new format is smaller, better suited to
prevent merge conflicts and generates smaller
diffs at dependency updates, leading to
smaller git history.

You can read more about it in this PR: https://github.com/rust-lang/cargo/pull/7070
2019-11-28 03:27:37 +01:00
deae66c194 Cargo update
This performs a cargo update to allow the upcoming commit
that switches to the new Cargo.lock format to be only about
that format change.
2019-11-28 03:17:31 +01:00
0845572878 Add documentation for histogram, split-column 2019-11-26 20:47:34 +01:00
393 changed files with 18990 additions and 15250 deletions

View File

@ -42,12 +42,18 @@ steps:
echo "##vso[task.prependpath]$HOME/.cargo/bin"
rustup component add rustfmt --toolchain "stable"
displayName: Install Rust
- bash: RUSTFLAGS="-D warnings" cargo test --all-features
- bash: RUSTFLAGS="-D warnings" cargo test --all --features=stable
condition: eq(variables['style'], 'unflagged')
displayName: Run tests
- bash: NUSHELL_ENABLE_ALL_FLAGS=1 RUSTFLAGS="-D warnings" cargo test --all-features
- bash: RUSTFLAGS="-D warnings" cargo clippy --all --features=stable -- -D clippy::result_unwrap_used -D clippy::option_unwrap_used
condition: eq(variables['style'], 'unflagged')
displayName: Check clippy lints
- bash: NUSHELL_ENABLE_ALL_FLAGS=1 RUSTFLAGS="-D warnings" cargo test --all --features=stable
condition: eq(variables['style'], 'canary')
displayName: Run tests
- bash: NUSHELL_ENABLE_ALL_FLAGS=1 RUSTFLAGS="-D warnings" cargo clippy --all --features=stable -- -D clippy::result_unwrap_used -D clippy::option_unwrap_used
condition: eq(variables['style'], 'canary')
displayName: Check clippy lints
- bash: cargo fmt --all -- --check
condition: eq(variables['style'], 'fmt')
displayName: Lint

View File

@ -1,7 +1,7 @@
image:
file: .gitpod.Dockerfile
tasks:
- init: cargo install nu
- init: cargo install nu --features=stable
command: nu
github:
prebuilds:

3426
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -1,6 +1,6 @@
[package]
name = "nu"
version = "0.6.0"
version = "0.8.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
description = "A shell for the GitHub era"
license = "MIT"
@ -9,22 +9,64 @@ readme = "README.md"
default-run = "nu"
repository = "https://github.com/nushell/nushell"
homepage = "https://www.nushell.sh"
documentation = "https://book.nushell.sh"
documentation = "https://www.nushell.sh/book/"
[workspace]
members = ["crates/nu-source"]
members = [
"crates/nu-macros",
"crates/nu-errors",
"crates/nu-source",
"crates/nu_plugin_average",
"crates/nu_plugin_binaryview",
"crates/nu_plugin_fetch",
"crates/nu_plugin_inc",
"crates/nu_plugin_match",
"crates/nu_plugin_post",
"crates/nu_plugin_ps",
"crates/nu_plugin_str",
"crates/nu_plugin_sum",
"crates/nu_plugin_sys",
"crates/nu_plugin_textview",
"crates/nu_plugin_tree",
"crates/nu-protocol",
"crates/nu-plugin",
"crates/nu-parser",
"crates/nu-value-ext",
"crates/nu-build"
]
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
nu-source = { version = "0.1.0", path = "./crates/nu-source" }
nu-source = { version = "0.8.0", path = "./crates/nu-source" }
nu-plugin = { version = "0.8.0", path = "./crates/nu-plugin" }
nu-protocol = { version = "0.8.0", path = "./crates/nu-protocol" }
nu-errors = { version = "0.8.0", path = "./crates/nu-errors" }
nu-parser = { version = "0.8.0", path = "./crates/nu-parser" }
nu-value-ext = { version = "0.8.0", path = "./crates/nu-value-ext" }
nu_plugin_average = {version = "0.8.0", path = "./crates/nu_plugin_average", optional=true}
nu_plugin_binaryview = {version = "0.8.0", path = "./crates/nu_plugin_binaryview", optional=true}
nu_plugin_fetch = {version = "0.8.0", path = "./crates/nu_plugin_fetch", optional=true}
nu_plugin_inc = {version = "0.8.0", path = "./crates/nu_plugin_inc", optional=true}
nu_plugin_match = {version = "0.8.0", path = "./crates/nu_plugin_match", optional=true}
nu_plugin_post = {version = "0.8.0", path = "./crates/nu_plugin_post", optional=true}
nu_plugin_ps = {version = "0.8.0", path = "./crates/nu_plugin_ps", optional=true}
nu_plugin_str = {version = "0.8.0", path = "./crates/nu_plugin_str", optional=true}
nu_plugin_sum = {version = "0.8.0", path = "./crates/nu_plugin_sum", optional=true}
nu_plugin_sys = {version = "0.8.0", path = "./crates/nu_plugin_sys", optional=true}
nu_plugin_textview = {version = "0.8.0", path = "./crates/nu_plugin_textview", optional=true}
nu_plugin_tree = {version = "0.8.0", path = "./crates/nu_plugin_tree", optional=true}
nu-macros = { version = "0.8.0", path = "./crates/nu-macros" }
rustyline = "5.0.4"
chrono = { version = "0.4.9", features = ["serde"] }
query_interface = "0.3.5"
typetag = "0.1.4"
rustyline = "5.0.6"
chrono = { version = "0.4.10", features = ["serde"] }
derive-new = "0.5.8"
prettytable-rs = "0.8.0"
itertools = "0.8.1"
itertools = "0.8.2"
ansi_term = "0.12.1"
nom = "5.0.1"
dunce = "1.0.0"
@ -35,82 +77,86 @@ base64 = "0.11"
futures-preview = { version = "=0.3.0-alpha.19", features = ["compat", "io-compat"] }
async-stream = "0.1.2"
futures_codec = "0.2.5"
num-traits = "0.2.8"
num-traits = "0.2.10"
term = "0.5.2"
bytes = "0.4.12"
log = "0.4.8"
pretty_env_logger = "0.3.1"
serde = { version = "1.0.102", features = ["derive"] }
serde = { version = "1.0.103", features = ["derive"] }
bson = { version = "0.14.0", features = ["decimal128"] }
serde_json = "1.0.41"
serde_json = "1.0.44"
serde-hjson = "0.9.1"
serde_yaml = "0.8"
serde_bytes = "0.11.2"
serde_bytes = "0.11.3"
getset = "0.0.9"
language-reporting = "0.4.0"
app_dirs = "1.2.1"
csv = "1.1"
toml = "0.5.5"
clap = "2.33.0"
git2 = { version = "0.10.1", default_features = false }
git2 = { version = "0.10.2", default_features = false }
dirs = "2.0.2"
glob = "0.3.0"
ctrlc = "3.1.3"
surf = "1.0.3"
url = "2.1.0"
roxmltree = "0.7.2"
roxmltree = "0.7.3"
nom_locate = "1.0.0"
nom-tracable = "0.4.1"
unicode-xid = "0.2.0"
serde_ini = "0.2.0"
subprocess = "0.1.18"
mime = "0.3.14"
pretty-hex = "0.1.1"
hex = "0.4"
tempfile = "3.1.0"
semver = "0.9.0"
which = "3.1"
ichwh = "0.2"
textwrap = {version = "0.11.0", features = ["term_size"]}
shellexpand = "1.0.0"
futures-timer = "2.0.0"
pin-utils = "0.1.0-alpha.4"
num-bigint = { version = "0.2.3", features = ["serde"] }
bigdecimal = { version = "0.1.0", features = ["serde"] }
natural = "0.3.0"
serde_urlencoded = "0.6.1"
sublime_fuzzy = "0.6"
trash = "1.0.0"
regex = "1"
cfg-if = "0.1"
strip-ansi-escapes = "0.1.0"
calamine = "0.16"
umask = "0.1"
futures-util = "0.3.0"
pretty = "0.5.2"
futures-util = "0.3.1"
termcolor = "1.0.5"
console = "0.9.1"
natural = "0.3.0"
parking_lot = "0.10.0"
neso = { version = "0.5.0", optional = true }
crossterm = { version = "0.10.2", optional = true }
syntect = {version = "3.2.0", optional = true }
onig_sys = {version = "=69.1.0", optional = true }
heim = {version = "0.0.8", optional = true }
battery = {version = "0.7.4", optional = true }
rawkey = {version = "0.1.2", optional = true }
clipboard = {version = "0.5", optional = true }
ptree = {version = "0.2" }
image = { version = "0.22.2", default_features = false, features = ["png_codec", "jpeg"], optional = true }
starship = { version = "0.26.4", optional = true}
starship = { version = "0.28", optional = true}
heim = {version = "0.0.9", optional = true}
battery = {version = "0.7.5", optional = true}
syntect = {version = "3.2.0", optional = true }
onig_sys = {version = "=69.1.0", optional = true }
crossterm = {version = "0.10.2", optional = true}
futures-timer = {version = "1.0.2", optional = true}
url = {version = "2.1.0", optional = true}
[features]
default = ["textview", "sys", "ps"]
raw-key = ["rawkey", "neso"]
textview = ["syntect", "onig_sys", "crossterm"]
binaryview = ["image", "crossterm"]
default = ["sys", "ps", "textview", "inc", "str"]
stable = ["sys", "ps", "textview", "inc", "str", "starship-prompt", "binaryview", "match", "tree", "average", "sum", "post", "fetch", "clipboard"]
# Default
sys = ["heim", "battery"]
ps = ["heim"]
ps = ["heim", "futures-timer"]
textview = ["crossterm", "syntect", "onig_sys", "url"]
inc = ["nu_plugin_inc"]
str = ["nu_plugin_str"]
# Stable
average = ["nu_plugin_average"]
binaryview = ["nu_plugin_binaryview"]
fetch = ["nu_plugin_fetch"]
match = ["nu_plugin_match"]
post = ["nu_plugin_post"]
starship-prompt = ["starship"]
# trace = ["nom-tracable/trace"]
sum = ["nu_plugin_sum"]
trace = ["nu-parser/trace"]
tree = ["nu_plugin_tree"]
[dependencies.rusqlite]
version = "0.20.0"
@ -118,89 +164,47 @@ features = ["bundled", "blob"]
[dev-dependencies]
pretty_assertions = "0.6.1"
nu-test-support = { version = "0.8.0", path = "./crates/nu-test-support" }
[build-dependencies]
toml = "0.5.5"
serde = { version = "1.0.102", features = ["derive"] }
serde = { version = "1.0.103", features = ["derive"] }
nu-build = { version = "0.8.0", path = "./crates/nu-build" }
[lib]
name = "nu"
doctest = false
path = "src/lib.rs"
# Core plugins that ship with `cargo install nu` by default
# Currently, Cargo limits us to installing only one binary
# unless we use [[bin]], so we use this as a workaround
[[bin]]
name = "nu_plugin_inc"
path = "src/plugins/inc.rs"
[[bin]]
name = "nu_plugin_sum"
path = "src/plugins/sum.rs"
[[bin]]
name = "nu_plugin_average"
path = "src/plugins/average.rs"
[[bin]]
name = "nu_plugin_embed"
path = "src/plugins/embed.rs"
[[bin]]
name = "nu_plugin_insert"
path = "src/plugins/insert.rs"
[[bin]]
name = "nu_plugin_edit"
path = "src/plugins/edit.rs"
[[bin]]
name = "nu_plugin_format"
path = "src/plugins/format.rs"
[[bin]]
name = "nu_plugin_parse"
path = "src/plugins/parse.rs"
[[bin]]
name = "nu_plugin_str"
path = "src/plugins/str.rs"
[[bin]]
name = "nu_plugin_skip"
path = "src/plugins/skip.rs"
[[bin]]
name = "nu_plugin_match"
path = "src/plugins/match.rs"
[[bin]]
name = "nu_plugin_sys"
path = "src/plugins/sys.rs"
required-features = ["sys"]
[[bin]]
name = "nu_plugin_ps"
path = "src/plugins/ps.rs"
required-features = ["ps"]
[[bin]]
name = "nu_plugin_tree"
path = "src/plugins/tree.rs"
required-features = ["tree"]
[[bin]]
name = "nu_plugin_binaryview"
path = "src/plugins/binaryview.rs"
required-features = ["binaryview"]
[[bin]]
name = "nu_plugin_textview"
path = "src/plugins/textview.rs"
name = "nu_plugin_core_textview"
path = "src/plugins/nu_plugin_core_textview.rs"
required-features = ["textview"]
[[bin]]
name = "nu_plugin_docker"
path = "src/plugins/docker.rs"
required-features = ["docker"]
name = "nu_plugin_core_inc"
path = "src/plugins/nu_plugin_core_inc.rs"
required-features = ["inc"]
[[bin]]
name = "nu_plugin_core_ps"
path = "src/plugins/nu_plugin_core_ps.rs"
required-features = ["ps"]
[[bin]]
name = "nu_plugin_core_str"
path = "src/plugins/nu_plugin_core_str.rs"
required-features = ["str"]
[[bin]]
name = "nu_plugin_core_sys"
path = "src/plugins/nu_plugin_core_sys.rs"
required-features = ["sys"]
# Main nu binary
[[bin]]
name = "nu"
path = "src/main.rs"

View File

@ -18,9 +18,9 @@ Nu comes with a set of built-in commands (listed below). If a command is unknown
# Learning more
There are a few good resources to learn about Nu. There is a [book](https://book.nushell.sh) about Nu that is currently in progress. The book focuses on using Nu and its core concepts.
There are a few good resources to learn about Nu. There is a [book](https://www.nushell.sh/book/) about Nu that is currently in progress. The book focuses on using Nu and its core concepts.
If you're a developer who would like to contribute to Nu, we're also working on a [book for developers](https://github.com/nushell/contributor-book/tree/master/en) to help you get started. There are also [good first issues](https://github.com/nushell/nushell/issues?q=is%3Aopen+is%3Aissue+label%3A%22good+first+issue%22) to help you dive in.
If you're a developer who would like to contribute to Nu, we're also working on a [book for developers](https://www.nushell.sh/contributor-book/) to help you get started. There are also [good first issues](https://github.com/nushell/nushell/issues?q=is%3Aopen+is%3Aissue+label%3A%22good+first+issue%22) to help you dive in.
We also have an active [Discord](https://discord.gg/NtAbbGn) and [Twitter](https://twitter.com/nu_shell) if you'd like to come and chat with us.
@ -32,7 +32,7 @@ Try it in Gitpod.
## Local
Up-to-date installation instructions can be found in the [installation chapter of the book](https://book.nushell.sh/en/installation). **Windows users**: please note that Nu works on Windows 10 and does not currently have Windows 7/8.1 support.
Up-to-date installation instructions can be found in the [installation chapter of the book](https://www.nushell.sh/book/en/installation.html). **Windows users**: please note that Nu works on Windows 10 and does not currently have Windows 7/8.1 support.
To build Nu, you will need to use the **latest stable (1.39 or later)** version of the compiler.
@ -52,10 +52,10 @@ To install Nu via cargo (make sure you have installed [rustup](https://rustup.rs
cargo install nu
```
You can also install Nu with all the bells and whistles (be sure to have installed the [dependencies](https://book.nushell.sh/en/installation#dependencies) for your platform):
You can also build Nu yourself with all the bells and whistles (be sure to have installed the [dependencies](https://www.nushell.sh/book/en/installation.html#dependencies) for your platform), once you have checked out this repo with git:
```
cargo install nu --all-features
cargo build --all --features=stable
```
## Docker
@ -173,7 +173,7 @@ We can pipeline this into a command that gets the contents of one of the columns
━━━━━━━━━━━━━━━━━┯━━━━━━━━━━━━━━━━━━━━━━━━━━━━┯━━━━━━━━━┯━━━━━━━━━┯━━━━━━┯━━━━━━━━━
authors │ description │ edition │ license │ name │ version
─────────────────┼────────────────────────────┼─────────┼─────────┼──────┼─────────
[table: 3 rows] │ A shell for the GitHub era │ 2018 │ MIT │ nu │ 0.6.0
[table: 3 rows] │ A shell for the GitHub era │ 2018 │ MIT │ nu │ 0.6.1
━━━━━━━━━━━━━━━━━┷━━━━━━━━━━━━━━━━━━━━━━━━━━━━┷━━━━━━━━━┷━━━━━━━━━┷━━━━━━┷━━━━━━━━━
```
@ -181,7 +181,7 @@ Finally, we can use commands outside of Nu once we have the data we want:
```
/home/jonathan/Source/nushell(master)> open Cargo.toml | get package.version | echo $it
0.6.0
0.6.1
```
Here we use the variable `$it` to refer to the value being piped to the external command.
@ -202,7 +202,7 @@ To set one of these variables, you can use `config --set`. For example:
```
> config --set [edit_mode "vi"]
> config --set [path $nu:path]
> config --set [path $nu.path]
```
## Shells

View File

@ -46,3 +46,7 @@ Unify dictionary building, probably around a macro
sys plugin in own crate
textview in own crate
Combine atomic and atomic_parse in parser
at_end_possible_ws needs to be comment and separator sensitive

View File

@ -1,39 +1,3 @@
use serde::Deserialize;
use std::collections::HashMap;
use std::collections::HashSet;
use std::env;
use std::path::Path;
#[derive(Deserialize)]
struct Feature {
#[allow(unused)]
description: String,
enabled: bool,
}
fn main() -> Result<(), Box<dyn std::error::Error>> {
let input = env::var("CARGO_MANIFEST_DIR").unwrap();
let all_on = env::var("NUSHELL_ENABLE_ALL_FLAGS").is_ok();
let flags: HashSet<String> = env::var("NUSHELL_ENABLE_FLAGS")
.map(|s| s.split(",").map(|s| s.to_string()).collect())
.unwrap_or_else(|_| HashSet::new());
if all_on && !flags.is_empty() {
println!(
"cargo:warning={}",
"Both NUSHELL_ENABLE_ALL_FLAGS and NUSHELL_ENABLE_FLAGS were set. You don't need both."
);
}
let path = Path::new(&input).join("features.toml");
let toml: HashMap<String, Feature> = toml::from_str(&std::fs::read_to_string(path)?)?;
for (key, value) in toml.iter() {
if value.enabled == true || all_on || flags.contains(key) {
println!("cargo:rustc-cfg={}", key);
}
}
Ok(())
nu_build::build()
}

View File

@ -0,0 +1,16 @@
[package]
name = "nu-build"
version = "0.8.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "Core build system for nushell"
license = "MIT"
[lib]
doctest = false
[dependencies]
serde = { version = "1.0.103", features = ["derive"] }
lazy_static = "1.4.0"
serde_json = "1.0.44"
toml = "0.5.5"

View File

@ -0,0 +1,80 @@
use lazy_static::lazy_static;
use serde::Deserialize;
use std::collections::BTreeMap;
use std::collections::HashMap;
use std::collections::HashSet;
use std::env;
use std::path::{Path, PathBuf};
use std::sync::Mutex;
lazy_static! {
static ref WORKSPACES: Mutex<BTreeMap<String, &'static Path>> = Mutex::new(BTreeMap::new());
}
// got from https://github.com/mitsuhiko/insta/blob/b113499249584cb650150d2d01ed96ee66db6b30/src/runtime.rs#L67-L88
fn get_cargo_workspace(manifest_dir: &str) -> Result<Option<&Path>, Box<dyn std::error::Error>> {
let mut workspaces = WORKSPACES.lock()?;
if let Some(rv) = workspaces.get(manifest_dir) {
Ok(Some(rv))
} else {
#[derive(Deserialize)]
struct Manifest {
workspace_root: String,
}
let output = std::process::Command::new(env!("CARGO"))
.arg("metadata")
.arg("--format-version=1")
.current_dir(manifest_dir)
.output()?;
let manifest: Manifest = serde_json::from_slice(&output.stdout)?;
let path = Box::leak(Box::new(PathBuf::from(manifest.workspace_root)));
workspaces.insert(manifest_dir.to_string(), path.as_path());
Ok(workspaces.get(manifest_dir).cloned())
}
}
#[derive(Deserialize)]
struct Feature {
#[allow(unused)]
description: String,
enabled: bool,
}
pub fn build() -> Result<(), Box<dyn std::error::Error>> {
let input = env::var("CARGO_MANIFEST_DIR")?;
let all_on = env::var("NUSHELL_ENABLE_ALL_FLAGS").is_ok();
let flags: HashSet<String> = env::var("NUSHELL_ENABLE_FLAGS")
.map(|s| s.split(',').map(|s| s.to_string()).collect())
.unwrap_or_else(|_| HashSet::new());
if all_on && !flags.is_empty() {
println!(
"cargo:warning=Both NUSHELL_ENABLE_ALL_FLAGS and NUSHELL_ENABLE_FLAGS were set. You don't need both."
);
}
let workspace = match get_cargo_workspace(&input)? {
// If the crate is being downloaded from crates.io, it won't have a workspace root, and that's ok
None => return Ok(()),
Some(workspace) => workspace,
};
let path = Path::new(&workspace).join("features.toml");
// If the crate is being downloaded from crates.io, it won't have a features.toml, and that's ok
if !path.exists() {
return Ok(());
}
let toml: HashMap<String, Feature> = toml::from_str(&std::fs::read_to_string(path)?)?;
for (key, value) in toml.iter() {
if value.enabled || all_on || flags.contains(key) {
println!("cargo:rustc-cfg={}", key);
}
}
Ok(())
}

View File

@ -0,0 +1,32 @@
[package]
name = "nu-errors"
version = "0.8.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "Core error subsystem for Nushell"
license = "MIT"
[lib]
doctest = false
[dependencies]
nu-source = { path = "../nu-source", version = "0.8.0" }
ansi_term = "0.12.1"
bigdecimal = { version = "0.1.0", features = ["serde"] }
derive-new = "0.5.8"
language-reporting = "0.4.0"
num-bigint = { version = "0.2.3", features = ["serde"] }
num-traits = "0.2.10"
serde = { version = "1.0.103", features = ["derive"] }
nom = "5.0.1"
nom_locate = "1.0.0"
# implement conversions
subprocess = "0.1.18"
serde_yaml = "0.8"
toml = "0.5.5"
serde_json = "1.0.44"
[build-dependencies]
nu-build = { version = "0.8.0", path = "../nu-build" }

View File

@ -0,0 +1,3 @@
fn main() -> Result<(), Box<dyn std::error::Error>> {
nu_build::build()
}

View File

@ -1,71 +1,67 @@
use crate::prelude::*;
use ansi_term::Color;
use bigdecimal::BigDecimal;
use derive_new::new;
use language_reporting::{Diagnostic, Label, Severity};
use nu_source::{Spanned, TracableContext};
use nu_source::{b, DebugDocBuilder, PrettyDebug, Span, Spanned, SpannedItem, TracableContext};
use num_bigint::BigInt;
use num_traits::ToPrimitive;
use serde::{Deserialize, Serialize};
use std::fmt;
use std::ops::Range;
// TODO: Spanned<T> -> HasSpanAndItem<T> ?
#[derive(Debug, Eq, PartialEq, Clone, Ord, PartialOrd, Hash, Serialize, Deserialize)]
pub enum Description {
Source(Spanned<String>),
Synthetic(String),
}
impl Description {
fn from_spanned(item: Spanned<impl Into<String>>) -> Description {
Description::Source(item.map(|s| s.into()))
}
fn into_label(self) -> Result<Label<Span>, String> {
match self {
Description::Source(s) => Ok(Label::new_primary(s.span).with_message(s.item)),
Description::Synthetic(s) => Err(s),
}
}
}
impl PrettyDebug for Description {
fn pretty(&self) -> DebugDocBuilder {
match self {
Description::Source(s) => b::description(&s.item),
Description::Synthetic(s) => b::description(s),
}
}
}
/// A structured reason for a ParseError. Note that parsing in nu is more like macro expansion in
/// other languages, so the kinds of errors that can occur during parsing are more contextual than
/// you might expect.
#[derive(Debug, Clone)]
pub enum ParseErrorReason {
Eof {
expected: &'static str,
span: Span,
},
/// The parser encountered an EOF rather than what it was expecting
Eof { expected: &'static str, span: Span },
/// The parser expected to see the end of a token stream (possibly the token
/// stream from inside a delimited token node), but found something else.
ExtraTokens { actual: Spanned<String> },
/// The parser encountered something other than what it was expecting
Mismatch {
expected: &'static str,
actual: Spanned<String>,
},
/// An unexpected internal error has occurred
InternalError { message: Spanned<String> },
/// The parser tried to parse an argument for a command, but it failed for
/// some reason
ArgumentError {
command: Spanned<String>,
error: ArgumentError,
},
}
/// A newtype for `ParseErrorReason`
#[derive(Debug, Clone)]
pub struct ParseError {
reason: ParseErrorReason,
}
impl ParseError {
/// Construct a [ParseErrorReason::Eof](ParseErrorReason::Eof)
pub fn unexpected_eof(expected: &'static str, span: Span) -> ParseError {
ParseError {
reason: ParseErrorReason::Eof { expected, span },
}
}
/// Construct a [ParseErrorReason::ExtraTokens](ParseErrorReason::ExtraTokens)
pub fn extra_tokens(actual: Spanned<impl Into<String>>) -> ParseError {
let Spanned { span, item } = actual;
ParseError {
reason: ParseErrorReason::ExtraTokens {
actual: item.into().spanned(span),
},
}
}
/// Construct a [ParseErrorReason::Mismatch](ParseErrorReason::Mismatch)
pub fn mismatch(expected: &'static str, actual: Spanned<impl Into<String>>) -> ParseError {
let Spanned { span, item } = actual;
@ -77,6 +73,16 @@ impl ParseError {
}
}
/// Construct a [ParseErrorReason::InternalError](ParseErrorReason::InternalError)
pub fn internal_error(message: Spanned<impl Into<String>>) -> ParseError {
ParseError {
reason: ParseErrorReason::InternalError {
message: message.item.into().spanned(message.span),
},
}
}
/// Construct a [ParseErrorReason::ArgumentError](ParseErrorReason::ArgumentError)
pub fn argument_error(command: Spanned<impl Into<String>>, kind: ArgumentError) -> ParseError {
ParseError {
reason: ParseErrorReason::ArgumentError {
@ -87,13 +93,20 @@ impl ParseError {
}
}
/// Convert a [ParseError](ParseError) into a [ShellError](ShellError)
impl From<ParseError> for ShellError {
fn from(error: ParseError) -> ShellError {
match error.reason {
ParseErrorReason::Eof { expected, span } => ShellError::unexpected_eof(expected, span),
ParseErrorReason::ExtraTokens { actual } => ShellError::type_error("nothing", actual),
ParseErrorReason::Mismatch { actual, expected } => {
ShellError::type_error(expected, actual.clone())
ShellError::type_error(expected, actual)
}
ParseErrorReason::InternalError { message } => ShellError::labeled_error(
format!("Internal error: {}", message.item),
&message.item,
&message.span,
),
ParseErrorReason::ArgumentError { command, error } => {
ShellError::argument_error(command, error)
}
@ -101,11 +114,20 @@ impl From<ParseError> for ShellError {
}
}
/// ArgumentError describes various ways that the parser could fail because of unexpected arguments.
/// Nu commands are like a combination of functions and macros, and these errors correspond to
/// problems that could be identified during expansion based on the syntactic signature of a
/// command.
#[derive(Debug, Eq, PartialEq, Clone, Ord, Hash, PartialOrd, Serialize, Deserialize)]
pub enum ArgumentError {
/// The command specified a mandatory flag, but it was missing.
MissingMandatoryFlag(String),
/// The command specified a mandatory positional argument, but it was missing.
MissingMandatoryPositional(String),
/// A flag was found, and it should have been followed by a value, but no value was found
MissingValueForName(String),
/// A sequence of characters was found that was not syntactically valid (but would have
/// been valid if the command was an external command)
InvalidExternalWord,
}
@ -132,12 +154,16 @@ impl PrettyDebug for ArgumentError {
}
}
/// A `ShellError` is a proximate error and a possible cause, which could have its own cause,
/// creating a cause chain.
#[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Clone, Serialize, Deserialize, Hash)]
pub struct ShellError {
error: ProximateShellError,
cause: Option<Box<ProximateShellError>>,
cause: Option<Box<ShellError>>,
}
/// `PrettyDebug` is for internal debugging. For user-facing debugging, [into_diagnostic](ShellError::into_diagnostic)
/// is used, which prints an error, highlighting spans.
impl PrettyDebug for ShellError {
fn pretty(&self) -> DebugDocBuilder {
match &self.error {
@ -159,7 +185,7 @@ impl PrettyDebug for ShellError {
+ b::space()
+ b::description("actual:")
+ b::space()
+ b::option(actual.item.as_ref().map(|actual| b::description(actual))),
+ b::option(actual.item.as_ref().map(b::description)),
")",
)
}
@ -170,12 +196,12 @@ impl PrettyDebug for ShellError {
"(",
b::description("expr:")
+ b::space()
+ expr.pretty()
+ b::description(&expr.item)
+ b::description(",")
+ b::space()
+ b::description("subpath:")
+ b::space()
+ subpath.pretty(),
+ b::description(&subpath.item),
")",
)
}
@ -184,7 +210,7 @@ impl PrettyDebug for ShellError {
+ b::space()
+ b::delimit(
"(",
b::description("subpath:") + b::space() + subpath.pretty(),
b::description("subpath:") + b::space() + b::description(&subpath.item),
")",
)
}
@ -294,8 +320,8 @@ impl ShellError {
expr: Spanned<impl Into<String>>,
) -> ShellError {
ProximateShellError::MissingProperty {
subpath: Description::from_spanned(subpath),
expr: Description::from_spanned(expr),
subpath: subpath.map(|s| s.into()),
expr: expr.map(|e| e.into()),
}
.start()
}
@ -305,7 +331,7 @@ impl ShellError {
integer: impl Into<Span>,
) -> ShellError {
ProximateShellError::InvalidIntegerIndex {
subpath: Description::from_spanned(subpath),
subpath: subpath.map(|s| s.into()),
integer: integer.into(),
}
.start()
@ -318,7 +344,7 @@ impl ShellError {
.start()
}
pub(crate) fn unexpected_eof(expected: impl Into<String>, span: impl Into<Span>) -> ShellError {
pub fn unexpected_eof(expected: impl Into<String>, span: impl Into<Span>) -> ShellError {
ProximateShellError::UnexpectedEof {
expected: expected.into(),
span: span.into(),
@ -326,7 +352,7 @@ impl ShellError {
.start()
}
pub(crate) fn range_error(
pub fn range_error(
expected: impl Into<ExpectedRange>,
actual: &Spanned<impl fmt::Debug>,
operation: impl Into<String>,
@ -339,14 +365,14 @@ impl ShellError {
.start()
}
pub(crate) fn syntax_error(problem: Spanned<impl Into<String>>) -> ShellError {
pub fn syntax_error(problem: Spanned<impl Into<String>>) -> ShellError {
ProximateShellError::SyntaxError {
problem: problem.map(|p| p.into()),
}
.start()
}
pub(crate) fn coerce_error(
pub fn coerce_error(
left: Spanned<impl Into<String>>,
right: Spanned<impl Into<String>>,
) -> ShellError {
@ -357,10 +383,7 @@ impl ShellError {
.start()
}
pub(crate) fn argument_error(
command: Spanned<impl Into<String>>,
kind: ArgumentError,
) -> ShellError {
pub fn argument_error(command: Spanned<impl Into<String>>, kind: ArgumentError) -> ShellError {
ProximateShellError::ArgumentError {
command: command.map(|c| c.into()),
error: kind,
@ -368,7 +391,7 @@ impl ShellError {
.start()
}
pub(crate) fn parse_error(
pub fn parse_error(
error: nom::Err<(
nom_locate::LocatedSpanEx<&str, TracableContext>,
nom::error::ErrorKind,
@ -381,13 +404,13 @@ impl ShellError {
// TODO: Get span of EOF
let diagnostic = Diagnostic::new(
Severity::Error,
format!("Parse Error: Unexpected end of line"),
"Parse Error: Unexpected end of line".to_string(),
);
ShellError::diagnostic(diagnostic)
}
nom::Err::Failure(span) | nom::Err::Error(span) => {
let diagnostic = Diagnostic::new(Severity::Error, format!("Parse Error"))
let diagnostic = Diagnostic::new(Severity::Error, "Parse Error".to_string())
.with_label(Label::new_primary(Span::from(span.0)));
ShellError::diagnostic(diagnostic)
@ -395,11 +418,11 @@ impl ShellError {
}
}
pub(crate) fn diagnostic(diagnostic: Diagnostic<Span>) -> ShellError {
pub fn diagnostic(diagnostic: Diagnostic<Span>) -> ShellError {
ProximateShellError::Diagnostic(ShellDiagnostic { diagnostic }).start()
}
pub(crate) fn to_diagnostic(self) -> Diagnostic<Span> {
pub fn into_diagnostic(self) -> Diagnostic<Span> {
match self.error {
ProximateShellError::MissingValue { span, reason } => {
let mut d = Diagnostic::new(
@ -419,7 +442,7 @@ impl ShellError {
} => match error {
ArgumentError::InvalidExternalWord => Diagnostic::new(
Severity::Error,
format!("Invalid bare word for Nu command (did you intend to invoke an external command?)"))
"Invalid bare word for Nu command (did you intend to invoke an external command?)".to_string())
.with_label(Label::new_primary(command.span)),
ArgumentError::MissingMandatoryFlag(name) => Diagnostic::new(
Severity::Error,
@ -476,7 +499,7 @@ impl ShellError {
ProximateShellError::UnexpectedEof {
expected, span
} => Diagnostic::new(Severity::Error, format!("Unexpected end of input"))
} => Diagnostic::new(Severity::Error, "Unexpected end of input".to_string())
.with_label(Label::new_primary(span).with_message(format!("Expected {}", expected))),
ProximateShellError::RangeError {
@ -491,7 +514,7 @@ impl ShellError {
Label::new_primary(span).with_message(format!(
"Expected to convert {} to {} while {}, but it was out of range",
item,
kind.desc(),
kind.display(),
operation
)),
),
@ -506,31 +529,33 @@ impl ShellError {
.with_label(Label::new_primary(span).with_message(item)),
ProximateShellError::MissingProperty { subpath, expr, .. } => {
let subpath = subpath.into_label();
let expr = expr.into_label();
let mut diag = Diagnostic::new(Severity::Error, "Missing property");
match subpath {
Ok(label) => diag = diag.with_label(label),
Err(ty) => diag.message = format!("Missing property (for {})", ty),
}
if subpath.span == Span::unknown() {
diag.message = format!("Missing property (for {})", subpath.item);
} else {
let subpath = Label::new_primary(subpath.span).with_message(subpath.item);
diag = diag.with_label(subpath);
if expr.span != Span::unknown() {
let expr = Label::new_primary(expr.span).with_message(expr.item);
diag = diag.with_label(expr)
}
if let Ok(label) = expr {
diag = diag.with_label(label);
}
diag
}
ProximateShellError::InvalidIntegerIndex { subpath,integer } => {
let subpath = subpath.into_label();
let mut diag = Diagnostic::new(Severity::Error, "Invalid integer property");
match subpath {
Ok(label) => diag = diag.with_label(label),
Err(ty) => diag.message = format!("Invalid integer property (for {})", ty)
if subpath.span == Span::unknown() {
diag.message = format!("Invalid integer property (for {})", subpath.item)
} else {
let label = Label::new_primary(subpath.span).with_message(subpath.item);
diag = diag.with_label(label)
}
diag = diag.with_label(Label::new_secondary(integer).with_message("integer"));
@ -579,23 +604,19 @@ impl ShellError {
)
}
// pub fn string(title: impl Into<String>) -> ShellError {
// ProximateShellError::String(StringError::new(title.into(), String::new())).start()
// }
//
// pub(crate) fn unreachable(title: impl Into<String>) -> ShellError {
// ShellError::untagged_runtime_error(&format!("BUG: Unreachable: {}", title.into()))
// }
pub(crate) fn unimplemented(title: impl Into<String>) -> ShellError {
pub fn unimplemented(title: impl Into<String>) -> ShellError {
ShellError::untagged_runtime_error(&format!("Unimplemented: {}", title.into()))
}
pub(crate) fn unexpected(title: impl Into<String>) -> ShellError {
pub fn unexpected(title: impl Into<String>) -> ShellError {
ShellError::untagged_runtime_error(&format!("Unexpected: {}", title.into()))
}
}
/// `ExpectedRange` describes a range of values that was expected by a command. In addition
/// to typical ranges, this enum allows an error to specify that the range of allowed values
/// corresponds to a particular numeric type (which is a dominant use-case for the
/// [RangeError](ProximateShellError::RangeError) error type).
#[derive(Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Clone, Serialize, Deserialize)]
pub enum ExpectedRange {
I8,
@ -617,6 +638,7 @@ pub enum ExpectedRange {
Range { start: usize, end: usize },
}
/// Convert a Rust range into an [ExpectedRange](ExpectedRange).
impl From<Range<usize>> for ExpectedRange {
fn from(range: Range<usize>) -> Self {
ExpectedRange::Range {
@ -628,13 +650,7 @@ impl From<Range<usize>> for ExpectedRange {
impl PrettyDebug for ExpectedRange {
fn pretty(&self) -> DebugDocBuilder {
b::description(self.desc())
}
}
impl ExpectedRange {
fn desc(&self) -> String {
match self {
b::description(match self {
ExpectedRange::I8 => "an 8-bit signed integer",
ExpectedRange::I16 => "a 16-bit signed integer",
ExpectedRange::I32 => "a 32-bit signed integer",
@ -651,9 +667,10 @@ impl ExpectedRange {
ExpectedRange::Size => "a list offset",
ExpectedRange::BigDecimal => "a decimal",
ExpectedRange::BigInt => "an integer",
ExpectedRange::Range { start, end } => return format!("{} to {}", start, end),
}
.to_string()
ExpectedRange::Range { start, end } => {
return b::description(format!("{} to {}", start, end))
}
})
}
}
@ -671,11 +688,11 @@ pub enum ProximateShellError {
actual: Spanned<Option<String>>,
},
MissingProperty {
subpath: Description,
expr: Description,
subpath: Spanned<String>,
expr: Spanned<String>,
},
InvalidIntegerIndex {
subpath: Description,
subpath: Spanned<String>,
integer: Span,
},
MissingValue {

View File

@ -0,0 +1,13 @@
[package]
name = "nu-macros"
version = "0.8.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "Core macros for building Nushell"
license = "MIT"
[lib]
doctest = false
[dependencies]
nu-protocol = { path = "../nu-protocol", version = "0.8.0" }

View File

@ -0,0 +1,25 @@
#[macro_export]
macro_rules! signature {
(def $name:tt {
$usage:tt
$(
$positional_name:tt $positional_ty:tt - $positional_desc:tt
)*
}) => {{
let signature = Signature::new(stringify!($name)).desc($usage);
$(
$crate::positional! { signature, $positional_name $positional_ty - $positional_desc }
)*
signature
}};
}
#[macro_export]
macro_rules! positional {
($ident:tt, $name:tt (optional $shape:tt) - $desc:tt) => {
let $ident = $ident.optional(stringify!($name), SyntaxShape::$shape, $desc);
};
($ident:tt, $name:tt ($shape:tt)- $desc:tt) => {
let $ident = $ident.required(stringify!($name), SyntaxShape::$shape, $desc);
};
}

View File

@ -0,0 +1,48 @@
[package]
name = "nu-parser"
version = "0.8.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "Core parser used in Nushell"
license = "MIT"
[lib]
doctest = false
[dependencies]
nu-errors = { path = "../nu-errors", version = "0.8.0" }
nu-source = { path = "../nu-source", version = "0.8.0" }
nu-protocol = { path = "../nu-protocol", version = "0.8.0" }
pretty_env_logger = "0.3.1"
pretty = "0.5.2"
termcolor = "1.0.5"
log = "0.4.8"
indexmap = { version = "1.3.0", features = ["serde-1"] }
serde = { version = "1.0.102", features = ["derive"] }
nom = "5.0.1"
nom_locate = "1.0.0"
nom-tracable = "0.4.1"
num-traits = "0.2.8"
num-bigint = { version = "0.2.3", features = ["serde"] }
bigdecimal = { version = "0.1.0", features = ["serde"] }
derive-new = "0.5.8"
getset = "0.0.9"
cfg-if = "0.1"
itertools = "0.8.1"
shellexpand = "1.0.0"
ansi_term = "0.12.1"
ptree = {version = "0.2" }
language-reporting = "0.4.0"
unicode-xid = "0.2.0"
enumflags2 = "0.6.2"
[dev-dependencies]
pretty_assertions = "0.6.1"
[build-dependencies]
nu-build = { version = "0.8.0", path = "../nu-build" }
[features]
stable = []
trace = ["nom-tracable/trace"]

View File

@ -0,0 +1,3 @@
fn main() -> Result<(), Box<dyn std::error::Error>> {
nu_build::build()
}

View File

@ -0,0 +1,35 @@
pub mod classified;
use crate::commands::classified::external::{ExternalArg, ExternalArgs, ExternalCommand};
use crate::commands::classified::ClassifiedCommand;
use crate::hir::expand_external_tokens::ExternalTokensShape;
use crate::hir::syntax_shape::{expand_syntax, ExpandContext};
use crate::hir::tokens_iterator::TokensIterator;
use nu_errors::ParseError;
use nu_source::{Spanned, Tagged};
// Classify this command as an external command, which doesn't give special meaning
// to nu syntactic constructs, and passes all arguments to the external command as
// strings.
pub(crate) fn external_command(
tokens: &mut TokensIterator,
context: &ExpandContext,
name: Tagged<&str>,
) -> Result<ClassifiedCommand, ParseError> {
let Spanned { item, span } = expand_syntax(&ExternalTokensShape, tokens, context)?.tokens;
Ok(ClassifiedCommand::External(ExternalCommand {
name: name.to_string(),
name_tag: name.tag(),
args: ExternalArgs {
list: item
.iter()
.map(|x| ExternalArg {
tag: x.span.into(),
arg: x.item.clone(),
})
.collect(),
span,
},
}))
}

View File

@ -0,0 +1,92 @@
pub mod external;
pub mod internal;
use crate::commands::classified::external::ExternalCommand;
use crate::commands::classified::internal::InternalCommand;
use crate::hir;
use crate::parse::token_tree::TokenNode;
use derive_new::new;
use nu_source::{b, DebugDocBuilder, HasSpan, PrettyDebugWithSource, Span};
#[derive(Debug, Clone, Eq, PartialEq)]
pub enum ClassifiedCommand {
#[allow(unused)]
Expr(TokenNode),
#[allow(unused)]
Dynamic(hir::Call),
Internal(InternalCommand),
External(ExternalCommand),
}
impl PrettyDebugWithSource for ClassifiedCommand {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
match self {
ClassifiedCommand::Expr(token) => b::typed("command", token.pretty_debug(source)),
ClassifiedCommand::Dynamic(call) => b::typed("command", call.pretty_debug(source)),
ClassifiedCommand::Internal(internal) => internal.pretty_debug(source),
ClassifiedCommand::External(external) => external.pretty_debug(source),
}
}
}
impl HasSpan for ClassifiedCommand {
fn span(&self) -> Span {
match self {
ClassifiedCommand::Expr(node) => node.span(),
ClassifiedCommand::Internal(command) => command.span(),
ClassifiedCommand::Dynamic(call) => call.span,
ClassifiedCommand::External(command) => command.span(),
}
}
}
#[derive(new, Debug, Eq, PartialEq)]
pub(crate) struct DynamicCommand {
pub(crate) args: hir::Call,
}
#[derive(Debug, Clone)]
pub struct Commands {
pub list: Vec<ClassifiedCommand>,
pub span: Span,
}
impl std::ops::Deref for Commands {
type Target = [ClassifiedCommand];
fn deref(&self) -> &Self::Target {
&self.list
}
}
#[derive(Debug, Clone)]
pub struct ClassifiedPipeline {
pub commands: Commands,
}
impl ClassifiedPipeline {
pub fn commands(list: Vec<ClassifiedCommand>, span: impl Into<Span>) -> ClassifiedPipeline {
ClassifiedPipeline {
commands: Commands {
list,
span: span.into(),
},
}
}
}
impl HasSpan for ClassifiedPipeline {
fn span(&self) -> Span {
self.commands.span
}
}
impl PrettyDebugWithSource for ClassifiedPipeline {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
b::intersperse(
self.commands.iter().map(|c| c.pretty_debug(source)),
b::operator(" | "),
)
.or(b::delimit("<", b::description("empty pipeline"), ">"))
}
}

View File

@ -0,0 +1,65 @@
use nu_source::{b, DebugDocBuilder, HasSpan, PrettyDebug, Span, Tag};
#[derive(Debug, Clone, Eq, PartialEq)]
pub struct ExternalArg {
pub arg: String,
pub tag: Tag,
}
impl std::ops::Deref for ExternalArg {
type Target = str;
fn deref(&self) -> &str {
&self.arg
}
}
#[derive(Debug, Clone, Eq, PartialEq)]
pub struct ExternalArgs {
pub list: Vec<ExternalArg>,
pub span: Span,
}
impl ExternalArgs {
pub fn iter(&self) -> impl Iterator<Item = &ExternalArg> {
self.list.iter()
}
}
impl std::ops::Deref for ExternalArgs {
type Target = [ExternalArg];
fn deref(&self) -> &[ExternalArg] {
&self.list
}
}
#[derive(Debug, Clone, Eq, PartialEq)]
pub struct ExternalCommand {
pub name: String,
pub name_tag: Tag,
pub args: ExternalArgs,
}
impl PrettyDebug for ExternalCommand {
fn pretty(&self) -> DebugDocBuilder {
b::typed(
"external command",
b::description(&self.name)
+ b::preceded(
b::space(),
b::intersperse(
self.args.iter().map(|a| b::primitive(a.arg.to_string())),
b::space(),
),
),
)
}
}
impl HasSpan for ExternalCommand {
fn span(&self) -> Span {
self.name_tag.span.until(self.args.span)
}
}

View File

@ -0,0 +1,28 @@
use crate::hir;
use derive_new::new;
use nu_source::{b, DebugDocBuilder, HasSpan, PrettyDebugWithSource, Span, Tag};
#[derive(new, Debug, Clone, Eq, PartialEq)]
pub struct InternalCommand {
pub name: String,
pub name_tag: Tag,
pub args: hir::Call,
}
impl PrettyDebugWithSource for InternalCommand {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
b::typed(
"internal command",
b::description(&self.name) + b::space() + self.args.pretty_debug(source),
)
}
}
impl HasSpan for InternalCommand {
fn span(&self) -> Span {
let start = self.name_tag.span;
start.until(self.args.span)
}
}

View File

@ -4,30 +4,61 @@ pub(crate) mod expand_external_tokens;
pub(crate) mod external_command;
pub(crate) mod named;
pub(crate) mod path;
pub(crate) mod syntax_shape;
pub(crate) mod range;
pub(crate) mod signature;
pub mod syntax_shape;
pub(crate) mod tokens_iterator;
use crate::parser::hir::path::PathMember;
use crate::parser::hir::syntax_shape::Member;
use crate::parser::{registry, Operator, Unit};
use crate::prelude::*;
use crate::hir::syntax_shape::Member;
use crate::parse::operator::CompareOperator;
use crate::parse::parser::Number;
use crate::parse::unit::Unit;
use derive_new::new;
use getset::Getters;
use nu_source::Spanned;
use nu_protocol::{PathMember, ShellTypeName};
use nu_source::{
b, DebugDocBuilder, HasSpan, PrettyDebug, PrettyDebugWithSource, Span, Spanned, SpannedItem,
};
use serde::{Deserialize, Serialize};
use std::path::PathBuf;
use crate::evaluate::Scope;
use crate::parser::parse::tokens::RawNumber;
use crate::parse::tokens::RawNumber;
pub(crate) use self::binary::Binary;
pub(crate) use self::external_command::ExternalCommand;
pub(crate) use self::named::NamedArguments;
pub(crate) use self::path::Path;
pub(crate) use self::range::Range;
pub(crate) use self::syntax_shape::ExpandContext;
pub(crate) use self::tokens_iterator::TokensIterator;
pub use self::syntax_shape::SyntaxShape;
pub use self::external_command::ExternalCommand;
pub use self::named::{NamedArguments, NamedValue};
#[derive(Debug, Clone)]
pub struct Signature {
unspanned: nu_protocol::Signature,
span: Span,
}
impl Signature {
pub fn new(unspanned: nu_protocol::Signature, span: impl Into<Span>) -> Signature {
Signature {
unspanned,
span: span.into(),
}
}
}
impl HasSpan for Signature {
fn span(&self) -> Span {
self.span
}
}
impl PrettyDebugWithSource for Signature {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
self.unspanned.pretty_debug(source)
}
}
#[derive(Debug, Clone, Eq, PartialEq, Getters, Serialize, Deserialize, new)]
pub struct Call {
@ -60,17 +91,6 @@ impl PrettyDebugWithSource for Call {
}
}
impl Call {
pub fn evaluate(
&self,
registry: &registry::CommandRegistry,
scope: &Scope,
source: &Text,
) -> Result<registry::EvaluatedArgs, ShellError> {
registry::evaluate_args(self, registry, scope, source)
}
}
#[derive(Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Hash, Serialize, Deserialize)]
pub enum RawExpression {
Literal(Literal),
@ -78,6 +98,7 @@ pub enum RawExpression {
Synthetic(Synthetic),
Variable(Variable),
Binary(Box<Binary>),
Range(Box<Range>),
Block(Vec<Expression>),
List(Vec<Expression>),
Path(Box<Path>),
@ -100,6 +121,7 @@ impl ShellTypeName for RawExpression {
RawExpression::Variable(..) => "variable",
RawExpression::List(..) => "list",
RawExpression::Binary(..) => "binary",
RawExpression::Range(..) => "range",
RawExpression::Block(..) => "block",
RawExpression::Path(..) => "variable path",
RawExpression::Boolean(..) => "boolean",
@ -169,6 +191,7 @@ impl PrettyDebugWithSource for Expression {
},
RawExpression::Variable(_) => b::keyword(self.span.slice(source)),
RawExpression::Binary(binary) => binary.pretty_debug(source),
RawExpression::Range(range) => range.pretty_debug(source),
RawExpression::Block(_) => b::opaque("block"),
RawExpression::List(list) => b::delimit(
"[",
@ -196,41 +219,37 @@ impl PrettyDebugWithSource for Expression {
}
impl Expression {
pub(crate) fn number(i: impl Into<Number>, span: impl Into<Span>) -> Expression {
pub fn number(i: impl Into<Number>, span: impl Into<Span>) -> Expression {
let span = span.into();
RawExpression::Literal(RawLiteral::Number(i.into()).into_literal(span)).into_expr(span)
}
pub(crate) fn size(
i: impl Into<Number>,
unit: impl Into<Unit>,
span: impl Into<Span>,
) -> Expression {
pub fn size(i: impl Into<Number>, unit: impl Into<Unit>, span: impl Into<Span>) -> Expression {
let span = span.into();
RawExpression::Literal(RawLiteral::Size(i.into(), unit.into()).into_literal(span))
.into_expr(span)
}
pub(crate) fn synthetic_string(s: impl Into<String>) -> Expression {
pub fn synthetic_string(s: impl Into<String>) -> Expression {
RawExpression::Synthetic(Synthetic::String(s.into())).into_unspanned_expr()
}
pub(crate) fn string(inner: impl Into<Span>, outer: impl Into<Span>) -> Expression {
pub fn string(inner: impl Into<Span>, outer: impl Into<Span>) -> Expression {
let outer = outer.into();
RawExpression::Literal(RawLiteral::String(inner.into()).into_literal(outer))
.into_expr(outer)
}
pub(crate) fn column_path(members: Vec<Member>, span: impl Into<Span>) -> Expression {
pub fn column_path(members: Vec<Member>, span: impl Into<Span>) -> Expression {
let span = span.into();
RawExpression::Literal(RawLiteral::ColumnPath(members).into_literal(span)).into_expr(span)
}
pub(crate) fn path(
pub fn path(
head: Expression,
tail: Vec<impl Into<PathMember>>,
span: impl Into<Span>,
@ -239,7 +258,7 @@ impl Expression {
RawExpression::Path(Box::new(Path::new(head, tail))).into_expr(span.into())
}
pub(crate) fn dot_member(head: Expression, next: impl Into<PathMember>) -> Expression {
pub fn dot_member(head: Expression, next: impl Into<PathMember>) -> Expression {
let Expression { expr: item, span } = head;
let next = next.into();
@ -257,9 +276,9 @@ impl Expression {
}
}
pub(crate) fn infix(
pub fn infix(
left: Expression,
op: Spanned<impl Into<Operator>>,
op: Spanned<impl Into<CompareOperator>>,
right: Expression,
) -> Expression {
let new_span = left.span.until(right.span);
@ -268,36 +287,42 @@ impl Expression {
.into_expr(new_span)
}
pub(crate) fn file_path(path: impl Into<PathBuf>, outer: impl Into<Span>) -> Expression {
pub fn range(left: Expression, op: Span, right: Expression) -> Expression {
let new_span = left.span.until(right.span);
RawExpression::Range(Box::new(Range::new(left, op, right))).into_expr(new_span)
}
pub fn file_path(path: impl Into<PathBuf>, outer: impl Into<Span>) -> Expression {
RawExpression::FilePath(path.into()).into_expr(outer)
}
pub(crate) fn list(list: Vec<Expression>, span: impl Into<Span>) -> Expression {
pub fn list(list: Vec<Expression>, span: impl Into<Span>) -> Expression {
RawExpression::List(list).into_expr(span)
}
pub(crate) fn bare(span: impl Into<Span>) -> Expression {
pub fn bare(span: impl Into<Span>) -> Expression {
let span = span.into();
RawExpression::Literal(RawLiteral::Bare.into_literal(span)).into_expr(span)
}
pub(crate) fn pattern(inner: impl Into<String>, outer: impl Into<Span>) -> Expression {
pub fn pattern(inner: impl Into<String>, outer: impl Into<Span>) -> Expression {
let outer = outer.into();
RawExpression::Literal(RawLiteral::GlobPattern(inner.into()).into_literal(outer))
.into_expr(outer)
}
pub(crate) fn variable(inner: impl Into<Span>, outer: impl Into<Span>) -> Expression {
pub fn variable(inner: impl Into<Span>, outer: impl Into<Span>) -> Expression {
RawExpression::Variable(Variable::Other(inner.into())).into_expr(outer)
}
pub(crate) fn external_command(inner: impl Into<Span>, outer: impl Into<Span>) -> Expression {
pub fn external_command(inner: impl Into<Span>, outer: impl Into<Span>) -> Expression {
RawExpression::ExternalCommand(ExternalCommand::new(inner.into())).into_expr(outer)
}
pub(crate) fn it_variable(inner: impl Into<Span>, outer: impl Into<Span>) -> Expression {
pub fn it_variable(inner: impl Into<Span>, outer: impl Into<Span>) -> Expression {
RawExpression::Variable(Variable::It(inner.into())).into_expr(outer)
}
}

View File

@ -0,0 +1,2 @@
#[cfg(test)]
pub mod tests;

View File

@ -1,13 +1,13 @@
use crate::commands::classified::InternalCommand;
use crate::commands::ClassifiedCommand;
use crate::env::host::BasicHost;
use crate::parser::hir::TokensIterator;
use crate::parser::hir::{
self, named::NamedValue, path::PathMember, syntax_shape::*, NamedArguments,
};
use crate::parser::parse::token_tree_builder::{CurriedToken, TokenTreeBuilder as b};
use crate::parser::TokenNode;
use crate::commands::classified::{internal::InternalCommand, ClassifiedCommand};
use crate::hir::TokensIterator;
use crate::hir::{self, named::NamedValue, syntax_shape::*, NamedArguments};
use crate::parse::files::Files;
use crate::parse::token_tree_builder::{CurriedToken, TokenTreeBuilder as b};
use crate::TokenNode;
use derive_new::new;
use indexmap::IndexMap;
use nu_errors::ShellError;
use nu_protocol::{PathMember, Signature, SyntaxShape};
use nu_source::{HasSpan, Span, Tag, Text};
use pretty_assertions::assert_eq;
use std::fmt::Debug;
@ -23,7 +23,7 @@ fn test_parse_string() {
fn test_parse_path() {
parse_tokens(
VariablePathShape,
vec![b::var("it"), b::op("."), b::bare("cpu")],
vec![b::var("it"), b::dot(), b::bare("cpu")],
|tokens| {
let (outer_var, inner_var) = tokens[0].expect_var();
let bare = tokens[2].expect_bare();
@ -39,9 +39,9 @@ fn test_parse_path() {
VariablePathShape,
vec![
b::var("cpu"),
b::op("."),
b::dot(),
b::bare("amount"),
b::op("."),
b::dot(),
b::string("max ghz"),
],
|tokens| {
@ -90,6 +90,43 @@ fn test_parse_command() {
);
}
#[derive(new)]
struct TestRegistry {
#[new(default)]
signatures: indexmap::IndexMap<String, Signature>,
}
impl TestRegistry {
fn insert(&mut self, key: &str, value: Signature) {
self.signatures.insert(key.to_string(), value);
}
}
impl SignatureRegistry for TestRegistry {
fn has(&self, name: &str) -> Result<bool, ShellError> {
Ok(self.signatures.contains_key(name))
}
fn get(&self, name: &str) -> Result<Option<Signature>, ShellError> {
Ok(self.signatures.get(name).cloned())
}
}
fn with_empty_context(source: &Text, callback: impl FnOnce(ExpandContext)) {
let mut registry = TestRegistry::new();
registry.insert(
"ls",
Signature::build("ls")
.optional(
"path",
SyntaxShape::Pattern,
"a path to get the directory contents from",
)
.switch("full", "list all available columns for each entry"),
);
callback(ExpandContext::new(Box::new(registry), source, None))
}
fn parse_tokens<T: Eq + HasSpan + Clone + Debug + 'static>(
shape: impl ExpandSyntax<Output = T>,
tokens: Vec<CurriedToken>,
@ -99,7 +136,7 @@ fn parse_tokens<T: Eq + HasSpan + Clone + Debug + 'static>(
let (tokens, source) = b::build(tokens);
let text = Text::from(source);
ExpandContext::with_empty(&text, |context| {
with_empty_context(&text, |context| {
let tokens = tokens.expect_list();
let mut iterator = TokensIterator::all(tokens.item, text.clone(), tokens.span);
@ -108,7 +145,7 @@ fn parse_tokens<T: Eq + HasSpan + Clone + Debug + 'static>(
let expr = match expr {
Ok(expr) => expr,
Err(err) => {
crate::cli::print_err(err.into(), &BasicHost, context.source().clone());
print_err(err.into(), &context.source().clone());
panic!("Parse failed");
}
};
@ -120,3 +157,18 @@ fn parse_tokens<T: Eq + HasSpan + Clone + Debug + 'static>(
fn inner_string_span(span: Span) -> Span {
Span::new(span.start() + 1, span.end() - 1)
}
pub fn print_err(err: ShellError, source: &Text) {
let diag = err.into_diagnostic();
let writer = termcolor::StandardStream::stderr(termcolor::ColorChoice::Auto);
let mut source = source.to_string();
source.push_str(" ");
let files = Files::new(source);
let _ = language_reporting::emit(
&mut writer.lock(),
&files,
&diag,
&language_reporting::DefaultConfig,
);
}

View File

@ -1,18 +1,17 @@
use crate::parser::{hir::Expression, Operator};
use crate::prelude::*;
use crate::{hir::Expression, CompareOperator};
use derive_new::new;
use getset::Getters;
use nu_source::Spanned;
use nu_source::{b, DebugDocBuilder, PrettyDebugWithSource, Spanned};
use serde::{Deserialize, Serialize};
#[derive(
Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Hash, Getters, Serialize, Deserialize, new,
)]
#[get = "pub(crate)"]
#[get = "pub"]
pub struct Binary {
left: Expression,
op: Spanned<Operator>,
op: Spanned<CompareOperator>,
right: Expression,
}

View File

@ -1,7 +1,4 @@
use crate::errors::ParseError;
#[cfg(not(coloring_in_tokens))]
use crate::parser::hir::syntax_shape::FlatShape;
use crate::parser::{
use crate::{
hir::syntax_shape::{
color_syntax, expand_atom, expand_expr, expand_syntax, AtomicToken, ColorSyntax,
ExpandContext, ExpandExpression, ExpandSyntax, ExpansionRule, MaybeSpaceShape,
@ -10,6 +7,8 @@ use crate::parser::{
hir::Expression,
TokensIterator,
};
use nu_errors::ParseError;
use nu_protocol::SpannedTypeName;
use nu_source::{b, DebugDocBuilder, HasSpan, PrettyDebug, Span, Spanned, SpannedItem};
#[derive(Debug, Clone)]
@ -68,33 +67,6 @@ impl ExpandSyntax for ExternalTokensShape {
}
}
#[cfg(not(coloring_in_tokens))]
impl ColorSyntax for ExternalTokensShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Self::Info {
loop {
// Allow a space
color_syntax(&MaybeSpaceShape, token_nodes, context, shapes);
// Process an external expression. External expressions are mostly words, with a
// few exceptions (like $variables and path expansion rules)
match color_syntax(&ExternalExpression, token_nodes, context, shapes).1 {
ExternalExpressionResult::Eof => break,
ExternalExpressionResult::Processed => continue,
}
}
}
}
#[cfg(coloring_in_tokens)]
impl ColorSyntax for ExternalTokensShape {
type Info = ();
type Input = ();
@ -115,7 +87,7 @@ impl ColorSyntax for ExternalTokensShape {
// Process an external expression. External expressions are mostly words, with a
// few exceptions (like $variables and path expansion rules)
match color_syntax(&ExternalExpression, token_nodes, context).1 {
match color_syntax(&ExternalExpressionShape, token_nodes, context).1 {
ExternalExpressionResult::Eof => break,
ExternalExpressionResult::Processed => continue,
}
@ -144,7 +116,7 @@ impl ExpandSyntax for ExternalExpressionShape {
token_nodes,
"external command",
context,
ExpansionRule::new().allow_external_command(),
ExpansionRule::new().allow_external_word(),
)?
.span;
@ -164,40 +136,6 @@ impl ExpandSyntax for ExternalExpressionShape {
}
}
#[derive(Debug, Copy, Clone)]
struct ExternalExpression;
impl ExpandSyntax for ExternalExpression {
type Output = Option<Span>;
fn name(&self) -> &'static str {
"external expression"
}
fn expand_syntax<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<Self::Output, ParseError> {
expand_syntax(&MaybeSpaceShape, token_nodes, context)?;
let first = expand_syntax(&ExternalHeadShape, token_nodes, context)?.span;
let mut last = first;
loop {
let continuation = expand_syntax(&ExternalContinuationShape, token_nodes, context);
if let Ok(continuation) = continuation {
last = continuation.span;
} else {
break;
}
}
Ok(Some(first.until(last)))
}
}
#[derive(Debug, Copy, Clone)]
struct ExternalHeadShape;
@ -229,11 +167,18 @@ impl ExpandExpression for ExternalHeadShape {
UnspannedAtomicToken::Whitespace { .. } => {
unreachable!("ExpansionRule doesn't allow Whitespace")
}
UnspannedAtomicToken::Separator { .. } => {
unreachable!("ExpansionRule doesn't allow Separator")
}
UnspannedAtomicToken::Comment { .. } => {
unreachable!("ExpansionRule doesn't allow Comment")
}
UnspannedAtomicToken::ShorthandFlag { .. }
| UnspannedAtomicToken::SquareDelimited { .. } => {
| UnspannedAtomicToken::SquareDelimited { .. }
| UnspannedAtomicToken::RoundDelimited { .. } => {
return Err(ParseError::mismatch(
"external command name",
"pipeline".spanned(atom.span),
atom.spanned_type_name(),
))
}
UnspannedAtomicToken::ExternalCommand { command } => {
@ -249,7 +194,10 @@ impl ExpandExpression for ExternalHeadShape {
| UnspannedAtomicToken::GlobPattern { .. }
| UnspannedAtomicToken::Word { .. }
| UnspannedAtomicToken::Dot { .. }
| UnspannedAtomicToken::Operator { .. } => Expression::external_command(span, span),
| UnspannedAtomicToken::DotDot { .. }
| UnspannedAtomicToken::CompareOperator { .. } => {
Expression::external_command(span, span)
}
})
}
}
@ -291,6 +239,12 @@ impl ExpandExpression for ExternalContinuationShape {
UnspannedAtomicToken::Whitespace { .. } => {
unreachable!("ExpansionRule doesn't allow Whitespace")
}
UnspannedAtomicToken::Separator { .. } => {
unreachable!("ExpansionRule doesn't allow Separator")
}
UnspannedAtomicToken::Comment { .. } => {
unreachable!("ExpansionRule doesn't allow Comment")
}
UnspannedAtomicToken::String { body } => Expression::string(*body, span),
UnspannedAtomicToken::ItVariable { name } => Expression::it_variable(*name, span),
UnspannedAtomicToken::Variable { name } => Expression::variable(*name, span),
@ -299,24 +253,25 @@ impl ExpandExpression for ExternalContinuationShape {
| UnspannedAtomicToken::Word { .. }
| UnspannedAtomicToken::ShorthandFlag { .. }
| UnspannedAtomicToken::Dot { .. }
| UnspannedAtomicToken::Operator { .. } => Expression::bare(span),
UnspannedAtomicToken::SquareDelimited { .. } => {
| UnspannedAtomicToken::DotDot { .. }
| UnspannedAtomicToken::CompareOperator { .. } => Expression::bare(span),
UnspannedAtomicToken::SquareDelimited { .. }
| UnspannedAtomicToken::RoundDelimited { .. } => {
return Err(ParseError::mismatch(
"external argument",
"pipeline".spanned(atom.span),
atom.spanned_type_name(),
))
}
})
}
}
#[cfg(coloring_in_tokens)]
impl ColorSyntax for ExternalExpression {
impl ColorSyntax for ExternalExpressionShape {
type Info = ExternalExpressionResult;
type Input = ();
fn name(&self) -> &'static str {
"ExternalExpression"
"ExternalExpressionShape"
}
fn color_syntax<'a, 'b>(
@ -340,43 +295,12 @@ impl ColorSyntax for ExternalExpression {
};
token_nodes.mutate_shapes(|shapes| atom.color_tokens(shapes));
return ExternalExpressionResult::Processed;
ExternalExpressionResult::Processed
}
}
#[must_use]
enum ExternalExpressionResult {
pub enum ExternalExpressionResult {
Eof,
Processed,
}
#[cfg(not(coloring_in_tokens))]
impl ColorSyntax for ExternalExpression {
type Info = ExternalExpressionResult;
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> ExternalExpressionResult {
let atom = match expand_atom(
token_nodes,
"external word",
context,
ExpansionRule::permissive(),
) {
Err(_) => unreachable!("TODO: separate infallible expand_atom"),
Ok(AtomicToken {
unspanned: UnspannedAtomicToken::Eof { .. },
..
}) => return ExternalExpressionResult::Eof,
Ok(atom) => atom,
};
atom.color_tokens(shapes);
return ExternalExpressionResult::Processed;
}
}

View File

@ -1,12 +1,12 @@
use crate::prelude::*;
use derive_new::new;
use getset::Getters;
use nu_source::Span;
use serde::{Deserialize, Serialize};
#[derive(
Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Hash, Getters, Serialize, Deserialize, new,
)]
#[get = "pub(crate)"]
#[get = "pub"]
pub struct ExternalCommand {
pub(crate) name: Span,
}

View File

@ -1,8 +1,8 @@
use crate::parser::hir::Expression;
use crate::parser::Flag;
use crate::prelude::*;
use crate::hir::Expression;
use crate::Flag;
use indexmap::IndexMap;
use log::trace;
use nu_source::{b, DebugDocBuilder, PrettyDebugWithSource, Tag};
use serde::{Deserialize, Serialize};
#[derive(Debug, Clone, Eq, PartialEq, Serialize, Deserialize)]
@ -24,16 +24,18 @@ impl PrettyDebugWithSource for NamedValue {
}
}
#[derive(Debug, Clone, Eq, PartialEq, Serialize, Deserialize)]
#[derive(Debug, Default, Clone, Eq, PartialEq, Serialize, Deserialize)]
pub struct NamedArguments {
pub(crate) named: IndexMap<String, NamedValue>,
pub named: IndexMap<String, NamedValue>,
}
impl NamedArguments {
pub fn new() -> NamedArguments {
NamedArguments {
named: IndexMap::new(),
}
Default::default()
}
pub fn iter(&self) -> impl Iterator<Item = (&String, &NamedValue)> {
self.named.iter()
}
}
@ -43,7 +45,7 @@ impl NamedArguments {
trace!("Inserting switch -- {} = {:?}", name, switch);
match switch {
None => self.named.insert(name.into(), NamedValue::AbsentSwitch),
None => self.named.insert(name, NamedValue::AbsentSwitch),
Some(flag) => self.named.insert(
name,
NamedValue::PresentSwitch(Tag {

View File

@ -0,0 +1,41 @@
use crate::hir::Expression;
use derive_new::new;
use getset::{Getters, MutGetters};
use nu_protocol::PathMember;
use nu_source::{b, DebugDocBuilder, PrettyDebug, PrettyDebugWithSource};
use serde::{Deserialize, Serialize};
#[derive(
Debug,
Clone,
Eq,
PartialEq,
Ord,
PartialOrd,
Hash,
Getters,
MutGetters,
Serialize,
Deserialize,
new,
)]
#[get = "pub"]
pub struct Path {
head: Expression,
#[get_mut = "pub(crate)"]
tail: Vec<PathMember>,
}
impl PrettyDebugWithSource for Path {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
self.head.pretty_debug(source)
+ b::operator(".")
+ b::intersperse(self.tail.iter().map(|m| m.pretty()), b::operator("."))
}
}
impl Path {
pub(crate) fn parts(self) -> (Expression, Vec<PathMember>) {
(self.head, self.tail)
}
}

View File

@ -0,0 +1,33 @@
use crate::hir::Expression;
use derive_new::new;
use getset::Getters;
use nu_source::{b, DebugDocBuilder, PrettyDebugWithSource, Span};
use serde::{Deserialize, Serialize};
#[derive(
Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Hash, Getters, Serialize, Deserialize, new,
)]
pub struct Range {
#[get = "pub"]
left: Expression,
#[get = "pub"]
dotdot: Span,
#[get = "pub"]
right: Expression,
}
impl PrettyDebugWithSource for Range {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
b::delimit(
"<",
self.left.pretty_debug(source)
+ b::space()
+ b::keyword(self.dotdot.slice(source))
+ b::space()
+ self.right.pretty_debug(source),
">",
)
.group()
}
}

View File

@ -0,0 +1,475 @@
use crate::hir;
use crate::hir::syntax_shape::{
expand_atom, expand_syntax, BareShape, ExpandContext, ExpandSyntax, ExpansionRule,
UnspannedAtomicToken, WhitespaceShape,
};
use crate::hir::tokens_iterator::TokensIterator;
use crate::parse::comment::Comment;
use derive_new::new;
use nu_errors::ParseError;
use nu_protocol::{RowType, SpannedTypeName, Type};
use nu_source::{
b, DebugDocBuilder, HasFallibleSpan, HasSpan, PrettyDebugWithSource, Span, Spanned, SpannedItem,
};
use std::fmt::Debug;
// A Signature is a command without implementation.
//
// In Nu, a command is a function combined with macro expansion rules.
//
// def cd
// # Change to a new path.
// optional directory(Path) # the directory to change to
// end
#[derive(new)]
struct Expander<'a, 'b, 'c, 'd> {
iterator: &'b mut TokensIterator<'a>,
context: &'d ExpandContext<'c>,
}
impl<'a, 'b, 'c, 'd> Expander<'a, 'b, 'c, 'd> {
fn expand<O>(&mut self, syntax: impl ExpandSyntax<Output = O>) -> Result<O, ParseError>
where
O: HasFallibleSpan + Clone + std::fmt::Debug + 'static,
{
expand_syntax(&syntax, self.iterator, self.context)
}
fn optional<O>(&mut self, syntax: impl ExpandSyntax<Output = O>) -> Option<O>
where
O: HasFallibleSpan + Clone + std::fmt::Debug + 'static,
{
match expand_syntax(&syntax, self.iterator, self.context) {
Err(_) => None,
Ok(value) => Some(value),
}
}
fn pos(&mut self) -> Span {
self.iterator.span_at_cursor()
}
fn slice_string(&mut self, span: impl Into<Span>) -> String {
span.into().slice(self.context.source()).to_string()
}
}
#[derive(Debug, Copy, Clone)]
struct SignatureShape;
impl ExpandSyntax for SignatureShape {
type Output = hir::Signature;
fn name(&self) -> &'static str {
"signature"
}
fn expand_syntax<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<Self::Output, ParseError> {
token_nodes.atomic_parse(|token_nodes| {
let mut expander = Expander::new(token_nodes, context);
let start = expander.pos();
expander.expand(keyword("def"))?;
expander.expand(WhitespaceShape)?;
let name = expander.expand(BareShape)?;
expander.expand(SeparatorShape)?;
let usage = expander.expand(CommentShape)?;
expander.expand(SeparatorShape)?;
let end = expander.pos();
Ok(hir::Signature::new(
nu_protocol::Signature::new(&name.word).desc(expander.slice_string(usage.text)),
start.until(end),
))
})
}
}
fn keyword(kw: &'static str) -> KeywordShape {
KeywordShape { keyword: kw }
}
#[derive(Debug, Copy, Clone)]
struct KeywordShape {
keyword: &'static str,
}
impl ExpandSyntax for KeywordShape {
type Output = Span;
fn name(&self) -> &'static str {
"keyword"
}
fn expand_syntax<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<Self::Output, ParseError> {
let atom = expand_atom(token_nodes, "keyword", context, ExpansionRule::new())?;
if let UnspannedAtomicToken::Word { text } = &atom.unspanned {
let word = text.slice(context.source());
if word == self.keyword {
return Ok(atom.span);
}
}
Err(ParseError::mismatch(self.keyword, atom.spanned_type_name()))
}
}
#[derive(Debug, Copy, Clone)]
struct SeparatorShape;
impl ExpandSyntax for SeparatorShape {
type Output = Span;
fn name(&self) -> &'static str {
"separator"
}
fn expand_syntax<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<Self::Output, ParseError> {
let atom = expand_atom(token_nodes, "separator", context, ExpansionRule::new())?;
match &atom.unspanned {
UnspannedAtomicToken::Separator { text } => Ok(*text),
_ => Err(ParseError::mismatch("separator", atom.spanned_type_name())),
}
}
}
#[derive(Debug, Copy, Clone)]
struct CommentShape;
impl ExpandSyntax for CommentShape {
type Output = Comment;
fn name(&self) -> &'static str {
"comment"
}
fn expand_syntax<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<Self::Output, ParseError> {
let atom = expand_atom(token_nodes, "comment", context, ExpansionRule::new())?;
match &atom.unspanned {
UnspannedAtomicToken::Comment { body } => Ok(Comment::line(body, atom.span)),
_ => Err(ParseError::mismatch("separator", atom.spanned_type_name())),
}
}
}
#[derive(Debug, Copy, Clone, new)]
struct TupleShape<A, B> {
first: A,
second: B,
}
#[derive(Debug, Clone, new)]
struct TupleSyntax<A, B> {
first: A,
second: B,
}
impl<A, B> PrettyDebugWithSource for TupleSyntax<A, B>
where
A: PrettyDebugWithSource,
B: PrettyDebugWithSource,
{
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
b::typed(
"pair",
self.first.pretty_debug(source) + b::space() + self.second.pretty_debug(source),
)
}
}
impl<A, B> HasFallibleSpan for TupleSyntax<A, B>
where
A: HasFallibleSpan + Debug + Clone,
B: HasFallibleSpan + Debug + Clone,
{
fn maybe_span(&self) -> Option<Span> {
match (self.first.maybe_span(), self.second.maybe_span()) {
(Some(first), Some(second)) => Some(first.until(second)),
(Some(first), None) => Some(first),
(None, Some(second)) => Some(second),
(None, None) => None,
}
}
}
impl<A, B, AOut, BOut> ExpandSyntax for TupleShape<A, B>
where
A: ExpandSyntax<Output = AOut> + Debug + Copy,
B: ExpandSyntax<Output = BOut> + Debug + Copy,
AOut: HasFallibleSpan + Debug + Clone + 'static,
BOut: HasFallibleSpan + Debug + Clone + 'static,
{
type Output = TupleSyntax<AOut, BOut>;
fn name(&self) -> &'static str {
"pair"
}
fn expand_syntax<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<Self::Output, ParseError> {
token_nodes.atomic_parse(|token_nodes| {
let first = expand_syntax(&self.first, token_nodes, context)?;
let second = expand_syntax(&self.second, token_nodes, context)?;
Ok(TupleSyntax { first, second })
})
}
}
#[derive(Debug, Clone)]
pub struct PositionalParam {
optional: Option<Span>,
name: Identifier,
ty: Spanned<Type>,
desc: Spanned<String>,
span: Span,
}
impl HasSpan for PositionalParam {
fn span(&self) -> Span {
self.span
}
}
impl PrettyDebugWithSource for PositionalParam {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
(match self.optional {
Some(_) => b::description("optional") + b::space(),
None => b::blank(),
}) + self.ty.pretty_debug(source)
}
}
#[derive(Debug, Copy, Clone)]
pub struct PositionalParamShape;
impl ExpandSyntax for PositionalParamShape {
type Output = PositionalParam;
fn name(&self) -> &'static str {
"positional param"
}
fn expand_syntax<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<Self::Output, ParseError> {
token_nodes.atomic_parse(|token_nodes| {
let mut expander = Expander::new(token_nodes, context);
let optional = expander
.optional(TupleShape::new(keyword("optional"), WhitespaceShape))
.map(|s| s.first);
let name = expander.expand(IdentifierShape)?;
expander.optional(WhitespaceShape);
let _ty = expander.expand(TypeShape)?;
Ok(PositionalParam {
optional,
name,
ty: Type::Nothing.spanned(Span::unknown()),
desc: format!("").spanned(Span::unknown()),
span: Span::unknown(),
})
})
}
}
#[derive(Debug, Clone)]
struct Identifier {
body: String,
span: Span,
}
impl HasSpan for Identifier {
fn span(&self) -> Span {
self.span
}
}
impl PrettyDebugWithSource for Identifier {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
b::typed("id", b::description(self.span.slice(source)))
}
}
#[derive(Debug, Copy, Clone)]
struct IdentifierShape;
impl ExpandSyntax for IdentifierShape {
type Output = Identifier;
fn name(&self) -> &'static str {
"identifier"
}
fn expand_syntax<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<Self::Output, ParseError> {
let atom = expand_atom(token_nodes, "identifier", context, ExpansionRule::new())?;
if let UnspannedAtomicToken::Word { text } = atom.unspanned {
let body = text.slice(context.source());
if is_id(body) {
return Ok(Identifier {
body: body.to_string(),
span: text,
});
}
}
Err(ParseError::mismatch("identifier", atom.spanned_type_name()))
}
}
fn is_id(input: &str) -> bool {
let source = nu_source::nom_input(input);
match crate::parse::parser::ident(source) {
Err(_) => false,
Ok((input, _)) => input.fragment.is_empty(),
}
}
#[derive(Debug, Clone, new)]
struct TypeSyntax {
ty: Type,
span: Span,
}
impl HasSpan for TypeSyntax {
fn span(&self) -> Span {
self.span
}
}
impl PrettyDebugWithSource for TypeSyntax {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
self.ty.pretty_debug(source)
}
}
#[derive(Debug, Copy, Clone)]
struct TypeShape;
impl ExpandSyntax for TypeShape {
type Output = TypeSyntax;
fn name(&self) -> &'static str {
"type"
}
fn expand_syntax<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<Self::Output, ParseError> {
let atom = expand_atom(token_nodes, "type", context, ExpansionRule::new())?;
match atom.unspanned {
UnspannedAtomicToken::Word { text } => {
let word = text.slice(context.source());
Ok(TypeSyntax::new(
match word {
"nothing" => Type::Nothing,
"integer" => Type::Int,
"decimal" => Type::Decimal,
"bytesize" => Type::Bytesize,
"string" => Type::String,
"column-path" => Type::ColumnPath,
"pattern" => Type::Pattern,
"boolean" => Type::Boolean,
"date" => Type::Date,
"duration" => Type::Duration,
"filename" => Type::Path,
"binary" => Type::Binary,
"row" => Type::Row(RowType::new()),
"table" => Type::Table(vec![]),
"block" => Type::Block,
_ => return Err(ParseError::mismatch("type", atom.spanned_type_name())),
},
atom.span,
))
}
_ => Err(ParseError::mismatch("type", atom.spanned_type_name())),
}
}
}
#[derive(Debug, Copy, Clone)]
struct TypeAnnotation;
impl ExpandSyntax for TypeAnnotation {
type Output = TypeSyntax;
fn name(&self) -> &'static str {
"type annotation"
}
fn expand_syntax<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<Self::Output, ParseError> {
let atom = expand_atom(
token_nodes,
"type annotation",
context,
ExpansionRule::new(),
)?;
match atom.unspanned {
UnspannedAtomicToken::RoundDelimited { nodes, .. } => {
token_nodes.atomic_parse(|token_nodes| {
token_nodes.child(
(&nodes[..]).spanned(atom.span),
context.source().clone(),
|token_nodes| {
let ty = expand_syntax(&TypeShape, token_nodes, context)?;
let next = token_nodes.peek_non_ws();
match next.node {
None => Ok(ty),
Some(node) => {
Err(ParseError::extra_tokens(node.spanned_type_name()))
}
}
},
)
})
}
_ => Err(ParseError::mismatch(
"type annotation",
atom.spanned_type_name(),
)),
}
}
}

View File

@ -1,23 +1,27 @@
mod block;
mod expression;
pub(crate) mod flat_shape;
pub mod flat_shape;
use crate::cli::external_command;
use crate::commands::{
classified::{ClassifiedPipeline, InternalCommand},
ClassifiedCommand, Command,
};
use crate::parser::hir::expand_external_tokens::ExternalTokensShape;
use crate::parser::hir::syntax_shape::block::AnyBlockShape;
use crate::parser::hir::tokens_iterator::Peeked;
use crate::parser::parse::tokens::Token;
use crate::parser::parse_command::{parse_command_tail, CommandTailShape};
use crate::parser::{hir, hir::TokensIterator, Operator, TokenNode, UnspannedToken};
use crate::prelude::*;
use crate::commands::classified::internal::InternalCommand;
use crate::commands::classified::{ClassifiedCommand, ClassifiedPipeline};
use crate::commands::external_command;
use crate::hir;
use crate::hir::expand_external_tokens::ExternalTokensShape;
use crate::hir::syntax_shape::block::AnyBlockShape;
use crate::hir::syntax_shape::expression::range::RangeShape;
use crate::hir::tokens_iterator::{Peeked, TokensIterator};
use crate::parse::operator::EvaluationOperator;
use crate::parse::token_tree::TokenNode;
use crate::parse::tokens::{Token, UnspannedToken};
use crate::parse_command::{parse_command_tail, CommandTailShape};
use derive_new::new;
use getset::Getters;
use nu_source::Spanned;
use serde::{Deserialize, Serialize};
use nu_errors::{ParseError, ShellError};
use nu_protocol::{ShellTypeName, Signature};
use nu_source::{
b, DebugDocBuilder, HasFallibleSpan, HasSpan, PrettyDebug, PrettyDebugWithSource, Span,
Spanned, SpannedItem, Tag, TaggedItem, Text,
};
use std::path::{Path, PathBuf};
pub(crate) use self::expression::atom::{
@ -40,89 +44,8 @@ pub(crate) use self::expression::variable_path::{
pub(crate) use self::expression::{continue_expression, AnyExpressionShape};
pub(crate) use self::flat_shape::FlatShape;
#[cfg(not(coloring_in_tokens))]
use crate::parser::hir::tokens_iterator::debug::debug_tokens;
#[cfg(not(coloring_in_tokens))]
use crate::parser::parse::pipeline::Pipeline;
#[cfg(not(coloring_in_tokens))]
use log::{log_enabled, trace};
use nu_protocol::SyntaxShape;
#[derive(Debug, Copy, Clone, Serialize, Deserialize)]
pub enum SyntaxShape {
Any,
String,
Member,
ColumnPath,
Number,
Int,
Path,
Pattern,
Block,
}
impl PrettyDebug for SyntaxShape {
fn pretty(&self) -> DebugDocBuilder {
b::kind(match self {
SyntaxShape::Any => "any shape",
SyntaxShape::String => "string shape",
SyntaxShape::Member => "member shape",
SyntaxShape::ColumnPath => "column path shape",
SyntaxShape::Number => "number shape",
SyntaxShape::Int => "integer shape",
SyntaxShape::Path => "file path shape",
SyntaxShape::Pattern => "pattern shape",
SyntaxShape::Block => "block shape",
})
}
}
#[cfg(not(coloring_in_tokens))]
impl FallibleColorSyntax for SyntaxShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Result<(), ShellError> {
match self {
SyntaxShape::Any => {
color_fallible_syntax(&AnyExpressionShape, token_nodes, context, shapes)
}
SyntaxShape::Int => color_fallible_syntax(&IntShape, token_nodes, context, shapes),
SyntaxShape::String => color_fallible_syntax_with(
&StringShape,
&FlatShape::String,
token_nodes,
context,
shapes,
),
SyntaxShape::Member => {
color_fallible_syntax(&MemberShape, token_nodes, context, shapes)
}
SyntaxShape::ColumnPath => {
color_fallible_syntax(&ColumnPathShape, token_nodes, context, shapes)
}
SyntaxShape::Number => {
color_fallible_syntax(&NumberShape, token_nodes, context, shapes)
}
SyntaxShape::Path => {
color_fallible_syntax(&FilePathShape, token_nodes, context, shapes)
}
SyntaxShape::Pattern => {
color_fallible_syntax(&PatternShape, token_nodes, context, shapes)
}
SyntaxShape::Block => {
color_fallible_syntax(&AnyBlockShape, token_nodes, context, shapes)
}
}
}
}
#[cfg(coloring_in_tokens)]
impl FallibleColorSyntax for SyntaxShape {
type Info = ();
type Input = ();
@ -140,6 +63,7 @@ impl FallibleColorSyntax for SyntaxShape {
match self {
SyntaxShape::Any => color_fallible_syntax(&AnyExpressionShape, token_nodes, context),
SyntaxShape::Int => color_fallible_syntax(&IntShape, token_nodes, context),
SyntaxShape::Range => color_fallible_syntax(&RangeShape, token_nodes, context),
SyntaxShape::String => {
color_fallible_syntax_with(&StringShape, &FlatShape::String, token_nodes, context)
}
@ -160,6 +84,7 @@ impl ExpandExpression for SyntaxShape {
match self {
SyntaxShape::Any => "shape[any]",
SyntaxShape::Int => "shape[integer]",
SyntaxShape::Range => "shape[range]",
SyntaxShape::String => "shape[string]",
SyntaxShape::Member => "shape[column name]",
SyntaxShape::ColumnPath => "shape[column path]",
@ -178,6 +103,7 @@ impl ExpandExpression for SyntaxShape {
match self {
SyntaxShape::Any => expand_expr(&AnyExpressionShape, token_nodes, context),
SyntaxShape::Int => expand_expr(&IntShape, token_nodes, context),
SyntaxShape::Range => expand_expr(&RangeShape, token_nodes, context),
SyntaxShape::String => expand_expr(&StringShape, token_nodes, context),
SyntaxShape::Member => {
let syntax = expand_syntax(&MemberShape, token_nodes, context)?;
@ -200,13 +126,17 @@ impl ExpandExpression for SyntaxShape {
}
}
pub trait SignatureRegistry {
fn has(&self, name: &str) -> Result<bool, ShellError>;
fn get(&self, name: &str) -> Result<Option<Signature>, ShellError>;
}
#[derive(Getters, new)]
pub struct ExpandContext<'context> {
#[get = "pub(crate)"]
registry: &'context CommandRegistry,
#[get = "pub(crate)"]
source: &'context Text,
homedir: Option<PathBuf>,
pub registry: Box<dyn SignatureRegistry>,
pub source: &'context Text,
pub homedir: Option<PathBuf>,
}
impl<'context> ExpandContext<'context> {
@ -214,19 +144,8 @@ impl<'context> ExpandContext<'context> {
self.homedir.as_ref().map(|h| h.as_path())
}
#[cfg(test)]
pub fn with_empty(source: &Text, callback: impl FnOnce(ExpandContext)) {
let mut registry = CommandRegistry::new();
registry.insert(
"ls",
crate::commands::whole_stream_command(crate::commands::LS),
);
callback(ExpandContext {
registry: &registry,
source,
homedir: None,
})
pub(crate) fn source(&self) -> &'context Text {
self.source
}
}
@ -248,7 +167,6 @@ pub trait ExpandExpression: std::fmt::Debug + Copy {
) -> Result<hir::Expression, ParseError>;
}
#[cfg(coloring_in_tokens)]
pub trait FallibleColorSyntax: std::fmt::Debug + Copy {
type Info;
type Input;
@ -263,35 +181,6 @@ pub trait FallibleColorSyntax: std::fmt::Debug + Copy {
) -> Result<Self::Info, ShellError>;
}
#[cfg(not(coloring_in_tokens))]
pub trait FallibleColorSyntax: std::fmt::Debug + Copy {
type Info;
type Input;
fn color_syntax<'a, 'b>(
&self,
input: &Self::Input,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Result<Self::Info, ShellError>;
}
#[cfg(not(coloring_in_tokens))]
pub trait ColorSyntax: std::fmt::Debug + Copy {
type Info;
type Input;
fn color_syntax<'a, 'b>(
&self,
input: &Self::Input,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Self::Info;
}
#[cfg(coloring_in_tokens)]
pub trait ColorSyntax: std::fmt::Debug + Copy {
type Info;
type Input;
@ -306,7 +195,7 @@ pub trait ColorSyntax: std::fmt::Debug + Copy {
) -> Self::Info;
}
pub(crate) trait ExpandSyntax: std::fmt::Debug + Copy {
pub trait ExpandSyntax: std::fmt::Debug + Copy {
type Output: HasFallibleSpan + Clone + std::fmt::Debug + 'static;
fn name(&self) -> &'static str;
@ -318,7 +207,7 @@ pub(crate) trait ExpandSyntax: std::fmt::Debug + Copy {
) -> Result<Self::Output, ParseError>;
}
pub(crate) fn expand_syntax<'a, 'b, T: ExpandSyntax>(
pub fn expand_syntax<'a, 'b, T: ExpandSyntax>(
shape: &T,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
@ -338,7 +227,6 @@ pub(crate) fn expand_expr<'a, 'b, T: ExpandExpression>(
})
}
#[cfg(coloring_in_tokens)]
pub fn color_syntax<'a, 'b, T: ColorSyntax<Info = U, Input = ()>, U>(
shape: &T,
token_nodes: &'b mut TokensIterator<'a>,
@ -352,70 +240,6 @@ pub fn color_syntax<'a, 'b, T: ColorSyntax<Info = U, Input = ()>, U>(
)
}
#[cfg(not(coloring_in_tokens))]
pub fn color_syntax<'a, 'b, T: ColorSyntax<Info = U, Input = ()>, U>(
shape: &T,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> ((), U) {
trace!(target: "nu::color_syntax", "before {} :: {:?}", std::any::type_name::<T>(), debug_tokens(token_nodes.state(), context.source));
let len = shapes.len();
let result = shape.color_syntax(&(), token_nodes, context, shapes);
trace!(target: "nu::color_syntax", "ok :: {:?}", debug_tokens(token_nodes.state(), context.source));
if log_enabled!(target: "nu::color_syntax", log::Level::Trace) {
trace!(target: "nu::color_syntax", "after {}", std::any::type_name::<T>());
if len < shapes.len() {
for i in len..(shapes.len()) {
trace!(target: "nu::color_syntax", "new shape :: {:?}", shapes[i]);
}
} else {
trace!(target: "nu::color_syntax", "no new shapes");
}
}
((), result)
}
#[cfg(not(coloring_in_tokens))]
pub fn color_fallible_syntax<'a, 'b, T: FallibleColorSyntax<Info = U, Input = ()>, U>(
shape: &T,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Result<U, ShellError> {
trace!(target: "nu::color_syntax", "before {} :: {:?}", std::any::type_name::<T>(), debug_tokens(token_nodes.state(), context.source));
if token_nodes.at_end() {
trace!(target: "nu::color_syntax", "at eof");
return Err(ShellError::unexpected_eof("coloring", Tag::unknown()));
}
let len = shapes.len();
let result = shape.color_syntax(&(), token_nodes, context, shapes);
trace!(target: "nu::color_syntax", "ok :: {:?}", debug_tokens(token_nodes.state(), context.source));
if log_enabled!(target: "nu::color_syntax", log::Level::Trace) {
trace!(target: "nu::color_syntax", "after {}", std::any::type_name::<T>());
if len < shapes.len() {
for i in len..(shapes.len()) {
trace!(target: "nu::color_syntax", "new shape :: {:?}", shapes[i]);
}
} else {
trace!(target: "nu::color_syntax", "no new shapes");
}
}
result
}
#[cfg(coloring_in_tokens)]
pub fn color_fallible_syntax<'a, 'b, T: FallibleColorSyntax<Info = U, Input = ()>, U>(
shape: &T,
token_nodes: &'b mut TokensIterator<'a>,
@ -426,37 +250,6 @@ pub fn color_fallible_syntax<'a, 'b, T: FallibleColorSyntax<Info = U, Input = ()
})
}
#[cfg(not(coloring_in_tokens))]
pub fn color_syntax_with<'a, 'b, T: ColorSyntax<Info = U, Input = I>, U, I>(
shape: &T,
input: &I,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> ((), U) {
trace!(target: "nu::color_syntax", "before {} :: {:?}", std::any::type_name::<T>(), debug_tokens(token_nodes.state(), context.source));
let len = shapes.len();
let result = shape.color_syntax(input, token_nodes, context, shapes);
trace!(target: "nu::color_syntax", "ok :: {:?}", debug_tokens(token_nodes.state(), context.source));
if log_enabled!(target: "nu::color_syntax", log::Level::Trace) {
trace!(target: "nu::color_syntax", "after {}", std::any::type_name::<T>());
if len < shapes.len() {
for i in len..(shapes.len()) {
trace!(target: "nu::color_syntax", "new shape :: {:?}", shapes[i]);
}
} else {
trace!(target: "nu::color_syntax", "no new shapes");
}
}
((), result)
}
#[cfg(coloring_in_tokens)]
pub fn color_syntax_with<'a, 'b, T: ColorSyntax<Info = U, Input = I>, U, I>(
shape: &T,
input: &I,
@ -471,20 +264,6 @@ pub fn color_syntax_with<'a, 'b, T: ColorSyntax<Info = U, Input = I>, U, I>(
)
}
#[cfg(not(coloring_in_tokens))]
pub fn color_fallible_syntax_with<'a, 'b, T: FallibleColorSyntax<Info = U, Input = I>, U, I>(
shape: &T,
input: &I,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Result<U, ShellError> {
token_nodes.color_fallible_frame(std::any::type_name::<T>(), |token_nodes| {
shape.color_syntax(input, token_nodes, context, shapes)
})
}
#[cfg(coloring_in_tokens)]
pub fn color_fallible_syntax_with<'a, 'b, T: FallibleColorSyntax<Info = U, Input = I>, U, I>(
shape: &T,
input: &I,
@ -604,7 +383,7 @@ impl ExpandSyntax for BarePathShape {
..
})
| TokenNode::Token(Token {
unspanned: UnspannedToken::Operator(Operator::Dot),
unspanned: UnspannedToken::EvaluationOperator(EvaluationOperator::Dot),
..
}) => true,
@ -616,37 +395,6 @@ impl ExpandSyntax for BarePathShape {
#[derive(Debug, Copy, Clone)]
pub struct BareShape;
#[cfg(not(coloring_in_tokens))]
impl FallibleColorSyntax for BareShape {
type Info = ();
type Input = FlatShape;
fn color_syntax<'a, 'b>(
&self,
input: &FlatShape,
token_nodes: &'b mut TokensIterator<'a>,
_context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Result<(), ShellError> {
token_nodes
.peek_any_token("word", |token| match token {
// If it's a bare token, color it
TokenNode::Token(Token {
unspanned: UnspannedToken::Bare,
span,
}) => {
shapes.push((*input).spanned(*span));
Ok(())
}
// otherwise, fail
other => Err(ParseError::mismatch("word", other.spanned_type_name())),
})
.map_err(|err| err.into())
}
}
#[cfg(coloring_in_tokens)]
impl FallibleColorSyntax for BareShape {
type Info = ();
type Input = FlatShape;
@ -747,7 +495,7 @@ impl TestSyntax for BareShape {
#[derive(Debug, Clone)]
pub enum CommandSignature {
Internal(Spanned<Arc<Command>>),
Internal(Spanned<Signature>),
LiteralExternal { outer: Span, inner: Span },
External(Span),
Expression(hir::Expression),
@ -757,7 +505,7 @@ impl PrettyDebugWithSource for CommandSignature {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
match self {
CommandSignature::Internal(internal) => {
b::typed("command", b::description(internal.name()))
b::typed("command", b::description(&internal.name))
}
CommandSignature::LiteralExternal { outer, .. } => {
b::typed("command", b::description(outer.slice(source)))
@ -805,43 +553,6 @@ impl CommandSignature {
#[derive(Debug, Copy, Clone)]
pub struct PipelineShape;
#[cfg(not(coloring_in_tokens))]
// The failure mode is if the head of the token stream is not a pipeline
impl FallibleColorSyntax for PipelineShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Result<(), ShellError> {
// Make sure we're looking at a pipeline
let Pipeline { parts, .. } =
token_nodes.peek_any_token("pipeline", |node| node.as_pipeline())?;
// Enumerate the pipeline parts
for part in parts {
// If the pipeline part has a prefix `|`, emit a pipe to color
if let Some(pipe) = part.pipe {
shapes.push(FlatShape::Pipe.spanned(pipe));
}
// Create a new iterator containing the tokens in the pipeline part to color
let mut token_nodes =
TokensIterator::new(&part.tokens(), part.span(), context.source.clone(), false);
color_syntax(&MaybeSpaceShape, &mut token_nodes, context, shapes);
color_syntax(&CommandShape, &mut token_nodes, context, shapes);
}
Ok(())
}
}
#[cfg(coloring_in_tokens)]
// The failure mode is if the head of the token stream is not a pipeline
impl FallibleColorSyntax for PipelineShape {
type Info = ();
@ -881,46 +592,6 @@ impl FallibleColorSyntax for PipelineShape {
}
}
#[cfg(coloring_in_tokens)]
impl ExpandSyntax for PipelineShape {
type Output = ClassifiedPipeline;
fn name(&self) -> &'static str {
"pipeline"
}
fn expand_syntax<'content, 'me>(
&self,
iterator: &'me mut TokensIterator<'content>,
context: &ExpandContext,
) -> Result<Self::Output, ParseError> {
let start = iterator.span_at_cursor();
let peeked = iterator.peek_any().not_eof("pipeline")?;
let pipeline = peeked.commit().as_pipeline()?;
let parts = &pipeline.parts[..];
let mut out = vec![];
for part in parts {
let tokens: Spanned<&[TokenNode]> = part.tokens().spanned(part.span());
let classified =
iterator.child(tokens, context.source.clone(), move |token_nodes| {
expand_syntax(&ClassifiedCommandShape, token_nodes, context)
})?;
out.push(classified);
}
let end = iterator.span_at_cursor();
Ok(ClassifiedPipeline::commands(out, start.until(end)))
}
}
#[cfg(not(coloring_in_tokens))]
impl ExpandSyntax for PipelineShape {
type Output = ClassifiedPipeline;
@ -967,61 +638,6 @@ pub enum CommandHeadKind {
#[derive(Debug, Copy, Clone)]
pub struct CommandHeadShape;
#[cfg(not(coloring_in_tokens))]
impl FallibleColorSyntax for CommandHeadShape {
type Info = CommandHeadKind;
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Result<CommandHeadKind, ShellError> {
// If we don't ultimately find a token, roll back
token_nodes.atomic(|token_nodes| {
// First, take a look at the next token
let atom = expand_atom(
token_nodes,
"command head",
context,
ExpansionRule::permissive(),
)?;
match &atom.unspanned {
// If the head is an explicit external command (^cmd), color it as an external command
UnspannedAtomicToken::ExternalCommand { .. } => {
shapes.push(FlatShape::ExternalCommand.spanned(atom.span));
Ok(CommandHeadKind::External)
}
// If the head is a word, it depends on whether it matches a registered internal command
UnspannedAtomicToken::Word { text } => {
let name = text.slice(context.source);
if context.registry.has(name) {
// If the registry has the command, color it as an internal command
shapes.push(FlatShape::InternalCommand.spanned(text));
let command = context.registry.expect_command(name);
Ok(CommandHeadKind::Internal(command.signature()))
} else {
// Otherwise, color it as an external command
shapes.push(FlatShape::ExternalCommand.spanned(text));
Ok(CommandHeadKind::External)
}
}
// Otherwise, we're not actually looking at a command
_ => Err(ShellError::syntax_error(
"No command at the head".spanned(atom.span),
)),
}
})
}
}
#[cfg(coloring_in_tokens)]
impl FallibleColorSyntax for CommandHeadShape {
type Info = CommandHeadKind;
type Input = ();
@ -1057,11 +673,27 @@ impl FallibleColorSyntax for CommandHeadShape {
UnspannedAtomicToken::Word { text } => {
let name = text.slice(context.source);
if context.registry.has(name) {
if context.registry.has(name)? {
// If the registry has the command, color it as an internal command
token_nodes.color_shape(FlatShape::InternalCommand.spanned(text));
let command = context.registry.expect_command(name);
Ok(CommandHeadKind::Internal(command.signature()))
let signature = context
.registry
.get(name)
.map_err(|_| {
ShellError::labeled_error(
"Internal error: could not load signature from registry",
"could not load from registry",
text,
)
})?
.ok_or_else(|| {
ShellError::labeled_error(
"Internal error: could not load signature from registry",
"could not load from registry",
text,
)
})?;
Ok(CommandHeadKind::Internal(signature))
} else {
// Otherwise, color it as an external command
token_nodes.color_shape(FlatShape::ExternalCommand.spanned(text));
@ -1099,9 +731,15 @@ impl ExpandSyntax for CommandHeadShape {
},
UnspannedToken::Bare => {
let name = token_span.slice(context.source);
if context.registry.has(name) {
let command = context.registry.expect_command(name);
CommandSignature::Internal(command.spanned(token_span))
if context.registry.has(name)? {
let signature = context
.registry
.get(name)
.map_err(|_| ParseError::internal_error(name.spanned(token_span)))?
.ok_or_else(|| {
ParseError::internal_error(name.spanned(token_span))
})?;
CommandSignature::Internal(signature.spanned(token_span))
} else {
CommandSignature::External(token_span)
}
@ -1116,9 +754,9 @@ impl ExpandSyntax for CommandHeadShape {
});
match node {
Ok(expr) => return Ok(expr),
Ok(expr) => Ok(expr),
Err(_) => match expand_expr(&AnyExpressionShape, token_nodes, context) {
Ok(expr) => return Ok(CommandSignature::Expression(expr)),
Ok(expr) => Ok(CommandSignature::Expression(expr)),
Err(_) => Err(token_nodes.peek_non_ws().type_error("command head3")),
},
}
@ -1162,9 +800,8 @@ impl ExpandSyntax for ClassifiedCommandShape {
external_command(iterator, context, name_str.tagged(outer))
}
CommandSignature::Internal(command) => {
let tail =
parse_command_tail(&command.signature(), &context, iterator, command.span)?;
CommandSignature::Internal(signature) => {
let tail = parse_command_tail(&signature.item, &context, iterator, signature.span)?;
let (positional, named) = match tail {
None => (None, None),
@ -1181,9 +818,9 @@ impl ExpandSyntax for ClassifiedCommandShape {
};
Ok(ClassifiedCommand::Internal(InternalCommand::new(
command.item.name().to_string(),
signature.item.name.clone(),
Tag {
span: command.span,
span: signature.span,
anchor: None,
},
call,
@ -1196,46 +833,6 @@ impl ExpandSyntax for ClassifiedCommandShape {
#[derive(Debug, Copy, Clone)]
pub struct InternalCommandHeadShape;
#[cfg(not(coloring_in_tokens))]
impl FallibleColorSyntax for InternalCommandHeadShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
_context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Result<(), ShellError> {
let peeked_head = token_nodes.peek_non_ws().not_eof("command head4");
let peeked_head = match peeked_head {
Err(_) => return Ok(()),
Ok(peeked_head) => peeked_head,
};
let _expr = match peeked_head.node {
TokenNode::Token(Token {
unspanned: UnspannedToken::Bare,
span,
}) => shapes.push(FlatShape::Word.spanned(*span)),
TokenNode::Token(Token {
unspanned: UnspannedToken::String(_inner_tag),
span,
}) => shapes.push(FlatShape::String.spanned(*span)),
_node => shapes.push(FlatShape::Error.spanned(peeked_head.node.span())),
};
peeked_head.commit();
Ok(())
}
}
#[cfg(coloring_in_tokens)]
impl FallibleColorSyntax for InternalCommandHeadShape {
type Info = ();
type Input = ();
@ -1259,7 +856,7 @@ impl FallibleColorSyntax for InternalCommandHeadShape {
let node = peeked_head.commit();
let _expr = match node {
match node {
TokenNode::Token(Token {
unspanned: UnspannedToken::Bare,
span,
@ -1329,8 +926,8 @@ impl<'token> SingleError<'token> {
}
}
fn parse_single_node<'a, 'b, T>(
token_nodes: &'b mut TokensIterator<'a>,
fn parse_single_node<T>(
token_nodes: &mut TokensIterator<'_>,
expected: &'static str,
callback: impl FnOnce(UnspannedToken, Span, SingleError) -> Result<T, ParseError>,
) -> Result<T, ParseError> {
@ -1351,8 +948,8 @@ fn parse_single_node<'a, 'b, T>(
})
}
fn parse_single_node_skipping_ws<'a, 'b, T>(
token_nodes: &'b mut TokensIterator<'a>,
fn parse_single_node_skipping_ws<T>(
token_nodes: &mut TokensIterator<'_>,
expected: &'static str,
callback: impl FnOnce(UnspannedToken, Span, SingleError) -> Result<T, ShellError>,
) -> Result<T, ShellError> {
@ -1384,38 +981,6 @@ fn parse_single_node_skipping_ws<'a, 'b, T>(
#[derive(Debug, Copy, Clone)]
pub struct WhitespaceShape;
#[cfg(not(coloring_in_tokens))]
impl FallibleColorSyntax for WhitespaceShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
_context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Result<(), ShellError> {
let peeked = token_nodes.peek_any().not_eof("whitespace");
let peeked = match peeked {
Err(_) => return Ok(()),
Ok(peeked) => peeked,
};
let _tag = match peeked.node {
TokenNode::Whitespace(span) => shapes.push(FlatShape::Whitespace.spanned(*span)),
_other => return Ok(()),
};
peeked.commit();
Ok(())
}
}
#[cfg(coloring_in_tokens)]
impl FallibleColorSyntax for WhitespaceShape {
type Info = ();
type Input = ();
@ -1439,7 +1004,7 @@ impl FallibleColorSyntax for WhitespaceShape {
let node = peeked.commit();
let _ = match node {
match node {
TokenNode::Whitespace(span) => {
token_nodes.color_shape(FlatShape::Whitespace.spanned(*span))
}
@ -1556,33 +1121,6 @@ impl ExpandSyntax for MaybeSpaceShape {
}
}
#[cfg(not(coloring_in_tokens))]
impl ColorSyntax for MaybeSpaceShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
_context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Self::Info {
let peeked = token_nodes.peek_any().not_eof("whitespace");
let peeked = match peeked {
Err(_) => return,
Ok(peeked) => peeked,
};
if let TokenNode::Whitespace(span) = peeked.node {
peeked.commit();
shapes.push(FlatShape::Whitespace.spanned(*span));
}
}
}
#[cfg(coloring_in_tokens)]
impl ColorSyntax for MaybeSpaceShape {
type Info = ();
type Input = ();
@ -1614,36 +1152,6 @@ impl ColorSyntax for MaybeSpaceShape {
#[derive(Debug, Copy, Clone)]
pub struct SpaceShape;
#[cfg(not(coloring_in_tokens))]
impl FallibleColorSyntax for SpaceShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
_context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Result<(), ShellError> {
let peeked = token_nodes.peek_any().not_eof("whitespace")?;
match peeked.node {
TokenNode::Whitespace(span) => {
peeked.commit();
shapes.push(FlatShape::Whitespace.spanned(*span));
Ok(())
}
other => Err(ShellError::type_error(
"whitespace",
other.spanned_type_name(),
)),
}
}
}
#[cfg(coloring_in_tokens)]
impl FallibleColorSyntax for SpaceShape {
type Info = ();
type Input = ();
@ -1717,38 +1225,6 @@ fn expand_variable(span: Span, token_span: Span, source: &Text) -> hir::Expressi
#[derive(Debug, Copy, Clone)]
pub struct CommandShape;
#[cfg(not(coloring_in_tokens))]
impl ColorSyntax for CommandShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) {
let kind = color_fallible_syntax(&CommandHeadShape, token_nodes, context, shapes);
match kind {
Err(_) => {
// We didn't find a command, so we'll have to fall back to parsing this pipeline part
// as a blob of undifferentiated expressions
color_syntax(&ExpressionListShape, token_nodes, context, shapes);
}
Ok(CommandHeadKind::External) => {
color_syntax(&ExternalTokensShape, token_nodes, context, shapes);
}
Ok(CommandHeadKind::Internal(signature)) => {
color_syntax_with(&CommandTailShape, &signature, token_nodes, context, shapes);
}
};
}
}
#[cfg(coloring_in_tokens)]
impl ColorSyntax for CommandShape {
type Info = ();
type Input = ();

View File

@ -0,0 +1,251 @@
use crate::{
hir,
hir::syntax_shape::{
color_fallible_syntax, color_syntax_with, continue_expression, expand_expr, expand_syntax,
DelimitedShape, ExpandContext, ExpandExpression, ExpressionContinuationShape,
ExpressionListShape, FallibleColorSyntax, MemberShape, PathTailShape, PathTailSyntax,
VariablePathShape,
},
hir::tokens_iterator::TokensIterator,
parse::token_tree::Delimiter,
};
use nu_errors::{ParseError, ShellError};
use nu_source::Span;
#[derive(Debug, Copy, Clone)]
pub struct AnyBlockShape;
impl FallibleColorSyntax for AnyBlockShape {
type Info = ();
type Input = ();
fn name(&self) -> &'static str {
"AnyBlockShape"
}
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<(), ShellError> {
let block = token_nodes.peek_non_ws().not_eof("block");
let block = match block {
Err(_) => return Ok(()),
Ok(block) => block,
};
// is it just a block?
let block = block.node.as_block();
if let Some((children, spans)) = block {
token_nodes.child(children, context.source.clone(), |token_nodes| {
color_syntax_with(
&DelimitedShape,
&(Delimiter::Brace, spans.0, spans.1),
token_nodes,
context,
);
});
return Ok(());
}
// Otherwise, look for a shorthand block. If none found, fail
color_fallible_syntax(&ShorthandBlock, token_nodes, context)
}
}
impl ExpandExpression for AnyBlockShape {
fn name(&self) -> &'static str {
"any block"
}
fn expand_expr<'a, 'b>(
&self,
token_nodes: &mut TokensIterator<'_>,
context: &ExpandContext,
) -> Result<hir::Expression, ParseError> {
let block = token_nodes.peek_non_ws().not_eof("block")?;
// is it just a block?
let block = block.node.as_block();
if let Some((block, _tags)) = block {
let mut iterator =
TokensIterator::new(&block.item, block.span, context.source.clone(), false);
let exprs = expand_syntax(&ExpressionListShape, &mut iterator, context)?.exprs;
return Ok(hir::RawExpression::Block(exprs.item).into_expr(block.span));
}
expand_syntax(&ShorthandBlock, token_nodes, context)
}
}
#[derive(Debug, Copy, Clone)]
pub struct ShorthandBlock;
impl FallibleColorSyntax for ShorthandBlock {
type Info = ();
type Input = ();
fn name(&self) -> &'static str {
"ShorthandBlock"
}
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<(), ShellError> {
// Try to find a shorthand head. If none found, fail
color_fallible_syntax(&ShorthandPath, token_nodes, context)?;
loop {
// Check to see whether there's any continuation after the head expression
let result = color_fallible_syntax(&ExpressionContinuationShape, token_nodes, context);
match result {
// if no continuation was found, we're done
Err(_) => break,
// if a continuation was found, look for another one
Ok(_) => continue,
}
}
Ok(())
}
}
impl ExpandExpression for ShorthandBlock {
fn name(&self) -> &'static str {
"shorthand block"
}
fn expand_expr<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<hir::Expression, ParseError> {
let path = expand_expr(&ShorthandPath, token_nodes, context)?;
let start = path.span;
let expr = continue_expression(path, token_nodes, context);
let end = expr.span;
let block = hir::RawExpression::Block(vec![expr]).into_expr(start.until(end));
Ok(block)
}
}
/// A shorthand for `$it.foo."bar"`, used inside of a shorthand block
#[derive(Debug, Copy, Clone)]
pub struct ShorthandPath;
impl FallibleColorSyntax for ShorthandPath {
type Info = ();
type Input = ();
fn name(&self) -> &'static str {
"ShorthandPath"
}
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<(), ShellError> {
token_nodes.atomic(|token_nodes| {
let variable = color_fallible_syntax(&VariablePathShape, token_nodes, context);
if variable.is_ok() {
// if it's a variable path, that's the head part
return Ok(());
}
// otherwise, we'll try to find a member path
// look for a member (`<member>` -> `$it.<member>`)
color_fallible_syntax(&MemberShape, token_nodes, context)?;
// Now that we've synthesized the head, of the path, proceed to expand the tail of the path
// like any other path.
// It's ok if there's no path tail; a single member is sufficient
let _ = color_fallible_syntax(&PathTailShape, token_nodes, context);
Ok(())
})
}
}
impl ExpandExpression for ShorthandPath {
fn name(&self) -> &'static str {
"shorthand path"
}
fn expand_expr<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<hir::Expression, ParseError> {
// if it's a variable path, that's the head part
let path = expand_expr(&VariablePathShape, token_nodes, context);
if let Ok(path) = path {
return Ok(path);
}
// Synthesize the head of the shorthand path (`<member>` -> `$it.<member>`)
let mut head = expand_expr(&ShorthandHeadShape, token_nodes, context)?;
// Now that we've synthesized the head, of the path, proceed to expand the tail of the path
// like any other path.
let tail = expand_syntax(&PathTailShape, token_nodes, context);
match tail {
Err(_) => Ok(head),
Ok(PathTailSyntax { tail, .. }) => {
// For each member that `PathTailShape` expanded, join it onto the existing expression
// to form a new path
for member in tail {
head = hir::Expression::dot_member(head, member);
}
Ok(head)
}
}
}
}
/// A shorthand for `$it.foo."bar"`, used inside of a shorthand block
#[derive(Debug, Copy, Clone)]
pub struct ShorthandHeadShape;
impl ExpandExpression for ShorthandHeadShape {
fn name(&self) -> &'static str {
"shorthand head"
}
fn expand_expr<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<hir::Expression, ParseError> {
let head = expand_syntax(&MemberShape, token_nodes, context)?;
let head = head.to_path_member(context.source);
// Synthesize an `$it` expression
let it = synthetic_it();
let span = head.span;
Ok(hir::Expression::path(it, vec![head], span))
}
}
fn synthetic_it() -> hir::Expression {
hir::Expression::it_variable(Span::unknown(), Span::unknown())
}

View File

@ -4,22 +4,23 @@ pub(crate) mod file_path;
pub(crate) mod list;
pub(crate) mod number;
pub(crate) mod pattern;
pub(crate) mod range;
pub(crate) mod string;
pub(crate) mod unit;
pub(crate) mod variable_path;
use crate::parser::hir::syntax_shape::{
use crate::hir::syntax_shape::{
color_delimited_square, color_fallible_syntax, color_fallible_syntax_with, expand_atom,
expand_delimited_square, expand_expr, expand_syntax, BareShape, ColorableDotShape, DotShape,
ExpandContext, ExpandExpression, ExpandSyntax, ExpansionRule, ExpressionContinuation,
ExpressionContinuationShape, FallibleColorSyntax, FlatShape, ParseError, UnspannedAtomicToken,
ExpressionContinuationShape, FallibleColorSyntax, FlatShape, UnspannedAtomicToken,
};
use crate::parser::{
use crate::{
hir,
hir::{Expression, TokensIterator},
};
use crate::prelude::*;
use nu_source::Spanned;
use nu_errors::{ParseError, ShellError};
use nu_source::{HasSpan, Span, Spanned, SpannedItem, Tag};
use std::path::PathBuf;
#[derive(Debug, Copy, Clone)]
@ -42,34 +43,6 @@ impl ExpandExpression for AnyExpressionShape {
}
}
#[cfg(not(coloring_in_tokens))]
impl FallibleColorSyntax for AnyExpressionShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Result<(), ShellError> {
// Look for an expression at the cursor
color_fallible_syntax(&AnyExpressionStartShape, token_nodes, context, shapes)?;
match continue_coloring_expression(token_nodes, context, shapes) {
Err(_) => {
// it's fine for there to be no continuation
}
Ok(()) => {}
}
Ok(())
}
}
#[cfg(coloring_in_tokens)]
impl FallibleColorSyntax for AnyExpressionShape {
type Info = ();
type Input = ();
@ -127,32 +100,6 @@ pub(crate) fn continue_expression(
}
}
#[cfg(not(coloring_in_tokens))]
pub(crate) fn continue_coloring_expression(
token_nodes: &mut TokensIterator<'_>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Result<(), ShellError> {
// if there's not even one expression continuation, fail
color_fallible_syntax(&ExpressionContinuationShape, token_nodes, context, shapes)?;
loop {
// Check to see whether there's any continuation after the head expression
let result =
color_fallible_syntax(&ExpressionContinuationShape, token_nodes, context, shapes);
match result {
Err(_) => {
// We already saw one continuation, so just return
return Ok(());
}
Ok(_) => {}
}
}
}
#[cfg(coloring_in_tokens)]
pub(crate) fn continue_coloring_expression(
token_nodes: &mut TokensIterator<'_>,
context: &ExpandContext,
@ -164,13 +111,9 @@ pub(crate) fn continue_coloring_expression(
// Check to see whether there's any continuation after the head expression
let result = color_fallible_syntax(&ExpressionContinuationShape, token_nodes, context);
match result {
Err(_) => {
// We already saw one continuation, so just return
return Ok(());
}
Ok(_) => {}
if result.is_err() {
// We already saw one continuation, so just return
return Ok(());
}
}
}
@ -191,19 +134,17 @@ impl ExpandExpression for AnyExpressionStartShape {
let atom = expand_atom(token_nodes, "expression", context, ExpansionRule::new())?;
match atom.unspanned {
UnspannedAtomicToken::Size { number, unit } => {
return Ok(hir::Expression::size(
number.to_number(context.source),
unit.item,
Tag {
span: atom.span,
anchor: None,
},
))
}
UnspannedAtomicToken::Size { number, unit } => Ok(hir::Expression::size(
number.to_number(context.source),
unit.item,
Tag {
span: atom.span,
anchor: None,
},
)),
UnspannedAtomicToken::SquareDelimited { nodes, .. } => {
expand_delimited_square(&nodes, atom.span.into(), context)
expand_delimited_square(&nodes, atom.span, context)
}
UnspannedAtomicToken::Word { .. } => {
@ -211,75 +152,13 @@ impl ExpandExpression for AnyExpressionStartShape {
Ok(hir::Expression::bare(atom.span.until_option(end)))
}
other => {
return other
.into_atomic_token(atom.span)
.into_hir(context, "expression")
}
other => other
.into_atomic_token(atom.span)
.to_hir(context, "expression"),
}
}
}
#[cfg(not(coloring_in_tokens))]
impl FallibleColorSyntax for AnyExpressionStartShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Result<(), ShellError> {
let atom = token_nodes.spanned(|token_nodes| {
expand_atom(
token_nodes,
"expression",
context,
ExpansionRule::permissive(),
)
});
let atom = match atom {
Spanned {
item: Err(_err),
span,
} => {
shapes.push(FlatShape::Error.spanned(span));
return Ok(());
}
Spanned {
item: Ok(value), ..
} => value,
};
match &atom.unspanned {
UnspannedAtomicToken::Size { number, unit } => shapes.push(
FlatShape::Size {
number: number.span(),
unit: unit.span.into(),
}
.spanned(atom.span),
),
UnspannedAtomicToken::SquareDelimited { nodes, spans } => {
color_delimited_square(*spans, &nodes, atom.span.into(), context, shapes)
}
UnspannedAtomicToken::Word { .. } => {
shapes.push(FlatShape::Word.spanned(atom.span));
}
_ => atom.color_tokens(shapes),
}
Ok(())
}
}
#[cfg(coloring_in_tokens)]
impl FallibleColorSyntax for AnyExpressionStartShape {
type Info = ();
type Input = ();
@ -321,7 +200,7 @@ impl FallibleColorSyntax for AnyExpressionStartShape {
UnspannedAtomicToken::Size { number, unit } => token_nodes.color_shape(
FlatShape::Size {
number: number.span(),
unit: unit.span.into(),
unit: unit.span,
}
.spanned(atom.span),
),
@ -331,7 +210,7 @@ impl FallibleColorSyntax for AnyExpressionStartShape {
(&nodes[..]).spanned(atom.span),
context.source.clone(),
|tokens| {
color_delimited_square(spans, tokens, atom.span.into(), context);
color_delimited_square(spans, tokens, atom.span, context);
},
);
}
@ -350,64 +229,6 @@ impl FallibleColorSyntax for AnyExpressionStartShape {
#[derive(Debug, Copy, Clone)]
pub struct BareTailShape;
#[cfg(not(coloring_in_tokens))]
impl FallibleColorSyntax for BareTailShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Result<(), ShellError> {
let len = shapes.len();
loop {
let word = color_fallible_syntax_with(
&BareShape,
&FlatShape::Word,
token_nodes,
context,
shapes,
);
match word {
// if a word was found, continue
Ok(_) => continue,
// if a word wasn't found, try to find a dot
Err(_) => {}
}
// try to find a dot
let dot = color_fallible_syntax_with(
&ColorableDotShape,
&FlatShape::Word,
token_nodes,
context,
shapes,
);
match dot {
// if a dot was found, try to find another word
Ok(_) => continue,
// otherwise, we're done
Err(_) => break,
}
}
if shapes.len() > len {
Ok(())
} else {
Err(ShellError::syntax_error(
"No tokens matched BareTailShape".spanned_unknown(),
))
}
}
}
#[cfg(coloring_in_tokens)]
impl FallibleColorSyntax for BareTailShape {
type Info = ();
type Input = ();
@ -428,13 +249,13 @@ impl FallibleColorSyntax for BareTailShape {
let word =
color_fallible_syntax_with(&BareShape, &FlatShape::Word, token_nodes, context);
match word {
if word.is_ok() {
// if a word was found, continue
Ok(_) => continue,
// if a word wasn't found, try to find a dot
Err(_) => {}
continue;
}
// if a word wasn't found, try to find a dot
// try to find a dot
let dot = color_fallible_syntax_with(
&ColorableDotShape,

View File

@ -1,15 +1,20 @@
use crate::parser::hir::syntax_shape::{
use crate::hir::syntax_shape::FlatShape;
use crate::hir::syntax_shape::{
expand_syntax, expression::expand_file_path, parse_single_node, BarePathShape,
BarePatternShape, ExpandContext, UnitShape, UnitSyntax,
};
use crate::parser::{
use crate::parse::operator::EvaluationOperator;
use crate::parse::token_tree::{DelimitedNode, Delimiter, TokenNode};
use crate::parse::tokens::UnspannedToken;
use crate::parse::unit::Unit;
use crate::{
hir,
hir::{Expression, RawNumber, TokensIterator},
parse::flag::{Flag, FlagKind},
DelimitedNode, Delimiter, FlatShape, TokenNode, Unit, UnspannedToken,
};
use crate::prelude::*;
use nu_source::Spanned;
use nu_errors::{ParseError, ShellError};
use nu_protocol::{ShellTypeName, SpannedTypeName};
use nu_source::{b, DebugDocBuilder, HasSpan, PrettyDebugWithSource, Span, Spanned, SpannedItem};
use std::ops::Deref;
#[derive(Debug, Clone)]
@ -48,23 +53,36 @@ pub enum UnspannedAtomicToken<'tokens> {
Word {
text: Span,
},
#[allow(unused)]
Dot {
text: Span,
},
SquareDelimited {
spans: (Span, Span),
nodes: &'tokens Vec<TokenNode>,
},
#[allow(unused)]
RoundDelimited {
spans: (Span, Span),
nodes: &'tokens Vec<TokenNode>,
},
ShorthandFlag {
name: Span,
},
Operator {
CompareOperator {
text: Span,
},
Dot {
text: Span,
},
DotDot {
text: Span,
},
Whitespace {
text: Span,
},
Separator {
text: Span,
},
Comment {
body: Span,
},
}
impl<'tokens> UnspannedAtomicToken<'tokens> {
@ -76,15 +94,24 @@ impl<'tokens> UnspannedAtomicToken<'tokens> {
}
}
impl<'tokens> ShellTypeName for AtomicToken<'tokens> {
fn type_name(&self) -> &'static str {
self.unspanned.type_name()
}
}
impl<'tokens> ShellTypeName for UnspannedAtomicToken<'tokens> {
fn type_name(&self) -> &'static str {
match &self {
UnspannedAtomicToken::Eof { .. } => "eof",
UnspannedAtomicToken::Error { .. } => "error",
UnspannedAtomicToken::Operator { .. } => "operator",
UnspannedAtomicToken::CompareOperator { .. } => "compare operator",
UnspannedAtomicToken::ShorthandFlag { .. } => "shorthand flag",
UnspannedAtomicToken::Whitespace { .. } => "whitespace",
UnspannedAtomicToken::Separator { .. } => "separator",
UnspannedAtomicToken::Comment { .. } => "comment",
UnspannedAtomicToken::Dot { .. } => "dot",
UnspannedAtomicToken::DotDot { .. } => "dotdot",
UnspannedAtomicToken::Number { .. } => "number",
UnspannedAtomicToken::Size { .. } => "size",
UnspannedAtomicToken::String { .. } => "string",
@ -95,6 +122,7 @@ impl<'tokens> ShellTypeName for UnspannedAtomicToken<'tokens> {
UnspannedAtomicToken::GlobPattern { .. } => "file pattern",
UnspannedAtomicToken::Word { .. } => "word",
UnspannedAtomicToken::SquareDelimited { .. } => "array literal",
UnspannedAtomicToken::RoundDelimited { .. } => "paren delimited",
}
}
}
@ -105,6 +133,12 @@ pub struct AtomicToken<'tokens> {
pub span: Span,
}
impl<'tokens> HasSpan for AtomicToken<'tokens> {
fn span(&self) -> Span {
self.span
}
}
impl<'tokens> Deref for AtomicToken<'tokens> {
type Target = UnspannedAtomicToken<'tokens>;
@ -114,7 +148,7 @@ impl<'tokens> Deref for AtomicToken<'tokens> {
}
impl<'tokens> AtomicToken<'tokens> {
pub fn into_hir(
pub fn to_hir(
&self,
context: &ExpandContext,
expected: &'static str,
@ -127,31 +161,18 @@ impl<'tokens> AtomicToken<'tokens> {
))
}
UnspannedAtomicToken::Error { .. } => {
return Err(ParseError::mismatch(
expected,
"eof atomic token".spanned(self.span),
))
return Err(ParseError::mismatch(expected, "error".spanned(self.span)))
}
UnspannedAtomicToken::Operator { .. } => {
return Err(ParseError::mismatch(
expected,
"operator".spanned(self.span),
))
}
UnspannedAtomicToken::ShorthandFlag { .. } => {
return Err(ParseError::mismatch(
expected,
"shorthand flag".spanned(self.span),
))
}
UnspannedAtomicToken::Whitespace { .. } => {
return Err(ParseError::mismatch(
expected,
"whitespace".spanned(self.span),
))
}
UnspannedAtomicToken::Dot { .. } => {
return Err(ParseError::mismatch(expected, "dot".spanned(self.span)))
UnspannedAtomicToken::RoundDelimited { .. }
| UnspannedAtomicToken::CompareOperator { .. }
| UnspannedAtomicToken::ShorthandFlag { .. }
| UnspannedAtomicToken::Whitespace { .. }
| UnspannedAtomicToken::Separator { .. }
| UnspannedAtomicToken::Comment { .. }
| UnspannedAtomicToken::Dot { .. }
| UnspannedAtomicToken::DotDot { .. }
| UnspannedAtomicToken::SquareDelimited { .. } => {
return Err(ParseError::mismatch(expected, self.spanned_type_name()));
}
UnspannedAtomicToken::Number { number } => {
Expression::number(number.to_number(context.source), self.span)
@ -171,89 +192,55 @@ impl<'tokens> AtomicToken<'tokens> {
self.span,
),
UnspannedAtomicToken::Word { text } => Expression::string(*text, *text),
UnspannedAtomicToken::SquareDelimited { .. } => unimplemented!("into_hir"),
})
}
#[cfg(not(coloring_in_tokens))]
pub fn spanned_type_name(&self) -> Spanned<&'static str> {
match &self.unspanned {
UnspannedAtomicToken::Eof { .. } => "eof",
UnspannedAtomicToken::Error { .. } => "error",
UnspannedAtomicToken::Operator { .. } => "operator",
UnspannedAtomicToken::ShorthandFlag { .. } => "shorthand flag",
UnspannedAtomicToken::Whitespace { .. } => "whitespace",
UnspannedAtomicToken::Dot { .. } => "dot",
UnspannedAtomicToken::Number { .. } => "number",
UnspannedAtomicToken::Size { .. } => "size",
UnspannedAtomicToken::String { .. } => "string",
UnspannedAtomicToken::ItVariable { .. } => "$it",
UnspannedAtomicToken::Variable { .. } => "variable",
UnspannedAtomicToken::ExternalCommand { .. } => "external command",
UnspannedAtomicToken::ExternalWord { .. } => "external word",
UnspannedAtomicToken::GlobPattern { .. } => "file pattern",
UnspannedAtomicToken::Word { .. } => "word",
UnspannedAtomicToken::SquareDelimited { .. } => "array literal",
}
.spanned(self.span)
}
pub(crate) fn color_tokens(&self, shapes: &mut Vec<Spanned<FlatShape>>) {
match &self.unspanned {
UnspannedAtomicToken::Eof { .. } => {}
UnspannedAtomicToken::Error { .. } => {
return shapes.push(FlatShape::Error.spanned(self.span))
}
UnspannedAtomicToken::Operator { .. } => {
return shapes.push(FlatShape::Operator.spanned(self.span));
UnspannedAtomicToken::Error { .. } => shapes.push(FlatShape::Error.spanned(self.span)),
UnspannedAtomicToken::CompareOperator { .. } => {
shapes.push(FlatShape::CompareOperator.spanned(self.span))
}
UnspannedAtomicToken::ShorthandFlag { .. } => {
return shapes.push(FlatShape::ShorthandFlag.spanned(self.span));
shapes.push(FlatShape::ShorthandFlag.spanned(self.span))
}
UnspannedAtomicToken::Whitespace { .. } => {
return shapes.push(FlatShape::Whitespace.spanned(self.span));
shapes.push(FlatShape::Whitespace.spanned(self.span))
}
UnspannedAtomicToken::Number {
number: RawNumber::Decimal(_),
} => {
return shapes.push(FlatShape::Decimal.spanned(self.span));
}
} => shapes.push(FlatShape::Decimal.spanned(self.span)),
UnspannedAtomicToken::Number {
number: RawNumber::Int(_),
} => {
return shapes.push(FlatShape::Int.spanned(self.span));
}
UnspannedAtomicToken::Size { number, unit } => {
return shapes.push(
FlatShape::Size {
number: number.span(),
unit: unit.span,
}
.spanned(self.span),
);
}
} => shapes.push(FlatShape::Int.spanned(self.span)),
UnspannedAtomicToken::Size { number, unit } => shapes.push(
FlatShape::Size {
number: number.span(),
unit: unit.span,
}
.spanned(self.span),
),
UnspannedAtomicToken::String { .. } => {
return shapes.push(FlatShape::String.spanned(self.span))
shapes.push(FlatShape::String.spanned(self.span))
}
UnspannedAtomicToken::ItVariable { .. } => {
return shapes.push(FlatShape::ItVariable.spanned(self.span))
shapes.push(FlatShape::ItVariable.spanned(self.span))
}
UnspannedAtomicToken::Variable { .. } => {
return shapes.push(FlatShape::Variable.spanned(self.span))
shapes.push(FlatShape::Variable.spanned(self.span))
}
UnspannedAtomicToken::ExternalCommand { .. } => {
return shapes.push(FlatShape::ExternalCommand.spanned(self.span));
shapes.push(FlatShape::ExternalCommand.spanned(self.span))
}
UnspannedAtomicToken::ExternalWord { .. } => {
return shapes.push(FlatShape::ExternalWord.spanned(self.span))
shapes.push(FlatShape::ExternalWord.spanned(self.span))
}
UnspannedAtomicToken::GlobPattern { .. } => {
return shapes.push(FlatShape::GlobPattern.spanned(self.span))
shapes.push(FlatShape::GlobPattern.spanned(self.span))
}
UnspannedAtomicToken::Word { .. } => {
return shapes.push(FlatShape::Word.spanned(self.span))
}
_ => return shapes.push(FlatShape::Error.spanned(self.span)),
UnspannedAtomicToken::Word { .. } => shapes.push(FlatShape::Word.spanned(self.span)),
_ => shapes.push(FlatShape::Error.spanned(self.span)),
}
}
}
@ -301,17 +288,30 @@ impl PrettyDebugWithSource for AtomicToken<'_> {
b::intersperse_with_source(nodes.iter(), b::space(), source),
"]",
),
UnspannedAtomicToken::RoundDelimited { nodes, .. } => b::delimit(
"(",
b::intersperse_with_source(nodes.iter(), b::space(), source),
")",
),
UnspannedAtomicToken::ShorthandFlag { name } => {
atom_kind("shorthand flag", b::key(name.slice(source)))
}
UnspannedAtomicToken::Dot { .. } => atom(b::kind("dot")),
UnspannedAtomicToken::Operator { text } => {
UnspannedAtomicToken::DotDot { .. } => atom(b::kind("dotdot")),
UnspannedAtomicToken::CompareOperator { text } => {
atom_kind("operator", b::keyword(text.slice(source)))
}
UnspannedAtomicToken::Whitespace { text } => atom_kind(
"whitespace",
b::description(format!("{:?}", text.slice(source))),
),
UnspannedAtomicToken::Separator { text } => atom_kind(
"separator",
b::description(format!("{:?}", text.slice(source))),
),
UnspannedAtomicToken::Comment { body } => {
atom_kind("comment", b::description(body.slice(source)))
}
})
}
}
@ -327,12 +327,15 @@ pub enum WhitespaceHandling {
pub struct ExpansionRule {
pub(crate) allow_external_command: bool,
pub(crate) allow_external_word: bool,
pub(crate) allow_operator: bool,
pub(crate) allow_cmp_operator: bool,
pub(crate) allow_eval_operator: bool,
pub(crate) allow_eof: bool,
pub(crate) allow_separator: bool,
pub(crate) treat_size_as_word: bool,
pub(crate) separate_members: bool,
pub(crate) commit_errors: bool,
pub(crate) whitespace: WhitespaceHandling,
pub(crate) allow_comments: bool,
}
impl ExpansionRule {
@ -340,12 +343,15 @@ impl ExpansionRule {
ExpansionRule {
allow_external_command: false,
allow_external_word: false,
allow_operator: false,
allow_eval_operator: false,
allow_cmp_operator: false,
allow_eof: false,
treat_size_as_word: false,
separate_members: false,
commit_errors: false,
allow_separator: false,
whitespace: WhitespaceHandling::RejectWhitespace,
allow_comments: false,
}
}
@ -356,11 +362,14 @@ impl ExpansionRule {
ExpansionRule {
allow_external_command: true,
allow_external_word: true,
allow_operator: true,
allow_cmp_operator: true,
allow_eval_operator: true,
allow_eof: true,
separate_members: false,
treat_size_as_word: false,
commit_errors: true,
allow_separator: true,
allow_comments: true,
whitespace: WhitespaceHandling::AllowWhitespace,
}
}
@ -372,14 +381,26 @@ impl ExpansionRule {
}
#[allow(unused)]
pub fn allow_operator(mut self) -> ExpansionRule {
self.allow_operator = true;
pub fn allow_cmp_operator(mut self) -> ExpansionRule {
self.allow_cmp_operator = true;
self
}
#[allow(unused)]
pub fn no_cmp_operator(mut self) -> ExpansionRule {
self.allow_cmp_operator = false;
self
}
#[allow(unused)]
pub fn allow_eval_operator(mut self) -> ExpansionRule {
self.allow_eval_operator = true;
self
}
#[allow(unused)]
pub fn no_operator(mut self) -> ExpansionRule {
self.allow_operator = false;
self.allow_eval_operator = false;
self
}
@ -436,6 +457,30 @@ impl ExpansionRule {
self.whitespace = WhitespaceHandling::RejectWhitespace;
self
}
#[allow(unused)]
pub fn allow_separator(mut self) -> ExpansionRule {
self.allow_separator = true;
self
}
#[allow(unused)]
pub fn reject_separator(mut self) -> ExpansionRule {
self.allow_separator = false;
self
}
#[allow(unused)]
pub fn allow_comments(mut self) -> ExpansionRule {
self.allow_comments = true;
self
}
#[allow(unused)]
pub fn reject_comments(mut self) -> ExpansionRule {
self.allow_comments = false;
self
}
}
pub fn expand_atom<'me, 'content>(
@ -469,14 +514,13 @@ fn expand_atom_inner<'me, 'content>(
rule: ExpansionRule,
) -> Result<AtomicToken<'content>, ParseError> {
if token_nodes.at_end() {
match rule.allow_eof {
true => {
return Ok(UnspannedAtomicToken::Eof {
span: Span::unknown(),
}
.into_atomic_token(Span::unknown()))
if rule.allow_eof {
return Ok(UnspannedAtomicToken::Eof {
span: Span::unknown(),
}
false => return Err(ParseError::unexpected_eof("anything", Span::unknown())),
.into_atomic_token(Span::unknown()));
} else {
return Err(ParseError::unexpected_eof("anything", Span::unknown()));
}
}
@ -485,9 +529,8 @@ fn expand_atom_inner<'me, 'content>(
// If treat_size_as_word, don't try to parse the head of the token stream
// as a size.
match rule.treat_size_as_word {
true => {}
false => match expand_syntax(&UnitShape, token_nodes, context) {
if !rule.treat_size_as_word {
match expand_syntax(&UnitShape, token_nodes, context) {
// If the head of the stream isn't a valid unit, we'll try to parse
// it again next as a word
Err(_) => {}
@ -497,31 +540,28 @@ fn expand_atom_inner<'me, 'content>(
unit: (number, unit),
span,
}) => return Ok(UnspannedAtomicToken::Size { number, unit }.into_atomic_token(span)),
},
}
}
match rule.separate_members {
false => {}
true => {
let mut next = token_nodes.peek_any();
if rule.separate_members {
let mut next = token_nodes.peek_any();
match next.node {
Some(token) if token.is_word() => {
next.commit();
return Ok(UnspannedAtomicToken::Word { text: token.span() }
.into_atomic_token(token.span()));
}
Some(token) if token.is_int() => {
next.commit();
return Ok(UnspannedAtomicToken::Number {
number: RawNumber::Int(token.span()),
}
match next.node {
Some(token) if token.is_word() => {
next.commit();
return Ok(UnspannedAtomicToken::Word { text: token.span() }
.into_atomic_token(token.span()));
}
_ => {}
}
Some(token) if token.is_int() => {
next.commit();
return Ok(UnspannedAtomicToken::Number {
number: RawNumber::Int(token.span()),
}
.into_atomic_token(token.span()));
}
_ => {}
}
}
@ -574,6 +614,17 @@ fn expand_atom_inner<'me, 'content>(
.into_atomic_token(error.span));
}
TokenNode::Separator(span) if rule.allow_separator => {
peeked.commit();
return Ok(UnspannedAtomicToken::Separator { text: *span }.into_atomic_token(span));
}
TokenNode::Comment(comment) if rule.allow_comments => {
peeked.commit();
return Ok(UnspannedAtomicToken::Comment { body: comment.text }
.into_atomic_token(comment.span()));
}
// [ ... ]
TokenNode::Delimited(Spanned {
item:
@ -645,8 +696,16 @@ fn expand_atom_inner<'me, 'content>(
// First, the error cases. Each error case corresponds to a expansion rule
// flag that can be used to allow the case
// rule.allow_operator
UnspannedToken::Operator(_) if !rule.allow_operator => return Err(err.error()),
// rule.allow_cmp_operator
UnspannedToken::CompareOperator(_) if !rule.allow_cmp_operator => {
return Err(err.error())
}
// rule.allow_eval_operator
UnspannedToken::EvaluationOperator(_) if !rule.allow_eval_operator => {
return Err(err.error())
}
// rule.allow_external_command
UnspannedToken::ExternalCommand(_) if !rule.allow_external_command => {
return Err(ParseError::mismatch(
@ -665,8 +724,15 @@ fn expand_atom_inner<'me, 'content>(
UnspannedToken::Number(number) => {
UnspannedAtomicToken::Number { number }.into_atomic_token(token_span)
}
UnspannedToken::Operator(_) => {
UnspannedAtomicToken::Operator { text: token_span }.into_atomic_token(token_span)
UnspannedToken::CompareOperator(_) => {
UnspannedAtomicToken::CompareOperator { text: token_span }
.into_atomic_token(token_span)
}
UnspannedToken::EvaluationOperator(EvaluationOperator::Dot) => {
UnspannedAtomicToken::Dot { text: token_span }.into_atomic_token(token_span)
}
UnspannedToken::EvaluationOperator(EvaluationOperator::DotDot) => {
UnspannedAtomicToken::DotDot { text: token_span }.into_atomic_token(token_span)
}
UnspannedToken::String(body) => {
UnspannedAtomicToken::String { body }.into_atomic_token(token_span)

View File

@ -1,13 +1,12 @@
use crate::parser::hir::syntax_shape::{
use crate::hir::syntax_shape::{
color_syntax, expand_syntax, ColorSyntax, ExpandContext, ExpressionListShape, TokenNode,
};
use crate::parser::{hir, hir::TokensIterator, Delimiter, FlatShape};
use crate::prelude::*;
#[cfg(not(coloring_in_tokens))]
use nu_source::Spanned;
use crate::{hir, hir::TokensIterator, Delimiter, FlatShape};
use nu_errors::ParseError;
use nu_source::{Span, SpannedItem, Tag};
pub fn expand_delimited_square(
children: &Vec<TokenNode>,
children: &[TokenNode],
span: Span,
context: &ExpandContext,
) -> Result<hir::Expression, ParseError> {
@ -21,21 +20,6 @@ pub fn expand_delimited_square(
))
}
#[cfg(not(coloring_in_tokens))]
pub fn color_delimited_square(
(open, close): (Span, Span),
children: &Vec<TokenNode>,
span: Span,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) {
shapes.push(FlatShape::OpenDelimiter(Delimiter::Square).spanned(open));
let mut tokens = TokensIterator::new(&children, span, context.source.clone(), false);
let _list = color_syntax(&ExpressionListShape, &mut tokens, context, shapes);
shapes.push(FlatShape::CloseDelimiter(Delimiter::Square).spanned(close));
}
#[cfg(coloring_in_tokens)]
pub fn color_delimited_square(
(open, close): (Span, Span),
token_nodes: &mut TokensIterator,
@ -50,24 +34,6 @@ pub fn color_delimited_square(
#[derive(Debug, Copy, Clone)]
pub struct DelimitedShape;
#[cfg(not(coloring_in_tokens))]
impl ColorSyntax for DelimitedShape {
type Info = ();
type Input = (Delimiter, Span, Span);
fn color_syntax<'a, 'b>(
&self,
(delimiter, open, close): &(Delimiter, Span, Span),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Self::Info {
shapes.push(FlatShape::OpenDelimiter(*delimiter).spanned(*open));
color_syntax(&ExpressionListShape, token_nodes, context, shapes);
shapes.push(FlatShape::CloseDelimiter(*delimiter).spanned(*close));
}
}
#[cfg(coloring_in_tokens)]
impl ColorSyntax for DelimitedShape {
type Info = ();
type Input = (Delimiter, Span, Span);

View File

@ -1,56 +1,16 @@
use crate::parser::hir::syntax_shape::expression::atom::{
use crate::hir::syntax_shape::expression::atom::{
expand_atom, ExpansionRule, UnspannedAtomicToken,
};
use crate::parser::hir::syntax_shape::{
use crate::hir::syntax_shape::{
expression::expand_file_path, ExpandContext, ExpandExpression, FallibleColorSyntax, FlatShape,
ParseError,
};
use crate::parser::{hir, hir::TokensIterator};
use crate::prelude::*;
use crate::{hir, hir::TokensIterator};
use nu_errors::{ParseError, ShellError};
use nu_source::SpannedItem;
#[derive(Debug, Copy, Clone)]
pub struct FilePathShape;
#[cfg(not(coloring_in_tokens))]
impl FallibleColorSyntax for FilePathShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<nu_source::Spanned<FlatShape>>,
) -> Result<(), ShellError> {
let atom = expand_atom(
token_nodes,
"file path",
context,
ExpansionRule::permissive(),
);
let atom = match atom {
Err(_) => return Ok(()),
Ok(atom) => atom,
};
match &atom.unspanned {
UnspannedAtomicToken::Word { .. }
| UnspannedAtomicToken::String { .. }
| UnspannedAtomicToken::Number { .. }
| UnspannedAtomicToken::Size { .. } => {
shapes.push(FlatShape::Path.spanned(atom.span));
}
_ => atom.color_tokens(shapes),
}
Ok(())
}
}
#[cfg(coloring_in_tokens)]
impl FallibleColorSyntax for FilePathShape {
type Info = ();
type Input = ();
@ -102,20 +62,27 @@ impl ExpandExpression for FilePathShape {
token_nodes: &mut TokensIterator<'_>,
context: &ExpandContext,
) -> Result<hir::Expression, ParseError> {
let atom = expand_atom(token_nodes, "file path", context, ExpansionRule::new())?;
let atom = expand_atom(
token_nodes,
"file path",
context,
ExpansionRule::new().allow_external_word(),
)?;
match atom.unspanned {
UnspannedAtomicToken::Word { text: body } | UnspannedAtomicToken::String { body } => {
UnspannedAtomicToken::Word { text: body }
| UnspannedAtomicToken::ExternalWord { text: body }
| UnspannedAtomicToken::String { body } => {
let path = expand_file_path(body.slice(context.source), context);
return Ok(hir::Expression::file_path(path, atom.span));
Ok(hir::Expression::file_path(path, atom.span))
}
UnspannedAtomicToken::Number { .. } | UnspannedAtomicToken::Size { .. } => {
let path = atom.span.slice(context.source);
return Ok(hir::Expression::file_path(path, atom.span));
Ok(hir::Expression::file_path(path, atom.span))
}
_ => return atom.into_hir(context, "file path"),
_ => atom.to_hir(context, "file path"),
}
}
}

View File

@ -1,7 +1,4 @@
use crate::errors::ParseError;
#[cfg(not(coloring_in_tokens))]
use crate::parser::hir::syntax_shape::FlatShape;
use crate::parser::{
use crate::{
hir,
hir::syntax_shape::{
color_fallible_syntax, color_syntax, expand_atom, expand_expr, maybe_spaced, spaced,
@ -10,6 +7,7 @@ use crate::parser::{
},
hir::TokensIterator,
};
use nu_errors::ParseError;
use nu_source::{b, DebugDocBuilder, HasSpan, PrettyDebugWithSource, Span, Spanned, SpannedItem};
#[derive(Debug, Clone)]
@ -76,77 +74,6 @@ impl ExpandSyntax for ExpressionListShape {
}
}
#[cfg(not(coloring_in_tokens))]
impl ColorSyntax for ExpressionListShape {
type Info = ();
type Input = ();
/// The intent of this method is to fully color an expression list shape infallibly.
/// This means that if we can't expand a token into an expression, we fall back to
/// a simpler coloring strategy.
///
/// This would apply to something like `where x >`, which includes an incomplete
/// binary operator. Since we will fail to process it as a binary operator, we'll
/// fall back to a simpler coloring and move on.
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) {
// We encountered a parsing error and will continue with simpler coloring ("backoff
// coloring mode")
let mut backoff = false;
// Consume any leading whitespace
color_syntax(&MaybeSpaceShape, token_nodes, context, shapes);
loop {
// If we reached the very end of the token stream, we're done
if token_nodes.at_end() {
return;
}
if backoff {
let len = shapes.len();
// If we previously encountered a parsing error, use backoff coloring mode
color_syntax(&SimplestExpression, token_nodes, context, shapes);
if len == shapes.len() && !token_nodes.at_end() {
// This should never happen, but if it does, a panic is better than an infinite loop
panic!("Unexpected tokens left that couldn't be colored even with SimplestExpression")
}
} else {
// Try to color the head of the stream as an expression
match color_fallible_syntax(&AnyExpressionShape, token_nodes, context, shapes) {
// If no expression was found, switch to backoff coloring mode
Err(_) => {
backoff = true;
continue;
}
Ok(_) => {}
}
// If an expression was found, consume a space
match color_fallible_syntax(&SpaceShape, token_nodes, context, shapes) {
Err(_) => {
// If no space was found, we're either at the end or there's an error.
// Either way, switch to backoff coloring mode. If we're at the end
// it won't have any consequences.
backoff = true;
}
Ok(_) => {
// Otherwise, move on to the next expression
}
}
}
}
}
}
#[cfg(coloring_in_tokens)]
impl ColorSyntax for ExpressionListShape {
type Info = ();
type Input = ();
@ -193,27 +120,21 @@ impl ColorSyntax for ExpressionListShape {
}
} else {
// Try to color the head of the stream as an expression
match color_fallible_syntax(&AnyExpressionShape, token_nodes, context) {
if color_fallible_syntax(&AnyExpressionShape, token_nodes, context).is_err() {
// If no expression was found, switch to backoff coloring mode
Err(_) => {
backoff = true;
continue;
}
Ok(_) => {}
backoff = true;
continue;
}
// If an expression was found, consume a space
match color_fallible_syntax(&SpaceShape, token_nodes, context) {
Err(_) => {
// If no space was found, we're either at the end or there's an error.
// Either way, switch to backoff coloring mode. If we're at the end
// it won't have any consequences.
backoff = true;
}
Ok(_) => {
// Otherwise, move on to the next expression
}
if color_fallible_syntax(&SpaceShape, token_nodes, context).is_err() {
// If no space was found, we're either at the end or there's an error.
// Either way, switch to backoff coloring mode. If we're at the end
// it won't have any consequences.
backoff = true;
}
// Otherwise, move on to the next expression
}
}
}
@ -223,35 +144,6 @@ impl ColorSyntax for ExpressionListShape {
#[derive(Debug, Copy, Clone)]
pub struct BackoffColoringMode;
#[cfg(not(coloring_in_tokens))]
impl ColorSyntax for BackoffColoringMode {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &Self::Input,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Self::Info {
loop {
if token_nodes.at_end() {
break;
}
let len = shapes.len();
color_syntax(&SimplestExpression, token_nodes, context, shapes);
if len == shapes.len() && !token_nodes.at_end() {
// This shouldn't happen, but if it does, a panic is better than an infinite loop
panic!("SimplestExpression failed to consume any tokens, but it's not at the end. This is unexpected\n== token nodes==\n{:#?}\n\n== shapes ==\n{:#?}", token_nodes, shapes);
}
}
}
}
#[cfg(coloring_in_tokens)]
impl ColorSyntax for BackoffColoringMode {
type Info = ();
type Input = ();
@ -288,33 +180,6 @@ impl ColorSyntax for BackoffColoringMode {
#[derive(Debug, Copy, Clone)]
pub struct SimplestExpression;
#[cfg(not(coloring_in_tokens))]
impl ColorSyntax for SimplestExpression {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) {
let atom = expand_atom(
token_nodes,
"any token",
context,
ExpansionRule::permissive(),
);
match atom {
Err(_) => {}
Ok(atom) => atom.color_tokens(shapes),
}
}
}
#[cfg(coloring_in_tokens)]
impl ColorSyntax for SimplestExpression {
type Info = ();
type Input = ();

View File

@ -1,15 +1,15 @@
use crate::parser::hir::syntax_shape::{
use crate::hir::syntax_shape::{
expand_atom, parse_single_node, ExpandContext, ExpandExpression, ExpansionRule,
FallibleColorSyntax, FlatShape, ParseError, TestSyntax,
FallibleColorSyntax, FlatShape, TestSyntax,
};
use crate::parser::hir::tokens_iterator::Peeked;
use crate::parser::{
use crate::hir::tokens_iterator::Peeked;
use crate::parse::tokens::UnspannedToken;
use crate::{
hir,
hir::{RawNumber, TokensIterator},
UnspannedToken,
};
use crate::prelude::*;
use nu_source::Spanned;
use nu_errors::{ParseError, ShellError};
use nu_source::{Spanned, SpannedItem};
#[derive(Debug, Copy, Clone)]
pub struct NumberShape;
@ -26,9 +26,9 @@ impl ExpandExpression for NumberShape {
) -> Result<hir::Expression, ParseError> {
parse_single_node(token_nodes, "Number", |token, token_span, err| {
Ok(match token {
UnspannedToken::GlobPattern | UnspannedToken::Operator(..) => {
return Err(err.error())
}
UnspannedToken::GlobPattern
| UnspannedToken::CompareOperator(..)
| UnspannedToken::EvaluationOperator(..) => return Err(err.error()),
UnspannedToken::Variable(tag) if tag.slice(context.source) == "it" => {
hir::Expression::it_variable(tag, token_span)
}
@ -52,37 +52,6 @@ impl ExpandExpression for NumberShape {
}
}
#[cfg(not(coloring_in_tokens))]
impl FallibleColorSyntax for NumberShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Result<(), ShellError> {
let atom = token_nodes.spanned(|token_nodes| {
expand_atom(token_nodes, "number", context, ExpansionRule::permissive())
});
let atom = match atom {
Spanned { item: Err(_), span } => {
shapes.push(FlatShape::Error.spanned(span));
return Ok(());
}
Spanned { item: Ok(atom), .. } => atom,
};
atom.color_tokens(shapes);
Ok(())
}
}
#[cfg(coloring_in_tokens)]
impl FallibleColorSyntax for NumberShape {
type Info = ();
type Input = ();
@ -131,7 +100,8 @@ impl ExpandExpression for IntShape {
parse_single_node(token_nodes, "Integer", |token, token_span, err| {
Ok(match token {
UnspannedToken::GlobPattern
| UnspannedToken::Operator(..)
| UnspannedToken::CompareOperator(..)
| UnspannedToken::EvaluationOperator(..)
| UnspannedToken::ExternalWord => return Err(err.error()),
UnspannedToken::Variable(span) if span.slice(context.source) == "it" => {
hir::Expression::it_variable(span, token_span)
@ -151,37 +121,6 @@ impl ExpandExpression for IntShape {
}
}
#[cfg(not(coloring_in_tokens))]
impl FallibleColorSyntax for IntShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Result<(), ShellError> {
let atom = token_nodes.spanned(|token_nodes| {
expand_atom(token_nodes, "integer", context, ExpansionRule::permissive())
});
let atom = match atom {
Spanned { item: Err(_), span } => {
shapes.push(FlatShape::Error.spanned(span));
return Ok(());
}
Spanned { item: Ok(atom), .. } => atom,
};
atom.color_tokens(shapes);
Ok(())
}
}
#[cfg(coloring_in_tokens)]
impl FallibleColorSyntax for IntShape {
type Info = ();
type Input = ();

View File

@ -1,44 +1,18 @@
use crate::parser::hir::syntax_shape::{
use crate::hir::syntax_shape::{
expand_atom, expand_bare, expression::expand_file_path, ExpandContext, ExpandExpression,
ExpandSyntax, ExpansionRule, FallibleColorSyntax, FlatShape, ParseError, UnspannedAtomicToken,
ExpandSyntax, ExpansionRule, FallibleColorSyntax, FlatShape, UnspannedAtomicToken,
};
use crate::parser::parse::tokens::Token;
use crate::parser::{hir, hir::TokensIterator, Operator, TokenNode, UnspannedToken};
use crate::prelude::*;
#[cfg(not(coloring_in_tokens))]
use nu_source::Spanned;
use crate::parse::operator::EvaluationOperator;
use crate::parse::tokens::{Token, UnspannedToken};
use crate::{hir, hir::TokensIterator, TokenNode};
use nu_errors::{ParseError, ShellError};
use nu_protocol::ShellTypeName;
use nu_source::{Span, SpannedItem};
#[derive(Debug, Copy, Clone)]
pub struct PatternShape;
#[cfg(not(coloring_in_tokens))]
impl FallibleColorSyntax for PatternShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Result<(), ShellError> {
token_nodes.atomic(|token_nodes| {
let atom = expand_atom(token_nodes, "pattern", context, ExpansionRule::permissive())?;
match &atom.unspanned {
UnspannedAtomicToken::GlobPattern { .. } | UnspannedAtomicToken::Word { .. } => {
shapes.push(FlatShape::GlobPattern.spanned(atom.span));
Ok(())
}
_ => Err(ShellError::type_error("pattern", atom.spanned_type_name())),
}
})
}
}
#[cfg(coloring_in_tokens)]
impl FallibleColorSyntax for PatternShape {
type Info = ();
type Input = ();
@ -81,16 +55,22 @@ impl ExpandExpression for PatternShape {
token_nodes: &mut TokensIterator<'_>,
context: &ExpandContext,
) -> Result<hir::Expression, ParseError> {
let atom = expand_atom(token_nodes, "pattern", context, ExpansionRule::new())?;
let atom = expand_atom(
token_nodes,
"pattern",
context,
ExpansionRule::new().allow_external_word(),
)?;
match atom.unspanned {
UnspannedAtomicToken::Word { text: body }
| UnspannedAtomicToken::String { body }
| UnspannedAtomicToken::ExternalWord { text: body }
| UnspannedAtomicToken::GlobPattern { pattern: body } => {
let path = expand_file_path(body.slice(context.source), context);
return Ok(hir::Expression::pattern(path.to_string_lossy(), atom.span));
Ok(hir::Expression::pattern(path.to_string_lossy(), atom.span))
}
_ => return atom.into_hir(context, "pattern"),
_ => atom.to_hir(context, "pattern"),
}
}
}
@ -116,7 +96,7 @@ impl ExpandSyntax for BarePatternShape {
..
})
| TokenNode::Token(Token {
unspanned: UnspannedToken::Operator(Operator::Dot),
unspanned: UnspannedToken::EvaluationOperator(EvaluationOperator::Dot),
..
})
| TokenNode::Token(Token {

View File

@ -0,0 +1,103 @@
use crate::hir::syntax_shape::expression::UnspannedAtomicToken;
use crate::hir::syntax_shape::{
color_fallible_syntax, expand_atom, expand_expr, AnyExpressionShape, ExpandContext,
ExpandExpression, ExpansionRule, FallibleColorSyntax, FlatShape,
};
use crate::parse::operator::EvaluationOperator;
use crate::parse::token_tree::TokenNode;
use crate::parse::tokens::{Token, UnspannedToken};
use crate::{hir, hir::TokensIterator};
use nu_errors::{ParseError, ShellError};
use nu_protocol::SpannedTypeName;
use nu_source::SpannedItem;
#[derive(Debug, Copy, Clone)]
pub struct RangeShape;
impl ExpandExpression for RangeShape {
fn name(&self) -> &'static str {
"range"
}
fn expand_expr<'a, 'b>(
&self,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<hir::Expression, ParseError> {
token_nodes.atomic_parse(|token_nodes| {
let left = expand_expr(&AnyExpressionShape, token_nodes, context)?;
let atom = expand_atom(
token_nodes,
"..",
context,
ExpansionRule::new().allow_eval_operator(),
)?;
let span = match atom.unspanned {
UnspannedAtomicToken::DotDot { text } => text,
_ => return Err(ParseError::mismatch("..", atom.spanned_type_name())),
};
let right = expand_expr(&AnyExpressionShape, token_nodes, context)?;
Ok(hir::Expression::range(left, span, right))
})
}
}
impl FallibleColorSyntax for RangeShape {
type Info = ();
type Input = ();
fn name(&self) -> &'static str {
"RangeShape"
}
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Result<(), ShellError> {
token_nodes.atomic_parse(|token_nodes| {
color_fallible_syntax(&AnyExpressionShape, token_nodes, context)?;
color_fallible_syntax(&DotDotShape, token_nodes, context)?;
color_fallible_syntax(&AnyExpressionShape, token_nodes, context)
})?;
Ok(())
}
}
#[derive(Debug, Copy, Clone)]
struct DotDotShape;
impl FallibleColorSyntax for DotDotShape {
type Info = ();
type Input = ();
fn name(&self) -> &'static str {
".."
}
fn color_syntax<'a, 'b>(
&self,
_input: &Self::Input,
token_nodes: &'b mut TokensIterator<'a>,
_context: &ExpandContext,
) -> Result<Self::Info, ShellError> {
let peeked = token_nodes.peek_any().not_eof("..")?;
match &peeked.node {
TokenNode::Token(Token {
unspanned: UnspannedToken::EvaluationOperator(EvaluationOperator::DotDot),
span,
}) => {
peeked.commit();
token_nodes.color_shape(FlatShape::DotDot.spanned(span));
Ok(())
}
token => Err(ShellError::type_error("..", token.spanned_type_name())),
}
}
}

View File

@ -1,48 +1,16 @@
use crate::parser::hir::syntax_shape::{
use crate::hir::syntax_shape::{
expand_atom, expand_variable, parse_single_node, AtomicToken, ExpandContext, ExpandExpression,
ExpansionRule, FallibleColorSyntax, FlatShape, ParseError, TestSyntax, UnspannedAtomicToken,
ExpansionRule, FallibleColorSyntax, FlatShape, TestSyntax, UnspannedAtomicToken,
};
use crate::parser::hir::tokens_iterator::Peeked;
use crate::parser::{hir, hir::TokensIterator, UnspannedToken};
use crate::prelude::*;
#[cfg(not(coloring_in_tokens))]
use nu_source::Spanned;
use crate::hir::tokens_iterator::Peeked;
use crate::parse::tokens::UnspannedToken;
use crate::{hir, hir::TokensIterator};
use nu_errors::{ParseError, ShellError};
use nu_source::SpannedItem;
#[derive(Debug, Copy, Clone)]
pub struct StringShape;
#[cfg(not(coloring_in_tokens))]
impl FallibleColorSyntax for StringShape {
type Info = ();
type Input = FlatShape;
fn color_syntax<'a, 'b>(
&self,
input: &FlatShape,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Result<(), ShellError> {
let atom = expand_atom(token_nodes, "string", context, ExpansionRule::permissive());
let atom = match atom {
Err(_) => return Ok(()),
Ok(atom) => atom,
};
match atom {
AtomicToken {
unspanned: UnspannedAtomicToken::String { .. },
span,
} => shapes.push((*input).spanned(span)),
other => other.color_tokens(shapes),
}
Ok(())
}
}
#[cfg(coloring_in_tokens)]
impl FallibleColorSyntax for StringShape {
type Info = ();
type Input = FlatShape;
@ -89,7 +57,8 @@ impl ExpandExpression for StringShape {
parse_single_node(token_nodes, "String", |token, token_span, err| {
Ok(match token {
UnspannedToken::GlobPattern
| UnspannedToken::Operator(..)
| UnspannedToken::CompareOperator(..)
| UnspannedToken::EvaluationOperator(..)
| UnspannedToken::ExternalWord => return Err(err.error()),
UnspannedToken::Variable(span) => {
expand_variable(span, token_span, &context.source)

View File

@ -1,15 +1,16 @@
use crate::parser::hir::syntax_shape::{ExpandContext, ExpandSyntax, ParseError};
use crate::parser::parse::tokens::RawNumber;
use crate::parser::parse::tokens::Token;
use crate::parser::parse::unit::Unit;
use crate::parser::{hir::TokensIterator, TokenNode, UnspannedToken};
use crate::prelude::*;
use crate::hir::syntax_shape::{ExpandContext, ExpandSyntax};
use crate::parse::tokens::RawNumber;
use crate::parse::tokens::Token;
use crate::parse::tokens::UnspannedToken;
use crate::parse::unit::Unit;
use crate::{hir::TokensIterator, TokenNode};
use nom::branch::alt;
use nom::bytes::complete::tag;
use nom::character::complete::digit1;
use nom::combinator::{all_consuming, opt, value};
use nom::IResult;
use nu_source::{Span, Spanned};
use nu_errors::ParseError;
use nu_source::{b, DebugDocBuilder, HasSpan, PrettyDebugWithSource, Span, Spanned, SpannedItem};
#[derive(Debug, Clone)]
pub struct UnitSyntax {

View File

@ -1,16 +1,19 @@
use crate::parser::hir::path::PathMember;
use crate::parser::hir::syntax_shape::{
use crate::hir::syntax_shape::{
color_fallible_syntax, color_fallible_syntax_with, expand_atom, expand_expr, expand_syntax,
parse_single_node, AnyExpressionShape, BareShape, ExpandContext, ExpandExpression,
ExpandSyntax, ExpansionRule, FallibleColorSyntax, FlatShape, ParseError, Peeked, SkipSyntax,
StringShape, TestSyntax, UnspannedAtomicToken, WhitespaceShape,
};
use crate::parser::{
hir, hir::Expression, hir::TokensIterator, Operator, RawNumber, UnspannedToken,
use crate::parse::tokens::{RawNumber, UnspannedToken};
use crate::{hir, hir::Expression, hir::TokensIterator, CompareOperator, EvaluationOperator};
use nu_errors::ShellError;
use nu_protocol::{PathMember, ShellTypeName};
use nu_source::{
b, DebugDocBuilder, HasSpan, PrettyDebug, PrettyDebugWithSource, Span, Spanned, SpannedItem,
Tag, Tagged, TaggedItem, Text,
};
use crate::prelude::*;
use nu_source::{Spanned, Tagged};
use serde::Serialize;
use num_bigint::BigInt;
use serde::{Deserialize, Serialize};
use std::str::FromStr;
#[derive(Debug, Copy, Clone)]
@ -38,9 +41,8 @@ impl ExpandExpression for VariablePathShape {
let mut tail: Vec<PathMember> = vec![];
loop {
match DotShape.skip(token_nodes, context) {
Err(_) => break,
Ok(_) => {}
if DotShape.skip(token_nodes, context).is_err() {
break;
}
let member = expand_syntax(&MemberShape, token_nodes, context)?;
@ -54,48 +56,6 @@ impl ExpandExpression for VariablePathShape {
}
}
#[cfg(not(coloring_in_tokens))]
impl FallibleColorSyntax for VariablePathShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Result<(), ShellError> {
token_nodes.atomic(|token_nodes| {
// If the head of the token stream is not a variable, fail
color_fallible_syntax(&VariableShape, token_nodes, context, shapes)?;
loop {
// look for a dot at the head of a stream
let dot = color_fallible_syntax_with(
&ColorableDotShape,
&FlatShape::Dot,
token_nodes,
context,
shapes,
);
// if there's no dot, we're done
match dot {
Err(_) => break,
Ok(_) => {}
}
// otherwise, look for a member, and if you don't find one, fail
color_fallible_syntax(&MemberShape, token_nodes, context, shapes)?;
}
Ok(())
})
}
}
#[cfg(coloring_in_tokens)]
impl FallibleColorSyntax for VariablePathShape {
type Info = ();
type Input = ();
@ -116,17 +76,16 @@ impl FallibleColorSyntax for VariablePathShape {
loop {
// look for a dot at the head of a stream
let dot = color_fallible_syntax_with(
if color_fallible_syntax_with(
&ColorableDotShape,
&FlatShape::Dot,
token_nodes,
context,
);
// if there's no dot, we're done
match dot {
Err(_) => break,
Ok(_) => {}
)
.is_err()
{
// if there's no dot, we're done
break;
}
// otherwise, look for a member, and if you don't find one, fail
@ -141,40 +100,6 @@ impl FallibleColorSyntax for VariablePathShape {
#[derive(Debug, Copy, Clone)]
pub struct PathTailShape;
#[cfg(not(coloring_in_tokens))]
/// The failure mode of `PathTailShape` is a dot followed by a non-member
impl FallibleColorSyntax for PathTailShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Result<(), ShellError> {
token_nodes.atomic(|token_nodes| loop {
let result = color_fallible_syntax_with(
&ColorableDotShape,
&FlatShape::Dot,
token_nodes,
context,
shapes,
);
match result {
Err(_) => return Ok(()),
Ok(_) => {}
}
// If we've seen a dot but not a member, fail
color_fallible_syntax(&MemberShape, token_nodes, context, shapes)?;
})
}
}
#[cfg(coloring_in_tokens)]
/// The failure mode of `PathTailShape` is a dot followed by a non-member
impl FallibleColorSyntax for PathTailShape {
type Info = ();
@ -198,9 +123,8 @@ impl FallibleColorSyntax for PathTailShape {
context,
);
match result {
Err(_) => return Ok(()),
Ok(_) => {}
if result.is_err() {
return Ok(());
}
// If we've seen a dot but not a member, fail
@ -243,9 +167,8 @@ impl ExpandSyntax for PathTailShape {
let mut tail: Vec<PathMember> = vec![];
loop {
match DotShape.skip(token_nodes, context) {
Err(_) => break,
Ok(_) => {}
if DotShape.skip(token_nodes, context).is_err() {
break;
}
let member = expand_syntax(&MemberShape, token_nodes, context)?;
@ -268,7 +191,7 @@ impl ExpandSyntax for PathTailShape {
#[derive(Debug, Clone)]
pub enum ExpressionContinuation {
DotSuffix(Span, PathMember),
InfixSuffix(Spanned<Operator>, Expression),
InfixSuffix(Spanned<CompareOperator>, Expression),
}
impl PrettyDebugWithSource for ExpressionContinuation {
@ -339,60 +262,6 @@ pub enum ContinuationInfo {
Infix,
}
#[cfg(not(coloring_in_tokens))]
impl FallibleColorSyntax for ExpressionContinuationShape {
type Info = ContinuationInfo;
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Result<ContinuationInfo, ShellError> {
token_nodes.atomic(|token_nodes| {
// Try to expand a `.`
let dot = color_fallible_syntax_with(
&ColorableDotShape,
&FlatShape::Dot,
token_nodes,
context,
shapes,
);
match dot {
Ok(_) => {
// we found a dot, so let's keep looking for a member; if no member was found, fail
color_fallible_syntax(&MemberShape, token_nodes, context, shapes)?;
Ok(ContinuationInfo::Dot)
}
Err(_) => {
let mut new_shapes = vec![];
let result = token_nodes.atomic(|token_nodes| {
// we didn't find a dot, so let's see if we're looking at an infix. If not found, fail
color_fallible_syntax(&InfixShape, token_nodes, context, &mut new_shapes)?;
// now that we've seen an infix shape, look for any expression. If not found, fail
color_fallible_syntax(
&AnyExpressionShape,
token_nodes,
context,
&mut new_shapes,
)?;
Ok(ContinuationInfo::Infix)
})?;
shapes.extend(new_shapes);
Ok(result)
}
}
})
}
}
#[cfg(coloring_in_tokens)]
impl FallibleColorSyntax for ExpressionContinuationShape {
type Info = ContinuationInfo;
type Input = ();
@ -469,45 +338,6 @@ impl ExpandExpression for VariableShape {
}
}
#[cfg(not(coloring_in_tokens))]
impl FallibleColorSyntax for VariableShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Result<(), ShellError> {
let atom = expand_atom(
token_nodes,
"variable",
context,
ExpansionRule::permissive(),
);
let atom = match atom {
Err(err) => return Err(err.into()),
Ok(atom) => atom,
};
match &atom.unspanned {
UnspannedAtomicToken::Variable { .. } => {
shapes.push(FlatShape::Variable.spanned(atom.span));
Ok(())
}
UnspannedAtomicToken::ItVariable { .. } => {
shapes.push(FlatShape::ItVariable.spanned(atom.span));
Ok(())
}
_ => Err(ShellError::type_error("variable", atom.spanned_type_name())),
}
}
}
#[cfg(coloring_in_tokens)]
impl FallibleColorSyntax for VariableShape {
type Info = ();
type Input = ();
@ -705,57 +535,6 @@ pub fn expand_column_path<'a, 'b>(
#[derive(Debug, Copy, Clone)]
pub struct ColumnPathShape;
#[cfg(not(coloring_in_tokens))]
impl FallibleColorSyntax for ColumnPathShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Result<(), ShellError> {
// If there's not even one member shape, fail
color_fallible_syntax(&MemberShape, token_nodes, context, shapes)?;
loop {
let checkpoint = token_nodes.checkpoint();
match color_fallible_syntax_with(
&ColorableDotShape,
&FlatShape::Dot,
checkpoint.iterator,
context,
shapes,
) {
Err(_) => {
// we already saw at least one member shape, so return successfully
return Ok(());
}
Ok(_) => {
match color_fallible_syntax(&MemberShape, checkpoint.iterator, context, shapes)
{
Err(_) => {
// we saw a dot but not a member (but we saw at least one member),
// so don't commit the dot but return successfully
return Ok(());
}
Ok(_) => {
// we saw a dot and a member, so commit it and continue on
checkpoint.commit();
}
}
}
}
}
}
}
#[cfg(coloring_in_tokens)]
impl FallibleColorSyntax for ColumnPathShape {
type Info = ();
type Input = ();
@ -849,45 +628,6 @@ impl ExpandSyntax for ColumnPathShape {
#[derive(Debug, Copy, Clone)]
pub struct MemberShape;
#[cfg(not(coloring_in_tokens))]
impl FallibleColorSyntax for MemberShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Result<(), ShellError> {
let bare = color_fallible_syntax_with(
&BareShape,
&FlatShape::BareMember,
token_nodes,
context,
shapes,
);
match bare {
Ok(_) => return Ok(()),
Err(_) => {
// If we don't have a bare word, we'll look for a string
}
}
// Look for a string token. If we don't find one, fail
color_fallible_syntax_with(
&StringShape,
&FlatShape::StringMember,
token_nodes,
context,
shapes,
)
}
}
#[cfg(coloring_in_tokens)]
impl FallibleColorSyntax for MemberShape {
type Info = ();
type Input = ();
@ -905,13 +645,12 @@ impl FallibleColorSyntax for MemberShape {
let bare =
color_fallible_syntax_with(&BareShape, &FlatShape::BareMember, token_nodes, context);
match bare {
Ok(_) => return Ok(()),
Err(_) => {
// If we don't have a bare word, we'll look for a string
}
if bare.is_ok() {
return Ok(());
}
// If we don't have a bare word, we'll look for a string
// Look for a string token. If we don't find one, fail
color_fallible_syntax_with(&StringShape, &FlatShape::StringMember, token_nodes, context)
}
@ -944,7 +683,11 @@ impl ExpandSyntax for IntMemberShape {
UnspannedAtomicToken::Number {
number: RawNumber::Int(int),
} => Ok(Member::Int(
BigInt::from_str(int.slice(context.source)).unwrap(),
BigInt::from_str(int.slice(context.source)).map_err(|_| {
ParseError::internal_error(
"can't convert from string to big int".spanned(int),
)
})?,
int,
)),
@ -952,7 +695,7 @@ impl ExpandSyntax for IntMemberShape {
let int = BigInt::from_str(text.slice(context.source));
match int {
Ok(int) => return Ok(Member::Int(int, text)),
Ok(int) => Ok(Member::Int(int, text)),
Err(_) => Err(ParseError::mismatch("integer member", "word".spanned(text))),
}
}
@ -993,7 +736,9 @@ impl ExpandSyntax for MemberShape {
if let Some(peeked) = number {
let node = peeked.not_eof("column")?.commit();
let (n, span) = node.as_number().unwrap();
let (n, span) = node.as_number().ok_or_else(|| {
ParseError::internal_error("can't convert node to number".spanned(node.span()))
})?;
return Ok(Member::Number(n, span))
}*/
@ -1002,7 +747,9 @@ impl ExpandSyntax for MemberShape {
if let Some(peeked) = string {
let node = peeked.not_eof("column")?.commit();
let (outer, inner) = node.as_string().unwrap();
let (outer, inner) = node.as_string().ok_or_else(|| {
ParseError::internal_error("can't convert node to string".spanned(node.span()))
})?;
return Ok(Member::String(outer, inner));
}
@ -1017,33 +764,6 @@ pub struct DotShape;
#[derive(Debug, Copy, Clone)]
pub struct ColorableDotShape;
#[cfg(not(coloring_in_tokens))]
impl FallibleColorSyntax for ColorableDotShape {
type Info = ();
type Input = FlatShape;
fn color_syntax<'a, 'b>(
&self,
input: &FlatShape,
token_nodes: &'b mut TokensIterator<'a>,
_context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Result<(), ShellError> {
let peeked = token_nodes.peek_any().not_eof("dot")?;
match peeked.node {
node if node.is_dot() => {
peeked.commit();
shapes.push((*input).spanned(node.span()));
Ok(())
}
other => Err(ShellError::type_error("dot", other.spanned_type_name())),
}
}
}
#[cfg(coloring_in_tokens)]
impl FallibleColorSyntax for ColorableDotShape {
type Info = ();
type Input = FlatShape;
@ -1101,7 +821,7 @@ impl ExpandSyntax for DotShape {
) -> Result<Self::Output, ParseError> {
parse_single_node(token_nodes, "dot", |token, token_span, _| {
Ok(match token {
UnspannedToken::Operator(Operator::Dot) => token_span,
UnspannedToken::EvaluationOperator(EvaluationOperator::Dot) => token_span,
_ => {
return Err(ParseError::mismatch(
"dot",
@ -1116,52 +836,6 @@ impl ExpandSyntax for DotShape {
#[derive(Debug, Copy, Clone)]
pub struct InfixShape;
#[cfg(not(coloring_in_tokens))]
impl FallibleColorSyntax for InfixShape {
type Info = ();
type Input = ();
fn color_syntax<'a, 'b>(
&self,
_input: &(),
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
outer_shapes: &mut Vec<Spanned<FlatShape>>,
) -> Result<(), ShellError> {
let checkpoint = token_nodes.checkpoint();
let mut shapes = vec![];
// An infix operator must be prefixed by whitespace. If no whitespace was found, fail
color_fallible_syntax(&WhitespaceShape, checkpoint.iterator, context, &mut shapes)?;
// Parse the next TokenNode after the whitespace
parse_single_node(
checkpoint.iterator,
"infix operator",
|token, token_span, err| {
match token {
// If it's an operator (and not `.`), it's a match
UnspannedToken::Operator(operator) if operator != Operator::Dot => {
shapes.push(FlatShape::Operator.spanned(token_span));
Ok(())
}
// Otherwise, it's not a match
_ => Err(err.error()),
}
},
)?;
// An infix operator must be followed by whitespace. If no whitespace was found, fail
color_fallible_syntax(&WhitespaceShape, checkpoint.iterator, context, &mut shapes)?;
outer_shapes.extend(shapes);
checkpoint.commit();
Ok(())
}
}
#[cfg(coloring_in_tokens)]
impl FallibleColorSyntax for InfixShape {
type Info = ();
type Input = ();
@ -1188,9 +862,7 @@ impl FallibleColorSyntax for InfixShape {
|token, token_span, _| {
match token {
// If it's an operator (and not `.`), it's a match
UnspannedToken::Operator(operator) if operator != Operator::Dot => {
Ok(token_span)
}
UnspannedToken::CompareOperator(_operator) => Ok(token_span),
// Otherwise, it's not a match
_ => Err(ParseError::mismatch(
@ -1203,7 +875,7 @@ impl FallibleColorSyntax for InfixShape {
checkpoint
.iterator
.color_shape(FlatShape::Operator.spanned(operator_span));
.color_shape(FlatShape::CompareOperator.spanned(operator_span));
// An infix operator must be followed by whitespace. If no whitespace was found, fail
color_fallible_syntax(&WhitespaceShape, checkpoint.iterator, context)?;
@ -1263,7 +935,7 @@ impl ExpandSyntax for InfixShape {
#[derive(Debug, Clone)]
pub struct InfixInnerSyntax {
pub operator: Spanned<Operator>,
pub operator: Spanned<CompareOperator>,
}
impl HasSpan for InfixInnerSyntax {
@ -1295,12 +967,10 @@ impl ExpandSyntax for InfixInnerShape {
) -> Result<Self::Output, ParseError> {
parse_single_node(token_nodes, "infix operator", |token, token_span, err| {
Ok(match token {
// If it's an operator (and not `.`), it's a match
UnspannedToken::Operator(operator) if operator != Operator::Dot => {
InfixInnerSyntax {
operator: operator.spanned(token_span),
}
}
// If it's a comparison operator, it's a match
UnspannedToken::CompareOperator(operator) => InfixInnerSyntax {
operator: operator.spanned(token_span),
},
// Otherwise, it's not a match
_ => return Err(err.error()),

View File

@ -1,4 +1,7 @@
use crate::parser::{Delimiter, Flag, FlagKind, Operator, RawNumber, TokenNode, UnspannedToken};
use crate::parse::flag::{Flag, FlagKind};
use crate::parse::operator::EvaluationOperator;
use crate::parse::token_tree::{Delimiter, TokenNode};
use crate::parse::tokens::{RawNumber, UnspannedToken};
use nu_source::{HasSpan, Span, Spanned, SpannedItem, Text};
#[derive(Debug, Copy, Clone)]
@ -7,8 +10,9 @@ pub enum FlatShape {
CloseDelimiter(Delimiter),
ItVariable,
Variable,
Operator,
CompareOperator,
Dot,
DotDot,
InternalCommand,
ExternalCommand,
ExternalWord,
@ -24,12 +28,14 @@ pub enum FlatShape {
Int,
Decimal,
Whitespace,
Separator,
Error,
Comment,
Size { number: Span, unit: Span },
}
impl FlatShape {
pub fn from(token: &TokenNode, source: &Text, shapes: &mut Vec<Spanned<FlatShape>>) -> () {
pub fn from(token: &TokenNode, source: &Text, shapes: &mut Vec<Spanned<FlatShape>>) {
match token {
TokenNode::Token(token) => match token.unspanned {
UnspannedToken::Number(RawNumber::Int(_)) => {
@ -38,10 +44,15 @@ impl FlatShape {
UnspannedToken::Number(RawNumber::Decimal(_)) => {
shapes.push(FlatShape::Decimal.spanned(token.span))
}
UnspannedToken::Operator(Operator::Dot) => {
UnspannedToken::EvaluationOperator(EvaluationOperator::Dot) => {
shapes.push(FlatShape::Dot.spanned(token.span))
}
UnspannedToken::Operator(_) => shapes.push(FlatShape::Operator.spanned(token.span)),
UnspannedToken::EvaluationOperator(EvaluationOperator::DotDot) => {
shapes.push(FlatShape::DotDot.spanned(token.span))
}
UnspannedToken::CompareOperator(_) => {
shapes.push(FlatShape::CompareOperator.spanned(token.span))
}
UnspannedToken::String(_) => shapes.push(FlatShape::String.spanned(token.span)),
UnspannedToken::Variable(v) if v.slice(source) == "it" => {
shapes.push(FlatShape::ItVariable.spanned(token.span))
@ -73,7 +84,7 @@ impl FlatShape {
}
TokenNode::Pipeline(pipeline) => {
for part in &pipeline.parts {
if let Some(_) = part.pipe {
if part.pipe.is_some() {
shapes.push(FlatShape::Pipe.spanned(part.span()));
}
}
@ -89,6 +100,8 @@ impl FlatShape {
..
}) => shapes.push(FlatShape::ShorthandFlag.spanned(*span)),
TokenNode::Whitespace(_) => shapes.push(FlatShape::Whitespace.spanned(token.span())),
TokenNode::Separator(_) => shapes.push(FlatShape::Separator.spanned(token.span())),
TokenNode::Comment(_) => shapes.push(FlatShape::Comment.spanned(token.span())),
TokenNode::Error(v) => shapes.push(FlatShape::Error.spanned(v.span)),
}
}

View File

@ -1,38 +1,24 @@
pub(crate) mod debug;
use self::debug::{ColorTracer, ExpandTracer};
use crate::errors::ShellError;
#[cfg(coloring_in_tokens)]
use crate::parser::hir::syntax_shape::FlatShape;
use crate::parser::hir::Expression;
use crate::parser::TokenNode;
use crate::prelude::*;
#[allow(unused)]
use getset::{Getters, MutGetters};
use nu_source::Spanned;
cfg_if::cfg_if! {
if #[cfg(coloring_in_tokens)] {
#[derive(Getters, Debug)]
pub struct TokensIteratorState<'content> {
tokens: &'content [TokenNode],
span: Span,
skip_ws: bool,
index: usize,
seen: indexmap::IndexSet<usize>,
#[get = "pub"]
shapes: Vec<Spanned<FlatShape>>,
}
} else {
#[derive(Getters, Debug)]
pub struct TokensIteratorState<'content> {
tokens: &'content [TokenNode],
span: Span,
skip_ws: bool,
index: usize,
seen: indexmap::IndexSet<usize>,
}
}
use crate::hir::syntax_shape::FlatShape;
use crate::hir::Expression;
use crate::TokenNode;
use getset::{Getters, MutGetters};
use nu_errors::{ParseError, ShellError};
use nu_protocol::SpannedTypeName;
use nu_source::{HasFallibleSpan, HasSpan, Span, Spanned, SpannedItem, Tag, Text};
#[derive(Getters, Debug)]
pub struct TokensIteratorState<'content> {
tokens: &'content [TokenNode],
span: Span,
skip_ws: bool,
index: usize,
seen: indexmap::IndexSet<usize>,
#[get = "pub"]
shapes: Vec<Spanned<FlatShape>>,
}
#[derive(Getters, MutGetters, Debug)]
@ -53,7 +39,7 @@ pub struct Checkpoint<'content, 'me> {
pub(crate) iterator: &'me mut TokensIterator<'content>,
index: usize,
seen: indexmap::IndexSet<usize>,
#[cfg(coloring_in_tokens)]
shape_start: usize,
committed: bool,
}
@ -71,7 +57,7 @@ impl<'content, 'me> std::ops::Drop for Checkpoint<'content, 'me> {
state.index = self.index;
state.seen = self.seen.clone();
#[cfg(coloring_in_tokens)]
state.shapes.truncate(self.shape_start);
}
}
@ -81,8 +67,8 @@ impl<'content, 'me> std::ops::Drop for Checkpoint<'content, 'me> {
pub struct Peeked<'content, 'me> {
pub(crate) node: Option<&'content TokenNode>,
iterator: &'me mut TokensIterator<'content>,
from: usize,
to: usize,
pub from: usize,
pub to: usize,
}
impl<'content, 'me> Peeked<'content, 'me> {
@ -115,7 +101,7 @@ impl<'content, 'me> Peeked<'content, 'me> {
}
pub fn type_error(&self, expected: &'static str) -> ParseError {
peek_error(&self.node, self.iterator.eof_span(), expected)
peek_error(self.node, self.iterator.eof_span(), expected)
}
}
@ -143,14 +129,14 @@ impl<'content, 'me> PeekedNode<'content, 'me> {
pub fn rollback(self) {}
pub fn type_error(&self, expected: &'static str) -> ParseError {
peek_error(&Some(self.node), self.iterator.eof_span(), expected)
peek_error(Some(self.node), self.iterator.eof_span(), expected)
}
}
pub fn peek_error(node: &Option<&TokenNode>, eof_span: Span, expected: &'static str) -> ParseError {
pub fn peek_error(node: Option<&TokenNode>, eof_span: Span, expected: &'static str) -> ParseError {
match node {
None => ParseError::unexpected_eof(expected, eof_span),
Some(node) => ParseError::mismatch(expected, node.type_name().spanned(node.span())),
Some(node) => ParseError::mismatch(expected, node.spanned_type_name()),
}
}
@ -161,34 +147,17 @@ impl<'content> TokensIterator<'content> {
source: Text,
skip_ws: bool,
) -> TokensIterator<'content> {
cfg_if::cfg_if! {
if #[cfg(coloring_in_tokens)] {
TokensIterator {
state: TokensIteratorState {
tokens: items,
span,
skip_ws,
index: 0,
seen: indexmap::IndexSet::new(),
shapes: vec![],
},
color_tracer: ColorTracer::new(source.clone()),
expand_tracer: ExpandTracer::new(source.clone()),
}
} else {
TokensIterator {
state: TokensIteratorState {
tokens: items,
span,
skip_ws,
index: 0,
seen: indexmap::IndexSet::new(),
},
color_tracer: ColorTracer::new(source.clone()),
expand_tracer: ExpandTracer::new(source.clone()),
}
}
TokensIterator {
state: TokensIteratorState {
tokens: items,
span,
skip_ws,
index: 0,
seen: indexmap::IndexSet::new(),
shapes: vec![],
},
color_tracer: ColorTracer::new(source.clone()),
expand_tracer: ExpandTracer::new(source),
}
}
@ -204,6 +173,10 @@ impl<'content> TokensIterator<'content> {
self.state.tokens.len()
}
pub fn is_empty(&self) -> bool {
self.len() == 0
}
pub fn spanned<T>(
&mut self,
block: impl FnOnce(&mut TokensIterator<'content>) -> T,
@ -217,13 +190,11 @@ impl<'content> TokensIterator<'content> {
result.spanned(start.until(end))
}
#[cfg(coloring_in_tokens)]
pub fn color_shape(&mut self, shape: Spanned<FlatShape>) {
self.with_color_tracer(|_, tracer| tracer.add_shape(shape));
self.state.shapes.push(shape);
}
#[cfg(coloring_in_tokens)]
pub fn mutate_shapes(&mut self, block: impl FnOnce(&mut Vec<Spanned<FlatShape>>)) {
let new_shapes: Vec<Spanned<FlatShape>> = {
let shapes = &mut self.state.shapes;
@ -239,13 +210,11 @@ impl<'content> TokensIterator<'content> {
});
}
#[cfg(coloring_in_tokens)]
pub fn silently_mutate_shapes(&mut self, block: impl FnOnce(&mut Vec<Spanned<FlatShape>>)) {
let shapes = &mut self.state.shapes;
block(shapes);
}
#[cfg(coloring_in_tokens)]
pub fn sort_shapes(&mut self) {
// This is pretty dubious, but it works. We should look into a better algorithm that doesn't end up requiring
// this solution.
@ -255,7 +224,6 @@ impl<'content> TokensIterator<'content> {
.sort_by(|a, b| a.span.start().cmp(&b.span.start()));
}
#[cfg(coloring_in_tokens)]
pub fn child<'me, T>(
&'me mut self,
tokens: Spanned<&'me [TokenNode]>,
@ -268,58 +236,7 @@ impl<'content> TokensIterator<'content> {
let mut color_tracer = ColorTracer::new(source.clone());
std::mem::swap(&mut color_tracer, &mut self.color_tracer);
let mut expand_tracer = ExpandTracer::new(source.clone());
std::mem::swap(&mut expand_tracer, &mut self.expand_tracer);
cfg_if::cfg_if! {
if #[cfg(coloring_in_tokens)] {
let mut iterator = TokensIterator {
state: TokensIteratorState {
tokens: tokens.item,
span: tokens.span,
skip_ws: false,
index: 0,
seen: indexmap::IndexSet::new(),
shapes,
},
color_tracer,
expand_tracer,
};
} else {
let mut iterator = TokensIterator {
state: TokensIteratorState {
tokens: tokens.item,
span: tokens.span,
skip_ws: false,
index: 0,
seen: indexmap::IndexSet::new(),
},
color_tracer,
expand_tracer,
};
}
}
let result = block(&mut iterator);
std::mem::swap(&mut iterator.state.shapes, &mut self.state.shapes);
std::mem::swap(&mut iterator.color_tracer, &mut self.color_tracer);
std::mem::swap(&mut iterator.expand_tracer, &mut self.expand_tracer);
result
}
#[cfg(not(coloring_in_tokens))]
pub fn child<'me, T>(
&'me mut self,
tokens: Spanned<&'me [TokenNode]>,
source: Text,
block: impl FnOnce(&mut TokensIterator<'me>) -> T,
) -> T {
let mut color_tracer = ColorTracer::new(source.clone());
std::mem::swap(&mut color_tracer, &mut self.color_tracer);
let mut expand_tracer = ExpandTracer::new(source.clone());
let mut expand_tracer = ExpandTracer::new(source);
std::mem::swap(&mut expand_tracer, &mut self.expand_tracer);
let mut iterator = TokensIterator {
@ -329,6 +246,7 @@ impl<'content> TokensIterator<'content> {
skip_ws: false,
index: 0,
seen: indexmap::IndexSet::new(),
shapes,
},
color_tracer,
expand_tracer,
@ -336,6 +254,7 @@ impl<'content> TokensIterator<'content> {
let result = block(&mut iterator);
std::mem::swap(&mut iterator.state.shapes, &mut self.state.shapes);
std::mem::swap(&mut iterator.color_tracer, &mut self.color_tracer);
std::mem::swap(&mut iterator.expand_tracer, &mut self.expand_tracer);
@ -362,7 +281,6 @@ impl<'content> TokensIterator<'content> {
block(state, tracer)
}
#[cfg(coloring_in_tokens)]
pub fn color_frame<T>(
&mut self,
desc: &'static str,
@ -455,7 +373,7 @@ impl<'content> TokensIterator<'content> {
let state = &mut self.state;
let index = state.index;
#[cfg(coloring_in_tokens)]
let shape_start = state.shapes.len();
let seen = state.seen.clone();
@ -464,7 +382,7 @@ impl<'content> TokensIterator<'content> {
index,
seen,
committed: false,
#[cfg(coloring_in_tokens)]
shape_start,
}
}
@ -478,7 +396,7 @@ impl<'content> TokensIterator<'content> {
let state = &mut self.state;
let index = state.index;
#[cfg(coloring_in_tokens)]
let shape_start = state.shapes.len();
let seen = state.seen.clone();
@ -487,26 +405,26 @@ impl<'content> TokensIterator<'content> {
index,
seen,
committed: false,
#[cfg(coloring_in_tokens)]
shape_start,
};
let value = block(checkpoint.iterator)?;
checkpoint.commit();
return Ok(value);
Ok(value)
}
/// Use a checkpoint when you need to peek more than one token ahead, but can't be sure
/// that you'll succeed.
pub fn atomic_parse<'me, T>(
pub fn atomic_parse<'me, T, E>(
&'me mut self,
block: impl FnOnce(&mut TokensIterator<'content>) -> Result<T, ParseError>,
) -> Result<T, ParseError> {
block: impl FnOnce(&mut TokensIterator<'content>) -> Result<T, E>,
) -> Result<T, E> {
let state = &mut self.state;
let index = state.index;
#[cfg(coloring_in_tokens)]
let shape_start = state.shapes.len();
let seen = state.seen.clone();
@ -515,17 +433,16 @@ impl<'content> TokensIterator<'content> {
index,
seen,
committed: false,
#[cfg(coloring_in_tokens)]
shape_start,
};
let value = block(checkpoint.iterator)?;
checkpoint.commit();
return Ok(value);
Ok(value)
}
#[cfg(coloring_in_tokens)]
/// Use a checkpoint when you need to peek more than one token ahead, but can't be sure
/// that you'll succeed.
pub fn atomic_returning_shapes<'me, T>(
@ -560,7 +477,7 @@ impl<'content> TokensIterator<'content> {
checkpoint.commit();
std::mem::swap(&mut self.state.shapes, &mut shapes);
return (Ok(value), shapes);
(Ok(value), shapes)
}
fn eof_span(&self) -> Span {
@ -641,7 +558,7 @@ impl<'content> TokensIterator<'content> {
// index: state.index,
// seen: state.seen.clone(),
// skip_ws: state.skip_ws,
// #[cfg(coloring_in_tokens)]
//
// shapes: state.shapes.clone(),
// },
// color_tracer: self.color_tracer.clone(),
@ -669,12 +586,12 @@ impl<'content> TokensIterator<'content> {
let peeked = peeked.not_eof(expected);
match peeked {
Err(err) => return Err(err),
Err(err) => Err(err),
Ok(peeked) => match block(peeked.node) {
Err(err) => return Err(err),
Err(err) => Err(err),
Ok(val) => {
peeked.commit();
return Ok(val);
Ok(val)
}
},
}
@ -744,10 +661,7 @@ fn peek<'content, 'me>(
}
}
fn peek_pos<'content, 'me>(
iterator: &'me TokensIterator<'content>,
skip_ws: bool,
) -> Option<usize> {
fn peek_pos(iterator: &TokensIterator<'_>, skip_ws: bool) -> Option<usize> {
let state = iterator.state();
let mut to = state.index;

View File

@ -6,7 +6,7 @@ pub(crate) mod expand_trace;
pub(crate) use self::color_trace::*;
pub(crate) use self::expand_trace::*;
use crate::parser::hir::tokens_iterator::TokensIteratorState;
use crate::hir::tokens_iterator::TokensIteratorState;
use nu_source::{PrettyDebug, PrettyDebugWithSource, Text};
#[derive(Debug)]
@ -24,13 +24,11 @@ pub(crate) fn debug_tokens(state: &TokensIteratorState, source: &str) -> Vec<Deb
out.push(DebugIteratorToken::Cursor);
}
let msg = token.debug(source).to_string();
if state.seen.contains(&i) {
out.push(DebugIteratorToken::Seen(format!("{}", token.debug(source))));
out.push(DebugIteratorToken::Seen(msg));
} else {
out.push(DebugIteratorToken::Unseen(format!(
"{}",
token.debug(source)
)));
out.push(DebugIteratorToken::Unseen(msg));
}
}

View File

@ -1,9 +1,8 @@
use crate::errors::ShellError;
use crate::parser::hir::syntax_shape::FlatShape;
use crate::prelude::*;
use crate::hir::syntax_shape::FlatShape;
use ansi_term::Color;
use log::trace;
use nu_source::Spanned;
use nu_errors::ShellError;
use nu_source::{Spanned, Text};
use ptree::*;
use std::borrow::Cow;
use std::io;
@ -50,7 +49,7 @@ pub struct ColorFrame {
impl ColorFrame {
fn colored_leaf_description(&self, f: &mut impl io::Write) -> io::Result<()> {
if self.has_only_error_descendents() {
if self.children.len() == 0 {
if self.children.is_empty() {
write!(
f,
"{}",
@ -110,14 +109,10 @@ impl ColorFrame {
fn any_child_shape(&self, predicate: impl Fn(Spanned<FlatShape>) -> bool) -> bool {
for item in &self.children {
match item {
FrameChild::Shape(shape) => {
if predicate(*shape) {
return true;
}
if let FrameChild::Shape(shape) = item {
if predicate(*shape) {
return true;
}
_ => {}
}
}
@ -126,14 +121,10 @@ impl ColorFrame {
fn any_child_frame(&self, predicate: impl Fn(&ColorFrame) -> bool) -> bool {
for item in &self.children {
match item {
FrameChild::Frame(frame) => {
if predicate(frame) {
return true;
}
if let FrameChild::Frame(frame) = item {
if predicate(frame) {
return true;
}
_ => {}
}
}
@ -149,7 +140,7 @@ impl ColorFrame {
}
fn has_only_error_descendents(&self) -> bool {
if self.children.len() == 0 {
if self.children.is_empty() {
// if this frame has no children at all, it has only error descendents if this frame
// is an error
self.error.is_some()
@ -260,7 +251,7 @@ impl ColorTracer {
let result = self.frame_stack.pop().expect("Can't pop root tracer frame");
if self.frame_stack.len() == 0 {
if self.frame_stack.is_empty() {
panic!("Can't pop root tracer frame {:#?}", self);
}

View File

@ -1,8 +1,9 @@
use crate::parser::hir::Expression;
use crate::prelude::*;
use crate::hir::Expression;
use ansi_term::Color;
use log::trace;
use nu_source::DebugDoc;
use nu_errors::ParseError;
use nu_protocol::ShellTypeName;
use nu_source::{DebugDoc, PrettyDebug, PrettyDebugWithSource, Text};
use ptree::*;
use std::borrow::Cow;
use std::io;
@ -18,7 +19,7 @@ impl FrameChild {
fn get_error_leaf(&self) -> Option<&'static str> {
match self {
FrameChild::Frame(frame) if frame.error.is_some() => {
if frame.children.len() == 0 {
if frame.children.is_empty() {
Some(frame.description)
} else {
None
@ -32,12 +33,12 @@ impl FrameChild {
match self {
FrameChild::Expr(expr) => TreeChild::OkExpr(expr.clone(), text.clone()),
FrameChild::Result(result) => {
let result = format!("{}", result.display());
let result = result.display();
TreeChild::OkNonExpr(result)
}
FrameChild::Frame(frame) => {
if frame.error.is_some() {
if frame.children.len() == 0 {
if frame.children.is_empty() {
TreeChild::ErrorLeaf(vec![frame.description])
} else {
TreeChild::ErrorFrame(frame.to_tree_frame(text), text.clone())
@ -66,7 +67,7 @@ impl ExprFrame {
if let Some(error_leaf) = child.get_error_leaf() {
errors.push(error_leaf);
continue;
} else if errors.len() > 0 {
} else if !errors.is_empty() {
children.push(TreeChild::ErrorLeaf(errors));
errors = vec![];
}
@ -74,7 +75,7 @@ impl ExprFrame {
children.push(child.to_tree_child(text));
}
if errors.len() > 0 {
if !errors.is_empty() {
children.push(TreeChild::ErrorLeaf(errors));
}
@ -114,22 +115,20 @@ impl TreeFrame {
write!(f, " -> ")?;
self.children[0].leaf_description(f)
} else {
if self.error.is_some() {
if self.children.len() == 0 {
write!(
f,
"{}",
Color::White.bold().on(Color::Red).paint(self.description)
)
} else {
write!(f, "{}", Color::Red.normal().paint(self.description))
}
} else if self.has_descendent_green() {
write!(f, "{}", Color::Green.normal().paint(self.description))
} else if self.error.is_some() {
if self.children.is_empty() {
write!(
f,
"{}",
Color::White.bold().on(Color::Red).paint(self.description)
)
} else {
write!(f, "{}", Color::Yellow.bold().paint(self.description))
write!(f, "{}", Color::Red.normal().paint(self.description))
}
} else if self.has_descendent_green() {
write!(f, "{}", Color::Green.normal().paint(self.description))
} else {
write!(f, "{}", Color::Yellow.bold().paint(self.description))
}
}
@ -142,14 +141,10 @@ impl TreeFrame {
fn any_child_frame(&self, predicate: impl Fn(&TreeFrame) -> bool) -> bool {
for item in &self.children {
match item {
TreeChild::OkFrame(frame, ..) => {
if predicate(frame) {
return true;
}
if let TreeChild::OkFrame(frame, ..) = item {
if predicate(frame) {
return true;
}
_ => {}
}
}
@ -208,7 +203,7 @@ impl TreeChild {
Color::White
.bold()
.on(Color::Green)
.paint(format!("{}", result))
.paint(result.to_string())
),
TreeChild::ErrorLeaf(desc) => {
@ -259,12 +254,7 @@ pub struct ExpandTracer {
impl ExpandTracer {
pub fn print(&self, source: Text) -> PrintTracer {
let root = self
.frame_stack
.iter()
.nth(0)
.unwrap()
.to_tree_frame(&source);
let root = self.frame_stack[0].to_tree_frame(&source);
PrintTracer { root, source }
}
@ -291,7 +281,7 @@ impl ExpandTracer {
fn pop_frame(&mut self) -> ExprFrame {
let result = self.frame_stack.pop().expect("Can't pop root tracer frame");
if self.frame_stack.len() == 0 {
if self.frame_stack.is_empty() {
panic!("Can't pop root tracer frame");
}

View File

@ -0,0 +1,16 @@
use crate::hir::TokensIterator;
use crate::parse::token_tree_builder::TokenTreeBuilder as b;
use crate::Span;
#[test]
fn supplies_tokens() -> Result<(), Box<dyn std::error::Error>> {
let tokens = b::token_list(vec![b::var("it"), b::op("."), b::bare("cpu")]);
let (tokens, _) = b::build(tokens);
let tokens = tokens.expect_list();
let mut iterator = TokensIterator::all(tokens, Span::unknown());
iterator.next()?.expect_var();
iterator.next()?.expect_dot();
iterator.next()?.expect_bare();
}

View File

@ -0,0 +1,43 @@
#![allow(clippy::large_enum_variant, clippy::type_complexity)]
pub mod commands;
pub mod hir;
pub mod parse;
pub mod parse_command;
pub use crate::commands::classified::{
external::ExternalCommand, internal::InternalCommand, ClassifiedCommand, ClassifiedPipeline,
};
pub use crate::hir::syntax_shape::flat_shape::FlatShape;
pub use crate::hir::syntax_shape::{
expand_syntax, ExpandContext, ExpandSyntax, PipelineShape, SignatureRegistry,
};
pub use crate::hir::tokens_iterator::TokensIterator;
pub use crate::parse::files::Files;
pub use crate::parse::flag::Flag;
pub use crate::parse::operator::{CompareOperator, EvaluationOperator};
pub use crate::parse::parser::Number;
pub use crate::parse::parser::{module, pipeline};
pub use crate::parse::token_tree::{Delimiter, TokenNode};
pub use crate::parse::token_tree_builder::TokenTreeBuilder;
use nu_errors::ShellError;
use nu_source::nom_input;
pub fn parse(input: &str) -> Result<TokenNode, ShellError> {
let _ = pretty_env_logger::try_init();
match pipeline(nom_input(input)) {
Ok((_rest, val)) => Ok(val),
Err(err) => Err(ShellError::parse_error(err)),
}
}
pub fn parse_script(input: &str) -> Result<TokenNode, ShellError> {
let _ = pretty_env_logger::try_init();
match module(nom_input(input)) {
Ok((_rest, val)) => Ok(val),
Err(err) => Err(ShellError::parse_error(err)),
}
}

View File

@ -1,4 +1,5 @@
pub(crate) mod call_node;
pub(crate) mod comment;
pub(crate) mod files;
pub(crate) mod flag;
pub(crate) mod operator;

View File

@ -1,6 +1,6 @@
use crate::parser::TokenNode;
use crate::prelude::*;
use crate::TokenNode;
use getset::Getters;
use nu_source::{b, DebugDocBuilder, PrettyDebugWithSource};
#[derive(Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Getters)]
pub struct CallNode {
@ -30,7 +30,7 @@ impl PrettyDebugWithSource for CallNode {
impl CallNode {
pub fn new(head: Box<TokenNode>, children: Vec<TokenNode>) -> CallNode {
if children.len() == 0 {
if children.is_empty() {
CallNode {
head,
children: None,

View File

@ -0,0 +1,42 @@
use derive_new::new;
use getset::Getters;
use nu_source::{b, DebugDocBuilder, HasSpan, PrettyDebugWithSource, Span};
use serde::{Deserialize, Serialize};
#[derive(Debug, Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Hash, Deserialize, Serialize)]
pub enum CommentKind {
Line,
}
#[derive(Debug, Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Hash, Getters, new)]
pub struct Comment {
pub(crate) kind: CommentKind,
pub(crate) text: Span,
pub(crate) span: Span,
}
impl Comment {
pub fn line(text: impl Into<Span>, outer: impl Into<Span>) -> Comment {
Comment {
kind: CommentKind::Line,
text: text.into(),
span: outer.into(),
}
}
}
impl PrettyDebugWithSource for Comment {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
let prefix = match self.kind {
CommentKind::Line => b::description("#"),
};
prefix + b::description(self.text.slice(source))
}
}
impl HasSpan for Comment {
fn span(&self) -> Span {
self.span
}
}

View File

@ -0,0 +1,151 @@
use derive_new::new;
use language_reporting::{FileName, Location};
use log::trace;
use nu_source::Span;
#[derive(new, Debug, Clone)]
pub struct Files {
snippet: String,
}
impl language_reporting::ReportingFiles for Files {
type Span = Span;
type FileId = usize;
fn byte_span(
&self,
_file: Self::FileId,
from_index: usize,
to_index: usize,
) -> Option<Self::Span> {
Some(Span::new(from_index, to_index))
}
fn file_id(&self, _tag: Self::Span) -> Self::FileId {
0
}
fn file_name(&self, _file: Self::FileId) -> FileName {
FileName::Verbatim("shell".to_string())
}
fn byte_index(&self, _file: Self::FileId, _line: usize, _column: usize) -> Option<usize> {
unimplemented!("byte_index")
}
fn location(&self, _file: Self::FileId, byte_index: usize) -> Option<Location> {
trace!("finding location for {}", byte_index);
let source = &self.snippet;
let mut seen_lines = 0;
let mut seen_bytes = 0;
for (pos, slice) in source.match_indices('\n') {
trace!(
"searching byte_index={} seen_bytes={} pos={} slice={:?} slice.len={} source={:?}",
byte_index,
seen_bytes,
pos,
slice,
source.len(),
source
);
if pos >= byte_index {
trace!(
"returning {}:{} seen_lines={} byte_index={} pos={} seen_bytes={}",
seen_lines,
byte_index,
pos,
seen_lines,
byte_index,
seen_bytes
);
return Some(language_reporting::Location::new(
seen_lines,
byte_index - pos,
));
} else {
seen_lines += 1;
seen_bytes = pos;
}
}
if seen_lines == 0 {
trace!("seen_lines=0 end={}", source.len() - 1);
// if we got here, there were no newlines in the source
Some(language_reporting::Location::new(0, source.len() - 1))
} else {
trace!(
"last line seen_lines={} end={}",
seen_lines,
source.len() - 1 - byte_index
);
// if we got here and we didn't return, it should mean that we're talking about
// the last line
Some(language_reporting::Location::new(
seen_lines,
source.len() - 1 - byte_index,
))
}
}
fn line_span(&self, _file: Self::FileId, lineno: usize) -> Option<Self::Span> {
trace!("finding line_span for {}", lineno);
let source = &self.snippet;
let mut seen_lines = 0;
let mut seen_bytes = 0;
for (pos, _) in source.match_indices('\n') {
trace!(
"lineno={} seen_lines={} seen_bytes={} pos={}",
lineno,
seen_lines,
seen_bytes,
pos
);
if seen_lines == lineno {
trace!("returning start={} end={}", seen_bytes, pos);
// If the number of seen lines is the lineno, seen_bytes is the start of the
// line and pos is the end of the line
return Some(Span::new(seen_bytes, pos));
} else {
// If it's not, increment seen_lines, and move seen_bytes to the beginning of
// the next line
seen_lines += 1;
seen_bytes = pos + 1;
}
}
if seen_lines == 0 {
trace!("returning start={} end={}", 0, self.snippet.len() - 1);
// if we got here, there were no newlines in the source
Some(Span::new(0, self.snippet.len() - 1))
} else {
trace!(
"returning start={} end={}",
seen_bytes,
self.snippet.len() - 1
);
// if we got here and we didn't return, it should mean that we're talking about
// the last line
Some(Span::new(seen_bytes, self.snippet.len() - 1))
}
}
fn source(&self, span: Self::Span) -> Option<String> {
trace!("source(tag={:?}) snippet={:?}", span, self.snippet);
if span.start() > span.end() || span.end() > self.snippet.len() {
return None;
}
Some(span.slice(&self.snippet).to_string())
}
}

View File

@ -1,8 +1,7 @@
use crate::parser::hir::syntax_shape::flat_shape::FlatShape;
use crate::prelude::*;
use crate::hir::syntax_shape::flat_shape::FlatShape;
use derive_new::new;
use getset::Getters;
use nu_source::{Span, Spanned, SpannedItem};
use nu_source::{b, DebugDocBuilder, PrettyDebugWithSource, Span, Spanned, SpannedItem};
use serde::{Deserialize, Serialize};
#[derive(Debug, Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Hash, Deserialize, Serialize)]

View File

@ -0,0 +1,114 @@
use nu_source::{b, DebugDocBuilder, PrettyDebug};
use serde::{Deserialize, Serialize};
use std::str::FromStr;
#[derive(Debug, Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Hash, Deserialize, Serialize)]
pub enum CompareOperator {
Equal,
NotEqual,
LessThan,
GreaterThan,
LessThanOrEqual,
GreaterThanOrEqual,
Contains,
NotContains,
}
impl PrettyDebug for CompareOperator {
fn pretty(&self) -> DebugDocBuilder {
b::operator(self.as_str())
}
}
impl CompareOperator {
pub fn print(self) -> String {
self.as_str().to_string()
}
pub fn as_str(self) -> &'static str {
match self {
CompareOperator::Equal => "==",
CompareOperator::NotEqual => "!=",
CompareOperator::LessThan => "<",
CompareOperator::GreaterThan => ">",
CompareOperator::LessThanOrEqual => "<=",
CompareOperator::GreaterThanOrEqual => ">=",
CompareOperator::Contains => "=~",
CompareOperator::NotContains => "!~",
}
}
}
impl From<&str> for CompareOperator {
fn from(input: &str) -> CompareOperator {
if let Ok(output) = CompareOperator::from_str(input) {
output
} else {
unreachable!("Internal error: CompareOperator from failed")
}
}
}
impl FromStr for CompareOperator {
type Err = ();
fn from_str(input: &str) -> Result<Self, <Self as std::str::FromStr>::Err> {
match input {
"==" => Ok(CompareOperator::Equal),
"!=" => Ok(CompareOperator::NotEqual),
"<" => Ok(CompareOperator::LessThan),
">" => Ok(CompareOperator::GreaterThan),
"<=" => Ok(CompareOperator::LessThanOrEqual),
">=" => Ok(CompareOperator::GreaterThanOrEqual),
"=~" => Ok(CompareOperator::Contains),
"!~" => Ok(CompareOperator::NotContains),
_ => Err(()),
}
}
}
#[derive(Debug, Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Hash, Deserialize, Serialize)]
pub enum EvaluationOperator {
Dot,
DotDot,
}
impl PrettyDebug for EvaluationOperator {
fn pretty(&self) -> DebugDocBuilder {
b::operator(self.as_str())
}
}
impl EvaluationOperator {
pub fn print(self) -> String {
self.as_str().to_string()
}
pub fn as_str(self) -> &'static str {
match self {
EvaluationOperator::Dot => ".",
EvaluationOperator::DotDot => "..",
}
}
}
impl From<&str> for EvaluationOperator {
fn from(input: &str) -> EvaluationOperator {
if let Ok(output) = EvaluationOperator::from_str(input) {
output
} else {
unreachable!("Internal error: EvaluationOperator 'from' failed")
}
}
}
impl FromStr for EvaluationOperator {
type Err = ();
fn from_str(input: &str) -> Result<Self, <Self as std::str::FromStr>::Err> {
match input {
"." => Ok(EvaluationOperator::Dot),
".." => Ok(EvaluationOperator::DotDot),
_ => Err(()),
}
}
}

View File

@ -1,8 +1,7 @@
use crate::parser::TokenNode;
use crate::prelude::*;
use crate::TokenNode;
use derive_new::new;
use getset::Getters;
use nu_source::{DebugDocBuilder, PrettyDebugWithSource, Span, Spanned};
use nu_source::{b, DebugDocBuilder, HasSpan, PrettyDebugWithSource, Span, Spanned};
#[derive(Debug, Clone, PartialEq, Eq, PartialOrd, Ord, Getters, new)]
pub struct Pipeline {

View File

@ -1,10 +1,12 @@
use crate::errors::{ParseError, ShellError};
use crate::parser::parse::{call_node::*, flag::*, operator::*, pipeline::*, tokens::*};
use crate::prelude::*;
use crate::parse::{call_node::*, comment::*, flag::*, operator::*, pipeline::*, tokens::*};
use derive_new::new;
use getset::Getters;
use nu_source::Spanned;
use nu_source::{Tagged, Text};
use nu_errors::{ParseError, ShellError};
use nu_protocol::ShellTypeName;
use nu_source::{
b, DebugDocBuilder, HasSpan, PrettyDebugWithSource, Span, Spanned, SpannedItem, Tagged,
TaggedItem, Text,
};
use std::fmt;
#[derive(Debug, Clone, Eq, PartialEq, Ord, PartialOrd)]
@ -16,7 +18,9 @@ pub enum TokenNode {
Delimited(Spanned<DelimitedNode>),
Pipeline(Pipeline),
Flag(Flag),
Comment(Comment),
Whitespace(Span),
Separator(Span),
Error(Spanned<ShellError>),
}
@ -37,14 +41,32 @@ impl PrettyDebugWithSource for TokenNode {
"whitespace",
b::description(format!("{:?}", space.slice(source))),
),
TokenNode::Separator(span) => b::typed(
"separator",
b::description(format!("{:?}", span.slice(source))),
),
TokenNode::Comment(comment) => {
b::typed("comment", b::description(comment.text.slice(source)))
}
TokenNode::Error(_) => b::error("error"),
}
}
}
impl HasSpan for TokenNode {
fn span(&self) -> Span {
self.get_span()
impl ShellTypeName for TokenNode {
fn type_name(&self) -> &'static str {
match self {
TokenNode::Token(t) => t.type_name(),
TokenNode::Nodes(_) => "nodes",
TokenNode::Call(_) => "command",
TokenNode::Delimited(d) => d.type_name(),
TokenNode::Pipeline(_) => "pipeline",
TokenNode::Flag(_) => "flag",
TokenNode::Whitespace(_) => "whitespace",
TokenNode::Separator(_) => "separator",
TokenNode::Comment(_) => "comment",
TokenNode::Error(_) => "error",
}
}
}
@ -105,12 +127,12 @@ impl fmt::Debug for DebugTokenNode<'_> {
impl From<&TokenNode> for Span {
fn from(token: &TokenNode) -> Span {
token.get_span()
token.span()
}
}
impl TokenNode {
pub fn get_span(&self) -> Span {
impl HasSpan for TokenNode {
fn span(&self) -> Span {
match self {
TokenNode::Token(t) => t.span,
TokenNode::Nodes(t) => t.span,
@ -119,27 +141,14 @@ impl TokenNode {
TokenNode::Pipeline(s) => s.span,
TokenNode::Flag(s) => s.span,
TokenNode::Whitespace(s) => *s,
TokenNode::Separator(s) => *s,
TokenNode::Comment(c) => c.span(),
TokenNode::Error(s) => s.span,
}
}
}
pub fn type_name(&self) -> &'static str {
match self {
TokenNode::Token(t) => t.type_name(),
TokenNode::Nodes(_) => "nodes",
TokenNode::Call(_) => "command",
TokenNode::Delimited(d) => d.type_name(),
TokenNode::Pipeline(_) => "pipeline",
TokenNode::Flag(_) => "flag",
TokenNode::Whitespace(_) => "whitespace",
TokenNode::Error(_) => "error",
}
}
pub fn spanned_type_name(&self) -> Spanned<&'static str> {
self.type_name().spanned(self.span())
}
impl TokenNode {
pub fn tagged_type_name(&self) -> Tagged<&'static str> {
self.type_name().tagged(self.span())
}
@ -242,7 +251,7 @@ impl TokenNode {
pub fn is_dot(&self) -> bool {
match self {
TokenNode::Token(Token {
unspanned: UnspannedToken::Operator(Operator::Dot),
unspanned: UnspannedToken::EvaluationOperator(EvaluationOperator::Dot),
..
}) => true,
_ => false,
@ -340,7 +349,7 @@ pub enum Delimiter {
}
impl Delimiter {
pub(crate) fn open(&self) -> &'static str {
pub(crate) fn open(self) -> &'static str {
match self {
Delimiter::Paren => "(",
Delimiter::Brace => "{",
@ -348,7 +357,7 @@ impl Delimiter {
}
}
pub(crate) fn close(&self) -> &'static str {
pub(crate) fn close(self) -> &'static str {
match self {
Delimiter::Paren => ")",
Delimiter::Brace => "}",
@ -419,7 +428,7 @@ impl TokenNode {
pub fn expect_dot(&self) -> Span {
match self {
TokenNode::Token(Token {
unspanned: UnspannedToken::Operator(Operator::Dot),
unspanned: UnspannedToken::EvaluationOperator(EvaluationOperator::Dot),
span,
}) => *span,
other => panic!("Expected dot, found {:?}", other),

View File

@ -1,24 +1,23 @@
use crate::prelude::*;
use crate::parser::parse::flag::{Flag, FlagKind};
use crate::parser::parse::operator::Operator;
use crate::parser::parse::pipeline::{Pipeline, PipelineElement};
use crate::parser::parse::token_tree::{DelimitedNode, Delimiter, TokenNode};
use crate::parser::parse::tokens::{RawNumber, UnspannedToken};
use crate::parser::CallNode;
use nu_source::Spanned;
use crate::parse::call_node::CallNode;
use crate::parse::comment::Comment;
use crate::parse::flag::{Flag, FlagKind};
use crate::parse::operator::{CompareOperator, EvaluationOperator};
use crate::parse::pipeline::{Pipeline, PipelineElement};
use crate::parse::token_tree::{DelimitedNode, Delimiter, TokenNode};
use crate::parse::tokens::{RawNumber, UnspannedToken};
use bigdecimal::BigDecimal;
use nu_source::{Span, Spanned, SpannedItem};
use num_bigint::BigInt;
#[derive(Default)]
pub struct TokenTreeBuilder {
pos: usize,
output: String,
}
impl TokenTreeBuilder {
pub fn new() -> TokenTreeBuilder {
TokenTreeBuilder {
pos: 0,
output: String::new(),
}
pub fn new() -> Self {
Default::default()
}
}
@ -96,7 +95,7 @@ impl TokenTreeBuilder {
TokenNode::Nodes(input.spanned(span.into()))
}
pub fn op(input: impl Into<Operator>) -> CurriedToken {
pub fn op(input: impl Into<CompareOperator>) -> CurriedToken {
let input = input.into();
Box::new(move |b| {
@ -104,12 +103,39 @@ impl TokenTreeBuilder {
b.pos = end;
TokenTreeBuilder::spanned_op(input, Span::new(start, end))
TokenTreeBuilder::spanned_cmp_op(input, Span::new(start, end))
})
}
pub fn spanned_op(input: impl Into<Operator>, span: impl Into<Span>) -> TokenNode {
TokenNode::Token(UnspannedToken::Operator(input.into()).into_token(span))
pub fn spanned_cmp_op(input: impl Into<CompareOperator>, span: impl Into<Span>) -> TokenNode {
TokenNode::Token(UnspannedToken::CompareOperator(input.into()).into_token(span))
}
pub fn dot() -> CurriedToken {
Box::new(move |b| {
let (start, end) = b.consume(".");
b.pos = end;
TokenTreeBuilder::spanned_eval_op(".", Span::new(start, end))
})
}
pub fn dotdot() -> CurriedToken {
Box::new(move |b| {
let (start, end) = b.consume("..");
b.pos = end;
TokenTreeBuilder::spanned_eval_op("..", Span::new(start, end))
})
}
pub fn spanned_eval_op(
input: impl Into<EvaluationOperator>,
span: impl Into<Span>,
) -> TokenNode {
TokenNode::Token(UnspannedToken::EvaluationOperator(input.into()).into_token(span))
}
pub fn string(input: impl Into<String>) -> CurriedToken {
@ -291,16 +317,19 @@ impl TokenTreeBuilder {
}
pub fn spanned_call(input: Vec<TokenNode>, span: impl Into<Span>) -> Spanned<CallNode> {
if input.len() == 0 {
if input.is_empty() {
panic!("BUG: spanned call (TODO)")
}
let mut input = input.into_iter();
let head = input.next().unwrap();
let tail = input.collect();
if let Some(head) = input.next() {
let tail = input.collect();
CallNode::new(Box::new(head), tail).spanned(span.into())
CallNode::new(Box::new(head), tail).spanned(span.into())
} else {
unreachable!("Internal error: spanned_call failed")
}
}
fn consume_delimiter(
@ -398,6 +427,36 @@ impl TokenTreeBuilder {
TokenNode::Whitespace(span.into())
}
pub fn sep(input: impl Into<String>) -> CurriedToken {
let input = input.into();
Box::new(move |b| {
let (start, end) = b.consume(&input);
TokenTreeBuilder::spanned_sep(Span::new(start, end))
})
}
pub fn spanned_sep(span: impl Into<Span>) -> TokenNode {
TokenNode::Separator(span.into())
}
pub fn comment(input: impl Into<String>) -> CurriedToken {
let input = input.into();
Box::new(move |b| {
let outer_start = b.pos;
b.consume("#");
let (start, end) = b.consume(&input);
let outer_end = b.pos;
TokenTreeBuilder::spanned_comment((start, end), (outer_start, outer_end))
})
}
pub fn spanned_comment(input: impl Into<Span>, span: impl Into<Span>) -> TokenNode {
TokenNode::Comment(Comment::line(input, span))
}
fn consume(&mut self, input: &str) -> (usize, usize) {
let start = self.pos;
self.pos += input.len();

View File

@ -1,13 +1,20 @@
use crate::parser::Operator;
use crate::prelude::*;
use nu_source::{Spanned, Text};
use crate::parse::parser::Number;
use crate::{CompareOperator, EvaluationOperator};
use bigdecimal::BigDecimal;
use nu_protocol::ShellTypeName;
use nu_source::{
b, DebugDocBuilder, HasSpan, PrettyDebug, PrettyDebugWithSource, Span, Spanned, SpannedItem,
Text,
};
use num_bigint::BigInt;
use std::fmt;
use std::str::FromStr;
#[derive(Debug, Clone, Copy, Eq, PartialEq, Ord, PartialOrd, Hash)]
pub enum UnspannedToken {
Number(RawNumber),
Operator(Operator),
CompareOperator(CompareOperator),
EvaluationOperator(EvaluationOperator),
String(Span),
Variable(Span),
ExternalCommand(Span),
@ -29,7 +36,9 @@ impl ShellTypeName for UnspannedToken {
fn type_name(&self) -> &'static str {
match self {
UnspannedToken::Number(_) => "number",
UnspannedToken::Operator(..) => "operator",
UnspannedToken::CompareOperator(..) => "comparison operator",
UnspannedToken::EvaluationOperator(EvaluationOperator::Dot) => "dot",
UnspannedToken::EvaluationOperator(EvaluationOperator::DotDot) => "dotdot",
UnspannedToken::String(_) => "string",
UnspannedToken::Variable(_) => "variable",
UnspannedToken::ExternalCommand(_) => "syntax error",
@ -79,9 +88,19 @@ impl RawNumber {
pub(crate) fn to_number(self, source: &Text) -> Number {
match self {
RawNumber::Int(tag) => Number::Int(BigInt::from_str(tag.slice(source)).unwrap()),
RawNumber::Int(tag) => {
if let Ok(int) = BigInt::from_str(tag.slice(source)) {
Number::Int(int)
} else {
unreachable!("Internal error: to_number failed")
}
}
RawNumber::Decimal(tag) => {
Number::Decimal(BigDecimal::from_str(tag.slice(source)).unwrap())
if let Ok(decimal) = BigDecimal::from_str(tag.slice(source)) {
Number::Decimal(decimal)
} else {
unreachable!("Internal error: to_number failed")
}
}
}
}
@ -105,7 +124,8 @@ impl PrettyDebugWithSource for Token {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
match self.unspanned {
UnspannedToken::Number(number) => number.pretty_debug(source),
UnspannedToken::Operator(operator) => operator.pretty(),
UnspannedToken::CompareOperator(operator) => operator.pretty(),
UnspannedToken::EvaluationOperator(operator) => operator.pretty(),
UnspannedToken::String(_) => b::primitive(self.span.slice(source)),
UnspannedToken::Variable(_) => b::var(self.span.slice(source)),
UnspannedToken::ExternalCommand(_) => b::primitive(self.span.slice(source)),
@ -149,9 +169,9 @@ impl Token {
}
}
pub fn extract_operator(&self) -> Option<Spanned<Operator>> {
pub fn extract_operator(&self) -> Option<Spanned<CompareOperator>> {
match self.unspanned {
UnspannedToken::Operator(operator) => Some(operator.spanned(self.span)),
UnspannedToken::CompareOperator(operator) => Some(operator.spanned(self.span)),
_ => None,
}
}

View File

@ -0,0 +1,127 @@
use crate::parse::parser::Number;
use nu_protocol::{Primitive, UntaggedValue};
use nu_source::{b, DebugDocBuilder, PrettyDebug};
use num_traits::ToPrimitive;
use serde::{Deserialize, Serialize};
use std::str::FromStr;
#[derive(Debug, Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Hash, Deserialize, Serialize)]
pub enum Unit {
// Filesize units
Byte,
Kilobyte,
Megabyte,
Gigabyte,
Terabyte,
Petabyte,
// Duration units
Second,
Minute,
Hour,
Day,
Week,
Month,
Year,
}
impl PrettyDebug for Unit {
fn pretty(&self) -> DebugDocBuilder {
b::keyword(self.as_str())
}
}
fn convert_number_to_u64(number: &Number) -> u64 {
match number {
Number::Int(big_int) => {
if let Some(x) = big_int.to_u64() {
x
} else {
unreachable!("Internal error: convert_number_to_u64 given incompatible number")
}
}
Number::Decimal(big_decimal) => {
if let Some(x) = big_decimal.to_u64() {
x
} else {
unreachable!("Internal error: convert_number_to_u64 given incompatible number")
}
}
}
}
impl Unit {
pub fn as_str(self) -> &'static str {
match self {
Unit::Byte => "B",
Unit::Kilobyte => "KB",
Unit::Megabyte => "MB",
Unit::Gigabyte => "GB",
Unit::Terabyte => "TB",
Unit::Petabyte => "PB",
Unit::Second => "s",
Unit::Minute => "m",
Unit::Hour => "h",
Unit::Day => "d",
Unit::Week => "w",
Unit::Month => "M",
Unit::Year => "y",
}
}
pub fn compute(self, size: &Number) -> UntaggedValue {
let size = size.clone();
match self {
Unit::Byte => number(size),
Unit::Kilobyte => number(size * 1024),
Unit::Megabyte => number(size * 1024 * 1024),
Unit::Gigabyte => number(size * 1024 * 1024 * 1024),
Unit::Terabyte => number(size * 1024 * 1024 * 1024 * 1024),
Unit::Petabyte => number(size * 1024 * 1024 * 1024 * 1024 * 1024),
Unit::Second => duration(convert_number_to_u64(&size)),
Unit::Minute => duration(60 * convert_number_to_u64(&size)),
Unit::Hour => duration(60 * 60 * convert_number_to_u64(&size)),
Unit::Day => duration(24 * 60 * 60 * convert_number_to_u64(&size)),
Unit::Week => duration(7 * 24 * 60 * 60 * convert_number_to_u64(&size)),
Unit::Month => duration(30 * 24 * 60 * 60 * convert_number_to_u64(&size)),
Unit::Year => duration(365 * 24 * 60 * 60 * convert_number_to_u64(&size)),
}
}
}
fn number(number: impl Into<Number>) -> UntaggedValue {
let number = number.into();
match number {
Number::Int(int) => UntaggedValue::Primitive(Primitive::Int(int)),
Number::Decimal(decimal) => UntaggedValue::Primitive(Primitive::Decimal(decimal)),
}
}
pub fn duration(secs: u64) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::Duration(secs))
}
impl FromStr for Unit {
type Err = ();
fn from_str(input: &str) -> Result<Self, <Self as std::str::FromStr>::Err> {
match input {
"B" | "b" => Ok(Unit::Byte),
"KB" | "kb" | "Kb" | "K" | "k" => Ok(Unit::Kilobyte),
"MB" | "mb" | "Mb" => Ok(Unit::Megabyte),
"GB" | "gb" | "Gb" => Ok(Unit::Gigabyte),
"TB" | "tb" | "Tb" => Ok(Unit::Terabyte),
"PB" | "pb" | "Pb" => Ok(Unit::Petabyte),
"s" => Ok(Unit::Second),
"m" => Ok(Unit::Minute),
"h" => Ok(Unit::Hour),
"d" => Ok(Unit::Day),
"w" => Ok(Unit::Week),
"M" => Ok(Unit::Month),
"y" => Ok(Unit::Year),
_ => Err(()),
}
}
}

View File

@ -1,17 +1,17 @@
use crate::errors::{ArgumentError, ParseError};
use crate::parser::hir::syntax_shape::{
use crate::hir::syntax_shape::{
color_fallible_syntax, color_syntax, expand_expr, flat_shape::FlatShape, spaced,
BackoffColoringMode, ColorSyntax, MaybeSpaceShape,
};
use crate::parser::registry::{NamedType, PositionalType, Signature};
use crate::parser::TokensIterator;
use crate::parser::{
use crate::TokensIterator;
use crate::{
hir::{self, ExpandContext, NamedArguments},
Flag,
};
use log::trace;
use nu_source::{PrettyDebugWithSource, Text};
use nu_source::{Span, Spanned, SpannedItem};
use nu_source::{PrettyDebugWithSource, Span, Spanned, SpannedItem, Text};
use nu_errors::{ArgumentError, ParseError};
use nu_protocol::{NamedType, PositionalType, Signature};
pub fn parse_command_tail(
config: &Signature,
@ -134,7 +134,7 @@ pub fn parse_command_tail(
trace!(target: "nu::parse", "Constructed positional={:?} named={:?}", positional, named);
let positional = if positional.len() == 0 {
let positional = if positional.is_empty() {
None
} else {
Some(positional)
@ -183,198 +183,6 @@ impl ColoringArgs {
#[derive(Debug, Copy, Clone)]
pub struct CommandTailShape;
#[cfg(not(coloring_in_tokens))]
impl ColorSyntax for CommandTailShape {
type Info = ();
type Input = Signature;
fn color_syntax<'a, 'b>(
&self,
signature: &Signature,
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
shapes: &mut Vec<Spanned<FlatShape>>,
) -> Self::Info {
let mut args = ColoringArgs::new(token_nodes.len());
for (name, kind) in &signature.named {
trace!(target: "nu::color_syntax", "looking for {} : {:?}", name, kind);
match &kind.0 {
NamedType::Switch => {
match token_nodes.extract(|t| t.as_flag(name, context.source())) {
Some((pos, flag)) => args.insert(pos, vec![flag.color()]),
None => {}
}
}
NamedType::Mandatory(syntax_type) => {
match extract_mandatory(
signature,
name,
token_nodes,
context.source(),
Span::unknown(),
) {
Err(_) => {
// The mandatory flag didn't exist at all, so there's nothing to color
}
Ok((pos, flag)) => {
let mut shapes = vec![flag.color()];
token_nodes.move_to(pos);
if token_nodes.at_end() {
args.insert(pos, shapes);
token_nodes.restart();
continue;
}
// We can live with unmatched syntax after a mandatory flag
let _ = token_nodes.atomic(|token_nodes| {
color_syntax(&MaybeSpaceShape, token_nodes, context, &mut shapes);
// If the part after a mandatory flag isn't present, that's ok, but we
// should roll back any whitespace we chomped
color_fallible_syntax(
syntax_type,
token_nodes,
context,
&mut shapes,
)
});
args.insert(pos, shapes);
token_nodes.restart();
}
}
}
NamedType::Optional(syntax_type) => {
match extract_optional(name, token_nodes, context.source()) {
Err(_) => {
// The optional flag didn't exist at all, so there's nothing to color
}
Ok(Some((pos, flag))) => {
let mut shapes = vec![flag.color()];
token_nodes.move_to(pos);
if token_nodes.at_end() {
args.insert(pos, shapes);
token_nodes.restart();
continue;
}
// We can live with unmatched syntax after an optional flag
let _ = token_nodes.atomic(|token_nodes| {
color_syntax(&MaybeSpaceShape, token_nodes, context, &mut shapes);
// If the part after a mandatory flag isn't present, that's ok, but we
// should roll back any whitespace we chomped
color_fallible_syntax(
syntax_type,
token_nodes,
context,
&mut shapes,
)
});
args.insert(pos, shapes);
token_nodes.restart();
}
Ok(None) => {
token_nodes.restart();
}
}
}
};
}
for arg in &signature.positional {
trace!("Processing positional {:?}", arg);
match arg.0 {
PositionalType::Mandatory(..) => {
if token_nodes.at_end() {
break;
}
}
PositionalType::Optional(..) => {
if token_nodes.at_end() {
break;
}
}
}
let mut shapes = vec![];
let pos = token_nodes.pos(false);
match pos {
None => break,
Some(pos) => {
// We can live with an unmatched positional argument. Hopefully it will be
// matched by a future token
let _ = token_nodes.atomic(|token_nodes| {
color_syntax(&MaybeSpaceShape, token_nodes, context, &mut shapes);
// If no match, we should roll back any whitespace we chomped
color_fallible_syntax(
&arg.0.syntax_type(),
token_nodes,
context,
&mut shapes,
)?;
args.insert(pos, shapes);
Ok(())
});
}
}
}
if let Some((syntax_type, _)) = signature.rest_positional {
loop {
if token_nodes.at_end_possible_ws() {
break;
}
let pos = token_nodes.pos(false);
match pos {
None => break,
Some(pos) => {
let mut shapes = vec![];
// If any arguments don't match, we'll fall back to backoff coloring mode
let result = token_nodes.atomic(|token_nodes| {
color_syntax(&MaybeSpaceShape, token_nodes, context, &mut shapes);
// If no match, we should roll back any whitespace we chomped
color_fallible_syntax(&syntax_type, token_nodes, context, &mut shapes)?;
args.insert(pos, shapes);
Ok(())
});
match result {
Err(_) => break,
Ok(_) => continue,
}
}
}
}
}
args.spread_shapes(shapes);
// Consume any remaining tokens with backoff coloring mode
color_syntax(&BackoffColoringMode, token_nodes, context, shapes);
shapes.sort_by(|a, b| a.span.start().cmp(&b.span.start()));
}
}
#[cfg(coloring_in_tokens)]
impl ColorSyntax for CommandTailShape {
type Info = ();
type Input = Signature;
@ -389,14 +197,14 @@ impl ColorSyntax for CommandTailShape {
token_nodes: &'b mut TokensIterator<'a>,
context: &ExpandContext,
) -> Self::Info {
use crate::parser::hir::syntax_shape::SyntaxShape;
use nu_protocol::SyntaxShape;
let mut args = ColoringArgs::new(token_nodes.len());
trace_remaining("nodes", &token_nodes, context.source());
fn insert_flag(
token_nodes: &mut TokensIterator,
syntax_type: &SyntaxShape,
syntax_type: SyntaxShape,
args: &mut ColoringArgs,
flag: Flag,
pos: usize,
@ -418,7 +226,7 @@ impl ColorSyntax for CommandTailShape {
// If the part after a mandatory flag isn't present, that's ok, but we
// should roll back any whitespace we chomped
color_fallible_syntax(syntax_type, token_nodes, context)?;
color_fallible_syntax(&syntax_type, token_nodes, context)?;
Ok(())
});
@ -435,9 +243,10 @@ impl ColorSyntax for CommandTailShape {
match &kind.0 {
NamedType::Switch => {
match token_nodes.extract(|t| t.as_flag(name, context.source())) {
Some((pos, flag)) => args.insert(pos, vec![flag.color()]),
None => {}
if let Some((pos, flag)) =
token_nodes.extract(|t| t.as_flag(name, context.source()))
{
args.insert(pos, vec![flag.color()])
}
}
NamedType::Mandatory(syntax_type) => {
@ -452,7 +261,7 @@ impl ColorSyntax for CommandTailShape {
// The mandatory flag didn't exist at all, so there's nothing to color
}
Ok((pos, flag)) => {
insert_flag(token_nodes, syntax_type, &mut args, flag, pos, context)
insert_flag(token_nodes, *syntax_type, &mut args, flag, pos, context)
}
}
}
@ -462,7 +271,7 @@ impl ColorSyntax for CommandTailShape {
// The optional flag didn't exist at all, so there's nothing to color
}
Ok(Some((pos, flag))) => {
insert_flag(token_nodes, syntax_type, &mut args, flag, pos, context)
insert_flag(token_nodes, *syntax_type, &mut args, flag, pos, context)
}
Ok(None) => {

View File

@ -0,0 +1,24 @@
[package]
name = "nu-plugin"
version = "0.8.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "Nushell Plugin"
license = "MIT"
[lib]
doctest = false
[dependencies]
nu-protocol = { path = "../nu-protocol", version = "0.8.0" }
nu-source = { path = "../nu-source", version = "0.8.0" }
nu-errors = { path = "../nu-errors", version = "0.8.0" }
nu-value-ext = { path = "../nu-value-ext", version = "0.8.0" }
indexmap = { version = "1.3.0", features = ["serde-1"] }
serde = { version = "1.0.103", features = ["derive"] }
num-bigint = { version = "0.2.3", features = ["serde"] }
serde_json = "1.0.44"
[build-dependencies]
nu-build = { version = "0.8.0", path = "../nu-build" }

View File

@ -0,0 +1,3 @@
fn main() -> Result<(), Box<dyn std::error::Error>> {
nu_build::build()
}

View File

@ -0,0 +1,4 @@
mod plugin;
pub mod test_helpers;
pub use crate::plugin::{serve_plugin, Plugin};

View File

@ -1,5 +1,5 @@
use crate::Signature;
use crate::{CallInfo, ReturnValue, ShellError, Value};
use nu_errors::ShellError;
use nu_protocol::{outln, CallInfo, ReturnValue, Signature, Value};
use serde::{Deserialize, Serialize};
use std::io;
@ -24,9 +24,9 @@ pub trait Plugin {
}
pub fn serve_plugin(plugin: &mut dyn Plugin) {
let args = std::env::args();
let mut args = std::env::args();
if args.len() > 1 {
let input = args.skip(1).next();
let input = args.nth(1);
let input = match input {
Some(arg) => std::fs::read_to_string(arg),

View File

@ -0,0 +1,213 @@
use crate::Plugin;
use indexmap::IndexMap;
use nu_errors::ShellError;
use nu_protocol::{CallInfo, EvaluatedArgs, ReturnSuccess, ReturnValue, UntaggedValue, Value};
use nu_source::Tag;
pub struct PluginTest<'a, T: Plugin> {
plugin: &'a mut T,
call_info: CallInfo,
input: Value,
}
impl<'a, T: Plugin> PluginTest<'a, T> {
pub fn for_plugin(plugin: &'a mut T) -> Self {
PluginTest {
plugin,
call_info: CallStub::new().create(),
input: UntaggedValue::nothing().into_value(Tag::unknown()),
}
}
pub fn args(&mut self, call_info: CallInfo) -> &mut PluginTest<'a, T> {
self.call_info = call_info;
self
}
pub fn configure(&mut self, callback: impl FnOnce(Vec<String>)) -> &mut PluginTest<'a, T> {
let signature = self
.plugin
.config()
.expect("There was a problem configuring the plugin.");
callback(signature.named.keys().map(String::from).collect());
self
}
pub fn input(&mut self, value: Value) -> &mut PluginTest<'a, T> {
self.input = value;
self
}
pub fn test(&mut self) -> Result<Vec<ReturnValue>, ShellError> {
let return_values = self.plugin.filter(self.input.clone());
let mut return_values = match return_values {
Ok(filtered) => filtered,
Err(reason) => return Err(reason),
};
let end = self.plugin.end_filter();
match end {
Ok(filter_ended) => return_values.extend(filter_ended),
Err(reason) => return Err(reason),
}
self.plugin.quit();
Ok(return_values)
}
pub fn setup(
&mut self,
callback: impl FnOnce(&mut T, Result<Vec<ReturnValue>, ShellError>),
) -> &mut PluginTest<'a, T> {
let call_stub = self.call_info.clone();
self.configure(|flags_configured| {
let flags_registered = &call_stub.args.named;
let flag_passed = match flags_registered {
Some(names) => Some(names.keys().map(String::from).collect::<Vec<String>>()),
None => None,
};
if let Some(flags) = flag_passed {
for flag in flags {
assert!(
flags_configured.iter().any(|f| *f == flag),
format!(
"The flag you passed ({}) is not configured in the plugin.",
flag
)
);
}
}
});
let began = self.plugin.begin_filter(call_stub);
let return_values = match began {
Ok(values) => Ok(values),
Err(reason) => Err(reason),
};
callback(self.plugin, return_values);
self
}
}
pub fn plugin<T: Plugin>(plugin: &mut T) -> PluginTest<T> {
PluginTest::for_plugin(plugin)
}
#[derive(Default)]
pub struct CallStub {
positionals: Vec<Value>,
flags: IndexMap<String, Value>,
}
impl CallStub {
pub fn new() -> Self {
Default::default()
}
pub fn with_named_parameter(&mut self, name: &str, value: Value) -> &mut Self {
self.flags.insert(name.to_string(), value);
self
}
pub fn with_long_flag(&mut self, name: &str) -> &mut Self {
self.flags.insert(
name.to_string(),
UntaggedValue::boolean(true).into_value(Tag::unknown()),
);
self
}
pub fn with_parameter(&mut self, name: &str) -> Result<&mut Self, ShellError> {
let fields: Vec<Value> = name
.split('.')
.map(|s| UntaggedValue::string(s.to_string()).into_value(Tag::unknown()))
.collect();
self.positionals.push(value::column_path(&fields)?);
Ok(self)
}
pub fn create(&self) -> CallInfo {
CallInfo {
args: EvaluatedArgs::new(Some(self.positionals.clone()), Some(self.flags.clone())),
name_tag: Tag::unknown(),
}
}
}
pub fn expect_return_value_at(
for_results: Result<Vec<Result<ReturnSuccess, ShellError>>, ShellError>,
at: usize,
) -> Value {
let return_values = for_results
.expect("Failed! This seems to be an error getting back the results from the plugin.");
for (idx, item) in return_values.iter().enumerate() {
let item = match item {
Ok(return_value) => return_value,
Err(reason) => panic!(format!("{}", reason)),
};
if idx == at {
if let Some(value) = item.raw_value() {
return value;
} else {
panic!("Internal error: could not get raw value in expect_return_value_at")
}
}
}
panic!(format!(
"Couldn't get return value from stream at {}. (There are {} items)",
at,
return_values.len() - 1
))
}
pub mod value {
use nu_errors::ShellError;
use nu_protocol::{Primitive, TaggedDictBuilder, UntaggedValue, Value};
use nu_source::Tag;
use nu_value_ext::ValueExt;
use num_bigint::BigInt;
pub fn get_data(for_value: Value, key: &str) -> Value {
for_value.get_data(&key.to_string()).borrow().clone()
}
pub fn int(i: impl Into<BigInt>) -> Value {
UntaggedValue::Primitive(Primitive::Int(i.into())).into_untagged_value()
}
pub fn string(input: impl Into<String>) -> Value {
UntaggedValue::string(input.into()).into_untagged_value()
}
pub fn structured_sample_record(key: &str, value: &str) -> Value {
let mut record = TaggedDictBuilder::new(Tag::unknown());
record.insert_untagged(key, UntaggedValue::string(value));
record.into_value()
}
pub fn unstructured_sample_record(value: &str) -> Value {
UntaggedValue::string(value).into_value(Tag::unknown())
}
pub fn table(list: &[Value]) -> Value {
UntaggedValue::table(list).into_untagged_value()
}
pub fn column_path(paths: &[Value]) -> Result<Value, ShellError> {
Ok(UntaggedValue::Primitive(Primitive::ColumnPath(
table(&paths.to_vec()).as_column_path()?.item,
))
.into_untagged_value())
}
}

View File

@ -0,0 +1,43 @@
[package]
name = "nu-protocol"
version = "0.8.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "Core values and protocols for Nushell"
license = "MIT"
[lib]
doctest = false
[dependencies]
nu-source = { path = "../nu-source", version = "0.8.0" }
nu-errors = { path = "../nu-errors", version = "0.8.0" }
serde = { version = "1.0.103", features = ["derive"] }
indexmap = { version = "1.3.0", features = ["serde-1"] }
num-bigint = { version = "0.2.3", features = ["serde"] }
bigdecimal = { version = "0.1.0", features = ["serde"] }
chrono = { version = "0.4.10", features = ["serde"] }
num-traits = "0.2.8"
serde_bytes = "0.11.3"
getset = "0.0.9"
derive-new = "0.5.8"
ansi_term = "0.12.1"
language-reporting = "0.4.0"
nom = "5.0.1"
nom_locate = "1.0.0"
nom-tracable = "0.4.1"
typetag = "0.1.4"
query_interface = "0.3.5"
byte-unit = "3.0.3"
chrono-humanize = "0.0.11"
natural = "0.3.0"
# implement conversions
subprocess = "0.1.18"
serde_yaml = "0.8"
toml = "0.5.5"
serde_json = "1.0.44"
[build-dependencies]
nu-build = { version = "0.8.0", path = "../nu-build" }

View File

@ -0,0 +1,3 @@
fn main() -> Result<(), Box<dyn std::error::Error>> {
nu_build::build()
}

View File

@ -0,0 +1,97 @@
use crate::value::Value;
use derive_new::new;
use indexmap::IndexMap;
use nu_errors::ShellError;
use nu_source::Tag;
use serde::{Deserialize, Serialize};
#[derive(Deserialize, Serialize, Debug, Clone)]
pub struct CallInfo {
pub args: EvaluatedArgs,
pub name_tag: Tag,
}
#[derive(Debug, Default, new, Serialize, Deserialize, Clone)]
pub struct EvaluatedArgs {
pub positional: Option<Vec<Value>>,
pub named: Option<IndexMap<String, Value>>,
}
impl EvaluatedArgs {
pub fn slice_from(&self, from: usize) -> Vec<Value> {
let positional = &self.positional;
match positional {
None => vec![],
Some(list) => list[from..].to_vec(),
}
}
pub fn nth(&self, pos: usize) -> Option<&Value> {
match &self.positional {
None => None,
Some(array) => array.get(pos),
}
}
pub fn expect_nth(&self, pos: usize) -> Result<&Value, ShellError> {
match &self.positional {
None => Err(ShellError::unimplemented("Better error: expect_nth")),
Some(array) => match array.get(pos) {
None => Err(ShellError::unimplemented("Better error: expect_nth")),
Some(item) => Ok(item),
},
}
}
pub fn len(&self) -> usize {
match &self.positional {
None => 0,
Some(array) => array.len(),
}
}
pub fn is_empty(&self) -> bool {
self.len() == 0
}
pub fn has(&self, name: &str) -> bool {
match &self.named {
None => false,
Some(named) => named.contains_key(name),
}
}
pub fn get(&self, name: &str) -> Option<&Value> {
match &self.named {
None => None,
Some(named) => named.get(name),
}
}
pub fn positional_iter(&self) -> PositionalIter<'_> {
match &self.positional {
None => PositionalIter::Empty,
Some(v) => {
let iter = v.iter();
PositionalIter::Array(iter)
}
}
}
}
pub enum PositionalIter<'a> {
Empty,
Array(std::slice::Iter<'a, Value>),
}
impl<'a> Iterator for PositionalIter<'a> {
type Item = &'a Value;
fn next(&mut self) -> Option<Self::Item> {
match self {
PositionalIter::Empty => None,
PositionalIter::Array(iter) => iter.next(),
}
}
}

View File

@ -0,0 +1,26 @@
#[macro_use]
mod macros;
mod call_info;
mod maybe_owned;
mod return_value;
mod signature;
mod syntax_shape;
mod type_name;
mod type_shape;
mod value;
pub use crate::call_info::{CallInfo, EvaluatedArgs};
pub use crate::maybe_owned::MaybeOwned;
pub use crate::return_value::{CommandAction, ReturnSuccess, ReturnValue};
pub use crate::signature::{NamedType, PositionalType, Signature};
pub use crate::syntax_shape::SyntaxShape;
pub use crate::type_name::{PrettyType, ShellTypeName, SpannedTypeName};
pub use crate::type_shape::{Row as RowType, Type};
pub use crate::value::column_path::{did_you_mean, ColumnPath, PathMember, UnspannedPathMember};
pub use crate::value::dict::{Dictionary, TaggedDictBuilder};
pub use crate::value::evaluate::{Evaluate, EvaluateTrait, Scope};
pub use crate::value::primitive::format_primitive;
pub use crate::value::primitive::Primitive;
pub use crate::value::range::{Range, RangeInclusion};
pub use crate::value::{UntaggedValue, Value};

View File

@ -0,0 +1,12 @@
// These macros exist to differentiate between intentional writing to stdout
// and stray printlns left by accident
#[macro_export]
macro_rules! outln {
($($tokens:tt)*) => { println!($($tokens)*) }
}
#[macro_export]
macro_rules! errln {
($($tokens:tt)*) => { eprintln!($($tokens)*) }
}

View File

@ -0,0 +1,16 @@
#![allow(clippy::should_implement_trait)]
#[derive(Debug)]
pub enum MaybeOwned<'a, T> {
Owned(T),
Borrowed(&'a T),
}
impl<T> MaybeOwned<'_, T> {
pub fn borrow(&self) -> &T {
match self {
MaybeOwned::Owned(v) => v,
MaybeOwned::Borrowed(v) => v,
}
}
}

View File

@ -0,0 +1,88 @@
use crate::value::Value;
use nu_errors::ShellError;
use nu_source::{b, DebugDocBuilder, PrettyDebug};
use serde::{Deserialize, Serialize};
#[derive(Debug, Clone, Serialize, Deserialize)]
pub enum CommandAction {
ChangePath(String),
Exit,
Error(ShellError),
EnterShell(String),
AutoConvert(Value, String),
EnterValueShell(Value),
EnterHelpShell(Value),
PreviousShell,
NextShell,
LeaveShell,
}
impl PrettyDebug for CommandAction {
fn pretty(&self) -> DebugDocBuilder {
match self {
CommandAction::ChangePath(path) => b::typed("change path", b::description(path)),
CommandAction::Exit => b::description("exit"),
CommandAction::Error(_) => b::error("error"),
CommandAction::AutoConvert(_, extension) => {
b::typed("auto convert", b::description(extension))
}
CommandAction::EnterShell(s) => b::typed("enter shell", b::description(s)),
CommandAction::EnterValueShell(v) => b::typed("enter value shell", v.pretty()),
CommandAction::EnterHelpShell(v) => b::typed("enter help shell", v.pretty()),
CommandAction::PreviousShell => b::description("previous shell"),
CommandAction::NextShell => b::description("next shell"),
CommandAction::LeaveShell => b::description("leave shell"),
}
}
}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub enum ReturnSuccess {
Value(Value),
DebugValue(Value),
Action(CommandAction),
}
impl PrettyDebug for ReturnSuccess {
fn pretty(&self) -> DebugDocBuilder {
match self {
ReturnSuccess::Value(value) => b::typed("value", value.pretty()),
ReturnSuccess::DebugValue(value) => b::typed("debug value", value.pretty()),
ReturnSuccess::Action(action) => b::typed("action", action.pretty()),
}
}
}
pub type ReturnValue = Result<ReturnSuccess, ShellError>;
impl Into<ReturnValue> for Value {
fn into(self) -> ReturnValue {
Ok(ReturnSuccess::Value(self))
}
}
impl ReturnSuccess {
pub fn raw_value(&self) -> Option<Value> {
match self {
ReturnSuccess::Value(raw) => Some(raw.clone()),
ReturnSuccess::DebugValue(raw) => Some(raw.clone()),
ReturnSuccess::Action(_) => None,
}
}
pub fn change_cwd(path: String) -> ReturnValue {
Ok(ReturnSuccess::Action(CommandAction::ChangePath(path)))
}
pub fn value(input: impl Into<Value>) -> ReturnValue {
Ok(ReturnSuccess::Value(input.into()))
}
pub fn debug_value(input: impl Into<Value>) -> ReturnValue {
Ok(ReturnSuccess::DebugValue(input.into()))
}
pub fn action(input: CommandAction) -> ReturnValue {
Ok(ReturnSuccess::Action(input))
}
}

View File

@ -1,11 +1,7 @@
// TODO: Temporary redirect
pub(crate) use crate::context::CommandRegistry;
use crate::evaluate::{evaluate_baseline_expr, Scope};
use crate::parser::{hir, hir::SyntaxShape};
use crate::prelude::*;
use derive_new::new;
use crate::syntax_shape::SyntaxShape;
use crate::type_shape::Type;
use indexmap::IndexMap;
use nu_source::{b, DebugDocBuilder, PrettyDebug, PrettyDebugWithSource};
use serde::{Deserialize, Serialize};
#[derive(Debug, Serialize, Deserialize, Clone)]
@ -25,12 +21,12 @@ impl PrettyDebug for PositionalType {
fn pretty(&self) -> DebugDocBuilder {
match self {
PositionalType::Mandatory(string, shape) => {
b::description(string) + b::delimit("(", shape.pretty(), ")").as_kind().group()
b::description(string) + b::delimit("(", shape.pretty(), ")").into_kind().group()
}
PositionalType::Optional(string, shape) => {
b::description(string)
+ b::operator("?")
+ b::delimit("(", shape.pretty(), ")").as_kind().group()
+ b::delimit("(", shape.pretty(), ")").into_kind().group()
}
}
}
@ -57,14 +53,14 @@ impl PositionalType {
PositionalType::Optional(name.to_string(), SyntaxShape::Any)
}
pub(crate) fn name(&self) -> &str {
pub fn name(&self) -> &str {
match self {
PositionalType::Mandatory(s, _) => s,
PositionalType::Optional(s, _) => s,
}
}
pub(crate) fn syntax_type(&self) -> SyntaxShape {
pub fn syntax_type(&self) -> SyntaxShape {
match *self {
PositionalType::Mandatory(_, t) => t,
PositionalType::Optional(_, t) => t,
@ -81,6 +77,8 @@ pub struct Signature {
pub positional: Vec<(PositionalType, Description)>,
pub rest_positional: Option<(SyntaxShape, Description)>,
pub named: IndexMap<String, (NamedType, Description)>,
pub yields: Option<Type>,
pub input: Option<Type>,
pub is_filter: bool,
}
@ -103,14 +101,16 @@ impl PrettyDebugWithSource for Signature {
}
impl Signature {
pub fn new(name: String) -> Signature {
pub fn new(name: impl Into<String>) -> Signature {
Signature {
name,
name: name.into(),
usage: String::new(),
positional: vec![],
rest_positional: None,
named: IndexMap::new(),
is_filter: false,
yields: None,
input: None,
}
}
@ -191,136 +191,14 @@ impl Signature {
self.rest_positional = Some((ty, desc.into()));
self
}
}
#[derive(Debug, Default, new, Serialize, Deserialize, Clone)]
pub struct EvaluatedArgs {
pub positional: Option<Vec<Value>>,
pub named: Option<IndexMap<String, Value>>,
}
pub fn yields(mut self, ty: Type) -> Signature {
self.yields = Some(ty);
self
}
impl EvaluatedArgs {
pub fn slice_from(&self, from: usize) -> Vec<Value> {
let positional = &self.positional;
match positional {
None => vec![],
Some(list) => list[from..].to_vec(),
}
pub fn input(mut self, ty: Type) -> Signature {
self.input = Some(ty);
self
}
}
impl EvaluatedArgs {
pub fn nth(&self, pos: usize) -> Option<&Value> {
match &self.positional {
None => None,
Some(array) => array.iter().nth(pos),
}
}
pub fn expect_nth(&self, pos: usize) -> Result<&Value, ShellError> {
match &self.positional {
None => Err(ShellError::unimplemented("Better error: expect_nth")),
Some(array) => match array.iter().nth(pos) {
None => Err(ShellError::unimplemented("Better error: expect_nth")),
Some(item) => Ok(item),
},
}
}
pub fn len(&self) -> usize {
match &self.positional {
None => 0,
Some(array) => array.len(),
}
}
pub fn has(&self, name: &str) -> bool {
match &self.named {
None => false,
Some(named) => named.contains_key(name),
}
}
pub fn get(&self, name: &str) -> Option<&Value> {
match &self.named {
None => None,
Some(named) => named.get(name),
}
}
pub fn positional_iter(&self) -> PositionalIter<'_> {
match &self.positional {
None => PositionalIter::Empty,
Some(v) => {
let iter = v.iter();
PositionalIter::Array(iter)
}
}
}
}
pub enum PositionalIter<'a> {
Empty,
Array(std::slice::Iter<'a, Value>),
}
impl<'a> Iterator for PositionalIter<'a> {
type Item = &'a Value;
fn next(&mut self) -> Option<Self::Item> {
match self {
PositionalIter::Empty => None,
PositionalIter::Array(iter) => iter.next(),
}
}
}
pub(crate) fn evaluate_args(
call: &hir::Call,
registry: &CommandRegistry,
scope: &Scope,
source: &Text,
) -> Result<EvaluatedArgs, ShellError> {
let positional: Result<Option<Vec<_>>, _> = call
.positional()
.as_ref()
.map(|p| {
p.iter()
.map(|e| evaluate_baseline_expr(e, registry, scope, source))
.collect()
})
.transpose();
let positional = positional?;
let named: Result<Option<IndexMap<String, Value>>, ShellError> = call
.named()
.as_ref()
.map(|n| {
let mut results = IndexMap::new();
for (name, value) in n.named.iter() {
match value {
hir::named::NamedValue::PresentSwitch(tag) => {
results.insert(name.clone(), UntaggedValue::boolean(true).into_value(tag));
}
hir::named::NamedValue::Value(expr) => {
results.insert(
name.clone(),
evaluate_baseline_expr(expr, registry, scope, source)?,
);
}
_ => {}
};
}
Ok(results)
})
.transpose();
let named = named?;
Ok(EvaluatedArgs::new(positional, named))
}

View File

@ -0,0 +1,33 @@
use nu_source::{b, DebugDocBuilder, PrettyDebug};
use serde::{Deserialize, Serialize};
#[derive(Debug, Copy, Clone, Serialize, Deserialize)]
pub enum SyntaxShape {
Any,
String,
Member,
ColumnPath,
Number,
Range,
Int,
Path,
Pattern,
Block,
}
impl PrettyDebug for SyntaxShape {
fn pretty(&self) -> DebugDocBuilder {
b::kind(match self {
SyntaxShape::Any => "any shape",
SyntaxShape::String => "string shape",
SyntaxShape::Member => "member shape",
SyntaxShape::ColumnPath => "column path shape",
SyntaxShape::Number => "number shape",
SyntaxShape::Range => "range shape",
SyntaxShape::Int => "integer shape",
SyntaxShape::Path => "file path shape",
SyntaxShape::Pattern => "pattern shape",
SyntaxShape::Block => "block shape",
})
}
}

View File

@ -1,5 +1,4 @@
use crate::prelude::*;
use nu_source::{DebugDocBuilder, Spanned, SpannedItem, Tagged};
use nu_source::{DebugDocBuilder, HasSpan, Spanned, SpannedItem, Tagged};
pub trait ShellTypeName {
fn type_name(&self) -> &'static str;

View File

@ -0,0 +1,382 @@
use crate::value::dict::Dictionary;
use crate::value::primitive::Primitive;
use crate::value::range::RangeInclusion;
use crate::value::{UntaggedValue, Value};
use derive_new::new;
use nu_source::{b, DebugDoc, DebugDocBuilder, PrettyDebug};
use serde::{Deserialize, Deserializer, Serialize};
use std::collections::BTreeMap;
use std::fmt::Debug;
use std::hash::Hash;
/**
This file describes the structural types of the nushell system.
Its primary purpose today is to identify "equivalent" values for the purpose
of merging rows into a single table or identify rows in a table that have the
same shape for reflection.
*/
#[derive(Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Hash, Serialize, Deserialize, new)]
pub struct RangeType {
from: (Type, RangeInclusion),
to: (Type, RangeInclusion),
}
#[derive(Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Hash, Serialize, Deserialize)]
pub enum Type {
Nothing,
Int,
Range(Box<RangeType>),
Decimal,
Bytesize,
String,
Line,
ColumnPath,
Pattern,
Boolean,
Date,
Duration,
Path,
Binary,
Row(Row),
Table(Vec<Type>),
// TODO: Block arguments
Block,
// TODO: Error type
Error,
// Stream markers (used as bookend markers rather than actual values)
BeginningOfStream,
EndOfStream,
}
#[derive(Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Hash, new)]
pub struct Row {
#[new(default)]
map: BTreeMap<Column, Type>,
}
impl Serialize for Row {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: serde::Serializer,
{
serializer.collect_map(self.map.iter())
}
}
impl<'de> Deserialize<'de> for Row {
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
where
D: Deserializer<'de>,
{
struct RowVisitor;
impl<'de> serde::de::Visitor<'de> for RowVisitor {
type Value = Row;
fn expecting(&self, formatter: &mut std::fmt::Formatter) -> std::fmt::Result {
write!(formatter, "a row")
}
fn visit_map<A>(self, mut map: A) -> Result<Self::Value, A::Error>
where
A: serde::de::MapAccess<'de>,
{
let mut new_map = BTreeMap::new();
loop {
let entry = map.next_entry()?;
match entry {
None => return Ok(Row { map: new_map }),
Some((key, value)) => {
new_map.insert(key, value);
}
}
}
}
}
deserializer.deserialize_map(RowVisitor)
}
}
impl Type {
pub fn from_primitive(primitive: &Primitive) -> Type {
match primitive {
Primitive::Nothing => Type::Nothing,
Primitive::Int(_) => Type::Int,
Primitive::Range(range) => {
let (left_value, left_inclusion) = &range.from;
let (right_value, right_inclusion) = &range.to;
let left_type = (Type::from_primitive(left_value), *left_inclusion);
let right_type = (Type::from_primitive(right_value), *right_inclusion);
let range = RangeType::new(left_type, right_type);
Type::Range(Box::new(range))
}
Primitive::Decimal(_) => Type::Decimal,
Primitive::Bytes(_) => Type::Bytesize,
Primitive::String(_) => Type::String,
Primitive::Line(_) => Type::Line,
Primitive::ColumnPath(_) => Type::ColumnPath,
Primitive::Pattern(_) => Type::Pattern,
Primitive::Boolean(_) => Type::Boolean,
Primitive::Date(_) => Type::Date,
Primitive::Duration(_) => Type::Duration,
Primitive::Path(_) => Type::Path,
Primitive::Binary(_) => Type::Binary,
Primitive::BeginningOfStream => Type::BeginningOfStream,
Primitive::EndOfStream => Type::EndOfStream,
}
}
pub fn from_dictionary(dictionary: &Dictionary) -> Type {
let mut map = BTreeMap::new();
for (key, value) in dictionary.entries.iter() {
let column = Column::String(key.clone());
map.insert(column, Type::from_value(value));
}
Type::Row(Row { map })
}
pub fn from_table<'a>(table: impl IntoIterator<Item = &'a Value>) -> Type {
let mut vec = vec![];
for item in table.into_iter() {
vec.push(Type::from_value(item))
}
Type::Table(vec)
}
pub fn from_value<'a>(value: impl Into<&'a UntaggedValue>) -> Type {
match value.into() {
UntaggedValue::Primitive(p) => Type::from_primitive(p),
UntaggedValue::Row(row) => Type::from_dictionary(row),
UntaggedValue::Table(table) => Type::from_table(table.iter()),
UntaggedValue::Error(_) => Type::Error,
UntaggedValue::Block(_) => Type::Block,
}
}
}
impl PrettyDebug for Type {
fn pretty(&self) -> DebugDocBuilder {
match self {
Type::Nothing => ty("nothing"),
Type::Int => ty("integer"),
Type::Range(range) => {
let (left, left_inclusion) = &range.from;
let (right, right_inclusion) = &range.to;
let left_bracket = b::delimiter(match left_inclusion {
RangeInclusion::Exclusive => "(",
RangeInclusion::Inclusive => "[",
});
let right_bracket = b::delimiter(match right_inclusion {
RangeInclusion::Exclusive => ")",
RangeInclusion::Inclusive => "]",
});
b::typed(
"range",
(left_bracket
+ left.pretty()
+ b::operator(",")
+ b::space()
+ right.pretty()
+ right_bracket)
.group(),
)
}
Type::Decimal => ty("decimal"),
Type::Bytesize => ty("bytesize"),
Type::String => ty("string"),
Type::Line => ty("line"),
Type::ColumnPath => ty("column-path"),
Type::Pattern => ty("pattern"),
Type::Boolean => ty("boolean"),
Type::Date => ty("date"),
Type::Duration => ty("duration"),
Type::Path => ty("path"),
Type::Binary => ty("binary"),
Type::Error => b::error("error"),
Type::BeginningOfStream => b::keyword("beginning-of-stream"),
Type::EndOfStream => b::keyword("end-of-stream"),
Type::Row(row) => (b::kind("row")
+ b::space()
+ b::intersperse(
row.map.iter().map(|(key, ty)| {
(b::key(match key {
Column::String(string) => string.clone(),
Column::Value => "<value>".to_string(),
}) + b::delimit("(", ty.pretty(), ")").into_kind())
.nest()
}),
b::space(),
)
.nest())
.nest(),
Type::Table(table) => {
let mut group: Group<DebugDoc, Vec<(usize, usize)>> = Group::new();
for (i, item) in table.iter().enumerate() {
group.add(item.to_doc(), i);
}
(b::kind("table") + b::space() + b::keyword("of")).group()
+ b::space()
+ (if group.len() == 1 {
let (doc, _) = group.into_iter().collect::<Vec<_>>()[0].clone();
DebugDocBuilder::from_doc(doc)
} else {
b::intersperse(
group.into_iter().map(|(doc, rows)| {
(b::intersperse(
rows.iter().map(|(from, to)| {
if from == to {
b::description(from)
} else {
(b::description(from)
+ b::space()
+ b::keyword("to")
+ b::space()
+ b::description(to))
.group()
}
}),
b::description(", "),
) + b::description(":")
+ b::space()
+ DebugDocBuilder::from_doc(doc))
.nest()
}),
b::space(),
)
})
}
Type::Block => ty("block"),
}
}
}
#[derive(Debug, new)]
struct DebugEntry<'a> {
key: &'a Column,
value: &'a Type,
}
impl<'a> PrettyDebug for DebugEntry<'a> {
fn pretty(&self) -> DebugDocBuilder {
(b::key(match self.key {
Column::String(string) => string.clone(),
Column::Value => "<value>".to_string(),
}) + b::delimit("(", self.value.pretty(), ")").into_kind())
}
}
fn ty(name: impl std::fmt::Display) -> DebugDocBuilder {
b::kind(format!("{}", name))
}
pub trait GroupedValue: Debug + Clone {
type Item;
fn new() -> Self;
fn merge(&mut self, value: Self::Item);
}
impl GroupedValue for Vec<(usize, usize)> {
type Item = usize;
fn new() -> Vec<(usize, usize)> {
vec![]
}
fn merge(&mut self, new_value: usize) {
match self.last_mut() {
Some(value) if value.1 == new_value - 1 => {
value.1 += 1;
}
_ => self.push((new_value, new_value)),
}
}
}
#[derive(Debug)]
pub struct Group<K: Debug + Eq + Hash, V: GroupedValue> {
values: indexmap::IndexMap<K, V>,
}
impl<K, G> Group<K, G>
where
K: Debug + Eq + Hash,
G: GroupedValue,
{
pub fn new() -> Group<K, G> {
Group {
values: indexmap::IndexMap::default(),
}
}
pub fn len(&self) -> usize {
self.values.len()
}
pub fn into_iter(self) -> impl Iterator<Item = (K, G)> {
self.values.into_iter()
}
pub fn add(&mut self, key: impl Into<K>, value: impl Into<G::Item>) {
let key = key.into();
let value = value.into();
let group = self.values.get_mut(&key);
match group {
None => {
self.values.insert(key, {
let mut group = G::new();
group.merge(value);
group
});
}
Some(group) => {
group.merge(value);
}
}
}
}
#[derive(Debug, Clone, Eq, PartialEq, Ord, PartialOrd, Serialize, Deserialize, Hash)]
pub enum Column {
String(String),
Value,
}
impl Into<Column> for String {
fn into(self) -> Column {
Column::String(self)
}
}
impl Into<Column> for &String {
fn into(self) -> Column {
Column::String(self.clone())
}
}
impl Into<Column> for &str {
fn into(self) -> Column {
Column::String(self.to_string())
}
}

View File

@ -0,0 +1,300 @@
pub mod column_path;
mod convert;
mod debug;
pub mod dict;
pub mod evaluate;
pub mod primitive;
pub mod range;
mod serde_bigdecimal;
mod serde_bigint;
use crate::type_name::{ShellTypeName, SpannedTypeName};
use crate::value::dict::Dictionary;
use crate::value::evaluate::Evaluate;
use crate::value::primitive::Primitive;
use crate::value::range::{Range, RangeInclusion};
use crate::{ColumnPath, PathMember};
use bigdecimal::BigDecimal;
use indexmap::IndexMap;
use nu_errors::ShellError;
use nu_source::{AnchorLocation, HasSpan, Span, Spanned, Tag};
use num_bigint::BigInt;
use serde::{Deserialize, Serialize};
use std::path::PathBuf;
use std::time::SystemTime;
#[derive(Debug, Clone, PartialEq, PartialOrd, Eq, Ord, Hash, Serialize, Deserialize)]
pub enum UntaggedValue {
Primitive(Primitive),
Row(Dictionary),
Table(Vec<Value>),
// Errors are a type of value too
Error(ShellError),
Block(Evaluate),
}
impl UntaggedValue {
pub fn retag(self, tag: impl Into<Tag>) -> Value {
Value {
value: self,
tag: tag.into(),
}
}
pub fn data_descriptors(&self) -> Vec<String> {
match self {
UntaggedValue::Primitive(_) => vec![],
UntaggedValue::Row(columns) => columns.entries.keys().map(|x| x.to_string()).collect(),
UntaggedValue::Block(_) => vec![],
UntaggedValue::Table(_) => vec![],
UntaggedValue::Error(_) => vec![],
}
}
pub fn into_value(self, tag: impl Into<Tag>) -> Value {
Value {
value: self,
tag: tag.into(),
}
}
pub fn into_untagged_value(self) -> Value {
Value {
value: self,
tag: Tag::unknown(),
}
}
pub fn is_true(&self) -> bool {
match self {
UntaggedValue::Primitive(Primitive::Boolean(true)) => true,
_ => false,
}
}
pub fn is_some(&self) -> bool {
!self.is_none()
}
pub fn is_none(&self) -> bool {
match self {
UntaggedValue::Primitive(Primitive::Nothing) => true,
_ => false,
}
}
pub fn is_error(&self) -> bool {
match self {
UntaggedValue::Error(_err) => true,
_ => false,
}
}
pub fn expect_error(&self) -> ShellError {
match self {
UntaggedValue::Error(err) => err.clone(),
_ => panic!("Don't call expect_error without first calling is_error"),
}
}
pub fn expect_string(&self) -> &str {
match self {
UntaggedValue::Primitive(Primitive::String(string)) => &string[..],
_ => panic!("expect_string assumes that the value must be a string"),
}
}
#[allow(unused)]
pub fn row(entries: IndexMap<String, Value>) -> UntaggedValue {
UntaggedValue::Row(entries.into())
}
pub fn table(list: &[Value]) -> UntaggedValue {
UntaggedValue::Table(list.to_vec())
}
pub fn string(s: impl Into<String>) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::String(s.into()))
}
pub fn line(s: impl Into<String>) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::Line(s.into()))
}
pub fn column_path(s: Vec<impl Into<PathMember>>) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::ColumnPath(ColumnPath::new(
s.into_iter().map(|p| p.into()).collect(),
)))
}
pub fn int(i: impl Into<BigInt>) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::Int(i.into()))
}
pub fn pattern(s: impl Into<String>) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::String(s.into()))
}
pub fn path(s: impl Into<PathBuf>) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::Path(s.into()))
}
pub fn bytes(s: impl Into<u64>) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::Bytes(s.into()))
}
pub fn decimal(s: impl Into<BigDecimal>) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::Decimal(s.into()))
}
pub fn binary(binary: Vec<u8>) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::Binary(binary))
}
pub fn range(
left: (Spanned<Primitive>, RangeInclusion),
right: (Spanned<Primitive>, RangeInclusion),
) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::Range(Box::new(Range::new(left, right))))
}
pub fn boolean(s: impl Into<bool>) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::Boolean(s.into()))
}
pub fn duration(secs: u64) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::Duration(secs))
}
pub fn system_date(s: SystemTime) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::Date(s.into()))
}
pub fn nothing() -> UntaggedValue {
UntaggedValue::Primitive(Primitive::Nothing)
}
}
#[derive(Debug, Clone, PartialOrd, PartialEq, Ord, Eq, Hash, Serialize, Deserialize)]
pub struct Value {
pub value: UntaggedValue,
pub tag: Tag,
}
impl std::ops::Deref for Value {
type Target = UntaggedValue;
fn deref(&self) -> &Self::Target {
&self.value
}
}
impl Value {
pub fn anchor(&self) -> Option<AnchorLocation> {
self.tag.anchor()
}
pub fn anchor_name(&self) -> Option<String> {
self.tag.anchor_name()
}
pub fn tag(&self) -> Tag {
self.tag.clone()
}
pub fn as_string(&self) -> Result<String, ShellError> {
match &self.value {
UntaggedValue::Primitive(Primitive::String(string)) => Ok(string.clone()),
UntaggedValue::Primitive(Primitive::Line(line)) => Ok(line.clone() + "\n"),
_ => Err(ShellError::type_error("string", self.spanned_type_name())),
}
}
pub fn as_forgiving_string(&self) -> Result<&str, ShellError> {
match &self.value {
UntaggedValue::Primitive(Primitive::String(string)) => Ok(&string[..]),
_ => Err(ShellError::type_error("string", self.spanned_type_name())),
}
}
pub fn as_path(&self) -> Result<PathBuf, ShellError> {
match &self.value {
UntaggedValue::Primitive(Primitive::Path(path)) => Ok(path.clone()),
UntaggedValue::Primitive(Primitive::String(path_str)) => Ok(PathBuf::from(&path_str)),
_ => Err(ShellError::type_error("Path", self.spanned_type_name())),
}
}
pub fn as_primitive(&self) -> Result<Primitive, ShellError> {
match &self.value {
UntaggedValue::Primitive(primitive) => Ok(primitive.clone()),
_ => Err(ShellError::type_error(
"Primitive",
self.spanned_type_name(),
)),
}
}
pub fn as_u64(&self) -> Result<u64, ShellError> {
match &self.value {
UntaggedValue::Primitive(primitive) => primitive.as_u64(self.tag.span),
_ => Err(ShellError::type_error("integer", self.spanned_type_name())),
}
}
}
impl Into<UntaggedValue> for &str {
fn into(self) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::String(self.to_string()))
}
}
impl Into<UntaggedValue> for Value {
fn into(self) -> UntaggedValue {
self.value
}
}
impl<'a> Into<&'a UntaggedValue> for &'a Value {
fn into(self) -> &'a UntaggedValue {
&self.value
}
}
impl HasSpan for Value {
fn span(&self) -> Span {
self.tag.span
}
}
impl ShellTypeName for Value {
fn type_name(&self) -> &'static str {
ShellTypeName::type_name(&self.value)
}
}
impl ShellTypeName for UntaggedValue {
fn type_name(&self) -> &'static str {
match &self {
UntaggedValue::Primitive(p) => p.type_name(),
UntaggedValue::Row(_) => "row",
UntaggedValue::Table(_) => "table",
UntaggedValue::Error(_) => "error",
UntaggedValue::Block(_) => "block",
}
}
}
impl From<Primitive> for UntaggedValue {
fn from(input: Primitive) -> UntaggedValue {
UntaggedValue::Primitive(input)
}
}
impl From<String> for UntaggedValue {
fn from(input: String) -> UntaggedValue {
UntaggedValue::Primitive(Primitive::String(input))
}
}

View File

@ -1,8 +1,8 @@
use crate::parser::hir::Expression;
use crate::prelude::*;
use crate::Value;
use derive_new::new;
use getset::{Getters, MutGetters};
use nu_source::{b, span_for_spanned_list, PrettyDebug};
use getset::Getters;
use nu_source::{b, span_for_spanned_list, DebugDocBuilder, HasFallibleSpan, PrettyDebug, Span};
use num_bigint::BigInt;
use serde::{Deserialize, Serialize};
#[derive(Clone, Debug, Eq, PartialEq, Ord, PartialOrd, Hash, Serialize, Deserialize)]
@ -48,8 +48,8 @@ impl ColumnPath {
self.members.iter()
}
pub fn split_last(&self) -> (&PathMember, &[PathMember]) {
self.members.split_last().unwrap()
pub fn split_last(&self) -> Option<(&PathMember, &[PathMember])> {
self.members.split_last()
}
}
@ -69,7 +69,7 @@ impl PrettyDebug for ColumnPath {
impl HasFallibleSpan for ColumnPath {
fn maybe_span(&self) -> Option<Span> {
if self.members.len() == 0 {
if self.members.is_empty() {
None
} else {
Some(span_for_spanned_list(self.members.iter().map(|m| m.span)))
@ -87,37 +87,28 @@ impl PathMember {
}
}
#[derive(
Debug,
Clone,
Eq,
PartialEq,
Ord,
PartialOrd,
Hash,
Getters,
MutGetters,
Serialize,
Deserialize,
new,
)]
#[get = "pub(crate)"]
pub struct Path {
head: Expression,
#[get_mut = "pub(crate)"]
tail: Vec<PathMember>,
}
pub fn did_you_mean(obj_source: &Value, field_tried: &PathMember) -> Option<Vec<(usize, String)>> {
let field_tried = match &field_tried.unspanned {
UnspannedPathMember::String(string) => string.clone(),
UnspannedPathMember::Int(int) => format!("{}", int),
};
impl PrettyDebugWithSource for Path {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
self.head.pretty_debug(source)
+ b::operator(".")
+ b::intersperse(self.tail.iter().map(|m| m.pretty()), b::operator("."))
}
}
impl Path {
pub(crate) fn parts(self) -> (Expression, Vec<PathMember>) {
(self.head, self.tail)
let possibilities = obj_source.data_descriptors();
let mut possible_matches: Vec<_> = possibilities
.into_iter()
.map(|x| {
let word = x;
let distance = natural::distance::levenshtein_distance(&word, &field_tried);
(distance, word)
})
.collect();
if !possible_matches.is_empty() {
possible_matches.sort();
Some(possible_matches)
} else {
None
}
}

View File

@ -0,0 +1,55 @@
use crate::type_name::SpannedTypeName;
use crate::value::dict::Dictionary;
use crate::value::primitive::Primitive;
use crate::value::{UntaggedValue, Value};
use nu_errors::{CoerceInto, ShellError};
use nu_source::TaggedItem;
impl std::convert::TryFrom<&Value> for i64 {
type Error = ShellError;
fn try_from(value: &Value) -> Result<i64, ShellError> {
match &value.value {
UntaggedValue::Primitive(Primitive::Int(int)) => {
int.tagged(&value.tag).coerce_into("converting to i64")
}
_ => Err(ShellError::type_error("Integer", value.spanned_type_name())),
}
}
}
impl std::convert::TryFrom<&Value> for String {
type Error = ShellError;
fn try_from(value: &Value) -> Result<String, ShellError> {
match &value.value {
UntaggedValue::Primitive(Primitive::String(s)) => Ok(s.clone()),
_ => Err(ShellError::type_error("String", value.spanned_type_name())),
}
}
}
impl std::convert::TryFrom<&Value> for Vec<u8> {
type Error = ShellError;
fn try_from(value: &Value) -> Result<Vec<u8>, ShellError> {
match &value.value {
UntaggedValue::Primitive(Primitive::Binary(b)) => Ok(b.clone()),
_ => Err(ShellError::type_error("Binary", value.spanned_type_name())),
}
}
}
impl<'a> std::convert::TryFrom<&'a Value> for &'a Dictionary {
type Error = ShellError;
fn try_from(value: &'a Value) -> Result<&'a Dictionary, ShellError> {
match &value.value {
UntaggedValue::Row(d) => Ok(d),
_ => Err(ShellError::type_error(
"Dictionary",
value.spanned_type_name(),
)),
}
}
}

View File

@ -1,15 +1,38 @@
use crate::data::base::Primitive;
use crate::traits::PrettyType;
use crate::type_name::PrettyType;
use crate::value::primitive::Primitive;
use crate::value::{UntaggedValue, Value};
use nu_source::{b, DebugDocBuilder, PrettyDebug};
impl PrettyDebug for &Value {
fn pretty(&self) -> DebugDocBuilder {
PrettyDebug::pretty(*self)
}
}
impl PrettyDebug for Value {
fn pretty(&self) -> DebugDocBuilder {
match &self.value {
UntaggedValue::Primitive(p) => p.pretty(),
UntaggedValue::Row(row) => row.pretty_builder().nest(1).group().into(),
UntaggedValue::Table(table) => {
b::delimit("[", b::intersperse(table, b::space()), "]").nest()
}
UntaggedValue::Error(_) => b::error("error"),
UntaggedValue::Block(_) => b::opaque("block"),
}
}
}
impl PrettyType for Primitive {
fn pretty_type(&self) -> DebugDocBuilder {
match self {
Primitive::Nothing => ty("nothing"),
Primitive::Int(_) => ty("integer"),
Primitive::Range(_) => ty("range"),
Primitive::Decimal(_) => ty("decimal"),
Primitive::Bytes(_) => ty("bytesize"),
Primitive::String(_) => ty("string"),
Primitive::Line(_) => ty("line"),
Primitive::ColumnPath(_) => ty("column-path"),
Primitive::Pattern(_) => ty("pattern"),
Primitive::Boolean(_) => ty("boolean"),
@ -29,8 +52,24 @@ impl PrettyDebug for Primitive {
Primitive::Nothing => b::primitive("nothing"),
Primitive::Int(int) => prim(format_args!("{}", int)),
Primitive::Decimal(decimal) => prim(format_args!("{}", decimal)),
Primitive::Range(range) => {
let (left, left_inclusion) = &range.from;
let (right, right_inclusion) = &range.to;
b::typed(
"range",
(left_inclusion.debug_left_bracket()
+ left.pretty()
+ b::operator(",")
+ b::space()
+ right.pretty()
+ right_inclusion.debug_right_bracket())
.group(),
)
}
Primitive::Bytes(bytes) => primitive_doc(bytes, "bytesize"),
Primitive::String(string) => prim(string),
Primitive::Line(string) => prim(string),
Primitive::ColumnPath(path) => path.pretty(),
Primitive::Pattern(pattern) => primitive_doc(pattern, "pattern"),
Primitive::Boolean(boolean) => match boolean {
@ -51,10 +90,10 @@ fn prim(name: impl std::fmt::Debug) -> DebugDocBuilder {
b::primitive(format!("{:?}", name))
}
fn ty(name: impl std::fmt::Debug) -> DebugDocBuilder {
b::kind(format!("{:?}", name))
}
fn primitive_doc(name: impl std::fmt::Debug, ty: impl Into<String>) -> DebugDocBuilder {
b::primitive(format!("{:?}", name)) + b::delimit("(", b::kind(ty.into()), ")")
}
fn ty(name: impl std::fmt::Debug) -> DebugDocBuilder {
b::kind(format!("{:?}", name))
}

View File

@ -0,0 +1,207 @@
use crate::maybe_owned::MaybeOwned;
use crate::value::primitive::Primitive;
use crate::value::{UntaggedValue, Value};
use derive_new::new;
use getset::Getters;
use indexmap::IndexMap;
use nu_source::{b, DebugDocBuilder, PrettyDebug, Spanned, Tag};
use serde::{Deserialize, Serialize};
use std::cmp::{Ord, Ordering, PartialOrd};
use std::hash::{Hash, Hasher};
#[derive(Debug, Default, Serialize, Deserialize, PartialEq, Eq, Clone, Getters, new)]
pub struct Dictionary {
#[get = "pub"]
pub entries: IndexMap<String, Value>,
}
#[allow(clippy::derive_hash_xor_eq)]
impl Hash for Dictionary {
fn hash<H: Hasher>(&self, state: &mut H) {
let mut entries = self.entries.clone();
entries.sort_keys();
entries.keys().collect::<Vec<&String>>().hash(state);
entries.values().collect::<Vec<&Value>>().hash(state);
}
}
impl PartialOrd for Dictionary {
fn partial_cmp(&self, other: &Dictionary) -> Option<Ordering> {
let this: Vec<&String> = self.entries.keys().collect();
let that: Vec<&String> = other.entries.keys().collect();
if this != that {
return this.partial_cmp(&that);
}
let this: Vec<&Value> = self.entries.values().collect();
let that: Vec<&Value> = self.entries.values().collect();
this.partial_cmp(&that)
}
}
impl Ord for Dictionary {
fn cmp(&self, other: &Dictionary) -> Ordering {
let this: Vec<&String> = self.entries.keys().collect();
let that: Vec<&String> = other.entries.keys().collect();
if this != that {
return this.cmp(&that);
}
let this: Vec<&Value> = self.entries.values().collect();
let that: Vec<&Value> = self.entries.values().collect();
this.cmp(&that)
}
}
impl PartialEq<Value> for Dictionary {
fn eq(&self, other: &Value) -> bool {
match &other.value {
UntaggedValue::Row(d) => self == d,
_ => false,
}
}
}
#[derive(Debug, new)]
struct DebugEntry<'a> {
key: &'a str,
value: &'a Value,
}
impl<'a> PrettyDebug for DebugEntry<'a> {
fn pretty(&self) -> DebugDocBuilder {
(b::key(self.key.to_string()) + b::equals() + self.value.pretty().into_value()).group()
}
}
impl PrettyDebug for Dictionary {
fn pretty(&self) -> DebugDocBuilder {
b::delimit(
"(",
b::intersperse(
self.entries()
.iter()
.map(|(key, value)| DebugEntry::new(key, value)),
b::space(),
),
")",
)
}
}
impl From<IndexMap<String, Value>> for Dictionary {
fn from(input: IndexMap<String, Value>) -> Dictionary {
let mut out = IndexMap::default();
for (key, value) in input {
out.insert(key, value);
}
Dictionary::new(out)
}
}
impl Dictionary {
pub fn get_data(&self, desc: &str) -> MaybeOwned<'_, Value> {
match self.entries.get(desc) {
Some(v) => MaybeOwned::Borrowed(v),
None => MaybeOwned::Owned(
UntaggedValue::Primitive(Primitive::Nothing).into_untagged_value(),
),
}
}
pub fn keys(&self) -> impl Iterator<Item = &String> {
self.entries.keys()
}
pub fn get_data_by_key(&self, name: Spanned<&str>) -> Option<Value> {
let result = self
.entries
.iter()
.find(|(desc_name, _)| *desc_name == name.item)?
.1;
Some(
result
.value
.clone()
.into_value(Tag::new(result.tag.anchor(), name.span)),
)
}
pub fn get_mut_data_by_key(&mut self, name: &str) -> Option<&mut Value> {
match self
.entries
.iter_mut()
.find(|(desc_name, _)| *desc_name == name)
{
Some((_, v)) => Some(v),
None => None,
}
}
pub fn insert_data_at_key(&mut self, name: &str, value: Value) {
self.entries.insert(name.to_string(), value);
}
}
#[derive(Debug)]
pub struct TaggedDictBuilder {
tag: Tag,
dict: IndexMap<String, Value>,
}
impl TaggedDictBuilder {
pub fn new(tag: impl Into<Tag>) -> TaggedDictBuilder {
TaggedDictBuilder {
tag: tag.into(),
dict: IndexMap::default(),
}
}
pub fn build(tag: impl Into<Tag>, block: impl FnOnce(&mut TaggedDictBuilder)) -> Value {
let mut builder = TaggedDictBuilder::new(tag);
block(&mut builder);
builder.into_value()
}
pub fn with_capacity(tag: impl Into<Tag>, n: usize) -> TaggedDictBuilder {
TaggedDictBuilder {
tag: tag.into(),
dict: IndexMap::with_capacity(n),
}
}
pub fn insert_untagged(&mut self, key: impl Into<String>, value: impl Into<UntaggedValue>) {
self.dict
.insert(key.into(), value.into().into_value(&self.tag));
}
pub fn insert_value(&mut self, key: impl Into<String>, value: impl Into<Value>) {
self.dict.insert(key.into(), value.into());
}
pub fn into_value(self) -> Value {
let tag = self.tag.clone();
self.into_untagged_value().into_value(tag)
}
pub fn into_untagged_value(self) -> UntaggedValue {
UntaggedValue::Row(Dictionary { entries: self.dict })
}
pub fn is_empty(&self) -> bool {
self.dict.is_empty()
}
}
impl From<TaggedDictBuilder> for Value {
fn from(input: TaggedDictBuilder) -> Value {
input.into_value()
}
}

View File

@ -0,0 +1,102 @@
use crate::value::{Primitive, UntaggedValue, Value};
use indexmap::IndexMap;
use nu_errors::ShellError;
use query_interface::{interfaces, vtable_for, Object, ObjectHash};
use serde::{Deserialize, Serialize};
use std::cmp::{Ord, Ordering, PartialOrd};
use std::fmt::Debug;
#[derive(Debug)]
pub struct Scope {
pub it: Value,
pub vars: IndexMap<String, Value>,
}
impl Scope {
pub fn new(it: Value) -> Scope {
Scope {
it,
vars: IndexMap::new(),
}
}
}
impl Scope {
pub fn empty() -> Scope {
Scope {
it: UntaggedValue::Primitive(Primitive::Nothing).into_untagged_value(),
vars: IndexMap::new(),
}
}
pub fn it_value(value: Value) -> Scope {
Scope {
it: value,
vars: IndexMap::new(),
}
}
}
#[typetag::serde(tag = "type")]
pub trait EvaluateTrait: Debug + Send + Sync + Object + ObjectHash + 'static {
fn invoke(&self, scope: &Scope) -> Result<Value, ShellError>;
fn clone_box(&self) -> Evaluate;
}
interfaces!(Evaluate: dyn ObjectHash);
#[typetag::serde]
impl EvaluateTrait for Evaluate {
fn invoke(&self, scope: &Scope) -> Result<Value, ShellError> {
self.expr.invoke(scope)
}
fn clone_box(&self) -> Evaluate {
self.expr.clone_box()
}
}
#[derive(Debug, Serialize, Deserialize)]
pub struct Evaluate {
expr: Box<dyn EvaluateTrait>,
}
impl Evaluate {
pub fn new(evaluate: impl EvaluateTrait) -> Evaluate {
Evaluate {
expr: Box::new(evaluate),
}
}
}
impl std::hash::Hash for Evaluate {
fn hash<H: std::hash::Hasher>(&self, state: &mut H) {
self.expr.obj_hash(state)
}
}
impl Clone for Evaluate {
fn clone(&self) -> Evaluate {
self.expr.clone_box()
}
}
impl Ord for Evaluate {
fn cmp(&self, _: &Self) -> Ordering {
Ordering::Equal
}
}
impl PartialOrd for Evaluate {
fn partial_cmp(&self, _: &Evaluate) -> Option<Ordering> {
Some(Ordering::Equal)
}
}
impl PartialEq for Evaluate {
fn eq(&self, _: &Evaluate) -> bool {
true
}
}
impl Eq for Evaluate {}

View File

@ -0,0 +1,173 @@
use crate::type_name::ShellTypeName;
use crate::value::column_path::ColumnPath;
use crate::value::range::Range;
use crate::value::{serde_bigdecimal, serde_bigint};
use bigdecimal::BigDecimal;
use chrono::{DateTime, Utc};
use chrono_humanize::Humanize;
use nu_errors::{ExpectedRange, ShellError};
use nu_source::{PrettyDebug, Span, SpannedItem};
use num_bigint::BigInt;
use num_traits::cast::{FromPrimitive, ToPrimitive};
use serde::{Deserialize, Serialize};
use std::path::PathBuf;
#[derive(Debug, Clone, Ord, PartialOrd, Eq, PartialEq, Hash, Deserialize, Serialize)]
pub enum Primitive {
Nothing,
#[serde(with = "serde_bigint")]
Int(BigInt),
#[serde(with = "serde_bigdecimal")]
Decimal(BigDecimal),
Bytes(u64),
String(String),
Line(String),
ColumnPath(ColumnPath),
Pattern(String),
Boolean(bool),
Date(DateTime<Utc>),
Duration(u64), // Duration in seconds
Range(Box<Range>),
Path(PathBuf),
#[serde(with = "serde_bytes")]
Binary(Vec<u8>),
// Stream markers (used as bookend markers rather than actual values)
BeginningOfStream,
EndOfStream,
}
impl Primitive {
pub fn as_u64(&self, span: Span) -> Result<u64, ShellError> {
match self {
Primitive::Int(int) => match int.to_u64() {
None => Err(ShellError::range_error(
ExpectedRange::U64,
&format!("{}", int).spanned(span),
"converting an integer into a 64-bit integer",
)),
Some(num) => Ok(num),
},
other => Err(ShellError::type_error(
"integer",
other.type_name().spanned(span),
)),
}
}
}
impl From<BigDecimal> for Primitive {
fn from(decimal: BigDecimal) -> Primitive {
Primitive::Decimal(decimal)
}
}
impl From<f64> for Primitive {
fn from(float: f64) -> Primitive {
if let Some(f) = BigDecimal::from_f64(float) {
Primitive::Decimal(f)
} else {
unreachable!("Internal error: protocol did not use f64-compatible decimal")
}
}
}
impl ShellTypeName for Primitive {
fn type_name(&self) -> &'static str {
match self {
Primitive::Nothing => "nothing",
Primitive::Int(_) => "integer",
Primitive::Range(_) => "range",
Primitive::Decimal(_) => "decimal",
Primitive::Bytes(_) => "bytes",
Primitive::String(_) => "string",
Primitive::Line(_) => "line",
Primitive::ColumnPath(_) => "column path",
Primitive::Pattern(_) => "pattern",
Primitive::Boolean(_) => "boolean",
Primitive::Date(_) => "date",
Primitive::Duration(_) => "duration",
Primitive::Path(_) => "file path",
Primitive::Binary(_) => "binary",
Primitive::BeginningOfStream => "marker<beginning of stream>",
Primitive::EndOfStream => "marker<end of stream>",
}
}
}
pub fn format_primitive(primitive: &Primitive, field_name: Option<&String>) -> String {
match primitive {
Primitive::Nothing => String::new(),
Primitive::BeginningOfStream => String::new(),
Primitive::EndOfStream => String::new(),
Primitive::Path(p) => format!("{}", p.display()),
Primitive::Bytes(b) => {
let byte = byte_unit::Byte::from_bytes(*b as u128);
if byte.get_bytes() == 0u128 {
return "".to_string();
}
let byte = byte.get_appropriate_unit(false);
match byte.get_unit() {
byte_unit::ByteUnit::B => format!("{} B ", byte.get_value()),
_ => byte.format(1),
}
}
Primitive::Duration(sec) => format_duration(*sec),
Primitive::Int(i) => i.to_string(),
Primitive::Decimal(decimal) => format!("{:.4}", decimal),
Primitive::Range(range) => format!(
"{}..{}",
format_primitive(&range.from.0.item, None),
format_primitive(&range.to.0.item, None)
),
Primitive::Pattern(s) => s.to_string(),
Primitive::String(s) => s.to_owned(),
Primitive::Line(s) => s.to_owned(),
Primitive::ColumnPath(p) => {
let mut members = p.iter();
let mut f = String::new();
f.push_str(
&members
.next()
.expect("BUG: column path with zero members")
.display(),
);
for member in members {
f.push_str(".");
f.push_str(&member.display())
}
f
}
Primitive::Boolean(b) => match (b, field_name) {
(true, None) => "Yes",
(false, None) => "No",
(true, Some(s)) if !s.is_empty() => s,
(false, Some(s)) if !s.is_empty() => "",
(true, Some(_)) => "Yes",
(false, Some(_)) => "No",
}
.to_owned(),
Primitive::Binary(_) => "<binary>".to_owned(),
Primitive::Date(d) => d.humanize(),
}
}
pub fn format_duration(sec: u64) -> String {
let (minutes, seconds) = (sec / 60, sec % 60);
let (hours, minutes) = (minutes / 60, minutes % 60);
let (days, hours) = (hours / 24, hours % 24);
match (days, hours, minutes, seconds) {
(0, 0, 0, 1) => "1 sec".to_owned(),
(0, 0, 0, s) => format!("{} secs", s),
(0, 0, m, s) => format!("{}:{:02}", m, s),
(0, h, m, s) => format!("{}:{:02}:{:02}", h, m, s),
(d, h, m, s) => format!("{}:{:02}:{:02}:{:02}", d, h, m, s),
}
}

View File

@ -0,0 +1,32 @@
use crate::value::Primitive;
use derive_new::new;
use nu_source::{b, DebugDocBuilder, Spanned};
use serde::{Deserialize, Serialize};
#[derive(Debug, Copy, Clone, PartialEq, PartialOrd, Eq, Ord, Hash, Serialize, Deserialize)]
pub enum RangeInclusion {
Inclusive,
Exclusive,
}
impl RangeInclusion {
pub fn debug_left_bracket(self) -> DebugDocBuilder {
b::delimiter(match self {
RangeInclusion::Exclusive => "(",
RangeInclusion::Inclusive => "[",
})
}
pub fn debug_right_bracket(self) -> DebugDocBuilder {
b::delimiter(match self {
RangeInclusion::Exclusive => ")",
RangeInclusion::Inclusive => "]",
})
}
}
#[derive(Debug, Clone, PartialEq, PartialOrd, Eq, Ord, Hash, Serialize, Deserialize, new)]
pub struct Range {
pub from: (Spanned<Primitive>, RangeInclusion),
pub to: (Spanned<Primitive>, RangeInclusion),
}

View File

@ -0,0 +1,24 @@
use bigdecimal::BigDecimal;
use num_traits::cast::FromPrimitive;
use num_traits::cast::ToPrimitive;
pub fn serialize<S>(big_decimal: &BigDecimal, serializer: S) -> Result<S::Ok, S::Error>
where
S: serde::Serializer,
{
serde::Serialize::serialize(
&big_decimal
.to_f64()
.ok_or_else(|| serde::ser::Error::custom("expected a f64-sized bignum"))?,
serializer,
)
}
pub fn deserialize<'de, D>(deserializer: D) -> Result<BigDecimal, D::Error>
where
D: serde::Deserializer<'de>,
{
let x: f64 = serde::Deserialize::deserialize(deserializer)?;
Ok(BigDecimal::from_f64(x)
.ok_or_else(|| serde::de::Error::custom("expected a f64-sized bigdecimal"))?)
}

View File

@ -0,0 +1,24 @@
use num_bigint::BigInt;
use num_traits::cast::FromPrimitive;
use num_traits::cast::ToPrimitive;
pub fn serialize<S>(big_int: &BigInt, serializer: S) -> Result<S::Ok, S::Error>
where
S: serde::Serializer,
{
serde::Serialize::serialize(
&big_int
.to_i64()
.ok_or_else(|| serde::ser::Error::custom("expected a i64-sized bignum"))?,
serializer,
)
}
pub fn deserialize<'de, D>(deserializer: D) -> Result<BigInt, D::Error>
where
D: serde::Deserializer<'de>,
{
let x: i64 = serde::Deserialize::deserialize(deserializer)?;
Ok(BigInt::from_i64(x)
.ok_or_else(|| serde::de::Error::custom("expected a i64-sized bignum"))?)
}

View File

@ -1,16 +1,16 @@
[package]
name = "nu-source"
version = "0.1.0"
authors = ["Yehuda Katz <wycats@gmail.com>"]
version = "0.8.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "A source string characterizer for Nushell"
license = "MIT"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[lib]
doctest = false
[dependencies]
serde = { version = "1.0.102", features = ["derive"] }
serde = { version = "1.0.103", features = ["derive"] }
derive-new = "0.5.8"
getset = "0.0.9"
nom_locate = "1.0.0"
@ -18,3 +18,6 @@ nom-tracable = "0.4.1"
language-reporting = "0.4.0"
termcolor = "1.0.5"
pretty = "0.5.2"
[build-dependencies]
nu-build = { version = "0.8.0", path = "../nu-build" }

View File

@ -0,0 +1,3 @@
fn main() -> Result<(), Box<dyn std::error::Error>> {
nu_build::build()
}

View File

@ -11,5 +11,6 @@ pub use self::meta::{
pub use self::pretty::{
b, DebugDoc, DebugDocBuilder, PrettyDebug, PrettyDebugWithSource, ShellAnnotation,
};
pub use self::term_colored::TermColored;
pub use self::text::Text;
pub use self::tracable::{nom_input, NomSpan, TracableContext};

View File

@ -38,7 +38,7 @@ impl Spanned<String> {
pub fn items<'a, U>(
items: impl Iterator<Item = &'a Spanned<String>>,
) -> impl Iterator<Item = &'a str> {
items.into_iter().map(|item| &item.item[..])
items.map(|item| &item.item[..])
}
}
@ -156,7 +156,7 @@ impl<T> Tagged<T> {
Tagged {
item: self.item,
tag: tag,
tag,
}
}
@ -220,10 +220,7 @@ impl<T>
nom_locate::LocatedSpanEx<T, u64>,
),
) -> Span {
Span {
start: input.0.offset,
end: input.1.offset,
}
Span::new(input.0.offset, input.1.offset)
}
}
@ -235,10 +232,7 @@ impl From<(usize, usize)> for Span {
impl From<&std::ops::Range<usize>> for Span {
fn from(input: &std::ops::Range<usize>) -> Span {
Span {
start: input.start,
end: input.end,
}
Span::new(input.start, input.end)
}
}
@ -321,10 +315,7 @@ impl Tag {
pub fn for_char(pos: usize, anchor: AnchorLocation) -> Tag {
Tag {
anchor: Some(anchor),
span: Span {
start: pos,
end: pos + 1,
},
span: Span::new(pos, pos + 1),
}
}
@ -528,11 +519,19 @@ impl Span {
impl language_reporting::ReportingSpan for Span {
fn with_start(&self, start: usize) -> Self {
Span::new(start, self.end)
if self.end < start {
Span::new(start, start)
} else {
Span::new(start, self.end)
}
}
fn with_end(&self, end: usize) -> Self {
Span::new(self.start, end)
if end < self.start {
Span::new(end, end)
} else {
Span::new(self.start, end)
}
}
fn start(&self) -> usize {

View File

@ -135,7 +135,7 @@ impl DebugDocBuilder {
DebugDocBuilder::styled(string, ShellStyle::Value)
}
pub fn as_value(self) -> DebugDocBuilder {
pub fn into_value(self) -> DebugDocBuilder {
self.inner
.annotate(ShellAnnotation::style(ShellStyle::Value))
.into()
@ -149,7 +149,7 @@ impl DebugDocBuilder {
DebugDocBuilder::styled(string, ShellStyle::Kind)
}
pub fn as_kind(self) -> DebugDocBuilder {
pub fn into_kind(self) -> DebugDocBuilder {
self.inner
.annotate(ShellAnnotation::style(ShellStyle::Kind))
.into()
@ -316,7 +316,7 @@ impl DebugDocBuilder {
result = result + item;
}
result.into()
result
}
fn styled(string: impl std::fmt::Display, style: ShellStyle) -> DebugDocBuilder {
@ -414,8 +414,7 @@ pub trait PrettyDebug {
let doc = self.pretty_doc();
let mut buffer = termcolor::Buffer::no_color();
doc.render_raw(width, &mut TermColored::new(&mut buffer))
.unwrap();
let _ = doc.render_raw(width, &mut TermColored::new(&mut buffer));
String::from_utf8_lossy(buffer.as_slice()).to_string()
}
@ -424,8 +423,7 @@ pub trait PrettyDebug {
let doc = self.pretty_doc();
let mut buffer = termcolor::Buffer::ansi();
doc.render_raw(width, &mut TermColored::new(&mut buffer))
.unwrap();
let _ = doc.render_raw(width, &mut TermColored::new(&mut buffer));
String::from_utf8_lossy(buffer.as_slice()).to_string()
}
@ -488,6 +486,7 @@ fn hash_doc<H: std::hash::Hasher>(doc: &PrettyDebugDoc, state: &mut H) {
}
}
#[allow(clippy::derive_hash_xor_eq)]
impl std::hash::Hash for DebugDoc {
fn hash<H: std::hash::Hasher>(&self, state: &mut H) {
hash_doc(&self.inner, state);

View File

@ -0,0 +1,17 @@
[package]
name = "nu-test-support"
version = "0.8.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "A source string characterizer for Nushell"
license = "MIT"
[lib]
doctest = false
[dependencies]
app_dirs = "1.2.1"
dunce = "1.0.0"
getset = "0.0.9"
glob = "0.3.0"
tempfile = "3.1.0"

View File

@ -0,0 +1,232 @@
use std::io::Read;
use std::ops::Div;
use std::path::{Path, PathBuf};
pub struct AbsoluteFile {
inner: PathBuf,
}
impl AbsoluteFile {
pub fn new(path: impl AsRef<Path>) -> AbsoluteFile {
let path = path.as_ref();
if !path.is_absolute() {
panic!(
"AbsoluteFile::new must take an absolute path :: {}",
path.display()
)
} else if path.is_dir() {
// At the moment, this is not an invariant, but rather a way to catch bugs
// in tests.
panic!(
"AbsoluteFile::new must not take a directory :: {}",
path.display()
)
} else {
AbsoluteFile {
inner: path.to_path_buf(),
}
}
}
pub fn dir(&self) -> AbsolutePath {
AbsolutePath::new(if let Some(parent) = self.inner.parent() {
parent
} else {
unreachable!("Internal error: could not get parent in dir")
})
}
}
impl From<AbsoluteFile> for PathBuf {
fn from(file: AbsoluteFile) -> Self {
file.inner
}
}
pub struct AbsolutePath {
inner: PathBuf,
}
impl AbsolutePath {
pub fn new(path: impl AsRef<Path>) -> AbsolutePath {
let path = path.as_ref();
if path.is_absolute() {
AbsolutePath {
inner: path.to_path_buf(),
}
} else {
panic!("AbsolutePath::new must take an absolute path")
}
}
}
impl Div<&str> for &AbsolutePath {
type Output = AbsolutePath;
fn div(self, rhs: &str) -> Self::Output {
let parts = rhs.split('/');
let mut result = self.inner.clone();
for part in parts {
result = result.join(part);
}
AbsolutePath::new(result)
}
}
impl AsRef<Path> for AbsolutePath {
fn as_ref(&self) -> &Path {
self.inner.as_path()
}
}
pub struct RelativePath {
inner: PathBuf,
}
impl RelativePath {
pub fn new(path: impl Into<PathBuf>) -> RelativePath {
let path = path.into();
if path.is_relative() {
RelativePath { inner: path }
} else {
panic!("RelativePath::new must take a relative path")
}
}
}
impl<T: AsRef<str>> Div<T> for &RelativePath {
type Output = RelativePath;
fn div(self, rhs: T) -> Self::Output {
let parts = rhs.as_ref().split('/');
let mut result = self.inner.clone();
for part in parts {
result = result.join(part);
}
RelativePath::new(result)
}
}
pub trait DisplayPath {
fn display_path(&self) -> String;
}
impl DisplayPath for AbsolutePath {
fn display_path(&self) -> String {
self.inner.display().to_string()
}
}
impl DisplayPath for PathBuf {
fn display_path(&self) -> String {
self.display().to_string()
}
}
impl DisplayPath for str {
fn display_path(&self) -> String {
self.to_string()
}
}
impl DisplayPath for &str {
fn display_path(&self) -> String {
(*self).to_string()
}
}
impl DisplayPath for String {
fn display_path(&self) -> String {
self.clone()
}
}
impl DisplayPath for &String {
fn display_path(&self) -> String {
(*self).to_string()
}
}
pub enum Stub<'a> {
FileWithContent(&'a str, &'a str),
FileWithContentToBeTrimmed(&'a str, &'a str),
EmptyFile(&'a str),
}
pub fn file_contents(full_path: impl AsRef<Path>) -> String {
let mut file = std::fs::File::open(full_path.as_ref()).expect("can not open file");
let mut contents = String::new();
file.read_to_string(&mut contents)
.expect("can not read file");
contents
}
pub fn file_contents_binary(full_path: impl AsRef<Path>) -> Vec<u8> {
let mut file = std::fs::File::open(full_path.as_ref()).expect("can not open file");
let mut contents = Vec::new();
file.read_to_end(&mut contents).expect("can not read file");
contents
}
pub fn line_ending() -> String {
#[cfg(windows)]
{
String::from("\r\n")
}
#[cfg(not(windows))]
{
String::from("\n")
}
}
pub fn delete_file_at(full_path: impl AsRef<Path>) {
let full_path = full_path.as_ref();
if full_path.exists() {
std::fs::remove_file(full_path).expect("can not delete file");
}
}
pub fn create_file_at(full_path: impl AsRef<Path>) -> Result<(), std::io::Error> {
let full_path = full_path.as_ref();
if let Some(parent) = full_path.parent() {
panic!(format!("{:?} exists", parent.display()));
}
std::fs::write(full_path, b"fake data")
}
pub fn copy_file_to(source: &str, destination: &str) {
std::fs::copy(source, destination).expect("can not copy file");
}
pub fn files_exist_at(files: Vec<impl AsRef<Path>>, path: impl AsRef<Path>) -> bool {
files.iter().all(|f| {
let mut loc = PathBuf::from(path.as_ref());
loc.push(f);
loc.exists()
})
}
pub fn delete_directory_at(full_path: &str) {
std::fs::remove_dir_all(PathBuf::from(full_path)).expect("can not remove directory");
}
pub fn executable_path() -> PathBuf {
let mut buf = PathBuf::new();
buf.push("target");
buf.push("debug");
buf.push("nu");
buf
}
pub fn in_directory(str: impl AsRef<Path>) -> String {
str.as_ref().display().to_string()
}

View File

@ -0,0 +1,38 @@
pub mod fs;
pub mod macros;
pub mod playground;
pub fn pipeline(commands: &str) -> String {
commands
.lines()
.skip(1)
.map(|line| line.trim())
.collect::<Vec<&str>>()
.join(" ")
.trim_end()
.to_string()
}
#[cfg(tests)]
mod tests {
use super::pipeline;
#[test]
fn constructs_a_pipeline() {
let actual = pipeline(
r#"
open los_tres_amigos.txt
| from-csv
| get rusty_luck
| str --to-int
| sum
| echo "$it"
"#,
);
assert_eq!(
actual,
r#"open los_tres_amigos.txt | from-csv | get rusty_luck | str --to-int | sum | echo "$it""#
);
}
}

View File

@ -0,0 +1,105 @@
#[macro_export]
macro_rules! nu {
(cwd: $cwd:expr, $path:expr, $($part:expr),*) => {{
use $crate::fs::DisplayPath;
let path = format!($path, $(
$part.display_path()
),*);
nu!($cwd, &path)
}};
(cwd: $cwd:expr, $path:expr) => {{
nu!($cwd, $path)
}};
($cwd:expr, $path:expr) => {{
pub use std::error::Error;
pub use std::io::prelude::*;
pub use std::process::{Command, Stdio};
let commands = &*format!(
"
cd {}
{}
exit",
$crate::fs::in_directory($cwd),
$crate::fs::DisplayPath::display_path(&$path)
);
let mut process = match Command::new($crate::fs::executable_path())
.stdin(Stdio::piped())
.stdout(Stdio::piped())
.spawn()
{
Ok(child) => child,
Err(why) => panic!("Can't run test {}", why.to_string()),
};
let stdin = process.stdin.as_mut().expect("couldn't open stdin");
stdin
.write_all(commands.as_bytes())
.expect("couldn't write to stdin");
let output = process
.wait_with_output()
.expect("couldn't read from stdout");
let out = String::from_utf8_lossy(&output.stdout);
let out = out.replace("\r\n", "");
let out = out.replace("\n", "");
out
}};
}
#[macro_export]
macro_rules! nu_error {
(cwd: $cwd:expr, $path:expr, $($part:expr),*) => {{
use $crate::fs::DisplayPath;
let path = format!($path, $(
$part.display_path()
),*);
nu_error!($cwd, &path)
}};
(cwd: $cwd:expr, $path:expr) => {{
nu_error!($cwd, $path)
}};
($cwd:expr, $path:expr) => {{
pub use std::error::Error;
pub use std::io::prelude::*;
pub use std::process::{Command, Stdio};
let commands = &*format!(
"
cd {}
{}
exit",
$crate::fs::in_directory($cwd),
$crate::fs::DisplayPath::display_path(&$path)
);
let mut process = Command::new($crate::fs::executable_path())
.stdin(Stdio::piped())
.stderr(Stdio::piped())
.spawn()
.expect("couldn't run test");
let stdin = process.stdin.as_mut().expect("couldn't open stdin");
stdin
.write_all(commands.as_bytes())
.expect("couldn't write to stdin");
let output = process
.wait_with_output()
.expect("couldn't read from stderr");
let out = String::from_utf8_lossy(&output.stderr);
out.into_owned()
}};
}

View File

@ -0,0 +1,156 @@
use crate::fs::line_ending;
use crate::fs::Stub;
use getset::Getters;
use glob::glob;
use std::path::{Path, PathBuf};
use tempfile::{tempdir, TempDir};
pub struct Playground {
root: TempDir,
tests: String,
cwd: PathBuf,
}
#[derive(Getters)]
#[get = "pub"]
pub struct Dirs {
pub root: PathBuf,
pub test: PathBuf,
pub fixtures: PathBuf,
}
impl Dirs {
pub fn formats(&self) -> PathBuf {
self.fixtures.join("formats")
}
}
impl Playground {
pub fn root(&self) -> &Path {
self.root.path()
}
pub fn back_to_playground(&mut self) -> &mut Self {
self.cwd = PathBuf::from(self.root()).join(self.tests.clone());
self
}
pub fn setup(topic: &str, block: impl FnOnce(Dirs, &mut Playground)) {
let root = tempdir().expect("Couldn't create a tempdir");
let nuplay_dir = root.path().join(topic);
if PathBuf::from(&nuplay_dir).exists() {
std::fs::remove_dir_all(PathBuf::from(&nuplay_dir)).expect("can not remove directory");
}
std::fs::create_dir(PathBuf::from(&nuplay_dir)).expect("can not create directory");
let mut playground = Playground {
root,
tests: topic.to_string(),
cwd: nuplay_dir,
};
let project_root = PathBuf::from(env!("CARGO_MANIFEST_DIR"));
let playground_root = playground.root.path();
let fixtures = project_root;
let fixtures = fixtures
.parent()
.expect("Couldn't find the fixtures directory")
.parent()
.expect("Couldn't find the fixtures directory")
.join("tests/fixtures");
let fixtures = dunce::canonicalize(fixtures.clone()).unwrap_or_else(|e| {
panic!(
"Couldn't canonicalize fixtures path {}: {:?}",
fixtures.display(),
e
)
});
let test = dunce::canonicalize(playground_root.join(topic)).unwrap_or_else(|e| {
panic!(
"Couldn't canonicalize test path {}: {:?}",
playground_root.join(topic).display(),
e
)
});
let root = dunce::canonicalize(playground_root).unwrap_or_else(|e| {
panic!(
"Couldn't canonicalize tests root path {}: {:?}",
playground_root.display(),
e
)
});
let dirs = Dirs {
root,
test,
fixtures,
};
block(dirs, &mut playground);
}
pub fn mkdir(&mut self, directory: &str) -> &mut Self {
self.cwd.push(directory);
std::fs::create_dir_all(&self.cwd).expect("can not create directory");
self.back_to_playground();
self
}
pub fn with_files(&mut self, files: Vec<Stub>) -> &mut Self {
let endl = line_ending();
files
.iter()
.map(|f| {
let mut path = PathBuf::from(&self.cwd);
let (file_name, contents) = match *f {
Stub::EmptyFile(name) => (name, "fake data".to_string()),
Stub::FileWithContent(name, content) => (name, content.to_string()),
Stub::FileWithContentToBeTrimmed(name, content) => (
name,
content
.lines()
.skip(1)
.map(|line| line.trim())
.collect::<Vec<&str>>()
.join(&endl),
),
};
path.push(file_name);
std::fs::write(path, contents.as_bytes()).expect("can not create file");
})
.for_each(drop);
self.back_to_playground();
self
}
pub fn within(&mut self, directory: &str) -> &mut Self {
self.cwd.push(directory);
std::fs::create_dir(&self.cwd).expect("can not create directory");
self
}
pub fn glob_vec(pattern: &str) -> Vec<PathBuf> {
let glob = glob(pattern);
glob.expect("invalid pattern")
.map(|path| {
if let Ok(path) = path {
path
} else {
unreachable!()
}
})
.collect()
}
}

Some files were not shown because too many files have changed in this diff Show More