Compare commits

...

156 Commits

Author SHA1 Message Date
87d71604ad Bump to 0.18.1 (#2335) 2020-08-12 15:59:28 +12:00
e372e7c448 Display built features. Long/Short commit hashes display removed due to cargo publishing. (#2333)
Co-authored-by: Darren Schroeder <343840+fdncred@users.noreply.github.com>
Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>

Co-authored-by: Darren Schroeder <343840+fdncred@users.noreply.github.com>
Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-08-12 15:13:33 +12:00
0194dee3a6 Updated version hashing and bumped nu-cli to 0.18.1 (#2331)
* Updated version hashing and bumped nu-cli to 0.18.1

* made code pertyer

* Update version.rs

* Update version.rs

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-08-12 09:43:23 +12:00
cc3c10867c Histogram no longer requires a wrap command before it on unnamed columns (#2332) 2020-08-12 09:42:59 +12:00
3c18169f63 Autoenv fix: Exitscripts incorrectly running when visiting a subdirectory (#2326)
* Add test case for issue

* Preliminary fix

* fmt

* Reorder asserts

* move insertion

* Touch nu-env.toml

* Cleanup

* touch nu-env toml

* Remove touch

* Change feature flags
2020-08-12 07:54:49 +12:00
43e9c89125 moved theme assets local to crate (#2329)
* moved theme assets local to crate

* remove the TODO comment
2020-08-11 13:57:03 -05:00
2ad07912d9 Bump to 0.18 (#2325) 2020-08-11 18:44:53 +12:00
51ad019495 Update Cargo.lock (#2322) 2020-08-11 15:00:07 +12:00
9264325e57 Make history file location configurable (#2320)
* Make history location configurable

Add history-path to your config if you want an alternate history file
location

* use IndexMap.get() instead of index

Co-authored-by: Amanita Muscaria <nope>
2020-08-11 13:58:53 +12:00
901157341b Bumped which-rs to 4.0.2 (#2321)
This should fix #1541
2020-08-11 13:55:43 +12:00
eb766b80c1 added pkg_mgr/winget + yaml (#2297) 2020-08-11 05:45:53 +12:00
f0dbffd761 Add 228 json html themes for to html (#2308)
* add 228 json html themes
removed old assets, added new zipped asset
added --list to get a list of the theme names
reworked some older theme code
added rust-embed and zip crate
removed the dark tests

* fmt

* Updated, removed excess comments
Changed usage a bit
Updated the error handling
Added some helper items in --list
2020-08-11 05:43:16 +12:00
f14c0df582 Allow disabling welcome message on launch (#2314)
* Implements #2313
2020-08-09 11:38:21 +12:00
362bb1bea3 Add commit hash to version command (#2312)
* Add commit to version command

* Replace unwrap with expect.
2020-08-08 17:39:34 +12:00
724b177c97 Sample variance and Sample standard deviation. (#2310) 2020-08-06 23:56:19 -05:00
50343f2d6a Add stderr back when using do -i (#2309)
* Add stderr back when using do -i

* Add stderr back when using do -i
2020-08-07 16:53:37 +12:00
3122525b96 removed rustyline config duplication (#2306)
* removed rustyline config duplication
set other rustyline defaults if line_editor section doesn't exist
updated keyseq_timeout to -1 if emacs mode is chosen

* change checking rustyline config to if lets

* removed some unneccessary code
2020-08-05 16:34:28 -05:00
8232c6f185 Update rustyline defaults (#2305)
Use rustyline defaults if no config exists for line_editor (for most options)
2020-08-05 13:05:13 -05:00
6202705eb6 parse most common date formats using dtparse crate (#2303)
* use dtparse crate to parse most common date formats

* use dtparse crate to parse most common date formats - cargo fmt
2020-08-05 12:44:52 +12:00
e1c5940b04 Add command "reduce" (#2292)
* initial

* fold working

* tests and cleanup

* change command to reduce, with fold flag

* move complex example to tests

* add --numbered flag
2020-08-05 05:16:19 +12:00
7f35bfc005 histogram: support regular values. (#2300) 2020-08-04 04:57:25 -05:00
c48c092125 String funcs - Contains and IndexOf (#2298)
* Contains and index of string functions

* Clippy and fmt
2020-08-04 18:36:51 +12:00
028fc9b9cd Data summarize reporting overhaul. (#2299)
Refactored out most of internal work for summarizing data opening
the door for generating charts from it. A model is introduced
to hold information needed for a summary, Histogram command is
an example of a partial usage. This is the beginning.

Removed implicit arithmetic traits on Value and Primitive to avoid
mixed types panics. The std operations traits can't fail and we
can't guarantee that. We can handle gracefully now since compute_values
was introduced after the parser changes four months ago. The handling
logic should be taken care of either explicitly or in compute_values.

The zero identity trait was also removed (and implementing this forced
us to also implement Add, Mult, etc)

Also: the `math` operations now remove in the output if a given column is not computable:

```
> ls | math sum
──────┬──────────
 size │ 150.9 KB
──────┴──────────
```
2020-08-03 17:47:19 -05:00
eeb9b4edcb Match cleanup (#2294)
* Delete unnecessary match

* Use `unwrap_or_else()`

* Whitespace was trim on file save

* Use `map_or_else()`

* Use a default to group all match arms with same output

* Clippy made me do it
2020-08-04 05:43:27 +12:00
3a7869b422 Switch to maintained app_dirs (#2293)
* Switch to maintained app_dirs

* Update app_dirs under old name
2020-08-04 05:41:57 +12:00
c48ea46c4f Match cleanup (#2290) 2020-08-02 18:34:33 -04:00
f33da33626 Add --partial to 'to html' (#2291) 2020-08-03 08:47:54 +12:00
a88f5c7ae7 Make str collect take an optional separator value (#2289)
* Make `str collect` take an optional separator value

* Make `str collect` take an optional separator value

* Add some tests

* Fix my tests
2020-08-02 19:29:29 +12:00
cda53b6cda Return incomplete parse from lite_parse (#2284)
* Move lite_parse tests into a submodule

* Have lite_parse return partial parses when error encountered.

Although a parse fails, we can generally still return what was successfully
parsed. This is useful, for example, when figuring out completions at some
cursor position, because we can map the cursor to something more structured
(e.g., cursor is at a flag name).
2020-08-02 06:39:55 +12:00
ee734873ba Fix no longer working histogram example (#2271)
* Fix no longer working histogram example

* Oops
2020-08-02 06:38:45 +12:00
9fb6f5cd09 Change f/full flag to l/long for ls and ps commands (#2283)
* Change `f`/`full` flag to `l`/`long` for `ls` and `ps` commands

* Fix a few more `--full` instances
2020-08-02 06:30:45 +12:00
4ef15b5f80 docs/alias: simplify the 'persistent' section, using --save (#2285)
All the workarounds using `config` aren't necessary anymore. Only `config path` is still of interest.
2020-08-01 08:11:26 -04:00
ba81278ffd Remove build.rs and nu-build (#2282) 2020-08-01 09:21:10 +12:00
10fbed3808 updated cmd builtin commands (#2266)
* updated cmd builtin commands

* removed cd, chdir, exit, prompt, rem

* remove more commands, what remains is useful

* cargo fmt is so finicky
2020-07-31 09:51:42 +12:00
16cfc36aec set default edit_mode to emacs instead of vi (#2278) 2020-07-30 14:59:20 -05:00
aca7f71737 🐛 Fix path command error messages (#2261). (#2276) 2020-07-31 06:51:14 +12:00
3282a509a9 Make insert take in a block (#2265)
* Make insert take in a block

* Add some tests
2020-07-30 16:58:54 +12:00
878b748a41 Add list output for to html (#2273) 2020-07-30 16:54:55 +12:00
18a4505b9b starts_with ends_with match functions for string (#2269) 2020-07-30 16:51:20 +12:00
26e77a4b05 Add url commands (#2274)
* scheme
* path
* query
* host
2020-07-30 08:56:56 +12:00
37f10cf273 Add two further path cmds - type and exists (#2264)
* Add two further path cmds - type and exists

* Update type.rs

Try a more universal directory

* Update type.rs

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-07-27 14:12:07 +12:00
5e0a9aecaa ltrim and rtrim for string (#2262)
* Trim string from left and right

* Move trim to folder

* fmt

* Clippy
2020-07-27 06:09:35 +12:00
7e2c627044 WIP: Path utility commands (#2255)
* Add new path commands

basename, expand and extension. Currently there is no real error
handling. expand returns the initial path if it didn't work, the others
return empty string

* Optionally apply to path
2020-07-26 07:29:15 +12:00
4347339e9a Make all bullet point items uppercase (#2257) 2020-07-26 06:15:12 +12:00
e66a8258ec add example to parse command, with row output (#2256) 2020-07-26 06:14:29 +12:00
e4b42b54ad Simplify NuCompleter. (#2254)
- Removing old code for dealing with escaping, since that has moved elsewhere.
- Eliminating some match statements in favour of result/option methods.
- Fix an issue where completing inside quotes could remove the quote at the
  beginning, if one already existed on the line but the replacement didn't have
  a quote at the beginning.
2020-07-25 10:41:14 -04:00
de18b9ca2c Match cleanup (#2248) 2020-07-25 08:40:35 -04:00
a77f0f7b41 to-xml.md documentation update (#2253)
* Update to-xml.md documentation to be consistent

* Capitalize bullet point items

* Add link to this document wthin `to.md`
2020-07-25 20:19:15 +12:00
6b31a006b8 Refactor all completion logic into NuCompleter (#2252)
* Refactor all completion logic into `NuCompleter`

This is the next step to improving completions. Previously, completion logic was
scattered about (`FilesystemShell`, `NuCompleter`, `Helper`, and `ShellManager`).
By unifying the core logic into a central location, it will be easier to take the
next steps in improving completion.

* Update context.rs

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-07-25 11:39:12 +12:00
2db4fe83d8 Remove unnecessary peekable iterator (#2251) 2020-07-24 12:06:12 -04:00
55a2f284d9 Add to xml command (#2141) (#2155) 2020-07-24 19:41:22 +12:00
2d3b1e090a Remove piping of stderr. (#2247)
In any other shell, stderr is inherited like normal, and only piped if you
request it explicitly (e.g., `2>/dev/null`). In the case of a command like
`fzf`, stderr is used for the interactive selection of files. By piping it,
something like

    fzf | xargs echo

does not work. By removing all stderr piping we eliminate this issue. We can
return later with a way to deal with stderr piping when an actual use case
arises.
2020-07-24 17:56:50 +12:00
ed0c1038e3 Step 1 for to html theme-ing (#2245)
* reworked theming step 1.
added theme parameter.
hard coded 4 themes

* forgot about blulocolight

* aarrrrg! test are in another place. fixed i think.
2020-07-23 13:21:58 -05:00
0c20282200 added documentation of available binding options (#2246)
straight from the rustyline source code
2020-07-23 13:13:06 -05:00
e71f44d26f if config file doesn't exist, set defaults. (#2244)
if line_editor in config doesn't exist, set defaults.
2020-07-23 08:27:45 -05:00
e3d7e46855 added defaults. fixed but of not loading history. (#2243) 2020-07-23 07:19:05 -05:00
9b35aae5e8 update sample configs (#2242)
* update sample configs

* change rustyline to line_editor
2020-07-23 06:49:25 -05:00
7e9f87c57f Expose all rustyline configuration points (#2238)
* Added all rustyline config points

* comments cleanup

* my good friend fmt keeps changing his mind
2020-07-23 09:43:52 +12:00
5d17b72852 update config documentation (#2178)
* update config documentation

* update config syntax

* update config syntax

* Update alias.md

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-07-23 09:42:04 +12:00
6b4634b293 Convert table of primitives to positional arguments for external cmd (#2232)
* Convert table of primitives to positional arguments for external cmd

* Multiple file test, fix for cococo
2020-07-23 09:41:34 +12:00
2a084fc838 Bump to 0.17.0 (#2237) 2020-07-22 06:41:49 +12:00
a36d2a1586 Revert "hopefully the final fix for history (#2222)" (#2235)
This reverts commit 6829ad7a30.
2020-07-21 18:19:04 +12:00
32b875ada9 sample config settigns (#2233) 2020-07-20 21:15:58 -05:00
aaed9c4e8a added ansi example (#2230)
* added ansi example

* added strcollect to example
2020-07-20 18:33:39 -05:00
b9278bdfe1 Char example (#2231)
* added ansi example

* added another example

* changed example

* ansi changes here by mistake
2020-07-20 14:25:38 -05:00
6eb2c94209 Add flag for case-insensitive sort-by (#2225)
* Add flag for case-insensitive sort-by

* Fix test names

* Fix documentation comments
2020-07-21 05:31:58 +12:00
7b1a15b223 Campbell colors (#2219)
* added campbell theme to html colors

* updated test results. had to make change for ci.

* hopefully the last changes for this stupid test :)

* moved tests to html.rs

* remove unnecessary using statement.

* still fighting with tests and tests are winning.
2020-07-20 07:57:29 -05:00
836efd237c fix internal command parsing (args.is_last) (#2224) 2020-07-20 05:49:40 +12:00
aad3cca793 Add benchmark command (#2223) 2020-07-20 05:39:43 +12:00
6829ad7a30 hopefully the final fix for history (#2222) 2020-07-19 07:47:55 -05:00
1f0962eb08 Add some tests for parse_arg (#2220)
* add some tests for parse

* Format

* fix warnings
2020-07-19 19:12:56 +12:00
c65acc174d Add hex pretty print to 'to html' (#2221) 2020-07-19 16:44:15 +12:00
2dea392e40 Add hex pretty print to 'to html' (#2217) 2020-07-19 12:14:40 +12:00
0c43a4d04b Add hex pretty print to 'to html' (#2216) 2020-07-19 10:12:17 +12:00
ebc2d40875 Expose more registry APIs (#2215) 2020-07-19 06:01:05 +12:00
3432078e77 Fix uniq to work with simple values (#2214) 2020-07-19 05:19:03 +12:00
9e5170b3dc Clean up lines command (#2207) 2020-07-19 05:17:56 +12:00
0ae7c5d836 Fix if description (#2204) (#2213) 2020-07-19 05:16:35 +12:00
d0712a00f4 made it easier to change colors (#2212)
* made it easier to change colors
and the beginning of html theming

* fmt
2020-07-18 11:05:45 -05:00
5e722181cb Export more defs from nu-cli (#2205) 2020-07-18 16:47:03 +12:00
ffe3e2c16b Rename calc to math eval and allow it to optionally take an expression as an argument (#2195)
* Rename `calc` to `math eval` and allow it to optionally take the expression as an argument

* Moved calc tests to math eval
Also added 2 tests and changed 1 test

* Move calc docs to math eval
2020-07-18 16:11:19 +12:00
04e8aa31fe update history max size with two different calls. (#2202)
Closes #2193
2020-07-18 15:26:32 +12:00
9d24b440bb Introduce completion abstractions to nushell. (#2198)
* Introduce completion abstractions to nushell.

Currently, we rely on rustyline's completion structures. By abstracting this
away, we are more flexible to introduce someone elses completion engine, or our
own.

* Update value_shell.rs

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-07-18 14:55:10 +12:00
d8594a62c2 Add wasm support (#2199)
* Working towards a PoC for wasm

* Move bson and sqlite to plugins

* proof of concept now working

* tests are green

* Add CI test for --no-default-features

* Fix some tests

* Fix clippy and windows build

* More fixes

* Fix the windows build

* Fix the windows test
2020-07-18 13:59:23 +12:00
dbe0effd67 User error propagation operator (#2201) 2020-07-18 13:12:06 +12:00
b358804904 Auto-Generate Documentation for nushell.com (#2139)
* Very rough idea

* Remove colour codes

* Work on command for generating docs

* Quick comment

* Use nested collapsible markdown

* Refine documentation command

* Clippy and rename docs

* This layout probably seems best

Also moved some code to documentation.rs to avoid making help.rs massive

* Delete summaries.md

* Add usage strings

* Remove static annotations

* get_documentation produces value

Which will be used like
'help generate_docs | save "something"'
The resulting yaml can be passed to a script for generating HTML/MD files in the website

* Fix subcommands

* DRY code

* Address clippy:

* Fix links

* Clippy lints

* Move documentation to more central location
2020-07-18 10:22:43 +12:00
7b02604e6d changed colors as per Jörn's suggestion. (#2200)
* changed colors as per Jörn's suggestion.

* cleaned up old comments
2020-07-17 15:02:54 -05:00
6497421615 Keep until and while as subcommands of keep (#2197) 2020-07-18 07:06:48 +12:00
f26151e36d Silence Rust 1.45 Clippy warnings (#2196)
* Silence Rust 1.45 Clippy warnings dealing with using `map_err()`

* Silence false Clippy warning

* Fix last Clippy error for unnecessary conversion

* Fix `and_then` clippy warnings
2020-07-18 05:57:15 +12:00
0f688d7da7 Use '?' for error propagation, remove match (#2194) 2020-07-17 05:39:51 +12:00
a04dfca63a added ability to supply --dark_bg to to html (#2189)
* added ability to supply --dark_bg to to html

* fmt + fixed tests

* updated other html tests

* fmt
2020-07-16 08:19:29 -05:00
72f6513d2a Keybindings and invocation fix (#2186) 2020-07-15 19:51:59 +12:00
7c0a830d84 Match cleanup (#2184)
* Use `unwrap_or()` to remove `match`

* Use `?` for error propogation, and remove `match`
2020-07-15 19:51:41 +12:00
c299d207f7 Remove unnecessary match (#2183) 2020-07-15 19:50:38 +12:00
42a1adf2e9 Indices are (now) green, bold, right-aligned (#2181)
With https://github.com/nushell/nushell/pull/355 the (numeric) index column of tables was changed to be right-aligned. After the move to `nu-table` the index column is now centered instead of right-aligned. I think this is a copy-paste bug where [this line](71e55541d7/crates/nu-cli/src/commands/table.rs (L190)) has been copied from [this line](71e55541d7/crates/nu-cli/src/commands/table.rs (L207)), since the code is out-of-sync with the comment. This change restores harmony between the description and the function of the code.
2020-07-15 15:48:20 +12:00
b4761f9d8a Remove commands meant for internal use. (#2182) 2020-07-14 21:49:46 -05:00
71e55541d7 Merge skip command varieties into one command with sub commands. (#2179) 2020-07-14 20:44:49 -05:00
5f1075544c Remove unnecessary match statement (#2177) 2020-07-14 20:17:28 -04:00
0934410b38 Use matches!() for true/false returning match statements (#2176) 2020-07-14 20:11:41 -04:00
17e6c53b62 Add str reverse subcommand (#2170)
* Add str reverse subcommand

* rustfmt
2020-07-15 08:47:04 +12:00
80d2a7ee7a Fix autoenv executing scripts multiple times (#2171)
* Fix autoenv executing scripts multiple times

Previously, if the user had only specified entry or exitscripts the scripts
would execute many times. This should be fixed now

* Add tests

* Run exitscripts

* More tests and fixes to existing tests

* Test solution with visited dirs

* Track visited directories

* Comments and fmt
2020-07-15 07:16:50 +12:00
8fd22b61be Add variance and stddev subcommands to math command (#2154)
* add variance (population)
subcommand to math

* impl variance subcommand with spanning errors for invalid types

* add stddev subcommand to math

* rename bytes to filesize

* clippy fix -- use expect instead of unwrap in variance tests
2020-07-15 07:15:02 +12:00
e9313a61af Make str more strict. (#2173) 2020-07-14 10:04:00 -05:00
f2c4d22739 group-by can generate custom grouping key by block evaluation. (#2172) 2020-07-14 08:45:19 -05:00
8551e06d9e Ensure source buffer is cleared after reading in MaybeTextCodec. (#2168) 2020-07-14 11:24:52 +12:00
97cedeb324 Fix str --to-int usages (#2167) 2020-07-13 15:07:34 -04:00
07594222c0 Extend 'Shell' with open and save capabilities (#2165)
* Extend 'Shell' with open and save capabilities

* clippy fix
2020-07-13 21:07:44 +12:00
7a207a673b Update documentation to properly refer to subcommands with spaces (#2164) 2020-07-13 18:39:36 +12:00
78f13407e6 Documentation for autoenv (#2163)
* Documentation

* Somewhat nicer?

* cat
2020-07-13 18:23:19 +12:00
5a34744d8c add --char flag to 'str trim' (#2162) 2020-07-13 17:45:34 +12:00
0456f4a007 To html with color (#2158)
* adding color to html output

* latest changes

* seems to be working now

* WIP - close. Good is the enemy of Great.

* fixed the final issues... hopefully
2020-07-13 17:40:59 +12:00
f3f40df4dd Tests for autoenv (and fixes for bugs the tests found) (#2148)
* add test basic_autoenv_vars_are_added

* Tests

* Entry and exit scripts

* Recursive set and overwrite

* Make sure that overwritten vals are restored

* Move tests to autoenv

* Move tests out of cli crate

* Tests help, apparently. Windows has issues

On windows, .nu-env is not applied immediately after running autoenv trust.
You have to cd out of the directory for it to work.

* Sort paths non-lexicographically

* Sibling dir test

* Revert "Sort paths non-lexicographically"

This reverts commit 72e4b856af.

* Rename test

* Change conditions

* Revert "Revert "Sort paths non-lexicographically""

This reverts commit 71606bc62f.

* Set vars as they are discovered

This means that if a parent directory is untrusted,
the variables in its child directories are still set properly.

* format

* Fix cleanup issues too

* Run commands in their separate functions

* Make everything into one large function like all the cool kids

* Refactoring

* fmt

* Debugging windows path issue

* Canonicalize

* Trim whitespace

* On windows, use echo nul instead of touch to create file in test

* Avoid cloning by using drain()
2020-07-12 16:14:09 +12:00
bdef5d7d72 Add 'str from' subcommand (#2125)
* add human, precision commands

* add 'str from' subcommand (converted from human/precision commands)

move human tests to str from

* add default locale, platform-specific SystemLocale use

* fix platform specific num-format dependency, remove invalid test

* change 'str from' localization to static num_format::Locale::en

* minor cleanup, nudge ci

* re-attempt ci
2020-07-12 15:57:39 +12:00
8d03cf5b02 added more verbose message to assert (#2157)
* added more verbose message to assert

having a -h short is bad for any command since it's already used by --help.

* updated for fmt
2020-07-11 16:49:44 -05:00
3ec0242960 fix the name of 'do' (#2152)
* fix the name of 'do'

* try to fix ci
2020-07-11 17:09:05 +12:00
0bc2e29f99 Rename 'bytes' to 'filesize' (#2153) 2020-07-11 14:17:37 +12:00
1bb6a2d9ed Split key/value in 'config set' (#2151) 2020-07-11 13:15:51 +12:00
e848fc0bbe Updates config to use subcommands (#2146)
* First commit updating `config` to use subcommands (#2119)
    - Implemented `get` subcommand

* Implmented `config set` as a subcommand.

* Implemented `config set_into` as subcommand

* Fixed base `config` command
 - Instead of outputting help, it now outputs the list of all
 configuration parameters.

* Added `config clear` subcommand

* Added `config load` and `config remove` subcommands

* Added `config path` subcommand

* fixed clippy
2020-07-11 12:11:04 +12:00
6820d70e7d make duration pretty print clearer (#2150)
* make duration pretty print clearer

* fix typo and tests

* fix typo and tests
2020-07-11 09:06:52 +12:00
f32ab696d3 Return iter from sort by (#2149) 2020-07-11 06:49:55 +12:00
e07a9e4ee7 1747 add ns to duration (#2128)
* Added nanos to Duration

* Removed unwraps

* Added nanos to Duration

* Removed unwraps

* Fixed errors

* Removed unwraps

* Changed serialization to String

* Fixed Date and Duration comparison
2020-07-11 05:48:11 +12:00
6a89b1b010 Remove duplicate method (retag) (#2147) 2020-07-10 06:21:13 -04:00
b1b93931cb Return an iter from last command (#2143) 2020-07-09 09:07:51 -04:00
1e62a8fb6e Fix variable name (#2142) 2020-07-08 19:55:01 -04:00
ed6f337a48 Refactor Fetch command (No logic changes) (#2131)
* Refactoring unrelated to Streaming...

Mainly keeping code DRY

* Remove cli as dependency
2020-07-09 04:49:27 +12:00
b004236927 Return iter from from vcf (#2137) 2020-07-08 09:20:57 -04:00
0fdb9ac5e2 str substring additions. (#2140) 2020-07-08 04:45:45 -05:00
28be39494c add requoting for completions (#2129) 2020-07-07 17:13:39 -04:00
32f18536e1 Add space to special prompt chars (#2122)
So spaces don't have to be escaped with quotes in the config.toml
2020-07-06 10:30:47 -05:00
34e1e6e426 Add "move column" command. (#2123) 2020-07-06 10:27:01 -05:00
c3ba1e476f Make every stream-able (#2120)
* Make every stream-able

* Make each over ranges stream-able
2020-07-06 20:23:27 +12:00
a1a0710ee6 Return iter from every command (#2118)
* Return iter from `every` command

* Clippy

* Better variable name
2020-07-06 14:25:39 +12:00
455b1ac294 Update config.yml 2020-07-06 08:21:46 +12:00
b2e0dc5b77 Update azure-pipelines.yml 2020-07-06 08:20:38 +12:00
d30c40b40e Bump to 0.16.1 (#2116) 2020-07-06 08:12:44 +12:00
85d848dd7d Stream results of drop command (#2114)
* Stream results of drop command

* When the amount of rows to drop is equal to or greaten than the size of the table, output nothing
2020-07-06 05:46:06 +12:00
74717582ac Slightly nicer "rm" message (#2113)
* maybe this was root issue

* quotes

* formatting
2020-07-06 05:42:37 +12:00
ee18f16378 Autoenv rewrite, security and scripting (#2083)
* Add args in .nurc file to environment

* Working dummy version

* Add add_nurc to sync_env command

* Parse .nurc file

* Delete env vars after leaving directory

* Removing vals not working, strangely

* Refactoring, add comment

* Debugging

* Debug by logging to file

* Add and remove env var behavior appears correct

However, it does not use existing code that well.

* Move work to cli.rs

* Parse config directories

* I am in a state of distress

* Rename .nurc to .nu

* Some notes for me

* Refactoring

* Removing vars works, but not done in a very nice fashion

* Refactor env_vars_to_delete

* Refactor env_vars_to_add()

* Move directory environment code to separate file

* Refactor from_config

* Restore env values

* Working?

* Working?

* Update comments and change var name

* Formatting

* Remove vars after leaving dir

* Remove notes I made

* Rename config function

* Clippy

* Cleanup and handle errors

* cargo fmt

* Better error messages, remove last (?) unwrap

* FORMAT PLZ

* Rename whitelisted_directories to allowed_directories

* Add comment to clarify how overwritten values are restored.

* Change list of allowed dirs to indexmap

* Rewrite starting

* rewrite everything

* Overwritten env values tracks an indexmap instead of vector

* Refactor restore function

* Untrack removed vars properly

* Performance concerns

* Performance concerns

* Error handling

* Clippy

* Add type aliases for String and OsString

* Deletion almost works

* Working?

* Error handling and refactoring

* nicer errors

* Add TODO file

* Move outside of loop

* Error handling

* Reworking adding of vars

* Reworking adding of vars

* Ready for testing

* Refactoring

* Restore overwritten vals code

* todo.org

* Remove overwritten values tracking, as it is not needed

* Cleanup, stop tracking overwritten values as nu takes care of it

* Init autoenv command

* Initialize autoenv and autoenv trust

* autoenv trust toml

* toml

* Use serde for autoenv

* Optional directory arg

* Add autoenv untrust command

* ... actually add autoenv untrust this time

* OsString and paths

* Revert "OsString and paths"

This reverts commit e6eedf8824.

* Fix path

* Fix path

* Autoenv trust and untrust

* Start using autoenv

* Check hashes

* Use trust functionality when setting vars

* Remove unused code

* Clippy

* Nicer errors for autoenv commands

* Non-working errors

* Update error description

* Satisfy fmt

* Errors

* Errors print, but not nicely

* Nicer errors

* fmt

* Delete accidentally added todo.org file

* Rename direnv to autoenv

* Use ShellError instead of Error

* Change tests to pass, danger zone?

* Clippy and errors

* Clippy... again

* Replace match with or_else

* Use sha2 crate for hashing

* parsing and error msg

* Refactoring

* Only apply vars once

* if parent dir

* Delete vars

* Rework exit code

* Adding works

* restore

* Fix possibility of infinite loop

* Refactoring

* Non-working

* Revert "Non-working"

This reverts commit e231b85570.

* Revert "Revert "Non-working""

This reverts commit 804092e46a.

* Autoenv trust works without restart

* Cargo fix

* Script vars

* Serde

* Serde errors

* Entry and exitscripts

* Clippy

* Support windows and handle errors

* Formatting

* Fix infinite loop on windows

* Debugging windows loop

* More windows infinite loop debugging

* Windows loop debugging #3

* windows loop #4

* Don't return err

* Cleanup unused code

* Infinite loop debug

* Loop debugging

* Check if infinite loop is vars_to_add

* env_vars_to_add does not terminate, skip loop as test

* Hypothesis: std::env::current_dir() is messing with something

* Hypothesis: std::env::current_dir() is messing with something

* plz

* make clippy happy

* debugging in env_vars_to_add

* Debbuging env_vars_to_add #2

* clippy

* clippy..

* Fool clippy

* Fix another infinite loop

* Binary search for error location x)

* Binary search #3

* fmt

* Binary search #4

* more searching...

* closing in... maybe

* PLZ

* Cleanup

* Restore commented out functionality

* Handle case when user gives the directory "."

* fmt

* Use fs::canonicalize for paths

* Create optional script section

* fmt

* Add exitscripts even if no entryscripts are defined

* All sections in .nu-env are now optional

* Re-read config file each directory change

* Hot reload after autoenv untrust, don't run exitscripts if untrusted

* Debugging

* Fix issue with recursive adding of vars

* Thank you for finding my issues Mr. Azure

* use std::env
2020-07-06 05:34:00 +12:00
9e82e5a2fa Show entire table if number of rows requested for last is greater than table size (#2112)
* Show entire table if number of rows requested for last is greater than table size

* rustfmt

* Add test
2020-07-05 13:04:17 +12:00
8ea2307815 More informative error messages when "rm" fails (#2109)
* add a nicer error message

* small fix to message and remove log
2020-07-05 13:03:12 +12:00
bbc5a28fe9 Fix buffering in lines command (#2111) 2020-07-05 12:20:58 +12:00
04120e00e4 Oops, fix crash in parser updates (#2108) 2020-07-05 08:56:54 +12:00
efd8a633f2 Align tables in "ls -f" (#2105)
* Stuff column with nothing if we have nothing

* Stuff columns at the very start

Remove unnecessary else clauses.
Add the unix cfg portion

* Added some tests and cfg windows

Not sure how I feel about these tests but it's better than nothing
2020-07-05 08:17:36 +12:00
e75c44c95b If command and touchups (#2106) 2020-07-05 07:40:04 +12:00
0629c896eb Return error for unterminated string in bareword parser (#2103)
* add some tests

* add failing tests

* use some; fix test

* clean up code, flesh out tests

* cargo fmt
2020-07-04 17:14:31 +12:00
eb02c773d0 Add 'str length' command (#2102) 2020-07-04 08:17:44 +12:00
e31e8d1550 Convert open/fetch to stream (#2028)
* Types lined up for open with stream

* Chunking stream

* Maybe I didn't need most of the Stream stuff after all?

* Some clean-up

* Merge weird cargo.lock

* Start moving some encoding logic to MaybeTextCodec

Will we lose the nice table formatting if we Stream? How do we get it back? Collect the Stream at the end?

* Clean-up and small refinements

* Put in auto-convert workaround

* Workaround to make sure bat functionality works

* Handle some easy error cases

* All tests pass

* Remove guessing logic

* Address clippy comments

* Pull latest master and fix MaybeTextCodec usage

* Add tag to enable autoview
2020-07-04 07:53:20 +12:00
8775991c2d Add 'split chars' command (#2101) 2020-07-04 07:09:38 +12:00
de8e2841a0 Numbered each (#2100)
* Add --numbered to each

* Fix example tester and add numbered each
2020-07-03 20:43:55 +12:00
5cafead4a4 Added license and license-for-less to wix build (#2097) 2020-07-03 11:30:00 +12:00
180290f3a8 Remove custom escaping for external args. (#2095)
Our own custom escaping unfortunately is far too simple to cover all cases.
Instead, the parser will now do no transforms on the args passed to an external
command, letting the process spawning library deal with doing the appropriate
escaping.
2020-07-03 11:29:28 +12:00
7813063c93 updated less license to raw gh link (#2088) 2020-07-02 16:25:07 +12:00
ba5d774fe1 Add a histogram example to the random dice documentation (#2087) 2020-07-02 16:24:28 +12:00
7be49e43fd added a few more command chars (#2086)
* added a few more command chars

* forgot my ole' friend clippy

* added unicode name synonums to characters
2020-07-01 17:34:11 -05:00
dcd2227201 Update README.md 2020-07-02 09:00:44 +12:00
2dd28c2909 updated to include less and nushell licenses (#2085) 2020-07-01 10:45:42 +12:00
351 changed files with 14793 additions and 6176 deletions

View File

@ -1,11 +1,14 @@
trigger: trigger:
- master - main
strategy: strategy:
matrix: matrix:
linux-stable: linux-stable:
image: ubuntu-18.04 image: ubuntu-18.04
style: 'unflagged' style: 'unflagged'
linux-minimal:
image: ubuntu-18.04
style: 'minimal'
macos-stable: macos-stable:
image: macos-10.14 image: macos-10.14
style: 'unflagged' style: 'unflagged'
@ -41,10 +44,10 @@ steps:
rustup component add clippy --toolchain stable-x86_64-apple-darwin rustup component add clippy --toolchain stable-x86_64-apple-darwin
export PATH=$HOME/.cargo/bin:$PATH export PATH=$HOME/.cargo/bin:$PATH
fi fi
rustup update # rustup update
rustc -Vv # rustc -Vv
echo "##vso[task.prependpath]$HOME/.cargo/bin" # echo "##vso[task.prependpath]$HOME/.cargo/bin"
rustup component add rustfmt # rustup component add rustfmt
displayName: Install Rust displayName: Install Rust
- bash: RUSTFLAGS="-D warnings" cargo test --all --features stable - bash: RUSTFLAGS="-D warnings" cargo test --all --features stable
condition: eq(variables['style'], 'unflagged') condition: eq(variables['style'], 'unflagged')
@ -52,12 +55,15 @@ steps:
- bash: RUSTFLAGS="-D warnings" cargo clippy --all --features=stable -- -D clippy::result_unwrap_used -D clippy::option_unwrap_used - bash: RUSTFLAGS="-D warnings" cargo clippy --all --features=stable -- -D clippy::result_unwrap_used -D clippy::option_unwrap_used
condition: eq(variables['style'], 'unflagged') condition: eq(variables['style'], 'unflagged')
displayName: Check clippy lints displayName: Check clippy lints
- bash: NUSHELL_ENABLE_ALL_FLAGS=1 RUSTFLAGS="-D warnings" cargo test --all --features stable - bash: RUSTFLAGS="-D warnings" cargo test --all --features stable
condition: eq(variables['style'], 'canary') condition: eq(variables['style'], 'canary')
displayName: Run tests displayName: Run tests
- bash: NUSHELL_ENABLE_ALL_FLAGS=1 RUSTFLAGS="-D warnings" cargo clippy --all --features=stable -- -D clippy::result_unwrap_used -D clippy::option_unwrap_used - bash: RUSTFLAGS="-D warnings" cargo clippy --all --features=stable -- -D clippy::result_unwrap_used -D clippy::option_unwrap_used
condition: eq(variables['style'], 'canary') condition: eq(variables['style'], 'canary')
displayName: Check clippy lints displayName: Check clippy lints
- bash: RUSTFLAGS="-D warnings" cargo test --all --no-default-features
condition: eq(variables['style'], 'minimal')
displayName: Run tests
- bash: cargo fmt --all -- --check - bash: cargo fmt --all -- --check
condition: eq(variables['style'], 'fmt') condition: eq(variables['style'], 'fmt')
displayName: Lint displayName: Lint

View File

@ -1,3 +0,0 @@
[build]
#rustflags = ["--cfg", "data_processing_primitives"]

View File

@ -27,7 +27,7 @@ orbs:
workflows: workflows:
version: 2.0 version: 2.0
# This builds on all pull requests to test, and ignores master # This builds on all pull requests to test, and ignores main
build_without_deploy: build_without_deploy:
jobs: jobs:
- docker/publish: - docker/publish:
@ -39,7 +39,7 @@ workflows:
filters: filters:
branches: branches:
ignore: ignore:
- master - main
before_build: before_build:
- pull_cache - pull_cache
after_build: after_build:
@ -98,11 +98,11 @@ workflows:
docker push quay.io/nushell/nu docker push quay.io/nushell/nu
# publish devel to Docker Hub on merge to master (doesn't build --release) # publish devel to Docker Hub on merge to main (doesn't build --release)
build_with_deploy_devel: build_with_deploy_devel:
jobs: jobs:
# Deploy devel tag on merge to master # Deploy devel tag on merge to main
- docker/publish: - docker/publish:
image: nushell/nu-base image: nushell/nu-base
registry: quay.io registry: quay.io
@ -113,7 +113,7 @@ workflows:
- pull_cache - pull_cache
filters: filters:
branches: branches:
only: master only: main
after_build: after_build:
- run: - run:
name: Build Multistage (smaller) container name: Build Multistage (smaller) container
@ -137,7 +137,7 @@ workflows:
filters: filters:
branches: branches:
only: only:
- master - main
jobs: jobs:
- docker/publish: - docker/publish:
image: nushell/nu-base image: nushell/nu-base

View File

@ -11,31 +11,39 @@ jobs:
steps: steps:
- name: Check out code - name: Check out code
uses: actions/checkout@v2 uses: actions/checkout@v2
- name: Install libxcb - name: Install libxcb
run: sudo apt-get install libxcb-composite0-dev run: sudo apt-get install libxcb-composite0-dev
- name: Set up cargo - name: Set up cargo
uses: actions-rs/toolchain@v1 uses: actions-rs/toolchain@v1
with: with:
profile: minimal profile: minimal
toolchain: stable toolchain: stable
override: true override: true
- name: Build - name: Build
uses: actions-rs/cargo@v1 uses: actions-rs/cargo@v1
with: with:
command: build command: build
args: --release --all --features=stable args: --release --all --features=stable
- name: Create output directory - name: Create output directory
run: mkdir output run: mkdir output
- name: Copy files to output - name: Copy files to output
run: | run: |
cp target/release/nu target/release/nu_plugin_* output/ cp target/release/nu target/release/nu_plugin_* output/
cp README.build.txt output/README.txt cp README.build.txt output/README.txt
cp LICENSE output/LICENSE
rm output/*.d rm output/*.d
rm output/nu_plugin_core_* rm output/nu_plugin_core_*
rm output/nu_plugin_stable_* rm output/nu_plugin_stable_*
# Note: If OpenSSL changes, this path will need to be updated # Note: If OpenSSL changes, this path will need to be updated
- name: Copy OpenSSL to output - name: Copy OpenSSL to output
run: cp /usr/lib/x86_64-linux-gnu/libssl.so.1.1 output/ run: cp /usr/lib/x86_64-linux-gnu/libssl.so.1.1 output/
- name: Upload artifact - name: Upload artifact
uses: actions/upload-artifact@v2 uses: actions/upload-artifact@v2
with: with:
@ -48,26 +56,32 @@ jobs:
steps: steps:
- name: Check out code - name: Check out code
uses: actions/checkout@v2 uses: actions/checkout@v2
- name: Set up cargo - name: Set up cargo
uses: actions-rs/toolchain@v1 uses: actions-rs/toolchain@v1
with: with:
profile: minimal profile: minimal
toolchain: stable toolchain: stable
override: true override: true
- name: Build - name: Build
uses: actions-rs/cargo@v1 uses: actions-rs/cargo@v1
with: with:
command: build command: build
args: --release --all --features=stable args: --release --all --features=stable
- name: Create output directory - name: Create output directory
run: mkdir output run: mkdir output
- name: Copy files to output - name: Copy files to output
run: | run: |
cp target/release/nu target/release/nu_plugin_* output/ cp target/release/nu target/release/nu_plugin_* output/
cp README.build.txt output/README.txt cp README.build.txt output/README.txt
cp LICENSE output/LICENSE
rm output/*.d rm output/*.d
rm output/nu_plugin_core_* rm output/nu_plugin_core_*
rm output/nu_plugin_stable_* rm output/nu_plugin_stable_*
- name: Upload artifact - name: Upload artifact
uses: actions/upload-artifact@v2 uses: actions/upload-artifact@v2
with: with:
@ -80,29 +94,40 @@ jobs:
steps: steps:
- name: Check out code - name: Check out code
uses: actions/checkout@v2 uses: actions/checkout@v2
- name: Set up cargo - name: Set up cargo
uses: actions-rs/toolchain@v1 uses: actions-rs/toolchain@v1
with: with:
profile: minimal profile: minimal
toolchain: stable toolchain: stable
override: true override: true
- name: Add cargo-wix subcommand - name: Add cargo-wix subcommand
uses: actions-rs/cargo@v1 uses: actions-rs/cargo@v1
with: with:
command: install command: install
args: cargo-wix args: cargo-wix
- name: Build - name: Build
uses: actions-rs/cargo@v1 uses: actions-rs/cargo@v1
with: with:
command: build command: build
args: --release --all --features=stable args: --release --all --features=stable
- name: Create output directory - name: Create output directory
run: mkdir output run: mkdir output
- name: Download Less Binary - name: Download Less Binary
run: Invoke-WebRequest -Uri "https://github.com/jftuga/less-Windows/releases/download/less-v562.1/less.exe" -OutFile "target\release\less.exe" run: Invoke-WebRequest -Uri "https://github.com/jftuga/less-Windows/releases/download/less-v562.0/less.exe" -OutFile "target\release\less.exe"
- name: Download Less License
run: Invoke-WebRequest -Uri "https://raw.githubusercontent.com/jftuga/less-Windows/master/LICENSE" -OutFile "target\release\LICENSE-for-less.txt"
- name: Copy files to output - name: Copy files to output
run: | run: |
cp target\release\nu.exe output\ cp target\release\nu.exe output\
cp LICENSE output\
cp target\release\LICENSE-for-less.txt output\
rm target\release\nu_plugin_core_*.exe rm target\release\nu_plugin_core_*.exe
rm target\release\nu_plugin_stable_*.exe rm target\release\nu_plugin_stable_*.exe
cp target\release\nu_plugin_*.exe output\ cp target\release\nu_plugin_*.exe output\
@ -111,16 +136,19 @@ jobs:
# Note: If the version of `less.exe` needs to be changed, update this URL # Note: If the version of `less.exe` needs to be changed, update this URL
# Similarly, if `less.exe` is checked into the repo, copy from the local path here # Similarly, if `less.exe` is checked into the repo, copy from the local path here
# moved this stuff down to create wix after we download less # moved this stuff down to create wix after we download less
- name: Create msi with wix - name: Create msi with wix
uses: actions-rs/cargo@v1 uses: actions-rs/cargo@v1
with: with:
command: wix command: wix
args: --no-build --nocapture --output target\wix\nushell-windows.msi args: --no-build --nocapture --output target\wix\nushell-windows.msi
- name: Upload installer - name: Upload installer
uses: actions/upload-artifact@v2 uses: actions/upload-artifact@v2
with: with:
name: windows-installer name: windows-installer
path: target\wix\nushell-windows.msi path: target\wix\nushell-windows.msi
- name: Upload zip - name: Upload zip
uses: actions/upload-artifact@v2 uses: actions/upload-artifact@v2
with: with:
@ -137,6 +165,7 @@ jobs:
steps: steps:
- name: Check out code - name: Check out code
uses: actions/checkout@v2 uses: actions/checkout@v2
- name: Determine Release Info - name: Determine Release Info
id: info id: info
env: env:
@ -152,6 +181,7 @@ jobs:
echo "::set-output name=macosdir::nu_${MAJOR}_${MINOR}_${PATCH}_macOS" echo "::set-output name=macosdir::nu_${MAJOR}_${MINOR}_${PATCH}_macOS"
echo "::set-output name=windowsdir::nu_${MAJOR}_${MINOR}_${PATCH}_windows" echo "::set-output name=windowsdir::nu_${MAJOR}_${MINOR}_${PATCH}_windows"
echo "::set-output name=innerdir::nushell-${VERSION}" echo "::set-output name=innerdir::nushell-${VERSION}"
- name: Create Release Draft - name: Create Release Draft
id: create_release id: create_release
uses: actions/create-release@v1 uses: actions/create-release@v1
@ -161,19 +191,24 @@ jobs:
tag_name: ${{ github.ref }} tag_name: ${{ github.ref }}
release_name: ${{ steps.info.outputs.version }} Release release_name: ${{ steps.info.outputs.version }} Release
draft: true draft: true
- name: Create Linux Directory - name: Create Linux Directory
run: mkdir -p ${{ steps.info.outputs.linuxdir }}/${{ steps.info.outputs.innerdir }} run: mkdir -p ${{ steps.info.outputs.linuxdir }}/${{ steps.info.outputs.innerdir }}
- name: Download Linux Artifacts - name: Download Linux Artifacts
uses: actions/download-artifact@v2 uses: actions/download-artifact@v2
with: with:
name: linux name: linux
path: ${{ steps.info.outputs.linuxdir }}/${{ steps.info.outputs.innerdir }} path: ${{ steps.info.outputs.linuxdir }}/${{ steps.info.outputs.innerdir }}
- name: Restore Linux File Modes - name: Restore Linux File Modes
run: | run: |
chmod 755 ${{ steps.info.outputs.linuxdir }}/${{ steps.info.outputs.innerdir }}/nu* chmod 755 ${{ steps.info.outputs.linuxdir }}/${{ steps.info.outputs.innerdir }}/nu*
chmod 755 ${{ steps.info.outputs.linuxdir }}/${{ steps.info.outputs.innerdir }}/libssl* chmod 755 ${{ steps.info.outputs.linuxdir }}/${{ steps.info.outputs.innerdir }}/libssl*
- name: Create Linux tarball - name: Create Linux tarball
run: tar -zcvf ${{ steps.info.outputs.linuxdir }}.tar.gz ${{ steps.info.outputs.linuxdir }} run: tar -zcvf ${{ steps.info.outputs.linuxdir }}.tar.gz ${{ steps.info.outputs.linuxdir }}
- name: Upload Linux Artifact - name: Upload Linux Artifact
uses: actions/upload-release-asset@v1 uses: actions/upload-release-asset@v1
env: env:
@ -183,17 +218,22 @@ jobs:
asset_path: ./${{ steps.info.outputs.linuxdir }}.tar.gz asset_path: ./${{ steps.info.outputs.linuxdir }}.tar.gz
asset_name: ${{ steps.info.outputs.linuxdir }}.tar.gz asset_name: ${{ steps.info.outputs.linuxdir }}.tar.gz
asset_content_type: application/gzip asset_content_type: application/gzip
- name: Create macOS Directory - name: Create macOS Directory
run: mkdir -p ${{ steps.info.outputs.macosdir }}/${{ steps.info.outputs.innerdir }} run: mkdir -p ${{ steps.info.outputs.macosdir }}/${{ steps.info.outputs.innerdir }}
- name: Download macOS Artifacts - name: Download macOS Artifacts
uses: actions/download-artifact@v2 uses: actions/download-artifact@v2
with: with:
name: macos name: macos
path: ${{ steps.info.outputs.macosdir }}/${{ steps.info.outputs.innerdir }} path: ${{ steps.info.outputs.macosdir }}/${{ steps.info.outputs.innerdir }}
- name: Restore macOS File Modes - name: Restore macOS File Modes
run: chmod 755 ${{ steps.info.outputs.macosdir }}/${{ steps.info.outputs.innerdir }}/nu* run: chmod 755 ${{ steps.info.outputs.macosdir }}/${{ steps.info.outputs.innerdir }}/nu*
- name: Create macOS Archive - name: Create macOS Archive
run: zip -r ${{ steps.info.outputs.macosdir }}.zip ${{ steps.info.outputs.macosdir }} run: zip -r ${{ steps.info.outputs.macosdir }}.zip ${{ steps.info.outputs.macosdir }}
- name: Upload macOS Artifact - name: Upload macOS Artifact
uses: actions/upload-release-asset@v1 uses: actions/upload-release-asset@v1
env: env:
@ -203,18 +243,22 @@ jobs:
asset_path: ./${{ steps.info.outputs.macosdir }}.zip asset_path: ./${{ steps.info.outputs.macosdir }}.zip
asset_name: ${{ steps.info.outputs.macosdir }}.zip asset_name: ${{ steps.info.outputs.macosdir }}.zip
asset_content_type: application/zip asset_content_type: application/zip
- name: Create Windows Directory - name: Create Windows Directory
run: mkdir -p ${{ steps.info.outputs.windowsdir }}/${{ steps.info.outputs.innerdir }} run: mkdir -p ${{ steps.info.outputs.windowsdir }}/${{ steps.info.outputs.innerdir }}
- name: Download Windows zip - name: Download Windows zip
uses: actions/download-artifact@v2 uses: actions/download-artifact@v2
with: with:
name: windows-zip name: windows-zip
path: ${{ steps.info.outputs.windowsdir }}/${{ steps.info.outputs.innerdir }} path: ${{ steps.info.outputs.windowsdir }}/${{ steps.info.outputs.innerdir }}
# TODO: Remove Show
- name: Show Windows Artifacts - name: Show Windows Artifacts
run: ls -la ${{ steps.info.outputs.windowsdir }}/${{ steps.info.outputs.innerdir }} run: ls -la ${{ steps.info.outputs.windowsdir }}/${{ steps.info.outputs.innerdir }}
- name: Create macOS Archive - name: Create macOS Archive
run: zip -r ${{ steps.info.outputs.windowsdir }}.zip ${{ steps.info.outputs.windowsdir }} run: zip -r ${{ steps.info.outputs.windowsdir }}.zip ${{ steps.info.outputs.windowsdir }}
- name: Upload Windows zip - name: Upload Windows zip
uses: actions/upload-release-asset@v1 uses: actions/upload-release-asset@v1
env: env:
@ -224,11 +268,13 @@ jobs:
asset_path: ./${{ steps.info.outputs.windowsdir }}.zip asset_path: ./${{ steps.info.outputs.windowsdir }}.zip
asset_name: ${{ steps.info.outputs.windowsdir }}.zip asset_name: ${{ steps.info.outputs.windowsdir }}.zip
asset_content_type: application/zip asset_content_type: application/zip
- name: Download Windows installer - name: Download Windows installer
uses: actions/download-artifact@v2 uses: actions/download-artifact@v2
with: with:
name: windows-installer name: windows-installer
path: ./ path: ./
- name: Upload Windows installer - name: Upload Windows installer
uses: actions/upload-release-asset@v1 uses: actions/upload-release-asset@v1
env: env:

984
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -1,16 +1,16 @@
[package] [package]
name = "nu"
version = "0.16.0"
authors = ["The Nu Project Contributors"] authors = ["The Nu Project Contributors"]
description = "A new type of shell"
license = "MIT"
edition = "2018"
readme = "README.md"
default-run = "nu" default-run = "nu"
repository = "https://github.com/nushell/nushell" description = "A new type of shell"
homepage = "https://www.nushell.sh"
documentation = "https://www.nushell.sh/book/" documentation = "https://www.nushell.sh/book/"
edition = "2018"
exclude = ["images"] exclude = ["images"]
homepage = "https://www.nushell.sh"
license = "MIT"
name = "nu"
readme = "README.md"
repository = "https://github.com/nushell/nushell"
version = "0.18.1"
[workspace] [workspace]
members = ["crates/*/"] members = ["crates/*/"]
@ -18,23 +18,28 @@ members = ["crates/*/"]
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies] [dependencies]
nu-cli = { version = "0.16.0", path = "./crates/nu-cli" } nu-cli = {version = "0.18.1", path = "./crates/nu-cli"}
nu-source = { version = "0.16.0", path = "./crates/nu-source" } nu-errors = {version = "0.18.1", path = "./crates/nu-errors"}
nu-plugin = { version = "0.16.0", path = "./crates/nu-plugin" } nu-parser = {version = "0.18.1", path = "./crates/nu-parser"}
nu-protocol = { version = "0.16.0", path = "./crates/nu-protocol" } nu-plugin = {version = "0.18.1", path = "./crates/nu-plugin"}
nu-errors = { version = "0.16.0", path = "./crates/nu-errors" } nu-protocol = {version = "0.18.1", path = "./crates/nu-protocol"}
nu-parser = { version = "0.16.0", path = "./crates/nu-parser" } nu-source = {version = "0.18.1", path = "./crates/nu-source"}
nu-value-ext = { version = "0.16.0", path = "./crates/nu-value-ext" } nu-value-ext = {version = "0.18.1", path = "./crates/nu-value-ext"}
nu_plugin_binaryview = { version = "0.16.0", path = "./crates/nu_plugin_binaryview", optional=true }
nu_plugin_fetch = { version = "0.16.0", path = "./crates/nu_plugin_fetch", optional=true } nu_plugin_binaryview = {version = "0.18.1", path = "./crates/nu_plugin_binaryview", optional = true}
nu_plugin_inc = { version = "0.16.0", path = "./crates/nu_plugin_inc", optional=true } nu_plugin_fetch = {version = "0.18.1", path = "./crates/nu_plugin_fetch", optional = true}
nu_plugin_match = { version = "0.16.0", path = "./crates/nu_plugin_match", optional=true } nu_plugin_from_bson = {version = "0.18.1", path = "./crates/nu_plugin_from_bson", optional = true}
nu_plugin_post = { version = "0.16.0", path = "./crates/nu_plugin_post", optional=true } nu_plugin_from_sqlite = {version = "0.18.1", path = "./crates/nu_plugin_from_sqlite", optional = true}
nu_plugin_ps = { version = "0.16.0", path = "./crates/nu_plugin_ps", optional=true } nu_plugin_inc = {version = "0.18.1", path = "./crates/nu_plugin_inc", optional = true}
nu_plugin_start = { version = "0.16.0", path = "./crates/nu_plugin_start", optional=true } nu_plugin_match = {version = "0.18.1", path = "./crates/nu_plugin_match", optional = true}
nu_plugin_sys = { version = "0.16.0", path = "./crates/nu_plugin_sys", optional=true } nu_plugin_post = {version = "0.18.1", path = "./crates/nu_plugin_post", optional = true}
nu_plugin_textview = { version = "0.16.0", path = "./crates/nu_plugin_textview", optional=true } nu_plugin_ps = {version = "0.18.1", path = "./crates/nu_plugin_ps", optional = true}
nu_plugin_tree = { version = "0.16.0", path = "./crates/nu_plugin_tree", optional=true } nu_plugin_start = {version = "0.18.1", path = "./crates/nu_plugin_start", optional = true}
nu_plugin_sys = {version = "0.18.1", path = "./crates/nu_plugin_sys", optional = true}
nu_plugin_textview = {version = "0.18.1", path = "./crates/nu_plugin_textview", optional = true}
nu_plugin_to_bson = {version = "0.18.1", path = "./crates/nu_plugin_to_bson", optional = true}
nu_plugin_to_sqlite = {version = "0.18.1", path = "./crates/nu_plugin_to_sqlite", optional = true}
nu_plugin_tree = {version = "0.18.1", path = "./crates/nu_plugin_tree", optional = true}
crossterm = {version = "0.17.5", optional = true} crossterm = {version = "0.17.5", optional = true}
semver = {version = "0.10.0", optional = true} semver = {version = "0.10.0", optional = true}
@ -47,38 +52,59 @@ dunce = "1.0.1"
futures = {version = "0.3", features = ["compat", "io-compat"]} futures = {version = "0.3", features = ["compat", "io-compat"]}
log = "0.4.8" log = "0.4.8"
pretty_env_logger = "0.4.0" pretty_env_logger = "0.4.0"
quick-xml = "0.18.1"
starship = "0.43.0" starship = "0.43.0"
[dev-dependencies] [dev-dependencies]
nu-test-support = { version = "0.16.0", path = "./crates/nu-test-support" } nu-test-support = {version = "0.18.1", path = "./crates/nu-test-support"}
[build-dependencies] [build-dependencies]
toml = "0.5.6"
serde = {version = "1.0.110", features = ["derive"]} serde = {version = "1.0.110", features = ["derive"]}
nu-build = { version = "0.16.0", path = "./crates/nu-build" } toml = "0.5.6"
[features] [features]
default = ["sys", "ps", "textview", "inc"] default = [
stable = ["default", "binaryview", "match", "tree", "post", "fetch", "clipboard-cli", "trash-support", "start"] "sys",
"ps",
"textview",
"inc",
"git-support",
"directories-support",
"ctrlc-support",
"which-support",
"ptree-support",
"term-support",
"uuid-support",
]
stable = ["default", "binaryview", "match", "tree", "post", "fetch", "clipboard-cli", "trash-support", "start", "starship-prompt", "bson", "sqlite"]
# Default # Default
textview = ["crossterm", "syntect", "url", "nu_plugin_textview"]
sys = ["nu_plugin_sys"]
ps = ["nu_plugin_ps"]
inc = ["semver", "nu_plugin_inc"] inc = ["semver", "nu_plugin_inc"]
ps = ["nu_plugin_ps"]
sys = ["nu_plugin_sys"]
textview = ["crossterm", "syntect", "url", "nu_plugin_textview"]
# Stable # Stable
binaryview = ["nu_plugin_binaryview"] binaryview = ["nu_plugin_binaryview"]
bson = ["nu_plugin_from_bson", "nu_plugin_to_bson"]
fetch = ["nu_plugin_fetch"] fetch = ["nu_plugin_fetch"]
match = ["nu_plugin_match"] match = ["nu_plugin_match"]
post = ["nu_plugin_post"] post = ["nu_plugin_post"]
sqlite = ["nu_plugin_from_sqlite", "nu_plugin_to_sqlite"]
start = ["nu_plugin_start"]
trace = ["nu-parser/trace"] trace = ["nu-parser/trace"]
tree = ["nu_plugin_tree"] tree = ["nu_plugin_tree"]
start = ["nu_plugin_start"]
clipboard-cli = ["nu-cli/clipboard-cli"] clipboard-cli = ["nu-cli/clipboard-cli"]
# starship-prompt = ["nu-cli/starship-prompt"] ctrlc-support = ["nu-cli/ctrlc"]
directories-support = ["nu-cli/directories", "nu-cli/dirs"]
git-support = ["nu-cli/git2"]
ptree-support = ["nu-cli/ptree"]
starship-prompt = ["nu-cli/starship-prompt"]
term-support = ["nu-cli/term"]
trash-support = ["nu-cli/trash-support"] trash-support = ["nu-cli/trash-support"]
uuid-support = ["nu-cli/uuid_crate"]
which-support = ["nu-cli/ichwh", "nu-cli/which"]
# Core plugins that ship with `cargo install nu` by default # Core plugins that ship with `cargo install nu` by default
# Currently, Cargo limits us to installing only one binary # Currently, Cargo limits us to installing only one binary

View File

@ -5,6 +5,7 @@
[![Build Status](https://dev.azure.com/nushell/nushell/_apis/build/status/nushell.nushell?branchName=master)](https://dev.azure.com/nushell/nushell/_build/latest?definitionId=2&branchName=master) [![Build Status](https://dev.azure.com/nushell/nushell/_apis/build/status/nushell.nushell?branchName=master)](https://dev.azure.com/nushell/nushell/_build/latest?definitionId=2&branchName=master)
[![Discord](https://img.shields.io/discord/601130461678272522.svg?logo=discord)](https://discord.gg/NtAbbGn) [![Discord](https://img.shields.io/discord/601130461678272522.svg?logo=discord)](https://discord.gg/NtAbbGn)
[![The Changelog #363](https://img.shields.io/badge/The%20Changelog-%23363-61c192.svg)](https://changelog.com/podcast/363) [![The Changelog #363](https://img.shields.io/badge/The%20Changelog-%23363-61c192.svg)](https://changelog.com/podcast/363)
[![@nu_shell](https://img.shields.io/badge/twitter-@nu_shell-1DA1F3?style=flat-square)](https://twitter.com/nu_shell)
## Nu Shell ## Nu Shell
@ -50,12 +51,12 @@ To build Nu, you will need to use the **latest stable (1.41 or later)** version
Required dependencies: Required dependencies:
* pkg-config and libssl (only needed on Linux) * pkg-config and libssl (only needed on Linux)
* on Debian/Ubuntu: `apt install pkg-config libssl-dev` * On Debian/Ubuntu: `apt install pkg-config libssl-dev`
Optional dependencies: Optional dependencies:
* To use Nu with all possible optional features enabled, you'll also need the following: * To use Nu with all possible optional features enabled, you'll also need the following:
* on Linux (on Debian/Ubuntu): `apt install libxcb-composite0-dev libx11-dev` * On Linux (on Debian/Ubuntu): `apt install libxcb-composite0-dev libx11-dev`
To install Nu via cargo (make sure you have installed [rustup](https://rustup.rs/) and the latest stable compiler via `rustup install stable`): To install Nu via cargo (make sure you have installed [rustup](https://rustup.rs/) and the latest stable compiler via `rustup install stable`):
@ -193,7 +194,7 @@ For example, you can load a .toml file as structured data and explore it:
> open Cargo.toml > open Cargo.toml
────────────────────┬─────────────────────────── ────────────────────┬───────────────────────────
bin │ [table 18 rows] bin │ [table 18 rows]
build-dependencies │ [row nu-build serde toml] build-dependencies │ [row serde toml]
dependencies │ [row 29 columns] dependencies │ [row 29 columns]
dev-dependencies │ [row nu-test-support] dev-dependencies │ [row nu-test-support]
features │ [row 19 columns] features │ [row 19 columns]
@ -235,11 +236,11 @@ Here we use the variable `$it` to refer to the value being piped to the external
Nu has early support for configuring the shell. You can refer to the book for a list of [all supported variables](https://www.nushell.sh/book/en/configuration.html). Nu has early support for configuring the shell. You can refer to the book for a list of [all supported variables](https://www.nushell.sh/book/en/configuration.html).
To set one of these variables, you can use `config --set`. For example: To set one of these variables, you can use `config set`. For example:
```shell ```shell
> config --set [edit_mode "vi"] > config set edit_mode "vi"
> config --set [path $nu.path] > config set path $nu.path
``` ```
### Shells ### Shells

Binary file not shown.

Binary file not shown.

View File

@ -1,3 +0,0 @@
fn main() -> Result<(), Box<dyn std::error::Error>> {
nu_build::build()
}

View File

@ -1,16 +0,0 @@
[package]
name = "nu-build"
version = "0.16.0"
authors = ["The Nu Project Contributors"]
edition = "2018"
description = "Core build system for nushell"
license = "MIT"
[lib]
doctest = false
[dependencies]
serde = { version = "1.0.114", features = ["derive"] }
lazy_static = "1.4.0"
serde_json = "1.0.55"
toml = "0.5.6"

View File

@ -1,80 +0,0 @@
use lazy_static::lazy_static;
use serde::Deserialize;
use std::collections::BTreeMap;
use std::collections::HashMap;
use std::collections::HashSet;
use std::env;
use std::path::{Path, PathBuf};
use std::sync::Mutex;
lazy_static! {
static ref WORKSPACES: Mutex<BTreeMap<String, &'static Path>> = Mutex::new(BTreeMap::new());
}
// got from https://github.com/mitsuhiko/insta/blob/b113499249584cb650150d2d01ed96ee66db6b30/src/runtime.rs#L67-L88
fn get_cargo_workspace(manifest_dir: &str) -> Result<Option<&Path>, Box<dyn std::error::Error>> {
let mut workspaces = WORKSPACES.lock()?;
if let Some(rv) = workspaces.get(manifest_dir) {
Ok(Some(rv))
} else {
#[derive(Deserialize)]
struct Manifest {
workspace_root: String,
}
let output = std::process::Command::new(env!("CARGO"))
.arg("metadata")
.arg("--format-version=1")
.current_dir(manifest_dir)
.output()?;
let manifest: Manifest = serde_json::from_slice(&output.stdout)?;
let path = Box::leak(Box::new(PathBuf::from(manifest.workspace_root)));
workspaces.insert(manifest_dir.to_string(), path.as_path());
Ok(workspaces.get(manifest_dir).cloned())
}
}
#[derive(Deserialize)]
struct Feature {
#[allow(unused)]
description: String,
enabled: bool,
}
pub fn build() -> Result<(), Box<dyn std::error::Error>> {
let input = env::var("CARGO_MANIFEST_DIR")?;
let all_on = env::var("NUSHELL_ENABLE_ALL_FLAGS").is_ok();
let flags: HashSet<String> = env::var("NUSHELL_ENABLE_FLAGS")
.map(|s| s.split(',').map(|s| s.to_string()).collect())
.unwrap_or_else(|_| HashSet::new());
if all_on && !flags.is_empty() {
println!(
"cargo:warning=Both NUSHELL_ENABLE_ALL_FLAGS and NUSHELL_ENABLE_FLAGS were set. You don't need both."
);
}
let workspace = match get_cargo_workspace(&input)? {
// If the crate is being downloaded from crates.io, it won't have a workspace root, and that's ok
None => return Ok(()),
Some(workspace) => workspace,
};
let path = Path::new(&workspace).join("features.toml");
// If the crate is being downloaded from crates.io, it won't have a features.toml, and that's ok
if !path.exists() {
return Ok(());
}
let toml: HashMap<String, Feature> = toml::from_str(&std::fs::read_to_string(path)?)?;
for (key, value) in toml.iter() {
if value.enabled || all_on || flags.contains(key) {
println!("cargo:rustc-cfg={}", key);
}
}
Ok(())
}

View File

@ -1,42 +1,42 @@
[package] [package]
name = "nu-cli"
version = "0.16.0"
authors = ["The Nu Project Contributors"] authors = ["The Nu Project Contributors"]
description = "CLI for nushell" description = "CLI for nushell"
edition = "2018" edition = "2018"
license = "MIT" license = "MIT"
name = "nu-cli"
version = "0.18.1"
[lib] [lib]
doctest = false doctest = false
[dependencies] [dependencies]
nu-source = { version = "0.16.0", path = "../nu-source" } nu-errors = {version = "0.18.1", path = "../nu-errors"}
nu-plugin = { version = "0.16.0", path = "../nu-plugin" } nu-parser = {version = "0.18.1", path = "../nu-parser"}
nu-protocol = { version = "0.16.0", path = "../nu-protocol" } nu-plugin = {version = "0.18.1", path = "../nu-plugin"}
nu-errors = { version = "0.16.0", path = "../nu-errors" } nu-protocol = {version = "0.18.1", path = "../nu-protocol"}
nu-parser = { version = "0.16.0", path = "../nu-parser" } nu-source = {version = "0.18.1", path = "../nu-source"}
nu-value-ext = { version = "0.16.0", path = "../nu-value-ext" } nu-table = {version = "0.18.1", path = "../nu-table"}
nu-test-support = { version = "0.16.0", path = "../nu-test-support" } nu-test-support = {version = "0.18.1", path = "../nu-test-support"}
nu-table = {version = "0.16.0", path = "../nu-table"} nu-value-ext = {version = "0.18.1", path = "../nu-value-ext"}
ansi_term = "0.12.1" ansi_term = "0.12.1"
app_dirs = "1.2.1" app_dirs = {version = "2", package = "app_dirs2"}
async-recursion = "0.3.1" async-recursion = "0.3.1"
async-trait = "0.1.36" async-trait = "0.1.36"
directories = "2.0.2"
base64 = "0.12.3" base64 = "0.12.3"
bigdecimal = {version = "0.1.2", features = ["serde"]} bigdecimal = {version = "0.1.2", features = ["serde"]}
bson = { version = "0.14.1", features = ["decimal128"] }
byte-unit = "3.1.3" byte-unit = "3.1.3"
bytes = "0.5.5" bytes = "0.5.5"
calamine = "0.16" calamine = "0.16"
cfg-if = "0.1"
chrono = {version = "0.4.11", features = ["serde"]} chrono = {version = "0.4.11", features = ["serde"]}
clap = "2.33.1" clap = "2.33.1"
codespan-reporting = "0.9.5"
csv = "1.1" csv = "1.1"
ctrlc = "3.1.4" ctrlc = {version = "3.1.4", optional = true}
derive-new = "0.5.8" derive-new = "0.5.8"
dirs = "2.0.2" directories = {version = "2.0.2", optional = true}
dirs = {version = "2.0.2", optional = true}
dtparse = "1.1.0"
dunce = "1.0.1" dunce = "1.0.1"
eml-parser = "0.1.0" eml-parser = "0.1.0"
filesize = "0.2.0" filesize = "0.2.0"
@ -44,29 +44,30 @@ futures = { version = "0.3", features = ["compat", "io-compat"] }
futures-util = "0.3.5" futures-util = "0.3.5"
futures_codec = "0.4" futures_codec = "0.4"
getset = "0.1.1" getset = "0.1.1"
git2 = { version = "0.13.6", default_features = false } git2 = {version = "0.13.6", default_features = false, optional = true}
glob = "0.3.0" glob = "0.3.0"
hex = "0.4" hex = "0.4"
htmlescape = "0.3.1" htmlescape = "0.3.1"
ical = "0.6.*" ical = "0.6.*"
ichwh = "0.3.4" ichwh = {version = "0.3.4", optional = true}
indexmap = {version = "1.4.0", features = ["serde-1"]} indexmap = {version = "1.4.0", features = ["serde-1"]}
itertools = "0.9.0" itertools = "0.9.0"
codespan-reporting = "0.9.5"
log = "0.4.8" log = "0.4.8"
meval = "0.2" meval = "0.2"
natural = "0.5.0" natural = "0.5.0"
num-bigint = {version = "0.2.6", features = ["serde"]} num-bigint = {version = "0.2.6", features = ["serde"]}
num-format = {version = "0.4", features = ["with-num-bigint"]}
num-traits = "0.2.11" num-traits = "0.2.11"
parking_lot = "0.11.0" parking_lot = "0.11.0"
pin-utils = "0.1.0" pin-utils = "0.1.0"
pretty-hex = "0.1.1" pretty-hex = "0.1.1"
pretty_env_logger = "0.4.0" pretty_env_logger = "0.4.0"
ptree = {version = "0.2" } ptree = {version = "0.2", optional = true}
query_interface = "0.3.5" query_interface = "0.3.5"
rand = "0.7" rand = "0.7"
regex = "1" regex = "1"
roxmltree = "0.13.0" roxmltree = "0.13.0"
rust-embed = "5.6.0"
rustyline = "6.2.0" rustyline = "6.2.0"
serde = {version = "1.0.114", features = ["derive"]} serde = {version = "1.0.114", features = ["derive"]}
serde-hjson = "0.9.1" serde-hjson = "0.9.1"
@ -75,41 +76,51 @@ serde_ini = "0.2.0"
serde_json = "1.0.55" serde_json = "1.0.55"
serde_urlencoded = "0.6.1" serde_urlencoded = "0.6.1"
serde_yaml = "0.8" serde_yaml = "0.8"
sha2 = "0.9.1"
shellexpand = "2.0.0" shellexpand = "2.0.0"
strip-ansi-escapes = "0.1.0" strip-ansi-escapes = "0.1.0"
tempfile = "3.1.0" tempfile = "3.1.0"
term = "0.5.2" term = {version = "0.5.2", optional = true}
termcolor = "1.1.0"
term_size = "0.3.2" term_size = "0.3.2"
termcolor = "1.1.0"
toml = "0.5.6" toml = "0.5.6"
typetag = "0.1.5" typetag = "0.1.5"
umask = "1.0.0" umask = "1.0.0"
unicode-xid = "0.2.1" unicode-xid = "0.2.1"
uuid_crate = { package = "uuid", version = "0.8.1", features = ["v4"] } uuid_crate = {package = "uuid", version = "0.8.1", features = ["v4"], optional = true}
which = "4.0.1" which = {version = "4.0.2", optional = true}
zip = "0.5.6"
trash = { version = "1.0.1", optional = true }
clipboard = {version = "0.5", optional = true} clipboard = {version = "0.5", optional = true}
starship = "0.43.0"
rayon = "1.3.1"
encoding_rs = "0.8.23" encoding_rs = "0.8.23"
quick-xml = "0.18.1"
rayon = "1.3.1"
starship = {version = "0.43.0", optional = true}
trash = {version = "1.0.1", optional = true}
url = {version = "2.1.1"}
[target.'cfg(unix)'.dependencies] [target.'cfg(unix)'.dependencies]
users = "0.10.0" users = "0.10.0"
# TODO this will be possible with new dependency resolver
# (currently on nightly behind -Zfeatures=itarget):
# https://github.com/rust-lang/cargo/issues/7914
#[target.'cfg(not(windows))'.dependencies]
#num-format = {version = "0.4", features = ["with-system-locale"]}
[dependencies.rusqlite] [dependencies.rusqlite]
version = "0.23.1"
features = ["bundled", "blob"] features = ["bundled", "blob"]
optional = true
version = "0.23.1"
[build-dependencies] [build-dependencies]
nu-build = { version = "0.16.0", path = "../nu-build" }
[dev-dependencies] [dev-dependencies]
quickcheck = "0.9" quickcheck = "0.9"
quickcheck_macros = "0.9" quickcheck_macros = "0.9"
[features] [features]
stable = []
# starship-prompt = ["starship"]
clipboard-cli = ["clipboard"] clipboard-cli = ["clipboard"]
stable = []
starship-prompt = ["starship"]
trash-support = ["trash"] trash-support = ["trash"]

Binary file not shown.

View File

@ -1,5 +1,5 @@
use crate::commands::classified::block::run_block; use crate::commands::classified::block::run_block;
use crate::commands::classified::external::{MaybeTextCodec, StringOrBinary}; use crate::commands::classified::maybe_text_codec::{MaybeTextCodec, StringOrBinary};
use crate::commands::plugin::JsonRpc; use crate::commands::plugin::JsonRpc;
use crate::commands::plugin::{PluginCommand, PluginSink}; use crate::commands::plugin::{PluginCommand, PluginSink};
use crate::commands::whole_stream_command; use crate::commands::whole_stream_command;
@ -7,18 +7,20 @@ use crate::context::Context;
use crate::git::current_branch; use crate::git::current_branch;
use crate::path::canonicalize; use crate::path::canonicalize;
use crate::prelude::*; use crate::prelude::*;
use crate::shell::completer::NuCompleter;
use crate::shell::Helper;
use crate::EnvironmentSyncer; use crate::EnvironmentSyncer;
use futures_codec::FramedRead; use futures_codec::FramedRead;
use nu_errors::{ProximateShellError, ShellDiagnostic, ShellError}; use nu_errors::{ProximateShellError, ShellDiagnostic, ShellError};
use nu_protocol::hir::{ClassifiedCommand, Expression, InternalCommand, Literal, NamedArguments}; use nu_protocol::hir::{ClassifiedCommand, Expression, InternalCommand, Literal, NamedArguments};
use nu_protocol::{Primitive, ReturnSuccess, Signature, UntaggedValue, Value}; use nu_protocol::{Primitive, ReturnSuccess, Signature, UntaggedValue, Value};
#[allow(unused)]
use nu_source::Tagged;
use log::{debug, trace}; use log::{debug, trace};
use rustyline::config::{ColorMode, CompletionType, Config};
use rustyline::error::ReadlineError; use rustyline::error::ReadlineError;
use rustyline::{ use rustyline::{self, config::Configurer, At, Cmd, Editor, KeyPress, Movement, Word};
self, config::Configurer, config::EditMode, At, Cmd, ColorMode, CompletionType, Config, Editor,
KeyPress, Movement, Word,
};
use std::error::Error; use std::error::Error;
use std::io::{BufRead, BufReader, Write}; use std::io::{BufRead, BufReader, Write};
use std::iter::Iterator; use std::iter::Iterator;
@ -149,10 +151,7 @@ pub fn load_plugins(context: &mut Context) -> Result<(), ShellError> {
.map(|path| { .map(|path| {
let bin_name = { let bin_name = {
if let Some(name) = path.file_name() { if let Some(name) = path.file_name() {
match name.to_str() { name.to_str().unwrap_or("")
Some(raw) => raw,
None => "",
}
} else { } else {
"" ""
} }
@ -204,12 +203,28 @@ pub struct History;
impl History { impl History {
pub fn path() -> PathBuf { pub fn path() -> PathBuf {
const FNAME: &str = "history.txt"; const FNAME: &str = "history.txt";
config::user_data() let default = config::user_data()
.map(|mut p| { .map(|mut p| {
p.push(FNAME); p.push(FNAME);
p p
}) })
.unwrap_or_else(|_| PathBuf::from(FNAME)) .unwrap_or_else(|_| PathBuf::from(FNAME));
let cfg = crate::data::config::config(Tag::unknown());
if let Ok(c) = cfg {
match &c.get("history-path") {
Some(Value {
value: UntaggedValue::Primitive(p),
..
}) => match p {
Primitive::String(path) => PathBuf::from(path),
_ => default,
},
_ => default,
}
} else {
default
}
} }
} }
@ -251,6 +266,13 @@ pub fn create_default_context(
whole_stream_command(Remove), whole_stream_command(Remove),
whole_stream_command(Open), whole_stream_command(Open),
whole_stream_command(Config), whole_stream_command(Config),
whole_stream_command(ConfigGet),
whole_stream_command(ConfigSet),
whole_stream_command(ConfigSetInto),
whole_stream_command(ConfigClear),
whole_stream_command(ConfigLoad),
whole_stream_command(ConfigRemove),
whole_stream_command(ConfigPath),
whole_stream_command(Help), whole_stream_command(Help),
whole_stream_command(History), whole_stream_command(History),
whole_stream_command(Save), whole_stream_command(Save),
@ -258,9 +280,8 @@ pub fn create_default_context(
whole_stream_command(Cpy), whole_stream_command(Cpy),
whole_stream_command(Date), whole_stream_command(Date),
whole_stream_command(Cal), whole_stream_command(Cal),
whole_stream_command(Calc),
whole_stream_command(Mkdir), whole_stream_command(Mkdir),
whole_stream_command(Move), whole_stream_command(Mv),
whole_stream_command(Kill), whole_stream_command(Kill),
whole_stream_command(Version), whole_stream_command(Version),
whole_stream_command(Clear), whole_stream_command(Clear),
@ -273,6 +294,7 @@ pub fn create_default_context(
// Statistics // Statistics
whole_stream_command(Size), whole_stream_command(Size),
whole_stream_command(Count), whole_stream_command(Count),
whole_stream_command(Benchmark),
// Metadata // Metadata
whole_stream_command(Tags), whole_stream_command(Tags),
// Shells // Shells
@ -288,6 +310,7 @@ pub fn create_default_context(
whole_stream_command(Split), whole_stream_command(Split),
whole_stream_command(SplitColumn), whole_stream_command(SplitColumn),
whole_stream_command(SplitRow), whole_stream_command(SplitRow),
whole_stream_command(SplitChars),
whole_stream_command(Lines), whole_stream_command(Lines),
whole_stream_command(Trim), whole_stream_command(Trim),
whole_stream_command(Echo), whole_stream_command(Echo),
@ -299,15 +322,25 @@ pub fn create_default_context(
whole_stream_command(StrUpcase), whole_stream_command(StrUpcase),
whole_stream_command(StrCapitalize), whole_stream_command(StrCapitalize),
whole_stream_command(StrFindReplace), whole_stream_command(StrFindReplace),
whole_stream_command(StrFrom),
whole_stream_command(StrSubstring), whole_stream_command(StrSubstring),
whole_stream_command(StrSet), whole_stream_command(StrSet),
whole_stream_command(StrToDatetime), whole_stream_command(StrToDatetime),
whole_stream_command(StrContains),
whole_stream_command(StrIndexOf),
whole_stream_command(StrTrim), whole_stream_command(StrTrim),
whole_stream_command(StrTrimLeft),
whole_stream_command(StrTrimRight),
whole_stream_command(StrStartsWith),
whole_stream_command(StrEndsWith),
whole_stream_command(StrCollect), whole_stream_command(StrCollect),
whole_stream_command(StrLength),
whole_stream_command(StrReverse),
whole_stream_command(BuildString), whole_stream_command(BuildString),
whole_stream_command(Ansi), whole_stream_command(Ansi),
whole_stream_command(Char), whole_stream_command(Char),
// Column manipulation // Column manipulation
whole_stream_command(MoveColumn),
whole_stream_command(Reject), whole_stream_command(Reject),
whole_stream_command(Select), whole_stream_command(Select),
whole_stream_command(Get), whole_stream_command(Get),
@ -328,6 +361,7 @@ pub fn create_default_context(
whole_stream_command(Drop), whole_stream_command(Drop),
whole_stream_command(Format), whole_stream_command(Format),
whole_stream_command(Where), whole_stream_command(Where),
whole_stream_command(If),
whole_stream_command(Compact), whole_stream_command(Compact),
whole_stream_command(Default), whole_stream_command(Default),
whole_stream_command(Skip), whole_stream_command(Skip),
@ -342,33 +376,39 @@ pub fn create_default_context(
whole_stream_command(Each), whole_stream_command(Each),
whole_stream_command(IsEmpty), whole_stream_command(IsEmpty),
// Table manipulation // Table manipulation
whole_stream_command(Move),
whole_stream_command(Merge), whole_stream_command(Merge),
whole_stream_command(Shuffle), whole_stream_command(Shuffle),
whole_stream_command(Wrap), whole_stream_command(Wrap),
whole_stream_command(Pivot), whole_stream_command(Pivot),
whole_stream_command(Headers), whole_stream_command(Headers),
whole_stream_command(Reduce),
// Data processing // Data processing
whole_stream_command(Histogram), whole_stream_command(Histogram),
whole_stream_command(Autoenv),
whole_stream_command(AutoenvTrust),
whole_stream_command(AutoenvUnTrust),
whole_stream_command(Math), whole_stream_command(Math),
whole_stream_command(MathAverage), whole_stream_command(MathAverage),
whole_stream_command(MathEval),
whole_stream_command(MathMedian), whole_stream_command(MathMedian),
whole_stream_command(MathMinimum), whole_stream_command(MathMinimum),
whole_stream_command(MathMode), whole_stream_command(MathMode),
whole_stream_command(MathMaximum), whole_stream_command(MathMaximum),
whole_stream_command(MathStddev),
whole_stream_command(MathSummation), whole_stream_command(MathSummation),
whole_stream_command(MathVariance),
// File format output // File format output
whole_stream_command(To), whole_stream_command(To),
whole_stream_command(ToBSON),
whole_stream_command(ToCSV), whole_stream_command(ToCSV),
whole_stream_command(ToHTML), whole_stream_command(ToHTML),
whole_stream_command(ToJSON), whole_stream_command(ToJSON),
whole_stream_command(ToSQLite),
whole_stream_command(ToDB),
whole_stream_command(ToMarkdown), whole_stream_command(ToMarkdown),
whole_stream_command(ToTOML), whole_stream_command(ToTOML),
whole_stream_command(ToTSV), whole_stream_command(ToTSV),
whole_stream_command(ToURL), whole_stream_command(ToURL),
whole_stream_command(ToYAML), whole_stream_command(ToYAML),
whole_stream_command(ToXML),
// File format input // File format input
whole_stream_command(From), whole_stream_command(From),
whole_stream_command(FromCSV), whole_stream_command(FromCSV),
@ -376,11 +416,8 @@ pub fn create_default_context(
whole_stream_command(FromTSV), whole_stream_command(FromTSV),
whole_stream_command(FromSSV), whole_stream_command(FromSSV),
whole_stream_command(FromINI), whole_stream_command(FromINI),
whole_stream_command(FromBSON),
whole_stream_command(FromJSON), whole_stream_command(FromJSON),
whole_stream_command(FromODS), whole_stream_command(FromODS),
whole_stream_command(FromDB),
whole_stream_command(FromSQLite),
whole_stream_command(FromTOML), whole_stream_command(FromTOML),
whole_stream_command(FromURL), whole_stream_command(FromURL),
whole_stream_command(FromXLSX), whole_stream_command(FromXLSX),
@ -395,20 +432,23 @@ pub fn create_default_context(
whole_stream_command(Random), whole_stream_command(Random),
whole_stream_command(RandomBool), whole_stream_command(RandomBool),
whole_stream_command(RandomDice), whole_stream_command(RandomDice),
#[cfg(feature = "uuid_crate")]
whole_stream_command(RandomUUID), whole_stream_command(RandomUUID),
// Path
whole_stream_command(PathCommand),
whole_stream_command(PathExtension),
whole_stream_command(PathBasename),
whole_stream_command(PathExpand),
whole_stream_command(PathExists),
whole_stream_command(PathType),
// Url
whole_stream_command(UrlCommand),
whole_stream_command(UrlScheme),
whole_stream_command(UrlPath),
whole_stream_command(UrlHost),
whole_stream_command(UrlQuery),
]); ]);
cfg_if::cfg_if! {
if #[cfg(data_processing_primitives)] {
context.add_commands(vec![
whole_stream_command(ReduceBy),
whole_stream_command(EvaluateBy),
whole_stream_command(TSortBy),
whole_stream_command(MapMaxBy),
]);
}
}
#[cfg(feature = "clipboard")] #[cfg(feature = "clipboard")]
{ {
context.add_commands(vec![whole_stream_command(crate::commands::clip::Clip)]); context.add_commands(vec![whole_stream_command(crate::commands::clip::Clip)]);
@ -427,6 +467,8 @@ pub async fn run_vec_of_pipelines(
let _ = crate::load_plugins(&mut context); let _ = crate::load_plugins(&mut context);
#[cfg(feature = "ctrlc")]
{
let cc = context.ctrl_c.clone(); let cc = context.ctrl_c.clone();
ctrlc::set_handler(move || { ctrlc::set_handler(move || {
@ -437,6 +479,7 @@ pub async fn run_vec_of_pipelines(
if context.ctrl_c.load(Ordering::SeqCst) { if context.ctrl_c.load(Ordering::SeqCst) {
context.ctrl_c.store(false, Ordering::SeqCst); context.ctrl_c.store(false, Ordering::SeqCst);
} }
}
// before we start up, let's run our startup commands // before we start up, let's run our startup commands
if let Ok(config) = crate::data::config::config(Tag::unknown()) { if let Ok(config) = crate::data::config::config(Tag::unknown()) {
@ -515,18 +558,12 @@ pub async fn run_pipeline_standalone(
Ok(()) Ok(())
} }
/// The entry point for the CLI. Will register all known internal commands, load experimental commands, load plugins, then prepare the prompt and line reader for input. pub fn set_rustyline_configuration() -> (Editor<Helper>, IndexMap<String, Value>) {
pub async fn cli(
mut syncer: EnvironmentSyncer,
mut context: Context,
) -> Result<(), Box<dyn Error>> {
#[cfg(windows)] #[cfg(windows)]
const DEFAULT_COMPLETION_MODE: CompletionType = CompletionType::Circular; const DEFAULT_COMPLETION_MODE: CompletionType = CompletionType::Circular;
#[cfg(not(windows))] #[cfg(not(windows))]
const DEFAULT_COMPLETION_MODE: CompletionType = CompletionType::List; const DEFAULT_COMPLETION_MODE: CompletionType = CompletionType::List;
let _ = load_plugins(&mut context);
let config = Config::builder().color_mode(ColorMode::Forced).build(); let config = Config::builder().color_mode(ColorMode::Forced).build();
let mut rl: Editor<_> = Editor::with_config(config); let mut rl: Editor<_> = Editor::with_config(config);
@ -540,19 +577,194 @@ pub async fn cli(
Cmd::Move(Movement::ForwardWord(1, At::AfterEnd, Word::Vi)), Cmd::Move(Movement::ForwardWord(1, At::AfterEnd, Word::Vi)),
); );
#[cfg(windows)] // Let's set the defaults up front and then override them later if the user indicates
// defaults taken from here https://github.com/kkawakam/rustyline/blob/2fe886c9576c1ea13ca0e5808053ad491a6fe049/src/config.rs#L150-L167
rl.set_max_history_size(100);
rl.set_history_ignore_dups(true);
rl.set_history_ignore_space(false);
rl.set_completion_type(DEFAULT_COMPLETION_MODE);
rl.set_completion_prompt_limit(100);
rl.set_keyseq_timeout(-1);
rl.set_edit_mode(rustyline::config::EditMode::Emacs);
rl.set_auto_add_history(false);
rl.set_bell_style(rustyline::config::BellStyle::default());
rl.set_color_mode(rustyline::ColorMode::Enabled);
rl.set_tab_stop(8);
if let Err(e) = crate::keybinding::load_keybindings(&mut rl) {
println!("Error loading keybindings: {:?}", e);
}
let config = match config::config(Tag::unknown()) {
Ok(config) => config,
Err(e) => {
eprintln!("Config could not be loaded.");
if let ShellError {
error: ProximateShellError::Diagnostic(ShellDiagnostic { diagnostic }),
..
} = e
{ {
let _ = ansi_term::enable_ansi_support(); eprintln!("{}", diagnostic.message);
}
IndexMap::new()
}
};
if let Ok(config) = config::config(Tag::unknown()) {
if let Some(line_editor_vars) = config.get("line_editor") {
for (idx, value) in line_editor_vars.row_entries() {
match idx.as_ref() {
"max_history_size" => {
if let Ok(max_history_size) = value.as_u64() {
rl.set_max_history_size(max_history_size as usize);
}
}
"history_duplicates" => {
// history_duplicates = match value.as_string() {
// Ok(s) if s.to_lowercase() == "alwaysadd" => {
// rustyline::config::HistoryDuplicates::AlwaysAdd
// }
// Ok(s) if s.to_lowercase() == "ignoreconsecutive" => {
// rustyline::config::HistoryDuplicates::IgnoreConsecutive
// }
// _ => rustyline::config::HistoryDuplicates::AlwaysAdd,
// };
if let Ok(history_duplicates) = value.as_bool() {
rl.set_history_ignore_dups(history_duplicates);
}
}
"history_ignore_space" => {
if let Ok(history_ignore_space) = value.as_bool() {
rl.set_history_ignore_space(history_ignore_space);
}
}
"completion_type" => {
let completion_type = match value.as_string() {
Ok(s) if s.to_lowercase() == "circular" => {
rustyline::config::CompletionType::Circular
}
Ok(s) if s.to_lowercase() == "list" => {
rustyline::config::CompletionType::List
}
#[cfg(all(unix, feature = "with-fuzzy"))]
Ok(s) if s.to_lowercase() == "fuzzy" => {
rustyline::config::CompletionType::Fuzzy
}
_ => DEFAULT_COMPLETION_MODE,
};
rl.set_completion_type(completion_type);
}
"completion_prompt_limit" => {
if let Ok(completion_prompt_limit) = value.as_u64() {
rl.set_completion_prompt_limit(completion_prompt_limit as usize);
}
}
"keyseq_timeout_ms" => {
if let Ok(keyseq_timeout_ms) = value.as_u64() {
rl.set_keyseq_timeout(keyseq_timeout_ms as i32);
}
}
"edit_mode" => {
let edit_mode = match value.as_string() {
Ok(s) if s.to_lowercase() == "vi" => rustyline::config::EditMode::Vi,
Ok(s) if s.to_lowercase() == "emacs" => {
rustyline::config::EditMode::Emacs
}
_ => rustyline::config::EditMode::Emacs,
};
rl.set_edit_mode(edit_mode);
// Note: When edit_mode is Emacs, the keyseq_timeout_ms is set to -1
// no matter what you may have configured. This is so that key chords
// can be applied without having to do them in a given timeout. So,
// it essentially turns off the keyseq timeout.
}
"auto_add_history" => {
if let Ok(auto_add_history) = value.as_bool() {
rl.set_auto_add_history(auto_add_history);
}
}
"bell_style" => {
let bell_style = match value.as_string() {
Ok(s) if s.to_lowercase() == "audible" => {
rustyline::config::BellStyle::Audible
}
Ok(s) if s.to_lowercase() == "none" => {
rustyline::config::BellStyle::None
}
Ok(s) if s.to_lowercase() == "visible" => {
rustyline::config::BellStyle::Visible
}
_ => rustyline::config::BellStyle::default(),
};
rl.set_bell_style(bell_style);
}
"color_mode" => {
let color_mode = match value.as_string() {
Ok(s) if s.to_lowercase() == "enabled" => rustyline::ColorMode::Enabled,
Ok(s) if s.to_lowercase() == "forced" => rustyline::ColorMode::Forced,
Ok(s) if s.to_lowercase() == "disabled" => {
rustyline::ColorMode::Disabled
}
_ => rustyline::ColorMode::Enabled,
};
rl.set_color_mode(color_mode);
}
"tab_stop" => {
if let Ok(tab_stop) = value.as_u64() {
rl.set_tab_stop(tab_stop as usize);
}
}
_ => (),
}
}
}
} }
// we are ok if history does not exist // we are ok if history does not exist
let _ = rl.load_history(&History::path()); let _ = rl.load_history(&History::path());
(rl, config)
}
/// The entry point for the CLI. Will register all known internal commands, load experimental commands, load plugins, then prepare the prompt and line reader for input.
pub async fn cli(
mut syncer: EnvironmentSyncer,
mut context: Context,
) -> Result<(), Box<dyn Error>> {
let _ = load_plugins(&mut context);
let (mut rl, config) = set_rustyline_configuration();
let skip_welcome_message = config
.get("skip_welcome_message")
.map(|x| x.is_true())
.unwrap_or(false);
if !skip_welcome_message {
println!(
"Welcome to Nushell {} (type 'help' for more info)",
clap::crate_version!()
);
}
let use_starship = config
.get("use_starship")
.map(|x| x.is_true())
.unwrap_or(false);
#[cfg(windows)]
{
let _ = ansi_term::enable_ansi_support();
}
#[cfg(feature = "ctrlc")]
{
let cc = context.ctrl_c.clone(); let cc = context.ctrl_c.clone();
ctrlc::set_handler(move || { ctrlc::set_handler(move || {
cc.store(true, Ordering::SeqCst); cc.store(true, Ordering::SeqCst);
}) })
.expect("Error setting Ctrl-C handler"); .expect("Error setting Ctrl-C handler");
}
let mut ctrlcbreak = false; let mut ctrlcbreak = false;
// before we start up, let's run our startup commands // before we start up, let's run our startup commands
@ -590,69 +802,15 @@ pub async fn cli(
let cwd = context.shell_manager.path(); let cwd = context.shell_manager.path();
rl.set_helper(Some(crate::shell::Helper::new(context.clone()))); rl.set_helper(Some(crate::shell::Helper::new(
Box::new(<NuCompleter as Default>::default()),
let config = match config::config(Tag::unknown()) { context.clone(),
Ok(config) => config, )));
Err(e) => {
eprintln!("Config could not be loaded.");
if let ShellError {
error: ProximateShellError::Diagnostic(ShellDiagnostic { diagnostic }),
..
} = e
{
eprintln!("{}", diagnostic.message);
}
IndexMap::new()
}
};
let use_starship = match config.get("use_starship") {
Some(b) => match b.as_bool() {
Ok(b) => b,
_ => false,
},
_ => false,
};
let edit_mode = config
.get("edit_mode")
.map(|s| match s.value.expect_string() {
"vi" => EditMode::Vi,
"emacs" => EditMode::Emacs,
_ => EditMode::Emacs,
})
.unwrap_or(EditMode::Emacs);
rl.set_edit_mode(edit_mode);
let max_history_size = config
.get("history_size")
.map(|i| i.value.expect_int())
.unwrap_or(100_000);
rl.set_max_history_size(max_history_size as usize);
let key_timeout = config
.get("key_timeout")
.map(|s| s.value.expect_int())
.unwrap_or(1);
rl.set_keyseq_timeout(key_timeout as i32);
let completion_mode = config
.get("completion_mode")
.map(|s| match s.value.expect_string() {
"list" => CompletionType::List,
"circular" => CompletionType::Circular,
_ => DEFAULT_COMPLETION_MODE,
})
.unwrap_or(DEFAULT_COMPLETION_MODE);
rl.set_completion_type(completion_mode);
let colored_prompt = { let colored_prompt = {
if use_starship { if use_starship {
#[cfg(feature = "starship")]
{
std::env::set_var("STARSHIP_SHELL", ""); std::env::set_var("STARSHIP_SHELL", "");
std::env::set_var("PWD", &cwd); std::env::set_var("PWD", &cwd);
let mut starship_context = let mut starship_context =
@ -668,6 +826,18 @@ pub async fn cli(
_ => {} _ => {}
}; };
starship::print::get_prompt(starship_context) starship::print::get_prompt(starship_context)
}
#[cfg(not(feature = "starship"))]
{
format!(
"\x1b[32m{}{}\x1b[m> ",
cwd,
match current_branch() {
Some(s) => format!("({})", s),
None => "".to_string(),
}
)
}
} else if let Some(prompt) = config.get("prompt") { } else if let Some(prompt) = config.get("prompt") {
let prompt_line = prompt.as_string()?; let prompt_line = prompt.as_string()?;
@ -782,10 +952,7 @@ pub async fn cli(
LineResult::CtrlC => { LineResult::CtrlC => {
let config_ctrlc_exit = config::config(Tag::unknown())? let config_ctrlc_exit = config::config(Tag::unknown())?
.get("ctrlc_exit") .get("ctrlc_exit")
.map(|s| match s.value.expect_string() { .map(|s| s.value.expect_string() == "true")
"true" => true,
_ => false,
})
.unwrap_or(false); // default behavior is to allow CTRL-C spamming similar to other shells .unwrap_or(false); // default behavior is to allow CTRL-C spamming similar to other shells
if !config_ctrlc_exit { if !config_ctrlc_exit {
@ -823,6 +990,7 @@ fn chomp_newline(s: &str) -> &str {
} }
} }
#[derive(Debug)]
pub enum LineResult { pub enum LineResult {
Success(String), Success(String),
Error(String, ShellError), Error(String, ShellError),
@ -830,6 +998,36 @@ pub enum LineResult {
Break, Break,
} }
pub async fn parse_and_eval(line: &str, ctx: &mut Context) -> Result<String, ShellError> {
let line = if line.ends_with('\n') {
&line[..line.len() - 1]
} else {
line
};
let lite_result = nu_parser::lite_parse(&line, 0)?;
// TODO ensure the command whose examples we're testing is actually in the pipeline
let mut classified_block = nu_parser::classify_block(&lite_result, ctx.registry());
classified_block.block.expand_it_usage();
let input_stream = InputStream::empty();
let env = ctx.get_env();
run_block(
&classified_block.block,
ctx,
input_stream,
&Value::nothing(),
&IndexMap::new(),
&env,
)
.await?
.collect_string(Tag::unknown())
.await
.map(|x| x.item)
}
/// Process the line by parsing the text to turn it into commands, classify those commands so that we understand what is being called in the pipeline, and then run this pipeline /// Process the line by parsing the text to turn it into commands, classify those commands so that we understand what is being called in the pipeline, and then run this pipeline
pub async fn process_line( pub async fn process_line(
readline: Result<String, ReadlineError>, readline: Result<String, ReadlineError>,
@ -906,7 +1104,7 @@ pub async fn process_line(
.unwrap_or(true) .unwrap_or(true)
&& canonicalize(ctx.shell_manager.path(), name).is_ok() && canonicalize(ctx.shell_manager.path(), name).is_ok()
&& Path::new(&name).is_dir() && Path::new(&name).is_dir()
&& which::which(&name).is_err() && !crate::commands::classified::external::did_find_command(&name)
{ {
// Here we work differently if we're in Windows because of the expected Windows behavior // Here we work differently if we're in Windows because of the expected Windows behavior
#[cfg(windows)] #[cfg(windows)]
@ -952,20 +1150,17 @@ pub async fn process_line(
let input_stream = if redirect_stdin { let input_stream = if redirect_stdin {
let file = futures::io::AllowStdIo::new(std::io::stdin()); let file = futures::io::AllowStdIo::new(std::io::stdin());
let stream = FramedRead::new(file, MaybeTextCodec).map(|line| { let stream = FramedRead::new(file, MaybeTextCodec::default()).map(|line| {
if let Ok(line) = line { if let Ok(line) = line {
match line { let primitive = match line {
StringOrBinary::String(s) => Ok(Value { StringOrBinary::String(s) => Primitive::String(s),
value: UntaggedValue::Primitive(Primitive::String(s)), StringOrBinary::Binary(b) => Primitive::Binary(b.into_iter().collect()),
};
Ok(Value {
value: UntaggedValue::Primitive(primitive),
tag: Tag::unknown(), tag: Tag::unknown(),
}), })
StringOrBinary::Binary(b) => Ok(Value {
value: UntaggedValue::Primitive(Primitive::Binary(
b.into_iter().collect(),
)),
tag: Tag::unknown(),
}),
}
} else { } else {
panic!("Internal error: could not read lines of text from stdin") panic!("Internal error: could not read lines of text from stdin")
} }

View File

@ -8,10 +8,13 @@ pub(crate) mod alias;
pub(crate) mod ansi; pub(crate) mod ansi;
pub(crate) mod append; pub(crate) mod append;
pub(crate) mod args; pub(crate) mod args;
pub(crate) mod autoenv;
pub(crate) mod autoenv_trust;
pub(crate) mod autoenv_untrust;
pub(crate) mod autoview; pub(crate) mod autoview;
pub(crate) mod benchmark;
pub(crate) mod build_string; pub(crate) mod build_string;
pub(crate) mod cal; pub(crate) mod cal;
pub(crate) mod calc;
pub(crate) mod cd; pub(crate) mod cd;
pub(crate) mod char_; pub(crate) mod char_;
pub(crate) mod classified; pub(crate) mod classified;
@ -20,6 +23,7 @@ pub(crate) mod clip;
pub(crate) mod command; pub(crate) mod command;
pub(crate) mod compact; pub(crate) mod compact;
pub(crate) mod config; pub(crate) mod config;
pub(crate) mod constants;
pub(crate) mod count; pub(crate) mod count;
pub(crate) mod cp; pub(crate) mod cp;
pub(crate) mod date; pub(crate) mod date;
@ -31,21 +35,17 @@ pub(crate) mod du;
pub(crate) mod each; pub(crate) mod each;
pub(crate) mod echo; pub(crate) mod echo;
pub(crate) mod enter; pub(crate) mod enter;
#[allow(unused)]
pub(crate) mod evaluate_by;
pub(crate) mod every; pub(crate) mod every;
pub(crate) mod exit; pub(crate) mod exit;
pub(crate) mod first; pub(crate) mod first;
pub(crate) mod format; pub(crate) mod format;
pub(crate) mod from; pub(crate) mod from;
pub(crate) mod from_bson;
pub(crate) mod from_csv; pub(crate) mod from_csv;
pub(crate) mod from_eml; pub(crate) mod from_eml;
pub(crate) mod from_ics; pub(crate) mod from_ics;
pub(crate) mod from_ini; pub(crate) mod from_ini;
pub(crate) mod from_json; pub(crate) mod from_json;
pub(crate) mod from_ods; pub(crate) mod from_ods;
pub(crate) mod from_sqlite;
pub(crate) mod from_ssv; pub(crate) mod from_ssv;
pub(crate) mod from_toml; pub(crate) mod from_toml;
pub(crate) mod from_tsv; pub(crate) mod from_tsv;
@ -61,24 +61,22 @@ pub(crate) mod headers;
pub(crate) mod help; pub(crate) mod help;
pub(crate) mod histogram; pub(crate) mod histogram;
pub(crate) mod history; pub(crate) mod history;
pub(crate) mod if_;
pub(crate) mod insert; pub(crate) mod insert;
pub(crate) mod is_empty; pub(crate) mod is_empty;
pub(crate) mod keep; pub(crate) mod keep;
pub(crate) mod keep_until;
pub(crate) mod keep_while;
pub(crate) mod last; pub(crate) mod last;
pub(crate) mod lines; pub(crate) mod lines;
pub(crate) mod ls; pub(crate) mod ls;
#[allow(unused)]
pub(crate) mod map_max_by;
pub(crate) mod math; pub(crate) mod math;
pub(crate) mod merge; pub(crate) mod merge;
pub(crate) mod mkdir; pub(crate) mod mkdir;
pub(crate) mod mv; pub(crate) mod move_;
pub(crate) mod next; pub(crate) mod next;
pub(crate) mod nth; pub(crate) mod nth;
pub(crate) mod open; pub(crate) mod open;
pub(crate) mod parse; pub(crate) mod parse;
pub(crate) mod path;
pub(crate) mod pivot; pub(crate) mod pivot;
pub(crate) mod plugin; pub(crate) mod plugin;
pub(crate) mod prepend; pub(crate) mod prepend;
@ -86,8 +84,7 @@ pub(crate) mod prev;
pub(crate) mod pwd; pub(crate) mod pwd;
pub(crate) mod random; pub(crate) mod random;
pub(crate) mod range; pub(crate) mod range;
#[allow(unused)] pub(crate) mod reduce;
pub(crate) mod reduce_by;
pub(crate) mod reject; pub(crate) mod reject;
pub(crate) mod rename; pub(crate) mod rename;
pub(crate) mod reverse; pub(crate) mod reverse;
@ -100,30 +97,26 @@ pub(crate) mod shells;
pub(crate) mod shuffle; pub(crate) mod shuffle;
pub(crate) mod size; pub(crate) mod size;
pub(crate) mod skip; pub(crate) mod skip;
pub(crate) mod skip_until;
pub(crate) mod skip_while;
pub(crate) mod sort_by; pub(crate) mod sort_by;
pub(crate) mod split; pub(crate) mod split;
pub(crate) mod split_by; pub(crate) mod split_by;
pub(crate) mod str_; pub(crate) mod str_;
#[allow(unused)]
pub(crate) mod t_sort_by;
pub(crate) mod table; pub(crate) mod table;
pub(crate) mod tags; pub(crate) mod tags;
pub(crate) mod to; pub(crate) mod to;
pub(crate) mod to_bson;
pub(crate) mod to_csv; pub(crate) mod to_csv;
pub(crate) mod to_html; pub(crate) mod to_html;
pub(crate) mod to_json; pub(crate) mod to_json;
pub(crate) mod to_md; pub(crate) mod to_md;
pub(crate) mod to_sqlite;
pub(crate) mod to_toml; pub(crate) mod to_toml;
pub(crate) mod to_tsv; pub(crate) mod to_tsv;
pub(crate) mod to_url; pub(crate) mod to_url;
pub(crate) mod to_xml;
pub(crate) mod to_yaml; pub(crate) mod to_yaml;
pub(crate) mod trim; pub(crate) mod trim;
pub(crate) mod uniq; pub(crate) mod uniq;
pub(crate) mod update; pub(crate) mod update;
pub(crate) mod url_;
pub(crate) mod version; pub(crate) mod version;
pub(crate) mod what; pub(crate) mod what;
pub(crate) mod where_; pub(crate) mod where_;
@ -140,12 +133,17 @@ pub(crate) use command::{
pub(crate) use alias::Alias; pub(crate) use alias::Alias;
pub(crate) use ansi::Ansi; pub(crate) use ansi::Ansi;
pub(crate) use append::Append; pub(crate) use append::Append;
pub(crate) use autoenv::Autoenv;
pub(crate) use autoenv_trust::AutoenvTrust;
pub(crate) use autoenv_untrust::AutoenvUnTrust;
pub(crate) use benchmark::Benchmark;
pub(crate) use build_string::BuildString; pub(crate) use build_string::BuildString;
pub(crate) use cal::Cal; pub(crate) use cal::Cal;
pub(crate) use calc::Calc;
pub(crate) use char_::Char; pub(crate) use char_::Char;
pub(crate) use compact::Compact; pub(crate) use compact::Compact;
pub(crate) use config::Config; pub(crate) use config::{
Config, ConfigClear, ConfigGet, ConfigLoad, ConfigPath, ConfigRemove, ConfigSet, ConfigSetInto,
};
pub(crate) use count::Count; pub(crate) use count::Count;
pub(crate) use cp::Cpy; pub(crate) use cp::Cpy;
pub(crate) use date::Date; pub(crate) use date::Date;
@ -156,6 +154,7 @@ pub(crate) use drop::Drop;
pub(crate) use du::Du; pub(crate) use du::Du;
pub(crate) use each::Each; pub(crate) use each::Each;
pub(crate) use echo::Echo; pub(crate) use echo::Echo;
pub(crate) use if_::If;
pub(crate) use is_empty::IsEmpty; pub(crate) use is_empty::IsEmpty;
pub(crate) use update::Update; pub(crate) use update::Update;
pub(crate) mod kill; pub(crate) mod kill;
@ -164,22 +163,17 @@ pub(crate) mod clear;
pub(crate) use clear::Clear; pub(crate) use clear::Clear;
pub(crate) mod touch; pub(crate) mod touch;
pub(crate) use enter::Enter; pub(crate) use enter::Enter;
#[allow(unused_imports)]
pub(crate) use evaluate_by::EvaluateBy;
pub(crate) use every::Every; pub(crate) use every::Every;
pub(crate) use exit::Exit; pub(crate) use exit::Exit;
pub(crate) use first::First; pub(crate) use first::First;
pub(crate) use format::Format; pub(crate) use format::Format;
pub(crate) use from::From; pub(crate) use from::From;
pub(crate) use from_bson::FromBSON;
pub(crate) use from_csv::FromCSV; pub(crate) use from_csv::FromCSV;
pub(crate) use from_eml::FromEML; pub(crate) use from_eml::FromEML;
pub(crate) use from_ics::FromIcs; pub(crate) use from_ics::FromIcs;
pub(crate) use from_ini::FromINI; pub(crate) use from_ini::FromINI;
pub(crate) use from_json::FromJSON; pub(crate) use from_json::FromJSON;
pub(crate) use from_ods::FromODS; pub(crate) use from_ods::FromODS;
pub(crate) use from_sqlite::FromDB;
pub(crate) use from_sqlite::FromSQLite;
pub(crate) use from_ssv::FromSSV; pub(crate) use from_ssv::FromSSV;
pub(crate) use from_toml::FromTOML; pub(crate) use from_toml::FromTOML;
pub(crate) use from_tsv::FromTSV; pub(crate) use from_tsv::FromTSV;
@ -197,32 +191,31 @@ pub(crate) use help::Help;
pub(crate) use histogram::Histogram; pub(crate) use histogram::Histogram;
pub(crate) use history::History; pub(crate) use history::History;
pub(crate) use insert::Insert; pub(crate) use insert::Insert;
pub(crate) use keep::Keep; pub(crate) use keep::{Keep, KeepUntil, KeepWhile};
pub(crate) use keep_until::KeepUntil;
pub(crate) use keep_while::KeepWhile;
pub(crate) use last::Last; pub(crate) use last::Last;
pub(crate) use lines::Lines; pub(crate) use lines::Lines;
pub(crate) use ls::Ls; pub(crate) use ls::Ls;
#[allow(unused_imports)]
pub(crate) use map_max_by::MapMaxBy;
pub(crate) use math::{ pub(crate) use math::{
Math, MathAverage, MathMaximum, MathMedian, MathMinimum, MathMode, MathSummation, Math, MathAverage, MathEval, MathMaximum, MathMedian, MathMinimum, MathMode, MathStddev,
MathSummation, MathVariance,
}; };
pub(crate) use merge::Merge; pub(crate) use merge::Merge;
pub(crate) use mkdir::Mkdir; pub(crate) use mkdir::Mkdir;
pub(crate) use mv::Move; pub(crate) use move_::{Move, MoveColumn, Mv};
pub(crate) use next::Next; pub(crate) use next::Next;
pub(crate) use nth::Nth; pub(crate) use nth::Nth;
pub(crate) use open::Open; pub(crate) use open::Open;
pub(crate) use parse::Parse; pub(crate) use parse::Parse;
pub(crate) use path::{PathBasename, PathCommand, PathExists, PathExpand, PathExtension, PathType};
pub(crate) use pivot::Pivot; pub(crate) use pivot::Pivot;
pub(crate) use prepend::Prepend; pub(crate) use prepend::Prepend;
pub(crate) use prev::Previous; pub(crate) use prev::Previous;
pub(crate) use pwd::Pwd; pub(crate) use pwd::Pwd;
pub(crate) use random::{Random, RandomBool, RandomDice, RandomUUID}; #[cfg(feature = "uuid_crate")]
pub(crate) use random::RandomUUID;
pub(crate) use random::{Random, RandomBool, RandomDice};
pub(crate) use range::Range; pub(crate) use range::Range;
#[allow(unused_imports)] pub(crate) use reduce::Reduce;
pub(crate) use reduce_by::ReduceBy;
pub(crate) use reject::Reject; pub(crate) use reject::Reject;
pub(crate) use rename::Rename; pub(crate) use rename::Rename;
pub(crate) use reverse::Reverse; pub(crate) use reverse::Reverse;
@ -233,37 +226,31 @@ pub(crate) use select::Select;
pub(crate) use shells::Shells; pub(crate) use shells::Shells;
pub(crate) use shuffle::Shuffle; pub(crate) use shuffle::Shuffle;
pub(crate) use size::Size; pub(crate) use size::Size;
pub(crate) use skip::Skip; pub(crate) use skip::{Skip, SkipUntil, SkipWhile};
pub(crate) use skip_until::SkipUntil;
pub(crate) use skip_while::SkipWhile;
pub(crate) use sort_by::SortBy; pub(crate) use sort_by::SortBy;
pub(crate) use split::Split; pub(crate) use split::{Split, SplitChars, SplitColumn, SplitRow};
pub(crate) use split::SplitColumn;
pub(crate) use split::SplitRow;
pub(crate) use split_by::SplitBy; pub(crate) use split_by::SplitBy;
pub(crate) use str_::{ pub(crate) use str_::{
Str, StrCapitalize, StrCollect, StrDowncase, StrFindReplace, StrSet, StrSubstring, Str, StrCapitalize, StrCollect, StrContains, StrDowncase, StrEndsWith, StrFindReplace, StrFrom,
StrToDatetime, StrToDecimal, StrToInteger, StrTrim, StrUpcase, StrIndexOf, StrLength, StrReverse, StrSet, StrStartsWith, StrSubstring, StrToDatetime,
StrToDecimal, StrToInteger, StrTrim, StrTrimLeft, StrTrimRight, StrUpcase,
}; };
#[allow(unused_imports)]
pub(crate) use t_sort_by::TSortBy;
pub(crate) use table::Table; pub(crate) use table::Table;
pub(crate) use tags::Tags; pub(crate) use tags::Tags;
pub(crate) use to::To; pub(crate) use to::To;
pub(crate) use to_bson::ToBSON;
pub(crate) use to_csv::ToCSV; pub(crate) use to_csv::ToCSV;
pub(crate) use to_html::ToHTML; pub(crate) use to_html::ToHTML;
pub(crate) use to_json::ToJSON; pub(crate) use to_json::ToJSON;
pub(crate) use to_md::ToMarkdown; pub(crate) use to_md::ToMarkdown;
pub(crate) use to_sqlite::ToDB;
pub(crate) use to_sqlite::ToSQLite;
pub(crate) use to_toml::ToTOML; pub(crate) use to_toml::ToTOML;
pub(crate) use to_tsv::ToTSV; pub(crate) use to_tsv::ToTSV;
pub(crate) use to_url::ToURL; pub(crate) use to_url::ToURL;
pub(crate) use to_xml::ToXML;
pub(crate) use to_yaml::ToYAML; pub(crate) use to_yaml::ToYAML;
pub(crate) use touch::Touch; pub(crate) use touch::Touch;
pub(crate) use trim::Trim; pub(crate) use trim::Trim;
pub(crate) use uniq::Uniq; pub(crate) use uniq::Uniq;
pub(crate) use url_::{UrlCommand, UrlHost, UrlPath, UrlQuery, UrlScheme};
pub(crate) use version::Version; pub(crate) use version::Version;
pub(crate) use what::What; pub(crate) use what::What;
pub(crate) use where_::Where; pub(crate) use where_::Where;

View File

@ -40,6 +40,14 @@ impl WholeStreamCommand for Ansi {
example: r#"ansi reset"#, example: r#"ansi reset"#,
result: Some(vec![Value::from("\u{1b}[0m")]), result: Some(vec![Value::from("\u{1b}[0m")]),
}, },
Example {
description:
"Use ansi to color text (rb = red bold, gb = green bold, pb = purple bold)",
example: r#"echo [$(ansi rb) Hello " " $(ansi gb) Nu " " $(ansi pb) World] | str collect"#,
result: Some(vec![Value::from(
"\u{1b}[1;31mHello \u{1b}[1;32mNu \u{1b}[1;35mWorld",
)]),
},
] ]
} }

View File

@ -0,0 +1,92 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, UntaggedValue};
use serde::Deserialize;
use serde::Serialize;
use sha2::{Digest, Sha256};
use std::io::Read;
use std::path::PathBuf;
pub struct Autoenv;
#[derive(Deserialize, Serialize, Debug, Default)]
pub struct Trusted {
pub files: IndexMap<String, Vec<u8>>,
}
impl Trusted {
pub fn new() -> Self {
Trusted {
files: IndexMap::new(),
}
}
}
pub fn file_is_trusted(nu_env_file: &PathBuf, content: &[u8]) -> Result<bool, ShellError> {
let contentdigest = Sha256::digest(&content).as_slice().to_vec();
let nufile = std::fs::canonicalize(nu_env_file)?;
let trusted = read_trusted()?;
Ok(trusted.files.get(&nufile.to_string_lossy().to_string()) == Some(&contentdigest))
}
pub fn read_trusted() -> Result<Trusted, ShellError> {
let config_path = config::default_path_for(&Some(PathBuf::from("nu-env.toml")))?;
let mut file = std::fs::OpenOptions::new()
.read(true)
.create(true)
.write(true)
.open(config_path)
.map_err(|_| ShellError::untagged_runtime_error("Couldn't open nu-env.toml"))?;
let mut doc = String::new();
file.read_to_string(&mut doc)?;
let allowed = toml::de::from_str(doc.as_str()).unwrap_or_else(|_| Trusted::new());
Ok(allowed)
}
#[async_trait]
impl WholeStreamCommand for Autoenv {
fn name(&self) -> &str {
"autoenv"
}
fn usage(&self) -> &str {
// "Mark a .nu-env file in a directory as trusted. Needs to be re-run after each change to the file or its filepath."
r#"Manage directory specific environment variables and scripts. Create a file called .nu-env in any directory and run 'autoenv trust' to let nushell read it when entering the directory.
The file can contain several optional sections:
env: environment variables to set when visiting the directory. The variables are unset after leaving the directory and any overwritten values are restored.
scriptvars: environment variables that should be set to the return value of a script. After they have been set, they behave in the same way as variables set in the env section.
scripts: scripts to run when entering the directory or leaving it. Note that exitscripts are not run in the directory they are declared in."#
}
fn signature(&self) -> Signature {
Signature::build("autoenv")
}
async fn run(
&self,
_args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
Ok(OutputStream::one(ReturnSuccess::value(
UntaggedValue::string(crate::commands::help::get_help(&Autoenv, &registry))
.into_value(Tag::unknown()),
)))
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Example .nu-env file",
example: r#"cat .nu-env
[env]
mykey = "myvalue"
[scriptvars]
myscript = "echo myval"
[scripts]
entryscripts = ["touch hello.txt", "touch hello2.txt"]
exitscripts = ["touch bye.txt"]"#,
result: None,
}]
}
}

View File

@ -0,0 +1,83 @@
use super::autoenv::read_trusted;
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::SyntaxShape;
use nu_protocol::{Primitive, ReturnSuccess, Signature, UntaggedValue, Value};
use sha2::{Digest, Sha256};
use std::{fs, path::PathBuf};
pub struct AutoenvTrust;
#[async_trait]
impl WholeStreamCommand for AutoenvTrust {
fn name(&self) -> &str {
"autoenv trust"
}
fn signature(&self) -> Signature {
Signature::build("autoenv trust").optional("dir", SyntaxShape::String, "Directory to allow")
}
fn usage(&self) -> &str {
"Trust a .nu-env file in the current or given directory"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let file_to_trust = match args.call_info.evaluate(registry).await?.args.nth(0) {
Some(Value {
value: UntaggedValue::Primitive(Primitive::String(ref path)),
tag: _,
}) => {
let mut dir = fs::canonicalize(path)?;
dir.push(".nu-env");
dir
}
_ => {
let mut dir = fs::canonicalize(std::env::current_dir()?)?;
dir.push(".nu-env");
dir
}
};
let content = std::fs::read(&file_to_trust)?;
let filename = file_to_trust.to_string_lossy().to_string();
let mut allowed = read_trusted()?;
allowed
.files
.insert(filename, Sha256::digest(&content).as_slice().to_vec());
let config_path = config::default_path_for(&Some(PathBuf::from("nu-env.toml")))?;
let tomlstr = toml::to_string(&allowed).map_err(|_| {
ShellError::untagged_runtime_error("Couldn't serialize allowed dirs to nu-env.toml")
})?;
fs::write(config_path, tomlstr).expect("Couldn't write to toml file");
Ok(OutputStream::one(ReturnSuccess::value(
UntaggedValue::string(".nu-env trusted!").into_value(tag),
)))
}
fn is_binary(&self) -> bool {
false
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Allow .nu-env file in current directory",
example: "autoenv trust",
result: None,
},
Example {
description: "Allow .nu-env file in directory foo",
example: "autoenv trust foo",
result: None,
},
]
}
}

View File

@ -0,0 +1,107 @@
use super::autoenv::Trusted;
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::SyntaxShape;
use nu_protocol::{Primitive, ReturnSuccess, Signature, UntaggedValue, Value};
use std::io::Read;
use std::{fs, path::PathBuf};
pub struct AutoenvUnTrust;
#[async_trait]
impl WholeStreamCommand for AutoenvUnTrust {
fn name(&self) -> &str {
"autoenv untrust"
}
fn signature(&self) -> Signature {
Signature::build("autoenv untrust").optional(
"dir",
SyntaxShape::String,
"Directory to disallow",
)
}
fn usage(&self) -> &str {
"Untrust a .nu-env file in the current or given directory"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let file_to_untrust = match args.call_info.evaluate(registry).await?.args.nth(0) {
Some(Value {
value: UntaggedValue::Primitive(Primitive::String(ref path)),
tag: _,
}) => {
let mut dir = fs::canonicalize(path)?;
dir.push(".nu-env");
dir
}
_ => {
let mut dir = std::env::current_dir()?;
dir.push(".nu-env");
dir
}
};
let config_path = config::default_path_for(&Some(PathBuf::from("nu-env.toml")))?;
let mut file = match std::fs::OpenOptions::new()
.read(true)
.create(true)
.write(true)
.open(config_path.clone())
{
Ok(p) => p,
Err(_) => {
return Err(ShellError::untagged_runtime_error(
"Couldn't open nu-env.toml",
));
}
};
let mut doc = String::new();
file.read_to_string(&mut doc)?;
let mut allowed: Trusted = toml::from_str(doc.as_str()).unwrap_or_else(|_| Trusted::new());
let file_to_untrust = file_to_untrust.to_string_lossy().to_string();
if allowed.files.remove(&file_to_untrust).is_none() {
return
Err(ShellError::untagged_runtime_error(
"No .nu-env file to untrust in the given directory. Is it missing, or already untrusted?",
));
}
let tomlstr = toml::to_string(&allowed).map_err(|_| {
ShellError::untagged_runtime_error("Couldn't serialize allowed dirs to nu-env.toml")
})?;
fs::write(config_path, tomlstr).expect("Couldn't write to toml file");
Ok(OutputStream::one(ReturnSuccess::value(
UntaggedValue::string(".nu-env untrusted!").into_value(tag),
)))
}
fn is_binary(&self) -> bool {
false
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Disallow .nu-env file in current directory",
example: "autoenv untrust",
result: None,
},
Example {
description: "Disallow .nu-env file in directory foo",
example: "autoenv untrust foo",
result: None,
},
]
}
}

View File

@ -3,7 +3,7 @@ use crate::commands::WholeStreamCommand;
use crate::data::value::format_leaf; use crate::data::value::format_leaf;
use crate::prelude::*; use crate::prelude::*;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{hir, hir::Expression, hir::Literal, hir::SpannedExpression}; use nu_protocol::hir::{self, Expression, ExternalRedirection, Literal, SpannedExpression};
use nu_protocol::{Primitive, Scope, Signature, UntaggedValue, Value}; use nu_protocol::{Primitive, Scope, Signature, UntaggedValue, Value};
use parking_lot::Mutex; use parking_lot::Mutex;
use std::sync::atomic::AtomicBool; use std::sync::atomic::AtomicBool;
@ -328,7 +328,7 @@ fn create_default_command_args(context: &RunnableContextWithoutInput) -> RawComm
positional: None, positional: None,
named: None, named: None,
span, span,
is_last: true, external_redirection: ExternalRedirection::Stdout,
}, },
name_tag: context.name.clone(), name_tag: context.name.clone(),
scope: Scope::new(), scope: Scope::new(),

View File

@ -0,0 +1,78 @@
use crate::commands::classified::block::run_block;
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{
hir::Block, Primitive, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value,
};
use chrono::prelude::*;
pub struct Benchmark;
#[derive(Deserialize, Debug)]
struct BenchmarkArgs {
block: Block,
}
#[async_trait]
impl WholeStreamCommand for Benchmark {
fn name(&self) -> &str {
"benchmark"
}
fn signature(&self) -> Signature {
Signature::build("benchmark").required(
"block",
SyntaxShape::Block,
"the block to run and benchmark",
)
}
fn usage(&self) -> &str {
"Runs a block and return the time it took to do execute it. Eg) benchmark { echo $nu.env.NAME }"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
benchmark(args, registry).await
}
}
async fn benchmark(
raw_args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let mut context = Context::from_raw(&raw_args, &registry);
let scope = raw_args.call_info.scope.clone();
let (BenchmarkArgs { block }, input) = raw_args.process(&registry).await?;
let start_time: chrono::DateTime<_> = Utc::now();
let result = run_block(
&block,
&mut context,
input,
&scope.it,
&scope.vars,
&scope.env,
)
.await;
let _ = result?.drain_vec().await;
let run_duration: chrono::Duration = Utc::now().signed_duration_since(start_time);
context.clear_errors();
let output = Ok(ReturnSuccess::Value(Value {
value: UntaggedValue::Primitive(Primitive::from(run_duration)),
tag: Tag::from(block.span),
}));
Ok(OutputStream::from(vec![output]))
}

View File

@ -100,7 +100,7 @@ pub async fn cal(
(current_month, current_month) (current_month, current_month)
}; };
let add_months_of_year_to_table_result = add_months_of_year_to_table( add_months_of_year_to_table(
&args, &args,
&mut calendar_vec_deque, &mut calendar_vec_deque,
&tag, &tag,
@ -108,12 +108,9 @@ pub async fn cal(
month_range, month_range,
current_month, current_month,
current_day_option, current_day_option,
); )?;
match add_months_of_year_to_table_result { Ok(futures::stream::iter(calendar_vec_deque).to_output_stream())
Ok(()) => Ok(futures::stream::iter(calendar_vec_deque).to_output_stream()),
Err(error) => Err(error),
}
} }
fn get_invalid_year_shell_error(year_tag: &Tag) -> ShellError { fn get_invalid_year_shell_error(year_tag: &Tag) -> ShellError {

View File

@ -1,88 +0,0 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, UntaggedValue, Value};
pub struct Calc;
#[async_trait]
impl WholeStreamCommand for Calc {
fn name(&self) -> &str {
"calc"
}
fn usage(&self) -> &str {
"Parse a math expression into a number"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
calc(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Calculate math in the pipeline",
example: "echo '10 / 4' | calc",
result: Some(vec![UntaggedValue::decimal(2.5).into()]),
}]
}
}
pub async fn calc(
args: CommandArgs,
_registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let input = args.input;
let name = args.call_info.name_tag.span;
Ok(input
.map(move |input| {
if let Ok(string) = input.as_string() {
match parse(&string, &input.tag) {
Ok(value) => ReturnSuccess::value(value),
Err(err) => Err(ShellError::labeled_error(
"Calculation error",
err,
&input.tag.span,
)),
}
} else {
Err(ShellError::labeled_error(
"Expected a string from pipeline",
"requires string input",
name,
))
}
})
.to_output_stream())
}
pub fn parse(math_expression: &str, tag: impl Into<Tag>) -> Result<Value, String> {
use std::f64;
let num = meval::eval_str(math_expression);
match num {
Ok(num) => {
if num == f64::INFINITY || num == f64::NEG_INFINITY {
return Err(String::from("cannot represent result"));
}
Ok(UntaggedValue::from(Primitive::from(num)).into_value(tag))
}
Err(error) => Err(error.to_string()),
}
}
#[cfg(test)]
mod tests {
use super::Calc;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Calc {})
}
}

View File

@ -30,11 +30,22 @@ impl WholeStreamCommand for Char {
} }
fn examples(&self) -> Vec<Example> { fn examples(&self) -> Vec<Example> {
vec![Example { vec![
Example {
description: "Output newline", description: "Output newline",
example: r#"char newline"#, example: r#"char newline"#,
result: Some(vec![Value::from("\n")]), result: Some(vec![Value::from("\n")]),
}] },
Example {
description: "Output prompt character, newline and a hamburger character",
example: r#"echo $(char prompt) $(char newline) $(char hamburger)"#,
result: Some(vec![
UntaggedValue::string("\u{25b6}").into(),
UntaggedValue::string("\n").into(),
UntaggedValue::string("\u{2261}").into(),
]),
},
]
} }
async fn run( async fn run(
@ -64,6 +75,25 @@ fn str_to_character(s: &str) -> Option<String> {
match s { match s {
"newline" | "enter" | "nl" => Some("\n".into()), "newline" | "enter" | "nl" => Some("\n".into()),
"tab" => Some("\t".into()), "tab" => Some("\t".into()),
"sp" | "space" => Some(" ".into()),
// Unicode names came from https://www.compart.com/en/unicode
// Private Use Area (U+E000-U+F8FF)
"branch" => Some('\u{e0a0}'.to_string()), // 
"segment" => Some('\u{e0b0}'.to_string()), // 
"identical_to" | "hamburger" => Some('\u{2261}'.to_string()), // ≡
"not_identical_to" | "branch_untracked" => Some('\u{2262}'.to_string()), // ≢
"strictly_equivalent_to" | "branch_identical" => Some('\u{2263}'.to_string()), // ≣
"upwards_arrow" | "branch_ahead" => Some('\u{2191}'.to_string()), // ↑
"downwards_arrow" | "branch_behind" => Some('\u{2193}'.to_string()), // ↓
"up_down_arrow" | "branch_ahead_behind" => Some('\u{2195}'.to_string()), // ↕
"black_right_pointing_triangle" | "prompt" => Some('\u{25b6}'.to_string()), // ▶
"vector_or_cross_product" | "failed" => Some('\u{2a2f}'.to_string()), //
"high_voltage_sign" | "elevated" => Some('\u{26a1}'.to_string()), // ⚡
"tilde" | "twiddle" | "squiggly" | "home" => Some("~".into()), // ~
"hash" | "hashtag" | "pound_sign" | "sharp" | "root" => Some("#".into()), // #
_ => None, _ => None,
} }
} }

View File

@ -70,27 +70,21 @@ async fn run_pipeline(
vars: &IndexMap<String, Value>, vars: &IndexMap<String, Value>,
env: &IndexMap<String, String>, env: &IndexMap<String, String>,
) -> Result<InputStream, ShellError> { ) -> Result<InputStream, ShellError> {
let mut iter = commands.list.clone().into_iter().peekable(); for item in commands.list.clone() {
loop { input = match item {
let item: Option<ClassifiedCommand> = iter.next(); ClassifiedCommand::Dynamic(_) => {
let next: Option<&ClassifiedCommand> = iter.peek();
input = match (item, next) {
(Some(ClassifiedCommand::Dynamic(_)), _) | (_, Some(ClassifiedCommand::Dynamic(_))) => {
return Err(ShellError::unimplemented("Dynamic commands")) return Err(ShellError::unimplemented("Dynamic commands"))
} }
(Some(ClassifiedCommand::Expr(expr)), _) => { ClassifiedCommand::Expr(expr) => {
run_expression_block(*expr, ctx, it, vars, env).await? run_expression_block(*expr, ctx, it, vars, env).await?
} }
(Some(ClassifiedCommand::Error(err)), _) => return Err(err.into()),
(_, Some(ClassifiedCommand::Error(err))) => return Err(err.clone().into()),
(Some(ClassifiedCommand::Internal(left)), _) => { ClassifiedCommand::Error(err) => return Err(err.into()),
ClassifiedCommand::Internal(left) => {
run_internal_command(left, ctx, input, it, vars, env).await? run_internal_command(left, ctx, input, it, vars, env).await?
} }
(None, _) => break,
}; };
} }

View File

@ -1,3 +1,4 @@
use crate::commands::classified::maybe_text_codec::{MaybeTextCodec, StringOrBinary};
use crate::evaluate::evaluate_baseline_expr; use crate::evaluate::evaluate_baseline_expr;
use crate::futures::ThreadedReceiver; use crate::futures::ThreadedReceiver;
use crate::prelude::*; use crate::prelude::*;
@ -7,91 +8,25 @@ use std::ops::Deref;
use std::process::{Command, Stdio}; use std::process::{Command, Stdio};
use std::sync::mpsc; use std::sync::mpsc;
use bytes::{BufMut, Bytes, BytesMut};
use futures::executor::block_on_stream; use futures::executor::block_on_stream;
// use futures::stream::StreamExt;
use futures_codec::FramedRead; use futures_codec::FramedRead;
use log::trace; use log::trace;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::hir::ExternalCommand; use nu_protocol::hir::{ExternalCommand, ExternalRedirection};
use nu_protocol::{Primitive, Scope, ShellTypeName, UntaggedValue, Value}; use nu_protocol::{Primitive, Scope, ShellTypeName, UntaggedValue, Value};
use nu_source::Tag; use nu_source::Tag;
pub enum StringOrBinary {
String(String),
Binary(Vec<u8>),
}
pub struct MaybeTextCodec;
impl futures_codec::Encoder for MaybeTextCodec {
type Item = StringOrBinary;
type Error = std::io::Error;
fn encode(&mut self, item: Self::Item, dst: &mut BytesMut) -> Result<(), Self::Error> {
match item {
StringOrBinary::String(s) => {
dst.reserve(s.len());
dst.put(s.as_bytes());
Ok(())
}
StringOrBinary::Binary(b) => {
dst.reserve(b.len());
dst.put(Bytes::from(b));
Ok(())
}
}
}
}
impl futures_codec::Decoder for MaybeTextCodec {
type Item = StringOrBinary;
type Error = std::io::Error;
fn decode(&mut self, src: &mut BytesMut) -> Result<Option<Self::Item>, Self::Error> {
let v: Vec<u8> = src.to_vec();
match String::from_utf8(v) {
Ok(s) => {
src.clear();
if s.is_empty() {
Ok(None)
} else {
Ok(Some(StringOrBinary::String(s)))
}
}
Err(err) => {
// Note: the longest UTF-8 character per Unicode spec is currently 6 bytes. If we fail somewhere earlier than the last 6 bytes,
// we know that we're failing to understand the string encoding and not just seeing a partial character. When this happens, let's
// fall back to assuming it's a binary buffer.
if src.is_empty() {
Ok(None)
} else if src.len() > 6 && (src.len() - err.utf8_error().valid_up_to() > 6) {
// Fall back to assuming binary
let buf = src.to_vec();
src.clear();
Ok(Some(StringOrBinary::Binary(buf)))
} else {
// Looks like a utf-8 string, so let's assume that
let buf = src.split_to(err.utf8_error().valid_up_to() + 1);
String::from_utf8(buf.to_vec())
.map(|x| Some(StringOrBinary::String(x)))
.map_err(|e| std::io::Error::new(std::io::ErrorKind::InvalidData, e))
}
}
}
}
}
pub(crate) async fn run_external_command( pub(crate) async fn run_external_command(
command: ExternalCommand, command: ExternalCommand,
context: &mut Context, context: &mut Context,
input: InputStream, input: InputStream,
scope: &Scope, scope: &Scope,
is_last: bool, external_redirection: ExternalRedirection,
) -> Result<InputStream, ShellError> { ) -> Result<InputStream, ShellError> {
trace!(target: "nu::run::external", "-> {}", command.name); trace!(target: "nu::run::external", "-> {}", command.name);
if !did_find_command(&command.name).await { if !did_find_command(&command.name) {
return Err(ShellError::labeled_error( return Err(ShellError::labeled_error(
"Command not found", "Command not found",
"command not found", "command not found",
@ -99,7 +34,7 @@ pub(crate) async fn run_external_command(
)); ));
} }
run_with_stdin(command, context, input, scope, is_last).await run_with_stdin(command, context, input, scope, external_redirection).await
} }
async fn run_with_stdin( async fn run_with_stdin(
@ -107,7 +42,7 @@ async fn run_with_stdin(
context: &mut Context, context: &mut Context,
input: InputStream, input: InputStream,
scope: &Scope, scope: &Scope,
is_last: bool, external_redirection: ExternalRedirection,
) -> Result<InputStream, ShellError> { ) -> Result<InputStream, ShellError> {
let path = context.shell_manager.path(); let path = context.shell_manager.path();
@ -118,35 +53,55 @@ async fn run_with_stdin(
let value = let value =
evaluate_baseline_expr(arg, &context.registry, &scope.it, &scope.vars, &scope.env) evaluate_baseline_expr(arg, &context.registry, &scope.it, &scope.vars, &scope.env)
.await?; .await?;
// Skip any arguments that don't really exist, treating them as optional // Skip any arguments that don't really exist, treating them as optional
// FIXME: we may want to preserve the gap in the future, though it's hard to say // FIXME: we may want to preserve the gap in the future, though it's hard to say
// what value we would put in its place. // what value we would put in its place.
if value.value.is_none() { if value.value.is_none() {
continue; continue;
} }
// Do the cleanup that we need to do on any argument going out: // Do the cleanup that we need to do on any argument going out:
match &value.value {
UntaggedValue::Table(table) => {
for t in table {
match &t.value {
UntaggedValue::Primitive(_) => {
command_args
.push(t.convert_to_string().trim_end_matches('\n').to_string());
}
_ => {
return Err(ShellError::labeled_error(
"Could not convert to positional arguments",
"could not convert to positional arguments",
value.tag(),
));
}
}
}
}
_ => {
let trimmed_value_string = value.as_string()?.trim_end_matches('\n').to_string(); let trimmed_value_string = value.as_string()?.trim_end_matches('\n').to_string();
command_args.push(trimmed_value_string);
let value_string;
#[cfg(not(windows))]
{
value_string = trimmed_value_string
.replace('$', "\\$")
.replace('"', "\\\"")
.to_string()
} }
#[cfg(windows)]
{
value_string = trimmed_value_string
} }
command_args.push(value_string);
} }
let process_args = command_args let process_args = command_args
.iter() .iter()
.map(|arg| { .map(|arg| {
let arg = expand_tilde(arg.deref(), dirs::home_dir); let home_dir;
#[cfg(feature = "dirs")]
{
home_dir = dirs::home_dir;
}
#[cfg(not(feature = "dirs"))]
{
home_dir = || Some(std::path::PathBuf::from("/"));
}
let arg = expand_tilde(arg.deref(), home_dir);
#[cfg(not(windows))] #[cfg(not(windows))]
{ {
@ -167,7 +122,14 @@ async fn run_with_stdin(
}) })
.collect::<Vec<String>>(); .collect::<Vec<String>>();
spawn(&command, &path, &process_args[..], input, is_last, scope) spawn(
&command,
&path,
&process_args[..],
input,
external_redirection,
scope,
)
} }
fn spawn( fn spawn(
@ -175,7 +137,7 @@ fn spawn(
path: &str, path: &str,
args: &[String], args: &[String],
input: InputStream, input: InputStream,
is_last: bool, external_redirection: ExternalRedirection,
scope: &Scope, scope: &Scope,
) -> Result<InputStream, ShellError> { ) -> Result<InputStream, ShellError> {
let command = command.clone(); let command = command.clone();
@ -211,13 +173,23 @@ fn spawn(
// We want stdout regardless of what // We want stdout regardless of what
// we are doing ($it case or pipe stdin) // we are doing ($it case or pipe stdin)
if !is_last { match external_redirection {
ExternalRedirection::Stdout => {
process.stdout(Stdio::piped()); process.stdout(Stdio::piped());
trace!(target: "nu::run::external", "set up stdout pipe"); trace!(target: "nu::run::external", "set up stdout pipe");
}
ExternalRedirection::Stderr => {
process.stderr(Stdio::piped()); process.stderr(Stdio::piped());
trace!(target: "nu::run::external", "set up stderr pipe"); trace!(target: "nu::run::external", "set up stderr pipe");
} }
ExternalRedirection::StdoutAndStderr => {
process.stdout(Stdio::piped());
trace!(target: "nu::run::external", "set up stdout pipe");
process.stderr(Stdio::piped());
trace!(target: "nu::run::external", "set up stderr pipe");
}
_ => {}
}
// open since we have some contents for stdin // open since we have some contents for stdin
if !input.is_empty() { if !input.is_empty() {
@ -300,7 +272,9 @@ fn spawn(
}); });
std::thread::spawn(move || { std::thread::spawn(move || {
if !is_last { if external_redirection == ExternalRedirection::Stdout
|| external_redirection == ExternalRedirection::StdoutAndStderr
{
let stdout = if let Some(stdout) = child.stdout.take() { let stdout = if let Some(stdout) = child.stdout.take() {
stdout stdout
} else { } else {
@ -315,22 +289,8 @@ fn spawn(
return Err(()); return Err(());
}; };
let stderr = if let Some(stderr) = child.stderr.take() {
stderr
} else {
let _ = stdout_read_tx.send(Ok(Value {
value: UntaggedValue::Error(ShellError::labeled_error(
"Can't redirect the stderr for external command",
"can't redirect stderr",
&stdout_name_tag,
)),
tag: stdout_name_tag,
}));
return Err(());
};
let file = futures::io::AllowStdIo::new(stdout); let file = futures::io::AllowStdIo::new(stdout);
let stream = FramedRead::new(file, MaybeTextCodec); let stream = FramedRead::new(file, MaybeTextCodec::default());
for line in block_on_stream(stream) { for line in block_on_stream(stream) {
match line { match line {
@ -382,17 +342,34 @@ fn spawn(
} }
} }
} }
}
if external_redirection == ExternalRedirection::Stderr
|| external_redirection == ExternalRedirection::StdoutAndStderr
{
let stderr = if let Some(stderr) = child.stderr.take() {
stderr
} else {
let _ = stdout_read_tx.send(Ok(Value {
value: UntaggedValue::Error(ShellError::labeled_error(
"Can't redirect the stderr for external command",
"can't redirect stderr",
&stdout_name_tag,
)),
tag: stdout_name_tag,
}));
return Err(());
};
let file = futures::io::AllowStdIo::new(stderr); let file = futures::io::AllowStdIo::new(stderr);
let err_stream = FramedRead::new(file, MaybeTextCodec); let stream = FramedRead::new(file, MaybeTextCodec::default());
for err_line in block_on_stream(err_stream) { for line in block_on_stream(stream) {
match err_line { match line {
Ok(line) => match line { Ok(line) => match line {
StringOrBinary::String(s) => { StringOrBinary::String(s) => {
let result = stdout_read_tx.send(Ok(Value { let result = stdout_read_tx.send(Ok(Value {
value: UntaggedValue::Error( value: UntaggedValue::Error(
ShellError::untagged_runtime_error(s.clone()), ShellError::untagged_runtime_error(s),
), ),
tag: stdout_name_tag.clone(), tag: stdout_name_tag.clone(),
})); }));
@ -404,9 +381,7 @@ fn spawn(
StringOrBinary::Binary(_) => { StringOrBinary::Binary(_) => {
let result = stdout_read_tx.send(Ok(Value { let result = stdout_read_tx.send(Ok(Value {
value: UntaggedValue::Error( value: UntaggedValue::Error(
ShellError::untagged_runtime_error( ShellError::untagged_runtime_error("<binary stderr>"),
"Binary in stderr output",
),
), ),
tag: stdout_name_tag.clone(), tag: stdout_name_tag.clone(),
})); }));
@ -428,8 +403,8 @@ fn spawn(
if should_error { if should_error {
let _ = stdout_read_tx.send(Ok(Value { let _ = stdout_read_tx.send(Ok(Value {
value: UntaggedValue::Error(ShellError::labeled_error( value: UntaggedValue::Error(ShellError::labeled_error(
format!("Unable to read from stderr ({})", e), format!("Unable to read from stdout ({})", e),
"unable to read from stderr", "unable to read from stdout",
&stdout_name_tag, &stdout_name_tag,
)), )),
tag: stdout_name_tag.clone(), tag: stdout_name_tag.clone(),
@ -483,21 +458,28 @@ fn spawn(
} }
} }
async fn did_find_command(name: &str) -> bool { pub fn did_find_command(#[allow(unused)] name: &str) -> bool {
#[cfg(not(windows))] #[cfg(not(feature = "which"))]
{
// we can't perform this check, so just assume it can be found
true
}
#[cfg(all(feature = "which", unix))]
{ {
which::which(name).is_ok() which::which(name).is_ok()
} }
#[cfg(windows)] #[cfg(all(feature = "which", windows))]
{ {
if which::which(name).is_ok() { if which::which(name).is_ok() {
true true
} else { } else {
// Reference: https://ss64.com/nt/syntax-internal.html
let cmd_builtins = [ let cmd_builtins = [
"call", "cls", "color", "date", "dir", "echo", "find", "hostname", "pause", "assoc", "break", "color", "copy", "date", "del", "dir", "dpath", "echo", "erase",
"start", "time", "title", "ver", "copy", "mkdir", "rename", "rd", "rmdir", "type", "for", "ftype", "md", "mkdir", "mklink", "move", "path", "ren", "rename", "rd",
"mklink", "rmdir", "set", "start", "time", "title", "type", "ver", "verify", "vol",
]; ];
cmd_builtins.contains(&name) cmd_builtins.contains(&name)
@ -559,11 +541,17 @@ fn shell_os_paths() -> Vec<std::path::PathBuf> {
mod tests { mod tests {
use super::{ use super::{
add_quotes, argument_contains_whitespace, argument_is_quoted, expand_tilde, remove_quotes, add_quotes, argument_contains_whitespace, argument_is_quoted, expand_tilde, remove_quotes,
run_external_command, Context, InputStream,
}; };
#[cfg(feature = "which")]
use super::{run_external_command, Context, InputStream};
#[cfg(feature = "which")]
use futures::executor::block_on; use futures::executor::block_on;
#[cfg(feature = "which")]
use nu_errors::ShellError; use nu_errors::ShellError;
#[cfg(feature = "which")]
use nu_protocol::Scope; use nu_protocol::Scope;
#[cfg(feature = "which")]
use nu_test_support::commands::ExternalBuilder; use nu_test_support::commands::ExternalBuilder;
// async fn read(mut stream: OutputStream) -> Option<Value> { // async fn read(mut stream: OutputStream) -> Option<Value> {
@ -579,17 +567,23 @@ mod tests {
// } // }
// } // }
#[cfg(feature = "which")]
async fn non_existent_run() -> Result<(), ShellError> { async fn non_existent_run() -> Result<(), ShellError> {
use nu_protocol::hir::ExternalRedirection;
let cmd = ExternalBuilder::for_name("i_dont_exist.exe").build(); let cmd = ExternalBuilder::for_name("i_dont_exist.exe").build();
let input = InputStream::empty(); let input = InputStream::empty();
let mut ctx = Context::basic().expect("There was a problem creating a basic context."); let mut ctx = Context::basic().expect("There was a problem creating a basic context.");
assert!( assert!(run_external_command(
run_external_command(cmd, &mut ctx, input, &Scope::new(), false) cmd,
&mut ctx,
input,
&Scope::new(),
ExternalRedirection::Stdout
)
.await .await
.is_err() .is_err());
);
Ok(()) Ok(())
} }
@ -618,6 +612,7 @@ mod tests {
// block_on(failure_run()) // block_on(failure_run())
// } // }
#[cfg(feature = "which")]
#[test] #[test]
fn identifies_command_not_found() -> Result<(), ShellError> { fn identifies_command_not_found() -> Result<(), ShellError> {
block_on(non_existent_run()) block_on(non_existent_run())

View File

@ -4,7 +4,7 @@ use crate::commands::UnevaluatedCallInfo;
use crate::prelude::*; use crate::prelude::*;
use log::{log_enabled, trace}; use log::{log_enabled, trace};
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::hir::InternalCommand; use nu_protocol::hir::{ExternalRedirection, InternalCommand};
use nu_protocol::{CommandAction, Primitive, ReturnSuccess, Scope, UntaggedValue, Value}; use nu_protocol::{CommandAction, Primitive, ReturnSuccess, Scope, UntaggedValue, Value};
pub(crate) async fn run_internal_command( pub(crate) async fn run_internal_command(
@ -28,6 +28,10 @@ pub(crate) async fn run_internal_command(
let objects: InputStream = trace_stream!(target: "nu::trace_stream::internal", "input" = input); let objects: InputStream = trace_stream!(target: "nu::trace_stream::internal", "input" = input);
let internal_command = context.expect_command(&command.name); let internal_command = context.expect_command(&command.name);
if command.name == "autoenv untrust" {
context.user_recently_used_autoenv_untrust = true;
}
let result = { let result = {
context context
.run_command( .run_command(
@ -83,7 +87,7 @@ pub(crate) async fn run_internal_command(
positional: None, positional: None,
named: None, named: None,
span: Span::unknown(), span: Span::unknown(),
is_last: false, external_redirection: ExternalRedirection::Stdout,
}, },
name_tag: Tag::unknown_anchor(command.name_span), name_tag: Tag::unknown_anchor(command.name_span),
scope: (&*scope).clone(), scope: (&*scope).clone(),
@ -178,10 +182,7 @@ pub(crate) async fn run_internal_command(
} }
CommandAction::EnterShell(location) => { CommandAction::EnterShell(location) => {
context.shell_manager.insert_at_current(Box::new( context.shell_manager.insert_at_current(Box::new(
match FilesystemShell::with_location( match FilesystemShell::with_location(location) {
location,
context.registry().clone(),
) {
Ok(v) => v, Ok(v) => v,
Err(err) => { Err(err) => {
return InputStream::one( return InputStream::one(

View File

@ -0,0 +1,127 @@
use bytes::{BufMut, Bytes, BytesMut};
use nu_errors::ShellError;
extern crate encoding_rs;
use encoding_rs::{CoderResult, Decoder, Encoding, UTF_8};
#[cfg(not(test))]
const OUTPUT_BUFFER_SIZE: usize = 8192;
#[cfg(test)]
const OUTPUT_BUFFER_SIZE: usize = 4;
#[derive(Debug, Eq, PartialEq)]
pub enum StringOrBinary {
String(String),
Binary(Vec<u8>),
}
pub struct MaybeTextCodec {
decoder: Decoder,
}
impl MaybeTextCodec {
// The constructor takes an Option<&'static Encoding>, because an absence of an encoding indicates that we want BOM sniffing enabled
pub fn new(encoding: Option<&'static Encoding>) -> Self {
let decoder = match encoding {
Some(e) => e.new_decoder_with_bom_removal(),
None => UTF_8.new_decoder(),
};
MaybeTextCodec { decoder }
}
}
impl Default for MaybeTextCodec {
fn default() -> Self {
MaybeTextCodec {
decoder: UTF_8.new_decoder(),
}
}
}
impl futures_codec::Encoder for MaybeTextCodec {
type Item = StringOrBinary;
type Error = std::io::Error;
fn encode(&mut self, item: Self::Item, dst: &mut BytesMut) -> Result<(), Self::Error> {
match item {
StringOrBinary::String(s) => {
dst.reserve(s.len());
dst.put(s.as_bytes());
Ok(())
}
StringOrBinary::Binary(b) => {
dst.reserve(b.len());
dst.put(Bytes::from(b));
Ok(())
}
}
}
}
impl futures_codec::Decoder for MaybeTextCodec {
type Item = StringOrBinary;
type Error = ShellError;
fn decode(&mut self, src: &mut BytesMut) -> Result<Option<Self::Item>, Self::Error> {
if src.is_empty() {
return Ok(None);
}
let mut s = String::with_capacity(OUTPUT_BUFFER_SIZE);
let (res, _read, replacements) = self.decoder.decode_to_string(src, &mut s, false);
let result = if replacements {
// If we had to make replacements when converting to utf8, fall back to binary
StringOrBinary::Binary(src.to_vec())
} else {
// If original buffer size is too small, we continue to allocate new Strings and append
// them to the result until the input buffer is smaller than the allocated String
if let CoderResult::OutputFull = res {
let mut buffer = String::with_capacity(OUTPUT_BUFFER_SIZE);
loop {
let (res, _read, _replacements) =
self.decoder
.decode_to_string(&src[s.len()..], &mut buffer, false);
s.push_str(&buffer);
if let CoderResult::InputEmpty = res {
break;
}
buffer.clear();
}
}
StringOrBinary::String(s)
};
src.clear();
Ok(Some(result))
}
}
#[cfg(test)]
mod tests {
use super::{MaybeTextCodec, StringOrBinary};
use bytes::BytesMut;
use futures_codec::Decoder;
// TODO: Write some more tests
#[test]
fn should_consume_all_bytes_from_source_when_temporary_buffer_overflows() {
let mut maybe_text = MaybeTextCodec::new(None);
let mut bytes = BytesMut::from("0123456789");
let text = maybe_text.decode(&mut bytes);
assert_eq!(
Ok(Some(StringOrBinary::String("0123456789".to_string()))),
text
);
assert!(bytes.is_empty());
}
}

View File

@ -3,6 +3,7 @@ mod dynamic;
pub(crate) mod expr; pub(crate) mod expr;
pub(crate) mod external; pub(crate) mod external;
pub(crate) mod internal; pub(crate) mod internal;
pub(crate) mod maybe_text_codec;
#[allow(unused_imports)] #[allow(unused_imports)]
pub(crate) use dynamic::Command as DynamicCommand; pub(crate) use dynamic::Command as DynamicCommand;

View File

@ -262,10 +262,10 @@ impl EvaluatedCommandArgs {
/// Get the nth positional argument, error if not possible /// Get the nth positional argument, error if not possible
pub fn expect_nth(&self, pos: usize) -> Result<&Value, ShellError> { pub fn expect_nth(&self, pos: usize) -> Result<&Value, ShellError> {
match self.call_info.args.nth(pos) { self.call_info
None => Err(ShellError::unimplemented("Better error: expect_nth")), .args
Some(item) => Ok(item), .nth(pos)
} .ok_or_else(|| ShellError::unimplemented("Better error: expect_nth"))
} }
pub fn get(&self, name: &str) -> Option<&Value> { pub fn get(&self, name: &str) -> Option<&Value> {
@ -343,6 +343,10 @@ impl Command {
self.0.usage() self.0.usage()
} }
pub fn examples(&self) -> Vec<Example> {
self.0.examples()
}
pub async fn run( pub async fn run(
&self, &self,
args: CommandArgs, args: CommandArgs,

View File

@ -40,7 +40,7 @@ impl WholeStreamCommand for Compact {
vec![ vec![
Example { Example {
description: "Filter out all null entries in a list", description: "Filter out all null entries in a list",
example: "echo [1 2 $null 3 $null $null] | compact target", example: "echo [1 2 $null 3 $null $null] | compact",
result: Some(vec![ result: Some(vec![
UntaggedValue::int(1).into(), UntaggedValue::int(1).into(),
UntaggedValue::int(2).into(), UntaggedValue::int(2).into(),
@ -49,7 +49,7 @@ impl WholeStreamCommand for Compact {
}, },
Example { Example {
description: "Filter out all directory entries having no 'target'", description: "Filter out all directory entries having no 'target'",
example: "ls -af | compact target", example: "ls -la | compact target",
result: None, result: None,
}, },
] ]

View File

@ -1,260 +0,0 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::data::config;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged;
use std::path::PathBuf;
pub struct Config;
#[derive(Deserialize)]
pub struct ConfigArgs {
load: Option<Tagged<PathBuf>>,
set: Option<(Tagged<String>, Value)>,
set_into: Option<Tagged<String>>,
get: Option<Tagged<String>>,
clear: Tagged<bool>,
remove: Option<Tagged<String>>,
path: Tagged<bool>,
}
#[async_trait]
impl WholeStreamCommand for Config {
fn name(&self) -> &str {
"config"
}
fn signature(&self) -> Signature {
Signature::build("config")
.named(
"load",
SyntaxShape::Path,
"load the config from the path given",
Some('l'),
)
.named(
"set",
SyntaxShape::Any,
"set a value in the config, eg) --set [key value]",
Some('s'),
)
.named(
"set_into",
SyntaxShape::String,
"sets a variable from values in the pipeline",
Some('i'),
)
.named(
"get",
SyntaxShape::Any,
"get a value from the config",
Some('g'),
)
.named(
"remove",
SyntaxShape::Any,
"remove a value from the config",
Some('r'),
)
.switch("clear", "clear the config", Some('c'))
.switch("path", "return the path to the config file", Some('p'))
}
fn usage(&self) -> &str {
"Configuration management."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
config(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "See all config values",
example: "config",
result: None,
},
Example {
description: "Set completion_mode to circular",
example: "config --set [completion_mode circular]",
result: None,
},
Example {
description: "Store the contents of the pipeline as a path",
example: "echo ['/usr/bin' '/bin'] | config --set_into path",
result: None,
},
Example {
description: "Get the current startup commands",
example: "config --get startup",
result: None,
},
Example {
description: "Remove the startup commands",
example: "config --remove startup",
result: None,
},
Example {
description: "Clear the config (be careful!)",
example: "config --clear",
result: None,
},
Example {
description: "Get the path to the current config file",
example: "config --path",
result: None,
},
]
}
}
pub async fn config(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let name_span = args.call_info.name_tag.clone();
let name = args.call_info.name_tag.clone();
let registry = registry.clone();
let (
ConfigArgs {
load,
set,
set_into,
get,
clear,
remove,
path,
},
input,
) = args.process(&registry).await?;
let configuration = if let Some(supplied) = load {
Some(supplied.item().clone())
} else {
None
};
let mut result = crate::data::config::read(name_span, &configuration)?;
Ok(if let Some(v) = get {
let key = v.to_string();
let value = result
.get(&key)
.ok_or_else(|| ShellError::labeled_error("Missing key in config", "key", v.tag()))?;
match value {
Value {
value: UntaggedValue::Table(list),
..
} => {
let list: Vec<_> = list
.iter()
.map(|x| ReturnSuccess::value(x.clone()))
.collect();
futures::stream::iter(list).to_output_stream()
}
x => {
let x = x.clone();
OutputStream::one(ReturnSuccess::value(x))
}
}
} else if let Some((key, value)) = set {
result.insert(key.to_string(), value.clone());
config::write(&result, &configuration)?;
OutputStream::one(ReturnSuccess::value(
UntaggedValue::Row(result.into()).into_value(&value.tag),
))
} else if let Some(v) = set_into {
let rows: Vec<Value> = input.collect().await;
let key = v.to_string();
if rows.is_empty() {
return Err(ShellError::labeled_error(
"No values given for set_into",
"needs value(s) from pipeline",
v.tag(),
));
} else if rows.len() == 1 {
// A single value
let value = &rows[0];
result.insert(key, value.clone());
config::write(&result, &configuration)?;
OutputStream::one(ReturnSuccess::value(
UntaggedValue::Row(result.into()).into_value(name),
))
} else {
// Take in the pipeline as a table
let value = UntaggedValue::Table(rows).into_value(name.clone());
result.insert(key, value);
config::write(&result, &configuration)?;
OutputStream::one(ReturnSuccess::value(
UntaggedValue::Row(result.into()).into_value(name),
))
}
} else if let Tagged { item: true, tag } = clear {
result.clear();
config::write(&result, &configuration)?;
OutputStream::one(ReturnSuccess::value(
UntaggedValue::Row(result.into()).into_value(tag),
))
} else if let Tagged { item: true, tag } = path {
let path = config::default_path_for(&configuration)?;
OutputStream::one(ReturnSuccess::value(
UntaggedValue::Primitive(Primitive::Path(path)).into_value(tag),
))
} else if let Some(v) = remove {
let key = v.to_string();
if result.contains_key(&key) {
result.swap_remove(&key);
config::write(&result, &configuration)?;
futures::stream::iter(vec![ReturnSuccess::value(
UntaggedValue::Row(result.into()).into_value(v.tag()),
)])
.to_output_stream()
} else {
return Err(ShellError::labeled_error(
"Key does not exist in config",
"key",
v.tag(),
));
}
} else {
futures::stream::iter(vec![ReturnSuccess::value(
UntaggedValue::Row(result.into()).into_value(name),
)])
.to_output_stream()
})
}
#[cfg(test)]
mod tests {
use super::Config;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Config {})
}
}

View File

@ -0,0 +1,57 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, UntaggedValue};
pub struct SubCommand;
#[async_trait]
impl WholeStreamCommand for SubCommand {
fn name(&self) -> &str {
"config clear"
}
fn signature(&self) -> Signature {
Signature::build("config clear")
}
fn usage(&self) -> &str {
"clear the config"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
clear(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Clear the config (be careful!)",
example: "config clear",
result: None,
}]
}
}
pub async fn clear(
args: CommandArgs,
_registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let name_span = args.call_info.name_tag.clone();
// NOTE: None because we are not loading a new config file, we just want to read from the
// existing config
let mut result = crate::data::config::read(name_span, &None)?;
result.clear();
config::write(&result, &None)?;
Ok(OutputStream::one(ReturnSuccess::value(
UntaggedValue::Row(result.into()).into_value(args.call_info.name_tag),
)))
}

View File

@ -0,0 +1,37 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use crate::{CommandArgs, CommandRegistry, OutputStream};
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, UntaggedValue};
pub struct Command;
#[async_trait]
impl WholeStreamCommand for Command {
fn name(&self) -> &str {
"config"
}
fn signature(&self) -> Signature {
Signature::build("config")
}
fn usage(&self) -> &str {
"Configuration management."
}
async fn run(
&self,
args: CommandArgs,
_registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let name_span = args.call_info.name_tag.clone();
let name = args.call_info.name_tag;
let result = crate::data::config::read(name_span, &None)?;
Ok(futures::stream::iter(vec![ReturnSuccess::value(
UntaggedValue::Row(result.into()).into_value(name),
)])
.to_output_stream())
}
}

View File

@ -0,0 +1,83 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged;
pub struct SubCommand;
#[derive(Deserialize)]
pub struct GetArgs {
get: Tagged<String>,
}
#[async_trait]
impl WholeStreamCommand for SubCommand {
fn name(&self) -> &str {
"config get"
}
fn signature(&self) -> Signature {
Signature::build("config get").required(
"get",
SyntaxShape::Any,
"value to get from the config",
)
}
fn usage(&self) -> &str {
"Gets a value from the config"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
get(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Get the current startup commands",
example: "config get startup",
result: None,
}]
}
}
pub async fn get(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let name_span = args.call_info.name_tag.clone();
let (GetArgs { get }, _) = args.process(&registry).await?;
// NOTE: None because we are not loading a new config file, we just want to read from the
// existing config
let result = crate::data::config::read(name_span, &None)?;
let key = get.to_string();
let value = result
.get(&key)
.ok_or_else(|| ShellError::labeled_error("Missing key in config", "key", get.tag()))?;
Ok(match value {
Value {
value: UntaggedValue::Table(list),
..
} => {
let list: Vec<_> = list
.iter()
.map(|x| ReturnSuccess::value(x.clone()))
.collect();
futures::stream::iter(list).to_output_stream()
}
x => {
let x = x.clone();
OutputStream::one(ReturnSuccess::value(x))
}
})
}

View File

@ -0,0 +1,59 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue};
use nu_source::Tagged;
use std::path::PathBuf;
pub struct SubCommand;
#[derive(Deserialize)]
pub struct LoadArgs {
load: Tagged<PathBuf>,
}
#[async_trait]
impl WholeStreamCommand for SubCommand {
fn name(&self) -> &str {
"config load"
}
fn signature(&self) -> Signature {
Signature::build("config load").required(
"load",
SyntaxShape::Path,
"Path to load the config from",
)
}
fn usage(&self) -> &str {
"Loads the config from the path given"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
set(args, registry).await
}
}
pub async fn set(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let name = args.call_info.name_tag.clone();
let name_span = args.call_info.name_tag.clone();
let (LoadArgs { load }, _) = args.process(&registry).await?;
let configuration = load.item().clone();
let result = crate::data::config::read(name_span, &Some(configuration))?;
Ok(futures::stream::iter(vec![ReturnSuccess::value(
UntaggedValue::Row(result.into()).into_value(name),
)])
.to_output_stream())
}

View File

@ -0,0 +1,17 @@
pub mod clear;
pub mod command;
pub mod get;
pub mod load;
pub mod path;
pub mod remove;
pub mod set;
pub mod set_into;
pub use clear::SubCommand as ConfigClear;
pub use command::Command as Config;
pub use get::SubCommand as ConfigGet;
pub use load::SubCommand as ConfigLoad;
pub use path::SubCommand as ConfigPath;
pub use remove::SubCommand as ConfigRemove;
pub use set::SubCommand as ConfigSet;
pub use set_into::SubCommand as ConfigSetInto;

View File

@ -0,0 +1,49 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, Signature, UntaggedValue};
pub struct SubCommand;
#[async_trait]
impl WholeStreamCommand for SubCommand {
fn name(&self) -> &str {
"config path"
}
fn signature(&self) -> Signature {
Signature::build("config path")
}
fn usage(&self) -> &str {
"return the path to the config file"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
path(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Get the path to the current config file",
example: "config path",
result: None,
}]
}
}
pub async fn path(
args: CommandArgs,
_registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let path = config::default_path()?;
Ok(OutputStream::one(ReturnSuccess::value(
UntaggedValue::Primitive(Primitive::Path(path)).into_value(args.call_info.name_tag),
)))
}

View File

@ -0,0 +1,75 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue};
use nu_source::Tagged;
pub struct SubCommand;
#[derive(Deserialize)]
pub struct RemoveArgs {
remove: Tagged<String>,
}
#[async_trait]
impl WholeStreamCommand for SubCommand {
fn name(&self) -> &str {
"config remove"
}
fn signature(&self) -> Signature {
Signature::build("config remove").required(
"remove",
SyntaxShape::Any,
"remove a value from the config",
)
}
fn usage(&self) -> &str {
"Removes a value from the config"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
remove(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Remove the startup commands",
example: "config remove startup",
result: None,
}]
}
}
pub async fn remove(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let name_span = args.call_info.name_tag.clone();
let (RemoveArgs { remove }, _) = args.process(&registry).await?;
let mut result = crate::data::config::read(name_span, &None)?;
let key = remove.to_string();
if result.contains_key(&key) {
result.swap_remove(&key);
config::write(&result, &None)?;
Ok(futures::stream::iter(vec![ReturnSuccess::value(
UntaggedValue::Row(result.into()).into_value(remove.tag()),
)])
.to_output_stream())
} else {
Err(ShellError::labeled_error(
"Key does not exist in config",
"key",
remove.tag(),
))
}
}

View File

@ -0,0 +1,67 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged;
pub struct SubCommand;
#[derive(Deserialize)]
pub struct SetArgs {
key: Tagged<String>,
value: Value,
}
#[async_trait]
impl WholeStreamCommand for SubCommand {
fn name(&self) -> &str {
"config set"
}
fn signature(&self) -> Signature {
Signature::build("config set")
.required("key", SyntaxShape::String, "variable name to set")
.required("value", SyntaxShape::Any, "value to use")
}
fn usage(&self) -> &str {
"Sets a value in the config"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
set(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Set completion_mode to circular",
example: "config set [completion_mode circular]",
result: None,
}]
}
}
pub async fn set(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let name_span = args.call_info.name_tag.clone();
let (SetArgs { key, value }, _) = args.process(&registry).await?;
// NOTE: None because we are not loading a new config file, we just want to read from the
// existing config
let mut result = crate::data::config::read(name_span, &None)?;
result.insert(key.to_string(), value.clone());
config::write(&result, &None)?;
Ok(OutputStream::one(ReturnSuccess::value(
UntaggedValue::Row(result.into()).into_value(&value.tag),
)))
}

View File

@ -0,0 +1,98 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged;
pub struct SubCommand;
#[derive(Deserialize)]
pub struct SetIntoArgs {
set_into: Tagged<String>,
}
#[async_trait]
impl WholeStreamCommand for SubCommand {
fn name(&self) -> &str {
"config set_into"
}
fn signature(&self) -> Signature {
Signature::build("config set_into").required(
"set_into",
SyntaxShape::String,
"sets a variable from values in the pipeline",
)
}
fn usage(&self) -> &str {
"Sets a value in the config"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
set_into(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Store the contents of the pipeline as a path",
example: "echo ['/usr/bin' '/bin'] | config set_into path",
result: None,
}]
}
}
pub async fn set_into(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let name_span = args.call_info.name_tag.clone();
let name = args.call_info.name_tag.clone();
let (SetIntoArgs { set_into: v }, input) = args.process(&registry).await?;
// NOTE: None because we are not loading a new config file, we just want to read from the
// existing config
let mut result = crate::data::config::read(name_span, &None)?;
// In the original code, this is set to `Some` if the `load flag is set`
let configuration = None;
let rows: Vec<Value> = input.collect().await;
let key = v.to_string();
Ok(if rows.is_empty() {
return Err(ShellError::labeled_error(
"No values given for set_into",
"needs value(s) from pipeline",
v.tag(),
));
} else if rows.len() == 1 {
// A single value
let value = &rows[0];
result.insert(key, value.clone());
config::write(&result, &configuration)?;
OutputStream::one(ReturnSuccess::value(
UntaggedValue::Row(result.into()).into_value(name),
))
} else {
// Take in the pipeline as a table
let value = UntaggedValue::Table(rows).into_value(name.clone());
result.insert(key, value);
config::write(&result, &configuration)?;
OutputStream::one(ReturnSuccess::value(
UntaggedValue::Row(result.into()).into_value(name),
))
})
}

View File

@ -0,0 +1,358 @@
pub const BAT_LANGUAGES: &[&str] = &[
"as",
"csv",
"tsv",
"applescript",
"script editor",
"s",
"S",
"adoc",
"asciidoc",
"asc",
"asa",
"yasm",
"nasm",
"asm",
"inc",
"mac",
"awk",
"bat",
"cmd",
"bib",
"sh",
"bash",
"zsh",
".bash_aliases",
".bash_completions",
".bash_functions",
".bash_login",
".bash_logout",
".bash_profile",
".bash_variables",
".bashrc",
".profile",
".textmate_init",
".zshrc",
"PKGBUILD",
".ebuild",
".eclass",
"c",
"h",
"cs",
"csx",
"cpp",
"cc",
"cp",
"cxx",
"c++",
"C",
"h",
"hh",
"hpp",
"hxx",
"h++",
"inl",
"ipp",
"cabal",
"clj",
"cljc",
"cljs",
"edn",
"CMakeLists.txt",
"cmake",
"h.in",
"hh.in",
"hpp.in",
"hxx.in",
"h++.in",
"CMakeCache.txt",
"cr",
"css",
"css.erb",
"css.liquid",
"d",
"di",
"dart",
"diff",
"patch",
"Dockerfile",
"dockerfile",
"ex",
"exs",
"elm",
"erl",
"hrl",
"Emakefile",
"emakefile",
"fs",
"fsi",
"fsx",
"fs",
"fsi",
"fsx",
"fish",
"attributes",
"gitattributes",
".gitattributes",
"COMMIT_EDITMSG",
"MERGE_MSG",
"TAG_EDITMSG",
"gitconfig",
".gitconfig",
".gitmodules",
"exclude",
"gitignore",
".gitignore",
".git",
"gitlog",
"git-rebase-todo",
"go",
"dot",
"DOT",
"gv",
"groovy",
"gvy",
"gradle",
"Jenkinsfile",
"hs",
"hs",
"hsc",
"show-nonprintable",
"html",
"htm",
"shtml",
"xhtml",
"asp",
"html.eex",
"yaws",
"rails",
"rhtml",
"erb",
"html.erb",
"adp",
"twig",
"html.twig",
"ini",
"INI",
"INF",
"reg",
"REG",
"lng",
"cfg",
"CFG",
"desktop",
"url",
"URL",
".editorconfig",
".hgrc",
"hgrc",
"java",
"bsh",
"properties",
"jsp",
"js",
"htc",
"js",
"jsx",
"babel",
"es6",
"js.erb",
"json",
"sublime-settings",
"sublime-menu",
"sublime-keymap",
"sublime-mousemap",
"sublime-theme",
"sublime-build",
"sublime-project",
"sublime-completions",
"sublime-commands",
"sublime-macro",
"sublime-color-scheme",
"ipynb",
"Pipfile.lock",
"jsonnet",
"libsonnet",
"libjsonnet",
"jl",
"kt",
"kts",
"tex",
"ltx",
"less",
"css.less",
"lisp",
"cl",
"clisp",
"l",
"mud",
"el",
"scm",
"ss",
"lsp",
"fasl",
"lhs",
"lua",
"make",
"GNUmakefile",
"makefile",
"Makefile",
"makefile.am",
"Makefile.am",
"makefile.in",
"Makefile.in",
"OCamlMakefile",
"mak",
"mk",
"md",
"mdown",
"markdown",
"markdn",
"matlab",
"build",
"nix",
"m",
"h",
"mm",
"M",
"h",
"ml",
"mli",
"mll",
"mly",
"pas",
"p",
"dpr",
"pl",
"pm",
"pod",
"t",
"PL",
"php",
"php3",
"php4",
"php5",
"php7",
"phps",
"phpt",
"phtml",
"txt",
"ps1",
"psm1",
"psd1",
"proto",
"protodevel",
"pb.txt",
"proto.text",
"textpb",
"pbtxt",
"prototxt",
"pp",
"epp",
"purs",
"py",
"py3",
"pyw",
"pyi",
"pyx",
"pyx.in",
"pxd",
"pxd.in",
"pxi",
"pxi.in",
"rpy",
"cpy",
"SConstruct",
"Sconstruct",
"sconstruct",
"SConscript",
"gyp",
"gypi",
"Snakefile",
"wscript",
"R",
"r",
"s",
"S",
"Rprofile",
"rd",
"re",
"rst",
"rest",
"robot",
"rb",
"Appfile",
"Appraisals",
"Berksfile",
"Brewfile",
"capfile",
"cgi",
"Cheffile",
"config.ru",
"Deliverfile",
"Fastfile",
"fcgi",
"Gemfile",
"gemspec",
"Guardfile",
"irbrc",
"jbuilder",
"Podfile",
"podspec",
"prawn",
"rabl",
"rake",
"Rakefile",
"Rantfile",
"rbx",
"rjs",
"ruby.rail",
"Scanfile",
"simplecov",
"Snapfile",
"thor",
"Thorfile",
"Vagrantfile",
"haml",
"sass",
"rxml",
"builder",
"rs",
"scala",
"sbt",
"sql",
"ddl",
"dml",
"erbsql",
"sql.erb",
"swift",
"log",
"tcl",
"tf",
"tfvars",
"hcl",
"sty",
"cls",
"textile",
"toml",
"tml",
"Cargo.lock",
"Gopkg.lock",
"Pipfile",
"ts",
"tsx",
"varlink",
"vim",
".vimrc",
"xml",
"xsd",
"xslt",
"tld",
"dtml",
"rss",
"opml",
"svg",
"yaml",
"yml",
"sublime-syntax",
];

View File

@ -45,7 +45,7 @@ impl WholeStreamCommand for Default {
fn examples(&self) -> Vec<Example> { fn examples(&self) -> Vec<Example> {
vec![Example { vec![Example {
description: "Give a default 'target' to all file entries", description: "Give a default 'target' to all file entries",
example: "ls -af | default target 'nothing'", example: "ls -la | default target 'nothing'",
result: None, result: None,
}] }]
} }
@ -60,13 +60,13 @@ async fn default(
Ok(input Ok(input
.map(move |item| { .map(move |item| {
let should_add = match item { let should_add = matches!(
item,
Value { Value {
value: UntaggedValue::Row(ref r), value: UntaggedValue::Row(ref r),
.. ..
} => r.get_data(&column.item).borrow().is_none(), } if r.get_data(&column.item).borrow().is_none()
_ => false, );
};
if should_add { if should_add {
match item.insert_data_at_path(&column.item, value.clone()) { match item.insert_data_at_path(&column.item, value.clone()) {

View File

@ -2,7 +2,9 @@ use crate::commands::classified::block::run_block;
use crate::commands::WholeStreamCommand; use crate::commands::WholeStreamCommand;
use crate::prelude::*; use crate::prelude::*;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{hir::Block, ReturnSuccess, Signature, SyntaxShape, Value}; use nu_protocol::{
hir::Block, hir::ExternalRedirection, ReturnSuccess, Signature, SyntaxShape, Value,
};
pub struct Do; pub struct Do;
@ -19,7 +21,7 @@ impl WholeStreamCommand for Do {
} }
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("with-env") Signature::build("do")
.required("block", SyntaxShape::Block, "the block to run ") .required("block", SyntaxShape::Block, "the block to run ")
.switch( .switch(
"ignore_errors", "ignore_errors",
@ -61,7 +63,7 @@ async fn do_(
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
let registry = registry.clone(); let registry = registry.clone();
let is_last = raw_args.call_info.args.is_last; let external_redirection = raw_args.call_info.args.external_redirection;
let mut context = Context::from_raw(&raw_args, &registry); let mut context = Context::from_raw(&raw_args, &registry);
let scope = raw_args.call_info.scope.clone(); let scope = raw_args.call_info.scope.clone();
@ -72,7 +74,26 @@ async fn do_(
}, },
input, input,
) = raw_args.process(&registry).await?; ) = raw_args.process(&registry).await?;
block.set_is_last(is_last);
let block_redirection = match external_redirection {
ExternalRedirection::None => {
if ignore_errors {
ExternalRedirection::Stderr
} else {
ExternalRedirection::None
}
}
ExternalRedirection::Stdout => {
if ignore_errors {
ExternalRedirection::StdoutAndStderr
} else {
ExternalRedirection::Stdout
}
}
x => x,
};
block.set_redirect(block_redirection);
let result = run_block( let result = run_block(
&block, &block,
@ -85,6 +106,9 @@ async fn do_(
.await; .await;
if ignore_errors { if ignore_errors {
// To properly ignore errors we need to redirect stderr, consume it, and remove
// any errors we see in the process.
match result { match result {
Ok(mut stream) => { Ok(mut stream) => {
let output = stream.drain_vec().await; let output = stream.drain_vec().await;

View File

@ -35,20 +35,7 @@ impl WholeStreamCommand for Drop {
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
let (DropArgs { rows }, input) = args.process(&registry).await?; drop(args, registry).await
let mut v: Vec<_> = input.into_vec().await;
let rows_to_drop = if let Some(quantity) = rows {
*quantity as usize
} else {
1
};
for _ in 0..rows_to_drop {
v.pop();
}
Ok(futures::stream::iter(v).to_output_stream())
} }
fn examples(&self) -> Vec<Example> { fn examples(&self) -> Vec<Example> {
@ -70,6 +57,31 @@ impl WholeStreamCommand for Drop {
} }
} }
async fn drop(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let (DropArgs { rows }, input) = args.process(&registry).await?;
let v: Vec<_> = input.into_vec().await;
let rows_to_drop = if let Some(quantity) = rows {
*quantity as usize
} else {
1
};
Ok(if rows_to_drop == 0 {
futures::stream::iter(v).to_output_stream()
} else {
let k = if v.len() < rows_to_drop {
0
} else {
v.len() - rows_to_drop
};
let iter = v.into_iter().take(k);
futures::stream::iter(iter).to_output_stream()
})
}
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use super::Drop; use super::Drop;

View File

@ -343,7 +343,7 @@ where
let values = vec.into_iter().map(Into::into).collect::<Vec<Value>>(); let values = vec.into_iter().map(Into::into).collect::<Vec<Value>>();
UntaggedValue::Table(values) UntaggedValue::Table(values)
} }
.retag(tag) .into_value(tag)
} }
impl From<DirInfo> for Value { impl From<DirInfo> for Value {
@ -352,17 +352,17 @@ impl From<DirInfo> for Value {
r.insert( r.insert(
"path".to_string(), "path".to_string(),
UntaggedValue::path(d.path).retag(&d.tag), UntaggedValue::path(d.path).into_value(&d.tag),
); );
r.insert( r.insert(
"apparent".to_string(), "apparent".to_string(),
UntaggedValue::bytes(d.size).retag(&d.tag), UntaggedValue::filesize(d.size).into_value(&d.tag),
); );
r.insert( r.insert(
"physical".to_string(), "physical".to_string(),
UntaggedValue::bytes(d.blocks).retag(&d.tag), UntaggedValue::filesize(d.blocks).into_value(&d.tag),
); );
r.insert("directories".to_string(), value_from_vec(d.dirs, &d.tag)); r.insert("directories".to_string(), value_from_vec(d.dirs, &d.tag));
@ -376,7 +376,7 @@ impl From<DirInfo> for Value {
.map(move |e| UntaggedValue::Error(e).into_untagged_value()) .map(move |e| UntaggedValue::Error(e).into_untagged_value())
.collect::<Vec<Value>>(), .collect::<Vec<Value>>(),
) )
.retag(&d.tag); .into_value(&d.tag);
r.insert("errors".to_string(), v); r.insert("errors".to_string(), v);
} }
@ -394,30 +394,33 @@ impl From<FileInfo> for Value {
r.insert( r.insert(
"path".to_string(), "path".to_string(),
UntaggedValue::path(f.path).retag(&f.tag), UntaggedValue::path(f.path).into_value(&f.tag),
); );
r.insert( r.insert(
"apparent".to_string(), "apparent".to_string(),
UntaggedValue::bytes(f.size).retag(&f.tag), UntaggedValue::filesize(f.size).into_value(&f.tag),
); );
let b = f let b = f
.blocks .blocks
.map(UntaggedValue::bytes) .map(UntaggedValue::filesize)
.unwrap_or_else(UntaggedValue::nothing) .unwrap_or_else(UntaggedValue::nothing)
.retag(&f.tag); .into_value(&f.tag);
r.insert("physical".to_string(), b); r.insert("physical".to_string(), b);
r.insert( r.insert(
"directories".to_string(), "directories".to_string(),
UntaggedValue::nothing().retag(&f.tag), UntaggedValue::nothing().into_value(&f.tag),
); );
r.insert("files".to_string(), UntaggedValue::nothing().retag(&f.tag)); r.insert(
"files".to_string(),
UntaggedValue::nothing().into_value(&f.tag),
);
UntaggedValue::row(r).retag(&f.tag) UntaggedValue::row(r).into_value(&f.tag)
} }
} }

View File

@ -7,14 +7,16 @@ use futures::stream::once;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{ use nu_protocol::{
hir::Block, hir::Expression, hir::SpannedExpression, hir::Synthetic, Scope, Signature, hir::Block, hir::Expression, hir::SpannedExpression, hir::Synthetic, Scope, Signature,
SyntaxShape, UntaggedValue, Value, SyntaxShape, TaggedDictBuilder, UntaggedValue, Value,
}; };
use nu_source::Tagged;
pub struct Each; pub struct Each;
#[derive(Deserialize)] #[derive(Deserialize)]
pub struct EachArgs { pub struct EachArgs {
block: Block, block: Block,
numbered: Tagged<bool>,
} }
#[async_trait] #[async_trait]
@ -24,10 +26,12 @@ impl WholeStreamCommand for Each {
} }
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("each").required( Signature::build("each")
"block", .required("block", SyntaxShape::Block, "the block to run on each row")
SyntaxShape::Block, .switch(
"the block to run on each row", "numbered",
"returned a numbered item ($it.index and $it.item)",
Some('n'),
) )
} }
@ -45,6 +49,11 @@ impl WholeStreamCommand for Each {
fn examples(&self) -> Vec<Example> { fn examples(&self) -> Vec<Example> {
vec![ vec![
Example {
description: "Echo the sum of each row",
example: "echo [[1 2] [3 4]] | each { echo $it | math sum }",
result: None,
},
Example { Example {
description: "Echo the square of each integer", description: "Echo the square of each integer",
example: "echo [1 2 3] | each { echo $(= $it * $it) }", example: "echo [1 2 3] | each { echo $(= $it * $it) }",
@ -55,28 +64,23 @@ impl WholeStreamCommand for Each {
]), ]),
}, },
Example { Example {
description: "Echo the sum of each row", description: "Number each item and echo a message",
example: "echo [[1 2] [3 4]] | each { echo $it | math sum }", example:
result: Some(vec![ "echo ['bob' 'fred'] | each --numbered { echo `{{$it.index}} is {{$it.item}}` }",
UntaggedValue::int(3).into(), result: Some(vec![Value::from("0 is bob"), Value::from("1 is fred")]),
UntaggedValue::int(7).into(),
]),
}, },
] ]
} }
} }
fn is_expanded_it_usage(head: &SpannedExpression) -> bool { fn is_expanded_it_usage(head: &SpannedExpression) -> bool {
match &*head { matches!(&*head, SpannedExpression {
SpannedExpression {
expr: Expression::Synthetic(Synthetic::String(s)), expr: Expression::Synthetic(Synthetic::String(s)),
.. ..
} if s == "expanded-each" => true, } if s == "expanded-each")
_ => false,
}
} }
async fn process_row( pub async fn process_row(
block: Arc<Block>, block: Arc<Block>,
scope: Arc<Scope>, scope: Arc<Scope>,
head: Arc<Box<SpannedExpression>>, head: Arc<Box<SpannedExpression>>,
@ -101,6 +105,14 @@ async fn process_row(
.to_output_stream()) .to_output_stream())
} }
pub(crate) fn make_indexed_item(index: usize, item: Value) -> Value {
let mut dict = TaggedDictBuilder::new(item.tag());
dict.insert_untagged("index", UntaggedValue::int(index));
dict.insert_value("item", item);
dict.into_value()
}
async fn each( async fn each(
raw_args: CommandArgs, raw_args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
@ -111,12 +123,34 @@ async fn each(
let context = Arc::new(Context::from_raw(&raw_args, &registry)); let context = Arc::new(Context::from_raw(&raw_args, &registry));
let (each_args, input): (EachArgs, _) = raw_args.process(&registry).await?; let (each_args, input): (EachArgs, _) = raw_args.process(&registry).await?;
let block = Arc::new(each_args.block); let block = Arc::new(each_args.block);
if each_args.numbered.item {
Ok(input
.enumerate()
.then(move |input| {
let block = block.clone();
let scope = scope.clone();
let head = head.clone();
let context = context.clone();
let row = make_indexed_item(input.0, input.1);
async {
match process_row(block, scope, head, context, row).await {
Ok(s) => s,
Err(e) => OutputStream::one(Err(e)),
}
}
})
.flatten()
.to_output_stream())
} else {
Ok(input Ok(input
.then(move |input| { .then(move |input| {
let block = block.clone(); let block = block.clone();
let scope = scope.clone(); let scope = scope.clone();
let head = head.clone(); let head = head.clone();
let context = context.clone(); let context = context.clone();
async { async {
match process_row(block, scope, head, context, input).await { match process_row(block, scope, head, context, input).await {
Ok(s) => s, Ok(s) => s,
@ -127,6 +161,7 @@ async fn each(
.flatten() .flatten()
.to_output_stream()) .to_output_stream())
} }
}
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {

View File

@ -3,7 +3,7 @@ use crate::prelude::*;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::hir::Operator; use nu_protocol::hir::Operator;
use nu_protocol::{ use nu_protocol::{
Primitive, RangeInclusion, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value, Primitive, Range, RangeInclusion, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value,
}; };
pub struct Echo; pub struct Echo;
@ -55,60 +55,86 @@ async fn echo(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStr
let registry = registry.clone(); let registry = registry.clone();
let (args, _): (EchoArgs, _) = args.process(&registry).await?; let (args, _): (EchoArgs, _) = args.process(&registry).await?;
let stream = args.rest.into_iter().map(|i| { let stream = args.rest.into_iter().map(|i| match i.as_string() {
match i.as_string() { Ok(s) => OutputStream::one(Ok(ReturnSuccess::Value(
Ok(s) => {
OutputStream::one(Ok(ReturnSuccess::Value(
UntaggedValue::string(s).into_value(i.tag.clone()), UntaggedValue::string(s).into_value(i.tag.clone()),
))) ))),
}
_ => match i { _ => match i {
Value { Value {
value: UntaggedValue::Table(table), value: UntaggedValue::Table(table),
.. ..
} => { } => futures::stream::iter(table.into_iter().map(ReturnSuccess::value))
futures::stream::iter(table.into_iter().map(ReturnSuccess::value)).to_output_stream() .to_output_stream(),
}
Value { Value {
value: UntaggedValue::Primitive(Primitive::Range(range)), value: UntaggedValue::Primitive(Primitive::Range(range)),
tag tag,
} => { } => futures::stream::iter(RangeIterator::new(*range, tag)).to_output_stream(),
let mut output_vec = vec![]; _ => OutputStream::one(Ok(ReturnSuccess::Value(i.clone()))),
let mut current = range.from.0.item;
while current != range.to.0.item {
output_vec.push(Ok(ReturnSuccess::Value(UntaggedValue::Primitive(current.clone()).into_value(&tag))));
current = match crate::data::value::compute_values(Operator::Plus, &UntaggedValue::Primitive(current), &UntaggedValue::int(1)) {
Ok(result) => match result {
UntaggedValue::Primitive(p) => p,
_ => {
return OutputStream::one(Err(ShellError::unimplemented("Internal error: expected a primitive result from increment")));
}
}, },
Err((left_type, right_type)) => {
return OutputStream::one(Err(ShellError::coerce_error(
left_type.spanned(tag.span),
right_type.spanned(tag.span),
)));
}
}
}
if let RangeInclusion::Inclusive = range.to.1 {
output_vec.push(Ok(ReturnSuccess::Value(UntaggedValue::Primitive(current).into_value(&tag))));
}
futures::stream::iter(output_vec.into_iter()).to_output_stream()
}
_ => {
OutputStream::one(Ok(ReturnSuccess::Value(i.clone())))
}
},
}
}); });
Ok(futures::stream::iter(stream).flatten().to_output_stream()) Ok(futures::stream::iter(stream).flatten().to_output_stream())
} }
struct RangeIterator {
curr: Primitive,
end: Primitive,
tag: Tag,
is_end_inclusive: bool,
is_done: bool,
}
impl RangeIterator {
pub fn new(range: Range, tag: Tag) -> RangeIterator {
RangeIterator {
curr: range.from.0.item,
end: range.to.0.item,
tag,
is_end_inclusive: matches!(range.to.1, RangeInclusion::Inclusive),
is_done: false,
}
}
}
impl Iterator for RangeIterator {
type Item = Result<ReturnSuccess, ShellError>;
fn next(&mut self) -> Option<Self::Item> {
if self.curr != self.end {
let output = UntaggedValue::Primitive(self.curr.clone()).into_value(self.tag.clone());
self.curr = match crate::data::value::compute_values(
Operator::Plus,
&UntaggedValue::Primitive(self.curr.clone()),
&UntaggedValue::int(1),
) {
Ok(result) => match result {
UntaggedValue::Primitive(p) => p,
_ => {
return Some(Err(ShellError::unimplemented(
"Internal error: expected a primitive result from increment",
)));
}
},
Err((left_type, right_type)) => {
return Some(Err(ShellError::coerce_error(
left_type.spanned(self.tag.span),
right_type.spanned(self.tag.span),
)));
}
};
Some(ReturnSuccess::value(output))
} else if self.is_end_inclusive && !self.is_done {
self.is_done = true;
Some(ReturnSuccess::value(
UntaggedValue::Primitive(self.curr.clone()).into_value(self.tag.clone()),
))
} else {
// TODO: add inclusive/exclusive ranges
None
}
}
}
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use super::Echo; use super::Echo;

View File

@ -3,6 +3,7 @@ use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry; use crate::context::CommandRegistry;
use crate::prelude::*; use crate::prelude::*;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::hir::ExternalRedirection;
use nu_protocol::{ use nu_protocol::{
CommandAction, Primitive, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value, CommandAction, Primitive, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value,
}; };
@ -121,21 +122,16 @@ async fn enter(
let full_path = std::path::PathBuf::from(cwd); let full_path = std::path::PathBuf::from(cwd);
let (file_extension, contents, contents_tag) = crate::commands::open::fetch( let (file_extension, tagged_contents) = crate::commands::open::fetch(
&full_path, &full_path,
&PathBuf::from(location_clone), &PathBuf::from(location_clone),
tag.span, tag.span,
match encoding { encoding,
Some(e) => e.to_string(),
_ => "".to_string(),
},
) )
.await?; .await?;
match contents { match tagged_contents.value {
UntaggedValue::Primitive(Primitive::String(_)) => { UntaggedValue::Primitive(Primitive::String(_)) => {
let tagged_contents = contents.into_value(&contents_tag);
if let Some(extension) = file_extension { if let Some(extension) = file_extension {
let command_name = format!("from {}", extension); let command_name = format!("from {}", extension);
if let Some(converter) = registry.get_command(&command_name) { if let Some(converter) = registry.get_command(&command_name) {
@ -150,24 +146,24 @@ async fn enter(
positional: None, positional: None,
named: None, named: None,
span: Span::unknown(), span: Span::unknown(),
is_last: false, external_redirection: ExternalRedirection::Stdout,
}, },
name_tag: tag.clone(), name_tag: tag.clone(),
scope: scope.clone(), scope: scope.clone(),
}, },
}; };
let tag = tagged_contents.tag.clone();
let mut result = converter let mut result = converter
.run(new_args.with_input(vec![tagged_contents]), &registry) .run(new_args.with_input(vec![tagged_contents]), &registry)
.await?; .await?;
let result_vec: Vec<Result<ReturnSuccess, ShellError>> = let result_vec: Vec<Result<ReturnSuccess, ShellError>> =
result.drain_vec().await; result.drain_vec().await;
Ok(futures::stream::iter(result_vec.into_iter().map( Ok(futures::stream::iter(result_vec.into_iter().map(
move |res| match res { move |res| match res {
Ok(ReturnSuccess::Value(Value { value, .. })) => Ok( Ok(ReturnSuccess::Value(Value { value, .. })) => Ok(
ReturnSuccess::Action(CommandAction::EnterValueShell(Value { ReturnSuccess::Action(CommandAction::EnterValueShell(Value {
value, value,
tag: contents_tag.clone(), tag: tag.clone(),
})), })),
), ),
x => x, x => x,
@ -185,13 +181,9 @@ async fn enter(
))) )))
} }
} }
_ => { _ => Ok(OutputStream::one(ReturnSuccess::action(
let tagged_contents = contents.into_value(contents_tag);
Ok(OutputStream::one(ReturnSuccess::action(
CommandAction::EnterValueShell(tagged_contents), CommandAction::EnterValueShell(tagged_contents),
))) ))),
}
} }
} }
} }

View File

@ -1,83 +0,0 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use crate::utils::data_processing::{evaluate, fetch};
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::{SpannedItem, Tagged};
use nu_value_ext::ValueExt;
pub struct EvaluateBy;
#[derive(Deserialize)]
pub struct EvaluateByArgs {
evaluate_with: Option<Tagged<String>>,
}
#[async_trait]
impl WholeStreamCommand for EvaluateBy {
fn name(&self) -> &str {
"evaluate-by"
}
fn signature(&self) -> Signature {
Signature::build("evaluate-by").named(
"evaluate_with",
SyntaxShape::String,
"the name of the column to evaluate by",
Some('w'),
)
}
fn usage(&self) -> &str {
"Creates a new table with the data from the tables rows evaluated by the column given."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
evaluate_by(args, registry).await
}
}
pub async fn evaluate_by(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let name = args.call_info.name_tag.clone();
let (EvaluateByArgs { evaluate_with }, mut input) = args.process(&registry).await?;
let values: Vec<Value> = input.collect().await;
if values.is_empty() {
Err(ShellError::labeled_error(
"Expected table from pipeline",
"requires a table input",
name,
))
} else {
let evaluate_with = if let Some(evaluator) = evaluate_with {
Some(evaluator.item().clone())
} else {
None
};
match evaluate(&values[0], evaluate_with, name) {
Ok(evaluated) => Ok(OutputStream::one(ReturnSuccess::value(evaluated))),
Err(err) => Err(err),
}
}
}
#[cfg(test)]
mod tests {
use super::EvaluateBy;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(EvaluateBy {})
}
}

View File

@ -71,25 +71,23 @@ impl WholeStreamCommand for Every {
async fn every(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> { async fn every(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let registry = registry.clone(); let registry = registry.clone();
let (EveryArgs { stride, skip }, input) = args.process(&registry).await?; let (EveryArgs { stride, skip }, input) = args.process(&registry).await?;
let v: Vec<_> = input.into_vec().await;
let stride_desired = if stride.item < 1 { 1 } else { stride.item } as usize; let stride = stride.item;
let skip = skip.item;
let mut values_vec_deque = VecDeque::new(); Ok(input
.enumerate()
for (i, x) in v.iter().enumerate() { .filter_map(move |(i, value)| async move {
let should_include = if skip.item { let stride_desired = if stride < 1 { 1 } else { stride } as usize;
i % stride_desired != 0 let should_include = skip == (i % stride_desired != 0);
} else {
i % stride_desired == 0
};
if should_include { if should_include {
values_vec_deque.push_back(ReturnSuccess::value(x.clone())); Some(ReturnSuccess::value(value))
} else {
None
} }
} })
.to_output_stream())
Ok(futures::stream::iter(values_vec_deque).to_output_stream())
} }
#[cfg(test)] #[cfg(test)]

View File

@ -5,7 +5,6 @@ use ical::parser::vcard::component::*;
use ical::property::Property; use ical::property::Property;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, Signature, TaggedDictBuilder, UntaggedValue, Value}; use nu_protocol::{Primitive, ReturnSuccess, Signature, TaggedDictBuilder, UntaggedValue, Value};
use std::io::BufReader;
pub struct FromVcf; pub struct FromVcf;
@ -42,28 +41,20 @@ async fn from_vcf(
let input = args.input; let input = args.input;
let input_string = input.collect_string(tag.clone()).await?.item; let input_string = input.collect_string(tag.clone()).await?.item;
let input_bytes = input_string.as_bytes(); let input_bytes = input_string.into_bytes();
let buf_reader = BufReader::new(input_bytes); let cursor = std::io::Cursor::new(input_bytes);
let parser = ical::VcardParser::new(buf_reader); let parser = ical::VcardParser::new(cursor);
let mut values_vec_deque = VecDeque::new(); let iter = parser.map(move |contact| match contact {
Ok(c) => ReturnSuccess::value(contact_to_value(c, tag.clone())),
for contact in parser { Err(_) => Err(ShellError::labeled_error(
match contact {
Ok(c) => {
values_vec_deque.push_back(ReturnSuccess::value(contact_to_value(c, tag.clone())))
}
Err(_) => {
return Err(ShellError::labeled_error(
"Could not parse as .vcf", "Could not parse as .vcf",
"input cannot be parsed as .vcf", "input cannot be parsed as .vcf",
tag.clone(), tag.clone(),
)) )),
} });
}
}
Ok(futures::stream::iter(values_vec_deque).to_output_stream()) Ok(futures::stream::iter(iter).to_output_stream())
} }
fn contact_to_value(contact: VcardContact, tag: Tag) -> Value { fn contact_to_value(contact: VcardContact, tag: Tag) -> Value {

View File

@ -135,7 +135,6 @@ async fn from_xml(
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use crate::commands::from_xml; use crate::commands::from_xml;
use indexmap::IndexMap; use indexmap::IndexMap;
use nu_protocol::{UntaggedValue, Value}; use nu_protocol::{UntaggedValue, Value};

View File

@ -10,7 +10,7 @@ pub struct GroupBy;
#[derive(Deserialize)] #[derive(Deserialize)]
pub struct GroupByArgs { pub struct GroupByArgs {
column_name: Option<Tagged<String>>, grouper: Option<Value>,
} }
#[async_trait] #[async_trait]
@ -21,14 +21,14 @@ impl WholeStreamCommand for GroupBy {
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("group-by").optional( Signature::build("group-by").optional(
"column_name", "grouper",
SyntaxShape::String, SyntaxShape::Any,
"the name of the column to group by", "the grouper value to use",
) )
} }
fn usage(&self) -> &str { fn usage(&self) -> &str {
"Creates a new table with the data from the table rows grouped by the column given." "create a new table grouped."
} }
async fn run( async fn run(
@ -42,12 +42,17 @@ impl WholeStreamCommand for GroupBy {
fn examples(&self) -> Vec<Example> { fn examples(&self) -> Vec<Example> {
vec![ vec![
Example { Example {
description: "Group items by type", description: "group items by column named \"type\"",
example: r#"ls | group-by type"#, example: r#"ls | group-by type"#,
result: None, result: None,
}, },
Example { Example {
description: "Group items by their value", description: "blocks can be used for generating a grouping key (same as above)",
example: r#"ls | group-by { get type }"#,
result: None,
},
Example {
description: "you can also group by raw values by leaving out the argument",
example: "echo [1 3 1 3 2 1 1] | group-by", example: "echo [1 3 1 3 2 1 1] | group-by",
result: Some(vec![UntaggedValue::row(indexmap! { result: Some(vec![UntaggedValue::row(indexmap! {
"1".to_string() => UntaggedValue::Table(vec![ "1".to_string() => UntaggedValue::Table(vec![
@ -68,26 +73,95 @@ impl WholeStreamCommand for GroupBy {
}) })
.into()]), .into()]),
}, },
Example {
description: "write pipelines for a more involved grouping key",
example:
"echo [1 3 1 3 2 1 1] | group-by { echo `({{$it}} - 1) % 3` | calc | str from }",
result: None,
},
] ]
} }
} }
enum Grouper { enum Grouper {
ByColumn(Option<Tagged<String>>), ByColumn(Option<Tagged<String>>),
ByBlock,
} }
pub async fn group_by( pub async fn group_by(
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let name = args.call_info.name_tag.clone(); let name = args.call_info.name_tag.clone();
let (GroupByArgs { column_name }, input) = args.process(&registry).await?; let registry = registry.clone();
let head = Arc::new(args.call_info.args.head.clone());
let scope = Arc::new(args.call_info.scope.clone());
let context = Arc::new(Context::from_raw(&args, &registry));
let (GroupByArgs { grouper }, input) = args.process(&registry).await?;
let values: Vec<Value> = input.collect().await; let values: Vec<Value> = input.collect().await;
let mut keys: Vec<Result<String, ShellError>> = vec![];
let mut group_strategy = Grouper::ByColumn(None);
match grouper {
Some(Value {
value: UntaggedValue::Block(block_given),
..
}) => {
let block = Arc::new(block_given);
let error_key = "error";
for value in values.iter() {
let run = block.clone();
let scope = scope.clone();
let head = head.clone();
let context = context.clone();
match crate::commands::each::process_row(run, scope, head, context, value.clone())
.await
{
Ok(mut s) => {
let collection: Vec<Result<ReturnSuccess, ShellError>> =
s.drain_vec().await;
if collection.len() > 1 {
return Err(ShellError::labeled_error(
"expected one value from the block",
"requires a table with one value for grouping",
&name,
));
}
let value = match collection.get(0) {
Some(Ok(return_value)) => {
return_value.raw_value().unwrap_or_else(|| {
UntaggedValue::string(error_key).into_value(&name)
})
}
Some(Err(_)) | None => {
UntaggedValue::string(error_key).into_value(&name)
}
};
keys.push(as_string(&value));
}
Err(_) => {
keys.push(Ok(error_key.into()));
}
}
}
group_strategy = Grouper::ByBlock;
}
Some(other) => {
group_strategy = Grouper::ByColumn(Some(as_string(&other)?.tagged(&name)));
}
_ => {}
}
if values.is_empty() { if values.is_empty() {
return Err(ShellError::labeled_error( return Err(ShellError::labeled_error(
"Expected table from pipeline", "expected table from pipeline",
"requires a table input", "requires a table input",
name, name,
)); ));
@ -95,10 +169,22 @@ pub async fn group_by(
let values = UntaggedValue::table(&values).into_value(&name); let values = UntaggedValue::table(&values).into_value(&name);
match group(&column_name, &values, name) { let group_value = match group_strategy {
Ok(grouped) => Ok(OutputStream::one(ReturnSuccess::value(grouped))), Grouper::ByBlock => {
Err(reason) => Err(reason), let map = keys.clone();
let block = Box::new(move |idx: usize, row: &Value| match map.get(idx) {
Some(Ok(key)) => Ok(key.clone()),
Some(Err(reason)) => Err(reason.clone()),
None => as_string(row),
});
crate::utils::data::group(&values, &Some(block), &name)
} }
Grouper::ByColumn(column_name) => group(&column_name, &values, name),
};
Ok(OutputStream::one(ReturnSuccess::value(group_value?)))
} }
pub fn suggestions(tried: Tagged<&str>, for_value: &Value) -> ShellError { pub fn suggestions(tried: Tagged<&str>, for_value: &Value) -> ShellError {
@ -141,7 +227,7 @@ pub fn group(
match grouper { match grouper {
Grouper::ByColumn(Some(column_name)) => { Grouper::ByColumn(Some(column_name)) => {
let block = Box::new(move |row: &Value| { let block = Box::new(move |_, row: &Value| {
match row.get_data_by_key(column_name.borrow_spanned()) { match row.get_data_by_key(column_name.borrow_spanned()) {
Some(group_key) => Ok(as_string(&group_key)?), Some(group_key) => Ok(as_string(&group_key)?),
None => Err(suggestions(column_name.borrow_tagged(), &row)), None => Err(suggestions(column_name.borrow_tagged(), &row)),
@ -151,90 +237,45 @@ pub fn group(
crate::utils::data::group(&values, &Some(block), &name) crate::utils::data::group(&values, &Some(block), &name)
} }
Grouper::ByColumn(None) => { Grouper::ByColumn(None) => {
let block = Box::new(move |row: &Value| match as_string(row) { let block = Box::new(move |_, row: &Value| as_string(row));
Ok(group_key) => Ok(group_key),
Err(reason) => Err(reason),
});
crate::utils::data::group(&values, &Some(block), &name) crate::utils::data::group(&values, &Some(block), &name)
} }
Grouper::ByBlock => Err(ShellError::unimplemented(
"Block not implemented: This should never happen.",
)),
} }
} }
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use super::group; use super::group;
use indexmap::IndexMap; use crate::utils::data::helpers::{committers, date, int, row, string, table};
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{UntaggedValue, Value};
use nu_source::*; use nu_source::*;
fn string(input: impl Into<String>) -> Value {
UntaggedValue::string(input.into()).into_untagged_value()
}
fn row(entries: IndexMap<String, Value>) -> Value {
UntaggedValue::row(entries).into_untagged_value()
}
fn table(list: &[Value]) -> Value {
UntaggedValue::table(list).into_untagged_value()
}
fn nu_releases_committers() -> Vec<Value> {
vec![
row(
indexmap! {"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => string("August 23-2019")},
),
row(
indexmap! {"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => string("August 23-2019")},
),
row(
indexmap! {"name".into() => string("YK"), "country".into() => string("US"), "date".into() => string("October 10-2019")},
),
row(
indexmap! {"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => string("Sept 24-2019")},
),
row(
indexmap! {"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => string("October 10-2019")},
),
row(
indexmap! {"name".into() => string("YK"), "country".into() => string("US"), "date".into() => string("Sept 24-2019")},
),
row(
indexmap! {"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => string("October 10-2019")},
),
row(
indexmap! {"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => string("Sept 24-2019")},
),
row(
indexmap! {"name".into() => string("YK"), "country".into() => string("US"), "date".into() => string("August 23-2019")},
),
]
}
#[test] #[test]
fn groups_table_by_date_column() -> Result<(), ShellError> { fn groups_table_by_date_column() -> Result<(), ShellError> {
let for_key = Some(String::from("date").tagged_unknown()); let for_key = Some(String::from("date").tagged_unknown());
let sample = table(&nu_releases_committers()); let sample = table(&committers());
assert_eq!( assert_eq!(
group(&for_key, &sample, Tag::unknown())?, group(&for_key, &sample, Tag::unknown())?,
row(indexmap! { row(indexmap! {
"August 23-2019".into() => table(&[ "2019-07-23".into() => table(&[
row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => string("August 23-2019")}), row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => date("2019-07-23"), "chickens".into() => int(10) }),
row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => string("August 23-2019")}), row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => date("2019-07-23"), "chickens".into() => int(5) }),
row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => string("August 23-2019")}) row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => date("2019-07-23"), "chickens".into() => int(2) })
]), ]),
"October 10-2019".into() => table(&[ "2019-10-10".into() => table(&[
row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => string("October 10-2019")}), row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => date("2019-10-10"), "chickens".into() => int(6) }),
row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => string("October 10-2019")}), row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => date("2019-10-10"), "chickens".into() => int(15) }),
row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => string("October 10-2019")}) row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => date("2019-10-10"), "chickens".into() => int(30) })
]), ]),
"Sept 24-2019".into() => table(&[ "2019-09-24".into() => table(&[
row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => string("Sept 24-2019")}), row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => date("2019-09-24"), "chickens".into() => int(20) }),
row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => string("Sept 24-2019")}), row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => date("2019-09-24"), "chickens".into() => int(4) }),
row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => string("Sept 24-2019")}) row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => date("2019-09-24"), "chickens".into() => int(10) })
]), ]),
}) })
); );
@ -245,25 +286,25 @@ mod tests {
#[test] #[test]
fn groups_table_by_country_column() -> Result<(), ShellError> { fn groups_table_by_country_column() -> Result<(), ShellError> {
let for_key = Some(String::from("country").tagged_unknown()); let for_key = Some(String::from("country").tagged_unknown());
let sample = table(&nu_releases_committers()); let sample = table(&committers());
assert_eq!( assert_eq!(
group(&for_key, &sample, Tag::unknown())?, group(&for_key, &sample, Tag::unknown())?,
row(indexmap! { row(indexmap! {
"EC".into() => table(&[ "EC".into() => table(&[
row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => string("August 23-2019")}), row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => date("2019-07-23"), "chickens".into() => int(10) }),
row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => string("Sept 24-2019")}), row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => date("2019-09-24"), "chickens".into() => int(20) }),
row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => string("October 10-2019")}) row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => date("2019-10-10"), "chickens".into() => int(30) })
]), ]),
"NZ".into() => table(&[ "NZ".into() => table(&[
row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => string("August 23-2019")}), row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => date("2019-07-23"), "chickens".into() => int(5) }),
row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => string("October 10-2019")}), row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => date("2019-10-10"), "chickens".into() => int(15) }),
row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => string("Sept 24-2019")}) row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => date("2019-09-24"), "chickens".into() => int(10) })
]), ]),
"US".into() => table(&[ "US".into() => table(&[
row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => string("October 10-2019")}), row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => date("2019-10-10"), "chickens".into() => int(6) }),
row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => string("Sept 24-2019")}), row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => date("2019-09-24"), "chickens".into() => int(4) }),
row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => string("August 23-2019")}), row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => date("2019-07-23"), "chickens".into() => int(2) }),
]), ]),
}) })
); );

View File

@ -34,7 +34,7 @@ impl WholeStreamCommand for GroupByDate {
} }
fn usage(&self) -> &str { fn usage(&self) -> &str {
"Creates a new table with the data from the table rows grouped by the column given." "creates a table grouped by date."
} }
async fn run( async fn run(
@ -98,54 +98,42 @@ pub async fn group_by_date(
Grouper::ByDate(None) Grouper::ByDate(None)
}; };
match (grouper_date, grouper_column) { let value_result = match (grouper_date, grouper_column) {
(Grouper::ByDate(None), GroupByColumn::Name(None)) => { (Grouper::ByDate(None), GroupByColumn::Name(None)) => {
let block = Box::new(move |row: &Value| row.format("%Y-%b-%d")); let block = Box::new(move |_, row: &Value| row.format("%Y-%m-%d"));
match crate::utils::data::group(&values, &Some(block), &name) { crate::utils::data::group(&values, &Some(block), &name)
Ok(grouped) => Ok(OutputStream::one(ReturnSuccess::value(grouped))),
Err(err) => Err(err),
}
} }
(Grouper::ByDate(None), GroupByColumn::Name(Some(column_name))) => { (Grouper::ByDate(None), GroupByColumn::Name(Some(column_name))) => {
let block = Box::new(move |row: &Value| { let block = Box::new(move |_, row: &Value| {
let group_key = match row.get_data_by_key(column_name.borrow_spanned()) { let group_key = row
Some(group_key) => Ok(group_key), .get_data_by_key(column_name.borrow_spanned())
None => Err(suggestions(column_name.borrow_tagged(), &row)), .ok_or_else(|| suggestions(column_name.borrow_tagged(), &row));
};
group_key?.format("%Y-%b-%d") group_key?.format("%Y-%m-%d")
}); });
match crate::utils::data::group(&values, &Some(block), &name) { crate::utils::data::group(&values, &Some(block), &name)
Ok(grouped) => Ok(OutputStream::one(ReturnSuccess::value(grouped))),
Err(err) => Err(err),
}
} }
(Grouper::ByDate(Some(fmt)), GroupByColumn::Name(None)) => { (Grouper::ByDate(Some(fmt)), GroupByColumn::Name(None)) => {
let block = Box::new(move |row: &Value| row.format(&fmt)); let block = Box::new(move |_, row: &Value| row.format(&fmt));
match crate::utils::data::group(&values, &Some(block), &name) { crate::utils::data::group(&values, &Some(block), &name)
Ok(grouped) => Ok(OutputStream::one(ReturnSuccess::value(grouped))),
Err(err) => Err(err),
}
} }
(Grouper::ByDate(Some(fmt)), GroupByColumn::Name(Some(column_name))) => { (Grouper::ByDate(Some(fmt)), GroupByColumn::Name(Some(column_name))) => {
let block = Box::new(move |row: &Value| { let block = Box::new(move |_, row: &Value| {
let group_key = match row.get_data_by_key(column_name.borrow_spanned()) { let group_key = row
Some(group_key) => Ok(group_key), .get_data_by_key(column_name.borrow_spanned())
None => Err(suggestions(column_name.borrow_tagged(), &row)), .ok_or_else(|| suggestions(column_name.borrow_tagged(), &row));
};
group_key?.format(&fmt) group_key?.format(&fmt)
}); });
match crate::utils::data::group(&values, &Some(block), &name) { crate::utils::data::group(&values, &Some(block), &name)
Ok(grouped) => Ok(OutputStream::one(ReturnSuccess::value(grouped))),
Err(err) => Err(err),
}
}
} }
};
Ok(OutputStream::one(ReturnSuccess::value(value_result?)))
} }
} }

View File

@ -1,12 +1,10 @@
use crate::commands::WholeStreamCommand; use crate::commands::WholeStreamCommand;
use crate::data::command_dict; use crate::data::command_dict;
use crate::documentation::{generate_docs, get_documentation, DocumentationConfig};
use crate::prelude::*; use crate::prelude::*;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{ use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, TaggedDictBuilder, UntaggedValue};
NamedType, PositionalType, ReturnSuccess, Signature, SyntaxShape, TaggedDictBuilder,
UntaggedValue,
};
use nu_source::{SpannedItem, Tagged}; use nu_source::{SpannedItem, Tagged};
use nu_value_ext::get_data_by_key; use nu_value_ext::get_data_by_key;
@ -98,6 +96,10 @@ async fn help(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStr
})) }))
.to_output_stream(), .to_output_stream(),
) )
} else if rest[0].item == "generate_docs" {
Ok(OutputStream::one(ReturnSuccess::value(generate_docs(
&registry,
))))
} else if rest.len() == 2 { } else if rest.len() == 2 {
// Check for a subcommand // Check for a subcommand
let command_name = format!("{} {}", rest[0].item, rest[1].item); let command_name = format!("{} {}", rest[0].item, rest[1].item);
@ -150,165 +152,8 @@ You can also learn more at https://www.nushell.sh/book/"#;
} }
} }
#[allow(clippy::cognitive_complexity)]
pub fn get_help(cmd: &dyn WholeStreamCommand, registry: &CommandRegistry) -> String { pub fn get_help(cmd: &dyn WholeStreamCommand, registry: &CommandRegistry) -> String {
let cmd_name = cmd.name(); get_documentation(cmd, registry, &DocumentationConfig::default())
let signature = cmd.signature();
let mut long_desc = String::new();
long_desc.push_str(&cmd.usage());
long_desc.push_str("\n");
let mut subcommands = String::new();
for name in registry.names() {
if name.starts_with(&format!("{} ", cmd_name)) {
let subcommand = registry.get_command(&name).expect("This shouldn't happen");
subcommands.push_str(&format!(" {} - {}\n", name, subcommand.usage()));
}
}
let mut one_liner = String::new();
one_liner.push_str(&signature.name);
one_liner.push_str(" ");
for positional in &signature.positional {
match &positional.0 {
PositionalType::Mandatory(name, _m) => {
one_liner.push_str(&format!("<{}> ", name));
}
PositionalType::Optional(name, _o) => {
one_liner.push_str(&format!("({}) ", name));
}
}
}
if signature.rest_positional.is_some() {
one_liner.push_str(" ...args");
}
if !subcommands.is_empty() {
one_liner.push_str("<subcommand> ");
}
if !signature.named.is_empty() {
one_liner.push_str("{flags} ");
}
long_desc.push_str(&format!("\nUsage:\n > {}\n", one_liner));
if !subcommands.is_empty() {
long_desc.push_str("\nSubcommands:\n");
long_desc.push_str(&subcommands);
}
if !signature.positional.is_empty() || signature.rest_positional.is_some() {
long_desc.push_str("\nParameters:\n");
for positional in &signature.positional {
match &positional.0 {
PositionalType::Mandatory(name, _m) => {
long_desc.push_str(&format!(" <{}> {}\n", name, positional.1));
}
PositionalType::Optional(name, _o) => {
long_desc.push_str(&format!(" ({}) {}\n", name, positional.1));
}
}
}
if let Some(rest_positional) = &signature.rest_positional {
long_desc.push_str(&format!(" ...args: {}\n", rest_positional.1));
}
}
if !signature.named.is_empty() {
long_desc.push_str(&get_flags_section(&signature))
}
let palette = crate::shell::palette::DefaultPalette {};
let examples = cmd.examples();
if !examples.is_empty() {
long_desc.push_str("\nExamples:");
}
for example in examples {
long_desc.push_str("\n");
long_desc.push_str(" ");
long_desc.push_str(example.description);
let colored_example =
crate::shell::helper::Painter::paint_string(example.example, registry, &palette);
long_desc.push_str(&format!("\n > {}\n", colored_example));
}
long_desc.push_str("\n");
long_desc
}
fn get_flags_section(signature: &Signature) -> String {
let mut long_desc = String::new();
long_desc.push_str("\nFlags:\n");
for (flag, ty) in &signature.named {
let msg = match ty.0 {
NamedType::Switch(s) => {
if let Some(c) = s {
format!(
" -{}, --{}{} {}\n",
c,
flag,
if !ty.1.is_empty() { ":" } else { "" },
ty.1
)
} else {
format!(
" --{}{} {}\n",
flag,
if !ty.1.is_empty() { ":" } else { "" },
ty.1
)
}
}
NamedType::Mandatory(s, m) => {
if let Some(c) = s {
format!(
" -{}, --{} <{}> (required parameter){} {}\n",
c,
flag,
m.display(),
if !ty.1.is_empty() { ":" } else { "" },
ty.1
)
} else {
format!(
" --{} <{}> (required parameter){} {}\n",
flag,
m.display(),
if !ty.1.is_empty() { ":" } else { "" },
ty.1
)
}
}
NamedType::Optional(s, o) => {
if let Some(c) = s {
format!(
" -{}, --{} <{}>{} {}\n",
c,
flag,
o.display(),
if !ty.1.is_empty() { ":" } else { "" },
ty.1
)
} else {
format!(
" --{} <{}>{} {}\n",
flag,
o.display(),
if !ty.1.is_empty() { ":" } else { "" },
ty.1
)
}
}
};
long_desc.push_str(&msg);
}
long_desc
} }
#[cfg(test)] #[cfg(test)]

View File

@ -1,19 +1,13 @@
use crate::commands::group_by::group;
use crate::commands::WholeStreamCommand; use crate::commands::WholeStreamCommand;
use crate::prelude::*; use crate::prelude::*;
use crate::utils::data_processing::{columns_sorted, evaluate, map_max, reduce, t_sort};
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{ use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, TaggedDictBuilder, UntaggedValue, Value};
Primitive, ReturnSuccess, Signature, SyntaxShape, TaggedDictBuilder, UntaggedValue, Value,
};
use nu_source::Tagged; use nu_source::Tagged;
use num_traits::{ToPrimitive, Zero};
pub struct Histogram; pub struct Histogram;
#[derive(Deserialize)] #[derive(Deserialize)]
pub struct HistogramArgs { pub struct HistogramArgs {
column_name: Tagged<String>,
rest: Vec<Tagged<String>>, rest: Vec<Tagged<String>>,
} }
@ -24,13 +18,7 @@ impl WholeStreamCommand for Histogram {
} }
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("histogram") Signature::build("histogram").rest(
.required(
"column_name",
SyntaxShape::String,
"the name of the column to graph by",
)
.rest(
SyntaxShape::String, SyntaxShape::String,
"column name to give the histogram's frequency column", "column name to give the histogram's frequency column",
) )
@ -57,8 +45,8 @@ impl WholeStreamCommand for Histogram {
}, },
Example { Example {
description: description:
"Get a histogram for the types of files, with frequency column named count", "Get a histogram for the types of files, with frequency column named percentage",
example: "ls | histogram type count", example: "ls | histogram type percentage",
result: None, result: None,
}, },
Example { Example {
@ -77,26 +65,16 @@ pub async fn histogram(
let registry = registry.clone(); let registry = registry.clone();
let name = args.call_info.name_tag.clone(); let name = args.call_info.name_tag.clone();
let (HistogramArgs { column_name, rest }, input) = args.process(&registry).await?; let (HistogramArgs { rest: mut columns }, input) = args.process(&registry).await?;
let values: Vec<Value> = input.collect().await; let values: Vec<Value> = input.collect().await;
let values = UntaggedValue::table(&values).into_value(&name);
let groups = group(&Some(column_name.clone()), &values, &name)?; let column_grouper = if !columns.is_empty() {
let group_labels = columns_sorted(Some(column_name.clone()), &groups, &name); Some(columns.remove(0))
let sorted = t_sort(Some(column_name.clone()), None, &groups, &name)?; } else {
let evaled = evaluate(&sorted, None, &name)?; None
let reduced = reduce(&evaled, None, &name)?; };
let maxima = map_max(&reduced, None, &name)?;
let percents = percentages(&reduced, maxima, &name)?;
match percents { let column_names_supplied: Vec<_> = columns.iter().map(|f| f.item.clone()).collect();
Value {
value: UntaggedValue::Table(datasets),
..
} => {
let mut idx = 0;
let column_names_supplied: Vec<_> = rest.iter().map(|f| f.item.clone()).collect();
let frequency_column_name = if column_names_supplied.is_empty() { let frequency_column_name = if column_names_supplied.is_empty() {
"frequency".to_string() "frequency".to_string()
@ -104,51 +82,41 @@ pub async fn histogram(
column_names_supplied[0].clone() column_names_supplied[0].clone()
}; };
let column = (*column_name).clone(); let column = if let Some(ref column) = column_grouper {
column.clone()
let count_column_name = "count".to_string();
let count_shell_error = ShellError::labeled_error(
"Unable to load group count",
"unabled to load group count",
&name,
);
let mut count_values: Vec<u64> = Vec::new();
for table_entry in reduced.table_entries() {
match table_entry {
Value {
value: UntaggedValue::Table(list),
..
} => {
for i in list {
if let Ok(count) = i.value.clone().into_value(&name).as_u64() {
count_values.push(count);
} else { } else {
return Err(count_shell_error); "value".to_string().tagged(&name)
} };
}
}
_ => {
return Err(count_shell_error);
}
}
}
if let Value { let results = crate::utils::data::report(
value: UntaggedValue::Table(start), &UntaggedValue::table(&values).into_value(&name),
.. crate::utils::data::Operation {
} = datasets.get(0).ok_or_else(|| { grouper: Some(Box::new(move |_, _| Ok(String::from("frequencies")))),
ShellError::labeled_error( splitter: Some(splitter(column_grouper)),
"Unable to load dataset", format: None,
"unabled to load dataset", eval: &None,
},
&name, &name,
) )?;
})? {
let start = start.clone(); let labels = results.labels.y.clone();
Ok( let mut idx = 0;
futures::stream::iter(start.into_iter().map(move |percentage| {
Ok(futures::stream::iter(
results
.percentages
.table_entries()
.map(move |value| {
let values = value.table_entries().cloned().collect::<Vec<_>>();
let count = values.len();
(count, values[count - 1].clone())
})
.collect::<Vec<_>>()
.into_iter()
.map(move |(count, value)| {
let mut fact = TaggedDictBuilder::new(&name); let mut fact = TaggedDictBuilder::new(&name);
let value: Tagged<String> = group_labels let column_value = labels
.get(idx) .get(idx)
.ok_or_else(|| { .ok_or_else(|| {
ShellError::labeled_error( ShellError::labeled_error(
@ -158,99 +126,44 @@ pub async fn histogram(
) )
})? })?
.clone(); .clone();
fact.insert_value(
&column,
UntaggedValue::string(value.item).into_value(value.tag),
);
fact.insert_untagged( fact.insert_value(&column.item, column_value);
&count_column_name, fact.insert_untagged("count", UntaggedValue::int(count));
UntaggedValue::int(count_values[idx]),
);
if let Value {
value: UntaggedValue::Primitive(Primitive::Int(ref num)),
ref tag,
} = percentage
{
let string = std::iter::repeat("*") let string = std::iter::repeat("*")
.take(num.to_i32().ok_or_else(|| { .take(value.as_u64().map_err(|_| {
ShellError::labeled_error( ShellError::labeled_error("expected a number", "expected a number", &name)
"Expected a number",
"expected a number",
tag,
)
})? as usize) })? as usize)
.collect::<String>(); .collect::<String>();
fact.insert_untagged(
&frequency_column_name, fact.insert_untagged(&frequency_column_name, UntaggedValue::string(string));
UntaggedValue::string(string),
);
}
idx += 1; idx += 1;
ReturnSuccess::value(fact.into_value()) ReturnSuccess::value(fact.into_value())
})) }),
.to_output_stream(),
) )
} else { .to_output_stream())
Ok(OutputStream::empty())
}
}
_ => Ok(OutputStream::empty()),
}
} }
fn percentages(values: &Value, max: Value, tag: impl Into<Tag>) -> Result<Value, ShellError> { fn splitter(
let tag = tag.into(); by: Option<Tagged<String>>,
) -> Box<dyn Fn(usize, &Value) -> Result<String, ShellError> + Send> {
match by {
Some(column) => Box::new(move |_, row: &Value| {
let key = &column;
let results: Value = match values { match row.get_data_by_key(key.borrow_spanned()) {
Value { Some(key) => nu_value_ext::as_string(&key),
value: UntaggedValue::Table(datasets), None => Err(ShellError::labeled_error(
.. "unknown column",
} => { "unknown column",
let datasets: Vec<_> = datasets key.tag(),
.iter() )),
.map(|subsets| match subsets {
Value {
value: UntaggedValue::Table(data),
..
} => {
let data = data
.iter()
.map(|d| match d {
Value {
value: UntaggedValue::Primitive(Primitive::Int(n)),
..
} => {
let max = match &max {
Value {
value: UntaggedValue::Primitive(Primitive::Int(maxima)),
..
} => maxima.clone(),
_ => Zero::zero(),
};
let n = (n * 100) / max;
UntaggedValue::int(n).into_value(&tag)
} }
_ => UntaggedValue::int(0).into_value(&tag), }),
}) None => Box::new(move |_, row: &Value| nu_value_ext::as_string(&row)),
.collect::<Vec<_>>();
UntaggedValue::Table(data).into_value(&tag)
} }
_ => UntaggedValue::Table(vec![]).into_value(&tag),
})
.collect();
UntaggedValue::Table(datasets).into_value(&tag)
}
other => other.clone(),
};
Ok(results)
} }
#[cfg(test)] #[cfg(test)]

View File

@ -0,0 +1,188 @@
use crate::commands::classified::block::run_block;
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::evaluate::evaluate_baseline_expr;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{hir::Block, hir::ClassifiedCommand, Signature, SyntaxShape, UntaggedValue};
pub struct If;
#[derive(Deserialize)]
pub struct IfArgs {
condition: Block,
then_case: Block,
else_case: Block,
}
#[async_trait]
impl WholeStreamCommand for If {
fn name(&self) -> &str {
"if"
}
fn signature(&self) -> Signature {
Signature::build("if")
.required(
"condition",
SyntaxShape::Math,
"the condition that must match",
)
.required(
"then_case",
SyntaxShape::Block,
"block to run if condition is true",
)
.required(
"else_case",
SyntaxShape::Block,
"block to run if condition is false",
)
}
fn usage(&self) -> &str {
"Run blocks if a condition is true or false."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
if_command(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Run a block if a condition is true",
example: "echo 10 | if $it > 5 { echo 'greater than 5' } { echo 'less than or equal to 5' }",
result: Some(vec![UntaggedValue::string("greater than 5").into()]),
},
Example {
description: "Run a block if a condition is false",
example: "echo 1 | if $it > 5 { echo 'greater than 5' } { echo 'less than or equal to 5' }",
result: Some(vec![UntaggedValue::string("less than or equal to 5").into()]),
},
]
}
}
async fn if_command(
raw_args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = Arc::new(registry.clone());
let scope = Arc::new(raw_args.call_info.scope.clone());
let tag = raw_args.call_info.name_tag.clone();
let context = Arc::new(Context::from_raw(&raw_args, &registry));
let (
IfArgs {
condition,
then_case,
else_case,
},
input,
) = raw_args.process(&registry).await?;
let condition = {
if condition.block.len() != 1 {
return Err(ShellError::labeled_error(
"Expected a condition",
"expected a condition",
tag,
));
}
match condition.block[0].list.get(0) {
Some(item) => match item {
ClassifiedCommand::Expr(expr) => expr.clone(),
_ => {
return Err(ShellError::labeled_error(
"Expected a condition",
"expected a condition",
tag,
));
}
},
None => {
return Err(ShellError::labeled_error(
"Expected a condition",
"expected a condition",
tag,
));
}
}
};
Ok(input
.then(move |input| {
let condition = condition.clone();
let then_case = then_case.clone();
let else_case = else_case.clone();
let registry = registry.clone();
let scope = scope.clone();
let mut context = context.clone();
async move {
//FIXME: should we use the scope that's brought in as well?
let condition =
evaluate_baseline_expr(&condition, &*registry, &input, &scope.vars, &scope.env)
.await;
match condition {
Ok(condition) => match condition.as_bool() {
Ok(b) => {
if b {
match run_block(
&then_case,
Arc::make_mut(&mut context),
InputStream::empty(),
&input,
&scope.vars,
&scope.env,
)
.await
{
Ok(stream) => stream.to_output_stream(),
Err(e) => futures::stream::iter(vec![Err(e)].into_iter())
.to_output_stream(),
}
} else {
match run_block(
&else_case,
Arc::make_mut(&mut context),
InputStream::empty(),
&input,
&scope.vars,
&scope.env,
)
.await
{
Ok(stream) => stream.to_output_stream(),
Err(e) => futures::stream::iter(vec![Err(e)].into_iter())
.to_output_stream(),
}
}
}
Err(e) => {
futures::stream::iter(vec![Err(e)].into_iter()).to_output_stream()
}
},
Err(e) => futures::stream::iter(vec![Err(e)].into_iter()).to_output_stream(),
}
}
})
.flatten()
.to_output_stream())
}
#[cfg(test)]
mod tests {
use super::If;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(If {})
}
}

View File

@ -1,8 +1,10 @@
use crate::commands::classified::block::run_block;
use crate::commands::WholeStreamCommand; use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry; use crate::context::CommandRegistry;
use crate::prelude::*; use crate::prelude::*;
use futures::stream::once;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{ColumnPath, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value}; use nu_protocol::{ColumnPath, ReturnSuccess, Scope, Signature, SyntaxShape, UntaggedValue, Value};
use nu_value_ext::ValueExt; use nu_value_ext::ValueExt;
pub struct Insert; pub struct Insert;
@ -26,11 +28,7 @@ impl WholeStreamCommand for Insert {
SyntaxShape::ColumnPath, SyntaxShape::ColumnPath,
"the column name to insert", "the column name to insert",
) )
.required( .required("value", SyntaxShape::Any, "the value to give the cell(s)")
"value",
SyntaxShape::String,
"the value to give the cell(s)",
)
} }
fn usage(&self) -> &str { fn usage(&self) -> &str {
@ -46,27 +44,111 @@ impl WholeStreamCommand for Insert {
} }
} }
async fn insert(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> { async fn process_row(
let registry = registry.clone(); scope: Arc<Scope>,
mut context: Arc<Context>,
input: Value,
mut value: Arc<Value>,
column: Arc<ColumnPath>,
) -> Result<OutputStream, ShellError> {
let value = Arc::make_mut(&mut value);
let (InsertArgs { column, value }, input) = args.process(&registry).await?; Ok(match value {
Value {
value: UntaggedValue::Block(block),
..
} => {
let for_block = input.clone();
let input_stream = once(async { Ok(for_block) }).to_input_stream();
Ok(input let result = run_block(
.map(move |row| match row { &block,
Arc::make_mut(&mut context),
input_stream,
&input,
&scope.vars,
&scope.env,
)
.await;
match result {
Ok(mut stream) => {
let errors = context.get_errors();
if let Some(error) = errors.first() {
return Err(error.clone());
}
match input {
obj
@
Value { Value {
value: UntaggedValue::Row(_), value: UntaggedValue::Row(_),
.. ..
} => match row.insert_data_at_column_path(&column, value.clone()) { } => {
Ok(v) => Ok(ReturnSuccess::Value(v)), if let Some(result) = stream.next().await {
Err(err) => Err(err), match obj.insert_data_at_column_path(&column, result) {
}, Ok(v) => OutputStream::one(ReturnSuccess::value(v)),
Err(e) => OutputStream::one(Err(e)),
Value { tag, .. } => Err(ShellError::labeled_error( }
} else {
OutputStream::empty()
}
}
Value { tag, .. } => OutputStream::one(Err(ShellError::labeled_error(
"Unrecognized type in stream", "Unrecognized type in stream",
"original value", "original value",
tag, tag,
)), ))),
}
}
Err(e) => OutputStream::one(Err(e)),
}
}
_ => match input {
obj
@
Value {
value: UntaggedValue::Row(_),
..
} => match obj.insert_data_at_column_path(&column, value.clone()) {
Ok(v) => OutputStream::one(ReturnSuccess::value(v)),
Err(e) => OutputStream::one(Err(e)),
},
Value { tag, .. } => OutputStream::one(Err(ShellError::labeled_error(
"Unrecognized type in stream",
"original value",
tag,
))),
},
}) })
}
async fn insert(
raw_args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let scope = Arc::new(raw_args.call_info.scope.clone());
let context = Arc::new(Context::from_raw(&raw_args, &registry));
let (InsertArgs { column, value }, input) = raw_args.process(&registry).await?;
let value = Arc::new(value);
let column = Arc::new(column);
Ok(input
.then(move |input| {
let scope = scope.clone();
let context = context.clone();
let value = value.clone();
let column = column.clone();
async {
match process_row(scope, context, input, value, column).await {
Ok(s) => s,
Err(e) => OutputStream::one(Err(e)),
}
}
})
.flatten()
.to_output_stream()) .to_output_stream())
} }

View File

@ -106,14 +106,14 @@ async fn is_empty(
.. ..
} => { } => {
if val.is_empty() { if val.is_empty() {
match obj.replace_data_at_column_path(&field, default.clone()) { obj.replace_data_at_column_path(&field, default.clone())
Some(v) => Ok(v), .ok_or_else(|| {
None => Err(ShellError::labeled_error( ShellError::labeled_error(
"empty? could not find place to check emptiness", "empty? could not find place to check emptiness",
"column name", "column name",
&field.tag, &field.tag,
)), )
} })
} else { } else {
Ok(obj) Ok(obj)
} }

View File

@ -5,15 +5,15 @@ use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape, UntaggedValue}; use nu_protocol::{Signature, SyntaxShape, UntaggedValue};
use nu_source::Tagged; use nu_source::Tagged;
pub struct Keep; pub struct Command;
#[derive(Deserialize)] #[derive(Deserialize)]
pub struct KeepArgs { pub struct Arguments {
rows: Option<Tagged<usize>>, rows: Option<Tagged<usize>>,
} }
#[async_trait] #[async_trait]
impl WholeStreamCommand for Keep { impl WholeStreamCommand for Command {
fn name(&self) -> &str { fn name(&self) -> &str {
"keep" "keep"
} }
@ -22,7 +22,7 @@ impl WholeStreamCommand for Keep {
Signature::build("keep").optional( Signature::build("keep").optional(
"rows", "rows",
SyntaxShape::Int, SyntaxShape::Int,
"starting from the front, the number of rows to keep", "Starting from the front, the number of rows to keep",
) )
} }
@ -61,7 +61,7 @@ impl WholeStreamCommand for Keep {
async fn keep(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> { async fn keep(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let registry = registry.clone(); let registry = registry.clone();
let (KeepArgs { rows }, input) = args.process(&registry).await?; let (Arguments { rows }, input) = args.process(&registry).await?;
let rows_desired = if let Some(quantity) = rows { let rows_desired = if let Some(quantity) = rows {
*quantity *quantity
} else { } else {
@ -73,12 +73,12 @@ async fn keep(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStr
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use super::Keep; use super::Command;
#[test] #[test]
fn examples_work_as_expected() { fn examples_work_as_expected() {
use crate::examples::test as test_examples; use crate::examples::test as test_examples;
test_examples(Keep {}) test_examples(Command {})
} }
} }

View File

@ -0,0 +1,7 @@
mod command;
mod until;
mod while_;
pub use command::Command as Keep;
pub use until::SubCommand as KeepUntil;
pub use while_::SubCommand as KeepWhile;

View File

@ -5,20 +5,20 @@ use log::trace;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{hir::ClassifiedCommand, Signature, SyntaxShape, UntaggedValue, Value}; use nu_protocol::{hir::ClassifiedCommand, Signature, SyntaxShape, UntaggedValue, Value};
pub struct KeepUntil; pub struct SubCommand;
#[async_trait] #[async_trait]
impl WholeStreamCommand for KeepUntil { impl WholeStreamCommand for SubCommand {
fn name(&self) -> &str { fn name(&self) -> &str {
"keep-until" "keep until"
} }
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("keep-until") Signature::build("keep until")
.required( .required(
"condition", "condition",
SyntaxShape::Math, SyntaxShape::Math,
"the condition that must be met to stop keeping rows", "The condition that must be met to stop keeping rows",
) )
.filter() .filter()
} }
@ -100,10 +100,7 @@ impl WholeStreamCommand for KeepUntil {
.await; .await;
trace!("RESULT = {:?}", result); trace!("RESULT = {:?}", result);
match result { !matches!(result, Ok(ref v) if v.is_true())
Ok(ref v) if v.is_true() => false,
_ => true,
}
} }
}) })
.to_output_stream()) .to_output_stream())
@ -112,12 +109,12 @@ impl WholeStreamCommand for KeepUntil {
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use super::KeepUntil; use super::SubCommand;
#[test] #[test]
fn examples_work_as_expected() { fn examples_work_as_expected() {
use crate::examples::test as test_examples; use crate::examples::test as test_examples;
test_examples(KeepUntil {}) test_examples(SubCommand {})
} }
} }

View File

@ -5,20 +5,20 @@ use log::trace;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{hir::ClassifiedCommand, Signature, SyntaxShape, UntaggedValue, Value}; use nu_protocol::{hir::ClassifiedCommand, Signature, SyntaxShape, UntaggedValue, Value};
pub struct KeepWhile; pub struct SubCommand;
#[async_trait] #[async_trait]
impl WholeStreamCommand for KeepWhile { impl WholeStreamCommand for SubCommand {
fn name(&self) -> &str { fn name(&self) -> &str {
"keep-while" "keep while"
} }
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("keep-while") Signature::build("keep while")
.required( .required(
"condition", "condition",
SyntaxShape::Math, SyntaxShape::Math,
"the condition that must be met to keep rows", "The condition that must be met to keep rows",
) )
.filter() .filter()
} }
@ -100,10 +100,7 @@ impl WholeStreamCommand for KeepWhile {
.await; .await;
trace!("RESULT = {:?}", result); trace!("RESULT = {:?}", result);
match result { matches!(result, Ok(ref v) if v.is_true())
Ok(ref v) if v.is_true() => true,
_ => false,
}
} }
}) })
.to_output_stream()) .to_output_stream())
@ -112,12 +109,12 @@ impl WholeStreamCommand for KeepWhile {
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use super::KeepWhile; use super::SubCommand;
#[test] #[test]
fn examples_work_as_expected() { fn examples_work_as_expected() {
use crate::examples::test as test_examples; use crate::examples::test as test_examples;
test_examples(KeepWhile {}) test_examples(SubCommand {})
} }
} }

View File

@ -2,7 +2,7 @@ use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry; use crate::context::CommandRegistry;
use crate::prelude::*; use crate::prelude::*;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value}; use nu_protocol::{Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged; use nu_source::Tagged;
pub struct Last; pub struct Last;
@ -63,25 +63,21 @@ async fn last(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStr
let (LastArgs { rows }, input) = args.process(&registry).await?; let (LastArgs { rows }, input) = args.process(&registry).await?;
let v: Vec<_> = input.into_vec().await; let v: Vec<_> = input.into_vec().await;
let rows_desired = if let Some(quantity) = rows { let end_rows_desired = if let Some(quantity) = rows {
*quantity *quantity as usize
} else { } else {
1 1
}; };
let mut values_vec_deque = VecDeque::new(); let beginning_rows_to_skip = if end_rows_desired < v.len() {
v.len() - end_rows_desired
} else {
0
};
let count = rows_desired as usize; let iter = v.into_iter().skip(beginning_rows_to_skip);
if count < v.len() { Ok(futures::stream::iter(iter).to_output_stream())
let k = v.len() - count;
for x in v[k..].iter() {
values_vec_deque.push_back(ReturnSuccess::value(x.clone()));
}
}
Ok(futures::stream::iter(values_vec_deque).to_output_stream())
} }
#[cfg(test)] #[cfg(test)]

View File

@ -2,6 +2,7 @@ use crate::commands::WholeStreamCommand;
use crate::prelude::*; use crate::prelude::*;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, Signature, UntaggedValue, Value}; use nu_protocol::{Primitive, ReturnSuccess, Signature, UntaggedValue, Value};
use parking_lot::Mutex;
pub struct Lines; pub struct Lines;
@ -47,8 +48,7 @@ fn ends_with_line_ending(st: &str) -> bool {
} }
async fn lines(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> { async fn lines(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let leftover = Arc::new(vec![]); let leftover_string = Arc::new(Mutex::new(String::new()));
let leftover_string = Arc::new(String::new());
let registry = registry.clone(); let registry = registry.clone();
let args = args.evaluate_once(&registry).await?; let args = args.evaluate_once(&registry).await?;
let tag = args.name_tag(); let tag = args.name_tag();
@ -62,91 +62,50 @@ async fn lines(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputSt
.input .input
.chain(eos) .chain(eos)
.map(move |item| { .map(move |item| {
let mut leftover = leftover.clone(); let leftover_string = leftover_string.clone();
let mut leftover_string = leftover_string.clone();
match item { match item {
Value { Value {
value: UntaggedValue::Primitive(Primitive::String(st)), value: UntaggedValue::Primitive(Primitive::String(st)),
.. ..
} => {
let st = (&*leftover_string).clone() + &st;
if let Some(leftover) = Arc::get_mut(&mut leftover) {
leftover.clear();
} }
| Value {
let mut lines: Vec<String> = st.lines().map(|x| x.to_string()).collect();
if !ends_with_line_ending(&st) {
if let Some(last) = lines.pop() {
if let Some(leftover_string) = Arc::get_mut(&mut leftover_string) {
leftover_string.clear();
leftover_string.push_str(&last);
}
} else if let Some(leftover_string) = Arc::get_mut(&mut leftover_string) {
leftover_string.clear();
}
} else if let Some(leftover_string) = Arc::get_mut(&mut leftover_string) {
leftover_string.clear();
}
let success_lines: Vec<_> = lines
.iter()
.map(|x| ReturnSuccess::value(UntaggedValue::line(x).into_untagged_value()))
.collect();
futures::stream::iter(success_lines)
}
Value {
value: UntaggedValue::Primitive(Primitive::Line(st)), value: UntaggedValue::Primitive(Primitive::Line(st)),
.. ..
} => { } => {
let mut leftover_string = leftover_string.lock();
let st = (&*leftover_string).clone() + &st; let st = (&*leftover_string).clone() + &st;
if let Some(leftover) = Arc::get_mut(&mut leftover) {
leftover.clear();
}
let mut lines: Vec<String> = st.lines().map(|x| x.to_string()).collect(); let mut lines: Vec<String> = st.lines().map(|x| x.to_string()).collect();
leftover_string.clear();
if !ends_with_line_ending(&st) { if !ends_with_line_ending(&st) {
if let Some(last) = lines.pop() { if let Some(last) = lines.pop() {
if let Some(leftover_string) = Arc::get_mut(&mut leftover_string) {
leftover_string.clear();
leftover_string.push_str(&last); leftover_string.push_str(&last);
} }
} else if let Some(leftover_string) = Arc::get_mut(&mut leftover_string) {
leftover_string.clear();
}
} else if let Some(leftover_string) = Arc::get_mut(&mut leftover_string) {
leftover_string.clear();
} }
let success_lines: Vec<_> = lines let success_lines: Vec<_> = lines
.iter() .iter()
.map(|x| ReturnSuccess::value(UntaggedValue::line(x).into_untagged_value())) .map(|x| ReturnSuccess::value(UntaggedValue::line(x).into_untagged_value()))
.collect(); .collect();
futures::stream::iter(success_lines) futures::stream::iter(success_lines)
} }
Value { Value {
value: UntaggedValue::Primitive(Primitive::EndOfStream), value: UntaggedValue::Primitive(Primitive::EndOfStream),
.. ..
} => { } => {
if !leftover.is_empty() { let st = (&*leftover_string).lock().clone();
let mut st = (&*leftover_string).clone();
if let Ok(extra) = String::from_utf8((&*leftover).clone()) {
st.push_str(&extra);
}
// futures::stream::iter(vec![ReturnSuccess::value(
// UntaggedValue::string(st).into_untagged_value(),
// )])
if !st.is_empty() { if !st.is_empty() {
futures::stream::iter(vec![ReturnSuccess::value( futures::stream::iter(vec![ReturnSuccess::value(
UntaggedValue::string(&*leftover_string).into_untagged_value(), UntaggedValue::string(st).into_untagged_value(),
)]) )])
} else { } else {
futures::stream::iter(vec![]) futures::stream::iter(vec![])
} }
} else {
futures::stream::iter(vec![])
}
} }
Value { Value {
tag: value_span, .. tag: value_span, ..

View File

@ -11,7 +11,7 @@ pub struct Ls;
pub struct LsArgs { pub struct LsArgs {
pub path: Option<Tagged<PathBuf>>, pub path: Option<Tagged<PathBuf>>,
pub all: bool, pub all: bool,
pub full: bool, pub long: bool,
#[serde(rename = "short-names")] #[serde(rename = "short-names")]
pub short_names: bool, pub short_names: bool,
#[serde(rename = "with-symlink-targets")] #[serde(rename = "with-symlink-targets")]
@ -33,25 +33,26 @@ impl WholeStreamCommand for Ls {
SyntaxShape::Pattern, SyntaxShape::Pattern,
"a path to get the directory contents from", "a path to get the directory contents from",
) )
.switch("all", "also show hidden files", Some('a')) .switch("all", "Show hidden files", Some('a'))
.switch( .switch(
"full", "long",
"list all available columns for each entry", "List all available columns for each entry",
Some('f'), Some('l'),
) )
.switch( .switch(
"short-names", "short-names",
"only print the file names and not the path", "Only print the file names and not the path",
Some('s'), Some('s'),
) )
.switch( .switch(
// Delete this
"with-symlink-targets", "with-symlink-targets",
"display the paths to the target files that symlinks point to", "Display the paths to the target files that symlinks point to",
Some('w'), Some('w'),
) )
.switch( .switch(
"du", "du",
"display the apparent directory size in place of the directory metadata size", "Display the apparent directory size in place of the directory metadata size",
Some('d'), Some('d'),
) )
} }

View File

@ -1,84 +0,0 @@
use crate::commands::WholeStreamCommand;
use crate::data::value;
use crate::prelude::*;
use crate::utils::data_processing::map_max;
use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged;
use num_traits::cast::ToPrimitive;
pub struct MapMaxBy;
#[derive(Deserialize)]
pub struct MapMaxByArgs {
column_name: Option<Tagged<String>>,
}
#[async_trait]
impl WholeStreamCommand for MapMaxBy {
fn name(&self) -> &str {
"map-max-by"
}
fn signature(&self) -> Signature {
Signature::build("map-max-by").named(
"column_name",
SyntaxShape::String,
"the name of the column to map-max the table's rows",
Some('c'),
)
}
fn usage(&self) -> &str {
"Creates a new table with the data from the tables rows maxed by the column given."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
map_max_by(args, registry).await
}
}
pub async fn map_max_by(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let name = args.call_info.name_tag.clone();
let (MapMaxByArgs { column_name }, mut input) = args.process(&registry).await?;
let values: Vec<Value> = input.collect().await;
if values.is_empty() {
Err(ShellError::labeled_error(
"Expected table from pipeline",
"requires a table input",
name,
))
} else {
let map_by_column = if let Some(column_to_map) = column_name {
Some(column_to_map.item().clone())
} else {
None
};
match map_max(&values[0], map_by_column, name) {
Ok(table_maxed) => Ok(OutputStream::one(ReturnSuccess::value(table_maxed))),
Err(err) => Err(err),
}
}
}
#[cfg(test)]
mod tests {
use super::MapMaxBy;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(MapMaxBy {})
}
}

View File

@ -1,14 +1,17 @@
use crate::prelude::*;
use crate::commands::math::reducers::{reducer_for, Reduce};
use crate::commands::math::utils::run_with_function; use crate::commands::math::utils::run_with_function;
use crate::commands::WholeStreamCommand; use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use crate::utils::data_processing::{reducer_for, Reduce};
use bigdecimal::{FromPrimitive, Zero};
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{ use nu_protocol::{
hir::{convert_number_to_u64, Number, Operator}, hir::{convert_number_to_u64, Number, Operator},
Primitive, Signature, UntaggedValue, Value, Primitive, Signature, UntaggedValue, Value,
}; };
use bigdecimal::FromPrimitive;
pub struct SubCommand; pub struct SubCommand;
#[async_trait] #[async_trait]
@ -55,23 +58,63 @@ impl WholeStreamCommand for SubCommand {
} }
} }
fn to_byte(value: &Value) -> Option<Value> {
match &value.value {
UntaggedValue::Primitive(Primitive::Int(num)) => Some(
UntaggedValue::Primitive(Primitive::Filesize(convert_number_to_u64(&Number::Int(
num.clone(),
))))
.into_untagged_value(),
),
_ => None,
}
}
pub fn average(values: &[Value], name: &Tag) -> Result<Value, ShellError> { pub fn average(values: &[Value], name: &Tag) -> Result<Value, ShellError> {
let sum = reducer_for(Reduce::Summation); let sum = reducer_for(Reduce::Summation);
let number = BigDecimal::from_usize(values.len()).ok_or_else(|| { let number = BigDecimal::from_usize(values.len()).ok_or_else(|| {
ShellError::labeled_error("nothing to average", "nothing to average", &name.span)
})?;
let total_rows = UntaggedValue::decimal(number);
let are_bytes = values
.get(0)
.ok_or_else(|| {
ShellError::unexpected("Cannot perform aggregate math operation on empty data")
})?
.is_filesize();
let total = if are_bytes {
to_byte(&sum(
UntaggedValue::int(0).into_untagged_value(),
values
.to_vec()
.iter()
.map(|v| match v {
Value {
value: UntaggedValue::Primitive(Primitive::Filesize(num)),
..
} => UntaggedValue::int(*num as usize).into_untagged_value(),
other => other.clone(),
})
.collect::<Vec<_>>(),
)?)
.ok_or_else(|| {
ShellError::labeled_error( ShellError::labeled_error(
"could not convert to big decimal", "could not convert to big decimal",
"could not convert to big decimal", "could not convert to big decimal",
&name.span, &name.span,
) )
})?; })
} else {
let total_rows = UntaggedValue::decimal(number); sum(UntaggedValue::int(0).into_untagged_value(), values.to_vec())
let total = sum(Value::zero(), values.to_vec())?; }?;
match total { match total {
Value { Value {
value: UntaggedValue::Primitive(Primitive::Bytes(num)), value: UntaggedValue::Primitive(Primitive::Filesize(num)),
.. ..
} => { } => {
let left = UntaggedValue::from(Primitive::Int(num.into())); let left = UntaggedValue::from(Primitive::Int(num.into()));
@ -81,7 +124,7 @@ pub fn average(values: &[Value], name: &Tag) -> Result<Value, ShellError> {
Ok(UntaggedValue::Primitive(Primitive::Decimal(result))) => { Ok(UntaggedValue::Primitive(Primitive::Decimal(result))) => {
let number = Number::Decimal(result); let number = Number::Decimal(result);
let number = convert_number_to_u64(&number); let number = convert_number_to_u64(&number);
Ok(UntaggedValue::bytes(number).into_value(name)) Ok(UntaggedValue::filesize(number).into_value(name))
} }
Ok(_) => Err(ShellError::labeled_error( Ok(_) => Err(ShellError::labeled_error(
"could not calculate average of non-integer or unrelated types", "could not calculate average of non-integer or unrelated types",

View File

@ -35,12 +35,13 @@ impl WholeStreamCommand for Command {
mod tests { mod tests {
use super::*; use super::*;
use crate::commands::math::{ use crate::commands::math::{
avg::average, max::maximum, median::median, min::minimum, mode::mode, sum::summation, avg::average, max::maximum, median::median, min::minimum, mode::mode, stddev::stddev,
utils::calculate, utils::MathFunction, sum::summation, utils::calculate, utils::MathFunction, variance::variance,
}; };
use nu_plugin::row; use nu_plugin::row;
use nu_plugin::test_helpers::value::{decimal, int, table}; use nu_plugin::test_helpers::value::{decimal, int, table};
use nu_protocol::Value; use nu_protocol::Value;
use std::str::FromStr;
#[test] #[test]
fn examples_work_as_expected() { fn examples_work_as_expected() {
@ -75,7 +76,9 @@ mod tests {
Ok(int(10)), Ok(int(10)),
Ok(int(10)), Ok(int(10)),
Ok(table(&[int(10)])), Ok(table(&[int(10)])),
Ok(decimal(0)),
Ok(int(10)), Ok(int(10)),
Ok(decimal(0)),
], ],
}, },
TestCase { TestCase {
@ -88,7 +91,9 @@ mod tests {
Ok(int(30)), Ok(int(30)),
Ok(int(20)), Ok(int(20)),
Ok(table(&[int(10), int(20), int(30)])), Ok(table(&[int(10), int(20), int(30)])),
Ok(decimal(BigDecimal::from_str("8.164965809277260327324280249019637973219824935522233761442308557503201258191050088466198110348800783").expect("Could not convert to decimal from string"))),
Ok(int(60)), Ok(int(60)),
Ok(decimal(BigDecimal::from_str("66.66666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666667").expect("Could not convert to decimal from string"))),
], ],
}, },
TestCase { TestCase {
@ -101,7 +106,9 @@ mod tests {
Ok(decimal(26.5)), Ok(decimal(26.5)),
Ok(decimal(26.5)), Ok(decimal(26.5)),
Ok(table(&[decimal(26.5)])), Ok(table(&[decimal(26.5)])),
Ok(decimal(BigDecimal::from_str("7.77817459305202276840928798315333943213319531457321440247173855894902863154158871367713143880202865").expect("Could not convert to decimal from string"))),
Ok(decimal(63)), Ok(decimal(63)),
Ok(decimal(60.5)),
], ],
}, },
TestCase { TestCase {
@ -114,7 +121,9 @@ mod tests {
Ok(int(10)), Ok(int(10)),
Ok(int(-11)), Ok(int(-11)),
Ok(table(&[int(-14), int(-11), int(10)])), Ok(table(&[int(-14), int(-11), int(10)])),
Ok(decimal(BigDecimal::from_str("10.67707825203131121081152396559571062628228776946058011397810604284900898365140801704064843595778374").expect("Could not convert to decimal from string"))),
Ok(int(-15)), Ok(int(-15)),
Ok(decimal(114)),
], ],
}, },
TestCase { TestCase {
@ -127,7 +136,9 @@ mod tests {
Ok(int(10)), Ok(int(10)),
Ok(decimal(-11.5)), Ok(decimal(-11.5)),
Ok(table(&[decimal(-13.5), decimal(-11.5), int(10)])), Ok(table(&[decimal(-13.5), decimal(-11.5), int(10)])),
Ok(decimal(BigDecimal::from_str("10.63798226482196513098036125801342585449179971588207816421068645273754903468375890632981926875247027").expect("Could not convert to decimal from string"))),
Ok(decimal(-15)), Ok(decimal(-15)),
Ok(decimal(BigDecimal::from_str("113.1666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666667").expect("Could not convert to decimal from string"))),
], ],
}, },
TestCase { TestCase {
@ -148,7 +159,12 @@ mod tests {
"col1".to_owned() => table(&[int(1), int(2), int(3), int(4)]), "col1".to_owned() => table(&[int(1), int(2), int(3), int(4)]),
"col2".to_owned() => table(&[int(5), int(6), int(7), int(8)]) "col2".to_owned() => table(&[int(5), int(6), int(7), int(8)])
]), ]),
Ok(row![
"col1".to_owned() => decimal(BigDecimal::from_str("1.118033988749894848204586834365638117720309179805762862135448622705260462818902449707207204189391137").expect("Could not convert to decimal from string")),
"col2".to_owned() => decimal(BigDecimal::from_str("1.118033988749894848204586834365638117720309179805762862135448622705260462818902449707207204189391137").expect("Could not convert to decimal from string"))
]),
Ok(row!["col1".to_owned() => int(10), "col2".to_owned() => int(26)]), Ok(row!["col1".to_owned() => int(10), "col2".to_owned() => int(26)]),
Ok(row!["col1".to_owned() => decimal(1.25), "col2".to_owned() => decimal(1.25)]),
], ],
}, },
// TODO-Uncomment once Issue: https://github.com/nushell/nushell/issues/1883 is resolved // TODO-Uncomment once Issue: https://github.com/nushell/nushell/issues/1883 is resolved
@ -162,8 +178,9 @@ mod tests {
let test_tag = Tag::unknown(); let test_tag = Tag::unknown();
for tc in tt.iter() { for tc in tt.iter() {
let tc: &TestCase = tc; // Just for type annotations let tc: &TestCase = tc; // Just for type annotations
let math_functions: Vec<MathFunction> = let math_functions: Vec<MathFunction> = vec![
vec![average, minimum, maximum, median, mode, summation]; average, minimum, maximum, median, mode, stddev, summation, variance,
];
let results = math_functions let results = math_functions
.into_iter() .into_iter()
.map(|mf| calculate(&tc.values, &test_tag, mf)) .map(|mf| calculate(&tc.values, &test_tag, mf))

View File

@ -0,0 +1,107 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged;
pub struct SubCommand;
#[derive(Deserialize)]
pub struct SubCommandArgs {
expression: Option<Tagged<String>>,
}
#[async_trait]
impl WholeStreamCommand for SubCommand {
fn name(&self) -> &str {
"math eval"
}
fn usage(&self) -> &str {
"Evaluate a math expression into a number"
}
fn signature(&self) -> Signature {
Signature::build("math eval").desc(self.usage()).optional(
"math expression",
SyntaxShape::String,
"the math expression to evaluate",
)
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
eval(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Evalulate math in the pipeline",
example: "echo '10 / 4' | math eval",
result: Some(vec![UntaggedValue::decimal(2.5).into()]),
}]
}
}
pub async fn eval(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let name = args.call_info.name_tag.span;
let (SubCommandArgs { expression }, input) = args.process(registry).await?;
Ok(input
.map(move |x| {
if let Some(Tagged {
tag,
item: expression,
}) = &expression
{
UntaggedValue::string(expression).into_value(tag)
} else {
x
}
})
.map(move |input| {
if let Ok(string) = input.as_string() {
match parse(&string, &input.tag) {
Ok(value) => ReturnSuccess::value(value),
Err(err) => Err(ShellError::labeled_error(
"Math evaluation error",
err,
&input.tag.span,
)),
}
} else {
Err(ShellError::labeled_error(
"Expected a string from pipeline",
"requires string input",
name,
))
}
})
.to_output_stream())
}
pub fn parse<T: Into<Tag>>(math_expression: &str, tag: T) -> Result<Value, String> {
match meval::eval_str(math_expression) {
Ok(num) if num.is_infinite() || num.is_nan() => Err("cannot represent result".to_string()),
Ok(num) => Ok(UntaggedValue::from(Primitive::from(num)).into_value(tag)),
Err(error) => Err(error.to_string().to_lowercase()),
}
}
#[cfg(test)]
mod tests {
use super::SubCommand;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(SubCommand {})
}
}

View File

@ -1,7 +1,7 @@
use crate::commands::math::reducers::{reducer_for, Reduce};
use crate::commands::math::utils::run_with_function; use crate::commands::math::utils::run_with_function;
use crate::commands::WholeStreamCommand; use crate::commands::WholeStreamCommand;
use crate::prelude::*; use crate::prelude::*;
use crate::utils::data_processing::{reducer_for, Reduce};
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{Signature, UntaggedValue, Value}; use nu_protocol::{Signature, UntaggedValue, Value};

View File

@ -1,8 +1,8 @@
use crate::commands::math::reducers::{reducer_for, Reduce};
use crate::commands::math::utils::run_with_function; use crate::commands::math::utils::run_with_function;
use crate::commands::WholeStreamCommand; use crate::commands::WholeStreamCommand;
use crate::prelude::*; use crate::prelude::*;
use crate::utils::data_processing::{reducer_for, Reduce}; use bigdecimal::FromPrimitive;
use bigdecimal::{FromPrimitive, Zero};
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{ use nu_protocol::{
hir::{convert_number_to_u64, Number, Operator}, hir::{convert_number_to_u64, Number, Operator},
@ -73,7 +73,7 @@ pub fn median(values: &[Value], name: &Tag) -> Result<Value, ShellError> {
sorted.push(item.clone()); sorted.push(item.clone());
} }
crate::commands::sort_by::sort(&mut sorted, &[], name)?; crate::commands::sort_by::sort(&mut sorted, &[], name, false)?;
match take { match take {
Pick::Median => { Pick::Median => {
@ -130,11 +130,11 @@ fn compute_average(values: &[Value], name: impl Into<Tag>) -> Result<Value, Shel
) )
})?; })?;
let total_rows = UntaggedValue::decimal(number); let total_rows = UntaggedValue::decimal(number);
let total = sum(Value::zero(), values.to_vec())?; let total = sum(UntaggedValue::int(0).into_untagged_value(), values.to_vec())?;
match total { match total {
Value { Value {
value: UntaggedValue::Primitive(Primitive::Bytes(num)), value: UntaggedValue::Primitive(Primitive::Filesize(num)),
.. ..
} => { } => {
let left = UntaggedValue::from(Primitive::Int(num.into())); let left = UntaggedValue::from(Primitive::Int(num.into()));
@ -144,7 +144,7 @@ fn compute_average(values: &[Value], name: impl Into<Tag>) -> Result<Value, Shel
Ok(UntaggedValue::Primitive(Primitive::Decimal(result))) => { Ok(UntaggedValue::Primitive(Primitive::Decimal(result))) => {
let number = Number::Decimal(result); let number = Number::Decimal(result);
let number = convert_number_to_u64(&number); let number = convert_number_to_u64(&number);
Ok(UntaggedValue::bytes(number).into_value(name)) Ok(UntaggedValue::filesize(number).into_value(name))
} }
Ok(_) => Err(ShellError::labeled_error( Ok(_) => Err(ShellError::labeled_error(
"could not calculate median of non-numeric or unrelated types", "could not calculate median of non-numeric or unrelated types",

View File

@ -1,7 +1,7 @@
use crate::commands::math::reducers::{reducer_for, Reduce};
use crate::commands::math::utils::run_with_function; use crate::commands::math::utils::run_with_function;
use crate::commands::WholeStreamCommand; use crate::commands::WholeStreamCommand;
use crate::prelude::*; use crate::prelude::*;
use crate::utils::data_processing::{reducer_for, Reduce};
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{Signature, UntaggedValue, Value}; use nu_protocol::{Signature, UntaggedValue, Value};

View File

@ -1,16 +1,24 @@
pub mod avg; pub mod avg;
pub mod command; pub mod command;
pub mod eval;
pub mod max; pub mod max;
pub mod median; pub mod median;
pub mod min; pub mod min;
pub mod mode; pub mod mode;
pub mod stddev;
pub mod sum; pub mod sum;
pub mod utils; pub mod variance;
mod reducers;
mod utils;
pub use avg::SubCommand as MathAverage; pub use avg::SubCommand as MathAverage;
pub use command::Command as Math; pub use command::Command as Math;
pub use eval::SubCommand as MathEval;
pub use max::SubCommand as MathMaximum; pub use max::SubCommand as MathMaximum;
pub use median::SubCommand as MathMedian; pub use median::SubCommand as MathMedian;
pub use min::SubCommand as MathMinimum; pub use min::SubCommand as MathMinimum;
pub use mode::SubCommand as MathMode; pub use mode::SubCommand as MathMode;
pub use stddev::SubCommand as MathStddev;
pub use sum::SubCommand as MathSummation; pub use sum::SubCommand as MathSummation;
pub use variance::SubCommand as MathVariance;

View File

@ -77,7 +77,7 @@ pub fn mode(values: &[Value], name: &Tag) -> Result<Value, ShellError> {
} }
} }
crate::commands::sort_by::sort(&mut modes, &[], name)?; crate::commands::sort_by::sort(&mut modes, &[], name, false)?;
Ok(UntaggedValue::Table(modes).into_value(name)) Ok(UntaggedValue::Table(modes).into_value(name))
} }

View File

@ -0,0 +1,135 @@
use crate::data::value::{compare_values, compute_values};
use nu_errors::ShellError;
use nu_protocol::hir::Operator;
use nu_protocol::{UntaggedValue, Value};
use nu_source::{SpannedItem, Tag};
// Re-usable error messages
const ERR_EMPTY_DATA: &str = "Cannot perform aggregate math operation on empty data";
fn formula(
acc_begin: Value,
calculator: Box<dyn Fn(Vec<Value>) -> Result<Value, ShellError> + Send + Sync + 'static>,
) -> Box<dyn Fn(Value, Vec<Value>) -> Result<Value, ShellError> + Send + Sync + 'static> {
Box::new(move |acc, datax| -> Result<Value, ShellError> {
let result = match compute_values(Operator::Multiply, &acc, &acc_begin) {
Ok(v) => v.into_untagged_value(),
Err((left_type, right_type)) => {
return Err(ShellError::coerce_error(
left_type.spanned_unknown(),
right_type.spanned_unknown(),
))
}
};
match calculator(datax) {
Ok(total) => Ok(match compute_values(Operator::Plus, &result, &total) {
Ok(v) => v.into_untagged_value(),
Err((left_type, right_type)) => {
return Err(ShellError::coerce_error(
left_type.spanned_unknown(),
right_type.spanned_unknown(),
))
}
}),
Err(reason) => Err(reason),
}
})
}
pub fn reducer_for(
command: Reduce,
) -> Box<dyn Fn(Value, Vec<Value>) -> Result<Value, ShellError> + Send + Sync + 'static> {
match command {
Reduce::Summation | Reduce::Default => Box::new(formula(
UntaggedValue::int(0).into_untagged_value(),
Box::new(sum),
)),
Reduce::Minimum => Box::new(|_, values| min(values)),
Reduce::Maximum => Box::new(|_, values| max(values)),
}
}
pub enum Reduce {
Summation,
Minimum,
Maximum,
Default,
}
pub fn sum(data: Vec<Value>) -> Result<Value, ShellError> {
let mut acc = UntaggedValue::int(0).into_untagged_value();
for value in data {
match value.value {
UntaggedValue::Primitive(_) => {
acc = match compute_values(Operator::Plus, &acc, &value) {
Ok(v) => v.into_untagged_value(),
Err((left_type, right_type)) => {
return Err(ShellError::coerce_error(
left_type.spanned_unknown(),
right_type.spanned_unknown(),
))
}
};
}
_ => {
return Err(ShellError::labeled_error(
"Attempted to compute the sum of a value that cannot be summed.",
"value appears here",
value.tag.span,
))
}
}
}
Ok(acc)
}
pub fn max(data: Vec<Value>) -> Result<Value, ShellError> {
let mut biggest = data
.first()
.ok_or_else(|| ShellError::unexpected(ERR_EMPTY_DATA))?
.value
.clone();
for value in data.iter() {
if let Ok(greater_than) = compare_values(Operator::GreaterThan, &value.value, &biggest) {
if greater_than {
biggest = value.value.clone();
}
} else {
return Err(ShellError::unexpected(format!(
"Could not compare\nleft: {:?}\nright: {:?}",
biggest, value.value
)));
}
}
Ok(Value {
value: biggest,
tag: Tag::unknown(),
})
}
pub fn min(data: Vec<Value>) -> Result<Value, ShellError> {
let mut smallest = data
.first()
.ok_or_else(|| ShellError::unexpected(ERR_EMPTY_DATA))?
.value
.clone();
for value in data.iter() {
if let Ok(greater_than) = compare_values(Operator::LessThan, &value.value, &smallest) {
if greater_than {
smallest = value.value.clone();
}
} else {
return Err(ShellError::unexpected(format!(
"Could not compare\nleft: {:?}\nright: {:?}",
smallest, value.value
)));
}
}
Ok(Value {
value: smallest,
tag: Tag::unknown(),
})
}

View File

@ -0,0 +1,150 @@
use super::variance::compute_variance as variance;
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Dictionary, Primitive, ReturnSuccess, Signature, UntaggedValue, Value};
use nu_source::Tagged;
use std::str::FromStr;
pub struct SubCommand;
#[derive(Deserialize)]
struct Arguments {
sample: Tagged<bool>,
}
#[async_trait]
impl WholeStreamCommand for SubCommand {
fn name(&self) -> &str {
"math stddev"
}
fn signature(&self) -> Signature {
Signature::build("math stddev").switch(
"sample",
"calculate sample standard deviation",
Some('s'),
)
}
fn usage(&self) -> &str {
"Finds the stddev of a list of numbers or tables"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let name = args.call_info.name_tag.clone();
let (Arguments { sample }, mut input) = args.process(&registry).await?;
let values: Vec<Value> = input.drain_vec().await;
let n = if let Tagged { item: true, .. } = sample {
values.len() - 1
} else {
values.len()
};
let res = if values.iter().all(|v| v.is_primitive()) {
compute_stddev(&values, n, &name)
} else {
// If we are not dealing with Primitives, then perhaps we are dealing with a table
// Create a key for each column name
let mut column_values = IndexMap::new();
for value in values {
if let UntaggedValue::Row(row_dict) = &value.value {
for (key, value) in row_dict.entries.iter() {
column_values
.entry(key.clone())
.and_modify(|v: &mut Vec<Value>| v.push(value.clone()))
.or_insert(vec![value.clone()]);
}
}
}
// The mathematical function operates over the columns of the table
let mut column_totals = IndexMap::new();
for (col_name, col_vals) in column_values {
if let Ok(out) = compute_stddev(&col_vals, n, &name) {
column_totals.insert(col_name, out);
}
}
if column_totals.keys().len() == 0 {
return Err(ShellError::labeled_error(
"Attempted to compute values that can't be operated on",
"value appears here",
name.span,
));
}
Ok(UntaggedValue::Row(Dictionary {
entries: column_totals,
})
.into_untagged_value())
};
match res {
Ok(v) => {
if v.value.is_table() {
Ok(OutputStream::from(
v.table_entries()
.map(|v| ReturnSuccess::value(v.clone()))
.collect::<Vec<_>>(),
))
} else {
Ok(OutputStream::one(ReturnSuccess::value(v)))
}
}
Err(e) => Err(e),
}
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Get the stddev of a list of numbers",
example: "echo [1 2 3 4 5] | math stddev",
result: Some(vec![UntaggedValue::decimal(BigDecimal::from_str("1.414213562373095048801688724209698078569671875376948073176679737990732478462107038850387534327641573").expect("Could not convert to decimal from string")).into()]),
}]
}
}
#[cfg(test)]
pub fn stddev(values: &[Value], name: &Tag) -> Result<Value, ShellError> {
compute_stddev(values, values.len(), name)
}
pub fn compute_stddev(values: &[Value], n: usize, name: &Tag) -> Result<Value, ShellError> {
let variance = variance(values, n, name)?.as_primitive()?;
let sqrt_var = match variance {
Primitive::Decimal(var) => var.sqrt(),
_ => {
return Err(ShellError::labeled_error(
"Could not take square root of variance",
"error occured here",
name.span,
))
}
};
match sqrt_var {
Some(stddev) => Ok(UntaggedValue::from(Primitive::Decimal(stddev)).into_value(name)),
None => Err(ShellError::labeled_error(
"Could not calculate stddev",
"error occured here",
name.span,
)),
}
}
#[cfg(test)]
mod tests {
use super::SubCommand;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(SubCommand {})
}
}

View File

@ -1,10 +1,13 @@
use crate::commands::math::reducers::{reducer_for, Reduce};
use crate::commands::math::utils::run_with_function; use crate::commands::math::utils::run_with_function;
use crate::commands::WholeStreamCommand; use crate::commands::WholeStreamCommand;
use crate::prelude::*; use crate::prelude::*;
use crate::utils::data_processing::{reducer_for, Reduce};
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{Dictionary, Signature, UntaggedValue, Value};
use num_traits::identities::Zero; use nu_protocol::{
hir::{convert_number_to_u64, Number},
Primitive, Signature, UntaggedValue, Value,
};
pub struct SubCommand; pub struct SubCommand;
@ -59,37 +62,63 @@ impl WholeStreamCommand for SubCommand {
} }
} }
fn to_byte(value: &Value) -> Option<Value> {
match &value.value {
UntaggedValue::Primitive(Primitive::Int(num)) => Some(
UntaggedValue::Primitive(Primitive::Filesize(convert_number_to_u64(&Number::Int(
num.clone(),
))))
.into_untagged_value(),
),
_ => None,
}
}
pub fn summation(values: &[Value], name: &Tag) -> Result<Value, ShellError> { pub fn summation(values: &[Value], name: &Tag) -> Result<Value, ShellError> {
let sum = reducer_for(Reduce::Summation); let sum = reducer_for(Reduce::Summation);
if values.iter().all(|v| v.is_primitive()) { let first = values.get(0).ok_or_else(|| {
Ok(sum(Value::zero(), values.to_vec())?) ShellError::unexpected("Cannot perform aggregate math operation on empty data")
} else { })?;
let mut column_values = IndexMap::new();
for value in values { match first {
if let UntaggedValue::Row(row_dict) = value.value.clone() { v if v.is_filesize() => to_byte(&sum(
for (key, value) in row_dict.entries.iter() { UntaggedValue::int(0).into_untagged_value(),
column_values values
.entry(key.clone()) .to_vec()
.and_modify(|v: &mut Vec<Value>| v.push(value.clone())) .iter()
.or_insert(vec![value.clone()]); .map(|v| match v {
} Value {
}; value: UntaggedValue::Primitive(Primitive::Filesize(num)),
} ..
} => UntaggedValue::int(*num as usize).into_untagged_value(),
let mut column_totals = IndexMap::new(); other => other.clone(),
for (col_name, col_vals) in column_values {
let sum = sum(Value::zero(), col_vals)?;
column_totals.insert(col_name, sum);
}
Ok(UntaggedValue::Row(Dictionary {
entries: column_totals,
}) })
.into_value(name)) .collect::<Vec<_>>(),
)?)
.ok_or_else(|| {
ShellError::labeled_error(
"could not convert to big decimal",
"could not convert to big decimal",
&name.span,
)
}),
// v is nothing primitive
v if v.is_none() => sum(
UntaggedValue::int(0).into_untagged_value(),
values
.to_vec()
.iter()
.map(|v| match v {
Value {
value: UntaggedValue::Primitive(Primitive::Nothing),
..
} => UntaggedValue::int(0).into_untagged_value(),
other => other.clone(),
})
.collect::<Vec<_>>(),
),
_ => sum(UntaggedValue::int(0).into_untagged_value(), values.to_vec()),
} }
} }

View File

@ -13,6 +13,7 @@ pub async fn run_with_function(
mf: MathFunction, mf: MathFunction,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
let values: Vec<Value> = input.drain_vec().await; let values: Vec<Value> = input.drain_vec().await;
let res = calculate(&values, &name, mf); let res = calculate(&values, &name, mf);
match res { match res {
Ok(v) => { Ok(v) => {
@ -50,12 +51,17 @@ pub fn calculate(values: &[Value], name: &Tag, mf: MathFunction) -> Result<Value
// The mathematical function operates over the columns of the table // The mathematical function operates over the columns of the table
let mut column_totals = IndexMap::new(); let mut column_totals = IndexMap::new();
for (col_name, col_vals) in column_values { for (col_name, col_vals) in column_values {
match mf(&col_vals, &name) { if let Ok(out) = mf(&col_vals, &name) {
Ok(result) => { column_totals.insert(col_name, out);
column_totals.insert(col_name, result);
} }
Err(err) => return Err(err),
} }
if column_totals.keys().len() == 0 {
return Err(ShellError::labeled_error(
"Attempted to compute values that can't be operated on",
"value appears here",
name.span,
));
} }
Ok(UntaggedValue::Row(Dictionary { Ok(UntaggedValue::Row(Dictionary {

View File

@ -0,0 +1,241 @@
use crate::commands::WholeStreamCommand;
use crate::data::value::compute_values;
use crate::prelude::*;
use bigdecimal::FromPrimitive;
use nu_errors::ShellError;
use nu_protocol::{
hir::Operator, Dictionary, Primitive, ReturnSuccess, Signature, UntaggedValue, Value,
};
use nu_source::Tagged;
pub struct SubCommand;
#[derive(Deserialize)]
struct Arguments {
sample: Tagged<bool>,
}
#[async_trait]
impl WholeStreamCommand for SubCommand {
fn name(&self) -> &str {
"math variance"
}
fn signature(&self) -> Signature {
Signature::build("math variance").switch("sample", "calculate sample variance", Some('s'))
}
fn usage(&self) -> &str {
"Finds the variance of a list of numbers or tables"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let name = args.call_info.name_tag.clone();
let (Arguments { sample }, mut input) = args.process(&registry).await?;
let values: Vec<Value> = input.drain_vec().await;
let n = if let Tagged { item: true, .. } = sample {
values.len() - 1
} else {
values.len()
};
let res = if values.iter().all(|v| v.is_primitive()) {
compute_variance(&values, n, &name)
} else {
// If we are not dealing with Primitives, then perhaps we are dealing with a table
// Create a key for each column name
let mut column_values = IndexMap::new();
for value in values {
if let UntaggedValue::Row(row_dict) = &value.value {
for (key, value) in row_dict.entries.iter() {
column_values
.entry(key.clone())
.and_modify(|v: &mut Vec<Value>| v.push(value.clone()))
.or_insert(vec![value.clone()]);
}
}
}
// The mathematical function operates over the columns of the table
let mut column_totals = IndexMap::new();
for (col_name, col_vals) in column_values {
if let Ok(out) = compute_variance(&col_vals, n, &name) {
column_totals.insert(col_name, out);
}
}
if column_totals.keys().len() == 0 {
return Err(ShellError::labeled_error(
"Attempted to compute values that can't be operated on",
"value appears here",
name.span,
));
}
Ok(UntaggedValue::Row(Dictionary {
entries: column_totals,
})
.into_untagged_value())
};
match res {
Ok(v) => {
if v.value.is_table() {
Ok(OutputStream::from(
v.table_entries()
.map(|v| ReturnSuccess::value(v.clone()))
.collect::<Vec<_>>(),
))
} else {
Ok(OutputStream::one(ReturnSuccess::value(v)))
}
}
Err(e) => Err(e),
}
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Get the variance of a list of numbers",
example: "echo [1 2 3 4 5] | math variance",
result: Some(vec![UntaggedValue::decimal(2).into()]),
}]
}
}
fn sum_of_squares(values: &[Value], name: &Tag) -> Result<Value, ShellError> {
let n = BigDecimal::from_usize(values.len()).ok_or_else(|| {
ShellError::labeled_error(
"could not convert to big decimal",
"could not convert to big decimal",
&name.span,
)
})?;
let mut sum_x = UntaggedValue::int(0).into_untagged_value();
let mut sum_x2 = UntaggedValue::int(0).into_untagged_value();
for value in values {
let v = match value {
Value {
value: UntaggedValue::Primitive(Primitive::Filesize(num)),
..
} => {
UntaggedValue::from(Primitive::Int(num.clone().into()))
},
Value {
value: UntaggedValue::Primitive(num),
..
} => {
UntaggedValue::from(num.clone())
},
_ => {
return Err(ShellError::labeled_error(
"Attempted to compute the sum of squared values of a value that cannot be summed or squared.",
"value appears here",
value.tag.span,
))
}
};
let v_squared = compute_values(Operator::Multiply, &v, &v);
match v_squared {
// X^2
Ok(x2) => {
sum_x2 = match compute_values(Operator::Plus, &sum_x2, &x2) {
Ok(v) => v.into_untagged_value(),
Err((left_type, right_type)) => {
return Err(ShellError::coerce_error(
left_type.spanned(name.span),
right_type.spanned(name.span),
))
}
};
}
Err((left_type, right_type)) => {
return Err(ShellError::coerce_error(
left_type.spanned(value.tag.span),
right_type.spanned(value.tag.span),
))
}
};
sum_x = match compute_values(Operator::Plus, &sum_x, &v) {
Ok(v) => v.into_untagged_value(),
Err((left_type, right_type)) => {
return Err(ShellError::coerce_error(
left_type.spanned(name.span),
right_type.spanned(name.span),
))
}
};
}
let sum_x_squared = match compute_values(Operator::Multiply, &sum_x, &sum_x) {
Ok(v) => v.into_untagged_value(),
Err((left_type, right_type)) => {
return Err(ShellError::coerce_error(
left_type.spanned(name.span),
right_type.spanned(name.span),
))
}
};
let sum_x_squared_div_n = match compute_values(Operator::Divide, &sum_x_squared, &n.into()) {
Ok(v) => v.into_untagged_value(),
Err((left_type, right_type)) => {
return Err(ShellError::coerce_error(
left_type.spanned(name.span),
right_type.spanned(name.span),
))
}
};
let ss = match compute_values(Operator::Minus, &sum_x2, &sum_x_squared_div_n) {
Ok(v) => v.into_untagged_value(),
Err((left_type, right_type)) => {
return Err(ShellError::coerce_error(
left_type.spanned(name.span),
right_type.spanned(name.span),
))
}
};
Ok(ss)
}
#[cfg(test)]
pub fn variance(values: &[Value], name: &Tag) -> Result<Value, ShellError> {
compute_variance(values, values.len(), name)
}
pub fn compute_variance(values: &[Value], n: usize, name: &Tag) -> Result<Value, ShellError> {
let ss = sum_of_squares(values, name)?;
let n = BigDecimal::from_usize(n).ok_or_else(|| {
ShellError::labeled_error(
"could not convert to big decimal",
"could not convert to big decimal",
&name.span,
)
})?;
let variance = compute_values(Operator::Divide, &ss, &n.into());
match variance {
Ok(value) => Ok(value.into_value(name)),
Err((_, _)) => Err(ShellError::labeled_error(
"could not calculate variance of non-integer or unrelated types",
"source",
name,
)),
}
}
#[cfg(test)]
mod tests {
use super::SubCommand;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(SubCommand {})
}
}

View File

@ -0,0 +1,335 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::data::base::select_fields;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ColumnPath, ReturnSuccess, Signature, SyntaxShape, Value};
use nu_source::span_for_spanned_list;
pub struct SubCommand;
#[derive(Deserialize)]
pub struct Arguments {
rest: Vec<ColumnPath>,
after: Option<ColumnPath>,
before: Option<ColumnPath>,
}
#[async_trait]
impl WholeStreamCommand for SubCommand {
fn name(&self) -> &str {
"move column"
}
fn signature(&self) -> Signature {
Signature::build("move column")
.rest(SyntaxShape::ColumnPath, "the columns to move")
.named(
"after",
SyntaxShape::ColumnPath,
"the column that will precede the columns moved",
None,
)
.named(
"before",
SyntaxShape::ColumnPath,
"the column that will be next the columns moved",
None,
)
}
fn usage(&self) -> &str {
"Move columns."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
operate(args, registry).await
}
}
async fn operate(
raw_args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let name = raw_args.call_info.name_tag.clone();
let registry = registry.clone();
let (
Arguments {
rest: mut columns,
before,
after,
},
input,
) = raw_args.process(&registry).await?;
if columns.is_empty() {
return Err(ShellError::labeled_error(
"expected columns",
"expected columns",
name,
));
}
if columns.iter().any(|c| c.members().len() > 1) {
return Err(ShellError::labeled_error(
"expected columns",
"expected columns",
name,
));
}
if vec![&after, &before]
.iter()
.map(|o| if o.is_some() { 1 } else { 0 })
.sum::<usize>()
> 1
{
return Err(ShellError::labeled_error(
"can't move column(s)",
"pick exactly one (before, after)",
name,
));
}
if let Some(after) = after {
let member = columns.remove(0);
Ok(input
.map(move |item| {
let member = vec![member.clone()];
let column_paths = vec![&member, &columns]
.into_iter()
.flatten()
.collect::<Vec<&ColumnPath>>();
let after_span = span_for_spanned_list(after.members().iter().map(|p| p.span));
if after.members().len() == 1 {
let keys = column_paths
.iter()
.filter_map(|c| c.last())
.map(|c| c.as_string())
.collect::<Vec<_>>();
if let Some(column) = after.last() {
if !keys.contains(&column.as_string()) {
ReturnSuccess::value(move_after(&item, &keys, &after, &name)?)
} else {
let msg =
format!("can't move column {} after itself", column.as_string());
Err(ShellError::labeled_error(
"can't move column",
msg,
after_span,
))
}
} else {
Err(ShellError::labeled_error(
"expected column",
"expected column",
after_span,
))
}
} else {
Err(ShellError::labeled_error(
"expected column",
"expected column",
after_span,
))
}
})
.to_output_stream())
} else if let Some(before) = before {
let member = columns.remove(0);
Ok(input
.map(move |item| {
let member = vec![member.clone()];
let column_paths = vec![&member, &columns]
.into_iter()
.flatten()
.collect::<Vec<&ColumnPath>>();
let before_span = span_for_spanned_list(before.members().iter().map(|p| p.span));
if before.members().len() == 1 {
let keys = column_paths
.iter()
.filter_map(|c| c.last())
.map(|c| c.as_string())
.collect::<Vec<_>>();
if let Some(column) = before.last() {
if !keys.contains(&column.as_string()) {
ReturnSuccess::value(move_before(&item, &keys, &before, &name)?)
} else {
let msg =
format!("can't move column {} before itself", column.as_string());
Err(ShellError::labeled_error(
"can't move column",
msg,
before_span,
))
}
} else {
Err(ShellError::labeled_error(
"expected column",
"expected column",
before_span,
))
}
} else {
Err(ShellError::labeled_error(
"expected column",
"expected column",
before_span,
))
}
})
.to_output_stream())
} else {
Err(ShellError::labeled_error(
"no columns given",
"no columns given",
name,
))
}
}
fn move_after(
table: &Value,
columns: &[String],
from: &ColumnPath,
tag: impl Into<Tag>,
) -> Result<Value, ShellError> {
let tag = tag.into();
let from_fields = span_for_spanned_list(from.members().iter().map(|p| p.span));
let from = if let Some((last, _)) = from.split_last() {
last.as_string()
} else {
return Err(ShellError::labeled_error(
"unknown column",
"unknown column",
from_fields,
));
};
let columns_moved = table
.data_descriptors()
.into_iter()
.map(|name| {
if columns.contains(&name) {
None
} else {
Some(name)
}
})
.collect::<Vec<_>>();
let mut reordered_columns = vec![];
let mut insert = false;
let mut inserted = false;
for name in columns_moved.into_iter() {
if let Some(name) = name {
reordered_columns.push(Some(name.clone()));
if !inserted && name == from {
insert = true;
}
} else {
reordered_columns.push(None);
}
if insert {
for column in columns {
reordered_columns.push(Some(column.clone()));
}
inserted = true;
}
}
Ok(select_fields(
table,
&reordered_columns
.into_iter()
.filter_map(|v| v)
.collect::<Vec<_>>(),
&tag,
))
}
fn move_before(
table: &Value,
columns: &[String],
from: &ColumnPath,
tag: impl Into<Tag>,
) -> Result<Value, ShellError> {
let tag = tag.into();
let from_fields = span_for_spanned_list(from.members().iter().map(|p| p.span));
let from = if let Some((last, _)) = from.split_last() {
last.as_string()
} else {
return Err(ShellError::labeled_error(
"unknown column",
"unknown column",
from_fields,
));
};
let columns_moved = table
.data_descriptors()
.into_iter()
.map(|name| {
if columns.contains(&name) {
None
} else {
Some(name)
}
})
.collect::<Vec<_>>();
let mut reordered_columns = vec![];
let mut inserted = false;
for name in columns_moved.into_iter() {
if let Some(name) = name {
if !inserted && name == from {
for column in columns {
reordered_columns.push(Some(column.clone()));
}
inserted = true;
}
reordered_columns.push(Some(name.clone()));
} else {
reordered_columns.push(None);
}
}
Ok(select_fields(
table,
&reordered_columns
.into_iter()
.filter_map(|v| v)
.collect::<Vec<_>>(),
&tag,
))
}
#[cfg(test)]
mod tests {
use super::SubCommand;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(SubCommand {})
}
}

View File

@ -0,0 +1,46 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, UntaggedValue};
#[derive(Clone)]
pub struct Command;
#[async_trait]
impl WholeStreamCommand for Command {
fn name(&self) -> &str {
"move"
}
fn signature(&self) -> Signature {
Signature::build("move")
}
fn usage(&self) -> &str {
"moves across desired subcommand."
}
async fn run(
&self,
_args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
Ok(OutputStream::one(Ok(ReturnSuccess::Value(
UntaggedValue::string(crate::commands::help::get_help(&Command, &registry))
.into_value(Tag::unknown()),
))))
}
}
#[cfg(test)]
mod tests {
use super::Command;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Command {})
}
}

View File

@ -0,0 +1,7 @@
mod column;
mod command;
pub mod mv;
pub use column::SubCommand as MoveColumn;
pub use command::Command as Move;
pub use mv::Mv;

View File

@ -6,16 +6,16 @@ use nu_protocol::{Signature, SyntaxShape};
use nu_source::Tagged; use nu_source::Tagged;
use std::path::PathBuf; use std::path::PathBuf;
pub struct Move; pub struct Mv;
#[derive(Deserialize)] #[derive(Deserialize)]
pub struct MoveArgs { pub struct Arguments {
pub src: Tagged<PathBuf>, pub src: Tagged<PathBuf>,
pub dst: Tagged<PathBuf>, pub dst: Tagged<PathBuf>,
} }
#[async_trait] #[async_trait]
impl WholeStreamCommand for Move { impl WholeStreamCommand for Mv {
fn name(&self) -> &str { fn name(&self) -> &str {
"mv" "mv"
} }
@ -78,12 +78,12 @@ async fn mv(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStrea
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use super::Move; use super::Mv;
#[test] #[test]
fn examples_work_as_expected() { fn examples_work_as_expected() {
use crate::examples::test as test_examples; use crate::examples::test as test_examples;
test_examples(Move {}) test_examples(Mv {})
} }
} }

View File

@ -70,16 +70,20 @@ async fn nth(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStre
let row_numbers = vec![vec![row_number], and_rows] let row_numbers = vec![vec![row_number], and_rows]
.into_iter() .into_iter()
.flatten() .flatten()
.collect::<Vec<Tagged<u64>>>(); .map(|x| x.item)
.collect::<Vec<u64>>();
let max_row_number = row_numbers
.iter()
.max()
.expect("Internal error: should be > 0 row numbers");
Ok(input Ok(input
.take(*max_row_number as usize + 1)
.enumerate() .enumerate()
.filter_map(move |(idx, item)| { .filter_map(move |(idx, item)| {
futures::future::ready( futures::future::ready(
if row_numbers if row_numbers.iter().any(|requested| *requested == idx as u64) {
.iter()
.any(|requested| requested.item == idx as u64)
{
Some(ReturnSuccess::value(item)) Some(ReturnSuccess::value(item))
} else { } else {
None None

View File

@ -1,15 +1,14 @@
use crate::commands::classified::maybe_text_codec::StringOrBinary;
use crate::commands::constants::BAT_LANGUAGES;
use crate::commands::WholeStreamCommand; use crate::commands::WholeStreamCommand;
use crate::prelude::*; use crate::prelude::*;
use encoding_rs::{Encoding, UTF_8};
use futures_util::StreamExt;
use log::debug;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{CommandAction, ReturnSuccess, Signature, SyntaxShape, UntaggedValue}; use nu_protocol::{CommandAction, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::{AnchorLocation, Span, Tagged}; use nu_source::{AnchorLocation, Span, Tagged};
use std::path::{Path, PathBuf}; use std::path::PathBuf;
extern crate encoding_rs;
use encoding_rs::*;
use std::fs::File;
use std::io::BufWriter;
use std::io::Read;
use std::io::Write;
pub struct Open; pub struct Open;
@ -81,24 +80,27 @@ documentation link at https://docs.rs/encoding_rs/0.8.23/encoding_rs/#statics"#
} }
} }
pub fn get_encoding(opt: Option<String>) -> &'static Encoding { pub fn get_encoding(opt: Option<Tagged<String>>) -> Result<&'static Encoding, ShellError> {
match opt { match opt {
None => UTF_8, None => Ok(UTF_8),
Some(label) => match Encoding::for_label((&label).as_bytes()) { Some(label) => match Encoding::for_label((&label.item).as_bytes()) {
None => { None => Err(ShellError::labeled_error(
//print!("{} is not a known encoding label. Trying UTF-8.", label); format!(
//std::process::exit(-2); r#"{} is not a valid encoding, refer to https://docs.rs/encoding_rs/0.8.23/encoding_rs/#statics for a valid list of encodings"#,
get_encoding(Some("utf-8".to_string())) label.item
} ),
Some(encoding) => encoding, "invalid encoding",
label.span(),
)),
Some(encoding) => Ok(encoding),
}, },
} }
} }
async fn open(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> { async fn open(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let cwd = PathBuf::from(args.shell_manager.path()); let cwd = PathBuf::from(args.shell_manager.path());
let full_path = cwd;
let registry = registry.clone(); let registry = registry.clone();
let shell_manager = args.shell_manager.clone();
let ( let (
OpenArgs { OpenArgs {
@ -108,329 +110,134 @@ async fn open(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStr
}, },
_, _,
) = args.process(&registry).await?; ) = args.process(&registry).await?;
let enc = match encoding {
Some(e) => e.to_string(),
_ => "".to_string(),
};
let result = fetch(&full_path, &path.item, path.tag.span, enc).await;
let (file_extension, contents, contents_tag) = result?; // TODO: Remove once Streams are supported everywhere!
// As a short term workaround for getting AutoConvert and Bat functionality (Those don't currently support Streams)
let file_extension = if raw.item { // Check if the extension has a "from *" command OR "bat" supports syntax highlighting
// AND the user doesn't want the raw output
// In these cases, we will collect the Stream
let ext = if raw.item {
None None
} else { } else {
// If the extension could not be determined via mimetype, try to use the path path.extension()
// extension. Some file types do not declare their mimetypes (such as bson files). .map(|name| name.to_string_lossy().to_string())
file_extension.or_else(|| path.extension().map(|x| x.to_string_lossy().to_string()))
}; };
let tagged_contents = contents.into_value(&contents_tag); if let Some(ext) = ext {
// Check if we have a conversion command
if let Some(_command) = registry.get_command(&format!("from {}", ext)) {
let (_, tagged_contents) = crate::commands::open::fetch(
&cwd,
&PathBuf::from(&path.item),
path.tag.span,
encoding,
)
.await?;
return Ok(OutputStream::one(ReturnSuccess::action(
CommandAction::AutoConvert(tagged_contents, ext),
)));
}
// Check if bat does syntax highlighting
if BAT_LANGUAGES.contains(&ext.as_ref()) {
let (_, tagged_contents) = crate::commands::open::fetch(
&cwd,
&PathBuf::from(&path.item),
path.tag.span,
encoding,
)
.await?;
return Ok(OutputStream::one(ReturnSuccess::value(tagged_contents)));
}
}
if let Some(extension) = file_extension { // Normal Streaming operation
Ok(OutputStream::one(ReturnSuccess::action( let with_encoding = if encoding.is_none() {
CommandAction::AutoConvert(tagged_contents, extension), None
)))
} else { } else {
Ok(OutputStream::one(ReturnSuccess::value(tagged_contents))) Some(get_encoding(encoding)?)
};
let sob_stream = shell_manager.open(&path.item, path.tag.span, with_encoding)?;
let final_stream = sob_stream.map(move |x| {
// The tag that will used when returning a Value
let file_tag = Tag {
span: path.tag.span,
anchor: Some(AnchorLocation::File(path.to_string_lossy().to_string())),
};
match x {
Ok(StringOrBinary::String(s)) => {
ReturnSuccess::value(UntaggedValue::string(s).into_value(file_tag))
} }
Ok(StringOrBinary::Binary(b)) => ReturnSuccess::value(
UntaggedValue::binary(b.into_iter().collect()).into_value(file_tag),
),
Err(se) => Err(se),
}
});
Ok(OutputStream::new(final_stream))
} }
// Note that we do not output a Stream in "fetch" since it is only used by "enter" command
// Which we expect to use a concrete Value a not a Stream
pub async fn fetch( pub async fn fetch(
cwd: &PathBuf, cwd: &PathBuf,
location: &PathBuf, location: &PathBuf,
span: Span, span: Span,
encoding: String, encoding_choice: Option<Tagged<String>>,
) -> Result<(Option<String>, UntaggedValue, Tag), ShellError> { ) -> Result<(Option<String>, Value), ShellError> {
// TODO: I don't understand the point of this? Maybe for better error reporting
let mut cwd = cwd.clone(); let mut cwd = cwd.clone();
let output_encoding: &Encoding = get_encoding(Some("utf-8".to_string())); cwd.push(location);
let input_encoding: &Encoding = get_encoding(Some(encoding.clone())); let nice_location = dunce::canonicalize(&cwd).map_err(|e| {
let mut decoder = input_encoding.new_decoder(); ShellError::labeled_error(
let mut encoder = output_encoding.new_encoder(); format!("Cannot canonicalize file {:?} because {:?}", &cwd, e),
let mut _file: File; "Cannot canonicalize",
let buf = Vec::new(); span,
let mut bufwriter = BufWriter::new(buf); )
})?;
cwd.push(Path::new(location)); // The extension may be used in AutoConvert later on
if let Ok(cwd) = dunce::canonicalize(&cwd) { let ext = location
if !encoding.is_empty() { .extension()
// use the encoding string .map(|name| name.to_string_lossy().to_string());
match File::open(&Path::new(&cwd)) {
Ok(mut _file) => {
convert_via_utf8(
&mut decoder,
&mut encoder,
&mut _file,
&mut bufwriter,
false,
);
//bufwriter.flush()?;
Ok((
cwd.extension()
.map(|name| name.to_string_lossy().to_string()),
UntaggedValue::string(String::from_utf8_lossy(&bufwriter.buffer())),
Tag {
span,
anchor: Some(AnchorLocation::File(cwd.to_string_lossy().to_string())),
},
))
}
Err(_) => Err(ShellError::labeled_error(
format!("Cannot open {:?} for reading.", &cwd),
"file not found",
span,
)),
}
} else {
// Do the old stuff
match std::fs::read(&cwd) {
Ok(bytes) => match std::str::from_utf8(&bytes) {
Ok(s) => Ok((
cwd.extension()
.map(|name| name.to_string_lossy().to_string()),
UntaggedValue::string(s),
Tag {
span,
anchor: Some(AnchorLocation::File(cwd.to_string_lossy().to_string())),
},
)),
Err(_) => {
//Non utf8 data.
match (bytes.get(0), bytes.get(1)) {
(Some(x), Some(y)) if *x == 0xff && *y == 0xfe => {
// Possibly UTF-16 little endian
let utf16 = read_le_u16(&bytes[2..]);
if let Some(utf16) = utf16 { // The tag that will used when returning a Value
match std::string::String::from_utf16(&utf16) { let file_tag = Tag {
Ok(s) => Ok((
cwd.extension()
.map(|name| name.to_string_lossy().to_string()),
UntaggedValue::string(s),
Tag {
span, span,
anchor: Some(AnchorLocation::File( anchor: Some(AnchorLocation::File(
cwd.to_string_lossy().to_string(), nice_location.to_string_lossy().to_string(),
)), )),
},
)),
Err(_) => Ok((
None,
UntaggedValue::binary(bytes),
Tag {
span,
anchor: Some(AnchorLocation::File(
cwd.to_string_lossy().to_string(),
)),
},
)),
}
} else {
Ok((
None,
UntaggedValue::binary(bytes),
Tag {
span,
anchor: Some(AnchorLocation::File(
cwd.to_string_lossy().to_string(),
)),
},
))
}
}
(Some(x), Some(y)) if *x == 0xfe && *y == 0xff => {
// Possibly UTF-16 big endian
let utf16 = read_be_u16(&bytes[2..]);
if let Some(utf16) = utf16 {
match std::string::String::from_utf16(&utf16) {
Ok(s) => Ok((
cwd.extension()
.map(|name| name.to_string_lossy().to_string()),
UntaggedValue::string(s),
Tag {
span,
anchor: Some(AnchorLocation::File(
cwd.to_string_lossy().to_string(),
)),
},
)),
Err(_) => Ok((
None,
UntaggedValue::binary(bytes),
Tag {
span,
anchor: Some(AnchorLocation::File(
cwd.to_string_lossy().to_string(),
)),
},
)),
}
} else {
Ok((
None,
UntaggedValue::binary(bytes),
Tag {
span,
anchor: Some(AnchorLocation::File(
cwd.to_string_lossy().to_string(),
)),
},
))
}
}
_ => Ok((
None,
UntaggedValue::binary(bytes),
Tag {
span,
anchor: Some(AnchorLocation::File(
cwd.to_string_lossy().to_string(),
)),
},
)),
}
}
},
Err(_) => Err(ShellError::labeled_error(
format!("Cannot open {:?} for reading.", &cwd),
"file not found",
span,
)),
}
}
} else {
Err(ShellError::labeled_error(
format!("Cannot open {:?} for reading.", &cwd),
"file not found",
span,
))
}
}
fn convert_via_utf8(
decoder: &mut Decoder,
encoder: &mut Encoder,
read: &mut dyn Read,
write: &mut dyn Write,
last: bool,
) {
let mut input_buffer = [0u8; 2048];
let mut intermediate_buffer_bytes = [0u8; 4096];
// Is there a safe way to create a stack-allocated &mut str?
let mut intermediate_buffer: &mut str =
//unsafe { std::mem::transmute(&mut intermediate_buffer_bytes[..]) };
std::str::from_utf8_mut(&mut intermediate_buffer_bytes[..]).expect("error with from_utf8_mut");
let mut output_buffer = [0u8; 4096];
let mut current_input_ended = false;
while !current_input_ended {
match read.read(&mut input_buffer) {
Err(_) => {
print!("Error reading input.");
//std::process::exit(-5);
}
Ok(decoder_input_end) => {
current_input_ended = decoder_input_end == 0;
let input_ended = last && current_input_ended;
let mut decoder_input_start = 0usize;
loop {
let (decoder_result, decoder_read, decoder_written, _) = decoder.decode_to_str(
&input_buffer[decoder_input_start..decoder_input_end],
&mut intermediate_buffer,
input_ended,
);
decoder_input_start += decoder_read;
let last_output = if input_ended {
match decoder_result {
CoderResult::InputEmpty => true,
CoderResult::OutputFull => false,
}
} else {
false
}; };
// Regardless of whether the intermediate buffer got full let res = std::fs::read(location)?;
// or the input buffer was exhausted, let's process what's
// in the intermediate buffer.
if encoder.encoding() == UTF_8 { // If no encoding is provided we try to guess the encoding to read the file with
// If the target is UTF-8, optimize out the encoder. let encoding = if encoding_choice.is_none() {
if write UTF_8
.write_all(&intermediate_buffer.as_bytes()[..decoder_written])
.is_err()
{
print!("Error writing output.");
//std::process::exit(-7);
}
} else { } else {
let mut encoder_input_start = 0usize; get_encoding(encoding_choice.clone())?
loop { };
let (encoder_result, encoder_read, encoder_written, _) = encoder
.encode_from_utf8(
&intermediate_buffer[encoder_input_start..decoder_written],
&mut output_buffer,
last_output,
);
encoder_input_start += encoder_read;
if write.write_all(&output_buffer[..encoder_written]).is_err() {
print!("Error writing output.");
//std::process::exit(-6);
}
match encoder_result {
CoderResult::InputEmpty => {
break;
}
CoderResult::OutputFull => {
continue;
}
}
}
}
// Now let's see if we should read again or process the // If the user specified an encoding, then do not do BOM sniffing
// rest of the current input buffer. let decoded_res = if encoding_choice.is_some() {
match decoder_result { let (cow_res, _replacements) = encoding.decode_with_bom_removal(&res);
CoderResult::InputEmpty => { cow_res
break;
}
CoderResult::OutputFull => {
continue;
}
}
}
}
}
}
}
fn read_le_u16(input: &[u8]) -> Option<Vec<u16>> {
if input.len() % 2 != 0 || input.len() < 2 {
None
} else { } else {
let mut result = vec![]; // Otherwise, use the default UTF-8 encoder with BOM sniffing
let mut pos = 0; let (cow_res, actual_encoding, replacements) = encoding.decode(&res);
while pos < input.len() { // If we had to use replacement characters then fallback to binary
result.push(u16::from_le_bytes([input[pos], input[pos + 1]])); if replacements {
pos += 2; return Ok((ext, UntaggedValue::binary(res).into_value(file_tag)));
}
Some(result)
}
}
fn read_be_u16(input: &[u8]) -> Option<Vec<u16>> {
if input.len() % 2 != 0 || input.len() < 2 {
None
} else {
let mut result = vec![];
let mut pos = 0;
while pos < input.len() {
result.push(u16::from_be_bytes([input[pos], input[pos + 1]]));
pos += 2;
}
Some(result)
} }
debug!("Decoded using {:?}", actual_encoding);
cow_res
};
let v = UntaggedValue::string(decoded_res.to_string()).into_value(file_tag);
Ok((ext, v))
} }
#[cfg(test)] #[cfg(test)]

View File

@ -41,6 +41,17 @@ impl WholeStreamCommand for Command {
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
operate(args, registry).await operate(args, registry).await
} }
fn examples(&self) -> Vec<Example> {
let mut row = IndexMap::new();
row.insert("foo".to_string(), Value::from("hi"));
row.insert("bar".to_string(), Value::from("there"));
vec![Example {
description: "Parse a string into two named columns",
example: "echo \"hi there\" | parse \"{foo} {bar}\"",
result: Some(vec![UntaggedValue::row(row).into()]),
}]
}
} }
pub async fn operate( pub async fn operate(

View File

@ -0,0 +1,61 @@
use super::{operate, DefaultArguments};
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape, UntaggedValue, Value};
use std::path::Path;
pub struct PathBasename;
#[async_trait]
impl WholeStreamCommand for PathBasename {
fn name(&self) -> &str {
"path basename"
}
fn signature(&self) -> Signature {
Signature::build("path basename")
.rest(SyntaxShape::ColumnPath, "optionally operate by path")
}
fn usage(&self) -> &str {
"gets the filename of a path"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let (DefaultArguments { rest }, input) = args.process(&registry).await?;
operate(input, rest, &action, tag.span).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Get basename of a path",
example: "echo '/home/joe/test.txt' | path basename",
result: Some(vec![Value::from("test.txt")]),
}]
}
}
fn action(path: &Path) -> UntaggedValue {
UntaggedValue::string(match path.file_name() {
Some(filename) => filename.to_string_lossy().to_string(),
_ => "".to_string(),
})
}
#[cfg(test)]
mod tests {
use super::PathBasename;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(PathBasename {})
}
}

View File

@ -0,0 +1,46 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, UntaggedValue};
pub struct Path;
#[async_trait]
impl WholeStreamCommand for Path {
fn name(&self) -> &str {
"path"
}
fn signature(&self) -> Signature {
Signature::build("path")
}
fn usage(&self) -> &str {
"Apply path function"
}
async fn run(
&self,
_args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
Ok(OutputStream::one(ReturnSuccess::value(
UntaggedValue::string(crate::commands::help::get_help(&Path, &registry))
.into_value(Tag::unknown()),
)))
}
}
#[cfg(test)]
mod tests {
use super::Path;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Path {})
}
}

View File

@ -0,0 +1,45 @@
use super::{operate, DefaultArguments};
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape, UntaggedValue, Value};
use std::path::Path;
pub struct PathExists;
#[async_trait]
impl WholeStreamCommand for PathExists {
fn name(&self) -> &str {
"path exists"
}
fn signature(&self) -> Signature {
Signature::build("path exists").rest(SyntaxShape::ColumnPath, "optionally operate by path")
}
fn usage(&self) -> &str {
"checks whether the path exists"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let (DefaultArguments { rest }, input) = args.process(&registry).await?;
operate(input, rest, &action, tag.span).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Check if file exists",
example: "echo '/home/joe/todo.txt' | path exists",
result: Some(vec![Value::from(UntaggedValue::boolean(false))]),
}]
}
}
fn action(path: &Path) -> UntaggedValue {
UntaggedValue::boolean(path.exists())
}

View File

@ -0,0 +1,51 @@
use super::{operate, DefaultArguments};
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape, UntaggedValue, Value};
use std::path::Path;
pub struct PathExpand;
#[async_trait]
impl WholeStreamCommand for PathExpand {
fn name(&self) -> &str {
"path expand"
}
fn signature(&self) -> Signature {
Signature::build("path expand").rest(SyntaxShape::ColumnPath, "optionally operate by path")
}
fn usage(&self) -> &str {
"expands the path to its absolute form"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let (DefaultArguments { rest }, input) = args.process(&registry).await?;
operate(input, rest, &action, tag.span).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Expand relative directories",
example: "echo '/home/joe/foo/../bar' | path expand",
result: Some(vec![Value::from("/home/joe/bar")]),
}]
}
}
fn action(path: &Path) -> UntaggedValue {
let ps = path.to_string_lossy();
let expanded = shellexpand::tilde(&ps);
let path: &Path = expanded.as_ref().as_ref();
UntaggedValue::string(match path.canonicalize() {
Ok(p) => p.to_string_lossy().to_string(),
Err(_) => ps.to_string(),
})
}

View File

@ -0,0 +1,68 @@
use super::{operate, DefaultArguments};
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape, UntaggedValue, Value};
use std::path::Path;
pub struct PathExtension;
#[async_trait]
impl WholeStreamCommand for PathExtension {
fn name(&self) -> &str {
"path extension"
}
fn signature(&self) -> Signature {
Signature::build("path extension")
.rest(SyntaxShape::ColumnPath, "optionally operate by path")
}
fn usage(&self) -> &str {
"gets the extension of a path"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let (DefaultArguments { rest }, input) = args.process(&registry).await?;
operate(input, rest, &action, tag.span).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Get extension of a path",
example: "echo 'test.txt' | path extension",
result: Some(vec![Value::from("txt")]),
},
Example {
description: "You get an empty string if there is no extension",
example: "echo 'test' | path extension",
result: Some(vec![Value::from("")]),
},
]
}
}
fn action(path: &Path) -> UntaggedValue {
UntaggedValue::string(match path.extension() {
Some(ext) => ext.to_string_lossy().to_string(),
_ => "".to_string(),
})
}
#[cfg(test)]
mod tests {
use super::PathExtension;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(PathExtension {})
}
}

View File

@ -0,0 +1,75 @@
mod basename;
mod command;
mod exists;
mod expand;
mod extension;
mod r#type;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ColumnPath, Primitive, ReturnSuccess, ShellTypeName, UntaggedValue, Value};
use nu_source::Span;
use std::path::Path;
pub use basename::PathBasename;
pub use command::Path as PathCommand;
pub use exists::PathExists;
pub use expand::PathExpand;
pub use extension::PathExtension;
pub use r#type::PathType;
#[derive(Deserialize)]
struct DefaultArguments {
rest: Vec<ColumnPath>,
}
fn handle_value<F>(action: &F, v: &Value, span: Span) -> Result<Value, ShellError>
where
F: Fn(&Path) -> UntaggedValue + Send + 'static,
{
let v = match &v.value {
UntaggedValue::Primitive(Primitive::Path(buf)) => action(buf).into_value(v.tag()),
UntaggedValue::Primitive(Primitive::String(s))
| UntaggedValue::Primitive(Primitive::Line(s)) => action(s.as_ref()).into_value(v.tag()),
other => {
let got = format!("got {}", other.type_name());
return Err(ShellError::labeled_error_with_secondary(
"value is not string or path",
got,
span,
"originates from here".to_string(),
v.tag().span,
));
}
};
Ok(v)
}
async fn operate<F>(
input: crate::InputStream,
paths: Vec<ColumnPath>,
action: &'static F,
span: Span,
) -> Result<OutputStream, ShellError>
where
F: Fn(&Path) -> UntaggedValue + Send + Sync + 'static,
{
Ok(input
.map(move |v| {
if paths.is_empty() {
ReturnSuccess::value(handle_value(&action, &v, span)?)
} else {
let mut ret = v;
for path in &paths {
ret = ret.swap_data_by_column_path(
path,
Box::new(move |old| handle_value(&action, &old, span)),
)?;
}
ReturnSuccess::value(ret)
}
})
.to_output_stream())
}

View File

@ -0,0 +1,62 @@
use super::{operate, DefaultArguments};
use crate::commands::WholeStreamCommand;
use crate::data::files::get_file_type;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape, UntaggedValue, Value};
use std::path::Path;
pub struct PathType;
#[async_trait]
impl WholeStreamCommand for PathType {
fn name(&self) -> &str {
"path type"
}
fn signature(&self) -> Signature {
Signature::build("path type").rest(SyntaxShape::ColumnPath, "optionally operate by path")
}
fn usage(&self) -> &str {
"gives the type of the object the path refers to (eg file, dir, symlink)"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let (DefaultArguments { rest }, input) = args.process(&registry).await?;
operate(input, rest, &action, tag.span).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Show type of a filepath",
example: "echo '.' | path type",
result: Some(vec![Value::from("Dir")]),
}]
}
}
fn action(path: &Path) -> UntaggedValue {
let meta = std::fs::symlink_metadata(path);
UntaggedValue::string(match &meta {
Ok(md) => get_file_type(md),
Err(_) => "",
})
}
#[cfg(test)]
mod tests {
use super::PathType;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(PathType {})
}
}

View File

@ -2,10 +2,12 @@ pub mod command;
pub mod bool; pub mod bool;
pub mod dice; pub mod dice;
#[cfg(feature = "uuid_crate")]
pub mod uuid; pub mod uuid;
pub use command::Command as Random; pub use command::Command as Random;
pub use self::bool::SubCommand as RandomBool; pub use self::bool::SubCommand as RandomBool;
pub use dice::SubCommand as RandomDice; pub use dice::SubCommand as RandomDice;
#[cfg(feature = "uuid_crate")]
pub use uuid::SubCommand as RandomUUID; pub use uuid::SubCommand as RandomUUID;

View File

@ -0,0 +1,178 @@
use crate::commands::classified::block::run_block;
use crate::commands::each;
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use crate::{CommandArgs, CommandRegistry, Example, OutputStream};
use futures::stream::once;
use nu_errors::ShellError;
use nu_protocol::{hir::Block, Primitive, Scope, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged;
pub struct Reduce;
#[derive(Deserialize)]
pub struct ReduceArgs {
block: Block,
fold: Option<Value>,
numbered: Tagged<bool>,
}
#[async_trait]
impl WholeStreamCommand for Reduce {
fn name(&self) -> &str {
"reduce"
}
fn signature(&self) -> Signature {
Signature::build("reduce")
.named(
"fold",
SyntaxShape::Any,
"reduce with initial value",
Some('f'),
)
.required("block", SyntaxShape::Block, "reducing function")
.switch(
"numbered",
"returned a numbered item ($it.index and $it.item)",
Some('n'),
)
}
fn usage(&self) -> &str {
"Aggregate a list table to a single value using an accumulator block. Block must be
(A, A) -> A unless --fold is selected, in which case it may be A, B -> A."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
reduce(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Simple summation (equivalent to math sum)",
example: "echo 1 2 3 4 | reduce { = $acc + $it }",
result: Some(vec![UntaggedValue::int(10).into()]),
},
Example {
description: "Summation from starting value using fold",
example: "echo 1 2 3 4 | reduce -f $(= -1) { = $acc + $it }",
result: Some(vec![UntaggedValue::int(9).into()]),
},
Example {
description: "Folding with rows",
example: "<table> | reduce -f 1.6 { = $acc * $(echo $it.a | str to-int) + $(echo $it.b | str to-int) }",
result: None,
},
Example {
description: "Numbered reduce to find index of longest word",
example: "echo one longest three bar | reduce -n { if $(echo $it.item | str length) > $(echo $acc.item | str length) {echo $it} {echo $acc}} | get index",
result: None,
},
]
}
}
async fn process_row(
block: Arc<Block>,
scope: Arc<Scope>,
mut context: Arc<Context>,
row: Value,
) -> Result<InputStream, ShellError> {
let row_clone = row.clone();
let input_stream = once(async { Ok(row_clone) }).to_input_stream();
Ok(run_block(
&block,
Arc::make_mut(&mut context),
input_stream,
&row,
&scope.vars,
&scope.env,
)
.await?)
}
async fn reduce(
raw_args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let base_scope = raw_args.call_info.scope.clone();
let context = Arc::new(Context::from_raw(&raw_args, &registry));
let (reduce_args, mut input): (ReduceArgs, _) = raw_args.process(&registry).await?;
let block = Arc::new(reduce_args.block);
let (ioffset, start) = match reduce_args.fold {
None => {
let first = input
.next()
.await
.expect("empty stream expected to contain Primitive::Nothing");
if let UntaggedValue::Primitive(Primitive::Nothing) = first.value {
return Err(ShellError::missing_value(None, "empty input"));
}
(1, first)
}
Some(acc) => (0, acc),
};
if reduce_args.numbered.item {
// process_row returns Result<InputStream, ShellError>, so we must fold with one
let initial = Ok(InputStream::one(each::make_indexed_item(
ioffset - 1,
start,
)));
Ok(input
.enumerate()
.fold(initial, move |acc, input| {
let block = Arc::clone(&block);
let mut scope = base_scope.clone();
let context = Arc::clone(&context);
let row = each::make_indexed_item(input.0 + ioffset, input.1);
async {
let f = acc?.into_vec().await[0].clone();
scope.vars.insert(String::from("$acc"), f);
process_row(block, Arc::new(scope), context, row).await
}
})
.await?
.to_output_stream())
} else {
let initial = Ok(InputStream::one(start));
Ok(input
.fold(initial, move |acc, row| {
let block = Arc::clone(&block);
let mut scope = base_scope.clone();
let context = Arc::clone(&context);
async {
scope
.vars
.insert(String::from("$acc"), acc?.into_vec().await[0].clone());
process_row(block, Arc::new(scope), context, row).await
}
})
.await?
.to_output_stream())
}
}
#[cfg(test)]
mod tests {
use super::Reduce;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Reduce {})
}
}

View File

@ -1,83 +0,0 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use crate::utils::data_processing::reduce;
use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged;
use num_traits::cast::ToPrimitive;
pub struct ReduceBy;
#[derive(Deserialize)]
pub struct ReduceByArgs {
reduce_with: Option<Tagged<String>>,
}
#[async_trait]
impl WholeStreamCommand for ReduceBy {
fn name(&self) -> &str {
"reduce-by"
}
fn signature(&self) -> Signature {
Signature::build("reduce-by").named(
"reduce_with",
SyntaxShape::String,
"the command to reduce by with",
Some('w'),
)
}
fn usage(&self) -> &str {
"Creates a new table with the data from the tables rows reduced by the command given."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
reduce_by(args, registry).await
}
}
pub async fn reduce_by(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let name = args.call_info.name_tag.clone();
let (ReduceByArgs { reduce_with }, mut input) = args.process(&registry).await?;
let values: Vec<Value> = input.collect().await;
if values.is_empty() {
return Err(ShellError::labeled_error(
"Expected table from pipeline",
"requires a table input",
name,
));
}
let reduce_with = if let Some(reducer) = reduce_with {
Some(reducer.item().clone())
} else {
None
};
match reduce(&values[0], reduce_with, name) {
Ok(reduced) => Ok(OutputStream::one(ReturnSuccess::value(reduced))),
Err(err) => Err(err),
}
}
#[cfg(test)]
mod tests {
use super::ReduceBy;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(ReduceBy {})
}
}

View File

@ -41,7 +41,7 @@ impl WholeStreamCommand for AliasCommand {
let call_info = args.call_info.clone(); let call_info = args.call_info.clone();
let registry = registry.clone(); let registry = registry.clone();
let mut block = self.block.clone(); let mut block = self.block.clone();
block.set_is_last(call_info.args.is_last); block.set_redirect(call_info.args.external_redirection);
let alias_command = self.clone(); let alias_command = self.clone();
let mut context = Context::from_args(&args, &registry); let mut context = Context::from_args(&args, &registry);

Some files were not shown because too many files have changed in this diff Show More