Compare commits

...

517 Commits

Author SHA1 Message Date
87d71604ad Bump to 0.18.1 (#2335) 2020-08-12 15:59:28 +12:00
e372e7c448 Display built features. Long/Short commit hashes display removed due to cargo publishing. (#2333)
Co-authored-by: Darren Schroeder <343840+fdncred@users.noreply.github.com>
Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>

Co-authored-by: Darren Schroeder <343840+fdncred@users.noreply.github.com>
Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-08-12 15:13:33 +12:00
0194dee3a6 Updated version hashing and bumped nu-cli to 0.18.1 (#2331)
* Updated version hashing and bumped nu-cli to 0.18.1

* made code pertyer

* Update version.rs

* Update version.rs

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-08-12 09:43:23 +12:00
cc3c10867c Histogram no longer requires a wrap command before it on unnamed columns (#2332) 2020-08-12 09:42:59 +12:00
3c18169f63 Autoenv fix: Exitscripts incorrectly running when visiting a subdirectory (#2326)
* Add test case for issue

* Preliminary fix

* fmt

* Reorder asserts

* move insertion

* Touch nu-env.toml

* Cleanup

* touch nu-env toml

* Remove touch

* Change feature flags
2020-08-12 07:54:49 +12:00
43e9c89125 moved theme assets local to crate (#2329)
* moved theme assets local to crate

* remove the TODO comment
2020-08-11 13:57:03 -05:00
2ad07912d9 Bump to 0.18 (#2325) 2020-08-11 18:44:53 +12:00
51ad019495 Update Cargo.lock (#2322) 2020-08-11 15:00:07 +12:00
9264325e57 Make history file location configurable (#2320)
* Make history location configurable

Add history-path to your config if you want an alternate history file
location

* use IndexMap.get() instead of index

Co-authored-by: Amanita Muscaria <nope>
2020-08-11 13:58:53 +12:00
901157341b Bumped which-rs to 4.0.2 (#2321)
This should fix #1541
2020-08-11 13:55:43 +12:00
eb766b80c1 added pkg_mgr/winget + yaml (#2297) 2020-08-11 05:45:53 +12:00
f0dbffd761 Add 228 json html themes for to html (#2308)
* add 228 json html themes
removed old assets, added new zipped asset
added --list to get a list of the theme names
reworked some older theme code
added rust-embed and zip crate
removed the dark tests

* fmt

* Updated, removed excess comments
Changed usage a bit
Updated the error handling
Added some helper items in --list
2020-08-11 05:43:16 +12:00
f14c0df582 Allow disabling welcome message on launch (#2314)
* Implements #2313
2020-08-09 11:38:21 +12:00
362bb1bea3 Add commit hash to version command (#2312)
* Add commit to version command

* Replace unwrap with expect.
2020-08-08 17:39:34 +12:00
724b177c97 Sample variance and Sample standard deviation. (#2310) 2020-08-06 23:56:19 -05:00
50343f2d6a Add stderr back when using do -i (#2309)
* Add stderr back when using do -i

* Add stderr back when using do -i
2020-08-07 16:53:37 +12:00
3122525b96 removed rustyline config duplication (#2306)
* removed rustyline config duplication
set other rustyline defaults if line_editor section doesn't exist
updated keyseq_timeout to -1 if emacs mode is chosen

* change checking rustyline config to if lets

* removed some unneccessary code
2020-08-05 16:34:28 -05:00
8232c6f185 Update rustyline defaults (#2305)
Use rustyline defaults if no config exists for line_editor (for most options)
2020-08-05 13:05:13 -05:00
6202705eb6 parse most common date formats using dtparse crate (#2303)
* use dtparse crate to parse most common date formats

* use dtparse crate to parse most common date formats - cargo fmt
2020-08-05 12:44:52 +12:00
e1c5940b04 Add command "reduce" (#2292)
* initial

* fold working

* tests and cleanup

* change command to reduce, with fold flag

* move complex example to tests

* add --numbered flag
2020-08-05 05:16:19 +12:00
7f35bfc005 histogram: support regular values. (#2300) 2020-08-04 04:57:25 -05:00
c48c092125 String funcs - Contains and IndexOf (#2298)
* Contains and index of string functions

* Clippy and fmt
2020-08-04 18:36:51 +12:00
028fc9b9cd Data summarize reporting overhaul. (#2299)
Refactored out most of internal work for summarizing data opening
the door for generating charts from it. A model is introduced
to hold information needed for a summary, Histogram command is
an example of a partial usage. This is the beginning.

Removed implicit arithmetic traits on Value and Primitive to avoid
mixed types panics. The std operations traits can't fail and we
can't guarantee that. We can handle gracefully now since compute_values
was introduced after the parser changes four months ago. The handling
logic should be taken care of either explicitly or in compute_values.

The zero identity trait was also removed (and implementing this forced
us to also implement Add, Mult, etc)

Also: the `math` operations now remove in the output if a given column is not computable:

```
> ls | math sum
──────┬──────────
 size │ 150.9 KB
──────┴──────────
```
2020-08-03 17:47:19 -05:00
eeb9b4edcb Match cleanup (#2294)
* Delete unnecessary match

* Use `unwrap_or_else()`

* Whitespace was trim on file save

* Use `map_or_else()`

* Use a default to group all match arms with same output

* Clippy made me do it
2020-08-04 05:43:27 +12:00
3a7869b422 Switch to maintained app_dirs (#2293)
* Switch to maintained app_dirs

* Update app_dirs under old name
2020-08-04 05:41:57 +12:00
c48ea46c4f Match cleanup (#2290) 2020-08-02 18:34:33 -04:00
f33da33626 Add --partial to 'to html' (#2291) 2020-08-03 08:47:54 +12:00
a88f5c7ae7 Make str collect take an optional separator value (#2289)
* Make `str collect` take an optional separator value

* Make `str collect` take an optional separator value

* Add some tests

* Fix my tests
2020-08-02 19:29:29 +12:00
cda53b6cda Return incomplete parse from lite_parse (#2284)
* Move lite_parse tests into a submodule

* Have lite_parse return partial parses when error encountered.

Although a parse fails, we can generally still return what was successfully
parsed. This is useful, for example, when figuring out completions at some
cursor position, because we can map the cursor to something more structured
(e.g., cursor is at a flag name).
2020-08-02 06:39:55 +12:00
ee734873ba Fix no longer working histogram example (#2271)
* Fix no longer working histogram example

* Oops
2020-08-02 06:38:45 +12:00
9fb6f5cd09 Change f/full flag to l/long for ls and ps commands (#2283)
* Change `f`/`full` flag to `l`/`long` for `ls` and `ps` commands

* Fix a few more `--full` instances
2020-08-02 06:30:45 +12:00
4ef15b5f80 docs/alias: simplify the 'persistent' section, using --save (#2285)
All the workarounds using `config` aren't necessary anymore. Only `config path` is still of interest.
2020-08-01 08:11:26 -04:00
ba81278ffd Remove build.rs and nu-build (#2282) 2020-08-01 09:21:10 +12:00
10fbed3808 updated cmd builtin commands (#2266)
* updated cmd builtin commands

* removed cd, chdir, exit, prompt, rem

* remove more commands, what remains is useful

* cargo fmt is so finicky
2020-07-31 09:51:42 +12:00
16cfc36aec set default edit_mode to emacs instead of vi (#2278) 2020-07-30 14:59:20 -05:00
aca7f71737 🐛 Fix path command error messages (#2261). (#2276) 2020-07-31 06:51:14 +12:00
3282a509a9 Make insert take in a block (#2265)
* Make insert take in a block

* Add some tests
2020-07-30 16:58:54 +12:00
878b748a41 Add list output for to html (#2273) 2020-07-30 16:54:55 +12:00
18a4505b9b starts_with ends_with match functions for string (#2269) 2020-07-30 16:51:20 +12:00
26e77a4b05 Add url commands (#2274)
* scheme
* path
* query
* host
2020-07-30 08:56:56 +12:00
37f10cf273 Add two further path cmds - type and exists (#2264)
* Add two further path cmds - type and exists

* Update type.rs

Try a more universal directory

* Update type.rs

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-07-27 14:12:07 +12:00
5e0a9aecaa ltrim and rtrim for string (#2262)
* Trim string from left and right

* Move trim to folder

* fmt

* Clippy
2020-07-27 06:09:35 +12:00
7e2c627044 WIP: Path utility commands (#2255)
* Add new path commands

basename, expand and extension. Currently there is no real error
handling. expand returns the initial path if it didn't work, the others
return empty string

* Optionally apply to path
2020-07-26 07:29:15 +12:00
4347339e9a Make all bullet point items uppercase (#2257) 2020-07-26 06:15:12 +12:00
e66a8258ec add example to parse command, with row output (#2256) 2020-07-26 06:14:29 +12:00
e4b42b54ad Simplify NuCompleter. (#2254)
- Removing old code for dealing with escaping, since that has moved elsewhere.
- Eliminating some match statements in favour of result/option methods.
- Fix an issue where completing inside quotes could remove the quote at the
  beginning, if one already existed on the line but the replacement didn't have
  a quote at the beginning.
2020-07-25 10:41:14 -04:00
de18b9ca2c Match cleanup (#2248) 2020-07-25 08:40:35 -04:00
a77f0f7b41 to-xml.md documentation update (#2253)
* Update to-xml.md documentation to be consistent

* Capitalize bullet point items

* Add link to this document wthin `to.md`
2020-07-25 20:19:15 +12:00
6b31a006b8 Refactor all completion logic into NuCompleter (#2252)
* Refactor all completion logic into `NuCompleter`

This is the next step to improving completions. Previously, completion logic was
scattered about (`FilesystemShell`, `NuCompleter`, `Helper`, and `ShellManager`).
By unifying the core logic into a central location, it will be easier to take the
next steps in improving completion.

* Update context.rs

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-07-25 11:39:12 +12:00
2db4fe83d8 Remove unnecessary peekable iterator (#2251) 2020-07-24 12:06:12 -04:00
55a2f284d9 Add to xml command (#2141) (#2155) 2020-07-24 19:41:22 +12:00
2d3b1e090a Remove piping of stderr. (#2247)
In any other shell, stderr is inherited like normal, and only piped if you
request it explicitly (e.g., `2>/dev/null`). In the case of a command like
`fzf`, stderr is used for the interactive selection of files. By piping it,
something like

    fzf | xargs echo

does not work. By removing all stderr piping we eliminate this issue. We can
return later with a way to deal with stderr piping when an actual use case
arises.
2020-07-24 17:56:50 +12:00
ed0c1038e3 Step 1 for to html theme-ing (#2245)
* reworked theming step 1.
added theme parameter.
hard coded 4 themes

* forgot about blulocolight

* aarrrrg! test are in another place. fixed i think.
2020-07-23 13:21:58 -05:00
0c20282200 added documentation of available binding options (#2246)
straight from the rustyline source code
2020-07-23 13:13:06 -05:00
e71f44d26f if config file doesn't exist, set defaults. (#2244)
if line_editor in config doesn't exist, set defaults.
2020-07-23 08:27:45 -05:00
e3d7e46855 added defaults. fixed but of not loading history. (#2243) 2020-07-23 07:19:05 -05:00
9b35aae5e8 update sample configs (#2242)
* update sample configs

* change rustyline to line_editor
2020-07-23 06:49:25 -05:00
7e9f87c57f Expose all rustyline configuration points (#2238)
* Added all rustyline config points

* comments cleanup

* my good friend fmt keeps changing his mind
2020-07-23 09:43:52 +12:00
5d17b72852 update config documentation (#2178)
* update config documentation

* update config syntax

* update config syntax

* Update alias.md

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-07-23 09:42:04 +12:00
6b4634b293 Convert table of primitives to positional arguments for external cmd (#2232)
* Convert table of primitives to positional arguments for external cmd

* Multiple file test, fix for cococo
2020-07-23 09:41:34 +12:00
2a084fc838 Bump to 0.17.0 (#2237) 2020-07-22 06:41:49 +12:00
a36d2a1586 Revert "hopefully the final fix for history (#2222)" (#2235)
This reverts commit 6829ad7a30.
2020-07-21 18:19:04 +12:00
32b875ada9 sample config settigns (#2233) 2020-07-20 21:15:58 -05:00
aaed9c4e8a added ansi example (#2230)
* added ansi example

* added strcollect to example
2020-07-20 18:33:39 -05:00
b9278bdfe1 Char example (#2231)
* added ansi example

* added another example

* changed example

* ansi changes here by mistake
2020-07-20 14:25:38 -05:00
6eb2c94209 Add flag for case-insensitive sort-by (#2225)
* Add flag for case-insensitive sort-by

* Fix test names

* Fix documentation comments
2020-07-21 05:31:58 +12:00
7b1a15b223 Campbell colors (#2219)
* added campbell theme to html colors

* updated test results. had to make change for ci.

* hopefully the last changes for this stupid test :)

* moved tests to html.rs

* remove unnecessary using statement.

* still fighting with tests and tests are winning.
2020-07-20 07:57:29 -05:00
836efd237c fix internal command parsing (args.is_last) (#2224) 2020-07-20 05:49:40 +12:00
aad3cca793 Add benchmark command (#2223) 2020-07-20 05:39:43 +12:00
6829ad7a30 hopefully the final fix for history (#2222) 2020-07-19 07:47:55 -05:00
1f0962eb08 Add some tests for parse_arg (#2220)
* add some tests for parse

* Format

* fix warnings
2020-07-19 19:12:56 +12:00
c65acc174d Add hex pretty print to 'to html' (#2221) 2020-07-19 16:44:15 +12:00
2dea392e40 Add hex pretty print to 'to html' (#2217) 2020-07-19 12:14:40 +12:00
0c43a4d04b Add hex pretty print to 'to html' (#2216) 2020-07-19 10:12:17 +12:00
ebc2d40875 Expose more registry APIs (#2215) 2020-07-19 06:01:05 +12:00
3432078e77 Fix uniq to work with simple values (#2214) 2020-07-19 05:19:03 +12:00
9e5170b3dc Clean up lines command (#2207) 2020-07-19 05:17:56 +12:00
0ae7c5d836 Fix if description (#2204) (#2213) 2020-07-19 05:16:35 +12:00
d0712a00f4 made it easier to change colors (#2212)
* made it easier to change colors
and the beginning of html theming

* fmt
2020-07-18 11:05:45 -05:00
5e722181cb Export more defs from nu-cli (#2205) 2020-07-18 16:47:03 +12:00
ffe3e2c16b Rename calc to math eval and allow it to optionally take an expression as an argument (#2195)
* Rename `calc` to `math eval` and allow it to optionally take the expression as an argument

* Moved calc tests to math eval
Also added 2 tests and changed 1 test

* Move calc docs to math eval
2020-07-18 16:11:19 +12:00
04e8aa31fe update history max size with two different calls. (#2202)
Closes #2193
2020-07-18 15:26:32 +12:00
9d24b440bb Introduce completion abstractions to nushell. (#2198)
* Introduce completion abstractions to nushell.

Currently, we rely on rustyline's completion structures. By abstracting this
away, we are more flexible to introduce someone elses completion engine, or our
own.

* Update value_shell.rs

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-07-18 14:55:10 +12:00
d8594a62c2 Add wasm support (#2199)
* Working towards a PoC for wasm

* Move bson and sqlite to plugins

* proof of concept now working

* tests are green

* Add CI test for --no-default-features

* Fix some tests

* Fix clippy and windows build

* More fixes

* Fix the windows build

* Fix the windows test
2020-07-18 13:59:23 +12:00
dbe0effd67 User error propagation operator (#2201) 2020-07-18 13:12:06 +12:00
b358804904 Auto-Generate Documentation for nushell.com (#2139)
* Very rough idea

* Remove colour codes

* Work on command for generating docs

* Quick comment

* Use nested collapsible markdown

* Refine documentation command

* Clippy and rename docs

* This layout probably seems best

Also moved some code to documentation.rs to avoid making help.rs massive

* Delete summaries.md

* Add usage strings

* Remove static annotations

* get_documentation produces value

Which will be used like
'help generate_docs | save "something"'
The resulting yaml can be passed to a script for generating HTML/MD files in the website

* Fix subcommands

* DRY code

* Address clippy:

* Fix links

* Clippy lints

* Move documentation to more central location
2020-07-18 10:22:43 +12:00
7b02604e6d changed colors as per Jörn's suggestion. (#2200)
* changed colors as per Jörn's suggestion.

* cleaned up old comments
2020-07-17 15:02:54 -05:00
6497421615 Keep until and while as subcommands of keep (#2197) 2020-07-18 07:06:48 +12:00
f26151e36d Silence Rust 1.45 Clippy warnings (#2196)
* Silence Rust 1.45 Clippy warnings dealing with using `map_err()`

* Silence false Clippy warning

* Fix last Clippy error for unnecessary conversion

* Fix `and_then` clippy warnings
2020-07-18 05:57:15 +12:00
0f688d7da7 Use '?' for error propagation, remove match (#2194) 2020-07-17 05:39:51 +12:00
a04dfca63a added ability to supply --dark_bg to to html (#2189)
* added ability to supply --dark_bg to to html

* fmt + fixed tests

* updated other html tests

* fmt
2020-07-16 08:19:29 -05:00
72f6513d2a Keybindings and invocation fix (#2186) 2020-07-15 19:51:59 +12:00
7c0a830d84 Match cleanup (#2184)
* Use `unwrap_or()` to remove `match`

* Use `?` for error propogation, and remove `match`
2020-07-15 19:51:41 +12:00
c299d207f7 Remove unnecessary match (#2183) 2020-07-15 19:50:38 +12:00
42a1adf2e9 Indices are (now) green, bold, right-aligned (#2181)
With https://github.com/nushell/nushell/pull/355 the (numeric) index column of tables was changed to be right-aligned. After the move to `nu-table` the index column is now centered instead of right-aligned. I think this is a copy-paste bug where [this line](71e55541d7/crates/nu-cli/src/commands/table.rs (L190)) has been copied from [this line](71e55541d7/crates/nu-cli/src/commands/table.rs (L207)), since the code is out-of-sync with the comment. This change restores harmony between the description and the function of the code.
2020-07-15 15:48:20 +12:00
b4761f9d8a Remove commands meant for internal use. (#2182) 2020-07-14 21:49:46 -05:00
71e55541d7 Merge skip command varieties into one command with sub commands. (#2179) 2020-07-14 20:44:49 -05:00
5f1075544c Remove unnecessary match statement (#2177) 2020-07-14 20:17:28 -04:00
0934410b38 Use matches!() for true/false returning match statements (#2176) 2020-07-14 20:11:41 -04:00
17e6c53b62 Add str reverse subcommand (#2170)
* Add str reverse subcommand

* rustfmt
2020-07-15 08:47:04 +12:00
80d2a7ee7a Fix autoenv executing scripts multiple times (#2171)
* Fix autoenv executing scripts multiple times

Previously, if the user had only specified entry or exitscripts the scripts
would execute many times. This should be fixed now

* Add tests

* Run exitscripts

* More tests and fixes to existing tests

* Test solution with visited dirs

* Track visited directories

* Comments and fmt
2020-07-15 07:16:50 +12:00
8fd22b61be Add variance and stddev subcommands to math command (#2154)
* add variance (population)
subcommand to math

* impl variance subcommand with spanning errors for invalid types

* add stddev subcommand to math

* rename bytes to filesize

* clippy fix -- use expect instead of unwrap in variance tests
2020-07-15 07:15:02 +12:00
e9313a61af Make str more strict. (#2173) 2020-07-14 10:04:00 -05:00
f2c4d22739 group-by can generate custom grouping key by block evaluation. (#2172) 2020-07-14 08:45:19 -05:00
8551e06d9e Ensure source buffer is cleared after reading in MaybeTextCodec. (#2168) 2020-07-14 11:24:52 +12:00
97cedeb324 Fix str --to-int usages (#2167) 2020-07-13 15:07:34 -04:00
07594222c0 Extend 'Shell' with open and save capabilities (#2165)
* Extend 'Shell' with open and save capabilities

* clippy fix
2020-07-13 21:07:44 +12:00
7a207a673b Update documentation to properly refer to subcommands with spaces (#2164) 2020-07-13 18:39:36 +12:00
78f13407e6 Documentation for autoenv (#2163)
* Documentation

* Somewhat nicer?

* cat
2020-07-13 18:23:19 +12:00
5a34744d8c add --char flag to 'str trim' (#2162) 2020-07-13 17:45:34 +12:00
0456f4a007 To html with color (#2158)
* adding color to html output

* latest changes

* seems to be working now

* WIP - close. Good is the enemy of Great.

* fixed the final issues... hopefully
2020-07-13 17:40:59 +12:00
f3f40df4dd Tests for autoenv (and fixes for bugs the tests found) (#2148)
* add test basic_autoenv_vars_are_added

* Tests

* Entry and exit scripts

* Recursive set and overwrite

* Make sure that overwritten vals are restored

* Move tests to autoenv

* Move tests out of cli crate

* Tests help, apparently. Windows has issues

On windows, .nu-env is not applied immediately after running autoenv trust.
You have to cd out of the directory for it to work.

* Sort paths non-lexicographically

* Sibling dir test

* Revert "Sort paths non-lexicographically"

This reverts commit 72e4b856af.

* Rename test

* Change conditions

* Revert "Revert "Sort paths non-lexicographically""

This reverts commit 71606bc62f.

* Set vars as they are discovered

This means that if a parent directory is untrusted,
the variables in its child directories are still set properly.

* format

* Fix cleanup issues too

* Run commands in their separate functions

* Make everything into one large function like all the cool kids

* Refactoring

* fmt

* Debugging windows path issue

* Canonicalize

* Trim whitespace

* On windows, use echo nul instead of touch to create file in test

* Avoid cloning by using drain()
2020-07-12 16:14:09 +12:00
bdef5d7d72 Add 'str from' subcommand (#2125)
* add human, precision commands

* add 'str from' subcommand (converted from human/precision commands)

move human tests to str from

* add default locale, platform-specific SystemLocale use

* fix platform specific num-format dependency, remove invalid test

* change 'str from' localization to static num_format::Locale::en

* minor cleanup, nudge ci

* re-attempt ci
2020-07-12 15:57:39 +12:00
8d03cf5b02 added more verbose message to assert (#2157)
* added more verbose message to assert

having a -h short is bad for any command since it's already used by --help.

* updated for fmt
2020-07-11 16:49:44 -05:00
3ec0242960 fix the name of 'do' (#2152)
* fix the name of 'do'

* try to fix ci
2020-07-11 17:09:05 +12:00
0bc2e29f99 Rename 'bytes' to 'filesize' (#2153) 2020-07-11 14:17:37 +12:00
1bb6a2d9ed Split key/value in 'config set' (#2151) 2020-07-11 13:15:51 +12:00
e848fc0bbe Updates config to use subcommands (#2146)
* First commit updating `config` to use subcommands (#2119)
    - Implemented `get` subcommand

* Implmented `config set` as a subcommand.

* Implemented `config set_into` as subcommand

* Fixed base `config` command
 - Instead of outputting help, it now outputs the list of all
 configuration parameters.

* Added `config clear` subcommand

* Added `config load` and `config remove` subcommands

* Added `config path` subcommand

* fixed clippy
2020-07-11 12:11:04 +12:00
6820d70e7d make duration pretty print clearer (#2150)
* make duration pretty print clearer

* fix typo and tests

* fix typo and tests
2020-07-11 09:06:52 +12:00
f32ab696d3 Return iter from sort by (#2149) 2020-07-11 06:49:55 +12:00
e07a9e4ee7 1747 add ns to duration (#2128)
* Added nanos to Duration

* Removed unwraps

* Added nanos to Duration

* Removed unwraps

* Fixed errors

* Removed unwraps

* Changed serialization to String

* Fixed Date and Duration comparison
2020-07-11 05:48:11 +12:00
6a89b1b010 Remove duplicate method (retag) (#2147) 2020-07-10 06:21:13 -04:00
b1b93931cb Return an iter from last command (#2143) 2020-07-09 09:07:51 -04:00
1e62a8fb6e Fix variable name (#2142) 2020-07-08 19:55:01 -04:00
ed6f337a48 Refactor Fetch command (No logic changes) (#2131)
* Refactoring unrelated to Streaming...

Mainly keeping code DRY

* Remove cli as dependency
2020-07-09 04:49:27 +12:00
b004236927 Return iter from from vcf (#2137) 2020-07-08 09:20:57 -04:00
0fdb9ac5e2 str substring additions. (#2140) 2020-07-08 04:45:45 -05:00
28be39494c add requoting for completions (#2129) 2020-07-07 17:13:39 -04:00
32f18536e1 Add space to special prompt chars (#2122)
So spaces don't have to be escaped with quotes in the config.toml
2020-07-06 10:30:47 -05:00
34e1e6e426 Add "move column" command. (#2123) 2020-07-06 10:27:01 -05:00
c3ba1e476f Make every stream-able (#2120)
* Make every stream-able

* Make each over ranges stream-able
2020-07-06 20:23:27 +12:00
a1a0710ee6 Return iter from every command (#2118)
* Return iter from `every` command

* Clippy

* Better variable name
2020-07-06 14:25:39 +12:00
455b1ac294 Update config.yml 2020-07-06 08:21:46 +12:00
b2e0dc5b77 Update azure-pipelines.yml 2020-07-06 08:20:38 +12:00
d30c40b40e Bump to 0.16.1 (#2116) 2020-07-06 08:12:44 +12:00
85d848dd7d Stream results of drop command (#2114)
* Stream results of drop command

* When the amount of rows to drop is equal to or greaten than the size of the table, output nothing
2020-07-06 05:46:06 +12:00
74717582ac Slightly nicer "rm" message (#2113)
* maybe this was root issue

* quotes

* formatting
2020-07-06 05:42:37 +12:00
ee18f16378 Autoenv rewrite, security and scripting (#2083)
* Add args in .nurc file to environment

* Working dummy version

* Add add_nurc to sync_env command

* Parse .nurc file

* Delete env vars after leaving directory

* Removing vals not working, strangely

* Refactoring, add comment

* Debugging

* Debug by logging to file

* Add and remove env var behavior appears correct

However, it does not use existing code that well.

* Move work to cli.rs

* Parse config directories

* I am in a state of distress

* Rename .nurc to .nu

* Some notes for me

* Refactoring

* Removing vars works, but not done in a very nice fashion

* Refactor env_vars_to_delete

* Refactor env_vars_to_add()

* Move directory environment code to separate file

* Refactor from_config

* Restore env values

* Working?

* Working?

* Update comments and change var name

* Formatting

* Remove vars after leaving dir

* Remove notes I made

* Rename config function

* Clippy

* Cleanup and handle errors

* cargo fmt

* Better error messages, remove last (?) unwrap

* FORMAT PLZ

* Rename whitelisted_directories to allowed_directories

* Add comment to clarify how overwritten values are restored.

* Change list of allowed dirs to indexmap

* Rewrite starting

* rewrite everything

* Overwritten env values tracks an indexmap instead of vector

* Refactor restore function

* Untrack removed vars properly

* Performance concerns

* Performance concerns

* Error handling

* Clippy

* Add type aliases for String and OsString

* Deletion almost works

* Working?

* Error handling and refactoring

* nicer errors

* Add TODO file

* Move outside of loop

* Error handling

* Reworking adding of vars

* Reworking adding of vars

* Ready for testing

* Refactoring

* Restore overwritten vals code

* todo.org

* Remove overwritten values tracking, as it is not needed

* Cleanup, stop tracking overwritten values as nu takes care of it

* Init autoenv command

* Initialize autoenv and autoenv trust

* autoenv trust toml

* toml

* Use serde for autoenv

* Optional directory arg

* Add autoenv untrust command

* ... actually add autoenv untrust this time

* OsString and paths

* Revert "OsString and paths"

This reverts commit e6eedf8824.

* Fix path

* Fix path

* Autoenv trust and untrust

* Start using autoenv

* Check hashes

* Use trust functionality when setting vars

* Remove unused code

* Clippy

* Nicer errors for autoenv commands

* Non-working errors

* Update error description

* Satisfy fmt

* Errors

* Errors print, but not nicely

* Nicer errors

* fmt

* Delete accidentally added todo.org file

* Rename direnv to autoenv

* Use ShellError instead of Error

* Change tests to pass, danger zone?

* Clippy and errors

* Clippy... again

* Replace match with or_else

* Use sha2 crate for hashing

* parsing and error msg

* Refactoring

* Only apply vars once

* if parent dir

* Delete vars

* Rework exit code

* Adding works

* restore

* Fix possibility of infinite loop

* Refactoring

* Non-working

* Revert "Non-working"

This reverts commit e231b85570.

* Revert "Revert "Non-working""

This reverts commit 804092e46a.

* Autoenv trust works without restart

* Cargo fix

* Script vars

* Serde

* Serde errors

* Entry and exitscripts

* Clippy

* Support windows and handle errors

* Formatting

* Fix infinite loop on windows

* Debugging windows loop

* More windows infinite loop debugging

* Windows loop debugging #3

* windows loop #4

* Don't return err

* Cleanup unused code

* Infinite loop debug

* Loop debugging

* Check if infinite loop is vars_to_add

* env_vars_to_add does not terminate, skip loop as test

* Hypothesis: std::env::current_dir() is messing with something

* Hypothesis: std::env::current_dir() is messing with something

* plz

* make clippy happy

* debugging in env_vars_to_add

* Debbuging env_vars_to_add #2

* clippy

* clippy..

* Fool clippy

* Fix another infinite loop

* Binary search for error location x)

* Binary search #3

* fmt

* Binary search #4

* more searching...

* closing in... maybe

* PLZ

* Cleanup

* Restore commented out functionality

* Handle case when user gives the directory "."

* fmt

* Use fs::canonicalize for paths

* Create optional script section

* fmt

* Add exitscripts even if no entryscripts are defined

* All sections in .nu-env are now optional

* Re-read config file each directory change

* Hot reload after autoenv untrust, don't run exitscripts if untrusted

* Debugging

* Fix issue with recursive adding of vars

* Thank you for finding my issues Mr. Azure

* use std::env
2020-07-06 05:34:00 +12:00
9e82e5a2fa Show entire table if number of rows requested for last is greater than table size (#2112)
* Show entire table if number of rows requested for last is greater than table size

* rustfmt

* Add test
2020-07-05 13:04:17 +12:00
8ea2307815 More informative error messages when "rm" fails (#2109)
* add a nicer error message

* small fix to message and remove log
2020-07-05 13:03:12 +12:00
bbc5a28fe9 Fix buffering in lines command (#2111) 2020-07-05 12:20:58 +12:00
04120e00e4 Oops, fix crash in parser updates (#2108) 2020-07-05 08:56:54 +12:00
efd8a633f2 Align tables in "ls -f" (#2105)
* Stuff column with nothing if we have nothing

* Stuff columns at the very start

Remove unnecessary else clauses.
Add the unix cfg portion

* Added some tests and cfg windows

Not sure how I feel about these tests but it's better than nothing
2020-07-05 08:17:36 +12:00
e75c44c95b If command and touchups (#2106) 2020-07-05 07:40:04 +12:00
0629c896eb Return error for unterminated string in bareword parser (#2103)
* add some tests

* add failing tests

* use some; fix test

* clean up code, flesh out tests

* cargo fmt
2020-07-04 17:14:31 +12:00
eb02c773d0 Add 'str length' command (#2102) 2020-07-04 08:17:44 +12:00
e31e8d1550 Convert open/fetch to stream (#2028)
* Types lined up for open with stream

* Chunking stream

* Maybe I didn't need most of the Stream stuff after all?

* Some clean-up

* Merge weird cargo.lock

* Start moving some encoding logic to MaybeTextCodec

Will we lose the nice table formatting if we Stream? How do we get it back? Collect the Stream at the end?

* Clean-up and small refinements

* Put in auto-convert workaround

* Workaround to make sure bat functionality works

* Handle some easy error cases

* All tests pass

* Remove guessing logic

* Address clippy comments

* Pull latest master and fix MaybeTextCodec usage

* Add tag to enable autoview
2020-07-04 07:53:20 +12:00
8775991c2d Add 'split chars' command (#2101) 2020-07-04 07:09:38 +12:00
de8e2841a0 Numbered each (#2100)
* Add --numbered to each

* Fix example tester and add numbered each
2020-07-03 20:43:55 +12:00
5cafead4a4 Added license and license-for-less to wix build (#2097) 2020-07-03 11:30:00 +12:00
180290f3a8 Remove custom escaping for external args. (#2095)
Our own custom escaping unfortunately is far too simple to cover all cases.
Instead, the parser will now do no transforms on the args passed to an external
command, letting the process spawning library deal with doing the appropriate
escaping.
2020-07-03 11:29:28 +12:00
7813063c93 updated less license to raw gh link (#2088) 2020-07-02 16:25:07 +12:00
ba5d774fe1 Add a histogram example to the random dice documentation (#2087) 2020-07-02 16:24:28 +12:00
7be49e43fd added a few more command chars (#2086)
* added a few more command chars

* forgot my ole' friend clippy

* added unicode name synonums to characters
2020-07-01 17:34:11 -05:00
dcd2227201 Update README.md 2020-07-02 09:00:44 +12:00
2dd28c2909 updated to include less and nushell licenses (#2085) 2020-07-01 10:45:42 +12:00
0522023d4c Bump to 0.16.0 (#2084) 2020-07-01 06:25:09 +12:00
9876169f5d Add dice subcommand to random command (#2082)
* Add dice subcommand to random command

* Update random dice test name

* Stream results of random dice

* Thanks Clippy!
2020-06-30 16:12:51 +12:00
ed10aafa6f Bubble errors even if pipeline isn't used (#2080) 2020-06-30 05:39:11 +12:00
bcddeb3c1f WIP (#2077) 2020-06-29 09:06:05 +12:00
3f170c7fb8 Cal improvements (#2074)
* .get() already checks for the argument, don't need to use .has() as well

* Supplying the month-names flag should also cause the months column to show up, it should not require the -m flag first
2020-06-29 05:16:10 +12:00
8d91d151bf added raw to date for string output (#2075) 2020-06-28 09:01:13 -05:00
821d44af54 Docs autoview pwd touch (#2068)
* [ADD] Add draft documentation for autoview

* [ADD] Add draft documentation for pwd

* [ADD] Add draft documentation for touch

* [MOD] Improve description and add examples

Add the use of `textview` and `binaryview`.
Add examples for single value, source file and binary file.
2020-06-28 14:22:26 +12:00
a30901ff7d added ability to request the date in a format (#2073) 2020-06-27 20:36:15 -05:00
94a1968a88 Add command for printing special characters (#2072) 2020-06-28 09:46:30 +12:00
dffc9c9b1c Properly redirect invocations (#2070)
* Properly redirect invocations

* Don't convert with-env yet, as there's a random test failure
2020-06-28 09:04:57 +12:00
8b3964f518 More ansi (#2067)
* added a few more options for ansi colors
not sure italic works, maybe it's font dependent

* fmt
2020-06-27 10:29:09 -05:00
7fed9992c9 Bump deps and touchup (#2066) 2020-06-27 19:54:31 +12:00
4e2a4236f8 Fix it expansion and add collect (#2065) 2020-06-27 17:38:19 +12:00
05781607f4 Configurable built-in prompts (#2064)
* Add ansi, do, and prompt customization

* Fix test

* Cleanups
2020-06-27 10:37:31 +12:00
6daec399e6 added match plugin and readme.txt for msi (#2063) 2020-06-26 08:39:19 -05:00
306dc89ede Add bool subcommand to random (#2061)
* Add bool subcommand to random

* Fix function name copy paste error

* Fix issue 2062: allow deserialization of a decimal

* Add bias flag to `random bool`
2020-06-26 16:51:05 +12:00
80ce8acf57 Add ThemedPalette (#1873)
* add theme module

* reorganize theme palette code

* improve tests

* move to newtype implementation for ThemeColor and fix Palette name.

* add dead code ignore for now

* fix allow dead code macro

* remove redundant import and unnecessary return

* fix ok_or clippy error

Co-authored-by: Kurtis Nusbaum <kcommiter@gmail.com>
2020-06-26 16:40:12 +12:00
8dfc90a322 Update release.yml 2020-06-26 15:55:18 +12:00
ad5e485594 Update release.yml 2020-06-26 15:24:45 +12:00
60ed40f8bd Update release.yml 2020-06-26 14:34:39 +12:00
a6228cab9e Update main.wxs (#2060) 2020-06-26 11:34:39 +12:00
1857ac69d1 updated wxs to have the right exes (#2059) 2020-06-25 17:38:32 -05:00
e33e80ab24 Update release.yml 2020-06-26 09:40:59 +12:00
d18bc78e7c Update release.yml 2020-06-26 09:28:09 +12:00
3b2a87b6d4 Update release.yml 2020-06-26 09:08:20 +12:00
62c76be7ca Update release.yml 2020-06-26 09:06:33 +12:00
733f93e673 update to make closer to volta's (#2058) 2020-06-26 08:08:59 +12:00
2c88b2fae7 Gh actions with wix (#2057)
* Added wix to gh workflow
Followed volta example

* added --nocapture to see more error detail

* move creation of wix to after we download less.exe

* moved create wix down
2020-06-26 07:30:07 +12:00
501da433d4 Gh actions with wix, added --nocapture (#2056)
* Added wix to gh workflow
Followed volta example

* added --nocapture to see more error detail
2020-06-26 06:26:48 +12:00
0e8a239ae1 Added wix to gh workflow (#2055)
Followed volta example
2020-06-26 05:51:50 +12:00
bb08a221e2 Wix addition for creating msi (#2054)
* WIP - wix

* Updated wxs to have less. Added less.exe.

* removed binary less.exe since we're downloading it
updated wxs to point to output/* for less.exe

Co-authored-by: Darren Schroeder <fdncred@hotmail.com>
2020-06-26 05:50:45 +12:00
0dbe347f84 Update Cargo.lock (#2053) 2020-06-25 19:46:20 +12:00
72a21ad619 Adds random command with uuid subcommand (#2050) 2020-06-25 17:51:09 +12:00
6372d2a18c Made starship usage configurable (#2049)
Co-authored-by: Darren Schroeder <fdncred@hotmail.com>
2020-06-25 17:44:55 +12:00
4468947ad4 Add release automation with GitHub Actions (#2048) 2020-06-25 11:43:25 +12:00
93144a0132 Make mode subcommand: math mode (#2043)
* Update calculate to return a table when Value is a table

* impl mode subcommand for math

* add tests for math mode subcommand

* add table/row tests for math mode subcommand

* fix formatting
2020-06-25 05:57:27 +12:00
72f7406057 Fix cal command week-start example (#2040) 2020-06-24 17:15:19 +12:00
c56cbd0f6b Cleanup header to work like before (#2039) 2020-06-24 06:51:06 +12:00
1420cbafe4 Refactor out RunnableContext from calculate (#2037) 2020-06-24 06:24:33 +12:00
053bd926ec First pass at updating all documentation formatting and cleaning up output of examples (#2031) 2020-06-24 06:21:47 +12:00
d095cb91e4 bump which from 3 to 4.0.1 (#2035)
This release should work with reparse points like WinGet.
2020-06-22 12:58:10 -05:00
e8476d8fbb removed some comments (#2032)
removed comments that i shouldn't have left in
2020-06-22 08:30:43 -05:00
7532618bdc Add error for division by zero using "=" (#2002) (#2030)
* error for division by zero with =

* cargo fmt

* add helper zero_division_error

* tests for zero division error

* fix test names (zero div error)
2020-06-22 08:50:43 +12:00
e3e1e6f81b Adds a test case for changing drives using drive letter in Windows (#2022)
* added test case to test check switching drives using drive letter in windows

* modified test case to not use config, assert file exists
2020-06-22 07:02:58 +12:00
bce6f5a3e6 Uniq: --count flag to count occurences (#2017)
* uniq: Add counting option (WIP!)

Usage:

fetch https://raw.githubusercontent.com/timbray/topfew/master/test/data/access-1k | lines | wrap item | uniq | sort-by count | last 10

* uniq: Add first test

* uniq: Re-enable the non-counting variant.

* uniq: Also handle primitive lines.

* uniq: Update documentation

* uniq: Final comment about error handling. Let's get some feedback

* uniq: Address review comments.

Not happy with the way I create a TypeError. There must be a cleaner
way. Anyway, good for shipping.

* uniq: Use Labeled_error as suggested by jturner in chat.

* uniq: Return error directly.

Co-authored-by: Christoph Siedentop <christoph@siedentop.name>
2020-06-21 12:22:06 +12:00
480600c465 Fix wrap of carriage returns in cells (#2027)
* Fix carriage returns in cells

* Fix carriage returns in cells
2020-06-21 09:33:58 +12:00
89c737f456 Finish move to nu-table (#2025) 2020-06-21 07:25:07 +12:00
4e83363dd3 Upgrade heim to 0.1.0-beta.3 (#2019) 2020-06-21 06:55:16 +12:00
de6d8738c4 Simplify textview match (#2024)
* simplify textview match code

* Math median tests and documentation additions (#2018)

* Add math median example and unit tests

* Update output of other all math ls command examples to keep consistent with math median output

* Fix output of math max example

* Update output of other math commands using pwd examples to keep data consistent
2020-06-20 12:16:36 -05:00
853d7e7120 Math median tests and documentation additions (#2018)
* Add math median example and unit tests

* Update output of other all math ls command examples to keep consistent with math median output

* Fix output of math max example

* Update output of other math commands using pwd examples to keep data consistent
2020-06-20 00:28:03 -05:00
b0c30098e4 Sort primitives explictly. (#2016)
* Sort primitives explictly.

* Write backing up test.
2020-06-19 23:34:36 -05:00
fcbaefed52 Nu table (#2015)
* WIP

* Get ready to land nu-table

* Remove unwrap
2020-06-20 15:41:53 +12:00
77e02ac1c1 Fixed grammar (#2012) 2020-06-19 20:54:25 -05:00
088901b24f Rename average to avg 2020-06-19 18:59:00 -05:00
ed7a62bca3 textview config docs (#2011)
* documentation for bat config changes

* renamed to textview, added fetch example

Co-authored-by: Darren Schroeder <fdncred@hotmail.com>
2020-06-19 15:45:56 -05:00
6bfd8532e4 Bat config (#2010)
* WIP - changes to support bat config

* added bat configuration

* removed debug info

* clippy fix

* changed [bat] to [textview]

Co-authored-by: Darren Schroeder <fdncred@hotmail.com>
2020-06-19 15:08:59 -05:00
bc9cc75c8a Minor Math Sum Additions (#2007)
* Move sum tests into math directory

* Move sum documentation over to math documentation

One sum example already existed in the math examples and a few of the others were outdated and didn't work, so I only moved one over, and updated their output

* Remove no-longer-in-use mod statement
2020-06-20 06:00:18 +12:00
53a6e9f0bd Convert sum command into subcommand of the math command (#2004)
* Convert sum command into subcommand of the math command

* Add bullet points to math.md documentation
2020-06-18 21:02:01 -05:00
5f9de80d9b Math#media - ability to compute median value. 2020-06-18 16:59:43 -05:00
353b33be1b Add support to allow the week day start in cal to be configured via a flag (#1996)
* Add support to allow the week day start in cal to be configurable

* Fix variable name

* Use a flag instead of a configuration setting for specifying the starting day of the week
2020-06-19 05:34:51 +12:00
96d58094cf Fix regression. skip-until 'skips' until condition is met. 2020-06-17 14:08:09 -05:00
94aac0e8dd Remove unused pattern matched tag fields. 2020-06-17 13:34:17 -05:00
9f54d238ba Refactoring and more split-by flexibility. 2020-06-17 13:34:17 -05:00
778e497903 Refactoring and more group-by flexibility. 2020-06-17 13:34:17 -05:00
6914099e28 Cat with wings (#1993)
* WIP - Modified textview to use bat crate

* use input_from_bytes_with_name instead of input_file

* removed old paging
added prettyprint on else blocks
duplicated  too much code
hard coded defaults

Co-authored-by: Darren Schroeder <fdncred@hotmail.com>
2020-06-16 16:17:32 -05:00
1b6f94b46c Cal command code cleanup (#1990)
* Cal command code cleanup

* Reverting "index_map" back to "indexmap", since that is the convention used in the system
2020-06-17 08:00:49 +12:00
3d63901b3b Add 'every' command to select (or skip) every nth row (#1992)
* Add 'every' command

* Add --skip option to 'every' command

This option skips instead of selects every nth row

* Fix descriptions for 'every' command

* Add docummentation for 'every' command

* Check actual filenames in 'every' command tests
2020-06-17 07:58:41 +12:00
eb1ada6115 issue1332 - Fix for yamls with unquoted double curly braces (#1988)
* Gnarly hardcoded fix

* Whoops remove println
2020-06-17 07:12:04 +12:00
831111edec Pass the borrow instead of clone. (#1986) 2020-06-16 05:35:24 +12:00
29ea29261d Bump to 0.15.1 (#1984) 2020-06-15 09:54:30 +12:00
ee835f75db Last batch of removing async_stream (#1983) 2020-06-15 09:00:42 +12:00
bd7ac0d48e Math Documentation (#1982)
* Adding math docs

* Add some comments to calculate

* Remove redudant message

Message already shows up in subcommands list

* Added not working example

Accidentantly

* Remove table
2020-06-15 08:42:15 +12:00
d7b1480ad0 Another batch of removing async_stream (#1981) 2020-06-14 11:54:35 +12:00
86b316e930 Another batch of removing async_stream (#1979)
* Another batch of removing async_stream

* Another batch of removing async_stream

* Another batch of removing async_stream
2020-06-14 10:01:44 +12:00
a042f407c1 1882-Add min, max in addition to average for acting lists (#1969)
* Converting average.rs to math.rs

* Making some progress towards math

Examples and unit tests failing, also think I found a bug with passing in strings

* Fix typos

* Found issue with negative numbers

* Add some comments

* Split commands like in split and str_ but do not register?

* register commands in cli

* Address clippy warnings

* Fix bad examples

* Make the example failure message more helpful

* Replace unwraps

* Use compare_values to avoid coercion issues

* Remove unneeded code
2020-06-14 09:49:57 +12:00
40673e4599 Another batch of removing async_stream (#1978) 2020-06-14 07:13:36 +12:00
bcfb084d4c Remove async_stream from some commands (#1976) 2020-06-14 04:30:24 +12:00
a1fd5ad128 Updated Readme to include Roadmap project board. (#1975) 2020-06-13 08:24:30 -05:00
fe6d96e996 Another batch of converting commands away from async_stream (#1974)
* Another batch of removing async_stream

* merge master
2020-06-13 20:43:21 +12:00
e24e0242d1 Removing async_stream! from some more commands (#1973)
* Removing async_stream! from some more commands

* Fix await error

* Fix Clippy warnings
2020-06-13 20:03:13 +12:00
c959dc1ee3 Another batch of removing async_stream (#1972) 2020-06-13 16:03:39 +12:00
d82ce26b44 Another batch of removing async_stream (#1971) 2020-06-13 11:40:23 +12:00
935a5f6b9e Another batch of removing async_stream (#1970) 2020-06-12 20:34:41 +12:00
731aa6bbdd use encoding on open for #1939 (#1949)
* WIP - not compiling

* compiling but panicing

* still broken

* nearly working

* reverted deserializer_string changes
updated enter.rs and open.rs to use Option<Tagged<String>>
Accepted Clippy suggestions
Accepted fmt suggestions
Left original code from open.rs
 We may want to use some of it and only fallback to encoding.

* Don't exit when there is an unknown encoding.

* When encoding is unknown default to utf-8.

* only do encoding if the user says to it

* merged some conflicts on open

* made error messages consistent

* Updated unwrap with expect

* updated open test to pass with more descriptive err
updated enter test to not fail

* change _location to location

* changed _visitor to visitor

* Added a more verbose usage statement for encoding
Linked to docs.rs/encoding_rs for details

Co-authored-by: Darren Schroeder <fdncred@hotmail.com>
2020-06-11 19:37:43 -05:00
a268e825aa Allow config to be readonly (#1967) 2020-06-12 05:50:57 +12:00
982f067d0e Proper precedence history in math (#1966) 2020-06-12 05:17:08 +12:00
a3e1a3f266 Bump start plugin to 0.15.0 (#1956) 2020-06-10 08:39:15 +12:00
e5a18eb3c2 Bump to 0.15.0 (#1955) 2020-06-10 05:33:59 +12:00
16ba274170 Bump heim dependency version. (#1954)
Most important change is a fix for processes CPU usage calculation (see https://github.com/heim-rs/heim/issues/246)
2020-06-10 04:34:05 +12:00
3bb2c9beed Rename env file to .nu-env (#1953) 2020-06-09 15:54:20 +12:00
2fa83b0bbe Pub expose InterruptibleStream and InputStream. (#1952)
This allows crate users to make sure their long-running
streams can be interrupted with ctrl-c.
2020-06-09 05:17:19 +12:00
bf459e09cb WIP: Per directory env-variables (#1943)
* Add args in .nurc file to environment

* Working dummy version

* Add add_nurc to sync_env command

* Parse .nurc file

* Delete env vars after leaving directory

* Removing vals not working, strangely

* Refactoring, add comment

* Debugging

* Debug by logging to file

* Add and remove env var behavior appears correct

However, it does not use existing code that well.

* Move work to cli.rs

* Parse config directories

* I am in a state of distress

* Rename .nurc to .nu

* Some notes for me

* Refactoring

* Removing vars works, but not done in a very nice fashion

* Refactor env_vars_to_delete

* Refactor env_vars_to_add()

* Move directory environment code to separate file

* Refactor from_config

* Restore env values

* Working?

* Working?

* Update comments and change var name

* Formatting

* Remove vars after leaving dir

* Remove notes I made

* Rename config function

* Clippy

* Cleanup and handle errors

* cargo fmt

* Better error messages, remove last (?) unwrap

* FORMAT PLZ

* Rename whitelisted_directories to allowed_directories

* Add comment to clarify how overwritten values are restored.
2020-06-08 19:55:25 +12:00
ec7ff5960d Remove async_stream! from some commands (#1951)
* Remove async_stream! from open.rs

* Ran rustfmt

* Fix Clippy warning

* Removed async_stream! from evaluate_by.rs

* Removed async_stream! from exit.rs

* Removed async_stream! from from_eml.rs

* Removed async_stream! from group_by_date.rs

* Removed async_stream! from group_by.rs

* Removed async_stream! from map_max.rs

* Removed async_stream! from to_sqlite.rs

* Removed async_stream! from to_md.rs

* Removed async_stream! from to_html.rs
2020-06-08 16:48:10 +12:00
545f70705e ISSUE-1907 Disallow invalid top level TOML (#1946)
* Do not allow invalid top-level toml

Move recursive toml conversion into a helper func

* Forgot to format

* Forgot to use helper inside collect values

Added some additional tests
2020-06-08 08:02:37 +12:00
48672f8e30 Assign variables when passed as an argument. (#1947) 2020-06-08 04:15:57 +12:00
160191e9f4 Cal updates (#1945)
* Clean up `use` statements

* Update cal code to be ready for future data coloring
2020-06-07 15:52:42 +12:00
bef9669b85 When the nushell is located in a path that has a space in it, these tests break, this fixes it (#1944) 2020-06-07 15:50:52 +12:00
15e66ae065 Implement an option to show paths made of mkdir. (#1932) 2020-06-06 15:13:38 -04:00
ba6370621f Removing async_stream! from some commands (#1940)
* Removing async_stream! from some commands

* Revert row.rs code

* Simplify logic for first.rs and skip.rs
2020-06-06 19:42:06 +12:00
2a8ea88413 Bring back parse as built-in. 2020-06-04 15:21:13 -05:00
05959d6a61 Bump to latest rustyline (#1937) 2020-06-05 05:50:12 +12:00
012c99839c Moving some commands off of async stream (#1934)
* Remove async_stream from rm

* Remove async_stream from sort_by

* Remove async_stream from split_by

* Remove dbg!() statement

* Remove async_stream from uniq

* Remove async_stream from mkdir

* Don't change functions from private to public

* Clippy fixes

* Peer-review updates
2020-06-04 20:42:23 +12:00
5dd346094e Cut out a function to generate a pharase in the Flags section. (#1930) 2020-06-04 19:09:43 +12:00
b6f9d0ca58 Remove no_auto_pivot (#1931)
no_auto_pivot does not exist any longer. It was rolled into pivot_mode.
2020-06-03 12:27:54 -04:00
ae72593831 changed to-float to to-decimal (#1926)
* changed to-float to to-decimal

* changed to-float to to-decimal
2020-06-02 09:02:57 +12:00
ac22319a5d Update Cargo.lock (#1923) 2020-06-02 04:32:24 +12:00
7e8c84e394 Bump actions/checkout version from v1 to v2. (#1924) 2020-06-02 04:31:48 +12:00
ef4eefa96a Bump more deps (#1921) 2020-05-31 08:54:47 +12:00
2dc43775e3 Bump to latest heim (#1920)
* Bump to latest heim

* Fix pinning issue
2020-05-31 08:54:33 +12:00
4bdf27b173 Batch of moving commands off async_stream #3 (#1919)
* Batch of moving commands off async_stream #3

* remove commented-out section

* merge master
2020-05-31 06:31:50 +12:00
741d7b9f10 Add rm_always_trash option to config (#1869)
* Add `rm_always_trash` option to config

* Add `--permanent` flag to `rm`

* `rm`: error if both `-t` and `-p` are present

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-05-31 06:31:34 +12:00
ecb67fee40 ISSUE-1744-Glob support for start command (#1912)
* Possible implementation of globbing for start command

* Whoops forgot to remove Error used for debugging

* Use string lossy

* Run clippy

* Pin glob

* Better error messages

* Remove unneeded comment
2020-05-31 05:41:25 +12:00
ad43ef08e5 Support average for tables. 2020-05-30 10:33:09 -05:00
092ee127ee Batch of moving commands off async_stream (#1917) 2020-05-30 16:34:39 +12:00
b84ff99e7f Batch of moving commands off async_stream (#1916) 2020-05-30 11:36:04 +12:00
3a6a3d7409 Implement login for the fetch command (#1915) 2020-05-30 11:22:38 +12:00
48ee20782f Ensure end_filter plugin lifecycle stage gets called. 2020-05-29 04:03:25 -05:00
360e8340d1 Move run to be async (#1913) 2020-05-29 20:22:52 +12:00
fbc0dd57e9 Add plugin_dir to docs (#1911) 2020-05-29 08:46:27 +12:00
3f9871f60d Simplify plugin directory scanning (#1910) 2020-05-29 07:14:32 +12:00
fe01a223a4 Str plugin promoted to built-in Nu command. 2020-05-28 11:18:58 -05:00
0a6692ac44 Simplify parse plugin code. (#1904)
Primarily, instead of building a parse pattern enum, we just build the regex
directly, with the appropriate capture group names so that the column name
codepaths can be shared between simple and `--regex` patterns.

Also removed capture group count compared to column name count. I don't think
this codepath can possibly be reached with the regex we now use for the
simplified capture form.
2020-05-28 09:58:06 -04:00
98a3d9fff6 Allow echo to iterate ranges (#1905) 2020-05-28 06:07:53 +12:00
e2dabecc0b Make it-expansion work when in a list (#1903) 2020-05-27 20:29:05 +12:00
49b0592e3c Implement ctrl+c for the du command (#1901)
* Implement ctrl+c for the du command

* Ignore the "too many arguments" Clippy warning
2020-05-27 16:52:20 +12:00
fa812849b8 Fix warnings and split Scope (#1902) 2020-05-27 16:50:26 +12:00
9567c1f564 Fix for inconsistency when quoted strings are used with with_env shorthand (#1900) 2020-05-26 15:03:55 -04:00
a915471b38 Cal documentation updates (#1895) 2020-05-26 07:21:36 -04:00
bf212a5a3a change the test to use the origin column (#1878) 2020-05-25 18:50:54 -04:00
f0fc9e1038 Merge pivot options (#1888) 2020-05-25 18:40:25 -04:00
cb6ccc3c5a Improve the simplified parse form. (#1875) 2020-05-25 14:19:49 -04:00
07996ea93d Remove as many of the typecasts as possible in the cal command (#1886)
* Remove as many of the typecasts as possible in the cal command

* Run rustfmt on cal.rs
2020-05-25 18:51:23 +12:00
c71510c65c Update CODE_OF_CONDUCT.md 2020-05-25 18:50:14 +12:00
9c9941cf97 Update CODE_OF_CONDUCT.md 2020-05-25 18:49:36 +12:00
005d76cf57 Fix broken ordering of args when parsing command with env vars. (#1841) 2020-05-24 19:26:27 -04:00
8a99d112fc Add --to-float to str plugin (#1872) 2020-05-24 18:11:49 -04:00
fb09d7d1a1 docs: add alias --save flag (#1874) 2020-05-24 13:42:20 -04:00
9c14fb6c02 Show error when trying to sort by invalid column (#1880)
* Show error when trying to sort by invalid column

* Added test for changes

* Addressed comments, updated test

* Removed unnecessary mutable keyword

* Changed split-column to solt column after rebase from upstream
2020-05-25 05:37:08 +12:00
d488fddfe1 Add useful commands to CONTRIBUTING.md (#1865)
* Add useful commands to CONTRIBUTING.md

* Add some formatting commands
2020-05-25 05:34:26 +12:00
e1b598d478 added examples and explanation to trim (#1876)
* added examples and explanation to trim

inserted examples (taken from parse in cookbook) for lists and tables using trim

* Move `to-json` to `to json`

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-05-25 05:10:30 +12:00
edbecda14d Split split command to sub commands. 2020-05-24 02:02:44 -05:00
74c2daf665 Add completion for binaries on PATH (#1866) 2020-05-23 20:27:52 -04:00
aadbcf5ce8 Issue 1787 (#1827) 2020-05-23 20:08:39 -04:00
460daf029b Add space to bottom of table in 'light' mode (#1871) 2020-05-22 21:12:26 -04:00
9e6ab33fd7 Add --regex flag to parse (#1863) 2020-05-22 10:13:58 -04:00
5de30d0ae5 Tweak auto-rotate for single row output (#1861)
* added helper to convert data to strings
added ability to auto-rotate single row output
if row will be greater than terminal width

* Added pivot_to_fit config value

* Added ColumnPath to convert_to_string helper

* Figured out I had to run `cargo fmt --all -- --check`

Co-authored-by: Darren Schroeder <fdncred@hotmail.com>
2020-05-22 04:30:58 +12:00
97b9c078b1 Fix completer handling of paths with spaces (#1858)
* Fix completer handling of paths with spaces

* Satisfy Clippy for completer

* Satisfy cargo fmt for completer
2020-05-21 08:32:21 +12:00
8dc5c34932 Save alias (#1852)
* figuring out error with lines

* make progress in printing of block

* support for external commands; fix some tiny bugs in formatting

* basic printing of block; going to experiment with bubbling raw input to the command itself to avoid potential edge cases

* remove fmt::Display impls for hir structs; bubbled raw_input to command args

* compiling checkpoint :)

* process raw input alias to remove save flag; do duplicates stored

* fix warnings; run clippy

* removed tmux log file

* fix bug in looking for same alias; changed unwraps to safe unwraps
2020-05-21 05:31:04 +12:00
3239e5055c Added a count column on the histogram command (#1853)
* Adding iniitial draft for the addition of the count column on the histogram command

* Update histogram documentation

* Add count column test to histogram command

* Fix error in histogram documentation
2020-05-20 18:02:36 +12:00
b22db39775 Progress readme (#1854)
* Add some progress indicators to the readme

* Add some progress indicators to the readme
2020-05-20 16:46:55 +12:00
7c61f427c6 Update minimum Rust version requirement in README (#1851)
a4c1b092ba introduced uses of `map_or` on `Result`, which was only stabilised in Rust 1.41.
2020-05-20 10:55:46 +12:00
ae8c864934 default history size to 100k (#1845) 2020-05-20 07:28:06 +12:00
ed80933806 String interpolation (#1849)
* Add string interpolation

* fix coloring

* A few more fixups + tests

* merge master again
2020-05-20 07:27:26 +12:00
ae87582cb6 Fix missing invocation errors (#1846) 2020-05-19 08:57:25 -04:00
b89976daef let format access variables also (#1842) 2020-05-19 16:20:09 +12:00
76b170cea0 Update test command (#1840) 2020-05-18 22:01:27 -04:00
Sam
3302586379 added documentation for no_auto_pivot (#1828) 2020-05-18 21:21:35 -04:00
3144dc7f93 add support for specifying max history size in config (#1829) (#1837) 2020-05-19 10:27:08 +12:00
6efabef8d3 Remove interpretation of Primitive::Nothing as the number 0. (#1836) 2020-05-18 15:18:46 -04:00
0743b69ad5 Move from language-reporting to codespan (#1825) 2020-05-19 06:44:27 +12:00
5f1136dcb0 Fix newly added examples. (#1830) 2020-05-18 11:40:44 -04:00
acf13a6fcf Add (near) automatic testing for command examples (#1777) 2020-05-18 08:56:01 -04:00
Sam
3fc4a9f142 added config check to disable auto pivoting of single row outputs (#1791)
* added config check to disable auto pivoting of single row outputs

* fixed change to use built-in boolean values
2020-05-18 20:42:22 +12:00
1d781e8797 Add docs for the shuffle command (#1824) 2020-05-18 19:13:03 +12:00
b6cdfb1b19 Remove the -n flag from shuffle (#1823) 2020-05-18 19:12:35 +12:00
334685af23 Add some examples (#1821)
* Adds some examples

* Run rustfmt

* Fixed a few descriptions in the examples
2020-05-18 19:11:37 +12:00
c475be2df8 Fix starship not getting the correct pwd (#1822) 2020-05-18 17:22:54 +12:00
6ec6eb5199 Call external correctly. 2020-05-17 23:32:55 -05:00
f18424a6f6 Remove test-bins feature. 2020-05-17 23:32:55 -05:00
d1b1438ce5 Check capture group count (#1814) 2020-05-17 14:52:17 -04:00
af6aff8ca3 Allow user to specify the indentation setting on the pretty flag for the to json command (#1818)
* Allow user to specify the indentation setting on the pretty flag for the to json command

* Use "JSON" over "json"
2020-05-18 06:48:58 +12:00
d4dd8284a6 create Palette trait (#1813)
* create Pallet trait

* correct spelling to palette

* move palette to it's own module
2020-05-18 05:48:57 +12:00
e728cecb4d add docs for start command (#1816) 2020-05-18 05:37:15 +12:00
41e1aef369 Fix the insert command (#1815) 2020-05-17 08:30:52 -04:00
e50db1a4de Adds support to pretty format the json in to json (#1812)
* Current work on the --pretty flag for to json

* Deleted notes that were pushed by accident

* Fixed some errors
2020-05-17 16:43:10 +12:00
41412bc577 Update issue templates 2020-05-17 11:30:09 +12:00
e12aa87abe Update issue templates 2020-05-17 11:28:14 +12:00
0abc94f0c6 Bump some of our dependencies (#1809) 2020-05-17 10:34:10 +12:00
48d06f40b1 Remove the old it-hacks from fetch and post (#1807) 2020-05-17 06:18:46 +12:00
f43ed23ed7 Fix parsing of invocations with a dot (#1804) 2020-05-16 19:25:18 +12:00
40ec8c41a0 Cal command updates (#1758)
* Calculate the quarter directly

* Group some data together, remove attribute to ignore Clippy warning

* Group items into structs and add methods

* Updates to cal command

* Update cal.rs

* Update cal.rs

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-05-16 16:00:06 +12:00
076fde16dd Evaluation of command arguments (#1801)
* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* Finish adding the baseline refactors for argument invocation

* Finish cleanup and add test

* Add missing plugin references
2020-05-16 15:18:24 +12:00
Sam
822440d5ff added nothing value for incalcuable dir sizes (#1789) 2020-05-15 12:53:18 -04:00
fc8ee8e4b9 Extracted grouping by date to it's own subcommand. (#1792) 2020-05-15 04:16:09 -05:00
5fbe5cf785 Use the directories crate instead of app_dirs (#1782)
The app_dirs crate is abandonned since quite a bit of time. Use the directories
crate instead, which is maintained and have more OS support.
2020-05-14 20:17:23 +12:00
f78536333a doc: add rename command page (#1781) 2020-05-14 12:36:08 +12:00
e7f08cb21d Allow external binary to register custom commands. (#1780)
This changeset contains everything that a separate binary needs to
register its own commands (including the new help function). It is
very possible that this commit misses other pub use exports, but
the contained ones work for our use cases so far.
2020-05-14 12:35:22 +12:00
f5a1d2f4fb Update README.md 2020-05-14 05:37:18 +12:00
8abbfd3ec9 Update README examples (#1779) 2020-05-14 05:36:24 +12:00
6826a9aeac doc: add more from command pages (#1778)
* doc: add from-url command page

* doc: add missing link to existing from-xml page.

* doc: add from-ini command page
2020-05-14 05:23:33 +12:00
e3b7e47515 cal: fix thursday typo (#1776) 2020-05-13 08:06:31 -04:00
196991ae1e Bump to 0.14.1 (#1772) 2020-05-13 20:03:45 +12:00
34b5e5c34e doc: add headers command page (#1775) 2020-05-13 20:03:29 +12:00
cb24a9c7ea doc: rename edit command to update (#1774) 2020-05-13 20:02:41 +12:00
9700b74407 Fix type in config flag description (#1769) 2020-05-13 14:21:57 +12:00
803c6539eb doc: fix nth examples (#1768) 2020-05-12 16:47:45 -04:00
75edcbc0d0 Simplify mv in FilesystemShell (#1587) 2020-05-12 16:40:45 -04:00
b2eecfb110 Bump to 0.14 (#1766) 2020-05-13 04:32:51 +12:00
b0aa142542 Add examples for some more commands (#1765) 2020-05-13 03:54:29 +12:00
247d8b00f8 doc: fix prepend example definition (#1761)
It seems that the description was copy-pasted by mistake from the
append command.
2020-05-12 19:46:21 +12:00
0b520eeaf0 Add a batch of help examples (#1759) 2020-05-12 17:17:17 +12:00
c3535b5c67 It-expansion fixes (#1757)
* It-expansion fixes

* fix clippy
2020-05-12 15:58:16 +12:00
8b9a8daa1d Add a batch of help examples (#1755) 2020-05-12 13:00:55 +12:00
c5ea4a31bd Adding coloring to help examples (#1754) 2020-05-12 11:06:40 +12:00
2275575575 Improve list view and ranges (#1753) 2020-05-12 08:06:09 +12:00
c3a066eeb4 Add examples to commands (#1752)
* Pass &dyn WholeStreamCommand to get_help

* Add an optional example to the WholeStreamCommand trait

* Add an example to the alias command
2020-05-12 08:05:44 +12:00
42eb658c37 Add a simplified list view (#1749) 2020-05-11 14:47:27 +12:00
a2e9bbd358 Improve duration math and printing (#1748)
* Improve duration math and printing

* Fix test
2020-05-11 13:44:49 +12:00
8951a01e58 update cal documentation (#1746) 2020-05-11 13:19:14 +12:00
f702aae72f Don't include year and month by default, adds an option to display th… (#1745)
* Don't include year and month by default, adds an option to display the quarters of the year

* Add a test for cal that checks that year requested is in the return

* rustfmt the cal tests
2020-05-11 12:35:24 +12:00
f5e03aaf1c Add cal command (#1739)
* Add cal command

* Fix docmentation to show commands on two lines

* Use bullet points on flag documentation for cal

* Dereference day before calling string method

* Silence Clippy warning regarding a function with too many arguments

* Update cal flag descriptions and documentation

* Add some tests for the cal command
2020-05-10 11:05:48 +12:00
0f0847e45b Move 'start' to use ShellError (#1743)
* Move 'start' to use ShellError

* Remove unnecessary changes

* Add missing macOS change

* Add default

* More fixed

* More fixed
2020-05-10 08:08:53 +12:00
ccd5d69fd1 Bug fix start (#1738)
* fix bug on linux; added start to the stable list

* add to stable and fix clippy lint
2020-05-10 05:28:57 +12:00
55374ee54f Fix help text for alias command. (#1742)
* Fix help text for alias command.

* Rust fmt
2020-05-09 12:16:14 -05:00
f93ff9ec33 Make grouping more flexible. (#1741) 2020-05-09 12:15:47 -05:00
9a94b3c656 start command in nushell (#1727)
* mvp for start command

* modified the signature of the start command

* parse filenames

* working model for macos is done

* refactored to read from pipes

* start command works well on macos; manual testing reveals need of --args flag support

* implemented start error; color printing of warning and errors

* ran clippy and fixed warnings

* fix a clippy lint that was caught in pipeline

* fix dead code clippy lint for windows

* add cfg annotation to import
2020-05-09 06:19:48 +12:00
e04b89f747 [Gitpod] Refactor Gitpod configuration and add full Debugging support (#1728) 2020-05-09 05:41:24 +12:00
180c1204f3 Use playground instead of depending on fixture format files. (#1726) 2020-05-07 06:58:35 -05:00
96e5fc05a3 Pick->Select rename. Integration tests changes. (#1725)
Pick->Select rename. Integration tests changes.
2020-05-07 06:03:43 -05:00
c3efdf2689 Rename edit command to update. (#1724)
Rename edit command to update.
2020-05-07 00:33:30 -05:00
27fdef5479 Read exit status before failing in failed read from stdout pipe (#1723) 2020-05-07 13:42:01 +12:00
7ce8026916 Ignore empty arguments passed to externals (#1722) 2020-05-07 09:18:56 +12:00
8a9fc6a721 Fix changing to a new Windows drive (#1721)
* Fix changing to a new Windows drive

* Update cli.rs
2020-05-07 05:51:03 +12:00
c06a692709 Bash-like shorthand with-env (#1718)
* Bash-like shorthand with-env

* fix clippy warning
2020-05-06 18:57:37 +12:00
b37e420c7c Add with-env command (#1717) 2020-05-06 15:56:31 +12:00
22e70478a4 docs/commands: add to.md, update subcommands (#1715)
This adds a top-level document for the new `to` command, with a list (of links) of all the subcommands.

All the to-* subcommands keep their filename, but the content is updated to use the new subcommand syntax.

Since not all subcommands have documentation, some items in the list are just text without a link. Also filled the list for the undocumented from* commands in the same style.

Fixes #1709
2020-05-05 20:05:23 +12:00
8ab2b92405 docs/commands: add from.md, update subcommands (#1712)
This adds a top-level document for the new `from` command, with a list of links of all the subcommands.

All the from-* subcommands keep their filename, but the content is updated to use the new subcommand syntax.

Needs matching update for to*

Ref #1709
2020-05-05 09:01:31 +12:00
3201c90647 Extend to/from usage text to indicate subcommands (#1711)
Both to and from without a subcommand only print the helptext. Expand the usage line a bit, so a glance at `help commands` indicates the existance of the subcommands and mentions some common formats.

Ref a9968046ed
Ref #1708
2020-05-05 09:00:29 +12:00
454f560eaa Properly deserialize history args (#1710) 2020-05-05 07:50:10 +12:00
d2ac506de3 Changes to allow plugins to be loaded in a multi-threaded manner (#1694)
* Changes to allow plugins to be loaded in a multi-threaded manner in order to decrease startup time.

* Ran rust fmt and clippy to find and fix first pass errors.
Updated launch.jason to make debugging easier in vscode.
Also added tasks.json so tasks like clippy can be ran easily.

* ran fmt again

* Delete launch.json

Remove IDE settings file

* Remove IDE settings file

* Ignore vscode IDE settings

* Cloned the context instead of Arc/Mutexing it.

Co-authored-by: Darren Schroeder <fdncred@hotmail.com>
Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-05-05 06:15:24 +12:00
a9968046ed Add subcommands. Switch from-* and to-* to them (#1708) 2020-05-04 20:44:33 +12:00
453087248a Properly drain iterating pipe so we can see errors (#1707) 2020-05-04 15:29:32 +12:00
81ff598d6c Fix column bugs associated with previous refactoring (#1705)
* Fix: the symlink target column will only dispaly if either the `full` or `with_symlink_targets` options are given

* If the metadata for every item in the size column is None, do not show the size column

* Do not show the symlink target column if the metadata is None for all the items in the table
2020-05-04 14:58:11 +12:00
d7d487de73 Basic documentation for the wrap command (#1704) 2020-05-04 04:54:21 +12:00
8d69c77989 Display either dir metadata size or dir apparent size in ls (#1696)
* Show dir size in ls command

* Add the option to show the apparent directory size via `ls --du`
2020-05-03 17:09:17 +12:00
0779a46179 docs/commands: add alias.md (#1697)
* docs/commands: add alias.md

* docs/commands/alias: drop reference to bash
2020-05-03 16:49:27 +12:00
ada92f41b4 Parse file size (byte) units in all mixes of case (#1693) 2020-05-02 14:14:07 -04:00
ef3049f5a1 Reduce some repetitive code in files.rs (#1692) 2020-05-02 11:14:29 -04:00
1dab82ffa1 Gitignore JetBrains' IDE items (#1691) 2020-05-01 09:43:53 +12:00
e9e3fac59d Remove bin.is_file() because it's expensive (#1689)
This one change takes the startup time from 2.8 seconds to 1.2 seconds in my testing on Windows.
2020-05-01 06:43:59 +12:00
7d403a6cc7 Escape some symbols in external args (#1687)
* Escape some symbols in external args

* Don't escape on Windows, which does its own

* fix warning
2020-04-30 16:54:07 +12:00
cf53264438 Table operating commands. (#1686)
* Table operating commands.

* Updated merge test for clarity.

* More clarity.

* Better like this..
2020-04-29 23:18:24 -05:00
d834708be8 Finish remove the last traces of per-item (#1685) 2020-04-30 14:23:40 +12:00
f8b4784891 Add .DS_Store to .gitignore (#1684) 2020-04-30 13:12:18 +12:00
789b28ac8a Convert if expression to match (#1683) 2020-04-30 12:59:50 +12:00
db8219e798 extend it-expansion to externals (#1682)
* extend it-expansion to externals

* trim the carriage return for external strings
2020-04-30 07:09:14 +12:00
73d5310c9c make it-expansion work through blocks when necessary (#1681) 2020-04-29 19:51:46 +12:00
8d197e1b2f Show symlink sizes in ls (#1680)
* Show symlink sizes in ls

* Reduce redundancy in the size code of ls
2020-04-29 19:23:26 +12:00
c704157bc8 Replace ichwh with which and some fixes for auto-cd (canonicalization) (#1672)
* fix: absolutize path against its parent if it was a symlink.

On Linux this happens because Rust calls readlink but doesn't canonicalize the resultant path.

* feat: playground function to create symlinks

* fix: use playground dirs

* feat: test for #1631, shift tests names

* Creation of FilesystemShell with custom location may fail

* Replace ichwh with which

* Creation of FilesystemShell with custom location may fail

* Replace ichwh with which

* fix: add ichwh again since it cannot be completely replaced

* fix: replace one more use of which
2020-04-28 05:49:53 +12:00
6abb9181d5 bump to 0.13.1 (#1670) 2020-04-27 18:50:14 +12:00
006171d669 Fix per-item run_alias (#1669)
* Fix per-item run_alias

* Fix 1609 also
2020-04-27 18:10:34 +12:00
8bd3cedce1 It expansion (#1668)
* First step in it-expansion

* Fix tests

* fix clippy warnings
2020-04-27 14:04:54 +12:00
6f2ef05195 Resolves https://github.com/nushell/nushell/issues/1658 (#1660)
For example, when running the following:

    crates/nu-cli/src

nushell currently parses this as an external command. Before running the command, we check to see if
it's a directory. If it is, we "auto cd" into that directory, otherwise we go through normal
external processing.

If we put a trailing slash on it though, shells typically interpret that as "user is explicitly
referencing directory". So

    crates/nu-cli/src/

should not be interpreted as "run an external command". We intercept a trailing slash in the head
position of a command in a pipeline as such, and inject a `cd` internal command.
2020-04-27 13:22:01 +12:00
80025ea684 Rows and values can be checked for emptiness. Allows to set a value if desired. (#1665) 2020-04-26 12:30:52 -05:00
a62745eefb make trim apply to strings contained in tables (also at deeper nesting (#1664)
* make trim apply to strings contained in tables (also at deeper nesting
levels), not just top-level strings

* remove unnecessary clone (thanks clippy)
2020-04-27 05:26:02 +12:00
2ac501f88e Adds drop number of rows command (#1663)
* Fix access to columns with quoted names

* Add a drop number of rows from end of table
2020-04-26 18:34:45 +12:00
df90d9e4b6 Fix access to columns with quoted names (#1662) 2020-04-26 18:01:55 +12:00
ad7a3fd908 Add not-in: operator (#1661) 2020-04-26 17:32:17 +12:00
ad8ab5b04d Add from-eml command (#1656)
* from-eml initial ver

* Adding tests for `from-eml`

* Add eml to prepares_and_decorates_filesystem_source_files

* Sort the file order

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-04-26 16:26:35 +12:00
e7767ab7b3 suppress the parser error for an insufficient number of required arguments if the named keyword argument 'help' is given (#1659) 2020-04-26 04:45:45 +12:00
846a779516 Fix cd'ing to symlinked directories (#1651)
* fix: absolutize path against its parent if it was a symlink.

On Linux this happens because Rust calls readlink but doesn't canonicalize the resultant path.

* feat: playground function to create symlinks

* fix: use playground dirs

* feat: test for #1631, shift tests names
2020-04-25 18:09:00 +12:00
e7a4f31b38 Docs: Mention how to use str --find-replace in the docs. (#1653)
Co-authored-by: Christoph Siedentop <christoph@siedentop.name>
2020-04-25 18:07:38 +12:00
10768b6ecf str plugin can capitalize and trim strings. (#1652)
* Str plugin can capitalize.

* Str plugin can trim.
2020-04-24 16:37:58 -05:00
716c4def03 Add key_timeout config option (#1649) 2020-04-25 05:28:38 +12:00
0e510ad42b Values can be used as keys for grouping. (#1648) 2020-04-23 20:55:18 -05:00
3c222916c6 [docs]: How to run tests. (#1647)
Co-authored-by: Christoph Siedentop <christoph@siedentop.name>
2020-04-24 12:20:55 +12:00
6887554888 [docs/enter] Warn about enter opening multiple shells (#1645)
Opening a JSON with a top-level list, opens one shell per list element. This can be extremely confusing to unexpected users.
2020-04-24 12:17:11 +12:00
d7bd77829f Bump rustyline (#1644)
* Bump rustyline

* Fix new clippy warnings

* Add pipeline command
2020-04-24 08:00:49 +12:00
9e8434326d Update azure-pipelines.yml 2020-04-24 07:23:56 +12:00
27bff35c79 Update azure-pipelines.yml 2020-04-24 07:21:27 +12:00
e2fae63a42 Update azure-pipelines.yml 2020-04-24 07:04:10 +12:00
701711eada Be more resilient with startup lines (#1642) 2020-04-23 17:22:01 +12:00
9ec2aca86f Support completion for paths with multiple dots (#1640)
* refactor: expand_path and expand_ndots now work for any string.

* refactor: refactor test and add new ones.

* refactor: convert expanded to owned string

* feat: pub export of expand_ndots

* feat: add completion for ndots in fs-shell
2020-04-23 16:17:38 +12:00
818171cc2c Make default completion mode OS dependent (#1636) 2020-04-23 10:53:40 +12:00
b3c623396f Fix to-csv command (#1635)
* Fix --headerless option of to-csv and to-tsv

Before to-csv --headerless split the "headerfull" output on lines,
skipped the first, and then concatenated them together. That meant
that all remaining output would be put on the same line, without
any delimiter, making it unusable. Now we replace the range of the
first line with the empty string, maintaining the rest of the
output unchanged.

* Remove extra space in indentation of to_delimited_data

* Add --separator <string> argument to to-csv

This functionaliy has been present before, but wasn't exposed
to the command.
2020-04-23 05:08:53 +12:00
88f06c81b2 Update Cargo.toml 2020-04-22 06:34:32 +12:00
e6b315f05b Bump ichwh to version 0.3.4 (#1627)
Fixes #1626 and [this ichwh issue](https://gitlab.com/avandesa/ichwh-rs/-/issues/3)
2020-04-22 05:37:14 +12:00
01ef6b0732 Add config to disable table index column (#1623) 2020-04-21 18:09:22 +12:00
c7e11a5a28 bump to 0.13.0 (#1625) 2020-04-21 17:01:03 +12:00
ce0231049e Fix the panic running a bad alias (#1624) 2020-04-21 16:11:34 +12:00
0f7b270740 Add exit code verification (#1622) 2020-04-21 15:14:18 +12:00
72cf57dd99 Add booleans and fix semicolon shortcircuit (#1620) 2020-04-21 12:30:01 +12:00
e4fdb36511 External vars (#1615)
* fix empty table and missing spans

* wip

* WIP

* WIP

* working version with vars

* tidying

* WIP

* Fix external quoting issue
2020-04-21 09:45:11 +12:00
2ffb14c7d0 fix empty table and missing spans (#1614) 2020-04-20 19:44:19 +12:00
eec94e4016 Semicolon (#1613)
* WIP on blocks

* Getting further

* add some tests
2020-04-20 18:41:51 +12:00
6412bfd58d Move alias arguments to optional arguments (#1612)
This will allow people to use more variables to capture more, if necessary, without having to implement required/optional/rest
2020-04-20 16:04:40 +12:00
522a828687 Move external closer to internal (#1611)
* Refactor InputStream and affected commands.

First, making `values` private and leaning on the `Stream` implementation makes
consumes of `InputStream` less likely to have to change in the future, if we
change what an `InputStream` is internally.

Second, we're dropping `Option<InputStream>` as the input to pipelines,
internals, and externals. Instead, `InputStream.is_empty` can be used to check
for "emptiness". Empty streams are typically only ever used as the first input
to a pipeline.

* Add run_external internal command.

We want to push external commands closer to internal commands, eventually
eliminating the concept of "external" completely. This means we can consolidate
a couple of things:

- Variable evaluation (for example, `$it`, `$nu`, alias vars)
- Behaviour of whole stream vs per-item external execution

It should also make it easier for us to start introducing argument signatures
for external commands,

* Update run_external.rs

* Update run_external.rs

* Update run_external.rs

* Update run_external.rs

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-04-20 15:30:44 +12:00
6b8c6dec0e Bump ichwh minimum version to 0.3.3 (#1607)
The new version includes a fix for [this issue]

[this issue]: https://gitlab.com/avandesa/ichwh-rs/-/issues/2
2020-04-19 13:38:14 +12:00
2b0212880e Simplify cp command and allow multiple recursive copying (#1580)
* Better message error

* Use custom canonicalize in FileStructure build

* Better glob error in ls

* Use custom canonicalize, remove some duplicate code in cd.

* Enable recursive copying with patterns.

* Change test to fit new error message

* Test recursive with glob pattern

* Show that not matches were found in cp

* Fix typo in message error

* Change old canonicalize usage, follow newest changes
2020-04-19 11:05:24 +12:00
a16a91ede8 Fix precedence parsing of parens. Limit use (#1606) 2020-04-19 06:39:06 +12:00
c2a9bc3bf4 Add better docs to parser (#1604) 2020-04-19 05:30:40 +12:00
e5a79d09df Fix the versions up a bit to prevent breakage (#1602)
* Fix the versions up a bit to prevent breakage

* fix up the quickcheck test to not fail on parse error
2020-04-18 15:31:57 +12:00
7974e09eeb Math operators (#1601)
* Add some math operations

* WIP for adding compound expressions

* precedence parsing

* paren expressions

* better lhs handling

* add compound comparisons and shorthand lefthand parsing

* Add or comparison and shorthand paths
2020-04-18 13:50:58 +12:00
52d2d2b888 A random set of fixes (#1600) 2020-04-17 18:19:49 +12:00
ee778d2b03 WIP fix for the error bubbling (#1597) 2020-04-16 16:25:24 +12:00
928188b18e Redesign custom canonicalize, enable multiple dots expansion earlier. (#1588)
* Expand n dots early where tilde was also expanded.

* Remove normalize, not needed.
New function absolutize, doesn't follow links neither checks existence.
Renamed canonicalize_existing to canonicalize, works as expected.

* Remove normalize usages, change canonicalize.

* Treat strings as paths
2020-04-16 11:29:22 +12:00
59d516064c Add alias support to scripts and -c (#1593) 2020-04-16 05:50:35 +12:00
bd5836e25d Aliases (#1589)
* WIP getting scopes right

* finish adding initial support

* Finish with alias and add startup commands
2020-04-15 17:43:23 +12:00
e3da037b80 Canonical expr block (#1584)
* Add the canonical form for an expression block

* Remove commented out code
2020-04-14 06:59:33 +12:00
08a09e2273 Pipeline blocks (#1579)
* Making Commands match what UntaggedValue needs

* WIP

* WIP

* WIP

* Moved to expressions for conditions

* Add 'each' command to use command blocks

* More cleanup

* Add test for 'each'

* Instead use an expression block
2020-04-13 19:59:57 +12:00
85d6b24be3 Add more cmd builtins (#1583) 2020-04-13 13:46:59 +12:00
ed583bd79b Bump to latest ichwh (#1582) 2020-04-13 08:01:23 +12:00
e0fc09ac52 move rustyline to 6.1.1 to fix windows crash (#1581) 2020-04-13 07:01:44 +12:00
38b2846024 Split canonicalize function in two for missing and existing behavior (#1576)
* Split allow missing logic in two functions

* Replace use of old canonicalize
2020-04-12 20:33:38 +12:00
57c62de66f Remove erronous GPL license header (#1578) 2020-04-12 20:16:50 +12:00
dd4935fb23 Add quickcheck (#1574)
* Move uptime to being a duration value

* Adds our first quickcheck test
2020-04-12 07:05:59 +12:00
18dd009ca8 Unified path expansion under new module and better canonicalize (#1571)
* New 'path' module under nu-cli.
Added normalize and canonicalize method.
Added some unit tests.

* Replace old usages of normalize and canonicalize.

* Fix reading symlinks and existence logic.

* Better explained
2020-04-12 07:05:29 +12:00
c0dda36217 Update CONTRIBUTING.md 2020-04-12 06:54:16 +12:00
75b72f844e Create CONTRIBUTING.md (#998)
* Create CONTRIBUTING.md

* mention gitpod

* Update CONTRIBUTING.md

* mention community supports

* fix capitalization issues
2020-04-12 06:52:53 +12:00
fbddc12c02 Move uptime to being a duration value (#1573) 2020-04-11 19:40:56 +12:00
8e7e8c17e1 Make trash support optional (#1572) 2020-04-11 18:53:53 +12:00
8ac9d781fd Remove source text where not needed (#1567) 2020-04-10 19:56:48 +12:00
c86cf31aac some minor improvements and removing dead code (#1563) 2020-04-10 07:48:10 +12:00
2c513d1883 More dep bumps (#1562) 2020-04-09 10:28:20 +12:00
04702530a3 Bump a lot of deps (#1560) 2020-04-07 19:51:17 +12:00
c9f424977e actually bump version (#1559) 2020-04-07 07:18:47 +12:00
183c8407de fix nu variable. tweak shells (#1558) 2020-04-07 05:30:54 +12:00
d0618b0b32 Enable the use of multiple dots in FS Shell commands (#1547)
Every dot after `..` means another parent directory.
2020-04-06 07:28:56 -04:00
c4daa2e40f Add experimental new parser (#1554)
Move to an experimental new parser
2020-04-06 19:16:14 +12:00
0a198b9bd0 Have FilesystemShell#rm correctly delete symlinks (#1550) 2020-04-05 17:17:52 -04:00
6a604491f5 ci(docker-publish): force remove non-executable (#1540) 2020-04-02 04:20:18 +13:00
791f7dd9c3 Bump to 0.12.0 (#1538) 2020-04-01 06:25:21 +13:00
a4c1b092ba Add configurations for table headers (#1537)
* Add configurations for table headers

* lint

Co-authored-by: Amanita-Muscaria <nope>
2020-03-31 12:19:48 +13:00
6e71c1008d Change get to remove blanks (#1534)
Remove blank values when getting a column of values
2020-03-30 15:36:21 +13:00
906d0b920f A few improvements to du implementation: (#1533)
1. Fixed a bug where `--all` wasn't showing files at the root directories.
2. More use of `Result`'s `map` and `map_err` methods.
3. Making tables be homogeneous so one can, for example, `get directories`.
2020-03-29 21:16:09 -04:00
efbf4f48c6 Fix poor message for executable that user doesn't have permissi… (#1535)
Previously, if the user didn't have the appropriate permissions to execute the
binary/script, they would see "command not found", which is confusing.

This commit eliminates the `which` crate in favour of `ichwh`, which deals
better with permissions by not dealing with them at all! This is closer to the
behaviour of `which` in many shells. Permission checks are then left up to the
caller to deal with.
2020-03-29 21:15:55 -04:00
2ddab3e8ce Some small improvements to du readability.
Mostly, making more use of `map` and `map_err` in `Result`. One benefit
is that at least one location had duplicated logic for how to map the
error, which is no longer the case after this commit.
2020-03-29 17:03:01 -04:00
35dc7438a5 Make use of interruptible stream in various places 2020-03-29 17:03:01 -04:00
2a54ee0c54 Introduce InterruptibleStream type.
An interruptible stream can query an `AtomicBool. If that bool is true,
the stream will no longer produce any values.

Also introducing the `Interruptible` trait, which extends any `Stream`
with the `interruptible` function, to simplify the construction and
allow chaining.
2020-03-29 17:03:01 -04:00
cad2741e9e Split input and output streams into separate modules 2020-03-29 17:03:01 -04:00
ae5f3c8210 WIP: 1486/first row as headers (#1530)
* headers plugin

* Remove plugin

* Add non-functioning headers command

* Add ability to extract headers from first row

* Refactor header extraction

* Rebuild indexmap with proper headers

* Rebuild result properly

* Compiling, probably wrapped too much?

* Refactoring

* Deal with case of empty header cell

* Deal with case of empty header cell

* Fix formatting

* Fix linting, attempt 2.

* Move whole_stream_command(Headers) to more appropriate section

* ... more linting

* Return Err(ShellError...) instead of panic, yield each row instead of entire table

* Insert Column[index] if no header info is found.

* Update error description

* Add initial test

* Add tests for headers command

* Lint test cases in headers

* Change ShellError for headers, Add sample_headers file to utils.rs

* Add empty sheet to test file

* Revert "Add empty sheet to test file"

This reverts commit a4bf38a31d.

* Show error message when given empty table
2020-03-29 15:05:57 +13:00
a5e97ca549 Respect CARGO_TARGET_DIR when set (#1528)
This makes the `binaries` function respect the `CARGO_TARGET_DIR` environment variable when set. If it's not present it falls back to the regular target directory used by Cargo.
2020-03-27 17:13:59 -04:00
06f87cfbe8 Add support for removing multiple files at once (#1526) 2020-03-25 16:19:01 -04:00
d4e78c6f47 Improve the rotated row wrap (#1524) 2020-03-25 06:27:16 +13:00
3653400ebc testing fix to matrix to define all variables (#1522)
there is currently a bug with invalid syntax for some of the
docker build steps, and I think this is because there are build
variables in the matrix that are not defined. This PR will
attempt to resolve this issue by defining all missing variables
for each row in the matrix.

Signed-off-by: vsoch <vsochat@stanford.edu>
2020-03-24 16:20:39 +13:00
81a48d6d0e Fix '/' and '..' not being valid mv targets (#1519)
* Fix '/' and '..' not being valid mv targets

If `/` or `../` is specified as the destination for `mv`, it will fail with an error message saying it's not a valid destination. This fixes it to account for the fact that `Path::file_name` return `None` when the file name evaluates to `/` or `..`. It will only take the slow(er) path if `Path::file_name` returns `None` in its initial check.

Fixes #1291

* Add test
2020-03-24 14:00:48 +13:00
f030ab3f12 Add experimental auto-rotate (#1516) 2020-03-23 09:55:30 +13:00
0dc0c6a10a Add quickstart option to Docker section in README (#1515) 2020-03-23 09:18:50 +13:00
53c8185af3 Fixes the crash for ps --full in Windows (#1514)
* Fixes the crash for `ps --full` in Windows

* Update ps.rs
2020-03-23 08:28:02 +13:00
36b5d063c1 Simplify and improve listing for which. (#1510)
* Simplified implementation
* Show executables, even if the current user doesn't have permissions to
  execute them.
2020-03-22 09:11:39 -04:00
a7ec00a037 Add documentation for from-ics and from-vcf (#1509) 2020-03-21 14:50:13 +13:00
918822ae0d Fix numeric comparison with nothing (#1508) 2020-03-21 11:02:49 +13:00
ab5e24a0e7 WIP: Add vcard/ical support (#1504)
* Initial from-ical implementation

* Initial from-vcard implementation

* Rename from-ics and from-vcf for autoconvert

* Remove redundant clones

* Add from-vcf and from-ics tests

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-03-21 08:35:09 +13:00
b5ea522f0e Add a --full mode to ps (#1507)
* Add a --full mode to ps

* Use a slightly older heim
2020-03-20 20:53:49 +13:00
afa963fd50 Add is_dir check to auto-cd (#1506)
* Add markdown output

* Add is_dir() check
2020-03-20 16:57:36 +13:00
1e343ff00c Add markdown output (#1503) 2020-03-20 08:18:24 +13:00
21a543a901 Make sum plugin as internal command. (#1501) 2020-03-18 18:46:00 -05:00
390deb4ff7 Windows needs to remember auto-cd paths when changing drives (#1500)
* Windows needs to remember auto-cd paths when changing drives

* Windows needs to remember auto-cd paths when changing drives
2020-03-18 15:10:45 +13:00
1c4cb30d64 Add documentation for skip and skip-while (#1499) 2020-03-18 14:22:35 +13:00
1ec2ec72b5 Add automatic change directory (#1496)
* Allow automatic cd in cli mode

* Set correct priority for auto-cd and add test
2020-03-18 07:13:38 +13:00
0d244a9701 Open fails silently, fix #1493 (#1495)
* Fix #1493

The error was wrongfully discarded

* Run cargo fmt
2020-03-17 17:40:04 +13:00
b36d21e76f Infer types from regular delimited plain text unstructured files. (#1494)
* Infer types from regular delimited plain text unstructured files.

* Nothing resolves to an empty string.
2020-03-16 15:50:45 -05:00
d8c4565413 Csv errors (#1490)
* Add error message for csv parsing failures

* Add csv error prettyfier

* Improve readability of the error

Line 2: error is easier to understand than:
Line 2, error

* Remove unnecessary use of the format! macro

Replacing it with .to_string() fixes a clippy warning

* Improve consistency with JSON parsing errors
2020-03-16 12:32:02 -05:00
22ba4c2a2f Add svg support to to-html (#1492) 2020-03-16 20:19:18 +13:00
8d19b21b9f Custom canonicalize method on Filesystem Shell. (#1485)
* Custom canonicalize method for FilesystemShell.

* Use custom canonicalize method.
Fixed missing import.

* Move function body to already impl body.

* Create test that aims to resolve.
2020-03-16 19:28:18 +13:00
45a3afdc79 Update Cargo.toml 2020-03-16 06:12:28 +13:00
2d078849cb Add simple to-html output and bump version (#1487) 2020-03-15 16:04:44 +13:00
b6363f3ce1 Added new flag '--all/-a' for Ls, also refactor some code (#1483)
* Utility function to detect hidden folders.
Implemented for Unix and Windows.

* Rename function argument.

* Revert "Rename function argument."

This reverts commit e7ab70f0f0.

* Add flag '--all/-a' to Ls

* Rename function argument.

* Check if flag '--all/-a' is present and path is hidden.
Replace match with map_err for glob result.
Remove redundancy in stream body.
Included comments on new stream body.
Replace async_stream::stream with async_stream::try_stream.
Minor tweaks to is_empty_dir.
Fix and refactor is_hidden_dir.

* Fix "implicit" bool coerse

* Fixed clippy errors
2020-03-14 06:27:04 +13:00
5ca9e12b7f Fix whitespace and typos (#1481)
* Remove EOL whitespace in files other than docs

* Break paragraphs into lines

See http://rhodesmill.org/brandon/2012/one-sentence-per-line/ for the rationale

* Fix various typos

* Remove EOL whitespace in docs/commands/*.md
2020-03-14 06:23:41 +13:00
5b0b2f1ddd Fixes #1204 : sys | get host.users displays the same user (#1480)
account twice while only one exists (macOS)

- renamed host.users to host.sessions
2020-03-12 14:01:55 +13:00
3afb53b8ce fix typo in calc command documentation (#1477)
minimumum -> minimum
2020-03-11 11:20:22 -04:00
692 changed files with 43915 additions and 26643 deletions

View File

@ -1,11 +1,14 @@
trigger: trigger:
- master - main
strategy: strategy:
matrix: matrix:
linux-stable: linux-stable:
image: ubuntu-18.04 image: ubuntu-18.04
style: 'unflagged' style: 'unflagged'
linux-minimal:
image: ubuntu-18.04
style: 'minimal'
macos-stable: macos-stable:
image: macos-10.14 image: macos-10.14
style: 'unflagged' style: 'unflagged'
@ -37,26 +40,30 @@ steps:
fi fi
if [ "$(uname)" == "Darwin" ]; then if [ "$(uname)" == "Darwin" ]; then
curl https://sh.rustup.rs -sSf | sh -s -- -y --no-modify-path --default-toolchain "stable" curl https://sh.rustup.rs -sSf | sh -s -- -y --no-modify-path --default-toolchain "stable"
echo "Installing clippy"
rustup component add clippy --toolchain stable-x86_64-apple-darwin
export PATH=$HOME/.cargo/bin:$PATH export PATH=$HOME/.cargo/bin:$PATH
fi fi
rustup update # rustup update
rustc -Vv # rustc -Vv
echo "##vso[task.prependpath]$HOME/.cargo/bin" # echo "##vso[task.prependpath]$HOME/.cargo/bin"
rustup component add rustfmt # rustup component add rustfmt
displayName: Install Rust displayName: Install Rust
- bash: RUSTFLAGS="-D warnings" cargo test --all --features stable,test-bins - bash: RUSTFLAGS="-D warnings" cargo test --all --features stable
condition: eq(variables['style'], 'unflagged') condition: eq(variables['style'], 'unflagged')
displayName: Run tests displayName: Run tests
- bash: RUSTFLAGS="-D warnings" cargo clippy --all --features=stable -- -D clippy::result_unwrap_used -D clippy::option_unwrap_used - bash: RUSTFLAGS="-D warnings" cargo clippy --all --features=stable -- -D clippy::result_unwrap_used -D clippy::option_unwrap_used
condition: eq(variables['style'], 'unflagged') condition: eq(variables['style'], 'unflagged')
displayName: Check clippy lints displayName: Check clippy lints
- bash: NUSHELL_ENABLE_ALL_FLAGS=1 RUSTFLAGS="-D warnings" cargo test --all --features stable,test-bins - bash: RUSTFLAGS="-D warnings" cargo test --all --features stable
condition: eq(variables['style'], 'canary') condition: eq(variables['style'], 'canary')
displayName: Run tests displayName: Run tests
- bash: NUSHELL_ENABLE_ALL_FLAGS=1 RUSTFLAGS="-D warnings" cargo clippy --all --features=stable -- -D clippy::result_unwrap_used -D clippy::option_unwrap_used - bash: RUSTFLAGS="-D warnings" cargo clippy --all --features=stable -- -D clippy::result_unwrap_used -D clippy::option_unwrap_used
condition: eq(variables['style'], 'canary') condition: eq(variables['style'], 'canary')
displayName: Check clippy lints displayName: Check clippy lints
- bash: RUSTFLAGS="-D warnings" cargo test --all --no-default-features
condition: eq(variables['style'], 'minimal')
displayName: Run tests
- bash: cargo fmt --all -- --check - bash: cargo fmt --all -- --check
condition: eq(variables['style'], 'fmt') condition: eq(variables['style'], 'fmt')
displayName: Lint displayName: Lint

View File

@ -1,3 +0,0 @@
[build]
#rustflags = ["--cfg", "coloring_in_tokens"]

View File

@ -27,7 +27,7 @@ orbs:
workflows: workflows:
version: 2.0 version: 2.0
# This builds on all pull requests to test, and ignores master # This builds on all pull requests to test, and ignores main
build_without_deploy: build_without_deploy:
jobs: jobs:
- docker/publish: - docker/publish:
@ -39,7 +39,7 @@ workflows:
filters: filters:
branches: branches:
ignore: ignore:
- master - main
before_build: before_build:
- pull_cache - pull_cache
after_build: after_build:
@ -98,11 +98,11 @@ workflows:
docker push quay.io/nushell/nu docker push quay.io/nushell/nu
# publish devel to Docker Hub on merge to master (doesn't build --release) # publish devel to Docker Hub on merge to main (doesn't build --release)
build_with_deploy_devel: build_with_deploy_devel:
jobs: jobs:
# Deploy devel tag on merge to master # Deploy devel tag on merge to main
- docker/publish: - docker/publish:
image: nushell/nu-base image: nushell/nu-base
registry: quay.io registry: quay.io
@ -113,7 +113,7 @@ workflows:
- pull_cache - pull_cache
filters: filters:
branches: branches:
only: master only: main
after_build: after_build:
- run: - run:
name: Build Multistage (smaller) container name: Build Multistage (smaller) container
@ -137,7 +137,7 @@ workflows:
filters: filters:
branches: branches:
only: only:
- master - main
jobs: jobs:
- docker/publish: - docker/publish:
image: nushell/nu-base image: nushell/nu-base

View File

@ -23,8 +23,8 @@ A clear and concise description of what you expected to happen.
If applicable, add screenshots to help explain your problem. If applicable, add screenshots to help explain your problem.
**Configuration (please complete the following information):** **Configuration (please complete the following information):**
- OS: [e.g. Windows] - OS (e.g. Windows):
- Version [e.g. 0.4.0] - Nu version (you can use the `version` command to find out):
- Optional features (if any) - Optional features (if any):
Add any other context about the problem here. Add any other context about the problem here.

View File

@ -13,7 +13,7 @@ jobs:
- x86_64-unknown-linux-musl - x86_64-unknown-linux-musl
- x86_64-unknown-linux-gnu - x86_64-unknown-linux-gnu
steps: steps:
- uses: actions/checkout@v1 - uses: actions/checkout@v2
- name: Install rust-embedded/cross - name: Install rust-embedded/cross
env: { VERSION: v0.1.16 } env: { VERSION: v0.1.16 }
run: >- run: >-
@ -24,7 +24,7 @@ jobs:
run: | run: |
cross build --target ${{ matrix.arch }} --release cross build --target ${{ matrix.arch }} --release
# leave only the executable file # leave only the executable file
rm -rd target/${{ matrix.arch }}/release/{*/*,*.d,*.rlib,.fingerprint} rm -frd target/${{ matrix.arch }}/release/{*/*,*.d,*.rlib,.fingerprint}
find . -empty -delete find . -empty -delete
- uses: actions/upload-artifact@master - uses: actions/upload-artifact@master
with: with:
@ -52,17 +52,17 @@ jobs:
- glibc - glibc
- musl - musl
include: include:
- { tag: alpine, base-image: alpine, arch: x86_64-unknown-linux-musl, plugin: true } - { tag: alpine, base-image: alpine, arch: x86_64-unknown-linux-musl, plugin: true, use-patch: false}
- { tag: slim, base-image: 'debian:stable-slim', arch: x86_64-unknown-linux-gnu, plugin: true } - { tag: slim, base-image: 'debian:stable-slim', arch: x86_64-unknown-linux-gnu, plugin: true, use-patch: false}
- { tag: debian, base-image: debian, arch: x86_64-unknown-linux-gnu, plugin: true } - { tag: debian, base-image: debian, arch: x86_64-unknown-linux-gnu, plugin: true, use-patch: false}
- { tag: glibc-busybox, base-image: 'busybox:glibc', arch: x86_64-unknown-linux-gnu, use-patch: true } - { tag: glibc-busybox, base-image: 'busybox:glibc', arch: x86_64-unknown-linux-gnu, plugin: false, use-patch: true }
- { tag: musl-busybox, base-image: 'busybox:musl', arch: x86_64-unknown-linux-musl, } - { tag: musl-busybox, base-image: 'busybox:musl', arch: x86_64-unknown-linux-musl, plugin: false, use-patch: false}
- { tag: musl-distroless, base-image: 'gcr.io/distroless/static', arch: x86_64-unknown-linux-musl, } - { tag: musl-distroless, base-image: 'gcr.io/distroless/static', arch: x86_64-unknown-linux-musl, plugin: false, use-patch: false}
- { tag: glibc-distroless, base-image: 'gcr.io/distroless/cc', arch: x86_64-unknown-linux-gnu, use-patch: true } - { tag: glibc-distroless, base-image: 'gcr.io/distroless/cc', arch: x86_64-unknown-linux-gnu, plugin: false, use-patch: true }
- { tag: glibc, base-image: scratch, arch: x86_64-unknown-linux-gnu, } - { tag: glibc, base-image: scratch, arch: x86_64-unknown-linux-gnu, plugin: false, use-patch: false}
- { tag: musl, base-image: scratch, arch: x86_64-unknown-linux-musl, } - { tag: musl, base-image: scratch, arch: x86_64-unknown-linux-musl, plugin: false, use-patch: false}
steps: steps:
- uses: actions/checkout@v1 - uses: actions/checkout@v2
- uses: actions/download-artifact@master - uses: actions/download-artifact@master
with: { name: '${{ matrix.arch }}', path: target/release } with: { name: '${{ matrix.arch }}', path: target/release }
- name: Build and publish exact version - name: Build and publish exact version

286
.github/workflows/release.yml vendored Normal file
View File

@ -0,0 +1,286 @@
name: Create Release Draft
on:
push:
tags: ['[0-9]+.[0-9]+.[0-9]+*']
jobs:
linux:
name: Build Linux
runs-on: ubuntu-latest
steps:
- name: Check out code
uses: actions/checkout@v2
- name: Install libxcb
run: sudo apt-get install libxcb-composite0-dev
- name: Set up cargo
uses: actions-rs/toolchain@v1
with:
profile: minimal
toolchain: stable
override: true
- name: Build
uses: actions-rs/cargo@v1
with:
command: build
args: --release --all --features=stable
- name: Create output directory
run: mkdir output
- name: Copy files to output
run: |
cp target/release/nu target/release/nu_plugin_* output/
cp README.build.txt output/README.txt
cp LICENSE output/LICENSE
rm output/*.d
rm output/nu_plugin_core_*
rm output/nu_plugin_stable_*
# Note: If OpenSSL changes, this path will need to be updated
- name: Copy OpenSSL to output
run: cp /usr/lib/x86_64-linux-gnu/libssl.so.1.1 output/
- name: Upload artifact
uses: actions/upload-artifact@v2
with:
name: linux
path: output/*
macos:
name: Build macOS
runs-on: macos-latest
steps:
- name: Check out code
uses: actions/checkout@v2
- name: Set up cargo
uses: actions-rs/toolchain@v1
with:
profile: minimal
toolchain: stable
override: true
- name: Build
uses: actions-rs/cargo@v1
with:
command: build
args: --release --all --features=stable
- name: Create output directory
run: mkdir output
- name: Copy files to output
run: |
cp target/release/nu target/release/nu_plugin_* output/
cp README.build.txt output/README.txt
cp LICENSE output/LICENSE
rm output/*.d
rm output/nu_plugin_core_*
rm output/nu_plugin_stable_*
- name: Upload artifact
uses: actions/upload-artifact@v2
with:
name: macos
path: output/*
windows:
name: Build Windows
runs-on: windows-latest
steps:
- name: Check out code
uses: actions/checkout@v2
- name: Set up cargo
uses: actions-rs/toolchain@v1
with:
profile: minimal
toolchain: stable
override: true
- name: Add cargo-wix subcommand
uses: actions-rs/cargo@v1
with:
command: install
args: cargo-wix
- name: Build
uses: actions-rs/cargo@v1
with:
command: build
args: --release --all --features=stable
- name: Create output directory
run: mkdir output
- name: Download Less Binary
run: Invoke-WebRequest -Uri "https://github.com/jftuga/less-Windows/releases/download/less-v562.0/less.exe" -OutFile "target\release\less.exe"
- name: Download Less License
run: Invoke-WebRequest -Uri "https://raw.githubusercontent.com/jftuga/less-Windows/master/LICENSE" -OutFile "target\release\LICENSE-for-less.txt"
- name: Copy files to output
run: |
cp target\release\nu.exe output\
cp LICENSE output\
cp target\release\LICENSE-for-less.txt output\
rm target\release\nu_plugin_core_*.exe
rm target\release\nu_plugin_stable_*.exe
cp target\release\nu_plugin_*.exe output\
cp README.build.txt output\README.txt
cp target\release\less.exe output\
# Note: If the version of `less.exe` needs to be changed, update this URL
# Similarly, if `less.exe` is checked into the repo, copy from the local path here
# moved this stuff down to create wix after we download less
- name: Create msi with wix
uses: actions-rs/cargo@v1
with:
command: wix
args: --no-build --nocapture --output target\wix\nushell-windows.msi
- name: Upload installer
uses: actions/upload-artifact@v2
with:
name: windows-installer
path: target\wix\nushell-windows.msi
- name: Upload zip
uses: actions/upload-artifact@v2
with:
name: windows-zip
path: output\*
release:
name: Publish Release
runs-on: ubuntu-latest
needs:
- linux
- macos
- windows
steps:
- name: Check out code
uses: actions/checkout@v2
- name: Determine Release Info
id: info
env:
GITHUB_REF: ${{ github.ref }}
run: |
VERSION=${GITHUB_REF##*/}
MAJOR=${VERSION%%.*}
MINOR=${VERSION%.*}
MINOR=${MINOR#*.}
PATCH=${VERSION##*.}
echo "::set-output name=version::${VERSION}"
echo "::set-output name=linuxdir::nu_${MAJOR}_${MINOR}_${PATCH}_linux"
echo "::set-output name=macosdir::nu_${MAJOR}_${MINOR}_${PATCH}_macOS"
echo "::set-output name=windowsdir::nu_${MAJOR}_${MINOR}_${PATCH}_windows"
echo "::set-output name=innerdir::nushell-${VERSION}"
- name: Create Release Draft
id: create_release
uses: actions/create-release@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
tag_name: ${{ github.ref }}
release_name: ${{ steps.info.outputs.version }} Release
draft: true
- name: Create Linux Directory
run: mkdir -p ${{ steps.info.outputs.linuxdir }}/${{ steps.info.outputs.innerdir }}
- name: Download Linux Artifacts
uses: actions/download-artifact@v2
with:
name: linux
path: ${{ steps.info.outputs.linuxdir }}/${{ steps.info.outputs.innerdir }}
- name: Restore Linux File Modes
run: |
chmod 755 ${{ steps.info.outputs.linuxdir }}/${{ steps.info.outputs.innerdir }}/nu*
chmod 755 ${{ steps.info.outputs.linuxdir }}/${{ steps.info.outputs.innerdir }}/libssl*
- name: Create Linux tarball
run: tar -zcvf ${{ steps.info.outputs.linuxdir }}.tar.gz ${{ steps.info.outputs.linuxdir }}
- name: Upload Linux Artifact
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.create_release.outputs.upload_url }}
asset_path: ./${{ steps.info.outputs.linuxdir }}.tar.gz
asset_name: ${{ steps.info.outputs.linuxdir }}.tar.gz
asset_content_type: application/gzip
- name: Create macOS Directory
run: mkdir -p ${{ steps.info.outputs.macosdir }}/${{ steps.info.outputs.innerdir }}
- name: Download macOS Artifacts
uses: actions/download-artifact@v2
with:
name: macos
path: ${{ steps.info.outputs.macosdir }}/${{ steps.info.outputs.innerdir }}
- name: Restore macOS File Modes
run: chmod 755 ${{ steps.info.outputs.macosdir }}/${{ steps.info.outputs.innerdir }}/nu*
- name: Create macOS Archive
run: zip -r ${{ steps.info.outputs.macosdir }}.zip ${{ steps.info.outputs.macosdir }}
- name: Upload macOS Artifact
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.create_release.outputs.upload_url }}
asset_path: ./${{ steps.info.outputs.macosdir }}.zip
asset_name: ${{ steps.info.outputs.macosdir }}.zip
asset_content_type: application/zip
- name: Create Windows Directory
run: mkdir -p ${{ steps.info.outputs.windowsdir }}/${{ steps.info.outputs.innerdir }}
- name: Download Windows zip
uses: actions/download-artifact@v2
with:
name: windows-zip
path: ${{ steps.info.outputs.windowsdir }}/${{ steps.info.outputs.innerdir }}
- name: Show Windows Artifacts
run: ls -la ${{ steps.info.outputs.windowsdir }}/${{ steps.info.outputs.innerdir }}
- name: Create macOS Archive
run: zip -r ${{ steps.info.outputs.windowsdir }}.zip ${{ steps.info.outputs.windowsdir }}
- name: Upload Windows zip
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.create_release.outputs.upload_url }}
asset_path: ./${{ steps.info.outputs.windowsdir }}.zip
asset_name: ${{ steps.info.outputs.windowsdir }}.zip
asset_content_type: application/zip
- name: Download Windows installer
uses: actions/download-artifact@v2
with:
name: windows-installer
path: ./
- name: Upload Windows installer
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.create_release.outputs.upload_url }}
asset_path: ./nushell-windows.msi
asset_name: ${{ steps.info.outputs.windowsdir }}.msi
asset_content_type: applictaion/x-msi

9
.gitignore vendored
View File

@ -11,3 +11,12 @@ debian/debhelper-build-stamp
debian/files debian/files
debian/nu.substvars debian/nu.substvars
debian/nu/ debian/nu/
# macOS junk
.DS_Store
# JetBrains' IDE items
.idea/*
# VSCode's IDE items
.vscode/*

7
.gitpod.Dockerfile vendored
View File

@ -6,4 +6,9 @@ RUN sudo apt-get update && \
sudo apt-get install -y \ sudo apt-get install -y \
libssl-dev \ libssl-dev \
libxcb-composite0-dev \ libxcb-composite0-dev \
pkg-config pkg-config \
libpython3.6 \
rust-lldb \
&& sudo rm -rf /var/lib/apt/lists/*
ENV RUST_LLDB=/usr/bin/lldb-8

View File

@ -1,23 +1,19 @@
image: image:
file: .gitpod.Dockerfile file: .gitpod.Dockerfile
tasks: tasks:
- init: cargo install --path . --force --features=stable - name: Clippy
init: cargo clippy --all --features=stable -- -D clippy::result_unwrap_used -D clippy::option_unwrap_used
- name: Testing
init: cargo test --all --features=stable,test-bins
- name: Build
init: cargo build --features=stable
- name: Nu
init: cargo install --path . --features=stable
command: nu command: nu
github: github:
prebuilds: prebuilds:
# enable for the master/default branch (defaults to true)
master: true
# enable for all branches in this repo (defaults to false)
branches: true branches: true
# enable for pull requests coming from this repo (defaults to true)
pullRequests: true
# enable for pull requests coming from forks (defaults to false)
pullRequestsFromForks: true pullRequestsFromForks: true
# add a "Review in Gitpod" button as a comment to pull requests (defaults to true)
addComment: true
# add a "Review in Gitpod" button to pull requests (defaults to false)
addBadge: false
# add a label once the prebuild is ready to pull requests (defaults to false)
addLabel: prebuilt-in-gitpod addLabel: prebuilt-in-gitpod
vscode: vscode:
extensions: extensions:
@ -25,5 +21,5 @@ vscode:
- Swellaby.vscode-rust-test-adapter@0.11.0:Xg+YeZZQiVpVUsIkH+uiiw== - Swellaby.vscode-rust-test-adapter@0.11.0:Xg+YeZZQiVpVUsIkH+uiiw==
- serayuzgur.crates@0.4.7:HMkoguLcXp9M3ud7ac3eIw== - serayuzgur.crates@0.4.7:HMkoguLcXp9M3ud7ac3eIw==
- belfz.search-crates-io@1.2.1:kSLnyrOhXtYPjQpKnMr4eQ== - belfz.search-crates-io@1.2.1:kSLnyrOhXtYPjQpKnMr4eQ==
- vadimcn.vscode-lldb@1.4.5:lwHCNwtm0kmOBXeQUIPGMQ==
- bungcip.better-toml@0.3.2:3QfgGxxYtGHfJKQU7H0nEw== - bungcip.better-toml@0.3.2:3QfgGxxYtGHfJKQU7H0nEw==
- webfreak.debug@0.24.0:1zVcRsAhewYEX3/A9xjMNw==

14
.theia/launch.json Normal file
View File

@ -0,0 +1,14 @@
{
"version": "0.2.0",
"configurations": [
{
"type": "gdb",
"request": "launch",
"name": "Debug Rust Code",
"preLaunchTask": "cargo",
"target": "${workspaceFolder}/target/debug/nu",
"cwd": "${workspaceFolder}",
"valuesFormatting": "parseText"
}
]
}

12
.theia/tasks.json Normal file
View File

@ -0,0 +1,12 @@
{
"tasks": [
{
"command": "cargo",
"args": [
"build"
],
"type": "process",
"label": "cargo",
}
],
}

View File

@ -55,7 +55,7 @@ a project may be further defined and clarified by project maintainers.
## Enforcement ## Enforcement
Instances of abusive, harassing, or otherwise unacceptable behavior may be Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported by contacting the project team at wycats@gmail.com. All reported by contacting the project team at wycats@gmail.com via email or by reaching out to @jturner, @gedge, or @andras_io on discord. All
complaints will be reviewed and investigated and will result in a response that complaints will be reviewed and investigated and will result in a response that
is deemed necessary and appropriate to the circumstances. The project team is is deemed necessary and appropriate to the circumstances. The project team is
obligated to maintain confidentiality with regard to the reporter of an incident. obligated to maintain confidentiality with regard to the reporter of an incident.
@ -68,9 +68,9 @@ members of the project's leadership.
## Attribution ## Attribution
This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4, This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4,
available at https://www.contributor-covenant.org/version/1/4/code-of-conduct.html available at <https://www.contributor-covenant.org/version/1/4/code-of-conduct.html>
[homepage]: https://www.contributor-covenant.org [homepage]: https://www.contributor-covenant.org
For answers to common questions about this code of conduct, see For answers to common questions about this code of conduct, see
https://www.contributor-covenant.org/faq <https://www.contributor-covenant.org/faq>

62
CONTRIBUTING.md Normal file
View File

@ -0,0 +1,62 @@
# Contributing
Welcome to nushell!
*Note: for a more complete guide see [The nu contributor book](https://github.com/nushell/contributor-book)*
For speedy contributions open it in Gitpod, nu will be pre-installed with the latest build in a VSCode like editor all from your browser.
[![Open in Gitpod](https://gitpod.io/button/open-in-gitpod.svg)](https://gitpod.io/#https://github.com/nushell/nushell)
To get live support from the community see our [Discord](https://discordapp.com/invite/NtAbbGn), [Twitter](https://twitter.com/nu_shell) or file an issue or feature request here on [GitHub](https://github.com/nushell/nushell/issues/new/choose)!
<!--WIP-->
## Developing
### Set up
This is no different than other Rust projects.
```bash
git clone https://github.com/nushell/nushell
cd nushell
cargo build
```
### Useful Commands
Build and run Nushell:
```shell
cargo build --release && cargo run --release
```
Run Clippy on Nushell:
```shell
cargo clippy --all --features=stable
```
Run all tests:
```shell
cargo test --all --features=stable
```
Run all tests for a specific command
```shell
cargo test --package nu-cli --test main -- commands::<command_name_here>
```
Check to see if there are code formatting issues
```shell
cargo fmt --all -- --check
```
Format the code in the project
```shell
cargo fmt --all
```

3406
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -1,16 +1,16 @@
[package] [package]
name = "nu" authors = ["The Nu Project Contributors"]
version = "0.11.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
description = "A shell for the GitHub era"
license = "MIT"
edition = "2018"
readme = "README.md"
default-run = "nu" default-run = "nu"
repository = "https://github.com/nushell/nushell" description = "A new type of shell"
homepage = "https://www.nushell.sh"
documentation = "https://www.nushell.sh/book/" documentation = "https://www.nushell.sh/book/"
edition = "2018"
exclude = ["images"] exclude = ["images"]
homepage = "https://www.nushell.sh"
license = "MIT"
name = "nu"
readme = "README.md"
repository = "https://github.com/nushell/nushell"
version = "0.18.1"
[workspace] [workspace]
members = ["crates/*/"] members = ["crates/*/"]
@ -18,100 +18,93 @@ members = ["crates/*/"]
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies] [dependencies]
nu-cli = { version = "0.11.0", path = "./crates/nu-cli" } nu-cli = {version = "0.18.1", path = "./crates/nu-cli"}
nu-source = { version = "0.11.0", path = "./crates/nu-source" } nu-errors = {version = "0.18.1", path = "./crates/nu-errors"}
nu-plugin = { version = "0.11.0", path = "./crates/nu-plugin" } nu-parser = {version = "0.18.1", path = "./crates/nu-parser"}
nu-protocol = { version = "0.11.0", path = "./crates/nu-protocol" } nu-plugin = {version = "0.18.1", path = "./crates/nu-plugin"}
nu-errors = { version = "0.11.0", path = "./crates/nu-errors" } nu-protocol = {version = "0.18.1", path = "./crates/nu-protocol"}
nu-parser = { version = "0.11.0", path = "./crates/nu-parser" } nu-source = {version = "0.18.1", path = "./crates/nu-source"}
nu-value-ext = { version = "0.11.0", path = "./crates/nu-value-ext" } nu-value-ext = {version = "0.18.1", path = "./crates/nu-value-ext"}
nu_plugin_average = { version = "0.11.0", path = "./crates/nu_plugin_average", optional=true }
nu_plugin_binaryview = { version = "0.11.0", path = "./crates/nu_plugin_binaryview", optional=true }
nu_plugin_fetch = { version = "0.11.0", path = "./crates/nu_plugin_fetch", optional=true }
nu_plugin_inc = { version = "0.11.0", path = "./crates/nu_plugin_inc", optional=true }
nu_plugin_match = { version = "0.11.0", path = "./crates/nu_plugin_match", optional=true }
nu_plugin_post = { version = "0.11.0", path = "./crates/nu_plugin_post", optional=true }
nu_plugin_ps = { version = "0.11.0", path = "./crates/nu_plugin_ps", optional=true }
nu_plugin_str = { version = "0.11.0", path = "./crates/nu_plugin_str", optional=true }
nu_plugin_sum = { version = "0.11.0", path = "./crates/nu_plugin_sum", optional=true }
nu_plugin_sys = { version = "0.11.0", path = "./crates/nu_plugin_sys", optional=true }
nu_plugin_textview = { version = "0.11.0", path = "./crates/nu_plugin_textview", optional=true }
nu_plugin_tree = { version = "0.11.0", path = "./crates/nu_plugin_tree", optional=true }
nu-macros = { version = "0.11.0", path = "./crates/nu-macros" }
crossterm = { version = "0.16.0", optional = true } nu_plugin_binaryview = {version = "0.18.1", path = "./crates/nu_plugin_binaryview", optional = true}
onig_sys = { version = "=69.1.0", optional = true } nu_plugin_fetch = {version = "0.18.1", path = "./crates/nu_plugin_fetch", optional = true}
semver = { version = "0.9.0", optional = true } nu_plugin_from_bson = {version = "0.18.1", path = "./crates/nu_plugin_from_bson", optional = true}
syntect = { version = "3.2.0", optional = true } nu_plugin_from_sqlite = {version = "0.18.1", path = "./crates/nu_plugin_from_sqlite", optional = true}
nu_plugin_inc = {version = "0.18.1", path = "./crates/nu_plugin_inc", optional = true}
nu_plugin_match = {version = "0.18.1", path = "./crates/nu_plugin_match", optional = true}
nu_plugin_post = {version = "0.18.1", path = "./crates/nu_plugin_post", optional = true}
nu_plugin_ps = {version = "0.18.1", path = "./crates/nu_plugin_ps", optional = true}
nu_plugin_start = {version = "0.18.1", path = "./crates/nu_plugin_start", optional = true}
nu_plugin_sys = {version = "0.18.1", path = "./crates/nu_plugin_sys", optional = true}
nu_plugin_textview = {version = "0.18.1", path = "./crates/nu_plugin_textview", optional = true}
nu_plugin_to_bson = {version = "0.18.1", path = "./crates/nu_plugin_to_bson", optional = true}
nu_plugin_to_sqlite = {version = "0.18.1", path = "./crates/nu_plugin_to_sqlite", optional = true}
nu_plugin_tree = {version = "0.18.1", path = "./crates/nu_plugin_tree", optional = true}
crossterm = {version = "0.17.5", optional = true}
semver = {version = "0.10.0", optional = true}
syntect = {version = "4.2", default-features = false, features = ["default-fancy"], optional = true}
url = {version = "2.1.1", optional = true} url = {version = "2.1.1", optional = true}
clap = "2.33.0" clap = "2.33.1"
ctrlc = "3.1.4" ctrlc = "3.1.4"
dunce = "1.0.0" dunce = "1.0.1"
futures = {version = "0.3", features = ["compat", "io-compat"]} futures = {version = "0.3", features = ["compat", "io-compat"]}
log = "0.4.8" log = "0.4.8"
pretty_env_logger = "0.4.0" pretty_env_logger = "0.4.0"
quick-xml = "0.18.1"
starship = "0.43.0"
[dev-dependencies] [dev-dependencies]
pretty_assertions = "0.6.1" nu-test-support = {version = "0.18.1", path = "./crates/nu-test-support"}
nu-test-support = { version = "0.11.0", path = "./crates/nu-test-support" }
[build-dependencies] [build-dependencies]
serde = {version = "1.0.110", features = ["derive"]}
toml = "0.5.6" toml = "0.5.6"
serde = { version = "1.0.104", features = ["derive"] }
nu-build = { version = "0.11.0", path = "./crates/nu-build" }
[features] [features]
# Test executables default = [
test-bins = [] "sys",
"ps",
default = ["sys", "ps", "textview", "inc", "str"] "textview",
stable = ["default", "starship-prompt", "binaryview", "match", "tree", "average", "sum", "post", "fetch", "clipboard-cli"] "inc",
"git-support",
"directories-support",
"ctrlc-support",
"which-support",
"ptree-support",
"term-support",
"uuid-support",
]
stable = ["default", "binaryview", "match", "tree", "post", "fetch", "clipboard-cli", "trash-support", "start", "starship-prompt", "bson", "sqlite"]
# Default # Default
textview = ["crossterm", "syntect", "onig_sys", "url", "nu_plugin_textview"]
sys = ["nu_plugin_sys"]
ps = ["nu_plugin_ps"]
inc = ["semver", "nu_plugin_inc"] inc = ["semver", "nu_plugin_inc"]
str = ["nu_plugin_str"] ps = ["nu_plugin_ps"]
sys = ["nu_plugin_sys"]
textview = ["crossterm", "syntect", "url", "nu_plugin_textview"]
# Stable # Stable
average = ["nu_plugin_average"]
binaryview = ["nu_plugin_binaryview"] binaryview = ["nu_plugin_binaryview"]
bson = ["nu_plugin_from_bson", "nu_plugin_to_bson"]
fetch = ["nu_plugin_fetch"] fetch = ["nu_plugin_fetch"]
match = ["nu_plugin_match"] match = ["nu_plugin_match"]
post = ["nu_plugin_post"] post = ["nu_plugin_post"]
sum = ["nu_plugin_sum"] sqlite = ["nu_plugin_from_sqlite", "nu_plugin_to_sqlite"]
start = ["nu_plugin_start"]
trace = ["nu-parser/trace"] trace = ["nu-parser/trace"]
tree = ["nu_plugin_tree"] tree = ["nu_plugin_tree"]
clipboard-cli = ["nu-cli/clipboard-cli"] clipboard-cli = ["nu-cli/clipboard-cli"]
ctrlc-support = ["nu-cli/ctrlc"]
directories-support = ["nu-cli/directories", "nu-cli/dirs"]
git-support = ["nu-cli/git2"]
ptree-support = ["nu-cli/ptree"]
starship-prompt = ["nu-cli/starship-prompt"] starship-prompt = ["nu-cli/starship-prompt"]
term-support = ["nu-cli/term"]
[[bin]] trash-support = ["nu-cli/trash-support"]
name = "fail" uuid-support = ["nu-cli/uuid_crate"]
path = "crates/nu-test-support/src/bins/fail.rs" which-support = ["nu-cli/ichwh", "nu-cli/which"]
required-features = ["test-bins"]
[[bin]]
name = "chop"
path = "crates/nu-test-support/src/bins/chop.rs"
required-features = ["test-bins"]
[[bin]]
name = "cococo"
path = "crates/nu-test-support/src/bins/cococo.rs"
required-features = ["test-bins"]
[[bin]]
name = "nonu"
path = "crates/nu-test-support/src/bins/nonu.rs"
required-features = ["test-bins"]
[[bin]]
name = "iecho"
path = "crates/nu-test-support/src/bins/iecho.rs"
required-features = ["test-bins"]
# Core plugins that ship with `cargo install nu` by default # Core plugins that ship with `cargo install nu` by default
# Currently, Cargo limits us to installing only one binary # Currently, Cargo limits us to installing only one binary
@ -131,22 +124,12 @@ name = "nu_plugin_core_ps"
path = "src/plugins/nu_plugin_core_ps.rs" path = "src/plugins/nu_plugin_core_ps.rs"
required-features = ["ps"] required-features = ["ps"]
[[bin]]
name = "nu_plugin_core_str"
path = "src/plugins/nu_plugin_core_str.rs"
required-features = ["str"]
[[bin]] [[bin]]
name = "nu_plugin_core_sys" name = "nu_plugin_core_sys"
path = "src/plugins/nu_plugin_core_sys.rs" path = "src/plugins/nu_plugin_core_sys.rs"
required-features = ["sys"] required-features = ["sys"]
# Stable plugins # Stable plugins
[[bin]]
name = "nu_plugin_stable_average"
path = "src/plugins/nu_plugin_stable_average.rs"
required-features = ["average"]
[[bin]] [[bin]]
name = "nu_plugin_stable_fetch" name = "nu_plugin_stable_fetch"
path = "src/plugins/nu_plugin_stable_fetch.rs" path = "src/plugins/nu_plugin_stable_fetch.rs"
@ -167,16 +150,16 @@ name = "nu_plugin_stable_post"
path = "src/plugins/nu_plugin_stable_post.rs" path = "src/plugins/nu_plugin_stable_post.rs"
required-features = ["post"] required-features = ["post"]
[[bin]]
name = "nu_plugin_stable_sum"
path = "src/plugins/nu_plugin_stable_sum.rs"
required-features = ["sum"]
[[bin]] [[bin]]
name = "nu_plugin_stable_tree" name = "nu_plugin_stable_tree"
path = "src/plugins/nu_plugin_stable_tree.rs" path = "src/plugins/nu_plugin_stable_tree.rs"
required-features = ["tree"] required-features = ["tree"]
[[bin]]
name = "nu_plugin_stable_start"
path = "src/plugins/nu_plugin_stable_start.rs"
required-features = ["start"]
# Main nu binary # Main nu binary
[[bin]] [[bin]]
name = "nu" name = "nu"

1
README.build.txt Normal file
View File

@ -0,0 +1 @@
Nu will look for the plugins in your PATH on startup. While nu will have some functionality without them, for full functionality you'll need to copy them into your path so they can be loaded.

272
README.md
View File

@ -1,27 +1,36 @@
# README
[![Gitpod Ready-to-Code](https://img.shields.io/badge/Gitpod-Ready--to--Code-blue?logo=gitpod)](https://gitpod.io/#https://github.com/nushell/nushell) [![Gitpod Ready-to-Code](https://img.shields.io/badge/Gitpod-Ready--to--Code-blue?logo=gitpod)](https://gitpod.io/#https://github.com/nushell/nushell)
[![Crates.io](https://img.shields.io/crates/v/nu.svg)](https://crates.io/crates/nu) [![Crates.io](https://img.shields.io/crates/v/nu.svg)](https://crates.io/crates/nu)
[![Build Status](https://dev.azure.com/nushell/nushell/_apis/build/status/nushell.nushell?branchName=master)](https://dev.azure.com/nushell/nushell/_build/latest?definitionId=2&branchName=master) [![Build Status](https://dev.azure.com/nushell/nushell/_apis/build/status/nushell.nushell?branchName=master)](https://dev.azure.com/nushell/nushell/_build/latest?definitionId=2&branchName=master)
[![Discord](https://img.shields.io/discord/601130461678272522.svg?logo=discord)](https://discord.gg/NtAbbGn) [![Discord](https://img.shields.io/discord/601130461678272522.svg?logo=discord)](https://discord.gg/NtAbbGn)
[![The Changelog #363](https://img.shields.io/badge/The%20Changelog-%23363-61c192.svg)](https://changelog.com/podcast/363) [![The Changelog #363](https://img.shields.io/badge/The%20Changelog-%23363-61c192.svg)](https://changelog.com/podcast/363)
[![@nu_shell](https://img.shields.io/badge/twitter-@nu_shell-1DA1F3?style=flat-square)](https://twitter.com/nu_shell)
## Nu Shell
# Nu Shell
A new type of shell. A new type of shell.
![Example of nushell](images/nushell-autocomplete.gif "Example of nushell") ![Example of nushell](images/nushell-autocomplete.gif "Example of nushell")
# Status ## Status
This project has reached a minimum-viable product level of quality. While contributors dogfood it as their daily driver, it may be unstable for some commands. Future releases will work to fill out missing features and improve stability. Its design is also subject to change as it matures. This project has reached a minimum-viable product level of quality.
While contributors dogfood it as their daily driver, it may be unstable for some commands.
Future releases will work to fill out missing features and improve stability.
Its design is also subject to change as it matures.
Nu comes with a set of built-in commands (listed below). If a command is unknown, the command will shell-out and execute it (using cmd on Windows or bash on Linux and macOS), correctly passing through stdin, stdout, and stderr, so things like your daily git workflows and even `vim` will work just fine. Nu comes with a set of built-in commands (listed below).
If a command is unknown, the command will shell-out and execute it (using cmd on Windows or bash on Linux and macOS), correctly passing through stdin, stdout, and stderr, so things like your daily git workflows and even `vim` will work just fine.
# Learning more ## Learning more
There are a few good resources to learn about Nu. There is a [book](https://www.nushell.sh/book/) about Nu that is currently in progress. The book focuses on using Nu and its core concepts. There are a few good resources to learn about Nu.
There is a [book](https://www.nushell.sh/book/) about Nu that is currently in progress.
The book focuses on using Nu and its core concepts.
If you're a developer who would like to contribute to Nu, we're also working on a [book for developers](https://www.nushell.sh/contributor-book/) to help you get started. There are also [good first issues](https://github.com/nushell/nushell/issues?q=is%3Aopen+is%3Aissue+label%3A%22good+first+issue%22) to help you dive in. If you're a developer who would like to contribute to Nu, we're also working on a [book for developers](https://www.nushell.sh/contributor-book/) to help you get started.
There are also [good first issues](https://github.com/nushell/nushell/issues?q=is%3Aopen+is%3Aissue+label%3A%22good+first+issue%22) to help you dive in.
We also have an active [Discord](https://discord.gg/NtAbbGn) and [Twitter](https://twitter.com/nu_shell) if you'd like to come and chat with us. We also have an active [Discord](https://discord.gg/NtAbbGn) and [Twitter](https://twitter.com/nu_shell) if you'd like to come and chat with us.
@ -31,44 +40,54 @@ Try it in Gitpod.
[![Open in Gitpod](https://gitpod.io/button/open-in-gitpod.svg)](https://gitpod.io/#https://github.com/nushell/nushell) [![Open in Gitpod](https://gitpod.io/button/open-in-gitpod.svg)](https://gitpod.io/#https://github.com/nushell/nushell)
# Installation ## Installation
## Local ### Local
Up-to-date installation instructions can be found in the [installation chapter of the book](https://www.nushell.sh/book/en/installation.html). **Windows users**: please note that Nu works on Windows 10 and does not currently have Windows 7/8.1 support. Up-to-date installation instructions can be found in the [installation chapter of the book](https://www.nushell.sh/book/en/installation.html). **Windows users**: please note that Nu works on Windows 10 and does not currently have Windows 7/8.1 support.
To build Nu, you will need to use the **latest stable (1.39 or later)** version of the compiler. To build Nu, you will need to use the **latest stable (1.41 or later)** version of the compiler.
Required dependencies: Required dependencies:
* pkg-config and libssl (only needed on Linux) * pkg-config and libssl (only needed on Linux)
* on Debian/Ubuntu: `apt install pkg-config libssl-dev` * On Debian/Ubuntu: `apt install pkg-config libssl-dev`
Optional dependencies: Optional dependencies:
* To use Nu with all possible optional features enabled, you'll also need the following: * To use Nu with all possible optional features enabled, you'll also need the following:
* on Linux (on Debian/Ubuntu): `apt install libxcb-composite0-dev libx11-dev` * On Linux (on Debian/Ubuntu): `apt install libxcb-composite0-dev libx11-dev`
To install Nu via cargo (make sure you have installed [rustup](https://rustup.rs/) and the latest stable compiler via `rustup install stable`): To install Nu via cargo (make sure you have installed [rustup](https://rustup.rs/) and the latest stable compiler via `rustup install stable`):
``` ```bash
cargo install nu cargo install nu
``` ```
You can also build Nu yourself with all the bells and whistles (be sure to have installed the [dependencies](https://www.nushell.sh/book/en/installation.html#dependencies) for your platform), once you have checked out this repo with git: You can also build Nu yourself with all the bells and whistles (be sure to have installed the [dependencies](https://www.nushell.sh/book/en/installation.html#dependencies) for your platform), once you have checked out this repo with git:
``` ```bash
cargo build --workspace --features=stable cargo build --workspace --features=stable
``` ```
## Docker ### Docker
#### Quickstart
Want to try Nu right away? Execute the following to get started.
```bash
docker run -it quay.io/nushell/nu:latest
```
#### Guide
If you want to pull a pre-built container, you can browse tags for the [nushell organization](https://quay.io/organization/nushell) If you want to pull a pre-built container, you can browse tags for the [nushell organization](https://quay.io/organization/nushell)
on Quay.io. Pulling a container would come down to: on Quay.io. Pulling a container would come down to:
```bash ```bash
$ docker pull quay.io/nushell/nu docker pull quay.io/nushell/nu
$ docker pull quay.io/nushell/nu-base docker pull quay.io/nushell/nu-base
``` ```
Both "nu-base" and "nu" provide the nu binary, however nu-base also includes the source code at `/code` Both "nu-base" and "nu" provide the nu binary, however nu-base also includes the source code at `/code`
@ -78,153 +97,176 @@ Optionally, you can also build the containers locally using the [dockerfiles pro
To build the base image: To build the base image:
```bash ```bash
$ docker build -f docker/Dockerfile.nu-base -t nushell/nu-base . docker build -f docker/Dockerfile.nu-base -t nushell/nu-base .
``` ```
And then to build the smaller container (using a Multistage build): And then to build the smaller container (using a Multistage build):
```bash ```bash
$ docker build -f docker/Dockerfile -t nushell/nu . docker build -f docker/Dockerfile -t nushell/nu .
``` ```
Either way, you can run either container as follows: Either way, you can run either container as follows:
```bash ```bash
$ docker run -it nushell/nu-base docker run -it nushell/nu-base
$ docker run -it nushell/nu docker run -it nushell/nu
/> exit /> exit
``` ```
The second container is a bit smaller if the size is important to you. The second container is a bit smaller if the size is important to you.
## Packaging status ### Packaging status
[![Packaging status](https://repology.org/badge/vertical-allrepos/nushell.svg)](https://repology.org/project/nushell/versions) [![Packaging status](https://repology.org/badge/vertical-allrepos/nushell.svg)](https://repology.org/project/nushell/versions)
### Fedora #### Fedora
[COPR repo](https://copr.fedorainfracloud.org/coprs/atim/nushell/): `sudo dnf copr enable atim/nushell -y && sudo dnf install nushell -y` [COPR repo](https://copr.fedorainfracloud.org/coprs/atim/nushell/): `sudo dnf copr enable atim/nushell -y && sudo dnf install nushell -y`
# Philosophy ## Philosophy
Nu draws inspiration from projects like PowerShell, functional programming languages, and modern CLI tools. Rather than thinking of files and services as raw streams of text, Nu looks at each input as something with structure. For example, when you list the contents of a directory, what you get back is a table of rows, where each row represents an item in that directory. These values can be piped through a series of steps, in a series of commands called a 'pipeline'. Nu draws inspiration from projects like PowerShell, functional programming languages, and modern CLI tools.
Rather than thinking of files and services as raw streams of text, Nu looks at each input as something with structure.
For example, when you list the contents of a directory, what you get back is a table of rows, where each row represents an item in that directory.
These values can be piped through a series of steps, in a series of commands called a 'pipeline'.
## Pipelines ### Pipelines
In Unix, it's common to pipe between commands to split up a sophisticated command over multiple steps. Nu takes this a step further and builds heavily on the idea of _pipelines_. Just as the Unix philosophy, Nu allows commands to output from stdout and read from stdin. Additionally, commands can output structured data (you can think of this as a third kind of stream). Commands that work in the pipeline fit into one of three categories: In Unix, it's common to pipe between commands to split up a sophisticated command over multiple steps.
Nu takes this a step further and builds heavily on the idea of _pipelines_.
Just as the Unix philosophy, Nu allows commands to output from stdout and read from stdin.
Additionally, commands can output structured data (you can think of this as a third kind of stream).
Commands that work in the pipeline fit into one of three categories:
* Commands that produce a stream (eg, `ls`) * Commands that produce a stream (eg, `ls`)
* Commands that filter a stream (eg, `where type == "Directory"`) * Commands that filter a stream (eg, `where type == "Dir"`)
* Commands that consume the output of the pipeline (eg, `autoview`) * Commands that consume the output of the pipeline (eg, `autoview`)
Commands are separated by the pipe symbol (`|`) to denote a pipeline flowing left to right. Commands are separated by the pipe symbol (`|`) to denote a pipeline flowing left to right.
``` ```shell
/home/jonathan/Source/nushell(master)> ls | where type == "Directory" | autoview > ls | where type == "Dir" | autoview
────┬───────────┬───────────┬──────────┬────────┬──────────────┬──────────────── ───────────┬──────┬───────┬──────────────
# │ name │ type │ readonly │ size │ accessed │ modified # │ name │ type │ size │ modified
────┼───────────┼───────────┼──────────┼────────┼──────────────┼──────────────── ───────────┼──────┼───────┼──────────────
0 │ .azure │ Directory │ │ 4.1 KB │ 2 months ago │ a day ago 0 │ assets │ Dir │ 128 B │ 5 months ago
1 │ target │ Directory │ │ 4.1 KB │ 3 days ago │ 3 days ago 1 │ crates │ Dir │ 704 B │ 50 mins ago
2 │ images │ Directory │ 4.1 KB │ 2 months ago │ 2 weeks ago 2 │ debian │ Dir352 B │ 5 months ago
3 │ tests │ Directory │ 4.1 KB │ 2 months ago │ 37 minutes ago 3 │ docker │ Dir288 B │ 3 months ago
4 │ tmp │ Directory │ │ 4.1 KB │ 2 weeks ago │ 2 weeks ago 4 │ docs │ Dir │ 192 B │ 50 mins ago
5 │ src │ Directory │ │ 4.1 KB │ 2 months ago │ 37 minutes ago 5 │ images │ Dir │ 160 B │ 5 months ago
6 │ assets │ Directory │ │ 4.1 KB │ a month ago │ a month ago 6 │ src │ Dir │ 128 B │ 1 day ago
7 │ docs │ Directory │ │ 4.1 KB │ 2 months ago │ 2 months ago 7 │ target │ Dir │ 160 B │ 5 days ago
────┴───────────┴───────────┴──────────┴────────┴──────────────┴──────────────── 8 │ tests │ Dir │ 192 B │ 3 months ago
───┴────────┴──────┴───────┴──────────────
``` ```
Because most of the time you'll want to see the output of a pipeline, `autoview` is assumed. We could have also written the above: Because most of the time you'll want to see the output of a pipeline, `autoview` is assumed.
We could have also written the above:
``` ```shell
/home/jonathan/Source/nushell(master)> ls | where type == Directory > ls | where type == Dir
``` ```
Being able to use the same commands and compose them differently is an important philosophy in Nu. For example, we could use the built-in `ps` command as well to get a list of the running processes, using the same `where` as above. Being able to use the same commands and compose them differently is an important philosophy in Nu.
For example, we could use the built-in `ps` command as well to get a list of the running processes, using the same `where` as above.
```text ```shell
/home/jonathan/Source/nushell(master)> ps | where cpu > 0 > ps | where cpu > 0
───┬───────┬─────────────────┬──────────┬────────── ───┬───────┬───────────────────┬──────────┬─────────┬──────────┬──────────
# │ pid │ name │ status │ cpu # │ pid │ name │ status │ cpu │ mem │ virtual
───┼───────┼─────────────────┼──────────┼────────── ───┼───────┼───────────────────┼──────────┼─────────┼──────────┼──────────
0992 │ chrome │ Sleeping │ 6.988768 0 435 │ irq/142-SYNA327 │ Sleeping │ 7.5699 │ 0 B │ 0 B
14240 │ chrome │ Sleeping │ 5.645982 1 1609 │ pulseaudio │ Sleeping │ 6.5605 │ 10.6 MB │ 2.3 GB
2 │ 13973 │ qemu-system-x86 │ Sleeping │ 4.996551 21625 │ gnome-shell │ Sleeping │ 6.5684 │ 639.6 MB │ 7.3 GB
3 │ 15746 │ nu │ Sleeping │ 84.59905 32202 │ Web Content │ Sleeping │ 6.8157 │ 320.8 MB │ 3.0 GB
───┴───────┴─────────────────┴──────────┴────────── 4328788 │ nu_plugin_core_ps │ Sleeping │ 92.5750 │ 5.9 MB │ 633.2 MB
───┴────────┴───────────────────┴──────────┴─────────┴──────────┴──────────
``` ```
## Opening files ### Opening files
Nu can load file and URL contents as raw text or as structured data (if it recognizes the format). For example, you can load a .toml file as structured data and explore it: Nu can load file and URL contents as raw text or as structured data (if it recognizes the format).
For example, you can load a .toml file as structured data and explore it:
``` ```shell
/home/jonathan/Source/nushell(master)> open Cargo.toml > open Cargo.toml
──────────────────┬────────────────┬────────────────── ────────────────────┬───────────────────────────
bin │ dependencies │ dev-dependencies bin [table 18 rows]
──────────────────┼────────────────┼────────────────── build-dependencies │ [row serde toml]
[table: 12 rows] │ [table: 1 row] │ [table: 1 row] dependencies │ [row 29 columns]
──────────────────┴────────────────┴────────────────── dev-dependencies │ [row nu-test-support]
features │ [row 19 columns]
package │ [row 12 columns]
workspace │ [row members]
────────────────────┴───────────────────────────
``` ```
We can pipeline this into a command that gets the contents of one of the columns: We can pipeline this into a command that gets the contents of one of the columns:
``` ```shell
/home/jonathan/Source/nushell(master)> open Cargo.toml | get package > open Cargo.toml | get package
─────────────────┬────────────────────────────┬─────────┬─────────┬──────┬───────── ───────────────┬────────────────────────────────────
authors │ description │ edition │ license │ name │ version authors [table 1 rows]
─────────────────┼────────────────────────────┼─────────┼─────────┼──────┼───────── default-run │ nu
[table: 3 rows] │ A shell for the GitHub era │ 2018 │ MIT │ nu │ 0.9.0 description │ A new type of shell
─────────────────┴────────────────────────────┴─────────┴─────────┴──────┴───────── documentation │ https://www.nushell.sh/book/
edition │ 2018
exclude │ [table 1 rows]
homepage │ https://www.nushell.sh
license │ MIT
name │ nu
readme │ README.md
repository │ https://github.com/nushell/nushell
version │ 0.15.1
───────────────┴────────────────────────────────────
``` ```
Finally, we can use commands outside of Nu once we have the data we want: Finally, we can use commands outside of Nu once we have the data we want:
``` ```shell
/home/jonathan/Source/nushell(master)> open Cargo.toml | get package.version | echo $it > open Cargo.toml | get package.version | echo $it
0.9.0 0.15.1
``` ```
Here we use the variable `$it` to refer to the value being piped to the external command. Here we use the variable `$it` to refer to the value being piped to the external command.
## Configuration ### Configuration
Nu has early support for configuring the shell. It currently supports the following settings: Nu has early support for configuring the shell. You can refer to the book for a list of [all supported variables](https://www.nushell.sh/book/en/configuration.html).
| Variable | Type | Description | To set one of these variables, you can use `config set`. For example:
| --------------- | -------------------- | -------------------------------------------------------------- |
| path | table of strings | PATH to use to find binaries |
| env | row | the environment variables to pass to external commands |
| ctrlc_exit | boolean | whether or not to exit Nu after multiple ctrl-c presses |
| table_mode | "light" or other | enable lightweight or normal tables |
| edit_mode | "vi" or "emacs" | changes line editing to "vi" or "emacs" mode |
| completion_mode | "circular" or "list" | changes completion type to "circular" (default) or "list" mode |
To set one of these variables, you can use `config --set`. For example: ```shell
> config set edit_mode "vi"
``` > config set path $nu.path
> config --set [edit_mode "vi"]
> config --set [path $nu.path]
``` ```
## Shells ### Shells
Nu will work inside of a single directory and allow you to navigate around your filesystem by default. Nu also offers a way of adding additional working directories that you can jump between, allowing you to work in multiple directories at the same time. Nu will work inside of a single directory and allow you to navigate around your filesystem by default.
Nu also offers a way of adding additional working directories that you can jump between, allowing you to work in multiple directories at the same time.
To do so, use the `enter` command, which will allow you create a new "shell" and enter it at the specified path. You can toggle between this new shell and the original shell with the `p` (for previous) and `n` (for next), allowing you to navigate around a ring buffer of shells. Once you're done with a shell, you can `exit` it and remove it from the ring buffer. To do so, use the `enter` command, which will allow you create a new "shell" and enter it at the specified path.
You can toggle between this new shell and the original shell with the `p` (for previous) and `n` (for next), allowing you to navigate around a ring buffer of shells.
Once you're done with a shell, you can `exit` it and remove it from the ring buffer.
Finally, to get a list of all the current shells, you can use the `shells` command. Finally, to get a list of all the current shells, you can use the `shells` command.
## Plugins ### Plugins
Nu supports plugins that offer additional functionality to the shell and follow the same structured data model that built-in commands use. This allows you to extend nu for your needs. Nu supports plugins that offer additional functionality to the shell and follow the same structured data model that built-in commands use.
This allows you to extend nu for your needs.
There are a few examples in the `plugins` directory. There are a few examples in the `plugins` directory.
Plugins are binaries that are available in your path and follow a `nu_plugin_*` naming convention. These binaries interact with nu via a simple JSON-RPC protocol where the command identifies itself and passes along its configuration, which then makes it available for use. If the plugin is a filter, data streams to it one element at a time, and it can stream data back in return via stdin/stdout. If the plugin is a sink, it is given the full vector of final data and is given free reign over stdin/stdout to use as it pleases. Plugins are binaries that are available in your path and follow a `nu_plugin_*` naming convention.
These binaries interact with nu via a simple JSON-RPC protocol where the command identifies itself and passes along its configuration, which then makes it available for use.
If the plugin is a filter, data streams to it one element at a time, and it can stream data back in return via stdin/stdout.
If the plugin is a sink, it is given the full vector of final data and is given free reign over stdin/stdout to use as it pleases.
# Goals ## Goals
Nu adheres closely to a set of goals that make up its design philosophy. As features are added, they are checked against these goals. Nu adheres closely to a set of goals that make up its design philosophy. As features are added, they are checked against these goals.
@ -238,11 +280,39 @@ Nu adheres closely to a set of goals that make up its design philosophy. As feat
* Finally, Nu views data functionally. Rather than using mutation, pipelines act as a means to load, change, and save data without mutable state. * Finally, Nu views data functionally. Rather than using mutation, pipelines act as a means to load, change, and save data without mutable state.
# Commands ## Commands
You can find a list of Nu commands, complete with documentation, in [quick command references](https://www.nushell.sh/documentation.html#quick-command-references). You can find a list of Nu commands, complete with documentation, in [quick command references](https://www.nushell.sh/documentation.html#quick-command-references).
# License ## Progress
The project is made available under the MIT license. See "LICENSE" for more information. Nu is in heavy development, and will naturally change as it matures and people use it. The chart below isn't meant to be exhaustive, but rather helps give an idea for some of the areas of development and their relative completion:
| Features | Not started | Prototype | MVP | Preview | Mature | Notes
| -------- |:-----------:|:---------:|:---:|:-------:|:------:| -----
| Aliases | | X | | | | Initial implementation but lacks necessary features
| Notebook | | X | | | | Initial jupyter support, but it loses state and lacks features
| File ops | | | X | | | cp, mv, rm, mkdir have some support, but lacking others
| Environment | | X | | | | Temporary environment, but no session-wide env variables
| Shells | | X | | | | Basic value and file shells, but no opt-in/opt-out for commands
| Protocol | | | X | | | Streaming protocol is serviceable
| Plugins | | X | | | | Plugins work on one row at a time, lack batching and expression eval
| Errors | | | X | | | Error reporting works, but could use usability polish
| Documentation | | X | | | | Book and related are barebones and lack task-based lessons
| Paging | | X | | | | Textview has paging, but we'd like paging for tables
| Functions| X | | | | | No functions, yet, only aliases
| Variables| X | | | | | Nu doesn't yet support variables
| Completions | | X | | | | Completions are currently barebones, at best
| Type-checking | | X | | | | Commands check basic types, but input/output isn't checked
## Current Roadmap
We've added a `Roadmap Board` to help collaboratively capture the direction we're going for the current release as well as capture some important issues we'd like to see in NuShell. You can find the Roadmap [here](https://github.com/nushell/nushell/projects/2).
## Contributing
See [Contributing](CONTRIBUTING.md) for details.
## License
The project is made available under the MIT license. See the `LICENSE` file for more information.

60
TODO.md
View File

@ -1,60 +0,0 @@
This pattern is extremely repetitive and can be abstracted:
```rs
let args = args.evaluate_once(registry)?;
let tag = args.name_tag();
let input = args.input;
let stream = async_stream! {
let values: Vec<Value> = input.values.collect().await;
let mut concat_string = String::new();
let mut latest_tag: Option<Tag> = None;
for value in values {
latest_tag = Some(value_tag.clone());
let value_span = value.tag.span;
match &value.value {
UntaggedValue::Primitive(Primitive::String(s)) => {
concat_string.push_str(&s);
concat_string.push_str("\n");
}
_ => yield Err(ShellError::labeled_error_with_secondary(
"Expected a string from pipeline",
"requires string input",
name_span,
"value originates from here",
value_span,
)),
}
}
```
Mandatory and Optional in parse_command
trace_remaining?
select_fields and select_fields take unnecessary Tag
Value#value should be Value#untagged
Unify dictionary building, probably around a macro
sys plugin in own crate
textview in own crate
Combine atomic and atomic_parse in parser
at_end_possible_ws needs to be comment and separator sensitive
Eliminate unnecessary `nodes` parser
#[derive(HasSpan)]
Figure out a solution for the duplication in stuff like NumberShape vs. NumberExpressionShape
use `struct Expander` from signature.rs

Binary file not shown.

Binary file not shown.

View File

@ -1,3 +0,0 @@
fn main() -> Result<(), Box<dyn std::error::Error>> {
nu_build::build()
}

View File

@ -1,16 +0,0 @@
[package]
name = "nu-build"
version = "0.11.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "Core build system for nushell"
license = "MIT"
[lib]
doctest = false
[dependencies]
serde = { version = "1.0.103", features = ["derive"] }
lazy_static = "1.4.0"
serde_json = "1.0.44"
toml = "0.5.5"

View File

@ -1,80 +0,0 @@
use lazy_static::lazy_static;
use serde::Deserialize;
use std::collections::BTreeMap;
use std::collections::HashMap;
use std::collections::HashSet;
use std::env;
use std::path::{Path, PathBuf};
use std::sync::Mutex;
lazy_static! {
static ref WORKSPACES: Mutex<BTreeMap<String, &'static Path>> = Mutex::new(BTreeMap::new());
}
// got from https://github.com/mitsuhiko/insta/blob/b113499249584cb650150d2d01ed96ee66db6b30/src/runtime.rs#L67-L88
fn get_cargo_workspace(manifest_dir: &str) -> Result<Option<&Path>, Box<dyn std::error::Error>> {
let mut workspaces = WORKSPACES.lock()?;
if let Some(rv) = workspaces.get(manifest_dir) {
Ok(Some(rv))
} else {
#[derive(Deserialize)]
struct Manifest {
workspace_root: String,
}
let output = std::process::Command::new(env!("CARGO"))
.arg("metadata")
.arg("--format-version=1")
.current_dir(manifest_dir)
.output()?;
let manifest: Manifest = serde_json::from_slice(&output.stdout)?;
let path = Box::leak(Box::new(PathBuf::from(manifest.workspace_root)));
workspaces.insert(manifest_dir.to_string(), path.as_path());
Ok(workspaces.get(manifest_dir).cloned())
}
}
#[derive(Deserialize)]
struct Feature {
#[allow(unused)]
description: String,
enabled: bool,
}
pub fn build() -> Result<(), Box<dyn std::error::Error>> {
let input = env::var("CARGO_MANIFEST_DIR")?;
let all_on = env::var("NUSHELL_ENABLE_ALL_FLAGS").is_ok();
let flags: HashSet<String> = env::var("NUSHELL_ENABLE_FLAGS")
.map(|s| s.split(',').map(|s| s.to_string()).collect())
.unwrap_or_else(|_| HashSet::new());
if all_on && !flags.is_empty() {
println!(
"cargo:warning=Both NUSHELL_ENABLE_ALL_FLAGS and NUSHELL_ENABLE_FLAGS were set. You don't need both."
);
}
let workspace = match get_cargo_workspace(&input)? {
// If the crate is being downloaded from crates.io, it won't have a workspace root, and that's ok
None => return Ok(()),
Some(workspace) => workspace,
};
let path = Path::new(&workspace).join("features.toml");
// If the crate is being downloaded from crates.io, it won't have a features.toml, and that's ok
if !path.exists() {
return Ok(());
}
let toml: HashMap<String, Feature> = toml::from_str(&std::fs::read_to_string(path)?)?;
for (key, value) in toml.iter() {
if value.enabled || all_on || flags.contains(key) {
println!("cargo:rustc-cfg={}", key);
}
}
Ok(())
}

View File

@ -1,109 +1,126 @@
[package] [package]
name = "nu-cli" authors = ["The Nu Project Contributors"]
version = "0.11.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
description = "CLI for nushell" description = "CLI for nushell"
edition = "2018" edition = "2018"
license = "MIT" license = "MIT"
name = "nu-cli"
version = "0.18.1"
[lib] [lib]
doctest = false doctest = false
[dependencies] [dependencies]
nu-source = { version = "0.11.0", path = "../nu-source" } nu-errors = {version = "0.18.1", path = "../nu-errors"}
nu-plugin = { version = "0.11.0", path = "../nu-plugin" } nu-parser = {version = "0.18.1", path = "../nu-parser"}
nu-protocol = { version = "0.11.0", path = "../nu-protocol" } nu-plugin = {version = "0.18.1", path = "../nu-plugin"}
nu-errors = { version = "0.11.0", path = "../nu-errors" } nu-protocol = {version = "0.18.1", path = "../nu-protocol"}
nu-parser = { version = "0.11.0", path = "../nu-parser" } nu-source = {version = "0.18.1", path = "../nu-source"}
nu-value-ext = { version = "0.11.0", path = "../nu-value-ext" } nu-table = {version = "0.18.1", path = "../nu-table"}
nu-macros = { version = "0.11.0", path = "../nu-macros" } nu-test-support = {version = "0.18.1", path = "../nu-test-support"}
nu-test-support = { version = "0.11.0", path = "../nu-test-support" } nu-value-ext = {version = "0.18.1", path = "../nu-value-ext"}
ansi_term = "0.12.1" ansi_term = "0.12.1"
app_dirs = "1.2.1" app_dirs = {version = "2", package = "app_dirs2"}
async-stream = "0.2" async-recursion = "0.3.1"
base64 = "0.11" async-trait = "0.1.36"
bigdecimal = { version = "0.1.0", features = ["serde"] } base64 = "0.12.3"
bson = { version = "0.14.0", features = ["decimal128"] } bigdecimal = {version = "0.1.2", features = ["serde"]}
byte-unit = "3.0.3" byte-unit = "3.1.3"
bytes = "0.5.4" bytes = "0.5.5"
calamine = "0.16" calamine = "0.16"
cfg-if = "0.1"
chrono = {version = "0.4.11", features = ["serde"]} chrono = {version = "0.4.11", features = ["serde"]}
clap = "2.33.0" clap = "2.33.1"
codespan-reporting = "0.9.5"
csv = "1.1" csv = "1.1"
ctrlc = "3.1.4" ctrlc = {version = "3.1.4", optional = true}
derive-new = "0.5.8" derive-new = "0.5.8"
dirs = "2.0.2" directories = {version = "2.0.2", optional = true}
dunce = "1.0.0" dirs = {version = "2.0.2", optional = true}
filesize = "0.1.0" dtparse = "1.1.0"
dunce = "1.0.1"
eml-parser = "0.1.0"
filesize = "0.2.0"
futures = {version = "0.3", features = ["compat", "io-compat"]} futures = {version = "0.3", features = ["compat", "io-compat"]}
futures-util = "0.3.4" futures-util = "0.3.5"
futures_codec = "0.4" futures_codec = "0.4"
getset = "0.1.0" getset = "0.1.1"
git2 = { version = "0.11.0", default_features = false } git2 = {version = "0.13.6", default_features = false, optional = true}
glob = "0.3.0" glob = "0.3.0"
hex = "0.4" hex = "0.4"
ichwh = "0.3" htmlescape = "0.3.1"
indexmap = { version = "1.3.2", features = ["serde-1"] } ical = "0.6.*"
ichwh = {version = "0.3.4", optional = true}
indexmap = {version = "1.4.0", features = ["serde-1"]}
itertools = "0.9.0" itertools = "0.9.0"
language-reporting = "0.4.0"
log = "0.4.8" log = "0.4.8"
meval = "0.2" meval = "0.2"
natural = "0.5.0" natural = "0.5.0"
nom = "5.0.1"
nom-tracable = "0.4.1"
nom_locate = "1.0.0"
num-bigint = {version = "0.2.6", features = ["serde"]} num-bigint = {version = "0.2.6", features = ["serde"]}
num-format = {version = "0.4", features = ["with-num-bigint"]}
num-traits = "0.2.11" num-traits = "0.2.11"
parking_lot = "0.10.0" parking_lot = "0.11.0"
pin-utils = "0.1.0-alpha.4" pin-utils = "0.1.0"
pretty-hex = "0.1.1" pretty-hex = "0.1.1"
pretty_env_logger = "0.4.0" pretty_env_logger = "0.4.0"
prettytable-rs = "0.8.0" ptree = {version = "0.2", optional = true}
ptree = {version = "0.2" }
query_interface = "0.3.5" query_interface = "0.3.5"
rand = "0.7" rand = "0.7"
regex = "1" regex = "1"
roxmltree = "0.9.1" roxmltree = "0.13.0"
rustyline = "6.0.0" rust-embed = "5.6.0"
serde = { version = "1.0.104", features = ["derive"] } rustyline = "6.2.0"
serde = {version = "1.0.114", features = ["derive"]}
serde-hjson = "0.9.1" serde-hjson = "0.9.1"
serde_bytes = "0.11.3" serde_bytes = "0.11.5"
serde_ini = "0.2.0" serde_ini = "0.2.0"
serde_json = "1.0.48" serde_json = "1.0.55"
serde_urlencoded = "0.6.1" serde_urlencoded = "0.6.1"
serde_yaml = "0.8" serde_yaml = "0.8"
sha2 = "0.9.1"
shellexpand = "2.0.0" shellexpand = "2.0.0"
strip-ansi-escapes = "0.1.0" strip-ansi-escapes = "0.1.0"
tempfile = "3.1.0" tempfile = "3.1.0"
term = "0.5.2" term = {version = "0.5.2", optional = true}
term_size = "0.3.2"
termcolor = "1.1.0" termcolor = "1.1.0"
textwrap = {version = "0.11.0", features = ["term_size"]}
toml = "0.5.6" toml = "0.5.6"
trash = "1.0.0" typetag = "0.1.5"
typetag = "0.1.4" umask = "1.0.0"
umask = "0.1" unicode-xid = "0.2.1"
unicode-xid = "0.2.0" uuid_crate = {package = "uuid", version = "0.8.1", features = ["v4"], optional = true}
which = "3.1.1" which = {version = "4.0.2", optional = true}
zip = "0.5.6"
clipboard = {version = "0.5", optional = true} clipboard = {version = "0.5", optional = true}
starship = { version = "0.37.0", optional = true } encoding_rs = "0.8.23"
quick-xml = "0.18.1"
rayon = "1.3.1"
starship = {version = "0.43.0", optional = true}
trash = {version = "1.0.1", optional = true}
url = {version = "2.1.1"}
[target.'cfg(unix)'.dependencies] [target.'cfg(unix)'.dependencies]
users = "0.9" users = "0.10.0"
# TODO this will be possible with new dependency resolver
# (currently on nightly behind -Zfeatures=itarget):
# https://github.com/rust-lang/cargo/issues/7914
#[target.'cfg(not(windows))'.dependencies]
#num-format = {version = "0.4", features = ["with-system-locale"]}
[dependencies.rusqlite] [dependencies.rusqlite]
version = "0.21.0"
features = ["bundled", "blob"] features = ["bundled", "blob"]
optional = true
[dev-dependencies] version = "0.23.1"
pretty_assertions = "0.6.1"
[build-dependencies] [build-dependencies]
nu-build = { version = "0.11.0", path = "../nu-build" }
[dev-dependencies]
quickcheck = "0.9"
quickcheck_macros = "0.9"
[features] [features]
clipboard-cli = ["clipboard"]
stable = [] stable = []
starship-prompt = ["starship"] starship-prompt = ["starship"]
clipboard-cli = ["clipboard"] trash-support = ["trash"]

Binary file not shown.

File diff suppressed because it is too large Load Diff

View File

@ -4,205 +4,256 @@ pub(crate) mod macros;
mod from_delimited_data; mod from_delimited_data;
mod to_delimited_data; mod to_delimited_data;
pub(crate) mod alias;
pub(crate) mod ansi;
pub(crate) mod append; pub(crate) mod append;
pub(crate) mod args; pub(crate) mod args;
pub(crate) mod autoenv;
pub(crate) mod autoenv_trust;
pub(crate) mod autoenv_untrust;
pub(crate) mod autoview; pub(crate) mod autoview;
pub(crate) mod calc; pub(crate) mod benchmark;
pub(crate) mod build_string;
pub(crate) mod cal;
pub(crate) mod cd; pub(crate) mod cd;
pub(crate) mod char_;
pub(crate) mod classified; pub(crate) mod classified;
#[cfg(feature = "clipboard")]
pub(crate) mod clip; pub(crate) mod clip;
pub(crate) mod command; pub(crate) mod command;
pub(crate) mod compact; pub(crate) mod compact;
pub(crate) mod config; pub(crate) mod config;
pub(crate) mod constants;
pub(crate) mod count; pub(crate) mod count;
pub(crate) mod cp; pub(crate) mod cp;
pub(crate) mod date; pub(crate) mod date;
pub(crate) mod debug; pub(crate) mod debug;
pub(crate) mod default; pub(crate) mod default;
pub(crate) mod do_;
pub(crate) mod drop;
pub(crate) mod du; pub(crate) mod du;
pub(crate) mod each;
pub(crate) mod echo; pub(crate) mod echo;
pub(crate) mod edit;
pub(crate) mod enter; pub(crate) mod enter;
#[allow(unused)] pub(crate) mod every;
pub(crate) mod evaluate_by;
pub(crate) mod exit; pub(crate) mod exit;
pub(crate) mod first; pub(crate) mod first;
pub(crate) mod format; pub(crate) mod format;
pub(crate) mod from_bson; pub(crate) mod from;
pub(crate) mod from_csv; pub(crate) mod from_csv;
pub(crate) mod from_eml;
pub(crate) mod from_ics;
pub(crate) mod from_ini; pub(crate) mod from_ini;
pub(crate) mod from_json; pub(crate) mod from_json;
pub(crate) mod from_ods; pub(crate) mod from_ods;
pub(crate) mod from_sqlite;
pub(crate) mod from_ssv; pub(crate) mod from_ssv;
pub(crate) mod from_toml; pub(crate) mod from_toml;
pub(crate) mod from_tsv; pub(crate) mod from_tsv;
pub(crate) mod from_url; pub(crate) mod from_url;
pub(crate) mod from_vcf;
pub(crate) mod from_xlsx; pub(crate) mod from_xlsx;
pub(crate) mod from_xml; pub(crate) mod from_xml;
pub(crate) mod from_yaml; pub(crate) mod from_yaml;
pub(crate) mod get; pub(crate) mod get;
pub(crate) mod group_by; pub(crate) mod group_by;
pub(crate) mod group_by_date;
pub(crate) mod headers;
pub(crate) mod help; pub(crate) mod help;
pub(crate) mod histogram; pub(crate) mod histogram;
pub(crate) mod history; pub(crate) mod history;
pub(crate) mod if_;
pub(crate) mod insert; pub(crate) mod insert;
pub(crate) mod is_empty;
pub(crate) mod keep;
pub(crate) mod last; pub(crate) mod last;
pub(crate) mod lines; pub(crate) mod lines;
pub(crate) mod ls; pub(crate) mod ls;
#[allow(unused)] pub(crate) mod math;
pub(crate) mod map_max_by; pub(crate) mod merge;
pub(crate) mod mkdir; pub(crate) mod mkdir;
pub(crate) mod mv; pub(crate) mod move_;
pub(crate) mod next; pub(crate) mod next;
pub(crate) mod nth; pub(crate) mod nth;
pub(crate) mod open; pub(crate) mod open;
pub(crate) mod parse; pub(crate) mod parse;
pub(crate) mod pick; pub(crate) mod path;
pub(crate) mod pivot; pub(crate) mod pivot;
pub(crate) mod plugin; pub(crate) mod plugin;
pub(crate) mod prepend; pub(crate) mod prepend;
pub(crate) mod prev; pub(crate) mod prev;
pub(crate) mod pwd; pub(crate) mod pwd;
pub(crate) mod random;
pub(crate) mod range; pub(crate) mod range;
#[allow(unused)] pub(crate) mod reduce;
pub(crate) mod reduce_by;
pub(crate) mod reject; pub(crate) mod reject;
pub(crate) mod rename; pub(crate) mod rename;
pub(crate) mod reverse; pub(crate) mod reverse;
pub(crate) mod rm; pub(crate) mod rm;
pub(crate) mod run_alias;
pub(crate) mod run_external;
pub(crate) mod save; pub(crate) mod save;
pub(crate) mod select;
pub(crate) mod shells; pub(crate) mod shells;
pub(crate) mod shuffle; pub(crate) mod shuffle;
pub(crate) mod size; pub(crate) mod size;
pub(crate) mod skip; pub(crate) mod skip;
pub(crate) mod skip_while;
pub(crate) mod sort_by; pub(crate) mod sort_by;
pub(crate) mod split;
pub(crate) mod split_by; pub(crate) mod split_by;
pub(crate) mod split_column; pub(crate) mod str_;
pub(crate) mod split_row;
#[allow(unused)]
pub(crate) mod t_sort_by;
pub(crate) mod table; pub(crate) mod table;
pub(crate) mod tags; pub(crate) mod tags;
pub(crate) mod to_bson; pub(crate) mod to;
pub(crate) mod to_csv; pub(crate) mod to_csv;
pub(crate) mod to_html;
pub(crate) mod to_json; pub(crate) mod to_json;
pub(crate) mod to_sqlite; pub(crate) mod to_md;
pub(crate) mod to_toml; pub(crate) mod to_toml;
pub(crate) mod to_tsv; pub(crate) mod to_tsv;
pub(crate) mod to_url; pub(crate) mod to_url;
pub(crate) mod to_xml;
pub(crate) mod to_yaml; pub(crate) mod to_yaml;
pub(crate) mod trim; pub(crate) mod trim;
pub(crate) mod uniq; pub(crate) mod uniq;
pub(crate) mod update;
pub(crate) mod url_;
pub(crate) mod version; pub(crate) mod version;
pub(crate) mod what; pub(crate) mod what;
pub(crate) mod where_; pub(crate) mod where_;
pub(crate) mod which_; pub(crate) mod which_;
pub(crate) mod with_env;
pub(crate) mod wrap; pub(crate) mod wrap;
pub(crate) use autoview::Autoview; pub(crate) use autoview::Autoview;
pub(crate) use cd::Cd; pub(crate) use cd::Cd;
pub(crate) use command::{ pub(crate) use command::{
per_item_command, whole_stream_command, Command, PerItemCommand, UnevaluatedCallInfo, whole_stream_command, Command, Example, UnevaluatedCallInfo, WholeStreamCommand,
WholeStreamCommand,
}; };
pub(crate) use alias::Alias;
pub(crate) use ansi::Ansi;
pub(crate) use append::Append; pub(crate) use append::Append;
pub(crate) use calc::Calc; pub(crate) use autoenv::Autoenv;
pub(crate) use autoenv_trust::AutoenvTrust;
pub(crate) use autoenv_untrust::AutoenvUnTrust;
pub(crate) use benchmark::Benchmark;
pub(crate) use build_string::BuildString;
pub(crate) use cal::Cal;
pub(crate) use char_::Char;
pub(crate) use compact::Compact; pub(crate) use compact::Compact;
pub(crate) use config::Config; pub(crate) use config::{
Config, ConfigClear, ConfigGet, ConfigLoad, ConfigPath, ConfigRemove, ConfigSet, ConfigSetInto,
};
pub(crate) use count::Count; pub(crate) use count::Count;
pub(crate) use cp::Cpy; pub(crate) use cp::Cpy;
pub(crate) use date::Date; pub(crate) use date::Date;
pub(crate) use debug::Debug; pub(crate) use debug::Debug;
pub(crate) use default::Default; pub(crate) use default::Default;
pub(crate) use do_::Do;
pub(crate) use drop::Drop;
pub(crate) use du::Du; pub(crate) use du::Du;
pub(crate) use each::Each;
pub(crate) use echo::Echo; pub(crate) use echo::Echo;
pub(crate) use edit::Edit; pub(crate) use if_::If;
pub(crate) use is_empty::IsEmpty;
pub(crate) use update::Update;
pub(crate) mod kill; pub(crate) mod kill;
pub(crate) use kill::Kill; pub(crate) use kill::Kill;
pub(crate) mod clear; pub(crate) mod clear;
pub(crate) use clear::Clear; pub(crate) use clear::Clear;
pub(crate) mod touch; pub(crate) mod touch;
pub(crate) use enter::Enter; pub(crate) use enter::Enter;
#[allow(unused_imports)] pub(crate) use every::Every;
pub(crate) use evaluate_by::EvaluateBy;
pub(crate) use exit::Exit; pub(crate) use exit::Exit;
pub(crate) use first::First; pub(crate) use first::First;
pub(crate) use format::Format; pub(crate) use format::Format;
pub(crate) use from_bson::FromBSON; pub(crate) use from::From;
pub(crate) use from_csv::FromCSV; pub(crate) use from_csv::FromCSV;
pub(crate) use from_eml::FromEML;
pub(crate) use from_ics::FromIcs;
pub(crate) use from_ini::FromINI; pub(crate) use from_ini::FromINI;
pub(crate) use from_json::FromJSON; pub(crate) use from_json::FromJSON;
pub(crate) use from_ods::FromODS; pub(crate) use from_ods::FromODS;
pub(crate) use from_sqlite::FromDB;
pub(crate) use from_sqlite::FromSQLite;
pub(crate) use from_ssv::FromSSV; pub(crate) use from_ssv::FromSSV;
pub(crate) use from_toml::FromTOML; pub(crate) use from_toml::FromTOML;
pub(crate) use from_tsv::FromTSV; pub(crate) use from_tsv::FromTSV;
pub(crate) use from_url::FromURL; pub(crate) use from_url::FromURL;
pub(crate) use from_vcf::FromVcf;
pub(crate) use from_xlsx::FromXLSX; pub(crate) use from_xlsx::FromXLSX;
pub(crate) use from_xml::FromXML; pub(crate) use from_xml::FromXML;
pub(crate) use from_yaml::FromYAML; pub(crate) use from_yaml::FromYAML;
pub(crate) use from_yaml::FromYML; pub(crate) use from_yaml::FromYML;
pub(crate) use get::Get; pub(crate) use get::Get;
pub(crate) use group_by::GroupBy; pub(crate) use group_by::GroupBy;
pub(crate) use group_by_date::GroupByDate;
pub(crate) use headers::Headers;
pub(crate) use help::Help; pub(crate) use help::Help;
pub(crate) use histogram::Histogram; pub(crate) use histogram::Histogram;
pub(crate) use history::History; pub(crate) use history::History;
pub(crate) use insert::Insert; pub(crate) use insert::Insert;
pub(crate) use keep::{Keep, KeepUntil, KeepWhile};
pub(crate) use last::Last; pub(crate) use last::Last;
pub(crate) use lines::Lines; pub(crate) use lines::Lines;
pub(crate) use ls::Ls; pub(crate) use ls::Ls;
#[allow(unused_imports)] pub(crate) use math::{
pub(crate) use map_max_by::MapMaxBy; Math, MathAverage, MathEval, MathMaximum, MathMedian, MathMinimum, MathMode, MathStddev,
MathSummation, MathVariance,
};
pub(crate) use merge::Merge;
pub(crate) use mkdir::Mkdir; pub(crate) use mkdir::Mkdir;
pub(crate) use mv::Move; pub(crate) use move_::{Move, MoveColumn, Mv};
pub(crate) use next::Next; pub(crate) use next::Next;
pub(crate) use nth::Nth; pub(crate) use nth::Nth;
pub(crate) use open::Open; pub(crate) use open::Open;
pub(crate) use parse::Parse; pub(crate) use parse::Parse;
pub(crate) use pick::Pick; pub(crate) use path::{PathBasename, PathCommand, PathExists, PathExpand, PathExtension, PathType};
pub(crate) use pivot::Pivot; pub(crate) use pivot::Pivot;
pub(crate) use prepend::Prepend; pub(crate) use prepend::Prepend;
pub(crate) use prev::Previous; pub(crate) use prev::Previous;
pub(crate) use pwd::Pwd; pub(crate) use pwd::Pwd;
#[cfg(feature = "uuid_crate")]
pub(crate) use random::RandomUUID;
pub(crate) use random::{Random, RandomBool, RandomDice};
pub(crate) use range::Range; pub(crate) use range::Range;
#[allow(unused_imports)] pub(crate) use reduce::Reduce;
pub(crate) use reduce_by::ReduceBy;
pub(crate) use reject::Reject; pub(crate) use reject::Reject;
pub(crate) use rename::Rename; pub(crate) use rename::Rename;
pub(crate) use reverse::Reverse; pub(crate) use reverse::Reverse;
pub(crate) use rm::Remove; pub(crate) use rm::Remove;
pub(crate) use run_external::RunExternalCommand;
pub(crate) use save::Save; pub(crate) use save::Save;
pub(crate) use select::Select;
pub(crate) use shells::Shells; pub(crate) use shells::Shells;
pub(crate) use shuffle::Shuffle; pub(crate) use shuffle::Shuffle;
pub(crate) use size::Size; pub(crate) use size::Size;
pub(crate) use skip::Skip; pub(crate) use skip::{Skip, SkipUntil, SkipWhile};
pub(crate) use skip_while::SkipWhile;
pub(crate) use sort_by::SortBy; pub(crate) use sort_by::SortBy;
pub(crate) use split::{Split, SplitChars, SplitColumn, SplitRow};
pub(crate) use split_by::SplitBy; pub(crate) use split_by::SplitBy;
pub(crate) use split_column::SplitColumn; pub(crate) use str_::{
pub(crate) use split_row::SplitRow; Str, StrCapitalize, StrCollect, StrContains, StrDowncase, StrEndsWith, StrFindReplace, StrFrom,
#[allow(unused_imports)] StrIndexOf, StrLength, StrReverse, StrSet, StrStartsWith, StrSubstring, StrToDatetime,
pub(crate) use t_sort_by::TSortBy; StrToDecimal, StrToInteger, StrTrim, StrTrimLeft, StrTrimRight, StrUpcase,
};
pub(crate) use table::Table; pub(crate) use table::Table;
pub(crate) use tags::Tags; pub(crate) use tags::Tags;
pub(crate) use to_bson::ToBSON; pub(crate) use to::To;
pub(crate) use to_csv::ToCSV; pub(crate) use to_csv::ToCSV;
pub(crate) use to_html::ToHTML;
pub(crate) use to_json::ToJSON; pub(crate) use to_json::ToJSON;
pub(crate) use to_sqlite::ToDB; pub(crate) use to_md::ToMarkdown;
pub(crate) use to_sqlite::ToSQLite;
pub(crate) use to_toml::ToTOML; pub(crate) use to_toml::ToTOML;
pub(crate) use to_tsv::ToTSV; pub(crate) use to_tsv::ToTSV;
pub(crate) use to_url::ToURL; pub(crate) use to_url::ToURL;
pub(crate) use to_xml::ToXML;
pub(crate) use to_yaml::ToYAML; pub(crate) use to_yaml::ToYAML;
pub(crate) use touch::Touch; pub(crate) use touch::Touch;
pub(crate) use trim::Trim; pub(crate) use trim::Trim;
pub(crate) use uniq::Uniq; pub(crate) use uniq::Uniq;
pub(crate) use url_::{UrlCommand, UrlHost, UrlPath, UrlQuery, UrlScheme};
pub(crate) use version::Version; pub(crate) use version::Version;
pub(crate) use what::What; pub(crate) use what::What;
pub(crate) use where_::Where; pub(crate) use where_::Where;
pub(crate) use which_::Which; pub(crate) use which_::Which;
pub(crate) use with_env::WithEnv;
pub(crate) use wrap::Wrap; pub(crate) use wrap::Wrap;

View File

@ -0,0 +1,151 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::data::config;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{
hir::Block, CommandAction, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value,
};
use nu_source::Tagged;
pub struct Alias;
#[derive(Deserialize)]
pub struct AliasArgs {
pub name: Tagged<String>,
pub args: Vec<Value>,
pub block: Block,
pub save: Option<bool>,
}
#[async_trait]
impl WholeStreamCommand for Alias {
fn name(&self) -> &str {
"alias"
}
fn signature(&self) -> Signature {
Signature::build("alias")
.required("name", SyntaxShape::String, "the name of the alias")
.required("args", SyntaxShape::Table, "the arguments to the alias")
.required(
"block",
SyntaxShape::Block,
"the block to run as the body of the alias",
)
.switch("save", "save the alias to your config", Some('s'))
}
fn usage(&self) -> &str {
"Define a shortcut for another command."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
alias(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "An alias without parameters",
example: "alias say-hi [] { echo 'Hello!' }",
result: None,
},
Example {
description: "An alias with a single parameter",
example: "alias l [x] { ls $x }",
result: None,
},
]
}
}
pub async fn alias(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let mut raw_input = args.raw_input.clone();
let (
AliasArgs {
name,
args: list,
block,
save,
},
_ctx,
) = args.process(&registry).await?;
let mut processed_args: Vec<String> = vec![];
if let Some(true) = save {
let mut result = crate::data::config::read(name.clone().tag, &None)?;
// process the alias to remove the --save flag
let left_brace = raw_input.find('{').unwrap_or(0);
let right_brace = raw_input.rfind('}').unwrap_or_else(|| raw_input.len());
let left = raw_input[..left_brace]
.replace("--save", "")
.replace("-s", "");
let right = raw_input[right_brace..]
.replace("--save", "")
.replace("-s", "");
raw_input = format!("{}{}{}", left, &raw_input[left_brace..right_brace], right);
// create a value from raw_input alias
let alias: Value = raw_input.trim().to_string().into();
let alias_start = raw_input.find('[').unwrap_or(0); // used to check if the same alias already exists
// add to startup if alias doesn't exist and replce if it does
match result.get_mut("startup") {
Some(startup) => {
if let UntaggedValue::Table(ref mut commands) = startup.value {
if let Some(command) = commands.iter_mut().find(|command| {
let cmd_str = command.as_string().unwrap_or_default();
cmd_str.starts_with(&raw_input[..alias_start])
}) {
*command = alias;
} else {
commands.push(alias);
}
}
}
None => {
let table = UntaggedValue::table(&[alias]);
result.insert("startup".to_string(), table.into_value(Tag::default()));
}
}
config::write(&result, &None)?;
}
for item in list.iter() {
if let Ok(string) = item.as_string() {
processed_args.push(format!("${}", string));
} else {
return Err(ShellError::labeled_error(
"Expected a string",
"expected a string",
item.tag(),
));
}
}
Ok(OutputStream::one(ReturnSuccess::action(
CommandAction::AddAlias(name.to_string(), processed_args, block),
)))
}
#[cfg(test)]
mod tests {
use super::Alias;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Alias {})
}
}

View File

@ -0,0 +1,148 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
pub struct Ansi;
#[derive(Deserialize)]
struct AnsiArgs {
color: Value,
}
#[async_trait]
impl WholeStreamCommand for Ansi {
fn name(&self) -> &str {
"ansi"
}
fn signature(&self) -> Signature {
Signature::build("ansi").required(
"color",
SyntaxShape::Any,
"the name of the color to use or 'reset' to reset the color",
)
}
fn usage(&self) -> &str {
"Output ANSI codes to change color"
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Change color to green",
example: r#"ansi green"#,
result: Some(vec![Value::from("\u{1b}[32m")]),
},
Example {
description: "Reset the color",
example: r#"ansi reset"#,
result: Some(vec![Value::from("\u{1b}[0m")]),
},
Example {
description:
"Use ansi to color text (rb = red bold, gb = green bold, pb = purple bold)",
example: r#"echo [$(ansi rb) Hello " " $(ansi gb) Nu " " $(ansi pb) World] | str collect"#,
result: Some(vec![Value::from(
"\u{1b}[1;31mHello \u{1b}[1;32mNu \u{1b}[1;35mWorld",
)]),
},
]
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let (AnsiArgs { color }, _) = args.process(&registry).await?;
let color_string = color.as_string()?;
let ansi_code = str_to_ansi_color(color_string);
if let Some(output) = ansi_code {
Ok(OutputStream::one(ReturnSuccess::value(
UntaggedValue::string(output).into_value(color.tag()),
)))
} else {
Err(ShellError::labeled_error(
"Unknown color",
"unknown color",
color.tag(),
))
}
}
}
fn str_to_ansi_color(s: String) -> Option<String> {
match s.as_str() {
"g" | "green" => Some(ansi_term::Color::Green.prefix().to_string()),
"gb" | "green_bold" => Some(ansi_term::Color::Green.bold().prefix().to_string()),
"gu" | "green_underline" => Some(ansi_term::Color::Green.underline().prefix().to_string()),
"gi" | "green_italic" => Some(ansi_term::Color::Green.italic().prefix().to_string()),
"gd" | "green_dimmed" => Some(ansi_term::Color::Green.dimmed().prefix().to_string()),
"gr" | "green_reverse" => Some(ansi_term::Color::Green.reverse().prefix().to_string()),
"r" | "red" => Some(ansi_term::Color::Red.prefix().to_string()),
"rb" | "red_bold" => Some(ansi_term::Color::Red.bold().prefix().to_string()),
"ru" | "red_underline" => Some(ansi_term::Color::Red.underline().prefix().to_string()),
"ri" | "red_italic" => Some(ansi_term::Color::Red.italic().prefix().to_string()),
"rd" | "red_dimmed" => Some(ansi_term::Color::Red.dimmed().prefix().to_string()),
"rr" | "red_reverse" => Some(ansi_term::Color::Red.reverse().prefix().to_string()),
"u" | "blue" => Some(ansi_term::Color::Blue.prefix().to_string()),
"ub" | "blue_bold" => Some(ansi_term::Color::Blue.bold().prefix().to_string()),
"uu" | "blue_underline" => Some(ansi_term::Color::Blue.underline().prefix().to_string()),
"ui" | "blue_italic" => Some(ansi_term::Color::Blue.italic().prefix().to_string()),
"ud" | "blue_dimmed" => Some(ansi_term::Color::Blue.dimmed().prefix().to_string()),
"ur" | "blue_reverse" => Some(ansi_term::Color::Blue.reverse().prefix().to_string()),
"b" | "black" => Some(ansi_term::Color::Black.prefix().to_string()),
"bb" | "black_bold" => Some(ansi_term::Color::Black.bold().prefix().to_string()),
"bu" | "black_underline" => Some(ansi_term::Color::Black.underline().prefix().to_string()),
"bi" | "black_italic" => Some(ansi_term::Color::Black.italic().prefix().to_string()),
"bd" | "black_dimmed" => Some(ansi_term::Color::Black.dimmed().prefix().to_string()),
"br" | "black_reverse" => Some(ansi_term::Color::Black.reverse().prefix().to_string()),
"y" | "yellow" => Some(ansi_term::Color::Yellow.prefix().to_string()),
"yb" | "yellow_bold" => Some(ansi_term::Color::Yellow.bold().prefix().to_string()),
"yu" | "yellow_underline" => {
Some(ansi_term::Color::Yellow.underline().prefix().to_string())
}
"yi" | "yellow_italic" => Some(ansi_term::Color::Yellow.italic().prefix().to_string()),
"yd" | "yellow_dimmed" => Some(ansi_term::Color::Yellow.dimmed().prefix().to_string()),
"yr" | "yellow_reverse" => Some(ansi_term::Color::Yellow.reverse().prefix().to_string()),
"p" | "purple" => Some(ansi_term::Color::Purple.prefix().to_string()),
"pb" | "purple_bold" => Some(ansi_term::Color::Purple.bold().prefix().to_string()),
"pu" | "purple_underline" => {
Some(ansi_term::Color::Purple.underline().prefix().to_string())
}
"pi" | "purple_italic" => Some(ansi_term::Color::Purple.italic().prefix().to_string()),
"pd" | "purple_dimmed" => Some(ansi_term::Color::Purple.dimmed().prefix().to_string()),
"pr" | "purple_reverse" => Some(ansi_term::Color::Purple.reverse().prefix().to_string()),
"c" | "cyan" => Some(ansi_term::Color::Cyan.prefix().to_string()),
"cb" | "cyan_bold" => Some(ansi_term::Color::Cyan.bold().prefix().to_string()),
"cu" | "cyan_underline" => Some(ansi_term::Color::Cyan.underline().prefix().to_string()),
"ci" | "cyan_italic" => Some(ansi_term::Color::Cyan.italic().prefix().to_string()),
"cd" | "cyan_dimmed" => Some(ansi_term::Color::Cyan.dimmed().prefix().to_string()),
"cr" | "cyan_reverse" => Some(ansi_term::Color::Cyan.reverse().prefix().to_string()),
"w" | "white" => Some(ansi_term::Color::White.prefix().to_string()),
"wb" | "white_bold" => Some(ansi_term::Color::White.bold().prefix().to_string()),
"wu" | "white_underline" => Some(ansi_term::Color::White.underline().prefix().to_string()),
"wi" | "white_italic" => Some(ansi_term::Color::White.italic().prefix().to_string()),
"wd" | "white_dimmed" => Some(ansi_term::Color::White.dimmed().prefix().to_string()),
"wr" | "white_reverse" => Some(ansi_term::Color::White.reverse().prefix().to_string()),
"reset" => Some("\x1b[0m".to_owned()),
_ => None,
}
}
#[cfg(test)]
mod tests {
use super::Ansi;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Ansi {})
}
}

View File

@ -2,7 +2,7 @@ use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry; use crate::context::CommandRegistry;
use crate::prelude::*; use crate::prelude::*;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape, Value}; use nu_protocol::{Signature, SyntaxShape, UntaggedValue, Value};
#[derive(Deserialize)] #[derive(Deserialize)]
struct AppendArgs { struct AppendArgs {
@ -11,6 +11,7 @@ struct AppendArgs {
pub struct Append; pub struct Append;
#[async_trait]
impl WholeStreamCommand for Append { impl WholeStreamCommand for Append {
fn name(&self) -> &str { fn name(&self) -> &str {
"append" "append"
@ -28,22 +29,40 @@ impl WholeStreamCommand for Append {
"Append the given row to the table" "Append the given row to the table"
} }
fn run( async fn run(
&self, &self,
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
args.process(registry, append)?.run() let (AppendArgs { row }, input) = args.process(registry).await?;
let eos = futures::stream::iter(vec![row]);
Ok(input.chain(eos).to_output_stream())
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Add something to the end of a list or table",
example: "echo [1 2 3] | append 4",
result: Some(vec![
UntaggedValue::int(1).into(),
UntaggedValue::int(2).into(),
UntaggedValue::int(3).into(),
UntaggedValue::int(4).into(),
]),
}]
} }
} }
fn append( #[cfg(test)]
AppendArgs { row }: AppendArgs, mod tests {
RunnableContext { input, .. }: RunnableContext, use super::Append;
) -> Result<OutputStream, ShellError> {
let mut after: VecDeque<Value> = VecDeque::new();
after.push_back(row);
let after = futures::stream::iter(after);
Ok(OutputStream::from_input(input.values.chain(after))) #[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Append {})
}
} }

View File

@ -0,0 +1,92 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, UntaggedValue};
use serde::Deserialize;
use serde::Serialize;
use sha2::{Digest, Sha256};
use std::io::Read;
use std::path::PathBuf;
pub struct Autoenv;
#[derive(Deserialize, Serialize, Debug, Default)]
pub struct Trusted {
pub files: IndexMap<String, Vec<u8>>,
}
impl Trusted {
pub fn new() -> Self {
Trusted {
files: IndexMap::new(),
}
}
}
pub fn file_is_trusted(nu_env_file: &PathBuf, content: &[u8]) -> Result<bool, ShellError> {
let contentdigest = Sha256::digest(&content).as_slice().to_vec();
let nufile = std::fs::canonicalize(nu_env_file)?;
let trusted = read_trusted()?;
Ok(trusted.files.get(&nufile.to_string_lossy().to_string()) == Some(&contentdigest))
}
pub fn read_trusted() -> Result<Trusted, ShellError> {
let config_path = config::default_path_for(&Some(PathBuf::from("nu-env.toml")))?;
let mut file = std::fs::OpenOptions::new()
.read(true)
.create(true)
.write(true)
.open(config_path)
.map_err(|_| ShellError::untagged_runtime_error("Couldn't open nu-env.toml"))?;
let mut doc = String::new();
file.read_to_string(&mut doc)?;
let allowed = toml::de::from_str(doc.as_str()).unwrap_or_else(|_| Trusted::new());
Ok(allowed)
}
#[async_trait]
impl WholeStreamCommand for Autoenv {
fn name(&self) -> &str {
"autoenv"
}
fn usage(&self) -> &str {
// "Mark a .nu-env file in a directory as trusted. Needs to be re-run after each change to the file or its filepath."
r#"Manage directory specific environment variables and scripts. Create a file called .nu-env in any directory and run 'autoenv trust' to let nushell read it when entering the directory.
The file can contain several optional sections:
env: environment variables to set when visiting the directory. The variables are unset after leaving the directory and any overwritten values are restored.
scriptvars: environment variables that should be set to the return value of a script. After they have been set, they behave in the same way as variables set in the env section.
scripts: scripts to run when entering the directory or leaving it. Note that exitscripts are not run in the directory they are declared in."#
}
fn signature(&self) -> Signature {
Signature::build("autoenv")
}
async fn run(
&self,
_args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
Ok(OutputStream::one(ReturnSuccess::value(
UntaggedValue::string(crate::commands::help::get_help(&Autoenv, &registry))
.into_value(Tag::unknown()),
)))
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Example .nu-env file",
example: r#"cat .nu-env
[env]
mykey = "myvalue"
[scriptvars]
myscript = "echo myval"
[scripts]
entryscripts = ["touch hello.txt", "touch hello2.txt"]
exitscripts = ["touch bye.txt"]"#,
result: None,
}]
}
}

View File

@ -0,0 +1,83 @@
use super::autoenv::read_trusted;
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::SyntaxShape;
use nu_protocol::{Primitive, ReturnSuccess, Signature, UntaggedValue, Value};
use sha2::{Digest, Sha256};
use std::{fs, path::PathBuf};
pub struct AutoenvTrust;
#[async_trait]
impl WholeStreamCommand for AutoenvTrust {
fn name(&self) -> &str {
"autoenv trust"
}
fn signature(&self) -> Signature {
Signature::build("autoenv trust").optional("dir", SyntaxShape::String, "Directory to allow")
}
fn usage(&self) -> &str {
"Trust a .nu-env file in the current or given directory"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let file_to_trust = match args.call_info.evaluate(registry).await?.args.nth(0) {
Some(Value {
value: UntaggedValue::Primitive(Primitive::String(ref path)),
tag: _,
}) => {
let mut dir = fs::canonicalize(path)?;
dir.push(".nu-env");
dir
}
_ => {
let mut dir = fs::canonicalize(std::env::current_dir()?)?;
dir.push(".nu-env");
dir
}
};
let content = std::fs::read(&file_to_trust)?;
let filename = file_to_trust.to_string_lossy().to_string();
let mut allowed = read_trusted()?;
allowed
.files
.insert(filename, Sha256::digest(&content).as_slice().to_vec());
let config_path = config::default_path_for(&Some(PathBuf::from("nu-env.toml")))?;
let tomlstr = toml::to_string(&allowed).map_err(|_| {
ShellError::untagged_runtime_error("Couldn't serialize allowed dirs to nu-env.toml")
})?;
fs::write(config_path, tomlstr).expect("Couldn't write to toml file");
Ok(OutputStream::one(ReturnSuccess::value(
UntaggedValue::string(".nu-env trusted!").into_value(tag),
)))
}
fn is_binary(&self) -> bool {
false
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Allow .nu-env file in current directory",
example: "autoenv trust",
result: None,
},
Example {
description: "Allow .nu-env file in directory foo",
example: "autoenv trust foo",
result: None,
},
]
}
}

View File

@ -0,0 +1,107 @@
use super::autoenv::Trusted;
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::SyntaxShape;
use nu_protocol::{Primitive, ReturnSuccess, Signature, UntaggedValue, Value};
use std::io::Read;
use std::{fs, path::PathBuf};
pub struct AutoenvUnTrust;
#[async_trait]
impl WholeStreamCommand for AutoenvUnTrust {
fn name(&self) -> &str {
"autoenv untrust"
}
fn signature(&self) -> Signature {
Signature::build("autoenv untrust").optional(
"dir",
SyntaxShape::String,
"Directory to disallow",
)
}
fn usage(&self) -> &str {
"Untrust a .nu-env file in the current or given directory"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let file_to_untrust = match args.call_info.evaluate(registry).await?.args.nth(0) {
Some(Value {
value: UntaggedValue::Primitive(Primitive::String(ref path)),
tag: _,
}) => {
let mut dir = fs::canonicalize(path)?;
dir.push(".nu-env");
dir
}
_ => {
let mut dir = std::env::current_dir()?;
dir.push(".nu-env");
dir
}
};
let config_path = config::default_path_for(&Some(PathBuf::from("nu-env.toml")))?;
let mut file = match std::fs::OpenOptions::new()
.read(true)
.create(true)
.write(true)
.open(config_path.clone())
{
Ok(p) => p,
Err(_) => {
return Err(ShellError::untagged_runtime_error(
"Couldn't open nu-env.toml",
));
}
};
let mut doc = String::new();
file.read_to_string(&mut doc)?;
let mut allowed: Trusted = toml::from_str(doc.as_str()).unwrap_or_else(|_| Trusted::new());
let file_to_untrust = file_to_untrust.to_string_lossy().to_string();
if allowed.files.remove(&file_to_untrust).is_none() {
return
Err(ShellError::untagged_runtime_error(
"No .nu-env file to untrust in the given directory. Is it missing, or already untrusted?",
));
}
let tomlstr = toml::to_string(&allowed).map_err(|_| {
ShellError::untagged_runtime_error("Couldn't serialize allowed dirs to nu-env.toml")
})?;
fs::write(config_path, tomlstr).expect("Couldn't write to toml file");
Ok(OutputStream::one(ReturnSuccess::value(
UntaggedValue::string(".nu-env untrusted!").into_value(tag),
)))
}
fn is_binary(&self) -> bool {
false
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Disallow .nu-env file in current directory",
example: "autoenv untrust",
result: None,
},
Example {
description: "Disallow .nu-env file in directory foo",
example: "autoenv untrust foo",
result: None,
},
]
}
}

View File

@ -1,14 +1,16 @@
use crate::commands::UnevaluatedCallInfo; use crate::commands::UnevaluatedCallInfo;
use crate::commands::WholeStreamCommand; use crate::commands::WholeStreamCommand;
use crate::data::value::format_leaf;
use crate::prelude::*; use crate::prelude::*;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_parser::{hir, hir::Expression, hir::Literal, hir::SpannedExpression}; use nu_protocol::hir::{self, Expression, ExternalRedirection, Literal, SpannedExpression};
use nu_protocol::{Primitive, ReturnSuccess, Signature, UntaggedValue, Value}; use nu_protocol::{Primitive, Scope, Signature, UntaggedValue, Value};
use parking_lot::Mutex;
use std::sync::atomic::AtomicBool; use std::sync::atomic::AtomicBool;
use std::sync::atomic::Ordering;
pub struct Autoview; pub struct Autoview;
#[async_trait]
impl WholeStreamCommand for Autoview { impl WholeStreamCommand for Autoview {
fn name(&self) -> &str { fn name(&self) -> &str {
"autoview" "autoview"
@ -22,29 +24,46 @@ impl WholeStreamCommand for Autoview {
"View the contents of the pipeline as a table or list." "View the contents of the pipeline as a table or list."
} }
fn run( async fn run(
&self, &self,
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
autoview(RunnableContext { autoview(RunnableContext {
input: args.input, input: args.input,
commands: registry.clone(), registry: registry.clone(),
shell_manager: args.shell_manager, shell_manager: args.shell_manager,
host: args.host, host: args.host,
source: args.call_info.source,
ctrl_c: args.ctrl_c, ctrl_c: args.ctrl_c,
current_errors: args.current_errors,
name: args.call_info.name_tag, name: args.call_info.name_tag,
raw_input: args.raw_input,
}) })
.await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Automatically view the results",
example: "ls | autoview",
result: None,
},
Example {
description: "Autoview is also implied. The above can be written as",
example: "ls",
result: None,
},
]
} }
} }
pub struct RunnableContextWithoutInput { pub struct RunnableContextWithoutInput {
pub shell_manager: ShellManager, pub shell_manager: ShellManager,
pub host: Arc<parking_lot::Mutex<Box<dyn Host>>>, pub host: Arc<parking_lot::Mutex<Box<dyn Host>>>,
pub source: Text, pub current_errors: Arc<Mutex<Vec<ShellError>>>,
pub ctrl_c: Arc<AtomicBool>, pub ctrl_c: Arc<AtomicBool>,
pub commands: CommandRegistry, pub registry: CommandRegistry,
pub name: Tag, pub name: Tag,
} }
@ -53,49 +72,56 @@ impl RunnableContextWithoutInput {
let new_context = RunnableContextWithoutInput { let new_context = RunnableContextWithoutInput {
shell_manager: context.shell_manager, shell_manager: context.shell_manager,
host: context.host, host: context.host,
source: context.source,
ctrl_c: context.ctrl_c, ctrl_c: context.ctrl_c,
commands: context.commands, current_errors: context.current_errors,
registry: context.registry,
name: context.name, name: context.name,
}; };
(context.input, new_context) (context.input, new_context)
} }
} }
pub fn autoview(context: RunnableContext) -> Result<OutputStream, ShellError> { pub async fn autoview(context: RunnableContext) -> Result<OutputStream, ShellError> {
let binary = context.get_command("binaryview"); let binary = context.get_command("binaryview");
let text = context.get_command("textview"); let text = context.get_command("textview");
let table = context.get_command("table"); let table = context.get_command("table");
Ok(OutputStream::new(async_stream! { #[derive(PartialEq)]
let (mut input_stream, context) = RunnableContextWithoutInput::convert(context); enum AutoPivotMode {
Auto,
Always,
Never,
}
match input_stream.next().await { let pivot_mode = crate::data::config::config(Tag::unknown());
Some(x) => { let pivot_mode = if let Some(v) = pivot_mode?.get("pivot_mode") {
match v.as_string() {
Ok(m) if m.to_lowercase() == "auto" => AutoPivotMode::Auto,
Ok(m) if m.to_lowercase() == "always" => AutoPivotMode::Always,
Ok(m) if m.to_lowercase() == "never" => AutoPivotMode::Never,
_ => AutoPivotMode::Always,
}
} else {
AutoPivotMode::Always
};
let (mut input_stream, context) = RunnableContextWithoutInput::convert(context);
let term_width = context.host.lock().width();
if let Some(x) = input_stream.next().await {
match input_stream.next().await { match input_stream.next().await {
Some(y) => { Some(y) => {
let ctrl_c = context.ctrl_c.clone(); let ctrl_c = context.ctrl_c.clone();
let stream = async_stream! { let xy = vec![x, y];
yield Ok(x); let xy_stream = futures::stream::iter(xy)
yield Ok(y); .chain(input_stream)
.interruptible(ctrl_c);
loop { let stream = InputStream::from_stream(xy_stream);
match input_stream.next().await {
Some(z) => {
if ctrl_c.load(Ordering::SeqCst) {
break;
}
yield Ok(z);
}
_ => break,
}
}
};
let stream = stream.to_input_stream();
if let Some(table) = table { if let Some(table) = table {
let command_args = create_default_command_args(&context).with_input(stream); let command_args = create_default_command_args(&context).with_input(stream);
let result = table.run(command_args, &context.commands); let result = table.run(command_args, &context.registry).await?;
result.collect::<Vec<_>>().await; result.collect::<Vec<_>>().await;
} }
} }
@ -107,9 +133,12 @@ pub fn autoview(context: RunnableContext) -> Result<OutputStream, ShellError> {
} if anchor.is_some() => { } if anchor.is_some() => {
if let Some(text) = text { if let Some(text) = text {
let mut stream = VecDeque::new(); let mut stream = VecDeque::new();
stream.push_back(UntaggedValue::string(s).into_value(Tag { anchor, span })); stream.push_back(
let command_args = create_default_command_args(&context).with_input(stream); UntaggedValue::string(s).into_value(Tag { anchor, span }),
let result = text.run(command_args, &context.commands); );
let command_args =
create_default_command_args(&context).with_input(stream);
let result = text.run(command_args, &context.registry).await?;
result.collect::<Vec<_>>().await; result.collect::<Vec<_>>().await;
} else { } else {
out!("{}", s); out!("{}", s);
@ -127,9 +156,12 @@ pub fn autoview(context: RunnableContext) -> Result<OutputStream, ShellError> {
} if anchor.is_some() => { } if anchor.is_some() => {
if let Some(text) = text { if let Some(text) = text {
let mut stream = VecDeque::new(); let mut stream = VecDeque::new();
stream.push_back(UntaggedValue::string(s).into_value(Tag { anchor, span })); stream.push_back(
let command_args = create_default_command_args(&context).with_input(stream); UntaggedValue::string(s).into_value(Tag { anchor, span }),
let result = text.run(command_args, &context.commands); );
let command_args =
create_default_command_args(&context).with_input(stream);
let result = text.run(command_args, &context.registry).await?;
result.collect::<Vec<_>>().await; result.collect::<Vec<_>>().await;
} else { } else {
out!("{}\n", s); out!("{}\n", s);
@ -157,15 +189,54 @@ pub fn autoview(context: RunnableContext) -> Result<OutputStream, ShellError> {
value: UntaggedValue::Primitive(Primitive::Decimal(n)), value: UntaggedValue::Primitive(Primitive::Decimal(n)),
.. ..
} => { } => {
out!("{}", n); // TODO: normalize decimal to remove trailing zeros.
// normalization will be available in next release of bigdecimal crate
let mut output = n.to_string();
if output.contains('.') {
output = output.trim_end_matches('0').to_owned();
}
if output.ends_with('.') {
output.push('0');
}
out!("{}", output);
}
Value {
value: UntaggedValue::Primitive(Primitive::Boolean(b)),
..
} => {
out!("{}", b);
}
Value {
value: UntaggedValue::Primitive(Primitive::Duration(_)),
..
} => {
let output = format_leaf(&x).plain_string(100_000);
out!("{}", output);
}
Value {
value: UntaggedValue::Primitive(Primitive::Date(d)),
..
} => {
out!("{}", d);
}
Value {
value: UntaggedValue::Primitive(Primitive::Range(_)),
..
} => {
let output = format_leaf(&x).plain_string(100_000);
out!("{}", output);
} }
Value { value: UntaggedValue::Primitive(Primitive::Binary(ref b)), .. } => { Value {
value: UntaggedValue::Primitive(Primitive::Binary(ref b)),
..
} => {
if let Some(binary) = binary { if let Some(binary) = binary {
let mut stream = VecDeque::new(); let mut stream = VecDeque::new();
stream.push_back(x); stream.push_back(x);
let command_args = create_default_command_args(&context).with_input(stream); let command_args =
let result = binary.run(command_args, &context.commands); create_default_command_args(&context).with_input(stream);
let result = binary.run(command_args, &context.registry).await?;
result.collect::<Vec<_>>().await; result.collect::<Vec<_>>().await;
} else { } else {
use pretty_hex::*; use pretty_hex::*;
@ -173,15 +244,61 @@ pub fn autoview(context: RunnableContext) -> Result<OutputStream, ShellError> {
} }
} }
Value { value: UntaggedValue::Error(e), .. } => { Value {
yield Err(e); value: UntaggedValue::Error(e),
..
} => {
return Err(e);
} }
Value { value: ref item, .. } => {
Value {
value: UntaggedValue::Row(row),
..
} if pivot_mode == AutoPivotMode::Always
|| (pivot_mode == AutoPivotMode::Auto
&& (row
.entries
.iter()
.map(|(_, v)| v.convert_to_string())
.collect::<Vec<_>>()
.iter()
.fold(0usize, |acc, len| acc + len.len())
+ row.entries.iter().count() * 2)
> term_width) =>
{
let mut entries = vec![];
for (key, value) in row.entries.iter() {
entries.push(vec![
nu_table::StyledString::new(
key.to_string(),
nu_table::TextStyle {
alignment: nu_table::Alignment::Left,
color: Some(ansi_term::Color::Green),
is_bold: true,
},
),
nu_table::StyledString::new(
format_leaf(value).plain_string(100_000),
nu_table::TextStyle::basic(),
),
]);
}
let table =
nu_table::Table::new(vec![], entries, nu_table::Theme::compact());
nu_table::draw_table(&table, term_width);
}
Value {
value: ref item, ..
} => {
if let Some(table) = table { if let Some(table) = table {
let mut stream = VecDeque::new(); let mut stream = VecDeque::new();
stream.push_back(x); stream.push_back(x);
let command_args = create_default_command_args(&context).with_input(stream); let command_args =
let result = table.run(command_args, &context.commands); create_default_command_args(&context).with_input(stream);
let result = table.run(command_args, &context.registry).await?;
result.collect::<Vec<_>>().await; result.collect::<Vec<_>>().await;
} else { } else {
out!("{:?}", item); out!("{:?}", item);
@ -191,16 +308,8 @@ pub fn autoview(context: RunnableContext) -> Result<OutputStream, ShellError> {
} }
} }
} }
_ => {
//out!("<no results>");
}
}
// Needed for async_stream to type check Ok(OutputStream::empty())
if false {
yield ReturnSuccess::value(UntaggedValue::nothing().into_untagged_value());
}
}))
} }
fn create_default_command_args(context: &RunnableContextWithoutInput) -> RawCommandArgs { fn create_default_command_args(context: &RunnableContextWithoutInput) -> RawCommandArgs {
@ -208,19 +317,33 @@ fn create_default_command_args(context: &RunnableContextWithoutInput) -> RawComm
RawCommandArgs { RawCommandArgs {
host: context.host.clone(), host: context.host.clone(),
ctrl_c: context.ctrl_c.clone(), ctrl_c: context.ctrl_c.clone(),
current_errors: context.current_errors.clone(),
shell_manager: context.shell_manager.clone(), shell_manager: context.shell_manager.clone(),
call_info: UnevaluatedCallInfo { call_info: UnevaluatedCallInfo {
args: hir::Call { args: hir::Call {
head: Box::new(SpannedExpression::new( head: Box::new(SpannedExpression::new(
Expression::Literal(Literal::String(span)), Expression::Literal(Literal::String(String::new())),
span, span,
)), )),
positional: None, positional: None,
named: None, named: None,
span, span,
external_redirection: ExternalRedirection::Stdout,
}, },
source: context.source.clone(),
name_tag: context.name.clone(), name_tag: context.name.clone(),
scope: Scope::new(),
}, },
} }
} }
#[cfg(test)]
mod tests {
use super::Autoview;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Autoview {})
}
}

View File

@ -0,0 +1,78 @@
use crate::commands::classified::block::run_block;
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{
hir::Block, Primitive, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value,
};
use chrono::prelude::*;
pub struct Benchmark;
#[derive(Deserialize, Debug)]
struct BenchmarkArgs {
block: Block,
}
#[async_trait]
impl WholeStreamCommand for Benchmark {
fn name(&self) -> &str {
"benchmark"
}
fn signature(&self) -> Signature {
Signature::build("benchmark").required(
"block",
SyntaxShape::Block,
"the block to run and benchmark",
)
}
fn usage(&self) -> &str {
"Runs a block and return the time it took to do execute it. Eg) benchmark { echo $nu.env.NAME }"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
benchmark(args, registry).await
}
}
async fn benchmark(
raw_args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let mut context = Context::from_raw(&raw_args, &registry);
let scope = raw_args.call_info.scope.clone();
let (BenchmarkArgs { block }, input) = raw_args.process(&registry).await?;
let start_time: chrono::DateTime<_> = Utc::now();
let result = run_block(
&block,
&mut context,
input,
&scope.it,
&scope.vars,
&scope.env,
)
.await;
let _ = result?.drain_vec().await;
let run_duration: chrono::Duration = Utc::now().signed_duration_since(start_time);
context.clear_errors();
let output = Ok(ReturnSuccess::Value(Value {
value: UntaggedValue::Primitive(Primitive::from(run_duration)),
tag: Tag::from(block.span),
}));
Ok(OutputStream::from(vec![output]))
}

View File

@ -0,0 +1,56 @@
use crate::prelude::*;
use nu_errors::ShellError;
use crate::commands::WholeStreamCommand;
use crate::data::value::format_leaf;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
#[derive(Deserialize)]
pub struct BuildStringArgs {
rest: Vec<Value>,
}
pub struct BuildString;
#[async_trait]
impl WholeStreamCommand for BuildString {
fn name(&self) -> &str {
"build-string"
}
fn signature(&self) -> Signature {
Signature::build("build-string")
.rest(SyntaxShape::Any, "all values to form into the string")
}
fn usage(&self) -> &str {
"Builds a string from the arguments"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let (BuildStringArgs { rest }, _) = args.process(&registry).await?;
let mut output_string = String::new();
for r in rest {
output_string.push_str(&format_leaf(&r).plain_string(100_000))
}
Ok(OutputStream::one(ReturnSuccess::value(
UntaggedValue::string(output_string).into_value(tag),
)))
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Builds a string from a string and a number, without spaces between them",
example: "build-string 'foo' 3",
result: None,
}]
}
}

View File

@ -0,0 +1,349 @@
use crate::commands::{command::EvaluatedWholeStreamCommandArgs, WholeStreamCommand};
use crate::prelude::*;
use chrono::{Datelike, Local, NaiveDate};
use indexmap::IndexMap;
use nu_errors::ShellError;
use nu_protocol::{Dictionary, Signature, SyntaxShape, UntaggedValue, Value};
pub struct Cal;
#[async_trait]
impl WholeStreamCommand for Cal {
fn name(&self) -> &str {
"cal"
}
fn signature(&self) -> Signature {
Signature::build("cal")
.switch("year", "Display the year column", Some('y'))
.switch("quarter", "Display the quarter column", Some('q'))
.switch("month", "Display the month column", Some('m'))
.named(
"full-year",
SyntaxShape::Int,
"Display a year-long calendar for the specified year",
None,
)
.named(
"week-start",
SyntaxShape::String,
"Display the calendar with the specified day as the first day of the week",
None,
)
.switch(
"month-names",
"Display the month names instead of integers",
None,
)
}
fn usage(&self) -> &str {
"Display a calendar."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
cal(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "This month's calendar",
example: "cal",
result: None,
},
Example {
description: "The calendar for all of 2012",
example: "cal --full-year 2012",
result: None,
},
Example {
description: "This month's calendar with the week starting on monday",
example: "cal --week-start monday",
result: None,
},
]
}
}
pub async fn cal(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let args = args.evaluate_once(&registry).await?;
let mut calendar_vec_deque = VecDeque::new();
let tag = args.call_info.name_tag.clone();
let (current_year, current_month, current_day) = get_current_date();
let mut selected_year: i32 = current_year;
let mut current_day_option: Option<u32> = Some(current_day);
let month_range = if let Some(full_year_value) = args.get("full-year") {
if let Ok(year_u64) = full_year_value.as_u64() {
selected_year = year_u64 as i32;
if selected_year != current_year {
current_day_option = None
}
} else {
return Err(get_invalid_year_shell_error(&full_year_value.tag()));
}
(1, 12)
} else {
(current_month, current_month)
};
add_months_of_year_to_table(
&args,
&mut calendar_vec_deque,
&tag,
selected_year,
month_range,
current_month,
current_day_option,
)?;
Ok(futures::stream::iter(calendar_vec_deque).to_output_stream())
}
fn get_invalid_year_shell_error(year_tag: &Tag) -> ShellError {
ShellError::labeled_error("The year is invalid", "invalid year", year_tag)
}
struct MonthHelper {
selected_year: i32,
selected_month: u32,
day_number_of_week_month_starts_on: u32,
number_of_days_in_month: u32,
quarter_number: u32,
month_name: String,
}
impl MonthHelper {
pub fn new(selected_year: i32, selected_month: u32) -> Result<MonthHelper, ()> {
let naive_date = NaiveDate::from_ymd_opt(selected_year, selected_month, 1).ok_or(())?;
let number_of_days_in_month =
MonthHelper::calculate_number_of_days_in_month(selected_year, selected_month)?;
Ok(MonthHelper {
selected_year,
selected_month,
day_number_of_week_month_starts_on: naive_date.weekday().num_days_from_sunday(),
number_of_days_in_month,
quarter_number: ((selected_month - 1) / 3) + 1,
month_name: naive_date.format("%B").to_string().to_ascii_lowercase(),
})
}
fn calculate_number_of_days_in_month(
mut selected_year: i32,
mut selected_month: u32,
) -> Result<u32, ()> {
// Chrono does not provide a method to output the amount of days in a month
// This is a workaround taken from the example code from the Chrono docs here:
// https://docs.rs/chrono/0.3.0/chrono/naive/date/struct.NaiveDate.html#example-30
if selected_month == 12 {
selected_year += 1;
selected_month = 1;
} else {
selected_month += 1;
};
let next_month_naive_date =
NaiveDate::from_ymd_opt(selected_year, selected_month, 1).ok_or(())?;
Ok(next_month_naive_date.pred().day())
}
}
fn get_current_date() -> (i32, u32, u32) {
let local_now_date = Local::now().date();
let current_year: i32 = local_now_date.year();
let current_month: u32 = local_now_date.month();
let current_day: u32 = local_now_date.day();
(current_year, current_month, current_day)
}
fn add_months_of_year_to_table(
args: &EvaluatedWholeStreamCommandArgs,
mut calendar_vec_deque: &mut VecDeque<Value>,
tag: &Tag,
selected_year: i32,
(start_month, end_month): (u32, u32),
current_month: u32,
current_day_option: Option<u32>,
) -> Result<(), ShellError> {
for month_number in start_month..=end_month {
let mut new_current_day_option: Option<u32> = None;
if let Some(current_day) = current_day_option {
if month_number == current_month {
new_current_day_option = Some(current_day)
}
}
let add_month_to_table_result = add_month_to_table(
&args,
&mut calendar_vec_deque,
&tag,
selected_year,
month_number,
new_current_day_option,
);
add_month_to_table_result?
}
Ok(())
}
fn add_month_to_table(
args: &EvaluatedWholeStreamCommandArgs,
calendar_vec_deque: &mut VecDeque<Value>,
tag: &Tag,
selected_year: i32,
current_month: u32,
current_day_option: Option<u32>,
) -> Result<(), ShellError> {
let month_helper_result = MonthHelper::new(selected_year, current_month);
let month_helper = match month_helper_result {
Ok(month_helper) => month_helper,
Err(()) => match args.get("full-year") {
Some(full_year_value) => {
return Err(get_invalid_year_shell_error(&full_year_value.tag()))
}
None => {
return Err(ShellError::labeled_error(
"Issue parsing command",
"invalid command",
tag,
))
}
},
};
let mut days_of_the_week = [
"sunday",
"monday",
"tuesday",
"wednesday",
"thursday",
"friday",
"saturday",
];
let mut week_start_day = days_of_the_week[0].to_string();
if let Some(week_start_value) = args.get("week-start") {
if let Ok(day) = week_start_value.as_string() {
if days_of_the_week.contains(&day.as_str()) {
week_start_day = day;
} else {
return Err(ShellError::labeled_error(
"The specified week start day is invalid",
"invalid week start day",
week_start_value.tag(),
));
}
}
}
let week_start_day_offset = days_of_the_week.len()
- days_of_the_week
.iter()
.position(|day| *day == week_start_day)
.unwrap_or(0);
days_of_the_week.rotate_right(week_start_day_offset);
let mut total_start_offset: u32 =
month_helper.day_number_of_week_month_starts_on + week_start_day_offset as u32;
total_start_offset %= days_of_the_week.len() as u32;
let mut day_number: u32 = 1;
let day_limit: u32 = total_start_offset + month_helper.number_of_days_in_month;
let should_show_year_column = args.has("year");
let should_show_quarter_column = args.has("quarter");
let should_show_month_column = args.has("month");
let should_show_month_names = args.has("month-names");
while day_number <= day_limit {
let mut indexmap = IndexMap::new();
if should_show_year_column {
indexmap.insert(
"year".to_string(),
UntaggedValue::int(month_helper.selected_year).into_value(tag),
);
}
if should_show_quarter_column {
indexmap.insert(
"quarter".to_string(),
UntaggedValue::int(month_helper.quarter_number).into_value(tag),
);
}
if should_show_month_column || should_show_month_names {
let month_value = if should_show_month_names {
UntaggedValue::string(month_helper.month_name.clone()).into_value(tag)
} else {
UntaggedValue::int(month_helper.selected_month).into_value(tag)
};
indexmap.insert("month".to_string(), month_value);
}
for day in &days_of_the_week {
let should_add_day_number_to_table =
(day_number > total_start_offset) && (day_number <= day_limit);
let mut value = UntaggedValue::nothing().into_value(tag);
if should_add_day_number_to_table {
let adjusted_day_number = day_number - total_start_offset;
value = UntaggedValue::int(adjusted_day_number).into_value(tag);
if let Some(current_day) = current_day_option {
if current_day == adjusted_day_number {
// TODO: Update the value here with a color when color support is added
// This colors the current day
}
}
}
indexmap.insert((*day).to_string(), value);
day_number += 1;
}
calendar_vec_deque
.push_back(UntaggedValue::Row(Dictionary::from(indexmap)).into_value(tag));
}
Ok(())
}
#[cfg(test)]
mod tests {
use super::Cal;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Cal {})
}
}

View File

@ -1,63 +0,0 @@
use crate::commands::PerItemCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{CallInfo, Primitive, ReturnSuccess, UntaggedValue, Value};
pub struct Calc;
impl PerItemCommand for Calc {
fn name(&self) -> &str {
"calc"
}
fn usage(&self) -> &str {
"Parse a math expression into a number"
}
fn run(
&self,
_call_info: &CallInfo,
_registry: &CommandRegistry,
raw_args: &RawCommandArgs,
input: Value,
) -> Result<OutputStream, ShellError> {
calc(input, raw_args)
}
}
fn calc(input: Value, args: &RawCommandArgs) -> Result<OutputStream, ShellError> {
let name_span = &args.call_info.name_tag.span;
let output = if let Ok(string) = input.as_string() {
match parse(&string, &input.tag) {
Ok(value) => ReturnSuccess::value(value),
Err(err) => Err(ShellError::labeled_error(
"Calculation error",
err,
&input.tag.span,
)),
}
} else {
Err(ShellError::labeled_error(
"Expected a string from pipeline",
"requires string input",
name_span,
))
};
Ok(vec![output].into())
}
pub fn parse(math_expression: &str, tag: impl Into<Tag>) -> Result<Value, String> {
use std::f64;
let num = meval::eval_str(math_expression);
match num {
Ok(num) => {
if num == f64::INFINITY || num == f64::NEG_INFINITY {
return Err(String::from("cannot represent result"));
}
Ok(UntaggedValue::from(Primitive::from(num)).into_value(tag))
}
Err(error) => Err(error.to_string()),
}
}

View File

@ -1,45 +1,82 @@
use crate::commands::WholeStreamCommand; use crate::commands::WholeStreamCommand;
use crate::prelude::*; use crate::prelude::*;
use std::path::PathBuf;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_macros::signature;
use nu_protocol::{Signature, SyntaxShape}; use nu_protocol::{Signature, SyntaxShape};
use nu_source::Tagged;
#[derive(Deserialize)]
pub struct CdArgs {
pub(crate) path: Option<Tagged<PathBuf>>,
}
pub struct Cd; pub struct Cd;
#[async_trait]
impl WholeStreamCommand for Cd { impl WholeStreamCommand for Cd {
fn name(&self) -> &str { fn name(&self) -> &str {
"cd" "cd"
} }
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
signature! { Signature::build("cd").optional(
def cd { "directory",
"the directory to change to" SyntaxShape::Path,
directory(optional Path) - "the directory to change to" "the directory to change to",
} )
}
// Signature::build("cd").optional(
// "directory",
// SyntaxShape::Path,
// "the directory to change to",
// )
} }
fn usage(&self) -> &str { fn usage(&self) -> &str {
"Change to a new path." "Change to a new path."
} }
fn run( async fn run(
&self, &self,
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
cd(args, registry) let name = args.call_info.name_tag.clone();
let shell_manager = args.shell_manager.clone();
let (args, _): (CdArgs, _) = args.process(&registry).await?;
shell_manager.cd(args, name)
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Change to a new directory called 'dirname'",
example: "cd dirname",
result: None,
},
Example {
description: "Change to your home directory",
example: "cd",
result: None,
},
Example {
description: "Change to your home directory (alternate version)",
example: "cd ~",
result: None,
},
Example {
description: "Change to the previous directory",
example: "cd -",
result: None,
},
]
} }
} }
fn cd(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> { #[cfg(test)]
let shell_manager = args.shell_manager.clone(); mod tests {
let args = args.evaluate_once(registry)?; use super::Cd;
shell_manager.cd(args)
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Cd {})
}
} }

View File

@ -0,0 +1,111 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged;
pub struct Char;
#[derive(Deserialize)]
struct CharArgs {
name: Tagged<String>,
}
#[async_trait]
impl WholeStreamCommand for Char {
fn name(&self) -> &str {
"char"
}
fn signature(&self) -> Signature {
Signature::build("ansi").required(
"character",
SyntaxShape::Any,
"the name of the character to output",
)
}
fn usage(&self) -> &str {
"Output special characters (eg. 'newline')"
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Output newline",
example: r#"char newline"#,
result: Some(vec![Value::from("\n")]),
},
Example {
description: "Output prompt character, newline and a hamburger character",
example: r#"echo $(char prompt) $(char newline) $(char hamburger)"#,
result: Some(vec![
UntaggedValue::string("\u{25b6}").into(),
UntaggedValue::string("\n").into(),
UntaggedValue::string("\u{2261}").into(),
]),
},
]
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let (CharArgs { name }, _) = args.process(&registry).await?;
let special_character = str_to_character(&name.item);
if let Some(output) = special_character {
Ok(OutputStream::one(ReturnSuccess::value(
UntaggedValue::string(output).into_value(name.tag()),
)))
} else {
Err(ShellError::labeled_error(
"Unknown character",
"unknown character",
name.tag(),
))
}
}
}
fn str_to_character(s: &str) -> Option<String> {
match s {
"newline" | "enter" | "nl" => Some("\n".into()),
"tab" => Some("\t".into()),
"sp" | "space" => Some(" ".into()),
// Unicode names came from https://www.compart.com/en/unicode
// Private Use Area (U+E000-U+F8FF)
"branch" => Some('\u{e0a0}'.to_string()), // 
"segment" => Some('\u{e0b0}'.to_string()), // 
"identical_to" | "hamburger" => Some('\u{2261}'.to_string()), // ≡
"not_identical_to" | "branch_untracked" => Some('\u{2262}'.to_string()), // ≢
"strictly_equivalent_to" | "branch_identical" => Some('\u{2263}'.to_string()), // ≣
"upwards_arrow" | "branch_ahead" => Some('\u{2191}'.to_string()), // ↑
"downwards_arrow" | "branch_behind" => Some('\u{2193}'.to_string()), // ↓
"up_down_arrow" | "branch_ahead_behind" => Some('\u{2195}'.to_string()), // ↕
"black_right_pointing_triangle" | "prompt" => Some('\u{25b6}'.to_string()), // ▶
"vector_or_cross_product" | "failed" => Some('\u{2a2f}'.to_string()), //
"high_voltage_sign" | "elevated" => Some('\u{26a1}'.to_string()), // ⚡
"tilde" | "twiddle" | "squiggly" | "home" => Some("~".into()), // ~
"hash" | "hashtag" | "pound_sign" | "sharp" | "root" => Some("#".into()), // #
_ => None,
}
}
#[cfg(test)]
mod tests {
use super::Char;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Char {})
}
}

View File

@ -0,0 +1,92 @@
use crate::commands::classified::expr::run_expression_block;
use crate::commands::classified::internal::run_internal_command;
use crate::context::Context;
use crate::prelude::*;
use crate::stream::InputStream;
use futures::stream::TryStreamExt;
use nu_errors::ShellError;
use nu_protocol::hir::{Block, ClassifiedCommand, Commands};
use nu_protocol::{ReturnSuccess, UntaggedValue, Value};
use std::sync::atomic::Ordering;
pub(crate) async fn run_block(
block: &Block,
ctx: &mut Context,
mut input: InputStream,
it: &Value,
vars: &IndexMap<String, Value>,
env: &IndexMap<String, String>,
) -> Result<InputStream, ShellError> {
let mut output: Result<InputStream, ShellError> = Ok(InputStream::empty());
for pipeline in &block.block {
match output {
Ok(inp) if inp.is_empty() => {}
Ok(inp) => {
let mut output_stream = inp.to_output_stream();
loop {
match output_stream.try_next().await {
Ok(Some(ReturnSuccess::Value(Value {
value: UntaggedValue::Error(e),
..
}))) => return Err(e),
Ok(Some(_item)) => {
if let Some(err) = ctx.get_errors().get(0) {
ctx.clear_errors();
return Err(err.clone());
}
if ctx.ctrl_c.load(Ordering::SeqCst) {
break;
}
}
Ok(None) => {
if let Some(err) = ctx.get_errors().get(0) {
ctx.clear_errors();
return Err(err.clone());
}
break;
}
Err(e) => return Err(e),
}
}
}
Err(e) => {
return Err(e);
}
}
output = run_pipeline(pipeline, ctx, input, it, vars, env).await;
input = InputStream::empty();
}
output
}
async fn run_pipeline(
commands: &Commands,
ctx: &mut Context,
mut input: InputStream,
it: &Value,
vars: &IndexMap<String, Value>,
env: &IndexMap<String, String>,
) -> Result<InputStream, ShellError> {
for item in commands.list.clone() {
input = match item {
ClassifiedCommand::Dynamic(_) => {
return Err(ShellError::unimplemented("Dynamic commands"))
}
ClassifiedCommand::Expr(expr) => {
run_expression_block(*expr, ctx, it, vars, env).await?
}
ClassifiedCommand::Error(err) => return Err(err.into()),
ClassifiedCommand::Internal(left) => {
run_internal_command(left, ctx, input, it, vars, env).await?
}
};
}
Ok(input)
}

View File

@ -1,7 +1,7 @@
use derive_new::new; use derive_new::new;
use nu_parser::hir; use nu_protocol::hir;
#[derive(new, Debug, Eq, PartialEq)] #[derive(new, Debug)]
pub(crate) struct Command { pub(crate) struct Command {
pub(crate) args: hir::Call, pub(crate) args: hir::Call,
} }

View File

@ -0,0 +1,27 @@
use crate::evaluate::evaluate_baseline_expr;
use crate::prelude::*;
use log::{log_enabled, trace};
use futures::stream::once;
use nu_errors::ShellError;
use nu_protocol::hir::SpannedExpression;
use nu_protocol::Value;
pub(crate) async fn run_expression_block(
expr: SpannedExpression,
context: &mut Context,
it: &Value,
vars: &IndexMap<String, Value>,
env: &IndexMap<String, String>,
) -> Result<InputStream, ShellError> {
if log_enabled!(log::Level::Trace) {
trace!(target: "nu::run::expr", "->");
trace!(target: "nu::run::expr", "{:?}", expr);
}
let registry = context.registry().clone();
let output = evaluate_baseline_expr(&expr, &registry, it, vars, env).await?;
Ok(once(async { Ok(output) }).to_input_stream())
}

View File

@ -1,105 +1,29 @@
use crate::commands::classified::maybe_text_codec::{MaybeTextCodec, StringOrBinary};
use crate::evaluate::evaluate_baseline_expr;
use crate::futures::ThreadedReceiver; use crate::futures::ThreadedReceiver;
use crate::prelude::*; use crate::prelude::*;
use bytes::{BufMut, Bytes, BytesMut};
use futures::executor::block_on_stream;
use futures::stream::StreamExt;
use futures_codec::FramedRead;
use log::trace;
use nu_errors::ShellError;
use nu_parser::commands::classified::external::ExternalArg;
use nu_parser::ExternalCommand;
use nu_protocol::{ColumnPath, Primitive, ShellTypeName, UntaggedValue, Value};
use nu_source::{Tag, Tagged};
use nu_value_ext::as_column_path;
use std::io::Write; use std::io::Write;
use std::ops::Deref; use std::ops::Deref;
use std::process::{Command, Stdio}; use std::process::{Command, Stdio};
use std::sync::mpsc; use std::sync::mpsc;
pub enum StringOrBinary { use futures::executor::block_on_stream;
String(String), use futures_codec::FramedRead;
Binary(Vec<u8>), use log::trace;
}
pub struct MaybeTextCodec;
impl futures_codec::Encoder for MaybeTextCodec { use nu_errors::ShellError;
type Item = StringOrBinary; use nu_protocol::hir::{ExternalCommand, ExternalRedirection};
type Error = std::io::Error; use nu_protocol::{Primitive, Scope, ShellTypeName, UntaggedValue, Value};
use nu_source::Tag;
fn encode(&mut self, item: Self::Item, dst: &mut BytesMut) -> Result<(), Self::Error> { pub(crate) async fn run_external_command(
match item {
StringOrBinary::String(s) => {
dst.reserve(s.len());
dst.put(s.as_bytes());
Ok(())
}
StringOrBinary::Binary(b) => {
dst.reserve(b.len());
dst.put(Bytes::from(b));
Ok(())
}
}
}
}
impl futures_codec::Decoder for MaybeTextCodec {
type Item = StringOrBinary;
type Error = std::io::Error;
fn decode(&mut self, src: &mut BytesMut) -> Result<Option<Self::Item>, Self::Error> {
let v: Vec<u8> = src.to_vec();
match String::from_utf8(v) {
Ok(s) => {
src.clear();
if s.is_empty() {
Ok(None)
} else {
Ok(Some(StringOrBinary::String(s)))
}
}
Err(err) => {
// Note: the longest UTF-8 character per Unicode spec is currently 6 bytes. If we fail somewhere earlier than the last 6 bytes,
// we know that we're failing to understand the string encoding and not just seeing a partial character. When this happens, let's
// fall back to assuming it's a binary buffer.
if src.is_empty() {
Ok(None)
} else if src.len() > 6 && (src.len() - err.utf8_error().valid_up_to() > 6) {
// Fall back to assuming binary
let buf = src.to_vec();
src.clear();
Ok(Some(StringOrBinary::Binary(buf)))
} else {
// Looks like a utf-8 string, so let's assume that
let buf = src.split_to(err.utf8_error().valid_up_to() + 1);
String::from_utf8(buf.to_vec())
.map(|x| Some(StringOrBinary::String(x)))
.map_err(|e| std::io::Error::new(std::io::ErrorKind::InvalidData, e))
}
}
}
}
}
pub fn nu_value_to_string(command: &ExternalCommand, from: &Value) -> Result<String, ShellError> {
match &from.value {
UntaggedValue::Primitive(Primitive::Int(i)) => Ok(i.to_string()),
UntaggedValue::Primitive(Primitive::String(s))
| UntaggedValue::Primitive(Primitive::Line(s)) => Ok(s.clone()),
UntaggedValue::Primitive(Primitive::Path(p)) => Ok(p.to_string_lossy().to_string()),
unsupported => Err(ShellError::labeled_error(
format!("needs string data (given: {})", unsupported.type_name()),
"expected a string",
&command.name_tag,
)),
}
}
pub(crate) fn run_external_command(
command: ExternalCommand, command: ExternalCommand,
context: &mut Context, context: &mut Context,
input: Option<InputStream>, input: InputStream,
is_last: bool, scope: &Scope,
) -> Result<Option<InputStream>, ShellError> { external_redirection: ExternalRedirection,
) -> Result<InputStream, ShellError> {
trace!(target: "nu::run::external", "-> {}", command.name); trace!(target: "nu::run::external", "-> {}", command.name);
if !did_find_command(&command.name) { if !did_find_command(&command.name) {
@ -110,306 +34,79 @@ pub(crate) fn run_external_command(
)); ));
} }
if command.has_it_argument() || command.has_nu_argument() { run_with_stdin(command, context, input, scope, external_redirection).await
run_with_iterator_arg(command, context, input, is_last)
} else {
run_with_stdin(command, context, input, is_last)
}
} }
fn prepare_column_path_for_fetching_it_variable( async fn run_with_stdin(
argument: &ExternalArg,
) -> Result<Tagged<ColumnPath>, ShellError> {
// We have "$it.[contents of interest]"
// and start slicing from "$it.[member+]"
// ^ here.
let key = nu_source::Text::from(argument.deref()).slice(4..argument.len());
to_column_path(&key, &argument.tag)
}
fn prepare_column_path_for_fetching_nu_variable(
argument: &ExternalArg,
) -> Result<Tagged<ColumnPath>, ShellError> {
// We have "$nu.[contents of interest]"
// and start slicing from "$nu.[member+]"
// ^ here.
let key = nu_source::Text::from(argument.deref()).slice(4..argument.len());
to_column_path(&key, &argument.tag)
}
fn to_column_path(
path_members: &str,
tag: impl Into<Tag>,
) -> Result<Tagged<ColumnPath>, ShellError> {
let tag = tag.into();
as_column_path(
&UntaggedValue::Table(
path_members
.split('.')
.map(|x| {
let member = match x.parse::<u64>() {
Ok(v) => UntaggedValue::int(v),
Err(_) => UntaggedValue::string(x),
};
member.into_value(&tag)
})
.collect(),
)
.into_value(&tag),
)
}
fn run_with_iterator_arg(
command: ExternalCommand, command: ExternalCommand,
context: &mut Context, context: &mut Context,
input: Option<InputStream>, input: InputStream,
is_last: bool, scope: &Scope,
) -> Result<Option<InputStream>, ShellError> { external_redirection: ExternalRedirection,
) -> Result<InputStream, ShellError> {
let path = context.shell_manager.path(); let path = context.shell_manager.path();
let mut inputs: InputStream = if let Some(input) = input { let input = trace_stream!(target: "nu::trace_stream::external::stdin", "input" = input);
trace_stream!(target: "nu::trace_stream::external::it", "input" = input)
} else {
InputStream::empty()
};
let stream = async_stream! { let mut command_args = vec![];
while let Some(value) = inputs.next().await { for arg in command.args.iter() {
let name = command.name.clone(); let value =
let name_tag = command.name_tag.clone(); evaluate_baseline_expr(arg, &context.registry, &scope.it, &scope.vars, &scope.env)
let home_dir = dirs::home_dir(); .await?;
let path = &path;
let args = command.args.clone();
let it_replacement = { // Skip any arguments that don't really exist, treating them as optional
if command.has_it_argument() { // FIXME: we may want to preserve the gap in the future, though it's hard to say
let empty_arg = ExternalArg { // what value we would put in its place.
arg: "".to_string(), if value.value.is_none() {
tag: name_tag.clone() continue;
};
let key = args.iter()
.find(|arg| arg.looks_like_it())
.unwrap_or_else(|| &empty_arg);
if args.iter().all(|arg| !arg.is_it()) {
let key = match prepare_column_path_for_fetching_it_variable(&key) {
Ok(keypath) => keypath,
Err(reason) => {
yield Ok(Value {
value: UntaggedValue::Error(reason),
tag: name_tag
});
return;
}
};
match crate::commands::get::get_column_path(&key, &value) {
Ok(field) => {
match nu_value_to_string(&command, &field) {
Ok(val) => Some(val),
Err(reason) => {
yield Ok(Value {
value: UntaggedValue::Error(reason),
tag: name_tag
});
return;
},
}
},
Err(reason) => {
yield Ok(Value {
value: UntaggedValue::Error(reason),
tag: name_tag
});
return;
}
}
} else {
match nu_value_to_string(&command, &value) {
Ok(val) => Some(val),
Err(reason) => {
yield Ok(Value {
value: UntaggedValue::Error(reason),
tag: name_tag
});
return;
},
}
}
} else {
None
}
};
let nu_replacement = {
if command.has_nu_argument() {
let empty_arg = ExternalArg {
arg: "".to_string(),
tag: name_tag.clone()
};
let key = args.iter()
.find(|arg| arg.looks_like_nu())
.unwrap_or_else(|| &empty_arg);
let nu_var = match crate::evaluate::variables::nu(&name_tag) {
Ok(variables) => variables,
Err(reason) => {
yield Ok(Value {
value: UntaggedValue::Error(reason),
tag: name_tag
});
return;
}
};
if args.iter().all(|arg| !arg.is_nu()) {
let key = match prepare_column_path_for_fetching_nu_variable(&key) {
Ok(keypath) => keypath,
Err(reason) => {
yield Ok(Value {
value: UntaggedValue::Error(reason),
tag: name_tag
});
return;
}
};
match crate::commands::get::get_column_path(&key, &nu_var) {
Ok(field) => {
match nu_value_to_string(&command, &field) {
Ok(val) => Some(val),
Err(reason) => {
yield Ok(Value {
value: UntaggedValue::Error(reason),
tag: name_tag
});
return;
},
}
},
Err(reason) => {
yield Ok(Value {
value: UntaggedValue::Error(reason),
tag: name_tag
});
return;
}
}
} else {
match nu_value_to_string(&command, &nu_var) {
Ok(val) => Some(val),
Err(reason) => {
yield Ok(Value {
value: UntaggedValue::Error(reason),
tag: name_tag
});
return;
},
}
}
} else {
None
}
};
let process_args = args.iter().filter_map(|arg| {
if arg.chars().all(|c| c.is_whitespace()) {
None
} else {
let arg = if arg.looks_like_it() {
if let Some(mut value) = it_replacement.to_owned() {
let mut value = expand_tilde(&value, || home_dir.as_ref()).as_ref().to_string();
#[cfg(not(windows))]
{
value = {
if argument_contains_whitespace(&value) && !argument_is_quoted(&value) {
add_quotes(&value)
} else {
value
}
};
}
Some(value)
} else {
None
}
} else if arg.looks_like_nu() {
if let Some(mut value) = nu_replacement.to_owned() {
#[cfg(not(windows))]
{
value = {
if argument_contains_whitespace(&value) && !argument_is_quoted(&value) {
add_quotes(&value)
} else {
value
}
};
}
Some(value)
} else {
None
}
} else {
Some(arg.to_string())
};
arg
}
}).collect::<Vec<String>>();
match spawn(&command, &path, &process_args[..], None, is_last) {
Ok(res) => {
if let Some(mut res) = res {
while let Some(item) = res.next().await {
yield Ok(item)
}
}
}
Err(reason) => {
yield Ok(Value {
value: UntaggedValue::Error(reason),
tag: name_tag
});
return;
}
}
}
};
Ok(Some(stream.to_input_stream()))
} }
fn run_with_stdin( // Do the cleanup that we need to do on any argument going out:
command: ExternalCommand, match &value.value {
context: &mut Context, UntaggedValue::Table(table) => {
input: Option<InputStream>, for t in table {
is_last: bool, match &t.value {
) -> Result<Option<InputStream>, ShellError> { UntaggedValue::Primitive(_) => {
let path = context.shell_manager.path(); command_args
.push(t.convert_to_string().trim_end_matches('\n').to_string());
}
_ => {
return Err(ShellError::labeled_error(
"Could not convert to positional arguments",
"could not convert to positional arguments",
value.tag(),
));
}
}
}
}
_ => {
let trimmed_value_string = value.as_string()?.trim_end_matches('\n').to_string();
command_args.push(trimmed_value_string);
}
}
}
let input = input let process_args = command_args
.map(|input| trace_stream!(target: "nu::trace_stream::external::stdin", "input" = input));
let process_args = command
.args
.iter() .iter()
.map(|arg| { .map(|arg| {
let arg = expand_tilde(arg.deref(), dirs::home_dir); let home_dir;
#[cfg(feature = "dirs")]
{
home_dir = dirs::home_dir;
}
#[cfg(not(feature = "dirs"))]
{
home_dir = || Some(std::path::PathBuf::from("/"));
}
let arg = expand_tilde(arg.deref(), home_dir);
#[cfg(not(windows))] #[cfg(not(windows))]
{ {
if argument_contains_whitespace(&arg) && argument_is_quoted(&arg) { if argument_contains_whitespace(&arg) && !argument_is_quoted(&arg) {
if let Some(unquoted) = remove_quotes(&arg) { add_quotes(&arg)
format!(r#""{}""#, unquoted)
} else {
arg.as_ref().to_string()
}
} else { } else {
arg.as_ref().to_string() arg.as_ref().to_string()
} }
@ -425,16 +122,24 @@ fn run_with_stdin(
}) })
.collect::<Vec<String>>(); .collect::<Vec<String>>();
spawn(&command, &path, &process_args[..], input, is_last) spawn(
&command,
&path,
&process_args[..],
input,
external_redirection,
scope,
)
} }
fn spawn( fn spawn(
command: &ExternalCommand, command: &ExternalCommand,
path: &str, path: &str,
args: &[String], args: &[String],
input: Option<InputStream>, input: InputStream,
is_last: bool, external_redirection: ExternalRedirection,
) -> Result<Option<InputStream>, ShellError> { scope: &Scope,
) -> Result<InputStream, ShellError> {
let command = command.clone(); let command = command.clone();
let mut process = { let mut process = {
@ -444,6 +149,8 @@ fn spawn(
process.arg("/c"); process.arg("/c");
process.arg(&command.name); process.arg(&command.name);
for arg in args { for arg in args {
// Clean the args before we use them:
let arg = arg.replace("|", "\\|");
process.arg(&arg); process.arg(&arg);
} }
process process
@ -461,15 +168,31 @@ fn spawn(
process.current_dir(path); process.current_dir(path);
trace!(target: "nu::run::external", "cwd = {:?}", &path); trace!(target: "nu::run::external", "cwd = {:?}", &path);
process.env_clear();
process.envs(scope.env.iter());
// We want stdout regardless of what // We want stdout regardless of what
// we are doing ($it case or pipe stdin) // we are doing ($it case or pipe stdin)
if !is_last { match external_redirection {
ExternalRedirection::Stdout => {
process.stdout(Stdio::piped()); process.stdout(Stdio::piped());
trace!(target: "nu::run::external", "set up stdout pipe"); trace!(target: "nu::run::external", "set up stdout pipe");
} }
ExternalRedirection::Stderr => {
process.stderr(Stdio::piped());
trace!(target: "nu::run::external", "set up stderr pipe");
}
ExternalRedirection::StdoutAndStderr => {
process.stdout(Stdio::piped());
trace!(target: "nu::run::external", "set up stdout pipe");
process.stderr(Stdio::piped());
trace!(target: "nu::run::external", "set up stderr pipe");
}
_ => {}
}
// open since we have some contents for stdin // open since we have some contents for stdin
if input.is_some() { if !input.is_empty() {
process.stdin(Stdio::piped()); process.stdin(Stdio::piped());
trace!(target: "nu::run::external", "set up stdin pipe"); trace!(target: "nu::run::external", "set up stdin pipe");
} }
@ -488,7 +211,7 @@ fn spawn(
let stdout_name_tag = command.name_tag; let stdout_name_tag = command.name_tag;
std::thread::spawn(move || { std::thread::spawn(move || {
if let Some(input) = input { if !input.is_empty() {
let mut stdin_write = stdin let mut stdin_write = stdin
.take() .take()
.expect("Internal error: could not get stdin pipe for external command"); .expect("Internal error: could not get stdin pipe for external command");
@ -549,7 +272,9 @@ fn spawn(
}); });
std::thread::spawn(move || { std::thread::spawn(move || {
if !is_last { if external_redirection == ExternalRedirection::Stdout
|| external_redirection == ExternalRedirection::StdoutAndStderr
{
let stdout = if let Some(stdout) = child.stdout.take() { let stdout = if let Some(stdout) = child.stdout.take() {
stdout stdout
} else { } else {
@ -565,7 +290,7 @@ fn spawn(
}; };
let file = futures::io::AllowStdIo::new(stdout); let file = futures::io::AllowStdIo::new(stdout);
let stream = FramedRead::new(file, MaybeTextCodec); let stream = FramedRead::new(file, MaybeTextCodec::default());
for line in block_on_stream(stream) { for line in block_on_stream(stream) {
match line { match line {
@ -593,24 +318,113 @@ fn spawn(
} }
} }
}, },
Err(_) => { Err(e) => {
// If there's an exit status, it makes sense that we may error when
// trying to read from its stdout pipe (likely been closed). In that
// case, don't emit an error.
let should_error = match child.wait() {
Ok(exit_status) => !exit_status.success(),
Err(_) => true,
};
if should_error {
let _ = stdout_read_tx.send(Ok(Value { let _ = stdout_read_tx.send(Ok(Value {
value: UntaggedValue::Error(ShellError::labeled_error( value: UntaggedValue::Error(ShellError::labeled_error(
"Unable to read from stdout.", format!("Unable to read from stdout ({})", e),
"unable to read from stdout", "unable to read from stdout",
&stdout_name_tag, &stdout_name_tag,
)), )),
tag: stdout_name_tag.clone(), tag: stdout_name_tag.clone(),
})); }));
}
return Ok(());
}
}
}
}
if external_redirection == ExternalRedirection::Stderr
|| external_redirection == ExternalRedirection::StdoutAndStderr
{
let stderr = if let Some(stderr) = child.stderr.take() {
stderr
} else {
let _ = stdout_read_tx.send(Ok(Value {
value: UntaggedValue::Error(ShellError::labeled_error(
"Can't redirect the stderr for external command",
"can't redirect stderr",
&stdout_name_tag,
)),
tag: stdout_name_tag,
}));
return Err(());
};
let file = futures::io::AllowStdIo::new(stderr);
let stream = FramedRead::new(file, MaybeTextCodec::default());
for line in block_on_stream(stream) {
match line {
Ok(line) => match line {
StringOrBinary::String(s) => {
let result = stdout_read_tx.send(Ok(Value {
value: UntaggedValue::Error(
ShellError::untagged_runtime_error(s),
),
tag: stdout_name_tag.clone(),
}));
if result.is_err() {
break; break;
} }
} }
StringOrBinary::Binary(_) => {
let result = stdout_read_tx.send(Ok(Value {
value: UntaggedValue::Error(
ShellError::untagged_runtime_error("<binary stderr>"),
),
tag: stdout_name_tag.clone(),
}));
if result.is_err() {
break;
}
}
},
Err(e) => {
// If there's an exit status, it makes sense that we may error when
// trying to read from its stdout pipe (likely been closed). In that
// case, don't emit an error.
let should_error = match child.wait() {
Ok(exit_status) => !exit_status.success(),
Err(_) => true,
};
if should_error {
let _ = stdout_read_tx.send(Ok(Value {
value: UntaggedValue::Error(ShellError::labeled_error(
format!("Unable to read from stdout ({})", e),
"unable to read from stdout",
&stdout_name_tag,
)),
tag: stdout_name_tag.clone(),
}));
}
return Ok(());
}
}
} }
} }
// We can give an error when we see a non-zero exit code, but this is different // We can give an error when we see a non-zero exit code, but this is different
// than what other shells will do. // than what other shells will do.
if child.wait().is_err() { let external_failed = match child.wait() {
Err(_) => true,
Ok(exit_status) => !exit_status.success(),
};
if external_failed {
let cfg = crate::data::config::config(Tag::unknown()); let cfg = crate::data::config::config(Tag::unknown());
if let Ok(cfg) = cfg { if let Ok(cfg) = cfg {
if cfg.contains_key("nonzero_exit_errors") { if cfg.contains_key("nonzero_exit_errors") {
@ -620,40 +434,52 @@ fn spawn(
"command failed", "command failed",
&stdout_name_tag, &stdout_name_tag,
)), )),
tag: stdout_name_tag, tag: stdout_name_tag.clone(),
})); }));
} }
} }
let _ = stdout_read_tx.send(Ok(Value {
value: UntaggedValue::Error(ShellError::external_non_zero()),
tag: stdout_name_tag,
}));
} }
Ok(()) Ok(())
}); });
let stream = ThreadedReceiver::new(rx); let stream = ThreadedReceiver::new(rx);
Ok(Some(stream.to_input_stream())) Ok(stream.to_input_stream())
} else { } else {
Err(ShellError::labeled_error( Err(ShellError::labeled_error(
"Command not found", "Failed to spawn process",
"command not found", "failed to spawn",
&command.name_tag, &command.name_tag,
)) ))
} }
} }
fn did_find_command(name: &str) -> bool { pub fn did_find_command(#[allow(unused)] name: &str) -> bool {
#[cfg(not(windows))] #[cfg(not(feature = "which"))]
{
// we can't perform this check, so just assume it can be found
true
}
#[cfg(all(feature = "which", unix))]
{ {
which::which(name).is_ok() which::which(name).is_ok()
} }
#[cfg(windows)] #[cfg(all(feature = "which", windows))]
{ {
if which::which(name).is_ok() { if which::which(name).is_ok() {
true true
} else { } else {
// Reference: https://ss64.com/nt/syntax-internal.html
let cmd_builtins = [ let cmd_builtins = [
"call", "cls", "color", "date", "dir", "echo", "find", "hostname", "pause", "assoc", "break", "color", "copy", "date", "del", "dir", "dpath", "echo", "erase",
"start", "time", "title", "ver", "copy", "mkdir", "rename", "rd", "rmdir", "type", "for", "ftype", "md", "mkdir", "mklink", "move", "path", "ren", "rename", "rd",
"rmdir", "set", "start", "time", "title", "type", "ver", "verify", "vol",
]; ];
cmd_builtins.contains(&name) cmd_builtins.contains(&name)
@ -689,6 +515,7 @@ fn add_quotes(argument: &str) -> String {
format!("\"{}\"", argument) format!("\"{}\"", argument)
} }
#[allow(unused)]
fn remove_quotes(argument: &str) -> Option<&str> { fn remove_quotes(argument: &str) -> Option<&str> {
if !argument_is_quoted(argument) { if !argument_is_quoted(argument) {
return None; return None;
@ -714,10 +541,17 @@ fn shell_os_paths() -> Vec<std::path::PathBuf> {
mod tests { mod tests {
use super::{ use super::{
add_quotes, argument_contains_whitespace, argument_is_quoted, expand_tilde, remove_quotes, add_quotes, argument_contains_whitespace, argument_is_quoted, expand_tilde, remove_quotes,
run_external_command, Context,
}; };
#[cfg(feature = "which")]
use super::{run_external_command, Context, InputStream};
#[cfg(feature = "which")]
use futures::executor::block_on; use futures::executor::block_on;
#[cfg(feature = "which")]
use nu_errors::ShellError; use nu_errors::ShellError;
#[cfg(feature = "which")]
use nu_protocol::Scope;
#[cfg(feature = "which")]
use nu_test_support::commands::ExternalBuilder; use nu_test_support::commands::ExternalBuilder;
// async fn read(mut stream: OutputStream) -> Option<Value> { // async fn read(mut stream: OutputStream) -> Option<Value> {
@ -733,12 +567,23 @@ mod tests {
// } // }
// } // }
#[cfg(feature = "which")]
async fn non_existent_run() -> Result<(), ShellError> { async fn non_existent_run() -> Result<(), ShellError> {
use nu_protocol::hir::ExternalRedirection;
let cmd = ExternalBuilder::for_name("i_dont_exist.exe").build(); let cmd = ExternalBuilder::for_name("i_dont_exist.exe").build();
let input = InputStream::empty();
let mut ctx = Context::basic().expect("There was a problem creating a basic context."); let mut ctx = Context::basic().expect("There was a problem creating a basic context.");
assert!(run_external_command(cmd, &mut ctx, None, false).is_err()); assert!(run_external_command(
cmd,
&mut ctx,
input,
&Scope::new(),
ExternalRedirection::Stdout
)
.await
.is_err());
Ok(()) Ok(())
} }
@ -767,6 +612,7 @@ mod tests {
// block_on(failure_run()) // block_on(failure_run())
// } // }
#[cfg(feature = "which")]
#[test] #[test]
fn identifies_command_not_found() -> Result<(), ShellError> { fn identifies_command_not_found() -> Result<(), ShellError> {
block_on(non_existent_run()) block_on(non_existent_run())

View File

@ -1,159 +1,233 @@
use crate::commands::command::whole_stream_command;
use crate::commands::run_alias::AliasCommand;
use crate::commands::UnevaluatedCallInfo; use crate::commands::UnevaluatedCallInfo;
use crate::prelude::*; use crate::prelude::*;
use log::{log_enabled, trace}; use log::{log_enabled, trace};
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_parser::InternalCommand; use nu_protocol::hir::{ExternalRedirection, InternalCommand};
use nu_protocol::{CommandAction, Primitive, ReturnSuccess, UntaggedValue, Value}; use nu_protocol::{CommandAction, Primitive, ReturnSuccess, Scope, UntaggedValue, Value};
pub(crate) fn run_internal_command( pub(crate) async fn run_internal_command(
command: InternalCommand, command: InternalCommand,
context: &mut Context, context: &mut Context,
input: Option<InputStream>, input: InputStream,
source: Text, it: &Value,
) -> Result<Option<InputStream>, ShellError> { vars: &IndexMap<String, Value>,
env: &IndexMap<String, String>,
) -> Result<InputStream, ShellError> {
if log_enabled!(log::Level::Trace) { if log_enabled!(log::Level::Trace) {
trace!(target: "nu::run::internal", "->"); trace!(target: "nu::run::internal", "->");
trace!(target: "nu::run::internal", "{}", command.name); trace!(target: "nu::run::internal", "{}", command.name);
trace!(target: "nu::run::internal", "{}", command.args.debug(&source));
} }
let objects: InputStream = if let Some(input) = input { let scope = Scope {
trace_stream!(target: "nu::trace_stream::internal", "input" = input) it: it.clone(),
} else { vars: vars.clone(),
InputStream::empty() env: env.clone(),
}; };
let objects: InputStream = trace_stream!(target: "nu::trace_stream::internal", "input" = input);
let internal_command = context.expect_command(&command.name); let internal_command = context.expect_command(&command.name);
if command.name == "autoenv untrust" {
context.user_recently_used_autoenv_untrust = true;
}
let result = { let result = {
context.run_command( context
.run_command(
internal_command?, internal_command?,
command.name_tag.clone(), Tag::unknown_anchor(command.name_span),
command.args.clone(), command.args.clone(),
&source, &scope,
objects, objects,
) )
.await?
}; };
let result = trace_out_stream!(target: "nu::trace_stream::internal", "output" = result); let head = Arc::new(command.args.head.clone());
let mut result = result.values; //let context = Arc::new(context.clone());
let context = context.clone();
let command = Arc::new(command);
let scope = Arc::new(scope);
// let scope = scope.clone();
Ok(InputStream::from_stream(
result
.then(move |item| {
let head = head.clone();
let command = command.clone();
let mut context = context.clone(); let mut context = context.clone();
let scope = scope.clone();
let stream = async_stream! { async move {
let mut soft_errs: Vec<ShellError> = vec![];
let mut yielded = false;
while let Some(item) = result.next().await {
match item { match item {
Ok(ReturnSuccess::Action(action)) => match action { Ok(ReturnSuccess::Action(action)) => match action {
CommandAction::ChangePath(path) => { CommandAction::ChangePath(path) => {
context.shell_manager.set_path(path); context.shell_manager.set_path(path);
InputStream::from_stream(futures::stream::iter(vec![]))
} }
CommandAction::Exit => std::process::exit(0), // TODO: save history.txt CommandAction::Exit => std::process::exit(0), // TODO: save history.txt
CommandAction::Error(err) => { CommandAction::Error(err) => {
context.error(err); context.error(err.clone());
break; InputStream::one(UntaggedValue::Error(err).into_untagged_value())
} }
CommandAction::AutoConvert(tagged_contents, extension) => { CommandAction::AutoConvert(tagged_contents, extension) => {
let contents_tag = tagged_contents.tag.clone(); let contents_tag = tagged_contents.tag.clone();
let command_name = format!("from-{}", extension); let command_name = format!("from {}", extension);
let command = command.clone(); let command = command.clone();
if let Some(converter) = context.registry.get_command(&command_name) { if let Some(converter) = context.registry.get_command(&command_name)
{
let new_args = RawCommandArgs { let new_args = RawCommandArgs {
host: context.host.clone(), host: context.host.clone(),
ctrl_c: context.ctrl_c.clone(), ctrl_c: context.ctrl_c.clone(),
current_errors: context.current_errors.clone(),
shell_manager: context.shell_manager.clone(), shell_manager: context.shell_manager.clone(),
call_info: UnevaluatedCallInfo { call_info: UnevaluatedCallInfo {
args: nu_parser::hir::Call { args: nu_protocol::hir::Call {
head: command.args.head, head: (&*head).clone(),
positional: None, positional: None,
named: None, named: None,
span: Span::unknown() span: Span::unknown(),
external_redirection: ExternalRedirection::Stdout,
},
name_tag: Tag::unknown_anchor(command.name_span),
scope: (&*scope).clone(),
}, },
source: source.clone(),
name_tag: command.name_tag,
}
}; };
let mut result = converter.run(new_args.with_input(vec![tagged_contents]), &context.registry); let result = converter
let result_vec: Vec<Result<ReturnSuccess, ShellError>> = result.drain_vec().await; .run(
new_args.with_input(vec![tagged_contents]),
&context.registry,
)
.await;
match result {
Ok(mut result) => {
let result_vec: Vec<Result<ReturnSuccess, ShellError>> =
result.drain_vec().await;
let mut output = vec![];
for res in result_vec { for res in result_vec {
match res { match res {
Ok(ReturnSuccess::Value(Value { value: UntaggedValue::Table(list), ..})) => { Ok(ReturnSuccess::Value(Value {
value: UntaggedValue::Table(list),
..
})) => {
for l in list { for l in list {
yield Ok(l); output.push(Ok(l));
} }
} }
Ok(ReturnSuccess::Value(Value { value, .. })) => { Ok(ReturnSuccess::Value(Value {
yield Ok(value.into_value(contents_tag.clone())); value,
..
})) => {
output
.push(Ok(value
.into_value(contents_tag.clone())));
} }
Err(e) => yield Err(e), Err(e) => output.push(Err(e)),
_ => {} _ => {}
} }
} }
futures::stream::iter(output).to_input_stream()
}
Err(e) => {
context.add_error(e);
InputStream::empty()
}
}
} else { } else {
yield Ok(tagged_contents) InputStream::one(tagged_contents)
} }
} }
CommandAction::EnterHelpShell(value) => { CommandAction::EnterHelpShell(value) => match value {
match value {
Value { Value {
value: UntaggedValue::Primitive(Primitive::String(cmd)), value: UntaggedValue::Primitive(Primitive::String(cmd)),
tag, tag,
} => { } => {
context.shell_manager.insert_at_current(Box::new( context.shell_manager.insert_at_current(Box::new(
HelpShell::for_command( match HelpShell::for_command(
UntaggedValue::string(cmd).into_value(tag), UntaggedValue::string(cmd).into_value(tag),
&context.registry(), &context.registry(),
)?, ) {
Ok(v) => v,
Err(err) => {
return InputStream::one(
UntaggedValue::Error(err).into_untagged_value(),
)
}
},
)); ));
InputStream::from_stream(futures::stream::iter(vec![]))
} }
_ => { _ => {
context.shell_manager.insert_at_current(Box::new( context.shell_manager.insert_at_current(Box::new(
HelpShell::index(&context.registry())?, match HelpShell::index(&context.registry()) {
Ok(v) => v,
Err(err) => {
return InputStream::one(
UntaggedValue::Error(err).into_untagged_value(),
)
}
},
)); ));
InputStream::from_stream(futures::stream::iter(vec![]))
} }
} },
}
CommandAction::EnterValueShell(value) => { CommandAction::EnterValueShell(value) => {
context context
.shell_manager .shell_manager
.insert_at_current(Box::new(ValueShell::new(value))); .insert_at_current(Box::new(ValueShell::new(value)));
InputStream::from_stream(futures::stream::iter(vec![]))
} }
CommandAction::EnterShell(location) => { CommandAction::EnterShell(location) => {
context.shell_manager.insert_at_current(Box::new( context.shell_manager.insert_at_current(Box::new(
FilesystemShell::with_location(location, context.registry().clone()), match FilesystemShell::with_location(location) {
Ok(v) => v,
Err(err) => {
return InputStream::one(
UntaggedValue::Error(err.into())
.into_untagged_value(),
)
}
},
)); ));
InputStream::from_stream(futures::stream::iter(vec![]))
}
CommandAction::AddAlias(name, args, block) => {
context.add_commands(vec![whole_stream_command(
AliasCommand::new(name, args, block),
)]);
InputStream::from_stream(futures::stream::iter(vec![]))
} }
CommandAction::PreviousShell => { CommandAction::PreviousShell => {
context.shell_manager.prev(); context.shell_manager.prev();
InputStream::from_stream(futures::stream::iter(vec![]))
} }
CommandAction::NextShell => { CommandAction::NextShell => {
context.shell_manager.next(); context.shell_manager.next();
InputStream::from_stream(futures::stream::iter(vec![]))
} }
CommandAction::LeaveShell => { CommandAction::LeaveShell => {
context.shell_manager.remove_at_current(); context.shell_manager.remove_at_current();
if context.shell_manager.is_empty() { if context.shell_manager.is_empty() {
std::process::exit(0); // TODO: save history.txt std::process::exit(0); // TODO: save history.txt
} }
InputStream::from_stream(futures::stream::iter(vec![]))
} }
}, },
Ok(ReturnSuccess::Value(Value { Ok(ReturnSuccess::Value(Value {
value: UntaggedValue::Error(err), value: UntaggedValue::Error(err),
.. tag,
})) => { })) => {
context.error(err); context.error(err.clone());
break; InputStream::one(UntaggedValue::Error(err).into_value(tag))
} }
Ok(ReturnSuccess::Value(v)) => { Ok(ReturnSuccess::Value(v)) => InputStream::one(v),
yielded = true;
yield Ok(v);
}
Ok(ReturnSuccess::DebugValue(v)) => { Ok(ReturnSuccess::DebugValue(v)) => {
yielded = true;
let doc = PrettyDebug::pretty_doc(&v); let doc = PrettyDebug::pretty_doc(&v);
let mut buffer = termcolor::Buffer::ansi(); let mut buffer = termcolor::Buffer::ansi();
@ -164,16 +238,17 @@ pub(crate) fn run_internal_command(
let value = String::from_utf8_lossy(buffer.as_slice()); let value = String::from_utf8_lossy(buffer.as_slice());
yield Ok(UntaggedValue::string(value).into_untagged_value()) InputStream::one(UntaggedValue::string(value).into_untagged_value())
} }
Err(err) => { Err(err) => {
context.error(err); context.error(err.clone());
break; InputStream::one(UntaggedValue::Error(err).into_untagged_value())
} }
} }
} }
}; })
.flatten()
Ok(Some(stream.to_input_stream())) .take_while(|x| futures::future::ready(!x.is_error())),
))
} }

View File

@ -0,0 +1,127 @@
use bytes::{BufMut, Bytes, BytesMut};
use nu_errors::ShellError;
extern crate encoding_rs;
use encoding_rs::{CoderResult, Decoder, Encoding, UTF_8};
#[cfg(not(test))]
const OUTPUT_BUFFER_SIZE: usize = 8192;
#[cfg(test)]
const OUTPUT_BUFFER_SIZE: usize = 4;
#[derive(Debug, Eq, PartialEq)]
pub enum StringOrBinary {
String(String),
Binary(Vec<u8>),
}
pub struct MaybeTextCodec {
decoder: Decoder,
}
impl MaybeTextCodec {
// The constructor takes an Option<&'static Encoding>, because an absence of an encoding indicates that we want BOM sniffing enabled
pub fn new(encoding: Option<&'static Encoding>) -> Self {
let decoder = match encoding {
Some(e) => e.new_decoder_with_bom_removal(),
None => UTF_8.new_decoder(),
};
MaybeTextCodec { decoder }
}
}
impl Default for MaybeTextCodec {
fn default() -> Self {
MaybeTextCodec {
decoder: UTF_8.new_decoder(),
}
}
}
impl futures_codec::Encoder for MaybeTextCodec {
type Item = StringOrBinary;
type Error = std::io::Error;
fn encode(&mut self, item: Self::Item, dst: &mut BytesMut) -> Result<(), Self::Error> {
match item {
StringOrBinary::String(s) => {
dst.reserve(s.len());
dst.put(s.as_bytes());
Ok(())
}
StringOrBinary::Binary(b) => {
dst.reserve(b.len());
dst.put(Bytes::from(b));
Ok(())
}
}
}
}
impl futures_codec::Decoder for MaybeTextCodec {
type Item = StringOrBinary;
type Error = ShellError;
fn decode(&mut self, src: &mut BytesMut) -> Result<Option<Self::Item>, Self::Error> {
if src.is_empty() {
return Ok(None);
}
let mut s = String::with_capacity(OUTPUT_BUFFER_SIZE);
let (res, _read, replacements) = self.decoder.decode_to_string(src, &mut s, false);
let result = if replacements {
// If we had to make replacements when converting to utf8, fall back to binary
StringOrBinary::Binary(src.to_vec())
} else {
// If original buffer size is too small, we continue to allocate new Strings and append
// them to the result until the input buffer is smaller than the allocated String
if let CoderResult::OutputFull = res {
let mut buffer = String::with_capacity(OUTPUT_BUFFER_SIZE);
loop {
let (res, _read, _replacements) =
self.decoder
.decode_to_string(&src[s.len()..], &mut buffer, false);
s.push_str(&buffer);
if let CoderResult::InputEmpty = res {
break;
}
buffer.clear();
}
}
StringOrBinary::String(s)
};
src.clear();
Ok(Some(result))
}
}
#[cfg(test)]
mod tests {
use super::{MaybeTextCodec, StringOrBinary};
use bytes::BytesMut;
use futures_codec::Decoder;
// TODO: Write some more tests
#[test]
fn should_consume_all_bytes_from_source_when_temporary_buffer_overflows() {
let mut maybe_text = MaybeTextCodec::new(None);
let mut bytes = BytesMut::from("0123456789");
let text = maybe_text.decode(&mut bytes);
assert_eq!(
Ok(Some(StringOrBinary::String("0123456789".to_string()))),
text
);
assert!(bytes.is_empty());
}
}

View File

@ -1,7 +1,9 @@
pub(crate) mod block;
mod dynamic; mod dynamic;
pub(crate) mod expr;
pub(crate) mod external; pub(crate) mod external;
pub(crate) mod internal; pub(crate) mod internal;
pub(crate) mod pipeline; pub(crate) mod maybe_text_codec;
#[allow(unused_imports)] #[allow(unused_imports)]
pub(crate) use dynamic::Command as DynamicCommand; pub(crate) use dynamic::Command as DynamicCommand;

View File

@ -1,50 +0,0 @@
use crate::commands::classified::external::run_external_command;
use crate::commands::classified::internal::run_internal_command;
use crate::context::Context;
use crate::stream::InputStream;
use nu_errors::ShellError;
use nu_parser::{ClassifiedCommand, ClassifiedPipeline};
use nu_source::Text;
pub(crate) async fn run_pipeline(
pipeline: ClassifiedPipeline,
ctx: &mut Context,
mut input: Option<InputStream>,
line: &str,
) -> Result<Option<InputStream>, ShellError> {
let mut iter = pipeline.commands.list.into_iter().peekable();
loop {
let item: Option<ClassifiedCommand> = iter.next();
let next: Option<&ClassifiedCommand> = iter.peek();
input = match (item, next) {
(Some(ClassifiedCommand::Dynamic(_)), _) | (_, Some(ClassifiedCommand::Dynamic(_))) => {
return Err(ShellError::unimplemented("Dynamic commands"))
}
(Some(ClassifiedCommand::Expr(_)), _) | (_, Some(ClassifiedCommand::Expr(_))) => {
return Err(ShellError::unimplemented("Expression-only commands"))
}
(Some(ClassifiedCommand::Error(err)), _) => return Err(err.into()),
(_, Some(ClassifiedCommand::Error(err))) => return Err(err.clone().into()),
(Some(ClassifiedCommand::Internal(left)), _) => {
run_internal_command(left, ctx, input, Text::from(line))?
}
(Some(ClassifiedCommand::External(left)), None) => {
run_external_command(left, ctx, input, true)?
}
(Some(ClassifiedCommand::External(left)), _) => {
run_external_command(left, ctx, input, false)?
}
(None, _) => break,
};
}
Ok(input)
}

View File

@ -6,25 +6,21 @@ use std::process::Command;
pub struct Clear; pub struct Clear;
#[async_trait]
impl WholeStreamCommand for Clear { impl WholeStreamCommand for Clear {
fn name(&self) -> &str { fn name(&self) -> &str {
"clear" "clear"
} }
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("clear") Signature::build("clear")
} }
fn usage(&self) -> &str { fn usage(&self) -> &str {
"clears the terminal" "clears the terminal"
} }
fn run(
&self, async fn run(&self, _: CommandArgs, _: &CommandRegistry) -> Result<OutputStream, ShellError> {
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
clear(args, registry)
}
}
fn clear(_args: CommandArgs, _registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
if cfg!(windows) { if cfg!(windows) {
Command::new("cmd") Command::new("cmd")
.args(&["/C", "cls"]) .args(&["/C", "cls"])
@ -38,3 +34,24 @@ fn clear(_args: CommandArgs, _registry: &CommandRegistry) -> Result<OutputStream
} }
Ok(OutputStream::empty()) Ok(OutputStream::empty())
} }
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Clear the screen",
example: "clear",
result: None,
}]
}
}
#[cfg(test)]
mod tests {
use super::Clear;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Clear {})
}
}

View File

@ -1,19 +1,15 @@
#[cfg(feature = "clipboard")]
pub mod clipboard {
use crate::commands::WholeStreamCommand; use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry; use crate::context::CommandRegistry;
use crate::prelude::*; use crate::prelude::*;
use futures::stream::StreamExt; use futures::stream::StreamExt;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{ReturnValue, Signature, Value}; use nu_protocol::{Signature, Value};
use clipboard::{ClipboardContext, ClipboardProvider}; use clipboard::{ClipboardContext, ClipboardProvider};
pub struct Clip; pub struct Clip;
#[derive(Deserialize)] #[async_trait]
pub struct ClipArgs {}
impl WholeStreamCommand for Clip { impl WholeStreamCommand for Clip {
fn name(&self) -> &str { fn name(&self) -> &str {
"clip" "clip"
@ -27,41 +23,38 @@ pub mod clipboard {
"Copy the contents of the pipeline to the copy/paste buffer" "Copy the contents of the pipeline to the copy/paste buffer"
} }
fn run( async fn run(
&self, &self,
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
args.process(registry, clip)?.run() clip(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Save text to the clipboard",
example: "echo 'secret value' | clip",
result: None,
}]
} }
} }
pub fn clip( pub async fn clip(
ClipArgs {}: ClipArgs, args: CommandArgs,
RunnableContext { input, name, .. }: RunnableContext, _registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
let stream = async_stream! { let input = args.input;
let values: Vec<Value> = input.values.collect().await; let name = args.call_info.name_tag.clone();
let values: Vec<Value> = input.collect().await;
let mut clip_stream = inner_clip(values, name).await;
while let Some(value) = clip_stream.next().await {
yield value;
}
};
let stream: BoxStream<'static, ReturnValue> = stream.boxed();
Ok(OutputStream::from(stream))
}
async fn inner_clip(input: Vec<Value>, name: Tag) -> OutputStream {
if let Ok(clip_context) = ClipboardProvider::new() { if let Ok(clip_context) = ClipboardProvider::new() {
let mut clip_context: ClipboardContext = clip_context; let mut clip_context: ClipboardContext = clip_context;
let mut new_copy_data = String::new(); let mut new_copy_data = String::new();
if !input.is_empty() { if !values.is_empty() {
let mut first = true; let mut first = true;
for i in input.iter() { for i in values.iter() {
if !first { if !first {
new_copy_data.push_str("\n"); new_copy_data.push_str("\n");
} else { } else {
@ -71,11 +64,11 @@ pub mod clipboard {
let string: String = match i.as_string() { let string: String = match i.as_string() {
Ok(string) => string.to_string(), Ok(string) => string.to_string(),
Err(_) => { Err(_) => {
return OutputStream::one(Err(ShellError::labeled_error( return Err(ShellError::labeled_error(
"Given non-string data", "Given non-string data",
"expected strings from pipeline", "expected strings from pipeline",
name, name,
))) ))
} }
}; };
@ -86,21 +79,31 @@ pub mod clipboard {
match clip_context.set_contents(new_copy_data) { match clip_context.set_contents(new_copy_data) {
Ok(_) => {} Ok(_) => {}
Err(_) => { Err(_) => {
return OutputStream::one(Err(ShellError::labeled_error( return Err(ShellError::labeled_error(
"Could not set contents of clipboard", "Could not set contents of clipboard",
"could not set contents of clipboard", "could not set contents of clipboard",
name, name,
))); ));
} }
} }
OutputStream::empty()
} else { } else {
OutputStream::one(Err(ShellError::labeled_error( return Err(ShellError::labeled_error(
"Could not open clipboard", "Could not open clipboard",
"could not open clipboard", "could not open clipboard",
name, name,
))) ));
} }
Ok(OutputStream::empty())
}
#[cfg(test)]
mod tests {
use super::Clip;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Clip {})
} }
} }

View File

@ -6,8 +6,9 @@ use crate::prelude::*;
use derive_new::new; use derive_new::new;
use getset::Getters; use getset::Getters;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_parser::hir; use nu_protocol::hir;
use nu_protocol::{CallInfo, EvaluatedArgs, ReturnValue, Scope, Signature, Value}; use nu_protocol::{CallInfo, EvaluatedArgs, ReturnSuccess, Scope, Signature, UntaggedValue, Value};
use parking_lot::Mutex;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use std::ops::Deref; use std::ops::Deref;
use std::sync::atomic::AtomicBool; use std::sync::atomic::AtomicBool;
@ -15,17 +16,28 @@ use std::sync::atomic::AtomicBool;
#[derive(Deserialize, Serialize, Debug, Clone)] #[derive(Deserialize, Serialize, Debug, Clone)]
pub struct UnevaluatedCallInfo { pub struct UnevaluatedCallInfo {
pub args: hir::Call, pub args: hir::Call,
pub source: Text,
pub name_tag: Tag, pub name_tag: Tag,
pub scope: Scope,
} }
impl UnevaluatedCallInfo { impl UnevaluatedCallInfo {
pub fn evaluate( pub async fn evaluate(self, registry: &CommandRegistry) -> Result<CallInfo, ShellError> {
let args = evaluate_args(&self.args, registry, &self.scope).await?;
Ok(CallInfo {
args,
name_tag: self.name_tag,
})
}
pub async fn evaluate_with_new_it(
self, self,
registry: &CommandRegistry, registry: &CommandRegistry,
scope: &Scope, it: &Value,
) -> Result<CallInfo, ShellError> { ) -> Result<CallInfo, ShellError> {
let args = evaluate_args(&self.args, registry, scope, &self.source)?; let mut scope = self.scope.clone();
scope.it = it.clone();
let args = evaluate_args(&self.args, registry, &scope).await?;
Ok(CallInfo { Ok(CallInfo {
args, args,
@ -38,44 +50,16 @@ impl UnevaluatedCallInfo {
} }
} }
pub trait CallInfoExt {
fn process<'de, T: Deserialize<'de>>(
&self,
shell_manager: &ShellManager,
ctrl_c: Arc<AtomicBool>,
callback: fn(T, &RunnablePerItemContext) -> Result<OutputStream, ShellError>,
) -> Result<RunnablePerItemArgs<T>, ShellError>;
}
impl CallInfoExt for CallInfo {
fn process<'de, T: Deserialize<'de>>(
&self,
shell_manager: &ShellManager,
ctrl_c: Arc<AtomicBool>,
callback: fn(T, &RunnablePerItemContext) -> Result<OutputStream, ShellError>,
) -> Result<RunnablePerItemArgs<T>, ShellError> {
let mut deserializer = ConfigDeserializer::from_call_info(self.clone());
Ok(RunnablePerItemArgs {
args: T::deserialize(&mut deserializer)?,
context: RunnablePerItemContext {
shell_manager: shell_manager.clone(),
name: self.name_tag.clone(),
ctrl_c,
},
callback,
})
}
}
#[derive(Getters)] #[derive(Getters)]
#[get = "pub(crate)"] #[get = "pub(crate)"]
pub struct CommandArgs { pub struct CommandArgs {
pub host: Arc<parking_lot::Mutex<Box<dyn Host>>>, pub host: Arc<parking_lot::Mutex<Box<dyn Host>>>,
pub ctrl_c: Arc<AtomicBool>, pub ctrl_c: Arc<AtomicBool>,
pub current_errors: Arc<Mutex<Vec<ShellError>>>,
pub shell_manager: ShellManager, pub shell_manager: ShellManager,
pub call_info: UnevaluatedCallInfo, pub call_info: UnevaluatedCallInfo,
pub input: InputStream, pub input: InputStream,
pub raw_input: String,
} }
#[derive(Getters, Clone)] #[derive(Getters, Clone)]
@ -83,6 +67,7 @@ pub struct CommandArgs {
pub struct RawCommandArgs { pub struct RawCommandArgs {
pub host: Arc<parking_lot::Mutex<Box<dyn Host>>>, pub host: Arc<parking_lot::Mutex<Box<dyn Host>>>,
pub ctrl_c: Arc<AtomicBool>, pub ctrl_c: Arc<AtomicBool>,
pub current_errors: Arc<Mutex<Vec<ShellError>>>,
pub shell_manager: ShellManager, pub shell_manager: ShellManager,
pub call_info: UnevaluatedCallInfo, pub call_info: UnevaluatedCallInfo,
} }
@ -92,9 +77,11 @@ impl RawCommandArgs {
CommandArgs { CommandArgs {
host: self.host, host: self.host,
ctrl_c: self.ctrl_c, ctrl_c: self.ctrl_c,
current_errors: self.current_errors,
shell_manager: self.shell_manager, shell_manager: self.shell_manager,
call_info: self.call_info, call_info: self.call_info,
input: input.into(), input: input.into(),
raw_input: String::default(),
} }
} }
} }
@ -106,7 +93,7 @@ impl std::fmt::Debug for CommandArgs {
} }
impl CommandArgs { impl CommandArgs {
pub fn evaluate_once( pub async fn evaluate_once(
self, self,
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<EvaluatedWholeStreamCommandArgs, ShellError> { ) -> Result<EvaluatedWholeStreamCommandArgs, ShellError> {
@ -114,7 +101,7 @@ impl CommandArgs {
let ctrl_c = self.ctrl_c.clone(); let ctrl_c = self.ctrl_c.clone();
let shell_manager = self.shell_manager.clone(); let shell_manager = self.shell_manager.clone();
let input = self.input; let input = self.input;
let call_info = self.call_info.evaluate(registry, &Scope::empty())?; let call_info = self.call_info.evaluate(registry).await?;
Ok(EvaluatedWholeStreamCommandArgs::new( Ok(EvaluatedWholeStreamCommandArgs::new(
host, host,
@ -125,7 +112,7 @@ impl CommandArgs {
)) ))
} }
pub fn evaluate_once_with_scope( pub async fn evaluate_once_with_scope(
self, self,
registry: &CommandRegistry, registry: &CommandRegistry,
scope: &Scope, scope: &Scope,
@ -134,7 +121,12 @@ impl CommandArgs {
let ctrl_c = self.ctrl_c.clone(); let ctrl_c = self.ctrl_c.clone();
let shell_manager = self.shell_manager.clone(); let shell_manager = self.shell_manager.clone();
let input = self.input; let input = self.input;
let call_info = self.call_info.evaluate(registry, scope)?; let call_info = UnevaluatedCallInfo {
name_tag: self.call_info.name_tag,
args: self.call_info.args,
scope: scope.clone(),
};
let call_info = call_info.evaluate(registry).await?;
Ok(EvaluatedWholeStreamCommandArgs::new( Ok(EvaluatedWholeStreamCommandArgs::new(
host, host,
@ -145,139 +137,33 @@ impl CommandArgs {
)) ))
} }
pub fn source(&self) -> Text { pub async fn process<'de, T: Deserialize<'de>>(
self.call_info.source.clone()
}
pub fn process<'de, T: Deserialize<'de>, O: ToOutputStream>(
self, self,
registry: &CommandRegistry, registry: &CommandRegistry,
callback: fn(T, RunnableContext) -> Result<O, ShellError>, ) -> Result<(T, InputStream), ShellError> {
) -> Result<RunnableArgs<T, O>, ShellError> { let args = self.evaluate_once(registry).await?;
let shell_manager = self.shell_manager.clone();
let host = self.host.clone();
let source = self.source();
let ctrl_c = self.ctrl_c.clone();
let args = self.evaluate_once(registry)?;
let call_info = args.call_info.clone();
let (input, args) = args.split();
let name_tag = args.call_info.name_tag;
let mut deserializer = ConfigDeserializer::from_call_info(call_info);
Ok(RunnableArgs {
args: T::deserialize(&mut deserializer)?,
context: RunnableContext {
input,
commands: registry.clone(),
source,
shell_manager,
name: name_tag,
host,
ctrl_c,
},
callback,
})
}
pub fn process_raw<'de, T: Deserialize<'de>>(
self,
registry: &CommandRegistry,
callback: fn(T, RunnableContext, RawCommandArgs) -> Result<OutputStream, ShellError>,
) -> Result<RunnableRawArgs<T>, ShellError> {
let raw_args = RawCommandArgs {
host: self.host.clone(),
ctrl_c: self.ctrl_c.clone(),
shell_manager: self.shell_manager.clone(),
call_info: self.call_info.clone(),
};
let shell_manager = self.shell_manager.clone();
let host = self.host.clone();
let source = self.source();
let ctrl_c = self.ctrl_c.clone();
let args = self.evaluate_once(registry)?;
let call_info = args.call_info.clone(); let call_info = args.call_info.clone();
let (input, args) = args.split();
let name_tag = args.call_info.name_tag;
let mut deserializer = ConfigDeserializer::from_call_info(call_info); let mut deserializer = ConfigDeserializer::from_call_info(call_info);
Ok(RunnableRawArgs { Ok((T::deserialize(&mut deserializer)?, args.input))
args: T::deserialize(&mut deserializer)?,
context: RunnableContext {
input,
commands: registry.clone(),
source,
shell_manager,
name: name_tag,
host,
ctrl_c,
},
raw_args,
callback,
})
} }
} }
pub struct RunnablePerItemContext {
pub shell_manager: ShellManager,
pub name: Tag,
pub ctrl_c: Arc<AtomicBool>,
}
pub struct RunnableContext { pub struct RunnableContext {
pub input: InputStream, pub input: InputStream,
pub shell_manager: ShellManager, pub shell_manager: ShellManager,
pub host: Arc<parking_lot::Mutex<Box<dyn Host>>>, pub host: Arc<parking_lot::Mutex<Box<dyn Host>>>,
pub source: Text,
pub ctrl_c: Arc<AtomicBool>, pub ctrl_c: Arc<AtomicBool>,
pub commands: CommandRegistry, pub current_errors: Arc<Mutex<Vec<ShellError>>>,
pub registry: CommandRegistry,
pub name: Tag, pub name: Tag,
pub raw_input: String,
} }
impl RunnableContext { impl RunnableContext {
pub fn get_command(&self, name: &str) -> Option<Arc<Command>> { pub fn get_command(&self, name: &str) -> Option<Command> {
self.commands.get_command(name) self.registry.get_command(name)
}
}
pub struct RunnablePerItemArgs<T> {
args: T,
context: RunnablePerItemContext,
callback: fn(T, &RunnablePerItemContext) -> Result<OutputStream, ShellError>,
}
impl<T> RunnablePerItemArgs<T> {
pub fn run(self) -> Result<OutputStream, ShellError> {
(self.callback)(self.args, &self.context)
}
}
pub struct RunnableArgs<T, O: ToOutputStream> {
args: T,
context: RunnableContext,
callback: fn(T, RunnableContext) -> Result<O, ShellError>,
}
impl<T, O: ToOutputStream> RunnableArgs<T, O> {
pub fn run(self) -> Result<OutputStream, ShellError> {
(self.callback)(self.args, self.context).map(|v| v.to_output_stream())
}
}
pub struct RunnableRawArgs<T> {
args: T,
raw_args: RawCommandArgs,
context: RunnableContext,
callback: fn(T, RunnableContext, RawCommandArgs) -> Result<OutputStream, ShellError>,
}
impl<T> RunnableRawArgs<T> {
pub fn run(self) -> OutputStream {
match (self.callback)(self.args, self.context, self.raw_args) {
Ok(stream) => stream,
Err(err) => OutputStream::one(Err(err)),
}
} }
} }
@ -374,6 +260,14 @@ impl EvaluatedCommandArgs {
self.call_info.args.nth(pos) self.call_info.args.nth(pos)
} }
/// Get the nth positional argument, error if not possible
pub fn expect_nth(&self, pos: usize) -> Result<&Value, ShellError> {
self.call_info
.args
.nth(pos)
.ok_or_else(|| ShellError::unimplemented("Better error: expect_nth"))
}
pub fn get(&self, name: &str) -> Option<&Value> { pub fn get(&self, name: &str) -> Option<&Value> {
self.call_info.args.get(name) self.call_info.args.get(name)
} }
@ -383,6 +277,13 @@ impl EvaluatedCommandArgs {
} }
} }
pub struct Example {
pub example: &'static str,
pub description: &'static str,
pub result: Option<Vec<Value>>,
}
#[async_trait]
pub trait WholeStreamCommand: Send + Sync { pub trait WholeStreamCommand: Send + Sync {
fn name(&self) -> &str; fn name(&self) -> &str;
@ -392,7 +293,7 @@ pub trait WholeStreamCommand: Send + Sync {
fn usage(&self) -> &str; fn usage(&self) -> &str;
fn run( async fn run(
&self, &self,
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
@ -401,147 +302,73 @@ pub trait WholeStreamCommand: Send + Sync {
fn is_binary(&self) -> bool { fn is_binary(&self) -> bool {
false false
} }
}
pub trait PerItemCommand: Send + Sync { fn examples(&self) -> Vec<Example> {
fn name(&self) -> &str; Vec::new()
fn signature(&self) -> Signature {
Signature::new(self.name()).desc(self.usage()).filter()
}
fn usage(&self) -> &str;
fn run(
&self,
call_info: &CallInfo,
registry: &CommandRegistry,
raw_args: &RawCommandArgs,
input: Value,
) -> Result<OutputStream, ShellError>;
fn is_binary(&self) -> bool {
false
} }
} }
pub enum Command { #[derive(Clone)]
WholeStream(Arc<dyn WholeStreamCommand>), pub struct Command(Arc<dyn WholeStreamCommand>);
PerItem(Arc<dyn PerItemCommand>),
}
impl PrettyDebugWithSource for Command { impl PrettyDebugWithSource for Command {
fn pretty_debug(&self, source: &str) -> DebugDocBuilder { fn pretty_debug(&self, source: &str) -> DebugDocBuilder {
match self { b::typed(
Command::WholeStream(command) => b::typed(
"whole stream command", "whole stream command",
b::description(command.name()) b::description(self.name())
+ b::space() + b::space()
+ b::equals() + b::equals()
+ b::space() + b::space()
+ command.signature().pretty_debug(source), + self.signature().pretty_debug(source),
), )
Command::PerItem(command) => b::typed(
"per item command",
b::description(command.name())
+ b::space()
+ b::equals()
+ b::space()
+ command.signature().pretty_debug(source),
),
}
} }
} }
impl std::fmt::Debug for Command { impl std::fmt::Debug for Command {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result { fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self { write!(f, "Command({})", self.name())
Command::WholeStream(command) => write!(f, "WholeStream({})", command.name()),
Command::PerItem(command) => write!(f, "PerItem({})", command.name()),
}
} }
} }
impl Command { impl Command {
pub fn name(&self) -> &str { pub fn name(&self) -> &str {
match self { self.0.name()
Command::WholeStream(command) => command.name(),
Command::PerItem(command) => command.name(),
}
} }
pub fn signature(&self) -> Signature { pub fn signature(&self) -> Signature {
match self { self.0.signature()
Command::WholeStream(command) => command.signature(),
Command::PerItem(command) => command.signature(),
}
} }
pub fn usage(&self) -> &str { pub fn usage(&self) -> &str {
match self { self.0.usage()
Command::WholeStream(command) => command.usage(),
Command::PerItem(command) => command.usage(),
}
} }
pub fn run(&self, args: CommandArgs, registry: &CommandRegistry) -> OutputStream { pub fn examples(&self) -> Vec<Example> {
if args.call_info.switch_present("help") { self.0.examples()
get_help(self.name(), self.usage(), self.signature()).into()
} else {
match self {
Command::WholeStream(command) => match command.run(args, registry) {
Ok(stream) => stream,
Err(err) => OutputStream::one(Err(err)),
},
Command::PerItem(command) => {
self.run_helper(command.clone(), args, registry.clone())
}
}
}
} }
fn run_helper( pub async fn run(
&self, &self,
command: Arc<dyn PerItemCommand>,
args: CommandArgs, args: CommandArgs,
registry: CommandRegistry, registry: &CommandRegistry,
) -> OutputStream { ) -> Result<OutputStream, ShellError> {
let raw_args = RawCommandArgs { if args.call_info.switch_present("help") {
host: args.host, let cl = self.0.clone();
ctrl_c: args.ctrl_c, let registry = registry.clone();
shell_manager: args.shell_manager, Ok(OutputStream::one(Ok(ReturnSuccess::Value(
call_info: args.call_info, UntaggedValue::string(get_help(&*cl, &registry)).into_value(Tag::unknown()),
}; ))))
} else {
let out = args self.0.run(args, registry).await
.input
.values
.map(move |x| {
let call_info = raw_args
.clone()
.call_info
.evaluate(&registry, &Scope::it_value(x.clone()));
match call_info {
Ok(call_info) => match command.run(&call_info, &registry, &raw_args, x) {
Ok(o) => o,
Err(e) => {
futures::stream::iter(vec![ReturnValue::Err(e)]).to_output_stream()
} }
},
Err(e) => futures::stream::iter(vec![ReturnValue::Err(e)]).to_output_stream(),
}
})
.flatten();
out.to_output_stream()
} }
pub fn is_binary(&self) -> bool { pub fn is_binary(&self) -> bool {
match self { self.0.is_binary()
Command::WholeStream(command) => command.is_binary(),
Command::PerItem(command) => command.is_binary(),
} }
pub fn stream_command(&self) -> &dyn WholeStreamCommand {
&*self.0
} }
} }
@ -550,6 +377,7 @@ pub struct FnFilterCommand {
func: fn(EvaluatedFilterCommandArgs) -> Result<OutputStream, ShellError>, func: fn(EvaluatedFilterCommandArgs) -> Result<OutputStream, ShellError>,
} }
#[async_trait]
impl WholeStreamCommand for FnFilterCommand { impl WholeStreamCommand for FnFilterCommand {
fn name(&self) -> &str { fn name(&self) -> &str {
&self.name &self.name
@ -559,27 +387,33 @@ impl WholeStreamCommand for FnFilterCommand {
"usage" "usage"
} }
fn run( async fn run(
&self, &self,
args: CommandArgs, CommandArgs {
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let CommandArgs {
host, host,
ctrl_c, ctrl_c,
shell_manager, shell_manager,
call_info, call_info,
input, input,
} = args; ..
}: CommandArgs,
let host: Arc<parking_lot::Mutex<dyn Host>> = host.clone(); registry: &CommandRegistry,
let registry: CommandRegistry = registry.clone(); ) -> Result<OutputStream, ShellError> {
let registry = Arc::new(registry.clone());
let func = self.func; let func = self.func;
let result = input.values.map(move |it| { Ok(input
.then(move |it| {
let host = host.clone();
let registry = registry.clone(); let registry = registry.clone();
let call_info = match call_info.clone().evaluate(&registry, &Scope::it_value(it)) { let ctrl_c = ctrl_c.clone();
Err(err) => return OutputStream::from(vec![Err(err)]).values, let shell_manager = shell_manager.clone();
let call_info = call_info.clone();
async move {
let call_info = match call_info.evaluate_with_new_it(&*registry, &it).await {
Err(err) => {
return OutputStream::one(Err(err));
}
Ok(args) => args, Ok(args) => args,
}; };
@ -591,22 +425,16 @@ impl WholeStreamCommand for FnFilterCommand {
); );
match func(args) { match func(args) {
Err(err) => OutputStream::from(vec![Err(err)]).values, Err(err) => return OutputStream::one(Err(err)),
Ok(stream) => stream.values, Ok(stream) => stream,
} }
}); }
})
let result = result.flatten(); .flatten()
let result: BoxStream<ReturnValue> = result.boxed(); .to_output_stream())
Ok(result.into())
} }
} }
pub fn whole_stream_command(command: impl WholeStreamCommand + 'static) -> Arc<Command> { pub fn whole_stream_command(command: impl WholeStreamCommand + 'static) -> Command {
Arc::new(Command::WholeStream(Arc::new(command))) Command(Arc::new(command))
}
pub fn per_item_command(command: impl PerItemCommand + 'static) -> Arc<Command> {
Arc::new(Command::PerItem(Arc::new(command)))
} }

View File

@ -1,9 +1,10 @@
use crate::commands::WholeStreamCommand; use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry; use crate::context::CommandRegistry;
use crate::prelude::*; use crate::prelude::*;
use futures::future;
use futures::stream::StreamExt; use futures::stream::StreamExt;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape, UntaggedValue, Value}; use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged; use nu_source::Tagged;
pub struct Compact; pub struct Compact;
@ -13,6 +14,7 @@ pub struct CompactArgs {
rest: Vec<Tagged<String>>, rest: Vec<Tagged<String>>,
} }
#[async_trait]
impl WholeStreamCommand for Compact { impl WholeStreamCommand for Compact {
fn name(&self) -> &str { fn name(&self) -> &str {
"compact" "compact"
@ -26,36 +28,78 @@ impl WholeStreamCommand for Compact {
"Creates a table with non-empty rows" "Creates a table with non-empty rows"
} }
fn run( async fn run(
&self, &self,
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
args.process(registry, compact)?.run() compact(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Filter out all null entries in a list",
example: "echo [1 2 $null 3 $null $null] | compact",
result: Some(vec![
UntaggedValue::int(1).into(),
UntaggedValue::int(2).into(),
UntaggedValue::int(3).into(),
]),
},
Example {
description: "Filter out all directory entries having no 'target'",
example: "ls -la | compact target",
result: None,
},
]
} }
} }
pub fn compact( pub async fn compact(
CompactArgs { rest: columns }: CompactArgs, args: CommandArgs,
RunnableContext { input, .. }: RunnableContext, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
let objects = input.values.filter(move |item| { let registry = registry.clone();
let keep = if columns.is_empty() { let (CompactArgs { rest: columns }, input) = args.process(&registry).await?;
item.is_some() Ok(input
.filter_map(move |item| {
future::ready(if columns.is_empty() {
if !item.is_empty() {
Some(ReturnSuccess::value(item))
} else {
None
}
} else { } else {
match item { match item {
Value { Value {
value: UntaggedValue::Row(ref r), value: UntaggedValue::Row(ref r),
.. ..
} => columns } => {
if columns
.iter() .iter()
.all(|field| r.get_data(field).borrow().is_some()), .all(|field| r.get_data(field).borrow().is_some())
_ => false, {
Some(ReturnSuccess::value(item))
} else {
None
}
}
_ => None,
}
})
})
.to_output_stream())
}
#[cfg(test)]
mod tests {
use super::Compact;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Compact {})
} }
};
futures::future::ready(keep)
});
Ok(objects.from_input_stream())
} }

View File

@ -1,188 +0,0 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::data::config;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged;
use std::path::PathBuf;
pub struct Config;
#[derive(Deserialize)]
pub struct ConfigArgs {
load: Option<Tagged<PathBuf>>,
set: Option<(Tagged<String>, Value)>,
set_into: Option<Tagged<String>>,
get: Option<Tagged<String>>,
clear: Tagged<bool>,
remove: Option<Tagged<String>>,
path: Tagged<bool>,
}
impl WholeStreamCommand for Config {
fn name(&self) -> &str {
"config"
}
fn signature(&self) -> Signature {
Signature::build("config")
.named(
"load",
SyntaxShape::Path,
"load the config from the path give",
Some('l'),
)
.named(
"set",
SyntaxShape::Any,
"set a value in the config, eg) --set [key value]",
Some('s'),
)
.named(
"set_into",
SyntaxShape::Member,
"sets a variable from values in the pipeline",
Some('i'),
)
.named(
"get",
SyntaxShape::Any,
"get a value from the config",
Some('g'),
)
.named(
"remove",
SyntaxShape::Any,
"remove a value from the config",
Some('r'),
)
.switch("clear", "clear the config", Some('c'))
.switch("path", "return the path to the config file", Some('p'))
}
fn usage(&self) -> &str {
"Configuration management."
}
fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
args.process(registry, config)?.run()
}
}
pub fn config(
ConfigArgs {
load,
set,
set_into,
get,
clear,
remove,
path,
}: ConfigArgs,
RunnableContext { name, input, .. }: RunnableContext,
) -> Result<OutputStream, ShellError> {
let name_span = name.clone();
let stream = async_stream! {
let configuration = if let Some(supplied) = load {
Some(supplied.item().clone())
} else {
None
};
let mut result = crate::data::config::read(name_span, &configuration)?;
if let Some(v) = get {
let key = v.to_string();
let value = result
.get(&key)
.ok_or_else(|| ShellError::labeled_error("Missing key in config", "key", v.tag()))?;
match value {
Value {
value: UntaggedValue::Table(list),
..
} => {
for l in list {
let value = l.clone();
yield ReturnSuccess::value(l.clone());
}
}
x => yield ReturnSuccess::value(x.clone()),
}
}
else if let Some((key, value)) = set {
result.insert(key.to_string(), value.clone());
config::write(&result, &configuration)?;
yield ReturnSuccess::value(UntaggedValue::Row(result.into()).into_value(&value.tag));
}
else if let Some(v) = set_into {
let rows: Vec<Value> = input.values.collect().await;
let key = v.to_string();
if rows.len() == 0 {
yield Err(ShellError::labeled_error("No values given for set_into", "needs value(s) from pipeline", v.tag()));
} else if rows.len() == 1 {
// A single value
let value = &rows[0];
result.insert(key.to_string(), value.clone());
config::write(&result, &configuration)?;
yield ReturnSuccess::value(UntaggedValue::Row(result.into()).into_value(name));
} else {
// Take in the pipeline as a table
let value = UntaggedValue::Table(rows).into_value(name.clone());
result.insert(key.to_string(), value.clone());
config::write(&result, &configuration)?;
yield ReturnSuccess::value(UntaggedValue::Row(result.into()).into_value(name));
}
}
else if let Tagged { item: true, tag } = clear {
result.clear();
config::write(&result, &configuration)?;
yield ReturnSuccess::value(UntaggedValue::Row(result.into()).into_value(tag));
return;
}
else if let Tagged { item: true, tag } = path {
let path = config::default_path_for(&configuration)?;
yield ReturnSuccess::value(UntaggedValue::Primitive(Primitive::Path(path)).into_value(tag));
}
else if let Some(v) = remove {
let key = v.to_string();
if result.contains_key(&key) {
result.swap_remove(&key);
config::write(&result, &configuration)?
} else {
yield Err(ShellError::labeled_error(
"Key does not exist in config",
"key",
v.tag(),
));
}
yield ReturnSuccess::value(UntaggedValue::Row(result.into()).into_value(v.tag()));
}
else {
yield ReturnSuccess::value(UntaggedValue::Row(result.into()).into_value(name));
}
};
Ok(stream.to_output_stream())
}

View File

@ -0,0 +1,57 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, UntaggedValue};
pub struct SubCommand;
#[async_trait]
impl WholeStreamCommand for SubCommand {
fn name(&self) -> &str {
"config clear"
}
fn signature(&self) -> Signature {
Signature::build("config clear")
}
fn usage(&self) -> &str {
"clear the config"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
clear(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Clear the config (be careful!)",
example: "config clear",
result: None,
}]
}
}
pub async fn clear(
args: CommandArgs,
_registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let name_span = args.call_info.name_tag.clone();
// NOTE: None because we are not loading a new config file, we just want to read from the
// existing config
let mut result = crate::data::config::read(name_span, &None)?;
result.clear();
config::write(&result, &None)?;
Ok(OutputStream::one(ReturnSuccess::value(
UntaggedValue::Row(result.into()).into_value(args.call_info.name_tag),
)))
}

View File

@ -0,0 +1,37 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use crate::{CommandArgs, CommandRegistry, OutputStream};
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, UntaggedValue};
pub struct Command;
#[async_trait]
impl WholeStreamCommand for Command {
fn name(&self) -> &str {
"config"
}
fn signature(&self) -> Signature {
Signature::build("config")
}
fn usage(&self) -> &str {
"Configuration management."
}
async fn run(
&self,
args: CommandArgs,
_registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let name_span = args.call_info.name_tag.clone();
let name = args.call_info.name_tag;
let result = crate::data::config::read(name_span, &None)?;
Ok(futures::stream::iter(vec![ReturnSuccess::value(
UntaggedValue::Row(result.into()).into_value(name),
)])
.to_output_stream())
}
}

View File

@ -0,0 +1,83 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged;
pub struct SubCommand;
#[derive(Deserialize)]
pub struct GetArgs {
get: Tagged<String>,
}
#[async_trait]
impl WholeStreamCommand for SubCommand {
fn name(&self) -> &str {
"config get"
}
fn signature(&self) -> Signature {
Signature::build("config get").required(
"get",
SyntaxShape::Any,
"value to get from the config",
)
}
fn usage(&self) -> &str {
"Gets a value from the config"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
get(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Get the current startup commands",
example: "config get startup",
result: None,
}]
}
}
pub async fn get(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let name_span = args.call_info.name_tag.clone();
let (GetArgs { get }, _) = args.process(&registry).await?;
// NOTE: None because we are not loading a new config file, we just want to read from the
// existing config
let result = crate::data::config::read(name_span, &None)?;
let key = get.to_string();
let value = result
.get(&key)
.ok_or_else(|| ShellError::labeled_error("Missing key in config", "key", get.tag()))?;
Ok(match value {
Value {
value: UntaggedValue::Table(list),
..
} => {
let list: Vec<_> = list
.iter()
.map(|x| ReturnSuccess::value(x.clone()))
.collect();
futures::stream::iter(list).to_output_stream()
}
x => {
let x = x.clone();
OutputStream::one(ReturnSuccess::value(x))
}
})
}

View File

@ -0,0 +1,59 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue};
use nu_source::Tagged;
use std::path::PathBuf;
pub struct SubCommand;
#[derive(Deserialize)]
pub struct LoadArgs {
load: Tagged<PathBuf>,
}
#[async_trait]
impl WholeStreamCommand for SubCommand {
fn name(&self) -> &str {
"config load"
}
fn signature(&self) -> Signature {
Signature::build("config load").required(
"load",
SyntaxShape::Path,
"Path to load the config from",
)
}
fn usage(&self) -> &str {
"Loads the config from the path given"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
set(args, registry).await
}
}
pub async fn set(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let name = args.call_info.name_tag.clone();
let name_span = args.call_info.name_tag.clone();
let (LoadArgs { load }, _) = args.process(&registry).await?;
let configuration = load.item().clone();
let result = crate::data::config::read(name_span, &Some(configuration))?;
Ok(futures::stream::iter(vec![ReturnSuccess::value(
UntaggedValue::Row(result.into()).into_value(name),
)])
.to_output_stream())
}

View File

@ -0,0 +1,17 @@
pub mod clear;
pub mod command;
pub mod get;
pub mod load;
pub mod path;
pub mod remove;
pub mod set;
pub mod set_into;
pub use clear::SubCommand as ConfigClear;
pub use command::Command as Config;
pub use get::SubCommand as ConfigGet;
pub use load::SubCommand as ConfigLoad;
pub use path::SubCommand as ConfigPath;
pub use remove::SubCommand as ConfigRemove;
pub use set::SubCommand as ConfigSet;
pub use set_into::SubCommand as ConfigSetInto;

View File

@ -0,0 +1,49 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, Signature, UntaggedValue};
pub struct SubCommand;
#[async_trait]
impl WholeStreamCommand for SubCommand {
fn name(&self) -> &str {
"config path"
}
fn signature(&self) -> Signature {
Signature::build("config path")
}
fn usage(&self) -> &str {
"return the path to the config file"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
path(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Get the path to the current config file",
example: "config path",
result: None,
}]
}
}
pub async fn path(
args: CommandArgs,
_registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let path = config::default_path()?;
Ok(OutputStream::one(ReturnSuccess::value(
UntaggedValue::Primitive(Primitive::Path(path)).into_value(args.call_info.name_tag),
)))
}

View File

@ -0,0 +1,75 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue};
use nu_source::Tagged;
pub struct SubCommand;
#[derive(Deserialize)]
pub struct RemoveArgs {
remove: Tagged<String>,
}
#[async_trait]
impl WholeStreamCommand for SubCommand {
fn name(&self) -> &str {
"config remove"
}
fn signature(&self) -> Signature {
Signature::build("config remove").required(
"remove",
SyntaxShape::Any,
"remove a value from the config",
)
}
fn usage(&self) -> &str {
"Removes a value from the config"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
remove(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Remove the startup commands",
example: "config remove startup",
result: None,
}]
}
}
pub async fn remove(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let name_span = args.call_info.name_tag.clone();
let (RemoveArgs { remove }, _) = args.process(&registry).await?;
let mut result = crate::data::config::read(name_span, &None)?;
let key = remove.to_string();
if result.contains_key(&key) {
result.swap_remove(&key);
config::write(&result, &None)?;
Ok(futures::stream::iter(vec![ReturnSuccess::value(
UntaggedValue::Row(result.into()).into_value(remove.tag()),
)])
.to_output_stream())
} else {
Err(ShellError::labeled_error(
"Key does not exist in config",
"key",
remove.tag(),
))
}
}

View File

@ -0,0 +1,67 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged;
pub struct SubCommand;
#[derive(Deserialize)]
pub struct SetArgs {
key: Tagged<String>,
value: Value,
}
#[async_trait]
impl WholeStreamCommand for SubCommand {
fn name(&self) -> &str {
"config set"
}
fn signature(&self) -> Signature {
Signature::build("config set")
.required("key", SyntaxShape::String, "variable name to set")
.required("value", SyntaxShape::Any, "value to use")
}
fn usage(&self) -> &str {
"Sets a value in the config"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
set(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Set completion_mode to circular",
example: "config set [completion_mode circular]",
result: None,
}]
}
}
pub async fn set(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let name_span = args.call_info.name_tag.clone();
let (SetArgs { key, value }, _) = args.process(&registry).await?;
// NOTE: None because we are not loading a new config file, we just want to read from the
// existing config
let mut result = crate::data::config::read(name_span, &None)?;
result.insert(key.to_string(), value.clone());
config::write(&result, &None)?;
Ok(OutputStream::one(ReturnSuccess::value(
UntaggedValue::Row(result.into()).into_value(&value.tag),
)))
}

View File

@ -0,0 +1,98 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged;
pub struct SubCommand;
#[derive(Deserialize)]
pub struct SetIntoArgs {
set_into: Tagged<String>,
}
#[async_trait]
impl WholeStreamCommand for SubCommand {
fn name(&self) -> &str {
"config set_into"
}
fn signature(&self) -> Signature {
Signature::build("config set_into").required(
"set_into",
SyntaxShape::String,
"sets a variable from values in the pipeline",
)
}
fn usage(&self) -> &str {
"Sets a value in the config"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
set_into(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Store the contents of the pipeline as a path",
example: "echo ['/usr/bin' '/bin'] | config set_into path",
result: None,
}]
}
}
pub async fn set_into(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let name_span = args.call_info.name_tag.clone();
let name = args.call_info.name_tag.clone();
let (SetIntoArgs { set_into: v }, input) = args.process(&registry).await?;
// NOTE: None because we are not loading a new config file, we just want to read from the
// existing config
let mut result = crate::data::config::read(name_span, &None)?;
// In the original code, this is set to `Some` if the `load flag is set`
let configuration = None;
let rows: Vec<Value> = input.collect().await;
let key = v.to_string();
Ok(if rows.is_empty() {
return Err(ShellError::labeled_error(
"No values given for set_into",
"needs value(s) from pipeline",
v.tag(),
));
} else if rows.len() == 1 {
// A single value
let value = &rows[0];
result.insert(key, value.clone());
config::write(&result, &configuration)?;
OutputStream::one(ReturnSuccess::value(
UntaggedValue::Row(result.into()).into_value(name),
))
} else {
// Take in the pipeline as a table
let value = UntaggedValue::Table(rows).into_value(name.clone());
result.insert(key, value);
config::write(&result, &configuration)?;
OutputStream::one(ReturnSuccess::value(
UntaggedValue::Row(result.into()).into_value(name),
))
})
}

View File

@ -0,0 +1,358 @@
pub const BAT_LANGUAGES: &[&str] = &[
"as",
"csv",
"tsv",
"applescript",
"script editor",
"s",
"S",
"adoc",
"asciidoc",
"asc",
"asa",
"yasm",
"nasm",
"asm",
"inc",
"mac",
"awk",
"bat",
"cmd",
"bib",
"sh",
"bash",
"zsh",
".bash_aliases",
".bash_completions",
".bash_functions",
".bash_login",
".bash_logout",
".bash_profile",
".bash_variables",
".bashrc",
".profile",
".textmate_init",
".zshrc",
"PKGBUILD",
".ebuild",
".eclass",
"c",
"h",
"cs",
"csx",
"cpp",
"cc",
"cp",
"cxx",
"c++",
"C",
"h",
"hh",
"hpp",
"hxx",
"h++",
"inl",
"ipp",
"cabal",
"clj",
"cljc",
"cljs",
"edn",
"CMakeLists.txt",
"cmake",
"h.in",
"hh.in",
"hpp.in",
"hxx.in",
"h++.in",
"CMakeCache.txt",
"cr",
"css",
"css.erb",
"css.liquid",
"d",
"di",
"dart",
"diff",
"patch",
"Dockerfile",
"dockerfile",
"ex",
"exs",
"elm",
"erl",
"hrl",
"Emakefile",
"emakefile",
"fs",
"fsi",
"fsx",
"fs",
"fsi",
"fsx",
"fish",
"attributes",
"gitattributes",
".gitattributes",
"COMMIT_EDITMSG",
"MERGE_MSG",
"TAG_EDITMSG",
"gitconfig",
".gitconfig",
".gitmodules",
"exclude",
"gitignore",
".gitignore",
".git",
"gitlog",
"git-rebase-todo",
"go",
"dot",
"DOT",
"gv",
"groovy",
"gvy",
"gradle",
"Jenkinsfile",
"hs",
"hs",
"hsc",
"show-nonprintable",
"html",
"htm",
"shtml",
"xhtml",
"asp",
"html.eex",
"yaws",
"rails",
"rhtml",
"erb",
"html.erb",
"adp",
"twig",
"html.twig",
"ini",
"INI",
"INF",
"reg",
"REG",
"lng",
"cfg",
"CFG",
"desktop",
"url",
"URL",
".editorconfig",
".hgrc",
"hgrc",
"java",
"bsh",
"properties",
"jsp",
"js",
"htc",
"js",
"jsx",
"babel",
"es6",
"js.erb",
"json",
"sublime-settings",
"sublime-menu",
"sublime-keymap",
"sublime-mousemap",
"sublime-theme",
"sublime-build",
"sublime-project",
"sublime-completions",
"sublime-commands",
"sublime-macro",
"sublime-color-scheme",
"ipynb",
"Pipfile.lock",
"jsonnet",
"libsonnet",
"libjsonnet",
"jl",
"kt",
"kts",
"tex",
"ltx",
"less",
"css.less",
"lisp",
"cl",
"clisp",
"l",
"mud",
"el",
"scm",
"ss",
"lsp",
"fasl",
"lhs",
"lua",
"make",
"GNUmakefile",
"makefile",
"Makefile",
"makefile.am",
"Makefile.am",
"makefile.in",
"Makefile.in",
"OCamlMakefile",
"mak",
"mk",
"md",
"mdown",
"markdown",
"markdn",
"matlab",
"build",
"nix",
"m",
"h",
"mm",
"M",
"h",
"ml",
"mli",
"mll",
"mly",
"pas",
"p",
"dpr",
"pl",
"pm",
"pod",
"t",
"PL",
"php",
"php3",
"php4",
"php5",
"php7",
"phps",
"phpt",
"phtml",
"txt",
"ps1",
"psm1",
"psd1",
"proto",
"protodevel",
"pb.txt",
"proto.text",
"textpb",
"pbtxt",
"prototxt",
"pp",
"epp",
"purs",
"py",
"py3",
"pyw",
"pyi",
"pyx",
"pyx.in",
"pxd",
"pxd.in",
"pxi",
"pxi.in",
"rpy",
"cpy",
"SConstruct",
"Sconstruct",
"sconstruct",
"SConscript",
"gyp",
"gypi",
"Snakefile",
"wscript",
"R",
"r",
"s",
"S",
"Rprofile",
"rd",
"re",
"rst",
"rest",
"robot",
"rb",
"Appfile",
"Appraisals",
"Berksfile",
"Brewfile",
"capfile",
"cgi",
"Cheffile",
"config.ru",
"Deliverfile",
"Fastfile",
"fcgi",
"Gemfile",
"gemspec",
"Guardfile",
"irbrc",
"jbuilder",
"Podfile",
"podspec",
"prawn",
"rabl",
"rake",
"Rakefile",
"Rantfile",
"rbx",
"rjs",
"ruby.rail",
"Scanfile",
"simplecov",
"Snapfile",
"thor",
"Thorfile",
"Vagrantfile",
"haml",
"sass",
"rxml",
"builder",
"rs",
"scala",
"sbt",
"sql",
"ddl",
"dml",
"erbsql",
"sql.erb",
"swift",
"log",
"tcl",
"tf",
"tfvars",
"hcl",
"sty",
"cls",
"textile",
"toml",
"tml",
"Cargo.lock",
"Gopkg.lock",
"Pipfile",
"ts",
"tsx",
"varlink",
"vim",
".vimrc",
"xml",
"xsd",
"xslt",
"tld",
"dtml",
"rss",
"opml",
"svg",
"yaml",
"yml",
"sublime-syntax",
];

View File

@ -3,13 +3,11 @@ use crate::context::CommandRegistry;
use crate::prelude::*; use crate::prelude::*;
use futures::stream::StreamExt; use futures::stream::StreamExt;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, UntaggedValue, Value}; use nu_protocol::{Signature, UntaggedValue, Value};
pub struct Count; pub struct Count;
#[derive(Deserialize)] #[async_trait]
pub struct CountArgs {}
impl WholeStreamCommand for Count { impl WholeStreamCommand for Count {
fn name(&self) -> &str { fn name(&self) -> &str {
"count" "count"
@ -20,27 +18,39 @@ impl WholeStreamCommand for Count {
} }
fn usage(&self) -> &str { fn usage(&self) -> &str {
"Show the total number of rows." "Show the total number of rows or items."
} }
fn run( async fn run(
&self, &self,
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, _registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
args.process(registry, count)?.run() let name = args.call_info.name_tag.clone();
let rows: Vec<Value> = args.input.collect().await;
Ok(OutputStream::one(
UntaggedValue::int(rows.len()).into_value(name),
))
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Count the number of entries in a list",
example: "echo [1 2 3 4 5] | count",
result: Some(vec![UntaggedValue::int(5).into()]),
}]
} }
} }
pub fn count( #[cfg(test)]
CountArgs {}: CountArgs, mod tests {
RunnableContext { input, name, .. }: RunnableContext, use super::Count;
) -> Result<OutputStream, ShellError> {
let stream = async_stream! {
let rows: Vec<Value> = input.values.collect().await;
yield ReturnSuccess::value(UntaggedValue::int(rows.len()).into_value(name)) #[test]
}; fn examples_work_as_expected() {
use crate::examples::test as test_examples;
Ok(stream.to_output_stream()) test_examples(Count {})
}
} }

View File

@ -1,8 +1,8 @@
use crate::commands::command::RunnablePerItemContext; use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry; use crate::context::CommandRegistry;
use crate::prelude::*; use crate::prelude::*;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{CallInfo, Signature, SyntaxShape, Value}; use nu_protocol::{Signature, SyntaxShape};
use nu_source::Tagged; use nu_source::Tagged;
use std::path::PathBuf; use std::path::PathBuf;
@ -15,7 +15,8 @@ pub struct CopyArgs {
pub recursive: Tagged<bool>, pub recursive: Tagged<bool>,
} }
impl PerItemCommand for Cpy { #[async_trait]
impl WholeStreamCommand for Cpy {
fn name(&self) -> &str { fn name(&self) -> &str {
"cp" "cp"
} }
@ -35,20 +36,41 @@ impl PerItemCommand for Cpy {
"Copy files." "Copy files."
} }
fn run( async fn run(
&self, &self,
call_info: &CallInfo, args: CommandArgs,
_registry: &CommandRegistry, registry: &CommandRegistry,
raw_args: &RawCommandArgs,
_input: Value,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
call_info let shell_manager = args.shell_manager.clone();
.process(&raw_args.shell_manager, raw_args.ctrl_c.clone(), cp)? let name = args.call_info.name_tag.clone();
.run() let (args, _) = args.process(&registry).await?;
shell_manager.cp(args, name)
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Copy myfile to dir_b",
example: "cp myfile dir_b",
result: None,
},
Example {
description: "Recursively copy dir_a to dir_b",
example: "cp -r dir_a dir_b",
result: None,
},
]
} }
} }
fn cp(args: CopyArgs, context: &RunnablePerItemContext) -> Result<OutputStream, ShellError> { #[cfg(test)]
let shell_manager = context.shell_manager.clone(); mod tests {
shell_manager.cp(args, context) use super::Cpy;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Cpy {})
}
} }

View File

@ -7,10 +7,11 @@ use crate::commands::WholeStreamCommand;
use chrono::{Datelike, TimeZone, Timelike}; use chrono::{Datelike, TimeZone, Timelike};
use core::fmt::Display; use core::fmt::Display;
use indexmap::IndexMap; use indexmap::IndexMap;
use nu_protocol::{Signature, UntaggedValue}; use nu_protocol::{Signature, SyntaxShape, UntaggedValue};
pub struct Date; pub struct Date;
#[async_trait]
impl WholeStreamCommand for Date { impl WholeStreamCommand for Date {
fn name(&self) -> &str { fn name(&self) -> &str {
"date" "date"
@ -20,27 +21,68 @@ impl WholeStreamCommand for Date {
Signature::build("date") Signature::build("date")
.switch("utc", "use universal time (UTC)", Some('u')) .switch("utc", "use universal time (UTC)", Some('u'))
.switch("local", "use the local time", Some('l')) .switch("local", "use the local time", Some('l'))
.named(
"format",
SyntaxShape::String,
"report datetime in supplied strftime format",
Some('f'),
)
.switch("raw", "print date without tables", Some('r'))
} }
fn usage(&self) -> &str { fn usage(&self) -> &str {
"Get the current datetime." "Get the current datetime."
} }
fn run( async fn run(
&self, &self,
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
date(args, registry) date(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Get the current local time and date",
example: "date",
result: None,
},
Example {
description: "Get the current UTC time and date",
example: "date --utc",
result: None,
},
Example {
description: "Get the current time and date and report it based on format",
example: "date --format '%Y-%m-%d %H:%M:%S.%f %z'",
result: None,
},
Example {
description: "Get the current time and date and report it without a table",
example: "date --format '%Y-%m-%d %H:%M:%S.%f %z' --raw",
result: None,
},
]
} }
} }
pub fn date_to_value<T: TimeZone>(dt: DateTime<T>, tag: Tag) -> Value pub fn date_to_value_raw<T: TimeZone>(dt: DateTime<T>, dt_format: String) -> String
where
T::Offset: Display,
{
let result = dt.format(&dt_format);
format!("{}", result)
}
pub fn date_to_value<T: TimeZone>(dt: DateTime<T>, tag: Tag, dt_format: String) -> Value
where where
T::Offset: Display, T::Offset: Display,
{ {
let mut indexmap = IndexMap::new(); let mut indexmap = IndexMap::new();
if dt_format.is_empty() {
indexmap.insert( indexmap.insert(
"year".to_string(), "year".to_string(),
UntaggedValue::int(dt.year()).into_value(&tag), UntaggedValue::int(dt.year()).into_value(&tag),
@ -71,25 +113,63 @@ where
"timezone".to_string(), "timezone".to_string(),
UntaggedValue::string(format!("{}", tz)).into_value(&tag), UntaggedValue::string(format!("{}", tz)).into_value(&tag),
); );
} else {
let result = dt.format(&dt_format);
indexmap.insert(
"formatted".to_string(),
UntaggedValue::string(format!("{}", result)).into_value(&tag),
);
}
UntaggedValue::Row(Dictionary::from(indexmap)).into_value(&tag) UntaggedValue::Row(Dictionary::from(indexmap)).into_value(&tag)
} }
pub fn date(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> { pub async fn date(
let args = args.evaluate_once(registry)?; args: CommandArgs,
registry: &CommandRegistry,
let mut date_out = VecDeque::new(); ) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let args = args.evaluate_once(&registry).await?;
let tag = args.call_info.name_tag.clone(); let tag = args.call_info.name_tag.clone();
let raw = args.has("raw");
let dt_fmt = if args.has("format") {
if let Some(dt_fmt) = args.get("format") {
dt_fmt.convert_to_string()
} else {
"".to_string()
}
} else {
"".to_string()
};
let value = if args.has("utc") { let value = if args.has("utc") {
let utc: DateTime<Utc> = Utc::now(); let utc: DateTime<Utc> = Utc::now();
date_to_value(utc, tag) if raw {
UntaggedValue::string(date_to_value_raw(utc, dt_fmt)).into_untagged_value()
} else {
date_to_value(utc, tag, dt_fmt)
}
} else { } else {
let local: DateTime<Local> = Local::now(); let local: DateTime<Local> = Local::now();
date_to_value(local, tag) if raw {
UntaggedValue::string(date_to_value_raw(local, dt_fmt)).into_untagged_value()
} else {
date_to_value(local, tag, dt_fmt)
}
}; };
date_out.push_back(value); Ok(OutputStream::one(value))
}
Ok(futures::stream::iter(date_out).to_output_stream())
#[cfg(test)]
mod tests {
use super::Date;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Date {})
}
} }

View File

@ -10,6 +10,7 @@ pub struct DebugArgs {
raw: bool, raw: bool,
} }
#[async_trait]
impl WholeStreamCommand for Debug { impl WholeStreamCommand for Debug {
fn name(&self) -> &str { fn name(&self) -> &str {
"debug" "debug"
@ -23,21 +24,22 @@ impl WholeStreamCommand for Debug {
"Print the Rust debug representation of the values" "Print the Rust debug representation of the values"
} }
fn run( async fn run(
&self, &self,
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
args.process(registry, debug_value)?.run() debug_value(args, registry).await
} }
} }
fn debug_value( async fn debug_value(
DebugArgs { raw }: DebugArgs, args: CommandArgs,
RunnableContext { input, .. }: RunnableContext, registry: &CommandRegistry,
) -> Result<impl ToOutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let (DebugArgs { raw }, input) = args.process(&registry).await?;
Ok(input Ok(input
.values
.map(move |v| { .map(move |v| {
if raw { if raw {
ReturnSuccess::value( ReturnSuccess::value(
@ -49,3 +51,15 @@ fn debug_value(
}) })
.to_output_stream()) .to_output_stream())
} }
#[cfg(test)]
mod tests {
use super::Debug;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Debug {})
}
}

View File

@ -14,6 +14,7 @@ struct DefaultArgs {
pub struct Default; pub struct Default;
#[async_trait]
impl WholeStreamCommand for Default { impl WholeStreamCommand for Default {
fn name(&self) -> &str { fn name(&self) -> &str {
"default" "default"
@ -33,44 +34,60 @@ impl WholeStreamCommand for Default {
"Sets a default row's column if missing." "Sets a default row's column if missing."
} }
fn run( async fn run(
&self, &self,
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
args.process(registry, default)?.run() default(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Give a default 'target' to all file entries",
example: "ls -la | default target 'nothing'",
result: None,
}]
} }
} }
fn default( async fn default(
DefaultArgs { column, value }: DefaultArgs, args: CommandArgs,
RunnableContext { input, .. }: RunnableContext, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
let stream = input let registry = registry.clone();
.values let (DefaultArgs { column, value }, input) = args.process(&registry).await?;
.map(move |item| {
let mut result = VecDeque::new();
let should_add = match item { Ok(input
.map(move |item| {
let should_add = matches!(
item,
Value { Value {
value: UntaggedValue::Row(ref r), value: UntaggedValue::Row(ref r),
.. ..
} => r.get_data(&column.item).borrow().is_none(), } if r.get_data(&column.item).borrow().is_none()
_ => false, );
};
if should_add { if should_add {
match item.insert_data_at_path(&column.item, value.clone()) { match item.insert_data_at_path(&column.item, value.clone()) {
Some(new_value) => result.push_back(ReturnSuccess::value(new_value)), Some(new_value) => ReturnSuccess::value(new_value),
None => result.push_back(ReturnSuccess::value(item)), None => ReturnSuccess::value(item),
} }
} else { } else {
result.push_back(ReturnSuccess::value(item)); ReturnSuccess::value(item)
} }
futures::stream::iter(result)
}) })
.flatten(); .to_output_stream())
}
Ok(stream.to_output_stream())
#[cfg(test)]
mod tests {
use super::Default;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Default {})
}
} }

View File

@ -0,0 +1,135 @@
use crate::commands::classified::block::run_block;
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{
hir::Block, hir::ExternalRedirection, ReturnSuccess, Signature, SyntaxShape, Value,
};
pub struct Do;
#[derive(Deserialize, Debug)]
struct DoArgs {
block: Block,
ignore_errors: bool,
}
#[async_trait]
impl WholeStreamCommand for Do {
fn name(&self) -> &str {
"do"
}
fn signature(&self) -> Signature {
Signature::build("do")
.required("block", SyntaxShape::Block, "the block to run ")
.switch(
"ignore_errors",
"ignore errors as the block runs",
Some('i'),
)
}
fn usage(&self) -> &str {
"Runs a block, optionally ignoring errors"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
do_(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Run the block",
example: r#"do { echo hello }"#,
result: Some(vec![Value::from("hello")]),
},
Example {
description: "Run the block and ignore errors",
example: r#"do -i { thisisnotarealcommand }"#,
result: Some(vec![Value::nothing()]),
},
]
}
}
async fn do_(
raw_args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let external_redirection = raw_args.call_info.args.external_redirection;
let mut context = Context::from_raw(&raw_args, &registry);
let scope = raw_args.call_info.scope.clone();
let (
DoArgs {
ignore_errors,
mut block,
},
input,
) = raw_args.process(&registry).await?;
let block_redirection = match external_redirection {
ExternalRedirection::None => {
if ignore_errors {
ExternalRedirection::Stderr
} else {
ExternalRedirection::None
}
}
ExternalRedirection::Stdout => {
if ignore_errors {
ExternalRedirection::StdoutAndStderr
} else {
ExternalRedirection::Stdout
}
}
x => x,
};
block.set_redirect(block_redirection);
let result = run_block(
&block,
&mut context,
input,
&scope.it,
&scope.vars,
&scope.env,
)
.await;
if ignore_errors {
// To properly ignore errors we need to redirect stderr, consume it, and remove
// any errors we see in the process.
match result {
Ok(mut stream) => {
let output = stream.drain_vec().await;
context.clear_errors();
Ok(futures::stream::iter(output).to_output_stream())
}
Err(_) => Ok(OutputStream::one(ReturnSuccess::value(Value::nothing()))),
}
} else {
result.map(|x| x.to_output_stream())
}
}
#[cfg(test)]
mod tests {
use super::Do;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Do {})
}
}

View File

@ -0,0 +1,95 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape, UntaggedValue};
use nu_source::Tagged;
pub struct Drop;
#[derive(Deserialize)]
pub struct DropArgs {
rows: Option<Tagged<u64>>,
}
#[async_trait]
impl WholeStreamCommand for Drop {
fn name(&self) -> &str {
"drop"
}
fn signature(&self) -> Signature {
Signature::build("drop").optional(
"rows",
SyntaxShape::Number,
"starting from the back, the number of rows to drop",
)
}
fn usage(&self) -> &str {
"Drop the last number of rows."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
drop(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Remove the last item of a list/table",
example: "echo [1 2 3] | drop",
result: Some(vec![
UntaggedValue::int(1).into(),
UntaggedValue::int(2).into(),
]),
},
Example {
description: "Remove the last 2 items of a list/table",
example: "echo [1 2 3] | drop 2",
result: Some(vec![UntaggedValue::int(1).into()]),
},
]
}
}
async fn drop(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let (DropArgs { rows }, input) = args.process(&registry).await?;
let v: Vec<_> = input.into_vec().await;
let rows_to_drop = if let Some(quantity) = rows {
*quantity as usize
} else {
1
};
Ok(if rows_to_drop == 0 {
futures::stream::iter(v).to_output_stream()
} else {
let k = if v.len() < rows_to_drop {
0
} else {
v.len() - rows_to_drop
};
let iter = v.into_iter().take(k);
futures::stream::iter(iter).to_output_stream()
})
}
#[cfg(test)]
mod tests {
use super::Drop;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Drop {})
}
}

View File

@ -1,12 +1,10 @@
extern crate filesize; use crate::commands::WholeStreamCommand;
use crate::commands::command::RunnablePerItemContext;
use crate::prelude::*; use crate::prelude::*;
use filesize::file_real_size_fast; use filesize::file_real_size_fast;
use glob::*; use glob::*;
use indexmap::map::IndexMap; use indexmap::map::IndexMap;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{CallInfo, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value}; use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged; use nu_source::Tagged;
use std::path::PathBuf; use std::path::PathBuf;
use std::sync::atomic::Ordering; use std::sync::atomic::Ordering;
@ -32,7 +30,8 @@ pub struct DuArgs {
min_size: Option<Tagged<u64>>, min_size: Option<Tagged<u64>>,
} }
impl PerItemCommand for Du { #[async_trait]
impl WholeStreamCommand for Du {
fn name(&self) -> &str { fn name(&self) -> &str {
NAME NAME
} }
@ -42,7 +41,7 @@ impl PerItemCommand for Du {
.optional("path", SyntaxShape::Pattern, "starting directory") .optional("path", SyntaxShape::Pattern, "starting directory")
.switch( .switch(
"all", "all",
"Output File sizes as well as directory sizes", "Output file sizes as well as directory sizes",
Some('a'), Some('a'),
) )
.switch( .switch(
@ -74,115 +73,121 @@ impl PerItemCommand for Du {
"Find disk usage sizes of specified items" "Find disk usage sizes of specified items"
} }
fn run( async fn run(
&self, &self,
call_info: &CallInfo, args: CommandArgs,
_registry: &CommandRegistry, registry: &CommandRegistry,
raw_args: &RawCommandArgs,
_input: Value,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
call_info du(args, registry).await
.process(&raw_args.shell_manager, raw_args.ctrl_c.clone(), du)? }
.run()
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Disk usage of the current directory",
example: "du",
result: None,
}]
} }
} }
fn du(args: DuArgs, ctx: &RunnablePerItemContext) -> Result<OutputStream, ShellError> { async fn du(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let tag = ctx.name.clone(); let registry = registry.clone();
let tag = args.call_info.name_tag.clone();
let ctrl_c = args.ctrl_c.clone();
let ctrl_c_copy = ctrl_c.clone();
let exclude = args let (args, _): (DuArgs, _) = args.process(&registry).await?;
.exclude let exclude = args.exclude.map_or(Ok(None), move |x| {
.clone() Pattern::new(&x.item)
.map_or(Ok(None), move |x| match Pattern::new(&x.item) { .map(Option::Some)
Ok(p) => Ok(Some(p)), .map_err(|e| ShellError::labeled_error(e.msg, "glob error", x.tag.clone()))
Err(e) => Err(ShellError::labeled_error(
e.msg,
"Glob error",
x.tag.clone(),
)),
})?; })?;
let path = args.path.clone();
let filter_files = path.is_none(); let include_files = args.all;
let paths = match path { let paths = match args.path {
Some(p) => match glob::glob_with( Some(p) => {
p.item.to_str().expect("Why isn't this encoded properly?"), let p = p.item.to_str().expect("Why isn't this encoded properly?");
GLOB_PARAMS, glob::glob_with(p, GLOB_PARAMS)
) { }
Ok(g) => Ok(g), None => glob::glob_with("*", GLOB_PARAMS),
Err(e) => Err(ShellError::labeled_error( }
e.msg, .map_err(|e| ShellError::labeled_error(e.msg, "glob error", tag.clone()))?
"Glob error",
p.tag.clone(),
)),
},
None => match glob::glob_with("*", GLOB_PARAMS) {
Ok(g) => Ok(g),
Err(e) => Err(ShellError::labeled_error(e.msg, "Glob error", tag.clone())),
},
}?
.filter(move |p| { .filter(move |p| {
if filter_files { if include_files {
true
} else {
match p { match p {
Ok(f) if f.is_dir() => true, Ok(f) if f.is_dir() => true,
Err(e) if e.path().is_dir() => true, Err(e) if e.path().is_dir() => true,
_ => false, _ => false,
} }
} else {
true
} }
}) })
.map(move |p| match p { .map(|v| v.map_err(glob_err_into));
Err(e) => Err(glob_err_into(e)),
Ok(s) => Ok(s),
});
let ctrl_c = ctx.ctrl_c.clone();
let all = args.all; let all = args.all;
let deref = args.deref; let deref = args.deref;
let max_depth = args.max_depth.map(|f| f.item); let max_depth = args.max_depth.map(|f| f.item);
let min_size = args.min_size.map(|f| f.item); let min_size = args.min_size.map(|f| f.item);
let stream = async_stream! {
let params = DirBuilder { let params = DirBuilder {
tag: tag.clone(), tag: tag.clone(),
min: min_size, min: min_size,
deref, deref,
ex: exclude, exclude,
all, all,
}; };
for path in paths {
if ctrl_c.load(Ordering::SeqCst) { let inp = futures::stream::iter(paths);
break;
} Ok(inp
match path { .flat_map(move |path| match path {
Ok(p) => { Ok(p) => {
let mut output = vec![];
if p.is_dir() { if p.is_dir() {
yield Ok(ReturnSuccess::Value( output.push(Ok(ReturnSuccess::Value(
DirInfo::new(p, &params, max_depth).into(), DirInfo::new(p, &params, max_depth, ctrl_c.clone()).into(),
)); )));
} else { } else {
match FileInfo::new(p, deref, tag.clone()) { for v in FileInfo::new(p, deref, tag.clone()).into_iter() {
Ok(f) => yield Ok(ReturnSuccess::Value(f.into())), output.push(Ok(ReturnSuccess::Value(v.into())));
Err(e) => yield Err(e)
} }
} }
futures::stream::iter(output)
} }
Err(e) => yield Err(e), Err(e) => futures::stream::iter(vec![Err(e)]),
} })
} .interruptible(ctrl_c_copy)
}; .to_output_stream())
Ok(stream.to_output_stream())
} }
struct DirBuilder { pub struct DirBuilder {
tag: Tag, tag: Tag,
min: Option<u64>, min: Option<u64>,
deref: bool, deref: bool,
ex: Option<Pattern>, exclude: Option<Pattern>,
all: bool, all: bool,
} }
struct DirInfo { impl DirBuilder {
pub fn new(
tag: Tag,
min: Option<u64>,
deref: bool,
exclude: Option<Pattern>,
all: bool,
) -> DirBuilder {
DirBuilder {
tag,
min,
deref,
exclude,
all,
}
}
}
pub struct DirInfo {
dirs: Vec<DirInfo>, dirs: Vec<DirInfo>,
files: Vec<FileInfo>, files: Vec<FileInfo>,
errors: Vec<ShellError>, errors: Vec<ShellError>,
@ -225,7 +230,12 @@ impl FileInfo {
} }
impl DirInfo { impl DirInfo {
fn new(path: impl Into<PathBuf>, params: &DirBuilder, depth: Option<u64>) -> Self { pub fn new(
path: impl Into<PathBuf>,
params: &DirBuilder,
depth: Option<u64>,
ctrl_c: Arc<AtomicBool>,
) -> Self {
let path = path.into(); let path = path.into();
let mut s = Self { let mut s = Self {
@ -241,17 +251,19 @@ impl DirInfo {
match std::fs::read_dir(&s.path) { match std::fs::read_dir(&s.path) {
Ok(d) => { Ok(d) => {
for f in d { for f in d {
if ctrl_c.load(Ordering::SeqCst) {
break;
}
match f { match f {
Ok(i) => match i.file_type() { Ok(i) => match i.file_type() {
Ok(t) if t.is_dir() => { Ok(t) if t.is_dir() => {
s = s.add_dir(i.path(), depth, &params); s = s.add_dir(i.path(), depth, &params, ctrl_c.clone())
} }
Ok(_t) => { Ok(_t) => s = s.add_file(i.path(), &params),
s = s.add_file(i.path(), &params); Err(e) => s = s.add_error(e.into()),
}
Err(e) => s = s.add_error(ShellError::from(e)),
}, },
Err(e) => s = s.add_error(ShellError::from(e)), Err(e) => s = s.add_error(e.into()),
} }
} }
} }
@ -265,6 +277,7 @@ impl DirInfo {
path: impl Into<PathBuf>, path: impl Into<PathBuf>,
mut depth: Option<u64>, mut depth: Option<u64>,
params: &DirBuilder, params: &DirBuilder,
ctrl_c: Arc<AtomicBool>,
) -> Self { ) -> Self {
if let Some(current) = depth { if let Some(current) = depth {
if let Some(new) = current.checked_sub(1) { if let Some(new) = current.checked_sub(1) {
@ -274,7 +287,7 @@ impl DirInfo {
} }
} }
let d = DirInfo::new(path, &params, depth); let d = DirInfo::new(path, &params, depth, ctrl_c);
self.size += d.size; self.size += d.size;
self.blocks += d.blocks; self.blocks += d.blocks;
self.dirs.push(d); self.dirs.push(d);
@ -283,8 +296,11 @@ impl DirInfo {
fn add_file(mut self, f: impl Into<PathBuf>, params: &DirBuilder) -> Self { fn add_file(mut self, f: impl Into<PathBuf>, params: &DirBuilder) -> Self {
let f = f.into(); let f = f.into();
let ex = params.ex.as_ref().map_or(false, |x| x.matches_path(&f)); let include = params
if !ex { .exclude
.as_ref()
.map_or(true, |x| !x.matches_path(&f));
if include {
match FileInfo::new(f, params.deref, self.tag.clone()) { match FileInfo::new(f, params.deref, self.tag.clone()) {
Ok(file) => { Ok(file) => {
let inc = params.min.map_or(true, |s| file.size >= s); let inc = params.min.map_or(true, |s| file.size >= s);
@ -306,6 +322,10 @@ impl DirInfo {
self.errors.push(e); self.errors.push(e);
self self
} }
pub fn get_size(&self) -> u64 {
self.size
}
} }
fn glob_err_into(e: GlobError) -> ShellError { fn glob_err_into(e: GlobError) -> ShellError {
@ -313,55 +333,51 @@ fn glob_err_into(e: GlobError) -> ShellError {
ShellError::from(e) ShellError::from(e)
} }
fn value_from_vec<V>(vec: Vec<V>, tag: &Tag) -> Value
where
V: Into<Value>,
{
if vec.is_empty() {
UntaggedValue::nothing()
} else {
let values = vec.into_iter().map(Into::into).collect::<Vec<Value>>();
UntaggedValue::Table(values)
}
.into_value(tag)
}
impl From<DirInfo> for Value { impl From<DirInfo> for Value {
fn from(d: DirInfo) -> Self { fn from(d: DirInfo) -> Self {
let mut r: IndexMap<String, Value> = IndexMap::new(); let mut r: IndexMap<String, Value> = IndexMap::new();
r.insert( r.insert(
"path".to_string(), "path".to_string(),
UntaggedValue::path(d.path).retag(d.tag.clone()), UntaggedValue::path(d.path).into_value(&d.tag),
); );
r.insert( r.insert(
"apparent".to_string(), "apparent".to_string(),
UntaggedValue::bytes(d.size).retag(d.tag.clone()), UntaggedValue::filesize(d.size).into_value(&d.tag),
); );
r.insert( r.insert(
"physical".to_string(), "physical".to_string(),
UntaggedValue::bytes(d.blocks).retag(d.tag.clone()), UntaggedValue::filesize(d.blocks).into_value(&d.tag),
); );
if !d.files.is_empty() {
let v = Value { r.insert("directories".to_string(), value_from_vec(d.dirs, &d.tag));
value: UntaggedValue::Table(
d.files r.insert("files".to_string(), value_from_vec(d.files, &d.tag));
.into_iter()
.map(move |f| f.into())
.collect::<Vec<Value>>(),
),
tag: d.tag.clone(),
};
r.insert("files".to_string(), v);
}
if !d.dirs.is_empty() {
let v = Value {
value: UntaggedValue::Table(
d.dirs
.into_iter()
.map(move |d| d.into())
.collect::<Vec<Value>>(),
),
tag: d.tag.clone(),
};
r.insert("directories".to_string(), v);
}
if !d.errors.is_empty() { if !d.errors.is_empty() {
let v = Value { let v = UntaggedValue::Table(
value: UntaggedValue::Table(
d.errors d.errors
.into_iter() .into_iter()
.map(move |e| UntaggedValue::Error(e).into_untagged_value()) .map(move |e| UntaggedValue::Error(e).into_untagged_value())
.collect::<Vec<Value>>(), .collect::<Vec<Value>>(),
), )
tag: d.tag.clone(), .into_value(&d.tag);
};
r.insert("errors".to_string(), v); r.insert("errors".to_string(), v);
} }
@ -375,22 +391,47 @@ impl From<DirInfo> for Value {
impl From<FileInfo> for Value { impl From<FileInfo> for Value {
fn from(f: FileInfo) -> Self { fn from(f: FileInfo) -> Self {
let mut r: IndexMap<String, Value> = IndexMap::new(); let mut r: IndexMap<String, Value> = IndexMap::new();
r.insert( r.insert(
"path".to_string(), "path".to_string(),
UntaggedValue::path(f.path).retag(f.tag.clone()), UntaggedValue::path(f.path).into_value(&f.tag),
); );
r.insert( r.insert(
"apparent".to_string(), "apparent".to_string(),
UntaggedValue::bytes(f.size).retag(f.tag.clone()), UntaggedValue::filesize(f.size).into_value(&f.tag),
); );
let b = match f.blocks {
Some(k) => UntaggedValue::bytes(k).retag(f.tag.clone()), let b = f
None => UntaggedValue::nothing().retag(f.tag.clone()), .blocks
}; .map(UntaggedValue::filesize)
.unwrap_or_else(UntaggedValue::nothing)
.into_value(&f.tag);
r.insert("physical".to_string(), b); r.insert("physical".to_string(), b);
Value {
value: UntaggedValue::row(r), r.insert(
tag: f.tag, "directories".to_string(),
UntaggedValue::nothing().into_value(&f.tag),
);
r.insert(
"files".to_string(),
UntaggedValue::nothing().into_value(&f.tag),
);
UntaggedValue::row(r).into_value(&f.tag)
} }
} }
#[cfg(test)]
mod tests {
use super::Du;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Du {})
}
} }

View File

@ -0,0 +1,176 @@
use crate::commands::classified::block::run_block;
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use futures::stream::once;
use nu_errors::ShellError;
use nu_protocol::{
hir::Block, hir::Expression, hir::SpannedExpression, hir::Synthetic, Scope, Signature,
SyntaxShape, TaggedDictBuilder, UntaggedValue, Value,
};
use nu_source::Tagged;
pub struct Each;
#[derive(Deserialize)]
pub struct EachArgs {
block: Block,
numbered: Tagged<bool>,
}
#[async_trait]
impl WholeStreamCommand for Each {
fn name(&self) -> &str {
"each"
}
fn signature(&self) -> Signature {
Signature::build("each")
.required("block", SyntaxShape::Block, "the block to run on each row")
.switch(
"numbered",
"returned a numbered item ($it.index and $it.item)",
Some('n'),
)
}
fn usage(&self) -> &str {
"Run a block on each row of the table."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
each(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Echo the sum of each row",
example: "echo [[1 2] [3 4]] | each { echo $it | math sum }",
result: None,
},
Example {
description: "Echo the square of each integer",
example: "echo [1 2 3] | each { echo $(= $it * $it) }",
result: Some(vec![
UntaggedValue::int(1).into(),
UntaggedValue::int(4).into(),
UntaggedValue::int(9).into(),
]),
},
Example {
description: "Number each item and echo a message",
example:
"echo ['bob' 'fred'] | each --numbered { echo `{{$it.index}} is {{$it.item}}` }",
result: Some(vec![Value::from("0 is bob"), Value::from("1 is fred")]),
},
]
}
}
fn is_expanded_it_usage(head: &SpannedExpression) -> bool {
matches!(&*head, SpannedExpression {
expr: Expression::Synthetic(Synthetic::String(s)),
..
} if s == "expanded-each")
}
pub async fn process_row(
block: Arc<Block>,
scope: Arc<Scope>,
head: Arc<Box<SpannedExpression>>,
mut context: Arc<Context>,
input: Value,
) -> Result<OutputStream, ShellError> {
let input_clone = input.clone();
let input_stream = if is_expanded_it_usage(&head) {
InputStream::empty()
} else {
once(async { Ok(input_clone) }).to_input_stream()
};
Ok(run_block(
&block,
Arc::make_mut(&mut context),
input_stream,
&input,
&scope.vars,
&scope.env,
)
.await?
.to_output_stream())
}
pub(crate) fn make_indexed_item(index: usize, item: Value) -> Value {
let mut dict = TaggedDictBuilder::new(item.tag());
dict.insert_untagged("index", UntaggedValue::int(index));
dict.insert_value("item", item);
dict.into_value()
}
async fn each(
raw_args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let head = Arc::new(raw_args.call_info.args.head.clone());
let scope = Arc::new(raw_args.call_info.scope.clone());
let context = Arc::new(Context::from_raw(&raw_args, &registry));
let (each_args, input): (EachArgs, _) = raw_args.process(&registry).await?;
let block = Arc::new(each_args.block);
if each_args.numbered.item {
Ok(input
.enumerate()
.then(move |input| {
let block = block.clone();
let scope = scope.clone();
let head = head.clone();
let context = context.clone();
let row = make_indexed_item(input.0, input.1);
async {
match process_row(block, scope, head, context, row).await {
Ok(s) => s,
Err(e) => OutputStream::one(Err(e)),
}
}
})
.flatten()
.to_output_stream())
} else {
Ok(input
.then(move |input| {
let block = block.clone();
let scope = scope.clone();
let head = head.clone();
let context = context.clone();
async {
match process_row(block, scope, head, context, input).await {
Ok(s) => s,
Err(e) => OutputStream::one(Err(e)),
}
}
})
.flatten()
.to_output_stream())
}
}
#[cfg(test)]
mod tests {
use super::Each;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Each {})
}
}

View File

@ -1,10 +1,20 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*; use crate::prelude::*;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{CallInfo, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value}; use nu_protocol::hir::Operator;
use nu_protocol::{
Primitive, Range, RangeInclusion, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value,
};
pub struct Echo; pub struct Echo;
impl PerItemCommand for Echo { #[derive(Deserialize)]
pub struct EchoArgs {
pub rest: Vec<Value>,
}
#[async_trait]
impl WholeStreamCommand for Echo {
fn name(&self) -> &str { fn name(&self) -> &str {
"echo" "echo"
} }
@ -17,51 +27,122 @@ impl PerItemCommand for Echo {
"Echo the arguments back to the user." "Echo the arguments back to the user."
} }
fn run( async fn run(
&self, &self,
call_info: &CallInfo, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
raw_args: &RawCommandArgs,
_input: Value,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
run(call_info, registry, raw_args) echo(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Put a hello message in the pipeline",
example: "echo 'hello'",
result: Some(vec![Value::from("hello")]),
},
Example {
description: "Print the value of the special '$nu' variable",
example: "echo $nu",
result: None,
},
]
} }
} }
fn run( async fn echo(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
call_info: &CallInfo, let registry = registry.clone();
_registry: &CommandRegistry, let (args, _): (EchoArgs, _) = args.process(&registry).await?;
_raw_args: &RawCommandArgs,
) -> Result<OutputStream, ShellError> {
let mut output = vec![];
if let Some(ref positional) = call_info.args.positional { let stream = args.rest.into_iter().map(|i| match i.as_string() {
for i in positional { Ok(s) => OutputStream::one(Ok(ReturnSuccess::Value(
match i.as_string() {
Ok(s) => {
output.push(Ok(ReturnSuccess::Value(
UntaggedValue::string(s).into_value(i.tag.clone()), UntaggedValue::string(s).into_value(i.tag.clone()),
))); ))),
}
_ => match i { _ => match i {
Value { Value {
value: UntaggedValue::Table(table), value: UntaggedValue::Table(table),
.. ..
} => { } => futures::stream::iter(table.into_iter().map(ReturnSuccess::value))
for value in table { .to_output_stream(),
output.push(Ok(ReturnSuccess::Value(value.clone()))); Value {
value: UntaggedValue::Primitive(Primitive::Range(range)),
tag,
} => futures::stream::iter(RangeIterator::new(*range, tag)).to_output_stream(),
_ => OutputStream::one(Ok(ReturnSuccess::Value(i.clone()))),
},
});
Ok(futures::stream::iter(stream).flatten().to_output_stream())
}
struct RangeIterator {
curr: Primitive,
end: Primitive,
tag: Tag,
is_end_inclusive: bool,
is_done: bool,
}
impl RangeIterator {
pub fn new(range: Range, tag: Tag) -> RangeIterator {
RangeIterator {
curr: range.from.0.item,
end: range.to.0.item,
tag,
is_end_inclusive: matches!(range.to.1, RangeInclusion::Inclusive),
is_done: false,
} }
} }
}
impl Iterator for RangeIterator {
type Item = Result<ReturnSuccess, ShellError>;
fn next(&mut self) -> Option<Self::Item> {
if self.curr != self.end {
let output = UntaggedValue::Primitive(self.curr.clone()).into_value(self.tag.clone());
self.curr = match crate::data::value::compute_values(
Operator::Plus,
&UntaggedValue::Primitive(self.curr.clone()),
&UntaggedValue::int(1),
) {
Ok(result) => match result {
UntaggedValue::Primitive(p) => p,
_ => { _ => {
output.push(Ok(ReturnSuccess::Value(i.clone()))); return Some(Err(ShellError::unimplemented(
"Internal error: expected a primitive result from increment",
)));
} }
}, },
Err((left_type, right_type)) => {
return Some(Err(ShellError::coerce_error(
left_type.spanned(self.tag.span),
right_type.spanned(self.tag.span),
)));
}
};
Some(ReturnSuccess::value(output))
} else if self.is_end_inclusive && !self.is_done {
self.is_done = true;
Some(ReturnSuccess::value(
UntaggedValue::Primitive(self.curr.clone()).into_value(self.tag.clone()),
))
} else {
// TODO: add inclusive/exclusive ranges
None
} }
} }
} }
// TODO: This whole block can probably be replaced with `.map()` #[cfg(test)]
let stream = futures::stream::iter(output); mod tests {
use super::Echo;
Ok(stream.to_output_stream()) #[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Echo {})
}
} }

View File

@ -1,72 +0,0 @@
use crate::commands::PerItemCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{CallInfo, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_value_ext::ValueExt;
pub struct Edit;
impl PerItemCommand for Edit {
fn name(&self) -> &str {
"edit"
}
fn signature(&self) -> Signature {
Signature::build("edit")
.required(
"Field",
SyntaxShape::ColumnPath,
"the name of the column to edit",
)
.required(
"Value",
SyntaxShape::String,
"the new value to give the cell(s)",
)
}
fn usage(&self) -> &str {
"Edit an existing column to have a new value."
}
fn run(
&self,
call_info: &CallInfo,
_registry: &CommandRegistry,
_raw_args: &RawCommandArgs,
value: Value,
) -> Result<OutputStream, ShellError> {
let value_tag = value.tag();
let field = call_info.args.expect_nth(0)?.as_column_path()?;
let replacement = call_info.args.expect_nth(1)?.tagged_unknown();
let stream = match value {
obj
@
Value {
value: UntaggedValue::Row(_),
..
} => match obj.replace_data_at_column_path(&field, replacement.item.clone()) {
Some(v) => futures::stream::iter(vec![Ok(ReturnSuccess::Value(v))]),
None => {
return Err(ShellError::labeled_error(
"edit could not find place to insert column",
"column name",
&field.tag,
))
}
},
_ => {
return Err(ShellError::labeled_error(
"Unrecognized type in stream",
"original value",
value_tag,
))
}
};
Ok(stream.to_output_stream())
}
}

View File

@ -1,49 +1,99 @@
use crate::commands::PerItemCommand;
use crate::commands::UnevaluatedCallInfo; use crate::commands::UnevaluatedCallInfo;
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry; use crate::context::CommandRegistry;
use crate::prelude::*; use crate::prelude::*;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::hir::ExternalRedirection;
use nu_protocol::{ use nu_protocol::{
CallInfo, CommandAction, Primitive, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value, CommandAction, Primitive, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value,
}; };
use nu_source::Tagged;
use std::path::PathBuf;
pub struct Enter; pub struct Enter;
impl PerItemCommand for Enter { #[derive(Deserialize)]
pub struct EnterArgs {
location: Tagged<PathBuf>,
encoding: Option<Tagged<String>>,
}
#[async_trait]
impl WholeStreamCommand for Enter {
fn name(&self) -> &str { fn name(&self) -> &str {
"enter" "enter"
} }
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("enter").required( Signature::build("enter")
.required(
"location", "location",
SyntaxShape::Path, SyntaxShape::Path,
"the location to create a new shell from", "the location to create a new shell from",
) )
.named(
"encoding",
SyntaxShape::String,
"encoding to use to open file",
Some('e'),
)
} }
fn usage(&self) -> &str { fn usage(&self) -> &str {
"Create a new shell and begin at this path." r#"Create a new shell and begin at this path.
Multiple encodings are supported for reading text files by using
the '--encoding <encoding>' parameter. Here is an example of a few:
big5, euc-jp, euc-kr, gbk, iso-8859-1, utf-16, cp1252, latin5
For a more complete list of encodings please refer to the encoding_rs
documentation link at https://docs.rs/encoding_rs/0.8.23/encoding_rs/#statics"#
} }
fn run( async fn run(
&self, &self,
call_info: &CallInfo, args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
enter(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Enter a path as a new shell",
example: "enter ../projectB",
result: None,
},
Example {
description: "Enter a file as a new shell",
example: "enter package.json",
result: None,
},
Example {
description: "Enters file with iso-8859-1 encoding",
example: "enter file.csv --encoding iso-8859-1",
result: None,
},
]
}
}
async fn enter(
raw_args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
raw_args: &RawCommandArgs,
_input: Value,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
let registry = registry.clone(); let registry = registry.clone();
let raw_args = raw_args.clone(); let scope = raw_args.call_info.scope.clone();
match call_info.args.expect_nth(0)? { let shell_manager = raw_args.shell_manager.clone();
Value { let head = raw_args.call_info.args.head.clone();
value: UntaggedValue::Primitive(Primitive::Path(location)), let ctrl_c = raw_args.ctrl_c.clone();
tag, let current_errors = raw_args.current_errors.clone();
.. let host = raw_args.host.clone();
} => { let tag = raw_args.call_info.name_tag.clone();
let (EnterArgs { location, encoding }, _) = raw_args.process(&registry).await?;
let location_string = location.display().to_string(); let location_string = location.display().to_string();
let location_clone = location_string.clone(); let location_clone = location_string.clone();
let tag_clone = tag.clone();
if location_string.starts_with("help") { if location_string.starts_with("help") {
let spec = location_string.split(':').collect::<Vec<&str>>(); let spec = location_string.split(':').collect::<Vec<&str>>();
@ -52,105 +102,100 @@ impl PerItemCommand for Enter {
let (_, command) = (spec[0], spec[1]); let (_, command) = (spec[0], spec[1]);
if registry.has(command) { if registry.has(command) {
return Ok(vec![Ok(ReturnSuccess::Action( return Ok(OutputStream::one(ReturnSuccess::action(
CommandAction::EnterHelpShell( CommandAction::EnterHelpShell(
UntaggedValue::string(command).into_value(Tag::unknown()), UntaggedValue::string(command).into_value(Tag::unknown()),
), ),
))] )));
.into());
} }
} }
Ok(vec![Ok(ReturnSuccess::Action(CommandAction::EnterHelpShell( Ok(OutputStream::one(ReturnSuccess::action(
UntaggedValue::nothing().into_value(Tag::unknown()), CommandAction::EnterHelpShell(UntaggedValue::nothing().into_value(Tag::unknown())),
)))] )))
.into())
} else if location.is_dir() { } else if location.is_dir() {
Ok(vec![Ok(ReturnSuccess::Action(CommandAction::EnterShell( Ok(OutputStream::one(ReturnSuccess::action(
location_clone, CommandAction::EnterShell(location_clone),
)))] )))
.into())
} else { } else {
let stream = async_stream! {
// If it's a file, attempt to open the file as a value and enter it // If it's a file, attempt to open the file as a value and enter it
let cwd = raw_args.shell_manager.path(); let cwd = shell_manager.path();
let full_path = std::path::PathBuf::from(cwd); let full_path = std::path::PathBuf::from(cwd);
let (file_extension, contents, contents_tag) = let (file_extension, tagged_contents) = crate::commands::open::fetch(
crate::commands::open::fetch(
&full_path, &full_path,
&location_clone, &PathBuf::from(location_clone),
tag_clone.span, tag.span,
).await?; encoding,
)
.await?;
match contents { match tagged_contents.value {
UntaggedValue::Primitive(Primitive::String(_)) => { UntaggedValue::Primitive(Primitive::String(_)) => {
let tagged_contents = contents.into_value(&contents_tag);
if let Some(extension) = file_extension { if let Some(extension) = file_extension {
let command_name = format!("from-{}", extension); let command_name = format!("from {}", extension);
if let Some(converter) = if let Some(converter) = registry.get_command(&command_name) {
registry.get_command(&command_name)
{
let new_args = RawCommandArgs { let new_args = RawCommandArgs {
host: raw_args.host, host,
ctrl_c: raw_args.ctrl_c, ctrl_c,
shell_manager: raw_args.shell_manager, current_errors,
shell_manager,
call_info: UnevaluatedCallInfo { call_info: UnevaluatedCallInfo {
args: nu_parser::hir::Call { args: nu_protocol::hir::Call {
head: raw_args.call_info.args.head, head,
positional: None, positional: None,
named: None, named: None,
span: Span::unknown() span: Span::unknown(),
external_redirection: ExternalRedirection::Stdout,
}, },
source: raw_args.call_info.source, name_tag: tag.clone(),
name_tag: raw_args.call_info.name_tag, scope: scope.clone(),
}, },
}; };
let mut result = converter.run( let tag = tagged_contents.tag.clone();
new_args.with_input(vec![tagged_contents]), let mut result = converter
&registry, .run(new_args.with_input(vec![tagged_contents]), &registry)
); .await?;
let result_vec: Vec<Result<ReturnSuccess, ShellError>> = let result_vec: Vec<Result<ReturnSuccess, ShellError>> =
result.drain_vec().await; result.drain_vec().await;
for res in result_vec { Ok(futures::stream::iter(result_vec.into_iter().map(
match res { move |res| match res {
Ok(ReturnSuccess::Value(Value { Ok(ReturnSuccess::Value(Value { value, .. })) => Ok(
ReturnSuccess::Action(CommandAction::EnterValueShell(Value {
value, value,
.. tag: tag.clone(),
})) => { })),
yield Ok(ReturnSuccess::Action(CommandAction::EnterValueShell(
Value {
value,
tag: contents_tag.clone(),
})));
}
x => yield x,
}
}
} else {
yield Ok(ReturnSuccess::Action(CommandAction::EnterValueShell(tagged_contents)));
}
} else {
yield Ok(ReturnSuccess::Action(CommandAction::EnterValueShell(tagged_contents)));
}
}
_ => {
let tagged_contents = contents.into_value(contents_tag);
yield Ok(ReturnSuccess::Action(CommandAction::EnterValueShell(tagged_contents)));
}
}
};
Ok(stream.to_output_stream())
}
}
x => Ok(
vec![Ok(ReturnSuccess::Action(CommandAction::EnterValueShell(
x.clone(),
)))]
.into(),
), ),
x => x,
},
))
.to_output_stream())
} else {
Ok(OutputStream::one(ReturnSuccess::action(
CommandAction::EnterValueShell(tagged_contents),
)))
}
} else {
Ok(OutputStream::one(ReturnSuccess::action(
CommandAction::EnterValueShell(tagged_contents),
)))
}
}
_ => Ok(OutputStream::one(ReturnSuccess::action(
CommandAction::EnterValueShell(tagged_contents),
))),
} }
} }
} }
#[cfg(test)]
mod tests {
use super::Enter;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Enter {})
}
}

View File

@ -1,72 +0,0 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use crate::utils::data_processing::{evaluate, fetch};
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::{SpannedItem, Tagged};
use nu_value_ext::ValueExt;
pub struct EvaluateBy;
#[derive(Deserialize)]
pub struct EvaluateByArgs {
evaluate_with: Option<Tagged<String>>,
}
impl WholeStreamCommand for EvaluateBy {
fn name(&self) -> &str {
"evaluate-by"
}
fn signature(&self) -> Signature {
Signature::build("evaluate-by").named(
"evaluate_with",
SyntaxShape::String,
"the name of the column to evaluate by",
Some('w'),
)
}
fn usage(&self) -> &str {
"Creates a new table with the data from the tables rows evaluated by the column given."
}
fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
args.process(registry, evaluate_by)?.run()
}
}
pub fn evaluate_by(
EvaluateByArgs { evaluate_with }: EvaluateByArgs,
RunnableContext { input, name, .. }: RunnableContext,
) -> Result<OutputStream, ShellError> {
let stream = async_stream! {
let values: Vec<Value> = input.values.collect().await;
if values.is_empty() {
yield Err(ShellError::labeled_error(
"Expected table from pipeline",
"requires a table input",
name
))
} else {
let evaluate_with = if let Some(evaluator) = evaluate_with {
Some(evaluator.item().clone())
} else {
None
};
match evaluate(&values[0], evaluate_with, name) {
Ok(evaluated) => yield ReturnSuccess::value(evaluated),
Err(err) => yield Err(err)
}
}
};
Ok(stream.to_output_stream())
}

View File

@ -0,0 +1,103 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue};
use nu_source::Tagged;
pub struct Every;
#[derive(Deserialize)]
pub struct EveryArgs {
stride: Tagged<u64>,
skip: Tagged<bool>,
}
#[async_trait]
impl WholeStreamCommand for Every {
fn name(&self) -> &str {
"every"
}
fn signature(&self) -> Signature {
Signature::build(self.name())
.required(
"stride",
SyntaxShape::Int,
"how many rows to skip between (and including) each row returned",
)
.switch(
"skip",
"skip the rows that would be returned, instead of selecting them",
Some('s'),
)
}
fn usage(&self) -> &str {
"Show (or skip) every n-th row, starting from the first one."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
every(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Get every second row",
example: "echo [1 2 3 4 5] | every 2",
result: Some(vec![
UntaggedValue::int(1).into(),
UntaggedValue::int(3).into(),
UntaggedValue::int(5).into(),
]),
},
Example {
description: "Skip every second row",
example: "echo [1 2 3 4 5] | every 2 --skip",
result: Some(vec![
UntaggedValue::int(2).into(),
UntaggedValue::int(4).into(),
]),
},
]
}
}
async fn every(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let (EveryArgs { stride, skip }, input) = args.process(&registry).await?;
let stride = stride.item;
let skip = skip.item;
Ok(input
.enumerate()
.filter_map(move |(i, value)| async move {
let stride_desired = if stride < 1 { 1 } else { stride } as usize;
let should_include = skip == (i % stride_desired != 0);
if should_include {
Some(ReturnSuccess::value(value))
} else {
None
}
})
.to_output_stream())
}
#[cfg(test)]
mod tests {
use super::Every;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Every {})
}
}

View File

@ -6,6 +6,7 @@ use nu_protocol::{CommandAction, ReturnSuccess, Signature};
pub struct Exit; pub struct Exit;
#[async_trait]
impl WholeStreamCommand for Exit { impl WholeStreamCommand for Exit {
fn name(&self) -> &str { fn name(&self) -> &str {
"exit" "exit"
@ -19,21 +20,54 @@ impl WholeStreamCommand for Exit {
"Exit the current shell (or all shells)" "Exit the current shell (or all shells)"
} }
fn run( async fn run(
&self, &self,
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
exit(args, registry) exit(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Exit the current shell",
example: "exit",
result: None,
},
Example {
description: "Exit all shells (exiting Nu)",
example: "exit --now",
result: None,
},
]
} }
} }
pub fn exit(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> { pub async fn exit(
let args = args.evaluate_once(registry)?; args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let args = args.evaluate_once(&registry).await?;
if args.call_info.args.has("now") { let command_action = if args.call_info.args.has("now") {
Ok(vec![Ok(ReturnSuccess::Action(CommandAction::Exit))].into()) CommandAction::Exit
} else { } else {
Ok(vec![Ok(ReturnSuccess::Action(CommandAction::LeaveShell))].into()) CommandAction::LeaveShell
};
Ok(OutputStream::one(ReturnSuccess::action(command_action)))
}
#[cfg(test)]
mod tests {
use super::Exit;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Exit {})
} }
} }

View File

@ -2,7 +2,7 @@ use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry; use crate::context::CommandRegistry;
use crate::prelude::*; use crate::prelude::*;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{Signature, SyntaxShape}; use nu_protocol::{Signature, SyntaxShape, UntaggedValue};
use nu_source::Tagged; use nu_source::Tagged;
pub struct First; pub struct First;
@ -12,6 +12,7 @@ pub struct FirstArgs {
rows: Option<Tagged<usize>>, rows: Option<Tagged<usize>>,
} }
#[async_trait]
impl WholeStreamCommand for First { impl WholeStreamCommand for First {
fn name(&self) -> &str { fn name(&self) -> &str {
"first" "first"
@ -29,26 +30,53 @@ impl WholeStreamCommand for First {
"Show only the first number of rows." "Show only the first number of rows."
} }
fn run( async fn run(
&self, &self,
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
args.process(registry, first)?.run() first(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Return the first item of a list/table",
example: "echo [1 2 3] | first",
result: Some(vec![UntaggedValue::int(1).into()]),
},
Example {
description: "Return the first 2 items of a list/table",
example: "echo [1 2 3] | first 2",
result: Some(vec![
UntaggedValue::int(1).into(),
UntaggedValue::int(2).into(),
]),
},
]
} }
} }
fn first( async fn first(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
FirstArgs { rows }: FirstArgs, let registry = registry.clone();
context: RunnableContext, let (FirstArgs { rows }, input) = args.process(&registry).await?;
) -> Result<OutputStream, ShellError> {
let rows_desired = if let Some(quantity) = rows { let rows_desired = if let Some(quantity) = rows {
*quantity *quantity
} else { } else {
1 1
}; };
Ok(OutputStream::from_input( Ok(input.take(rows_desired).to_output_stream())
context.input.values.take(rows_desired), }
))
#[cfg(test)]
mod tests {
use super::First;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(First {})
}
} }

View File

@ -1,22 +1,21 @@
use crate::commands::PerItemCommand; use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry; use crate::context::CommandRegistry;
use crate::evaluate::evaluate_baseline_expr;
use crate::prelude::*; use crate::prelude::*;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{ use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue};
CallInfo, ColumnPath, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value,
};
use nu_source::Tagged; use nu_source::Tagged;
use nu_value_ext::{as_column_path, get_data_by_column_path};
use std::borrow::Borrow; use std::borrow::Borrow;
use nom::{
bytes::complete::{tag, take_while},
IResult,
};
pub struct Format; pub struct Format;
impl PerItemCommand for Format { #[derive(Deserialize)]
pub struct FormatArgs {
pattern: Tagged<String>,
}
#[async_trait]
impl WholeStreamCommand for Format {
fn name(&self) -> &str { fn name(&self) -> &str {
"format" "format"
} }
@ -24,7 +23,7 @@ impl PerItemCommand for Format {
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("format").required( Signature::build("format").required(
"pattern", "pattern",
SyntaxShape::Any, SyntaxShape::String,
"the pattern to output. Eg) \"{foo}: {bar}\"", "the pattern to output. Eg) \"{foo}: {bar}\"",
) )
} }
@ -33,70 +32,78 @@ impl PerItemCommand for Format {
"Format columns into a string using a simple pattern." "Format columns into a string using a simple pattern."
} }
fn run( async fn run(
&self, &self,
call_info: &CallInfo, args: CommandArgs,
_registry: &CommandRegistry, registry: &CommandRegistry,
_raw_args: &RawCommandArgs,
value: Value,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
//let value_tag = value.tag(); format_command(args, registry).await
let pattern = call_info.args.expect_nth(0)?; }
let pattern_tag = pattern.tag.clone();
let pattern = pattern.as_string()?;
let format_pattern = format(&pattern).map_err(|_| { fn examples(&self) -> Vec<Example> {
ShellError::labeled_error( vec![Example {
"Could not create format pattern", description: "Print filenames with their sizes",
"could not create format pattern", example: "ls | format '{name}: {size}'",
&pattern_tag, result: None,
) }]
})?; }
let commands = format_pattern.1; }
let output = match value { async fn format_command(
value args: CommandArgs,
@ registry: &CommandRegistry,
Value { ) -> Result<OutputStream, ShellError> {
value: UntaggedValue::Row(_), let registry = Arc::new(registry.clone());
.. let scope = Arc::new(args.call_info.scope.clone());
} => { let (FormatArgs { pattern }, input) = args.process(&registry).await?;
let format_pattern = format(&pattern);
let commands = Arc::new(format_pattern);
Ok(input
.then(move |value| {
let mut output = String::new(); let mut output = String::new();
let commands = commands.clone();
let registry = registry.clone();
let scope = scope.clone();
for command in &commands { async move {
for command in &*commands {
match command { match command {
FormatCommand::Text(s) => { FormatCommand::Text(s) => {
output.push_str(s); output.push_str(&s);
} }
FormatCommand::Column(c) => { FormatCommand::Column(c) => {
let key = to_column_path(&c, &pattern_tag)?; // FIXME: use the correct spans
let full_column_path = nu_parser::parse_full_column_path(
let fetcher = get_data_by_column_path( &(c.to_string()).spanned(Span::unknown()),
&value, &*registry,
&key,
Box::new(move |(_, _, error)| error),
); );
if let Ok(c) = fetcher { let result = evaluate_baseline_expr(
&full_column_path.0,
&registry,
&value,
&scope.vars,
&scope.env,
)
.await;
if let Ok(c) = result {
output output
.push_str(&value::format_leaf(c.borrow()).plain_string(100_000)) .push_str(&value::format_leaf(c.borrow()).plain_string(100_000))
} } else {
// That column doesn't match, so don't emit anything // That column doesn't match, so don't emit anything
} }
} }
} }
output
} }
_ => String::new(),
};
Ok(futures::stream::iter(vec![ReturnSuccess::value( ReturnSuccess::value(UntaggedValue::string(output).into_untagged_value())
UntaggedValue::string(output).into_untagged_value(), }
)]) })
.to_output_stream()) .to_output_stream())
} }
}
#[derive(Debug)] #[derive(Debug)]
enum FormatCommand { enum FormatCommand {
@ -104,54 +111,53 @@ enum FormatCommand {
Column(String), Column(String),
} }
fn format(input: &str) -> IResult<&str, Vec<FormatCommand>> { fn format(input: &str) -> Vec<FormatCommand> {
let mut output = vec![]; let mut output = vec![];
let mut loop_input = input; let mut loop_input = input.chars();
loop { loop {
let (input, before) = take_while(|c| c != '{')(loop_input)?; let mut before = String::new();
while let Some(c) = loop_input.next() {
if c == '{' {
break;
}
before.push(c);
}
if !before.is_empty() { if !before.is_empty() {
output.push(FormatCommand::Text(before.to_string())); output.push(FormatCommand::Text(before.to_string()));
} }
if input != "" {
// Look for column as we're now at one // Look for column as we're now at one
let (input, _) = tag("{")(input)?; let mut column = String::new();
let (input, column) = take_while(|c| c != '}')(input)?;
let (input, _) = tag("}")(input)?;
output.push(FormatCommand::Column(column.to_string())); while let Some(c) = loop_input.next() {
loop_input = input; if c == '}' {
} else { break;
loop_input = input;
} }
if loop_input == "" { column.push(c);
}
if !column.is_empty() {
output.push(FormatCommand::Column(column.to_string()));
}
if before.is_empty() && column.is_empty() {
break; break;
} }
} }
Ok((loop_input, output)) output
} }
fn to_column_path( #[cfg(test)]
path_members: &str, mod tests {
tag: impl Into<Tag>, use super::Format;
) -> Result<Tagged<ColumnPath>, ShellError> {
let tag = tag.into();
as_column_path( #[test]
&UntaggedValue::Table( fn examples_work_as_expected() {
path_members use crate::examples::test as test_examples;
.split('.')
.map(|x| {
let member = match x.parse::<u64>() {
Ok(v) => UntaggedValue::int(v),
Err(_) => UntaggedValue::string(x),
};
member.into_value(&tag) test_examples(Format {})
}) }
.collect(),
)
.into_value(&tag),
)
} }

View File

@ -0,0 +1,45 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, UntaggedValue};
pub struct From;
#[async_trait]
impl WholeStreamCommand for From {
fn name(&self) -> &str {
"from"
}
fn signature(&self) -> Signature {
Signature::build("from")
}
fn usage(&self) -> &str {
"Parse content (string or binary) as a table (input format based on subcommand, like csv, ini, json, toml)"
}
async fn run(
&self,
_args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
Ok(OutputStream::one(ReturnSuccess::value(
UntaggedValue::string(crate::commands::help::get_help(&From, &registry))
.into_value(Tag::unknown()),
)))
}
}
#[cfg(test)]
mod tests {
use super::From;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(From {})
}
}

View File

@ -12,13 +12,14 @@ pub struct FromCSVArgs {
separator: Option<Value>, separator: Option<Value>,
} }
#[async_trait]
impl WholeStreamCommand for FromCSV { impl WholeStreamCommand for FromCSV {
fn name(&self) -> &str { fn name(&self) -> &str {
"from-csv" "from csv"
} }
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("from-csv") Signature::build("from csv")
.named( .named(
"separator", "separator",
SyntaxShape::String, SyntaxShape::String,
@ -36,22 +37,49 @@ impl WholeStreamCommand for FromCSV {
"Parse text as .csv and create table." "Parse text as .csv and create table."
} }
fn run( async fn run(
&self, &self,
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
args.process(registry, from_csv)?.run() from_csv(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Convert comma-separated data to a table",
example: "open data.txt | from csv",
result: None,
},
Example {
description: "Convert comma-separated data to a table, ignoring headers",
example: "open data.txt | from csv --headerless",
result: None,
},
Example {
description: "Convert semicolon-separated data to a table",
example: "open data.txt | from csv --separator ';'",
result: None,
},
]
} }
} }
fn from_csv( async fn from_csv(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let name = args.call_info.name_tag.clone();
let (
FromCSVArgs { FromCSVArgs {
headerless, headerless,
separator, separator,
}: FromCSVArgs, },
runnable_context: RunnableContext, input,
) -> Result<OutputStream, ShellError> { ) = args.process(&registry).await?;
let sep = match separator { let sep = match separator {
Some(Value { Some(Value {
value: UntaggedValue::Primitive(Primitive::String(s)), value: UntaggedValue::Primitive(Primitive::String(s)),
@ -75,5 +103,17 @@ fn from_csv(
_ => ',', _ => ',',
}; };
from_delimited_data(headerless, sep, "CSV", runnable_context) from_delimited_data(headerless, sep, "CSV", input, name).await
}
#[cfg(test)]
mod tests {
use super::FromCSV;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(FromCSV {})
}
} }

View File

@ -1,7 +1,7 @@
use crate::prelude::*; use crate::prelude::*;
use csv::ReaderBuilder; use csv::{ErrorKind, ReaderBuilder};
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, TaggedDictBuilder, UntaggedValue, Value}; use nu_protocol::{TaggedDictBuilder, UntaggedValue, Value};
fn from_delimited_string_to_value( fn from_delimited_string_to_value(
s: String, s: String,
@ -27,10 +27,13 @@ fn from_delimited_string_to_value(
for row in reader.records() { for row in reader.records() {
let mut tagged_row = TaggedDictBuilder::new(&tag); let mut tagged_row = TaggedDictBuilder::new(&tag);
for (value, header) in row?.iter().zip(headers.iter()) { for (value, header) in row?.iter().zip(headers.iter()) {
tagged_row.insert_value( if let Ok(i) = value.parse::<i64>() {
header, tagged_row.insert_value(header, UntaggedValue::int(i).into_value(&tag))
UntaggedValue::Primitive(Primitive::String(String::from(value))).into_value(&tag), } else if let Ok(f) = value.parse::<f64>() {
) tagged_row.insert_value(header, UntaggedValue::decimal(f).into_value(&tag))
} else {
tagged_row.insert_value(header, UntaggedValue::string(value).into_value(&tag))
}
} }
rows.push(tagged_row.into_value()); rows.push(tagged_row.into_value());
} }
@ -38,39 +41,61 @@ fn from_delimited_string_to_value(
Ok(UntaggedValue::Table(rows).into_value(&tag)) Ok(UntaggedValue::Table(rows).into_value(&tag))
} }
pub fn from_delimited_data( pub async fn from_delimited_data(
headerless: bool, headerless: bool,
sep: char, sep: char,
format_name: &'static str, format_name: &'static str,
RunnableContext { input, name, .. }: RunnableContext, input: InputStream,
name: Tag,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
let name_tag = name; let name_tag = name;
let stream = async_stream! {
let concat_string = input.collect_string(name_tag.clone()).await?; let concat_string = input.collect_string(name_tag.clone()).await?;
match from_delimited_string_to_value(concat_string.item, headerless, sep, name_tag.clone()) { match from_delimited_string_to_value(concat_string.item, headerless, sep, name_tag.clone()) {
Ok(x) => match x { Ok(x) => match x {
Value { value: UntaggedValue::Table(list), .. } => { Value {
for l in list { value: UntaggedValue::Table(list),
yield ReturnSuccess::value(l); ..
} } => Ok(futures::stream::iter(list).to_output_stream()),
} x => Ok(OutputStream::one(x)),
x => yield ReturnSuccess::value(x),
}, },
Err(_) => { Err(err) => {
let line_one = format!("Could not parse as {}", format_name); let line_one = match pretty_csv_error(err) {
Some(pretty) => format!("Could not parse as {} ({})", format_name, pretty),
None => format!("Could not parse as {}", format_name),
};
let line_two = format!("input cannot be parsed as {}", format_name); let line_two = format!("input cannot be parsed as {}", format_name);
yield Err(ShellError::labeled_error_with_secondary(
Err(ShellError::labeled_error_with_secondary(
line_one, line_one,
line_two, line_two,
name_tag.clone(), name_tag.clone(),
"value originates from here", "value originates from here",
concat_string.tag, concat_string.tag,
)) ))
} ,
} }
}; }
}
Ok(stream.to_output_stream()) fn pretty_csv_error(err: csv::Error) -> Option<String> {
match err.kind() {
ErrorKind::UnequalLengths {
pos,
expected_len,
len,
} => {
if let Some(pos) = pos {
Some(format!(
"Line {}: expected {} fields, found {}",
pos.line(),
expected_len,
len
))
} else {
Some(format!("Expected {} fields, found {}", expected_len, len))
}
}
ErrorKind::Seek => Some("Internal error while parsing csv".to_string()),
_ => None,
}
} }

View File

@ -0,0 +1,140 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use ::eml_parser::eml::*;
use ::eml_parser::EmlParser;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, TaggedDictBuilder, UntaggedValue};
use nu_source::Tagged;
pub struct FromEML;
const DEFAULT_BODY_PREVIEW: usize = 50;
#[derive(Deserialize, Clone)]
pub struct FromEMLArgs {
#[serde(rename(deserialize = "preview-body"))]
preview_body: Option<Tagged<usize>>,
}
#[async_trait]
impl WholeStreamCommand for FromEML {
fn name(&self) -> &str {
"from eml"
}
fn signature(&self) -> Signature {
Signature::build("from eml").named(
"preview-body",
SyntaxShape::Int,
"How many bytes of the body to preview",
Some('b'),
)
}
fn usage(&self) -> &str {
"Parse text as .eml and create table."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
from_eml(args, registry).await
}
}
fn emailaddress_to_value(tag: &Tag, email_address: &EmailAddress) -> TaggedDictBuilder {
let mut dict = TaggedDictBuilder::with_capacity(tag, 2);
let (n, a) = match email_address {
EmailAddress::AddressOnly { address } => {
(UntaggedValue::nothing(), UntaggedValue::string(address))
}
EmailAddress::NameAndEmailAddress { name, address } => {
(UntaggedValue::string(name), UntaggedValue::string(address))
}
};
dict.insert_untagged("Name", n);
dict.insert_untagged("Address", a);
dict
}
fn headerfieldvalue_to_value(tag: &Tag, value: &HeaderFieldValue) -> UntaggedValue {
use HeaderFieldValue::*;
match value {
SingleEmailAddress(address) => emailaddress_to_value(tag, address).into_untagged_value(),
MultipleEmailAddresses(addresses) => UntaggedValue::Table(
addresses
.iter()
.map(|a| emailaddress_to_value(tag, a).into_value())
.collect(),
),
Unstructured(s) => UntaggedValue::string(s),
Empty => UntaggedValue::nothing(),
}
}
async fn from_eml(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let registry = registry.clone();
let (eml_args, input): (FromEMLArgs, _) = args.process(&registry).await?;
let value = input.collect_string(tag.clone()).await?;
let body_preview = eml_args
.preview_body
.map(|b| b.item)
.unwrap_or(DEFAULT_BODY_PREVIEW);
let eml = EmlParser::from_string(value.item)
.with_body_preview(body_preview)
.parse()
.map_err(|_| {
ShellError::labeled_error(
"Could not parse .eml file",
"could not parse .eml file",
&tag,
)
})?;
let mut dict = TaggedDictBuilder::new(&tag);
if let Some(subj) = eml.subject {
dict.insert_untagged("Subject", UntaggedValue::string(subj));
}
if let Some(from) = eml.from {
dict.insert_untagged("From", headerfieldvalue_to_value(&tag, &from));
}
if let Some(to) = eml.to {
dict.insert_untagged("To", headerfieldvalue_to_value(&tag, &to));
}
for HeaderField { name, value } in eml.headers.iter() {
dict.insert_untagged(name, headerfieldvalue_to_value(&tag, &value));
}
if let Some(body) = eml.body {
dict.insert_untagged("Body", UntaggedValue::string(body));
}
Ok(OutputStream::one(ReturnSuccess::value(dict.into_value())))
}
#[cfg(test)]
mod tests {
use super::FromEML;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(FromEML {})
}
}

View File

@ -0,0 +1,259 @@
extern crate ical;
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use ical::parser::ical::component::*;
use ical::property::Property;
use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, Signature, TaggedDictBuilder, UntaggedValue, Value};
use std::io::BufReader;
pub struct FromIcs;
#[async_trait]
impl WholeStreamCommand for FromIcs {
fn name(&self) -> &str {
"from ics"
}
fn signature(&self) -> Signature {
Signature::build("from ics")
}
fn usage(&self) -> &str {
"Parse text as .ics and create table."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
from_ics(args, registry).await
}
}
async fn from_ics(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let args = args.evaluate_once(&registry).await?;
let tag = args.name_tag();
let input = args.input;
let input_string = input.collect_string(tag.clone()).await?.item;
let input_bytes = input_string.as_bytes();
let buf_reader = BufReader::new(input_bytes);
let parser = ical::IcalParser::new(buf_reader);
// TODO: it should be possible to make this a stream, but the some of the lifetime requirements make this tricky.
// Pre-computing for now
let mut output = vec![];
for calendar in parser {
match calendar {
Ok(c) => output.push(ReturnSuccess::value(calendar_to_value(c, tag.clone()))),
Err(_) => output.push(Err(ShellError::labeled_error(
"Could not parse as .ics",
"input cannot be parsed as .ics",
tag.clone(),
))),
}
}
Ok(futures::stream::iter(output).to_output_stream())
}
fn calendar_to_value(calendar: IcalCalendar, tag: Tag) -> Value {
let mut row = TaggedDictBuilder::new(tag.clone());
row.insert_untagged(
"properties",
properties_to_value(calendar.properties, tag.clone()),
);
row.insert_untagged("events", events_to_value(calendar.events, tag.clone()));
row.insert_untagged("alarms", alarms_to_value(calendar.alarms, tag.clone()));
row.insert_untagged("to-Dos", todos_to_value(calendar.todos, tag.clone()));
row.insert_untagged(
"journals",
journals_to_value(calendar.journals, tag.clone()),
);
row.insert_untagged(
"free-busys",
free_busys_to_value(calendar.free_busys, tag.clone()),
);
row.insert_untagged("timezones", timezones_to_value(calendar.timezones, tag));
row.into_value()
}
fn events_to_value(events: Vec<IcalEvent>, tag: Tag) -> UntaggedValue {
UntaggedValue::table(
&events
.into_iter()
.map(|event| {
let mut row = TaggedDictBuilder::new(tag.clone());
row.insert_untagged(
"properties",
properties_to_value(event.properties, tag.clone()),
);
row.insert_untagged("alarms", alarms_to_value(event.alarms, tag.clone()));
row.into_value()
})
.collect::<Vec<Value>>(),
)
}
fn alarms_to_value(alarms: Vec<IcalAlarm>, tag: Tag) -> UntaggedValue {
UntaggedValue::table(
&alarms
.into_iter()
.map(|alarm| {
let mut row = TaggedDictBuilder::new(tag.clone());
row.insert_untagged(
"properties",
properties_to_value(alarm.properties, tag.clone()),
);
row.into_value()
})
.collect::<Vec<Value>>(),
)
}
fn todos_to_value(todos: Vec<IcalTodo>, tag: Tag) -> UntaggedValue {
UntaggedValue::table(
&todos
.into_iter()
.map(|todo| {
let mut row = TaggedDictBuilder::new(tag.clone());
row.insert_untagged(
"properties",
properties_to_value(todo.properties, tag.clone()),
);
row.insert_untagged("alarms", alarms_to_value(todo.alarms, tag.clone()));
row.into_value()
})
.collect::<Vec<Value>>(),
)
}
fn journals_to_value(journals: Vec<IcalJournal>, tag: Tag) -> UntaggedValue {
UntaggedValue::table(
&journals
.into_iter()
.map(|journal| {
let mut row = TaggedDictBuilder::new(tag.clone());
row.insert_untagged(
"properties",
properties_to_value(journal.properties, tag.clone()),
);
row.into_value()
})
.collect::<Vec<Value>>(),
)
}
fn free_busys_to_value(free_busys: Vec<IcalFreeBusy>, tag: Tag) -> UntaggedValue {
UntaggedValue::table(
&free_busys
.into_iter()
.map(|free_busy| {
let mut row = TaggedDictBuilder::new(tag.clone());
row.insert_untagged(
"properties",
properties_to_value(free_busy.properties, tag.clone()),
);
row.into_value()
})
.collect::<Vec<Value>>(),
)
}
fn timezones_to_value(timezones: Vec<IcalTimeZone>, tag: Tag) -> UntaggedValue {
UntaggedValue::table(
&timezones
.into_iter()
.map(|timezone| {
let mut row = TaggedDictBuilder::new(tag.clone());
row.insert_untagged(
"properties",
properties_to_value(timezone.properties, tag.clone()),
);
row.insert_untagged(
"transitions",
timezone_transitions_to_value(timezone.transitions, tag.clone()),
);
row.into_value()
})
.collect::<Vec<Value>>(),
)
}
fn timezone_transitions_to_value(
transitions: Vec<IcalTimeZoneTransition>,
tag: Tag,
) -> UntaggedValue {
UntaggedValue::table(
&transitions
.into_iter()
.map(|transition| {
let mut row = TaggedDictBuilder::new(tag.clone());
row.insert_untagged(
"properties",
properties_to_value(transition.properties, tag.clone()),
);
row.into_value()
})
.collect::<Vec<Value>>(),
)
}
fn properties_to_value(properties: Vec<Property>, tag: Tag) -> UntaggedValue {
UntaggedValue::table(
&properties
.into_iter()
.map(|prop| {
let mut row = TaggedDictBuilder::new(tag.clone());
let name = UntaggedValue::string(prop.name);
let value = match prop.value {
Some(val) => UntaggedValue::string(val),
None => UntaggedValue::Primitive(Primitive::Nothing),
};
let params = match prop.params {
Some(param_list) => params_to_value(param_list, tag.clone()).into(),
None => UntaggedValue::Primitive(Primitive::Nothing),
};
row.insert_untagged("name", name);
row.insert_untagged("value", value);
row.insert_untagged("params", params);
row.into_value()
})
.collect::<Vec<Value>>(),
)
}
fn params_to_value(params: Vec<(String, Vec<String>)>, tag: Tag) -> Value {
let mut row = TaggedDictBuilder::new(tag);
for (param_name, param_values) in params {
let values: Vec<Value> = param_values.into_iter().map(|val| val.into()).collect();
let values = UntaggedValue::table(&values);
row.insert_untagged(param_name, values);
}
row.into_value()
}
#[cfg(test)]
mod tests {
use super::FromIcs;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(FromIcs {})
}
}

View File

@ -1,30 +1,31 @@
use crate::commands::WholeStreamCommand; use crate::commands::WholeStreamCommand;
use crate::prelude::*; use crate::prelude::*;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, Signature, TaggedDictBuilder, UntaggedValue, Value}; use nu_protocol::{Primitive, Signature, TaggedDictBuilder, UntaggedValue, Value};
use std::collections::HashMap; use std::collections::HashMap;
pub struct FromINI; pub struct FromINI;
#[async_trait]
impl WholeStreamCommand for FromINI { impl WholeStreamCommand for FromINI {
fn name(&self) -> &str { fn name(&self) -> &str {
"from-ini" "from ini"
} }
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("from-ini") Signature::build("from ini")
} }
fn usage(&self) -> &str { fn usage(&self) -> &str {
"Parse text as .ini and create table" "Parse text as .ini and create table"
} }
fn run( async fn run(
&self, &self,
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
from_ini(args, registry) from_ini(args, registry).await
} }
} }
@ -63,34 +64,42 @@ pub fn from_ini_string_to_value(
Ok(convert_ini_top_to_nu_value(&v, tag)) Ok(convert_ini_top_to_nu_value(&v, tag))
} }
fn from_ini(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> { async fn from_ini(
let args = args.evaluate_once(registry)?; args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let args = args.evaluate_once(&registry).await?;
let tag = args.name_tag(); let tag = args.name_tag();
let input = args.input; let input = args.input;
let stream = async_stream! {
let concat_string = input.collect_string(tag.clone()).await?; let concat_string = input.collect_string(tag.clone()).await?;
match from_ini_string_to_value(concat_string.item, tag.clone()) { match from_ini_string_to_value(concat_string.item, tag.clone()) {
Ok(x) => match x { Ok(x) => match x {
Value { value: UntaggedValue::Table(list), .. } => { Value {
for l in list { value: UntaggedValue::Table(list),
yield ReturnSuccess::value(l); ..
} } => Ok(futures::stream::iter(list).to_output_stream()),
} x => Ok(OutputStream::one(x)),
x => yield ReturnSuccess::value(x),
}, },
Err(_) => { Err(_) => Err(ShellError::labeled_error_with_secondary(
yield Err(ShellError::labeled_error_with_secondary(
"Could not parse as INI", "Could not parse as INI",
"input cannot be parsed as INI", "input cannot be parsed as INI",
&tag, &tag,
"value originates from here", "value originates from here",
concat_string.tag, concat_string.tag,
)) )),
} }
} }
};
Ok(stream.to_output_stream()) #[cfg(test)]
mod tests {
use super::FromINI;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(FromINI {})
}
} }

View File

@ -10,13 +10,14 @@ pub struct FromJSONArgs {
objects: bool, objects: bool,
} }
#[async_trait]
impl WholeStreamCommand for FromJSON { impl WholeStreamCommand for FromJSON {
fn name(&self) -> &str { fn name(&self) -> &str {
"from-json" "from json"
} }
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("from-json").switch( Signature::build("from json").switch(
"objects", "objects",
"treat each line as a separate value", "treat each line as a separate value",
Some('o'), Some('o'),
@ -27,12 +28,12 @@ impl WholeStreamCommand for FromJSON {
"Parse text as .json and create table." "Parse text as .json and create table."
} }
fn run( async fn run(
&self, &self,
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
args.process(registry, from_json)?.run() from_json(args, registry).await
} }
} }
@ -70,64 +71,83 @@ pub fn from_json_string_to_value(s: String, tag: impl Into<Tag>) -> serde_hjson:
Ok(convert_json_value_to_nu_value(&v, tag)) Ok(convert_json_value_to_nu_value(&v, tag))
} }
fn from_json( async fn from_json(
FromJSONArgs { objects }: FromJSONArgs, args: CommandArgs,
RunnableContext { input, name, .. }: RunnableContext, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
let name_tag = name; let name_tag = args.call_info.name_tag.clone();
let registry = registry.clone();
let stream = async_stream! { let (FromJSONArgs { objects }, input) = args.process(&registry).await?;
let concat_string = input.collect_string(name_tag.clone()).await?; let concat_string = input.collect_string(name_tag.clone()).await?;
let string_clone: Vec<_> = concat_string.item.lines().map(|x| x.to_string()).collect();
if objects { if objects {
for json_str in concat_string.item.lines() { Ok(
futures::stream::iter(string_clone.into_iter().filter_map(move |json_str| {
if json_str.is_empty() { if json_str.is_empty() {
continue; return None;
} }
match from_json_string_to_value(json_str.to_string(), &name_tag) { match from_json_string_to_value(json_str, &name_tag) {
Ok(x) => Ok(x) => Some(ReturnSuccess::value(x)),
yield ReturnSuccess::value(x),
Err(e) => { Err(e) => {
let mut message = "Could not parse as JSON (".to_string(); let mut message = "Could not parse as JSON (".to_string();
message.push_str(&e.to_string()); message.push_str(&e.to_string());
message.push_str(")"); message.push_str(")");
yield Err(ShellError::labeled_error_with_secondary( Some(Err(ShellError::labeled_error_with_secondary(
message, message,
"input cannot be parsed as JSON", "input cannot be parsed as JSON",
&name_tag, name_tag.clone(),
"value originates from here", "value originates from here",
concat_string.tag.clone())) concat_string.tag.clone(),
} )))
} }
} }
}))
.to_output_stream(),
)
} else { } else {
match from_json_string_to_value(concat_string.item, name_tag.clone()) { match from_json_string_to_value(concat_string.item, name_tag.clone()) {
Ok(x) => Ok(x) => match x {
match x { Value {
Value { value: UntaggedValue::Table(list), .. } => { value: UntaggedValue::Table(list),
for l in list { ..
yield ReturnSuccess::value(l); } => Ok(
} futures::stream::iter(list.into_iter().map(ReturnSuccess::value))
} .to_output_stream(),
x => yield ReturnSuccess::value(x), ),
} x => Ok(OutputStream::one(ReturnSuccess::value(x))),
},
Err(e) => { Err(e) => {
let mut message = "Could not parse as JSON (".to_string(); let mut message = "Could not parse as JSON (".to_string();
message.push_str(&e.to_string()); message.push_str(&e.to_string());
message.push_str(")"); message.push_str(")");
yield Err(ShellError::labeled_error_with_secondary( Ok(OutputStream::one(Err(
ShellError::labeled_error_with_secondary(
message, message,
"input cannot be parsed as JSON", "input cannot be parsed as JSON",
name_tag, name_tag,
"value originates from here", "value originates from here",
concat_string.tag)) concat_string.tag,
),
)))
}
} }
} }
} }
};
Ok(stream.to_output_stream()) #[cfg(test)]
mod tests {
use super::FromJSON;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(FromJSON {})
}
} }

View File

@ -13,13 +13,14 @@ pub struct FromODSArgs {
headerless: bool, headerless: bool,
} }
#[async_trait]
impl WholeStreamCommand for FromODS { impl WholeStreamCommand for FromODS {
fn name(&self) -> &str { fn name(&self) -> &str {
"from-ods" "from ods"
} }
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("from-ods").switch( Signature::build("from ods").switch(
"headerless", "headerless",
"don't treat the first row as column names", "don't treat the first row as column names",
None, None,
@ -30,31 +31,33 @@ impl WholeStreamCommand for FromODS {
"Parse OpenDocument Spreadsheet(.ods) data and create table." "Parse OpenDocument Spreadsheet(.ods) data and create table."
} }
fn run( async fn run(
&self, &self,
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
args.process(registry, from_ods)?.run() from_ods(args, registry).await
} }
} }
fn from_ods( async fn from_ods(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let registry = registry.clone();
let (
FromODSArgs { FromODSArgs {
headerless: _headerless, headerless: _headerless,
}: FromODSArgs, },
runnable_context: RunnableContext, input,
) -> Result<OutputStream, ShellError> { ) = args.process(&registry).await?;
let input = runnable_context.input;
let tag = runnable_context.name;
let stream = async_stream! {
let bytes = input.collect_binary(tag.clone()).await?; let bytes = input.collect_binary(tag.clone()).await?;
let mut buf: Cursor<Vec<u8>> = Cursor::new(bytes.item); let buf: Cursor<Vec<u8>> = Cursor::new(bytes.item);
let mut ods = Ods::<_>::new(buf).map_err(|_| ShellError::labeled_error( let mut ods = Ods::<_>::new(buf).map_err(|_| {
"Could not load ods file", ShellError::labeled_error("Could not load ods file", "could not load ods file", &tag)
"could not load ods file", })?;
&tag))?;
let mut dict = TaggedDictBuilder::new(&tag); let mut dict = TaggedDictBuilder::new(&tag);
@ -84,15 +87,25 @@ fn from_ods(
dict.insert_untagged(sheet_name, sheet_output.into_untagged_value()); dict.insert_untagged(sheet_name, sheet_output.into_untagged_value());
} else { } else {
yield Err(ShellError::labeled_error( return Err(ShellError::labeled_error(
"Could not load sheet", "Could not load sheet",
"could not load sheet", "could not load sheet",
&tag)); &tag,
));
} }
} }
yield ReturnSuccess::value(dict.into_value()); Ok(OutputStream::one(ReturnSuccess::value(dict.into_value())))
}; }
Ok(stream.to_output_stream()) #[cfg(test)]
mod tests {
use super::FromODS;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(FromODS {})
}
} }

View File

@ -17,9 +17,10 @@ pub struct FromSSVArgs {
minimum_spaces: Option<Tagged<usize>>, minimum_spaces: Option<Tagged<usize>>,
} }
const STRING_REPRESENTATION: &str = "from-ssv"; const STRING_REPRESENTATION: &str = "from ssv";
const DEFAULT_MINIMUM_SPACES: usize = 2; const DEFAULT_MINIMUM_SPACES: usize = 2;
#[async_trait]
impl WholeStreamCommand for FromSSV { impl WholeStreamCommand for FromSSV {
fn name(&self) -> &str { fn name(&self) -> &str {
STRING_REPRESENTATION STRING_REPRESENTATION
@ -45,12 +46,12 @@ impl WholeStreamCommand for FromSSV {
"Parse text as space-separated values and create a table. The default minimum number of spaces counted as a separator is 2." "Parse text as space-separated values and create a table. The default minimum number of spaces counted as a separator is 2."
} }
fn run( async fn run(
&self, &self,
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
args.process(registry, from_ssv)?.run() from_ssv(args, registry).await
} }
} }
@ -250,41 +251,53 @@ fn from_ssv_string_to_value(
Some(UntaggedValue::Table(rows).into_value(&tag)) Some(UntaggedValue::Table(rows).into_value(&tag))
} }
fn from_ssv( async fn from_ssv(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let name = args.call_info.name_tag.clone();
let registry = registry.clone();
let (
FromSSVArgs { FromSSVArgs {
headerless, headerless,
aligned_columns, aligned_columns,
minimum_spaces, minimum_spaces,
}: FromSSVArgs, },
RunnableContext { input, name, .. }: RunnableContext, input,
) -> Result<OutputStream, ShellError> { ) = args.process(&registry).await?;
let stream = async_stream! {
let concat_string = input.collect_string(name.clone()).await?; let concat_string = input.collect_string(name.clone()).await?;
let split_at = match minimum_spaces { let split_at = match minimum_spaces {
Some(number) => number.item, Some(number) => number.item,
None => DEFAULT_MINIMUM_SPACES None => DEFAULT_MINIMUM_SPACES,
}; };
match from_ssv_string_to_value(&concat_string.item, headerless, aligned_columns, split_at, name.clone()) { Ok(
match from_ssv_string_to_value(
&concat_string.item,
headerless,
aligned_columns,
split_at,
name.clone(),
) {
Some(x) => match x { Some(x) => match x {
Value { value: UntaggedValue::Table(list), ..} => { Value {
for l in list { yield ReturnSuccess::value(l) } value: UntaggedValue::Table(list),
} ..
x => yield ReturnSuccess::value(x) } => futures::stream::iter(list.into_iter().map(ReturnSuccess::value))
.to_output_stream(),
x => OutputStream::one(ReturnSuccess::value(x)),
}, },
None => { None => {
yield Err(ShellError::labeled_error_with_secondary( return Err(ShellError::labeled_error_with_secondary(
"Could not parse as SSV", "Could not parse as SSV",
"input cannot be parsed ssv", "input cannot be parsed ssv",
&name, &name,
"value originates from here", "value originates from here",
&concat_string.tag, &concat_string.tag,
)) ));
},
} }
}; },
)
Ok(stream.to_output_stream())
} }
#[cfg(test)] #[cfg(test)]
@ -489,4 +502,12 @@ mod tests {
assert_eq!(aligned_columns_headerless, separator_headerless); assert_eq!(aligned_columns_headerless, separator_headerless);
assert_eq!(aligned_columns_with_headers, separator_with_headers); assert_eq!(aligned_columns_with_headers, separator_with_headers);
} }
#[test]
fn examples_work_as_expected() {
use super::FromSSV;
use crate::examples::test as test_examples;
test_examples(FromSSV {})
}
} }

View File

@ -5,25 +5,26 @@ use nu_protocol::{Primitive, ReturnSuccess, Signature, TaggedDictBuilder, Untagg
pub struct FromTOML; pub struct FromTOML;
#[async_trait]
impl WholeStreamCommand for FromTOML { impl WholeStreamCommand for FromTOML {
fn name(&self) -> &str { fn name(&self) -> &str {
"from-toml" "from toml"
} }
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("from-toml") Signature::build("from toml")
} }
fn usage(&self) -> &str { fn usage(&self) -> &str {
"Parse text as .toml and create table." "Parse text as .toml and create table."
} }
fn run( async fn run(
&self, &self,
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
from_toml(args, registry) from_toml(args, registry).await
} }
} }
@ -63,27 +64,28 @@ pub fn from_toml_string_to_value(s: String, tag: impl Into<Tag>) -> Result<Value
Ok(convert_toml_value_to_nu_value(&v, tag)) Ok(convert_toml_value_to_nu_value(&v, tag))
} }
pub fn from_toml( pub async fn from_toml(
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
let args = args.evaluate_once(registry)?; let registry = registry.clone();
let args = args.evaluate_once(&registry).await?;
let tag = args.name_tag(); let tag = args.name_tag();
let input = args.input; let input = args.input;
let stream = async_stream! {
let concat_string = input.collect_string(tag.clone()).await?; let concat_string = input.collect_string(tag.clone()).await?;
Ok(
match from_toml_string_to_value(concat_string.item, tag.clone()) { match from_toml_string_to_value(concat_string.item, tag.clone()) {
Ok(x) => match x { Ok(x) => match x {
Value { value: UntaggedValue::Table(list), .. } => { Value {
for l in list { value: UntaggedValue::Table(list),
yield ReturnSuccess::value(l); ..
} } => futures::stream::iter(list.into_iter().map(ReturnSuccess::value))
} .to_output_stream(),
x => yield ReturnSuccess::value(x), x => OutputStream::one(ReturnSuccess::value(x)),
}, },
Err(_) => { Err(_) => {
yield Err(ShellError::labeled_error_with_secondary( return Err(ShellError::labeled_error_with_secondary(
"Could not parse as TOML", "Could not parse as TOML",
"input cannot be parsed as TOML", "input cannot be parsed as TOML",
&tag, &tag,
@ -91,8 +93,18 @@ pub fn from_toml(
concat_string.tag, concat_string.tag,
)) ))
} }
},
)
} }
};
Ok(stream.to_output_stream()) #[cfg(test)]
mod tests {
use super::FromTOML;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(FromTOML {})
}
} }

View File

@ -11,13 +11,14 @@ pub struct FromTSVArgs {
headerless: bool, headerless: bool,
} }
#[async_trait]
impl WholeStreamCommand for FromTSV { impl WholeStreamCommand for FromTSV {
fn name(&self) -> &str { fn name(&self) -> &str {
"from-tsv" "from tsv"
} }
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("from-tsv").switch( Signature::build("from tsv").switch(
"headerless", "headerless",
"don't treat the first row as column names", "don't treat the first row as column names",
None, None,
@ -28,18 +29,34 @@ impl WholeStreamCommand for FromTSV {
"Parse text as .tsv and create table." "Parse text as .tsv and create table."
} }
fn run( async fn run(
&self, &self,
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
args.process(registry, from_tsv)?.run() from_tsv(args, registry).await
} }
} }
fn from_tsv( async fn from_tsv(
FromTSVArgs { headerless }: FromTSVArgs, args: CommandArgs,
runnable_context: RunnableContext, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
from_delimited_data(headerless, '\t', "TSV", runnable_context) let registry = registry.clone();
let name = args.call_info.name_tag.clone();
let (FromTSVArgs { headerless }, input) = args.process(&registry).await?;
from_delimited_data(headerless, '\t', "TSV", input, name).await
}
#[cfg(test)]
mod tests {
use super::FromTSV;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(FromTSV {})
}
} }

View File

@ -5,34 +5,38 @@ use nu_protocol::{ReturnSuccess, Signature, TaggedDictBuilder, UntaggedValue};
pub struct FromURL; pub struct FromURL;
#[async_trait]
impl WholeStreamCommand for FromURL { impl WholeStreamCommand for FromURL {
fn name(&self) -> &str { fn name(&self) -> &str {
"from-url" "from url"
} }
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("from-url") Signature::build("from url")
} }
fn usage(&self) -> &str { fn usage(&self) -> &str {
"Parse url-encoded string as a table." "Parse url-encoded string as a table."
} }
fn run( async fn run(
&self, &self,
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
from_url(args, registry) from_url(args, registry).await
} }
} }
fn from_url(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> { async fn from_url(
let args = args.evaluate_once(registry)?; args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let args = args.evaluate_once(&registry).await?;
let tag = args.name_tag(); let tag = args.name_tag();
let input = args.input; let input = args.input;
let stream = async_stream! {
let concat_string = input.collect_string(tag.clone()).await?; let concat_string = input.collect_string(tag.clone()).await?;
let result = serde_urlencoded::from_str::<Vec<(String, String)>>(&concat_string.item); let result = serde_urlencoded::from_str::<Vec<(String, String)>>(&concat_string.item);
@ -45,19 +49,26 @@ fn from_url(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStrea
row.insert_untagged(k, UntaggedValue::string(v)); row.insert_untagged(k, UntaggedValue::string(v));
} }
yield ReturnSuccess::value(row.into_value()); Ok(OutputStream::one(ReturnSuccess::value(row.into_value())))
} }
_ => { _ => Err(ShellError::labeled_error_with_secondary(
yield Err(ShellError::labeled_error_with_secondary(
"String not compatible with url-encoding", "String not compatible with url-encoding",
"input not url-encoded", "input not url-encoded",
tag, tag,
"value originates from here", "value originates from here",
concat_string.tag, concat_string.tag,
)); )),
} }
} }
};
Ok(stream.to_output_stream()) #[cfg(test)]
mod tests {
use super::FromURL;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(FromURL {})
}
} }

View File

@ -0,0 +1,114 @@
extern crate ical;
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use ical::parser::vcard::component::*;
use ical::property::Property;
use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, Signature, TaggedDictBuilder, UntaggedValue, Value};
pub struct FromVcf;
#[async_trait]
impl WholeStreamCommand for FromVcf {
fn name(&self) -> &str {
"from vcf"
}
fn signature(&self) -> Signature {
Signature::build("from vcf")
}
fn usage(&self) -> &str {
"Parse text as .vcf and create table."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
from_vcf(args, registry).await
}
}
async fn from_vcf(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let args = args.evaluate_once(&registry).await?;
let tag = args.name_tag();
let input = args.input;
let input_string = input.collect_string(tag.clone()).await?.item;
let input_bytes = input_string.into_bytes();
let cursor = std::io::Cursor::new(input_bytes);
let parser = ical::VcardParser::new(cursor);
let iter = parser.map(move |contact| match contact {
Ok(c) => ReturnSuccess::value(contact_to_value(c, tag.clone())),
Err(_) => Err(ShellError::labeled_error(
"Could not parse as .vcf",
"input cannot be parsed as .vcf",
tag.clone(),
)),
});
Ok(futures::stream::iter(iter).to_output_stream())
}
fn contact_to_value(contact: VcardContact, tag: Tag) -> Value {
let mut row = TaggedDictBuilder::new(tag.clone());
row.insert_untagged("properties", properties_to_value(contact.properties, tag));
row.into_value()
}
fn properties_to_value(properties: Vec<Property>, tag: Tag) -> UntaggedValue {
UntaggedValue::table(
&properties
.into_iter()
.map(|prop| {
let mut row = TaggedDictBuilder::new(tag.clone());
let name = UntaggedValue::string(prop.name);
let value = match prop.value {
Some(val) => UntaggedValue::string(val),
None => UntaggedValue::Primitive(Primitive::Nothing),
};
let params = match prop.params {
Some(param_list) => params_to_value(param_list, tag.clone()).into(),
None => UntaggedValue::Primitive(Primitive::Nothing),
};
row.insert_untagged("name", name);
row.insert_untagged("value", value);
row.insert_untagged("params", params);
row.into_value()
})
.collect::<Vec<Value>>(),
)
}
fn params_to_value(params: Vec<(String, Vec<String>)>, tag: Tag) -> Value {
let mut row = TaggedDictBuilder::new(tag);
for (param_name, param_values) in params {
let values: Vec<Value> = param_values.into_iter().map(|val| val.into()).collect();
let values = UntaggedValue::table(&values);
row.insert_untagged(param_name, values);
}
row.into_value()
}
#[cfg(test)]
mod tests {
use super::FromVcf;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(FromVcf {})
}
}

View File

@ -13,13 +13,14 @@ pub struct FromXLSXArgs {
headerless: bool, headerless: bool,
} }
#[async_trait]
impl WholeStreamCommand for FromXLSX { impl WholeStreamCommand for FromXLSX {
fn name(&self) -> &str { fn name(&self) -> &str {
"from-xlsx" "from xlsx"
} }
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("from-xlsx").switch( Signature::build("from xlsx").switch(
"headerless", "headerless",
"don't treat the first row as column names", "don't treat the first row as column names",
None, None,
@ -30,28 +31,30 @@ impl WholeStreamCommand for FromXLSX {
"Parse binary Excel(.xlsx) data and create table." "Parse binary Excel(.xlsx) data and create table."
} }
fn run( async fn run(
&self, &self,
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
args.process(registry, from_xlsx)?.run() from_xlsx(args, registry).await
} }
} }
fn from_xlsx( async fn from_xlsx(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let tag = args.call_info.name_tag.clone();
let registry = registry.clone();
let (
FromXLSXArgs { FromXLSXArgs {
headerless: _headerless, headerless: _headerless,
}: FromXLSXArgs, },
runnable_context: RunnableContext, input,
) -> Result<OutputStream, ShellError> { ) = args.process(&registry).await?;
let input = runnable_context.input;
let tag = runnable_context.name;
let stream = async_stream! {
let value = input.collect_binary(tag.clone()).await?; let value = input.collect_binary(tag.clone()).await?;
let mut buf: Cursor<Vec<u8>> = Cursor::new(value.item); let buf: Cursor<Vec<u8>> = Cursor::new(value.item);
let mut xls = Xlsx::<_>::new(buf).map_err(|_| { let mut xls = Xlsx::<_>::new(buf).map_err(|_| {
ShellError::labeled_error("Could not load xlsx file", "could not load xlsx file", &tag) ShellError::labeled_error("Could not load xlsx file", "could not load xlsx file", &tag)
})?; })?;
@ -84,7 +87,7 @@ fn from_xlsx(
dict.insert_untagged(sheet_name, sheet_output.into_untagged_value()); dict.insert_untagged(sheet_name, sheet_output.into_untagged_value());
} else { } else {
yield Err(ShellError::labeled_error( return Err(ShellError::labeled_error(
"Could not load sheet", "Could not load sheet",
"could not load sheet", "could not load sheet",
&tag, &tag,
@ -92,8 +95,17 @@ fn from_xlsx(
} }
} }
yield ReturnSuccess::value(dict.into_value()); Ok(OutputStream::one(ReturnSuccess::value(dict.into_value())))
}; }
Ok(stream.to_output_stream()) #[cfg(test)]
mod tests {
use super::FromXLSX;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(FromXLSX {})
}
} }

View File

@ -5,25 +5,26 @@ use nu_protocol::{Primitive, ReturnSuccess, Signature, TaggedDictBuilder, Untagg
pub struct FromXML; pub struct FromXML;
#[async_trait]
impl WholeStreamCommand for FromXML { impl WholeStreamCommand for FromXML {
fn name(&self) -> &str { fn name(&self) -> &str {
"from-xml" "from xml"
} }
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("from-xml") Signature::build("from xml")
} }
fn usage(&self) -> &str { fn usage(&self) -> &str {
"Parse text as .xml and create table." "Parse text as .xml and create table."
} }
fn run( async fn run(
&self, &self,
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
from_xml(args, registry) from_xml(args, registry).await
} }
} }
@ -98,41 +99,42 @@ pub fn from_xml_string_to_value(s: String, tag: impl Into<Tag>) -> Result<Value,
Ok(from_document_to_value(&parsed, tag)) Ok(from_document_to_value(&parsed, tag))
} }
fn from_xml(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> { async fn from_xml(
let args = args.evaluate_once(registry)?; args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let args = args.evaluate_once(&registry).await?;
let tag = args.name_tag(); let tag = args.name_tag();
let input = args.input; let input = args.input;
let stream = async_stream! {
let concat_string = input.collect_string(tag.clone()).await?; let concat_string = input.collect_string(tag.clone()).await?;
Ok(
match from_xml_string_to_value(concat_string.item, tag.clone()) { match from_xml_string_to_value(concat_string.item, tag.clone()) {
Ok(x) => match x { Ok(x) => match x {
Value { value: UntaggedValue::Table(list), .. } => { Value {
for l in list { value: UntaggedValue::Table(list),
yield ReturnSuccess::value(l); ..
} } => futures::stream::iter(list.into_iter().map(ReturnSuccess::value))
} .to_output_stream(),
x => yield ReturnSuccess::value(x), x => OutputStream::one(ReturnSuccess::value(x)),
}, },
Err(_) => { Err(_) => {
yield Err(ShellError::labeled_error_with_secondary( return Err(ShellError::labeled_error_with_secondary(
"Could not parse as XML", "Could not parse as XML",
"input cannot be parsed as XML", "input cannot be parsed as XML",
&tag, &tag,
"value originates from here", "value originates from here",
&concat_string.tag, &concat_string.tag,
)) ))
} ,
} }
}; },
)
Ok(stream.to_output_stream())
} }
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use crate::commands::from_xml; use crate::commands::from_xml;
use indexmap::IndexMap; use indexmap::IndexMap;
use nu_protocol::{UntaggedValue, Value}; use nu_protocol::{UntaggedValue, Value};
@ -300,4 +302,12 @@ mod tests {
Ok(()) Ok(())
} }
#[test]
fn examples_work_as_expected() {
use super::FromXML;
use crate::examples::test as test_examples;
test_examples(FromXML {})
}
} }

View File

@ -1,53 +1,55 @@
use crate::commands::WholeStreamCommand; use crate::commands::WholeStreamCommand;
use crate::prelude::*; use crate::prelude::*;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, Signature, TaggedDictBuilder, UntaggedValue, Value}; use nu_protocol::{Primitive, Signature, TaggedDictBuilder, UntaggedValue, Value};
pub struct FromYAML; pub struct FromYAML;
#[async_trait]
impl WholeStreamCommand for FromYAML { impl WholeStreamCommand for FromYAML {
fn name(&self) -> &str { fn name(&self) -> &str {
"from-yaml" "from yaml"
} }
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("from-yaml") Signature::build("from yaml")
} }
fn usage(&self) -> &str { fn usage(&self) -> &str {
"Parse text as .yaml/.yml and create table." "Parse text as .yaml/.yml and create table."
} }
fn run( async fn run(
&self, &self,
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
from_yaml(args, registry) from_yaml(args, registry).await
} }
} }
pub struct FromYML; pub struct FromYML;
#[async_trait]
impl WholeStreamCommand for FromYML { impl WholeStreamCommand for FromYML {
fn name(&self) -> &str { fn name(&self) -> &str {
"from-yml" "from yml"
} }
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("from-yml") Signature::build("from yml")
} }
fn usage(&self) -> &str { fn usage(&self) -> &str {
"Parse text as .yaml/.yml and create table." "Parse text as .yaml/.yml and create table."
} }
fn run( async fn run(
&self, &self,
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
from_yaml(args, registry) from_yaml(args, registry).await
} }
} }
@ -57,26 +59,18 @@ fn convert_yaml_value_to_nu_value(
) -> Result<Value, ShellError> { ) -> Result<Value, ShellError> {
let tag = tag.into(); let tag = tag.into();
let err_not_compatible_number = ShellError::labeled_error(
"Expected a compatible number",
"expected a compatible number",
&tag,
);
Ok(match v { Ok(match v {
serde_yaml::Value::Bool(b) => UntaggedValue::boolean(*b).into_value(tag), serde_yaml::Value::Bool(b) => UntaggedValue::boolean(*b).into_value(tag),
serde_yaml::Value::Number(n) if n.is_i64() => { serde_yaml::Value::Number(n) if n.is_i64() => {
UntaggedValue::int(n.as_i64().ok_or_else(|| { UntaggedValue::int(n.as_i64().ok_or_else(|| err_not_compatible_number)?).into_value(tag)
ShellError::labeled_error(
"Expected a compatible number",
"expected a compatible number",
&tag,
)
})?)
.into_value(tag)
} }
serde_yaml::Value::Number(n) if n.is_f64() => { serde_yaml::Value::Number(n) if n.is_f64() => {
UntaggedValue::decimal(n.as_f64().ok_or_else(|| { UntaggedValue::decimal(n.as_f64().ok_or_else(|| err_not_compatible_number)?)
ShellError::labeled_error(
"Expected a compatible number",
"expected a compatible number",
&tag,
)
})?)
.into_value(tag) .into_value(tag)
} }
serde_yaml::Value::String(s) => UntaggedValue::string(s).into_value(tag), serde_yaml::Value::String(s) => UntaggedValue::string(s).into_value(tag),
@ -91,11 +85,39 @@ fn convert_yaml_value_to_nu_value(
let mut collected = TaggedDictBuilder::new(&tag); let mut collected = TaggedDictBuilder::new(&tag);
for (k, v) in t.iter() { for (k, v) in t.iter() {
match k { // A ShellError that we re-use multiple times in the Mapping scenario
serde_yaml::Value::String(k) => { let err_unexpected_map = ShellError::labeled_error(
format!("Unexpected YAML:\nKey: {:?}\nValue: {:?}", k, v),
"unexpected",
tag.clone(),
);
match (k, v) {
(serde_yaml::Value::String(k), _) => {
collected.insert_value(k.clone(), convert_yaml_value_to_nu_value(v, &tag)?); collected.insert_value(k.clone(), convert_yaml_value_to_nu_value(v, &tag)?);
} }
_ => unimplemented!("Unknown key type"), // Hard-code fix for cases where "v" is a string without quotations with double curly braces
// e.g. k = value
// value: {{ something }}
// Strangely, serde_yaml returns
// "value" -> Mapping(Mapping { map: {Mapping(Mapping { map: {String("something"): Null} }): Null} })
(serde_yaml::Value::Mapping(m), serde_yaml::Value::Null) => {
return m
.iter()
.take(1)
.collect_vec()
.first()
.and_then(|e| match e {
(serde_yaml::Value::String(s), serde_yaml::Value::Null) => Some(
UntaggedValue::string("{{ ".to_owned() + &s + " }}")
.into_value(tag),
),
_ => None,
})
.ok_or(err_unexpected_map);
}
(_, _) => {
return Err(err_unexpected_map);
}
} }
} }
@ -118,34 +140,79 @@ pub fn from_yaml_string_to_value(s: String, tag: impl Into<Tag>) -> Result<Value
Ok(convert_yaml_value_to_nu_value(&v, tag)?) Ok(convert_yaml_value_to_nu_value(&v, tag)?)
} }
fn from_yaml(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> { async fn from_yaml(
let args = args.evaluate_once(registry)?; args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let args = args.evaluate_once(&registry).await?;
let tag = args.name_tag(); let tag = args.name_tag();
let input = args.input; let input = args.input;
let stream = async_stream! {
let concat_string = input.collect_string(tag.clone()).await?; let concat_string = input.collect_string(tag.clone()).await?;
match from_yaml_string_to_value(concat_string.item, tag.clone()) { match from_yaml_string_to_value(concat_string.item, tag.clone()) {
Ok(x) => match x { Ok(x) => match x {
Value { value: UntaggedValue::Table(list), .. } => { Value {
for l in list { value: UntaggedValue::Table(list),
yield ReturnSuccess::value(l); ..
} } => Ok(futures::stream::iter(list).to_output_stream()),
} x => Ok(OutputStream::one(x)),
x => yield ReturnSuccess::value(x),
}, },
Err(_) => { Err(_) => Err(ShellError::labeled_error_with_secondary(
yield Err(ShellError::labeled_error_with_secondary(
"Could not parse as YAML", "Could not parse as YAML",
"input cannot be parsed as YAML", "input cannot be parsed as YAML",
&tag, &tag,
"value originates from here", "value originates from here",
&concat_string.tag, &concat_string.tag,
)) )),
} }
} }
};
Ok(stream.to_output_stream()) #[cfg(test)]
mod tests {
use super::*;
use nu_plugin::row;
use nu_plugin::test_helpers::value::string;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(FromYAML {})
}
#[test]
fn test_problematic_yaml() {
struct TestCase {
description: &'static str,
input: &'static str,
expected: Result<Value, ShellError>,
}
let tt: Vec<TestCase> = vec![
TestCase {
description: "Double Curly Braces With Quotes",
input: r#"value: "{{ something }}""#,
expected: Ok(row!["value".to_owned() => string("{{ something }}")]),
},
TestCase {
description: "Double Curly Braces Without Quotes",
input: r#"value: {{ something }}"#,
expected: Ok(row!["value".to_owned() => string("{{ something }}")]),
},
];
for tc in tt.into_iter() {
let actual = from_yaml_string_to_value(tc.input.to_owned(), Tag::default());
if actual.is_err() {
assert!(
tc.expected.is_err(),
"actual is Err for test:\nTest Description {}\nErr: {:?}",
tc.description,
actual
);
} else {
assert_eq!(actual, tc.expected, "{}", tc.description);
}
}
}
} }

View File

@ -4,7 +4,7 @@ use indexmap::set::IndexSet;
use log::trace; use log::trace;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{ use nu_protocol::{
did_you_mean, ColumnPath, PathMember, ReturnSuccess, ReturnValue, Signature, SyntaxShape, did_you_mean, ColumnPath, PathMember, Primitive, ReturnSuccess, Signature, SyntaxShape,
UnspannedPathMember, UntaggedValue, Value, UnspannedPathMember, UntaggedValue, Value,
}; };
use nu_source::span_for_spanned_list; use nu_source::span_for_spanned_list;
@ -17,6 +17,7 @@ pub struct GetArgs {
rest: Vec<ColumnPath>, rest: Vec<ColumnPath>,
} }
#[async_trait]
impl WholeStreamCommand for Get { impl WholeStreamCommand for Get {
fn name(&self) -> &str { fn name(&self) -> &str {
"get" "get"
@ -33,12 +34,27 @@ impl WholeStreamCommand for Get {
"Open given cells as text." "Open given cells as text."
} }
fn run( async fn run(
&self, &self,
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
args.process(registry, get)?.run() get(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "Extract the name of files as a list",
example: "ls | get name",
result: None,
},
Example {
description: "Extract the cpu list from the sys information",
example: "sys | get cpu",
result: None,
},
]
} }
} }
@ -176,31 +192,24 @@ pub fn get_column_path(path: &ColumnPath, obj: &Value) -> Result<Value, ShellErr
) )
} }
pub fn get( pub async fn get(
GetArgs { rest: mut fields }: GetArgs, args: CommandArgs,
RunnableContext { mut input, .. }: RunnableContext, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let (GetArgs { rest: mut fields }, mut input) = args.process(&registry).await?;
if fields.is_empty() { if fields.is_empty() {
let stream = async_stream! { let vec = input.drain_vec().await;
let mut vec = input.drain_vec().await;
let descs = nu_protocol::merge_descriptors(&vec); let descs = nu_protocol::merge_descriptors(&vec);
for desc in descs {
yield ReturnSuccess::value(desc);
}
};
let stream: BoxStream<'static, ReturnValue> = stream.boxed(); Ok(futures::stream::iter(descs.into_iter().map(ReturnSuccess::value)).to_output_stream())
Ok(stream.to_output_stream())
} else { } else {
let member = fields.remove(0); let member = fields.remove(0);
trace!("get {:?} {:?}", member, fields); trace!("get {:?} {:?}", member, fields);
let stream = input
.values
.map(move |item| {
let mut result = VecDeque::new();
Ok(input
.map(move |item| {
let member = vec![member.clone()]; let member = vec![member.clone()];
let column_paths = vec![&member, &fields] let column_paths = vec![&member, &fields]
@ -208,6 +217,7 @@ pub fn get(
.flatten() .flatten()
.collect::<Vec<&ColumnPath>>(); .collect::<Vec<&ColumnPath>>();
let mut output = vec![];
for path in column_paths { for path in column_paths {
let res = get_column_path(&path, &item); let res = get_column_path(&path, &item);
@ -218,21 +228,36 @@ pub fn get(
.. ..
} => { } => {
for item in rows { for item in rows {
result.push_back(ReturnSuccess::value(item.clone())); output.push(ReturnSuccess::value(item.clone()));
} }
} }
other => result.push_back(ReturnSuccess::value(other.clone())), Value {
value: UntaggedValue::Primitive(Primitive::Nothing),
..
} => {}
other => output.push(ReturnSuccess::value(other.clone())),
}, },
Err(reason) => result.push_back(ReturnSuccess::value( Err(reason) => output.push(ReturnSuccess::value(
UntaggedValue::Error(reason).into_untagged_value(), UntaggedValue::Error(reason).into_untagged_value(),
)), )),
} }
} }
futures::stream::iter(result) futures::stream::iter(output)
}) })
.flatten(); .flatten()
.to_output_stream())
}
}
Ok(stream.to_output_stream()) #[cfg(test)]
mod tests {
use super::Get;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Get {})
} }
} }

View File

@ -1,191 +1,281 @@
use crate::commands::WholeStreamCommand; use crate::commands::WholeStreamCommand;
use crate::prelude::*; use crate::prelude::*;
use indexmap::indexmap;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, TaggedDictBuilder, UntaggedValue, Value}; use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged; use nu_source::Tagged;
use nu_value_ext::get_data_by_key; use nu_value_ext::as_string;
pub struct GroupBy; pub struct GroupBy;
#[derive(Deserialize)] #[derive(Deserialize)]
pub struct GroupByArgs { pub struct GroupByArgs {
column_name: Tagged<String>, grouper: Option<Value>,
} }
#[async_trait]
impl WholeStreamCommand for GroupBy { impl WholeStreamCommand for GroupBy {
fn name(&self) -> &str { fn name(&self) -> &str {
"group-by" "group-by"
} }
fn signature(&self) -> Signature { fn signature(&self) -> Signature {
Signature::build("group-by").required( Signature::build("group-by").optional(
"column_name", "grouper",
SyntaxShape::String, SyntaxShape::Any,
"the name of the column to group by", "the grouper value to use",
) )
} }
fn usage(&self) -> &str { fn usage(&self) -> &str {
"Creates a new table with the data from the table rows grouped by the column given." "create a new table grouped."
} }
fn run( async fn run(
&self, &self,
args: CommandArgs, args: CommandArgs,
registry: &CommandRegistry, registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
args.process(registry, group_by)?.run() group_by(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![
Example {
description: "group items by column named \"type\"",
example: r#"ls | group-by type"#,
result: None,
},
Example {
description: "blocks can be used for generating a grouping key (same as above)",
example: r#"ls | group-by { get type }"#,
result: None,
},
Example {
description: "you can also group by raw values by leaving out the argument",
example: "echo [1 3 1 3 2 1 1] | group-by",
result: Some(vec![UntaggedValue::row(indexmap! {
"1".to_string() => UntaggedValue::Table(vec![
UntaggedValue::int(1).into(),
UntaggedValue::int(1).into(),
UntaggedValue::int(1).into(),
UntaggedValue::int(1).into(),
]).into(),
"3".to_string() => UntaggedValue::Table(vec![
UntaggedValue::int(3).into(),
UntaggedValue::int(3).into(),
]).into(),
"2".to_string() => UntaggedValue::Table(vec![
UntaggedValue::int(2).into(),
]).into(),
})
.into()]),
},
Example {
description: "write pipelines for a more involved grouping key",
example:
"echo [1 3 1 3 2 1 1] | group-by { echo `({{$it}} - 1) % 3` | calc | str from }",
result: None,
},
]
} }
} }
pub fn group_by( enum Grouper {
GroupByArgs { column_name }: GroupByArgs, ByColumn(Option<Tagged<String>>),
RunnableContext { input, name, .. }: RunnableContext, ByBlock,
}
pub async fn group_by(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
let stream = async_stream! { let name = args.call_info.name_tag.clone();
let values: Vec<Value> = input.values.collect().await; let registry = registry.clone();
let head = Arc::new(args.call_info.args.head.clone());
let scope = Arc::new(args.call_info.scope.clone());
let context = Arc::new(Context::from_raw(&args, &registry));
let (GroupByArgs { grouper }, input) = args.process(&registry).await?;
if values.is_empty() { let values: Vec<Value> = input.collect().await;
yield Err(ShellError::labeled_error( let mut keys: Vec<Result<String, ShellError>> = vec![];
"Expected table from pipeline", let mut group_strategy = Grouper::ByColumn(None);
"requires a table input",
column_name.span() match grouper {
)) Some(Value {
} else { value: UntaggedValue::Block(block_given),
match group(&column_name, values, name) { ..
Ok(grouped) => yield ReturnSuccess::value(grouped), }) => {
Err(err) => yield Err(err) let block = Arc::new(block_given);
let error_key = "error";
for value in values.iter() {
let run = block.clone();
let scope = scope.clone();
let head = head.clone();
let context = context.clone();
match crate::commands::each::process_row(run, scope, head, context, value.clone())
.await
{
Ok(mut s) => {
let collection: Vec<Result<ReturnSuccess, ShellError>> =
s.drain_vec().await;
if collection.len() > 1 {
return Err(ShellError::labeled_error(
"expected one value from the block",
"requires a table with one value for grouping",
&name,
));
} }
let value = match collection.get(0) {
Some(Ok(return_value)) => {
return_value.raw_value().unwrap_or_else(|| {
UntaggedValue::string(error_key).into_value(&name)
})
}
Some(Err(_)) | None => {
UntaggedValue::string(error_key).into_value(&name)
} }
}; };
Ok(stream.to_output_stream()) keys.push(as_string(&value));
}
Err(_) => {
keys.push(Ok(error_key.into()));
}
}
} }
pub fn group( group_strategy = Grouper::ByBlock;
column_name: &Tagged<String>, }
values: Vec<Value>, Some(other) => {
tag: impl Into<Tag>, group_strategy = Grouper::ByColumn(Some(as_string(&other)?.tagged(&name)));
) -> Result<Value, ShellError> { }
let tag = tag.into(); _ => {}
}
let mut groups: indexmap::IndexMap<String, Vec<Value>> = indexmap::IndexMap::new(); if values.is_empty() {
return Err(ShellError::labeled_error(
"expected table from pipeline",
"requires a table input",
name,
));
}
for value in values { let values = UntaggedValue::table(&values).into_value(&name);
let group_key = get_data_by_key(&value, column_name.borrow_spanned());
if let Some(group_key) = group_key { let group_value = match group_strategy {
let group_key = group_key.as_string()?.to_string(); Grouper::ByBlock => {
let group = groups.entry(group_key).or_insert(vec![]); let map = keys.clone();
group.push(value);
} else { let block = Box::new(move |idx: usize, row: &Value| match map.get(idx) {
let possibilities = value.data_descriptors(); Some(Ok(key)) => Ok(key.clone()),
Some(Err(reason)) => Err(reason.clone()),
None => as_string(row),
});
crate::utils::data::group(&values, &Some(block), &name)
}
Grouper::ByColumn(column_name) => group(&column_name, &values, name),
};
Ok(OutputStream::one(ReturnSuccess::value(group_value?)))
}
pub fn suggestions(tried: Tagged<&str>, for_value: &Value) -> ShellError {
let possibilities = for_value.data_descriptors();
let mut possible_matches: Vec<_> = possibilities let mut possible_matches: Vec<_> = possibilities
.iter() .iter()
.map(|x| (natural::distance::levenshtein_distance(x, column_name), x)) .map(|x| (natural::distance::levenshtein_distance(x, &tried), x))
.collect(); .collect();
possible_matches.sort(); possible_matches.sort();
if !possible_matches.is_empty() { if !possible_matches.is_empty() {
return Err(ShellError::labeled_error( ShellError::labeled_error(
"Unknown column", "Unknown column",
format!("did you mean '{}'?", possible_matches[0].1), format!("did you mean '{}'?", possible_matches[0].1),
column_name.tag(), tried.tag(),
)); )
} else { } else {
return Err(ShellError::labeled_error( ShellError::labeled_error(
"Unknown column", "Unknown column",
"row does not contain this column", "row does not contain this column",
column_name.tag(), tried.tag(),
)); )
}
} }
} }
let mut out = TaggedDictBuilder::new(&tag); pub fn group(
column_name: &Option<Tagged<String>>,
values: &Value,
tag: impl Into<Tag>,
) -> Result<Value, ShellError> {
let name = tag.into();
for (k, v) in groups.iter() { let grouper = if let Some(column_name) = column_name {
out.insert_untagged(k, UntaggedValue::table(v)); Grouper::ByColumn(Some(column_name.clone()))
} else {
Grouper::ByColumn(None)
};
match grouper {
Grouper::ByColumn(Some(column_name)) => {
let block = Box::new(move |_, row: &Value| {
match row.get_data_by_key(column_name.borrow_spanned()) {
Some(group_key) => Ok(as_string(&group_key)?),
None => Err(suggestions(column_name.borrow_tagged(), &row)),
} }
});
Ok(out.into_value()) crate::utils::data::group(&values, &Some(block), &name)
}
Grouper::ByColumn(None) => {
let block = Box::new(move |_, row: &Value| as_string(row));
crate::utils::data::group(&values, &Some(block), &name)
}
Grouper::ByBlock => Err(ShellError::unimplemented(
"Block not implemented: This should never happen.",
)),
}
} }
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use crate::commands::group_by::group; use super::group;
use indexmap::IndexMap; use crate::utils::data::helpers::{committers, date, int, row, string, table};
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{UntaggedValue, Value};
use nu_source::*; use nu_source::*;
fn string(input: impl Into<String>) -> Value {
UntaggedValue::string(input.into()).into_untagged_value()
}
fn row(entries: IndexMap<String, Value>) -> Value {
UntaggedValue::row(entries).into_untagged_value()
}
fn table(list: &[Value]) -> Value {
UntaggedValue::table(list).into_untagged_value()
}
fn nu_releases_commiters() -> Vec<Value> {
vec![
row(
indexmap! {"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => string("August 23-2019")},
),
row(
indexmap! {"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => string("August 23-2019")},
),
row(
indexmap! {"name".into() => string("YK"), "country".into() => string("US"), "date".into() => string("October 10-2019")},
),
row(
indexmap! {"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => string("Sept 24-2019")},
),
row(
indexmap! {"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => string("October 10-2019")},
),
row(
indexmap! {"name".into() => string("YK"), "country".into() => string("US"), "date".into() => string("Sept 24-2019")},
),
row(
indexmap! {"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => string("October 10-2019")},
),
row(
indexmap! {"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => string("Sept 24-2019")},
),
row(
indexmap! {"name".into() => string("YK"), "country".into() => string("US"), "date".into() => string("August 23-2019")},
),
]
}
#[test] #[test]
fn groups_table_by_date_column() -> Result<(), ShellError> { fn groups_table_by_date_column() -> Result<(), ShellError> {
let for_key = String::from("date").tagged_unknown(); let for_key = Some(String::from("date").tagged_unknown());
let sample = table(&committers());
assert_eq!( assert_eq!(
group(&for_key, nu_releases_commiters(), Tag::unknown())?, group(&for_key, &sample, Tag::unknown())?,
row(indexmap! { row(indexmap! {
"August 23-2019".into() => table(&[ "2019-07-23".into() => table(&[
row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => string("August 23-2019")}), row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => date("2019-07-23"), "chickens".into() => int(10) }),
row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => string("August 23-2019")}), row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => date("2019-07-23"), "chickens".into() => int(5) }),
row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => string("August 23-2019")}) row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => date("2019-07-23"), "chickens".into() => int(2) })
]), ]),
"October 10-2019".into() => table(&[ "2019-10-10".into() => table(&[
row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => string("October 10-2019")}), row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => date("2019-10-10"), "chickens".into() => int(6) }),
row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => string("October 10-2019")}), row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => date("2019-10-10"), "chickens".into() => int(15) }),
row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => string("October 10-2019")}) row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => date("2019-10-10"), "chickens".into() => int(30) })
]), ]),
"Sept 24-2019".into() => table(&[ "2019-09-24".into() => table(&[
row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => string("Sept 24-2019")}), row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => date("2019-09-24"), "chickens".into() => int(20) }),
row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => string("Sept 24-2019")}), row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => date("2019-09-24"), "chickens".into() => int(4) }),
row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => string("Sept 24-2019")}) row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => date("2019-09-24"), "chickens".into() => int(10) })
]), ]),
}) })
); );
@ -195,29 +285,38 @@ mod tests {
#[test] #[test]
fn groups_table_by_country_column() -> Result<(), ShellError> { fn groups_table_by_country_column() -> Result<(), ShellError> {
let for_key = String::from("country").tagged_unknown(); let for_key = Some(String::from("country").tagged_unknown());
let sample = table(&committers());
assert_eq!( assert_eq!(
group(&for_key, nu_releases_commiters(), Tag::unknown())?, group(&for_key, &sample, Tag::unknown())?,
row(indexmap! { row(indexmap! {
"EC".into() => table(&[ "EC".into() => table(&[
row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => string("August 23-2019")}), row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => date("2019-07-23"), "chickens".into() => int(10) }),
row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => string("Sept 24-2019")}), row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => date("2019-09-24"), "chickens".into() => int(20) }),
row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => string("October 10-2019")}) row(indexmap!{"name".into() => string("AR"), "country".into() => string("EC"), "date".into() => date("2019-10-10"), "chickens".into() => int(30) })
]), ]),
"NZ".into() => table(&[ "NZ".into() => table(&[
row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => string("August 23-2019")}), row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => date("2019-07-23"), "chickens".into() => int(5) }),
row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => string("October 10-2019")}), row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => date("2019-10-10"), "chickens".into() => int(15) }),
row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => string("Sept 24-2019")}) row(indexmap!{"name".into() => string("JT"), "country".into() => string("NZ"), "date".into() => date("2019-09-24"), "chickens".into() => int(10) })
]), ]),
"US".into() => table(&[ "US".into() => table(&[
row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => string("October 10-2019")}), row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => date("2019-10-10"), "chickens".into() => int(6) }),
row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => string("Sept 24-2019")}), row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => date("2019-09-24"), "chickens".into() => int(4) }),
row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => string("August 23-2019")}), row(indexmap!{"name".into() => string("YK"), "country".into() => string("US"), "date".into() => date("2019-07-23"), "chickens".into() => int(2) }),
]), ]),
}) })
); );
Ok(()) Ok(())
} }
#[test]
fn examples_work_as_expected() {
use super::GroupBy;
use crate::examples::test as test_examples;
test_examples(GroupBy {})
}
} }

View File

@ -0,0 +1,175 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged;
pub struct GroupByDate;
#[derive(Deserialize)]
pub struct GroupByDateArgs {
column_name: Option<Tagged<String>>,
format: Option<Tagged<String>>,
}
#[async_trait]
impl WholeStreamCommand for GroupByDate {
fn name(&self) -> &str {
"group-by date"
}
fn signature(&self) -> Signature {
Signature::build("group-by date")
.optional(
"column_name",
SyntaxShape::String,
"the name of the column to group by",
)
.named(
"format",
SyntaxShape::String,
"Specify date and time formatting",
Some('f'),
)
}
fn usage(&self) -> &str {
"creates a table grouped by date."
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
group_by_date(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Group files by type",
example: "ls | group-by date --format '%d/%m/%Y'",
result: None,
}]
}
}
enum Grouper {
ByDate(Option<Tagged<String>>),
}
enum GroupByColumn {
Name(Option<Tagged<String>>),
}
pub async fn group_by_date(
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let registry = registry.clone();
let name = args.call_info.name_tag.clone();
let (
GroupByDateArgs {
column_name,
format,
},
input,
) = args.process(&registry).await?;
let values: Vec<Value> = input.collect().await;
if values.is_empty() {
Err(ShellError::labeled_error(
"Expected table from pipeline",
"requires a table input",
name,
))
} else {
let values = UntaggedValue::table(&values).into_value(&name);
let grouper_column = if let Some(column_name) = column_name {
GroupByColumn::Name(Some(column_name))
} else {
GroupByColumn::Name(None)
};
let grouper_date = if let Some(date_format) = format {
Grouper::ByDate(Some(date_format))
} else {
Grouper::ByDate(None)
};
let value_result = match (grouper_date, grouper_column) {
(Grouper::ByDate(None), GroupByColumn::Name(None)) => {
let block = Box::new(move |_, row: &Value| row.format("%Y-%m-%d"));
crate::utils::data::group(&values, &Some(block), &name)
}
(Grouper::ByDate(None), GroupByColumn::Name(Some(column_name))) => {
let block = Box::new(move |_, row: &Value| {
let group_key = row
.get_data_by_key(column_name.borrow_spanned())
.ok_or_else(|| suggestions(column_name.borrow_tagged(), &row));
group_key?.format("%Y-%m-%d")
});
crate::utils::data::group(&values, &Some(block), &name)
}
(Grouper::ByDate(Some(fmt)), GroupByColumn::Name(None)) => {
let block = Box::new(move |_, row: &Value| row.format(&fmt));
crate::utils::data::group(&values, &Some(block), &name)
}
(Grouper::ByDate(Some(fmt)), GroupByColumn::Name(Some(column_name))) => {
let block = Box::new(move |_, row: &Value| {
let group_key = row
.get_data_by_key(column_name.borrow_spanned())
.ok_or_else(|| suggestions(column_name.borrow_tagged(), &row));
group_key?.format(&fmt)
});
crate::utils::data::group(&values, &Some(block), &name)
}
};
Ok(OutputStream::one(ReturnSuccess::value(value_result?)))
}
}
pub fn suggestions(tried: Tagged<&str>, for_value: &Value) -> ShellError {
let possibilities = for_value.data_descriptors();
let mut possible_matches: Vec<_> = possibilities
.iter()
.map(|x| (natural::distance::levenshtein_distance(x, &tried), x))
.collect();
possible_matches.sort();
if !possible_matches.is_empty() {
ShellError::labeled_error(
"Unknown column",
format!("did you mean '{}'?", possible_matches[0].1),
tried.tag(),
)
} else {
ShellError::labeled_error(
"Unknown column",
"row does not contain this column",
tried.tag(),
)
}
}
#[cfg(test)]
mod tests {
use super::GroupByDate;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(GroupByDate {})
}
}

View File

@ -0,0 +1,114 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use futures::stream::StreamExt;
use indexmap::IndexMap;
use nu_errors::ShellError;
use nu_protocol::Dictionary;
use nu_protocol::{ReturnSuccess, Signature, UntaggedValue, Value};
pub struct Headers;
#[async_trait]
impl WholeStreamCommand for Headers {
fn name(&self) -> &str {
"headers"
}
fn signature(&self) -> Signature {
Signature::build("headers")
}
fn usage(&self) -> &str {
"Use the first row of the table as column names"
}
async fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
headers(args, registry).await
}
fn examples(&self) -> Vec<Example> {
vec![Example {
description: "Create headers for a raw string",
example: r#"echo "a b c|1 2 3" | split row "|" | split column " " | headers"#,
result: None,
}]
}
}
pub async fn headers(
args: CommandArgs,
_registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
let input = args.input;
let rows: Vec<Value> = input.collect().await;
if rows.is_empty() {
return Err(ShellError::untagged_runtime_error(
"Couldn't find headers, was the input a properly formatted, non-empty table?",
));
}
//the headers are the first row in the table
let headers: Vec<String> = match &rows[0].value {
UntaggedValue::Row(d) => {
Ok(d.entries
.iter()
.map(|(k, v)| {
match v.as_string() {
Ok(s) => s,
Err(_) => {
//If a cell that should contain a header name is empty, we name the column Column[index]
match d.entries.get_full(k) {
Some((index, _, _)) => format!("Column{}", index),
None => "unknownColumn".to_string(),
}
}
}
})
.collect())
}
_ => Err(ShellError::unexpected_eof(
"Could not get headers, is the table empty?",
rows[0].tag.span,
)),
}?;
Ok(
futures::stream::iter(rows.into_iter().skip(1).map(move |r| {
//Each row is a dictionary with the headers as keys
match &r.value {
UntaggedValue::Row(d) => {
let mut entries = IndexMap::new();
for (i, (_, v)) in d.entries.iter().enumerate() {
entries.insert(headers[i].clone(), v.clone());
}
Ok(ReturnSuccess::Value(
UntaggedValue::Row(Dictionary { entries }).into_value(r.tag.clone()),
))
}
_ => Err(ShellError::unexpected_eof(
"Couldn't iterate through rows, was the input a properly formatted table?",
r.tag.span,
)),
}
}))
.to_output_stream(),
)
}
#[cfg(test)]
mod tests {
use super::Headers;
#[test]
fn examples_work_as_expected() {
use crate::examples::test as test_examples;
test_examples(Headers {})
}
}

Some files were not shown because too many files have changed in this diff Show More