Compare commits

...

39 Commits

Author SHA1 Message Date
b40d16310c More relaxed file modes for now. (#1476) 2020-03-11 13:19:15 +13:00
d3718d00db Merge shuffle nu plugin as core command. (#1475) 2020-03-10 17:00:08 -05:00
f716f61fc1 Update Cargo.lock 2020-03-11 08:53:26 +13:00
b2ce669791 Update Cargo.toml 2020-03-11 08:51:53 +13:00
cd155f63e1 Update Cargo.toml 2020-03-11 08:51:17 +13:00
9eaa6877f3 Update Cargo.toml 2020-03-11 08:50:51 +13:00
a6b6afbca9 Update Cargo.toml 2020-03-11 08:08:13 +13:00
62666bebc9 Bump to 0.11.0 (#1474) 2020-03-11 06:34:19 +13:00
d1fcce0cd3 Fixes #1427: Prints help message with -h switch (#1454)
For some commands like `which` -h flag would trigger an error asking for
missing required parameters instead of printing the help message as it
does with --help. This commit adds a check in the command parser to
avoid that.
2020-03-11 05:59:50 +13:00
a2443fbe02 Remove unused parsing logic. (#1473)
* Remove unused parsing logic.

* Run tokens iteration tests baseline.

* Pass lint.

* lifetimes can be elided without being explicit.
2020-03-10 04:31:42 -05:00
db16b56fe1 Columnpath support when passing fields for formatting. (#1472) 2020-03-10 01:55:03 -05:00
54bf671a50 Fix deleting / showing ls named pipes and other fs objects no… (#1461)
* Fix deleting named pipes
* Use std::os::unix::fs::FileTypeExt to show correct type for unix-specific fs objects; Fix formatting

Co-authored-by: Linards Kalvāns <linards.kalvans@twino.eu>
2020-03-09 09:02:53 -04:00
755d0e648b Eliminate some compiler warnings (#1468)
- Unnecessary parentheses
- Deprecated `description()` method
2020-03-09 08:19:07 +13:00
e440d8c939 Bump some deps (#1467) 2020-03-09 08:18:44 +13:00
01dd358a18 Don't emit a newline in autoview. (#1466)
The extra newline character makes it hard to use nu as part of an
external processing pipeline, since the extra character could taint the
results. For example:

```
$ nu -c 'echo test | xxd'
00000000: 7465 7374                                test
```

versus

```
nu -c 'echo test' | xxd
00000000: 7465 7374 0a                             test.
```
2020-03-09 08:18:24 +13:00
50fb97f6b7 Merge env into $nu and simplify table/get (#1463) 2020-03-08 18:33:30 +13:00
ebf139f5e5 Auto-detect string / binary in save command (#1459)
* Auto-detect string / binary in save command

* Linter
2020-03-08 07:33:29 +13:00
8925ca5da3 Move to bytes/string hybrid codec (#1457)
* WIP: move to bytes codec

* Progress on adding collect helpers

* Progress on adding collect helpers

* Add in line splitting back to lines

* Lines outputting line primitives

* Close to ready?

* Finish fixing lines

* clippy fixes

* fmt fixes

* removed unused code

* Cleanup a few bits

* Cleanup a few bits

* Cleanup a few more bits

* Fix failing test with corrected test case
2020-03-07 05:06:39 +13:00
287652573b Fix and refactor cd for Filesystem Shell. (#1452)
* Fix and refactor cd for Filesystem Shell.
Reorder check conditions, don't check existence twice.
If building for unix check exec bit on folder.

* Import PermissionsExt only on unix target.

* It seems that this is the correct way?
2020-03-06 20:13:47 +13:00
db24ad8f36 Add --num parameter to limit the number of output lines (#1455)
Add `--num` parameter to limit the numer of returned elements
2020-03-05 05:26:46 -05:00
f88674f353 Nu internals are logged under nu filter. (#1451) 2020-03-05 05:18:53 -05:00
59cb0ba381 Color appropiately commands. (#1453) 2020-03-04 23:22:42 -05:00
c4cfab5e16 Make feature options available downstream to nu-cli subcrate. (#1450) 2020-03-04 15:31:12 -05:00
b2c5af457e Move most of the root package into a subcrate. (#1445)
This improves incremental build time when working on what was previously
the root package. For example, previously all plugins would be rebuilt
with a change to `src/commands/classified/external.rs`, but now only
`nu-cli` will have to be rebuilt (and anything that depends on it).
2020-03-04 13:58:20 -05:00
c731a5b628 Columns can be renamed. (#1447) 2020-03-03 16:01:24 -05:00
f97f9d4af3 Update deps locklfile (#1446)
Update deps lockfile
2020-03-03 15:34:22 -05:00
ed7d3fed66 Add shuffle plugin (#1443)
* Add shuffle plugin

see #1437

* Change plugin to integrate into nu structure and build system
2020-03-03 08:44:12 +13:00
7304d06c0b Use threads to avoid blocking reads/writes in externals. (#1440)
In particular, one thing that we can't (properly) do before this commit
is consuming an infinite input stream. For example:

```
yes | grep y | head -n10
```

will give 10 "y"s in most shells, but blocks indefinitely in nu. This PR
resolves that by doing blocking I/O in threads, and reducing the `await`
calls we currently have in our pipeline code.
2020-03-02 06:19:09 +13:00
ca615d9389 Bump to 0.10.1 (#1442) 2020-03-01 20:59:13 +13:00
6d096206b6 Add support for compound shorthand flags (#1414)
* Break multicharacter shorthand flags into single character flags

* Remove shorthand flag test
2020-03-01 13:20:42 +13:00
2a8cb24309 Add support for downloading unsupported mime types (#1441) 2020-03-01 13:14:36 +13:00
8d38743e27 Add docs for debug (#1438)
* Add docs for `debug`

* Put debug docs in right folder
Also fixed minor spacing problem
2020-03-01 04:09:28 +13:00
eabfa2de54 Let ls ignore permission errors (#1435)
* Create a function to create an empty directory entry

* Print an empty directory entry if permission is denied

* Fix rustfmt whitespace issues.

* Made metadata optional for `dir_entry_dict`.

Removed `empty_dir_entry_dict` as its not needed anymore.
2020-02-29 14:33:52 +13:00
a86a0abb90 Plugin documentation (#1431)
* Add very basic documentation. Need to play with rest of the api to figure out what it does

* Add some documentation to more of the Plugin API methods

* fmt
2020-02-24 15:28:46 +13:00
adcda450d5 Update LICENSE 2020-02-21 10:49:46 +13:00
147b9d4436 Add Better-TOML (#1417) 2020-02-19 16:59:42 -05:00
c43a58d9d6 Fix incorrect display for zero-size files (#1422) 2020-02-19 09:57:58 -05:00
e38442782e Command documentation for du (#1416) 2020-02-19 09:55:22 +13:00
b98f893217 add a touch command (#1399) 2020-02-19 09:54:32 +13:00
261 changed files with 2418 additions and 1860 deletions

1
.dockerignore Normal file
View File

@ -0,0 +1 @@
target

1
.gitignore vendored
View File

@ -4,6 +4,7 @@
history.txt history.txt
tests/fixtures/nuplayground tests/fixtures/nuplayground
crates/*/target crates/*/target
# Debian/Ubuntu # Debian/Ubuntu
debian/.debhelper/ debian/.debhelper/
debian/debhelper-build-stamp debian/debhelper-build-stamp

View File

@ -26,3 +26,4 @@ vscode:
- serayuzgur.crates@0.4.7:HMkoguLcXp9M3ud7ac3eIw== - serayuzgur.crates@0.4.7:HMkoguLcXp9M3ud7ac3eIw==
- belfz.search-crates-io@1.2.1:kSLnyrOhXtYPjQpKnMr4eQ== - belfz.search-crates-io@1.2.1:kSLnyrOhXtYPjQpKnMr4eQ==
- vadimcn.vscode-lldb@1.4.5:lwHCNwtm0kmOBXeQUIPGMQ== - vadimcn.vscode-lldb@1.4.5:lwHCNwtm0kmOBXeQUIPGMQ==
- bungcip.better-toml@0.3.2:3QfgGxxYtGHfJKQU7H0nEw==

323
Cargo.lock generated
View File

@ -379,9 +379,9 @@ checksum = "4785bdd1c96b2a846b2bd7cc02e86b6b3dbf14e7e53446c4f54c92a361040822"
[[package]] [[package]]
name = "chrono" name = "chrono"
version = "0.4.10" version = "0.4.11"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "31850b4a4d6bae316f7a09e691c944c28299298837edc0a03f755618c23cbc01" checksum = "80094f509cf8b5ae86a4966a39b3ff66cd7e2a3e594accec3743ff3fabeab5b2"
dependencies = [ dependencies = [
"num-integer", "num-integer",
"num-traits 0.2.11", "num-traits 0.2.11",
@ -673,11 +673,11 @@ dependencies = [
[[package]] [[package]]
name = "ctrlc" name = "ctrlc"
version = "3.1.3" version = "3.1.4"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c7dfd2d8b4c82121dfdff120f818e09fc4380b0b7e17a742081a89b94853e87f" checksum = "7a4ba686dff9fa4c1c9636ce1010b0cf98ceb421361b0bb3d6faeec43bd217a7"
dependencies = [ dependencies = [
"nix 0.14.1", "nix 0.17.0",
"winapi 0.3.8", "winapi 0.3.8",
] ]
@ -912,9 +912,9 @@ dependencies = [
[[package]] [[package]]
name = "failure" name = "failure"
version = "0.1.6" version = "0.1.7"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f8273f13c977665c5db7eb2b99ae520952fe5ac831ae4cd09d80c4c7042b5ed9" checksum = "b8529c2421efa3066a5cbd8063d2244603824daccb6936b079010bb2aa89464b"
dependencies = [ dependencies = [
"backtrace", "backtrace",
] ]
@ -1205,6 +1205,18 @@ dependencies = [
"syn", "syn",
] ]
[[package]]
name = "getset"
version = "0.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f62a139c59ae846c3964c392f12aac68f1997d1a40e9d3b40a89a4ab553e04a0"
dependencies = [
"proc-macro-error",
"proc-macro2",
"quote",
"syn",
]
[[package]] [[package]]
name = "ghost" name = "ghost"
version = "0.1.1" version = "0.1.1"
@ -1702,6 +1714,15 @@ dependencies = [
"either", "either",
] ]
[[package]]
name = "itertools"
version = "0.9.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "284f18f85651fe11e8a991b2adb42cb078325c996ed026d994719efcfca1d54b"
dependencies = [
"either",
]
[[package]] [[package]]
name = "itoa" name = "itoa"
version = "0.4.5" version = "0.4.5"
@ -1830,9 +1851,9 @@ dependencies = [
[[package]] [[package]]
name = "libsqlite3-sys" name = "libsqlite3-sys"
version = "0.16.0" version = "0.17.1"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5e5b95e89c330291768dc840238db7f9e204fd208511ab6319b56193a7f2ae25" checksum = "266eb8c361198e8d1f682bc974e5d9e2ae90049fb1943890904d11dad7d4a77d"
dependencies = [ dependencies = [
"cc", "cc",
"pkg-config", "pkg-config",
@ -2049,6 +2070,15 @@ version = "0.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fd659d7d6b4554da2c0e7a486d5952b24dfce0e0bac88ab53b270f4efe1010a6" checksum = "fd659d7d6b4554da2c0e7a486d5952b24dfce0e0bac88ab53b270f4efe1010a6"
[[package]]
name = "natural"
version = "0.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "65e6b44f8ddc659cde3555e0140d3441ad26cb03a1410774af1f9a19097c1867"
dependencies = [
"rust-stemmers",
]
[[package]] [[package]]
name = "neso" name = "neso"
version = "0.5.0" version = "0.5.0"
@ -2113,6 +2143,19 @@ dependencies = [
"void", "void",
] ]
[[package]]
name = "nix"
version = "0.17.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "50e4785f2c3b7589a0d0c1dd60285e1188adac4006e8abd6dd578e1567027363"
dependencies = [
"bitflags",
"cc",
"cfg-if",
"libc",
"void",
]
[[package]] [[package]]
name = "nodrop" name = "nodrop"
version = "0.1.14" version = "0.1.14"
@ -2189,46 +2232,16 @@ dependencies = [
[[package]] [[package]]
name = "nu" name = "nu"
version = "0.10.0" version = "0.11.0"
dependencies = [ dependencies = [
"ansi_term 0.12.1",
"app_dirs",
"async-stream",
"base64 0.11.0",
"bigdecimal",
"bson",
"byte-unit",
"bytes 0.4.12",
"calamine",
"cfg-if",
"chrono",
"clap", "clap",
"clipboard",
"crossterm 0.16.0", "crossterm 0.16.0",
"csv",
"ctrlc", "ctrlc",
"derive-new",
"dirs 2.0.2",
"dunce", "dunce",
"filesize",
"futures 0.3.3", "futures 0.3.3",
"futures-util",
"futures_codec",
"getset",
"git2",
"glob",
"hex 0.4.0",
"ichwh",
"indexmap",
"itertools 0.8.2",
"language-reporting",
"log", "log",
"meval",
"natural",
"nom 5.1.0",
"nom-tracable",
"nom_locate",
"nu-build", "nu-build",
"nu-cli",
"nu-errors", "nu-errors",
"nu-macros", "nu-macros",
"nu-parser", "nu-parser",
@ -2249,9 +2262,77 @@ dependencies = [
"nu_plugin_sys", "nu_plugin_sys",
"nu_plugin_textview", "nu_plugin_textview",
"nu_plugin_tree", "nu_plugin_tree",
"onig_sys",
"pretty_assertions",
"pretty_env_logger 0.4.0",
"semver",
"serde 1.0.104",
"syntect",
"toml 0.5.6",
"url",
]
[[package]]
name = "nu-build"
version = "0.11.0"
dependencies = [
"lazy_static 1.4.0",
"serde 1.0.104",
"serde_json",
"toml 0.5.6",
]
[[package]]
name = "nu-cli"
version = "0.11.0"
dependencies = [
"ansi_term 0.12.1",
"app_dirs",
"async-stream",
"base64 0.11.0",
"bigdecimal",
"bson",
"byte-unit",
"bytes 0.5.4",
"calamine",
"cfg-if",
"chrono",
"clap",
"clipboard",
"csv",
"ctrlc",
"derive-new",
"dirs 2.0.2",
"dunce",
"filesize",
"futures 0.3.3",
"futures-util",
"futures_codec",
"getset 0.1.0",
"git2",
"glob",
"hex 0.4.0",
"ichwh",
"indexmap",
"itertools 0.9.0",
"language-reporting",
"log",
"meval",
"natural 0.5.0",
"nom 5.1.0",
"nom-tracable",
"nom_locate",
"nu-build",
"nu-errors",
"nu-macros",
"nu-parser",
"nu-plugin",
"nu-protocol",
"nu-source",
"nu-test-support",
"nu-value-ext",
"num-bigint", "num-bigint",
"num-traits 0.2.11", "num-traits 0.2.11",
"onig_sys",
"parking_lot", "parking_lot",
"pin-utils", "pin-utils",
"pretty-hex", "pretty-hex",
@ -2260,11 +2341,11 @@ dependencies = [
"prettytable-rs", "prettytable-rs",
"ptree", "ptree",
"query_interface", "query_interface",
"rand",
"regex", "regex",
"roxmltree", "roxmltree",
"rusqlite", "rusqlite",
"rustyline", "rustyline",
"semver",
"serde 1.0.104", "serde 1.0.104",
"serde-hjson 0.9.1", "serde-hjson 0.9.1",
"serde_bytes", "serde_bytes",
@ -2272,10 +2353,9 @@ dependencies = [
"serde_json", "serde_json",
"serde_urlencoded", "serde_urlencoded",
"serde_yaml", "serde_yaml",
"shellexpand", "shellexpand 2.0.0",
"starship", "starship",
"strip-ansi-escapes", "strip-ansi-escapes",
"syntect",
"tempfile", "tempfile",
"term", "term",
"termcolor", "termcolor",
@ -2285,29 +2365,18 @@ dependencies = [
"typetag", "typetag",
"umask", "umask",
"unicode-xid", "unicode-xid",
"url",
"users", "users",
"which", "which",
] ]
[[package]]
name = "nu-build"
version = "0.10.0"
dependencies = [
"lazy_static 1.4.0",
"serde 1.0.104",
"serde_json",
"toml 0.5.6",
]
[[package]] [[package]]
name = "nu-errors" name = "nu-errors"
version = "0.10.0" version = "0.11.0"
dependencies = [ dependencies = [
"ansi_term 0.12.1", "ansi_term 0.12.1",
"bigdecimal", "bigdecimal",
"derive-new", "derive-new",
"getset", "getset 0.0.9",
"language-reporting", "language-reporting",
"nom 5.1.0", "nom 5.1.0",
"nom_locate", "nom_locate",
@ -2323,21 +2392,21 @@ dependencies = [
[[package]] [[package]]
name = "nu-macros" name = "nu-macros"
version = "0.10.0" version = "0.11.0"
dependencies = [ dependencies = [
"nu-protocol", "nu-protocol",
] ]
[[package]] [[package]]
name = "nu-parser" name = "nu-parser"
version = "0.10.0" version = "0.11.0"
dependencies = [ dependencies = [
"ansi_term 0.12.1", "ansi_term 0.12.1",
"bigdecimal", "bigdecimal",
"cfg-if", "cfg-if",
"derive-new", "derive-new",
"enumflags2", "enumflags2",
"getset", "getset 0.0.9",
"indexmap", "indexmap",
"itertools 0.8.2", "itertools 0.8.2",
"language-reporting", "language-reporting",
@ -2356,14 +2425,14 @@ dependencies = [
"pretty_env_logger 0.3.1", "pretty_env_logger 0.3.1",
"ptree", "ptree",
"serde 1.0.104", "serde 1.0.104",
"shellexpand", "shellexpand 1.1.1",
"termcolor", "termcolor",
"unicode-xid", "unicode-xid",
] ]
[[package]] [[package]]
name = "nu-plugin" name = "nu-plugin"
version = "0.10.0" version = "0.11.0"
dependencies = [ dependencies = [
"indexmap", "indexmap",
"nu-build", "nu-build",
@ -2378,17 +2447,17 @@ dependencies = [
[[package]] [[package]]
name = "nu-protocol" name = "nu-protocol"
version = "0.10.0" version = "0.11.0"
dependencies = [ dependencies = [
"ansi_term 0.12.1", "ansi_term 0.12.1",
"bigdecimal", "bigdecimal",
"byte-unit", "byte-unit",
"chrono", "chrono",
"derive-new", "derive-new",
"getset", "getset 0.0.9",
"indexmap", "indexmap",
"language-reporting", "language-reporting",
"natural", "natural 0.3.0",
"nom 5.1.0", "nom 5.1.0",
"nom-tracable", "nom-tracable",
"nom_locate", "nom_locate",
@ -2408,10 +2477,10 @@ dependencies = [
[[package]] [[package]]
name = "nu-source" name = "nu-source"
version = "0.10.0" version = "0.11.0"
dependencies = [ dependencies = [
"derive-new", "derive-new",
"getset", "getset 0.0.9",
"language-reporting", "language-reporting",
"nom-tracable", "nom-tracable",
"nom_locate", "nom_locate",
@ -2423,11 +2492,11 @@ dependencies = [
[[package]] [[package]]
name = "nu-test-support" name = "nu-test-support"
version = "0.10.0" version = "0.11.0"
dependencies = [ dependencies = [
"app_dirs", "app_dirs",
"dunce", "dunce",
"getset", "getset 0.0.9",
"glob", "glob",
"indexmap", "indexmap",
"nu-build", "nu-build",
@ -2439,7 +2508,7 @@ dependencies = [
[[package]] [[package]]
name = "nu-value-ext" name = "nu-value-ext"
version = "0.10.0" version = "0.11.0"
dependencies = [ dependencies = [
"indexmap", "indexmap",
"itertools 0.8.2", "itertools 0.8.2",
@ -2453,7 +2522,7 @@ dependencies = [
[[package]] [[package]]
name = "nu_plugin_average" name = "nu_plugin_average"
version = "0.10.0" version = "0.11.0"
dependencies = [ dependencies = [
"nu-build", "nu-build",
"nu-errors", "nu-errors",
@ -2464,7 +2533,7 @@ dependencies = [
[[package]] [[package]]
name = "nu_plugin_binaryview" name = "nu_plugin_binaryview"
version = "0.10.0" version = "0.11.0"
dependencies = [ dependencies = [
"ansi_term 0.12.1", "ansi_term 0.12.1",
"crossterm 0.14.2", "crossterm 0.14.2",
@ -2481,7 +2550,7 @@ dependencies = [
[[package]] [[package]]
name = "nu_plugin_fetch" name = "nu_plugin_fetch"
version = "0.10.0" version = "0.11.0"
dependencies = [ dependencies = [
"futures 0.3.3", "futures 0.3.3",
"nu-build", "nu-build",
@ -2495,7 +2564,7 @@ dependencies = [
[[package]] [[package]]
name = "nu_plugin_inc" name = "nu_plugin_inc"
version = "0.10.0" version = "0.11.0"
dependencies = [ dependencies = [
"nu-build", "nu-build",
"nu-errors", "nu-errors",
@ -2508,7 +2577,7 @@ dependencies = [
[[package]] [[package]]
name = "nu_plugin_match" name = "nu_plugin_match"
version = "0.10.0" version = "0.11.0"
dependencies = [ dependencies = [
"futures 0.3.3", "futures 0.3.3",
"nu-build", "nu-build",
@ -2521,7 +2590,7 @@ dependencies = [
[[package]] [[package]]
name = "nu_plugin_post" name = "nu_plugin_post"
version = "0.10.0" version = "0.11.0"
dependencies = [ dependencies = [
"base64 0.11.0", "base64 0.11.0",
"futures 0.3.3", "futures 0.3.3",
@ -2538,7 +2607,7 @@ dependencies = [
[[package]] [[package]]
name = "nu_plugin_ps" name = "nu_plugin_ps"
version = "0.10.0" version = "0.11.0"
dependencies = [ dependencies = [
"futures 0.3.3", "futures 0.3.3",
"futures-timer 3.0.1", "futures-timer 3.0.1",
@ -2553,7 +2622,7 @@ dependencies = [
[[package]] [[package]]
name = "nu_plugin_str" name = "nu_plugin_str"
version = "0.10.0" version = "0.11.0"
dependencies = [ dependencies = [
"chrono", "chrono",
"nu-build", "nu-build",
@ -2568,7 +2637,7 @@ dependencies = [
[[package]] [[package]]
name = "nu_plugin_sum" name = "nu_plugin_sum"
version = "0.10.0" version = "0.11.0"
dependencies = [ dependencies = [
"nu-build", "nu-build",
"nu-errors", "nu-errors",
@ -2579,7 +2648,7 @@ dependencies = [
[[package]] [[package]]
name = "nu_plugin_sys" name = "nu_plugin_sys"
version = "0.10.0" version = "0.11.0"
dependencies = [ dependencies = [
"battery", "battery",
"futures 0.3.3", "futures 0.3.3",
@ -2594,7 +2663,7 @@ dependencies = [
[[package]] [[package]]
name = "nu_plugin_textview" name = "nu_plugin_textview"
version = "0.10.0" version = "0.11.0"
dependencies = [ dependencies = [
"ansi_term 0.12.1", "ansi_term 0.12.1",
"crossterm 0.14.2", "crossterm 0.14.2",
@ -2610,7 +2679,7 @@ dependencies = [
[[package]] [[package]]
name = "nu_plugin_tree" name = "nu_plugin_tree"
version = "0.10.0" version = "0.11.0"
dependencies = [ dependencies = [
"derive-new", "derive-new",
"nu-build", "nu-build",
@ -2762,9 +2831,9 @@ dependencies = [
[[package]] [[package]]
name = "open" name = "open"
version = "1.3.3" version = "1.4.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3dfa632621d66502e1e9298c038d903090fc810a33cc1e6a02958fa0be65e3fb" checksum = "7c283bf0114efea9e42f1a60edea9859e8c47528eae09d01df4b29c1e489cc48"
dependencies = [ dependencies = [
"winapi 0.3.8", "winapi 0.3.8",
] ]
@ -2811,15 +2880,12 @@ checksum = "a86ed3f5f244b372d6b1a00b72ef7f8876d0bc6a78a4c9985c53614041512063"
[[package]] [[package]]
name = "os_info" name = "os_info"
version = "1.3.3" version = "2.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "46c6031e9373f6942a00933638731c7f4543f265b4bd920a1230fbcd62dfdf0c" checksum = "cf0044ce3b28b09ffb3ef188c81dbc6592999366d153dccdc065045ee54717f7"
dependencies = [ dependencies = [
"lazy_static 1.4.0",
"log", "log",
"regex",
"serde 1.0.104", "serde 1.0.104",
"serde_derive",
"winapi 0.3.8", "winapi 0.3.8",
] ]
@ -3022,6 +3088,32 @@ dependencies = [
"unicode-width", "unicode-width",
] ]
[[package]]
name = "proc-macro-error"
version = "0.4.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e7959c6467d962050d639361f7703b2051c43036d03493c36f01d440fdd3138a"
dependencies = [
"proc-macro-error-attr",
"proc-macro2",
"quote",
"syn",
"version_check 0.9.1",
]
[[package]]
name = "proc-macro-error-attr"
version = "0.4.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e4002d9f55991d5e019fb940a90e1a95eb80c24e77cb2462dd4dc869604d543a"
dependencies = [
"proc-macro2",
"quote",
"syn",
"syn-mid",
"version_check 0.9.1",
]
[[package]] [[package]]
name = "proc-macro-hack" name = "proc-macro-hack"
version = "0.5.11" version = "0.5.11"
@ -3322,9 +3414,9 @@ dependencies = [
[[package]] [[package]]
name = "rusqlite" name = "rusqlite"
version = "0.20.0" version = "0.21.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2a194373ef527035645a1bc21b10dc2125f73497e6e155771233eb187aedd051" checksum = "64a656821bb6317a84b257737b7934f79c0dbb7eb694710475908280ebad3e64"
dependencies = [ dependencies = [
"bitflags", "bitflags",
"fallible-iterator", "fallible-iterator",
@ -3353,6 +3445,16 @@ version = "0.13.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3e52c148ef37f8c375d49d5a73aa70713125b7f19095948a923f80afdeb22ec2" checksum = "3e52c148ef37f8c375d49d5a73aa70713125b7f19095948a923f80afdeb22ec2"
[[package]]
name = "rust-stemmers"
version = "1.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e46a2036019fdb888131db7a4c847a1063a7493f971ed94ea82c67eada63ca54"
dependencies = [
"serde 1.0.104",
"serde_derive",
]
[[package]] [[package]]
name = "rustc-demangle" name = "rustc-demangle"
version = "0.1.16" version = "0.1.16"
@ -3584,9 +3686,9 @@ dependencies = [
[[package]] [[package]]
name = "serde_json" name = "serde_json"
version = "1.0.47" version = "1.0.48"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "15913895b61e0be854afd32fd4163fcd2a3df34142cf2cb961b310ce694cbf90" checksum = "9371ade75d4c2d6cb154141b9752cf3781ec9c05e0e5cf35060e1e70ee7b9c25"
dependencies = [ dependencies = [
"indexmap", "indexmap",
"itoa", "itoa",
@ -3646,6 +3748,15 @@ dependencies = [
"dirs 2.0.2", "dirs 2.0.2",
] ]
[[package]]
name = "shellexpand"
version = "2.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9a2b22262a9aaf9464d356f656fea420634f78c881c5eebd5ef5e66d8b9bc603"
dependencies = [
"dirs 2.0.2",
]
[[package]] [[package]]
name = "signal-hook" name = "signal-hook"
version = "0.1.13" version = "0.1.13"
@ -3716,9 +3827,9 @@ checksum = "6e63cff320ae2c57904679ba7cb63280a3dc4613885beafb148ee7bf9aa9042d"
[[package]] [[package]]
name = "starship" name = "starship"
version = "0.35.1" version = "0.37.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5e9fd041a10d378370baa43669ccf11a8385a495d492ce57e5d144f83007a8ba" checksum = "588e250206eec64cee803f0d7c1d1f7c1d3ba7e123d513f0311b172d07a0f755"
dependencies = [ dependencies = [
"ansi_term 0.12.1", "ansi_term 0.12.1",
"battery", "battery",
@ -3736,6 +3847,7 @@ dependencies = [
"path-slash", "path-slash",
"pretty_env_logger 0.4.0", "pretty_env_logger 0.4.0",
"rayon", "rayon",
"regex",
"reqwest", "reqwest",
"serde_json", "serde_json",
"starship_module_config_derive", "starship_module_config_derive",
@ -3814,6 +3926,17 @@ dependencies = [
"unicode-xid", "unicode-xid",
] ]
[[package]]
name = "syn-mid"
version = "0.5.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7be3539f6c128a931cf19dcee741c1af532c7fd387baa739c03dd2e96479338a"
dependencies = [
"proc-macro2",
"quote",
"syn",
]
[[package]] [[package]]
name = "syntect" name = "syntect"
version = "3.2.0" version = "3.2.0"
@ -3838,9 +3961,9 @@ dependencies = [
[[package]] [[package]]
name = "sysinfo" name = "sysinfo"
version = "0.10.5" version = "0.11.6"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b8834e42be61ae4f6338b216fbb69837c7f33c3d4d3a139fb073735b25af4d9e" checksum = "0bf9aa50e31b8d8ded9eafa135b4faf717ce308c35a7c919e1ba4cdb10ae1bfe"
dependencies = [ dependencies = [
"cfg-if", "cfg-if",
"doc-comment", "doc-comment",
@ -4412,9 +4535,9 @@ dependencies = [
[[package]] [[package]]
name = "which" name = "which"
version = "3.1.0" version = "3.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5475d47078209a02e60614f7ba5e645ef3ed60f771920ac1906d7c1cc65024c8" checksum = "d011071ae14a2f6671d0b74080ae0cd8ebf3a6f8c9589a2cd45f23126fe29724"
dependencies = [ dependencies = [
"failure", "failure",
"libc", "libc",

View File

@ -1,6 +1,6 @@
[package] [package]
name = "nu" name = "nu"
version = "0.10.0" version = "0.11.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"] authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
description = "A shell for the GitHub era" description = "A shell for the GitHub era"
license = "MIT" license = "MIT"
@ -13,137 +13,60 @@ documentation = "https://www.nushell.sh/book/"
exclude = ["images"] exclude = ["images"]
[workspace] [workspace]
members = ["crates/*/"]
members = [
"crates/nu-macros",
"crates/nu-errors",
"crates/nu-source",
"crates/nu_plugin_average",
"crates/nu_plugin_binaryview",
"crates/nu_plugin_fetch",
"crates/nu_plugin_inc",
"crates/nu_plugin_match",
"crates/nu_plugin_post",
"crates/nu_plugin_ps",
"crates/nu_plugin_str",
"crates/nu_plugin_sum",
"crates/nu_plugin_sys",
"crates/nu_plugin_textview",
"crates/nu_plugin_tree",
"crates/nu-protocol",
"crates/nu-plugin",
"crates/nu-parser",
"crates/nu-value-ext",
"crates/nu-build"
]
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies] [dependencies]
nu-source = { version = "0.10.0", path = "./crates/nu-source" } nu-cli = { version = "0.11.0", path = "./crates/nu-cli" }
nu-plugin = { version = "0.10.0", path = "./crates/nu-plugin" } nu-source = { version = "0.11.0", path = "./crates/nu-source" }
nu-protocol = { version = "0.10.0", path = "./crates/nu-protocol" } nu-plugin = { version = "0.11.0", path = "./crates/nu-plugin" }
nu-errors = { version = "0.10.0", path = "./crates/nu-errors" } nu-protocol = { version = "0.11.0", path = "./crates/nu-protocol" }
nu-parser = { version = "0.10.0", path = "./crates/nu-parser" } nu-errors = { version = "0.11.0", path = "./crates/nu-errors" }
nu-value-ext = { version = "0.10.0", path = "./crates/nu-value-ext" } nu-parser = { version = "0.11.0", path = "./crates/nu-parser" }
nu_plugin_average = { version = "0.10.0", path = "./crates/nu_plugin_average", optional=true } nu-value-ext = { version = "0.11.0", path = "./crates/nu-value-ext" }
nu_plugin_binaryview = { version = "0.10.0", path = "./crates/nu_plugin_binaryview", optional=true } nu_plugin_average = { version = "0.11.0", path = "./crates/nu_plugin_average", optional=true }
nu_plugin_fetch = { version = "0.10.0", path = "./crates/nu_plugin_fetch", optional=true } nu_plugin_binaryview = { version = "0.11.0", path = "./crates/nu_plugin_binaryview", optional=true }
nu_plugin_inc = { version = "0.10.0", path = "./crates/nu_plugin_inc", optional=true } nu_plugin_fetch = { version = "0.11.0", path = "./crates/nu_plugin_fetch", optional=true }
nu_plugin_match = { version = "0.10.0", path = "./crates/nu_plugin_match", optional=true } nu_plugin_inc = { version = "0.11.0", path = "./crates/nu_plugin_inc", optional=true }
nu_plugin_post = { version = "0.10.0", path = "./crates/nu_plugin_post", optional=true } nu_plugin_match = { version = "0.11.0", path = "./crates/nu_plugin_match", optional=true }
nu_plugin_ps = { version = "0.10.0", path = "./crates/nu_plugin_ps", optional=true } nu_plugin_post = { version = "0.11.0", path = "./crates/nu_plugin_post", optional=true }
nu_plugin_str = { version = "0.10.0", path = "./crates/nu_plugin_str", optional=true } nu_plugin_ps = { version = "0.11.0", path = "./crates/nu_plugin_ps", optional=true }
nu_plugin_sum = { version = "0.10.0", path = "./crates/nu_plugin_sum", optional=true } nu_plugin_str = { version = "0.11.0", path = "./crates/nu_plugin_str", optional=true }
nu_plugin_sys = { version = "0.10.0", path = "./crates/nu_plugin_sys", optional=true } nu_plugin_sum = { version = "0.11.0", path = "./crates/nu_plugin_sum", optional=true }
nu_plugin_textview = { version = "0.10.0", path = "./crates/nu_plugin_textview", optional=true } nu_plugin_sys = { version = "0.11.0", path = "./crates/nu_plugin_sys", optional=true }
nu_plugin_tree = { version = "0.10.0", path = "./crates/nu_plugin_tree", optional=true } nu_plugin_textview = { version = "0.11.0", path = "./crates/nu_plugin_textview", optional=true }
nu-macros = { version = "0.10.0", path = "./crates/nu-macros" } nu_plugin_tree = { version = "0.11.0", path = "./crates/nu_plugin_tree", optional=true }
nu-macros = { version = "0.11.0", path = "./crates/nu-macros" }
query_interface = "0.3.5" crossterm = { version = "0.16.0", optional = true }
typetag = "0.1.4" onig_sys = { version = "=69.1.0", optional = true }
rustyline = "6.0.0" semver = { version = "0.9.0", optional = true }
chrono = { version = "0.4.10", features = ["serde"] } syntect = { version = "3.2.0", optional = true }
derive-new = "0.5.8" url = { version = "2.1.1", optional = true }
prettytable-rs = "0.8.0"
itertools = "0.8.2" clap = "2.33.0"
ansi_term = "0.12.1" ctrlc = "3.1.4"
nom = "5.0.1"
dunce = "1.0.0" dunce = "1.0.0"
indexmap = { version = "1.3.2", features = ["serde-1"] }
byte-unit = "3.0.3"
base64 = "0.11"
futures = { version = "0.3", features = ["compat", "io-compat"] } futures = { version = "0.3", features = ["compat", "io-compat"] }
async-stream = "0.2"
futures_codec = "0.4"
num-traits = "0.2.11"
term = "0.5.2"
bytes = "0.4.12"
log = "0.4.8" log = "0.4.8"
pretty_env_logger = "0.4.0" pretty_env_logger = "0.4.0"
serde = { version = "1.0.104", features = ["derive"] }
bson = { version = "0.14.0", features = ["decimal128"] } [dev-dependencies]
serde_json = "1.0.47" pretty_assertions = "0.6.1"
serde-hjson = "0.9.1" nu-test-support = { version = "0.11.0", path = "./crates/nu-test-support" }
serde_yaml = "0.8"
serde_bytes = "0.11.3" [build-dependencies]
getset = "0.0.9"
language-reporting = "0.4.0"
app_dirs = "1.2.1"
csv = "1.1"
toml = "0.5.6" toml = "0.5.6"
clap = "2.33.0" serde = { version = "1.0.104", features = ["derive"] }
git2 = { version = "0.11.0", default_features = false } nu-build = { version = "0.11.0", path = "./crates/nu-build" }
dirs = "2.0.2"
glob = "0.3.0"
ctrlc = "3.1.3"
roxmltree = "0.9.1"
nom_locate = "1.0.0"
nom-tracable = "0.4.1"
unicode-xid = "0.2.0"
serde_ini = "0.2.0"
pretty-hex = "0.1.1"
hex = "0.4"
tempfile = "3.1.0"
which = "3.1.0"
ichwh = "0.3"
textwrap = {version = "0.11.0", features = ["term_size"]}
shellexpand = "1.1.1"
pin-utils = "0.1.0-alpha.4"
num-bigint = { version = "0.2.6", features = ["serde"] }
bigdecimal = { version = "0.1.0", features = ["serde"] }
serde_urlencoded = "0.6.1"
trash = "1.0.0"
regex = "1"
cfg-if = "0.1"
strip-ansi-escapes = "0.1.0"
calamine = "0.16"
umask = "0.1"
futures-util = "0.3.4"
termcolor = "1.1.0"
natural = "0.3.0"
parking_lot = "0.10.0"
meval = "0.2"
clipboard = {version = "0.5", optional = true }
ptree = {version = "0.2" }
starship = { version = "0.35.1", optional = true}
syntect = {version = "3.2.0", optional = true }
onig_sys = {version = "=69.1.0", optional = true }
crossterm = {version = "0.16.0", optional = true}
url = {version = "2.1.1", optional = true}
semver = {version = "0.9.0", optional = true}
filesize = "0.1.0"
[target.'cfg(unix)'.dependencies]
users = "0.9"
[features] [features]
# Test executables # Test executables
test-bins = [] test-bins = []
default = ["sys", "ps", "textview", "inc", "str"] default = ["sys", "ps", "textview", "inc", "str"]
stable = ["default", "starship-prompt", "binaryview", "match", "tree", "average", "sum", "post", "fetch", "clipboard"] stable = ["default", "starship-prompt", "binaryview", "match", "tree", "average", "sum", "post", "fetch", "clipboard-cli"]
# Default # Default
textview = ["crossterm", "syntect", "onig_sys", "url", "nu_plugin_textview"] textview = ["crossterm", "syntect", "onig_sys", "url", "nu_plugin_textview"]
@ -158,28 +81,12 @@ binaryview = ["nu_plugin_binaryview"]
fetch = ["nu_plugin_fetch"] fetch = ["nu_plugin_fetch"]
match = ["nu_plugin_match"] match = ["nu_plugin_match"]
post = ["nu_plugin_post"] post = ["nu_plugin_post"]
starship-prompt = ["starship"]
sum = ["nu_plugin_sum"] sum = ["nu_plugin_sum"]
trace = ["nu-parser/trace"] trace = ["nu-parser/trace"]
tree = ["nu_plugin_tree"] tree = ["nu_plugin_tree"]
[dependencies.rusqlite] clipboard-cli = ["nu-cli/clipboard-cli"]
version = "0.20.0" starship-prompt = ["nu-cli/starship-prompt"]
features = ["bundled", "blob"]
[dev-dependencies]
pretty_assertions = "0.6.1"
nu-test-support = { version = "0.10.0", path = "./crates/nu-test-support" }
[build-dependencies]
toml = "0.5.6"
serde = { version = "1.0.104", features = ["derive"] }
nu-build = { version = "0.10.0", path = "./crates/nu-build" }
[lib]
name = "nu"
doctest = false
path = "src/lib.rs"
[[bin]] [[bin]]
name = "fail" name = "fail"
@ -201,6 +108,11 @@ name = "nonu"
path = "crates/nu-test-support/src/bins/nonu.rs" path = "crates/nu-test-support/src/bins/nonu.rs"
required-features = ["test-bins"] required-features = ["test-bins"]
[[bin]]
name = "iecho"
path = "crates/nu-test-support/src/bins/iecho.rs"
required-features = ["test-bins"]
# Core plugins that ship with `cargo install nu` by default # Core plugins that ship with `cargo install nu` by default
# Currently, Cargo limits us to installing only one binary # Currently, Cargo limits us to installing only one binary
# unless we use [[bin]], so we use this as a workaround # unless we use [[bin]], so we use this as a workaround

View File

@ -1,6 +1,6 @@
MIT License MIT License
Copyright (c) 2019 Yehuda Katz, Jonathan Turner Copyright (c) 2019 - 2020 Yehuda Katz, Jonathan Turner
Permission is hereby granted, free of charge, to any person obtaining a copy Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal of this software and associated documentation files (the "Software"), to deal

View File

@ -1,6 +1,6 @@
[package] [package]
name = "nu-build" name = "nu-build"
version = "0.10.0" version = "0.11.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"] authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018" edition = "2018"
description = "Core build system for nushell" description = "Core build system for nushell"

109
crates/nu-cli/Cargo.toml Normal file
View File

@ -0,0 +1,109 @@
[package]
name = "nu-cli"
version = "0.11.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
description = "CLI for nushell"
edition = "2018"
license = "MIT"
[lib]
doctest = false
[dependencies]
nu-source = { version = "0.11.0", path = "../nu-source" }
nu-plugin = { version = "0.11.0", path = "../nu-plugin" }
nu-protocol = { version = "0.11.0", path = "../nu-protocol" }
nu-errors = { version = "0.11.0", path = "../nu-errors" }
nu-parser = { version = "0.11.0", path = "../nu-parser" }
nu-value-ext = { version = "0.11.0", path = "../nu-value-ext" }
nu-macros = { version = "0.11.0", path = "../nu-macros" }
nu-test-support = { version = "0.11.0", path = "../nu-test-support" }
ansi_term = "0.12.1"
app_dirs = "1.2.1"
async-stream = "0.2"
base64 = "0.11"
bigdecimal = { version = "0.1.0", features = ["serde"] }
bson = { version = "0.14.0", features = ["decimal128"] }
byte-unit = "3.0.3"
bytes = "0.5.4"
calamine = "0.16"
cfg-if = "0.1"
chrono = { version = "0.4.11", features = ["serde"] }
clap = "2.33.0"
csv = "1.1"
ctrlc = "3.1.4"
derive-new = "0.5.8"
dirs = "2.0.2"
dunce = "1.0.0"
filesize = "0.1.0"
futures = { version = "0.3", features = ["compat", "io-compat"] }
futures-util = "0.3.4"
futures_codec = "0.4"
getset = "0.1.0"
git2 = { version = "0.11.0", default_features = false }
glob = "0.3.0"
hex = "0.4"
ichwh = "0.3"
indexmap = { version = "1.3.2", features = ["serde-1"] }
itertools = "0.9.0"
language-reporting = "0.4.0"
log = "0.4.8"
meval = "0.2"
natural = "0.5.0"
nom = "5.0.1"
nom-tracable = "0.4.1"
nom_locate = "1.0.0"
num-bigint = { version = "0.2.6", features = ["serde"] }
num-traits = "0.2.11"
parking_lot = "0.10.0"
pin-utils = "0.1.0-alpha.4"
pretty-hex = "0.1.1"
pretty_env_logger = "0.4.0"
prettytable-rs = "0.8.0"
ptree = {version = "0.2" }
query_interface = "0.3.5"
rand = "0.7"
regex = "1"
roxmltree = "0.9.1"
rustyline = "6.0.0"
serde = { version = "1.0.104", features = ["derive"] }
serde-hjson = "0.9.1"
serde_bytes = "0.11.3"
serde_ini = "0.2.0"
serde_json = "1.0.48"
serde_urlencoded = "0.6.1"
serde_yaml = "0.8"
shellexpand = "2.0.0"
strip-ansi-escapes = "0.1.0"
tempfile = "3.1.0"
term = "0.5.2"
termcolor = "1.1.0"
textwrap = {version = "0.11.0", features = ["term_size"]}
toml = "0.5.6"
trash = "1.0.0"
typetag = "0.1.4"
umask = "0.1"
unicode-xid = "0.2.0"
which = "3.1.1"
clipboard = { version = "0.5", optional = true }
starship = { version = "0.37.0", optional = true }
[target.'cfg(unix)'.dependencies]
users = "0.9"
[dependencies.rusqlite]
version = "0.21.0"
features = ["bundled", "blob"]
[dev-dependencies]
pretty_assertions = "0.6.1"
[build-dependencies]
nu-build = { version = "0.11.0", path = "../nu-build" }
[features]
stable = []
starship-prompt = ["starship"]
clipboard-cli = ["clipboard"]

View File

@ -1,3 +1,4 @@
use crate::commands::classified::external::{MaybeTextCodec, StringOrBinary};
use crate::commands::classified::pipeline::run_pipeline; use crate::commands::classified::pipeline::run_pipeline;
use crate::commands::plugin::JsonRpc; use crate::commands::plugin::JsonRpc;
use crate::commands::plugin::{PluginCommand, PluginSink}; use crate::commands::plugin::{PluginCommand, PluginSink};
@ -6,7 +7,8 @@ use crate::context::Context;
#[cfg(not(feature = "starship-prompt"))] #[cfg(not(feature = "starship-prompt"))]
use crate::git::current_branch; use crate::git::current_branch;
use crate::prelude::*; use crate::prelude::*;
use futures_codec::{FramedRead, LinesCodec}; use futures_codec::FramedRead;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_parser::{ClassifiedPipeline, PipelineShape, SpannedToken, TokensIterator}; use nu_parser::{ClassifiedPipeline, PipelineShape, SpannedToken, TokensIterator};
use nu_protocol::{Primitive, ReturnSuccess, Signature, UntaggedValue, Value}; use nu_protocol::{Primitive, ReturnSuccess, Signature, UntaggedValue, Value};
@ -93,44 +95,27 @@ fn load_plugin(path: &std::path::Path, context: &mut Context) -> Result<(), Shel
} }
fn search_paths() -> Vec<std::path::PathBuf> { fn search_paths() -> Vec<std::path::PathBuf> {
use std::env;
let mut search_paths = Vec::new(); let mut search_paths = Vec::new();
#[cfg(debug_assertions)] // Automatically add path `nu` is in as a search path
{ if let Ok(exe_path) = env::current_exe() {
// Use our debug plugins in debug mode if let Some(exe_dir) = exe_path.parent() {
let mut path = std::path::PathBuf::from("."); search_paths.push(exe_dir.to_path_buf());
path.push("target");
path.push("debug");
if path.exists() {
search_paths.push(path);
} }
} }
#[cfg(not(debug_assertions))] #[cfg(not(debug_assertions))]
{ {
use std::env;
match env::var_os("PATH") { match env::var_os("PATH") {
Some(paths) => { Some(paths) => {
search_paths = env::split_paths(&paths).collect::<Vec<_>>(); search_paths.extend(env::split_paths(&paths).collect::<Vec<_>>());
} }
None => println!("PATH is not defined in the environment."), None => println!("PATH is not defined in the environment."),
} }
// Use our release plugins in release mode
let mut path = std::path::PathBuf::from(".");
path.push("target");
path.push("release");
if path.exists() {
search_paths.push(path);
}
} }
// permit Nu finding and picking up development plugins
// if there are any first.
search_paths.reverse();
search_paths search_paths
} }
@ -255,13 +240,13 @@ pub fn create_default_context(
per_item_command(Ls), per_item_command(Ls),
per_item_command(Du), per_item_command(Du),
whole_stream_command(Cd), whole_stream_command(Cd),
whole_stream_command(Env),
per_item_command(Remove), per_item_command(Remove),
per_item_command(Open), per_item_command(Open),
whole_stream_command(Config), whole_stream_command(Config),
per_item_command(Help), per_item_command(Help),
per_item_command(History), per_item_command(History),
whole_stream_command(Save), whole_stream_command(Save),
per_item_command(Touch),
per_item_command(Cpy), per_item_command(Cpy),
whole_stream_command(Date), whole_stream_command(Date),
per_item_command(Calc), per_item_command(Calc),
@ -317,8 +302,10 @@ pub fn create_default_context(
whole_stream_command(Default), whole_stream_command(Default),
whole_stream_command(SkipWhile), whole_stream_command(SkipWhile),
whole_stream_command(Range), whole_stream_command(Range),
whole_stream_command(Rename),
whole_stream_command(Uniq), whole_stream_command(Uniq),
// Table manipulation // Table manipulation
whole_stream_command(Shuffle),
whole_stream_command(Wrap), whole_stream_command(Wrap),
whole_stream_command(Pivot), whole_stream_command(Pivot),
// Data processing // Data processing
@ -635,15 +622,21 @@ async fn process_line(
} }
let input_stream = if redirect_stdin { let input_stream = if redirect_stdin {
let file = futures::io::AllowStdIo::new( let file = futures::io::AllowStdIo::new(std::io::stdin());
crate::commands::classified::external::StdoutWithNewline::new(std::io::stdin()), let stream = FramedRead::new(file, MaybeTextCodec).map(|line| {
);
let stream = FramedRead::new(file, LinesCodec).map(|line| {
if let Ok(line) = line { if let Ok(line) = line {
Ok(Value { match line {
value: UntaggedValue::Primitive(Primitive::String(line)), StringOrBinary::String(s) => Ok(Value {
tag: Tag::unknown(), value: UntaggedValue::Primitive(Primitive::String(s)),
}) tag: Tag::unknown(),
}),
StringOrBinary::Binary(b) => Ok(Value {
value: UntaggedValue::Primitive(Primitive::Binary(
b.into_iter().collect(),
)),
tag: Tag::unknown(),
}),
}
} else { } else {
panic!("Internal error: could not read lines of text from stdin") panic!("Internal error: could not read lines of text from stdin")
} }

View File

@ -23,7 +23,6 @@ pub(crate) mod du;
pub(crate) mod echo; pub(crate) mod echo;
pub(crate) mod edit; pub(crate) mod edit;
pub(crate) mod enter; pub(crate) mod enter;
pub(crate) mod env;
#[allow(unused)] #[allow(unused)]
pub(crate) mod evaluate_by; pub(crate) mod evaluate_by;
pub(crate) mod exit; pub(crate) mod exit;
@ -69,10 +68,12 @@ pub(crate) mod range;
#[allow(unused)] #[allow(unused)]
pub(crate) mod reduce_by; pub(crate) mod reduce_by;
pub(crate) mod reject; pub(crate) mod reject;
pub(crate) mod rename;
pub(crate) mod reverse; pub(crate) mod reverse;
pub(crate) mod rm; pub(crate) mod rm;
pub(crate) mod save; pub(crate) mod save;
pub(crate) mod shells; pub(crate) mod shells;
pub(crate) mod shuffle;
pub(crate) mod size; pub(crate) mod size;
pub(crate) mod skip; pub(crate) mod skip;
pub(crate) mod skip_while; pub(crate) mod skip_while;
@ -123,8 +124,8 @@ pub(crate) mod kill;
pub(crate) use kill::Kill; pub(crate) use kill::Kill;
pub(crate) mod clear; pub(crate) mod clear;
pub(crate) use clear::Clear; pub(crate) use clear::Clear;
pub(crate) mod touch;
pub(crate) use enter::Enter; pub(crate) use enter::Enter;
pub(crate) use env::Env;
#[allow(unused_imports)] #[allow(unused_imports)]
pub(crate) use evaluate_by::EvaluateBy; pub(crate) use evaluate_by::EvaluateBy;
pub(crate) use exit::Exit; pub(crate) use exit::Exit;
@ -171,10 +172,12 @@ pub(crate) use range::Range;
#[allow(unused_imports)] #[allow(unused_imports)]
pub(crate) use reduce_by::ReduceBy; pub(crate) use reduce_by::ReduceBy;
pub(crate) use reject::Reject; pub(crate) use reject::Reject;
pub(crate) use rename::Rename;
pub(crate) use reverse::Reverse; pub(crate) use reverse::Reverse;
pub(crate) use rm::Remove; pub(crate) use rm::Remove;
pub(crate) use save::Save; pub(crate) use save::Save;
pub(crate) use shells::Shells; pub(crate) use shells::Shells;
pub(crate) use shuffle::Shuffle;
pub(crate) use size::Size; pub(crate) use size::Size;
pub(crate) use skip::Skip; pub(crate) use skip::Skip;
pub(crate) use skip_while::SkipWhile; pub(crate) use skip_while::SkipWhile;
@ -195,6 +198,7 @@ pub(crate) use to_toml::ToTOML;
pub(crate) use to_tsv::ToTSV; pub(crate) use to_tsv::ToTSV;
pub(crate) use to_url::ToURL; pub(crate) use to_url::ToURL;
pub(crate) use to_yaml::ToYAML; pub(crate) use to_yaml::ToYAML;
pub(crate) use touch::Touch;
pub(crate) use trim::Trim; pub(crate) use trim::Trim;
pub(crate) use uniq::Uniq; pub(crate) use uniq::Uniq;
pub(crate) use version::Version; pub(crate) use version::Version;

View File

@ -92,6 +92,7 @@ pub fn autoview(context: RunnableContext) -> Result<OutputStream, ShellError> {
} }
}; };
let stream = stream.to_input_stream(); let stream = stream.to_input_stream();
if let Some(table) = table { if let Some(table) = table {
let command_args = create_default_command_args(&context).with_input(stream); let command_args = create_default_command_args(&context).with_input(stream);
let result = table.run(command_args, &context.commands); let result = table.run(command_args, &context.commands);
@ -111,14 +112,14 @@ pub fn autoview(context: RunnableContext) -> Result<OutputStream, ShellError> {
let result = text.run(command_args, &context.commands); let result = text.run(command_args, &context.commands);
result.collect::<Vec<_>>().await; result.collect::<Vec<_>>().await;
} else { } else {
outln!("{}", s); out!("{}", s);
} }
} }
Value { Value {
value: UntaggedValue::Primitive(Primitive::String(s)), value: UntaggedValue::Primitive(Primitive::String(s)),
.. ..
} => { } => {
outln!("{}", s); out!("{}", s);
} }
Value { Value {
value: UntaggedValue::Primitive(Primitive::Line(ref s)), value: UntaggedValue::Primitive(Primitive::Line(ref s)),
@ -131,32 +132,32 @@ pub fn autoview(context: RunnableContext) -> Result<OutputStream, ShellError> {
let result = text.run(command_args, &context.commands); let result = text.run(command_args, &context.commands);
result.collect::<Vec<_>>().await; result.collect::<Vec<_>>().await;
} else { } else {
outln!("{}\n", s); out!("{}\n", s);
} }
} }
Value { Value {
value: UntaggedValue::Primitive(Primitive::Line(s)), value: UntaggedValue::Primitive(Primitive::Line(s)),
.. ..
} => { } => {
outln!("{}\n", s); out!("{}\n", s);
} }
Value { Value {
value: UntaggedValue::Primitive(Primitive::Path(s)), value: UntaggedValue::Primitive(Primitive::Path(s)),
.. ..
} => { } => {
outln!("{}", s.display()); out!("{}", s.display());
} }
Value { Value {
value: UntaggedValue::Primitive(Primitive::Int(n)), value: UntaggedValue::Primitive(Primitive::Int(n)),
.. ..
} => { } => {
outln!("{}", n); out!("{}", n);
} }
Value { Value {
value: UntaggedValue::Primitive(Primitive::Decimal(n)), value: UntaggedValue::Primitive(Primitive::Decimal(n)),
.. ..
} => { } => {
outln!("{}", n); out!("{}", n);
} }
Value { value: UntaggedValue::Primitive(Primitive::Binary(ref b)), .. } => { Value { value: UntaggedValue::Primitive(Primitive::Binary(ref b)), .. } => {
@ -168,7 +169,7 @@ pub fn autoview(context: RunnableContext) -> Result<OutputStream, ShellError> {
result.collect::<Vec<_>>().await; result.collect::<Vec<_>>().await;
} else { } else {
use pretty_hex::*; use pretty_hex::*;
outln!("{:?}", b.hex_dump()); out!("{:?}", b.hex_dump());
} }
} }
@ -183,7 +184,7 @@ pub fn autoview(context: RunnableContext) -> Result<OutputStream, ShellError> {
let result = table.run(command_args, &context.commands); let result = table.run(command_args, &context.commands);
result.collect::<Vec<_>>().await; result.collect::<Vec<_>>().await;
} else { } else {
outln!("{:?}", item); out!("{:?}", item);
} }
} }
} }
@ -191,7 +192,7 @@ pub fn autoview(context: RunnableContext) -> Result<OutputStream, ShellError> {
} }
} }
_ => { _ => {
//outln!("<no results>"); //out!("<no results>");
} }
} }

View File

@ -1,6 +1,9 @@
use crate::futures::ThreadedReceiver;
use crate::prelude::*; use crate::prelude::*;
use bytes::{BufMut, Bytes, BytesMut};
use futures::executor::block_on_stream;
use futures::stream::StreamExt; use futures::stream::StreamExt;
use futures_codec::{FramedRead, LinesCodec}; use futures_codec::FramedRead;
use log::trace; use log::trace;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_parser::commands::classified::external::ExternalArg; use nu_parser::commands::classified::external::ExternalArg;
@ -11,6 +14,71 @@ use nu_value_ext::as_column_path;
use std::io::Write; use std::io::Write;
use std::ops::Deref; use std::ops::Deref;
use std::process::{Command, Stdio}; use std::process::{Command, Stdio};
use std::sync::mpsc;
pub enum StringOrBinary {
String(String),
Binary(Vec<u8>),
}
pub struct MaybeTextCodec;
impl futures_codec::Encoder for MaybeTextCodec {
type Item = StringOrBinary;
type Error = std::io::Error;
fn encode(&mut self, item: Self::Item, dst: &mut BytesMut) -> Result<(), Self::Error> {
match item {
StringOrBinary::String(s) => {
dst.reserve(s.len());
dst.put(s.as_bytes());
Ok(())
}
StringOrBinary::Binary(b) => {
dst.reserve(b.len());
dst.put(Bytes::from(b));
Ok(())
}
}
}
}
impl futures_codec::Decoder for MaybeTextCodec {
type Item = StringOrBinary;
type Error = std::io::Error;
fn decode(&mut self, src: &mut BytesMut) -> Result<Option<Self::Item>, Self::Error> {
let v: Vec<u8> = src.to_vec();
match String::from_utf8(v) {
Ok(s) => {
src.clear();
if s.is_empty() {
Ok(None)
} else {
Ok(Some(StringOrBinary::String(s)))
}
}
Err(err) => {
// Note: the longest UTF-8 character per Unicode spec is currently 6 bytes. If we fail somewhere earlier than the last 6 bytes,
// we know that we're failing to understand the string encoding and not just seeing a partial character. When this happens, let's
// fall back to assuming it's a binary buffer.
if src.is_empty() {
Ok(None)
} else if src.len() > 6 && (src.len() - err.utf8_error().valid_up_to() > 6) {
// Fall back to assuming binary
let buf = src.to_vec();
src.clear();
Ok(Some(StringOrBinary::Binary(buf)))
} else {
// Looks like a utf-8 string, so let's assume that
let buf = src.split_to(err.utf8_error().valid_up_to() + 1);
String::from_utf8(buf.to_vec())
.map(|x| Some(StringOrBinary::String(x)))
.map_err(|e| std::io::Error::new(std::io::ErrorKind::InvalidData, e))
}
}
}
}
}
pub fn nu_value_to_string(command: &ExternalCommand, from: &Value) -> Result<String, ShellError> { pub fn nu_value_to_string(command: &ExternalCommand, from: &Value) -> Result<String, ShellError> {
match &from.value { match &from.value {
@ -26,26 +94,7 @@ pub fn nu_value_to_string(command: &ExternalCommand, from: &Value) -> Result<Str
} }
} }
pub fn nu_value_to_string_for_stdin( pub(crate) fn run_external_command(
command: &ExternalCommand,
from: &Value,
) -> Result<Option<String>, ShellError> {
match &from.value {
UntaggedValue::Primitive(Primitive::Nothing) => Ok(None),
UntaggedValue::Primitive(Primitive::String(s))
| UntaggedValue::Primitive(Primitive::Line(s)) => Ok(Some(s.clone())),
unsupported => Err(ShellError::labeled_error(
format!(
"Received unexpected type from pipeline ({})",
unsupported.type_name()
),
"expected a string",
&command.name_tag,
)),
}
}
pub(crate) async fn run_external_command(
command: ExternalCommand, command: ExternalCommand,
context: &mut Context, context: &mut Context,
input: Option<InputStream>, input: Option<InputStream>,
@ -62,9 +111,9 @@ pub(crate) async fn run_external_command(
} }
if command.has_it_argument() || command.has_nu_argument() { if command.has_it_argument() || command.has_nu_argument() {
run_with_iterator_arg(command, context, input, is_last).await run_with_iterator_arg(command, context, input, is_last)
} else { } else {
run_with_stdin(command, context, input, is_last).await run_with_stdin(command, context, input, is_last)
} }
} }
@ -114,7 +163,7 @@ fn to_column_path(
) )
} }
async fn run_with_iterator_arg( fn run_with_iterator_arg(
command: ExternalCommand, command: ExternalCommand,
context: &mut Context, context: &mut Context,
input: Option<InputStream>, input: Option<InputStream>,
@ -336,7 +385,7 @@ async fn run_with_iterator_arg(
Ok(Some(stream.to_input_stream())) Ok(Some(stream.to_input_stream()))
} }
async fn run_with_stdin( fn run_with_stdin(
command: ExternalCommand, command: ExternalCommand,
context: &mut Context, context: &mut Context,
input: Option<InputStream>, input: Option<InputStream>,
@ -379,45 +428,6 @@ async fn run_with_stdin(
spawn(&command, &path, &process_args[..], input, is_last) spawn(&command, &path, &process_args[..], input, is_last)
} }
/// This is a wrapper for stdout-like readers that ensure a carriage return ends the stream
pub struct StdoutWithNewline<T: std::io::Read> {
stdout: T,
ended_in_newline: bool,
}
impl<T: std::io::Read> StdoutWithNewline<T> {
pub fn new(stdout: T) -> StdoutWithNewline<T> {
StdoutWithNewline {
stdout,
ended_in_newline: false,
}
}
}
impl<T: std::io::Read> std::io::Read for StdoutWithNewline<T> {
fn read(&mut self, buf: &mut [u8]) -> std::io::Result<usize> {
match self.stdout.read(buf) {
Err(e) => Err(e),
Ok(0) => {
if !self.ended_in_newline && !buf.is_empty() {
self.ended_in_newline = true;
buf[0] = b'\n';
Ok(1)
} else {
Ok(0)
}
}
Ok(len) => {
if buf[len - 1] == b'\n' {
self.ended_in_newline = true;
} else {
self.ended_in_newline = false;
}
Ok(len)
}
}
}
}
fn spawn( fn spawn(
command: &ExternalCommand, command: &ExternalCommand,
path: &str, path: &str,
@ -426,7 +436,6 @@ fn spawn(
is_last: bool, is_last: bool,
) -> Result<Option<InputStream>, ShellError> { ) -> Result<Option<InputStream>, ShellError> {
let command = command.clone(); let command = command.clone();
let name_tag = command.name_tag.clone();
let mut process = { let mut process = {
#[cfg(windows)] #[cfg(windows)]
@ -467,76 +476,134 @@ fn spawn(
trace!(target: "nu::run::external", "built command {:?}", process); trace!(target: "nu::run::external", "built command {:?}", process);
// TODO Switch to async_std::process once it's stabilized
if let Ok(mut child) = process.spawn() { if let Ok(mut child) = process.spawn() {
let stream = async_stream! { let (tx, rx) = mpsc::sync_channel(0);
if let Some(mut input) = input {
let mut stdin_write = child.stdin let mut stdin = child.stdin.take();
let stdin_write_tx = tx.clone();
let stdout_read_tx = tx;
let stdin_name_tag = command.name_tag.clone();
let stdout_name_tag = command.name_tag;
std::thread::spawn(move || {
if let Some(input) = input {
let mut stdin_write = stdin
.take() .take()
.expect("Internal error: could not get stdin pipe for external command"); .expect("Internal error: could not get stdin pipe for external command");
while let Some(value) = input.next().await { for value in block_on_stream(input) {
let input_string = match nu_value_to_string_for_stdin(&command, &value) { match &value.value {
Ok(None) => continue, UntaggedValue::Primitive(Primitive::Nothing) => continue,
Ok(Some(v)) => v, UntaggedValue::Primitive(Primitive::String(s))
Err(e) => { | UntaggedValue::Primitive(Primitive::Line(s)) => {
yield Ok(Value { if let Err(e) = stdin_write.write(s.as_bytes()) {
value: UntaggedValue::Error(e), let message = format!("Unable to write to stdin (error = {})", e);
tag: name_tag
}); let _ = stdin_write_tx.send(Ok(Value {
return; value: UntaggedValue::Error(ShellError::labeled_error(
message,
"application may have closed before completing pipeline",
&stdin_name_tag,
)),
tag: stdin_name_tag,
}));
return Err(());
}
}
UntaggedValue::Primitive(Primitive::Binary(b)) => {
if let Err(e) = stdin_write.write(b) {
let message = format!("Unable to write to stdin (error = {})", e);
let _ = stdin_write_tx.send(Ok(Value {
value: UntaggedValue::Error(ShellError::labeled_error(
message,
"application may have closed before completing pipeline",
&stdin_name_tag,
)),
tag: stdin_name_tag,
}));
return Err(());
}
}
unsupported => {
let _ = stdin_write_tx.send(Ok(Value {
value: UntaggedValue::Error(ShellError::labeled_error(
format!(
"Received unexpected type from pipeline ({})",
unsupported.type_name()
),
"expected a string",
stdin_name_tag.clone(),
)),
tag: stdin_name_tag,
}));
return Err(());
} }
}; };
if let Err(e) = stdin_write.write(input_string.as_bytes()) {
let message = format!("Unable to write to stdin (error = {})", e);
yield Ok(Value {
value: UntaggedValue::Error(ShellError::labeled_error(
message,
"application may have closed before completing pipeline",
&name_tag)),
tag: name_tag
});
return;
}
} }
} }
Ok(())
});
std::thread::spawn(move || {
if !is_last { if !is_last {
let stdout = if let Some(stdout) = child.stdout.take() { let stdout = if let Some(stdout) = child.stdout.take() {
stdout stdout
} else { } else {
yield Ok(Value { let _ = stdout_read_tx.send(Ok(Value {
value: UntaggedValue::Error(ShellError::labeled_error( value: UntaggedValue::Error(ShellError::labeled_error(
"Can't redirect the stdout for external command", "Can't redirect the stdout for external command",
"can't redirect stdout", "can't redirect stdout",
&name_tag)), &stdout_name_tag,
tag: name_tag )),
}); tag: stdout_name_tag,
return; }));
return Err(());
}; };
let file = futures::io::AllowStdIo::new(StdoutWithNewline::new(stdout)); let file = futures::io::AllowStdIo::new(stdout);
let mut stream = FramedRead::new(file, LinesCodec); let stream = FramedRead::new(file, MaybeTextCodec);
while let Some(line) = stream.next().await { for line in block_on_stream(stream) {
if let Ok(line) = line { match line {
yield Ok(Value { Ok(line) => match line {
value: UntaggedValue::Primitive(Primitive::Line(line)), StringOrBinary::String(s) => {
tag: name_tag.clone(), let result = stdout_read_tx.send(Ok(Value {
}); value: UntaggedValue::Primitive(Primitive::String(s.clone())),
} else { tag: stdout_name_tag.clone(),
yield Ok(Value { }));
value: UntaggedValue::Error(
ShellError::labeled_error( if result.is_err() {
"Unable to read lines from stdout. This usually happens when the output does not end with a newline.", break;
}
}
StringOrBinary::Binary(b) => {
let result = stdout_read_tx.send(Ok(Value {
value: UntaggedValue::Primitive(Primitive::Binary(
b.into_iter().collect(),
)),
tag: stdout_name_tag.clone(),
}));
if result.is_err() {
break;
}
}
},
Err(_) => {
let _ = stdout_read_tx.send(Ok(Value {
value: UntaggedValue::Error(ShellError::labeled_error(
"Unable to read from stdout.",
"unable to read from stdout", "unable to read from stdout",
&name_tag, &stdout_name_tag,
) )),
), tag: stdout_name_tag.clone(),
tag: name_tag.clone(), }));
}); break;
return; }
} }
} }
} }
@ -547,21 +614,22 @@ fn spawn(
let cfg = crate::data::config::config(Tag::unknown()); let cfg = crate::data::config::config(Tag::unknown());
if let Ok(cfg) = cfg { if let Ok(cfg) = cfg {
if cfg.contains_key("nonzero_exit_errors") { if cfg.contains_key("nonzero_exit_errors") {
yield Ok(Value { let _ = stdout_read_tx.send(Ok(Value {
value: UntaggedValue::Error( value: UntaggedValue::Error(ShellError::labeled_error(
ShellError::labeled_error( "External command failed",
"External command failed", "command failed",
"command failed", &stdout_name_tag,
&name_tag, )),
) tag: stdout_name_tag,
), }));
tag: name_tag,
});
} }
} }
} }
};
Ok(())
});
let stream = ThreadedReceiver::new(rx);
Ok(Some(stream.to_input_stream())) Ok(Some(stream.to_input_stream()))
} else { } else {
Err(ShellError::labeled_error( Err(ShellError::labeled_error(
@ -612,8 +680,8 @@ fn argument_is_quoted(argument: &str) -> bool {
return false; return false;
} }
((argument.starts_with('"') && argument.ends_with('"')) (argument.starts_with('"') && argument.ends_with('"'))
|| (argument.starts_with('\'') && argument.ends_with('\''))) || (argument.starts_with('\'') && argument.ends_with('\''))
} }
#[allow(unused)] #[allow(unused)]
@ -670,9 +738,7 @@ mod tests {
let mut ctx = Context::basic().expect("There was a problem creating a basic context."); let mut ctx = Context::basic().expect("There was a problem creating a basic context.");
assert!(run_external_command(cmd, &mut ctx, None, false) assert!(run_external_command(cmd, &mut ctx, None, false).is_err());
.await
.is_err());
Ok(()) Ok(())
} }

View File

@ -5,7 +5,7 @@ use nu_errors::ShellError;
use nu_parser::InternalCommand; use nu_parser::InternalCommand;
use nu_protocol::{CommandAction, Primitive, ReturnSuccess, UntaggedValue, Value}; use nu_protocol::{CommandAction, Primitive, ReturnSuccess, UntaggedValue, Value};
pub(crate) async fn run_internal_command( pub(crate) fn run_internal_command(
command: InternalCommand, command: InternalCommand,
context: &mut Context, context: &mut Context,
input: Option<InputStream>, input: Option<InputStream>,

View File

@ -31,15 +31,15 @@ pub(crate) async fn run_pipeline(
(_, Some(ClassifiedCommand::Error(err))) => return Err(err.clone().into()), (_, Some(ClassifiedCommand::Error(err))) => return Err(err.clone().into()),
(Some(ClassifiedCommand::Internal(left)), _) => { (Some(ClassifiedCommand::Internal(left)), _) => {
run_internal_command(left, ctx, input, Text::from(line)).await? run_internal_command(left, ctx, input, Text::from(line))?
} }
(Some(ClassifiedCommand::External(left)), None) => { (Some(ClassifiedCommand::External(left)), None) => {
run_external_command(left, ctx, input, true).await? run_external_command(left, ctx, input, true)?
} }
(Some(ClassifiedCommand::External(left)), _) => { (Some(ClassifiedCommand::External(left)), _) => {
run_external_command(left, ctx, input, false).await? run_external_command(left, ctx, input, false)?
} }
(None, _) => break, (None, _) => break,

View File

@ -2,7 +2,11 @@ use crate::commands::PerItemCommand;
use crate::context::CommandRegistry; use crate::context::CommandRegistry;
use crate::prelude::*; use crate::prelude::*;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{CallInfo, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value}; use nu_protocol::{
CallInfo, ColumnPath, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value,
};
use nu_source::Tagged;
use nu_value_ext::{as_column_path, get_data_by_column_path};
use std::borrow::Borrow; use std::borrow::Borrow;
use nom::{ use nom::{
@ -45,35 +49,46 @@ impl PerItemCommand for Format {
ShellError::labeled_error( ShellError::labeled_error(
"Could not create format pattern", "Could not create format pattern",
"could not create format pattern", "could not create format pattern",
pattern_tag, &pattern_tag,
) )
})?; })?;
let commands = format_pattern.1; let commands = format_pattern.1;
let output = if let Value { let output = match value {
value: UntaggedValue::Row(dict), value
.. @
} = value Value {
{ value: UntaggedValue::Row(_),
let mut output = String::new(); ..
} => {
let mut output = String::new();
for command in &commands { for command in &commands {
match command { match command {
FormatCommand::Text(s) => { FormatCommand::Text(s) => {
output.push_str(s); output.push_str(s);
} }
FormatCommand::Column(c) => { FormatCommand::Column(c) => {
if let Some(c) = dict.entries.get(c) { let key = to_column_path(&c, &pattern_tag)?;
output.push_str(&value::format_leaf(c.borrow()).plain_string(100_000))
let fetcher = get_data_by_column_path(
&value,
&key,
Box::new(move |(_, _, error)| error),
);
if let Ok(c) = fetcher {
output
.push_str(&value::format_leaf(c.borrow()).plain_string(100_000))
}
// That column doesn't match, so don't emit anything
} }
// That column doesn't match, so don't emit anything
} }
} }
}
output output
} else { }
String::new() _ => String::new(),
}; };
Ok(futures::stream::iter(vec![ReturnSuccess::value( Ok(futures::stream::iter(vec![ReturnSuccess::value(
@ -116,3 +131,27 @@ fn format(input: &str) -> IResult<&str, Vec<FormatCommand>> {
Ok((loop_input, output)) Ok((loop_input, output))
} }
fn to_column_path(
path_members: &str,
tag: impl Into<Tag>,
) -> Result<Tagged<ColumnPath>, ShellError> {
let tag = tag.into();
as_column_path(
&UntaggedValue::Table(
path_members
.split('.')
.map(|x| {
let member = match x.parse::<u64>() {
Ok(v) => UntaggedValue::int(v),
Err(_) => UntaggedValue::string(x),
};
member.into_value(&tag)
})
.collect(),
)
.into_value(&tag),
)
}

View File

@ -205,32 +205,18 @@ fn from_bson(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStre
let input = args.input; let input = args.input;
let stream = async_stream! { let stream = async_stream! {
let values: Vec<Value> = input.values.collect().await; let bytes = input.collect_binary(tag.clone()).await?;
for value in values { match from_bson_bytes_to_value(bytes.item, tag.clone()) {
let value_tag = &value.tag; Ok(x) => yield ReturnSuccess::value(x),
match value.value { Err(_) => {
UntaggedValue::Primitive(Primitive::Binary(vb)) => yield Err(ShellError::labeled_error_with_secondary(
match from_bson_bytes_to_value(vb, tag.clone()) { "Could not parse as BSON",
Ok(x) => yield ReturnSuccess::value(x), "input cannot be parsed as BSON",
Err(_) => {
yield Err(ShellError::labeled_error_with_secondary(
"Could not parse as BSON",
"input cannot be parsed as BSON",
tag.clone(),
"value originates from here",
value_tag,
))
}
}
_ => yield Err(ShellError::labeled_error_with_secondary(
"Expected a string from pipeline",
"requires string input",
tag.clone(), tag.clone(),
"value originates from here", "value originates from here",
value_tag, bytes.tag,
)), ))
} }
} }
}; };

View File

@ -47,28 +47,9 @@ pub fn from_delimited_data(
let name_tag = name; let name_tag = name;
let stream = async_stream! { let stream = async_stream! {
let values: Vec<Value> = input.values.collect().await; let concat_string = input.collect_string(name_tag.clone()).await?;
let mut concat_string = String::new(); match from_delimited_string_to_value(concat_string.item, headerless, sep, name_tag.clone()) {
let mut latest_tag: Option<Tag> = None;
for value in values {
let value_tag = &value.tag;
latest_tag = Some(value_tag.clone());
if let Ok(s) = value.as_string() {
concat_string.push_str(&s);
} else {
yield Err(ShellError::labeled_error_with_secondary(
"Expected a string from pipeline",
"requires string input",
name_tag.clone(),
"value originates from here",
value_tag.clone(),
))
}
}
match from_delimited_string_to_value(concat_string, headerless, sep, name_tag.clone()) {
Ok(x) => match x { Ok(x) => match x {
Value { value: UntaggedValue::Table(list), .. } => { Value { value: UntaggedValue::Table(list), .. } => {
for l in list { for l in list {
@ -77,7 +58,7 @@ pub fn from_delimited_data(
} }
x => yield ReturnSuccess::value(x), x => yield ReturnSuccess::value(x),
}, },
Err(_) => if let Some(last_tag) = latest_tag { Err(_) => {
let line_one = format!("Could not parse as {}", format_name); let line_one = format!("Could not parse as {}", format_name);
let line_two = format!("input cannot be parsed as {}", format_name); let line_two = format!("input cannot be parsed as {}", format_name);
yield Err(ShellError::labeled_error_with_secondary( yield Err(ShellError::labeled_error_with_secondary(
@ -85,7 +66,7 @@ pub fn from_delimited_data(
line_two, line_two,
name_tag.clone(), name_tag.clone(),
"value originates from here", "value originates from here",
last_tag.clone(), concat_string.tag,
)) ))
} , } ,
} }

View File

@ -66,32 +66,12 @@ pub fn from_ini_string_to_value(
fn from_ini(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> { fn from_ini(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let args = args.evaluate_once(registry)?; let args = args.evaluate_once(registry)?;
let tag = args.name_tag(); let tag = args.name_tag();
let span = tag.span;
let input = args.input; let input = args.input;
let stream = async_stream! { let stream = async_stream! {
let values: Vec<Value> = input.values.collect().await; let concat_string = input.collect_string(tag.clone()).await?;
let mut concat_string = String::new(); match from_ini_string_to_value(concat_string.item, tag.clone()) {
let mut latest_tag: Option<Tag> = None;
for value in values {
latest_tag = Some(value.tag.clone());
let value_span = value.tag.span;
if let Ok(s) = value.as_string() {
concat_string.push_str(&s);
} else {
yield Err(ShellError::labeled_error_with_secondary(
"Expected a string from pipeline",
"requires string input",
span,
"value originates from here",
value_span,
))
}
}
match from_ini_string_to_value(concat_string, tag.clone()) {
Ok(x) => match x { Ok(x) => match x {
Value { value: UntaggedValue::Table(list), .. } => { Value { value: UntaggedValue::Table(list), .. } => {
for l in list { for l in list {
@ -100,15 +80,15 @@ fn from_ini(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStrea
} }
x => yield ReturnSuccess::value(x), x => yield ReturnSuccess::value(x),
}, },
Err(_) => if let Some(last_tag) = latest_tag { Err(_) => {
yield Err(ShellError::labeled_error_with_secondary( yield Err(ShellError::labeled_error_with_secondary(
"Could not parse as INI", "Could not parse as INI",
"input cannot be parsed as INI", "input cannot be parsed as INI",
&tag, &tag,
"value originates from here", "value originates from here",
last_tag, concat_string.tag,
)) ))
} , }
} }
}; };

View File

@ -74,35 +74,13 @@ fn from_json(
FromJSONArgs { objects }: FromJSONArgs, FromJSONArgs { objects }: FromJSONArgs,
RunnableContext { input, name, .. }: RunnableContext, RunnableContext { input, name, .. }: RunnableContext,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
let name_span = name.span;
let name_tag = name; let name_tag = name;
let stream = async_stream! { let stream = async_stream! {
let values: Vec<Value> = input.values.collect().await; let concat_string = input.collect_string(name_tag.clone()).await?;
let mut concat_string = String::new();
let mut latest_tag: Option<Tag> = None;
for value in values {
latest_tag = Some(value.tag.clone());
let value_span = value.tag.span;
if let Ok(s) = value.as_string() {
concat_string.push_str(&s);
} else {
yield Err(ShellError::labeled_error_with_secondary(
"Expected a string from pipeline",
"requires string input",
name_span,
"value originates from here",
value_span,
))
}
}
if objects { if objects {
for json_str in concat_string.lines() { for json_str in concat_string.item.lines() {
if json_str.is_empty() { if json_str.is_empty() {
continue; continue;
} }
@ -111,23 +89,21 @@ fn from_json(
Ok(x) => Ok(x) =>
yield ReturnSuccess::value(x), yield ReturnSuccess::value(x),
Err(e) => { Err(e) => {
if let Some(ref last_tag) = latest_tag { let mut message = "Could not parse as JSON (".to_string();
let mut message = "Could not parse as JSON (".to_string(); message.push_str(&e.to_string());
message.push_str(&e.to_string()); message.push_str(")");
message.push_str(")");
yield Err(ShellError::labeled_error_with_secondary( yield Err(ShellError::labeled_error_with_secondary(
message, message,
"input cannot be parsed as JSON", "input cannot be parsed as JSON",
&name_tag, &name_tag,
"value originates from here", "value originates from here",
last_tag)) concat_string.tag.clone()))
}
} }
} }
} }
} else { } else {
match from_json_string_to_value(concat_string, name_tag.clone()) { match from_json_string_to_value(concat_string.item, name_tag.clone()) {
Ok(x) => Ok(x) =>
match x { match x {
Value { value: UntaggedValue::Table(list), .. } => { Value { value: UntaggedValue::Table(list), .. } => {
@ -138,18 +114,16 @@ fn from_json(
x => yield ReturnSuccess::value(x), x => yield ReturnSuccess::value(x),
} }
Err(e) => { Err(e) => {
if let Some(last_tag) = latest_tag { let mut message = "Could not parse as JSON (".to_string();
let mut message = "Could not parse as JSON (".to_string(); message.push_str(&e.to_string());
message.push_str(&e.to_string()); message.push_str(")");
message.push_str(")");
yield Err(ShellError::labeled_error_with_secondary( yield Err(ShellError::labeled_error_with_secondary(
message, message,
"input cannot be parsed as JSON", "input cannot be parsed as JSON",
name_tag, name_tag,
"value originates from here", "value originates from here",
last_tag)) concat_string.tag))
}
} }
} }
} }

View File

@ -0,0 +1,98 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use crate::TaggedListBuilder;
use calamine::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, TaggedDictBuilder, UntaggedValue};
use std::io::Cursor;
pub struct FromODS;
#[derive(Deserialize)]
pub struct FromODSArgs {
headerless: bool,
}
impl WholeStreamCommand for FromODS {
fn name(&self) -> &str {
"from-ods"
}
fn signature(&self) -> Signature {
Signature::build("from-ods").switch(
"headerless",
"don't treat the first row as column names",
None,
)
}
fn usage(&self) -> &str {
"Parse OpenDocument Spreadsheet(.ods) data and create table."
}
fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
args.process(registry, from_ods)?.run()
}
}
fn from_ods(
FromODSArgs {
headerless: _headerless,
}: FromODSArgs,
runnable_context: RunnableContext,
) -> Result<OutputStream, ShellError> {
let input = runnable_context.input;
let tag = runnable_context.name;
let stream = async_stream! {
let bytes = input.collect_binary(tag.clone()).await?;
let mut buf: Cursor<Vec<u8>> = Cursor::new(bytes.item);
let mut ods = Ods::<_>::new(buf).map_err(|_| ShellError::labeled_error(
"Could not load ods file",
"could not load ods file",
&tag))?;
let mut dict = TaggedDictBuilder::new(&tag);
let sheet_names = ods.sheet_names().to_owned();
for sheet_name in &sheet_names {
let mut sheet_output = TaggedListBuilder::new(&tag);
if let Some(Ok(current_sheet)) = ods.worksheet_range(sheet_name) {
for row in current_sheet.rows() {
let mut row_output = TaggedDictBuilder::new(&tag);
for (i, cell) in row.iter().enumerate() {
let value = match cell {
DataType::Empty => UntaggedValue::nothing(),
DataType::String(s) => UntaggedValue::string(s),
DataType::Float(f) => UntaggedValue::decimal(*f),
DataType::Int(i) => UntaggedValue::int(*i),
DataType::Bool(b) => UntaggedValue::boolean(*b),
_ => UntaggedValue::nothing(),
};
row_output.insert_untagged(&format!("Column{}", i), value);
}
sheet_output.push_untagged(row_output.into_untagged_value());
}
dict.insert_untagged(sheet_name, sheet_output.into_untagged_value());
} else {
yield Err(ShellError::labeled_error(
"Could not load sheet",
"could not load sheet",
&tag));
}
}
yield ReturnSuccess::value(dict.into_value());
};
Ok(stream.to_output_stream())
}

View File

@ -138,40 +138,25 @@ fn from_sqlite(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputSt
let input = args.input; let input = args.input;
let stream = async_stream! { let stream = async_stream! {
let values: Vec<Value> = input.values.collect().await; let bytes = input.collect_binary(tag.clone()).await?;
match from_sqlite_bytes_to_value(bytes.item, tag.clone()) {
for value in values { Ok(x) => match x {
let value_tag = &value.tag; Value { value: UntaggedValue::Table(list), .. } => {
match value.value { for l in list {
UntaggedValue::Primitive(Primitive::Binary(vb)) => yield ReturnSuccess::value(l);
match from_sqlite_bytes_to_value(vb, tag.clone()) {
Ok(x) => match x {
Value { value: UntaggedValue::Table(list), .. } => {
for l in list {
yield ReturnSuccess::value(l);
}
}
_ => yield ReturnSuccess::value(x),
}
Err(err) => {
println!("{:?}", err);
yield Err(ShellError::labeled_error_with_secondary(
"Could not parse as SQLite",
"input cannot be parsed as SQLite",
&tag,
"value originates from here",
value_tag,
))
}
} }
_ => yield Err(ShellError::labeled_error_with_secondary( }
"Expected binary data from pipeline", _ => yield ReturnSuccess::value(x),
"requires binary data input", }
Err(err) => {
println!("{:?}", err);
yield Err(ShellError::labeled_error_with_secondary(
"Could not parse as SQLite",
"input cannot be parsed as SQLite",
&tag, &tag,
"value originates from here", "value originates from here",
value_tag, bytes.tag,
)), ))
} }
} }
}; };

View File

@ -259,45 +259,26 @@ fn from_ssv(
RunnableContext { input, name, .. }: RunnableContext, RunnableContext { input, name, .. }: RunnableContext,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
let stream = async_stream! { let stream = async_stream! {
let values: Vec<Value> = input.values.collect().await; let concat_string = input.collect_string(name.clone()).await?;
let mut concat_string = String::new();
let mut latest_tag: Option<Tag> = None;
let split_at = match minimum_spaces { let split_at = match minimum_spaces {
Some(number) => number.item, Some(number) => number.item,
None => DEFAULT_MINIMUM_SPACES None => DEFAULT_MINIMUM_SPACES
}; };
for value in values { match from_ssv_string_to_value(&concat_string.item, headerless, aligned_columns, split_at, name.clone()) {
let value_tag = value.tag.clone();
latest_tag = Some(value_tag.clone());
if let Ok(s) = value.as_string() {
concat_string.push_str(&s);
}
else {
yield Err(ShellError::labeled_error_with_secondary (
"Expected a string from pipeline",
"requires string input",
&name,
"value originates from here",
&value_tag
))
}
}
match from_ssv_string_to_value(&concat_string, headerless, aligned_columns, split_at, name.clone()) {
Some(x) => match x { Some(x) => match x {
Value { value: UntaggedValue::Table(list), ..} => { Value { value: UntaggedValue::Table(list), ..} => {
for l in list { yield ReturnSuccess::value(l) } for l in list { yield ReturnSuccess::value(l) }
} }
x => yield ReturnSuccess::value(x) x => yield ReturnSuccess::value(x)
}, },
None => if let Some(tag) = latest_tag { None => {
yield Err(ShellError::labeled_error_with_secondary( yield Err(ShellError::labeled_error_with_secondary(
"Could not parse as SSV", "Could not parse as SSV",
"input cannot be parsed ssv", "input cannot be parsed ssv",
&name, &name,
"value originates from here", "value originates from here",
&tag, &concat_string.tag,
)) ))
}, },
} }

View File

@ -69,33 +69,11 @@ pub fn from_toml(
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
let args = args.evaluate_once(registry)?; let args = args.evaluate_once(registry)?;
let tag = args.name_tag(); let tag = args.name_tag();
let name_span = tag.span;
let input = args.input; let input = args.input;
let stream = async_stream! { let stream = async_stream! {
let values: Vec<Value> = input.values.collect().await; let concat_string = input.collect_string(tag.clone()).await?;
match from_toml_string_to_value(concat_string.item, tag.clone()) {
let mut concat_string = String::new();
let mut latest_tag: Option<Tag> = None;
for value in values {
latest_tag = Some(value.tag.clone());
let value_span = value.tag.span;
if let Ok(s) = value.as_string() {
concat_string.push_str(&s);
}
else {
yield Err(ShellError::labeled_error_with_secondary(
"Expected a string from pipeline",
"requires string input",
name_span,
"value originates from here",
value_span,
))
}
}
match from_toml_string_to_value(concat_string, tag.clone()) {
Ok(x) => match x { Ok(x) => match x {
Value { value: UntaggedValue::Table(list), .. } => { Value { value: UntaggedValue::Table(list), .. } => {
for l in list { for l in list {
@ -104,15 +82,15 @@ pub fn from_toml(
} }
x => yield ReturnSuccess::value(x), x => yield ReturnSuccess::value(x),
}, },
Err(_) => if let Some(last_tag) = latest_tag { Err(_) => {
yield Err(ShellError::labeled_error_with_secondary( yield Err(ShellError::labeled_error_with_secondary(
"Could not parse as TOML", "Could not parse as TOML",
"input cannot be parsed as TOML", "input cannot be parsed as TOML",
&tag, &tag,
"value originates from here", "value originates from here",
last_tag, concat_string.tag,
)) ))
} , }
} }
}; };

View File

@ -1,7 +1,7 @@
use crate::commands::WholeStreamCommand; use crate::commands::WholeStreamCommand;
use crate::prelude::*; use crate::prelude::*;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, TaggedDictBuilder, UntaggedValue, Value}; use nu_protocol::{ReturnSuccess, Signature, TaggedDictBuilder, UntaggedValue};
pub struct FromURL; pub struct FromURL;
@ -30,32 +30,12 @@ impl WholeStreamCommand for FromURL {
fn from_url(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> { fn from_url(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let args = args.evaluate_once(registry)?; let args = args.evaluate_once(registry)?;
let tag = args.name_tag(); let tag = args.name_tag();
let name_span = tag.span;
let input = args.input; let input = args.input;
let stream = async_stream! { let stream = async_stream! {
let values: Vec<Value> = input.values.collect().await; let concat_string = input.collect_string(tag.clone()).await?;
let mut concat_string = String::new(); let result = serde_urlencoded::from_str::<Vec<(String, String)>>(&concat_string.item);
let mut latest_tag: Option<Tag> = None;
for value in values {
latest_tag = Some(value.tag.clone());
let value_span = value.tag.span;
if let Ok(s) = value.as_string() {
concat_string.push_str(&s);
} else {
yield Err(ShellError::labeled_error_with_secondary(
"Expected a string from pipeline",
"requires string input",
name_span,
"value originates from here",
value_span,
))
}
}
let result = serde_urlencoded::from_str::<Vec<(String, String)>>(&concat_string);
match result { match result {
Ok(result) => { Ok(result) => {
@ -68,15 +48,13 @@ fn from_url(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStrea
yield ReturnSuccess::value(row.into_value()); yield ReturnSuccess::value(row.into_value());
} }
_ => { _ => {
if let Some(last_tag) = latest_tag { yield Err(ShellError::labeled_error_with_secondary(
yield Err(ShellError::labeled_error_with_secondary( "String not compatible with url-encoding",
"String not compatible with url-encoding", "input not url-encoded",
"input not url-encoded", tag,
tag, "value originates from here",
"value originates from here", concat_string.tag,
last_tag, ));
));
}
} }
} }
}; };

View File

@ -0,0 +1,99 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use crate::TaggedListBuilder;
use calamine::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, TaggedDictBuilder, UntaggedValue};
use std::io::Cursor;
pub struct FromXLSX;
#[derive(Deserialize)]
pub struct FromXLSXArgs {
headerless: bool,
}
impl WholeStreamCommand for FromXLSX {
fn name(&self) -> &str {
"from-xlsx"
}
fn signature(&self) -> Signature {
Signature::build("from-xlsx").switch(
"headerless",
"don't treat the first row as column names",
None,
)
}
fn usage(&self) -> &str {
"Parse binary Excel(.xlsx) data and create table."
}
fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
args.process(registry, from_xlsx)?.run()
}
}
fn from_xlsx(
FromXLSXArgs {
headerless: _headerless,
}: FromXLSXArgs,
runnable_context: RunnableContext,
) -> Result<OutputStream, ShellError> {
let input = runnable_context.input;
let tag = runnable_context.name;
let stream = async_stream! {
let value = input.collect_binary(tag.clone()).await?;
let mut buf: Cursor<Vec<u8>> = Cursor::new(value.item);
let mut xls = Xlsx::<_>::new(buf).map_err(|_| {
ShellError::labeled_error("Could not load xlsx file", "could not load xlsx file", &tag)
})?;
let mut dict = TaggedDictBuilder::new(&tag);
let sheet_names = xls.sheet_names().to_owned();
for sheet_name in &sheet_names {
let mut sheet_output = TaggedListBuilder::new(&tag);
if let Some(Ok(current_sheet)) = xls.worksheet_range(sheet_name) {
for row in current_sheet.rows() {
let mut row_output = TaggedDictBuilder::new(&tag);
for (i, cell) in row.iter().enumerate() {
let value = match cell {
DataType::Empty => UntaggedValue::nothing(),
DataType::String(s) => UntaggedValue::string(s),
DataType::Float(f) => UntaggedValue::decimal(*f),
DataType::Int(i) => UntaggedValue::int(*i),
DataType::Bool(b) => UntaggedValue::boolean(*b),
_ => UntaggedValue::nothing(),
};
row_output.insert_untagged(&format!("Column{}", i), value);
}
sheet_output.push_untagged(row_output.into_untagged_value());
}
dict.insert_untagged(sheet_name, sheet_output.into_untagged_value());
} else {
yield Err(ShellError::labeled_error(
"Could not load sheet",
"could not load sheet",
&tag,
));
}
}
yield ReturnSuccess::value(dict.into_value());
};
Ok(stream.to_output_stream())
}

View File

@ -101,34 +101,12 @@ pub fn from_xml_string_to_value(s: String, tag: impl Into<Tag>) -> Result<Value,
fn from_xml(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> { fn from_xml(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let args = args.evaluate_once(registry)?; let args = args.evaluate_once(registry)?;
let tag = args.name_tag(); let tag = args.name_tag();
let name_span = tag.span;
let input = args.input; let input = args.input;
let stream = async_stream! { let stream = async_stream! {
let values: Vec<Value> = input.values.collect().await; let concat_string = input.collect_string(tag.clone()).await?;
let mut concat_string = String::new(); match from_xml_string_to_value(concat_string.item, tag.clone()) {
let mut latest_tag: Option<Tag> = None;
for value in values {
latest_tag = Some(value.tag.clone());
let value_span = value.tag.span;
if let Ok(s) = value.as_string() {
concat_string.push_str(&s);
}
else {
yield Err(ShellError::labeled_error_with_secondary(
"Expected a string from pipeline",
"requires string input",
name_span,
"value originates from here",
value_span,
))
}
}
match from_xml_string_to_value(concat_string, tag.clone()) {
Ok(x) => match x { Ok(x) => match x {
Value { value: UntaggedValue::Table(list), .. } => { Value { value: UntaggedValue::Table(list), .. } => {
for l in list { for l in list {
@ -137,13 +115,13 @@ fn from_xml(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStrea
} }
x => yield ReturnSuccess::value(x), x => yield ReturnSuccess::value(x),
}, },
Err(_) => if let Some(last_tag) = latest_tag { Err(_) => {
yield Err(ShellError::labeled_error_with_secondary( yield Err(ShellError::labeled_error_with_secondary(
"Could not parse as XML", "Could not parse as XML",
"input cannot be parsed as XML", "input cannot be parsed as XML",
&tag, &tag,
"value originates from here", "value originates from here",
&last_tag, &concat_string.tag,
)) ))
} , } ,
} }

View File

@ -121,34 +121,12 @@ pub fn from_yaml_string_to_value(s: String, tag: impl Into<Tag>) -> Result<Value
fn from_yaml(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> { fn from_yaml(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let args = args.evaluate_once(registry)?; let args = args.evaluate_once(registry)?;
let tag = args.name_tag(); let tag = args.name_tag();
let name_span = tag.span;
let input = args.input; let input = args.input;
let stream = async_stream! { let stream = async_stream! {
let values: Vec<Value> = input.values.collect().await; let concat_string = input.collect_string(tag.clone()).await?;
let mut concat_string = String::new(); match from_yaml_string_to_value(concat_string.item, tag.clone()) {
let mut latest_tag: Option<Tag> = None;
for value in values {
latest_tag = Some(value.tag.clone());
let value_span = value.tag.span;
if let Ok(s) = value.as_string() {
concat_string.push_str(&s);
}
else {
yield Err(ShellError::labeled_error_with_secondary(
"Expected a string from pipeline",
"requires string input",
name_span,
"value originates from here",
value_span,
))
}
}
match from_yaml_string_to_value(concat_string, tag.clone()) {
Ok(x) => match x { Ok(x) => match x {
Value { value: UntaggedValue::Table(list), .. } => { Value { value: UntaggedValue::Table(list), .. } => {
for l in list { for l in list {
@ -157,15 +135,15 @@ fn from_yaml(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStre
} }
x => yield ReturnSuccess::value(x), x => yield ReturnSuccess::value(x),
}, },
Err(_) => if let Some(last_tag) = latest_tag { Err(_) => {
yield Err(ShellError::labeled_error_with_secondary( yield Err(ShellError::labeled_error_with_secondary(
"Could not parse as YAML", "Could not parse as YAML",
"input cannot be parsed as YAML", "input cannot be parsed as YAML",
&tag, &tag,
"value originates from here", "value originates from here",
&last_tag, &concat_string.tag,
)) ))
} , }
} }
}; };

View File

@ -1,7 +1,5 @@
use crate::commands::WholeStreamCommand; use crate::commands::WholeStreamCommand;
use crate::data::base::shape::Shapes;
use crate::prelude::*; use crate::prelude::*;
use futures_util::pin_mut;
use indexmap::set::IndexSet; use indexmap::set::IndexSet;
use log::trace; use log::trace;
use nu_errors::ShellError; use nu_errors::ShellError;
@ -180,23 +178,15 @@ pub fn get_column_path(path: &ColumnPath, obj: &Value) -> Result<Value, ShellErr
pub fn get( pub fn get(
GetArgs { rest: mut fields }: GetArgs, GetArgs { rest: mut fields }: GetArgs,
RunnableContext { input, .. }: RunnableContext, RunnableContext { mut input, .. }: RunnableContext,
) -> Result<OutputStream, ShellError> { ) -> Result<OutputStream, ShellError> {
if fields.is_empty() { if fields.is_empty() {
let stream = async_stream! { let stream = async_stream! {
let values = input.values; let mut vec = input.drain_vec().await;
pin_mut!(values);
let mut shapes = Shapes::new(); let descs = nu_protocol::merge_descriptors(&vec);
let mut index = 0; for desc in descs {
yield ReturnSuccess::value(desc);
while let Some(row) = values.next().await {
shapes.add(&row, index);
index += 1;
}
for row in shapes.to_values() {
yield ReturnSuccess::value(row);
} }
}; };

View File

@ -0,0 +1,116 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, Signature, UntaggedValue, Value};
pub struct Lines;
impl WholeStreamCommand for Lines {
fn name(&self) -> &str {
"lines"
}
fn signature(&self) -> Signature {
Signature::build("lines")
}
fn usage(&self) -> &str {
"Split single string into rows, one per line."
}
fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
lines(args, registry)
}
}
fn ends_with_line_ending(st: &str) -> bool {
let mut temp = st.to_string();
let last = temp.pop();
if let Some(c) = last {
c == '\n'
} else {
false
}
}
fn lines(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let args = args.evaluate_once(registry)?;
let tag = args.name_tag();
let name_span = tag.span;
let mut input = args.input;
let mut leftover = vec![];
let mut leftover_string = String::new();
let stream = async_stream! {
loop {
match input.values.next().await {
Some(Value { value: UntaggedValue::Primitive(Primitive::String(st)), ..}) => {
let mut st = leftover_string.clone() + &st;
leftover.clear();
let mut lines: Vec<String> = st.lines().map(|x| x.to_string()).collect();
if !ends_with_line_ending(&st) {
if let Some(last) = lines.pop() {
leftover_string = last;
} else {
leftover_string.clear();
}
} else {
leftover_string.clear();
}
let success_lines: Vec<_> = lines.iter().map(|x| ReturnSuccess::value(UntaggedValue::line(x).into_untagged_value())).collect();
yield futures::stream::iter(success_lines)
}
Some(Value { value: UntaggedValue::Primitive(Primitive::Line(st)), ..}) => {
let mut st = leftover_string.clone() + &st;
leftover.clear();
let mut lines: Vec<String> = st.lines().map(|x| x.to_string()).collect();
if !ends_with_line_ending(&st) {
if let Some(last) = lines.pop() {
leftover_string = last;
} else {
leftover_string.clear();
}
} else {
leftover_string.clear();
}
let success_lines: Vec<_> = lines.iter().map(|x| ReturnSuccess::value(UntaggedValue::line(x).into_untagged_value())).collect();
yield futures::stream::iter(success_lines)
}
Some( Value { tag: value_span, ..}) => {
yield futures::stream::iter(vec![Err(ShellError::labeled_error_with_secondary(
"Expected a string from pipeline",
"requires string input",
name_span,
"value originates from here",
value_span,
))]);
}
None => {
if !leftover.is_empty() {
let mut st = leftover_string.clone();
if let Ok(extra) = String::from_utf8(leftover) {
st.push_str(&extra);
}
yield futures::stream::iter(vec![ReturnSuccess::value(UntaggedValue::string(st).into_untagged_value())])
}
break;
}
}
}
if !leftover_string.is_empty() {
yield futures::stream::iter(vec![ReturnSuccess::value(UntaggedValue::string(leftover_string).into_untagged_value())]);
}
}
.flatten();
Ok(stream.to_output_stream())
}

View File

@ -1,7 +1,9 @@
use crate::commands::WholeStreamCommand; use crate::commands::WholeStreamCommand;
use crate::prelude::*; use crate::prelude::*;
use nu_errors::ShellError; use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, TaggedDictBuilder, UntaggedValue, Value}; use nu_protocol::{
merge_descriptors, ReturnSuccess, Signature, SyntaxShape, TaggedDictBuilder, UntaggedValue,
};
use nu_source::{SpannedItem, Tagged}; use nu_source::{SpannedItem, Tagged};
use nu_value_ext::get_data_by_key; use nu_value_ext::get_data_by_key;
@ -52,18 +54,6 @@ impl WholeStreamCommand for Pivot {
} }
} }
fn merge_descriptors(values: &[Value]) -> Vec<String> {
let mut ret = vec![];
for value in values {
for desc in value.data_descriptors() {
if !ret.contains(&desc) {
ret.push(desc);
}
}
}
ret
}
pub fn pivot(args: PivotArgs, context: RunnableContext) -> Result<OutputStream, ShellError> { pub fn pivot(args: PivotArgs, context: RunnableContext) -> Result<OutputStream, ShellError> {
let stream = async_stream! { let stream = async_stream! {
let input = context.input.into_vec().await; let input = context.input.into_vec().await;

View File

@ -0,0 +1,97 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use indexmap::IndexMap;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged;
pub struct Rename;
#[derive(Deserialize)]
pub struct Arguments {
column_name: Tagged<String>,
rest: Vec<Tagged<String>>,
}
impl WholeStreamCommand for Rename {
fn name(&self) -> &str {
"rename"
}
fn signature(&self) -> Signature {
Signature::build("rename")
.required(
"column_name",
SyntaxShape::String,
"the name of the column to rename for",
)
.rest(
SyntaxShape::Member,
"Additional column name(s) to rename for",
)
}
fn usage(&self) -> &str {
"Creates a new table with columns renamed."
}
fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
args.process(registry, rename)?.run()
}
}
pub fn rename(
Arguments { column_name, rest }: Arguments,
RunnableContext { input, name, .. }: RunnableContext,
) -> Result<OutputStream, ShellError> {
let mut new_column_names = vec![vec![column_name]];
new_column_names.push(rest);
let new_column_names = new_column_names.into_iter().flatten().collect::<Vec<_>>();
let stream = input
.values
.map(move |item| {
let mut result = VecDeque::new();
if let Value {
value: UntaggedValue::Row(row),
tag,
} = item
{
let mut renamed_row = IndexMap::new();
for (idx, (key, value)) in row.entries.iter().enumerate() {
let key = if idx < new_column_names.len() {
&new_column_names[idx].item
} else {
key
};
renamed_row.insert(key.clone(), value.clone());
}
let out = UntaggedValue::Row(renamed_row.into()).into_value(tag);
result.push_back(ReturnSuccess::value(out));
} else {
result.push_back(ReturnSuccess::value(
UntaggedValue::Error(ShellError::labeled_error(
"no column names available",
"can't rename",
&name,
))
.into_untagged_value(),
));
}
futures::stream::iter(result)
})
.flatten();
Ok(stream.to_output_stream())
}

View File

@ -7,6 +7,22 @@ use std::path::{Path, PathBuf};
pub struct Save; pub struct Save;
macro_rules! process_unknown {
($scope:tt, $input:ident, $name_tag:ident) => {{
if $input.len() > 0 {
match $input[0] {
Value {
value: UntaggedValue::Primitive(Primitive::Binary(_)),
..
} => process_binary!($scope, $input, $name_tag),
_ => process_string!($scope, $input, $name_tag),
}
} else {
process_string!($scope, $input, $name_tag)
}
}};
}
macro_rules! process_string { macro_rules! process_string {
($scope:tt, $input:ident, $name_tag:ident) => {{ ($scope:tt, $input:ident, $name_tag:ident) => {{
let mut result_string = String::new(); let mut result_string = String::new();
@ -31,6 +47,32 @@ macro_rules! process_string {
}}; }};
} }
macro_rules! process_binary {
($scope:tt, $input:ident, $name_tag:ident) => {{
let mut result_binary: Vec<u8> = Vec::new();
for res in $input {
match res {
Value {
value: UntaggedValue::Primitive(Primitive::Binary(b)),
..
} => {
for u in b.into_iter() {
result_binary.push(u);
}
}
_ => {
break $scope Err(ShellError::labeled_error(
"Save could not successfully save",
"unexpected data during binary save",
$name_tag,
));
}
}
}
Ok(result_binary)
}};
}
macro_rules! process_string_return_success { macro_rules! process_string_return_success {
($scope:tt, $result_vec:ident, $name_tag:ident) => {{ ($scope:tt, $result_vec:ident, $name_tag:ident) => {{
let mut result_string = String::new(); let mut result_string = String::new();
@ -204,10 +246,10 @@ fn save(
process_string_return_success!('scope, result_vec, name_tag) process_string_return_success!('scope, result_vec, name_tag)
} }
} else { } else {
process_string!('scope, input, name_tag) process_unknown!('scope, input, name_tag)
} }
} else { } else {
process_string!('scope, input, name_tag) process_unknown!('scope, input, name_tag)
} }
} else { } else {
Ok(string_from(&input).into_bytes()) Ok(string_from(&input).into_bytes())

View File

@ -0,0 +1,69 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, ReturnValue, Signature, SyntaxShape, Value};
use nu_source::Tagged;
use rand::seq::SliceRandom;
use rand::thread_rng;
pub struct Shuffle;
#[derive(Deserialize)]
pub struct Arguments {
#[serde(rename = "num")]
limit: Option<Tagged<u64>>,
}
impl WholeStreamCommand for Shuffle {
fn name(&self) -> &str {
"shuffle"
}
fn signature(&self) -> Signature {
Signature::build("shuffle").named(
"num",
SyntaxShape::Int,
"Limit `num` number of rows",
Some('n'),
)
}
fn usage(&self) -> &str {
"Shuffle rows randomly."
}
fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
args.process(registry, shuffle)?.run()
}
}
fn shuffle(
Arguments { limit }: Arguments,
RunnableContext { input, .. }: RunnableContext,
) -> Result<OutputStream, ShellError> {
let stream = async_stream! {
let mut values: Vec<Value> = input.values.collect().await;
let out = if let Some(n) = limit {
let (shuffled, _) = values.partial_shuffle(&mut thread_rng(), *n as usize);
shuffled.to_vec()
} else {
values.shuffle(&mut thread_rng());
values.clone()
};
for val in out.into_iter() {
yield ReturnSuccess::value(val);
}
};
let stream: BoxStream<'static, ReturnValue> = stream.boxed();
Ok(stream.to_output_stream())
}

Some files were not shown because too many files have changed in this diff Show More