Compare commits

...

40 Commits

Author SHA1 Message Date
791f7dd9c3 Bump to 0.12.0 (#1538) 2020-04-01 06:25:21 +13:00
a4c1b092ba Add configurations for table headers (#1537)
* Add configurations for table headers

* lint

Co-authored-by: Amanita-Muscaria <nope>
2020-03-31 12:19:48 +13:00
6e71c1008d Change get to remove blanks (#1534)
Remove blank values when getting a column of values
2020-03-30 15:36:21 +13:00
906d0b920f A few improvements to du implementation: (#1533)
1. Fixed a bug where `--all` wasn't showing files at the root directories.
2. More use of `Result`'s `map` and `map_err` methods.
3. Making tables be homogeneous so one can, for example, `get directories`.
2020-03-29 21:16:09 -04:00
efbf4f48c6 Fix poor message for executable that user doesn't have permissi… (#1535)
Previously, if the user didn't have the appropriate permissions to execute the
binary/script, they would see "command not found", which is confusing.

This commit eliminates the `which` crate in favour of `ichwh`, which deals
better with permissions by not dealing with them at all! This is closer to the
behaviour of `which` in many shells. Permission checks are then left up to the
caller to deal with.
2020-03-29 21:15:55 -04:00
2ddab3e8ce Some small improvements to du readability.
Mostly, making more use of `map` and `map_err` in `Result`. One benefit
is that at least one location had duplicated logic for how to map the
error, which is no longer the case after this commit.
2020-03-29 17:03:01 -04:00
35dc7438a5 Make use of interruptible stream in various places 2020-03-29 17:03:01 -04:00
2a54ee0c54 Introduce InterruptibleStream type.
An interruptible stream can query an `AtomicBool. If that bool is true,
the stream will no longer produce any values.

Also introducing the `Interruptible` trait, which extends any `Stream`
with the `interruptible` function, to simplify the construction and
allow chaining.
2020-03-29 17:03:01 -04:00
cad2741e9e Split input and output streams into separate modules 2020-03-29 17:03:01 -04:00
ae5f3c8210 WIP: 1486/first row as headers (#1530)
* headers plugin

* Remove plugin

* Add non-functioning headers command

* Add ability to extract headers from first row

* Refactor header extraction

* Rebuild indexmap with proper headers

* Rebuild result properly

* Compiling, probably wrapped too much?

* Refactoring

* Deal with case of empty header cell

* Deal with case of empty header cell

* Fix formatting

* Fix linting, attempt 2.

* Move whole_stream_command(Headers) to more appropriate section

* ... more linting

* Return Err(ShellError...) instead of panic, yield each row instead of entire table

* Insert Column[index] if no header info is found.

* Update error description

* Add initial test

* Add tests for headers command

* Lint test cases in headers

* Change ShellError for headers, Add sample_headers file to utils.rs

* Add empty sheet to test file

* Revert "Add empty sheet to test file"

This reverts commit a4bf38a31d.

* Show error message when given empty table
2020-03-29 15:05:57 +13:00
a5e97ca549 Respect CARGO_TARGET_DIR when set (#1528)
This makes the `binaries` function respect the `CARGO_TARGET_DIR` environment variable when set. If it's not present it falls back to the regular target directory used by Cargo.
2020-03-27 17:13:59 -04:00
06f87cfbe8 Add support for removing multiple files at once (#1526) 2020-03-25 16:19:01 -04:00
d4e78c6f47 Improve the rotated row wrap (#1524) 2020-03-25 06:27:16 +13:00
3653400ebc testing fix to matrix to define all variables (#1522)
there is currently a bug with invalid syntax for some of the
docker build steps, and I think this is because there are build
variables in the matrix that are not defined. This PR will
attempt to resolve this issue by defining all missing variables
for each row in the matrix.

Signed-off-by: vsoch <vsochat@stanford.edu>
2020-03-24 16:20:39 +13:00
81a48d6d0e Fix '/' and '..' not being valid mv targets (#1519)
* Fix '/' and '..' not being valid mv targets

If `/` or `../` is specified as the destination for `mv`, it will fail with an error message saying it's not a valid destination. This fixes it to account for the fact that `Path::file_name` return `None` when the file name evaluates to `/` or `..`. It will only take the slow(er) path if `Path::file_name` returns `None` in its initial check.

Fixes #1291

* Add test
2020-03-24 14:00:48 +13:00
f030ab3f12 Add experimental auto-rotate (#1516) 2020-03-23 09:55:30 +13:00
0dc0c6a10a Add quickstart option to Docker section in README (#1515) 2020-03-23 09:18:50 +13:00
53c8185af3 Fixes the crash for ps --full in Windows (#1514)
* Fixes the crash for `ps --full` in Windows

* Update ps.rs
2020-03-23 08:28:02 +13:00
36b5d063c1 Simplify and improve listing for which. (#1510)
* Simplified implementation
* Show executables, even if the current user doesn't have permissions to
  execute them.
2020-03-22 09:11:39 -04:00
a7ec00a037 Add documentation for from-ics and from-vcf (#1509) 2020-03-21 14:50:13 +13:00
918822ae0d Fix numeric comparison with nothing (#1508) 2020-03-21 11:02:49 +13:00
ab5e24a0e7 WIP: Add vcard/ical support (#1504)
* Initial from-ical implementation

* Initial from-vcard implementation

* Rename from-ics and from-vcf for autoconvert

* Remove redundant clones

* Add from-vcf and from-ics tests

Co-authored-by: Jonathan Turner <jonathandturner@users.noreply.github.com>
2020-03-21 08:35:09 +13:00
b5ea522f0e Add a --full mode to ps (#1507)
* Add a --full mode to ps

* Use a slightly older heim
2020-03-20 20:53:49 +13:00
afa963fd50 Add is_dir check to auto-cd (#1506)
* Add markdown output

* Add is_dir() check
2020-03-20 16:57:36 +13:00
1e343ff00c Add markdown output (#1503) 2020-03-20 08:18:24 +13:00
21a543a901 Make sum plugin as internal command. (#1501) 2020-03-18 18:46:00 -05:00
390deb4ff7 Windows needs to remember auto-cd paths when changing drives (#1500)
* Windows needs to remember auto-cd paths when changing drives

* Windows needs to remember auto-cd paths when changing drives
2020-03-18 15:10:45 +13:00
1c4cb30d64 Add documentation for skip and skip-while (#1499) 2020-03-18 14:22:35 +13:00
1ec2ec72b5 Add automatic change directory (#1496)
* Allow automatic cd in cli mode

* Set correct priority for auto-cd and add test
2020-03-18 07:13:38 +13:00
0d244a9701 Open fails silently, fix #1493 (#1495)
* Fix #1493

The error was wrongfully discarded

* Run cargo fmt
2020-03-17 17:40:04 +13:00
b36d21e76f Infer types from regular delimited plain text unstructured files. (#1494)
* Infer types from regular delimited plain text unstructured files.

* Nothing resolves to an empty string.
2020-03-16 15:50:45 -05:00
d8c4565413 Csv errors (#1490)
* Add error message for csv parsing failures

* Add csv error prettyfier

* Improve readability of the error

Line 2: error is easier to understand than:
Line 2, error

* Remove unnecessary use of the format! macro

Replacing it with .to_string() fixes a clippy warning

* Improve consistency with JSON parsing errors
2020-03-16 12:32:02 -05:00
22ba4c2a2f Add svg support to to-html (#1492) 2020-03-16 20:19:18 +13:00
8d19b21b9f Custom canonicalize method on Filesystem Shell. (#1485)
* Custom canonicalize method for FilesystemShell.

* Use custom canonicalize method.
Fixed missing import.

* Move function body to already impl body.

* Create test that aims to resolve.
2020-03-16 19:28:18 +13:00
45a3afdc79 Update Cargo.toml 2020-03-16 06:12:28 +13:00
2d078849cb Add simple to-html output and bump version (#1487) 2020-03-15 16:04:44 +13:00
b6363f3ce1 Added new flag '--all/-a' for Ls, also refactor some code (#1483)
* Utility function to detect hidden folders.
Implemented for Unix and Windows.

* Rename function argument.

* Revert "Rename function argument."

This reverts commit e7ab70f0f0.

* Add flag '--all/-a' to Ls

* Rename function argument.

* Check if flag '--all/-a' is present and path is hidden.
Replace match with map_err for glob result.
Remove redundancy in stream body.
Included comments on new stream body.
Replace async_stream::stream with async_stream::try_stream.
Minor tweaks to is_empty_dir.
Fix and refactor is_hidden_dir.

* Fix "implicit" bool coerse

* Fixed clippy errors
2020-03-14 06:27:04 +13:00
5ca9e12b7f Fix whitespace and typos (#1481)
* Remove EOL whitespace in files other than docs

* Break paragraphs into lines

See http://rhodesmill.org/brandon/2012/one-sentence-per-line/ for the rationale

* Fix various typos

* Remove EOL whitespace in docs/commands/*.md
2020-03-14 06:23:41 +13:00
5b0b2f1ddd Fixes #1204 : sys | get host.users displays the same user (#1480)
account twice while only one exists (macOS)

- renamed host.users to host.sessions
2020-03-12 14:01:55 +13:00
3afb53b8ce fix typo in calc command documentation (#1477)
minimumum -> minimum
2020-03-11 11:20:22 -04:00
158 changed files with 4048 additions and 2154 deletions

View File

@ -59,4 +59,3 @@ steps:
- bash: cargo fmt --all -- --check
condition: eq(variables['style'], 'fmt')
displayName: Lint

View File

@ -1,3 +1,3 @@
[build]
#rustflags = ["--cfg", "coloring_in_tokens"]
#rustflags = ["--cfg", "data_processing_primitives"]

View File

@ -38,7 +38,7 @@ workflows:
extra_build_args: --cache-from=quay.io/nushell/nu-base:devel
filters:
branches:
ignore:
ignore:
- master
before_build:
- pull_cache

View File

@ -52,15 +52,15 @@ jobs:
- glibc
- musl
include:
- { tag: alpine, base-image: alpine, arch: x86_64-unknown-linux-musl, plugin: true }
- { tag: slim, base-image: 'debian:stable-slim', arch: x86_64-unknown-linux-gnu, plugin: true }
- { tag: debian, base-image: debian, arch: x86_64-unknown-linux-gnu, plugin: true }
- { tag: glibc-busybox, base-image: 'busybox:glibc', arch: x86_64-unknown-linux-gnu, use-patch: true }
- { tag: musl-busybox, base-image: 'busybox:musl', arch: x86_64-unknown-linux-musl, }
- { tag: musl-distroless, base-image: 'gcr.io/distroless/static', arch: x86_64-unknown-linux-musl, }
- { tag: glibc-distroless, base-image: 'gcr.io/distroless/cc', arch: x86_64-unknown-linux-gnu, use-patch: true }
- { tag: glibc, base-image: scratch, arch: x86_64-unknown-linux-gnu, }
- { tag: musl, base-image: scratch, arch: x86_64-unknown-linux-musl, }
- { tag: alpine, base-image: alpine, arch: x86_64-unknown-linux-musl, plugin: true, use-patch: false}
- { tag: slim, base-image: 'debian:stable-slim', arch: x86_64-unknown-linux-gnu, plugin: true, use-patch: false}
- { tag: debian, base-image: debian, arch: x86_64-unknown-linux-gnu, plugin: true, use-patch: false}
- { tag: glibc-busybox, base-image: 'busybox:glibc', arch: x86_64-unknown-linux-gnu, plugin: false, use-patch: true }
- { tag: musl-busybox, base-image: 'busybox:musl', arch: x86_64-unknown-linux-musl, plugin: false, use-patch: false}
- { tag: musl-distroless, base-image: 'gcr.io/distroless/static', arch: x86_64-unknown-linux-musl, plugin: false, use-patch: false}
- { tag: glibc-distroless, base-image: 'gcr.io/distroless/cc', arch: x86_64-unknown-linux-gnu, plugin: false, use-patch: true }
- { tag: glibc, base-image: scratch, arch: x86_64-unknown-linux-gnu, plugin: false, use-patch: false}
- { tag: musl, base-image: scratch, arch: x86_64-unknown-linux-musl, plugin: false, use-patch: false}
steps:
- uses: actions/checkout@v1
- uses: actions/download-artifact@master

775
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -1,8 +1,8 @@
[package]
name = "nu"
version = "0.11.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
description = "A shell for the GitHub era"
version = "0.12.0"
authors = ["The Nu Project Contributors"]
description = "A new kind of shell"
license = "MIT"
edition = "2018"
readme = "README.md"
@ -18,26 +18,25 @@ members = ["crates/*/"]
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
nu-cli = { version = "0.11.0", path = "./crates/nu-cli" }
nu-source = { version = "0.11.0", path = "./crates/nu-source" }
nu-plugin = { version = "0.11.0", path = "./crates/nu-plugin" }
nu-protocol = { version = "0.11.0", path = "./crates/nu-protocol" }
nu-errors = { version = "0.11.0", path = "./crates/nu-errors" }
nu-parser = { version = "0.11.0", path = "./crates/nu-parser" }
nu-value-ext = { version = "0.11.0", path = "./crates/nu-value-ext" }
nu_plugin_average = { version = "0.11.0", path = "./crates/nu_plugin_average", optional=true }
nu_plugin_binaryview = { version = "0.11.0", path = "./crates/nu_plugin_binaryview", optional=true }
nu_plugin_fetch = { version = "0.11.0", path = "./crates/nu_plugin_fetch", optional=true }
nu_plugin_inc = { version = "0.11.0", path = "./crates/nu_plugin_inc", optional=true }
nu_plugin_match = { version = "0.11.0", path = "./crates/nu_plugin_match", optional=true }
nu_plugin_post = { version = "0.11.0", path = "./crates/nu_plugin_post", optional=true }
nu_plugin_ps = { version = "0.11.0", path = "./crates/nu_plugin_ps", optional=true }
nu_plugin_str = { version = "0.11.0", path = "./crates/nu_plugin_str", optional=true }
nu_plugin_sum = { version = "0.11.0", path = "./crates/nu_plugin_sum", optional=true }
nu_plugin_sys = { version = "0.11.0", path = "./crates/nu_plugin_sys", optional=true }
nu_plugin_textview = { version = "0.11.0", path = "./crates/nu_plugin_textview", optional=true }
nu_plugin_tree = { version = "0.11.0", path = "./crates/nu_plugin_tree", optional=true }
nu-macros = { version = "0.11.0", path = "./crates/nu-macros" }
nu-cli = { version = "0.12.0", path = "./crates/nu-cli" }
nu-source = { version = "0.12.0", path = "./crates/nu-source" }
nu-plugin = { version = "0.12.0", path = "./crates/nu-plugin" }
nu-protocol = { version = "0.12.0", path = "./crates/nu-protocol" }
nu-errors = { version = "0.12.0", path = "./crates/nu-errors" }
nu-parser = { version = "0.12.0", path = "./crates/nu-parser" }
nu-value-ext = { version = "0.12.0", path = "./crates/nu-value-ext" }
nu_plugin_average = { version = "0.12.0", path = "./crates/nu_plugin_average", optional=true }
nu_plugin_binaryview = { version = "0.12.0", path = "./crates/nu_plugin_binaryview", optional=true }
nu_plugin_fetch = { version = "0.12.0", path = "./crates/nu_plugin_fetch", optional=true }
nu_plugin_inc = { version = "0.12.0", path = "./crates/nu_plugin_inc", optional=true }
nu_plugin_match = { version = "0.12.0", path = "./crates/nu_plugin_match", optional=true }
nu_plugin_post = { version = "0.12.0", path = "./crates/nu_plugin_post", optional=true }
nu_plugin_ps = { version = "0.12.0", path = "./crates/nu_plugin_ps", optional=true }
nu_plugin_str = { version = "0.12.0", path = "./crates/nu_plugin_str", optional=true }
nu_plugin_sys = { version = "0.12.0", path = "./crates/nu_plugin_sys", optional=true }
nu_plugin_textview = { version = "0.12.0", path = "./crates/nu_plugin_textview", optional=true }
nu_plugin_tree = { version = "0.12.0", path = "./crates/nu_plugin_tree", optional=true }
nu-macros = { version = "0.12.0", path = "./crates/nu-macros" }
crossterm = { version = "0.16.0", optional = true }
onig_sys = { version = "=69.1.0", optional = true }
@ -54,19 +53,19 @@ pretty_env_logger = "0.4.0"
[dev-dependencies]
pretty_assertions = "0.6.1"
nu-test-support = { version = "0.11.0", path = "./crates/nu-test-support" }
nu-test-support = { version = "0.12.0", path = "./crates/nu-test-support" }
[build-dependencies]
toml = "0.5.6"
serde = { version = "1.0.104", features = ["derive"] }
nu-build = { version = "0.11.0", path = "./crates/nu-build" }
serde = { version = "1.0.105", features = ["derive"] }
nu-build = { version = "0.12.0", path = "./crates/nu-build" }
[features]
# Test executables
test-bins = []
default = ["sys", "ps", "textview", "inc", "str"]
stable = ["default", "starship-prompt", "binaryview", "match", "tree", "average", "sum", "post", "fetch", "clipboard-cli"]
stable = ["default", "starship-prompt", "binaryview", "match", "tree", "average", "post", "fetch", "clipboard-cli"]
# Default
textview = ["crossterm", "syntect", "onig_sys", "url", "nu_plugin_textview"]
@ -81,7 +80,6 @@ binaryview = ["nu_plugin_binaryview"]
fetch = ["nu_plugin_fetch"]
match = ["nu_plugin_match"]
post = ["nu_plugin_post"]
sum = ["nu_plugin_sum"]
trace = ["nu-parser/trace"]
tree = ["nu_plugin_tree"]
@ -167,11 +165,6 @@ name = "nu_plugin_stable_post"
path = "src/plugins/nu_plugin_stable_post.rs"
required-features = ["post"]
[[bin]]
name = "nu_plugin_stable_sum"
path = "src/plugins/nu_plugin_stable_sum.rs"
required-features = ["sum"]
[[bin]]
name = "nu_plugin_stable_tree"
path = "src/plugins/nu_plugin_stable_tree.rs"

View File

@ -13,15 +13,22 @@ A new type of shell.
# Status
This project has reached a minimum-viable product level of quality. While contributors dogfood it as their daily driver, it may be unstable for some commands. Future releases will work to fill out missing features and improve stability. Its design is also subject to change as it matures.
This project has reached a minimum-viable product level of quality.
While contributors dogfood it as their daily driver, it may be unstable for some commands.
Future releases will work to fill out missing features and improve stability.
Its design is also subject to change as it matures.
Nu comes with a set of built-in commands (listed below). If a command is unknown, the command will shell-out and execute it (using cmd on Windows or bash on Linux and macOS), correctly passing through stdin, stdout, and stderr, so things like your daily git workflows and even `vim` will work just fine.
Nu comes with a set of built-in commands (listed below).
If a command is unknown, the command will shell-out and execute it (using cmd on Windows or bash on Linux and macOS), correctly passing through stdin, stdout, and stderr, so things like your daily git workflows and even `vim` will work just fine.
# Learning more
There are a few good resources to learn about Nu. There is a [book](https://www.nushell.sh/book/) about Nu that is currently in progress. The book focuses on using Nu and its core concepts.
There are a few good resources to learn about Nu.
There is a [book](https://www.nushell.sh/book/) about Nu that is currently in progress.
The book focuses on using Nu and its core concepts.
If you're a developer who would like to contribute to Nu, we're also working on a [book for developers](https://www.nushell.sh/contributor-book/) to help you get started. There are also [good first issues](https://github.com/nushell/nushell/issues?q=is%3Aopen+is%3Aissue+label%3A%22good+first+issue%22) to help you dive in.
If you're a developer who would like to contribute to Nu, we're also working on a [book for developers](https://www.nushell.sh/contributor-book/) to help you get started.
There are also [good first issues](https://github.com/nushell/nushell/issues?q=is%3Aopen+is%3Aissue+label%3A%22good+first+issue%22) to help you dive in.
We also have an active [Discord](https://discord.gg/NtAbbGn) and [Twitter](https://twitter.com/nu_shell) if you'd like to come and chat with us.
@ -63,6 +70,16 @@ cargo build --workspace --features=stable
## Docker
### Quickstart
Want to try Nu right away? Execute the following to get started.
```bash
docker run -it quay.io/nushell/nu:latest
```
### Guide
If you want to pull a pre-built container, you can browse tags for the [nushell organization](https://quay.io/organization/nushell)
on Quay.io. Pulling a container would come down to:
@ -107,11 +124,18 @@ The second container is a bit smaller if the size is important to you.
# Philosophy
Nu draws inspiration from projects like PowerShell, functional programming languages, and modern CLI tools. Rather than thinking of files and services as raw streams of text, Nu looks at each input as something with structure. For example, when you list the contents of a directory, what you get back is a table of rows, where each row represents an item in that directory. These values can be piped through a series of steps, in a series of commands called a 'pipeline'.
Nu draws inspiration from projects like PowerShell, functional programming languages, and modern CLI tools.
Rather than thinking of files and services as raw streams of text, Nu looks at each input as something with structure.
For example, when you list the contents of a directory, what you get back is a table of rows, where each row represents an item in that directory.
These values can be piped through a series of steps, in a series of commands called a 'pipeline'.
## Pipelines
In Unix, it's common to pipe between commands to split up a sophisticated command over multiple steps. Nu takes this a step further and builds heavily on the idea of _pipelines_. Just as the Unix philosophy, Nu allows commands to output from stdout and read from stdin. Additionally, commands can output structured data (you can think of this as a third kind of stream). Commands that work in the pipeline fit into one of three categories:
In Unix, it's common to pipe between commands to split up a sophisticated command over multiple steps.
Nu takes this a step further and builds heavily on the idea of _pipelines_.
Just as the Unix philosophy, Nu allows commands to output from stdout and read from stdin.
Additionally, commands can output structured data (you can think of this as a third kind of stream).
Commands that work in the pipeline fit into one of three categories:
* Commands that produce a stream (eg, `ls`)
* Commands that filter a stream (eg, `where type == "Directory"`)
@ -135,13 +159,15 @@ Commands are separated by the pipe symbol (`|`) to denote a pipeline flowing lef
────┴───────────┴───────────┴──────────┴────────┴──────────────┴────────────────
```
Because most of the time you'll want to see the output of a pipeline, `autoview` is assumed. We could have also written the above:
Because most of the time you'll want to see the output of a pipeline, `autoview` is assumed.
We could have also written the above:
```
/home/jonathan/Source/nushell(master)> ls | where type == Directory
```
Being able to use the same commands and compose them differently is an important philosophy in Nu. For example, we could use the built-in `ps` command as well to get a list of the running processes, using the same `where` as above.
Being able to use the same commands and compose them differently is an important philosophy in Nu.
For example, we could use the built-in `ps` command as well to get a list of the running processes, using the same `where` as above.
```text
/home/jonathan/Source/nushell(master)> ps | where cpu > 0
@ -157,7 +183,8 @@ Being able to use the same commands and compose them differently is an important
## Opening files
Nu can load file and URL contents as raw text or as structured data (if it recognizes the format). For example, you can load a .toml file as structured data and explore it:
Nu can load file and URL contents as raw text or as structured data (if it recognizes the format).
For example, you can load a .toml file as structured data and explore it:
```
/home/jonathan/Source/nushell(master)> open Cargo.toml
@ -210,19 +237,26 @@ To set one of these variables, you can use `config --set`. For example:
## Shells
Nu will work inside of a single directory and allow you to navigate around your filesystem by default. Nu also offers a way of adding additional working directories that you can jump between, allowing you to work in multiple directories at the same time.
Nu will work inside of a single directory and allow you to navigate around your filesystem by default.
Nu also offers a way of adding additional working directories that you can jump between, allowing you to work in multiple directories at the same time.
To do so, use the `enter` command, which will allow you create a new "shell" and enter it at the specified path. You can toggle between this new shell and the original shell with the `p` (for previous) and `n` (for next), allowing you to navigate around a ring buffer of shells. Once you're done with a shell, you can `exit` it and remove it from the ring buffer.
To do so, use the `enter` command, which will allow you create a new "shell" and enter it at the specified path.
You can toggle between this new shell and the original shell with the `p` (for previous) and `n` (for next), allowing you to navigate around a ring buffer of shells.
Once you're done with a shell, you can `exit` it and remove it from the ring buffer.
Finally, to get a list of all the current shells, you can use the `shells` command.
## Plugins
Nu supports plugins that offer additional functionality to the shell and follow the same structured data model that built-in commands use. This allows you to extend nu for your needs.
Nu supports plugins that offer additional functionality to the shell and follow the same structured data model that built-in commands use.
This allows you to extend nu for your needs.
There are a few examples in the `plugins` directory.
Plugins are binaries that are available in your path and follow a `nu_plugin_*` naming convention. These binaries interact with nu via a simple JSON-RPC protocol where the command identifies itself and passes along its configuration, which then makes it available for use. If the plugin is a filter, data streams to it one element at a time, and it can stream data back in return via stdin/stdout. If the plugin is a sink, it is given the full vector of final data and is given free reign over stdin/stdout to use as it pleases.
Plugins are binaries that are available in your path and follow a `nu_plugin_*` naming convention.
These binaries interact with nu via a simple JSON-RPC protocol where the command identifies itself and passes along its configuration, which then makes it available for use.
If the plugin is a filter, data streams to it one element at a time, and it can stream data back in return via stdin/stdout.
If the plugin is a sink, it is given the full vector of final data and is given free reign over stdin/stdout to use as it pleases.
# Goals
@ -240,9 +274,9 @@ Nu adheres closely to a set of goals that make up its design philosophy. As feat
# Commands
You can find a list of Nu commands, complete with documentation, in [quick command references](https://www.nushell.sh/documentation.html#quick-command-references).
You can find a list of Nu commands, complete with documentation, in [quick command references](https://www.nushell.sh/documentation.html#quick-command-references).
# License
The project is made available under the MIT license. See "LICENSE" for more information.
The project is made available under the MIT license. See the `LICENSE` file for more information.

View File

@ -1,6 +1,6 @@
[package]
name = "nu-build"
version = "0.11.0"
version = "0.12.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "Core build system for nushell"

View File

@ -1,6 +1,6 @@
[package]
name = "nu-cli"
version = "0.11.0"
version = "0.12.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
description = "CLI for nushell"
edition = "2018"
@ -10,21 +10,21 @@ license = "MIT"
doctest = false
[dependencies]
nu-source = { version = "0.11.0", path = "../nu-source" }
nu-plugin = { version = "0.11.0", path = "../nu-plugin" }
nu-protocol = { version = "0.11.0", path = "../nu-protocol" }
nu-errors = { version = "0.11.0", path = "../nu-errors" }
nu-parser = { version = "0.11.0", path = "../nu-parser" }
nu-value-ext = { version = "0.11.0", path = "../nu-value-ext" }
nu-macros = { version = "0.11.0", path = "../nu-macros" }
nu-test-support = { version = "0.11.0", path = "../nu-test-support" }
nu-source = { version = "0.12.0", path = "../nu-source" }
nu-plugin = { version = "0.12.0", path = "../nu-plugin" }
nu-protocol = { version = "0.12.0", path = "../nu-protocol" }
nu-errors = { version = "0.12.0", path = "../nu-errors" }
nu-parser = { version = "0.12.0", path = "../nu-parser" }
nu-value-ext = { version = "0.12.0", path = "../nu-value-ext" }
nu-macros = { version = "0.12.0", path = "../nu-macros" }
nu-test-support = { version = "0.12.0", path = "../nu-test-support" }
ansi_term = "0.12.1"
app_dirs = "1.2.1"
async-stream = "0.2"
base64 = "0.11"
base64 = "0.12.0"
bigdecimal = { version = "0.1.0", features = ["serde"] }
bson = { version = "0.14.0", features = ["decimal128"] }
bson = { version = "0.14.1", features = ["decimal128"] }
byte-unit = "3.0.3"
bytes = "0.5.4"
calamine = "0.16"
@ -41,9 +41,11 @@ futures = { version = "0.3", features = ["compat", "io-compat"] }
futures-util = "0.3.4"
futures_codec = "0.4"
getset = "0.1.0"
git2 = { version = "0.11.0", default_features = false }
git2 = { version = "0.13.0", default_features = false }
glob = "0.3.0"
hex = "0.4"
htmlescape = "0.3.1"
ical = "0.6.*"
ichwh = "0.3"
indexmap = { version = "1.3.2", features = ["serde-1"] }
itertools = "0.9.0"
@ -65,9 +67,9 @@ ptree = {version = "0.2" }
query_interface = "0.3.5"
rand = "0.7"
regex = "1"
roxmltree = "0.9.1"
roxmltree = "0.10.0"
rustyline = "6.0.0"
serde = { version = "1.0.104", features = ["derive"] }
serde = { version = "1.0.105", features = ["derive"] }
serde-hjson = "0.9.1"
serde_bytes = "0.11.3"
serde_ini = "0.2.0"
@ -85,10 +87,9 @@ trash = "1.0.0"
typetag = "0.1.4"
umask = "0.1"
unicode-xid = "0.2.0"
which = "3.1.1"
clipboard = { version = "0.5", optional = true }
starship = { version = "0.37.0", optional = true }
starship = { version = "0.38.0", optional = true }
[target.'cfg(unix)'.dependencies]
users = "0.9"
@ -101,7 +102,7 @@ features = ["bundled", "blob"]
pretty_assertions = "0.6.1"
[build-dependencies]
nu-build = { version = "0.11.0", path = "../nu-build" }
nu-build = { version = "0.12.0", path = "../nu-build" }
[features]
stable = []

View File

@ -10,7 +10,10 @@ use crate::prelude::*;
use futures_codec::FramedRead;
use nu_errors::ShellError;
use nu_parser::{ClassifiedPipeline, PipelineShape, SpannedToken, TokensIterator};
use nu_parser::{
ClassifiedCommand, ClassifiedPipeline, ExternalCommand, PipelineShape, SpannedToken,
TokensIterator,
};
use nu_protocol::{Primitive, ReturnSuccess, Signature, UntaggedValue, Value};
use log::{debug, log_enabled, trace};
@ -308,14 +311,18 @@ pub fn create_default_context(
whole_stream_command(Shuffle),
whole_stream_command(Wrap),
whole_stream_command(Pivot),
whole_stream_command(Headers),
// Data processing
whole_stream_command(Histogram),
whole_stream_command(Sum),
// File format output
whole_stream_command(ToBSON),
whole_stream_command(ToCSV),
whole_stream_command(ToHTML),
whole_stream_command(ToJSON),
whole_stream_command(ToSQLite),
whole_stream_command(ToDB),
whole_stream_command(ToMarkdown),
whole_stream_command(ToTOML),
whole_stream_command(ToTSV),
whole_stream_command(ToURL),
@ -336,6 +343,8 @@ pub fn create_default_context(
whole_stream_command(FromXML),
whole_stream_command(FromYAML),
whole_stream_command(FromYML),
whole_stream_command(FromIcs),
whole_stream_command(FromVcf),
]);
cfg_if::cfg_if! {
@ -365,7 +374,7 @@ pub async fn run_pipeline_standalone(
redirect_stdin: bool,
context: &mut Context,
) -> Result<(), Box<dyn Error>> {
let line = process_line(Ok(pipeline), context, redirect_stdin).await;
let line = process_line(Ok(pipeline), context, redirect_stdin, false).await;
match line {
LineResult::Success(line) => {
@ -514,7 +523,7 @@ pub async fn cli() -> Result<(), Box<dyn Error>> {
initial_command = None;
}
let line = process_line(readline, &mut context, false).await;
let line = process_line(readline, &mut context, false, true).await;
// Check the config to see if we need to update the path
// TODO: make sure config is cached so we don't path this load every call
@ -597,6 +606,7 @@ async fn process_line(
readline: Result<String, ReadlineError>,
ctx: &mut Context,
redirect_stdin: bool,
cli_mode: bool,
) -> LineResult {
match &readline {
Ok(line) if line.trim() == "" => LineResult::Success(line.clone()),
@ -615,12 +625,68 @@ async fn process_line(
debug!("=== Parsed ===");
debug!("{:#?}", result);
let pipeline = classify_pipeline(&result, ctx, &Text::from(line));
let pipeline = classify_pipeline(&result, &ctx, &Text::from(line));
if let Some(failure) = pipeline.failed {
return LineResult::Error(line.to_string(), failure.into());
}
// There's a special case to check before we process the pipeline:
// If we're giving a path by itself
// ...and it's not a command in the path
// ...and it doesn't have any arguments
// ...and we're in the CLI
// ...then change to this directory
if cli_mode && pipeline.commands.list.len() == 1 {
if let ClassifiedCommand::External(ExternalCommand {
ref name, ref args, ..
}) = pipeline.commands.list[0]
{
if dunce::canonicalize(name).is_ok()
&& PathBuf::from(name).is_dir()
&& ichwh::which(name).await.unwrap_or(None).is_none()
&& args.list.is_empty()
{
// Here we work differently if we're in Windows because of the expected Windows behavior
#[cfg(windows)]
{
if name.ends_with(':') {
// This looks like a drive shortcut. We need to a) switch drives and b) go back to the previous directory we were viewing on that drive
// But first, we need to save where we are now
let current_path = ctx.shell_manager.path();
let split_path: Vec<_> = current_path.split(':').collect();
if split_path.len() > 1 {
ctx.windows_drives_previous_cwd
.lock()
.insert(split_path[0].to_string(), current_path);
}
let name = name.to_uppercase();
let new_drive: Vec<_> = name.split(':').collect();
if let Some(val) =
ctx.windows_drives_previous_cwd.lock().get(new_drive[0])
{
ctx.shell_manager.set_path(val.to_string());
return LineResult::Success(line.to_string());
} else {
ctx.shell_manager.set_path(name.to_string());
return LineResult::Success(line.to_string());
}
} else {
ctx.shell_manager.set_path(name.to_string());
return LineResult::Success(line.to_string());
}
}
#[cfg(not(windows))]
{
ctx.shell_manager.set_path(name.to_string());
return LineResult::Success(line.to_string());
}
}
}
}
let input_stream = if redirect_stdin {
let file = futures::io::AllowStdIo::new(std::io::stdin());
let stream = FramedRead::new(file, MaybeTextCodec).map(|line| {
@ -675,9 +741,8 @@ async fn process_line(
break;
}
}
_ => {
break;
}
Ok(None) => break,
Err(e) => return LineResult::Error(line.to_string(), e),
}
}
}

View File

@ -30,6 +30,7 @@ pub(crate) mod first;
pub(crate) mod format;
pub(crate) mod from_bson;
pub(crate) mod from_csv;
pub(crate) mod from_ics;
pub(crate) mod from_ini;
pub(crate) mod from_json;
pub(crate) mod from_ods;
@ -38,11 +39,13 @@ pub(crate) mod from_ssv;
pub(crate) mod from_toml;
pub(crate) mod from_tsv;
pub(crate) mod from_url;
pub(crate) mod from_vcf;
pub(crate) mod from_xlsx;
pub(crate) mod from_xml;
pub(crate) mod from_yaml;
pub(crate) mod get;
pub(crate) mod group_by;
pub(crate) mod headers;
pub(crate) mod help;
pub(crate) mod histogram;
pub(crate) mod history;
@ -81,13 +84,16 @@ pub(crate) mod sort_by;
pub(crate) mod split_by;
pub(crate) mod split_column;
pub(crate) mod split_row;
pub(crate) mod sum;
#[allow(unused)]
pub(crate) mod t_sort_by;
pub(crate) mod table;
pub(crate) mod tags;
pub(crate) mod to_bson;
pub(crate) mod to_csv;
pub(crate) mod to_html;
pub(crate) mod to_json;
pub(crate) mod to_md;
pub(crate) mod to_sqlite;
pub(crate) mod to_toml;
pub(crate) mod to_tsv;
@ -133,6 +139,7 @@ pub(crate) use first::First;
pub(crate) use format::Format;
pub(crate) use from_bson::FromBSON;
pub(crate) use from_csv::FromCSV;
pub(crate) use from_ics::FromIcs;
pub(crate) use from_ini::FromINI;
pub(crate) use from_json::FromJSON;
pub(crate) use from_ods::FromODS;
@ -142,12 +149,14 @@ pub(crate) use from_ssv::FromSSV;
pub(crate) use from_toml::FromTOML;
pub(crate) use from_tsv::FromTSV;
pub(crate) use from_url::FromURL;
pub(crate) use from_vcf::FromVcf;
pub(crate) use from_xlsx::FromXLSX;
pub(crate) use from_xml::FromXML;
pub(crate) use from_yaml::FromYAML;
pub(crate) use from_yaml::FromYML;
pub(crate) use get::Get;
pub(crate) use group_by::GroupBy;
pub(crate) use headers::Headers;
pub(crate) use help::Help;
pub(crate) use histogram::Histogram;
pub(crate) use history::History;
@ -185,13 +194,16 @@ pub(crate) use sort_by::SortBy;
pub(crate) use split_by::SplitBy;
pub(crate) use split_column::SplitColumn;
pub(crate) use split_row::SplitRow;
pub(crate) use sum::Sum;
#[allow(unused_imports)]
pub(crate) use t_sort_by::TSortBy;
pub(crate) use table::Table;
pub(crate) use tags::Tags;
pub(crate) use to_bson::ToBSON;
pub(crate) use to_csv::ToCSV;
pub(crate) use to_html::ToHTML;
pub(crate) use to_json::ToJSON;
pub(crate) use to_md::ToMarkdown;
pub(crate) use to_sqlite::ToDB;
pub(crate) use to_sqlite::ToSQLite;
pub(crate) use to_toml::ToTOML;

View File

@ -176,6 +176,76 @@ pub fn autoview(context: RunnableContext) -> Result<OutputStream, ShellError> {
Value { value: UntaggedValue::Error(e), .. } => {
yield Err(e);
}
Value { value: UntaggedValue::Row(row), ..} => {
use prettytable::format::{FormatBuilder, LinePosition, LineSeparator};
use prettytable::{color, Attr, Cell, Row, Table};
use crate::data::value::{format_leaf, style_leaf};
use textwrap::fill;
let termwidth = std::cmp::max(textwrap::termwidth(), 20);
enum TableMode {
Light,
Normal,
}
let mut table = Table::new();
let table_mode = crate::data::config::config(Tag::unknown());
let table_mode = if let Some(s) = table_mode?.get("table_mode") {
match s.as_string() {
Ok(typ) if typ == "light" => TableMode::Light,
_ => TableMode::Normal,
}
} else {
TableMode::Normal
};
match table_mode {
TableMode::Light => {
table.set_format(
FormatBuilder::new()
.separator(LinePosition::Title, LineSeparator::new('─', '─', ' ', ' '))
.padding(1, 1)
.build(),
);
}
_ => {
table.set_format(
FormatBuilder::new()
.column_separator('│')
.separator(LinePosition::Top, LineSeparator::new('─', '┬', ' ', ' '))
.separator(LinePosition::Title, LineSeparator::new('─', '┼', ' ', ' '))
.separator(LinePosition::Bottom, LineSeparator::new('─', '┴', ' ', ' '))
.padding(1, 1)
.build(),
);
}
}
let mut max_key_len = 0;
for (key, _) in row.entries.iter() {
max_key_len = std::cmp::max(max_key_len, key.chars().count());
}
if max_key_len > (termwidth/2 - 1) {
max_key_len = termwidth/2 - 1;
}
let max_val_len = termwidth - max_key_len - 5;
for (key, value) in row.entries.iter() {
table.add_row(Row::new(vec![Cell::new(&fill(&key, max_key_len)).with_style(Attr::ForegroundColor(color::GREEN)).with_style(Attr::Bold),
Cell::new(&fill(&format_leaf(value).plain_string(100_000), max_val_len))]));
}
table.printstd();
// table.print_term(&mut *context.host.lock().out_terminal().ok_or_else(|| ShellError::untagged_runtime_error("Could not open terminal for output"))?)
// .map_err(|_| ShellError::untagged_runtime_error("Internal error: could not print to terminal (for unix systems check to make sure TERM is set)"))?;
}
Value { value: ref item, .. } => {
if let Some(table) = table {
let mut stream = VecDeque::new();

View File

@ -94,7 +94,7 @@ pub fn nu_value_to_string(command: &ExternalCommand, from: &Value) -> Result<Str
}
}
pub(crate) fn run_external_command(
pub(crate) async fn run_external_command(
command: ExternalCommand,
context: &mut Context,
input: Option<InputStream>,
@ -102,7 +102,7 @@ pub(crate) fn run_external_command(
) -> Result<Option<InputStream>, ShellError> {
trace!(target: "nu::run::external", "-> {}", command.name);
if !did_find_command(&command.name) {
if !did_find_command(&command.name).await {
return Err(ShellError::labeled_error(
"Command not found",
"command not found",
@ -633,22 +633,22 @@ fn spawn(
Ok(Some(stream.to_input_stream()))
} else {
Err(ShellError::labeled_error(
"Command not found",
"command not found",
"Failed to spawn process",
"failed to spawn",
&command.name_tag,
))
}
}
fn did_find_command(name: &str) -> bool {
async fn did_find_command(name: &str) -> bool {
#[cfg(not(windows))]
{
which::which(name).is_ok()
ichwh::which(name).await.unwrap_or(None).is_some()
}
#[cfg(windows)]
{
if which::which(name).is_ok() {
if ichwh::which(name).await.unwrap_or(None).is_some() {
true
} else {
let cmd_builtins = [
@ -738,7 +738,9 @@ mod tests {
let mut ctx = Context::basic().expect("There was a problem creating a basic context.");
assert!(run_external_command(cmd, &mut ctx, None, false).is_err());
assert!(run_external_command(cmd, &mut ctx, None, false)
.await
.is_err());
Ok(())
}

View File

@ -35,11 +35,11 @@ pub(crate) async fn run_pipeline(
}
(Some(ClassifiedCommand::External(left)), None) => {
run_external_command(left, ctx, input, true)?
run_external_command(left, ctx, input, true).await?
}
(Some(ClassifiedCommand::External(left)), _) => {
run_external_command(left, ctx, input, false)?
run_external_command(left, ctx, input, false).await?
}
(None, _) => break,

View File

@ -9,7 +9,6 @@ use nu_errors::ShellError;
use nu_protocol::{CallInfo, ReturnSuccess, Signature, SyntaxShape, UntaggedValue, Value};
use nu_source::Tagged;
use std::path::PathBuf;
use std::sync::atomic::Ordering;
const NAME: &str = "du";
const GLOB_PARAMS: MatchOptions = MatchOptions {
@ -42,7 +41,7 @@ impl PerItemCommand for Du {
.optional("path", SyntaxShape::Pattern, "starting directory")
.switch(
"all",
"Output File sizes as well as directory sizes",
"Output file sizes as well as directory sizes",
Some('a'),
)
.switch(
@ -90,51 +89,33 @@ impl PerItemCommand for Du {
fn du(args: DuArgs, ctx: &RunnablePerItemContext) -> Result<OutputStream, ShellError> {
let tag = ctx.name.clone();
let exclude = args
.exclude
.clone()
.map_or(Ok(None), move |x| match Pattern::new(&x.item) {
Ok(p) => Ok(Some(p)),
Err(e) => Err(ShellError::labeled_error(
e.msg,
"Glob error",
x.tag.clone(),
)),
})?;
let path = args.path.clone();
let filter_files = path.is_none();
let paths = match path {
Some(p) => match glob::glob_with(
p.item.to_str().expect("Why isn't this encoded properly?"),
GLOB_PARAMS,
) {
Ok(g) => Ok(g),
Err(e) => Err(ShellError::labeled_error(
e.msg,
"Glob error",
p.tag.clone(),
)),
},
None => match glob::glob_with("*", GLOB_PARAMS) {
Ok(g) => Ok(g),
Err(e) => Err(ShellError::labeled_error(e.msg, "Glob error", tag.clone())),
},
}?
let exclude = args.exclude.map_or(Ok(None), move |x| {
Pattern::new(&x.item)
.map(Option::Some)
.map_err(|e| ShellError::labeled_error(e.msg, "glob error", x.tag.clone()))
})?;
let include_files = args.all;
let paths = match args.path {
Some(p) => {
let p = p.item.to_str().expect("Why isn't this encoded properly?");
glob::glob_with(p, GLOB_PARAMS)
}
None => glob::glob_with("*", GLOB_PARAMS),
}
.map_err(|e| ShellError::labeled_error(e.msg, "glob error", tag.clone()))?
.filter(move |p| {
if filter_files {
if include_files {
true
} else {
match p {
Ok(f) if f.is_dir() => true,
Err(e) if e.path().is_dir() => true,
_ => false,
}
} else {
true
}
})
.map(move |p| match p {
Err(e) => Err(glob_err_into(e)),
Ok(s) => Ok(s),
});
.map(|v| v.map_err(glob_err_into));
let ctrl_c = ctx.ctrl_c.clone();
let all = args.all;
@ -142,35 +123,29 @@ fn du(args: DuArgs, ctx: &RunnablePerItemContext) -> Result<OutputStream, ShellE
let max_depth = args.max_depth.map(|f| f.item);
let min_size = args.min_size.map(|f| f.item);
let stream = async_stream! {
let params = DirBuilder {
tag: tag.clone(),
min: min_size,
deref,
ex: exclude,
all,
};
for path in paths {
if ctrl_c.load(Ordering::SeqCst) {
break;
}
match path {
Ok(p) => {
if p.is_dir() {
yield Ok(ReturnSuccess::Value(
DirInfo::new(p, &params, max_depth).into(),
));
} else {
match FileInfo::new(p, deref, tag.clone()) {
Ok(f) => yield Ok(ReturnSuccess::Value(f.into())),
Err(e) => yield Err(e)
}
}
}
Err(e) => yield Err(e),
}
}
let params = DirBuilder {
tag: tag.clone(),
min: min_size,
deref,
exclude,
all,
};
let stream = futures::stream::iter(paths)
.interruptible(ctrl_c)
.map(move |path| match path {
Ok(p) => {
if p.is_dir() {
Ok(ReturnSuccess::Value(
DirInfo::new(p, &params, max_depth).into(),
))
} else {
FileInfo::new(p, deref, tag.clone()).map(|v| ReturnSuccess::Value(v.into()))
}
}
Err(e) => Err(e),
});
Ok(stream.to_output_stream())
}
@ -178,7 +153,7 @@ struct DirBuilder {
tag: Tag,
min: Option<u64>,
deref: bool,
ex: Option<Pattern>,
exclude: Option<Pattern>,
all: bool,
}
@ -243,15 +218,11 @@ impl DirInfo {
for f in d {
match f {
Ok(i) => match i.file_type() {
Ok(t) if t.is_dir() => {
s = s.add_dir(i.path(), depth, &params);
}
Ok(_t) => {
s = s.add_file(i.path(), &params);
}
Err(e) => s = s.add_error(ShellError::from(e)),
Ok(t) if t.is_dir() => s = s.add_dir(i.path(), depth, &params),
Ok(_t) => s = s.add_file(i.path(), &params),
Err(e) => s = s.add_error(e.into()),
},
Err(e) => s = s.add_error(ShellError::from(e)),
Err(e) => s = s.add_error(e.into()),
}
}
}
@ -283,8 +254,11 @@ impl DirInfo {
fn add_file(mut self, f: impl Into<PathBuf>, params: &DirBuilder) -> Self {
let f = f.into();
let ex = params.ex.as_ref().map_or(false, |x| x.matches_path(&f));
if !ex {
let include = params
.exclude
.as_ref()
.map_or(true, |x| !x.matches_path(&f));
if include {
match FileInfo::new(f, params.deref, self.tag.clone()) {
Ok(file) => {
let inc = params.min.map_or(true, |s| file.size >= s);
@ -313,55 +287,51 @@ fn glob_err_into(e: GlobError) -> ShellError {
ShellError::from(e)
}
fn value_from_vec<V>(vec: Vec<V>, tag: &Tag) -> Value
where
V: Into<Value>,
{
if vec.is_empty() {
UntaggedValue::nothing()
} else {
let values = vec.into_iter().map(Into::into).collect::<Vec<Value>>();
UntaggedValue::Table(values)
}
.retag(tag)
}
impl From<DirInfo> for Value {
fn from(d: DirInfo) -> Self {
let mut r: IndexMap<String, Value> = IndexMap::new();
r.insert(
"path".to_string(),
UntaggedValue::path(d.path).retag(d.tag.clone()),
UntaggedValue::path(d.path).retag(&d.tag),
);
r.insert(
"apparent".to_string(),
UntaggedValue::bytes(d.size).retag(d.tag.clone()),
UntaggedValue::bytes(d.size).retag(&d.tag),
);
r.insert(
"physical".to_string(),
UntaggedValue::bytes(d.blocks).retag(d.tag.clone()),
UntaggedValue::bytes(d.blocks).retag(&d.tag),
);
if !d.files.is_empty() {
let v = Value {
value: UntaggedValue::Table(
d.files
.into_iter()
.map(move |f| f.into())
.collect::<Vec<Value>>(),
),
tag: d.tag.clone(),
};
r.insert("files".to_string(), v);
}
if !d.dirs.is_empty() {
let v = Value {
value: UntaggedValue::Table(
d.dirs
.into_iter()
.map(move |d| d.into())
.collect::<Vec<Value>>(),
),
tag: d.tag.clone(),
};
r.insert("directories".to_string(), v);
}
r.insert("directories".to_string(), value_from_vec(d.dirs, &d.tag));
r.insert("files".to_string(), value_from_vec(d.files, &d.tag));
if !d.errors.is_empty() {
let v = Value {
value: UntaggedValue::Table(
d.errors
.into_iter()
.map(move |e| UntaggedValue::Error(e).into_untagged_value())
.collect::<Vec<Value>>(),
),
tag: d.tag.clone(),
};
let v = UntaggedValue::Table(
d.errors
.into_iter()
.map(move |e| UntaggedValue::Error(e).into_untagged_value())
.collect::<Vec<Value>>(),
)
.retag(&d.tag);
r.insert("errors".to_string(), v);
}
@ -375,22 +345,32 @@ impl From<DirInfo> for Value {
impl From<FileInfo> for Value {
fn from(f: FileInfo) -> Self {
let mut r: IndexMap<String, Value> = IndexMap::new();
r.insert(
"path".to_string(),
UntaggedValue::path(f.path).retag(f.tag.clone()),
UntaggedValue::path(f.path).retag(&f.tag),
);
r.insert(
"apparent".to_string(),
UntaggedValue::bytes(f.size).retag(f.tag.clone()),
UntaggedValue::bytes(f.size).retag(&f.tag),
);
let b = match f.blocks {
Some(k) => UntaggedValue::bytes(k).retag(f.tag.clone()),
None => UntaggedValue::nothing().retag(f.tag.clone()),
};
let b = f
.blocks
.map(UntaggedValue::bytes)
.unwrap_or_else(UntaggedValue::nothing)
.retag(&f.tag);
r.insert("physical".to_string(), b);
Value {
value: UntaggedValue::row(r),
tag: f.tag,
}
r.insert(
"directories".to_string(),
UntaggedValue::nothing().retag(&f.tag),
);
r.insert("files".to_string(), UntaggedValue::nothing().retag(&f.tag));
UntaggedValue::row(r).retag(&f.tag)
}
}

View File

@ -1,42 +1,12 @@
use crate::prelude::*;
use csv::ReaderBuilder;
use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, TaggedDictBuilder, UntaggedValue, Value};
use nu_parser::hir::syntax_shape::{ExpandContext, SignatureRegistry};
use nu_parser::utils::{parse_line_with_separator as parse, LineSeparatedShape};
use nu_parser::TokensIterator;
use nu_protocol::{ReturnSuccess, Signature, TaggedDictBuilder, UntaggedValue, Value};
use nu_source::nom_input;
fn from_delimited_string_to_value(
s: String,
headerless: bool,
separator: char,
tag: impl Into<Tag>,
) -> Result<Value, csv::Error> {
let mut reader = ReaderBuilder::new()
.has_headers(!headerless)
.delimiter(separator as u8)
.from_reader(s.as_bytes());
let tag = tag.into();
let headers = if headerless {
(1..=reader.headers()?.len())
.map(|i| format!("Column{}", i))
.collect::<Vec<String>>()
} else {
reader.headers()?.iter().map(String::from).collect()
};
let mut rows = vec![];
for row in reader.records() {
let mut tagged_row = TaggedDictBuilder::new(&tag);
for (value, header) in row?.iter().zip(headers.iter()) {
tagged_row.insert_value(
header,
UntaggedValue::Primitive(Primitive::String(String::from(value))).into_value(&tag),
)
}
rows.push(tagged_row.into_value());
}
Ok(UntaggedValue::Table(rows).into_value(&tag))
}
use derive_new::new;
pub fn from_delimited_data(
headerless: bool,
@ -50,15 +20,19 @@ pub fn from_delimited_data(
let concat_string = input.collect_string(name_tag.clone()).await?;
match from_delimited_string_to_value(concat_string.item, headerless, sep, name_tag.clone()) {
Ok(x) => match x {
Value { value: UntaggedValue::Table(list), .. } => {
for l in list {
yield ReturnSuccess::value(l);
Ok(rows) => {
for row in rows {
match row {
Value { value: UntaggedValue::Table(list), .. } => {
for l in list {
yield ReturnSuccess::value(l);
}
}
x => yield ReturnSuccess::value(x),
}
}
x => yield ReturnSuccess::value(x),
},
Err(_) => {
Err(err) => {
let line_one = format!("Could not parse as {}", format_name);
let line_two = format!("input cannot be parsed as {}", format_name);
yield Err(ShellError::labeled_error_with_secondary(
@ -74,3 +48,122 @@ pub fn from_delimited_data(
Ok(stream.to_output_stream())
}
#[derive(Debug, Clone, new)]
pub struct EmptyRegistry {
#[new(default)]
signatures: indexmap::IndexMap<String, Signature>,
}
impl EmptyRegistry {}
impl SignatureRegistry for EmptyRegistry {
fn has(&self, _name: &str) -> bool {
false
}
fn get(&self, _name: &str) -> Option<Signature> {
None
}
fn clone_box(&self) -> Box<dyn SignatureRegistry> {
Box::new(self.clone())
}
}
fn from_delimited_string_to_value(
s: String,
headerless: bool,
sep: char,
tag: impl Into<Tag>,
) -> Result<Vec<Value>, ShellError> {
let tag = tag.into();
let mut entries = s.lines();
let mut fields = vec![];
let mut out = vec![];
if let Some(first_entry) = entries.next() {
let tokens = match parse(&sep.to_string(), nom_input(first_entry)) {
Ok((_, tokens)) => tokens,
Err(err) => return Err(ShellError::parse_error(err)),
};
let tokens_span = tokens.span;
let source: nu_source::Text = tokens_span.slice(&first_entry).into();
if !headerless {
fields = tokens
.item
.iter()
.filter(|token| !token.is_separator())
.map(|field| field.source(&source).to_string())
.collect::<Vec<_>>();
}
let registry = Box::new(EmptyRegistry::new());
let ctx = ExpandContext::new(registry, &source, None);
let mut iterator = TokensIterator::new(&tokens.item, ctx, tokens_span);
let (results, tokens_identified) = iterator.expand(LineSeparatedShape);
let results = results?;
let mut row = TaggedDictBuilder::new(&tag);
if headerless {
let fallback_columns = (1..=tokens_identified)
.map(|i| format!("Column{}", i))
.collect::<Vec<String>>();
for (idx, field) in results.into_iter().enumerate() {
let key = if headerless {
&fallback_columns[idx]
} else {
&fields[idx]
};
row.insert_value(key, field.into_value(&tag));
}
out.push(row.into_value())
}
}
for entry in entries {
let tokens = match parse(&sep.to_string(), nom_input(entry)) {
Ok((_, tokens)) => tokens,
Err(err) => return Err(ShellError::parse_error(err)),
};
let tokens_span = tokens.span;
let source: nu_source::Text = tokens_span.slice(&entry).into();
let registry = Box::new(EmptyRegistry::new());
let ctx = ExpandContext::new(registry, &source, None);
let mut iterator = TokensIterator::new(&tokens.item, ctx, tokens_span);
let (results, tokens_identified) = iterator.expand(LineSeparatedShape);
let results = results?;
let mut row = TaggedDictBuilder::new(&tag);
let fallback_columns = (1..=tokens_identified)
.map(|i| format!("Column{}", i))
.collect::<Vec<String>>();
for (idx, field) in results.into_iter().enumerate() {
let key = if headerless {
&fallback_columns[idx]
} else {
match fields.get(idx) {
Some(key) => key,
None => &fallback_columns[idx],
}
};
row.insert_value(key, field.into_value(&tag));
}
out.push(row.into_value())
}
Ok(out)
}

View File

@ -0,0 +1,240 @@
extern crate ical;
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use ical::parser::ical::component::*;
use ical::property::Property;
use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, Signature, TaggedDictBuilder, UntaggedValue, Value};
use std::io::BufReader;
pub struct FromIcs;
impl WholeStreamCommand for FromIcs {
fn name(&self) -> &str {
"from-ics"
}
fn signature(&self) -> Signature {
Signature::build("from-ics")
}
fn usage(&self) -> &str {
"Parse text as .ics and create table."
}
fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
from_ics(args, registry)
}
}
fn from_ics(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let args = args.evaluate_once(registry)?;
let tag = args.name_tag();
let input = args.input;
let stream = async_stream! {
let input_string = input.collect_string(tag.clone()).await?.item;
let input_bytes = input_string.as_bytes();
let buf_reader = BufReader::new(input_bytes);
let parser = ical::IcalParser::new(buf_reader);
for calendar in parser {
match calendar {
Ok(c) => yield ReturnSuccess::value(calendar_to_value(c, tag.clone())),
Err(_) => yield Err(ShellError::labeled_error(
"Could not parse as .ics",
"input cannot be parsed as .ics",
tag.clone()
)),
}
}
};
Ok(stream.to_output_stream())
}
fn calendar_to_value(calendar: IcalCalendar, tag: Tag) -> Value {
let mut row = TaggedDictBuilder::new(tag.clone());
row.insert_untagged(
"properties",
properties_to_value(calendar.properties, tag.clone()),
);
row.insert_untagged("events", events_to_value(calendar.events, tag.clone()));
row.insert_untagged("alarms", alarms_to_value(calendar.alarms, tag.clone()));
row.insert_untagged("to-Dos", todos_to_value(calendar.todos, tag.clone()));
row.insert_untagged(
"journals",
journals_to_value(calendar.journals, tag.clone()),
);
row.insert_untagged(
"free-busys",
free_busys_to_value(calendar.free_busys, tag.clone()),
);
row.insert_untagged("timezones", timezones_to_value(calendar.timezones, tag));
row.into_value()
}
fn events_to_value(events: Vec<IcalEvent>, tag: Tag) -> UntaggedValue {
UntaggedValue::table(
&events
.into_iter()
.map(|event| {
let mut row = TaggedDictBuilder::new(tag.clone());
row.insert_untagged(
"properties",
properties_to_value(event.properties, tag.clone()),
);
row.insert_untagged("alarms", alarms_to_value(event.alarms, tag.clone()));
row.into_value()
})
.collect::<Vec<Value>>(),
)
}
fn alarms_to_value(alarms: Vec<IcalAlarm>, tag: Tag) -> UntaggedValue {
UntaggedValue::table(
&alarms
.into_iter()
.map(|alarm| {
let mut row = TaggedDictBuilder::new(tag.clone());
row.insert_untagged(
"properties",
properties_to_value(alarm.properties, tag.clone()),
);
row.into_value()
})
.collect::<Vec<Value>>(),
)
}
fn todos_to_value(todos: Vec<IcalTodo>, tag: Tag) -> UntaggedValue {
UntaggedValue::table(
&todos
.into_iter()
.map(|todo| {
let mut row = TaggedDictBuilder::new(tag.clone());
row.insert_untagged(
"properties",
properties_to_value(todo.properties, tag.clone()),
);
row.insert_untagged("alarms", alarms_to_value(todo.alarms, tag.clone()));
row.into_value()
})
.collect::<Vec<Value>>(),
)
}
fn journals_to_value(journals: Vec<IcalJournal>, tag: Tag) -> UntaggedValue {
UntaggedValue::table(
&journals
.into_iter()
.map(|journal| {
let mut row = TaggedDictBuilder::new(tag.clone());
row.insert_untagged(
"properties",
properties_to_value(journal.properties, tag.clone()),
);
row.into_value()
})
.collect::<Vec<Value>>(),
)
}
fn free_busys_to_value(free_busys: Vec<IcalFreeBusy>, tag: Tag) -> UntaggedValue {
UntaggedValue::table(
&free_busys
.into_iter()
.map(|free_busy| {
let mut row = TaggedDictBuilder::new(tag.clone());
row.insert_untagged(
"properties",
properties_to_value(free_busy.properties, tag.clone()),
);
row.into_value()
})
.collect::<Vec<Value>>(),
)
}
fn timezones_to_value(timezones: Vec<IcalTimeZone>, tag: Tag) -> UntaggedValue {
UntaggedValue::table(
&timezones
.into_iter()
.map(|timezone| {
let mut row = TaggedDictBuilder::new(tag.clone());
row.insert_untagged(
"properties",
properties_to_value(timezone.properties, tag.clone()),
);
row.insert_untagged(
"transitions",
timezone_transitions_to_value(timezone.transitions, tag.clone()),
);
row.into_value()
})
.collect::<Vec<Value>>(),
)
}
fn timezone_transitions_to_value(
transitions: Vec<IcalTimeZoneTransition>,
tag: Tag,
) -> UntaggedValue {
UntaggedValue::table(
&transitions
.into_iter()
.map(|transition| {
let mut row = TaggedDictBuilder::new(tag.clone());
row.insert_untagged(
"properties",
properties_to_value(transition.properties, tag.clone()),
);
row.into_value()
})
.collect::<Vec<Value>>(),
)
}
fn properties_to_value(properties: Vec<Property>, tag: Tag) -> UntaggedValue {
UntaggedValue::table(
&properties
.into_iter()
.map(|prop| {
let mut row = TaggedDictBuilder::new(tag.clone());
let name = UntaggedValue::string(prop.name);
let value = match prop.value {
Some(val) => UntaggedValue::string(val),
None => UntaggedValue::Primitive(Primitive::Nothing),
};
let params = match prop.params {
Some(param_list) => params_to_value(param_list, tag.clone()).into(),
None => UntaggedValue::Primitive(Primitive::Nothing),
};
row.insert_untagged("name", name);
row.insert_untagged("value", value);
row.insert_untagged("params", params);
row.into_value()
})
.collect::<Vec<Value>>(),
)
}
fn params_to_value(params: Vec<(String, Vec<String>)>, tag: Tag) -> Value {
let mut row = TaggedDictBuilder::new(tag);
for (param_name, param_values) in params {
let values: Vec<Value> = param_values.into_iter().map(|val| val.into()).collect();
let values = UntaggedValue::table(&values);
row.insert_untagged(param_name, values);
}
row.into_value()
}

View File

@ -0,0 +1,102 @@
extern crate ical;
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use ical::parser::vcard::component::*;
use ical::property::Property;
use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, Signature, TaggedDictBuilder, UntaggedValue, Value};
use std::io::BufReader;
pub struct FromVcf;
impl WholeStreamCommand for FromVcf {
fn name(&self) -> &str {
"from-vcf"
}
fn signature(&self) -> Signature {
Signature::build("from-vcf")
}
fn usage(&self) -> &str {
"Parse text as .vcf and create table."
}
fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
from_vcf(args, registry)
}
}
fn from_vcf(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let args = args.evaluate_once(registry)?;
let tag = args.name_tag();
let input = args.input;
let stream = async_stream! {
let input_string = input.collect_string(tag.clone()).await?.item;
let input_bytes = input_string.as_bytes();
let buf_reader = BufReader::new(input_bytes);
let parser = ical::VcardParser::new(buf_reader);
for contact in parser {
match contact {
Ok(c) => yield ReturnSuccess::value(contact_to_value(c, tag.clone())),
Err(_) => yield Err(ShellError::labeled_error(
"Could not parse as .vcf",
"input cannot be parsed as .vcf",
tag.clone()
)),
}
}
};
Ok(stream.to_output_stream())
}
fn contact_to_value(contact: VcardContact, tag: Tag) -> Value {
let mut row = TaggedDictBuilder::new(tag.clone());
row.insert_untagged("properties", properties_to_value(contact.properties, tag));
row.into_value()
}
fn properties_to_value(properties: Vec<Property>, tag: Tag) -> UntaggedValue {
UntaggedValue::table(
&properties
.into_iter()
.map(|prop| {
let mut row = TaggedDictBuilder::new(tag.clone());
let name = UntaggedValue::string(prop.name);
let value = match prop.value {
Some(val) => UntaggedValue::string(val),
None => UntaggedValue::Primitive(Primitive::Nothing),
};
let params = match prop.params {
Some(param_list) => params_to_value(param_list, tag.clone()).into(),
None => UntaggedValue::Primitive(Primitive::Nothing),
};
row.insert_untagged("name", name);
row.insert_untagged("value", value);
row.insert_untagged("params", params);
row.into_value()
})
.collect::<Vec<Value>>(),
)
}
fn params_to_value(params: Vec<(String, Vec<String>)>, tag: Tag) -> Value {
let mut row = TaggedDictBuilder::new(tag);
for (param_name, param_values) in params {
let values: Vec<Value> = param_values.into_iter().map(|val| val.into()).collect();
let values = UntaggedValue::table(&values);
row.insert_untagged(param_name, values);
}
row.into_value()
}

View File

@ -4,8 +4,8 @@ use indexmap::set::IndexSet;
use log::trace;
use nu_errors::ShellError;
use nu_protocol::{
did_you_mean, ColumnPath, PathMember, ReturnSuccess, ReturnValue, Signature, SyntaxShape,
UnspannedPathMember, UntaggedValue, Value,
did_you_mean, ColumnPath, PathMember, Primitive, ReturnSuccess, ReturnValue, Signature,
SyntaxShape, UnspannedPathMember, UntaggedValue, Value,
};
use nu_source::span_for_spanned_list;
use nu_value_ext::get_data_by_column_path;
@ -221,6 +221,10 @@ pub fn get(
result.push_back(ReturnSuccess::value(item.clone()));
}
}
Value {
value: UntaggedValue::Primitive(Primitive::Nothing),
..
} => {}
other => result.push_back(ReturnSuccess::value(other.clone())),
},
Err(reason) => result.push_back(ReturnSuccess::value(

View File

@ -0,0 +1,80 @@
use crate::commands::WholeStreamCommand;
use crate::context::CommandRegistry;
use crate::prelude::*;
use futures::stream::StreamExt;
use indexmap::IndexMap;
use nu_errors::ShellError;
use nu_protocol::Dictionary;
use nu_protocol::{ReturnSuccess, Signature, UntaggedValue, Value};
pub struct Headers;
#[derive(Deserialize)]
pub struct HeadersArgs {}
impl WholeStreamCommand for Headers {
fn name(&self) -> &str {
"headers"
}
fn signature(&self) -> Signature {
Signature::build("headers")
}
fn usage(&self) -> &str {
"Use the first row of the table as column names"
}
fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
args.process(registry, headers)?.run()
}
}
pub fn headers(
HeadersArgs {}: HeadersArgs,
RunnableContext { input, .. }: RunnableContext,
) -> Result<OutputStream, ShellError> {
let stream = async_stream! {
let rows: Vec<Value> = input.values.collect().await;
if rows.len() < 1 {
yield Err(ShellError::untagged_runtime_error("Couldn't find headers, was the input a properly formatted, non-empty table?"));
}
//the headers are the first row in the table
let headers: Vec<String> = match &rows[0].value {
UntaggedValue::Row(d) => {
Ok(d.entries.iter().map(|(k, v)| {
match v.as_string() {
Ok(s) => s,
Err(_) => { //If a cell that should contain a header name is empty, we name the column Column[index]
match d.entries.get_full(k) {
Some((index, _, _)) => format!("Column{}", index),
None => "unknownColumn".to_string()
}
}
}
}).collect())
}
_ => Err(ShellError::unexpected_eof("Could not get headers, is the table empty?", rows[0].tag.span))
}?;
//Each row is a dictionary with the headers as keys
for r in rows.iter().skip(1) {
match &r.value {
UntaggedValue::Row(d) => {
let mut i = 0;
let mut entries = IndexMap::new();
for (_, v) in d.entries.iter() {
entries.insert(headers[i].clone(), v.clone());
i += 1;
}
yield Ok(ReturnSuccess::Value(UntaggedValue::Row(Dictionary{entries}).into_value(r.tag.clone())))
}
_ => yield Err(ShellError::unexpected_eof("Couldn't iterate through rows, was the input a properly formatted table?", r.tag.span))
}
}
};
Ok(stream.to_output_stream())
}

View File

@ -87,8 +87,8 @@ Here are some tips to help you get started.
* help commands - list all available commands
* help <command name> - display help about a particular command
Nushell works on the idea of a "pipeline". Pipelines are commands connected with the '|' character. Each stage
in the pipeline works together to load, parse, and display information to you.
Nushell works on the idea of a "pipeline". Pipelines are commands connected with the '|' character.
Each stage in the pipeline works together to load, parse, and display information to you.
[Examples]

View File

@ -10,6 +10,7 @@ pub struct Ls;
#[derive(Deserialize)]
pub struct LsArgs {
pub path: Option<Tagged<PathBuf>>,
pub all: bool,
pub full: bool,
#[serde(rename = "short-names")]
pub short_names: bool,
@ -29,6 +30,7 @@ impl PerItemCommand for Ls {
SyntaxShape::Pattern,
"a path to get the directory contents from",
)
.switch("all", "also show hidden files", Some('a'))
.switch(
"full",
"list all available columns for each entry",

View File

@ -80,7 +80,7 @@ fn pick(
"No data to fetch.",
format!("Couldn't pick column \"{}\"", column),
path_member_tried.span,
format!("How about exploring it with \"get\"? Check the input is appropiate originating from here"),
format!("How about exploring it with \"get\"? Check the input is appropriate originating from here"),
obj_source.tag.span)
}

View File

@ -10,7 +10,7 @@ pub struct Remove;
#[derive(Deserialize)]
pub struct RemoveArgs {
pub target: Tagged<PathBuf>,
pub rest: Vec<Tagged<PathBuf>>,
pub recursive: Tagged<bool>,
pub trash: Tagged<bool>,
}
@ -22,17 +22,17 @@ impl PerItemCommand for Remove {
fn signature(&self) -> Signature {
Signature::build("rm")
.required("path", SyntaxShape::Pattern, "the file path to remove")
.switch(
"trash",
"use the platform's recycle bin instead of permanently deleting",
Some('t'),
)
.switch("recursive", "delete subdirectories recursively", Some('r'))
.rest(SyntaxShape::Pattern, "the file path(s) to remove")
}
fn usage(&self) -> &str {
"Remove a file"
"Remove file(s)"
}
fn run(

View File

@ -0,0 +1,55 @@
use crate::commands::WholeStreamCommand;
use crate::prelude::*;
use crate::utils::data_processing::{reducer_for, Reduce};
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, ReturnValue, Signature, Value};
use num_traits::identities::Zero;
pub struct Sum;
impl WholeStreamCommand for Sum {
fn name(&self) -> &str {
"sum"
}
fn signature(&self) -> Signature {
Signature::build("sum")
}
fn usage(&self) -> &str {
"Sums the values."
}
fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
sum(RunnableContext {
input: args.input,
commands: registry.clone(),
shell_manager: args.shell_manager,
host: args.host,
source: args.call_info.source,
ctrl_c: args.ctrl_c,
name: args.call_info.name_tag,
})
}
}
fn sum(RunnableContext { mut input, .. }: RunnableContext) -> Result<OutputStream, ShellError> {
let stream = async_stream! {
let mut values = input.drain_vec().await;
let action = reducer_for(Reduce::Sum);
match action(Value::zero(), values) {
Ok(total) => yield ReturnSuccess::value(total),
Err(err) => yield Err(err),
}
};
let stream: BoxStream<'static, ReturnValue> = stream.boxed();
Ok(stream.to_output_stream())
}

View File

@ -140,6 +140,7 @@ fn to_string_tagged_value(v: &Value) -> Result<String, ShellError> {
| UntaggedValue::Primitive(Primitive::Path(_))
| UntaggedValue::Primitive(Primitive::Int(_)) => as_string(v),
UntaggedValue::Primitive(Primitive::Date(d)) => Ok(d.to_string()),
UntaggedValue::Primitive(Primitive::Nothing) => Ok(String::new()),
UntaggedValue::Table(_) => Ok(String::from("[Table]")),
UntaggedValue::Row(_) => Ok(String::from("[Row]")),
_ => Err(ShellError::labeled_error(

View File

@ -0,0 +1,121 @@
use crate::commands::WholeStreamCommand;
use crate::data::value::format_leaf;
use crate::prelude::*;
use futures::StreamExt;
use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, Signature, UntaggedValue, Value};
use nu_source::AnchorLocation;
pub struct ToHTML;
impl WholeStreamCommand for ToHTML {
fn name(&self) -> &str {
"to-html"
}
fn signature(&self) -> Signature {
Signature::build("to-html")
}
fn usage(&self) -> &str {
"Convert table into simple HTML"
}
fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
to_html(args, registry)
}
}
fn to_html(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let args = args.evaluate_once(registry)?;
let name_tag = args.name_tag();
//let name_span = name_tag.span;
let stream = async_stream! {
let input: Vec<Value> = args.input.values.collect().await;
let headers = nu_protocol::merge_descriptors(&input);
let mut output_string = "<html><body>".to_string();
if !headers.is_empty() && (headers.len() > 1 || headers[0] != "<value>") {
output_string.push_str("<table>");
output_string.push_str("<tr>");
for header in &headers {
output_string.push_str("<th>");
output_string.push_str(&htmlescape::encode_minimal(&header));
output_string.push_str("</th>");
}
output_string.push_str("</tr>");
}
for row in input {
match row.value {
UntaggedValue::Primitive(Primitive::Binary(b)) => {
// This might be a bit much, but it's fun :)
match row.tag.anchor {
Some(AnchorLocation::Url(f)) |
Some(AnchorLocation::File(f)) => {
let extension = f.split('.').last().map(String::from);
match extension {
Some(s) if ["png", "jpg", "bmp", "gif", "tiff", "jpeg"].contains(&s.to_lowercase().as_str()) => {
output_string.push_str("<img src=\"data:image/");
output_string.push_str(&s);
output_string.push_str(";base64,");
output_string.push_str(&base64::encode(&b));
output_string.push_str("\">");
}
_ => {}
}
}
_ => {}
}
}
UntaggedValue::Primitive(Primitive::String(ref b)) => {
// This might be a bit much, but it's fun :)
match row.tag.anchor {
Some(AnchorLocation::Url(f)) |
Some(AnchorLocation::File(f)) => {
let extension = f.split('.').last().map(String::from);
match extension {
Some(s) if s.to_lowercase() == "svg" => {
output_string.push_str("<img src=\"data:image/svg+xml;base64,");
output_string.push_str(&base64::encode(&b.as_bytes()));
output_string.push_str("\">");
continue;
}
_ => {}
}
}
_ => {}
}
output_string.push_str(&(htmlescape::encode_minimal(&format_leaf(&row.value).plain_string(100_000)).replace("\n", "<br>")));
}
UntaggedValue::Row(row) => {
output_string.push_str("<tr>");
for header in &headers {
let data = row.get_data(header);
output_string.push_str("<td>");
output_string.push_str(&format_leaf(data.borrow()).plain_string(100_000));
output_string.push_str("</td>");
}
output_string.push_str("</tr>");
}
p => {
output_string.push_str(&(htmlescape::encode_minimal(&format_leaf(&p).plain_string(100_000)).replace("\n", "<br>")));
}
}
}
if !headers.is_empty() && (headers.len() > 1 || headers[0] != "<value>") {
output_string.push_str("</table>");
}
output_string.push_str("</body></html>");
yield ReturnSuccess::value(UntaggedValue::string(output_string).into_value(name_tag));
};
Ok(stream.to_output_stream())
}

View File

@ -0,0 +1,77 @@
use crate::commands::WholeStreamCommand;
use crate::data::value::format_leaf;
use crate::prelude::*;
use futures::StreamExt;
use nu_errors::ShellError;
use nu_protocol::{ReturnSuccess, Signature, UntaggedValue, Value};
pub struct ToMarkdown;
impl WholeStreamCommand for ToMarkdown {
fn name(&self) -> &str {
"to-md"
}
fn signature(&self) -> Signature {
Signature::build("to-md")
}
fn usage(&self) -> &str {
"Convert table into simple Markdown"
}
fn run(
&self,
args: CommandArgs,
registry: &CommandRegistry,
) -> Result<OutputStream, ShellError> {
to_html(args, registry)
}
}
fn to_html(args: CommandArgs, registry: &CommandRegistry) -> Result<OutputStream, ShellError> {
let args = args.evaluate_once(registry)?;
let name_tag = args.name_tag();
//let name_span = name_tag.span;
let stream = async_stream! {
let input: Vec<Value> = args.input.values.collect().await;
let headers = nu_protocol::merge_descriptors(&input);
let mut output_string = String::new();
if !headers.is_empty() && (headers.len() > 1 || headers[0] != "<value>") {
output_string.push_str("|");
for header in &headers {
output_string.push_str(&htmlescape::encode_minimal(&header));
output_string.push_str("|");
}
output_string.push_str("\n|");
for _ in &headers {
output_string.push_str("-");
output_string.push_str("|");
}
output_string.push_str("\n");
}
for row in input {
match row.value {
UntaggedValue::Row(row) => {
output_string.push_str("|");
for header in &headers {
let data = row.get_data(header);
output_string.push_str(&format_leaf(data.borrow()).plain_string(100_000));
output_string.push_str("|");
}
output_string.push_str("\n");
}
p => {
output_string.push_str(&(htmlescape::encode_minimal(&format_leaf(&p).plain_string(100_000))));
output_string.push_str("\n");
}
}
}
yield ReturnSuccess::value(UntaggedValue::string(output_string).into_value(name_tag));
};
Ok(stream.to_output_stream())
}

View File

@ -87,66 +87,24 @@ fn which(
application.item.clone()
};
if all {
let stream = async_stream! {
if external {
if let Ok(Some(path)) = ichwh::which(&item).await {
yield ReturnSuccess::value(entry_path!(item, path.into(), application.tag.clone()));
}
}
let stream = async_stream! {
if !external {
let builtin = commands.has(&item);
if builtin {
yield ReturnSuccess::value(entry_builtin!(item, application.tag.clone()));
}
}
if let Ok(paths) = ichwh::which_all(&item).await {
if !builtin && paths.len() == 0 {
yield Err(ShellError::labeled_error(
"Binary not found for argument, and argument is not a builtin",
"not found",
&application.tag,
));
} else {
for path in paths {
yield ReturnSuccess::value(entry_path!(item, path.into(), application.tag.clone()));
}
}
} else {
yield Err(ShellError::labeled_error(
"Error trying to find binary for argument",
"error",
&application.tag,
));
if let Ok(paths) = ichwh::which_all(&item).await {
for path in paths {
yield ReturnSuccess::value(entry_path!(item, path.into(), application.tag.clone()));
}
};
}
};
if all {
Ok(stream.to_output_stream())
} else {
let stream = async_stream! {
if external {
if let Ok(Some(path)) = ichwh::which(&item).await {
yield ReturnSuccess::value(entry_path!(item, path.into(), application.tag.clone()));
}
} else if commands.has(&item) {
yield ReturnSuccess::value(entry_builtin!(item, application.tag.clone()));
} else {
match ichwh::which(&item).await {
Ok(Some(path)) => yield ReturnSuccess::value(entry_path!(item, path.into(), application.tag.clone())),
Ok(None) => yield Err(ShellError::labeled_error(
"Binary not found for argument, and argument is not a builtin",
"not found",
&application.tag,
)),
Err(_) => yield Err(ShellError::labeled_error(
"Error trying to find binary for argument",
"error",
&application.tag,
)),
}
}
};
Ok(stream.to_output_stream())
Ok(stream.take(1).to_output_stream())
}
}

View File

@ -82,6 +82,9 @@ pub struct Context {
pub current_errors: Arc<Mutex<Vec<ShellError>>>,
pub ctrl_c: Arc<AtomicBool>,
pub(crate) shell_manager: ShellManager,
#[cfg(windows)]
pub windows_drives_previous_cwd: Arc<Mutex<std::collections::HashMap<String, String>>>,
}
impl Context {
@ -102,15 +105,33 @@ impl Context {
pub(crate) fn basic() -> Result<Context, Box<dyn Error>> {
let registry = CommandRegistry::new();
Ok(Context {
registry: registry.clone(),
host: Arc::new(parking_lot::Mutex::new(Box::new(
crate::env::host::BasicHost,
))),
current_errors: Arc::new(Mutex::new(vec![])),
ctrl_c: Arc::new(AtomicBool::new(false)),
shell_manager: ShellManager::basic(registry)?,
})
#[cfg(windows)]
{
Ok(Context {
registry: registry.clone(),
host: Arc::new(parking_lot::Mutex::new(Box::new(
crate::env::host::BasicHost,
))),
current_errors: Arc::new(Mutex::new(vec![])),
ctrl_c: Arc::new(AtomicBool::new(false)),
shell_manager: ShellManager::basic(registry)?,
windows_drives_previous_cwd: Arc::new(Mutex::new(std::collections::HashMap::new())),
})
}
#[cfg(not(windows))]
{
Ok(Context {
registry: registry.clone(),
host: Arc::new(parking_lot::Mutex::new(Box::new(
crate::env::host::BasicHost,
))),
current_errors: Arc::new(Mutex::new(vec![])),
ctrl_c: Arc::new(AtomicBool::new(false)),
shell_manager: ShellManager::basic(registry)?,
})
}
}
pub(crate) fn error(&mut self, error: ShellError) {

View File

@ -180,6 +180,12 @@ fn coerce_compare_primitive(
(Bytes(left), Decimal(right)) => {
CompareValues::Decimals(BigDecimal::from(*left), right.clone())
}
(Bytes(left), Nothing) => CompareValues::Ints(BigInt::from(*left), BigInt::from(0)),
(Nothing, Bytes(right)) => CompareValues::Ints(BigInt::from(0), BigInt::from(*right)),
(Int(left), Nothing) => CompareValues::Ints(left.clone(), BigInt::from(0)),
(Nothing, Int(right)) => CompareValues::Ints(BigInt::from(0), right.clone()),
(Decimal(left), Nothing) => CompareValues::Decimals(left.clone(), BigDecimal::from(0.0)),
(Nothing, Decimal(right)) => CompareValues::Decimals(BigDecimal::from(0.0), right.clone()),
(String(left), String(right)) => CompareValues::String(left.clone(), right.clone()),
(Line(left), String(right)) => CompareValues::String(left.clone(), right.clone()),
(String(left), Line(right)) => CompareValues::String(left.clone(), right.clone()),

View File

@ -36,13 +36,13 @@ pub(crate) fn dir_entry_dict(
metadata: Option<&std::fs::Metadata>,
tag: impl Into<Tag>,
full: bool,
name_only: bool,
short_name: bool,
with_symlink_targets: bool,
) -> Result<Value, ShellError> {
let tag = tag.into();
let mut dict = TaggedDictBuilder::new(&tag);
let name = if name_only {
let name = if short_name {
filename.file_name().and_then(|s| s.to_str())
} else {
filename.to_str()

View File

@ -208,7 +208,7 @@ mod tests {
assert_eq!(actual, expected);
});
// Now confirm in-memory environment variables synced appropiately
// Now confirm in-memory environment variables synced appropriately
// including the newer one accounted for.
let environment = actual.env.lock();

View File

@ -6,8 +6,9 @@ use nu_errors::ShellError;
use nu_protocol::{UntaggedValue, Value};
use textwrap::fill;
use prettytable::format::{FormatBuilder, LinePosition, LineSeparator};
use prettytable::format::{Alignment, FormatBuilder, LinePosition, LineSeparator};
use prettytable::{color, Attr, Cell, Row, Table};
use term::color::Color;
type Entries = Vec<Vec<(String, &'static str)>>;
@ -300,9 +301,34 @@ impl RenderView for TableView {
let mut table = Table::new();
let table_mode = crate::data::config::config(Tag::unknown());
let mut config = crate::data::config::config(Tag::unknown())?;
let header_align = config.get("header_align").map_or(Alignment::LEFT, |a| {
a.as_string()
.map_or(Alignment::LEFT, |a| match a.to_lowercase().as_str() {
"center" | "c" => Alignment::CENTER,
"right" | "r" => Alignment::RIGHT,
_ => Alignment::LEFT,
})
});
let table_mode = if let Some(s) = table_mode?.get("table_mode") {
let header_color = config.get("header_color").map_or(color::GREEN, |c| {
c.as_string().map_or(color::GREEN, |c| {
str_to_color(c.to_lowercase()).unwrap_or(color::GREEN)
})
});
let header_style =
config
.remove("header_style")
.map_or(vec![Attr::Bold], |y| match y.value {
UntaggedValue::Table(t) => to_style_vec(t),
UntaggedValue::Primitive(p) => vec![p
.into_string(Span::unknown())
.map_or(Attr::Bold, |s| str_to_style(s).unwrap_or(Attr::Bold))],
_ => vec![Attr::Bold],
});
let table_mode = if let Some(s) = config.get("table_mode") {
match s.as_string() {
Ok(typ) if typ == "light" => TableMode::Light,
_ => TableMode::Normal,
@ -337,9 +363,12 @@ impl RenderView for TableView {
.headers
.iter()
.map(|h| {
Cell::new(h)
.with_style(Attr::ForegroundColor(color::GREEN))
.with_style(Attr::Bold)
let mut c = Cell::new_align(h, header_align)
.with_style(Attr::ForegroundColor(header_color));
for &s in &header_style {
c.style(s);
}
c
})
.collect();
@ -359,3 +388,45 @@ impl RenderView for TableView {
Ok(())
}
}
fn str_to_color(s: String) -> Option<Color> {
match s.as_str() {
"g" | "green" => Some(color::GREEN),
"r" | "red" => Some(color::RED),
"u" | "blue" => Some(color::BLUE),
"b" | "black" => Some(color::BLACK),
"y" | "yellow" => Some(color::YELLOW),
"m" | "magenta" => Some(color::MAGENTA),
"c" | "cyan" => Some(color::CYAN),
"w" | "white" => Some(color::WHITE),
"bg" | "bright green" => Some(color::BRIGHT_GREEN),
"br" | "bright red" => Some(color::BRIGHT_RED),
"bu" | "bright blue" => Some(color::BRIGHT_BLUE),
"by" | "bright yellow" => Some(color::BRIGHT_YELLOW),
"bm" | "bright magenta" => Some(color::BRIGHT_MAGENTA),
"bc" | "bright cyan" => Some(color::BRIGHT_CYAN),
"bw" | "bright white" => Some(color::BRIGHT_WHITE),
_ => None,
}
}
fn to_style_vec(a: Vec<Value>) -> Vec<Attr> {
let mut v: Vec<Attr> = Vec::new();
for t in a {
if let Ok(s) = t.as_string() {
if let Some(r) = str_to_style(s) {
v.push(r);
}
}
}
v
}
fn str_to_style(s: String) -> Option<Attr> {
match s.as_str() {
"b" | "bold" => Some(Attr::Bold),
"i" | "italic" | "italics" => Some(Attr::Italic(true)),
"u" | "underline" | "underlined" => Some(Attr::Underline(true)),
_ => None,
}
}

View File

@ -86,7 +86,7 @@ pub(crate) use crate::shell::filesystem_shell::FilesystemShell;
pub(crate) use crate::shell::help_shell::HelpShell;
pub(crate) use crate::shell::shell_manager::ShellManager;
pub(crate) use crate::shell::value_shell::ValueShell;
pub(crate) use crate::stream::{InputStream, OutputStream};
pub(crate) use crate::stream::{InputStream, InterruptibleStream, OutputStream};
pub(crate) use async_stream::stream as async_stream;
pub(crate) use bigdecimal::BigDecimal;
pub(crate) use futures::stream::BoxStream;
@ -102,6 +102,7 @@ pub(crate) use num_traits::cast::ToPrimitive;
pub(crate) use serde::Deserialize;
pub(crate) use std::collections::VecDeque;
pub(crate) use std::future::Future;
pub(crate) use std::sync::atomic::AtomicBool;
pub(crate) use std::sync::Arc;
pub(crate) use itertools::Itertools;
@ -160,3 +161,16 @@ where
}
}
}
pub trait Interruptible<V> {
fn interruptible(self, ctrl_c: Arc<AtomicBool>) -> InterruptibleStream<V>;
}
impl<S, V> Interruptible<V> for S
where
S: Stream<Item = V> + Send + 'static,
{
fn interruptible(self, ctrl_c: Arc<AtomicBool>) -> InterruptibleStream<V> {
InterruptibleStream::new(self, ctrl_c)
}
}

View File

@ -14,8 +14,8 @@ use nu_parser::ExpandContext;
use nu_protocol::{Primitive, ReturnSuccess, UntaggedValue};
use rustyline::completion::FilenameCompleter;
use rustyline::hint::{Hinter, HistoryHinter};
use std::path::PathBuf;
use std::sync::atomic::Ordering;
use std::collections::HashMap;
use std::path::{Component, Path, PathBuf};
use trash as SendToTrash;
#[cfg(unix)]
@ -78,6 +78,29 @@ impl FilesystemShell {
hinter: HistoryHinter {},
}
}
fn canonicalize(&self, path: impl AsRef<Path>) -> std::io::Result<PathBuf> {
let path = if path.as_ref().is_relative() {
let components = path.as_ref().components();
let mut result = PathBuf::from(self.path());
for component in components {
match component {
Component::CurDir => { /* ignore current dir */ }
Component::ParentDir => {
result.pop();
}
Component::Normal(normal) => result.push(normal),
_ => {}
}
}
result
} else {
path.as_ref().into()
};
dunce::canonicalize(path)
}
}
impl Shell for FilesystemShell {
@ -93,6 +116,7 @@ impl Shell for FilesystemShell {
&self,
LsArgs {
path,
all,
full,
short_names,
with_symlink_targets,
@ -107,7 +131,7 @@ impl Shell for FilesystemShell {
let p_tag = p.tag;
let mut p = p.item;
if p.is_dir() {
if is_dir_empty(&p) {
if is_empty_dir(&p) {
return Ok(OutputStream::empty());
}
p.push("*");
@ -115,7 +139,7 @@ impl Shell for FilesystemShell {
(p, p_tag)
}
None => {
if is_dir_empty(&self.path().into()) {
if is_empty_dir(&self.path()) {
return Ok(OutputStream::empty());
} else {
(PathBuf::from("./*"), context.name.clone())
@ -123,11 +147,9 @@ impl Shell for FilesystemShell {
}
};
let mut paths = match glob::glob(&path.to_string_lossy()) {
Ok(g) => Ok(g),
Err(e) => Err(ShellError::labeled_error("Glob error", e.msg, &p_tag)),
}?
.peekable();
let mut paths = glob::glob(&path.to_string_lossy())
.map_err(|e| ShellError::labeled_error("Glob error", e.to_string(), &p_tag))?
.peekable();
if paths.peek().is_none() {
return Err(ShellError::labeled_error(
@ -137,36 +159,39 @@ impl Shell for FilesystemShell {
));
}
let stream = async_stream! {
// Generated stream: impl Stream<Item = Result<ReturnSuccess, ShellError>
let stream = async_stream::try_stream! {
for path in paths {
if ctrl_c.load(Ordering::SeqCst) {
break;
}
match path {
Ok(p) => match std::fs::symlink_metadata(&p) {
Ok(m) => {
match dir_entry_dict(&p, Some(&m), name_tag.clone(), full, short_names, with_symlink_targets) {
Ok(d) => yield ReturnSuccess::value(d),
Err(e) => yield Err(e)
}
}
Err(e) => {
match e.kind() {
PermissionDenied => {
match dir_entry_dict(&p, None, name_tag.clone(), full, short_names, with_symlink_targets) {
Ok(d) => yield ReturnSuccess::value(d),
Err(e) => yield Err(e)
}
},
_ => yield Err(ShellError::from(e))
}
}
}
Err(e) => yield Err(e.into_error().into()),
let path = path.map_err(|e| ShellError::from(e.into_error()))?;
if !all && is_hidden_dir(&path) {
continue;
}
let metadata = match std::fs::symlink_metadata(&path) {
Ok(metadata) => Ok(Some(metadata)),
Err(e) => if let PermissionDenied = e.kind() {
Ok(None)
} else {
Err(e)
},
}?;
let entry = dir_entry_dict(
&path,
metadata.as_ref(),
name_tag.clone(),
full,
short_names,
with_symlink_targets
)
.map(|entry| ReturnSuccess::Value(entry.into()))?;
yield entry;
}
};
Ok(stream.to_output_stream())
Ok(stream.interruptible(ctrl_c).to_output_stream())
}
fn cd(&self, args: EvaluatedWholeStreamCommandArgs) -> Result<OutputStream, ShellError> {
@ -184,11 +209,10 @@ impl Shell for FilesystemShell {
Some(v) => {
let target = v.as_path()?;
if PathBuf::from("-") == target {
if target == Path::new("-") {
PathBuf::from(&self.last_path)
} else {
let path = PathBuf::from(self.path());
let path = dunce::canonicalize(path.join(&target)).map_err(|_| {
let path = self.canonicalize(target).map_err(|_| {
ShellError::labeled_error(
"Cannot change to directory",
"directory not found",
@ -592,11 +616,27 @@ impl Shell for FilesystemShell {
match destination.file_name() {
Some(name) => PathBuf::from(name),
None => {
return Err(ShellError::labeled_error(
"Rename aborted. Not a valid destination",
"not a valid destination",
dst.tag,
))
let name_maybe =
destination.components().next_back().and_then(
|component| match component {
Component::RootDir => Some(PathBuf::from("/")),
Component::ParentDir => destination
.parent()
.and_then(|parent| parent.file_name())
.map(PathBuf::from),
_ => None,
},
);
if let Some(name) = name_maybe {
name
} else {
return Err(ShellError::labeled_error(
"Rename aborted. Not a valid destination",
"not a valid destination",
dst.tag,
));
}
}
}
};
@ -936,7 +976,7 @@ impl Shell for FilesystemShell {
fn rm(
&self,
RemoveArgs {
target,
rest: targets,
recursive,
trash,
}: RemoveArgs,
@ -945,125 +985,141 @@ impl Shell for FilesystemShell {
) -> Result<OutputStream, ShellError> {
let name_tag = name;
if target.item.to_str() == Some(".") || target.item.to_str() == Some("..") {
if targets.is_empty() {
return Err(ShellError::labeled_error(
"Remove aborted. \".\" or \"..\" may not be removed.",
"\".\" or \"..\" may not be removed",
target.tag,
"rm requires target paths",
"needs parameter",
name_tag,
));
}
let mut path = PathBuf::from(path);
let mut all_targets: HashMap<PathBuf, Tag> = HashMap::new();
for target in targets {
if target.item.to_str() == Some(".") || target.item.to_str() == Some("..") {
return Err(ShellError::labeled_error(
"Remove aborted. \".\" or \"..\" may not be removed.",
"\".\" or \"..\" may not be removed",
target.tag,
));
}
path.push(&target.item);
match glob::glob(&path.to_string_lossy()) {
Ok(files) => {
let files: Vec<_> = files.collect();
if files.is_empty() {
Err(ShellError::labeled_error(
"Remove aborted. Not a valid path",
"not a valid path",
target.tag,
let mut path = PathBuf::from(path);
path.push(&target.item);
match glob::glob(&path.to_string_lossy()) {
Ok(files) => {
for file in files {
match file {
Ok(ref f) => {
all_targets
.entry(f.clone())
.or_insert_with(|| target.tag.clone());
}
Err(e) => {
let msg = format!("Could not remove {:}", path.to_string_lossy());
return Err(ShellError::labeled_error(
msg,
e.to_string(),
&target.tag,
));
}
}
}
}
Err(e) => {
return Err(ShellError::labeled_error(
format!("Remove aborted. {:}", e.to_string()),
e.to_string(),
&name_tag,
))
} else {
let stream = async_stream! {
for file in files.iter() {
match file {
Ok(f) => {
let is_empty = match f.read_dir() {
Ok(mut p) => p.next().is_none(),
Err(_) => false
};
}
};
}
let valid_target =
f.exists() && (!f.is_dir() || (is_empty || recursive.item));
if valid_target {
if trash.item {
match SendToTrash::remove(f) {
Err(e) => {
let msg = format!(
"Could not delete {:}",
f.to_string_lossy()
);
let label = format!("{:?}", e);
yield Err(ShellError::labeled_error(
msg,
label,
&target.tag,
))
},
Ok(()) => {
let val = format!("deleted {:}", f.to_string_lossy()).into();
yield Ok(ReturnSuccess::Value(val))
},
}
} else {
let success = if f.is_dir() {
std::fs::remove_dir_all(f)
} else {
std::fs::remove_file(f)
};
match success {
Err(e) => {
let msg = format!(
"Could not delete {:}",
f.to_string_lossy()
);
yield Err(ShellError::labeled_error(
msg,
e.to_string(),
&target.tag,
))
},
Ok(()) => {
let val = format!("deleted {:}", f.to_string_lossy()).into();
yield Ok(ReturnSuccess::Value(
val,
))
},
}
}
} else {
if f.is_dir() {
let msg = format!(
"Cannot remove {:}. try --recursive",
f.to_string_lossy()
);
yield Err(ShellError::labeled_error(
msg,
"cannot remove non-empty directory",
&target.tag,
))
} else {
let msg = format!("Invalid file: {:}", f.to_string_lossy());
yield Err(ShellError::labeled_error(
msg,
"invalid file",
&target.tag,
))
}
}
}
if all_targets.is_empty() {
Err(ShellError::labeled_error(
"Remove aborted. No valid paths",
"no valid paths",
name_tag,
))
} else {
let stream = async_stream! {
for (f, tag) in all_targets.iter() {
let is_empty = match f.read_dir() {
Ok(mut p) => p.next().is_none(),
Err(_) => false
};
let valid_target =
f.exists() && (!f.is_dir() || (is_empty || recursive.item));
if valid_target {
if trash.item {
match SendToTrash::remove(f) {
Err(e) => {
let msg = format!("Could not remove {:}", path.to_string_lossy());
let msg = format!(
"Could not delete {:}",
f.to_string_lossy()
);
let label = format!("{:?}", e);
yield Err(ShellError::labeled_error(
msg,
label,
tag,
))
},
Ok(()) => {
let val = format!("deleted {:}", f.to_string_lossy()).into();
yield Ok(ReturnSuccess::Value(val))
},
}
} else {
let success = if f.is_dir() {
std::fs::remove_dir_all(f)
} else {
std::fs::remove_file(f)
};
match success {
Err(e) => {
let msg = format!(
"Could not delete {:}",
f.to_string_lossy()
);
yield Err(ShellError::labeled_error(
msg,
e.to_string(),
&target.tag,
tag,
))
},
Ok(()) => {
let val = format!("deleted {:}", f.to_string_lossy()).into();
yield Ok(ReturnSuccess::Value(
val,
))
},
}
}
};
Ok(stream.to_output_stream())
}
} else {
if f.is_dir() {
let msg = format!(
"Cannot remove {:}. try --recursive",
f.to_string_lossy()
);
yield Err(ShellError::labeled_error(
msg,
"cannot remove non-empty directory",
tag,
))
} else {
let msg = format!("Invalid file: {:}", f.to_string_lossy());
yield Err(ShellError::labeled_error(
msg,
"invalid file",
tag,
))
}
}
}
}
Err(e) => Err(ShellError::labeled_error(
format!("Remove aborted. {:}", e.to_string()),
e.to_string(),
&name_tag,
)),
};
Ok(stream.to_output_stream())
}
}
@ -1129,9 +1185,30 @@ impl Shell for FilesystemShell {
}
}
fn is_dir_empty(d: &PathBuf) -> bool {
match d.read_dir() {
Err(_e) => true,
fn is_empty_dir(dir: impl AsRef<Path>) -> bool {
match dir.as_ref().read_dir() {
Err(_) => true,
Ok(mut s) => s.next().is_none(),
}
}
fn is_hidden_dir(dir: impl AsRef<Path>) -> bool {
cfg_if::cfg_if! {
if #[cfg(windows)] {
use std::os::windows::fs::MetadataExt;
if let Ok(metadata) = dir.as_ref().metadata() {
let attributes = metadata.file_attributes();
// https://docs.microsoft.com/en-us/windows/win32/fileio/file-attribute-constants
(attributes & 0x2) != 0
} else {
false
}
} else {
dir.as_ref()
.file_name()
.map(|name| name.to_string_lossy().starts_with('.'))
.unwrap_or(false)
}
}
}

View File

@ -162,7 +162,7 @@ impl Shell for HelpShell {
cwd.pop();
} else {
match target.to_str() {
Some(target) => match target.chars().nth(0) {
Some(target) => match target.chars().next() {
Some(x) if x == '/' => cwd = PathBuf::from(target),
_ => cwd.push(target),
},

View File

@ -144,7 +144,7 @@ impl Shell for ValueShell {
cwd = PathBuf::from(&self.last_path);
} else {
match target.to_str() {
Some(target) => match target.chars().nth(0) {
Some(target) => match target.chars().next() {
Some(x) if x == '/' => cwd = PathBuf::from(target),
_ => cwd.push(target),
},

View File

@ -1,7 +1,7 @@
use crate::prelude::*;
use futures::stream::iter;
use nu_errors::ShellError;
use nu_protocol::{Primitive, ReturnSuccess, ReturnValue, UntaggedValue, Value};
use nu_protocol::{Primitive, UntaggedValue, Value};
use nu_source::{Tagged, TaggedItem};
pub struct InputStream {
@ -148,106 +148,3 @@ impl From<Vec<Value>> for InputStream {
}
}
}
pub struct OutputStream {
pub(crate) values: BoxStream<'static, ReturnValue>,
}
impl OutputStream {
pub fn new(values: impl Stream<Item = ReturnValue> + Send + 'static) -> OutputStream {
OutputStream {
values: values.boxed(),
}
}
pub fn empty() -> OutputStream {
let v: VecDeque<ReturnValue> = VecDeque::new();
v.into()
}
pub fn one(item: impl Into<ReturnValue>) -> OutputStream {
let mut v: VecDeque<ReturnValue> = VecDeque::new();
v.push_back(item.into());
v.into()
}
pub fn from_input(input: impl Stream<Item = Value> + Send + 'static) -> OutputStream {
OutputStream {
values: input.map(ReturnSuccess::value).boxed(),
}
}
pub fn drain_vec(&mut self) -> impl Future<Output = Vec<ReturnValue>> {
let mut values: BoxStream<'static, ReturnValue> = iter(VecDeque::new()).boxed();
std::mem::swap(&mut values, &mut self.values);
values.collect()
}
}
impl Stream for OutputStream {
type Item = ReturnValue;
fn poll_next(
mut self: std::pin::Pin<&mut Self>,
cx: &mut std::task::Context<'_>,
) -> core::task::Poll<Option<Self::Item>> {
Stream::poll_next(std::pin::Pin::new(&mut self.values), cx)
}
}
impl From<InputStream> for OutputStream {
fn from(input: InputStream) -> OutputStream {
OutputStream {
values: input.values.map(ReturnSuccess::value).boxed(),
}
}
}
impl From<BoxStream<'static, Value>> for OutputStream {
fn from(input: BoxStream<'static, Value>) -> OutputStream {
OutputStream {
values: input.map(ReturnSuccess::value).boxed(),
}
}
}
impl From<BoxStream<'static, ReturnValue>> for OutputStream {
fn from(input: BoxStream<'static, ReturnValue>) -> OutputStream {
OutputStream { values: input }
}
}
impl From<VecDeque<ReturnValue>> for OutputStream {
fn from(input: VecDeque<ReturnValue>) -> OutputStream {
OutputStream {
values: futures::stream::iter(input).boxed(),
}
}
}
impl From<VecDeque<Value>> for OutputStream {
fn from(input: VecDeque<Value>) -> OutputStream {
let stream = input.into_iter().map(ReturnSuccess::value);
OutputStream {
values: futures::stream::iter(stream).boxed(),
}
}
}
impl From<Vec<ReturnValue>> for OutputStream {
fn from(input: Vec<ReturnValue>) -> OutputStream {
OutputStream {
values: futures::stream::iter(input).boxed(),
}
}
}
impl From<Vec<Value>> for OutputStream {
fn from(input: Vec<Value>) -> OutputStream {
let stream = input.into_iter().map(ReturnSuccess::value);
OutputStream {
values: futures::stream::iter(stream).boxed(),
}
}
}

View File

@ -0,0 +1,35 @@
use crate::prelude::*;
use futures::task::Poll;
use std::sync::atomic::{AtomicBool, Ordering};
pub struct InterruptibleStream<V> {
inner: BoxStream<'static, V>,
ctrl_c: Arc<AtomicBool>,
}
impl<V> InterruptibleStream<V> {
pub fn new<S>(inner: S, ctrl_c: Arc<AtomicBool>) -> InterruptibleStream<V>
where
S: Stream<Item = V> + Send + 'static,
{
InterruptibleStream {
inner: inner.boxed(),
ctrl_c,
}
}
}
impl<V> Stream for InterruptibleStream<V> {
type Item = V;
fn poll_next(
mut self: std::pin::Pin<&mut Self>,
cx: &mut std::task::Context<'_>,
) -> core::task::Poll<Option<Self::Item>> {
if self.ctrl_c.load(Ordering::SeqCst) {
Poll::Ready(None)
} else {
Stream::poll_next(std::pin::Pin::new(&mut self.inner), cx)
}
}
}

View File

@ -0,0 +1,7 @@
mod input;
mod interruptible;
mod output;
pub use input::*;
pub use interruptible::*;
pub use output::*;

View File

@ -0,0 +1,106 @@
use crate::prelude::*;
use futures::stream::iter;
use nu_protocol::{ReturnSuccess, ReturnValue, Value};
pub struct OutputStream {
pub(crate) values: BoxStream<'static, ReturnValue>,
}
impl OutputStream {
pub fn new(values: impl Stream<Item = ReturnValue> + Send + 'static) -> OutputStream {
OutputStream {
values: values.boxed(),
}
}
pub fn empty() -> OutputStream {
let v: VecDeque<ReturnValue> = VecDeque::new();
v.into()
}
pub fn one(item: impl Into<ReturnValue>) -> OutputStream {
let mut v: VecDeque<ReturnValue> = VecDeque::new();
v.push_back(item.into());
v.into()
}
pub fn from_input(input: impl Stream<Item = Value> + Send + 'static) -> OutputStream {
OutputStream {
values: input.map(ReturnSuccess::value).boxed(),
}
}
pub fn drain_vec(&mut self) -> impl Future<Output = Vec<ReturnValue>> {
let mut values: BoxStream<'static, ReturnValue> = iter(VecDeque::new()).boxed();
std::mem::swap(&mut values, &mut self.values);
values.collect()
}
}
impl Stream for OutputStream {
type Item = ReturnValue;
fn poll_next(
mut self: std::pin::Pin<&mut Self>,
cx: &mut std::task::Context<'_>,
) -> core::task::Poll<Option<Self::Item>> {
Stream::poll_next(std::pin::Pin::new(&mut self.values), cx)
}
}
impl From<InputStream> for OutputStream {
fn from(input: InputStream) -> OutputStream {
OutputStream {
values: input.values.map(ReturnSuccess::value).boxed(),
}
}
}
impl From<BoxStream<'static, Value>> for OutputStream {
fn from(input: BoxStream<'static, Value>) -> OutputStream {
OutputStream {
values: input.map(ReturnSuccess::value).boxed(),
}
}
}
impl From<BoxStream<'static, ReturnValue>> for OutputStream {
fn from(input: BoxStream<'static, ReturnValue>) -> OutputStream {
OutputStream { values: input }
}
}
impl From<VecDeque<ReturnValue>> for OutputStream {
fn from(input: VecDeque<ReturnValue>) -> OutputStream {
OutputStream {
values: futures::stream::iter(input).boxed(),
}
}
}
impl From<VecDeque<Value>> for OutputStream {
fn from(input: VecDeque<Value>) -> OutputStream {
let stream = input.into_iter().map(ReturnSuccess::value);
OutputStream {
values: futures::stream::iter(stream).boxed(),
}
}
}
impl From<Vec<ReturnValue>> for OutputStream {
fn from(input: Vec<ReturnValue>) -> OutputStream {
OutputStream {
values: futures::stream::iter(input).boxed(),
}
}
}
impl From<Vec<Value>> for OutputStream {
fn from(input: Vec<Value>) -> OutputStream {
let stream = input.into_iter().map(ReturnSuccess::value);
OutputStream {
values: futures::stream::iter(stream).boxed(),
}
}
}

View File

@ -317,6 +317,10 @@ mod tests {
loc: fixtures().join("sample_data.xlsx"),
at: 0
},
Res {
loc: fixtures().join("sample_headers.xlsx"),
at: 0
},
Res {
loc: fixtures().join("script.nu"),
at: 0

View File

@ -1,10 +1,11 @@
use crate::data::value::compare_values;
use crate::data::TaggedListBuilder;
use chrono::{DateTime, NaiveDate, Utc};
use nu_errors::ShellError;
use nu_parser::CompareOperator;
use nu_protocol::{Primitive, TaggedDictBuilder, UntaggedValue, Value};
use nu_source::{SpannedItem, Tag, Tagged, TaggedItem};
use nu_value_ext::{get_data_by_key, ValueExt};
use num_bigint::BigInt;
use num_traits::Zero;
pub fn columns_sorted(
@ -196,44 +197,31 @@ pub fn evaluate(
Ok(results)
}
fn sum(data: Vec<Value>) -> Result<Value, ShellError> {
let total = data
pub fn sum(data: Vec<Value>) -> Result<Value, ShellError> {
Ok(data
.into_iter()
.fold(Zero::zero(), |acc: BigInt, value| match value {
Value {
value: UntaggedValue::Primitive(Primitive::Int(n)),
..
} => acc + n,
_ => acc,
});
Ok(UntaggedValue::int(total).into_untagged_value())
.fold(Value::zero(), |acc: Value, value| acc + value))
}
fn formula(
acc_begin: BigInt,
calculator: Box<dyn Fn(Vec<Value>) -> Result<Value, ShellError> + 'static>,
) -> Box<dyn Fn(BigInt, Vec<Value>) -> Result<Value, ShellError> + 'static> {
acc_begin: Value,
calculator: Box<dyn Fn(Vec<Value>) -> Result<Value, ShellError> + Send + Sync + 'static>,
) -> Box<dyn Fn(Value, Vec<Value>) -> Result<Value, ShellError> + Send + Sync + 'static> {
Box::new(move |acc, datax| -> Result<Value, ShellError> {
let result = acc * acc_begin.clone();
if let Ok(Value {
value: UntaggedValue::Primitive(Primitive::Int(computed)),
..
}) = calculator(datax)
{
return Ok(UntaggedValue::int(result + computed).into_untagged_value());
match calculator(datax) {
Ok(total) => Ok(result + total),
Err(reason) => Err(reason),
}
Ok(UntaggedValue::int(0).into_untagged_value())
})
}
pub fn reducer_for(
command: Reduce,
) -> Box<dyn Fn(BigInt, Vec<Value>) -> Result<Value, ShellError> + 'static> {
) -> Box<dyn Fn(Value, Vec<Value>) -> Result<Value, ShellError> + Send + Sync + 'static> {
match command {
Reduce::Sum | Reduce::Default => Box::new(formula(Zero::zero(), Box::new(sum))),
Reduce::Sum | Reduce::Default => Box::new(formula(Value::zero(), Box::new(sum))),
}
}
@ -262,7 +250,7 @@ pub fn reduce(
let datasets: Vec<_> = datasets
.iter()
.map(|subsets| {
let acc: BigInt = Zero::zero();
let acc = Value::zero();
match subsets {
Value {
value: UntaggedValue::Table(data),
@ -318,37 +306,48 @@ pub fn map_max(
value: UntaggedValue::Table(datasets),
..
} => {
let datasets: Vec<_> = datasets
let datasets: Vec<Value> = datasets
.iter()
.map(|subsets| match subsets {
Value {
value: UntaggedValue::Table(data),
..
} => {
let data: BigInt =
data.iter().fold(Zero::zero(), |acc, value| match value {
Value {
value: UntaggedValue::Primitive(Primitive::Int(n)),
..
} if *n > acc => n.clone(),
_ => acc,
});
UntaggedValue::int(data).into_value(&tag)
}
} => data.iter().fold(Value::zero(), |acc, value| {
let left = &value.value;
let right = &acc.value;
if let Ok(is_greater_than) =
compare_values(CompareOperator::GreaterThan, left, right)
{
if is_greater_than {
value.clone()
} else {
acc
}
} else {
acc
}
}),
_ => UntaggedValue::int(0).into_value(&tag),
})
.collect();
let datasets: BigInt = datasets
.iter()
.fold(Zero::zero(), |max, value| match value {
Value {
value: UntaggedValue::Primitive(Primitive::Int(n)),
..
} if *n > max => n.clone(),
_ => max,
});
UntaggedValue::int(datasets).into_value(&tag)
datasets.into_iter().fold(Value::zero(), |max, value| {
let left = &value.value;
let right = &max.value;
if let Ok(is_greater_than) =
compare_values(CompareOperator::GreaterThan, left, right)
{
if is_greater_than {
value
} else {
max
}
} else {
max
}
})
}
_ => UntaggedValue::int(-1).into_value(&tag),
};
@ -573,7 +572,7 @@ mod tests {
let action = reducer_for(Reduce::Sum);
assert_eq!(action(Zero::zero(), subject)?, int(3));
assert_eq!(action(Value::zero(), subject)?, int(3));
Ok(())
}

View File

@ -85,6 +85,28 @@ fn filesystem_change_current_directory_to_parent_directory() {
})
}
#[test]
fn filesystem_change_current_directory_to_parent_directory_after_delete_cwd() {
Playground::setup("cd_test_5_1", |dirs, sandbox| {
sandbox.within("foo").mkdir("bar");
let actual = nu!(
cwd: dirs.test().join("foo/bar"),
r#"
rm {}/foo/bar
echo ","
cd ..
pwd | echo $it
"#,
dirs.test()
);
let actual = actual.split(',').nth(1).unwrap();
assert_eq!(PathBuf::from(actual), *dirs.test().join("foo"));
})
}
#[test]
fn filesystem_change_to_home_directory() {
Playground::setup("cd_test_6", |dirs, _| {

View File

@ -0,0 +1,31 @@
use nu_test_support::{nu, pipeline};
#[test]
fn headers_uses_first_row_as_header() {
let actual = nu!(
cwd: "tests/fixtures/formats", pipeline(
r#"
open sample_headers.xlsx
| get Sheet1
| headers
| get header0
| from-json"#
));
assert_eq!(actual, "r1c0r2c0")
}
#[test]
fn headers_adds_missing_column_name() {
let actual = nu!(
cwd: "tests/fixtures/formats", pipeline(
r#"
open sample_headers.xlsx
| get Sheet1
| headers
| get Column1
| from-json"#
));
assert_eq!(actual, "r1c1r2c1")
}

View File

@ -10,6 +10,7 @@ mod first;
mod format;
mod get;
mod group_by;
mod headers;
mod histogram;
mod insert;
mod last;
@ -29,6 +30,7 @@ mod save;
mod sort_by;
mod split_by;
mod split_column;
mod sum;
mod touch;
mod uniq;
mod where_;

View File

@ -230,3 +230,23 @@ fn errors_if_source_doesnt_exist() {
assert!(actual.contains("Invalid File or Pattern"));
})
}
#[test]
fn does_not_error_on_relative_parent_path() {
Playground::setup("mv_test_11", |dirs, sandbox| {
sandbox
.mkdir("first")
.with_files(vec![EmptyFile("first/william_hartnell.txt")]);
let original = dirs.test().join("first/william_hartnell.txt");
let expected = dirs.test().join("william_hartnell.txt");
nu!(
cwd: dirs.test().join("first"),
"mv william_hartnell.txt ./.."
);
assert!(!original.exists());
assert!(expected.exists());
})
}

View File

@ -159,3 +159,86 @@ fn errors_if_attempting_to_delete_two_dot_as_argument() {
assert!(actual.contains("may not be removed"));
})
}
#[test]
fn removes_multiple_directories() {
Playground::setup("rm_test_9", |dirs, sandbox| {
sandbox
.within("src")
.with_files(vec![EmptyFile("a.rs"), EmptyFile("b.rs")])
.within("src/cli")
.with_files(vec![EmptyFile("c.rs"), EmptyFile("d.rs")])
.within("test")
.with_files(vec![EmptyFile("a_test.rs"), EmptyFile("b_test.rs")]);
nu!(
cwd: dirs.test(),
"rm src test --recursive"
);
assert_eq!(
Playground::glob_vec(&format!("{}/*", dirs.test().display())),
Vec::<std::path::PathBuf>::new()
);
})
}
#[test]
fn removes_multiple_files() {
Playground::setup("rm_test_10", |dirs, sandbox| {
sandbox.with_files(vec![
EmptyFile("yehuda.txt"),
EmptyFile("jonathan.txt"),
EmptyFile("andres.txt"),
]);
nu!(
cwd: dirs.test(),
"rm yehuda.txt jonathan.txt andres.txt"
);
assert_eq!(
Playground::glob_vec(&format!("{}/*", dirs.test().display())),
Vec::<std::path::PathBuf>::new()
);
})
}
#[test]
fn removes_multiple_files_with_asterisks() {
Playground::setup("rm_test_11", |dirs, sandbox| {
sandbox.with_files(vec![
EmptyFile("yehuda.txt"),
EmptyFile("jonathan.txt"),
EmptyFile("andres.toml"),
]);
nu!(
cwd: dirs.test(),
"rm *.txt *.toml"
);
assert_eq!(
Playground::glob_vec(&format!("{}/*", dirs.test().display())),
Vec::<std::path::PathBuf>::new()
);
})
}
#[test]
fn allows_doubly_specified_file() {
Playground::setup("rm_test_12", |dirs, sandbox| {
sandbox.with_files(vec![EmptyFile("yehuda.txt"), EmptyFile("jonathan.toml")]);
let actual = nu!(
cwd: dirs.test(),
"rm *.txt yehuda* *.toml"
);
assert_eq!(
Playground::glob_vec(&format!("{}/*", dirs.test().display())),
Vec::<std::path::PathBuf>::new()
);
assert!(!actual.contains("error"))
})
}

View File

@ -0,0 +1,30 @@
use nu_test_support::fs::Stub::FileWithContentToBeTrimmed;
use nu_test_support::playground::Playground;
use nu_test_support::{nu, pipeline};
#[test]
fn all() {
Playground::setup("sum_test_1", |dirs, sandbox| {
sandbox.with_files(vec![FileWithContentToBeTrimmed(
"meals.csv",
r#"
description,calories
"1 large egg",90
"1 cup white rice",250
"1 tablespoon fish oil",108
"#,
)]);
let actual = nu!(
cwd: dirs.test(), pipeline(
r#"
open meals.csv
| get calories
| sum
| echo $it
"#
));
assert_eq!(actual, "448");
})
}

View File

@ -67,7 +67,7 @@ fn nested_json_structures() {
sandbox.with_files(vec![FileWithContentToBeTrimmed(
"nested_json_structures.json",
r#"
[
[
{
"name": "this is duplicated",
"nesting": [ { "a": "a", "b": "b" },

View File

@ -10,6 +10,16 @@ fn filters_by_unit_size_comparison() {
assert_eq!(actual, "cargo_sample.toml");
}
#[test]
fn filters_with_nothing_comparison() {
let actual = nu!(
cwd: "tests/fixtures/formats",
r#"echo '[{"foo": 3}, {"foo": null}, {"foo": 4}]' | from-json | where foo > 1 | get foo | sum | echo $it"#
);
assert_eq!(actual, "7");
}
#[test]
fn binary_operator_comparisons() {
let actual = nu!(

View File

@ -4,7 +4,7 @@ use nu_test_support::{nu, pipeline};
#[test]
fn wrap_rows_into_a_row() {
Playground::setup("embed_test_1", |dirs, sandbox| {
Playground::setup("wrap_test_1", |dirs, sandbox| {
sandbox.with_files(vec![FileWithContentToBeTrimmed(
"los_tres_caballeros.txt",
r#"
@ -34,7 +34,7 @@ fn wrap_rows_into_a_row() {
#[test]
fn wrap_rows_into_a_table() {
Playground::setup("embed_test_2", |dirs, sandbox| {
Playground::setup("wrap_test_2", |dirs, sandbox| {
sandbox.with_files(vec![FileWithContentToBeTrimmed(
"los_tres_caballeros.txt",
r#"

View File

@ -73,8 +73,36 @@ fn table_to_csv_text_skipping_headers_after_conversion() {
}
#[test]
fn from_csv_text_to_table() {
fn infers_types() {
Playground::setup("filter_from_csv_test_1", |dirs, sandbox| {
sandbox.with_files(vec![FileWithContentToBeTrimmed(
"los_cuatro_mosqueteros.csv",
r#"
first_name,last_name,rusty_luck
Andrés,Robalino,1,d
Jonathan,Turner,1,d
Yehuda,Katz,1,d
Jason,Gedge,1,d
"#,
)]);
let actual = nu!(
cwd: dirs.test(), pipeline(
r#"
open los_cuatro_mosqueteros.csv
| where rusty_luck > 0
| count
| echo $it
"#
));
assert_eq!(actual, "4");
})
}
#[test]
fn from_csv_text_to_table() {
Playground::setup("filter_from_csv_test_2", |dirs, sandbox| {
sandbox.with_files(vec![FileWithContentToBeTrimmed(
"los_tres_caballeros.txt",
r#"
@ -102,7 +130,7 @@ fn from_csv_text_to_table() {
#[test]
fn from_csv_text_with_separator_to_table() {
Playground::setup("filter_from_csv_test_2", |dirs, sandbox| {
Playground::setup("filter_from_csv_test_3", |dirs, sandbox| {
sandbox.with_files(vec![FileWithContentToBeTrimmed(
"los_tres_caballeros.txt",
r#"
@ -130,7 +158,7 @@ fn from_csv_text_with_separator_to_table() {
#[test]
fn from_csv_text_with_tab_separator_to_table() {
Playground::setup("filter_from_csv_test_3", |dirs, sandbox| {
Playground::setup("filter_from_csv_test_4", |dirs, sandbox| {
sandbox.with_files(vec![FileWithContentToBeTrimmed(
"los_tres_caballeros.txt",
r#"
@ -158,7 +186,7 @@ fn from_csv_text_with_tab_separator_to_table() {
#[test]
fn from_csv_text_skipping_headers_to_table() {
Playground::setup("filter_from_csv_test_4", |dirs, sandbox| {
Playground::setup("filter_from_csv_test_5", |dirs, sandbox| {
sandbox.with_files(vec![FileWithContentToBeTrimmed(
"los_tres_amigos.txt",
r#"

View File

@ -0,0 +1,28 @@
use nu_test_support::{nu, pipeline};
#[test]
fn out_html_simple() {
let actual = nu!(
cwd: ".", pipeline(
r#"
echo 3 | to-html
"#
));
assert_eq!(actual, "<html><body>3</body></html>");
}
#[test]
fn out_html_table() {
let actual = nu!(
cwd: ".", pipeline(
r#"
echo '{"name": "jason"}' | from-json | to-html
"#
));
assert_eq!(
actual,
"<html><body><table><tr><th>name</th></tr><tr><td>jason</td></tr></table></body></html>"
);
}

View File

@ -0,0 +1,101 @@
use nu_test_support::fs::Stub::FileWithContentToBeTrimmed;
use nu_test_support::playground::Playground;
use nu_test_support::{nu, pipeline};
#[test]
fn infers_types() {
Playground::setup("filter_from_ics_test_1", |dirs, sandbox| {
sandbox.with_files(vec![FileWithContentToBeTrimmed(
"calendar.ics",
r#"
BEGIN:VCALENDAR
PRODID:-//Google Inc//Google Calendar 70.9054//EN
VERSION:2.0
BEGIN:VEVENT
DTSTART:20171007T200000Z
DTEND:20171007T233000Z
DTSTAMP:20200319T182138Z
UID:4l80f6dcovnriq38g57g07btid@google.com
CREATED:20170719T202915Z
DESCRIPTION:
LAST-MODIFIED:20170930T190808Z
LOCATION:
SEQUENCE:1
STATUS:CONFIRMED
SUMMARY:Maryland Game
TRANSP:TRANSPARENT
END:VEVENT
BEGIN:VEVENT
DTSTART:20171002T010000Z
DTEND:20171002T020000Z
DTSTAMP:20200319T182138Z
UID:2v61g7mij4s7ieoubm3sjpun5d@google.com
CREATED:20171001T180103Z
DESCRIPTION:
LAST-MODIFIED:20171001T180103Z
LOCATION:
SEQUENCE:0
STATUS:CONFIRMED
SUMMARY:Halloween Wars
TRANSP:OPAQUE
END:VEVENT
END:VCALENDAR
"#,
)]);
let actual = nu!(
cwd: dirs.test(), pipeline(
r#"
open calendar.ics
| get events
| count
| echo $it
"#
));
assert_eq!(actual, "2");
})
}
#[test]
fn from_ics_text_to_table() {
Playground::setup("filter_from_ics_test_2", |dirs, sandbox| {
sandbox.with_files(vec![FileWithContentToBeTrimmed(
"calendar.txt",
r#"
BEGIN:VCALENDAR
BEGIN:VEVENT
DTSTART:20171007T200000Z
DTEND:20171007T233000Z
DTSTAMP:20200319T182138Z
UID:4l80f6dcovnriq38g57g07btid@google.com
CREATED:20170719T202915Z
DESCRIPTION:
LAST-MODIFIED:20170930T190808Z
LOCATION:
SEQUENCE:1
STATUS:CONFIRMED
SUMMARY:Maryland Game
TRANSP:TRANSPARENT
END:VEVENT
END:VCALENDAR
"#,
)]);
let actual = nu!(
cwd: dirs.test(), pipeline(
r#"
open calendar.txt
| from-ics
| get events
| get properties
| where name == "SUMMARY"
| first
| get value
| echo $it
"#
));
assert_eq!(actual, "Maryland Game");
})
}

View File

@ -0,0 +1,25 @@
use nu_test_support::{nu, pipeline};
#[test]
fn out_md_simple() {
let actual = nu!(
cwd: ".", pipeline(
r#"
echo 3 | to-md
"#
));
assert_eq!(actual, "3");
}
#[test]
fn out_md_table() {
let actual = nu!(
cwd: ".", pipeline(
r#"
echo '{"name": "jason"}' | from-json | to-md
"#
));
assert_eq!(actual, "|name||-||jason|");
}

View File

@ -1,11 +1,15 @@
mod bson;
mod csv;
mod html;
mod ics;
mod json;
mod markdown;
mod ods;
mod sqlite;
mod ssv;
mod toml;
mod tsv;
mod url;
mod vcf;
mod xlsx;
mod yaml;

View File

@ -0,0 +1,84 @@
use nu_test_support::fs::Stub::FileWithContentToBeTrimmed;
use nu_test_support::playground::Playground;
use nu_test_support::{nu, pipeline};
#[test]
fn infers_types() {
Playground::setup("filter_from_vcf_test_1", |dirs, sandbox| {
sandbox.with_files(vec![FileWithContentToBeTrimmed(
"contacts.vcf",
r#"
BEGIN:VCARD
VERSION:3.0
FN:John Doe
N:Doe;John;;;
EMAIL;TYPE=INTERNET:john.doe99@gmail.com
item1.ORG:'Alpine Ski Resort'
item1.X-ABLabel:Other
item2.TITLE:'Ski Instructor'
item2.X-ABLabel:Other
BDAY:19001106
NOTE:Facebook: john.doe.3\nWebsite: \nHometown: Cleveland\, Ohio
CATEGORIES:myContacts
END:VCARD
BEGIN:VCARD
VERSION:3.0
FN:Alex Smith
N:Smith;Alex;;;
TEL;TYPE=CELL:(890) 123-4567
CATEGORIES:Band,myContacts
END:VCARD
"#,
)]);
let actual = nu!(
cwd: dirs.test(), pipeline(
r#"
open contacts.vcf
| count
| echo $it
"#
));
assert_eq!(actual, "2");
})
}
#[test]
fn from_vcf_text_to_table() {
Playground::setup("filter_from_vcf_test_2", |dirs, sandbox| {
sandbox.with_files(vec![FileWithContentToBeTrimmed(
"contacts.txt",
r#"
BEGIN:VCARD
VERSION:3.0
FN:John Doe
N:Doe;John;;;
EMAIL;TYPE=INTERNET:john.doe99@gmail.com
item1.ORG:'Alpine Ski Resort'
item1.X-ABLabel:Other
item2.TITLE:'Ski Instructor'
item2.X-ABLabel:Other
BDAY:19001106
NOTE:Facebook: john.doe.3\nWebsite: \nHometown: Cleveland\, Ohio
CATEGORIES:myContacts
END:VCARD
"#,
)]);
let actual = nu!(
cwd: dirs.test(), pipeline(
r#"
open contacts.txt
| from-vcf
| get properties
| where name == "EMAIL"
| first
| get value
| echo $it
"#
));
assert_eq!(actual, "john.doe99@gmail.com");
})
}

View File

@ -1,6 +1,6 @@
[package]
name = "nu-errors"
version = "0.11.0"
version = "0.12.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "Core error subsystem for Nushell"
@ -10,7 +10,7 @@ license = "MIT"
doctest = false
[dependencies]
nu-source = { path = "../nu-source", version = "0.11.0" }
nu-source = { path = "../nu-source", version = "0.12.0" }
ansi_term = "0.12.1"
bigdecimal = { version = "0.1.0", features = ["serde"] }
@ -29,4 +29,4 @@ toml = "0.5.5"
serde_json = "1.0.44"
[build-dependencies]
nu-build = { version = "0.11.0", path = "../nu-build" }
nu-build = { version = "0.12.0", path = "../nu-build" }

View File

@ -1,6 +1,6 @@
[package]
name = "nu-macros"
version = "0.11.0"
version = "0.12.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "Core macros for building Nushell"
@ -10,4 +10,4 @@ license = "MIT"
doctest = false
[dependencies]
nu-protocol = { path = "../nu-protocol", version = "0.11.0" }
nu-protocol = { path = "../nu-protocol", version = "0.12.0" }

View File

@ -1,6 +1,6 @@
[package]
name = "nu-parser"
version = "0.11.0"
version = "0.12.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "Core parser used in Nushell"
@ -10,9 +10,9 @@ license = "MIT"
doctest = false
[dependencies]
nu-errors = { path = "../nu-errors", version = "0.11.0" }
nu-source = { path = "../nu-source", version = "0.11.0" }
nu-protocol = { path = "../nu-protocol", version = "0.11.0" }
nu-errors = { path = "../nu-errors", version = "0.12.0" }
nu-source = { path = "../nu-source", version = "0.12.0" }
nu-protocol = { path = "../nu-protocol", version = "0.12.0" }
pretty_env_logger = "0.3.1"
pretty = "0.5.2"
@ -41,7 +41,7 @@ enumflags2 = "0.6.2"
pretty_assertions = "0.6.1"
[build-dependencies]
nu-build = { version = "0.11.0", path = "../nu-build" }
nu-build = { version = "0.12.0", path = "../nu-build" }
[features]
stable = []

View File

@ -1,6 +1,6 @@
use crate::hir;
use crate::hir::syntax_shape::{
expand_atom, expand_syntax, BareShape, ExpandContext, ExpandSyntax, ExpansionRule,
ExpandSyntax, expand_atom, expand_syntax, BareShape, ExpandContext, ExpandSyntax, ExpansionRule,
UnspannedAtomicToken, WhitespaceShape,
};
use crate::hir::tokens_iterator::TokensIterator;

View File

@ -137,7 +137,7 @@ pub struct ExpandContext<'context> {
impl<'context> ExpandContext<'context> {
pub(crate) fn homedir(&self) -> Option<&Path> {
self.homedir.as_ref().map(|h| h.as_path())
self.homedir.as_deref()
}
pub(crate) fn source(&self) -> &'context Text {

View File

@ -477,18 +477,6 @@ impl ExpandSyntax for MemberShape {
return Ok(Member::Bare(bare.span()));
}
/* KATZ */
/* let number = NumberShape.test(token_nodes, context);
if let Some(peeked) = number {
let node = peeked.not_eof("column")?.commit();
let (n, span) = node.as_number().ok_or_else(|| {
ParseError::internal_error("can't convert node to number".spanned(node.span()))
})?;
return Ok(Member::Number(n, span))
}*/
let string = token_nodes.expand_syntax(StringShape);
if let Ok(syntax) = string {

View File

@ -3,9 +3,6 @@ pub(crate) mod into_shapes;
pub(crate) mod pattern;
pub(crate) mod state;
#[cfg(test)]
mod tests;
use self::debug::ExpandTracer;
use self::into_shapes::IntoShapes;
use self::state::{Peeked, TokensIteratorState};
@ -510,7 +507,7 @@ impl<'content> TokensIterator<'content> {
/// The purpose of `expand_infallible` is to clearly mark the infallible path through
/// and entire list of tokens that produces a fully colored version of the source.
///
/// If the `ExpandSyntax` can poroduce a `Result`, make sure to use `expand_syntax`,
/// If the `ExpandSyntax` can produce a `Result`, make sure to use `expand_syntax`,
/// which will correctly show the error in the trace.
pub fn expand_infallible<U>(&mut self, shape: impl ExpandSyntax<Output = U>) -> U
where
@ -536,7 +533,7 @@ impl<'content> TokensIterator<'content> {
})
}
fn expand<U>(&mut self, shape: impl ExpandSyntax<Output = U>) -> (U, usize)
pub fn expand<U>(&mut self, shape: impl ExpandSyntax<Output = U>) -> (U, usize)
where
U: std::fmt::Debug + Clone + 'static,
{

View File

@ -1,46 +0,0 @@
use crate::hir::{syntax_shape::ExpandContext, syntax_shape::SignatureRegistry, TokensIterator};
use crate::parse::token_tree_builder::TokenTreeBuilder as b;
use nu_protocol::Signature;
use nu_source::{Span, Text};
use derive_new::new;
#[derive(Debug, Clone, new)]
struct TestRegistry {
#[new(default)]
signatures: indexmap::IndexMap<String, Signature>,
}
impl TestRegistry {}
impl SignatureRegistry for TestRegistry {
fn has(&self, name: &str) -> bool {
self.signatures.contains_key(name)
}
fn get(&self, name: &str) -> Option<Signature> {
self.signatures.get(name).cloned()
}
fn clone_box(&self) -> Box<dyn SignatureRegistry> {
Box::new(self.clone())
}
}
#[test]
fn supplies_tokens() {
let token = b::it_var();
let (tokens, source) = b::build(token);
let tokens = vec![tokens];
let source = Text::from(&source);
let mut iterator = TokensIterator::new(
&tokens,
ExpandContext::new(Box::new(TestRegistry::new()), &source, None),
Span::unknown(),
);
let token = iterator.next().expect("Token expected.");
token.expect_var();
}

View File

@ -6,6 +6,9 @@ pub mod hir;
pub mod parse;
pub mod parse_command;
#[cfg(test)]
pub mod test_support;
pub use crate::commands::classified::{
external::ExternalCommand, internal::InternalCommand, ClassifiedCommand, ClassifiedPipeline,
};
@ -20,6 +23,11 @@ pub use crate::parse::parser::{module, pipeline};
pub use crate::parse::token_tree::{Delimiter, SpannedToken, Token};
pub use crate::parse::token_tree_builder::TokenTreeBuilder;
pub mod utils {
pub use crate::parse::util::parse_line_with_separator;
pub use crate::parse::util::LineSeparatedShape;
}
use log::log_enabled;
use nu_errors::ShellError;
use nu_protocol::{errln, outln};

View File

@ -7,3 +7,49 @@ macro_rules! return_ok {
}
};
}
#[cfg(test)]
macro_rules! equal_tokens {
($source:tt -> $tokens:expr) => {
let result = apply(pipeline, "pipeline", $source);
let (expected_tree, expected_source) = TokenTreeBuilder::build($tokens);
if result != expected_tree {
let debug_result = format!("{}", result.debug($source));
let debug_expected = format!("{}", expected_tree.debug(&expected_source));
if debug_result == debug_expected {
assert_eq!(
result, expected_tree,
"NOTE: actual and expected had equivalent debug serializations, source={:?}, debug_expected={:?}",
$source,
debug_expected
)
} else {
assert_eq!(debug_result, debug_expected)
}
}
};
(<$parser:tt> $source:tt -> $tokens:expr) => {
let result = apply($parser, stringify!($parser), $source);
let (expected_tree, expected_source) = TokenTreeBuilder::build($tokens);
if result != expected_tree {
let debug_result = format!("{}", result.debug($source));
let debug_expected = format!("{}", expected_tree.debug(&expected_source));
if debug_result == debug_expected {
assert_eq!(
result, expected_tree,
"NOTE: actual and expected had equivalent debug serializations, source={:?}, debug_expected={:?}",
$source,
debug_expected
)
} else {
assert_eq!(debug_result, debug_expected)
}
}
};
}

View File

@ -1,5 +1,4 @@
#![allow(unused)]
use crate::parse::{
call_node::*, flag::*, number::*, operator::*, pipeline::*, token_tree::*,
token_tree_builder::*, unit::*,
@ -318,6 +317,7 @@ pub fn dq_string(input: NomSpan) -> IResult<NomSpan, SpannedToken> {
let (input, _) = char('"')(input)?;
let start1 = input.offset;
let (input, _) = many0(none_of("\""))(input)?;
let end1 = input.offset;
let (input, _) = char('"')(input)?;
let end = input.offset;
@ -372,7 +372,7 @@ fn word<'a, T, U, V>(
let (input, _) = start_predicate(input)?;
let (input, _) = many0(next_predicate)(input)?;
let next_char = &input.fragment.chars().nth(0);
let next_char = &input.fragment.chars().next();
match next_char {
Some('.') => {}
@ -609,7 +609,7 @@ fn tight<'a>(
let (input, tail) = opt(alt((many1(range_continuation), many1(dot_member))))(input)?;
let next_char = &input.fragment.chars().nth(0);
let next_char = &input.fragment.chars().next();
if is_boundary(*next_char) {
if let Some(tail) = tail {
@ -939,7 +939,7 @@ pub fn tight_node(input: NomSpan) -> IResult<NomSpan, Vec<SpannedToken>> {
))(input)
}
fn to_list(
pub fn to_list(
parser: impl Fn(NomSpan) -> IResult<NomSpan, SpannedToken>,
) -> impl Fn(NomSpan) -> IResult<NomSpan, Vec<SpannedToken>> {
move |input| {
@ -1017,7 +1017,7 @@ fn parse_int<T>(frag: &str, neg: Option<T>) -> i64 {
}
}
fn is_boundary(c: Option<char>) -> bool {
pub fn is_boundary(c: Option<char>) -> bool {
match c {
None => true,
Some(')') | Some(']') | Some('}') | Some('(') => true,
@ -1140,59 +1140,13 @@ fn is_member_start(c: char) -> bool {
#[cfg(test)]
mod tests {
use super::*;
use crate::parse::token_tree_builder::TokenTreeBuilder as b;
use crate::parse::token_tree_builder::{CurriedToken, TokenTreeBuilder};
use crate::parse::parser::{module, nodes, pipeline};
use crate::parse::token_tree_builder::TokenTreeBuilder::{self, self as b};
use crate::test_support::apply;
use nu_source::PrettyDebugWithSource;
use pretty_assertions::assert_eq;
pub type CurriedNode<T> = Box<dyn FnOnce(&mut TokenTreeBuilder) -> T + 'static>;
macro_rules! equal_tokens {
($source:tt -> $tokens:expr) => {
let result = apply(pipeline, "pipeline", $source);
let (expected_tree, expected_source) = TokenTreeBuilder::build($tokens);
if result != expected_tree {
let debug_result = format!("{}", result.debug($source));
let debug_expected = format!("{}", expected_tree.debug(&expected_source));
if debug_result == debug_expected {
assert_eq!(
result, expected_tree,
"NOTE: actual and expected had equivalent debug serializations, source={:?}, debug_expected={:?}",
$source,
debug_expected
)
} else {
assert_eq!(debug_result, debug_expected)
}
}
};
(<$parser:tt> $source:tt -> $tokens:expr) => {
let result = apply($parser, stringify!($parser), $source);
let (expected_tree, expected_source) = TokenTreeBuilder::build($tokens);
if result != expected_tree {
let debug_result = format!("{}", result.debug($source));
let debug_expected = format!("{}", expected_tree.debug(&expected_source));
if debug_result == debug_expected {
assert_eq!(
result, expected_tree,
"NOTE: actual and expected had equivalent debug serializations, source={:?}, debug_expected={:?}",
$source,
debug_expected
)
} else {
assert_eq!(debug_result, debug_expected)
}
}
};
}
#[test]
fn test_integer() {
equal_tokens! {
@ -1339,7 +1293,7 @@ mod tests {
fn test_flag() {
equal_tokens! {
<nodes>
"--amigos" -> b::token_list(vec![b::flag("arepas")])
"--amigos" -> b::token_list(vec![b::flag("amigos")])
}
equal_tokens! {
@ -1721,119 +1675,4 @@ mod tests {
])
);
}
// #[test]
// fn test_smoke_pipeline() {
// let _ = pretty_env_logger::try_init();
// assert_eq!(
// apply(
// pipeline,
// "pipeline",
// r#"git branch --merged | split-row "`n" | where $it != "* master""#
// ),
// build_token(b::pipeline(vec![
// (
// None,
// b::call(
// b::bare("git"),
// vec![b::sp(), b::bare("branch"), b::sp(), b::flag("merged")]
// ),
// Some(" ")
// ),
// (
// Some(" "),
// b::call(b::bare("split-row"), vec![b::sp(), b::string("`n")]),
// Some(" ")
// ),
// (
// Some(" "),
// b::call(
// b::bare("where"),
// vec![
// b::sp(),
// b::it_var(),
// b::sp(),
// b::op("!="),
// b::sp(),
// b::string("* master")
// ]
// ),
// None
// )
// ]))
// );
// assert_eq!(
// apply(pipeline, "pipeline", "ls | where { $it.size > 100 }"),
// build_token(b::pipeline(vec![
// (None, b::call(b::bare("ls"), vec![]), Some(" ")),
// (
// Some(" "),
// b::call(
// b::bare("where"),
// vec![
// b::sp(),
// b::braced(vec![
// b::path(b::it_var(), vec![b::member("size")]),
// b::sp(),
// b::op(">"),
// b::sp(),
// b::int(100)
// ])
// ]
// ),
// None
// )
// ]))
// )
// }
fn apply(
f: impl Fn(
NomSpan,
)
-> Result<(NomSpan, SpannedToken), nom::Err<(NomSpan, nom::error::ErrorKind)>>,
desc: &str,
string: &str,
) -> SpannedToken {
let result = f(nom_input(string));
match result {
Ok(value) => value.1,
Err(err) => {
let err = nu_errors::ShellError::parse_error(err);
println!("{:?}", string);
crate::hir::baseline_parse::tests::print_err(err, &nu_source::Text::from(string));
panic!("test failed")
}
}
}
fn span((left, right): (usize, usize)) -> Span {
Span::new(left, right)
}
fn delimited(
delimiter: Spanned<Delimiter>,
children: Vec<SpannedToken>,
left: usize,
right: usize,
) -> SpannedToken {
let start = Span::for_char(left);
let end = Span::for_char(right);
let node = DelimitedNode::new(delimiter.item, (start, end), children);
Token::Delimited(node).into_spanned((left, right))
}
fn build<T>(block: CurriedNode<T>) -> T {
let mut builder = TokenTreeBuilder::new();
block(&mut builder)
}
fn build_token(block: CurriedToken) -> SpannedToken {
TokenTreeBuilder::build(block).0
}
}

View File

@ -306,6 +306,13 @@ impl SpannedToken {
}
}
pub fn is_int(&self) -> bool {
match self.unspanned() {
Token::Number(RawNumber::Int(_)) => true,
_ => false,
}
}
pub fn as_string(&self) -> Option<(Span, Span)> {
match self.unspanned() {
Token::String(inner_span) => Some((self.span(), *inner_span)),
@ -327,16 +334,16 @@ impl SpannedToken {
}
}
pub fn is_int(&self) -> bool {
pub fn is_dot(&self) -> bool {
match self.unspanned() {
Token::Number(RawNumber::Int(_)) => true,
Token::EvaluationOperator(EvaluationOperator::Dot) => true,
_ => false,
}
}
pub fn is_dot(&self) -> bool {
pub fn is_separator(&self) -> bool {
match self.unspanned() {
Token::EvaluationOperator(EvaluationOperator::Dot) => true,
Token::Separator => true,
_ => false,
}
}
@ -479,6 +486,13 @@ impl SpannedToken {
}
}
pub fn expect_number(&self) -> RawNumber {
match self.unspanned() {
Token::Number(raw_number) => *raw_number,
other => panic!("Expected number, found {:?}", other),
}
}
pub fn expect_string(&self) -> (Span, Span) {
match self.unspanned() {
Token::String(inner_span) => (self.span(), *inner_span),

View File

@ -1 +0,0 @@

View File

@ -0,0 +1,2 @@
pub(crate) mod parser;
pub(crate) mod shape;

View File

@ -0,0 +1,272 @@
use crate::parse::number::RawNumber;
use crate::parse::parser::{is_boundary, to_list};
use crate::parse::token_tree::SpannedToken;
use crate::parse::token_tree_builder::TokenTreeBuilder;
use nu_source::{HasSpan, NomSpan, Span, Spanned, SpannedItem};
use nom::branch::alt;
use nom::bytes::complete::{escaped, tag};
use nom::character::complete::*;
use nom::combinator::*;
use nom::multi::*;
use nom::IResult;
use nom_tracable::tracable_parser;
#[tracable_parser]
pub fn parse_line_with_separator<'a, 'b>(
separator: &'b str,
input: NomSpan<'a>,
) -> IResult<NomSpan<'a>, Spanned<Vec<SpannedToken>>> {
let start = input.offset;
let mut nodes = vec![];
let mut next_input = input;
loop {
let node_result = to_list(leaf(separator))(next_input);
let (after_node_input, next_nodes) = match node_result {
Err(_) => break,
Ok((after_node_input, next_node)) => (after_node_input, next_node),
};
nodes.extend(next_nodes);
match separated_by(separator)(after_node_input) {
Err(_) => {
next_input = after_node_input;
break;
}
Ok((input, s)) => {
nodes.push(s);
next_input = input;
}
}
}
let end = next_input.offset;
Ok((next_input, nodes.spanned(Span::new(start, end))))
}
#[tracable_parser]
pub fn fallback_number_without(c: char) -> impl Fn(NomSpan) -> IResult<NomSpan, SpannedToken> {
move |input| {
let (input, number) = fallback_raw_number_without(c)(input)?;
Ok((
input,
TokenTreeBuilder::spanned_number(number, number.span()),
))
}
}
#[tracable_parser]
pub fn fallback_raw_number_without(c: char) -> impl Fn(NomSpan) -> IResult<NomSpan, RawNumber> {
move |input| {
let _anchoral = input;
let start = input.offset;
let (input, _neg) = opt(tag("-"))(input)?;
let (input, _head) = digit1(input)?;
let after_int_head = input;
match input.fragment.chars().next() {
None => return Ok((input, RawNumber::int(Span::new(start, input.offset)))),
Some('.') => (),
other if is_boundary(other) || other == Some(c) => {
return Ok((input, RawNumber::int(Span::new(start, input.offset))))
}
_ => {
return Err(nom::Err::Error(nom::error::make_error(
input,
nom::error::ErrorKind::Tag,
)))
}
}
let dot: IResult<NomSpan, NomSpan, (NomSpan, nom::error::ErrorKind)> = tag(".")(input);
let input = match dot {
Ok((input, _dot)) => input,
// it's just an integer
Err(_) => return Ok((input, RawNumber::int(Span::new(start, input.offset)))),
};
let tail_digits_result: IResult<NomSpan, _> = digit1(input);
let (input, _tail) = match tail_digits_result {
Ok((input, tail)) => (input, tail),
Err(_) => {
return Ok((
after_int_head,
RawNumber::int((start, after_int_head.offset)),
))
}
};
let end = input.offset;
let next = input.fragment.chars().next();
if is_boundary(next) || next == Some(c) {
Ok((input, RawNumber::decimal(Span::new(start, end))))
} else {
Err(nom::Err::Error(nom::error::make_error(
input,
nom::error::ErrorKind::Tag,
)))
}
}
}
#[tracable_parser]
pub fn leaf(c: &str) -> impl Fn(NomSpan) -> IResult<NomSpan, SpannedToken> + '_ {
move |input| {
let separator = c.chars().next().unwrap_or_else(|| ',');
let (input, node) = alt((
fallback_number_without(separator),
string,
fallback_string_without(c),
))(input)?;
Ok((input, node))
}
}
#[tracable_parser]
pub fn separated_by(c: &str) -> impl Fn(NomSpan) -> IResult<NomSpan, SpannedToken> + '_ {
move |input| {
let left = input.offset;
let (input, _) = tag(c)(input)?;
let right = input.offset;
Ok((input, TokenTreeBuilder::spanned_sep(Span::new(left, right))))
}
}
#[tracable_parser]
pub fn dq_string(input: NomSpan) -> IResult<NomSpan, SpannedToken> {
let start = input.offset;
let (input, _) = char('"')(input)?;
let start1 = input.offset;
let (input, _) = escaped(
none_of(r#"\""#),
'\\',
nom::character::complete::one_of(r#"\"rnt"#),
)(input)?;
let end1 = input.offset;
let (input, _) = char('"')(input)?;
let end = input.offset;
Ok((
input,
TokenTreeBuilder::spanned_string(Span::new(start1, end1), Span::new(start, end)),
))
}
#[tracable_parser]
pub fn sq_string(input: NomSpan) -> IResult<NomSpan, SpannedToken> {
let start = input.offset;
let (input, _) = char('\'')(input)?;
let start1 = input.offset;
let (input, _) = many0(none_of("\'"))(input)?;
let end1 = input.offset;
let (input, _) = char('\'')(input)?;
let end = input.offset;
Ok((
input,
TokenTreeBuilder::spanned_string(Span::new(start1, end1), Span::new(start, end)),
))
}
#[tracable_parser]
pub fn string(input: NomSpan) -> IResult<NomSpan, SpannedToken> {
alt((sq_string, dq_string))(input)
}
#[tracable_parser]
pub fn fallback_string_without(c: &str) -> impl Fn(NomSpan) -> IResult<NomSpan, SpannedToken> + '_ {
move |input| {
let start = input.offset;
let (input, _) = many0(none_of(c))(input)?;
let end = input.offset;
Ok((
input,
TokenTreeBuilder::spanned_string(Span::new(start, end), Span::new(start, end)),
))
}
}
#[cfg(test)]
mod tests {
use crate::parse::token_tree_builder::TokenTreeBuilder::{self, self as b};
use crate::parse::util::parse_line_with_separator;
use crate::test_support::apply;
use nom::IResult;
use crate::parse::pipeline::PipelineElement;
use crate::parse::token_tree::SpannedToken;
use nu_source::NomSpan;
use nu_source::PrettyDebugWithSource;
use pretty_assertions::assert_eq;
pub fn nodes(input: NomSpan) -> IResult<NomSpan, SpannedToken> {
let (input, tokens) = parse_line_with_separator(",", input)?;
let span = tokens.span;
Ok((
input,
TokenTreeBuilder::spanned_pipeline(vec![PipelineElement::new(None, tokens)], span),
))
}
#[test]
fn separators() {
equal_tokens! {
<nodes>
r#""name","lastname","age""# -> b::token_list(vec![
b::string("name"),
b::sep(","),
b::string("lastname"),
b::sep(","),
b::string("age")
])
}
equal_tokens! {
<nodes>
r#""Andrés","Robalino",12"# -> b::token_list(vec![
b::string("Andrés"),
b::sep(","),
b::string("Robalino"),
b::sep(","),
b::int(12)
])
}
}
#[test]
fn strings() {
equal_tokens! {
<nodes>
r#""andres""# -> b::token_list(vec![b::string("andres")])
}
}
#[test]
fn numbers() {
equal_tokens! {
<nodes>
"123" -> b::token_list(vec![b::int(123)])
}
equal_tokens! {
<nodes>
"-123" -> b::token_list(vec![b::int(-123)])
}
}
}

View File

@ -0,0 +1,91 @@
use crate::hir::{
self, syntax_shape::ExpandSyntax, syntax_shape::FlatShape, syntax_shape::NumberExpressionShape,
syntax_shape::StringShape,
};
use crate::hir::{Expression, TokensIterator};
use crate::parse::token_tree::SeparatorType;
use nu_errors::ParseError;
use nu_protocol::UntaggedValue;
use nu_source::Span;
#[derive(Debug, Copy, Clone)]
pub struct LineSeparatedShape;
impl ExpandSyntax for LineSeparatedShape {
type Output = Result<Vec<UntaggedValue>, ParseError>;
fn name(&self) -> &'static str {
"any string line separated by"
}
fn expand<'a, 'b>(
&self,
token_nodes: &mut TokensIterator<'_>,
) -> Result<Vec<UntaggedValue>, ParseError> {
let source = token_nodes.source();
if token_nodes.at_end() {
return Ok(vec![]);
}
let mut entries = vec![];
loop {
let field = {
token_nodes
.expand_syntax(NumberExpressionShape)
.or_else(|_| {
token_nodes
.expand_syntax(StringShape)
.map(|syntax| Expression::string(syntax.inner).into_expr(syntax.span))
})
};
if let Ok(field) = field {
match &field.expr {
Expression::Literal(hir::Literal::Number(crate::Number::Int(i))) => {
entries.push(UntaggedValue::int(i.clone()))
}
Expression::Literal(hir::Literal::Number(crate::Number::Decimal(d))) => {
entries.push(UntaggedValue::decimal(d.clone()))
}
Expression::Literal(hir::Literal::String(span)) => {
if span.is_closed() {
entries.push(UntaggedValue::nothing())
} else {
entries.push(UntaggedValue::string(span.slice(&source)))
}
}
_ => {}
}
}
match token_nodes.expand_infallible(SeparatorShape) {
Err(err) if !token_nodes.at_end() => return Err(err),
_ => {}
}
if token_nodes.at_end() {
break;
}
}
Ok(entries)
}
}
#[derive(Debug, Copy, Clone)]
pub struct SeparatorShape;
impl ExpandSyntax for SeparatorShape {
type Output = Result<Span, ParseError>;
fn name(&self) -> &'static str {
"separated"
}
fn expand<'a, 'b>(&self, token_nodes: &'b mut TokensIterator<'a>) -> Result<Span, ParseError> {
token_nodes.expand_token(SeparatorType, |span| Ok((FlatShape::Separator, span)))
}
}

View File

@ -0,0 +1,4 @@
mod line_delimited_parser;
pub use line_delimited_parser::parser::parse_line_with_separator;
pub use line_delimited_parser::shape::LineSeparatedShape;

View File

@ -0,0 +1,104 @@
use crate::hir::{syntax_shape::ExpandContext, syntax_shape::SignatureRegistry};
use crate::parse::files::Files;
use crate::parse::token_tree::{DelimitedNode, Delimiter, SpannedToken, Token};
use crate::parse::token_tree_builder::{CurriedToken, TokenTreeBuilder};
use nu_errors::ShellError;
use nu_protocol::Signature;
use nu_source::{nom_input, NomSpan, Span, Spanned, Text};
pub use nu_source::PrettyDebug;
use derive_new::new;
pub type CurriedNode<T> = Box<dyn FnOnce(&mut TokenTreeBuilder) -> T + 'static>;
#[derive(Debug, Clone, new)]
pub struct TestRegistry {
#[new(default)]
signatures: indexmap::IndexMap<String, Signature>,
}
impl TestRegistry {}
impl SignatureRegistry for TestRegistry {
fn has(&self, name: &str) -> bool {
self.signatures.contains_key(name)
}
fn get(&self, name: &str) -> Option<Signature> {
self.signatures.get(name).cloned()
}
fn clone_box(&self) -> Box<dyn SignatureRegistry> {
Box::new(self.clone())
}
}
pub fn with_empty_context(source: &Text, callback: impl FnOnce(ExpandContext)) {
let registry = TestRegistry::new();
callback(ExpandContext::new(Box::new(registry), source, None))
}
pub fn inner_string_span(span: Span) -> Span {
Span::new(span.start() + 1, span.end() - 1)
}
pub fn print_err(err: ShellError, source: &Text) {
let diag = err.into_diagnostic();
let writer = termcolor::StandardStream::stderr(termcolor::ColorChoice::Auto);
let mut source = source.to_string();
source.push_str(" ");
let files = Files::new(source);
let _ = language_reporting::emit(
&mut writer.lock(),
&files,
&diag,
&language_reporting::DefaultConfig,
);
}
pub fn apply(
f: impl Fn(NomSpan) -> Result<(NomSpan, SpannedToken), nom::Err<(NomSpan, nom::error::ErrorKind)>>,
_desc: &str,
string: &str,
) -> SpannedToken {
let result = f(nom_input(string));
match result {
Ok(value) => value.1,
Err(err) => {
let err = nu_errors::ShellError::parse_error(err);
println!("{:?}", string);
crate::hir::baseline_parse::tests::print_err(err, &nu_source::Text::from(string));
panic!("test failed")
}
}
}
pub fn span((left, right): (usize, usize)) -> Span {
Span::new(left, right)
}
pub fn delimited(
delimiter: Spanned<Delimiter>,
children: Vec<SpannedToken>,
left: usize,
right: usize,
) -> SpannedToken {
let start = Span::for_char(left);
let end = Span::for_char(right);
let node = DelimitedNode::new(delimiter.item, (start, end), children);
Token::Delimited(node).into_spanned((left, right))
}
pub fn build<T>(block: CurriedNode<T>) -> T {
let mut builder = TokenTreeBuilder::new();
block(&mut builder)
}
pub fn build_token(block: CurriedToken) -> SpannedToken {
TokenTreeBuilder::build(block).0
}

View File

@ -1,6 +1,6 @@
[package]
name = "nu-plugin"
version = "0.11.0"
version = "0.12.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "Nushell Plugin"
@ -10,10 +10,10 @@ license = "MIT"
doctest = false
[dependencies]
nu-protocol = { path = "../nu-protocol", version = "0.11.0" }
nu-source = { path = "../nu-source", version = "0.11.0" }
nu-errors = { path = "../nu-errors", version = "0.11.0" }
nu-value-ext = { path = "../nu-value-ext", version = "0.11.0" }
nu-protocol = { path = "../nu-protocol", version = "0.12.0" }
nu-source = { path = "../nu-source", version = "0.12.0" }
nu-errors = { path = "../nu-errors", version = "0.12.0" }
nu-value-ext = { path = "../nu-value-ext", version = "0.12.0" }
indexmap = { version = "1.3.0", features = ["serde-1"] }
serde = { version = "1.0.103", features = ["derive"] }
@ -21,4 +21,4 @@ num-bigint = { version = "0.2.3", features = ["serde"] }
serde_json = "1.0.44"
[build-dependencies]
nu-build = { version = "0.11.0", path = "../nu-build" }
nu-build = { version = "0.12.0", path = "../nu-build" }

View File

@ -1,6 +1,6 @@
[package]
name = "nu-protocol"
version = "0.11.0"
version = "0.12.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "Core values and protocols for Nushell"
@ -10,8 +10,8 @@ license = "MIT"
doctest = false
[dependencies]
nu-source = { path = "../nu-source", version = "0.11.0" }
nu-errors = { path = "../nu-errors", version = "0.11.0" }
nu-source = { path = "../nu-source", version = "0.12.0" }
nu-errors = { path = "../nu-errors", version = "0.12.0" }
serde = { version = "1.0.103", features = ["derive"] }
indexmap = { version = "1.3.0", features = ["serde-1"] }
@ -38,4 +38,4 @@ toml = "0.5.5"
serde_json = "1.0.44"
[build-dependencies]
nu-build = { version = "0.11.0", path = "../nu-build" }
nu-build = { version = "0.12.0", path = "../nu-build" }

View File

@ -379,6 +379,60 @@ impl From<ShellError> for UntaggedValue {
}
}
impl num_traits::Zero for Value {
fn zero() -> Self {
Value {
value: UntaggedValue::Primitive(Primitive::zero()),
tag: Tag::unknown(),
}
}
fn is_zero(&self) -> bool {
match &self.value {
UntaggedValue::Primitive(primitive) => primitive.is_zero(),
UntaggedValue::Row(row) => row.entries.is_empty(),
UntaggedValue::Table(rows) => rows.is_empty(),
_ => false,
}
}
}
impl std::ops::Mul for Value {
type Output = Self;
fn mul(self, rhs: Self) -> Self {
let tag = self.tag.clone();
match (&*self, &*rhs) {
(UntaggedValue::Primitive(left), UntaggedValue::Primitive(right)) => {
let left = left.clone();
let right = right.clone();
UntaggedValue::from(left.mul(right)).into_value(tag)
}
(_, _) => unimplemented!("Internal error: can't multiply non-primitives."),
}
}
}
impl std::ops::Add for Value {
type Output = Self;
fn add(self, rhs: Self) -> Self {
let tag = self.tag.clone();
match (&*self, &*rhs) {
(UntaggedValue::Primitive(left), UntaggedValue::Primitive(right)) => {
let left = left.clone();
let right = right.clone();
UntaggedValue::from(left.add(right)).into_value(tag)
}
(_, _) => unimplemented!("Internal error: can't add non-primitives."),
}
}
}
pub fn merge_descriptors(values: &[Value]) -> Vec<String> {
let mut ret: Vec<String> = vec![];
let value_column = "<value>".to_string();

View File

@ -8,6 +8,7 @@ use nu_errors::{ExpectedRange, ShellError};
use nu_source::{PrettyDebug, Span, SpannedItem};
use num_bigint::BigInt;
use num_traits::cast::{FromPrimitive, ToPrimitive};
use num_traits::identities::Zero;
use serde::{Deserialize, Serialize};
use std::path::PathBuf;
@ -73,6 +74,92 @@ impl Primitive {
)),
}
}
pub fn into_string(self, span: Span) -> Result<String, ShellError> {
match self {
Primitive::String(s) => Ok(s),
other => Err(ShellError::type_error(
"string",
other.type_name().spanned(span),
)),
}
}
}
impl num_traits::Zero for Primitive {
fn zero() -> Self {
Primitive::Int(BigInt::zero())
}
fn is_zero(&self) -> bool {
match self {
Primitive::Int(int) => int.is_zero(),
Primitive::Decimal(decimal) => decimal.is_zero(),
Primitive::Bytes(size) => size.is_zero(),
Primitive::Nothing => true,
_ => false,
}
}
}
impl std::ops::Add for Primitive {
type Output = Primitive;
fn add(self, rhs: Self) -> Self {
match (self, rhs) {
(Primitive::Int(left), Primitive::Int(right)) => Primitive::Int(left + right),
(Primitive::Int(left), Primitive::Decimal(right)) => {
Primitive::Decimal(BigDecimal::from(left) + right)
}
(Primitive::Decimal(left), Primitive::Decimal(right)) => {
Primitive::Decimal(left + right)
}
(Primitive::Decimal(left), Primitive::Int(right)) => {
Primitive::Decimal(left + BigDecimal::from(right))
}
(Primitive::Bytes(left), right) => match right {
Primitive::Bytes(right) => Primitive::Bytes(left + right),
Primitive::Int(right) => {
Primitive::Bytes(left + right.to_u64().unwrap_or_else(|| 0 as u64))
}
Primitive::Decimal(right) => {
Primitive::Bytes(left + right.to_u64().unwrap_or_else(|| 0 as u64))
}
_ => Primitive::Bytes(left),
},
(left, Primitive::Bytes(right)) => match left {
Primitive::Bytes(left) => Primitive::Bytes(left + right),
Primitive::Int(left) => {
Primitive::Bytes(left.to_u64().unwrap_or_else(|| 0 as u64) + right)
}
Primitive::Decimal(left) => {
Primitive::Bytes(left.to_u64().unwrap_or_else(|| 0 as u64) + right)
}
_ => Primitive::Bytes(right),
},
_ => Primitive::zero(),
}
}
}
impl std::ops::Mul for Primitive {
type Output = Self;
fn mul(self, rhs: Self) -> Self {
match (self, rhs) {
(Primitive::Int(left), Primitive::Int(right)) => Primitive::Int(left * right),
(Primitive::Int(left), Primitive::Decimal(right)) => {
Primitive::Decimal(BigDecimal::from(left) * right)
}
(Primitive::Decimal(left), Primitive::Decimal(right)) => {
Primitive::Decimal(left * right)
}
(Primitive::Decimal(left), Primitive::Int(right)) => {
Primitive::Decimal(left * BigDecimal::from(right))
}
_ => unimplemented!("Internal error: can't multiply incompatible primitives."),
}
}
}
impl From<BigDecimal> for Primitive {
@ -82,6 +169,13 @@ impl From<BigDecimal> for Primitive {
}
}
impl From<BigInt> for Primitive {
/// Helper to convert from integers to a Primitive value
fn from(int: BigInt) -> Primitive {
Primitive::Int(int)
}
}
impl From<f64> for Primitive {
/// Helper to convert from 64-bit float to a Primitive value
fn from(float: f64) -> Primitive {

View File

@ -1,6 +1,6 @@
[package]
name = "nu-source"
version = "0.11.0"
version = "0.12.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "A source string characterizer for Nushell"
@ -20,4 +20,4 @@ termcolor = "1.0.5"
pretty = "0.5.2"
[build-dependencies]
nu-build = { version = "0.11.0", path = "../nu-build" }
nu-build = { version = "0.12.0", path = "../nu-build" }

View File

@ -2,14 +2,14 @@
## Overview
The `nu-source` crate contains types and traits used for keeping track of _metadata_ about values being processed.
The `nu-source` crate contains types and traits used for keeping track of _metadata_ about values being processed.
Nu uses `Tag`s to keep track of where a value came from, an `AnchorLocation`,
as well as positional information about the value, a `Span`.
An `AchorLocation` can be a `Url`, `File`, or `Source` text that a value was parsed from.
An `AnchorLocation` can be a `Url`, `File`, or `Source` text that a value was parsed from.
The source `Text` is special in that it is a type similar to a `String` that comes with the ability to be cheaply cloned.
A `Span` keeps track of a value's `start` and `end` positions.
These types make up the metadata for a value and are wrapped up together in a `Tagged` struct,
which holds everything needed to track and locate a value.
which holds everything needed to track and locate a value.
Nu's metadata system can be seen when reporting errors.
@ -20,11 +20,15 @@ In the following example Nu is able to report to the user where the typo of a co
| ^^^ did you mean 'type'?
```
In addition to metadata tracking, `nu-source` also contains types and traits related to debugging, tracing, and formatting the metadata and values it processes.
In addition to metadata tracking, `nu-source` also contains types and traits
related to debugging, tracing, and formatting the metadata and values it processes.
## Other Resources
- [Nushell Github Project](https://github.com/nushell): Contains all projects in the Nushell ecosystem such as the source code to Nushell as well as website and books.
- [Nushell Git Repository](https://github.com/nushell/nushell): A direct link to the source git repository for Nushell
- [Nushell Contributor Book](https://github.com/nushell/contributor-book): An overview of topics about Nushell to help you get started contributing to the project.
- [Nushell Github Project](https://github.com/nushell):
Contains all projects in the Nushell ecosystem such as the source code to Nushell as well as website and books.
- [Nushell Git Repository](https://github.com/nushell/nushell):
A direct link to the source git repository for Nushell
- [Nushell Contributor Book](https://github.com/nushell/contributor-book):
An overview of topics about Nushell to help you get started contributing to the project.
- [Discord Channel](https://discordapp.com/invite/NtAbbGn)
- [Twitter](https://twitter.com/nu_shell)

View File

@ -659,6 +659,27 @@ impl Span {
self.start == 0 && self.end == 0
}
/// Returns a bool if the current Span does not cover.
///
/// # Example
///
/// ```
/// // make clean
/// // ----
/// // (0,4)
/// //
/// // ^(5,5)
///
/// let make_span = Span::new(0,4);
/// let clean_span = Span::new(5,5);
///
/// assert_eq!(make_span.is_closed(), false);
/// assert_eq!(clean_span.is_closed(), true);
/// ```
pub fn is_closed(&self) -> bool {
self.start == self.end
}
/// Returns a slice of the input that covers the start and end of the current Span.
pub fn slice<'a>(&self, source: &'a str) -> &'a str {
&source[self.start..self.end]

View File

@ -1,6 +1,6 @@
[package]
name = "nu-test-support"
version = "0.11.0"
version = "0.12.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "A source string characterizer for Nushell"
@ -10,9 +10,9 @@ license = "MIT"
doctest = false
[dependencies]
nu-parser = { path = "../nu-parser", version = "0.11.0" }
nu-source = { path = "../nu-source", version = "0.11.0" }
nu-protocol = { path = "../nu-protocol", version = "0.11.0" }
nu-parser = { path = "../nu-parser", version = "0.12.0" }
nu-source = { path = "../nu-source", version = "0.12.0" }
nu-protocol = { path = "../nu-protocol", version = "0.12.0" }
app_dirs = "1.2.1"
dunce = "1.0.0"
@ -22,4 +22,4 @@ tempfile = "3.1.0"
indexmap = { version = "1.3.0", features = ["serde-1"] }
[build-dependencies]
nu-build = { version = "0.11.0", path = "../nu-build" }
nu-build = { version = "0.12.0", path = "../nu-build" }

View File

@ -246,7 +246,10 @@ pub fn root() -> PathBuf {
}
pub fn binaries() -> PathBuf {
root().join("target/debug")
std::env::var("CARGO_TARGET_DIR")
.ok()
.map(|target_dir| PathBuf::from(target_dir).join("debug"))
.unwrap_or_else(|| root().join("target/debug"))
}
pub fn fixtures() -> PathBuf {

View File

@ -1,6 +1,6 @@
[package]
name = "nu-value-ext"
version = "0.11.0"
version = "0.12.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "Extension traits for values in Nushell"
@ -10,14 +10,14 @@ license = "MIT"
doctest = false
[dependencies]
nu-source = { path = "../nu-source", version = "0.11.0" }
nu-errors = { path = "../nu-errors", version = "0.11.0" }
nu-parser = { path = "../nu-parser", version = "0.11.0" }
nu-protocol = { path = "../nu-protocol", version = "0.11.0" }
nu-source = { path = "../nu-source", version = "0.12.0" }
nu-errors = { path = "../nu-errors", version = "0.12.0" }
nu-parser = { path = "../nu-parser", version = "0.12.0" }
nu-protocol = { path = "../nu-protocol", version = "0.12.0" }
num-traits = "0.2.10"
itertools = "0.8.2"
indexmap = { version = "1.3.0", features = ["serde-1"] }
[build-dependencies]
nu-build = { version = "0.11.0", path = "../nu-build" }
nu-build = { version = "0.12.0", path = "../nu-build" }

View File

@ -1,6 +1,6 @@
[package]
name = "nu_plugin_average"
version = "0.11.0"
version = "0.12.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "An average value plugin for Nushell"
@ -10,10 +10,10 @@ license = "MIT"
doctest = false
[dependencies]
nu-plugin = { path = "../nu-plugin", version = "0.11.0" }
nu-protocol = { path = "../nu-protocol", version = "0.11.0" }
nu-source = { path = "../nu-source", version = "0.11.0" }
nu-errors = { path = "../nu-errors", version = "0.11.0" }
nu-plugin = { path = "../nu-plugin", version = "0.12.0" }
nu-protocol = { path = "../nu-protocol", version = "0.12.0" }
nu-source = { path = "../nu-source", version = "0.12.0" }
nu-errors = { path = "../nu-errors", version = "0.12.0" }
[build-dependencies]
nu-build = { version = "0.11.0", path = "../nu-build" }
nu-build = { version = "0.12.0", path = "../nu-build" }

View File

@ -1,6 +1,6 @@
[package]
name = "nu_plugin_binaryview"
version = "0.11.0"
version = "0.12.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "A binary viewer plugin for Nushell"
@ -11,15 +11,15 @@ doctest = false
[dependencies]
ansi_term = "0.12.1"
crossterm = { version = "0.14.2" }
nu-plugin = { path = "../nu-plugin", version = "0.11.0" }
nu-protocol = { path = "../nu-protocol", version = "0.11.0" }
nu-source = { path = "../nu-source", version = "0.11.0" }
nu-errors = { path = "../nu-errors", version = "0.11.0" }
crossterm = { version = "0.16.0" }
nu-plugin = { path = "../nu-plugin", version = "0.12.0" }
nu-protocol = { path = "../nu-protocol", version = "0.12.0" }
nu-source = { path = "../nu-source", version = "0.12.0" }
nu-errors = { path = "../nu-errors", version = "0.12.0" }
pretty-hex = "0.1.1"
image = { version = "0.22.4", default_features = false, features = ["png_codec", "jpeg"] }
rawkey = "0.1.2"
neso = "0.5.0"
[build-dependencies]
nu-build = { version = "0.11.0", path = "../nu-build" }
nu-build = { version = "0.12.0", path = "../nu-build" }

View File

@ -1,6 +1,6 @@
[package]
name = "nu_plugin_fetch"
version = "0.11.0"
version = "0.12.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "A URL fetch plugin for Nushell"
@ -10,13 +10,13 @@ license = "MIT"
doctest = false
[dependencies]
nu-plugin = { path = "../nu-plugin", version = "0.11.0" }
nu-protocol = { path = "../nu-protocol", version = "0.11.0" }
nu-source = { path = "../nu-source", version = "0.11.0" }
nu-errors = { path = "../nu-errors", version = "0.11.0" }
nu-plugin = { path = "../nu-plugin", version = "0.12.0" }
nu-protocol = { path = "../nu-protocol", version = "0.12.0" }
nu-source = { path = "../nu-source", version = "0.12.0" }
nu-errors = { path = "../nu-errors", version = "0.12.0" }
futures = { version = "0.3", features = ["compat", "io-compat"] }
surf = "1.0.3"
url = "2.1.1"
[build-dependencies]
nu-build = { version = "0.11.0", path = "../nu-build" }
nu-build = { version = "0.12.0", path = "../nu-build" }

View File

@ -1,6 +1,6 @@
[package]
name = "nu_plugin_inc"
version = "0.11.0"
version = "0.12.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "A version incrementer plugin for Nushell"
@ -10,13 +10,13 @@ license = "MIT"
doctest = false
[dependencies]
nu-plugin = { path = "../nu-plugin", version = "0.11.0" }
nu-protocol = { path = "../nu-protocol", version = "0.11.0" }
nu-source = { path = "../nu-source", version = "0.11.0" }
nu-errors = { path = "../nu-errors", version = "0.11.0" }
nu-value-ext = { path = "../nu-value-ext", version = "0.11.0" }
nu-plugin = { path = "../nu-plugin", version = "0.12.0" }
nu-protocol = { path = "../nu-protocol", version = "0.12.0" }
nu-source = { path = "../nu-source", version = "0.12.0" }
nu-errors = { path = "../nu-errors", version = "0.12.0" }
nu-value-ext = { path = "../nu-value-ext", version = "0.12.0" }
semver = "0.9.0"
[build-dependencies]
nu-build = { version = "0.11.0", path = "../nu-build" }
nu-build = { version = "0.12.0", path = "../nu-build" }

View File

@ -1,6 +1,6 @@
[package]
name = "nu_plugin_match"
version = "0.11.0"
version = "0.12.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "A regex match plugin for Nushell"
@ -10,12 +10,12 @@ license = "MIT"
doctest = false
[dependencies]
nu-plugin = { path = "../nu-plugin", version = "0.11.0" }
nu-protocol = { path = "../nu-protocol", version = "0.11.0" }
nu-source = { path = "../nu-source", version = "0.11.0" }
nu-errors = { path = "../nu-errors", version = "0.11.0" }
nu-plugin = { path = "../nu-plugin", version = "0.12.0" }
nu-protocol = { path = "../nu-protocol", version = "0.12.0" }
nu-source = { path = "../nu-source", version = "0.12.0" }
nu-errors = { path = "../nu-errors", version = "0.12.0" }
futures = { version = "0.3", features = ["compat", "io-compat"] }
regex = "1"
[build-dependencies]
nu-build = { version = "0.11.0", path = "../nu-build" }
nu-build = { version = "0.12.0", path = "../nu-build" }

View File

@ -1,6 +1,6 @@
[package]
name = "nu_plugin_post"
version = "0.11.0"
version = "0.12.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "An HTTP post plugin for Nushell"
@ -10,10 +10,10 @@ license = "MIT"
doctest = false
[dependencies]
nu-plugin = { path = "../nu-plugin", version = "0.11.0" }
nu-protocol = { path = "../nu-protocol", version = "0.11.0" }
nu-source = { path = "../nu-source", version = "0.11.0" }
nu-errors = { path = "../nu-errors", version = "0.11.0" }
nu-plugin = { path = "../nu-plugin", version = "0.12.0" }
nu-protocol = { path = "../nu-protocol", version = "0.12.0" }
nu-source = { path = "../nu-source", version = "0.12.0" }
nu-errors = { path = "../nu-errors", version = "0.12.0" }
futures = { version = "0.3", features = ["compat", "io-compat"] }
surf = "1.0.3"
url = "2.1.1"
@ -22,4 +22,4 @@ base64 = "0.11"
num-traits = "0.2.11"
[build-dependencies]
nu-build = { version = "0.11.0", path = "../nu-build" }
nu-build = { version = "0.12.0", path = "../nu-build" }

View File

@ -1,6 +1,6 @@
[package]
name = "nu_plugin_ps"
version = "0.11.0"
version = "0.12.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "A process list plugin for Nushell"
@ -10,19 +10,18 @@ license = "MIT"
doctest = false
[dependencies]
nu-plugin = { path = "../nu-plugin", version = "0.11.0" }
nu-protocol = { path = "../nu-protocol", version = "0.11.0" }
nu-source = { path = "../nu-source", version = "0.11.0" }
nu-errors = { path = "../nu-errors", version = "0.11.0" }
nu-plugin = { path = "../nu-plugin", version = "0.12.0" }
nu-protocol = { path = "../nu-protocol", version = "0.12.0" }
nu-source = { path = "../nu-source", version = "0.12.0" }
nu-errors = { path = "../nu-errors", version = "0.12.0" }
futures = { version = "0.3", features = ["compat", "io-compat"] }
futures-timer = "3.0.1"
pin-utils = "0.1.0-alpha.4"
[dependencies.heim]
version = "0.0.9"
version = "0.0.10"
default-features = false
features = ["process", "runtime-polyfill"]
[build-dependencies]
nu-build = { version = "0.11.0", path = "../nu-build" }
nu-build = { version = "0.12.0", path = "../nu-build" }

View File

@ -9,11 +9,16 @@ impl Plugin for Ps {
fn config(&mut self) -> Result<Signature, ShellError> {
Ok(Signature::build("ps")
.desc("View information about system processes.")
.switch(
"full",
"list all available columns for each entry",
Some('f'),
)
.filter())
}
fn begin_filter(&mut self, callinfo: CallInfo) -> Result<Vec<ReturnValue>, ShellError> {
Ok(block_on(ps(callinfo.name_tag))
Ok(block_on(ps(callinfo.name_tag, callinfo.args.has("full")))
.into_iter()
.map(ReturnSuccess::value)
.collect())

View File

@ -27,8 +27,8 @@ async fn usage(process: Process) -> ProcessResult<(process::Process, Ratio, proc
Ok((process, usage_2 - usage_1, memory))
}
pub async fn ps(tag: Tag) -> Vec<Value> {
let processes = process::processes()
pub async fn ps(tag: Tag, full: bool) -> Vec<Value> {
let mut processes = process::processes()
.map_ok(|process| {
// Note that there is no `.await` here,
// as we want to pass the returned future
@ -36,7 +36,6 @@ pub async fn ps(tag: Tag) -> Vec<Value> {
usage(process)
})
.try_buffer_unordered(usize::MAX);
pin_utils::pin_mut!(processes);
let mut output = vec![];
while let Some(res) = processes.next().await {
@ -58,6 +57,25 @@ pub async fn ps(tag: Tag) -> Vec<Value> {
"virtual",
UntaggedValue::bytes(memory.vms().get::<information::byte>()),
);
if full {
if let Ok(parent_pid) = process.parent_pid().await {
dict.insert_untagged("parent", UntaggedValue::int(parent_pid))
}
if let Ok(exe) = process.exe().await {
dict.insert_untagged("exe", UntaggedValue::string(exe.to_string_lossy()))
}
#[cfg(not(windows))]
{
if let Ok(command) = process.command().await {
dict.insert_untagged(
"command",
UntaggedValue::string(command.to_os_string().to_string_lossy()),
);
}
}
}
output.push(dict.into_value());
}
}

View File

@ -1,6 +1,6 @@
[package]
name = "nu_plugin_str"
version = "0.11.0"
version = "0.12.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "A string manipulation plugin for Nushell"
@ -10,15 +10,15 @@ license = "MIT"
doctest = false
[dependencies]
nu-plugin = { path = "../nu-plugin", version = "0.11.0" }
nu-protocol = { path = "../nu-protocol", version = "0.11.0" }
nu-source = { path = "../nu-source", version = "0.11.0" }
nu-errors = { path = "../nu-errors", version = "0.11.0" }
nu-value-ext = { path = "../nu-value-ext", version = "0.11.0" }
nu-plugin = { path = "../nu-plugin", version = "0.12.0" }
nu-protocol = { path = "../nu-protocol", version = "0.12.0" }
nu-source = { path = "../nu-source", version = "0.12.0" }
nu-errors = { path = "../nu-errors", version = "0.12.0" }
nu-value-ext = { path = "../nu-value-ext", version = "0.12.0" }
chrono = { version = "0.4.10", features = ["serde"] }
regex = "1"
num-bigint = "0.2.3"
[build-dependencies]
nu-build = { version = "0.11.0", path = "../nu-build" }
nu-build = { version = "0.12.0", path = "../nu-build" }

View File

@ -1,19 +0,0 @@
[package]
name = "nu_plugin_sum"
version = "0.11.0"
authors = ["Yehuda Katz <wycats@gmail.com>", "Jonathan Turner <jonathan.d.turner@gmail.com>", "Andrés N. Robalino <andres@androbtech.com>"]
edition = "2018"
description = "A simple summation plugin for Nushell"
license = "MIT"
[lib]
doctest = false
[dependencies]
nu-plugin = { path = "../nu-plugin", version = "0.11.0" }
nu-protocol = { path = "../nu-protocol", version = "0.11.0" }
nu-source = { path = "../nu-source", version = "0.11.0" }
nu-errors = { path = "../nu-errors", version = "0.11.0" }
[build-dependencies]
nu-build = { version = "0.11.0", path = "../nu-build" }

View File

@ -1,3 +0,0 @@
fn main() -> Result<(), Box<dyn std::error::Error>> {
nu_build::build()
}

View File

@ -1,4 +0,0 @@
mod nu;
mod sum;
pub use sum::Sum;

Some files were not shown because too many files have changed in this diff Show More